Skip to content

Instantly share code, notes, and snippets.

View lucasrpb's full-sized avatar
🎯
Focusing

Lucas Batistussi lucasrpb

🎯
Focusing
View GitHub Profile
import java.util.concurrent.LinkedBlockingDeque
/*
Given n pairs of parentheses, write a function to generate all combinations of well-formed parentheses.
For example, given n = 3, a solution set is:
[
"((()))",
"(()())",
package feed
import MyPostgresProfile.api._
import com.zaxxer.hikari.HikariConfig
import org.apache.pulsar.shade.org.glassfish.jersey.message.internal.DataSourceProvider
import org.postgresql.ds.PGSimpleDataSource
import org.slf4j.LoggerFactory
import slick.dbio.DBIO
import scala.concurrent.{Await, Future}

Docker file: Dockerfile

FROM openjdk:17.0.1-slim

# Copy the jar to the production image from the builder stage.
COPY /target/scala-2.13/service.jar /service.jar
COPY secure-connect-test.zip /secure-connect-test.zip

# Run the web service on container startup.
package io.scalac.githubrank
import akka.actor.typed.{ActorRef, ActorSystem, Behavior}
import akka.actor.typed.scaladsl.Behaviors
import akka.http.scaladsl.Http
import org.slf4j.LoggerFactory
import akka.http.scaladsl.client.RequestBuilding._
import akka.http.scaladsl.model.{HttpHeader, HttpRequest, ResponseEntity, StatusCode, StatusCodes}
import akka.http.scaladsl.model.HttpProtocols._
import akka.http.scaladsl.model.MediaTypes._
package scheduler
import akka.Done
import akka.actor.ActorSystem
import akka.kafka.scaladsl.Consumer.DrainingControl
import akka.kafka.{CommitDelivery, CommitterSettings, ConsumerSettings, Subscriptions}
import akka.kafka.scaladsl.{Committer, Consumer}
import akka.stream.KillSwitches
import akka.stream.impl.Cancel
import akka.stream.scaladsl.{Keep, Sink}
//Akka Kafka Stream Consumer
import akka.Done
import akka.actor.ActorSystem
import akka.kafka.scaladsl.Consumer.DrainingControl
import akka.kafka.{CommitDelivery, CommitterSettings, ConsumerSettings, Subscriptions}
import akka.kafka.scaladsl.{Committer, Consumer}
import akka.stream.scaladsl.Sink
import org.apache.kafka.clients.consumer.ConsumerConfig
import org.apache.kafka.common.serialization.{ByteArrayDeserializer, StringDeserializer}
@lucasrpb
lucasrpb / Main.scala
Created March 6, 2021 01:03
How to init and stop akka akka typed sharded singletons around the cluster
import akka.actor.typed.scaladsl.Behaviors
import akka.actor.typed.{ActorSystem, Behavior}
import akka.cluster.sharding.typed.scaladsl.{EntityTypeKey, ShardedDaemonProcess}
import akka.util.Timeout
import com.typesafe.config.ConfigFactory
import scala.concurrent.duration._
object Main {
// Test file for using the Raspberry Pi and Johnny-Five
const five = require('johnny-five');
const raspi = require('raspi-io').RaspiIO;
// Make a new `Board()` instance and use raspi-io
const board = new five.Board({
io: new raspi()
});
var mosca = require('mosca');
var led = null;
@lucasrpb
lucasrpb / README.md
Created January 31, 2020 20:05 — forked from davideicardi/README.md
Alpakka Kafka connector (akka-stream-kafka) example. Produce and consumer kafka messages using Akka Stream.

Alpakka Kafka connector (akka-stream-kafka) example

Simple solution to use Alpakka Kafka connector to produce and consume kafka messages.

I assume that you have 2 scala apps, a producer and a consumer.

Producer

Add the following dependencies:

#!/bin/bash
curl -L https://github.com/coreos/etcd/releases/download/v3.3.1/etcd-v3.3.1-linux-amd64.tar.gz -o etcd-v3.3.1-linux-amd64.tar.gz
tar xzvf etcd-v3.3.1-linux-amd64.tar.gz
cd etcd-v3.3.1-linux-amd64
sudo cp etcd /usr/local/bin/
sudo cp etcdctl /usr/local/bin/
etcd --version