Skip to content

Instantly share code, notes, and snippets.

🎯
Focusing

Sam Bessalah samklr

🎯
Focusing
Block or report user

Report or block samklr

Hide content and notifications from this user.

Learn more about blocking users

Contact Support about this user’s behavior.

Learn more about reporting abuse

Report abuse
View GitHub Profile
@samklr
samklr / binlog_rds.md
Last active Dec 20, 2018
Enabling RDS MySQL for Change Data Capture
View binlog_rds.md

Do this before you add any data to your instance.

For RDS : Create a Parameter Group with the following parameters changed :
- Set binlog_format = ROW
- Set binlog_row_image = FULL
- Set binlog_rows_query_log_events = ON (1)
- (Optional) Set max_allowed_packet = Max (Increase ...)

Once the instance is created with the above parameter group, you'll need to change retention time of the binlog

@jeqo
jeqo / KafkaStreamsTopologyGraphvizPrinter.java
Last active Jul 11, 2018
Generating Graphviz from Kafka Streams
View KafkaStreamsTopologyGraphvizPrinter.java
import org.apache.kafka.streams.TopologyDescription;
import java.io.StringWriter;
import java.util.stream.Stream;
/**
*
*/
public class KafkaStreamsTopologyGraphvizPrinter {
@MorganGeek
MorganGeek / output
Last active May 16, 2019
Install transmission on WD My Cloud
View output
FMCloud:~# wget https://gist.githubusercontent.com/MorganGeek/43ecb0ef1ac65322cd9237c16e77a5bb/raw/c8be05b2305c5b03b0e6ea1f2f27628625145371/wdmycloud-install-transmission.sh
--2017-06-17 07:54:43-- https://gist.githubusercontent.com/MorganGeek/43ecb0ef1ac65322cd9237c16e77a5bb/raw/c8be05b2305c5b03b0e6ea1f2f27628625145371/wdmycloud-install-transmission.sh
Resolving gist.githubusercontent.com (gist.githubusercontent.com)... 151.101.36.133
Connecting to gist.githubusercontent.com (gist.githubusercontent.com)|151.101.36.133|:443... connected.
HTTP request sent, awaiting response... 200 OK
Length: 1136 (1.1K) [text/plain]
Saving to: `wdmycloud-install-transmission.sh'
100%[===================================================================================================================>] 1,136 --.-K/s in 0.001s
@jkpl
jkpl / article.md
Last active Nov 12, 2018
Error handling pitfalls in Scala
View article.md

Error handling pitfalls in Scala

There are multiple strategies for error handling in Scala.

Errors can be represented as [exceptions][], which is a common way of dealing with errors in languages such as Java. However, exceptions are invisible to the type system, which can make them challenging to deal with. It's easy to leave out the necessary error handling, which can result in unfortunate runtime errors.

@elakito
elakito / sc_subtest.go
Created Dec 13, 2016
sarama_cluster's publisher subscriber samples to test partition assignment
View sc_subtest.go
package main
import (
"flag"
"fmt"
"log"
"strings"
"time"
"github.com/Shopify/sarama"
@jkpl
jkpl / article.org
Last active Nov 13, 2018
Enforcing invariants in Scala datatypes
View article.org

Enforcing invariants in Scala datatypes

Scala provides many tools to help us build programs with less runtime errors. Instead of relying on nulls, the recommended practice is to use the Option type. Instead of throwing exceptions, Try and Either types are used for representing potential error scenarios. What’s common with these features is that they’re used for capturing runtime features in the type system, thus lifting the runtime scenario handling to the compilation phase: your program doesn’t compile until you’ve explicitly handled nulls, exceptions, and other runtime features in your code.

In his “Strategic Scala Style” blog post series,

View aws_kafka_bench.tf
/* Terraform setup to evaluate kafka performances on various aws instances types and ebs sizes */
provider "aws" {
region = "eu-west-1"
}
variable "ssh_key_name" {
default = "ben@ici"
}
@adamw
adamw / windowing.scala
Created Aug 5, 2016
Windowing data in Akka
View windowing.scala
package com.softwaremill.akka
import java.time._
import akka.actor.ActorSystem
import akka.stream.ActorMaterializer
import akka.stream.scaladsl.Source
import scala.collection.mutable
import scala.concurrent.Await
@longcao
longcao / SparkCopyPostgres.scala
Last active Sep 14, 2019
COPY Spark DataFrame rows to PostgreSQL (via JDBC)
View SparkCopyPostgres.scala
import java.io.InputStream
import org.apache.spark.sql.execution.datasources.jdbc.JdbcUtils
import org.apache.spark.sql.{ DataFrame, Row }
import org.postgresql.copy.CopyManager
import org.postgresql.core.BaseConnection
val jdbcUrl = s"jdbc:postgresql://..." // db credentials elided
val connectionProperties = {
View MinByColumn.scala
import org.apache.spark.sql.expressions.MutableAggregationBuffer
import org.apache.spark.sql.expressions.UserDefinedAggregateFunction
import org.apache.spark.sql.Row
import sqlContext.implicits._
import org.apache.spark.sql.types.{StructType, StructField, DataType, ByteType, ShortType, IntegerType, LongType, FloatType, DoubleType, DecimalType, StringType, BinaryType, BooleanType, TimestampType, DateType, ArrayType}
class MinBy(valueType: DataType, minType: DataType) extends UserDefinedAggregateFunction {
def inputSchema: StructType = StructType(StructField("value", valueType) :: StructField("minCol", minType) :: Nil)
def bufferSchema: StructType = StructType(StructField("value", valueType) :: StructField("minCol", minType) :: Nil)
You can’t perform that action at this time.