Skip to content

Instantly share code, notes, and snippets.

View TrainTravel's full-sized avatar

Train Chen TrainTravel

View GitHub Profile
@TrainTravel
TrainTravel / introrx.md
Created May 15, 2019 03:46 — forked from staltz/introrx.md
The introduction to Reactive Programming you've been missing
@TrainTravel
TrainTravel / Instructions.md
Created July 13, 2019 07:56 — forked from jmdobry/Instructions.md
Nginx reverse-proxy for RethinkDB Admin UI

Start your rethinkdb instance with this flag: --bind all (or bind=all in the configuration file for your instance)

Block external access to the web UI with these two commands: sudo iptables -A INPUT -i eth0 -p tcp --dport 8080 -j DROP sudo iptables -I INPUT -i eth0 -s 127.0.0.1 -p tcp --dport 8080 -j ACCEPT

Install nginx: sudo apt-get install nginx

@TrainTravel
TrainTravel / SingleCoreSolution.cpp
Created April 14, 2020 03:24
Nvidia Assignments
#include"SingleCoreSolution.h"
namespace SingleCoreSolution {
std::string Solution::TrickOrTreat(const std::vector<int>& pieces,
const int homes, const int max) {
//calculate prefix sum;
std::vector<int> prefixSum(homes + 1, 0);
for (int i = 0; i < homes; ++i) {
prefixSum[i + 1] = prefixSum[i] + pieces[i];
}
@TrainTravel
TrainTravel / README.md
Created April 14, 2020 07:04 — forked from kmader/README.md
Beating Serialization in Spark

Serialization

As all objects must be Serializable to be used as part of RDD operations in Spark, it can be difficult to work with libraries which do not implement these featuers.

Java Solutions

Simple Classes

For simple classes, it is easiest to make a wrapper interface that extends Serializable. This means that even though UnserializableObject cannot be serialized we can pass in the following object without any issue

public interface UnserializableWrapper extends Serializable {
 public UnserializableObject create(String parm1, String parm2);
@TrainTravel
TrainTravel / unapply.scala
Created May 11, 2020 09:07 — forked from linasm/unapply.scala
Output of live coding session "Scala pattern matching: apply the unapply"
import java.time.{LocalDate, LocalDateTime, LocalTime}
/*case */class FullName(val first: String, val last: String)
object FullName {
def apply(first: String, last: String): FullName =
new FullName(first, last)
def unapply(full: FullName): Some[(String, String)] =
Some((full.first, full.last))

Recursion and Trampolines in Scala

Recursion is beautiful. As an example, let's consider this perfectly acceptable example of defining the functions even and odd in Scala, whose semantics you can guess:

def even(i: Int): Boolean = i match {
  case 0 => true
  case _ => odd(i - 1)
}

def odd(i: Int): Boolean = i match {

@TrainTravel
TrainTravel / gist:b7aa2671dd543850d06ba5dae7a82a07
Created June 2, 2020 07:28 — forked from rednaxelafx/gist:998018
Example of /proc/{pid}/maps and pmap -x of a groovysh process on Ubuntu 10.10 x64
rednaxelafx@fx-laptop:~$ jps
7013 Jps
6946 GroovyStarter
rednaxelafx@fx-laptop:~$ cd /proc/6946
rednaxelafx@fx-laptop:/proc/6946$ ls
attr cpuset latency mountstats root statm
auxv cwd limits net sched status
cgroup environ loginuid numa_maps schedstat syscall
clear_refs exe maps oom_adj sessionid task
cmdline fd mem oom_score smaps wchan
@TrainTravel
TrainTravel / AggregateByKey.scala
Created July 8, 2020 02:59 — forked from bbejeck/AggregateByKey.scala
Sample code for the Spark PairRDDFunctions - AggregateByKey
package bbejeck.grouping
import org.apache.log4j.{Level, Logger}
import org.apache.spark.{SparkConf, SparkContext}
import scala.collection.mutable
/**
* Created by bbejeck on 7/31/15.
*
sealed trait Option[+A]
case object None extends Option[Nothing]
case class Some[A](a: A) extends Option[A]
object ProductType {
type TwoBooleansOrNone = Option[(Boolean, Boolean)]
def mustBeBothTrue(in: TwoBooleansOrNone): Boolean = in match {
case None => false
case Some((true, true)) => true
class UserManagerLifecycleTest
extends ScalaTestWithActorTestKit
with AnyWordSpecLike {
implicit val dynamoClient: DynamoDbAsyncClient =
new DynamoDbAsyncClient {
override def serviceName(): String = "test"
override def close(): Unit = ()
}
implicit val ec: ExecutionContext =