Skip to content

Instantly share code, notes, and snippets.

View koeninger's full-sized avatar

Cody Koeninger koeninger

View GitHub Profile
sub listener {
my ($event) = @_;
console->log("event is " . $event);
# doesn't work
# $event->respondWith(handleRequest($event->request));
# works
my $req = $event->request;
my $resp = handleRequest($req);
@koeninger
koeninger / gist:6155cd94a19d1a6373ba0b40039e97e3
Created February 22, 2018 04:02
scala assertions aren't disabled by jvm commandline options
package example;
public class JavaExample {
public static void main(String[] args) {
System.out.println("before assert");
assert 1 == 0;
System.out.println("after asssert");
}
}
scala> def useInt(i: Int) = println(s"used $i")
useInt: (i: Int)Unit
scala> val i: java.lang.Integer = null
i: Integer = null
// implicit conversion from Integer to Int fails
scala> useInt(i)
java.lang.NullPointerException
at scala.Predef$.Integer2int(Predef.scala:392)
@koeninger
koeninger / ... results in this exception
Created January 12, 2015 22:10
spark checkpoint loading exception
15/01/12 16:07:07 INFO CheckpointReader: Attempting to load checkpoint from file file:/var/tmp/cp/checkpoint-1421100410000.bk
15/01/12 16:07:07 WARN CheckpointReader: Error reading checkpoint from file file:/var/tmp/cp/checkpoint-1421100410000.bk
java.io.IOException: java.lang.ClassNotFoundException: org.apache.spark.rdd.kafka.KafkaRDDPartition
at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1043)
at org.apache.spark.streaming.dstream.DStreamCheckpointData.readObject(DStreamCheckpointData.scala:146)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
@koeninger
koeninger / spark-sql-parquet-graphx
Created September 3, 2014 04:47
Spark SQL, Parquet, GraphX examples
import org.apache.spark.sql.{ SQLContext, Row }
val sqlContext = new SQLContext(sc)
import sqlContext._
/*
Next you'll need to get your data registered as a table. Note that in order to infer the schema (column names and types) it must be a case class. A normal class won't work, unless you manually implement much of what a case class would do, e.g. extend Product and Serializable.
Any data source can be used, as long as you can transform it to an rdd of case classes. For now, we'll just use some dummy data:
*/
@koeninger
koeninger / gist:6497066
Created September 9, 2013 15:18
hector vs datastax batch performance
// hector version
val mutator: Mutator[Composite] = HFactory.createMutator(ksp, CompositeSerializer.get)
val column = HFactory.createColumn(deviceId, Array[Byte]())
column.setClock(clock)
profiles.foreach { p =>
if (p(attributes))
mutator.addInsertion(rowkey(brandCode, p.id.get), columnFamilyName, column)
else
mutator.addDeletion(rowkey(brandCode, p.id.get), columnFamilyName, deviceId, StringSerializer.get, clock)
}
@koeninger
koeninger / gist:5216636
Created March 21, 2013 20:54
async dispatch example
import dispatch._
import scala.util.Random
object AsyncExample {
val total = 2000000
val groupSize = 100
def svc = url("http://localhost:8080/")
def req(q: String) = Http(svc.addQueryParameter("q", q) OK as.String)
def res = Seq.fill(total)(Random.nextInt).