Skip to content

Instantly share code, notes, and snippets.

Cody Koeninger koeninger

  • Austin, TX
Block or report user

Report or block koeninger

Hide content and notifications from this user.

Learn more about blocking users

Contact Support about this user’s behavior.

Learn more about reporting abuse

Report abuse
View GitHub Profile
@koeninger
koeninger / gist:6155cd94a19d1a6373ba0b40039e97e3
Created Feb 22, 2018
scala assertions aren't disabled by jvm commandline options
View gist:6155cd94a19d1a6373ba0b40039e97e3
package example;
public class JavaExample {
public static void main(String[] args) {
System.out.println("before assert");
assert 1 == 0;
System.out.println("after asssert");
}
}
View gist:a388f64b8e5c20e05358f593146dfa7c
scala> def useInt(i: Int) = println(s"used $i")
useInt: (i: Int)Unit
scala> val i: java.lang.Integer = null
i: Integer = null
// implicit conversion from Integer to Int fails
scala> useInt(i)
java.lang.NullPointerException
at scala.Predef$.Integer2int(Predef.scala:392)
@koeninger
koeninger / ... results in this exception
Created Jan 12, 2015
spark checkpoint loading exception
View ... results in this exception
15/01/12 16:07:07 INFO CheckpointReader: Attempting to load checkpoint from file file:/var/tmp/cp/checkpoint-1421100410000.bk
15/01/12 16:07:07 WARN CheckpointReader: Error reading checkpoint from file file:/var/tmp/cp/checkpoint-1421100410000.bk
java.io.IOException: java.lang.ClassNotFoundException: org.apache.spark.rdd.kafka.KafkaRDDPartition
at org.apache.spark.util.Utils$.tryOrIOException(Utils.scala:1043)
at org.apache.spark.streaming.dstream.DStreamCheckpointData.readObject(DStreamCheckpointData.scala:146)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1017)
@koeninger
koeninger / spark-sql-parquet-graphx
Created Sep 3, 2014
Spark SQL, Parquet, GraphX examples
View spark-sql-parquet-graphx
import org.apache.spark.sql.{ SQLContext, Row }
val sqlContext = new SQLContext(sc)
import sqlContext._
/*
Next you'll need to get your data registered as a table. Note that in order to infer the schema (column names and types) it must be a case class. A normal class won't work, unless you manually implement much of what a case class would do, e.g. extend Product and Serializable.
Any data source can be used, as long as you can transform it to an rdd of case classes. For now, we'll just use some dummy data:
*/
@koeninger
koeninger / gist:6497066
Created Sep 9, 2013
hector vs datastax batch performance
View gist:6497066
// hector version
val mutator: Mutator[Composite] = HFactory.createMutator(ksp, CompositeSerializer.get)
val column = HFactory.createColumn(deviceId, Array[Byte]())
column.setClock(clock)
profiles.foreach { p =>
if (p(attributes))
mutator.addInsertion(rowkey(brandCode, p.id.get), columnFamilyName, column)
else
mutator.addDeletion(rowkey(brandCode, p.id.get), columnFamilyName, deviceId, StringSerializer.get, clock)
}
@koeninger
koeninger / gist:5216636
Created Mar 21, 2013
async dispatch example
View gist:5216636
import dispatch._
import scala.util.Random
object AsyncExample {
val total = 2000000
val groupSize = 100
def svc = url("http://localhost:8080/")
def req(q: String) = Http(svc.addQueryParameter("q", q) OK as.String)
def res = Seq.fill(total)(Random.nextInt).
You can’t perform that action at this time.