Skip to content

Instantly share code, notes, and snippets.

View velvia's full-sized avatar

Evan Chan velvia

View GitHub Profile
@velvia
velvia / sparkjobapi.scala
Last active December 13, 2016 17:56
SCarman's spark job API on top of existing API
package spark.jobserver.api
import com.typesafe.config.Config
import org.scalactic._
import spark.jobserver.api._
trait ContextProvider[C] {
val ctx: C with ContextLike = null
def context: C with ContextLike = ctx
}
@velvia
velvia / gist:5690cd1b44ee7f2057a81b3b6cc100fc
Created September 29, 2016 16:45
SBT 0.13.11 download issue
:: problems summary ::
:::: WARNINGS
[FAILED ] org.scala-sbt#main;0.13.11!main.jar: The HTTP response code for https://repo.typesafe.com/typesafe/ivy-releases/org.scala-sbt/main/0.13.11/jars/main.jar did not indicate a success. See log for more detail. (323ms)
[FAILED ] org.scala-sbt#main;0.13.11!main.jar: The HTTP response code for https://repo.typesafe.com/typesafe/ivy-releases/org.scala-sbt/main/0.13.11/jars/main.jar did not indicate a success. See log for more detail. (323ms)
==== typesafe-ivy-releases: tried
https://repo.typesafe.com/typesafe/ivy-releases/org.scala-sbt/main/0.13.11/jars/main.jar
[FAILED ] org.scala-sbt#compiler-interface;0.13.11!compiler-interface.jar: The HTTP response code for https://repo.typesafe.com/typesafe/ivy-releases/org.scala-sbt/compiler-interface/0.13.11/jars/compiler-interface.jar did not indicate a success. See log for more detail. (308ms)
[FAILED ] org.scala-sbt#compiler-interface;0.13.11!compiler-interface.jar: The HTTP response code for https:
@velvia
velvia / spark-jobserver-ui-thoughts
Created July 29, 2016 07:45
Thoughts on requirements for SJS UI
* See all current contexts
* For each context see all the jobs that are running
* History of past jobs for a given context. Actually this is not technically available via API; but history of past jobs
* Be able to see configuration for each job present or past (this is available via a tiny "C" link in current UI)
* Job results
* Link to Spark UI for detailed analysis
@velvia
velvia / scala-call-func-type-param.md
Created September 30, 2015 17:54
Calling a function by type parameter in Scala

Let's say you have a Scala function which takes type parameter:

  def myFunc[K]: T

Let's say I have several functions like that. Right now, if K could be one of several different values, I'd need some code like the following:

 kType match {
@velvia
velvia / gist:909408dc6f053d6934be
Created June 29, 2015 07:38
Patch to Phantom 1.8.x for ByteBuffer handling
// Primitive.scala - only change is asCql method
implicit object BlobIsPrimitive extends Primitive[ByteBuffer] {
override type PrimitiveType = java.nio.ByteBuffer
val cassandraType = CQLSyntax.Types.Blob
override def fromRow(column: String, row: Row): Try[ByteBuffer] = nullCheck(column, row) {
r => r.getBytes(column)
@velvia
velvia / gist:213b837c6e02c4982a9a
Last active September 21, 2015 09:28
Notes for velvia/filo 50x performance improvement

...to be turned into a blog post later. These are notes with references to commits, the blog post will have snippets of code so folks don't have to look things up.

How I tuned Filo for 50x speedup in 24 hours

Filo is an extreme serialization library for vector data. Think of it as the good parts of Parquet without the HDFS and file format garbage -- just the serdes and fast columnar storage.

I recently added a JMH benchmark for reading a Filo binary buffer containing 10,000 Ints using the simplest apply() method to sum up all the Ints.

Oh, and before we get started - avoid throwing exceptions in inner loops, especially Try(....).getOrElse(...) patterns. Even if they occur only occasionally they can be extremely expensive.

@velvia
velvia / gist:69ca1ab5e758d3b0ab13
Created March 2, 2015 18:18
JTS custom CoordinateSequence example
import com.vividsolutions.jts.geom._
import com.vividsolutions.jts.geom.impl.PackedCoordinateSequence
import com.vividsolutions.jts.geom.util.GeometryTransformer
/**
* A custom CoordSequence based on byte arrays for compactness and speed
* Just 2 dimensions for now.
*
* It's an example of creating a custom CoordinateSequence.
* NOTE: This is much more memory efficient, but slower because of deserialization cost.
@velvia
velvia / gist:10447394
Created April 11, 2014 07:42
Patch to spark-jobserver to clean up GC
case StopContext(name) =>
if (contexts contains name) {
logger.info("Shutting down context {}", name)
context.watch(contexts(name)) // watch for termination event
contexts(name) ! PoisonPill
contexts.remove(name)
resultActors.remove(name)
sender ! ContextStopped
} else {
sender ! NoSuchContext
@velvia
velvia / gist:8967611
Last active August 29, 2015 13:56
Chunking Array Serializer/Deserializer in Scala
// We want a generic way to chunk large arrays into segments of byte arrays
// and to do so in a way that doesn't blow up memory for large objects, ie ability to chunk / page / stream
trait ChunkingArraySerDe[T] {
def apply(data: Array[T], chunkSize: Int): Iterator[Array[Byte]]
def unapply(serialized: Iterator[Array[Byte]]): Array[T]
}
// For arrays with fixed-size elements
trait PrimitiveChunkingSerDe[@specialized(Int, Long, Boolean) T] extends ChunkingArraySerDe[T] {

Useful Scalac Flags

So, there have been some discussions and angry tweets recently about irritating Scala "features" (like value discarding and auto-tupling) that can actually be turned off by selecting the right compiler flag in conjunction with -Xfatal-warnings. I highly recommend a set of options something like those below.

scalacOptions ++= Seq(
  "-deprecation",           
  "-encoding", "UTF-8",       // yes, this is 2 args
  "-feature",                
 "-language:existentials",