Skip to content

Instantly share code, notes, and snippets.

@damieng
Last active August 29, 2015 14:16
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save damieng/50ccbc199fc251d65d87 to your computer and use it in GitHub Desktop.
Save damieng/50ccbc199fc251d65d87 to your computer and use it in GitHub Desktop.
build.sbt for a Spark Streaming 1.2.1 job using Scala 2.10 that can assemble (remote) or run (local)
name := "MyApp"
version := "1.0"
scalaVersion := "2.10.4"
libraryDependencies ++= Seq(
// Use Log4J 1.2.17 instead of the unresolvable jmx of 1.2.15
"log4j" % "log4j" % "1.2.17",
// Spark components - some provided on the cluster
"org.apache.spark" %% "spark-core" % "1.2.1" % "provided",
"org.apache.spark" %% "spark-streaming" % "1.2.1" % "provided",
"org.apache.spark" %% "spark-streaming-kafka" % "1.2.1",
// Correct Netty version for 'sbt run'
"io.netty" % "netty-all" % "4.0.23.Final" % "provided"
)
// Make 'sbt run' bring in provided dependencies
run in Compile <<= Defaults.runTask(fullClasspath in Compile, mainClass in (Compile, run), runner in (Compile, run))
// Final assembled uber-jar name
assemblyJarName in assembly := "my-assembly.jar"
// Do not assemble any of the Scala runtime, also provided
assemblyOption in assembly := (assemblyOption in assembly).value.copy(includeScala = false)
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment