Skip to content

Instantly share code, notes, and snippets.

@masato
Last active December 26, 2015 18:54
Show Gist options
  • Save masato/559efd0294a0c09c39dc to your computer and use it in GitHub Desktop.
Save masato/559efd0294a0c09c39dc to your computer and use it in GitHub Desktop.
Apache Zeppelinでデータ分析を分散処理する - Part 5: Ambari on DockerとsbtのScalaビルド環境 ref: http://qiita.com/masato/items/49c0d7911f3ba0baa340
name := "Sample app"
version := "1.0"
scalaVersion := "2.10.5"
libraryDependencies += "org.apache.spark" %% "spark-core" % "1.4.1"
sbt:
build: .
volumes:
- .:/app
- /temp/.sbt:/root/.sbt
- /temp/.ivy2:/root/.ivy2
dns:
- $AMB_CONSUL
- 172.17.0.1
- 8.8.8.8
dns_search:
- service.consul
FROM java:openjdk-7-jdk
MAINTAINER Masato Shimizu <ma6ato@gmail.com>
ENV SBT_VERSION 0.13.9
ENV PATH ${PATH}:/usr/local/sbt/bin
WORKDIR /app
RUN wget -O- http://dl.bintray.com/sbt/native-packages/sbt/$SBT_VERSION/sbt-$SBT_VERSION.tgz | tar xz -C /usr/local
VOLUME ["/root/.ivy2", "/root/.sbt", "/app"]
$ mkdir -p ~/scala_apps/spike
$ cd !$
$ docker exec amb1 hadoop fs -ls /user
Found 5 items
drwxrwx--- - ambari-qa hdfs 0 2015-12-25 17:08 /user/ambari-qa
drwxr-xr-x - hcat hdfs 0 2015-12-25 17:10 /user/hcat
drwx------ - hive hdfs 0 2015-12-25 17:10 /user/hive
drwxr-xr-x - root hdfs 0 2015-12-26 02:40 /user/root
drwxrwxr-x - spark hdfs 0 2015-12-25 17:09 /user/spark
$ su hdfs
$ hadoop fs -mkdir /user/root
$ hadoop fs -chown root /user/root
/usr/hdp/2.3.4.0-3485/spark/LICENSE
$ docker exec amb1 hadoop fs -put /usr/hdp/2.3.4.0-3485/spark/LICENSE /user/root
$ docker exec amb1 hadoop fs -ls /user/root/LICENSE
-rw-r--r-- 3 root hdfs 17356 2015-12-25 17:20 LICENSE
$ docker-compose run --rm sbt \
curl -L -X PUT -T /app/target/scala-2.10/sample-app_2.10-1.0.jar "http://amb1:50070/webhdfs/v1/user/root/test/sample-app_2.10-1.0.jar?user.name=root&op=CREATE&overwrite=true"
$ docker exec amb1 hadoop fs -ls ./test/sample-app_2.10-1.0.jar
-rwxr-xr-x 3 root hdfs 1862 2015-12-26 04:16 test/sample-app_2.10-1.0.jar
$ docker exec amb1 \
spark-submit \
--class SampleApp \
--master yarn-cluster \
--driver-memory 1g \
--executor-memory 1g \
--executor-cores 1 \
hdfs://amb1/user/root/test/sample-app_2.10-1.0.jar
...
15/12/26 02:54:18 INFO Client:
client token: N/A
diagnostics: N/A
ApplicationMaster host: 172.17.0.5
ApplicationMaster RPC port: 0
queue: default
start time: 1451098443955
final status: SUCCEEDED
tracking URL: http://amb1.service.consul:8088/proxy/application_1451063385654_0011/
user: root
15/12/26 02:54:18 INFO ShutdownHookManager: Shutdown hook called
15/12/26 02:54:18 INFO ShutdownHookManager: Deleting directory /tmp/spark-c30cdca1-6f2b-4432-a866-c63391e619b2
$ docker exec amb1 \
yarn logs -applicationId application_1451063385654_0011
...
LogType:stdout
Log Upload Time:Sat Dec 26 02:54:19 +0000 2015
LogLength:10
Log Contents:
Lines 294
End of LogType:stdout
$ docker-compose build
$ export AMB_CONSUL=$(docker inspect --format="{{ .NetworkSettings.IPAddress }}" amb-consul)
$ docker-compose run --rm sbt sbt console
[info] Set current project to Sample app (in build file:/app/)
[info] Starting scala interpreter...
[info]
Welcome to Scala version 2.10.5 (OpenJDK 64-Bit Server VM, Java 1.7.0_91).
Type in expressions to have them evaluated.
Type :help for more information.
scala> :quit
$ cd ~/scala_apps/spike
$ mkdir -p src/main/scala/
$ tree
.
├── Dockerfile
├── build.sbt
├── docker-compose.yml
└── src
└── main
└── scala
└── SampleApp.scala
$ docker-compose run --rm sbt sbt package
[info] Set current project to Sample app (in build file:/app/)
[info] Packaging /app/target/scala-2.10/sample-app_2.10-1.0.jar ...
[info] Done packaging.
[success] Total time: 1 s, completed Dec 26, 2015 4:13:52 AM
import org.apache.spark.{SparkContext, SparkConf}
object SampleApp {
def main(args: Array[String]) {
val conf = new SparkConf().setAppName("App")
val sc = new SparkContext(conf)
val src = sc.textFile("LICENSE")
val lines = src.count()
println("Lines %,d".format(lines))
}
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment