Skip to content

Instantly share code, notes, and snippets.

@tbertelsen
Created February 4, 2015 10:38
Show Gist options
  • Save tbertelsen/446b9062e645cbeaf646 to your computer and use it in GitHub Desktop.
Save tbertelsen/446b9062e645cbeaf646 to your computer and use it in GitHub Desktop.
Script for testing precedence of Spark configuration
#!/bin/bash
echo "SPARK_WORKER_DIR=/tmp/sparktest/file/" > $SPARK_HOME/conf/spark-env.sh
#echo "" > $SPARK_HOME/conf/spark-env.sh
LOCAL_HOSTNAME=`hostname`
$SPARK_HOME/sbin/start-master.sh
echo "sleeping for 10 s"
sleep 10
# Try to remove the -d parameter and/or the leading env.var.
SPARK_WORKER_DIR=/tmp/sparktest/envvar/ $SPARK_HOME/sbin/start-slave.sh 1 spark://$LOCAL_HOSTNAME:7077 -d /tmp/sparktest/cmdline/
echo "sleeping for 10 s"
sleep 10
echo starting
$SPARK_HOME/bin/spark-submit --class org.apache.spark.examples.SparkPi --master spark://$LOCAL_HOSTNAME:7077 $SPARK_HOME/lib/spark-examples-1.2.0-hadoop2.4.0.jar 100
echo "sleeping for 10 s"
sleep 10
$SPARK_HOME/sbin/spark-daemon.sh stop org.apache.spark.deploy.worker.Worker 1
$SPARK_HOME/sbin/stop-master.sh
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment