Created
September 5, 2015 04:24
-
-
Save mrcsparker/6f908eb08da102b69cc7 to your computer and use it in GitHub Desktop.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
// Running spark from the command line | |
// | |
// Built with scala 2.11: | |
// mvn -Pyarn -Phadoop-2.4 -Dscala-2.11 -DskipTests clean package | |
// in /Users/mrcsparker/Downloads/spark-1.4.1 | |
// | |
// scala -classpath ~/Downloads/spark-1.4.1/assembly/target/scala-2.11/spark-assembly-1.4.1-hadoop2.6.0.jar pi.scala 1000 | |
import org.apache.spark._ | |
val conf = new SparkConf().setAppName("Spark Pi").setMaster("local[2]") | |
val spark = new SparkContext(conf) | |
val par_to = if (args.length == 0) 1000 else args(0).toInt | |
val count = spark.parallelize(1 to par_to).map{i => | |
val x = Math.random() | |
val y = Math.random() | |
if (x*x + y*y < 1) 1 else 0 | |
}.reduce(_ + _) | |
println(count / par_to) |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment