Created
June 30, 2016 02:52
-
-
Save rhasson/43caaeb46ee18113005bfda24d758a19 to your computer and use it in GitHub Desktop.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
#Config file is a tab-delimited file with these expected fields - | |
# field1 - EMR AMI version (may be a, a.b or a.b.c format or 'default', default will be used if exact match cannot be found) | |
# field2 - Spark build version (the REQUESTED_VERSION is matched against this field, if not specified it will look for the term 'default', default will be used if exact match cannot be found) | |
# field3 - (optional) shell program to use to execute the install script, defaults to python | |
# field4 - s3 location of a specific install script for this AMI and Spark version (is ran with environment variables S3SparkInstallPath, SparkBuild and Ec2Region) | |
# field5 - s3 path to files that will be used for Spark binaries, how this field is used is up to the install script, sets enviroment variable 'SparkS3InstallPath' | |
# field6 - (optional) s3 path to the script executed when "-x" argument is used, defaults to AWS provided version if empty | |
# field7 - (optional) s3 path to the script executed when "-g" argument is used to install ganlia, defaults to AWS provided version if empty | |
# field8 - (optional) s3 location for Scala binaries to install, how this field is used is up to the install script, set enviroment variable 'ScalaS3Location' | |
# field9 - (optional) s3 base path for supporting reference files, for example, where to find configure-spark.bash, this file would be expected to be found in this path, set environment variable 'SparkS3SupportingFilesPath' | |
# | |
default 2.0 python s3://support.elasticmapreduce/spark/install-spark-script.py s3://your-bucket-name/spark-2.0.0-preview-bin-hadoop2.7.tgz s3://support.elasticmapreduce/spark/maximize-spark-default-config s3://support.elasticmapreduce/spark/install-ganglia-metrics s3://support.elasticmapreduce/spark/scala-2.11-builds/spark-1.2.1.a.tgz s3://support.elasticmapreduce/spark/ | |
# |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment