Skip to content

Instantly share code, notes, and snippets.

@xiaods
Forked from cfmcgrady/sparkOnMesos.md
Last active October 15, 2015 03:16
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save xiaods/554d20842e2dc988426f to your computer and use it in GitHub Desktop.
Save xiaods/554d20842e2dc988426f to your computer and use it in GitHub Desktop.
玩转Spark on 数人云

###RUN SPARK ON 数人云

###单机版

  • Mesos运行在host上
  • Spark driver 和 executor运行docker上

#####1.搭建Mesos集群环境 详情 ,登录数人云,登录控制台后,通过集群管理创建自己的集群 #####2.运行Spark demo 登录到master节点主机上: 启动spark driver container

docker run -it --net host mesosphere/spark:1.5.0-hadoop2.6.0 bash 

修改spark配置文件

cd $SPARK_HOME
vi conf/spark-defaults.conf

########################
spark.mesos.coarse=true
spark.mesos.executor.home /opt/spark/dist
spark.mesos.executor.docker.image mesosphere/spark:1.5.0-hadoop2.6.0
########################

vi conf/spark-env.sh

#####################
export JAVA_HOME=$(readlink -f /usr/bin/java | sed "s:jre/bin/java::")
export MASTER=mesos://$MESOS_MASTER_IP:5050    #将MESOS_MASTER_IP替换为ip地址,多master请使用zk连接串
export SPARK_HOME=/opt/spark/dist
export SPARK_LOCAL_IP=`ifconfig eth0 | awk '/inet addr/{print substr($2,6)}'`
export SPARK_LOCAL_HOSTNAME=`ifconfig eth0 | awk '/inet addr/{print substr($2,6)}'`
#####################

启动Spark shell

bin/spark-shell

运行demo

sc.parallelize(1 to 1000) count
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment