Skip to content

Instantly share code, notes, and snippets.

@cfmcgrady
Last active October 15, 2015 03:10
Show Gist options
  • Save cfmcgrady/b670ec926cc05385ff6f to your computer and use it in GitHub Desktop.
Save cfmcgrady/b670ec926cc05385ff6f to your computer and use it in GitHub Desktop.
Spark On Mesos

###RUN SPARK ON MESOS

###单机版 ####模式一(测试通过)

  • Mesos运行在host上
  • Spark driver 和 executor运行docker上

#####1.搭建Mesos单机环境 详情 #####2.运行Spark demo 启动spark driver container

docker run -it --net host registry.dataman.io/centos7/spark:1.5.1 bash 

修改spark配置文件

cd $SPARK_HOME
vi conf/spark-defaults.conf

########################
spark.mesos.executor.home /spark-1.5.1-bin-hadoop2.6
spark.mesos.executor.docker.image registry.dataman.io/centos7/spark:1.5.1
########################

vi conf/spark-env.sh

#####################
export JAVA_HOME=$(readlink -f /usr/bin/java | sed "s:jre/bin/java::")
export MASTER=mesos://$MESOS_MASTER_IP:5050    #将MESOS_MASTER_IP替换为ip地址
export SPARK_HOME=/spark-1.5.1-bin-hadoop2.6
export SPARK_LOCAL_IP=`ifconfig eth0 | awk '/inet addr/{print substr($2,6)}'
export SPARK_LOCAL_HOSTNAME=`ifconfig eth0 | awk '/inet addr/{print substr($2,6)}'
#####################

启动Spark shell

bin/spark-shell

运行demo

sc.parallelize(1 to 1000) count

###模式二(测试不通过)

  • Mesos通过数人云发布
  • spark driver和executor运行在container里

####1.通过数人云发布mesos

####2.运行spark(同模式一步骤2)

@xiaods
Copy link

xiaods commented Oct 14, 2015

export SPARK_LOCAL_IP=ifconfig eth0 | awk '/inet addr/{print substr($2,6)}' export SPARK_LOCAL_HOSTNAME=ifconfig eth0 | awk '/inet addr/{print substr($2,6)}'
缺最后一个符号`

@xiaods
Copy link

xiaods commented Oct 14, 2015

这个镜像问题太多。直接使用https://hub.docker.com/r/mesosphere/spark/ 测试

@xiaods
Copy link

xiaods commented Oct 14, 2015

docker run -it --net host mesosphere/spark:1.5.0-hadoop2.6.0 bash

@xiaods
Copy link

xiaods commented Oct 14, 2015

root@omegamaster1:/opt/spark/dist# cat conf/spark-defaults.conf

spark.mesos.coarse=true
spark.mesos.executor.home /opt/spark/dist
spark.mesos.executor.docker.image  mesosphere/spark:1.5.0-hadoop2.6.0

然后问题就解决了。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment