Skip to content

Instantly share code, notes, and snippets.

Embed
What would you like to do?
Logback - Graylog (GELF) integration on Apache Spark

Assuming SPARK_HOME=/spark246

  • Put logback.xml into $SPARK_HOME/conf
  • Create additionaljars directory under $SPARK_HOME
  • Download required logback and logback-gelf related jars into additionaljars directory
    • log4j-over-slf4j-1.7.30.jar
    • logback-core-1.2.3.jar
    • logback-classic-1.2.3.jar
    • logback-gelf-2.2.0.jar
  • Tell Spark to use custom jar folder additionaljars as primarily classpath folder using following spark-submit parameters:
    • --conf "spark.driver.extraClassPath=/spark246/additionaljars/*"
    • --conf "spark.executor.extraClassPath=/spark246/additionaljars/*"

Example spark-submit command:

/spark246/bin/spark-submit --class com.baybatu.myJob \ 
--master spark://sparksal:7077 \
--driver-java-options "-Dapp_name=app_name1 -Dspark_job_name=job_name1" \
--conf "spark.driver.extraClassPath=/spark246/additionaljars/*" \
--conf "spark.executor.extraClassPath=/spark246/additionaljars/*" \
--total-executor-cores=2 myfat.jar

Source: https://stackoverflow.com/a/45480145

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment