Skip to content

Instantly share code, notes, and snippets.

@erp12
Last active September 24, 2021 23:33
Show Gist options
  • Save erp12/233a60574dc157aa544079959108f9db to your computer and use it in GitHub Desktop.
Save erp12/233a60574dc157aa544079959108f9db to your computer and use it in GitHub Desktop.
Spark Serialization ClassCastException

This gist documents an issue I have had when performing Spark interop from Clojure. When higher order functions are used, a serialization error is thrown that I can't make sense of.

  • not_working.clj has the minimal Clojure to reproduce the issue.
  • working.scala has a direct translation of the Clojure code into Scala. It does not throw the exception.
  • logs_and_exception.log has the Spark logs and exception trace that are produced when running not_working.clj.

Below is addition information about when the exception does/doesn't occur.

  • The exception is not raised (and -main behaves correctly) when:
    • not_working.clj is compiled into an uberjar.
    • -main is called in a REPL.
  • The exception is raised when:
    • Running clj -X:spark erp12.clark-example.repro2/-main
    • Running -main via a test runner
/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/bin/java "-javaagent:/Applications/IntelliJ IDEA CE.app/Contents/lib/idea_rt.jar=62394:/Applications/IntelliJ IDEA CE.app/Contents/bin" -Dfile.encoding=UTF-8 -classpath /Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/charsets.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/deploy.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/ext/cldrdata.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/ext/dnsns.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/ext/jaccess.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/ext/jfxrt.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/ext/localedata.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/ext/nashorn.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/ext/sunec.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/ext/sunjce_provider.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/ext/sunpkcs11.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/ext/zipfs.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/javaws.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/jce.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/jfr.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/jfxswt.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/jsse.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/management-agent.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/plugin.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/resources.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/jre/lib/rt.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/lib/ant-javafx.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/lib/dt.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/lib/javafx-mx.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/lib/jconsole.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/lib/packager.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/lib/sa-jdi.jar:/Library/Java/JavaVirtualMachines/jdk1.8.0_172.jdk/Contents/Home/lib/tools.jar:/Users/eddie/Projects/Software/clark/clark-example/src:/Users/eddie/Projects/Software/clark/clark-core/src:/Users/eddie/Projects/Software/clark/clark-repl/src:/Users/eddie/Projects/Software/fijit/src:/Users/eddie/.m2/repository/com/sun/jersey/jersey-core/1.19/jersey-core-1.19.jar:/Users/eddie/.m2/repository/com/sun/jersey/jersey-client/1.19/jersey-client-1.19.jar:/Users/eddie/.m2/repository/org/lz4/lz4-java/1.7.1/lz4-java-1.7.1.jar:/Users/eddie/.m2/repository/org/clojure/clojure/1.10.3/clojure-1.10.3.jar:/Users/eddie/.m2/repository/org/apache/parquet/parquet-encoding/1.10.1/parquet-encoding-1.10.1.jar:/Users/eddie/.m2/repository/com/fasterxml/woodstox/woodstox-core/5.0.3/woodstox-core-5.0.3.jar:/Users/eddie/.m2/repository/org/apache/hadoop/hadoop-hdfs-client/3.2.0/hadoop-hdfs-client-3.2.0.jar:/Users/eddie/.m2/repository/org/thnetos/cd-client/0.3.6/cd-client-0.3.6.jar:/Users/eddie/.m2/repository/org/spark-project/spark/unused/1.0.0/unused-1.0.0.jar:/Users/eddie/.m2/repository/org/apache/curator/curator-framework/2.13.0/curator-framework-2.13.0.jar:/Users/eddie/.m2/repository/org/apache/avro/avro/1.8.2/avro-1.8.2.jar:/Users/eddie/.m2/repository/commons-codec/commons-codec/1.11/commons-codec-1.11.jar:/Users/eddie/.m2/repository/org/apache/kerby/kerby-asn1/1.0.1/kerby-asn1-1.0.1.jar:/Users/eddie/.m2/repository/org/eclipse/jetty/jetty-servlet/9.3.24.v20180605/jetty-servlet-9.3.24.v20180605.jar:/Users/eddie/.m2/repository/org/apache/curator/curator-client/2.13.0/curator-client-2.13.0.jar:/Users/eddie/.m2/repository/org/apache/commons/commons-compress/1.8.1/commons-compress-1.8.1.jar:/Users/eddie/.m2/repository/org/apache/commons/commons-lang3/3.10/commons-lang3-3.10.jar:/Users/eddie/.m2/repository/com/esotericsoftware/minlog/1.3.0/minlog-1.3.0.jar:/Users/eddie/.m2/repository/org/glassfish/jersey/core/jersey-common/2.30/jersey-common-2.30.jar:/Users/eddie/.m2/repository/org/clojure/core.specs.alpha/0.2.56/core.specs.alpha-0.2.56.jar:/Users/eddie/.m2/repository/org/tukaani/xz/1.5/xz-1.5.jar:/Users/eddie/.m2/repository/com/sun/jersey/jersey-servlet/1.19/jersey-servlet-1.19.jar:/Users/eddie/.m2/repository/com/fasterxml/jackson/module/jackson-module-jaxb-annotations/2.9.5/jackson-module-jaxb-annotations-2.9.5.jar:/Users/eddie/.m2/repository/jakarta/validation/jakarta.validation-api/2.0.2/jakarta.validation-api-2.0.2.jar:/Users/eddie/.m2/repository/com/esotericsoftware/kryo-shaded/4.0.2/kryo-shaded-4.0.2.jar:/Users/eddie/.m2/repository/org/apache/arrow/arrow-format/2.0.0/arrow-format-2.0.0.jar:/Users/eddie/.m2/repository/org/scala-lang/scala-library/2.12.13/scala-library-2.12.13.jar:/Users/eddie/.m2/repository/com/fasterxml/jackson/core/jackson-databind/2.10.5.1/jackson-databind-2.10.5.1.jar:/Users/eddie/.m2/repository/org/clojure/spec.alpha/0.2.194/spec.alpha-0.2.194.jar:/Users/eddie/.m2/repository/org/antlr/antlr4-runtime/4.8-1/antlr4-runtime-4.8-1.jar:/Users/eddie/.m2/repository/org/apache/spark/spark-unsafe_2.12/3.1.1/spark-unsafe_2.12-3.1.1.jar:/Users/eddie/.m2/repository/org/clojure/tools.cli/1.0.206/tools.cli-1.0.206.jar:/Users/eddie/.m2/repository/org/glassfish/jersey/containers/jersey-container-servlet/2.30/jersey-container-servlet-2.30.jar:/Users/eddie/.m2/repository/org/apache/orc/orc-shims/1.5.12/orc-shims-1.5.12.jar:/Users/eddie/.m2/repository/org/glassfish/hk2/hk2-utils/2.6.1/hk2-utils-2.6.1.jar:/Users/eddie/.m2/repository/commons-fileupload/commons-fileupload/1.2.1/commons-fileupload-1.2.1.jar:/Users/eddie/.m2/repository/org/apache/htrace/htrace-core4/4.1.0-incubating/htrace-core4-4.1.0-incubating.jar:/Users/eddie/.m2/repository/org/json4s/json4s-jackson_2.12/3.7.0-M5/json4s-jackson_2.12-3.7.0-M5.jar:/Users/eddie/.m2/repository/org/apache/arrow/arrow-memory-netty/2.0.0/arrow-memory-netty-2.0.0.jar:/Users/eddie/.m2/repository/com/squareup/okio/okio/1.6.0/okio-1.6.0.jar:/Users/eddie/.m2/repository/javax/activation/activation/1.1.1/activation-1.1.1.jar:/Users/eddie/.m2/repository/org/apache/kerby/kerb-common/1.0.1/kerb-common-1.0.1.jar:/Users/eddie/.m2/repository/io/dropwizard/metrics/metrics-jvm/4.1.1/metrics-jvm-4.1.1.jar:/Users/eddie/.m2/repository/org/eclipse/jetty/jetty-util/9.3.24.v20180605/jetty-util-9.3.24.v20180605.jar:/Users/eddie/.m2/repository/org/apache/kerby/token-provider/1.0.1/token-provider-1.0.1.jar:/Users/eddie/.m2/repository/org/codehaus/janino/janino/3.0.16/janino-3.0.16.jar:/Users/eddie/.m2/repository/org/apache/parquet/parquet-common/1.10.1/parquet-common-1.10.1.jar:/Users/eddie/.m2/repository/org/json4s/json4s-ast_2.12/3.7.0-M5/json4s-ast_2.12-3.7.0-M5.jar:/Users/eddie/.m2/repository/org/slf4j/jcl-over-slf4j/1.7.30/jcl-over-slf4j-1.7.30.jar:/Users/eddie/.m2/repository/jakarta/xml/bind/jakarta.xml.bind-api/2.3.2/jakarta.xml.bind-api-2.3.2.jar:/Users/eddie/.m2/repository/org/apache/avro/avro-ipc/1.8.2/avro-ipc-1.8.2.jar:/Users/eddie/.m2/repository/org/glassfish/jersey/containers/jersey-container-servlet-core/2.30/jersey-container-servlet-core-2.30.jar:/Users/eddie/.m2/repository/org/apache/kerby/kerb-admin/1.0.1/kerb-admin-1.0.1.jar:/Users/eddie/.m2/repository/com/github/luben/zstd-jni/1.4.8-1/zstd-jni-1.4.8-1.jar:/Users/eddie/.m2/repository/org/apache/hadoop/hadoop-auth/3.2.0/hadoop-auth-3.2.0.jar:/Users/eddie/.m2/repository/org/apache/spark/spark-launcher_2.12/3.1.1/spark-launcher_2.12-3.1.1.jar:/Users/eddie/.m2/repository/org/glassfish/hk2/external/jakarta.inject/2.6.1/jakarta.inject-2.6.1.jar:/Users/eddie/.m2/repository/org/apache/commons/commons-math3/3.4.1/commons-math3-3.4.1.jar:/Users/eddie/.m2/repository/net/cgrand/parsley/0.9.2/parsley-0.9.2.jar:/Users/eddie/.m2/repository/reply/reply/0.4.3/reply-0.4.3.jar:/Users/eddie/.m2/repository/org/apache/spark/spark-network-common_2.12/3.1.1/spark-network-common_2.12-3.1.1.jar:/Users/eddie/.m2/repository/org/apache/kerby/kerb-core/1.0.1/kerb-core-1.0.1.jar:/Users/eddie/.m2/repository/org/apache/hadoop/hadoop-yarn-client/3.2.0/hadoop-yarn-client-3.2.0.jar:/Users/eddie/.m2/repository/org/glassfish/hk2/external/aopalliance-repackaged/2.6.1/aopalliance-repackaged-2.6.1.jar:/Users/eddie/.m2/repository/commons-cli/commons-cli/1.2/commons-cli-1.2.jar:/Users/eddie/.m2/repository/commons-lang/commons-lang/2.6/commons-lang-2.6.jar:/Users/eddie/.m2/repository/slingshot/slingshot/0.10.3/slingshot-0.10.3.jar:/Users/eddie/.m2/repository/org/scala-lang/modules/scala-parser-combinators_2.12/1.1.2/scala-parser-combinators_2.12-1.1.2.jar:/Users/eddie/.m2/repository/commons-io/commons-io/2.5/commons-io-2.5.jar:/Users/eddie/.m2/repository/org/objenesis/objenesis/2.5.1/objenesis-2.5.1.jar:/Users/eddie/.m2/repository/org/json4s/json4s-scalap_2.12/3.7.0-M5/json4s-scalap_2.12-3.7.0-M5.jar:/Users/eddie/.m2/repository/org/apache/kerby/kerb-simplekdc/1.0.1/kerb-simplekdc-1.0.1.jar:/Users/eddie/.m2/repository/com/nimbusds/nimbus-jose-jwt/4.41.1/nimbus-jose-jwt-4.41.1.jar:/Users/eddie/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-common/3.2.0/hadoop-mapreduce-client-common-3.2.0.jar:/Users/eddie/.m2/repository/com/fasterxml/jackson/core/jackson-core/2.10.5/jackson-core-2.10.5.jar:/Users/eddie/.m2/repository/org/apache/hive/hive-storage-api/2.7.2/hive-storage-api-2.7.2.jar:/Users/eddie/.m2/repository/nrepl/drawbridge/0.1.0/drawbridge-0.1.0.jar:/Users/eddie/.m2/repository/org/apache/parquet/parquet-format/2.4.0/parquet-format-2.4.0.jar:/Users/eddie/.m2/repository/clj-http/clj-http/0.3.6/clj-http-0.3.6.jar:/Users/eddie/.m2/repository/org/slf4j/jul-to-slf4j/1.7.30/jul-to-slf4j-1.7.30.jar:/Users/eddie/.m2/repository/org/apache/hadoop/hadoop-annotations/3.2.0/hadoop-annotations-3.2.0.jar:/Users/eddie/.m2/repository/commons-collections/commons-collections/3.2.2/commons-collections-3.2.2.jar:/Users/eddie/.m2/repository/trptcolin/versioneer/0.1.1/versioneer-0.1.1.jar:/Users/eddie/.m2/repository/org/scala-lang/scala-compiler/2.12.13/scala-compiler-2.12.13.jar:/Users/eddie/.m2/repository/com/google/flatbuffers/flatbuffers-java/1.9.0/flatbuffers-java-1.9.0.jar:/Users/eddie/.m2/repository/org/apache/httpcomponents/httpcore/4.4.4/httpcore-4.4.4.jar:/Users/eddie/.m2/repository/org/eclipse/jetty/jetty-security/9.3.24.v20180605/jetty-security-9.3.24.v20180605.jar:/Users/eddie/.m2/repository/com/fasterxml/jackson/module/jackson-module-paranamer/2.10.0/jackson-module-paranamer-2.10.0.jar:/Users/eddie/.m2/repository/dnsjava/dnsjava/2.1.7/dnsjava-2.1.7.jar:/Users/eddie/.m2/repository/org/slf4j/slf4j-log4j12/1.7.30/slf4j-log4j12-1.7.30.jar:/Users/eddie/.m2/repository/jakarta/annotation/jakarta.annotation-api/1.3.5/jakarta.annotation-api-1.3.5.jar:/Users/eddie/.m2/repository/io/dropwizard/metrics/metrics-core/4.1.1/metrics-core-4.1.1.jar:/Users/eddie/.m2/repository/io/dropwizard/metrics/metrics-graphite/4.1.1/metrics-graphite-4.1.1.jar:/Users/eddie/.m2/repository/org/apache/spark/spark-catalyst_2.12/3.1.1/spark-catalyst_2.12-3.1.1.jar:/Users/eddie/.m2/repository/io/airlift/aircompressor/0.10/aircompressor-0.10.jar:/Users/eddie/.m2/repository/clojure-complete/clojure-complete/0.2.5/clojure-complete-0.2.5.jar:/Users/eddie/.m2/repository/com/fasterxml/jackson/core/jackson-annotations/2.10.5/jackson-annotations-2.10.5.jar:/Users/eddie/.m2/repository/org/json4s/json4s-core_2.12/3.7.0-M5/json4s-core_2.12-3.7.0-M5.jar:/Users/eddie/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-core/3.2.0/hadoop-mapreduce-client-core-3.2.0.jar:/Users/eddie/.m2/repository/org/clojars/trptcolin/sjacket/0.1.1.1/sjacket-0.1.1.1.jar:/Users/eddie/.m2/repository/org/apache/commons/commons-text/1.6/commons-text-1.6.jar:/Users/eddie/.m2/repository/org/javassist/javassist/3.25.0-GA/javassist-3.25.0-GA.jar:/Users/eddie/.m2/repository/org/glassfish/hk2/hk2-api/2.6.1/hk2-api-2.6.1.jar:/Users/eddie/.m2/repository/org/roaringbitmap/shims/0.9.0/shims-0.9.0.jar:/Users/eddie/.m2/repository/org/apache/commons/commons-crypto/1.1.0/commons-crypto-1.1.0.jar:/Users/eddie/.m2/repository/commons-logging/commons-logging/1.1.1/commons-logging-1.1.1.jar:/Users/eddie/.m2/repository/com/google/guava/guava/16.0.1/guava-16.0.1.jar:/Users/eddie/.m2/repository/com/clearspring/analytics/stream/2.9.6/stream-2.9.6.jar:/Users/eddie/.m2/repository/org/apache/arrow/arrow-vector/2.0.0/arrow-vector-2.0.0.jar:/Users/eddie/.m2/repository/org/xerial/snappy/snappy-java/1.1.8.2/snappy-java-1.1.8.2.jar:/Users/eddie/.m2/repository/oro/oro/2.0.8/oro-2.0.8.jar:/Users/eddie/.m2/repository/org/apache/hadoop/hadoop-common/3.2.0/hadoop-common-3.2.0.jar:/Users/eddie/.m2/repository/commons-beanutils/commons-beanutils/1.9.3/commons-beanutils-1.9.3.jar:/Users/eddie/.m2/repository/net/minidev/json-smart/2.3/json-smart-2.3.jar:/Users/eddie/.m2/repository/org/apache/avro/avro-mapred/1.8.2/avro-mapred-1.8.2-hadoop2.jar:/Users/eddie/.m2/repository/org/apache/spark/spark-sketch_2.12/3.1.1/spark-sketch_2.12-3.1.1.jar:/Users/eddie/.m2/repository/net/cgrand/regex/1.1.0/regex-1.1.0.jar:/Users/eddie/.m2/repository/org/apache/kerby/kerb-crypto/1.0.1/kerb-crypto-1.0.1.jar:/Users/eddie/.m2/repository/com/fasterxml/jackson/jaxrs/jackson-jaxrs-base/2.9.5/jackson-jaxrs-base-2.9.5.jar:/Users/eddie/.m2/repository/org/apache/hadoop/hadoop-yarn-api/3.2.0/hadoop-yarn-api-3.2.0.jar:/Users/eddie/.m2/repository/org/apache/arrow/arrow-memory-core/2.0.0/arrow-memory-core-2.0.0.jar:/Users/eddie/.m2/repository/org/apache/orc/orc-mapreduce/1.5.12/orc-mapreduce-1.5.12.jar:/Users/eddie/.m2/repository/com/twitter/chill_2.12/0.9.5/chill_2.12-0.9.5.jar:/Users/eddie/.m2/repository/org/scala-lang/scala-reflect/2.12.13/scala-reflect-2.12.13.jar:/Users/eddie/.m2/repository/org/apache/httpcomponents/httpclient/4.5.2/httpclient-4.5.2.jar:/Users/eddie/.m2/repository/net/sf/py4j/py4j/0.10.9/py4j-0.10.9.jar:/Users/eddie/.m2/repository/com/thoughtworks/paranamer/paranamer/2.8/paranamer-2.8.jar:/Users/eddie/.m2/repository/org/apache/ivy/ivy/2.4.0/ivy-2.4.0.jar:/Users/eddie/.m2/repository/org/glassfish/jersey/inject/jersey-hk2/2.30/jersey-hk2-2.30.jar:/Users/eddie/.m2/repository/org/fusesource/leveldbjni/leveldbjni-all/1.8/leveldbjni-all-1.8.jar:/Users/eddie/.m2/repository/com/twitter/chill-java/0.9.5/chill-java-0.9.5.jar:/Users/eddie/.m2/repository/org/apache/kerby/kerb-server/1.0.1/kerb-server-1.0.1.jar:/Users/eddie/.m2/repository/org/apache/hadoop/hadoop-yarn-common/3.2.0/hadoop-yarn-common-3.2.0.jar:/Users/eddie/.m2/repository/cheshire/cheshire/4.0.3/cheshire-4.0.3.jar:/Users/eddie/.m2/repository/org/apache/orc/orc-core/1.5.12/orc-core-1.5.12.jar:/Users/eddie/.m2/repository/org/codehaus/woodstox/stax2-api/3.1.4/stax2-api-3.1.4.jar:/Users/eddie/.m2/repository/org/apache/parquet/parquet-hadoop/1.10.1/parquet-hadoop-1.10.1.jar:/Users/eddie/.m2/repository/org/glassfish/hk2/hk2-locator/2.6.1/hk2-locator-2.6.1.jar:/Users/eddie/.m2/repository/com/univocity/univocity-parsers/2.9.1/univocity-parsers-2.9.1.jar:/Users/eddie/.m2/repository/org/apache/kerby/kerby-xdr/1.0.1/kerby-xdr-1.0.1.jar:/Users/eddie/.m2/repository/org/codehaus/jackson/jackson-core-asl/1.9.13/jackson-core-asl-1.9.13.jar:/Users/eddie/.m2/repository/org/glassfish/jersey/core/jersey-server/2.30/jersey-server-2.30.jar:/Users/eddie/.m2/repository/org/apache/zookeeper/zookeeper/3.4.14/zookeeper-3.4.14.jar:/Users/eddie/.m2/repository/com/squareup/okhttp/okhttp/2.7.5/okhttp-2.7.5.jar:/Users/eddie/.m2/repository/org/apache/commons/commons-configuration2/2.1.1/commons-configuration2-2.1.1.jar:/Users/eddie/.m2/repository/org/codehaus/janino/commons-compiler/3.0.16/commons-compiler-3.0.16.jar:/Users/eddie/.m2/repository/javax/servlet/servlet-api/2.5/servlet-api-2.5.jar:/Users/eddie/.m2/repository/org/apache/parquet/parquet-jackson/1.10.1/parquet-jackson-1.10.1.jar:/Users/eddie/.m2/repository/org/threeten/threeten-extra/1.5.0/threeten-extra-1.5.0.jar:/Users/eddie/.m2/repository/com/ning/compress-lzf/1.0.3/compress-lzf-1.0.3.jar:/Users/eddie/.m2/repository/org/apache/spark/spark-network-shuffle_2.12/3.1.1/spark-network-shuffle_2.12-3.1.1.jar:/Users/eddie/.m2/repository/nrepl/nrepl/0.8.3/nrepl-0.8.3.jar:/Users/eddie/.m2/repository/jline/jline/2.14.6/jline-2.14.6.jar:/Users/eddie/.m2/repository/com/fasterxml/jackson/module/jackson-module-scala_2.12/2.10.0/jackson-module-scala_2.12-2.10.0.jar:/Users/eddie/.m2/repository/org/apache/curator/curator-recipes/2.13.0/curator-recipes-2.13.0.jar:/Users/eddie/.m2/repository/org/glassfish/jersey/core/jersey-client/2.30/jersey-client-2.30.jar:/Users/eddie/.m2/repository/clj-stacktrace/clj-stacktrace/0.2.7/clj-stacktrace-0.2.7.jar:/Users/eddie/.m2/repository/org/slf4j/slf4j-api/1.7.30/slf4j-api-1.7.30.jar:/Users/eddie/.m2/repository/com/google/re2j/re2j/1.1/re2j-1.1.jar:/Users/eddie/.m2/repository/org/apache/spark/spark-tags_2.12/3.1.1/spark-tags_2.12-3.1.1.jar:/Users/eddie/.m2/repository/org/glassfish/jersey/media/jersey-media-jaxb/2.30/jersey-media-jaxb-2.30.jar:/Users/eddie/.m2/repository/org/apache/hadoop/hadoop-mapreduce-client-jobclient/3.2.0/hadoop-mapreduce-client-jobclient-3.2.0.jar:/Users/eddie/.m2/repository/com/fasterxml/jackson/jaxrs/jackson-jaxrs-json-provider/2.9.5/jackson-jaxrs-json-provider-2.9.5.jar:/Users/eddie/.m2/repository/io/dropwizard/metrics/metrics-jmx/4.1.1/metrics-jmx-4.1.1.jar:/Users/eddie/.m2/repository/org/apache/kerby/kerby-config/1.0.1/kerby-config-1.0.1.jar:/Users/eddie/.m2/repository/net/razorvine/pyrolite/4.30/pyrolite-4.30.jar:/Users/eddie/.m2/repository/org/apache/kerby/kerb-util/1.0.1/kerb-util-1.0.1.jar:/Users/eddie/.m2/repository/com/sun/activation/jakarta.activation/1.2.1/jakarta.activation-1.2.1.jar:/Users/eddie/.m2/repository/org/apache/spark/spark-kvstore_2.12/3.1.1/spark-kvstore_2.12-3.1.1.jar:/Users/eddie/.m2/repository/log4j/log4j/1.2.17/log4j-1.2.17.jar:/Users/eddie/.m2/repository/com/google/protobuf/protobuf-java/2.5.0/protobuf-java-2.5.0.jar:/Users/eddie/.m2/repository/org/glassfish/hk2/osgi-resource-locator/1.0.3/osgi-resource-locator-1.0.3.jar:/Users/eddie/.m2/repository/io/dropwizard/metrics/metrics-json/4.1.1/metrics-json-4.1.1.jar:/Users/eddie/.m2/repository/camel-snake-kebab/camel-snake-kebab/0.4.2/camel-snake-kebab-0.4.2.jar:/Users/eddie/.m2/repository/javax/servlet/jsp/jsp-api/2.1/jsp-api-2.1.jar:/Users/eddie/.m2/repository/org/apache/yetus/audience-annotations/0.5.0/audience-annotations-0.5.0.jar:/Users/eddie/.m2/repository/org/apache/kerby/kerby-util/1.0.1/kerby-util-1.0.1.jar:/Users/eddie/.m2/repository/org/apache/xbean/xbean-asm7-shaded/4.15/xbean-asm7-shaded-4.15.jar:/Users/eddie/.m2/repository/clj-http-lite/clj-http-lite/0.2.0/clj-http-lite-0.2.0.jar:/Users/eddie/.m2/repository/org/scala-lang/modules/scala-xml_2.12/1.2.0/scala-xml_2.12-1.2.0.jar:/Users/eddie/.m2/repository/org/apache/kerby/kerb-identity/1.0.1/kerb-identity-1.0.1.jar:/Users/eddie/.m2/repository/net/minidev/accessors-smart/1.2/accessors-smart-1.2.jar:/Users/eddie/.m2/repository/jakarta/ws/rs/jakarta.ws.rs-api/2.1.6/jakarta.ws.rs-api-2.1.6.jar:/Users/eddie/.m2/repository/io/netty/netty-all/4.1.51.Final/netty-all-4.1.51.Final.jar:/Users/eddie/.m2/repository/org/apache/httpcomponents/httpmime/4.1.2/httpmime-4.1.2.jar:/Users/eddie/.m2/repository/org/apache/spark/spark-core_2.12/3.1.1/spark-core_2.12-3.1.1.jar:/Users/eddie/.m2/repository/com/google/code/findbugs/jsr305/3.0.2/jsr305-3.0.2.jar:/Users/eddie/.m2/repository/org/apache/parquet/parquet-column/1.10.1/parquet-column-1.10.1.jar:/Users/eddie/.m2/repository/org/apache/kerby/kerb-client/1.0.1/kerb-client-1.0.1.jar:/Users/eddie/.m2/repository/ring/ring-core/1.0.2/ring-core-1.0.2.jar:/Users/eddie/.m2/repository/commons-net/commons-net/3.6/commons-net-3.6.jar:/Users/eddie/.m2/repository/org/roaringbitmap/RoaringBitmap/0.9.0/RoaringBitmap-0.9.0.jar:/Users/eddie/.m2/repository/org/codehaus/jackson/jackson-mapper-asl/1.9.13/jackson-mapper-asl-1.9.13.jar:/Users/eddie/.m2/repository/jakarta/activation/jakarta.activation-api/1.2.1/jakarta.activation-api-1.2.1.jar:/Users/eddie/.m2/repository/jakarta/servlet/jakarta.servlet-api/4.0.3/jakarta.servlet-api-4.0.3.jar:/Users/eddie/.m2/repository/org/apache/hadoop/hadoop-client/3.2.0/hadoop-client-3.2.0.jar:/Users/eddie/.m2/repository/com/fasterxml/jackson/dataformat/jackson-dataformat-smile/2.0.6/jackson-dataformat-smile-2.0.6.jar:/Users/eddie/.m2/repository/com/github/stephenc/jcip/jcip-annotations/1.0-1/jcip-annotations-1.0-1.jar:/Users/eddie/.m2/repository/javax/xml/bind/jaxb-api/2.2.11/jaxb-api-2.2.11.jar:/Users/eddie/.m2/repository/org/apache/kerby/kerby-pkix/1.0.1/kerby-pkix-1.0.1.jar:/Users/eddie/.m2/repository/com/google/code/gson/gson/2.2.4/gson-2.2.4.jar:/Users/eddie/.m2/repository/org/apache/spark/spark-sql_2.12/3.1.1/spark-sql_2.12-3.1.1.jar clojure.main -m erp12.clark-example.repro2
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
21/09/24 19:20:54 WARN Utils: Your hostname, Eddies-MacBook-Pro.local resolves to a loopback address: 127.0.0.1; using 192.168.1.13 instead (on interface en0)
21/09/24 19:20:54 WARN Utils: Set SPARK_LOCAL_IP if you need to bind to another address
21/09/24 19:20:54 INFO SparkContext: Running Spark version 3.1.1
21/09/24 19:20:55 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
21/09/24 19:20:55 INFO ResourceUtils: ==============================================================
21/09/24 19:20:55 INFO ResourceUtils: No custom resources configured for spark.driver.
21/09/24 19:20:55 INFO ResourceUtils: ==============================================================
21/09/24 19:20:55 INFO SparkContext: Submitted application: 98c91a2f-42e7-43cf-8c21-770558925874
21/09/24 19:20:55 INFO ResourceProfile: Default ResourceProfile created, executor resources: Map(cores -> name: cores, amount: 1, script: , vendor: , memory -> name: memory, amount: 1024, script: , vendor: , offHeap -> name: offHeap, amount: 0, script: , vendor: ), task resources: Map(cpus -> name: cpus, amount: 1.0)
21/09/24 19:20:55 INFO ResourceProfile: Limiting resource is cpu
21/09/24 19:20:55 INFO ResourceProfileManager: Added ResourceProfile id: 0
21/09/24 19:20:55 INFO SecurityManager: Changing view acls to: eddie
21/09/24 19:20:55 INFO SecurityManager: Changing modify acls to: eddie
21/09/24 19:20:55 INFO SecurityManager: Changing view acls groups to:
21/09/24 19:20:55 INFO SecurityManager: Changing modify acls groups to:
21/09/24 19:20:55 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(eddie); groups with view permissions: Set(); users with modify permissions: Set(eddie); groups with modify permissions: Set()
21/09/24 19:20:56 INFO Utils: Successfully started service 'sparkDriver' on port 62400.
21/09/24 19:20:56 INFO SparkEnv: Registering MapOutputTracker
21/09/24 19:20:56 INFO SparkEnv: Registering BlockManagerMaster
21/09/24 19:20:56 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
21/09/24 19:20:56 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
21/09/24 19:20:56 INFO SparkEnv: Registering BlockManagerMasterHeartbeat
21/09/24 19:20:56 INFO DiskBlockManager: Created local directory at /private/var/folders/zl/hlg5bvnj1y9753gvjsf0_86r0000gn/T/blockmgr-81fda9b5-a54d-4ec5-9c15-523eaeda445e
21/09/24 19:20:56 INFO MemoryStore: MemoryStore started with capacity 2004.6 MiB
21/09/24 19:20:56 INFO SparkEnv: Registering OutputCommitCoordinator
21/09/24 19:20:56 INFO Utils: Successfully started service 'SparkUI' on port 4040.
21/09/24 19:20:57 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.1.13:4040
21/09/24 19:20:57 INFO Executor: Starting executor ID driver on host 192.168.1.13
21/09/24 19:20:57 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 62402.
21/09/24 19:20:57 INFO NettyBlockTransferService: Server created on 192.168.1.13:62402
21/09/24 19:20:57 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
21/09/24 19:20:57 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.1.13, 62402, None)
21/09/24 19:20:57 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.1.13:62402 with 2004.6 MiB RAM, BlockManagerId(driver, 192.168.1.13, 62402, None)
21/09/24 19:20:57 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.1.13, 62402, None)
21/09/24 19:20:57 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 192.168.1.13, 62402, None)
21/09/24 19:20:59 INFO SharedState: Setting hive.metastore.warehouse.dir ('null') to the value of spark.sql.warehouse.dir ('file:/Users/eddie/Projects/Software/clark/clark-example/spark-warehouse').
21/09/24 19:20:59 INFO SharedState: Warehouse path is 'file:/Users/eddie/Projects/Software/clark/clark-example/spark-warehouse'.
21/09/24 19:21:01 INFO CodeGenerator: Code generated in 215.279552 ms
21/09/24 19:21:01 INFO SparkContext: Starting job: show at repro2.clj:20
21/09/24 19:21:01 INFO DAGScheduler: Got job 0 (show at repro2.clj:20) with 1 output partitions
21/09/24 19:21:01 INFO DAGScheduler: Final stage: ResultStage 0 (show at repro2.clj:20)
21/09/24 19:21:01 INFO DAGScheduler: Parents of final stage: List()
21/09/24 19:21:01 INFO DAGScheduler: Missing parents: List()
21/09/24 19:21:01 INFO DAGScheduler: Submitting ResultStage 0 (MapPartitionsRDD[3] at show at repro2.clj:20), which has no missing parents
21/09/24 19:21:01 INFO MemoryStore: Block broadcast_0 stored as values in memory (estimated size 12.8 KiB, free 2004.6 MiB)
21/09/24 19:21:01 INFO MemoryStore: Block broadcast_0_piece0 stored as bytes in memory (estimated size 5.6 KiB, free 2004.6 MiB)
21/09/24 19:21:01 INFO BlockManagerInfo: Added broadcast_0_piece0 in memory on 192.168.1.13:62402 (size: 5.6 KiB, free: 2004.6 MiB)
21/09/24 19:21:01 INFO SparkContext: Created broadcast 0 from broadcast at DAGScheduler.scala:1383
21/09/24 19:21:01 INFO DAGScheduler: Submitting 1 missing tasks from ResultStage 0 (MapPartitionsRDD[3] at show at repro2.clj:20) (first 15 tasks are for partitions Vector(0))
21/09/24 19:21:01 INFO TaskSchedulerImpl: Adding task set 0.0 with 1 tasks resource profile 0
21/09/24 19:21:01 INFO TaskSetManager: Starting task 0.0 in stage 0.0 (TID 0) (192.168.1.13, executor driver, partition 0, PROCESS_LOCAL, 4578 bytes) taskResourceAssignments Map()
21/09/24 19:21:01 INFO Executor: Running task 0.0 in stage 0.0 (TID 0)
21/09/24 19:21:01 ERROR Executor: Exception in task 0.0 in stage 0.0 (TID 0)
java.lang.ClassCastException: cannot assign instance of java.lang.invoke.SerializedLambda to field org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of org.apache.spark.rdd.MapPartitionsRDD
at java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2233)
at java.io.ObjectStreamClass.setObjFieldValues(ObjectStreamClass.java:1405)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2291)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2209)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2067)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1571)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2285)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2209)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2067)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1571)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:431)
at scala.collection.immutable.List$SerializationProxy.readObject(List.scala:527)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1158)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2176)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2067)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1571)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2285)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2209)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2067)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1571)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2285)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2209)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2067)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1571)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:431)
at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:76)
at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:115)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:83)
at org.apache.spark.scheduler.Task.run(Task.scala:131)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:497)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1439)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:500)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
21/09/24 19:21:02 WARN TaskSetManager: Lost task 0.0 in stage 0.0 (TID 0) (192.168.1.13 executor driver): java.lang.ClassCastException: cannot assign instance of java.lang.invoke.SerializedLambda to field org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of org.apache.spark.rdd.MapPartitionsRDD
at java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2233)
at java.io.ObjectStreamClass.setObjFieldValues(ObjectStreamClass.java:1405)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2291)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2209)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2067)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1571)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2285)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2209)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2067)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1571)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:431)
at scala.collection.immutable.List$SerializationProxy.readObject(List.scala:527)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1158)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2176)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2067)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1571)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2285)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2209)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2067)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1571)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2285)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2209)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2067)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1571)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:431)
at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:76)
at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:115)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:83)
at org.apache.spark.scheduler.Task.run(Task.scala:131)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:497)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1439)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:500)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
21/09/24 19:21:02 ERROR TaskSetManager: Task 0 in stage 0.0 failed 1 times; aborting job
21/09/24 19:21:02 INFO TaskSchedulerImpl: Removed TaskSet 0.0, whose tasks have all completed, from pool
21/09/24 19:21:02 INFO TaskSchedulerImpl: Cancelling stage 0
21/09/24 19:21:02 INFO TaskSchedulerImpl: Killing all running tasks in stage 0: Stage cancelled
21/09/24 19:21:02 INFO DAGScheduler: ResultStage 0 (show at repro2.clj:20) failed in 0.650 s due to Job aborted due to stage failure: Task 0 in stage 0.0 failed 1 times, most recent failure: Lost task 0.0 in stage 0.0 (TID 0) (192.168.1.13 executor driver): java.lang.ClassCastException: cannot assign instance of java.lang.invoke.SerializedLambda to field org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of org.apache.spark.rdd.MapPartitionsRDD
at java.io.ObjectStreamClass$FieldReflector.setObjFieldValues(ObjectStreamClass.java:2233)
at java.io.ObjectStreamClass.setObjFieldValues(ObjectStreamClass.java:1405)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2291)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2209)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2067)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1571)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2285)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2209)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2067)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1571)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:431)
at scala.collection.immutable.List$SerializationProxy.readObject(List.scala:527)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:498)
at java.io.ObjectStreamClass.invokeReadObject(ObjectStreamClass.java:1158)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2176)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2067)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1571)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2285)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2209)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2067)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1571)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:2285)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:2209)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:2067)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1571)
at java.io.ObjectInputStream.readObject(ObjectInputStream.java:431)
at org.apache.spark.serializer.JavaDeserializationStream.readObject(JavaSerializer.scala:76)
at org.apache.spark.serializer.JavaSerializerInstance.deserialize(JavaSerializer.scala:115)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:83)
at org.apache.spark.scheduler.Task.run(Task.scala:131)
at org.apache.spark.executor.Executor$TaskRunner.$anonfun$run$3(Executor.scala:497)
at org.apache.spark.util.Utils$.tryWithSafeFinally(Utils.scala:1439)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:500)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)
Driver stacktrace:
21/09/24 19:21:02 INFO DAGScheduler: Job 0 failed: show at repro2.clj:20, took 0.690595 s
Execution error (ClassCastException) at java.io.ObjectStreamClass$FieldReflector/setObjFieldValues (ObjectStreamClass.java:2233).
cannot assign instance of java.lang.invoke.SerializedLambda to field org.apache.spark.rdd.MapPartitionsRDD.f of type scala.Function3 in instance of org.apache.spark.rdd.MapPartitionsRDD
Full report at:
/var/folders/zl/hlg5bvnj1y9753gvjsf0_86r0000gn/T/clojure-1888777375856289776.edn
21/09/24 19:21:02 INFO SparkContext: Invoking stop() from shutdown hook
21/09/24 19:21:02 INFO SparkUI: Stopped Spark web UI at http://192.168.1.13:4040
21/09/24 19:21:02 INFO MapOutputTrackerMasterEndpoint: MapOutputTrackerMasterEndpoint stopped!
21/09/24 19:21:02 INFO MemoryStore: MemoryStore cleared
21/09/24 19:21:02 INFO BlockManager: BlockManager stopped
21/09/24 19:21:02 INFO BlockManagerMaster: BlockManagerMaster stopped
21/09/24 19:21:02 INFO OutputCommitCoordinator$OutputCommitCoordinatorEndpoint: OutputCommitCoordinator stopped!
21/09/24 19:21:02 INFO SparkContext: Successfully stopped SparkContext
21/09/24 19:21:02 INFO ShutdownHookManager: Shutdown hook called
21/09/24 19:21:02 INFO ShutdownHookManager: Deleting directory /private/var/folders/zl/hlg5bvnj1y9753gvjsf0_86r0000gn/T/spark-dff3d6d0-f584-409f-a9a5-2cf998d2e70d
Process finished with exit code 1
(ns erp12.clark-example.repro2
(:gen-class)
(:import (org.apache.spark.sql SparkSession)
(org.apache.spark.sql Encoders)
(scala Function1)
(java.io Serializable)))
(defn -main
[& _]
(-> (SparkSession/builder)
(.master "local[*]")
(.getOrCreate)
(.range 1)
(.map (reify
Function1
(apply [_ row]
row)
Serializable)
(Encoders/LONG))
(.show)))
package erp12.clark.debug
import java.lang.Long
import org.apache.spark.sql.{Encoders, SparkSession}
case object Play extends App {
SparkSession
.builder
.master("local[*]")
.getOrCreate()
.range(1)
.map(new Function1[Long, Long] with Serializable {
def apply(v1: Long): Long = v1
})(Encoders.LONG)
.show()
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment