This gist documents an issue I have had when performing Spark interop from Clojure. When higher order functions are used, a serialization error is thrown that I can't make sense of.
not_working.clj
has the minimal Clojure to reproduce the issue.working.scala
has a direct translation of the Clojure code into Scala. It does not throw the exception.logs_and_exception.log
has the Spark logs and exception trace that are produced when runningnot_working.clj
.
Below is addition information about when the exception does/doesn't occur.
- The exception is not raised (and
-main
behaves correctly) when: not_working.clj
is compiled into an uberjar.