Often it is important to change a core configuration setting in pyspark before running (like a serializer, PYTHONHASHSEED for python3 users, or
from pyspark import SparkContext
from pyspark.serializers import PickleSerializer
new_conf = sc._conf.setExecutorEnv('PYTHONHASHSEED', '1234')
sc.stop()
sc = pyspark.SparkContext(conf = new_conf, serializer = PickleSerializer())
A quick solution to a few serialization problems and issues with the exception