Skip to content

Instantly share code, notes, and snippets.

@jhu-chang
jhu-chang / DStream initialize information
Created April 15, 2015 08:30
Spark streaming stack overflow
15/04/15 15:50:42 [main] INFO org.apache.spark.streaming.dstream.ForEachDStream: metadataCleanupDelay = -1
15/04/15 15:50:42 [main] INFO org.apache.spark.streaming.dstream.MappedDStream: metadataCleanupDelay = -1
15/04/15 15:50:42 [main] INFO org.apache.spark.streaming.dstream.StateDStream: metadataCleanupDelay = -1
15/04/15 15:50:42 [main] INFO org.apache.spark.streaming.dstream.MappedDStream: metadataCleanupDelay = -1
15/04/15 15:50:42 [main] INFO org.apache.spark.streaming.dstream.SocketInputDStream: metadataCleanupDelay = -1
15/04/15 15:50:42 [main] INFO org.apache.spark.streaming.dstream.SocketInputDStream: Slide time = 2000 ms
15/04/15 15:50:42 [main] INFO org.apache.spark.streaming.dstream.SocketInputDStream: Storage level = StorageLevel(false, false, false, false, 1)
15/04/15 15:50:42 [main] INFO org.apache.spark.streaming.dstream.SocketInputDStream: Checkpoint interval = null
15/04/15 15:50:42 [main] INFO org.apache.spark.streaming.dstream.SocketInputDStream: Remember duration = 40000 ms
15/
15/04/14 11:28:20 [Executor task launch worker-1] ERROR org.apache.spark.executor.Executor: Exception in task 1.0 in stage 27554.0 (TID 3801)
java.lang.StackOverflowError
at java.io.ObjectStreamClass.setPrimFieldValues(ObjectStreamClass.java:1243)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1984)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
at java.io.ObjectInputStream.readObject0(ObjectInputStream.java:1351)
at java.io.ObjectInputStream.defaultReadFields(ObjectInputStream.java:1993)
at java.io.ObjectInputStream.readSerialData(ObjectInputStream.java:1918)
at java.io.ObjectInputStream.readOrdinaryObject(ObjectInputStream.java:1801)
@jhu-chang
jhu-chang / FileDStreamIssue
Last active August 29, 2015 14:13
FileDStreamIssue
package org.apache.spark.streaming
import org.apache.spark.SparkConf
import org.apache.spark.streaming.StreamingContext._
import org.apache.spark.streaming.dstream._
import org.apache.hadoop.io.{LongWritable, Text}
import org.apache.hadoop.mapreduce.lib.input.TextInputFormat
import org.apache.hadoop.fs.Path
object FileDStreamTest {
# coding=utf-8
import datetime
import sys
import time
import threading
import traceback
import SocketServer
from dnslib import *