Skip to content

Instantly share code, notes, and snippets.

@kawamon
Last active December 15, 2015 19:08
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save kawamon/82b46959927ac1361a68 to your computer and use it in GitHub Desktop.
Save kawamon/82b46959927ac1361a68 to your computer and use it in GitHub Desktop.
ドライバで指定したKeyがText型, ReducerはKeyがLongWritable型の場合のエクセプションのログ
[training@localhost seqoutput]$ hadoop jar wc.jar WordCountDriver -D mapreduce.map.log.level=DEBUG -D mapred.map.child.log.level=DEBUG -D mapred.reduce.child.log.level=DEBUG shakespeare outs2
13/04/04 03:19:00 WARN mapred.JobClient: Use GenericOptionsParser for parsing the arguments. Applications should implement Tool for the same.
13/04/04 03:19:01 INFO input.FileInputFormat: Total input paths to process : 5
13/04/04 03:19:01 WARN snappy.LoadSnappy: Snappy native library is available
13/04/04 03:19:01 INFO snappy.LoadSnappy: Snappy native library loaded
13/04/04 03:19:01 INFO mapred.JobClient: Running job: job_201304040226_0013
13/04/04 03:19:02 INFO mapred.JobClient: map 0% reduce 0%
13/04/04 03:19:10 INFO mapred.JobClient: map 40% reduce 0%
13/04/04 03:19:15 INFO mapred.JobClient: map 60% reduce 0%
13/04/04 03:19:17 INFO mapred.JobClient: map 80% reduce 0%
13/04/04 03:19:18 INFO mapred.JobClient: map 100% reduce 0%
13/04/04 03:19:22 INFO mapred.JobClient: Task Id : attempt_201304040226_0013_r_000000_0, Status : FAILED
java.io.IOException: wrong key class: org.apache.hadoop.io.LongWritable is not class org.apache.hadoop.io.Text
at org.apache.hadoop.io.SequenceFile$Writer.append(SequenceFile.java:1266)
at org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat$1.write(SequenceFileOutputFormat.java:74)
at org.apache.hadoop.mapred.ReduceTask$NewTrackingRecordWriter.write(ReduceTask.java:537)
at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:88)
at org.apache.hadoop.mapreduce.lib.reduce.WrappedReducer$Context.write(WrappedReducer.java:99)
at SumReducer.reduce(SumReducer.java:20)
at SumReducer.reduce(SumReducer.java:11)
at org.apache.hadoop.mapreduce.Reducer.run(Reducer.java:164)
at org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:595)
at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:433)
at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subj
attempt_201304040226_0013_r_000000_0: log4j:WARN No appenders could be found for logger (org.apache.hadoop.hdfs.DFSClient).
attempt_201304040226_0013_r_000000_0: log4j:WARN Please initialize the log4j system properly.
attempt_201304040226_0013_r_000000_0: log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
13/04/04 03:19:27 INFO mapred.JobClient: Task Id : attempt_201304040226_0013_r_000000_1, Status : FAILED
java.io.IOException: wrong key class: org.apache.hadoop.io.LongWritable is not class org.apache.hadoop.io.Text
at org.apache.hadoop.io.SequenceFile$Writer.append(SequenceFile.java:1266)
at org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat$1.write(SequenceFileOutputFormat.java:74)
at org.apache.hadoop.mapred.ReduceTask$NewTrackingRecordWriter.write(ReduceTask.java:537)
at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:88)
at org.apache.hadoop.mapreduce.lib.reduce.WrappedReducer$Context.write(WrappedReducer.java:99)
at SumReducer.reduce(SumReducer.java:20)
at SumReducer.reduce(SumReducer.java:11)
at org.apache.hadoop.mapreduce.Reducer.run(Reducer.java:164)
at org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:595)
at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:433)
at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subj
attempt_201304040226_0013_r_000000_1: log4j:WARN No appenders could be found for logger (org.apache.hadoop.hdfs.DFSClient).
attempt_201304040226_0013_r_000000_1: log4j:WARN Please initialize the log4j system properly.
attempt_201304040226_0013_r_000000_1: log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
13/04/04 03:19:33 INFO mapred.JobClient: Task Id : attempt_201304040226_0013_r_000000_2, Status : FAILED
java.io.IOException: wrong key class: org.apache.hadoop.io.LongWritable is not class org.apache.hadoop.io.Text
at org.apache.hadoop.io.SequenceFile$Writer.append(SequenceFile.java:1266)
at org.apache.hadoop.mapreduce.lib.output.SequenceFileOutputFormat$1.write(SequenceFileOutputFormat.java:74)
at org.apache.hadoop.mapred.ReduceTask$NewTrackingRecordWriter.write(ReduceTask.java:537)
at org.apache.hadoop.mapreduce.task.TaskInputOutputContextImpl.write(TaskInputOutputContextImpl.java:88)
at org.apache.hadoop.mapreduce.lib.reduce.WrappedReducer$Context.write(WrappedReducer.java:99)
at SumReducer.reduce(SumReducer.java:20)
at SumReducer.reduce(SumReducer.java:11)
at org.apache.hadoop.mapreduce.Reducer.run(Reducer.java:164)
at org.apache.hadoop.mapred.ReduceTask.runNewReducer(ReduceTask.java:595)
at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:433)
at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
at java.security.AccessController.doPrivileged(Native Method)
at javax.security.auth.Subj
attempt_201304040226_0013_r_000000_2: log4j:WARN No appenders could be found for logger (org.apache.hadoop.hdfs.DFSClient).
attempt_201304040226_0013_r_000000_2: log4j:WARN Please initialize the log4j system properly.
attempt_201304040226_0013_r_000000_2: log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
13/04/04 03:19:40 INFO mapred.JobClient: Job complete: job_201304040226_0013
13/04/04 03:19:40 INFO mapred.JobClient: Counters: 29
13/04/04 03:19:40 INFO mapred.JobClient: File System Counters
13/04/04 03:19:40 INFO mapred.JobClient: FILE: Number of bytes read=10154450
13/04/04 03:19:40 INFO mapred.JobClient: FILE: Number of bytes written=21901615
13/04/04 03:19:40 INFO mapred.JobClient: FILE: Number of read operations=0
13/04/04 03:19:40 INFO mapred.JobClient: FILE: Number of large read operations=0
13/04/04 03:19:40 INFO mapred.JobClient: FILE: Number of write operations=0
13/04/04 03:19:40 INFO mapred.JobClient: HDFS: Number of bytes read=5343801
13/04/04 03:19:40 INFO mapred.JobClient: HDFS: Number of bytes written=0
13/04/04 03:19:40 INFO mapred.JobClient: HDFS: Number of read operations=10
13/04/04 03:19:40 INFO mapred.JobClient: HDFS: Number of large read operations=0
13/04/04 03:19:40 INFO mapred.JobClient: HDFS: Number of write operations=0
13/04/04 03:19:40 INFO mapred.JobClient: Job Counters
13/04/04 03:19:40 INFO mapred.JobClient: Failed reduce tasks=1
13/04/04 03:19:40 INFO mapred.JobClient: Launched map tasks=5
13/04/04 03:19:40 INFO mapred.JobClient: Launched reduce tasks=4
13/04/04 03:19:40 INFO mapred.JobClient: Data-local map tasks=5
13/04/04 03:19:40 INFO mapred.JobClient: Total time spent by all maps in occupied slots (ms)=28008
13/04/04 03:19:40 INFO mapred.JobClient: Total time spent by all reduces in occupied slots (ms)=28718
13/04/04 03:19:40 INFO mapred.JobClient: Total time spent by all maps waiting after reserving slots (ms)=0
13/04/04 03:19:40 INFO mapred.JobClient: Total time spent by all reduces waiting after reserving slots (ms)=0
13/04/04 03:19:40 INFO mapred.JobClient: Map-Reduce Framework
13/04/04 03:19:40 INFO mapred.JobClient: Map input records=175558
13/04/04 03:19:40 INFO mapred.JobClient: Map output records=974078
13/04/04 03:19:40 INFO mapred.JobClient: Map output bytes=8880434
13/04/04 03:19:40 INFO mapred.JobClient: Input split bytes=594
13/04/04 03:19:40 INFO mapred.JobClient: Combine input records=0
13/04/04 03:19:40 INFO mapred.JobClient: Combine output records=0
13/04/04 03:19:40 INFO mapred.JobClient: Spilled Records=1888319
13/04/04 03:19:40 INFO mapred.JobClient: CPU time spent (ms)=5130
13/04/04 03:19:40 INFO mapred.JobClient: Physical memory (bytes) snapshot=906711040
13/04/04 03:19:40 INFO mapred.JobClient: Virtual memory (bytes) snapshot=1937862656
13/04/04 03:19:40 INFO mapred.JobClient: Total committed heap usage (bytes)=802508800
[training@localhost seqoutput]$
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment