Created
October 24, 2011 08:43
-
-
Save miguno/1308612 to your computer and use it in GitHub Desktop.
Stefan
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
hduser@host:/usr/local/hadoop$ bin/hadoop jar hadoop*examples*.jar wordcount /user/hduser/gutenberg /user/hduser/gutenberg-output | |
11/10/22 22:03:33 INFO input.FileInputFormat: Total input paths to process : 3 | |
11/10/22 22:03:34 INFO mapred.JobClient: Running job: job_201110222200_0001 | |
11/10/22 22:03:35 INFO mapred.JobClient: map 0% reduce 0% | |
11/10/22 22:03:54 INFO mapred.JobClient: map 66% reduce 0% | |
11/10/22 22:04:00 INFO mapred.JobClient: map 100% reduce 0% | |
11/10/22 22:04:02 INFO mapred.JobClient: Task Id : attempt_201110222200_0001_r_000000_0, Status : FAILED | |
Error: java.lang.NullPointerException | |
at java.util.concurrent.ConcurrentHashMap.get(ConcurrentHashMap.java:768) | |
at org.apache.hadoop.mapred.ReduceTask$ReduceCopier$GetMapEventsThread.getMapCompletionEvents(ReduceTask.java:2900) | |
at org.apache.hadoop.mapred.ReduceTask$ReduceCopier$GetMapEventsThread.run(ReduceTask.java:2820) | |
11/10/22 22:04:11 INFO mapred.JobClient: Task Id : attempt_201110222200_0001_r_000000_1, Status : FAILED | |
Error: java.lang.NullPointerException | |
at java.util.concurrent.ConcurrentHashMap.get(ConcurrentHashMap.java:768) | |
at org.apache.hadoop.mapred.ReduceTask$ReduceCopier$GetMapEventsThread.getMapCompletionEvents(ReduceTask.java:2900) | |
at org.apache.hadoop.mapred.ReduceTask$ReduceCopier$GetMapEventsThread.run(ReduceTask.java:2820) | |
11/10/22 22:04:20 INFO mapred.JobClient: Task Id : attempt_201110222200_0001_r_000000_2, Status : FAILED | |
Error: java.lang.NullPointerException | |
at java.util.concurrent.ConcurrentHashMap.get(ConcurrentHashMap.java:768) | |
at org.apache.hadoop.mapred.ReduceTask$ReduceCopier$GetMapEventsThread.getMapCompletionEvents(ReduceTask.java:2900) | |
at org.apache.hadoop.mapred.ReduceTask$ReduceCopier$GetMapEventsThread.run(ReduceTask.java:2820) | |
11/10/22 22:04:35 INFO mapred.JobClient: Job complete: job_201110222200_0001 | |
11/10/22 22:04:35 INFO mapred.JobClient: Counters: 20 | |
11/10/22 22:04:35 INFO mapred.JobClient: Job Counters | |
11/10/22 22:04:35 INFO mapred.JobClient: Launched reduce tasks=4 | |
11/10/22 22:04:35 INFO mapred.JobClient: SLOTS_MILLIS_MAPS=31504 | |
11/10/22 22:04:35 INFO mapred.JobClient: Total time spent by all reduces waiting after reserving slots (ms)=0 | |
11/10/22 22:04:35 INFO mapred.JobClient: Total time spent by all maps waiting after reserving slots (ms)=0 | |
11/10/22 22:04:35 INFO mapred.JobClient: Launched map tasks=3 | |
11/10/22 22:04:35 INFO mapred.JobClient: Data-local map tasks=3 | |
11/10/22 22:04:35 INFO mapred.JobClient: Failed reduce tasks=1 | |
11/10/22 22:04:35 INFO mapred.JobClient: SLOTS_MILLIS_REDUCES=18089 | |
11/10/22 22:04:35 INFO mapred.JobClient: FileSystemCounters | |
11/10/22 22:04:35 INFO mapred.JobClient: FILE_BYTES_READ=740458 | |
11/10/22 22:04:35 INFO mapred.JobClient: HDFS_BYTES_READ=3671840 | |
11/10/22 22:04:35 INFO mapred.JobClient: FILE_BYTES_WRITTEN=2278193 | |
11/10/22 22:04:35 INFO mapred.JobClient: File Input Format Counters | |
11/10/22 22:04:35 INFO mapred.JobClient: Bytes Read=3671479 | |
11/10/22 22:04:35 INFO mapred.JobClient: Map-Reduce Framework | |
11/10/22 22:04:35 INFO mapred.JobClient: Map output materialized bytes=1474279 | |
11/10/22 22:04:35 INFO mapred.JobClient: Combine output records=102317 | |
11/10/22 22:04:35 INFO mapred.JobClient: Map input records=77931 | |
11/10/22 22:04:35 INFO mapred.JobClient: Spilled Records=153630 | |
11/10/22 22:04:35 INFO mapred.JobClient: Map output bytes=6076039 | |
11/10/22 22:04:35 INFO mapred.JobClient: Combine input records=629167 | |
11/10/22 22:04:35 INFO mapred.JobClient: Map output records=629167 | |
11/10/22 22:04:35 INFO mapred.JobClient: SPLIT_RAW_BYTES=361 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment