Created
January 19, 2017 16:57
-
-
Save aqt01/61f947e22b94bccf50073cb71f478168 to your computer and use it in GitHub Desktop.
Running LastCountryOfUser
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
969637439385692816-temp-2017-01-19T16-33-41.818263 | |
2017-01-19 16:33:45,906 INFO 10701 [luigi-interface] hadoop.py:273 - 17/01/19 16:33:45 WARN streaming.StreamJob: -file option is deprecated, please use generic option -files instead. | |
2017-01-19 16:33:49,043 INFO 10701 [luigi-interface] hadoop.py:273 - 17/01/19 16:33:49 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032 | |
2017-01-19 16:33:49,544 INFO 10701 [luigi-interface] hadoop.py:273 - 17/01/19 16:33:49 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032 | |
2017-01-19 16:33:52,698 INFO 10701 [luigi-interface] hadoop.py:273 - 17/01/19 16:33:52 INFO mapred.FileInputFormat: Total input paths to process : 1 | |
2017-01-19 16:33:52,879 INFO 10701 [luigi-interface] hadoop.py:273 - 17/01/19 16:33:52 INFO mapreduce.JobSubmitter: number of splits:2 | |
2017-01-19 16:33:52,887 INFO 10701 [luigi-interface] hadoop.py:273 - 17/01/19 16:33:52 INFO Configuration.deprecation: mapred.job.name is deprecated. Instead, use mapreduce.job.name | |
2017-01-19 16:33:52,904 INFO 10701 [luigi-interface] hadoop.py:273 - 17/01/19 16:33:52 INFO Configuration.deprecation: mapred.reduce.tasks is deprecated. Instead, use mapreduce.job.reduces | |
2017-01-19 16:33:53,236 INFO 10701 [luigi-interface] hadoop.py:273 - 17/01/19 16:33:53 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1484841390427_0002 | |
2017-01-19 16:33:53,873 INFO 10701 [luigi-interface] hadoop.py:273 - 17/01/19 16:33:53 INFO impl.YarnClientImpl: Submitted application application_1484841390427_0002 | |
2017-01-19 16:33:54,003 INFO 10701 [luigi-interface] hadoop.py:273 - 17/01/19 16:33:54 INFO mapreduce.Job: The url to track the job: http://ip-172-30-0-176:8088/proxy/application_1484841390427_0002/ | |
2017-01-19 16:33:54,008 INFO 10701 [luigi-interface] hadoop.py:273 - 17/01/19 16:33:54 INFO mapreduce.Job: Running job: job_1484841390427_0002 | |
2017-01-19 16:34:12,591 INFO 10701 [luigi-interface] hadoop.py:273 - 17/01/19 16:34:12 INFO mapreduce.Job: Job job_1484841390427_0002 running in uber mode : false | |
2017-01-19 16:34:12,601 INFO 10701 [luigi-interface] hadoop.py:273 - 17/01/19 16:34:12 INFO mapreduce.Job: map 0% reduce 0% | |
2017-01-19 16:34:31,414 INFO 10701 [luigi-interface] hadoop.py:273 - 17/01/19 16:34:31 INFO mapreduce.Job: map 50% reduce 0% | |
2017-01-19 16:34:32,445 INFO 10701 [luigi-interface] hadoop.py:273 - 17/01/19 16:34:32 INFO mapreduce.Job: map 100% reduce 0% | |
2017-01-19 16:34:47,727 INFO 10701 [luigi-interface] hadoop.py:273 - 17/01/19 16:34:47 INFO mapreduce.Job: map 100% reduce 100% | |
2017-01-19 16:35:02,029 INFO 10701 [luigi-interface] hadoop.py:273 - 17/01/19 16:35:02 INFO mapreduce.Job: Task Id : attempt_1484841390427_0002_r_000000_0, Status : FAILED | |
2017-01-19 16:35:02,103 INFO 10701 [luigi-interface] hadoop.py:273 - Error: java.lang.RuntimeException: PipeMapRed.waitOutputThreads(): subprocess failed with code 1 | |
2017-01-19 16:35:02,103 INFO 10701 [luigi-interface] hadoop.py:273 - at org.apache.hadoop.streaming.PipeMapRed.waitOutputThreads(PipeMapRed.java:320) | |
2017-01-19 16:35:02,103 INFO 10701 [luigi-interface] hadoop.py:273 - at org.apache.hadoop.streaming.PipeMapRed.mapRedFinished(PipeMapRed.java:533) | |
2017-01-19 16:35:02,104 INFO 10701 [luigi-interface] hadoop.py:273 - at org.apache.hadoop.streaming.PipeReducer.close(PipeReducer.java:134) | |
2017-01-19 16:35:02,104 INFO 10701 [luigi-interface] hadoop.py:273 - at org.apache.hadoop.io.IOUtils.cleanup(IOUtils.java:237) | |
2017-01-19 16:35:02,104 INFO 10701 [luigi-interface] hadoop.py:273 - at org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:459) | |
2017-01-19 16:35:02,104 INFO 10701 [luigi-interface] hadoop.py:273 - at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:392) | |
2017-01-19 16:35:02,105 INFO 10701 [luigi-interface] hadoop.py:273 - at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168) | |
2017-01-19 16:35:02,105 INFO 10701 [luigi-interface] hadoop.py:273 - at java.security.AccessController.doPrivileged(Native Method) | |
2017-01-19 16:35:02,105 INFO 10701 [luigi-interface] hadoop.py:273 - at javax.security.auth.Subject.doAs(Subject.java:422) | |
2017-01-19 16:35:02,105 INFO 10701 [luigi-interface] hadoop.py:273 - at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548) | |
2017-01-19 16:35:02,105 INFO 10701 [luigi-interface] hadoop.py:273 - at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163) | |
2017-01-19 16:35:03,155 INFO 10701 [luigi-interface] hadoop.py:273 - 17/01/19 16:35:03 INFO mapreduce.Job: map 100% reduce 0% | |
2017-01-19 16:35:21,519 INFO 10701 [luigi-interface] hadoop.py:273 - 17/01/19 16:35:21 INFO mapreduce.Job: map 100% reduce 100% | |
2017-01-19 16:35:35,732 INFO 10701 [luigi-interface] hadoop.py:273 - 17/01/19 16:35:35 INFO mapreduce.Job: Task Id : attempt_1484841390427_0002_r_000000_1, Status : FAILED | |
2017-01-19 16:35:35,734 INFO 10701 [luigi-interface] hadoop.py:273 - Error: java.lang.RuntimeException: PipeMapRed.waitOutputThreads(): subprocess failed with code 1 | |
2017-01-19 16:35:35,734 INFO 10701 [luigi-interface] hadoop.py:273 - at org.apache.hadoop.streaming.PipeMapRed.waitOutputThreads(PipeMapRed.java:320) | |
2017-01-19 16:35:35,734 INFO 10701 [luigi-interface] hadoop.py:273 - at org.apache.hadoop.streaming.PipeMapRed.mapRedFinished(PipeMapRed.java:533) | |
2017-01-19 16:35:35,735 INFO 10701 [luigi-interface] hadoop.py:273 - at org.apache.hadoop.streaming.PipeReducer.close(PipeReducer.java:134) | |
2017-01-19 16:35:35,735 INFO 10701 [luigi-interface] hadoop.py:273 - at org.apache.hadoop.io.IOUtils.cleanup(IOUtils.java:237) | |
2017-01-19 16:35:35,735 INFO 10701 [luigi-interface] hadoop.py:273 - at org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:459) | |
2017-01-19 16:35:35,736 INFO 10701 [luigi-interface] hadoop.py:273 - at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:392) | |
2017-01-19 16:35:35,737 INFO 10701 [luigi-interface] hadoop.py:273 - at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168) | |
2017-01-19 16:35:35,738 INFO 10701 [luigi-interface] hadoop.py:273 - at java.security.AccessController.doPrivileged(Native Method) | |
2017-01-19 16:35:35,738 INFO 10701 [luigi-interface] hadoop.py:273 - at javax.security.auth.Subject.doAs(Subject.java:422) | |
2017-01-19 16:35:35,738 INFO 10701 [luigi-interface] hadoop.py:273 - at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548) | |
2017-01-19 16:35:35,739 INFO 10701 [luigi-interface] hadoop.py:273 - at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163) | |
2017-01-19 16:35:36,741 INFO 10701 [luigi-interface] hadoop.py:273 - 17/01/19 16:35:36 INFO mapreduce.Job: map 100% reduce 0% | |
2017-01-19 16:35:50,959 INFO 10701 [luigi-interface] hadoop.py:273 - 17/01/19 16:35:50 INFO mapreduce.Job: map 100% reduce 100% | |
2017-01-19 16:36:06,279 INFO 10701 [luigi-interface] hadoop.py:273 - 17/01/19 16:36:06 INFO mapreduce.Job: Task Id : attempt_1484841390427_0002_r_000000_2, Status : FAILED | |
2017-01-19 16:36:06,280 INFO 10701 [luigi-interface] hadoop.py:273 - Error: java.lang.RuntimeException: PipeMapRed.waitOutputThreads(): subprocess failed with code 1 | |
2017-01-19 16:36:06,280 INFO 10701 [luigi-interface] hadoop.py:273 - at org.apache.hadoop.streaming.PipeMapRed.waitOutputThreads(PipeMapRed.java:320) | |
2017-01-19 16:36:06,281 INFO 10701 [luigi-interface] hadoop.py:273 - at org.apache.hadoop.streaming.PipeMapRed.mapRedFinished(PipeMapRed.java:533) | |
2017-01-19 16:36:06,281 INFO 10701 [luigi-interface] hadoop.py:273 - at org.apache.hadoop.streaming.PipeReducer.close(PipeReducer.java:134) | |
2017-01-19 16:36:06,281 INFO 10701 [luigi-interface] hadoop.py:273 - at org.apache.hadoop.io.IOUtils.cleanup(IOUtils.java:237) | |
2017-01-19 16:36:06,281 INFO 10701 [luigi-interface] hadoop.py:273 - at org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:459) | |
2017-01-19 16:36:06,281 INFO 10701 [luigi-interface] hadoop.py:273 - at org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:392) | |
2017-01-19 16:36:06,282 INFO 10701 [luigi-interface] hadoop.py:273 - at org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168) | |
2017-01-19 16:36:06,282 INFO 10701 [luigi-interface] hadoop.py:273 - at java.security.AccessController.doPrivileged(Native Method) | |
2017-01-19 16:36:06,282 INFO 10701 [luigi-interface] hadoop.py:273 - at javax.security.auth.Subject.doAs(Subject.java:422) | |
2017-01-19 16:36:06,282 INFO 10701 [luigi-interface] hadoop.py:273 - at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548) | |
2017-01-19 16:36:06,282 INFO 10701 [luigi-interface] hadoop.py:273 - at org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163) | |
2017-01-19 16:36:07,327 INFO 10701 [luigi-interface] hadoop.py:273 - 17/01/19 16:36:07 INFO mapreduce.Job: map 100% reduce 0% | |
2017-01-19 16:36:23,711 INFO 10701 [luigi-interface] hadoop.py:273 - 17/01/19 16:36:23 INFO mapreduce.Job: map 100% reduce 100% | |
2017-01-19 16:36:38,999 INFO 10701 [luigi-interface] hadoop.py:273 - 17/01/19 16:36:38 INFO mapreduce.Job: Job job_1484841390427_0002 failed with state FAILED due to: Task failed task_1484841390427_0002_r_000000 | |
2017-01-19 16:36:38,999 INFO 10701 [luigi-interface] hadoop.py:273 - Job failed as tasks failed. failedMaps:0 failedReduces:1 | |
2017-01-19 16:36:39,356 INFO 10701 [luigi-interface] hadoop.py:273 - 17/01/19 16:36:39 INFO mapreduce.Job: Counters: 37 | |
2017-01-19 16:36:39,356 INFO 10701 [luigi-interface] hadoop.py:273 - File System Counters | |
2017-01-19 16:36:39,356 INFO 10701 [luigi-interface] hadoop.py:273 - FILE: Number of bytes read=0 | |
2017-01-19 16:36:39,356 INFO 10701 [luigi-interface] hadoop.py:273 - FILE: Number of bytes written=195423 | |
2017-01-19 16:36:39,357 INFO 10701 [luigi-interface] hadoop.py:273 - FILE: Number of read operations=0 | |
2017-01-19 16:36:39,357 INFO 10701 [luigi-interface] hadoop.py:273 - FILE: Number of large read operations=0 | |
2017-01-19 16:36:39,357 INFO 10701 [luigi-interface] hadoop.py:273 - FILE: Number of write operations=0 | |
2017-01-19 16:36:39,357 INFO 10701 [luigi-interface] hadoop.py:273 - HDFS: Number of bytes read=686974 | |
2017-01-19 16:36:39,357 INFO 10701 [luigi-interface] hadoop.py:273 - HDFS: Number of bytes written=0 | |
2017-01-19 16:36:39,366 INFO 10701 [luigi-interface] hadoop.py:273 - HDFS: Number of read operations=6 | |
2017-01-19 16:36:39,366 INFO 10701 [luigi-interface] hadoop.py:273 - HDFS: Number of large read operations=0 | |
2017-01-19 16:36:39,366 INFO 10701 [luigi-interface] hadoop.py:273 - HDFS: Number of write operations=0 | |
2017-01-19 16:36:39,366 INFO 10701 [luigi-interface] hadoop.py:273 - Job Counters | |
2017-01-19 16:36:39,366 INFO 10701 [luigi-interface] hadoop.py:273 - Failed reduce tasks=4 | |
2017-01-19 16:36:39,367 INFO 10701 [luigi-interface] hadoop.py:273 - Launched map tasks=2 | |
2017-01-19 16:36:39,367 INFO 10701 [luigi-interface] hadoop.py:273 - Launched reduce tasks=4 | |
2017-01-19 16:36:39,367 INFO 10701 [luigi-interface] hadoop.py:273 - Data-local map tasks=2 | |
2017-01-19 16:36:39,367 INFO 10701 [luigi-interface] hadoop.py:273 - Total time spent by all maps in occupied slots (ms)=35481 | |
2017-01-19 16:36:39,367 INFO 10701 [luigi-interface] hadoop.py:273 - Total time spent by all reduces in occupied slots (ms)=115440 | |
2017-01-19 16:36:39,367 INFO 10701 [luigi-interface] hadoop.py:273 - Total time spent by all map tasks (ms)=35481 | |
2017-01-19 16:36:39,367 INFO 10701 [luigi-interface] hadoop.py:273 - Total time spent by all reduce tasks (ms)=115440 | |
2017-01-19 16:36:39,367 INFO 10701 [luigi-interface] hadoop.py:273 - Total vcore-seconds taken by all map tasks=35481 | |
2017-01-19 16:36:39,367 INFO 10701 [luigi-interface] hadoop.py:273 - Total vcore-seconds taken by all reduce tasks=115440 | |
2017-01-19 16:36:39,368 INFO 10701 [luigi-interface] hadoop.py:273 - Total megabyte-seconds taken by all map tasks=36332544 | |
2017-01-19 16:36:39,368 INFO 10701 [luigi-interface] hadoop.py:273 - Total megabyte-seconds taken by all reduce tasks=118210560 | |
2017-01-19 16:36:39,369 INFO 10701 [luigi-interface] hadoop.py:273 - Map-Reduce Framework | |
2017-01-19 16:36:39,369 INFO 10701 [luigi-interface] hadoop.py:273 - Map input records=1742 | |
2017-01-19 16:36:39,369 INFO 10701 [luigi-interface] hadoop.py:273 - Map output records=167 | |
2017-01-19 16:36:39,369 INFO 10701 [luigi-interface] hadoop.py:273 - Map output bytes=15809 | |
2017-01-19 16:36:39,369 INFO 10701 [luigi-interface] hadoop.py:273 - Map output materialized bytes=16155 | |
2017-01-19 16:36:39,369 INFO 10701 [luigi-interface] hadoop.py:273 - Input split bytes=182 | |
2017-01-19 16:36:39,369 INFO 10701 [luigi-interface] hadoop.py:273 - Combine input records=0 | |
2017-01-19 16:36:39,369 INFO 10701 [luigi-interface] hadoop.py:273 - Spilled Records=167 | |
2017-01-19 16:36:39,370 INFO 10701 [luigi-interface] hadoop.py:273 - Failed Shuffles=0 | |
2017-01-19 16:36:39,370 INFO 10701 [luigi-interface] hadoop.py:273 - Merged Map outputs=0 | |
2017-01-19 16:36:39,370 INFO 10701 [luigi-interface] hadoop.py:273 - GC time elapsed (ms)=558 | |
2017-01-19 16:36:39,370 INFO 10701 [luigi-interface] hadoop.py:273 - CPU time spent (ms)=1970 | |
2017-01-19 16:36:39,370 INFO 10701 [luigi-interface] hadoop.py:273 - Physical memory (bytes) snapshot=415866880 | |
2017-01-19 16:36:39,373 INFO 10701 [luigi-interface] hadoop.py:273 - Virtual memory (bytes) snapshot=4414054400 | |
2017-01-19 16:36:39,373 INFO 10701 [luigi-interface] hadoop.py:273 - Total committed heap usage (bytes)=317587456 | |
2017-01-19 16:36:39,382 INFO 10701 [luigi-interface] hadoop.py:273 - File Input Format Counters | |
2017-01-19 16:36:39,382 INFO 10701 [luigi-interface] hadoop.py:273 - Bytes Read=686792 | |
2017-01-19 16:36:39,382 INFO 10701 [luigi-interface] hadoop.py:273 - 17/01/19 16:36:39 ERROR streaming.StreamJob: Job not Successful! | |
2017-01-19 16:36:39,383 INFO 10701 [luigi-interface] hadoop.py:273 - Streaming Command Failed! | |
2017-01-19 16:36:39,403 ERROR 10701 [luigi-interface] worker.py:304 - [pid 10701] Worker Worker(salt=761015386, host=ip-172-30-0-176, username=hadoop, pid=10701) failed LastDailyIpAddressOfUserTask(source=('hdfs://localhost:9000/data/',), interval=2017-01-03-2017-01-19, expand_interval=2 days, 0:00:00, pattern=('.*tracking.log.*',), date_pattern=%Y%m%d, warehouse_path=hdfs://localhost:9000/edx-analytics-pipeline/warehouse/) | |
Traceback (most recent call last): | |
File "/var/lib/analytics-tasks/analyticstack/venv/local/lib/python2.7/site-packages/luigi/worker.py", line 292, in _run_task | |
task.run() | |
File "/var/lib/analytics-tasks/analyticstack/venv/local/lib/python2.7/site-packages/edx/analytics/tasks/insights/location_per_course.py", line 163, in run | |
super(LastDailyIpAddressOfUserTask, self).run() | |
File "/var/lib/analytics-tasks/analyticstack/venv/local/lib/python2.7/site-packages/luigi/hadoop.py", line 612, in run | |
self.job_runner().run_job(self) | |
File "/var/lib/analytics-tasks/analyticstack/venv/local/lib/python2.7/site-packages/luigi/hadoop.py", line 482, in run_job | |
run_and_track_hadoop_job(arglist) | |
File "/var/lib/analytics-tasks/analyticstack/venv/local/lib/python2.7/site-packages/luigi/hadoop.py", line 318, in run_and_track_hadoop_job | |
return track_process(arglist, tracking_url_callback, env) | |
File "/var/lib/analytics-tasks/analyticstack/venv/local/lib/python2.7/site-packages/luigi/hadoop.py", line 302, in track_process | |
raise HadoopJobError(message + 'Also, no tracking url found.', out, err) | |
HadoopJobError: ('Streaming job failed with exit code 1. Also, no tracking url found.', 'packageJobJar: [/var/lib/analytics-tasks/analyticstack/venv/local/lib/python2.7/site-packages/luigi/mrrunner.py, /tmp/tmp7SE7Lw/packages.tar, /tmp/tmp7SE7Lw/job-instance.pickle, /tmp/hadoop-hadoop/hadoop-unjar3829775231970839201/] [] /tmp/streamjob6930265245717095292.jar tmpDir=null\n', '17/01/19 16:33:45 WARN streaming.StreamJob: -file option is deprecated, please use generic option -files instead.\n17/01/19 16:33:49 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032\n17/01/19 16:33:49 INFO client.RMProxy: Connecting to ResourceManager at /0.0.0.0:8032\n17/01/19 16:33:52 INFO mapred.FileInputFormat: Total input paths to process : 1\n17/01/19 16:33:52 INFO mapreduce.JobSubmitter: number of splits:2\n17/01/19 16:33:52 INFO Configuration.deprecation: mapred.job.name is deprecated. Instead, use mapreduce.job.name\n17/01/19 16:33:52 INFO Configuration.deprecation: mapred.reduce.tasks is deprecated. Instead, use mapreduce.job.reduces\n17/01/19 16:33:53 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1484841390427_0002\n17/01/19 16:33:53 INFO impl.YarnClientImpl: Submitted application application_1484841390427_0002\n17/01/19 16:33:54 INFO mapreduce.Job: The url to track the job: http://ip-172-30-0-176:8088/proxy/application_1484841390427_0002/\n17/01/19 16:33:54 INFO mapreduce.Job: Running job: job_1484841390427_0002\n17/01/19 16:34:12 INFO mapreduce.Job: Job job_1484841390427_0002 running in uber mode : false\n17/01/19 16:34:12 INFO mapreduce.Job: map 0% reduce 0%\n17/01/19 16:34:31 INFO mapreduce.Job: map 50% reduce 0%\n17/01/19 16:34:32 INFO mapreduce.Job: map 100% reduce 0%\n17/01/19 16:34:47 INFO mapreduce.Job: map 100% reduce 100%\n17/01/19 16:35:02 INFO mapreduce.Job: Task Id : attempt_1484841390427_0002_r_000000_0, Status : FAILED\nError: java.lang.RuntimeException: PipeMapRed.waitOutputThreads(): subprocess failed with code 1\n\tat org.apache.hadoop.streaming.PipeMapRed.waitOutputThreads(PipeMapRed.java:320)\n\tat org.apache.hadoop.streaming.PipeMapRed.mapRedFinished(PipeMapRed.java:533)\n\tat org.apache.hadoop.streaming.PipeReducer.close(PipeReducer.java:134)\n\tat org.apache.hadoop.io.IOUtils.cleanup(IOUtils.java:237)\n\tat org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:459)\n\tat org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:392)\n\tat org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)\n\tat java.security.AccessController.doPrivileged(Native Method)\n\tat javax.security.auth.Subject.doAs(Subject.java:422)\n\tat org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)\n\tat org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)\n\n17/01/19 16:35:03 INFO mapreduce.Job: map 100% reduce 0%\n17/01/19 16:35:21 INFO mapreduce.Job: map 100% reduce 100%\n17/01/19 16:35:35 INFO mapreduce.Job: Task Id : attempt_1484841390427_0002_r_000000_1, Status : FAILED\nError: java.lang.RuntimeException: PipeMapRed.waitOutputThreads(): subprocess failed with code 1\n\tat org.apache.hadoop.streaming.PipeMapRed.waitOutputThreads(PipeMapRed.java:320)\n\tat org.apache.hadoop.streaming.PipeMapRed.mapRedFinished(PipeMapRed.java:533)\n\tat org.apache.hadoop.streaming.PipeReducer.close(PipeReducer.java:134)\n\tat org.apache.hadoop.io.IOUtils.cleanup(IOUtils.java:237)\n\tat org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:459)\n\tat org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:392)\n\tat org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)\n\tat java.security.AccessController.doPrivileged(Native Method)\n\tat javax.security.auth.Subject.doAs(Subject.java:422)\n\tat org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)\n\tat org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)\n\n17/01/19 16:35:36 INFO mapreduce.Job: map 100% reduce 0%\n17/01/19 16:35:50 INFO mapreduce.Job: map 100% reduce 100%\n17/01/19 16:36:06 INFO mapreduce.Job: Task Id : attempt_1484841390427_0002_r_000000_2, Status : FAILED\nError: java.lang.RuntimeException: PipeMapRed.waitOutputThreads(): subprocess failed with code 1\n\tat org.apache.hadoop.streaming.PipeMapRed.waitOutputThreads(PipeMapRed.java:320)\n\tat org.apache.hadoop.streaming.PipeMapRed.mapRedFinished(PipeMapRed.java:533)\n\tat org.apache.hadoop.streaming.PipeReducer.close(PipeReducer.java:134)\n\tat org.apache.hadoop.io.IOUtils.cleanup(IOUtils.java:237)\n\tat org.apache.hadoop.mapred.ReduceTask.runOldReducer(ReduceTask.java:459)\n\tat org.apache.hadoop.mapred.ReduceTask.run(ReduceTask.java:392)\n\tat org.apache.hadoop.mapred.YarnChild$2.run(YarnChild.java:168)\n\tat java.security.AccessController.doPrivileged(Native Method)\n\tat javax.security.auth.Subject.doAs(Subject.java:422)\n\tat org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548)\n\tat org.apache.hadoop.mapred.YarnChild.main(YarnChild.java:163)\n\n17/01/19 16:36:07 INFO mapreduce.Job: map 100% reduce 0%\n17/01/19 16:36:23 INFO mapreduce.Job: map 100% reduce 100%\n17/01/19 16:36:38 INFO mapreduce.Job: Job job_1484841390427_0002 failed with state FAILED due to: Task failed task_1484841390427_0002_r_000000\nJob failed as tasks failed. failedMaps:0 failedReduces:1\n\n17/01/19 16:36:39 INFO mapreduce.Job: Counters: 37\n\tFile System Counters\n\t\tFILE: Number of bytes read=0\n\t\tFILE: Number of bytes written=195423\n\t\tFILE: Number of read operations=0\n\t\tFILE: Number of large read operations=0\n\t\tFILE: Number of write operations=0\n\t\tHDFS: Number of bytes read=686974\n\t\tHDFS: Number of bytes written=0\n\t\tHDFS: Number of read operations=6\n\t\tHDFS: Number of large read operations=0\n\t\tHDFS: Number of write operations=0\n\tJob Counters \n\t\tFailed reduce tasks=4\n\t\tLaunched map tasks=2\n\t\tLaunched reduce tasks=4\n\t\tData-local map tasks=2\n\t\tTotal time spent by all maps in occupied slots (ms)=35481\n\t\tTotal time spent by all reduces in occupied slots (ms)=115440\n\t\tTotal time spent by all map tasks (ms)=35481\n\t\tTotal time spent by all reduce tasks (ms)=115440\n\t\tTotal vcore-seconds taken by all map tasks=35481\n\t\tTotal vcore-seconds taken by all reduce tasks=115440\n\t\tTotal megabyte-seconds taken by all map tasks=36332544\n\t\tTotal megabyte-seconds taken by all reduce tasks=118210560\n\tMap-Reduce Framework\n\t\tMap input records=1742\n\t\tMap output records=167\n\t\tMap output bytes=15809\n\t\tMap output materialized bytes=16155\n\t\tInput split bytes=182\n\t\tCombine input records=0\n\t\tSpilled Records=167\n\t\tFailed Shuffles=0\n\t\tMerged Map outputs=0\n\t\tGC time elapsed (ms)=558\n\t\tCPU time spent (ms)=1970\n\t\tPhysical memory (bytes) snapshot=415866880\n\t\tVirtual memory (bytes) snapshot=4414054400\n\t\tTotal committed heap usage (bytes)=317587456\n\tFile Input Format Counters \n\t\tBytes Read=686792\n17/01/19 16:36:39 ERROR streaming.StreamJob: Job not Successful!\nStreaming Command Failed!\n') | |
2017-01-19 16:36:39,405 INFO 10701 [luigi-interface] notifications.py:96 - Skipping error email. Set `error-email` in the `core` section of the luigi config file to receive error emails. | |
2017-01-19 16:36:47,680 INFO 10701 [luigi-interface] worker.py:337 - Done |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment