Last active
August 29, 2015 13:58
-
-
Save pjstein/10422498 to your computer and use it in GitHub Desktop.
failed states
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
failed: [silo-03] => {"changed": true, "cmd": "su - hdfs -c \"export HADOOP_MAPRED_HOME=/usr/lib/hadoop-mapreduce && hadoop jar /usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar grep /source-for-mapreduce-test /outputfile 'hello'\" ", "delta": "0:00:11.452524", "end": "2014-04-10 20:01:31.965314", "rc": 255, "start": "2014-04-10 20:01:20.512790"} | |
stderr: 14/04/10 20:01:22 INFO client.RMProxy: Connecting to ResourceManager at silo-03/33.12.34.53:8032 | |
14/04/10 20:01:23 WARN mapreduce.JobSubmitter: No job jar file set. User classes may not be found. See Job or Job#setJar(String). | |
14/04/10 20:01:23 INFO input.FileInputFormat: Total input paths to process : 1 | |
14/04/10 20:01:23 INFO mapreduce.JobSubmitter: number of splits:1 | |
14/04/10 20:01:23 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1397160042836_0001 | |
14/04/10 20:01:23 INFO mapred.YARNRunner: Job jar is not present. Not adding any jar to the list of resources. | |
14/04/10 20:01:24 INFO impl.YarnClientImpl: Submitted application application_1397160042836_0001 | |
14/04/10 20:01:24 INFO mapreduce.Job: The url to track the job: http://silo-03:8088/proxy/application_1397160042836_0001/ | |
14/04/10 20:01:24 INFO mapreduce.Job: Running job: job_1397160042836_0001 | |
14/04/10 20:01:31 INFO mapreduce.Job: Job job_1397160042836_0001 running in uber mode : false | |
14/04/10 20:01:31 INFO mapreduce.Job: map 0% reduce 0% | |
14/04/10 20:01:31 INFO mapreduce.Job: Job job_1397160042836_0001 failed with state FAILED due to: Application application_1397160042836_0001 failed 2 times due to AM Container for appattempt_1397160042836_0001_000002 exited with exitCode: 1 due to: Exception from container-launch: org.apache.hadoop.util.Shell$ExitCodeException: | |
org.apache.hadoop.util.Shell$ExitCodeException: | |
at org.apache.hadoop.util.Shell.runCommand(Shell.java:505) | |
at org.apache.hadoop.util.Shell.run(Shell.java:418) | |
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:650) | |
at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:195) | |
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:283) | |
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:79) | |
at java.util.concurrent.FutureTask.run(FutureTask.java:262) | |
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) | |
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) | |
at java.lang.Thread.run(Thread.java:744) | |
Container exited with a non-zero exit code 1 | |
.Failing this attempt.. Failing the application. | |
14/04/10 20:01:31 INFO mapreduce.Job: Counters: 0 | |
14/04/10 20:01:31 INFO client.RMProxy: Connecting to ResourceManager at silo-03/33.12.34.53:8032 | |
14/04/10 20:01:31 WARN mapreduce.JobSubmitter: No job jar file set. User classes may not be found. See Job or Job#setJar(String). | |
14/04/10 20:01:31 INFO mapreduce.JobSubmitter: Cleaning up the staging area /user/hdfs/.staging/job_1397160042836_0002 | |
14/04/10 20:01:31 WARN security.UserGroupInformation: PriviledgedActionException as:hdfs (auth:SIMPLE) cause:org.apache.hadoop.mapreduce.lib.input.InvalidInputException: Input path does not exist: hdfs://spacestation/user/hdfs/grep-temp-1278052657 | |
org.apache.hadoop.mapreduce.lib.input.InvalidInputException: Input path does not exist: hdfs://spacestation/user/hdfs/grep-temp-1278052657 | |
at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.listStatus(FileInputFormat.java:285) | |
at org.apache.hadoop.mapreduce.lib.input.SequenceFileInputFormat.listStatus(SequenceFileInputFormat.java:59) | |
at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.getSplits(FileInputFormat.java:340) | |
at org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:493) | |
at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:510) | |
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:394) | |
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1295) | |
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1292) | |
at java.security.AccessController.doPrivileged(Native Method) | |
at javax.security.auth.Subject.doAs(Subject.java:415) | |
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548) | |
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1292) | |
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1313) | |
at org.apache.hadoop.examples.Grep.run(Grep.java:92) | |
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) | |
at org.apache.hadoop.examples.Grep.main(Grep.java:101) | |
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) | |
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) | |
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) | |
at java.lang.reflect.Method.invoke(Method.java:606) | |
at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:72) | |
at org.apache.hadoop.util.ProgramDriver.run(ProgramDriver.java:144) | |
at org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:74) | |
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) | |
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) | |
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) | |
at java.lang.reflect.Method.invoke(Method.java:606) | |
at org.apache.hadoop.util.RunJar.main(RunJar.java:212) |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Just make sure your symlinks have symlinks to other symlinks that symlink to a shell script that sets up all those environment variables and then executes another shell script that is really a symlink to java.