-
-
Save pjstein/10422498 to your computer and use it in GitHub Desktop.
failed: [silo-03] => {"changed": true, "cmd": "su - hdfs -c \"export HADOOP_MAPRED_HOME=/usr/lib/hadoop-mapreduce && hadoop jar /usr/lib/hadoop-mapreduce/hadoop-mapreduce-examples.jar grep /source-for-mapreduce-test /outputfile 'hello'\" ", "delta": "0:00:11.452524", "end": "2014-04-10 20:01:31.965314", "rc": 255, "start": "2014-04-10 20:01:20.512790"} | |
stderr: 14/04/10 20:01:22 INFO client.RMProxy: Connecting to ResourceManager at silo-03/33.12.34.53:8032 | |
14/04/10 20:01:23 WARN mapreduce.JobSubmitter: No job jar file set. User classes may not be found. See Job or Job#setJar(String). | |
14/04/10 20:01:23 INFO input.FileInputFormat: Total input paths to process : 1 | |
14/04/10 20:01:23 INFO mapreduce.JobSubmitter: number of splits:1 | |
14/04/10 20:01:23 INFO mapreduce.JobSubmitter: Submitting tokens for job: job_1397160042836_0001 | |
14/04/10 20:01:23 INFO mapred.YARNRunner: Job jar is not present. Not adding any jar to the list of resources. | |
14/04/10 20:01:24 INFO impl.YarnClientImpl: Submitted application application_1397160042836_0001 | |
14/04/10 20:01:24 INFO mapreduce.Job: The url to track the job: http://silo-03:8088/proxy/application_1397160042836_0001/ | |
14/04/10 20:01:24 INFO mapreduce.Job: Running job: job_1397160042836_0001 | |
14/04/10 20:01:31 INFO mapreduce.Job: Job job_1397160042836_0001 running in uber mode : false | |
14/04/10 20:01:31 INFO mapreduce.Job: map 0% reduce 0% | |
14/04/10 20:01:31 INFO mapreduce.Job: Job job_1397160042836_0001 failed with state FAILED due to: Application application_1397160042836_0001 failed 2 times due to AM Container for appattempt_1397160042836_0001_000002 exited with exitCode: 1 due to: Exception from container-launch: org.apache.hadoop.util.Shell$ExitCodeException: | |
org.apache.hadoop.util.Shell$ExitCodeException: | |
at org.apache.hadoop.util.Shell.runCommand(Shell.java:505) | |
at org.apache.hadoop.util.Shell.run(Shell.java:418) | |
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:650) | |
at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:195) | |
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:283) | |
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:79) | |
at java.util.concurrent.FutureTask.run(FutureTask.java:262) | |
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1145) | |
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) | |
at java.lang.Thread.run(Thread.java:744) | |
Container exited with a non-zero exit code 1 | |
.Failing this attempt.. Failing the application. | |
14/04/10 20:01:31 INFO mapreduce.Job: Counters: 0 | |
14/04/10 20:01:31 INFO client.RMProxy: Connecting to ResourceManager at silo-03/33.12.34.53:8032 | |
14/04/10 20:01:31 WARN mapreduce.JobSubmitter: No job jar file set. User classes may not be found. See Job or Job#setJar(String). | |
14/04/10 20:01:31 INFO mapreduce.JobSubmitter: Cleaning up the staging area /user/hdfs/.staging/job_1397160042836_0002 | |
14/04/10 20:01:31 WARN security.UserGroupInformation: PriviledgedActionException as:hdfs (auth:SIMPLE) cause:org.apache.hadoop.mapreduce.lib.input.InvalidInputException: Input path does not exist: hdfs://spacestation/user/hdfs/grep-temp-1278052657 | |
org.apache.hadoop.mapreduce.lib.input.InvalidInputException: Input path does not exist: hdfs://spacestation/user/hdfs/grep-temp-1278052657 | |
at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.listStatus(FileInputFormat.java:285) | |
at org.apache.hadoop.mapreduce.lib.input.SequenceFileInputFormat.listStatus(SequenceFileInputFormat.java:59) | |
at org.apache.hadoop.mapreduce.lib.input.FileInputFormat.getSplits(FileInputFormat.java:340) | |
at org.apache.hadoop.mapreduce.JobSubmitter.writeNewSplits(JobSubmitter.java:493) | |
at org.apache.hadoop.mapreduce.JobSubmitter.writeSplits(JobSubmitter.java:510) | |
at org.apache.hadoop.mapreduce.JobSubmitter.submitJobInternal(JobSubmitter.java:394) | |
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1295) | |
at org.apache.hadoop.mapreduce.Job$10.run(Job.java:1292) | |
at java.security.AccessController.doPrivileged(Native Method) | |
at javax.security.auth.Subject.doAs(Subject.java:415) | |
at org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1548) | |
at org.apache.hadoop.mapreduce.Job.submit(Job.java:1292) | |
at org.apache.hadoop.mapreduce.Job.waitForCompletion(Job.java:1313) | |
at org.apache.hadoop.examples.Grep.run(Grep.java:92) | |
at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:70) | |
at org.apache.hadoop.examples.Grep.main(Grep.java:101) | |
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) | |
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) | |
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) | |
at java.lang.reflect.Method.invoke(Method.java:606) | |
at org.apache.hadoop.util.ProgramDriver$ProgramDescription.invoke(ProgramDriver.java:72) | |
at org.apache.hadoop.util.ProgramDriver.run(ProgramDriver.java:144) | |
at org.apache.hadoop.examples.ExampleDriver.main(ExampleDriver.java:74) | |
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) | |
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57) | |
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) | |
at java.lang.reflect.Method.invoke(Method.java:606) | |
at org.apache.hadoop.util.RunJar.main(RunJar.java:212) |
most similar issue I've found is documented here: http://tonylixu.blogspot.com/2014/02/mapreduce-with-yarn-troubleshooting.html
I've tried both specifying the done directories explicitly and letting it use the defaults. No success with either
Directory layout:
hdfs@silo-01:/home/vagrant$ hadoop fs -ls -R /
-rw-r--r-- 3 hdfs hadoop 67 2014-04-10 20:01 /source-for-mapreduce-test
drwxrwxrwt - hdfs hadoop 0 2014-04-10 20:01 /tmp
drwxr-xr-x - yarn hadoop 0 2014-04-10 20:01 /tmp/hadoop-yarn
drwxrwxrwx - yarn hadoop 0 2014-04-10 20:01 /tmp/hadoop-yarn/fail
drwxr-xr-x - hdfs hadoop 0 2014-04-10 19:58 /user
drwxr-xr-x - hdfs hadoop 0 2014-04-10 20:01 /user/hdfs
drwx------ - hdfs hadoop 0 2014-04-10 20:01 /user/hdfs/.staging
drwx------ - hdfs hadoop 0 2014-04-10 20:01 /user/hdfs/.staging/job_1397160042836_0001
-rw-r--r-- 10 hdfs hadoop 117 2014-04-10 20:01 /user/hdfs/.staging/job_1397160042836_0001/job.split
-rw-r--r-- 3 hdfs hadoop 37 2014-04-10 20:01 /user/hdfs/.staging/job_1397160042836_0001/job.splitmetainfo
-rw-r--r-- 3 hdfs hadoop 74863 2014-04-10 20:01 /user/hdfs/.staging/job_1397160042836_0001/job.xml
drwxrwxrwt - yarn hadoop 0 2014-04-10 19:57 /user/history
drwxr-x--T - yarn hadoop 0 2014-04-10 19:57 /user/history/done
drwxrwxrwt - yarn hadoop 0 2014-04-10 19:57 /user/history/done_intermediate
drwxr-xr-x - mapred hadoop 0 2014-04-10 19:58 /user/mapred
drwxr-xr-x - yarn hadoop 0 2014-04-10 19:58 /user/yarn
drwxr-xr-x - hdfs hadoop 0 2014-04-10 19:57 /var
drwxr-xr-x - hdfs hadoop 0 2014-04-10 19:57 /var/log
drwxr-xr-x - yarn mapred 0 2014-04-10 19:57 /var/log/hadoop-yarn
Oh, would you look at this:
yarn@silo-02:/home/vagrant$ cat /data/yarn/yarn-logs/application_1397160042836_0009/container_1397160042836_0009_02_000001/stderr
Exception in thread "main" java.lang.NoClassDefFoundError: org/apache/hadoop/yarn/exceptions/YarnRuntimeException
at java.lang.Class.getDeclaredMethods0(Native Method)
at java.lang.Class.privateGetDeclaredMethods(Class.java:2531)
at java.lang.Class.getMethod0(Class.java:2774)
at java.lang.Class.getMethod(Class.java:1663)
at sun.launcher.LauncherHelper.getMainMethod(LauncherHelper.java:494)
at sun.launcher.LauncherHelper.checkAndLoadMain(LauncherHelper.java:486)
Caused by: java.lang.ClassNotFoundException: org.apache.hadoop.yarn.exceptions.YarnRuntimeException
at java.net.URLClassLoader$1.run(URLClassLoader.java:366)
at java.net.URLClassLoader$1.run(URLClassLoader.java:355)
at java.security.AccessController.doPrivileged(Native Method)
at java.net.URLClassLoader.findClass(URLClassLoader.java:354)
at java.lang.ClassLoader.loadClass(ClassLoader.java:425)
at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:308)
at java.lang.ClassLoader.loadClass(ClassLoader.java:358)
Imagine, dear reader, a sigh that crumbles the tallest of mountains. With its roots torn from the Earth, the path to Hell is clear. A yellow elephant dances on the shore while the river styx flows silently by.
Changing YARN_HOME
to HADOOP_YARN_HOME
in yarn-site.xml
remedies all things. HADOOP_YARN_HOME
supersedes the United States of America as humanity's last best hope. Achievement unlocked.
Just make sure your symlinks have symlinks to other symlinks that symlink to a shell script that sets up all those environment variables and then executes another shell script that is really a symlink to java.
running on wheezy