-
-
Save izo3/069a35afd2f9055d695c66de6ddb8dfb to your computer and use it in GitHub Desktop.
snowplow-emr-error3
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
17/08/30 17:59:48 INFO RMProxy: Connecting to ResourceManager at ip-172-31-11-77.ec2.internal/172.31.11.77:8032 | |
17/08/30 17:59:48 INFO Client: Requesting a new application from cluster with 1 NodeManagers | |
17/08/30 17:59:49 INFO Client: Verifying our application has not requested more than the maximum memory capability of the cluster (2048 MB per container) | |
17/08/30 17:59:49 INFO Client: Will allocate AM container, with 1024 MB memory including 384 MB overhead | |
17/08/30 17:59:49 INFO Client: Setting up container launch context for our AM | |
17/08/30 17:59:49 INFO Client: Setting up the launch environment for our AM container | |
17/08/30 17:59:49 INFO Client: Preparing resources for our AM container | |
17/08/30 17:59:53 WARN Client: Neither spark.yarn.jars nor spark.yarn.archive is set, falling back to uploading libraries under SPARK_HOME. | |
17/08/30 17:59:58 INFO Client: Uploading resource file:/mnt/tmp/spark-c6fb60da-3108-4a58-bf8c-4cffb559850e/__spark_libs__2753975237409157279.zip -> hdfs://ip-172-31-11-77.ec2.internal:8020/user/hadoop/.sparkStaging/application_1504115279181_0003/__spark_libs__2753975237409157279.zip | |
17/08/30 18:00:08 INFO Client: Uploading resource s3://snowplow-hosted-assets-us-east-1/3-enrich/spark-enrich/snowplow-spark-enrich-1.9.0.jar -> hdfs://ip-172-31-11-77.ec2.internal:8020/user/hadoop/.sparkStaging/application_1504115279181_0003/snowplow-spark-enrich-1.9.0.jar | |
17/08/30 18:00:08 INFO S3NativeFileSystem: Opening 's3://snowplow-hosted-assets-us-east-1/3-enrich/spark-enrich/snowplow-spark-enrich-1.9.0.jar' for reading | |
17/08/30 18:00:14 INFO Client: Uploading resource file:/mnt/tmp/spark-c6fb60da-3108-4a58-bf8c-4cffb559850e/__spark_conf__2809700159682013988.zip -> hdfs://ip-172-31-11-77.ec2.internal:8020/user/hadoop/.sparkStaging/application_1504115279181_0003/__spark_conf__.zip | |
17/08/30 18:00:14 INFO SecurityManager: Changing view acls to: hadoop | |
17/08/30 18:00:14 INFO SecurityManager: Changing modify acls to: hadoop | |
17/08/30 18:00:14 INFO SecurityManager: Changing view acls groups to: | |
17/08/30 18:00:14 INFO SecurityManager: Changing modify acls groups to: | |
17/08/30 18:00:14 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(hadoop); groups with view permissions: Set(); users with modify permissions: Set(hadoop); groups with modify permissions: Set() | |
17/08/30 18:00:14 INFO Client: Submitting application application_1504115279181_0003 to ResourceManager | |
17/08/30 18:00:15 INFO YarnClientImpl: Submitted application application_1504115279181_0003 | |
17/08/30 18:00:16 INFO Client: Application report for application_1504115279181_0003 (state: ACCEPTED) | |
17/08/30 18:00:16 INFO Client: | |
client token: N/A | |
diagnostics: N/A | |
ApplicationMaster host: N/A | |
ApplicationMaster RPC port: -1 | |
queue: default | |
start time: 1504116014963 | |
final status: UNDEFINED | |
tracking URL: http://ip-172-31-11-77.ec2.internal:20888/proxy/application_1504115279181_0003/ | |
user: hadoop | |
17/08/30 18:00:17 INFO Client: Application report for application_1504115279181_0003 (state: ACCEPTED) | |
17/08/30 18:00:18 INFO Client: Application report for application_1504115279181_0003 (state: ACCEPTED) | |
17/08/30 18:00:19 INFO Client: Application report for application_1504115279181_0003 (state: ACCEPTED) | |
17/08/30 18:00:20 INFO Client: Application report for application_1504115279181_0003 (state: ACCEPTED) | |
17/08/30 18:00:21 INFO Client: Application report for application_1504115279181_0003 (state: ACCEPTED) | |
17/08/30 18:00:22 INFO Client: Application report for application_1504115279181_0003 (state: ACCEPTED) | |
17/08/30 18:00:23 INFO Client: Application report for application_1504115279181_0003 (state: ACCEPTED) | |
17/08/30 18:00:24 INFO Client: Application report for application_1504115279181_0003 (state: ACCEPTED) | |
17/08/30 18:00:25 INFO Client: Application report for application_1504115279181_0003 (state: ACCEPTED) | |
17/08/30 18:00:26 INFO Client: Application report for application_1504115279181_0003 (state: ACCEPTED) | |
17/08/30 18:00:27 INFO Client: Application report for application_1504115279181_0003 (state: ACCEPTED) | |
17/08/30 18:00:28 INFO Client: Application report for application_1504115279181_0003 (state: ACCEPTED) | |
17/08/30 18:00:29 INFO Client: Application report for application_1504115279181_0003 (state: ACCEPTED) | |
17/08/30 18:00:30 INFO Client: Application report for application_1504115279181_0003 (state: ACCEPTED) | |
17/08/30 18:00:31 INFO Client: Application report for application_1504115279181_0003 (state: ACCEPTED) | |
17/08/30 18:00:32 INFO Client: Application report for application_1504115279181_0003 (state: RUNNING) | |
17/08/30 18:00:32 INFO Client: | |
client token: N/A | |
diagnostics: N/A | |
ApplicationMaster host: 172.31.7.218 | |
ApplicationMaster RPC port: 0 | |
queue: default | |
start time: 1504116014963 | |
final status: UNDEFINED | |
tracking URL: http://ip-172-31-11-77.ec2.internal:20888/proxy/application_1504115279181_0003/ | |
user: hadoop | |
17/08/30 18:00:33 INFO Client: Application report for application_1504115279181_0003 (state: RUNNING) | |
17/08/30 18:00:34 INFO Client: Application report for application_1504115279181_0003 (state: RUNNING) | |
17/08/30 18:00:35 INFO Client: Application report for application_1504115279181_0003 (state: RUNNING) | |
17/08/30 18:00:36 INFO Client: Application report for application_1504115279181_0003 (state: RUNNING) | |
17/08/30 18:00:37 INFO Client: Application report for application_1504115279181_0003 (state: RUNNING) | |
17/08/30 18:00:38 INFO Client: Application report for application_1504115279181_0003 (state: RUNNING) | |
17/08/30 18:00:39 INFO Client: Application report for application_1504115279181_0003 (state: RUNNING) | |
17/08/30 18:00:40 INFO Client: Application report for application_1504115279181_0003 (state: RUNNING) | |
17/08/30 18:00:41 INFO Client: Application report for application_1504115279181_0003 (state: RUNNING) | |
17/08/30 18:00:42 INFO Client: Application report for application_1504115279181_0003 (state: RUNNING) | |
17/08/30 18:00:43 INFO Client: Application report for application_1504115279181_0003 (state: RUNNING) | |
17/08/30 18:00:44 INFO Client: Application report for application_1504115279181_0003 (state: RUNNING) | |
17/08/30 18:00:45 INFO Client: Application report for application_1504115279181_0003 (state: RUNNING) | |
17/08/30 18:00:46 INFO Client: Application report for application_1504115279181_0003 (state: RUNNING) | |
17/08/30 18:00:47 INFO Client: Application report for application_1504115279181_0003 (state: RUNNING) | |
17/08/30 18:00:48 INFO Client: Application report for application_1504115279181_0003 (state: RUNNING) | |
17/08/30 18:00:49 INFO Client: Application report for application_1504115279181_0003 (state: RUNNING) | |
17/08/30 18:00:50 INFO Client: Application report for application_1504115279181_0003 (state: RUNNING) | |
17/08/30 18:00:51 INFO Client: Application report for application_1504115279181_0003 (state: RUNNING) | |
17/08/30 18:00:52 INFO Client: Application report for application_1504115279181_0003 (state: RUNNING) | |
17/08/30 18:00:53 INFO Client: Application report for application_1504115279181_0003 (state: RUNNING) | |
17/08/30 18:00:54 INFO Client: Application report for application_1504115279181_0003 (state: RUNNING) | |
17/08/30 18:00:55 INFO Client: Application report for application_1504115279181_0003 (state: RUNNING) | |
17/08/30 18:00:56 INFO Client: Application report for application_1504115279181_0003 (state: RUNNING) | |
17/08/30 18:00:57 INFO Client: Application report for application_1504115279181_0003 (state: RUNNING) | |
17/08/30 18:00:58 INFO Client: Application report for application_1504115279181_0003 (state: RUNNING) | |
17/08/30 18:00:59 INFO Client: Application report for application_1504115279181_0003 (state: RUNNING) | |
17/08/30 18:01:00 INFO Client: Application report for application_1504115279181_0003 (state: RUNNING) | |
17/08/30 18:01:01 INFO Client: Application report for application_1504115279181_0003 (state: RUNNING) | |
17/08/30 18:01:02 INFO Client: Application report for application_1504115279181_0003 (state: RUNNING) | |
17/08/30 18:01:03 INFO Client: Application report for application_1504115279181_0003 (state: FINISHED) | |
17/08/30 18:01:03 INFO Client: | |
client token: N/A | |
diagnostics: User class threw exception: java.io.IOException: Not a file: hdfs://ip-172-31-11-77.ec2.internal:8020/local/snowplow/raw-events/j-18HIRIGZU5W2U/node | |
ApplicationMaster host: 172.31.7.218 | |
ApplicationMaster RPC port: 0 | |
queue: default | |
start time: 1504116014963 | |
final status: FAILED | |
tracking URL: http://ip-172-31-11-77.ec2.internal:20888/proxy/application_1504115279181_0003/ | |
user: hadoop | |
Exception in thread "main" org.apache.spark.SparkException: Application application_1504115279181_0003 finished with failed status | |
at org.apache.spark.deploy.yarn.Client.run(Client.scala:1167) | |
at org.apache.spark.deploy.yarn.Client$.main(Client.scala:1213) | |
at org.apache.spark.deploy.yarn.Client.main(Client.scala) | |
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) | |
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) | |
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) | |
at java.lang.reflect.Method.invoke(Method.java:498) | |
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:738) | |
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:187) | |
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:212) | |
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:126) | |
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) | |
17/08/30 18:01:03 INFO ShutdownHookManager: Shutdown hook called | |
17/08/30 18:01:03 INFO ShutdownHookManager: Deleting directory /mnt/tmp/spark-c6fb60da-3108-4a58-bf8c-4cffb559850e | |
Command exiting with ret '1' |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment