-
-
Save ryan-williams/648bff70e518de0c7c84 to your computer and use it in GitHub Desktop.
ivy race condition in spark
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
error: could not lock config file /hpc/users/willir31/.gitconfig: File exists | |
Dynamically requesting 1:1000 executors | |
Woops: /etc/hadoop/conf.cloudera.yarn/topology.py requires Python 2 | |
Found $PYTHON2_HOME at /hpc/users/willir31/.local; attempting to prepend /hpc/users/willir31/.local/bin to $PATH... | |
Running command: | |
/hpc/users/willir31/c/adam/bin/adam-submit --master yarn --deploy-mode client --executor-cores 6 --executor-memory 17g --driver-memory 10g --conf spark.default.parallelism=1000 --conf spark.eventLog.enabled=true --conf spark.eventLog.dir=hdfs://demeter-nn1.demeter.hpc.mssm.edu:/spark/tmp/logs --conf spark.slim.host=hammerlab-dev3.hpc.mssm.edu --conf spark.extraListeners=org.apache.spark.JsonRelay --packages org.hammerlab:spark-json-relay:2.0.0 --conf spark.storage.memoryFraction=0.05 --conf spark.shuffle.memoryFraction=0.4 --conf spark.shuffle.service.enabled=true --conf spark.speculation=true --conf spark.speculation.interval=1000 --conf spark.speculation.multiplier=2 --conf spark.dynamicAllocation.enabled=true --conf spark.dynamicAllocation.minExecutors=1 --conf spark.dynamicAllocation.initialExecutors=1 --conf spark.dynamicAllocation.maxExecutors=1000 --conf spark.dynamicAllocation.schedulerBacklogTimeout=1 --conf spark.dynamicAllocation.executorIdleTimeout=300 --conf spark.yarn.executor.memoryOverhead=1024 --driver-java-options -Dyarn.resourcemanager.am.max-attempts=1 --conf spark.akka.timeout=120 --conf spark.file.transferTo=false --conf spark.core.connection.ack.wait.timeout=120 --conf spark.ui.enabled=false --executor-cores 10 --executor-memory 5g --driver-memory 1g --conf spark.shuffle.memoryFraction=0.8 -- transform /datasets/illumina_platinum/200x/fastq/ERR174313_1.fastq -paired_fastq /datasets/illumina_platinum/200x/fastq/ERR174313_2.fastq -concat /datasets/illumina_platinum/200x/fastq/ERR174313_unpaired.fastq -record_group ERR174313 /datasets/illumina_platinum/200x/ERR174313.adam | |
Using SPARK_SUBMIT=/hpc/users/willir31/sparks/spark-1.5.0-bin-hadoop2.6/bin/spark-submit | |
Ivy Default Cache set to: /hpc/users/willir31/.ivy2/cache | |
The jars for the packages stored in: /hpc/users/willir31/.ivy2/jars | |
:: loading settings :: url = jar:file:/demeter/users/willir31/sparks/spark-1.5.0-bin-hadoop2.6/lib/spark-assembly-1.5.0-hadoop2.6.0.jar!/org/apache/ivy/core/settings/ivysettings.xml | |
org.hammerlab#spark-json-relay added as a dependency | |
:: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0 | |
confs: [default] | |
found org.hammerlab#spark-json-relay;2.0.0 in list | |
found org.json4s#json4s-jackson_2.10;3.2.10 in list | |
found org.json4s#json4s-core_2.10;3.2.10 in list | |
found org.json4s#json4s-ast_2.10;3.2.10 in list | |
found com.thoughtworks.paranamer#paranamer;2.6 in list | |
found com.fasterxml.jackson.core#jackson-databind;2.3.1 in list | |
found com.fasterxml.jackson.core#jackson-annotations;2.3.0 in list | |
found com.fasterxml.jackson.core#jackson-core;2.3.1 in list | |
found org.scala-lang#scalap;2.10.4 in list | |
found org.scala-lang#scala-compiler;2.10.4 in list | |
found org.scala-lang#scala-reflect;2.10.4 in list | |
:: resolution report :: resolve 1481ms :: artifacts dl 112ms | |
:: modules in use: | |
com.fasterxml.jackson.core#jackson-annotations;2.3.0 from list in [default] | |
com.fasterxml.jackson.core#jackson-core;2.3.1 from list in [default] | |
com.fasterxml.jackson.core#jackson-databind;2.3.1 from list in [default] | |
com.thoughtworks.paranamer#paranamer;2.6 from list in [default] | |
org.hammerlab#spark-json-relay;2.0.0 from list in [default] | |
org.json4s#json4s-ast_2.10;3.2.10 from list in [default] | |
org.json4s#json4s-core_2.10;3.2.10 from list in [default] | |
org.json4s#json4s-jackson_2.10;3.2.10 from list in [default] | |
org.scala-lang#scala-compiler;2.10.4 from list in [default] | |
org.scala-lang#scala-reflect;2.10.4 from list in [default] | |
org.scala-lang#scalap;2.10.4 from list in [default] | |
:: evicted modules: | |
org.scala-lang#scalap;2.10.0 by [org.scala-lang#scalap;2.10.4] in [default] | |
--------------------------------------------------------------------- | |
| | modules || artifacts | | |
| conf | number| search|dwnlded|evicted|| number|dwnlded| | |
--------------------------------------------------------------------- | |
| default | 12 | 0 | 0 | 1 || 11 | 0 | | |
--------------------------------------------------------------------- | |
:: retrieving :: org.apache.spark#spark-submit-parent | |
confs: [default] | |
Exception in thread "main" java.lang.RuntimeException: problem during retrieve of org.apache.spark#spark-submit-parent: java.text.ParseException: failed to parse report: /hpc/users/willir31/.ivy2/cache/org.apache.spark-spark-submit-parent-default.xml: Premature end of file. | |
at org.apache.ivy.core.retrieve.RetrieveEngine.retrieve(RetrieveEngine.java:249) | |
at org.apache.ivy.core.retrieve.RetrieveEngine.retrieve(RetrieveEngine.java:83) | |
at org.apache.ivy.Ivy.retrieve(Ivy.java:551) | |
at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1006) | |
at org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:286) | |
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:153) | |
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120) | |
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) | |
Caused by: java.text.ParseException: failed to parse report: /hpc/users/willir31/.ivy2/cache/org.apache.spark-spark-submit-parent-default.xml: Premature end of file. | |
at org.apache.ivy.plugins.report.XmlReportParser.parse(XmlReportParser.java:293) | |
at org.apache.ivy.core.retrieve.RetrieveEngine.determineArtifactsToCopy(RetrieveEngine.java:329) | |
at org.apache.ivy.core.retrieve.RetrieveEngine.retrieve(RetrieveEngine.java:118) | |
... 7 more | |
Caused by: org.xml.sax.SAXParseException; Premature end of file. | |
at org.apache.xerces.util.ErrorHandlerWrapper.createSAXParseException(Unknown Source) | |
at org.apache.xerces.util.ErrorHandlerWrapper.fatalError(Unknown Source) | |
at org.apache.xerces.impl.XMLErrorReporter.reportError(Unknown Source) | |
at org.apache.xerces.impl.XMLErrorReporter.reportError(Unknown Source) | |
at org.apache.xerces.impl.XMLErrorReporter.reportError(Unknown Source) | |
at org.apache.xerces.impl.XMLVersionDetector.determineDocVersion(Unknown Source) | |
at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source) | |
at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source) | |
at org.apache.xerces.parsers.XMLParser.parse(Unknown Source) | |
at org.apache.xerces.parsers.AbstractSAXParser.parse(Unknown Source) | |
at org.apache.xerces.jaxp.SAXParserImpl$JAXPSAXParser.parse(Unknown Source) | |
at org.apache.xerces.jaxp.SAXParserImpl.parse(Unknown Source) | |
at javax.xml.parsers.SAXParser.parse(SAXParser.java:328) | |
at org.apache.ivy.plugins.report.XmlReportParser$SaxXmlReportParser.parse(XmlReportParser.java:249) | |
at org.apache.ivy.plugins.report.XmlReportParser.parse(XmlReportParser.java:291) | |
... 9 more |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
ln: cannot remove `/hpc/users/willir31/s/yarn/ldl': No such file or directory | |
ln: cannot remove `/hpc/users/willir31/s/yarn/ycl': No such file or directory | |
ln: creating symbolic link `/hpc/users/willir31/s/yarn/yll': File exists | |
error: could not lock config file /hpc/users/willir31/.gitconfig: File exists | |
Dynamically requesting 1:1000 executors | |
Woops: /etc/hadoop/conf.cloudera.yarn/topology.py requires Python 2 | |
Found $PYTHON2_HOME at /hpc/users/willir31/.local; attempting to prepend /hpc/users/willir31/.local/bin to $PATH... | |
Running command: | |
/hpc/users/willir31/c/adam/bin/adam-submit --master yarn --deploy-mode client --executor-cores 6 --executor-memory 17g --driver-memory 10g --conf spark.default.parallelism=1000 --conf spark.eventLog.enabled=true --conf spark.eventLog.dir=hdfs://demeter-nn1.demeter.hpc.mssm.edu:/spark/tmp/logs --conf spark.slim.host=hammerlab-dev3.hpc.mssm.edu --conf spark.extraListeners=org.apache.spark.JsonRelay --packages org.hammerlab:spark-json-relay:2.0.0 --conf spark.storage.memoryFraction=0.05 --conf spark.shuffle.memoryFraction=0.4 --conf spark.shuffle.service.enabled=true --conf spark.speculation=true --conf spark.speculation.interval=1000 --conf spark.speculation.multiplier=2 --conf spark.dynamicAllocation.enabled=true --conf spark.dynamicAllocation.minExecutors=1 --conf spark.dynamicAllocation.initialExecutors=1 --conf spark.dynamicAllocation.maxExecutors=1000 --conf spark.dynamicAllocation.schedulerBacklogTimeout=1 --conf spark.dynamicAllocation.executorIdleTimeout=300 --conf spark.yarn.executor.memoryOverhead=1024 --driver-java-options -Dyarn.resourcemanager.am.max-attempts=1 --conf spark.akka.timeout=120 --conf spark.file.transferTo=false --conf spark.core.connection.ack.wait.timeout=120 --conf spark.ui.enabled=false --executor-cores 10 --executor-memory 5g --driver-memory 1g --conf spark.shuffle.memoryFraction=0.8 -- transform /datasets/illumina_platinum/200x/fastq/ERR174314_1.fastq -paired_fastq /datasets/illumina_platinum/200x/fastq/ERR174314_2.fastq -concat /datasets/illumina_platinum/200x/fastq/ERR174314_unpaired.fastq -record_group ERR174314 /datasets/illumina_platinum/200x/ERR174314.adam | |
Using SPARK_SUBMIT=/hpc/users/willir31/sparks/spark-1.5.0-bin-hadoop2.6/bin/spark-submit | |
Ivy Default Cache set to: /hpc/users/willir31/.ivy2/cache | |
The jars for the packages stored in: /hpc/users/willir31/.ivy2/jars | |
:: loading settings :: url = jar:file:/demeter/users/willir31/sparks/spark-1.5.0-bin-hadoop2.6/lib/spark-assembly-1.5.0-hadoop2.6.0.jar!/org/apache/ivy/core/settings/ivysettings.xml | |
org.hammerlab#spark-json-relay added as a dependency | |
:: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0 | |
confs: [default] | |
found org.hammerlab#spark-json-relay;2.0.0 in list | |
found org.json4s#json4s-jackson_2.10;3.2.10 in list | |
found org.json4s#json4s-core_2.10;3.2.10 in list | |
found org.json4s#json4s-ast_2.10;3.2.10 in list | |
found com.thoughtworks.paranamer#paranamer;2.6 in list | |
found com.fasterxml.jackson.core#jackson-databind;2.3.1 in list | |
found com.fasterxml.jackson.core#jackson-annotations;2.3.0 in list | |
found com.fasterxml.jackson.core#jackson-core;2.3.1 in list | |
found org.scala-lang#scalap;2.10.4 in list | |
found org.scala-lang#scala-compiler;2.10.4 in list | |
found org.scala-lang#scala-reflect;2.10.4 in list | |
:: resolution report :: resolve 1525ms :: artifacts dl 65ms | |
:: modules in use: | |
com.fasterxml.jackson.core#jackson-annotations;2.3.0 from list in [default] | |
com.fasterxml.jackson.core#jackson-core;2.3.1 from list in [default] | |
com.fasterxml.jackson.core#jackson-databind;2.3.1 from list in [default] | |
com.thoughtworks.paranamer#paranamer;2.6 from list in [default] | |
org.hammerlab#spark-json-relay;2.0.0 from list in [default] | |
org.json4s#json4s-ast_2.10;3.2.10 from list in [default] | |
org.json4s#json4s-core_2.10;3.2.10 from list in [default] | |
org.json4s#json4s-jackson_2.10;3.2.10 from list in [default] | |
org.scala-lang#scala-compiler;2.10.4 from list in [default] | |
org.scala-lang#scala-reflect;2.10.4 from list in [default] | |
org.scala-lang#scalap;2.10.4 from list in [default] | |
:: evicted modules: | |
org.scala-lang#scalap;2.10.0 by [org.scala-lang#scalap;2.10.4] in [default] | |
--------------------------------------------------------------------- | |
| | modules || artifacts | | |
| conf | number| search|dwnlded|evicted|| number|dwnlded| | |
--------------------------------------------------------------------- | |
| default | 12 | 0 | 0 | 1 || 11 | 0 | | |
--------------------------------------------------------------------- | |
:: retrieving :: org.apache.spark#spark-submit-parent | |
confs: [default] | |
Exception in thread "main" java.lang.RuntimeException: problem during retrieve of org.apache.spark#spark-submit-parent: java.text.ParseException: failed to parse report: /hpc/users/willir31/.ivy2/cache/org.apache.spark-spark-submit-parent-default.xml: Premature end of file. | |
at org.apache.ivy.core.retrieve.RetrieveEngine.retrieve(RetrieveEngine.java:249) | |
at org.apache.ivy.core.retrieve.RetrieveEngine.retrieve(RetrieveEngine.java:83) | |
at org.apache.ivy.Ivy.retrieve(Ivy.java:551) | |
at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1006) | |
at org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:286) | |
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:153) | |
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120) | |
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) | |
Caused by: java.text.ParseException: failed to parse report: /hpc/users/willir31/.ivy2/cache/org.apache.spark-spark-submit-parent-default.xml: Premature end of file. | |
at org.apache.ivy.plugins.report.XmlReportParser.parse(XmlReportParser.java:293) | |
at org.apache.ivy.core.retrieve.RetrieveEngine.determineArtifactsToCopy(RetrieveEngine.java:329) | |
at org.apache.ivy.core.retrieve.RetrieveEngine.retrieve(RetrieveEngine.java:118) | |
... 7 more | |
Caused by: org.xml.sax.SAXParseException; Premature end of file. | |
at org.apache.xerces.util.ErrorHandlerWrapper.createSAXParseException(Unknown Source) | |
at org.apache.xerces.util.ErrorHandlerWrapper.fatalError(Unknown Source) | |
at org.apache.xerces.impl.XMLErrorReporter.reportError(Unknown Source) | |
at org.apache.xerces.impl.XMLErrorReporter.reportError(Unknown Source) | |
at org.apache.xerces.impl.XMLErrorReporter.reportError(Unknown Source) | |
at org.apache.xerces.impl.XMLVersionDetector.determineDocVersion(Unknown Source) | |
at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source) | |
at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source) | |
at org.apache.xerces.parsers.XMLParser.parse(Unknown Source) | |
at org.apache.xerces.parsers.AbstractSAXParser.parse(Unknown Source) | |
at org.apache.xerces.jaxp.SAXParserImpl$JAXPSAXParser.parse(Unknown Source) | |
at org.apache.xerces.jaxp.SAXParserImpl.parse(Unknown Source) | |
at javax.xml.parsers.SAXParser.parse(SAXParser.java:328) | |
at org.apache.ivy.plugins.report.XmlReportParser$SaxXmlReportParser.parse(XmlReportParser.java:249) | |
at org.apache.ivy.plugins.report.XmlReportParser.parse(XmlReportParser.java:291) | |
... 9 more |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
error: could not lock config file /hpc/users/willir31/.gitconfig: File exists | |
Dynamically requesting 1:1000 executors | |
Woops: /etc/hadoop/conf.cloudera.yarn/topology.py requires Python 2 | |
Found $PYTHON2_HOME at /hpc/users/willir31/.local; attempting to prepend /hpc/users/willir31/.local/bin to $PATH... | |
Running command: | |
/hpc/users/willir31/c/adam/bin/adam-submit --master yarn --deploy-mode client --executor-cores 6 --executor-memory 17g --driver-memory 10g --conf spark.default.parallelism=1000 --conf spark.eventLog.enabled=true --conf spark.eventLog.dir=hdfs://demeter-nn1.demeter.hpc.mssm.edu:/spark/tmp/logs --conf spark.slim.host=hammerlab-dev3.hpc.mssm.edu --conf spark.extraListeners=org.apache.spark.JsonRelay --packages org.hammerlab:spark-json-relay:2.0.0 --conf spark.storage.memoryFraction=0.05 --conf spark.shuffle.memoryFraction=0.4 --conf spark.shuffle.service.enabled=true --conf spark.speculation=true --conf spark.speculation.interval=1000 --conf spark.speculation.multiplier=2 --conf spark.dynamicAllocation.enabled=true --conf spark.dynamicAllocation.minExecutors=1 --conf spark.dynamicAllocation.initialExecutors=1 --conf spark.dynamicAllocation.maxExecutors=1000 --conf spark.dynamicAllocation.schedulerBacklogTimeout=1 --conf spark.dynamicAllocation.executorIdleTimeout=300 --conf spark.yarn.executor.memoryOverhead=1024 --driver-java-options -Dyarn.resourcemanager.am.max-attempts=1 --conf spark.akka.timeout=120 --conf spark.file.transferTo=false --conf spark.core.connection.ack.wait.timeout=120 --conf spark.ui.enabled=false --executor-cores 10 --executor-memory 5g --driver-memory 1g --conf spark.shuffle.memoryFraction=0.8 -- transform /datasets/illumina_platinum/200x/fastq/ERR174332_1.fastq -paired_fastq /datasets/illumina_platinum/200x/fastq/ERR174332_2.fastq -concat /datasets/illumina_platinum/200x/fastq/ERR174332_unpaired.fastq -record_group ERR174332 -stringency SILENT /datasets/illumina_platinum/200x/ERR174332.adam | |
Using SPARK_SUBMIT=/hpc/users/willir31/sparks/spark-1.5.0-bin-hadoop2.6/bin/spark-submit | |
Ivy Default Cache set to: /hpc/users/willir31/.ivy2/cache | |
The jars for the packages stored in: /hpc/users/willir31/.ivy2/jars | |
:: loading settings :: url = jar:file:/demeter/users/willir31/sparks/spark-1.5.0-bin-hadoop2.6/lib/spark-assembly-1.5.0-hadoop2.6.0.jar!/org/apache/ivy/core/settings/ivysettings.xml | |
org.hammerlab#spark-json-relay added as a dependency | |
:: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0 | |
confs: [default] | |
found org.hammerlab#spark-json-relay;2.0.0 in list | |
found org.json4s#json4s-jackson_2.10;3.2.10 in list | |
found org.json4s#json4s-core_2.10;3.2.10 in list | |
found org.json4s#json4s-ast_2.10;3.2.10 in list | |
found com.thoughtworks.paranamer#paranamer;2.6 in list | |
found com.fasterxml.jackson.core#jackson-databind;2.3.1 in list | |
found com.fasterxml.jackson.core#jackson-annotations;2.3.0 in list | |
found com.fasterxml.jackson.core#jackson-core;2.3.1 in list | |
found org.scala-lang#scalap;2.10.4 in list | |
found org.scala-lang#scala-compiler;2.10.4 in list | |
found org.scala-lang#scala-reflect;2.10.4 in list | |
:: resolution report :: resolve 1581ms :: artifacts dl 65ms | |
:: modules in use: | |
com.fasterxml.jackson.core#jackson-annotations;2.3.0 from list in [default] | |
com.fasterxml.jackson.core#jackson-core;2.3.1 from list in [default] | |
com.fasterxml.jackson.core#jackson-databind;2.3.1 from list in [default] | |
com.thoughtworks.paranamer#paranamer;2.6 from list in [default] | |
org.hammerlab#spark-json-relay;2.0.0 from list in [default] | |
org.json4s#json4s-ast_2.10;3.2.10 from list in [default] | |
org.json4s#json4s-core_2.10;3.2.10 from list in [default] | |
org.json4s#json4s-jackson_2.10;3.2.10 from list in [default] | |
org.scala-lang#scala-compiler;2.10.4 from list in [default] | |
org.scala-lang#scala-reflect;2.10.4 from list in [default] | |
org.scala-lang#scalap;2.10.4 from list in [default] | |
:: evicted modules: | |
org.scala-lang#scalap;2.10.0 by [org.scala-lang#scalap;2.10.4] in [default] | |
--------------------------------------------------------------------- | |
| | modules || artifacts | | |
| conf | number| search|dwnlded|evicted|| number|dwnlded| | |
--------------------------------------------------------------------- | |
| default | 12 | 0 | 0 | 1 || 11 | 0 | | |
--------------------------------------------------------------------- | |
:: retrieving :: org.apache.spark#spark-submit-parent | |
confs: [default] | |
Exception in thread "main" java.lang.RuntimeException: problem during retrieve of org.apache.spark#spark-submit-parent: java.lang.IllegalStateException: Report file '/hpc/users/willir31/.ivy2/cache/org.apache.spark-spark-submit-parent-default.xml' does not exist. | |
at org.apache.ivy.core.retrieve.RetrieveEngine.retrieve(RetrieveEngine.java:249) | |
at org.apache.ivy.core.retrieve.RetrieveEngine.retrieve(RetrieveEngine.java:83) | |
at org.apache.ivy.Ivy.retrieve(Ivy.java:551) | |
at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1006) | |
at org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:286) | |
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:153) | |
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120) | |
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) | |
Caused by: java.lang.IllegalStateException: Report file '/hpc/users/willir31/.ivy2/cache/org.apache.spark-spark-submit-parent-default.xml' does not exist. | |
at org.apache.ivy.plugins.report.XmlReportParser.parse(XmlReportParser.java:285) | |
at org.apache.ivy.core.retrieve.RetrieveEngine.determineArtifactsToCopy(RetrieveEngine.java:329) | |
at org.apache.ivy.core.retrieve.RetrieveEngine.retrieve(RetrieveEngine.java:118) | |
... 7 more |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Dynamically requesting 1:1000 executors | |
Woops: /etc/hadoop/conf.cloudera.yarn/topology.py requires Python 2 | |
Found $PYTHON2_HOME at /hpc/users/willir31/.local; attempting to prepend /hpc/users/willir31/.local/bin to $PATH... | |
Running command: | |
/hpc/users/willir31/c/adam/bin/adam-submit --master yarn --deploy-mode client --executor-cores 6 --executor-memory 17g --driver-memory 10g --conf spark.default.parallelism=1000 --conf spark.eventLog.enabled=true --conf spark.eventLog.dir=hdfs://demeter-nn1.demeter.hpc.mssm.edu:/spark/tmp/logs --conf spark.slim.host=hammerlab-dev3.hpc.mssm.edu --conf spark.extraListeners=org.apache.spark.JsonRelay --packages org.hammerlab:spark-json-relay:2.0.0 --conf spark.storage.memoryFraction=0.05 --conf spark.shuffle.memoryFraction=0.4 --conf spark.shuffle.service.enabled=true --conf spark.speculation=true --conf spark.speculation.interval=1000 --conf spark.speculation.multiplier=2 --conf spark.dynamicAllocation.enabled=true --conf spark.dynamicAllocation.minExecutors=1 --conf spark.dynamicAllocation.initialExecutors=1 --conf spark.dynamicAllocation.maxExecutors=1000 --conf spark.dynamicAllocation.schedulerBacklogTimeout=1 --conf spark.dynamicAllocation.executorIdleTimeout=300 --conf spark.yarn.executor.memoryOverhead=1024 --driver-java-options -Dyarn.resourcemanager.am.max-attempts=1 --conf spark.akka.timeout=120 --conf spark.file.transferTo=false --conf spark.core.connection.ack.wait.timeout=120 --conf spark.ui.enabled=false --executor-cores 10 --executor-memory 5g --driver-memory 1g --conf spark.shuffle.memoryFraction=0.8 -- transform /datasets/illumina_platinum/200x/fastq/ERR174335_1.fastq -paired_fastq /datasets/illumina_platinum/200x/fastq/ERR174335_2.fastq -concat /datasets/illumina_platinum/200x/fastq/ERR174335_unpaired.fastq -record_group ERR174335 -stringency SILENT /datasets/illumina_platinum/200x/ERR174335.adam | |
Using SPARK_SUBMIT=/hpc/users/willir31/sparks/spark-1.5.0-bin-hadoop2.6/bin/spark-submit | |
Ivy Default Cache set to: /hpc/users/willir31/.ivy2/cache | |
The jars for the packages stored in: /hpc/users/willir31/.ivy2/jars | |
:: loading settings :: url = jar:file:/demeter/users/willir31/sparks/spark-1.5.0-bin-hadoop2.6/lib/spark-assembly-1.5.0-hadoop2.6.0.jar!/org/apache/ivy/core/settings/ivysettings.xml | |
org.hammerlab#spark-json-relay added as a dependency | |
:: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0 | |
confs: [default] | |
found org.hammerlab#spark-json-relay;2.0.0 in list | |
found org.json4s#json4s-jackson_2.10;3.2.10 in list | |
found org.json4s#json4s-core_2.10;3.2.10 in list | |
found org.json4s#json4s-ast_2.10;3.2.10 in list | |
found com.thoughtworks.paranamer#paranamer;2.6 in list | |
found com.fasterxml.jackson.core#jackson-databind;2.3.1 in list | |
found com.fasterxml.jackson.core#jackson-annotations;2.3.0 in list | |
found com.fasterxml.jackson.core#jackson-core;2.3.1 in list | |
found org.scala-lang#scalap;2.10.4 in list | |
found org.scala-lang#scala-compiler;2.10.4 in list | |
found org.scala-lang#scala-reflect;2.10.4 in list | |
:: resolution report :: resolve 1678ms :: artifacts dl 87ms | |
:: modules in use: | |
com.fasterxml.jackson.core#jackson-annotations;2.3.0 from list in [default] | |
com.fasterxml.jackson.core#jackson-core;2.3.1 from list in [default] | |
com.fasterxml.jackson.core#jackson-databind;2.3.1 from list in [default] | |
com.thoughtworks.paranamer#paranamer;2.6 from list in [default] | |
org.hammerlab#spark-json-relay;2.0.0 from list in [default] | |
org.json4s#json4s-ast_2.10;3.2.10 from list in [default] | |
org.json4s#json4s-core_2.10;3.2.10 from list in [default] | |
org.json4s#json4s-jackson_2.10;3.2.10 from list in [default] | |
org.scala-lang#scala-compiler;2.10.4 from list in [default] | |
org.scala-lang#scala-reflect;2.10.4 from list in [default] | |
org.scala-lang#scalap;2.10.4 from list in [default] | |
:: evicted modules: | |
org.scala-lang#scalap;2.10.0 by [org.scala-lang#scalap;2.10.4] in [default] | |
--------------------------------------------------------------------- | |
| | modules || artifacts | | |
| conf | number| search|dwnlded|evicted|| number|dwnlded| | |
--------------------------------------------------------------------- | |
| default | 12 | 0 | 0 | 1 || 11 | 0 | | |
--------------------------------------------------------------------- | |
:: retrieving :: org.apache.spark#spark-submit-parent | |
confs: [default] | |
Exception in thread "main" java.lang.RuntimeException: problem during retrieve of org.apache.spark#spark-submit-parent: java.text.ParseException: failed to parse report: /hpc/users/willir31/.ivy2/cache/org.apache.spark-spark-submit-parent-default.xml: Premature end of file. | |
at org.apache.ivy.core.retrieve.RetrieveEngine.retrieve(RetrieveEngine.java:249) | |
at org.apache.ivy.core.retrieve.RetrieveEngine.retrieve(RetrieveEngine.java:83) | |
at org.apache.ivy.Ivy.retrieve(Ivy.java:551) | |
at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1006) | |
at org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:286) | |
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:153) | |
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120) | |
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) | |
Caused by: java.text.ParseException: failed to parse report: /hpc/users/willir31/.ivy2/cache/org.apache.spark-spark-submit-parent-default.xml: Premature end of file. | |
at org.apache.ivy.plugins.report.XmlReportParser.parse(XmlReportParser.java:293) | |
at org.apache.ivy.core.retrieve.RetrieveEngine.determineArtifactsToCopy(RetrieveEngine.java:329) | |
at org.apache.ivy.core.retrieve.RetrieveEngine.retrieve(RetrieveEngine.java:118) | |
... 7 more | |
Caused by: org.xml.sax.SAXParseException; Premature end of file. | |
at org.apache.xerces.util.ErrorHandlerWrapper.createSAXParseException(Unknown Source) | |
at org.apache.xerces.util.ErrorHandlerWrapper.fatalError(Unknown Source) | |
at org.apache.xerces.impl.XMLErrorReporter.reportError(Unknown Source) | |
at org.apache.xerces.impl.XMLErrorReporter.reportError(Unknown Source) | |
at org.apache.xerces.impl.XMLErrorReporter.reportError(Unknown Source) | |
at org.apache.xerces.impl.XMLVersionDetector.determineDocVersion(Unknown Source) | |
at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source) | |
at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source) | |
at org.apache.xerces.parsers.XMLParser.parse(Unknown Source) | |
at org.apache.xerces.parsers.AbstractSAXParser.parse(Unknown Source) | |
at org.apache.xerces.jaxp.SAXParserImpl$JAXPSAXParser.parse(Unknown Source) | |
at org.apache.xerces.jaxp.SAXParserImpl.parse(Unknown Source) | |
at javax.xml.parsers.SAXParser.parse(SAXParser.java:328) | |
at org.apache.ivy.plugins.report.XmlReportParser$SaxXmlReportParser.parse(XmlReportParser.java:249) | |
at org.apache.ivy.plugins.report.XmlReportParser.parse(XmlReportParser.java:291) | |
... 9 more |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
error: could not lock config file /hpc/users/willir31/.gitconfig: File exists | |
Dynamically requesting 1:1000 executors | |
Woops: /etc/hadoop/conf.cloudera.yarn/topology.py requires Python 2 | |
Found $PYTHON2_HOME at /hpc/users/willir31/.local; attempting to prepend /hpc/users/willir31/.local/bin to $PATH... | |
Running command: | |
/hpc/users/willir31/c/adam/bin/adam-submit --master yarn --deploy-mode client --executor-cores 6 --executor-memory 17g --driver-memory 10g --conf spark.default.parallelism=1000 --conf spark.eventLog.enabled=true --conf spark.eventLog.dir=hdfs://demeter-nn1.demeter.hpc.mssm.edu:/spark/tmp/logs --conf spark.slim.host=hammerlab-dev3.hpc.mssm.edu --conf spark.extraListeners=org.apache.spark.JsonRelay --packages org.hammerlab:spark-json-relay:2.0.0 --conf spark.storage.memoryFraction=0.05 --conf spark.shuffle.memoryFraction=0.4 --conf spark.shuffle.service.enabled=true --conf spark.speculation=true --conf spark.speculation.interval=1000 --conf spark.speculation.multiplier=2 --conf spark.dynamicAllocation.enabled=true --conf spark.dynamicAllocation.minExecutors=1 --conf spark.dynamicAllocation.initialExecutors=1 --conf spark.dynamicAllocation.maxExecutors=1000 --conf spark.dynamicAllocation.schedulerBacklogTimeout=1 --conf spark.dynamicAllocation.executorIdleTimeout=300 --conf spark.yarn.executor.memoryOverhead=1024 --driver-java-options -Dyarn.resourcemanager.am.max-attempts=1 --conf spark.akka.timeout=120 --conf spark.file.transferTo=false --conf spark.core.connection.ack.wait.timeout=120 --conf spark.ui.enabled=false --executor-cores 10 --executor-memory 5g --driver-memory 1g --conf spark.shuffle.memoryFraction=0.8 -- transform /datasets/illumina_platinum/200x/fastq/ERR174336_1.fastq -paired_fastq /datasets/illumina_platinum/200x/fastq/ERR174336_2.fastq -concat /datasets/illumina_platinum/200x/fastq/ERR174336_unpaired.fastq -record_group ERR174336 -stringency SILENT /datasets/illumina_platinum/200x/ERR174336.adam | |
Using SPARK_SUBMIT=/hpc/users/willir31/sparks/spark-1.5.0-bin-hadoop2.6/bin/spark-submit | |
Ivy Default Cache set to: /hpc/users/willir31/.ivy2/cache | |
The jars for the packages stored in: /hpc/users/willir31/.ivy2/jars | |
:: loading settings :: url = jar:file:/demeter/users/willir31/sparks/spark-1.5.0-bin-hadoop2.6/lib/spark-assembly-1.5.0-hadoop2.6.0.jar!/org/apache/ivy/core/settings/ivysettings.xml | |
org.hammerlab#spark-json-relay added as a dependency | |
:: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0 | |
confs: [default] | |
found org.hammerlab#spark-json-relay;2.0.0 in list | |
found org.json4s#json4s-jackson_2.10;3.2.10 in list | |
found org.json4s#json4s-core_2.10;3.2.10 in list | |
found org.json4s#json4s-ast_2.10;3.2.10 in list | |
found com.thoughtworks.paranamer#paranamer;2.6 in list | |
found com.fasterxml.jackson.core#jackson-databind;2.3.1 in list | |
found com.fasterxml.jackson.core#jackson-annotations;2.3.0 in list | |
found com.fasterxml.jackson.core#jackson-core;2.3.1 in list | |
found org.scala-lang#scalap;2.10.4 in list | |
found org.scala-lang#scala-compiler;2.10.4 in list | |
found org.scala-lang#scala-reflect;2.10.4 in list | |
:: resolution report :: resolve 2273ms :: artifacts dl 46ms | |
:: modules in use: | |
com.fasterxml.jackson.core#jackson-annotations;2.3.0 from list in [default] | |
com.fasterxml.jackson.core#jackson-core;2.3.1 from list in [default] | |
com.fasterxml.jackson.core#jackson-databind;2.3.1 from list in [default] | |
com.thoughtworks.paranamer#paranamer;2.6 from list in [default] | |
org.hammerlab#spark-json-relay;2.0.0 from list in [default] | |
org.json4s#json4s-ast_2.10;3.2.10 from list in [default] | |
org.json4s#json4s-core_2.10;3.2.10 from list in [default] | |
org.json4s#json4s-jackson_2.10;3.2.10 from list in [default] | |
org.scala-lang#scala-compiler;2.10.4 from list in [default] | |
org.scala-lang#scala-reflect;2.10.4 from list in [default] | |
org.scala-lang#scalap;2.10.4 from list in [default] | |
:: evicted modules: | |
org.scala-lang#scalap;2.10.0 by [org.scala-lang#scalap;2.10.4] in [default] | |
--------------------------------------------------------------------- | |
| | modules || artifacts | | |
| conf | number| search|dwnlded|evicted|| number|dwnlded| | |
--------------------------------------------------------------------- | |
| default | 12 | 0 | 0 | 1 || 11 | 0 | | |
--------------------------------------------------------------------- | |
:: retrieving :: org.apache.spark#spark-submit-parent | |
confs: [default] | |
Exception in thread "main" java.lang.RuntimeException: problem during retrieve of org.apache.spark#spark-submit-parent: java.text.ParseException: failed to parse report: /hpc/users/willir31/.ivy2/cache/org.apache.spark-spark-submit-parent-default.xml: Premature end of file. | |
at org.apache.ivy.core.retrieve.RetrieveEngine.retrieve(RetrieveEngine.java:249) | |
at org.apache.ivy.core.retrieve.RetrieveEngine.retrieve(RetrieveEngine.java:83) | |
at org.apache.ivy.Ivy.retrieve(Ivy.java:551) | |
at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1006) | |
at org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:286) | |
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:153) | |
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120) | |
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) | |
Caused by: java.text.ParseException: failed to parse report: /hpc/users/willir31/.ivy2/cache/org.apache.spark-spark-submit-parent-default.xml: Premature end of file. | |
at org.apache.ivy.plugins.report.XmlReportParser.parse(XmlReportParser.java:293) | |
at org.apache.ivy.core.retrieve.RetrieveEngine.determineArtifactsToCopy(RetrieveEngine.java:329) | |
at org.apache.ivy.core.retrieve.RetrieveEngine.retrieve(RetrieveEngine.java:118) | |
... 7 more | |
Caused by: org.xml.sax.SAXParseException; Premature end of file. | |
at org.apache.xerces.util.ErrorHandlerWrapper.createSAXParseException(Unknown Source) | |
at org.apache.xerces.util.ErrorHandlerWrapper.fatalError(Unknown Source) | |
at org.apache.xerces.impl.XMLErrorReporter.reportError(Unknown Source) | |
at org.apache.xerces.impl.XMLErrorReporter.reportError(Unknown Source) | |
at org.apache.xerces.impl.XMLErrorReporter.reportError(Unknown Source) | |
at org.apache.xerces.impl.XMLVersionDetector.determineDocVersion(Unknown Source) | |
at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source) | |
at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source) | |
at org.apache.xerces.parsers.XMLParser.parse(Unknown Source) | |
at org.apache.xerces.parsers.AbstractSAXParser.parse(Unknown Source) | |
at org.apache.xerces.jaxp.SAXParserImpl$JAXPSAXParser.parse(Unknown Source) | |
at org.apache.xerces.jaxp.SAXParserImpl.parse(Unknown Source) | |
at javax.xml.parsers.SAXParser.parse(SAXParser.java:328) | |
at org.apache.ivy.plugins.report.XmlReportParser$SaxXmlReportParser.parse(XmlReportParser.java:249) | |
at org.apache.ivy.plugins.report.XmlReportParser.parse(XmlReportParser.java:291) | |
... 9 more |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
ln: cannot remove `/hpc/users/willir31/s/yarn/ldl': No such file or directory | |
ln: cannot remove `/hpc/users/willir31/s/yarn/ya': No such file or directory | |
Dynamically requesting 1:1000 executors | |
Woops: /etc/hadoop/conf.cloudera.yarn/topology.py requires Python 2 | |
Found $PYTHON2_HOME at /hpc/users/willir31/.local; attempting to prepend /hpc/users/willir31/.local/bin to $PATH... | |
Running command: | |
/hpc/users/willir31/c/adam/bin/adam-submit --master yarn --deploy-mode client --executor-cores 6 --executor-memory 17g --driver-memory 10g --conf spark.default.parallelism=1000 --conf spark.eventLog.enabled=true --conf spark.eventLog.dir=hdfs://demeter-nn1.demeter.hpc.mssm.edu:/spark/tmp/logs --conf spark.slim.host=hammerlab-dev3.hpc.mssm.edu --conf spark.extraListeners=org.apache.spark.JsonRelay --packages org.hammerlab:spark-json-relay:2.0.0 --conf spark.storage.memoryFraction=0.05 --conf spark.shuffle.memoryFraction=0.4 --conf spark.shuffle.service.enabled=true --conf spark.speculation=true --conf spark.speculation.interval=1000 --conf spark.speculation.multiplier=2 --conf spark.dynamicAllocation.enabled=true --conf spark.dynamicAllocation.minExecutors=1 --conf spark.dynamicAllocation.initialExecutors=1 --conf spark.dynamicAllocation.maxExecutors=1000 --conf spark.dynamicAllocation.schedulerBacklogTimeout=1 --conf spark.dynamicAllocation.executorIdleTimeout=300 --conf spark.yarn.executor.memoryOverhead=1024 --driver-java-options -Dyarn.resourcemanager.am.max-attempts=1 --conf spark.akka.timeout=120 --conf spark.file.transferTo=false --conf spark.core.connection.ack.wait.timeout=120 --conf spark.ui.enabled=false --executor-cores 10 --executor-memory 5g --driver-memory 1g --conf spark.shuffle.memoryFraction=0.8 -- transform /datasets/illumina_platinum/200x/fastq/ERR174340_1.fastq -paired_fastq /datasets/illumina_platinum/200x/fastq/ERR174340_2.fastq -concat /datasets/illumina_platinum/200x/fastq/ERR174340_unpaired.fastq -record_group ERR174340 -stringency SILENT /datasets/illumina_platinum/200x/ERR174340.adam | |
Using SPARK_SUBMIT=/hpc/users/willir31/sparks/spark-1.5.0-bin-hadoop2.6/bin/spark-submit | |
Ivy Default Cache set to: /hpc/users/willir31/.ivy2/cache | |
The jars for the packages stored in: /hpc/users/willir31/.ivy2/jars | |
:: loading settings :: url = jar:file:/demeter/users/willir31/sparks/spark-1.5.0-bin-hadoop2.6/lib/spark-assembly-1.5.0-hadoop2.6.0.jar!/org/apache/ivy/core/settings/ivysettings.xml | |
org.hammerlab#spark-json-relay added as a dependency | |
:: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0 | |
confs: [default] | |
found org.hammerlab#spark-json-relay;2.0.0 in list | |
found org.json4s#json4s-jackson_2.10;3.2.10 in list | |
found org.json4s#json4s-core_2.10;3.2.10 in list | |
found org.json4s#json4s-ast_2.10;3.2.10 in list | |
found com.thoughtworks.paranamer#paranamer;2.6 in list | |
found com.fasterxml.jackson.core#jackson-databind;2.3.1 in list | |
found com.fasterxml.jackson.core#jackson-annotations;2.3.0 in list | |
found com.fasterxml.jackson.core#jackson-core;2.3.1 in list | |
found org.scala-lang#scalap;2.10.4 in list | |
found org.scala-lang#scala-compiler;2.10.4 in list | |
found org.scala-lang#scala-reflect;2.10.4 in list | |
:: resolution report :: resolve 1870ms :: artifacts dl 113ms | |
:: modules in use: | |
com.fasterxml.jackson.core#jackson-annotations;2.3.0 from list in [default] | |
com.fasterxml.jackson.core#jackson-core;2.3.1 from list in [default] | |
com.fasterxml.jackson.core#jackson-databind;2.3.1 from list in [default] | |
com.thoughtworks.paranamer#paranamer;2.6 from list in [default] | |
org.hammerlab#spark-json-relay;2.0.0 from list in [default] | |
org.json4s#json4s-ast_2.10;3.2.10 from list in [default] | |
org.json4s#json4s-core_2.10;3.2.10 from list in [default] | |
org.json4s#json4s-jackson_2.10;3.2.10 from list in [default] | |
org.scala-lang#scala-compiler;2.10.4 from list in [default] | |
org.scala-lang#scala-reflect;2.10.4 from list in [default] | |
org.scala-lang#scalap;2.10.4 from list in [default] | |
:: evicted modules: | |
org.scala-lang#scalap;2.10.0 by [org.scala-lang#scalap;2.10.4] in [default] | |
--------------------------------------------------------------------- | |
| | modules || artifacts | | |
| conf | number| search|dwnlded|evicted|| number|dwnlded| | |
--------------------------------------------------------------------- | |
| default | 12 | 0 | 0 | 1 || 11 | 0 | | |
--------------------------------------------------------------------- | |
:: retrieving :: org.apache.spark#spark-submit-parent | |
confs: [default] | |
Exception in thread "main" java.lang.RuntimeException: problem during retrieve of org.apache.spark#spark-submit-parent: java.text.ParseException: failed to parse report: /hpc/users/willir31/.ivy2/cache/org.apache.spark-spark-submit-parent-default.xml: /hpc/users/willir31/.ivy2/cache/org.apache.spark-spark-submit-parent-default.xml (No such file or directory) | |
at org.apache.ivy.core.retrieve.RetrieveEngine.retrieve(RetrieveEngine.java:249) | |
at org.apache.ivy.core.retrieve.RetrieveEngine.retrieve(RetrieveEngine.java:83) | |
at org.apache.ivy.Ivy.retrieve(Ivy.java:551) | |
at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1006) | |
at org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:286) | |
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:153) | |
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120) | |
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) | |
Caused by: java.text.ParseException: failed to parse report: /hpc/users/willir31/.ivy2/cache/org.apache.spark-spark-submit-parent-default.xml: /hpc/users/willir31/.ivy2/cache/org.apache.spark-spark-submit-parent-default.xml (No such file or directory) | |
at org.apache.ivy.plugins.report.XmlReportParser.parse(XmlReportParser.java:293) | |
at org.apache.ivy.core.retrieve.RetrieveEngine.determineArtifactsToCopy(RetrieveEngine.java:329) | |
at org.apache.ivy.core.retrieve.RetrieveEngine.retrieve(RetrieveEngine.java:118) | |
... 7 more | |
Caused by: java.io.FileNotFoundException: /hpc/users/willir31/.ivy2/cache/org.apache.spark-spark-submit-parent-default.xml (No such file or directory) | |
at java.io.FileInputStream.open(Native Method) | |
at java.io.FileInputStream.<init>(FileInputStream.java:146) | |
at java.io.FileInputStream.<init>(FileInputStream.java:101) | |
at sun.net.www.protocol.file.FileURLConnection.connect(FileURLConnection.java:90) | |
at sun.net.www.protocol.file.FileURLConnection.getInputStream(FileURLConnection.java:188) | |
at org.apache.xerces.impl.XMLEntityManager.setupCurrentEntity(Unknown Source) | |
at org.apache.xerces.impl.XMLVersionDetector.determineDocVersion(Unknown Source) | |
at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source) | |
at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source) | |
at org.apache.xerces.parsers.XMLParser.parse(Unknown Source) | |
at org.apache.xerces.parsers.AbstractSAXParser.parse(Unknown Source) | |
at org.apache.xerces.jaxp.SAXParserImpl$JAXPSAXParser.parse(Unknown Source) | |
at org.apache.xerces.jaxp.SAXParserImpl.parse(Unknown Source) | |
at javax.xml.parsers.SAXParser.parse(SAXParser.java:328) | |
at org.apache.ivy.plugins.report.XmlReportParser$SaxXmlReportParser.parse(XmlReportParser.java:249) | |
at org.apache.ivy.plugins.report.XmlReportParser.parse(XmlReportParser.java:291) | |
... 9 more |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
error: could not lock config file /hpc/users/willir31/.gitconfig: File exists | |
Dynamically requesting 1:1000 executors | |
Woops: /etc/hadoop/conf.cloudera.yarn/topology.py requires Python 2 | |
Found $PYTHON2_HOME at /hpc/users/willir31/.local; attempting to prepend /hpc/users/willir31/.local/bin to $PATH... | |
Running command: | |
/hpc/users/willir31/c/adam/bin/adam-submit --master yarn --deploy-mode client --executor-cores 6 --executor-memory 17g --driver-memory 10g --conf spark.default.parallelism=1000 --conf spark.eventLog.enabled=true --conf spark.eventLog.dir=hdfs://demeter-nn1.demeter.hpc.mssm.edu:/spark/tmp/logs --conf spark.slim.host=hammerlab-dev3.hpc.mssm.edu --conf spark.extraListeners=org.apache.spark.JsonRelay --packages org.hammerlab:spark-json-relay:2.0.0 --conf spark.storage.memoryFraction=0.05 --conf spark.shuffle.memoryFraction=0.4 --conf spark.shuffle.service.enabled=true --conf spark.speculation=true --conf spark.speculation.interval=1000 --conf spark.speculation.multiplier=2 --conf spark.dynamicAllocation.enabled=true --conf spark.dynamicAllocation.minExecutors=1 --conf spark.dynamicAllocation.initialExecutors=1 --conf spark.dynamicAllocation.maxExecutors=1000 --conf spark.dynamicAllocation.schedulerBacklogTimeout=1 --conf spark.dynamicAllocation.executorIdleTimeout=300 --conf spark.yarn.executor.memoryOverhead=1024 --driver-java-options -Dyarn.resourcemanager.am.max-attempts=1 --conf spark.akka.timeout=120 --conf spark.file.transferTo=false --conf spark.core.connection.ack.wait.timeout=120 --conf spark.ui.enabled=false --executor-cores 10 --executor-memory 5g --driver-memory 1g --conf spark.shuffle.memoryFraction=0.8 -- transform /datasets/illumina_platinum/200x/fastq/ERR174346_1.fastq -paired_fastq /datasets/illumina_platinum/200x/fastq/ERR174346_2.fastq -concat /datasets/illumina_platinum/200x/fastq/ERR174346_unpaired.fastq -record_group ERR174346 -stringency SILENT /datasets/illumina_platinum/200x/ERR174346.adam | |
Using SPARK_SUBMIT=/hpc/users/willir31/sparks/spark-1.5.0-bin-hadoop2.6/bin/spark-submit | |
Ivy Default Cache set to: /hpc/users/willir31/.ivy2/cache | |
The jars for the packages stored in: /hpc/users/willir31/.ivy2/jars | |
:: loading settings :: url = jar:file:/demeter/users/willir31/sparks/spark-1.5.0-bin-hadoop2.6/lib/spark-assembly-1.5.0-hadoop2.6.0.jar!/org/apache/ivy/core/settings/ivysettings.xml | |
org.hammerlab#spark-json-relay added as a dependency | |
:: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0 | |
confs: [default] | |
found org.hammerlab#spark-json-relay;2.0.0 in list | |
found org.json4s#json4s-jackson_2.10;3.2.10 in list | |
found org.json4s#json4s-core_2.10;3.2.10 in list | |
found org.json4s#json4s-ast_2.10;3.2.10 in list | |
found com.thoughtworks.paranamer#paranamer;2.6 in list | |
found com.fasterxml.jackson.core#jackson-databind;2.3.1 in list | |
found com.fasterxml.jackson.core#jackson-annotations;2.3.0 in list | |
found com.fasterxml.jackson.core#jackson-core;2.3.1 in list | |
found org.scala-lang#scalap;2.10.4 in list | |
found org.scala-lang#scala-compiler;2.10.4 in list | |
found org.scala-lang#scala-reflect;2.10.4 in list | |
:: resolution report :: resolve 1298ms :: artifacts dl 104ms | |
:: modules in use: | |
com.fasterxml.jackson.core#jackson-annotations;2.3.0 from list in [default] | |
com.fasterxml.jackson.core#jackson-core;2.3.1 from list in [default] | |
com.fasterxml.jackson.core#jackson-databind;2.3.1 from list in [default] | |
com.thoughtworks.paranamer#paranamer;2.6 from list in [default] | |
org.hammerlab#spark-json-relay;2.0.0 from list in [default] | |
org.json4s#json4s-ast_2.10;3.2.10 from list in [default] | |
org.json4s#json4s-core_2.10;3.2.10 from list in [default] | |
org.json4s#json4s-jackson_2.10;3.2.10 from list in [default] | |
org.scala-lang#scala-compiler;2.10.4 from list in [default] | |
org.scala-lang#scala-reflect;2.10.4 from list in [default] | |
org.scala-lang#scalap;2.10.4 from list in [default] | |
:: evicted modules: | |
org.scala-lang#scalap;2.10.0 by [org.scala-lang#scalap;2.10.4] in [default] | |
--------------------------------------------------------------------- | |
| | modules || artifacts | | |
| conf | number| search|dwnlded|evicted|| number|dwnlded| | |
--------------------------------------------------------------------- | |
| default | 12 | 0 | 0 | 1 || 11 | 0 | | |
--------------------------------------------------------------------- | |
:: problems summary :: | |
:::: ERRORS | |
unknown resolver null | |
unknown resolver null | |
:: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS | |
:: retrieving :: org.apache.spark#spark-submit-parent | |
confs: [default] | |
Exception in thread "main" java.lang.RuntimeException: problem during retrieve of org.apache.spark#spark-submit-parent: java.text.ParseException: failed to parse report: /hpc/users/willir31/.ivy2/cache/org.apache.spark-spark-submit-parent-default.xml: Premature end of file. | |
at org.apache.ivy.core.retrieve.RetrieveEngine.retrieve(RetrieveEngine.java:249) | |
at org.apache.ivy.core.retrieve.RetrieveEngine.retrieve(RetrieveEngine.java:83) | |
at org.apache.ivy.Ivy.retrieve(Ivy.java:551) | |
at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1006) | |
at org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:286) | |
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:153) | |
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120) | |
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) | |
Caused by: java.text.ParseException: failed to parse report: /hpc/users/willir31/.ivy2/cache/org.apache.spark-spark-submit-parent-default.xml: Premature end of file. | |
at org.apache.ivy.plugins.report.XmlReportParser.parse(XmlReportParser.java:293) | |
at org.apache.ivy.core.retrieve.RetrieveEngine.determineArtifactsToCopy(RetrieveEngine.java:329) | |
at org.apache.ivy.core.retrieve.RetrieveEngine.retrieve(RetrieveEngine.java:118) | |
... 7 more | |
Caused by: org.xml.sax.SAXParseException; Premature end of file. | |
at org.apache.xerces.util.ErrorHandlerWrapper.createSAXParseException(Unknown Source) | |
at org.apache.xerces.util.ErrorHandlerWrapper.fatalError(Unknown Source) | |
at org.apache.xerces.impl.XMLErrorReporter.reportError(Unknown Source) | |
at org.apache.xerces.impl.XMLErrorReporter.reportError(Unknown Source) | |
at org.apache.xerces.impl.XMLErrorReporter.reportError(Unknown Source) | |
at org.apache.xerces.impl.XMLVersionDetector.determineDocVersion(Unknown Source) | |
at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source) | |
at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source) | |
at org.apache.xerces.parsers.XMLParser.parse(Unknown Source) | |
at org.apache.xerces.parsers.AbstractSAXParser.parse(Unknown Source) | |
at org.apache.xerces.jaxp.SAXParserImpl$JAXPSAXParser.parse(Unknown Source) | |
at org.apache.xerces.jaxp.SAXParserImpl.parse(Unknown Source) | |
at javax.xml.parsers.SAXParser.parse(SAXParser.java:328) | |
at org.apache.ivy.plugins.report.XmlReportParser$SaxXmlReportParser.parse(XmlReportParser.java:249) | |
at org.apache.ivy.plugins.report.XmlReportParser.parse(XmlReportParser.java:291) | |
... 9 more |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
ln: creating symbolic link `/hpc/users/willir31/s/yarn/ldl': File exists | |
ln: cannot remove `/hpc/users/willir31/s/yarn/yl': No such file or directory | |
Dynamically requesting 1:1000 executors | |
Woops: /etc/hadoop/conf.cloudera.yarn/topology.py requires Python 2 | |
Found $PYTHON2_HOME at /hpc/users/willir31/.local; attempting to prepend /hpc/users/willir31/.local/bin to $PATH... | |
Running command: | |
/hpc/users/willir31/c/adam/bin/adam-submit --master yarn --deploy-mode client --executor-cores 6 --executor-memory 17g --driver-memory 10g --conf spark.default.parallelism=1000 --conf spark.eventLog.enabled=true --conf spark.eventLog.dir=hdfs://demeter-nn1.demeter.hpc.mssm.edu:/spark/tmp/logs --conf spark.slim.host=hammerlab-dev3.hpc.mssm.edu --conf spark.extraListeners=org.apache.spark.JsonRelay --packages org.hammerlab:spark-json-relay:2.0.0 --conf spark.storage.memoryFraction=0.05 --conf spark.shuffle.memoryFraction=0.4 --conf spark.shuffle.service.enabled=true --conf spark.speculation=true --conf spark.speculation.interval=1000 --conf spark.speculation.multiplier=2 --conf spark.dynamicAllocation.enabled=true --conf spark.dynamicAllocation.minExecutors=1 --conf spark.dynamicAllocation.initialExecutors=1 --conf spark.dynamicAllocation.maxExecutors=1000 --conf spark.dynamicAllocation.schedulerBacklogTimeout=1 --conf spark.dynamicAllocation.executorIdleTimeout=300 --conf spark.yarn.executor.memoryOverhead=1024 --driver-java-options -Dyarn.resourcemanager.am.max-attempts=1 --conf spark.akka.timeout=120 --conf spark.file.transferTo=false --conf spark.core.connection.ack.wait.timeout=120 --conf spark.ui.enabled=false --executor-cores 10 --executor-memory 5g --driver-memory 1g --conf spark.shuffle.memoryFraction=0.8 -- transform /datasets/illumina_platinum/200x/fastq/ERR174364_1.fastq -paired_fastq /datasets/illumina_platinum/200x/fastq/ERR174364_2.fastq -concat /datasets/illumina_platinum/200x/fastq/ERR174364_unpaired.fastq -record_group ERR174364 -stringency SILENT /datasets/illumina_platinum/200x/ERR174364.adam | |
Using SPARK_SUBMIT=/hpc/users/willir31/sparks/spark-1.5.0-bin-hadoop2.6/bin/spark-submit | |
Ivy Default Cache set to: /hpc/users/willir31/.ivy2/cache | |
The jars for the packages stored in: /hpc/users/willir31/.ivy2/jars | |
:: loading settings :: url = jar:file:/demeter/users/willir31/sparks/spark-1.5.0-bin-hadoop2.6/lib/spark-assembly-1.5.0-hadoop2.6.0.jar!/org/apache/ivy/core/settings/ivysettings.xml | |
org.hammerlab#spark-json-relay added as a dependency | |
:: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0 | |
confs: [default] | |
found org.hammerlab#spark-json-relay;2.0.0 in list | |
found org.json4s#json4s-jackson_2.10;3.2.10 in list | |
found org.json4s#json4s-core_2.10;3.2.10 in list | |
found org.json4s#json4s-ast_2.10;3.2.10 in list | |
found com.thoughtworks.paranamer#paranamer;2.6 in list | |
found com.fasterxml.jackson.core#jackson-databind;2.3.1 in list | |
found com.fasterxml.jackson.core#jackson-annotations;2.3.0 in list | |
found com.fasterxml.jackson.core#jackson-core;2.3.1 in list | |
found org.scala-lang#scalap;2.10.4 in list | |
found org.scala-lang#scala-compiler;2.10.4 in list | |
found org.scala-lang#scala-reflect;2.10.4 in list | |
:: resolution report :: resolve 2466ms :: artifacts dl 129ms | |
:: modules in use: | |
com.fasterxml.jackson.core#jackson-annotations;2.3.0 from list in [default] | |
com.fasterxml.jackson.core#jackson-core;2.3.1 from list in [default] | |
com.fasterxml.jackson.core#jackson-databind;2.3.1 from list in [default] | |
com.thoughtworks.paranamer#paranamer;2.6 from list in [default] | |
org.hammerlab#spark-json-relay;2.0.0 from list in [default] | |
org.json4s#json4s-ast_2.10;3.2.10 from list in [default] | |
org.json4s#json4s-core_2.10;3.2.10 from list in [default] | |
org.json4s#json4s-jackson_2.10;3.2.10 from list in [default] | |
org.scala-lang#scala-compiler;2.10.4 from list in [default] | |
org.scala-lang#scala-reflect;2.10.4 from list in [default] | |
org.scala-lang#scalap;2.10.4 from list in [default] | |
:: evicted modules: | |
org.scala-lang#scalap;2.10.0 by [org.scala-lang#scalap;2.10.4] in [default] | |
--------------------------------------------------------------------- | |
| | modules || artifacts | | |
| conf | number| search|dwnlded|evicted|| number|dwnlded| | |
--------------------------------------------------------------------- | |
| default | 12 | 0 | 0 | 1 || 11 | 0 | | |
--------------------------------------------------------------------- | |
:: retrieving :: org.apache.spark#spark-submit-parent | |
confs: [default] | |
Exception in thread "main" java.lang.RuntimeException: problem during retrieve of org.apache.spark#spark-submit-parent: java.text.ParseException: failed to parse report: /hpc/users/willir31/.ivy2/cache/org.apache.spark-spark-submit-parent-default.xml: Content is not allowed in prolog. | |
at org.apache.ivy.core.retrieve.RetrieveEngine.retrieve(RetrieveEngine.java:249) | |
at org.apache.ivy.core.retrieve.RetrieveEngine.retrieve(RetrieveEngine.java:83) | |
at org.apache.ivy.Ivy.retrieve(Ivy.java:551) | |
at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1006) | |
at org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:286) | |
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:153) | |
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120) | |
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) | |
Caused by: java.text.ParseException: failed to parse report: /hpc/users/willir31/.ivy2/cache/org.apache.spark-spark-submit-parent-default.xml: Content is not allowed in prolog. | |
at org.apache.ivy.plugins.report.XmlReportParser.parse(XmlReportParser.java:293) | |
at org.apache.ivy.core.retrieve.RetrieveEngine.determineArtifactsToCopy(RetrieveEngine.java:329) | |
at org.apache.ivy.core.retrieve.RetrieveEngine.retrieve(RetrieveEngine.java:118) | |
... 7 more | |
Caused by: org.xml.sax.SAXParseException; systemId: file:/hpc/users/willir31/.ivy2/cache/org.apache.spark-spark-submit-parent-default.xml; lineNumber: 1; columnNumber: 1; Content is not allowed in prolog. | |
at org.apache.xerces.util.ErrorHandlerWrapper.createSAXParseException(Unknown Source) | |
at org.apache.xerces.util.ErrorHandlerWrapper.fatalError(Unknown Source) | |
at org.apache.xerces.impl.XMLErrorReporter.reportError(Unknown Source) | |
at org.apache.xerces.impl.XMLErrorReporter.reportError(Unknown Source) | |
at org.apache.xerces.impl.XMLErrorReporter.reportError(Unknown Source) | |
at org.apache.xerces.impl.XMLScanner.reportFatalError(Unknown Source) | |
at org.apache.xerces.impl.XMLDocumentScannerImpl$PrologDispatcher.dispatch(Unknown Source) | |
at org.apache.xerces.impl.XMLDocumentFragmentScannerImpl.scanDocument(Unknown Source) | |
at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source) | |
at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source) | |
at org.apache.xerces.parsers.XMLParser.parse(Unknown Source) | |
at org.apache.xerces.parsers.AbstractSAXParser.parse(Unknown Source) | |
at org.apache.xerces.jaxp.SAXParserImpl$JAXPSAXParser.parse(Unknown Source) | |
at org.apache.xerces.jaxp.SAXParserImpl.parse(Unknown Source) | |
at javax.xml.parsers.SAXParser.parse(SAXParser.java:328) | |
at org.apache.ivy.plugins.report.XmlReportParser$SaxXmlReportParser.parse(XmlReportParser.java:249) | |
at org.apache.ivy.plugins.report.XmlReportParser.parse(XmlReportParser.java:291) | |
... 9 more |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
error: could not lock config file /hpc/users/willir31/.gitconfig: File exists | |
error: could not lock config file /hpc/users/willir31/.gitconfig: File exists | |
Dynamically requesting 1:1000 executors | |
Woops: /etc/hadoop/conf.cloudera.yarn/topology.py requires Python 2 | |
Found $PYTHON2_HOME at /hpc/users/willir31/.local; attempting to prepend /hpc/users/willir31/.local/bin to $PATH... | |
Running command: | |
/hpc/users/willir31/c/adam/bin/adam-submit --master yarn --deploy-mode client --executor-cores 6 --executor-memory 17g --driver-memory 10g --conf spark.default.parallelism=1000 --conf spark.eventLog.enabled=true --conf spark.eventLog.dir=hdfs://demeter-nn1.demeter.hpc.mssm.edu:/spark/tmp/logs --conf spark.slim.host=hammerlab-dev3.hpc.mssm.edu --conf spark.extraListeners=org.apache.spark.JsonRelay --packages org.hammerlab:spark-json-relay:2.0.0 --conf spark.storage.memoryFraction=0.05 --conf spark.shuffle.memoryFraction=0.4 --conf spark.shuffle.service.enabled=true --conf spark.speculation=true --conf spark.speculation.interval=1000 --conf spark.speculation.multiplier=2 --conf spark.dynamicAllocation.enabled=true --conf spark.dynamicAllocation.minExecutors=1 --conf spark.dynamicAllocation.initialExecutors=1 --conf spark.dynamicAllocation.maxExecutors=1000 --conf spark.dynamicAllocation.schedulerBacklogTimeout=1 --conf spark.dynamicAllocation.executorIdleTimeout=300 --conf spark.yarn.executor.memoryOverhead=1024 --driver-java-options -Dyarn.resourcemanager.am.max-attempts=1 --conf spark.akka.timeout=120 --conf spark.file.transferTo=false --conf spark.core.connection.ack.wait.timeout=120 --conf spark.ui.enabled=false --executor-cores 10 --executor-memory 5g --driver-memory 1g --conf spark.shuffle.memoryFraction=0.8 -- transform /datasets/illumina_platinum/200x/fastq/ERR174369_1.fastq -paired_fastq /datasets/illumina_platinum/200x/fastq/ERR174369_2.fastq -concat /datasets/illumina_platinum/200x/fastq/ERR174369_unpaired.fastq -record_group ERR174369 -stringency SILENT /datasets/illumina_platinum/200x/ERR174369.adam | |
Using SPARK_SUBMIT=/hpc/users/willir31/sparks/spark-1.5.0-bin-hadoop2.6/bin/spark-submit | |
Ivy Default Cache set to: /hpc/users/willir31/.ivy2/cache | |
The jars for the packages stored in: /hpc/users/willir31/.ivy2/jars | |
:: loading settings :: url = jar:file:/demeter/users/willir31/sparks/spark-1.5.0-bin-hadoop2.6/lib/spark-assembly-1.5.0-hadoop2.6.0.jar!/org/apache/ivy/core/settings/ivysettings.xml | |
org.hammerlab#spark-json-relay added as a dependency | |
:: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0 | |
confs: [default] | |
found org.hammerlab#spark-json-relay;2.0.0 in list | |
found org.json4s#json4s-jackson_2.10;3.2.10 in list | |
found org.json4s#json4s-core_2.10;3.2.10 in list | |
found org.json4s#json4s-ast_2.10;3.2.10 in list | |
found com.thoughtworks.paranamer#paranamer;2.6 in list | |
found com.fasterxml.jackson.core#jackson-databind;2.3.1 in list | |
found com.fasterxml.jackson.core#jackson-annotations;2.3.0 in list | |
found com.fasterxml.jackson.core#jackson-core;2.3.1 in list | |
found org.scala-lang#scalap;2.10.4 in list | |
found org.scala-lang#scala-compiler;2.10.4 in list | |
found org.scala-lang#scala-reflect;2.10.4 in list | |
:: resolution report :: resolve 2873ms :: artifacts dl 180ms | |
:: modules in use: | |
com.fasterxml.jackson.core#jackson-annotations;2.3.0 from list in [default] | |
com.fasterxml.jackson.core#jackson-core;2.3.1 from list in [default] | |
com.fasterxml.jackson.core#jackson-databind;2.3.1 from list in [default] | |
com.thoughtworks.paranamer#paranamer;2.6 from list in [default] | |
org.hammerlab#spark-json-relay;2.0.0 from list in [default] | |
org.json4s#json4s-ast_2.10;3.2.10 from list in [default] | |
org.json4s#json4s-core_2.10;3.2.10 from list in [default] | |
org.json4s#json4s-jackson_2.10;3.2.10 from list in [default] | |
org.scala-lang#scala-compiler;2.10.4 from list in [default] | |
org.scala-lang#scala-reflect;2.10.4 from list in [default] | |
org.scala-lang#scalap;2.10.4 from list in [default] | |
:: evicted modules: | |
org.scala-lang#scalap;2.10.0 by [org.scala-lang#scalap;2.10.4] in [default] | |
--------------------------------------------------------------------- | |
| | modules || artifacts | | |
| conf | number| search|dwnlded|evicted|| number|dwnlded| | |
--------------------------------------------------------------------- | |
| default | 12 | 0 | 0 | 1 || 11 | 0 | | |
--------------------------------------------------------------------- | |
:: problems summary :: | |
:::: ERRORS | |
unknown resolver null | |
unknown resolver null | |
:: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS | |
:: retrieving :: org.apache.spark#spark-submit-parent | |
confs: [default] | |
Exception in thread "main" java.lang.RuntimeException: problem during retrieve of org.apache.spark#spark-submit-parent: java.text.ParseException: failed to parse report: /hpc/users/willir31/.ivy2/cache/org.apache.spark-spark-submit-parent-default.xml: Premature end of file. | |
at org.apache.ivy.core.retrieve.RetrieveEngine.retrieve(RetrieveEngine.java:249) | |
at org.apache.ivy.core.retrieve.RetrieveEngine.retrieve(RetrieveEngine.java:83) | |
at org.apache.ivy.Ivy.retrieve(Ivy.java:551) | |
at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1006) | |
at org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:286) | |
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:153) | |
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120) | |
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) | |
Caused by: java.text.ParseException: failed to parse report: /hpc/users/willir31/.ivy2/cache/org.apache.spark-spark-submit-parent-default.xml: Premature end of file. | |
at org.apache.ivy.plugins.report.XmlReportParser.parse(XmlReportParser.java:293) | |
at org.apache.ivy.core.retrieve.RetrieveEngine.determineArtifactsToCopy(RetrieveEngine.java:329) | |
at org.apache.ivy.core.retrieve.RetrieveEngine.retrieve(RetrieveEngine.java:118) | |
... 7 more | |
Caused by: org.xml.sax.SAXParseException; Premature end of file. | |
at org.apache.xerces.util.ErrorHandlerWrapper.createSAXParseException(Unknown Source) | |
at org.apache.xerces.util.ErrorHandlerWrapper.fatalError(Unknown Source) | |
at org.apache.xerces.impl.XMLErrorReporter.reportError(Unknown Source) | |
at org.apache.xerces.impl.XMLErrorReporter.reportError(Unknown Source) | |
at org.apache.xerces.impl.XMLErrorReporter.reportError(Unknown Source) | |
at org.apache.xerces.impl.XMLVersionDetector.determineDocVersion(Unknown Source) | |
at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source) | |
at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source) | |
at org.apache.xerces.parsers.XMLParser.parse(Unknown Source) | |
at org.apache.xerces.parsers.AbstractSAXParser.parse(Unknown Source) | |
at org.apache.xerces.jaxp.SAXParserImpl$JAXPSAXParser.parse(Unknown Source) | |
at org.apache.xerces.jaxp.SAXParserImpl.parse(Unknown Source) | |
at javax.xml.parsers.SAXParser.parse(SAXParser.java:328) | |
at org.apache.ivy.plugins.report.XmlReportParser$SaxXmlReportParser.parse(XmlReportParser.java:249) | |
at org.apache.ivy.plugins.report.XmlReportParser.parse(XmlReportParser.java:291) | |
... 9 more |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
ln: creating symbolic link `/hpc/users/willir31/s/yarn/ldl': File exists | |
ln: creating symbolic link `/hpc/users/willir31/s/yarn/ycl': File exists | |
ln: cannot remove `/hpc/users/willir31/s/yarn/yl': No such file or directory | |
ln: cannot remove `/hpc/users/willir31/s/yarn/yll': No such file or directory | |
Dynamically requesting 1:1000 executors | |
Woops: /etc/hadoop/conf.cloudera.yarn/topology.py requires Python 2 | |
Found $PYTHON2_HOME at /hpc/users/willir31/.local; attempting to prepend /hpc/users/willir31/.local/bin to $PATH... | |
Running command: | |
/hpc/users/willir31/c/adam/bin/adam-submit --master yarn --deploy-mode client --executor-cores 6 --executor-memory 17g --driver-memory 10g --conf spark.default.parallelism=1000 --conf spark.eventLog.enabled=true --conf spark.eventLog.dir=hdfs://demeter-nn1.demeter.hpc.mssm.edu:/spark/tmp/logs --conf spark.slim.host=hammerlab-dev3.hpc.mssm.edu --conf spark.extraListeners=org.apache.spark.JsonRelay --packages org.hammerlab:spark-json-relay:2.0.0 --conf spark.storage.memoryFraction=0.05 --conf spark.shuffle.memoryFraction=0.4 --conf spark.shuffle.service.enabled=true --conf spark.speculation=true --conf spark.speculation.interval=1000 --conf spark.speculation.multiplier=2 --conf spark.dynamicAllocation.enabled=true --conf spark.dynamicAllocation.minExecutors=1 --conf spark.dynamicAllocation.initialExecutors=1 --conf spark.dynamicAllocation.maxExecutors=1000 --conf spark.dynamicAllocation.schedulerBacklogTimeout=1 --conf spark.dynamicAllocation.executorIdleTimeout=300 --conf spark.yarn.executor.memoryOverhead=1024 --driver-java-options -Dyarn.resourcemanager.am.max-attempts=1 --conf spark.akka.timeout=120 --conf spark.file.transferTo=false --conf spark.core.connection.ack.wait.timeout=120 --conf spark.ui.enabled=false --executor-cores 10 --executor-memory 5g --driver-memory 1g --conf spark.shuffle.memoryFraction=0.8 -- transform /datasets/illumina_platinum/200x/fastq/ERR174372_1.fastq -paired_fastq /datasets/illumina_platinum/200x/fastq/ERR174372_2.fastq -concat /datasets/illumina_platinum/200x/fastq/ERR174372_unpaired.fastq -record_group ERR174372 -stringency SILENT /datasets/illumina_platinum/200x/ERR174372.adam | |
Using SPARK_SUBMIT=/hpc/users/willir31/sparks/spark-1.5.0-bin-hadoop2.6/bin/spark-submit | |
Ivy Default Cache set to: /hpc/users/willir31/.ivy2/cache | |
The jars for the packages stored in: /hpc/users/willir31/.ivy2/jars | |
:: loading settings :: url = jar:file:/demeter/users/willir31/sparks/spark-1.5.0-bin-hadoop2.6/lib/spark-assembly-1.5.0-hadoop2.6.0.jar!/org/apache/ivy/core/settings/ivysettings.xml | |
org.hammerlab#spark-json-relay added as a dependency | |
:: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0 | |
confs: [default] | |
found org.hammerlab#spark-json-relay;2.0.0 in list | |
found org.json4s#json4s-jackson_2.10;3.2.10 in list | |
found org.json4s#json4s-core_2.10;3.2.10 in list | |
found org.json4s#json4s-ast_2.10;3.2.10 in list | |
found com.thoughtworks.paranamer#paranamer;2.6 in list | |
found com.fasterxml.jackson.core#jackson-databind;2.3.1 in list | |
found com.fasterxml.jackson.core#jackson-annotations;2.3.0 in list | |
found com.fasterxml.jackson.core#jackson-core;2.3.1 in list | |
found org.scala-lang#scalap;2.10.4 in list | |
found org.scala-lang#scala-compiler;2.10.4 in list | |
found org.scala-lang#scala-reflect;2.10.4 in list | |
:: resolution report :: resolve 3123ms :: artifacts dl 76ms | |
:: modules in use: | |
com.fasterxml.jackson.core#jackson-annotations;2.3.0 from list in [default] | |
com.fasterxml.jackson.core#jackson-core;2.3.1 from list in [default] | |
com.fasterxml.jackson.core#jackson-databind;2.3.1 from list in [default] | |
com.thoughtworks.paranamer#paranamer;2.6 from list in [default] | |
org.hammerlab#spark-json-relay;2.0.0 from list in [default] | |
org.json4s#json4s-ast_2.10;3.2.10 from list in [default] | |
org.json4s#json4s-core_2.10;3.2.10 from list in [default] | |
org.json4s#json4s-jackson_2.10;3.2.10 from list in [default] | |
org.scala-lang#scala-compiler;2.10.4 from list in [default] | |
org.scala-lang#scala-reflect;2.10.4 from list in [default] | |
org.scala-lang#scalap;2.10.4 from list in [default] | |
:: evicted modules: | |
org.scala-lang#scalap;2.10.0 by [org.scala-lang#scalap;2.10.4] in [default] | |
--------------------------------------------------------------------- | |
| | modules || artifacts | | |
| conf | number| search|dwnlded|evicted|| number|dwnlded| | |
--------------------------------------------------------------------- | |
| default | 12 | 0 | 0 | 1 || 11 | 0 | | |
--------------------------------------------------------------------- | |
:: problems summary :: | |
:::: ERRORS | |
unknown resolver null | |
unknown resolver null | |
unknown resolver null | |
:: USE VERBOSE OR DEBUG MESSAGE LEVEL FOR MORE DETAILS | |
:: retrieving :: org.apache.spark#spark-submit-parent | |
confs: [default] | |
Exception in thread "main" java.lang.RuntimeException: problem during retrieve of org.apache.spark#spark-submit-parent: java.text.ParseException: failed to parse report: /hpc/users/willir31/.ivy2/cache/org.apache.spark-spark-submit-parent-default.xml: Content is not allowed in prolog. | |
at org.apache.ivy.core.retrieve.RetrieveEngine.retrieve(RetrieveEngine.java:249) | |
at org.apache.ivy.core.retrieve.RetrieveEngine.retrieve(RetrieveEngine.java:83) | |
at org.apache.ivy.Ivy.retrieve(Ivy.java:551) | |
at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1006) | |
at org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:286) | |
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:153) | |
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120) | |
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) | |
Caused by: java.text.ParseException: failed to parse report: /hpc/users/willir31/.ivy2/cache/org.apache.spark-spark-submit-parent-default.xml: Content is not allowed in prolog. | |
at org.apache.ivy.plugins.report.XmlReportParser.parse(XmlReportParser.java:293) | |
at org.apache.ivy.core.retrieve.RetrieveEngine.determineArtifactsToCopy(RetrieveEngine.java:329) | |
at org.apache.ivy.core.retrieve.RetrieveEngine.retrieve(RetrieveEngine.java:118) | |
... 7 more | |
Caused by: org.xml.sax.SAXParseException; systemId: file:/hpc/users/willir31/.ivy2/cache/org.apache.spark-spark-submit-parent-default.xml; lineNumber: 1; columnNumber: 1; Content is not allowed in prolog. | |
at org.apache.xerces.util.ErrorHandlerWrapper.createSAXParseException(Unknown Source) | |
at org.apache.xerces.util.ErrorHandlerWrapper.fatalError(Unknown Source) | |
at org.apache.xerces.impl.XMLErrorReporter.reportError(Unknown Source) | |
at org.apache.xerces.impl.XMLErrorReporter.reportError(Unknown Source) | |
at org.apache.xerces.impl.XMLErrorReporter.reportError(Unknown Source) | |
at org.apache.xerces.impl.XMLScanner.reportFatalError(Unknown Source) | |
at org.apache.xerces.impl.XMLDocumentScannerImpl$PrologDispatcher.dispatch(Unknown Source) | |
at org.apache.xerces.impl.XMLDocumentFragmentScannerImpl.scanDocument(Unknown Source) | |
at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source) | |
at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source) | |
at org.apache.xerces.parsers.XMLParser.parse(Unknown Source) | |
at org.apache.xerces.parsers.AbstractSAXParser.parse(Unknown Source) | |
at org.apache.xerces.jaxp.SAXParserImpl$JAXPSAXParser.parse(Unknown Source) | |
at org.apache.xerces.jaxp.SAXParserImpl.parse(Unknown Source) | |
at javax.xml.parsers.SAXParser.parse(SAXParser.java:328) | |
at org.apache.ivy.plugins.report.XmlReportParser$SaxXmlReportParser.parse(XmlReportParser.java:249) | |
at org.apache.ivy.plugins.report.XmlReportParser.parse(XmlReportParser.java:291) | |
... 9 more |
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
ln: cannot remove `/hpc/users/willir31/s/yarn/ycl': No such file or directory | |
error: could not lock config file /hpc/users/willir31/.gitconfig: File exists | |
Dynamically requesting 1:1000 executors | |
Woops: /etc/hadoop/conf.cloudera.yarn/topology.py requires Python 2 | |
Found $PYTHON2_HOME at /hpc/users/willir31/.local; attempting to prepend /hpc/users/willir31/.local/bin to $PATH... | |
Running command: | |
/hpc/users/willir31/c/adam/bin/adam-submit --master yarn --deploy-mode client --executor-cores 6 --executor-memory 17g --driver-memory 10g --conf spark.default.parallelism=1000 --conf spark.eventLog.enabled=true --conf spark.eventLog.dir=hdfs://demeter-nn1.demeter.hpc.mssm.edu:/spark/tmp/logs --conf spark.slim.host=hammerlab-dev3.hpc.mssm.edu --conf spark.extraListeners=org.apache.spark.JsonRelay --packages org.hammerlab:spark-json-relay:2.0.0 --conf spark.storage.memoryFraction=0.05 --conf spark.shuffle.memoryFraction=0.4 --conf spark.shuffle.service.enabled=true --conf spark.speculation=true --conf spark.speculation.interval=1000 --conf spark.speculation.multiplier=2 --conf spark.dynamicAllocation.enabled=true --conf spark.dynamicAllocation.minExecutors=1 --conf spark.dynamicAllocation.initialExecutors=1 --conf spark.dynamicAllocation.maxExecutors=1000 --conf spark.dynamicAllocation.schedulerBacklogTimeout=1 --conf spark.dynamicAllocation.executorIdleTimeout=300 --conf spark.yarn.executor.memoryOverhead=1024 --driver-java-options -Dyarn.resourcemanager.am.max-attempts=1 --conf spark.akka.timeout=120 --conf spark.file.transferTo=false --conf spark.core.connection.ack.wait.timeout=120 --conf spark.ui.enabled=false --executor-cores 10 --executor-memory 5g --driver-memory 1g --conf spark.shuffle.memoryFraction=0.8 -- transform /datasets/illumina_platinum/200x/fastq/ERR174377_1.fastq -paired_fastq /datasets/illumina_platinum/200x/fastq/ERR174377_2.fastq -concat /datasets/illumina_platinum/200x/fastq/ERR174377_unpaired.fastq -record_group ERR174377 -stringency SILENT /datasets/illumina_platinum/200x/ERR174377.adam | |
Using SPARK_SUBMIT=/hpc/users/willir31/sparks/spark-1.5.0-bin-hadoop2.6/bin/spark-submit | |
Ivy Default Cache set to: /hpc/users/willir31/.ivy2/cache | |
The jars for the packages stored in: /hpc/users/willir31/.ivy2/jars | |
:: loading settings :: url = jar:file:/demeter/users/willir31/sparks/spark-1.5.0-bin-hadoop2.6/lib/spark-assembly-1.5.0-hadoop2.6.0.jar!/org/apache/ivy/core/settings/ivysettings.xml | |
org.hammerlab#spark-json-relay added as a dependency | |
:: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0 | |
confs: [default] | |
found org.hammerlab#spark-json-relay;2.0.0 in list | |
found org.json4s#json4s-jackson_2.10;3.2.10 in list | |
found org.json4s#json4s-core_2.10;3.2.10 in list | |
found org.json4s#json4s-ast_2.10;3.2.10 in list | |
found com.thoughtworks.paranamer#paranamer;2.6 in list | |
found com.fasterxml.jackson.core#jackson-databind;2.3.1 in list | |
found com.fasterxml.jackson.core#jackson-annotations;2.3.0 in list | |
found com.fasterxml.jackson.core#jackson-core;2.3.1 in list | |
found org.scala-lang#scalap;2.10.4 in list | |
found org.scala-lang#scala-compiler;2.10.4 in list | |
found org.scala-lang#scala-reflect;2.10.4 in list | |
:: resolution report :: resolve 3240ms :: artifacts dl 131ms | |
:: modules in use: | |
com.fasterxml.jackson.core#jackson-annotations;2.3.0 from list in [default] | |
com.fasterxml.jackson.core#jackson-core;2.3.1 from list in [default] | |
com.fasterxml.jackson.core#jackson-databind;2.3.1 from list in [default] | |
com.thoughtworks.paranamer#paranamer;2.6 from list in [default] | |
org.hammerlab#spark-json-relay;2.0.0 from list in [default] | |
org.json4s#json4s-ast_2.10;3.2.10 from list in [default] | |
org.json4s#json4s-core_2.10;3.2.10 from list in [default] | |
org.json4s#json4s-jackson_2.10;3.2.10 from list in [default] | |
org.scala-lang#scala-compiler;2.10.4 from list in [default] | |
org.scala-lang#scala-reflect;2.10.4 from list in [default] | |
org.scala-lang#scalap;2.10.4 from list in [default] | |
:: evicted modules: | |
org.scala-lang#scalap;2.10.0 by [org.scala-lang#scalap;2.10.4] in [default] | |
--------------------------------------------------------------------- | |
| | modules || artifacts | | |
| conf | number| search|dwnlded|evicted|| number|dwnlded| | |
--------------------------------------------------------------------- | |
| default | 12 | 0 | 0 | 1 || 11 | 0 | | |
--------------------------------------------------------------------- | |
:: retrieving :: org.apache.spark#spark-submit-parent | |
confs: [default] | |
Exception in thread "main" java.lang.RuntimeException: problem during retrieve of org.apache.spark#spark-submit-parent: java.text.ParseException: failed to parse report: /hpc/users/willir31/.ivy2/cache/org.apache.spark-spark-submit-parent-default.xml: Content is not allowed in prolog. | |
at org.apache.ivy.core.retrieve.RetrieveEngine.retrieve(RetrieveEngine.java:249) | |
at org.apache.ivy.core.retrieve.RetrieveEngine.retrieve(RetrieveEngine.java:83) | |
at org.apache.ivy.Ivy.retrieve(Ivy.java:551) | |
at org.apache.spark.deploy.SparkSubmitUtils$.resolveMavenCoordinates(SparkSubmit.scala:1006) | |
at org.apache.spark.deploy.SparkSubmit$.prepareSubmitEnvironment(SparkSubmit.scala:286) | |
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:153) | |
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:120) | |
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala) | |
Caused by: java.text.ParseException: failed to parse report: /hpc/users/willir31/.ivy2/cache/org.apache.spark-spark-submit-parent-default.xml: Content is not allowed in prolog. | |
at org.apache.ivy.plugins.report.XmlReportParser.parse(XmlReportParser.java:293) | |
at org.apache.ivy.core.retrieve.RetrieveEngine.determineArtifactsToCopy(RetrieveEngine.java:329) | |
at org.apache.ivy.core.retrieve.RetrieveEngine.retrieve(RetrieveEngine.java:118) | |
... 7 more | |
Caused by: org.xml.sax.SAXParseException; systemId: file:/hpc/users/willir31/.ivy2/cache/org.apache.spark-spark-submit-parent-default.xml; lineNumber: 1; columnNumber: 1; Content is not allowed in prolog. | |
at org.apache.xerces.util.ErrorHandlerWrapper.createSAXParseException(Unknown Source) | |
at org.apache.xerces.util.ErrorHandlerWrapper.fatalError(Unknown Source) | |
at org.apache.xerces.impl.XMLErrorReporter.reportError(Unknown Source) | |
at org.apache.xerces.impl.XMLErrorReporter.reportError(Unknown Source) | |
at org.apache.xerces.impl.XMLErrorReporter.reportError(Unknown Source) | |
at org.apache.xerces.impl.XMLScanner.reportFatalError(Unknown Source) | |
at org.apache.xerces.impl.XMLDocumentScannerImpl$PrologDispatcher.dispatch(Unknown Source) | |
at org.apache.xerces.impl.XMLDocumentFragmentScannerImpl.scanDocument(Unknown Source) | |
at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source) | |
at org.apache.xerces.parsers.XML11Configuration.parse(Unknown Source) | |
at org.apache.xerces.parsers.XMLParser.parse(Unknown Source) | |
at org.apache.xerces.parsers.AbstractSAXParser.parse(Unknown Source) | |
at org.apache.xerces.jaxp.SAXParserImpl$JAXPSAXParser.parse(Unknown Source) | |
at org.apache.xerces.jaxp.SAXParserImpl.parse(Unknown Source) | |
at javax.xml.parsers.SAXParser.parse(SAXParser.java:328) | |
at org.apache.ivy.plugins.report.XmlReportParser$SaxXmlReportParser.parse(XmlReportParser.java:249) | |
at org.apache.ivy.plugins.report.XmlReportParser.parse(XmlReportParser.java:291) | |
... 9 more |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment