Created
March 20, 2017 06:31
-
-
Save geoHeil/17ea7fa96f402f10a9a90517406330f6 to your computer and use it in GitHub Desktop.
xgboost build problem
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
mvn clean install [±master ✓] | |
[INFO] Scanning for projects... | |
[WARNING] | |
[WARNING] Some problems were encountered while building the effective model for ml.dmlc:xgboost4j:jar:0.7 | |
[WARNING] 'build.plugins.plugin.version' for org.codehaus.mojo:exec-maven-plugin is missing. @ line 40, column 29 | |
[WARNING] | |
[WARNING] It is highly recommended to fix these problems because they threaten the stability of your build. | |
[WARNING] | |
[WARNING] For this reason, future Maven versions might no longer support building such malformed projects. | |
[WARNING] | |
[INFO] ------------------------------------------------------------------------ | |
[INFO] Reactor Build Order: | |
[INFO] | |
[INFO] xgboost-jvm | |
[INFO] xgboost4j | |
[INFO] xgboost4j-spark | |
[INFO] xgboost4j-flink | |
[INFO] xgboost4j-example | |
[INFO] | |
[INFO] ------------------------------------------------------------------------ | |
[INFO] Building xgboost-jvm 0.7 | |
[INFO] ------------------------------------------------------------------------ | |
[INFO] | |
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ xgboost-jvm --- | |
[INFO] Deleting /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/target | |
[INFO] | |
[INFO] --- scalastyle-maven-plugin:0.8.0:check (checkstyle) @ xgboost-jvm --- | |
[WARNING] sourceDirectory is not specified or does not exist value=/Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/src/main/scala | |
[WARNING] testSourceDirectory is not specified or does not exist value=/Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/src/test/scala | |
Processed 0 file(s) | |
Found 0 errors | |
Found 0 warnings | |
Found 0 infos | |
Finished in 71 ms | |
[INFO] | |
[INFO] --- maven-checkstyle-plugin:2.17:check (checkstyle) @ xgboost-jvm --- | |
[INFO] | |
[INFO] --- scala-maven-plugin:3.2.2:compile (default) @ xgboost-jvm --- | |
[WARNING] Expected all dependencies to require Scala version: 2.11.8 | |
[WARNING] ml.dmlc:xgboost-jvm:0.7 requires scala version: 2.11.8 | |
[WARNING] org.scala-lang:scala-compiler:2.11.8 requires scala version: 2.11.8 | |
[WARNING] org.scala-lang.modules:scala-xml_2.11:1.0.4 requires scala version: 2.11.4 | |
[WARNING] Multiple versions of scala libraries detected! | |
[INFO] No sources to compile | |
[INFO] | |
[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @ xgboost-jvm --- | |
[WARNING] Expected all dependencies to require Scala version: 2.11.8 | |
[WARNING] ml.dmlc:xgboost-jvm:0.7 requires scala version: 2.11.8 | |
[WARNING] org.scala-lang:scala-compiler:2.11.8 requires scala version: 2.11.8 | |
[WARNING] org.scala-lang.modules:scala-xml_2.11:1.0.4 requires scala version: 2.11.4 | |
[WARNING] Multiple versions of scala libraries detected! | |
[INFO] No sources to compile | |
[INFO] | |
[INFO] --- scala-maven-plugin:3.2.2:add-source (scala-compile-first) @ xgboost-jvm --- | |
[INFO] Add Source directory: /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/src/main/scala | |
[INFO] Add Test Source directory: /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/src/test/scala | |
[INFO] | |
[INFO] --- scala-maven-plugin:3.2.2:compile (compile) @ xgboost-jvm --- | |
[WARNING] Expected all dependencies to require Scala version: 2.11.8 | |
[WARNING] ml.dmlc:xgboost-jvm:0.7 requires scala version: 2.11.8 | |
[WARNING] org.scala-lang:scala-compiler:2.11.8 requires scala version: 2.11.8 | |
[WARNING] org.scala-lang.modules:scala-xml_2.11:1.0.4 requires scala version: 2.11.4 | |
[WARNING] Multiple versions of scala libraries detected! | |
[INFO] No sources to compile | |
[INFO] | |
[INFO] --- scala-maven-plugin:3.2.2:testCompile (test-compile) @ xgboost-jvm --- | |
[WARNING] Expected all dependencies to require Scala version: 2.11.8 | |
[WARNING] ml.dmlc:xgboost-jvm:0.7 requires scala version: 2.11.8 | |
[WARNING] org.scala-lang:scala-compiler:2.11.8 requires scala version: 2.11.8 | |
[WARNING] org.scala-lang.modules:scala-xml_2.11:1.0.4 requires scala version: 2.11.4 | |
[WARNING] Multiple versions of scala libraries detected! | |
[INFO] No sources to compile | |
[INFO] | |
[INFO] --- scalatest-maven-plugin:1.0:test (test) @ xgboost-jvm --- | |
Discovery starting. | |
Discovery completed in 34 milliseconds. | |
Run starting. Expected test count is: 0 | |
DiscoverySuite: | |
Run completed in 66 milliseconds. | |
Total number of tests run: 0 | |
Suites: completed 1, aborted 0 | |
Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0 | |
No tests were executed. | |
[INFO] | |
[INFO] --- maven-assembly-plugin:2.6:single (make-assembly) @ xgboost-jvm --- | |
[INFO] Assemblies have been skipped per configuration of the skipAssembly parameter. | |
[INFO] | |
[INFO] --- maven-source-plugin:3.0.1:jar-no-fork (attach-sources) @ xgboost-jvm --- | |
[INFO] | |
[INFO] --- maven-install-plugin:2.4:install (default-install) @ xgboost-jvm --- | |
[INFO] Installing /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/pom.xml to /Users/geoheil/.m2/repository/ml/dmlc/xgboost-jvm/0.7/xgboost-jvm-0.7.pom | |
[INFO] | |
[INFO] ------------------------------------------------------------------------ | |
[INFO] Building xgboost4j 0.7 | |
[INFO] ------------------------------------------------------------------------ | |
[INFO] | |
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ xgboost4j --- | |
[INFO] Deleting /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j/target | |
[INFO] | |
[INFO] --- scalastyle-maven-plugin:0.8.0:check (checkstyle) @ xgboost4j --- | |
Processed 13 file(s) | |
Found 0 errors | |
Found 0 warnings | |
Found 0 infos | |
Finished in 1061 ms | |
[INFO] | |
[INFO] --- maven-checkstyle-plugin:2.17:check (checkstyle) @ xgboost4j --- | |
[INFO] | |
[INFO] --- exec-maven-plugin:1.6.0:exec (native) @ xgboost4j --- | |
build java wrapper | |
g++-6 -std=c++11 -Wall -Wno-unknown-pragmas -Iinclude -Idmlc-core/include -Irabit/include -O3 -funroll-loops -msse2 -fPIC -fopenmp -I/Library/Java/JavaVirtualMachines/jdk1.8.0_121.jdk/Contents/Home/include -I./java -I/Library/Java/JavaVirtualMachines/jdk1.8.0_121.jdk/Contents/Home/include/darwin -shared -o jvm-packages/lib/libxgboost4j.so jvm-packages/xgboost4j/src/native/xgboost4j.cpp build/learner.o build/logging.o build/c_api/c_api.o build/c_api/c_api_error.o build/common/common.o build/common/hist_util.o build/data/data.o build/data/simple_csr_source.o build/data/simple_dmatrix.o build/data/sparse_page_dmatrix.o build/data/sparse_page_raw_format.o build/data/sparse_page_source.o build/data/sparse_page_writer.o build/gbm/gblinear.o build/gbm/gbm.o build/gbm/gbtree.o build/metric/elementwise_metric.o build/metric/metric.o build/metric/multiclass_metric.o build/metric/rank_metric.o build/objective/multiclass_obj.o build/objective/objective.o build/objective/rank_obj.o build/objective/regression_obj.o build/tree/tree_model.o build/tree/tree_updater.o build/tree/updater_colmaker.o build/tree/updater_fast_hist.o build/tree/updater_histmaker.o build/tree/updater_prune.o build/tree/updater_refresh.o build/tree/updater_skmaker.o build/tree/updater_sync.o dmlc-core/libdmlc.a rabit/lib/librabit.a -pthread -lm -fopenmp | |
move native lib | |
/Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages | |
complete | |
[INFO] | |
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ xgboost4j --- | |
[INFO] Using 'UTF-8' encoding to copy filtered resources. | |
[INFO] Copying 2 resources | |
[INFO] | |
[INFO] --- scala-maven-plugin:3.2.2:compile (default) @ xgboost4j --- | |
[WARNING] Expected all dependencies to require Scala version: 2.11.8 | |
[WARNING] com.typesafe.akka:akka-actor_2.11:2.3.11 requires scala version: 2.11.5 | |
[WARNING] Multiple versions of scala libraries detected! | |
[INFO] /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j/src/main/java:-1: info: compiling | |
[INFO] /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j/src/main/scala:-1: info: compiling | |
[INFO] Compiling 24 source files to /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j/target/classes at 1489991262063 | |
[WARNING] warning: there was one deprecation warning; re-run with -deprecation for details | |
[WARNING] warning: there were four feature warnings; re-run with -feature for details | |
[WARNING] two warnings found | |
[INFO] prepare-compile in 0 s | |
[INFO] compile in 6 s | |
[INFO] | |
[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @ xgboost4j --- | |
[WARNING] Expected all dependencies to require Scala version: 2.11.8 | |
[WARNING] com.typesafe.akka:akka-actor_2.11:2.3.11 requires scala version: 2.11.5 | |
[WARNING] Multiple versions of scala libraries detected! | |
[INFO] Nothing to compile - all classes are up to date | |
[INFO] | |
[INFO] --- scala-maven-plugin:3.2.2:add-source (scala-compile-first) @ xgboost4j --- | |
[INFO] Add Source directory: /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j/src/main/scala | |
[INFO] Add Test Source directory: /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j/src/test/scala | |
[INFO] | |
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ xgboost4j --- | |
[INFO] Changes detected - recompiling the module! | |
[INFO] Compiling 14 source files to /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j/target/classes | |
[INFO] | |
[INFO] --- scala-maven-plugin:3.2.2:compile (compile) @ xgboost4j --- | |
[WARNING] Expected all dependencies to require Scala version: 2.11.8 | |
[WARNING] com.typesafe.akka:akka-actor_2.11:2.3.11 requires scala version: 2.11.5 | |
[WARNING] Multiple versions of scala libraries detected! | |
[INFO] Nothing to compile - all classes are up to date | |
[INFO] | |
[INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ xgboost4j --- | |
[INFO] Using 'UTF-8' encoding to copy filtered resources. | |
[INFO] skip non existing resourceDirectory /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j/src/test/resources | |
[INFO] | |
[INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ xgboost4j --- | |
[INFO] Changes detected - recompiling the module! | |
[INFO] Compiling 2 source files to /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j/target/test-classes | |
[WARNING] /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j/src/test/java/ml/dmlc/xgboost4j/java/DMatrixTest.java: /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j/src/test/java/ml/dmlc/xgboost4j/java/DMatrixTest.java uses or overrides a deprecated API. | |
[WARNING] /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j/src/test/java/ml/dmlc/xgboost4j/java/DMatrixTest.java: Recompile with -Xlint:deprecation for details. | |
[INFO] | |
[INFO] --- scala-maven-plugin:3.2.2:testCompile (test-compile) @ xgboost4j --- | |
[WARNING] Expected all dependencies to require Scala version: 2.11.8 | |
[WARNING] com.typesafe.akka:akka-actor_2.11:2.3.11 requires scala version: 2.11.5 | |
[WARNING] Multiple versions of scala libraries detected! | |
[INFO] /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j/src/test/java:-1: info: compiling | |
[INFO] /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j/src/test/scala:-1: info: compiling | |
[INFO] Compiling 5 source files to /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j/target/test-classes at 1489991269808 | |
[WARNING] warning: there were two deprecation warnings; re-run with -deprecation for details | |
[WARNING] one warning found | |
[INFO] prepare-compile in 0 s | |
[INFO] compile in 4 s | |
[INFO] | |
[INFO] --- maven-surefire-plugin:2.19.1:test (default-test) @ xgboost4j --- | |
------------------------------------------------------- | |
T E S T S | |
------------------------------------------------------- | |
Running ml.dmlc.xgboost4j.java.BoosterImplTest | |
[07:27:55] Tree method is selected to be 'hist', which uses histogram aggregation for faster training. Using default sequence of updaters: grow_fast_histmaker,prune | |
[0] training-auc:0.958228 | |
[07:27:55] Tree method is selected to be 'hist', which uses histogram aggregation for faster training. Using default sequence of updaters: grow_fast_histmaker,prune | |
[1] training-auc:0.987161 | |
[2] training-auc:0.993971 | |
[3] training-auc:0.997779 | |
[4] training-auc:0.998543 | |
[5] training-auc:0.998960 | |
[6] training-auc:0.998960 | |
[7] training-auc:0.998960 | |
[8] training-auc:0.999250 | |
[9] training-auc:0.999365 | |
Mär 20, 2017 7:27:55 AM ml.dmlc.xgboost4j.java.XGBoost crossValidation | |
INFORMATION: [0] cv-test-error:0,014439 cv-train-error:0,014431 | |
Mär 20, 2017 7:27:55 AM ml.dmlc.xgboost4j.java.XGBoost crossValidation | |
INFORMATION: [1] cv-test-error:0,001690 cv-train-error:0,001688 | |
[0] training-auc:0.994714 | |
[07:27:55] Tree method is selected to be 'hist', which uses histogram aggregation for faster training. Using default sequence of updaters: grow_fast_histmaker,prune | |
[1] training-auc:0.998459 | |
[2] training-auc:0.998459 | |
[3] training-auc:0.998459 | |
[4] training-auc:0.999977 | |
[5] training-auc:0.999991 | |
[6] training-auc:0.999991 | |
[7] training-auc:0.999991 | |
[8] training-auc:1.000000 | |
[9] training-auc:1.000000 | |
[0] test-error:0.042831 train-error:0.046522 | |
[1] test-error:0.021726 train-error:0.022263 | |
[2] test-error:0.006207 train-error:0.007063 | |
[3] test-error:0.018001 train-error:0.015200 | |
[4] test-error:0.006207 train-error:0.007063 | |
[0] test-error:0.042831 train-error:0.046522 | |
[1] test-error:0.021726 train-error:0.022263 | |
[2] test-error:0.006207 train-error:0.007063 | |
[3] test-error:0.018001 train-error:0.015200 | |
[4] test-error:0.006207 train-error:0.007063 | |
[0] training-auc:0.958228 | |
[1] training-auc:0.987161 | |
[2] training-auc:0.993971 | |
[3] training-auc:0.997779 | |
[4] training-auc:0.998543 | |
[5] training-auc:0.998960 | |
[6] training-auc:0.998960 | |
[7] training-auc:0.998960 | |
[8] training-auc:0.999250 | |
[9] training-auc:0.999365 | |
[07:27:55] Tree method is selected to be 'hist', which uses histogram aggregation for faster training. Using default sequence of updaters: grow_fast_histmaker,prune | |
[0] test-auc:0.986255 training-auc:0.987112 | |
[07:27:55] Tree method is selected to be 'hist', which uses histogram aggregation for faster training. Using default sequence of updaters: grow_fast_histmaker,prune | |
[1] test-auc:0.998653 training-auc:0.997538 | |
[2] test-auc:0.998958 training-auc:0.997675 | |
[3] test-auc:0.999957 training-auc:0.998421 | |
[4] test-auc:1.000000 training-auc:0.999880 | |
[5] test-auc:1.000000 training-auc:0.999880 | |
[6] test-auc:1.000000 training-auc:0.999935 | |
[7] test-auc:1.000000 training-auc:1.000000 | |
[8] test-auc:1.000000 training-auc:1.000000 | |
[9] test-auc:1.000000 training-auc:1.000000 | |
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.403 sec - in ml.dmlc.xgboost4j.java.BoosterImplTest | |
Running ml.dmlc.xgboost4j.java.DMatrixTest | |
5 | |
[0] test-auc:0.994937 training-auc:0.994714 | |
[1] test-auc:0.999972 training-auc:0.998459 | |
[2] test-auc:0.999972 training-auc:0.998459 | |
[07:27:55] src/c_api/c_api.cc:264: num_row=3 | |
[3] test-auc:0.999972 training-auc:0.998459 | |
[4] test-auc:1.000000 training-auc:0.999977 | |
[5] test-auc:1.000000 training-auc:0.999991 | |
[6] test-auc:1.000000 training-auc:0.999991 | |
[7] test-auc:1.000000 training-auc:0.999991 | |
[8] test-auc:1.000000 training-auc:1.000000 | |
[9] test-auc:1.000000 training-auc:1.000000 | |
[0] test-error:0.042831 train-error:0.046522 | |
[1] test-error:0.021726 train-error:0.022263 | |
[2] test-error:0.006207 train-error:0.007063 | |
[3] test-error:0.018001 train-error:0.015200 | |
[4] test-error:0.006207 train-error:0.007063 | |
5 | |
[07:27:55] src/c_api/c_api.cc:264: num_row=3 | |
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.009 sec - in ml.dmlc.xgboost4j.java.DMatrixTest | |
Running ml.dmlc.xgboost4j.scala.rabit.RabitTrackerConnectionHandlerTest | |
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.669 sec - in ml.dmlc.xgboost4j.scala.rabit.RabitTrackerConnectionHandlerTest | |
Results : | |
Tests run: 21, Failures: 0, Errors: 0, Skipped: 0 | |
[INFO] | |
[INFO] --- scalatest-maven-plugin:1.0:test (test) @ xgboost4j --- | |
Discovery starting. | |
Discovery completed in 441 milliseconds. | |
Run starting. Expected test count is: 20 | |
RabitTrackerConnectionHandlerTest: | |
RabitTrackerConnectionHandler | |
- should handle Rabit client 'start' command properly | |
- should forward print command to tracker | |
- should handle fragmented print command without throwing exception | |
- should handle spill-over Tcp data correctly between state transition | |
ScalaBoosterImplSuite: | |
- basic operation of booster | |
- save/load model with path | |
- save/load model with stream | |
[0] train-error:0.046522 test-error:0.042831 | |
[1] train-error:0.022263 test-error:0.021726 | |
[0] train-error:0.046522 test-error:0.042831 | |
[1] train-error:0.022263 test-error:0.021726 | |
[07:27:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 12 extra nodes, 0 pruned nodes, max_depth=3 | |
[0] train-error:0.046522 test-error:0.042831 | |
[1] train-error:0.022263 test-error:0.021726 | |
[07:27:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 12 extra nodes, 0 pruned nodes, max_depth=3 | |
[07:27:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 12 extra nodes, 0 pruned nodes, max_depth=3 | |
[07:27:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 12 extra nodes, 0 pruned nodes, max_depth=3 | |
[07:27:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 12 extra nodes, 0 pruned nodes, max_depth=3 | |
Mär 20, 2017 7:27:57 AM ml.dmlc.xgboost4j.java.XGBoost crossValidation | |
INFORMATION: [0] cv-test-error:0,014439 cv-train-error:0,014431 | |
[07:27:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 10 extra nodes, 0 pruned nodes, max_depth=3 | |
[07:27:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 10 extra nodes, 0 pruned nodes, max_depth=3 | |
[07:27:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 10 extra nodes, 0 pruned nodes, max_depth=3 | |
[07:27:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 10 extra nodes, 0 pruned nodes, max_depth=3 | |
[07:27:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 10 extra nodes, 0 pruned nodes, max_depth=3 | |
Mär 20, 2017 7:27:57 AM ml.dmlc.xgboost4j.java.XGBoost crossValidation | |
INFORMATION: [1] cv-test-error:0,001229 cv-train-error:0,001228 | |
- cross validation | |
[07:27:57] Tree method is selected to be 'hist', which uses histogram aggregation for faster training. Using default sequence of updaters: grow_fast_histmaker,prune | |
[07:27:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 12 extra nodes, 0 pruned nodes, max_depth=3 | |
[0] training-auc:0.987112 test-auc:0.986255 | |
[07:27:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 10 extra nodes, 0 pruned nodes, max_depth=3 | |
[07:27:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 12 extra nodes, 0 pruned nodes, max_depth=3 | |
[1] training-auc:0.997538 test-auc:0.998653 | |
[2] training-auc:0.997675 test-auc:0.998958 | |
[07:27:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 12 extra nodes, 0 pruned nodes, max_depth=3 | |
[3] training-auc:0.998421 test-auc:0.999957 | |
[07:27:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 12 extra nodes, 0 pruned nodes, max_depth=3 | |
[4] training-auc:0.999880 test-auc:1.000000 | |
[07:27:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 12 extra nodes, 0 pruned nodes, max_depth=3 | |
[5] training-auc:0.999880 test-auc:1.000000 | |
[07:27:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 12 extra nodes, 0 pruned nodes, max_depth=3 | |
[6] training-auc:0.999935 test-auc:1.000000 | |
[07:27:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 12 extra nodes, 0 pruned nodes, max_depth=3 | |
[7] training-auc:1.000000 test-auc:1.000000 | |
[07:27:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 12 extra nodes, 0 pruned nodes, max_depth=3 | |
[8] training-auc:1.000000 test-auc:1.000000 | |
[07:27:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 12 extra nodes, 0 pruned nodes, max_depth=3 | |
- test with fast histo depthwise | |
[9] training-auc:1.000000 test-auc:1.000000 | |
[07:27:57] Tree method is selected to be 'hist', which uses histogram aggregation for faster training. Using default sequence of updaters: grow_fast_histmaker,prune | |
[07:27:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 14 extra nodes, 0 pruned nodes, max_depth=5 | |
[0] training-auc:0.994714 test-auc:0.994937 | |
[07:27:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 14 extra nodes, 0 pruned nodes, max_depth=5 | |
[1] training-auc:0.998459 test-auc:0.999972 | |
[07:27:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 14 extra nodes, 0 pruned nodes, max_depth=5 | |
[2] training-auc:0.998459 test-auc:0.999972 | |
[07:27:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 14 extra nodes, 0 pruned nodes, max_depth=5 | |
[3] training-auc:0.998459 test-auc:0.999972 | |
[07:27:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 14 extra nodes, 0 pruned nodes, max_depth=4 | |
[4] training-auc:0.999977 test-auc:1.000000 | |
[07:27:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 14 extra nodes, 0 pruned nodes, max_depth=4 | |
[5] training-auc:0.999991 test-auc:1.000000 | |
[07:27:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 14 extra nodes, 0 pruned nodes, max_depth=5 | |
[6] training-auc:0.999991 test-auc:1.000000 | |
[07:27:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 14 extra nodes, 0 pruned nodes, max_depth=4 | |
[7] training-auc:0.999991 test-auc:1.000000 | |
[07:27:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 14 extra nodes, 0 pruned nodes, max_depth=4 | |
[8] training-auc:1.000000 test-auc:1.000000 | |
[07:27:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 14 extra nodes, 0 pruned nodes, max_depth=5 | |
- test with fast histo lossguide | |
[07:27:57] Tree method is selected to be 'hist', which uses histogram aggregation for faster training. Using default sequence of updaters: grow_fast_histmaker,prune | |
[9] training-auc:1.000000 test-auc:1.000000 | |
[07:27:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 14 extra nodes, 0 pruned nodes, max_depth=5 | |
[0] training-auc:0.994714 | |
[07:27:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 14 extra nodes, 0 pruned nodes, max_depth=5 | |
[1] training-auc:0.998459 | |
[07:27:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 14 extra nodes, 0 pruned nodes, max_depth=5 | |
[2] training-auc:0.998459 | |
[07:27:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 14 extra nodes, 0 pruned nodes, max_depth=5 | |
[3] training-auc:0.998459 | |
[07:27:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 14 extra nodes, 0 pruned nodes, max_depth=4 | |
[4] training-auc:0.999977 | |
[07:27:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 14 extra nodes, 0 pruned nodes, max_depth=4 | |
[5] training-auc:0.999991 | |
[07:27:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 14 extra nodes, 0 pruned nodes, max_depth=5 | |
[6] training-auc:0.999991 | |
[07:27:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 14 extra nodes, 0 pruned nodes, max_depth=4 | |
[07:27:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 14 extra nodes, 0 pruned nodes, max_depth=4 | |
[7] training-auc:0.999991 | |
[8] training-auc:1.000000 | |
[07:27:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 14 extra nodes, 0 pruned nodes, max_depth=5 | |
- test with fast histo lossguide with max bin | |
[9] training-auc:1.000000 | |
[07:27:57] Tree method is selected to be 'hist', which uses histogram aggregation for faster training. Using default sequence of updaters: grow_fast_histmaker,prune | |
[07:27:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 6 extra nodes, 0 pruned nodes, max_depth=2 | |
[0] training-auc:0.958228 | |
[07:27:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 6 extra nodes, 0 pruned nodes, max_depth=2 | |
[1] training-auc:0.987161 | |
[07:27:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 6 extra nodes, 0 pruned nodes, max_depth=2 | |
[2] training-auc:0.993971 | |
[07:27:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 6 extra nodes, 0 pruned nodes, max_depth=2 | |
[3] training-auc:0.997779 | |
[07:27:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 4 extra nodes, 0 pruned nodes, max_depth=2 | |
[4] training-auc:0.998543 | |
[07:27:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 4 extra nodes, 0 pruned nodes, max_depth=2 | |
[5] training-auc:0.998960 | |
[07:27:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 6 extra nodes, 0 pruned nodes, max_depth=2 | |
[6] training-auc:0.998960 | |
[07:27:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 4 extra nodes, 0 pruned nodes, max_depth=2 | |
[7] training-auc:0.998960 | |
[07:27:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 6 extra nodes, 0 pruned nodes, max_depth=2 | |
[07:27:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 6 extra nodes, 0 pruned nodes, max_depth=2 | |
[8] training-auc:0.999250 | |
- test with fast histo depthwidth with max depth | |
[07:27:57] Tree method is selected to be 'hist', which uses histogram aggregation for faster training. Using default sequence of updaters: grow_fast_histmaker,prune | |
[9] training-auc:0.999365 | |
[07:27:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 6 extra nodes, 0 pruned nodes, max_depth=2 | |
[0] training-auc:0.958228 | |
[07:27:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 6 extra nodes, 0 pruned nodes, max_depth=2 | |
[1] training-auc:0.987161 | |
[07:27:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 6 extra nodes, 0 pruned nodes, max_depth=2 | |
[2] training-auc:0.993971 | |
[07:27:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 6 extra nodes, 0 pruned nodes, max_depth=2 | |
[3] training-auc:0.997779 | |
[07:27:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 4 extra nodes, 0 pruned nodes, max_depth=2 | |
[4] training-auc:0.998543 | |
[07:27:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 4 extra nodes, 0 pruned nodes, max_depth=2 | |
[5] training-auc:0.998960 | |
[07:27:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 6 extra nodes, 0 pruned nodes, max_depth=2 | |
[6] training-auc:0.998960 | |
[07:27:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 4 extra nodes, 0 pruned nodes, max_depth=2 | |
[7] training-auc:0.998960 | |
[07:27:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 6 extra nodes, 0 pruned nodes, max_depth=2 | |
[8] training-auc:0.999250 | |
[07:27:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 6 extra nodes, 0 pruned nodes, max_depth=2 | |
- test with fast histo depthwidth with max depth and max bin | |
DMatrixSuite: | |
- create DMatrix from File | |
[9] training-auc:0.999365 | |
[07:27:57] src/c_api/c_api.cc:264: num_row=3 | |
- create DMatrix from CSR | |
[07:27:57] src/c_api/c_api.cc:264: num_row=3 | |
- create DMatrix from CSREx | |
- create DMatrix from CSC | |
- create DMatrix from CSCEx | |
- create DMatrix from DenseMatrix | |
- create DMatrix from DenseMatrix with missing value | |
Run completed in 979 milliseconds. | |
Total number of tests run: 20 | |
Suites: completed 4, aborted 0 | |
Tests: succeeded 20, failed 0, canceled 0, ignored 0, pending 0 | |
All tests passed. | |
[INFO] | |
[INFO] --- maven-jar-plugin:2.4:jar (default-jar) @ xgboost4j --- | |
[INFO] Building jar: /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j/target/xgboost4j-0.7.jar | |
[INFO] | |
[INFO] --- maven-assembly-plugin:2.6:single (make-assembly) @ xgboost4j --- | |
[INFO] Building jar: /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j/target/xgboost4j-0.7-jar-with-dependencies.jar | |
[INFO] | |
[INFO] --- maven-source-plugin:3.0.1:jar-no-fork (attach-sources) @ xgboost4j --- | |
[INFO] Building jar: /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j/target/xgboost4j-0.7-sources.jar | |
[INFO] | |
[INFO] --- maven-install-plugin:2.4:install (default-install) @ xgboost4j --- | |
[INFO] Installing /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j/target/xgboost4j-0.7.jar to /Users/geoheil/.m2/repository/ml/dmlc/xgboost4j/0.7/xgboost4j-0.7.jar | |
[INFO] Installing /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j/pom.xml to /Users/geoheil/.m2/repository/ml/dmlc/xgboost4j/0.7/xgboost4j-0.7.pom | |
[INFO] Installing /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j/target/xgboost4j-0.7-jar-with-dependencies.jar to /Users/geoheil/.m2/repository/ml/dmlc/xgboost4j/0.7/xgboost4j-0.7-jar-with-dependencies.jar | |
[INFO] Installing /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j/target/xgboost4j-0.7-sources.jar to /Users/geoheil/.m2/repository/ml/dmlc/xgboost4j/0.7/xgboost4j-0.7-sources.jar | |
[INFO] | |
[INFO] ------------------------------------------------------------------------ | |
[INFO] Building xgboost4j-spark 0.7 | |
[INFO] ------------------------------------------------------------------------ | |
[INFO] | |
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ xgboost4j-spark --- | |
[INFO] Deleting /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j-spark/target | |
[INFO] | |
[INFO] --- scalastyle-maven-plugin:0.8.0:check (checkstyle) @ xgboost4j-spark --- | |
Processed 19 file(s) | |
Found 0 errors | |
Found 0 warnings | |
Found 0 infos | |
Finished in 836 ms | |
[INFO] | |
[INFO] --- maven-checkstyle-plugin:2.17:check (checkstyle) @ xgboost4j-spark --- | |
[INFO] | |
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ xgboost4j-spark --- | |
[INFO] Using 'UTF-8' encoding to copy filtered resources. | |
[INFO] skip non existing resourceDirectory /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j-spark/src/main/resources | |
[INFO] | |
[INFO] --- scala-maven-plugin:3.2.2:compile (default) @ xgboost4j-spark --- | |
[WARNING] Expected all dependencies to require Scala version: 2.11.8 | |
[WARNING] com.typesafe.akka:akka-actor_2.11:2.3.11 requires scala version: 2.11.5 | |
[WARNING] Multiple versions of scala libraries detected! | |
[INFO] /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j-spark/src/main/scala:-1: info: compiling | |
[INFO] Compiling 11 source files to /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j-spark/target/classes at 1489991283776 | |
[WARNING] /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j-spark/src/main/scala/ml/dmlc/xgboost4j/scala/spark/XGBoostEstimator.scala:74: warning: match may not be exhaustive. | |
[WARNING] It would fail on the following input: Some((x: org.apache.spark.ml.param.Param[?] forSome x not in (org.apache.spark.ml.param.BooleanParam, org.apache.spark.ml.param.DoubleParam, org.apache.spark.ml.param.FloatParam, org.apache.spark.ml.param.IntParam, org.apache.spark.ml.param.Param[?]))) | |
[WARNING] params.find(_.name == paramName) match { | |
[WARNING] ^ | |
[WARNING] warning: there were four feature warnings; re-run with -feature for details | |
[WARNING] two warnings found | |
[INFO] prepare-compile in 0 s | |
[INFO] compile in 7 s | |
[INFO] | |
[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @ xgboost4j-spark --- | |
[WARNING] Expected all dependencies to require Scala version: 2.11.8 | |
[WARNING] com.typesafe.akka:akka-actor_2.11:2.3.11 requires scala version: 2.11.5 | |
[WARNING] Multiple versions of scala libraries detected! | |
[INFO] Nothing to compile - all classes are up to date | |
[INFO] | |
[INFO] --- scala-maven-plugin:3.2.2:add-source (scala-compile-first) @ xgboost4j-spark --- | |
[INFO] Add Source directory: /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j-spark/src/main/scala | |
[INFO] Add Test Source directory: /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j-spark/src/test/scala | |
[INFO] | |
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ xgboost4j-spark --- | |
[INFO] Nothing to compile - all classes are up to date | |
[INFO] | |
[INFO] --- scala-maven-plugin:3.2.2:compile (compile) @ xgboost4j-spark --- | |
[WARNING] Expected all dependencies to require Scala version: 2.11.8 | |
[WARNING] com.typesafe.akka:akka-actor_2.11:2.3.11 requires scala version: 2.11.5 | |
[WARNING] Multiple versions of scala libraries detected! | |
[INFO] Nothing to compile - all classes are up to date | |
[INFO] | |
[INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ xgboost4j-spark --- | |
[INFO] Using 'UTF-8' encoding to copy filtered resources. | |
[INFO] Copying 11 resources | |
[INFO] | |
[INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ xgboost4j-spark --- | |
[INFO] Nothing to compile - all classes are up to date | |
[INFO] | |
[INFO] --- scala-maven-plugin:3.2.2:testCompile (test-compile) @ xgboost4j-spark --- | |
[WARNING] Expected all dependencies to require Scala version: 2.11.8 | |
[WARNING] com.typesafe.akka:akka-actor_2.11:2.3.11 requires scala version: 2.11.5 | |
[WARNING] Multiple versions of scala libraries detected! | |
[INFO] /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j-spark/src/test/scala:-1: info: compiling | |
[INFO] Compiling 8 source files to /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j-spark/target/test-classes at 1489991291083 | |
[WARNING] warning: there were four feature warnings; re-run with -feature for details | |
[WARNING] one warning found | |
[INFO] prepare-compile in 0 s | |
[INFO] compile in 8 s | |
[INFO] | |
[INFO] --- maven-surefire-plugin:2.19.1:test (default-test) @ xgboost4j-spark --- | |
[INFO] | |
[INFO] --- scalatest-maven-plugin:1.0:test (test) @ xgboost4j-spark --- | |
Discovery starting. | |
Discovery completed in 207 milliseconds. | |
Run starting. Expected test count is: 32 | |
XGBoostDFSuite: | |
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties | |
17/03/20 07:28:20 INFO SparkContext: Running Spark version 2.1.0 | |
17/03/20 07:28:20 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable | |
17/03/20 07:28:26 INFO SecurityManager: Changing view acls to: geoheil | |
17/03/20 07:28:26 INFO SecurityManager: Changing modify acls to: geoheil | |
17/03/20 07:28:26 INFO SecurityManager: Changing view acls groups to: | |
17/03/20 07:28:26 INFO SecurityManager: Changing modify acls groups to: | |
17/03/20 07:28:26 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(geoheil); groups with view permissions: Set(); users with modify permissions: Set(geoheil); groups with modify permissions: Set() | |
17/03/20 07:28:26 INFO Utils: Successfully started service 'sparkDriver' on port 60316. | |
17/03/20 07:28:26 INFO SparkEnv: Registering MapOutputTracker | |
17/03/20 07:28:26 INFO SparkEnv: Registering BlockManagerMaster | |
17/03/20 07:28:26 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information | |
17/03/20 07:28:26 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up | |
17/03/20 07:28:26 INFO DiskBlockManager: Created local directory at /private/var/folders/_2/nr_kz09s1db8ysqykv2lxvsc0000gn/T/blockmgr-1094dc91-f3af-4d42-bc33-82f814510e01 | |
17/03/20 07:28:26 INFO MemoryStore: MemoryStore started with capacity 2004.6 MB | |
17/03/20 07:28:26 INFO SparkEnv: Registering OutputCommitCoordinator | |
17/03/20 07:28:27 INFO Utils: Successfully started service 'SparkUI' on port 4040. | |
17/03/20 07:28:27 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.0.5:4040 | |
17/03/20 07:28:27 INFO Executor: Starting executor ID driver on host localhost | |
17/03/20 07:28:27 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 60319. | |
17/03/20 07:28:27 INFO NettyBlockTransferService: Server created on 192.168.0.5:60319 | |
17/03/20 07:28:27 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy | |
17/03/20 07:28:27 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.0.5, 60319, None) | |
17/03/20 07:28:27 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.0.5:60319 with 2004.6 MB RAM, BlockManagerId(driver, 192.168.0.5, 60319, None) | |
17/03/20 07:28:27 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.0.5, 60319, None) | |
17/03/20 07:28:27 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 192.168.0.5, 60319, None) | |
Tracker started, with env={} | |
17/03/20 07:28:51 ERROR RabitTracker: Uncaught exception thrown by worker: | |
java.lang.InterruptedException | |
at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1302) | |
at scala.concurrent.impl.Promise$DefaultPromise.tryAwait(Promise.scala:202) | |
at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:218) | |
at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:153) | |
at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:619) | |
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1918) | |
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1931) | |
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1944) | |
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1958) | |
at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1.apply(RDD.scala:925) | |
at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1.apply(RDD.scala:923) | |
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) | |
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) | |
at org.apache.spark.rdd.RDD.withScope(RDD.scala:362) | |
at org.apache.spark.rdd.RDD.foreachPartition(RDD.scala:923) | |
at ml.dmlc.xgboost4j.scala.spark.XGBoost$$anon$2.run(XGBoost.scala:295) | |
- test consistency and order preservation of dataframe-based model *** FAILED *** | |
ml.dmlc.xgboost4j.java.XGBoostError: XGBoostModel training failed | |
at ml.dmlc.xgboost4j.scala.spark.XGBoost$.postTrackerReturnProcessing(XGBoost.scala:322) | |
at ml.dmlc.xgboost4j.scala.spark.XGBoost$.trainWithRDD(XGBoost.scala:303) | |
at ml.dmlc.xgboost4j.scala.spark.XGBoostEstimator.train(XGBoostEstimator.scala:119) | |
at ml.dmlc.xgboost4j.scala.spark.XGBoostEstimator.train(XGBoostEstimator.scala:34) | |
at org.apache.spark.ml.Predictor.fit(Predictor.scala:96) | |
at ml.dmlc.xgboost4j.scala.spark.XGBoost$.trainWithDataFrame(XGBoost.scala:187) | |
at ml.dmlc.xgboost4j.scala.spark.XGBoostDFSuite$$anonfun$1.apply$mcV$sp(XGBoostDFSuite.scala:67) | |
at ml.dmlc.xgboost4j.scala.spark.XGBoostDFSuite$$anonfun$1.apply(XGBoostDFSuite.scala:50) | |
at ml.dmlc.xgboost4j.scala.spark.XGBoostDFSuite$$anonfun$1.apply(XGBoostDFSuite.scala:50) | |
at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85) | |
... | |
[[[07:28:5207:28:5207:28:52] ] ] dmlc-core/include/dmlc/logging.hdmlc-core/include/dmlc/logging.hdmlc-core/include/dmlc/logging.h:::300300300: : : [07:28:52] src/objective/regression_obj.cc:41: Check failed: base_score > 0.0f && base_score < 1.0f base_score must be in (0,1) for logistic loss | |
Stack trace returned 2 entries: | |
[bt] (0) 0 libxgboost4j9089129105300950492.dylib 0x0000000125afba99 _ZN4dmlc15LogMessageFatalD1Ev + 41 | |
[bt] (1) 1 libstdc++.6.dylib 0x0000000125eb7f60 _ZTVNSt7__cxx1115basic_stringbufIcSt11char_traitsIcESaIcEEE + 16 | |
[07:28:52] src/objective/regression_obj.cc:41: Check failed: base_score > 0.0f && base_score < 1.0f base_score must be in (0,1) for logistic loss | |
Stack trace returned 2 entries: | |
[bt] (0) 0 libxgboost4j9089129105300950492.dylib 0x0000000125afba99 _ZN4dmlc15LogMessageFatalD1Ev + 41 | |
[bt] (1) 1 libstdc++.6.dylib 0x0000000125eb7f60 _ZTVNSt7__cxx1115basic_stringbufIcSt11char_traitsIcESaIcEEE + 16 | |
[07:28:52] src/objective/regression_obj.cc:41: Check failed: base_score > 0.0f && base_score < 1.0f base_score must be in (0,1) for logistic loss | |
Stack trace returned 2 entries: | |
[bt] (0) 0 libxgboost4j9089129105300950492.dylib 0x0000000125afba99 _ZN4dmlc15LogMessageFatalD1Ev + 41 | |
[bt] (1) 1 libstdc++.6.dylib 0x0000000125eb7f60 _ZTVNSt7__cxx1115basic_stringbufIcSt11char_traitsIcESaIcEEE + 16 | |
[[07:28:5207:28:52] ] dmlc-core/include/dmlc/logging.hdmlc-core/include/dmlc/logging.h::300300: : [07:28:52] src/objective/regression_obj.cc:41: Check failed: base_score > 0.0f && base_score < 1.0f base_score must be in (0,1) for logistic loss | |
Stack trace returned 2 entries: | |
[bt] (0) 0 libxgboost4j9089129105300950492.dylib 0x0000000125afba99 _ZN4dmlc15LogMessageFatalD1Ev + 41 | |
[bt] (1) 1 libstdc++.6.dylib 0x0000000125eb7f60 _ZTVNSt7__cxx1115basic_stringbufIcSt11char_traitsIcESaIcEEE + 16 | |
[07:28:52] src/objective/regression_obj.cc:41: Check failed: base_score > 0.0f && base_score < 1.0f base_score must be in (0,1) for logistic loss | |
Stack trace returned 2 entries: | |
[bt] (0) 0 libxgboost4j9089129105300950492.dylib 0x0000000125afba99 _ZN4dmlc15LogMessageFatalD1Ev + 41 | |
[bt] (1) 1 libstdc++.6.dylib 0x0000000125eb7f60 _ZTVNSt7__cxx1115basic_stringbufIcSt11char_traitsIcESaIcEEE + 16 | |
[[07:28:5207:28:52] ] dmlc-core/include/dmlc/logging.hdmlc-core/include/dmlc/logging.h::300300: : [07:28:52] src/objective/regression_obj.cc:41: Check failed: base_score > 0.0f && base_score < 1.0f base_score must be in (0,1) for logistic loss | |
Stack trace returned 2 entries: | |
[bt] (0) 0 libxgboost4j9089129105300950492.dylib 0x0000000125afba99 _ZN4dmlc15LogMessageFatalD1Ev + 41 | |
[bt] (1) 1 libstdc++.6.dylib 0x0000000125eb7f60 _ZTVNSt7__cxx1115basic_stringbufIcSt11char_traitsIcESaIcEEE + 16 | |
[07:28:52] src/objective/regression_obj.cc:41: Check failed: base_score > 0.0f && base_score < 1.0f base_score must be in (0,1) for logistic loss | |
Stack trace returned 2 entries: | |
[bt] (0) 0 libxgboost4j9089129105300950492.dylib 0x0000000125afba99 _ZN4dmlc15LogMessageFatalD1Ev + 41 | |
[bt] (1) 1 libstdc++.6.dylib 0x0000000125eb7f60 _ZTVNSt7__cxx1115basic_stringbufIcSt11char_traitsIcESaIcEEE + 16 | |
17/03/20 07:28:52 ERROR Executor: Exception in task 2.0 in stage 0.0 (TID 2) | |
ml.dmlc.xgboost4j.java.XGBoostError: [07:28:52] src/objective/regression_obj.cc:41: Check failed: base_score > 0.0f && base_score < 1.0f base_score must be in (0,1) for logistic loss | |
Stack trace returned 2 entries: | |
[bt] (0) 0 libxgboost4j9089129105300950492.dylib 0x0000000125afba99 _ZN4dmlc15LogMessageFatalD1Ev + 41 | |
[bt] (1) 1 libstdc++.6.dylib 0x0000000125eb7f60 _ZTVNSt7__cxx1115basic_stringbufIcSt11char_traitsIcESaIcEEE + 16 | |
at ml.dmlc.xgboost4j.java.JNIErrorHandle.checkCall(JNIErrorHandle.java:48) | |
at ml.dmlc.xgboost4j.java.Booster.update(Booster.java:133) | |
at ml.dmlc.xgboost4j.java.XGBoost.train(XGBoost.java:115) | |
at ml.dmlc.xgboost4j.scala.XGBoost$.train(XGBoost.scala:53) | |
at ml.dmlc.xgboost4j.scala.XGBoost$.train(XGBoost.scala:83) | |
at ml.dmlc.xgboost4j.scala.spark.XGBoost$$anonfun$buildDistributedBoosters$1.apply(XGBoost.scala:132) | |
at ml.dmlc.xgboost4j.scala.spark.XGBoost$$anonfun$buildDistributedBoosters$1.apply(XGBoost.scala:111) | |
at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:796) | |
at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:796) | |
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) | |
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323) | |
at org.apache.spark.rdd.RDD$$anonfun$8.apply(RDD.scala:336) | |
at org.apache.spark.rdd.RDD$$anonfun$8.apply(RDD.scala:334) | |
at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:957) | |
at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:948) | |
at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:888) | |
at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:948) | |
at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:694) | |
at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:334) | |
at org.apache.spark.rdd.RDD.iterator(RDD.scala:285) | |
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87) | |
at org.apache.spark.scheduler.Task.run(Task.scala:99) | |
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282) | |
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) | |
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) | |
at java.lang.Thread.run(Thread.java:745) | |
17/03/20 07:28:52 ERROR Executor: Exception in task 0.0 in stage 0.0 (TID 0) | |
ml.dmlc.xgboost4j.java.XGBoostError: [07:28:52] src/objective/regression_obj.cc:41: Check failed: base_score > 0.0f && base_score < 1.0f base_score must be in (0,1) for logistic loss | |
Stack trace returned 2 entries: | |
[bt] (0) 0 libxgboost4j9089129105300950492.dylib 0x0000000125afba99 _ZN4dmlc15LogMessageFatalD1Ev + 41 | |
[bt] (1) 1 libstdc++.6.dylib 0x0000000125eb7f60 _ZTVNSt7__cxx1115basic_stringbufIcSt11char_traitsIcESaIcEEE + 16 | |
at ml.dmlc.xgboost4j.java.JNIErrorHandle.checkCall(JNIErrorHandle.java:48) | |
at ml.dmlc.xgboost4j.java.Booster.update(Booster.java:133) | |
at ml.dmlc.xgboost4j.java.XGBoost.train(XGBoost.java:115) | |
at ml.dmlc.xgboost4j.scala.XGBoost$.train(XGBoost.scala:53) | |
at ml.dmlc.xgboost4j.scala.XGBoost$.train(XGBoost.scala:83) | |
at ml.dmlc.xgboost4j.scala.spark.XGBoost$$anonfun$buildDistributedBoosters$1.apply(XGBoost.scala:132) | |
at ml.dmlc.xgboost4j.scala.spark.XGBoost$$anonfun$buildDistributedBoosters$1.apply(XGBoost.scala:111) | |
at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:796) | |
at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:796) | |
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) | |
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323) | |
at org.apache.spark.rdd.RDD$$anonfun$8.apply(RDD.scala:336) | |
at org.apache.spark.rdd.RDD$$anonfun$8.apply(RDD.scala:334) | |
at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:957) | |
at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:948) | |
at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:888) | |
at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:948) | |
at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:694) | |
at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:334) | |
at org.apache.spark.rdd.RDD.iterator(RDD.scala:285) | |
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87) | |
at org.apache.spark.scheduler.Task.run(Task.scala:99) | |
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282) | |
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) | |
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) | |
at java.lang.Thread.run(Thread.java:745) | |
17/03/20 07:28:52 ERROR Executor: Exception in task 7.0 in stage 0.0 (TID 7) | |
ml.dmlc.xgboost4j.java.XGBoostError: [07:28:52] src/objective/regression_obj.cc:41: Check failed: base_score > 0.0f && base_score < 1.0f base_score must be in (0,1) for logistic loss | |
Stack trace returned 2 entries: | |
[bt] (0) 0 libxgboost4j9089129105300950492.dylib 0x0000000125afba99 _ZN4dmlc15LogMessageFatalD1Ev + 41 | |
[bt] (1) 1 libstdc++.6.dylib 0x0000000125eb7f60 _ZTVNSt7__cxx1115basic_stringbufIcSt11char_traitsIcESaIcEEE + 16 | |
at ml.dmlc.xgboost4j.java.JNIErrorHandle.checkCall(JNIErrorHandle.java:48) | |
at ml.dmlc.xgboost4j.java.Booster.update(Booster.java:133) | |
at ml.dmlc.xgboost4j.java.XGBoost.train(XGBoost.java:115) | |
at ml.dmlc.xgboost4j.scala.XGBoost$.train(XGBoost.scala:53) | |
at ml.dmlc.xgboost4j.scala.XGBoost$.train(XGBoost.scala:83) | |
at ml.dmlc.xgboost4j.scala.spark.XGBoost$$anonfun$buildDistributedBoosters$1.apply(XGBoost.scala:132) | |
at ml.dmlc.xgboost4j.scala.spark.XGBoost$$anonfun$buildDistributedBoosters$1.apply(XGBoost.scala:111) | |
at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:796) | |
at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:796) | |
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) | |
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323) | |
at org.apache.spark.rdd.RDD$$anonfun$8.apply(RDD.scala:336) | |
at org.apache.spark.rdd.RDD$$anonfun$8.apply(RDD.scala:334) | |
at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:957) | |
at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:948) | |
at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:888) | |
at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:948) | |
at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:694) | |
at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:334) | |
at org.apache.spark.rdd.RDD.iterator(RDD.scala:285) | |
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87) | |
at org.apache.spark.scheduler.Task.run(Task.scala:99) | |
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282) | |
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) | |
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) | |
at java.lang.Thread.run(Thread.java:745) | |
17/03/20 07:28:52 ERROR Executor: Exception in task 1.0 in stage 0.0 (TID 1) | |
ml.dmlc.xgboost4j.java.XGBoostError: [07:28:52] src/objective/regression_obj.cc:41: Check failed: base_score > 0.0f && base_score < 1.0f base_score must be in (0,1) for logistic loss | |
Stack trace returned 2 entries: | |
[bt] (0) 0 libxgboost4j9089129105300950492.dylib 0x0000000125afba99 _ZN4dmlc15LogMessageFatalD1Ev + 41 | |
[bt] (1) 1 libstdc++.6.dylib 0x0000000125eb7f60 _ZTVNSt7__cxx1115basic_stringbufIcSt11char_traitsIcESaIcEEE + 16 | |
at ml.dmlc.xgboost4j.java.JNIErrorHandle.checkCall(JNIErrorHandle.java:48) | |
at ml.dmlc.xgboost4j.java.Booster.update(Booster.java:133) | |
at ml.dmlc.xgboost4j.java.XGBoost.train(XGBoost.java:115) | |
at ml.dmlc.xgboost4j.scala.XGBoost$.train(XGBoost.scala:53) | |
at ml.dmlc.xgboost4j.scala.XGBoost$.train(XGBoost.scala:83) | |
at ml.dmlc.xgboost4j.scala.spark.XGBoost$$anonfun$buildDistributedBoosters$1.apply(XGBoost.scala:132) | |
at ml.dmlc.xgboost4j.scala.spark.XGBoost$$anonfun$buildDistributedBoosters$1.apply(XGBoost.scala:111) | |
at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:796) | |
at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:796) | |
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) | |
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323) | |
at org.apache.spark.rdd.RDD$$anonfun$8.apply(RDD.scala:336) | |
at org.apache.spark.rdd.RDD$$anonfun$8.apply(RDD.scala:334) | |
at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:957) | |
at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:948) | |
at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:888) | |
at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:948) | |
at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:694) | |
at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:334) | |
at org.apache.spark.rdd.RDD.iterator(RDD.scala:285) | |
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87) | |
at org.apache.spark.scheduler.Task.run(Task.scala:99) | |
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282) | |
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) | |
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) | |
at java.lang.Thread.run(Thread.java:745) | |
[07:28:52] dmlc-core/include/dmlc/logging.h:300: [07:28:52] src/objective/regression_obj.cc:41: Check failed: base_score > 0.0f && base_score < 1.0f base_score must be in (0,1) for logistic loss | |
Stack trace returned 2 entries: | |
[bt] (0) 0 libxgboost4j9089129105300950492.dylib 0x0000000125afba99 _ZN4dmlc15LogMessageFatalD1Ev + 41 | |
[bt] (1) 1 libstdc++.6.dylib 0x0000000125eb7f60 _ZTVNSt7__cxx1115basic_stringbufIcSt11char_traitsIcESaIcEEE + 16 | |
17/03/20 07:28:52 ERROR Executor: Exception in task 3.0 in stage 0.0 (TID 3) | |
ml.dmlc.xgboost4j.java.XGBoostError: [07:28:52] src/objective/regression_obj.cc:41: Check failed: base_score > 0.0f && base_score < 1.0f base_score must be in (0,1) for logistic loss | |
Stack trace returned 2 entries: | |
[bt] (0) 0 libxgboost4j9089129105300950492.dylib 0x0000000125afba99 _ZN4dmlc15LogMessageFatalD1Ev + 41 | |
[bt] (1) 1 libstdc++.6.dylib 0x0000000125eb7f60 _ZTVNSt7__cxx1115basic_stringbufIcSt11char_traitsIcESaIcEEE + 16 | |
at ml.dmlc.xgboost4j.java.JNIErrorHandle.checkCall(JNIErrorHandle.java:48) | |
at ml.dmlc.xgboost4j.java.Booster.update(Booster.java:133) | |
at ml.dmlc.xgboost4j.java.XGBoost.train(XGBoost.java:115) | |
at ml.dmlc.xgboost4j.scala.XGBoost$.train(XGBoost.scala:53) | |
at ml.dmlc.xgboost4j.scala.XGBoost$.train(XGBoost.scala:83) | |
at ml.dmlc.xgboost4j.scala.spark.XGBoost$$anonfun$buildDistributedBoosters$1.apply(XGBoost.scala:132) | |
at ml.dmlc.xgboost4j.scala.spark.XGBoost$$anonfun$buildDistributedBoosters$1.apply(XGBoost.scala:111) | |
at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:796) | |
at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:796) | |
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) | |
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323) | |
at org.apache.spark.rdd.RDD$$anonfun$8.apply(RDD.scala:336) | |
at org.apache.spark.rdd.RDD$$anonfun$8.apply(RDD.scala:334) | |
at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:957) | |
at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:948) | |
at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:888) | |
at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:948) | |
at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:694) | |
at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:334) | |
at org.apache.spark.rdd.RDD.iterator(RDD.scala:285) | |
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87) | |
at org.apache.spark.scheduler.Task.run(Task.scala:99) | |
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282) | |
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) | |
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) | |
at java.lang.Thread.run(Thread.java:745) | |
17/03/20 07:28:52 ERROR Executor: Exception in task 4.0 in stage 0.0 (TID 4) | |
ml.dmlc.xgboost4j.java.XGBoostError: [07:28:52] src/objective/regression_obj.cc:41: Check failed: base_score > 0.0f && base_score < 1.0f base_score must be in (0,1) for logistic loss | |
Stack trace returned 2 entries: | |
[bt] (0) 0 libxgboost4j9089129105300950492.dylib 0x0000000125afba99 _ZN4dmlc15LogMessageFatalD1Ev + 41 | |
[bt] (1) 1 libstdc++.6.dylib 0x0000000125eb7f60 _ZTVNSt7__cxx1115basic_stringbufIcSt11char_traitsIcESaIcEEE + 16 | |
at ml.dmlc.xgboost4j.java.JNIErrorHandle.checkCall(JNIErrorHandle.java:48) | |
at ml.dmlc.xgboost4j.java.Booster.update(Booster.java:133) | |
at ml.dmlc.xgboost4j.java.XGBoost.train(XGBoost.java:115) | |
at ml.dmlc.xgboost4j.scala.XGBoost$.train(XGBoost.scala:53) | |
at ml.dmlc.xgboost4j.scala.XGBoost$.train(XGBoost.scala:83) | |
at ml.dmlc.xgboost4j.scala.spark.XGBoost$$anonfun$buildDistributedBoosters$1.apply(XGBoost.scala:132) | |
at ml.dmlc.xgboost4j.scala.spark.XGBoost$$anonfun$buildDistributedBoosters$1.apply(XGBoost.scala:111) | |
at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:796) | |
at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:796) | |
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) | |
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323) | |
at org.apache.spark.rdd.RDD$$anonfun$8.apply(RDD.scala:336) | |
at org.apache.spark.rdd.RDD$$anonfun$8.apply(RDD.scala:334) | |
at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:957) | |
at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:948) | |
at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:888) | |
at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:948) | |
at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:694) | |
at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:334) | |
at org.apache.spark.rdd.RDD.iterator(RDD.scala:285) | |
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87) | |
at org.apache.spark.scheduler.Task.run(Task.scala:99) | |
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282) | |
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) | |
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) | |
at java.lang.Thread.run(Thread.java:745) | |
17/03/20 07:28:52 ERROR Executor: Exception in task 5.0 in stage 0.0 (TID 5) | |
ml.dmlc.xgboost4j.java.XGBoostError: [07:28:52] src/objective/regression_obj.cc:41: Check failed: base_score > 0.0f && base_score < 1.0f base_score must be in (0,1) for logistic loss | |
Stack trace returned 2 entries: | |
[bt] (0) 0 libxgboost4j9089129105300950492.dylib 0x0000000125afba99 _ZN4dmlc15LogMessageFatalD1Ev + 41 | |
[bt] (1) 1 libstdc++.6.dylib 0x0000000125eb7f60 _ZTVNSt7__cxx1115basic_stringbufIcSt11char_traitsIcESaIcEEE + 16 | |
at ml.dmlc.xgboost4j.java.JNIErrorHandle.checkCall(JNIErrorHandle.java:48) | |
at ml.dmlc.xgboost4j.java.Booster.update(Booster.java:133) | |
at ml.dmlc.xgboost4j.java.XGBoost.train(XGBoost.java:115) | |
at ml.dmlc.xgboost4j.scala.XGBoost$.train(XGBoost.scala:53) | |
at ml.dmlc.xgboost4j.scala.XGBoost$.train(XGBoost.scala:83) | |
at ml.dmlc.xgboost4j.scala.spark.XGBoost$$anonfun$buildDistributedBoosters$1.apply(XGBoost.scala:132) | |
at ml.dmlc.xgboost4j.scala.spark.XGBoost$$anonfun$buildDistributedBoosters$1.apply(XGBoost.scala:111) | |
at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:796) | |
at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:796) | |
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) | |
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323) | |
at org.apache.spark.rdd.RDD$$anonfun$8.apply(RDD.scala:336) | |
at org.apache.spark.rdd.RDD$$anonfun$8.apply(RDD.scala:334) | |
at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:957) | |
at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:948) | |
at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:888) | |
at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:948) | |
at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:694) | |
at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:334) | |
at org.apache.spark.rdd.RDD.iterator(RDD.scala:285) | |
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87) | |
at org.apache.spark.scheduler.Task.run(Task.scala:99) | |
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282) | |
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) | |
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) | |
at java.lang.Thread.run(Thread.java:745) | |
17/03/20 07:28:52 ERROR Executor: Exception in task 6.0 in stage 0.0 (TID 6) | |
ml.dmlc.xgboost4j.java.XGBoostError: [07:28:52] src/objective/regression_obj.cc:41: Check failed: base_score > 0.0f && base_score < 1.0f base_score must be in (0,1) for logistic loss | |
Stack trace returned 2 entries: | |
[bt] (0) 0 libxgboost4j9089129105300950492.dylib 0x0000000125afba99 _ZN4dmlc15LogMessageFatalD1Ev + 41 | |
[bt] (1) 1 libstdc++.6.dylib 0x0000000125eb7f60 _ZTVNSt7__cxx1115basic_stringbufIcSt11char_traitsIcESaIcEEE + 16 | |
at ml.dmlc.xgboost4j.java.JNIErrorHandle.checkCall(JNIErrorHandle.java:48) | |
at ml.dmlc.xgboost4j.java.Booster.update(Booster.java:133) | |
at ml.dmlc.xgboost4j.java.XGBoost.train(XGBoost.java:115) | |
at ml.dmlc.xgboost4j.scala.XGBoost$.train(XGBoost.scala:53) | |
at ml.dmlc.xgboost4j.scala.XGBoost$.train(XGBoost.scala:83) | |
at ml.dmlc.xgboost4j.scala.spark.XGBoost$$anonfun$buildDistributedBoosters$1.apply(XGBoost.scala:132) | |
at ml.dmlc.xgboost4j.scala.spark.XGBoost$$anonfun$buildDistributedBoosters$1.apply(XGBoost.scala:111) | |
at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:796) | |
at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:796) | |
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38) | |
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323) | |
at org.apache.spark.rdd.RDD$$anonfun$8.apply(RDD.scala:336) | |
at org.apache.spark.rdd.RDD$$anonfun$8.apply(RDD.scala:334) | |
at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:957) | |
at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:948) | |
at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:888) | |
at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:948) | |
at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:694) | |
at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:334) | |
at org.apache.spark.rdd.RDD.iterator(RDD.scala:285) | |
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87) | |
at org.apache.spark.scheduler.Task.run(Task.scala:99) | |
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282) | |
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) | |
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) | |
at java.lang.Thread.run(Thread.java:745) | |
17/03/20 07:28:52 ERROR TaskSetManager: Task 1 in stage 0.0 failed 1 times; aborting job | |
Tracker started, with env={} | |
- test transformLeaf *** FAILED *** | |
ml.dmlc.xgboost4j.java.XGBoostError: XGBoostModel training failed | |
at ml.dmlc.xgboost4j.scala.spark.XGBoost$.postTrackerReturnProcessing(XGBoost.scala:322) | |
at ml.dmlc.xgboost4j.scala.spark.XGBoost$.trainWithRDD(XGBoost.scala:303) | |
at ml.dmlc.xgboost4j.scala.spark.XGBoostEstimator.train(XGBoostEstimator.scala:119) | |
at ml.dmlc.xgboost4j.scala.spark.XGBoostEstimator.train(XGBoostEstimator.scala:34) | |
at org.apache.spark.ml.Predictor.fit(Predictor.scala:96) | |
at ml.dmlc.xgboost4j.scala.spark.XGBoost$.trainWithDataFrame(XGBoost.scala:187) | |
at ml.dmlc.xgboost4j.scala.spark.XGBoostDFSuite$$anonfun$2.apply$mcV$sp(XGBoostDFSuite.scala:90) | |
at ml.dmlc.xgboost4j.scala.spark.XGBoostDFSuite$$anonfun$2.apply(XGBoostDFSuite.scala:85) | |
at ml.dmlc.xgboost4j.scala.spark.XGBoostDFSuite$$anonfun$2.apply(XGBoostDFSuite.scala:85) | |
at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85) | |
... | |
17/03/20 07:29:06 ERROR RabitTracker: Uncaught exception thrown by worker: | |
java.lang.InterruptedException | |
at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1302) | |
at scala.concurrent.impl.Promise$DefaultPromise.tryAwait(Promise.scala:202) | |
at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:218) | |
at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:153) | |
at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:619) | |
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1918) | |
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1931) | |
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1944) | |
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1958) | |
at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1.apply(RDD.scala:925) | |
at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1.apply(RDD.scala:923) | |
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151) | |
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112) | |
at org.apache.spark.rdd.RDD.withScope(RDD.scala:362) | |
at org.apache.spark.rdd.RDD.foreachPartition(RDD.scala:923) | |
at ml.dmlc.xgboost4j.scala.spark.XGBoost$$anon$2.run(XGBoost.scala:295) | |
rabit::Init is already called in this thread | |
rabit::Init is already called in this thread | |
rabit::Init is already called in this thread | |
[INFO] ------------------------------------------------------------------------ | |
[INFO] Reactor Summary: | |
[INFO] | |
[INFO] xgboost-jvm ........................................ SUCCESS [ 2.591 s] | |
[INFO] xgboost4j .......................................... SUCCESS [ 22.047 s] | |
[INFO] xgboost4j-spark .................................... FAILURE [01:06 min] | |
[INFO] xgboost4j-flink .................................... SKIPPED | |
[INFO] xgboost4j-example .................................. SKIPPED | |
[INFO] ------------------------------------------------------------------------ | |
[INFO] BUILD FAILURE | |
[INFO] ------------------------------------------------------------------------ | |
[INFO] Total time: 01:30 min | |
[INFO] Finished at: 2017-03-20T07:29:07+01:00 | |
[INFO] Final Memory: 48M/1007M | |
[INFO] ------------------------------------------------------------------------ | |
[ERROR] Failed to execute goal org.scalatest:scalatest-maven-plugin:1.0:test (test) on project xgboost4j-spark: There are test failures -> [Help 1] | |
[ERROR] | |
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. | |
[ERROR] Re-run Maven using the -X switch to enable full debug logging. | |
[ERROR] | |
[ERROR] For more information about the errors and possible solutions, please read the following articles: | |
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException | |
[ERROR] | |
[ERROR] After correcting the problems, you can resume the build with the command | |
[ERROR] mvn <goals> -rf :xgboost4j-spark |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
how to resolve this problem?