Skip to content

Instantly share code, notes, and snippets.

@geoHeil
Created April 3, 2017 16:52
Show Gist options
  • Save geoHeil/913dc5cf5f48af1614c3e1550a294815 to your computer and use it in GitHub Desktop.
Save geoHeil/913dc5cf5f48af1614c3e1550a294815 to your computer and use it in GitHub Desktop.
(xgb) > $ mvn clean install [±518 ✓]
[INFO] Scanning for projects...
[WARNING]
[WARNING] Some problems were encountered while building the effective model for ml.dmlc:xgboost4j:jar:0.7
[WARNING] 'build.plugins.plugin.version' for org.codehaus.mojo:exec-maven-plugin is missing. @ line 40, column 29
[WARNING]
[WARNING] It is highly recommended to fix these problems because they threaten the stability of your build.
[WARNING]
[WARNING] For this reason, future Maven versions might no longer support building such malformed projects.
[WARNING]
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Build Order:
[INFO]
[INFO] xgboost-jvm
[INFO] xgboost4j
[INFO] xgboost4j-spark
[INFO] xgboost4j-flink
[INFO] xgboost4j-example
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] Building xgboost-jvm 0.7
[INFO] ------------------------------------------------------------------------
[INFO]
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ xgboost-jvm ---
[INFO] Deleting /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/target
[INFO]
[INFO] --- scalastyle-maven-plugin:0.8.0:check (checkstyle) @ xgboost-jvm ---
[WARNING] sourceDirectory is not specified or does not exist value=/Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/src/main/scala
[WARNING] testSourceDirectory is not specified or does not exist value=/Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/src/test/scala
Processed 0 file(s)
Found 0 errors
Found 0 warnings
Found 0 infos
Finished in 85 ms
[INFO]
[INFO] --- maven-checkstyle-plugin:2.17:check (checkstyle) @ xgboost-jvm ---
[INFO]
[INFO] --- scala-maven-plugin:3.2.2:compile (default) @ xgboost-jvm ---
[WARNING] Expected all dependencies to require Scala version: 2.11.8
[WARNING] ml.dmlc:xgboost-jvm:0.7 requires scala version: 2.11.8
[WARNING] org.scala-lang:scala-compiler:2.11.8 requires scala version: 2.11.8
[WARNING] org.scala-lang.modules:scala-xml_2.11:1.0.4 requires scala version: 2.11.4
[WARNING] Multiple versions of scala libraries detected!
[INFO] No sources to compile
[INFO]
[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @ xgboost-jvm ---
[WARNING] Expected all dependencies to require Scala version: 2.11.8
[WARNING] ml.dmlc:xgboost-jvm:0.7 requires scala version: 2.11.8
[WARNING] org.scala-lang:scala-compiler:2.11.8 requires scala version: 2.11.8
[WARNING] org.scala-lang.modules:scala-xml_2.11:1.0.4 requires scala version: 2.11.4
[WARNING] Multiple versions of scala libraries detected!
[INFO] No sources to compile
[INFO]
[INFO] --- scala-maven-plugin:3.2.2:add-source (scala-compile-first) @ xgboost-jvm ---
[INFO] Add Source directory: /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/src/main/scala
[INFO] Add Test Source directory: /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/src/test/scala
[INFO]
[INFO] --- scala-maven-plugin:3.2.2:compile (compile) @ xgboost-jvm ---
[WARNING] Expected all dependencies to require Scala version: 2.11.8
[WARNING] ml.dmlc:xgboost-jvm:0.7 requires scala version: 2.11.8
[WARNING] org.scala-lang:scala-compiler:2.11.8 requires scala version: 2.11.8
[WARNING] org.scala-lang.modules:scala-xml_2.11:1.0.4 requires scala version: 2.11.4
[WARNING] Multiple versions of scala libraries detected!
[INFO] No sources to compile
[INFO]
[INFO] --- scala-maven-plugin:3.2.2:testCompile (test-compile) @ xgboost-jvm ---
[WARNING] Expected all dependencies to require Scala version: 2.11.8
[WARNING] ml.dmlc:xgboost-jvm:0.7 requires scala version: 2.11.8
[WARNING] org.scala-lang:scala-compiler:2.11.8 requires scala version: 2.11.8
[WARNING] org.scala-lang.modules:scala-xml_2.11:1.0.4 requires scala version: 2.11.4
[WARNING] Multiple versions of scala libraries detected!
[INFO] No sources to compile
[INFO]
[INFO] --- scalatest-maven-plugin:1.0:test (test) @ xgboost-jvm ---
Discovery starting.
Discovery completed in 36 milliseconds.
Run starting. Expected test count is: 0
DiscoverySuite:
Run completed in 79 milliseconds.
Total number of tests run: 0
Suites: completed 1, aborted 0
Tests: succeeded 0, failed 0, canceled 0, ignored 0, pending 0
No tests were executed.
[INFO]
[INFO] --- maven-assembly-plugin:2.6:single (make-assembly) @ xgboost-jvm ---
[INFO] Assemblies have been skipped per configuration of the skipAssembly parameter.
[INFO]
[INFO] --- maven-source-plugin:3.0.1:jar-no-fork (attach-sources) @ xgboost-jvm ---
[INFO]
[INFO] --- maven-install-plugin:2.4:install (default-install) @ xgboost-jvm ---
[INFO] Installing /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/pom.xml to /Users/geoheil/.m2/repository/ml/dmlc/xgboost-jvm/0.7/xgboost-jvm-0.7.pom
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] Building xgboost4j 0.7
[INFO] ------------------------------------------------------------------------
[INFO]
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ xgboost4j ---
[INFO] Deleting /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j/target
[INFO]
[INFO] --- scalastyle-maven-plugin:0.8.0:check (checkstyle) @ xgboost4j ---
Processed 13 file(s)
Found 0 errors
Found 0 warnings
Found 0 infos
Finished in 879 ms
[INFO]
[INFO] --- maven-checkstyle-plugin:2.17:check (checkstyle) @ xgboost4j ---
[INFO]
[INFO] --- exec-maven-plugin:1.6.0:exec (native) @ xgboost4j ---
build java wrapper
g++-6 -std=c++11 -Wall -Wno-unknown-pragmas -Iinclude -Idmlc-core/include -Irabit/include -O3 -funroll-loops -msse2 -fPIC -fopenmp -I/Library/Java/JavaVirtualMachines/jdk1.8.0_121.jdk/Contents/Home/include -I./java -I/Library/Java/JavaVirtualMachines/jdk1.8.0_121.jdk/Contents/Home/include/darwin -shared -o jvm-packages/lib/libxgboost4j.so jvm-packages/xgboost4j/src/native/xgboost4j.cpp build/learner.o build/logging.o build/c_api/c_api.o build/c_api/c_api_error.o build/common/common.o build/common/hist_util.o build/data/data.o build/data/simple_csr_source.o build/data/simple_dmatrix.o build/data/sparse_page_dmatrix.o build/data/sparse_page_raw_format.o build/data/sparse_page_source.o build/data/sparse_page_writer.o build/gbm/gblinear.o build/gbm/gbm.o build/gbm/gbtree.o build/metric/elementwise_metric.o build/metric/metric.o build/metric/multiclass_metric.o build/metric/rank_metric.o build/objective/multiclass_obj.o build/objective/objective.o build/objective/rank_obj.o build/objective/regression_obj.o build/tree/tree_model.o build/tree/tree_updater.o build/tree/updater_colmaker.o build/tree/updater_fast_hist.o build/tree/updater_histmaker.o build/tree/updater_prune.o build/tree/updater_refresh.o build/tree/updater_skmaker.o build/tree/updater_sync.o dmlc-core/libdmlc.a rabit/lib/librabit.a -pthread -lm -fopenmp
move native lib
/Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages
complete
[INFO]
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ xgboost4j ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 2 resources
[INFO]
[INFO] --- scala-maven-plugin:3.2.2:compile (default) @ xgboost4j ---
[WARNING] Expected all dependencies to require Scala version: 2.11.8
[WARNING] com.typesafe.akka:akka-actor_2.11:2.3.11 requires scala version: 2.11.5
[WARNING] Multiple versions of scala libraries detected!
[INFO] /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j/src/main/java:-1: info: compiling
[INFO] /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j/src/main/scala:-1: info: compiling
[INFO] Compiling 24 source files to /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j/target/classes at 1491238059808
[WARNING] warning: there was one deprecation warning; re-run with -deprecation for details
[WARNING] warning: there were four feature warnings; re-run with -feature for details
[WARNING] two warnings found
[INFO] prepare-compile in 0 s
[INFO] compile in 6 s
[INFO]
[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @ xgboost4j ---
[WARNING] Expected all dependencies to require Scala version: 2.11.8
[WARNING] com.typesafe.akka:akka-actor_2.11:2.3.11 requires scala version: 2.11.5
[WARNING] Multiple versions of scala libraries detected!
[INFO] Nothing to compile - all classes are up to date
[INFO]
[INFO] --- scala-maven-plugin:3.2.2:add-source (scala-compile-first) @ xgboost4j ---
[INFO] Add Source directory: /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j/src/main/scala
[INFO] Add Test Source directory: /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j/src/test/scala
[INFO]
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ xgboost4j ---
[INFO] Changes detected - recompiling the module!
[INFO] Compiling 14 source files to /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j/target/classes
[INFO]
[INFO] --- scala-maven-plugin:3.2.2:compile (compile) @ xgboost4j ---
[WARNING] Expected all dependencies to require Scala version: 2.11.8
[WARNING] com.typesafe.akka:akka-actor_2.11:2.3.11 requires scala version: 2.11.5
[WARNING] Multiple versions of scala libraries detected!
[INFO] Nothing to compile - all classes are up to date
[INFO]
[INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ xgboost4j ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j/src/test/resources
[INFO]
[INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ xgboost4j ---
[INFO] Changes detected - recompiling the module!
[INFO] Compiling 2 source files to /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j/target/test-classes
[WARNING] /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j/src/test/java/ml/dmlc/xgboost4j/java/DMatrixTest.java: /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j/src/test/java/ml/dmlc/xgboost4j/java/DMatrixTest.java uses or overrides a deprecated API.
[WARNING] /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j/src/test/java/ml/dmlc/xgboost4j/java/DMatrixTest.java: Recompile with -Xlint:deprecation for details.
[INFO]
[INFO] --- scala-maven-plugin:3.2.2:testCompile (test-compile) @ xgboost4j ---
[WARNING] Expected all dependencies to require Scala version: 2.11.8
[WARNING] com.typesafe.akka:akka-actor_2.11:2.3.11 requires scala version: 2.11.5
[WARNING] Multiple versions of scala libraries detected!
[INFO] /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j/src/test/java:-1: info: compiling
[INFO] /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j/src/test/scala:-1: info: compiling
[INFO] Compiling 5 source files to /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j/target/test-classes at 1491238067484
[WARNING] warning: there were two deprecation warnings; re-run with -deprecation for details
[WARNING] one warning found
[INFO] prepare-compile in 0 s
[INFO] compile in 5 s
[INFO]
[INFO] --- maven-surefire-plugin:2.19.1:test (default-test) @ xgboost4j ---
-------------------------------------------------------
T E S T S
-------------------------------------------------------
Running ml.dmlc.xgboost4j.java.BoosterImplTest
[18:47:53] Tree method is selected to be 'hist', which uses histogram aggregation for faster training. Using default sequence of updaters: grow_fast_histmaker,prune
[0] training-auc:0.958228
[1] training-auc:0.987161
[2] training-auc:0.993971
[3] training-auc:0.997779
[4] training-auc:0.998543
[18:47:53] Tree method is selected to be 'hist', which uses histogram aggregation for faster training. Using default sequence of updaters: grow_fast_histmaker,prune
[5] training-auc:0.998960
[6] training-auc:0.998960
[7] training-auc:0.998960
[8] training-auc:0.999250
[9] training-auc:0.999365
Apr 03, 2017 6:47:54 PM ml.dmlc.xgboost4j.java.XGBoost crossValidation
INFORMATION: [0] cv-test-error:0,014439 cv-train-error:0,014431
Apr 03, 2017 6:47:54 PM ml.dmlc.xgboost4j.java.XGBoost crossValidation
INFORMATION: [1] cv-test-error:0,003072 cv-train-error:0,001459
[18:47:54] Tree method is selected to be 'hist', which uses histogram aggregation for faster training. Using default sequence of updaters: grow_fast_histmaker,prune
[0] training-auc:0.994714
[1] training-auc:0.998459
[2] training-auc:0.998459
[3] training-auc:0.998459
[4] training-auc:0.999977
[5] training-auc:0.999991
[6] training-auc:0.999991
[7] training-auc:0.999991
[8] training-auc:1.000000
[9] training-auc:1.000000
[0] test-error:0.042831 train-error:0.046522
[1] test-error:0.021726 train-error:0.022263
[2] test-error:0.006207 train-error:0.007063
[3] test-error:0.018001 train-error:0.015200
[4] test-error:0.006207 train-error:0.007063
[0] test-error:0.042831 train-error:0.046522
[1] test-error:0.021726 train-error:0.022263
[2] test-error:0.006207 train-error:0.007063
[3] test-error:0.018001 train-error:0.015200
[4] test-error:0.006207 train-error:0.007063
[0] training-auc:0.958228
[18:47:54] Tree method is selected to be 'hist', which uses histogram aggregation for faster training. Using default sequence of updaters: grow_fast_histmaker,prune
[1] training-auc:0.987161
[2] training-auc:0.993971
[3] training-auc:0.997779
[4] training-auc:0.998543
[5] training-auc:0.998960
[6] training-auc:0.998960
[7] training-auc:0.998960
[8] training-auc:0.999250
[9] training-auc:0.999365
[0] test-auc:0.986255 training-auc:0.987112
[18:47:54] Tree method is selected to be 'hist', which uses histogram aggregation for faster training. Using default sequence of updaters: grow_fast_histmaker,prune
[1] test-auc:0.998653 training-auc:0.997538
[2] test-auc:0.998958 training-auc:0.997675
[3] test-auc:0.999957 training-auc:0.998421
[4] test-auc:1.000000 training-auc:0.999880
[5] test-auc:1.000000 training-auc:0.999880
[6] test-auc:1.000000 training-auc:0.999935
[7] test-auc:1.000000 training-auc:1.000000
[8] test-auc:1.000000 training-auc:1.000000
[9] test-auc:1.000000 training-auc:1.000000
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.437 sec - in ml.dmlc.xgboost4j.java.BoosterImplTest
Running ml.dmlc.xgboost4j.java.DMatrixTest
5
[0] test-auc:0.994937 training-auc:0.994714
[1] test-auc:0.999972 training-auc:0.998459
[2] test-auc:0.999972 training-auc:0.998459
[18:47:54] src/c_api/c_api.cc:264: num_row=3
[3] test-auc:0.999972 training-auc:0.998459
[4] test-auc:1.000000 training-auc:0.999977
[5] test-auc:1.000000 training-auc:0.999991
[6] test-auc:1.000000 training-auc:0.999991
[7] test-auc:1.000000 training-auc:0.999991
[8] test-auc:1.000000 training-auc:1.000000
[9] test-auc:1.000000 training-auc:1.000000
[0] test-error:0.042831 train-error:0.046522
[1] test-error:0.021726 train-error:0.022263
[2] test-error:0.006207 train-error:0.007063
[3] test-error:0.018001 train-error:0.015200
[4] test-error:0.006207 train-error:0.007063
5
[18:47:54] src/c_api/c_api.cc:264: num_row=3
Tests run: 8, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.008 sec - in ml.dmlc.xgboost4j.java.DMatrixTest
Running ml.dmlc.xgboost4j.scala.rabit.RabitTrackerConnectionHandlerTest
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.857 sec - in ml.dmlc.xgboost4j.scala.rabit.RabitTrackerConnectionHandlerTest
Results :
Tests run: 21, Failures: 0, Errors: 0, Skipped: 0
[INFO]
[INFO] --- scalatest-maven-plugin:1.0:test (test) @ xgboost4j ---
Discovery starting.
Discovery completed in 464 milliseconds.
Run starting. Expected test count is: 20
RabitTrackerConnectionHandlerTest:
RabitTrackerConnectionHandler
- should handle Rabit client 'start' command properly
- should forward print command to tracker
- should handle fragmented print command without throwing exception
- should handle spill-over Tcp data correctly between state transition
ScalaBoosterImplSuite:
- basic operation of booster
- save/load model with path
- save/load model with stream
[0] train-error:0.046522 test-error:0.042831
[18:47:56] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 12 extra nodes, 0 pruned nodes, max_depth=3
[1] train-error:0.022263 test-error:0.021726
[0] train-error:0.046522 test-error:0.042831
[1] train-error:0.022263 test-error:0.021726
[0] train-error:0.046522 test-error:0.042831
[1] train-error:0.022263 test-error:0.021726
[18:47:56] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 12 extra nodes, 0 pruned nodes, max_depth=3
[18:47:56] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 12 extra nodes, 0 pruned nodes, max_depth=3
[18:47:56] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 12 extra nodes, 0 pruned nodes, max_depth=3
[18:47:56] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 12 extra nodes, 0 pruned nodes, max_depth=3
Apr 03, 2017 6:47:56 PM ml.dmlc.xgboost4j.java.XGBoost crossValidation
INFORMATION: [0] cv-test-error:0,014439 cv-train-error:0,014431
[18:47:56] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 10 extra nodes, 0 pruned nodes, max_depth=3
[18:47:56] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 12 extra nodes, 0 pruned nodes, max_depth=3
[18:47:56] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 10 extra nodes, 0 pruned nodes, max_depth=3
[18:47:56] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 10 extra nodes, 0 pruned nodes, max_depth=3
[18:47:56] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 10 extra nodes, 0 pruned nodes, max_depth=3
Apr 03, 2017 6:47:56 PM ml.dmlc.xgboost4j.java.XGBoost crossValidation
INFORMATION: [1] cv-test-error:0,001075 cv-train-error:0,000959
- cross validation
[18:47:56] Tree method is selected to be 'hist', which uses histogram aggregation for faster training. Using default sequence of updaters: grow_fast_histmaker,prune
[18:47:56] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 12 extra nodes, 0 pruned nodes, max_depth=3
[18:47:56] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 10 extra nodes, 0 pruned nodes, max_depth=3
[0] training-auc:0.987112 test-auc:0.986255
[1] training-auc:0.997538 test-auc:0.998653
[18:47:56] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 12 extra nodes, 0 pruned nodes, max_depth=3
[2] training-auc:0.997675 test-auc:0.998958
[18:47:56] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 12 extra nodes, 0 pruned nodes, max_depth=3
[3] training-auc:0.998421 test-auc:0.999957
[18:47:56] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 12 extra nodes, 0 pruned nodes, max_depth=3
[4] training-auc:0.999880 test-auc:1.000000
[18:47:56] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 12 extra nodes, 0 pruned nodes, max_depth=3
[5] training-auc:0.999880 test-auc:1.000000
[18:47:56] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 12 extra nodes, 0 pruned nodes, max_depth=3
[6] training-auc:0.999935 test-auc:1.000000
[18:47:56] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 12 extra nodes, 0 pruned nodes, max_depth=3
[7] training-auc:1.000000 test-auc:1.000000
[18:47:56] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 12 extra nodes, 0 pruned nodes, max_depth=3
[8] training-auc:1.000000 test-auc:1.000000
[18:47:56] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 12 extra nodes, 0 pruned nodes, max_depth=3
- test with fast histo depthwise
[9] training-auc:1.000000 test-auc:1.000000
[18:47:56] Tree method is selected to be 'hist', which uses histogram aggregation for faster training. Using default sequence of updaters: grow_fast_histmaker,prune
[18:47:56] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 14 extra nodes, 0 pruned nodes, max_depth=5
[0] training-auc:0.994714 test-auc:0.994937
[18:47:56] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 14 extra nodes, 0 pruned nodes, max_depth=5
[1] training-auc:0.998459 test-auc:0.999972
[18:47:56] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 14 extra nodes, 0 pruned nodes, max_depth=5
[2] training-auc:0.998459 test-auc:0.999972
[18:47:56] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 14 extra nodes, 0 pruned nodes, max_depth=5
[3] training-auc:0.998459 test-auc:0.999972
[18:47:56] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 14 extra nodes, 0 pruned nodes, max_depth=4
[4] training-auc:0.999977 test-auc:1.000000
[18:47:56] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 14 extra nodes, 0 pruned nodes, max_depth=4
[5] training-auc:0.999991 test-auc:1.000000
[18:47:56] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 14 extra nodes, 0 pruned nodes, max_depth=5
[6] training-auc:0.999991 test-auc:1.000000
[18:47:56] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 14 extra nodes, 0 pruned nodes, max_depth=4
[7] training-auc:0.999991 test-auc:1.000000
[18:47:56] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 14 extra nodes, 0 pruned nodes, max_depth=4
[8] training-auc:1.000000 test-auc:1.000000
[18:47:56] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 14 extra nodes, 0 pruned nodes, max_depth=5
- test with fast histo lossguide
[18:47:56] Tree method is selected to be 'hist', which uses histogram aggregation for faster training. Using default sequence of updaters: grow_fast_histmaker,prune
[9] training-auc:1.000000 test-auc:1.000000
[18:47:56] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 14 extra nodes, 0 pruned nodes, max_depth=5
[0] training-auc:0.994714
[18:47:56] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 14 extra nodes, 0 pruned nodes, max_depth=5
[1] training-auc:0.998459
[18:47:56] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 14 extra nodes, 0 pruned nodes, max_depth=5
[2] training-auc:0.998459
[18:47:56] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 14 extra nodes, 0 pruned nodes, max_depth=5
[3] training-auc:0.998459
[18:47:56] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 14 extra nodes, 0 pruned nodes, max_depth=4
[4] training-auc:0.999977
[18:47:56] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 14 extra nodes, 0 pruned nodes, max_depth=4
[5] training-auc:0.999991
[18:47:56] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 14 extra nodes, 0 pruned nodes, max_depth=5
[6] training-auc:0.999991
[18:47:56] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 14 extra nodes, 0 pruned nodes, max_depth=4
[7] training-auc:0.999991
[18:47:56] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 14 extra nodes, 0 pruned nodes, max_depth=4
[8] training-auc:1.000000
[18:47:56] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 14 extra nodes, 0 pruned nodes, max_depth=5
- test with fast histo lossguide with max bin
[18:47:56] Tree method is selected to be 'hist', which uses histogram aggregation for faster training. Using default sequence of updaters: grow_fast_histmaker,prune
[9] training-auc:1.000000
[18:47:56] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 6 extra nodes, 0 pruned nodes, max_depth=2
[0] training-auc:0.958228
[18:47:56] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 6 extra nodes, 0 pruned nodes, max_depth=2
[1] training-auc:0.987161
[18:47:56] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 6 extra nodes, 0 pruned nodes, max_depth=2
[2] training-auc:0.993971
[18:47:56] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 6 extra nodes, 0 pruned nodes, max_depth=2
[3] training-auc:0.997779
[18:47:56] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 4 extra nodes, 0 pruned nodes, max_depth=2
[4] training-auc:0.998543
[18:47:56] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 4 extra nodes, 0 pruned nodes, max_depth=2
[5] training-auc:0.998960
[18:47:56] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 6 extra nodes, 0 pruned nodes, max_depth=2
[6] training-auc:0.998960
[18:47:56] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 4 extra nodes, 0 pruned nodes, max_depth=2
[7] training-auc:0.998960
[18:47:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 6 extra nodes, 0 pruned nodes, max_depth=2
[8] training-auc:0.999250
[18:47:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 6 extra nodes, 0 pruned nodes, max_depth=2
- test with fast histo depthwidth with max depth
[18:47:57] Tree method is selected to be 'hist', which uses histogram aggregation for faster training. Using default sequence of updaters: grow_fast_histmaker,prune
[9] training-auc:0.999365
[18:47:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 6 extra nodes, 0 pruned nodes, max_depth=2
[0] training-auc:0.958228
[18:47:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 6 extra nodes, 0 pruned nodes, max_depth=2
[1] training-auc:0.987161
[18:47:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 6 extra nodes, 0 pruned nodes, max_depth=2
[2] training-auc:0.993971
[18:47:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 6 extra nodes, 0 pruned nodes, max_depth=2
[3] training-auc:0.997779
[18:47:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 4 extra nodes, 0 pruned nodes, max_depth=2
[4] training-auc:0.998543
[18:47:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 4 extra nodes, 0 pruned nodes, max_depth=2
[5] training-auc:0.998960
[18:47:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 6 extra nodes, 0 pruned nodes, max_depth=2
[6] training-auc:0.998960
[18:47:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 4 extra nodes, 0 pruned nodes, max_depth=2
[7] training-auc:0.998960
[18:47:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 6 extra nodes, 0 pruned nodes, max_depth=2
[8] training-auc:0.999250
[18:47:57] src/tree/updater_prune.cc:74: tree pruning end, 1 roots, 6 extra nodes, 0 pruned nodes, max_depth=2
- test with fast histo depthwidth with max depth and max bin
DMatrixSuite:
- create DMatrix from File
[9] training-auc:0.999365
[18:47:57] src/c_api/c_api.cc:264: num_row=3
- create DMatrix from CSR
[18:47:57] src/c_api/c_api.cc:264: num_row=3
- create DMatrix from CSREx
- create DMatrix from CSC
- create DMatrix from CSCEx
- create DMatrix from DenseMatrix
- create DMatrix from DenseMatrix with missing value
Run completed in 1 second, 209 milliseconds.
Total number of tests run: 20
Suites: completed 4, aborted 0
Tests: succeeded 20, failed 0, canceled 0, ignored 0, pending 0
All tests passed.
[INFO]
[INFO] --- maven-jar-plugin:2.4:jar (default-jar) @ xgboost4j ---
[INFO] Building jar: /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j/target/xgboost4j-0.7.jar
[INFO]
[INFO] --- maven-assembly-plugin:2.6:single (make-assembly) @ xgboost4j ---
[INFO] Building jar: /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j/target/xgboost4j-0.7-jar-with-dependencies.jar
[INFO]
[INFO] --- maven-source-plugin:3.0.1:jar-no-fork (attach-sources) @ xgboost4j ---
[INFO] Building jar: /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j/target/xgboost4j-0.7-sources.jar
[INFO]
[INFO] --- maven-install-plugin:2.4:install (default-install) @ xgboost4j ---
[INFO] Installing /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j/target/xgboost4j-0.7.jar to /Users/geoheil/.m2/repository/ml/dmlc/xgboost4j/0.7/xgboost4j-0.7.jar
[INFO] Installing /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j/pom.xml to /Users/geoheil/.m2/repository/ml/dmlc/xgboost4j/0.7/xgboost4j-0.7.pom
[INFO] Installing /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j/target/xgboost4j-0.7-jar-with-dependencies.jar to /Users/geoheil/.m2/repository/ml/dmlc/xgboost4j/0.7/xgboost4j-0.7-jar-with-dependencies.jar
[INFO] Installing /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j/target/xgboost4j-0.7-sources.jar to /Users/geoheil/.m2/repository/ml/dmlc/xgboost4j/0.7/xgboost4j-0.7-sources.jar
[INFO]
[INFO] ------------------------------------------------------------------------
[INFO] Building xgboost4j-spark 0.7
[INFO] ------------------------------------------------------------------------
[INFO]
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ xgboost4j-spark ---
[INFO] Deleting /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j-spark/target
[INFO]
[INFO] --- scalastyle-maven-plugin:0.8.0:check (checkstyle) @ xgboost4j-spark ---
Processed 20 file(s)
Found 0 errors
Found 0 warnings
Found 0 infos
Finished in 634 ms
[INFO]
[INFO] --- maven-checkstyle-plugin:2.17:check (checkstyle) @ xgboost4j-spark ---
[INFO]
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ xgboost4j-spark ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] skip non existing resourceDirectory /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j-spark/src/main/resources
[INFO]
[INFO] --- scala-maven-plugin:3.2.2:compile (default) @ xgboost4j-spark ---
[WARNING] Expected all dependencies to require Scala version: 2.11.8
[WARNING] com.typesafe.akka:akka-actor_2.11:2.3.11 requires scala version: 2.11.5
[WARNING] Multiple versions of scala libraries detected!
[INFO] /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j-spark/src/main/scala:-1: info: compiling
[INFO] Compiling 12 source files to /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j-spark/target/classes at 1491238084064
[WARNING] /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j-spark/src/main/scala/ml/dmlc/xgboost4j/scala/spark/XGBoostEstimator.scala:74: warning: match may not be exhaustive.
[WARNING] It would fail on the following input: Some((x: org.apache.spark.ml.param.Param[?] forSome x not in (org.apache.spark.ml.param.BooleanParam, org.apache.spark.ml.param.DoubleParam, org.apache.spark.ml.param.FloatParam, org.apache.spark.ml.param.IntParam, org.apache.spark.ml.param.Param[?])))
[WARNING] params.find(_.name == paramName) match {
[WARNING] ^
[WARNING] warning: there were four feature warnings; re-run with -feature for details
[WARNING] two warnings found
[INFO] prepare-compile in 0 s
[INFO] compile in 9 s
[INFO]
[INFO] --- scala-maven-plugin:3.2.2:compile (scala-compile-first) @ xgboost4j-spark ---
[WARNING] Expected all dependencies to require Scala version: 2.11.8
[WARNING] com.typesafe.akka:akka-actor_2.11:2.3.11 requires scala version: 2.11.5
[WARNING] Multiple versions of scala libraries detected!
[INFO] Nothing to compile - all classes are up to date
[INFO]
[INFO] --- scala-maven-plugin:3.2.2:add-source (scala-compile-first) @ xgboost4j-spark ---
[INFO] Add Source directory: /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j-spark/src/main/scala
[INFO] Add Test Source directory: /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j-spark/src/test/scala
[INFO]
[INFO] --- maven-compiler-plugin:3.1:compile (default-compile) @ xgboost4j-spark ---
[INFO] Nothing to compile - all classes are up to date
[INFO]
[INFO] --- scala-maven-plugin:3.2.2:compile (compile) @ xgboost4j-spark ---
[WARNING] Expected all dependencies to require Scala version: 2.11.8
[WARNING] com.typesafe.akka:akka-actor_2.11:2.3.11 requires scala version: 2.11.5
[WARNING] Multiple versions of scala libraries detected!
[INFO] Nothing to compile - all classes are up to date
[INFO]
[INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ xgboost4j-spark ---
[INFO] Using 'UTF-8' encoding to copy filtered resources.
[INFO] Copying 11 resources
[INFO]
[INFO] --- maven-compiler-plugin:3.1:testCompile (default-testCompile) @ xgboost4j-spark ---
[INFO] Nothing to compile - all classes are up to date
[INFO]
[INFO] --- scala-maven-plugin:3.2.2:testCompile (test-compile) @ xgboost4j-spark ---
[WARNING] Expected all dependencies to require Scala version: 2.11.8
[WARNING] com.typesafe.akka:akka-actor_2.11:2.3.11 requires scala version: 2.11.5
[WARNING] Multiple versions of scala libraries detected!
[INFO] /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j-spark/src/test/scala:-1: info: compiling
[INFO] Compiling 8 source files to /Users/geoheil/Dropbox/masterThesis/xgboost/jvm-packages/xgboost4j-spark/target/test-classes at 1491238093213
[WARNING] warning: there were four feature warnings; re-run with -feature for details
[WARNING] one warning found
[INFO] prepare-compile in 0 s
[INFO] compile in 8 s
[INFO]
[INFO] --- maven-surefire-plugin:2.19.1:test (default-test) @ xgboost4j-spark ---
[INFO]
[INFO] --- scalatest-maven-plugin:1.0:test (test) @ xgboost4j-spark ---
Discovery starting.
Discovery completed in 199 milliseconds.
Run starting. Expected test count is: 33
XGBoostDFSuite:
Using Spark's default log4j profile: org/apache/spark/log4j-defaults.properties
17/04/03 18:48:23 INFO SparkContext: Running Spark version 2.1.0
17/04/03 18:48:23 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
17/04/03 18:48:28 INFO SecurityManager: Changing view acls to: geoheil
17/04/03 18:48:28 INFO SecurityManager: Changing modify acls to: geoheil
17/04/03 18:48:28 INFO SecurityManager: Changing view acls groups to:
17/04/03 18:48:28 INFO SecurityManager: Changing modify acls groups to:
17/04/03 18:48:28 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(geoheil); groups with view permissions: Set(); users with modify permissions: Set(geoheil); groups with modify permissions: Set()
17/04/03 18:48:28 INFO Utils: Successfully started service 'sparkDriver' on port 59415.
17/04/03 18:48:28 INFO SparkEnv: Registering MapOutputTracker
17/04/03 18:48:28 INFO SparkEnv: Registering BlockManagerMaster
17/04/03 18:48:28 INFO BlockManagerMasterEndpoint: Using org.apache.spark.storage.DefaultTopologyMapper for getting topology information
17/04/03 18:48:28 INFO BlockManagerMasterEndpoint: BlockManagerMasterEndpoint up
17/04/03 18:48:28 INFO DiskBlockManager: Created local directory at /private/var/folders/_2/nr_kz09s1db8ysqykv2lxvsc0000gn/T/blockmgr-bf495f6d-56a6-4043-9f85-40728bd6c3ec
17/04/03 18:48:28 INFO MemoryStore: MemoryStore started with capacity 2004.6 MB
17/04/03 18:48:28 INFO SparkEnv: Registering OutputCommitCoordinator
17/04/03 18:48:29 INFO Utils: Successfully started service 'SparkUI' on port 4040.
17/04/03 18:48:29 INFO SparkUI: Bound SparkUI to 0.0.0.0, and started at http://192.168.44.134:4040
17/04/03 18:48:29 INFO Executor: Starting executor ID driver on host localhost
17/04/03 18:48:29 INFO Utils: Successfully started service 'org.apache.spark.network.netty.NettyBlockTransferService' on port 59416.
17/04/03 18:48:29 INFO NettyBlockTransferService: Server created on 192.168.44.134:59416
17/04/03 18:48:29 INFO BlockManager: Using org.apache.spark.storage.RandomBlockReplicationPolicy for block replication policy
17/04/03 18:48:29 INFO BlockManagerMaster: Registering BlockManager BlockManagerId(driver, 192.168.44.134, 59416, None)
17/04/03 18:48:29 INFO BlockManagerMasterEndpoint: Registering block manager 192.168.44.134:59416 with 2004.6 MB RAM, BlockManagerId(driver, 192.168.44.134, 59416, None)
17/04/03 18:48:29 INFO BlockManagerMaster: Registered BlockManager BlockManagerId(driver, 192.168.44.134, 59416, None)
17/04/03 18:48:29 INFO BlockManager: Initialized BlockManager: BlockManagerId(driver, 192.168.44.134, 59416, None)
Tracker started, with env={}
- test consistency and order preservation of dataframe-based model *** FAILED ***
ml.dmlc.xgboost4j.java.XGBoostError: XGBoostModel training failed
at ml.dmlc.xgboost4j.scala.spark.XGBoost$.postTrackerReturnProcessing(XGBoost.scala:322)
at ml.dmlc.xgboost4j.scala.spark.XGBoost$.trainWithRDD(XGBoost.scala:303)
at ml.dmlc.xgboost4j.scala.spark.XGBoostEstimator.train(XGBoostEstimator.scala:119)
at ml.dmlc.xgboost4j.scala.spark.XGBoostEstimator.train(XGBoostEstimator.scala:34)
at org.apache.spark.ml.Predictor.fit(Predictor.scala:96)
at ml.dmlc.xgboost4j.scala.spark.XGBoost$.trainWithDataFrame(XGBoost.scala:187)
at ml.dmlc.xgboost4j.scala.spark.XGBoostDFSuite$$anonfun$1.apply$mcV$sp(XGBoostDFSuite.scala:67)
at ml.dmlc.xgboost4j.scala.spark.XGBoostDFSuite$$anonfun$1.apply(XGBoostDFSuite.scala:50)
at ml.dmlc.xgboost4j.scala.spark.XGBoostDFSuite$$anonfun$1.apply(XGBoostDFSuite.scala:50)
at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
...
17/04/03 18:48:48 ERROR RabitTracker: Uncaught exception thrown by worker:
java.lang.InterruptedException
at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1302)
at scala.concurrent.impl.Promise$DefaultPromise.tryAwait(Promise.scala:202)
at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:218)
at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:153)
at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:619)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1918)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1931)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1944)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1958)
at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1.apply(RDD.scala:925)
at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1.apply(RDD.scala:923)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:362)
at org.apache.spark.rdd.RDD.foreachPartition(RDD.scala:923)
at ml.dmlc.xgboost4j.scala.spark.XGBoost$$anon$2.run(XGBoost.scala:295)
[18:48:48] dmlc-core/include/dmlc/logging.h:300: [18:48:48] src/objective/regression_obj.cc:41: Check failed: base_score > 0.0f && base_score < 1.0f base_score must be in (0,1) for logistic loss
Stack trace returned 2 entries:
[bt] (0) 0 libxgboost4j5320766796355203025.dylib 0x0000000122a31a99 _ZN4dmlc15LogMessageFatalD1Ev + 41
[bt] (1) 1 libstdc++.6.dylib 0x0000000122decf60 _ZTVNSt7__cxx1115basic_stringbufIcSt11char_traitsIcESaIcEEE + 16
[18:48:48] dmlc-core/include/dmlc/logging.h:300: [18:48:48] src/objective/regression_obj.cc:41: Check failed: base_score > 0.0f && base_score < 1.0f base_score must be in (0,1) for logistic loss
Stack trace returned 2 entries:
[bt] (0) 0 libxgboost4j5320766796355203025.dylib 0x0000000122a31a99 _ZN4dmlc15LogMessageFatalD1Ev + 41
[bt] (1) 1 libstdc++.6.dylib 0x0000000122decf60 _ZTVNSt7__cxx1115basic_stringbufIcSt11char_traitsIcESaIcEEE + 16
[18:48:48] dmlc-core/include/dmlc/logging.h:300: [18:48:48] src/objective/regression_obj.cc:41: Check failed: base_score > 0.0f && base_score < 1.0f base_score must be in (0,1) for logistic loss
Stack trace returned 2 entries:
[bt] (0) 0 libxgboost4j5320766796355203025.dylib 0x0000000122a31a99 _ZN4dmlc15LogMessageFatalD1Ev + 41
[bt] (1) 1 libstdc++.6.dylib 0x0000000122decf60 _ZTVNSt7__cxx1115basic_stringbufIcSt11char_traitsIcESaIcEEE + 16
[18:48:48] dmlc-core/include/dmlc/logging.h:300: [18:48:48] src/objective/regression_obj.cc:41: Check failed: base_score > 0.0f && base_score < 1.0f base_score must be in (0,1) for logistic loss
Stack trace returned 2 entries:
[bt] (0) 0 libxgboost4j5320766796355203025.dylib 0x0000000122a31a99 _ZN4dmlc15LogMessageFatalD1Ev + 41
[bt] (1) 1 libstdc++.6.dylib 0x0000000122decf60 _ZTVNSt7__cxx1115basic_stringbufIcSt11char_traitsIcESaIcEEE + 16
[18:48:48] dmlc-core/include/dmlc/logging.h:300: [18:48:48] src/objective/regression_obj.cc:41: Check failed: base_score > 0.0f && base_score < 1.0f base_score must be in (0,1) for logistic loss
Stack trace returned 2 entries:
[bt] (0) 0 libxgboost4j5320766796355203025.dylib 0x0000000122a31a99 _ZN4dmlc15LogMessageFatalD1Ev + 41
[bt] (1) 1 libstdc++.6.dylib 0x0000000122decf60 _ZTVNSt7__cxx1115basic_stringbufIcSt11char_traitsIcESaIcEEE + 16
[18:48:48] dmlc-core/include/dmlc/logging.h:300: [18:48:48] src/objective/regression_obj.cc:41: Check failed: base_score > 0.0f && base_score < 1.0f base_score must be in (0,1) for logistic loss
Stack trace returned 2 entries:
[bt] (0) 0 libxgboost4j5320766796355203025.dylib 0x0000000122a31a99 _ZN4dmlc15LogMessageFatalD1Ev + 41
[bt] (1) 1 libstdc++.6.dylib 0x0000000122decf60 _ZTVNSt7__cxx1115basic_stringbufIcSt11char_traitsIcESaIcEEE + 16
[18:48:48] dmlc-core/include/dmlc/logging.h:300: [18:48:48] src/objective/regression_obj.cc:41: Check failed: base_score > 0.0f && base_score < 1.0f base_score must be in (0,1) for logistic loss
Stack trace returned 2 entries:
[bt] (0) 0 libxgboost4j5320766796355203025.dylib 0x0000000122a31a99 _ZN4dmlc15LogMessageFatalD1Ev + 41
[bt] (1) 1 libstdc++.6.dylib 0x0000000122decf60 _ZTVNSt7__cxx1115basic_stringbufIcSt11char_traitsIcESaIcEEE + 16
[18:48:48] dmlc-core/include/dmlc/logging.h:300: [18:48:48] src/objective/regression_obj.cc:41: Check failed: base_score > 0.0f && base_score < 1.0f base_score must be in (0,1) for logistic loss
Stack trace returned 2 entries:
[bt] (0) 0 libxgboost4j5320766796355203025.dylib 0x0000000122a31a99 _ZN4dmlc15LogMessageFatalD1Ev + 41
[bt] (1) 1 libstdc++.6.dylib 0x0000000122decf60 _ZTVNSt7__cxx1115basic_stringbufIcSt11char_traitsIcESaIcEEE + 16
17/04/03 18:48:48 ERROR Executor: Exception in task 2.0 in stage 0.0 (TID 2)
ml.dmlc.xgboost4j.java.XGBoostError: [18:48:48] src/objective/regression_obj.cc:41: Check failed: base_score > 0.0f && base_score < 1.0f base_score must be in (0,1) for logistic loss
Stack trace returned 2 entries:
[bt] (0) 0 libxgboost4j5320766796355203025.dylib 0x0000000122a31a99 _ZN4dmlc15LogMessageFatalD1Ev + 41
[bt] (1) 1 libstdc++.6.dylib 0x0000000122decf60 _ZTVNSt7__cxx1115basic_stringbufIcSt11char_traitsIcESaIcEEE + 16
at ml.dmlc.xgboost4j.java.JNIErrorHandle.checkCall(JNIErrorHandle.java:48)
at ml.dmlc.xgboost4j.java.Booster.update(Booster.java:133)
at ml.dmlc.xgboost4j.java.XGBoost.train(XGBoost.java:115)
at ml.dmlc.xgboost4j.scala.XGBoost$.train(XGBoost.scala:53)
at ml.dmlc.xgboost4j.scala.XGBoost$.train(XGBoost.scala:83)
at ml.dmlc.xgboost4j.scala.spark.XGBoost$$anonfun$buildDistributedBoosters$1.apply(XGBoost.scala:132)
at ml.dmlc.xgboost4j.scala.spark.XGBoost$$anonfun$buildDistributedBoosters$1.apply(XGBoost.scala:111)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:796)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:796)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
at org.apache.spark.rdd.RDD$$anonfun$8.apply(RDD.scala:336)
at org.apache.spark.rdd.RDD$$anonfun$8.apply(RDD.scala:334)
at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:957)
at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:948)
at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:888)
at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:948)
at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:694)
at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:334)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:285)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
at org.apache.spark.scheduler.Task.run(Task.scala:99)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
17/04/03 18:48:48 ERROR Executor: Exception in task 3.0 in stage 0.0 (TID 3)
ml.dmlc.xgboost4j.java.XGBoostError: [18:48:48] src/objective/regression_obj.cc:41: Check failed: base_score > 0.0f && base_score < 1.0f base_score must be in (0,1) for logistic loss
Stack trace returned 2 entries:
[bt] (0) 0 libxgboost4j5320766796355203025.dylib 0x0000000122a31a99 _ZN4dmlc15LogMessageFatalD1Ev + 41
[bt] (1) 1 libstdc++.6.dylib 0x0000000122decf60 _ZTVNSt7__cxx1115basic_stringbufIcSt11char_traitsIcESaIcEEE + 16
at ml.dmlc.xgboost4j.java.JNIErrorHandle.checkCall(JNIErrorHandle.java:48)
at ml.dmlc.xgboost4j.java.Booster.update(Booster.java:133)
at ml.dmlc.xgboost4j.java.XGBoost.train(XGBoost.java:115)
at ml.dmlc.xgboost4j.scala.XGBoost$.train(XGBoost.scala:53)
at ml.dmlc.xgboost4j.scala.XGBoost$.train(XGBoost.scala:83)
at ml.dmlc.xgboost4j.scala.spark.XGBoost$$anonfun$buildDistributedBoosters$1.apply(XGBoost.scala:132)
at ml.dmlc.xgboost4j.scala.spark.XGBoost$$anonfun$buildDistributedBoosters$1.apply(XGBoost.scala:111)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:796)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:796)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
at org.apache.spark.rdd.RDD$$anonfun$8.apply(RDD.scala:336)
at org.apache.spark.rdd.RDD$$anonfun$8.apply(RDD.scala:334)
at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:957)
at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:948)
at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:888)
at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:948)
at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:694)
at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:334)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:285)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
at org.apache.spark.scheduler.Task.run(Task.scala:99)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
17/04/03 18:48:48 ERROR Executor: Exception in task 5.0 in stage 0.0 (TID 5)
ml.dmlc.xgboost4j.java.XGBoostError: [18:48:48] src/objective/regression_obj.cc:41: Check failed: base_score > 0.0f && base_score < 1.0f base_score must be in (0,1) for logistic loss
Stack trace returned 2 entries:
[bt] (0) 0 libxgboost4j5320766796355203025.dylib 0x0000000122a31a99 _ZN4dmlc15LogMessageFatalD1Ev + 41
[bt] (1) 1 libstdc++.6.dylib 0x0000000122decf60 _ZTVNSt7__cxx1115basic_stringbufIcSt11char_traitsIcESaIcEEE + 16
at ml.dmlc.xgboost4j.java.JNIErrorHandle.checkCall(JNIErrorHandle.java:48)
at ml.dmlc.xgboost4j.java.Booster.update(Booster.java:133)
at ml.dmlc.xgboost4j.java.XGBoost.train(XGBoost.java:115)
at ml.dmlc.xgboost4j.scala.XGBoost$.train(XGBoost.scala:53)
at ml.dmlc.xgboost4j.scala.XGBoost$.train(XGBoost.scala:83)
at ml.dmlc.xgboost4j.scala.spark.XGBoost$$anonfun$buildDistributedBoosters$1.apply(XGBoost.scala:132)
at ml.dmlc.xgboost4j.scala.spark.XGBoost$$anonfun$buildDistributedBoosters$1.apply(XGBoost.scala:111)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:796)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:796)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
at org.apache.spark.rdd.RDD$$anonfun$8.apply(RDD.scala:336)
at org.apache.spark.rdd.RDD$$anonfun$8.apply(RDD.scala:334)
at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:957)
at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:948)
at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:888)
at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:948)
at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:694)
at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:334)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:285)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
at org.apache.spark.scheduler.Task.run(Task.scala:99)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
17/04/03 18:48:48 ERROR Executor: Exception in task 1.0 in stage 0.0 (TID 1)
ml.dmlc.xgboost4j.java.XGBoostError: [18:48:48] src/objective/regression_obj.cc:41: Check failed: base_score > 0.0f && base_score < 1.0f base_score must be in (0,1) for logistic loss
Stack trace returned 2 entries:
[bt] (0) 0 libxgboost4j5320766796355203025.dylib 0x0000000122a31a99 _ZN4dmlc15LogMessageFatalD1Ev + 41
[bt] (1) 1 libstdc++.6.dylib 0x0000000122decf60 _ZTVNSt7__cxx1115basic_stringbufIcSt11char_traitsIcESaIcEEE + 16
at ml.dmlc.xgboost4j.java.JNIErrorHandle.checkCall(JNIErrorHandle.java:48)
at ml.dmlc.xgboost4j.java.Booster.update(Booster.java:133)
at ml.dmlc.xgboost4j.java.XGBoost.train(XGBoost.java:115)
at ml.dmlc.xgboost4j.scala.XGBoost$.train(XGBoost.scala:53)
at ml.dmlc.xgboost4j.scala.XGBoost$.train(XGBoost.scala:83)
at ml.dmlc.xgboost4j.scala.spark.XGBoost$$anonfun$buildDistributedBoosters$1.apply(XGBoost.scala:132)
at ml.dmlc.xgboost4j.scala.spark.XGBoost$$anonfun$buildDistributedBoosters$1.apply(XGBoost.scala:111)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:796)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:796)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
at org.apache.spark.rdd.RDD$$anonfun$8.apply(RDD.scala:336)
at org.apache.spark.rdd.RDD$$anonfun$8.apply(RDD.scala:334)
at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:957)
at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:948)
at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:888)
at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:948)
at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:694)
at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:334)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:285)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
at org.apache.spark.scheduler.Task.run(Task.scala:99)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
17/04/03 18:48:48 ERROR Executor: Exception in task 7.0 in stage 0.0 (TID 7)
ml.dmlc.xgboost4j.java.XGBoostError: [18:48:48] src/objective/regression_obj.cc:41: Check failed: base_score > 0.0f && base_score < 1.0f base_score must be in (0,1) for logistic loss
Stack trace returned 2 entries:
[bt] (0) 0 libxgboost4j5320766796355203025.dylib 0x0000000122a31a99 _ZN4dmlc15LogMessageFatalD1Ev + 41
[bt] (1) 1 libstdc++.6.dylib 0x0000000122decf60 _ZTVNSt7__cxx1115basic_stringbufIcSt11char_traitsIcESaIcEEE + 16
at ml.dmlc.xgboost4j.java.JNIErrorHandle.checkCall(JNIErrorHandle.java:48)
at ml.dmlc.xgboost4j.java.Booster.update(Booster.java:133)
at ml.dmlc.xgboost4j.java.XGBoost.train(XGBoost.java:115)
at ml.dmlc.xgboost4j.scala.XGBoost$.train(XGBoost.scala:53)
at ml.dmlc.xgboost4j.scala.XGBoost$.train(XGBoost.scala:83)
at ml.dmlc.xgboost4j.scala.spark.XGBoost$$anonfun$buildDistributedBoosters$1.apply(XGBoost.scala:132)
at ml.dmlc.xgboost4j.scala.spark.XGBoost$$anonfun$buildDistributedBoosters$1.apply(XGBoost.scala:111)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:796)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:796)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
at org.apache.spark.rdd.RDD$$anonfun$8.apply(RDD.scala:336)
at org.apache.spark.rdd.RDD$$anonfun$8.apply(RDD.scala:334)
at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:957)
at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:948)
at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:888)
at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:948)
at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:694)
at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:334)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:285)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
at org.apache.spark.scheduler.Task.run(Task.scala:99)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
17/04/03 18:48:48 ERROR Executor: Exception in task 6.0 in stage 0.0 (TID 6)
ml.dmlc.xgboost4j.java.XGBoostError: [18:48:48] src/objective/regression_obj.cc:41: Check failed: base_score > 0.0f && base_score < 1.0f base_score must be in (0,1) for logistic loss
Stack trace returned 2 entries:
[bt] (0) 0 libxgboost4j5320766796355203025.dylib 0x0000000122a31a99 _ZN4dmlc15LogMessageFatalD1Ev + 41
[bt] (1) 1 libstdc++.6.dylib 0x0000000122decf60 _ZTVNSt7__cxx1115basic_stringbufIcSt11char_traitsIcESaIcEEE + 16
at ml.dmlc.xgboost4j.java.JNIErrorHandle.checkCall(JNIErrorHandle.java:48)
at ml.dmlc.xgboost4j.java.Booster.update(Booster.java:133)
at ml.dmlc.xgboost4j.java.XGBoost.train(XGBoost.java:115)
at ml.dmlc.xgboost4j.scala.XGBoost$.train(XGBoost.scala:53)
at ml.dmlc.xgboost4j.scala.XGBoost$.train(XGBoost.scala:83)
at ml.dmlc.xgboost4j.scala.spark.XGBoost$$anonfun$buildDistributedBoosters$1.apply(XGBoost.scala:132)
at ml.dmlc.xgboost4j.scala.spark.XGBoost$$anonfun$buildDistributedBoosters$1.apply(XGBoost.scala:111)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:796)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:796)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
at org.apache.spark.rdd.RDD$$anonfun$8.apply(RDD.scala:336)
at org.apache.spark.rdd.RDD$$anonfun$8.apply(RDD.scala:334)
at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:957)
at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:948)
at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:888)
at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:948)
at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:694)
at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:334)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:285)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
at org.apache.spark.scheduler.Task.run(Task.scala:99)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
17/04/03 18:48:48 ERROR Executor: Exception in task 0.0 in stage 0.0 (TID 0)
ml.dmlc.xgboost4j.java.XGBoostError: [18:48:48] src/objective/regression_obj.cc:41: Check failed: base_score > 0.0f && base_score < 1.0f base_score must be in (0,1) for logistic loss
Stack trace returned 2 entries:
[bt] (0) 0 libxgboost4j5320766796355203025.dylib 0x0000000122a31a99 _ZN4dmlc15LogMessageFatalD1Ev + 41
[bt] (1) 1 libstdc++.6.dylib 0x0000000122decf60 _ZTVNSt7__cxx1115basic_stringbufIcSt11char_traitsIcESaIcEEE + 16
at ml.dmlc.xgboost4j.java.JNIErrorHandle.checkCall(JNIErrorHandle.java:48)
at ml.dmlc.xgboost4j.java.Booster.update(Booster.java:133)
at ml.dmlc.xgboost4j.java.XGBoost.train(XGBoost.java:115)
at ml.dmlc.xgboost4j.scala.XGBoost$.train(XGBoost.scala:53)
at ml.dmlc.xgboost4j.scala.XGBoost$.train(XGBoost.scala:83)
at ml.dmlc.xgboost4j.scala.spark.XGBoost$$anonfun$buildDistributedBoosters$1.apply(XGBoost.scala:132)
at ml.dmlc.xgboost4j.scala.spark.XGBoost$$anonfun$buildDistributedBoosters$1.apply(XGBoost.scala:111)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:796)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:796)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
at org.apache.spark.rdd.RDD$$anonfun$8.apply(RDD.scala:336)
at org.apache.spark.rdd.RDD$$anonfun$8.apply(RDD.scala:334)
at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:957)
at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:948)
at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:888)
at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:948)
at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:694)
at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:334)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:285)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
at org.apache.spark.scheduler.Task.run(Task.scala:99)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
17/04/03 18:48:48 ERROR Executor: Exception in task 4.0 in stage 0.0 (TID 4)
ml.dmlc.xgboost4j.java.XGBoostError: [18:48:48] src/objective/regression_obj.cc:41: Check failed: base_score > 0.0f && base_score < 1.0f base_score must be in (0,1) for logistic loss
Stack trace returned 2 entries:
[bt] (0) 0 libxgboost4j5320766796355203025.dylib 0x0000000122a31a99 _ZN4dmlc15LogMessageFatalD1Ev + 41
[bt] (1) 1 libstdc++.6.dylib 0x0000000122decf60 _ZTVNSt7__cxx1115basic_stringbufIcSt11char_traitsIcESaIcEEE + 16
at ml.dmlc.xgboost4j.java.JNIErrorHandle.checkCall(JNIErrorHandle.java:48)
at ml.dmlc.xgboost4j.java.Booster.update(Booster.java:133)
at ml.dmlc.xgboost4j.java.XGBoost.train(XGBoost.java:115)
at ml.dmlc.xgboost4j.scala.XGBoost$.train(XGBoost.scala:53)
at ml.dmlc.xgboost4j.scala.XGBoost$.train(XGBoost.scala:83)
at ml.dmlc.xgboost4j.scala.spark.XGBoost$$anonfun$buildDistributedBoosters$1.apply(XGBoost.scala:132)
at ml.dmlc.xgboost4j.scala.spark.XGBoost$$anonfun$buildDistributedBoosters$1.apply(XGBoost.scala:111)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:796)
at org.apache.spark.rdd.RDD$$anonfun$mapPartitions$1$$anonfun$apply$23.apply(RDD.scala:796)
at org.apache.spark.rdd.MapPartitionsRDD.compute(MapPartitionsRDD.scala:38)
at org.apache.spark.rdd.RDD.computeOrReadCheckpoint(RDD.scala:323)
at org.apache.spark.rdd.RDD$$anonfun$8.apply(RDD.scala:336)
at org.apache.spark.rdd.RDD$$anonfun$8.apply(RDD.scala:334)
at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:957)
at org.apache.spark.storage.BlockManager$$anonfun$doPutIterator$1.apply(BlockManager.scala:948)
at org.apache.spark.storage.BlockManager.doPut(BlockManager.scala:888)
at org.apache.spark.storage.BlockManager.doPutIterator(BlockManager.scala:948)
at org.apache.spark.storage.BlockManager.getOrElseUpdate(BlockManager.scala:694)
at org.apache.spark.rdd.RDD.getOrCompute(RDD.scala:334)
at org.apache.spark.rdd.RDD.iterator(RDD.scala:285)
at org.apache.spark.scheduler.ResultTask.runTask(ResultTask.scala:87)
at org.apache.spark.scheduler.Task.run(Task.scala:99)
at org.apache.spark.executor.Executor$TaskRunner.run(Executor.scala:282)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:745)
17/04/03 18:48:48 ERROR TaskSetManager: Task 2 in stage 0.0 failed 1 times; aborting job
Tracker started, with env={}
- test transformLeaf *** FAILED ***
ml.dmlc.xgboost4j.java.XGBoostError: XGBoostModel training failed
at ml.dmlc.xgboost4j.scala.spark.XGBoost$.postTrackerReturnProcessing(XGBoost.scala:322)
at ml.dmlc.xgboost4j.scala.spark.XGBoost$.trainWithRDD(XGBoost.scala:303)
at ml.dmlc.xgboost4j.scala.spark.XGBoostEstimator.train(XGBoostEstimator.scala:119)
at ml.dmlc.xgboost4j.scala.spark.XGBoostEstimator.train(XGBoostEstimator.scala:34)
at org.apache.spark.ml.Predictor.fit(Predictor.scala:96)
at ml.dmlc.xgboost4j.scala.spark.XGBoost$.trainWithDataFrame(XGBoost.scala:187)
at ml.dmlc.xgboost4j.scala.spark.XGBoostDFSuite$$anonfun$2.apply$mcV$sp(XGBoostDFSuite.scala:90)
at ml.dmlc.xgboost4j.scala.spark.XGBoostDFSuite$$anonfun$2.apply(XGBoostDFSuite.scala:85)
at ml.dmlc.xgboost4j.scala.spark.XGBoostDFSuite$$anonfun$2.apply(XGBoostDFSuite.scala:85)
at org.scalatest.OutcomeOf$class.outcomeOf(OutcomeOf.scala:85)
...
17/04/03 18:49:03 ERROR RabitTracker: Uncaught exception thrown by worker:
java.lang.InterruptedException
at java.util.concurrent.locks.AbstractQueuedSynchronizer.acquireSharedInterruptibly(AbstractQueuedSynchronizer.java:1302)
at scala.concurrent.impl.Promise$DefaultPromise.tryAwait(Promise.scala:202)
at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:218)
at scala.concurrent.impl.Promise$DefaultPromise.ready(Promise.scala:153)
at org.apache.spark.scheduler.DAGScheduler.runJob(DAGScheduler.scala:619)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1918)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1931)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1944)
at org.apache.spark.SparkContext.runJob(SparkContext.scala:1958)
at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1.apply(RDD.scala:925)
at org.apache.spark.rdd.RDD$$anonfun$foreachPartition$1.apply(RDD.scala:923)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:151)
at org.apache.spark.rdd.RDDOperationScope$.withScope(RDDOperationScope.scala:112)
at org.apache.spark.rdd.RDD.withScope(RDD.scala:362)
at org.apache.spark.rdd.RDD.foreachPartition(RDD.scala:923)
at ml.dmlc.xgboost4j.scala.spark.XGBoost$$anon$2.run(XGBoost.scala:295)
rabit::Init is already called in this thread
rabit::Init is already called in this thread
rabit::Init is already called in this thread
rabit::Init is already called in this thread
[INFO] ------------------------------------------------------------------------
[INFO] Reactor Summary:
[INFO]
[INFO] xgboost-jvm ........................................ SUCCESS [ 3.274 s]
[INFO] xgboost4j .......................................... SUCCESS [ 23.757 s]
[INFO] xgboost4j-spark .................................... FAILURE [01:02 min]
[INFO] xgboost4j-flink .................................... SKIPPED
[INFO] xgboost4j-example .................................. SKIPPED
[INFO] ------------------------------------------------------------------------
[INFO] BUILD FAILURE
[INFO] ------------------------------------------------------------------------
[INFO] Total time: 01:30 min
[INFO] Finished at: 2017-04-03T18:49:03+02:00
[INFO] Final Memory: 45M/801M
[INFO] ------------------------------------------------------------------------
[ERROR] Failed to execute goal org.scalatest:scalatest-maven-plugin:1.0:test (test) on project xgboost4j-spark: There are test failures -> [Help 1]
[ERROR]
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch.
[ERROR] Re-run Maven using the -X switch to enable full debug logging.
[ERROR]
[ERROR] For more information about the errors and possible solutions, please read the following articles:
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment