Created
November 27, 2019 11:25
-
-
Save AlexDBlack/68fe556289fcd32a16b31304a3218a17 to your computer and use it in GitHub Desktop.
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
c:\DL4J\Git\deeplearning4j\datavec>mvn clean test -Ptest-nd4j-native -Ptestresources | |
[INFO] Scanning for projects... | |
[INFO] ------------------------------------------------------------------------ | |
[INFO] Reactor Build Order: | |
[INFO] | |
[INFO] DataVec | |
[INFO] datavec-api | |
[INFO] datavec-arrow | |
[INFO] datavec-geo | |
[INFO] datavec-python | |
[INFO] datavec-local | |
[INFO] datavec-data | |
[INFO] datavec-data-audio | |
[INFO] datavec-data-image | |
[INFO] datavec-data-codec | |
[INFO] datavec-data-nlp | |
[INFO] datavec-hadoop | |
[INFO] datavec-spark_2.11 | |
[INFO] DataVec Camel Component | |
[INFO] datavec-spark-inference-parent | |
[INFO] datavec-spark-inference-model | |
[INFO] datavec-spark-inference-server | |
[INFO] datavec-spark-inference-client | |
[INFO] datavec-jdbc | |
[INFO] datavec-excel | |
[INFO] datavec-perf | |
[INFO] | |
[INFO] ------------------------------------------------------------------------ | |
[INFO] Building DataVec 1.0.0-SNAPSHOT | |
[INFO] ------------------------------------------------------------------------ | |
[INFO] | |
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ datavec-parent --- | |
[INFO] Deleting c:\DL4J\Git\deeplearning4j\datavec\target | |
[INFO] | |
[INFO] --- maven-enforcer-plugin:1.4.1:enforce (enforce-maven) @ datavec-parent --- | |
[INFO] | |
[INFO] --- lint-maven-plugin:0.0.11:check (pom-lint) @ datavec-parent --- | |
[INFO] Writing summary report | |
[INFO] [LINT] Completed with no violations | |
[INFO] Writing xml report | |
[INFO] | |
[INFO] --- git-commit-id-plugin:2.2.2:revision (default) @ datavec-parent --- | |
[INFO] | |
[INFO] --- build-helper-maven-plugin:3.0.0:add-resource (add-resource) @ datavec-parent --- | |
[INFO] | |
[INFO] --- maven-enforcer-plugin:1.4.1:enforce (enforce-test-resources) @ datavec-parent --- | |
[INFO] | |
[INFO] ------------------------------------------------------------------------ | |
[INFO] Building datavec-api 1.0.0-SNAPSHOT | |
[INFO] ------------------------------------------------------------------------ | |
[INFO] | |
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ datavec-api --- | |
[INFO] Deleting C:\DL4J\Git\deeplearning4j\datavec\datavec-api\target | |
[INFO] | |
[INFO] --- maven-enforcer-plugin:1.4.1:enforce (enforce-maven) @ datavec-api --- | |
[INFO] | |
[INFO] --- lint-maven-plugin:0.0.11:check (pom-lint) @ datavec-api --- | |
[INFO] Writing summary report | |
[INFO] [LINT] Completed with no violations | |
[INFO] Writing xml report | |
[INFO] | |
[INFO] --- git-commit-id-plugin:2.2.2:revision (default) @ datavec-api --- | |
[INFO] | |
[INFO] --- build-helper-maven-plugin:3.0.0:add-resource (add-resource) @ datavec-api --- | |
[INFO] | |
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ datavec-api --- | |
[INFO] Using 'UTF-8' encoding to copy filtered resources. | |
[INFO] Copying 2 resources | |
[INFO] Copying 1 resource | |
[INFO] | |
[INFO] --- maven-compiler-plugin:3.7.0:compile (default-compile) @ datavec-api --- | |
[INFO] Changes detected - recompiling the module! | |
[INFO] Compiling 399 source files to C:\DL4J\Git\deeplearning4j\datavec\datavec-api\target\classes | |
[INFO] /C:/DL4J/Git/deeplearning4j/datavec/datavec-api/src/main/java/org/datavec/api/conf/Configuration.java: Some input files use or override a deprecated API. | |
[INFO] /C:/DL4J/Git/deeplearning4j/datavec/datavec-api/src/main/java/org/datavec/api/conf/Configuration.java: Recompile with -Xlint:deprecation for details. | |
[INFO] /C:/DL4J/Git/deeplearning4j/datavec/datavec-api/src/main/java/org/datavec/api/transform/transform/floattransform/FloatColumnsMathOpTransform.java: Some input files use unchecked or unsafe operations. | |
[INFO] /C:/DL4J/Git/deeplearning4j/datavec/datavec-api/src/main/java/org/datavec/api/transform/transform/floattransform/FloatColumnsMathOpTransform.java: Recompile with -Xlint:unchecked for details. | |
[INFO] | |
[INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ datavec-api --- | |
[INFO] Using 'UTF-8' encoding to copy filtered resources. | |
[INFO] skip non existing resourceDirectory C:\DL4J\Git\deeplearning4j\datavec\datavec-api\src\test\resources | |
[INFO] | |
[INFO] --- maven-compiler-plugin:3.7.0:testCompile (default-testCompile) @ datavec-api --- | |
[INFO] Changes detected - recompiling the module! | |
[INFO] Compiling 58 source files to C:\DL4J\Git\deeplearning4j\datavec\datavec-api\target\test-classes | |
[INFO] /C:/DL4J/Git/deeplearning4j/datavec/datavec-api/src/test/java/org/datavec/api/records/reader/impl/LineReaderTest.java: Some input files use or override a deprecated API. | |
[INFO] /C:/DL4J/Git/deeplearning4j/datavec/datavec-api/src/test/java/org/datavec/api/records/reader/impl/LineReaderTest.java: Recompile with -Xlint:deprecation for details. | |
[INFO] /C:/DL4J/Git/deeplearning4j/datavec/datavec-api/src/test/java/org/datavec/api/transform/reduce/TestMultiOpReduce.java: C:\DL4J\Git\deeplearning4j\datavec\datavec-api\src\test\java\org\datavec\api\transform\reduce\TestMultiOpReduce.java uses unchecked or unsafe operations. | |
[INFO] /C:/DL4J/Git/deeplearning4j/datavec/datavec-api/src/test/java/org/datavec/api/transform/reduce/TestMultiOpReduce.java: Recompile with -Xlint:unchecked for details. | |
[INFO] | |
[INFO] --- maven-surefire-plugin:2.19.1:test (default-test) @ datavec-api --- | |
------------------------------------------------------- | |
T E S T S | |
------------------------------------------------------- | |
Running org.datavec.api.records.reader.impl.CSVLineSequenceRecordReaderTest | |
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.909 sec - in org.datavec.api.records.reader.impl.CSVLineSequenceRecordReaderTest | |
Running org.datavec.api.records.reader.impl.CSVMultiSequenceRecordReaderTest | |
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.051 sec - in org.datavec.api.records.reader.impl.CSVMultiSequenceRecordReaderTest | |
Running org.datavec.api.records.reader.impl.CSVNLinesSequenceRecordReaderTest | |
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.061 sec - in org.datavec.api.records.reader.impl.CSVNLinesSequenceRecordReaderTest | |
Running org.datavec.api.records.reader.impl.CSVRecordReaderTest | |
-------------------------------- | |
java.lang.RuntimeException: Error during LineRecordReader reset | |
at org.datavec.api.records.reader.impl.LineRecordReader.reset(LineRecordReader.java:171) | |
at org.datavec.api.records.reader.impl.csv.CSVRecordReader.reset(CSVRecordReader.java:235) | |
at org.datavec.api.records.reader.impl.CSVRecordReaderTest.testStreamReset(CSVRecordReaderTest.java:311) | |
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) | |
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) | |
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) | |
at java.lang.reflect.Method.invoke(Method.java:498) | |
at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50) | |
at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12) | |
at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47) | |
at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17) | |
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325) | |
at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78) | |
at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57) | |
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290) | |
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71) | |
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288) | |
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58) | |
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268) | |
at org.junit.runners.ParentRunner.run(ParentRunner.java:363) | |
at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:367) | |
at org.apache.maven.surefire.junit4.JUnit4Provider.executeWithRerun(JUnit4Provider.java:274) | |
at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:238) | |
at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:161) | |
at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290) | |
at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242) | |
at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121) | |
Caused by: java.lang.UnsupportedOperationException: Reset not supported from streams | |
at org.datavec.api.split.InputStreamInputSplit.reset(InputStreamInputSplit.java:143) | |
at org.datavec.api.records.reader.impl.LineRecordReader.reset(LineRecordReader.java:167) | |
... 26 more | |
Tests run: 14, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.076 sec - in org.datavec.api.records.reader.impl.CSVRecordReaderTest | |
Running org.datavec.api.records.reader.impl.CSVSequenceRecordReaderTest | |
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.191 sec - in org.datavec.api.records.reader.impl.CSVSequenceRecordReaderTest | |
Running org.datavec.api.records.reader.impl.CSVVariableSlidingWindowRecordReaderTest | |
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.028 sec - in org.datavec.api.records.reader.impl.CSVVariableSlidingWindowRecordReaderTest | |
Running org.datavec.api.records.reader.impl.FileBatchRecordReaderTest | |
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.104 sec - in org.datavec.api.records.reader.impl.FileBatchRecordReaderTest | |
Running org.datavec.api.records.reader.impl.FileRecordReaderTest | |
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.017 sec - in org.datavec.api.records.reader.impl.FileRecordReaderTest | |
Running org.datavec.api.records.reader.impl.JacksonLineRecordReaderTest | |
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.36 sec - in org.datavec.api.records.reader.impl.JacksonLineRecordReaderTest | |
Running org.datavec.api.records.reader.impl.JacksonRecordReaderTest | |
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.604 sec - in org.datavec.api.records.reader.impl.JacksonRecordReaderTest | |
Running org.datavec.api.records.reader.impl.LibSvmRecordReaderTest | |
Tests run: 14, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.186 sec - in org.datavec.api.records.reader.impl.LibSvmRecordReaderTest | |
Running org.datavec.api.records.reader.impl.LineReaderTest | |
java.io.IOException: Unable to delete file: C:\Users\Alex\AppData\Local\Temp\tmpdir-testLineReader\tmp3.txt | |
at org.apache.commons.io.FileUtils.forceDelete(FileUtils.java:2381) | |
at org.apache.commons.io.FileUtils.cleanDirectory(FileUtils.java:1679) | |
at org.apache.commons.io.FileUtils.deleteDirectory(FileUtils.java:1575) | |
at org.datavec.api.records.reader.impl.LineReaderTest.testLineReader(LineReaderTest.java:89) | |
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) | |
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) | |
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) | |
at java.lang.reflect.Method.invoke(Method.java:498) | |
at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50) | |
at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12) | |
at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47) | |
at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17) | |
at org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:48) | |
at org.junit.rules.RunRules.evaluate(RunRules.java:20) | |
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325) | |
at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78) | |
at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57) | |
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290) | |
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71) | |
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288) | |
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58) | |
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268) | |
at org.junit.runners.ParentRunner.run(ParentRunner.java:363) | |
at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:367) | |
at org.apache.maven.surefire.junit4.JUnit4Provider.executeWithRerun(JUnit4Provider.java:274) | |
at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:238) | |
at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:161) | |
at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290) | |
at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242) | |
at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121) | |
java.io.IOException: Unable to delete file: C:\Users\Alex\AppData\Local\Temp\junit8362671457948134290\junit7281348152887334603\tmp3.txt | |
at org.apache.commons.io.FileUtils.forceDelete(FileUtils.java:2381) | |
at org.apache.commons.io.FileUtils.cleanDirectory(FileUtils.java:1679) | |
at org.apache.commons.io.FileUtils.deleteDirectory(FileUtils.java:1575) | |
at org.datavec.api.records.reader.impl.LineReaderTest.testLineReaderMetaData(LineReaderTest.java:151) | |
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) | |
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) | |
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) | |
at java.lang.reflect.Method.invoke(Method.java:498) | |
at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50) | |
at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12) | |
at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47) | |
at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17) | |
at org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:48) | |
at org.junit.rules.RunRules.evaluate(RunRules.java:20) | |
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325) | |
at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78) | |
at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57) | |
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290) | |
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71) | |
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288) | |
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58) | |
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268) | |
at org.junit.runners.ParentRunner.run(ParentRunner.java:363) | |
at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:367) | |
at org.apache.maven.surefire.junit4.JUnit4Provider.executeWithRerun(JUnit4Provider.java:274) | |
at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:238) | |
at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:161) | |
at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290) | |
at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242) | |
at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121) | |
java.io.IOException: Unable to delete file: C:\Users\Alex\AppData\Local\Temp\junit1994159827995476413\junit3522288626564733443\tmp1.txt.gz | |
at org.apache.commons.io.FileUtils.forceDelete(FileUtils.java:2381) | |
at org.apache.commons.io.FileUtils.cleanDirectory(FileUtils.java:1679) | |
at org.apache.commons.io.FileUtils.deleteDirectory(FileUtils.java:1575) | |
at org.datavec.api.records.reader.impl.LineReaderTest.testLineReaderWithInputStreamInputSplit(LineReaderTest.java:182) | |
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) | |
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62) | |
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43) | |
at java.lang.reflect.Method.invoke(Method.java:498) | |
at org.junit.runners.model.FrameworkMethod$1.runReflectiveCall(FrameworkMethod.java:50) | |
at org.junit.internal.runners.model.ReflectiveCallable.run(ReflectiveCallable.java:12) | |
at org.junit.runners.model.FrameworkMethod.invokeExplosively(FrameworkMethod.java:47) | |
at org.junit.internal.runners.statements.InvokeMethod.evaluate(InvokeMethod.java:17) | |
at org.junit.rules.ExternalResource$1.evaluate(ExternalResource.java:48) | |
at org.junit.rules.RunRules.evaluate(RunRules.java:20) | |
at org.junit.runners.ParentRunner.runLeaf(ParentRunner.java:325) | |
at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:78) | |
at org.junit.runners.BlockJUnit4ClassRunner.runChild(BlockJUnit4ClassRunner.java:57) | |
at org.junit.runners.ParentRunner$3.run(ParentRunner.java:290) | |
at org.junit.runners.ParentRunner$1.schedule(ParentRunner.java:71) | |
at org.junit.runners.ParentRunner.runChildren(ParentRunner.java:288) | |
at org.junit.runners.ParentRunner.access$000(ParentRunner.java:58) | |
at org.junit.runners.ParentRunner$2.evaluate(ParentRunner.java:268) | |
at org.junit.runners.ParentRunner.run(ParentRunner.java:363) | |
at org.apache.maven.surefire.junit4.JUnit4Provider.execute(JUnit4Provider.java:367) | |
at org.apache.maven.surefire.junit4.JUnit4Provider.executeWithRerun(JUnit4Provider.java:274) | |
at org.apache.maven.surefire.junit4.JUnit4Provider.executeTestSet(JUnit4Provider.java:238) | |
at org.apache.maven.surefire.junit4.JUnit4Provider.invoke(JUnit4Provider.java:161) | |
at org.apache.maven.surefire.booter.ForkedBooter.invokeProviderInSameClassLoader(ForkedBooter.java:290) | |
at org.apache.maven.surefire.booter.ForkedBooter.runSuitesInProcess(ForkedBooter.java:242) | |
at org.apache.maven.surefire.booter.ForkedBooter.main(ForkedBooter.java:121) | |
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.068 sec - in org.datavec.api.records.reader.impl.LineReaderTest | |
Running org.datavec.api.records.reader.impl.RegexRecordReaderTest | |
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.16 sec - in org.datavec.api.records.reader.impl.RegexRecordReaderTest | |
Running org.datavec.api.records.reader.impl.SVMLightRecordReaderTest | |
Tests run: 15, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.105 sec - in org.datavec.api.records.reader.impl.SVMLightRecordReaderTest | |
Running org.datavec.api.records.reader.impl.TestCollectionRecordReaders | |
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.001 sec - in org.datavec.api.records.reader.impl.TestCollectionRecordReaders | |
Running org.datavec.api.records.reader.impl.TestConcatenatingRecordReader | |
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.022 sec - in org.datavec.api.records.reader.impl.TestConcatenatingRecordReader | |
Running org.datavec.api.records.reader.impl.TestSerialization | |
org.datavec.api.records.reader.impl.csv.CSVNLinesSequenceRecordReader | |
org.datavec.api.records.reader.impl.csv.CSVRecordReader | |
org.datavec.api.records.reader.impl.csv.CSVSequenceRecordReader | |
org.datavec.api.records.reader.impl.csv.CSVVariableSlidingWindowRecordReader | |
org.datavec.api.records.reader.impl.csv.CSVRegexRecordReader | |
org.datavec.api.records.reader.impl.jackson.JacksonRecordReader | |
org.datavec.api.records.reader.impl.jackson.JacksonLineRecordReader | |
org.datavec.api.records.reader.impl.misc.LibSvmRecordReader | |
org.datavec.api.records.reader.impl.misc.SVMLightRecordReader | |
org.datavec.api.records.reader.impl.regex.RegexLineRecordReader | |
org.datavec.api.records.reader.impl.regex.RegexSequenceRecordReader | |
org.datavec.api.records.reader.impl.transform.TransformProcessRecordReader | |
org.datavec.api.records.reader.impl.transform.TransformProcessSequenceRecordReader | |
org.datavec.api.records.reader.impl.LineRecordReader | |
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.179 sec - in org.datavec.api.records.reader.impl.TestSerialization | |
Running org.datavec.api.records.writer.impl.CSVRecordWriterTest | |
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.001 sec - in org.datavec.api.records.writer.impl.CSVRecordWriterTest | |
Running org.datavec.api.records.writer.impl.LibSvmRecordWriterTest | |
22:22:28.977 [main] INFO org.nd4j.linalg.factory.Nd4jBackend - Loaded [CpuBackend] backend | |
22:22:45.146 [main] INFO org.nd4j.nativeblas.NativeOpsHolder - Number of threads used for linear algebra: 8 | |
22:22:45.227 [main] INFO org.nd4j.nativeblas.Nd4jBlas - Number of threads used for OpenMP BLAS: 8 | |
22:22:45.232 [main] INFO org.nd4j.linalg.api.ops.executioner.DefaultOpExecutioner - Backend used: [CPU]; OS: [Windows 10] | |
22:22:45.232 [main] INFO org.nd4j.linalg.api.ops.executioner.DefaultOpExecutioner - Cores: [16]; Memory: [7.1GB]; | |
22:22:45.232 [main] INFO org.nd4j.linalg.api.ops.executioner.DefaultOpExecutioner - Blas vendor: [OPENBLAS] | |
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 16.493 sec - in org.datavec.api.records.writer.impl.LibSvmRecordWriterTest | |
Running org.datavec.api.records.writer.impl.SVMLightRecordWriterTest | |
Tests run: 11, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.117 sec - in org.datavec.api.records.writer.impl.SVMLightRecordWriterTest | |
Running org.datavec.api.split.TestStreamInputSplit | |
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.046 sec - in org.datavec.api.split.TestStreamInputSplit | |
Running org.datavec.api.split.TransformSplitTest | |
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0 sec - in org.datavec.api.split.TransformSplitTest | |
Running org.datavec.api.transform.condition.TestConditions | |
Tests run: 13, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.051 sec - in org.datavec.api.transform.condition.TestConditions | |
Running org.datavec.api.transform.filter.TestFilters | |
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.001 sec - in org.datavec.api.transform.filter.TestFilters | |
Running org.datavec.api.transform.join.TestJoin | |
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.008 sec - in org.datavec.api.transform.join.TestJoin | |
Running org.datavec.api.transform.ops.AggregableMultiOpTest | |
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.03 sec - in org.datavec.api.transform.ops.AggregableMultiOpTest | |
Running org.datavec.api.transform.ops.AggregatorImplsTest | |
Tests run: 14, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.019 sec - in org.datavec.api.transform.ops.AggregatorImplsTest | |
Running org.datavec.api.transform.ops.DispatchOpTest | |
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0 sec - in org.datavec.api.transform.ops.DispatchOpTest | |
Running org.datavec.api.transform.reduce.TestMultiOpReduce | |
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.008 sec - in org.datavec.api.transform.reduce.TestMultiOpReduce | |
Running org.datavec.api.transform.reduce.TestReductions | |
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0 sec - in org.datavec.api.transform.reduce.TestReductions | |
Running org.datavec.api.transform.schema.TestJsonYaml | |
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.291 sec - in org.datavec.api.transform.schema.TestJsonYaml | |
Running org.datavec.api.transform.schema.TestSchemaMethods | |
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0 sec - in org.datavec.api.transform.schema.TestSchemaMethods | |
Running org.datavec.api.transform.sequence.TestReduceSequenceByWindowFunction | |
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0 sec - in org.datavec.api.transform.sequence.TestReduceSequenceByWindowFunction | |
Running org.datavec.api.transform.sequence.TestSequenceSplit | |
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0 sec - in org.datavec.api.transform.sequence.TestSequenceSplit | |
Running org.datavec.api.transform.sequence.TestWindowFunctions | |
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.001 sec - in org.datavec.api.transform.sequence.TestWindowFunctions | |
Running org.datavec.api.transform.serde.TestCustomTransformJsonYaml | |
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.024 sec - in org.datavec.api.transform.serde.TestCustomTransformJsonYaml | |
Running org.datavec.api.transform.serde.TestYamlJsonSerde | |
Tests run: 9, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.258 sec - in org.datavec.api.transform.serde.TestYamlJsonSerde | |
Running org.datavec.api.transform.stringreduce.TestReduce | |
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.001 sec - in org.datavec.api.transform.stringreduce.TestReduce | |
Running org.datavec.api.transform.TestTransformProcess | |
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.003 sec - in org.datavec.api.transform.TestTransformProcess | |
Running org.datavec.api.transform.transform.ndarray.TestNDArrayWritableTransforms | |
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.286 sec - in org.datavec.api.transform.transform.ndarray.TestNDArrayWritableTransforms | |
Running org.datavec.api.transform.transform.ndarray.TestYamlJsonSerde | |
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.01 sec - in org.datavec.api.transform.transform.ndarray.TestYamlJsonSerde | |
Running org.datavec.api.transform.transform.parse.ParseDoubleTransformTest | |
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.001 sec - in org.datavec.api.transform.transform.parse.ParseDoubleTransformTest | |
Running org.datavec.api.transform.transform.TestJsonYaml | |
Tests run: 2, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.063 sec - in org.datavec.api.transform.transform.TestJsonYaml | |
Running org.datavec.api.transform.transform.TestTransforms | |
Tests run: 53, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.102 sec - in org.datavec.api.transform.transform.TestTransforms | |
Running org.datavec.api.transform.ui.TestUI | |
C:\Users\Alex\AppData\Local\Temp\junit2065293018270009152\junit2641309503950758663\datavec_transform_UITest.html | |
C:\Users\Alex\AppData\Local\Temp\junit2065293018270009152\junit2641309503950758663\datavec_transform_UITest_seq.html | |
Tests run: 2, Failures: 0, Errors: 0, Skipped: 1, Time elapsed: 0.317 sec - in org.datavec.api.transform.ui.TestUI | |
Running org.datavec.api.util.ClassPathResourceTest | |
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.035 sec - in org.datavec.api.util.ClassPathResourceTest | |
Running org.datavec.api.util.TimeSeriesUtilsTest | |
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0 sec - in org.datavec.api.util.TimeSeriesUtilsTest | |
Running org.datavec.api.writable.RecordConverterTest | |
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.039 sec - in org.datavec.api.writable.RecordConverterTest | |
Running org.datavec.api.writable.TestNDArrayWritableAndSerialization | |
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.01 sec - in org.datavec.api.writable.TestNDArrayWritableAndSerialization | |
Running org.datavec.api.writable.WritableTest | |
Tests run: 7, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.018 sec - in org.datavec.api.writable.WritableTest | |
Results : | |
Tests run: 261, Failures: 0, Errors: 0, Skipped: 1 | |
[INFO] | |
[INFO] --- maven-enforcer-plugin:1.4.1:enforce (enforce-test-resources) @ datavec-api --- | |
[INFO] | |
[INFO] ------------------------------------------------------------------------ | |
[INFO] Building datavec-arrow 1.0.0-SNAPSHOT | |
[INFO] ------------------------------------------------------------------------ | |
[INFO] | |
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ datavec-arrow --- | |
[INFO] Deleting C:\DL4J\Git\deeplearning4j\datavec\datavec-arrow\target | |
[INFO] | |
[INFO] --- maven-enforcer-plugin:1.4.1:enforce (enforce-maven) @ datavec-arrow --- | |
[INFO] | |
[INFO] --- lint-maven-plugin:0.0.11:check (pom-lint) @ datavec-arrow --- | |
[INFO] Writing summary report | |
[INFO] [LINT] Completed with no violations | |
[INFO] Writing xml report | |
[INFO] | |
[INFO] --- git-commit-id-plugin:2.2.2:revision (default) @ datavec-arrow --- | |
[INFO] | |
[INFO] --- build-helper-maven-plugin:3.0.0:add-resource (add-resource) @ datavec-arrow --- | |
[INFO] | |
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ datavec-arrow --- | |
[INFO] Using 'UTF-8' encoding to copy filtered resources. | |
[INFO] skip non existing resourceDirectory C:\DL4J\Git\deeplearning4j\datavec\datavec-arrow\src\main\resources | |
[INFO] Copying 1 resource | |
[INFO] | |
[INFO] --- maven-compiler-plugin:3.7.0:compile (default-compile) @ datavec-arrow --- | |
[INFO] Changes detected - recompiling the module! | |
[INFO] Compiling 6 source files to C:\DL4J\Git\deeplearning4j\datavec\datavec-arrow\target\classes | |
[INFO] | |
[INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ datavec-arrow --- | |
[INFO] Using 'UTF-8' encoding to copy filtered resources. | |
[INFO] skip non existing resourceDirectory C:\DL4J\Git\deeplearning4j\datavec\datavec-arrow\src\test\resources | |
[INFO] | |
[INFO] --- maven-compiler-plugin:3.7.0:testCompile (default-testCompile) @ datavec-arrow --- | |
[INFO] Changes detected - recompiling the module! | |
[INFO] Compiling 3 source files to C:\DL4J\Git\deeplearning4j\datavec\datavec-arrow\target\test-classes | |
[INFO] /C:/DL4J/Git/deeplearning4j/datavec/datavec-arrow/src/test/java/org/datavec/arrow/RecordMapperTest.java: C:\DL4J\Git\deeplearning4j\datavec\datavec-arrow\src\test\java\org\datavec\arrow\RecordMapperTest.java uses or overrides a deprecated API. | |
[INFO] /C:/DL4J/Git/deeplearning4j/datavec/datavec-arrow/src/test/java/org/datavec/arrow/RecordMapperTest.java: Recompile with -Xlint:deprecation for details. | |
[INFO] | |
[INFO] --- maven-surefire-plugin:2.19.1:test (default-test) @ datavec-arrow --- | |
------------------------------------------------------- | |
T E S T S | |
------------------------------------------------------- | |
Running org.datavec.arrow.ArrowConverterTest | |
SLF4J: Failed to load class "org.slf4j.impl.StaticLoggerBinder". | |
SLF4J: Defaulting to no-operation (NOP) logger implementation | |
SLF4J: See http://www.slf4j.org/codes.html#StaticLoggerBinder for further details. | |
[ 1.0000, 2.0000, 3.0000, 4.0000] | |
Tests run: 17, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 20.827 sec - in org.datavec.arrow.ArrowConverterTest | |
Running org.datavec.arrow.RecordMapperTest | |
ArrowWritableRecordBatch(list=[org.apache.arrow.vector.UInt4Vector@2bb7bd00[name = 0, ...], org.apache.arrow.vector.UInt4Vector@53812a9b[name = 1, ...], org.apache.arrow.vector.UInt4Vector@18230356[name = 2, ...]], size=10, schema=Schema(): | |
idx name type meta data | |
0 "0" Integer IntegerMetaData(name="0",) | |
1 "1" Integer IntegerMetaData(name="1",) | |
2 "2" Integer IntegerMetaData(name="2",) | |
, arrowRecordBatch=ArrowRecordBatch [length=10, nodes=[ArrowFieldNode [length=10, nullCount=0], ArrowFieldNode [length=10, nullCount=0], ArrowFieldNode [length=10, nullCount=0]], #buffers=6, buffersLayout=[ArrowBuffer [offset=0, size=2], ArrowBuffer [offset=8, size=40], ArrowBuffer [offset=48, size=2], ArrowBuffer [offset=56, size=40], ArrowBuffer [offset=96, size=2], ArrowBuffer [offset=104, size=40]], closed=false], vectorLoader=null, unloader=org.apache.arrow.vector.VectorUnloader@e077866, offset=0, rows=0) | |
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.128 sec - in org.datavec.arrow.RecordMapperTest | |
Results : | |
Tests run: 20, Failures: 0, Errors: 0, Skipped: 0 | |
[INFO] | |
[INFO] --- maven-enforcer-plugin:1.4.1:enforce (enforce-test-resources) @ datavec-arrow --- | |
[INFO] | |
[INFO] ------------------------------------------------------------------------ | |
[INFO] Building datavec-geo 1.0.0-SNAPSHOT | |
[INFO] ------------------------------------------------------------------------ | |
[INFO] | |
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ datavec-geo --- | |
[INFO] Deleting C:\DL4J\Git\deeplearning4j\datavec\datavec-geo\target | |
[INFO] | |
[INFO] --- maven-enforcer-plugin:1.4.1:enforce (enforce-maven) @ datavec-geo --- | |
[INFO] | |
[INFO] --- lint-maven-plugin:0.0.11:check (pom-lint) @ datavec-geo --- | |
[INFO] Writing summary report | |
[INFO] [LINT] Completed with no violations | |
[INFO] Writing xml report | |
[INFO] | |
[INFO] --- git-commit-id-plugin:2.2.2:revision (default) @ datavec-geo --- | |
[INFO] | |
[INFO] --- build-helper-maven-plugin:3.0.0:add-resource (add-resource) @ datavec-geo --- | |
[INFO] | |
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ datavec-geo --- | |
[INFO] Using 'UTF-8' encoding to copy filtered resources. | |
[INFO] skip non existing resourceDirectory C:\DL4J\Git\deeplearning4j\datavec\datavec-geo\src\main\resources | |
[INFO] Copying 1 resource | |
[INFO] | |
[INFO] --- maven-compiler-plugin:3.7.0:compile (default-compile) @ datavec-geo --- | |
[INFO] Changes detected - recompiling the module! | |
[INFO] Compiling 6 source files to C:\DL4J\Git\deeplearning4j\datavec\datavec-geo\target\classes | |
[INFO] /C:/DL4J/Git/deeplearning4j/datavec/datavec-geo/src/main/java/org/datavec/api/transform/transform/geo/CoordinatesDistanceTransform.java: C:\DL4J\Git\deeplearning4j\datavec\datavec-geo\src\main\java\org\datavec\api\transform\transform\geo\CoordinatesDistanceTransform.java uses unchecked or unsafe operations. | |
[INFO] /C:/DL4J/Git/deeplearning4j/datavec/datavec-geo/src/main/java/org/datavec/api/transform/transform/geo/CoordinatesDistanceTransform.java: Recompile with -Xlint:unchecked for details. | |
[INFO] | |
[INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ datavec-geo --- | |
[INFO] Using 'UTF-8' encoding to copy filtered resources. | |
[INFO] skip non existing resourceDirectory C:\DL4J\Git\deeplearning4j\datavec\datavec-geo\src\test\resources | |
[INFO] | |
[INFO] --- maven-compiler-plugin:3.7.0:testCompile (default-testCompile) @ datavec-geo --- | |
[INFO] Changes detected - recompiling the module! | |
[INFO] Compiling 2 source files to C:\DL4J\Git\deeplearning4j\datavec\datavec-geo\target\test-classes | |
[INFO] | |
[INFO] --- maven-surefire-plugin:2.19.1:test (default-test) @ datavec-geo --- | |
------------------------------------------------------- | |
T E S T S | |
------------------------------------------------------- | |
Running org.datavec.api.transform.reduce.TestGeoReduction | |
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.153 sec - in org.datavec.api.transform.reduce.TestGeoReduction | |
Running org.datavec.api.transform.transform.TestGeoTransforms | |
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.3 sec - in org.datavec.api.transform.transform.TestGeoTransforms | |
Results : | |
Tests run: 4, Failures: 0, Errors: 0, Skipped: 0 | |
[INFO] | |
[INFO] --- maven-enforcer-plugin:1.4.1:enforce (enforce-test-resources) @ datavec-geo --- | |
[INFO] | |
[INFO] ------------------------------------------------------------------------ | |
[INFO] Building datavec-python 1.0.0-SNAPSHOT | |
[INFO] ------------------------------------------------------------------------ | |
[INFO] | |
[INFO] --- maven-clean-plugin:2.5:clean (default-clean) @ datavec-python --- | |
[INFO] Deleting C:\DL4J\Git\deeplearning4j\datavec\datavec-python\target | |
[INFO] | |
[INFO] --- maven-enforcer-plugin:1.4.1:enforce (enforce-maven) @ datavec-python --- | |
[INFO] | |
[INFO] --- lint-maven-plugin:0.0.11:check (pom-lint) @ datavec-python --- | |
[INFO] Writing summary report | |
[INFO] [LINT] Completed with no violations | |
[INFO] Writing xml report | |
[INFO] | |
[INFO] --- git-commit-id-plugin:2.2.2:revision (default) @ datavec-python --- | |
[INFO] | |
[INFO] --- build-helper-maven-plugin:3.0.0:add-resource (add-resource) @ datavec-python --- | |
[INFO] | |
[INFO] --- maven-resources-plugin:2.6:resources (default-resources) @ datavec-python --- | |
[INFO] Using 'UTF-8' encoding to copy filtered resources. | |
[INFO] Copying 8 resources | |
[INFO] Copying 1 resource | |
[INFO] | |
[INFO] --- maven-compiler-plugin:3.7.0:compile (default-compile) @ datavec-python --- | |
[INFO] Changes detected - recompiling the module! | |
[INFO] Compiling 6 source files to C:\DL4J\Git\deeplearning4j\datavec\datavec-python\target\classes | |
[INFO] /C:/DL4J/Git/deeplearning4j/datavec/datavec-python/src/main/java/org/datavec/python/PythonUtils.java: Some input files use unchecked or unsafe operations. | |
[INFO] /C:/DL4J/Git/deeplearning4j/datavec/datavec-python/src/main/java/org/datavec/python/PythonUtils.java: Recompile with -Xlint:unchecked for details. | |
[INFO] | |
[INFO] --- maven-resources-plugin:2.6:testResources (default-testResources) @ datavec-python --- | |
[INFO] Using 'UTF-8' encoding to copy filtered resources. | |
[INFO] skip non existing resourceDirectory C:\DL4J\Git\deeplearning4j\datavec\datavec-python\src\test\resources | |
[INFO] | |
[INFO] --- maven-compiler-plugin:3.7.0:testCompile (default-testCompile) @ datavec-python --- | |
[INFO] Changes detected - recompiling the module! | |
[INFO] Compiling 5 source files to C:\DL4J\Git\deeplearning4j\datavec\datavec-python\target\test-classes | |
[INFO] | |
[INFO] --- maven-surefire-plugin:2.19.1:test (default-test) @ datavec-python --- | |
------------------------------------------------------- | |
T E S T S | |
------------------------------------------------------- | |
Running org.datavec.python.TestPythonExecutioner | |
22:23:56.085 [main] INFO org.nd4j.linalg.factory.Nd4jBackend - Loaded [CpuBackend] backend | |
22:24:15.930 [main] INFO org.nd4j.nativeblas.NativeOpsHolder - Number of threads used for linear algebra: 8 | |
22:24:16.050 [main] INFO org.nd4j.nativeblas.Nd4jBlas - Number of threads used for OpenMP BLAS: 8 | |
22:24:16.059 [main] INFO org.nd4j.linalg.api.ops.executioner.DefaultOpExecutioner - Backend used: [CPU]; OS: [Windows 10] | |
22:24:16.059 [main] INFO org.nd4j.linalg.api.ops.executioner.DefaultOpExecutioner - Cores: [16]; Memory: [7.1GB]; | |
22:24:16.059 [main] INFO org.nd4j.linalg.api.ops.executioner.DefaultOpExecutioner - Blas vendor: [OPENBLAS] | |
22:24:16.070 [main] INFO org.datavec.python.PythonExecutioner - Setting python default path | |
22:24:19.386 [main] INFO org.datavec.python.PythonExecutioner - CPython: PyEval_InitThreads() | |
22:24:19.386 [main] INFO org.datavec.python.PythonExecutioner - CPython: Py_InitializeEx() | |
22:24:35.103 [main] INFO org.datavec.python.PythonExecutioner - CPython: PyGILState_Release() | |
22:24:36.076 [main] INFO org.datavec.python.PythonExecutioner - acquireGIL() | |
22:24:36.076 [main] INFO org.datavec.python.PythonExecutioner - CPython: PyEval_SaveThread() | |
22:24:36.076 [main] INFO org.datavec.python.PythonExecutioner - CPython: PyThreadState_New() | |
22:24:36.189 [main] INFO org.datavec.python.PythonExecutioner - CPython: PyEval_RestoreThread() | |
22:24:36.189 [main] INFO org.datavec.python.PythonExecutioner - CPython: PyThreadState_Swap() | |
22:24:36.189 [main] INFO org.datavec.python.PythonExecutioner - import sys | |
import traceback | |
import json | |
import inspect | |
try: | |
import numpy as np | |
sys.stdout.flush() | |
sys.stderr.flush() | |
except Exception as ex: | |
try: | |
exc_info = sys.exc_info() | |
finally: | |
print(ex) | |
traceback.print_exception(*exc_info) | |
sys.stdout.flush() | |
sys.stderr.flush() | |
22:24:36.190 [main] INFO org.datavec.python.PythonExecutioner - CPython: PyRun_SimpleStringFlag() | |
cannot import name 'ARRAY_FUNCTION_ENABLED' from 'numpy.core.overrides' (C:\Users\Alex\.javacpp\cache\numpy-1.17.3-1.5.2-windows-x86_64.jar\org\bytedeco\numpy\windows-x86_64\python\numpy\core\overrides.py) | |
Traceback (most recent call last): | |
File "<string>", line 9, in <module> | |
File "C:\Users\Alex\.javacpp\cache\numpy-1.17.3-1.5.2-windows-x86_64.jar\org\bytedeco\numpy\windows-x86_64\python\numpy\__init__.py", line 145, in <module> | |
from . import lib | |
File "C:\Users\Alex\.javacpp\cache\numpy-1.17.3-1.5.2-windows-x86_64.jar\org\bytedeco\numpy\windows-x86_64\python\numpy\lib\__init__.py", line 8, in <module> | |
from .type_check import * | |
File "C:\Users\Alex\.javacpp\cache\numpy-1.17.3-1.5.2-windows-x86_64.jar\org\bytedeco\numpy\windows-x86_64\python\numpy\lib\type_check.py", line 17, in <module> | |
from .ufunclike import isneginf, isposinf | |
File "C:\Users\Alex\.javacpp\cache\numpy-1.17.3-1.5.2-windows-x86_64.jar\org\bytedeco\numpy\windows-x86_64\python\numpy\lib\ufunclike.py", line 11, in <module> | |
from numpy.core.overrides import ( | |
ImportError: cannot import name 'ARRAY_FUNCTION_ENABLED' from 'numpy.core.overrides' (C:\Users\Alex\.javacpp\cache\numpy-1.17.3-1.5.2-windows-x86_64.jar\org\bytedeco\numpy\windows-x86_64\python\numpy\core\overrides.py) | |
22:24:36.210 [main] INFO org.datavec.python.PythonExecutioner - Exec done | |
22:24:36.210 [main] INFO org.datavec.python.PythonExecutioner - CPython: PyEval_SaveThread() | |
22:24:36.210 [main] INFO org.datavec.python.PythonExecutioner - CPython: PyEval_RestoreThread() | |
22:24:36.211 [main] INFO org.datavec.python.PythonExecutioner - acquireGIL() | |
22:24:36.211 [main] INFO org.datavec.python.PythonExecutioner - CPython: PyEval_SaveThread() | |
22:24:36.211 [main] INFO org.datavec.python.PythonExecutioner - CPython: PyThreadState_New() | |
22:24:36.211 [main] INFO org.datavec.python.PythonExecutioner - CPython: PyEval_RestoreThread() | |
22:24:36.211 [main] INFO org.datavec.python.PythonExecutioner - CPython: PyThreadState_Swap() | |
22:24:36.211 [main] INFO org.datavec.python.PythonExecutioner - import sys | |
import traceback | |
import json | |
import inspect | |
try: | |
__overrides_path = np.core.overrides.__file__ | |
sys.stdout.flush() | |
sys.stderr.flush() | |
except Exception as ex: | |
try: | |
exc_info = sys.exc_info() | |
finally: | |
print(ex) | |
traceback.print_exception(*exc_info) | |
sys.stdout.flush() | |
sys.stderr.flush() | |
22:24:36.211 [main] INFO org.datavec.python.PythonExecutioner - CPython: PyRun_SimpleStringFlag() | |
name 'np' is not defined | |
Traceback (most recent call last): | |
File "<string>", line 9, in <module> | |
NameError: name 'np' is not defined | |
22:24:36.211 [main] INFO org.datavec.python.PythonExecutioner - Exec done | |
22:24:36.211 [main] INFO org.datavec.python.PythonExecutioner - CPython: PyEval_SaveThread() | |
22:24:36.211 [main] INFO org.datavec.python.PythonExecutioner - CPython: PyEval_RestoreThread() | |
22:24:36.212 [main] INFO org.datavec.python.PythonExecutioner - acquireGIL() | |
22:24:36.212 [main] INFO org.datavec.python.PythonExecutioner - CPython: PyEval_SaveThread() | |
22:24:36.212 [main] INFO org.datavec.python.PythonExecutioner - CPython: PyThreadState_New() | |
22:24:36.212 [main] INFO org.datavec.python.PythonExecutioner - CPython: PyEval_RestoreThread() | |
22:24:36.212 [main] INFO org.datavec.python.PythonExecutioner - CPython: PyThreadState_Swap() | |
22:24:36.212 [main] INFO org.datavec.python.PythonExecutioner - import sys | |
import traceback | |
import json | |
import inspect | |
try: | |
__random_path = np.random.__file__ | |
sys.stdout.flush() | |
sys.stderr.flush() | |
except Exception as ex: | |
try: | |
exc_info = sys.exc_info() | |
finally: | |
print(ex) | |
traceback.print_exception(*exc_info) | |
sys.stdout.flush() | |
sys.stderr.flush() | |
22:24:36.213 [main] INFO org.datavec.python.PythonExecutioner - CPython: PyRun_SimpleStringFlag() | |
name 'np' is not defined | |
Traceback (most recent call last): | |
File "<string>", line 9, in <module> | |
NameError: name 'np' is not defined | |
22:24:36.213 [main] INFO org.datavec.python.PythonExecutioner - Exec done | |
22:24:36.213 [main] INFO org.datavec.python.PythonExecutioner - CPython: PyEval_SaveThread() | |
22:24:36.213 [main] INFO org.datavec.python.PythonExecutioner - CPython: PyEval_RestoreThread() | |
22:24:36.214 [main] INFO org.datavec.python.PythonExecutioner - temp_1_main.json | |
22:24:36.215 [main] INFO org.datavec.python.PythonExecutioner - acquireGIL() | |
22:24:36.215 [main] INFO org.datavec.python.PythonExecutioner - CPython: PyEval_SaveThread() | |
22:24:36.215 [main] INFO org.datavec.python.PythonExecutioner - CPython: PyThreadState_New() | |
22:24:36.215 [main] INFO org.datavec.python.PythonExecutioner - CPython: PyEval_RestoreThread() | |
22:24:36.215 [main] INFO org.datavec.python.PythonExecutioner - CPython: PyThreadState_Swap() | |
22:24:36.215 [main] INFO org.datavec.python.PythonExecutioner - import sys | |
import traceback | |
import json | |
import inspect | |
try: | |
print('') | |
import json | |
def __is_numpy_array(x): | |
return str(type(x))== "<class 'numpy.ndarray'>" | |
def __maybe_serialize_ndarray_metadata(x): | |
return __serialize_ndarray_metadata(x) if __is_numpy_array(x) else x | |
def __serialize_ndarray_metadata(x): | |
return {"address": x.__array_interface__['data'][0], | |
"shape": x.shape, | |
"strides": x.strides, | |
"dtype": str(x.dtype), | |
"_is_numpy_array": True} if __is_numpy_array(x) else x | |
def __serialize_list(x): | |
import json | |
return json.dumps(__recursive_serialize_list(x)) | |
def __serialize_dict(x): | |
import json | |
return json.dumps(__recursive_serialize_dict(x)) | |
def __recursive_serialize_list(x): | |
out = [] | |
for i in x: | |
if __is_numpy_array(i): | |
out.append(__serialize_ndarray_metadata(i)) | |
elif isinstance(i, (list, tuple)): | |
out.append(__recursive_serialize_list(i)) | |
elif isinstance(i, dict): | |
out.append(__recursive_serialize_dict(i)) | |
else: | |
out.append(i) | |
return out | |
def __recursive_serialize_dict(x): | |
out = {} | |
for k in x: | |
v = x[k] | |
if __is_numpy_array(v): | |
out[k] = __serialize_ndarray_metadata(v) | |
elif isinstance(v, (list, tuple)): | |
out[k] = __recursive_serialize_list(v) | |
elif isinstance(v, dict): | |
out[k] = __recursive_serialize_dict(v) | |
else: | |
out[k] = v | |
return out | |
_1_main_out = __serialize_dict({"__overrides_path": __overrides_path}) | |
with open('temp_1_main.json', 'w') as _f1554073280:_f1554073280.write(_1_main_out) | |
sys.stdout.flush() | |
sys.stderr.flush() | |
except Exception as ex: | |
try: | |
exc_info = sys.exc_info() | |
finally: | |
print(ex) | |
traceback.print_exception(*exc_info) | |
sys.stdout.flush() | |
sys.stderr.flush() | |
22:24:36.215 [main] INFO org.datavec.python.PythonExecutioner - CPython: PyRun_SimpleStringFlag() | |
name '__overrides_path' is not defined | |
Traceback (most recent call last): | |
File "<string>", line 61, in <module> | |
NameError: name '__overrides_path' is not defined | |
22:24:36.216 [main] INFO org.datavec.python.PythonExecutioner - Exec done | |
22:24:36.216 [main] INFO org.datavec.python.PythonExecutioner - CPython: PyEval_SaveThread() | |
22:24:36.216 [main] INFO org.datavec.python.PythonExecutioner - CPython: PyEval_RestoreThread() | |
22:24:36.216 [main] INFO org.datavec.python.PythonExecutioner - temp_1_main.json | |
Tests run: 10, Failures: 0, Errors: 10, Skipped: 0, Time elapsed: 40.546 sec <<< FAILURE! - in org.datavec.python.TestPythonExecutioner | |
testInt(org.datavec.python.TestPythonExecutioner) Time elapsed: 40.334 sec <<< ERROR! | |
java.lang.ExceptionInInitializerError | |
at org.datavec.python.TestPythonExecutioner.testInt(TestPythonExecutioner.java:71) | |
Caused by: java.lang.IllegalStateException: File C:\DL4J\Git\deeplearning4j\datavec\datavec-python\temp_1_main.json failed to get written for reading outputs! | |
at org.datavec.python.TestPythonExecutioner.testInt(TestPythonExecutioner.java:71) | |
testStr(org.datavec.python.TestPythonExecutioner) Time elapsed: 0 sec <<< ERROR! | |
java.lang.NoClassDefFoundError: Could not initialize class org.datavec.python.PythonExecutioner | |
at org.datavec.python.TestPythonExecutioner.testStr(TestPythonExecutioner.java:49) | |
testTensorflowCustomAnaconda(org.datavec.python.TestPythonExecutioner) Time elapsed: 0 sec <<< ERROR! | |
java.lang.NoClassDefFoundError: Could not initialize class org.datavec.python.PythonExecutioner | |
at org.datavec.python.TestPythonExecutioner.testTensorflowCustomAnaconda(TestPythonExecutioner.java:148) | |
testList(org.datavec.python.TestPythonExecutioner) Time elapsed: 0 sec <<< ERROR! | |
java.lang.NoClassDefFoundError: Could not initialize class org.datavec.python.PythonExecutioner | |
at org.datavec.python.TestPythonExecutioner.testList(TestPythonExecutioner.java:95) | |
testNDArrayLong(org.datavec.python.TestPythonExecutioner) Time elapsed: 0.076 sec <<< ERROR! | |
java.lang.NoClassDefFoundError: Could not initialize class org.datavec.python.PythonExecutioner | |
at org.datavec.python.TestPythonExecutioner.testNDArrayLong(TestPythonExecutioner.java:215) | |
testPythonSysVersion(org.datavec.python.TestPythonExecutioner) Time elapsed: 0 sec <<< ERROR! | |
java.lang.NoClassDefFoundError: Could not initialize class org.datavec.python.PythonExecutioner | |
at org.datavec.python.TestPythonExecutioner.testPythonSysVersion(TestPythonExecutioner.java:33) | |
testNDArrayInt(org.datavec.python.TestPythonExecutioner) Time elapsed: 0.006 sec <<< ERROR! | |
java.lang.NoClassDefFoundError: Could not initialize class org.datavec.python.PythonExecutioner | |
at org.datavec.python.TestPythonExecutioner.testNDArrayInt(TestPythonExecutioner.java:197) | |
testNDArrayFloat(org.datavec.python.TestPythonExecutioner) Time elapsed: 0.007 sec <<< ERROR! | |
java.lang.NoClassDefFoundError: Could not initialize class org.datavec.python.PythonExecutioner | |
at org.datavec.python.TestPythonExecutioner.testNDArrayFloat(TestPythonExecutioner.java:138) | |
testNDArrayShort(org.datavec.python.TestPythonExecutioner) Time elapsed: 0.006 sec <<< ERROR! | |
java.lang.NoClassDefFoundError: Could not initialize class org.datavec.python.PythonExecutioner | |
at org.datavec.python.TestPythonExecutioner.testNDArrayShort(TestPythonExecutioner.java:179) | |
testNDArrayDouble(org.datavec.python.TestPythonExecutioner) Time elapsed: 0 sec <<< ERROR! | |
java.lang.NoClassDefFoundError: Could not initialize class org.datavec.python.PythonExecutioner | |
at org.datavec.python.TestPythonExecutioner.testNDArrayDouble(TestPythonExecutioner.java:162) | |
Running org.datavec.python.TestPythonExecutionSandbox | |
Tests run: 3, Failures: 0, Errors: 3, Skipped: 0, Time elapsed: 0 sec <<< FAILURE! - in org.datavec.python.TestPythonExecutionSandbox | |
testNumpyRandom(org.datavec.python.TestPythonExecutionSandbox) Time elapsed: 0 sec <<< ERROR! | |
java.lang.NoClassDefFoundError: Could not initialize class org.datavec.python.PythonExecutioner | |
at org.datavec.python.TestPythonExecutionSandbox.testNumpyRandom(TestPythonExecutionSandbox.java:72) | |
testInt(org.datavec.python.TestPythonExecutionSandbox) Time elapsed: 0 sec <<< ERROR! | |
java.lang.NoClassDefFoundError: Could not initialize class org.datavec.python.PythonExecutioner | |
at org.datavec.python.TestPythonExecutionSandbox.testInt(TestPythonExecutionSandbox.java:28) | |
testNDArray(org.datavec.python.TestPythonExecutionSandbox) Time elapsed: 0 sec <<< ERROR! | |
java.lang.NoClassDefFoundError: Could not initialize class org.datavec.python.PythonExecutioner | |
at org.datavec.python.TestPythonExecutionSandbox.testNDArray(TestPythonExecutionSandbox.java:48) | |
Running org.datavec.python.TestPythonSetupAndRun | |
Tests run: 1, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 0 sec <<< FAILURE! - in org.datavec.python.TestPythonSetupAndRun | |
testPythonWithSetupAndRun(org.datavec.python.TestPythonSetupAndRun) Time elapsed: 0 sec <<< ERROR! | |
java.lang.NoClassDefFoundError: Could not initialize class org.datavec.python.PythonExecutioner | |
at org.datavec.python.TestPythonSetupAndRun.testPythonWithSetupAndRun(TestPythonSetupAndRun.java:21) | |
Running org.datavec.python.TestPythonVariables | |
null | |
Tests run: 2, Failures: 0, Errors: 1, Skipped: 0, Time elapsed: 0 sec <<< FAILURE! - in org.datavec.python.TestPythonVariables | |
testImportNumpy(org.datavec.python.TestPythonVariables) Time elapsed: 0 sec <<< ERROR! | |
java.lang.NoClassDefFoundError: Could not initialize class org.datavec.python.PythonExecutioner | |
at org.datavec.python.TestPythonVariables.testImportNumpy(TestPythonVariables.java:42) | |
Running org.datavec.python.TestSerde | |
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 1.376 sec - in org.datavec.python.TestSerde | |
Results : | |
Tests in error: | |
TestPythonExecutionSandbox.testInt:28 NoClassDefFound Could not initialize cla... | |
TestPythonExecutionSandbox.testNDArray:48 NoClassDefFound Could not initialize... | |
TestPythonExecutionSandbox.testNumpyRandom:72 NoClassDefFound Could not initia... | |
TestPythonExecutioner.testInt:71 ExceptionInInitializer | |
TestPythonExecutioner.testList:95 NoClassDefFound Could not initialize class o... | |
TestPythonExecutioner.testNDArrayDouble:162 NoClassDefFound Could not initiali... | |
TestPythonExecutioner.testNDArrayFloat:138 NoClassDefFound Could not initializ... | |
TestPythonExecutioner.testNDArrayInt:197 NoClassDefFound Could not initialize ... | |
TestPythonExecutioner.testNDArrayLong:215 NoClassDefFound Could not initialize... | |
TestPythonExecutioner.testNDArrayShort:179 NoClassDefFound Could not initializ... | |
TestPythonExecutioner.testPythonSysVersion:33 NoClassDefFound Could not initia... | |
TestPythonExecutioner.testStr:49 NoClassDefFound Could not initialize class or... | |
TestPythonExecutioner.testTensorflowCustomAnaconda:148 NoClassDefFound Could n... | |
TestPythonSetupAndRun.testPythonWithSetupAndRun:21 NoClassDefFound Could not i... | |
TestPythonVariables.testImportNumpy:42 NoClassDefFound Could not initialize cl... | |
Tests run: 17, Failures: 0, Errors: 15, Skipped: 0 | |
[INFO] ------------------------------------------------------------------------ | |
[INFO] Reactor Summary: | |
[INFO] | |
[INFO] DataVec ............................................ SUCCESS [ 2.740 s] | |
[INFO] datavec-api ........................................ SUCCESS [01:01 min] | |
[INFO] datavec-arrow ...................................... SUCCESS [ 37.670 s] | |
[INFO] datavec-geo ........................................ SUCCESS [ 15.444 s] | |
[INFO] datavec-python ..................................... FAILURE [ 57.250 s] | |
[INFO] datavec-local ...................................... SKIPPED | |
[INFO] datavec-data ....................................... SKIPPED | |
[INFO] datavec-data-audio ................................. SKIPPED | |
[INFO] datavec-data-image ................................. SKIPPED | |
[INFO] datavec-data-codec ................................. SKIPPED | |
[INFO] datavec-data-nlp ................................... SKIPPED | |
[INFO] datavec-hadoop ..................................... SKIPPED | |
[INFO] datavec-spark_2.11 ................................. SKIPPED | |
[INFO] DataVec Camel Component ............................ SKIPPED | |
[INFO] datavec-spark-inference-parent ..................... SKIPPED | |
[INFO] datavec-spark-inference-model ...................... SKIPPED | |
[INFO] datavec-spark-inference-server ..................... SKIPPED | |
[INFO] datavec-spark-inference-client ..................... SKIPPED | |
[INFO] datavec-jdbc ....................................... SKIPPED | |
[INFO] datavec-excel ...................................... SKIPPED | |
[INFO] datavec-perf ....................................... SKIPPED | |
[INFO] ------------------------------------------------------------------------ | |
[INFO] BUILD FAILURE | |
[INFO] ------------------------------------------------------------------------ | |
[INFO] Total time: 02:55 min | |
[INFO] Finished at: 2019-11-27T22:24:38+11:00 | |
[INFO] Final Memory: 235M/1110M | |
[INFO] ------------------------------------------------------------------------ | |
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.19.1:test (default-test) on project datavec-python: There are test failures. | |
[ERROR] | |
[ERROR] Please refer to C:\DL4J\Git\deeplearning4j\datavec\datavec-python\target\surefire-reports for the individual test results. | |
[ERROR] -> [Help 1] | |
[ERROR] | |
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. | |
[ERROR] Re-run Maven using the -X switch to enable full debug logging. | |
[ERROR] | |
[ERROR] For more information about the errors and possible solutions, please read the following articles: | |
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException | |
[ERROR] | |
[ERROR] After correcting the problems, you can resume the build with the command | |
[ERROR] mvn <goals> -rf :datavec-python | |
c:\DL4J\Git\deeplearning4j\datavec> |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment