Last active
September 3, 2015 12:36
-
-
Save ofavre/4a7d5cc46a766bdc030a to your computer and use it in GitHub Desktop.
mvn test output for elasticsearch-analysis-combo using ES 1.4.0, Lucene 4.10.2 and randomizedtesting-runner 2.1.10
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
[INFO] Scanning for projects... | |
[INFO] | |
[INFO] ------------------------------------------------------------------------ | |
[INFO] Building elasticsearch-analysis-combo 1.5.2-SNAPSHOT | |
[INFO] ------------------------------------------------------------------------ | |
[INFO] | |
[INFO] --- maven-clean-plugin:2.3:clean (default-clean) @ elasticsearch-analysis-combo --- | |
[INFO] Deleting file set: /data/Yakaz/elasticsearch-analysis-combo/target (included: [**], excluded: []) | |
[INFO] | |
[INFO] --- maven-enforcer-plugin:1.0:enforce (enforce-maven) @ elasticsearch-analysis-combo --- | |
[INFO] | |
[INFO] --- maven-resources-plugin:2.3:resources (default-resources) @ elasticsearch-analysis-combo --- | |
[INFO] Using 'UTF-8' encoding to copy filtered resources. | |
[INFO] Copying 1 resource | |
[INFO] | |
[INFO] --- maven-compiler-plugin:2.3.2:compile (default-compile) @ elasticsearch-analysis-combo --- | |
[INFO] Compiling 18 source files to /data/Yakaz/elasticsearch-analysis-combo/target/classes | |
[INFO] | |
[INFO] --- maven-resources-plugin:2.3:testResources (default-testResources) @ elasticsearch-analysis-combo --- | |
[INFO] Using 'UTF-8' encoding to copy filtered resources. | |
[INFO] Copying 1 resource | |
[INFO] | |
[INFO] --- maven-compiler-plugin:2.3.2:testCompile (default-testCompile) @ elasticsearch-analysis-combo --- | |
[INFO] Compiling 11 source files to /data/Yakaz/elasticsearch-analysis-combo/target/test-classes | |
[INFO] | |
[INFO] --- maven-surefire-plugin:2.11:test (default-test) @ elasticsearch-analysis-combo --- | |
[INFO] Surefire report directory: /data/Yakaz/elasticsearch-analysis-combo/target/surefire-reports | |
------------------------------------------------------- | |
T E S T S | |
------------------------------------------------------- | |
Running org.elasticsearch.common.io.FastStringReaderClonerTests | |
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.091 sec | |
Running org.elasticsearch.common.io.FastCharArrayReaderClonerTests | |
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.001 sec | |
Running org.elasticsearch.index.analysis.TestIntegration | |
[2015-02-01 23:11:46,149][INFO ][org.elasticsearch.test ] Test testAnalysis(org.elasticsearch.index.analysis.TestIntegration) started | |
[2015-02-01 23:11:46,233][INFO ][org.elasticsearch.test ] Setup InternalTestCluster [GLOBAL-TeKa-Laptop-CHILD_VM=[0]-CLUSTER_SEED=[-1201736048358110605]-HASH=[41C05C10F71]] with seed [EF52936E5BD54A73] using [2] data nodes and [1] client nodes | |
[2015-02-01 23:11:46,469][INFO ][org.elasticsearch.node ] [node_0] version[1.4.0], pid[9310], build[bc94bd8/2014-11-05T14:26:12Z] | |
[2015-02-01 23:11:46,470][INFO ][org.elasticsearch.node ] [node_0] initializing ... | |
[2015-02-01 23:11:46,474][INFO ][org.elasticsearch.plugins] [node_0] loaded [], sites [] | |
[2015-02-01 23:11:49,130][INFO ][org.elasticsearch.node ] [node_0] initialized | |
[2015-02-01 23:11:49,131][INFO ][org.elasticsearch.node ] [node_0] starting ... | |
[2015-02-01 23:11:49,136][INFO ][org.elasticsearch.test.transport] [node_0] bound_address {local[1]}, publish_address {local[1]} | |
[2015-02-01 23:11:49,151][INFO ][org.elasticsearch.discovery] [node_0] GLOBAL-TeKa-Laptop-CHILD_VM=[0]-CLUSTER_SEED=[-1201736048358110605]-HASH=[41C05C10F71]/0Kc_MGbRR_G_GSwejVGSAA | |
[2015-02-01 23:11:49,155][INFO ][org.elasticsearch.cluster.service] [node_0] new_master [node_0][0Kc_MGbRR_G_GSwejVGSAA][TeKa-Laptop][local[1]]{mode=local}, reason: local-disco-initial_connect(master) | |
[2015-02-01 23:11:49,189][INFO ][org.elasticsearch.gateway] [node_0] recovered [0] indices into cluster_state | |
[2015-02-01 23:11:49,251][INFO ][org.elasticsearch.http ] [node_0] bound_address {inet[/0:0:0:0:0:0:0:0:9501]}, publish_address {inet[/192.168.1.71:9501]} | |
[2015-02-01 23:11:49,251][INFO ][org.elasticsearch.node ] [node_0] started | |
[2015-02-01 23:11:49,252][INFO ][org.elasticsearch.test ] Start Shared Node [node_0] not shared | |
[2015-02-01 23:11:49,262][INFO ][org.elasticsearch.node ] [node_1] version[1.4.0], pid[9310], build[bc94bd8/2014-11-05T14:26:12Z] | |
[2015-02-01 23:11:49,263][INFO ][org.elasticsearch.node ] [node_1] initializing ... | |
[2015-02-01 23:11:49,267][INFO ][org.elasticsearch.plugins] [node_1] loaded [], sites [] | |
[2015-02-01 23:11:49,573][INFO ][org.elasticsearch.node ] [node_1] initialized | |
[2015-02-01 23:11:49,574][INFO ][org.elasticsearch.node ] [node_1] starting ... | |
[2015-02-01 23:11:49,576][INFO ][org.elasticsearch.test.transport] [node_1] bound_address {local[2]}, publish_address {local[2]} | |
[2015-02-01 23:11:49,579][INFO ][org.elasticsearch.discovery] [node_1] GLOBAL-TeKa-Laptop-CHILD_VM=[0]-CLUSTER_SEED=[-1201736048358110605]-HASH=[41C05C10F71]/KAvCRQAZTK2FMBPooxNuEw | |
[2015-02-01 23:11:49,579][INFO ][org.elasticsearch.cluster.service] [node_0] added {[node_1][KAvCRQAZTK2FMBPooxNuEw][TeKa-Laptop][local[2]]{mode=local},}, reason: local-disco-receive(from node[[node_1][KAvCRQAZTK2FMBPooxNuEw][TeKa-Laptop][local[2]]{mode=local}]) | |
[2015-02-01 23:11:49,583][INFO ][org.elasticsearch.cluster.service] [node_1] detected_master [node_0][0Kc_MGbRR_G_GSwejVGSAA][TeKa-Laptop][local[1]]{mode=local}, added {[node_0][0Kc_MGbRR_G_GSwejVGSAA][TeKa-Laptop][local[1]]{mode=local},}, reason: local-disco-receive(from master) | |
[2015-02-01 23:11:49,648][INFO ][org.elasticsearch.http ] [node_1] bound_address {inet[/0:0:0:0:0:0:0:0:9502]}, publish_address {inet[/192.168.1.71:9502]} | |
[2015-02-01 23:11:49,648][INFO ][org.elasticsearch.node ] [node_1] started | |
[2015-02-01 23:11:49,648][INFO ][org.elasticsearch.test ] Start Shared Node [node_1] not shared | |
[2015-02-01 23:11:49,651][INFO ][org.elasticsearch.node ] [node_2] version[1.4.0], pid[9310], build[bc94bd8/2014-11-05T14:26:12Z] | |
[2015-02-01 23:11:49,651][INFO ][org.elasticsearch.node ] [node_2] initializing ... | |
[2015-02-01 23:11:49,651][INFO ][org.elasticsearch.plugins] [node_2] loaded [], sites [] | |
[2015-02-01 23:11:49,911][INFO ][org.elasticsearch.node ] [node_2] initialized | |
[2015-02-01 23:11:49,911][INFO ][org.elasticsearch.node ] [node_2] starting ... | |
[2015-02-01 23:11:49,912][INFO ][org.elasticsearch.test.transport] [node_2] bound_address {local[3]}, publish_address {local[3]} | |
[2015-02-01 23:11:49,926][INFO ][org.elasticsearch.discovery] [node_2] GLOBAL-TeKa-Laptop-CHILD_VM=[0]-CLUSTER_SEED=[-1201736048358110605]-HASH=[41C05C10F71]/NB0yyfH1TESTJiiV9cSiaQ | |
[2015-02-01 23:11:49,927][INFO ][org.elasticsearch.cluster.service] [node_0] added {[node_2][NB0yyfH1TESTJiiV9cSiaQ][TeKa-Laptop][local[3]]{bench=true, client=true, data=false, mode=local},}, reason: local-disco-receive(from node[[node_2][NB0yyfH1TESTJiiV9cSiaQ][TeKa-Laptop][local[3]]{bench=true, client=true, data=false, mode=local}]) | |
[2015-02-01 23:11:49,929][INFO ][org.elasticsearch.cluster.service] [node_1] added {[node_2][NB0yyfH1TESTJiiV9cSiaQ][TeKa-Laptop][local[3]]{bench=true, client=true, data=false, mode=local},}, reason: local-disco-receive(from master) | |
[2015-02-01 23:11:49,933][INFO ][org.elasticsearch.cluster.service] [node_2] detected_master [node_0][0Kc_MGbRR_G_GSwejVGSAA][TeKa-Laptop][local[1]]{mode=local}, added {[node_0][0Kc_MGbRR_G_GSwejVGSAA][TeKa-Laptop][local[1]]{mode=local},[node_1][KAvCRQAZTK2FMBPooxNuEw][TeKa-Laptop][local[2]]{mode=local},}, reason: local-disco-receive(from master) | |
[2015-02-01 23:11:49,997][INFO ][org.elasticsearch.http ] [node_2] bound_address {inet[/0:0:0:0:0:0:0:0:9503]}, publish_address {inet[/192.168.1.71:9503]} | |
[2015-02-01 23:11:49,997][INFO ][org.elasticsearch.node ] [node_2] started | |
[2015-02-01 23:11:49,998][INFO ][org.elasticsearch.test ] Start Shared Node [node_2] not shared | |
[2015-02-01 23:11:50,013][INFO ][org.elasticsearch.plugins] [transport_client_node_0] loaded [], sites [] | |
[2015-02-01 23:11:50,071][INFO ][org.elasticsearch.transport] [transport_client_node_0] bound_address {local[4]}, publish_address {local[4]} | |
[2015-02-01 23:11:50,136][INFO ][org.elasticsearch.plugins] [transport_client_node_1] loaded [], sites [] | |
[2015-02-01 23:11:50,191][INFO ][org.elasticsearch.transport] [transport_client_node_1] bound_address {local[5]}, publish_address {local[5]} | |
[2015-02-01 23:11:50,309][INFO ][org.elasticsearch.plugins] [transport_client_node_2] loaded [], sites [] | |
[2015-02-01 23:11:50,348][INFO ][org.elasticsearch.transport] [transport_client_node_2] bound_address {local[6]}, publish_address {local[6]} | |
[2015-02-01 23:11:50,374][INFO ][org.elasticsearch.index.analysis] [TestIntegration#testAnalysis]: before test | |
[2015-02-01 23:11:50,745][INFO ][org.elasticsearch.cluster.metadata] [node_0] [some_index] creating index, cause [api], shards [9]/[0], mappings [_default_] | |
[2015-02-01 23:11:51,438][INFO ][org.elasticsearch.index.analysis] [TestIntegration#testAnalysis]: cleaning up after test | |
[2015-02-01 23:11:51,468][INFO ][org.elasticsearch.cluster.metadata] [node_0] [some_index] deleting index | |
[2015-02-01 23:11:51,528][INFO ][org.elasticsearch.index.analysis] [TestIntegration#testAnalysis]: cleaned up after test | |
[2015-02-01 23:11:51,529][INFO ][org.elasticsearch.test ] Wipe data directory for all nodes locations: [/data/Yakaz/elasticsearch-analysis-combo/data/GLOBAL-TeKa-Laptop-CHILD_VM=[0]-CLUSTER_SEED=[-1201736048358110605]-HASH=[41C05C10F71]/nodes/0, /data/Yakaz/elasticsearch-analysis-combo/data/GLOBAL-TeKa-Laptop-CHILD_VM=[0]-CLUSTER_SEED=[-1201736048358110605]-HASH=[41C05C10F71]/nodes/1] success: true | |
[2015-02-01 23:11:51,537][INFO ][org.elasticsearch.test ] Test testAnalysis(org.elasticsearch.index.analysis.TestIntegration) finished | |
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 6.476 sec | |
Running org.apache.lucene.util.ReaderClonerDefaultImplTests | |
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0 sec | |
Running org.apache.lucene.util.TestReaderCloneFactory | |
Tests run: 5, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.237 sec | |
Running org.apache.lucene.analysis.TestReusableStringReaderCloner | |
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0.083 sec | |
Running org.apache.lucene.analysis.TestComboAnalyzer | |
Feb 01, 2015 11:11:53 PM com.carrotsearch.randomizedtesting.ThreadLeakControl checkThreadLeaks | |
WARNING: Will linger awaiting termination of 2 leaked thread(s). | |
Feb 01, 2015 11:12:13 PM com.carrotsearch.randomizedtesting.ThreadLeakControl checkThreadLeaks | |
SEVERE: 3 threads leaked from SUITE scope at org.apache.lucene.analysis.TestComboAnalyzer: | |
1) Thread[id=96, name=elasticsearch[node_2][management][T#3], state=WAITING, group=TGRP-TestIntegration] | |
at sun.misc.Unsafe.park(Native Method) | |
at java.util.concurrent.locks.LockSupport.park(LockSupport.java:186) | |
at java.util.concurrent.LinkedTransferQueue.awaitMatch(LinkedTransferQueue.java:735) | |
at java.util.concurrent.LinkedTransferQueue.xfer(LinkedTransferQueue.java:644) | |
at java.util.concurrent.LinkedTransferQueue.take(LinkedTransferQueue.java:1137) | |
at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1068) | |
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1130) | |
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) | |
at java.lang.Thread.run(Thread.java:745) | |
2) Thread[id=98, name=elasticsearch[node_2][management][T#5], state=WAITING, group=TGRP-TestIntegration] | |
at sun.misc.Unsafe.park(Native Method) | |
at java.util.concurrent.locks.LockSupport.park(LockSupport.java:186) | |
at java.util.concurrent.LinkedTransferQueue.awaitMatch(LinkedTransferQueue.java:735) | |
at java.util.concurrent.LinkedTransferQueue.xfer(LinkedTransferQueue.java:644) | |
at java.util.concurrent.LinkedTransferQueue.take(LinkedTransferQueue.java:1137) | |
at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1068) | |
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1130) | |
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) | |
at java.lang.Thread.run(Thread.java:745) | |
3) Thread[id=97, name=elasticsearch[node_2][management][T#4], state=WAITING, group=TGRP-TestIntegration] | |
at sun.misc.Unsafe.park(Native Method) | |
at java.util.concurrent.locks.LockSupport.park(LockSupport.java:186) | |
at java.util.concurrent.LinkedTransferQueue.awaitMatch(LinkedTransferQueue.java:735) | |
at java.util.concurrent.LinkedTransferQueue.xfer(LinkedTransferQueue.java:644) | |
at java.util.concurrent.LinkedTransferQueue.take(LinkedTransferQueue.java:1137) | |
at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1068) | |
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1130) | |
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) | |
at java.lang.Thread.run(Thread.java:745) | |
Feb 01, 2015 11:12:13 PM com.carrotsearch.randomizedtesting.ThreadLeakControl tryToInterruptAll | |
INFO: Starting to interrupt leaked threads: | |
1) Thread[id=96, name=elasticsearch[node_2][management][T#3], state=WAITING, group=TGRP-TestIntegration] | |
2) Thread[id=98, name=elasticsearch[node_2][management][T#5], state=WAITING, group=TGRP-TestIntegration] | |
3) Thread[id=97, name=elasticsearch[node_2][management][T#4], state=WAITING, group=TGRP-TestIntegration] | |
Feb 01, 2015 11:12:16 PM com.carrotsearch.randomizedtesting.ThreadLeakControl tryToInterruptAll | |
SEVERE: There are still zombie threads that couldn't be terminated: | |
1) Thread[id=96, name=elasticsearch[node_2][management][T#3], state=WAITING, group=TGRP-TestIntegration] | |
at sun.misc.Unsafe.park(Native Method) | |
at java.util.concurrent.locks.LockSupport.park(LockSupport.java:186) | |
at java.util.concurrent.LinkedTransferQueue.awaitMatch(LinkedTransferQueue.java:735) | |
at java.util.concurrent.LinkedTransferQueue.xfer(LinkedTransferQueue.java:644) | |
at java.util.concurrent.LinkedTransferQueue.take(LinkedTransferQueue.java:1137) | |
at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1068) | |
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1130) | |
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) | |
at java.lang.Thread.run(Thread.java:745) | |
2) Thread[id=98, name=elasticsearch[node_2][management][T#5], state=WAITING, group=TGRP-TestIntegration] | |
at sun.misc.Unsafe.park(Native Method) | |
at java.util.concurrent.locks.LockSupport.park(LockSupport.java:186) | |
at java.util.concurrent.LinkedTransferQueue.awaitMatch(LinkedTransferQueue.java:735) | |
at java.util.concurrent.LinkedTransferQueue.xfer(LinkedTransferQueue.java:644) | |
at java.util.concurrent.LinkedTransferQueue.take(LinkedTransferQueue.java:1137) | |
at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1068) | |
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1130) | |
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) | |
at java.lang.Thread.run(Thread.java:745) | |
3) Thread[id=97, name=elasticsearch[node_2][management][T#4], state=WAITING, group=TGRP-TestIntegration] | |
at sun.misc.Unsafe.park(Native Method) | |
at java.util.concurrent.locks.LockSupport.park(LockSupport.java:186) | |
at java.util.concurrent.LinkedTransferQueue.awaitMatch(LinkedTransferQueue.java:735) | |
at java.util.concurrent.LinkedTransferQueue.xfer(LinkedTransferQueue.java:644) | |
at java.util.concurrent.LinkedTransferQueue.take(LinkedTransferQueue.java:1137) | |
at java.util.concurrent.ThreadPoolExecutor.getTask(ThreadPoolExecutor.java:1068) | |
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1130) | |
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:615) | |
at java.lang.Thread.run(Thread.java:745) | |
NOTE: test params are: codec=Lucene410: {}, docValues:{}, sim=RandomSimilarityProvider(queryNorm=false,coord=no): {}, locale=es_CO, timezone=Canada/Pacific | |
NOTE: Linux 3.2.0-4-amd64 amd64/Oracle Corporation 1.7.0_65 (64-bit)/cpus=2,threads=1,free=172031272,total=268959744 | |
NOTE: All tests run in this JVM: [TestReaderCloneFactory, TestReusableStringReaderCloner, TestComboAnalyzer] | |
NOTE: reproduce with: ant test -Dtestcase=TestComboAnalyzer -Dtests.seed=610D0FD5FFB9356F -Dtests.locale=es_CO -Dtests.timezone=Canada/Pacific -Dtests.asserts=true -Dtests.file.encoding=UTF-8 | |
Tests run: 11, Failures: 0, Errors: 2, Skipped: 0, Time elapsed: 25.132 sec <<< FAILURE! | |
Running org.apache.lucene.analysis.ReusableStringReaderClonerTests | |
Tests run: 3, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0 sec | |
Running org.apache.lucene.analysis.TestComboTokenStream | |
Tests run: 5, Failures: 0, Errors: 0, Skipped: 5, Time elapsed: 0.001 sec | |
Running javax.io.StringReaderClonerTests | |
Tests run: 1, Failures: 0, Errors: 0, Skipped: 0, Time elapsed: 0 sec | |
Results : | |
Tests in error: | |
org.apache.lucene.analysis.TestComboAnalyzer: 3 threads leaked from SUITE scope at org.apache.lucene.analysis.TestComboAnalyzer: (..) | |
org.apache.lucene.analysis.TestComboAnalyzer: There are still zombie threads that couldn't be terminated:(..) | |
Tests run: 30, Failures: 0, Errors: 2, Skipped: 5 | |
[INFO] ------------------------------------------------------------------------ | |
[INFO] BUILD FAILURE | |
[INFO] ------------------------------------------------------------------------ | |
[INFO] Total time: 40.881s | |
[INFO] Finished at: Sun Feb 01 23:12:17 CET 2015 | |
[INFO] Final Memory: 29M/180M | |
[INFO] ------------------------------------------------------------------------ | |
[ERROR] Failed to execute goal org.apache.maven.plugins:maven-surefire-plugin:2.11:test (default-test) on project elasticsearch-analysis-combo: There are test failures. | |
[ERROR] | |
[ERROR] Please refer to /data/Yakaz/elasticsearch-analysis-combo/target/surefire-reports for the individual test results. | |
[ERROR] -> [Help 1] | |
[ERROR] | |
[ERROR] To see the full stack trace of the errors, re-run Maven with the -e switch. | |
[ERROR] Re-run Maven using the -X switch to enable full debug logging. | |
[ERROR] | |
[ERROR] For more information about the errors and possible solutions, please read the following articles: | |
[ERROR] [Help 1] http://cwiki.apache.org/confluence/display/MAVEN/MojoFailureException |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
@ofavre
Hi , did you manage to fix this memory leak and how please?