Skip to content

Instantly share code, notes, and snippets.

@cmeiklejohn
Created May 8, 2018 16:20
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save cmeiklejohn/b53e7d908e8d93bedf27512e98eec72f to your computer and use it in GitHub Desktop.
Save cmeiklejohn/b53e7d908e8d93bedf27512e98eec72f to your computer and use it in GitHub Desktop.
results for unir-865d54db67-vc4v4
find . -name console.log | grep `ls -d ./undefined* | tail -1` | xargs cat
2018-05-08 16:17:16.516 [info] <0.33.0> Application lager started on node 'node_3@127.0.0.1'
2018-05-08 16:17:16.530 [info] <0.33.0> Application types started on node 'node_3@127.0.0.1'
2018-05-08 16:17:16.535 [info] <0.33.0> Application acceptor_pool started on node 'node_3@127.0.0.1'
2018-05-08 16:17:16.569 [info] <0.33.0> Application asn1 started on node 'node_3@127.0.0.1'
2018-05-08 16:17:16.570 [info] <0.33.0> Application public_key started on node 'node_3@127.0.0.1'
2018-05-08 16:17:16.782 [info] <0.33.0> Application ssl started on node 'node_3@127.0.0.1'
2018-05-08 16:17:16.799 [info] <0.33.0> Application rand_compat started on node 'node_3@127.0.0.1'
2018-05-08 16:17:16.813 [info] <0.33.0> Application quickrand started on node 'node_3@127.0.0.1'
2018-05-08 16:17:16.813 [info] <0.33.0> Application uuid started on node 'node_3@127.0.0.1'
2018-05-08 16:17:16.826 [info] <0.191.0>@partisan_config:init:58 Using node name: 'node_3@127.0.0.1'
2018-05-08 16:17:16.846 [info] <0.193.0>@partisan_config:get_node_address:168 Resolving "127.0.0.1"...
2018-05-08 16:17:16.846 [info] <0.193.0>@partisan_config:get_node_address:171 Resolved "node_3@127.0.0.1" to {127,0,0,1}
2018-05-08 16:17:16.847 [info] <0.191.0>@partisan_config:get_node_address:188 Resolved "127.0.0.1" to {127,0,0,1}
2018-05-08 16:17:17.255 [info] <0.33.0> Application partisan started on node 'node_3@127.0.0.1'
2018-05-08 16:17:17.260 [info] <0.33.0> Application gen_fsm_compat started on node 'node_3@127.0.0.1'
2018-05-08 16:17:17.269 [info] <0.33.0> Application riak_sysmon started on node 'node_3@127.0.0.1'
2018-05-08 16:17:17.296 [info] <0.33.0> Application os_mon started on node 'node_3@127.0.0.1'
2018-05-08 16:17:17.296 [info] <0.92.0> alarm_handler: {set,{system_memory_high_watermark,[]}}
2018-05-08 16:17:17.303 [info] <0.33.0> Application basho_stats started on node 'node_3@127.0.0.1'
2018-05-08 16:17:17.305 [info] <0.33.0> Application eleveldb started on node 'node_3@127.0.0.1'
2018-05-08 16:17:17.306 [info] <0.33.0> Application pbkdf2 started on node 'node_3@127.0.0.1'
2018-05-08 16:17:17.307 [info] <0.33.0> Application poolboy started on node 'node_3@127.0.0.1'
2018-05-08 16:17:17.368 [info] <0.270.0>@exometer_report:do_start_reporters:613 Starting reporters with []
2018-05-08 16:17:17.369 [info] <0.33.0> Application exometer_core started on node 'node_3@127.0.0.1'
2018-05-08 16:17:17.425 [info] <0.33.0> Application clique started on node 'node_3@127.0.0.1'
2018-05-08 16:17:17.445 [info] <0.33.0> Application folsom started on node 'node_3@127.0.0.1'
2018-05-08 16:17:17.500 [info] <0.288.0>@riak_core_partisan_utils:configure_dispatch:145 Configuring partisan dispatch: true
2018-05-08 16:17:17.565 [warning] <0.304.0>@riak_core_ring_manager:reload_ring:380 No ring file available.
2018-05-08 16:17:17.784 [info] <0.321.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,vnode_routing} = proxy
2018-05-08 16:17:17.795 [info] <0.321.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,staged_joins} = true
2018-05-08 16:17:17.803 [info] <0.321.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,resizable_ring} = true
2018-05-08 16:17:17.818 [info] <0.321.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,fold_req_version} = v2
2018-05-08 16:17:17.829 [info] <0.321.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,security} = true
2018-05-08 16:17:17.838 [info] <0.321.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,bucket_types} = true
2018-05-08 16:17:17.844 [info] <0.321.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,net_ticktime} = true
2018-05-08 16:17:17.960 [info] <0.33.0> Application riak_core started on node 'node_3@127.0.0.1'
2018-05-08 16:17:17.967 [info] <0.33.0> Application setup started on node 'node_3@127.0.0.1'
2018-05-08 16:17:29.378 [info] <0.3430.0>@riak_core_partisan_utils:join:100 Starting join from partisan utils from 'node_3@127.0.0.1' to 'node_1@127.0.0.1'
2018-05-08 16:17:30.415 [info] <0.3430.0>@riak_core_partisan_utils:join:106 Finishing join from 'node_3@127.0.0.1' to 'node_1@127.0.0.1'
2018-05-08 16:17:30.417 [info] <0.3430.0>@riak_core_partisan_utils:join:100 Starting join from partisan utils from 'node_3@127.0.0.1' to 'node_3@127.0.0.1'
2018-05-08 16:17:30.427 [info] <0.3430.0>@riak_core_partisan_utils:join:106 Finishing join from 'node_3@127.0.0.1' to 'node_3@127.0.0.1'
2018-05-08 16:17:40.137 [info] <0.304.0>@riak_core_gossip:log_node_changed:357 'node_3@127.0.0.1' changed from 'joining' to 'valid'
2018-05-08 16:18:35.954 [info] <0.249.0>@riak_core_sysmon_handler:handle_event:92 monitor long_schedule <0.3434.0> [{initial_call,{partisan_peer_service_client,init,1}},{almost_current_function,{partisan_util,term_to_iolist_,1}},{message_queue_len,0},{dictionary,[{{partisan_peer_service_client,listen_addr},#{ip => {127,0,0,1},port => 37509}},{{partisan_peer_service_client,peer},#{channels => [broadcast,vnode,{monotonic,gossip}],listen_addrs => [#{ip => {127,0,0,1},port => 37509}],name => 'node_1@127.0.0.1',parallelism => 1}},{'$initial_call',{partisan_peer_service_client,init,1}},{'$ancestors',[partisan_default_peer_service_manager,partisan_sup,<0.190.0>]},{{partisan_peer_service_client,channel},{monotonic,gossip}},{{partisan_peer_service_client,from},<0.224.0>},{last_transmission_time,-576460660150}]}] [{timeout,52},{in,{partisan_util,'-term_to_iolist_/1-lc$^0/1-1-',1}},{out,{partisan_util,term_to_iolist_,1}}]
2018-05-08 16:19:38.049 [info] <0.249.0>@riak_core_sysmon_handler:handle_event:92 monitor long_schedule <0.304.0> [{name,riak_core_ring_manager},{initial_call,{riak_core_ring_manager,init,1}},{almost_current_function,{gen_server,loop,7}},{message_queue_len,0},{dictionary,[{rand_seed,{#{bits => 58,jump => #Fun<rand.8.15449617>,next => #Fun<rand.5.15449617>,type => exrop,uniform => #Fun<rand.6.15449617>,uniform_n => #Fun<rand.7.15449617>,weak_low_bits => 1},[149660036360679688|75021657176756335]}},{'$initial_call',{riak_core_ring_manager,init,1}},{'$ancestors',[riak_core_sup,<0.287.0>]}]}] [{timeout,57},{in,{gen_server,loop,7}},{out,{gen_server,loop,7}}]
2018-05-08 16:19:48.780 [info] <0.249.0>@riak_core_sysmon_handler:handle_event:92 monitor long_schedule <0.303.0> [{name,riak_core_ring_events},{initial_call,{gen_event,init_it,6}},{almost_current_function,{gen_event,fetch_msg,6}},{message_queue_len,0},{dictionary,[{'$initial_call',{gen_event,init_it,6}},{'$ancestors',[riak_core_sup,<0.287.0>]}]}] [{timeout,51},{in,{riak_core_ring_handler,'-maybe_stop_vnode_proxies/1-lc$^3/1-3-',2}},{out,{gen_event,fetch_msg,6}}]
2018-05-08 16:19:48.999 [info] <0.249.0>@riak_core_sysmon_handler:handle_event:92 monitor long_schedule <0.303.0> [{name,riak_core_ring_events},{initial_call,{gen_event,init_it,6}},{almost_current_function,{gen_event,fetch_msg,6}},{message_queue_len,0},{dictionary,[{'$initial_call',{gen_event,init_it,6}},{'$ancestors',[riak_core_sup,<0.287.0>]}]}] [{timeout,99},{in,{riak_core_ring_handler,'-maybe_stop_vnode_proxies/1-lc$^3/1-3-',2}},{out,{gen_event,fetch_msg,6}}]
2018-05-08 16:19:50.772 [warning] <0.5173.0>@partisan_peer_service_client:init:72 unable to connect to #{channels => [broadcast,vnode,{monotonic,gossip}],listen_addrs => [#{ip => {127,0,0,1},port => 37509}],name => 'node_1@127.0.0.1',parallelism => 1} due to {error,econnrefused}
2018-05-08 16:19:51.804 [warning] <0.5175.0>@partisan_peer_service_client:init:72 unable to connect to #{channels => [broadcast,vnode,{monotonic,gossip}],listen_addrs => [#{ip => {127,0,0,1},port => 37509}],name => 'node_1@127.0.0.1',parallelism => 1} due to {error,econnrefused}
2018-05-08 16:19:52.849 [warning] <0.5176.0>@partisan_peer_service_client:init:72 unable to connect to #{channels => [broadcast,vnode,{monotonic,gossip}],listen_addrs => [#{ip => {127,0,0,1},port => 43973}],name => 'node_2@127.0.0.1',parallelism => 1} due to {error,econnrefused}
2018-05-08 16:19:52.851 [warning] <0.5177.0>@partisan_peer_service_client:init:72 unable to connect to #{channels => [broadcast,vnode,{monotonic,gossip}],listen_addrs => [#{ip => {127,0,0,1},port => 37509}],name => 'node_1@127.0.0.1',parallelism => 1} due to {error,econnrefused}
2018-05-08 16:19:53.880 [warning] <0.5179.0>@partisan_peer_service_client:init:72 unable to connect to #{channels => [broadcast,vnode,{monotonic,gossip}],listen_addrs => [#{ip => {127,0,0,1},port => 43973}],name => 'node_2@127.0.0.1',parallelism => 1} due to {error,econnrefused}
2018-05-08 16:19:53.880 [warning] <0.5180.0>@partisan_peer_service_client:init:72 unable to connect to #{channels => [broadcast,vnode,{monotonic,gossip}],listen_addrs => [#{ip => {127,0,0,1},port => 37509}],name => 'node_1@127.0.0.1',parallelism => 1} due to {error,econnrefused}
2018-05-08 16:19:54.142 [info] <0.287.0>@riak_core_app:stop:47 Stopped application riak_core.
2018-05-08 16:19:54.222 [info] <0.92.0> alarm_handler: {clear,system_memory_high_watermark}
2018-05-08 16:17:10.126 [info] <0.33.0> Application lager started on node 'node_1@127.0.0.1'
2018-05-08 16:17:10.162 [info] <0.33.0> Application types started on node 'node_1@127.0.0.1'
2018-05-08 16:17:10.169 [info] <0.33.0> Application acceptor_pool started on node 'node_1@127.0.0.1'
2018-05-08 16:17:10.235 [info] <0.33.0> Application asn1 started on node 'node_1@127.0.0.1'
2018-05-08 16:17:10.235 [info] <0.33.0> Application public_key started on node 'node_1@127.0.0.1'
2018-05-08 16:17:10.345 [info] <0.33.0> Application ssl started on node 'node_1@127.0.0.1'
2018-05-08 16:17:10.353 [info] <0.33.0> Application rand_compat started on node 'node_1@127.0.0.1'
2018-05-08 16:17:10.401 [info] <0.33.0> Application quickrand started on node 'node_1@127.0.0.1'
2018-05-08 16:17:10.410 [info] <0.33.0> Application uuid started on node 'node_1@127.0.0.1'
2018-05-08 16:17:10.457 [info] <0.188.0>@partisan_config:init:58 Using node name: 'node_1@127.0.0.1'
2018-05-08 16:17:10.602 [info] <0.190.0>@partisan_config:get_node_address:168 Resolving "127.0.0.1"...
2018-05-08 16:17:10.602 [info] <0.190.0>@partisan_config:get_node_address:171 Resolved "node_1@127.0.0.1" to {127,0,0,1}
2018-05-08 16:17:10.602 [info] <0.188.0>@partisan_config:get_node_address:188 Resolved "127.0.0.1" to {127,0,0,1}
2018-05-08 16:17:11.005 [info] <0.33.0> Application partisan started on node 'node_1@127.0.0.1'
2018-05-08 16:17:11.022 [info] <0.33.0> Application gen_fsm_compat started on node 'node_1@127.0.0.1'
2018-05-08 16:17:11.049 [info] <0.33.0> Application riak_sysmon started on node 'node_1@127.0.0.1'
2018-05-08 16:17:11.099 [info] <0.33.0> Application os_mon started on node 'node_1@127.0.0.1'
2018-05-08 16:17:11.109 [info] <0.33.0> Application basho_stats started on node 'node_1@127.0.0.1'
2018-05-08 16:17:11.121 [info] <0.33.0> Application eleveldb started on node 'node_1@127.0.0.1'
2018-05-08 16:17:11.124 [info] <0.33.0> Application pbkdf2 started on node 'node_1@127.0.0.1'
2018-05-08 16:17:11.128 [info] <0.33.0> Application poolboy started on node 'node_1@127.0.0.1'
2018-05-08 16:17:11.232 [info] <0.267.0>@exometer_report:do_start_reporters:613 Starting reporters with []
2018-05-08 16:17:11.232 [info] <0.33.0> Application exometer_core started on node 'node_1@127.0.0.1'
2018-05-08 16:17:11.311 [info] <0.33.0> Application clique started on node 'node_1@127.0.0.1'
2018-05-08 16:17:11.337 [info] <0.33.0> Application folsom started on node 'node_1@127.0.0.1'
2018-05-08 16:17:11.425 [info] <0.285.0>@riak_core_partisan_utils:configure_dispatch:145 Configuring partisan dispatch: true
2018-05-08 16:17:11.598 [warning] <0.301.0>@riak_core_ring_manager:reload_ring:380 No ring file available.
2018-05-08 16:17:11.998 [info] <0.246.0>@riak_core_sysmon_handler:handle_event:92 monitor long_schedule <0.311.0> [{timeout,124},{in,{code_server,call,1}},{out,{code_server,'-handle_on_load/5-fun-0-',1}}]
2018-05-08 16:17:12.318 [info] <0.318.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,vnode_routing} = proxy
2018-05-08 16:17:12.330 [info] <0.318.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,staged_joins} = true
2018-05-08 16:17:12.347 [info] <0.318.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,resizable_ring} = true
2018-05-08 16:17:12.360 [info] <0.318.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,fold_req_version} = v2
2018-05-08 16:17:12.370 [info] <0.318.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,security} = true
2018-05-08 16:17:12.385 [info] <0.318.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,bucket_types} = true
2018-05-08 16:17:12.400 [info] <0.318.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,net_ticktime} = true
2018-05-08 16:17:12.688 [info] <0.33.0> Application riak_core started on node 'node_1@127.0.0.1'
2018-05-08 16:17:12.811 [info] <0.33.0> Application setup started on node 'node_1@127.0.0.1'
2018-05-08 16:17:24.380 [error] <0.221.0>@partisan_default_peer_service_manager:do_send_message:940 Node 'node_2@127.0.0.1' is not yet connected during send!
2018-05-08 16:17:26.657 [info] <0.301.0>@riak_core_gossip:log_node_added:360 'node_2@127.0.0.1' joined cluster with status 'valid'
2018-05-08 16:17:30.470 [error] <0.221.0>@partisan_default_peer_service_manager:do_send_message:940 Node 'node_3@127.0.0.1' is not yet connected during send!
2018-05-08 16:17:38.785 [info] <0.301.0>@riak_core_gossip:log_node_added:360 'node_3@127.0.0.1' joined cluster with status 'joining'
2018-05-08 16:17:39.988 [info] <0.301.0>@riak_core_gossip:log_node_changed:357 'node_3@127.0.0.1' changed from 'joining' to 'valid'
2018-05-08 16:18:11.085 [info] <0.89.0> alarm_handler: {set,{system_memory_high_watermark,[]}}
2018-05-08 16:18:25.249 [info] <0.246.0>@riak_core_sysmon_handler:handle_event:92 monitor long_schedule #Port<0.718> name tcp_inet [{links,[<0.76.0>]},{id,5744},{connected,<0.76.0>},{input,38211},{output,60},{os_pid,undefined},{queue_size,0}] [{timeout,57},{port_op,dist_cmd}]
2018-05-08 16:19:48.771 [info] <0.246.0>@riak_core_sysmon_handler:handle_event:92 monitor long_schedule <0.300.0> [{name,riak_core_ring_events},{initial_call,{gen_event,init_it,6}},{almost_current_function,{gen_event,fetch_msg,6}},{message_queue_len,0},{dictionary,[{'$initial_call',{gen_event,init_it,6}},{'$ancestors',[riak_core_sup,<0.284.0>]}]}] [{timeout,60},{in,{riak_core_ring_handler,'-maybe_stop_vnode_proxies/1-lc$^3/1-3-',2}},{out,{gen_event,fetch_msg,6}}]
2018-05-08 16:19:48.995 [info] <0.246.0>@riak_core_sysmon_handler:handle_event:92 monitor long_schedule <0.300.0> [{name,riak_core_ring_events},{initial_call,{gen_event,init_it,6}},{almost_current_function,{gen_event,fetch_msg,6}},{message_queue_len,0},{dictionary,[{'$initial_call',{gen_event,init_it,6}},{'$ancestors',[riak_core_sup,<0.284.0>]}]}] [{timeout,90},{in,{riak_core_ring_handler,'-maybe_stop_vnode_proxies/1-lc$^3/1-3-',2}},{out,{gen_event,fetch_msg,6}}]
2018-05-08 16:19:50.046 [info] <0.284.0>@riak_core_app:stop:47 Stopped application riak_core.
2018-05-08 16:19:50.127 [info] <0.89.0> alarm_handler: {clear,system_memory_high_watermark}
2018-05-08 16:17:13.526 [info] <0.33.0> Application lager started on node 'node_2@127.0.0.1'
2018-05-08 16:17:13.549 [info] <0.33.0> Application types started on node 'node_2@127.0.0.1'
2018-05-08 16:17:13.552 [info] <0.33.0> Application acceptor_pool started on node 'node_2@127.0.0.1'
2018-05-08 16:17:13.613 [info] <0.33.0> Application asn1 started on node 'node_2@127.0.0.1'
2018-05-08 16:17:13.613 [info] <0.33.0> Application public_key started on node 'node_2@127.0.0.1'
2018-05-08 16:17:13.782 [info] <0.33.0> Application ssl started on node 'node_2@127.0.0.1'
2018-05-08 16:17:13.789 [info] <0.33.0> Application rand_compat started on node 'node_2@127.0.0.1'
2018-05-08 16:17:13.809 [info] <0.33.0> Application quickrand started on node 'node_2@127.0.0.1'
2018-05-08 16:17:13.809 [info] <0.33.0> Application uuid started on node 'node_2@127.0.0.1'
2018-05-08 16:17:13.850 [info] <0.188.0>@partisan_config:init:58 Using node name: 'node_2@127.0.0.1'
2018-05-08 16:17:13.882 [info] <0.190.0>@partisan_config:get_node_address:168 Resolving "127.0.0.1"...
2018-05-08 16:17:13.885 [info] <0.190.0>@partisan_config:get_node_address:171 Resolved "node_2@127.0.0.1" to {127,0,0,1}
2018-05-08 16:17:13.885 [info] <0.188.0>@partisan_config:get_node_address:188 Resolved "127.0.0.1" to {127,0,0,1}
2018-05-08 16:17:14.431 [info] <0.33.0> Application partisan started on node 'node_2@127.0.0.1'
2018-05-08 16:17:14.461 [info] <0.33.0> Application gen_fsm_compat started on node 'node_2@127.0.0.1'
2018-05-08 16:17:14.488 [info] <0.33.0> Application riak_sysmon started on node 'node_2@127.0.0.1'
2018-05-08 16:17:14.546 [info] <0.33.0> Application os_mon started on node 'node_2@127.0.0.1'
2018-05-08 16:17:14.546 [info] <0.90.0> alarm_handler: {set,{system_memory_high_watermark,[]}}
2018-05-08 16:17:14.564 [info] <0.33.0> Application basho_stats started on node 'node_2@127.0.0.1'
2018-05-08 16:17:14.569 [info] <0.33.0> Application eleveldb started on node 'node_2@127.0.0.1'
2018-05-08 16:17:14.573 [info] <0.33.0> Application pbkdf2 started on node 'node_2@127.0.0.1'
2018-05-08 16:17:14.576 [info] <0.33.0> Application poolboy started on node 'node_2@127.0.0.1'
2018-05-08 16:17:14.685 [info] <0.267.0>@exometer_report:do_start_reporters:613 Starting reporters with []
2018-05-08 16:17:14.685 [info] <0.33.0> Application exometer_core started on node 'node_2@127.0.0.1'
2018-05-08 16:17:14.794 [info] <0.33.0> Application clique started on node 'node_2@127.0.0.1'
2018-05-08 16:17:14.841 [info] <0.33.0> Application folsom started on node 'node_2@127.0.0.1'
2018-05-08 16:17:15.049 [info] <0.285.0>@riak_core_partisan_utils:configure_dispatch:145 Configuring partisan dispatch: true
2018-05-08 16:17:15.233 [warning] <0.301.0>@riak_core_ring_manager:reload_ring:380 No ring file available.
2018-05-08 16:17:15.714 [info] <0.318.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,vnode_routing} = proxy
2018-05-08 16:17:15.723 [info] <0.318.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,staged_joins} = true
2018-05-08 16:17:15.732 [info] <0.318.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,resizable_ring} = true
2018-05-08 16:17:15.739 [info] <0.318.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,fold_req_version} = v2
2018-05-08 16:17:15.749 [info] <0.318.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,security} = true
2018-05-08 16:17:15.756 [info] <0.318.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,bucket_types} = true
2018-05-08 16:17:15.766 [info] <0.318.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,net_ticktime} = true
2018-05-08 16:17:15.978 [info] <0.33.0> Application riak_core started on node 'node_2@127.0.0.1'
2018-05-08 16:17:15.990 [info] <0.33.0> Application setup started on node 'node_2@127.0.0.1'
2018-05-08 16:17:23.167 [info] <0.1375.0>@riak_core_partisan_utils:join:100 Starting join from partisan utils from 'node_2@127.0.0.1' to 'node_1@127.0.0.1'
2018-05-08 16:17:24.299 [info] <0.1375.0>@riak_core_partisan_utils:join:106 Finishing join from 'node_2@127.0.0.1' to 'node_1@127.0.0.1'
2018-05-08 16:17:24.311 [info] <0.1375.0>@riak_core_partisan_utils:join:100 Starting join from partisan utils from 'node_2@127.0.0.1' to 'node_2@127.0.0.1'
2018-05-08 16:17:24.332 [info] <0.1375.0>@riak_core_partisan_utils:join:106 Finishing join from 'node_2@127.0.0.1' to 'node_2@127.0.0.1'
2018-05-08 16:17:26.696 [info] <0.301.0>@riak_core_gossip:log_node_changed:357 'node_2@127.0.0.1' changed from 'joining' to 'valid'
2018-05-08 16:17:39.823 [info] <0.301.0>@riak_core_gossip:log_node_added:360 'node_3@127.0.0.1' joined cluster with status 'joining'
2018-05-08 16:17:40.069 [info] <0.301.0>@riak_core_gossip:log_node_changed:357 'node_3@127.0.0.1' changed from 'joining' to 'valid'
2018-05-08 16:18:35.950 [info] <0.246.0>@riak_core_sysmon_handler:handle_event:92 monitor long_schedule <0.317.0> [{name,riak_core_vnode_manager},{initial_call,{riak_core_vnode_manager,init,1}},{almost_current_function,{gen_server,loop,7}},{message_queue_len,0},{dictionary,[{'$initial_call',{riak_core_vnode_manager,init,1}},{'$ancestors',[riak_core_sup,<0.284.0>]}]}] [{timeout,51},{in,{riak_core_util,pmap_collect_one,1}},{out,{gen_server,loop,7}}]
2018-05-08 16:19:48.642 [info] <0.246.0>@riak_core_sysmon_handler:handle_event:92 monitor long_schedule <0.300.0> [{name,riak_core_ring_events},{initial_call,{gen_event,init_it,6}},{almost_current_function,{gen_event,fetch_msg,6}},{message_queue_len,0},{dictionary,[{'$initial_call',{gen_event,init_it,6}},{'$ancestors',[riak_core_sup,<0.284.0>]}]}] [{timeout,51},{in,{riak_core_ring_handler,'-maybe_stop_vnode_proxies/1-lc$^3/1-3-',2}},{out,{gen_event,fetch_msg,6}}]
2018-05-08 16:19:49.354 [info] <0.246.0>@riak_core_sysmon_handler:handle_event:92 monitor long_schedule <0.300.0> [{name,riak_core_ring_events},{initial_call,{gen_event,init_it,6}},{almost_current_function,{gen_event,fetch_msg,6}},{message_queue_len,0},{dictionary,[{'$initial_call',{gen_event,init_it,6}},{'$ancestors',[riak_core_sup,<0.284.0>]}]}] [{timeout,134},{in,{riak_core_ring_handler,'-maybe_stop_vnode_proxies/1-lc$^3/1-3-',2}},{out,{gen_event,fetch_msg,6}}]
2018-05-08 16:19:50.416 [warning] <0.4276.0>@partisan_peer_service_client:init:72 unable to connect to #{channels => [broadcast,vnode,{monotonic,gossip}],listen_addrs => [#{ip => {127,0,0,1},port => 37509}],name => 'node_1@127.0.0.1',parallelism => 1} due to {error,econnrefused}
2018-05-08 16:19:51.449 [warning] <0.4278.0>@partisan_peer_service_client:init:72 unable to connect to #{channels => [broadcast,vnode,{monotonic,gossip}],listen_addrs => [#{ip => {127,0,0,1},port => 37509}],name => 'node_1@127.0.0.1',parallelism => 1} due to {error,econnrefused}
2018-05-08 16:19:52.038 [info] <0.284.0>@riak_core_app:stop:47 Stopped application riak_core.
2018-05-08 16:19:52.079 [info] <0.90.0> alarm_handler: {clear,system_memory_high_watermark}
===> Linking _build/default/plugins/rebar_erl_vsn to _build/test/plugins/rebar_erl_vsn
===> Linking _build/default/plugins/rebar3_proper to _build/test/plugins/rebar3_proper
===> Linking _build/default/plugins/rebar3_run to _build/test/plugins/rebar3_run
===> Verifying dependencies...
===> Fetching proper ({pkg,<<"proper">>,<<"1.2.0">>})
===> Version cached at /root/.cache/rebar3/hex/default/packages/proper-1.2.0.tar is up to date, reusing it
===> Fetching recon ({pkg,<<"recon">>,<<"2.3.5">>})
===> Downloaded package, caching at /root/.cache/rebar3/hex/default/packages/recon-2.3.5.tar
===> Linking _build/default/lib/exometer_core to _build/test/lib/exometer_core
===> Linking _build/default/lib/lager to _build/test/lib/lager
===> Linking _build/default/lib/lasp_bench to _build/test/lib/lasp_bench
===> Linking _build/default/lib/partisan to _build/test/lib/partisan
===> Linking _build/default/lib/pbkdf2 to _build/test/lib/pbkdf2
===> Linking _build/default/lib/poolboy to _build/test/lib/poolboy
===> Linking _build/default/lib/riak_core to _build/test/lib/riak_core
===> Linking _build/default/lib/riak_core_partisan_utils to _build/test/lib/riak_core_partisan_utils
===> Linking _build/default/lib/riak_ensemble to _build/test/lib/riak_ensemble
===> Linking _build/default/plugins/pc to _build/test/plugins/pc
===> Linking _build/default/lib/acceptor_pool to _build/test/lib/acceptor_pool
===> Linking _build/default/lib/basho_stats to _build/test/lib/basho_stats
===> Linking _build/default/lib/blume to _build/test/lib/blume
===> Linking _build/default/lib/chash to _build/test/lib/chash
===> Linking _build/default/lib/clique to _build/test/lib/clique
===> Linking _build/default/lib/cuttlefish to _build/test/lib/cuttlefish
===> Linking _build/default/lib/eleveldb to _build/test/lib/eleveldb
===> Linking _build/default/lib/folsom to _build/test/lib/folsom
===> Linking _build/default/lib/gen_fsm_compat to _build/test/lib/gen_fsm_compat
===> Linking _build/default/lib/getopt to _build/test/lib/getopt
===> Linking _build/default/lib/goldrush to _build/test/lib/goldrush
===> Linking _build/default/lib/jam to _build/test/lib/jam
===> Linking _build/default/lib/parse_trans to _build/test/lib/parse_trans
===> Linking _build/default/lib/rand_compat to _build/test/lib/rand_compat
===> Linking _build/default/lib/riak_sysmon to _build/test/lib/riak_sysmon
===> Linking _build/default/lib/setup to _build/test/lib/setup
===> Linking _build/default/lib/time_compat to _build/test/lib/time_compat
===> Linking _build/default/lib/types to _build/test/lib/types
===> Linking _build/default/lib/uuid to _build/test/lib/uuid
===> Linking _build/default/lib/bear to _build/test/lib/bear
===> Linking _build/default/lib/edown to _build/test/lib/edown
===> Linking _build/default/lib/meck to _build/test/lib/meck
===> Linking _build/default/lib/quickrand to _build/test/lib/quickrand
===> Compiling proper
make: 'include/compile_flags.hrl' is up to date.
/opt/unir/_build/test/lib/proper/src/proper_typeserver.erl:553: Warning: erlang:get_stacktrace/0 used following a 'try' expression may stop working in a future release. (Use it inside 'try'.)
/opt/unir/_build/test/lib/proper/src/proper_typeserver.erl:556: Warning: erlang:get_stacktrace/0 used following a 'try' expression may stop working in a future release. (Use it inside 'try'.)
_build/test/lib/proper/src/proper_typeserver.erl:553: Warning: erlang:get_stacktrace/0 used following a 'try' expression may stop working in a future release. (Use it inside 'try'.)
_build/test/lib/proper/src/proper_typeserver.erl:556: Warning: erlang:get_stacktrace/0 used following a 'try' expression may stop working in a future release. (Use it inside 'try'.)
===> Compiling quickrand
===> Compiling uuid
===> Compiling types
===> Compiling time_compat
===> Compiling rand_compat
===> Compiling acceptor_pool
===> Compiling partisan
===> Compiling riak_core_partisan_utils
===> Compiling eleveldb
make: Nothing to be done for 'all'.
make: Nothing to be done for 'tools'.
===> Compiling c_src/eleveldb.cc
===> Compiling c_src/refobjects.cc
===> Compiling c_src/workitems.cc
===> Linking priv/eleveldb.so
===> Compiling getopt
===> Compiling cuttlefish
===> Building escript...
===> Compiling riak_ensemble
===> Linking priv/riak_ensemble_drv.so
===> Compiling chash
===> Compiling basho_stats
===> Compiling edown
===> Compiling setup
/usr/local/bin/rebar skip_deps=true escriptize
==> setup (escriptize)
===> Compiling gen_fsm_compat
===> Compiling poolboy
===> Compiling pbkdf2
===> Compiling parse_trans
===> Compiling exometer_core
===> Compiling clique
===> Compiling blume
===> Compiling riak_core
===> Compiling lasp_bench
===> Compiling recon
===> Compiling unir
src/unir_get_fsm.erl:58: Warning: gen_fsm:start_link/3 is deprecated and will be removed in a future release; use gen_statem:start_link/3
src/unir_ping_fsm.erl:57: Warning: gen_fsm:start_link/3 is deprecated and will be removed in a future release; use gen_statem:start_link/3
src/unir_put_fsm.erl:59: Warning: gen_fsm:start_link/3 is deprecated and will be removed in a future release; use gen_statem:start_link/3
_build/test/lib/unir/test/support.erl:25: Warning: export_all flag enabled - all functions will be exported
_build/test/lib/unir/test/prop_unir.erl:11: Warning: export_all flag enabled - all functions will be exported
_build/test/lib/unir/test/functionality_SUITE.erl:36: Warning: export_all flag enabled - all functions will be exported
_build/test/lib/unir/test/throughput_SUITE.erl:36: Warning: export_all flag enabled - all functions will be exported
test/support.erl:25: Warning: export_all flag enabled - all functions will be exported
test/prop_unir.erl:11: Warning: export_all flag enabled - all functions will be exported
test/functionality_SUITE.erl:36: Warning: export_all flag enabled - all functions will be exported
test/throughput_SUITE.erl:36: Warning: export_all flag enabled - all functions will be exported
===> Testing prop_unir:prop_sequential()
16:16:59.136 [info] Using "node_1@127.0.0.1" as name, since running >= 20.0
16:16:59.138 [info] Starting node: 'node_1@127.0.0.1'
16:16:59.138 [info] Application lager started on node 'runner@127.0.0.1'
16:17:01.444 [info] Node started: 'node_1@127.0.0.1'
16:17:01.444 [info] Using "node_2@127.0.0.1" as name, since running >= 20.0
16:17:01.444 [info] Starting node: 'node_2@127.0.0.1'
16:17:03.659 [info] Node started: 'node_2@127.0.0.1'
16:17:03.659 [info] Using "node_3@127.0.0.1" as name, since running >= 20.0
16:17:03.661 [info] Starting node: 'node_3@127.0.0.1'
16:17:06.864 [info] Node started: 'node_3@127.0.0.1'
16:17:24.359 [info] Issuing normal join from 'node_2@127.0.0.1' to 'node_1@127.0.0.1': ok
16:17:30.448 [info] Issuing normal join from 'node_3@127.0.0.1' to 'node_1@127.0.0.1': ok
16:17:30.448 [info] Waiting for nodes to be ready...
16:17:30.448 [info] Waiting for node 'node_1@127.0.0.1' to be ready...
16:17:30.452 [info] Waiting for node 'node_2@127.0.0.1' to be ready...
16:17:30.455 [info] Waiting for node 'node_3@127.0.0.1' to be ready...
16:17:31.465 [info] Waiting for node 'node_3@127.0.0.1' to be ready...
16:17:32.468 [info] Waiting for node 'node_3@127.0.0.1' to be ready...
16:17:33.471 [info] Waiting for node 'node_3@127.0.0.1' to be ready...
16:17:34.476 [info] Waiting for node 'node_3@127.0.0.1' to be ready...
16:17:35.479 [info] Waiting for node 'node_3@127.0.0.1' to be ready...
16:17:36.481 [info] Waiting for node 'node_3@127.0.0.1' to be ready...
16:17:37.483 [info] Waiting for node 'node_3@127.0.0.1' to be ready...
16:17:38.486 [info] Waiting for node 'node_3@127.0.0.1' to be ready...
16:17:39.489 [info] Waiting for node 'node_3@127.0.0.1' to be ready...
16:17:40.492 [info] Waiting for node 'node_3@127.0.0.1' to be ready...
16:17:40.496 [info] All nodes ready!
16:17:40.496 [info] Waiting for ownership agreement...
16:19:36.149 [info] Waiting for handoff...
16:19:36.153 [info] Wait until no pending changes on nodes ['node_1@127.0.0.1','node_2@127.0.0.1','node_3@127.0.0.1']
16:19:36.517 [info] -> BadNodes: [], length(Changes): 3, length(Nodes): 3, Changes: [false,false,false]
16:19:37.519 [info] Wait until no pending changes on nodes ['node_1@127.0.0.1','node_2@127.0.0.1','node_3@127.0.0.1']
16:19:38.592 [info] -> BadNodes: [], length(Changes): 3, length(Nodes): 3, Changes: [false,false,false]
16:19:39.595 [info] Wait until no pending changes on nodes ['node_1@127.0.0.1','node_2@127.0.0.1','node_3@127.0.0.1']
16:19:39.635 [info] -> BadNodes: [], length(Changes): 3, length(Nodes): 3, Changes: [false,false,false]
16:19:40.636 [info] Wait until no pending changes on nodes ['node_1@127.0.0.1','node_2@127.0.0.1','node_3@127.0.0.1']
16:19:40.686 [info] -> BadNodes: [], length(Changes): 3, length(Nodes): 3, Changes: [false,false,false]
16:19:41.687 [info] Wait until no pending changes on nodes ['node_1@127.0.0.1','node_2@127.0.0.1','node_3@127.0.0.1']
16:19:41.753 [info] -> BadNodes: [], length(Changes): 3, length(Nodes): 3, Changes: [false,false,false]
16:19:42.755 [info] Wait until no pending changes on nodes ['node_1@127.0.0.1','node_2@127.0.0.1','node_3@127.0.0.1']
16:19:42.804 [info] -> BadNodes: [], length(Changes): 3, length(Nodes): 3, Changes: [false,false,false]
16:19:43.807 [info] Wait until no pending changes on nodes ['node_1@127.0.0.1','node_2@127.0.0.1','node_3@127.0.0.1']
16:19:43.829 [info] -> BadNodes: [], length(Changes): 3, length(Nodes): 3, Changes: [false,false,false]
16:19:44.830 [info] Wait until no pending changes on nodes ['node_1@127.0.0.1','node_2@127.0.0.1','node_3@127.0.0.1']
16:19:44.885 [info] -> BadNodes: [], length(Changes): 3, length(Nodes): 3, Changes: [false,false,false]
16:19:45.887 [info] Wait until no pending changes on nodes ['node_1@127.0.0.1','node_2@127.0.0.1','node_3@127.0.0.1']
16:19:45.938 [info] -> BadNodes: [], length(Changes): 3, length(Nodes): 3, Changes: [false,false,false]
16:19:46.939 [info] Wait until no pending changes on nodes ['node_1@127.0.0.1','node_2@127.0.0.1','node_3@127.0.0.1']
16:19:47.017 [info] -> BadNodes: [], length(Changes): 3, length(Nodes): 3, Changes: [false,false,false]
16:19:48.018 [info] Wait until no pending changes on nodes ['node_1@127.0.0.1','node_2@127.0.0.1','node_3@127.0.0.1']
16:19:48.063 [info] -> BadNodes: [], length(Changes): 3, length(Nodes): 3, Changes: [false,false,false]
16:19:49.064 [info] Wait until no pending changes on nodes ['node_1@127.0.0.1','node_2@127.0.0.1','node_3@127.0.0.1']
16:19:49.122 [info] -> BadNodes: [], length(Changes): 3, length(Nodes): 3, Changes: [true,true,true]
16:19:49.122 [info] No pending changes remain!
16:19:49.122 [info] Waiting for ring convergence...
16:19:49.373 [info] initial_state: nodes [node_1,node_2,node_3] joined_nodes [node_1,node_2,node_3]
----------------------------------------------------
2018-05-08 16:19:49.871
Stopping node: node_1
Warning: ct_logs not started
Stopping node: node_116:19:49.879 [info] Using "node_1@127.0.0.1" as name, since running >= 20.0
----------------------------------------------------
2018-05-08 16:19:51.900
Stopping node: node_2
Warning: ct_logs not started
Stopping node: node_216:19:51.901 [info] Using "node_2@127.0.0.1" as name, since running >= 20.0
----------------------------------------------------
2018-05-08 16:19:53.909
Stopping node: node_3
Warning: ct_logs not started
Stopping node: node_316:19:53.909 [info] Using "node_3@127.0.0.1" as name, since running >= 20.0
.16:19:55.921 [info] initial_state: nodes [node_1,node_2,node_3] joined_nodes [node_1,node_2,node_3]
16:19:55.958 [info] Using "node_1@127.0.0.1" as name, since running >= 20.0
16:19:55.958 [info] Starting node: 'node_1@127.0.0.1'
!
Failed: After 2 test(s).
An exception was raised: exit:{test_case_failed,{error,boot_timeout,'node_1@127.0.0.1'}}.
Stacktrace: [{lists,map,2,[{file,"lists.erl"},{line,1239}]},
{support,start,3,
[{file,"/opt/unir/test/support.erl"},{line,117}]},
{prop_unir,start_nodes,0,
[{file,"/opt/unir/test/prop_unir.erl"},{line,307}]},
{prop_unir,'-prop_sequential/0-fun-2-',1,
[{file,"/opt/unir/test/prop_unir.erl"},{line,32}]},
{proper,apply_args,3,
[{file,"/opt/unir/_build/test/lib/proper/src/proper.erl"},
{line,1353}]},
{proper,perform,7,
[{file,"/opt/unir/_build/test/lib/proper/src/proper.erl"},
{line,1146}]},
{proper,inner_test,2,
[{file,"/opt/unir/_build/test/lib/proper/src/proper.erl"},
{line,1028}]}].
[]
Shrinking 16:19:59.033 [info] initial_state: nodes [node_1,node_2,node_3] joined_nodes [node_1,node_2,node_3]
16:19:59.048 [info] initial_state: nodes [node_1,node_2,node_3] joined_nodes [node_1,node_2,node_3]
16:19:59.049 [info] initial_state: nodes [node_1,node_2,node_3] joined_nodes [node_1,node_2,node_3]
16:19:59.062 [info] initial_state: nodes [node_1,node_2,node_3] joined_nodes [node_1,node_2,node_3]
16:19:59.102 [info] initial_state: nodes [node_1,node_2,node_3] joined_nodes [node_1,node_2,node_3]
(0 time(s))
[]
===>
0/1 properties passed, 1 failed
===> Failed test cases:
prop_unir:prop_sequential() -> false
{prop_unir,prop_sequential,[[]]}.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment