Navigation Menu

Skip to content

Instantly share code, notes, and snippets.

@cmeiklejohn
Created May 8, 2018 16:13
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save cmeiklejohn/d16c8b9d34a503bd4cf542f421ebe203 to your computer and use it in GitHub Desktop.
Save cmeiklejohn/d16c8b9d34a503bd4cf542f421ebe203 to your computer and use it in GitHub Desktop.
results for unir-865d54db67-dcndd
find . -name console.log | grep `ls -d ./undefined* | tail -1` | xargs cat
2018-05-08 16:10:11.970 [info] <0.33.0> Application lager started on node 'node_3@127.0.0.1'
2018-05-08 16:10:11.988 [info] <0.33.0> Application types started on node 'node_3@127.0.0.1'
2018-05-08 16:10:11.992 [info] <0.33.0> Application acceptor_pool started on node 'node_3@127.0.0.1'
2018-05-08 16:10:12.027 [info] <0.33.0> Application asn1 started on node 'node_3@127.0.0.1'
2018-05-08 16:10:12.027 [info] <0.33.0> Application public_key started on node 'node_3@127.0.0.1'
2018-05-08 16:10:12.157 [info] <0.33.0> Application ssl started on node 'node_3@127.0.0.1'
2018-05-08 16:10:12.164 [info] <0.33.0> Application rand_compat started on node 'node_3@127.0.0.1'
2018-05-08 16:10:12.176 [info] <0.33.0> Application quickrand started on node 'node_3@127.0.0.1'
2018-05-08 16:10:12.176 [info] <0.33.0> Application uuid started on node 'node_3@127.0.0.1'
2018-05-08 16:10:12.188 [info] <0.185.0>@partisan_config:init:58 Using node name: 'node_3@127.0.0.1'
2018-05-08 16:10:12.223 [info] <0.187.0>@partisan_config:get_node_address:168 Resolving "127.0.0.1"...
2018-05-08 16:10:12.228 [info] <0.187.0>@partisan_config:get_node_address:171 Resolved "node_3@127.0.0.1" to {127,0,0,1}
2018-05-08 16:10:12.228 [info] <0.185.0>@partisan_config:get_node_address:188 Resolved "127.0.0.1" to {127,0,0,1}
2018-05-08 16:10:12.509 [info] <0.33.0> Application partisan started on node 'node_3@127.0.0.1'
2018-05-08 16:10:12.519 [info] <0.33.0> Application gen_fsm_compat started on node 'node_3@127.0.0.1'
2018-05-08 16:10:12.537 [info] <0.33.0> Application riak_sysmon started on node 'node_3@127.0.0.1'
2018-05-08 16:10:12.598 [info] <0.33.0> Application os_mon started on node 'node_3@127.0.0.1'
2018-05-08 16:10:12.618 [info] <0.33.0> Application basho_stats started on node 'node_3@127.0.0.1'
2018-05-08 16:10:12.621 [info] <0.33.0> Application eleveldb started on node 'node_3@127.0.0.1'
2018-05-08 16:10:12.626 [info] <0.33.0> Application pbkdf2 started on node 'node_3@127.0.0.1'
2018-05-08 16:10:12.627 [info] <0.33.0> Application poolboy started on node 'node_3@127.0.0.1'
2018-05-08 16:10:12.737 [info] <0.266.0>@exometer_report:do_start_reporters:613 Starting reporters with []
2018-05-08 16:10:12.737 [info] <0.33.0> Application exometer_core started on node 'node_3@127.0.0.1'
2018-05-08 16:10:12.853 [info] <0.33.0> Application clique started on node 'node_3@127.0.0.1'
2018-05-08 16:10:12.887 [info] <0.33.0> Application folsom started on node 'node_3@127.0.0.1'
2018-05-08 16:10:12.982 [info] <0.284.0>@riak_core_partisan_utils:configure_dispatch:145 Configuring partisan dispatch: true
2018-05-08 16:10:13.151 [warning] <0.300.0>@riak_core_ring_manager:reload_ring:380 No ring file available.
2018-05-08 16:10:13.589 [info] <0.317.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,vnode_routing} = proxy
2018-05-08 16:10:13.601 [info] <0.317.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,staged_joins} = true
2018-05-08 16:10:13.610 [info] <0.317.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,resizable_ring} = true
2018-05-08 16:10:13.619 [info] <0.317.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,fold_req_version} = v2
2018-05-08 16:10:13.627 [info] <0.317.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,security} = true
2018-05-08 16:10:13.638 [info] <0.317.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,bucket_types} = true
2018-05-08 16:10:13.648 [info] <0.317.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,net_ticktime} = true
2018-05-08 16:10:13.838 [info] <0.33.0> Application riak_core started on node 'node_3@127.0.0.1'
2018-05-08 16:10:13.844 [info] <0.33.0> Application setup started on node 'node_3@127.0.0.1'
2018-05-08 16:10:25.068 [info] <0.3426.0>@riak_core_partisan_utils:join:100 Starting join from partisan utils from 'node_3@127.0.0.1' to 'node_1@127.0.0.1'
2018-05-08 16:10:26.096 [info] <0.3426.0>@riak_core_partisan_utils:join:106 Finishing join from 'node_3@127.0.0.1' to 'node_1@127.0.0.1'
2018-05-08 16:10:26.099 [info] <0.3426.0>@riak_core_partisan_utils:join:100 Starting join from partisan utils from 'node_3@127.0.0.1' to 'node_3@127.0.0.1'
2018-05-08 16:10:26.114 [info] <0.3426.0>@riak_core_partisan_utils:join:106 Finishing join from 'node_3@127.0.0.1' to 'node_3@127.0.0.1'
2018-05-08 16:10:36.193 [info] <0.300.0>@riak_core_gossip:log_node_changed:357 'node_3@127.0.0.1' changed from 'joining' to 'valid'
2018-05-08 16:11:30.174 [info] <0.245.0>@riak_core_sysmon_handler:handle_event:92 monitor long_schedule <0.1675.0> [{timeout,53},{in,{gen_fsm_compat,loop,8}},{out,{proc_lib,exit_p,3}}]
2018-05-08 16:11:31.204 [info] <0.245.0>@riak_core_sysmon_handler:handle_event:92 monitor long_schedule <0.3430.0> [{initial_call,{partisan_peer_service_client,init,1}},{almost_current_function,{gen_server,loop,7}},{message_queue_len,0},{dictionary,[{{partisan_peer_service_client,listen_addr},#{ip => {127,0,0,1},port => 55742}},{{partisan_peer_service_client,peer},#{channels => [broadcast,vnode,{monotonic,gossip}],listen_addrs => [#{ip => {127,0,0,1},port => 55742}],name => 'node_1@127.0.0.1',parallelism => 1}},{{partisan_peer_service_client,channel},{monotonic,gossip}},{'$initial_call',{partisan_peer_service_client,init,1}},{{partisan_peer_service_client,from},<0.220.0>},{last_transmission_time,-576460666876},{'$ancestors',[partisan_default_peer_service_manager,partisan_sup,<0.184.0>]}]}] [{timeout,64},{in,{lists,all,2}},{out,{gen_server,loop,7}}]
2018-05-08 16:11:53.660 [info] <0.245.0>@riak_core_sysmon_handler:handle_event:92 monitor long_gc <0.316.0> [{name,riak_core_vnode_manager},{initial_call,{riak_core_vnode_manager,init,1}},{almost_current_function,{dict,'-from_list/1-fun-0-',2}},{message_queue_len,4},{dictionary,[{'$initial_call',{riak_core_vnode_manager,init,1}},{'$ancestors',[riak_core_sup,<0.283.0>]}]}] [{timeout,62},{old_heap_block_size,514838},{heap_block_size,75113},{mbuf_size,0},{stack_size,43},{old_heap_size,90328},{heap_size,3127}]
2018-05-08 16:11:53.660 [info] <0.245.0>@riak_core_sysmon_handler:handle_event:92 monitor long_schedule <0.316.0> [{name,riak_core_vnode_manager},{initial_call,{riak_core_vnode_manager,init,1}},{almost_current_function,{dict,'-from_list/1-fun-0-',2}},{message_queue_len,4},{dictionary,[{'$initial_call',{riak_core_vnode_manager,init,1}},{'$ancestors',[riak_core_sup,<0.283.0>]}]}] [{timeout,64},{in,{dict,store_bkt_val,3}},{out,{dict,on_bucket,3}}]
2018-05-08 16:12:52.838 [info] <0.245.0>@riak_core_sysmon_handler:handle_event:92 monitor long_schedule <0.299.0> [{name,riak_core_ring_events},{initial_call,{gen_event,init_it,6}},{almost_current_function,{gen_event,fetch_msg,6}},{message_queue_len,0},{dictionary,[{'$initial_call',{gen_event,init_it,6}},{'$ancestors',[riak_core_sup,<0.283.0>]}]}] [{timeout,113},{in,{riak_core_ring_handler,'-maybe_stop_vnode_proxies/1-lc$^3/1-3-',2}},{out,{gen_event,fetch_msg,6}}]
2018-05-08 16:12:53.343 [info] <0.245.0>@riak_core_sysmon_handler:handle_event:92 monitor long_schedule <0.299.0> [{name,riak_core_ring_events},{initial_call,{gen_event,init_it,6}},{almost_current_function,{gen_event,fetch_msg,6}},{message_queue_len,0},{dictionary,[{'$initial_call',{gen_event,init_it,6}},{'$ancestors',[riak_core_sup,<0.283.0>]}]}] [{timeout,161},{in,{riak_core_ring_handler,'-maybe_stop_vnode_proxies/1-lc$^3/1-3-',2}},{out,{gen_event,fetch_msg,6}}]
2018-05-08 16:12:55.661 [warning] <0.5276.0>@partisan_peer_service_client:init:72 unable to connect to #{channels => [broadcast,vnode,{monotonic,gossip}],listen_addrs => [#{ip => {127,0,0,1},port => 55742}],name => 'node_1@127.0.0.1',parallelism => 1} due to {error,econnrefused}
2018-05-08 16:12:56.688 [warning] <0.5277.0>@partisan_peer_service_client:init:72 unable to connect to #{channels => [broadcast,vnode,{monotonic,gossip}],listen_addrs => [#{ip => {127,0,0,1},port => 55742}],name => 'node_1@127.0.0.1',parallelism => 1} due to {error,econnrefused}
2018-05-08 16:12:56.692 [warning] <0.5278.0>@partisan_peer_service_client:init:72 unable to connect to #{channels => [broadcast,vnode,{monotonic,gossip}],listen_addrs => [#{ip => {127,0,0,1},port => 54572}],name => 'node_2@127.0.0.1',parallelism => 1} due to {error,econnrefused}
2018-05-08 16:12:57.760 [warning] <0.5279.0>@partisan_peer_service_client:init:72 unable to connect to #{channels => [broadcast,vnode,{monotonic,gossip}],listen_addrs => [#{ip => {127,0,0,1},port => 55742}],name => 'node_1@127.0.0.1',parallelism => 1} due to {error,econnrefused}
2018-05-08 16:12:57.760 [warning] <0.5280.0>@partisan_peer_service_client:init:72 unable to connect to #{channels => [broadcast,vnode,{monotonic,gossip}],listen_addrs => [#{ip => {127,0,0,1},port => 54572}],name => 'node_2@127.0.0.1',parallelism => 1} due to {error,econnrefused}
2018-05-08 16:12:58.606 [info] <0.283.0>@riak_core_app:stop:47 Stopped application riak_core.
2018-05-08 16:10:08.717 [info] <0.33.0> Application lager started on node 'node_1@127.0.0.1'
2018-05-08 16:10:08.760 [info] <0.33.0> Application types started on node 'node_1@127.0.0.1'
2018-05-08 16:10:08.763 [info] <0.33.0> Application acceptor_pool started on node 'node_1@127.0.0.1'
2018-05-08 16:10:08.799 [info] <0.33.0> Application asn1 started on node 'node_1@127.0.0.1'
2018-05-08 16:10:08.799 [info] <0.33.0> Application public_key started on node 'node_1@127.0.0.1'
2018-05-08 16:10:08.936 [info] <0.33.0> Application ssl started on node 'node_1@127.0.0.1'
2018-05-08 16:10:08.940 [info] <0.33.0> Application rand_compat started on node 'node_1@127.0.0.1'
2018-05-08 16:10:08.960 [info] <0.33.0> Application quickrand started on node 'node_1@127.0.0.1'
2018-05-08 16:10:08.960 [info] <0.33.0> Application uuid started on node 'node_1@127.0.0.1'
2018-05-08 16:10:08.974 [info] <0.186.0>@partisan_config:init:58 Using node name: 'node_1@127.0.0.1'
2018-05-08 16:10:09.001 [info] <0.188.0>@partisan_config:get_node_address:168 Resolving "127.0.0.1"...
2018-05-08 16:10:09.001 [info] <0.188.0>@partisan_config:get_node_address:171 Resolved "node_1@127.0.0.1" to {127,0,0,1}
2018-05-08 16:10:09.001 [info] <0.186.0>@partisan_config:get_node_address:188 Resolved "127.0.0.1" to {127,0,0,1}
2018-05-08 16:10:09.218 [info] <0.33.0> Application partisan started on node 'node_1@127.0.0.1'
2018-05-08 16:10:09.237 [info] <0.33.0> Application gen_fsm_compat started on node 'node_1@127.0.0.1'
2018-05-08 16:10:09.268 [info] <0.33.0> Application riak_sysmon started on node 'node_1@127.0.0.1'
2018-05-08 16:10:09.305 [info] <0.33.0> Application os_mon started on node 'node_1@127.0.0.1'
2018-05-08 16:10:09.312 [info] <0.33.0> Application basho_stats started on node 'node_1@127.0.0.1'
2018-05-08 16:10:09.316 [info] <0.33.0> Application eleveldb started on node 'node_1@127.0.0.1'
2018-05-08 16:10:09.317 [info] <0.33.0> Application pbkdf2 started on node 'node_1@127.0.0.1'
2018-05-08 16:10:09.319 [info] <0.33.0> Application poolboy started on node 'node_1@127.0.0.1'
2018-05-08 16:10:09.391 [info] <0.265.0>@exometer_report:do_start_reporters:613 Starting reporters with []
2018-05-08 16:10:09.393 [info] <0.33.0> Application exometer_core started on node 'node_1@127.0.0.1'
2018-05-08 16:10:09.467 [info] <0.33.0> Application clique started on node 'node_1@127.0.0.1'
2018-05-08 16:10:09.498 [info] <0.33.0> Application folsom started on node 'node_1@127.0.0.1'
2018-05-08 16:10:09.541 [info] <0.283.0>@riak_core_partisan_utils:configure_dispatch:145 Configuring partisan dispatch: true
2018-05-08 16:10:09.604 [warning] <0.299.0>@riak_core_ring_manager:reload_ring:380 No ring file available.
2018-05-08 16:10:09.806 [info] <0.244.0>@riak_core_sysmon_handler:handle_event:92 monitor long_schedule <0.309.0> [{timeout,86},{in,{code_server,call,1}},{out,{code_server,'-handle_on_load/5-fun-0-',1}}]
2018-05-08 16:10:09.951 [info] <0.316.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,vnode_routing} = proxy
2018-05-08 16:10:09.959 [info] <0.316.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,staged_joins} = true
2018-05-08 16:10:09.967 [info] <0.316.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,resizable_ring} = true
2018-05-08 16:10:09.971 [info] <0.316.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,fold_req_version} = v2
2018-05-08 16:10:09.975 [info] <0.316.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,security} = true
2018-05-08 16:10:09.981 [info] <0.316.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,bucket_types} = true
2018-05-08 16:10:09.986 [info] <0.316.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,net_ticktime} = true
2018-05-08 16:10:10.079 [info] <0.33.0> Application riak_core started on node 'node_1@127.0.0.1'
2018-05-08 16:10:10.128 [info] <0.33.0> Application setup started on node 'node_1@127.0.0.1'
2018-05-08 16:10:20.085 [error] <0.219.0>@partisan_default_peer_service_manager:do_send_message:940 Node 'node_2@127.0.0.1' is not yet connected during send!
2018-05-08 16:10:22.328 [info] <0.299.0>@riak_core_gossip:log_node_added:360 'node_2@127.0.0.1' joined cluster with status 'valid'
2018-05-08 16:10:26.137 [error] <0.219.0>@partisan_default_peer_service_manager:do_send_message:940 Node 'node_3@127.0.0.1' is not yet connected during send!
2018-05-08 16:10:31.419 [info] <0.244.0>@riak_core_sysmon_handler:handle_event:92 monitor long_schedule <0.299.0> [{name,riak_core_ring_manager},{initial_call,{riak_core_ring_manager,init,1}},{almost_current_function,{riak_core_claimant,handle_down_nodes,2}},{message_queue_len,0},{dictionary,[{rand_seed,{#{bits => 58,jump => #Fun<rand.8.15449617>,next => #Fun<rand.5.15449617>,type => exrop,uniform => #Fun<rand.6.15449617>,uniform_n => #Fun<rand.7.15449617>,weak_low_bits => 1},[269565471622683752|10668982613975908]}},{'$initial_call',{riak_core_ring_manager,init,1}},{'$ancestors',[riak_core_sup,<0.282.0>]}]}] [{timeout,55},{in,{riak_core_claimant,'-transfer_ownership/2-fun-0-',2}},{out,{lists,'-filter/2-lc$^0/1-0-',2}}]
2018-05-08 16:10:34.637 [info] <0.299.0>@riak_core_gossip:log_node_added:360 'node_3@127.0.0.1' joined cluster with status 'joining'
2018-05-08 16:10:35.923 [info] <0.299.0>@riak_core_gossip:log_node_changed:357 'node_3@127.0.0.1' changed from 'joining' to 'valid'
2018-05-08 16:11:30.627 [info] <0.244.0>@riak_core_sysmon_handler:handle_event:92 monitor long_schedule #Port<0.718> name tcp_inet [{links,[<0.76.0>]},{id,5744},{connected,<0.76.0>},{input,41144},{output,60},{os_pid,undefined},{queue_size,0}] [{timeout,83},{port_op,dist_cmd}]
2018-05-08 16:11:42.852 [info] <0.244.0>@riak_core_sysmon_handler:handle_event:92 monitor long_schedule <0.299.0> [{name,riak_core_ring_manager},{initial_call,{riak_core_ring_manager,init,1}},{almost_current_function,{prim_file,drv_get_response,1}},{message_queue_len,1},{dictionary,[{rand_seed,{#{bits => 58,jump => #Fun<rand.8.15449617>,next => #Fun<rand.5.15449617>,type => exrop,uniform => #Fun<rand.6.15449617>,uniform_n => #Fun<rand.7.15449617>,weak_low_bits => 1},[269565471622683752|10668982613975908]}},{'$initial_call',{riak_core_ring_manager,init,1}},{'$ancestors',[riak_core_sup,<0.282.0>]}]}] [{timeout,51},{in,{prim_file,drv_get_response,1}},{out,{prim_file,drv_get_response,1}}]
2018-05-08 16:11:56.942 [info] <0.244.0>@riak_core_sysmon_handler:handle_event:92 monitor long_schedule <0.2093.0> [{initial_call,{riak_core_vnode,init,1}},{almost_current_function,{gen_fsm_compat,loop,8}},{message_queue_len,0},{dictionary,[{rand_seed,{#{bits => 58,jump => #Fun<rand.8.15449617>,next => #Fun<rand.5.15449617>,type => exrop,uniform => #Fun<rand.6.15449617>,uniform_n => #Fun<rand.7.15449617>,weak_low_bits => 1},[75131654660335307|224667251395002285]}},{'$initial_call',{riak_core_vnode,init,1}},{'$ancestors',[riak_core_vnode_sup,riak_core_sup,<0.282.0>]}]}] [{timeout,53},{in,{gen_server,call,3}},{out,{gen_fsm_compat,loop,8}}]
2018-05-08 16:12:20.227 [info] <0.244.0>@riak_core_sysmon_handler:handle_event:92 monitor long_schedule <0.6.0> [{name,erl_prim_loader},{initial_call,{erlang,apply,2}},{almost_current_function,{erl_prim_loader,loop,3}},{message_queue_len,0},{dictionary,[]}] [{timeout,72},{in,{erl_prim_loader,loop,3}},{out,{prim_file,drv_get_response,1}}]
2018-05-08 16:12:30.062 [info] <0.244.0>@riak_core_sysmon_handler:handle_event:92 monitor long_schedule <0.6714.0> [{initial_call,{erlang,apply,2}},{almost_current_function,{riak_core_ring,'-random_other_index/2-lc$^0/1-0-',2}},{message_queue_len,0},{dictionary,[]}] [{timeout,57},{in,{gen,do_call,4}},{out,{riak_core_ring,'-random_other_index/2-lc$^0/1-0-',2}}]
2018-05-08 16:12:52.642 [info] <0.244.0>@riak_core_sysmon_handler:handle_event:92 monitor long_schedule <0.298.0> [{name,riak_core_ring_events},{initial_call,{gen_event,init_it,6}},{almost_current_function,{gen_event,fetch_msg,6}},{message_queue_len,0},{dictionary,[{'$initial_call',{gen_event,init_it,6}},{'$ancestors',[riak_core_sup,<0.282.0>]}]}] [{timeout,89},{in,{riak_core_ring_handler,'-maybe_stop_vnode_proxies/1-lc$^3/1-3-',2}},{out,{gen_event,fetch_msg,6}}]
2018-05-08 16:12:53.032 [info] <0.244.0>@riak_core_sysmon_handler:handle_event:92 monitor long_schedule <0.298.0> [{name,riak_core_ring_events},{initial_call,{gen_event,init_it,6}},{almost_current_function,{gen_event,fetch_msg,6}},{message_queue_len,0},{dictionary,[{'$initial_call',{gen_event,init_it,6}},{'$ancestors',[riak_core_sup,<0.282.0>]}]}] [{timeout,89},{in,{riak_core_ring_handler,'-maybe_stop_vnode_proxies/1-lc$^3/1-3-',2}},{out,{gen_event,fetch_msg,6}}]
2018-05-08 16:12:53.494 [info] <0.244.0>@riak_core_sysmon_handler:handle_event:92 monitor long_schedule <0.298.0> [{name,riak_core_ring_events},{initial_call,{gen_event,init_it,6}},{almost_current_function,{gen_event,fetch_msg,6}},{message_queue_len,0},{dictionary,[{'$initial_call',{gen_event,init_it,6}},{'$ancestors',[riak_core_sup,<0.282.0>]}]}] [{timeout,93},{in,{riak_core_ring_handler,'-maybe_stop_vnode_proxies/1-lc$^3/1-3-',2}},{out,{gen_event,fetch_msg,6}}]
2018-05-08 16:12:54.518 [info] <0.282.0>@riak_core_app:stop:47 Stopped application riak_core.
2018-05-08 16:10:10.477 [info] <0.33.0> Application lager started on node 'node_2@127.0.0.1'
2018-05-08 16:10:10.482 [info] <0.33.0> Application types started on node 'node_2@127.0.0.1'
2018-05-08 16:10:10.484 [info] <0.33.0> Application acceptor_pool started on node 'node_2@127.0.0.1'
2018-05-08 16:10:10.499 [info] <0.33.0> Application asn1 started on node 'node_2@127.0.0.1'
2018-05-08 16:10:10.499 [info] <0.33.0> Application public_key started on node 'node_2@127.0.0.1'
2018-05-08 16:10:10.542 [info] <0.33.0> Application ssl started on node 'node_2@127.0.0.1'
2018-05-08 16:10:10.544 [info] <0.33.0> Application rand_compat started on node 'node_2@127.0.0.1'
2018-05-08 16:10:10.549 [info] <0.33.0> Application quickrand started on node 'node_2@127.0.0.1'
2018-05-08 16:10:10.549 [info] <0.33.0> Application uuid started on node 'node_2@127.0.0.1'
2018-05-08 16:10:10.557 [info] <0.186.0>@partisan_config:init:58 Using node name: 'node_2@127.0.0.1'
2018-05-08 16:10:10.580 [info] <0.188.0>@partisan_config:get_node_address:168 Resolving "127.0.0.1"...
2018-05-08 16:10:10.580 [info] <0.188.0>@partisan_config:get_node_address:171 Resolved "node_2@127.0.0.1" to {127,0,0,1}
2018-05-08 16:10:10.580 [info] <0.186.0>@partisan_config:get_node_address:188 Resolved "127.0.0.1" to {127,0,0,1}
2018-05-08 16:10:10.746 [info] <0.33.0> Application partisan started on node 'node_2@127.0.0.1'
2018-05-08 16:10:10.751 [info] <0.33.0> Application gen_fsm_compat started on node 'node_2@127.0.0.1'
2018-05-08 16:10:10.765 [info] <0.33.0> Application riak_sysmon started on node 'node_2@127.0.0.1'
2018-05-08 16:10:10.797 [info] <0.33.0> Application os_mon started on node 'node_2@127.0.0.1'
2018-05-08 16:10:10.803 [info] <0.33.0> Application basho_stats started on node 'node_2@127.0.0.1'
2018-05-08 16:10:10.805 [info] <0.33.0> Application eleveldb started on node 'node_2@127.0.0.1'
2018-05-08 16:10:10.807 [info] <0.33.0> Application pbkdf2 started on node 'node_2@127.0.0.1'
2018-05-08 16:10:10.808 [info] <0.33.0> Application poolboy started on node 'node_2@127.0.0.1'
2018-05-08 16:10:10.867 [info] <0.265.0>@exometer_report:do_start_reporters:613 Starting reporters with []
2018-05-08 16:10:10.867 [info] <0.33.0> Application exometer_core started on node 'node_2@127.0.0.1'
2018-05-08 16:10:11.032 [info] <0.283.0>@riak_core_partisan_utils:configure_dispatch:145 Configuring partisan dispatch: true
2018-05-08 16:10:11.110 [warning] <0.299.0>@riak_core_ring_manager:reload_ring:380 No ring file available.
2018-05-08 16:10:11.348 [info] <0.316.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,vnode_routing} = proxy
2018-05-08 16:10:11.357 [info] <0.316.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,staged_joins} = true
2018-05-08 16:10:11.362 [info] <0.316.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,resizable_ring} = true
2018-05-08 16:10:11.366 [info] <0.316.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,fold_req_version} = v2
2018-05-08 16:10:11.370 [info] <0.316.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,security} = true
2018-05-08 16:10:11.374 [info] <0.316.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,bucket_types} = true
2018-05-08 16:10:11.378 [info] <0.316.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,net_ticktime} = true
2018-05-08 16:10:11.475 [info] <0.33.0> Application riak_core started on node 'node_2@127.0.0.1'
2018-05-08 16:10:11.484 [info] <0.33.0> Application setup started on node 'node_2@127.0.0.1'
2018-05-08 16:10:18.948 [info] <0.1373.0>@riak_core_partisan_utils:join:100 Starting join from partisan utils from 'node_2@127.0.0.1' to 'node_1@127.0.0.1'
2018-05-08 16:10:20.012 [info] <0.1373.0>@riak_core_partisan_utils:join:106 Finishing join from 'node_2@127.0.0.1' to 'node_1@127.0.0.1'
2018-05-08 16:10:20.020 [info] <0.1373.0>@riak_core_partisan_utils:join:100 Starting join from partisan utils from 'node_2@127.0.0.1' to 'node_2@127.0.0.1'
2018-05-08 16:10:20.042 [info] <0.1373.0>@riak_core_partisan_utils:join:106 Finishing join from 'node_2@127.0.0.1' to 'node_2@127.0.0.1'
2018-05-08 16:10:22.364 [info] <0.299.0>@riak_core_gossip:log_node_changed:357 'node_2@127.0.0.1' changed from 'joining' to 'valid'
2018-05-08 16:10:35.745 [info] <0.299.0>@riak_core_gossip:log_node_added:360 'node_3@127.0.0.1' joined cluster with status 'joining'
2018-05-08 16:10:36.095 [info] <0.299.0>@riak_core_gossip:log_node_changed:357 'node_3@127.0.0.1' changed from 'joining' to 'valid'
2018-05-08 16:11:42.852 [info] <0.244.0>@riak_core_sysmon_handler:handle_event:92 monitor long_schedule #Port<0.1167> name tcp_inet [{links,[<0.104.0>]},{id,9336},{connected,<0.104.0>},{input,43899},{output,44},{os_pid,undefined},{queue_size,0}] [{timeout,51},{port_op,dist_cmd}]
2018-05-08 16:12:20.196 [info] <0.244.0>@riak_core_sysmon_handler:handle_event:92 monitor long_schedule <0.6.0> [{name,erl_prim_loader},{initial_call,{erlang,apply,2}},{almost_current_function,{erl_prim_loader,loop,3}},{message_queue_len,0},{dictionary,[]}] [{timeout,52},{in,{erl_prim_loader,loop,3}},{out,{prim_file,drv_get_response,1}}]
2018-05-08 16:12:53.002 [info] <0.244.0>@riak_core_sysmon_handler:handle_event:92 monitor long_schedule <0.298.0> [{name,riak_core_ring_events},{initial_call,{gen_event,init_it,6}},{almost_current_function,{gen_event,fetch_msg,6}},{message_queue_len,0},{dictionary,[{'$initial_call',{gen_event,init_it,6}},{'$ancestors',[riak_core_sup,<0.282.0>]}]}] [{timeout,101},{in,{riak_core_ring_handler,'-maybe_stop_vnode_proxies/1-lc$^3/1-3-',2}},{out,{gen_event,fetch_msg,6}}]
2018-05-08 16:12:53.328 [info] <0.244.0>@riak_core_sysmon_handler:handle_event:92 monitor long_schedule <0.298.0> [{name,riak_core_ring_events},{initial_call,{gen_event,init_it,6}},{almost_current_function,{gen_event,fetch_msg,6}},{message_queue_len,0},{dictionary,[{'$initial_call',{gen_event,init_it,6}},{'$ancestors',[riak_core_sup,<0.282.0>]}]}] [{timeout,154},{in,{riak_core_ring_handler,'-maybe_stop_vnode_proxies/1-lc$^3/1-3-',2}},{out,{gen_event,fetch_msg,6}}]
2018-05-08 16:12:55.539 [warning] <0.4347.0>@partisan_peer_service_client:init:72 unable to connect to #{channels => [broadcast,vnode,{monotonic,gossip}],listen_addrs => [#{ip => {127,0,0,1},port => 55742}],name => 'node_1@127.0.0.1',parallelism => 1} due to {error,econnrefused}
2018-05-08 16:12:56.496 [info] <0.282.0>@riak_core_app:stop:47 Stopped application riak_core.
2018-05-08 16:12:56.557 [warning] <0.4355.0>@partisan_peer_service_client:init:72 unable to connect to #{channels => [broadcast,vnode,{monotonic,gossip}],listen_addrs => [#{ip => {127,0,0,1},port => 55742}],name => 'node_1@127.0.0.1',parallelism => 1} due to {error,econnrefused}
===> Linking _build/default/plugins/rebar_erl_vsn to _build/test/plugins/rebar_erl_vsn
===> Linking _build/default/plugins/rebar3_proper to _build/test/plugins/rebar3_proper
===> Linking _build/default/plugins/rebar3_run to _build/test/plugins/rebar3_run
===> Verifying dependencies...
===> Fetching proper ({pkg,<<"proper">>,<<"1.2.0">>})
===> Version cached at /root/.cache/rebar3/hex/default/packages/proper-1.2.0.tar is up to date, reusing it
===> Fetching recon ({pkg,<<"recon">>,<<"2.3.5">>})
===> Downloaded package, caching at /root/.cache/rebar3/hex/default/packages/recon-2.3.5.tar
===> Linking _build/default/lib/exometer_core to _build/test/lib/exometer_core
===> Linking _build/default/lib/lager to _build/test/lib/lager
===> Linking _build/default/lib/lasp_bench to _build/test/lib/lasp_bench
===> Linking _build/default/lib/partisan to _build/test/lib/partisan
===> Linking _build/default/lib/pbkdf2 to _build/test/lib/pbkdf2
===> Linking _build/default/lib/poolboy to _build/test/lib/poolboy
===> Linking _build/default/lib/riak_core to _build/test/lib/riak_core
===> Linking _build/default/lib/riak_core_partisan_utils to _build/test/lib/riak_core_partisan_utils
===> Linking _build/default/lib/riak_ensemble to _build/test/lib/riak_ensemble
===> Linking _build/default/plugins/pc to _build/test/plugins/pc
===> Linking _build/default/lib/acceptor_pool to _build/test/lib/acceptor_pool
===> Linking _build/default/lib/basho_stats to _build/test/lib/basho_stats
===> Linking _build/default/lib/blume to _build/test/lib/blume
===> Linking _build/default/lib/chash to _build/test/lib/chash
===> Linking _build/default/lib/clique to _build/test/lib/clique
===> Linking _build/default/lib/cuttlefish to _build/test/lib/cuttlefish
===> Linking _build/default/lib/eleveldb to _build/test/lib/eleveldb
===> Linking _build/default/lib/folsom to _build/test/lib/folsom
===> Linking _build/default/lib/gen_fsm_compat to _build/test/lib/gen_fsm_compat
===> Linking _build/default/lib/getopt to _build/test/lib/getopt
===> Linking _build/default/lib/goldrush to _build/test/lib/goldrush
===> Linking _build/default/lib/jam to _build/test/lib/jam
===> Linking _build/default/lib/parse_trans to _build/test/lib/parse_trans
===> Linking _build/default/lib/rand_compat to _build/test/lib/rand_compat
===> Linking _build/default/lib/riak_sysmon to _build/test/lib/riak_sysmon
===> Linking _build/default/lib/setup to _build/test/lib/setup
===> Linking _build/default/lib/time_compat to _build/test/lib/time_compat
===> Linking _build/default/lib/types to _build/test/lib/types
===> Linking _build/default/lib/uuid to _build/test/lib/uuid
===> Linking _build/default/lib/bear to _build/test/lib/bear
===> Linking _build/default/lib/edown to _build/test/lib/edown
===> Linking _build/default/lib/meck to _build/test/lib/meck
===> Linking _build/default/lib/quickrand to _build/test/lib/quickrand
===> Compiling proper
make: 'include/compile_flags.hrl' is up to date.
/opt/unir/_build/test/lib/proper/src/proper_typeserver.erl:553: Warning: erlang:get_stacktrace/0 used following a 'try' expression may stop working in a future release. (Use it inside 'try'.)
/opt/unir/_build/test/lib/proper/src/proper_typeserver.erl:556: Warning: erlang:get_stacktrace/0 used following a 'try' expression may stop working in a future release. (Use it inside 'try'.)
_build/test/lib/proper/src/proper_typeserver.erl:553: Warning: erlang:get_stacktrace/0 used following a 'try' expression may stop working in a future release. (Use it inside 'try'.)
_build/test/lib/proper/src/proper_typeserver.erl:556: Warning: erlang:get_stacktrace/0 used following a 'try' expression may stop working in a future release. (Use it inside 'try'.)
===> Compiling quickrand
===> Compiling uuid
===> Compiling types
===> Compiling time_compat
===> Compiling rand_compat
===> Compiling acceptor_pool
===> Compiling partisan
===> Compiling riak_core_partisan_utils
===> Compiling eleveldb
make: Nothing to be done for 'all'.
make: Nothing to be done for 'tools'.
===> Compiling c_src/eleveldb.cc
===> Compiling c_src/refobjects.cc
===> Compiling c_src/workitems.cc
===> Linking priv/eleveldb.so
===> Compiling getopt
===> Compiling cuttlefish
===> Building escript...
===> Compiling riak_ensemble
===> Linking priv/riak_ensemble_drv.so
===> Compiling chash
===> Compiling basho_stats
===> Compiling edown
===> Compiling setup
/usr/local/bin/rebar skip_deps=true escriptize
==> setup (escriptize)
===> Compiling gen_fsm_compat
===> Compiling poolboy
===> Compiling pbkdf2
===> Compiling parse_trans
===> Compiling exometer_core
===> Compiling clique
===> Compiling blume
===> Compiling riak_core
===> Compiling lasp_bench
===> Compiling recon
===> Compiling unir
src/unir_get_fsm.erl:58: Warning: gen_fsm:start_link/3 is deprecated and will be removed in a future release; use gen_statem:start_link/3
src/unir_ping_fsm.erl:57: Warning: gen_fsm:start_link/3 is deprecated and will be removed in a future release; use gen_statem:start_link/3
src/unir_put_fsm.erl:59: Warning: gen_fsm:start_link/3 is deprecated and will be removed in a future release; use gen_statem:start_link/3
_build/test/lib/unir/test/support.erl:25: Warning: export_all flag enabled - all functions will be exported
_build/test/lib/unir/test/prop_unir.erl:11: Warning: export_all flag enabled - all functions will be exported
_build/test/lib/unir/test/functionality_SUITE.erl:36: Warning: export_all flag enabled - all functions will be exported
_build/test/lib/unir/test/throughput_SUITE.erl:36: Warning: export_all flag enabled - all functions will be exported
test/support.erl:25: Warning: export_all flag enabled - all functions will be exported
test/prop_unir.erl:11: Warning: export_all flag enabled - all functions will be exported
test/functionality_SUITE.erl:36: Warning: export_all flag enabled - all functions will be exported
test/throughput_SUITE.erl:36: Warning: export_all flag enabled - all functions will be exported
===> Testing prop_unir:prop_sequential()
16:10:02.711 [info] Using "node_1@127.0.0.1" as name, since running >= 20.0
16:10:02.711 [info] Application lager started on node 'runner@127.0.0.1'
----------------------------------------------------
2018-05-08 16:10:02.791
Starting node: 'node_1@127.0.0.1'
Warning: ct_logs not started
Starting node: 'node_1@127.0.0.1'
----------------------------------------------------
2018-05-08 16:10:04.911
Node started: 'node_1@127.0.0.1'
Warning: ct_logs not started
Node started: 'node_1@127.0.0.1'
----------------------------------------------------
2018-05-08 16:10:04.911
Starting node: 'node_2@127.0.0.1'
Warning: ct_logs not started
16:10:04.911 [info] Using "node_2@127.0.0.1" as name, since running >= 20.0
Starting node: 'node_2@127.0.0.1'
----------------------------------------------------
2018-05-08 16:10:06.021
Node started: 'node_2@127.0.0.1'
Warning: ct_logs not started
Node started: 'node_2@127.0.0.1'
----------------------------------------------------
2018-05-08 16:10:06.021
Starting node: 'node_3@127.0.0.1'
Warning: ct_logs not started
16:10:06.021 [info] Using "node_3@127.0.0.1" as name, since running >= 20.0
Starting node: 'node_3@127.0.0.1'
----------------------------------------------------
2018-05-08 16:10:07.076
Node started: 'node_3@127.0.0.1'
Warning: ct_logs not started
Node started: 'node_3@127.0.0.1'16:10:07.123 [info] Node 'node_1@127.0.0.1' PrivDir: undefined NodeDir: "undefined/node_1@127.0.0.1"
16:10:07.190 [info] Node 'node_2@127.0.0.1' PrivDir: undefined NodeDir: "undefined/node_2@127.0.0.1"
16:10:07.264 [info] Node 'node_3@127.0.0.1' PrivDir: undefined NodeDir: "undefined/node_3@127.0.0.1"
16:10:20.064 [info] Issuing normal join from 'node_2@127.0.0.1' to 'node_1@127.0.0.1': ok
16:10:26.132 [info] Issuing normal join from 'node_3@127.0.0.1' to 'node_1@127.0.0.1': ok
16:10:26.132 [info] Waiting for nodes to be ready...
16:10:26.132 [info] Waiting for node 'node_1@127.0.0.1' to be ready...
16:10:26.133 [info] -> Ready members: ['node_1@127.0.0.1','node_2@127.0.0.1']
16:10:26.133 [info] Waiting for node 'node_2@127.0.0.1' to be ready...
16:10:26.135 [info] -> Ready members: ['node_1@127.0.0.1','node_2@127.0.0.1']
16:10:26.135 [info] Waiting for node 'node_3@127.0.0.1' to be ready...
16:10:26.137 [info] -> Ready members: ['node_1@127.0.0.1','node_2@127.0.0.1']
16:10:27.143 [info] Waiting for node 'node_3@127.0.0.1' to be ready...
16:10:27.154 [info] -> Ready members: ['node_1@127.0.0.1','node_2@127.0.0.1']
16:10:28.155 [info] Waiting for node 'node_3@127.0.0.1' to be ready...
16:10:28.156 [info] -> Ready members: ['node_1@127.0.0.1','node_2@127.0.0.1']
16:10:29.157 [info] Waiting for node 'node_3@127.0.0.1' to be ready...
16:10:29.171 [info] -> Ready members: ['node_1@127.0.0.1','node_2@127.0.0.1']
16:10:30.172 [info] Waiting for node 'node_3@127.0.0.1' to be ready...
16:10:30.174 [info] -> Ready members: ['node_1@127.0.0.1','node_2@127.0.0.1']
16:10:31.175 [info] Waiting for node 'node_3@127.0.0.1' to be ready...
16:10:31.176 [info] -> Ready members: ['node_1@127.0.0.1','node_2@127.0.0.1']
16:10:32.178 [info] Waiting for node 'node_3@127.0.0.1' to be ready...
16:10:32.182 [info] -> Ready members: ['node_1@127.0.0.1','node_2@127.0.0.1']
16:10:33.183 [info] Waiting for node 'node_3@127.0.0.1' to be ready...
16:10:33.185 [info] -> Ready members: ['node_1@127.0.0.1','node_2@127.0.0.1']
16:10:34.186 [info] Waiting for node 'node_3@127.0.0.1' to be ready...
16:10:34.189 [info] -> Ready members: ['node_1@127.0.0.1','node_2@127.0.0.1']
16:10:35.190 [info] Waiting for node 'node_3@127.0.0.1' to be ready...
16:10:35.193 [info] -> Ready members: ['node_1@127.0.0.1','node_2@127.0.0.1']
16:10:36.197 [info] Waiting for node 'node_3@127.0.0.1' to be ready...
16:10:36.202 [info] -> Ready members: ['node_1@127.0.0.1','node_2@127.0.0.1']
16:10:37.203 [info] Waiting for node 'node_3@127.0.0.1' to be ready...
16:10:37.205 [info] -> Ready members: ['node_1@127.0.0.1','node_2@127.0.0.1','node_3@127.0.0.1']
16:10:37.205 [info] All nodes ready!
16:10:37.205 [info] Waiting for ownership agreement...
16:12:33.205 [info] Waiting for handoff...
16:12:33.205 [info] Wait until no pending changes on nodes ['node_1@127.0.0.1','node_2@127.0.0.1','node_3@127.0.0.1']
16:12:33.406 [info] -> BadNodes: [], length(Changes): 3, length(Nodes): 3, Changes: [false,false,false]
16:12:34.409 [info] Wait until no pending changes on nodes ['node_1@127.0.0.1','node_2@127.0.0.1','node_3@127.0.0.1']
16:12:34.443 [info] -> BadNodes: [], length(Changes): 3, length(Nodes): 3, Changes: [false,false,false]
16:12:35.445 [info] Wait until no pending changes on nodes ['node_1@127.0.0.1','node_2@127.0.0.1','node_3@127.0.0.1']
16:12:35.487 [info] -> BadNodes: [], length(Changes): 3, length(Nodes): 3, Changes: [false,false,false]
16:12:36.491 [info] Wait until no pending changes on nodes ['node_1@127.0.0.1','node_2@127.0.0.1','node_3@127.0.0.1']
16:12:36.532 [info] -> BadNodes: [], length(Changes): 3, length(Nodes): 3, Changes: [false,false,false]
16:12:37.533 [info] Wait until no pending changes on nodes ['node_1@127.0.0.1','node_2@127.0.0.1','node_3@127.0.0.1']
16:12:37.573 [info] -> BadNodes: [], length(Changes): 3, length(Nodes): 3, Changes: [false,false,false]
16:12:38.574 [info] Wait until no pending changes on nodes ['node_1@127.0.0.1','node_2@127.0.0.1','node_3@127.0.0.1']
16:12:38.623 [info] -> BadNodes: [], length(Changes): 3, length(Nodes): 3, Changes: [false,false,false]
16:12:39.628 [info] Wait until no pending changes on nodes ['node_1@127.0.0.1','node_2@127.0.0.1','node_3@127.0.0.1']
16:12:39.721 [info] -> BadNodes: [], length(Changes): 3, length(Nodes): 3, Changes: [false,false,false]
16:12:40.723 [info] Wait until no pending changes on nodes ['node_1@127.0.0.1','node_2@127.0.0.1','node_3@127.0.0.1']
16:12:40.766 [info] -> BadNodes: [], length(Changes): 3, length(Nodes): 3, Changes: [false,false,false]
16:12:41.767 [info] Wait until no pending changes on nodes ['node_1@127.0.0.1','node_2@127.0.0.1','node_3@127.0.0.1']
16:12:41.813 [info] -> BadNodes: [], length(Changes): 3, length(Nodes): 3, Changes: [false,false,false]
16:12:42.815 [info] Wait until no pending changes on nodes ['node_1@127.0.0.1','node_2@127.0.0.1','node_3@127.0.0.1']
16:12:42.874 [info] -> BadNodes: [], length(Changes): 3, length(Nodes): 3, Changes: [false,false,false]
16:12:43.879 [info] Wait until no pending changes on nodes ['node_1@127.0.0.1','node_2@127.0.0.1','node_3@127.0.0.1']
16:12:43.941 [info] -> BadNodes: [], length(Changes): 3, length(Nodes): 3, Changes: [false,false,false]
16:12:44.943 [info] Wait until no pending changes on nodes ['node_1@127.0.0.1','node_2@127.0.0.1','node_3@127.0.0.1']
16:12:45.006 [info] -> BadNodes: [], length(Changes): 3, length(Nodes): 3, Changes: [false,false,false]
16:12:46.007 [info] Wait until no pending changes on nodes ['node_1@127.0.0.1','node_2@127.0.0.1','node_3@127.0.0.1']
16:12:46.802 [info] -> BadNodes: [], length(Changes): 3, length(Nodes): 3, Changes: [false,false,false]
16:12:47.807 [info] Wait until no pending changes on nodes ['node_1@127.0.0.1','node_2@127.0.0.1','node_3@127.0.0.1']
16:12:47.856 [info] -> BadNodes: [], length(Changes): 3, length(Nodes): 3, Changes: [false,false,false]
16:12:48.857 [info] Wait until no pending changes on nodes ['node_1@127.0.0.1','node_2@127.0.0.1','node_3@127.0.0.1']
16:12:48.886 [info] -> BadNodes: [], length(Changes): 3, length(Nodes): 3, Changes: [false,false,false]
16:12:49.887 [info] Wait until no pending changes on nodes ['node_1@127.0.0.1','node_2@127.0.0.1','node_3@127.0.0.1']
16:12:49.926 [info] -> BadNodes: [], length(Changes): 3, length(Nodes): 3, Changes: [false,false,false]
16:12:50.927 [info] Wait until no pending changes on nodes ['node_1@127.0.0.1','node_2@127.0.0.1','node_3@127.0.0.1']
16:12:50.970 [info] -> BadNodes: [], length(Changes): 3, length(Nodes): 3, Changes: [false,false,false]
16:12:51.972 [info] Wait until no pending changes on nodes ['node_1@127.0.0.1','node_2@127.0.0.1','node_3@127.0.0.1']
16:12:51.989 [info] -> BadNodes: [], length(Changes): 3, length(Nodes): 3, Changes: [false,false,false]
16:12:52.993 [info] Wait until no pending changes on nodes ['node_1@127.0.0.1','node_2@127.0.0.1','node_3@127.0.0.1']
16:12:53.060 [info] -> BadNodes: [], length(Changes): 3, length(Nodes): 3, Changes: [true,true,true]
16:12:53.060 [info] No pending changes remain!
16:12:53.060 [info] Waiting for ring convergence...
16:12:54.195 [info] initial_state: nodes [node_1,node_2,node_3] joined_nodes [node_1,node_2,node_3]
16:12:54.195 [info] write_object: node node_1 key <<"key">> value <<>>
----------------------------------------------------
2018-05-08 16:12:54.328
Stopping node: node_1
Warning: ct_logs not started
Stopping node: node_116:12:54.328 [info] Using "node_1@127.0.0.1" as name, since running >= 20.0
----------------------------------------------------
2018-05-08 16:12:56.335
Stopping node: node_2
Warning: ct_logs not started
Stopping node: node_216:12:56.336 [info] Using "node_2@127.0.0.1" as name, since running >= 20.0
----------------------------------------------------
2018-05-08 16:12:58.346
Stopping node: node_3
Warning: ct_logs not started
Stopping node: node_316:12:58.347 [info] Using "node_3@127.0.0.1" as name, since running >= 20.0
.16:13:00.357 [info] initial_state: nodes [node_1,node_2,node_3] joined_nodes [node_1,node_2,node_3]
----------------------------------------------------
2018-05-08 16:13:00.401
Starting node: 'node_1@127.0.0.1'
16:13:00.401 [info] Using "node_1@127.0.0.1" as name, since running >= 20.0
Warning: ct_logs not started
Starting node: 'node_1@127.0.0.1'!
Failed: After 2 test(s).
An exception was raised: exit:{test_case_failed,{error,boot_timeout,'node_1@127.0.0.1'}}.
Stacktrace: [{lists,map,2,[{file,"lists.erl"},{line,1239}]},
{support,start,3,
[{file,"/opt/unir/test/support.erl"},{line,117}]},
{prop_unir,start_nodes,0,
[{file,"/opt/unir/test/prop_unir.erl"},{line,307}]},
{prop_unir,'-prop_sequential/0-fun-2-',1,
[{file,"/opt/unir/test/prop_unir.erl"},{line,32}]},
{proper,apply_args,3,
[{file,"/opt/unir/_build/test/lib/proper/src/proper.erl"},
{line,1353}]},
{proper,perform,7,
[{file,"/opt/unir/_build/test/lib/proper/src/proper.erl"},
{line,1146}]},
{proper,inner_test,2,
[{file,"/opt/unir/_build/test/lib/proper/src/proper.erl"},
{line,1028}]}].
[{set,{var,1},{call,prop_unir,read_object,[node_1,<<107,101,121>>]}}]
Shrinking 16:13:03.508 [info] initial_state: nodes [node_1,node_2,node_3] joined_nodes [node_1,node_2,node_3]
16:13:03.529 [info] initial_state: nodes [node_1,node_2,node_3] joined_nodes [node_1,node_2,node_3]
16:13:03.572 [info] initial_state: nodes [node_1,node_2,node_3] joined_nodes [node_1,node_2,node_3]
16:13:03.580 [info] initial_state: nodes [node_1,node_2,node_3] joined_nodes [node_1,node_2,node_3]
16:13:03.694 [info] initial_state: nodes [node_1,node_2,node_3] joined_nodes [node_1,node_2,node_3]
(0 time(s))
[{set,{var,1},{call,prop_unir,read_object,[node_1,<<107,101,121>>]}}]
===>
0/1 properties passed, 1 failed
===> Failed test cases:
prop_unir:prop_sequential() -> false
{prop_unir,prop_sequential,
[[{set,{var,1},{call,prop_unir,read_object,[node_1,<<"key">>]}}]]}.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment