Skip to content

Instantly share code, notes, and snippets.

@cmeiklejohn
Created May 8, 2018 16:17
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save cmeiklejohn/4b1e279feac019650571e07dc66017d7 to your computer and use it in GitHub Desktop.
Save cmeiklejohn/4b1e279feac019650571e07dc66017d7 to your computer and use it in GitHub Desktop.
results for unir-865d54db67-dcndd
find . -name console.log | grep `ls -d ./undefined* | tail -1` | xargs cat
2018-05-08 16:14:15.235 [info] <0.33.0> Application lager started on node 'node_3@127.0.0.1'
2018-05-08 16:14:15.262 [info] <0.33.0> Application types started on node 'node_3@127.0.0.1'
2018-05-08 16:14:15.264 [info] <0.33.0> Application acceptor_pool started on node 'node_3@127.0.0.1'
2018-05-08 16:14:15.280 [info] <0.33.0> Application asn1 started on node 'node_3@127.0.0.1'
2018-05-08 16:14:15.281 [info] <0.33.0> Application public_key started on node 'node_3@127.0.0.1'
2018-05-08 16:14:15.365 [info] <0.33.0> Application ssl started on node 'node_3@127.0.0.1'
2018-05-08 16:14:15.369 [info] <0.33.0> Application rand_compat started on node 'node_3@127.0.0.1'
2018-05-08 16:14:15.374 [info] <0.33.0> Application quickrand started on node 'node_3@127.0.0.1'
2018-05-08 16:14:15.374 [info] <0.33.0> Application uuid started on node 'node_3@127.0.0.1'
2018-05-08 16:14:15.385 [info] <0.191.0>@partisan_config:init:58 Using node name: 'node_3@127.0.0.1'
2018-05-08 16:14:15.406 [info] <0.193.0>@partisan_config:get_node_address:168 Resolving "127.0.0.1"...
2018-05-08 16:14:15.406 [info] <0.193.0>@partisan_config:get_node_address:171 Resolved "node_3@127.0.0.1" to {127,0,0,1}
2018-05-08 16:14:15.406 [info] <0.191.0>@partisan_config:get_node_address:188 Resolved "127.0.0.1" to {127,0,0,1}
2018-05-08 16:14:15.576 [info] <0.33.0> Application partisan started on node 'node_3@127.0.0.1'
2018-05-08 16:14:15.580 [info] <0.33.0> Application gen_fsm_compat started on node 'node_3@127.0.0.1'
2018-05-08 16:14:15.592 [info] <0.33.0> Application riak_sysmon started on node 'node_3@127.0.0.1'
2018-05-08 16:14:15.621 [info] <0.33.0> Application os_mon started on node 'node_3@127.0.0.1'
2018-05-08 16:14:15.624 [info] <0.33.0> Application basho_stats started on node 'node_3@127.0.0.1'
2018-05-08 16:14:15.630 [info] <0.33.0> Application eleveldb started on node 'node_3@127.0.0.1'
2018-05-08 16:14:15.631 [info] <0.33.0> Application pbkdf2 started on node 'node_3@127.0.0.1'
2018-05-08 16:14:15.632 [info] <0.33.0> Application poolboy started on node 'node_3@127.0.0.1'
2018-05-08 16:14:15.679 [info] <0.270.0>@exometer_report:do_start_reporters:613 Starting reporters with []
2018-05-08 16:14:15.679 [info] <0.33.0> Application exometer_core started on node 'node_3@127.0.0.1'
2018-05-08 16:14:15.792 [info] <0.288.0>@riak_core_partisan_utils:configure_dispatch:145 Configuring partisan dispatch: true
2018-05-08 16:14:15.978 [warning] <0.304.0>@riak_core_ring_manager:reload_ring:380 No ring file available.
2018-05-08 16:14:16.530 [info] <0.321.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,vnode_routing} = proxy
2018-05-08 16:14:16.544 [info] <0.321.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,staged_joins} = true
2018-05-08 16:14:16.558 [info] <0.321.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,resizable_ring} = true
2018-05-08 16:14:16.567 [info] <0.321.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,fold_req_version} = v2
2018-05-08 16:14:16.584 [info] <0.321.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,security} = true
2018-05-08 16:14:16.609 [info] <0.321.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,bucket_types} = true
2018-05-08 16:14:16.621 [info] <0.321.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,net_ticktime} = true
2018-05-08 16:14:16.742 [info] <0.33.0> Application riak_core started on node 'node_3@127.0.0.1'
2018-05-08 16:14:16.748 [info] <0.33.0> Application setup started on node 'node_3@127.0.0.1'
2018-05-08 16:14:27.920 [info] <0.3430.0>@riak_core_partisan_utils:join:100 Starting join from partisan utils from 'node_3@127.0.0.1' to 'node_1@127.0.0.1'
2018-05-08 16:14:28.980 [info] <0.3430.0>@riak_core_partisan_utils:join:106 Finishing join from 'node_3@127.0.0.1' to 'node_1@127.0.0.1'
2018-05-08 16:14:28.987 [info] <0.3430.0>@riak_core_partisan_utils:join:100 Starting join from partisan utils from 'node_3@127.0.0.1' to 'node_3@127.0.0.1'
2018-05-08 16:14:28.997 [info] <0.3430.0>@riak_core_partisan_utils:join:106 Finishing join from 'node_3@127.0.0.1' to 'node_3@127.0.0.1'
2018-05-08 16:14:38.602 [info] <0.304.0>@riak_core_gossip:log_node_changed:357 'node_3@127.0.0.1' changed from 'joining' to 'valid'
2018-05-08 16:15:15.618 [info] <0.92.0> alarm_handler: {set,{system_memory_high_watermark,[]}}
2018-05-08 16:16:06.163 [info] <0.249.0>@riak_core_sysmon_handler:handle_event:92 monitor long_schedule <0.3434.0> [{initial_call,{partisan_peer_service_client,init,1}},{almost_current_function,{gen_server,loop,7}},{message_queue_len,0},{dictionary,[{{partisan_peer_service_client,listen_addr},#{ip => {127,0,0,1},port => 35986}},{{partisan_peer_service_client,peer},#{channels => [broadcast,vnode,{monotonic,gossip}],listen_addrs => [#{ip => {127,0,0,1},port => 35986}],name => 'node_1@127.0.0.1',parallelism => 1}},{'$initial_call',{partisan_peer_service_client,init,1}},{'$ancestors',[partisan_default_peer_service_manager,partisan_sup,<0.190.0>]},{{partisan_peer_service_client,channel},{monotonic,gossip}},{{partisan_peer_service_client,from},<0.224.0>},{last_transmission_time,-576460635703}]}] [{timeout,93},{in,{partisan_util,term_to_iolist_,1}},{out,{gen_server,loop,7}}]
2018-05-08 16:16:21.973 [info] <0.249.0>@riak_core_sysmon_handler:handle_event:92 monitor long_gc <0.6.0> [{name,erl_prim_loader},{initial_call,{erlang,apply,2}},{almost_current_function,{erl_prim_loader,loop,3}},{message_queue_len,0},{dictionary,[]}] [{timeout,51},{old_heap_block_size,28690},{heap_block_size,4185},{mbuf_size,0},{stack_size,14},{old_heap_size,14919},{heap_size,180}]
2018-05-08 16:16:55.854 [info] <0.249.0>@riak_core_sysmon_handler:handle_event:92 monitor long_schedule <0.303.0> [{name,riak_core_ring_events},{initial_call,{gen_event,init_it,6}},{almost_current_function,{gen_event,fetch_msg,6}},{message_queue_len,0},{dictionary,[{'$initial_call',{gen_event,init_it,6}},{'$ancestors',[riak_core_sup,<0.287.0>]}]}] [{timeout,67},{in,{riak_core_ring_handler,'-maybe_stop_vnode_proxies/1-lc$^3/1-3-',2}},{out,{gen_event,fetch_msg,6}}]
2018-05-08 16:16:57.245 [error] <0.224.0>@partisan_default_peer_service_manager:do_send_message:933 Node 'node_1@127.0.0.1' was connected, but is now disconnected!
2018-05-08 16:16:57.701 [warning] <0.5270.0>@partisan_peer_service_client:init:72 unable to connect to #{channels => [broadcast,vnode,{monotonic,gossip}],listen_addrs => [#{ip => {127,0,0,1},port => 35986}],name => 'node_1@127.0.0.1',parallelism => 1} due to {error,econnrefused}
2018-05-08 16:16:58.714 [warning] <0.5289.0>@partisan_peer_service_client:init:72 unable to connect to #{channels => [broadcast,vnode,{monotonic,gossip}],listen_addrs => [#{ip => {127,0,0,1},port => 35986}],name => 'node_1@127.0.0.1',parallelism => 1} due to {error,econnrefused}
2018-05-08 16:16:59.731 [warning] <0.5290.0>@partisan_peer_service_client:init:72 unable to connect to #{channels => [broadcast,vnode,{monotonic,gossip}],listen_addrs => [#{ip => {127,0,0,1},port => 35986}],name => 'node_1@127.0.0.1',parallelism => 1} due to {error,econnrefused}
2018-05-08 16:16:59.731 [warning] <0.5291.0>@partisan_peer_service_client:init:72 unable to connect to #{channels => [broadcast,vnode,{monotonic,gossip}],listen_addrs => [#{ip => {127,0,0,1},port => 53483}],name => 'node_2@127.0.0.1',parallelism => 1} due to {error,econnrefused}
2018-05-08 16:17:00.748 [warning] <0.5293.0>@partisan_peer_service_client:init:72 unable to connect to #{channels => [broadcast,vnode,{monotonic,gossip}],listen_addrs => [#{ip => {127,0,0,1},port => 35986}],name => 'node_1@127.0.0.1',parallelism => 1} due to {error,econnrefused}
2018-05-08 16:17:00.748 [warning] <0.5294.0>@partisan_peer_service_client:init:72 unable to connect to #{channels => [broadcast,vnode,{monotonic,gossip}],listen_addrs => [#{ip => {127,0,0,1},port => 53483}],name => 'node_2@127.0.0.1',parallelism => 1} due to {error,econnrefused}
2018-05-08 16:17:01.103 [info] <0.287.0>@riak_core_app:stop:47 Stopped application riak_core.
2018-05-08 16:17:01.130 [info] <0.92.0> alarm_handler: {clear,system_memory_high_watermark}
2018-05-08 16:14:12.716 [info] <0.33.0> Application lager started on node 'node_1@127.0.0.1'
2018-05-08 16:14:12.727 [info] <0.33.0> Application types started on node 'node_1@127.0.0.1'
2018-05-08 16:14:12.728 [info] <0.33.0> Application acceptor_pool started on node 'node_1@127.0.0.1'
2018-05-08 16:14:12.752 [info] <0.33.0> Application asn1 started on node 'node_1@127.0.0.1'
2018-05-08 16:14:12.752 [info] <0.33.0> Application public_key started on node 'node_1@127.0.0.1'
2018-05-08 16:14:12.793 [info] <0.33.0> Application ssl started on node 'node_1@127.0.0.1'
2018-05-08 16:14:12.796 [info] <0.33.0> Application rand_compat started on node 'node_1@127.0.0.1'
2018-05-08 16:14:12.804 [info] <0.33.0> Application quickrand started on node 'node_1@127.0.0.1'
2018-05-08 16:14:12.805 [info] <0.33.0> Application uuid started on node 'node_1@127.0.0.1'
2018-05-08 16:14:12.813 [info] <0.188.0>@partisan_config:init:58 Using node name: 'node_1@127.0.0.1'
2018-05-08 16:14:12.824 [info] <0.190.0>@partisan_config:get_node_address:168 Resolving "127.0.0.1"...
2018-05-08 16:14:12.824 [info] <0.190.0>@partisan_config:get_node_address:171 Resolved "node_1@127.0.0.1" to {127,0,0,1}
2018-05-08 16:14:12.824 [info] <0.188.0>@partisan_config:get_node_address:188 Resolved "127.0.0.1" to {127,0,0,1}
2018-05-08 16:14:12.980 [info] <0.33.0> Application partisan started on node 'node_1@127.0.0.1'
2018-05-08 16:14:12.985 [info] <0.33.0> Application gen_fsm_compat started on node 'node_1@127.0.0.1'
2018-05-08 16:14:12.997 [info] <0.33.0> Application riak_sysmon started on node 'node_1@127.0.0.1'
2018-05-08 16:14:13.023 [info] <0.33.0> Application os_mon started on node 'node_1@127.0.0.1'
2018-05-08 16:14:13.025 [info] <0.33.0> Application basho_stats started on node 'node_1@127.0.0.1'
2018-05-08 16:14:13.030 [info] <0.33.0> Application eleveldb started on node 'node_1@127.0.0.1'
2018-05-08 16:14:13.031 [info] <0.33.0> Application pbkdf2 started on node 'node_1@127.0.0.1'
2018-05-08 16:14:13.033 [info] <0.33.0> Application poolboy started on node 'node_1@127.0.0.1'
2018-05-08 16:14:13.068 [info] <0.267.0>@exometer_report:do_start_reporters:613 Starting reporters with []
2018-05-08 16:14:13.068 [info] <0.33.0> Application exometer_core started on node 'node_1@127.0.0.1'
2018-05-08 16:14:13.097 [info] <0.33.0> Application clique started on node 'node_1@127.0.0.1'
2018-05-08 16:14:13.109 [info] <0.33.0> Application folsom started on node 'node_1@127.0.0.1'
2018-05-08 16:14:13.143 [info] <0.285.0>@riak_core_partisan_utils:configure_dispatch:145 Configuring partisan dispatch: true
2018-05-08 16:14:13.216 [warning] <0.301.0>@riak_core_ring_manager:reload_ring:380 No ring file available.
2018-05-08 16:14:13.374 [info] <0.246.0>@riak_core_sysmon_handler:handle_event:92 monitor long_schedule <0.311.0> [{timeout,64},{in,{code_server,call,1}},{out,{code_server,'-handle_on_load/5-fun-0-',1}}]
2018-05-08 16:14:13.516 [info] <0.318.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,vnode_routing} = proxy
2018-05-08 16:14:13.530 [info] <0.318.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,staged_joins} = true
2018-05-08 16:14:13.540 [info] <0.318.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,resizable_ring} = true
2018-05-08 16:14:13.549 [info] <0.318.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,fold_req_version} = v2
2018-05-08 16:14:13.561 [info] <0.318.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,security} = true
2018-05-08 16:14:13.572 [info] <0.318.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,bucket_types} = true
2018-05-08 16:14:13.577 [info] <0.318.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,net_ticktime} = true
2018-05-08 16:14:13.658 [info] <0.33.0> Application riak_core started on node 'node_1@127.0.0.1'
2018-05-08 16:14:13.708 [info] <0.33.0> Application setup started on node 'node_1@127.0.0.1'
2018-05-08 16:14:22.919 [error] <0.221.0>@partisan_default_peer_service_manager:do_send_message:940 Node 'node_2@127.0.0.1' is not yet connected during send!
2018-05-08 16:14:25.802 [info] <0.301.0>@riak_core_gossip:log_node_added:360 'node_2@127.0.0.1' joined cluster with status 'valid'
2018-05-08 16:14:29.024 [error] <0.221.0>@partisan_default_peer_service_manager:do_send_message:940 Node 'node_3@127.0.0.1' is not yet connected during send!
2018-05-08 16:14:38.513 [info] <0.301.0>@riak_core_gossip:log_node_added:360 'node_3@127.0.0.1' joined cluster with status 'valid'
2018-05-08 16:15:07.119 [info] <0.246.0>@riak_core_sysmon_handler:handle_event:92 monitor long_schedule <0.301.0> [{name,riak_core_ring_manager},{initial_call,{riak_core_ring_manager,init,1}},{almost_current_function,{lists,usplit_1,5}},{message_queue_len,0},{dictionary,[{rand_seed,{#{bits => 58,jump => #Fun<rand.8.15449617>,next => #Fun<rand.5.15449617>,type => exrop,uniform => #Fun<rand.6.15449617>,uniform_n => #Fun<rand.7.15449617>,weak_low_bits => 1},[2579874926871652|161513737163266859]}},{'$initial_call',{riak_core_ring_manager,init,1}},{'$ancestors',[riak_core_sup,<0.284.0>]}]}] [{timeout,51},{in,{chash,nodes,1}},{out,{lists,'-filter/2-lc$^0/1-0-',2}}]
2018-05-08 16:15:13.019 [info] <0.86.0> alarm_handler: {set,{system_memory_high_watermark,[]}}
2018-05-08 16:15:24.063 [info] <0.246.0>@riak_core_sysmon_handler:handle_event:92 monitor long_schedule <0.3521.0> [{initial_call,{partisan_peer_service_client,init,1}},{almost_current_function,{gen_server,loop,7}},{message_queue_len,0},{dictionary,[{{partisan_peer_service_client,peer},#{channels => [broadcast,vnode,{monotonic,gossip}],listen_addrs => [#{ip => {127,0,0,1},port => 38500}],name => 'node_3@127.0.0.1',parallelism => 1}},{{partisan_peer_service_client,from},<0.221.0>},{{partisan_peer_service_client,listen_addr},#{ip => {127,0,0,1},port => 38500}},{last_transmission_time,-576460675653},{{partisan_peer_service_client,channel},{monotonic,gossip}},{'$initial_call',{partisan_peer_service_client,init,1}},{'$ancestors',[partisan_default_peer_service_manager,partisan_sup,<0.187.0>]}]}] [{timeout,61},{in,{partisan_util,term_to_iolist_,1}},{out,{gen_server,loop,7}}]
2018-05-08 16:15:44.665 [info] <0.246.0>@riak_core_sysmon_handler:handle_event:92 monitor long_schedule <0.317.0> [{name,riak_core_vnode_manager},{initial_call,{riak_core_vnode_manager,init,1}},{almost_current_function,{riak_core_util,pmap_collect_one,1}},{message_queue_len,1},{dictionary,[{'$initial_call',{riak_core_vnode_manager,init,1}},{'$ancestors',[riak_core_sup,<0.284.0>]}]}] [{timeout,53},{in,{gen_server,loop,7}},{out,{riak_core_util,pmap_collect_one,1}}]
2018-05-08 16:15:58.766 [info] <0.246.0>@riak_core_sysmon_handler:handle_event:92 monitor long_schedule <0.4666.0> [{initial_call,{riak_core_vnode,init,1}},{almost_current_function,{gen_fsm_compat,loop,8}},{message_queue_len,0},{dictionary,[{rand_seed,{#{bits => 58,jump => #Fun<rand.8.15449617>,next => #Fun<rand.5.15449617>,type => exrop,uniform => #Fun<rand.6.15449617>,uniform_n => #Fun<rand.7.15449617>,weak_low_bits => 1},[66293688590342551|104158784899781145]}},{'$initial_call',{riak_core_vnode,init,1}},{'$ancestors',[riak_core_vnode_sup,riak_core_sup,<0.284.0>]}]}] [{timeout,62},{in,{gen,do_call,4}},{out,{gen_fsm_compat,loop,8}}]
2018-05-08 16:16:00.988 [info] <0.246.0>@riak_core_sysmon_handler:handle_event:92 monitor long_schedule #Port<0.718> name tcp_inet [{links,[<0.76.0>]},{id,5744},{connected,<0.76.0>},{input,78846},{output,60},{os_pid,undefined},{queue_size,0}] [{timeout,56},{port_op,dist_cmd}]
2018-05-08 16:16:06.235 [info] <0.246.0>@riak_core_sysmon_handler:handle_event:92 monitor long_schedule <0.53.0> [{name,file_server_2},{initial_call,{file_server,init,1}},{almost_current_function,{gen_server,loop,7}},{message_queue_len,0},{dictionary,[{'$initial_call',{file_server,init,1}},{'$ancestors',[kernel_sup,<0.36.0>]}]}] [{timeout,106},{in,{gen_server,loop,7}},{out,{prim_file,drv_get_response,1}}]
2018-05-08 16:16:43.290 [info] <0.246.0>@riak_core_sysmon_handler:handle_event:92 monitor long_schedule <0.53.0> [{name,file_server_2},{initial_call,{file_server,init,1}},{almost_current_function,{gen_server,loop,7}},{message_queue_len,0},{dictionary,[{'$initial_call',{file_server,init,1}},{'$ancestors',[kernel_sup,<0.36.0>]}]}] [{timeout,51},{in,{gen_server,loop,7}},{out,{prim_file,drv_get_response,1}}]
2018-05-08 16:16:57.105 [info] <0.284.0>@riak_core_app:stop:47 Stopped application riak_core.
2018-05-08 16:16:57.122 [info] <0.86.0> alarm_handler: {clear,system_memory_high_watermark}
2018-05-08 16:14:13.997 [info] <0.33.0> Application lager started on node 'node_2@127.0.0.1'
2018-05-08 16:14:14.003 [info] <0.33.0> Application types started on node 'node_2@127.0.0.1'
2018-05-08 16:14:14.005 [info] <0.33.0> Application acceptor_pool started on node 'node_2@127.0.0.1'
2018-05-08 16:14:14.020 [info] <0.33.0> Application asn1 started on node 'node_2@127.0.0.1'
2018-05-08 16:14:14.020 [info] <0.33.0> Application public_key started on node 'node_2@127.0.0.1'
2018-05-08 16:14:14.069 [info] <0.33.0> Application ssl started on node 'node_2@127.0.0.1'
2018-05-08 16:14:14.075 [info] <0.33.0> Application rand_compat started on node 'node_2@127.0.0.1'
2018-05-08 16:14:14.086 [info] <0.33.0> Application quickrand started on node 'node_2@127.0.0.1'
2018-05-08 16:14:14.087 [info] <0.33.0> Application uuid started on node 'node_2@127.0.0.1'
2018-05-08 16:14:14.094 [info] <0.188.0>@partisan_config:init:58 Using node name: 'node_2@127.0.0.1'
2018-05-08 16:14:14.104 [info] <0.190.0>@partisan_config:get_node_address:168 Resolving "127.0.0.1"...
2018-05-08 16:14:14.104 [info] <0.190.0>@partisan_config:get_node_address:171 Resolved "node_2@127.0.0.1" to {127,0,0,1}
2018-05-08 16:14:14.104 [info] <0.188.0>@partisan_config:get_node_address:188 Resolved "127.0.0.1" to {127,0,0,1}
2018-05-08 16:14:14.293 [info] <0.33.0> Application partisan started on node 'node_2@127.0.0.1'
2018-05-08 16:14:14.322 [info] <0.33.0> Application gen_fsm_compat started on node 'node_2@127.0.0.1'
2018-05-08 16:14:14.347 [info] <0.33.0> Application riak_sysmon started on node 'node_2@127.0.0.1'
2018-05-08 16:14:14.384 [info] <0.33.0> Application os_mon started on node 'node_2@127.0.0.1'
2018-05-08 16:14:14.399 [info] <0.33.0> Application basho_stats started on node 'node_2@127.0.0.1'
2018-05-08 16:14:14.404 [info] <0.33.0> Application eleveldb started on node 'node_2@127.0.0.1'
2018-05-08 16:14:14.406 [info] <0.33.0> Application pbkdf2 started on node 'node_2@127.0.0.1'
2018-05-08 16:14:14.407 [info] <0.33.0> Application poolboy started on node 'node_2@127.0.0.1'
2018-05-08 16:14:14.448 [info] <0.267.0>@exometer_report:do_start_reporters:613 Starting reporters with []
2018-05-08 16:14:14.448 [info] <0.33.0> Application exometer_core started on node 'node_2@127.0.0.1'
2018-05-08 16:14:14.483 [info] <0.33.0> Application clique started on node 'node_2@127.0.0.1'
2018-05-08 16:14:14.499 [info] <0.33.0> Application folsom started on node 'node_2@127.0.0.1'
2018-05-08 16:14:14.564 [info] <0.285.0>@riak_core_partisan_utils:configure_dispatch:145 Configuring partisan dispatch: true
2018-05-08 16:14:14.622 [warning] <0.301.0>@riak_core_ring_manager:reload_ring:380 No ring file available.
2018-05-08 16:14:14.817 [info] <0.318.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,vnode_routing} = proxy
2018-05-08 16:14:14.827 [info] <0.318.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,staged_joins} = true
2018-05-08 16:14:14.832 [info] <0.318.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,resizable_ring} = true
2018-05-08 16:14:14.835 [info] <0.318.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,fold_req_version} = v2
2018-05-08 16:14:14.839 [info] <0.318.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,security} = true
2018-05-08 16:14:14.843 [info] <0.318.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,bucket_types} = true
2018-05-08 16:14:14.851 [info] <0.318.0>@riak_core_capability:process_capability_changes:568 New capability: {riak_core,net_ticktime} = true
2018-05-08 16:14:14.974 [info] <0.33.0> Application riak_core started on node 'node_2@127.0.0.1'
2018-05-08 16:14:14.981 [info] <0.33.0> Application setup started on node 'node_2@127.0.0.1'
2018-05-08 16:14:21.856 [info] <0.1375.0>@riak_core_partisan_utils:join:100 Starting join from partisan utils from 'node_2@127.0.0.1' to 'node_1@127.0.0.1'
2018-05-08 16:14:22.888 [info] <0.1375.0>@riak_core_partisan_utils:join:106 Finishing join from 'node_2@127.0.0.1' to 'node_1@127.0.0.1'
2018-05-08 16:14:22.890 [info] <0.1375.0>@riak_core_partisan_utils:join:100 Starting join from partisan utils from 'node_2@127.0.0.1' to 'node_2@127.0.0.1'
2018-05-08 16:14:22.898 [info] <0.1375.0>@riak_core_partisan_utils:join:106 Finishing join from 'node_2@127.0.0.1' to 'node_2@127.0.0.1'
2018-05-08 16:14:25.841 [info] <0.301.0>@riak_core_gossip:log_node_changed:357 'node_2@127.0.0.1' changed from 'joining' to 'valid'
2018-05-08 16:14:37.471 [info] <0.301.0>@riak_core_gossip:log_node_added:360 'node_3@127.0.0.1' joined cluster with status 'joining'
2018-05-08 16:14:38.547 [info] <0.301.0>@riak_core_gossip:log_node_changed:357 'node_3@127.0.0.1' changed from 'joining' to 'valid'
2018-05-08 16:15:14.375 [info] <0.90.0> alarm_handler: {set,{system_memory_high_watermark,[]}}
2018-05-08 16:15:56.088 [info] <0.246.0>@riak_core_sysmon_handler:handle_event:92 monitor long_schedule <0.2683.0> [{initial_call,{riak_core_vnode,init,1}},{almost_current_function,{gen_fsm_compat,loop,8}},{message_queue_len,0},{dictionary,[{rand_seed,{#{bits => 58,jump => #Fun<rand.8.15449617>,next => #Fun<rand.5.15449617>,type => exrop,uniform => #Fun<rand.6.15449617>,uniform_n => #Fun<rand.7.15449617>,weak_low_bits => 1},[119060259614748250|250217001700325936]}},{'$initial_call',{riak_core_vnode,init,1}},{'$ancestors',[riak_core_vnode_sup,riak_core_sup,<0.284.0>]}]}] [{timeout,56},{in,undefined},{out,{gen_fsm_compat,loop,8}}]
2018-05-08 16:15:58.782 [info] <0.246.0>@riak_core_sysmon_handler:handle_event:92 monitor long_schedule <0.1554.0> [{initial_call,{riak_core_vnode,init,1}},{almost_current_function,{gen_fsm_compat,loop,8}},{message_queue_len,0},{dictionary,[{rand_seed,{#{bits => 58,jump => #Fun<rand.8.15449617>,next => #Fun<rand.5.15449617>,type => exrop,uniform => #Fun<rand.6.15449617>,uniform_n => #Fun<rand.7.15449617>,weak_low_bits => 1},[148281531231106023|287645769648315321]}},{'$initial_call',{riak_core_vnode,init,1}},{'$ancestors',[riak_core_vnode_sup,riak_core_sup,<0.284.0>]}]}] [{timeout,52},{in,{gen,do_call,4}},{out,{gen_fsm_compat,loop,8}}]
2018-05-08 16:16:55.866 [info] <0.246.0>@riak_core_sysmon_handler:handle_event:92 monitor long_schedule <0.300.0> [{name,riak_core_ring_events},{initial_call,{gen_event,init_it,6}},{almost_current_function,{gen_event,fetch_msg,6}},{message_queue_len,0},{dictionary,[{'$initial_call',{gen_event,init_it,6}},{'$ancestors',[riak_core_sup,<0.284.0>]}]}] [{timeout,68},{in,{riak_core_ring_handler,'-maybe_stop_vnode_proxies/1-lc$^3/1-3-',2}},{out,{gen_event,fetch_msg,6}}]
2018-05-08 16:16:56.142 [info] <0.246.0>@riak_core_sysmon_handler:handle_event:92 monitor long_schedule <0.300.0> [{name,riak_core_ring_events},{initial_call,{gen_event,init_it,6}},{almost_current_function,{gen_event,fetch_msg,6}},{message_queue_len,0},{dictionary,[{'$initial_call',{gen_event,init_it,6}},{'$ancestors',[riak_core_sup,<0.284.0>]}]}] [{timeout,51},{in,{riak_core_ring_handler,'-maybe_stop_vnode_proxies/1-lc$^3/1-3-',2}},{out,{gen_event,fetch_msg,6}}]
2018-05-08 16:16:57.809 [warning] <0.4391.0>@partisan_peer_service_client:init:72 unable to connect to #{channels => [broadcast,vnode,{monotonic,gossip}],listen_addrs => [#{ip => {127,0,0,1},port => 35986}],name => 'node_1@127.0.0.1',parallelism => 1} due to {error,econnrefused}
2018-05-08 16:16:58.836 [warning] <0.4392.0>@partisan_peer_service_client:init:72 unable to connect to #{channels => [broadcast,vnode,{monotonic,gossip}],listen_addrs => [#{ip => {127,0,0,1},port => 35986}],name => 'node_1@127.0.0.1',parallelism => 1} due to {error,econnrefused}
2018-05-08 16:16:59.109 [info] <0.284.0>@riak_core_app:stop:47 Stopped application riak_core.
2018-05-08 16:16:59.126 [info] <0.90.0> alarm_handler: {clear,system_memory_high_watermark}
===> Linking _build/default/plugins/rebar_erl_vsn to _build/test/plugins/rebar_erl_vsn
===> Linking _build/default/plugins/rebar3_proper to _build/test/plugins/rebar3_proper
===> Linking _build/default/plugins/rebar3_run to _build/test/plugins/rebar3_run
===> Verifying dependencies...
===> Fetching proper ({pkg,<<"proper">>,<<"1.2.0">>})
===> Version cached at /root/.cache/rebar3/hex/default/packages/proper-1.2.0.tar is up to date, reusing it
===> Fetching recon ({pkg,<<"recon">>,<<"2.3.5">>})
===> Downloaded package, caching at /root/.cache/rebar3/hex/default/packages/recon-2.3.5.tar
===> Linking _build/default/lib/exometer_core to _build/test/lib/exometer_core
===> Linking _build/default/lib/lager to _build/test/lib/lager
===> Linking _build/default/lib/lasp_bench to _build/test/lib/lasp_bench
===> Linking _build/default/lib/partisan to _build/test/lib/partisan
===> Linking _build/default/lib/pbkdf2 to _build/test/lib/pbkdf2
===> Linking _build/default/lib/poolboy to _build/test/lib/poolboy
===> Linking _build/default/lib/riak_core to _build/test/lib/riak_core
===> Linking _build/default/lib/riak_core_partisan_utils to _build/test/lib/riak_core_partisan_utils
===> Linking _build/default/lib/riak_ensemble to _build/test/lib/riak_ensemble
===> Linking _build/default/plugins/pc to _build/test/plugins/pc
===> Linking _build/default/lib/acceptor_pool to _build/test/lib/acceptor_pool
===> Linking _build/default/lib/basho_stats to _build/test/lib/basho_stats
===> Linking _build/default/lib/blume to _build/test/lib/blume
===> Linking _build/default/lib/chash to _build/test/lib/chash
===> Linking _build/default/lib/clique to _build/test/lib/clique
===> Linking _build/default/lib/cuttlefish to _build/test/lib/cuttlefish
===> Linking _build/default/lib/eleveldb to _build/test/lib/eleveldb
===> Linking _build/default/lib/folsom to _build/test/lib/folsom
===> Linking _build/default/lib/gen_fsm_compat to _build/test/lib/gen_fsm_compat
===> Linking _build/default/lib/getopt to _build/test/lib/getopt
===> Linking _build/default/lib/goldrush to _build/test/lib/goldrush
===> Linking _build/default/lib/jam to _build/test/lib/jam
===> Linking _build/default/lib/parse_trans to _build/test/lib/parse_trans
===> Linking _build/default/lib/rand_compat to _build/test/lib/rand_compat
===> Linking _build/default/lib/riak_sysmon to _build/test/lib/riak_sysmon
===> Linking _build/default/lib/setup to _build/test/lib/setup
===> Linking _build/default/lib/time_compat to _build/test/lib/time_compat
===> Linking _build/default/lib/types to _build/test/lib/types
===> Linking _build/default/lib/uuid to _build/test/lib/uuid
===> Linking _build/default/lib/bear to _build/test/lib/bear
===> Linking _build/default/lib/edown to _build/test/lib/edown
===> Linking _build/default/lib/meck to _build/test/lib/meck
===> Linking _build/default/lib/quickrand to _build/test/lib/quickrand
===> Compiling proper
make: 'include/compile_flags.hrl' is up to date.
/opt/unir/_build/test/lib/proper/src/proper_typeserver.erl:553: Warning: erlang:get_stacktrace/0 used following a 'try' expression may stop working in a future release. (Use it inside 'try'.)
/opt/unir/_build/test/lib/proper/src/proper_typeserver.erl:556: Warning: erlang:get_stacktrace/0 used following a 'try' expression may stop working in a future release. (Use it inside 'try'.)
_build/test/lib/proper/src/proper_typeserver.erl:553: Warning: erlang:get_stacktrace/0 used following a 'try' expression may stop working in a future release. (Use it inside 'try'.)
_build/test/lib/proper/src/proper_typeserver.erl:556: Warning: erlang:get_stacktrace/0 used following a 'try' expression may stop working in a future release. (Use it inside 'try'.)
===> Compiling quickrand
===> Compiling uuid
===> Compiling types
===> Compiling time_compat
===> Compiling rand_compat
===> Compiling acceptor_pool
===> Compiling partisan
===> Compiling riak_core_partisan_utils
===> Compiling eleveldb
make: Nothing to be done for 'all'.
make: Nothing to be done for 'tools'.
===> Compiling c_src/eleveldb.cc
===> Compiling c_src/refobjects.cc
===> Compiling c_src/workitems.cc
===> Linking priv/eleveldb.so
===> Compiling getopt
===> Compiling cuttlefish
===> Building escript...
===> Compiling riak_ensemble
===> Linking priv/riak_ensemble_drv.so
===> Compiling chash
===> Compiling basho_stats
===> Compiling edown
===> Compiling setup
/usr/local/bin/rebar skip_deps=true escriptize
==> setup (escriptize)
===> Compiling gen_fsm_compat
===> Compiling poolboy
===> Compiling pbkdf2
===> Compiling parse_trans
===> Compiling exometer_core
===> Compiling clique
===> Compiling blume
===> Compiling riak_core
===> Compiling lasp_bench
===> Compiling recon
===> Compiling unir
src/unir_get_fsm.erl:58: Warning: gen_fsm:start_link/3 is deprecated and will be removed in a future release; use gen_statem:start_link/3
src/unir_ping_fsm.erl:57: Warning: gen_fsm:start_link/3 is deprecated and will be removed in a future release; use gen_statem:start_link/3
src/unir_put_fsm.erl:59: Warning: gen_fsm:start_link/3 is deprecated and will be removed in a future release; use gen_statem:start_link/3
_build/test/lib/unir/test/support.erl:25: Warning: export_all flag enabled - all functions will be exported
_build/test/lib/unir/test/prop_unir.erl:11: Warning: export_all flag enabled - all functions will be exported
_build/test/lib/unir/test/functionality_SUITE.erl:36: Warning: export_all flag enabled - all functions will be exported
_build/test/lib/unir/test/throughput_SUITE.erl:36: Warning: export_all flag enabled - all functions will be exported
test/support.erl:25: Warning: export_all flag enabled - all functions will be exported
test/prop_unir.erl:11: Warning: export_all flag enabled - all functions will be exported
test/functionality_SUITE.erl:36: Warning: export_all flag enabled - all functions will be exported
test/throughput_SUITE.erl:36: Warning: export_all flag enabled - all functions will be exported
===> Testing prop_unir:prop_sequential()
16:14:07.614 [info] Using "node_1@127.0.0.1" as name, since running >= 20.0
16:14:07.614 [info] Starting node: 'node_1@127.0.0.1'
16:14:07.614 [info] Application lager started on node 'runner@127.0.0.1'
16:14:08.714 [info] Node started: 'node_1@127.0.0.1'
16:14:08.714 [info] Using "node_2@127.0.0.1" as name, since running >= 20.0
16:14:08.714 [info] Starting node: 'node_2@127.0.0.1'
16:14:09.767 [info] Node started: 'node_2@127.0.0.1'
16:14:09.767 [info] Using "node_3@127.0.0.1" as name, since running >= 20.0
16:14:09.767 [info] Starting node: 'node_3@127.0.0.1'
16:14:10.873 [info] Node started: 'node_3@127.0.0.1'
16:14:22.910 [info] Issuing normal join from 'node_2@127.0.0.1' to 'node_1@127.0.0.1': ok
16:14:29.020 [info] Issuing normal join from 'node_3@127.0.0.1' to 'node_1@127.0.0.1': ok
16:14:29.020 [info] Waiting for nodes to be ready...
16:14:29.020 [info] Waiting for node 'node_1@127.0.0.1' to be ready...
16:14:29.021 [info] Waiting for node 'node_2@127.0.0.1' to be ready...
16:14:29.022 [info] Waiting for node 'node_3@127.0.0.1' to be ready...
16:14:30.025 [info] Waiting for node 'node_3@127.0.0.1' to be ready...
16:14:31.027 [info] Waiting for node 'node_3@127.0.0.1' to be ready...
16:14:32.029 [info] Waiting for node 'node_3@127.0.0.1' to be ready...
16:14:33.031 [info] Waiting for node 'node_3@127.0.0.1' to be ready...
16:14:34.034 [info] Waiting for node 'node_3@127.0.0.1' to be ready...
16:14:35.036 [info] Waiting for node 'node_3@127.0.0.1' to be ready...
16:14:36.039 [info] Waiting for node 'node_3@127.0.0.1' to be ready...
16:14:37.042 [info] Waiting for node 'node_3@127.0.0.1' to be ready...
16:14:38.044 [info] Waiting for node 'node_3@127.0.0.1' to be ready...
16:14:39.047 [info] Waiting for node 'node_3@127.0.0.1' to be ready...
16:14:39.048 [info] All nodes ready!
16:14:39.048 [info] Waiting for ownership agreement...
16:16:36.035 [info] Waiting for handoff...
16:16:36.035 [info] Wait until no pending changes on nodes ['node_1@127.0.0.1','node_2@127.0.0.1','node_3@127.0.0.1']
16:16:36.373 [info] -> BadNodes: [], length(Changes): 3, length(Nodes): 3, Changes: [false,false,false]
16:16:37.374 [info] Wait until no pending changes on nodes ['node_1@127.0.0.1','node_2@127.0.0.1','node_3@127.0.0.1']
16:16:37.664 [info] -> BadNodes: [], length(Changes): 3, length(Nodes): 3, Changes: [false,false,false]
16:16:38.667 [info] Wait until no pending changes on nodes ['node_1@127.0.0.1','node_2@127.0.0.1','node_3@127.0.0.1']
16:16:38.704 [info] -> BadNodes: [], length(Changes): 3, length(Nodes): 3, Changes: [false,false,false]
16:16:39.708 [info] Wait until no pending changes on nodes ['node_1@127.0.0.1','node_2@127.0.0.1','node_3@127.0.0.1']
16:16:39.747 [info] -> BadNodes: [], length(Changes): 3, length(Nodes): 3, Changes: [false,false,false]
16:16:40.748 [info] Wait until no pending changes on nodes ['node_1@127.0.0.1','node_2@127.0.0.1','node_3@127.0.0.1']
16:16:40.787 [info] -> BadNodes: [], length(Changes): 3, length(Nodes): 3, Changes: [false,false,false]
16:16:41.788 [info] Wait until no pending changes on nodes ['node_1@127.0.0.1','node_2@127.0.0.1','node_3@127.0.0.1']
16:16:41.865 [info] -> BadNodes: [], length(Changes): 3, length(Nodes): 3, Changes: [false,false,false]
16:16:42.869 [info] Wait until no pending changes on nodes ['node_1@127.0.0.1','node_2@127.0.0.1','node_3@127.0.0.1']
16:16:42.909 [info] -> BadNodes: [], length(Changes): 3, length(Nodes): 3, Changes: [false,false,false]
16:16:43.910 [info] Wait until no pending changes on nodes ['node_1@127.0.0.1','node_2@127.0.0.1','node_3@127.0.0.1']
16:16:43.945 [info] -> BadNodes: [], length(Changes): 3, length(Nodes): 3, Changes: [false,false,false]
16:16:44.950 [info] Wait until no pending changes on nodes ['node_1@127.0.0.1','node_2@127.0.0.1','node_3@127.0.0.1']
16:16:45.013 [info] -> BadNodes: [], length(Changes): 3, length(Nodes): 3, Changes: [false,false,false]
16:16:46.016 [info] Wait until no pending changes on nodes ['node_1@127.0.0.1','node_2@127.0.0.1','node_3@127.0.0.1']
16:16:46.040 [info] -> BadNodes: [], length(Changes): 3, length(Nodes): 3, Changes: [false,false,false]
16:16:47.041 [info] Wait until no pending changes on nodes ['node_1@127.0.0.1','node_2@127.0.0.1','node_3@127.0.0.1']
16:16:47.109 [info] -> BadNodes: [], length(Changes): 3, length(Nodes): 3, Changes: [false,false,false]
16:16:48.111 [info] Wait until no pending changes on nodes ['node_1@127.0.0.1','node_2@127.0.0.1','node_3@127.0.0.1']
16:16:48.133 [info] -> BadNodes: [], length(Changes): 3, length(Nodes): 3, Changes: [false,false,false]
16:16:49.135 [info] Wait until no pending changes on nodes ['node_1@127.0.0.1','node_2@127.0.0.1','node_3@127.0.0.1']
16:16:49.171 [info] -> BadNodes: [], length(Changes): 3, length(Nodes): 3, Changes: [false,false,false]
16:16:50.172 [info] Wait until no pending changes on nodes ['node_1@127.0.0.1','node_2@127.0.0.1','node_3@127.0.0.1']
16:16:50.193 [info] -> BadNodes: [], length(Changes): 3, length(Nodes): 3, Changes: [false,false,false]
16:16:51.195 [info] Wait until no pending changes on nodes ['node_1@127.0.0.1','node_2@127.0.0.1','node_3@127.0.0.1']
16:16:51.229 [info] -> BadNodes: [], length(Changes): 3, length(Nodes): 3, Changes: [false,false,false]
16:16:52.231 [info] Wait until no pending changes on nodes ['node_1@127.0.0.1','node_2@127.0.0.1','node_3@127.0.0.1']
16:16:52.255 [info] -> BadNodes: [], length(Changes): 3, length(Nodes): 3, Changes: [false,false,false]
16:16:53.257 [info] Wait until no pending changes on nodes ['node_1@127.0.0.1','node_2@127.0.0.1','node_3@127.0.0.1']
16:16:53.315 [info] -> BadNodes: [], length(Changes): 3, length(Nodes): 3, Changes: [false,false,false]
16:16:54.316 [info] Wait until no pending changes on nodes ['node_1@127.0.0.1','node_2@127.0.0.1','node_3@127.0.0.1']
16:16:54.343 [info] -> BadNodes: [], length(Changes): 3, length(Nodes): 3, Changes: [false,false,false]
16:16:55.345 [info] Wait until no pending changes on nodes ['node_1@127.0.0.1','node_2@127.0.0.1','node_3@127.0.0.1']
16:16:55.388 [info] -> BadNodes: [], length(Changes): 3, length(Nodes): 3, Changes: [false,false,false]
16:16:56.389 [info] Wait until no pending changes on nodes ['node_1@127.0.0.1','node_2@127.0.0.1','node_3@127.0.0.1']
16:16:56.435 [info] -> BadNodes: [], length(Changes): 3, length(Nodes): 3, Changes: [true,true,true]
16:16:56.435 [info] No pending changes remain!
16:16:56.435 [info] Waiting for ring convergence...
16:16:56.550 [info] initial_state: nodes [node_1,node_2,node_3] joined_nodes [node_1,node_2,node_3]
16:16:56.550 [info] read_object: node node_2 key <<"key">>
16:16:56.843 [info] read_object: returned key <<"key">> value not_found
16:16:56.843 [info] read_object: object wasn't written yet, not_found OK
----------------------------------------------------
2018-05-08 16:16:56.987
Stopping node: node_1
Warning: ct_logs not started
Stopping node: node_116:16:56.990 [info] Using "node_1@127.0.0.1" as name, since running >= 20.0
----------------------------------------------------
2018-05-08 16:16:59.013
Stopping node: node_2
Warning: ct_logs not started
Stopping node: node_216:16:59.013 [info] Using "node_2@127.0.0.1" as name, since running >= 20.0
----------------------------------------------------
2018-05-08 16:17:01.026
Stopping node: node_3
Warning: ct_logs not started
Stopping node: node_316:17:01.027 [info] Using "node_3@127.0.0.1" as name, since running >= 20.0
.16:17:03.039 [info] initial_state: nodes [node_1,node_2,node_3] joined_nodes [node_1,node_2,node_3]
16:17:03.074 [info] Using "node_1@127.0.0.1" as name, since running >= 20.0
16:17:03.074 [info] Starting node: 'node_1@127.0.0.1'
!
Failed: After 2 test(s).
An exception was raised: exit:{test_case_failed,{error,boot_timeout,'node_1@127.0.0.1'}}.
Stacktrace: [{lists,map,2,[{file,"lists.erl"},{line,1239}]},
{support,start,3,
[{file,"/opt/unir/test/support.erl"},{line,117}]},
{prop_unir,start_nodes,0,
[{file,"/opt/unir/test/prop_unir.erl"},{line,307}]},
{prop_unir,'-prop_sequential/0-fun-2-',1,
[{file,"/opt/unir/test/prop_unir.erl"},{line,32}]},
{proper,apply_args,3,
[{file,"/opt/unir/_build/test/lib/proper/src/proper.erl"},
{line,1353}]},
{proper,perform,7,
[{file,"/opt/unir/_build/test/lib/proper/src/proper.erl"},
{line,1146}]},
{proper,inner_test,2,
[{file,"/opt/unir/_build/test/lib/proper/src/proper.erl"},
{line,1028}]}].
[{set,{var,1},{call,prop_unir,write_object,[node_3,<<107,101,121>>,<<>>]}}]
Shrinking 16:17:06.129 [info] initial_state: nodes [node_1,node_2,node_3] joined_nodes [node_1,node_2,node_3]
16:17:06.133 [info] initial_state: nodes [node_1,node_2,node_3] joined_nodes [node_1,node_2,node_3]
16:17:06.139 [info] initial_state: nodes [node_1,node_2,node_3] joined_nodes [node_1,node_2,node_3]
16:17:06.145 [info] initial_state: nodes [node_1,node_2,node_3] joined_nodes [node_1,node_2,node_3]
16:17:06.161 [info] initial_state: nodes [node_1,node_2,node_3] joined_nodes [node_1,node_2,node_3]
(0 time(s))
[{set,{var,1},{call,prop_unir,write_object,[node_3,<<107,101,121>>,<<>>]}}]
===>
0/1 properties passed, 1 failed
===> Failed test cases:
prop_unir:prop_sequential() -> false
{prop_unir,prop_sequential,
[[{set,{var,1},
{call,prop_unir,write_object,[node_3,<<"key">>,<<>>]}}]]}.
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment