Skip to content

Instantly share code, notes, and snippets.

View brianlong's full-sized avatar

Brian Long brianlong

View GitHub Profile
@brianlong
brianlong / gist:ad07be56202b2e3bd4c5e6bbee17cc26
Created August 7, 2022 18:03
Solana Quic Status 2022-08-07
The 81 nodes listed below are not ready for QUIC. There are some nodes running v1.10.32+ with configuration problems.
See https://github.com/rpcpool/triton-one-rpc-debug-scripts/tree/main/rust for the program used to create this output.
IP Address | Identity | Version | Active Stake | QUIC OK?
----------------+----------------------------------------------+---------+----------------------+---------
35.164.26.16 | krakeNd6ednDPEXxHAmoBs1qKVM8kLg79PvWF2mhXV1 | 1.10.31 | 5689282260813854 | timed out
5.9.65.10 | VymDdiepH77edNcNcKBKtRUb3gbQPtPyGh5NLcWaynj | 1.10.34 | 3928499805934152 | timed out
145.40.106.49 | Fd7btgySsrjuo25CJCj7oE7VPMyezDhnx7pZkj2v69Nk | 1.10.31 | 3302546620325862 | timed out
104.238.222.158 | GZNnph4EvmyjjL5uzF9xNNTHyV46RzbkW4w4HYU8BQCW | 1.10.34 | 1409218929873249 | timed out
The 107 nodes listed below are not ready for QUIC. There are some nodes running v1.10.32+ with configuration problems.
See https://github.com/rpcpool/triton-one-rpc-debug-scripts/tree/main/rust for the program used to create this output.
IP Address | Identity | Version | Active Stake | QUIC OK?
----------------+----------------------------------------------+---------+----------------------+---------
35.164.26.16 | krakeNd6ednDPEXxHAmoBs1qKVM8kLg79PvWF2mhXV1 | 1.10.31 | 5689282263096734 | timed out
5.62.126.196 | Bx7SNaHfyeLZDeja9HPsn61XYjeForHtrTxLCb6cu62o | 1.10.29 | 4763221343500111 | timed out
85.190.150.166 | Certusm1sa411sMpV9FPqU5dXAYhmmhygvxJ23S6hJ24 | 1.10.31 | 4501105946372643 | timed out
5.9.65.10 | VymDdiepH77edNcNcKBKtRUb3gbQPtPyGh5NLcWaynj | 1.10.34 | 3928499808217032 | timed out
@brianlong
brianlong / gist:fc0a6db0eaba5f66ea605cb69ed5b9a9
Created June 2, 2021 14:50
Solana Gossip slot=80900153
Jun 1 06:26:32 solana-production-lax chrt[27102]: [2021-06-01T06:26:32.996449667Z INFO solana_core::cluster_info]
Jun 1 06:26:32 solana-production-lax chrt[27102]: IP Address |Age(ms)| Node identifier | Version |Gossip| TPU |TPUfwd| TVU |TVUfwd|Repair|ServeR|ShredVer
Jun 1 06:26:32 solana-production-lax chrt[27102]: ------------------+-------+----------------------------------------------+---------+------+------+------+------+------+------+------+--------
Jun 1 06:26:32 solana-production-lax chrt[27102]: 37.10.126.90 me| 3150 | 8CXhjBBfF6zegLTc81fFTaqvJNVNjv61Q14nwzmR6SJd | 1.6.10 | 8001 | 8004 | 8005 | 8002 | 8003 | 8007 | 8008 | 13490
Jun 1 06:26:32 solana-production-lax chrt[27102]: 168.119.212.88 | 541 | 4JryygoiM1j324fYkeBzcQDcwRfd2WpgkEzUePFj1rJY | 1.6.10 | 8001 | 8004 | 8005 | 8002 | 8003 | 8007 | 8008 | 13490
Jun 1 06:26:32 solana-production-lax chrt[27102]: 109.238.11.33 | 535 | 8vnrJNMDERZRwWgUMSqwLyYHEPkQQg3Ww6BM9AH4u
$ sudo cat /var/log/syslog | grep -B 50 -A 1500 'fork:81046360' | grep replay_stage
Jun 2 06:27:08 solana-production-lax solana-validator[32716]: [2021-06-02T06:27:08.876644586Z INFO solana_core::replay_stage] new fork:81046359 parent:81046358 root:81046291
Jun 2 06:27:09 solana-production-lax solana-validator[32716]: [2021-06-02T06:27:09.194384482Z INFO solana_core::replay_stage] bank frozen: 81046359
Jun 2 06:27:09 solana-production-lax solana-validator[32716]: [2021-06-02T06:27:09.209850727Z INFO solana_core::replay_stage] 9GJmEHGom9eWo4np4L5vC6b6ri1Df2xN8KFoWixvD1Bs slot_weight: 81046359 0 0 81046358
Jun 2 06:27:09 solana-production-lax solana-validator[32716]: [2021-06-02T06:27:09.209956060Z INFO solana_core::replay_stage] voting: 81046359 0
Jun 2 06:27:09 solana-production-lax solana-validator[32716]: [2021-06-02T06:27:09.212213310Z INFO solana_core::replay_stage] vote bank: Some((81046359, SameFork)) reset bank: 81046359
Jun 2 06:27:09 solana-production-lax solana-validator[32716]: [2021-06-
@brianlong
brianlong / gist:5d7822a1827bc016dc8d1a6f479c171f
Last active April 8, 2021 19:23
Solana Skipped Slot Investigation 72677732-72677735
Given this slot sequence, I am DDnA and I skipped all four of my slots:
`% solana block-production -v | grep -B 8 -A 8 DDnA`
72677728 5XKJwdKB2Hs7pkEXzifAysjSk6q7Rt6k5KfHwmAMPtoQ
72677729 5XKJwdKB2Hs7pkEXzifAysjSk6q7Rt6k5KfHwmAMPtoQ
72677730 5XKJwdKB2Hs7pkEXzifAysjSk6q7Rt6k5KfHwmAMPtoQ
72677731 5XKJwdKB2Hs7pkEXzifAysjSk6q7Rt6k5KfHwmAMPtoQ
72677732 DDnAqxJVFo2GVTujibHt5cjevHMSE9bo8HJaydHoshdp SKIPPED
72677733 DDnAqxJVFo2GVTujibHt5cjevHMSE9bo8HJaydHoshdp SKIPPED
72677734 DDnAqxJVFo2GVTujibHt5cjevHMSE9bo8HJaydHoshdp SKIPPED
Looking at this sequence of slots, I am DDnA... and skipped 4 slots
--
66728400 Hqc2qT3vXvBSKozmz7Rd7gLF6jUJHuQEGBTnCHzEBnqk
66728401 Hqc2qT3vXvBSKozmz7Rd7gLF6jUJHuQEGBTnCHzEBnqk
66728402 Hqc2qT3vXvBSKozmz7Rd7gLF6jUJHuQEGBTnCHzEBnqk
66728403 Hqc2qT3vXvBSKozmz7Rd7gLF6jUJHuQEGBTnCHzEBnqk
66728404 XkCriyrNwS3G4rzAXtG5B1nnvb5Ka1JtCku93VqeKAr
66728405 XkCriyrNwS3G4rzAXtG5B1nnvb5Ka1JtCku93VqeKAr
66728406 XkCriyrNwS3G4rzAXtG5B1nnvb5Ka1JtCku93VqeKAr
# In the log segment below, my node skipped votes for slots 63390772-63390775. After voting for 63390771, I see some repair
# activity. Does this mean that I did not receive all of the required shreds and was forced to pause for repair? Is this
# caused by network propagation errors? Other possible causes related to my node? Look for `voting: 63390771` and scan down
# from there to see details.
solana@solana-production:~$ cat DDnAqxJVFo2GVTujibHt5cjevHMSE9bo8HJaydHoshdp.log | grep -B 50 -A 1000 'voting: 63390771'
[2021-02-01T01:58:26.321387225Z INFO solana_metrics::metrics] datapoint: tower-vote latest=63390769i root=63390738i
[2021-02-01T01:58:26.331460497Z INFO solana_metrics::counter] COUNTER:{"name": "tower_save-ms", "counts": 9001272, "samples": 1047000, "now": 1612144706331, "events": 10}
[2021-02-01T01:58:26.332045365Z INFO solana_metrics::metrics] datapoint: purge_slots_time remove_storages_elapsed=0i drop_storage_entries_elapsed=0i num_slots_removed=0i total_removed_storage_entries=0i total_r
solana@solana-tds:~$ cat 71bhKKL89U3dNHzuZVZ7KarqV6XtHEgjXjvJTsguD11B.log | grep -B 20 -A 140 '52731816 leader: 71bhKKL89U3dNHzuZVZ7KarqV6XtHEgjXjvJTsguD11B'
[2020-12-14T23:57:00.036427361Z INFO solana_metrics::metrics] datapoint: cluster_info_stats entrypoint=0i entrypoint2=0i push_vote_read=89i vote_process_push=0i get_votes=16496i get_accounts_hash=0i all_tvu_peers=0i tvu_peers=12926i new_push_requests_num=4451i table_size=64020i purged_values_size=110146i failed_inserts_size=3376i
[2020-12-14T23:57:00.036445975Z INFO solana_metrics::metrics] datapoint: cluster_info_stats2 gossip_packets_dropped_count=0i retransmit_peers=6642i repair_peers=62801i new_push_requests=14082i new_push_requests2=830i purge=53267i process_gossip_packets_time=1288881i handle_batch_ping_messages_time=509i handle_batch_pong_messages_time=36i handle_batch_prune_messages_time=5395i handle_batch_pull_requests_time=516126i handle_batch_pull_responses_time=44671i handle_batch_push_messages_time=322092i process_pull_resp=25838i filter_p
solana@solana-tds:~$ cat 71bhKKL89U3dNHzuZVZ7KarqV6XtHEgjXjvJTsguD11B.log | grep -B 20 -A 140 '52742164 leader: 71bhKKL89U3dNHzuZVZ7KarqV6XtHEgjXjvJTsguD11B'
[2020-12-15T01:14:53.673326033Z WARN solana_metrics::counter] COUNTER:{"name": "poh_recorder-record_ms", "counts": 1037622, "samples": 1034000, "now": 1607994893673, "events": 0}
[2020-12-15T01:14:53.673757860Z INFO solana_metrics::metrics] datapoint: best_slot slot=52742163i best_slot=52742163i
[2020-12-15T01:14:53.673776904Z INFO solana_metrics::metrics] datapoint: bank_weight slot=52742163i weight="0"
[2020-12-15T01:14:53.673759069Z INFO solana_core::replay_stage] 71bhKKL89U3dNHzuZVZ7KarqV6XtHEgjXjvJTsguD11B slot_weight: 52742163 0 0 52742162
[2020-12-15T01:14:53.673813457Z INFO solana_core::replay_stage] validator fork confirmed 52742161 1660ms
[2020-12-15T01:14:53.673828085Z INFO solana_core::replay_stage] validator fork confirmed 52742162 929ms
[2020-12-15T01:14:53.673830074Z INFO solana_metrics::metrics] datapoint: validator-confirmation d
stack backtrace:
[2020-09-10T12:24:05.948229058Z INFO solana_metrics::counter] COUNTER:{"name": "bank-process_transactions-txs", "counts": 66418645, "samples": 72530000, "now": 1599740645948, "events": 1}
[2020-09-10T12:24:05.948260339Z INFO solana_metrics::counter] COUNTER:{"name": "bank-process_transactions-sigs", "counts": 75578998, "samples": 72530000, "now": 1599740645948, "events": 1}
[2020-09-10T12:24:05.949498353Z INFO solana_ledger::blockstore] slot 35012034 is full, last: 145
[2020-09-10T12:24:05.951303326Z INFO solana_ledger::blockstore] slot 35012035 is full, last: 137
[2020-09-10T12:24:05.951395939Z INFO solana_core::replay_stage] bank frozen: 35012033
[2020-09-10T12:24:05.951405651Z INFO solana_metrics::metrics] datapoint: replay-slot-stats slot=35012033i fetch_entries_time=1123i fetch_entries_fail_time=0i entry_poh_verification_time=57677i entry_transaction_verification_time=9141i replay_time=479786i replay_total_elapsed=553054i total_entries=349i total_shreds=164i
[2020-09-10T12:24:05.