Skip to content

Instantly share code, notes, and snippets.

@SnabbBot
Created February 26, 2016 09:06
Show Gist options
  • Save SnabbBot/279f6999c99689e2b24c to your computer and use it in GitHub Desktop.
Save SnabbBot/279f6999c99689e2b24c to your computer and use it in GitHub Desktop.
Host: Linux davos 3.13.0-46-generic x86_64 Intel(R) Xeon(R) CPU E5-2603 v2 @ 1.80GHz
Image: eugeneia/snabb-nfv-test
Pull Request: #736
Current Head: 0d6470d9a1e3954ccf35077cd28d76f87a5ac730
Pull Request Head: 2012bd6606ade368e6e8f05335e15389a23b6005
SNABB_PCI0=0000:84:00.0 Ethernet controller: Intel Corporation 82599ES 10-Gigabit SFI/SFP+ Network Connection (rev 01)
SNABB_PCI1=0000:84:00.1 Ethernet controller: Intel Corporation 82599ES 10-Gigabit SFI/SFP+ Network Connection (rev 01)
SNABB_PCI_INTEL0=0000:84:00.0 Ethernet controller: Intel Corporation 82599ES 10-Gigabit SFI/SFP+ Network Connection (rev 01)
SNABB_PCI_INTEL1=0000:84:00.1 Ethernet controller: Intel Corporation 82599ES 10-Gigabit SFI/SFP+ Network Connection (rev 01)
Checking for performance regressions:
BENCH basic1-100e6 -> 0.990424 of 14.62 (SD: 0.0979796 )
BENCH packetblaster-64 -> 0.997166 of 10.586 (SD: 0.026533 )
BENCH packetblaster-synth-64 -> 0.998877 of 10.682 (SD: 0.0193907 )
BENCH snabbnfv-iperf-1500 -> 1.04683 of 5.466 (SD: 0.331095 )
BENCH snabbnfv-iperf-jumbo -> 0.997417 of 6.194 (SD: 0.1201 )
BENCH snabbnfv-loadgen-dpdk -> 0.990965 of 2.6784 (SD: 0.0321969 )
Checking test suite:
DIR testlog
TEST program.snabbnfv.nfvconfig
TEST program.snabbnfv.neutron2snabb.neutron2snabb
TEST program.snabbnfv.neutron2snabb.neutron2snabb_schema
TEST program.l2fwd.l2fwd
TEST lib.hash.murmur
TEST lib.hardware.pci
TEST lib.protocol.datagram
TEST lib.protocol.ipv6
TEST lib.protocol.tcp
TEST lib.protocol.ipv4
TEST lib.ipc.shmem.shmem
TEST lib.traceprof.traceprof
TEST lib.checksum
TEST lib.bloom_filter
TEST lib.pmu
SKIPPED testlog/lib.pmu
TEST core.app
TEST core.counter
TEST core.lib
TEST core.timer
TEST core.link
TEST core.shm
TEST core.main
TEST core.memory
TEST apps.virtio_net.virtio_net
SKIPPED testlog/apps.virtio_net.virtio_net
TEST apps.ipv6.nd_light
TEST apps.rate_limiter.rate_limiter
TEST apps.keyed_ipv6_tunnel.tunnel
TEST apps.vhost.vhost_user
SKIPPED testlog/apps.vhost.vhost_user
TEST apps.socket.raw
TEST apps.tap.tap
SKIPPED testlog/apps.tap.tap
TEST apps.intel.intel1g
SKIPPED testlog/apps.intel.intel1g
TEST apps.intel.intel_app
TEST apps.packet_filter.pcap_filter
TEST apps.bridge.mac_table
TEST apps.vpn.vpws
TEST apps.test.synth
TEST program/packetblaster/selftest.sh
TEST program/snabbnfv/selftest.sh
ERROR testlog/program.snabbnfv.selftest.sh
ERROR during tests:
src/testlog/apps.bridge.mac_table:
Feb 26 2016 09:05:07 mac_table: resizing from 512 to 4096 hash buckets, new target size 2048 (1295 MAC entries, old target size 256, size/bucket overflow: true/true)
src/testlog/apps.intel.intel1g:
selftest: Intel1g
SNABB_SELFTEST_INTEL1G_0 not set
EXITCODE: 43
src/testlog/apps.intel.intel_app:
selftest: intel_app
100 VF initializations:
Running iterated VMDq test...
test # 1: VMDq VLAN=101; 100ms burst. packet sent: 484,755
test # 2: VMDq VLAN=102; 100ms burst. packet sent: 1,065,135
test # 3: VMDq VLAN=103; 100ms burst. packet sent: 1,645,005
test # 4: VMDq VLAN=104; 100ms burst. packet sent: 2,231,250
test # 5: VMDq VLAN=105; 100ms burst. packet sent: 2,815,200
test # 6: VMDq VLAN=106; 100ms burst. packet sent: 3,398,385
test # 7: VMDq VLAN=107; 100ms burst. packet sent: 3,980,295
test # 8: VMDq VLAN=108; 100ms burst. packet sent: 4,560,930
test # 9: VMDq VLAN=109; 100ms burst. packet sent: 5,138,250
test # 10: VMDq VLAN=110; 100ms burst. packet sent: 5,709,960
test # 11: VMDq VLAN=111; 100ms burst. packet sent: 6,284,985
test # 12: VMDq VLAN=112; 100ms burst. packet sent: 6,857,205
test # 13: VMDq VLAN=113; 100ms burst. packet sent: 7,429,170
test # 14: VMDq VLAN=114; 100ms burst. packet sent: 8,003,175
test # 15: VMDq VLAN=115; 100ms burst. packet sent: 8,573,100
test # 16: VMDq VLAN=116; 100ms burst. packet sent: 9,146,340
test # 17: VMDq VLAN=117; 100ms burst. packet sent: 9,714,480
test # 18: VMDq VLAN=118; 100ms burst. packet sent: 10,284,405
test # 19: VMDq VLAN=119; 100ms burst. packet sent: 10,859,175
test # 20: VMDq VLAN=120; 100ms burst. packet sent: 11,432,925
test # 21: VMDq VLAN=121; 100ms burst. packet sent: 12,005,400
test # 22: VMDq VLAN=122; 100ms burst. packet sent: 12,577,365
test # 23: VMDq VLAN=123; 100ms burst. packet sent: 13,148,055
test # 24: VMDq VLAN=124; 100ms burst. packet sent: 13,717,470
test # 25: VMDq VLAN=125; 100ms burst. packet sent: 14,283,825
test # 26: VMDq VLAN=126; 100ms burst. packet sent: 14,849,670
test # 27: VMDq VLAN=127; 100ms burst. packet sent: 15,415,005
test # 28: VMDq VLAN=128; 100ms burst. packet sent: 15,979,830
test # 29: VMDq VLAN=129; 100ms burst. packet sent: 16,543,890
test # 30: VMDq VLAN=130; 100ms burst. packet sent: 17,105,910
test # 31: VMDq VLAN=131; 100ms burst. packet sent: 17,669,205
test # 32: VMDq VLAN=132; 100ms burst. packet sent: 18,235,560
test # 33: VMDq VLAN=133; 100ms burst. packet sent: 18,796,815
test # 34: VMDq VLAN=134; 100ms burst. packet sent: 19,357,815
test # 35: VMDq VLAN=135; 100ms burst. packet sent: 19,918,305
test # 36: VMDq VLAN=136; 100ms burst. packet sent: 20,475,735
test # 37: VMDq VLAN=137; 100ms burst. packet sent: 21,031,635
test # 38: VMDq VLAN=138; 100ms burst. packet sent: 21,587,280
test # 39: VMDq VLAN=139; 100ms burst. packet sent: 22,143,180
test # 40: VMDq VLAN=140; 100ms burst. packet sent: 22,698,060
test # 41: VMDq VLAN=141; 100ms burst. packet sent: 23,250,900
test # 42: VMDq VLAN=142; 100ms burst. packet sent: 23,803,995
test # 43: VMDq VLAN=143; 100ms burst. packet sent: 24,355,815
test # 44: VMDq VLAN=144; 100ms burst. packet sent: 24,907,125
test # 45: VMDq VLAN=145; 100ms burst. packet sent: 25,458,180
test # 46: VMDq VLAN=146; 100ms burst. packet sent: 26,006,175
test # 47: VMDq VLAN=147; 100ms burst. packet sent: 26,553,915
test # 48: VMDq VLAN=148; 100ms burst. packet sent: 27,101,400
test # 49: VMDq VLAN=149; 100ms burst. packet sent: 27,648,885
test # 50: VMDq VLAN=150; 100ms burst. packet sent: 28,192,545
test # 51: VMDq VLAN=151; 100ms burst. packet sent: 28,737,735
test # 52: VMDq VLAN=152; 100ms burst. packet sent: 29,288,025
test # 53: VMDq VLAN=153; 100ms burst. packet sent: 29,840,355
test # 54: VMDq VLAN=154; 100ms burst. packet sent: 30,392,430
test # 55: VMDq VLAN=155; 100ms burst. packet sent: 30,943,485
test # 56: VMDq VLAN=156; 100ms burst. packet sent: 31,494,030
test # 57: VMDq VLAN=157; 100ms burst. packet sent: 32,043,810
test # 58: VMDq VLAN=158; 100ms burst. packet sent: 32,592,315
test # 59: VMDq VLAN=159; 100ms burst. packet sent: 33,140,820
test # 60: VMDq VLAN=160; 100ms burst. packet sent: 33,690,600
test # 61: VMDq VLAN=161; 100ms burst. packet sent: 34,239,360
test # 62: VMDq VLAN=162; 100ms burst. packet sent: 34,787,100
test # 63: VMDq VLAN=163; 100ms burst. packet sent: 35,334,585
test # 64: VMDq VLAN=164; 100ms burst. packet sent: 35,885,895
test # 65: VMDq VLAN=165; 100ms burst. packet sent: 36,431,850
test # 66: VMDq VLAN=166; 100ms burst. packet sent: 36,976,530
test # 67: VMDq VLAN=167; 100ms burst. packet sent: 37,519,170
test # 68: VMDq VLAN=168; 100ms burst. packet sent: 38,062,320
test # 69: VMDq VLAN=169; 100ms burst. packet sent: 38,604,705
test # 70: VMDq VLAN=170; 100ms burst. packet sent: 39,147,090
test # 71: VMDq VLAN=171; 100ms burst. packet sent: 39,688,455
test # 72: VMDq VLAN=172; 100ms burst. packet sent: 40,226,760
test # 73: VMDq VLAN=173; 100ms burst. packet sent: 40,765,575
test # 74: VMDq VLAN=174; 100ms burst. packet sent: 41,298,780
test # 75: VMDq VLAN=175; 100ms burst. packet sent: 41,833,515
test # 76: VMDq VLAN=176; 100ms burst. packet sent: 42,368,250
test # 77: VMDq VLAN=177; 100ms burst. packet sent: 42,902,730
test # 78: VMDq VLAN=178; 100ms burst. packet sent: 43,436,190
test # 79: VMDq VLAN=179; 100ms burst. packet sent: 43,968,885
test # 80: VMDq VLAN=180; 100ms burst. packet sent: 44,500,305
test # 81: VMDq VLAN=181; 100ms burst. packet sent: 45,031,215
test # 82: VMDq VLAN=182; 100ms burst. packet sent: 45,561,360
test # 83: VMDq VLAN=183; 100ms burst. packet sent: 46,096,095
test # 84: VMDq VLAN=184; 100ms burst. packet sent: 46,634,145
test # 85: VMDq VLAN=185; 100ms burst. packet sent: 47,171,685
test # 86: VMDq VLAN=186; 100ms burst. packet sent: 47,709,225
test # 87: VMDq VLAN=187; 100ms burst. packet sent: 48,246,765
test # 88: VMDq VLAN=188; 100ms burst. packet sent: 48,783,540
test # 89: VMDq VLAN=189; 100ms burst. packet sent: 49,319,805
test # 90: VMDq VLAN=190; 100ms burst. packet sent: 49,856,325
test # 91: VMDq VLAN=191; 100ms burst. packet sent: 50,392,080
test # 92: VMDq VLAN=192; 100ms burst. packet sent: 50,926,560
test # 93: VMDq VLAN=193; 100ms burst. packet sent: 51,461,040
test # 94: VMDq VLAN=194; 100ms burst. packet sent: 51,997,560
test # 95: VMDq VLAN=195; 100ms burst. packet sent: 52,531,020
test # 96: VMDq VLAN=196; 100ms burst. packet sent: 53,064,735
test # 97: VMDq VLAN=197; 100ms burst. packet sent: 53,595,390
test # 98: VMDq VLAN=198; 100ms burst. packet sent: 54,126,300
test # 99: VMDq VLAN=199; 100ms burst. packet sent: 54,656,445
test #100: VMDq VLAN=200; 100ms burst. packet sent: 55,186,590
0000:84:00.0: avg wait_lu: 210, max redos: 0, avg: 0
100 PF full cycles
Running iterated VMDq test...
test # 1: VMDq VLAN=101; 100ms burst. packet sent: 575,790
test # 2: VMDq VLAN=102; 100ms burst. packet sent: 569,925
test # 3: VMDq VLAN=103; 100ms burst. packet sent: 569,925
test # 4: VMDq VLAN=104; 100ms burst. packet sent: 570,435
test # 5: VMDq VLAN=105; 100ms burst. packet sent: 574,515
test # 6: VMDq VLAN=106; 100ms burst. packet sent: 479,145
test # 7: VMDq VLAN=107; 100ms burst. packet sent: 569,670
test # 8: VMDq VLAN=108; 100ms burst. packet sent: 555,900
test # 9: VMDq VLAN=109; 100ms burst. packet sent: 579,360
test # 10: VMDq VLAN=110; 100ms burst. packet sent: 578,085
test # 11: VMDq VLAN=111; 100ms burst. packet sent: 583,185
test # 12: VMDq VLAN=112; 100ms burst. packet sent: 568,395
test # 13: VMDq VLAN=113; 100ms burst. packet sent: 575,790
test # 14: VMDq VLAN=114; 100ms burst. packet sent: 571,200
test # 15: VMDq VLAN=115; 100ms burst. packet sent: 570,435
test # 16: VMDq VLAN=116; 100ms burst. packet sent: 570,180
test # 17: VMDq VLAN=117; 100ms burst. packet sent: 574,515
test # 18: VMDq VLAN=118; 100ms burst. packet sent: 567,885
test # 19: VMDq VLAN=119; 100ms burst. packet sent: 580,380
test # 20: VMDq VLAN=120; 100ms burst. packet sent: 576,555
test # 21: VMDq VLAN=121; 100ms burst. packet sent: 579,870
test # 22: VMDq VLAN=122; 100ms burst. packet sent: 392,445
test # 23: VMDq VLAN=123; 100ms burst. packet sent: 552,585
test # 24: VMDq VLAN=124; 100ms burst. packet sent: 566,100
test # 25: VMDq VLAN=125; 100ms burst. packet sent: 583,440
test # 26: VMDq VLAN=126; 100ms burst. packet sent: 577,320
test # 27: VMDq VLAN=127; 100ms burst. packet sent: 580,125
test # 28: VMDq VLAN=128; 100ms burst. packet sent: 565,590
test # 29: VMDq VLAN=129; 100ms burst. packet sent: 586,500
test # 30: VMDq VLAN=130; 100ms burst. packet sent: 584,970
test # 31: VMDq VLAN=131; 100ms burst. packet sent: 572,985
test # 32: VMDq VLAN=132; 100ms burst. packet sent: 585,735
test # 33: VMDq VLAN=133; 100ms burst. packet sent: 572,220
test # 34: VMDq VLAN=134; 100ms burst. packet sent: 563,550
test # 35: VMDq VLAN=135; 100ms burst. packet sent: 574,770
test # 36: VMDq VLAN=136; 100ms burst. packet sent: 583,185
test # 37: VMDq VLAN=137; 100ms burst. packet sent: 566,100
test # 38: VMDq VLAN=138; 100ms burst. packet sent: 579,105
test # 39: VMDq VLAN=139; 100ms burst. packet sent: 576,045
test # 40: VMDq VLAN=140; 100ms burst. packet sent: 561,255
test # 41: VMDq VLAN=141; 100ms burst. packet sent: 571,710
test # 42: VMDq VLAN=142; 100ms burst. packet sent: 565,335
test # 43: VMDq VLAN=143; 100ms burst. packet sent: 571,455
test # 44: VMDq VLAN=144; 100ms burst. packet sent: 563,040
test # 45: VMDq VLAN=145; 100ms burst. packet sent: 576,300
test # 46: VMDq VLAN=146; 100ms burst. packet sent: 575,535
test # 47: VMDq VLAN=147; 100ms burst. packet sent: 564,060
test # 48: VMDq VLAN=148; 100ms burst. packet sent: 552,330
test # 49: VMDq VLAN=149; 100ms burst. packet sent: 547,230
test # 50: VMDq VLAN=150; 100ms burst. packet sent: 575,280
test # 51: VMDq VLAN=151; 100ms burst. packet sent: 579,105
test # 52: VMDq VLAN=152; 100ms burst. packet sent: 564,825
test # 53: VMDq VLAN=153; 100ms burst. packet sent: 207,825
test # 54: VMDq VLAN=154; 100ms burst. packet sent: 592,620
test # 55: VMDq VLAN=155; 100ms burst. packet sent: 570,180
test # 56: VMDq VLAN=156; 100ms burst. packet sent: 570,945
test # 57: VMDq VLAN=157; 100ms burst. packet sent: 573,750
test # 58: VMDq VLAN=158; 100ms burst. packet sent: 576,300
test # 59: VMDq VLAN=159; 100ms burst. packet sent: 570,690
test # 60: VMDq VLAN=160; 100ms burst. packet sent: 558,195
test # 61: VMDq VLAN=161; 100ms burst. packet sent: 594,150
test # 62: VMDq VLAN=162; 100ms burst. packet sent: 579,360
test # 63: VMDq VLAN=163; 100ms burst. packet sent: 577,575
test # 64: VMDq VLAN=164; 100ms burst. packet sent: 561,255
test # 65: VMDq VLAN=165; 100ms burst. packet sent: 557,685
test # 66: VMDq VLAN=166; 100ms burst. packet sent: 568,140
test # 67: VMDq VLAN=167; 100ms burst. packet sent: 562,785
test # 68: VMDq VLAN=168; 100ms burst. packet sent: 565,845
test # 69: VMDq VLAN=169; 100ms burst. packet sent: 564,060
test # 70: VMDq VLAN=170; 100ms burst. packet sent: 565,335
test # 71: VMDq VLAN=171; 100ms burst. packet sent: 555,645
test # 72: VMDq VLAN=172; 100ms burst. packet sent: 575,025
test # 73: VMDq VLAN=173; 100ms burst. packet sent: 590,325
test # 74: VMDq VLAN=174; 100ms burst. packet sent: 581,145
test # 75: VMDq VLAN=175; 100ms burst. packet sent: 562,785
test # 76: VMDq VLAN=176; 100ms burst. packet sent: 568,650
test # 77: VMDq VLAN=177; 100ms burst. packet sent: 576,810
test # 78: VMDq VLAN=178; 100ms burst. packet sent: 570,945
test # 79: VMDq VLAN=179; 100ms burst. packet sent: 570,945
test # 80: VMDq VLAN=180; 100ms burst. packet sent: 574,005
test # 81: VMDq VLAN=181; 100ms burst. packet sent: 571,965
test # 82: VMDq VLAN=182; 100ms burst. packet sent: 576,300
test # 83: VMDq VLAN=183; 100ms burst. packet sent: 561,510
test # 84: VMDq VLAN=184; 100ms burst. packet sent: 570,945
test # 85: VMDq VLAN=185; 100ms burst. packet sent: 568,650
test # 86: VMDq VLAN=186; 100ms burst. packet sent: 580,635
test # 87: VMDq VLAN=187; 100ms burst. packet sent: 569,925
test # 88: VMDq VLAN=188; 100ms burst. packet sent: 583,695
test # 89: VMDq VLAN=189; 100ms burst. packet sent: 573,495
test # 90: VMDq VLAN=190; 100ms burst. packet sent: 587,520
test # 91: VMDq VLAN=191; 100ms burst. packet sent: 542,640
test # 92: VMDq VLAN=192; 100ms burst. packet sent: 584,205
test # 93: VMDq VLAN=193; 100ms burst. packet sent: 585,735
test # 94: VMDq VLAN=194; 100ms burst. packet sent: 584,205
test # 95: VMDq VLAN=195; 100ms burst. packet sent: 571,710
test # 96: VMDq VLAN=196; 100ms burst. packet sent: 576,300
test # 97: VMDq VLAN=197; 100ms burst. packet sent: 581,145
test # 98: VMDq VLAN=198; 100ms burst. packet sent: 574,515
test # 99: VMDq VLAN=199; 100ms burst. packet sent: 587,520
test #100: VMDq VLAN=200; 100ms burst. packet sent: 581,655
0000:84:00.0: avg wait_lu: 134.75, max redos: 0, avg: 0
-------
Send a bunch of packets from Am0
half of them go to nicAm1 and half go nowhere
link report:
0 sent on nicAm0.tx -> sink_ms.in1 (loss rate: 0%)
2,929,385 sent on nicAm1.tx -> sink_ms.in2 (loss rate: 0%)
5,859,390 sent on repeater_ms.output -> nicAm0.rx (loss rate: 0%)
2 sent on source_ms.out -> repeater_ms.input (loss rate: 0%)
-------
Transmitting bidirectionally between nicA and nicB
link report:
950,400 sent on nicA.tx -> sink.in1 (loss rate: 0%)
950,400 sent on nicB.tx -> sink.in2 (loss rate: 0%)
1,893,885 sent on source1.out -> nicA.rx (loss rate: 0%)
1,893,885 sent on source2.out -> nicB.rx (loss rate: 0%)
-------
Send traffic from a nicA (SF) to nicB (two VFs)
The packets should arrive evenly split between the VFs
link report:
0 sent on nicAs.tx -> sink_ms.in1 (loss rate: 0%)
2,108,678 sent on nicBm0.tx -> sink_ms.in2 (loss rate: 0%)
2,108,678 sent on nicBm1.tx -> sink_ms.in3 (loss rate: 0%)
4,217,955 sent on repeater_ms.output -> nicAs.rx (loss rate: 0%)
2 sent on source_ms.out -> repeater_ms.input (loss rate: 0%)
selftest: ok
src/testlog/apps.ipv6.nd_light:
Feb 26 2016 09:04:10 nd_light: Sending neighbor solicitation for next-hop 2001:db8::1
Feb 26 2016 09:04:10 nd_light: Sending neighbor solicitation for next-hop 2001:db8::2
Feb 26 2016 09:04:10 nd_light: Resolved next-hop 2001:db8::1 to 00:00:00:00:00:01
Feb 26 2016 09:04:10 nd_light: Resolved next-hop 2001:db8::2 to 00:00:00:00:00:02
Feb 26 2016 09:04:11 nd_light: Sending neighbor solicitation for next-hop 2001:db8::1
Feb 26 2016 09:04:11 nd_light: Sending neighbor solicitation for next-hop 2001:db8::2
src/testlog/apps.keyed_ipv6_tunnel.tunnel:
Keyed IPv6 tunnel selftest
run simple one second benchmark ...
load: time: 1.00s fps: 12,411,385 fpGbps: 6.851 fpb: 255 bpp: 60 sleep: 0 us
selftest passed
src/testlog/apps.packet_filter.pcap_filter:
selftest: pcap_filter
Run for 1 second (stateful = false)...
link report:
237,806 sent on pcap_filter.output -> sink.input (loss rate: 0%)
6,381,120 sent on repeater.output -> pcap_filter.input (loss rate: 0%)
161 sent on source.output -> repeater.input (loss rate: 0%)
ok: accepted 3.7267% of inputs (within tolerance)
Run for 1 second (stateful = true)...
link report:
303,263 sent on pcap_filter.output -> sink.input (loss rate: 0%)
4,068,780 sent on repeater.output -> pcap_filter.input (loss rate: 0%)
161 sent on source.output -> repeater.input (loss rate: 0%)
ok: accepted 7.4534% of inputs (within tolerance)
selftest: ok
src/testlog/apps.rate_limiter.rate_limiter:
Rate limiter selftest
test effective rate, non-busy loop
load: time: 1.00s fps: 2,326,295 fpGbps: 1.284 fpb: 255 bpp: 60 sleep: 0 us
load: time: 1.00s fps: 2,325,152 fpGbps: 1.283 fpb: 255 bpp: 60 sleep: 0 us
load: time: 1.00s fps: 2,325,609 fpGbps: 1.284 fpb: 255 bpp: 60 sleep: 0 us
load: time: 1.00s fps: 2,328,764 fpGbps: 1.285 fpb: 255 bpp: 60 sleep: 0 us
load: time: 0.00s fps: NaN fpGbps: nan fpb: NaN bpp: - sleep: 0 us
configured rate is 200000 bytes per second
effective rate is 209973 bytes per second
measure throughput on heavy load...
load: time: 1.00s fps: 23,405,068 fpGbps: 12.920 fpb: 255 bpp: 60 sleep: 0 us
elapsed time 1.093915023 seconds
packets received 25500000 23 Mpps
configured rate is 1200000000 bytes per second
effective rate is 1308117242 bytes per second
throughput is 21 Mpps
selftest passed
src/testlog/apps.socket.raw:
selftest passed
src/testlog/apps.tap.tap:
EXITCODE: 43
src/testlog/apps.test.synth:
link report:
255 sent on synth.output -> writer.input (loss rate: 66%)
src/testlog/apps.vhost.vhost_user:
selftest: vhost_user
SNABB_TEST_VHOST_USER_SOCKET was not set
Test skipped
EXITCODE: 43
src/testlog/apps.virtio_net.virtio_net:
SNABB_TEST_VIRTIO_PCIDEV was not set
Test skipped
EXITCODE: 43
src/testlog/apps.vpn.vpws:
src/testlog/core.app:
Restarting app2 (died at 3957850.075187: core/app.lua:462: Push error.)
Restarting app1 (died at 3957850.075187: core/app.lua:456: Pull error.)
Restarting app2 (died at 3957852.075222: core/app.lua:462: Push error.)
Restarting app1 (died at 3957852.075222: core/app.lua:456: Pull error.)
Restarting app3 (died at 3957854.075091: core/app.lua:468: Report error.)
Restarting app2 (died at 3957854.075329: core/app.lua:462: Push error.)
Restarting app1 (died at 3957854.075329: core/app.lua:456: Pull error.)
selftest: app
empty -> c1
c1 -> c1
c1 -> c2
c2 -> c1
c1 -> empty
c_fail
apps report:
app3
app2 [dead: core/app.lua:462: Push error.]
app1 [dead: core/app.lua:456: Pull error.]
apps report:
app3
app2 [dead: core/app.lua:462: Push error.]
app1 [dead: core/app.lua:456: Pull error.]
OK
src/testlog/core.counter:
selftest: core.counter
selftest ok
src/testlog/core.lib:
selftest: lib
Testing equal
Testing load_string
Testing load/store_conf
Testing csum
Testing hex(un)dump
src/testlog/core.link:
selftest: link
selftest OK
src/testlog/core.main:
selftest
src/testlog/core.memory:
selftest: memory
Kernel vm.nr_hugepages: 10240
Allocating a 2MB HugeTLB: Got 2MB
Physical address: 0x000266a00000
Virtual address: 0x500266a00000
Allocating a 2MB HugeTLB: Got 2MB
Physical address: 0x000266800000
Virtual address: 0x500266800000
Allocating a 2MB HugeTLB: Got 2MB
Physical address: 0x000266600000
Virtual address: 0x500266600000
Allocating a 2MB HugeTLB: Got 2MB
Physical address: 0x000266400000
Virtual address: 0x500266400000
Kernel vm.nr_hugepages: 10240
HugeTLB page allocation OK.
src/testlog/core.shm:
selftest: shm
checking paths..
checking shared memory..
create obj
checking many objects..
10000 objects created
10000 objects unmapped
selftest ok
src/testlog/core.timer:
selftest: timer
ok (973,855 callbacks in 0.3045 seconds)
src/testlog/lib.bloom_filter:
src/testlog/lib.checksum:
selftest: checksum
no avx2
sse2: 1000/1000
selftest: tcp/ipv4
selftest: ok
src/testlog/lib.hardware.pci:
selftest: pci
pciaddress model interface status driver usable
01:00.0 Intel 350 - - apps.intel.intel1g yes
01:00.1 Intel 350 - - apps.intel.intel1g yes
03:00.0 Intel 82599 SFP - - apps.intel.intel_app yes
03:00.1 Intel 82599 SFP - - apps.intel.intel_app yes
09:00.0 Intel 350 - - apps.intel.intel1g yes
09:00.1 Intel 350 - - apps.intel.intel1g yes
81:00.0 Intel 82574L - - apps.intel.intel_app yes
82:00.0 Intel 82599 SFP - - apps.intel.intel_app yes
82:00.1 Intel 82599 SFP - - apps.intel.intel_app yes
84:00.0 Intel 82599 SFP - - apps.intel.intel_app yes
84:00.1 Intel 82599 SFP - - apps.intel.intel_app yes
86:00.0 Intel 82599 SFP - - apps.intel.intel_app yes
86:00.1 Intel 82599 SFP - - apps.intel.intel_app yes
89:00.0 Intel 82599 T3 - - apps.intel.intel_app yes
89:00.1 Intel 82599 T3 - - apps.intel.intel_app yes
src/testlog/lib.hash.murmur:
Sleftest hash MurmurHash3_x86_32
Passed
Sleftest hash MurmurHash3_x64_128
Passed
src/testlog/lib.ipc.shmem.shmem:
ok
src/testlog/lib.pmu:
selftest: pmu
PMU not available:
single core cpu affinity required
selftest skipped
EXITCODE: 43
src/testlog/lib.protocol.datagram:
src/testlog/lib.protocol.ipv4:
src/testlog/lib.protocol.ipv6:
src/testlog/lib.protocol.tcp:
src/testlog/lib.traceprof.traceprof:
traceprof report (recorded 333/333 samples):
40% TRACE 4:LOOP ->loop
33% TRACE 4 ->loop
26% TRACE 5 ->4
src/testlog/program.l2fwd.l2fwd:
selftest: l2fwd
OK
src/testlog/program.packetblaster.selftest.sh:
selftest: packetblaster
selftest: ok
src/testlog/program.snabbnfv.neutron2snabb.neutron2snabb:
selftest: neutron2snabb
ok: {{direction='ingress', ethertype='IPv6'}}
=> ip6
ok: {{direction='ingress', ethertype='IPv4'}}
=> (arp or ip)
ok: {{direction='ingress', ethertype='IPv4', protocol='tcp'}}
=> (arp or (ip and tcp))
ok: {{direction='ingress', ethertype='IPv4', protocol='udp'}}
=> (arp or (ip and udp))
ok: {{direction='ingress', ethertype='IPv4', protocol='udp', port_range_min=1000}}
=> (arp or (ip and udp and dst portrange 1000-1000))
ok: {{direction='ingress', ethertype='IPv4', protocol='udp', port_range_max=2000}}
=> (arp or (ip and udp and dst portrange 2000-2000))
ok: {{direction='ingress', ethertype='IPv4', protocol='tcp', port_range_min=1000, port_range_max=2000}}
=> (arp or (ip and tcp and dst portrange 1000-2000))
ok: {{direction='ingress', ethertype='IPv6', protocol='tcp'}, {direction='ingress', ethertype='IPv4', protocol='udp', remote_ip_prefix='10.0.0.0/8'}}
=> ((ip6 and tcp) or (arp or (ip and udp and src net 10.0.0.0/8)))
selftest ok
src/testlog/program.snabbnfv.neutron2snabb.neutron2snabb_schema:
selftest: neutron2snabb_schema
selftest: ok
src/testlog/program.snabbnfv.nfvconfig:
selftest: lib.nfv.config
testing: program/snabbnfv/test_fixtures/nfvconfig/switch_nic/x
engine: start app id0_Virtio
engine: start app id0_NIC
testing: program/snabbnfv/test_fixtures/nfvconfig/switch_filter/x
engine: start app id0_Filter_in
load: time: 0.30s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
testing: program/snabbnfv/test_fixtures/nfvconfig/switch_qos/x
engine: stop app id0_Filter_in
load: time: 0.26s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
testing: program/snabbnfv/test_fixtures/nfvconfig/switch_tunnel/x
engine: start app id0_Tunnel
engine: start app id0_ND
Feb 26 2016 09:03:57 nd_light: Sending neighbor solicitation for next-hop 2::2
load: time: 0.27s fps: 3 fpGbps: 0.000 fpb: 0 bpp: 86 sleep: 100 us
testing: program/snabbnfv/test_fixtures/nfvconfig/scale_up/y
engine: stop app id0_Tunnel
engine: stop app id0_ND
engine: start app id1_Virtio
engine: start app id1_NIC
load: time: 0.26s fps: 7 fpGbps: 0.000 fpb: 0 bpp: 86 sleep: 100 us
testing: program/snabbnfv/test_fixtures/nfvconfig/scale_up/x
engine: start app id63_Virtio
engine: start app id48_NIC
engine: start app id10_NIC
engine: start app id52_Virtio
engine: start app id6_NIC
engine: start app id45_NIC
engine: start app id34_NIC
engine: start app id35_NIC
engine: start app id40_NIC
engine: start app id61_Virtio
engine: start app id61_NIC
engine: start app id60_Virtio
engine: start app id49_Virtio
engine: start app id59_Virtio
engine: start app id12_NIC
engine: start app id42_NIC
engine: start app id38_NIC
engine: start app id24_NIC
engine: start app id58_Virtio
engine: start app id39_Virtio
engine: start app id51_NIC
engine: start app id57_Virtio
engine: start app id46_NIC
engine: start app id3_NIC
engine: start app id56_NIC
engine: start app id55_Virtio
engine: start app id45_Virtio
engine: start app id28_NIC
engine: start app id25_NIC
engine: start app id15_Virtio
engine: start app id41_NIC
engine: start app id35_Virtio
engine: start app id25_Virtio
engine: start app id54_Virtio
engine: start app id36_NIC
engine: start app id42_Virtio
engine: start app id32_NIC
engine: start app id18_NIC
engine: start app id50_Virtio
engine: start app id62_Virtio
engine: start app id12_Virtio
engine: start app id52_NIC
engine: start app id32_Virtio
engine: start app id22_Virtio
engine: start app id51_Virtio
engine: start app id53_NIC
engine: start app id60_NIC
engine: start app id8_NIC
engine: start app id31_NIC
engine: start app id33_NIC
engine: start app id37_Virtio
engine: start app id27_Virtio
engine: start app id20_Virtio
engine: start app id30_Virtio
engine: start app id50_NIC
engine: start app id10_Virtio
engine: start app id53_Virtio
engine: start app id47_Virtio
engine: start app id23_NIC
engine: start app id43_NIC
engine: start app id13_Virtio
engine: start app id24_Virtio
engine: start app id14_Virtio
engine: start app id21_NIC
engine: start app id7_Virtio
engine: start app id6_Virtio
engine: start app id5_Virtio
engine: start app id63_NIC
engine: start app id21_Virtio
engine: start app id31_Virtio
engine: start app id62_NIC
engine: start app id11_Virtio
engine: start app id59_NIC
engine: start app id13_NIC
engine: start app id41_Virtio
engine: start app id57_NIC
engine: start app id44_Virtio
engine: start app id9_Virtio
engine: start app id8_Virtio
engine: start app id44_NIC
engine: start app id43_Virtio
engine: start app id49_NIC
engine: start app id4_Virtio
engine: start app id4_NIC
engine: start app id2_Virtio
engine: start app id40_Virtio
engine: start app id47_NIC
engine: start app id58_NIC
engine: start app id33_Virtio
engine: start app id23_Virtio
engine: start app id34_Virtio
engine: start app id29_NIC
engine: start app id37_NIC
engine: start app id54_NIC
engine: start app id36_Virtio
engine: start app id26_NIC
engine: start app id16_Virtio
engine: start app id39_NIC
engine: start app id38_Virtio
engine: start app id46_Virtio
engine: start app id56_Virtio
engine: start app id28_Virtio
engine: start app id27_NIC
engine: start app id48_Virtio
engine: start app id30_NIC
engine: start app id16_NIC
engine: start app id5_NIC
engine: start app id19_NIC
engine: start app id17_Virtio
engine: start app id22_NIC
engine: start app id15_NIC
engine: start app id18_Virtio
engine: start app id20_NIC
engine: start app id2_NIC
engine: start app id9_NIC
engine: start app id19_Virtio
engine: start app id29_Virtio
engine: start app id17_NIC
engine: start app id11_NIC
engine: start app id55_NIC
engine: start app id7_NIC
engine: start app id14_NIC
engine: start app id3_Virtio
engine: start app id26_Virtio
load: time: 0.38s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
testing: program/snabbnfv/test_fixtures/nfvconfig/scale_change/x
load: time: 0.26s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
testing: program/snabbnfv/test_fixtures/nfvconfig/scale_change/y
engine: reconfig app id0_NIC
engine: reconfig app id3_NIC
engine: reconfig app id1_NIC
engine: reconfig app id2_NIC
load: time: 0.26s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
src/testlog/program.snabbnfv.selftest.sh:
Defaulting to SNABB_TELNET0=5000
Defaulting to SNABB_TELNET1=5001
USING program/snabbnfv/test_fixtures/nfvconfig/test_functions/other_vlan.ports
Defaulting to MAC=52:54:00:00:00:
Defaulting to IP=fe80::5054:ff:fe00:
Defaulting to GUEST_MEM=512
Defaulting to HUGETLBFS=/hugetlbfs
Defaulting to QUEUES=1
Defaulting to QEMU=/root/.test_env/qemu/obj/x86_64-softmmu/qemu-system-x86_64
Waiting for VM listening on telnet port 5000 to get ready... [OK]
Waiting for VM listening on telnet port 5001 to get ready... [OK]
USING program/snabbnfv/test_fixtures/nfvconfig/test_functions/same_vlan.ports
1 packets transmitted, 1 received, 0% packet loss, time 0ms
PING succeded.
[ 3] 0.0-10.0 sec 5.37 GBytes 4.59 Gbits/sec
IPERF succeded.
1 packets transmitted, 1 received, 0% packet loss, time 0ms
JUMBOPING succeded.
[ 3] 0.0-10.0 sec 6.67 GBytes 5.73 Gbits/sec
IPERF succeded.
tx-checksumming: on
TX-CHECKSUMMING succeded.
tx-checksumming: on
TX-CHECKSUMMING succeded.
USING program/snabbnfv/test_fixtures/nfvconfig/test_functions/tx_rate_limit.ports
Trying ::1...
Connected to localhost.
Escape character is '^]'.
ping6 -c 1 fe80::5054:ff:fe00:0001%eth0
PING fe80::5054:ff:fe00:0001%eth0(fe80::5054:ff:fe00:1) 56 data bytes
Connection closed by foreign host.
PING failed.
qemu0.log:
QEMU waiting for connection on: disconnected:unix:vhost_A.sock,server
qemu-system-x86_64: -netdev type=vhost-user,id=net0,chardev=char0: chardev "char0" went up
WARNING: Image format was not specified for '/root/.test_env/qemu0.img' and probing guessed raw.
Automatically detecting the format is dangerous for raw images, write operations on block 0 will be restricted.
Specify the 'raw' format explicitly to remove the restrictions.
QEMU 2.4.0 monitor - type 'help' for more information
(qemu)
(process:397): GLib-CRITICAL **: g_io_channel_unref: assertion 'channel != NULL' failed
qemu1.log:
QEMU waiting for connection on: disconnected:unix:vhost_B.sock,server
qemu-system-x86_64: -netdev type=vhost-user,id=net0,chardev=char0: chardev "char0" went up
WARNING: Image format was not specified for '/root/.test_env/qemu1.img' and probing guessed raw.
Automatically detecting the format is dangerous for raw images, write operations on block 0 will be restricted.
Specify the 'raw' format explicitly to remove the restrictions.
QEMU 2.4.0 monitor - type 'help' for more information
(qemu)
snabb0.log:
snabbnfv traffic starting
Loading /tmp/snabb_nfv_selftest_ports.344
engine: start app A_NIC
engine: start app B_NIC
engine: start app B_Virtio
engine: start app A_Virtio
Get features 0x18428001
VIRTIO_F_ANY_LAYOUT VIRTIO_NET_F_MQ VIRTIO_NET_F_CTRL_VQ VIRTIO_NET_F_MRG_RXBUF VIRTIO_RING_F_INDIRECT_DESC VIRTIO_NET_F_CSUM
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
Get features 0x18428001
VIRTIO_F_ANY_LAYOUT VIRTIO_NET_F_MQ VIRTIO_NET_F_CTRL_VQ VIRTIO_NET_F_MRG_RXBUF VIRTIO_RING_F_INDIRECT_DESC VIRTIO_NET_F_CSUM
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
vhost_user: Caching features (0x18028001) in /tmp/vhost_features_vhost_A.sock
Set features 0x18028001
VIRTIO_F_ANY_LAYOUT VIRTIO_NET_F_CTRL_VQ VIRTIO_NET_F_MRG_RXBUF VIRTIO_RING_F_INDIRECT_DESC VIRTIO_NET_F_CSUM
rxavail = 0 rxused = 0
rxavail = 0 rxused = 0
vhost_user: Connected and initialized: vhost_A.sock
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
vhost_user: Caching features (0x18028001) in /tmp/vhost_features_vhost_B.sock
Set features 0x18028001
VIRTIO_F_ANY_LAYOUT VIRTIO_NET_F_CTRL_VQ VIRTIO_NET_F_MRG_RXBUF VIRTIO_RING_F_INDIRECT_DESC VIRTIO_NET_F_CSUM
rxavail = 0 rxused = 0
rxavail = 0 rxused = 0
vhost_user: Connected and initialized: vhost_B.sock
load: time: 1.00s fps: 3 fpGbps: 0.000 fpb: 0 bpp: 86 sleep: 100 us
load: time: 1.00s fps: 4 fpGbps: 0.000 fpb: 0 bpp: 83 sleep: 100 us
load: time: 1.00s fps: 4 fpGbps: 0.000 fpb: 0 bpp: 85 sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 1 fpGbps: 0.000 fpb: 0 bpp: 70 sleep: 100 us
load: time: 1.00s fps: 1 fpGbps: 0.000 fpb: 0 bpp: 70 sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: 70 sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: 70 sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
Loading /tmp/snabb_nfv_selftest_ports.344
engine: reconfig app A_NIC
engine: reconfig app B_NIC
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 7 fpGbps: 0.000 fpb: 0 bpp: 102 sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 4 fpGbps: 0.000 fpb: 0 bpp: 82 sleep: 100 us
load: time: 1.00s fps: 8,132 fpGbps: 0.064 fpb: 1 bpp: 978 sleep: 1 us
load: time: 1.00s fps: 1,067,629 fpGbps: 10.390 fpb: 280 bpp: 1207 sleep: 0 us
load: time: 1.00s fps: 1,102,937 fpGbps: 10.776 fpb: 308 bpp: 1212 sleep: 0 us
load: time: 1.00s fps: 966,126 fpGbps: 9.187 fpb: 271 bpp: 1179 sleep: 0 us
load: time: 1.00s fps: 1,078,115 fpGbps: 10.607 fpb: 271 bpp: 1220 sleep: 0 us
load: time: 1.00s fps: 876,822 fpGbps: 8.543 fpb: 204 bpp: 1208 sleep: 0 us
load: time: 1.00s fps: 1,088,939 fpGbps: 10.683 fpb: 294 bpp: 1217 sleep: 0 us
load: time: 1.00s fps: 1,104,378 fpGbps: 10.819 fpb: 311 bpp: 1215 sleep: 0 us
load: time: 1.00s fps: 993,483 fpGbps: 9.624 fpb: 246 bpp: 1201 sleep: 0 us
load: time: 1.00s fps: 1,079,774 fpGbps: 10.635 fpb: 262 bpp: 1222 sleep: 0 us
load: time: 1.00s fps: 900,762 fpGbps: 8.743 fpb: 217 bpp: 1204 sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 3 fpGbps: 0.000 fpb: 0 bpp: 9062 sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 3 fpGbps: 0.000 fpb: 0 bpp: 82 sleep: 100 us
load: time: 1.00s fps: 8,625 fpGbps: 0.419 fpb: 1 bpp: 6067 sleep: 0 us
load: time: 1.00s fps: 245,398 fpGbps: 12.226 fpb: 21 bpp: 6218 sleep: 0 us
load: time: 1.00s fps: 199,679 fpGbps: 9.916 fpb: 17 bpp: 6198 sleep: 1 us
load: time: 1.00s fps: 254,121 fpGbps: 12.637 fpb: 24 bpp: 6207 sleep: 0 us
load: time: 1.00s fps: 253,654 fpGbps: 12.581 fpb: 24 bpp: 6190 sleep: 0 us
load: time: 1.00s fps: 226,101 fpGbps: 11.215 fpb: 21 bpp: 6191 sleep: 0 us
load: time: 1.00s fps: 256,611 fpGbps: 12.727 fpb: 25 bpp: 6190 sleep: 0 us
load: time: 1.00s fps: 239,616 fpGbps: 11.889 fpb: 24 bpp: 6193 sleep: 100 us
load: time: 1.00s fps: 229,623 fpGbps: 11.374 fpb: 24 bpp: 6182 sleep: 0 us
load: time: 1.00s fps: 253,972 fpGbps: 12.626 fpb: 24 bpp: 6205 sleep: 1 us
load: time: 1.00s fps: 183,227 fpGbps: 9.081 fpb: 19 bpp: 6186 sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
Loading /tmp/snabb_nfv_selftest_ports.344
engine: start app A_TxLimit
engine: start app B_TxLimit
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 2 fpGbps: 0.000 fpb: 0 bpp: 118 sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
EXITCODE: 1
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment