Skip to content

Instantly share code, notes, and snippets.

@SnabbBot
Created April 26, 2016 14:50
Show Gist options
  • Save SnabbBot/654445d188bf5446d2825142085dda9b to your computer and use it in GitHub Desktop.
Save SnabbBot/654445d188bf5446d2825142085dda9b to your computer and use it in GitHub Desktop.
Host: Linux davos 3.13.0-46-generic x86_64 Intel(R) Xeon(R) CPU E5-2603 v2 @ 1.80GHz
Image: eugeneia/snabb-nfv-test
Pull Request: #892
Target Head: 0b58c44e09c321325f54c7774f093e0e5b45a9a5
Pull Request Head: 39c36405bfb876b0e1838a2d3c0eb4f5fabf48a6
SNABB_PCI0=0000:84:00.0 Ethernet controller: Intel Corporation 82599ES 10-Gigabit SFI/SFP+ Network Connection (rev 01)
SNABB_PCI1=0000:84:00.1 Ethernet controller: Intel Corporation 82599ES 10-Gigabit SFI/SFP+ Network Connection (rev 01)
SNABB_PCI_INTEL0=0000:84:00.0 Ethernet controller: Intel Corporation 82599ES 10-Gigabit SFI/SFP+ Network Connection (rev 01)
SNABB_PCI_INTEL1=0000:84:00.1 Ethernet controller: Intel Corporation 82599ES 10-Gigabit SFI/SFP+ Network Connection (rev 01)
Checking for performance regressions:
BENCH basic1-100e6 -> 1.00564 of 14.18 (SD: 0.04 )
BENCH packetblaster-64 -> 0.992584 of 10.518 (SD: 0.0365513 )
BENCH packetblaster-synth-64 -> 1.00547 of 10.602 (SD: 0.0462169 )
BENCH snabbnfv-iperf-1500 -> 1.06477 of 4.91 (SD: 0.623506 )
BENCH snabbnfv-iperf-jumbo -> 1.03536 of 6.222 (SD: 0.228333 )
BENCH snabbnfv-loadgen-dpdk -> 0.998568 of 2.6528 (SD: 0.0298757 )
Checking test suite:
DIR testlog
TEST program.snabbnfv.nfvconfig
TEST program.snabbnfv.neutron2snabb.neutron2snabb
TEST program.snabbnfv.neutron2snabb.neutron2snabb_schema
TEST lib.hash.murmur
TEST lib.ctable
TEST lib.hardware.pci
TEST lib.protocol.datagram
TEST lib.protocol.ipv6
TEST lib.protocol.tcp
TEST lib.protocol.ipv4
TEST lib.ipc.shmem.shmem
TEST lib.traceprof.traceprof
TEST lib.checksum
TEST lib.pmu
SKIPPED testlog/lib.pmu
TEST core.app
TEST core.counter
TEST core.histogram
TEST core.lib
TEST core.timer
TEST core.link
TEST core.shm
TEST core.main
TEST core.memory
TEST apps.lwaftr.ndp
TEST apps.lwaftr.rangemap
TEST apps.lwaftr.conf
TEST apps.lwaftr.fragmentv4_test
TEST apps.lwaftr.binding_table
TEST apps.lwaftr.channel
TEST apps.lwaftr.lwdebug
TEST apps.lwaftr.podhashmap
TEST apps.virtio_net.virtio_net
SKIPPED testlog/apps.virtio_net.virtio_net
TEST apps.ipv6.nd_light
TEST apps.rate_limiter.rate_limiter
TEST apps.keyed_ipv6_tunnel.tunnel
TEST apps.vhost.vhost_user
SKIPPED testlog/apps.vhost.vhost_user
TEST apps.socket.raw
TEST apps.vlan.vlan
TEST apps.tap.tap
SKIPPED testlog/apps.tap.tap
TEST apps.intel.intel1g
SKIPPED testlog/apps.intel.intel1g
TEST apps.intel.intel_app
TEST apps.packet_filter.pcap_filter
TEST apps.bridge.mac_table
TEST apps.vpn.vpws
TEST apps.test.synth
TEST program/lwaftr/tests/end-to-end/selftest.sh
TEST program/packetblaster/selftest.sh
TEST program/snabbnfv/selftest.sh
ERROR testlog/program.snabbnfv.selftest.sh
ERROR during tests:
src/testlog/apps.bridge.mac_table:
Apr 26 2016 14:47:33 mac_table: resizing from 512 to 4096 hash buckets, new target size 2048 (1295 MAC entries, old target size 256, size/bucket overflow: true/true)
src/testlog/apps.intel.intel1g:
selftest: Intel1g
SNABB_SELFTEST_INTEL1G_0 not set
EXITCODE: 43
src/testlog/apps.intel.intel_app:
selftest: intel_app
100 VF initializations:
Running iterated VMDq test...
test # 1: VMDq VLAN=101; 100ms burst. packet sent: 469,455
test # 2: VMDq VLAN=102; 100ms burst. packet sent: 1,048,815
test # 3: VMDq VLAN=103; 100ms burst. packet sent: 1,627,665
test # 4: VMDq VLAN=104; 100ms burst. packet sent: 2,215,695
test # 5: VMDq VLAN=105; 100ms burst. packet sent: 2,797,095
test # 6: VMDq VLAN=106; 100ms burst. packet sent: 3,379,005
test # 7: VMDq VLAN=107; 100ms burst. packet sent: 3,959,640
test # 8: VMDq VLAN=108; 100ms burst. packet sent: 4,539,765
test # 9: VMDq VLAN=109; 100ms burst. packet sent: 5,116,065
test # 10: VMDq VLAN=110; 100ms burst. packet sent: 5,685,735
test # 11: VMDq VLAN=111; 100ms burst. packet sent: 6,251,580
test # 12: VMDq VLAN=112; 100ms burst. packet sent: 6,822,270
test # 13: VMDq VLAN=113; 100ms burst. packet sent: 7,395,000
test # 14: VMDq VLAN=114; 100ms burst. packet sent: 7,966,455
test # 15: VMDq VLAN=115; 100ms burst. packet sent: 8,535,615
test # 16: VMDq VLAN=116; 100ms burst. packet sent: 9,107,835
test # 17: VMDq VLAN=117; 100ms burst. packet sent: 9,675,975
test # 18: VMDq VLAN=118; 100ms burst. packet sent: 10,250,490
test # 19: VMDq VLAN=119; 100ms burst. packet sent: 10,825,005
test # 20: VMDq VLAN=120; 100ms burst. packet sent: 11,398,500
test # 21: VMDq VLAN=121; 100ms burst. packet sent: 11,971,740
test # 22: VMDq VLAN=122; 100ms burst. packet sent: 12,544,215
test # 23: VMDq VLAN=123; 100ms burst. packet sent: 13,114,395
test # 24: VMDq VLAN=124; 100ms burst. packet sent: 13,682,535
test # 25: VMDq VLAN=125; 100ms burst. packet sent: 14,250,420
test # 26: VMDq VLAN=126; 100ms burst. packet sent: 14,817,795
test # 27: VMDq VLAN=127; 100ms burst. packet sent: 15,384,915
test # 28: VMDq VLAN=128; 100ms burst. packet sent: 15,950,760
test # 29: VMDq VLAN=129; 100ms burst. packet sent: 16,516,605
test # 30: VMDq VLAN=130; 100ms burst. packet sent: 17,079,900
test # 31: VMDq VLAN=131; 100ms burst. packet sent: 17,644,215
test # 32: VMDq VLAN=132; 100ms burst. packet sent: 18,212,865
test # 33: VMDq VLAN=133; 100ms burst. packet sent: 18,775,905
test # 34: VMDq VLAN=134; 100ms burst. packet sent: 19,337,415
test # 35: VMDq VLAN=135; 100ms burst. packet sent: 19,898,160
test # 36: VMDq VLAN=136; 100ms burst. packet sent: 20,459,670
test # 37: VMDq VLAN=137; 100ms burst. packet sent: 21,016,590
test # 38: VMDq VLAN=138; 100ms burst. packet sent: 21,573,255
test # 39: VMDq VLAN=139; 100ms burst. packet sent: 22,130,175
test # 40: VMDq VLAN=140; 100ms burst. packet sent: 22,685,820
test # 41: VMDq VLAN=141; 100ms burst. packet sent: 23,240,955
test # 42: VMDq VLAN=142; 100ms burst. packet sent: 23,794,560
test # 43: VMDq VLAN=143; 100ms burst. packet sent: 24,349,440
test # 44: VMDq VLAN=144; 100ms burst. packet sent: 24,902,280
test # 45: VMDq VLAN=145; 100ms burst. packet sent: 25,453,590
test # 46: VMDq VLAN=146; 100ms burst. packet sent: 26,005,155
test # 47: VMDq VLAN=147; 100ms burst. packet sent: 26,553,915
test # 48: VMDq VLAN=148; 100ms burst. packet sent: 27,101,910
test # 49: VMDq VLAN=149; 100ms burst. packet sent: 27,649,905
test # 50: VMDq VLAN=150; 100ms burst. packet sent: 28,195,605
test # 51: VMDq VLAN=151; 100ms burst. packet sent: 28,741,305
test # 52: VMDq VLAN=152; 100ms burst. packet sent: 29,286,495
test # 53: VMDq VLAN=153; 100ms burst. packet sent: 29,833,980
test # 54: VMDq VLAN=154; 100ms burst. packet sent: 30,387,330
test # 55: VMDq VLAN=155; 100ms burst. packet sent: 30,940,935
test # 56: VMDq VLAN=156; 100ms burst. packet sent: 31,493,775
test # 57: VMDq VLAN=157; 100ms burst. packet sent: 32,045,340
test # 58: VMDq VLAN=158; 100ms burst. packet sent: 32,597,415
test # 59: VMDq VLAN=159; 100ms burst. packet sent: 33,147,960
test # 60: VMDq VLAN=160; 100ms burst. packet sent: 33,697,740
test # 61: VMDq VLAN=161; 100ms burst. packet sent: 34,248,030
test # 62: VMDq VLAN=162; 100ms burst. packet sent: 34,797,300
test # 63: VMDq VLAN=163; 100ms burst. packet sent: 35,345,295
test # 64: VMDq VLAN=164; 100ms burst. packet sent: 35,898,900
test # 65: VMDq VLAN=165; 100ms burst. packet sent: 36,447,150
test # 66: VMDq VLAN=166; 100ms burst. packet sent: 36,993,105
test # 67: VMDq VLAN=167; 100ms burst. packet sent: 37,539,060
test # 68: VMDq VLAN=168; 100ms burst. packet sent: 38,083,740
test # 69: VMDq VLAN=169; 100ms burst. packet sent: 38,626,890
test # 70: VMDq VLAN=170; 100ms burst. packet sent: 39,169,275
test # 71: VMDq VLAN=171; 100ms burst. packet sent: 39,711,405
test # 72: VMDq VLAN=172; 100ms burst. packet sent: 40,250,475
test # 73: VMDq VLAN=173; 100ms burst. packet sent: 40,789,545
test # 74: VMDq VLAN=174; 100ms burst. packet sent: 41,323,515
test # 75: VMDq VLAN=175; 100ms burst. packet sent: 41,858,760
test # 76: VMDq VLAN=176; 100ms burst. packet sent: 42,393,750
test # 77: VMDq VLAN=177; 100ms burst. packet sent: 42,928,740
test # 78: VMDq VLAN=178; 100ms burst. packet sent: 43,463,475
test # 79: VMDq VLAN=179; 100ms burst. packet sent: 43,997,955
test # 80: VMDq VLAN=180; 100ms burst. packet sent: 44,531,415
test # 81: VMDq VLAN=181; 100ms burst. packet sent: 45,063,600
test # 82: VMDq VLAN=182; 100ms burst. packet sent: 45,594,510
test # 83: VMDq VLAN=183; 100ms burst. packet sent: 46,125,675
test # 84: VMDq VLAN=184; 100ms burst. packet sent: 46,656,330
test # 85: VMDq VLAN=185; 100ms burst. packet sent: 47,190,810
test # 86: VMDq VLAN=186; 100ms burst. packet sent: 47,729,370
test # 87: VMDq VLAN=187; 100ms burst. packet sent: 48,267,675
test # 88: VMDq VLAN=188; 100ms burst. packet sent: 48,804,195
test # 89: VMDq VLAN=189; 100ms burst. packet sent: 49,341,990
test # 90: VMDq VLAN=190; 100ms burst. packet sent: 49,879,020
test # 91: VMDq VLAN=191; 100ms burst. packet sent: 50,415,795
test # 92: VMDq VLAN=192; 100ms burst. packet sent: 50,951,805
test # 93: VMDq VLAN=193; 100ms burst. packet sent: 51,486,540
test # 94: VMDq VLAN=194; 100ms burst. packet sent: 52,021,275
test # 95: VMDq VLAN=195; 100ms burst. packet sent: 52,555,245
test # 96: VMDq VLAN=196; 100ms burst. packet sent: 53,088,960
test # 97: VMDq VLAN=197; 100ms burst. packet sent: 53,621,910
test # 98: VMDq VLAN=198; 100ms burst. packet sent: 54,153,585
test # 99: VMDq VLAN=199; 100ms burst. packet sent: 54,685,515
test #100: VMDq VLAN=200; 100ms burst. packet sent: 55,215,660
0000:84:00.0: avg wait_lu: 235, max redos: 0, avg: 0
100 PF full cycles
Running iterated VMDq test...
test # 1: VMDq VLAN=101; 100ms burst. packet sent: 554,115
test # 2: VMDq VLAN=102; 100ms burst. packet sent: 558,450
test # 3: VMDq VLAN=103; 100ms burst. packet sent: 570,945
test # 4: VMDq VLAN=104; 100ms burst. packet sent: 565,845
test # 5: VMDq VLAN=105; 100ms burst. packet sent: 578,340
test # 6: VMDq VLAN=106; 100ms burst. packet sent: 472,005
test # 7: VMDq VLAN=107; 100ms burst. packet sent: 584,970
test # 8: VMDq VLAN=108; 100ms burst. packet sent: 565,335
test # 9: VMDq VLAN=109; 100ms burst. packet sent: 570,180
test # 10: VMDq VLAN=110; 100ms burst. packet sent: 566,610
test # 11: VMDq VLAN=111; 100ms burst. packet sent: 589,050
test # 12: VMDq VLAN=112; 100ms burst. packet sent: 575,280
test # 13: VMDq VLAN=113; 100ms burst. packet sent: 572,985
test # 14: VMDq VLAN=114; 100ms burst. packet sent: 567,120
test # 15: VMDq VLAN=115; 100ms burst. packet sent: 567,630
test # 16: VMDq VLAN=116; 100ms burst. packet sent: 578,340
test # 17: VMDq VLAN=117; 100ms burst. packet sent: 577,830
test # 18: VMDq VLAN=118; 100ms burst. packet sent: 573,240
test # 19: VMDq VLAN=119; 100ms burst. packet sent: 574,515
test # 20: VMDq VLAN=120; 100ms burst. packet sent: 587,520
test # 21: VMDq VLAN=121; 100ms burst. packet sent: 575,025
test # 22: VMDq VLAN=122; 100ms burst. packet sent: 392,190
test # 23: VMDq VLAN=123; 100ms burst. packet sent: 579,615
test # 24: VMDq VLAN=124; 100ms burst. packet sent: 583,950
test # 25: VMDq VLAN=125; 100ms burst. packet sent: 574,260
test # 26: VMDq VLAN=126; 100ms burst. packet sent: 567,630
test # 27: VMDq VLAN=127; 100ms burst. packet sent: 539,580
test # 28: VMDq VLAN=128; 100ms burst. packet sent: 568,650
test # 29: VMDq VLAN=129; 100ms burst. packet sent: 570,945
test # 30: VMDq VLAN=130; 100ms burst. packet sent: 562,020
test # 31: VMDq VLAN=131; 100ms burst. packet sent: 570,180
test # 32: VMDq VLAN=132; 100ms burst. packet sent: 580,890
test # 33: VMDq VLAN=133; 100ms burst. packet sent: 567,375
test # 34: VMDq VLAN=134; 100ms burst. packet sent: 552,330
test # 35: VMDq VLAN=135; 100ms burst. packet sent: 569,925
test # 36: VMDq VLAN=136; 100ms burst. packet sent: 569,670
test # 37: VMDq VLAN=137; 100ms burst. packet sent: 568,395
test # 38: VMDq VLAN=138; 100ms burst. packet sent: 578,850
test # 39: VMDq VLAN=139; 100ms burst. packet sent: 564,570
test # 40: VMDq VLAN=140; 100ms burst. packet sent: 559,470
test # 41: VMDq VLAN=141; 100ms burst. packet sent: 589,815
test # 42: VMDq VLAN=142; 100ms burst. packet sent: 569,925
test # 43: VMDq VLAN=143; 100ms burst. packet sent: 580,125
test # 44: VMDq VLAN=144; 100ms burst. packet sent: 547,995
test # 45: VMDq VLAN=145; 100ms burst. packet sent: 579,360
test # 46: VMDq VLAN=146; 100ms burst. packet sent: 554,625
test # 47: VMDq VLAN=147; 100ms burst. packet sent: 572,985
test # 48: VMDq VLAN=148; 100ms burst. packet sent: 562,275
test # 49: VMDq VLAN=149; 100ms burst. packet sent: 590,070
test # 50: VMDq VLAN=150; 100ms burst. packet sent: 583,695
test # 51: VMDq VLAN=151; 100ms burst. packet sent: 578,850
test # 52: VMDq VLAN=152; 100ms burst. packet sent: 565,845
test # 53: VMDq VLAN=153; 100ms burst. packet sent: 203,235
test # 54: VMDq VLAN=154; 100ms burst. packet sent: 552,075
test # 55: VMDq VLAN=155; 100ms burst. packet sent: 577,830
test # 56: VMDq VLAN=156; 100ms burst. packet sent: 565,080
test # 57: VMDq VLAN=157; 100ms burst. packet sent: 561,255
test # 58: VMDq VLAN=158; 100ms burst. packet sent: 569,670
test # 59: VMDq VLAN=159; 100ms burst. packet sent: 560,235
test # 60: VMDq VLAN=160; 100ms burst. packet sent: 553,860
test # 61: VMDq VLAN=161; 100ms burst. packet sent: 570,690
test # 62: VMDq VLAN=162; 100ms burst. packet sent: 573,750
test # 63: VMDq VLAN=163; 100ms burst. packet sent: 575,535
test # 64: VMDq VLAN=164; 100ms burst. packet sent: 572,985
test # 65: VMDq VLAN=165; 100ms burst. packet sent: 582,930
test # 66: VMDq VLAN=166; 100ms burst. packet sent: 567,375
test # 67: VMDq VLAN=167; 100ms burst. packet sent: 592,110
test # 68: VMDq VLAN=168; 100ms burst. packet sent: 564,570
test # 69: VMDq VLAN=169; 100ms burst. packet sent: 575,025
test # 70: VMDq VLAN=170; 100ms burst. packet sent: 572,220
test # 71: VMDq VLAN=171; 100ms burst. packet sent: 576,555
test # 72: VMDq VLAN=172; 100ms burst. packet sent: 566,865
test # 73: VMDq VLAN=173; 100ms burst. packet sent: 572,985
test # 74: VMDq VLAN=174; 100ms burst. packet sent: 563,040
test # 75: VMDq VLAN=175; 100ms burst. packet sent: 574,005
test # 76: VMDq VLAN=176; 100ms burst. packet sent: 577,830
test # 77: VMDq VLAN=177; 100ms burst. packet sent: 585,735
test # 78: VMDq VLAN=178; 100ms burst. packet sent: 573,240
test # 79: VMDq VLAN=179; 100ms burst. packet sent: 571,455
test # 80: VMDq VLAN=180; 100ms burst. packet sent: 562,530
test # 81: VMDq VLAN=181; 100ms burst. packet sent: 590,835
test # 82: VMDq VLAN=182; 100ms burst. packet sent: 566,100
test # 83: VMDq VLAN=183; 100ms burst. packet sent: 585,990
test # 84: VMDq VLAN=184; 100ms burst. packet sent: 545,190
test # 85: VMDq VLAN=185; 100ms burst. packet sent: 558,195
test # 86: VMDq VLAN=186; 100ms burst. packet sent: 575,025
test # 87: VMDq VLAN=187; 100ms burst. packet sent: 570,945
test # 88: VMDq VLAN=188; 100ms burst. packet sent: 578,595
test # 89: VMDq VLAN=189; 100ms burst. packet sent: 573,750
test # 90: VMDq VLAN=190; 100ms burst. packet sent: 570,690
test # 91: VMDq VLAN=191; 100ms burst. packet sent: 579,360
test # 92: VMDq VLAN=192; 100ms burst. packet sent: 580,380
test # 93: VMDq VLAN=193; 100ms burst. packet sent: 571,455
test # 94: VMDq VLAN=194; 100ms burst. packet sent: 581,145
test # 95: VMDq VLAN=195; 100ms burst. packet sent: 586,245
test # 96: VMDq VLAN=196; 100ms burst. packet sent: 579,360
test # 97: VMDq VLAN=197; 100ms burst. packet sent: 581,145
test # 98: VMDq VLAN=198; 100ms burst. packet sent: 571,965
test # 99: VMDq VLAN=199; 100ms burst. packet sent: 569,925
test #100: VMDq VLAN=200; 100ms burst. packet sent: 579,870
0000:84:00.0: avg wait_lu: 146.2, max redos: 0, avg: 0
-------
Send a bunch of packets from Am0
half of them go to nicAm1 and half go nowhere
link report:
0 sent on nicAm0.tx -> sink_ms.in1 (loss rate: 0%)
2,914,079 sent on nicAm1.tx -> sink_ms.in2 (loss rate: 0%)
5,828,790 sent on repeater_ms.output -> nicAm0.rx (loss rate: 0%)
2 sent on source_ms.out -> repeater_ms.input (loss rate: 0%)
-------
Transmitting bidirectionally between nicA and nicB
link report:
952,960 sent on nicA.tx -> sink.in1 (loss rate: 0%)
952,960 sent on nicB.tx -> sink.in2 (loss rate: 0%)
1,898,985 sent on source1.out -> nicA.rx (loss rate: 0%)
1,898,985 sent on source2.out -> nicB.rx (loss rate: 0%)
-------
Send traffic from a nicA (SF) to nicB (two VFs)
The packets should arrive evenly split between the VFs
link report:
0 sent on nicAs.tx -> sink_ms.in1 (loss rate: 0%)
2,064,669 sent on nicBm0.tx -> sink_ms.in2 (loss rate: 0%)
2,064,664 sent on nicBm1.tx -> sink_ms.in3 (loss rate: 0%)
4,129,980 sent on repeater_ms.output -> nicAs.rx (loss rate: 0%)
2 sent on source_ms.out -> repeater_ms.input (loss rate: 0%)
selftest: ok
src/testlog/apps.ipv6.nd_light:
Apr 26 2016 14:46:33 nd_light: Sending neighbor solicitation for next-hop 2001:db8::1
Apr 26 2016 14:46:33 nd_light: Sending neighbor solicitation for next-hop 2001:db8::2
Apr 26 2016 14:46:33 nd_light: Resolved next-hop 2001:db8::1 to 00:00:00:00:00:01
Apr 26 2016 14:46:33 nd_light: Resolved next-hop 2001:db8::2 to 00:00:00:00:00:02
Apr 26 2016 14:46:34 nd_light: Sending neighbor solicitation for next-hop 2001:db8::1
Apr 26 2016 14:46:34 nd_light: Sending neighbor solicitation for next-hop 2001:db8::2
src/testlog/apps.keyed_ipv6_tunnel.tunnel:
Keyed IPv6 tunnel selftest
run simple one second benchmark ...
load: time: 1.00s fps: 12,377,640 fpGbps: 6.832 fpb: 255 bpp: 60 sleep: 0 us
selftest passed
src/testlog/apps.lwaftr.binding_table:
selftest: binding_table
loading compiled binding table from /tmp/lua_5EsQ4D
loading compiled binding table from /tmp/lua_HRhxju
ok
src/testlog/apps.lwaftr.channel:
selftest: channel
selftest: channel ok
src/testlog/apps.lwaftr.conf:
selftest: conf
ok
src/testlog/apps.lwaftr.fragmentv4_test:
test: lwaftr.fragmentv4.fragment_ipv4
test: payload=1200 mtu=1500
test: payload=1200 mtu=1000
test: payload=1200 mtu=400
test: packet with "don't fragment" flag
test: length=1046 mtu=520 + reassembly
test: vlan tagging
test: lwaftr.fragmentv4.reassemble_ipv4 (no vlan tag)
test: no reassembly needed (single packet)
test: two fragments (one missing)
test: three fragments (one/two missing)
test: payload=1200 mtu=1000
test: payload=1000 mtu=400
test: lwaftr.fragmentv4.reassemble_ipv4 (vlan id=42)
test: no reassembly needed (single packet)
test: two fragments (one missing)
test: three fragments (one/two missing)
test: payload=1200 mtu=1000
test: payload=1000 mtu=400
src/testlog/apps.lwaftr.lwdebug:
src/testlog/apps.lwaftr.ndp:
src/testlog/apps.lwaftr.podhashmap:
No PMU available: single core cpu affinity required
jenkins hash: 4.93 ns per iteration (result: 2964940424)
WARNING: perfmark failed: exceeded maximum ns 4
murmur hash (32 bit): 8.71 ns per iteration (result: 2347483648)
WARNING: perfmark failed: exceeded maximum ns 8
insertion (40% occupancy): 178.99 ns per iteration (result: nil)
WARNING: perfmark failed: exceeded maximum ns 100
max displacement: 8
selfcheck: pass
population check: pass
lookup (40% occupancy): 91.07 ns per iteration (result: 1975650)
streaming lookup, stride=1: 210.54 ns per iteration (result: -2000001)
WARNING: perfmark failed: exceeded maximum ns 100
streaming lookup, stride=2: 138.67 ns per iteration (result: -2000001)
WARNING: perfmark failed: exceeded maximum ns 100
streaming lookup, stride=4: 152.03 ns per iteration (result: -2000001)
WARNING: perfmark failed: exceeded maximum ns 100
streaming lookup, stride=8: 126.00 ns per iteration (result: -2000001)
WARNING: perfmark failed: exceeded maximum ns 100
streaming lookup, stride=16: 123.19 ns per iteration (result: -2000001)
WARNING: perfmark failed: exceeded maximum ns 100
streaming lookup, stride=32: 121.85 ns per iteration (result: -2000001)
WARNING: perfmark failed: exceeded maximum ns 100
streaming lookup, stride=64: 119.08 ns per iteration (result: -2000001)
WARNING: perfmark failed: exceeded maximum ns 100
streaming lookup, stride=128: 115.58 ns per iteration (result: -2000001)
WARNING: perfmark failed: exceeded maximum ns 100
streaming lookup, stride=256: 114.78 ns per iteration (result: -2000001)
WARNING: perfmark failed: exceeded maximum ns 100
lookup (40% occupancy): 86.93 ns per iteration (result: 1975650)
src/testlog/apps.lwaftr.rangemap:
No PMU available: single core cpu affinity required
lookup: 11.23 ns per iteration (result: 0)
WARNING: perfmark failed: exceeded maximum ns 10
src/testlog/apps.packet_filter.pcap_filter:
selftest: pcap_filter
Run for 1 second (stateful = false)...
link report:
302,901 sent on pcap_filter.output -> sink.input (loss rate: 0%)
8,127,870 sent on repeater.output -> pcap_filter.input (loss rate: 0%)
161 sent on source.output -> repeater.input (loss rate: 0%)
ok: accepted 3.7267% of inputs (within tolerance)
Run for 1 second (stateful = true)...
link report:
317,705 sent on pcap_filter.output -> sink.input (loss rate: 0%)
4,262,580 sent on repeater.output -> pcap_filter.input (loss rate: 0%)
161 sent on source.output -> repeater.input (loss rate: 0%)
ok: accepted 7.4533% of inputs (within tolerance)
selftest: ok
src/testlog/apps.rate_limiter.rate_limiter:
Rate limiter selftest
test effective rate, non-busy loop
load: time: 1.00s fps: 2,373,726 fpGbps: 1.310 fpb: 255 bpp: 60 sleep: 0 us
load: time: 1.00s fps: 2,378,062 fpGbps: 1.313 fpb: 255 bpp: 60 sleep: 0 us
load: time: 1.00s fps: 2,369,890 fpGbps: 1.308 fpb: 255 bpp: 60 sleep: 0 us
load: time: 1.00s fps: 2,365,972 fpGbps: 1.306 fpb: 255 bpp: 60 sleep: 0 us
load: time: 0.00s fps: NaN fpGbps: nan fpb: NaN bpp: - sleep: 0 us
configured rate is 200000 bytes per second
effective rate is 209972 bytes per second
measure throughput on heavy load...
load: time: 1.00s fps: 23,345,313 fpGbps: 12.887 fpb: 255 bpp: 60 sleep: 0 us
elapsed time 1.096531682 seconds
packets received 25500000 23 Mpps
configured rate is 1200000000 bytes per second
effective rate is 1307108051 bytes per second
throughput is 21 Mpps
selftest passed
src/testlog/apps.socket.raw:
selftest passed
src/testlog/apps.tap.tap:
EXITCODE: 43
src/testlog/apps.test.synth:
link report:
255 sent on synth.output -> writer.input (loss rate: 66%)
src/testlog/apps.vhost.vhost_user:
selftest: vhost_user
SNABB_TEST_VHOST_USER_SOCKET was not set
Test skipped
EXITCODE: 43
src/testlog/apps.virtio_net.virtio_net:
SNABB_TEST_VIRTIO_PCIDEV was not set
Test skipped
EXITCODE: 43
src/testlog/apps.vlan.vlan:
source sent: 16413330
sink received: 16413330
src/testlog/apps.vpn.vpws:
src/testlog/core.app:
Restarting app2 (died at 9162385.386646: core/app.lua:472: Push error.)
Restarting app1 (died at 9162385.386646: core/app.lua:466: Pull error.)
Restarting app2 (died at 9162387.386664: core/app.lua:472: Push error.)
Restarting app1 (died at 9162387.386664: core/app.lua:466: Pull error.)
Restarting app3 (died at 9162389.386304: core/app.lua:478: Report error.)
Restarting app2 (died at 9162389.386900: core/app.lua:472: Push error.)
Restarting app1 (died at 9162389.386900: core/app.lua:466: Pull error.)
selftest: app
empty -> c1
c1 -> c1
c1 -> c2
c2 -> c1
c1 -> empty
c_fail
apps report:
app3
app2 [dead: core/app.lua:472: Push error.]
app1 [dead: core/app.lua:466: Pull error.]
apps report:
app3
app2 [dead: core/app.lua:472: Push error.]
app1 [dead: core/app.lua:466: Pull error.]
OK
src/testlog/core.counter:
selftest: core.counter
selftest ok
src/testlog/core.histogram:
selftest: histogram
selftest ok
src/testlog/core.lib:
selftest: lib
Testing equal
Testing load_string
Testing load/store_conf
Testing csum
Testing hex(un)dump
src/testlog/core.link:
selftest: link
selftest OK
src/testlog/core.main:
selftest
src/testlog/core.memory:
selftest: memory
Kernel vm.nr_hugepages: 10240
Allocating a 2MB HugeTLB: Got 2MB
Physical address: 0x0002c9e00000
Virtual address: 0x5002c9e00000
Allocating a 2MB HugeTLB: Got 2MB
Physical address: 0x0002c9c00000
Virtual address: 0x5002c9c00000
Allocating a 2MB HugeTLB: Got 2MB
Physical address: 0x0002c9a00000
Virtual address: 0x5002c9a00000
Allocating a 2MB HugeTLB: Got 2MB
Physical address: 0x0002c9800000
Virtual address: 0x5002c9800000
Kernel vm.nr_hugepages: 10240
HugeTLB page allocation OK.
src/testlog/core.shm:
selftest: shm
checking paths..
checking shared memory..
create obj
checking many objects..
10000 objects created
10000 objects unmapped
selftest ok
src/testlog/core.timer:
selftest: timer
ok (973,855 callbacks in 0.3119 seconds)
src/testlog/lib.checksum:
selftest: checksum
no avx2
sse2: 1000/1000
selftest: tcp/ipv4
selftest: ok
src/testlog/lib.ctable:
selftest: ctable
selftest: ok
src/testlog/lib.hardware.pci:
selftest: pci
pciaddress model interface status driver usable
01:00.0 Intel 350 - - apps.intel.intel1g yes
01:00.1 Intel 350 - - apps.intel.intel1g yes
03:00.0 Intel 82599 SFP - - apps.intel.intel_app yes
03:00.1 Intel 82599 SFP - - apps.intel.intel_app yes
09:00.0 Intel 350 - - apps.intel.intel1g yes
09:00.1 Intel 350 - - apps.intel.intel1g yes
81:00.0 Intel 82574L - - apps.intel.intel_app yes
82:00.0 Intel 82599 SFP - - apps.intel.intel_app yes
82:00.1 Intel 82599 SFP - - apps.intel.intel_app yes
84:00.0 Intel 82599 SFP - - apps.intel.intel_app yes
84:00.1 Intel 82599 SFP - - apps.intel.intel_app yes
86:00.0 Intel 82599 SFP - - apps.intel.intel_app yes
86:00.1 Intel 82599 SFP - - apps.intel.intel_app yes
89:00.0 Intel 82599 T3 - - apps.intel.intel_app yes
89:00.1 Intel 82599 T3 - - apps.intel.intel_app yes
src/testlog/lib.hash.murmur:
Sleftest hash MurmurHash3_x86_32
Passed
Sleftest hash MurmurHash3_x64_128
Passed
src/testlog/lib.ipc.shmem.shmem:
ok
src/testlog/lib.pmu:
selftest: pmu
PMU not available:
single core cpu affinity required
selftest skipped
EXITCODE: 43
src/testlog/lib.protocol.datagram:
src/testlog/lib.protocol.ipv4:
src/testlog/lib.protocol.ipv6:
src/testlog/lib.protocol.tcp:
src/testlog/lib.traceprof.traceprof:
traceprof report (recorded 333/333 samples):
42% TRACE 4:LOOP ->loop
30% TRACE 4 ->loop
26% TRACE 5 ->4
src/testlog/program.lwaftr.tests.end-to-end.selftest.sh:
Testing: from-internet IPv4 packet found in the binding table.
loading source binding table from ../data/binding-table.txt
done
Test passed
Testing: from-internet IPv4 packet found in the binding table with vlan tag.
done
Test passed
Testing: NDP: incoming NDP Neighbor Solicitation
done
Test passed
Testing: NDP: incoming NDP Neighbor Solicitation, secondary IP
done
Test passed
Testing: NDP: incoming NDP Neighbor Solicitation, non-lwAFTR IP
done
Test passed
Testing: NDP: IPv6 but not eth addr of next IPv6 hop set, do Neighbor Solicitation
done
Test passed
Testing: NDP: Without receiving NA, next_hop6_mac not set
done
Test passed
Testing: NDP: With receiving NA, next_hop6_mac not initially set
done
Test passed
Testing: IPv6 packet, next hop NA, packet, eth addr not set in configuration.
done
Test passed
Testing: from-internet IPv4 fragmented packets found in the binding table.
done
Test passed
Testing: traffic class mapping
done
Test passed
Testing: from-internet IPv4 packet found in the binding table, original TTL=1.
done
Test passed
Testing: from-B4 IPv4 fragmentation (2)
done
Test passed
Testing: from-B4 IPv4 fragmentation (3)
done
Test passed
Testing: from-internet IPv4 packet found in the binding table, needs IPv6 fragmentation (2).
done
Test passed
Testing: from-internet IPv4 packet found in the binding table, needs IPv6 fragmentation (3).
done
Test passed
Testing: IPv6 reassembly (followed by decapsulation).
done
Test passed
Testing: from-internet IPv4 packet found in the binding table, needs IPv6 fragmentation, DF set, ICMP-3,4.
done
Test passed
Testing: from-internet IPv4 packet NOT found in the binding table, no ICMP.
done
Test passed
Testing: from-internet IPv4 packet NOT found in the binding table (IPv4 matches, but port doesn't), no ICMP.
done
Test passed
Testing: from-internet IPv4 packet NOT found in the binding table (ICMP-on-fail).
done
Test passed
Testing: from-internet IPv4 packet NOT found in the binding table (IPv4 matches, but port doesn't) (ICMP-on-fail).
done
Test passed
Testing: from-to-b4 IPv6 packet NOT found in the binding table, no ICMP.
done
Test passed
Testing: from-b4 to-internet IPv6 packet found in the binding table.
done
Test passed
Testing: from-b4 to-internet IPv6 packet found in the binding table with vlan tag.
done
Test passed
Testing: from-b4 to-internet IPv6 packet NOT found in the binding table, no ICMP
done
Test passed
Testing: from-b4 to-internet IPv6 packet NOT found in the binding table, (IPv4 matches, but port doesn't), no ICMP
done
Test passed
Testing: from-b4 to-internet IPv6 packet NOT found in the binding table (ICMP-on-fail)
done
Test passed
Testing: from-b4 to-internet IPv6 packet NOT found in the binding table (IPv4 matches, but port doesn't) (ICMP-on-fail)
done
Test passed
Testing: from-to-b4 IPv6 packet, no hairpinning
done
Test passed
Testing: from-to-b4 IPv6 packet, with hairpinning
done
Test passed
Testing: from-to-b4 tunneled ICMPv4 ping, with hairpinning
done
Test passed
Testing: from-to-b4 tunneled ICMPv4 ping reply, with hairpinning
done
Test passed
Testing: from-to-b4 tunneled ICMPv4 ping, with hairpinning, unbound
done
Test passed
Testing: from-to-b4 tunneled ICMPv4 ping reply, with hairpinning, port 0 not bound
done
Test passed
Testing: from-to-b4 TCP packet, with hairpinning, TTL 1
done
Test passed
Testing: from-to-b4 IPv6 packet, with hairpinning, with vlan tag
done
Test passed
Testing: from-b4 IPv6 packet, with hairpinning, to B4 with custom lwAFTR address
done
Test passed
Testing: from-b4 IPv6 packet, with hairpinning, from B4 with custom lwAFTR address
done
Test passed
Testing: from-b4 IPv6 packet, with hairpinning, different non-default lwAFTR addresses
done
Test passed
Testing: from-internet bound IPv4 UDP packet
done
Test passed
Testing: unfragmented IPv4 UDP -> outgoing IPv6 UDP fragments
done
Test passed
Testing: IPv6 incoming UDP fragments -> unfragmented IPv4
done
Test passed
Testing: IPv6 incoming UDP fragments -> outgoing IPv4 UDP fragments
done
Test passed
Testing: IPv4 incoming UDP fragments -> outgoing IPv6 UDP fragments
done
Test passed
Testing: incoming ICMPv4 echo request, matches binding table
done
Test passed
Testing: incoming ICMPv4 echo request, matches binding table, bad checksum
done
Test passed
Testing: incoming ICMPv4 echo request, matches binding table, dropping ICMP
done
Test passed
Testing: incoming ICMPv4 echo request, doesn't match binding table
done
Test passed
Testing: incoming ICMPv4 echo reply, matches binding table
done
Test passed
Testing: incoming ICMPv4 3,4 'too big' notification, matches binding table
done
Test passed
Testing: incoming ICMPv6 1,3 destination/address unreachable, OPE from internet
done
Test passed
Testing: incoming ICMPv6 2,0 'too big' notification, OPE from internet
done
Test passed
Testing: incoming ICMPv6 3,0 hop limit exceeded, OPE from internet
done
Test passed
Testing: incoming ICMPv6 3,1 frag reasembly time exceeded, OPE from internet
done
Test passed
Testing: incoming ICMPv6 4,3 parameter problem, OPE from internet
done
Test passed
Testing: incoming ICMPv6 3,0 hop limit exceeded, OPE hairpinned
done
Test passed
Testing: ingress-filter: from-internet (IPv4) packet found in binding table (ACCEPT)
done
Test passed
Testing: ingress-filter: from-internet (IPv4) packet found in binding table (DROP)
done
Test passed
Testing: ingress-filter: from-b4 (IPv6) packet found in binding table (ACCEPT)
done
Test passed
Testing: ingress-filter: from-b4 (IPv6) packet found in binding table (DROP)
done
Test passed
Testing: egress-filter: to-internet (IPv4) (ACCEPT)
done
Test passed
Testing: egress-filter: to-internet (IPv4) (DROP)
done
Test passed
Testing: egress-filter: to-b4 (IPv4) (ACCEPT)
done
Test passed
Testing: egress-filter: to-b4 (IPv4) (DROP)
done
Test passed
Testing: ICMP Echo to AFTR (IPv4)
done
Test passed
Testing: ICMP Echo to AFTR (IPv4) + data
done
Test passed
Testing: ICMP Echo to AFTR (IPv6)
done
Test passed
Testing: ICMP Echo to AFTR (IPv6) + data
done
Test passed
All end-to-end lwAFTR tests passed.
Testing: from-internet IPv4 packet found in the binding table.
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: traffic class mapping
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: NDP: incoming NDP Neighbor Solicitation
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: NDP: incoming NDP Neighbor Solicitation, secondary IP
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: NDP: incoming NDP Neighbor Solicitation, non-lwAFTR IP
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: NDP: IPv6 but not eth addr of next IPv6 hop set, do Neighbor Solicitation
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: NDP: Without receiving NA, next_hop6_mac not set
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: NDP: With receiving NA, next_hop6_mac not initially set
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: from-internet IPv4 packet found in the binding table, original TTL=1.
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: from-internet IPv4 fragmented packets found in the binding table
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: from-B4 IPv4 fragmentation (2)
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: from-B4 IPv4 fragmentation (3)
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: from-internet IPv4 packet found in the binding table, needs IPv6 fragmentation (2).
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: from-internet IPv4 packet found in the binding table, needs IPv6 fragmentation (3).
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: IPv6 reassembly (followed by decapsulation).
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: from-internet IPv4 packet found in the binding table, needs IPv6 fragmentation, DF set, ICMP-3,4.
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: from-internet IPv4 packet NOT found in the binding table, no ICMP.
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: from-internet IPv4 packet NOT found in the binding table (matches IPv4, but not port), no ICMP.
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: from-internet IPv4 packet NOT found in the binding table (ICMP-on-fail).
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: from-internet IPv4 packet NOT found in the binding table (matches IPv4, but not port) (ICMP-on-fail).
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: from-to-b4 IPv6 packet NOT found in the binding table, no ICMP.
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: from-b4 to-internet IPv6 packet found in the binding table.
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: from-b4 to-internet IPv6 packet NOT found in the binding table, no ICMP
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: from-b4 to-internet IPv6 packet NOT found in the binding table (matches IPv4, but not port), no ICMP
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: from-b4 to-internet IPv6 packet NOT found in the binding table (ICMP-on-fail)
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: from-b4 to-internet IPv6 packet NOT found in the binding table (matches IPv4, but not port) (ICMP-on-fail)
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: from-to-b4 IPv6 packet, no hairpinning
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: from-to-b4 IPv6 packet, with hairpinning
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: from-to-b4 tunneled ICMPv4 ping, with hairpinning
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: from-to-b4 tunneled ICMPv4 ping reply, with hairpinning
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: from-to-b4 tunneled ICMPv4 ping, with hairpinning, unbound
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: from-to-b4 tunneled ICMPv4 ping reply, with hairpinning, port 0 not bound
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: from-to-b4 TCP packet, with hairpinning, TTL 1
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: from-b4 IPv6 packet, with hairpinning, to B4 with custom lwAFTR address
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: from-b4 IPv6 packet, with hairpinning, from B4 with custom lwAFTR address
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: from-b4 IPv6 packet, with hairpinning, different non-default lwAFTR addresses
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: from-internet bound IPv4 UDP packet
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: unfragmented IPv4 UDP -> outgoing IPv6 UDP fragments
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: IPv6 incoming UDP fragments -> unfragmented IPv4
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: IPv6 incoming UDP fragments -> outgoing IPv4 UDP fragments
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: IPv4 incoming UDP fragments -> outgoing IPv6 UDP fragments
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: incoming ICMPv4 echo request, matches binding table
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: incoming ICMPv4 echo request, matches binding table
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: incoming ICMPv4 echo request, matches binding table, dropping ICMP
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: incoming ICMPv4 echo request, doesn't match binding table
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: incoming ICMPv4 echo reply, matches binding table
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: incoming ICMPv4 3,4 'too big' notification, matches binding table
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: incoming ICMPv6 1,3 destination/address unreachable, OPE from internet
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: incoming ICMPv6 2,0 'too big' notification, OPE from internet
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: incoming ICMPv6 3,0 hop limit exceeded, OPE from internet
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: incoming ICMPv6 3,1 frag reasembly time exceeded, OPE from internet
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: incoming ICMPv6 4,3 parameter problem, OPE from internet
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: incoming ICMPv6 3,0 hop limit exceeded, OPE hairpinned
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: ingress-filter: from-internet (IPv4) packet found in binding table (ACCEPT)
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: ingress-filter: from-internet (IPv4) packet found in binding table (DROP)
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: ingress-filter: from-b4 (IPv6) packet found in binding table (ACCEPT)
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: ingress-filter: from-b4 (IPv6) packet found in binding table (DROP)
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: egress-filter: to-internet (IPv4) (ACCEPT)
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: egress-filter: to-internet (IPv4) (DROP)
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: egress-filter: to-b4 (IPv4) (ACCEPT)
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: egress-filter: to-b4 (IPv4) (DROP)
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: ICMP Echo to AFTR (IPv4)
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: ICMP Echo to AFTR (IPv4) + data
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: ICMP Echo to AFTR (IPv6)
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
Testing: ICMP Echo to AFTR (IPv6) + data
loading compiled binding table from ../data/binding-table.o
compiled binding table ../data/binding-table.o is up to date.
done
Test passed
All end-to-end lwAFTR vlan tests passed.
src/testlog/program.packetblaster.selftest.sh:
selftest: packetblaster
selftest: ok
src/testlog/program.snabbnfv.neutron2snabb.neutron2snabb:
selftest: neutron2snabb
ok: {{direction='ingress', ethertype='IPv6'}}
=> ip6
ok: {{direction='ingress', ethertype='IPv4'}}
=> (arp or ip)
ok: {{direction='ingress', ethertype='IPv4', protocol='tcp'}}
=> (arp or (ip and tcp))
ok: {{direction='ingress', ethertype='IPv4', protocol='udp'}}
=> (arp or (ip and udp))
ok: {{direction='ingress', ethertype='IPv4', protocol='udp', port_range_min=1000}}
=> (arp or (ip and udp and dst portrange 1000-1000))
ok: {{direction='ingress', ethertype='IPv4', protocol='udp', port_range_max=2000}}
=> (arp or (ip and udp and dst portrange 2000-2000))
ok: {{direction='ingress', ethertype='IPv4', protocol='tcp', port_range_min=1000, port_range_max=2000}}
=> (arp or (ip and tcp and dst portrange 1000-2000))
ok: {{direction='ingress', ethertype='IPv6', protocol='tcp'}, {direction='ingress', ethertype='IPv4', protocol='udp', remote_ip_prefix='10.0.0.0/8'}}
=> ((ip6 and tcp) or (arp or (ip and udp and src net 10.0.0.0/8)))
selftest ok
src/testlog/program.snabbnfv.neutron2snabb.neutron2snabb_schema:
selftest: neutron2snabb_schema
selftest: ok
src/testlog/program.snabbnfv.nfvconfig:
selftest: lib.nfv.config
testing: program/snabbnfv/test_fixtures/nfvconfig/switch_nic/x
engine: start app id0_Virtio
engine: start app id0_NIC
testing: program/snabbnfv/test_fixtures/nfvconfig/switch_filter/x
engine: start app id0_Filter_in
load: time: 0.30s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
testing: program/snabbnfv/test_fixtures/nfvconfig/switch_qos/x
engine: stop app id0_Filter_in
load: time: 0.26s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
testing: program/snabbnfv/test_fixtures/nfvconfig/switch_tunnel/x
engine: start app id0_Tunnel
engine: start app id0_ND
Apr 26 2016 14:46:10 nd_light: Sending neighbor solicitation for next-hop 2::2
load: time: 0.27s fps: 3 fpGbps: 0.000 fpb: 0 bpp: 86 sleep: 100 us
testing: program/snabbnfv/test_fixtures/nfvconfig/scale_up/y
engine: stop app id0_Tunnel
engine: stop app id0_ND
engine: start app id1_Virtio
engine: start app id1_NIC
load: time: 0.26s fps: 7 fpGbps: 0.000 fpb: 0 bpp: 86 sleep: 100 us
testing: program/snabbnfv/test_fixtures/nfvconfig/scale_up/x
engine: start app id63_Virtio
engine: start app id48_NIC
engine: start app id10_NIC
engine: start app id52_Virtio
engine: start app id6_NIC
engine: start app id45_NIC
engine: start app id34_NIC
engine: start app id35_NIC
engine: start app id40_NIC
engine: start app id61_Virtio
engine: start app id61_NIC
engine: start app id60_Virtio
engine: start app id49_Virtio
engine: start app id59_Virtio
engine: start app id12_NIC
engine: start app id42_NIC
engine: start app id38_NIC
engine: start app id24_NIC
engine: start app id58_Virtio
engine: start app id39_Virtio
engine: start app id51_NIC
engine: start app id57_Virtio
engine: start app id46_NIC
engine: start app id3_NIC
engine: start app id56_NIC
engine: start app id55_Virtio
engine: start app id45_Virtio
engine: start app id28_NIC
engine: start app id25_NIC
engine: start app id15_Virtio
engine: start app id41_NIC
engine: start app id35_Virtio
engine: start app id25_Virtio
engine: start app id54_Virtio
engine: start app id36_NIC
engine: start app id42_Virtio
engine: start app id32_NIC
engine: start app id18_NIC
engine: start app id50_Virtio
engine: start app id62_Virtio
engine: start app id12_Virtio
engine: start app id52_NIC
engine: start app id32_Virtio
engine: start app id22_Virtio
engine: start app id51_Virtio
engine: start app id53_NIC
engine: start app id60_NIC
engine: start app id8_NIC
engine: start app id31_NIC
engine: start app id33_NIC
engine: start app id37_Virtio
engine: start app id27_Virtio
engine: start app id20_Virtio
engine: start app id30_Virtio
engine: start app id50_NIC
engine: start app id10_Virtio
engine: start app id53_Virtio
engine: start app id47_Virtio
engine: start app id23_NIC
engine: start app id43_NIC
engine: start app id13_Virtio
engine: start app id24_Virtio
engine: start app id14_Virtio
engine: start app id21_NIC
engine: start app id7_Virtio
engine: start app id6_Virtio
engine: start app id5_Virtio
engine: start app id63_NIC
engine: start app id21_Virtio
engine: start app id31_Virtio
engine: start app id62_NIC
engine: start app id11_Virtio
engine: start app id59_NIC
engine: start app id13_NIC
engine: start app id41_Virtio
engine: start app id57_NIC
engine: start app id44_Virtio
engine: start app id9_Virtio
engine: start app id8_Virtio
engine: start app id44_NIC
engine: start app id43_Virtio
engine: start app id49_NIC
engine: start app id4_Virtio
engine: start app id4_NIC
engine: start app id2_Virtio
engine: start app id40_Virtio
engine: start app id47_NIC
engine: start app id58_NIC
engine: start app id33_Virtio
engine: start app id23_Virtio
engine: start app id34_Virtio
engine: start app id29_NIC
engine: start app id37_NIC
engine: start app id54_NIC
engine: start app id36_Virtio
engine: start app id26_NIC
engine: start app id16_Virtio
engine: start app id39_NIC
engine: start app id38_Virtio
engine: start app id46_Virtio
engine: start app id56_Virtio
engine: start app id28_Virtio
engine: start app id27_NIC
engine: start app id48_Virtio
engine: start app id30_NIC
engine: start app id16_NIC
engine: start app id5_NIC
engine: start app id19_NIC
engine: start app id17_Virtio
engine: start app id22_NIC
engine: start app id15_NIC
engine: start app id18_Virtio
engine: start app id20_NIC
engine: start app id2_NIC
engine: start app id9_NIC
engine: start app id19_Virtio
engine: start app id29_Virtio
engine: start app id17_NIC
engine: start app id11_NIC
engine: start app id55_NIC
engine: start app id7_NIC
engine: start app id14_NIC
engine: start app id3_Virtio
engine: start app id26_Virtio
load: time: 0.39s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
testing: program/snabbnfv/test_fixtures/nfvconfig/scale_change/x
load: time: 0.26s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
testing: program/snabbnfv/test_fixtures/nfvconfig/scale_change/y
engine: reconfig app id0_NIC
engine: reconfig app id3_NIC
engine: reconfig app id1_NIC
engine: reconfig app id2_NIC
load: time: 0.26s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
src/testlog/program.snabbnfv.selftest.sh:
Defaulting to SNABB_TELNET0=5000
Defaulting to SNABB_TELNET1=5001
USING program/snabbnfv/test_fixtures/nfvconfig/test_functions/other_vlan.ports
Defaulting to MAC=52:54:00:00:00:
Defaulting to IP=fe80::5054:ff:fe00:
Defaulting to GUEST_MEM=512
Defaulting to HUGETLBFS=/hugetlbfs
Defaulting to QUEUES=1
Defaulting to QEMU=/root/.test_env/qemu/obj/x86_64-softmmu/qemu-system-x86_64
Waiting for VM listening on telnet port 5000 to get ready... [OK]
Waiting for VM listening on telnet port 5001 to get ready... [OK]
USING program/snabbnfv/test_fixtures/nfvconfig/test_functions/same_vlan.ports
1 packets transmitted, 1 received, 0% packet loss, time 0ms
PING succeded.
[ 3] 0.0-10.0 sec 5.82 GBytes 4.99 Gbits/sec
IPERF succeded.
1 packets transmitted, 1 received, 0% packet loss, time 0ms
JUMBOPING succeded.
[ 3] 0.0-10.0 sec 6.74 GBytes 5.79 Gbits/sec
IPERF succeded.
tx-checksumming: on
TX-CHECKSUMMING succeded.
tx-checksumming: on
TX-CHECKSUMMING succeded.
USING program/snabbnfv/test_fixtures/nfvconfig/test_functions/tx_rate_limit.ports
1 packets transmitted, 1 received, 0% packet loss, time 0ms
PING succeded.
IPERF (RATE_LIMITED) succeded.
IPERF rate is 813 Mbits/sec (900 Mbits/sec allowed).
RATE_LIMITED succeded.
1 packets transmitted, 1 received, 0% packet loss, time 0ms
JUMBOPING succeded.
IPERF (RATE_LIMITED) succeded.
IPERF rate is 813 Mbits/sec (900 Mbits/sec allowed).
RATE_LIMITED succeded.
USING program/snabbnfv/test_fixtures/nfvconfig/test_functions/rx_rate_limit.ports
1 packets transmitted, 1 received, 0% packet loss, time 0ms
PING succeded.
IPERF (RATE_LIMITED) succeded.
IPERF rate is 812 Mbits/sec (1200 Mbits/sec allowed).
RATE_LIMITED succeded.
1 packets transmitted, 1 received, 0% packet loss, time 0ms
JUMBOPING succeded.
IPERF (RATE_LIMITED) succeded.
IPERF rate is 812 Mbits/sec (1200 Mbits/sec allowed).
RATE_LIMITED succeded.
USING program/snabbnfv/test_fixtures/nfvconfig/test_functions/tunnel.ports
Apr 26 2016 14:51:40 nd_light: Resolved next-hop fe80::5054:ff:fe00:0 to 52:54:00:00:00:00
Apr 26 2016 14:51:40 nd_light: Resolved next-hop fe80::5054:ff:fe00:1 to 52:54:00:00:00:01
ND succeded.
Trying ::1...
Connected to localhost.
Escape character is '^]'.
ping6 -c 1 fe80::5054:ff:fe00:0001%eth0
PING fe80::5054:ff:fe00:0001%eth0(fe80::5054:ff:fe00:1) 56 data bytes
Connection closed by foreign host.
PING failed.
qemu0.log:
QEMU waiting for connection on: disconnected:unix:vhost_A.sock,server
qemu-system-x86_64: -netdev type=vhost-user,id=net0,chardev=char0: chardev "char0" went up
WARNING: Image format was not specified for '/root/.test_env/qemu0.img' and probing guessed raw.
Automatically detecting the format is dangerous for raw images, write operations on block 0 will be restricted.
Specify the 'raw' format explicitly to remove the restrictions.
QEMU 2.4.0 monitor - type 'help' for more information
(qemu)
qemu1.log:
QEMU waiting for connection on: disconnected:unix:vhost_B.sock,server
qemu-system-x86_64: -netdev type=vhost-user,id=net0,chardev=char0: chardev "char0" went up
WARNING: Image format was not specified for '/root/.test_env/qemu1.img' and probing guessed raw.
Automatically detecting the format is dangerous for raw images, write operations on block 0 will be restricted.
Specify the 'raw' format explicitly to remove the restrictions.
QEMU 2.4.0 monitor - type 'help' for more information
(qemu)
snabb0.log:
snabbnfv traffic starting
Loading /tmp/snabb_nfv_selftest_ports.1066
engine: start app A_NIC
engine: start app B_NIC
engine: start app B_Virtio
engine: start app A_Virtio
Get features 0x18428001
VIRTIO_F_ANY_LAYOUT VIRTIO_NET_F_MQ VIRTIO_NET_F_CTRL_VQ VIRTIO_NET_F_MRG_RXBUF VIRTIO_RING_F_INDIRECT_DESC VIRTIO_NET_F_CSUM
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
Get features 0x18428001
VIRTIO_F_ANY_LAYOUT VIRTIO_NET_F_MQ VIRTIO_NET_F_CTRL_VQ VIRTIO_NET_F_MRG_RXBUF VIRTIO_RING_F_INDIRECT_DESC VIRTIO_NET_F_CSUM
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
vhost_user: Caching features (0x18028001) in /tmp/vhost_features_vhost_A.sock
Set features 0x18028001
VIRTIO_F_ANY_LAYOUT VIRTIO_NET_F_CTRL_VQ VIRTIO_NET_F_MRG_RXBUF VIRTIO_RING_F_INDIRECT_DESC VIRTIO_NET_F_CSUM
rxavail = 0 rxused = 0
rxavail = 0 rxused = 0
vhost_user: Connected and initialized: vhost_A.sock
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 2 fpGbps: 0.000 fpb: 0 bpp: 86 sleep: 100 us
load: time: 1.00s fps: 3 fpGbps: 0.000 fpb: 0 bpp: 83 sleep: 100 us
vhost_user: Caching features (0x18028001) in /tmp/vhost_features_vhost_B.sock
Set features 0x18028001
VIRTIO_F_ANY_LAYOUT VIRTIO_NET_F_CTRL_VQ VIRTIO_NET_F_MRG_RXBUF VIRTIO_RING_F_INDIRECT_DESC VIRTIO_NET_F_CSUM
rxavail = 0 rxused = 0
rxavail = 0 rxused = 0
vhost_user: Connected and initialized: vhost_B.sock
load: time: 1.00s fps: 2 fpGbps: 0.000 fpb: 0 bpp: 86 sleep: 100 us
load: time: 1.00s fps: 1 fpGbps: 0.000 fpb: 0 bpp: 80 sleep: 100 us
load: time: 1.00s fps: 1 fpGbps: 0.000 fpb: 0 bpp: 90 sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: 70 sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: 70 sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: 70 sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 1 fpGbps: 0.000 fpb: 0 bpp: 70 sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
Loading /tmp/snabb_nfv_selftest_ports.1066
engine: reconfig app A_NIC
engine: reconfig app B_NIC
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 7 fpGbps: 0.000 fpb: 0 bpp: 102 sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 3 fpGbps: 0.000 fpb: 0 bpp: 82 sleep: 100 us
load: time: 1.00s fps: 873,862 fpGbps: 8.635 fpb: 210 bpp: 1226 sleep: 0 us
load: time: 1.00s fps: 1,151,522 fpGbps: 11.560 fpb: 303 bpp: 1245 sleep: 0 us
load: time: 1.00s fps: 937,871 fpGbps: 9.463 fpb: 212 bpp: 1252 sleep: 0 us
load: time: 1.00s fps: 1,110,971 fpGbps: 11.233 fpb: 245 bpp: 1254 sleep: 0 us
load: time: 1.00s fps: 1,054,685 fpGbps: 10.673 fpb: 244 bpp: 1255 sleep: 100 us
load: time: 1.00s fps: 1,025,502 fpGbps: 10.416 fpb: 245 bpp: 1260 sleep: 0 us
load: time: 1.00s fps: 1,168,850 fpGbps: 11.735 fpb: 306 bpp: 1246 sleep: 0 us
load: time: 1.00s fps: 1,052,265 fpGbps: 10.685 fpb: 237 bpp: 1260 sleep: 0 us
load: time: 1.00s fps: 1,129,729 fpGbps: 11.376 fpb: 263 bpp: 1249 sleep: 0 us
load: time: 1.00s fps: 1,162,797 fpGbps: 11.840 fpb: 279 bpp: 1263 sleep: 0 us
load: time: 1.00s fps: 54,766 fpGbps: 0.555 fpb: 9 bpp: 1258 sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 4 fpGbps: 0.000 fpb: 0 bpp: 9062 sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 4 fpGbps: 0.000 fpb: 0 bpp: 82 sleep: 100 us
load: time: 1.00s fps: 176,997 fpGbps: 8.694 fpb: 20 bpp: 6130 sleep: 0 us
load: time: 1.00s fps: 261,596 fpGbps: 13.020 fpb: 40 bpp: 6212 sleep: 0 us
load: time: 1.00s fps: 219,104 fpGbps: 10.850 fpb: 33 bpp: 6180 sleep: 0 us
load: time: 1.00s fps: 265,433 fpGbps: 13.347 fpb: 44 bpp: 6276 sleep: 0 us
load: time: 1.00s fps: 194,331 fpGbps: 9.594 fpb: 28 bpp: 6162 sleep: 0 us
load: time: 1.00s fps: 251,369 fpGbps: 12.422 fpb: 36 bpp: 6167 sleep: 0 us
load: time: 1.00s fps: 255,563 fpGbps: 12.621 fpb: 37 bpp: 6163 sleep: 0 us
load: time: 1.00s fps: 227,847 fpGbps: 11.338 fpb: 34 bpp: 6211 sleep: 0 us
load: time: 1.00s fps: 257,976 fpGbps: 12.849 fpb: 39 bpp: 6216 sleep: 0 us
load: time: 1.00s fps: 221,016 fpGbps: 10.922 fpb: 31 bpp: 6168 sleep: 0 us
load: time: 1.00s fps: 44,053 fpGbps: 2.205 fpb: 7 bpp: 6247 sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
Loading /tmp/snabb_nfv_selftest_ports.1066
engine: start app A_TxLimit
engine: start app B_TxLimit
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 4 fpGbps: 0.000 fpb: 0 bpp: 118 sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 3 fpGbps: 0.000 fpb: 0 bpp: 82 sleep: 100 us
load: time: 1.00s fps: 108,234 fpGbps: 1.334 fpb: 6 bpp: 1531 sleep: 1 us
load: time: 1.00s fps: 128,505 fpGbps: 1.584 fpb: 5 bpp: 1531 sleep: 1 us
load: time: 1.00s fps: 103,864 fpGbps: 1.280 fpb: 4 bpp: 1531 sleep: 0 us
load: time: 1.00s fps: 109,679 fpGbps: 1.352 fpb: 4 bpp: 1531 sleep: 0 us
load: time: 1.00s fps: 104,949 fpGbps: 1.294 fpb: 4 bpp: 1531 sleep: 1 us
load: time: 1.00s fps: 104,350 fpGbps: 1.286 fpb: 4 bpp: 1531 sleep: 1 us
load: time: 1.00s fps: 109,896 fpGbps: 1.355 fpb: 4 bpp: 1531 sleep: 0 us
load: time: 1.00s fps: 103,890 fpGbps: 1.281 fpb: 4 bpp: 1531 sleep: 0 us
load: time: 1.00s fps: 109,882 fpGbps: 1.355 fpb: 4 bpp: 1531 sleep: 0 us
load: time: 1.00s fps: 109,772 fpGbps: 1.353 fpb: 4 bpp: 1531 sleep: 0 us
load: time: 1.00s fps: 21,224 fpGbps: 0.262 fpb: 2 bpp: 1531 sleep: 100 us
load: time: 1.00s fps: 4 fpGbps: 0.000 fpb: 0 bpp: 1413 sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 4 fpGbps: 0.000 fpb: 0 bpp: 9062 sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 8 fpGbps: 0.000 fpb: 0 bpp: 82 sleep: 100 us
load: time: 1.00s fps: 104,521 fpGbps: 1.289 fpb: 7 bpp: 1531 sleep: 1 us
load: time: 1.00s fps: 105,915 fpGbps: 1.306 fpb: 5 bpp: 1531 sleep: 1 us
load: time: 1.00s fps: 109,878 fpGbps: 1.355 fpb: 5 bpp: 1531 sleep: 1 us
load: time: 1.00s fps: 93,724 fpGbps: 1.155 fpb: 4 bpp: 1531 sleep: 0 us
load: time: 1.00s fps: 109,400 fpGbps: 1.349 fpb: 5 bpp: 1531 sleep: 1 us
load: time: 1.00s fps: 109,889 fpGbps: 1.355 fpb: 5 bpp: 1531 sleep: 0 us
load: time: 1.00s fps: 95,516 fpGbps: 1.178 fpb: 4 bpp: 1531 sleep: 1 us
load: time: 1.00s fps: 109,895 fpGbps: 1.355 fpb: 5 bpp: 1531 sleep: 0 us
load: time: 1.00s fps: 98,732 fpGbps: 1.217 fpb: 4 bpp: 1531 sleep: 0 us
load: time: 1.00s fps: 105,684 fpGbps: 1.303 fpb: 5 bpp: 1531 sleep: 0 us
load: time: 1.00s fps: 27,357 fpGbps: 0.337 fpb: 2 bpp: 1531 sleep: 62 us
load: time: 1.00s fps: 4 fpGbps: 0.000 fpb: 0 bpp: 1413 sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
Loading /tmp/snabb_nfv_selftest_ports.1066
engine: stop app A_TxLimit
engine: stop app B_TxLimit
engine: start app A_RxLimit
engine: start app B_RxLimit
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 3 fpGbps: 0.000 fpb: 0 bpp: 118 sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 4 fpGbps: 0.000 fpb: 0 bpp: 82 sleep: 100 us
load: time: 1.00s fps: 100,242 fpGbps: 1.236 fpb: 7 bpp: 1531 sleep: 22 us
load: time: 1.00s fps: 138,088 fpGbps: 1.702 fpb: 8 bpp: 1531 sleep: 0 us
load: time: 1.00s fps: 123,240 fpGbps: 1.519 fpb: 6 bpp: 1531 sleep: 1 us
load: time: 1.00s fps: 138,178 fpGbps: 1.703 fpb: 8 bpp: 1531 sleep: 1 us
load: time: 1.00s fps: 138,172 fpGbps: 1.703 fpb: 8 bpp: 1531 sleep: 1 us
load: time: 1.00s fps: 120,709 fpGbps: 1.488 fpb: 6 bpp: 1531 sleep: 0 us
load: time: 1.00s fps: 138,170 fpGbps: 1.703 fpb: 8 bpp: 1531 sleep: 1 us
load: time: 1.00s fps: 124,829 fpGbps: 1.539 fpb: 7 bpp: 1531 sleep: 2 us
load: time: 1.00s fps: 137,956 fpGbps: 1.701 fpb: 7 bpp: 1531 sleep: 1 us
load: time: 1.00s fps: 138,156 fpGbps: 1.703 fpb: 8 bpp: 1531 sleep: 0 us
load: time: 1.00s fps: 38,089 fpGbps: 0.470 fpb: 4 bpp: 1531 sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 3 fpGbps: 0.000 fpb: 0 bpp: 9062 sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 4 fpGbps: 0.000 fpb: 0 bpp: 82 sleep: 100 us
load: time: 1.00s fps: 95,951 fpGbps: 1.183 fpb: 7 bpp: 1531 sleep: 0 us
load: time: 1.00s fps: 137,184 fpGbps: 1.691 fpb: 8 bpp: 1531 sleep: 27 us
load: time: 1.00s fps: 124,626 fpGbps: 1.536 fpb: 6 bpp: 1531 sleep: 25 us
load: time: 1.00s fps: 138,199 fpGbps: 1.704 fpb: 8 bpp: 1531 sleep: 24 us
load: time: 1.00s fps: 134,110 fpGbps: 1.653 fpb: 7 bpp: 1531 sleep: 20 us
load: time: 1.00s fps: 121,874 fpGbps: 1.502 fpb: 6 bpp: 1531 sleep: 20 us
load: time: 1.00s fps: 138,203 fpGbps: 1.704 fpb: 8 bpp: 1531 sleep: 19 us
load: time: 1.00s fps: 114,664 fpGbps: 1.414 fpb: 6 bpp: 1531 sleep: 18 us
load: time: 1.00s fps: 138,203 fpGbps: 1.704 fpb: 8 bpp: 1531 sleep: 39 us
load: time: 1.00s fps: 138,210 fpGbps: 1.704 fpb: 8 bpp: 1531 sleep: 13 us
load: time: 1.00s fps: 29,347 fpGbps: 0.362 fpb: 2 bpp: 1531 sleep: 100 us
load: time: 1.00s fps: 5 fpGbps: 0.000 fpb: 0 bpp: 1452 sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
load: time: 1.00s fps: 0 fpGbps: 0.000 fpb: 0 bpp: - sleep: 100 us
Loading /tmp/snabb_nfv_selftest_ports.1066
engine: stop app A_RxLimit
engine: stop app B_RxLimit
engine: start app B_ND
engine: start app B_Tunnel
engine: start app A_Tunnel
engine: start app A_ND
Apr 26 2016 14:51:40 nd_light: Sending neighbor solicitation for next-hop fe80::5054:ff:fe00:0
Apr 26 2016 14:51:40 nd_light: Sending neighbor solicitation for next-hop fe80::5054:ff:fe00:1
Apr 26 2016 14:51:40 nd_light: Resolved next-hop fe80::5054:ff:fe00:0 to 52:54:00:00:00:00
Apr 26 2016 14:51:40 nd_light: Resolved next-hop fe80::5054:ff:fe00:1 to 52:54:00:00:00:01
load: time: 1.00s fps: 7 fpGbps: 0.000 fpb: 0 bpp: 86 sleep: 100 us
Apr 26 2016 14:51:41 nd_light: Sending neighbor solicitation for next-hop fe80::5054:ff:fe00:0
Apr 26 2016 14:51:41 nd_light: Sending neighbor solicitation for next-hop fe80::5054:ff:fe00:1
load: time: 1.00s fps: 8 fpGbps: 0.000 fpb: 0 bpp: 86 sleep: 100 us
load: time: 1.00s fps: 3 fpGbps: 0.000 fpb: 0 bpp: 162 sleep: 100 us
EXITCODE: 1
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment