Skip to content

Instantly share code, notes, and snippets.

@noelbk
Created July 9, 2014 17:46
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save noelbk/619426fa88c3bdd0534c to your computer and use it in GitHub Desktop.
Save noelbk/619426fa88c3bdd0534c to your computer and use it in GitHub Desktop.
cmessages-test_rolling_evac-fail-3
This file has been truncated, but you can view the full file.
Jul 8 19:24:22 10.35.0.14 nova-compute 2014-07-08 19:24:22.946 6347 INFO nova.virt.libvirt.imagecache [-] Removable base files: /mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375
Jul 8 19:24:22 10.35.0.14 nova-compute 2014-07-08 19:24:22.946 6347 INFO nova.virt.libvirt.imagecache [-] Base file too young to remove: /mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375
Jul 8 19:24:23 10.35.0.13 [2014-07-08 19:24:23.049] 941/Thread-1 savage.WSGI-FredInterface/INFO: 10.35.0.2 "GET /private/status HTTP/1.1" 200 3035
Jul 8 19:24:24 10.35.0.15 nova-network.log: /usr/lib64/python2.7/site-packages/sqlalchemy/engine/default.py:435: Warning: Incorrect datetime value: '2014-07-08 19:14:24+00:00' for column 'updated_at' at row 1
Jul 8 19:24:24 10.35.0.15 nova-network.log: cursor.execute(statement, parameters)
Jul 8 19:24:26 10.35.0.13 nova-compute 2014-07-08 19:24:26.962 4705 AUDIT nova.compute.resource_tracker [-] Auditing locally available compute resources
Jul 8 19:24:26 10.35.0.13 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf vgs --noheadings --nosuffix --separator | --units b -o vg_size,vg_free nova
Jul 8 19:24:27 10.35.0.2 dhcpd: DHCPREQUEST for 10.35.1.131 from 00:25:90:2f:2d:07 via management
Jul 8 19:24:27 10.35.0.2 dhcpd: DHCPACK on 10.35.1.131 to 00:25:90:2f:2d:07 via management
Jul 8 19:24:27 10.35.0.13 nova-compute 2014-07-08 19:24:27.338 4705 AUDIT nova.compute.resource_tracker [-] Free ram (MB): 46705
Jul 8 19:24:27 10.35.0.13 nova-compute 2014-07-08 19:24:27.339 4705 AUDIT nova.compute.resource_tracker [-] Free disk (GB): 1860
Jul 8 19:24:27 10.35.0.13 nova-compute 2014-07-08 19:24:27.339 4705 AUDIT nova.compute.resource_tracker [-] Free VCPUS: 2
Jul 8 19:24:27 10.35.0.13 nova-compute 2014-07-08 19:24:27.365 4705 INFO nova.compute.resource_tracker [-] Compute_service record updated for 10.35.0.13:node-10-35-0-13
Jul 8 19:24:29 10.35.0.14 [2014-07-08 19:24:29.496] 941/Thread-1 savage.WSGI-FredInterface/INFO: 10.35.0.2 "GET /private/status HTTP/1.1" 200 3035
Jul 8 19:24:31 10.35.0.15 nova-compute 2014-07-08 19:24:31.614 3328 AUDIT nova.compute.resource_tracker [-] Auditing locally available compute resources
Jul 8 19:24:31 10.35.0.15 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf vgs --noheadings --nosuffix --separator | --units b -o vg_size,vg_free nova
Jul 8 19:24:31 10.35.0.13 pound: keystone-35.i.pistoncloud.com:35357 172.31.0.13 - - [08/Jul/2014:19:24:31 +0000] "POST /v2.0/tokens HTTP/1.1" 200 3288 "" "python-novaclient"
Jul 8 19:24:31 10.35.0.13 nova-api 2014-07-08 19:24:31.888 4844 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:24:31 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.7 - - [08/Jul/2014:19:24:31 +0000] "GET /v2.0/tokens/d392a9c2a07b493697512c52094c15ac HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:24:32 10.35.0.15 nova-compute 2014-07-08 19:24:32.046 3328 AUDIT nova.compute.resource_tracker [-] Free ram (MB): 47217
Jul 8 19:24:32 10.35.0.15 nova-compute 2014-07-08 19:24:32.046 3328 AUDIT nova.compute.resource_tracker [-] Free disk (GB): 1861
Jul 8 19:24:32 10.35.0.15 nova-compute 2014-07-08 19:24:32.046 3328 AUDIT nova.compute.resource_tracker [-] Free VCPUS: 3
Jul 8 19:24:32 10.35.0.13 pound: 10.35.2.8:8774 172.31.0.13 - - [08/Jul/2014:19:24:31 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5070 "" "python-novaclient"
Jul 8 19:24:32 10.35.0.13 nova-api 2014-07-08 19:24:32.045 4844 INFO nova.osapi_compute.wsgi.server [req-4c191bec-62c2-4a62-9514-f7aa697a1078 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 172.31.0.13,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5244 time: 0.1592360
Jul 8 19:24:32 10.35.0.15 nova-api 2014-07-08 19:24:32.059 3443 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:24:32 10.35.0.15 nova-compute 2014-07-08 19:24:32.072 3328 INFO nova.compute.resource_tracker [-] Compute_service record updated for 10.35.0.15:node-10-35-0-15
Jul 8 19:24:32 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.6 - - [08/Jul/2014:19:24:32 +0000] "GET /v2.0/tokens/d392a9c2a07b493697512c52094c15ac HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:24:32 10.35.0.15 nova-compute 2014-07-08 19:24:32.109 3328 INFO nova.virt.libvirt.imagecache [-] image 8f23694a-2995-4070-a5e8-661de3207f2e at (/mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375): checking
Jul 8 19:24:32 10.35.0.15 nova-compute 2014-07-08 19:24:32.110 3328 INFO nova.virt.libvirt.imagecache [-] image 8f23694a-2995-4070-a5e8-661de3207f2e at (/mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375): in use: on this node 1 local, 0 on other nodes sharing this instance storage
Jul 8 19:24:32 10.35.0.15 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf chown 43 /mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375
Jul 8 19:24:32 10.35.0.13 pound: 10.35.2.8:8774 172.31.0.13 - - [08/Jul/2014:19:24:32 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5070 "" "python-novaclient"
Jul 8 19:24:32 10.35.0.15 nova-api 2014-07-08 19:24:32.182 3443 INFO nova.osapi_compute.wsgi.server [req-267d703c-edb2-47fc-9c34-8a888bb0915a 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 172.31.0.13,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5244 time: 0.1245031
Jul 8 19:24:32 10.35.0.14 nova-api 2014-07-08 19:24:32.191 6476 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:24:32 10.35.0.13 pound: 10.35.2.3:35357 10.35.0.14 - - [08/Jul/2014:19:24:32 +0000] "GET /v2.0/tokens/d392a9c2a07b493697512c52094c15ac HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:24:32 10.35.0.13 pound: 10.35.2.8:8774 172.31.0.13 - - [08/Jul/2014:19:24:32 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/os-floating-ips HTTP/1.1" 200 273 "" "python-novaclient"
Jul 8 19:24:32 10.35.0.14 nova-api 2014-07-08 19:24:32.274 6476 INFO nova.osapi_compute.wsgi.server [req-377548ea-be53-4842-bd54-32bf670de606 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 172.31.0.13,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/os-floating-ips HTTP/1.1" status: 200 len: 446 time: 0.0861552
Jul 8 19:24:32 10.35.0.15 nova-compute 2014-07-08 19:24:32.298 3328 INFO nova.virt.libvirt.imagecache [-] Active base files: /mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375
Jul 8 19:24:32 10.35.0.13 pound: 10.35.2.8:8774 172.31.0.13 - - [08/Jul/2014:19:24:32 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/os-floating-ips HTTP/1.1" 200 273 "" "python-novaclient"
Jul 8 19:24:32 10.35.0.13 nova-api 2014-07-08 19:24:32.334 4844 INFO nova.osapi_compute.wsgi.server [req-21e3fc23-ed34-4f59-bdc2-1eb0e3ae34a6 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 172.31.0.13,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/os-floating-ips HTTP/1.1" status: 200 len: 446 time: 0.0507660
Jul 8 19:24:32 10.35.0.15 nova-api 2014-07-08 19:24:32.347 3441 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:24:32 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.6 - - [08/Jul/2014:19:24:32 +0000] "GET /v2.0/tokens/d392a9c2a07b493697512c52094c15ac HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:24:32 10.35.0.13 pound: 10.35.2.8:8774 172.31.0.13 - - [08/Jul/2014:19:24:32 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/os-floating-ips HTTP/1.1" 200 273 "" "python-novaclient"
Jul 8 19:24:32 10.35.0.15 nova-api 2014-07-08 19:24:32.418 3441 INFO nova.osapi_compute.wsgi.server [req-ebb318c9-8738-4915-87f0-0c2343e3034d 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 172.31.0.13,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/os-floating-ips HTTP/1.1" status: 200 len: 446 time: 0.0734670
Jul 8 19:24:32 10.35.0.2 [2014-07-08 19:24:32.460] 18897/Thread-11907 savage.Node[10.35.1.129]/INFO: State change: ready -> freeze:ready: Node<id=2, host_ip=10.35.0.13, status=freeze:ready, ipmi_macs=['00:25:90:8B:3F:A7'], hostmacs=['00:25:90:8B:01:70', '00:25:90:8B:01:71', '90:E2:BA:49:9C:24', '90:E2:BA:49:9C:25']>
Jul 8 19:24:32 10.35.0.2 [2014-07-08 19:24:32.707] 18897/Thread-11913 savage.Node[10.35.1.129]/INFO: State change: freeze:ready -> freeze:send-freeze: Node<id=2, host_ip=10.35.0.13, status=freeze:send-freeze, ipmi_macs=['00:25:90:8B:3F:A7'], hostmacs=['00:25:90:8B:01:70', '00:25:90:8B:01:71', '90:E2:BA:49:9C:24', '90:E2:BA:49:9C:25']>
Jul 8 19:24:32 10.35.0.13 nova-compute 2014-07-08 19:24:32.796 4705 AUDIT nova.compute.resource_tracker [-] Auditing locally available compute resources
Jul 8 19:24:32 10.35.0.13 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf vgs --noheadings --nosuffix --separator | --units b -o vg_size,vg_free nova
Jul 8 19:24:33 10.35.0.13 nova-compute 2014-07-08 19:24:33.160 4705 AUDIT nova.compute.resource_tracker [-] Free ram (MB): 46705
Jul 8 19:24:33 10.35.0.13 nova-compute 2014-07-08 19:24:33.160 4705 AUDIT nova.compute.resource_tracker [-] Free disk (GB): 1860
Jul 8 19:24:33 10.35.0.13 nova-compute 2014-07-08 19:24:33.160 4705 AUDIT nova.compute.resource_tracker [-] Free VCPUS: 2
Jul 8 19:24:33 10.35.0.13 nova-compute 2014-07-08 19:24:33.185 4705 INFO nova.compute.resource_tracker [-] Compute_service record updated for 10.35.0.13:node-10-35-0-13
Jul 8 19:24:33 10.35.0.13 nova-compute 2014-07-08 19:24:33.241 4705 INFO nova.virt.libvirt.imagecache [-] image 8f23694a-2995-4070-a5e8-661de3207f2e at (/mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375): checking
Jul 8 19:24:33 10.35.0.13 nova-compute 2014-07-08 19:24:33.242 4705 INFO nova.virt.libvirt.imagecache [-] image 8f23694a-2995-4070-a5e8-661de3207f2e at (/mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375): in use: on this node 2 local, 0 on other nodes sharing this instance storage
Jul 8 19:24:33 10.35.0.13 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf chown 43 /mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375
Jul 8 19:24:33 10.35.0.13 nova-compute 2014-07-08 19:24:33.426 4705 INFO nova.virt.libvirt.imagecache [-] Active base files: /mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375
Jul 8 19:24:35 10.35.0.14 nova-compute 2014-07-08 19:24:35.620 6347 AUDIT nova.compute.resource_tracker [-] Auditing locally available compute resources
Jul 8 19:24:35 10.35.0.14 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf vgs --noheadings --nosuffix --separator | --units b -o vg_size,vg_free nova
Jul 8 19:24:35 10.35.0.13 [2014-07-08 19:24:35.825] 941/Thread-1 savage.WSGI-FredInterface/INFO: 10.35.0.2 "POST /private/freeze HTTP/1.1" 200 0
Jul 8 19:24:35 10.35.0.2 [2014-07-08 19:24:35.825] 18897/Thread-11919 savage.Node[10.35.1.129]/INFO: State change: freeze:send-freeze -> freeze:freezing: Node<id=2, host_ip=10.35.0.13, status=freeze:freezing, ipmi_macs=['00:25:90:8B:3F:A7'], hostmacs=['00:25:90:8B:01:70', '00:25:90:8B:01:71', '90:E2:BA:49:9C:24', '90:E2:BA:49:9C:25']>
Jul 8 19:24:35 10.35.0.15 [2014-07-08 19:24:35.831] 942/Thread-1 savage.WSGI-FredInterface/INFO: 10.35.0.2 "GET /private/status HTTP/1.1" 200 3035
Jul 8 19:24:35 10.35.0.14 nova-compute 2014-07-08 19:24:35.984 6347 AUDIT nova.compute.resource_tracker [-] Free ram (MB): 47729
Jul 8 19:24:35 10.35.0.14 nova-compute 2014-07-08 19:24:35.984 6347 AUDIT nova.compute.resource_tracker [-] Free disk (GB): 1862
Jul 8 19:24:35 10.35.0.14 nova-compute 2014-07-08 19:24:35.984 6347 AUDIT nova.compute.resource_tracker [-] Free VCPUS: 4
Jul 8 19:24:36 10.35.0.14 nova-compute 2014-07-08 19:24:36.012 6347 INFO nova.compute.resource_tracker [-] Compute_service record updated for 10.35.0.14:node-10-35-0-14
Jul 8 19:24:37 10.35.0.13 [2014-07-08 19:24:37.440] 941/MainThread savage.Fred/INFO: State change: ready -> freezing: Fred<ip=10.35.0.13, state=freezing>
Jul 8 19:24:37 10.35.0.13 [2014-07-08 19:24:37.440] 941/MainThread savage.Fred/INFO: Freezing.
Jul 8 19:24:37 10.35.0.13 [2014-07-08 19:24:37.440] 941/MainThread savage.Fred/INFO: Waiting for ceph to settle.
Jul 8 19:24:37 10.35.0.13 [2014-07-08 19:24:37.440] 941/MainThread savage/INFO: Called shutdown_safe
Jul 8 19:24:37 10.35.0.13 [2014-07-08 19:24:37.441] 941/MainThread savage/INFO: Called get_running_osds_mons
Jul 8 19:24:37 10.35.0.13 cobalt-compute: periodic_task DEBUG Running periodic task CobaltManager._refresh_host
Jul 8 19:24:37 10.35.0.13 [2014-07-08 19:24:37.467] 941/MainThread savage/INFO: get_running_osds_mons returned ([0, 4], [1])
Jul 8 19:24:37 10.35.0.13 [2014-07-08 19:24:37.467] 941/MainThread savage/INFO: Called _verify_safe
Jul 8 19:24:37 10.35.0.13 [2014-07-08 19:24:37.467] 941/MainThread savage/INFO: execute['ceph', 'pg', 'dump', '--format=json', '-o', '/tmp/ceph_verifyzysns4']
Jul 8 19:24:37 10.35.0.13 cobalt-compute: periodic_task DEBUG Running periodic task CobaltManager._clean
Jul 8 19:24:37 10.35.0.13 cobalt-compute: vmsconn DEBUG Starting to clean symlinks in /dev/nova
Jul 8 19:24:37 10.35.0.13 cobalt-compute: vmsconn DEBUG Link is still active /dev/nova/ce77d052-f021-44a2-b5f3-5908ec04fcc7_disk
Jul 8 19:24:37 10.35.0.13 cobalt-compute: vmsconn DEBUG Link is still active /dev/nova/5ee78913-e655-4cb1-a92c-500639c819ad_disk
Jul 8 19:24:37 10.35.0.13 cobalt-compute: vmsconn DEBUG Link is still active /dev/nova/libvirt_save
Jul 8 19:24:37 10.35.0.13 cobalt-compute: vmsconn DEBUG Link is still active /dev/nova/novadisk
Jul 8 19:24:37 10.35.0.13 cobalt-compute: loopingcall DEBUG Dynamic looping call sleeping for 60.00 seconds
Jul 8 19:24:37 10.35.0.13 [2014-07-08 19:24:37.791] 941/MainThread savage/WARNING: cmd ceph pg dump --format=json -o /tmp/ceph_verifyzysns4 returned with ignored stderr lines: ['dumped all in format json']
Jul 8 19:24:37 10.35.0.13 [2014-07-08 19:24:37.950] 941/MainThread savage/INFO: execute['ceph', 'quorum_status']
Jul 8 19:24:38 10.35.0.13 [2014-07-08 19:24:38.201] 941/MainThread savage/INFO: _verify_safe returned True(clean)
Jul 8 19:24:38 10.35.0.13 [2014-07-08 19:24:38.201] 941/MainThread savage/INFO: shutdown_safe returned True(clean)
Jul 8 19:24:38 10.35.0.13 [2014-07-08 19:24:38.202] 941/MainThread savage.Fred/INFO: Preventing new VMs from being scheduled here.
Jul 8 19:24:38 10.35.0.13 [2014-07-08 19:24:38.224] 941/MainThread savage/INFO: guaranteed that /etc/init exists (created=[], mode=None, owner=None, group=None)
Jul 8 19:24:38 10.35.0.13 [2014-07-08 19:24:38.224] 941/MainThread savage/INFO: /etc/init/nova-service-state.conf and /etc/init/template_write141ojI are the same, ignoring update.
Jul 8 19:24:38 10.35.0.13 [2014-07-08 19:24:38.224] 941/MainThread savage.NovaServiceState[nova-service-state]/INFO: Called start
Jul 8 19:24:38 10.35.0.13 [2014-07-08 19:24:38.225] 941/MainThread savage/INFO: execute['/sbin/start', 'nova-service-state', 'ACTION=disable']
Jul 8 19:24:38 10.35.0.13 nova-service-state.log: Trying to disable service nova-compute on 10.35.0.13.
Jul 8 19:24:40 10.35.0.2 dhcpd: DHCPREQUEST for 10.35.1.129 from 00:25:90:8b:3f:a7 via management
Jul 8 19:24:40 10.35.0.2 dhcpd: DHCPACK on 10.35.1.129 to 00:25:90:8b:3f:a7 via management
Jul 8 19:24:41 10.35.0.13 nova-service-state.log: Service nova-compute on host 10.35.0.13 disabled.
Jul 8 19:24:41 10.35.0.13 nova-service-state.log: Successfully set service nova-compute to disable on 10.35.0.13.
Jul 8 19:24:41 10.35.0.13 nova-service-state.log: Trying to disable service nova-cert on 10.35.0.13.
Jul 8 19:24:44 10.35.0.13 nova-service-state.log: Service nova-cert on host 10.35.0.13 disabled.
Jul 8 19:24:44 10.35.0.13 nova-service-state.log: Successfully set service nova-cert to disable on 10.35.0.13.
Jul 8 19:24:44 10.35.0.13 nova-service-state.log: Trying to disable service nova-network on 10.35.0.13.
Jul 8 19:24:45 10.35.0.14 [2014-07-08 19:24:45.166] 941/Thread-1 savage.WSGI-FredInterface/INFO: 10.35.0.2 "GET /private/status HTTP/1.1" 200 3035
Jul 8 19:24:47 10.35.0.13 nova-service-state.log: Service nova-network on host 10.35.0.13 disabled.
Jul 8 19:24:47 10.35.0.13 nova-service-state.log: Successfully set service nova-network to disable on 10.35.0.13.
Jul 8 19:24:47 10.35.0.13 nova-service-state.log: Trying to disable service nova-scheduler on 10.35.0.13.
Jul 8 19:24:47 10.35.0.14 sshd[16220]: error: Could not load host key: /etc/ssh/ssh_host_ed25519_key
Jul 8 19:24:47 10.35.0.14 sshd[16220]: WARNING: /etc/ssh/moduli does not exist, using fixed modulus
Jul 8 19:24:49 10.35.0.14 cobalt-compute: periodic_task DEBUG Running periodic task CobaltManager._refresh_host
Jul 8 19:24:49 10.35.0.14 cobalt-compute: periodic_task DEBUG Running periodic task CobaltManager._clean
Jul 8 19:24:49 10.35.0.14 cobalt-compute: vmsconn DEBUG Starting to clean symlinks in /dev/nova
Jul 8 19:24:49 10.35.0.14 cobalt-compute: vmsconn DEBUG Link is still active /dev/nova/libvirt_save
Jul 8 19:24:49 10.35.0.14 cobalt-compute: vmsconn DEBUG Link is still active /dev/nova/novadisk
Jul 8 19:24:49 10.35.0.14 cobalt-compute: loopingcall DEBUG Dynamic looping call sleeping for 60.00 seconds
Jul 8 19:24:49 10.35.0.14 sshd[16220]: Failed password for root from 10.35.0.2 port 50130 ssh2
Jul 8 19:24:50 10.35.0.13 nova-service-state.log: Service nova-scheduler on host 10.35.0.13 disabled.
Jul 8 19:24:50 10.35.0.13 nova-service-state.log: Successfully set service nova-scheduler to disable on 10.35.0.13.
Jul 8 19:24:50 10.35.0.13 nova-service-state.log: Trying to disable service cobalt on 10.35.0.13.
Jul 8 19:24:51 10.35.0.13 [2014-07-08 19:24:51.393] 941/Thread-1 savage.WSGI-FredInterface/INFO: 10.35.0.2 "GET /private/status HTTP/1.1" 200 3098
Jul 8 19:24:51 10.35.0.15 [2014-07-08 19:24:51.396] 942/Thread-1 savage.WSGI-FredInterface/INFO: 10.35.0.2 "GET /private/status HTTP/1.1" 200 3035
Jul 8 19:24:53 10.35.0.13 nova-service-state.log: Service cobalt on host 10.35.0.13 disabled.
Jul 8 19:24:53 10.35.0.13 nova-service-state.log: Successfully set service cobalt to disable on 10.35.0.13.
Jul 8 19:24:53 10.35.0.13 [2014-07-08 19:24:53.143] 941/MainThread savage.NovaServiceState[nova-service-state]/INFO: start returned nova-service-state stop/waiting
Jul 8 19:24:53 10.35.0.13 [2014-07-08 19:24:53.143] 941/MainThread savage.Fred/INFO: Evacuating VMs.
Jul 8 19:24:53 10.35.0.13 [2014-07-08 19:24:53.224] 941/MainThread savage.node-evacuation/INFO: Called evacuate
Jul 8 19:24:53 10.35.0.13 [2014-07-08 19:24:53.224] 941/MainThread savage.Fred/INFO: Checking for VMs to migrate.
Jul 8 19:24:53 10.35.0.13 [2014-07-08 19:24:53.224] 941/MainThread savage.node-evacuation/INFO: Called list
Jul 8 19:24:53 10.35.0.13 [2014-07-08 19:24:53.225] 941/MainThread novaclient.client/DEBUG: New session created for: (http://10.35.2.3:5000)
Jul 8 19:24:53 10.35.0.15 ceph-watchdog.log: Testing for ceph timout...
Jul 8 19:24:53 10.35.0.15 [2014-07-08 19:24:53.351] 21566/MainThread savage.kick_ceph/DEBUG: Testing for ceph timout...
Jul 8 19:24:53 10.35.0.15 [2014-07-08 19:24:53.352] 21566/MainThread savage/INFO: Called lspools
Jul 8 19:24:53 10.35.0.15 [2014-07-08 19:24:53.382] 21566/MainThread savage/INFO: lspools returned ['data', 'metadata', 'rbd', 'images', 'slowrbd', 'crash-dumps', 'gc-vms', '.rgw', '.rgw.buckets', '.rgw.control', '.rgw.gc', '.log', '.intent-log', '.usage', '.users', '.users.email', '.users.swift', '.users.uid', '.rgw.root']
Jul 8 19:24:53 10.35.0.15 [2014-07-08 19:24:53.382] 21566/MainThread savage/INFO: Called rbd_list
Jul 8 19:24:53 10.35.0.15 [2014-07-08 19:24:53.397] 21566/MainThread savage/INFO: rbd_list returned []
Jul 8 19:24:53 10.35.0.15 [2014-07-08 19:24:53.397] 21566/MainThread savage/INFO: Called rbd_list
Jul 8 19:24:53 10.35.0.13 pound: 10.35.2.3:5000 10.35.2.7 - - [08/Jul/2014:19:24:53 +0000] "POST /v2.0/tokens HTTP/1.1" 200 3290 "" "python-novaclient"
Jul 8 19:24:53 10.35.0.13 [2014-07-08 19:24:53.406] 941/MainThread novaclient.client/DEBUG: New session created for: (http://10.35.2.8:8774)
Jul 8 19:24:53 10.35.0.15 [2014-07-08 19:24:53.411] 21566/MainThread savage/INFO: rbd_list returned []
Jul 8 19:24:53 10.35.0.15 [2014-07-08 19:24:53.411] 21566/MainThread savage/INFO: Called rbd_list
Jul 8 19:24:53 10.35.0.13 nova-api 2014-07-08 19:24:53.411 4842 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:24:53 10.35.0.15 [2014-07-08 19:24:53.424] 21566/MainThread savage/INFO: rbd_list returned ['mysql']
Jul 8 19:24:53 10.35.0.15 [2014-07-08 19:24:53.425] 21566/MainThread savage/INFO: Called rbd_list
Jul 8 19:24:53 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.7 - - [08/Jul/2014:19:24:53 +0000] "GET /v2.0/tokens/e461a54513ed421ab86f2cc67e181011 HTTP/1.1" 200 3290 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:24:53 10.35.0.15 [2014-07-08 19:24:53.438] 21566/MainThread savage/INFO: rbd_list returned ['8f23694a-2995-4070-a5e8-661de3207f2e']
Jul 8 19:24:53 10.35.0.15 [2014-07-08 19:24:53.438] 21566/MainThread savage/INFO: Called rbd_list
Jul 8 19:24:53 10.35.0.15 [2014-07-08 19:24:53.452] 21566/MainThread savage/INFO: rbd_list returned []
Jul 8 19:24:53 10.35.0.15 [2014-07-08 19:24:53.452] 21566/MainThread savage/INFO: Called rbd_list
Jul 8 19:24:53 10.35.0.15 [2014-07-08 19:24:53.465] 21566/MainThread savage/INFO: rbd_list returned []
Jul 8 19:24:53 10.35.0.15 [2014-07-08 19:24:53.465] 21566/MainThread savage/INFO: Called rbd_list
Jul 8 19:24:53 10.35.0.15 [2014-07-08 19:24:53.479] 21566/MainThread savage/INFO: rbd_list returned []
Jul 8 19:24:53 10.35.0.15 [2014-07-08 19:24:53.479] 21566/MainThread savage/INFO: Called rbd_list
Jul 8 19:24:53 10.35.0.15 [2014-07-08 19:24:53.493] 21566/MainThread savage/INFO: rbd_list returned []
Jul 8 19:24:53 10.35.0.15 [2014-07-08 19:24:53.493] 21566/MainThread savage/INFO: Called rbd_list
Jul 8 19:24:53 10.35.0.15 [2014-07-08 19:24:53.506] 21566/MainThread savage/INFO: rbd_list returned []
Jul 8 19:24:53 10.35.0.15 [2014-07-08 19:24:53.507] 21566/MainThread savage/INFO: Called rbd_list
Jul 8 19:24:53 10.35.0.13 pound: 10.35.2.8:8774 10.35.2.7 - - [08/Jul/2014:19:24:53 +0000] "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/detail?all_tenants=True&host=10.35.0.13 HTTP/1.1" 200 3384 "" "python-novaclient"
Jul 8 19:24:53 10.35.0.13 nova-api 2014-07-08 19:24:53.516 4842 INFO nova.osapi_compute.wsgi.server [req-f833e2a0-c7ee-4fd5-b80c-177636a251b7 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] 10.35.2.7,10.35.0.13 "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/detail?all_tenants=True&host=10.35.0.13 HTTP/1.1" status: 200 len: 3558 time: 0.1065788
Jul 8 19:24:53 10.35.0.15 [2014-07-08 19:24:53.521] 21566/MainThread savage/INFO: rbd_list returned []
Jul 8 19:24:53 10.35.0.13 [2014-07-08 19:24:53.519] 941/MainThread savage.node-evacuation/INFO: list returned [<Server: vm-39852899123>, <Server: vm-3904852117>]
Jul 8 19:24:53 10.35.0.15 [2014-07-08 19:24:53.521] 21566/MainThread savage/INFO: Called rbd_list
Jul 8 19:24:53 10.35.0.13 [2014-07-08 19:24:53.519] 941/MainThread savage.node-evacuation/INFO: Requesting gc migration of VM id 5ee78913-e655-4cb1-a92c-500639c819ad
Jul 8 19:24:53 10.35.0.15 nova-api 2014-07-08 19:24:53.528 3443 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:24:53 10.35.0.15 [2014-07-08 19:24:53.533] 21566/MainThread savage/INFO: rbd_list returned []
Jul 8 19:24:53 10.35.0.15 [2014-07-08 19:24:53.534] 21566/MainThread savage/INFO: Called rbd_list
Jul 8 19:24:53 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.6 - - [08/Jul/2014:19:24:53 +0000] "GET /v2.0/tokens/e461a54513ed421ab86f2cc67e181011 HTTP/1.1" 200 3290 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:24:53 10.35.0.15 [2014-07-08 19:24:53.547] 21566/MainThread savage/INFO: rbd_list returned []
Jul 8 19:24:53 10.35.0.15 [2014-07-08 19:24:53.547] 21566/MainThread savage/INFO: Called rbd_list
Jul 8 19:24:53 10.35.0.15 [2014-07-08 19:24:53.561] 21566/MainThread savage/INFO: rbd_list returned []
Jul 8 19:24:53 10.35.0.15 [2014-07-08 19:24:53.561] 21566/MainThread savage/INFO: Called rbd_list
Jul 8 19:24:53 10.35.0.15 [2014-07-08 19:24:53.575] 21566/MainThread savage/INFO: rbd_list returned []
Jul 8 19:24:53 10.35.0.15 [2014-07-08 19:24:53.575] 21566/MainThread savage/INFO: Called rbd_list
Jul 8 19:24:53 10.35.0.15 [2014-07-08 19:24:53.588] 21566/MainThread savage/INFO: rbd_list returned []
Jul 8 19:24:53 10.35.0.15 [2014-07-08 19:24:53.588] 21566/MainThread savage/INFO: Called rbd_list
Jul 8 19:24:53 10.35.0.15 [2014-07-08 19:24:53.601] 21566/MainThread savage/INFO: rbd_list returned []
Jul 8 19:24:53 10.35.0.15 [2014-07-08 19:24:53.601] 21566/MainThread savage/INFO: Called rbd_list
Jul 8 19:24:53 10.35.0.15 [2014-07-08 19:24:53.614] 21566/MainThread savage/INFO: rbd_list returned []
Jul 8 19:24:53 10.35.0.15 [2014-07-08 19:24:53.615] 21566/MainThread savage/INFO: Called rbd_list
Jul 8 19:24:53 10.35.0.15 [2014-07-08 19:24:53.628] 21566/MainThread savage/INFO: rbd_list returned []
Jul 8 19:24:53 10.35.0.15 [2014-07-08 19:24:53.629] 21566/MainThread savage/INFO: Called rbd_list
Jul 8 19:24:53 10.35.0.15 nova-api 2014-07-08 19:24:53.632 3443 ERROR nova.cobalt.api [req-a377b0fb-15d0-49a4-8b46-7904db6d118e 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] NBK: migrate_instance instance_uuid=5ee78913-e655-4cb1-a92c-500639c819ad
Jul 8 19:24:53 10.35.0.15 [2014-07-08 19:24:53.642] 21566/MainThread savage/INFO: rbd_list returned []
Jul 8 19:24:53 10.35.0.15 ceph-watchdog.log: Ceph is responding without timeout
Jul 8 19:24:53 10.35.0.15 [2014-07-08 19:24:53.642] 21566/MainThread savage.kick_ceph/DEBUG: Ceph is responding without timeout
Jul 8 19:24:53 10.35.0.15 nova-api 2014-07-08 19:24:53.744 3443 ERROR nova.cobalt.api [req-a377b0fb-15d0-49a4-8b46-7904db6d118e 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] NBK: migrate_instance dest=10.35.0.14 set state=MIGRATING
Jul 8 19:24:53 10.35.0.15 nova-api 2014-07-08 19:24:53.746 3443 ERROR oslo.messaging._drivers.impl_rabbit [-] Failed to publish message to topic 'cobalt.10.35.0.13': [Errno 104] Connection reset by peer
2014-07-08 19:24:53.746 3443 TRACE oslo.messaging._drivers.impl_rabbit Traceback (most recent call last):
2014-07-08 19:24:53.746 3443 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/impl_rabbit.py", line 672, in ensure
2014-07-08 19:24:53.746 3443 TRACE oslo.messaging._drivers.impl_rabbit return method(*args, **kwargs)
2014-07-08 19:24:53.746 3443 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/impl_rabbit.py", line 768, in _publish
2014-07-08 19:24:53.746 3443 TRACE oslo.messaging._drivers.impl_rabbit publisher = cls(self.conf, self.channel, topic, **kwargs)
2014-07-08 19:24:53.746 3443 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/impl_rabbit.py", line 429, in __init__
2014-07-08 19:24:53.746 3443 TRACE oslo.messaging._drivers.impl_rabbit **options)
2014-07-08 19:24:53.746 3443 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/impl_rabbit.py", line 376, in __init__
2014-07-08 19:24:53.746 3443 TRACE oslo.messaging._drivers.impl_rabbit self.reconnect(channel)
2014-07-08 19:24:53.746 3443 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/impl_rabbit.py", line 384, in reconnect
2014-07-08 19:24:53.746 3443 TRACE oslo.messaging._drivers.impl_rabbit routing_key=self.routing_key)
2014-07-08 19:24:53.746 3443 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/kombu/messaging.py", line 85, in __init__
2014-07-08 19:24:53.746 3443 TRACE oslo.messaging._drivers.impl_rabbit self.revive(self._channel)
2014-07-08 19:24:53.746 3443 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/kombu/messaging.py", line 224, in revive
2014-07-08 19:24:53.746 3443 TRACE oslo.messaging._drivers.impl_rabbit self.declare()
2014-07-08 19:24:53.746 3443 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/kombu/messaging.py", line 105, in declare
2014-07-08 19:24:53.746 3443 TRACE oslo.messaging._drivers.impl_rabbit self.exchange.declare()
2014-07-08 19:24:53.746 3443 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/kombu/entity.py", line 166, in declare
2014-07-08 19:24:53.746 3443 TRACE oslo.messaging._drivers.impl_rabbit nowait=nowait, passive=passive,
2014-07-08 19:24:53.746 3443 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/amqp/channel.py", line 613, in exchange_declare
2014-07-08 19:24:53.746 3443 TRACE oslo.messaging._drivers.impl_rabbit self._send_method((40, 10), args)
2014-07-08 19:24:53.746 3443 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/amqp/abstract_channel.py", line 56, in _send_method
2014-07-08 19:24:53.746 3443 TRACE oslo.messaging._drivers.impl_rabbit self.channel_id, method_sig, args, content,
2014-07-08 19:24:53.746 3443 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/amqp/method_framing.py", line 221, in write_method
2014-07-08 19:24:53.746 3443 TRACE oslo.messaging._drivers.impl_rabbit write_frame(1, channel, payload)
2014-07-08 19:24:53.746 3443 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/amqp/transport.py", line 177, in write_frame
2014-07-08 19:24:53.746 3443 TRACE oslo.messaging._drivers.impl_rabbit frame_type, channel, size, payload, 0xce,
2014-07-08 19:24:53.746 3443 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/eventlet/greenio.py", line 307, in sendall
2014-07-08 19:24:53.746 3443 TRACE oslo.messaging._drivers.impl_rabbit tail = self.send(data, flags)
2014-07-08 19:24:53.746 3443 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/eventlet/greenio.py", line 293, in send
2014-07-08 19:24:53.746 3443 TRACE oslo.messaging._drivers.impl_rabbit total_sent += fd.send(data[total_sent:], flags)
2014-07-08 19:24:53.746 3443 TRACE oslo.messaging._drivers.impl_rabbit error: [Errno 104] Connection reset by peer
2014-07-08 19:24:53.746 3443 TRACE oslo.messaging._drivers.impl_rabbit
Jul 8 19:24:53 10.35.0.15 nova-api 2014-07-08 19:24:53.748 3443 INFO oslo.messaging._drivers.impl_rabbit [-] Reconnecting to AMQP server on 10.35.0.3:5672
Jul 8 19:24:53 10.35.0.15 nova-api 2014-07-08 19:24:53.748 3443 INFO oslo.messaging._drivers.impl_rabbit [-] Delaying reconnect for 1.0 seconds...
Jul 8 19:24:53 10.35.0.13 nova-network.log: /usr/lib64/python2.7/site-packages/sqlalchemy/engine/default.py:435: Warning: Incorrect datetime value: '2014-07-08 19:14:53+00:00' for column 'updated_at' at row 1
Jul 8 19:24:53 10.35.0.13 nova-network.log: cursor.execute(statement, parameters)
Jul 8 19:24:54 10.35.0.13 rabbitmq.log: =INFO REPORT==== 8-Jul-2014::19:24:54 ===
Jul 8 19:24:54 10.35.0.13 rabbitmq.log: accepting AMQP connection <0.2010.0> (10.35.0.15:41498 -> 10.35.0.3:5672)
Jul 8 19:24:54 10.35.0.15 nova-api 2014-07-08 19:24:54.768 3443 INFO oslo.messaging._drivers.impl_rabbit [-] Connected to AMQP server on 10.35.0.3:5672
Jul 8 19:24:54 10.35.0.15 nova-api 2014-07-08 19:24:54.769 3443 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_context_project_name\\": \\"service\\", \\"_context_quota_class\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_request_id\\": \\"req-a377b0fb-15d0-49a4-8b46-7904db6d118e\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"_context_user_name\\": \\"nova\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"args\\": {\\"instance_uuid\\": \\"5ee78913-e655-4cb1-a92c-500639c819ad\\", \\"dest\\": \\"10.35.0.14\\"}, \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_unique_id\\": \\"567cfaa672cf4deeb5010eb56a1397fc\\", \\"_context_is_admin\\": true, \\"version\\": \\"2.0\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:24:53.546069\\", \\"_context_read_deleted\\": \\"no\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"method\\": \\"migrate_instance\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:24:54 10.35.0.15 nova-api 2014-07-08 19:24:54.769 3443 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_context_project_name\\": \\"service\\", \\"_context_quota_class\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_request_id\\": \\"req-a377b0fb-15d0-49a4-8b46-7904db6d118e\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"_context_user_name\\": \\"nova\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"args\\": {\\"instance_uuid\\": \\"5ee78913-e655-4cb1-a92c-500639c819ad\\", \\"dest\\": \\"10.35.0.14\\"}, \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_unique_id\\": \\"567cfaa672cf4deeb5010eb56a1397fc\\", \\"_context_is_admin\\": true, \\"version\\": \\"2.0\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:24:53.546069\\", \\"_context_read_deleted\\": \\"no\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"method\\": \\"migrate_instance\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}', routing_key=u'cobalt.10.35.0.13'
Jul 8 19:24:54 10.35.0.13 cobalt-compute: pyamqp ERROR NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_context_project_name\\": \\"service\\", \\"_context_quota_class\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_request_id\\": \\"req-a377b0fb-15d0-49a4-8b46-7904db6d118e\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"_context_user_name\\": \\"nova\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"args\\": {\\"instance_uuid\\": \\"5ee78913-e655-4cb1-a92c-500639c819ad\\", \\"dest\\": \\"10.35.0.14\\"}, \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_unique_id\\": \\"567cfaa672cf4deeb5010eb56a1397fc\\", \\"_context_is_admin\\": true, \\"version\\": \\"2.0\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:24:53.546069\\", \\"_context_read_deleted\\": \\"no\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"method\\": \\"migrate_instance\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'2', 'redelivered': False, 'routing_key': u'cobalt.10.35.0.13', 'delivery_tag': 1, 'exchange': u'nova'}, 'properties': {'priority': 0, 'application_headers': {}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7ff0d3711150>}
Jul 8 19:24:54 10.35.0.13 cobalt-compute: common DEBUG received {u'_context_roles': [u'Admin', u'_member_', u'ProjectAdmin', u'admin'], u'_context_request_id': u'req-a377b0fb-15d0-49a4-8b46-7904db6d118e', u'_context_quota_class': None, u'_context_instance_lock_checked': False, u'_context_project_name': u'service', u'_context_service_catalog': [{u'endpoints_links': [], u'endpoints': [{u'adminURL': u'http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a', u'region': u'RegionOne', u'internalURL': u'http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a', u'publicURL': u'http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a'}], u'type': u'volume', u'name': u'Volume Service'}], u'args': {u'instance_uuid': u'5ee78913-e655-4cb1-a92c-500639c819ad', u'dest': u'10.35.0.14'}, u'_context_tenant': u'9a663247f58a4412a840385af8d9e73a', u'_context_user': u'260659016df54f47833004653cfe0cb8', u'_context_auth_token': '<SANITIZED>', u'method': u'migrate_instance', u'_context_is_admin': True, u'version': u'2.0', u'_context_project_id': u'9a663247f58a4412a840385af8d9e73a', u'_context_timestamp': u'2014-07-08T19:24:53.546069', u'_unique_id': u'567cfaa672cf4deeb5010eb56a1397fc', u'_context_read_deleted': u'no', u'_context_user_id': u'260659016df54f47833004653cfe0cb8', u'_context_user_name': u'nova', u'_context_remote_address': u'10.35.0.13'}
Jul 8 19:24:54 10.35.0.15 nova-api 2014-07-08 19:24:54.772 3443 INFO nova.osapi_compute.wsgi.server [req-a377b0fb-15d0-49a4-8b46-7904db6d118e 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] 10.35.2.7,10.35.0.13 "POST /v2/9a663247f58a4412a840385af8d9e73a/servers/5ee78913-e655-4cb1-a92c-500639c819ad/action HTTP/1.1" status: 200 len: 179 time: 1.2467179
Jul 8 19:24:54 10.35.0.13 pound: 10.35.2.8:8774 10.35.2.7 - - [08/Jul/2014:19:24:53 +0000] "POST /v2/9a663247f58a4412a840385af8d9e73a/servers/5ee78913-e655-4cb1-a92c-500639c819ad/action HTTP/1.1" 200 - "" "python-novaclient"
Jul 8 19:24:54 10.35.0.13 cobalt-compute: common DEBUG unpacked context: {'tenant': u'9a663247f58a4412a840385af8d9e73a', 'project_name': u'service', 'user_id': u'260659016df54f47833004653cfe0cb8', 'roles': [u'Admin', u'_member_', u'ProjectAdmin', u'admin'], 'timestamp': u'2014-07-08T19:24:53.546069', 'auth_token': '<SANITIZED>', 'remote_address': u'10.35.0.13', 'quota_class': None, 'is_admin': True, 'user': u'260659016df54f47833004653cfe0cb8', 'service_catalog': [{u'endpoints': [{u'adminURL': u'http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a', u'region': u'RegionOne', u'internalURL': u'http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a', u'publicURL': u'http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a'}], u'endpoints_links': [], u'type': u'volume', u'name': u'Volume Service'}], 'request_id': u'req-a377b0fb-15d0-49a4-8b46-7904db6d118e', 'instance_lock_checked': False, 'project_id': u'9a663247f58a4412a840385af8d9e73a', 'user_name': u'nova', 'read_deleted': u'no'}
Jul 8 19:24:54 10.35.0.13 cobalt-compute: impl_rabbit ERROR NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_context_project_name\\": \\"service\\", \\"_context_quota_class\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_request_id\\": \\"req-a377b0fb-15d0-49a4-8b46-7904db6d118e\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"_context_user_name\\": \\"nova\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"args\\": {\\"instance_uuid\\": \\"5ee78913-e655-4cb1-a92c-500639c819ad\\", \\"dest\\": \\"10.35.0.14\\"}, \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_unique_id\\": \\"567cfaa672cf4deeb5010eb56a1397fc\\", \\"_context_is_admin\\": true, \\"version\\": \\"2.0\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:24:53.546069\\", \\"_context_read_deleted\\": \\"no\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"method\\": \\"migrate_instance\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}'
Jul 8 19:24:54 10.35.0.13 [2014-07-08 19:24:54.771] 941/MainThread savage.Fred/INFO: Requested VM migrations.
Jul 8 19:24:54 10.35.0.13 [2014-07-08 19:24:54.771] 941/MainThread savage.node-evacuation/INFO: Requesting gc migration of VM id ce77d052-f021-44a2-b5f3-5908ec04fcc7
Jul 8 19:24:54 10.35.0.14 nova-api 2014-07-08 19:24:54.776 6476 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:24:54 10.35.0.13 pound: 10.35.2.3:35357 10.35.0.14 - - [08/Jul/2014:19:24:54 +0000] "GET /v2.0/tokens/e461a54513ed421ab86f2cc67e181011 HTTP/1.1" 200 3290 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:24:54 10.35.0.13 cobalt-compute: manager DEBUG migrate_instance called: {u'instance_uuid': u'5ee78913-e655-4cb1-a92c-500639c819ad', u'dest': u'10.35.0.14', 'instance': <nova.objects.instance.Instance object at 0x7ff0d29185d0>}
Jul 8 19:24:54 10.35.0.13 cobalt-compute: manager DEBUG Locking instance 5ee78913-e655-4cb1-a92c-500639c819ad (fn:migrate_instance)
Jul 8 19:24:54 10.35.0.13 cobalt-compute: manager DEBUG Acquiring lock for instance 5ee78913-e655-4cb1-a92c-500639c819ad
Jul 8 19:24:54 10.35.0.13 cobalt-compute: manager DEBUG Acquired lock for instance 5ee78913-e655-4cb1-a92c-500639c819ad (me: 140672292366768, refcount=1)
Jul 8 19:24:54 10.35.0.13 cobalt-compute: manager DEBUG NBK: manager.migrate_instance instance=5ee78913-e655-4cb1-a92c-500639c819ad calling pre_migrate
Jul 8 19:24:54 10.35.0.13 cobalt-compute: hooks INFO Calling hooks for pre_migrate action
Jul 8 19:24:54 10.35.0.13 cobalt-compute: hooks INFO Done calling hooks for pre_migrate action
Jul 8 19:24:54 10.35.0.13 cobalt-compute: manager DEBUG NBK: manager.migrate_instance instance='5ee78913-e655-4cb1-a92c-500639c819ad' self.host='10.35.0.13' instance.host=u'10.35.0.13'
Jul 8 19:24:54 10.35.0.13 cobalt-compute: manager DEBUG NBK: manager.migrate_instance instance='5ee78913-e655-4cb1-a92c-500639c819ad' migration_address='10.35.0.13'
Jul 8 19:24:54 10.35.0.13 cobalt-compute: manager DEBUG NBK: manager.migrate_instance instance='5ee78913-e655-4cb1-a92c-500639c819ad' calling get_instance_nw_info
Jul 8 19:24:54 10.35.0.13 cobalt-compute: amqpdriver DEBUG MSG_ID is ebea5c3e6cd54b0cb4cf6a28624c211d
Jul 8 19:24:54 10.35.0.13 cobalt-compute: amqp DEBUG UNIQUE_ID is d2562ad9fdbd4eebb4e5cdc4c53c531b.
Jul 8 19:24:54 10.35.0.13 cobalt-compute: impl_rabbit ERROR Failed to publish message to topic 'network': [Errno 104] Connection reset by peer
Traceback (most recent call last):
File "/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/impl_rabbit.py", line 672, in ensure
return method(*args, **kwargs)
File "/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/impl_rabbit.py", line 768, in _publish
publisher = cls(self.conf, self.channel, topic, **kwargs)
File "/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/impl_rabbit.py", line 429, in __init__
**options)
File "/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/impl_rabbit.py", line 376, in __init__
self.reconnect(channel)
File "/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/impl_rabbit.py", line 384, in reconnect
routing_key=self.routing_key)
File "/usr/lib64/python2.7/site-packages/kombu/messaging.py", line 85, in __init__
self.revive(self._channel)
File "/usr/lib64/python2.7/site-packages/kombu/messaging.py", line 224, in revive
self.declare()
File "/usr/lib64/python2.7/site-packages/kombu/messaging.py", line 105, in declare
self.exchange.declare()
File "/usr/lib64/python2.7/site-packages/kombu/entity.py", line 166, in declare
nowait=nowait, passive=passive,
File "/usr/lib64/python2.7/site-packages/amqp/channel.py", line 613, in exchange_declare
self._send_method((40, 10), args)
File "/usr/lib64/python2.7/site-packages/amqp/abstract_channel.py", line 56, in _send_method
self.channel_id, method_sig, args, content,
File "/usr/lib64/python2.7/site-packages/amqp/method_framing.py", line 221, in write_method
write_frame(1, channel, payload)
File "/usr/lib64/python2.7/site-packages/amqp/transport.py", line 177, in write_frame
frame_type, channel, size, payload, 0xce,
File "/usr/lib64/python2.7/site-packages/eventlet/greenio.py", line 307, in sendall
tail = self.send(data, flags)
File "/usr/lib64/python2.7/site-packages/eventlet/greenio.py", line 293, in send
total_sent += fd.send(data[total_sent:], flags)
error: [Errno 104] Connection reset by peer
Jul 8 19:24:54 10.35.0.13 cobalt-compute: impl_rabbit INFO Reconnecting to AMQP server on 10.35.0.3:5672
Jul 8 19:24:54 10.35.0.13 cobalt-compute: impl_rabbit INFO Delaying reconnect for 1.0 seconds...
Jul 8 19:24:54 10.35.0.14 nova-api 2014-07-08 19:24:54.886 6476 ERROR nova.cobalt.api [req-99e565b0-13a0-44d0-823a-8a4128c519b0 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] NBK: migrate_instance instance_uuid=ce77d052-f021-44a2-b5f3-5908ec04fcc7
Jul 8 19:24:54 10.35.0.14 nova-api 2014-07-08 19:24:54.991 6476 ERROR nova.cobalt.api [req-99e565b0-13a0-44d0-823a-8a4128c519b0 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] NBK: migrate_instance dest=10.35.0.15 set state=MIGRATING
Jul 8 19:24:54 10.35.0.14 nova-api 2014-07-08 19:24:54.992 6476 ERROR oslo.messaging._drivers.impl_rabbit [-] Failed to publish message to topic 'cobalt.10.35.0.13': [Errno 104] Connection reset by peer
2014-07-08 19:24:54.992 6476 TRACE oslo.messaging._drivers.impl_rabbit Traceback (most recent call last):
2014-07-08 19:24:54.992 6476 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/impl_rabbit.py", line 672, in ensure
2014-07-08 19:24:54.992 6476 TRACE oslo.messaging._drivers.impl_rabbit return method(*args, **kwargs)
2014-07-08 19:24:54.992 6476 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/impl_rabbit.py", line 768, in _publish
2014-07-08 19:24:54.992 6476 TRACE oslo.messaging._drivers.impl_rabbit publisher = cls(self.conf, self.channel, topic, **kwargs)
2014-07-08 19:24:54.992 6476 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/impl_rabbit.py", line 429, in __init__
2014-07-08 19:24:54.992 6476 TRACE oslo.messaging._drivers.impl_rabbit **options)
2014-07-08 19:24:54.992 6476 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/impl_rabbit.py", line 376, in __init__
2014-07-08 19:24:54.992 6476 TRACE oslo.messaging._drivers.impl_rabbit self.reconnect(channel)
2014-07-08 19:24:54.992 6476 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/impl_rabbit.py", line 384, in reconnect
2014-07-08 19:24:54.992 6476 TRACE oslo.messaging._drivers.impl_rabbit routing_key=self.routing_key)
2014-07-08 19:24:54.992 6476 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/kombu/messaging.py", line 85, in __init__
2014-07-08 19:24:54.992 6476 TRACE oslo.messaging._drivers.impl_rabbit self.revive(self._channel)
2014-07-08 19:24:54.992 6476 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/kombu/messaging.py", line 224, in revive
2014-07-08 19:24:54.992 6476 TRACE oslo.messaging._drivers.impl_rabbit self.declare()
2014-07-08 19:24:54.992 6476 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/kombu/messaging.py", line 105, in declare
2014-07-08 19:24:54.992 6476 TRACE oslo.messaging._drivers.impl_rabbit self.exchange.declare()
2014-07-08 19:24:54.992 6476 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/kombu/entity.py", line 166, in declare
2014-07-08 19:24:54.992 6476 TRACE oslo.messaging._drivers.impl_rabbit nowait=nowait, passive=passive,
2014-07-08 19:24:54.992 6476 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/amqp/channel.py", line 613, in exchange_declare
2014-07-08 19:24:54.992 6476 TRACE oslo.messaging._drivers.impl_rabbit self._send_method((40, 10), args)
2014-07-08 19:24:54.992 6476 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/amqp/abstract_channel.py", line 56, in _send_method
2014-07-08 19:24:54.992 6476 TRACE oslo.messaging._drivers.impl_rabbit self.channel_id, method_sig, args, content,
2014-07-08 19:24:54.992 6476 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/amqp/method_framing.py", line 221, in write_method
2014-07-08 19:24:54.992 6476 TRACE oslo.messaging._drivers.impl_rabbit write_frame(1, channel, payload)
2014-07-08 19:24:54.992 6476 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/amqp/transport.py", line 177, in write_frame
2014-07-08 19:24:54.992 6476 TRACE oslo.messaging._drivers.impl_rabbit frame_type, channel, size, payload, 0xce,
2014-07-08 19:24:54.992 6476 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/eventlet/greenio.py", line 307, in sendall
2014-07-08 19:24:54.992 6476 TRACE oslo.messaging._drivers.impl_rabbit tail = self.send(data, flags)
2014-07-08 19:24:54.992 6476 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/eventlet/greenio.py", line 293, in send
2014-07-08 19:24:54.992 6476 TRACE oslo.messaging._drivers.impl_rabbit total_sent += fd.send(data[total_sent:], flags)
2014-07-08 19:24:54.992 6476 TRACE oslo.messaging._drivers.impl_rabbit error: [Errno 104] Connection reset by peer
2014-07-08 19:24:54.992 6476 TRACE oslo.messaging._drivers.impl_rabbit
Jul 8 19:24:54 10.35.0.14 nova-api 2014-07-08 19:24:54.995 6476 INFO oslo.messaging._drivers.impl_rabbit [-] Reconnecting to AMQP server on 10.35.0.3:5672
Jul 8 19:24:54 10.35.0.14 nova-api 2014-07-08 19:24:54.996 6476 INFO oslo.messaging._drivers.impl_rabbit [-] Delaying reconnect for 1.0 seconds...
Jul 8 19:24:55 10.35.0.13 rabbitmq.log: =INFO REPORT==== 8-Jul-2014::19:24:55 ===
Jul 8 19:24:55 10.35.0.13 rabbitmq.log: accepting AMQP connection <0.2022.0> (10.35.0.13:42589 -> 10.35.0.3:5672)
Jul 8 19:24:55 10.35.0.13 cobalt-compute: connection DEBUG Start from server, version: 0.9, properties: {u'information': u'Licensed under the MPL. See http://www.rabbitmq.com/', u'product': u'RabbitMQ', u'copyright': u'Copyright (C) 2007-2013 VMware, Inc.', u'capabilities': {u'exchange_exchange_bindings': True, u'consumer_cancel_notify': True, u'publisher_confirms': True, u'basic.nack': True}, u'platform': u'Erlang/OTP', u'version': u'3.1.3'}, mechanisms: [u'PLAIN', u'AMQPLAIN'], locales: [u'en_US']
Jul 8 19:24:55 10.35.0.13 cobalt-compute: connection DEBUG Open OK!
Jul 8 19:24:55 10.35.0.13 cobalt-compute: channel DEBUG using channel_id: 1
Jul 8 19:24:55 10.35.0.13 cobalt-compute: channel DEBUG Channel open
Jul 8 19:24:55 10.35.0.13 cobalt-compute: impl_rabbit INFO Connected to AMQP server on 10.35.0.3:5672
Jul 8 19:24:55 10.35.0.13 cobalt-compute: pyamqp ERROR NBK: body='{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_msg_id\\": \\"ebea5c3e6cd54b0cb4cf6a28624c211d\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-a377b0fb-15d0-49a4-8b46-7904db6d118e\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"args\\": {\\"instance_id\\": \\"5ee78913-e655-4cb1-a92c-500639c819ad\\", \\"host\\": \\"10.35.0.13\\", \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"rxtx_factor\\": 1.0}, \\"_unique_id\\": \\"d2562ad9fdbd4eebb4e5cdc4c53c531b\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_project_name\\": \\"service\\", \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_ff10fb357a86427594db414dc3e5acc6\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.9\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:24:53.546069\\", \\"_context_user_name\\": \\"nova\\", \\"method\\": \\"get_instance_nw_info\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}', headers={'ttl': 60000}, properties={'delivery_mode': 2}
Jul 8 19:24:55 10.35.0.13 cobalt-compute: messaging ERROR NBK: message.body='{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_msg_id\\": \\"ebea5c3e6cd54b0cb4cf6a28624c211d\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-a377b0fb-15d0-49a4-8b46-7904db6d118e\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"args\\": {\\"instance_id\\": \\"5ee78913-e655-4cb1-a92c-500639c819ad\\", \\"host\\": \\"10.35.0.13\\", \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"rxtx_factor\\": 1.0}, \\"_unique_id\\": \\"d2562ad9fdbd4eebb4e5cdc4c53c531b\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_project_name\\": \\"service\\", \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_ff10fb357a86427594db414dc3e5acc6\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.9\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:24:53.546069\\", \\"_context_user_name\\": \\"nova\\", \\"method\\": \\"get_instance_nw_info\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}', routing_key='network'
Jul 8 19:24:55 10.35.0.13 cobalt-compute: channel DEBUG Closed channel #1
Jul 8 19:24:55 10.35.0.13 cobalt-compute: channel DEBUG using channel_id: 1
Jul 8 19:24:55 10.35.0.13 nova-network 2014-07-08 19:24:55.895 4876 ERROR kombu.transport.pyamqp [-] NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_msg_id\\": \\"ebea5c3e6cd54b0cb4cf6a28624c211d\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-a377b0fb-15d0-49a4-8b46-7904db6d118e\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"args\\": {\\"instance_id\\": \\"5ee78913-e655-4cb1-a92c-500639c819ad\\", \\"host\\": \\"10.35.0.13\\", \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"rxtx_factor\\": 1.0}, \\"_unique_id\\": \\"d2562ad9fdbd4eebb4e5cdc4c53c531b\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_project_name\\": \\"service\\", \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_ff10fb357a86427594db414dc3e5acc6\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.9\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:24:53.546069\\", \\"_context_user_name\\": \\"nova\\", \\"method\\": \\"get_instance_nw_info\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'1', 'redelivered': False, 'routing_key': u'network', 'delivery_tag': 25, 'exchange': u'nova'}, 'properties': {'priority': 0, 'application_headers': {u'ttl': 60000}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7fdeb6472390>}
Jul 8 19:24:55 10.35.0.13 cobalt-compute: channel DEBUG Channel open
Jul 8 19:24:55 10.35.0.13 cobalt-compute: impl_rabbit ERROR Failed to consume message from queue: [Errno 104] Connection reset by peer
Traceback (most recent call last):
File "/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/impl_rabbit.py", line 672, in ensure
return method(*args, **kwargs)
File "/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/impl_rabbit.py", line 752, in _consume
return self.connection.drain_events(timeout=timeout)
File "/usr/lib64/python2.7/site-packages/kombu/connection.py", line 281, in drain_events
return self.transport.drain_events(self.connection, **kwargs)
File "/usr/lib64/python2.7/site-packages/kombu/transport/pyamqp.py", line 103, in drain_events
return connection.drain_events(**kwargs)
File "/usr/lib64/python2.7/site-packages/amqp/connection.py", line 320, in drain_events
return amqp_method(channel, args)
File "/usr/lib64/python2.7/site-packages/amqp/connection.py", line 523, in _close
self._x_close_ok()
File "/usr/lib64/python2.7/site-packages/amqp/connection.py", line 551, in _x_close_ok
self._send_method((10, 51))
File "/usr/lib64/python2.7/site-packages/amqp/abstract_channel.py", line 56, in _send_method
self.channel_id, method_sig, args, content,
File "/usr/lib64/python2.7/site-packages/amqp/method_framing.py", line 221, in write_method
write_frame(1, channel, payload)
File "/usr/lib64/python2.7/site-packages/amqp/transport.py", line 177, in write_frame
frame_type, channel, size, payload, 0xce,
File "/usr/lib64/python2.7/site-packages/eventlet/greenio.py", line 307, in sendall
tail = self.send(data, flags)
File "/usr/lib64/python2.7/site-packages/eventlet/greenio.py", line 293, in send
total_sent += fd.send(data[total_sent:], flags)
error: [Errno 104] Connection reset by peer
Jul 8 19:24:55 10.35.0.13 cobalt-compute: impl_rabbit INFO Reconnecting to AMQP server on 10.35.0.3:5672
Jul 8 19:24:55 10.35.0.13 cobalt-compute: impl_rabbit INFO Delaying reconnect for 1.0 seconds...
Jul 8 19:24:55 10.35.0.13 nova-network 2014-07-08 19:24:55.896 4876 ERROR oslo.messaging._drivers.impl_rabbit [-] NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_msg_id\\": \\"ebea5c3e6cd54b0cb4cf6a28624c211d\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-a377b0fb-15d0-49a4-8b46-7904db6d118e\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"args\\": {\\"instance_id\\": \\"5ee78913-e655-4cb1-a92c-500639c819ad\\", \\"host\\": \\"10.35.0.13\\", \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"rxtx_factor\\": 1.0}, \\"_unique_id\\": \\"d2562ad9fdbd4eebb4e5cdc4c53c531b\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_project_name\\": \\"service\\", \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_ff10fb357a86427594db414dc3e5acc6\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.9\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:24:53.546069\\", \\"_context_user_name\\": \\"nova\\", \\"method\\": \\"get_instance_nw_info\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}'
Jul 8 19:24:56 10.35.0.13 rabbitmq.log: =INFO REPORT==== 8-Jul-2014::19:24:56 ===
Jul 8 19:24:56 10.35.0.13 rabbitmq.log: accepting AMQP connection <0.2034.0> (10.35.0.14:35461 -> 10.35.0.3:5672)
Jul 8 19:24:56 10.35.0.14 nova-api 2014-07-08 19:24:56.014 6476 INFO oslo.messaging._drivers.impl_rabbit [-] Connected to AMQP server on 10.35.0.3:5672
Jul 8 19:24:56 10.35.0.14 nova-api 2014-07-08 19:24:56.015 6476 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_context_project_name\\": \\"service\\", \\"_context_quota_class\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_request_id\\": \\"req-99e565b0-13a0-44d0-823a-8a4128c519b0\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"_context_user_name\\": \\"nova\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"args\\": {\\"instance_uuid\\": \\"ce77d052-f021-44a2-b5f3-5908ec04fcc7\\", \\"dest\\": \\"10.35.0.15\\"}, \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_unique_id\\": \\"1660e1c7c3ff46ce8c178f11fc35a478\\", \\"_context_is_admin\\": true, \\"version\\": \\"2.0\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:24:54.794950\\", \\"_context_read_deleted\\": \\"no\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"method\\": \\"migrate_instance\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:24:56 10.35.0.14 nova-api 2014-07-08 19:24:56.015 6476 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_context_project_name\\": \\"service\\", \\"_context_quota_class\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_request_id\\": \\"req-99e565b0-13a0-44d0-823a-8a4128c519b0\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"_context_user_name\\": \\"nova\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"args\\": {\\"instance_uuid\\": \\"ce77d052-f021-44a2-b5f3-5908ec04fcc7\\", \\"dest\\": \\"10.35.0.15\\"}, \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_unique_id\\": \\"1660e1c7c3ff46ce8c178f11fc35a478\\", \\"_context_is_admin\\": true, \\"version\\": \\"2.0\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:24:54.794950\\", \\"_context_read_deleted\\": \\"no\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"method\\": \\"migrate_instance\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}', routing_key=u'cobalt.10.35.0.13'
Jul 8 19:24:56 10.35.0.13 cobalt-compute: pyamqp ERROR NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_context_project_name\\": \\"service\\", \\"_context_quota_class\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_request_id\\": \\"req-99e565b0-13a0-44d0-823a-8a4128c519b0\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"_context_user_name\\": \\"nova\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"args\\": {\\"instance_uuid\\": \\"ce77d052-f021-44a2-b5f3-5908ec04fcc7\\", \\"dest\\": \\"10.35.0.15\\"}, \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_unique_id\\": \\"1660e1c7c3ff46ce8c178f11fc35a478\\", \\"_context_is_admin\\": true, \\"version\\": \\"2.0\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:24:54.794950\\", \\"_context_read_deleted\\": \\"no\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"method\\": \\"migrate_instance\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'2', 'redelivered': False, 'routing_key': u'cobalt.10.35.0.13', 'delivery_tag': 2, 'exchange': u'nova'}, 'properties': {'priority': 0, 'application_headers': {}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7ff0d3711150>}
Jul 8 19:24:56 10.35.0.13 cobalt-compute: common DEBUG received {u'_context_roles': [u'Admin', u'_member_', u'ProjectAdmin', u'admin'], u'_context_request_id': u'req-99e565b0-13a0-44d0-823a-8a4128c519b0', u'_context_quota_class': None, u'_context_instance_lock_checked': False, u'_context_project_name': u'service', u'_context_service_catalog': [{u'endpoints_links': [], u'endpoints': [{u'adminURL': u'http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a', u'region': u'RegionOne', u'internalURL': u'http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a', u'publicURL': u'http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a'}], u'type': u'volume', u'name': u'Volume Service'}], u'args': {u'instance_uuid': u'ce77d052-f021-44a2-b5f3-5908ec04fcc7', u'dest': u'10.35.0.15'}, u'_context_tenant': u'9a663247f58a4412a840385af8d9e73a', u'_context_user': u'260659016df54f47833004653cfe0cb8', u'_context_auth_token': '<SANITIZED>', u'method': u'migrate_instance', u'_context_is_admin': True, u'version': u'2.0', u'_context_project_id': u'9a663247f58a4412a840385af8d9e73a', u'_context_timestamp': u'2014-07-08T19:24:54.794950', u'_unique_id': u'1660e1c7c3ff46ce8c178f11fc35a478', u'_context_read_deleted': u'no', u'_context_user_id': u'260659016df54f47833004653cfe0cb8', u'_context_user_name': u'nova', u'_context_remote_address': u'10.35.0.13'}
Jul 8 19:24:56 10.35.0.13 pound: 10.35.2.8:8774 10.35.2.7 - - [08/Jul/2014:19:24:54 +0000] "POST /v2/9a663247f58a4412a840385af8d9e73a/servers/ce77d052-f021-44a2-b5f3-5908ec04fcc7/action HTTP/1.1" 200 - "" "python-novaclient"
Jul 8 19:24:56 10.35.0.14 nova-api 2014-07-08 19:24:56.018 6476 INFO nova.osapi_compute.wsgi.server [req-99e565b0-13a0-44d0-823a-8a4128c519b0 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] 10.35.2.7,10.35.0.13 "POST /v2/9a663247f58a4412a840385af8d9e73a/servers/ce77d052-f021-44a2-b5f3-5908ec04fcc7/action HTTP/1.1" status: 200 len: 179 time: 1.2437801
Jul 8 19:24:56 10.35.0.13 cobalt-compute: common DEBUG unpacked context: {'tenant': u'9a663247f58a4412a840385af8d9e73a', 'project_name': u'service', 'user_id': u'260659016df54f47833004653cfe0cb8', 'roles': [u'Admin', u'_member_', u'ProjectAdmin', u'admin'], 'timestamp': u'2014-07-08T19:24:54.794950', 'auth_token': '<SANITIZED>', 'remote_address': u'10.35.0.13', 'quota_class': None, 'is_admin': True, 'user': u'260659016df54f47833004653cfe0cb8', 'service_catalog': [{u'endpoints': [{u'adminURL': u'http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a', u'region': u'RegionOne', u'internalURL': u'http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a', u'publicURL': u'http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a'}], u'endpoints_links': [], u'type': u'volume', u'name': u'Volume Service'}], 'request_id': u'req-99e565b0-13a0-44d0-823a-8a4128c519b0', 'instance_lock_checked': False, 'project_id': u'9a663247f58a4412a840385af8d9e73a', 'user_name': u'nova', 'read_deleted': u'no'}
Jul 8 19:24:56 10.35.0.13 cobalt-compute: impl_rabbit ERROR NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_context_project_name\\": \\"service\\", \\"_context_quota_class\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_request_id\\": \\"req-99e565b0-13a0-44d0-823a-8a4128c519b0\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"_context_user_name\\": \\"nova\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"args\\": {\\"instance_uuid\\": \\"ce77d052-f021-44a2-b5f3-5908ec04fcc7\\", \\"dest\\": \\"10.35.0.15\\"}, \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_unique_id\\": \\"1660e1c7c3ff46ce8c178f11fc35a478\\", \\"_context_is_admin\\": true, \\"version\\": \\"2.0\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:24:54.794950\\", \\"_context_read_deleted\\": \\"no\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"method\\": \\"migrate_instance\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}'
Jul 8 19:24:56 10.35.0.13 [2014-07-08 19:24:56.020] 941/MainThread savage.Fred/INFO: Requested VM migrations.
Jul 8 19:24:56 10.35.0.13 nova-network.log: /usr/lib64/python2.7/site-packages/amqp/channel.py:616: VDeprecationWarning: The auto_delete flag for exchanges has been deprecated and will be removed
Jul 8 19:24:56 10.35.0.13 nova-network.log: from py-amqp v1.5.0.
Jul 8 19:24:56 10.35.0.13 nova-network.log: warn(VDeprecationWarning(EXCHANGE_AUTODELETE_DEPRECATED))
Jul 8 19:24:56 10.35.0.13 nova-network 2014-07-08 19:24:56.039 4876 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"ce31e73a061b43e2ac575d90d792fdf6\\", \\"failure\\": null, \\"result\\": [{\\"ovs_interfaceid\\": null, \\"network\\": {\\"bridge\\": \\"cloud\\", \\"subnets\\": [{\\"ips\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"fixed\\", \\"floating_ips\\": [], \\"address\\": \\"10.35.3.4\\"}], \\"version\\": 4, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.3\\"}, \\"dns\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"dns\\", \\"address\\": \\"8.8.4.4\\"}], \\"routes\\": [], \\"cidr\\": \\"10.35.3.0/24\\", \\"gateway\\": {\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"gateway\\", \\"address\\": \\"10.35.3.1\\"}}, {\\"ips\\": [], \\"version\\": null, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.3\\"}, \\"dns\\": [], \\"routes\\": [], \\"cidr\\": null, \\"gateway\\": {\\"meta\\": {}, \\"version\\": null, \\"type\\": \\"gateway\\", \\"address\\": null}}], \\"meta\\": {\\"tenant_id\\": null, \\"multi_host\\": true, \\"should_create_bridge\\": true}, \\"id\\": \\"933f56fa-7929-413f-ab40-9c7c91cb840b\\", \\"label\\": \\"private\\"}, \\"devname\\": null, \\"qbh_params\\": null, \\"meta\\": {}, \\"details\\": {}, \\"address\\": \\"fa:16:3e:da:1a:ff\\", \\"active\\": false, \\"type\\": \\"bridge\\", \\"id\\": \\"615297b1-6e8b-4e6c-ad03-1d8ca9bb3a72\\", \\"qbg_params\\": null}], \\"_msg_id\\": \\"ebea5c3e6cd54b0cb4cf6a28624c211d\\"}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:24:56 10.35.0.13 nova-network 2014-07-08 19:24:56.039 4876 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"ce31e73a061b43e2ac575d90d792fdf6\\", \\"failure\\": null, \\"result\\": [{\\"ovs_interfaceid\\": null, \\"network\\": {\\"bridge\\": \\"cloud\\", \\"subnets\\": [{\\"ips\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"fixed\\", \\"floating_ips\\": [], \\"address\\": \\"10.35.3.4\\"}], \\"version\\": 4, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.3\\"}, \\"dns\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"dns\\", \\"address\\": \\"8.8.4.4\\"}], \\"routes\\": [], \\"cidr\\": \\"10.35.3.0/24\\", \\"gateway\\": {\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"gateway\\", \\"address\\": \\"10.35.3.1\\"}}, {\\"ips\\": [], \\"version\\": null, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.3\\"}, \\"dns\\": [], \\"routes\\": [], \\"cidr\\": null, \\"gateway\\": {\\"meta\\": {}, \\"version\\": null, \\"type\\": \\"gateway\\", \\"address\\": null}}], \\"meta\\": {\\"tenant_id\\": null, \\"multi_host\\": true, \\"should_create_bridge\\": true}, \\"id\\": \\"933f56fa-7929-413f-ab40-9c7c91cb840b\\", \\"label\\": \\"private\\"}, \\"devname\\": null, \\"qbh_params\\": null, \\"meta\\": {}, \\"details\\": {}, \\"address\\": \\"fa:16:3e:da:1a:ff\\", \\"active\\": false, \\"type\\": \\"bridge\\", \\"id\\": \\"615297b1-6e8b-4e6c-ad03-1d8ca9bb3a72\\", \\"qbg_params\\": null}], \\"_msg_id\\": \\"ebea5c3e6cd54b0cb4cf6a28624c211d\\"}", "oslo.version": "2.0"}', routing_key=u'reply_ff10fb357a86427594db414dc3e5acc6'
Jul 8 19:24:56 10.35.0.13 nova-network.log: /usr/lib64/python2.7/site-packages/amqp/channel.py:616: VDeprecationWarning: The auto_delete flag for exchanges has been deprecated and will be removed
Jul 8 19:24:56 10.35.0.13 nova-network.log: from py-amqp v1.5.0.
Jul 8 19:24:56 10.35.0.13 nova-network.log: warn(VDeprecationWarning(EXCHANGE_AUTODELETE_DEPRECATED))
Jul 8 19:24:56 10.35.0.13 nova-network 2014-07-08 19:24:56.041 4876 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"671667177f9d4b2e9c7e695d4595ac17\\", \\"failure\\": null, \\"_msg_id\\": \\"ebea5c3e6cd54b0cb4cf6a28624c211d\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:24:56 10.35.0.13 nova-network 2014-07-08 19:24:56.041 4876 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"671667177f9d4b2e9c7e695d4595ac17\\", \\"failure\\": null, \\"_msg_id\\": \\"ebea5c3e6cd54b0cb4cf6a28624c211d\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', routing_key=u'reply_ff10fb357a86427594db414dc3e5acc6'
Jul 8 19:24:56 10.35.0.13 cobalt-compute: manager DEBUG migrate_instance called: {u'instance_uuid': u'ce77d052-f021-44a2-b5f3-5908ec04fcc7', u'dest': u'10.35.0.15', 'instance': <nova.objects.instance.Instance object at 0x7ff0d2864250>}
Jul 8 19:24:56 10.35.0.13 cobalt-compute: manager DEBUG Locking instance ce77d052-f021-44a2-b5f3-5908ec04fcc7 (fn:migrate_instance)
Jul 8 19:24:56 10.35.0.13 cobalt-compute: manager DEBUG Acquiring lock for instance ce77d052-f021-44a2-b5f3-5908ec04fcc7
Jul 8 19:24:56 10.35.0.13 cobalt-compute: manager DEBUG Acquired lock for instance ce77d052-f021-44a2-b5f3-5908ec04fcc7 (me: 140672304066384, refcount=1)
Jul 8 19:24:56 10.35.0.13 cobalt-compute: manager DEBUG NBK: manager.migrate_instance instance=ce77d052-f021-44a2-b5f3-5908ec04fcc7 calling pre_migrate
Jul 8 19:24:56 10.35.0.13 cobalt-compute: hooks INFO Calling hooks for pre_migrate action
Jul 8 19:24:56 10.35.0.13 cobalt-compute: hooks INFO Done calling hooks for pre_migrate action
Jul 8 19:24:56 10.35.0.13 cobalt-compute: manager DEBUG NBK: manager.migrate_instance instance='ce77d052-f021-44a2-b5f3-5908ec04fcc7' self.host='10.35.0.13' instance.host=u'10.35.0.13'
Jul 8 19:24:56 10.35.0.13 cobalt-compute: manager DEBUG NBK: manager.migrate_instance instance='ce77d052-f021-44a2-b5f3-5908ec04fcc7' migration_address='10.35.0.13'
Jul 8 19:24:56 10.35.0.13 cobalt-compute: manager DEBUG NBK: manager.migrate_instance instance='ce77d052-f021-44a2-b5f3-5908ec04fcc7' calling get_instance_nw_info
Jul 8 19:24:56 10.35.0.13 cobalt-compute: amqpdriver DEBUG MSG_ID is 29c6579c5de24c00b0b0e55579b8e047
Jul 8 19:24:56 10.35.0.13 cobalt-compute: amqp DEBUG UNIQUE_ID is 8a7433d783ff45c29e3b78a2af86e979.
Jul 8 19:24:56 10.35.0.13 cobalt-compute: pyamqp ERROR NBK: body='{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_msg_id\\": \\"29c6579c5de24c00b0b0e55579b8e047\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-99e565b0-13a0-44d0-823a-8a4128c519b0\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"args\\": {\\"instance_id\\": \\"ce77d052-f021-44a2-b5f3-5908ec04fcc7\\", \\"host\\": \\"10.35.0.13\\", \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"rxtx_factor\\": 1.0}, \\"_unique_id\\": \\"8a7433d783ff45c29e3b78a2af86e979\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_project_name\\": \\"service\\", \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_ff10fb357a86427594db414dc3e5acc6\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.9\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:24:54.794950\\", \\"_context_user_name\\": \\"nova\\", \\"method\\": \\"get_instance_nw_info\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}', headers={'ttl': 60000}, properties={'delivery_mode': 2}
Jul 8 19:24:56 10.35.0.13 cobalt-compute: messaging ERROR NBK: message.body='{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_msg_id\\": \\"29c6579c5de24c00b0b0e55579b8e047\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-99e565b0-13a0-44d0-823a-8a4128c519b0\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"args\\": {\\"instance_id\\": \\"ce77d052-f021-44a2-b5f3-5908ec04fcc7\\", \\"host\\": \\"10.35.0.13\\", \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"rxtx_factor\\": 1.0}, \\"_unique_id\\": \\"8a7433d783ff45c29e3b78a2af86e979\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_project_name\\": \\"service\\", \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_ff10fb357a86427594db414dc3e5acc6\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.9\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:24:54.794950\\", \\"_context_user_name\\": \\"nova\\", \\"method\\": \\"get_instance_nw_info\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}', routing_key='network'
Jul 8 19:24:56 10.35.0.13 cobalt-compute: channel DEBUG Closed channel #1
Jul 8 19:24:56 10.35.0.13 cobalt-compute: channel DEBUG using channel_id: 1
Jul 8 19:24:56 10.35.0.14 nova-network 2014-07-08 19:24:56.115 6514 ERROR kombu.transport.pyamqp [-] NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_msg_id\\": \\"29c6579c5de24c00b0b0e55579b8e047\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-99e565b0-13a0-44d0-823a-8a4128c519b0\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"args\\": {\\"instance_id\\": \\"ce77d052-f021-44a2-b5f3-5908ec04fcc7\\", \\"host\\": \\"10.35.0.13\\", \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"rxtx_factor\\": 1.0}, \\"_unique_id\\": \\"8a7433d783ff45c29e3b78a2af86e979\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_project_name\\": \\"service\\", \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_ff10fb357a86427594db414dc3e5acc6\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.9\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:24:54.794950\\", \\"_context_user_name\\": \\"nova\\", \\"method\\": \\"get_instance_nw_info\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'1', 'redelivered': False, 'routing_key': u'network', 'delivery_tag': 26, 'exchange': u'nova'}, 'properties': {'priority': 0, 'application_headers': {u'ttl': 60000}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7f5c1424f1d0>}
Jul 8 19:24:56 10.35.0.13 cobalt-compute: channel DEBUG Channel open
Jul 8 19:24:56 10.35.0.14 nova-network 2014-07-08 19:24:56.116 6514 ERROR oslo.messaging._drivers.impl_rabbit [-] NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_msg_id\\": \\"29c6579c5de24c00b0b0e55579b8e047\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-99e565b0-13a0-44d0-823a-8a4128c519b0\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"args\\": {\\"instance_id\\": \\"ce77d052-f021-44a2-b5f3-5908ec04fcc7\\", \\"host\\": \\"10.35.0.13\\", \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"rxtx_factor\\": 1.0}, \\"_unique_id\\": \\"8a7433d783ff45c29e3b78a2af86e979\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_project_name\\": \\"service\\", \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_ff10fb357a86427594db414dc3e5acc6\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.9\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:24:54.794950\\", \\"_context_user_name\\": \\"nova\\", \\"method\\": \\"get_instance_nw_info\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}'
Jul 8 19:24:56 10.35.0.14 nova-network.log: /usr/lib64/python2.7/site-packages/amqp/channel.py:616: VDeprecationWarning: The auto_delete flag for exchanges has been deprecated and will be removed
Jul 8 19:24:56 10.35.0.14 nova-network.log: from py-amqp v1.5.0.
Jul 8 19:24:56 10.35.0.14 nova-network.log: warn(VDeprecationWarning(EXCHANGE_AUTODELETE_DEPRECATED))
Jul 8 19:24:56 10.35.0.14 nova-network 2014-07-08 19:24:56.254 6514 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"70f77028f6114c25994361644719e643\\", \\"failure\\": null, \\"result\\": [{\\"ovs_interfaceid\\": null, \\"network\\": {\\"bridge\\": \\"cloud\\", \\"subnets\\": [{\\"ips\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"fixed\\", \\"floating_ips\\": [], \\"address\\": \\"10.35.3.2\\"}], \\"version\\": 4, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.3\\"}, \\"dns\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"dns\\", \\"address\\": \\"8.8.4.4\\"}], \\"routes\\": [], \\"cidr\\": \\"10.35.3.0/24\\", \\"gateway\\": {\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"gateway\\", \\"address\\": \\"10.35.3.1\\"}}, {\\"ips\\": [], \\"version\\": null, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.3\\"}, \\"dns\\": [], \\"routes\\": [], \\"cidr\\": null, \\"gateway\\": {\\"meta\\": {}, \\"version\\": null, \\"type\\": \\"gateway\\", \\"address\\": null}}], \\"meta\\": {\\"tenant_id\\": null, \\"multi_host\\": true, \\"should_create_bridge\\": true}, \\"id\\": \\"933f56fa-7929-413f-ab40-9c7c91cb840b\\", \\"label\\": \\"private\\"}, \\"devname\\": null, \\"qbh_params\\": null, \\"meta\\": {}, \\"details\\": {}, \\"address\\": \\"fa:16:3e:0e:79:17\\", \\"active\\": false, \\"type\\": \\"bridge\\", \\"id\\": \\"3821355e-1e28-4fa1-adfe-520202d90d3e\\", \\"qbg_params\\": null}], \\"_msg_id\\": \\"29c6579c5de24c00b0b0e55579b8e047\\"}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:24:56 10.35.0.14 nova-network 2014-07-08 19:24:56.254 6514 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"70f77028f6114c25994361644719e643\\", \\"failure\\": null, \\"result\\": [{\\"ovs_interfaceid\\": null, \\"network\\": {\\"bridge\\": \\"cloud\\", \\"subnets\\": [{\\"ips\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"fixed\\", \\"floating_ips\\": [], \\"address\\": \\"10.35.3.2\\"}], \\"version\\": 4, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.3\\"}, \\"dns\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"dns\\", \\"address\\": \\"8.8.4.4\\"}], \\"routes\\": [], \\"cidr\\": \\"10.35.3.0/24\\", \\"gateway\\": {\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"gateway\\", \\"address\\": \\"10.35.3.1\\"}}, {\\"ips\\": [], \\"version\\": null, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.3\\"}, \\"dns\\": [], \\"routes\\": [], \\"cidr\\": null, \\"gateway\\": {\\"meta\\": {}, \\"version\\": null, \\"type\\": \\"gateway\\", \\"address\\": null}}], \\"meta\\": {\\"tenant_id\\": null, \\"multi_host\\": true, \\"should_create_bridge\\": true}, \\"id\\": \\"933f56fa-7929-413f-ab40-9c7c91cb840b\\", \\"label\\": \\"private\\"}, \\"devname\\": null, \\"qbh_params\\": null, \\"meta\\": {}, \\"details\\": {}, \\"address\\": \\"fa:16:3e:0e:79:17\\", \\"active\\": false, \\"type\\": \\"bridge\\", \\"id\\": \\"3821355e-1e28-4fa1-adfe-520202d90d3e\\", \\"qbg_params\\": null}], \\"_msg_id\\": \\"29c6579c5de24c00b0b0e55579b8e047\\"}", "oslo.version": "2.0"}', routing_key=u'reply_ff10fb357a86427594db414dc3e5acc6'
Jul 8 19:24:56 10.35.0.14 nova-network.log: /usr/lib64/python2.7/site-packages/amqp/channel.py:616: VDeprecationWarning: The auto_delete flag for exchanges has been deprecated and will be removed
Jul 8 19:24:56 10.35.0.14 nova-network.log: from py-amqp v1.5.0.
Jul 8 19:24:56 10.35.0.14 nova-network.log: warn(VDeprecationWarning(EXCHANGE_AUTODELETE_DEPRECATED))
Jul 8 19:24:56 10.35.0.14 nova-network 2014-07-08 19:24:56.255 6514 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"a9063363d14549f0a4f1f8308004a74b\\", \\"failure\\": null, \\"_msg_id\\": \\"29c6579c5de24c00b0b0e55579b8e047\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:24:56 10.35.0.14 nova-network 2014-07-08 19:24:56.256 6514 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"a9063363d14549f0a4f1f8308004a74b\\", \\"failure\\": null, \\"_msg_id\\": \\"29c6579c5de24c00b0b0e55579b8e047\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', routing_key=u'reply_ff10fb357a86427594db414dc3e5acc6'
Jul 8 19:24:56 10.35.0.14 ceph-watchdog.log: Testing for ceph timout...
Jul 8 19:24:56 10.35.0.14 [2014-07-08 19:24:56.429] 16228/MainThread savage.kick_ceph/DEBUG: Testing for ceph timout...
Jul 8 19:24:56 10.35.0.14 [2014-07-08 19:24:56.429] 16228/MainThread savage/INFO: Called lspools
Jul 8 19:24:56 10.35.0.14 [2014-07-08 19:24:56.459] 16228/MainThread savage/INFO: lspools returned ['data', 'metadata', 'rbd', 'images', 'slowrbd', 'crash-dumps', 'gc-vms', '.rgw', '.rgw.buckets', '.rgw.control', '.rgw.gc', '.log', '.intent-log', '.usage', '.users', '.users.email', '.users.swift', '.users.uid', '.rgw.root']
Jul 8 19:24:56 10.35.0.14 [2014-07-08 19:24:56.459] 16228/MainThread savage/INFO: Called rbd_list
Jul 8 19:24:56 10.35.0.14 [2014-07-08 19:24:56.474] 16228/MainThread savage/INFO: rbd_list returned []
Jul 8 19:24:56 10.35.0.14 [2014-07-08 19:24:56.474] 16228/MainThread savage/INFO: Called rbd_list
Jul 8 19:24:56 10.35.0.14 [2014-07-08 19:24:56.487] 16228/MainThread savage/INFO: rbd_list returned []
Jul 8 19:24:56 10.35.0.14 [2014-07-08 19:24:56.488] 16228/MainThread savage/INFO: Called rbd_list
Jul 8 19:24:56 10.35.0.14 [2014-07-08 19:24:56.501] 16228/MainThread savage/INFO: rbd_list returned ['mysql']
Jul 8 19:24:56 10.35.0.14 [2014-07-08 19:24:56.501] 16228/MainThread savage/INFO: Called rbd_list
Jul 8 19:24:56 10.35.0.14 [2014-07-08 19:24:56.518] 16228/MainThread savage/INFO: rbd_list returned ['8f23694a-2995-4070-a5e8-661de3207f2e']
Jul 8 19:24:56 10.35.0.14 [2014-07-08 19:24:56.518] 16228/MainThread savage/INFO: Called rbd_list
Jul 8 19:24:56 10.35.0.14 [2014-07-08 19:24:56.531] 16228/MainThread savage/INFO: rbd_list returned []
Jul 8 19:24:56 10.35.0.14 [2014-07-08 19:24:56.532] 16228/MainThread savage/INFO: Called rbd_list
Jul 8 19:24:56 10.35.0.14 [2014-07-08 19:24:56.546] 16228/MainThread savage/INFO: rbd_list returned []
Jul 8 19:24:56 10.35.0.14 [2014-07-08 19:24:56.546] 16228/MainThread savage/INFO: Called rbd_list
Jul 8 19:24:56 10.35.0.14 [2014-07-08 19:24:56.560] 16228/MainThread savage/INFO: rbd_list returned []
Jul 8 19:24:56 10.35.0.14 [2014-07-08 19:24:56.560] 16228/MainThread savage/INFO: Called rbd_list
Jul 8 19:24:56 10.35.0.14 [2014-07-08 19:24:56.574] 16228/MainThread savage/INFO: rbd_list returned []
Jul 8 19:24:56 10.35.0.14 [2014-07-08 19:24:56.574] 16228/MainThread savage/INFO: Called rbd_list
Jul 8 19:24:56 10.35.0.14 [2014-07-08 19:24:56.587] 16228/MainThread savage/INFO: rbd_list returned []
Jul 8 19:24:56 10.35.0.14 [2014-07-08 19:24:56.587] 16228/MainThread savage/INFO: Called rbd_list
Jul 8 19:24:56 10.35.0.14 [2014-07-08 19:24:56.604] 16228/MainThread savage/INFO: rbd_list returned []
Jul 8 19:24:56 10.35.0.14 [2014-07-08 19:24:56.604] 16228/MainThread savage/INFO: Called rbd_list
Jul 8 19:24:56 10.35.0.14 [2014-07-08 19:24:56.618] 16228/MainThread savage/INFO: rbd_list returned []
Jul 8 19:24:56 10.35.0.14 [2014-07-08 19:24:56.618] 16228/MainThread savage/INFO: Called rbd_list
Jul 8 19:24:56 10.35.0.14 [2014-07-08 19:24:56.633] 16228/MainThread savage/INFO: rbd_list returned []
Jul 8 19:24:56 10.35.0.14 [2014-07-08 19:24:56.633] 16228/MainThread savage/INFO: Called rbd_list
Jul 8 19:24:56 10.35.0.14 [2014-07-08 19:24:56.647] 16228/MainThread savage/INFO: rbd_list returned []
Jul 8 19:24:56 10.35.0.14 [2014-07-08 19:24:56.647] 16228/MainThread savage/INFO: Called rbd_list
Jul 8 19:24:56 10.35.0.14 [2014-07-08 19:24:56.661] 16228/MainThread savage/INFO: rbd_list returned []
Jul 8 19:24:56 10.35.0.14 [2014-07-08 19:24:56.661] 16228/MainThread savage/INFO: Called rbd_list
Jul 8 19:24:56 10.35.0.14 [2014-07-08 19:24:56.675] 16228/MainThread savage/INFO: rbd_list returned []
Jul 8 19:24:56 10.35.0.14 [2014-07-08 19:24:56.676] 16228/MainThread savage/INFO: Called rbd_list
Jul 8 19:24:56 10.35.0.14 [2014-07-08 19:24:56.690] 16228/MainThread savage/INFO: rbd_list returned []
Jul 8 19:24:56 10.35.0.14 [2014-07-08 19:24:56.690] 16228/MainThread savage/INFO: Called rbd_list
Jul 8 19:24:56 10.35.0.14 [2014-07-08 19:24:56.705] 16228/MainThread savage/INFO: rbd_list returned []
Jul 8 19:24:56 10.35.0.14 [2014-07-08 19:24:56.705] 16228/MainThread savage/INFO: Called rbd_list
Jul 8 19:24:56 10.35.0.14 [2014-07-08 19:24:56.720] 16228/MainThread savage/INFO: rbd_list returned []
Jul 8 19:24:56 10.35.0.14 [2014-07-08 19:24:56.720] 16228/MainThread savage/INFO: Called rbd_list
Jul 8 19:24:56 10.35.0.14 [2014-07-08 19:24:56.733] 16228/MainThread savage/INFO: rbd_list returned []
Jul 8 19:24:56 10.35.0.14 ceph-watchdog.log: Ceph is responding without timeout
Jul 8 19:24:56 10.35.0.14 [2014-07-08 19:24:56.733] 16228/MainThread savage.kick_ceph/DEBUG: Ceph is responding without timeout
Jul 8 19:24:56 10.35.0.13 rabbitmq.log: =INFO REPORT==== 8-Jul-2014::19:24:56 ===
Jul 8 19:24:56 10.35.0.13 rabbitmq.log: accepting AMQP connection <0.2058.0> (10.35.0.13:42590 -> 10.35.0.3:5672)
Jul 8 19:24:56 10.35.0.13 cobalt-compute: connection DEBUG Start from server, version: 0.9, properties: {u'information': u'Licensed under the MPL. See http://www.rabbitmq.com/', u'product': u'RabbitMQ', u'copyright': u'Copyright (C) 2007-2013 VMware, Inc.', u'capabilities': {u'exchange_exchange_bindings': True, u'consumer_cancel_notify': True, u'publisher_confirms': True, u'basic.nack': True}, u'platform': u'Erlang/OTP', u'version': u'3.1.3'}, mechanisms: [u'PLAIN', u'AMQPLAIN'], locales: [u'en_US']
Jul 8 19:24:56 10.35.0.13 cobalt-compute: connection DEBUG Open OK!
Jul 8 19:24:56 10.35.0.13 cobalt-compute: channel DEBUG using channel_id: 1
Jul 8 19:24:56 10.35.0.13 cobalt-compute: channel DEBUG Channel open
Jul 8 19:24:56 10.35.0.13 cobalt-compute.log: /usr/lib64/python2.7/site-packages/amqp/channel.py:616: VDeprecationWarning: The auto_delete flag for exchanges has been deprecated and will be removed
Jul 8 19:24:56 10.35.0.13 cobalt-compute.log: from py-amqp v1.5.0.
Jul 8 19:24:56 10.35.0.13 cobalt-compute.log: warn(VDeprecationWarning(EXCHANGE_AUTODELETE_DEPRECATED))
Jul 8 19:24:56 10.35.0.13 cobalt-compute: impl_rabbit INFO Connected to AMQP server on 10.35.0.3:5672
Jul 8 19:24:58 10.35.0.15 cinder-volume 2014-07-08 19:24:58.282 9067 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_context_domain\\": null, \\"_context_request_id\\": \\"req-cace079f-6a5f-4088-9ec4-b6fead43a9ae\\", \\"_context_quota_class\\": null, \\"_context_service_catalog\\": [], \\"_context_auth_token\\": null, \\"_context_user_id\\": null, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_timestamp\\": \\"2014-07-08T19:24:58.279551\\", \\"_context_project_domain\\": null, \\"_context_user\\": null, \\"method\\": \\"update_service_capabilities\\", \\"_context_remote_address\\": null, \\"_context_roles\\": [\\"admin\\"], \\"args\\": {\\"service_name\\": \\"volume\\", \\"host\\": \\"cinder@rbd-fast\\", \\"capabilities\\": {\\"allocated_capacity_gb\\": 0, \\"volume_backend_name\\": \\"CEPH\\", \\"free_capacity_gb\\": 1282, \\"driver_version\\": \\"1.1.0\\", \\"total_capacity_gb\\": 1282, \\"reserved_percentage\\": 0, \\"vendor_name\\": \\"Open Source\\", \\"storage_protocol\\": \\"ceph\\"}}, \\"_unique_id\\": \\"63b0a96da2584d73b03a3cbea4576222\\", \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_context_user_identity\\": \\"- - - - -\\", \\"_context_tenant\\": null, \\"_context_project_id\\": null, \\"_context_user_domain\\": null}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:24:58 10.35.0.15 cinder-volume 2014-07-08 19:24:58.282 9067 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_context_domain\\": null, \\"_context_request_id\\": \\"req-cace079f-6a5f-4088-9ec4-b6fead43a9ae\\", \\"_context_quota_class\\": null, \\"_context_service_catalog\\": [], \\"_context_auth_token\\": null, \\"_context_user_id\\": null, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_timestamp\\": \\"2014-07-08T19:24:58.279551\\", \\"_context_project_domain\\": null, \\"_context_user\\": null, \\"method\\": \\"update_service_capabilities\\", \\"_context_remote_address\\": null, \\"_context_roles\\": [\\"admin\\"], \\"args\\": {\\"service_name\\": \\"volume\\", \\"host\\": \\"cinder@rbd-fast\\", \\"capabilities\\": {\\"allocated_capacity_gb\\": 0, \\"volume_backend_name\\": \\"CEPH\\", \\"free_capacity_gb\\": 1282, \\"driver_version\\": \\"1.1.0\\", \\"total_capacity_gb\\": 1282, \\"reserved_percentage\\": 0, \\"vendor_name\\": \\"Open Source\\", \\"storage_protocol\\": \\"ceph\\"}}, \\"_unique_id\\": \\"63b0a96da2584d73b03a3cbea4576222\\", \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_context_user_identity\\": \\"- - - - -\\", \\"_context_tenant\\": null, \\"_context_project_id\\": null, \\"_context_user_domain\\": null}", "oslo.version": "2.0"}', routing_key=''
Jul 8 19:24:58 10.35.0.15 cinder-scheduler 2014-07-08 19:24:58.284 8863 ERROR kombu.transport.pyamqp [-] NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_context_domain\\": null, \\"_context_request_id\\": \\"req-cace079f-6a5f-4088-9ec4-b6fead43a9ae\\", \\"_context_quota_class\\": null, \\"_context_service_catalog\\": [], \\"_context_auth_token\\": null, \\"_context_user_id\\": null, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_timestamp\\": \\"2014-07-08T19:24:58.279551\\", \\"_context_project_domain\\": null, \\"_context_user\\": null, \\"method\\": \\"update_service_capabilities\\", \\"_context_remote_address\\": null, \\"_context_roles\\": [\\"admin\\"], \\"args\\": {\\"service_name\\": \\"volume\\", \\"host\\": \\"cinder@rbd-fast\\", \\"capabilities\\": {\\"allocated_capacity_gb\\": 0, \\"volume_backend_name\\": \\"CEPH\\", \\"free_capacity_gb\\": 1282, \\"driver_version\\": \\"1.1.0\\", \\"total_capacity_gb\\": 1282, \\"reserved_percentage\\": 0, \\"vendor_name\\": \\"Open Source\\", \\"storage_protocol\\": \\"ceph\\"}}, \\"_unique_id\\": \\"63b0a96da2584d73b03a3cbea4576222\\", \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_context_user_identity\\": \\"- - - - -\\", \\"_context_tenant\\": null, \\"_context_project_id\\": null, \\"_context_user_domain\\": null}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'3', 'redelivered': False, 'routing_key': u'', 'delivery_tag': 77, 'exchange': u'cinder-scheduler_fanout'}, 'properties': {'priority': 0, 'application_headers': {}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7fa03d158690>}
Jul 8 19:24:58 10.35.0.15 cinder-volume 2014-07-08 19:24:58.285 9067 INFO cinder.volume.manager [-] Updating volume status
Jul 8 19:24:58 10.35.0.15 cinder-scheduler 2014-07-08 19:24:58.286 8863 ERROR oslo.messaging._drivers.impl_rabbit [-] NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_context_domain\\": null, \\"_context_request_id\\": \\"req-cace079f-6a5f-4088-9ec4-b6fead43a9ae\\", \\"_context_quota_class\\": null, \\"_context_service_catalog\\": [], \\"_context_auth_token\\": null, \\"_context_user_id\\": null, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_timestamp\\": \\"2014-07-08T19:24:58.279551\\", \\"_context_project_domain\\": null, \\"_context_user\\": null, \\"method\\": \\"update_service_capabilities\\", \\"_context_remote_address\\": null, \\"_context_roles\\": [\\"admin\\"], \\"args\\": {\\"service_name\\": \\"volume\\", \\"host\\": \\"cinder@rbd-fast\\", \\"capabilities\\": {\\"allocated_capacity_gb\\": 0, \\"volume_backend_name\\": \\"CEPH\\", \\"free_capacity_gb\\": 1282, \\"driver_version\\": \\"1.1.0\\", \\"total_capacity_gb\\": 1282, \\"reserved_percentage\\": 0, \\"vendor_name\\": \\"Open Source\\", \\"storage_protocol\\": \\"ceph\\"}}, \\"_unique_id\\": \\"63b0a96da2584d73b03a3cbea4576222\\", \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_context_user_identity\\": \\"- - - - -\\", \\"_context_tenant\\": null, \\"_context_project_id\\": null, \\"_context_user_domain\\": null}", "oslo.version": "2.0"}'
Jul 8 19:24:58 10.35.0.15 cinder-scheduler 2014-07-08 19:24:58.287 8863 WARNING cinder.context [-] Arguments dropped when creating context: {'user': None, 'tenant': None, 'user_identity': u'- - - - -'}
Jul 8 19:24:58 10.35.0.15 cinder-scheduler 2014-07-08 19:24:58.289 8863 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"c26e0e7dde414711a28534e2f7f31f4e\\", \\"failure\\": null, \\"result\\": null}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:24:58 10.35.0.15 cinder-scheduler 2014-07-08 19:24:58.289 8863 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"c26e0e7dde414711a28534e2f7f31f4e\\", \\"failure\\": null, \\"result\\": null}", "oslo.version": "2.0"}', routing_key=''
Jul 8 19:24:58 10.35.0.15 cinder-scheduler 2014-07-08 19:24:58.290 8863 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"09818b22692d4aacb8424e68b651e093\\", \\"failure\\": null, \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:24:58 10.35.0.15 cinder-scheduler 2014-07-08 19:24:58.291 8863 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"09818b22692d4aacb8424e68b651e093\\", \\"failure\\": null, \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', routing_key=''
Jul 8 19:25:00 10.35.0.14 [2014-07-08 19:25:00.733] 941/Thread-1 savage.WSGI-FredInterface/INFO: 10.35.0.2 "GET /private/status HTTP/1.1" 200 3035
Jul 8 19:25:02 10.35.0.14 nova-compute 2014-07-08 19:25:02.480 6347 AUDIT nova.compute.resource_tracker [-] Auditing locally available compute resources
Jul 8 19:25:02 10.35.0.14 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf vgs --noheadings --nosuffix --separator | --units b -o vg_size,vg_free nova
Jul 8 19:25:02 10.35.0.14 nova-compute 2014-07-08 19:25:02.822 6347 AUDIT nova.compute.resource_tracker [-] Free ram (MB): 47729
Jul 8 19:25:02 10.35.0.14 nova-compute 2014-07-08 19:25:02.822 6347 AUDIT nova.compute.resource_tracker [-] Free disk (GB): 1862
Jul 8 19:25:02 10.35.0.14 nova-compute 2014-07-08 19:25:02.823 6347 AUDIT nova.compute.resource_tracker [-] Free VCPUS: 4
Jul 8 19:25:02 10.35.0.14 nova-compute 2014-07-08 19:25:02.848 6347 INFO nova.compute.resource_tracker [-] Compute_service record updated for 10.35.0.14:node-10-35-0-14
Jul 8 19:25:02 10.35.0.14 nova-compute 2014-07-08 19:25:02.901 6347 WARNING nova.virt.libvirt.imagecache [-] Unknown base file: /mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375
Jul 8 19:25:02 10.35.0.14 nova-compute 2014-07-08 19:25:02.901 6347 INFO nova.virt.libvirt.imagecache [-] Removable base files: /mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375
Jul 8 19:25:02 10.35.0.14 nova-compute 2014-07-08 19:25:02.901 6347 INFO nova.virt.libvirt.imagecache [-] Base file too young to remove: /mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375
Jul 8 19:25:03 10.35.0.15 nova-compute 2014-07-08 19:25:03.888 3328 AUDIT nova.compute.resource_tracker [-] Auditing locally available compute resources
Jul 8 19:25:03 10.35.0.15 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf vgs --noheadings --nosuffix --separator | --units b -o vg_size,vg_free nova
Jul 8 19:25:04 10.35.0.15 nova-compute 2014-07-08 19:25:04.253 3328 AUDIT nova.compute.resource_tracker [-] Free ram (MB): 47217
Jul 8 19:25:04 10.35.0.15 nova-compute 2014-07-08 19:25:04.254 3328 AUDIT nova.compute.resource_tracker [-] Free disk (GB): 1861
Jul 8 19:25:04 10.35.0.15 nova-compute 2014-07-08 19:25:04.254 3328 AUDIT nova.compute.resource_tracker [-] Free VCPUS: 3
Jul 8 19:25:04 10.35.0.15 nova-compute 2014-07-08 19:25:04.280 3328 INFO nova.compute.resource_tracker [-] Compute_service record updated for 10.35.0.15:node-10-35-0-15
Jul 8 19:25:04 10.35.0.13 nova-compute 2014-07-08 19:25:04.309 4705 AUDIT nova.compute.resource_tracker [-] Auditing locally available compute resources
Jul 8 19:25:04 10.35.0.13 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf vgs --noheadings --nosuffix --separator | --units b -o vg_size,vg_free nova
Jul 8 19:25:04 10.35.0.15 nova-compute 2014-07-08 19:25:04.411 3328 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_context_roles\\": [], \\"_msg_id\\": \\"7209e493451945e7aff8fd24eba95714\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-26b6b4bc-fb69-4377-bc1b-c63d5ca2edaf\\", \\"_context_service_catalog\\": [], \\"args\\": {\\"instance_id\\": \\"3e65c658-731c-4a56-8c17-1cfac00bf754\\", \\"host\\": \\"10.35.0.15\\", \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"rxtx_factor\\": 1.0}, \\"_unique_id\\": \\"e2d10532de21425592ef51b4129b656c\\", \\"_context_user\\": null, \\"_context_user_id\\": null, \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_af4fb5adaca445a3b6d555c358806a4e\\", \\"_context_auth_token\\": null, \\"_context_tenant\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.9\\", \\"_context_project_id\\": null, \\"_context_timestamp\\": \\"2014-07-08T19:25:03.887100\\", \\"_context_user_name\\": null, \\"method\\": \\"get_instance_nw_info\\", \\"_context_remote_address\\": null}", "oslo.version": "2.0"}', headers={'ttl': 60000}, properties={'delivery_mode': 2}
Jul 8 19:25:04 10.35.0.15 nova-compute 2014-07-08 19:25:04.411 3328 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_context_roles\\": [], \\"_msg_id\\": \\"7209e493451945e7aff8fd24eba95714\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-26b6b4bc-fb69-4377-bc1b-c63d5ca2edaf\\", \\"_context_service_catalog\\": [], \\"args\\": {\\"instance_id\\": \\"3e65c658-731c-4a56-8c17-1cfac00bf754\\", \\"host\\": \\"10.35.0.15\\", \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"rxtx_factor\\": 1.0}, \\"_unique_id\\": \\"e2d10532de21425592ef51b4129b656c\\", \\"_context_user\\": null, \\"_context_user_id\\": null, \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_af4fb5adaca445a3b6d555c358806a4e\\", \\"_context_auth_token\\": null, \\"_context_tenant\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.9\\", \\"_context_project_id\\": null, \\"_context_timestamp\\": \\"2014-07-08T19:25:03.887100\\", \\"_context_user_name\\": null, \\"method\\": \\"get_instance_nw_info\\", \\"_context_remote_address\\": null}", "oslo.version": "2.0"}', routing_key='network'
Jul 8 19:25:04 10.35.0.15 nova-network 2014-07-08 19:25:04.413 3465 ERROR kombu.transport.pyamqp [-] NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_context_roles\\": [], \\"_msg_id\\": \\"7209e493451945e7aff8fd24eba95714\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-26b6b4bc-fb69-4377-bc1b-c63d5ca2edaf\\", \\"_context_service_catalog\\": [], \\"args\\": {\\"instance_id\\": \\"3e65c658-731c-4a56-8c17-1cfac00bf754\\", \\"host\\": \\"10.35.0.15\\", \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"rxtx_factor\\": 1.0}, \\"_unique_id\\": \\"e2d10532de21425592ef51b4129b656c\\", \\"_context_user\\": null, \\"_context_user_id\\": null, \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_af4fb5adaca445a3b6d555c358806a4e\\", \\"_context_auth_token\\": null, \\"_context_tenant\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.9\\", \\"_context_project_id\\": null, \\"_context_timestamp\\": \\"2014-07-08T19:25:03.887100\\", \\"_context_user_name\\": null, \\"method\\": \\"get_instance_nw_info\\", \\"_context_remote_address\\": null}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'1', 'redelivered': False, 'routing_key': u'network', 'delivery_tag': 26, 'exchange': u'nova'}, 'properties': {'priority': 0, 'application_headers': {u'ttl': 60000}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7eff2d221290>}
Jul 8 19:25:04 10.35.0.15 nova-network 2014-07-08 19:25:04.414 3465 ERROR oslo.messaging._drivers.impl_rabbit [-] NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_context_roles\\": [], \\"_msg_id\\": \\"7209e493451945e7aff8fd24eba95714\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-26b6b4bc-fb69-4377-bc1b-c63d5ca2edaf\\", \\"_context_service_catalog\\": [], \\"args\\": {\\"instance_id\\": \\"3e65c658-731c-4a56-8c17-1cfac00bf754\\", \\"host\\": \\"10.35.0.15\\", \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"rxtx_factor\\": 1.0}, \\"_unique_id\\": \\"e2d10532de21425592ef51b4129b656c\\", \\"_context_user\\": null, \\"_context_user_id\\": null, \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_af4fb5adaca445a3b6d555c358806a4e\\", \\"_context_auth_token\\": null, \\"_context_tenant\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.9\\", \\"_context_project_id\\": null, \\"_context_timestamp\\": \\"2014-07-08T19:25:03.887100\\", \\"_context_user_name\\": null, \\"method\\": \\"get_instance_nw_info\\", \\"_context_remote_address\\": null}", "oslo.version": "2.0"}'
Jul 8 19:25:04 10.35.0.13 [2014-07-08 19:25:04.482] 11286/MainThread savage.arp_forever/INFO: execute['arping', '-q', '-A', '-I', 'services', '-c', '1', '-U', '10.35.2.9']
Jul 8 19:25:04 10.35.0.15 nova-network.log: /usr/lib64/python2.7/site-packages/amqp/channel.py:616: VDeprecationWarning: The auto_delete flag for exchanges has been deprecated and will be removed
Jul 8 19:25:04 10.35.0.15 nova-network.log: from py-amqp v1.5.0.
Jul 8 19:25:04 10.35.0.15 nova-network.log: warn(VDeprecationWarning(EXCHANGE_AUTODELETE_DEPRECATED))
Jul 8 19:25:04 10.35.0.15 nova-network 2014-07-08 19:25:04.547 3465 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"633336a5827248bcb7f45745a51e7d8a\\", \\"failure\\": null, \\"result\\": [{\\"ovs_interfaceid\\": null, \\"network\\": {\\"bridge\\": \\"cloud\\", \\"subnets\\": [{\\"ips\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"fixed\\", \\"floating_ips\\": [], \\"address\\": \\"10.35.3.6\\"}], \\"version\\": 4, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.5\\"}, \\"dns\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"dns\\", \\"address\\": \\"8.8.4.4\\"}], \\"routes\\": [], \\"cidr\\": \\"10.35.3.0/24\\", \\"gateway\\": {\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"gateway\\", \\"address\\": \\"10.35.3.1\\"}}, {\\"ips\\": [], \\"version\\": null, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.5\\"}, \\"dns\\": [], \\"routes\\": [], \\"cidr\\": null, \\"gateway\\": {\\"meta\\": {}, \\"version\\": null, \\"type\\": \\"gateway\\", \\"address\\": null}}], \\"meta\\": {\\"tenant_id\\": null, \\"multi_host\\": true, \\"should_create_bridge\\": true}, \\"id\\": \\"933f56fa-7929-413f-ab40-9c7c91cb840b\\", \\"label\\": \\"private\\"}, \\"devname\\": null, \\"qbh_params\\": null, \\"meta\\": {}, \\"details\\": {}, \\"address\\": \\"fa:16:3e:c1:d4:f3\\", \\"active\\": false, \\"type\\": \\"bridge\\", \\"id\\": \\"ecd8dbf8-7280-45c3-89e1-919f48b2591c\\", \\"qbg_params\\": null}], \\"_msg_id\\": \\"7209e493451945e7aff8fd24eba95714\\"}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:25:04 10.35.0.15 nova-network 2014-07-08 19:25:04.548 3465 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"633336a5827248bcb7f45745a51e7d8a\\", \\"failure\\": null, \\"result\\": [{\\"ovs_interfaceid\\": null, \\"network\\": {\\"bridge\\": \\"cloud\\", \\"subnets\\": [{\\"ips\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"fixed\\", \\"floating_ips\\": [], \\"address\\": \\"10.35.3.6\\"}], \\"version\\": 4, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.5\\"}, \\"dns\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"dns\\", \\"address\\": \\"8.8.4.4\\"}], \\"routes\\": [], \\"cidr\\": \\"10.35.3.0/24\\", \\"gateway\\": {\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"gateway\\", \\"address\\": \\"10.35.3.1\\"}}, {\\"ips\\": [], \\"version\\": null, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.5\\"}, \\"dns\\": [], \\"routes\\": [], \\"cidr\\": null, \\"gateway\\": {\\"meta\\": {}, \\"version\\": null, \\"type\\": \\"gateway\\", \\"address\\": null}}], \\"meta\\": {\\"tenant_id\\": null, \\"multi_host\\": true, \\"should_create_bridge\\": true}, \\"id\\": \\"933f56fa-7929-413f-ab40-9c7c91cb840b\\", \\"label\\": \\"private\\"}, \\"devname\\": null, \\"qbh_params\\": null, \\"meta\\": {}, \\"details\\": {}, \\"address\\": \\"fa:16:3e:c1:d4:f3\\", \\"active\\": false, \\"type\\": \\"bridge\\", \\"id\\": \\"ecd8dbf8-7280-45c3-89e1-919f48b2591c\\", \\"qbg_params\\": null}], \\"_msg_id\\": \\"7209e493451945e7aff8fd24eba95714\\"}", "oslo.version": "2.0"}', routing_key=u'reply_af4fb5adaca445a3b6d555c358806a4e'
Jul 8 19:25:04 10.35.0.15 nova-network.log: /usr/lib64/python2.7/site-packages/amqp/channel.py:616: VDeprecationWarning: The auto_delete flag for exchanges has been deprecated and will be removed
Jul 8 19:25:04 10.35.0.15 nova-network.log: from py-amqp v1.5.0.
Jul 8 19:25:04 10.35.0.15 nova-network.log: warn(VDeprecationWarning(EXCHANGE_AUTODELETE_DEPRECATED))
Jul 8 19:25:04 10.35.0.15 nova-network 2014-07-08 19:25:04.549 3465 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"45a022900a7d4702b539bc51d9423d21\\", \\"failure\\": null, \\"_msg_id\\": \\"7209e493451945e7aff8fd24eba95714\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:25:04 10.35.0.15 nova-network 2014-07-08 19:25:04.549 3465 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"45a022900a7d4702b539bc51d9423d21\\", \\"failure\\": null, \\"_msg_id\\": \\"7209e493451945e7aff8fd24eba95714\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', routing_key=u'reply_af4fb5adaca445a3b6d555c358806a4e'
Jul 8 19:25:04 10.35.0.15 nova-compute 2014-07-08 19:25:04.549 3328 ERROR kombu.transport.pyamqp [-] NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_unique_id\\": \\"633336a5827248bcb7f45745a51e7d8a\\", \\"failure\\": null, \\"result\\": [{\\"ovs_interfaceid\\": null, \\"network\\": {\\"bridge\\": \\"cloud\\", \\"subnets\\": [{\\"ips\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"fixed\\", \\"floating_ips\\": [], \\"address\\": \\"10.35.3.6\\"}], \\"version\\": 4, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.5\\"}, \\"dns\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"dns\\", \\"address\\": \\"8.8.4.4\\"}], \\"routes\\": [], \\"cidr\\": \\"10.35.3.0/24\\", \\"gateway\\": {\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"gateway\\", \\"address\\": \\"10.35.3.1\\"}}, {\\"ips\\": [], \\"version\\": null, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.5\\"}, \\"dns\\": [], \\"routes\\": [], \\"cidr\\": null, \\"gateway\\": {\\"meta\\": {}, \\"version\\": null, \\"type\\": \\"gateway\\", \\"address\\": null}}], \\"meta\\": {\\"tenant_id\\": null, \\"multi_host\\": true, \\"should_create_bridge\\": true}, \\"id\\": \\"933f56fa-7929-413f-ab40-9c7c91cb840b\\", \\"label\\": \\"private\\"}, \\"devname\\": null, \\"qbh_params\\": null, \\"meta\\": {}, \\"details\\": {}, \\"address\\": \\"fa:16:3e:c1:d4:f3\\", \\"active\\": false, \\"type\\": \\"bridge\\", \\"id\\": \\"ecd8dbf8-7280-45c3-89e1-919f48b2591c\\", \\"qbg_params\\": null}], \\"_msg_id\\": \\"7209e493451945e7aff8fd24eba95714\\"}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'1', 'redelivered': False, 'routing_key': u'reply_af4fb5adaca445a3b6d555c358806a4e', 'delivery_tag': 75, 'exchange': u'reply_af4fb5adaca445a3b6d555c358806a4e'}, 'properties': {'priority': 0, 'application_headers': {}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7f3030b1ff10>}
Jul 8 19:25:04 10.35.0.15 nova-compute 2014-07-08 19:25:04.550 3328 ERROR oslo.messaging._drivers.impl_rabbit [-] NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_unique_id\\": \\"633336a5827248bcb7f45745a51e7d8a\\", \\"failure\\": null, \\"result\\": [{\\"ovs_interfaceid\\": null, \\"network\\": {\\"bridge\\": \\"cloud\\", \\"subnets\\": [{\\"ips\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"fixed\\", \\"floating_ips\\": [], \\"address\\": \\"10.35.3.6\\"}], \\"version\\": 4, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.5\\"}, \\"dns\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"dns\\", \\"address\\": \\"8.8.4.4\\"}], \\"routes\\": [], \\"cidr\\": \\"10.35.3.0/24\\", \\"gateway\\": {\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"gateway\\", \\"address\\": \\"10.35.3.1\\"}}, {\\"ips\\": [], \\"version\\": null, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.5\\"}, \\"dns\\": [], \\"routes\\": [], \\"cidr\\": null, \\"gateway\\": {\\"meta\\": {}, \\"version\\": null, \\"type\\": \\"gateway\\", \\"address\\": null}}], \\"meta\\": {\\"tenant_id\\": null, \\"multi_host\\": true, \\"should_create_bridge\\": true}, \\"id\\": \\"933f56fa-7929-413f-ab40-9c7c91cb840b\\", \\"label\\": \\"private\\"}, \\"devname\\": null, \\"qbh_params\\": null, \\"meta\\": {}, \\"details\\": {}, \\"address\\": \\"fa:16:3e:c1:d4:f3\\", \\"active\\": false, \\"type\\": \\"bridge\\", \\"id\\": \\"ecd8dbf8-7280-45c3-89e1-919f48b2591c\\", \\"qbg_params\\": null}], \\"_msg_id\\": \\"7209e493451945e7aff8fd24eba95714\\"}", "oslo.version": "2.0"}'
Jul 8 19:25:04 10.35.0.15 nova-compute 2014-07-08 19:25:04.551 3328 ERROR kombu.transport.pyamqp [-] NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_unique_id\\": \\"45a022900a7d4702b539bc51d9423d21\\", \\"failure\\": null, \\"_msg_id\\": \\"7209e493451945e7aff8fd24eba95714\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'1', 'redelivered': False, 'routing_key': u'reply_af4fb5adaca445a3b6d555c358806a4e', 'delivery_tag': 76, 'exchange': u'reply_af4fb5adaca445a3b6d555c358806a4e'}, 'properties': {'priority': 0, 'application_headers': {}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7f3030b1ff10>}
Jul 8 19:25:04 10.35.0.15 nova-compute 2014-07-08 19:25:04.551 3328 ERROR oslo.messaging._drivers.impl_rabbit [-] NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_unique_id\\": \\"45a022900a7d4702b539bc51d9423d21\\", \\"failure\\": null, \\"_msg_id\\": \\"7209e493451945e7aff8fd24eba95714\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}'
Jul 8 19:25:04 10.35.0.13 nova-compute 2014-07-08 19:25:04.684 4705 AUDIT nova.compute.resource_tracker [-] Free ram (MB): 46705
Jul 8 19:25:04 10.35.0.13 nova-compute 2014-07-08 19:25:04.684 4705 AUDIT nova.compute.resource_tracker [-] Free disk (GB): 1860
Jul 8 19:25:04 10.35.0.13 nova-compute 2014-07-08 19:25:04.684 4705 AUDIT nova.compute.resource_tracker [-] Free VCPUS: 2
Jul 8 19:25:04 10.35.0.13 nova-compute 2014-07-08 19:25:04.710 4705 INFO nova.compute.resource_tracker [-] Compute_service record updated for 10.35.0.13:node-10-35-0-13
Jul 8 19:25:04 10.35.0.15 [2014-07-08 19:25:04.765] 8877/MainThread savage.arp_forever/INFO: execute['arping', '-q', '-A', '-I', 'services', '-c', '1', '-U', '10.35.2.6']
Jul 8 19:25:04 10.35.0.13 nova-compute 2014-07-08 19:25:04.823 4705 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_context_roles\\": [], \\"_msg_id\\": \\"c6e78a44901a4b4c933abaa467b94e81\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-82162252-40d0-48d7-80ff-785ea7823aa2\\", \\"_context_service_catalog\\": [], \\"args\\": {\\"instance_id\\": \\"5ee78913-e655-4cb1-a92c-500639c819ad\\", \\"host\\": \\"10.35.0.13\\", \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"rxtx_factor\\": 1.0}, \\"_unique_id\\": \\"b95cef3ca5464bcb8d3762372a836591\\", \\"_context_user\\": null, \\"_context_user_id\\": null, \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_5a152b20506e4303bfaf3007cbe974c3\\", \\"_context_auth_token\\": null, \\"_context_tenant\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.9\\", \\"_context_project_id\\": null, \\"_context_timestamp\\": \\"2014-07-08T19:25:04.308188\\", \\"_context_user_name\\": null, \\"method\\": \\"get_instance_nw_info\\", \\"_context_remote_address\\": null}", "oslo.version": "2.0"}', headers={'ttl': 60000}, properties={'delivery_mode': 2}
Jul 8 19:25:04 10.35.0.13 nova-compute 2014-07-08 19:25:04.823 4705 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_context_roles\\": [], \\"_msg_id\\": \\"c6e78a44901a4b4c933abaa467b94e81\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-82162252-40d0-48d7-80ff-785ea7823aa2\\", \\"_context_service_catalog\\": [], \\"args\\": {\\"instance_id\\": \\"5ee78913-e655-4cb1-a92c-500639c819ad\\", \\"host\\": \\"10.35.0.13\\", \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"rxtx_factor\\": 1.0}, \\"_unique_id\\": \\"b95cef3ca5464bcb8d3762372a836591\\", \\"_context_user\\": null, \\"_context_user_id\\": null, \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_5a152b20506e4303bfaf3007cbe974c3\\", \\"_context_auth_token\\": null, \\"_context_tenant\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.9\\", \\"_context_project_id\\": null, \\"_context_timestamp\\": \\"2014-07-08T19:25:04.308188\\", \\"_context_user_name\\": null, \\"method\\": \\"get_instance_nw_info\\", \\"_context_remote_address\\": null}", "oslo.version": "2.0"}', routing_key='network'
Jul 8 19:25:04 10.35.0.13 nova-network 2014-07-08 19:25:04.824 4876 ERROR kombu.transport.pyamqp [-] NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_context_roles\\": [], \\"_msg_id\\": \\"c6e78a44901a4b4c933abaa467b94e81\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-82162252-40d0-48d7-80ff-785ea7823aa2\\", \\"_context_service_catalog\\": [], \\"args\\": {\\"instance_id\\": \\"5ee78913-e655-4cb1-a92c-500639c819ad\\", \\"host\\": \\"10.35.0.13\\", \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"rxtx_factor\\": 1.0}, \\"_unique_id\\": \\"b95cef3ca5464bcb8d3762372a836591\\", \\"_context_user\\": null, \\"_context_user_id\\": null, \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_5a152b20506e4303bfaf3007cbe974c3\\", \\"_context_auth_token\\": null, \\"_context_tenant\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.9\\", \\"_context_project_id\\": null, \\"_context_timestamp\\": \\"2014-07-08T19:25:04.308188\\", \\"_context_user_name\\": null, \\"method\\": \\"get_instance_nw_info\\", \\"_context_remote_address\\": null}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'1', 'redelivered': False, 'routing_key': u'network', 'delivery_tag': 26, 'exchange': u'nova'}, 'properties': {'priority': 0, 'application_headers': {u'ttl': 60000}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7fdeb6472390>}
Jul 8 19:25:04 10.35.0.13 nova-network 2014-07-08 19:25:04.826 4876 ERROR oslo.messaging._drivers.impl_rabbit [-] NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_context_roles\\": [], \\"_msg_id\\": \\"c6e78a44901a4b4c933abaa467b94e81\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-82162252-40d0-48d7-80ff-785ea7823aa2\\", \\"_context_service_catalog\\": [], \\"args\\": {\\"instance_id\\": \\"5ee78913-e655-4cb1-a92c-500639c819ad\\", \\"host\\": \\"10.35.0.13\\", \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"rxtx_factor\\": 1.0}, \\"_unique_id\\": \\"b95cef3ca5464bcb8d3762372a836591\\", \\"_context_user\\": null, \\"_context_user_id\\": null, \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_5a152b20506e4303bfaf3007cbe974c3\\", \\"_context_auth_token\\": null, \\"_context_tenant\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.9\\", \\"_context_project_id\\": null, \\"_context_timestamp\\": \\"2014-07-08T19:25:04.308188\\", \\"_context_user_name\\": null, \\"method\\": \\"get_instance_nw_info\\", \\"_context_remote_address\\": null}", "oslo.version": "2.0"}'
Jul 8 19:25:04 10.35.0.13 nova-network.log: /usr/lib64/python2.7/site-packages/amqp/channel.py:616: VDeprecationWarning: The auto_delete flag for exchanges has been deprecated and will be removed
Jul 8 19:25:04 10.35.0.13 nova-network.log: from py-amqp v1.5.0.
Jul 8 19:25:04 10.35.0.13 nova-network.log: warn(VDeprecationWarning(EXCHANGE_AUTODELETE_DEPRECATED))
Jul 8 19:25:04 10.35.0.13 nova-network 2014-07-08 19:25:04.965 4876 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"4642752749b44c11a8395b8c45378511\\", \\"failure\\": null, \\"result\\": [{\\"ovs_interfaceid\\": null, \\"network\\": {\\"bridge\\": \\"cloud\\", \\"subnets\\": [{\\"ips\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"fixed\\", \\"floating_ips\\": [], \\"address\\": \\"10.35.3.4\\"}], \\"version\\": 4, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.3\\"}, \\"dns\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"dns\\", \\"address\\": \\"8.8.4.4\\"}], \\"routes\\": [], \\"cidr\\": \\"10.35.3.0/24\\", \\"gateway\\": {\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"gateway\\", \\"address\\": \\"10.35.3.1\\"}}, {\\"ips\\": [], \\"version\\": null, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.3\\"}, \\"dns\\": [], \\"routes\\": [], \\"cidr\\": null, \\"gateway\\": {\\"meta\\": {}, \\"version\\": null, \\"type\\": \\"gateway\\", \\"address\\": null}}], \\"meta\\": {\\"tenant_id\\": null, \\"multi_host\\": true, \\"should_create_bridge\\": true}, \\"id\\": \\"933f56fa-7929-413f-ab40-9c7c91cb840b\\", \\"label\\": \\"private\\"}, \\"devname\\": null, \\"qbh_params\\": null, \\"meta\\": {}, \\"details\\": {}, \\"address\\": \\"fa:16:3e:da:1a:ff\\", \\"active\\": false, \\"type\\": \\"bridge\\", \\"id\\": \\"615297b1-6e8b-4e6c-ad03-1d8ca9bb3a72\\", \\"qbg_params\\": null}], \\"_msg_id\\": \\"c6e78a44901a4b4c933abaa467b94e81\\"}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:25:04 10.35.0.13 nova-network 2014-07-08 19:25:04.966 4876 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"4642752749b44c11a8395b8c45378511\\", \\"failure\\": null, \\"result\\": [{\\"ovs_interfaceid\\": null, \\"network\\": {\\"bridge\\": \\"cloud\\", \\"subnets\\": [{\\"ips\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"fixed\\", \\"floating_ips\\": [], \\"address\\": \\"10.35.3.4\\"}], \\"version\\": 4, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.3\\"}, \\"dns\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"dns\\", \\"address\\": \\"8.8.4.4\\"}], \\"routes\\": [], \\"cidr\\": \\"10.35.3.0/24\\", \\"gateway\\": {\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"gateway\\", \\"address\\": \\"10.35.3.1\\"}}, {\\"ips\\": [], \\"version\\": null, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.3\\"}, \\"dns\\": [], \\"routes\\": [], \\"cidr\\": null, \\"gateway\\": {\\"meta\\": {}, \\"version\\": null, \\"type\\": \\"gateway\\", \\"address\\": null}}], \\"meta\\": {\\"tenant_id\\": null, \\"multi_host\\": true, \\"should_create_bridge\\": true}, \\"id\\": \\"933f56fa-7929-413f-ab40-9c7c91cb840b\\", \\"label\\": \\"private\\"}, \\"devname\\": null, \\"qbh_params\\": null, \\"meta\\": {}, \\"details\\": {}, \\"address\\": \\"fa:16:3e:da:1a:ff\\", \\"active\\": false, \\"type\\": \\"bridge\\", \\"id\\": \\"615297b1-6e8b-4e6c-ad03-1d8ca9bb3a72\\", \\"qbg_params\\": null}], \\"_msg_id\\": \\"c6e78a44901a4b4c933abaa467b94e81\\"}", "oslo.version": "2.0"}', routing_key=u'reply_5a152b20506e4303bfaf3007cbe974c3'
Jul 8 19:25:04 10.35.0.13 nova-network.log: /usr/lib64/python2.7/site-packages/amqp/channel.py:616: VDeprecationWarning: The auto_delete flag for exchanges has been deprecated and will be removed
Jul 8 19:25:04 10.35.0.13 nova-network.log: from py-amqp v1.5.0.
Jul 8 19:25:04 10.35.0.13 nova-network.log: warn(VDeprecationWarning(EXCHANGE_AUTODELETE_DEPRECATED))
Jul 8 19:25:04 10.35.0.13 nova-network 2014-07-08 19:25:04.967 4876 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"4cb0cfe66bca450daedd3b36c633d071\\", \\"failure\\": null, \\"_msg_id\\": \\"c6e78a44901a4b4c933abaa467b94e81\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:25:04 10.35.0.13 nova-network 2014-07-08 19:25:04.967 4876 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"4cb0cfe66bca450daedd3b36c633d071\\", \\"failure\\": null, \\"_msg_id\\": \\"c6e78a44901a4b4c933abaa467b94e81\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', routing_key=u'reply_5a152b20506e4303bfaf3007cbe974c3'
Jul 8 19:25:04 10.35.0.13 nova-compute 2014-07-08 19:25:04.967 4705 ERROR kombu.transport.pyamqp [-] NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_unique_id\\": \\"4642752749b44c11a8395b8c45378511\\", \\"failure\\": null, \\"result\\": [{\\"ovs_interfaceid\\": null, \\"network\\": {\\"bridge\\": \\"cloud\\", \\"subnets\\": [{\\"ips\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"fixed\\", \\"floating_ips\\": [], \\"address\\": \\"10.35.3.4\\"}], \\"version\\": 4, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.3\\"}, \\"dns\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"dns\\", \\"address\\": \\"8.8.4.4\\"}], \\"routes\\": [], \\"cidr\\": \\"10.35.3.0/24\\", \\"gateway\\": {\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"gateway\\", \\"address\\": \\"10.35.3.1\\"}}, {\\"ips\\": [], \\"version\\": null, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.3\\"}, \\"dns\\": [], \\"routes\\": [], \\"cidr\\": null, \\"gateway\\": {\\"meta\\": {}, \\"version\\": null, \\"type\\": \\"gateway\\", \\"address\\": null}}], \\"meta\\": {\\"tenant_id\\": null, \\"multi_host\\": true, \\"should_create_bridge\\": true}, \\"id\\": \\"933f56fa-7929-413f-ab40-9c7c91cb840b\\", \\"label\\": \\"private\\"}, \\"devname\\": null, \\"qbh_params\\": null, \\"meta\\": {}, \\"details\\": {}, \\"address\\": \\"fa:16:3e:da:1a:ff\\", \\"active\\": false, \\"type\\": \\"bridge\\", \\"id\\": \\"615297b1-6e8b-4e6c-ad03-1d8ca9bb3a72\\", \\"qbg_params\\": null}], \\"_msg_id\\": \\"c6e78a44901a4b4c933abaa467b94e81\\"}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'1', 'redelivered': False, 'routing_key': u'reply_5a152b20506e4303bfaf3007cbe974c3', 'delivery_tag': 75, 'exchange': u'reply_5a152b20506e4303bfaf3007cbe974c3'}, 'properties': {'priority': 0, 'application_headers': {}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7fde2d448390>}
Jul 8 19:25:04 10.35.0.13 nova-compute 2014-07-08 19:25:04.968 4705 ERROR oslo.messaging._drivers.impl_rabbit [-] NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_unique_id\\": \\"4642752749b44c11a8395b8c45378511\\", \\"failure\\": null, \\"result\\": [{\\"ovs_interfaceid\\": null, \\"network\\": {\\"bridge\\": \\"cloud\\", \\"subnets\\": [{\\"ips\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"fixed\\", \\"floating_ips\\": [], \\"address\\": \\"10.35.3.4\\"}], \\"version\\": 4, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.3\\"}, \\"dns\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"dns\\", \\"address\\": \\"8.8.4.4\\"}], \\"routes\\": [], \\"cidr\\": \\"10.35.3.0/24\\", \\"gateway\\": {\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"gateway\\", \\"address\\": \\"10.35.3.1\\"}}, {\\"ips\\": [], \\"version\\": null, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.3\\"}, \\"dns\\": [], \\"routes\\": [], \\"cidr\\": null, \\"gateway\\": {\\"meta\\": {}, \\"version\\": null, \\"type\\": \\"gateway\\", \\"address\\": null}}], \\"meta\\": {\\"tenant_id\\": null, \\"multi_host\\": true, \\"should_create_bridge\\": true}, \\"id\\": \\"933f56fa-7929-413f-ab40-9c7c91cb840b\\", \\"label\\": \\"private\\"}, \\"devname\\": null, \\"qbh_params\\": null, \\"meta\\": {}, \\"details\\": {}, \\"address\\": \\"fa:16:3e:da:1a:ff\\", \\"active\\": false, \\"type\\": \\"bridge\\", \\"id\\": \\"615297b1-6e8b-4e6c-ad03-1d8ca9bb3a72\\", \\"qbg_params\\": null}], \\"_msg_id\\": \\"c6e78a44901a4b4c933abaa467b94e81\\"}", "oslo.version": "2.0"}'
Jul 8 19:25:04 10.35.0.13 nova-compute 2014-07-08 19:25:04.968 4705 ERROR kombu.transport.pyamqp [-] NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_unique_id\\": \\"4cb0cfe66bca450daedd3b36c633d071\\", \\"failure\\": null, \\"_msg_id\\": \\"c6e78a44901a4b4c933abaa467b94e81\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'1', 'redelivered': False, 'routing_key': u'reply_5a152b20506e4303bfaf3007cbe974c3', 'delivery_tag': 76, 'exchange': u'reply_5a152b20506e4303bfaf3007cbe974c3'}, 'properties': {'priority': 0, 'application_headers': {}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7fde2d448390>}
Jul 8 19:25:04 10.35.0.13 nova-compute 2014-07-08 19:25:04.969 4705 ERROR oslo.messaging._drivers.impl_rabbit [-] NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_unique_id\\": \\"4cb0cfe66bca450daedd3b36c633d071\\", \\"failure\\": null, \\"_msg_id\\": \\"c6e78a44901a4b4c933abaa467b94e81\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}'
Jul 8 19:25:05 10.35.0.15 [2014-07-08 19:25:05.624] 8900/MainThread savage.arp_forever/INFO: execute['arping', '-q', '-A', '-I', 'services', '-c', '1', '-U', '10.35.2.4']
Jul 8 19:25:05 10.35.0.13 [2014-07-08 19:25:05.685] 11310/MainThread savage.arp_forever/INFO: execute['arping', '-q', '-A', '-I', 'services', '-c', '1', '-U', '10.35.2.2']
Jul 8 19:25:06 10.35.0.13 [2014-07-08 19:25:06.021] 941/MainThread savage.node-evacuation/INFO: Requesting status refresh of VM id 5ee78913-e655-4cb1-a92c-500639c819ad
Jul 8 19:25:06 10.35.0.15 nova-api 2014-07-08 19:25:06.032 3441 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:25:06 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.6 - - [08/Jul/2014:19:25:06 +0000] "GET /v2.0/tokens/e461a54513ed421ab86f2cc67e181011 HTTP/1.1" 200 3290 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:25:06 10.35.0.13 pound: 10.35.2.8:8774 10.35.2.7 - - [08/Jul/2014:19:25:06 +0000] "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/5ee78913-e655-4cb1-a92c-500639c819ad HTTP/1.1" 200 1691 "" "python-novaclient"
Jul 8 19:25:06 10.35.0.15 nova-api 2014-07-08 19:25:06.169 3441 INFO nova.osapi_compute.wsgi.server [req-71d8b3d9-4ba6-4453-8e16-b6ad9b00b5cf 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] 10.35.2.7,10.35.0.13 "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/5ee78913-e655-4cb1-a92c-500639c819ad HTTP/1.1" status: 200 len: 1865 time: 0.1403289
Jul 8 19:25:06 10.35.0.13 [2014-07-08 19:25:06.170] 941/MainThread savage.Fred/INFO: Waiting for VMs to migrate.
Jul 8 19:25:06 10.35.0.13 [2014-07-08 19:25:06.170] 941/MainThread savage.node-evacuation/INFO: Requesting status refresh of VM id ce77d052-f021-44a2-b5f3-5908ec04fcc7
Jul 8 19:25:06 10.35.0.13 nova-api 2014-07-08 19:25:06.177 4844 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:25:06 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.7 - - [08/Jul/2014:19:25:06 +0000] "GET /v2.0/tokens/e461a54513ed421ab86f2cc67e181011 HTTP/1.1" 200 3290 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:25:06 10.35.0.13 pound: 10.35.2.8:8774 10.35.2.7 - - [08/Jul/2014:19:25:06 +0000] "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/ce77d052-f021-44a2-b5f3-5908ec04fcc7 HTTP/1.1" 200 1690 "" "python-novaclient"
Jul 8 19:25:06 10.35.0.13 nova-api 2014-07-08 19:25:06.290 4844 INFO nova.osapi_compute.wsgi.server [req-2fb8a52e-5f69-4659-bb9b-e830b3509dd0 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] 10.35.2.7,10.35.0.13 "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/ce77d052-f021-44a2-b5f3-5908ec04fcc7 HTTP/1.1" status: 200 len: 1864 time: 0.1157980
Jul 8 19:25:06 10.35.0.13 [2014-07-08 19:25:06.292] 941/MainThread savage.Fred/INFO: Waiting for VMs to migrate.
Jul 8 19:25:06 10.35.0.15 [2014-07-08 19:25:06.347] 9060/MainThread savage.arp_forever/INFO: execute['arping', '-q', '-A', '-I', 'services', '-c', '1', '-U', '10.35.2.5']
Jul 8 19:25:06 10.35.0.13 [2014-07-08 19:25:06.359] 11330/MainThread savage.arp_forever/INFO: execute['arping', '-q', '-A', '-I', 'host', '-c', '1', '-U', '10.35.0.3']
Jul 8 19:25:06 10.35.0.13 [2014-07-08 19:25:06.957] 941/Thread-1 savage.WSGI-FredInterface/INFO: 10.35.0.2 "GET /private/status HTTP/1.1" 200 3080
Jul 8 19:25:06 10.35.0.15 [2014-07-08 19:25:06.960] 942/Thread-1 savage.WSGI-FredInterface/INFO: 10.35.0.2 "GET /private/status HTTP/1.1" 200 3035
Jul 8 19:25:10 10.35.0.15 [2014-07-08 19:25:10.068] 9162/MainThread savage.arp_forever/INFO: execute['arping', '-q', '-A', '-I', 'host', '-c', '1', '-U', '10.35.0.4']
Jul 8 19:25:11 10.35.0.13 [2014-07-08 19:25:11.670] 10213/MainThread savage.arp_forever/INFO: execute['arping', '-q', '-A', '-I', 'services', '-c', '1', '-U', '10.35.2.7']
Jul 8 19:25:12 10.35.0.15 nova-compute 2014-07-08 19:25:12.352 3328 AUDIT nova.compute.resource_tracker [-] Auditing locally available compute resources
Jul 8 19:25:12 10.35.0.15 cobalt-compute: periodic_task DEBUG Running periodic task CobaltManager._refresh_host
Jul 8 19:25:12 10.35.0.15 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf vgs --noheadings --nosuffix --separator | --units b -o vg_size,vg_free nova
Jul 8 19:25:12 10.35.0.15 cobalt-compute: periodic_task DEBUG Running periodic task CobaltManager._clean
Jul 8 19:25:12 10.35.0.15 cobalt-compute: vmsconn DEBUG Starting to clean symlinks in /dev/nova
Jul 8 19:25:12 10.35.0.15 cobalt-compute: vmsconn DEBUG Link is still active /dev/nova/3e65c658-731c-4a56-8c17-1cfac00bf754_disk
Jul 8 19:25:12 10.35.0.15 cobalt-compute: vmsconn DEBUG Link is still active /dev/nova/libvirt_save
Jul 8 19:25:12 10.35.0.15 cobalt-compute: vmsconn DEBUG Link is still active /dev/nova/novadisk
Jul 8 19:25:12 10.35.0.15 cobalt-compute: loopingcall DEBUG Dynamic looping call sleeping for 60.00 seconds
Jul 8 19:25:12 10.35.0.15 nova-compute 2014-07-08 19:25:12.721 3328 AUDIT nova.compute.resource_tracker [-] Free ram (MB): 47217
Jul 8 19:25:12 10.35.0.15 nova-compute 2014-07-08 19:25:12.722 3328 AUDIT nova.compute.resource_tracker [-] Free disk (GB): 1861
Jul 8 19:25:12 10.35.0.15 nova-compute 2014-07-08 19:25:12.722 3328 AUDIT nova.compute.resource_tracker [-] Free VCPUS: 3
Jul 8 19:25:12 10.35.0.15 nova-compute 2014-07-08 19:25:12.746 3328 INFO nova.compute.resource_tracker [-] Compute_service record updated for 10.35.0.15:node-10-35-0-15
Jul 8 19:25:12 10.35.0.15 nova-compute 2014-07-08 19:25:12.804 3328 INFO nova.virt.libvirt.imagecache [-] image 8f23694a-2995-4070-a5e8-661de3207f2e at (/mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375): checking
Jul 8 19:25:12 10.35.0.15 nova-compute 2014-07-08 19:25:12.804 3328 INFO nova.virt.libvirt.imagecache [-] image 8f23694a-2995-4070-a5e8-661de3207f2e at (/mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375): in use: on this node 1 local, 0 on other nodes sharing this instance storage
Jul 8 19:25:12 10.35.0.15 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf chown 43 /mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375
Jul 8 19:25:12 10.35.0.13 [2014-07-08 19:25:12.934] 10232/MainThread savage.arp_forever/INFO: execute['arping', '-q', '-A', '-I', 'services', '-c', '1', '-U', '10.35.2.8']
Jul 8 19:25:12 10.35.0.15 nova-compute 2014-07-08 19:25:12.991 3328 INFO nova.virt.libvirt.imagecache [-] Active base files: /mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375
Jul 8 19:25:13 10.35.0.13 [2014-07-08 19:25:13.009] 10245/MainThread savage.arp_forever/INFO: execute['arping', '-q', '-A', '-I', 'services', '-c', '1', '-U', '10.35.2.3']
Jul 8 19:25:13 10.35.0.13 nova-compute 2014-07-08 19:25:13.452 4705 AUDIT nova.compute.resource_tracker [-] Auditing locally available compute resources
Jul 8 19:25:13 10.35.0.13 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf vgs --noheadings --nosuffix --separator | --units b -o vg_size,vg_free nova
Jul 8 19:25:13 10.35.0.13 nova-compute 2014-07-08 19:25:13.819 4705 AUDIT nova.compute.resource_tracker [-] Free ram (MB): 46705
Jul 8 19:25:13 10.35.0.13 nova-compute 2014-07-08 19:25:13.820 4705 AUDIT nova.compute.resource_tracker [-] Free disk (GB): 1860
Jul 8 19:25:13 10.35.0.13 nova-compute 2014-07-08 19:25:13.820 4705 AUDIT nova.compute.resource_tracker [-] Free VCPUS: 2
Jul 8 19:25:13 10.35.0.13 nova-compute 2014-07-08 19:25:13.849 4705 INFO nova.compute.resource_tracker [-] Compute_service record updated for 10.35.0.13:node-10-35-0-13
Jul 8 19:25:13 10.35.0.13 nova-compute 2014-07-08 19:25:13.908 4705 INFO nova.virt.libvirt.imagecache [-] image 8f23694a-2995-4070-a5e8-661de3207f2e at (/mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375): checking
Jul 8 19:25:13 10.35.0.13 nova-compute 2014-07-08 19:25:13.908 4705 INFO nova.virt.libvirt.imagecache [-] image 8f23694a-2995-4070-a5e8-661de3207f2e at (/mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375): in use: on this node 2 local, 0 on other nodes sharing this instance storage
Jul 8 19:25:13 10.35.0.13 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf chown 43 /mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375
Jul 8 19:25:14 10.35.0.13 nova-compute 2014-07-08 19:25:14.094 4705 INFO nova.virt.libvirt.imagecache [-] Active base files: /mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375
Jul 8 19:25:16 10.35.0.14 [2014-07-08 19:25:16.297] 941/Thread-1 savage.WSGI-FredInterface/INFO: 10.35.0.2 "GET /private/status HTTP/1.1" 200 3035
Jul 8 19:25:16 10.35.0.13 [2014-07-08 19:25:16.300] 941/MainThread savage.node-evacuation/INFO: Requesting status refresh of VM id 5ee78913-e655-4cb1-a92c-500639c819ad
Jul 8 19:25:16 10.35.0.13 pound: 10.35.2.8:8774 10.35.2.7 - - [08/Jul/2014:19:25:16 +0000] "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/5ee78913-e655-4cb1-a92c-500639c819ad HTTP/1.1" 200 1691 "" "python-novaclient"
Jul 8 19:25:16 10.35.0.13 nova-api 2014-07-08 19:25:16.410 4842 INFO nova.osapi_compute.wsgi.server [req-20418213-503d-4203-883f-ad4b6d6f0e3b 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] 10.35.2.7,10.35.0.13 "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/5ee78913-e655-4cb1-a92c-500639c819ad HTTP/1.1" status: 200 len: 1865 time: 0.1049702
Jul 8 19:25:16 10.35.0.13 [2014-07-08 19:25:16.412] 941/MainThread savage.Fred/INFO: Waiting for VMs to migrate.
Jul 8 19:25:16 10.35.0.13 [2014-07-08 19:25:16.413] 941/MainThread savage.node-evacuation/INFO: Requesting status refresh of VM id ce77d052-f021-44a2-b5f3-5908ec04fcc7
Jul 8 19:25:16 10.35.0.15 nova-api 2014-07-08 19:25:16.422 3442 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:25:16 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.6 - - [08/Jul/2014:19:25:16 +0000] "GET /v2.0/tokens/e461a54513ed421ab86f2cc67e181011 HTTP/1.1" 200 3290 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:25:16 10.35.0.13 pound: 10.35.2.8:8774 10.35.2.7 - - [08/Jul/2014:19:25:16 +0000] "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/ce77d052-f021-44a2-b5f3-5908ec04fcc7 HTTP/1.1" 200 1690 "" "python-novaclient"
Jul 8 19:25:16 10.35.0.15 nova-api 2014-07-08 19:25:16.546 3442 INFO nova.osapi_compute.wsgi.server [req-edefe9bc-906f-4c6b-aa24-ba3277ba9fd6 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] 10.35.2.7,10.35.0.13 "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/ce77d052-f021-44a2-b5f3-5908ec04fcc7 HTTP/1.1" status: 200 len: 1864 time: 0.1275840
Jul 8 19:25:16 10.35.0.13 [2014-07-08 19:25:16.546] 941/MainThread savage.Fred/INFO: Waiting for VMs to migrate.
Jul 8 19:25:18 10.35.0.15 cinder-volume 2014-07-08 19:25:18.141 9068 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_context_domain\\": null, \\"_context_request_id\\": \\"req-16f0403c-9d1d-40f3-a827-d48ccd9e29f6\\", \\"_context_quota_class\\": null, \\"_context_service_catalog\\": [], \\"_context_auth_token\\": null, \\"_context_user_id\\": null, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_timestamp\\": \\"2014-07-08T19:25:18.138576\\", \\"_context_project_domain\\": null, \\"_context_user\\": null, \\"method\\": \\"update_service_capabilities\\", \\"_context_remote_address\\": null, \\"_context_roles\\": [\\"admin\\"], \\"args\\": {\\"service_name\\": \\"volume\\", \\"host\\": \\"cinder@rbd-slow\\", \\"capabilities\\": {\\"allocated_capacity_gb\\": 0, \\"volume_backend_name\\": \\"CEPHSLOW\\", \\"free_capacity_gb\\": 1282, \\"driver_version\\": \\"1.1.0\\", \\"total_capacity_gb\\": 1282, \\"reserved_percentage\\": 0, \\"vendor_name\\": \\"Open Source\\", \\"storage_protocol\\": \\"ceph\\"}}, \\"_unique_id\\": \\"4bc8385f0f6f41a2b398063dafc109d9\\", \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_context_user_identity\\": \\"- - - - -\\", \\"_context_tenant\\": null, \\"_context_project_id\\": null, \\"_context_user_domain\\": null}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:25:18 10.35.0.15 cinder-volume 2014-07-08 19:25:18.141 9068 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_context_domain\\": null, \\"_context_request_id\\": \\"req-16f0403c-9d1d-40f3-a827-d48ccd9e29f6\\", \\"_context_quota_class\\": null, \\"_context_service_catalog\\": [], \\"_context_auth_token\\": null, \\"_context_user_id\\": null, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_timestamp\\": \\"2014-07-08T19:25:18.138576\\", \\"_context_project_domain\\": null, \\"_context_user\\": null, \\"method\\": \\"update_service_capabilities\\", \\"_context_remote_address\\": null, \\"_context_roles\\": [\\"admin\\"], \\"args\\": {\\"service_name\\": \\"volume\\", \\"host\\": \\"cinder@rbd-slow\\", \\"capabilities\\": {\\"allocated_capacity_gb\\": 0, \\"volume_backend_name\\": \\"CEPHSLOW\\", \\"free_capacity_gb\\": 1282, \\"driver_version\\": \\"1.1.0\\", \\"total_capacity_gb\\": 1282, \\"reserved_percentage\\": 0, \\"vendor_name\\": \\"Open Source\\", \\"storage_protocol\\": \\"ceph\\"}}, \\"_unique_id\\": \\"4bc8385f0f6f41a2b398063dafc109d9\\", \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_context_user_identity\\": \\"- - - - -\\", \\"_context_tenant\\": null, \\"_context_project_id\\": null, \\"_context_user_domain\\": null}", "oslo.version": "2.0"}', routing_key=''
Jul 8 19:25:18 10.35.0.15 cinder-scheduler 2014-07-08 19:25:18.143 8863 ERROR kombu.transport.pyamqp [-] NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_context_domain\\": null, \\"_context_request_id\\": \\"req-16f0403c-9d1d-40f3-a827-d48ccd9e29f6\\", \\"_context_quota_class\\": null, \\"_context_service_catalog\\": [], \\"_context_auth_token\\": null, \\"_context_user_id\\": null, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_timestamp\\": \\"2014-07-08T19:25:18.138576\\", \\"_context_project_domain\\": null, \\"_context_user\\": null, \\"method\\": \\"update_service_capabilities\\", \\"_context_remote_address\\": null, \\"_context_roles\\": [\\"admin\\"], \\"args\\": {\\"service_name\\": \\"volume\\", \\"host\\": \\"cinder@rbd-slow\\", \\"capabilities\\": {\\"allocated_capacity_gb\\": 0, \\"volume_backend_name\\": \\"CEPHSLOW\\", \\"free_capacity_gb\\": 1282, \\"driver_version\\": \\"1.1.0\\", \\"total_capacity_gb\\": 1282, \\"reserved_percentage\\": 0, \\"vendor_name\\": \\"Open Source\\", \\"storage_protocol\\": \\"ceph\\"}}, \\"_unique_id\\": \\"4bc8385f0f6f41a2b398063dafc109d9\\", \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_context_user_identity\\": \\"- - - - -\\", \\"_context_tenant\\": null, \\"_context_project_id\\": null, \\"_context_user_domain\\": null}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'3', 'redelivered': False, 'routing_key': u'', 'delivery_tag': 78, 'exchange': u'cinder-scheduler_fanout'}, 'properties': {'priority': 0, 'application_headers': {}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7fa03d158690>}
Jul 8 19:25:18 10.35.0.15 cinder-volume 2014-07-08 19:25:18.144 9068 INFO cinder.volume.manager [-] Updating volume status
Jul 8 19:25:18 10.35.0.15 cinder-scheduler 2014-07-08 19:25:18.145 8863 ERROR oslo.messaging._drivers.impl_rabbit [-] NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_context_domain\\": null, \\"_context_request_id\\": \\"req-16f0403c-9d1d-40f3-a827-d48ccd9e29f6\\", \\"_context_quota_class\\": null, \\"_context_service_catalog\\": [], \\"_context_auth_token\\": null, \\"_context_user_id\\": null, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_timestamp\\": \\"2014-07-08T19:25:18.138576\\", \\"_context_project_domain\\": null, \\"_context_user\\": null, \\"method\\": \\"update_service_capabilities\\", \\"_context_remote_address\\": null, \\"_context_roles\\": [\\"admin\\"], \\"args\\": {\\"service_name\\": \\"volume\\", \\"host\\": \\"cinder@rbd-slow\\", \\"capabilities\\": {\\"allocated_capacity_gb\\": 0, \\"volume_backend_name\\": \\"CEPHSLOW\\", \\"free_capacity_gb\\": 1282, \\"driver_version\\": \\"1.1.0\\", \\"total_capacity_gb\\": 1282, \\"reserved_percentage\\": 0, \\"vendor_name\\": \\"Open Source\\", \\"storage_protocol\\": \\"ceph\\"}}, \\"_unique_id\\": \\"4bc8385f0f6f41a2b398063dafc109d9\\", \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_context_user_identity\\": \\"- - - - -\\", \\"_context_tenant\\": null, \\"_context_project_id\\": null, \\"_context_user_domain\\": null}", "oslo.version": "2.0"}'
Jul 8 19:25:18 10.35.0.15 cinder-scheduler 2014-07-08 19:25:18.146 8863 WARNING cinder.context [-] Arguments dropped when creating context: {'user': None, 'tenant': None, 'user_identity': u'- - - - -'}
Jul 8 19:25:18 10.35.0.15 cinder-scheduler 2014-07-08 19:25:18.148 8863 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"7b9d0dd0b08a442992d3f63266b27f33\\", \\"failure\\": null, \\"result\\": null}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:25:18 10.35.0.15 cinder-scheduler 2014-07-08 19:25:18.148 8863 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"7b9d0dd0b08a442992d3f63266b27f33\\", \\"failure\\": null, \\"result\\": null}", "oslo.version": "2.0"}', routing_key=''
Jul 8 19:25:18 10.35.0.15 cinder-scheduler 2014-07-08 19:25:18.149 8863 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"155d3f144d8f439da0a41b4be0653aeb\\", \\"failure\\": null, \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:25:18 10.35.0.15 cinder-scheduler 2014-07-08 19:25:18.149 8863 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"155d3f144d8f439da0a41b4be0653aeb\\", \\"failure\\": null, \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', routing_key=''
Jul 8 19:25:19 10.35.0.15 sshd[22114]: error: Could not load host key: /etc/ssh/ssh_host_ed25519_key
Jul 8 19:25:19 10.35.0.15 sshd[22114]: Accepted publickey for root from 10.35.0.2 port 54337 ssh2: RSA 4a:52:62:54:e9:e7:3b:24:a9:74:2e:77:c4:67:3e:5e
Jul 8 19:25:19 10.35.0.15 sshd[22114]: Received disconnect from 10.35.0.2: 11: disconnected by user
Jul 8 19:25:19 10.35.0.13 sshd[21991]: error: Could not load host key: /etc/ssh/ssh_host_ed25519_key
Jul 8 19:25:19 10.35.0.13 sshd[21991]: Accepted publickey for root from 10.35.0.2 port 54926 ssh2: RSA 4a:52:62:54:e9:e7:3b:24:a9:74:2e:77:c4:67:3e:5e
Jul 8 19:25:19 10.35.0.13 sshd[21991]: Received disconnect from 10.35.0.2: 11: disconnected by user
Jul 8 19:25:19 10.35.0.14 sshd[16603]: error: Could not load host key: /etc/ssh/ssh_host_ed25519_key
Jul 8 19:25:20 10.35.0.14 sshd[16603]: Accepted publickey for root from 10.35.0.2 port 50145 ssh2: RSA 4a:52:62:54:e9:e7:3b:24:a9:74:2e:77:c4:67:3e:5e
Jul 8 19:25:20 10.35.0.14 sshd[16603]: Received disconnect from 10.35.0.2: 11: disconnected by user
Jul 8 19:25:20 10.35.0.14 nova-network.log: /usr/lib64/python2.7/site-packages/sqlalchemy/engine/default.py:435: Warning: Incorrect datetime value: '2014-07-08 19:15:20+00:00' for column 'updated_at' at row 1
Jul 8 19:25:20 10.35.0.14 nova-network.log: cursor.execute(statement, parameters)
Jul 8 19:25:22 10.35.0.13 [2014-07-08 19:25:22.521] 941/Thread-1 savage.WSGI-FredInterface/INFO: 10.35.0.2 "GET /private/status HTTP/1.1" 200 3080
Jul 8 19:25:22 10.35.0.15 [2014-07-08 19:25:22.524] 942/Thread-1 savage.WSGI-FredInterface/INFO: 10.35.0.2 "GET /private/status HTTP/1.1" 200 3035
Jul 8 19:25:23 10.35.0.2 dhcpd: DHCPREQUEST for 10.35.1.130 from 00:25:90:2f:2d:56 via management
Jul 8 19:25:23 10.35.0.2 dhcpd: DHCPACK on 10.35.1.130 to 00:25:90:2f:2d:56 via management
Jul 8 19:25:24 10.35.0.15 nova-network.log: /usr/lib64/python2.7/site-packages/sqlalchemy/engine/default.py:435: Warning: Incorrect datetime value: '2014-07-08 19:15:24+00:00' for column 'updated_at' at row 1
Jul 8 19:25:24 10.35.0.15 nova-network.log: cursor.execute(statement, parameters)
Jul 8 19:25:26 10.35.0.13 [2014-07-08 19:25:26.556] 941/MainThread savage.node-evacuation/INFO: Requesting status refresh of VM id 5ee78913-e655-4cb1-a92c-500639c819ad
Jul 8 19:25:26 10.35.0.14 nova-api 2014-07-08 19:25:26.563 6474 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:25:26 10.35.0.13 pound: 10.35.2.3:35357 10.35.0.14 - - [08/Jul/2014:19:25:26 +0000] "GET /v2.0/tokens/e461a54513ed421ab86f2cc67e181011 HTTP/1.1" 200 3290 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:25:26 10.35.0.13 pound: 10.35.2.8:8774 10.35.2.7 - - [08/Jul/2014:19:25:26 +0000] "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/5ee78913-e655-4cb1-a92c-500639c819ad HTTP/1.1" 200 1691 "" "python-novaclient"
Jul 8 19:25:26 10.35.0.14 nova-api 2014-07-08 19:25:26.702 6474 INFO nova.osapi_compute.wsgi.server [req-bcb4c067-4366-4358-94a0-5f613690f937 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] 10.35.2.7,10.35.0.13 "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/5ee78913-e655-4cb1-a92c-500639c819ad HTTP/1.1" status: 200 len: 1865 time: 0.1417708
Jul 8 19:25:26 10.35.0.13 [2014-07-08 19:25:26.705] 941/MainThread savage.Fred/INFO: Waiting for VMs to migrate.
Jul 8 19:25:26 10.35.0.13 [2014-07-08 19:25:26.705] 941/MainThread savage.node-evacuation/INFO: Requesting status refresh of VM id ce77d052-f021-44a2-b5f3-5908ec04fcc7
Jul 8 19:25:26 10.35.0.13 nova-api 2014-07-08 19:25:26.712 4840 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:25:26 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.7 - - [08/Jul/2014:19:25:26 +0000] "GET /v2.0/tokens/e461a54513ed421ab86f2cc67e181011 HTTP/1.1" 200 3290 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:25:26 10.35.0.13 pound: 10.35.2.8:8774 10.35.2.7 - - [08/Jul/2014:19:25:26 +0000] "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/ce77d052-f021-44a2-b5f3-5908ec04fcc7 HTTP/1.1" 200 1690 "" "python-novaclient"
Jul 8 19:25:26 10.35.0.13 nova-api 2014-07-08 19:25:26.835 4840 INFO nova.osapi_compute.wsgi.server [req-05a37618-79fa-4d4e-a712-7164dbf32ba4 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] 10.35.2.7,10.35.0.13 "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/ce77d052-f021-44a2-b5f3-5908ec04fcc7 HTTP/1.1" status: 200 len: 1864 time: 0.1262190
Jul 8 19:25:26 10.35.0.13 [2014-07-08 19:25:26.836] 941/MainThread savage.Fred/INFO: Waiting for VMs to migrate.
Jul 8 19:25:27 10.35.0.2 dhcpd: DHCPREQUEST for 10.35.1.131 from 00:25:90:2f:2d:07 via management
Jul 8 19:25:27 10.35.0.2 dhcpd: DHCPACK on 10.35.1.131 to 00:25:90:2f:2d:07 via management
Jul 8 19:25:28 10.35.0.15 sshd[22174]: error: Could not load host key: /etc/ssh/ssh_host_ed25519_key
Jul 8 19:25:29 10.35.0.15 sshd[22174]: Accepted publickey for root from 10.35.0.2 port 54342 ssh2: RSA 4a:52:62:54:e9:e7:3b:24:a9:74:2e:77:c4:67:3e:5e
Jul 8 19:25:29 10.35.0.15 sshd[22174]: Received disconnect from 10.35.0.2: 11: disconnected by user
Jul 8 19:25:29 10.35.0.13 sshd[22041]: error: Could not load host key: /etc/ssh/ssh_host_ed25519_key
Jul 8 19:25:29 10.35.0.13 sshd[22041]: Accepted publickey for root from 10.35.0.2 port 54931 ssh2: RSA 4a:52:62:54:e9:e7:3b:24:a9:74:2e:77:c4:67:3e:5e
Jul 8 19:25:29 10.35.0.13 sshd[22041]: Received disconnect from 10.35.0.2: 11: disconnected by user
Jul 8 19:25:29 10.35.0.14 sshd[16626]: error: Could not load host key: /etc/ssh/ssh_host_ed25519_key
Jul 8 19:25:29 10.35.0.14 sshd[16626]: Accepted publickey for root from 10.35.0.2 port 50150 ssh2: RSA 4a:52:62:54:e9:e7:3b:24:a9:74:2e:77:c4:67:3e:5e
Jul 8 19:25:29 10.35.0.14 sshd[16626]: Received disconnect from 10.35.0.2: 11: disconnected by user
Jul 8 19:25:31 10.35.0.14 [2014-07-08 19:25:31.862] 941/Thread-1 savage.WSGI-FredInterface/INFO: 10.35.0.2 "GET /private/status HTTP/1.1" 200 3035
Jul 8 19:25:33 10.35.0.14 sshd[16220]: Accepted password for root from 10.35.0.2 port 50130 ssh2
Jul 8 19:25:36 10.35.0.13 [2014-07-08 19:25:36.837] 941/MainThread savage.node-evacuation/INFO: Requesting status refresh of VM id 5ee78913-e655-4cb1-a92c-500639c819ad
Jul 8 19:25:36 10.35.0.13 pound: 10.35.2.8:8774 10.35.2.7 - - [08/Jul/2014:19:25:36 +0000] "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/5ee78913-e655-4cb1-a92c-500639c819ad HTTP/1.1" 200 1691 "" "python-novaclient"
Jul 8 19:25:36 10.35.0.15 nova-api 2014-07-08 19:25:36.949 3442 INFO nova.osapi_compute.wsgi.server [req-ea4b56bc-2f27-41b1-a757-6e804882e48c 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] 10.35.2.7,10.35.0.13 "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/5ee78913-e655-4cb1-a92c-500639c819ad HTTP/1.1" status: 200 len: 1865 time: 0.1037219
Jul 8 19:25:36 10.35.0.13 [2014-07-08 19:25:36.949] 941/MainThread savage.Fred/INFO: Waiting for VMs to migrate.
Jul 8 19:25:36 10.35.0.13 [2014-07-08 19:25:36.949] 941/MainThread savage.node-evacuation/INFO: Requesting status refresh of VM id ce77d052-f021-44a2-b5f3-5908ec04fcc7
Jul 8 19:25:37 10.35.0.13 pound: 10.35.2.8:8774 10.35.2.7 - - [08/Jul/2014:19:25:36 +0000] "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/ce77d052-f021-44a2-b5f3-5908ec04fcc7 HTTP/1.1" 200 1690 "" "python-novaclient"
Jul 8 19:25:37 10.35.0.14 nova-api 2014-07-08 19:25:37.056 6474 INFO nova.osapi_compute.wsgi.server [req-246ac43e-0e21-42f2-b659-046b674bf686 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] 10.35.2.7,10.35.0.13 "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/ce77d052-f021-44a2-b5f3-5908ec04fcc7 HTTP/1.1" status: 200 len: 1864 time: 0.1032529
Jul 8 19:25:37 10.35.0.13 [2014-07-08 19:25:37.059] 941/MainThread savage.Fred/INFO: Waiting for VMs to migrate.
Jul 8 19:25:37 10.35.0.13 cobalt-compute: periodic_task DEBUG Running periodic task CobaltManager._refresh_host
Jul 8 19:25:37 10.35.0.13 cobalt-compute: periodic_task DEBUG Running periodic task CobaltManager._clean
Jul 8 19:25:37 10.35.0.13 cobalt-compute: vmsconn DEBUG Starting to clean symlinks in /dev/nova
Jul 8 19:25:37 10.35.0.13 cobalt-compute: vmsconn DEBUG Link is still active /dev/nova/ce77d052-f021-44a2-b5f3-5908ec04fcc7_disk
Jul 8 19:25:37 10.35.0.13 cobalt-compute: vmsconn DEBUG Link is still active /dev/nova/5ee78913-e655-4cb1-a92c-500639c819ad_disk
Jul 8 19:25:37 10.35.0.13 cobalt-compute: vmsconn DEBUG Link is still active /dev/nova/libvirt_save
Jul 8 19:25:37 10.35.0.13 cobalt-compute: vmsconn DEBUG Link is still active /dev/nova/novadisk
Jul 8 19:25:37 10.35.0.13 cobalt-compute: loopingcall DEBUG Dynamic looping call sleeping for 60.00 seconds
Jul 8 19:25:38 10.35.0.15 [2014-07-08 19:25:38.094] 942/Thread-1 savage.WSGI-FredInterface/INFO: 10.35.0.2 "GET /private/status HTTP/1.1" 200 3033
Jul 8 19:25:38 10.35.0.13 [2014-07-08 19:25:38.092] 941/Thread-1 savage.WSGI-FredInterface/INFO: 10.35.0.2 "GET /private/status HTTP/1.1" 200 3080
Jul 8 19:25:40 10.35.0.2 dhcpd: DHCPREQUEST for 10.35.1.129 from 00:25:90:8b:3f:a7 via management
Jul 8 19:25:40 10.35.0.2 dhcpd: DHCPACK on 10.35.1.129 to 00:25:90:8b:3f:a7 via management
Jul 8 19:25:42 10.35.0.14 nova-compute 2014-07-08 19:25:42.927 6347 AUDIT nova.compute.resource_tracker [-] Auditing locally available compute resources
Jul 8 19:25:42 10.35.0.14 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf vgs --noheadings --nosuffix --separator | --units b -o vg_size,vg_free nova
Jul 8 19:25:43 10.35.0.14 nova-compute 2014-07-08 19:25:43.271 6347 AUDIT nova.compute.resource_tracker [-] Free ram (MB): 47729
Jul 8 19:25:43 10.35.0.14 nova-compute 2014-07-08 19:25:43.272 6347 AUDIT nova.compute.resource_tracker [-] Free disk (GB): 1862
Jul 8 19:25:43 10.35.0.14 nova-compute 2014-07-08 19:25:43.272 6347 AUDIT nova.compute.resource_tracker [-] Free VCPUS: 4
Jul 8 19:25:43 10.35.0.14 nova-compute 2014-07-08 19:25:43.297 6347 INFO nova.compute.resource_tracker [-] Compute_service record updated for 10.35.0.14:node-10-35-0-14
Jul 8 19:25:43 10.35.0.14 nova-compute 2014-07-08 19:25:43.350 6347 WARNING nova.virt.libvirt.imagecache [-] Unknown base file: /mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375
Jul 8 19:25:43 10.35.0.14 nova-compute 2014-07-08 19:25:43.350 6347 INFO nova.virt.libvirt.imagecache [-] Removable base files: /mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375
Jul 8 19:25:43 10.35.0.14 nova-compute 2014-07-08 19:25:43.350 6347 INFO nova.virt.libvirt.imagecache [-] Base file too young to remove: /mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375
Jul 8 19:25:47 10.35.0.13 [2014-07-08 19:25:47.065] 941/MainThread savage.node-evacuation/INFO: Requesting status refresh of VM id 5ee78913-e655-4cb1-a92c-500639c819ad
Jul 8 19:25:47 10.35.0.13 pound: 10.35.2.8:8774 10.35.2.7 - - [08/Jul/2014:19:25:47 +0000] "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/5ee78913-e655-4cb1-a92c-500639c819ad HTTP/1.1" 200 1691 "" "python-novaclient"
Jul 8 19:25:47 10.35.0.14 nova-api 2014-07-08 19:25:47.171 6474 INFO nova.osapi_compute.wsgi.server [req-4ba2ec42-cb9e-41ed-a8fb-a52aca240390 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] 10.35.2.7,10.35.0.13 "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/5ee78913-e655-4cb1-a92c-500639c819ad HTTP/1.1" status: 200 len: 1865 time: 0.1045630
Jul 8 19:25:47 10.35.0.13 [2014-07-08 19:25:47.174] 941/MainThread savage.Fred/INFO: Waiting for VMs to migrate.
Jul 8 19:25:47 10.35.0.13 [2014-07-08 19:25:47.174] 941/MainThread savage.node-evacuation/INFO: Requesting status refresh of VM id ce77d052-f021-44a2-b5f3-5908ec04fcc7
Jul 8 19:25:47 10.35.0.13 pound: 10.35.2.8:8774 10.35.2.7 - - [08/Jul/2014:19:25:47 +0000] "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/ce77d052-f021-44a2-b5f3-5908ec04fcc7 HTTP/1.1" 200 1690 "" "python-novaclient"
Jul 8 19:25:47 10.35.0.13 nova-api 2014-07-08 19:25:47.277 4840 INFO nova.osapi_compute.wsgi.server [req-06b06e3f-5c77-4bcd-9809-f340f70864b3 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] 10.35.2.7,10.35.0.13 "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/ce77d052-f021-44a2-b5f3-5908ec04fcc7 HTTP/1.1" status: 200 len: 1864 time: 0.1002240
Jul 8 19:25:47 10.35.0.13 [2014-07-08 19:25:47.279] 941/MainThread savage.Fred/INFO: Waiting for VMs to migrate.
Jul 8 19:25:47 10.35.0.14 [2014-07-08 19:25:47.427] 941/Thread-1 savage.WSGI-FredInterface/INFO: 10.35.0.2 "GET /private/status HTTP/1.1" 200 3035
Jul 8 19:25:49 10.35.0.13 pound: 10.35.2.3:5000 10.35.0.14 - - [08/Jul/2014:19:25:49 +0000] "POST //v2.0/tokens HTTP/1.1" 200 3288 "" "python-novaclient"
Jul 8 19:25:49 10.35.0.15 nova-api 2014-07-08 19:25:49.335 3442 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:25:49 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.6 - - [08/Jul/2014:19:25:49 +0000] "GET /v2.0/tokens/d95ed906463147b482c6ad30a2069faa HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:25:49 10.35.0.13 pound: 10.35.2.8:8774 10.35.0.14 - - [08/Jul/2014:19:25:49 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5060 "" "python-novaclient"
Jul 8 19:25:49 10.35.0.15 nova-api 2014-07-08 19:25:49.461 3442 INFO nova.osapi_compute.wsgi.server [req-420b1e49-a922-48a2-922a-34395842ad84 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 10.35.0.14,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5234 time: 0.1283062
Jul 8 19:25:49 10.35.0.14 cobalt-compute: periodic_task DEBUG Running periodic task CobaltManager._refresh_host
Jul 8 19:25:49 10.35.0.13 -- MARK --
Jul 8 19:25:49 10.35.0.14 cobalt-compute: periodic_task DEBUG Running periodic task CobaltManager._clean
Jul 8 19:25:49 10.35.0.14 cobalt-compute: vmsconn DEBUG Starting to clean symlinks in /dev/nova
Jul 8 19:25:49 10.35.0.14 cobalt-compute: vmsconn DEBUG Link is still active /dev/nova/libvirt_save
Jul 8 19:25:49 10.35.0.14 cobalt-compute: vmsconn DEBUG Link is still active /dev/nova/novadisk
Jul 8 19:25:49 10.35.0.14 cobalt-compute: loopingcall DEBUG Dynamic looping call sleeping for 60.00 seconds
Jul 8 19:25:52 10.35.0.15 nova-compute 2014-07-08 19:25:52.994 3328 AUDIT nova.compute.resource_tracker [-] Auditing locally available compute resources
Jul 8 19:25:53 10.35.0.15 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf vgs --noheadings --nosuffix --separator | --units b -o vg_size,vg_free nova
Jul 8 19:25:53 10.35.0.2 dhcpd: DHCPREQUEST for 10.35.1.130 from 00:25:90:2f:2d:56 via management
Jul 8 19:25:53 10.35.0.2 dhcpd: DHCPACK on 10.35.1.130 to 00:25:90:2f:2d:56 via management
Jul 8 19:25:53 10.35.0.15 nova-compute 2014-07-08 19:25:53.365 3328 AUDIT nova.compute.resource_tracker [-] Free ram (MB): 47217
Jul 8 19:25:53 10.35.0.15 nova-compute 2014-07-08 19:25:53.365 3328 AUDIT nova.compute.resource_tracker [-] Free disk (GB): 1861
Jul 8 19:25:53 10.35.0.15 nova-compute 2014-07-08 19:25:53.365 3328 AUDIT nova.compute.resource_tracker [-] Free VCPUS: 3
Jul 8 19:25:53 10.35.0.15 nova-compute 2014-07-08 19:25:53.390 3328 INFO nova.compute.resource_tracker [-] Compute_service record updated for 10.35.0.15:node-10-35-0-15
Jul 8 19:25:53 10.35.0.15 nova-compute 2014-07-08 19:25:53.449 3328 INFO nova.virt.libvirt.imagecache [-] image 8f23694a-2995-4070-a5e8-661de3207f2e at (/mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375): checking
Jul 8 19:25:53 10.35.0.15 nova-compute 2014-07-08 19:25:53.449 3328 INFO nova.virt.libvirt.imagecache [-] image 8f23694a-2995-4070-a5e8-661de3207f2e at (/mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375): in use: on this node 1 local, 0 on other nodes sharing this instance storage
Jul 8 19:25:53 10.35.0.15 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf chown 43 /mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375
Jul 8 19:25:53 10.35.0.15 nova-compute 2014-07-08 19:25:53.635 3328 INFO nova.virt.libvirt.imagecache [-] Active base files: /mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375
Jul 8 19:25:53 10.35.0.15 [2014-07-08 19:25:53.663] 942/Thread-1 savage.WSGI-FredInterface/INFO: 10.35.0.2 "GET /private/status HTTP/1.1" 200 3035
Jul 8 19:25:53 10.35.0.13 [2014-07-08 19:25:53.661] 941/Thread-1 savage.WSGI-FredInterface/INFO: 10.35.0.2 "GET /private/status HTTP/1.1" 200 3080
Jul 8 19:25:54 10.35.0.13 nova-network.log: /usr/lib64/python2.7/site-packages/sqlalchemy/engine/default.py:435: Warning: Incorrect datetime value: '2014-07-08 19:15:53+00:00' for column 'updated_at' at row 1
Jul 8 19:25:54 10.35.0.13 nova-network.log: cursor.execute(statement, parameters)
Jul 8 19:25:54 10.35.0.13 nova-compute 2014-07-08 19:25:54.096 4705 AUDIT nova.compute.resource_tracker [-] Auditing locally available compute resources
Jul 8 19:25:54 10.35.0.13 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf vgs --noheadings --nosuffix --separator | --units b -o vg_size,vg_free nova
Jul 8 19:25:54 10.35.0.13 nova-compute 2014-07-08 19:25:54.457 4705 AUDIT nova.compute.resource_tracker [-] Free ram (MB): 46705
Jul 8 19:25:54 10.35.0.13 nova-compute 2014-07-08 19:25:54.457 4705 AUDIT nova.compute.resource_tracker [-] Free disk (GB): 1860
Jul 8 19:25:54 10.35.0.13 nova-compute 2014-07-08 19:25:54.457 4705 AUDIT nova.compute.resource_tracker [-] Free VCPUS: 2
Jul 8 19:25:54 10.35.0.13 nova-compute 2014-07-08 19:25:54.485 4705 INFO nova.compute.resource_tracker [-] Compute_service record updated for 10.35.0.13:node-10-35-0-13
Jul 8 19:25:54 10.35.0.13 nova-compute 2014-07-08 19:25:54.549 4705 INFO nova.virt.libvirt.imagecache [-] image 8f23694a-2995-4070-a5e8-661de3207f2e at (/mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375): checking
Jul 8 19:25:54 10.35.0.13 nova-compute 2014-07-08 19:25:54.549 4705 INFO nova.virt.libvirt.imagecache [-] image 8f23694a-2995-4070-a5e8-661de3207f2e at (/mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375): in use: on this node 2 local, 0 on other nodes sharing this instance storage
Jul 8 19:25:54 10.35.0.13 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf chown 43 /mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375
Jul 8 19:25:54 10.35.0.13 nova-compute 2014-07-08 19:25:54.734 4705 INFO nova.virt.libvirt.imagecache [-] Active base files: /mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375
Jul 8 19:25:56 10.35.0.13 cobalt-compute: manager DEBUG Unlocked instance ce77d052-f021-44a2-b5f3-5908ec04fcc7 (fn: migrate_instance)
Jul 8 19:25:56 10.35.0.13 cobalt-compute: dispatcher ERROR Exception during message handling: Timed out waiting for a reply to message ID 29c6579c5de24c00b0b0e55579b8e047
Traceback (most recent call last):
File "/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/dispatcher.py", line 133, in _dispatch_and_reply
incoming.message))
File "/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/dispatcher.py", line 176, in _dispatch
return self._do_dispatch(endpoint, method, ctxt, args)
File "/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/dispatcher.py", line 122, in _do_dispatch
result = getattr(endpoint, method)(ctxt, **new_args)
File "/usr/lib64/python2.7/site-packages/cobalt/nova/extension/manager.py", line 132, in wrapped_fn
return fn(self, context, **kwargs)
File "/usr/lib64/python2.7/site-packages/cobalt/nova/extension/manager.py", line 756, in migrate_instance
network_info = self.network_api.get_instance_nw_info(context, instance)
File "/usr/lib64/python2.7/site-packages/nova/network/api.py", line 95, in wrapped
return func(self, context, *args, **kwargs)
File "/usr/lib64/python2.7/site-packages/nova/network/api.py", line 390, in get_instance_nw_info
result = self._get_instance_nw_info(context, instance)
File "/usr/lib64/python2.7/site-packages/nova/network/api.py", line 406, in _get_instance_nw_info
nw_info = self.network_rpcapi.get_instance_nw_info(context, **args)
File "/usr/lib64/python2.7/site-packages/nova/network/rpcapi.py", line 222, in get_instance_nw_info
host=host, project_id=project_id)
File "/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/client.py", line 150, in call
wait_for_reply=True, timeout=timeout)
File "/usr/lib64/python2.7/site-packages/oslo/messaging/transport.py", line 90, in _send
timeout=timeout)
File "/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py", line 412, in send
return self._send(target, ctxt, message, wait_for_reply, timeout)
File "/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py", line 403, in _send
result = self._waiter.wait(msg_id, timeout)
File "/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py", line 280, in wait
reply, ending, trylock = self._poll_queue(msg_id, timeout)
File "/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py", line 220, in _poll_queue
message = self.waiters.get(msg_id, timeout)
File "/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py", line 126, in get
'to message ID %s' % msg_id)
MessagingTimeout: Timed out waiting for a reply to message ID 29c6579c5de24c00b0b0e55579b8e047
Jul 8 19:25:56 10.35.0.13 cobalt-compute: common ERROR Returning exception Timed out waiting for a reply to message ID 29c6579c5de24c00b0b0e55579b8e047 to caller
Jul 8 19:25:56 10.35.0.13 cobalt-compute: common ERROR ['Traceback (most recent call last):\n', ' File "/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/dispatcher.py", line 133, in _dispatch_and_reply\n incoming.message))\n', ' File "/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/dispatcher.py", line 176, in _dispatch\n return self._do_dispatch(endpoint, method, ctxt, args)\n', ' File "/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/dispatcher.py", line 122, in _do_dispatch\n result = getattr(endpoint, method)(ctxt, **new_args)\n', ' File "/usr/lib64/python2.7/site-packages/cobalt/nova/extension/manager.py", line 132, in wrapped_fn\n return fn(self, context, **kwargs)\n', ' File "/usr/lib64/python2.7/site-packages/cobalt/nova/extension/manager.py", line 756, in migrate_instance\n network_info = self.network_api.get_instance_nw_info(context, instance)\n', ' File "/usr/lib64/python2.7/site-packages/nova/network/api.py", line 95, in wrapped\n return func(self, context, *args, **kwargs)\n', ' File "/usr/lib64/python2.7/site-packages/nova/network/api.py", line 390, in get_instance_nw_info\n result = self._get_instance_nw_info(context, instance)\n', ' File "/usr/lib64/python2.7/site-packages/nova/network/api.py", line 406, in _get_instance_nw_info\n nw_info = self.network_rpcapi.get_instance_nw_info(context, **args)\n', ' File "/usr/lib64/python2.7/site-packages/nova/network/rpcapi.py", line 222, in get_instance_nw_info\n host=host, project_id=project_id)\n', ' File "/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/client.py", line 150, in call\n wait_for_reply=True, timeout=timeout)\n', ' File "/usr/lib64/python2.7/site-packages/oslo/messaging/transport.py", line 90, in _send\n timeout=timeout)\n', ' File "/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py", line 412, in send\n return self._send(target, ctxt, message, wait_for_reply, timeout)\n', ' File "/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py", line 403, in _send\n result = self._waiter.wait(msg_id, timeout)\n', ' File "/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py", line 280, in wait\n reply, ending, trylock = self._poll_queue(msg_id, timeout)\n', ' File "/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py", line 220, in _poll_queue\n message = self.waiters.get(msg_id, timeout)\n', ' File "/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py", line 126, in get\n \'to message ID %s\' % msg_id)\n', 'MessagingTimeout: Timed out waiting for a reply to message ID 29c6579c5de24c00b0b0e55579b8e047\n']
Jul 8 19:25:56 10.35.0.13 cobalt-compute: amqp DEBUG UNIQUE_ID is e64dbaed42d347f5b2275638a62684a8.
Jul 8 19:25:56 10.35.0.13 cobalt-compute: pyamqp ERROR NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"e64dbaed42d347f5b2275638a62684a8\\", \\"failure\\": \\"{\\\\\\"args\\\\\\": [\\\\\\"Timed out waiting for a reply to message ID 29c6579c5de24c00b0b0e55579b8e047\\\\\\"], \\\\\\"module\\\\\\": \\\\\\"oslo.messaging.exceptions\\\\\\", \\\\\\"kwargs\\\\\\": {}, \\\\\\"message\\\\\\": \\\\\\"Timed out waiting for a reply to message ID 29c6579c5de24c00b0b0e55579b8e047\\\\\\", \\\\\\"tb\\\\\\": [\\\\\\"Traceback (most recent call last):\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/dispatcher.py\\\\\\\\\\\\\\", line 133, in _dispatch_and_reply\\\\\\\\n incoming.message))\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/dispatcher.py\\\\\\\\\\\\\\", line 176, in _dispatch\\\\\\\\n return self._do_dispatch(endpoint, method, ctxt, args)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/dispatcher.py\\\\\\\\\\\\\\", line 122, in _do_dispatch\\\\\\\\n result = getattr(endpoint, method)(ctxt, **new_args)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/cobalt/nova/extension/manager.py\\\\\\\\\\\\\\", line 132, in wrapped_fn\\\\\\\\n return fn(self, context, **kwargs)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/cobalt/nova/extension/manager.py\\\\\\\\\\\\\\", line 756, in migrate_instance\\\\\\\\n network_info = self.network_api.get_instance_nw_info(context, instance)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/nova/network/api.py\\\\\\\\\\\\\\", line 95, in wrapped\\\\\\\\n return func(self, context, *args, **kwargs)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/nova/network/api.py\\\\\\\\\\\\\\", line 390, in get_instance_nw_info\\\\\\\\n result = self._get_instance_nw_info(context, instance)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/nova/network/api.py\\\\\\\\\\\\\\", line 406, in _get_instance_nw_info\\\\\\\\n nw_info = self.network_rpcapi.get_instance_nw_info(context, **args)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/nova/network/rpcapi.py\\\\\\\\\\\\\\", line 222, in get_instance_nw_info\\\\\\\\n host=host, project_id=project_id)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/client.py\\\\\\\\\\\\\\", line 150, in call\\\\\\\\n wait_for_reply=True, timeout=timeout)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/transport.py\\\\\\\\\\\\\\", line 90, in _send\\\\\\\\n timeout=timeout)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py\\\\\\\\\\\\\\", line 412, in send\\\\\\\\n return self._send(target, ctxt, message, wait_for_reply, timeout)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py\\\\\\\\\\\\\\", line 403, in _send\\\\\\\\n result = self._waiter.wait(msg_id, timeout)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py\\\\\\\\\\\\\\", line 280, in wait\\\\\\\\n reply, ending, trylock = self._poll_queue(msg_id, timeout)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py\\\\\\\\\\\\\\", line 220, in _poll_queue\\\\\\\\n message = self.waiters.get(msg_id, timeout)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py\\\\\\\\\\\\\\", line 126, in get\\\\\\\\n \'to message ID %s\' % msg_id)\\\\\\\\n\\\\\\", \\\\\\"MessagingTimeout: Timed out waiting for a reply to message ID 29c6579c5de24c00b0b0e55579b8e047\\\\\\\\n\\\\\\"], \\\\\\"class\\\\\\": \\\\\\"MessagingTimeout\\\\\\"}\\", \\"result\\": null}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:25:56 10.35.0.13 cobalt-compute: messaging ERROR NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"e64dbaed42d347f5b2275638a62684a8\\", \\"failure\\": \\"{\\\\\\"args\\\\\\": [\\\\\\"Timed out waiting for a reply to message ID 29c6579c5de24c00b0b0e55579b8e047\\\\\\"], \\\\\\"module\\\\\\": \\\\\\"oslo.messaging.exceptions\\\\\\", \\\\\\"kwargs\\\\\\": {}, \\\\\\"message\\\\\\": \\\\\\"Timed out waiting for a reply to message ID 29c6579c5de24c00b0b0e55579b8e047\\\\\\", \\\\\\"tb\\\\\\": [\\\\\\"Traceback (most recent call last):\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/dispatcher.py\\\\\\\\\\\\\\", line 133, in _dispatch_and_reply\\\\\\\\n incoming.message))\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/dispatcher.py\\\\\\\\\\\\\\", line 176, in _dispatch\\\\\\\\n return self._do_dispatch(endpoint, method, ctxt, args)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/dispatcher.py\\\\\\\\\\\\\\", line 122, in _do_dispatch\\\\\\\\n result = getattr(endpoint, method)(ctxt, **new_args)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/cobalt/nova/extension/manager.py\\\\\\\\\\\\\\", line 132, in wrapped_fn\\\\\\\\n return fn(self, context, **kwargs)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/cobalt/nova/extension/manager.py\\\\\\\\\\\\\\", line 756, in migrate_instance\\\\\\\\n network_info = self.network_api.get_instance_nw_info(context, instance)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/nova/network/api.py\\\\\\\\\\\\\\", line 95, in wrapped\\\\\\\\n return func(self, context, *args, **kwargs)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/nova/network/api.py\\\\\\\\\\\\\\", line 390, in get_instance_nw_info\\\\\\\\n result = self._get_instance_nw_info(context, instance)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/nova/network/api.py\\\\\\\\\\\\\\", line 406, in _get_instance_nw_info\\\\\\\\n nw_info = self.network_rpcapi.get_instance_nw_info(context, **args)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/nova/network/rpcapi.py\\\\\\\\\\\\\\", line 222, in get_instance_nw_info\\\\\\\\n host=host, project_id=project_id)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/client.py\\\\\\\\\\\\\\", line 150, in call\\\\\\\\n wait_for_reply=True, timeout=timeout)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/transport.py\\\\\\\\\\\\\\", line 90, in _send\\\\\\\\n timeout=timeout)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py\\\\\\\\\\\\\\", line 412, in send\\\\\\\\n return self._send(target, ctxt, message, wait_for_reply, timeout)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py\\\\\\\\\\\\\\", line 403, in _send\\\\\\\\n result = self._waiter.wait(msg_id, timeout)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py\\\\\\\\\\\\\\", line 280, in wait\\\\\\\\n reply, ending, trylock = self._poll_queue(msg_id, timeout)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py\\\\\\\\\\\\\\", line 220, in _poll_queue\\\\\\\\n message = self.waiters.get(msg_id, timeout)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py\\\\\\\\\\\\\\", line 126, in get\\\\\\\\n \'to message ID %s\' % msg_id)\\\\\\\\n\\\\\\", \\\\\\"MessagingTimeout: Timed out waiting for a reply to message ID 29c6579c5de24c00b0b0e55579b8e047\\\\\\\\n\\\\\\"], \\\\\\"class\\\\\\": \\\\\\"MessagingTimeout\\\\\\"}\\", \\"result\\": null}", "oslo.version": "2.0"}', routing_key=''
Jul 8 19:25:56 10.35.0.13 cobalt-compute: amqp DEBUG UNIQUE_ID is f6832043927b482ba0fdfea3e064039c.
Jul 8 19:25:56 10.35.0.13 cobalt-compute: pyamqp ERROR NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"f6832043927b482ba0fdfea3e064039c\\", \\"failure\\": null, \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:25:56 10.35.0.13 cobalt-compute: messaging ERROR NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"f6832043927b482ba0fdfea3e064039c\\", \\"failure\\": null, \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', routing_key=''
Jul 8 19:25:56 10.35.0.13 cobalt-compute: channel DEBUG Closed channel #1
Jul 8 19:25:56 10.35.0.13 cobalt-compute: channel DEBUG using channel_id: 1
Jul 8 19:25:56 10.35.0.13 cobalt-compute: channel DEBUG Channel open
Jul 8 19:25:56 10.35.0.13 cobalt-compute: impl_rabbit DEBUG Timed out waiting for RPC response: timed out
Jul 8 19:25:56 10.35.0.13 cobalt-compute: manager DEBUG Unlocked instance 5ee78913-e655-4cb1-a92c-500639c819ad (fn: migrate_instance)
Jul 8 19:25:56 10.35.0.13 cobalt-compute: dispatcher ERROR Exception during message handling: Timed out waiting for a reply to message ID ebea5c3e6cd54b0cb4cf6a28624c211d
Traceback (most recent call last):
File "/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/dispatcher.py", line 133, in _dispatch_and_reply
incoming.message))
File "/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/dispatcher.py", line 176, in _dispatch
return self._do_dispatch(endpoint, method, ctxt, args)
File "/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/dispatcher.py", line 122, in _do_dispatch
result = getattr(endpoint, method)(ctxt, **new_args)
File "/usr/lib64/python2.7/site-packages/cobalt/nova/extension/manager.py", line 132, in wrapped_fn
return fn(self, context, **kwargs)
File "/usr/lib64/python2.7/site-packages/cobalt/nova/extension/manager.py", line 756, in migrate_instance
network_info = self.network_api.get_instance_nw_info(context, instance)
File "/usr/lib64/python2.7/site-packages/nova/network/api.py", line 95, in wrapped
return func(self, context, *args, **kwargs)
File "/usr/lib64/python2.7/site-packages/nova/network/api.py", line 390, in get_instance_nw_info
result = self._get_instance_nw_info(context, instance)
File "/usr/lib64/python2.7/site-packages/nova/network/api.py", line 406, in _get_instance_nw_info
nw_info = self.network_rpcapi.get_instance_nw_info(context, **args)
File "/usr/lib64/python2.7/site-packages/nova/network/rpcapi.py", line 222, in get_instance_nw_info
host=host, project_id=project_id)
File "/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/client.py", line 150, in call
wait_for_reply=True, timeout=timeout)
File "/usr/lib64/python2.7/site-packages/oslo/messaging/transport.py", line 90, in _send
timeout=timeout)
File "/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py", line 412, in send
return self._send(target, ctxt, message, wait_for_reply, timeout)
File "/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py", line 403, in _send
result = self._waiter.wait(msg_id, timeout)
File "/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py", line 267, in wait
reply, ending = self._poll_connection(msg_id, timeout)
File "/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py", line 217, in _poll_connection
% msg_id)
MessagingTimeout: Timed out waiting for a reply to message ID ebea5c3e6cd54b0cb4cf6a28624c211d
Jul 8 19:25:56 10.35.0.13 cobalt-compute: common ERROR Returning exception Timed out waiting for a reply to message ID ebea5c3e6cd54b0cb4cf6a28624c211d to caller
Jul 8 19:25:56 10.35.0.13 cobalt-compute: common ERROR ['Traceback (most recent call last):\n', ' File "/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/dispatcher.py", line 133, in _dispatch_and_reply\n incoming.message))\n', ' File "/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/dispatcher.py", line 176, in _dispatch\n return self._do_dispatch(endpoint, method, ctxt, args)\n', ' File "/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/dispatcher.py", line 122, in _do_dispatch\n result = getattr(endpoint, method)(ctxt, **new_args)\n', ' File "/usr/lib64/python2.7/site-packages/cobalt/nova/extension/manager.py", line 132, in wrapped_fn\n return fn(self, context, **kwargs)\n', ' File "/usr/lib64/python2.7/site-packages/cobalt/nova/extension/manager.py", line 756, in migrate_instance\n network_info = self.network_api.get_instance_nw_info(context, instance)\n', ' File "/usr/lib64/python2.7/site-packages/nova/network/api.py", line 95, in wrapped\n return func(self, context, *args, **kwargs)\n', ' File "/usr/lib64/python2.7/site-packages/nova/network/api.py", line 390, in get_instance_nw_info\n result = self._get_instance_nw_info(context, instance)\n', ' File "/usr/lib64/python2.7/site-packages/nova/network/api.py", line 406, in _get_instance_nw_info\n nw_info = self.network_rpcapi.get_instance_nw_info(context, **args)\n', ' File "/usr/lib64/python2.7/site-packages/nova/network/rpcapi.py", line 222, in get_instance_nw_info\n host=host, project_id=project_id)\n', ' File "/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/client.py", line 150, in call\n wait_for_reply=True, timeout=timeout)\n', ' File "/usr/lib64/python2.7/site-packages/oslo/messaging/transport.py", line 90, in _send\n timeout=timeout)\n', ' File "/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py", line 412, in send\n return self._send(target, ctxt, message, wait_for_reply, timeout)\n', ' File "/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py", line 403, in _send\n result = self._waiter.wait(msg_id, timeout)\n', ' File "/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py", line 267, in wait\n reply, ending = self._poll_connection(msg_id, timeout)\n', ' File "/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py", line 217, in _poll_connection\n % msg_id)\n', 'MessagingTimeout: Timed out waiting for a reply to message ID ebea5c3e6cd54b0cb4cf6a28624c211d\n']
Jul 8 19:25:56 10.35.0.13 cobalt-compute: amqp DEBUG UNIQUE_ID is e4a2779d58d64444a0b961d0670c86b2.
Jul 8 19:25:56 10.35.0.13 cobalt-compute: pyamqp ERROR NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"e4a2779d58d64444a0b961d0670c86b2\\", \\"failure\\": \\"{\\\\\\"args\\\\\\": [\\\\\\"Timed out waiting for a reply to message ID ebea5c3e6cd54b0cb4cf6a28624c211d\\\\\\"], \\\\\\"module\\\\\\": \\\\\\"oslo.messaging.exceptions\\\\\\", \\\\\\"kwargs\\\\\\": {}, \\\\\\"message\\\\\\": \\\\\\"Timed out waiting for a reply to message ID ebea5c3e6cd54b0cb4cf6a28624c211d\\\\\\", \\\\\\"tb\\\\\\": [\\\\\\"Traceback (most recent call last):\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/dispatcher.py\\\\\\\\\\\\\\", line 133, in _dispatch_and_reply\\\\\\\\n incoming.message))\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/dispatcher.py\\\\\\\\\\\\\\", line 176, in _dispatch\\\\\\\\n return self._do_dispatch(endpoint, method, ctxt, args)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/dispatcher.py\\\\\\\\\\\\\\", line 122, in _do_dispatch\\\\\\\\n result = getattr(endpoint, method)(ctxt, **new_args)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/cobalt/nova/extension/manager.py\\\\\\\\\\\\\\", line 132, in wrapped_fn\\\\\\\\n return fn(self, context, **kwargs)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/cobalt/nova/extension/manager.py\\\\\\\\\\\\\\", line 756, in migrate_instance\\\\\\\\n network_info = self.network_api.get_instance_nw_info(context, instance)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/nova/network/api.py\\\\\\\\\\\\\\", line 95, in wrapped\\\\\\\\n return func(self, context, *args, **kwargs)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/nova/network/api.py\\\\\\\\\\\\\\", line 390, in get_instance_nw_info\\\\\\\\n result = self._get_instance_nw_info(context, instance)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/nova/network/api.py\\\\\\\\\\\\\\", line 406, in _get_instance_nw_info\\\\\\\\n nw_info = self.network_rpcapi.get_instance_nw_info(context, **args)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/nova/network/rpcapi.py\\\\\\\\\\\\\\", line 222, in get_instance_nw_info\\\\\\\\n host=host, project_id=project_id)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/client.py\\\\\\\\\\\\\\", line 150, in call\\\\\\\\n wait_for_reply=True, timeout=timeout)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/transport.py\\\\\\\\\\\\\\", line 90, in _send\\\\\\\\n timeout=timeout)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py\\\\\\\\\\\\\\", line 412, in send\\\\\\\\n return self._send(target, ctxt, message, wait_for_reply, timeout)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py\\\\\\\\\\\\\\", line 403, in _send\\\\\\\\n result = self._waiter.wait(msg_id, timeout)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py\\\\\\\\\\\\\\", line 267, in wait\\\\\\\\n reply, ending = self._poll_connection(msg_id, timeout)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py\\\\\\\\\\\\\\", line 217, in _poll_connection\\\\\\\\n % msg_id)\\\\\\\\n\\\\\\", \\\\\\"MessagingTimeout: Timed out waiting for a reply to message ID ebea5c3e6cd54b0cb4cf6a28624c211d\\\\\\\\n\\\\\\"], \\\\\\"class\\\\\\": \\\\\\"MessagingTimeout\\\\\\"}\\", \\"result\\": null}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:25:56 10.35.0.13 cobalt-compute: messaging ERROR NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"e4a2779d58d64444a0b961d0670c86b2\\", \\"failure\\": \\"{\\\\\\"args\\\\\\": [\\\\\\"Timed out waiting for a reply to message ID ebea5c3e6cd54b0cb4cf6a28624c211d\\\\\\"], \\\\\\"module\\\\\\": \\\\\\"oslo.messaging.exceptions\\\\\\", \\\\\\"kwargs\\\\\\": {}, \\\\\\"message\\\\\\": \\\\\\"Timed out waiting for a reply to message ID ebea5c3e6cd54b0cb4cf6a28624c211d\\\\\\", \\\\\\"tb\\\\\\": [\\\\\\"Traceback (most recent call last):\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/dispatcher.py\\\\\\\\\\\\\\", line 133, in _dispatch_and_reply\\\\\\\\n incoming.message))\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/dispatcher.py\\\\\\\\\\\\\\", line 176, in _dispatch\\\\\\\\n return self._do_dispatch(endpoint, method, ctxt, args)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/dispatcher.py\\\\\\\\\\\\\\", line 122, in _do_dispatch\\\\\\\\n result = getattr(endpoint, method)(ctxt, **new_args)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/cobalt/nova/extension/manager.py\\\\\\\\\\\\\\", line 132, in wrapped_fn\\\\\\\\n return fn(self, context, **kwargs)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/cobalt/nova/extension/manager.py\\\\\\\\\\\\\\", line 756, in migrate_instance\\\\\\\\n network_info = self.network_api.get_instance_nw_info(context, instance)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/nova/network/api.py\\\\\\\\\\\\\\", line 95, in wrapped\\\\\\\\n return func(self, context, *args, **kwargs)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/nova/network/api.py\\\\\\\\\\\\\\", line 390, in get_instance_nw_info\\\\\\\\n result = self._get_instance_nw_info(context, instance)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/nova/network/api.py\\\\\\\\\\\\\\", line 406, in _get_instance_nw_info\\\\\\\\n nw_info = self.network_rpcapi.get_instance_nw_info(context, **args)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/nova/network/rpcapi.py\\\\\\\\\\\\\\", line 222, in get_instance_nw_info\\\\\\\\n host=host, project_id=project_id)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/client.py\\\\\\\\\\\\\\", line 150, in call\\\\\\\\n wait_for_reply=True, timeout=timeout)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/transport.py\\\\\\\\\\\\\\", line 90, in _send\\\\\\\\n timeout=timeout)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py\\\\\\\\\\\\\\", line 412, in send\\\\\\\\n return self._send(target, ctxt, message, wait_for_reply, timeout)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py\\\\\\\\\\\\\\", line 403, in _send\\\\\\\\n result = self._waiter.wait(msg_id, timeout)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py\\\\\\\\\\\\\\", line 267, in wait\\\\\\\\n reply, ending = self._poll_connection(msg_id, timeout)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py\\\\\\\\\\\\\\", line 217, in _poll_connection\\\\\\\\n % msg_id)\\\\\\\\n\\\\\\", \\\\\\"MessagingTimeout: Timed out waiting for a reply to message ID ebea5c3e6cd54b0cb4cf6a28624c211d\\\\\\\\n\\\\\\"], \\\\\\"class\\\\\\": \\\\\\"MessagingTimeout\\\\\\"}\\", \\"result\\": null}", "oslo.version": "2.0"}', routing_key=''
Jul 8 19:25:56 10.35.0.13 cobalt-compute: amqp DEBUG UNIQUE_ID is c8f2747829d642cdb2de8268f0ad0d4d.
Jul 8 19:25:56 10.35.0.13 cobalt-compute: pyamqp ERROR NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"c8f2747829d642cdb2de8268f0ad0d4d\\", \\"failure\\": null, \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:25:56 10.35.0.13 cobalt-compute: messaging ERROR NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"c8f2747829d642cdb2de8268f0ad0d4d\\", \\"failure\\": null, \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', routing_key=''
Jul 8 19:25:56 10.35.0.13 cobalt-compute: channel DEBUG Closed channel #1
Jul 8 19:25:56 10.35.0.13 cobalt-compute: channel DEBUG using channel_id: 1
Jul 8 19:25:56 10.35.0.13 cobalt-compute: channel DEBUG Channel open
Jul 8 19:25:57 10.35.0.13 [2014-07-08 19:25:57.289] 941/MainThread savage.node-evacuation/INFO: Requesting status refresh of VM id 5ee78913-e655-4cb1-a92c-500639c819ad
Jul 8 19:25:57 10.35.0.13 pound: 10.35.2.8:8774 10.35.2.7 - - [08/Jul/2014:19:25:57 +0000] "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/5ee78913-e655-4cb1-a92c-500639c819ad HTTP/1.1" 200 1691 "" "python-novaclient"
Jul 8 19:25:57 10.35.0.13 nova-api 2014-07-08 19:25:57.397 4840 INFO nova.osapi_compute.wsgi.server [req-dfb88317-20e7-4e1e-a16e-e97bda37ad41 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] 10.35.2.7,10.35.0.13 "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/5ee78913-e655-4cb1-a92c-500639c819ad HTTP/1.1" status: 200 len: 1865 time: 0.1047931
Jul 8 19:25:57 10.35.0.13 [2014-07-08 19:25:57.399] 941/MainThread savage.Fred/INFO: Waiting for VMs to migrate.
Jul 8 19:25:57 10.35.0.13 [2014-07-08 19:25:57.399] 941/MainThread savage.node-evacuation/INFO: Requesting status refresh of VM id ce77d052-f021-44a2-b5f3-5908ec04fcc7
Jul 8 19:25:57 10.35.0.13 pound: 10.35.2.8:8774 10.35.2.7 - - [08/Jul/2014:19:25:57 +0000] "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/ce77d052-f021-44a2-b5f3-5908ec04fcc7 HTTP/1.1" 200 1690 "" "python-novaclient"
Jul 8 19:25:57 10.35.0.13 nova-api 2014-07-08 19:25:57.501 4840 INFO nova.osapi_compute.wsgi.server [req-4dec1112-da12-4554-a061-e61b3e5060d9 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] 10.35.2.7,10.35.0.13 "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/ce77d052-f021-44a2-b5f3-5908ec04fcc7 HTTP/1.1" status: 200 len: 1864 time: 0.0994101
Jul 8 19:25:57 10.35.0.13 [2014-07-08 19:25:57.502] 941/MainThread savage.Fred/INFO: Waiting for VMs to migrate.
Jul 8 19:25:57 10.35.0.2 dhcpd: DHCPREQUEST for 10.35.1.131 from 00:25:90:2f:2d:07 via management
Jul 8 19:25:57 10.35.0.2 dhcpd: DHCPACK on 10.35.1.131 to 00:25:90:2f:2d:07 via management
Jul 8 19:25:58 10.35.0.15 cinder-volume 2014-07-08 19:25:58.286 9067 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_context_domain\\": null, \\"_context_request_id\\": \\"req-fb1a0176-1151-4e40-a7f6-14ea0613da3f\\", \\"_context_quota_class\\": null, \\"_context_service_catalog\\": [], \\"_context_auth_token\\": null, \\"_context_user_id\\": null, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_timestamp\\": \\"2014-07-08T19:25:58.284792\\", \\"_context_project_domain\\": null, \\"_context_user\\": null, \\"method\\": \\"update_service_capabilities\\", \\"_context_remote_address\\": null, \\"_context_roles\\": [\\"admin\\"], \\"args\\": {\\"service_name\\": \\"volume\\", \\"host\\": \\"cinder@rbd-fast\\", \\"capabilities\\": {\\"allocated_capacity_gb\\": 0, \\"volume_backend_name\\": \\"CEPH\\", \\"free_capacity_gb\\": 1282, \\"driver_version\\": \\"1.1.0\\", \\"total_capacity_gb\\": 1282, \\"reserved_percentage\\": 0, \\"vendor_name\\": \\"Open Source\\", \\"storage_protocol\\": \\"ceph\\"}}, \\"_unique_id\\": \\"c6f1c56c5959423794de2a22eb269afc\\", \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_context_user_identity\\": \\"- - - - -\\", \\"_context_tenant\\": null, \\"_context_project_id\\": null, \\"_context_user_domain\\": null}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:25:58 10.35.0.15 cinder-volume 2014-07-08 19:25:58.287 9067 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_context_domain\\": null, \\"_context_request_id\\": \\"req-fb1a0176-1151-4e40-a7f6-14ea0613da3f\\", \\"_context_quota_class\\": null, \\"_context_service_catalog\\": [], \\"_context_auth_token\\": null, \\"_context_user_id\\": null, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_timestamp\\": \\"2014-07-08T19:25:58.284792\\", \\"_context_project_domain\\": null, \\"_context_user\\": null, \\"method\\": \\"update_service_capabilities\\", \\"_context_remote_address\\": null, \\"_context_roles\\": [\\"admin\\"], \\"args\\": {\\"service_name\\": \\"volume\\", \\"host\\": \\"cinder@rbd-fast\\", \\"capabilities\\": {\\"allocated_capacity_gb\\": 0, \\"volume_backend_name\\": \\"CEPH\\", \\"free_capacity_gb\\": 1282, \\"driver_version\\": \\"1.1.0\\", \\"total_capacity_gb\\": 1282, \\"reserved_percentage\\": 0, \\"vendor_name\\": \\"Open Source\\", \\"storage_protocol\\": \\"ceph\\"}}, \\"_unique_id\\": \\"c6f1c56c5959423794de2a22eb269afc\\", \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_context_user_identity\\": \\"- - - - -\\", \\"_context_tenant\\": null, \\"_context_project_id\\": null, \\"_context_user_domain\\": null}", "oslo.version": "2.0"}', routing_key=''
Jul 8 19:25:58 10.35.0.15 cinder-scheduler 2014-07-08 19:25:58.288 8863 ERROR kombu.transport.pyamqp [-] NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_context_domain\\": null, \\"_context_request_id\\": \\"req-fb1a0176-1151-4e40-a7f6-14ea0613da3f\\", \\"_context_quota_class\\": null, \\"_context_service_catalog\\": [], \\"_context_auth_token\\": null, \\"_context_user_id\\": null, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_timestamp\\": \\"2014-07-08T19:25:58.284792\\", \\"_context_project_domain\\": null, \\"_context_user\\": null, \\"method\\": \\"update_service_capabilities\\", \\"_context_remote_address\\": null, \\"_context_roles\\": [\\"admin\\"], \\"args\\": {\\"service_name\\": \\"volume\\", \\"host\\": \\"cinder@rbd-fast\\", \\"capabilities\\": {\\"allocated_capacity_gb\\": 0, \\"volume_backend_name\\": \\"CEPH\\", \\"free_capacity_gb\\": 1282, \\"driver_version\\": \\"1.1.0\\", \\"total_capacity_gb\\": 1282, \\"reserved_percentage\\": 0, \\"vendor_name\\": \\"Open Source\\", \\"storage_protocol\\": \\"ceph\\"}}, \\"_unique_id\\": \\"c6f1c56c5959423794de2a22eb269afc\\", \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_context_user_identity\\": \\"- - - - -\\", \\"_context_tenant\\": null, \\"_context_project_id\\": null, \\"_context_user_domain\\": null}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'3', 'redelivered': False, 'routing_key': u'', 'delivery_tag': 79, 'exchange': u'cinder-scheduler_fanout'}, 'properties': {'priority': 0, 'application_headers': {}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7fa03d158690>}
Jul 8 19:25:58 10.35.0.15 cinder-volume 2014-07-08 19:25:58.289 9067 INFO cinder.volume.manager [-] Updating volume status
Jul 8 19:25:58 10.35.0.15 cinder-scheduler 2014-07-08 19:25:58.290 8863 ERROR oslo.messaging._drivers.impl_rabbit [-] NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_context_domain\\": null, \\"_context_request_id\\": \\"req-fb1a0176-1151-4e40-a7f6-14ea0613da3f\\", \\"_context_quota_class\\": null, \\"_context_service_catalog\\": [], \\"_context_auth_token\\": null, \\"_context_user_id\\": null, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_timestamp\\": \\"2014-07-08T19:25:58.284792\\", \\"_context_project_domain\\": null, \\"_context_user\\": null, \\"method\\": \\"update_service_capabilities\\", \\"_context_remote_address\\": null, \\"_context_roles\\": [\\"admin\\"], \\"args\\": {\\"service_name\\": \\"volume\\", \\"host\\": \\"cinder@rbd-fast\\", \\"capabilities\\": {\\"allocated_capacity_gb\\": 0, \\"volume_backend_name\\": \\"CEPH\\", \\"free_capacity_gb\\": 1282, \\"driver_version\\": \\"1.1.0\\", \\"total_capacity_gb\\": 1282, \\"reserved_percentage\\": 0, \\"vendor_name\\": \\"Open Source\\", \\"storage_protocol\\": \\"ceph\\"}}, \\"_unique_id\\": \\"c6f1c56c5959423794de2a22eb269afc\\", \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_context_user_identity\\": \\"- - - - -\\", \\"_context_tenant\\": null, \\"_context_project_id\\": null, \\"_context_user_domain\\": null}", "oslo.version": "2.0"}'
Jul 8 19:25:58 10.35.0.15 cinder-scheduler 2014-07-08 19:25:58.290 8863 WARNING cinder.context [-] Arguments dropped when creating context: {'user': None, 'tenant': None, 'user_identity': u'- - - - -'}
Jul 8 19:25:58 10.35.0.15 cinder-scheduler 2014-07-08 19:25:58.291 8863 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"cdf0f648b82044788fb6208e121a40a1\\", \\"failure\\": null, \\"result\\": null}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:25:58 10.35.0.15 cinder-scheduler 2014-07-08 19:25:58.291 8863 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"cdf0f648b82044788fb6208e121a40a1\\", \\"failure\\": null, \\"result\\": null}", "oslo.version": "2.0"}', routing_key=''
Jul 8 19:25:58 10.35.0.15 cinder-scheduler 2014-07-08 19:25:58.292 8863 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"b47c621a7af04cffa7dcc5a71a6a1697\\", \\"failure\\": null, \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:25:58 10.35.0.15 cinder-scheduler 2014-07-08 19:25:58.292 8863 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"b47c621a7af04cffa7dcc5a71a6a1697\\", \\"failure\\": null, \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', routing_key=''
Jul 8 19:25:59 10.35.0.13 pound: 10.35.2.3:5000 10.35.0.14 - - [08/Jul/2014:19:25:59 +0000] "POST //v2.0/tokens HTTP/1.1" 200 3288 "" "python-novaclient"
Jul 8 19:25:59 10.35.0.13 nova-api 2014-07-08 19:25:59.219 4842 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:25:59 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.7 - - [08/Jul/2014:19:25:59 +0000] "GET /v2.0/tokens/e122f128c6084e8e9679708f07109303 HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:25:59 10.35.0.13 pound: 10.35.2.8:8774 10.35.0.14 - - [08/Jul/2014:19:25:59 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5060 "" "python-novaclient"
Jul 8 19:25:59 10.35.0.13 nova-api 2014-07-08 19:25:59.337 4842 INFO nova.osapi_compute.wsgi.server [req-b9728266-c28c-45a4-99d3-27fe35e3390e 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 10.35.0.14,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5234 time: 0.1197751
Jul 8 19:26:02 10.35.0.13 pound: 10.35.2.3:5000 10.35.0.14 - - [08/Jul/2014:19:26:01 +0000] "POST //v2.0/tokens HTTP/1.1" 200 3288 "" "python-novaclient"
Jul 8 19:26:02 10.35.0.15 nova-api 2014-07-08 19:26:02.117 3440 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:26:02 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.6 - - [08/Jul/2014:19:26:02 +0000] "GET /v2.0/tokens/6346a8ad9e01475e9593bcc3011fa355 HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:26:02 10.35.0.13 pound: 10.35.2.8:8774 10.35.0.14 - - [08/Jul/2014:19:26:02 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5060 "" "python-novaclient"
Jul 8 19:26:02 10.35.0.15 nova-api 2014-07-08 19:26:02.265 3440 INFO nova.osapi_compute.wsgi.server [req-60548884-e5d0-4a25-9c72-6522ed040417 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 10.35.0.14,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5234 time: 0.1505959
Jul 8 19:26:02 10.35.0.14 nova-compute 2014-07-08 19:26:02.904 6347 AUDIT nova.compute.resource_tracker [-] Auditing locally available compute resources
Jul 8 19:26:02 10.35.0.14 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf vgs --noheadings --nosuffix --separator | --units b -o vg_size,vg_free nova
Jul 8 19:26:02 10.35.0.14 [2014-07-08 19:26:02.999] 941/Thread-1 savage.WSGI-FredInterface/INFO: 10.35.0.2 "GET /private/status HTTP/1.1" 200 3035
Jul 8 19:26:03 10.35.0.14 nova-compute 2014-07-08 19:26:03.247 6347 AUDIT nova.compute.resource_tracker [-] Free ram (MB): 47729
Jul 8 19:26:03 10.35.0.14 nova-compute 2014-07-08 19:26:03.247 6347 AUDIT nova.compute.resource_tracker [-] Free disk (GB): 1862
Jul 8 19:26:03 10.35.0.14 nova-compute 2014-07-08 19:26:03.247 6347 AUDIT nova.compute.resource_tracker [-] Free VCPUS: 4
Jul 8 19:26:03 10.35.0.14 nova-compute 2014-07-08 19:26:03.273 6347 INFO nova.compute.resource_tracker [-] Compute_service record updated for 10.35.0.14:node-10-35-0-14
Jul 8 19:26:04 10.35.0.15 nova-compute 2014-07-08 19:26:04.309 3328 AUDIT nova.compute.resource_tracker [-] Auditing locally available compute resources
Jul 8 19:26:04 10.35.0.15 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf vgs --noheadings --nosuffix --separator | --units b -o vg_size,vg_free nova
Jul 8 19:26:04 10.35.0.13 [2014-07-08 19:26:04.529] 11286/MainThread savage.arp_forever/INFO: execute['arping', '-q', '-A', '-I', 'services', '-c', '1', '-U', '10.35.2.9']
Jul 8 19:26:04 10.35.0.15 nova-compute 2014-07-08 19:26:04.680 3328 AUDIT nova.compute.resource_tracker [-] Free ram (MB): 47217
Jul 8 19:26:04 10.35.0.15 nova-compute 2014-07-08 19:26:04.680 3328 AUDIT nova.compute.resource_tracker [-] Free disk (GB): 1861
Jul 8 19:26:04 10.35.0.15 nova-compute 2014-07-08 19:26:04.680 3328 AUDIT nova.compute.resource_tracker [-] Free VCPUS: 3
Jul 8 19:26:04 10.35.0.15 nova-compute 2014-07-08 19:26:04.707 3328 INFO nova.compute.resource_tracker [-] Compute_service record updated for 10.35.0.15:node-10-35-0-15
Jul 8 19:26:04 10.35.0.13 nova-compute 2014-07-08 19:26:04.736 4705 AUDIT nova.compute.resource_tracker [-] Auditing locally available compute resources
Jul 8 19:26:04 10.35.0.13 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf vgs --noheadings --nosuffix --separator | --units b -o vg_size,vg_free nova
Jul 8 19:26:04 10.35.0.15 [2014-07-08 19:26:04.821] 8877/MainThread savage.arp_forever/INFO: execute['arping', '-q', '-A', '-I', 'services', '-c', '1', '-U', '10.35.2.6']
Jul 8 19:26:04 10.35.0.15 nova-compute 2014-07-08 19:26:04.839 3328 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_context_roles\\": [], \\"_msg_id\\": \\"1a713828fadd4c60b926f6d4486e57ac\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-16e449c5-93cf-4677-bab8-baf6f507d0a0\\", \\"_context_service_catalog\\": [], \\"args\\": {\\"instance_id\\": \\"3e65c658-731c-4a56-8c17-1cfac00bf754\\", \\"host\\": \\"10.35.0.15\\", \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"rxtx_factor\\": 1.0}, \\"_unique_id\\": \\"b67b69157776412f9d6146d8707630ae\\", \\"_context_user\\": null, \\"_context_user_id\\": null, \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_af4fb5adaca445a3b6d555c358806a4e\\", \\"_context_auth_token\\": null, \\"_context_tenant\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.9\\", \\"_context_project_id\\": null, \\"_context_timestamp\\": \\"2014-07-08T19:26:04.307898\\", \\"_context_user_name\\": null, \\"method\\": \\"get_instance_nw_info\\", \\"_context_remote_address\\": null}", "oslo.version": "2.0"}', headers={'ttl': 60000}, properties={'delivery_mode': 2}
Jul 8 19:26:04 10.35.0.15 nova-compute 2014-07-08 19:26:04.839 3328 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_context_roles\\": [], \\"_msg_id\\": \\"1a713828fadd4c60b926f6d4486e57ac\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-16e449c5-93cf-4677-bab8-baf6f507d0a0\\", \\"_context_service_catalog\\": [], \\"args\\": {\\"instance_id\\": \\"3e65c658-731c-4a56-8c17-1cfac00bf754\\", \\"host\\": \\"10.35.0.15\\", \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"rxtx_factor\\": 1.0}, \\"_unique_id\\": \\"b67b69157776412f9d6146d8707630ae\\", \\"_context_user\\": null, \\"_context_user_id\\": null, \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_af4fb5adaca445a3b6d555c358806a4e\\", \\"_context_auth_token\\": null, \\"_context_tenant\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.9\\", \\"_context_project_id\\": null, \\"_context_timestamp\\": \\"2014-07-08T19:26:04.307898\\", \\"_context_user_name\\": null, \\"method\\": \\"get_instance_nw_info\\", \\"_context_remote_address\\": null}", "oslo.version": "2.0"}', routing_key='network'
Jul 8 19:26:04 10.35.0.14 nova-network 2014-07-08 19:26:04.837 6514 ERROR kombu.transport.pyamqp [-] NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_context_roles\\": [], \\"_msg_id\\": \\"1a713828fadd4c60b926f6d4486e57ac\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-16e449c5-93cf-4677-bab8-baf6f507d0a0\\", \\"_context_service_catalog\\": [], \\"args\\": {\\"instance_id\\": \\"3e65c658-731c-4a56-8c17-1cfac00bf754\\", \\"host\\": \\"10.35.0.15\\", \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"rxtx_factor\\": 1.0}, \\"_unique_id\\": \\"b67b69157776412f9d6146d8707630ae\\", \\"_context_user\\": null, \\"_context_user_id\\": null, \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_af4fb5adaca445a3b6d555c358806a4e\\", \\"_context_auth_token\\": null, \\"_context_tenant\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.9\\", \\"_context_project_id\\": null, \\"_context_timestamp\\": \\"2014-07-08T19:26:04.307898\\", \\"_context_user_name\\": null, \\"method\\": \\"get_instance_nw_info\\", \\"_context_remote_address\\": null}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'1', 'redelivered': False, 'routing_key': u'network', 'delivery_tag': 27, 'exchange': u'nova'}, 'properties': {'priority': 0, 'application_headers': {u'ttl': 60000}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7f5c1424f1d0>}
Jul 8 19:26:04 10.35.0.14 nova-network 2014-07-08 19:26:04.839 6514 ERROR oslo.messaging._drivers.impl_rabbit [-] NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_context_roles\\": [], \\"_msg_id\\": \\"1a713828fadd4c60b926f6d4486e57ac\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-16e449c5-93cf-4677-bab8-baf6f507d0a0\\", \\"_context_service_catalog\\": [], \\"args\\": {\\"instance_id\\": \\"3e65c658-731c-4a56-8c17-1cfac00bf754\\", \\"host\\": \\"10.35.0.15\\", \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"rxtx_factor\\": 1.0}, \\"_unique_id\\": \\"b67b69157776412f9d6146d8707630ae\\", \\"_context_user\\": null, \\"_context_user_id\\": null, \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_af4fb5adaca445a3b6d555c358806a4e\\", \\"_context_auth_token\\": null, \\"_context_tenant\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.9\\", \\"_context_project_id\\": null, \\"_context_timestamp\\": \\"2014-07-08T19:26:04.307898\\", \\"_context_user_name\\": null, \\"method\\": \\"get_instance_nw_info\\", \\"_context_remote_address\\": null}", "oslo.version": "2.0"}'
Jul 8 19:26:04 10.35.0.14 nova-network.log: /usr/lib64/python2.7/site-packages/amqp/channel.py:616: VDeprecationWarning: The auto_delete flag for exchanges has been deprecated and will be removed
Jul 8 19:26:04 10.35.0.14 nova-network.log: from py-amqp v1.5.0.
Jul 8 19:26:04 10.35.0.14 nova-network.log: warn(VDeprecationWarning(EXCHANGE_AUTODELETE_DEPRECATED))
Jul 8 19:26:04 10.35.0.14 nova-network 2014-07-08 19:26:04.979 6514 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"255bb6ac5d724293b603d3aaaeab8e5b\\", \\"failure\\": null, \\"result\\": [{\\"ovs_interfaceid\\": null, \\"network\\": {\\"bridge\\": \\"cloud\\", \\"subnets\\": [{\\"ips\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"fixed\\", \\"floating_ips\\": [], \\"address\\": \\"10.35.3.6\\"}], \\"version\\": 4, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.5\\"}, \\"dns\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"dns\\", \\"address\\": \\"8.8.4.4\\"}], \\"routes\\": [], \\"cidr\\": \\"10.35.3.0/24\\", \\"gateway\\": {\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"gateway\\", \\"address\\": \\"10.35.3.1\\"}}, {\\"ips\\": [], \\"version\\": null, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.5\\"}, \\"dns\\": [], \\"routes\\": [], \\"cidr\\": null, \\"gateway\\": {\\"meta\\": {}, \\"version\\": null, \\"type\\": \\"gateway\\", \\"address\\": null}}], \\"meta\\": {\\"tenant_id\\": null, \\"multi_host\\": true, \\"should_create_bridge\\": true}, \\"id\\": \\"933f56fa-7929-413f-ab40-9c7c91cb840b\\", \\"label\\": \\"private\\"}, \\"devname\\": null, \\"qbh_params\\": null, \\"meta\\": {}, \\"details\\": {}, \\"address\\": \\"fa:16:3e:c1:d4:f3\\", \\"active\\": false, \\"type\\": \\"bridge\\", \\"id\\": \\"ecd8dbf8-7280-45c3-89e1-919f48b2591c\\", \\"qbg_params\\": null}], \\"_msg_id\\": \\"1a713828fadd4c60b926f6d4486e57ac\\"}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:26:04 10.35.0.14 nova-network 2014-07-08 19:26:04.979 6514 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"255bb6ac5d724293b603d3aaaeab8e5b\\", \\"failure\\": null, \\"result\\": [{\\"ovs_interfaceid\\": null, \\"network\\": {\\"bridge\\": \\"cloud\\", \\"subnets\\": [{\\"ips\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"fixed\\", \\"floating_ips\\": [], \\"address\\": \\"10.35.3.6\\"}], \\"version\\": 4, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.5\\"}, \\"dns\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"dns\\", \\"address\\": \\"8.8.4.4\\"}], \\"routes\\": [], \\"cidr\\": \\"10.35.3.0/24\\", \\"gateway\\": {\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"gateway\\", \\"address\\": \\"10.35.3.1\\"}}, {\\"ips\\": [], \\"version\\": null, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.5\\"}, \\"dns\\": [], \\"routes\\": [], \\"cidr\\": null, \\"gateway\\": {\\"meta\\": {}, \\"version\\": null, \\"type\\": \\"gateway\\", \\"address\\": null}}], \\"meta\\": {\\"tenant_id\\": null, \\"multi_host\\": true, \\"should_create_bridge\\": true}, \\"id\\": \\"933f56fa-7929-413f-ab40-9c7c91cb840b\\", \\"label\\": \\"private\\"}, \\"devname\\": null, \\"qbh_params\\": null, \\"meta\\": {}, \\"details\\": {}, \\"address\\": \\"fa:16:3e:c1:d4:f3\\", \\"active\\": false, \\"type\\": \\"bridge\\", \\"id\\": \\"ecd8dbf8-7280-45c3-89e1-919f48b2591c\\", \\"qbg_params\\": null}], \\"_msg_id\\": \\"1a713828fadd4c60b926f6d4486e57ac\\"}", "oslo.version": "2.0"}', routing_key=u'reply_af4fb5adaca445a3b6d555c358806a4e'
Jul 8 19:26:04 10.35.0.14 nova-network.log: /usr/lib64/python2.7/site-packages/amqp/channel.py:616: VDeprecationWarning: The auto_delete flag for exchanges has been deprecated and will be removed
Jul 8 19:26:04 10.35.0.14 nova-network.log: from py-amqp v1.5.0.
Jul 8 19:26:04 10.35.0.14 nova-network.log: warn(VDeprecationWarning(EXCHANGE_AUTODELETE_DEPRECATED))
Jul 8 19:26:04 10.35.0.14 nova-network 2014-07-08 19:26:04.980 6514 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"d0a85f844eed49d2921698d830ef52cf\\", \\"failure\\": null, \\"_msg_id\\": \\"1a713828fadd4c60b926f6d4486e57ac\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:26:04 10.35.0.14 nova-network 2014-07-08 19:26:04.980 6514 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"d0a85f844eed49d2921698d830ef52cf\\", \\"failure\\": null, \\"_msg_id\\": \\"1a713828fadd4c60b926f6d4486e57ac\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', routing_key=u'reply_af4fb5adaca445a3b6d555c358806a4e'
Jul 8 19:26:04 10.35.0.15 nova-compute 2014-07-08 19:26:04.984 3328 ERROR kombu.transport.pyamqp [-] NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_unique_id\\": \\"255bb6ac5d724293b603d3aaaeab8e5b\\", \\"failure\\": null, \\"result\\": [{\\"ovs_interfaceid\\": null, \\"network\\": {\\"bridge\\": \\"cloud\\", \\"subnets\\": [{\\"ips\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"fixed\\", \\"floating_ips\\": [], \\"address\\": \\"10.35.3.6\\"}], \\"version\\": 4, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.5\\"}, \\"dns\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"dns\\", \\"address\\": \\"8.8.4.4\\"}], \\"routes\\": [], \\"cidr\\": \\"10.35.3.0/24\\", \\"gateway\\": {\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"gateway\\", \\"address\\": \\"10.35.3.1\\"}}, {\\"ips\\": [], \\"version\\": null, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.5\\"}, \\"dns\\": [], \\"routes\\": [], \\"cidr\\": null, \\"gateway\\": {\\"meta\\": {}, \\"version\\": null, \\"type\\": \\"gateway\\", \\"address\\": null}}], \\"meta\\": {\\"tenant_id\\": null, \\"multi_host\\": true, \\"should_create_bridge\\": true}, \\"id\\": \\"933f56fa-7929-413f-ab40-9c7c91cb840b\\", \\"label\\": \\"private\\"}, \\"devname\\": null, \\"qbh_params\\": null, \\"meta\\": {}, \\"details\\": {}, \\"address\\": \\"fa:16:3e:c1:d4:f3\\", \\"active\\": false, \\"type\\": \\"bridge\\", \\"id\\": \\"ecd8dbf8-7280-45c3-89e1-919f48b2591c\\", \\"qbg_params\\": null}], \\"_msg_id\\": \\"1a713828fadd4c60b926f6d4486e57ac\\"}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'1', 'redelivered': False, 'routing_key': u'reply_af4fb5adaca445a3b6d555c358806a4e', 'delivery_tag': 77, 'exchange': u'reply_af4fb5adaca445a3b6d555c358806a4e'}, 'properties': {'priority': 0, 'application_headers': {}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7f3030b1ff10>}
Jul 8 19:26:04 10.35.0.15 nova-compute 2014-07-08 19:26:04.984 3328 ERROR oslo.messaging._drivers.impl_rabbit [-] NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_unique_id\\": \\"255bb6ac5d724293b603d3aaaeab8e5b\\", \\"failure\\": null, \\"result\\": [{\\"ovs_interfaceid\\": null, \\"network\\": {\\"bridge\\": \\"cloud\\", \\"subnets\\": [{\\"ips\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"fixed\\", \\"floating_ips\\": [], \\"address\\": \\"10.35.3.6\\"}], \\"version\\": 4, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.5\\"}, \\"dns\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"dns\\", \\"address\\": \\"8.8.4.4\\"}], \\"routes\\": [], \\"cidr\\": \\"10.35.3.0/24\\", \\"gateway\\": {\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"gateway\\", \\"address\\": \\"10.35.3.1\\"}}, {\\"ips\\": [], \\"version\\": null, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.5\\"}, \\"dns\\": [], \\"routes\\": [], \\"cidr\\": null, \\"gateway\\": {\\"meta\\": {}, \\"version\\": null, \\"type\\": \\"gateway\\", \\"address\\": null}}], \\"meta\\": {\\"tenant_id\\": null, \\"multi_host\\": true, \\"should_create_bridge\\": true}, \\"id\\": \\"933f56fa-7929-413f-ab40-9c7c91cb840b\\", \\"label\\": \\"private\\"}, \\"devname\\": null, \\"qbh_params\\": null, \\"meta\\": {}, \\"details\\": {}, \\"address\\": \\"fa:16:3e:c1:d4:f3\\", \\"active\\": false, \\"type\\": \\"bridge\\", \\"id\\": \\"ecd8dbf8-7280-45c3-89e1-919f48b2591c\\", \\"qbg_params\\": null}], \\"_msg_id\\": \\"1a713828fadd4c60b926f6d4486e57ac\\"}", "oslo.version": "2.0"}'
Jul 8 19:26:04 10.35.0.15 nova-compute 2014-07-08 19:26:04.985 3328 ERROR kombu.transport.pyamqp [-] NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_unique_id\\": \\"d0a85f844eed49d2921698d830ef52cf\\", \\"failure\\": null, \\"_msg_id\\": \\"1a713828fadd4c60b926f6d4486e57ac\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'1', 'redelivered': False, 'routing_key': u'reply_af4fb5adaca445a3b6d555c358806a4e', 'delivery_tag': 78, 'exchange': u'reply_af4fb5adaca445a3b6d555c358806a4e'}, 'properties': {'priority': 0, 'application_headers': {}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7f3030b1ff10>}
Jul 8 19:26:04 10.35.0.15 nova-compute 2014-07-08 19:26:04.985 3328 ERROR oslo.messaging._drivers.impl_rabbit [-] NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_unique_id\\": \\"d0a85f844eed49d2921698d830ef52cf\\", \\"failure\\": null, \\"_msg_id\\": \\"1a713828fadd4c60b926f6d4486e57ac\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}'
Jul 8 19:26:05 10.35.0.13 pound: 10.35.2.3:5000 10.35.0.14 - - [08/Jul/2014:19:26:04 +0000] "POST //v2.0/tokens HTTP/1.1" 200 3288 "" "python-novaclient"
Jul 8 19:26:05 10.35.0.15 nova-api 2014-07-08 19:26:05.033 3442 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:26:05 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.6 - - [08/Jul/2014:19:26:05 +0000] "GET /v2.0/tokens/4c941a3e16914671a9a5a4bd3e825ce6 HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:26:05 10.35.0.13 nova-compute 2014-07-08 19:26:05.104 4705 AUDIT nova.compute.resource_tracker [-] Free ram (MB): 46705
Jul 8 19:26:05 10.35.0.13 nova-compute 2014-07-08 19:26:05.104 4705 AUDIT nova.compute.resource_tracker [-] Free disk (GB): 1860
Jul 8 19:26:05 10.35.0.13 nova-compute 2014-07-08 19:26:05.105 4705 AUDIT nova.compute.resource_tracker [-] Free VCPUS: 2
Jul 8 19:26:05 10.35.0.13 nova-compute 2014-07-08 19:26:05.130 4705 INFO nova.compute.resource_tracker [-] Compute_service record updated for 10.35.0.13:node-10-35-0-13
Jul 8 19:26:05 10.35.0.13 pound: 10.35.2.8:8774 10.35.0.14 - - [08/Jul/2014:19:26:05 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5060 "" "python-novaclient"
Jul 8 19:26:05 10.35.0.15 nova-api 2014-07-08 19:26:05.149 3442 INFO nova.osapi_compute.wsgi.server [req-6ccbefbe-840a-4cd6-8447-a1ad80422605 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 10.35.0.14,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5234 time: 0.1180308
Jul 8 19:26:05 10.35.0.13 nova-compute 2014-07-08 19:26:05.263 4705 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_context_roles\\": [], \\"_msg_id\\": \\"8b8e6781a081424ea7933e90623f00ff\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-ce6f3584-67c9-432e-ac47-b47b1df57761\\", \\"_context_service_catalog\\": [], \\"args\\": {\\"instance_id\\": \\"ce77d052-f021-44a2-b5f3-5908ec04fcc7\\", \\"host\\": \\"10.35.0.13\\", \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"rxtx_factor\\": 1.0}, \\"_unique_id\\": \\"cb47544b4d324db9bcad094a0874ea39\\", \\"_context_user\\": null, \\"_context_user_id\\": null, \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_5a152b20506e4303bfaf3007cbe974c3\\", \\"_context_auth_token\\": null, \\"_context_tenant\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.9\\", \\"_context_project_id\\": null, \\"_context_timestamp\\": \\"2014-07-08T19:26:04.735239\\", \\"_context_user_name\\": null, \\"method\\": \\"get_instance_nw_info\\", \\"_context_remote_address\\": null}", "oslo.version": "2.0"}', headers={'ttl': 60000}, properties={'delivery_mode': 2}
Jul 8 19:26:05 10.35.0.13 nova-compute 2014-07-08 19:26:05.264 4705 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_context_roles\\": [], \\"_msg_id\\": \\"8b8e6781a081424ea7933e90623f00ff\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-ce6f3584-67c9-432e-ac47-b47b1df57761\\", \\"_context_service_catalog\\": [], \\"args\\": {\\"instance_id\\": \\"ce77d052-f021-44a2-b5f3-5908ec04fcc7\\", \\"host\\": \\"10.35.0.13\\", \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"rxtx_factor\\": 1.0}, \\"_unique_id\\": \\"cb47544b4d324db9bcad094a0874ea39\\", \\"_context_user\\": null, \\"_context_user_id\\": null, \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_5a152b20506e4303bfaf3007cbe974c3\\", \\"_context_auth_token\\": null, \\"_context_tenant\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.9\\", \\"_context_project_id\\": null, \\"_context_timestamp\\": \\"2014-07-08T19:26:04.735239\\", \\"_context_user_name\\": null, \\"method\\": \\"get_instance_nw_info\\", \\"_context_remote_address\\": null}", "oslo.version": "2.0"}', routing_key='network'
Jul 8 19:26:05 10.35.0.15 nova-network 2014-07-08 19:26:05.267 3465 ERROR kombu.transport.pyamqp [-] NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_context_roles\\": [], \\"_msg_id\\": \\"8b8e6781a081424ea7933e90623f00ff\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-ce6f3584-67c9-432e-ac47-b47b1df57761\\", \\"_context_service_catalog\\": [], \\"args\\": {\\"instance_id\\": \\"ce77d052-f021-44a2-b5f3-5908ec04fcc7\\", \\"host\\": \\"10.35.0.13\\", \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"rxtx_factor\\": 1.0}, \\"_unique_id\\": \\"cb47544b4d324db9bcad094a0874ea39\\", \\"_context_user\\": null, \\"_context_user_id\\": null, \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_5a152b20506e4303bfaf3007cbe974c3\\", \\"_context_auth_token\\": null, \\"_context_tenant\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.9\\", \\"_context_project_id\\": null, \\"_context_timestamp\\": \\"2014-07-08T19:26:04.735239\\", \\"_context_user_name\\": null, \\"method\\": \\"get_instance_nw_info\\", \\"_context_remote_address\\": null}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'1', 'redelivered': False, 'routing_key': u'network', 'delivery_tag': 27, 'exchange': u'nova'}, 'properties': {'priority': 0, 'application_headers': {u'ttl': 60000}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7eff2d221290>}
Jul 8 19:26:05 10.35.0.15 nova-network 2014-07-08 19:26:05.269 3465 ERROR oslo.messaging._drivers.impl_rabbit [-] NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_context_roles\\": [], \\"_msg_id\\": \\"8b8e6781a081424ea7933e90623f00ff\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-ce6f3584-67c9-432e-ac47-b47b1df57761\\", \\"_context_service_catalog\\": [], \\"args\\": {\\"instance_id\\": \\"ce77d052-f021-44a2-b5f3-5908ec04fcc7\\", \\"host\\": \\"10.35.0.13\\", \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"rxtx_factor\\": 1.0}, \\"_unique_id\\": \\"cb47544b4d324db9bcad094a0874ea39\\", \\"_context_user\\": null, \\"_context_user_id\\": null, \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_5a152b20506e4303bfaf3007cbe974c3\\", \\"_context_auth_token\\": null, \\"_context_tenant\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.9\\", \\"_context_project_id\\": null, \\"_context_timestamp\\": \\"2014-07-08T19:26:04.735239\\", \\"_context_user_name\\": null, \\"method\\": \\"get_instance_nw_info\\", \\"_context_remote_address\\": null}", "oslo.version": "2.0"}'
Jul 8 19:26:05 10.35.0.15 nova-network.log: /usr/lib64/python2.7/site-packages/amqp/channel.py:616: VDeprecationWarning: The auto_delete flag for exchanges has been deprecated and will be removed
Jul 8 19:26:05 10.35.0.15 nova-network.log: from py-amqp v1.5.0.
Jul 8 19:26:05 10.35.0.15 nova-network.log: warn(VDeprecationWarning(EXCHANGE_AUTODELETE_DEPRECATED))
Jul 8 19:26:05 10.35.0.15 nova-network 2014-07-08 19:26:05.403 3465 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"86d591ea98c74703b961bf35c9faaa3a\\", \\"failure\\": null, \\"result\\": [{\\"ovs_interfaceid\\": null, \\"network\\": {\\"bridge\\": \\"cloud\\", \\"subnets\\": [{\\"ips\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"fixed\\", \\"floating_ips\\": [], \\"address\\": \\"10.35.3.2\\"}], \\"version\\": 4, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.3\\"}, \\"dns\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"dns\\", \\"address\\": \\"8.8.4.4\\"}], \\"routes\\": [], \\"cidr\\": \\"10.35.3.0/24\\", \\"gateway\\": {\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"gateway\\", \\"address\\": \\"10.35.3.1\\"}}, {\\"ips\\": [], \\"version\\": null, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.3\\"}, \\"dns\\": [], \\"routes\\": [], \\"cidr\\": null, \\"gateway\\": {\\"meta\\": {}, \\"version\\": null, \\"type\\": \\"gateway\\", \\"address\\": null}}], \\"meta\\": {\\"tenant_id\\": null, \\"multi_host\\": true, \\"should_create_bridge\\": true}, \\"id\\": \\"933f56fa-7929-413f-ab40-9c7c91cb840b\\", \\"label\\": \\"private\\"}, \\"devname\\": null, \\"qbh_params\\": null, \\"meta\\": {}, \\"details\\": {}, \\"address\\": \\"fa:16:3e:0e:79:17\\", \\"active\\": false, \\"type\\": \\"bridge\\", \\"id\\": \\"3821355e-1e28-4fa1-adfe-520202d90d3e\\", \\"qbg_params\\": null}], \\"_msg_id\\": \\"8b8e6781a081424ea7933e90623f00ff\\"}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:26:05 10.35.0.15 nova-network 2014-07-08 19:26:05.404 3465 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"86d591ea98c74703b961bf35c9faaa3a\\", \\"failure\\": null, \\"result\\": [{\\"ovs_interfaceid\\": null, \\"network\\": {\\"bridge\\": \\"cloud\\", \\"subnets\\": [{\\"ips\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"fixed\\", \\"floating_ips\\": [], \\"address\\": \\"10.35.3.2\\"}], \\"version\\": 4, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.3\\"}, \\"dns\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"dns\\", \\"address\\": \\"8.8.4.4\\"}], \\"routes\\": [], \\"cidr\\": \\"10.35.3.0/24\\", \\"gateway\\": {\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"gateway\\", \\"address\\": \\"10.35.3.1\\"}}, {\\"ips\\": [], \\"version\\": null, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.3\\"}, \\"dns\\": [], \\"routes\\": [], \\"cidr\\": null, \\"gateway\\": {\\"meta\\": {}, \\"version\\": null, \\"type\\": \\"gateway\\", \\"address\\": null}}], \\"meta\\": {\\"tenant_id\\": null, \\"multi_host\\": true, \\"should_create_bridge\\": true}, \\"id\\": \\"933f56fa-7929-413f-ab40-9c7c91cb840b\\", \\"label\\": \\"private\\"}, \\"devname\\": null, \\"qbh_params\\": null, \\"meta\\": {}, \\"details\\": {}, \\"address\\": \\"fa:16:3e:0e:79:17\\", \\"active\\": false, \\"type\\": \\"bridge\\", \\"id\\": \\"3821355e-1e28-4fa1-adfe-520202d90d3e\\", \\"qbg_params\\": null}], \\"_msg_id\\": \\"8b8e6781a081424ea7933e90623f00ff\\"}", "oslo.version": "2.0"}', routing_key=u'reply_5a152b20506e4303bfaf3007cbe974c3'
Jul 8 19:26:05 10.35.0.15 nova-network.log: /usr/lib64/python2.7/site-packages/amqp/channel.py:616: VDeprecationWarning: The auto_delete flag for exchanges has been deprecated and will be removed
Jul 8 19:26:05 10.35.0.15 nova-network.log: from py-amqp v1.5.0.
Jul 8 19:26:05 10.35.0.15 nova-network.log: warn(VDeprecationWarning(EXCHANGE_AUTODELETE_DEPRECATED))
Jul 8 19:26:05 10.35.0.15 nova-network 2014-07-08 19:26:05.405 3465 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"bd5cc18e229d4829b355a6049baaba9e\\", \\"failure\\": null, \\"_msg_id\\": \\"8b8e6781a081424ea7933e90623f00ff\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:26:05 10.35.0.15 nova-network 2014-07-08 19:26:05.405 3465 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"bd5cc18e229d4829b355a6049baaba9e\\", \\"failure\\": null, \\"_msg_id\\": \\"8b8e6781a081424ea7933e90623f00ff\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', routing_key=u'reply_5a152b20506e4303bfaf3007cbe974c3'
Jul 8 19:26:05 10.35.0.13 nova-compute 2014-07-08 19:26:05.403 4705 ERROR kombu.transport.pyamqp [-] NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_unique_id\\": \\"86d591ea98c74703b961bf35c9faaa3a\\", \\"failure\\": null, \\"result\\": [{\\"ovs_interfaceid\\": null, \\"network\\": {\\"bridge\\": \\"cloud\\", \\"subnets\\": [{\\"ips\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"fixed\\", \\"floating_ips\\": [], \\"address\\": \\"10.35.3.2\\"}], \\"version\\": 4, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.3\\"}, \\"dns\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"dns\\", \\"address\\": \\"8.8.4.4\\"}], \\"routes\\": [], \\"cidr\\": \\"10.35.3.0/24\\", \\"gateway\\": {\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"gateway\\", \\"address\\": \\"10.35.3.1\\"}}, {\\"ips\\": [], \\"version\\": null, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.3\\"}, \\"dns\\": [], \\"routes\\": [], \\"cidr\\": null, \\"gateway\\": {\\"meta\\": {}, \\"version\\": null, \\"type\\": \\"gateway\\", \\"address\\": null}}], \\"meta\\": {\\"tenant_id\\": null, \\"multi_host\\": true, \\"should_create_bridge\\": true}, \\"id\\": \\"933f56fa-7929-413f-ab40-9c7c91cb840b\\", \\"label\\": \\"private\\"}, \\"devname\\": null, \\"qbh_params\\": null, \\"meta\\": {}, \\"details\\": {}, \\"address\\": \\"fa:16:3e:0e:79:17\\", \\"active\\": false, \\"type\\": \\"bridge\\", \\"id\\": \\"3821355e-1e28-4fa1-adfe-520202d90d3e\\", \\"qbg_params\\": null}], \\"_msg_id\\": \\"8b8e6781a081424ea7933e90623f00ff\\"}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'1', 'redelivered': False, 'routing_key': u'reply_5a152b20506e4303bfaf3007cbe974c3', 'delivery_tag': 77, 'exchange': u'reply_5a152b20506e4303bfaf3007cbe974c3'}, 'properties': {'priority': 0, 'application_headers': {}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7fde2d448390>}
Jul 8 19:26:05 10.35.0.13 nova-compute 2014-07-08 19:26:05.404 4705 ERROR oslo.messaging._drivers.impl_rabbit [-] NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_unique_id\\": \\"86d591ea98c74703b961bf35c9faaa3a\\", \\"failure\\": null, \\"result\\": [{\\"ovs_interfaceid\\": null, \\"network\\": {\\"bridge\\": \\"cloud\\", \\"subnets\\": [{\\"ips\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"fixed\\", \\"floating_ips\\": [], \\"address\\": \\"10.35.3.2\\"}], \\"version\\": 4, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.3\\"}, \\"dns\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"dns\\", \\"address\\": \\"8.8.4.4\\"}], \\"routes\\": [], \\"cidr\\": \\"10.35.3.0/24\\", \\"gateway\\": {\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"gateway\\", \\"address\\": \\"10.35.3.1\\"}}, {\\"ips\\": [], \\"version\\": null, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.3\\"}, \\"dns\\": [], \\"routes\\": [], \\"cidr\\": null, \\"gateway\\": {\\"meta\\": {}, \\"version\\": null, \\"type\\": \\"gateway\\", \\"address\\": null}}], \\"meta\\": {\\"tenant_id\\": null, \\"multi_host\\": true, \\"should_create_bridge\\": true}, \\"id\\": \\"933f56fa-7929-413f-ab40-9c7c91cb840b\\", \\"label\\": \\"private\\"}, \\"devname\\": null, \\"qbh_params\\": null, \\"meta\\": {}, \\"details\\": {}, \\"address\\": \\"fa:16:3e:0e:79:17\\", \\"active\\": false, \\"type\\": \\"bridge\\", \\"id\\": \\"3821355e-1e28-4fa1-adfe-520202d90d3e\\", \\"qbg_params\\": null}], \\"_msg_id\\": \\"8b8e6781a081424ea7933e90623f00ff\\"}", "oslo.version": "2.0"}'
Jul 8 19:26:05 10.35.0.13 nova-compute 2014-07-08 19:26:05.404 4705 ERROR kombu.transport.pyamqp [-] NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_unique_id\\": \\"bd5cc18e229d4829b355a6049baaba9e\\", \\"failure\\": null, \\"_msg_id\\": \\"8b8e6781a081424ea7933e90623f00ff\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'1', 'redelivered': False, 'routing_key': u'reply_5a152b20506e4303bfaf3007cbe974c3', 'delivery_tag': 78, 'exchange': u'reply_5a152b20506e4303bfaf3007cbe974c3'}, 'properties': {'priority': 0, 'application_headers': {}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7fde2d448390>}
Jul 8 19:26:05 10.35.0.13 nova-compute 2014-07-08 19:26:05.405 4705 ERROR oslo.messaging._drivers.impl_rabbit [-] NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_unique_id\\": \\"bd5cc18e229d4829b355a6049baaba9e\\", \\"failure\\": null, \\"_msg_id\\": \\"8b8e6781a081424ea7933e90623f00ff\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}'
Jul 8 19:26:05 10.35.0.15 [2014-07-08 19:26:05.694] 8900/MainThread savage.arp_forever/INFO: execute['arping', '-q', '-A', '-I', 'services', '-c', '1', '-U', '10.35.2.4']
Jul 8 19:26:05 10.35.0.13 [2014-07-08 19:26:05.697] 11310/MainThread savage.arp_forever/INFO: execute['arping', '-q', '-A', '-I', 'services', '-c', '1', '-U', '10.35.2.2']
Jul 8 19:26:06 10.35.0.13 [2014-07-08 19:26:06.374] 11330/MainThread savage.arp_forever/INFO: execute['arping', '-q', '-A', '-I', 'host', '-c', '1', '-U', '10.35.0.3']
Jul 8 19:26:06 10.35.0.15 [2014-07-08 19:26:06.423] 9060/MainThread savage.arp_forever/INFO: execute['arping', '-q', '-A', '-I', 'services', '-c', '1', '-U', '10.35.2.5']
Jul 8 19:26:07 10.35.0.13 [2014-07-08 19:26:07.513] 941/MainThread savage.node-evacuation/INFO: Requesting status refresh of VM id 5ee78913-e655-4cb1-a92c-500639c819ad
Jul 8 19:26:07 10.35.0.14 nova-api 2014-07-08 19:26:07.517 6473 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:26:07 10.35.0.13 pound: 10.35.2.3:35357 10.35.0.14 - - [08/Jul/2014:19:26:07 +0000] "GET /v2.0/tokens/e461a54513ed421ab86f2cc67e181011 HTTP/1.1" 200 3290 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:26:07 10.35.0.13 pound: 10.35.2.8:8774 10.35.2.7 - - [08/Jul/2014:19:26:07 +0000] "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/5ee78913-e655-4cb1-a92c-500639c819ad HTTP/1.1" 200 1691 "" "python-novaclient"
Jul 8 19:26:07 10.35.0.14 nova-api 2014-07-08 19:26:07.646 6473 INFO nova.osapi_compute.wsgi.server [req-b0442f3c-ab36-4c59-9008-d63806a73018 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] 10.35.2.7,10.35.0.13 "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/5ee78913-e655-4cb1-a92c-500639c819ad HTTP/1.1" status: 200 len: 1865 time: 0.1305370
Jul 8 19:26:07 10.35.0.13 [2014-07-08 19:26:07.650] 941/MainThread savage.Fred/INFO: Waiting for VMs to migrate.
Jul 8 19:26:07 10.35.0.13 [2014-07-08 19:26:07.650] 941/MainThread savage.node-evacuation/INFO: Requesting status refresh of VM id ce77d052-f021-44a2-b5f3-5908ec04fcc7
Jul 8 19:26:07 10.35.0.13 pound: 10.35.2.8:8774 10.35.2.7 - - [08/Jul/2014:19:26:07 +0000] "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/ce77d052-f021-44a2-b5f3-5908ec04fcc7 HTTP/1.1" 200 1690 "" "python-novaclient"
Jul 8 19:26:07 10.35.0.13 nova-api 2014-07-08 19:26:07.761 4842 INFO nova.osapi_compute.wsgi.server [req-2548b1a5-6825-49b2-8d20-ff7d4a485ef3 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] 10.35.2.7,10.35.0.13 "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/ce77d052-f021-44a2-b5f3-5908ec04fcc7 HTTP/1.1" status: 200 len: 1864 time: 0.1064060
Jul 8 19:26:07 10.35.0.13 [2014-07-08 19:26:07.762] 941/MainThread savage.Fred/INFO: Waiting for VMs to migrate.
Jul 8 19:26:07 10.35.0.13 pound: 10.35.2.3:5000 10.35.0.14 - - [08/Jul/2014:19:26:07 +0000] "POST //v2.0/tokens HTTP/1.1" 200 3288 "" "python-novaclient"
Jul 8 19:26:07 10.35.0.13 nova-api 2014-07-08 19:26:07.927 4841 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:26:07 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.7 - - [08/Jul/2014:19:26:07 +0000] "GET /v2.0/tokens/2b3828091bfe4829b82ac7242d54df1f HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:26:08 10.35.0.13 pound: 10.35.2.8:8774 10.35.0.14 - - [08/Jul/2014:19:26:07 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5060 "" "python-novaclient"
Jul 8 19:26:08 10.35.0.13 nova-api 2014-07-08 19:26:08.056 4841 INFO nova.osapi_compute.wsgi.server [req-f5534f27-82e0-4a93-9ad2-5b4eae20ff7e 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 10.35.0.14,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5234 time: 0.1312730
Jul 8 19:26:09 10.35.0.15 [2014-07-08 19:26:09.228] 942/Thread-1 savage.WSGI-FredInterface/INFO: 10.35.0.2 "GET /private/status HTTP/1.1" 200 3035
Jul 8 19:26:09 10.35.0.13 [2014-07-08 19:26:09.227] 941/Thread-1 savage.WSGI-FredInterface/INFO: 10.35.0.2 "GET /private/status HTTP/1.1" 200 3080
Jul 8 19:26:10 10.35.0.15 [2014-07-08 19:26:10.106] 9162/MainThread savage.arp_forever/INFO: execute['arping', '-q', '-A', '-I', 'host', '-c', '1', '-U', '10.35.0.4']
Jul 8 19:26:10 10.35.0.13 pound: 10.35.2.3:5000 10.35.0.14 - - [08/Jul/2014:19:26:10 +0000] "POST //v2.0/tokens HTTP/1.1" 200 3288 "" "python-novaclient"
Jul 8 19:26:10 10.35.0.2 dhcpd: DHCPREQUEST for 10.35.1.129 from 00:25:90:8b:3f:a7 via management
Jul 8 19:26:10 10.35.0.2 dhcpd: DHCPACK on 10.35.1.129 to 00:25:90:8b:3f:a7 via management
Jul 8 19:26:10 10.35.0.13 nova-api 2014-07-08 19:26:10.841 4840 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:26:10 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.7 - - [08/Jul/2014:19:26:10 +0000] "GET /v2.0/tokens/559ba2fc913346058d3457511dfd8388 HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:26:10 10.35.0.13 pound: 10.35.2.8:8774 10.35.0.14 - - [08/Jul/2014:19:26:10 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5060 "" "python-novaclient"
Jul 8 19:26:10 10.35.0.13 nova-api 2014-07-08 19:26:10.976 4840 INFO nova.osapi_compute.wsgi.server [req-c5910bf7-0942-4d17-a859-e3cd338759a4 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 10.35.0.14,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5234 time: 0.1378350
Jul 8 19:26:11 10.35.0.13 [2014-07-08 19:26:11.739] 10213/MainThread savage.arp_forever/INFO: execute['arping', '-q', '-A', '-I', 'services', '-c', '1', '-U', '10.35.2.7']
Jul 8 19:26:12 10.35.0.15 cobalt-compute: periodic_task DEBUG Running periodic task CobaltManager._refresh_host
Jul 8 19:26:12 10.35.0.15 cobalt-compute: periodic_task DEBUG Running periodic task CobaltManager._clean
Jul 8 19:26:12 10.35.0.15 cobalt-compute: vmsconn DEBUG Starting to clean symlinks in /dev/nova
Jul 8 19:26:12 10.35.0.15 cobalt-compute: vmsconn DEBUG Link is still active /dev/nova/3e65c658-731c-4a56-8c17-1cfac00bf754_disk
Jul 8 19:26:12 10.35.0.15 cobalt-compute: vmsconn DEBUG Link is still active /dev/nova/libvirt_save
Jul 8 19:26:12 10.35.0.15 cobalt-compute: vmsconn DEBUG Link is still active /dev/nova/novadisk
Jul 8 19:26:12 10.35.0.15 cobalt-compute: loopingcall DEBUG Dynamic looping call sleeping for 60.00 seconds
Jul 8 19:26:13 10.35.0.13 [2014-07-08 19:26:13.004] 10232/MainThread savage.arp_forever/INFO: execute['arping', '-q', '-A', '-I', 'services', '-c', '1', '-U', '10.35.2.8']
Jul 8 19:26:13 10.35.0.13 [2014-07-08 19:26:13.084] 10245/MainThread savage.arp_forever/INFO: execute['arping', '-q', '-A', '-I', 'services', '-c', '1', '-U', '10.35.2.3']
Jul 8 19:26:13 10.35.0.13 ceph-watchdog.log: Testing for ceph timout...
Jul 8 19:26:13 10.35.0.13 [2014-07-08 19:26:13.281] 22204/MainThread savage.kick_ceph/DEBUG: Testing for ceph timout...
Jul 8 19:26:13 10.35.0.13 [2014-07-08 19:26:13.281] 22204/MainThread savage/INFO: Called lspools
Jul 8 19:26:13 10.35.0.13 [2014-07-08 19:26:13.311] 22204/MainThread savage/INFO: lspools returned ['data', 'metadata', 'rbd', 'images', 'slowrbd', 'crash-dumps', 'gc-vms', '.rgw', '.rgw.buckets', '.rgw.control', '.rgw.gc', '.log', '.intent-log', '.usage', '.users', '.users.email', '.users.swift', '.users.uid', '.rgw.root']
Jul 8 19:26:13 10.35.0.13 [2014-07-08 19:26:13.311] 22204/MainThread savage/INFO: Called rbd_list
Jul 8 19:26:13 10.35.0.13 [2014-07-08 19:26:13.324] 22204/MainThread savage/INFO: rbd_list returned []
Jul 8 19:26:13 10.35.0.13 [2014-07-08 19:26:13.324] 22204/MainThread savage/INFO: Called rbd_list
Jul 8 19:26:13 10.35.0.13 [2014-07-08 19:26:13.337] 22204/MainThread savage/INFO: rbd_list returned []
Jul 8 19:26:13 10.35.0.13 [2014-07-08 19:26:13.337] 22204/MainThread savage/INFO: Called rbd_list
Jul 8 19:26:13 10.35.0.13 [2014-07-08 19:26:13.350] 22204/MainThread savage/INFO: rbd_list returned ['mysql']
Jul 8 19:26:13 10.35.0.13 [2014-07-08 19:26:13.350] 22204/MainThread savage/INFO: Called rbd_list
Jul 8 19:26:13 10.35.0.13 [2014-07-08 19:26:13.365] 22204/MainThread savage/INFO: rbd_list returned ['8f23694a-2995-4070-a5e8-661de3207f2e']
Jul 8 19:26:13 10.35.0.13 [2014-07-08 19:26:13.365] 22204/MainThread savage/INFO: Called rbd_list
Jul 8 19:26:13 10.35.0.13 [2014-07-08 19:26:13.379] 22204/MainThread savage/INFO: rbd_list returned []
Jul 8 19:26:13 10.35.0.13 [2014-07-08 19:26:13.379] 22204/MainThread savage/INFO: Called rbd_list
Jul 8 19:26:13 10.35.0.13 [2014-07-08 19:26:13.393] 22204/MainThread savage/INFO: rbd_list returned []
Jul 8 19:26:13 10.35.0.13 [2014-07-08 19:26:13.393] 22204/MainThread savage/INFO: Called rbd_list
Jul 8 19:26:13 10.35.0.13 [2014-07-08 19:26:13.403] 22204/MainThread savage/INFO: rbd_list returned []
Jul 8 19:26:13 10.35.0.13 [2014-07-08 19:26:13.404] 22204/MainThread savage/INFO: Called rbd_list
Jul 8 19:26:13 10.35.0.13 [2014-07-08 19:26:13.417] 22204/MainThread savage/INFO: rbd_list returned []
Jul 8 19:26:13 10.35.0.13 [2014-07-08 19:26:13.417] 22204/MainThread savage/INFO: Called rbd_list
Jul 8 19:26:13 10.35.0.13 [2014-07-08 19:26:13.431] 22204/MainThread savage/INFO: rbd_list returned []
Jul 8 19:26:13 10.35.0.13 [2014-07-08 19:26:13.432] 22204/MainThread savage/INFO: Called rbd_list
Jul 8 19:26:13 10.35.0.13 [2014-07-08 19:26:13.449] 22204/MainThread savage/INFO: rbd_list returned []
Jul 8 19:26:13 10.35.0.13 [2014-07-08 19:26:13.450] 22204/MainThread savage/INFO: Called rbd_list
Jul 8 19:26:13 10.35.0.13 [2014-07-08 19:26:13.475] 22204/MainThread savage/INFO: rbd_list returned []
Jul 8 19:26:13 10.35.0.13 [2014-07-08 19:26:13.475] 22204/MainThread savage/INFO: Called rbd_list
Jul 8 19:26:13 10.35.0.13 [2014-07-08 19:26:13.489] 22204/MainThread savage/INFO: rbd_list returned []
Jul 8 19:26:13 10.35.0.13 [2014-07-08 19:26:13.489] 22204/MainThread savage/INFO: Called rbd_list
Jul 8 19:26:13 10.35.0.13 [2014-07-08 19:26:13.502] 22204/MainThread savage/INFO: rbd_list returned []
Jul 8 19:26:13 10.35.0.13 [2014-07-08 19:26:13.502] 22204/MainThread savage/INFO: Called rbd_list
Jul 8 19:26:13 10.35.0.13 [2014-07-08 19:26:13.515] 22204/MainThread savage/INFO: rbd_list returned []
Jul 8 19:26:13 10.35.0.13 [2014-07-08 19:26:13.515] 22204/MainThread savage/INFO: Called rbd_list
Jul 8 19:26:13 10.35.0.13 [2014-07-08 19:26:13.530] 22204/MainThread savage/INFO: rbd_list returned []
Jul 8 19:26:13 10.35.0.13 [2014-07-08 19:26:13.530] 22204/MainThread savage/INFO: Called rbd_list
Jul 8 19:26:13 10.35.0.13 [2014-07-08 19:26:13.546] 22204/MainThread savage/INFO: rbd_list returned []
Jul 8 19:26:13 10.35.0.13 [2014-07-08 19:26:13.546] 22204/MainThread savage/INFO: Called rbd_list
Jul 8 19:26:13 10.35.0.13 [2014-07-08 19:26:13.562] 22204/MainThread savage/INFO: rbd_list returned []
Jul 8 19:26:13 10.35.0.13 [2014-07-08 19:26:13.562] 22204/MainThread savage/INFO: Called rbd_list
Jul 8 19:26:13 10.35.0.13 [2014-07-08 19:26:13.579] 22204/MainThread savage/INFO: rbd_list returned []
Jul 8 19:26:13 10.35.0.13 [2014-07-08 19:26:13.579] 22204/MainThread savage/INFO: Called rbd_list
Jul 8 19:26:13 10.35.0.13 [2014-07-08 19:26:13.594] 22204/MainThread savage/INFO: rbd_list returned []
Jul 8 19:26:13 10.35.0.13 ceph-watchdog.log: Ceph is responding without timeout
Jul 8 19:26:13 10.35.0.13 [2014-07-08 19:26:13.595] 22204/MainThread savage.kick_ceph/DEBUG: Ceph is responding without timeout
Jul 8 19:26:13 10.35.0.13 pound: 10.35.2.3:5000 10.35.0.14 - - [08/Jul/2014:19:26:13 +0000] "POST //v2.0/tokens HTTP/1.1" 200 3288 "" "python-novaclient"
Jul 8 19:26:13 10.35.0.13 nova-api 2014-07-08 19:26:13.733 4841 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:26:13 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.7 - - [08/Jul/2014:19:26:13 +0000] "GET /v2.0/tokens/7c607f9d44434fe49fa1906be8bede0e HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:26:13 10.35.0.13 pound: 10.35.2.8:8774 10.35.0.14 - - [08/Jul/2014:19:26:13 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5060 "" "python-novaclient"
Jul 8 19:26:13 10.35.0.13 nova-api 2014-07-08 19:26:13.856 4841 INFO nova.osapi_compute.wsgi.server [req-e95a03da-7c68-424e-84cc-aa84303eb81e 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 10.35.0.14,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5234 time: 0.1246190
Jul 8 19:26:16 10.35.0.13 pound: 10.35.2.3:5000 10.35.0.14 - - [08/Jul/2014:19:26:16 +0000] "POST //v2.0/tokens HTTP/1.1" 200 3288 "" "python-novaclient"
Jul 8 19:26:16 10.35.0.13 nova-api 2014-07-08 19:26:16.623 4842 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:26:16 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.7 - - [08/Jul/2014:19:26:16 +0000] "GET /v2.0/tokens/7a1971581ecf43fdb02037cb89aac55e HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:26:16 10.35.0.13 pound: 10.35.2.8:8774 10.35.0.14 - - [08/Jul/2014:19:26:16 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5060 "" "python-novaclient"
Jul 8 19:26:16 10.35.0.13 nova-api 2014-07-08 19:26:16.741 4842 INFO nova.osapi_compute.wsgi.server [req-73c49400-ba03-4204-b4a5-32fcb6806ab5 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 10.35.0.14,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5234 time: 0.1201870
Jul 8 19:26:17 10.35.0.13 [2014-07-08 19:26:17.772] 941/MainThread savage.node-evacuation/INFO: Requesting status refresh of VM id 5ee78913-e655-4cb1-a92c-500639c819ad
Jul 8 19:26:17 10.35.0.13 nova-api 2014-07-08 19:26:17.779 4841 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:26:17 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.7 - - [08/Jul/2014:19:26:17 +0000] "GET /v2.0/tokens/e461a54513ed421ab86f2cc67e181011 HTTP/1.1" 200 3290 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:26:17 10.35.0.13 pound: 10.35.2.8:8774 10.35.2.7 - - [08/Jul/2014:19:26:17 +0000] "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/5ee78913-e655-4cb1-a92c-500639c819ad HTTP/1.1" 200 1691 "" "python-novaclient"
Jul 8 19:26:17 10.35.0.13 nova-api 2014-07-08 19:26:17.903 4841 INFO nova.osapi_compute.wsgi.server [req-8e388c04-ef25-43a6-929a-c31da8c447d8 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] 10.35.2.7,10.35.0.13 "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/5ee78913-e655-4cb1-a92c-500639c819ad HTTP/1.1" status: 200 len: 1865 time: 0.1261680
Jul 8 19:26:17 10.35.0.13 [2014-07-08 19:26:17.905] 941/MainThread savage.Fred/INFO: Waiting for VMs to migrate.
Jul 8 19:26:17 10.35.0.13 [2014-07-08 19:26:17.905] 941/MainThread savage.node-evacuation/INFO: Requesting status refresh of VM id ce77d052-f021-44a2-b5f3-5908ec04fcc7
Jul 8 19:26:17 10.35.0.14 nova-api 2014-07-08 19:26:17.910 6472 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:26:17 10.35.0.13 pound: 10.35.2.3:35357 10.35.0.14 - - [08/Jul/2014:19:26:17 +0000] "GET /v2.0/tokens/e461a54513ed421ab86f2cc67e181011 HTTP/1.1" 200 3290 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:26:18 10.35.0.13 pound: 10.35.2.8:8774 10.35.2.7 - - [08/Jul/2014:19:26:17 +0000] "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/ce77d052-f021-44a2-b5f3-5908ec04fcc7 HTTP/1.1" 200 1690 "" "python-novaclient"
Jul 8 19:26:18 10.35.0.14 nova-api 2014-07-08 19:26:18.033 6472 INFO nova.osapi_compute.wsgi.server [req-d1c344f6-cb5d-4089-8a7c-e83c79df2e43 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] 10.35.2.7,10.35.0.13 "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/ce77d052-f021-44a2-b5f3-5908ec04fcc7 HTTP/1.1" status: 200 len: 1864 time: 0.1257091
Jul 8 19:26:18 10.35.0.13 [2014-07-08 19:26:18.036] 941/MainThread savage.Fred/INFO: Waiting for VMs to migrate.
Jul 8 19:26:18 10.35.0.15 cinder-volume 2014-07-08 19:26:18.143 9068 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_context_domain\\": null, \\"_context_request_id\\": \\"req-be6e2e4e-9a55-4dad-8f2c-6d4d13d5e76d\\", \\"_context_quota_class\\": null, \\"_context_service_catalog\\": [], \\"_context_auth_token\\": null, \\"_context_user_id\\": null, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_timestamp\\": \\"2014-07-08T19:26:18.141116\\", \\"_context_project_domain\\": null, \\"_context_user\\": null, \\"method\\": \\"update_service_capabilities\\", \\"_context_remote_address\\": null, \\"_context_roles\\": [\\"admin\\"], \\"args\\": {\\"service_name\\": \\"volume\\", \\"host\\": \\"cinder@rbd-slow\\", \\"capabilities\\": {\\"allocated_capacity_gb\\": 0, \\"volume_backend_name\\": \\"CEPHSLOW\\", \\"free_capacity_gb\\": 1281, \\"driver_version\\": \\"1.1.0\\", \\"total_capacity_gb\\": 1282, \\"reserved_percentage\\": 0, \\"vendor_name\\": \\"Open Source\\", \\"storage_protocol\\": \\"ceph\\"}}, \\"_unique_id\\": \\"44af45526c8c439d9aaf98d51a3b1e0e\\", \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_context_user_identity\\": \\"- - - - -\\", \\"_context_tenant\\": null, \\"_context_project_id\\": null, \\"_context_user_domain\\": null}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:26:18 10.35.0.15 cinder-volume 2014-07-08 19:26:18.143 9068 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_context_domain\\": null, \\"_context_request_id\\": \\"req-be6e2e4e-9a55-4dad-8f2c-6d4d13d5e76d\\", \\"_context_quota_class\\": null, \\"_context_service_catalog\\": [], \\"_context_auth_token\\": null, \\"_context_user_id\\": null, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_timestamp\\": \\"2014-07-08T19:26:18.141116\\", \\"_context_project_domain\\": null, \\"_context_user\\": null, \\"method\\": \\"update_service_capabilities\\", \\"_context_remote_address\\": null, \\"_context_roles\\": [\\"admin\\"], \\"args\\": {\\"service_name\\": \\"volume\\", \\"host\\": \\"cinder@rbd-slow\\", \\"capabilities\\": {\\"allocated_capacity_gb\\": 0, \\"volume_backend_name\\": \\"CEPHSLOW\\", \\"free_capacity_gb\\": 1281, \\"driver_version\\": \\"1.1.0\\", \\"total_capacity_gb\\": 1282, \\"reserved_percentage\\": 0, \\"vendor_name\\": \\"Open Source\\", \\"storage_protocol\\": \\"ceph\\"}}, \\"_unique_id\\": \\"44af45526c8c439d9aaf98d51a3b1e0e\\", \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_context_user_identity\\": \\"- - - - -\\", \\"_context_tenant\\": null, \\"_context_project_id\\": null, \\"_context_user_domain\\": null}", "oslo.version": "2.0"}', routing_key=''
Jul 8 19:26:18 10.35.0.15 cinder-scheduler 2014-07-08 19:26:18.144 8863 ERROR kombu.transport.pyamqp [-] NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_context_domain\\": null, \\"_context_request_id\\": \\"req-be6e2e4e-9a55-4dad-8f2c-6d4d13d5e76d\\", \\"_context_quota_class\\": null, \\"_context_service_catalog\\": [], \\"_context_auth_token\\": null, \\"_context_user_id\\": null, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_timestamp\\": \\"2014-07-08T19:26:18.141116\\", \\"_context_project_domain\\": null, \\"_context_user\\": null, \\"method\\": \\"update_service_capabilities\\", \\"_context_remote_address\\": null, \\"_context_roles\\": [\\"admin\\"], \\"args\\": {\\"service_name\\": \\"volume\\", \\"host\\": \\"cinder@rbd-slow\\", \\"capabilities\\": {\\"allocated_capacity_gb\\": 0, \\"volume_backend_name\\": \\"CEPHSLOW\\", \\"free_capacity_gb\\": 1281, \\"driver_version\\": \\"1.1.0\\", \\"total_capacity_gb\\": 1282, \\"reserved_percentage\\": 0, \\"vendor_name\\": \\"Open Source\\", \\"storage_protocol\\": \\"ceph\\"}}, \\"_unique_id\\": \\"44af45526c8c439d9aaf98d51a3b1e0e\\", \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_context_user_identity\\": \\"- - - - -\\", \\"_context_tenant\\": null, \\"_context_project_id\\": null, \\"_context_user_domain\\": null}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'3', 'redelivered': False, 'routing_key': u'', 'delivery_tag': 80, 'exchange': u'cinder-scheduler_fanout'}, 'properties': {'priority': 0, 'application_headers': {}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7fa03d158690>}
Jul 8 19:26:18 10.35.0.15 cinder-volume 2014-07-08 19:26:18.145 9068 INFO cinder.volume.manager [-] Updating volume status
Jul 8 19:26:18 10.35.0.15 cinder-scheduler 2014-07-08 19:26:18.146 8863 ERROR oslo.messaging._drivers.impl_rabbit [-] NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_context_domain\\": null, \\"_context_request_id\\": \\"req-be6e2e4e-9a55-4dad-8f2c-6d4d13d5e76d\\", \\"_context_quota_class\\": null, \\"_context_service_catalog\\": [], \\"_context_auth_token\\": null, \\"_context_user_id\\": null, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_timestamp\\": \\"2014-07-08T19:26:18.141116\\", \\"_context_project_domain\\": null, \\"_context_user\\": null, \\"method\\": \\"update_service_capabilities\\", \\"_context_remote_address\\": null, \\"_context_roles\\": [\\"admin\\"], \\"args\\": {\\"service_name\\": \\"volume\\", \\"host\\": \\"cinder@rbd-slow\\", \\"capabilities\\": {\\"allocated_capacity_gb\\": 0, \\"volume_backend_name\\": \\"CEPHSLOW\\", \\"free_capacity_gb\\": 1281, \\"driver_version\\": \\"1.1.0\\", \\"total_capacity_gb\\": 1282, \\"reserved_percentage\\": 0, \\"vendor_name\\": \\"Open Source\\", \\"storage_protocol\\": \\"ceph\\"}}, \\"_unique_id\\": \\"44af45526c8c439d9aaf98d51a3b1e0e\\", \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_context_user_identity\\": \\"- - - - -\\", \\"_context_tenant\\": null, \\"_context_project_id\\": null, \\"_context_user_domain\\": null}", "oslo.version": "2.0"}'
Jul 8 19:26:18 10.35.0.15 cinder-scheduler 2014-07-08 19:26:18.147 8863 WARNING cinder.context [-] Arguments dropped when creating context: {'user': None, 'tenant': None, 'user_identity': u'- - - - -'}
Jul 8 19:26:18 10.35.0.15 cinder-scheduler 2014-07-08 19:26:18.148 8863 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"8c234ce8eee64b22b2fb0ddd9bb377da\\", \\"failure\\": null, \\"result\\": null}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:26:18 10.35.0.15 cinder-scheduler 2014-07-08 19:26:18.148 8863 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"8c234ce8eee64b22b2fb0ddd9bb377da\\", \\"failure\\": null, \\"result\\": null}", "oslo.version": "2.0"}', routing_key=''
Jul 8 19:26:18 10.35.0.15 cinder-scheduler 2014-07-08 19:26:18.149 8863 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"5273c82dff234d15a08a668f16ed5fd7\\", \\"failure\\": null, \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:26:18 10.35.0.15 cinder-scheduler 2014-07-08 19:26:18.149 8863 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"5273c82dff234d15a08a668f16ed5fd7\\", \\"failure\\": null, \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', routing_key=''
Jul 8 19:26:18 10.35.0.14 [2014-07-08 19:26:18.565] 941/Thread-1 savage.WSGI-FredInterface/INFO: 10.35.0.2 "GET /private/status HTTP/1.1" 200 3035
Jul 8 19:26:19 10.35.0.13 pound: 10.35.2.3:5000 10.35.0.14 - - [08/Jul/2014:19:26:19 +0000] "POST //v2.0/tokens HTTP/1.1" 200 3288 "" "python-novaclient"
Jul 8 19:26:19 10.35.0.15 nova-api 2014-07-08 19:26:19.492 3442 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:26:19 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.6 - - [08/Jul/2014:19:26:19 +0000] "GET /v2.0/tokens/6d21fff457ea42a2b62ac792d9d64c30 HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:26:19 10.35.0.13 pound: 10.35.2.8:8774 10.35.0.14 - - [08/Jul/2014:19:26:19 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5060 "" "python-novaclient"
Jul 8 19:26:19 10.35.0.15 nova-api 2014-07-08 19:26:19.598 3442 INFO nova.osapi_compute.wsgi.server [req-eadcdada-7111-4b2c-b47c-9f275f89bfb3 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 10.35.0.14,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5234 time: 0.1080120
Jul 8 19:26:20 10.35.0.14 nova-network.log: /usr/lib64/python2.7/site-packages/sqlalchemy/engine/default.py:435: Warning: Incorrect datetime value: '2014-07-08 19:16:20+00:00' for column 'updated_at' at row 1
Jul 8 19:26:20 10.35.0.14 nova-network.log: cursor.execute(statement, parameters)
Jul 8 19:26:22 10.35.0.13 pound: 10.35.2.3:5000 10.35.0.14 - - [08/Jul/2014:19:26:22 +0000] "POST //v2.0/tokens HTTP/1.1" 200 3288 "" "python-novaclient"
Jul 8 19:26:22 10.35.0.15 nova-api 2014-07-08 19:26:22.365 3442 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:26:22 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.6 - - [08/Jul/2014:19:26:22 +0000] "GET /v2.0/tokens/5662409f1450452da99483af4917d514 HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:26:22 10.35.0.13 pound: 10.35.2.8:8774 10.35.0.14 - - [08/Jul/2014:19:26:22 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5060 "" "python-novaclient"
Jul 8 19:26:22 10.35.0.15 nova-api 2014-07-08 19:26:22.483 3442 INFO nova.osapi_compute.wsgi.server [req-f323ea3e-b347-4b3b-bc57-7e3dd7294195 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 10.35.0.14,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5234 time: 0.1198020
Jul 8 19:26:23 10.35.0.14 nova-compute 2014-07-08 19:26:23.349 6347 AUDIT nova.compute.resource_tracker [-] Auditing locally available compute resources
Jul 8 19:26:23 10.35.0.14 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf vgs --noheadings --nosuffix --separator | --units b -o vg_size,vg_free nova
Jul 8 19:26:23 10.35.0.14 nova-compute 2014-07-08 19:26:23.689 6347 AUDIT nova.compute.resource_tracker [-] Free ram (MB): 47729
Jul 8 19:26:23 10.35.0.14 nova-compute 2014-07-08 19:26:23.690 6347 AUDIT nova.compute.resource_tracker [-] Free disk (GB): 1862
Jul 8 19:26:23 10.35.0.14 nova-compute 2014-07-08 19:26:23.690 6347 AUDIT nova.compute.resource_tracker [-] Free VCPUS: 4
Jul 8 19:26:23 10.35.0.14 nova-compute 2014-07-08 19:26:23.716 6347 INFO nova.compute.resource_tracker [-] Compute_service record updated for 10.35.0.14:node-10-35-0-14
Jul 8 19:26:23 10.35.0.14 nova-compute 2014-07-08 19:26:23.776 6347 WARNING nova.virt.libvirt.imagecache [-] Unknown base file: /mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375
Jul 8 19:26:23 10.35.0.14 nova-compute 2014-07-08 19:26:23.776 6347 INFO nova.virt.libvirt.imagecache [-] Removable base files: /mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375
Jul 8 19:26:23 10.35.0.14 nova-compute 2014-07-08 19:26:23.776 6347 INFO nova.virt.libvirt.imagecache [-] Base file too young to remove: /mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375
Jul 8 19:26:24 10.35.0.15 nova-network.log: /usr/lib64/python2.7/site-packages/sqlalchemy/engine/default.py:435: Warning: Incorrect datetime value: '2014-07-08 19:16:24+00:00' for column 'updated_at' at row 1
Jul 8 19:26:24 10.35.0.15 nova-network.log: cursor.execute(statement, parameters)
Jul 8 19:26:24 10.35.0.13 [2014-07-08 19:26:24.789] 941/Thread-1 savage.WSGI-FredInterface/INFO: 10.35.0.2 "GET /private/status HTTP/1.1" 200 3080
Jul 8 19:26:24 10.35.0.15 [2014-07-08 19:26:24.793] 942/Thread-1 savage.WSGI-FredInterface/INFO: 10.35.0.2 "GET /private/status HTTP/1.1" 200 3035
Jul 8 19:26:25 10.35.0.13 pound: 10.35.2.3:5000 10.35.0.14 - - [08/Jul/2014:19:26:25 +0000] "POST //v2.0/tokens HTTP/1.1" 200 3288 "" "python-novaclient"
Jul 8 19:26:25 10.35.0.13 nova-api 2014-07-08 19:26:25.235 4842 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:26:25 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.7 - - [08/Jul/2014:19:26:25 +0000] "GET /v2.0/tokens/ecefca617e8e4cd8b564c57a2d4f16c1 HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:26:25 10.35.0.13 pound: 10.35.2.8:8774 10.35.0.14 - - [08/Jul/2014:19:26:25 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5060 "" "python-novaclient"
Jul 8 19:26:25 10.35.0.13 nova-api 2014-07-08 19:26:25.357 4842 INFO nova.osapi_compute.wsgi.server [req-672eed4e-4252-490b-80e2-ba7335636bf7 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 10.35.0.14,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5234 time: 0.1246910
Jul 8 19:26:28 10.35.0.13 [2014-07-08 19:26:28.047] 941/MainThread savage.node-evacuation/INFO: Requesting status refresh of VM id 5ee78913-e655-4cb1-a92c-500639c819ad
Jul 8 19:26:28 10.35.0.13 pound: 10.35.2.3:5000 10.35.0.14 - - [08/Jul/2014:19:26:27 +0000] "POST //v2.0/tokens HTTP/1.1" 200 3288 "" "python-novaclient"
Jul 8 19:26:28 10.35.0.14 nova-api 2014-07-08 19:26:28.131 6472 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:26:28 10.35.0.13 pound: 10.35.2.3:35357 10.35.0.14 - - [08/Jul/2014:19:26:28 +0000] "GET /v2.0/tokens/a089cd6a432746088ee51f4e11f9db03 HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:26:28 10.35.0.13 pound: 10.35.2.8:8774 10.35.2.7 - - [08/Jul/2014:19:26:28 +0000] "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/5ee78913-e655-4cb1-a92c-500639c819ad HTTP/1.1" 200 1691 "" "python-novaclient"
Jul 8 19:26:28 10.35.0.15 nova-api 2014-07-08 19:26:28.152 3442 INFO nova.osapi_compute.wsgi.server [req-f9518126-2b25-47dc-bf2d-140edb9f836f 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] 10.35.2.7,10.35.0.13 "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/5ee78913-e655-4cb1-a92c-500639c819ad HTTP/1.1" status: 200 len: 1865 time: 0.0977750
Jul 8 19:26:28 10.35.0.13 [2014-07-08 19:26:28.152] 941/MainThread savage.Fred/INFO: Waiting for VMs to migrate.
Jul 8 19:26:28 10.35.0.13 [2014-07-08 19:26:28.152] 941/MainThread savage.node-evacuation/INFO: Requesting status refresh of VM id ce77d052-f021-44a2-b5f3-5908ec04fcc7
Jul 8 19:26:28 10.35.0.13 pound: 10.35.2.8:8774 10.35.0.14 - - [08/Jul/2014:19:26:28 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5060 "" "python-novaclient"
Jul 8 19:26:28 10.35.0.14 nova-api 2014-07-08 19:26:28.249 6472 INFO nova.osapi_compute.wsgi.server [req-c8ecd59e-4f9a-4a68-a690-495b9e835e94 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 10.35.0.14,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5234 time: 0.1197460
Jul 8 19:26:28 10.35.0.13 pound: 10.35.2.8:8774 10.35.2.7 - - [08/Jul/2014:19:26:28 +0000] "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/ce77d052-f021-44a2-b5f3-5908ec04fcc7 HTTP/1.1" 200 1690 "" "python-novaclient"
Jul 8 19:26:28 10.35.0.14 nova-api 2014-07-08 19:26:28.254 6474 INFO nova.osapi_compute.wsgi.server [req-9d7e9451-be40-4331-98a4-c5c61652e9bf 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] 10.35.2.7,10.35.0.13 "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/ce77d052-f021-44a2-b5f3-5908ec04fcc7 HTTP/1.1" status: 200 len: 1864 time: 0.0991521
Jul 8 19:26:28 10.35.0.13 [2014-07-08 19:26:28.257] 941/MainThread savage.Fred/INFO: Waiting for VMs to migrate.
Jul 8 19:26:31 10.35.0.13 pound: 10.35.2.3:5000 10.35.0.14 - - [08/Jul/2014:19:26:30 +0000] "POST //v2.0/tokens HTTP/1.1" 200 3288 "" "python-novaclient"
Jul 8 19:26:31 10.35.0.14 nova-api 2014-07-08 19:26:31.022 6472 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:26:31 10.35.0.13 pound: 10.35.2.3:35357 10.35.0.14 - - [08/Jul/2014:19:26:31 +0000] "GET /v2.0/tokens/1d465d5749074690af19ada0416c28e6 HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:26:31 10.35.0.13 pound: 10.35.2.8:8774 10.35.0.14 - - [08/Jul/2014:19:26:31 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5060 "" "python-novaclient"
Jul 8 19:26:31 10.35.0.14 nova-api 2014-07-08 19:26:31.148 6472 INFO nova.osapi_compute.wsgi.server [req-a09ddbc7-1563-4818-bce9-5f12a72122ee 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 10.35.0.14,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5234 time: 0.1274550
Jul 8 19:26:33 10.35.0.15 nova-compute 2014-07-08 19:26:33.676 3328 AUDIT nova.compute.resource_tracker [-] Auditing locally available compute resources
Jul 8 19:26:33 10.35.0.15 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf vgs --noheadings --nosuffix --separator | --units b -o vg_size,vg_free nova
Jul 8 19:26:33 10.35.0.13 pound: 10.35.2.3:5000 10.35.0.14 - - [08/Jul/2014:19:26:33 +0000] "POST //v2.0/tokens HTTP/1.1" 200 3288 "" "python-novaclient"
Jul 8 19:26:33 10.35.0.13 nova-api 2014-07-08 19:26:33.916 4842 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:26:33 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.7 - - [08/Jul/2014:19:26:33 +0000] "GET /v2.0/tokens/ff57aecf664345a0a32ff00ecf662f17 HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:26:34 10.35.0.13 pound: 10.35.2.8:8774 10.35.0.14 - - [08/Jul/2014:19:26:33 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5060 "" "python-novaclient"
Jul 8 19:26:34 10.35.0.13 nova-api 2014-07-08 19:26:34.024 4842 INFO nova.osapi_compute.wsgi.server [req-d2398a01-2a7d-4240-b918-cf951121848f 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 10.35.0.14,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5234 time: 0.1101680
Jul 8 19:26:34 10.35.0.15 nova-compute 2014-07-08 19:26:34.036 3328 AUDIT nova.compute.resource_tracker [-] Free ram (MB): 47217
Jul 8 19:26:34 10.35.0.15 nova-compute 2014-07-08 19:26:34.037 3328 AUDIT nova.compute.resource_tracker [-] Free disk (GB): 1861
Jul 8 19:26:34 10.35.0.15 nova-compute 2014-07-08 19:26:34.037 3328 AUDIT nova.compute.resource_tracker [-] Free VCPUS: 3
Jul 8 19:26:34 10.35.0.15 nova-compute 2014-07-08 19:26:34.062 3328 INFO nova.compute.resource_tracker [-] Compute_service record updated for 10.35.0.15:node-10-35-0-15
Jul 8 19:26:34 10.35.0.15 nova-compute 2014-07-08 19:26:34.119 3328 INFO nova.virt.libvirt.imagecache [-] image 8f23694a-2995-4070-a5e8-661de3207f2e at (/mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375): checking
Jul 8 19:26:34 10.35.0.15 nova-compute 2014-07-08 19:26:34.119 3328 INFO nova.virt.libvirt.imagecache [-] image 8f23694a-2995-4070-a5e8-661de3207f2e at (/mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375): in use: on this node 1 local, 0 on other nodes sharing this instance storage
Jul 8 19:26:34 10.35.0.15 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf chown 43 /mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375
Jul 8 19:26:34 10.35.0.14 [2014-07-08 19:26:34.128] 941/Thread-1 savage.WSGI-FredInterface/INFO: 10.35.0.2 "GET /private/status HTTP/1.1" 200 3035
Jul 8 19:26:34 10.35.0.15 nova-compute 2014-07-08 19:26:34.305 3328 INFO nova.virt.libvirt.imagecache [-] Active base files: /mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375
Jul 8 19:26:34 10.35.0.13 nova-compute 2014-07-08 19:26:34.773 4705 AUDIT nova.compute.resource_tracker [-] Auditing locally available compute resources
Jul 8 19:26:34 10.35.0.13 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf vgs --noheadings --nosuffix --separator | --units b -o vg_size,vg_free nova
Jul 8 19:26:35 10.35.0.13 nova-compute 2014-07-08 19:26:35.300 4705 AUDIT nova.compute.resource_tracker [-] Free ram (MB): 46705
Jul 8 19:26:35 10.35.0.13 nova-compute 2014-07-08 19:26:35.300 4705 AUDIT nova.compute.resource_tracker [-] Free disk (GB): 1860
Jul 8 19:26:35 10.35.0.13 nova-compute 2014-07-08 19:26:35.300 4705 AUDIT nova.compute.resource_tracker [-] Free VCPUS: 2
Jul 8 19:26:35 10.35.0.13 nova-compute 2014-07-08 19:26:35.325 4705 INFO nova.compute.resource_tracker [-] Compute_service record updated for 10.35.0.13:node-10-35-0-13
Jul 8 19:26:35 10.35.0.13 nova-compute 2014-07-08 19:26:35.381 4705 INFO nova.virt.libvirt.imagecache [-] image 8f23694a-2995-4070-a5e8-661de3207f2e at (/mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375): checking
Jul 8 19:26:35 10.35.0.13 nova-compute 2014-07-08 19:26:35.381 4705 INFO nova.virt.libvirt.imagecache [-] image 8f23694a-2995-4070-a5e8-661de3207f2e at (/mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375): in use: on this node 2 local, 0 on other nodes sharing this instance storage
Jul 8 19:26:35 10.35.0.13 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf chown 43 /mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375
Jul 8 19:26:35 10.35.0.13 nova-compute 2014-07-08 19:26:35.574 4705 INFO nova.virt.libvirt.imagecache [-] Active base files: /mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375
Jul 8 19:26:36 10.35.0.13 pound: 10.35.2.3:5000 10.35.0.14 - - [08/Jul/2014:19:26:36 +0000] "POST //v2.0/tokens HTTP/1.1" 200 3288 "" "python-novaclient"
Jul 8 19:26:36 10.35.0.13 nova-api 2014-07-08 19:26:36.784 4842 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:26:36 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.7 - - [08/Jul/2014:19:26:36 +0000] "GET /v2.0/tokens/ba11a7ef8a534d25b15de5f4d698f3e8 HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:26:36 10.35.0.13 pound: 10.35.2.8:8774 10.35.0.14 - - [08/Jul/2014:19:26:36 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5060 "" "python-novaclient"
Jul 8 19:26:36 10.35.0.13 nova-api 2014-07-08 19:26:36.898 4842 INFO nova.osapi_compute.wsgi.server [req-08190bf6-4d32-4656-885c-b4611017f2c0 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 10.35.0.14,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5234 time: 0.1161191
Jul 8 19:26:37 10.35.0.13 cobalt-compute: periodic_task DEBUG Running periodic task CobaltManager._refresh_host
Jul 8 19:26:37 10.35.0.13 cobalt-compute: instance DEBUG Lazy-loading `system_metadata' on Instance uuid ce77d052-f021-44a2-b5f3-5908ec04fcc7
Jul 8 19:26:37 10.35.0.13 cobalt-compute: amqpdriver DEBUG MSG_ID is 787336d1e03a46d9be320bce7d28cc1c
Jul 8 19:26:37 10.35.0.13 cobalt-compute: amqp DEBUG UNIQUE_ID is fca7cba420634ae79e800aa4325a690c.
Jul 8 19:26:37 10.35.0.13 cobalt-compute: pyamqp ERROR NBK: body='{"oslo.message": "{\\"_context_roles\\": [], \\"_msg_id\\": \\"787336d1e03a46d9be320bce7d28cc1c\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-b2bb0bc5-9b42-4c7d-a4fd-ab160d4b70eb\\", \\"_context_service_catalog\\": [], \\"args\\": {\\"instance_id\\": 1, \\"teardown\\": false, \\"host\\": \\"10.35.0.13\\"}, \\"_unique_id\\": \\"fca7cba420634ae79e800aa4325a690c\\", \\"_context_user\\": null, \\"_context_user_id\\": null, \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_ff10fb357a86427594db414dc3e5acc6\\", \\"_context_auth_token\\": null, \\"_context_tenant\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_project_id\\": null, \\"_context_timestamp\\": \\"2014-07-08T19:26:37.549982\\", \\"_context_user_name\\": null, \\"method\\": \\"setup_networks_on_host\\", \\"_context_remote_address\\": null}", "oslo.version": "2.0"}', headers={'ttl': 60000}, properties={'delivery_mode': 2}
Jul 8 19:26:37 10.35.0.13 cobalt-compute: messaging ERROR NBK: message.body='{"oslo.message": "{\\"_context_roles\\": [], \\"_msg_id\\": \\"787336d1e03a46d9be320bce7d28cc1c\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-b2bb0bc5-9b42-4c7d-a4fd-ab160d4b70eb\\", \\"_context_service_catalog\\": [], \\"args\\": {\\"instance_id\\": 1, \\"teardown\\": false, \\"host\\": \\"10.35.0.13\\"}, \\"_unique_id\\": \\"fca7cba420634ae79e800aa4325a690c\\", \\"_context_user\\": null, \\"_context_user_id\\": null, \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_ff10fb357a86427594db414dc3e5acc6\\", \\"_context_auth_token\\": null, \\"_context_tenant\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_project_id\\": null, \\"_context_timestamp\\": \\"2014-07-08T19:26:37.549982\\", \\"_context_user_name\\": null, \\"method\\": \\"setup_networks_on_host\\", \\"_context_remote_address\\": null}", "oslo.version": "2.0"}', routing_key='network'
Jul 8 19:26:37 10.35.0.13 cobalt-compute: channel DEBUG Closed channel #1
Jul 8 19:26:37 10.35.0.13 cobalt-compute: channel DEBUG using channel_id: 1
Jul 8 19:26:37 10.35.0.13 nova-network 2014-07-08 19:26:37.681 4876 ERROR kombu.transport.pyamqp [-] NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_context_roles\\": [], \\"_msg_id\\": \\"787336d1e03a46d9be320bce7d28cc1c\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-b2bb0bc5-9b42-4c7d-a4fd-ab160d4b70eb\\", \\"_context_service_catalog\\": [], \\"args\\": {\\"instance_id\\": 1, \\"teardown\\": false, \\"host\\": \\"10.35.0.13\\"}, \\"_unique_id\\": \\"fca7cba420634ae79e800aa4325a690c\\", \\"_context_user\\": null, \\"_context_user_id\\": null, \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_ff10fb357a86427594db414dc3e5acc6\\", \\"_context_auth_token\\": null, \\"_context_tenant\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_project_id\\": null, \\"_context_timestamp\\": \\"2014-07-08T19:26:37.549982\\", \\"_context_user_name\\": null, \\"method\\": \\"setup_networks_on_host\\", \\"_context_remote_address\\": null}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'1', 'redelivered': False, 'routing_key': u'network', 'delivery_tag': 27, 'exchange': u'nova'}, 'properties': {'priority': 0, 'application_headers': {u'ttl': 60000}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7fdeb6472390>}
Jul 8 19:26:37 10.35.0.13 cobalt-compute: channel DEBUG Channel open
Jul 8 19:26:37 10.35.0.13 nova-network 2014-07-08 19:26:37.682 4876 ERROR oslo.messaging._drivers.impl_rabbit [-] NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_context_roles\\": [], \\"_msg_id\\": \\"787336d1e03a46d9be320bce7d28cc1c\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-b2bb0bc5-9b42-4c7d-a4fd-ab160d4b70eb\\", \\"_context_service_catalog\\": [], \\"args\\": {\\"instance_id\\": 1, \\"teardown\\": false, \\"host\\": \\"10.35.0.13\\"}, \\"_unique_id\\": \\"fca7cba420634ae79e800aa4325a690c\\", \\"_context_user\\": null, \\"_context_user_id\\": null, \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_ff10fb357a86427594db414dc3e5acc6\\", \\"_context_auth_token\\": null, \\"_context_tenant\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_project_id\\": null, \\"_context_timestamp\\": \\"2014-07-08T19:26:37.549982\\", \\"_context_user_name\\": null, \\"method\\": \\"setup_networks_on_host\\", \\"_context_remote_address\\": null}", "oslo.version": "2.0"}'
Jul 8 19:26:37 10.35.0.13 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf ip route replace 10.35.3.0/24 dev cloud table cloud
Jul 8 19:26:38 10.35.0.13 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf ip route replace 0.0.0.0/0 via 10.35.3.1 dev cloud table cloud
Jul 8 19:26:38 10.35.0.13 [2014-07-08 19:26:38.267] 941/MainThread savage.node-evacuation/INFO: Requesting status refresh of VM id 5ee78913-e655-4cb1-a92c-500639c819ad
Jul 8 19:26:38 10.35.0.13 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf kill -HUP 7234
Jul 8 19:26:38 10.35.0.13 pound: 10.35.2.8:8774 10.35.2.7 - - [08/Jul/2014:19:26:38 +0000] "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/5ee78913-e655-4cb1-a92c-500639c819ad HTTP/1.1" 200 1691 "" "python-novaclient"
Jul 8 19:26:38 10.35.0.14 nova-api 2014-07-08 19:26:38.378 6474 INFO nova.osapi_compute.wsgi.server [req-a056393a-f00b-4525-93cf-ca3780f7f1c3 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] 10.35.2.7,10.35.0.13 "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/5ee78913-e655-4cb1-a92c-500639c819ad HTTP/1.1" status: 200 len: 1865 time: 0.1080921
Jul 8 19:26:38 10.35.0.13 [2014-07-08 19:26:38.381] 941/MainThread savage.Fred/INFO: Waiting for VMs to migrate.
Jul 8 19:26:38 10.35.0.13 [2014-07-08 19:26:38.381] 941/MainThread savage.node-evacuation/INFO: Requesting status refresh of VM id ce77d052-f021-44a2-b5f3-5908ec04fcc7
Jul 8 19:26:38 10.35.0.13 dnsmasq-dhcp[7234]: read /var/lib/dnsmasq/nova-cloud.conf
Jul 8 19:26:38 10.35.0.13 nova-network.log: /usr/lib64/python2.7/site-packages/amqp/channel.py:616: VDeprecationWarning: The auto_delete flag for exchanges has been deprecated and will be removed
Jul 8 19:26:38 10.35.0.13 nova-network.log: from py-amqp v1.5.0.
Jul 8 19:26:38 10.35.0.13 nova-network.log: warn(VDeprecationWarning(EXCHANGE_AUTODELETE_DEPRECATED))
Jul 8 19:26:38 10.35.0.13 nova-network 2014-07-08 19:26:38.450 4876 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"eda26879ff7f4b1e84db95cc9194e6af\\", \\"failure\\": null, \\"result\\": null, \\"_msg_id\\": \\"787336d1e03a46d9be320bce7d28cc1c\\"}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:26:38 10.35.0.13 nova-network 2014-07-08 19:26:38.450 4876 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"eda26879ff7f4b1e84db95cc9194e6af\\", \\"failure\\": null, \\"result\\": null, \\"_msg_id\\": \\"787336d1e03a46d9be320bce7d28cc1c\\"}", "oslo.version": "2.0"}', routing_key=u'reply_ff10fb357a86427594db414dc3e5acc6'
Jul 8 19:26:38 10.35.0.13 nova-network.log: /usr/lib64/python2.7/site-packages/amqp/channel.py:616: VDeprecationWarning: The auto_delete flag for exchanges has been deprecated and will be removed
Jul 8 19:26:38 10.35.0.13 nova-network.log: from py-amqp v1.5.0.
Jul 8 19:26:38 10.35.0.13 nova-network.log: warn(VDeprecationWarning(EXCHANGE_AUTODELETE_DEPRECATED))
Jul 8 19:26:38 10.35.0.13 nova-network 2014-07-08 19:26:38.451 4876 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"215a23f864e04090972d00caa82f7a58\\", \\"failure\\": null, \\"_msg_id\\": \\"787336d1e03a46d9be320bce7d28cc1c\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:26:38 10.35.0.13 cobalt-compute: pyamqp ERROR NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_unique_id\\": \\"eda26879ff7f4b1e84db95cc9194e6af\\", \\"failure\\": null, \\"result\\": null, \\"_msg_id\\": \\"787336d1e03a46d9be320bce7d28cc1c\\"}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'1', 'redelivered': False, 'routing_key': u'reply_ff10fb357a86427594db414dc3e5acc6', 'delivery_tag': 1, 'exchange': u'reply_ff10fb357a86427594db414dc3e5acc6'}, 'properties': {'priority': 0, 'application_headers': {}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7ff0d2864c50>}
Jul 8 19:26:38 10.35.0.13 nova-network 2014-07-08 19:26:38.451 4876 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"215a23f864e04090972d00caa82f7a58\\", \\"failure\\": null, \\"_msg_id\\": \\"787336d1e03a46d9be320bce7d28cc1c\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', routing_key=u'reply_ff10fb357a86427594db414dc3e5acc6'
Jul 8 19:26:38 10.35.0.13 cobalt-compute: impl_rabbit ERROR NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_unique_id\\": \\"eda26879ff7f4b1e84db95cc9194e6af\\", \\"failure\\": null, \\"result\\": null, \\"_msg_id\\": \\"787336d1e03a46d9be320bce7d28cc1c\\"}", "oslo.version": "2.0"}'
Jul 8 19:26:38 10.35.0.13 cobalt-compute: pyamqp ERROR NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_unique_id\\": \\"215a23f864e04090972d00caa82f7a58\\", \\"failure\\": null, \\"_msg_id\\": \\"787336d1e03a46d9be320bce7d28cc1c\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'1', 'redelivered': False, 'routing_key': u'reply_ff10fb357a86427594db414dc3e5acc6', 'delivery_tag': 2, 'exchange': u'reply_ff10fb357a86427594db414dc3e5acc6'}, 'properties': {'priority': 0, 'application_headers': {}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7ff0d2864c50>}
Jul 8 19:26:38 10.35.0.13 cobalt-compute: impl_rabbit ERROR NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_unique_id\\": \\"215a23f864e04090972d00caa82f7a58\\", \\"failure\\": null, \\"_msg_id\\": \\"787336d1e03a46d9be320bce7d28cc1c\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}'
Jul 8 19:26:38 10.35.0.13 pound: 10.35.2.8:8774 10.35.2.7 - - [08/Jul/2014:19:26:38 +0000] "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/ce77d052-f021-44a2-b5f3-5908ec04fcc7 HTTP/1.1" 200 1690 "" "python-novaclient"
Jul 8 19:26:38 10.35.0.14 nova-api 2014-07-08 19:26:38.480 6474 INFO nova.osapi_compute.wsgi.server [req-3157dc0d-b4fa-4a07-9a0b-13fce84eafde 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] 10.35.2.7,10.35.0.13 "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/ce77d052-f021-44a2-b5f3-5908ec04fcc7 HTTP/1.1" status: 200 len: 1864 time: 0.0974510
Jul 8 19:26:38 10.35.0.13 [2014-07-08 19:26:38.483] 941/MainThread savage.Fred/INFO: Waiting for VMs to migrate.
Jul 8 19:26:38 10.35.0.13 cobalt-compute: instance DEBUG Lazy-loading `system_metadata' on Instance uuid 5ee78913-e655-4cb1-a92c-500639c819ad
Jul 8 19:26:38 10.35.0.13 cobalt-compute: amqpdriver DEBUG MSG_ID is 3303d69cf18e4bcb8ee69b3b8a4496af
Jul 8 19:26:38 10.35.0.13 cobalt-compute: amqp DEBUG UNIQUE_ID is 7dc0b0a8eb134b278b0c1842c0a2a4f6.
Jul 8 19:26:38 10.35.0.13 cobalt-compute: pyamqp ERROR NBK: body='{"oslo.message": "{\\"_context_roles\\": [], \\"_msg_id\\": \\"3303d69cf18e4bcb8ee69b3b8a4496af\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-b2bb0bc5-9b42-4c7d-a4fd-ab160d4b70eb\\", \\"_context_service_catalog\\": [], \\"args\\": {\\"instance_id\\": 2, \\"teardown\\": false, \\"host\\": \\"10.35.0.13\\"}, \\"_unique_id\\": \\"7dc0b0a8eb134b278b0c1842c0a2a4f6\\", \\"_context_user\\": null, \\"_context_user_id\\": null, \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_ff10fb357a86427594db414dc3e5acc6\\", \\"_context_auth_token\\": null, \\"_context_tenant\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_project_id\\": null, \\"_context_timestamp\\": \\"2014-07-08T19:26:37.549982\\", \\"_context_user_name\\": null, \\"method\\": \\"setup_networks_on_host\\", \\"_context_remote_address\\": null}", "oslo.version": "2.0"}', headers={'ttl': 60000}, properties={'delivery_mode': 2}
Jul 8 19:26:38 10.35.0.13 cobalt-compute: messaging ERROR NBK: message.body='{"oslo.message": "{\\"_context_roles\\": [], \\"_msg_id\\": \\"3303d69cf18e4bcb8ee69b3b8a4496af\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-b2bb0bc5-9b42-4c7d-a4fd-ab160d4b70eb\\", \\"_context_service_catalog\\": [], \\"args\\": {\\"instance_id\\": 2, \\"teardown\\": false, \\"host\\": \\"10.35.0.13\\"}, \\"_unique_id\\": \\"7dc0b0a8eb134b278b0c1842c0a2a4f6\\", \\"_context_user\\": null, \\"_context_user_id\\": null, \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_ff10fb357a86427594db414dc3e5acc6\\", \\"_context_auth_token\\": null, \\"_context_tenant\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_project_id\\": null, \\"_context_timestamp\\": \\"2014-07-08T19:26:37.549982\\", \\"_context_user_name\\": null, \\"method\\": \\"setup_networks_on_host\\", \\"_context_remote_address\\": null}", "oslo.version": "2.0"}', routing_key='network'
Jul 8 19:26:38 10.35.0.13 cobalt-compute: channel DEBUG Closed channel #1
Jul 8 19:26:38 10.35.0.13 cobalt-compute: channel DEBUG using channel_id: 1
Jul 8 19:26:38 10.35.0.13 cobalt-compute: channel DEBUG Channel open
Jul 8 19:26:38 10.35.0.14 nova-network 2014-07-08 19:26:38.637 6514 ERROR kombu.transport.pyamqp [-] NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_context_roles\\": [], \\"_msg_id\\": \\"3303d69cf18e4bcb8ee69b3b8a4496af\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-b2bb0bc5-9b42-4c7d-a4fd-ab160d4b70eb\\", \\"_context_service_catalog\\": [], \\"args\\": {\\"instance_id\\": 2, \\"teardown\\": false, \\"host\\": \\"10.35.0.13\\"}, \\"_unique_id\\": \\"7dc0b0a8eb134b278b0c1842c0a2a4f6\\", \\"_context_user\\": null, \\"_context_user_id\\": null, \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_ff10fb357a86427594db414dc3e5acc6\\", \\"_context_auth_token\\": null, \\"_context_tenant\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_project_id\\": null, \\"_context_timestamp\\": \\"2014-07-08T19:26:37.549982\\", \\"_context_user_name\\": null, \\"method\\": \\"setup_networks_on_host\\", \\"_context_remote_address\\": null}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'1', 'redelivered': False, 'routing_key': u'network', 'delivery_tag': 28, 'exchange': u'nova'}, 'properties': {'priority': 0, 'application_headers': {u'ttl': 60000}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7f5c1424f1d0>}
Jul 8 19:26:38 10.35.0.14 nova-network 2014-07-08 19:26:38.639 6514 ERROR oslo.messaging._drivers.impl_rabbit [-] NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_context_roles\\": [], \\"_msg_id\\": \\"3303d69cf18e4bcb8ee69b3b8a4496af\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-b2bb0bc5-9b42-4c7d-a4fd-ab160d4b70eb\\", \\"_context_service_catalog\\": [], \\"args\\": {\\"instance_id\\": 2, \\"teardown\\": false, \\"host\\": \\"10.35.0.13\\"}, \\"_unique_id\\": \\"7dc0b0a8eb134b278b0c1842c0a2a4f6\\", \\"_context_user\\": null, \\"_context_user_id\\": null, \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_ff10fb357a86427594db414dc3e5acc6\\", \\"_context_auth_token\\": null, \\"_context_tenant\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_project_id\\": null, \\"_context_timestamp\\": \\"2014-07-08T19:26:37.549982\\", \\"_context_user_name\\": null, \\"method\\": \\"setup_networks_on_host\\", \\"_context_remote_address\\": null}", "oslo.version": "2.0"}'
Jul 8 19:26:38 10.35.0.14 nova-network 2014-07-08 19:26:38.798 6514 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_context_roles\\": [], \\"_msg_id\\": \\"d9bd0f5d12014520a7165790429929c3\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-b2bb0bc5-9b42-4c7d-a4fd-ab160d4b70eb\\", \\"_context_service_catalog\\": [], \\"args\\": {\\"network_id\\": 1, \\"teardown\\": false}, \\"_unique_id\\": \\"2b6f3bde4ee941f68c3e7b45c6d3a7a9\\", \\"_context_user\\": null, \\"_context_user_id\\": null, \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_665664b1b2364a339f6ee930172547c8\\", \\"_context_auth_token\\": null, \\"_context_tenant\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_project_id\\": null, \\"_context_timestamp\\": \\"2014-07-08T19:26:37.549982\\", \\"_context_user_name\\": null, \\"method\\": \\"rpc_setup_network_on_host\\", \\"_context_remote_address\\": null}", "oslo.version": "2.0"}', headers={'ttl': 60000}, properties={'delivery_mode': 2}
Jul 8 19:26:38 10.35.0.14 nova-network 2014-07-08 19:26:38.798 6514 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_context_roles\\": [], \\"_msg_id\\": \\"d9bd0f5d12014520a7165790429929c3\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-b2bb0bc5-9b42-4c7d-a4fd-ab160d4b70eb\\", \\"_context_service_catalog\\": [], \\"args\\": {\\"network_id\\": 1, \\"teardown\\": false}, \\"_unique_id\\": \\"2b6f3bde4ee941f68c3e7b45c6d3a7a9\\", \\"_context_user\\": null, \\"_context_user_id\\": null, \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_665664b1b2364a339f6ee930172547c8\\", \\"_context_auth_token\\": null, \\"_context_tenant\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_project_id\\": null, \\"_context_timestamp\\": \\"2014-07-08T19:26:37.549982\\", \\"_context_user_name\\": null, \\"method\\": \\"rpc_setup_network_on_host\\", \\"_context_remote_address\\": null}", "oslo.version": "2.0"}', routing_key=u'network.10.35.0.13'
Jul 8 19:26:38 10.35.0.13 nova-network 2014-07-08 19:26:38.801 4876 ERROR kombu.transport.pyamqp [-] NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_context_roles\\": [], \\"_msg_id\\": \\"d9bd0f5d12014520a7165790429929c3\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-b2bb0bc5-9b42-4c7d-a4fd-ab160d4b70eb\\", \\"_context_service_catalog\\": [], \\"args\\": {\\"network_id\\": 1, \\"teardown\\": false}, \\"_unique_id\\": \\"2b6f3bde4ee941f68c3e7b45c6d3a7a9\\", \\"_context_user\\": null, \\"_context_user_id\\": null, \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_665664b1b2364a339f6ee930172547c8\\", \\"_context_auth_token\\": null, \\"_context_tenant\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_project_id\\": null, \\"_context_timestamp\\": \\"2014-07-08T19:26:37.549982\\", \\"_context_user_name\\": null, \\"method\\": \\"rpc_setup_network_on_host\\", \\"_context_remote_address\\": null}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'2', 'redelivered': False, 'routing_key': u'network.10.35.0.13', 'delivery_tag': 28, 'exchange': u'nova'}, 'properties': {'priority': 0, 'application_headers': {u'ttl': 60000}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7fdeb6472390>}
Jul 8 19:26:38 10.35.0.14 nova-network 2014-07-08 19:26:38.800 6514 ERROR oslo.messaging._drivers.impl_rabbit [-] Failed to consume message from queue: [Errno 104] Connection reset by peer
2014-07-08 19:26:38.800 6514 TRACE oslo.messaging._drivers.impl_rabbit Traceback (most recent call last):
2014-07-08 19:26:38.800 6514 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/impl_rabbit.py", line 672, in ensure
2014-07-08 19:26:38.800 6514 TRACE oslo.messaging._drivers.impl_rabbit return method(*args, **kwargs)
2014-07-08 19:26:38.800 6514 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/impl_rabbit.py", line 752, in _consume
2014-07-08 19:26:38.800 6514 TRACE oslo.messaging._drivers.impl_rabbit return self.connection.drain_events(timeout=timeout)
2014-07-08 19:26:38.800 6514 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/kombu/connection.py", line 281, in drain_events
2014-07-08 19:26:38.800 6514 TRACE oslo.messaging._drivers.impl_rabbit return self.transport.drain_events(self.connection, **kwargs)
2014-07-08 19:26:38.800 6514 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/kombu/transport/pyamqp.py", line 103, in drain_events
2014-07-08 19:26:38.800 6514 TRACE oslo.messaging._drivers.impl_rabbit return connection.drain_events(**kwargs)
2014-07-08 19:26:38.800 6514 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/amqp/connection.py", line 320, in drain_events
2014-07-08 19:26:38.800 6514 TRACE oslo.messaging._drivers.impl_rabbit return amqp_method(channel, args)
2014-07-08 19:26:38.800 6514 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/amqp/connection.py", line 523, in _close
2014-07-08 19:26:38.800 6514 TRACE oslo.messaging._drivers.impl_rabbit self._x_close_ok()
2014-07-08 19:26:38.800 6514 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/amqp/connection.py", line 551, in _x_close_ok
2014-07-08 19:26:38.800 6514 TRACE oslo.messaging._drivers.impl_rabbit self._send_method((10, 51))
2014-07-08 19:26:38.800 6514 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/amqp/abstract_channel.py", line 56, in _send_method
2014-07-08 19:26:38.800 6514 TRACE oslo.messaging._drivers.impl_rabbit self.channel_id, method_sig, args, content,
2014-07-08 19:26:38.800 6514 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/amqp/method_framing.py", line 221, in write_method
2014-07-08 19:26:38.800 6514 TRACE oslo.messaging._drivers.impl_rabbit write_frame(1, channel, payload)
2014-07-08 19:26:38.800 6514 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/amqp/transport.py", line 177, in write_frame
2014-07-08 19:26:38.800 6514 TRACE oslo.messaging._drivers.impl_rabbit frame_type, channel, size, payload, 0xce,
2014-07-08 19:26:38.800 6514 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/eventlet/greenio.py", line 307, in sendall
2014-07-08 19:26:38.800 6514 TRACE oslo.messaging._drivers.impl_rabbit tail = self.send(data, flags)
2014-07-08 19:26:38.800 6514 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/eventlet/greenio.py", line 293, in send
2014-07-08 19:26:38.800 6514 TRACE oslo.messaging._drivers.impl_rabbit total_sent += fd.send(data[total_sent:], flags)
2014-07-08 19:26:38.800 6514 TRACE oslo.messaging._drivers.impl_rabbit error: [Errno 104] Connection reset by peer
2014-07-08 19:26:38.800 6514 TRACE oslo.messaging._drivers.impl_rabbit
Jul 8 19:26:38 10.35.0.13 nova-network 2014-07-08 19:26:38.802 4876 ERROR oslo.messaging._drivers.impl_rabbit [-] NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_context_roles\\": [], \\"_msg_id\\": \\"d9bd0f5d12014520a7165790429929c3\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-b2bb0bc5-9b42-4c7d-a4fd-ab160d4b70eb\\", \\"_context_service_catalog\\": [], \\"args\\": {\\"network_id\\": 1, \\"teardown\\": false}, \\"_unique_id\\": \\"2b6f3bde4ee941f68c3e7b45c6d3a7a9\\", \\"_context_user\\": null, \\"_context_user_id\\": null, \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_665664b1b2364a339f6ee930172547c8\\", \\"_context_auth_token\\": null, \\"_context_tenant\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_project_id\\": null, \\"_context_timestamp\\": \\"2014-07-08T19:26:37.549982\\", \\"_context_user_name\\": null, \\"method\\": \\"rpc_setup_network_on_host\\", \\"_context_remote_address\\": null}", "oslo.version": "2.0"}'
Jul 8 19:26:38 10.35.0.14 nova-network 2014-07-08 19:26:38.801 6514 INFO oslo.messaging._drivers.impl_rabbit [-] Reconnecting to AMQP server on 10.35.0.3:5672
Jul 8 19:26:38 10.35.0.14 nova-network 2014-07-08 19:26:38.801 6514 INFO oslo.messaging._drivers.impl_rabbit [-] Delaying reconnect for 1.0 seconds...
Jul 8 19:26:38 10.35.0.13 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf ip route replace 10.35.3.0/24 dev cloud table cloud
Jul 8 19:26:39 10.35.0.13 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf ip route replace 0.0.0.0/0 via 10.35.3.1 dev cloud table cloud
Jul 8 19:26:39 10.35.0.13 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf kill -HUP 7234
Jul 8 19:26:39 10.35.0.13 dnsmasq-dhcp[7234]: read /var/lib/dnsmasq/nova-cloud.conf
Jul 8 19:26:39 10.35.0.13 nova-network.log: /usr/lib64/python2.7/site-packages/amqp/channel.py:616: VDeprecationWarning: The auto_delete flag for exchanges has been deprecated and will be removed
Jul 8 19:26:39 10.35.0.13 nova-network.log: from py-amqp v1.5.0.
Jul 8 19:26:39 10.35.0.13 nova-network.log: warn(VDeprecationWarning(EXCHANGE_AUTODELETE_DEPRECATED))
Jul 8 19:26:39 10.35.0.13 nova-network 2014-07-08 19:26:39.417 4876 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"e54b721ca8364990be54ec6a7c9f27a1\\", \\"failure\\": null, \\"result\\": null, \\"_msg_id\\": \\"d9bd0f5d12014520a7165790429929c3\\"}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:26:39 10.35.0.13 nova-network 2014-07-08 19:26:39.418 4876 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"e54b721ca8364990be54ec6a7c9f27a1\\", \\"failure\\": null, \\"result\\": null, \\"_msg_id\\": \\"d9bd0f5d12014520a7165790429929c3\\"}", "oslo.version": "2.0"}', routing_key=u'reply_665664b1b2364a339f6ee930172547c8'
Jul 8 19:26:39 10.35.0.13 nova-network.log: /usr/lib64/python2.7/site-packages/amqp/channel.py:616: VDeprecationWarning: The auto_delete flag for exchanges has been deprecated and will be removed
Jul 8 19:26:39 10.35.0.13 nova-network.log: from py-amqp v1.5.0.
Jul 8 19:26:39 10.35.0.13 nova-network.log: warn(VDeprecationWarning(EXCHANGE_AUTODELETE_DEPRECATED))
Jul 8 19:26:39 10.35.0.13 nova-network 2014-07-08 19:26:39.419 4876 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"10209baf6f65487ea8aacdee3a7786cf\\", \\"failure\\": null, \\"_msg_id\\": \\"d9bd0f5d12014520a7165790429929c3\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:26:39 10.35.0.13 nova-network 2014-07-08 19:26:39.419 4876 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"10209baf6f65487ea8aacdee3a7786cf\\", \\"failure\\": null, \\"_msg_id\\": \\"d9bd0f5d12014520a7165790429929c3\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', routing_key=u'reply_665664b1b2364a339f6ee930172547c8'
Jul 8 19:26:39 10.35.0.13 pound: 10.35.2.3:5000 10.35.0.14 - - [08/Jul/2014:19:26:39 +0000] "POST //v2.0/tokens HTTP/1.1" 200 3288 "" "python-novaclient"
Jul 8 19:26:39 10.35.0.15 nova-api 2014-07-08 19:26:39.655 3443 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:26:39 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.6 - - [08/Jul/2014:19:26:39 +0000] "GET /v2.0/tokens/d42a930f56ea4a05b9f9c91ced653d47 HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:26:39 10.35.0.13 pound: 10.35.2.8:8774 10.35.0.14 - - [08/Jul/2014:19:26:39 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5065 "" "python-novaclient"
Jul 8 19:26:39 10.35.0.15 nova-api 2014-07-08 19:26:39.776 3443 INFO nova.osapi_compute.wsgi.server [req-9188c5b6-dde9-4822-a3c0-d7a527ce4e64 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 10.35.0.14,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5239 time: 0.1234579
Jul 8 19:26:39 10.35.0.13 rabbitmq.log: =INFO REPORT==== 8-Jul-2014::19:26:39 ===
Jul 8 19:26:39 10.35.0.13 rabbitmq.log: accepting AMQP connection <0.2169.0> (10.35.0.14:35560 -> 10.35.0.3:5672)
Jul 8 19:26:39 10.35.0.14 nova-network.log: /usr/lib64/python2.7/site-packages/amqp/channel.py:616: VDeprecationWarning: The auto_delete flag for exchanges has been deprecated and will be removed
Jul 8 19:26:39 10.35.0.14 nova-network.log: from py-amqp v1.5.0.
Jul 8 19:26:39 10.35.0.14 nova-network.log: warn(VDeprecationWarning(EXCHANGE_AUTODELETE_DEPRECATED))
Jul 8 19:26:39 10.35.0.14 nova-network 2014-07-08 19:26:39.823 6514 INFO oslo.messaging._drivers.impl_rabbit [-] Connected to AMQP server on 10.35.0.3:5672
Jul 8 19:26:40 10.35.0.15 [2014-07-08 19:26:40.357] 942/Thread-1 savage.WSGI-FredInterface/INFO: 10.35.0.2 "GET /private/status HTTP/1.1" 200 3035
Jul 8 19:26:40 10.35.0.13 [2014-07-08 19:26:40.355] 941/Thread-1 savage.WSGI-FredInterface/INFO: 10.35.0.2 "GET /private/status HTTP/1.1" 200 3080
Jul 8 19:26:42 10.35.0.13 pound: 10.35.2.3:5000 10.35.0.14 - - [08/Jul/2014:19:26:42 +0000] "POST //v2.0/tokens HTTP/1.1" 200 3288 "" "python-novaclient"
Jul 8 19:26:42 10.35.0.15 nova-api 2014-07-08 19:26:42.546 3443 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:26:42 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.6 - - [08/Jul/2014:19:26:42 +0000] "GET /v2.0/tokens/b30a6b012b2e43e4a69adc3ad8c51f6b HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:26:42 10.35.0.13 pound: 10.35.2.8:8774 10.35.0.14 - - [08/Jul/2014:19:26:42 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5065 "" "python-novaclient"
Jul 8 19:26:42 10.35.0.15 nova-api 2014-07-08 19:26:42.657 3443 INFO nova.osapi_compute.wsgi.server [req-ffa5f9f7-45d4-49ed-9cfe-870da4c6b238 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 10.35.0.14,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5239 time: 0.1126990
Jul 8 19:26:45 10.35.0.13 pound: 10.35.2.3:5000 10.35.0.14 - - [08/Jul/2014:19:26:45 +0000] "POST //v2.0/tokens HTTP/1.1" 200 3288 "" "python-novaclient"
Jul 8 19:26:45 10.35.0.14 nova-api 2014-07-08 19:26:45.417 6476 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:26:45 10.35.0.13 pound: 10.35.2.3:35357 10.35.0.14 - - [08/Jul/2014:19:26:45 +0000] "GET /v2.0/tokens/44ff60c2f56a4e6088d988c9ebf144f6 HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:26:45 10.35.0.13 pound: 10.35.2.8:8774 10.35.0.14 - - [08/Jul/2014:19:26:45 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5065 "" "python-novaclient"
Jul 8 19:26:45 10.35.0.14 nova-api 2014-07-08 19:26:45.549 6476 INFO nova.osapi_compute.wsgi.server [req-8483918c-9c36-4cff-9913-825f5b2b286e 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 10.35.0.14,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5239 time: 0.1339259
Jul 8 19:26:47 127.0.0.1 gunicorn.yellin: [17317] Autorestarting worker after current request.
Jul 8 19:26:47 10.35.0.2 [2014-07-08 19:26:47.713] 17317/MainThread gunicorn.error/INFO: Autorestarting worker after current request.
Jul 8 19:26:48 10.35.0.13 pound: 10.35.2.3:5000 10.35.0.14 - - [08/Jul/2014:19:26:48 +0000] "POST //v2.0/tokens HTTP/1.1" 200 3288 "" "python-novaclient"
Jul 8 19:26:48 10.35.0.13 nova-api 2014-07-08 19:26:48.321 4840 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:26:48 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.7 - - [08/Jul/2014:19:26:48 +0000] "GET /v2.0/tokens/21860a2187ff4d1a9fa06a89a3bbb9ce HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:26:48 10.35.0.13 pound: 10.35.2.8:8774 10.35.0.14 - - [08/Jul/2014:19:26:48 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5065 "" "python-novaclient"
Jul 8 19:26:48 10.35.0.13 nova-api 2014-07-08 19:26:48.444 4840 INFO nova.osapi_compute.wsgi.server [req-320cbd3f-5a49-4215-a06c-72714c170c87 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 10.35.0.14,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5239 time: 0.1262701
Jul 8 19:26:48 10.35.0.13 [2014-07-08 19:26:48.492] 941/MainThread savage.node-evacuation/INFO: Requesting status refresh of VM id 5ee78913-e655-4cb1-a92c-500639c819ad
Jul 8 19:26:48 10.35.0.15 nova-api 2014-07-08 19:26:48.501 3440 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:26:48 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.6 - - [08/Jul/2014:19:26:48 +0000] "GET /v2.0/tokens/e461a54513ed421ab86f2cc67e181011 HTTP/1.1" 200 3290 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:26:48 10.35.0.13 pound: 10.35.2.8:8774 10.35.2.7 - - [08/Jul/2014:19:26:48 +0000] "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/5ee78913-e655-4cb1-a92c-500639c819ad HTTP/1.1" 200 1691 "" "python-novaclient"
Jul 8 19:26:48 10.35.0.15 nova-api 2014-07-08 19:26:48.618 3440 INFO nova.osapi_compute.wsgi.server [req-23a1dcbc-e5fc-418a-b015-8bd5d7ca4d9c 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] 10.35.2.7,10.35.0.13 "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/5ee78913-e655-4cb1-a92c-500639c819ad HTTP/1.1" status: 200 len: 1865 time: 0.1196511
Jul 8 19:26:48 10.35.0.13 [2014-07-08 19:26:48.618] 941/MainThread savage.Fred/INFO: Waiting for VMs to migrate.
Jul 8 19:26:48 10.35.0.13 [2014-07-08 19:26:48.618] 941/MainThread savage.node-evacuation/INFO: Requesting status refresh of VM id ce77d052-f021-44a2-b5f3-5908ec04fcc7
Jul 8 19:26:48 10.35.0.13 pound: 10.35.2.8:8774 10.35.2.7 - - [08/Jul/2014:19:26:48 +0000] "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/ce77d052-f021-44a2-b5f3-5908ec04fcc7 HTTP/1.1" 200 1695 "" "python-novaclient"
Jul 8 19:26:48 10.35.0.15 nova-api 2014-07-08 19:26:48.720 3440 INFO nova.osapi_compute.wsgi.server [req-2924ed40-6f4b-47ed-a0e9-66ba13866640 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] 10.35.2.7,10.35.0.13 "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/ce77d052-f021-44a2-b5f3-5908ec04fcc7 HTTP/1.1" status: 200 len: 1869 time: 0.0981250
Jul 8 19:26:48 10.35.0.13 [2014-07-08 19:26:48.720] 941/MainThread savage.Fred/INFO: Waiting for VMs to migrate.
Jul 8 19:26:49 10.35.0.14 [2014-07-08 19:26:49.692] 941/Thread-1 savage.WSGI-FredInterface/INFO: 10.35.0.2 "GET /private/status HTTP/1.1" 200 3035
Jul 8 19:26:49 10.35.0.14 cobalt-compute: periodic_task DEBUG Running periodic task CobaltManager._refresh_host
Jul 8 19:26:49 10.35.0.14 cobalt-compute: periodic_task DEBUG Running periodic task CobaltManager._clean
Jul 8 19:26:49 10.35.0.14 cobalt-compute: vmsconn DEBUG Starting to clean symlinks in /dev/nova
Jul 8 19:26:49 10.35.0.14 cobalt-compute: vmsconn DEBUG Link is still active /dev/nova/libvirt_save
Jul 8 19:26:49 10.35.0.14 cobalt-compute: vmsconn DEBUG Link is still active /dev/nova/novadisk
Jul 8 19:26:49 10.35.0.14 cobalt-compute: loopingcall DEBUG Dynamic looping call sleeping for 60.00 seconds
Jul 8 19:26:51 10.35.0.13 pound: 10.35.2.3:5000 10.35.0.14 - - [08/Jul/2014:19:26:51 +0000] "POST //v2.0/tokens HTTP/1.1" 200 3288 "" "python-novaclient"
Jul 8 19:26:51 10.35.0.15 nova-api 2014-07-08 19:26:51.214 3440 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:26:51 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.6 - - [08/Jul/2014:19:26:51 +0000] "GET /v2.0/tokens/c3fdc526b58e40c881f5067d761d7ddb HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:26:51 10.35.0.13 pound: 10.35.2.8:8774 10.35.0.14 - - [08/Jul/2014:19:26:51 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5065 "" "python-novaclient"
Jul 8 19:26:51 10.35.0.15 nova-api 2014-07-08 19:26:51.353 3440 INFO nova.osapi_compute.wsgi.server [req-c952aeac-bd79-4bef-87a0-81dcba548e34 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 10.35.0.14,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5239 time: 0.1409822
Jul 8 19:26:53 10.35.0.2 dhcpd: DHCPREQUEST for 10.35.1.130 from 00:25:90:2f:2d:56 via management
Jul 8 19:26:53 10.35.0.2 dhcpd: DHCPACK on 10.35.1.130 to 00:25:90:2f:2d:56 via management
Jul 8 19:26:54 10.35.0.13 nova-network.log: /usr/lib64/python2.7/site-packages/sqlalchemy/engine/default.py:435: Warning: Incorrect datetime value: '2014-07-08 19:16:54+00:00' for column 'updated_at' at row 1
Jul 8 19:26:54 10.35.0.13 nova-network.log: cursor.execute(statement, parameters)
Jul 8 19:26:54 10.35.0.13 pound: 10.35.2.3:5000 10.35.0.14 - - [08/Jul/2014:19:26:53 +0000] "POST //v2.0/tokens HTTP/1.1" 200 3288 "" "python-novaclient"
Jul 8 19:26:54 10.35.0.15 nova-api 2014-07-08 19:26:54.123 3441 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:26:54 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.6 - - [08/Jul/2014:19:26:54 +0000] "GET /v2.0/tokens/f97796a01f20411ba4f046168a902c9e HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:26:54 10.35.0.13 pound: 10.35.2.8:8774 10.35.0.14 - - [08/Jul/2014:19:26:54 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5065 "" "python-novaclient"
Jul 8 19:26:54 10.35.0.15 nova-api 2014-07-08 19:26:54.239 3441 INFO nova.osapi_compute.wsgi.server [req-46972b6a-e393-4713-8147-461ddebf6c83 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 10.35.0.14,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5239 time: 0.1181109
Jul 8 19:26:55 10.35.0.15 [2014-07-08 19:26:55.925] 942/Thread-1 savage.WSGI-FredInterface/INFO: 10.35.0.2 "GET /private/status HTTP/1.1" 200 3035
Jul 8 19:26:55 10.35.0.13 [2014-07-08 19:26:55.925] 941/Thread-1 savage.WSGI-FredInterface/INFO: 10.35.0.2 "GET /private/status HTTP/1.1" 200 3080
Jul 8 19:26:56 10.35.0.13 pound: 10.35.2.3:5000 10.35.0.14 - - [08/Jul/2014:19:26:56 +0000] "POST //v2.0/tokens HTTP/1.1" 200 3288 "" "python-novaclient"
Jul 8 19:26:56 10.35.0.15 nova-api 2014-07-08 19:26:56.999 3441 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:26:57 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.6 - - [08/Jul/2014:19:26:56 +0000] "GET /v2.0/tokens/e0677b1d3b2f45a7a59ef40b2a49c34d HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:26:57 10.35.0.13 pound: 10.35.2.8:8774 10.35.0.14 - - [08/Jul/2014:19:26:56 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5065 "" "python-novaclient"
Jul 8 19:26:57 10.35.0.15 nova-api 2014-07-08 19:26:57.116 3441 INFO nova.osapi_compute.wsgi.server [req-fe5fb29d-4d4d-4a06-8c4b-70ec247cfac6 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 10.35.0.14,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5239 time: 0.1184850
Jul 8 19:26:58 10.35.0.2 dhcpd: DHCPREQUEST for 10.35.1.131 from 00:25:90:2f:2d:07 via management
Jul 8 19:26:58 10.35.0.2 dhcpd: DHCPACK on 10.35.1.131 to 00:25:90:2f:2d:07 via management
Jul 8 19:26:58 10.35.0.15 cinder-volume 2014-07-08 19:26:58.286 9067 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_context_domain\\": null, \\"_context_request_id\\": \\"req-1e210e6b-4102-414d-ba49-daffb144aa32\\", \\"_context_quota_class\\": null, \\"_context_service_catalog\\": [], \\"_context_auth_token\\": null, \\"_context_user_id\\": null, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_timestamp\\": \\"2014-07-08T19:26:58.284850\\", \\"_context_project_domain\\": null, \\"_context_user\\": null, \\"method\\": \\"update_service_capabilities\\", \\"_context_remote_address\\": null, \\"_context_roles\\": [\\"admin\\"], \\"args\\": {\\"service_name\\": \\"volume\\", \\"host\\": \\"cinder@rbd-fast\\", \\"capabilities\\": {\\"allocated_capacity_gb\\": 0, \\"volume_backend_name\\": \\"CEPH\\", \\"free_capacity_gb\\": 1281, \\"driver_version\\": \\"1.1.0\\", \\"total_capacity_gb\\": 1282, \\"reserved_percentage\\": 0, \\"vendor_name\\": \\"Open Source\\", \\"storage_protocol\\": \\"ceph\\"}}, \\"_unique_id\\": \\"ac2d4fce69a8431890ef75b12be1d9e4\\", \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_context_user_identity\\": \\"- - - - -\\", \\"_context_tenant\\": null, \\"_context_project_id\\": null, \\"_context_user_domain\\": null}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:26:58 10.35.0.15 cinder-volume 2014-07-08 19:26:58.287 9067 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_context_domain\\": null, \\"_context_request_id\\": \\"req-1e210e6b-4102-414d-ba49-daffb144aa32\\", \\"_context_quota_class\\": null, \\"_context_service_catalog\\": [], \\"_context_auth_token\\": null, \\"_context_user_id\\": null, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_timestamp\\": \\"2014-07-08T19:26:58.284850\\", \\"_context_project_domain\\": null, \\"_context_user\\": null, \\"method\\": \\"update_service_capabilities\\", \\"_context_remote_address\\": null, \\"_context_roles\\": [\\"admin\\"], \\"args\\": {\\"service_name\\": \\"volume\\", \\"host\\": \\"cinder@rbd-fast\\", \\"capabilities\\": {\\"allocated_capacity_gb\\": 0, \\"volume_backend_name\\": \\"CEPH\\", \\"free_capacity_gb\\": 1281, \\"driver_version\\": \\"1.1.0\\", \\"total_capacity_gb\\": 1282, \\"reserved_percentage\\": 0, \\"vendor_name\\": \\"Open Source\\", \\"storage_protocol\\": \\"ceph\\"}}, \\"_unique_id\\": \\"ac2d4fce69a8431890ef75b12be1d9e4\\", \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_context_user_identity\\": \\"- - - - -\\", \\"_context_tenant\\": null, \\"_context_project_id\\": null, \\"_context_user_domain\\": null}", "oslo.version": "2.0"}', routing_key=''
Jul 8 19:26:58 10.35.0.15 cinder-scheduler 2014-07-08 19:26:58.288 8863 ERROR kombu.transport.pyamqp [-] NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_context_domain\\": null, \\"_context_request_id\\": \\"req-1e210e6b-4102-414d-ba49-daffb144aa32\\", \\"_context_quota_class\\": null, \\"_context_service_catalog\\": [], \\"_context_auth_token\\": null, \\"_context_user_id\\": null, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_timestamp\\": \\"2014-07-08T19:26:58.284850\\", \\"_context_project_domain\\": null, \\"_context_user\\": null, \\"method\\": \\"update_service_capabilities\\", \\"_context_remote_address\\": null, \\"_context_roles\\": [\\"admin\\"], \\"args\\": {\\"service_name\\": \\"volume\\", \\"host\\": \\"cinder@rbd-fast\\", \\"capabilities\\": {\\"allocated_capacity_gb\\": 0, \\"volume_backend_name\\": \\"CEPH\\", \\"free_capacity_gb\\": 1281, \\"driver_version\\": \\"1.1.0\\", \\"total_capacity_gb\\": 1282, \\"reserved_percentage\\": 0, \\"vendor_name\\": \\"Open Source\\", \\"storage_protocol\\": \\"ceph\\"}}, \\"_unique_id\\": \\"ac2d4fce69a8431890ef75b12be1d9e4\\", \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_context_user_identity\\": \\"- - - - -\\", \\"_context_tenant\\": null, \\"_context_project_id\\": null, \\"_context_user_domain\\": null}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'3', 'redelivered': False, 'routing_key': u'', 'delivery_tag': 81, 'exchange': u'cinder-scheduler_fanout'}, 'properties': {'priority': 0, 'application_headers': {}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7fa03d158690>}
Jul 8 19:26:58 10.35.0.15 cinder-volume 2014-07-08 19:26:58.289 9067 INFO cinder.volume.manager [-] Updating volume status
Jul 8 19:26:58 10.35.0.15 cinder-scheduler 2014-07-08 19:26:58.290 8863 ERROR oslo.messaging._drivers.impl_rabbit [-] NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_context_domain\\": null, \\"_context_request_id\\": \\"req-1e210e6b-4102-414d-ba49-daffb144aa32\\", \\"_context_quota_class\\": null, \\"_context_service_catalog\\": [], \\"_context_auth_token\\": null, \\"_context_user_id\\": null, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_timestamp\\": \\"2014-07-08T19:26:58.284850\\", \\"_context_project_domain\\": null, \\"_context_user\\": null, \\"method\\": \\"update_service_capabilities\\", \\"_context_remote_address\\": null, \\"_context_roles\\": [\\"admin\\"], \\"args\\": {\\"service_name\\": \\"volume\\", \\"host\\": \\"cinder@rbd-fast\\", \\"capabilities\\": {\\"allocated_capacity_gb\\": 0, \\"volume_backend_name\\": \\"CEPH\\", \\"free_capacity_gb\\": 1281, \\"driver_version\\": \\"1.1.0\\", \\"total_capacity_gb\\": 1282, \\"reserved_percentage\\": 0, \\"vendor_name\\": \\"Open Source\\", \\"storage_protocol\\": \\"ceph\\"}}, \\"_unique_id\\": \\"ac2d4fce69a8431890ef75b12be1d9e4\\", \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_context_user_identity\\": \\"- - - - -\\", \\"_context_tenant\\": null, \\"_context_project_id\\": null, \\"_context_user_domain\\": null}", "oslo.version": "2.0"}'
Jul 8 19:26:58 10.35.0.15 cinder-scheduler 2014-07-08 19:26:58.291 8863 WARNING cinder.context [-] Arguments dropped when creating context: {'user': None, 'tenant': None, 'user_identity': u'- - - - -'}
Jul 8 19:26:58 10.35.0.15 cinder-scheduler 2014-07-08 19:26:58.292 8863 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"72d7d015568442e0b85b14df521146f2\\", \\"failure\\": null, \\"result\\": null}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:26:58 10.35.0.15 cinder-scheduler 2014-07-08 19:26:58.292 8863 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"72d7d015568442e0b85b14df521146f2\\", \\"failure\\": null, \\"result\\": null}", "oslo.version": "2.0"}', routing_key=''
Jul 8 19:26:58 10.35.0.15 cinder-scheduler 2014-07-08 19:26:58.293 8863 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"b590a873fcfc4ceebbe5976b0eedca52\\", \\"failure\\": null, \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:26:58 10.35.0.15 cinder-scheduler 2014-07-08 19:26:58.293 8863 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"b590a873fcfc4ceebbe5976b0eedca52\\", \\"failure\\": null, \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', routing_key=''
Jul 8 19:26:58 10.35.0.13 [2014-07-08 19:26:58.730] 941/MainThread savage.node-evacuation/INFO: Requesting status refresh of VM id 5ee78913-e655-4cb1-a92c-500639c819ad
Jul 8 19:26:58 10.35.0.13 pound: 10.35.2.8:8774 10.35.2.7 - - [08/Jul/2014:19:26:58 +0000] "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/5ee78913-e655-4cb1-a92c-500639c819ad HTTP/1.1" 200 1691 "" "python-novaclient"
Jul 8 19:26:58 10.35.0.13 nova-api 2014-07-08 19:26:58.845 4844 INFO nova.osapi_compute.wsgi.server [req-8f9225c7-a3e5-48e0-8f95-8731a466ddab 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] 10.35.2.7,10.35.0.13 "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/5ee78913-e655-4cb1-a92c-500639c819ad HTTP/1.1" status: 200 len: 1865 time: 0.1098459
Jul 8 19:26:58 10.35.0.13 [2014-07-08 19:26:58.847] 941/MainThread savage.Fred/INFO: Waiting for VMs to migrate.
Jul 8 19:26:59 10.35.0.13 pound: 10.35.2.3:5000 10.35.0.14 - - [08/Jul/2014:19:26:59 +0000] "POST //v2.0/tokens HTTP/1.1" 200 3288 "" "python-novaclient"
Jul 8 19:26:59 10.35.0.15 nova-api 2014-07-08 19:26:59.891 3443 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:26:59 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.6 - - [08/Jul/2014:19:26:59 +0000] "GET /v2.0/tokens/5afa4ec17d4b49288d08557612206c2b HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:27:00 10.35.0.13 pound: 10.35.2.8:8774 10.35.0.14 - - [08/Jul/2014:19:26:59 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5065 "" "python-novaclient"
Jul 8 19:27:00 10.35.0.15 nova-api 2014-07-08 19:27:00.030 3443 INFO nova.osapi_compute.wsgi.server [req-9a7ecea6-20a0-4817-9489-37ea9e4b7d71 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 10.35.0.14,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5239 time: 0.1410289
Jul 8 19:27:02 127.0.0.1 gunicorn.yellin: [17317] Worker exiting (pid: 17317)
Jul 8 19:27:02 10.35.0.2 [2014-07-08 19:27:02.744] 17317/MainThread gunicorn.error/INFO: Worker exiting (pid: 17317)
Jul 8 19:27:02 10.35.0.13 pound: 10.35.2.3:5000 10.35.0.14 - - [08/Jul/2014:19:27:02 +0000] "POST //v2.0/tokens HTTP/1.1" 200 3288 "" "python-novaclient"
Jul 8 19:27:02 10.35.0.15 nova-api 2014-07-08 19:27:02.792 3442 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:27:02 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.6 - - [08/Jul/2014:19:27:02 +0000] "GET /v2.0/tokens/7f35097069bf4e359f8be42e67438722 HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:27:02 10.35.0.13 pound: 10.35.2.8:8774 10.35.0.14 - - [08/Jul/2014:19:27:02 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5065 "" "python-novaclient"
Jul 8 19:27:02 10.35.0.15 nova-api 2014-07-08 19:27:02.903 3442 INFO nova.osapi_compute.wsgi.server [req-92a08915-68e5-4a9d-806e-91796bc8a22b 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 10.35.0.14,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5239 time: 0.1138420
Jul 8 19:27:02 127.0.0.1 gunicorn.yellin: [1634] Booting worker with pid: 1634
Jul 8 19:27:03 10.35.0.2 [2014-07-08 19:27:03.211] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('YellinConfiguration = savage.services.yellin.app:YellinConfiguration')
Jul 8 19:27:03 10.35.0.2 [2014-07-08 19:27:03.212] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('NovaServiceState = savage.jobs:NovaServiceState')
Jul 8 19:27:03 10.35.0.14 nova-compute 2014-07-08 19:27:03.301 6347 AUDIT nova.compute.resource_tracker [-] Auditing locally available compute resources
Jul 8 19:27:03 10.35.0.14 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf vgs --noheadings --nosuffix --separator | --units b -o vg_size,vg_free nova
Jul 8 19:27:03 10.35.0.14 nova-compute 2014-07-08 19:27:03.642 6347 AUDIT nova.compute.resource_tracker [-] Free ram (MB): 47729
Jul 8 19:27:03 10.35.0.14 nova-compute 2014-07-08 19:27:03.642 6347 AUDIT nova.compute.resource_tracker [-] Free disk (GB): 1862
Jul 8 19:27:03 10.35.0.14 nova-compute 2014-07-08 19:27:03.643 6347 AUDIT nova.compute.resource_tracker [-] Free VCPUS: 4
Jul 8 19:27:03 10.35.0.14 nova-compute 2014-07-08 19:27:03.669 6347 INFO nova.compute.resource_tracker [-] Compute_service record updated for 10.35.0.14:node-10-35-0-14
Jul 8 19:27:03 10.35.0.14 nova-compute 2014-07-08 19:27:03.722 6347 WARNING nova.virt.libvirt.imagecache [-] Unknown base file: /mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375
Jul 8 19:27:03 10.35.0.14 nova-compute 2014-07-08 19:27:03.723 6347 INFO nova.virt.libvirt.imagecache [-] Removable base files: /mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375
Jul 8 19:27:03 10.35.0.14 nova-compute 2014-07-08 19:27:03.723 6347 INFO nova.virt.libvirt.imagecache [-] Base file too young to remove: /mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.399] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('BootNode = savage.jobs:BootNode')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.400] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('BasicConfiguration = savage.jobs:BasicConfiguration')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.401] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('BeforeNova = savage.jobs:BeforeNova')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.401] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('KeystoneLoadBalancer = savage.jobs:KeystoneLoadBalancer')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.401] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('MoveCrashDumps = savage.jobs:MoveCrashDumps')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.402] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('RadosgwConfiguration = savage.jobs:RadosgwConfiguration')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.402] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Nova = savage.jobs:Nova')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.402] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('KeystoneConfiguration = savage.jobs:KeystoneConfiguration')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.403] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('CinderFinalSetup = savage.jobs:CinderFinalSetup')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.403] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('CephWatchdog = savage.jobs.ceph_kick:CephWatchdog')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.405] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('CronConfiguration = savage.jobs:CronConfiguration')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.405] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Keystone = savage.jobs:Keystone')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.406] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('AuxiliaryNetwork = savage.jobs.network:AuxiliaryNetwork')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.427] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Radosgw = savage.jobs:Radosgw')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.427] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Mysql = savage.jobs:Mysql')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.428] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('MyHost = savage.jobs:MyHost')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.428] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Snmpd = savage.jobs:Snmpd')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.428] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('NovaConfiguration = savage.jobs:NovaConfiguration')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.429] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Neutron = savage.jobs:Neutron')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.429] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('LocalDisk = savage.jobs:LocalDisk')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.429] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Ceph = savage.jobs:Ceph')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.430] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('NetTest = savage.jobs.network:NetTest')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.430] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('AuthStoreConfiguration = savage.jobs.auth:AuthStoreConfiguration')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.438] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('ShutdownLevels = savage.services.fred:ShutdownLevels')
Jul 8 19:27:04 10.35.0.13 [2014-07-08 19:27:04.541] 11286/MainThread savage.arp_forever/INFO: execute['arping', '-q', '-A', '-I', 'services', '-c', '1', '-U', '10.35.2.9']
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.560] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('LocalDiskConfiguration = savage.jobs:LocalDiskConfiguration')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.561] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('SnmpdConfiguration = savage.jobs:SnmpdConfiguration')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.561] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('GlanceConfiguration = savage.jobs:GlanceConfiguration')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.561] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('LibvirtSave = savage.jobs:LibvirtSave')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.562] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Memcache = savage.jobs:Memcache')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.562] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('SSLConfiguration = savage.jobs:SSLConfiguration')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.563] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('RabbitConfiguration = savage.jobs:RabbitConfiguration')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.563] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Rabbit = savage.jobs:Rabbit')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.563] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Apache = savage.jobs:Apache')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.564] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('RadosgwLoadBalancer = savage.jobs:RadosgwLoadBalancer')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.564] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('KeystoneFinalSetup = savage.jobs:KeystoneFinalSetup')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.564] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('FirewallConfiguration = savage.jobs.network:FirewallConfiguration')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.565] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('NeutronConfiguration = savage.jobs:NeutronConfiguration')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.565] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('UpstreamServers = savage.jobs.network:UpstreamServers')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.566] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Debug = savage.jobs:Debug')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.566] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('NoVNC = savage.jobs:NoVNC')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.566] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('DashboardLoadBalancer = savage.jobs:DashboardLoadBalancer')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.567] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('DashboardConfiguration = savage.jobs:DashboardConfiguration')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.567] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('HostStoreConfiguration = savage.jobs:HostStoreConfiguration')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.568] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('NovaLoadBalancer = savage.jobs:NovaLoadBalancer')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.568] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Hosts = savage.jobs:Hosts')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.569] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Dashboard = savage.jobs:Dashboard')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.569] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('MysqlConfiguration = savage.jobs:MysqlConfiguration')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.569] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('NoVNCConfiguration = savage.jobs:NoVNCConfiguration')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.570] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Cinder = savage.jobs:Cinder')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.570] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('OpenstackInternal = savage.jobs:OpenstackInternal')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.571] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('BeforeNeutron = savage.jobs:BeforeNeutron')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.571] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('NetworkConfiguration = savage.jobs.network:NetworkConfiguration')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.571] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('BootNodeMinimalConfiguration = savage.machines:BootNodeMinimalConfiguration')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.572] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('NovaNetworkFinalSetup = savage.jobs:NovaNetworkFinalSetup')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.572] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('CephMounted = savage.jobs:CephMounted')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.572] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Arpwatch = savage.jobs.arpwatch:Arpwatch')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.574] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('CephReplication = savage.jobs:CephReplication')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.574] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Firewall = savage.jobs.network:Firewall')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.575] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Zookeeper = savage.jobs:Zookeeper')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.575] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('GlanceLoadBalancer = savage.jobs:GlanceLoadBalancer')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.575] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('FredConfiguration = savage.services.fred:FredConfiguration')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.576] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('CinderLoadBalancer = savage.jobs:CinderLoadBalancer')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.576] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Cron = savage.jobs:Cron')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.577] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('LibVirtd = savage.jobs:LibVirtd')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.577] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('SSH = savage.jobs:SSH')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.577] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('SupportConfiguration = savage.services.support:SupportConfiguration')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.717] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('CinderConfiguration = savage.jobs:CinderConfiguration')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.717] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Openstack = savage.jobs:Openstack')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.718] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Glance = savage.jobs:Glance')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.718] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Disk = savage.jobs:Disk')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.719] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('PublicApi = savage.jobs.api:PublicApi')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.722] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('WriteUUID = savage.jobs:WriteUUID')
Jul 8 19:27:04 10.35.0.15 nova-compute 2014-07-08 19:27:04.736 3328 AUDIT nova.compute.resource_tracker [-] Auditing locally available compute resources
Jul 8 19:27:04 10.35.0.15 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf vgs --noheadings --nosuffix --separator | --units b -o vg_size,vg_free nova
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.786] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('YellinConfiguration = savage.services.yellin.app:YellinConfiguration')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.786] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('NovaServiceState = savage.jobs:NovaServiceState')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.787] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('BootNode = savage.jobs:BootNode')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.787] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('BasicConfiguration = savage.jobs:BasicConfiguration')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.787] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('BeforeNova = savage.jobs:BeforeNova')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.788] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('KeystoneLoadBalancer = savage.jobs:KeystoneLoadBalancer')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.788] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('MoveCrashDumps = savage.jobs:MoveCrashDumps')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.789] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('RadosgwConfiguration = savage.jobs:RadosgwConfiguration')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.789] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Nova = savage.jobs:Nova')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.789] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('KeystoneConfiguration = savage.jobs:KeystoneConfiguration')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.790] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('CinderFinalSetup = savage.jobs:CinderFinalSetup')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.790] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('CephWatchdog = savage.jobs.ceph_kick:CephWatchdog')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.790] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('CronConfiguration = savage.jobs:CronConfiguration')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.791] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Keystone = savage.jobs:Keystone')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.791] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('AuxiliaryNetwork = savage.jobs.network:AuxiliaryNetwork')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.792] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Radosgw = savage.jobs:Radosgw')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.792] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Mysql = savage.jobs:Mysql')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.792] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('MyHost = savage.jobs:MyHost')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.793] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Snmpd = savage.jobs:Snmpd')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.793] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('NovaConfiguration = savage.jobs:NovaConfiguration')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.794] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Neutron = savage.jobs:Neutron')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.794] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('LocalDisk = savage.jobs:LocalDisk')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.794] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Ceph = savage.jobs:Ceph')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.795] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('NetTest = savage.jobs.network:NetTest')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.795] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('AuthStoreConfiguration = savage.jobs.auth:AuthStoreConfiguration')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.796] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('ShutdownLevels = savage.services.fred:ShutdownLevels')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.796] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('LocalDiskConfiguration = savage.jobs:LocalDiskConfiguration')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.796] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('SnmpdConfiguration = savage.jobs:SnmpdConfiguration')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.797] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('GlanceConfiguration = savage.jobs:GlanceConfiguration')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.797] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('LibvirtSave = savage.jobs:LibvirtSave')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.797] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Memcache = savage.jobs:Memcache')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.798] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('SSLConfiguration = savage.jobs:SSLConfiguration')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.798] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('RabbitConfiguration = savage.jobs:RabbitConfiguration')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.799] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Rabbit = savage.jobs:Rabbit')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.799] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Apache = savage.jobs:Apache')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.800] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('RadosgwLoadBalancer = savage.jobs:RadosgwLoadBalancer')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.800] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('KeystoneFinalSetup = savage.jobs:KeystoneFinalSetup')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.800] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('FirewallConfiguration = savage.jobs.network:FirewallConfiguration')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.801] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('NeutronConfiguration = savage.jobs:NeutronConfiguration')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.801] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('UpstreamServers = savage.jobs.network:UpstreamServers')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.802] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Debug = savage.jobs:Debug')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.802] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('NoVNC = savage.jobs:NoVNC')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.802] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('DashboardLoadBalancer = savage.jobs:DashboardLoadBalancer')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.803] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('DashboardConfiguration = savage.jobs:DashboardConfiguration')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.803] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('HostStoreConfiguration = savage.jobs:HostStoreConfiguration')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.803] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('NovaLoadBalancer = savage.jobs:NovaLoadBalancer')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.804] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Hosts = savage.jobs:Hosts')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.805] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Dashboard = savage.jobs:Dashboard')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.805] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('MysqlConfiguration = savage.jobs:MysqlConfiguration')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.805] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('NoVNCConfiguration = savage.jobs:NoVNCConfiguration')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.806] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Cinder = savage.jobs:Cinder')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.806] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('OpenstackInternal = savage.jobs:OpenstackInternal')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.807] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('BeforeNeutron = savage.jobs:BeforeNeutron')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.807] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('NetworkConfiguration = savage.jobs.network:NetworkConfiguration')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.807] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('BootNodeMinimalConfiguration = savage.machines:BootNodeMinimalConfiguration')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.808] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('NovaNetworkFinalSetup = savage.jobs:NovaNetworkFinalSetup')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.808] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('CephMounted = savage.jobs:CephMounted')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.808] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Arpwatch = savage.jobs.arpwatch:Arpwatch')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.809] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('CephReplication = savage.jobs:CephReplication')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.809] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Firewall = savage.jobs.network:Firewall')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.810] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Zookeeper = savage.jobs:Zookeeper')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.810] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('GlanceLoadBalancer = savage.jobs:GlanceLoadBalancer')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.810] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('FredConfiguration = savage.services.fred:FredConfiguration')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.811] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('CinderLoadBalancer = savage.jobs:CinderLoadBalancer')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.811] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Cron = savage.jobs:Cron')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.811] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('LibVirtd = savage.jobs:LibVirtd')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.812] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('SSH = savage.jobs:SSH')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.812] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('SupportConfiguration = savage.services.support:SupportConfiguration')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.813] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('CinderConfiguration = savage.jobs:CinderConfiguration')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.813] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Openstack = savage.jobs:Openstack')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.813] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Glance = savage.jobs:Glance')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.814] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Disk = savage.jobs:Disk')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.814] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('PublicApi = savage.jobs.api:PublicApi')
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.814] 1634/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('WriteUUID = savage.jobs:WriteUUID')
Jul 8 19:27:04 10.35.0.15 [2014-07-08 19:27:04.853] 8877/MainThread savage.arp_forever/INFO: execute['arping', '-q', '-A', '-I', 'services', '-c', '1', '-U', '10.35.2.6']
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.850] 1634/MainThread savage.moxie.meta/DEBUG: requested enabled=frozenset(['group:YellinConfiguration'])
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.853] 1634/MainThread savage.moxie.meta/DEBUG: expanded enabled=set([<savage.jobs.auth.AuthStoreConfiguration(id=id:AuthStoreConfiguration)>, <savage.jobs.network.FirewallConfiguration(id=id:FirewallConfiguration)>, <savage.jobs.network.UpstreamServers(id=id:UpstreamServers)>, <savage.jobs.BasicConfiguration(id=id:BasicConfiguration)>, <savage.jobs.KeystoneConfiguration(id=id:KeystoneConfiguration)>, <savage.moxie.implicit.group:Basic(id=group:Basic)>, <savage.services.yellin.app.YellinConfiguration(id=group:YellinConfiguration)>, <savage.services.fred.ShutdownLevels(id=id:ShutdownLevels)>, <savage.jobs.DashboardConfiguration(id=id:DashboardConfiguration)>, <savage.moxie.implicit.group:FirewallConf(id=group:FirewallConf)>, <savage.moxie.implicit.group:NetworkConf(id=group:NetworkConf)>, <savage.moxie.implicit.group:SslConf(id=group:SslConf)>, <savage.moxie.implicit.group:KeystoneConf(id=group:KeystoneConf)>, <savage.jobs.MysqlConfiguration(id=id:MysqlConfiguration)>, <savage.jobs.network.NetworkConfiguration(id=id:NetworkConfiguration)>, <savage.jobs.BootNode(id=id:BootNode)>, <savage.jobs.SSLConfiguration(id=id:SSLConfiguration)>, <savage.moxie.implicit.group:BootNode(id=group:BootNode)>, <savage.moxie.implicit.group:MysqlConf(id=group:MysqlConf)>, <savage.jobs.CronConfiguration(id=id:CronConfiguration)>])
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.893] 1634/MainThread passlib.registry/DEBUG: registered 'md5_crypt' handler: <class 'passlib.handlers.md5_crypt.md5_crypt'>
Jul 8 19:27:04 10.35.0.2 [2014-07-08 19:27:04.895] 1634/MainThread passlib.registry/DEBUG: registered 'sha512_crypt' handler: <class 'passlib.handlers.sha2_crypt.sha512_crypt'>
Jul 8 19:27:05 10.35.0.15 nova-compute 2014-07-08 19:27:05.095 3328 AUDIT nova.compute.resource_tracker [-] Free ram (MB): 47217
Jul 8 19:27:05 10.35.0.15 nova-compute 2014-07-08 19:27:05.095 3328 AUDIT nova.compute.resource_tracker [-] Free disk (GB): 1861
Jul 8 19:27:05 10.35.0.15 nova-compute 2014-07-08 19:27:05.095 3328 AUDIT nova.compute.resource_tracker [-] Free VCPUS: 3
Jul 8 19:27:05 10.35.0.15 nova-compute 2014-07-08 19:27:05.120 3328 INFO nova.compute.resource_tracker [-] Compute_service record updated for 10.35.0.15:node-10-35-0-15
Jul 8 19:27:05 10.35.0.13 nova-compute 2014-07-08 19:27:05.158 4705 AUDIT nova.compute.resource_tracker [-] Auditing locally available compute resources
Jul 8 19:27:05 10.35.0.13 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf vgs --noheadings --nosuffix --separator | --units b -o vg_size,vg_free nova
Jul 8 19:27:05 10.35.0.15 nova-compute 2014-07-08 19:27:05.246 3328 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_context_roles\\": [], \\"_msg_id\\": \\"0be7d105483e4156bbeca5f35323035b\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-834817e7-51b1-4120-b422-f1812ab79eb1\\", \\"_context_service_catalog\\": [], \\"args\\": {\\"instance_id\\": \\"3e65c658-731c-4a56-8c17-1cfac00bf754\\", \\"host\\": \\"10.35.0.15\\", \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"rxtx_factor\\": 1.0}, \\"_unique_id\\": \\"cbd408f7e05345cdace5f4204e68a13f\\", \\"_context_user\\": null, \\"_context_user_id\\": null, \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_af4fb5adaca445a3b6d555c358806a4e\\", \\"_context_auth_token\\": null, \\"_context_tenant\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.9\\", \\"_context_project_id\\": null, \\"_context_timestamp\\": \\"2014-07-08T19:27:04.735211\\", \\"_context_user_name\\": null, \\"method\\": \\"get_instance_nw_info\\", \\"_context_remote_address\\": null}", "oslo.version": "2.0"}', headers={'ttl': 60000}, properties={'delivery_mode': 2}
Jul 8 19:27:05 10.35.0.15 nova-compute 2014-07-08 19:27:05.246 3328 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_context_roles\\": [], \\"_msg_id\\": \\"0be7d105483e4156bbeca5f35323035b\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-834817e7-51b1-4120-b422-f1812ab79eb1\\", \\"_context_service_catalog\\": [], \\"args\\": {\\"instance_id\\": \\"3e65c658-731c-4a56-8c17-1cfac00bf754\\", \\"host\\": \\"10.35.0.15\\", \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"rxtx_factor\\": 1.0}, \\"_unique_id\\": \\"cbd408f7e05345cdace5f4204e68a13f\\", \\"_context_user\\": null, \\"_context_user_id\\": null, \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_af4fb5adaca445a3b6d555c358806a4e\\", \\"_context_auth_token\\": null, \\"_context_tenant\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.9\\", \\"_context_project_id\\": null, \\"_context_timestamp\\": \\"2014-07-08T19:27:04.735211\\", \\"_context_user_name\\": null, \\"method\\": \\"get_instance_nw_info\\", \\"_context_remote_address\\": null}", "oslo.version": "2.0"}', routing_key='network'
Jul 8 19:27:05 10.35.0.15 nova-network 2014-07-08 19:27:05.248 3465 ERROR kombu.transport.pyamqp [-] NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_context_roles\\": [], \\"_msg_id\\": \\"0be7d105483e4156bbeca5f35323035b\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-834817e7-51b1-4120-b422-f1812ab79eb1\\", \\"_context_service_catalog\\": [], \\"args\\": {\\"instance_id\\": \\"3e65c658-731c-4a56-8c17-1cfac00bf754\\", \\"host\\": \\"10.35.0.15\\", \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"rxtx_factor\\": 1.0}, \\"_unique_id\\": \\"cbd408f7e05345cdace5f4204e68a13f\\", \\"_context_user\\": null, \\"_context_user_id\\": null, \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_af4fb5adaca445a3b6d555c358806a4e\\", \\"_context_auth_token\\": null, \\"_context_tenant\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.9\\", \\"_context_project_id\\": null, \\"_context_timestamp\\": \\"2014-07-08T19:27:04.735211\\", \\"_context_user_name\\": null, \\"method\\": \\"get_instance_nw_info\\", \\"_context_remote_address\\": null}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'1', 'redelivered': False, 'routing_key': u'network', 'delivery_tag': 28, 'exchange': u'nova'}, 'properties': {'priority': 0, 'application_headers': {u'ttl': 60000}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7eff2d221290>}
Jul 8 19:27:05 10.35.0.15 nova-network 2014-07-08 19:27:05.249 3465 ERROR oslo.messaging._drivers.impl_rabbit [-] NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_context_roles\\": [], \\"_msg_id\\": \\"0be7d105483e4156bbeca5f35323035b\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-834817e7-51b1-4120-b422-f1812ab79eb1\\", \\"_context_service_catalog\\": [], \\"args\\": {\\"instance_id\\": \\"3e65c658-731c-4a56-8c17-1cfac00bf754\\", \\"host\\": \\"10.35.0.15\\", \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"rxtx_factor\\": 1.0}, \\"_unique_id\\": \\"cbd408f7e05345cdace5f4204e68a13f\\", \\"_context_user\\": null, \\"_context_user_id\\": null, \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_af4fb5adaca445a3b6d555c358806a4e\\", \\"_context_auth_token\\": null, \\"_context_tenant\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.9\\", \\"_context_project_id\\": null, \\"_context_timestamp\\": \\"2014-07-08T19:27:04.735211\\", \\"_context_user_name\\": null, \\"method\\": \\"get_instance_nw_info\\", \\"_context_remote_address\\": null}", "oslo.version": "2.0"}'
Jul 8 19:27:05 10.35.0.14 [2014-07-08 19:27:05.260] 941/Thread-1 savage.WSGI-FredInterface/INFO: 10.35.0.2 "GET /private/status HTTP/1.1" 200 3034
Jul 8 19:27:05 10.35.0.15 nova-network.log: /usr/lib64/python2.7/site-packages/amqp/channel.py:616: VDeprecationWarning: The auto_delete flag for exchanges has been deprecated and will be removed
Jul 8 19:27:05 10.35.0.15 nova-network.log: from py-amqp v1.5.0.
Jul 8 19:27:05 10.35.0.15 nova-network.log: warn(VDeprecationWarning(EXCHANGE_AUTODELETE_DEPRECATED))
Jul 8 19:27:05 10.35.0.15 nova-network 2014-07-08 19:27:05.378 3465 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"2c0becf261db46cfa2285562724917b4\\", \\"failure\\": null, \\"result\\": [{\\"ovs_interfaceid\\": null, \\"network\\": {\\"bridge\\": \\"cloud\\", \\"subnets\\": [{\\"ips\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"fixed\\", \\"floating_ips\\": [], \\"address\\": \\"10.35.3.6\\"}], \\"version\\": 4, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.5\\"}, \\"dns\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"dns\\", \\"address\\": \\"8.8.4.4\\"}], \\"routes\\": [], \\"cidr\\": \\"10.35.3.0/24\\", \\"gateway\\": {\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"gateway\\", \\"address\\": \\"10.35.3.1\\"}}, {\\"ips\\": [], \\"version\\": null, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.5\\"}, \\"dns\\": [], \\"routes\\": [], \\"cidr\\": null, \\"gateway\\": {\\"meta\\": {}, \\"version\\": null, \\"type\\": \\"gateway\\", \\"address\\": null}}], \\"meta\\": {\\"tenant_id\\": null, \\"multi_host\\": true, \\"should_create_bridge\\": true}, \\"id\\": \\"933f56fa-7929-413f-ab40-9c7c91cb840b\\", \\"label\\": \\"private\\"}, \\"devname\\": null, \\"qbh_params\\": null, \\"meta\\": {}, \\"details\\": {}, \\"address\\": \\"fa:16:3e:c1:d4:f3\\", \\"active\\": false, \\"type\\": \\"bridge\\", \\"id\\": \\"ecd8dbf8-7280-45c3-89e1-919f48b2591c\\", \\"qbg_params\\": null}], \\"_msg_id\\": \\"0be7d105483e4156bbeca5f35323035b\\"}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:27:05 10.35.0.15 nova-network 2014-07-08 19:27:05.378 3465 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"2c0becf261db46cfa2285562724917b4\\", \\"failure\\": null, \\"result\\": [{\\"ovs_interfaceid\\": null, \\"network\\": {\\"bridge\\": \\"cloud\\", \\"subnets\\": [{\\"ips\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"fixed\\", \\"floating_ips\\": [], \\"address\\": \\"10.35.3.6\\"}], \\"version\\": 4, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.5\\"}, \\"dns\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"dns\\", \\"address\\": \\"8.8.4.4\\"}], \\"routes\\": [], \\"cidr\\": \\"10.35.3.0/24\\", \\"gateway\\": {\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"gateway\\", \\"address\\": \\"10.35.3.1\\"}}, {\\"ips\\": [], \\"version\\": null, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.5\\"}, \\"dns\\": [], \\"routes\\": [], \\"cidr\\": null, \\"gateway\\": {\\"meta\\": {}, \\"version\\": null, \\"type\\": \\"gateway\\", \\"address\\": null}}], \\"meta\\": {\\"tenant_id\\": null, \\"multi_host\\": true, \\"should_create_bridge\\": true}, \\"id\\": \\"933f56fa-7929-413f-ab40-9c7c91cb840b\\", \\"label\\": \\"private\\"}, \\"devname\\": null, \\"qbh_params\\": null, \\"meta\\": {}, \\"details\\": {}, \\"address\\": \\"fa:16:3e:c1:d4:f3\\", \\"active\\": false, \\"type\\": \\"bridge\\", \\"id\\": \\"ecd8dbf8-7280-45c3-89e1-919f48b2591c\\", \\"qbg_params\\": null}], \\"_msg_id\\": \\"0be7d105483e4156bbeca5f35323035b\\"}", "oslo.version": "2.0"}', routing_key=u'reply_af4fb5adaca445a3b6d555c358806a4e'
Jul 8 19:27:05 10.35.0.15 nova-network.log: /usr/lib64/python2.7/site-packages/amqp/channel.py:616: VDeprecationWarning: The auto_delete flag for exchanges has been deprecated and will be removed
Jul 8 19:27:05 10.35.0.15 nova-network.log: from py-amqp v1.5.0.
Jul 8 19:27:05 10.35.0.15 nova-network.log: warn(VDeprecationWarning(EXCHANGE_AUTODELETE_DEPRECATED))
Jul 8 19:27:05 10.35.0.15 nova-network 2014-07-08 19:27:05.379 3465 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"8f5d5c7f02f746a6bf61f29e0d2bebb6\\", \\"failure\\": null, \\"_msg_id\\": \\"0be7d105483e4156bbeca5f35323035b\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:27:05 10.35.0.15 nova-network 2014-07-08 19:27:05.380 3465 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"8f5d5c7f02f746a6bf61f29e0d2bebb6\\", \\"failure\\": null, \\"_msg_id\\": \\"0be7d105483e4156bbeca5f35323035b\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', routing_key=u'reply_af4fb5adaca445a3b6d555c358806a4e'
Jul 8 19:27:05 10.35.0.15 nova-compute 2014-07-08 19:27:05.380 3328 ERROR kombu.transport.pyamqp [-] NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_unique_id\\": \\"2c0becf261db46cfa2285562724917b4\\", \\"failure\\": null, \\"result\\": [{\\"ovs_interfaceid\\": null, \\"network\\": {\\"bridge\\": \\"cloud\\", \\"subnets\\": [{\\"ips\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"fixed\\", \\"floating_ips\\": [], \\"address\\": \\"10.35.3.6\\"}], \\"version\\": 4, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.5\\"}, \\"dns\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"dns\\", \\"address\\": \\"8.8.4.4\\"}], \\"routes\\": [], \\"cidr\\": \\"10.35.3.0/24\\", \\"gateway\\": {\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"gateway\\", \\"address\\": \\"10.35.3.1\\"}}, {\\"ips\\": [], \\"version\\": null, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.5\\"}, \\"dns\\": [], \\"routes\\": [], \\"cidr\\": null, \\"gateway\\": {\\"meta\\": {}, \\"version\\": null, \\"type\\": \\"gateway\\", \\"address\\": null}}], \\"meta\\": {\\"tenant_id\\": null, \\"multi_host\\": true, \\"should_create_bridge\\": true}, \\"id\\": \\"933f56fa-7929-413f-ab40-9c7c91cb840b\\", \\"label\\": \\"private\\"}, \\"devname\\": null, \\"qbh_params\\": null, \\"meta\\": {}, \\"details\\": {}, \\"address\\": \\"fa:16:3e:c1:d4:f3\\", \\"active\\": false, \\"type\\": \\"bridge\\", \\"id\\": \\"ecd8dbf8-7280-45c3-89e1-919f48b2591c\\", \\"qbg_params\\": null}], \\"_msg_id\\": \\"0be7d105483e4156bbeca5f35323035b\\"}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'1', 'redelivered': False, 'routing_key': u'reply_af4fb5adaca445a3b6d555c358806a4e', 'delivery_tag': 79, 'exchange': u'reply_af4fb5adaca445a3b6d555c358806a4e'}, 'properties': {'priority': 0, 'application_headers': {}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7f3030b1ff10>}
Jul 8 19:27:05 10.35.0.15 nova-compute 2014-07-08 19:27:05.380 3328 ERROR oslo.messaging._drivers.impl_rabbit [-] NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_unique_id\\": \\"2c0becf261db46cfa2285562724917b4\\", \\"failure\\": null, \\"result\\": [{\\"ovs_interfaceid\\": null, \\"network\\": {\\"bridge\\": \\"cloud\\", \\"subnets\\": [{\\"ips\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"fixed\\", \\"floating_ips\\": [], \\"address\\": \\"10.35.3.6\\"}], \\"version\\": 4, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.5\\"}, \\"dns\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"dns\\", \\"address\\": \\"8.8.4.4\\"}], \\"routes\\": [], \\"cidr\\": \\"10.35.3.0/24\\", \\"gateway\\": {\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"gateway\\", \\"address\\": \\"10.35.3.1\\"}}, {\\"ips\\": [], \\"version\\": null, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.5\\"}, \\"dns\\": [], \\"routes\\": [], \\"cidr\\": null, \\"gateway\\": {\\"meta\\": {}, \\"version\\": null, \\"type\\": \\"gateway\\", \\"address\\": null}}], \\"meta\\": {\\"tenant_id\\": null, \\"multi_host\\": true, \\"should_create_bridge\\": true}, \\"id\\": \\"933f56fa-7929-413f-ab40-9c7c91cb840b\\", \\"label\\": \\"private\\"}, \\"devname\\": null, \\"qbh_params\\": null, \\"meta\\": {}, \\"details\\": {}, \\"address\\": \\"fa:16:3e:c1:d4:f3\\", \\"active\\": false, \\"type\\": \\"bridge\\", \\"id\\": \\"ecd8dbf8-7280-45c3-89e1-919f48b2591c\\", \\"qbg_params\\": null}], \\"_msg_id\\": \\"0be7d105483e4156bbeca5f35323035b\\"}", "oslo.version": "2.0"}'
Jul 8 19:27:05 10.35.0.15 nova-compute 2014-07-08 19:27:05.381 3328 ERROR kombu.transport.pyamqp [-] NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_unique_id\\": \\"8f5d5c7f02f746a6bf61f29e0d2bebb6\\", \\"failure\\": null, \\"_msg_id\\": \\"0be7d105483e4156bbeca5f35323035b\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'1', 'redelivered': False, 'routing_key': u'reply_af4fb5adaca445a3b6d555c358806a4e', 'delivery_tag': 80, 'exchange': u'reply_af4fb5adaca445a3b6d555c358806a4e'}, 'properties': {'priority': 0, 'application_headers': {}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7f3030b1ff10>}
Jul 8 19:27:05 10.35.0.15 nova-compute 2014-07-08 19:27:05.381 3328 ERROR oslo.messaging._drivers.impl_rabbit [-] NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_unique_id\\": \\"8f5d5c7f02f746a6bf61f29e0d2bebb6\\", \\"failure\\": null, \\"_msg_id\\": \\"0be7d105483e4156bbeca5f35323035b\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}'
Jul 8 19:27:05 10.35.0.13 nova-compute 2014-07-08 19:27:05.530 4705 AUDIT nova.compute.resource_tracker [-] Free ram (MB): 46705
Jul 8 19:27:05 10.35.0.13 nova-compute 2014-07-08 19:27:05.530 4705 AUDIT nova.compute.resource_tracker [-] Free disk (GB): 1860
Jul 8 19:27:05 10.35.0.13 nova-compute 2014-07-08 19:27:05.530 4705 AUDIT nova.compute.resource_tracker [-] Free VCPUS: 2
Jul 8 19:27:05 10.35.0.13 nova-compute 2014-07-08 19:27:05.560 4705 INFO nova.compute.resource_tracker [-] Compute_service record updated for 10.35.0.13:node-10-35-0-13
Jul 8 19:27:05 10.35.0.13 pound: 10.35.2.3:5000 10.35.0.14 - - [08/Jul/2014:19:27:05 +0000] "POST //v2.0/tokens HTTP/1.1" 200 3288 "" "python-novaclient"
Jul 8 19:27:05 10.35.0.14 nova-api 2014-07-08 19:27:05.653 6472 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:27:05 10.35.0.13 pound: 10.35.2.3:35357 10.35.0.14 - - [08/Jul/2014:19:27:05 +0000] "GET /v2.0/tokens/e4eb5103b0c643088f2b13c7e6ba698e HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:27:05 10.35.0.13 nova-compute 2014-07-08 19:27:05.672 4705 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_context_roles\\": [], \\"_msg_id\\": \\"02aff61aafc74cc1ac86e46c5f60a82b\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-664369ae-3774-4050-8fbe-cb546f9091db\\", \\"_context_service_catalog\\": [], \\"args\\": {\\"instance_id\\": \\"5ee78913-e655-4cb1-a92c-500639c819ad\\", \\"host\\": \\"10.35.0.13\\", \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"rxtx_factor\\": 1.0}, \\"_unique_id\\": \\"355a783d8ba54b389817e722d7a66047\\", \\"_context_user\\": null, \\"_context_user_id\\": null, \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_5a152b20506e4303bfaf3007cbe974c3\\", \\"_context_auth_token\\": null, \\"_context_tenant\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.9\\", \\"_context_project_id\\": null, \\"_context_timestamp\\": \\"2014-07-08T19:27:05.157166\\", \\"_context_user_name\\": null, \\"method\\": \\"get_instance_nw_info\\", \\"_context_remote_address\\": null}", "oslo.version": "2.0"}', headers={'ttl': 60000}, properties={'delivery_mode': 2}
Jul 8 19:27:05 10.35.0.13 nova-compute 2014-07-08 19:27:05.673 4705 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_context_roles\\": [], \\"_msg_id\\": \\"02aff61aafc74cc1ac86e46c5f60a82b\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-664369ae-3774-4050-8fbe-cb546f9091db\\", \\"_context_service_catalog\\": [], \\"args\\": {\\"instance_id\\": \\"5ee78913-e655-4cb1-a92c-500639c819ad\\", \\"host\\": \\"10.35.0.13\\", \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"rxtx_factor\\": 1.0}, \\"_unique_id\\": \\"355a783d8ba54b389817e722d7a66047\\", \\"_context_user\\": null, \\"_context_user_id\\": null, \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_5a152b20506e4303bfaf3007cbe974c3\\", \\"_context_auth_token\\": null, \\"_context_tenant\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.9\\", \\"_context_project_id\\": null, \\"_context_timestamp\\": \\"2014-07-08T19:27:05.157166\\", \\"_context_user_name\\": null, \\"method\\": \\"get_instance_nw_info\\", \\"_context_remote_address\\": null}", "oslo.version": "2.0"}', routing_key='network'
Jul 8 19:27:05 10.35.0.13 nova-network 2014-07-08 19:27:05.674 4876 ERROR kombu.transport.pyamqp [-] NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_context_roles\\": [], \\"_msg_id\\": \\"02aff61aafc74cc1ac86e46c5f60a82b\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-664369ae-3774-4050-8fbe-cb546f9091db\\", \\"_context_service_catalog\\": [], \\"args\\": {\\"instance_id\\": \\"5ee78913-e655-4cb1-a92c-500639c819ad\\", \\"host\\": \\"10.35.0.13\\", \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"rxtx_factor\\": 1.0}, \\"_unique_id\\": \\"355a783d8ba54b389817e722d7a66047\\", \\"_context_user\\": null, \\"_context_user_id\\": null, \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_5a152b20506e4303bfaf3007cbe974c3\\", \\"_context_auth_token\\": null, \\"_context_tenant\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.9\\", \\"_context_project_id\\": null, \\"_context_timestamp\\": \\"2014-07-08T19:27:05.157166\\", \\"_context_user_name\\": null, \\"method\\": \\"get_instance_nw_info\\", \\"_context_remote_address\\": null}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'1', 'redelivered': False, 'routing_key': u'network', 'delivery_tag': 29, 'exchange': u'nova'}, 'properties': {'priority': 0, 'application_headers': {u'ttl': 60000}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7fdeb6472390>}
Jul 8 19:27:05 10.35.0.13 nova-network 2014-07-08 19:27:05.675 4876 ERROR oslo.messaging._drivers.impl_rabbit [-] NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_context_roles\\": [], \\"_msg_id\\": \\"02aff61aafc74cc1ac86e46c5f60a82b\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-664369ae-3774-4050-8fbe-cb546f9091db\\", \\"_context_service_catalog\\": [], \\"args\\": {\\"instance_id\\": \\"5ee78913-e655-4cb1-a92c-500639c819ad\\", \\"host\\": \\"10.35.0.13\\", \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"rxtx_factor\\": 1.0}, \\"_unique_id\\": \\"355a783d8ba54b389817e722d7a66047\\", \\"_context_user\\": null, \\"_context_user_id\\": null, \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_5a152b20506e4303bfaf3007cbe974c3\\", \\"_context_auth_token\\": null, \\"_context_tenant\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.9\\", \\"_context_project_id\\": null, \\"_context_timestamp\\": \\"2014-07-08T19:27:05.157166\\", \\"_context_user_name\\": null, \\"method\\": \\"get_instance_nw_info\\", \\"_context_remote_address\\": null}", "oslo.version": "2.0"}'
Jul 8 19:27:05 10.35.0.15 [2014-07-08 19:27:05.747] 8900/MainThread savage.arp_forever/INFO: execute['arping', '-q', '-A', '-I', 'services', '-c', '1', '-U', '10.35.2.4']
Jul 8 19:27:05 10.35.0.13 [2014-07-08 19:27:05.766] 11310/MainThread savage.arp_forever/INFO: execute['arping', '-q', '-A', '-I', 'services', '-c', '1', '-U', '10.35.2.2']
Jul 8 19:27:05 10.35.0.13 pound: 10.35.2.8:8774 10.35.0.14 - - [08/Jul/2014:19:27:05 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5065 "" "python-novaclient"
Jul 8 19:27:05 10.35.0.14 nova-api 2014-07-08 19:27:05.769 6472 INFO nova.osapi_compute.wsgi.server [req-0a59ef34-65ad-4ec5-ba1c-3c17e422b209 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 10.35.0.14,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5239 time: 0.1187689
Jul 8 19:27:05 10.35.0.13 nova-network.log: /usr/lib64/python2.7/site-packages/amqp/channel.py:616: VDeprecationWarning: The auto_delete flag for exchanges has been deprecated and will be removed
Jul 8 19:27:05 10.35.0.13 nova-network.log: from py-amqp v1.5.0.
Jul 8 19:27:05 10.35.0.13 nova-network.log: warn(VDeprecationWarning(EXCHANGE_AUTODELETE_DEPRECATED))
Jul 8 19:27:05 10.35.0.13 nova-network 2014-07-08 19:27:05.823 4876 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"66e4c55c24d64375bf09526a24503cc3\\", \\"failure\\": null, \\"result\\": [{\\"ovs_interfaceid\\": null, \\"network\\": {\\"bridge\\": \\"cloud\\", \\"subnets\\": [{\\"ips\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"fixed\\", \\"floating_ips\\": [], \\"address\\": \\"10.35.3.4\\"}], \\"version\\": 4, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.3\\"}, \\"dns\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"dns\\", \\"address\\": \\"8.8.4.4\\"}], \\"routes\\": [], \\"cidr\\": \\"10.35.3.0/24\\", \\"gateway\\": {\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"gateway\\", \\"address\\": \\"10.35.3.1\\"}}, {\\"ips\\": [], \\"version\\": null, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.3\\"}, \\"dns\\": [], \\"routes\\": [], \\"cidr\\": null, \\"gateway\\": {\\"meta\\": {}, \\"version\\": null, \\"type\\": \\"gateway\\", \\"address\\": null}}], \\"meta\\": {\\"tenant_id\\": null, \\"multi_host\\": true, \\"should_create_bridge\\": true}, \\"id\\": \\"933f56fa-7929-413f-ab40-9c7c91cb840b\\", \\"label\\": \\"private\\"}, \\"devname\\": null, \\"qbh_params\\": null, \\"meta\\": {}, \\"details\\": {}, \\"address\\": \\"fa:16:3e:da:1a:ff\\", \\"active\\": false, \\"type\\": \\"bridge\\", \\"id\\": \\"615297b1-6e8b-4e6c-ad03-1d8ca9bb3a72\\", \\"qbg_params\\": null}], \\"_msg_id\\": \\"02aff61aafc74cc1ac86e46c5f60a82b\\"}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:27:05 10.35.0.13 nova-network 2014-07-08 19:27:05.823 4876 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"66e4c55c24d64375bf09526a24503cc3\\", \\"failure\\": null, \\"result\\": [{\\"ovs_interfaceid\\": null, \\"network\\": {\\"bridge\\": \\"cloud\\", \\"subnets\\": [{\\"ips\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"fixed\\", \\"floating_ips\\": [], \\"address\\": \\"10.35.3.4\\"}], \\"version\\": 4, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.3\\"}, \\"dns\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"dns\\", \\"address\\": \\"8.8.4.4\\"}], \\"routes\\": [], \\"cidr\\": \\"10.35.3.0/24\\", \\"gateway\\": {\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"gateway\\", \\"address\\": \\"10.35.3.1\\"}}, {\\"ips\\": [], \\"version\\": null, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.3\\"}, \\"dns\\": [], \\"routes\\": [], \\"cidr\\": null, \\"gateway\\": {\\"meta\\": {}, \\"version\\": null, \\"type\\": \\"gateway\\", \\"address\\": null}}], \\"meta\\": {\\"tenant_id\\": null, \\"multi_host\\": true, \\"should_create_bridge\\": true}, \\"id\\": \\"933f56fa-7929-413f-ab40-9c7c91cb840b\\", \\"label\\": \\"private\\"}, \\"devname\\": null, \\"qbh_params\\": null, \\"meta\\": {}, \\"details\\": {}, \\"address\\": \\"fa:16:3e:da:1a:ff\\", \\"active\\": false, \\"type\\": \\"bridge\\", \\"id\\": \\"615297b1-6e8b-4e6c-ad03-1d8ca9bb3a72\\", \\"qbg_params\\": null}], \\"_msg_id\\": \\"02aff61aafc74cc1ac86e46c5f60a82b\\"}", "oslo.version": "2.0"}', routing_key=u'reply_5a152b20506e4303bfaf3007cbe974c3'
Jul 8 19:27:05 10.35.0.13 nova-network.log: /usr/lib64/python2.7/site-packages/amqp/channel.py:616: VDeprecationWarning: The auto_delete flag for exchanges has been deprecated and will be removed
Jul 8 19:27:05 10.35.0.13 nova-network.log: from py-amqp v1.5.0.
Jul 8 19:27:05 10.35.0.13 nova-network.log: warn(VDeprecationWarning(EXCHANGE_AUTODELETE_DEPRECATED))
Jul 8 19:27:05 10.35.0.13 nova-network 2014-07-08 19:27:05.824 4876 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"6e813cb97bbd4cd693d9b941ece08344\\", \\"failure\\": null, \\"_msg_id\\": \\"02aff61aafc74cc1ac86e46c5f60a82b\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:27:05 10.35.0.13 nova-network 2014-07-08 19:27:05.825 4876 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"6e813cb97bbd4cd693d9b941ece08344\\", \\"failure\\": null, \\"_msg_id\\": \\"02aff61aafc74cc1ac86e46c5f60a82b\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', routing_key=u'reply_5a152b20506e4303bfaf3007cbe974c3'
Jul 8 19:27:05 10.35.0.13 nova-compute 2014-07-08 19:27:05.825 4705 ERROR kombu.transport.pyamqp [-] NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_unique_id\\": \\"66e4c55c24d64375bf09526a24503cc3\\", \\"failure\\": null, \\"result\\": [{\\"ovs_interfaceid\\": null, \\"network\\": {\\"bridge\\": \\"cloud\\", \\"subnets\\": [{\\"ips\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"fixed\\", \\"floating_ips\\": [], \\"address\\": \\"10.35.3.4\\"}], \\"version\\": 4, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.3\\"}, \\"dns\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"dns\\", \\"address\\": \\"8.8.4.4\\"}], \\"routes\\": [], \\"cidr\\": \\"10.35.3.0/24\\", \\"gateway\\": {\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"gateway\\", \\"address\\": \\"10.35.3.1\\"}}, {\\"ips\\": [], \\"version\\": null, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.3\\"}, \\"dns\\": [], \\"routes\\": [], \\"cidr\\": null, \\"gateway\\": {\\"meta\\": {}, \\"version\\": null, \\"type\\": \\"gateway\\", \\"address\\": null}}], \\"meta\\": {\\"tenant_id\\": null, \\"multi_host\\": true, \\"should_create_bridge\\": true}, \\"id\\": \\"933f56fa-7929-413f-ab40-9c7c91cb840b\\", \\"label\\": \\"private\\"}, \\"devname\\": null, \\"qbh_params\\": null, \\"meta\\": {}, \\"details\\": {}, \\"address\\": \\"fa:16:3e:da:1a:ff\\", \\"active\\": false, \\"type\\": \\"bridge\\", \\"id\\": \\"615297b1-6e8b-4e6c-ad03-1d8ca9bb3a72\\", \\"qbg_params\\": null}], \\"_msg_id\\": \\"02aff61aafc74cc1ac86e46c5f60a82b\\"}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'1', 'redelivered': False, 'routing_key': u'reply_5a152b20506e4303bfaf3007cbe974c3', 'delivery_tag': 79, 'exchange': u'reply_5a152b20506e4303bfaf3007cbe974c3'}, 'properties': {'priority': 0, 'application_headers': {}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7fde2d448390>}
Jul 8 19:27:05 10.35.0.13 nova-compute 2014-07-08 19:27:05.825 4705 ERROR oslo.messaging._drivers.impl_rabbit [-] NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_unique_id\\": \\"66e4c55c24d64375bf09526a24503cc3\\", \\"failure\\": null, \\"result\\": [{\\"ovs_interfaceid\\": null, \\"network\\": {\\"bridge\\": \\"cloud\\", \\"subnets\\": [{\\"ips\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"fixed\\", \\"floating_ips\\": [], \\"address\\": \\"10.35.3.4\\"}], \\"version\\": 4, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.3\\"}, \\"dns\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"dns\\", \\"address\\": \\"8.8.4.4\\"}], \\"routes\\": [], \\"cidr\\": \\"10.35.3.0/24\\", \\"gateway\\": {\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"gateway\\", \\"address\\": \\"10.35.3.1\\"}}, {\\"ips\\": [], \\"version\\": null, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.3\\"}, \\"dns\\": [], \\"routes\\": [], \\"cidr\\": null, \\"gateway\\": {\\"meta\\": {}, \\"version\\": null, \\"type\\": \\"gateway\\", \\"address\\": null}}], \\"meta\\": {\\"tenant_id\\": null, \\"multi_host\\": true, \\"should_create_bridge\\": true}, \\"id\\": \\"933f56fa-7929-413f-ab40-9c7c91cb840b\\", \\"label\\": \\"private\\"}, \\"devname\\": null, \\"qbh_params\\": null, \\"meta\\": {}, \\"details\\": {}, \\"address\\": \\"fa:16:3e:da:1a:ff\\", \\"active\\": false, \\"type\\": \\"bridge\\", \\"id\\": \\"615297b1-6e8b-4e6c-ad03-1d8ca9bb3a72\\", \\"qbg_params\\": null}], \\"_msg_id\\": \\"02aff61aafc74cc1ac86e46c5f60a82b\\"}", "oslo.version": "2.0"}'
Jul 8 19:27:05 10.35.0.13 nova-compute 2014-07-08 19:27:05.826 4705 ERROR kombu.transport.pyamqp [-] NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_unique_id\\": \\"6e813cb97bbd4cd693d9b941ece08344\\", \\"failure\\": null, \\"_msg_id\\": \\"02aff61aafc74cc1ac86e46c5f60a82b\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'1', 'redelivered': False, 'routing_key': u'reply_5a152b20506e4303bfaf3007cbe974c3', 'delivery_tag': 80, 'exchange': u'reply_5a152b20506e4303bfaf3007cbe974c3'}, 'properties': {'priority': 0, 'application_headers': {}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7fde2d448390>}
Jul 8 19:27:05 10.35.0.13 nova-compute 2014-07-08 19:27:05.826 4705 ERROR oslo.messaging._drivers.impl_rabbit [-] NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_unique_id\\": \\"6e813cb97bbd4cd693d9b941ece08344\\", \\"failure\\": null, \\"_msg_id\\": \\"02aff61aafc74cc1ac86e46c5f60a82b\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}'
Jul 8 19:27:06 10.35.0.13 [2014-07-08 19:27:06.437] 11330/MainThread savage.arp_forever/INFO: execute['arping', '-q', '-A', '-I', 'host', '-c', '1', '-U', '10.35.0.3']
Jul 8 19:27:06 10.35.0.15 [2014-07-08 19:27:06.495] 9060/MainThread savage.arp_forever/INFO: execute['arping', '-q', '-A', '-I', 'services', '-c', '1', '-U', '10.35.2.5']
Jul 8 19:27:08 10.35.0.13 pound: 10.35.2.3:5000 10.35.0.14 - - [08/Jul/2014:19:27:08 +0000] "POST //v2.0/tokens HTTP/1.1" 200 3288 "" "python-novaclient"
Jul 8 19:27:08 10.35.0.14 nova-api 2014-07-08 19:27:08.548 6473 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:27:08 10.35.0.13 pound: 10.35.2.3:35357 10.35.0.14 - - [08/Jul/2014:19:27:08 +0000] "GET /v2.0/tokens/2cb89e7e47b6404e9f2c520a2e1de0be HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:27:08 10.35.0.13 pound: 10.35.2.8:8774 10.35.0.14 - - [08/Jul/2014:19:27:08 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5065 "" "python-novaclient"
Jul 8 19:27:08 10.35.0.14 nova-api 2014-07-08 19:27:08.670 6473 INFO nova.osapi_compute.wsgi.server [req-33990752-c872-48a2-9eaf-ac80b6334d6d 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 10.35.0.14,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5239 time: 0.1242630
Jul 8 19:27:08 10.35.0.13 [2014-07-08 19:27:08.855] 941/MainThread savage.node-evacuation/INFO: Requesting status refresh of VM id 5ee78913-e655-4cb1-a92c-500639c819ad
Jul 8 19:27:08 10.35.0.13 pound: 10.35.2.8:8774 10.35.2.7 - - [08/Jul/2014:19:27:08 +0000] "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/5ee78913-e655-4cb1-a92c-500639c819ad HTTP/1.1" 200 1691 "" "python-novaclient"
Jul 8 19:27:08 10.35.0.14 nova-api 2014-07-08 19:27:08.963 6474 INFO nova.osapi_compute.wsgi.server [req-0e3c983a-d2b6-4904-853e-50b3138de012 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] 10.35.2.7,10.35.0.13 "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/5ee78913-e655-4cb1-a92c-500639c819ad HTTP/1.1" status: 200 len: 1865 time: 0.1038709
Jul 8 19:27:08 10.35.0.13 [2014-07-08 19:27:08.967] 941/MainThread savage.Fred/INFO: Waiting for VMs to migrate.
Jul 8 19:27:10 10.35.0.15 [2014-07-08 19:27:10.171] 9162/MainThread savage.arp_forever/INFO: execute['arping', '-q', '-A', '-I', 'host', '-c', '1', '-U', '10.35.0.4']
Jul 8 19:27:11 10.35.0.2 dhcpd: DHCPREQUEST for 10.35.1.129 from 00:25:90:8b:3f:a7 via management
Jul 8 19:27:11 10.35.0.2 dhcpd: DHCPACK on 10.35.1.129 to 00:25:90:8b:3f:a7 via management
Jul 8 19:27:11 10.35.0.13 pound: 10.35.2.3:5000 10.35.0.14 - - [08/Jul/2014:19:27:11 +0000] "POST //v2.0/tokens HTTP/1.1" 200 3288 "" "python-novaclient"
Jul 8 19:27:11 10.35.0.13 nova-api 2014-07-08 19:27:11.441 4844 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:27:11 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.7 - - [08/Jul/2014:19:27:11 +0000] "GET /v2.0/tokens/b33d3bb76d7749108e6d461041f01654 HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:27:11 10.35.0.15 [2014-07-08 19:27:11.494] 942/Thread-1 savage.WSGI-FredInterface/INFO: 10.35.0.2 "GET /private/status HTTP/1.1" 200 3034
Jul 8 19:27:11 10.35.0.13 [2014-07-08 19:27:11.492] 941/Thread-1 savage.WSGI-FredInterface/INFO: 10.35.0.2 "GET /private/status HTTP/1.1" 200 3080
Jul 8 19:27:11 10.35.0.13 pound: 10.35.2.8:8774 10.35.0.14 - - [08/Jul/2014:19:27:11 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5065 "" "python-novaclient"
Jul 8 19:27:11 10.35.0.13 nova-api 2014-07-08 19:27:11.571 4844 INFO nova.osapi_compute.wsgi.server [req-bd3c44b0-a6e3-427b-bca5-65674b81d2dd 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 10.35.0.14,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5239 time: 0.1313422
Jul 8 19:27:11 10.35.0.13 [2014-07-08 19:27:11.769] 10213/MainThread savage.arp_forever/INFO: execute['arping', '-q', '-A', '-I', 'services', '-c', '1', '-U', '10.35.2.7']
Jul 8 19:27:12 10.35.0.15 cobalt-compute: periodic_task DEBUG Running periodic task CobaltManager._refresh_host
Jul 8 19:27:12 10.35.0.15 cobalt-compute: periodic_task DEBUG Running periodic task CobaltManager._clean
Jul 8 19:27:12 10.35.0.15 cobalt-compute: vmsconn DEBUG Starting to clean symlinks in /dev/nova
Jul 8 19:27:12 10.35.0.15 cobalt-compute: vmsconn DEBUG Link is still active /dev/nova/3e65c658-731c-4a56-8c17-1cfac00bf754_disk
Jul 8 19:27:12 10.35.0.15 cobalt-compute: vmsconn DEBUG Link is still active /dev/nova/libvirt_save
Jul 8 19:27:12 10.35.0.15 cobalt-compute: vmsconn DEBUG Link is still active /dev/nova/novadisk
Jul 8 19:27:12 10.35.0.15 cobalt-compute: loopingcall DEBUG Dynamic looping call sleeping for 60.00 seconds
Jul 8 19:27:13 10.35.0.13 [2014-07-08 19:27:13.014] 10232/MainThread savage.arp_forever/INFO: execute['arping', '-q', '-A', '-I', 'services', '-c', '1', '-U', '10.35.2.8']
Jul 8 19:27:13 10.35.0.13 [2014-07-08 19:27:13.094] 10245/MainThread savage.arp_forever/INFO: execute['arping', '-q', '-A', '-I', 'services', '-c', '1', '-U', '10.35.2.3']
Jul 8 19:27:14 10.35.0.15 nova-compute 2014-07-08 19:27:14.333 3328 AUDIT nova.compute.resource_tracker [-] Auditing locally available compute resources
Jul 8 19:27:14 10.35.0.15 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf vgs --noheadings --nosuffix --separator | --units b -o vg_size,vg_free nova
Jul 8 19:27:14 10.35.0.13 pound: 10.35.2.3:5000 10.35.0.14 - - [08/Jul/2014:19:27:14 +0000] "POST //v2.0/tokens HTTP/1.1" 200 3288 "" "python-novaclient"
Jul 8 19:27:14 10.35.0.13 nova-api 2014-07-08 19:27:14.372 4844 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:27:14 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.7 - - [08/Jul/2014:19:27:14 +0000] "GET /v2.0/tokens/7eaf91b6f8bc4bbb9e98044c7f89ed3c HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:27:14 10.35.0.13 pound: 10.35.2.8:8774 10.35.0.14 - - [08/Jul/2014:19:27:14 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5065 "" "python-novaclient"
Jul 8 19:27:14 10.35.0.13 nova-api 2014-07-08 19:27:14.486 4844 INFO nova.osapi_compute.wsgi.server [req-8fe1758c-d45d-4beb-a856-3a40185a5767 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 10.35.0.14,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5239 time: 0.1158080
Jul 8 19:27:14 10.35.0.15 nova-compute 2014-07-08 19:27:14.693 3328 AUDIT nova.compute.resource_tracker [-] Free ram (MB): 47217
Jul 8 19:27:14 10.35.0.15 nova-compute 2014-07-08 19:27:14.693 3328 AUDIT nova.compute.resource_tracker [-] Free disk (GB): 1861
Jul 8 19:27:14 10.35.0.15 nova-compute 2014-07-08 19:27:14.693 3328 AUDIT nova.compute.resource_tracker [-] Free VCPUS: 3
Jul 8 19:27:14 10.35.0.15 nova-compute 2014-07-08 19:27:14.718 3328 INFO nova.compute.resource_tracker [-] Compute_service record updated for 10.35.0.15:node-10-35-0-15
Jul 8 19:27:14 10.35.0.15 nova-compute 2014-07-08 19:27:14.771 3328 INFO nova.virt.libvirt.imagecache [-] image 8f23694a-2995-4070-a5e8-661de3207f2e at (/mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375): checking
Jul 8 19:27:14 10.35.0.15 nova-compute 2014-07-08 19:27:14.771 3328 INFO nova.virt.libvirt.imagecache [-] image 8f23694a-2995-4070-a5e8-661de3207f2e at (/mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375): in use: on this node 1 local, 0 on other nodes sharing this instance storage
Jul 8 19:27:14 10.35.0.15 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf chown 43 /mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375
Jul 8 19:27:14 10.35.0.15 nova-compute 2014-07-08 19:27:14.957 3328 INFO nova.virt.libvirt.imagecache [-] Active base files: /mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375
Jul 8 19:27:15 10.35.0.13 nova-compute 2014-07-08 19:27:15.602 4705 AUDIT nova.compute.resource_tracker [-] Auditing locally available compute resources
Jul 8 19:27:15 10.35.0.13 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf vgs --noheadings --nosuffix --separator | --units b -o vg_size,vg_free nova
Jul 8 19:27:15 10.35.0.13 nova-compute 2014-07-08 19:27:15.984 4705 AUDIT nova.compute.resource_tracker [-] Free ram (MB): 46705
Jul 8 19:27:15 10.35.0.13 nova-compute 2014-07-08 19:27:15.984 4705 AUDIT nova.compute.resource_tracker [-] Free disk (GB): 1860
Jul 8 19:27:15 10.35.0.13 nova-compute 2014-07-08 19:27:15.984 4705 AUDIT nova.compute.resource_tracker [-] Free VCPUS: 2
Jul 8 19:27:16 10.35.0.13 nova-compute 2014-07-08 19:27:16.019 4705 INFO nova.compute.resource_tracker [-] Compute_service record updated for 10.35.0.13:node-10-35-0-13
Jul 8 19:27:16 10.35.0.13 nova-compute 2014-07-08 19:27:16.080 4705 INFO nova.virt.libvirt.imagecache [-] image 8f23694a-2995-4070-a5e8-661de3207f2e at (/mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375): checking
Jul 8 19:27:16 10.35.0.13 nova-compute 2014-07-08 19:27:16.080 4705 INFO nova.virt.libvirt.imagecache [-] image 8f23694a-2995-4070-a5e8-661de3207f2e at (/mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375): in use: on this node 2 local, 0 on other nodes sharing this instance storage
Jul 8 19:27:16 10.35.0.13 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf chown 43 /mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375
Jul 8 19:27:16 10.35.0.13 nova-compute 2014-07-08 19:27:16.266 4705 INFO nova.virt.libvirt.imagecache [-] Active base files: /mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375
Jul 8 19:27:17 10.35.0.14 ceph-osd: 2014-07-08 19:27:17.320264 7f54d8f57700 0 -- 10.35.0.14:6805/2095 >> 10.35.0.13:6801/1949 pipe(0x7f54f982cf00 sd=90 :6805 s=0 pgs=0 cs=0 l=0 c=0x7f54f9303340).accept connect_seq 2 vs existing 1 state standby
Jul 8 19:27:17 10.35.0.13 pound: 10.35.2.3:5000 10.35.0.14 - - [08/Jul/2014:19:27:17 +0000] "POST //v2.0/tokens HTTP/1.1" 200 3288 "" "python-novaclient"
Jul 8 19:27:17 10.35.0.14 nova-api 2014-07-08 19:27:17.382 6472 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:27:17 10.35.0.13 pound: 10.35.2.3:35357 10.35.0.14 - - [08/Jul/2014:19:27:17 +0000] "GET /v2.0/tokens/f104dfe29b65454aa8ca970014833b87 HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:27:17 10.35.0.13 pound: 10.35.2.8:8774 10.35.0.14 - - [08/Jul/2014:19:27:17 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5065 "" "python-novaclient"
Jul 8 19:27:17 10.35.0.14 nova-api 2014-07-08 19:27:17.498 6472 INFO nova.osapi_compute.wsgi.server [req-b9db69c1-bb49-42db-981f-69d7e1bd2663 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 10.35.0.14,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5239 time: 0.1186872
Jul 8 19:27:18 10.35.0.15 cinder-volume 2014-07-08 19:27:18.144 9068 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_context_domain\\": null, \\"_context_request_id\\": \\"req-6114273d-3cf8-4224-a217-d65763192d58\\", \\"_context_quota_class\\": null, \\"_context_service_catalog\\": [], \\"_context_auth_token\\": null, \\"_context_user_id\\": null, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_timestamp\\": \\"2014-07-08T19:27:18.141961\\", \\"_context_project_domain\\": null, \\"_context_user\\": null, \\"method\\": \\"update_service_capabilities\\", \\"_context_remote_address\\": null, \\"_context_roles\\": [\\"admin\\"], \\"args\\": {\\"service_name\\": \\"volume\\", \\"host\\": \\"cinder@rbd-slow\\", \\"capabilities\\": {\\"allocated_capacity_gb\\": 0, \\"volume_backend_name\\": \\"CEPHSLOW\\", \\"free_capacity_gb\\": 1281, \\"driver_version\\": \\"1.1.0\\", \\"total_capacity_gb\\": 1282, \\"reserved_percentage\\": 0, \\"vendor_name\\": \\"Open Source\\", \\"storage_protocol\\": \\"ceph\\"}}, \\"_unique_id\\": \\"d593e947e3ea4eb38a5d86c9eb1a2409\\", \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_context_user_identity\\": \\"- - - - -\\", \\"_context_tenant\\": null, \\"_context_project_id\\": null, \\"_context_user_domain\\": null}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:27:18 10.35.0.15 cinder-volume 2014-07-08 19:27:18.144 9068 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_context_domain\\": null, \\"_context_request_id\\": \\"req-6114273d-3cf8-4224-a217-d65763192d58\\", \\"_context_quota_class\\": null, \\"_context_service_catalog\\": [], \\"_context_auth_token\\": null, \\"_context_user_id\\": null, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_timestamp\\": \\"2014-07-08T19:27:18.141961\\", \\"_context_project_domain\\": null, \\"_context_user\\": null, \\"method\\": \\"update_service_capabilities\\", \\"_context_remote_address\\": null, \\"_context_roles\\": [\\"admin\\"], \\"args\\": {\\"service_name\\": \\"volume\\", \\"host\\": \\"cinder@rbd-slow\\", \\"capabilities\\": {\\"allocated_capacity_gb\\": 0, \\"volume_backend_name\\": \\"CEPHSLOW\\", \\"free_capacity_gb\\": 1281, \\"driver_version\\": \\"1.1.0\\", \\"total_capacity_gb\\": 1282, \\"reserved_percentage\\": 0, \\"vendor_name\\": \\"Open Source\\", \\"storage_protocol\\": \\"ceph\\"}}, \\"_unique_id\\": \\"d593e947e3ea4eb38a5d86c9eb1a2409\\", \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_context_user_identity\\": \\"- - - - -\\", \\"_context_tenant\\": null, \\"_context_project_id\\": null, \\"_context_user_domain\\": null}", "oslo.version": "2.0"}', routing_key=''
Jul 8 19:27:18 10.35.0.15 cinder-scheduler 2014-07-08 19:27:18.146 8863 ERROR kombu.transport.pyamqp [-] NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_context_domain\\": null, \\"_context_request_id\\": \\"req-6114273d-3cf8-4224-a217-d65763192d58\\", \\"_context_quota_class\\": null, \\"_context_service_catalog\\": [], \\"_context_auth_token\\": null, \\"_context_user_id\\": null, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_timestamp\\": \\"2014-07-08T19:27:18.141961\\", \\"_context_project_domain\\": null, \\"_context_user\\": null, \\"method\\": \\"update_service_capabilities\\", \\"_context_remote_address\\": null, \\"_context_roles\\": [\\"admin\\"], \\"args\\": {\\"service_name\\": \\"volume\\", \\"host\\": \\"cinder@rbd-slow\\", \\"capabilities\\": {\\"allocated_capacity_gb\\": 0, \\"volume_backend_name\\": \\"CEPHSLOW\\", \\"free_capacity_gb\\": 1281, \\"driver_version\\": \\"1.1.0\\", \\"total_capacity_gb\\": 1282, \\"reserved_percentage\\": 0, \\"vendor_name\\": \\"Open Source\\", \\"storage_protocol\\": \\"ceph\\"}}, \\"_unique_id\\": \\"d593e947e3ea4eb38a5d86c9eb1a2409\\", \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_context_user_identity\\": \\"- - - - -\\", \\"_context_tenant\\": null, \\"_context_project_id\\": null, \\"_context_user_domain\\": null}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'3', 'redelivered': False, 'routing_key': u'', 'delivery_tag': 82, 'exchange': u'cinder-scheduler_fanout'}, 'properties': {'priority': 0, 'application_headers': {}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7fa03d158690>}
Jul 8 19:27:18 10.35.0.15 cinder-volume 2014-07-08 19:27:18.147 9068 INFO cinder.volume.manager [-] Updating volume status
Jul 8 19:27:18 10.35.0.15 cinder-scheduler 2014-07-08 19:27:18.149 8863 ERROR oslo.messaging._drivers.impl_rabbit [-] NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_context_domain\\": null, \\"_context_request_id\\": \\"req-6114273d-3cf8-4224-a217-d65763192d58\\", \\"_context_quota_class\\": null, \\"_context_service_catalog\\": [], \\"_context_auth_token\\": null, \\"_context_user_id\\": null, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_timestamp\\": \\"2014-07-08T19:27:18.141961\\", \\"_context_project_domain\\": null, \\"_context_user\\": null, \\"method\\": \\"update_service_capabilities\\", \\"_context_remote_address\\": null, \\"_context_roles\\": [\\"admin\\"], \\"args\\": {\\"service_name\\": \\"volume\\", \\"host\\": \\"cinder@rbd-slow\\", \\"capabilities\\": {\\"allocated_capacity_gb\\": 0, \\"volume_backend_name\\": \\"CEPHSLOW\\", \\"free_capacity_gb\\": 1281, \\"driver_version\\": \\"1.1.0\\", \\"total_capacity_gb\\": 1282, \\"reserved_percentage\\": 0, \\"vendor_name\\": \\"Open Source\\", \\"storage_protocol\\": \\"ceph\\"}}, \\"_unique_id\\": \\"d593e947e3ea4eb38a5d86c9eb1a2409\\", \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_context_user_identity\\": \\"- - - - -\\", \\"_context_tenant\\": null, \\"_context_project_id\\": null, \\"_context_user_domain\\": null}", "oslo.version": "2.0"}'
Jul 8 19:27:18 10.35.0.15 cinder-scheduler 2014-07-08 19:27:18.150 8863 WARNING cinder.context [-] Arguments dropped when creating context: {'user': None, 'tenant': None, 'user_identity': u'- - - - -'}
Jul 8 19:27:18 10.35.0.15 cinder-scheduler 2014-07-08 19:27:18.151 8863 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"645154283e464073814c026239d979ee\\", \\"failure\\": null, \\"result\\": null}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:27:18 10.35.0.15 cinder-scheduler 2014-07-08 19:27:18.151 8863 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"645154283e464073814c026239d979ee\\", \\"failure\\": null, \\"result\\": null}", "oslo.version": "2.0"}', routing_key=''
Jul 8 19:27:18 10.35.0.15 cinder-scheduler 2014-07-08 19:27:18.152 8863 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"e3ec718c0a3a46fe807f95782deb4d87\\", \\"failure\\": null, \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:27:18 10.35.0.15 cinder-scheduler 2014-07-08 19:27:18.153 8863 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"e3ec718c0a3a46fe807f95782deb4d87\\", \\"failure\\": null, \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', routing_key=''
Jul 8 19:27:18 10.35.0.13 [2014-07-08 19:27:18.974] 941/MainThread savage.node-evacuation/INFO: Requesting status refresh of VM id 5ee78913-e655-4cb1-a92c-500639c819ad
Jul 8 19:27:19 10.35.0.13 pound: 10.35.2.8:8774 10.35.2.7 - - [08/Jul/2014:19:27:18 +0000] "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/5ee78913-e655-4cb1-a92c-500639c819ad HTTP/1.1" 200 1691 "" "python-novaclient"
Jul 8 19:27:19 10.35.0.14 nova-api 2014-07-08 19:27:19.078 6474 INFO nova.osapi_compute.wsgi.server [req-8a0cd8ae-2ff1-4a90-8f6b-051a30c6dba8 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] 10.35.2.7,10.35.0.13 "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/5ee78913-e655-4cb1-a92c-500639c819ad HTTP/1.1" status: 200 len: 1865 time: 0.1003549
Jul 8 19:27:19 10.35.0.13 [2014-07-08 19:27:19.082] 941/MainThread savage.Fred/INFO: Waiting for VMs to migrate.
Jul 8 19:27:20 10.35.0.13 pound: 10.35.2.3:5000 10.35.0.14 - - [08/Jul/2014:19:27:20 +0000] "POST //v2.0/tokens HTTP/1.1" 200 3288 "" "python-novaclient"
Jul 8 19:27:20 10.35.0.14 nova-api 2014-07-08 19:27:20.261 6474 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:27:20 10.35.0.14 nova-network.log: /usr/lib64/python2.7/site-packages/sqlalchemy/engine/default.py:435: Warning: Incorrect datetime value: '2014-07-08 19:17:20+00:00' for column 'updated_at' at row 1
Jul 8 19:27:20 10.35.0.14 nova-network.log: cursor.execute(statement, parameters)
Jul 8 19:27:20 10.35.0.13 pound: 10.35.2.3:35357 10.35.0.14 - - [08/Jul/2014:19:27:20 +0000] "GET /v2.0/tokens/e3557d20b9d04203a78d7b14ccb47924 HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:27:20 10.35.0.13 pound: 10.35.2.8:8774 10.35.0.14 - - [08/Jul/2014:19:27:20 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5065 "" "python-novaclient"
Jul 8 19:27:20 10.35.0.14 nova-api 2014-07-08 19:27:20.393 6474 INFO nova.osapi_compute.wsgi.server [req-ea970547-b4a3-42a9-8109-f077e12eb53a 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 10.35.0.14,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5239 time: 0.1345069
Jul 8 19:27:20 10.35.0.14 [2014-07-08 19:27:20.827] 941/Thread-1 savage.WSGI-FredInterface/INFO: 10.35.0.2 "GET /private/status HTTP/1.1" 200 3035
Jul 8 19:27:23 10.35.0.13 pound: 10.35.2.3:5000 10.35.0.14 - - [08/Jul/2014:19:27:23 +0000] "POST //v2.0/tokens HTTP/1.1" 200 3288 "" "python-novaclient"
Jul 8 19:27:23 10.35.0.15 nova-api 2014-07-08 19:27:23.156 3442 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:27:23 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.6 - - [08/Jul/2014:19:27:23 +0000] "GET /v2.0/tokens/43a67346925c4d8a943c05a38298eefd HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:27:23 10.35.0.13 pound: 10.35.2.8:8774 10.35.0.14 - - [08/Jul/2014:19:27:23 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5065 "" "python-novaclient"
Jul 8 19:27:23 10.35.0.15 nova-api 2014-07-08 19:27:23.270 3442 INFO nova.osapi_compute.wsgi.server [req-933ed2be-f077-4519-8376-b0e8c0302d9b 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 10.35.0.14,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5239 time: 0.1169021
Jul 8 19:27:23 10.35.0.2 dhcpd: DHCPREQUEST for 10.35.1.130 from 00:25:90:2f:2d:56 via management
Jul 8 19:27:23 10.35.0.2 dhcpd: DHCPACK on 10.35.1.130 to 00:25:90:2f:2d:56 via management
Jul 8 19:27:24 10.35.0.15 nova-network.log: /usr/lib64/python2.7/site-packages/sqlalchemy/engine/default.py:435: Warning: Incorrect datetime value: '2014-07-08 19:17:24+00:00' for column 'updated_at' at row 1
Jul 8 19:27:24 10.35.0.15 nova-network.log: cursor.execute(statement, parameters)
Jul 8 19:27:26 10.35.0.13 pound: 10.35.2.3:5000 10.35.0.14 - - [08/Jul/2014:19:27:25 +0000] "POST //v2.0/tokens HTTP/1.1" 200 3288 "" "python-novaclient"
Jul 8 19:27:26 10.35.0.15 nova-api 2014-07-08 19:27:26.050 3442 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:27:26 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.6 - - [08/Jul/2014:19:27:26 +0000] "GET /v2.0/tokens/d9d87e802602437e8df9fb126f485f90 HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:27:26 10.35.0.13 pound: 10.35.2.8:8774 10.35.0.14 - - [08/Jul/2014:19:27:26 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5065 "" "python-novaclient"
Jul 8 19:27:26 10.35.0.15 nova-api 2014-07-08 19:27:26.166 3442 INFO nova.osapi_compute.wsgi.server [req-0f7dc623-a6a6-43f3-8671-8891819f10c6 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 10.35.0.14,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5239 time: 0.1194630
Jul 8 19:27:27 10.35.0.13 [2014-07-08 19:27:27.057] 941/Thread-1 savage.WSGI-FredInterface/INFO: 10.35.0.2 "GET /private/status HTTP/1.1" 200 3080
Jul 8 19:27:27 10.35.0.15 [2014-07-08 19:27:27.060] 942/Thread-1 savage.WSGI-FredInterface/INFO: 10.35.0.2 "GET /private/status HTTP/1.1" 200 3035
Jul 8 19:27:28 10.35.0.2 dhcpd: DHCPREQUEST for 10.35.1.131 from 00:25:90:2f:2d:07 via management
Jul 8 19:27:28 10.35.0.2 dhcpd: DHCPACK on 10.35.1.131 to 00:25:90:2f:2d:07 via management
Jul 8 19:27:28 10.35.0.13 pound: 10.35.2.3:5000 10.35.0.14 - - [08/Jul/2014:19:27:28 +0000] "POST //v2.0/tokens HTTP/1.1" 200 3288 "" "python-novaclient"
Jul 8 19:27:28 10.35.0.13 nova-api 2014-07-08 19:27:28.926 4844 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:27:28 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.7 - - [08/Jul/2014:19:27:28 +0000] "GET /v2.0/tokens/caf31cfe201542df8c377b85d7e03df7 HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:27:29 10.35.0.13 pound: 10.35.2.8:8774 10.35.0.14 - - [08/Jul/2014:19:27:28 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5065 "" "python-novaclient"
Jul 8 19:27:29 10.35.0.13 nova-api 2014-07-08 19:27:29.045 4844 INFO nova.osapi_compute.wsgi.server [req-55cd7af3-3b6c-4c50-a639-8d73f451b261 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 10.35.0.14,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5239 time: 0.1216731
Jul 8 19:27:29 10.35.0.13 [2014-07-08 19:27:29.090] 941/MainThread savage.node-evacuation/INFO: Requesting status refresh of VM id 5ee78913-e655-4cb1-a92c-500639c819ad
Jul 8 19:27:29 10.35.0.13 pound: 10.35.2.8:8774 10.35.2.7 - - [08/Jul/2014:19:27:29 +0000] "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/5ee78913-e655-4cb1-a92c-500639c819ad HTTP/1.1" 200 1691 "" "python-novaclient"
Jul 8 19:27:29 10.35.0.13 nova-api 2014-07-08 19:27:29.197 4840 INFO nova.osapi_compute.wsgi.server [req-3168c441-cb08-4302-aacb-56c2704a9b86 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] 10.35.2.7,10.35.0.13 "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/5ee78913-e655-4cb1-a92c-500639c819ad HTTP/1.1" status: 200 len: 1865 time: 0.1035180
Jul 8 19:27:29 10.35.0.13 [2014-07-08 19:27:29.199] 941/MainThread savage.Fred/INFO: Waiting for VMs to migrate.
Jul 8 19:27:31 10.35.0.13 pound: 10.35.2.3:5000 10.35.0.14 - - [08/Jul/2014:19:27:31 +0000] "POST //v2.0/tokens HTTP/1.1" 200 3288 "" "python-novaclient"
Jul 8 19:27:31 10.35.0.15 nova-api 2014-07-08 19:27:31.901 3441 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:27:31 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.6 - - [08/Jul/2014:19:27:31 +0000] "GET /v2.0/tokens/e54f71bf8e5e4180ac62dba8e28dbd7d HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:27:32 10.35.0.13 pound: 10.35.2.8:8774 10.35.0.14 - - [08/Jul/2014:19:27:31 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5065 "" "python-novaclient"
Jul 8 19:27:32 10.35.0.15 nova-api 2014-07-08 19:27:32.018 3441 INFO nova.osapi_compute.wsgi.server [req-b56ac3f1-2326-41d1-a927-94de674357b8 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 10.35.0.14,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5239 time: 0.1194630
Jul 8 19:27:34 10.35.0.13 pound: 10.35.2.3:5000 10.35.0.14 - - [08/Jul/2014:19:27:34 +0000] "POST //v2.0/tokens HTTP/1.1" 200 3288 "" "python-novaclient"
Jul 8 19:27:34 10.35.0.13 nova-api 2014-07-08 19:27:34.775 4840 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:27:34 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.7 - - [08/Jul/2014:19:27:34 +0000] "GET /v2.0/tokens/7a8d7b38fa2647e68f58d7f8e1c7be49 HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:27:34 10.35.0.13 pound: 10.35.2.8:8774 10.35.0.14 - - [08/Jul/2014:19:27:34 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5065 "" "python-novaclient"
Jul 8 19:27:34 10.35.0.13 nova-api 2014-07-08 19:27:34.894 4840 INFO nova.osapi_compute.wsgi.server [req-91760570-c65e-4fc4-aefb-59a35b22a64a 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 10.35.0.14,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5239 time: 0.1211050
Jul 8 19:27:36 10.35.0.14 [2014-07-08 19:27:36.393] 941/Thread-1 savage.WSGI-FredInterface/INFO: 10.35.0.2 "GET /private/status HTTP/1.1" 200 3034
Jul 8 19:27:37 10.35.0.13 pound: 10.35.2.3:5000 10.35.0.14 - - [08/Jul/2014:19:27:37 +0000] "POST //v2.0/tokens HTTP/1.1" 200 3288 "" "python-novaclient"
Jul 8 19:27:37 10.35.0.14 nova-api 2014-07-08 19:27:37.652 6476 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:27:37 10.35.0.13 pound: 10.35.2.3:35357 10.35.0.14 - - [08/Jul/2014:19:27:37 +0000] "GET /v2.0/tokens/e16c82403488492abeab377a47fd7e60 HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:27:37 10.35.0.13 pound: 10.35.2.8:8774 10.35.0.14 - - [08/Jul/2014:19:27:37 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5065 "" "python-novaclient"
Jul 8 19:27:37 10.35.0.14 nova-api 2014-07-08 19:27:37.775 6476 INFO nova.osapi_compute.wsgi.server [req-086271a9-24e2-4b2d-b8f2-22387dd20ac8 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 10.35.0.14,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5239 time: 0.1248798
Jul 8 19:27:38 10.35.0.13 cobalt-compute: impl_rabbit DEBUG Timed out waiting for RPC response: timed out
Jul 8 19:27:38 10.35.0.13 cobalt-compute: periodic_task ERROR Error during CobaltManager._refresh_host: Timed out waiting for a reply to message ID 3303d69cf18e4bcb8ee69b3b8a4496af
Traceback (most recent call last):
File "/usr/lib64/python2.7/site-packages/nova/openstack/common/periodic_task.py", line 182, in run_periodic_tasks
task(self, context)
File "/usr/lib64/python2.7/site-packages/cobalt/nova/extension/manager.py", line 315, in _refresh_host
self.network_api.setup_networks_on_host(context, instance)
File "/usr/lib64/python2.7/site-packages/nova/network/api.py", line 95, in wrapped
return func(self, context, *args, **kwargs)
File "/usr/lib64/python2.7/site-packages/nova/network/api.py", line 509, in setup_networks_on_host
self.network_rpcapi.setup_networks_on_host(context, **args)
File "/usr/lib64/python2.7/site-packages/nova/network/rpcapi.py", line 270, in setup_networks_on_host
teardown=teardown)
File "/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/client.py", line 361, in call
return self.prepare().call(ctxt, method, **kwargs)
File "/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/client.py", line 150, in call
wait_for_reply=True, timeout=timeout)
File "/usr/lib64/python2.7/site-packages/oslo/messaging/transport.py", line 90, in _send
timeout=timeout)
File "/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py", line 412, in send
return self._send(target, ctxt, message, wait_for_reply, timeout)
File "/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py", line 403, in _send
result = self._waiter.wait(msg_id, timeout)
File "/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py", line 267, in wait
reply, ending = self._poll_connection(msg_id, timeout)
File "/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py", line 217, in _poll_connection
% msg_id)
MessagingTimeout: Timed out waiting for a reply to message ID 3303d69cf18e4bcb8ee69b3b8a4496af
Jul 8 19:27:38 10.35.0.13 cobalt-compute: periodic_task DEBUG Running periodic task CobaltManager._clean
Jul 8 19:27:38 10.35.0.13 cobalt-compute: vmsconn DEBUG Starting to clean symlinks in /dev/nova
Jul 8 19:27:38 10.35.0.13 cobalt-compute: vmsconn DEBUG Link is still active /dev/nova/ce77d052-f021-44a2-b5f3-5908ec04fcc7_disk
Jul 8 19:27:38 10.35.0.13 cobalt-compute: vmsconn DEBUG Link is still active /dev/nova/5ee78913-e655-4cb1-a92c-500639c819ad_disk
Jul 8 19:27:38 10.35.0.13 cobalt-compute: vmsconn DEBUG Link is still active /dev/nova/libvirt_save
Jul 8 19:27:38 10.35.0.13 cobalt-compute: vmsconn DEBUG Link is still active /dev/nova/novadisk
Jul 8 19:27:38 10.35.0.13 cobalt-compute: loopingcall DEBUG Dynamic looping call sleeping for 60.00 seconds
Jul 8 19:27:39 10.35.0.13 [2014-07-08 19:27:39.205] 941/MainThread savage.node-evacuation/INFO: Requesting status refresh of VM id 5ee78913-e655-4cb1-a92c-500639c819ad
Jul 8 19:27:39 10.35.0.13 pound: 10.35.2.8:8774 10.35.2.7 - - [08/Jul/2014:19:27:39 +0000] "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/5ee78913-e655-4cb1-a92c-500639c819ad HTTP/1.1" 200 1691 "" "python-novaclient"
Jul 8 19:27:39 10.35.0.15 nova-api 2014-07-08 19:27:39.312 3440 INFO nova.osapi_compute.wsgi.server [req-14a862c8-73e4-44c3-bfdd-1ffbe9eeca8a 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] 10.35.2.7,10.35.0.13 "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/5ee78913-e655-4cb1-a92c-500639c819ad HTTP/1.1" status: 200 len: 1865 time: 0.1010289
Jul 8 19:27:39 10.35.0.13 [2014-07-08 19:27:39.312] 941/MainThread savage.Fred/INFO: Waiting for VMs to migrate.
Jul 8 19:27:39 10.35.0.14 nova-network.log: Traceback (most recent call last):
Jul 8 19:27:39 10.35.0.14 nova-network.log: File "/usr/lib64/python2.7/site-packages/eventlet/hubs/hub.py", line 346, in fire_timers
Jul 8 19:27:39 10.35.0.14 nova-network.log: timer()
Jul 8 19:27:39 10.35.0.14 nova-network.log: File "/usr/lib64/python2.7/site-packages/eventlet/hubs/timer.py", line 56, in __call__
Jul 8 19:27:39 10.35.0.14 nova-network.log: cb(*args, **kw)
Jul 8 19:27:39 10.35.0.14 nova-network.log: File "/usr/lib64/python2.7/site-packages/eventlet/greenthread.py", line 194, in main
Jul 8 19:27:39 10.35.0.14 nova-network.log: result = function(*args, **kwargs)
Jul 8 19:27:39 10.35.0.14 nova-network.log: File "/usr/lib64/python2.7/site-packages/nova/network/rpcapi.py", line 281, in rpc_setup_network_on_host
Jul 8 19:27:39 10.35.0.14 nova-network.log: network_id=network_id, teardown=teardown)
Jul 8 19:27:39 10.35.0.14 nova-network.log: File "/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/client.py", line 150, in call
Jul 8 19:27:39 10.35.0.14 nova-network.log: wait_for_reply=True, timeout=timeout)
Jul 8 19:27:39 10.35.0.14 nova-network.log: File "/usr/lib64/python2.7/site-packages/oslo/messaging/transport.py", line 90, in _send
Jul 8 19:27:39 10.35.0.14 nova-network.log: timeout=timeout)
Jul 8 19:27:39 10.35.0.14 nova-network.log: File "/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py", line 412, in send
Jul 8 19:27:39 10.35.0.14 nova-network.log: return self._send(target, ctxt, message, wait_for_reply, timeout)
Jul 8 19:27:39 10.35.0.14 nova-network.log: File "/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py", line 403, in _send
Jul 8 19:27:39 10.35.0.14 nova-network.log: result = self._waiter.wait(msg_id, timeout)
Jul 8 19:27:39 10.35.0.14 nova-network.log: File "/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py", line 267, in wait
Jul 8 19:27:39 10.35.0.14 nova-network.log: reply, ending = self._poll_connection(msg_id, timeout)
Jul 8 19:27:39 10.35.0.14 nova-network.log: File "/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py", line 217, in _poll_connection
Jul 8 19:27:39 10.35.0.14 nova-network.log: % msg_id)
Jul 8 19:27:39 10.35.0.14 nova-network.log: MessagingTimeout: Timed out waiting for a reply to message ID d9bd0f5d12014520a7165790429929c3
Jul 8 19:27:39 10.35.0.14 nova-network 2014-07-08 19:27:39.831 6514 ERROR oslo.messaging.rpc.dispatcher [-] Exception during message handling: Timed out waiting for a reply to message ID d9bd0f5d12014520a7165790429929c3
2014-07-08 19:27:39.831 6514 TRACE oslo.messaging.rpc.dispatcher Traceback (most recent call last):
2014-07-08 19:27:39.831 6514 TRACE oslo.messaging.rpc.dispatcher File "/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/dispatcher.py", line 133, in _dispatch_and_reply
2014-07-08 19:27:39.831 6514 TRACE oslo.messaging.rpc.dispatcher incoming.message))
2014-07-08 19:27:39.831 6514 TRACE oslo.messaging.rpc.dispatcher File "/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/dispatcher.py", line 176, in _dispatch
2014-07-08 19:27:39.831 6514 TRACE oslo.messaging.rpc.dispatcher return self._do_dispatch(endpoint, method, ctxt, args)
2014-07-08 19:27:39.831 6514 TRACE oslo.messaging.rpc.dispatcher File "/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/dispatcher.py", line 122, in _do_dispatch
2014-07-08 19:27:39.831 6514 TRACE oslo.messaging.rpc.dispatcher result = getattr(endpoint, method)(ctxt, **new_args)
2014-07-08 19:27:39.831 6514 TRACE oslo.messaging.rpc.dispatcher File "/usr/lib64/python2.7/site-packages/nova/network/manager.py", line 1323, in setup_networks_on_host
2014-07-08 19:27:39.831 6514 TRACE oslo.messaging.rpc.dispatcher gt.wait()
2014-07-08 19:27:39.831 6514 TRACE oslo.messaging.rpc.dispatcher File "/usr/lib64/python2.7/site-packages/eventlet/greenthread.py", line 168, in wait
2014-07-08 19:27:39.831 6514 TRACE oslo.messaging.rpc.dispatcher return self._exit_event.wait()
2014-07-08 19:27:39.831 6514 TRACE oslo.messaging.rpc.dispatcher File "/usr/lib64/python2.7/site-packages/eventlet/event.py", line 116, in wait
2014-07-08 19:27:39.831 6514 TRACE oslo.messaging.rpc.dispatcher return hubs.get_hub().switch()
2014-07-08 19:27:39.831 6514 TRACE oslo.messaging.rpc.dispatcher File "/usr/lib64/python2.7/site-packages/eventlet/hubs/hub.py", line 187, in switch
2014-07-08 19:27:39.831 6514 TRACE oslo.messaging.rpc.dispatcher return self.greenlet.switch()
2014-07-08 19:27:39.831 6514 TRACE oslo.messaging.rpc.dispatcher File "/usr/lib64/python2.7/site-packages/eventlet/greenthread.py", line 194, in main
2014-07-08 19:27:39.831 6514 TRACE oslo.messaging.rpc.dispatcher result = function(*args, **kwargs)
2014-07-08 19:27:39.831 6514 TRACE oslo.messaging.rpc.dispatcher File "/usr/lib64/python2.7/site-packages/nova/network/rpcapi.py", line 281, in rpc_setup_network_on_host
2014-07-08 19:27:39.831 6514 TRACE oslo.messaging.rpc.dispatcher network_id=network_id, teardown=teardown)
2014-07-08 19:27:39.831 6514 TRACE oslo.messaging.rpc.dispatcher File "/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/client.py", line 150, in call
2014-07-08 19:27:39.831 6514 TRACE oslo.messaging.rpc.dispatcher wait_for_reply=True, timeout=timeout)
2014-07-08 19:27:39.831 6514 TRACE oslo.messaging.rpc.dispatcher File "/usr/lib64/python2.7/site-packages/oslo/messaging/transport.py", line 90, in _send
2014-07-08 19:27:39.831 6514 TRACE oslo.messaging.rpc.dispatcher timeout=timeout)
2014-07-08 19:27:39.831 6514 TRACE oslo.messaging.rpc.dispatcher File "/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py", line 412, in send
2014-07-08 19:27:39.831 6514 TRACE oslo.messaging.rpc.dispatcher return self._send(target, ctxt, message, wait_for_reply, timeout)
2014-07-08 19:27:39.831 6514 TRACE oslo.messaging.rpc.dispatcher File "/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py", line 403, in _send
2014-07-08 19:27:39.831 6514 TRACE oslo.messaging.rpc.dispatcher result = self._waiter.wait(msg_id, timeout)
2014-07-08 19:27:39.831 6514 TRACE oslo.messaging.rpc.dispatcher File "/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py", line 267, in wait
2014-07-08 19:27:39.831 6514 TRACE oslo.messaging.rpc.dispatcher reply, ending = self._poll_connection(msg_id, timeout)
2014-07-08 19:27:39.831 6514 TRACE oslo.messaging.rpc.dispatcher File "/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py", line 217, in _poll_connection
2014-07-08 19:27:39.831 6514 TRACE oslo.messaging.rpc.dispatcher % msg_id)
2014-07-08 19:27:39.831 6514 TRACE oslo.messaging.rpc.dispatcher MessagingTimeout: Timed out waiting for a reply to message ID d9bd0f5d12014520a7165790429929c3
2014-07-08 19:27:39.831 6514 TRACE oslo.messaging.rpc.dispatcher
Jul 8 19:27:39 10.35.0.14 nova-network 2014-07-08 19:27:39.832 6514 ERROR oslo.messaging._drivers.common [-] Returning exception Timed out waiting for a reply to message ID d9bd0f5d12014520a7165790429929c3 to caller
Jul 8 19:27:39 10.35.0.14 nova-network 2014-07-08 19:27:39.833 6514 ERROR oslo.messaging._drivers.common [-] ['Traceback (most recent call last):\n', ' File "/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/dispatcher.py", line 133, in _dispatch_and_reply\n incoming.message))\n', ' File "/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/dispatcher.py", line 176, in _dispatch\n return self._do_dispatch(endpoint, method, ctxt, args)\n', ' File "/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/dispatcher.py", line 122, in _do_dispatch\n result = getattr(endpoint, method)(ctxt, **new_args)\n', ' File "/usr/lib64/python2.7/site-packages/nova/network/manager.py", line 1323, in setup_networks_on_host\n gt.wait()\n', ' File "/usr/lib64/python2.7/site-packages/eventlet/greenthread.py", line 168, in wait\n return self._exit_event.wait()\n', ' File "/usr/lib64/python2.7/site-packages/eventlet/event.py", line 116, in wait\n return hubs.get_hub().switch()\n', ' File "/usr/lib64/python2.7/site-packages/eventlet/hubs/hub.py", line 187, in switch\n return self.greenlet.switch()\n', ' File "/usr/lib64/python2.7/site-packages/eventlet/greenthread.py", line 194, in main\n result = function(*args, **kwargs)\n', ' File "/usr/lib64/python2.7/site-packages/nova/network/rpcapi.py", line 281, in rpc_setup_network_on_host\n network_id=network_id, teardown=teardown)\n', ' File "/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/client.py", line 150, in call\n wait_for_reply=True, timeout=timeout)\n', ' File "/usr/lib64/python2.7/site-packages/oslo/messaging/transport.py", line 90, in _send\n timeout=timeout)\n', ' File "/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py", line 412, in send\n return self._send(target, ctxt, message, wait_for_reply, timeout)\n', ' File "/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py", line 403, in _send\n result = self._waiter.wait(msg_id, timeout)\n', ' File "/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py", line 267, in wait\n reply, ending = self._poll_connection(msg_id, timeout)\n', ' File "/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py", line 217, in _poll_connection\n % msg_id)\n', 'MessagingTimeout: Timed out waiting for a reply to message ID d9bd0f5d12014520a7165790429929c3\n']
Jul 8 19:27:39 10.35.0.14 nova-network.log: /usr/lib64/python2.7/site-packages/amqp/channel.py:616: VDeprecationWarning: The auto_delete flag for exchanges has been deprecated and will be removed
Jul 8 19:27:39 10.35.0.14 nova-network.log: from py-amqp v1.5.0.
Jul 8 19:27:39 10.35.0.14 nova-network.log: warn(VDeprecationWarning(EXCHANGE_AUTODELETE_DEPRECATED))
Jul 8 19:27:39 10.35.0.14 nova-network 2014-07-08 19:27:39.834 6514 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"2cc5455f05e0486088ee12d57bfc15f6\\", \\"failure\\": \\"{\\\\\\"args\\\\\\": [\\\\\\"Timed out waiting for a reply to message ID d9bd0f5d12014520a7165790429929c3\\\\\\"], \\\\\\"module\\\\\\": \\\\\\"oslo.messaging.exceptions\\\\\\", \\\\\\"kwargs\\\\\\": {}, \\\\\\"message\\\\\\": \\\\\\"Timed out waiting for a reply to message ID d9bd0f5d12014520a7165790429929c3\\\\\\", \\\\\\"tb\\\\\\": [\\\\\\"Traceback (most recent call last):\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/dispatcher.py\\\\\\\\\\\\\\", line 133, in _dispatch_and_reply\\\\\\\\n incoming.message))\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/dispatcher.py\\\\\\\\\\\\\\", line 176, in _dispatch\\\\\\\\n return self._do_dispatch(endpoint, method, ctxt, args)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/dispatcher.py\\\\\\\\\\\\\\", line 122, in _do_dispatch\\\\\\\\n result = getattr(endpoint, method)(ctxt, **new_args)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/nova/network/manager.py\\\\\\\\\\\\\\", line 1323, in setup_networks_on_host\\\\\\\\n gt.wait()\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/eventlet/greenthread.py\\\\\\\\\\\\\\", line 168, in wait\\\\\\\\n return self._exit_event.wait()\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/eventlet/event.py\\\\\\\\\\\\\\", line 116, in wait\\\\\\\\n return hubs.get_hub().switch()\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/eventlet/hubs/hub.py\\\\\\\\\\\\\\", line 187, in switch\\\\\\\\n return self.greenlet.switch()\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/eventlet/greenthread.py\\\\\\\\\\\\\\", line 194, in main\\\\\\\\n result = function(*args, **kwargs)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/nova/network/rpcapi.py\\\\\\\\\\\\\\", line 281, in rpc_setup_network_on_host\\\\\\\\n network_id=network_id, teardown=teardown)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/client.py\\\\\\\\\\\\\\", line 150, in call\\\\\\\\n wait_for_reply=True, timeout=timeout)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/transport.py\\\\\\\\\\\\\\", line 90, in _send\\\\\\\\n timeout=timeout)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py\\\\\\\\\\\\\\", line 412, in send\\\\\\\\n return self._send(target, ctxt, message, wait_for_reply, timeout)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py\\\\\\\\\\\\\\", line 403, in _send\\\\\\\\n result = self._waiter.wait(msg_id, timeout)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py\\\\\\\\\\\\\\", line 267, in wait\\\\\\\\n reply, ending = self._poll_connection(msg_id, timeout)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py\\\\\\\\\\\\\\", line 217, in _poll_connection\\\\\\\\n % msg_id)\\\\\\\\n\\\\\\", \\\\\\"MessagingTimeout: Timed out waiting for a reply to message ID d9bd0f5d12014520a7165790429929c3\\\\\\\\n\\\\\\"], \\\\\\"class\\\\\\": \\\\\\"MessagingTimeout\\\\\\"}\\", \\"result\\": null, \\"_msg_id\\": \\"3303d69cf18e4bcb8ee69b3b8a4496af\\"}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:27:39 10.35.0.14 nova-network 2014-07-08 19:27:39.834 6514 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"2cc5455f05e0486088ee12d57bfc15f6\\", \\"failure\\": \\"{\\\\\\"args\\\\\\": [\\\\\\"Timed out waiting for a reply to message ID d9bd0f5d12014520a7165790429929c3\\\\\\"], \\\\\\"module\\\\\\": \\\\\\"oslo.messaging.exceptions\\\\\\", \\\\\\"kwargs\\\\\\": {}, \\\\\\"message\\\\\\": \\\\\\"Timed out waiting for a reply to message ID d9bd0f5d12014520a7165790429929c3\\\\\\", \\\\\\"tb\\\\\\": [\\\\\\"Traceback (most recent call last):\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/dispatcher.py\\\\\\\\\\\\\\", line 133, in _dispatch_and_reply\\\\\\\\n incoming.message))\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/dispatcher.py\\\\\\\\\\\\\\", line 176, in _dispatch\\\\\\\\n return self._do_dispatch(endpoint, method, ctxt, args)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/dispatcher.py\\\\\\\\\\\\\\", line 122, in _do_dispatch\\\\\\\\n result = getattr(endpoint, method)(ctxt, **new_args)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/nova/network/manager.py\\\\\\\\\\\\\\", line 1323, in setup_networks_on_host\\\\\\\\n gt.wait()\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/eventlet/greenthread.py\\\\\\\\\\\\\\", line 168, in wait\\\\\\\\n return self._exit_event.wait()\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/eventlet/event.py\\\\\\\\\\\\\\", line 116, in wait\\\\\\\\n return hubs.get_hub().switch()\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/eventlet/hubs/hub.py\\\\\\\\\\\\\\", line 187, in switch\\\\\\\\n return self.greenlet.switch()\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/eventlet/greenthread.py\\\\\\\\\\\\\\", line 194, in main\\\\\\\\n result = function(*args, **kwargs)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/nova/network/rpcapi.py\\\\\\\\\\\\\\", line 281, in rpc_setup_network_on_host\\\\\\\\n network_id=network_id, teardown=teardown)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/client.py\\\\\\\\\\\\\\", line 150, in call\\\\\\\\n wait_for_reply=True, timeout=timeout)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/transport.py\\\\\\\\\\\\\\", line 90, in _send\\\\\\\\n timeout=timeout)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py\\\\\\\\\\\\\\", line 412, in send\\\\\\\\n return self._send(target, ctxt, message, wait_for_reply, timeout)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py\\\\\\\\\\\\\\", line 403, in _send\\\\\\\\n result = self._waiter.wait(msg_id, timeout)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py\\\\\\\\\\\\\\", line 267, in wait\\\\\\\\n reply, ending = self._poll_connection(msg_id, timeout)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py\\\\\\\\\\\\\\", line 217, in _poll_connection\\\\\\\\n % msg_id)\\\\\\\\n\\\\\\", \\\\\\"MessagingTimeout: Timed out waiting for a reply to message ID d9bd0f5d12014520a7165790429929c3\\\\\\\\n\\\\\\"], \\\\\\"class\\\\\\": \\\\\\"MessagingTimeout\\\\\\"}\\", \\"result\\": null, \\"_msg_id\\": \\"3303d69cf18e4bcb8ee69b3b8a4496af\\"}", "oslo.version": "2.0"}', routing_key=u'reply_ff10fb357a86427594db414dc3e5acc6'
Jul 8 19:27:39 10.35.0.14 nova-network.log: /usr/lib64/python2.7/site-packages/amqp/channel.py:616: VDeprecationWarning: The auto_delete flag for exchanges has been deprecated and will be removed
Jul 8 19:27:39 10.35.0.14 nova-network.log: from py-amqp v1.5.0.
Jul 8 19:27:39 10.35.0.14 nova-network.log: warn(VDeprecationWarning(EXCHANGE_AUTODELETE_DEPRECATED))
Jul 8 19:27:39 10.35.0.14 nova-network 2014-07-08 19:27:39.836 6514 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"b3f5fbea2282418ea920f244badad2bf\\", \\"failure\\": null, \\"_msg_id\\": \\"3303d69cf18e4bcb8ee69b3b8a4496af\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:27:39 10.35.0.14 nova-network 2014-07-08 19:27:39.836 6514 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"b3f5fbea2282418ea920f244badad2bf\\", \\"failure\\": null, \\"_msg_id\\": \\"3303d69cf18e4bcb8ee69b3b8a4496af\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', routing_key=u'reply_ff10fb357a86427594db414dc3e5acc6'
Jul 8 19:27:40 10.35.0.13 pound: 10.35.2.3:5000 10.35.0.14 - - [08/Jul/2014:19:27:40 +0000] "POST //v2.0/tokens HTTP/1.1" 200 3288 "" "python-novaclient"
Jul 8 19:27:40 10.35.0.15 nova-api 2014-07-08 19:27:40.545 3441 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:27:40 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.6 - - [08/Jul/2014:19:27:40 +0000] "GET /v2.0/tokens/d8114cf5d97d4e38a930ea2fb868ec8e HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:27:40 10.35.0.13 pound: 10.35.2.8:8774 10.35.0.14 - - [08/Jul/2014:19:27:40 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5065 "" "python-novaclient"
Jul 8 19:27:40 10.35.0.15 nova-api 2014-07-08 19:27:40.671 3441 INFO nova.osapi_compute.wsgi.server [req-c987b5c9-4f7b-4967-8f41-530a6f3b5cad 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 10.35.0.14,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5239 time: 0.1285062
Jul 8 19:27:41 10.35.0.2 dhcpd: DHCPREQUEST for 10.35.1.129 from 00:25:90:8b:3f:a7 via management
Jul 8 19:27:41 10.35.0.2 dhcpd: DHCPACK on 10.35.1.129 to 00:25:90:8b:3f:a7 via management
Jul 8 19:27:42 10.35.0.15 [2014-07-08 19:27:42.627] 942/Thread-1 savage.WSGI-FredInterface/INFO: 10.35.0.2 "GET /private/status HTTP/1.1" 200 3035
Jul 8 19:27:42 10.35.0.13 [2014-07-08 19:27:42.626] 941/Thread-1 savage.WSGI-FredInterface/INFO: 10.35.0.2 "GET /private/status HTTP/1.1" 200 3080
Jul 8 19:27:43 10.35.0.13 pound: 10.35.2.3:5000 10.35.0.14 - - [08/Jul/2014:19:27:43 +0000] "POST //v2.0/tokens HTTP/1.1" 200 3288 "" "python-novaclient"
Jul 8 19:27:43 10.35.0.15 nova-api 2014-07-08 19:27:43.433 3442 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:27:43 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.6 - - [08/Jul/2014:19:27:43 +0000] "GET /v2.0/tokens/9d7303f6642645e791c015b26992d3b5 HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:27:43 10.35.0.13 pound: 10.35.2.8:8774 10.35.0.14 - - [08/Jul/2014:19:27:43 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5065 "" "python-novaclient"
Jul 8 19:27:43 10.35.0.15 nova-api 2014-07-08 19:27:43.544 3442 INFO nova.osapi_compute.wsgi.server [req-b4da7486-94f6-4571-95d4-012d76a58b37 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 10.35.0.14,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5239 time: 0.1126852
Jul 8 19:27:43 10.35.0.14 nova-compute 2014-07-08 19:27:43.748 6347 AUDIT nova.compute.resource_tracker [-] Auditing locally available compute resources
Jul 8 19:27:43 10.35.0.14 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf vgs --noheadings --nosuffix --separator | --units b -o vg_size,vg_free nova
Jul 8 19:27:44 10.35.0.14 -- MARK --
Jul 8 19:27:44 10.35.0.14 nova-compute 2014-07-08 19:27:44.105 6347 AUDIT nova.compute.resource_tracker [-] Free ram (MB): 47729
Jul 8 19:27:44 10.35.0.14 nova-compute 2014-07-08 19:27:44.106 6347 AUDIT nova.compute.resource_tracker [-] Free disk (GB): 1862
Jul 8 19:27:44 10.35.0.14 nova-compute 2014-07-08 19:27:44.106 6347 AUDIT nova.compute.resource_tracker [-] Free VCPUS: 4
Jul 8 19:27:44 10.35.0.14 nova-compute 2014-07-08 19:27:44.132 6347 INFO nova.compute.resource_tracker [-] Compute_service record updated for 10.35.0.14:node-10-35-0-14
Jul 8 19:27:44 10.35.0.14 nova-compute 2014-07-08 19:27:44.189 6347 WARNING nova.virt.libvirt.imagecache [-] Unknown base file: /mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375
Jul 8 19:27:44 10.35.0.14 nova-compute 2014-07-08 19:27:44.189 6347 INFO nova.virt.libvirt.imagecache [-] Removable base files: /mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375
Jul 8 19:27:44 10.35.0.14 nova-compute 2014-07-08 19:27:44.190 6347 INFO nova.virt.libvirt.imagecache [-] Base file too young to remove: /mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375
Jul 8 19:27:46 10.35.0.13 pound: 10.35.2.3:5000 10.35.0.14 - - [08/Jul/2014:19:27:46 +0000] "POST //v2.0/tokens HTTP/1.1" 200 3288 "" "python-novaclient"
Jul 8 19:27:46 10.35.0.15 nova-api 2014-07-08 19:27:46.316 3442 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:27:46 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.6 - - [08/Jul/2014:19:27:46 +0000] "GET /v2.0/tokens/78d1352af6bf49a38d515d1621722412 HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:27:46 10.35.0.13 pound: 10.35.2.8:8774 10.35.0.14 - - [08/Jul/2014:19:27:46 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5065 "" "python-novaclient"
Jul 8 19:27:46 10.35.0.15 nova-api 2014-07-08 19:27:46.430 3442 INFO nova.osapi_compute.wsgi.server [req-288fc83d-c08b-40c7-b433-61bc6de99f84 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 10.35.0.14,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5239 time: 0.1162591
Jul 8 19:27:47 10.35.0.15 -- MARK --
Jul 8 19:27:49 10.35.0.13 pound: 10.35.2.3:5000 10.35.0.14 - - [08/Jul/2014:19:27:49 +0000] "POST //v2.0/tokens HTTP/1.1" 200 3288 "" "python-novaclient"
Jul 8 19:27:49 10.35.0.13 nova-api 2014-07-08 19:27:49.188 4841 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:27:49 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.7 - - [08/Jul/2014:19:27:49 +0000] "GET /v2.0/tokens/c0bb80bdfa3d4c1f856bada82fc657e6 HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:27:49 10.35.0.13 pound: 10.35.2.8:8774 10.35.0.14 - - [08/Jul/2014:19:27:49 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5065 "" "python-novaclient"
Jul 8 19:27:49 10.35.0.13 nova-api 2014-07-08 19:27:49.301 4841 INFO nova.osapi_compute.wsgi.server [req-ef30ba0d-cd3b-4470-90d1-b0a1f4a1b3ef 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 10.35.0.14,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5239 time: 0.1150820
Jul 8 19:27:49 10.35.0.13 [2014-07-08 19:27:49.322] 941/MainThread savage.node-evacuation/INFO: Requesting status refresh of VM id 5ee78913-e655-4cb1-a92c-500639c819ad
Jul 8 19:27:49 10.35.0.13 pound: 10.35.2.8:8774 10.35.2.7 - - [08/Jul/2014:19:27:49 +0000] "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/5ee78913-e655-4cb1-a92c-500639c819ad HTTP/1.1" 200 1691 "" "python-novaclient"
Jul 8 19:27:49 10.35.0.13 nova-api 2014-07-08 19:27:49.426 4842 INFO nova.osapi_compute.wsgi.server [req-6fa03053-8bd6-4592-9bef-7e0f79c5fc97 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] 10.35.2.7,10.35.0.13 "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/5ee78913-e655-4cb1-a92c-500639c819ad HTTP/1.1" status: 200 len: 1865 time: 0.1004140
Jul 8 19:27:49 10.35.0.13 [2014-07-08 19:27:49.428] 941/MainThread savage.Fred/INFO: Waiting for VMs to migrate.
Jul 8 19:27:49 10.35.0.14 cobalt-compute: periodic_task DEBUG Running periodic task CobaltManager._refresh_host
Jul 8 19:27:49 10.35.0.14 cobalt-compute: periodic_task DEBUG Running periodic task CobaltManager._clean
Jul 8 19:27:49 10.35.0.14 cobalt-compute: vmsconn DEBUG Starting to clean symlinks in /dev/nova
Jul 8 19:27:49 10.35.0.14 cobalt-compute: vmsconn DEBUG Link is still active /dev/nova/libvirt_save
Jul 8 19:27:49 10.35.0.14 cobalt-compute: vmsconn DEBUG Link is still active /dev/nova/novadisk
Jul 8 19:27:49 10.35.0.14 cobalt-compute: loopingcall DEBUG Dynamic looping call sleeping for 60.00 seconds
Jul 8 19:27:51 10.35.0.14 [2014-07-08 19:27:51.962] 941/Thread-1 savage.WSGI-FredInterface/INFO: 10.35.0.2 "GET /private/status HTTP/1.1" 200 3035
Jul 8 19:27:52 10.35.0.13 pound: 10.35.2.3:5000 10.35.0.14 - - [08/Jul/2014:19:27:51 +0000] "POST //v2.0/tokens HTTP/1.1" 200 3288 "" "python-novaclient"
Jul 8 19:27:52 10.35.0.15 nova-api 2014-07-08 19:27:52.082 3441 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:27:52 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.6 - - [08/Jul/2014:19:27:52 +0000] "GET /v2.0/tokens/2ea0008d48574f178666ac35fbe38379 HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:27:52 10.35.0.13 pound: 10.35.2.8:8774 10.35.0.14 - - [08/Jul/2014:19:27:52 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5065 "" "python-novaclient"
Jul 8 19:27:52 10.35.0.15 nova-api 2014-07-08 19:27:52.194 3441 INFO nova.osapi_compute.wsgi.server [req-b10fdba7-7bb8-4fd7-8a5b-4d5b542adf00 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 10.35.0.14,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5239 time: 0.1139798
Jul 8 19:27:54 10.35.0.13 nova-network.log: /usr/lib64/python2.7/site-packages/sqlalchemy/engine/default.py:435: Warning: Incorrect datetime value: '2014-07-08 19:17:54+00:00' for column 'updated_at' at row 1
Jul 8 19:27:54 10.35.0.13 nova-network.log: cursor.execute(statement, parameters)
Jul 8 19:27:54 10.35.0.13 pound: 10.35.2.3:5000 10.35.0.14 - - [08/Jul/2014:19:27:54 +0000] "POST //v2.0/tokens HTTP/1.1" 200 3288 "" "python-novaclient"
Jul 8 19:27:54 10.35.0.13 nova-api 2014-07-08 19:27:54.951 4840 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:27:54 10.35.0.15 nova-compute 2014-07-08 19:27:54.961 3328 AUDIT nova.compute.resource_tracker [-] Auditing locally available compute resources
Jul 8 19:27:54 10.35.0.15 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf vgs --noheadings --nosuffix --separator | --units b -o vg_size,vg_free nova
Jul 8 19:27:54 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.7 - - [08/Jul/2014:19:27:54 +0000] "GET /v2.0/tokens/88af6e056f8b4f938151a243dae4b53e HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:27:55 10.35.0.13 pound: 10.35.2.8:8774 10.35.0.14 - - [08/Jul/2014:19:27:54 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5065 "" "python-novaclient"
Jul 8 19:27:55 10.35.0.13 nova-api 2014-07-08 19:27:55.070 4840 INFO nova.osapi_compute.wsgi.server [req-d63e4b32-1b75-40f0-a678-5b4489d6aa7c 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 10.35.0.14,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5239 time: 0.1208801
Jul 8 19:27:55 10.35.0.15 nova-compute 2014-07-08 19:27:55.312 3328 AUDIT nova.compute.resource_tracker [-] Free ram (MB): 47217
Jul 8 19:27:55 10.35.0.15 nova-compute 2014-07-08 19:27:55.312 3328 AUDIT nova.compute.resource_tracker [-] Free disk (GB): 1861
Jul 8 19:27:55 10.35.0.15 nova-compute 2014-07-08 19:27:55.312 3328 AUDIT nova.compute.resource_tracker [-] Free VCPUS: 3
Jul 8 19:27:55 10.35.0.15 nova-compute 2014-07-08 19:27:55.337 3328 INFO nova.compute.resource_tracker [-] Compute_service record updated for 10.35.0.15:node-10-35-0-15
Jul 8 19:27:55 10.35.0.15 nova-compute 2014-07-08 19:27:55.391 3328 INFO nova.virt.libvirt.imagecache [-] image 8f23694a-2995-4070-a5e8-661de3207f2e at (/mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375): checking
Jul 8 19:27:55 10.35.0.15 nova-compute 2014-07-08 19:27:55.391 3328 INFO nova.virt.libvirt.imagecache [-] image 8f23694a-2995-4070-a5e8-661de3207f2e at (/mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375): in use: on this node 1 local, 0 on other nodes sharing this instance storage
Jul 8 19:27:55 10.35.0.15 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf chown 43 /mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375
Jul 8 19:27:55 10.35.0.15 nova-compute 2014-07-08 19:27:55.578 3328 INFO nova.virt.libvirt.imagecache [-] Active base files: /mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375
Jul 8 19:27:56 10.35.0.13 nova-compute 2014-07-08 19:27:56.269 4705 AUDIT nova.compute.resource_tracker [-] Auditing locally available compute resources
Jul 8 19:27:56 10.35.0.13 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf vgs --noheadings --nosuffix --separator | --units b -o vg_size,vg_free nova
Jul 8 19:27:56 10.35.0.13 nova-compute 2014-07-08 19:27:56.659 4705 AUDIT nova.compute.resource_tracker [-] Free ram (MB): 46705
Jul 8 19:27:56 10.35.0.13 nova-compute 2014-07-08 19:27:56.659 4705 AUDIT nova.compute.resource_tracker [-] Free disk (GB): 1860
Jul 8 19:27:56 10.35.0.13 nova-compute 2014-07-08 19:27:56.659 4705 AUDIT nova.compute.resource_tracker [-] Free VCPUS: 2
Jul 8 19:27:56 10.35.0.13 nova-compute 2014-07-08 19:27:56.686 4705 INFO nova.compute.resource_tracker [-] Compute_service record updated for 10.35.0.13:node-10-35-0-13
Jul 8 19:27:56 10.35.0.13 nova-compute 2014-07-08 19:27:56.751 4705 INFO nova.virt.libvirt.imagecache [-] image 8f23694a-2995-4070-a5e8-661de3207f2e at (/mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375): checking
Jul 8 19:27:56 10.35.0.13 nova-compute 2014-07-08 19:27:56.752 4705 INFO nova.virt.libvirt.imagecache [-] image 8f23694a-2995-4070-a5e8-661de3207f2e at (/mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375): in use: on this node 2 local, 0 on other nodes sharing this instance storage
Jul 8 19:27:56 10.35.0.13 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf chown 43 /mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375
Jul 8 19:27:56 10.35.0.13 nova-compute 2014-07-08 19:27:56.940 4705 INFO nova.virt.libvirt.imagecache [-] Active base files: /mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375
Jul 8 19:27:57 10.35.0.2 -- MARK --
Jul 8 19:27:57 10.35.0.13 pound: 10.35.2.3:5000 10.35.0.14 - - [08/Jul/2014:19:27:57 +0000] "POST //v2.0/tokens HTTP/1.1" 200 3288 "" "python-novaclient"
Jul 8 19:27:57 10.35.0.15 nova-api 2014-07-08 19:27:57.845 3441 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:27:57 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.6 - - [08/Jul/2014:19:27:57 +0000] "GET /v2.0/tokens/e165c3aa8dd142ffafb8928d1eca7bdd HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:27:57 10.35.0.15 nova-api 2014-07-08 19:27:57.967 3441 INFO nova.osapi_compute.wsgi.server [req-98771dfa-9f66-41b6-974f-427f83cac9fd 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 10.35.0.14,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5239 time: 0.1259179
Jul 8 19:27:57 10.35.0.13 pound: 10.35.2.8:8774 10.35.0.14 - - [08/Jul/2014:19:27:57 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5065 "" "python-novaclient"
Jul 8 19:27:58 10.35.0.13 [2014-07-08 19:27:58.197] 941/Thread-1 savage.WSGI-FredInterface/INFO: 10.35.0.2 "GET /private/status HTTP/1.1" 200 3080
Jul 8 19:27:58 10.35.0.15 [2014-07-08 19:27:58.199] 942/Thread-1 savage.WSGI-FredInterface/INFO: 10.35.0.2 "GET /private/status HTTP/1.1" 200 3035
Jul 8 19:27:58 10.35.0.15 cinder-volume 2014-07-08 19:27:58.292 9067 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_context_domain\\": null, \\"_context_request_id\\": \\"req-b875b81b-d9c8-44dc-b9a7-7446cf50db04\\", \\"_context_quota_class\\": null, \\"_context_service_catalog\\": [], \\"_context_auth_token\\": null, \\"_context_user_id\\": null, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_timestamp\\": \\"2014-07-08T19:27:58.290068\\", \\"_context_project_domain\\": null, \\"_context_user\\": null, \\"method\\": \\"update_service_capabilities\\", \\"_context_remote_address\\": null, \\"_context_roles\\": [\\"admin\\"], \\"args\\": {\\"service_name\\": \\"volume\\", \\"host\\": \\"cinder@rbd-fast\\", \\"capabilities\\": {\\"allocated_capacity_gb\\": 0, \\"volume_backend_name\\": \\"CEPH\\", \\"free_capacity_gb\\": 1281, \\"driver_version\\": \\"1.1.0\\", \\"total_capacity_gb\\": 1282, \\"reserved_percentage\\": 0, \\"vendor_name\\": \\"Open Source\\", \\"storage_protocol\\": \\"ceph\\"}}, \\"_unique_id\\": \\"c14007db290f47cd8b442c460abd7c7c\\", \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_context_user_identity\\": \\"- - - - -\\", \\"_context_tenant\\": null, \\"_context_project_id\\": null, \\"_context_user_domain\\": null}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:27:58 10.35.0.15 cinder-volume 2014-07-08 19:27:58.293 9067 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_context_domain\\": null, \\"_context_request_id\\": \\"req-b875b81b-d9c8-44dc-b9a7-7446cf50db04\\", \\"_context_quota_class\\": null, \\"_context_service_catalog\\": [], \\"_context_auth_token\\": null, \\"_context_user_id\\": null, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_timestamp\\": \\"2014-07-08T19:27:58.290068\\", \\"_context_project_domain\\": null, \\"_context_user\\": null, \\"method\\": \\"update_service_capabilities\\", \\"_context_remote_address\\": null, \\"_context_roles\\": [\\"admin\\"], \\"args\\": {\\"service_name\\": \\"volume\\", \\"host\\": \\"cinder@rbd-fast\\", \\"capabilities\\": {\\"allocated_capacity_gb\\": 0, \\"volume_backend_name\\": \\"CEPH\\", \\"free_capacity_gb\\": 1281, \\"driver_version\\": \\"1.1.0\\", \\"total_capacity_gb\\": 1282, \\"reserved_percentage\\": 0, \\"vendor_name\\": \\"Open Source\\", \\"storage_protocol\\": \\"ceph\\"}}, \\"_unique_id\\": \\"c14007db290f47cd8b442c460abd7c7c\\", \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_context_user_identity\\": \\"- - - - -\\", \\"_context_tenant\\": null, \\"_context_project_id\\": null, \\"_context_user_domain\\": null}", "oslo.version": "2.0"}', routing_key=''
Jul 8 19:27:58 10.35.0.15 cinder-scheduler 2014-07-08 19:27:58.295 8863 ERROR kombu.transport.pyamqp [-] NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_context_domain\\": null, \\"_context_request_id\\": \\"req-b875b81b-d9c8-44dc-b9a7-7446cf50db04\\", \\"_context_quota_class\\": null, \\"_context_service_catalog\\": [], \\"_context_auth_token\\": null, \\"_context_user_id\\": null, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_timestamp\\": \\"2014-07-08T19:27:58.290068\\", \\"_context_project_domain\\": null, \\"_context_user\\": null, \\"method\\": \\"update_service_capabilities\\", \\"_context_remote_address\\": null, \\"_context_roles\\": [\\"admin\\"], \\"args\\": {\\"service_name\\": \\"volume\\", \\"host\\": \\"cinder@rbd-fast\\", \\"capabilities\\": {\\"allocated_capacity_gb\\": 0, \\"volume_backend_name\\": \\"CEPH\\", \\"free_capacity_gb\\": 1281, \\"driver_version\\": \\"1.1.0\\", \\"total_capacity_gb\\": 1282, \\"reserved_percentage\\": 0, \\"vendor_name\\": \\"Open Source\\", \\"storage_protocol\\": \\"ceph\\"}}, \\"_unique_id\\": \\"c14007db290f47cd8b442c460abd7c7c\\", \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_context_user_identity\\": \\"- - - - -\\", \\"_context_tenant\\": null, \\"_context_project_id\\": null, \\"_context_user_domain\\": null}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'3', 'redelivered': False, 'routing_key': u'', 'delivery_tag': 83, 'exchange': u'cinder-scheduler_fanout'}, 'properties': {'priority': 0, 'application_headers': {}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7fa03d158690>}
Jul 8 19:27:58 10.35.0.15 cinder-volume 2014-07-08 19:27:58.296 9067 INFO cinder.volume.manager [-] Updating volume status
Jul 8 19:27:58 10.35.0.15 cinder-scheduler 2014-07-08 19:27:58.297 8863 ERROR oslo.messaging._drivers.impl_rabbit [-] NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_context_domain\\": null, \\"_context_request_id\\": \\"req-b875b81b-d9c8-44dc-b9a7-7446cf50db04\\", \\"_context_quota_class\\": null, \\"_context_service_catalog\\": [], \\"_context_auth_token\\": null, \\"_context_user_id\\": null, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_timestamp\\": \\"2014-07-08T19:27:58.290068\\", \\"_context_project_domain\\": null, \\"_context_user\\": null, \\"method\\": \\"update_service_capabilities\\", \\"_context_remote_address\\": null, \\"_context_roles\\": [\\"admin\\"], \\"args\\": {\\"service_name\\": \\"volume\\", \\"host\\": \\"cinder@rbd-fast\\", \\"capabilities\\": {\\"allocated_capacity_gb\\": 0, \\"volume_backend_name\\": \\"CEPH\\", \\"free_capacity_gb\\": 1281, \\"driver_version\\": \\"1.1.0\\", \\"total_capacity_gb\\": 1282, \\"reserved_percentage\\": 0, \\"vendor_name\\": \\"Open Source\\", \\"storage_protocol\\": \\"ceph\\"}}, \\"_unique_id\\": \\"c14007db290f47cd8b442c460abd7c7c\\", \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_context_user_identity\\": \\"- - - - -\\", \\"_context_tenant\\": null, \\"_context_project_id\\": null, \\"_context_user_domain\\": null}", "oslo.version": "2.0"}'
Jul 8 19:27:58 10.35.0.15 cinder-scheduler 2014-07-08 19:27:58.298 8863 WARNING cinder.context [-] Arguments dropped when creating context: {'user': None, 'tenant': None, 'user_identity': u'- - - - -'}
Jul 8 19:27:58 10.35.0.15 cinder-scheduler 2014-07-08 19:27:58.299 8863 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"d6e81d3a840d4b50aaa835254703bf9c\\", \\"failure\\": null, \\"result\\": null}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:27:58 10.35.0.15 cinder-scheduler 2014-07-08 19:27:58.299 8863 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"d6e81d3a840d4b50aaa835254703bf9c\\", \\"failure\\": null, \\"result\\": null}", "oslo.version": "2.0"}', routing_key=''
Jul 8 19:27:58 10.35.0.15 cinder-scheduler 2014-07-08 19:27:58.301 8863 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"51c405ad7afc4c6c8062cee988580c09\\", \\"failure\\": null, \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:27:58 10.35.0.15 cinder-scheduler 2014-07-08 19:27:58.301 8863 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"51c405ad7afc4c6c8062cee988580c09\\", \\"failure\\": null, \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', routing_key=''
Jul 8 19:27:59 10.35.0.13 [2014-07-08 19:27:59.437] 941/MainThread savage.node-evacuation/INFO: Requesting status refresh of VM id 5ee78913-e655-4cb1-a92c-500639c819ad
Jul 8 19:27:59 10.35.0.13 pound: 10.35.2.8:8774 10.35.2.7 - - [08/Jul/2014:19:27:59 +0000] "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/5ee78913-e655-4cb1-a92c-500639c819ad HTTP/1.1" 200 1691 "" "python-novaclient"
Jul 8 19:27:59 10.35.0.14 nova-api 2014-07-08 19:27:59.540 6474 INFO nova.osapi_compute.wsgi.server [req-ca1a4766-b2da-4bd7-93de-c3b91faa275f 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] 10.35.2.7,10.35.0.13 "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/5ee78913-e655-4cb1-a92c-500639c819ad HTTP/1.1" status: 200 len: 1865 time: 0.1000190
Jul 8 19:27:59 10.35.0.13 [2014-07-08 19:27:59.544] 941/MainThread savage.Fred/INFO: Waiting for VMs to migrate.
Jul 8 19:28:00 10.35.0.13 pound: 10.35.2.3:5000 10.35.0.14 - - [08/Jul/2014:19:28:00 +0000] "POST //v2.0/tokens HTTP/1.1" 200 3288 "" "python-novaclient"
Jul 8 19:28:00 10.35.0.14 nova-api 2014-07-08 19:28:00.740 6474 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:28:00 10.35.0.13 pound: 10.35.2.3:35357 10.35.0.14 - - [08/Jul/2014:19:28:00 +0000] "GET /v2.0/tokens/cf0d04e058fe43e588e503d112f5341e HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:28:00 10.35.0.13 pound: 10.35.2.8:8774 10.35.0.14 - - [08/Jul/2014:19:28:00 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5065 "" "python-novaclient"
Jul 8 19:28:00 10.35.0.14 nova-api 2014-07-08 19:28:00.858 6474 INFO nova.osapi_compute.wsgi.server [req-36e36c35-ad82-4bd4-8037-90a2fe80e73c 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 10.35.0.14,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5239 time: 0.1201451
Jul 8 19:28:03 10.35.0.13 pound: 10.35.2.3:5000 10.35.0.14 - - [08/Jul/2014:19:28:03 +0000] "POST //v2.0/tokens HTTP/1.1" 200 3288 "" "python-novaclient"
Jul 8 19:28:03 10.35.0.14 nova-api 2014-07-08 19:28:03.612 6473 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:28:03 10.35.0.13 pound: 10.35.2.3:35357 10.35.0.14 - - [08/Jul/2014:19:28:03 +0000] "GET /v2.0/tokens/5b4efe155887428e90ff2d144f3116d2 HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:28:03 10.35.0.14 nova-compute 2014-07-08 19:28:03.727 6347 AUDIT nova.compute.resource_tracker [-] Auditing locally available compute resources
Jul 8 19:28:03 10.35.0.13 pound: 10.35.2.8:8774 10.35.0.14 - - [08/Jul/2014:19:28:03 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5065 "" "python-novaclient"
Jul 8 19:28:03 10.35.0.14 nova-api 2014-07-08 19:28:03.736 6473 INFO nova.osapi_compute.wsgi.server [req-11d3b4ce-5aaf-409a-894c-68d8159b5116 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 10.35.0.14,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5239 time: 0.1259708
Jul 8 19:28:03 10.35.0.14 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf vgs --noheadings --nosuffix --separator | --units b -o vg_size,vg_free nova
Jul 8 19:28:04 10.35.0.14 nova-compute 2014-07-08 19:28:04.079 6347 AUDIT nova.compute.resource_tracker [-] Free ram (MB): 47729
Jul 8 19:28:04 10.35.0.14 nova-compute 2014-07-08 19:28:04.079 6347 AUDIT nova.compute.resource_tracker [-] Free disk (GB): 1862
Jul 8 19:28:04 10.35.0.14 nova-compute 2014-07-08 19:28:04.079 6347 AUDIT nova.compute.resource_tracker [-] Free VCPUS: 4
Jul 8 19:28:04 10.35.0.14 nova-compute 2014-07-08 19:28:04.229 6347 INFO nova.compute.resource_tracker [-] Compute_service record updated for 10.35.0.14:node-10-35-0-14
Jul 8 19:28:04 10.35.0.13 [2014-07-08 19:28:04.611] 11286/MainThread savage.arp_forever/INFO: execute['arping', '-q', '-A', '-I', 'services', '-c', '1', '-U', '10.35.2.9']
Jul 8 19:28:04 10.35.0.15 [2014-07-08 19:28:04.925] 8877/MainThread savage.arp_forever/INFO: execute['arping', '-q', '-A', '-I', 'services', '-c', '1', '-U', '10.35.2.6']
Jul 8 19:28:05 10.35.0.15 nova-compute 2014-07-08 19:28:05.146 3328 AUDIT nova.compute.resource_tracker [-] Auditing locally available compute resources
Jul 8 19:28:05 10.35.0.15 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf vgs --noheadings --nosuffix --separator | --units b -o vg_size,vg_free nova
Jul 8 19:28:05 10.35.0.15 nova-compute 2014-07-08 19:28:05.502 3328 AUDIT nova.compute.resource_tracker [-] Free ram (MB): 47217
Jul 8 19:28:05 10.35.0.15 nova-compute 2014-07-08 19:28:05.503 3328 AUDIT nova.compute.resource_tracker [-] Free disk (GB): 1861
Jul 8 19:28:05 10.35.0.15 nova-compute 2014-07-08 19:28:05.503 3328 AUDIT nova.compute.resource_tracker [-] Free VCPUS: 3
Jul 8 19:28:05 10.35.0.15 nova-compute 2014-07-08 19:28:05.529 3328 INFO nova.compute.resource_tracker [-] Compute_service record updated for 10.35.0.15:node-10-35-0-15
Jul 8 19:28:05 10.35.0.13 nova-compute 2014-07-08 19:28:05.587 4705 AUDIT nova.compute.resource_tracker [-] Auditing locally available compute resources
Jul 8 19:28:05 10.35.0.13 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf vgs --noheadings --nosuffix --separator | --units b -o vg_size,vg_free nova
Jul 8 19:28:05 10.35.0.15 nova-compute 2014-07-08 19:28:05.654 3328 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_context_roles\\": [], \\"_msg_id\\": \\"d5611b4067a842b7a9862f465b0c55aa\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-5dcf6838-683e-49fd-9e77-44d8e0c01c63\\", \\"_context_service_catalog\\": [], \\"args\\": {\\"instance_id\\": \\"3e65c658-731c-4a56-8c17-1cfac00bf754\\", \\"host\\": \\"10.35.0.15\\", \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"rxtx_factor\\": 1.0}, \\"_unique_id\\": \\"c97cc94d53ef4a70be0efb4c6d8dccf8\\", \\"_context_user\\": null, \\"_context_user_id\\": null, \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_af4fb5adaca445a3b6d555c358806a4e\\", \\"_context_auth_token\\": null, \\"_context_tenant\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.9\\", \\"_context_project_id\\": null, \\"_context_timestamp\\": \\"2014-07-08T19:28:05.145119\\", \\"_context_user_name\\": null, \\"method\\": \\"get_instance_nw_info\\", \\"_context_remote_address\\": null}", "oslo.version": "2.0"}', headers={'ttl': 60000}, properties={'delivery_mode': 2}
Jul 8 19:28:05 10.35.0.15 nova-compute 2014-07-08 19:28:05.654 3328 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_context_roles\\": [], \\"_msg_id\\": \\"d5611b4067a842b7a9862f465b0c55aa\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-5dcf6838-683e-49fd-9e77-44d8e0c01c63\\", \\"_context_service_catalog\\": [], \\"args\\": {\\"instance_id\\": \\"3e65c658-731c-4a56-8c17-1cfac00bf754\\", \\"host\\": \\"10.35.0.15\\", \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"rxtx_factor\\": 1.0}, \\"_unique_id\\": \\"c97cc94d53ef4a70be0efb4c6d8dccf8\\", \\"_context_user\\": null, \\"_context_user_id\\": null, \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_af4fb5adaca445a3b6d555c358806a4e\\", \\"_context_auth_token\\": null, \\"_context_tenant\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.9\\", \\"_context_project_id\\": null, \\"_context_timestamp\\": \\"2014-07-08T19:28:05.145119\\", \\"_context_user_name\\": null, \\"method\\": \\"get_instance_nw_info\\", \\"_context_remote_address\\": null}", "oslo.version": "2.0"}', routing_key='network'
Jul 8 19:28:05 10.35.0.14 nova-network 2014-07-08 19:28:05.652 6514 ERROR kombu.transport.pyamqp [-] NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_context_roles\\": [], \\"_msg_id\\": \\"d5611b4067a842b7a9862f465b0c55aa\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-5dcf6838-683e-49fd-9e77-44d8e0c01c63\\", \\"_context_service_catalog\\": [], \\"args\\": {\\"instance_id\\": \\"3e65c658-731c-4a56-8c17-1cfac00bf754\\", \\"host\\": \\"10.35.0.15\\", \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"rxtx_factor\\": 1.0}, \\"_unique_id\\": \\"c97cc94d53ef4a70be0efb4c6d8dccf8\\", \\"_context_user\\": null, \\"_context_user_id\\": null, \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_af4fb5adaca445a3b6d555c358806a4e\\", \\"_context_auth_token\\": null, \\"_context_tenant\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.9\\", \\"_context_project_id\\": null, \\"_context_timestamp\\": \\"2014-07-08T19:28:05.145119\\", \\"_context_user_name\\": null, \\"method\\": \\"get_instance_nw_info\\", \\"_context_remote_address\\": null}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'1', 'redelivered': False, 'routing_key': u'network', 'delivery_tag': 29, 'exchange': u'nova'}, 'properties': {'priority': 0, 'application_headers': {u'ttl': 60000}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7f5c1424f1d0>}
Jul 8 19:28:05 10.35.0.14 nova-network 2014-07-08 19:28:05.654 6514 ERROR oslo.messaging._drivers.impl_rabbit [-] NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_context_roles\\": [], \\"_msg_id\\": \\"d5611b4067a842b7a9862f465b0c55aa\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-5dcf6838-683e-49fd-9e77-44d8e0c01c63\\", \\"_context_service_catalog\\": [], \\"args\\": {\\"instance_id\\": \\"3e65c658-731c-4a56-8c17-1cfac00bf754\\", \\"host\\": \\"10.35.0.15\\", \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"rxtx_factor\\": 1.0}, \\"_unique_id\\": \\"c97cc94d53ef4a70be0efb4c6d8dccf8\\", \\"_context_user\\": null, \\"_context_user_id\\": null, \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_af4fb5adaca445a3b6d555c358806a4e\\", \\"_context_auth_token\\": null, \\"_context_tenant\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.9\\", \\"_context_project_id\\": null, \\"_context_timestamp\\": \\"2014-07-08T19:28:05.145119\\", \\"_context_user_name\\": null, \\"method\\": \\"get_instance_nw_info\\", \\"_context_remote_address\\": null}", "oslo.version": "2.0"}'
Jul 8 19:28:05 10.35.0.15 [2014-07-08 19:28:05.773] 8900/MainThread savage.arp_forever/INFO: execute['arping', '-q', '-A', '-I', 'services', '-c', '1', '-U', '10.35.2.4']
Jul 8 19:28:05 10.35.0.14 nova-network.log: /usr/lib64/python2.7/site-packages/amqp/channel.py:616: VDeprecationWarning: The auto_delete flag for exchanges has been deprecated and will be removed
Jul 8 19:28:05 10.35.0.14 nova-network.log: from py-amqp v1.5.0.
Jul 8 19:28:05 10.35.0.14 nova-network.log: warn(VDeprecationWarning(EXCHANGE_AUTODELETE_DEPRECATED))
Jul 8 19:28:05 10.35.0.14 nova-network 2014-07-08 19:28:05.796 6514 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"240f18f62638412095e497168685f8dc\\", \\"failure\\": null, \\"result\\": [{\\"ovs_interfaceid\\": null, \\"network\\": {\\"bridge\\": \\"cloud\\", \\"subnets\\": [{\\"ips\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"fixed\\", \\"floating_ips\\": [], \\"address\\": \\"10.35.3.6\\"}], \\"version\\": 4, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.5\\"}, \\"dns\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"dns\\", \\"address\\": \\"8.8.4.4\\"}], \\"routes\\": [], \\"cidr\\": \\"10.35.3.0/24\\", \\"gateway\\": {\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"gateway\\", \\"address\\": \\"10.35.3.1\\"}}, {\\"ips\\": [], \\"version\\": null, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.5\\"}, \\"dns\\": [], \\"routes\\": [], \\"cidr\\": null, \\"gateway\\": {\\"meta\\": {}, \\"version\\": null, \\"type\\": \\"gateway\\", \\"address\\": null}}], \\"meta\\": {\\"tenant_id\\": null, \\"multi_host\\": true, \\"should_create_bridge\\": true}, \\"id\\": \\"933f56fa-7929-413f-ab40-9c7c91cb840b\\", \\"label\\": \\"private\\"}, \\"devname\\": null, \\"qbh_params\\": null, \\"meta\\": {}, \\"details\\": {}, \\"address\\": \\"fa:16:3e:c1:d4:f3\\", \\"active\\": false, \\"type\\": \\"bridge\\", \\"id\\": \\"ecd8dbf8-7280-45c3-89e1-919f48b2591c\\", \\"qbg_params\\": null}], \\"_msg_id\\": \\"d5611b4067a842b7a9862f465b0c55aa\\"}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:28:05 10.35.0.14 nova-network 2014-07-08 19:28:05.796 6514 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"240f18f62638412095e497168685f8dc\\", \\"failure\\": null, \\"result\\": [{\\"ovs_interfaceid\\": null, \\"network\\": {\\"bridge\\": \\"cloud\\", \\"subnets\\": [{\\"ips\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"fixed\\", \\"floating_ips\\": [], \\"address\\": \\"10.35.3.6\\"}], \\"version\\": 4, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.5\\"}, \\"dns\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"dns\\", \\"address\\": \\"8.8.4.4\\"}], \\"routes\\": [], \\"cidr\\": \\"10.35.3.0/24\\", \\"gateway\\": {\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"gateway\\", \\"address\\": \\"10.35.3.1\\"}}, {\\"ips\\": [], \\"version\\": null, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.5\\"}, \\"dns\\": [], \\"routes\\": [], \\"cidr\\": null, \\"gateway\\": {\\"meta\\": {}, \\"version\\": null, \\"type\\": \\"gateway\\", \\"address\\": null}}], \\"meta\\": {\\"tenant_id\\": null, \\"multi_host\\": true, \\"should_create_bridge\\": true}, \\"id\\": \\"933f56fa-7929-413f-ab40-9c7c91cb840b\\", \\"label\\": \\"private\\"}, \\"devname\\": null, \\"qbh_params\\": null, \\"meta\\": {}, \\"details\\": {}, \\"address\\": \\"fa:16:3e:c1:d4:f3\\", \\"active\\": false, \\"type\\": \\"bridge\\", \\"id\\": \\"ecd8dbf8-7280-45c3-89e1-919f48b2591c\\", \\"qbg_params\\": null}], \\"_msg_id\\": \\"d5611b4067a842b7a9862f465b0c55aa\\"}", "oslo.version": "2.0"}', routing_key=u'reply_af4fb5adaca445a3b6d555c358806a4e'
Jul 8 19:28:05 10.35.0.14 nova-network.log: /usr/lib64/python2.7/site-packages/amqp/channel.py:616: VDeprecationWarning: The auto_delete flag for exchanges has been deprecated and will be removed
Jul 8 19:28:05 10.35.0.14 nova-network.log: from py-amqp v1.5.0.
Jul 8 19:28:05 10.35.0.14 nova-network.log: warn(VDeprecationWarning(EXCHANGE_AUTODELETE_DEPRECATED))
Jul 8 19:28:05 10.35.0.14 nova-network 2014-07-08 19:28:05.797 6514 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"a358550430a54c498c74e2dcabbe047e\\", \\"failure\\": null, \\"_msg_id\\": \\"d5611b4067a842b7a9862f465b0c55aa\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:28:05 10.35.0.14 nova-network 2014-07-08 19:28:05.797 6514 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"a358550430a54c498c74e2dcabbe047e\\", \\"failure\\": null, \\"_msg_id\\": \\"d5611b4067a842b7a9862f465b0c55aa\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', routing_key=u'reply_af4fb5adaca445a3b6d555c358806a4e'
Jul 8 19:28:05 10.35.0.15 nova-compute 2014-07-08 19:28:05.800 3328 ERROR kombu.transport.pyamqp [-] NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_unique_id\\": \\"240f18f62638412095e497168685f8dc\\", \\"failure\\": null, \\"result\\": [{\\"ovs_interfaceid\\": null, \\"network\\": {\\"bridge\\": \\"cloud\\", \\"subnets\\": [{\\"ips\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"fixed\\", \\"floating_ips\\": [], \\"address\\": \\"10.35.3.6\\"}], \\"version\\": 4, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.5\\"}, \\"dns\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"dns\\", \\"address\\": \\"8.8.4.4\\"}], \\"routes\\": [], \\"cidr\\": \\"10.35.3.0/24\\", \\"gateway\\": {\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"gateway\\", \\"address\\": \\"10.35.3.1\\"}}, {\\"ips\\": [], \\"version\\": null, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.5\\"}, \\"dns\\": [], \\"routes\\": [], \\"cidr\\": null, \\"gateway\\": {\\"meta\\": {}, \\"version\\": null, \\"type\\": \\"gateway\\", \\"address\\": null}}], \\"meta\\": {\\"tenant_id\\": null, \\"multi_host\\": true, \\"should_create_bridge\\": true}, \\"id\\": \\"933f56fa-7929-413f-ab40-9c7c91cb840b\\", \\"label\\": \\"private\\"}, \\"devname\\": null, \\"qbh_params\\": null, \\"meta\\": {}, \\"details\\": {}, \\"address\\": \\"fa:16:3e:c1:d4:f3\\", \\"active\\": false, \\"type\\": \\"bridge\\", \\"id\\": \\"ecd8dbf8-7280-45c3-89e1-919f48b2591c\\", \\"qbg_params\\": null}], \\"_msg_id\\": \\"d5611b4067a842b7a9862f465b0c55aa\\"}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'1', 'redelivered': False, 'routing_key': u'reply_af4fb5adaca445a3b6d555c358806a4e', 'delivery_tag': 81, 'exchange': u'reply_af4fb5adaca445a3b6d555c358806a4e'}, 'properties': {'priority': 0, 'application_headers': {}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7f3030b1ff10>}
Jul 8 19:28:05 10.35.0.15 nova-compute 2014-07-08 19:28:05.801 3328 ERROR oslo.messaging._drivers.impl_rabbit [-] NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_unique_id\\": \\"240f18f62638412095e497168685f8dc\\", \\"failure\\": null, \\"result\\": [{\\"ovs_interfaceid\\": null, \\"network\\": {\\"bridge\\": \\"cloud\\", \\"subnets\\": [{\\"ips\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"fixed\\", \\"floating_ips\\": [], \\"address\\": \\"10.35.3.6\\"}], \\"version\\": 4, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.5\\"}, \\"dns\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"dns\\", \\"address\\": \\"8.8.4.4\\"}], \\"routes\\": [], \\"cidr\\": \\"10.35.3.0/24\\", \\"gateway\\": {\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"gateway\\", \\"address\\": \\"10.35.3.1\\"}}, {\\"ips\\": [], \\"version\\": null, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.5\\"}, \\"dns\\": [], \\"routes\\": [], \\"cidr\\": null, \\"gateway\\": {\\"meta\\": {}, \\"version\\": null, \\"type\\": \\"gateway\\", \\"address\\": null}}], \\"meta\\": {\\"tenant_id\\": null, \\"multi_host\\": true, \\"should_create_bridge\\": true}, \\"id\\": \\"933f56fa-7929-413f-ab40-9c7c91cb840b\\", \\"label\\": \\"private\\"}, \\"devname\\": null, \\"qbh_params\\": null, \\"meta\\": {}, \\"details\\": {}, \\"address\\": \\"fa:16:3e:c1:d4:f3\\", \\"active\\": false, \\"type\\": \\"bridge\\", \\"id\\": \\"ecd8dbf8-7280-45c3-89e1-919f48b2591c\\", \\"qbg_params\\": null}], \\"_msg_id\\": \\"d5611b4067a842b7a9862f465b0c55aa\\"}", "oslo.version": "2.0"}'
Jul 8 19:28:05 10.35.0.15 nova-compute 2014-07-08 19:28:05.802 3328 ERROR kombu.transport.pyamqp [-] NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_unique_id\\": \\"a358550430a54c498c74e2dcabbe047e\\", \\"failure\\": null, \\"_msg_id\\": \\"d5611b4067a842b7a9862f465b0c55aa\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'1', 'redelivered': False, 'routing_key': u'reply_af4fb5adaca445a3b6d555c358806a4e', 'delivery_tag': 82, 'exchange': u'reply_af4fb5adaca445a3b6d555c358806a4e'}, 'properties': {'priority': 0, 'application_headers': {}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7f3030b1ff10>}
Jul 8 19:28:05 10.35.0.15 nova-compute 2014-07-08 19:28:05.802 3328 ERROR oslo.messaging._drivers.impl_rabbit [-] NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_unique_id\\": \\"a358550430a54c498c74e2dcabbe047e\\", \\"failure\\": null, \\"_msg_id\\": \\"d5611b4067a842b7a9862f465b0c55aa\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}'
Jul 8 19:28:05 10.35.0.13 [2014-07-08 19:28:05.839] 11310/MainThread savage.arp_forever/INFO: execute['arping', '-q', '-A', '-I', 'services', '-c', '1', '-U', '10.35.2.2']
Jul 8 19:28:05 10.35.0.13 nova-compute 2014-07-08 19:28:05.982 4705 AUDIT nova.compute.resource_tracker [-] Free ram (MB): 46705
Jul 8 19:28:05 10.35.0.13 nova-compute 2014-07-08 19:28:05.982 4705 AUDIT nova.compute.resource_tracker [-] Free disk (GB): 1860
Jul 8 19:28:05 10.35.0.13 nova-compute 2014-07-08 19:28:05.982 4705 AUDIT nova.compute.resource_tracker [-] Free VCPUS: 2
Jul 8 19:28:06 10.35.0.13 nova-compute 2014-07-08 19:28:06.010 4705 INFO nova.compute.resource_tracker [-] Compute_service record updated for 10.35.0.13:node-10-35-0-13
Jul 8 19:28:06 10.35.0.13 nova-compute 2014-07-08 19:28:06.146 4705 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_context_roles\\": [], \\"_msg_id\\": \\"73b3f3277d0746a4a019df20fd2a6a8f\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-aa39c20c-b199-4939-95a8-a87f81e7b1d1\\", \\"_context_service_catalog\\": [], \\"args\\": {\\"instance_id\\": \\"ce77d052-f021-44a2-b5f3-5908ec04fcc7\\", \\"host\\": \\"10.35.0.13\\", \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"rxtx_factor\\": 1.0}, \\"_unique_id\\": \\"9bf3c89495a840fd986508765be08fe0\\", \\"_context_user\\": null, \\"_context_user_id\\": null, \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_5a152b20506e4303bfaf3007cbe974c3\\", \\"_context_auth_token\\": null, \\"_context_tenant\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.9\\", \\"_context_project_id\\": null, \\"_context_timestamp\\": \\"2014-07-08T19:28:05.585776\\", \\"_context_user_name\\": null, \\"method\\": \\"get_instance_nw_info\\", \\"_context_remote_address\\": null}", "oslo.version": "2.0"}', headers={'ttl': 60000}, properties={'delivery_mode': 2}
Jul 8 19:28:06 10.35.0.13 nova-compute 2014-07-08 19:28:06.147 4705 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_context_roles\\": [], \\"_msg_id\\": \\"73b3f3277d0746a4a019df20fd2a6a8f\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-aa39c20c-b199-4939-95a8-a87f81e7b1d1\\", \\"_context_service_catalog\\": [], \\"args\\": {\\"instance_id\\": \\"ce77d052-f021-44a2-b5f3-5908ec04fcc7\\", \\"host\\": \\"10.35.0.13\\", \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"rxtx_factor\\": 1.0}, \\"_unique_id\\": \\"9bf3c89495a840fd986508765be08fe0\\", \\"_context_user\\": null, \\"_context_user_id\\": null, \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_5a152b20506e4303bfaf3007cbe974c3\\", \\"_context_auth_token\\": null, \\"_context_tenant\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.9\\", \\"_context_project_id\\": null, \\"_context_timestamp\\": \\"2014-07-08T19:28:05.585776\\", \\"_context_user_name\\": null, \\"method\\": \\"get_instance_nw_info\\", \\"_context_remote_address\\": null}", "oslo.version": "2.0"}', routing_key='network'
Jul 8 19:28:06 10.35.0.15 nova-network 2014-07-08 19:28:06.150 3465 ERROR kombu.transport.pyamqp [-] NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_context_roles\\": [], \\"_msg_id\\": \\"73b3f3277d0746a4a019df20fd2a6a8f\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-aa39c20c-b199-4939-95a8-a87f81e7b1d1\\", \\"_context_service_catalog\\": [], \\"args\\": {\\"instance_id\\": \\"ce77d052-f021-44a2-b5f3-5908ec04fcc7\\", \\"host\\": \\"10.35.0.13\\", \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"rxtx_factor\\": 1.0}, \\"_unique_id\\": \\"9bf3c89495a840fd986508765be08fe0\\", \\"_context_user\\": null, \\"_context_user_id\\": null, \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_5a152b20506e4303bfaf3007cbe974c3\\", \\"_context_auth_token\\": null, \\"_context_tenant\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.9\\", \\"_context_project_id\\": null, \\"_context_timestamp\\": \\"2014-07-08T19:28:05.585776\\", \\"_context_user_name\\": null, \\"method\\": \\"get_instance_nw_info\\", \\"_context_remote_address\\": null}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'1', 'redelivered': False, 'routing_key': u'network', 'delivery_tag': 29, 'exchange': u'nova'}, 'properties': {'priority': 0, 'application_headers': {u'ttl': 60000}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7eff2d221290>}
Jul 8 19:28:06 10.35.0.15 nova-network 2014-07-08 19:28:06.152 3465 ERROR oslo.messaging._drivers.impl_rabbit [-] NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_context_roles\\": [], \\"_msg_id\\": \\"73b3f3277d0746a4a019df20fd2a6a8f\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-aa39c20c-b199-4939-95a8-a87f81e7b1d1\\", \\"_context_service_catalog\\": [], \\"args\\": {\\"instance_id\\": \\"ce77d052-f021-44a2-b5f3-5908ec04fcc7\\", \\"host\\": \\"10.35.0.13\\", \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"rxtx_factor\\": 1.0}, \\"_unique_id\\": \\"9bf3c89495a840fd986508765be08fe0\\", \\"_context_user\\": null, \\"_context_user_id\\": null, \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_5a152b20506e4303bfaf3007cbe974c3\\", \\"_context_auth_token\\": null, \\"_context_tenant\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.9\\", \\"_context_project_id\\": null, \\"_context_timestamp\\": \\"2014-07-08T19:28:05.585776\\", \\"_context_user_name\\": null, \\"method\\": \\"get_instance_nw_info\\", \\"_context_remote_address\\": null}", "oslo.version": "2.0"}'
Jul 8 19:28:06 10.35.0.15 nova-network.log: /usr/lib64/python2.7/site-packages/amqp/channel.py:616: VDeprecationWarning: The auto_delete flag for exchanges has been deprecated and will be removed
Jul 8 19:28:06 10.35.0.15 nova-network.log: from py-amqp v1.5.0.
Jul 8 19:28:06 10.35.0.15 nova-network.log: warn(VDeprecationWarning(EXCHANGE_AUTODELETE_DEPRECATED))
Jul 8 19:28:06 10.35.0.15 nova-network 2014-07-08 19:28:06.291 3465 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"f920e90fd6b845aeab326231afdb553d\\", \\"failure\\": null, \\"result\\": [{\\"ovs_interfaceid\\": null, \\"network\\": {\\"bridge\\": \\"cloud\\", \\"subnets\\": [{\\"ips\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"fixed\\", \\"floating_ips\\": [], \\"address\\": \\"10.35.3.2\\"}], \\"version\\": 4, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.3\\"}, \\"dns\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"dns\\", \\"address\\": \\"8.8.4.4\\"}], \\"routes\\": [], \\"cidr\\": \\"10.35.3.0/24\\", \\"gateway\\": {\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"gateway\\", \\"address\\": \\"10.35.3.1\\"}}, {\\"ips\\": [], \\"version\\": null, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.3\\"}, \\"dns\\": [], \\"routes\\": [], \\"cidr\\": null, \\"gateway\\": {\\"meta\\": {}, \\"version\\": null, \\"type\\": \\"gateway\\", \\"address\\": null}}], \\"meta\\": {\\"tenant_id\\": null, \\"multi_host\\": true, \\"should_create_bridge\\": true}, \\"id\\": \\"933f56fa-7929-413f-ab40-9c7c91cb840b\\", \\"label\\": \\"private\\"}, \\"devname\\": null, \\"qbh_params\\": null, \\"meta\\": {}, \\"details\\": {}, \\"address\\": \\"fa:16:3e:0e:79:17\\", \\"active\\": false, \\"type\\": \\"bridge\\", \\"id\\": \\"3821355e-1e28-4fa1-adfe-520202d90d3e\\", \\"qbg_params\\": null}], \\"_msg_id\\": \\"73b3f3277d0746a4a019df20fd2a6a8f\\"}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:28:06 10.35.0.15 nova-network 2014-07-08 19:28:06.291 3465 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"f920e90fd6b845aeab326231afdb553d\\", \\"failure\\": null, \\"result\\": [{\\"ovs_interfaceid\\": null, \\"network\\": {\\"bridge\\": \\"cloud\\", \\"subnets\\": [{\\"ips\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"fixed\\", \\"floating_ips\\": [], \\"address\\": \\"10.35.3.2\\"}], \\"version\\": 4, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.3\\"}, \\"dns\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"dns\\", \\"address\\": \\"8.8.4.4\\"}], \\"routes\\": [], \\"cidr\\": \\"10.35.3.0/24\\", \\"gateway\\": {\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"gateway\\", \\"address\\": \\"10.35.3.1\\"}}, {\\"ips\\": [], \\"version\\": null, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.3\\"}, \\"dns\\": [], \\"routes\\": [], \\"cidr\\": null, \\"gateway\\": {\\"meta\\": {}, \\"version\\": null, \\"type\\": \\"gateway\\", \\"address\\": null}}], \\"meta\\": {\\"tenant_id\\": null, \\"multi_host\\": true, \\"should_create_bridge\\": true}, \\"id\\": \\"933f56fa-7929-413f-ab40-9c7c91cb840b\\", \\"label\\": \\"private\\"}, \\"devname\\": null, \\"qbh_params\\": null, \\"meta\\": {}, \\"details\\": {}, \\"address\\": \\"fa:16:3e:0e:79:17\\", \\"active\\": false, \\"type\\": \\"bridge\\", \\"id\\": \\"3821355e-1e28-4fa1-adfe-520202d90d3e\\", \\"qbg_params\\": null}], \\"_msg_id\\": \\"73b3f3277d0746a4a019df20fd2a6a8f\\"}", "oslo.version": "2.0"}', routing_key=u'reply_5a152b20506e4303bfaf3007cbe974c3'
Jul 8 19:28:06 10.35.0.15 nova-network.log: /usr/lib64/python2.7/site-packages/amqp/channel.py:616: VDeprecationWarning: The auto_delete flag for exchanges has been deprecated and will be removed
Jul 8 19:28:06 10.35.0.15 nova-network.log: from py-amqp v1.5.0.
Jul 8 19:28:06 10.35.0.15 nova-network.log: warn(VDeprecationWarning(EXCHANGE_AUTODELETE_DEPRECATED))
Jul 8 19:28:06 10.35.0.15 nova-network 2014-07-08 19:28:06.292 3465 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"6a8c2aaf2bb74535ab3f45096c71257d\\", \\"failure\\": null, \\"_msg_id\\": \\"73b3f3277d0746a4a019df20fd2a6a8f\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:28:06 10.35.0.13 nova-compute 2014-07-08 19:28:06.290 4705 ERROR kombu.transport.pyamqp [-] NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_unique_id\\": \\"f920e90fd6b845aeab326231afdb553d\\", \\"failure\\": null, \\"result\\": [{\\"ovs_interfaceid\\": null, \\"network\\": {\\"bridge\\": \\"cloud\\", \\"subnets\\": [{\\"ips\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"fixed\\", \\"floating_ips\\": [], \\"address\\": \\"10.35.3.2\\"}], \\"version\\": 4, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.3\\"}, \\"dns\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"dns\\", \\"address\\": \\"8.8.4.4\\"}], \\"routes\\": [], \\"cidr\\": \\"10.35.3.0/24\\", \\"gateway\\": {\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"gateway\\", \\"address\\": \\"10.35.3.1\\"}}, {\\"ips\\": [], \\"version\\": null, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.3\\"}, \\"dns\\": [], \\"routes\\": [], \\"cidr\\": null, \\"gateway\\": {\\"meta\\": {}, \\"version\\": null, \\"type\\": \\"gateway\\", \\"address\\": null}}], \\"meta\\": {\\"tenant_id\\": null, \\"multi_host\\": true, \\"should_create_bridge\\": true}, \\"id\\": \\"933f56fa-7929-413f-ab40-9c7c91cb840b\\", \\"label\\": \\"private\\"}, \\"devname\\": null, \\"qbh_params\\": null, \\"meta\\": {}, \\"details\\": {}, \\"address\\": \\"fa:16:3e:0e:79:17\\", \\"active\\": false, \\"type\\": \\"bridge\\", \\"id\\": \\"3821355e-1e28-4fa1-adfe-520202d90d3e\\", \\"qbg_params\\": null}], \\"_msg_id\\": \\"73b3f3277d0746a4a019df20fd2a6a8f\\"}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'1', 'redelivered': False, 'routing_key': u'reply_5a152b20506e4303bfaf3007cbe974c3', 'delivery_tag': 81, 'exchange': u'reply_5a152b20506e4303bfaf3007cbe974c3'}, 'properties': {'priority': 0, 'application_headers': {}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7fde2d448390>}
Jul 8 19:28:06 10.35.0.15 nova-network 2014-07-08 19:28:06.293 3465 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"6a8c2aaf2bb74535ab3f45096c71257d\\", \\"failure\\": null, \\"_msg_id\\": \\"73b3f3277d0746a4a019df20fd2a6a8f\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', routing_key=u'reply_5a152b20506e4303bfaf3007cbe974c3'
Jul 8 19:28:06 10.35.0.13 nova-compute 2014-07-08 19:28:06.291 4705 ERROR oslo.messaging._drivers.impl_rabbit [-] NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_unique_id\\": \\"f920e90fd6b845aeab326231afdb553d\\", \\"failure\\": null, \\"result\\": [{\\"ovs_interfaceid\\": null, \\"network\\": {\\"bridge\\": \\"cloud\\", \\"subnets\\": [{\\"ips\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"fixed\\", \\"floating_ips\\": [], \\"address\\": \\"10.35.3.2\\"}], \\"version\\": 4, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.3\\"}, \\"dns\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"dns\\", \\"address\\": \\"8.8.4.4\\"}], \\"routes\\": [], \\"cidr\\": \\"10.35.3.0/24\\", \\"gateway\\": {\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"gateway\\", \\"address\\": \\"10.35.3.1\\"}}, {\\"ips\\": [], \\"version\\": null, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.3\\"}, \\"dns\\": [], \\"routes\\": [], \\"cidr\\": null, \\"gateway\\": {\\"meta\\": {}, \\"version\\": null, \\"type\\": \\"gateway\\", \\"address\\": null}}], \\"meta\\": {\\"tenant_id\\": null, \\"multi_host\\": true, \\"should_create_bridge\\": true}, \\"id\\": \\"933f56fa-7929-413f-ab40-9c7c91cb840b\\", \\"label\\": \\"private\\"}, \\"devname\\": null, \\"qbh_params\\": null, \\"meta\\": {}, \\"details\\": {}, \\"address\\": \\"fa:16:3e:0e:79:17\\", \\"active\\": false, \\"type\\": \\"bridge\\", \\"id\\": \\"3821355e-1e28-4fa1-adfe-520202d90d3e\\", \\"qbg_params\\": null}], \\"_msg_id\\": \\"73b3f3277d0746a4a019df20fd2a6a8f\\"}", "oslo.version": "2.0"}'
Jul 8 19:28:06 10.35.0.13 nova-compute 2014-07-08 19:28:06.292 4705 ERROR kombu.transport.pyamqp [-] NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_unique_id\\": \\"6a8c2aaf2bb74535ab3f45096c71257d\\", \\"failure\\": null, \\"_msg_id\\": \\"73b3f3277d0746a4a019df20fd2a6a8f\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'1', 'redelivered': False, 'routing_key': u'reply_5a152b20506e4303bfaf3007cbe974c3', 'delivery_tag': 82, 'exchange': u'reply_5a152b20506e4303bfaf3007cbe974c3'}, 'properties': {'priority': 0, 'application_headers': {}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7fde2d448390>}
Jul 8 19:28:06 10.35.0.13 nova-compute 2014-07-08 19:28:06.292 4705 ERROR oslo.messaging._drivers.impl_rabbit [-] NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_unique_id\\": \\"6a8c2aaf2bb74535ab3f45096c71257d\\", \\"failure\\": null, \\"_msg_id\\": \\"73b3f3277d0746a4a019df20fd2a6a8f\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}'
Jul 8 19:28:06 10.35.0.13 pound: 10.35.2.3:5000 10.35.0.14 - - [08/Jul/2014:19:28:06 +0000] "POST //v2.0/tokens HTTP/1.1" 200 3288 "" "python-novaclient"
Jul 8 19:28:06 10.35.0.14 nova-api 2014-07-08 19:28:06.498 6473 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:28:06 10.35.0.13 [2014-07-08 19:28:06.508] 11330/MainThread savage.arp_forever/INFO: execute['arping', '-q', '-A', '-I', 'host', '-c', '1', '-U', '10.35.0.3']
Jul 8 19:28:06 10.35.0.13 pound: 10.35.2.3:35357 10.35.0.14 - - [08/Jul/2014:19:28:06 +0000] "GET /v2.0/tokens/89929b2e009f437eaea3b9b1778731ad HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:28:06 10.35.0.15 [2014-07-08 19:28:06.565] 9060/MainThread savage.arp_forever/INFO: execute['arping', '-q', '-A', '-I', 'services', '-c', '1', '-U', '10.35.2.5']
Jul 8 19:28:06 10.35.0.13 pound: 10.35.2.8:8774 10.35.0.14 - - [08/Jul/2014:19:28:06 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5065 "" "python-novaclient"
Jul 8 19:28:06 10.35.0.14 nova-api 2014-07-08 19:28:06.613 6473 INFO nova.osapi_compute.wsgi.server [req-2b2f18f3-61f4-4318-8582-d66766b943b8 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 10.35.0.14,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5239 time: 0.1179242
Jul 8 19:28:07 10.35.0.14 [2014-07-08 19:28:07.531] 941/Thread-1 savage.WSGI-FredInterface/INFO: 10.35.0.2 "GET /private/status HTTP/1.1" 200 3035
Jul 8 19:28:09 10.35.0.13 pound: 10.35.2.3:5000 10.35.0.14 - - [08/Jul/2014:19:28:09 +0000] "POST //v2.0/tokens HTTP/1.1" 200 3288 "" "python-novaclient"
Jul 8 19:28:09 10.35.0.15 nova-api 2014-07-08 19:28:09.379 3441 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:28:09 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.6 - - [08/Jul/2014:19:28:09 +0000] "GET /v2.0/tokens/3688cd39eb054c95a39a381f1d13d8ff HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:28:09 10.35.0.13 pound: 10.35.2.8:8774 10.35.0.14 - - [08/Jul/2014:19:28:09 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5065 "" "python-novaclient"
Jul 8 19:28:09 10.35.0.15 nova-api 2014-07-08 19:28:09.491 3441 INFO nova.osapi_compute.wsgi.server [req-0a19173d-9d7f-4413-9a75-70b74add1256 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 10.35.0.14,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5239 time: 0.1143689
Jul 8 19:28:09 10.35.0.13 [2014-07-08 19:28:09.552] 941/MainThread savage.node-evacuation/INFO: Requesting status refresh of VM id 5ee78913-e655-4cb1-a92c-500639c819ad
Jul 8 19:28:09 10.35.0.13 pound: 10.35.2.8:8774 10.35.2.7 - - [08/Jul/2014:19:28:09 +0000] "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/5ee78913-e655-4cb1-a92c-500639c819ad HTTP/1.1" 200 1691 "" "python-novaclient"
Jul 8 19:28:09 10.35.0.14 nova-api 2014-07-08 19:28:09.661 6474 INFO nova.osapi_compute.wsgi.server [req-acd1bb40-7485-47c1-87ce-f9f45379be7b 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] 10.35.2.7,10.35.0.13 "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/5ee78913-e655-4cb1-a92c-500639c819ad HTTP/1.1" status: 200 len: 1865 time: 0.1043789
Jul 8 19:28:09 10.35.0.13 [2014-07-08 19:28:09.665] 941/MainThread savage.Fred/INFO: Waiting for VMs to migrate.
Jul 8 19:28:10 10.35.0.15 [2014-07-08 19:28:10.242] 9162/MainThread savage.arp_forever/INFO: execute['arping', '-q', '-A', '-I', 'host', '-c', '1', '-U', '10.35.0.4']
Jul 8 19:28:11 10.35.0.13 [2014-07-08 19:28:11.834] 10213/MainThread savage.arp_forever/INFO: execute['arping', '-q', '-A', '-I', 'services', '-c', '1', '-U', '10.35.2.7']
Jul 8 19:28:12 10.35.0.13 pound: 10.35.2.3:5000 10.35.0.14 - - [08/Jul/2014:19:28:12 +0000] "POST //v2.0/tokens HTTP/1.1" 200 3288 "" "python-novaclient"
Jul 8 19:28:12 10.35.0.13 nova-api 2014-07-08 19:28:12.251 4844 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:28:12 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.7 - - [08/Jul/2014:19:28:12 +0000] "GET /v2.0/tokens/befa95768bc54c3bb5aed7d47c63d9d5 HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:28:12 10.35.0.13 pound: 10.35.2.8:8774 10.35.0.14 - - [08/Jul/2014:19:28:12 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5065 "" "python-novaclient"
Jul 8 19:28:12 10.35.0.13 nova-api 2014-07-08 19:28:12.374 4844 INFO nova.osapi_compute.wsgi.server [req-037c3e62-834c-49f2-814c-2613394a52e4 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 10.35.0.14,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5239 time: 0.1255510
Jul 8 19:28:12 10.35.0.15 cobalt-compute: periodic_task DEBUG Running periodic task CobaltManager._refresh_host
Jul 8 19:28:12 10.35.0.15 cobalt-compute: periodic_task DEBUG Running periodic task CobaltManager._clean
Jul 8 19:28:12 10.35.0.15 cobalt-compute: vmsconn DEBUG Starting to clean symlinks in /dev/nova
Jul 8 19:28:12 10.35.0.15 cobalt-compute: vmsconn DEBUG Link is still active /dev/nova/3e65c658-731c-4a56-8c17-1cfac00bf754_disk
Jul 8 19:28:12 10.35.0.15 cobalt-compute: vmsconn DEBUG Link is still active /dev/nova/libvirt_save
Jul 8 19:28:12 10.35.0.15 cobalt-compute: vmsconn DEBUG Link is still active /dev/nova/novadisk
Jul 8 19:28:12 10.35.0.15 cobalt-compute: loopingcall DEBUG Dynamic looping call sleeping for 60.00 seconds
Jul 8 19:28:13 10.35.0.13 [2014-07-08 19:28:13.023] 10232/MainThread savage.arp_forever/INFO: execute['arping', '-q', '-A', '-I', 'services', '-c', '1', '-U', '10.35.2.8']
Jul 8 19:28:13 10.35.0.13 [2014-07-08 19:28:13.160] 10245/MainThread savage.arp_forever/INFO: execute['arping', '-q', '-A', '-I', 'services', '-c', '1', '-U', '10.35.2.3']
Jul 8 19:28:13 10.35.0.15 [2014-07-08 19:28:13.761] 942/Thread-1 savage.WSGI-FredInterface/INFO: 10.35.0.2 "GET /private/status HTTP/1.1" 200 3035
Jul 8 19:28:13 10.35.0.13 [2014-07-08 19:28:13.761] 941/Thread-1 savage.WSGI-FredInterface/INFO: 10.35.0.2 "GET /private/status HTTP/1.1" 200 3079
Jul 8 19:28:15 10.35.0.13 pound: 10.35.2.3:5000 10.35.0.14 - - [08/Jul/2014:19:28:15 +0000] "POST //v2.0/tokens HTTP/1.1" 200 3288 "" "python-novaclient"
Jul 8 19:28:15 10.35.0.15 nova-api 2014-07-08 19:28:15.138 3441 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:28:15 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.6 - - [08/Jul/2014:19:28:15 +0000] "GET /v2.0/tokens/70ae315a099f44aba222a43f4fa8dee4 HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:28:15 10.35.0.13 pound: 10.35.2.8:8774 10.35.0.14 - - [08/Jul/2014:19:28:15 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5065 "" "python-novaclient"
Jul 8 19:28:15 10.35.0.15 nova-api 2014-07-08 19:28:15.249 3441 INFO nova.osapi_compute.wsgi.server [req-db132302-e529-4111-97ee-00174b28d2b6 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 10.35.0.14,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5239 time: 0.1130791
Jul 8 19:28:17 127.0.0.1 gunicorn.yellin: [2749] Autorestarting worker after current request.
Jul 8 19:28:17 10.35.0.2 [2014-07-08 19:28:17.974] 2749/MainThread gunicorn.error/INFO: Autorestarting worker after current request.
Jul 8 19:28:18 10.35.0.13 pound: 10.35.2.3:5000 10.35.0.14 - - [08/Jul/2014:19:28:17 +0000] "POST //v2.0/tokens HTTP/1.1" 200 3288 "" "python-novaclient"
Jul 8 19:28:18 10.35.0.15 nova-api 2014-07-08 19:28:18.022 3442 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:28:18 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.6 - - [08/Jul/2014:19:28:18 +0000] "GET /v2.0/tokens/235163e895f6483fb0a7aab856cd11d1 HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:28:18 10.35.0.13 pound: 10.35.2.8:8774 10.35.0.14 - - [08/Jul/2014:19:28:18 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5065 "" "python-novaclient"
Jul 8 19:28:18 10.35.0.15 nova-api 2014-07-08 19:28:18.143 3442 INFO nova.osapi_compute.wsgi.server [req-a0cd2222-ed3c-47df-aa7f-1148096dccd4 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 10.35.0.14,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5239 time: 0.1232579
Jul 8 19:28:18 10.35.0.15 cinder-volume 2014-07-08 19:28:18.144 9068 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_context_domain\\": null, \\"_context_request_id\\": \\"req-107eda9f-fac3-4ed8-9553-95e4ee9ff672\\", \\"_context_quota_class\\": null, \\"_context_service_catalog\\": [], \\"_context_auth_token\\": null, \\"_context_user_id\\": null, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_timestamp\\": \\"2014-07-08T19:28:18.142009\\", \\"_context_project_domain\\": null, \\"_context_user\\": null, \\"method\\": \\"update_service_capabilities\\", \\"_context_remote_address\\": null, \\"_context_roles\\": [\\"admin\\"], \\"args\\": {\\"service_name\\": \\"volume\\", \\"host\\": \\"cinder@rbd-slow\\", \\"capabilities\\": {\\"allocated_capacity_gb\\": 0, \\"volume_backend_name\\": \\"CEPHSLOW\\", \\"free_capacity_gb\\": 1281, \\"driver_version\\": \\"1.1.0\\", \\"total_capacity_gb\\": 1282, \\"reserved_percentage\\": 0, \\"vendor_name\\": \\"Open Source\\", \\"storage_protocol\\": \\"ceph\\"}}, \\"_unique_id\\": \\"514c6d2aef4b44029db1d37e28d767fb\\", \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_context_user_identity\\": \\"- - - - -\\", \\"_context_tenant\\": null, \\"_context_project_id\\": null, \\"_context_user_domain\\": null}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:28:18 10.35.0.15 cinder-volume 2014-07-08 19:28:18.144 9068 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_context_domain\\": null, \\"_context_request_id\\": \\"req-107eda9f-fac3-4ed8-9553-95e4ee9ff672\\", \\"_context_quota_class\\": null, \\"_context_service_catalog\\": [], \\"_context_auth_token\\": null, \\"_context_user_id\\": null, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_timestamp\\": \\"2014-07-08T19:28:18.142009\\", \\"_context_project_domain\\": null, \\"_context_user\\": null, \\"method\\": \\"update_service_capabilities\\", \\"_context_remote_address\\": null, \\"_context_roles\\": [\\"admin\\"], \\"args\\": {\\"service_name\\": \\"volume\\", \\"host\\": \\"cinder@rbd-slow\\", \\"capabilities\\": {\\"allocated_capacity_gb\\": 0, \\"volume_backend_name\\": \\"CEPHSLOW\\", \\"free_capacity_gb\\": 1281, \\"driver_version\\": \\"1.1.0\\", \\"total_capacity_gb\\": 1282, \\"reserved_percentage\\": 0, \\"vendor_name\\": \\"Open Source\\", \\"storage_protocol\\": \\"ceph\\"}}, \\"_unique_id\\": \\"514c6d2aef4b44029db1d37e28d767fb\\", \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_context_user_identity\\": \\"- - - - -\\", \\"_context_tenant\\": null, \\"_context_project_id\\": null, \\"_context_user_domain\\": null}", "oslo.version": "2.0"}', routing_key=''
Jul 8 19:28:18 10.35.0.15 cinder-scheduler 2014-07-08 19:28:18.146 8863 ERROR kombu.transport.pyamqp [-] NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_context_domain\\": null, \\"_context_request_id\\": \\"req-107eda9f-fac3-4ed8-9553-95e4ee9ff672\\", \\"_context_quota_class\\": null, \\"_context_service_catalog\\": [], \\"_context_auth_token\\": null, \\"_context_user_id\\": null, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_timestamp\\": \\"2014-07-08T19:28:18.142009\\", \\"_context_project_domain\\": null, \\"_context_user\\": null, \\"method\\": \\"update_service_capabilities\\", \\"_context_remote_address\\": null, \\"_context_roles\\": [\\"admin\\"], \\"args\\": {\\"service_name\\": \\"volume\\", \\"host\\": \\"cinder@rbd-slow\\", \\"capabilities\\": {\\"allocated_capacity_gb\\": 0, \\"volume_backend_name\\": \\"CEPHSLOW\\", \\"free_capacity_gb\\": 1281, \\"driver_version\\": \\"1.1.0\\", \\"total_capacity_gb\\": 1282, \\"reserved_percentage\\": 0, \\"vendor_name\\": \\"Open Source\\", \\"storage_protocol\\": \\"ceph\\"}}, \\"_unique_id\\": \\"514c6d2aef4b44029db1d37e28d767fb\\", \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_context_user_identity\\": \\"- - - - -\\", \\"_context_tenant\\": null, \\"_context_project_id\\": null, \\"_context_user_domain\\": null}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'3', 'redelivered': False, 'routing_key': u'', 'delivery_tag': 84, 'exchange': u'cinder-scheduler_fanout'}, 'properties': {'priority': 0, 'application_headers': {}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7fa03d158690>}
Jul 8 19:28:18 10.35.0.15 cinder-volume 2014-07-08 19:28:18.146 9068 INFO cinder.volume.manager [-] Updating volume status
Jul 8 19:28:18 10.35.0.15 cinder-scheduler 2014-07-08 19:28:18.147 8863 ERROR oslo.messaging._drivers.impl_rabbit [-] NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_context_domain\\": null, \\"_context_request_id\\": \\"req-107eda9f-fac3-4ed8-9553-95e4ee9ff672\\", \\"_context_quota_class\\": null, \\"_context_service_catalog\\": [], \\"_context_auth_token\\": null, \\"_context_user_id\\": null, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_timestamp\\": \\"2014-07-08T19:28:18.142009\\", \\"_context_project_domain\\": null, \\"_context_user\\": null, \\"method\\": \\"update_service_capabilities\\", \\"_context_remote_address\\": null, \\"_context_roles\\": [\\"admin\\"], \\"args\\": {\\"service_name\\": \\"volume\\", \\"host\\": \\"cinder@rbd-slow\\", \\"capabilities\\": {\\"allocated_capacity_gb\\": 0, \\"volume_backend_name\\": \\"CEPHSLOW\\", \\"free_capacity_gb\\": 1281, \\"driver_version\\": \\"1.1.0\\", \\"total_capacity_gb\\": 1282, \\"reserved_percentage\\": 0, \\"vendor_name\\": \\"Open Source\\", \\"storage_protocol\\": \\"ceph\\"}}, \\"_unique_id\\": \\"514c6d2aef4b44029db1d37e28d767fb\\", \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_context_user_identity\\": \\"- - - - -\\", \\"_context_tenant\\": null, \\"_context_project_id\\": null, \\"_context_user_domain\\": null}", "oslo.version": "2.0"}'
Jul 8 19:28:18 10.35.0.15 cinder-scheduler 2014-07-08 19:28:18.148 8863 WARNING cinder.context [-] Arguments dropped when creating context: {'user': None, 'tenant': None, 'user_identity': u'- - - - -'}
Jul 8 19:28:18 10.35.0.15 cinder-scheduler 2014-07-08 19:28:18.149 8863 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"1284ff2c03c14219849c21a019d2ffaf\\", \\"failure\\": null, \\"result\\": null}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:28:18 10.35.0.15 cinder-scheduler 2014-07-08 19:28:18.149 8863 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"1284ff2c03c14219849c21a019d2ffaf\\", \\"failure\\": null, \\"result\\": null}", "oslo.version": "2.0"}', routing_key=''
Jul 8 19:28:18 10.35.0.15 cinder-scheduler 2014-07-08 19:28:18.150 8863 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"69945d476a734e04952a2317c09a932f\\", \\"failure\\": null, \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:28:18 10.35.0.15 cinder-scheduler 2014-07-08 19:28:18.150 8863 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"69945d476a734e04952a2317c09a932f\\", \\"failure\\": null, \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', routing_key=''
Jul 8 19:28:19 10.35.0.13 [2014-07-08 19:28:19.676] 941/MainThread savage.node-evacuation/INFO: Requesting status refresh of VM id 5ee78913-e655-4cb1-a92c-500639c819ad
Jul 8 19:28:19 10.35.0.13 pound: 10.35.2.8:8774 10.35.2.7 - - [08/Jul/2014:19:28:19 +0000] "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/5ee78913-e655-4cb1-a92c-500639c819ad HTTP/1.1" 200 1691 "" "python-novaclient"
Jul 8 19:28:19 10.35.0.13 nova-api 2014-07-08 19:28:19.788 4842 INFO nova.osapi_compute.wsgi.server [req-5774deb0-baf4-40e5-afc7-269be1306ea8 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] 10.35.2.7,10.35.0.13 "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/5ee78913-e655-4cb1-a92c-500639c819ad HTTP/1.1" status: 200 len: 1865 time: 0.1074131
Jul 8 19:28:19 10.35.0.13 [2014-07-08 19:28:19.790] 941/MainThread savage.Fred/INFO: Waiting for VMs to migrate.
Jul 8 19:28:20 10.35.0.14 nova-network.log: /usr/lib64/python2.7/site-packages/sqlalchemy/engine/default.py:435: Warning: Incorrect datetime value: '2014-07-08 19:18:20+00:00' for column 'updated_at' at row 1
Jul 8 19:28:20 10.35.0.14 nova-network.log: cursor.execute(statement, parameters)
Jul 8 19:28:20 10.35.0.13 pound: 10.35.2.3:5000 10.35.0.14 - - [08/Jul/2014:19:28:20 +0000] "POST //v2.0/tokens HTTP/1.1" 200 3288 "" "python-novaclient"
Jul 8 19:28:20 10.35.0.15 nova-api 2014-07-08 19:28:20.898 3441 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:28:20 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.6 - - [08/Jul/2014:19:28:20 +0000] "GET /v2.0/tokens/e255a70a77124d1a82c6403c4b71378b HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:28:21 10.35.0.13 pound: 10.35.2.8:8774 10.35.0.14 - - [08/Jul/2014:19:28:20 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5065 "" "python-novaclient"
Jul 8 19:28:21 10.35.0.15 nova-api 2014-07-08 19:28:21.009 3441 INFO nova.osapi_compute.wsgi.server [req-b5e998c5-d944-4aff-83dc-adfdc22c70a1 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 10.35.0.14,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5239 time: 0.1127489
Jul 8 19:28:23 10.35.0.14 [2014-07-08 19:28:23.093] 941/Thread-1 savage.WSGI-FredInterface/INFO: 10.35.0.2 "GET /private/status HTTP/1.1" 200 3035
Jul 8 19:28:23 10.35.0.13 pound: 10.35.2.3:5000 10.35.0.14 - - [08/Jul/2014:19:28:23 +0000] "POST //v2.0/tokens HTTP/1.1" 200 3288 "" "python-novaclient"
Jul 8 19:28:23 10.35.0.15 nova-api 2014-07-08 19:28:23.775 3442 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:28:23 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.6 - - [08/Jul/2014:19:28:23 +0000] "GET /v2.0/tokens/06af45d8a0a34655892f4dd352f45fee HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:28:23 10.35.0.13 pound: 10.35.2.8:8774 10.35.0.14 - - [08/Jul/2014:19:28:23 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5065 "" "python-novaclient"
Jul 8 19:28:23 10.35.0.15 nova-api 2014-07-08 19:28:23.900 3442 INFO nova.osapi_compute.wsgi.server [req-0a42e6e8-2a2f-48a8-9c40-11ebadb94a07 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 10.35.0.14,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5239 time: 0.1268561
Jul 8 19:28:24 10.35.0.15 nova-network.log: /usr/lib64/python2.7/site-packages/sqlalchemy/engine/default.py:435: Warning: Incorrect datetime value: '2014-07-08 19:18:24+00:00' for column 'updated_at' at row 1
Jul 8 19:28:24 10.35.0.15 nova-network.log: cursor.execute(statement, parameters)
Jul 8 19:28:24 10.35.0.14 nova-compute 2014-07-08 19:28:24.179 6347 AUDIT nova.compute.resource_tracker [-] Auditing locally available compute resources
Jul 8 19:28:24 10.35.0.14 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf vgs --noheadings --nosuffix --separator | --units b -o vg_size,vg_free nova
Jul 8 19:28:24 10.35.0.2 dhcpd: DHCPREQUEST for 10.35.1.130 from 00:25:90:2f:2d:56 via management
Jul 8 19:28:24 10.35.0.2 dhcpd: DHCPACK on 10.35.1.130 to 00:25:90:2f:2d:56 via management
Jul 8 19:28:24 10.35.0.14 nova-compute 2014-07-08 19:28:24.528 6347 AUDIT nova.compute.resource_tracker [-] Free ram (MB): 47729
Jul 8 19:28:24 10.35.0.14 nova-compute 2014-07-08 19:28:24.529 6347 AUDIT nova.compute.resource_tracker [-] Free disk (GB): 1862
Jul 8 19:28:24 10.35.0.14 nova-compute 2014-07-08 19:28:24.529 6347 AUDIT nova.compute.resource_tracker [-] Free VCPUS: 4
Jul 8 19:28:24 10.35.0.14 nova-compute 2014-07-08 19:28:24.556 6347 INFO nova.compute.resource_tracker [-] Compute_service record updated for 10.35.0.14:node-10-35-0-14
Jul 8 19:28:24 10.35.0.14 nova-compute 2014-07-08 19:28:24.613 6347 WARNING nova.virt.libvirt.imagecache [-] Unknown base file: /mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375
Jul 8 19:28:24 10.35.0.14 nova-compute 2014-07-08 19:28:24.613 6347 INFO nova.virt.libvirt.imagecache [-] Removable base files: /mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375
Jul 8 19:28:24 10.35.0.14 nova-compute 2014-07-08 19:28:24.613 6347 INFO nova.virt.libvirt.imagecache [-] Base file too young to remove: /mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375
Jul 8 19:28:26 10.35.0.13 pound: 10.35.2.3:5000 10.35.0.14 - - [08/Jul/2014:19:28:26 +0000] "POST //v2.0/tokens HTTP/1.1" 200 3288 "" "python-novaclient"
Jul 8 19:28:26 10.35.0.14 nova-api 2014-07-08 19:28:26.675 6473 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:28:26 10.35.0.13 pound: 10.35.2.3:35357 10.35.0.14 - - [08/Jul/2014:19:28:26 +0000] "GET /v2.0/tokens/0090403d359a43e9aa8aebf9ceb09558 HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:28:26 10.35.0.13 pound: 10.35.2.8:8774 10.35.0.14 - - [08/Jul/2014:19:28:26 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5065 "" "python-novaclient"
Jul 8 19:28:26 10.35.0.14 nova-api 2014-07-08 19:28:26.798 6473 INFO nova.osapi_compute.wsgi.server [req-31941cb0-1dce-4984-b306-0c9508ff6c0c 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 10.35.0.14,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5239 time: 0.1264648
Jul 8 19:28:28 10.35.0.2 dhcpd: DHCPREQUEST for 10.35.1.131 from 00:25:90:2f:2d:07 via management
Jul 8 19:28:28 10.35.0.2 dhcpd: DHCPACK on 10.35.1.131 to 00:25:90:2f:2d:07 via management
Jul 8 19:28:29 10.35.0.15 [2014-07-08 19:28:29.327] 942/Thread-1 savage.WSGI-FredInterface/INFO: 10.35.0.2 "GET /private/status HTTP/1.1" 200 3035
Jul 8 19:28:29 10.35.0.13 [2014-07-08 19:28:29.326] 941/Thread-1 savage.WSGI-FredInterface/INFO: 10.35.0.2 "GET /private/status HTTP/1.1" 200 3080
Jul 8 19:28:29 10.35.0.13 pound: 10.35.2.3:5000 10.35.0.14 - - [08/Jul/2014:19:28:29 +0000] "POST //v2.0/tokens HTTP/1.1" 200 3288 "" "python-novaclient"
Jul 8 19:28:29 10.35.0.14 nova-api 2014-07-08 19:28:29.559 6472 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:28:29 10.35.0.13 pound: 10.35.2.3:35357 10.35.0.14 - - [08/Jul/2014:19:28:29 +0000] "GET /v2.0/tokens/ce3b5ad6cc3347eb95407ae6adb9206d HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:28:29 10.35.0.13 pound: 10.35.2.8:8774 10.35.0.14 - - [08/Jul/2014:19:28:29 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5065 "" "python-novaclient"
Jul 8 19:28:29 10.35.0.14 nova-api 2014-07-08 19:28:29.678 6472 INFO nova.osapi_compute.wsgi.server [req-4eba9689-aeda-4ab5-a424-bd9240e61409 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 10.35.0.14,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5239 time: 0.1208680
Jul 8 19:28:29 10.35.0.13 [2014-07-08 19:28:29.800] 941/MainThread savage.node-evacuation/INFO: Requesting status refresh of VM id 5ee78913-e655-4cb1-a92c-500639c819ad
Jul 8 19:28:29 10.35.0.13 pound: 10.35.2.8:8774 10.35.2.7 - - [08/Jul/2014:19:28:29 +0000] "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/5ee78913-e655-4cb1-a92c-500639c819ad HTTP/1.1" 200 1691 "" "python-novaclient"
Jul 8 19:28:29 10.35.0.15 nova-api 2014-07-08 19:28:29.914 3440 INFO nova.osapi_compute.wsgi.server [req-45b996da-9d50-4ff1-9501-f52541991b5e 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] 10.35.2.7,10.35.0.13 "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/5ee78913-e655-4cb1-a92c-500639c819ad HTTP/1.1" status: 200 len: 1865 time: 0.1054718
Jul 8 19:28:29 10.35.0.13 [2014-07-08 19:28:29.915] 941/MainThread savage.Fred/INFO: Waiting for VMs to migrate.
Jul 8 19:28:32 10.35.0.13 pound: 10.35.2.3:5000 10.35.0.14 - - [08/Jul/2014:19:28:32 +0000] "POST //v2.0/tokens HTTP/1.1" 200 3288 "" "python-novaclient"
Jul 8 19:28:32 10.35.0.15 nova-api 2014-07-08 19:28:32.451 3443 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:28:32 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.6 - - [08/Jul/2014:19:28:32 +0000] "GET /v2.0/tokens/52622cfdd2d2438b9b2a3d457bcfe5ea HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:28:32 10.35.0.13 pound: 10.35.2.8:8774 10.35.0.14 - - [08/Jul/2014:19:28:32 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5065 "" "python-novaclient"
Jul 8 19:28:32 10.35.0.15 nova-api 2014-07-08 19:28:32.578 3443 INFO nova.osapi_compute.wsgi.server [req-854e95f3-29cf-4810-a97d-7f102a4734a1 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 10.35.0.14,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5239 time: 0.1301329
Jul 8 19:28:33 127.0.0.1 gunicorn.yellin: [2749] Worker exiting (pid: 2749)
Jul 8 19:28:33 10.35.0.2 [2014-07-08 19:28:32.998] 2749/MainThread gunicorn.error/INFO: Worker exiting (pid: 2749)
Jul 8 19:28:33 127.0.0.1 gunicorn.yellin: [1818] Booting worker with pid: 1818
Jul 8 19:28:33 10.35.0.2 [2014-07-08 19:28:33.461] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('YellinConfiguration = savage.services.yellin.app:YellinConfiguration')
Jul 8 19:28:33 10.35.0.2 [2014-07-08 19:28:33.461] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('NovaServiceState = savage.jobs:NovaServiceState')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.630] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('BootNode = savage.jobs:BootNode')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.631] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('BasicConfiguration = savage.jobs:BasicConfiguration')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.632] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('BeforeNova = savage.jobs:BeforeNova')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.632] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('KeystoneLoadBalancer = savage.jobs:KeystoneLoadBalancer')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.632] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('MoveCrashDumps = savage.jobs:MoveCrashDumps')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.633] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('RadosgwConfiguration = savage.jobs:RadosgwConfiguration')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.633] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Nova = savage.jobs:Nova')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.633] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('KeystoneConfiguration = savage.jobs:KeystoneConfiguration')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.634] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('CinderFinalSetup = savage.jobs:CinderFinalSetup')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.634] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('CephWatchdog = savage.jobs.ceph_kick:CephWatchdog')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.636] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('CronConfiguration = savage.jobs:CronConfiguration')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.636] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Keystone = savage.jobs:Keystone')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.637] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('AuxiliaryNetwork = savage.jobs.network:AuxiliaryNetwork')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.657] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Radosgw = savage.jobs:Radosgw')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.658] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Mysql = savage.jobs:Mysql')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.658] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('MyHost = savage.jobs:MyHost')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.659] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Snmpd = savage.jobs:Snmpd')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.659] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('NovaConfiguration = savage.jobs:NovaConfiguration')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.659] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Neutron = savage.jobs:Neutron')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.660] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('LocalDisk = savage.jobs:LocalDisk')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.660] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Ceph = savage.jobs:Ceph')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.660] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('NetTest = savage.jobs.network:NetTest')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.661] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('AuthStoreConfiguration = savage.jobs.auth:AuthStoreConfiguration')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.668] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('ShutdownLevels = savage.services.fred:ShutdownLevels')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.780] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('LocalDiskConfiguration = savage.jobs:LocalDiskConfiguration')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.781] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('SnmpdConfiguration = savage.jobs:SnmpdConfiguration')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.781] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('GlanceConfiguration = savage.jobs:GlanceConfiguration')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.782] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('LibvirtSave = savage.jobs:LibvirtSave')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.782] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Memcache = savage.jobs:Memcache')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.783] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('SSLConfiguration = savage.jobs:SSLConfiguration')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.783] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('RabbitConfiguration = savage.jobs:RabbitConfiguration')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.783] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Rabbit = savage.jobs:Rabbit')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.784] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Apache = savage.jobs:Apache')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.784] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('RadosgwLoadBalancer = savage.jobs:RadosgwLoadBalancer')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.784] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('KeystoneFinalSetup = savage.jobs:KeystoneFinalSetup')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.785] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('FirewallConfiguration = savage.jobs.network:FirewallConfiguration')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.785] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('NeutronConfiguration = savage.jobs:NeutronConfiguration')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.786] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('UpstreamServers = savage.jobs.network:UpstreamServers')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.786] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Debug = savage.jobs:Debug')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.786] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('NoVNC = savage.jobs:NoVNC')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.787] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('DashboardLoadBalancer = savage.jobs:DashboardLoadBalancer')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.787] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('DashboardConfiguration = savage.jobs:DashboardConfiguration')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.787] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('HostStoreConfiguration = savage.jobs:HostStoreConfiguration')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.788] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('NovaLoadBalancer = savage.jobs:NovaLoadBalancer')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.789] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Hosts = savage.jobs:Hosts')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.789] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Dashboard = savage.jobs:Dashboard')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.789] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('MysqlConfiguration = savage.jobs:MysqlConfiguration')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.790] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('NoVNCConfiguration = savage.jobs:NoVNCConfiguration')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.790] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Cinder = savage.jobs:Cinder')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.791] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('OpenstackInternal = savage.jobs:OpenstackInternal')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.791] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('BeforeNeutron = savage.jobs:BeforeNeutron')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.791] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('NetworkConfiguration = savage.jobs.network:NetworkConfiguration')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.792] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('BootNodeMinimalConfiguration = savage.machines:BootNodeMinimalConfiguration')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.792] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('NovaNetworkFinalSetup = savage.jobs:NovaNetworkFinalSetup')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.792] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('CephMounted = savage.jobs:CephMounted')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.793] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Arpwatch = savage.jobs.arpwatch:Arpwatch')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.794] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('CephReplication = savage.jobs:CephReplication')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.795] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Firewall = savage.jobs.network:Firewall')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.795] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Zookeeper = savage.jobs:Zookeeper')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.795] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('GlanceLoadBalancer = savage.jobs:GlanceLoadBalancer')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.796] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('FredConfiguration = savage.services.fred:FredConfiguration')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.796] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('CinderLoadBalancer = savage.jobs:CinderLoadBalancer')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.797] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Cron = savage.jobs:Cron')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.797] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('LibVirtd = savage.jobs:LibVirtd')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.797] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('SSH = savage.jobs:SSH')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.798] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('SupportConfiguration = savage.services.support:SupportConfiguration')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.937] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('CinderConfiguration = savage.jobs:CinderConfiguration')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.938] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Openstack = savage.jobs:Openstack')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.938] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Glance = savage.jobs:Glance')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.938] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Disk = savage.jobs:Disk')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.939] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('PublicApi = savage.jobs.api:PublicApi')
Jul 8 19:28:34 10.35.0.2 [2014-07-08 19:28:34.942] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('WriteUUID = savage.jobs:WriteUUID')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.005] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('YellinConfiguration = savage.services.yellin.app:YellinConfiguration')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.006] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('NovaServiceState = savage.jobs:NovaServiceState')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.007] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('BootNode = savage.jobs:BootNode')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.007] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('BasicConfiguration = savage.jobs:BasicConfiguration')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.007] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('BeforeNova = savage.jobs:BeforeNova')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.008] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('KeystoneLoadBalancer = savage.jobs:KeystoneLoadBalancer')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.008] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('MoveCrashDumps = savage.jobs:MoveCrashDumps')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.008] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('RadosgwConfiguration = savage.jobs:RadosgwConfiguration')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.009] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Nova = savage.jobs:Nova')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.009] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('KeystoneConfiguration = savage.jobs:KeystoneConfiguration')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.009] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('CinderFinalSetup = savage.jobs:CinderFinalSetup')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.010] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('CephWatchdog = savage.jobs.ceph_kick:CephWatchdog')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.010] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('CronConfiguration = savage.jobs:CronConfiguration')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.010] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Keystone = savage.jobs:Keystone')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.011] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('AuxiliaryNetwork = savage.jobs.network:AuxiliaryNetwork')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.011] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Radosgw = savage.jobs:Radosgw')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.011] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Mysql = savage.jobs:Mysql')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.012] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('MyHost = savage.jobs:MyHost')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.012] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Snmpd = savage.jobs:Snmpd')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.012] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('NovaConfiguration = savage.jobs:NovaConfiguration')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.013] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Neutron = savage.jobs:Neutron')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.013] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('LocalDisk = savage.jobs:LocalDisk')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.013] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Ceph = savage.jobs:Ceph')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.014] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('NetTest = savage.jobs.network:NetTest')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.014] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('AuthStoreConfiguration = savage.jobs.auth:AuthStoreConfiguration')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.014] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('ShutdownLevels = savage.services.fred:ShutdownLevels')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.015] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('LocalDiskConfiguration = savage.jobs:LocalDiskConfiguration')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.015] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('SnmpdConfiguration = savage.jobs:SnmpdConfiguration')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.016] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('GlanceConfiguration = savage.jobs:GlanceConfiguration')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.016] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('LibvirtSave = savage.jobs:LibvirtSave')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.016] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Memcache = savage.jobs:Memcache')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.016] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('SSLConfiguration = savage.jobs:SSLConfiguration')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.017] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('RabbitConfiguration = savage.jobs:RabbitConfiguration')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.017] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Rabbit = savage.jobs:Rabbit')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.017] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Apache = savage.jobs:Apache')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.018] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('RadosgwLoadBalancer = savage.jobs:RadosgwLoadBalancer')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.018] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('KeystoneFinalSetup = savage.jobs:KeystoneFinalSetup')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.019] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('FirewallConfiguration = savage.jobs.network:FirewallConfiguration')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.019] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('NeutronConfiguration = savage.jobs:NeutronConfiguration')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.019] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('UpstreamServers = savage.jobs.network:UpstreamServers')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.019] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Debug = savage.jobs:Debug')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.020] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('NoVNC = savage.jobs:NoVNC')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.020] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('DashboardLoadBalancer = savage.jobs:DashboardLoadBalancer')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.021] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('DashboardConfiguration = savage.jobs:DashboardConfiguration')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.021] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('HostStoreConfiguration = savage.jobs:HostStoreConfiguration')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.021] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('NovaLoadBalancer = savage.jobs:NovaLoadBalancer')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.022] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Hosts = savage.jobs:Hosts')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.022] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Dashboard = savage.jobs:Dashboard')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.023] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('MysqlConfiguration = savage.jobs:MysqlConfiguration')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.023] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('NoVNCConfiguration = savage.jobs:NoVNCConfiguration')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.023] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Cinder = savage.jobs:Cinder')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.024] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('OpenstackInternal = savage.jobs:OpenstackInternal')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.024] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('BeforeNeutron = savage.jobs:BeforeNeutron')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.024] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('NetworkConfiguration = savage.jobs.network:NetworkConfiguration')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.024] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('BootNodeMinimalConfiguration = savage.machines:BootNodeMinimalConfiguration')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.025] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('NovaNetworkFinalSetup = savage.jobs:NovaNetworkFinalSetup')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.025] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('CephMounted = savage.jobs:CephMounted')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.025] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Arpwatch = savage.jobs.arpwatch:Arpwatch')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.026] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('CephReplication = savage.jobs:CephReplication')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.026] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Firewall = savage.jobs.network:Firewall')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.026] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Zookeeper = savage.jobs:Zookeeper')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.027] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('GlanceLoadBalancer = savage.jobs:GlanceLoadBalancer')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.027] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('FredConfiguration = savage.services.fred:FredConfiguration')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.027] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('CinderLoadBalancer = savage.jobs:CinderLoadBalancer')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.028] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Cron = savage.jobs:Cron')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.028] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('LibVirtd = savage.jobs:LibVirtd')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.028] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('SSH = savage.jobs:SSH')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.029] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('SupportConfiguration = savage.services.support:SupportConfiguration')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.029] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('CinderConfiguration = savage.jobs:CinderConfiguration')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.029] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Openstack = savage.jobs:Openstack')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.030] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Glance = savage.jobs:Glance')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.030] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('Disk = savage.jobs:Disk')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.030] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('PublicApi = savage.jobs.api:PublicApi')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.031] 1818/MainThread stevedore.extension/DEBUG: found extension EntryPoint.parse('WriteUUID = savage.jobs:WriteUUID')
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.066] 1818/MainThread savage.moxie.meta/DEBUG: requested enabled=frozenset(['group:YellinConfiguration'])
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.069] 1818/MainThread savage.moxie.meta/DEBUG: expanded enabled=set([<savage.jobs.auth.AuthStoreConfiguration(id=id:AuthStoreConfiguration)>, <savage.jobs.network.FirewallConfiguration(id=id:FirewallConfiguration)>, <savage.jobs.network.UpstreamServers(id=id:UpstreamServers)>, <savage.jobs.BasicConfiguration(id=id:BasicConfiguration)>, <savage.jobs.KeystoneConfiguration(id=id:KeystoneConfiguration)>, <savage.moxie.implicit.group:Basic(id=group:Basic)>, <savage.services.yellin.app.YellinConfiguration(id=group:YellinConfiguration)>, <savage.services.fred.ShutdownLevels(id=id:ShutdownLevels)>, <savage.jobs.DashboardConfiguration(id=id:DashboardConfiguration)>, <savage.moxie.implicit.group:FirewallConf(id=group:FirewallConf)>, <savage.moxie.implicit.group:NetworkConf(id=group:NetworkConf)>, <savage.moxie.implicit.group:SslConf(id=group:SslConf)>, <savage.moxie.implicit.group:KeystoneConf(id=group:KeystoneConf)>, <savage.jobs.MysqlConfiguration(id=id:MysqlConfiguration)>, <savage.jobs.network.NetworkConfiguration(id=id:NetworkConfiguration)>, <savage.jobs.BootNode(id=id:BootNode)>, <savage.jobs.SSLConfiguration(id=id:SSLConfiguration)>, <savage.moxie.implicit.group:BootNode(id=group:BootNode)>, <savage.moxie.implicit.group:MysqlConf(id=group:MysqlConf)>, <savage.jobs.CronConfiguration(id=id:CronConfiguration)>])
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.109] 1818/MainThread passlib.registry/DEBUG: registered 'md5_crypt' handler: <class 'passlib.handlers.md5_crypt.md5_crypt'>
Jul 8 19:28:35 10.35.0.2 [2014-07-08 19:28:35.111] 1818/MainThread passlib.registry/DEBUG: registered 'sha512_crypt' handler: <class 'passlib.handlers.sha2_crypt.sha512_crypt'>
Jul 8 19:28:35 10.35.0.13 pound: 10.35.2.3:5000 10.35.0.14 - - [08/Jul/2014:19:28:35 +0000] "POST //v2.0/tokens HTTP/1.1" 200 3288 "" "python-novaclient"
Jul 8 19:28:35 10.35.0.13 nova-api 2014-07-08 19:28:35.339 4842 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:28:35 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.7 - - [08/Jul/2014:19:28:35 +0000] "GET /v2.0/tokens/a982c81ad1164944bdcc5ec4775561af HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:28:35 10.35.0.13 pound: 10.35.2.8:8774 10.35.0.14 - - [08/Jul/2014:19:28:35 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5065 "" "python-novaclient"
Jul 8 19:28:35 10.35.0.13 nova-api 2014-07-08 19:28:35.453 4842 INFO nova.osapi_compute.wsgi.server [req-0729de01-ece7-4e9e-a1f0-c7a64fd3b362 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 10.35.0.14,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5239 time: 0.1165159
Jul 8 19:28:35 10.35.0.15 nova-compute 2014-07-08 19:28:35.622 3328 AUDIT nova.compute.resource_tracker [-] Auditing locally available compute resources
Jul 8 19:28:35 10.35.0.15 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf vgs --noheadings --nosuffix --separator | --units b -o vg_size,vg_free nova
Jul 8 19:28:36 10.35.0.15 nova-compute 2014-07-08 19:28:36.045 3328 AUDIT nova.compute.resource_tracker [-] Free ram (MB): 47217
Jul 8 19:28:36 10.35.0.15 nova-compute 2014-07-08 19:28:36.046 3328 AUDIT nova.compute.resource_tracker [-] Free disk (GB): 1861
Jul 8 19:28:36 10.35.0.15 nova-compute 2014-07-08 19:28:36.046 3328 AUDIT nova.compute.resource_tracker [-] Free VCPUS: 3
Jul 8 19:28:36 10.35.0.15 nova-compute 2014-07-08 19:28:36.081 3328 INFO nova.compute.resource_tracker [-] Compute_service record updated for 10.35.0.15:node-10-35-0-15
Jul 8 19:28:36 10.35.0.15 nova-compute 2014-07-08 19:28:36.135 3328 INFO nova.virt.libvirt.imagecache [-] image 8f23694a-2995-4070-a5e8-661de3207f2e at (/mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375): checking
Jul 8 19:28:36 10.35.0.15 nova-compute 2014-07-08 19:28:36.135 3328 INFO nova.virt.libvirt.imagecache [-] image 8f23694a-2995-4070-a5e8-661de3207f2e at (/mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375): in use: on this node 1 local, 0 on other nodes sharing this instance storage
Jul 8 19:28:36 10.35.0.15 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf chown 43 /mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375
Jul 8 19:28:36 10.35.0.15 nova-compute 2014-07-08 19:28:36.324 3328 INFO nova.virt.libvirt.imagecache [-] Active base files: /mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375
Jul 8 19:28:36 10.35.0.13 nova-compute 2014-07-08 19:28:36.969 4705 AUDIT nova.compute.resource_tracker [-] Auditing locally available compute resources
Jul 8 19:28:36 10.35.0.13 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf vgs --noheadings --nosuffix --separator | --units b -o vg_size,vg_free nova
Jul 8 19:28:37 10.35.0.13 nova-compute 2014-07-08 19:28:37.335 4705 AUDIT nova.compute.resource_tracker [-] Free ram (MB): 46705
Jul 8 19:28:37 10.35.0.13 nova-compute 2014-07-08 19:28:37.336 4705 AUDIT nova.compute.resource_tracker [-] Free disk (GB): 1860
Jul 8 19:28:37 10.35.0.13 nova-compute 2014-07-08 19:28:37.336 4705 AUDIT nova.compute.resource_tracker [-] Free VCPUS: 2
Jul 8 19:28:37 10.35.0.13 nova-compute 2014-07-08 19:28:37.362 4705 INFO nova.compute.resource_tracker [-] Compute_service record updated for 10.35.0.13:node-10-35-0-13
Jul 8 19:28:37 10.35.0.13 nova-compute 2014-07-08 19:28:37.428 4705 INFO nova.virt.libvirt.imagecache [-] image 8f23694a-2995-4070-a5e8-661de3207f2e at (/mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375): checking
Jul 8 19:28:37 10.35.0.13 nova-compute 2014-07-08 19:28:37.428 4705 INFO nova.virt.libvirt.imagecache [-] image 8f23694a-2995-4070-a5e8-661de3207f2e at (/mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375): in use: on this node 2 local, 0 on other nodes sharing this instance storage
Jul 8 19:28:37 10.35.0.13 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf chown 43 /mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375
Jul 8 19:28:37 10.35.0.13 nova-compute 2014-07-08 19:28:37.629 4705 INFO nova.virt.libvirt.imagecache [-] Active base files: /mnt/novadisk/nova/instances/_base/afc9137c447f75f367d2a200ace85682280e8375
Jul 8 19:28:38 10.35.0.13 pound: 10.35.2.3:5000 10.35.0.14 - - [08/Jul/2014:19:28:38 +0000] "POST //v2.0/tokens HTTP/1.1" 200 3288 "" "python-novaclient"
Jul 8 19:28:38 10.35.0.13 nova-api 2014-07-08 19:28:38.214 4840 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:28:38 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.7 - - [08/Jul/2014:19:28:38 +0000] "GET /v2.0/tokens/c2ed650f55214279b573fa094bb6528d HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:28:38 10.35.0.13 pound: 10.35.2.8:8774 10.35.0.14 - - [08/Jul/2014:19:28:38 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5065 "" "python-novaclient"
Jul 8 19:28:38 10.35.0.13 nova-api 2014-07-08 19:28:38.330 4840 INFO nova.osapi_compute.wsgi.server [req-97cea485-d7d7-4f54-86d1-ed4de96eafc9 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 10.35.0.14,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5239 time: 0.1175518
Jul 8 19:28:38 10.35.0.13 cobalt-compute: periodic_task DEBUG Running periodic task CobaltManager._refresh_host
Jul 8 19:28:38 10.35.0.14 [2014-07-08 19:28:38.660] 941/Thread-1 savage.WSGI-FredInterface/INFO: 10.35.0.2 "GET /private/status HTTP/1.1" 200 3035
Jul 8 19:28:38 10.35.0.13 cobalt-compute: instance DEBUG Lazy-loading `system_metadata' on Instance uuid 5ee78913-e655-4cb1-a92c-500639c819ad
Jul 8 19:28:38 10.35.0.13 cobalt-compute: amqpdriver DEBUG MSG_ID is 8a59996570cb4ea1a25d403a810bc336
Jul 8 19:28:38 10.35.0.13 cobalt-compute: amqp DEBUG UNIQUE_ID is 67557022cc2a4c9b9c7b2e479c00df99.
Jul 8 19:28:38 10.35.0.13 cobalt-compute: pyamqp ERROR NBK: body='{"oslo.message": "{\\"_context_roles\\": [], \\"_msg_id\\": \\"8a59996570cb4ea1a25d403a810bc336\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-d1286800-f6ef-4133-825d-05c02e8b33f3\\", \\"_context_service_catalog\\": [], \\"args\\": {\\"instance_id\\": 2, \\"teardown\\": false, \\"host\\": \\"10.35.0.13\\"}, \\"_unique_id\\": \\"67557022cc2a4c9b9c7b2e479c00df99\\", \\"_context_user\\": null, \\"_context_user_id\\": null, \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_ff10fb357a86427594db414dc3e5acc6\\", \\"_context_auth_token\\": null, \\"_context_tenant\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_project_id\\": null, \\"_context_timestamp\\": \\"2014-07-08T19:28:38.643246\\", \\"_context_user_name\\": null, \\"method\\": \\"setup_networks_on_host\\", \\"_context_remote_address\\": null}", "oslo.version": "2.0"}', headers={'ttl': 60000}, properties={'delivery_mode': 2}
Jul 8 19:28:38 10.35.0.13 cobalt-compute: messaging ERROR NBK: message.body='{"oslo.message": "{\\"_context_roles\\": [], \\"_msg_id\\": \\"8a59996570cb4ea1a25d403a810bc336\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-d1286800-f6ef-4133-825d-05c02e8b33f3\\", \\"_context_service_catalog\\": [], \\"args\\": {\\"instance_id\\": 2, \\"teardown\\": false, \\"host\\": \\"10.35.0.13\\"}, \\"_unique_id\\": \\"67557022cc2a4c9b9c7b2e479c00df99\\", \\"_context_user\\": null, \\"_context_user_id\\": null, \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_ff10fb357a86427594db414dc3e5acc6\\", \\"_context_auth_token\\": null, \\"_context_tenant\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_project_id\\": null, \\"_context_timestamp\\": \\"2014-07-08T19:28:38.643246\\", \\"_context_user_name\\": null, \\"method\\": \\"setup_networks_on_host\\", \\"_context_remote_address\\": null}", "oslo.version": "2.0"}', routing_key='network'
Jul 8 19:28:38 10.35.0.13 cobalt-compute: channel DEBUG Closed channel #1
Jul 8 19:28:38 10.35.0.13 cobalt-compute: channel DEBUG using channel_id: 1
Jul 8 19:28:38 10.35.0.13 nova-network 2014-07-08 19:28:38.768 4876 ERROR kombu.transport.pyamqp [-] NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_context_roles\\": [], \\"_msg_id\\": \\"8a59996570cb4ea1a25d403a810bc336\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-d1286800-f6ef-4133-825d-05c02e8b33f3\\", \\"_context_service_catalog\\": [], \\"args\\": {\\"instance_id\\": 2, \\"teardown\\": false, \\"host\\": \\"10.35.0.13\\"}, \\"_unique_id\\": \\"67557022cc2a4c9b9c7b2e479c00df99\\", \\"_context_user\\": null, \\"_context_user_id\\": null, \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_ff10fb357a86427594db414dc3e5acc6\\", \\"_context_auth_token\\": null, \\"_context_tenant\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_project_id\\": null, \\"_context_timestamp\\": \\"2014-07-08T19:28:38.643246\\", \\"_context_user_name\\": null, \\"method\\": \\"setup_networks_on_host\\", \\"_context_remote_address\\": null}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'1', 'redelivered': False, 'routing_key': u'network', 'delivery_tag': 30, 'exchange': u'nova'}, 'properties': {'priority': 0, 'application_headers': {u'ttl': 60000}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7fdeb6472390>}
Jul 8 19:28:38 10.35.0.13 cobalt-compute: channel DEBUG Channel open
Jul 8 19:28:38 10.35.0.13 cobalt-compute: pyamqp ERROR NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_unique_id\\": \\"2cc5455f05e0486088ee12d57bfc15f6\\", \\"failure\\": \\"{\\\\\\"args\\\\\\": [\\\\\\"Timed out waiting for a reply to message ID d9bd0f5d12014520a7165790429929c3\\\\\\"], \\\\\\"module\\\\\\": \\\\\\"oslo.messaging.exceptions\\\\\\", \\\\\\"kwargs\\\\\\": {}, \\\\\\"message\\\\\\": \\\\\\"Timed out waiting for a reply to message ID d9bd0f5d12014520a7165790429929c3\\\\\\", \\\\\\"tb\\\\\\": [\\\\\\"Traceback (most recent call last):\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/dispatcher.py\\\\\\\\\\\\\\", line 133, in _dispatch_and_reply\\\\\\\\n incoming.message))\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/dispatcher.py\\\\\\\\\\\\\\", line 176, in _dispatch\\\\\\\\n return self._do_dispatch(endpoint, method, ctxt, args)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/dispatcher.py\\\\\\\\\\\\\\", line 122, in _do_dispatch\\\\\\\\n result = getattr(endpoint, method)(ctxt, **new_args)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/nova/network/manager.py\\\\\\\\\\\\\\", line 1323, in setup_networks_on_host\\\\\\\\n gt.wait()\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/eventlet/greenthread.py\\\\\\\\\\\\\\", line 168, in wait\\\\\\\\n return self._exit_event.wait()\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/eventlet/event.py\\\\\\\\\\\\\\", line 116, in wait\\\\\\\\n return hubs.get_hub().switch()\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/eventlet/hubs/hub.py\\\\\\\\\\\\\\", line 187, in switch\\\\\\\\n return self.greenlet.switch()\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/eventlet/greenthread.py\\\\\\\\\\\\\\", line 194, in main\\\\\\\\n result = function(*args, **kwargs)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/nova/network/rpcapi.py\\\\\\\\\\\\\\", line 281, in rpc_setup_network_on_host\\\\\\\\n network_id=network_id, teardown=teardown)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/client.py\\\\\\\\\\\\\\", line 150, in call\\\\\\\\n wait_for_reply=True, timeout=timeout)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/transport.py\\\\\\\\\\\\\\", line 90, in _send\\\\\\\\n timeout=timeout)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py\\\\\\\\\\\\\\", line 412, in send\\\\\\\\n return self._send(target, ctxt, message, wait_for_reply, timeout)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py\\\\\\\\\\\\\\", line 403, in _send\\\\\\\\n result = self._waiter.wait(msg_id, timeout)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py\\\\\\\\\\\\\\", line 267, in wait\\\\\\\\n reply, ending = self._poll_connection(msg_id, timeout)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py\\\\\\\\\\\\\\", line 217, in _poll_connection\\\\\\\\n % msg_id)\\\\\\\\n\\\\\\", \\\\\\"MessagingTimeout: Timed out waiting for a reply to message ID d9bd0f5d12014520a7165790429929c3\\\\\\\\n\\\\\\"], \\\\\\"class\\\\\\": \\\\\\"MessagingTimeout\\\\\\"}\\", \\"result\\": null, \\"_msg_id\\": \\"3303d69cf18e4bcb8ee69b3b8a4496af\\"}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'1', 'redelivered': False, 'routing_key': u'reply_ff10fb357a86427594db414dc3e5acc6', 'delivery_tag': 3, 'exchange': u'reply_ff10fb357a86427594db414dc3e5acc6'}, 'properties': {'priority': 0, 'application_headers': {}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7ff0d2864c50>}
Jul 8 19:28:38 10.35.0.13 nova-network 2014-07-08 19:28:38.769 4876 ERROR oslo.messaging._drivers.impl_rabbit [-] NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_context_roles\\": [], \\"_msg_id\\": \\"8a59996570cb4ea1a25d403a810bc336\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-d1286800-f6ef-4133-825d-05c02e8b33f3\\", \\"_context_service_catalog\\": [], \\"args\\": {\\"instance_id\\": 2, \\"teardown\\": false, \\"host\\": \\"10.35.0.13\\"}, \\"_unique_id\\": \\"67557022cc2a4c9b9c7b2e479c00df99\\", \\"_context_user\\": null, \\"_context_user_id\\": null, \\"_context_project_name\\": null, \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_ff10fb357a86427594db414dc3e5acc6\\", \\"_context_auth_token\\": null, \\"_context_tenant\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_project_id\\": null, \\"_context_timestamp\\": \\"2014-07-08T19:28:38.643246\\", \\"_context_user_name\\": null, \\"method\\": \\"setup_networks_on_host\\", \\"_context_remote_address\\": null}", "oslo.version": "2.0"}'
Jul 8 19:28:38 10.35.0.13 cobalt-compute: impl_rabbit ERROR NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_unique_id\\": \\"2cc5455f05e0486088ee12d57bfc15f6\\", \\"failure\\": \\"{\\\\\\"args\\\\\\": [\\\\\\"Timed out waiting for a reply to message ID d9bd0f5d12014520a7165790429929c3\\\\\\"], \\\\\\"module\\\\\\": \\\\\\"oslo.messaging.exceptions\\\\\\", \\\\\\"kwargs\\\\\\": {}, \\\\\\"message\\\\\\": \\\\\\"Timed out waiting for a reply to message ID d9bd0f5d12014520a7165790429929c3\\\\\\", \\\\\\"tb\\\\\\": [\\\\\\"Traceback (most recent call last):\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/dispatcher.py\\\\\\\\\\\\\\", line 133, in _dispatch_and_reply\\\\\\\\n incoming.message))\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/dispatcher.py\\\\\\\\\\\\\\", line 176, in _dispatch\\\\\\\\n return self._do_dispatch(endpoint, method, ctxt, args)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/dispatcher.py\\\\\\\\\\\\\\", line 122, in _do_dispatch\\\\\\\\n result = getattr(endpoint, method)(ctxt, **new_args)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/nova/network/manager.py\\\\\\\\\\\\\\", line 1323, in setup_networks_on_host\\\\\\\\n gt.wait()\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/eventlet/greenthread.py\\\\\\\\\\\\\\", line 168, in wait\\\\\\\\n return self._exit_event.wait()\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/eventlet/event.py\\\\\\\\\\\\\\", line 116, in wait\\\\\\\\n return hubs.get_hub().switch()\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/eventlet/hubs/hub.py\\\\\\\\\\\\\\", line 187, in switch\\\\\\\\n return self.greenlet.switch()\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/eventlet/greenthread.py\\\\\\\\\\\\\\", line 194, in main\\\\\\\\n result = function(*args, **kwargs)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/nova/network/rpcapi.py\\\\\\\\\\\\\\", line 281, in rpc_setup_network_on_host\\\\\\\\n network_id=network_id, teardown=teardown)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/client.py\\\\\\\\\\\\\\", line 150, in call\\\\\\\\n wait_for_reply=True, timeout=timeout)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/transport.py\\\\\\\\\\\\\\", line 90, in _send\\\\\\\\n timeout=timeout)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py\\\\\\\\\\\\\\", line 412, in send\\\\\\\\n return self._send(target, ctxt, message, wait_for_reply, timeout)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py\\\\\\\\\\\\\\", line 403, in _send\\\\\\\\n result = self._waiter.wait(msg_id, timeout)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py\\\\\\\\\\\\\\", line 267, in wait\\\\\\\\n reply, ending = self._poll_connection(msg_id, timeout)\\\\\\\\n\\\\\\", \\\\\\" File \\\\\\\\\\\\\\"/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py\\\\\\\\\\\\\\", line 217, in _poll_connection\\\\\\\\n % msg_id)\\\\\\\\n\\\\\\", \\\\\\"MessagingTimeout: Timed out waiting for a reply to message ID d9bd0f5d12014520a7165790429929c3\\\\\\\\n\\\\\\"], \\\\\\"class\\\\\\": \\\\\\"MessagingTimeout\\\\\\"}\\", \\"result\\": null, \\"_msg_id\\": \\"3303d69cf18e4bcb8ee69b3b8a4496af\\"}", "oslo.version": "2.0"}'
Jul 8 19:28:38 10.35.0.13 cobalt-compute: amqpdriver WARNING No calling threads waiting for msg_id : 3303d69cf18e4bcb8ee69b3b8a4496af, message : {u'_unique_id': u'2cc5455f05e0486088ee12d57bfc15f6', u'failure': u'{"args": ["Timed out waiting for a reply to message ID d9bd0f5d12014520a7165790429929c3"], "module": "oslo.messaging.exceptions", "kwargs": {}, "message": "Timed out waiting for a reply to message ID d9bd0f5d12014520a7165790429929c3", "tb": ["Traceback (most recent call last):\\n", " File \\"/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/dispatcher.py\\", line 133, in _dispatch_and_reply\\n incoming.message))\\n", " File \\"/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/dispatcher.py\\", line 176, in _dispatch\\n return self._do_dispatch(endpoint, method, ctxt, args)\\n", " File \\"/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/dispatcher.py\\", line 122, in _do_dispatch\\n result = getattr(endpoint, method)(ctxt, **new_args)\\n", " File \\"/usr/lib64/python2.7/site-packages/nova/network/manager.py\\", line 1323, in setup_networks_on_host\\n gt.wait()\\n", " File \\"/usr/lib64/python2.7/site-packages/eventlet/greenthread.py\\", line 168, in wait\\n return self._exit_event.wait()\\n", " File \\"/usr/lib64/python2.7/site-packages/eventlet/event.py\\", line 116, in wait\\n return hubs.get_hub().switch()\\n", " File \\"/usr/lib64/python2.7/site-packages/eventlet/hubs/hub.py\\", line 187, in switch\\n return self.greenlet.switch()\\n", " File \\"/usr/lib64/python2.7/site-packages/eventlet/greenthread.py\\", line 194, in main\\n result = function(*args, **kwargs)\\n", " File \\"/usr/lib64/python2.7/site-packages/nova/network/rpcapi.py\\", line 281, in rpc_setup_network_on_host\\n network_id=network_id, teardown=teardown)\\n", " File \\"/usr/lib64/python2.7/site-packages/oslo/messaging/rpc/client.py\\", line 150, in call\\n wait_for_reply=True, timeout=timeout)\\n", " File \\"/usr/lib64/python2.7/site-packages/oslo/messaging/transport.py\\", line 90, in _send\\n timeout=timeout)\\n", " File \\"/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py\\", line 412, in send\\n return self._send(target, ctxt, message, wait_for_reply, timeout)\\n", " File \\"/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py\\", line 403, in _send\\n result = self._waiter.wait(msg_id, timeout)\\n", " File \\"/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py\\", line 267, in wait\\n reply, ending = self._poll_connection(msg_id, timeout)\\n", " File \\"/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/amqpdriver.py\\", line 217, in _poll_connection\\n % msg_id)\\n", "MessagingTimeout: Timed out waiting for a reply to message ID d9bd0f5d12014520a7165790429929c3\\n"], "class": "MessagingTimeout"}', u'result': None}
Jul 8 19:28:38 10.35.0.13 cobalt-compute: amqpdriver WARNING _queues: {'8a59996570cb4ea1a25d403a810bc336': <Queue at 0x7ff0d293ca10 maxsize=None>}
Jul 8 19:28:38 10.35.0.13 cobalt-compute: pyamqp ERROR NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_unique_id\\": \\"b3f5fbea2282418ea920f244badad2bf\\", \\"failure\\": null, \\"_msg_id\\": \\"3303d69cf18e4bcb8ee69b3b8a4496af\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'1', 'redelivered': False, 'routing_key': u'reply_ff10fb357a86427594db414dc3e5acc6', 'delivery_tag': 4, 'exchange': u'reply_ff10fb357a86427594db414dc3e5acc6'}, 'properties': {'priority': 0, 'application_headers': {}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7ff0d2864c50>}
Jul 8 19:28:38 10.35.0.13 cobalt-compute: impl_rabbit ERROR NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_unique_id\\": \\"b3f5fbea2282418ea920f244badad2bf\\", \\"failure\\": null, \\"_msg_id\\": \\"3303d69cf18e4bcb8ee69b3b8a4496af\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}'
Jul 8 19:28:38 10.35.0.13 cobalt-compute: amqpdriver WARNING No calling threads waiting for msg_id : 3303d69cf18e4bcb8ee69b3b8a4496af, message : {u'_unique_id': u'b3f5fbea2282418ea920f244badad2bf', u'failure': None, u'result': None, u'ending': True}
Jul 8 19:28:38 10.35.0.13 cobalt-compute: amqpdriver WARNING _queues: {'8a59996570cb4ea1a25d403a810bc336': <Queue at 0x7ff0d293ca10 maxsize=None>}
Jul 8 19:28:38 10.35.0.13 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf ip route replace 10.35.3.0/24 dev cloud table cloud
Jul 8 19:28:39 10.35.0.13 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf ip route replace 0.0.0.0/0 via 10.35.3.1 dev cloud table cloud
Jul 8 19:28:39 10.35.0.13 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf kill -HUP 7234
Jul 8 19:28:39 10.35.0.13 dnsmasq-dhcp[7234]: read /var/lib/dnsmasq/nova-cloud.conf
Jul 8 19:28:39 10.35.0.13 nova-network.log: /usr/lib64/python2.7/site-packages/amqp/channel.py:616: VDeprecationWarning: The auto_delete flag for exchanges has been deprecated and will be removed
Jul 8 19:28:39 10.35.0.13 nova-network.log: from py-amqp v1.5.0.
Jul 8 19:28:39 10.35.0.13 nova-network.log: warn(VDeprecationWarning(EXCHANGE_AUTODELETE_DEPRECATED))
Jul 8 19:28:39 10.35.0.13 nova-network 2014-07-08 19:28:39.530 4876 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"e8bdde5c4703434aa24c50131fe708e5\\", \\"failure\\": null, \\"result\\": null, \\"_msg_id\\": \\"8a59996570cb4ea1a25d403a810bc336\\"}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:28:39 10.35.0.13 nova-network 2014-07-08 19:28:39.531 4876 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"e8bdde5c4703434aa24c50131fe708e5\\", \\"failure\\": null, \\"result\\": null, \\"_msg_id\\": \\"8a59996570cb4ea1a25d403a810bc336\\"}", "oslo.version": "2.0"}', routing_key=u'reply_ff10fb357a86427594db414dc3e5acc6'
Jul 8 19:28:39 10.35.0.13 nova-network.log: /usr/lib64/python2.7/site-packages/amqp/channel.py:616: VDeprecationWarning: The auto_delete flag for exchanges has been deprecated and will be removed
Jul 8 19:28:39 10.35.0.13 nova-network.log: from py-amqp v1.5.0.
Jul 8 19:28:39 10.35.0.13 nova-network.log: warn(VDeprecationWarning(EXCHANGE_AUTODELETE_DEPRECATED))
Jul 8 19:28:39 10.35.0.13 nova-network 2014-07-08 19:28:39.532 4876 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"225abf412b4b4c008e1d7674cc8a9eb0\\", \\"failure\\": null, \\"_msg_id\\": \\"8a59996570cb4ea1a25d403a810bc336\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:28:39 10.35.0.13 cobalt-compute: pyamqp ERROR NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_unique_id\\": \\"e8bdde5c4703434aa24c50131fe708e5\\", \\"failure\\": null, \\"result\\": null, \\"_msg_id\\": \\"8a59996570cb4ea1a25d403a810bc336\\"}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'1', 'redelivered': False, 'routing_key': u'reply_ff10fb357a86427594db414dc3e5acc6', 'delivery_tag': 5, 'exchange': u'reply_ff10fb357a86427594db414dc3e5acc6'}, 'properties': {'priority': 0, 'application_headers': {}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7ff0d2864c50>}
Jul 8 19:28:39 10.35.0.13 nova-network 2014-07-08 19:28:39.532 4876 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"225abf412b4b4c008e1d7674cc8a9eb0\\", \\"failure\\": null, \\"_msg_id\\": \\"8a59996570cb4ea1a25d403a810bc336\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', routing_key=u'reply_ff10fb357a86427594db414dc3e5acc6'
Jul 8 19:28:39 10.35.0.13 cobalt-compute: impl_rabbit ERROR NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_unique_id\\": \\"e8bdde5c4703434aa24c50131fe708e5\\", \\"failure\\": null, \\"result\\": null, \\"_msg_id\\": \\"8a59996570cb4ea1a25d403a810bc336\\"}", "oslo.version": "2.0"}'
Jul 8 19:28:39 10.35.0.13 cobalt-compute: pyamqp ERROR NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_unique_id\\": \\"225abf412b4b4c008e1d7674cc8a9eb0\\", \\"failure\\": null, \\"_msg_id\\": \\"8a59996570cb4ea1a25d403a810bc336\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'1', 'redelivered': False, 'routing_key': u'reply_ff10fb357a86427594db414dc3e5acc6', 'delivery_tag': 6, 'exchange': u'reply_ff10fb357a86427594db414dc3e5acc6'}, 'properties': {'priority': 0, 'application_headers': {}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7ff0d2864c50>}
Jul 8 19:28:39 10.35.0.13 cobalt-compute: impl_rabbit ERROR NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_unique_id\\": \\"225abf412b4b4c008e1d7674cc8a9eb0\\", \\"failure\\": null, \\"_msg_id\\": \\"8a59996570cb4ea1a25d403a810bc336\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}'
Jul 8 19:28:39 10.35.0.13 cobalt-compute: periodic_task DEBUG Running periodic task CobaltManager._clean
Jul 8 19:28:39 10.35.0.13 cobalt-compute: vmsconn DEBUG Starting to clean symlinks in /dev/nova
Jul 8 19:28:39 10.35.0.13 cobalt-compute: vmsconn DEBUG Link is still active /dev/nova/ce77d052-f021-44a2-b5f3-5908ec04fcc7_disk
Jul 8 19:28:39 10.35.0.13 cobalt-compute: vmsconn DEBUG Link is still active /dev/nova/5ee78913-e655-4cb1-a92c-500639c819ad_disk
Jul 8 19:28:39 10.35.0.13 cobalt-compute: vmsconn DEBUG Link is still active /dev/nova/libvirt_save
Jul 8 19:28:39 10.35.0.13 cobalt-compute: vmsconn DEBUG Link is still active /dev/nova/novadisk
Jul 8 19:28:39 10.35.0.13 cobalt-compute: loopingcall DEBUG Dynamic looping call sleeping for 60.00 seconds
Jul 8 19:28:39 10.35.0.13 [2014-07-08 19:28:39.924] 941/MainThread savage.node-evacuation/INFO: Requesting status refresh of VM id 5ee78913-e655-4cb1-a92c-500639c819ad
Jul 8 19:28:40 10.35.0.13 pound: 10.35.2.8:8774 10.35.2.7 - - [08/Jul/2014:19:28:39 +0000] "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/5ee78913-e655-4cb1-a92c-500639c819ad HTTP/1.1" 200 1696 "" "python-novaclient"
Jul 8 19:28:40 10.35.0.14 nova-api 2014-07-08 19:28:40.031 6474 INFO nova.osapi_compute.wsgi.server [req-f9a18822-60eb-4c86-8e39-2353e5869659 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] 10.35.2.7,10.35.0.13 "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/5ee78913-e655-4cb1-a92c-500639c819ad HTTP/1.1" status: 200 len: 1870 time: 0.1029961
Jul 8 19:28:40 10.35.0.13 [2014-07-08 19:28:40.033] 941/MainThread savage.Fred/INFO: Waiting for VMs to migrate.
Jul 8 19:28:40 10.35.0.13 [2014-07-08 19:28:40.034] 941/MainThread savage/WARNING: Failed to run command "evacuate(<savage.command.commands.gc.EvacuationManager object at 0x7fa847b9f190>, True, progress=<savage.services.fred._ProgressTracker object at 0x7fa8434ace90>)": savage.command.commands.gc.RecheckVMs(There might still be some vms left.) (took 226.8s) (try 1/inf, retry in 0.8s)
Jul 8 19:28:40 10.35.0.13 [2014-07-08 19:28:40.873] 941/MainThread savage.node-evacuation/INFO: Called evacuate
Jul 8 19:28:40 10.35.0.13 [2014-07-08 19:28:40.874] 941/MainThread savage.Fred/INFO: Checking for VMs to migrate.
Jul 8 19:28:40 10.35.0.13 [2014-07-08 19:28:40.874] 941/MainThread savage.node-evacuation/INFO: Called list
Jul 8 19:28:40 10.35.0.13 pound: 10.35.2.8:8774 10.35.2.7 - - [08/Jul/2014:19:28:40 +0000] "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/detail?all_tenants=True&host=10.35.0.13 HTTP/1.1" 200 3384 "" "python-novaclient"
Jul 8 19:28:40 10.35.0.15 nova-api 2014-07-08 19:28:40.966 3442 INFO nova.osapi_compute.wsgi.server [req-ad89c3fb-872e-4a5a-9aeb-389d8b5144f8 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] 10.35.2.7,10.35.0.13 "GET /v2/9a663247f58a4412a840385af8d9e73a/servers/detail?all_tenants=True&host=10.35.0.13 HTTP/1.1" status: 200 len: 3558 time: 0.0860031
Jul 8 19:28:40 10.35.0.13 [2014-07-08 19:28:40.966] 941/MainThread savage.node-evacuation/INFO: list returned [<Server: vm-39852899123>, <Server: vm-3904852117>]
Jul 8 19:28:40 10.35.0.13 [2014-07-08 19:28:40.966] 941/MainThread savage.node-evacuation/INFO: Requesting gc migration of VM id 5ee78913-e655-4cb1-a92c-500639c819ad
Jul 8 19:28:41 10.35.0.15 nova-api 2014-07-08 19:28:41.063 3442 ERROR nova.cobalt.api [req-585393f6-c238-40c5-88fe-fbf1dd0c48e5 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] NBK: migrate_instance instance_uuid=5ee78913-e655-4cb1-a92c-500639c819ad
Jul 8 19:28:41 10.35.0.13 pound: 10.35.2.3:5000 10.35.0.14 - - [08/Jul/2014:19:28:40 +0000] "POST //v2.0/tokens HTTP/1.1" 200 3288 "" "python-novaclient"
Jul 8 19:28:41 10.35.0.15 nova-api 2014-07-08 19:28:41.095 3440 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:28:41 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.6 - - [08/Jul/2014:19:28:41 +0000] "GET /v2.0/tokens/1aea3c24a08e403db07188584a824621 HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:28:41 10.35.0.15 nova-api 2014-07-08 19:28:41.170 3442 ERROR nova.cobalt.api [req-585393f6-c238-40c5-88fe-fbf1dd0c48e5 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] NBK: migrate_instance dest=10.35.0.15 set state=MIGRATING
Jul 8 19:28:41 10.35.0.15 nova-api 2014-07-08 19:28:41.171 3442 ERROR oslo.messaging._drivers.impl_rabbit [-] Failed to publish message to topic 'cobalt.10.35.0.13': [Errno 104] Connection reset by peer
2014-07-08 19:28:41.171 3442 TRACE oslo.messaging._drivers.impl_rabbit Traceback (most recent call last):
2014-07-08 19:28:41.171 3442 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/impl_rabbit.py", line 672, in ensure
2014-07-08 19:28:41.171 3442 TRACE oslo.messaging._drivers.impl_rabbit return method(*args, **kwargs)
2014-07-08 19:28:41.171 3442 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/impl_rabbit.py", line 768, in _publish
2014-07-08 19:28:41.171 3442 TRACE oslo.messaging._drivers.impl_rabbit publisher = cls(self.conf, self.channel, topic, **kwargs)
2014-07-08 19:28:41.171 3442 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/impl_rabbit.py", line 429, in __init__
2014-07-08 19:28:41.171 3442 TRACE oslo.messaging._drivers.impl_rabbit **options)
2014-07-08 19:28:41.171 3442 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/impl_rabbit.py", line 376, in __init__
2014-07-08 19:28:41.171 3442 TRACE oslo.messaging._drivers.impl_rabbit self.reconnect(channel)
2014-07-08 19:28:41.171 3442 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/impl_rabbit.py", line 384, in reconnect
2014-07-08 19:28:41.171 3442 TRACE oslo.messaging._drivers.impl_rabbit routing_key=self.routing_key)
2014-07-08 19:28:41.171 3442 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/kombu/messaging.py", line 85, in __init__
2014-07-08 19:28:41.171 3442 TRACE oslo.messaging._drivers.impl_rabbit self.revive(self._channel)
2014-07-08 19:28:41.171 3442 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/kombu/messaging.py", line 224, in revive
2014-07-08 19:28:41.171 3442 TRACE oslo.messaging._drivers.impl_rabbit self.declare()
2014-07-08 19:28:41.171 3442 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/kombu/messaging.py", line 105, in declare
2014-07-08 19:28:41.171 3442 TRACE oslo.messaging._drivers.impl_rabbit self.exchange.declare()
2014-07-08 19:28:41.171 3442 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/kombu/entity.py", line 166, in declare
2014-07-08 19:28:41.171 3442 TRACE oslo.messaging._drivers.impl_rabbit nowait=nowait, passive=passive,
2014-07-08 19:28:41.171 3442 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/amqp/channel.py", line 613, in exchange_declare
2014-07-08 19:28:41.171 3442 TRACE oslo.messaging._drivers.impl_rabbit self._send_method((40, 10), args)
2014-07-08 19:28:41.171 3442 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/amqp/abstract_channel.py", line 56, in _send_method
2014-07-08 19:28:41.171 3442 TRACE oslo.messaging._drivers.impl_rabbit self.channel_id, method_sig, args, content,
2014-07-08 19:28:41.171 3442 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/amqp/method_framing.py", line 221, in write_method
2014-07-08 19:28:41.171 3442 TRACE oslo.messaging._drivers.impl_rabbit write_frame(1, channel, payload)
2014-07-08 19:28:41.171 3442 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/amqp/transport.py", line 177, in write_frame
2014-07-08 19:28:41.171 3442 TRACE oslo.messaging._drivers.impl_rabbit frame_type, channel, size, payload, 0xce,
2014-07-08 19:28:41.171 3442 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/eventlet/greenio.py", line 307, in sendall
2014-07-08 19:28:41.171 3442 TRACE oslo.messaging._drivers.impl_rabbit tail = self.send(data, flags)
2014-07-08 19:28:41.171 3442 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/eventlet/greenio.py", line 293, in send
2014-07-08 19:28:41.171 3442 TRACE oslo.messaging._drivers.impl_rabbit total_sent += fd.send(data[total_sent:], flags)
2014-07-08 19:28:41.171 3442 TRACE oslo.messaging._drivers.impl_rabbit error: [Errno 104] Connection reset by peer
2014-07-08 19:28:41.171 3442 TRACE oslo.messaging._drivers.impl_rabbit
Jul 8 19:28:41 10.35.0.15 nova-api 2014-07-08 19:28:41.174 3442 INFO oslo.messaging._drivers.impl_rabbit [-] Reconnecting to AMQP server on 10.35.0.3:5672
Jul 8 19:28:41 10.35.0.15 nova-api 2014-07-08 19:28:41.175 3442 INFO oslo.messaging._drivers.impl_rabbit [-] Delaying reconnect for 1.0 seconds...
Jul 8 19:28:41 10.35.0.13 pound: 10.35.2.8:8774 10.35.0.14 - - [08/Jul/2014:19:28:41 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5070 "" "python-novaclient"
Jul 8 19:28:41 10.35.0.15 nova-api 2014-07-08 19:28:41.211 3440 INFO nova.osapi_compute.wsgi.server [req-646e252e-b498-434d-ba7c-b9fe55dcef12 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 10.35.0.14,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5244 time: 0.1184280
Jul 8 19:28:41 10.35.0.2 dhcpd: DHCPREQUEST for 10.35.1.129 from 00:25:90:8b:3f:a7 via management
Jul 8 19:28:41 10.35.0.2 dhcpd: DHCPACK on 10.35.1.129 to 00:25:90:8b:3f:a7 via management
Jul 8 19:28:42 10.35.0.13 rabbitmq.log: =INFO REPORT==== 8-Jul-2014::19:28:42 ===
Jul 8 19:28:42 10.35.0.13 rabbitmq.log: accepting AMQP connection <0.2267.0> (10.35.0.15:41620 -> 10.35.0.3:5672)
Jul 8 19:28:42 10.35.0.15 nova-api 2014-07-08 19:28:42.195 3442 INFO oslo.messaging._drivers.impl_rabbit [-] Connected to AMQP server on 10.35.0.3:5672
Jul 8 19:28:42 10.35.0.15 nova-api 2014-07-08 19:28:42.197 3442 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_context_project_name\\": \\"service\\", \\"_context_quota_class\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_request_id\\": \\"req-585393f6-c238-40c5-88fe-fbf1dd0c48e5\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"_context_user_name\\": \\"nova\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"args\\": {\\"instance_uuid\\": \\"5ee78913-e655-4cb1-a92c-500639c819ad\\", \\"dest\\": \\"10.35.0.15\\"}, \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_unique_id\\": \\"acec40fea9304cffa221ad628b72c509\\", \\"_context_is_admin\\": true, \\"version\\": \\"2.0\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:28:40.972908\\", \\"_context_read_deleted\\": \\"no\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"method\\": \\"migrate_instance\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:28:42 10.35.0.15 nova-api 2014-07-08 19:28:42.197 3442 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_context_project_name\\": \\"service\\", \\"_context_quota_class\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_request_id\\": \\"req-585393f6-c238-40c5-88fe-fbf1dd0c48e5\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"_context_user_name\\": \\"nova\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"args\\": {\\"instance_uuid\\": \\"5ee78913-e655-4cb1-a92c-500639c819ad\\", \\"dest\\": \\"10.35.0.15\\"}, \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_unique_id\\": \\"acec40fea9304cffa221ad628b72c509\\", \\"_context_is_admin\\": true, \\"version\\": \\"2.0\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:28:40.972908\\", \\"_context_read_deleted\\": \\"no\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"method\\": \\"migrate_instance\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}', routing_key=u'cobalt.10.35.0.13'
Jul 8 19:28:42 10.35.0.13 cobalt-compute: pyamqp ERROR NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_context_project_name\\": \\"service\\", \\"_context_quota_class\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_request_id\\": \\"req-585393f6-c238-40c5-88fe-fbf1dd0c48e5\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"_context_user_name\\": \\"nova\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"args\\": {\\"instance_uuid\\": \\"5ee78913-e655-4cb1-a92c-500639c819ad\\", \\"dest\\": \\"10.35.0.15\\"}, \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_unique_id\\": \\"acec40fea9304cffa221ad628b72c509\\", \\"_context_is_admin\\": true, \\"version\\": \\"2.0\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:28:40.972908\\", \\"_context_read_deleted\\": \\"no\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"method\\": \\"migrate_instance\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'2', 'redelivered': False, 'routing_key': u'cobalt.10.35.0.13', 'delivery_tag': 3, 'exchange': u'nova'}, 'properties': {'priority': 0, 'application_headers': {}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7ff0d3711150>}
Jul 8 19:28:42 10.35.0.13 pound: 10.35.2.8:8774 10.35.2.7 - - [08/Jul/2014:19:28:40 +0000] "POST /v2/9a663247f58a4412a840385af8d9e73a/servers/5ee78913-e655-4cb1-a92c-500639c819ad/action HTTP/1.1" 200 - "" "python-novaclient"
Jul 8 19:28:42 10.35.0.13 cobalt-compute: common DEBUG received {u'_context_roles': [u'Admin', u'_member_', u'ProjectAdmin', u'admin'], u'_context_request_id': u'req-585393f6-c238-40c5-88fe-fbf1dd0c48e5', u'_context_quota_class': None, u'_context_instance_lock_checked': False, u'_context_project_name': u'service', u'_context_service_catalog': [{u'endpoints_links': [], u'endpoints': [{u'adminURL': u'http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a', u'region': u'RegionOne', u'internalURL': u'http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a', u'publicURL': u'http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a'}], u'type': u'volume', u'name': u'Volume Service'}], u'args': {u'instance_uuid': u'5ee78913-e655-4cb1-a92c-500639c819ad', u'dest': u'10.35.0.15'}, u'_context_tenant': u'9a663247f58a4412a840385af8d9e73a', u'_context_user': u'260659016df54f47833004653cfe0cb8', u'_context_auth_token': '<SANITIZED>', u'method': u'migrate_instance', u'_context_is_admin': True, u'version': u'2.0', u'_context_project_id': u'9a663247f58a4412a840385af8d9e73a', u'_context_timestamp': u'2014-07-08T19:28:40.972908', u'_unique_id': u'acec40fea9304cffa221ad628b72c509', u'_context_read_deleted': u'no', u'_context_user_id': u'260659016df54f47833004653cfe0cb8', u'_context_user_name': u'nova', u'_context_remote_address': u'10.35.0.13'}
Jul 8 19:28:42 10.35.0.15 nova-api 2014-07-08 19:28:42.200 3442 INFO nova.osapi_compute.wsgi.server [req-585393f6-c238-40c5-88fe-fbf1dd0c48e5 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] 10.35.2.7,10.35.0.13 "POST /v2/9a663247f58a4412a840385af8d9e73a/servers/5ee78913-e655-4cb1-a92c-500639c819ad/action HTTP/1.1" status: 200 len: 179 time: 1.2287190
Jul 8 19:28:42 10.35.0.13 cobalt-compute: common DEBUG unpacked context: {'tenant': u'9a663247f58a4412a840385af8d9e73a', 'project_name': u'service', 'user_id': u'260659016df54f47833004653cfe0cb8', 'roles': [u'Admin', u'_member_', u'ProjectAdmin', u'admin'], 'timestamp': u'2014-07-08T19:28:40.972908', 'auth_token': '<SANITIZED>', 'remote_address': u'10.35.0.13', 'quota_class': None, 'is_admin': True, 'user': u'260659016df54f47833004653cfe0cb8', 'service_catalog': [{u'endpoints': [{u'adminURL': u'http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a', u'region': u'RegionOne', u'internalURL': u'http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a', u'publicURL': u'http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a'}], u'endpoints_links': [], u'type': u'volume', u'name': u'Volume Service'}], 'request_id': u'req-585393f6-c238-40c5-88fe-fbf1dd0c48e5', 'instance_lock_checked': False, 'project_id': u'9a663247f58a4412a840385af8d9e73a', 'user_name': u'nova', 'read_deleted': u'no'}
Jul 8 19:28:42 10.35.0.13 cobalt-compute: impl_rabbit ERROR NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_context_project_name\\": \\"service\\", \\"_context_quota_class\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_request_id\\": \\"req-585393f6-c238-40c5-88fe-fbf1dd0c48e5\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"_context_user_name\\": \\"nova\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"args\\": {\\"instance_uuid\\": \\"5ee78913-e655-4cb1-a92c-500639c819ad\\", \\"dest\\": \\"10.35.0.15\\"}, \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_unique_id\\": \\"acec40fea9304cffa221ad628b72c509\\", \\"_context_is_admin\\": true, \\"version\\": \\"2.0\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:28:40.972908\\", \\"_context_read_deleted\\": \\"no\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"method\\": \\"migrate_instance\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}'
Jul 8 19:28:42 10.35.0.13 [2014-07-08 19:28:42.199] 941/MainThread savage.Fred/INFO: Requested VM migrations.
Jul 8 19:28:42 10.35.0.13 [2014-07-08 19:28:42.199] 941/MainThread savage.node-evacuation/INFO: Requesting gc migration of VM id ce77d052-f021-44a2-b5f3-5908ec04fcc7
Jul 8 19:28:42 10.35.0.13 cobalt-compute: manager DEBUG migrate_instance called: {u'instance_uuid': u'5ee78913-e655-4cb1-a92c-500639c819ad', u'dest': u'10.35.0.15', 'instance': <nova.objects.instance.Instance object at 0x7ff0d2ee5950>}
Jul 8 19:28:42 10.35.0.13 cobalt-compute: manager DEBUG Locking instance 5ee78913-e655-4cb1-a92c-500639c819ad (fn:migrate_instance)
Jul 8 19:28:42 10.35.0.13 cobalt-compute: manager DEBUG Acquiring lock for instance 5ee78913-e655-4cb1-a92c-500639c819ad
Jul 8 19:28:42 10.35.0.13 cobalt-compute: manager DEBUG Acquired lock for instance 5ee78913-e655-4cb1-a92c-500639c819ad (me: 140672304066384, refcount=1)
Jul 8 19:28:42 10.35.0.13 cobalt-compute: manager DEBUG NBK: manager.migrate_instance instance=5ee78913-e655-4cb1-a92c-500639c819ad calling pre_migrate
Jul 8 19:28:42 10.35.0.13 cobalt-compute: hooks INFO Calling hooks for pre_migrate action
Jul 8 19:28:42 10.35.0.13 cobalt-compute: hooks INFO Done calling hooks for pre_migrate action
Jul 8 19:28:42 10.35.0.13 cobalt-compute: manager DEBUG NBK: manager.migrate_instance instance='5ee78913-e655-4cb1-a92c-500639c819ad' self.host='10.35.0.13' instance.host=u'10.35.0.13'
Jul 8 19:28:42 10.35.0.15 nova-api 2014-07-08 19:28:42.299 3442 ERROR nova.cobalt.api [req-e9cd61c8-c319-45ab-a742-f01416bbb0ff 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] NBK: migrate_instance instance_uuid=ce77d052-f021-44a2-b5f3-5908ec04fcc7
Jul 8 19:28:42 10.35.0.13 cobalt-compute: manager DEBUG NBK: manager.migrate_instance instance='5ee78913-e655-4cb1-a92c-500639c819ad' migration_address='10.35.0.13'
Jul 8 19:28:42 10.35.0.13 cobalt-compute: manager DEBUG NBK: manager.migrate_instance instance='5ee78913-e655-4cb1-a92c-500639c819ad' calling get_instance_nw_info
Jul 8 19:28:42 10.35.0.13 cobalt-compute: amqpdriver DEBUG MSG_ID is 76569656e91a44c6b3b7e389de570b7a
Jul 8 19:28:42 10.35.0.13 cobalt-compute: amqp DEBUG UNIQUE_ID is 6fe888e7624a4856be125e58fd172dbd.
Jul 8 19:28:42 10.35.0.13 cobalt-compute: pyamqp ERROR NBK: body='{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_msg_id\\": \\"76569656e91a44c6b3b7e389de570b7a\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-585393f6-c238-40c5-88fe-fbf1dd0c48e5\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"args\\": {\\"instance_id\\": \\"5ee78913-e655-4cb1-a92c-500639c819ad\\", \\"host\\": \\"10.35.0.13\\", \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"rxtx_factor\\": 1.0}, \\"_unique_id\\": \\"6fe888e7624a4856be125e58fd172dbd\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_project_name\\": \\"service\\", \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_ff10fb357a86427594db414dc3e5acc6\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.9\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:28:40.972908\\", \\"_context_user_name\\": \\"nova\\", \\"method\\": \\"get_instance_nw_info\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}', headers={'ttl': 60000}, properties={'delivery_mode': 2}
Jul 8 19:28:42 10.35.0.13 cobalt-compute: messaging ERROR NBK: message.body='{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_msg_id\\": \\"76569656e91a44c6b3b7e389de570b7a\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-585393f6-c238-40c5-88fe-fbf1dd0c48e5\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"args\\": {\\"instance_id\\": \\"5ee78913-e655-4cb1-a92c-500639c819ad\\", \\"host\\": \\"10.35.0.13\\", \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"rxtx_factor\\": 1.0}, \\"_unique_id\\": \\"6fe888e7624a4856be125e58fd172dbd\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_project_name\\": \\"service\\", \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_ff10fb357a86427594db414dc3e5acc6\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.9\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:28:40.972908\\", \\"_context_user_name\\": \\"nova\\", \\"method\\": \\"get_instance_nw_info\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}', routing_key='network'
Jul 8 19:28:42 10.35.0.13 cobalt-compute: channel DEBUG Closed channel #1
Jul 8 19:28:42 10.35.0.13 cobalt-compute: channel DEBUG using channel_id: 1
Jul 8 19:28:42 10.35.0.13 cobalt-compute: channel DEBUG Channel open
Jul 8 19:28:42 10.35.0.14 nova-network 2014-07-08 19:28:42.300 6514 ERROR kombu.transport.pyamqp [-] NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_msg_id\\": \\"76569656e91a44c6b3b7e389de570b7a\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-585393f6-c238-40c5-88fe-fbf1dd0c48e5\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"args\\": {\\"instance_id\\": \\"5ee78913-e655-4cb1-a92c-500639c819ad\\", \\"host\\": \\"10.35.0.13\\", \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"rxtx_factor\\": 1.0}, \\"_unique_id\\": \\"6fe888e7624a4856be125e58fd172dbd\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_project_name\\": \\"service\\", \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_ff10fb357a86427594db414dc3e5acc6\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.9\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:28:40.972908\\", \\"_context_user_name\\": \\"nova\\", \\"method\\": \\"get_instance_nw_info\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'1', 'redelivered': False, 'routing_key': u'network', 'delivery_tag': 30, 'exchange': u'nova'}, 'properties': {'priority': 0, 'application_headers': {u'ttl': 60000}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7f5c1424f1d0>}
Jul 8 19:28:42 10.35.0.14 nova-network 2014-07-08 19:28:42.302 6514 ERROR oslo.messaging._drivers.impl_rabbit [-] NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_msg_id\\": \\"76569656e91a44c6b3b7e389de570b7a\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-585393f6-c238-40c5-88fe-fbf1dd0c48e5\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"args\\": {\\"instance_id\\": \\"5ee78913-e655-4cb1-a92c-500639c819ad\\", \\"host\\": \\"10.35.0.13\\", \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"rxtx_factor\\": 1.0}, \\"_unique_id\\": \\"6fe888e7624a4856be125e58fd172dbd\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_project_name\\": \\"service\\", \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_ff10fb357a86427594db414dc3e5acc6\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.9\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:28:40.972908\\", \\"_context_user_name\\": \\"nova\\", \\"method\\": \\"get_instance_nw_info\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}'
Jul 8 19:28:42 10.35.0.15 nova-api 2014-07-08 19:28:42.401 3442 ERROR nova.cobalt.api [req-e9cd61c8-c319-45ab-a742-f01416bbb0ff 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] NBK: migrate_instance dest=10.35.0.15 set state=MIGRATING
Jul 8 19:28:42 10.35.0.15 nova-api 2014-07-08 19:28:42.403 3442 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_context_project_name\\": \\"service\\", \\"_context_quota_class\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_request_id\\": \\"req-e9cd61c8-c319-45ab-a742-f01416bbb0ff\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"_context_user_name\\": \\"nova\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"args\\": {\\"instance_uuid\\": \\"ce77d052-f021-44a2-b5f3-5908ec04fcc7\\", \\"dest\\": \\"10.35.0.15\\"}, \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_unique_id\\": \\"7fec3b89a99849c28d16839cd03e2d67\\", \\"_context_is_admin\\": true, \\"version\\": \\"2.0\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:28:42.206685\\", \\"_context_read_deleted\\": \\"no\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"method\\": \\"migrate_instance\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:28:42 10.35.0.15 nova-api 2014-07-08 19:28:42.404 3442 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_context_project_name\\": \\"service\\", \\"_context_quota_class\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_request_id\\": \\"req-e9cd61c8-c319-45ab-a742-f01416bbb0ff\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"_context_user_name\\": \\"nova\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"args\\": {\\"instance_uuid\\": \\"ce77d052-f021-44a2-b5f3-5908ec04fcc7\\", \\"dest\\": \\"10.35.0.15\\"}, \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_unique_id\\": \\"7fec3b89a99849c28d16839cd03e2d67\\", \\"_context_is_admin\\": true, \\"version\\": \\"2.0\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:28:42.206685\\", \\"_context_read_deleted\\": \\"no\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"method\\": \\"migrate_instance\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}', routing_key=u'cobalt.10.35.0.13'
Jul 8 19:28:42 10.35.0.13 cobalt-compute: pyamqp ERROR NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_context_project_name\\": \\"service\\", \\"_context_quota_class\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_request_id\\": \\"req-e9cd61c8-c319-45ab-a742-f01416bbb0ff\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"_context_user_name\\": \\"nova\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"args\\": {\\"instance_uuid\\": \\"ce77d052-f021-44a2-b5f3-5908ec04fcc7\\", \\"dest\\": \\"10.35.0.15\\"}, \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_unique_id\\": \\"7fec3b89a99849c28d16839cd03e2d67\\", \\"_context_is_admin\\": true, \\"version\\": \\"2.0\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:28:42.206685\\", \\"_context_read_deleted\\": \\"no\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"method\\": \\"migrate_instance\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'2', 'redelivered': False, 'routing_key': u'cobalt.10.35.0.13', 'delivery_tag': 4, 'exchange': u'nova'}, 'properties': {'priority': 0, 'application_headers': {}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7ff0d3711150>}
Jul 8 19:28:42 10.35.0.13 pound: 10.35.2.8:8774 10.35.2.7 - - [08/Jul/2014:19:28:42 +0000] "POST /v2/9a663247f58a4412a840385af8d9e73a/servers/ce77d052-f021-44a2-b5f3-5908ec04fcc7/action HTTP/1.1" 200 - "" "python-novaclient"
Jul 8 19:28:42 10.35.0.13 cobalt-compute: common DEBUG received {u'_context_roles': [u'Admin', u'_member_', u'ProjectAdmin', u'admin'], u'_context_request_id': u'req-e9cd61c8-c319-45ab-a742-f01416bbb0ff', u'_context_quota_class': None, u'_context_instance_lock_checked': False, u'_context_project_name': u'service', u'_context_service_catalog': [{u'endpoints_links': [], u'endpoints': [{u'adminURL': u'http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a', u'region': u'RegionOne', u'internalURL': u'http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a', u'publicURL': u'http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a'}], u'type': u'volume', u'name': u'Volume Service'}], u'args': {u'instance_uuid': u'ce77d052-f021-44a2-b5f3-5908ec04fcc7', u'dest': u'10.35.0.15'}, u'_context_tenant': u'9a663247f58a4412a840385af8d9e73a', u'_context_user': u'260659016df54f47833004653cfe0cb8', u'_context_auth_token': '<SANITIZED>', u'method': u'migrate_instance', u'_context_is_admin': True, u'version': u'2.0', u'_context_project_id': u'9a663247f58a4412a840385af8d9e73a', u'_context_timestamp': u'2014-07-08T19:28:42.206685', u'_unique_id': u'7fec3b89a99849c28d16839cd03e2d67', u'_context_read_deleted': u'no', u'_context_user_id': u'260659016df54f47833004653cfe0cb8', u'_context_user_name': u'nova', u'_context_remote_address': u'10.35.0.13'}
Jul 8 19:28:42 10.35.0.15 nova-api 2014-07-08 19:28:42.406 3442 INFO nova.osapi_compute.wsgi.server [req-e9cd61c8-c319-45ab-a742-f01416bbb0ff 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] 10.35.2.7,10.35.0.13 "POST /v2/9a663247f58a4412a840385af8d9e73a/servers/ce77d052-f021-44a2-b5f3-5908ec04fcc7/action HTTP/1.1" status: 200 len: 179 time: 0.2013800
Jul 8 19:28:42 10.35.0.13 cobalt-compute: common DEBUG unpacked context: {'tenant': u'9a663247f58a4412a840385af8d9e73a', 'project_name': u'service', 'user_id': u'260659016df54f47833004653cfe0cb8', 'roles': [u'Admin', u'_member_', u'ProjectAdmin', u'admin'], 'timestamp': u'2014-07-08T19:28:42.206685', 'auth_token': '<SANITIZED>', 'remote_address': u'10.35.0.13', 'quota_class': None, 'is_admin': True, 'user': u'260659016df54f47833004653cfe0cb8', 'service_catalog': [{u'endpoints': [{u'adminURL': u'http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a', u'region': u'RegionOne', u'internalURL': u'http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a', u'publicURL': u'http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a'}], u'endpoints_links': [], u'type': u'volume', u'name': u'Volume Service'}], 'request_id': u'req-e9cd61c8-c319-45ab-a742-f01416bbb0ff', 'instance_lock_checked': False, 'project_id': u'9a663247f58a4412a840385af8d9e73a', 'user_name': u'nova', 'read_deleted': u'no'}
Jul 8 19:28:42 10.35.0.13 cobalt-compute: impl_rabbit ERROR NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_context_project_name\\": \\"service\\", \\"_context_quota_class\\": null, \\"_context_instance_lock_checked\\": false, \\"_context_request_id\\": \\"req-e9cd61c8-c319-45ab-a742-f01416bbb0ff\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"_context_user_name\\": \\"nova\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"args\\": {\\"instance_uuid\\": \\"ce77d052-f021-44a2-b5f3-5908ec04fcc7\\", \\"dest\\": \\"10.35.0.15\\"}, \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_unique_id\\": \\"7fec3b89a99849c28d16839cd03e2d67\\", \\"_context_is_admin\\": true, \\"version\\": \\"2.0\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:28:42.206685\\", \\"_context_read_deleted\\": \\"no\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"method\\": \\"migrate_instance\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}'
Jul 8 19:28:42 10.35.0.13 [2014-07-08 19:28:42.406] 941/MainThread savage.Fred/INFO: Requested VM migrations.
Jul 8 19:28:42 10.35.0.14 nova-network.log: /usr/lib64/python2.7/site-packages/amqp/channel.py:616: VDeprecationWarning: The auto_delete flag for exchanges has been deprecated and will be removed
Jul 8 19:28:42 10.35.0.14 nova-network.log: from py-amqp v1.5.0.
Jul 8 19:28:42 10.35.0.14 nova-network.log: warn(VDeprecationWarning(EXCHANGE_AUTODELETE_DEPRECATED))
Jul 8 19:28:42 10.35.0.14 nova-network 2014-07-08 19:28:42.462 6514 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"b781b714b6ce4285af4ef95e94b2244a\\", \\"failure\\": null, \\"result\\": [{\\"ovs_interfaceid\\": null, \\"network\\": {\\"bridge\\": \\"cloud\\", \\"subnets\\": [{\\"ips\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"fixed\\", \\"floating_ips\\": [], \\"address\\": \\"10.35.3.4\\"}], \\"version\\": 4, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.3\\"}, \\"dns\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"dns\\", \\"address\\": \\"8.8.4.4\\"}], \\"routes\\": [], \\"cidr\\": \\"10.35.3.0/24\\", \\"gateway\\": {\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"gateway\\", \\"address\\": \\"10.35.3.1\\"}}, {\\"ips\\": [], \\"version\\": null, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.3\\"}, \\"dns\\": [], \\"routes\\": [], \\"cidr\\": null, \\"gateway\\": {\\"meta\\": {}, \\"version\\": null, \\"type\\": \\"gateway\\", \\"address\\": null}}], \\"meta\\": {\\"tenant_id\\": null, \\"multi_host\\": true, \\"should_create_bridge\\": true}, \\"id\\": \\"933f56fa-7929-413f-ab40-9c7c91cb840b\\", \\"label\\": \\"private\\"}, \\"devname\\": null, \\"qbh_params\\": null, \\"meta\\": {}, \\"details\\": {}, \\"address\\": \\"fa:16:3e:da:1a:ff\\", \\"active\\": false, \\"type\\": \\"bridge\\", \\"id\\": \\"615297b1-6e8b-4e6c-ad03-1d8ca9bb3a72\\", \\"qbg_params\\": null}], \\"_msg_id\\": \\"76569656e91a44c6b3b7e389de570b7a\\"}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:28:42 10.35.0.14 nova-network 2014-07-08 19:28:42.463 6514 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"b781b714b6ce4285af4ef95e94b2244a\\", \\"failure\\": null, \\"result\\": [{\\"ovs_interfaceid\\": null, \\"network\\": {\\"bridge\\": \\"cloud\\", \\"subnets\\": [{\\"ips\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"fixed\\", \\"floating_ips\\": [], \\"address\\": \\"10.35.3.4\\"}], \\"version\\": 4, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.3\\"}, \\"dns\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"dns\\", \\"address\\": \\"8.8.4.4\\"}], \\"routes\\": [], \\"cidr\\": \\"10.35.3.0/24\\", \\"gateway\\": {\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"gateway\\", \\"address\\": \\"10.35.3.1\\"}}, {\\"ips\\": [], \\"version\\": null, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.3\\"}, \\"dns\\": [], \\"routes\\": [], \\"cidr\\": null, \\"gateway\\": {\\"meta\\": {}, \\"version\\": null, \\"type\\": \\"gateway\\", \\"address\\": null}}], \\"meta\\": {\\"tenant_id\\": null, \\"multi_host\\": true, \\"should_create_bridge\\": true}, \\"id\\": \\"933f56fa-7929-413f-ab40-9c7c91cb840b\\", \\"label\\": \\"private\\"}, \\"devname\\": null, \\"qbh_params\\": null, \\"meta\\": {}, \\"details\\": {}, \\"address\\": \\"fa:16:3e:da:1a:ff\\", \\"active\\": false, \\"type\\": \\"bridge\\", \\"id\\": \\"615297b1-6e8b-4e6c-ad03-1d8ca9bb3a72\\", \\"qbg_params\\": null}], \\"_msg_id\\": \\"76569656e91a44c6b3b7e389de570b7a\\"}", "oslo.version": "2.0"}', routing_key=u'reply_ff10fb357a86427594db414dc3e5acc6'
Jul 8 19:28:42 10.35.0.14 nova-network.log: /usr/lib64/python2.7/site-packages/amqp/channel.py:616: VDeprecationWarning: The auto_delete flag for exchanges has been deprecated and will be removed
Jul 8 19:28:42 10.35.0.14 nova-network.log: from py-amqp v1.5.0.
Jul 8 19:28:42 10.35.0.14 nova-network.log: warn(VDeprecationWarning(EXCHANGE_AUTODELETE_DEPRECATED))
Jul 8 19:28:42 10.35.0.14 nova-network 2014-07-08 19:28:42.464 6514 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"335e5596978645188c6bf7522f23ed81\\", \\"failure\\": null, \\"_msg_id\\": \\"76569656e91a44c6b3b7e389de570b7a\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:28:42 10.35.0.14 nova-network 2014-07-08 19:28:42.464 6514 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"335e5596978645188c6bf7522f23ed81\\", \\"failure\\": null, \\"_msg_id\\": \\"76569656e91a44c6b3b7e389de570b7a\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', routing_key=u'reply_ff10fb357a86427594db414dc3e5acc6'
Jul 8 19:28:42 10.35.0.13 cobalt-compute: pyamqp ERROR NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_unique_id\\": \\"b781b714b6ce4285af4ef95e94b2244a\\", \\"failure\\": null, \\"result\\": [{\\"ovs_interfaceid\\": null, \\"network\\": {\\"bridge\\": \\"cloud\\", \\"subnets\\": [{\\"ips\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"fixed\\", \\"floating_ips\\": [], \\"address\\": \\"10.35.3.4\\"}], \\"version\\": 4, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.3\\"}, \\"dns\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"dns\\", \\"address\\": \\"8.8.4.4\\"}], \\"routes\\": [], \\"cidr\\": \\"10.35.3.0/24\\", \\"gateway\\": {\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"gateway\\", \\"address\\": \\"10.35.3.1\\"}}, {\\"ips\\": [], \\"version\\": null, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.3\\"}, \\"dns\\": [], \\"routes\\": [], \\"cidr\\": null, \\"gateway\\": {\\"meta\\": {}, \\"version\\": null, \\"type\\": \\"gateway\\", \\"address\\": null}}], \\"meta\\": {\\"tenant_id\\": null, \\"multi_host\\": true, \\"should_create_bridge\\": true}, \\"id\\": \\"933f56fa-7929-413f-ab40-9c7c91cb840b\\", \\"label\\": \\"private\\"}, \\"devname\\": null, \\"qbh_params\\": null, \\"meta\\": {}, \\"details\\": {}, \\"address\\": \\"fa:16:3e:da:1a:ff\\", \\"active\\": false, \\"type\\": \\"bridge\\", \\"id\\": \\"615297b1-6e8b-4e6c-ad03-1d8ca9bb3a72\\", \\"qbg_params\\": null}], \\"_msg_id\\": \\"76569656e91a44c6b3b7e389de570b7a\\"}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'1', 'redelivered': False, 'routing_key': u'reply_ff10fb357a86427594db414dc3e5acc6', 'delivery_tag': 7, 'exchange': u'reply_ff10fb357a86427594db414dc3e5acc6'}, 'properties': {'priority': 0, 'application_headers': {}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7ff0d2864c50>}
Jul 8 19:28:42 10.35.0.13 cobalt-compute: impl_rabbit ERROR NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_unique_id\\": \\"b781b714b6ce4285af4ef95e94b2244a\\", \\"failure\\": null, \\"result\\": [{\\"ovs_interfaceid\\": null, \\"network\\": {\\"bridge\\": \\"cloud\\", \\"subnets\\": [{\\"ips\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"fixed\\", \\"floating_ips\\": [], \\"address\\": \\"10.35.3.4\\"}], \\"version\\": 4, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.3\\"}, \\"dns\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"dns\\", \\"address\\": \\"8.8.4.4\\"}], \\"routes\\": [], \\"cidr\\": \\"10.35.3.0/24\\", \\"gateway\\": {\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"gateway\\", \\"address\\": \\"10.35.3.1\\"}}, {\\"ips\\": [], \\"version\\": null, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.3\\"}, \\"dns\\": [], \\"routes\\": [], \\"cidr\\": null, \\"gateway\\": {\\"meta\\": {}, \\"version\\": null, \\"type\\": \\"gateway\\", \\"address\\": null}}], \\"meta\\": {\\"tenant_id\\": null, \\"multi_host\\": true, \\"should_create_bridge\\": true}, \\"id\\": \\"933f56fa-7929-413f-ab40-9c7c91cb840b\\", \\"label\\": \\"private\\"}, \\"devname\\": null, \\"qbh_params\\": null, \\"meta\\": {}, \\"details\\": {}, \\"address\\": \\"fa:16:3e:da:1a:ff\\", \\"active\\": false, \\"type\\": \\"bridge\\", \\"id\\": \\"615297b1-6e8b-4e6c-ad03-1d8ca9bb3a72\\", \\"qbg_params\\": null}], \\"_msg_id\\": \\"76569656e91a44c6b3b7e389de570b7a\\"}", "oslo.version": "2.0"}'
Jul 8 19:28:42 10.35.0.13 cobalt-compute: pyamqp ERROR NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_unique_id\\": \\"335e5596978645188c6bf7522f23ed81\\", \\"failure\\": null, \\"_msg_id\\": \\"76569656e91a44c6b3b7e389de570b7a\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'1', 'redelivered': False, 'routing_key': u'reply_ff10fb357a86427594db414dc3e5acc6', 'delivery_tag': 8, 'exchange': u'reply_ff10fb357a86427594db414dc3e5acc6'}, 'properties': {'priority': 0, 'application_headers': {}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7ff0d2864c50>}
Jul 8 19:28:42 10.35.0.13 cobalt-compute: impl_rabbit ERROR NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_unique_id\\": \\"335e5596978645188c6bf7522f23ed81\\", \\"failure\\": null, \\"_msg_id\\": \\"76569656e91a44c6b3b7e389de570b7a\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}'
Jul 8 19:28:42 10.35.0.13 cobalt-compute: api DEBUG Updating cache with info: [VIF({'ovs_interfaceid': None, 'network': Network({'bridge': u'cloud', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.35.3.4'})], 'version': 4, 'meta': {u'dhcp_server': u'10.35.3.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.35.3.0/24', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.35.3.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.35.3.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True}, 'id': u'933f56fa-7929-413f-ab40-9c7c91cb840b', 'label': u'private'}), 'devname': None, 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:da:1a:ff', 'active': False, 'type': u'bridge', 'id': u'615297b1-6e8b-4e6c-ad03-1d8ca9bb3a72', 'qbg_params': None})]
Jul 8 19:28:42 10.35.0.13 cobalt-compute: manager DEBUG migrate_instance called: {u'instance_uuid': u'ce77d052-f021-44a2-b5f3-5908ec04fcc7', u'dest': u'10.35.0.15', 'instance': <nova.objects.instance.Instance object at 0x7ff0d34fc3d0>}
Jul 8 19:28:42 10.35.0.13 cobalt-compute: manager DEBUG Locking instance ce77d052-f021-44a2-b5f3-5908ec04fcc7 (fn:migrate_instance)
Jul 8 19:28:42 10.35.0.13 cobalt-compute: manager DEBUG Acquiring lock for instance ce77d052-f021-44a2-b5f3-5908ec04fcc7
Jul 8 19:28:42 10.35.0.13 cobalt-compute: manager DEBUG Acquired lock for instance ce77d052-f021-44a2-b5f3-5908ec04fcc7 (me: 140672292366768, refcount=1)
Jul 8 19:28:42 10.35.0.13 cobalt-compute: manager DEBUG NBK: manager.migrate_instance instance=ce77d052-f021-44a2-b5f3-5908ec04fcc7 calling pre_migrate
Jul 8 19:28:42 10.35.0.13 cobalt-compute: hooks INFO Calling hooks for pre_migrate action
Jul 8 19:28:42 10.35.0.13 cobalt-compute: hooks INFO Done calling hooks for pre_migrate action
Jul 8 19:28:42 10.35.0.13 cobalt-compute: manager DEBUG NBK: manager.migrate_instance instance='ce77d052-f021-44a2-b5f3-5908ec04fcc7' self.host='10.35.0.13' instance.host=u'10.35.0.13'
Jul 8 19:28:42 10.35.0.13 cobalt-compute: manager DEBUG NBK: manager.migrate_instance instance='ce77d052-f021-44a2-b5f3-5908ec04fcc7' migration_address='10.35.0.13'
Jul 8 19:28:42 10.35.0.13 cobalt-compute: manager DEBUG NBK: manager.migrate_instance instance='ce77d052-f021-44a2-b5f3-5908ec04fcc7' calling get_instance_nw_info
Jul 8 19:28:42 10.35.0.13 cobalt-compute: amqpdriver DEBUG MSG_ID is 4fbd991c37934e068e532109c938376e
Jul 8 19:28:42 10.35.0.13 cobalt-compute: amqp DEBUG UNIQUE_ID is 295ab4ed01a348d0a9f0ccbc6137fb0a.
Jul 8 19:28:42 10.35.0.13 cobalt-compute: manager DEBUG NBK: manager.migrate_instance instance='5ee78913-e655-4cb1-a92c-500639c819ad' get_instance_nw_info returned network_info=[VIF({'ovs_interfaceid': None, 'network': Network({'bridge': u'cloud', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.35.3.4'})], 'version': 4, 'meta': {u'dhcp_server': u'10.35.3.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.35.3.0/24', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.35.3.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.35.3.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True}, 'id': u'933f56fa-7929-413f-ab40-9c7c91cb840b', 'label': u'private'}), 'devname': None, 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:da:1a:ff', 'active': False, 'type': u'bridge', 'id': u'615297b1-6e8b-4e6c-ad03-1d8ca9bb3a72', 'qbg_params': None})]
Jul 8 19:28:42 10.35.0.13 cobalt-compute: manager DEBUG NBK: manager.migrate_instance instance='5ee78913-e655-4cb1-a92c-500639c819ad' calling _instance_update system_metadata={u'gc_src_host': '10.35.0.13', u'logical_volumes': u'5ee78913-e655-4cb1-a92c-500639c819ad_disk:1073741824', u'instance_type_memory_mb': u'512', u'instance_type_swap': u'0', u'instance_type_vcpu_weight': None, u'instance_type_root_gb': u'1', u'instance_type_name': u'm1.tiny', u'instance_type_id': u'2', u'instance_type_ephemeral_gb': u'0', u'instance_type_rxtx_factor': u'1.0', u'clean_attempts': u'1', u'image_disk_format': u'qcow2', u'gc_dst_host': u'10.35.0.15', u'network_allocated': u'True', u'instance_type_flavorid': u'1', u'instance_type_vcpus': u'1', u'images': u'725c4da6-c116-4d31-8acd-2f6b147dec93,9d0edc56-9927-4102-a9b1-6b08bc0ead91', u'image_container_format': u'bare', u'image_min_ram': u'0', u'image_min_disk': u'1', u'image_base_image_ref': u'8f23694a-2995-4070-a5e8-661de3207f2e'}
Jul 8 19:28:42 10.35.0.13 cobalt-compute: pyamqp ERROR NBK: body='{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_msg_id\\": \\"4fbd991c37934e068e532109c938376e\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-e9cd61c8-c319-45ab-a742-f01416bbb0ff\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"args\\": {\\"instance_id\\": \\"ce77d052-f021-44a2-b5f3-5908ec04fcc7\\", \\"host\\": \\"10.35.0.13\\", \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"rxtx_factor\\": 1.0}, \\"_unique_id\\": \\"295ab4ed01a348d0a9f0ccbc6137fb0a\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_project_name\\": \\"service\\", \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_ff10fb357a86427594db414dc3e5acc6\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.9\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:28:42.206685\\", \\"_context_user_name\\": \\"nova\\", \\"method\\": \\"get_instance_nw_info\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}', headers={'ttl': 60000}, properties={'delivery_mode': 2}
Jul 8 19:28:42 10.35.0.13 cobalt-compute: messaging ERROR NBK: message.body='{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_msg_id\\": \\"4fbd991c37934e068e532109c938376e\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-e9cd61c8-c319-45ab-a742-f01416bbb0ff\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"args\\": {\\"instance_id\\": \\"ce77d052-f021-44a2-b5f3-5908ec04fcc7\\", \\"host\\": \\"10.35.0.13\\", \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"rxtx_factor\\": 1.0}, \\"_unique_id\\": \\"295ab4ed01a348d0a9f0ccbc6137fb0a\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_project_name\\": \\"service\\", \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_ff10fb357a86427594db414dc3e5acc6\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.9\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:28:42.206685\\", \\"_context_user_name\\": \\"nova\\", \\"method\\": \\"get_instance_nw_info\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}', routing_key='network'
Jul 8 19:28:42 10.35.0.15 nova-network 2014-07-08 19:28:42.618 3465 ERROR kombu.transport.pyamqp [-] NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_msg_id\\": \\"4fbd991c37934e068e532109c938376e\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-e9cd61c8-c319-45ab-a742-f01416bbb0ff\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"args\\": {\\"instance_id\\": \\"ce77d052-f021-44a2-b5f3-5908ec04fcc7\\", \\"host\\": \\"10.35.0.13\\", \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"rxtx_factor\\": 1.0}, \\"_unique_id\\": \\"295ab4ed01a348d0a9f0ccbc6137fb0a\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_project_name\\": \\"service\\", \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_ff10fb357a86427594db414dc3e5acc6\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.9\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:28:42.206685\\", \\"_context_user_name\\": \\"nova\\", \\"method\\": \\"get_instance_nw_info\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'1', 'redelivered': False, 'routing_key': u'network', 'delivery_tag': 30, 'exchange': u'nova'}, 'properties': {'priority': 0, 'application_headers': {u'ttl': 60000}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7eff2d221290>}
Jul 8 19:28:42 10.35.0.13 cobalt-compute: manager DEBUG NBK: manager.migrate_instance instance='5ee78913-e655-4cb1-a92c-500639c819ad' calling pre_live_migration
Jul 8 19:28:42 10.35.0.15 nova-network 2014-07-08 19:28:42.620 3465 ERROR oslo.messaging._drivers.impl_rabbit [-] NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_msg_id\\": \\"4fbd991c37934e068e532109c938376e\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-e9cd61c8-c319-45ab-a742-f01416bbb0ff\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"args\\": {\\"instance_id\\": \\"ce77d052-f021-44a2-b5f3-5908ec04fcc7\\", \\"host\\": \\"10.35.0.13\\", \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"rxtx_factor\\": 1.0}, \\"_unique_id\\": \\"295ab4ed01a348d0a9f0ccbc6137fb0a\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_project_name\\": \\"service\\", \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_ff10fb357a86427594db414dc3e5acc6\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.9\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:28:42.206685\\", \\"_context_user_name\\": \\"nova\\", \\"method\\": \\"get_instance_nw_info\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}'
Jul 8 19:28:42 10.35.0.13 cobalt-compute: amqpdriver DEBUG MSG_ID is 60a700beb6074e6e9999235c9252007c
Jul 8 19:28:42 10.35.0.13 cobalt-compute: amqp DEBUG UNIQUE_ID is 7ea0c9431d6d4d38966180617810f45a.
Jul 8 19:28:42 10.35.0.13 cobalt-compute: amqp DEBUG Pool creating new connection
Jul 8 19:28:42 10.35.0.13 cobalt-compute: channel DEBUG Closed channel #1
Jul 8 19:28:42 10.35.0.13 cobalt-compute: channel DEBUG using channel_id: 1
Jul 8 19:28:42 10.35.0.13 rabbitmq.log: =INFO REPORT==== 8-Jul-2014::19:28:42 ===
Jul 8 19:28:42 10.35.0.13 rabbitmq.log: accepting AMQP connection <0.2292.0> (10.35.0.13:42941 -> 10.35.0.3:5672)
Jul 8 19:28:42 10.35.0.13 cobalt-compute: connection DEBUG Start from server, version: 0.9, properties: {u'information': u'Licensed under the MPL. See http://www.rabbitmq.com/', u'product': u'RabbitMQ', u'copyright': u'Copyright (C) 2007-2013 VMware, Inc.', u'capabilities': {u'exchange_exchange_bindings': True, u'consumer_cancel_notify': True, u'publisher_confirms': True, u'basic.nack': True}, u'platform': u'Erlang/OTP', u'version': u'3.1.3'}, mechanisms: [u'PLAIN', u'AMQPLAIN'], locales: [u'en_US']
Jul 8 19:28:42 10.35.0.13 cobalt-compute: channel DEBUG Channel open
Jul 8 19:28:42 10.35.0.13 cobalt-compute: connection DEBUG Open OK!
Jul 8 19:28:42 10.35.0.13 cobalt-compute: channel DEBUG using channel_id: 1
Jul 8 19:28:42 10.35.0.13 cobalt-compute: channel DEBUG Channel open
Jul 8 19:28:42 10.35.0.13 cobalt-compute: impl_rabbit INFO Connected to AMQP server on 10.35.0.3:5672
Jul 8 19:28:42 10.35.0.13 cobalt-compute: pyamqp ERROR NBK: body='{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_msg_id\\": \\"60a700beb6074e6e9999235c9252007c\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-585393f6-c238-40c5-88fe-fbf1dd0c48e5\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"args\\": {\\"instance\\": {\\"nova_object.version\\": \\"1.13\\", \\"nova_object.changes\\": [\\"host\\"], \\"nova_object.name\\": \\"Instance\\", \\"nova_object.data\\": {\\"vm_state\\": \\"active\\", \\"availability_zone\\": null, \\"terminated_at\\": null, \\"ephemeral_gb\\": 0, \\"instance_type_id\\": 2, \\"user_data\\": null, \\"cleaned\\": true, \\"vm_mode\\": null, \\"deleted_at\\": null, \\"reservation_id\\": \\"r-0uipv92o\\", \\"id\\": 2, \\"security_groups\\": {\\"nova_object.version\\": \\"1.0\\", \\"nova_object.name\\": \\"SecurityGroupList\\", \\"nova_object.data\\": {\\"objects\\": [{\\"nova_object.version\\": \\"1.1\\", \\"nova_object.name\\": \\"SecurityGroup\\", \\"nova_object.data\\": {\\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"description\\": \\"default\\", \\"deleted\\": false, \\"created_at\\": \\"2014-07-08T18:33:56Z\\", \\"updated_at\\": null, \\"deleted_at\\": null, \\"id\\": 1, \\"name\\": \\"default\\"}, \\"nova_object.namespace\\": \\"nova\\"}]}, \\"nova_object.namespace\\": \\"nova\\"}, \\"disable_terminate\\": false, \\"display_name\\": \\"vm-39852899123\\", \\"uuid\\": \\"5ee78913-e655-4cb1-a92c-500639c819ad\\", \\"default_swap_device\\": null, \\"info_cache\\": {\\"nova_object.version\\": \\"1.5\\", \\"nova_object.name\\": \\"InstanceInfoCache\\", \\"nova_object.data\\": {\\"instance_uuid\\": \\"5ee78913-e655-4cb1-a92c-500639c819ad\\", \\"deleted\\": false, \\"created_at\\": \\"2014-07-08T18:39:21Z\\", \\"updated_at\\": \\"2014-07-08T18:44:54Z\\", \\"network_info\\": \\"[{\\\\\\"ovs_interfaceid\\\\\\": null, \\\\\\"network\\\\\\": {\\\\\\"bridge\\\\\\": \\\\\\"cloud\\\\\\", \\\\\\"subnets\\\\\\": [{\\\\\\"ips\\\\\\": [{\\\\\\"meta\\\\\\": {}, \\\\\\"version\\\\\\": 4, \\\\\\"type\\\\\\": \\\\\\"fixed\\\\\\", \\\\\\"floating_ips\\\\\\": [], \\\\\\"address\\\\\\": \\\\\\"10.35.3.4\\\\\\"}], \\\\\\"version\\\\\\": 4, \\\\\\"meta\\\\\\": {\\\\\\"dhcp_server\\\\\\": \\\\\\"10.35.3.3\\\\\\"}, \\\\\\"dns\\\\\\": [{\\\\\\"meta\\\\\\": {}, \\\\\\"version\\\\\\": 4, \\\\\\"type\\\\\\": \\\\\\"dns\\\\\\", \\\\\\"address\\\\\\": \\\\\\"8.8.4.4\\\\\\"}], \\\\\\"routes\\\\\\": [], \\\\\\"cidr\\\\\\": \\\\\\"10.35.3.0/24\\\\\\", \\\\\\"gateway\\\\\\": {\\\\\\"meta\\\\\\": {}, \\\\\\"version\\\\\\": 4, \\\\\\"type\\\\\\": \\\\\\"gateway\\\\\\", \\\\\\"address\\\\\\": \\\\\\"10.35.3.1\\\\\\"}}, {\\\\\\"ips\\\\\\": [], \\\\\\"version\\\\\\": null, \\\\\\"meta\\\\\\": {\\\\\\"dhcp_server\\\\\\": \\\\\\"10.35.3.3\\\\\\"}, \\\\\\"dns\\\\\\": [], \\\\\\"routes\\\\\\": [], \\\\\\"cidr\\\\\\": null, \\\\\\"gateway\\\\\\": {\\\\\\"meta\\\\\\": {}, \\\\\\"version\\\\\\": null, \\\\\\"type\\\\\\": \\\\\\"gateway\\\\\\", \\\\\\"address\\\\\\": null}}], \\\\\\"meta\\\\\\": {\\\\\\"tenant_id\\\\\\": null, \\\\\\"multi_host\\\\\\": true, \\\\\\"should_create_bridge\\\\\\": true}, \\\\\\"id\\\\\\": \\\\\\"933f56fa-7929-413f-ab40-9c7c91cb840b\\\\\\", \\\\\\"label\\\\\\": \\\\\\"private\\\\\\"}, \\\\\\"devname\\\\\\": null, \\\\\\"qbh_params\\\\\\": null, \\\\\\"meta\\\\\\": {}, \\\\\\"details\\\\\\": {}, \\\\\\"address\\\\\\": \\\\\\"fa:16:3e:da:1a:ff\\\\\\", \\\\\\"active\\\\\\": false, \\\\\\"type\\\\\\": \\\\\\"bridge\\\\\\", \\\\\\"id\\\\\\": \\\\\\"615297b1-6e8b-4e6c-ad03-1d8ca9bb3a72\\\\\\", \\\\\\"qbg_params\\\\\\": null}]\\", \\"deleted_at\\": null}, \\"nova_object.namespace\\": \\"nova\\"}, \\"hostname\\": \\"vm-39852899123\\", \\"launched_on\\": \\"10.35.0.15\\", \\"display_description\\": \\"vm-39852899123\\", \\"key_data\\": \\"ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQCZ0xCXPLLD8BFIxM8ZO3PxjxlOvrCi2dVZ+9WHUhbc/vSo+eF3Q8J7uuO8U1d9AvDaP1GepfTTVSrFg2XPbnqURvidIuAPXzPJoECYGJyaMfxeaS9qhiIZVRK7GwWPj2cqEtUEDMu6JsUyoc5uzYvFobnrPQlVYEy7jsX0OgBDReH+hz3jY2r2gnbUoGAfSINPiqfSNCxwYUL7/WAHEjv18YPZPCCwV9OwgzB42Z3qWlEL9E6CW3ITLQeNnnHrI/xAoJdcNbqxPYTHHwrpxO0/XOWJhnvxKs8czgiqaXojc7TZTv4YMPcQs7tR/TxF7KmCq0VKM5F8u07VuuoovsuL Generated by Nova\\\\n\\", \\"kernel_id\\": \\"\\", \\"power_state\\": 1, \\"default_ephemeral_device\\": null, \\"progress\\": 0, \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"launched_at\\": \\"2014-07-08T18:39:37Z\\", \\"config_drive\\": \\"\\", \\"node\\": \\"node-10-35-0-13\\", \\"ramdisk_id\\": \\"\\", \\"access_ip_v6\\": null, \\"access_ip_v4\\": null, \\"deleted\\": false, \\"key_name\\": \\"keypair-20849026277\\", \\"updated_at\\": \\"2014-07-08T19:28:41Z\\", \\"host\\": \\"10.35.0.15\\", \\"ephemeral_key_uuid\\": null, \\"architecture\\": null, \\"user_id\\": \\"579c706cdd354c3bace1d0e30ac24d59\\", \\"system_metadata\\": {\\"gc_src_host\\": \\"10.35.0.13\\", \\"logical_volumes\\": \\"5ee78913-e655-4cb1-a92c-500639c819ad_disk:1073741824\\", \\"instance_type_memory_mb\\": \\"512\\", \\"instance_type_swap\\": \\"0\\", \\"instance_type_vcpu_weight\\": null, \\"instance_type_root_gb\\": \\"1\\", \\"instance_type_id\\": \\"2\\", \\"instance_type_name\\": \\"m1.tiny\\", \\"instance_type_ephemeral_gb\\": \\"0\\", \\"instance_type_rxtx_factor\\": \\"1.0\\", \\"clean_attempts\\": \\"1\\", \\"image_min_disk\\": \\"1\\", \\"gc_dst_host\\": \\"10.35.0.15\\", \\"network_allocated\\": \\"True\\", \\"instance_type_flavorid\\": \\"1\\", \\"image_container_format\\": \\"bare\\", \\"images\\": \\"725c4da6-c116-4d31-8acd-2f6b147dec93,9d0edc56-9927-4102-a9b1-6b08bc0ead91\\", \\"instance_type_vcpus\\": \\"1\\", \\"image_min_ram\\": \\"0\\", \\"image_disk_format\\": \\"qcow2\\", \\"image_base_image_ref\\": \\"8f23694a-2995-4070-a5e8-661de3207f2e\\"}, \\"task_state\\": \\"migrating\\", \\"shutdown_terminate\\": false, \\"cell_name\\": null, \\"root_gb\\": 1, \\"locked\\": false, \\"created_at\\": \\"2014-07-08T18:39:21Z\\", \\"locked_by\\": null, \\"launch_index\\": 0, \\"metadata\\": {}, \\"memory_mb\\": 512, \\"vcpus\\": 1, \\"image_ref\\": \\"8f23694a-2995-4070-a5e8-661de3207f2e\\", \\"root_device_name\\": \\"/dev/vda\\", \\"auto_disk_config\\": false, \\"os_type\\": null, \\"scheduled_at\\": \\"2014-07-08T18:39:21Z\\"}, \\"nova_object.namespace\\": \\"nova\\"}, \\"disk\\": null, \\"migrate_data\\": null, \\"block_migration\\": false}, \\"_unique_id\\": \\"7ea0c9431d6d4d38966180617810f45a\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_project_name\\": \\"service\\", \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_ff10fb357a86427594db414dc3e5acc6\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"3.19\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:28:40.972908\\", \\"_context_user_name\\": \\"nova\\", \\"method\\": \\"pre_live_migration\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}', headers={'ttl': 60000}, properties={'delivery_mode': 2}
Jul 8 19:28:42 10.35.0.13 cobalt-compute: messaging ERROR NBK: message.body='{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_msg_id\\": \\"60a700beb6074e6e9999235c9252007c\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-585393f6-c238-40c5-88fe-fbf1dd0c48e5\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"args\\": {\\"instance\\": {\\"nova_object.version\\": \\"1.13\\", \\"nova_object.changes\\": [\\"host\\"], \\"nova_object.name\\": \\"Instance\\", \\"nova_object.data\\": {\\"vm_state\\": \\"active\\", \\"availability_zone\\": null, \\"terminated_at\\": null, \\"ephemeral_gb\\": 0, \\"instance_type_id\\": 2, \\"user_data\\": null, \\"cleaned\\": true, \\"vm_mode\\": null, \\"deleted_at\\": null, \\"reservation_id\\": \\"r-0uipv92o\\", \\"id\\": 2, \\"security_groups\\": {\\"nova_object.version\\": \\"1.0\\", \\"nova_object.name\\": \\"SecurityGroupList\\", \\"nova_object.data\\": {\\"objects\\": [{\\"nova_object.version\\": \\"1.1\\", \\"nova_object.name\\": \\"SecurityGroup\\", \\"nova_object.data\\": {\\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"description\\": \\"default\\", \\"deleted\\": false, \\"created_at\\": \\"2014-07-08T18:33:56Z\\", \\"updated_at\\": null, \\"deleted_at\\": null, \\"id\\": 1, \\"name\\": \\"default\\"}, \\"nova_object.namespace\\": \\"nova\\"}]}, \\"nova_object.namespace\\": \\"nova\\"}, \\"disable_terminate\\": false, \\"display_name\\": \\"vm-39852899123\\", \\"uuid\\": \\"5ee78913-e655-4cb1-a92c-500639c819ad\\", \\"default_swap_device\\": null, \\"info_cache\\": {\\"nova_object.version\\": \\"1.5\\", \\"nova_object.name\\": \\"InstanceInfoCache\\", \\"nova_object.data\\": {\\"instance_uuid\\": \\"5ee78913-e655-4cb1-a92c-500639c819ad\\", \\"deleted\\": false, \\"created_at\\": \\"2014-07-08T18:39:21Z\\", \\"updated_at\\": \\"2014-07-08T18:44:54Z\\", \\"network_info\\": \\"[{\\\\\\"ovs_interfaceid\\\\\\": null, \\\\\\"network\\\\\\": {\\\\\\"bridge\\\\\\": \\\\\\"cloud\\\\\\", \\\\\\"subnets\\\\\\": [{\\\\\\"ips\\\\\\": [{\\\\\\"meta\\\\\\": {}, \\\\\\"version\\\\\\": 4, \\\\\\"type\\\\\\": \\\\\\"fixed\\\\\\", \\\\\\"floating_ips\\\\\\": [], \\\\\\"address\\\\\\": \\\\\\"10.35.3.4\\\\\\"}], \\\\\\"version\\\\\\": 4, \\\\\\"meta\\\\\\": {\\\\\\"dhcp_server\\\\\\": \\\\\\"10.35.3.3\\\\\\"}, \\\\\\"dns\\\\\\": [{\\\\\\"meta\\\\\\": {}, \\\\\\"version\\\\\\": 4, \\\\\\"type\\\\\\": \\\\\\"dns\\\\\\", \\\\\\"address\\\\\\": \\\\\\"8.8.4.4\\\\\\"}], \\\\\\"routes\\\\\\": [], \\\\\\"cidr\\\\\\": \\\\\\"10.35.3.0/24\\\\\\", \\\\\\"gateway\\\\\\": {\\\\\\"meta\\\\\\": {}, \\\\\\"version\\\\\\": 4, \\\\\\"type\\\\\\": \\\\\\"gateway\\\\\\", \\\\\\"address\\\\\\": \\\\\\"10.35.3.1\\\\\\"}}, {\\\\\\"ips\\\\\\": [], \\\\\\"version\\\\\\": null, \\\\\\"meta\\\\\\": {\\\\\\"dhcp_server\\\\\\": \\\\\\"10.35.3.3\\\\\\"}, \\\\\\"dns\\\\\\": [], \\\\\\"routes\\\\\\": [], \\\\\\"cidr\\\\\\": null, \\\\\\"gateway\\\\\\": {\\\\\\"meta\\\\\\": {}, \\\\\\"version\\\\\\": null, \\\\\\"type\\\\\\": \\\\\\"gateway\\\\\\", \\\\\\"address\\\\\\": null}}], \\\\\\"meta\\\\\\": {\\\\\\"tenant_id\\\\\\": null, \\\\\\"multi_host\\\\\\": true, \\\\\\"should_create_bridge\\\\\\": true}, \\\\\\"id\\\\\\": \\\\\\"933f56fa-7929-413f-ab40-9c7c91cb840b\\\\\\", \\\\\\"label\\\\\\": \\\\\\"private\\\\\\"}, \\\\\\"devname\\\\\\": null, \\\\\\"qbh_params\\\\\\": null, \\\\\\"meta\\\\\\": {}, \\\\\\"details\\\\\\": {}, \\\\\\"address\\\\\\": \\\\\\"fa:16:3e:da:1a:ff\\\\\\", \\\\\\"active\\\\\\": false, \\\\\\"type\\\\\\": \\\\\\"bridge\\\\\\", \\\\\\"id\\\\\\": \\\\\\"615297b1-6e8b-4e6c-ad03-1d8ca9bb3a72\\\\\\", \\\\\\"qbg_params\\\\\\": null}]\\", \\"deleted_at\\": null}, \\"nova_object.namespace\\": \\"nova\\"}, \\"hostname\\": \\"vm-39852899123\\", \\"launched_on\\": \\"10.35.0.15\\", \\"display_description\\": \\"vm-39852899123\\", \\"key_data\\": \\"ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQCZ0xCXPLLD8BFIxM8ZO3PxjxlOvrCi2dVZ+9WHUhbc/vSo+eF3Q8J7uuO8U1d9AvDaP1GepfTTVSrFg2XPbnqURvidIuAPXzPJoECYGJyaMfxeaS9qhiIZVRK7GwWPj2cqEtUEDMu6JsUyoc5uzYvFobnrPQlVYEy7jsX0OgBDReH+hz3jY2r2gnbUoGAfSINPiqfSNCxwYUL7/WAHEjv18YPZPCCwV9OwgzB42Z3qWlEL9E6CW3ITLQeNnnHrI/xAoJdcNbqxPYTHHwrpxO0/XOWJhnvxKs8czgiqaXojc7TZTv4YMPcQs7tR/TxF7KmCq0VKM5F8u07VuuoovsuL Generated by Nova\\\\n\\", \\"kernel_id\\": \\"\\", \\"power_state\\": 1, \\"default_ephemeral_device\\": null, \\"progress\\": 0, \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"launched_at\\": \\"2014-07-08T18:39:37Z\\", \\"config_drive\\": \\"\\", \\"node\\": \\"node-10-35-0-13\\", \\"ramdisk_id\\": \\"\\", \\"access_ip_v6\\": null, \\"access_ip_v4\\": null, \\"deleted\\": false, \\"key_name\\": \\"keypair-20849026277\\", \\"updated_at\\": \\"2014-07-08T19:28:41Z\\", \\"host\\": \\"10.35.0.15\\", \\"ephemeral_key_uuid\\": null, \\"architecture\\": null, \\"user_id\\": \\"579c706cdd354c3bace1d0e30ac24d59\\", \\"system_metadata\\": {\\"gc_src_host\\": \\"10.35.0.13\\", \\"logical_volumes\\": \\"5ee78913-e655-4cb1-a92c-500639c819ad_disk:1073741824\\", \\"instance_type_memory_mb\\": \\"512\\", \\"instance_type_swap\\": \\"0\\", \\"instance_type_vcpu_weight\\": null, \\"instance_type_root_gb\\": \\"1\\", \\"instance_type_id\\": \\"2\\", \\"instance_type_name\\": \\"m1.tiny\\", \\"instance_type_ephemeral_gb\\": \\"0\\", \\"instance_type_rxtx_factor\\": \\"1.0\\", \\"clean_attempts\\": \\"1\\", \\"image_min_disk\\": \\"1\\", \\"gc_dst_host\\": \\"10.35.0.15\\", \\"network_allocated\\": \\"True\\", \\"instance_type_flavorid\\": \\"1\\", \\"image_container_format\\": \\"bare\\", \\"images\\": \\"725c4da6-c116-4d31-8acd-2f6b147dec93,9d0edc56-9927-4102-a9b1-6b08bc0ead91\\", \\"instance_type_vcpus\\": \\"1\\", \\"image_min_ram\\": \\"0\\", \\"image_disk_format\\": \\"qcow2\\", \\"image_base_image_ref\\": \\"8f23694a-2995-4070-a5e8-661de3207f2e\\"}, \\"task_state\\": \\"migrating\\", \\"shutdown_terminate\\": false, \\"cell_name\\": null, \\"root_gb\\": 1, \\"locked\\": false, \\"created_at\\": \\"2014-07-08T18:39:21Z\\", \\"locked_by\\": null, \\"launch_index\\": 0, \\"metadata\\": {}, \\"memory_mb\\": 512, \\"vcpus\\": 1, \\"image_ref\\": \\"8f23694a-2995-4070-a5e8-661de3207f2e\\", \\"root_device_name\\": \\"/dev/vda\\", \\"auto_disk_config\\": false, \\"os_type\\": null, \\"scheduled_at\\": \\"2014-07-08T18:39:21Z\\"}, \\"nova_object.namespace\\": \\"nova\\"}, \\"disk\\": null, \\"migrate_data\\": null, \\"block_migration\\": false}, \\"_unique_id\\": \\"7ea0c9431d6d4d38966180617810f45a\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_project_name\\": \\"service\\", \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_ff10fb357a86427594db414dc3e5acc6\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"3.19\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:28:40.972908\\", \\"_context_user_name\\": \\"nova\\", \\"method\\": \\"pre_live_migration\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}', routing_key=u'compute.10.35.0.15'
Jul 8 19:28:42 10.35.0.13 cobalt-compute: channel DEBUG Closed channel #1
Jul 8 19:28:42 10.35.0.13 cobalt-compute: channel DEBUG using channel_id: 1
Jul 8 19:28:42 10.35.0.13 cobalt-compute: channel DEBUG Channel open
Jul 8 19:28:42 10.35.0.15 nova-compute 2014-07-08 19:28:42.645 3328 ERROR kombu.transport.pyamqp [-] NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_msg_id\\": \\"60a700beb6074e6e9999235c9252007c\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-585393f6-c238-40c5-88fe-fbf1dd0c48e5\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"args\\": {\\"instance\\": {\\"nova_object.version\\": \\"1.13\\", \\"nova_object.changes\\": [\\"host\\"], \\"nova_object.name\\": \\"Instance\\", \\"nova_object.data\\": {\\"vm_state\\": \\"active\\", \\"availability_zone\\": null, \\"terminated_at\\": null, \\"ephemeral_gb\\": 0, \\"instance_type_id\\": 2, \\"user_data\\": null, \\"cleaned\\": true, \\"vm_mode\\": null, \\"deleted_at\\": null, \\"reservation_id\\": \\"r-0uipv92o\\", \\"id\\": 2, \\"security_groups\\": {\\"nova_object.version\\": \\"1.0\\", \\"nova_object.name\\": \\"SecurityGroupList\\", \\"nova_object.data\\": {\\"objects\\": [{\\"nova_object.version\\": \\"1.1\\", \\"nova_object.name\\": \\"SecurityGroup\\", \\"nova_object.data\\": {\\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"description\\": \\"default\\", \\"deleted\\": false, \\"created_at\\": \\"2014-07-08T18:33:56Z\\", \\"updated_at\\": null, \\"deleted_at\\": null, \\"id\\": 1, \\"name\\": \\"default\\"}, \\"nova_object.namespace\\": \\"nova\\"}]}, \\"nova_object.namespace\\": \\"nova\\"}, \\"disable_terminate\\": false, \\"display_name\\": \\"vm-39852899123\\", \\"uuid\\": \\"5ee78913-e655-4cb1-a92c-500639c819ad\\", \\"default_swap_device\\": null, \\"info_cache\\": {\\"nova_object.version\\": \\"1.5\\", \\"nova_object.name\\": \\"InstanceInfoCache\\", \\"nova_object.data\\": {\\"instance_uuid\\": \\"5ee78913-e655-4cb1-a92c-500639c819ad\\", \\"deleted\\": false, \\"created_at\\": \\"2014-07-08T18:39:21Z\\", \\"updated_at\\": \\"2014-07-08T18:44:54Z\\", \\"network_info\\": \\"[{\\\\\\"ovs_interfaceid\\\\\\": null, \\\\\\"network\\\\\\": {\\\\\\"bridge\\\\\\": \\\\\\"cloud\\\\\\", \\\\\\"subnets\\\\\\": [{\\\\\\"ips\\\\\\": [{\\\\\\"meta\\\\\\": {}, \\\\\\"version\\\\\\": 4, \\\\\\"type\\\\\\": \\\\\\"fixed\\\\\\", \\\\\\"floating_ips\\\\\\": [], \\\\\\"address\\\\\\": \\\\\\"10.35.3.4\\\\\\"}], \\\\\\"version\\\\\\": 4, \\\\\\"meta\\\\\\": {\\\\\\"dhcp_server\\\\\\": \\\\\\"10.35.3.3\\\\\\"}, \\\\\\"dns\\\\\\": [{\\\\\\"meta\\\\\\": {}, \\\\\\"version\\\\\\": 4, \\\\\\"type\\\\\\": \\\\\\"dns\\\\\\", \\\\\\"address\\\\\\": \\\\\\"8.8.4.4\\\\\\"}], \\\\\\"routes\\\\\\": [], \\\\\\"cidr\\\\\\": \\\\\\"10.35.3.0/24\\\\\\", \\\\\\"gateway\\\\\\": {\\\\\\"meta\\\\\\": {}, \\\\\\"version\\\\\\": 4, \\\\\\"type\\\\\\": \\\\\\"gateway\\\\\\", \\\\\\"address\\\\\\": \\\\\\"10.35.3.1\\\\\\"}}, {\\\\\\"ips\\\\\\": [], \\\\\\"version\\\\\\": null, \\\\\\"meta\\\\\\": {\\\\\\"dhcp_server\\\\\\": \\\\\\"10.35.3.3\\\\\\"}, \\\\\\"dns\\\\\\": [], \\\\\\"routes\\\\\\": [], \\\\\\"cidr\\\\\\": null, \\\\\\"gateway\\\\\\": {\\\\\\"meta\\\\\\": {}, \\\\\\"version\\\\\\": null, \\\\\\"type\\\\\\": \\\\\\"gateway\\\\\\", \\\\\\"address\\\\\\": null}}], \\\\\\"meta\\\\\\": {\\\\\\"tenant_id\\\\\\": null, \\\\\\"multi_host\\\\\\": true, \\\\\\"should_create_bridge\\\\\\": true}, \\\\\\"id\\\\\\": \\\\\\"933f56fa-7929-413f-ab40-9c7c91cb840b\\\\\\", \\\\\\"label\\\\\\": \\\\\\"private\\\\\\"}, \\\\\\"devname\\\\\\": null, \\\\\\"qbh_params\\\\\\": null, \\\\\\"meta\\\\\\": {}, \\\\\\"details\\\\\\": {}, \\\\\\"address\\\\\\": \\\\\\"fa:16:3e:da:1a:ff\\\\\\", \\\\\\"active\\\\\\": false, \\\\\\"type\\\\\\": \\\\\\"bridge\\\\\\", \\\\\\"id\\\\\\": \\\\\\"615297b1-6e8b-4e6c-ad03-1d8ca9bb3a72\\\\\\", \\\\\\"qbg_params\\\\\\": null}]\\", \\"deleted_at\\": null}, \\"nova_object.namespace\\": \\"nova\\"}, \\"hostname\\": \\"vm-39852899123\\", \\"launched_on\\": \\"10.35.0.15\\", \\"display_description\\": \\"vm-39852899123\\", \\"key_data\\": \\"ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQCZ0xCXPLLD8BFIxM8ZO3PxjxlOvrCi2dVZ+9WHUhbc/vSo+eF3Q8J7uuO8U1d9AvDaP1GepfTTVSrFg2XPbnqURvidIuAPXzPJoECYGJyaMfxeaS9qhiIZVRK7GwWPj2cqEtUEDMu6JsUyoc5uzYvFobnrPQlVYEy7jsX0OgBDReH+hz3jY2r2gnbUoGAfSINPiqfSNCxwYUL7/WAHEjv18YPZPCCwV9OwgzB42Z3qWlEL9E6CW3ITLQeNnnHrI/xAoJdcNbqxPYTHHwrpxO0/XOWJhnvxKs8czgiqaXojc7TZTv4YMPcQs7tR/TxF7KmCq0VKM5F8u07VuuoovsuL Generated by Nova\\\\n\\", \\"kernel_id\\": \\"\\", \\"power_state\\": 1, \\"default_ephemeral_device\\": null, \\"progress\\": 0, \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"launched_at\\": \\"2014-07-08T18:39:37Z\\", \\"config_drive\\": \\"\\", \\"node\\": \\"node-10-35-0-13\\", \\"ramdisk_id\\": \\"\\", \\"access_ip_v6\\": null, \\"access_ip_v4\\": null, \\"deleted\\": false, \\"key_name\\": \\"keypair-20849026277\\", \\"updated_at\\": \\"2014-07-08T19:28:41Z\\", \\"host\\": \\"10.35.0.15\\", \\"ephemeral_key_uuid\\": null, \\"architecture\\": null, \\"user_id\\": \\"579c706cdd354c3bace1d0e30ac24d59\\", \\"system_metadata\\": {\\"gc_src_host\\": \\"10.35.0.13\\", \\"logical_volumes\\": \\"5ee78913-e655-4cb1-a92c-500639c819ad_disk:1073741824\\", \\"instance_type_memory_mb\\": \\"512\\", \\"instance_type_swap\\": \\"0\\", \\"instance_type_vcpu_weight\\": null, \\"instance_type_root_gb\\": \\"1\\", \\"instance_type_id\\": \\"2\\", \\"instance_type_name\\": \\"m1.tiny\\", \\"instance_type_ephemeral_gb\\": \\"0\\", \\"instance_type_rxtx_factor\\": \\"1.0\\", \\"clean_attempts\\": \\"1\\", \\"image_min_disk\\": \\"1\\", \\"gc_dst_host\\": \\"10.35.0.15\\", \\"network_allocated\\": \\"True\\", \\"instance_type_flavorid\\": \\"1\\", \\"image_container_format\\": \\"bare\\", \\"images\\": \\"725c4da6-c116-4d31-8acd-2f6b147dec93,9d0edc56-9927-4102-a9b1-6b08bc0ead91\\", \\"instance_type_vcpus\\": \\"1\\", \\"image_min_ram\\": \\"0\\", \\"image_disk_format\\": \\"qcow2\\", \\"image_base_image_ref\\": \\"8f23694a-2995-4070-a5e8-661de3207f2e\\"}, \\"task_state\\": \\"migrating\\", \\"shutdown_terminate\\": false, \\"cell_name\\": null, \\"root_gb\\": 1, \\"locked\\": false, \\"created_at\\": \\"2014-07-08T18:39:21Z\\", \\"locked_by\\": null, \\"launch_index\\": 0, \\"metadata\\": {}, \\"memory_mb\\": 512, \\"vcpus\\": 1, \\"image_ref\\": \\"8f23694a-2995-4070-a5e8-661de3207f2e\\", \\"root_device_name\\": \\"/dev/vda\\", \\"auto_disk_config\\": false, \\"os_type\\": null, \\"scheduled_at\\": \\"2014-07-08T18:39:21Z\\"}, \\"nova_object.namespace\\": \\"nova\\"}, \\"disk\\": null, \\"migrate_data\\": null, \\"block_migration\\": false}, \\"_unique_id\\": \\"7ea0c9431d6d4d38966180617810f45a\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_project_name\\": \\"service\\", \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_ff10fb357a86427594db414dc3e5acc6\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"3.19\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:28:40.972908\\", \\"_context_user_name\\": \\"nova\\", \\"method\\": \\"pre_live_migration\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'2', 'redelivered': False, 'routing_key': u'compute.10.35.0.15', 'delivery_tag': 1, 'exchange': u'nova'}, 'properties': {'priority': 0, 'application_headers': {u'ttl': 60000}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7f3030d47e90>}
Jul 8 19:28:42 10.35.0.15 nova-compute 2014-07-08 19:28:42.649 3328 ERROR oslo.messaging._drivers.impl_rabbit [-] NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_msg_id\\": \\"60a700beb6074e6e9999235c9252007c\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-585393f6-c238-40c5-88fe-fbf1dd0c48e5\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"args\\": {\\"instance\\": {\\"nova_object.version\\": \\"1.13\\", \\"nova_object.changes\\": [\\"host\\"], \\"nova_object.name\\": \\"Instance\\", \\"nova_object.data\\": {\\"vm_state\\": \\"active\\", \\"availability_zone\\": null, \\"terminated_at\\": null, \\"ephemeral_gb\\": 0, \\"instance_type_id\\": 2, \\"user_data\\": null, \\"cleaned\\": true, \\"vm_mode\\": null, \\"deleted_at\\": null, \\"reservation_id\\": \\"r-0uipv92o\\", \\"id\\": 2, \\"security_groups\\": {\\"nova_object.version\\": \\"1.0\\", \\"nova_object.name\\": \\"SecurityGroupList\\", \\"nova_object.data\\": {\\"objects\\": [{\\"nova_object.version\\": \\"1.1\\", \\"nova_object.name\\": \\"SecurityGroup\\", \\"nova_object.data\\": {\\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"description\\": \\"default\\", \\"deleted\\": false, \\"created_at\\": \\"2014-07-08T18:33:56Z\\", \\"updated_at\\": null, \\"deleted_at\\": null, \\"id\\": 1, \\"name\\": \\"default\\"}, \\"nova_object.namespace\\": \\"nova\\"}]}, \\"nova_object.namespace\\": \\"nova\\"}, \\"disable_terminate\\": false, \\"display_name\\": \\"vm-39852899123\\", \\"uuid\\": \\"5ee78913-e655-4cb1-a92c-500639c819ad\\", \\"default_swap_device\\": null, \\"info_cache\\": {\\"nova_object.version\\": \\"1.5\\", \\"nova_object.name\\": \\"InstanceInfoCache\\", \\"nova_object.data\\": {\\"instance_uuid\\": \\"5ee78913-e655-4cb1-a92c-500639c819ad\\", \\"deleted\\": false, \\"created_at\\": \\"2014-07-08T18:39:21Z\\", \\"updated_at\\": \\"2014-07-08T18:44:54Z\\", \\"network_info\\": \\"[{\\\\\\"ovs_interfaceid\\\\\\": null, \\\\\\"network\\\\\\": {\\\\\\"bridge\\\\\\": \\\\\\"cloud\\\\\\", \\\\\\"subnets\\\\\\": [{\\\\\\"ips\\\\\\": [{\\\\\\"meta\\\\\\": {}, \\\\\\"version\\\\\\": 4, \\\\\\"type\\\\\\": \\\\\\"fixed\\\\\\", \\\\\\"floating_ips\\\\\\": [], \\\\\\"address\\\\\\": \\\\\\"10.35.3.4\\\\\\"}], \\\\\\"version\\\\\\": 4, \\\\\\"meta\\\\\\": {\\\\\\"dhcp_server\\\\\\": \\\\\\"10.35.3.3\\\\\\"}, \\\\\\"dns\\\\\\": [{\\\\\\"meta\\\\\\": {}, \\\\\\"version\\\\\\": 4, \\\\\\"type\\\\\\": \\\\\\"dns\\\\\\", \\\\\\"address\\\\\\": \\\\\\"8.8.4.4\\\\\\"}], \\\\\\"routes\\\\\\": [], \\\\\\"cidr\\\\\\": \\\\\\"10.35.3.0/24\\\\\\", \\\\\\"gateway\\\\\\": {\\\\\\"meta\\\\\\": {}, \\\\\\"version\\\\\\": 4, \\\\\\"type\\\\\\": \\\\\\"gateway\\\\\\", \\\\\\"address\\\\\\": \\\\\\"10.35.3.1\\\\\\"}}, {\\\\\\"ips\\\\\\": [], \\\\\\"version\\\\\\": null, \\\\\\"meta\\\\\\": {\\\\\\"dhcp_server\\\\\\": \\\\\\"10.35.3.3\\\\\\"}, \\\\\\"dns\\\\\\": [], \\\\\\"routes\\\\\\": [], \\\\\\"cidr\\\\\\": null, \\\\\\"gateway\\\\\\": {\\\\\\"meta\\\\\\": {}, \\\\\\"version\\\\\\": null, \\\\\\"type\\\\\\": \\\\\\"gateway\\\\\\", \\\\\\"address\\\\\\": null}}], \\\\\\"meta\\\\\\": {\\\\\\"tenant_id\\\\\\": null, \\\\\\"multi_host\\\\\\": true, \\\\\\"should_create_bridge\\\\\\": true}, \\\\\\"id\\\\\\": \\\\\\"933f56fa-7929-413f-ab40-9c7c91cb840b\\\\\\", \\\\\\"label\\\\\\": \\\\\\"private\\\\\\"}, \\\\\\"devname\\\\\\": null, \\\\\\"qbh_params\\\\\\": null, \\\\\\"meta\\\\\\": {}, \\\\\\"details\\\\\\": {}, \\\\\\"address\\\\\\": \\\\\\"fa:16:3e:da:1a:ff\\\\\\", \\\\\\"active\\\\\\": false, \\\\\\"type\\\\\\": \\\\\\"bridge\\\\\\", \\\\\\"id\\\\\\": \\\\\\"615297b1-6e8b-4e6c-ad03-1d8ca9bb3a72\\\\\\", \\\\\\"qbg_params\\\\\\": null}]\\", \\"deleted_at\\": null}, \\"nova_object.namespace\\": \\"nova\\"}, \\"hostname\\": \\"vm-39852899123\\", \\"launched_on\\": \\"10.35.0.15\\", \\"display_description\\": \\"vm-39852899123\\", \\"key_data\\": \\"ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQCZ0xCXPLLD8BFIxM8ZO3PxjxlOvrCi2dVZ+9WHUhbc/vSo+eF3Q8J7uuO8U1d9AvDaP1GepfTTVSrFg2XPbnqURvidIuAPXzPJoECYGJyaMfxeaS9qhiIZVRK7GwWPj2cqEtUEDMu6JsUyoc5uzYvFobnrPQlVYEy7jsX0OgBDReH+hz3jY2r2gnbUoGAfSINPiqfSNCxwYUL7/WAHEjv18YPZPCCwV9OwgzB42Z3qWlEL9E6CW3ITLQeNnnHrI/xAoJdcNbqxPYTHHwrpxO0/XOWJhnvxKs8czgiqaXojc7TZTv4YMPcQs7tR/TxF7KmCq0VKM5F8u07VuuoovsuL Generated by Nova\\\\n\\", \\"kernel_id\\": \\"\\", \\"power_state\\": 1, \\"default_ephemeral_device\\": null, \\"progress\\": 0, \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"launched_at\\": \\"2014-07-08T18:39:37Z\\", \\"config_drive\\": \\"\\", \\"node\\": \\"node-10-35-0-13\\", \\"ramdisk_id\\": \\"\\", \\"access_ip_v6\\": null, \\"access_ip_v4\\": null, \\"deleted\\": false, \\"key_name\\": \\"keypair-20849026277\\", \\"updated_at\\": \\"2014-07-08T19:28:41Z\\", \\"host\\": \\"10.35.0.15\\", \\"ephemeral_key_uuid\\": null, \\"architecture\\": null, \\"user_id\\": \\"579c706cdd354c3bace1d0e30ac24d59\\", \\"system_metadata\\": {\\"gc_src_host\\": \\"10.35.0.13\\", \\"logical_volumes\\": \\"5ee78913-e655-4cb1-a92c-500639c819ad_disk:1073741824\\", \\"instance_type_memory_mb\\": \\"512\\", \\"instance_type_swap\\": \\"0\\", \\"instance_type_vcpu_weight\\": null, \\"instance_type_root_gb\\": \\"1\\", \\"instance_type_id\\": \\"2\\", \\"instance_type_name\\": \\"m1.tiny\\", \\"instance_type_ephemeral_gb\\": \\"0\\", \\"instance_type_rxtx_factor\\": \\"1.0\\", \\"clean_attempts\\": \\"1\\", \\"image_min_disk\\": \\"1\\", \\"gc_dst_host\\": \\"10.35.0.15\\", \\"network_allocated\\": \\"True\\", \\"instance_type_flavorid\\": \\"1\\", \\"image_container_format\\": \\"bare\\", \\"images\\": \\"725c4da6-c116-4d31-8acd-2f6b147dec93,9d0edc56-9927-4102-a9b1-6b08bc0ead91\\", \\"instance_type_vcpus\\": \\"1\\", \\"image_min_ram\\": \\"0\\", \\"image_disk_format\\": \\"qcow2\\", \\"image_base_image_ref\\": \\"8f23694a-2995-4070-a5e8-661de3207f2e\\"}, \\"task_state\\": \\"migrating\\", \\"shutdown_terminate\\": false, \\"cell_name\\": null, \\"root_gb\\": 1, \\"locked\\": false, \\"created_at\\": \\"2014-07-08T18:39:21Z\\", \\"locked_by\\": null, \\"launch_index\\": 0, \\"metadata\\": {}, \\"memory_mb\\": 512, \\"vcpus\\": 1, \\"image_ref\\": \\"8f23694a-2995-4070-a5e8-661de3207f2e\\", \\"root_device_name\\": \\"/dev/vda\\", \\"auto_disk_config\\": false, \\"os_type\\": null, \\"scheduled_at\\": \\"2014-07-08T18:39:21Z\\"}, \\"nova_object.namespace\\": \\"nova\\"}, \\"disk\\": null, \\"migrate_data\\": null, \\"block_migration\\": false}, \\"_unique_id\\": \\"7ea0c9431d6d4d38966180617810f45a\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_project_name\\": \\"service\\", \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_ff10fb357a86427594db414dc3e5acc6\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"3.19\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:28:40.972908\\", \\"_context_user_name\\": \\"nova\\", \\"method\\": \\"pre_live_migration\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}'
Jul 8 19:28:42 10.35.0.15 nova-compute 2014-07-08 19:28:42.653 3328 ERROR nova.compute.manager [req-585393f6-c238-40c5-88fe-fbf1dd0c48e5 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] NBK: pre_live_migration
Jul 8 19:28:42 10.35.0.15 nova-compute 2014-07-08 19:28:42.653 3328 ERROR nova.compute.manager [req-585393f6-c238-40c5-88fe-fbf1dd0c48e5 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] NBK: pre_live_migration instance='5ee78913-e655-4cb1-a92c-500639c819ad'
Jul 8 19:28:42 10.35.0.15 nova-compute 2014-07-08 19:28:42.660 3328 ERROR nova.compute.manager [req-585393f6-c238-40c5-88fe-fbf1dd0c48e5 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] NBK: pre_live_migration instance='5ee78913-e655-4cb1-a92c-500639c819ad' calling _get_instance_nw_info
Jul 8 19:28:42 10.35.0.15 nova-compute 2014-07-08 19:28:42.663 3328 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_msg_id\\": \\"45535700d43f41aebea041c064b28c44\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-585393f6-c238-40c5-88fe-fbf1dd0c48e5\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"args\\": {\\"instance_id\\": \\"5ee78913-e655-4cb1-a92c-500639c819ad\\", \\"host\\": \\"10.35.0.15\\", \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"rxtx_factor\\": 1.0}, \\"_unique_id\\": \\"d42641825f14413ab4e41a60444376e0\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_project_name\\": \\"service\\", \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_af4fb5adaca445a3b6d555c358806a4e\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.9\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:28:40.972908\\", \\"_context_user_name\\": \\"nova\\", \\"method\\": \\"get_instance_nw_info\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}', headers={'ttl': 60000}, properties={'delivery_mode': 2}
Jul 8 19:28:42 10.35.0.15 nova-compute 2014-07-08 19:28:42.663 3328 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_msg_id\\": \\"45535700d43f41aebea041c064b28c44\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-585393f6-c238-40c5-88fe-fbf1dd0c48e5\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"args\\": {\\"instance_id\\": \\"5ee78913-e655-4cb1-a92c-500639c819ad\\", \\"host\\": \\"10.35.0.15\\", \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"rxtx_factor\\": 1.0}, \\"_unique_id\\": \\"d42641825f14413ab4e41a60444376e0\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_project_name\\": \\"service\\", \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_af4fb5adaca445a3b6d555c358806a4e\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.9\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:28:40.972908\\", \\"_context_user_name\\": \\"nova\\", \\"method\\": \\"get_instance_nw_info\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}', routing_key='network'
Jul 8 19:28:42 10.35.0.13 nova-network 2014-07-08 19:28:42.662 4876 ERROR kombu.transport.pyamqp [-] NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_msg_id\\": \\"45535700d43f41aebea041c064b28c44\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-585393f6-c238-40c5-88fe-fbf1dd0c48e5\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"args\\": {\\"instance_id\\": \\"5ee78913-e655-4cb1-a92c-500639c819ad\\", \\"host\\": \\"10.35.0.15\\", \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"rxtx_factor\\": 1.0}, \\"_unique_id\\": \\"d42641825f14413ab4e41a60444376e0\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_project_name\\": \\"service\\", \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_af4fb5adaca445a3b6d555c358806a4e\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.9\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:28:40.972908\\", \\"_context_user_name\\": \\"nova\\", \\"method\\": \\"get_instance_nw_info\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'1', 'redelivered': False, 'routing_key': u'network', 'delivery_tag': 31, 'exchange': u'nova'}, 'properties': {'priority': 0, 'application_headers': {u'ttl': 60000}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7fdeb6472390>}
Jul 8 19:28:42 10.35.0.13 nova-network 2014-07-08 19:28:42.664 4876 ERROR oslo.messaging._drivers.impl_rabbit [-] NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_msg_id\\": \\"45535700d43f41aebea041c064b28c44\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-585393f6-c238-40c5-88fe-fbf1dd0c48e5\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"args\\": {\\"instance_id\\": \\"5ee78913-e655-4cb1-a92c-500639c819ad\\", \\"host\\": \\"10.35.0.15\\", \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"rxtx_factor\\": 1.0}, \\"_unique_id\\": \\"d42641825f14413ab4e41a60444376e0\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_project_name\\": \\"service\\", \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_af4fb5adaca445a3b6d555c358806a4e\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.9\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:28:40.972908\\", \\"_context_user_name\\": \\"nova\\", \\"method\\": \\"get_instance_nw_info\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}'
Jul 8 19:28:42 10.35.0.15 nova-network.log: /usr/lib64/python2.7/site-packages/amqp/channel.py:616: VDeprecationWarning: The auto_delete flag for exchanges has been deprecated and will be removed
Jul 8 19:28:42 10.35.0.15 nova-network.log: from py-amqp v1.5.0.
Jul 8 19:28:42 10.35.0.15 nova-network.log: warn(VDeprecationWarning(EXCHANGE_AUTODELETE_DEPRECATED))
Jul 8 19:28:42 10.35.0.15 nova-network 2014-07-08 19:28:42.759 3465 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"a834c797aeb1418f9826240254be4a17\\", \\"failure\\": null, \\"result\\": [{\\"ovs_interfaceid\\": null, \\"network\\": {\\"bridge\\": \\"cloud\\", \\"subnets\\": [{\\"ips\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"fixed\\", \\"floating_ips\\": [], \\"address\\": \\"10.35.3.2\\"}], \\"version\\": 4, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.3\\"}, \\"dns\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"dns\\", \\"address\\": \\"8.8.4.4\\"}], \\"routes\\": [], \\"cidr\\": \\"10.35.3.0/24\\", \\"gateway\\": {\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"gateway\\", \\"address\\": \\"10.35.3.1\\"}}, {\\"ips\\": [], \\"version\\": null, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.3\\"}, \\"dns\\": [], \\"routes\\": [], \\"cidr\\": null, \\"gateway\\": {\\"meta\\": {}, \\"version\\": null, \\"type\\": \\"gateway\\", \\"address\\": null}}], \\"meta\\": {\\"tenant_id\\": null, \\"multi_host\\": true, \\"should_create_bridge\\": true}, \\"id\\": \\"933f56fa-7929-413f-ab40-9c7c91cb840b\\", \\"label\\": \\"private\\"}, \\"devname\\": null, \\"qbh_params\\": null, \\"meta\\": {}, \\"details\\": {}, \\"address\\": \\"fa:16:3e:0e:79:17\\", \\"active\\": false, \\"type\\": \\"bridge\\", \\"id\\": \\"3821355e-1e28-4fa1-adfe-520202d90d3e\\", \\"qbg_params\\": null}], \\"_msg_id\\": \\"4fbd991c37934e068e532109c938376e\\"}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:28:42 10.35.0.15 nova-network 2014-07-08 19:28:42.760 3465 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"a834c797aeb1418f9826240254be4a17\\", \\"failure\\": null, \\"result\\": [{\\"ovs_interfaceid\\": null, \\"network\\": {\\"bridge\\": \\"cloud\\", \\"subnets\\": [{\\"ips\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"fixed\\", \\"floating_ips\\": [], \\"address\\": \\"10.35.3.2\\"}], \\"version\\": 4, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.3\\"}, \\"dns\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"dns\\", \\"address\\": \\"8.8.4.4\\"}], \\"routes\\": [], \\"cidr\\": \\"10.35.3.0/24\\", \\"gateway\\": {\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"gateway\\", \\"address\\": \\"10.35.3.1\\"}}, {\\"ips\\": [], \\"version\\": null, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.3\\"}, \\"dns\\": [], \\"routes\\": [], \\"cidr\\": null, \\"gateway\\": {\\"meta\\": {}, \\"version\\": null, \\"type\\": \\"gateway\\", \\"address\\": null}}], \\"meta\\": {\\"tenant_id\\": null, \\"multi_host\\": true, \\"should_create_bridge\\": true}, \\"id\\": \\"933f56fa-7929-413f-ab40-9c7c91cb840b\\", \\"label\\": \\"private\\"}, \\"devname\\": null, \\"qbh_params\\": null, \\"meta\\": {}, \\"details\\": {}, \\"address\\": \\"fa:16:3e:0e:79:17\\", \\"active\\": false, \\"type\\": \\"bridge\\", \\"id\\": \\"3821355e-1e28-4fa1-adfe-520202d90d3e\\", \\"qbg_params\\": null}], \\"_msg_id\\": \\"4fbd991c37934e068e532109c938376e\\"}", "oslo.version": "2.0"}', routing_key=u'reply_ff10fb357a86427594db414dc3e5acc6'
Jul 8 19:28:42 10.35.0.15 nova-network.log: /usr/lib64/python2.7/site-packages/amqp/channel.py:616: VDeprecationWarning: The auto_delete flag for exchanges has been deprecated and will be removed
Jul 8 19:28:42 10.35.0.15 nova-network.log: from py-amqp v1.5.0.
Jul 8 19:28:42 10.35.0.15 nova-network.log: warn(VDeprecationWarning(EXCHANGE_AUTODELETE_DEPRECATED))
Jul 8 19:28:42 10.35.0.15 nova-network 2014-07-08 19:28:42.761 3465 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"3bd58252d08f4e1bbef1272c2f892693\\", \\"failure\\": null, \\"_msg_id\\": \\"4fbd991c37934e068e532109c938376e\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:28:42 10.35.0.15 nova-network 2014-07-08 19:28:42.761 3465 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"3bd58252d08f4e1bbef1272c2f892693\\", \\"failure\\": null, \\"_msg_id\\": \\"4fbd991c37934e068e532109c938376e\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', routing_key=u'reply_ff10fb357a86427594db414dc3e5acc6'
Jul 8 19:28:42 10.35.0.13 cobalt-compute: pyamqp ERROR NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_unique_id\\": \\"a834c797aeb1418f9826240254be4a17\\", \\"failure\\": null, \\"result\\": [{\\"ovs_interfaceid\\": null, \\"network\\": {\\"bridge\\": \\"cloud\\", \\"subnets\\": [{\\"ips\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"fixed\\", \\"floating_ips\\": [], \\"address\\": \\"10.35.3.2\\"}], \\"version\\": 4, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.3\\"}, \\"dns\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"dns\\", \\"address\\": \\"8.8.4.4\\"}], \\"routes\\": [], \\"cidr\\": \\"10.35.3.0/24\\", \\"gateway\\": {\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"gateway\\", \\"address\\": \\"10.35.3.1\\"}}, {\\"ips\\": [], \\"version\\": null, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.3\\"}, \\"dns\\": [], \\"routes\\": [], \\"cidr\\": null, \\"gateway\\": {\\"meta\\": {}, \\"version\\": null, \\"type\\": \\"gateway\\", \\"address\\": null}}], \\"meta\\": {\\"tenant_id\\": null, \\"multi_host\\": true, \\"should_create_bridge\\": true}, \\"id\\": \\"933f56fa-7929-413f-ab40-9c7c91cb840b\\", \\"label\\": \\"private\\"}, \\"devname\\": null, \\"qbh_params\\": null, \\"meta\\": {}, \\"details\\": {}, \\"address\\": \\"fa:16:3e:0e:79:17\\", \\"active\\": false, \\"type\\": \\"bridge\\", \\"id\\": \\"3821355e-1e28-4fa1-adfe-520202d90d3e\\", \\"qbg_params\\": null}], \\"_msg_id\\": \\"4fbd991c37934e068e532109c938376e\\"}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'1', 'redelivered': False, 'routing_key': u'reply_ff10fb357a86427594db414dc3e5acc6', 'delivery_tag': 9, 'exchange': u'reply_ff10fb357a86427594db414dc3e5acc6'}, 'properties': {'priority': 0, 'application_headers': {}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7ff0d2864c50>}
Jul 8 19:28:42 10.35.0.13 cobalt-compute: impl_rabbit ERROR NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_unique_id\\": \\"a834c797aeb1418f9826240254be4a17\\", \\"failure\\": null, \\"result\\": [{\\"ovs_interfaceid\\": null, \\"network\\": {\\"bridge\\": \\"cloud\\", \\"subnets\\": [{\\"ips\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"fixed\\", \\"floating_ips\\": [], \\"address\\": \\"10.35.3.2\\"}], \\"version\\": 4, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.3\\"}, \\"dns\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"dns\\", \\"address\\": \\"8.8.4.4\\"}], \\"routes\\": [], \\"cidr\\": \\"10.35.3.0/24\\", \\"gateway\\": {\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"gateway\\", \\"address\\": \\"10.35.3.1\\"}}, {\\"ips\\": [], \\"version\\": null, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.3\\"}, \\"dns\\": [], \\"routes\\": [], \\"cidr\\": null, \\"gateway\\": {\\"meta\\": {}, \\"version\\": null, \\"type\\": \\"gateway\\", \\"address\\": null}}], \\"meta\\": {\\"tenant_id\\": null, \\"multi_host\\": true, \\"should_create_bridge\\": true}, \\"id\\": \\"933f56fa-7929-413f-ab40-9c7c91cb840b\\", \\"label\\": \\"private\\"}, \\"devname\\": null, \\"qbh_params\\": null, \\"meta\\": {}, \\"details\\": {}, \\"address\\": \\"fa:16:3e:0e:79:17\\", \\"active\\": false, \\"type\\": \\"bridge\\", \\"id\\": \\"3821355e-1e28-4fa1-adfe-520202d90d3e\\", \\"qbg_params\\": null}], \\"_msg_id\\": \\"4fbd991c37934e068e532109c938376e\\"}", "oslo.version": "2.0"}'
Jul 8 19:28:42 10.35.0.13 cobalt-compute: pyamqp ERROR NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_unique_id\\": \\"3bd58252d08f4e1bbef1272c2f892693\\", \\"failure\\": null, \\"_msg_id\\": \\"4fbd991c37934e068e532109c938376e\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'1', 'redelivered': False, 'routing_key': u'reply_ff10fb357a86427594db414dc3e5acc6', 'delivery_tag': 10, 'exchange': u'reply_ff10fb357a86427594db414dc3e5acc6'}, 'properties': {'priority': 0, 'application_headers': {}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7ff0d2864c50>}
Jul 8 19:28:42 10.35.0.13 cobalt-compute: impl_rabbit ERROR NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_unique_id\\": \\"3bd58252d08f4e1bbef1272c2f892693\\", \\"failure\\": null, \\"_msg_id\\": \\"4fbd991c37934e068e532109c938376e\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}'
Jul 8 19:28:42 10.35.0.13 cobalt-compute: api DEBUG Updating cache with info: [VIF({'ovs_interfaceid': None, 'network': Network({'bridge': u'cloud', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.35.3.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.35.3.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.35.3.0/24', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.35.3.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.35.3.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True}, 'id': u'933f56fa-7929-413f-ab40-9c7c91cb840b', 'label': u'private'}), 'devname': None, 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:0e:79:17', 'active': False, 'type': u'bridge', 'id': u'3821355e-1e28-4fa1-adfe-520202d90d3e', 'qbg_params': None})]
Jul 8 19:28:42 10.35.0.13 cobalt-compute: manager DEBUG NBK: manager.migrate_instance instance='ce77d052-f021-44a2-b5f3-5908ec04fcc7' get_instance_nw_info returned network_info=[VIF({'ovs_interfaceid': None, 'network': Network({'bridge': u'cloud', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.35.3.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.35.3.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.35.3.0/24', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.35.3.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.35.3.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True}, 'id': u'933f56fa-7929-413f-ab40-9c7c91cb840b', 'label': u'private'}), 'devname': None, 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:0e:79:17', 'active': False, 'type': u'bridge', 'id': u'3821355e-1e28-4fa1-adfe-520202d90d3e', 'qbg_params': None})]
Jul 8 19:28:42 10.35.0.13 cobalt-compute: manager DEBUG NBK: manager.migrate_instance instance='ce77d052-f021-44a2-b5f3-5908ec04fcc7' calling _instance_update system_metadata={u'gc_src_host': '10.35.0.13', u'logical_volumes': u'ce77d052-f021-44a2-b5f3-5908ec04fcc7_disk:1073741824', u'instance_type_memory_mb': u'512', u'instance_type_swap': u'0', u'instance_type_vcpu_weight': None, u'instance_type_root_gb': u'1', u'instance_type_name': u'm1.tiny', u'instance_type_id': u'2', u'instance_type_ephemeral_gb': u'0', u'instance_type_rxtx_factor': u'1.0', u'clean_attempts': u'2', u'image_disk_format': u'qcow2', u'gc_dst_host': u'10.35.0.15', u'network_allocated': u'True', u'instance_type_flavorid': u'1', u'instance_type_vcpus': u'1', u'images': u'2f57332b-5d86-4ed6-a7e9-69942f23bbea,19d92c5d-6891-4938-8831-746791d29124', u'image_container_format': u'bare', u'image_min_ram': u'0', u'image_min_disk': u'1', u'image_base_image_ref': u'8f23694a-2995-4070-a5e8-661de3207f2e'}
Jul 8 19:28:42 10.35.0.13 nova-network.log: /usr/lib64/python2.7/site-packages/amqp/channel.py:616: VDeprecationWarning: The auto_delete flag for exchanges has been deprecated and will be removed
Jul 8 19:28:42 10.35.0.13 nova-network.log: from py-amqp v1.5.0.
Jul 8 19:28:42 10.35.0.13 nova-network.log: warn(VDeprecationWarning(EXCHANGE_AUTODELETE_DEPRECATED))
Jul 8 19:28:42 10.35.0.13 nova-network 2014-07-08 19:28:42.810 4876 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"8c73ef3552324d39813228604ee14a9b\\", \\"failure\\": null, \\"result\\": [{\\"ovs_interfaceid\\": null, \\"network\\": {\\"bridge\\": \\"cloud\\", \\"subnets\\": [{\\"ips\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"fixed\\", \\"floating_ips\\": [], \\"address\\": \\"10.35.3.4\\"}], \\"version\\": 4, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.5\\"}, \\"dns\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"dns\\", \\"address\\": \\"8.8.4.4\\"}], \\"routes\\": [], \\"cidr\\": \\"10.35.3.0/24\\", \\"gateway\\": {\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"gateway\\", \\"address\\": \\"10.35.3.1\\"}}, {\\"ips\\": [], \\"version\\": null, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.5\\"}, \\"dns\\": [], \\"routes\\": [], \\"cidr\\": null, \\"gateway\\": {\\"meta\\": {}, \\"version\\": null, \\"type\\": \\"gateway\\", \\"address\\": null}}], \\"meta\\": {\\"tenant_id\\": null, \\"multi_host\\": true, \\"should_create_bridge\\": true}, \\"id\\": \\"933f56fa-7929-413f-ab40-9c7c91cb840b\\", \\"label\\": \\"private\\"}, \\"devname\\": null, \\"qbh_params\\": null, \\"meta\\": {}, \\"details\\": {}, \\"address\\": \\"fa:16:3e:da:1a:ff\\", \\"active\\": false, \\"type\\": \\"bridge\\", \\"id\\": \\"615297b1-6e8b-4e6c-ad03-1d8ca9bb3a72\\", \\"qbg_params\\": null}], \\"_msg_id\\": \\"45535700d43f41aebea041c064b28c44\\"}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:28:42 10.35.0.13 nova-network 2014-07-08 19:28:42.811 4876 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"8c73ef3552324d39813228604ee14a9b\\", \\"failure\\": null, \\"result\\": [{\\"ovs_interfaceid\\": null, \\"network\\": {\\"bridge\\": \\"cloud\\", \\"subnets\\": [{\\"ips\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"fixed\\", \\"floating_ips\\": [], \\"address\\": \\"10.35.3.4\\"}], \\"version\\": 4, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.5\\"}, \\"dns\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"dns\\", \\"address\\": \\"8.8.4.4\\"}], \\"routes\\": [], \\"cidr\\": \\"10.35.3.0/24\\", \\"gateway\\": {\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"gateway\\", \\"address\\": \\"10.35.3.1\\"}}, {\\"ips\\": [], \\"version\\": null, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.5\\"}, \\"dns\\": [], \\"routes\\": [], \\"cidr\\": null, \\"gateway\\": {\\"meta\\": {}, \\"version\\": null, \\"type\\": \\"gateway\\", \\"address\\": null}}], \\"meta\\": {\\"tenant_id\\": null, \\"multi_host\\": true, \\"should_create_bridge\\": true}, \\"id\\": \\"933f56fa-7929-413f-ab40-9c7c91cb840b\\", \\"label\\": \\"private\\"}, \\"devname\\": null, \\"qbh_params\\": null, \\"meta\\": {}, \\"details\\": {}, \\"address\\": \\"fa:16:3e:da:1a:ff\\", \\"active\\": false, \\"type\\": \\"bridge\\", \\"id\\": \\"615297b1-6e8b-4e6c-ad03-1d8ca9bb3a72\\", \\"qbg_params\\": null}], \\"_msg_id\\": \\"45535700d43f41aebea041c064b28c44\\"}", "oslo.version": "2.0"}', routing_key=u'reply_af4fb5adaca445a3b6d555c358806a4e'
Jul 8 19:28:42 10.35.0.13 nova-network.log: /usr/lib64/python2.7/site-packages/amqp/channel.py:616: VDeprecationWarning: The auto_delete flag for exchanges has been deprecated and will be removed
Jul 8 19:28:42 10.35.0.13 nova-network.log: from py-amqp v1.5.0.
Jul 8 19:28:42 10.35.0.13 nova-network.log: warn(VDeprecationWarning(EXCHANGE_AUTODELETE_DEPRECATED))
Jul 8 19:28:42 10.35.0.13 nova-network 2014-07-08 19:28:42.812 4876 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"fd17faec5c454215b21240cbffd9935a\\", \\"failure\\": null, \\"_msg_id\\": \\"45535700d43f41aebea041c064b28c44\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:28:42 10.35.0.13 nova-network 2014-07-08 19:28:42.812 4876 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"fd17faec5c454215b21240cbffd9935a\\", \\"failure\\": null, \\"_msg_id\\": \\"45535700d43f41aebea041c064b28c44\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', routing_key=u'reply_af4fb5adaca445a3b6d555c358806a4e'
Jul 8 19:28:42 10.35.0.15 nova-compute 2014-07-08 19:28:42.814 3328 ERROR kombu.transport.pyamqp [-] NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_unique_id\\": \\"8c73ef3552324d39813228604ee14a9b\\", \\"failure\\": null, \\"result\\": [{\\"ovs_interfaceid\\": null, \\"network\\": {\\"bridge\\": \\"cloud\\", \\"subnets\\": [{\\"ips\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"fixed\\", \\"floating_ips\\": [], \\"address\\": \\"10.35.3.4\\"}], \\"version\\": 4, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.5\\"}, \\"dns\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"dns\\", \\"address\\": \\"8.8.4.4\\"}], \\"routes\\": [], \\"cidr\\": \\"10.35.3.0/24\\", \\"gateway\\": {\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"gateway\\", \\"address\\": \\"10.35.3.1\\"}}, {\\"ips\\": [], \\"version\\": null, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.5\\"}, \\"dns\\": [], \\"routes\\": [], \\"cidr\\": null, \\"gateway\\": {\\"meta\\": {}, \\"version\\": null, \\"type\\": \\"gateway\\", \\"address\\": null}}], \\"meta\\": {\\"tenant_id\\": null, \\"multi_host\\": true, \\"should_create_bridge\\": true}, \\"id\\": \\"933f56fa-7929-413f-ab40-9c7c91cb840b\\", \\"label\\": \\"private\\"}, \\"devname\\": null, \\"qbh_params\\": null, \\"meta\\": {}, \\"details\\": {}, \\"address\\": \\"fa:16:3e:da:1a:ff\\", \\"active\\": false, \\"type\\": \\"bridge\\", \\"id\\": \\"615297b1-6e8b-4e6c-ad03-1d8ca9bb3a72\\", \\"qbg_params\\": null}], \\"_msg_id\\": \\"45535700d43f41aebea041c064b28c44\\"}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'1', 'redelivered': False, 'routing_key': u'reply_af4fb5adaca445a3b6d555c358806a4e', 'delivery_tag': 83, 'exchange': u'reply_af4fb5adaca445a3b6d555c358806a4e'}, 'properties': {'priority': 0, 'application_headers': {}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7f3030b1ff10>}
Jul 8 19:28:42 10.35.0.15 nova-compute 2014-07-08 19:28:42.815 3328 ERROR oslo.messaging._drivers.impl_rabbit [-] NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_unique_id\\": \\"8c73ef3552324d39813228604ee14a9b\\", \\"failure\\": null, \\"result\\": [{\\"ovs_interfaceid\\": null, \\"network\\": {\\"bridge\\": \\"cloud\\", \\"subnets\\": [{\\"ips\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"fixed\\", \\"floating_ips\\": [], \\"address\\": \\"10.35.3.4\\"}], \\"version\\": 4, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.5\\"}, \\"dns\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"dns\\", \\"address\\": \\"8.8.4.4\\"}], \\"routes\\": [], \\"cidr\\": \\"10.35.3.0/24\\", \\"gateway\\": {\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"gateway\\", \\"address\\": \\"10.35.3.1\\"}}, {\\"ips\\": [], \\"version\\": null, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.5\\"}, \\"dns\\": [], \\"routes\\": [], \\"cidr\\": null, \\"gateway\\": {\\"meta\\": {}, \\"version\\": null, \\"type\\": \\"gateway\\", \\"address\\": null}}], \\"meta\\": {\\"tenant_id\\": null, \\"multi_host\\": true, \\"should_create_bridge\\": true}, \\"id\\": \\"933f56fa-7929-413f-ab40-9c7c91cb840b\\", \\"label\\": \\"private\\"}, \\"devname\\": null, \\"qbh_params\\": null, \\"meta\\": {}, \\"details\\": {}, \\"address\\": \\"fa:16:3e:da:1a:ff\\", \\"active\\": false, \\"type\\": \\"bridge\\", \\"id\\": \\"615297b1-6e8b-4e6c-ad03-1d8ca9bb3a72\\", \\"qbg_params\\": null}], \\"_msg_id\\": \\"45535700d43f41aebea041c064b28c44\\"}", "oslo.version": "2.0"}'
Jul 8 19:28:42 10.35.0.15 nova-compute 2014-07-08 19:28:42.815 3328 ERROR kombu.transport.pyamqp [-] NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_unique_id\\": \\"fd17faec5c454215b21240cbffd9935a\\", \\"failure\\": null, \\"_msg_id\\": \\"45535700d43f41aebea041c064b28c44\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'1', 'redelivered': False, 'routing_key': u'reply_af4fb5adaca445a3b6d555c358806a4e', 'delivery_tag': 84, 'exchange': u'reply_af4fb5adaca445a3b6d555c358806a4e'}, 'properties': {'priority': 0, 'application_headers': {}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7f3030b1ff10>}
Jul 8 19:28:42 10.35.0.15 nova-compute 2014-07-08 19:28:42.816 3328 ERROR oslo.messaging._drivers.impl_rabbit [-] NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_unique_id\\": \\"fd17faec5c454215b21240cbffd9935a\\", \\"failure\\": null, \\"_msg_id\\": \\"45535700d43f41aebea041c064b28c44\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}'
Jul 8 19:28:42 10.35.0.15 nova-compute 2014-07-08 19:28:42.833 3328 ERROR nova.compute.manager [req-585393f6-c238-40c5-88fe-fbf1dd0c48e5 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] NBK: pre_live_migration instance='5ee78913-e655-4cb1-a92c-500639c819ad' _get_instance_nw_info returned network_info=[VIF({'ovs_interfaceid': None, 'network': Network({'bridge': u'cloud', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.35.3.4'})], 'version': 4, 'meta': {u'dhcp_server': u'10.35.3.5'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.35.3.0/24', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.35.3.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.35.3.5'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True}, 'id': u'933f56fa-7929-413f-ab40-9c7c91cb840b', 'label': u'private'}), 'devname': None, 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:da:1a:ff', 'active': False, 'type': u'bridge', 'id': u'615297b1-6e8b-4e6c-ad03-1d8ca9bb3a72', 'qbg_params': None})]
Jul 8 19:28:42 10.35.0.15 nova-compute 2014-07-08 19:28:42.835 3328 ERROR nova.compute.manager [req-585393f6-c238-40c5-88fe-fbf1dd0c48e5 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] NBK: pre_live_migration instance='5ee78913-e655-4cb1-a92c-500639c819ad' setup_networks_on_host self.host='10.35.0.15' pre_live_migration_data=None
Jul 8 19:28:42 10.35.0.15 nova-compute 2014-07-08 19:28:42.836 3328 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_msg_id\\": \\"be1be11e962d4f198720ff9a8bb226a7\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-585393f6-c238-40c5-88fe-fbf1dd0c48e5\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"args\\": {\\"instance_id\\": 2, \\"teardown\\": false, \\"host\\": \\"10.35.0.15\\"}, \\"_unique_id\\": \\"341f1e8ff02b42778985446801d5d10c\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_project_name\\": \\"service\\", \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_af4fb5adaca445a3b6d555c358806a4e\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:28:40.972908\\", \\"_context_user_name\\": \\"nova\\", \\"method\\": \\"setup_networks_on_host\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}', headers={'ttl': 60000}, properties={'delivery_mode': 2}
Jul 8 19:28:42 10.35.0.15 nova-compute 2014-07-08 19:28:42.837 3328 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_msg_id\\": \\"be1be11e962d4f198720ff9a8bb226a7\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-585393f6-c238-40c5-88fe-fbf1dd0c48e5\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"args\\": {\\"instance_id\\": 2, \\"teardown\\": false, \\"host\\": \\"10.35.0.15\\"}, \\"_unique_id\\": \\"341f1e8ff02b42778985446801d5d10c\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_project_name\\": \\"service\\", \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_af4fb5adaca445a3b6d555c358806a4e\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:28:40.972908\\", \\"_context_user_name\\": \\"nova\\", \\"method\\": \\"setup_networks_on_host\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}', routing_key='network'
Jul 8 19:28:42 10.35.0.14 nova-network 2014-07-08 19:28:42.835 6514 ERROR kombu.transport.pyamqp [-] NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_msg_id\\": \\"be1be11e962d4f198720ff9a8bb226a7\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-585393f6-c238-40c5-88fe-fbf1dd0c48e5\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"args\\": {\\"instance_id\\": 2, \\"teardown\\": false, \\"host\\": \\"10.35.0.15\\"}, \\"_unique_id\\": \\"341f1e8ff02b42778985446801d5d10c\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_project_name\\": \\"service\\", \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_af4fb5adaca445a3b6d555c358806a4e\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:28:40.972908\\", \\"_context_user_name\\": \\"nova\\", \\"method\\": \\"setup_networks_on_host\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'1', 'redelivered': False, 'routing_key': u'network', 'delivery_tag': 31, 'exchange': u'nova'}, 'properties': {'priority': 0, 'application_headers': {u'ttl': 60000}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7f5c1424f1d0>}
Jul 8 19:28:42 10.35.0.14 nova-network 2014-07-08 19:28:42.836 6514 ERROR oslo.messaging._drivers.impl_rabbit [-] NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_msg_id\\": \\"be1be11e962d4f198720ff9a8bb226a7\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-585393f6-c238-40c5-88fe-fbf1dd0c48e5\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"args\\": {\\"instance_id\\": 2, \\"teardown\\": false, \\"host\\": \\"10.35.0.15\\"}, \\"_unique_id\\": \\"341f1e8ff02b42778985446801d5d10c\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_project_name\\": \\"service\\", \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_af4fb5adaca445a3b6d555c358806a4e\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:28:40.972908\\", \\"_context_user_name\\": \\"nova\\", \\"method\\": \\"setup_networks_on_host\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}'
Jul 8 19:28:42 10.35.0.13 cobalt-compute: manager DEBUG NBK: manager.migrate_instance instance='ce77d052-f021-44a2-b5f3-5908ec04fcc7' calling pre_live_migration
Jul 8 19:28:42 10.35.0.13 cobalt-compute: amqpdriver DEBUG MSG_ID is 2401eafe1a604dc8b14d26dc4c5cd1aa
Jul 8 19:28:42 10.35.0.13 cobalt-compute: amqp DEBUG UNIQUE_ID is 7291027f6aea4c4cb57857b954427d06.
Jul 8 19:28:42 10.35.0.13 cobalt-compute: pyamqp ERROR NBK: body='{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_msg_id\\": \\"2401eafe1a604dc8b14d26dc4c5cd1aa\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-e9cd61c8-c319-45ab-a742-f01416bbb0ff\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"args\\": {\\"instance\\": {\\"nova_object.version\\": \\"1.13\\", \\"nova_object.changes\\": [\\"host\\"], \\"nova_object.name\\": \\"Instance\\", \\"nova_object.data\\": {\\"vm_state\\": \\"active\\", \\"availability_zone\\": null, \\"terminated_at\\": null, \\"ephemeral_gb\\": 0, \\"instance_type_id\\": 2, \\"user_data\\": null, \\"cleaned\\": true, \\"vm_mode\\": null, \\"deleted_at\\": null, \\"reservation_id\\": \\"r-vu83jpp3\\", \\"id\\": 1, \\"security_groups\\": {\\"nova_object.version\\": \\"1.0\\", \\"nova_object.name\\": \\"SecurityGroupList\\", \\"nova_object.data\\": {\\"objects\\": [{\\"nova_object.version\\": \\"1.1\\", \\"nova_object.name\\": \\"SecurityGroup\\", \\"nova_object.data\\": {\\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"description\\": \\"default\\", \\"deleted\\": false, \\"created_at\\": \\"2014-07-08T18:33:56Z\\", \\"updated_at\\": null, \\"deleted_at\\": null, \\"id\\": 1, \\"name\\": \\"default\\"}, \\"nova_object.namespace\\": \\"nova\\"}]}, \\"nova_object.namespace\\": \\"nova\\"}, \\"disable_terminate\\": false, \\"display_name\\": \\"vm-3904852117\\", \\"uuid\\": \\"ce77d052-f021-44a2-b5f3-5908ec04fcc7\\", \\"default_swap_device\\": null, \\"info_cache\\": {\\"nova_object.version\\": \\"1.5\\", \\"nova_object.name\\": \\"InstanceInfoCache\\", \\"nova_object.data\\": {\\"instance_uuid\\": \\"ce77d052-f021-44a2-b5f3-5908ec04fcc7\\", \\"deleted\\": false, \\"created_at\\": \\"2014-07-08T18:39:17Z\\", \\"updated_at\\": \\"2014-07-08T18:44:54Z\\", \\"network_info\\": \\"[{\\\\\\"ovs_interfaceid\\\\\\": null, \\\\\\"network\\\\\\": {\\\\\\"bridge\\\\\\": \\\\\\"cloud\\\\\\", \\\\\\"subnets\\\\\\": [{\\\\\\"ips\\\\\\": [{\\\\\\"meta\\\\\\": {}, \\\\\\"version\\\\\\": 4, \\\\\\"type\\\\\\": \\\\\\"fixed\\\\\\", \\\\\\"floating_ips\\\\\\": [], \\\\\\"address\\\\\\": \\\\\\"10.35.3.2\\\\\\"}], \\\\\\"version\\\\\\": 4, \\\\\\"meta\\\\\\": {\\\\\\"dhcp_server\\\\\\": \\\\\\"10.35.3.3\\\\\\"}, \\\\\\"dns\\\\\\": [{\\\\\\"meta\\\\\\": {}, \\\\\\"version\\\\\\": 4, \\\\\\"type\\\\\\": \\\\\\"dns\\\\\\", \\\\\\"address\\\\\\": \\\\\\"8.8.4.4\\\\\\"}], \\\\\\"routes\\\\\\": [], \\\\\\"cidr\\\\\\": \\\\\\"10.35.3.0/24\\\\\\", \\\\\\"gateway\\\\\\": {\\\\\\"meta\\\\\\": {}, \\\\\\"version\\\\\\": 4, \\\\\\"type\\\\\\": \\\\\\"gateway\\\\\\", \\\\\\"address\\\\\\": \\\\\\"10.35.3.1\\\\\\"}}, {\\\\\\"ips\\\\\\": [], \\\\\\"version\\\\\\": null, \\\\\\"meta\\\\\\": {\\\\\\"dhcp_server\\\\\\": \\\\\\"10.35.3.3\\\\\\"}, \\\\\\"dns\\\\\\": [], \\\\\\"routes\\\\\\": [], \\\\\\"cidr\\\\\\": null, \\\\\\"gateway\\\\\\": {\\\\\\"meta\\\\\\": {}, \\\\\\"version\\\\\\": null, \\\\\\"type\\\\\\": \\\\\\"gateway\\\\\\", \\\\\\"address\\\\\\": null}}], \\\\\\"meta\\\\\\": {\\\\\\"tenant_id\\\\\\": null, \\\\\\"multi_host\\\\\\": true, \\\\\\"should_create_bridge\\\\\\": true}, \\\\\\"id\\\\\\": \\\\\\"933f56fa-7929-413f-ab40-9c7c91cb840b\\\\\\", \\\\\\"label\\\\\\": \\\\\\"private\\\\\\"}, \\\\\\"devname\\\\\\": null, \\\\\\"qbh_params\\\\\\": null, \\\\\\"meta\\\\\\": {}, \\\\\\"details\\\\\\": {}, \\\\\\"address\\\\\\": \\\\\\"fa:16:3e:0e:79:17\\\\\\", \\\\\\"active\\\\\\": false, \\\\\\"type\\\\\\": \\\\\\"bridge\\\\\\", \\\\\\"id\\\\\\": \\\\\\"3821355e-1e28-4fa1-adfe-520202d90d3e\\\\\\", \\\\\\"qbg_params\\\\\\": null}]\\", \\"deleted_at\\": null}, \\"nova_object.namespace\\": \\"nova\\"}, \\"hostname\\": \\"vm-3904852117\\", \\"launched_on\\": \\"10.35.0.13\\", \\"display_description\\": \\"vm-3904852117\\", \\"key_data\\": \\"ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDI9gPkFoUDfotrFuS7aWZAD1k8UB1CCBrNnopMWVEOeMbfPHdQHJRHw+ATUi75dOcUzBXNdV0gbfjMvEp9OtOZY9Pf/Z4vUPSVZRPcT7YAuJnKx3bF9YXPFNQeIMZx+FxFXJeiS9Jp9puTzsavNTW/rssIl92RKqO6yDUCWr7uD3S1y8T3hMKzb5gfXZiN5jvYg69r38YxXBj2pM81b8uHkM4KsuXwSKRVnt/TS6RnyS5l9FqRYv112GtkszY8+vCvIDcGDA65dUmekKsFZOvIWv0EH19YHQ39UA6eTp/OY6SWkfMeyyzUbs9OO6hVi6R3LpVSUzO2rpOivWotNmxJ Generated by Nova\\\\n\\", \\"kernel_id\\": \\"\\", \\"power_state\\": 1, \\"default_ephemeral_device\\": null, \\"progress\\": 0, \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"launched_at\\": \\"2014-07-08T18:39:32Z\\", \\"config_drive\\": \\"\\", \\"node\\": \\"node-10-35-0-13\\", \\"ramdisk_id\\": \\"\\", \\"access_ip_v6\\": null, \\"access_ip_v4\\": null, \\"deleted\\": false, \\"key_name\\": \\"keypair-96183545624\\", \\"updated_at\\": \\"2014-07-08T19:28:42Z\\", \\"host\\": \\"10.35.0.15\\", \\"ephemeral_key_uuid\\": null, \\"architecture\\": null, \\"user_id\\": \\"579c706cdd354c3bace1d0e30ac24d59\\", \\"system_metadata\\": {\\"gc_src_host\\": \\"10.35.0.13\\", \\"logical_volumes\\": \\"ce77d052-f021-44a2-b5f3-5908ec04fcc7_disk:1073741824\\", \\"instance_type_memory_mb\\": \\"512\\", \\"instance_type_swap\\": \\"0\\", \\"instance_type_vcpu_weight\\": null, \\"instance_type_root_gb\\": \\"1\\", \\"instance_type_id\\": \\"2\\", \\"instance_type_name\\": \\"m1.tiny\\", \\"instance_type_ephemeral_gb\\": \\"0\\", \\"instance_type_rxtx_factor\\": \\"1.0\\", \\"clean_attempts\\": \\"2\\", \\"image_min_disk\\": \\"1\\", \\"gc_dst_host\\": \\"10.35.0.15\\", \\"network_allocated\\": \\"True\\", \\"instance_type_flavorid\\": \\"1\\", \\"image_container_format\\": \\"bare\\", \\"images\\": \\"2f57332b-5d86-4ed6-a7e9-69942f23bbea,19d92c5d-6891-4938-8831-746791d29124\\", \\"instance_type_vcpus\\": \\"1\\", \\"image_min_ram\\": \\"0\\", \\"image_disk_format\\": \\"qcow2\\", \\"image_base_image_ref\\": \\"8f23694a-2995-4070-a5e8-661de3207f2e\\"}, \\"task_state\\": \\"migrating\\", \\"shutdown_terminate\\": false, \\"cell_name\\": null, \\"root_gb\\": 1, \\"locked\\": false, \\"created_at\\": \\"2014-07-08T18:39:17Z\\", \\"locked_by\\": null, \\"launch_index\\": 0, \\"metadata\\": {}, \\"memory_mb\\": 512, \\"vcpus\\": 1, \\"image_ref\\": \\"8f23694a-2995-4070-a5e8-661de3207f2e\\", \\"root_device_name\\": \\"/dev/vda\\", \\"auto_disk_config\\": false, \\"os_type\\": null, \\"scheduled_at\\": \\"2014-07-08T18:39:17Z\\"}, \\"nova_object.namespace\\": \\"nova\\"}, \\"disk\\": null, \\"migrate_data\\": null, \\"block_migration\\": false}, \\"_unique_id\\": \\"7291027f6aea4c4cb57857b954427d06\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_project_name\\": \\"service\\", \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_ff10fb357a86427594db414dc3e5acc6\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"3.19\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:28:42.206685\\", \\"_context_user_name\\": \\"nova\\", \\"method\\": \\"pre_live_migration\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}', headers={'ttl': 60000}, properties={'delivery_mode': 2}
Jul 8 19:28:42 10.35.0.13 cobalt-compute: messaging ERROR NBK: message.body='{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_msg_id\\": \\"2401eafe1a604dc8b14d26dc4c5cd1aa\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-e9cd61c8-c319-45ab-a742-f01416bbb0ff\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"args\\": {\\"instance\\": {\\"nova_object.version\\": \\"1.13\\", \\"nova_object.changes\\": [\\"host\\"], \\"nova_object.name\\": \\"Instance\\", \\"nova_object.data\\": {\\"vm_state\\": \\"active\\", \\"availability_zone\\": null, \\"terminated_at\\": null, \\"ephemeral_gb\\": 0, \\"instance_type_id\\": 2, \\"user_data\\": null, \\"cleaned\\": true, \\"vm_mode\\": null, \\"deleted_at\\": null, \\"reservation_id\\": \\"r-vu83jpp3\\", \\"id\\": 1, \\"security_groups\\": {\\"nova_object.version\\": \\"1.0\\", \\"nova_object.name\\": \\"SecurityGroupList\\", \\"nova_object.data\\": {\\"objects\\": [{\\"nova_object.version\\": \\"1.1\\", \\"nova_object.name\\": \\"SecurityGroup\\", \\"nova_object.data\\": {\\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"description\\": \\"default\\", \\"deleted\\": false, \\"created_at\\": \\"2014-07-08T18:33:56Z\\", \\"updated_at\\": null, \\"deleted_at\\": null, \\"id\\": 1, \\"name\\": \\"default\\"}, \\"nova_object.namespace\\": \\"nova\\"}]}, \\"nova_object.namespace\\": \\"nova\\"}, \\"disable_terminate\\": false, \\"display_name\\": \\"vm-3904852117\\", \\"uuid\\": \\"ce77d052-f021-44a2-b5f3-5908ec04fcc7\\", \\"default_swap_device\\": null, \\"info_cache\\": {\\"nova_object.version\\": \\"1.5\\", \\"nova_object.name\\": \\"InstanceInfoCache\\", \\"nova_object.data\\": {\\"instance_uuid\\": \\"ce77d052-f021-44a2-b5f3-5908ec04fcc7\\", \\"deleted\\": false, \\"created_at\\": \\"2014-07-08T18:39:17Z\\", \\"updated_at\\": \\"2014-07-08T18:44:54Z\\", \\"network_info\\": \\"[{\\\\\\"ovs_interfaceid\\\\\\": null, \\\\\\"network\\\\\\": {\\\\\\"bridge\\\\\\": \\\\\\"cloud\\\\\\", \\\\\\"subnets\\\\\\": [{\\\\\\"ips\\\\\\": [{\\\\\\"meta\\\\\\": {}, \\\\\\"version\\\\\\": 4, \\\\\\"type\\\\\\": \\\\\\"fixed\\\\\\", \\\\\\"floating_ips\\\\\\": [], \\\\\\"address\\\\\\": \\\\\\"10.35.3.2\\\\\\"}], \\\\\\"version\\\\\\": 4, \\\\\\"meta\\\\\\": {\\\\\\"dhcp_server\\\\\\": \\\\\\"10.35.3.3\\\\\\"}, \\\\\\"dns\\\\\\": [{\\\\\\"meta\\\\\\": {}, \\\\\\"version\\\\\\": 4, \\\\\\"type\\\\\\": \\\\\\"dns\\\\\\", \\\\\\"address\\\\\\": \\\\\\"8.8.4.4\\\\\\"}], \\\\\\"routes\\\\\\": [], \\\\\\"cidr\\\\\\": \\\\\\"10.35.3.0/24\\\\\\", \\\\\\"gateway\\\\\\": {\\\\\\"meta\\\\\\": {}, \\\\\\"version\\\\\\": 4, \\\\\\"type\\\\\\": \\\\\\"gateway\\\\\\", \\\\\\"address\\\\\\": \\\\\\"10.35.3.1\\\\\\"}}, {\\\\\\"ips\\\\\\": [], \\\\\\"version\\\\\\": null, \\\\\\"meta\\\\\\": {\\\\\\"dhcp_server\\\\\\": \\\\\\"10.35.3.3\\\\\\"}, \\\\\\"dns\\\\\\": [], \\\\\\"routes\\\\\\": [], \\\\\\"cidr\\\\\\": null, \\\\\\"gateway\\\\\\": {\\\\\\"meta\\\\\\": {}, \\\\\\"version\\\\\\": null, \\\\\\"type\\\\\\": \\\\\\"gateway\\\\\\", \\\\\\"address\\\\\\": null}}], \\\\\\"meta\\\\\\": {\\\\\\"tenant_id\\\\\\": null, \\\\\\"multi_host\\\\\\": true, \\\\\\"should_create_bridge\\\\\\": true}, \\\\\\"id\\\\\\": \\\\\\"933f56fa-7929-413f-ab40-9c7c91cb840b\\\\\\", \\\\\\"label\\\\\\": \\\\\\"private\\\\\\"}, \\\\\\"devname\\\\\\": null, \\\\\\"qbh_params\\\\\\": null, \\\\\\"meta\\\\\\": {}, \\\\\\"details\\\\\\": {}, \\\\\\"address\\\\\\": \\\\\\"fa:16:3e:0e:79:17\\\\\\", \\\\\\"active\\\\\\": false, \\\\\\"type\\\\\\": \\\\\\"bridge\\\\\\", \\\\\\"id\\\\\\": \\\\\\"3821355e-1e28-4fa1-adfe-520202d90d3e\\\\\\", \\\\\\"qbg_params\\\\\\": null}]\\", \\"deleted_at\\": null}, \\"nova_object.namespace\\": \\"nova\\"}, \\"hostname\\": \\"vm-3904852117\\", \\"launched_on\\": \\"10.35.0.13\\", \\"display_description\\": \\"vm-3904852117\\", \\"key_data\\": \\"ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDI9gPkFoUDfotrFuS7aWZAD1k8UB1CCBrNnopMWVEOeMbfPHdQHJRHw+ATUi75dOcUzBXNdV0gbfjMvEp9OtOZY9Pf/Z4vUPSVZRPcT7YAuJnKx3bF9YXPFNQeIMZx+FxFXJeiS9Jp9puTzsavNTW/rssIl92RKqO6yDUCWr7uD3S1y8T3hMKzb5gfXZiN5jvYg69r38YxXBj2pM81b8uHkM4KsuXwSKRVnt/TS6RnyS5l9FqRYv112GtkszY8+vCvIDcGDA65dUmekKsFZOvIWv0EH19YHQ39UA6eTp/OY6SWkfMeyyzUbs9OO6hVi6R3LpVSUzO2rpOivWotNmxJ Generated by Nova\\\\n\\", \\"kernel_id\\": \\"\\", \\"power_state\\": 1, \\"default_ephemeral_device\\": null, \\"progress\\": 0, \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"launched_at\\": \\"2014-07-08T18:39:32Z\\", \\"config_drive\\": \\"\\", \\"node\\": \\"node-10-35-0-13\\", \\"ramdisk_id\\": \\"\\", \\"access_ip_v6\\": null, \\"access_ip_v4\\": null, \\"deleted\\": false, \\"key_name\\": \\"keypair-96183545624\\", \\"updated_at\\": \\"2014-07-08T19:28:42Z\\", \\"host\\": \\"10.35.0.15\\", \\"ephemeral_key_uuid\\": null, \\"architecture\\": null, \\"user_id\\": \\"579c706cdd354c3bace1d0e30ac24d59\\", \\"system_metadata\\": {\\"gc_src_host\\": \\"10.35.0.13\\", \\"logical_volumes\\": \\"ce77d052-f021-44a2-b5f3-5908ec04fcc7_disk:1073741824\\", \\"instance_type_memory_mb\\": \\"512\\", \\"instance_type_swap\\": \\"0\\", \\"instance_type_vcpu_weight\\": null, \\"instance_type_root_gb\\": \\"1\\", \\"instance_type_id\\": \\"2\\", \\"instance_type_name\\": \\"m1.tiny\\", \\"instance_type_ephemeral_gb\\": \\"0\\", \\"instance_type_rxtx_factor\\": \\"1.0\\", \\"clean_attempts\\": \\"2\\", \\"image_min_disk\\": \\"1\\", \\"gc_dst_host\\": \\"10.35.0.15\\", \\"network_allocated\\": \\"True\\", \\"instance_type_flavorid\\": \\"1\\", \\"image_container_format\\": \\"bare\\", \\"images\\": \\"2f57332b-5d86-4ed6-a7e9-69942f23bbea,19d92c5d-6891-4938-8831-746791d29124\\", \\"instance_type_vcpus\\": \\"1\\", \\"image_min_ram\\": \\"0\\", \\"image_disk_format\\": \\"qcow2\\", \\"image_base_image_ref\\": \\"8f23694a-2995-4070-a5e8-661de3207f2e\\"}, \\"task_state\\": \\"migrating\\", \\"shutdown_terminate\\": false, \\"cell_name\\": null, \\"root_gb\\": 1, \\"locked\\": false, \\"created_at\\": \\"2014-07-08T18:39:17Z\\", \\"locked_by\\": null, \\"launch_index\\": 0, \\"metadata\\": {}, \\"memory_mb\\": 512, \\"vcpus\\": 1, \\"image_ref\\": \\"8f23694a-2995-4070-a5e8-661de3207f2e\\", \\"root_device_name\\": \\"/dev/vda\\", \\"auto_disk_config\\": false, \\"os_type\\": null, \\"scheduled_at\\": \\"2014-07-08T18:39:17Z\\"}, \\"nova_object.namespace\\": \\"nova\\"}, \\"disk\\": null, \\"migrate_data\\": null, \\"block_migration\\": false}, \\"_unique_id\\": \\"7291027f6aea4c4cb57857b954427d06\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_project_name\\": \\"service\\", \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_ff10fb357a86427594db414dc3e5acc6\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"3.19\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:28:42.206685\\", \\"_context_user_name\\": \\"nova\\", \\"method\\": \\"pre_live_migration\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}', routing_key=u'compute.10.35.0.15'
Jul 8 19:28:42 10.35.0.13 cobalt-compute: channel DEBUG Closed channel #1
Jul 8 19:28:42 10.35.0.13 cobalt-compute: channel DEBUG using channel_id: 1
Jul 8 19:28:42 10.35.0.15 nova-compute 2014-07-08 19:28:42.881 3328 ERROR kombu.transport.pyamqp [-] NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_msg_id\\": \\"2401eafe1a604dc8b14d26dc4c5cd1aa\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-e9cd61c8-c319-45ab-a742-f01416bbb0ff\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"args\\": {\\"instance\\": {\\"nova_object.version\\": \\"1.13\\", \\"nova_object.changes\\": [\\"host\\"], \\"nova_object.name\\": \\"Instance\\", \\"nova_object.data\\": {\\"vm_state\\": \\"active\\", \\"availability_zone\\": null, \\"terminated_at\\": null, \\"ephemeral_gb\\": 0, \\"instance_type_id\\": 2, \\"user_data\\": null, \\"cleaned\\": true, \\"vm_mode\\": null, \\"deleted_at\\": null, \\"reservation_id\\": \\"r-vu83jpp3\\", \\"id\\": 1, \\"security_groups\\": {\\"nova_object.version\\": \\"1.0\\", \\"nova_object.name\\": \\"SecurityGroupList\\", \\"nova_object.data\\": {\\"objects\\": [{\\"nova_object.version\\": \\"1.1\\", \\"nova_object.name\\": \\"SecurityGroup\\", \\"nova_object.data\\": {\\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"description\\": \\"default\\", \\"deleted\\": false, \\"created_at\\": \\"2014-07-08T18:33:56Z\\", \\"updated_at\\": null, \\"deleted_at\\": null, \\"id\\": 1, \\"name\\": \\"default\\"}, \\"nova_object.namespace\\": \\"nova\\"}]}, \\"nova_object.namespace\\": \\"nova\\"}, \\"disable_terminate\\": false, \\"display_name\\": \\"vm-3904852117\\", \\"uuid\\": \\"ce77d052-f021-44a2-b5f3-5908ec04fcc7\\", \\"default_swap_device\\": null, \\"info_cache\\": {\\"nova_object.version\\": \\"1.5\\", \\"nova_object.name\\": \\"InstanceInfoCache\\", \\"nova_object.data\\": {\\"instance_uuid\\": \\"ce77d052-f021-44a2-b5f3-5908ec04fcc7\\", \\"deleted\\": false, \\"created_at\\": \\"2014-07-08T18:39:17Z\\", \\"updated_at\\": \\"2014-07-08T18:44:54Z\\", \\"network_info\\": \\"[{\\\\\\"ovs_interfaceid\\\\\\": null, \\\\\\"network\\\\\\": {\\\\\\"bridge\\\\\\": \\\\\\"cloud\\\\\\", \\\\\\"subnets\\\\\\": [{\\\\\\"ips\\\\\\": [{\\\\\\"meta\\\\\\": {}, \\\\\\"version\\\\\\": 4, \\\\\\"type\\\\\\": \\\\\\"fixed\\\\\\", \\\\\\"floating_ips\\\\\\": [], \\\\\\"address\\\\\\": \\\\\\"10.35.3.2\\\\\\"}], \\\\\\"version\\\\\\": 4, \\\\\\"meta\\\\\\": {\\\\\\"dhcp_server\\\\\\": \\\\\\"10.35.3.3\\\\\\"}, \\\\\\"dns\\\\\\": [{\\\\\\"meta\\\\\\": {}, \\\\\\"version\\\\\\": 4, \\\\\\"type\\\\\\": \\\\\\"dns\\\\\\", \\\\\\"address\\\\\\": \\\\\\"8.8.4.4\\\\\\"}], \\\\\\"routes\\\\\\": [], \\\\\\"cidr\\\\\\": \\\\\\"10.35.3.0/24\\\\\\", \\\\\\"gateway\\\\\\": {\\\\\\"meta\\\\\\": {}, \\\\\\"version\\\\\\": 4, \\\\\\"type\\\\\\": \\\\\\"gateway\\\\\\", \\\\\\"address\\\\\\": \\\\\\"10.35.3.1\\\\\\"}}, {\\\\\\"ips\\\\\\": [], \\\\\\"version\\\\\\": null, \\\\\\"meta\\\\\\": {\\\\\\"dhcp_server\\\\\\": \\\\\\"10.35.3.3\\\\\\"}, \\\\\\"dns\\\\\\": [], \\\\\\"routes\\\\\\": [], \\\\\\"cidr\\\\\\": null, \\\\\\"gateway\\\\\\": {\\\\\\"meta\\\\\\": {}, \\\\\\"version\\\\\\": null, \\\\\\"type\\\\\\": \\\\\\"gateway\\\\\\", \\\\\\"address\\\\\\": null}}], \\\\\\"meta\\\\\\": {\\\\\\"tenant_id\\\\\\": null, \\\\\\"multi_host\\\\\\": true, \\\\\\"should_create_bridge\\\\\\": true}, \\\\\\"id\\\\\\": \\\\\\"933f56fa-7929-413f-ab40-9c7c91cb840b\\\\\\", \\\\\\"label\\\\\\": \\\\\\"private\\\\\\"}, \\\\\\"devname\\\\\\": null, \\\\\\"qbh_params\\\\\\": null, \\\\\\"meta\\\\\\": {}, \\\\\\"details\\\\\\": {}, \\\\\\"address\\\\\\": \\\\\\"fa:16:3e:0e:79:17\\\\\\", \\\\\\"active\\\\\\": false, \\\\\\"type\\\\\\": \\\\\\"bridge\\\\\\", \\\\\\"id\\\\\\": \\\\\\"3821355e-1e28-4fa1-adfe-520202d90d3e\\\\\\", \\\\\\"qbg_params\\\\\\": null}]\\", \\"deleted_at\\": null}, \\"nova_object.namespace\\": \\"nova\\"}, \\"hostname\\": \\"vm-3904852117\\", \\"launched_on\\": \\"10.35.0.13\\", \\"display_description\\": \\"vm-3904852117\\", \\"key_data\\": \\"ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDI9gPkFoUDfotrFuS7aWZAD1k8UB1CCBrNnopMWVEOeMbfPHdQHJRHw+ATUi75dOcUzBXNdV0gbfjMvEp9OtOZY9Pf/Z4vUPSVZRPcT7YAuJnKx3bF9YXPFNQeIMZx+FxFXJeiS9Jp9puTzsavNTW/rssIl92RKqO6yDUCWr7uD3S1y8T3hMKzb5gfXZiN5jvYg69r38YxXBj2pM81b8uHkM4KsuXwSKRVnt/TS6RnyS5l9FqRYv112GtkszY8+vCvIDcGDA65dUmekKsFZOvIWv0EH19YHQ39UA6eTp/OY6SWkfMeyyzUbs9OO6hVi6R3LpVSUzO2rpOivWotNmxJ Generated by Nova\\\\n\\", \\"kernel_id\\": \\"\\", \\"power_state\\": 1, \\"default_ephemeral_device\\": null, \\"progress\\": 0, \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"launched_at\\": \\"2014-07-08T18:39:32Z\\", \\"config_drive\\": \\"\\", \\"node\\": \\"node-10-35-0-13\\", \\"ramdisk_id\\": \\"\\", \\"access_ip_v6\\": null, \\"access_ip_v4\\": null, \\"deleted\\": false, \\"key_name\\": \\"keypair-96183545624\\", \\"updated_at\\": \\"2014-07-08T19:28:42Z\\", \\"host\\": \\"10.35.0.15\\", \\"ephemeral_key_uuid\\": null, \\"architecture\\": null, \\"user_id\\": \\"579c706cdd354c3bace1d0e30ac24d59\\", \\"system_metadata\\": {\\"gc_src_host\\": \\"10.35.0.13\\", \\"logical_volumes\\": \\"ce77d052-f021-44a2-b5f3-5908ec04fcc7_disk:1073741824\\", \\"instance_type_memory_mb\\": \\"512\\", \\"instance_type_swap\\": \\"0\\", \\"instance_type_vcpu_weight\\": null, \\"instance_type_root_gb\\": \\"1\\", \\"instance_type_id\\": \\"2\\", \\"instance_type_name\\": \\"m1.tiny\\", \\"instance_type_ephemeral_gb\\": \\"0\\", \\"instance_type_rxtx_factor\\": \\"1.0\\", \\"clean_attempts\\": \\"2\\", \\"image_min_disk\\": \\"1\\", \\"gc_dst_host\\": \\"10.35.0.15\\", \\"network_allocated\\": \\"True\\", \\"instance_type_flavorid\\": \\"1\\", \\"image_container_format\\": \\"bare\\", \\"images\\": \\"2f57332b-5d86-4ed6-a7e9-69942f23bbea,19d92c5d-6891-4938-8831-746791d29124\\", \\"instance_type_vcpus\\": \\"1\\", \\"image_min_ram\\": \\"0\\", \\"image_disk_format\\": \\"qcow2\\", \\"image_base_image_ref\\": \\"8f23694a-2995-4070-a5e8-661de3207f2e\\"}, \\"task_state\\": \\"migrating\\", \\"shutdown_terminate\\": false, \\"cell_name\\": null, \\"root_gb\\": 1, \\"locked\\": false, \\"created_at\\": \\"2014-07-08T18:39:17Z\\", \\"locked_by\\": null, \\"launch_index\\": 0, \\"metadata\\": {}, \\"memory_mb\\": 512, \\"vcpus\\": 1, \\"image_ref\\": \\"8f23694a-2995-4070-a5e8-661de3207f2e\\", \\"root_device_name\\": \\"/dev/vda\\", \\"auto_disk_config\\": false, \\"os_type\\": null, \\"scheduled_at\\": \\"2014-07-08T18:39:17Z\\"}, \\"nova_object.namespace\\": \\"nova\\"}, \\"disk\\": null, \\"migrate_data\\": null, \\"block_migration\\": false}, \\"_unique_id\\": \\"7291027f6aea4c4cb57857b954427d06\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_project_name\\": \\"service\\", \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_ff10fb357a86427594db414dc3e5acc6\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"3.19\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:28:42.206685\\", \\"_context_user_name\\": \\"nova\\", \\"method\\": \\"pre_live_migration\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'2', 'redelivered': False, 'routing_key': u'compute.10.35.0.15', 'delivery_tag': 2, 'exchange': u'nova'}, 'properties': {'priority': 0, 'application_headers': {u'ttl': 60000}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7f3030d47e90>}
Jul 8 19:28:42 10.35.0.13 cobalt-compute: channel DEBUG Channel open
Jul 8 19:28:42 10.35.0.15 nova-compute 2014-07-08 19:28:42.884 3328 ERROR oslo.messaging._drivers.impl_rabbit [-] NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_msg_id\\": \\"2401eafe1a604dc8b14d26dc4c5cd1aa\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-e9cd61c8-c319-45ab-a742-f01416bbb0ff\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"args\\": {\\"instance\\": {\\"nova_object.version\\": \\"1.13\\", \\"nova_object.changes\\": [\\"host\\"], \\"nova_object.name\\": \\"Instance\\", \\"nova_object.data\\": {\\"vm_state\\": \\"active\\", \\"availability_zone\\": null, \\"terminated_at\\": null, \\"ephemeral_gb\\": 0, \\"instance_type_id\\": 2, \\"user_data\\": null, \\"cleaned\\": true, \\"vm_mode\\": null, \\"deleted_at\\": null, \\"reservation_id\\": \\"r-vu83jpp3\\", \\"id\\": 1, \\"security_groups\\": {\\"nova_object.version\\": \\"1.0\\", \\"nova_object.name\\": \\"SecurityGroupList\\", \\"nova_object.data\\": {\\"objects\\": [{\\"nova_object.version\\": \\"1.1\\", \\"nova_object.name\\": \\"SecurityGroup\\", \\"nova_object.data\\": {\\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"description\\": \\"default\\", \\"deleted\\": false, \\"created_at\\": \\"2014-07-08T18:33:56Z\\", \\"updated_at\\": null, \\"deleted_at\\": null, \\"id\\": 1, \\"name\\": \\"default\\"}, \\"nova_object.namespace\\": \\"nova\\"}]}, \\"nova_object.namespace\\": \\"nova\\"}, \\"disable_terminate\\": false, \\"display_name\\": \\"vm-3904852117\\", \\"uuid\\": \\"ce77d052-f021-44a2-b5f3-5908ec04fcc7\\", \\"default_swap_device\\": null, \\"info_cache\\": {\\"nova_object.version\\": \\"1.5\\", \\"nova_object.name\\": \\"InstanceInfoCache\\", \\"nova_object.data\\": {\\"instance_uuid\\": \\"ce77d052-f021-44a2-b5f3-5908ec04fcc7\\", \\"deleted\\": false, \\"created_at\\": \\"2014-07-08T18:39:17Z\\", \\"updated_at\\": \\"2014-07-08T18:44:54Z\\", \\"network_info\\": \\"[{\\\\\\"ovs_interfaceid\\\\\\": null, \\\\\\"network\\\\\\": {\\\\\\"bridge\\\\\\": \\\\\\"cloud\\\\\\", \\\\\\"subnets\\\\\\": [{\\\\\\"ips\\\\\\": [{\\\\\\"meta\\\\\\": {}, \\\\\\"version\\\\\\": 4, \\\\\\"type\\\\\\": \\\\\\"fixed\\\\\\", \\\\\\"floating_ips\\\\\\": [], \\\\\\"address\\\\\\": \\\\\\"10.35.3.2\\\\\\"}], \\\\\\"version\\\\\\": 4, \\\\\\"meta\\\\\\": {\\\\\\"dhcp_server\\\\\\": \\\\\\"10.35.3.3\\\\\\"}, \\\\\\"dns\\\\\\": [{\\\\\\"meta\\\\\\": {}, \\\\\\"version\\\\\\": 4, \\\\\\"type\\\\\\": \\\\\\"dns\\\\\\", \\\\\\"address\\\\\\": \\\\\\"8.8.4.4\\\\\\"}], \\\\\\"routes\\\\\\": [], \\\\\\"cidr\\\\\\": \\\\\\"10.35.3.0/24\\\\\\", \\\\\\"gateway\\\\\\": {\\\\\\"meta\\\\\\": {}, \\\\\\"version\\\\\\": 4, \\\\\\"type\\\\\\": \\\\\\"gateway\\\\\\", \\\\\\"address\\\\\\": \\\\\\"10.35.3.1\\\\\\"}}, {\\\\\\"ips\\\\\\": [], \\\\\\"version\\\\\\": null, \\\\\\"meta\\\\\\": {\\\\\\"dhcp_server\\\\\\": \\\\\\"10.35.3.3\\\\\\"}, \\\\\\"dns\\\\\\": [], \\\\\\"routes\\\\\\": [], \\\\\\"cidr\\\\\\": null, \\\\\\"gateway\\\\\\": {\\\\\\"meta\\\\\\": {}, \\\\\\"version\\\\\\": null, \\\\\\"type\\\\\\": \\\\\\"gateway\\\\\\", \\\\\\"address\\\\\\": null}}], \\\\\\"meta\\\\\\": {\\\\\\"tenant_id\\\\\\": null, \\\\\\"multi_host\\\\\\": true, \\\\\\"should_create_bridge\\\\\\": true}, \\\\\\"id\\\\\\": \\\\\\"933f56fa-7929-413f-ab40-9c7c91cb840b\\\\\\", \\\\\\"label\\\\\\": \\\\\\"private\\\\\\"}, \\\\\\"devname\\\\\\": null, \\\\\\"qbh_params\\\\\\": null, \\\\\\"meta\\\\\\": {}, \\\\\\"details\\\\\\": {}, \\\\\\"address\\\\\\": \\\\\\"fa:16:3e:0e:79:17\\\\\\", \\\\\\"active\\\\\\": false, \\\\\\"type\\\\\\": \\\\\\"bridge\\\\\\", \\\\\\"id\\\\\\": \\\\\\"3821355e-1e28-4fa1-adfe-520202d90d3e\\\\\\", \\\\\\"qbg_params\\\\\\": null}]\\", \\"deleted_at\\": null}, \\"nova_object.namespace\\": \\"nova\\"}, \\"hostname\\": \\"vm-3904852117\\", \\"launched_on\\": \\"10.35.0.13\\", \\"display_description\\": \\"vm-3904852117\\", \\"key_data\\": \\"ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDI9gPkFoUDfotrFuS7aWZAD1k8UB1CCBrNnopMWVEOeMbfPHdQHJRHw+ATUi75dOcUzBXNdV0gbfjMvEp9OtOZY9Pf/Z4vUPSVZRPcT7YAuJnKx3bF9YXPFNQeIMZx+FxFXJeiS9Jp9puTzsavNTW/rssIl92RKqO6yDUCWr7uD3S1y8T3hMKzb5gfXZiN5jvYg69r38YxXBj2pM81b8uHkM4KsuXwSKRVnt/TS6RnyS5l9FqRYv112GtkszY8+vCvIDcGDA65dUmekKsFZOvIWv0EH19YHQ39UA6eTp/OY6SWkfMeyyzUbs9OO6hVi6R3LpVSUzO2rpOivWotNmxJ Generated by Nova\\\\n\\", \\"kernel_id\\": \\"\\", \\"power_state\\": 1, \\"default_ephemeral_device\\": null, \\"progress\\": 0, \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"launched_at\\": \\"2014-07-08T18:39:32Z\\", \\"config_drive\\": \\"\\", \\"node\\": \\"node-10-35-0-13\\", \\"ramdisk_id\\": \\"\\", \\"access_ip_v6\\": null, \\"access_ip_v4\\": null, \\"deleted\\": false, \\"key_name\\": \\"keypair-96183545624\\", \\"updated_at\\": \\"2014-07-08T19:28:42Z\\", \\"host\\": \\"10.35.0.15\\", \\"ephemeral_key_uuid\\": null, \\"architecture\\": null, \\"user_id\\": \\"579c706cdd354c3bace1d0e30ac24d59\\", \\"system_metadata\\": {\\"gc_src_host\\": \\"10.35.0.13\\", \\"logical_volumes\\": \\"ce77d052-f021-44a2-b5f3-5908ec04fcc7_disk:1073741824\\", \\"instance_type_memory_mb\\": \\"512\\", \\"instance_type_swap\\": \\"0\\", \\"instance_type_vcpu_weight\\": null, \\"instance_type_root_gb\\": \\"1\\", \\"instance_type_id\\": \\"2\\", \\"instance_type_name\\": \\"m1.tiny\\", \\"instance_type_ephemeral_gb\\": \\"0\\", \\"instance_type_rxtx_factor\\": \\"1.0\\", \\"clean_attempts\\": \\"2\\", \\"image_min_disk\\": \\"1\\", \\"gc_dst_host\\": \\"10.35.0.15\\", \\"network_allocated\\": \\"True\\", \\"instance_type_flavorid\\": \\"1\\", \\"image_container_format\\": \\"bare\\", \\"images\\": \\"2f57332b-5d86-4ed6-a7e9-69942f23bbea,19d92c5d-6891-4938-8831-746791d29124\\", \\"instance_type_vcpus\\": \\"1\\", \\"image_min_ram\\": \\"0\\", \\"image_disk_format\\": \\"qcow2\\", \\"image_base_image_ref\\": \\"8f23694a-2995-4070-a5e8-661de3207f2e\\"}, \\"task_state\\": \\"migrating\\", \\"shutdown_terminate\\": false, \\"cell_name\\": null, \\"root_gb\\": 1, \\"locked\\": false, \\"created_at\\": \\"2014-07-08T18:39:17Z\\", \\"locked_by\\": null, \\"launch_index\\": 0, \\"metadata\\": {}, \\"memory_mb\\": 512, \\"vcpus\\": 1, \\"image_ref\\": \\"8f23694a-2995-4070-a5e8-661de3207f2e\\", \\"root_device_name\\": \\"/dev/vda\\", \\"auto_disk_config\\": false, \\"os_type\\": null, \\"scheduled_at\\": \\"2014-07-08T18:39:17Z\\"}, \\"nova_object.namespace\\": \\"nova\\"}, \\"disk\\": null, \\"migrate_data\\": null, \\"block_migration\\": false}, \\"_unique_id\\": \\"7291027f6aea4c4cb57857b954427d06\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_project_name\\": \\"service\\", \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_ff10fb357a86427594db414dc3e5acc6\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"3.19\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:28:42.206685\\", \\"_context_user_name\\": \\"nova\\", \\"method\\": \\"pre_live_migration\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}'
Jul 8 19:28:42 10.35.0.15 nova-compute 2014-07-08 19:28:42.888 3328 ERROR nova.compute.manager [req-e9cd61c8-c319-45ab-a742-f01416bbb0ff 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] NBK: pre_live_migration
Jul 8 19:28:42 10.35.0.15 nova-compute 2014-07-08 19:28:42.888 3328 ERROR nova.compute.manager [req-e9cd61c8-c319-45ab-a742-f01416bbb0ff 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] NBK: pre_live_migration instance='ce77d052-f021-44a2-b5f3-5908ec04fcc7'
Jul 8 19:28:42 10.35.0.15 nova-compute 2014-07-08 19:28:42.893 3328 ERROR nova.compute.manager [req-e9cd61c8-c319-45ab-a742-f01416bbb0ff 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] NBK: pre_live_migration instance='ce77d052-f021-44a2-b5f3-5908ec04fcc7' calling _get_instance_nw_info
Jul 8 19:28:42 10.35.0.15 nova-compute 2014-07-08 19:28:42.895 3328 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_msg_id\\": \\"464e9d1d7df74f49acaa72263bb06d11\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-e9cd61c8-c319-45ab-a742-f01416bbb0ff\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"args\\": {\\"instance_id\\": \\"ce77d052-f021-44a2-b5f3-5908ec04fcc7\\", \\"host\\": \\"10.35.0.15\\", \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"rxtx_factor\\": 1.0}, \\"_unique_id\\": \\"52ad8486fc90479fa71dacc3a1ce359f\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_project_name\\": \\"service\\", \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_af4fb5adaca445a3b6d555c358806a4e\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.9\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:28:42.206685\\", \\"_context_user_name\\": \\"nova\\", \\"method\\": \\"get_instance_nw_info\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}', headers={'ttl': 60000}, properties={'delivery_mode': 2}
Jul 8 19:28:42 10.35.0.15 nova-compute 2014-07-08 19:28:42.896 3328 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_msg_id\\": \\"464e9d1d7df74f49acaa72263bb06d11\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-e9cd61c8-c319-45ab-a742-f01416bbb0ff\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"args\\": {\\"instance_id\\": \\"ce77d052-f021-44a2-b5f3-5908ec04fcc7\\", \\"host\\": \\"10.35.0.15\\", \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"rxtx_factor\\": 1.0}, \\"_unique_id\\": \\"52ad8486fc90479fa71dacc3a1ce359f\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_project_name\\": \\"service\\", \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_af4fb5adaca445a3b6d555c358806a4e\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.9\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:28:42.206685\\", \\"_context_user_name\\": \\"nova\\", \\"method\\": \\"get_instance_nw_info\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}', routing_key='network'
Jul 8 19:28:42 10.35.0.15 nova-network 2014-07-08 19:28:42.897 3465 ERROR kombu.transport.pyamqp [-] NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_msg_id\\": \\"464e9d1d7df74f49acaa72263bb06d11\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-e9cd61c8-c319-45ab-a742-f01416bbb0ff\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"args\\": {\\"instance_id\\": \\"ce77d052-f021-44a2-b5f3-5908ec04fcc7\\", \\"host\\": \\"10.35.0.15\\", \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"rxtx_factor\\": 1.0}, \\"_unique_id\\": \\"52ad8486fc90479fa71dacc3a1ce359f\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_project_name\\": \\"service\\", \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_af4fb5adaca445a3b6d555c358806a4e\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.9\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:28:42.206685\\", \\"_context_user_name\\": \\"nova\\", \\"method\\": \\"get_instance_nw_info\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'1', 'redelivered': False, 'routing_key': u'network', 'delivery_tag': 31, 'exchange': u'nova'}, 'properties': {'priority': 0, 'application_headers': {u'ttl': 60000}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7eff2d221290>}
Jul 8 19:28:42 10.35.0.15 nova-network 2014-07-08 19:28:42.899 3465 ERROR oslo.messaging._drivers.impl_rabbit [-] NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_msg_id\\": \\"464e9d1d7df74f49acaa72263bb06d11\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-e9cd61c8-c319-45ab-a742-f01416bbb0ff\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"args\\": {\\"instance_id\\": \\"ce77d052-f021-44a2-b5f3-5908ec04fcc7\\", \\"host\\": \\"10.35.0.15\\", \\"project_id\\": \\"13f85c1e1f6b431eb47db205456ef242\\", \\"rxtx_factor\\": 1.0}, \\"_unique_id\\": \\"52ad8486fc90479fa71dacc3a1ce359f\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_project_name\\": \\"service\\", \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_af4fb5adaca445a3b6d555c358806a4e\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.9\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:28:42.206685\\", \\"_context_user_name\\": \\"nova\\", \\"method\\": \\"get_instance_nw_info\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}'
Jul 8 19:28:42 10.35.0.14 nova-network 2014-07-08 19:28:42.995 6514 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_msg_id\\": \\"8dd500f0bcdf42c4ae0034e983986598\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-585393f6-c238-40c5-88fe-fbf1dd0c48e5\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"args\\": {\\"network_id\\": 1, \\"teardown\\": false}, \\"_unique_id\\": \\"ea038bc5357142a68c2cbb483738fa11\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_project_name\\": \\"service\\", \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_665664b1b2364a339f6ee930172547c8\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:28:40.972908\\", \\"_context_user_name\\": \\"nova\\", \\"method\\": \\"rpc_setup_network_on_host\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}', headers={'ttl': 60000}, properties={'delivery_mode': 2}
Jul 8 19:28:42 10.35.0.14 nova-network 2014-07-08 19:28:42.995 6514 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_msg_id\\": \\"8dd500f0bcdf42c4ae0034e983986598\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-585393f6-c238-40c5-88fe-fbf1dd0c48e5\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"args\\": {\\"network_id\\": 1, \\"teardown\\": false}, \\"_unique_id\\": \\"ea038bc5357142a68c2cbb483738fa11\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_project_name\\": \\"service\\", \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_665664b1b2364a339f6ee930172547c8\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:28:40.972908\\", \\"_context_user_name\\": \\"nova\\", \\"method\\": \\"rpc_setup_network_on_host\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}', routing_key=u'network.10.35.0.15'
Jul 8 19:28:43 10.35.0.15 nova-network 2014-07-08 19:28:43.001 3465 ERROR kombu.transport.pyamqp [-] NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_msg_id\\": \\"8dd500f0bcdf42c4ae0034e983986598\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-585393f6-c238-40c5-88fe-fbf1dd0c48e5\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"args\\": {\\"network_id\\": 1, \\"teardown\\": false}, \\"_unique_id\\": \\"ea038bc5357142a68c2cbb483738fa11\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_project_name\\": \\"service\\", \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_665664b1b2364a339f6ee930172547c8\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:28:40.972908\\", \\"_context_user_name\\": \\"nova\\", \\"method\\": \\"rpc_setup_network_on_host\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'2', 'redelivered': False, 'routing_key': u'network.10.35.0.15', 'delivery_tag': 32, 'exchange': u'nova'}, 'properties': {'priority': 0, 'application_headers': {u'ttl': 60000}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7eff2d221290>}
Jul 8 19:28:43 10.35.0.15 nova-network 2014-07-08 19:28:43.003 3465 ERROR oslo.messaging._drivers.impl_rabbit [-] NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_msg_id\\": \\"8dd500f0bcdf42c4ae0034e983986598\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-585393f6-c238-40c5-88fe-fbf1dd0c48e5\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"args\\": {\\"network_id\\": 1, \\"teardown\\": false}, \\"_unique_id\\": \\"ea038bc5357142a68c2cbb483738fa11\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_project_name\\": \\"service\\", \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_665664b1b2364a339f6ee930172547c8\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:28:40.972908\\", \\"_context_user_name\\": \\"nova\\", \\"method\\": \\"rpc_setup_network_on_host\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}'
Jul 8 19:28:43 10.35.0.15 nova-network.log: /usr/lib64/python2.7/site-packages/amqp/channel.py:616: VDeprecationWarning: The auto_delete flag for exchanges has been deprecated and will be removed
Jul 8 19:28:43 10.35.0.15 nova-network.log: from py-amqp v1.5.0.
Jul 8 19:28:43 10.35.0.15 nova-network.log: warn(VDeprecationWarning(EXCHANGE_AUTODELETE_DEPRECATED))
Jul 8 19:28:43 10.35.0.15 nova-network 2014-07-08 19:28:43.081 3465 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"70ffc3cac1994cce8075a2c0afc092df\\", \\"failure\\": null, \\"result\\": [{\\"ovs_interfaceid\\": null, \\"network\\": {\\"bridge\\": \\"cloud\\", \\"subnets\\": [{\\"ips\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"fixed\\", \\"floating_ips\\": [], \\"address\\": \\"10.35.3.2\\"}], \\"version\\": 4, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.5\\"}, \\"dns\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"dns\\", \\"address\\": \\"8.8.4.4\\"}], \\"routes\\": [], \\"cidr\\": \\"10.35.3.0/24\\", \\"gateway\\": {\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"gateway\\", \\"address\\": \\"10.35.3.1\\"}}, {\\"ips\\": [], \\"version\\": null, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.5\\"}, \\"dns\\": [], \\"routes\\": [], \\"cidr\\": null, \\"gateway\\": {\\"meta\\": {}, \\"version\\": null, \\"type\\": \\"gateway\\", \\"address\\": null}}], \\"meta\\": {\\"tenant_id\\": null, \\"multi_host\\": true, \\"should_create_bridge\\": true}, \\"id\\": \\"933f56fa-7929-413f-ab40-9c7c91cb840b\\", \\"label\\": \\"private\\"}, \\"devname\\": null, \\"qbh_params\\": null, \\"meta\\": {}, \\"details\\": {}, \\"address\\": \\"fa:16:3e:0e:79:17\\", \\"active\\": false, \\"type\\": \\"bridge\\", \\"id\\": \\"3821355e-1e28-4fa1-adfe-520202d90d3e\\", \\"qbg_params\\": null}], \\"_msg_id\\": \\"464e9d1d7df74f49acaa72263bb06d11\\"}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:28:43 10.35.0.15 nova-network 2014-07-08 19:28:43.081 3465 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"70ffc3cac1994cce8075a2c0afc092df\\", \\"failure\\": null, \\"result\\": [{\\"ovs_interfaceid\\": null, \\"network\\": {\\"bridge\\": \\"cloud\\", \\"subnets\\": [{\\"ips\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"fixed\\", \\"floating_ips\\": [], \\"address\\": \\"10.35.3.2\\"}], \\"version\\": 4, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.5\\"}, \\"dns\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"dns\\", \\"address\\": \\"8.8.4.4\\"}], \\"routes\\": [], \\"cidr\\": \\"10.35.3.0/24\\", \\"gateway\\": {\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"gateway\\", \\"address\\": \\"10.35.3.1\\"}}, {\\"ips\\": [], \\"version\\": null, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.5\\"}, \\"dns\\": [], \\"routes\\": [], \\"cidr\\": null, \\"gateway\\": {\\"meta\\": {}, \\"version\\": null, \\"type\\": \\"gateway\\", \\"address\\": null}}], \\"meta\\": {\\"tenant_id\\": null, \\"multi_host\\": true, \\"should_create_bridge\\": true}, \\"id\\": \\"933f56fa-7929-413f-ab40-9c7c91cb840b\\", \\"label\\": \\"private\\"}, \\"devname\\": null, \\"qbh_params\\": null, \\"meta\\": {}, \\"details\\": {}, \\"address\\": \\"fa:16:3e:0e:79:17\\", \\"active\\": false, \\"type\\": \\"bridge\\", \\"id\\": \\"3821355e-1e28-4fa1-adfe-520202d90d3e\\", \\"qbg_params\\": null}], \\"_msg_id\\": \\"464e9d1d7df74f49acaa72263bb06d11\\"}", "oslo.version": "2.0"}', routing_key=u'reply_af4fb5adaca445a3b6d555c358806a4e'
Jul 8 19:28:43 10.35.0.15 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf ip route replace 10.35.3.0/24 dev cloud table cloud
Jul 8 19:28:43 10.35.0.15 nova-network.log: /usr/lib64/python2.7/site-packages/amqp/channel.py:616: VDeprecationWarning: The auto_delete flag for exchanges has been deprecated and will be removed
Jul 8 19:28:43 10.35.0.15 nova-network.log: from py-amqp v1.5.0.
Jul 8 19:28:43 10.35.0.15 nova-network.log: warn(VDeprecationWarning(EXCHANGE_AUTODELETE_DEPRECATED))
Jul 8 19:28:43 10.35.0.15 nova-compute 2014-07-08 19:28:43.083 3328 ERROR kombu.transport.pyamqp [-] NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_unique_id\\": \\"70ffc3cac1994cce8075a2c0afc092df\\", \\"failure\\": null, \\"result\\": [{\\"ovs_interfaceid\\": null, \\"network\\": {\\"bridge\\": \\"cloud\\", \\"subnets\\": [{\\"ips\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"fixed\\", \\"floating_ips\\": [], \\"address\\": \\"10.35.3.2\\"}], \\"version\\": 4, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.5\\"}, \\"dns\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"dns\\", \\"address\\": \\"8.8.4.4\\"}], \\"routes\\": [], \\"cidr\\": \\"10.35.3.0/24\\", \\"gateway\\": {\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"gateway\\", \\"address\\": \\"10.35.3.1\\"}}, {\\"ips\\": [], \\"version\\": null, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.5\\"}, \\"dns\\": [], \\"routes\\": [], \\"cidr\\": null, \\"gateway\\": {\\"meta\\": {}, \\"version\\": null, \\"type\\": \\"gateway\\", \\"address\\": null}}], \\"meta\\": {\\"tenant_id\\": null, \\"multi_host\\": true, \\"should_create_bridge\\": true}, \\"id\\": \\"933f56fa-7929-413f-ab40-9c7c91cb840b\\", \\"label\\": \\"private\\"}, \\"devname\\": null, \\"qbh_params\\": null, \\"meta\\": {}, \\"details\\": {}, \\"address\\": \\"fa:16:3e:0e:79:17\\", \\"active\\": false, \\"type\\": \\"bridge\\", \\"id\\": \\"3821355e-1e28-4fa1-adfe-520202d90d3e\\", \\"qbg_params\\": null}], \\"_msg_id\\": \\"464e9d1d7df74f49acaa72263bb06d11\\"}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'1', 'redelivered': False, 'routing_key': u'reply_af4fb5adaca445a3b6d555c358806a4e', 'delivery_tag': 85, 'exchange': u'reply_af4fb5adaca445a3b6d555c358806a4e'}, 'properties': {'priority': 0, 'application_headers': {}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7f3030b1ff10>}
Jul 8 19:28:43 10.35.0.15 nova-network 2014-07-08 19:28:43.083 3465 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"270483332a364806b155301388293ad3\\", \\"failure\\": null, \\"_msg_id\\": \\"464e9d1d7df74f49acaa72263bb06d11\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:28:43 10.35.0.15 nova-network 2014-07-08 19:28:43.083 3465 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"270483332a364806b155301388293ad3\\", \\"failure\\": null, \\"_msg_id\\": \\"464e9d1d7df74f49acaa72263bb06d11\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', routing_key=u'reply_af4fb5adaca445a3b6d555c358806a4e'
Jul 8 19:28:43 10.35.0.15 nova-compute 2014-07-08 19:28:43.083 3328 ERROR oslo.messaging._drivers.impl_rabbit [-] NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_unique_id\\": \\"70ffc3cac1994cce8075a2c0afc092df\\", \\"failure\\": null, \\"result\\": [{\\"ovs_interfaceid\\": null, \\"network\\": {\\"bridge\\": \\"cloud\\", \\"subnets\\": [{\\"ips\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"fixed\\", \\"floating_ips\\": [], \\"address\\": \\"10.35.3.2\\"}], \\"version\\": 4, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.5\\"}, \\"dns\\": [{\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"dns\\", \\"address\\": \\"8.8.4.4\\"}], \\"routes\\": [], \\"cidr\\": \\"10.35.3.0/24\\", \\"gateway\\": {\\"meta\\": {}, \\"version\\": 4, \\"type\\": \\"gateway\\", \\"address\\": \\"10.35.3.1\\"}}, {\\"ips\\": [], \\"version\\": null, \\"meta\\": {\\"dhcp_server\\": \\"10.35.3.5\\"}, \\"dns\\": [], \\"routes\\": [], \\"cidr\\": null, \\"gateway\\": {\\"meta\\": {}, \\"version\\": null, \\"type\\": \\"gateway\\", \\"address\\": null}}], \\"meta\\": {\\"tenant_id\\": null, \\"multi_host\\": true, \\"should_create_bridge\\": true}, \\"id\\": \\"933f56fa-7929-413f-ab40-9c7c91cb840b\\", \\"label\\": \\"private\\"}, \\"devname\\": null, \\"qbh_params\\": null, \\"meta\\": {}, \\"details\\": {}, \\"address\\": \\"fa:16:3e:0e:79:17\\", \\"active\\": false, \\"type\\": \\"bridge\\", \\"id\\": \\"3821355e-1e28-4fa1-adfe-520202d90d3e\\", \\"qbg_params\\": null}], \\"_msg_id\\": \\"464e9d1d7df74f49acaa72263bb06d11\\"}", "oslo.version": "2.0"}'
Jul 8 19:28:43 10.35.0.15 nova-compute 2014-07-08 19:28:43.084 3328 ERROR kombu.transport.pyamqp [-] NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_unique_id\\": \\"270483332a364806b155301388293ad3\\", \\"failure\\": null, \\"_msg_id\\": \\"464e9d1d7df74f49acaa72263bb06d11\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'1', 'redelivered': False, 'routing_key': u'reply_af4fb5adaca445a3b6d555c358806a4e', 'delivery_tag': 86, 'exchange': u'reply_af4fb5adaca445a3b6d555c358806a4e'}, 'properties': {'priority': 0, 'application_headers': {}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7f3030b1ff10>}
Jul 8 19:28:43 10.35.0.15 nova-compute 2014-07-08 19:28:43.084 3328 ERROR oslo.messaging._drivers.impl_rabbit [-] NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_unique_id\\": \\"270483332a364806b155301388293ad3\\", \\"failure\\": null, \\"_msg_id\\": \\"464e9d1d7df74f49acaa72263bb06d11\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}'
Jul 8 19:28:43 10.35.0.15 nova-compute 2014-07-08 19:28:43.102 3328 ERROR nova.compute.manager [req-e9cd61c8-c319-45ab-a742-f01416bbb0ff 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] NBK: pre_live_migration instance='ce77d052-f021-44a2-b5f3-5908ec04fcc7' _get_instance_nw_info returned network_info=[VIF({'ovs_interfaceid': None, 'network': Network({'bridge': u'cloud', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.35.3.2'})], 'version': 4, 'meta': {u'dhcp_server': u'10.35.3.5'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.35.3.0/24', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.35.3.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.35.3.5'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True}, 'id': u'933f56fa-7929-413f-ab40-9c7c91cb840b', 'label': u'private'}), 'devname': None, 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:0e:79:17', 'active': False, 'type': u'bridge', 'id': u'3821355e-1e28-4fa1-adfe-520202d90d3e', 'qbg_params': None})]
Jul 8 19:28:43 10.35.0.15 nova-compute 2014-07-08 19:28:43.103 3328 ERROR nova.compute.manager [req-e9cd61c8-c319-45ab-a742-f01416bbb0ff 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] NBK: pre_live_migration instance='ce77d052-f021-44a2-b5f3-5908ec04fcc7' setup_networks_on_host self.host='10.35.0.15' pre_live_migration_data=None
Jul 8 19:28:43 10.35.0.15 nova-compute 2014-07-08 19:28:43.105 3328 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_msg_id\\": \\"32ac1a0ceca44ada925deeb99fa3ddaf\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-e9cd61c8-c319-45ab-a742-f01416bbb0ff\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"args\\": {\\"instance_id\\": 1, \\"teardown\\": false, \\"host\\": \\"10.35.0.15\\"}, \\"_unique_id\\": \\"18eea49f8bf64354aeb43af8b5556937\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_project_name\\": \\"service\\", \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_af4fb5adaca445a3b6d555c358806a4e\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:28:42.206685\\", \\"_context_user_name\\": \\"nova\\", \\"method\\": \\"setup_networks_on_host\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}', headers={'ttl': 60000}, properties={'delivery_mode': 2}
Jul 8 19:28:43 10.35.0.15 nova-compute 2014-07-08 19:28:43.105 3328 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_msg_id\\": \\"32ac1a0ceca44ada925deeb99fa3ddaf\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-e9cd61c8-c319-45ab-a742-f01416bbb0ff\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"args\\": {\\"instance_id\\": 1, \\"teardown\\": false, \\"host\\": \\"10.35.0.15\\"}, \\"_unique_id\\": \\"18eea49f8bf64354aeb43af8b5556937\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_project_name\\": \\"service\\", \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_af4fb5adaca445a3b6d555c358806a4e\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:28:42.206685\\", \\"_context_user_name\\": \\"nova\\", \\"method\\": \\"setup_networks_on_host\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}', routing_key='network'
Jul 8 19:28:43 10.35.0.13 nova-network 2014-07-08 19:28:43.104 4876 ERROR kombu.transport.pyamqp [-] NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_msg_id\\": \\"32ac1a0ceca44ada925deeb99fa3ddaf\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-e9cd61c8-c319-45ab-a742-f01416bbb0ff\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"args\\": {\\"instance_id\\": 1, \\"teardown\\": false, \\"host\\": \\"10.35.0.15\\"}, \\"_unique_id\\": \\"18eea49f8bf64354aeb43af8b5556937\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_project_name\\": \\"service\\", \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_af4fb5adaca445a3b6d555c358806a4e\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:28:42.206685\\", \\"_context_user_name\\": \\"nova\\", \\"method\\": \\"setup_networks_on_host\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'1', 'redelivered': False, 'routing_key': u'network', 'delivery_tag': 32, 'exchange': u'nova'}, 'properties': {'priority': 0, 'application_headers': {u'ttl': 60000}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7fdeb6472390>}
Jul 8 19:28:43 10.35.0.13 nova-network 2014-07-08 19:28:43.106 4876 ERROR oslo.messaging._drivers.impl_rabbit [-] NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_msg_id\\": \\"32ac1a0ceca44ada925deeb99fa3ddaf\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-e9cd61c8-c319-45ab-a742-f01416bbb0ff\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"args\\": {\\"instance_id\\": 1, \\"teardown\\": false, \\"host\\": \\"10.35.0.15\\"}, \\"_unique_id\\": \\"18eea49f8bf64354aeb43af8b5556937\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_project_name\\": \\"service\\", \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_af4fb5adaca445a3b6d555c358806a4e\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:28:42.206685\\", \\"_context_user_name\\": \\"nova\\", \\"method\\": \\"setup_networks_on_host\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}'
Jul 8 19:28:43 10.35.0.13 nova-network 2014-07-08 19:28:43.263 4876 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_msg_id\\": \\"61db0e29caf64e38952ae323700fa33a\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-e9cd61c8-c319-45ab-a742-f01416bbb0ff\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"args\\": {\\"network_id\\": 1, \\"teardown\\": false}, \\"_unique_id\\": \\"4ebd8f75ee064be493d1a488434b71b8\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_project_name\\": \\"service\\", \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_423752426aac4df5930f72f335e8adf0\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:28:42.206685\\", \\"_context_user_name\\": \\"nova\\", \\"method\\": \\"rpc_setup_network_on_host\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}', headers={'ttl': 60000}, properties={'delivery_mode': 2}
Jul 8 19:28:43 10.35.0.13 nova-network 2014-07-08 19:28:43.264 4876 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_msg_id\\": \\"61db0e29caf64e38952ae323700fa33a\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-e9cd61c8-c319-45ab-a742-f01416bbb0ff\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"args\\": {\\"network_id\\": 1, \\"teardown\\": false}, \\"_unique_id\\": \\"4ebd8f75ee064be493d1a488434b71b8\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_project_name\\": \\"service\\", \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_423752426aac4df5930f72f335e8adf0\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:28:42.206685\\", \\"_context_user_name\\": \\"nova\\", \\"method\\": \\"rpc_setup_network_on_host\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}', routing_key=u'network.10.35.0.15'
Jul 8 19:28:43 10.35.0.15 nova-network 2014-07-08 19:28:43.267 3465 ERROR kombu.transport.pyamqp [-] NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_msg_id\\": \\"61db0e29caf64e38952ae323700fa33a\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-e9cd61c8-c319-45ab-a742-f01416bbb0ff\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"args\\": {\\"network_id\\": 1, \\"teardown\\": false}, \\"_unique_id\\": \\"4ebd8f75ee064be493d1a488434b71b8\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_project_name\\": \\"service\\", \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_423752426aac4df5930f72f335e8adf0\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:28:42.206685\\", \\"_context_user_name\\": \\"nova\\", \\"method\\": \\"rpc_setup_network_on_host\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'2', 'redelivered': False, 'routing_key': u'network.10.35.0.15', 'delivery_tag': 33, 'exchange': u'nova'}, 'properties': {'priority': 0, 'application_headers': {u'ttl': 60000}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7eff2d221290>}
Jul 8 19:28:43 10.35.0.15 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf ip route replace 0.0.0.0/0 via 10.35.3.1 dev cloud table cloud
Jul 8 19:28:43 10.35.0.13 nova-network 2014-07-08 19:28:43.266 4876 ERROR oslo.messaging._drivers.impl_rabbit [-] Failed to consume message from queue: [Errno 104] Connection reset by peer
2014-07-08 19:28:43.266 4876 TRACE oslo.messaging._drivers.impl_rabbit Traceback (most recent call last):
2014-07-08 19:28:43.266 4876 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/impl_rabbit.py", line 672, in ensure
2014-07-08 19:28:43.266 4876 TRACE oslo.messaging._drivers.impl_rabbit return method(*args, **kwargs)
2014-07-08 19:28:43.266 4876 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/oslo/messaging/_drivers/impl_rabbit.py", line 752, in _consume
2014-07-08 19:28:43.266 4876 TRACE oslo.messaging._drivers.impl_rabbit return self.connection.drain_events(timeout=timeout)
2014-07-08 19:28:43.266 4876 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/kombu/connection.py", line 281, in drain_events
2014-07-08 19:28:43.266 4876 TRACE oslo.messaging._drivers.impl_rabbit return self.transport.drain_events(self.connection, **kwargs)
2014-07-08 19:28:43.266 4876 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/kombu/transport/pyamqp.py", line 103, in drain_events
2014-07-08 19:28:43.266 4876 TRACE oslo.messaging._drivers.impl_rabbit return connection.drain_events(**kwargs)
2014-07-08 19:28:43.266 4876 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/amqp/connection.py", line 320, in drain_events
2014-07-08 19:28:43.266 4876 TRACE oslo.messaging._drivers.impl_rabbit return amqp_method(channel, args)
2014-07-08 19:28:43.266 4876 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/amqp/connection.py", line 523, in _close
2014-07-08 19:28:43.266 4876 TRACE oslo.messaging._drivers.impl_rabbit self._x_close_ok()
2014-07-08 19:28:43.266 4876 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/amqp/connection.py", line 551, in _x_close_ok
2014-07-08 19:28:43.266 4876 TRACE oslo.messaging._drivers.impl_rabbit self._send_method((10, 51))
2014-07-08 19:28:43.266 4876 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/amqp/abstract_channel.py", line 56, in _send_method
2014-07-08 19:28:43.266 4876 TRACE oslo.messaging._drivers.impl_rabbit self.channel_id, method_sig, args, content,
2014-07-08 19:28:43.266 4876 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/amqp/method_framing.py", line 221, in write_method
2014-07-08 19:28:43.266 4876 TRACE oslo.messaging._drivers.impl_rabbit write_frame(1, channel, payload)
2014-07-08 19:28:43.266 4876 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/amqp/transport.py", line 177, in write_frame
2014-07-08 19:28:43.266 4876 TRACE oslo.messaging._drivers.impl_rabbit frame_type, channel, size, payload, 0xce,
2014-07-08 19:28:43.266 4876 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/eventlet/greenio.py", line 307, in sendall
2014-07-08 19:28:43.266 4876 TRACE oslo.messaging._drivers.impl_rabbit tail = self.send(data, flags)
2014-07-08 19:28:43.266 4876 TRACE oslo.messaging._drivers.impl_rabbit File "/usr/lib64/python2.7/site-packages/eventlet/greenio.py", line 293, in send
2014-07-08 19:28:43.266 4876 TRACE oslo.messaging._drivers.impl_rabbit total_sent += fd.send(data[total_sent:], flags)
2014-07-08 19:28:43.266 4876 TRACE oslo.messaging._drivers.impl_rabbit error: [Errno 104] Connection reset by peer
2014-07-08 19:28:43.266 4876 TRACE oslo.messaging._drivers.impl_rabbit
Jul 8 19:28:43 10.35.0.13 nova-network 2014-07-08 19:28:43.267 4876 INFO oslo.messaging._drivers.impl_rabbit [-] Reconnecting to AMQP server on 10.35.0.3:5672
Jul 8 19:28:43 10.35.0.13 nova-network 2014-07-08 19:28:43.267 4876 INFO oslo.messaging._drivers.impl_rabbit [-] Delaying reconnect for 1.0 seconds...
Jul 8 19:28:43 10.35.0.15 nova-network 2014-07-08 19:28:43.270 3465 ERROR oslo.messaging._drivers.impl_rabbit [-] NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_context_roles\\": [\\"Admin\\", \\"_member_\\", \\"ProjectAdmin\\", \\"admin\\"], \\"_msg_id\\": \\"61db0e29caf64e38952ae323700fa33a\\", \\"_context_quota_class\\": null, \\"_context_request_id\\": \\"req-e9cd61c8-c319-45ab-a742-f01416bbb0ff\\", \\"_context_service_catalog\\": [{\\"endpoints_links\\": [], \\"endpoints\\": [{\\"adminURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"region\\": \\"RegionOne\\", \\"internalURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\", \\"publicURL\\": \\"http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a\\"}], \\"type\\": \\"volume\\", \\"name\\": \\"Volume Service\\"}], \\"args\\": {\\"network_id\\": 1, \\"teardown\\": false}, \\"_unique_id\\": \\"4ebd8f75ee064be493d1a488434b71b8\\", \\"_context_user\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_user_id\\": \\"260659016df54f47833004653cfe0cb8\\", \\"_context_project_name\\": \\"service\\", \\"_context_read_deleted\\": \\"no\\", \\"_reply_q\\": \\"reply_423752426aac4df5930f72f335e8adf0\\", \\"_context_auth_token\\": \\"e461a54513ed421ab86f2cc67e181011\\", \\"_context_tenant\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_instance_lock_checked\\": false, \\"_context_is_admin\\": true, \\"version\\": \\"1.0\\", \\"_context_project_id\\": \\"9a663247f58a4412a840385af8d9e73a\\", \\"_context_timestamp\\": \\"2014-07-08T19:28:42.206685\\", \\"_context_user_name\\": \\"nova\\", \\"method\\": \\"rpc_setup_network_on_host\\", \\"_context_remote_address\\": \\"10.35.0.13\\"}", "oslo.version": "2.0"}'
Jul 8 19:28:43 10.35.0.15 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf kill -HUP 5300
Jul 8 19:28:43 10.35.0.15 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf ip route replace 10.35.3.0/24 dev cloud table cloud
Jul 8 19:28:43 10.35.0.15 dnsmasq-dhcp[5300]: read /var/lib/dnsmasq/nova-cloud.conf
Jul 8 19:28:43 10.35.0.15 nova-network.log: /usr/lib64/python2.7/site-packages/amqp/channel.py:616: VDeprecationWarning: The auto_delete flag for exchanges has been deprecated and will be removed
Jul 8 19:28:43 10.35.0.15 nova-network.log: from py-amqp v1.5.0.
Jul 8 19:28:43 10.35.0.15 nova-network.log: warn(VDeprecationWarning(EXCHANGE_AUTODELETE_DEPRECATED))
Jul 8 19:28:43 10.35.0.15 nova-network 2014-07-08 19:28:43.670 3465 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"75dd27f5acb442c3b3a4948f70cb20ff\\", \\"failure\\": null, \\"result\\": null, \\"_msg_id\\": \\"8dd500f0bcdf42c4ae0034e983986598\\"}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:28:43 10.35.0.15 nova-network 2014-07-08 19:28:43.670 3465 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"75dd27f5acb442c3b3a4948f70cb20ff\\", \\"failure\\": null, \\"result\\": null, \\"_msg_id\\": \\"8dd500f0bcdf42c4ae0034e983986598\\"}", "oslo.version": "2.0"}', routing_key=u'reply_665664b1b2364a339f6ee930172547c8'
Jul 8 19:28:43 10.35.0.15 nova-network.log: /usr/lib64/python2.7/site-packages/amqp/channel.py:616: VDeprecationWarning: The auto_delete flag for exchanges has been deprecated and will be removed
Jul 8 19:28:43 10.35.0.15 nova-network.log: from py-amqp v1.5.0.
Jul 8 19:28:43 10.35.0.15 nova-network.log: warn(VDeprecationWarning(EXCHANGE_AUTODELETE_DEPRECATED))
Jul 8 19:28:43 10.35.0.15 nova-network 2014-07-08 19:28:43.672 3465 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"d590e033ac8d4abb8fc927227e9f12df\\", \\"failure\\": null, \\"_msg_id\\": \\"8dd500f0bcdf42c4ae0034e983986598\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:28:43 10.35.0.15 nova-network 2014-07-08 19:28:43.672 3465 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"d590e033ac8d4abb8fc927227e9f12df\\", \\"failure\\": null, \\"_msg_id\\": \\"8dd500f0bcdf42c4ae0034e983986598\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', routing_key=u'reply_665664b1b2364a339f6ee930172547c8'
Jul 8 19:28:43 10.35.0.14 nova-network 2014-07-08 19:28:43.669 6514 ERROR kombu.transport.pyamqp [-] NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_unique_id\\": \\"75dd27f5acb442c3b3a4948f70cb20ff\\", \\"failure\\": null, \\"result\\": null, \\"_msg_id\\": \\"8dd500f0bcdf42c4ae0034e983986598\\"}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'1', 'redelivered': False, 'routing_key': u'reply_665664b1b2364a339f6ee930172547c8', 'delivery_tag': 1, 'exchange': u'reply_665664b1b2364a339f6ee930172547c8'}, 'properties': {'priority': 0, 'application_headers': {}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7f5c1455c950>}
Jul 8 19:28:43 10.35.0.14 nova-network 2014-07-08 19:28:43.669 6514 ERROR oslo.messaging._drivers.impl_rabbit [-] NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_unique_id\\": \\"75dd27f5acb442c3b3a4948f70cb20ff\\", \\"failure\\": null, \\"result\\": null, \\"_msg_id\\": \\"8dd500f0bcdf42c4ae0034e983986598\\"}", "oslo.version": "2.0"}'
Jul 8 19:28:43 10.35.0.14 nova-network 2014-07-08 19:28:43.670 6514 ERROR kombu.transport.pyamqp [-] NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_unique_id\\": \\"d590e033ac8d4abb8fc927227e9f12df\\", \\"failure\\": null, \\"_msg_id\\": \\"8dd500f0bcdf42c4ae0034e983986598\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'1', 'redelivered': False, 'routing_key': u'reply_665664b1b2364a339f6ee930172547c8', 'delivery_tag': 2, 'exchange': u'reply_665664b1b2364a339f6ee930172547c8'}, 'properties': {'priority': 0, 'application_headers': {}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7f5c1455c950>}
Jul 8 19:28:43 10.35.0.14 nova-network 2014-07-08 19:28:43.670 6514 ERROR oslo.messaging._drivers.impl_rabbit [-] NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_unique_id\\": \\"d590e033ac8d4abb8fc927227e9f12df\\", \\"failure\\": null, \\"_msg_id\\": \\"8dd500f0bcdf42c4ae0034e983986598\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}'
Jul 8 19:28:43 10.35.0.14 nova-network.log: /usr/lib64/python2.7/site-packages/amqp/channel.py:616: VDeprecationWarning: The auto_delete flag for exchanges has been deprecated and will be removed
Jul 8 19:28:43 10.35.0.14 nova-network.log: from py-amqp v1.5.0.
Jul 8 19:28:43 10.35.0.14 nova-network.log: warn(VDeprecationWarning(EXCHANGE_AUTODELETE_DEPRECATED))
Jul 8 19:28:43 10.35.0.14 nova-network 2014-07-08 19:28:43.672 6514 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"20cf1161fc184ba383c821681ab733cf\\", \\"failure\\": null, \\"result\\": null, \\"_msg_id\\": \\"be1be11e962d4f198720ff9a8bb226a7\\"}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:28:43 10.35.0.14 nova-network 2014-07-08 19:28:43.672 6514 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"20cf1161fc184ba383c821681ab733cf\\", \\"failure\\": null, \\"result\\": null, \\"_msg_id\\": \\"be1be11e962d4f198720ff9a8bb226a7\\"}", "oslo.version": "2.0"}', routing_key=u'reply_af4fb5adaca445a3b6d555c358806a4e'
Jul 8 19:28:43 10.35.0.14 nova-network.log: /usr/lib64/python2.7/site-packages/amqp/channel.py:616: VDeprecationWarning: The auto_delete flag for exchanges has been deprecated and will be removed
Jul 8 19:28:43 10.35.0.14 nova-network.log: from py-amqp v1.5.0.
Jul 8 19:28:43 10.35.0.14 nova-network.log: warn(VDeprecationWarning(EXCHANGE_AUTODELETE_DEPRECATED))
Jul 8 19:28:43 10.35.0.14 nova-network 2014-07-08 19:28:43.673 6514 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"5ea58a0c0ca44347b423e8ca1667afe6\\", \\"failure\\": null, \\"_msg_id\\": \\"be1be11e962d4f198720ff9a8bb226a7\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:28:43 10.35.0.14 nova-network 2014-07-08 19:28:43.674 6514 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"5ea58a0c0ca44347b423e8ca1667afe6\\", \\"failure\\": null, \\"_msg_id\\": \\"be1be11e962d4f198720ff9a8bb226a7\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', routing_key=u'reply_af4fb5adaca445a3b6d555c358806a4e'
Jul 8 19:28:43 10.35.0.15 nova-compute 2014-07-08 19:28:43.677 3328 ERROR kombu.transport.pyamqp [-] NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_unique_id\\": \\"20cf1161fc184ba383c821681ab733cf\\", \\"failure\\": null, \\"result\\": null, \\"_msg_id\\": \\"be1be11e962d4f198720ff9a8bb226a7\\"}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'1', 'redelivered': False, 'routing_key': u'reply_af4fb5adaca445a3b6d555c358806a4e', 'delivery_tag': 87, 'exchange': u'reply_af4fb5adaca445a3b6d555c358806a4e'}, 'properties': {'priority': 0, 'application_headers': {}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7f3030b1ff10>}
Jul 8 19:28:43 10.35.0.15 nova-compute 2014-07-08 19:28:43.677 3328 ERROR oslo.messaging._drivers.impl_rabbit [-] NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_unique_id\\": \\"20cf1161fc184ba383c821681ab733cf\\", \\"failure\\": null, \\"result\\": null, \\"_msg_id\\": \\"be1be11e962d4f198720ff9a8bb226a7\\"}", "oslo.version": "2.0"}'
Jul 8 19:28:43 10.35.0.15 nova-compute 2014-07-08 19:28:43.678 3328 ERROR kombu.transport.pyamqp [-] NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_unique_id\\": \\"5ea58a0c0ca44347b423e8ca1667afe6\\", \\"failure\\": null, \\"_msg_id\\": \\"be1be11e962d4f198720ff9a8bb226a7\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'1', 'redelivered': False, 'routing_key': u'reply_af4fb5adaca445a3b6d555c358806a4e', 'delivery_tag': 88, 'exchange': u'reply_af4fb5adaca445a3b6d555c358806a4e'}, 'properties': {'priority': 0, 'application_headers': {}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7f3030b1ff10>}
Jul 8 19:28:43 10.35.0.15 nova-compute 2014-07-08 19:28:43.678 3328 ERROR oslo.messaging._drivers.impl_rabbit [-] NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_unique_id\\": \\"5ea58a0c0ca44347b423e8ca1667afe6\\", \\"failure\\": null, \\"_msg_id\\": \\"be1be11e962d4f198720ff9a8bb226a7\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}'
Jul 8 19:28:43 10.35.0.15 nova-compute 2014-07-08 19:28:43.679 3328 ERROR nova.compute.manager [req-585393f6-c238-40c5-88fe-fbf1dd0c48e5 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] NBK: pre_live_migration instance='5ee78913-e655-4cb1-a92c-500639c819ad' ensure_filtering_rules_for_instance
Jul 8 19:28:43 10.35.0.15 nova-compute 2014-07-08 19:28:43.679 3328 INFO nova.virt.libvirt.firewall [req-585393f6-c238-40c5-88fe-fbf1dd0c48e5 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] [instance: 5ee78913-e655-4cb1-a92c-500639c819ad] Called setup_basic_filtering in nwfilter
Jul 8 19:28:43 10.35.0.15 nova-compute 2014-07-08 19:28:43.680 3328 INFO nova.virt.libvirt.firewall [req-585393f6-c238-40c5-88fe-fbf1dd0c48e5 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] [instance: 5ee78913-e655-4cb1-a92c-500639c819ad] Ensuring static filters
Jul 8 19:28:43 10.35.0.15 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf ip route replace 0.0.0.0/0 via 10.35.3.1 dev cloud table cloud
Jul 8 19:28:43 10.35.0.15 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf iptables-save -c
Jul 8 19:28:43 10.35.0.15 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf kill -HUP 5300
Jul 8 19:28:43 10.35.0.15 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf iptables-restore -c
Jul 8 19:28:43 10.35.0.13 pound: 10.35.2.3:5000 10.35.0.14 - - [08/Jul/2014:19:28:43 +0000] "POST //v2.0/tokens HTTP/1.1" 200 3288 "" "python-novaclient"
Jul 8 19:28:43 10.35.0.13 nova-api 2014-07-08 19:28:43.975 4840 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:28:43 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.7 - - [08/Jul/2014:19:28:43 +0000] "GET /v2.0/tokens/f4e18ae48a1f4b668bbf8f83fda56bc7 HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:28:44 10.35.0.15 dnsmasq-dhcp[5300]: read /var/lib/dnsmasq/nova-cloud.conf
Jul 8 19:28:44 10.35.0.13 pound: 10.35.2.8:8774 10.35.0.14 - - [08/Jul/2014:19:28:43 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5060 "" "python-novaclient"
Jul 8 19:28:44 10.35.0.13 nova-api 2014-07-08 19:28:44.093 4840 INFO nova.osapi_compute.wsgi.server [req-63a6596a-9493-4cd0-b4e9-197b6df2cc1a 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 10.35.0.14,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5234 time: 0.1214042
Jul 8 19:28:44 10.35.0.15 nova-network.log: /usr/lib64/python2.7/site-packages/amqp/channel.py:616: VDeprecationWarning: The auto_delete flag for exchanges has been deprecated and will be removed
Jul 8 19:28:44 10.35.0.15 nova-network.log: from py-amqp v1.5.0.
Jul 8 19:28:44 10.35.0.15 nova-network.log: warn(VDeprecationWarning(EXCHANGE_AUTODELETE_DEPRECATED))
Jul 8 19:28:44 10.35.0.15 nova-network 2014-07-08 19:28:44.102 3465 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"526f83b00e8a44bdade34d378d5d5c39\\", \\"failure\\": null, \\"result\\": null, \\"_msg_id\\": \\"61db0e29caf64e38952ae323700fa33a\\"}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:28:44 10.35.0.15 nova-network 2014-07-08 19:28:44.102 3465 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"526f83b00e8a44bdade34d378d5d5c39\\", \\"failure\\": null, \\"result\\": null, \\"_msg_id\\": \\"61db0e29caf64e38952ae323700fa33a\\"}", "oslo.version": "2.0"}', routing_key=u'reply_423752426aac4df5930f72f335e8adf0'
Jul 8 19:28:44 10.35.0.15 nova-network.log: /usr/lib64/python2.7/site-packages/amqp/channel.py:616: VDeprecationWarning: The auto_delete flag for exchanges has been deprecated and will be removed
Jul 8 19:28:44 10.35.0.15 nova-network.log: from py-amqp v1.5.0.
Jul 8 19:28:44 10.35.0.15 nova-network.log: warn(VDeprecationWarning(EXCHANGE_AUTODELETE_DEPRECATED))
Jul 8 19:28:44 10.35.0.15 nova-network 2014-07-08 19:28:44.104 3465 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"0b1a56e8ff5c403d93806ab951f0e878\\", \\"failure\\": null, \\"_msg_id\\": \\"61db0e29caf64e38952ae323700fa33a\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:28:44 10.35.0.15 nova-network 2014-07-08 19:28:44.104 3465 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"0b1a56e8ff5c403d93806ab951f0e878\\", \\"failure\\": null, \\"_msg_id\\": \\"61db0e29caf64e38952ae323700fa33a\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', routing_key=u'reply_423752426aac4df5930f72f335e8adf0'
Jul 8 19:28:44 10.35.0.15 nova-compute 2014-07-08 19:28:44.129 3328 ERROR nova.compute.manager [req-585393f6-c238-40c5-88fe-fbf1dd0c48e5 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] NBK: pre_live_migration instance='5ee78913-e655-4cb1-a92c-500639c819ad' _notify_about_instance_usage live_migration.pre.end
Jul 8 19:28:44 10.35.0.15 nova-compute 2014-07-08 19:28:44.131 3328 ERROR nova.compute.manager [req-585393f6-c238-40c5-88fe-fbf1dd0c48e5 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a] NBK: pre_live_migration instance='5ee78913-e655-4cb1-a92c-500639c819ad' returning pre_live_migration_data=None
Jul 8 19:28:44 10.35.0.15 nova-compute.log: /usr/lib64/python2.7/site-packages/amqp/channel.py:616: VDeprecationWarning: The auto_delete flag for exchanges has been deprecated and will be removed
Jul 8 19:28:44 10.35.0.15 nova-compute.log: from py-amqp v1.5.0.
Jul 8 19:28:44 10.35.0.15 nova-compute.log: warn(VDeprecationWarning(EXCHANGE_AUTODELETE_DEPRECATED))
Jul 8 19:28:44 10.35.0.15 nova-compute 2014-07-08 19:28:44.133 3328 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"b14eb8a49fa9493795c8c3ae6d6dbf65\\", \\"failure\\": null, \\"result\\": null, \\"_msg_id\\": \\"60a700beb6074e6e9999235c9252007c\\"}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:28:44 10.35.0.15 nova-compute 2014-07-08 19:28:44.133 3328 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"b14eb8a49fa9493795c8c3ae6d6dbf65\\", \\"failure\\": null, \\"result\\": null, \\"_msg_id\\": \\"60a700beb6074e6e9999235c9252007c\\"}", "oslo.version": "2.0"}', routing_key=u'reply_ff10fb357a86427594db414dc3e5acc6'
Jul 8 19:28:44 10.35.0.15 nova-compute.log: /usr/lib64/python2.7/site-packages/amqp/channel.py:616: VDeprecationWarning: The auto_delete flag for exchanges has been deprecated and will be removed
Jul 8 19:28:44 10.35.0.15 nova-compute.log: from py-amqp v1.5.0.
Jul 8 19:28:44 10.35.0.15 nova-compute.log: warn(VDeprecationWarning(EXCHANGE_AUTODELETE_DEPRECATED))
Jul 8 19:28:44 10.35.0.15 nova-compute 2014-07-08 19:28:44.134 3328 ERROR kombu.transport.pyamqp [-] NBK: body='{"oslo.message": "{\\"_unique_id\\": \\"53a425556a1a4fb79e900a1a61e552cd\\", \\"failure\\": null, \\"_msg_id\\": \\"60a700beb6074e6e9999235c9252007c\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', headers={}, properties={'delivery_mode': 2}
Jul 8 19:28:44 10.35.0.15 nova-compute 2014-07-08 19:28:44.134 3328 ERROR kombu.messaging [-] NBK: message.body='{"oslo.message": "{\\"_unique_id\\": \\"53a425556a1a4fb79e900a1a61e552cd\\", \\"failure\\": null, \\"_msg_id\\": \\"60a700beb6074e6e9999235c9252007c\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', routing_key=u'reply_ff10fb357a86427594db414dc3e5acc6'
Jul 8 19:28:44 10.35.0.13 cobalt-compute: pyamqp ERROR NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_unique_id\\": \\"b14eb8a49fa9493795c8c3ae6d6dbf65\\", \\"failure\\": null, \\"result\\": null, \\"_msg_id\\": \\"60a700beb6074e6e9999235c9252007c\\"}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'1', 'redelivered': False, 'routing_key': u'reply_ff10fb357a86427594db414dc3e5acc6', 'delivery_tag': 11, 'exchange': u'reply_ff10fb357a86427594db414dc3e5acc6'}, 'properties': {'priority': 0, 'application_headers': {}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7ff0d2864c50>}
Jul 8 19:28:44 10.35.0.13 cobalt-compute: impl_rabbit ERROR NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_unique_id\\": \\"b14eb8a49fa9493795c8c3ae6d6dbf65\\", \\"failure\\": null, \\"result\\": null, \\"_msg_id\\": \\"60a700beb6074e6e9999235c9252007c\\"}", "oslo.version": "2.0"}'
Jul 8 19:28:44 10.35.0.13 cobalt-compute: pyamqp ERROR NBK: raw_message.__dict__={'body': u'{"oslo.message": "{\\"_unique_id\\": \\"53a425556a1a4fb79e900a1a61e552cd\\", \\"failure\\": null, \\"_msg_id\\": \\"60a700beb6074e6e9999235c9252007c\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}', 'delivery_info': {'consumer_tag': u'1', 'redelivered': False, 'routing_key': u'reply_ff10fb357a86427594db414dc3e5acc6', 'delivery_tag': 12, 'exchange': u'reply_ff10fb357a86427594db414dc3e5acc6'}, 'properties': {'priority': 0, 'application_headers': {}, 'delivery_mode': 2, 'content_encoding': u'utf-8', 'content_type': u'application/json'}, 'channel': <kombu.transport.pyamqp.Channel object at 0x7ff0d2864c50>}
Jul 8 19:28:44 10.35.0.13 cobalt-compute: impl_rabbit ERROR NBK: acknowledge self._raw_message.body=u'{"oslo.message": "{\\"_unique_id\\": \\"53a425556a1a4fb79e900a1a61e552cd\\", \\"failure\\": null, \\"_msg_id\\": \\"60a700beb6074e6e9999235c9252007c\\", \\"result\\": null, \\"ending\\": true}", "oslo.version": "2.0"}'
Jul 8 19:28:44 10.35.0.13 cobalt-compute: manager DEBUG NBK: manager.migrate_instance instance='5ee78913-e655-4cb1-a92c-500639c819ad' calling bless_instance instance=<nova.objects.instance.Instance object at 0x7ff0d2ee5950>
Jul 8 19:28:44 10.35.0.13 cobalt-compute: manager DEBUG bless_instance called: {'migration_network_info': [VIF({'ovs_interfaceid': None, 'network': Network({'bridge': u'cloud', 'subnets': [Subnet({'ips': [FixedIP({'meta': {}, 'version': 4, 'type': u'fixed', 'floating_ips': [], 'address': u'10.35.3.4'})], 'version': 4, 'meta': {u'dhcp_server': u'10.35.3.3'}, 'dns': [IP({'meta': {}, 'version': 4, 'type': u'dns', 'address': u'8.8.4.4'})], 'routes': [], 'cidr': u'10.35.3.0/24', 'gateway': IP({'meta': {}, 'version': 4, 'type': u'gateway', 'address': u'10.35.3.1'})}), Subnet({'ips': [], 'version': None, 'meta': {u'dhcp_server': u'10.35.3.3'}, 'dns': [], 'routes': [], 'cidr': None, 'gateway': IP({'meta': {}, 'version': None, 'type': u'gateway', 'address': None})})], 'meta': {u'tenant_id': None, u'multi_host': True, u'should_create_bridge': True}, 'id': u'933f56fa-7929-413f-ab40-9c7c91cb840b', 'label': u'private'}), 'devname': None, 'qbh_params': None, 'meta': {}, 'details': {}, 'address': u'fa:16:3e:da:1a:ff', 'active': False, 'type': u'bridge', 'id': u'615297b1-6e8b-4e6c-ad03-1d8ca9bb3a72', 'qbg_params': None})], 'instance': <nova.objects.instance.Instance object at 0x7ff0d2ee5950>, 'migration_url': 'mcdist://10.35.0.13', 'instance_uuid': '5ee78913-e655-4cb1-a92c-500639c819ad'}
Jul 8 19:28:44 10.35.0.13 cobalt-compute: manager DEBUG Locking instance 5ee78913-e655-4cb1-a92c-500639c819ad (fn:bless_instance)
Jul 8 19:28:44 10.35.0.13 cobalt-compute: manager DEBUG Acquiring lock for instance 5ee78913-e655-4cb1-a92c-500639c819ad
Jul 8 19:28:44 10.35.0.13 cobalt-compute: manager DEBUG Acquired lock for instance 5ee78913-e655-4cb1-a92c-500639c819ad (me: 140672304066384, refcount=2)
Jul 8 19:28:44 10.35.0.13 cobalt-compute: hooks INFO Calling hooks for pre_bless action
Jul 8 19:28:44 10.35.0.13 cobalt-compute: hooks INFO Done calling hooks for pre_bless action
Jul 8 19:28:44 10.35.0.13 cobalt-compute: vmsconn DEBUG Calling get_instance_info with args=(<nova.objects.instance.Instance object at 0x7ff0d2ee5950>,) kwargs={}
Jul 8 19:28:44 10.35.0.13 cobalt-compute: vmsconn DEBUG Called get_instance_info with args=(<nova.objects.instance.Instance object at 0x7ff0d2ee5950>,) kwargs={}
Jul 8 19:28:44 10.35.0.13 cobalt-compute: vmsconn DEBUG Calling bless with args=(<nova.context.RequestContext object at 0x7ff0d2cd3850>, 'instance-00000002', <nova.objects.instance.Instance object at 0x7ff0d2ee5950>) kwargs={'migration_url': 'mcdist://10.35.0.13'}
Jul 8 19:28:44 10.35.0.13 cobalt-compute: vmsapi DEBUG Executing vms command ['vmsctl', '--use.names', '-p', 'libvirt', '-m', 'connection_url=qemu:///system', 'bless', 'instance-00000002', 'instance-00000002', '', '', 'mcdist://10.35.0.13', 'True']
Jul 8 19:28:44 10.35.0.13 cobalt-compute: processutils DEBUG Running cmd (subprocess): vmsctl --use.names -p libvirt -m connection_url=qemu:///system bless instance-00000002 instance-00000002 mcdist://10.35.0.13 True
Jul 8 19:28:44 10.35.0.14 nova-compute 2014-07-08 19:28:44.173 6347 AUDIT nova.compute.resource_tracker [-] Auditing locally available compute resources
Jul 8 19:28:44 10.35.0.14 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf vgs --noheadings --nosuffix --separator | --units b -o vg_size,vg_free nova
Jul 8 19:28:44 10.35.0.13 rabbitmq.log: =INFO REPORT==== 8-Jul-2014::19:28:44 ===
Jul 8 19:28:44 10.35.0.13 rabbitmq.log: accepting AMQP connection <0.2364.0> (10.35.0.13:42946 -> 10.35.0.3:5672)
Jul 8 19:28:44 10.35.0.13 nova-network.log: /usr/lib64/python2.7/site-packages/amqp/channel.py:616: VDeprecationWarning: The auto_delete flag for exchanges has been deprecated and will be removed
Jul 8 19:28:44 10.35.0.13 nova-network.log: from py-amqp v1.5.0.
Jul 8 19:28:44 10.35.0.13 nova-network.log: warn(VDeprecationWarning(EXCHANGE_AUTODELETE_DEPRECATED))
Jul 8 19:28:44 10.35.0.13 nova-network 2014-07-08 19:28:44.287 4876 INFO oslo.messaging._drivers.impl_rabbit [-] Connected to AMQP server on 10.35.0.3:5672
Jul 8 19:28:44 10.35.0.13 nova-compute 2014-07-08 19:28:44.350 4705 INFO nova.compute.manager [-] Lifecycle event 2 on VM 5ee78913-e655-4cb1-a92c-500639c819ad
Jul 8 19:28:44 10.35.0.13 nova-compute 2014-07-08 19:28:44.479 4705 INFO nova.compute.manager [-] [instance: 5ee78913-e655-4cb1-a92c-500639c819ad] During sync_power_state the instance has a pending task. Skip.
Jul 8 19:28:44 10.35.0.14 nova-compute 2014-07-08 19:28:44.521 6347 AUDIT nova.compute.resource_tracker [-] Free ram (MB): 47729
Jul 8 19:28:44 10.35.0.14 nova-compute 2014-07-08 19:28:44.521 6347 AUDIT nova.compute.resource_tracker [-] Free disk (GB): 1862
Jul 8 19:28:44 10.35.0.14 nova-compute 2014-07-08 19:28:44.522 6347 AUDIT nova.compute.resource_tracker [-] Free VCPUS: 4
Jul 8 19:28:44 10.35.0.14 nova-compute 2014-07-08 19:28:44.548 6347 INFO nova.compute.resource_tracker [-] Compute_service record updated for 10.35.0.14:node-10-35-0-14
Jul 8 19:28:44 10.35.0.13 [2014-07-08 19:28:44.892] 941/Thread-1 savage.WSGI-FredInterface/INFO: 10.35.0.2 "GET /private/status HTTP/1.1" 200 3077
Jul 8 19:28:44 10.35.0.15 [2014-07-08 19:28:44.894] 942/Thread-1 savage.WSGI-FredInterface/INFO: 10.35.0.2 "GET /private/status HTTP/1.1" 200 3035
Jul 8 19:28:46 10.35.0.13 vms.log: 1404847726.543961 vmsd:23144 [ERROR] <save.9785> Unsupported ABI 0 for Linux guest agent data.
Jul 8 19:28:46 10.35.0.13 vms.log: 1404847726.545724 vmsd:23144 [INFO] <save.9785> Using mappings for guest -> none.
Jul 8 19:28:46 10.35.0.13 vms.log: 1404847726.547265 vmsd:23144 [INFO] <save.9785> Pagedb serialization successful.
Jul 8 19:28:46 10.35.0.13 vms.log: 1404847726.573031 vmsd:23144 [INFO] <save.9785> Saved 1 packed pages, with 0 errors.
Jul 8 19:28:46 10.35.0.13 vms.log: info: first multicast group is 57790
Jul 8 19:28:46 10.35.0.13 vms.log: warning: NLMSG_ERROR during MTU probe (Unknown error -34).
Jul 8 19:28:46 10.35.0.13 vms.log: warning: MTU request failed.
Jul 8 19:28:46 10.35.0.13 vms.log: warning: returning default MTU.
Jul 8 19:28:46 10.35.0.13 vms.log: info: providing slice 0 length = 536870912, counters_avail = 10
Jul 8 19:28:46 10.35.0.13 kernel: [ 3844.897316] cloud: port 2(vnet0) entered disabled state
Jul 8 19:28:46 10.35.0.13 kernel: [ 3844.897482] device vnet0 left promiscuous mode
Jul 8 19:28:46 10.35.0.13 kernel: [ 3844.897555] cloud: port 2(vnet0) entered disabled state
Jul 8 19:28:46 10.35.0.13 libvirtd: 4561: info : qemuDomainUndefineFlags:6334 : Undefining domain 'instance-00000002'
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: error : qemuMonitorIO:660 : internal error: End of file from monitor
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyGroup:844 : Starting transaction for 0x7fc5311ef290 flags=1
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/iptables -D libvirt-out -m physdev --physdev-is-bridged --physdev-out vnet0 -g FP-vnet0'
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/iptables -D libvirt-out -m physdev --physdev-out vnet0 -g FP-vnet0'
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/iptables -D libvirt-in -m physdev --physdev-in vnet0 -g FJ-vnet0'
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/iptables -D libvirt-host-in -m physdev --physdev-in vnet0 -g HJ-vnet0'
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/iptables -F FP-vnet0'
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/iptables -X FP-vnet0'
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/iptables -F FJ-vnet0'
Jul 8 19:28:46 10.35.0.13 pound: 10.35.2.3:5000 10.35.0.14 - - [08/Jul/2014:19:28:46 +0000] "POST //v2.0/tokens HTTP/1.1" 200 3288 "" "python-novaclient"
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/iptables -X FJ-vnet0'
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/iptables -F HJ-vnet0'
Jul 8 19:28:46 10.35.0.13 cobalt-compute: processutils DEBUG Result was 0
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/iptables -X HJ-vnet0'
Jul 8 19:28:46 10.35.0.13 cobalt-compute: vmsapi DEBUG vmctl stderr: <23131> [INFO] qmp.py:222 The parameter "silent" is not supported.Falling back on old behavior, which is to not stop silently. YMMV.
Jul 8 19:28:46 10.35.0.13 cobalt-compute: vmsapi DEBUG vmctl stderr: <23131> [INFO] vmsrun.py:585 vmsrun [exec False]: vmsd save hypervisor=kvm memsrv=mcdist://10.35.0.13 memtap=null:// id=9785 pages=131072 vcpus=1 generation=0bd88247-e0b04c87-4c87-8cee-732516dd1225 fd=4 dir.cache=/dev/shm/vms control.filepattern=/dev/shm/vms/control/ dir.store=/mnt/novadisk/gc-store log.prefix=<save.9785> module.path=/usr/lib64/vms:/usr/lib/vms kvm.mem_path=/proc/9785/fd/9 kvm.qmp_path=/tmp/qmp_vms_mon-sceGKN.sock file.sparse=1 kvm.hmp_path=/tmp/hmp_vms_mon-PviBrs.sock replace.spec=
Jul 8 19:28:46 10.35.0.13 cobalt-compute: vmsapi DEBUG vmctl stderr: <23131> [INFO] image.py:65 Running command qemu-img info /dev/nova/5ee78913-e655-4cb1-a92c-500639c819ad_disk
Jul 8 19:28:46 10.35.0.13 cobalt-compute: vmsapi DEBUG vmctl stderr: <23131> [INFO] image.py:86 Running command qemu-nbd -p 65272 -r --persistent --shared 5 /dev/nova/5ee78913-e655-4cb1-a92c-500639c819ad_disk.x6xlyu
Jul 8 19:28:46 10.35.0.13 cobalt-compute: vmsapi DEBUG vmctl stderr: <23131> [INFO] image.py:86 Running command qemu-nbd -p 47971 -r --persistent --shared 5 /proc/self/fd/4
Jul 8 19:28:46 10.35.0.13 cobalt-compute: vmsapi DEBUG vmctl stderr: <23131> [WARNING] management.py:228 Unable to change the ownership of the vms file /dev/shm/vms/prefetch.0bd88247-e0b04c87-4c87-8cee-732516dd1225.
Jul 8 19:28:46 10.35.0.13 cobalt-compute: vmsapi DEBUG vmctl stderr: <23131> [WARNING] management.py:228 Unable to change the ownership of the vms file /dev/shm/vms/share.0bd88247-e0b04c87-4c87-8cee-732516dd1225.
Jul 8 19:28:46 10.35.0.13 cobalt-compute: vmsapi DEBUG vmctl stderr:
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/ip6tables -D libvirt-out -m physdev --physdev-is-bridged --physdev-out vnet0 -g FP-vnet0'
Jul 8 19:28:46 10.35.0.13 cobalt-compute: vmsconn DEBUG Called bless with args=(<nova.context.RequestContext object at 0x7ff0d2cd3850>, 'instance-00000002', <nova.objects.instance.Instance object at 0x7ff0d2ee5950>) kwargs={'migration_url': 'mcdist://10.35.0.13'}
Jul 8 19:28:46 10.35.0.13 cobalt-compute: vmsconn DEBUG Calling post_bless with args=(<nova.context.RequestContext object at 0x7ff0d2cd3850>, <nova.objects.instance.Instance object at 0x7ff0d2ee5950>, [u'/mnt/novadisk/gc-shared/instance-00000002.xml', u'/mnt/novadisk/gc-shared/instance-00000002.gc']) kwargs={'vms_policy_template': ';blessed=5ee78913-e655-4cb1-a92c-500639c819ad;;flavor=m1.tiny;;tenant=%(tenant)s;;uuid=%(uuid)s;'}
Jul 8 19:28:46 10.35.0.13 cobalt-compute: http DEBUG curl -i -X POST -H 'X-Service-Catalog: [{"endpoints_links": [], "endpoints": [{"adminURL": "http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a", "region": "RegionOne", "internalURL": "http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a", "publicURL": "http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a"}], "type": "volume", "name": "Volume Service"}]' -H 'X-Identity-Status: Confirmed' -H 'x-image-meta-property-user_id: 260659016df54f47833004653cfe0cb8' -H 'x-image-meta-property-instance_uuid: 5ee78913-e655-4cb1-a92c-500639c819ad' -H 'User-Agent: python-glanceclient' -H 'X-User-Id: 260659016df54f47833004653cfe0cb8' -H 'X-Roles: Admin,_member_,ProjectAdmin,admin' -H 'x-image-meta-property-image_state: creating' -H 'x-image-meta-is_public: False' -H 'X-Auth-Token: e461a54513ed421ab86f2cc67e181011' -H 'Content-Type: application/octet-stream' -H 'X-Tenant-Id: 9a663247f58a4412a840385af8d9e73a' -H 'x-image-meta-name: vm-39852899123.xml' http://10.35.0.13:9292/v1/images
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/ip6tables -D libvirt-out -m physdev --physdev-out vnet0 -g FP-vnet0'
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/ip6tables -D libvirt-in -m physdev --physdev-in vnet0 -g FJ-vnet0'
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/ip6tables -D libvirt-host-in -m physdev --physdev-in vnet0 -g HJ-vnet0'
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/ip6tables -F FP-vnet0'
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/ip6tables -X FP-vnet0'
Jul 8 19:28:46 10.35.0.13 nova-api 2014-07-08 19:28:46.859 4840 INFO requests.packages.urllib3.connectionpool [-] Starting new HTTP connection (1): 10.35.2.3
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/ip6tables -F FJ-vnet0'
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/ip6tables -X FJ-vnet0'
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/ip6tables -F HJ-vnet0'
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/ip6tables -X HJ-vnet0'
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/ebtables -t nat -D PREROUTING -i vnet0 -j libvirt-J-vnet0'
Jul 8 19:28:46 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.7 - - [08/Jul/2014:19:28:46 +0000] "GET /v2.0/tokens/e461a54513ed421ab86f2cc67e181011 HTTP/1.1" 200 3290 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:28:46 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.7 - - [08/Jul/2014:19:28:46 +0000] "GET /v2.0/tokens/1d7c486c603b410eb605cee6c3afe16a HTTP/1.1" 200 3288 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/ebtables -t nat -D POSTROUTING -o vnet0 -j libvirt-P-vnet0'
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/ebtables -t nat -L libvirt-J-vnet0'
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/ebtables -t nat -L libvirt-P-vnet0'
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/ebtables -t nat -F libvirt-J-vnet0'
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/ebtables -t nat -X libvirt-J-vnet0'
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/ebtables -t nat -F libvirt-P-vnet0'
Jul 8 19:28:46 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.7 - - [08/Jul/2014:19:28:46 +0000] "GET /v2.0/tokens/e461a54513ed421ab86f2cc67e181011 HTTP/1.1" 200 3290 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/ebtables -t nat -X libvirt-P-vnet0'
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/iptables -D libvirt-out -m physdev --physdev-is-bridged --physdev-out vnet0 -g FO-vnet0'
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/iptables -D libvirt-out -m physdev --physdev-out vnet0 -g FO-vnet0'
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/iptables -D libvirt-in -m physdev --physdev-in vnet0 -g FI-vnet0'
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/iptables -D libvirt-host-in -m physdev --physdev-in vnet0 -g HI-vnet0'
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/iptables -D libvirt-in-post -m physdev --physdev-in vnet0 -j ACCEPT'
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/iptables -F FO-vnet0'
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/iptables -X FO-vnet0'
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/iptables -F FI-vnet0'
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/iptables -X FI-vnet0'
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/iptables -F HI-vnet0'
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/iptables -X HI-vnet0'
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/ip6tables -D libvirt-out -m physdev --physdev-is-bridged --physdev-out vnet0 -g FO-vnet0'
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/ip6tables -D libvirt-out -m physdev --physdev-out vnet0 -g FO-vnet0'
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/ip6tables -D libvirt-in -m physdev --physdev-in vnet0 -g FI-vnet0'
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/ip6tables -D libvirt-host-in -m physdev --physdev-in vnet0 -g HI-vnet0'
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/ip6tables -D libvirt-in-post -m physdev --physdev-in vnet0 -j ACCEPT'
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/ip6tables -F FO-vnet0'
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/ip6tables -X FO-vnet0'
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/ip6tables -F FI-vnet0'
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/ip6tables -X FI-vnet0'
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/ip6tables -F HI-vnet0'
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/ip6tables -X HI-vnet0'
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/ebtables -t nat -D PREROUTING -i vnet0 -j libvirt-I-vnet0'
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/ebtables -t nat -D POSTROUTING -o vnet0 -j libvirt-O-vnet0'
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/ebtables -t nat -L libvirt-I-vnet0'
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/ebtables -t nat -L libvirt-O-vnet0'
Jul 8 19:28:46 10.35.0.13 glance-registry 2014-07-08 19:28:46.957 4328 INFO glance.registry.api.v1.images [7671cf2b-1014-4922-b11c-34351113de45 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a - - -] Successfully created image 6eeaab98-c1ee-40c1-b0c5-f0408c065321
Jul 8 19:28:46 10.35.0.13 glance-registry 2014-07-08 19:28:46.959 4328 INFO glance.wsgi.server [7671cf2b-1014-4922-b11c-34351113de45 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a - - -] 10.35.0.13 - - [08/Jul/2014 19:28:46] "POST /images HTTP/1.1" 200 807 0.085103
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/ebtables -t nat -F libvirt-I-vnet0'
Jul 8 19:28:46 10.35.0.13 glance-api 2014-07-08 19:28:46.961 4334 INFO glance.wsgi.server [13bf811a-dc55-4425-978a-b9c78b31e4ca 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a - - -] 10.35.0.13 - - [08/Jul/2014 19:28:46] "POST /v1/images HTTP/1.1" 201 839 0.110246
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/ebtables -t nat -X libvirt-I-vnet0'
Jul 8 19:28:46 10.35.0.13 cobalt-compute: http DEBUG
HTTP/1.1 201 Created
date: Tue, 08 Jul 2014 19:28:46 GMT
content-length: 578
content-type: application/json
location: http://10.35.0.13:9292/v1/images/6eeaab98-c1ee-40c1-b0c5-f0408c065321
x-openstack-request-id: req-13bf811a-dc55-4425-978a-b9c78b31e4ca
{"image": {"status": "queued", "deleted": false, "container_format": null, "min_ram": 0, "updated_at": "2014-07-08T19:28:46", "owner": "9a663247f58a4412a840385af8d9e73a", "min_disk": 0, "is_public": false, "deleted_at": null, "id": "6eeaab98-c1ee-40c1-b0c5-f0408c065321", "size": 0, "virtual_size": null, "name": "vm-39852899123.xml", "checksum": null, "created_at": "2014-07-08T19:28:46", "disk_format": null, "properties": {"instance_uuid": "5ee78913-e655-4cb1-a92c-500639c819ad", "image_state": "creating", "user_id": "260659016df54f47833004653cfe0cb8"}, "protected": false}}
Jul 8 19:28:46 10.35.0.13 cobalt-compute: iso8601 DEBUG Parsed 2014-07-08T19:28:46 into {'tz_sign': None, 'second_fraction': None, 'hour': u'19', 'daydash': u'08', 'tz_hour': None, 'month': None, 'timezone': None, 'second': u'46', 'tz_minute': None, 'year': u'2014', 'separator': u'T', 'monthdash': u'07', 'day': None, 'minute': u'28'} with default timezone <iso8601.iso8601.Utc object at 0x7ff0cee69250>
Jul 8 19:28:46 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'2014' for 'year' with default None
Jul 8 19:28:46 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'07' for 'monthdash' with default 1
Jul 8 19:28:46 10.35.0.13 cobalt-compute: iso8601 DEBUG Got 7 for 'month' with default 7
Jul 8 19:28:46 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'08' for 'daydash' with default 1
Jul 8 19:28:46 10.35.0.13 cobalt-compute: iso8601 DEBUG Got 8 for 'day' with default 8
Jul 8 19:28:46 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'19' for 'hour' with default None
Jul 8 19:28:46 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'28' for 'minute' with default None
Jul 8 19:28:46 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'46' for 'second' with default None
Jul 8 19:28:46 10.35.0.13 cobalt-compute: iso8601 DEBUG Parsed 2014-07-08T19:28:46 into {'tz_sign': None, 'second_fraction': None, 'hour': u'19', 'daydash': u'08', 'tz_hour': None, 'month': None, 'timezone': None, 'second': u'46', 'tz_minute': None, 'year': u'2014', 'separator': u'T', 'monthdash': u'07', 'day': None, 'minute': u'28'} with default timezone <iso8601.iso8601.Utc object at 0x7ff0cee69250>
Jul 8 19:28:46 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'2014' for 'year' with default None
Jul 8 19:28:46 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'07' for 'monthdash' with default 1
Jul 8 19:28:46 10.35.0.13 cobalt-compute: iso8601 DEBUG Got 7 for 'month' with default 7
Jul 8 19:28:46 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'08' for 'daydash' with default 1
Jul 8 19:28:46 10.35.0.13 cobalt-compute: iso8601 DEBUG Got 8 for 'day' with default 8
Jul 8 19:28:46 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'19' for 'hour' with default None
Jul 8 19:28:46 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'28' for 'minute' with default None
Jul 8 19:28:46 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'46' for 'second' with default None
Jul 8 19:28:46 10.35.0.13 cobalt-compute: image DEBUG Uploading image /mnt/novadisk/gc-shared/instance-00000002.xml
Jul 8 19:28:46 10.35.0.13 cobalt-compute: http DEBUG curl -i -X PUT -H 'X-Service-Catalog: [{"endpoints_links": [], "endpoints": [{"adminURL": "http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a", "region": "RegionOne", "internalURL": "http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a", "publicURL": "http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a"}], "type": "volume", "name": "Volume Service"}]' -H 'X-Identity-Status: Confirmed' -H 'x-image-meta-property-owner_id: 9a663247f58a4412a840385af8d9e73a' -H 'x-image-meta-container_format: bare' -H 'Transfer-Encoding: chunked' -H 'x-glance-registry-purge-props: true' -H 'x-image-meta-protected: True' -H 'User-Agent: python-glanceclient' -H 'x-image-meta-size: 4771' -H 'X-Roles: Admin,_member_,ProjectAdmin,admin' -H 'x-image-meta-property-image_state: available' -H 'x-image-meta-is_public: False' -H 'X-User-Id: 260659016df54f47833004653cfe0cb8' -H 'X-Auth-Token: e461a54513ed421ab86f2cc67e181011' -H 'Content-Type: application/octet-stream' -H 'X-Tenant-Id: 9a663247f58a4412a840385af8d9e73a' -H 'x-image-meta-disk_format: raw' -d '<open file u'/mnt/novadisk/gc-shared/instance-00000002.xml', mode 'r' at 0x7ff0d35da9c0>' http://10.35.0.13:9292/v1/images/6eeaab98-c1ee-40c1-b0c5-f0408c065321
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/ebtables -t nat -F libvirt-O-vnet0'
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/ebtables -t nat -X libvirt-O-vnet0'
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/ebtables -t nat -L I-vnet0-mac'
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/ebtables -t nat -F I-vnet0-mac'
Jul 8 19:28:46 10.35.0.13 pound: 10.35.2.8:8774 10.35.0.14 - - [08/Jul/2014:19:28:46 +0000] "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" 200 5060 "" "python-novaclient"
Jul 8 19:28:46 10.35.0.13 nova-api 2014-07-08 19:28:46.980 4840 INFO nova.osapi_compute.wsgi.server [req-a8c011fd-3df9-4264-8dfd-afae27229f45 579c706cdd354c3bace1d0e30ac24d59 13f85c1e1f6b431eb47db205456ef242] 10.35.0.14,10.35.0.13 "GET /v2/13f85c1e1f6b431eb47db205456ef242/servers/detail HTTP/1.1" status: 200 len: 5234 time: 0.1230640
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/ebtables -t nat -X I-vnet0-mac'
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/ebtables -t nat -L I-vnet0-ipv4-ip'
Jul 8 19:28:46 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.7 - - [08/Jul/2014:19:28:46 +0000] "GET /v2.0/tokens/e461a54513ed421ab86f2cc67e181011 HTTP/1.1" 200 3290 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/ebtables -t nat -F I-vnet0-ipv4-ip'
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/ebtables -t nat -X I-vnet0-ipv4-ip'
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/ebtables -t nat -L I-vnet0-ipv4'
Jul 8 19:28:46 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/ebtables -t nat -F I-vnet0-ipv4'
Jul 8 19:28:47 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/ebtables -t nat -X I-vnet0-ipv4'
Jul 8 19:28:47 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/ebtables -t nat -L I-vnet0-arp-mac'
Jul 8 19:28:47 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/ebtables -t nat -F I-vnet0-arp-mac'
Jul 8 19:28:47 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/ebtables -t nat -X I-vnet0-arp-mac'
Jul 8 19:28:47 10.35.0.13 glance-registry 2014-07-08 19:28:47.008 4328 INFO glance.registry.api.v1.images [eb5714d0-17f9-410e-9a0b-8e98fc2497c8 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a - - -] Successfully retrieved image 6eeaab98-c1ee-40c1-b0c5-f0408c065321
Jul 8 19:28:47 10.35.0.13 glance-registry 2014-07-08 19:28:47.010 4328 INFO glance.wsgi.server [eb5714d0-17f9-410e-9a0b-8e98fc2497c8 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a - - -] 10.35.0.13 - - [08/Jul/2014 19:28:47] "GET /images/6eeaab98-c1ee-40c1-b0c5-f0408c065321 HTTP/1.1" 200 807 0.018213
Jul 8 19:28:47 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/ebtables -t nat -L I-vnet0-arp-ip'
Jul 8 19:28:47 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/ebtables -t nat -F I-vnet0-arp-ip'
Jul 8 19:28:47 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/ebtables -t nat -X I-vnet0-arp-ip'
Jul 8 19:28:47 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/ebtables -t nat -L O-vnet0-ipv4'
Jul 8 19:28:47 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/ebtables -t nat -F O-vnet0-ipv4'
Jul 8 19:28:47 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/ebtables -t nat -X O-vnet0-ipv4'
Jul 8 19:28:47 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/ebtables -t nat -L O-vnet0-ipv6'
Jul 8 19:28:47 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/ebtables -t nat -F O-vnet0-ipv6'
Jul 8 19:28:47 10.35.0.13 libvirtd: 4555: info : virFirewallApplyRule:785 : Applying rule '/sbin/ebtables -t nat -X O-vnet0-ipv6'
Jul 8 19:28:47 10.35.0.13 libvirtd: 4555: info : virSecurityDACRestoreSecurityFileLabel:271 : Restoring DAC user and group on '/dev/nova/5ee78913-e655-4cb1-a92c-500639c819ad_disk'
Jul 8 19:28:47 10.35.0.13 libvirtd: 4555: error : virSecurityDACRestoreSecurityFileLabel:275 : cannot resolve symlink /dev/nova/5ee78913-e655-4cb1-a92c-500639c819ad_disk: No such file or directory
Jul 8 19:28:47 10.35.0.13 libvirtd: 4555: info : virSecurityDACRestoreSecurityFileLabel:271 : Restoring DAC user and group on '/mnt/novadisk/nova/instances/5ee78913-e655-4cb1-a92c-500639c819ad/disk.config'
Jul 8 19:28:47 10.35.0.13 libvirtd: 4555: error : virSecurityDACRestoreSecurityFileLabel:275 : cannot resolve symlink /mnt/novadisk/nova/instances/5ee78913-e655-4cb1-a92c-500639c819ad/disk.config: No such file or directory
Jul 8 19:28:47 10.35.0.13 libvirtd: 4555: info : virSecurityDACRestoreSecurityFileLabel:271 : Restoring DAC user and group on '/mnt/novadisk/nova/instances/5ee78913-e655-4cb1-a92c-500639c819ad/console.log'
Jul 8 19:28:47 10.35.0.13 libvirtd: 4555: info : virSecurityDACSetOwnership:227 : Setting DAC user and group on '/mnt/novadisk/nova/instances/5ee78913-e655-4cb1-a92c-500639c819ad/console.log' to '0:0'
Jul 8 19:28:47 10.35.0.13 libvirtd: 4555: error : virCgroupRemoveRecursively:3178 : Unable to remove /sys/fs/cgroup/cpu/machine/instance-00000002.libvirt-qemu//emulator (16)
Jul 8 19:28:47 10.35.0.13 libvirtd: 4555: error : virCgroupRemoveRecursively:3178 : Unable to remove /sys/fs/cgroup/cpu/machine/instance-00000002.libvirt-qemu/ (16)
Jul 8 19:28:47 10.35.0.13 libvirtd: 4555: error : virCgroupRemoveRecursively:3178 : Unable to remove /sys/fs/cgroup/cpuacct/machine/instance-00000002.libvirt-qemu//emulator (16)
Jul 8 19:28:47 10.35.0.13 libvirtd: 4555: error : virCgroupRemoveRecursively:3178 : Unable to remove /sys/fs/cgroup/cpuacct/machine/instance-00000002.libvirt-qemu/ (16)
Jul 8 19:28:47 10.35.0.13 libvirtd: 4555: error : virCgroupRemoveRecursively:3178 : Unable to remove /sys/fs/cgroup/memory/machine/instance-00000002.libvirt-qemu/ (16)
Jul 8 19:28:47 10.35.0.13 libvirtd: 4555: error : virCgroupRemoveRecursively:3178 : Unable to remove /sys/fs/cgroup/devices/machine/instance-00000002.libvirt-qemu/ (16)
Jul 8 19:28:47 10.35.0.13 libvirtd: 4555: error : virCgroupRemoveRecursively:3178 : Unable to remove /sys/fs/cgroup/freezer/machine/instance-00000002.libvirt-qemu/ (16)
Jul 8 19:28:47 10.35.0.13 libvirtd: 4555: error : virCgroupRemoveRecursively:3178 : Unable to remove /sys/fs/cgroup/blkio/machine/instance-00000002.libvirt-qemu/ (16)
Jul 8 19:28:47 10.35.0.13 libvirtd: 4555: error : virCgroupRemoveRecursively:3178 : Unable to remove /sys/fs/cgroup/net_cls/machine/instance-00000002.libvirt-qemu/ (16)
Jul 8 19:28:47 10.35.0.13 glance-registry 2014-07-08 19:28:47.080 4328 INFO glance.registry.api.v1.images [4339e599-8743-4742-b1f9-f975b6d785cf 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a - - -] Updating metadata for image 6eeaab98-c1ee-40c1-b0c5-f0408c065321
Jul 8 19:28:47 10.35.0.13 glance-registry 2014-07-08 19:28:47.082 4328 INFO glance.wsgi.server [4339e599-8743-4742-b1f9-f975b6d785cf 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a - - -] 10.35.0.13 - - [08/Jul/2014 19:28:47] "PUT /images/6eeaab98-c1ee-40c1-b0c5-f0408c065321 HTTP/1.1" 200 861 0.068042
Jul 8 19:28:47 10.35.0.13 glance-registry 2014-07-08 19:28:47.101 4328 INFO glance.registry.api.v1.images [5c7b390f-dce6-42bf-abc0-dc196e4675bc 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a - - -] Successfully retrieved image 6eeaab98-c1ee-40c1-b0c5-f0408c065321
Jul 8 19:28:47 10.35.0.13 glance-registry 2014-07-08 19:28:47.102 4328 INFO glance.wsgi.server [5c7b390f-dce6-42bf-abc0-dc196e4675bc 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a - - -] 10.35.0.13 - - [08/Jul/2014 19:28:47] "GET /images/6eeaab98-c1ee-40c1-b0c5-f0408c065321 HTTP/1.1" 200 861 0.016919
Jul 8 19:28:47 10.35.0.15 nova-compute 2014-07-08 19:28:47.226 3328 AUDIT nova.compute.resource_tracker [-] Auditing locally available compute resources
Jul 8 19:28:47 10.35.0.15 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf vgs --noheadings --nosuffix --separator | --units b -o vg_size,vg_free nova
Jul 8 19:28:47 10.35.0.13 libvirtd: 4555: error : virCgroupRemoveRecursively:3178 : Unable to remove /sys/fs/cgroup/cpu/machine/instance-00000002.libvirt-qemu//emulator (16)
Jul 8 19:28:47 10.35.0.13 libvirtd: 4555: error : virCgroupRemoveRecursively:3178 : Unable to remove /sys/fs/cgroup/cpu/machine/instance-00000002.libvirt-qemu/ (16)
Jul 8 19:28:47 10.35.0.13 libvirtd: 4555: error : virCgroupRemoveRecursively:3178 : Unable to remove /sys/fs/cgroup/cpuacct/machine/instance-00000002.libvirt-qemu//emulator (16)
Jul 8 19:28:47 10.35.0.13 libvirtd: 4555: error : virCgroupRemoveRecursively:3178 : Unable to remove /sys/fs/cgroup/cpuacct/machine/instance-00000002.libvirt-qemu/ (16)
Jul 8 19:28:47 10.35.0.13 libvirtd: 4555: error : virCgroupRemoveRecursively:3178 : Unable to remove /sys/fs/cgroup/memory/machine/instance-00000002.libvirt-qemu/ (16)
Jul 8 19:28:47 10.35.0.13 libvirtd: 4555: error : virCgroupRemoveRecursively:3178 : Unable to remove /sys/fs/cgroup/devices/machine/instance-00000002.libvirt-qemu/ (16)
Jul 8 19:28:47 10.35.0.13 libvirtd: 4555: error : virCgroupRemoveRecursively:3178 : Unable to remove /sys/fs/cgroup/freezer/machine/instance-00000002.libvirt-qemu/ (16)
Jul 8 19:28:47 10.35.0.13 libvirtd: 4555: error : virCgroupRemoveRecursively:3178 : Unable to remove /sys/fs/cgroup/blkio/machine/instance-00000002.libvirt-qemu/ (16)
Jul 8 19:28:47 10.35.0.13 libvirtd: 4555: error : virCgroupRemoveRecursively:3178 : Unable to remove /sys/fs/cgroup/net_cls/machine/instance-00000002.libvirt-qemu/ (16)
Jul 8 19:28:47 10.35.0.13 glance-registry 2014-07-08 19:28:47.247 4328 INFO glance.registry.api.v1.images [95ab5be1-0712-404c-bcd8-23d08b8acb98 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a - - -] Updating metadata for image 6eeaab98-c1ee-40c1-b0c5-f0408c065321
Jul 8 19:28:47 10.35.0.13 glance-registry 2014-07-08 19:28:47.248 4328 INFO glance.wsgi.server [95ab5be1-0712-404c-bcd8-23d08b8acb98 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a - - -] 10.35.0.13 - - [08/Jul/2014 19:28:47] "PUT /images/6eeaab98-c1ee-40c1-b0c5-f0408c065321 HTTP/1.1" 200 861 0.142196
Jul 8 19:28:47 10.35.0.13 glance-registry 2014-07-08 19:28:47.410 4328 INFO glance.registry.api.v1.images [39737b55-e565-41f0-839e-259c8f085498 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a - - -] Updating metadata for image 6eeaab98-c1ee-40c1-b0c5-f0408c065321
Jul 8 19:28:47 10.35.0.13 glance-registry 2014-07-08 19:28:47.411 4328 INFO glance.wsgi.server [39737b55-e565-41f0-839e-259c8f085498 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a - - -] 10.35.0.13 - - [08/Jul/2014 19:28:47] "PUT /images/6eeaab98-c1ee-40c1-b0c5-f0408c065321 HTTP/1.1" 200 891 0.060631
Jul 8 19:28:47 10.35.0.13 libvirtd: 4555: error : virCgroupRemoveRecursively:3178 : Unable to remove /sys/fs/cgroup/cpu/machine/instance-00000002.libvirt-qemu//emulator (16)
Jul 8 19:28:47 10.35.0.13 libvirtd: 4555: error : virCgroupRemoveRecursively:3178 : Unable to remove /sys/fs/cgroup/cpu/machine/instance-00000002.libvirt-qemu/ (16)
Jul 8 19:28:47 10.35.0.13 libvirtd: 4555: error : virCgroupRemoveRecursively:3178 : Unable to remove /sys/fs/cgroup/cpuacct/machine/instance-00000002.libvirt-qemu//emulator (16)
Jul 8 19:28:47 10.35.0.13 libvirtd: 4555: error : virCgroupRemoveRecursively:3178 : Unable to remove /sys/fs/cgroup/cpuacct/machine/instance-00000002.libvirt-qemu/ (16)
Jul 8 19:28:47 10.35.0.13 libvirtd: 4555: error : virCgroupRemoveRecursively:3178 : Unable to remove /sys/fs/cgroup/memory/machine/instance-00000002.libvirt-qemu/ (16)
Jul 8 19:28:47 10.35.0.13 libvirtd: 4555: error : virCgroupRemoveRecursively:3178 : Unable to remove /sys/fs/cgroup/devices/machine/instance-00000002.libvirt-qemu/ (16)
Jul 8 19:28:47 10.35.0.13 libvirtd: 4555: error : virCgroupRemoveRecursively:3178 : Unable to remove /sys/fs/cgroup/freezer/machine/instance-00000002.libvirt-qemu/ (16)
Jul 8 19:28:47 10.35.0.13 libvirtd: 4555: error : virCgroupRemoveRecursively:3178 : Unable to remove /sys/fs/cgroup/blkio/machine/instance-00000002.libvirt-qemu/ (16)
Jul 8 19:28:47 10.35.0.13 libvirtd: 4555: error : virCgroupRemoveRecursively:3178 : Unable to remove /sys/fs/cgroup/net_cls/machine/instance-00000002.libvirt-qemu/ (16)
Jul 8 19:28:47 10.35.0.13 vms.log: 1404847727.479780 vmsd:9803 [INFO] <load.9785> VM appears to have died.
Jul 8 19:28:47 10.35.0.13 glance-registry 2014-07-08 19:28:47.515 4328 INFO glance.registry.api.v1.images [e8a3f2ce-37e7-45ef-8f6f-76a3635c7a11 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a - - -] Updating metadata for image 6eeaab98-c1ee-40c1-b0c5-f0408c065321
Jul 8 19:28:47 10.35.0.13 glance-registry 2014-07-08 19:28:47.516 4328 INFO glance.wsgi.server [e8a3f2ce-37e7-45ef-8f6f-76a3635c7a11 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a - - -] 10.35.0.13 - - [08/Jul/2014 19:28:47] "PUT /images/6eeaab98-c1ee-40c1-b0c5-f0408c065321 HTTP/1.1" 200 1098 0.101098
Jul 8 19:28:47 10.35.0.13 glance-api 2014-07-08 19:28:47.521 4338 INFO glance.wsgi.server [6c8367ab-0bef-4b2a-8b01-a4fa2be6135b 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a - - -] 10.35.0.13 - - [08/Jul/2014 19:28:47] "PUT /v1/images/6eeaab98-c1ee-40c1-b0c5-f0408c065321 HTTP/1.1" 200 958 0.551092
Jul 8 19:28:47 10.35.0.13 cobalt-compute: http DEBUG
HTTP/1.1 200 OK
content-length: 662
etag: de806a660303a3ee27d5e2bd7ab2b328
location: http://10.35.0.13:9292/v1/images/6eeaab98-c1ee-40c1-b0c5-f0408c065321
date: Tue, 08 Jul 2014 19:28:47 GMT
content-type: application/json
x-openstack-request-id: req-6c8367ab-0bef-4b2a-8b01-a4fa2be6135b
{"image": {"status": "active", "deleted": false, "container_format": "bare", "min_ram": 0, "updated_at": "2014-07-08T19:28:47", "owner": "9a663247f58a4412a840385af8d9e73a", "min_disk": 0, "is_public": false, "deleted_at": null, "id": "6eeaab98-c1ee-40c1-b0c5-f0408c065321", "size": 4771, "virtual_size": null, "name": "vm-39852899123.xml", "checksum": "de806a660303a3ee27d5e2bd7ab2b328", "created_at": "2014-07-08T19:28:46", "disk_format": "raw", "properties": {"instance_uuid": "5ee78913-e655-4cb1-a92c-500639c819ad", "image_state": "available", "user_id": "260659016df54f47833004653cfe0cb8", "owner_id": "9a663247f58a4412a840385af8d9e73a"}, "protected": true}}
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Parsed 2014-07-08T19:28:46 into {'tz_sign': None, 'second_fraction': None, 'hour': u'19', 'daydash': u'08', 'tz_hour': None, 'month': None, 'timezone': None, 'second': u'46', 'tz_minute': None, 'year': u'2014', 'separator': u'T', 'monthdash': u'07', 'day': None, 'minute': u'28'} with default timezone <iso8601.iso8601.Utc object at 0x7ff0cee69250>
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'2014' for 'year' with default None
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'07' for 'monthdash' with default 1
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got 7 for 'month' with default 7
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'08' for 'daydash' with default 1
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got 8 for 'day' with default 8
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'19' for 'hour' with default None
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'28' for 'minute' with default None
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'46' for 'second' with default None
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Parsed 2014-07-08T19:28:47 into {'tz_sign': None, 'second_fraction': None, 'hour': u'19', 'daydash': u'08', 'tz_hour': None, 'month': None, 'timezone': None, 'second': u'47', 'tz_minute': None, 'year': u'2014', 'separator': u'T', 'monthdash': u'07', 'day': None, 'minute': u'28'} with default timezone <iso8601.iso8601.Utc object at 0x7ff0cee69250>
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'2014' for 'year' with default None
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'07' for 'monthdash' with default 1
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got 7 for 'month' with default 7
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'08' for 'daydash' with default 1
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got 8 for 'day' with default 8
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'19' for 'hour' with default None
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'28' for 'minute' with default None
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'47' for 'second' with default None
Jul 8 19:28:47 10.35.0.13 cobalt-compute: http DEBUG curl -i -X HEAD -H 'X-Service-Catalog: [{"endpoints_links": [], "endpoints": [{"adminURL": "http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a", "region": "RegionOne", "internalURL": "http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a", "publicURL": "http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a"}], "type": "volume", "name": "Volume Service"}]' -H 'X-Identity-Status: Confirmed' -H 'X-Roles: Admin,_member_,ProjectAdmin,admin' -H 'User-Agent: python-glanceclient' -H 'X-Tenant-Id: 9a663247f58a4412a840385af8d9e73a' -H 'X-User-Id: 260659016df54f47833004653cfe0cb8' -H 'X-Auth-Token: e461a54513ed421ab86f2cc67e181011' -H 'Content-Type: application/octet-stream' http://10.35.0.13:9292/v1/images/6eeaab98-c1ee-40c1-b0c5-f0408c065321
Jul 8 19:28:47 10.35.0.13 glance-registry 2014-07-08 19:28:47.560 4328 INFO glance.registry.api.v1.images [2230cb51-1f57-4646-96d6-0d5b8da5817c 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a - - -] Successfully retrieved image 6eeaab98-c1ee-40c1-b0c5-f0408c065321
Jul 8 19:28:47 10.35.0.13 glance-registry 2014-07-08 19:28:47.561 4328 INFO glance.wsgi.server [2230cb51-1f57-4646-96d6-0d5b8da5817c 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a - - -] 10.35.0.13 - - [08/Jul/2014 19:28:47] "GET /images/6eeaab98-c1ee-40c1-b0c5-f0408c065321 HTTP/1.1" 200 1098 0.024365
Jul 8 19:28:47 10.35.0.13 glance-api 2014-07-08 19:28:47.564 4338 INFO glance.wsgi.server [63f92e8d-71f3-4407-a680-62e6c49a0837 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a - - -] 10.35.0.13 - - [08/Jul/2014 19:28:47] "HEAD /v1/images/6eeaab98-c1ee-40c1-b0c5-f0408c065321 HTTP/1.1" 200 1114 0.033803
Jul 8 19:28:47 10.35.0.13 cobalt-compute: http DEBUG
HTTP/1.1 200 OK
content-length: 0
x-image-meta-property-user_id: 260659016df54f47833004653cfe0cb8
x-image-meta-status: active
x-image-meta-property-image_state: available
x-image-meta-owner: 9a663247f58a4412a840385af8d9e73a
x-image-meta-name: vm-39852899123.xml
x-image-meta-container_format: bare
x-image-meta-created_at: 2014-07-08T19:28:46
etag: de806a660303a3ee27d5e2bd7ab2b328
location: http://10.35.0.13:9292/v1/images/6eeaab98-c1ee-40c1-b0c5-f0408c065321
x-image-meta-min_ram: 0
x-image-meta-updated_at: 2014-07-08T19:28:47
x-image-meta-id: 6eeaab98-c1ee-40c1-b0c5-f0408c065321
x-image-meta-property-instance_uuid: 5ee78913-e655-4cb1-a92c-500639c819ad
date: Tue, 08 Jul 2014 19:28:47 GMT
x-openstack-request-id: req-63f92e8d-71f3-4407-a680-62e6c49a0837
x-image-meta-deleted: False
x-image-meta-checksum: de806a660303a3ee27d5e2bd7ab2b328
content-type: text/html; charset=UTF-8
x-image-meta-protected: True
x-image-meta-min_disk: 0
x-image-meta-size: 4771
x-image-meta-is_public: False
x-image-meta-property-owner_id: 9a663247f58a4412a840385af8d9e73a
x-image-meta-disk_format: raw
Jul 8 19:28:47 10.35.0.13 cobalt-compute: http DEBUG curl -i -X HEAD -H 'X-Service-Catalog: [{"endpoints_links": [], "endpoints": [{"adminURL": "http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a", "region": "RegionOne", "internalURL": "http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a", "publicURL": "http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a"}], "type": "volume", "name": "Volume Service"}]' -H 'X-Identity-Status: Confirmed' -H 'X-Roles: Admin,_member_,ProjectAdmin,admin' -H 'User-Agent: python-glanceclient' -H 'X-Tenant-Id: 9a663247f58a4412a840385af8d9e73a' -H 'X-User-Id: 260659016df54f47833004653cfe0cb8' -H 'X-Auth-Token: e461a54513ed421ab86f2cc67e181011' -H 'Content-Type: application/octet-stream' http://10.35.0.13:9292/v1/images/6eeaab98-c1ee-40c1-b0c5-f0408c065321
Jul 8 19:28:47 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.7 - - [08/Jul/2014:19:28:47 +0000] "GET /v2.0/tokens/e461a54513ed421ab86f2cc67e181011 HTTP/1.1" 200 3290 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:28:47 10.35.0.13 glance-registry 2014-07-08 19:28:47.611 4328 INFO glance.registry.api.v1.images [f9e63b69-3862-493e-9dbb-ddefdc9cd61e 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a - - -] Successfully retrieved image 6eeaab98-c1ee-40c1-b0c5-f0408c065321
Jul 8 19:28:47 10.35.0.13 glance-registry 2014-07-08 19:28:47.612 4328 INFO glance.wsgi.server [f9e63b69-3862-493e-9dbb-ddefdc9cd61e 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a - - -] 10.35.0.13 - - [08/Jul/2014 19:28:47] "GET /images/6eeaab98-c1ee-40c1-b0c5-f0408c065321 HTTP/1.1" 200 1098 0.017463
Jul 8 19:28:47 10.35.0.13 glance-api 2014-07-08 19:28:47.615 4337 INFO glance.wsgi.server [829bfa68-f6d7-405e-a46e-5016ff6fa53d 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a - - -] 10.35.0.13 - - [08/Jul/2014 19:28:47] "HEAD /v1/images/6eeaab98-c1ee-40c1-b0c5-f0408c065321 HTTP/1.1" 200 1114 0.041196
Jul 8 19:28:47 10.35.0.13 cobalt-compute: http DEBUG
HTTP/1.1 200 OK
content-length: 0
x-image-meta-property-user_id: 260659016df54f47833004653cfe0cb8
x-image-meta-status: active
x-image-meta-property-image_state: available
x-image-meta-owner: 9a663247f58a4412a840385af8d9e73a
x-image-meta-name: vm-39852899123.xml
x-image-meta-container_format: bare
x-image-meta-created_at: 2014-07-08T19:28:46
etag: de806a660303a3ee27d5e2bd7ab2b328
location: http://10.35.0.13:9292/v1/images/6eeaab98-c1ee-40c1-b0c5-f0408c065321
x-image-meta-min_ram: 0
x-image-meta-updated_at: 2014-07-08T19:28:47
x-image-meta-id: 6eeaab98-c1ee-40c1-b0c5-f0408c065321
x-image-meta-property-instance_uuid: 5ee78913-e655-4cb1-a92c-500639c819ad
date: Tue, 08 Jul 2014 19:28:47 GMT
x-openstack-request-id: req-829bfa68-f6d7-405e-a46e-5016ff6fa53d
x-image-meta-deleted: False
x-image-meta-checksum: de806a660303a3ee27d5e2bd7ab2b328
content-type: text/html; charset=UTF-8
x-image-meta-protected: True
x-image-meta-min_disk: 0
x-image-meta-size: 4771
x-image-meta-is_public: False
x-image-meta-property-owner_id: 9a663247f58a4412a840385af8d9e73a
x-image-meta-disk_format: raw
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Parsed 2014-07-08T19:28:46 into {'tz_sign': None, 'second_fraction': None, 'hour': u'19', 'daydash': u'08', 'tz_hour': None, 'month': None, 'timezone': None, 'second': u'46', 'tz_minute': None, 'year': u'2014', 'separator': u'T', 'monthdash': u'07', 'day': None, 'minute': u'28'} with default timezone <iso8601.iso8601.Utc object at 0x7ff0cee69250>
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'2014' for 'year' with default None
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'07' for 'monthdash' with default 1
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got 7 for 'month' with default 7
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'08' for 'daydash' with default 1
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got 8 for 'day' with default 8
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'19' for 'hour' with default None
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'28' for 'minute' with default None
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'46' for 'second' with default None
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Parsed 2014-07-08T19:28:47 into {'tz_sign': None, 'second_fraction': None, 'hour': u'19', 'daydash': u'08', 'tz_hour': None, 'month': None, 'timezone': None, 'second': u'47', 'tz_minute': None, 'year': u'2014', 'separator': u'T', 'monthdash': u'07', 'day': None, 'minute': u'28'} with default timezone <iso8601.iso8601.Utc object at 0x7ff0cee69250>
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'2014' for 'year' with default None
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'07' for 'monthdash' with default 1
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got 7 for 'month' with default 7
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'08' for 'daydash' with default 1
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got 8 for 'day' with default 8
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'19' for 'hour' with default None
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'28' for 'minute' with default None
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'47' for 'second' with default None
Jul 8 19:28:47 10.35.0.13 cobalt-compute: http DEBUG curl -i -X HEAD -H 'X-Service-Catalog: [{"endpoints_links": [], "endpoints": [{"adminURL": "http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a", "region": "RegionOne", "internalURL": "http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a", "publicURL": "http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a"}], "type": "volume", "name": "Volume Service"}]' -H 'X-Identity-Status: Confirmed' -H 'X-Roles: Admin,_member_,ProjectAdmin,admin' -H 'User-Agent: python-glanceclient' -H 'X-Tenant-Id: 9a663247f58a4412a840385af8d9e73a' -H 'X-User-Id: 260659016df54f47833004653cfe0cb8' -H 'X-Auth-Token: e461a54513ed421ab86f2cc67e181011' -H 'Content-Type: application/octet-stream' http://10.35.0.13:9292/v1/images/6eeaab98-c1ee-40c1-b0c5-f0408c065321
Jul 8 19:28:47 10.35.0.15 nova-compute 2014-07-08 19:28:47.630 3328 AUDIT nova.compute.resource_tracker [-] Free ram (MB): 47217
Jul 8 19:28:47 10.35.0.15 nova-compute 2014-07-08 19:28:47.630 3328 AUDIT nova.compute.resource_tracker [-] Free disk (GB): 1861
Jul 8 19:28:47 10.35.0.15 nova-compute 2014-07-08 19:28:47.631 3328 AUDIT nova.compute.resource_tracker [-] Free VCPUS: 3
Jul 8 19:28:47 10.35.0.13 nova-compute 2014-07-08 19:28:47.645 4705 INFO nova.compute.manager [-] Lifecycle event 1 on VM 5ee78913-e655-4cb1-a92c-500639c819ad
Jul 8 19:28:47 10.35.0.13 glance-registry 2014-07-08 19:28:47.648 4328 INFO glance.registry.api.v1.images [0e53b6da-ddb7-46a8-b9a7-6a73613f313d 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a - - -] Successfully retrieved image 6eeaab98-c1ee-40c1-b0c5-f0408c065321
Jul 8 19:28:47 10.35.0.13 glance-registry 2014-07-08 19:28:47.650 4328 INFO glance.wsgi.server [0e53b6da-ddb7-46a8-b9a7-6a73613f313d 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a - - -] 10.35.0.13 - - [08/Jul/2014 19:28:47] "GET /images/6eeaab98-c1ee-40c1-b0c5-f0408c065321 HTTP/1.1" 200 1098 0.017973
Jul 8 19:28:47 10.35.0.13 glance-api 2014-07-08 19:28:47.653 4337 INFO glance.wsgi.server [0371db4d-1456-44c9-a124-190ffc1e6b22 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a - - -] 10.35.0.13 - - [08/Jul/2014 19:28:47] "HEAD /v1/images/6eeaab98-c1ee-40c1-b0c5-f0408c065321 HTTP/1.1" 200 1114 0.026630
Jul 8 19:28:47 10.35.0.15 nova-compute 2014-07-08 19:28:47.658 3328 INFO nova.compute.resource_tracker [-] Compute_service record updated for 10.35.0.15:node-10-35-0-15
Jul 8 19:28:47 10.35.0.13 cobalt-compute: http DEBUG
HTTP/1.1 200 OK
content-length: 0
x-image-meta-property-user_id: 260659016df54f47833004653cfe0cb8
x-image-meta-status: active
x-image-meta-property-image_state: available
x-image-meta-owner: 9a663247f58a4412a840385af8d9e73a
x-image-meta-name: vm-39852899123.xml
x-image-meta-container_format: bare
x-image-meta-created_at: 2014-07-08T19:28:46
etag: de806a660303a3ee27d5e2bd7ab2b328
location: http://10.35.0.13:9292/v1/images/6eeaab98-c1ee-40c1-b0c5-f0408c065321
x-image-meta-min_ram: 0
x-image-meta-updated_at: 2014-07-08T19:28:47
x-image-meta-id: 6eeaab98-c1ee-40c1-b0c5-f0408c065321
x-image-meta-property-instance_uuid: 5ee78913-e655-4cb1-a92c-500639c819ad
date: Tue, 08 Jul 2014 19:28:47 GMT
x-openstack-request-id: req-0371db4d-1456-44c9-a124-190ffc1e6b22
x-image-meta-deleted: False
x-image-meta-checksum: de806a660303a3ee27d5e2bd7ab2b328
content-type: text/html; charset=UTF-8
x-image-meta-protected: True
x-image-meta-min_disk: 0
x-image-meta-size: 4771
x-image-meta-is_public: False
x-image-meta-property-owner_id: 9a663247f58a4412a840385af8d9e73a
x-image-meta-disk_format: raw
Jul 8 19:28:47 10.35.0.13 cobalt-compute: http DEBUG curl -i -X HEAD -H 'X-Service-Catalog: [{"endpoints_links": [], "endpoints": [{"adminURL": "http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a", "region": "RegionOne", "internalURL": "http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a", "publicURL": "http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a"}], "type": "volume", "name": "Volume Service"}]' -H 'X-Identity-Status: Confirmed' -H 'X-Roles: Admin,_member_,ProjectAdmin,admin' -H 'User-Agent: python-glanceclient' -H 'X-Tenant-Id: 9a663247f58a4412a840385af8d9e73a' -H 'X-User-Id: 260659016df54f47833004653cfe0cb8' -H 'X-Auth-Token: e461a54513ed421ab86f2cc67e181011' -H 'Content-Type: application/octet-stream' http://10.35.0.13:9292/v1/images/6eeaab98-c1ee-40c1-b0c5-f0408c065321
Jul 8 19:28:47 10.35.0.13 glance-registry 2014-07-08 19:28:47.683 4328 INFO glance.registry.api.v1.images [1687b1e1-8be4-4d94-bc10-36dcaf86d458 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a - - -] Successfully retrieved image 6eeaab98-c1ee-40c1-b0c5-f0408c065321
Jul 8 19:28:47 10.35.0.13 glance-registry 2014-07-08 19:28:47.684 4328 INFO glance.wsgi.server [1687b1e1-8be4-4d94-bc10-36dcaf86d458 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a - - -] 10.35.0.13 - - [08/Jul/2014 19:28:47] "GET /images/6eeaab98-c1ee-40c1-b0c5-f0408c065321 HTTP/1.1" 200 1098 0.017172
Jul 8 19:28:47 10.35.0.13 glance-api 2014-07-08 19:28:47.688 4337 INFO glance.wsgi.server [c08c36ee-260f-4a13-9b91-2880a13ea827 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a - - -] 10.35.0.13 - - [08/Jul/2014 19:28:47] "HEAD /v1/images/6eeaab98-c1ee-40c1-b0c5-f0408c065321 HTTP/1.1" 200 1114 0.024758
Jul 8 19:28:47 10.35.0.13 cobalt-compute: http DEBUG
HTTP/1.1 200 OK
content-length: 0
x-image-meta-property-user_id: 260659016df54f47833004653cfe0cb8
x-image-meta-status: active
x-image-meta-property-image_state: available
x-image-meta-owner: 9a663247f58a4412a840385af8d9e73a
x-image-meta-name: vm-39852899123.xml
x-image-meta-container_format: bare
x-image-meta-created_at: 2014-07-08T19:28:46
etag: de806a660303a3ee27d5e2bd7ab2b328
location: http://10.35.0.13:9292/v1/images/6eeaab98-c1ee-40c1-b0c5-f0408c065321
x-image-meta-min_ram: 0
x-image-meta-updated_at: 2014-07-08T19:28:47
x-image-meta-id: 6eeaab98-c1ee-40c1-b0c5-f0408c065321
x-image-meta-property-instance_uuid: 5ee78913-e655-4cb1-a92c-500639c819ad
date: Tue, 08 Jul 2014 19:28:47 GMT
x-openstack-request-id: req-c08c36ee-260f-4a13-9b91-2880a13ea827
x-image-meta-deleted: False
x-image-meta-checksum: de806a660303a3ee27d5e2bd7ab2b328
content-type: text/html; charset=UTF-8
x-image-meta-protected: True
x-image-meta-min_disk: 0
x-image-meta-size: 4771
x-image-meta-is_public: False
x-image-meta-property-owner_id: 9a663247f58a4412a840385af8d9e73a
x-image-meta-disk_format: raw
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Parsed 2014-07-08T19:28:46 into {'tz_sign': None, 'second_fraction': None, 'hour': u'19', 'daydash': u'08', 'tz_hour': None, 'month': None, 'timezone': None, 'second': u'46', 'tz_minute': None, 'year': u'2014', 'separator': u'T', 'monthdash': u'07', 'day': None, 'minute': u'28'} with default timezone <iso8601.iso8601.Utc object at 0x7ff0cee69250>
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'2014' for 'year' with default None
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'07' for 'monthdash' with default 1
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got 7 for 'month' with default 7
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'08' for 'daydash' with default 1
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got 8 for 'day' with default 8
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'19' for 'hour' with default None
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'28' for 'minute' with default None
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'46' for 'second' with default None
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Parsed 2014-07-08T19:28:47 into {'tz_sign': None, 'second_fraction': None, 'hour': u'19', 'daydash': u'08', 'tz_hour': None, 'month': None, 'timezone': None, 'second': u'47', 'tz_minute': None, 'year': u'2014', 'separator': u'T', 'monthdash': u'07', 'day': None, 'minute': u'28'} with default timezone <iso8601.iso8601.Utc object at 0x7ff0cee69250>
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'2014' for 'year' with default None
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'07' for 'monthdash' with default 1
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got 7 for 'month' with default 7
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'08' for 'daydash' with default 1
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got 8 for 'day' with default 8
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'19' for 'hour' with default None
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'28' for 'minute' with default None
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'47' for 'second' with default None
Jul 8 19:28:47 10.35.0.13 cobalt-compute: image DEBUG Updating image 6eeaab98-c1ee-40c1-b0c5-f0408c065321: {'status': u'active', 'name': u'vm-39852899123.xml', 'deleted': False, 'container_format': u'bare', 'created_at': datetime.datetime(2014, 7, 8, 19, 28, 46, tzinfo=<iso8601.iso8601.Utc object at 0x7ff0d34fc450>), 'disk_format': u'raw', 'updated_at': datetime.datetime(2014, 7, 8, 19, 28, 47, tzinfo=<iso8601.iso8601.Utc object at 0x7ff0d34fc450>), 'id': u'6eeaab98-c1ee-40c1-b0c5-f0408c065321', 'owner': u'9a663247f58a4412a840385af8d9e73a', 'min_ram': 0, 'checksum': u'de806a660303a3ee27d5e2bd7ab2b328', 'min_disk': 0, 'is_public': False, 'deleted_at': None, 'properties': {u'instance_uuid': u'5ee78913-e655-4cb1-a92c-500639c819ad', u'image_state': u'available', u'user_id': u'260659016df54f47833004653cfe0cb8', 'image_type': 'Image', 'file_name': u'instance-00000002.xml', u'owner_id': u'9a663247f58a4412a840385af8d9e73a'}, 'size': 4771}
Jul 8 19:28:47 10.35.0.13 cobalt-compute: http DEBUG curl -i -X PUT -H 'x-image-meta-property-user_id: 260659016df54f47833004653cfe0cb8' -H 'X-Auth-Token: e461a54513ed421ab86f2cc67e181011' -H 'x-image-meta-property-image_state: available' -H 'x-image-meta-owner: 9a663247f58a4412a840385af8d9e73a' -H 'User-Agent: python-glanceclient' -H 'x-image-meta-name: vm-39852899123.xml' -H 'x-image-meta-container_format: bare' -H 'x-image-meta-property-image_type: Image' -H 'X-User-Id: 260659016df54f47833004653cfe0cb8' -H 'x-image-meta-min_ram: 0' -H 'x-glance-registry-purge-props: true' -H 'x-image-meta-property-instance_uuid: 5ee78913-e655-4cb1-a92c-500639c819ad' -H 'X-Service-Catalog: [{"endpoints_links": [], "endpoints": [{"adminURL": "http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a", "region": "RegionOne", "internalURL": "http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a", "publicURL": "http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a"}], "type": "volume", "name": "Volume Service"}]' -H 'x-image-meta-deleted: False' -H 'x-image-meta-checksum: de806a660303a3ee27d5e2bd7ab2b328' -H 'x-image-meta-property-owner_id: 9a663247f58a4412a840385af8d9e73a' -H 'x-image-meta-min_disk: 0' -H 'X-Roles: Admin,_member_,ProjectAdmin,admin' -H 'x-image-meta-size: 4771' -H 'X-Tenant-Id: 9a663247f58a4412a840385af8d9e73a' -H 'x-image-meta-is_public: False' -H 'x-image-meta-property-file_name: instance-00000002.xml' -H 'X-Identity-Status: Confirmed' -H 'Content-Type: application/octet-stream' -H 'x-image-meta-disk_format: raw' http://10.35.0.13:9292/v1/images/6eeaab98-c1ee-40c1-b0c5-f0408c065321
Jul 8 19:28:47 10.35.0.13 glance-registry 2014-07-08 19:28:47.723 4328 INFO glance.registry.api.v1.images [dd555c38-29fa-4a17-abe3-386b618ea782 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a - - -] Successfully retrieved image 6eeaab98-c1ee-40c1-b0c5-f0408c065321
Jul 8 19:28:47 10.35.0.13 glance-registry 2014-07-08 19:28:47.725 4328 INFO glance.wsgi.server [dd555c38-29fa-4a17-abe3-386b618ea782 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a - - -] 10.35.0.13 - - [08/Jul/2014 19:28:47] "GET /images/6eeaab98-c1ee-40c1-b0c5-f0408c065321 HTTP/1.1" 200 1098 0.018250
Jul 8 19:28:47 10.35.0.13 glance-registry 2014-07-08 19:28:47.768 4328 INFO glance.registry.api.v1.members [322eba1a-6d25-4f97-ad27-03ff97cde610 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a - - -] Returning member list for image 6eeaab98-c1ee-40c1-b0c5-f0408c065321
Jul 8 19:28:47 10.35.0.13 glance-registry 2014-07-08 19:28:47.769 4328 INFO glance.wsgi.server [322eba1a-6d25-4f97-ad27-03ff97cde610 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a - - -] 10.35.0.13 - - [08/Jul/2014 19:28:47] "GET /images/6eeaab98-c1ee-40c1-b0c5-f0408c065321/members HTTP/1.1" 200 204 0.035162
Jul 8 19:28:47 10.35.0.13 nova-compute 2014-07-08 19:28:47.789 4705 AUDIT nova.compute.resource_tracker [-] Auditing locally available compute resources
Jul 8 19:28:47 10.35.0.13 sudo: nova : TTY=unknown ; PWD=/ ; USER=root ; COMMAND=/usr/bin/nova-rootwrap /etc/nova/rootwrap.conf vgs --noheadings --nosuffix --separator | --units b -o vg_size,vg_free nova
Jul 8 19:28:47 10.35.0.13 nova-compute 2014-07-08 19:28:47.801 4705 INFO nova.compute.manager [-] [instance: 5ee78913-e655-4cb1-a92c-500639c819ad] During sync_power_state the instance has a pending task. Skip.
Jul 8 19:28:47 10.35.0.13 glance-registry 2014-07-08 19:28:47.851 4328 INFO glance.registry.api.v1.images [bac92dc8-5728-4025-b68b-6b097d88eb9b 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a - - -] Updating metadata for image 6eeaab98-c1ee-40c1-b0c5-f0408c065321
Jul 8 19:28:47 10.35.0.13 glance-registry 2014-07-08 19:28:47.852 4328 INFO glance.wsgi.server [bac92dc8-5728-4025-b68b-6b097d88eb9b 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a - - -] 10.35.0.13 - - [08/Jul/2014 19:28:47] "PUT /images/6eeaab98-c1ee-40c1-b0c5-f0408c065321 HTTP/1.1" 200 1159 0.078348
Jul 8 19:28:47 10.35.0.13 glance-api 2014-07-08 19:28:47.855 4337 INFO glance.wsgi.server [3f90d007-1839-4df3-8840-8eafb7df63a9 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a - - -] 10.35.0.13 - - [08/Jul/2014 19:28:47] "PUT /v1/images/6eeaab98-c1ee-40c1-b0c5-f0408c065321 HTTP/1.1" 200 1019 0.153600
Jul 8 19:28:47 10.35.0.13 cobalt-compute: http DEBUG
HTTP/1.1 200 OK
content-length: 723
etag: de806a660303a3ee27d5e2bd7ab2b328
location: http://10.35.0.13:9292/v1/images/6eeaab98-c1ee-40c1-b0c5-f0408c065321
date: Tue, 08 Jul 2014 19:28:47 GMT
content-type: application/json
x-openstack-request-id: req-3f90d007-1839-4df3-8840-8eafb7df63a9
{"image": {"status": "active", "deleted": false, "container_format": "bare", "min_ram": 0, "updated_at": "2014-07-08T19:28:47", "owner": "9a663247f58a4412a840385af8d9e73a", "min_disk": 0, "is_public": false, "deleted_at": null, "id": "6eeaab98-c1ee-40c1-b0c5-f0408c065321", "size": 4771, "virtual_size": null, "name": "vm-39852899123.xml", "checksum": "de806a660303a3ee27d5e2bd7ab2b328", "created_at": "2014-07-08T19:28:46", "disk_format": "raw", "properties": {"instance_uuid": "5ee78913-e655-4cb1-a92c-500639c819ad", "image_state": "available", "user_id": "260659016df54f47833004653cfe0cb8", "image_type": "Image", "file_name": "instance-00000002.xml", "owner_id": "9a663247f58a4412a840385af8d9e73a"}, "protected": true}}
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Parsed 2014-07-08T19:28:46 into {'tz_sign': None, 'second_fraction': None, 'hour': u'19', 'daydash': u'08', 'tz_hour': None, 'month': None, 'timezone': None, 'second': u'46', 'tz_minute': None, 'year': u'2014', 'separator': u'T', 'monthdash': u'07', 'day': None, 'minute': u'28'} with default timezone <iso8601.iso8601.Utc object at 0x7ff0cee69250>
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'2014' for 'year' with default None
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'07' for 'monthdash' with default 1
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got 7 for 'month' with default 7
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'08' for 'daydash' with default 1
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got 8 for 'day' with default 8
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'19' for 'hour' with default None
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'28' for 'minute' with default None
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'46' for 'second' with default None
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Parsed 2014-07-08T19:28:47 into {'tz_sign': None, 'second_fraction': None, 'hour': u'19', 'daydash': u'08', 'tz_hour': None, 'month': None, 'timezone': None, 'second': u'47', 'tz_minute': None, 'year': u'2014', 'separator': u'T', 'monthdash': u'07', 'day': None, 'minute': u'28'} with default timezone <iso8601.iso8601.Utc object at 0x7ff0cee69250>
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'2014' for 'year' with default None
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'07' for 'monthdash' with default 1
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got 7 for 'month' with default 7
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'08' for 'daydash' with default 1
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got 8 for 'day' with default 8
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'19' for 'hour' with default None
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'28' for 'minute' with default None
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'47' for 'second' with default None
Jul 8 19:28:47 10.35.0.13 cobalt-compute: http DEBUG curl -i -X POST -H 'X-Service-Catalog: [{"endpoints_links": [], "endpoints": [{"adminURL": "http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a", "region": "RegionOne", "internalURL": "http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a", "publicURL": "http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a"}], "type": "volume", "name": "Volume Service"}]' -H 'X-Identity-Status: Confirmed' -H 'x-image-meta-property-user_id: 260659016df54f47833004653cfe0cb8' -H 'x-image-meta-property-instance_uuid: 5ee78913-e655-4cb1-a92c-500639c819ad' -H 'User-Agent: python-glanceclient' -H 'X-User-Id: 260659016df54f47833004653cfe0cb8' -H 'X-Roles: Admin,_member_,ProjectAdmin,admin' -H 'x-image-meta-property-image_state: creating' -H 'x-image-meta-is_public: False' -H 'X-Auth-Token: e461a54513ed421ab86f2cc67e181011' -H 'Content-Type: application/octet-stream' -H 'X-Tenant-Id: 9a663247f58a4412a840385af8d9e73a' -H 'x-image-meta-name: vm-39852899123' http://10.35.0.13:9292/v1/images
Jul 8 19:28:47 10.35.0.13 glance-registry 2014-07-08 19:28:47.907 4328 INFO glance.registry.api.v1.images [46446599-09be-4a03-b558-b5bed40a8376 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a - - -] Successfully created image 34ce76ec-7425-4c73-820b-e0efd32e4d72
Jul 8 19:28:47 10.35.0.13 glance-registry 2014-07-08 19:28:47.908 4328 INFO glance.wsgi.server [46446599-09be-4a03-b558-b5bed40a8376 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a - - -] 10.35.0.13 - - [08/Jul/2014 19:28:47] "POST /images HTTP/1.1" 200 803 0.040335
Jul 8 19:28:47 10.35.0.13 glance-api 2014-07-08 19:28:47.910 4334 INFO glance.wsgi.server [b40b5600-2434-40a7-a6af-7f2306f516ab 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a - - -] 10.35.0.13 - - [08/Jul/2014 19:28:47] "POST /v1/images HTTP/1.1" 201 835 0.048480
Jul 8 19:28:47 10.35.0.13 cobalt-compute: http DEBUG
HTTP/1.1 201 Created
date: Tue, 08 Jul 2014 19:28:47 GMT
content-length: 574
content-type: application/json
location: http://10.35.0.13:9292/v1/images/34ce76ec-7425-4c73-820b-e0efd32e4d72
x-openstack-request-id: req-b40b5600-2434-40a7-a6af-7f2306f516ab
{"image": {"status": "queued", "deleted": false, "container_format": null, "min_ram": 0, "updated_at": "2014-07-08T19:28:47", "owner": "9a663247f58a4412a840385af8d9e73a", "min_disk": 0, "is_public": false, "deleted_at": null, "id": "34ce76ec-7425-4c73-820b-e0efd32e4d72", "size": 0, "virtual_size": null, "name": "vm-39852899123", "checksum": null, "created_at": "2014-07-08T19:28:47", "disk_format": null, "properties": {"instance_uuid": "5ee78913-e655-4cb1-a92c-500639c819ad", "image_state": "creating", "user_id": "260659016df54f47833004653cfe0cb8"}, "protected": false}}
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Parsed 2014-07-08T19:28:47 into {'tz_sign': None, 'second_fraction': None, 'hour': u'19', 'daydash': u'08', 'tz_hour': None, 'month': None, 'timezone': None, 'second': u'47', 'tz_minute': None, 'year': u'2014', 'separator': u'T', 'monthdash': u'07', 'day': None, 'minute': u'28'} with default timezone <iso8601.iso8601.Utc object at 0x7ff0cee69250>
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'2014' for 'year' with default None
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'07' for 'monthdash' with default 1
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got 7 for 'month' with default 7
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'08' for 'daydash' with default 1
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got 8 for 'day' with default 8
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'19' for 'hour' with default None
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'28' for 'minute' with default None
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'47' for 'second' with default None
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Parsed 2014-07-08T19:28:47 into {'tz_sign': None, 'second_fraction': None, 'hour': u'19', 'daydash': u'08', 'tz_hour': None, 'month': None, 'timezone': None, 'second': u'47', 'tz_minute': None, 'year': u'2014', 'separator': u'T', 'monthdash': u'07', 'day': None, 'minute': u'28'} with default timezone <iso8601.iso8601.Utc object at 0x7ff0cee69250>
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'2014' for 'year' with default None
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'07' for 'monthdash' with default 1
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got 7 for 'month' with default 7
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'08' for 'daydash' with default 1
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got 8 for 'day' with default 8
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'19' for 'hour' with default None
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'28' for 'minute' with default None
Jul 8 19:28:47 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'47' for 'second' with default None
Jul 8 19:28:47 10.35.0.13 cobalt-compute: image DEBUG Uploading image /mnt/novadisk/gc-shared/instance-00000002.gc
Jul 8 19:28:47 10.35.0.13 cobalt-compute: http DEBUG curl -i -X PUT -H 'X-Service-Catalog: [{"endpoints_links": [], "endpoints": [{"adminURL": "http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a", "region": "RegionOne", "internalURL": "http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a", "publicURL": "http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a"}], "type": "volume", "name": "Volume Service"}]' -H 'X-Identity-Status: Confirmed' -H 'x-image-meta-property-owner_id: 9a663247f58a4412a840385af8d9e73a' -H 'x-image-meta-container_format: bare' -H 'Transfer-Encoding: chunked' -H 'x-glance-registry-purge-props: true' -H 'x-image-meta-protected: True' -H 'User-Agent: python-glanceclient' -H 'x-image-meta-size: 1372160' -H 'X-Roles: Admin,_member_,ProjectAdmin,admin' -H 'x-image-meta-property-image_state: available' -H 'x-image-meta-is_public: False' -H 'X-User-Id: 260659016df54f47833004653cfe0cb8' -H 'X-Auth-Token: e461a54513ed421ab86f2cc67e181011' -H 'Content-Type: application/octet-stream' -H 'X-Tenant-Id: 9a663247f58a4412a840385af8d9e73a' -H 'x-image-meta-disk_format: raw' -d '<open file u'/mnt/novadisk/gc-shared/instance-00000002.gc', mode 'r' at 0x7ff0d35da9c0>' http://10.35.0.13:9292/v1/images/34ce76ec-7425-4c73-820b-e0efd32e4d72
Jul 8 19:28:47 10.35.0.13 glance-registry 2014-07-08 19:28:47.939 4328 INFO glance.registry.api.v1.images [d37adb69-d5ac-4711-b595-e07eafa25505 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a - - -] Successfully retrieved image 34ce76ec-7425-4c73-820b-e0efd32e4d72
Jul 8 19:28:47 10.35.0.13 glance-registry 2014-07-08 19:28:47.940 4328 INFO glance.wsgi.server [d37adb69-d5ac-4711-b595-e07eafa25505 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a - - -] 10.35.0.13 - - [08/Jul/2014 19:28:47] "GET /images/34ce76ec-7425-4c73-820b-e0efd32e4d72 HTTP/1.1" 200 803 0.017079
Jul 8 19:28:48 10.35.0.13 glance-registry 2014-07-08 19:28:48.005 4328 INFO glance.registry.api.v1.images [e72b3f70-e11b-4a54-9ef5-413081bcb284 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a - - -] Updating metadata for image 34ce76ec-7425-4c73-820b-e0efd32e4d72
Jul 8 19:28:48 10.35.0.13 glance-registry 2014-07-08 19:28:48.006 4328 INFO glance.wsgi.server [e72b3f70-e11b-4a54-9ef5-413081bcb284 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a - - -] 10.35.0.13 - - [08/Jul/2014 19:28:48] "PUT /images/34ce76ec-7425-4c73-820b-e0efd32e4d72 HTTP/1.1" 200 860 0.062644
Jul 8 19:28:48 10.35.0.13 glance-registry 2014-07-08 19:28:48.025 4328 INFO glance.registry.api.v1.images [786f1223-467f-42b9-abbd-71cbe3c5943d 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a - - -] Successfully retrieved image 34ce76ec-7425-4c73-820b-e0efd32e4d72
Jul 8 19:28:48 10.35.0.13 glance-registry 2014-07-08 19:28:48.030 4328 INFO glance.wsgi.server [786f1223-467f-42b9-abbd-71cbe3c5943d 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a - - -] 10.35.0.13 - - [08/Jul/2014 19:28:48] "GET /images/34ce76ec-7425-4c73-820b-e0efd32e4d72 HTTP/1.1" 200 860 0.020535
Jul 8 19:28:48 10.35.0.13 glance-registry 2014-07-08 19:28:48.087 4328 INFO glance.registry.api.v1.images [d0f00095-d877-4497-9882-7d515cd1cf58 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a - - -] Updating metadata for image 34ce76ec-7425-4c73-820b-e0efd32e4d72
Jul 8 19:28:48 10.35.0.13 glance-registry 2014-07-08 19:28:48.089 4328 INFO glance.wsgi.server [d0f00095-d877-4497-9882-7d515cd1cf58 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a - - -] 10.35.0.13 - - [08/Jul/2014 19:28:48] "PUT /images/34ce76ec-7425-4c73-820b-e0efd32e4d72 HTTP/1.1" 200 860 0.057697
Jul 8 19:28:48 10.35.0.13 nova-compute 2014-07-08 19:28:48.152 4705 AUDIT nova.compute.resource_tracker [-] Free ram (MB): 46705
Jul 8 19:28:48 10.35.0.13 nova-compute 2014-07-08 19:28:48.152 4705 AUDIT nova.compute.resource_tracker [-] Free disk (GB): 1860
Jul 8 19:28:48 10.35.0.13 nova-compute 2014-07-08 19:28:48.153 4705 AUDIT nova.compute.resource_tracker [-] Free VCPUS: 2
Jul 8 19:28:48 10.35.0.13 nova-compute 2014-07-08 19:28:48.188 4705 INFO nova.compute.resource_tracker [-] Compute_service record updated for 10.35.0.13:node-10-35-0-13
Jul 8 19:28:48 10.35.0.13 glance-registry 2014-07-08 19:28:48.271 4328 INFO glance.registry.api.v1.images [3b48a102-1e7d-4fb3-845c-8dcacdeec8c0 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a - - -] Updating metadata for image 34ce76ec-7425-4c73-820b-e0efd32e4d72
Jul 8 19:28:48 10.35.0.13 glance-registry 2014-07-08 19:28:48.272 4328 INFO glance.wsgi.server [3b48a102-1e7d-4fb3-845c-8dcacdeec8c0 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a - - -] 10.35.0.13 - - [08/Jul/2014 19:28:48] "PUT /images/34ce76ec-7425-4c73-820b-e0efd32e4d72 HTTP/1.1" 200 890 0.066165
Jul 8 19:28:48 10.35.0.13 glance-registry 2014-07-08 19:28:48.348 4328 INFO glance.registry.api.v1.images [f8fde62f-889b-437a-9e16-9bf69acc1fb2 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a - - -] Updating metadata for image 34ce76ec-7425-4c73-820b-e0efd32e4d72
Jul 8 19:28:48 10.35.0.13 glance-registry 2014-07-08 19:28:48.349 4328 INFO glance.wsgi.server [f8fde62f-889b-437a-9e16-9bf69acc1fb2 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a - - -] 10.35.0.13 - - [08/Jul/2014 19:28:48] "PUT /images/34ce76ec-7425-4c73-820b-e0efd32e4d72 HTTP/1.1" 200 1097 0.072638
Jul 8 19:28:48 10.35.0.13 glance-api 2014-07-08 19:28:48.352 4334 INFO glance.wsgi.server [b0707d3f-d099-4d12-ad03-89857a48222d 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a - - -] 10.35.0.13 - - [08/Jul/2014 19:28:48] "PUT /v1/images/34ce76ec-7425-4c73-820b-e0efd32e4d72 HTTP/1.1" 200 957 0.434001
Jul 8 19:28:48 10.35.0.13 cobalt-compute: http DEBUG
HTTP/1.1 200 OK
content-length: 661
etag: 36906d65e7a428aabe245af48b6e7f08
location: http://10.35.0.13:9292/v1/images/34ce76ec-7425-4c73-820b-e0efd32e4d72
date: Tue, 08 Jul 2014 19:28:48 GMT
content-type: application/json
x-openstack-request-id: req-b0707d3f-d099-4d12-ad03-89857a48222d
{"image": {"status": "active", "deleted": false, "container_format": "bare", "min_ram": 0, "updated_at": "2014-07-08T19:28:48", "owner": "9a663247f58a4412a840385af8d9e73a", "min_disk": 0, "is_public": false, "deleted_at": null, "id": "34ce76ec-7425-4c73-820b-e0efd32e4d72", "size": 1372160, "virtual_size": null, "name": "vm-39852899123", "checksum": "36906d65e7a428aabe245af48b6e7f08", "created_at": "2014-07-08T19:28:47", "disk_format": "raw", "properties": {"instance_uuid": "5ee78913-e655-4cb1-a92c-500639c819ad", "image_state": "available", "user_id": "260659016df54f47833004653cfe0cb8", "owner_id": "9a663247f58a4412a840385af8d9e73a"}, "protected": true}}
Jul 8 19:28:48 10.35.0.13 cobalt-compute: iso8601 DEBUG Parsed 2014-07-08T19:28:47 into {'tz_sign': None, 'second_fraction': None, 'hour': u'19', 'daydash': u'08', 'tz_hour': None, 'month': None, 'timezone': None, 'second': u'47', 'tz_minute': None, 'year': u'2014', 'separator': u'T', 'monthdash': u'07', 'day': None, 'minute': u'28'} with default timezone <iso8601.iso8601.Utc object at 0x7ff0cee69250>
Jul 8 19:28:48 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'2014' for 'year' with default None
Jul 8 19:28:48 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'07' for 'monthdash' with default 1
Jul 8 19:28:48 10.35.0.13 cobalt-compute: iso8601 DEBUG Got 7 for 'month' with default 7
Jul 8 19:28:48 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'08' for 'daydash' with default 1
Jul 8 19:28:48 10.35.0.13 cobalt-compute: iso8601 DEBUG Got 8 for 'day' with default 8
Jul 8 19:28:48 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'19' for 'hour' with default None
Jul 8 19:28:48 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'28' for 'minute' with default None
Jul 8 19:28:48 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'47' for 'second' with default None
Jul 8 19:28:48 10.35.0.13 cobalt-compute: iso8601 DEBUG Parsed 2014-07-08T19:28:48 into {'tz_sign': None, 'second_fraction': None, 'hour': u'19', 'daydash': u'08', 'tz_hour': None, 'month': None, 'timezone': None, 'second': u'48', 'tz_minute': None, 'year': u'2014', 'separator': u'T', 'monthdash': u'07', 'day': None, 'minute': u'28'} with default timezone <iso8601.iso8601.Utc object at 0x7ff0cee69250>
Jul 8 19:28:48 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'2014' for 'year' with default None
Jul 8 19:28:48 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'07' for 'monthdash' with default 1
Jul 8 19:28:48 10.35.0.13 cobalt-compute: iso8601 DEBUG Got 7 for 'month' with default 7
Jul 8 19:28:48 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'08' for 'daydash' with default 1
Jul 8 19:28:48 10.35.0.13 cobalt-compute: iso8601 DEBUG Got 8 for 'day' with default 8
Jul 8 19:28:48 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'19' for 'hour' with default None
Jul 8 19:28:48 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'28' for 'minute' with default None
Jul 8 19:28:48 10.35.0.13 cobalt-compute: iso8601 DEBUG Got u'48' for 'second' with default None
Jul 8 19:28:48 10.35.0.13 cobalt-compute: http DEBUG curl -i -X HEAD -H 'X-Service-Catalog: [{"endpoints_links": [], "endpoints": [{"adminURL": "http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a", "region": "RegionOne", "internalURL": "http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a", "publicURL": "http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a"}], "type": "volume", "name": "Volume Service"}]' -H 'X-Identity-Status: Confirmed' -H 'X-Roles: Admin,_member_,ProjectAdmin,admin' -H 'User-Agent: python-glanceclient' -H 'X-Tenant-Id: 9a663247f58a4412a840385af8d9e73a' -H 'X-User-Id: 260659016df54f47833004653cfe0cb8' -H 'X-Auth-Token: e461a54513ed421ab86f2cc67e181011' -H 'Content-Type: application/octet-stream' http://10.35.0.13:9292/v1/images/34ce76ec-7425-4c73-820b-e0efd32e4d72
Jul 8 19:28:48 10.35.0.13 glance-registry 2014-07-08 19:28:48.379 4328 INFO glance.registry.api.v1.images [f9e88907-5fa5-44e4-a342-f919a48cc0f5 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a - - -] Successfully retrieved image 34ce76ec-7425-4c73-820b-e0efd32e4d72
Jul 8 19:28:48 10.35.0.13 glance-registry 2014-07-08 19:28:48.380 4328 INFO glance.wsgi.server [f9e88907-5fa5-44e4-a342-f919a48cc0f5 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a - - -] 10.35.0.13 - - [08/Jul/2014 19:28:48] "GET /images/34ce76ec-7425-4c73-820b-e0efd32e4d72 HTTP/1.1" 200 1097 0.017136
Jul 8 19:28:48 10.35.0.13 glance-api 2014-07-08 19:28:48.383 4334 INFO glance.wsgi.server [b59d9616-d3da-4c27-8708-b82f1c1b7845 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a - - -] 10.35.0.13 - - [08/Jul/2014 19:28:48] "HEAD /v1/images/34ce76ec-7425-4c73-820b-e0efd32e4d72 HTTP/1.1" 200 1113 0.024371
Jul 8 19:28:48 10.35.0.13 cobalt-compute: http DEBUG
HTTP/1.1 200 OK
content-length: 0
x-image-meta-property-user_id: 260659016df54f47833004653cfe0cb8
x-image-meta-status: active
x-image-meta-property-image_state: available
x-image-meta-owner: 9a663247f58a4412a840385af8d9e73a
x-image-meta-name: vm-39852899123
x-image-meta-container_format: bare
x-image-meta-created_at: 2014-07-08T19:28:47
etag: 36906d65e7a428aabe245af48b6e7f08
location: http://10.35.0.13:9292/v1/images/34ce76ec-7425-4c73-820b-e0efd32e4d72
x-image-meta-min_ram: 0
x-image-meta-updated_at: 2014-07-08T19:28:48
x-image-meta-id: 34ce76ec-7425-4c73-820b-e0efd32e4d72
x-image-meta-property-instance_uuid: 5ee78913-e655-4cb1-a92c-500639c819ad
date: Tue, 08 Jul 2014 19:28:48 GMT
x-openstack-request-id: req-b59d9616-d3da-4c27-8708-b82f1c1b7845
x-image-meta-deleted: False
x-image-meta-checksum: 36906d65e7a428aabe245af48b6e7f08
content-type: text/html; charset=UTF-8
x-image-meta-protected: True
x-image-meta-min_disk: 0
x-image-meta-size: 1372160
x-image-meta-is_public: False
x-image-meta-property-owner_id: 9a663247f58a4412a840385af8d9e73a
x-image-meta-disk_format: raw
Jul 8 19:28:48 10.35.0.13 cobalt-compute: http DEBUG curl -i -X HEAD -H 'X-Service-Catalog: [{"endpoints_links": [], "endpoints": [{"adminURL": "http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a", "region": "RegionOne", "internalURL": "http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a", "publicURL": "http://10.35.2.6:8776/v1/9a663247f58a4412a840385af8d9e73a"}], "type": "volume", "name": "Volume Service"}]' -H 'X-Identity-Status: Confirmed' -H 'X-Roles: Admin,_member_,ProjectAdmin,admin' -H 'User-Agent: python-glanceclient' -H 'X-Tenant-Id: 9a663247f58a4412a840385af8d9e73a' -H 'X-User-Id: 260659016df54f47833004653cfe0cb8' -H 'X-Auth-Token: e461a54513ed421ab86f2cc67e181011' -H 'Content-Type: application/octet-stream' http://10.35.0.13:9292/v1/images/34ce76ec-7425-4c73-820b-e0efd32e4d72
Jul 8 19:28:48 10.35.0.13 pound: 10.35.2.3:35357 10.35.2.7 - - [08/Jul/2014:19:28:48 +0000] "GET /v2.0/tokens/e461a54513ed421ab86f2cc67e181011 HTTP/1.1" 200 3290 "" "python-requests/2.3.0 CPython/2.7.6 Linux/3.10.46"
Jul 8 19:28:48 10.35.0.13 glance-registry 2014-07-08 19:28:48.436 4328 INFO glance.registry.api.v1.images [bf547826-44e7-4926-8ac1-5d5eb4b8018a 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a - - -] Successfully retrieved image 34ce76ec-7425-4c73-820b-e0efd32e4d72
Jul 8 19:28:48 10.35.0.13 glance-registry 2014-07-08 19:28:48.437 4328 INFO glance.wsgi.server [bf547826-44e7-4926-8ac1-5d5eb4b8018a 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a - - -] 10.35.0.13 - - [08/Jul/2014 19:28:48] "GET /images/34ce76ec-7425-4c73-820b-e0efd32e4d72 HTTP/1.1" 200 1097 0.017768
Jul 8 19:28:48 10.35.0.13 glance-api 2014-07-08 19:28:48.440 4335 INFO glance.wsgi.server [8277b206-302b-4d03-84e0-ad3dd64e9092 260659016df54f47833004653cfe0cb8 9a663247f58a4412a840385af8d9e73a - - -] 10.35.0.13 - - [08/Jul/2014 19:28:48] "HEAD /v1/images/34ce76ec-7425-4c73-820b-e0efd32e4d72 HTTP/1.1" 200 1113 0.047851
Jul 8 19:28:48 10.35.0.13 cobalt-compute: http DEBUG
HTTP/1.1 200 OK
content-length: 0
x-image-meta-property-user_id: 260659016df54f47833004653cfe0cb8
x-image-meta-status: active
x-image-meta-property-image_state: available
x-image-meta-owner: 9a663247f58a4412a840385af8d9e73a
x-image-meta-name: vm-39852899123
x-image-meta-container_format: bare
x-image-meta-created_at: 2014-07-08T19:28:47
etag: 36906d65e7a428aabe245af48b6e7f08
location: http://10.35.0.13:9292/v1/images/34ce76ec-7425-4c73-820b-e0efd32e4d72
x-image-meta-min_ram: 0
x-image-meta-updated_at: 2014-07-08T19:28:48
x-image-meta-id: 34ce76ec-7425-4c73-820b-e0efd32e4d72
x-image-meta-property-instance_uuid: 5ee78913-e655-4cb1-a92c-500639c819ad
date: Tue, 08 Jul 2014 19:28:48 GMT
x-openstack-request-id: req-8277b206-302b-4d03-84e0-ad3dd64e9092
x-image-meta-deleted: False
x-image-meta-checksum: 36906d65e7a428aabe245af48b6e7f08
content-type: text/html; charset=UTF-8
x-image-meta-protected: True
x-image-meta-min_disk: 0
x-image-meta-size: 1372160
x-image-meta-is_public: False
x-image-meta-property-owner_id: 9a663247f58a4412a840385af8d9e73a
x-image-meta-disk_format: raw
Jul 8 19:28:48 10.35.0.13 cobalt-compute: iso8601 DEBUG Parsed 2014-07-08T19:28:47 into {'tz_sign': None, 'second_fraction': None, 'hour': u'19', 'daydash': u'08', 'tz_hour': None, 'month': None, 'timezone': None, 'second': u'47', 'tz_minute': None, 'year': u'2014', 'separator': u'T', 'monthdash': u'07', 'day': None, 'minute': u'28'} with default timezone <iso8601.iso8601.Utc object at 0x7ff0cee6925
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment