Skip to content

Instantly share code, notes, and snippets.

@mallow111
Last active August 29, 2015 14:24
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save mallow111/486cbc01268df727cf46 to your computer and use it in GitHub Desktop.
Save mallow111/486cbc01268df727cf46 to your computer and use it in GitHub Desktop.
error message for o-cw
2015-07-07 16:12:34.475 98003 WARNING octavia.controller.worker.controller_worker [-] Task 'octavia.controller.worker.tasks.compute_tasks.ComputeWait' (5fc81d5d-3ce2-460b-967c-a3108961179a) transitioned into state 'FAILURE' from state 'RUNNING'
2015-07-07 16:12:34.475 98003 ERROR octavia.controller.worker.controller_worker Traceback (most recent call last):
2015-07-07 16:12:34.475 98003 ERROR octavia.controller.worker.controller_worker File "/usr/local/lib/python2.7/dist-packages/taskflow/engines/action_engine/executor.py", line 68, in _execute_task
2015-07-07 16:12:34.475 98003 ERROR octavia.controller.worker.controller_worker result = task.execute(**arguments)
2015-07-07 16:12:34.475 98003 ERROR octavia.controller.worker.controller_worker File "/opt/stack/octavia/octavia/controller/worker/tasks/compute_tasks.py", line 155, in execute
2015-07-07 16:12:34.475 98003 ERROR octavia.controller.worker.controller_worker raise exceptions.ComputeWaitTimeoutException()
2015-07-07 16:12:34.475 98003 ERROR octavia.controller.worker.controller_worker ComputeWaitTimeoutException: Waiting for compute to go active timeout.
2015-07-07 16:12:34.475 98003 ERROR octavia.controller.worker.controller_worker
2015-07-07 16:12:34.476 98003 DEBUG taskflow.engines.action_engine.completer [-] Applying resolver 'RevertAndRetry(strategy=RETRY)' to resolve failure 'Failure: octavia.common.exceptions.ComputeWaitTimeoutException: Waiting for compute to go active timeout.' of atom 'octavia.controller.worker.tasks.compute_tasks.ComputeWait==1.0' _process_atom_failure /usr/local/lib/python2.7/dist-packages/taskflow/engines/action_engine/completer.py:195
2015-07-07 16:12:34.486 98003 DEBUG taskflow.engines.action_engine.completer [-] Modified/tweaked 14 nodes while applying resolver 'RevertAndRetry(strategy=RETRY)' _process_atom_failure /usr/local/lib/python2.7/dist-packages/taskflow/engines/action_engine/completer.py:204
2015-07-07 16:12:34.486 98003 DEBUG taskflow.engines.action_engine.runner [-] Discarding failure 'Failure: octavia.common.exceptions.ComputeWaitTimeoutException: Waiting for compute to go active timeout.' (in response to event 'executed') under completion units request during completion of node 'octavia.controller.worker.tasks.compute_tasks.ComputeWait==1.0' (intention is to REVERT) analyze /usr/local/lib/python2.7/dist-packages/taskflow/engines/action_engine/runner.py:194
2015-07-07 16:12:34.488 98003 DEBUG octavia.controller.worker.controller_worker [-] Task 'octavia.controller.worker.tasks.compute_tasks.ComputeWait' (5fc81d5d-3ce2-460b-967c-a3108961179a) transitioned into state 'REVERTING' from state 'FAILURE' _task_receiver /usr/local/lib/python2.7/dist-packages/taskflow/listeners/logging.py:202
2015-07-07 16:12:34.490 98003 WARNING octavia.controller.worker.controller_worker [-] Task 'octavia.controller.worker.tasks.compute_tasks.ComputeWait' (5fc81d5d-3ce2-460b-967c-a3108961179a) transitioned into state 'REVERTED' from state 'REVERTING'
2015-07-07 16:12:34.493 98003 DEBUG octavia.controller.worker.controller_worker [-] Task 'octavia.controller.worker.tasks.compute_tasks.ComputeWait' (5fc81d5d-3ce2-460b-967c-a3108961179a) transitioned into state 'PENDING' from state 'REVERTED' _task_receiver /usr/local/lib/python2.7/dist-packages/taskflow/listeners/logging.py:202
2015-07-07 16:12:34.501 98003 DEBUG octavia.controller.worker.controller_worker [-] Task 'octavia.controller.worker.tasks.compute_tasks.ComputeWait' (5fc81d5d-3ce2-460b-967c-a3108961179a) transitioned into state 'RUNNING' from state 'PENDING' _task_receiver /usr/local/lib/python2.7/dist-packages/taskflow/listeners/logging.py:202
2015-07-07 16:12:44.504 98003 DEBUG keystoneclient.session [-] REQ: curl -g -i -X GET http://192.168.38.204:8774/v2/ee9162862e7b436187fb4f3280c10315/servers/ed4748f5-4d35-4871-8764-b0c72ffc1162 -H "User-Agent: python-novaclient" -H "Accept: application/json" -H "X-Auth-Token: {SHA1}5a3b25bdb00e98cdd42216ac97e6e59307983789" _http_log_request /usr/local/lib/python2.7/dist-packages/keystoneclient/session.py:195
2015-07-07 16:12:44.594 98003 DEBUG keystoneclient.session [-] RESP: [200] date: Tue, 07 Jul 2015 23:12:44 GMT connection: keep-alive content-type: application/json content-length: 2936 x-compute-request-id: req-628464ad-1536-4ce6-b24a-3e9f4b480814
RESP BODY: {"server": {"status": "ERROR", "updated": "2015-07-07T23:11:02Z", "hostId": "", "OS-EXT-SRV-ATTR:host": null, "addresses": {}, "links": [{"href": "http://192.168.38.204:8774/v2/ee9162862e7b436187fb4f3280c10315/servers/ed4748f5-4d35-4871-8764-b0c72ffc1162", "rel": "self"}, {"href": "http://192.168.38.204:8774/ee9162862e7b436187fb4f3280c10315/servers/ed4748f5-4d35-4871-8764-b0c72ffc1162", "rel": "bookmark"}], "key_name": "octavia_ssh_key", "image": {"id": "7a8d2a72-e803-4e08-8388-bec9290222b8", "links": [{"href": "http://192.168.38.204:8774/ee9162862e7b436187fb4f3280c10315/images/7a8d2a72-e803-4e08-8388-bec9290222b8", "rel": "bookmark"}]}, "OS-EXT-STS:task_state": null, "OS-EXT-STS:vm_state": "error", "OS-EXT-SRV-ATTR:instance_name": "instance-0000000a", "OS-SRV-USG:launched_at": null, "OS-EXT-SRV-ATTR:hypervisor_hostname": null, "flavor": {"id": "2", "links": [{"href": "http://192.168.38.204:8774/ee9162862e7b436187fb4f3280c10315/flavors/2", "rel": "bookmark"}]}, "id": "ed4748f5-4d35-4871-8764-b0c72ffc1162", "OS-SRV-USG:terminated_at": null, "OS-EXT-AZ:availability_zone": "nova", "user_id": "6019ab8f7ea84d53848afb5c595e0847", "name": "amphora-52b34550-b733-4df1-afa7-39c8b398c49e", "created": "2015-07-07T23:11:02Z", "tenant_id": "ee9162862e7b436187fb4f3280c10315", "OS-DCF:diskConfig": "MANUAL", "os-extended-volumes:volumes_attached": [], "accessIPv4": "", "accessIPv6": "", "fault": {"message": "No valid host was found. There are not enough hosts available.", "code": 500, "details": " File \"/opt/stack/nova/nova/conductor/manager.py\", line 701, in build_instances\n request_spec, filter_properties)\n File \"/opt/stack/nova/nova/scheduler/utils.py\", line 342, in wrapped\n return func(*args, **kwargs)\n File \"/opt/stack/nova/nova/scheduler/client/__init__.py\", line 52, in select_destinations\n context, request_spec, filter_properties)\n File \"/opt/stack/nova/nova/scheduler/client/__init__.py\", line 37, in __run_method\n return getattr(self.instance, __name)(*args, **kwargs)\n File \"/opt/stack/nova/nova/scheduler/client/query.py\", line 34, in select_destinations\n context, request_spec, filter_properties)\n File \"/opt/stack/nova/nova/scheduler/rpcapi.py\", line 120, in select_destinations\n request_spec=request_spec, filter_properties=filter_properties)\n File \"/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/client.py\", line 158, in call\n retry=self.retry)\n File \"/usr/local/lib/python2.7/dist-packages/oslo_messaging/transport.py\", line 90, in _send\n timeout=timeout, retry=retry)\n File \"/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py\", line 361, in send\n retry=retry)\n File \"/usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py\", line 352, in _send\n raise result\n", "created": "2015-07-07T23:11:02Z"}, "OS-EXT-STS:power_state": 0, "config_drive": "True", "metadata": {}}}
_http_log_response /usr/local/lib/python2.7/dist-packages/keystoneclient/session.py:224
2015-07-07 16:12:44.595 98003 DEBUG keystoneclient.session [-] REQ: curl -g -i -X GET http://192.168.38.204:8774/v2/ee9162862e7b436187fb4f3280c10315/os-networks/a2851004-63c7-4de0-9cac-3fac6b1593e9 -H "User-Agent: python-novaclient" -H "Accept: application/json" -H "X-Auth-Token: {SHA1}5a3b25bdb00e98cdd42216ac97e6e59307983789" _http_log_request /usr/local/lib/python2.7/dist-packages/keystoneclient/session.py:195
2015-07-07 16:12:44.624 98003 DEBUG keystoneclient.session [-] RESP: [200] date: Tue, 07 Jul 2015 23:12:44 GMT connection: keep-alive content-type: application/json content-length: 657 x-compute-request-id: req-852f9f4f-d493-486b-acce-28f1714259d8
RESP BODY: {"network": {"bridge": null, "vpn_public_port": null, "dhcp_start": null, "bridge_interface": null, "share_address": null, "updated_at": null, "id": "a2851004-63c7-4de0-9cac-3fac6b1593e9", "cidr_v6": null, "deleted_at": null, "gateway": null, "rxtx_base": null, "label": "lb-mgmt-net", "priority": null, "project_id": null, "vpn_private_address": null, "deleted": null, "vlan": null, "broadcast": null, "netmask": null, "injected": null, "cidr": null, "vpn_public_address": null, "multi_host": null, "enable_dhcp": null, "dns2": null, "created_at": null, "host": null, "mtu": null, "gateway_v6": null, "netmask_v6": null, "dhcp_server": null, "dns1": null}}
_http_log_response /usr/local/lib/python2.7/dist-packages/keystoneclient/session.py:224
2015-07-07 16:12:44.625 98003 WARNING octavia.controller.worker.controller_worker [-] Task 'octavia.controller.worker.tasks.compute_tasks.ComputeWait' (5fc81d5d-3ce2-460b-967c-a3108961179a) transitioned into state 'FAILURE' from state 'RUNNING'
2015-07-07 16:12:44.625 98003 ERROR octavia.controller.worker.controller_worker Traceback (most recent call last):
2015-07-07 16:12:44.625 98003 ERROR octavia.controller.worker.controller_worker File "/usr/local/lib/python2.7/dist-packages/taskflow/engines/action_engine/executor.py", line 68, in _execute_task
2015-07-07 16:12:44.625 98003 ERROR octavia.controller.worker.controller_worker result = task.execute(**arguments)
2015-07-07 16:12:44.625 98003 ERROR octavia.controller.worker.controller_worker File "/opt/stack/octavia/octavia/controller/worker/tasks/compute_tasks.py", line 155, in execute
2015-07-07 16:12:44.625 98003 ERROR octavia.controller.worker.controller_worker raise exceptions.ComputeWaitTimeoutException()
2015-07-07 16:12:44.625 98003 ERROR octavia.controller.worker.controller_worker ComputeWaitTimeoutException: Waiting for compute to go active timeout.
2015-07-07 16:12:44.625 98003 ERROR octavia.controller.worker.controller_worker
2015-07-07 16:12:44.626 98003 DEBUG taskflow.engines.action_engine.completer [-] Applying resolver 'Revert(strategy=REVERT)' to resolve failure 'Failure: octavia.common.exceptions.ComputeWaitTimeoutException: Waiting for compute to go active timeout.' of atom 'octavia.controller.worker.tasks.compute_tasks.ComputeWait==1.0' _process_atom_failure /usr/local/lib/python2.7/dist-packages/taskflow/engines/action_engine/completer.py:195
2015-07-07 16:12:44.631 98003 DEBUG taskflow.engines.action_engine.completer [-] Modified/tweaked 14 nodes while applying resolver 'Revert(strategy=REVERT)' _process_atom_failure /usr/local/lib/python2.7/dist-packages/taskflow/engines/action_engine/completer.py:204
2015-07-07 16:12:44.632 98003 DEBUG taskflow.engines.action_engine.runner [-] Discarding failure 'Failure: octavia.common.exceptions.ComputeWaitTimeoutException: Waiting for compute to go active timeout.' (in response to event 'executed') under completion units request during completion of node 'octavia.controller.worker.tasks.compute_tasks.ComputeWait==1.0' (intention is to REVERT) analyze /usr/local/lib/python2.7/dist-packages/taskflow/engines/action_engine/runner.py:194
2015-07-07 16:12:44.634 98003 DEBUG octavia.controller.worker.controller_worker [-] Task 'octavia.controller.worker.tasks.compute_tasks.ComputeWait' (5fc81d5d-3ce2-460b-967c-a3108961179a) transitioned into state 'REVERTING' from state 'FAILURE' _task_receiver /usr/local/lib/python2.7/dist-packages/taskflow/listeners/logging.py:202
2015-07-07 16:12:44.635 98003 WARNING octavia.controller.worker.controller_worker [-] Task 'octavia.controller.worker.tasks.compute_tasks.ComputeWait' (5fc81d5d-3ce2-460b-967c-a3108961179a) transitioned into state 'REVERTED' from state 'REVERTING'
2015-07-07 16:12:44.639 98003 WARNING octavia.controller.worker.controller_worker [-] Flow 'octavia-create-amp-for-lb-flow' (12606138-4183-4bd6-8ff1-b74c68fc729b) transitioned into state 'REVERTED' from state 'RUNNING'
2015-07-07 16:12:44.640 98003 ERROR oslo_messaging.rpc.dispatcher [-] Exception during message handling: Waiting for compute to go active timeout.
2015-07-07 16:12:44.640 98003 ERROR oslo_messaging.rpc.dispatcher Traceback (most recent call last):
2015-07-07 16:12:44.640 98003 ERROR oslo_messaging.rpc.dispatcher File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 142, in _dispatch_and_reply
2015-07-07 16:12:44.640 98003 ERROR oslo_messaging.rpc.dispatcher executor_callback))
2015-07-07 16:12:44.640 98003 ERROR oslo_messaging.rpc.dispatcher File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 186, in _dispatch
2015-07-07 16:12:44.640 98003 ERROR oslo_messaging.rpc.dispatcher executor_callback)
2015-07-07 16:12:44.640 98003 ERROR oslo_messaging.rpc.dispatcher File "/usr/local/lib/python2.7/dist-packages/oslo_messaging/rpc/dispatcher.py", line 130, in _do_dispatch
2015-07-07 16:12:44.640 98003 ERROR oslo_messaging.rpc.dispatcher result = func(ctxt, **new_args)
2015-07-07 16:12:44.640 98003 ERROR oslo_messaging.rpc.dispatcher File "/opt/stack/octavia/octavia/controller/queue/endpoint.py", line 38, in create_load_balancer
2015-07-07 16:12:44.640 98003 ERROR oslo_messaging.rpc.dispatcher self.worker.create_load_balancer(load_balancer_id)
2015-07-07 16:12:44.640 98003 ERROR oslo_messaging.rpc.dispatcher File "/opt/stack/octavia/octavia/controller/worker/controller_worker.py", line 266, in create_load_balancer
2015-07-07 16:12:44.640 98003 ERROR oslo_messaging.rpc.dispatcher create_amp_lb_tf.run()
2015-07-07 16:12:44.640 98003 ERROR oslo_messaging.rpc.dispatcher File "/usr/local/lib/python2.7/dist-packages/taskflow/engines/action_engine/engine.py", line 140, in run
2015-07-07 16:12:44.640 98003 ERROR oslo_messaging.rpc.dispatcher for _state in self.run_iter():
2015-07-07 16:12:44.640 98003 ERROR oslo_messaging.rpc.dispatcher File "/usr/local/lib/python2.7/dist-packages/taskflow/engines/action_engine/engine.py", line 198, in run_iter
2015-07-07 16:12:44.640 98003 ERROR oslo_messaging.rpc.dispatcher failure.Failure.reraise_if_any(failures.values())
2015-07-07 16:12:44.640 98003 ERROR oslo_messaging.rpc.dispatcher File "/usr/local/lib/python2.7/dist-packages/taskflow/types/failure.py", line 302, in reraise_if_any
2015-07-07 16:12:44.640 98003 ERROR oslo_messaging.rpc.dispatcher failures[0].reraise()
2015-07-07 16:12:44.640 98003 ERROR oslo_messaging.rpc.dispatcher File "/usr/local/lib/python2.7/dist-packages/taskflow/types/failure.py", line 309, in reraise
2015-07-07 16:12:44.640 98003 ERROR oslo_messaging.rpc.dispatcher six.reraise(*self._exc_info)
2015-07-07 16:12:44.640 98003 ERROR oslo_messaging.rpc.dispatcher File "/usr/local/lib/python2.7/dist-packages/taskflow/engines/action_engine/executor.py", line 68, in _execute_task
2015-07-07 16:12:44.640 98003 ERROR oslo_messaging.rpc.dispatcher result = task.execute(**arguments)
2015-07-07 16:12:44.640 98003 ERROR oslo_messaging.rpc.dispatcher File "/opt/stack/octavia/octavia/controller/worker/tasks/compute_tasks.py", line 155, in execute
2015-07-07 16:12:44.640 98003 ERROR oslo_messaging.rpc.dispatcher raise exceptions.ComputeWaitTimeoutException()
2015-07-07 16:12:44.640 98003 ERROR oslo_messaging.rpc.dispatcher ComputeWaitTimeoutException: Waiting for compute to go active timeout.
2015-07-07 16:12:44.640 98003 ERROR oslo_messaging.rpc.dispatcher
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment