Skip to content

Instantly share code, notes, and snippets.

@mbforbes
Created May 14, 2015 18:05
Show Gist options
  • Save mbforbes/fbafbdc1688c5b1236b1 to your computer and use it in GitHub Desktop.
Save mbforbes/fbafbdc1688c5b1236b1 to your computer and use it in GitHub Desktop.
Reboot test failure — reboot (pods not 'Running' after reboot)
Reboot
should reboot each node and ensure they function upon restart
/go/src/github.com/GoogleCloudPlatform/kubernetes/_output/dockerized/go/src/github.com/GoogleCloudPlatform/kubernetes/test/e2e/reboot.go:97
[BeforeEach] Reboot
/go/src/github.com/GoogleCloudPlatform/kubernetes/_output/dockerized/go/src/github.com/GoogleCloudPlatform/kubernetes/test/e2e/reboot.go:62
>>> testContext.KubeConfig: /Users/maxforbes/.kube/config
[It] should reboot each node and ensure they function upon restart
/go/src/github.com/GoogleCloudPlatform/kubernetes/_output/dockerized/go/src/github.com/GoogleCloudPlatform/kubernetes/test/e2e/reboot.go:97
INFO: Getting e2e-test-maxforbes-minion-3t1x
INFO: Getting e2e-test-maxforbes-minion-66vd
INFO: Waiting up to 20s for node e2e-test-maxforbes-minion-3t1x readiness to be true
INFO: Waiting up to 20s for node e2e-test-maxforbes-minion-66vd readiness to be true
INFO: Node e2e-test-maxforbes-minion-3t1x condition 1/1: type: Ready, status: True
INFO: Successfully found node e2e-test-maxforbes-minion-3t1x readiness to be true
INFO: Node e2e-test-maxforbes-minion-66vd condition 1/1: type: Ready, status: True
INFO: Successfully found node e2e-test-maxforbes-minion-66vd readiness to be true
INFO: Node e2e-test-maxforbes-minion-3t1x has 4 pods: [elasticsearch-logging-x8est fluentd-elasticsearch-e2e-test-maxforbes-minion-3t1x kube-dns-u248l monitoring-heapster-controller-uftv5]
INFO: Waiting up to 20s for the following pods to be running and ready: [elasticsearch-logging-x8est fluentd-elasticsearch-e2e-test-maxforbes-minion-3t1x kube-dns-u248l monitoring-heapster-controller-uftv5]
INFO: Waiting up to 20s for pod elasticsearch-logging-x8est status to be running and ready
INFO: Waiting up to 20s for pod monitoring-heapster-controller-uftv5 status to be running and ready
INFO: Waiting up to 20s for pod kube-dns-u248l status to be running and ready
INFO: Waiting up to 20s for pod fluentd-elasticsearch-e2e-test-maxforbes-minion-3t1x status to be running and ready
INFO: Node e2e-test-maxforbes-minion-66vd has 4 pods: [elasticsearch-logging-uoaxg fluentd-elasticsearch-e2e-test-maxforbes-minion-66vd kibana-logging-2qlm8 monitoring-influx-grafana-controller-ohuml]
INFO: Waiting up to 20s for the following pods to be running and ready: [elasticsearch-logging-uoaxg fluentd-elasticsearch-e2e-test-maxforbes-minion-66vd kibana-logging-2qlm8 monitoring-influx-grafana-controller-ohuml]
INFO: Waiting up to 20s for pod elasticsearch-logging-uoaxg status to be running and ready
INFO: Waiting up to 20s for pod kibana-logging-2qlm8 status to be running and ready
INFO: Waiting up to 20s for pod monitoring-influx-grafana-controller-ohuml status to be running and ready
INFO: Waiting up to 20s for pod fluentd-elasticsearch-e2e-test-maxforbes-minion-66vd status to be running and ready
INFO: Wanted all pods to be running and ready. Result: true. Pods: [elasticsearch-logging-x8est fluentd-elasticsearch-e2e-test-maxforbes-minion-3t1x kube-dns-u248l monitoring-heapster-controller-uftv5]
INFO: Getting external IP address for e2e-test-maxforbes-minion-3t1x
INFO: Calling reboot on e2e-test-maxforbes-minion-3t1x
INFO: Using SSH key: /Users/maxforbes/.ssh/google_compute_engine
INFO: Wanted all pods to be running and ready. Result: true. Pods: [elasticsearch-logging-uoaxg fluentd-elasticsearch-e2e-test-maxforbes-minion-66vd kibana-logging-2qlm8 monitoring-influx-grafana-controller-ohuml]
INFO: Getting external IP address for e2e-test-maxforbes-minion-66vd
INFO: Calling reboot on e2e-test-maxforbes-minion-66vd
INFO: Using SSH key: /Users/maxforbes/.ssh/google_compute_engine
INFO: Waiting up to 2m0s for node e2e-test-maxforbes-minion-3t1x readiness to be false
INFO: Node e2e-test-maxforbes-minion-3t1x condition 1/1: type: Ready, status: True
INFO: Waiting up to 2m0s for node e2e-test-maxforbes-minion-66vd readiness to be false
INFO: Node e2e-test-maxforbes-minion-66vd condition 1/1: type: Ready, status: True
INFO: Node e2e-test-maxforbes-minion-3t1x condition 1/1: type: Ready, status: True
INFO: Node e2e-test-maxforbes-minion-66vd condition 1/1: type: Ready, status: True
INFO: Node e2e-test-maxforbes-minion-3t1x condition 1/1: type: Ready, status: True
INFO: Node e2e-test-maxforbes-minion-66vd condition 1/1: type: Ready, status: True
INFO: Node e2e-test-maxforbes-minion-3t1x condition 1/1: type: Ready, status: True
INFO: Node e2e-test-maxforbes-minion-66vd condition 1/1: type: Ready, status: True
INFO: Node e2e-test-maxforbes-minion-3t1x condition 1/1: type: Ready, status: True
INFO: Node e2e-test-maxforbes-minion-66vd condition 1/1: type: Ready, status: True
INFO: Node e2e-test-maxforbes-minion-3t1x condition 1/1: type: Ready, status: True
INFO: Node e2e-test-maxforbes-minion-66vd condition 1/1: type: Ready, status: True
INFO: Node e2e-test-maxforbes-minion-3t1x condition 1/1: type: Ready, status: True
INFO: Node e2e-test-maxforbes-minion-66vd condition 1/1: type: Ready, status: True
INFO: Node e2e-test-maxforbes-minion-3t1x condition 1/1: type: Ready, status: True
INFO: Node e2e-test-maxforbes-minion-66vd condition 1/1: type: Ready, status: True
INFO: Node e2e-test-maxforbes-minion-3t1x condition 1/1: type: Ready, status: Unknown
INFO: Successfully found node e2e-test-maxforbes-minion-3t1x readiness to be false
INFO: Waiting up to 5m0s for node e2e-test-maxforbes-minion-3t1x readiness to be true
INFO: Node e2e-test-maxforbes-minion-3t1x condition 1/1: type: Ready, status: Unknown
INFO: Node e2e-test-maxforbes-minion-66vd condition 1/1: type: Ready, status: Unknown
INFO: Successfully found node e2e-test-maxforbes-minion-66vd readiness to be false
INFO: Waiting up to 5m0s for node e2e-test-maxforbes-minion-66vd readiness to be true
INFO: Node e2e-test-maxforbes-minion-66vd condition 1/1: type: Ready, status: Unknown
INFO: Node e2e-test-maxforbes-minion-3t1x condition 1/1: type: Ready, status: Unknown
INFO: Node e2e-test-maxforbes-minion-66vd condition 1/1: type: Ready, status: Unknown
INFO: Node e2e-test-maxforbes-minion-3t1x condition 1/1: type: Ready, status: Unknown
INFO: Node e2e-test-maxforbes-minion-66vd condition 1/1: type: Ready, status: Unknown
INFO: Node e2e-test-maxforbes-minion-3t1x condition 1/1: type: Ready, status: Unknown
INFO: Node e2e-test-maxforbes-minion-66vd condition 1/1: type: Ready, status: Unknown
INFO: Node e2e-test-maxforbes-minion-3t1x condition 1/1: type: Ready, status: Unknown
INFO: Node e2e-test-maxforbes-minion-66vd condition 1/1: type: Ready, status: Unknown
INFO: Node e2e-test-maxforbes-minion-3t1x condition 1/1: type: Ready, status: Unknown
INFO: Node e2e-test-maxforbes-minion-66vd condition 1/1: type: Ready, status: Unknown
INFO: Node e2e-test-maxforbes-minion-3t1x condition 1/1: type: Ready, status: Unknown
INFO: Node e2e-test-maxforbes-minion-66vd condition 1/1: type: Ready, status: Unknown
INFO: Node e2e-test-maxforbes-minion-3t1x condition 1/1: type: Ready, status: Unknown
INFO: Node e2e-test-maxforbes-minion-66vd condition 1/1: type: Ready, status: Unknown
INFO: Node e2e-test-maxforbes-minion-3t1x condition 1/1: type: Ready, status: Unknown
INFO: Node e2e-test-maxforbes-minion-66vd condition 1/1: type: Ready, status: Unknown
INFO: Node e2e-test-maxforbes-minion-3t1x condition 1/1: type: Ready, status: Unknown
INFO: Node e2e-test-maxforbes-minion-66vd condition 1/1: type: Ready, status: Unknown
INFO: Node e2e-test-maxforbes-minion-3t1x condition 1/1: type: Ready, status: Unknown
INFO: Node e2e-test-maxforbes-minion-66vd condition 1/1: type: Ready, status: Unknown
INFO: Node e2e-test-maxforbes-minion-3t1x condition 1/1: type: Ready, status: Unknown
INFO: Node e2e-test-maxforbes-minion-66vd condition 1/1: type: Ready, status: True
INFO: Successfully found node e2e-test-maxforbes-minion-66vd readiness to be true
INFO: Waiting up to 2m0s for the following pods to be running and ready: [elasticsearch-logging-uoaxg fluentd-elasticsearch-e2e-test-maxforbes-minion-66vd kibana-logging-2qlm8 monitoring-influx-grafana-controller-ohuml]
INFO: Waiting up to 2m0s for pod elasticsearch-logging-uoaxg status to be running and ready
INFO: Waiting up to 2m0s for pod kibana-logging-2qlm8 status to be running and ready
INFO: Waiting up to 2m0s for pod monitoring-influx-grafana-controller-ohuml status to be running and ready
INFO: Waiting up to 2m0s for pod fluentd-elasticsearch-e2e-test-maxforbes-minion-66vd status to be running and ready
INFO: Waiting for pod kibana-logging-2qlm8 in namespace default status to be "running and ready" (found "Pending") (43.287634ms)
INFO: Waiting for pod elasticsearch-logging-uoaxg in namespace default status to be "running and ready" (found "Pending") (45.215178ms)
INFO: Waiting for pod fluentd-elasticsearch-e2e-test-maxforbes-minion-66vd in namespace default status to be "running and ready" (found "Pending") (107.137054ms)
INFO: Waiting for pod monitoring-influx-grafana-controller-ohuml in namespace default status to be "running and ready" (found "Pending") (108.495561ms)
INFO: Node e2e-test-maxforbes-minion-3t1x condition 1/1: type: Ready, status: True
INFO: Successfully found node e2e-test-maxforbes-minion-3t1x readiness to be true
INFO: Waiting up to 2m0s for the following pods to be running and ready: [elasticsearch-logging-x8est fluentd-elasticsearch-e2e-test-maxforbes-minion-3t1x kube-dns-u248l monitoring-heapster-controller-uftv5]
INFO: Waiting up to 2m0s for pod elasticsearch-logging-x8est status to be running and ready
INFO: Waiting up to 2m0s for pod kube-dns-u248l status to be running and ready
INFO: Waiting up to 2m0s for pod fluentd-elasticsearch-e2e-test-maxforbes-minion-3t1x status to be running and ready
INFO: Waiting up to 2m0s for pod monitoring-heapster-controller-uftv5 status to be running and ready
INFO: Waiting for pod elasticsearch-logging-x8est in namespace default status to be "running and ready" (found "Pending") (43.164267ms)
INFO: Waiting for pod kube-dns-u248l in namespace default status to be "running and ready" (found "Pending") (45.026673ms)
INFO: Waiting for pod monitoring-heapster-controller-uftv5 in namespace default status to be "running and ready" (found "Pending") (105.042359ms)
INFO: Waiting for pod fluentd-elasticsearch-e2e-test-maxforbes-minion-3t1x in namespace default status to be "running and ready" (found "Pending") (105.208071ms)
INFO: Waiting for pod kibana-logging-2qlm8 in namespace default status to be "running and ready" (found "Pending") (5.087470939s)
INFO: Waiting for pod elasticsearch-logging-uoaxg in namespace default status to be "running and ready" (found "Pending") (5.088982693s)
INFO: Waiting for pod fluentd-elasticsearch-e2e-test-maxforbes-minion-66vd in namespace default status to be "running and ready" (found "Pending") (5.150627397s)
INFO: Waiting for pod monitoring-influx-grafana-controller-ohuml in namespace default status to be "running and ready" (found "Pending") (5.152173507s)
INFO: Waiting for pod kube-dns-u248l in namespace default status to be "running and ready" (found "Pending") (5.121070157s)
INFO: Waiting for pod monitoring-heapster-controller-uftv5 in namespace default status to be "running and ready" (found "Pending") (5.162979424s)
INFO: Waiting for pod fluentd-elasticsearch-e2e-test-maxforbes-minion-3t1x in namespace default status to be "running and ready" (found "Pending") (5.207536067s)
INFO: Waiting for pod elasticsearch-logging-x8est in namespace default status to be "running and ready" (found "Pending") (5.371580208s)
INFO: Waiting for pod kibana-logging-2qlm8 in namespace default status to be "running and ready" (found "Pending") (10.132099551s)
INFO: Waiting for pod elasticsearch-logging-uoaxg in namespace default status to be "running and ready" (found "Pending") (10.133574915s)
INFO: Waiting for pod fluentd-elasticsearch-e2e-test-maxforbes-minion-66vd in namespace default status to be "running and ready" (found "Pending") (10.195934709s)
INFO: Waiting for pod monitoring-influx-grafana-controller-ohuml in namespace default status to be "running and ready" (found "Pending") (10.197173952s)
INFO: Waiting for pod kube-dns-u248l in namespace default status to be "running and ready" (found "Pending") (10.166765973s)
INFO: Waiting for pod monitoring-heapster-controller-uftv5 in namespace default status to be "running and ready" (found "Pending") (10.229809949s)
INFO: Waiting for pod fluentd-elasticsearch-e2e-test-maxforbes-minion-3t1x in namespace default status to be "running and ready" (found "Pending") (10.249892402s)
INFO: Waiting for pod elasticsearch-logging-x8est in namespace default status to be "running and ready" (found "Pending") (10.415583611s)
INFO: Waiting for pod kibana-logging-2qlm8 in namespace default status to be "running and ready" (found "Pending") (15.179643907s)
INFO: Waiting for pod elasticsearch-logging-uoaxg in namespace default status to be "running and ready" (found "Pending") (15.17965962s)
INFO: Waiting for pod fluentd-elasticsearch-e2e-test-maxforbes-minion-66vd in namespace default status to be "running and ready" (found "Pending") (15.240081936s)
INFO: Waiting for pod monitoring-influx-grafana-controller-ohuml in namespace default status to be "running and ready" (found "Pending") (15.241208694s)
INFO: Waiting for pod kube-dns-u248l in namespace default status to be "running and ready" (found "Pending") (15.22811162s)
INFO: Waiting for pod monitoring-heapster-controller-uftv5 in namespace default status to be "running and ready" (found "Pending") (15.274535856s)
INFO: Waiting for pod fluentd-elasticsearch-e2e-test-maxforbes-minion-3t1x in namespace default status to be "running and ready" (found "Pending") (15.293072688s)
INFO: Waiting for pod elasticsearch-logging-x8est in namespace default status to be "running and ready" (found "Pending") (15.460306214s)
INFO: Waiting for pod kibana-logging-2qlm8 in namespace default status to be "running and ready" (found "Pending") (20.224500927s)
INFO: Waiting for pod elasticsearch-logging-uoaxg in namespace default status to be "running and ready" (found "Pending") (20.225380194s)
INFO: Waiting for pod monitoring-influx-grafana-controller-ohuml in namespace default status to be "running and ready" (found "Pending") (20.285414226s)
INFO: Waiting for pod fluentd-elasticsearch-e2e-test-maxforbes-minion-66vd in namespace default status to be "running and ready" (found "Pending") (20.286232821s)
INFO: Waiting for pod kube-dns-u248l in namespace default status to be "running and ready" (found "Pending") (20.274190576s)
INFO: Waiting for pod monitoring-heapster-controller-uftv5 in namespace default status to be "running and ready" (found "Pending") (20.319333815s)
INFO: Waiting for pod fluentd-elasticsearch-e2e-test-maxforbes-minion-3t1x in namespace default status to be "running and ready" (found "Pending") (20.336866654s)
INFO: Waiting for pod elasticsearch-logging-x8est in namespace default status to be "running and ready" (found "Pending") (20.503693552s)
INFO: Waiting for pod elasticsearch-logging-uoaxg in namespace default status to be "running and ready" (found "Pending") (25.271577531s)
INFO: Waiting for pod kibana-logging-2qlm8 in namespace default status to be "running and ready" (found "Pending") (25.271879625s)
INFO: Waiting for pod monitoring-influx-grafana-controller-ohuml in namespace default status to be "running and ready" (found "Pending") (25.328826988s)
INFO: Waiting for pod fluentd-elasticsearch-e2e-test-maxforbes-minion-66vd in namespace default status to be "running and ready" (found "Pending") (25.3303714s)
INFO: Waiting for pod kube-dns-u248l in namespace default status to be "running and ready" (found "Pending") (25.319618225s)
INFO: Waiting for pod monitoring-heapster-controller-uftv5 in namespace default status to be "running and ready" (found "Pending") (25.362067974s)
INFO: Waiting for pod fluentd-elasticsearch-e2e-test-maxforbes-minion-3t1x in namespace default status to be "running and ready" (found "Pending") (25.384637917s)
INFO: Waiting for pod elasticsearch-logging-x8est in namespace default status to be "running and ready" (found "Pending") (25.546744567s)
INFO: Waiting for pod elasticsearch-logging-uoaxg in namespace default status to be "running and ready" (found "Pending") (30.31673409s)
INFO: Waiting for pod kibana-logging-2qlm8 in namespace default status to be "running and ready" (found "Pending") (30.317330979s)
INFO: Waiting for pod monitoring-influx-grafana-controller-ohuml in namespace default status to be "running and ready" (found "Pending") (30.371978588s)
INFO: Waiting for pod fluentd-elasticsearch-e2e-test-maxforbes-minion-66vd in namespace default status to be "running and ready" (found "Pending") (30.372609858s)
INFO: Waiting for pod kube-dns-u248l in namespace default status to be "running and ready" (found "Pending") (30.364615383s)
INFO: Waiting for pod monitoring-heapster-controller-uftv5 in namespace default status to be "running and ready" (found "Pending") (30.405421287s)
INFO: Waiting for pod fluentd-elasticsearch-e2e-test-maxforbes-minion-3t1x in namespace default status to be "running and ready" (found "Pending") (30.473124659s)
INFO: Waiting for pod elasticsearch-logging-x8est in namespace default status to be "running and ready" (found "Pending") (30.590207181s)
INFO: Waiting for pod elasticsearch-logging-uoaxg in namespace default status to be "running and ready" (found "Pending") (35.360519915s)
INFO: Waiting for pod kibana-logging-2qlm8 in namespace default status to be "running and ready" (found "Pending") (35.363479934s)
INFO: Waiting for pod monitoring-influx-grafana-controller-ohuml in namespace default status to be "running and ready" (found "Pending") (35.417593407s)
INFO: Waiting for pod fluentd-elasticsearch-e2e-test-maxforbes-minion-66vd in namespace default status to be "running and ready" (found "Pending") (35.418406047s)
INFO: Waiting for pod kube-dns-u248l in namespace default status to be "running and ready" (found "Pending") (35.411985961s)
INFO: Waiting for pod monitoring-heapster-controller-uftv5 in namespace default status to be "running and ready" (found "Pending") (35.449542731s)
INFO: Waiting for pod fluentd-elasticsearch-e2e-test-maxforbes-minion-3t1x in namespace default status to be "running and ready" (found "Pending") (35.515670093s)
INFO: Waiting for pod elasticsearch-logging-x8est in namespace default status to be "running and ready" (found "Pending") (35.633967692s)
INFO: Waiting for pod elasticsearch-logging-uoaxg in namespace default status to be "running and ready" (found "Pending") (40.40514117s)
INFO: Waiting for pod kibana-logging-2qlm8 in namespace default status to be "running and ready" (found "Pending") (40.406460623s)
INFO: Waiting for pod monitoring-influx-grafana-controller-ohuml in namespace default status to be "running and ready" (found "Pending") (40.46140216s)
INFO: Waiting for pod fluentd-elasticsearch-e2e-test-maxforbes-minion-66vd in namespace default status to be "running and ready" (found "Pending") (40.462620657s)
INFO: Waiting for pod kube-dns-u248l in namespace default status to be "running and ready" (found "Pending") (40.457450905s)
INFO: Waiting for pod monitoring-heapster-controller-uftv5 in namespace default status to be "running and ready" (found "Pending") (40.494360141s)
INFO: Waiting for pod fluentd-elasticsearch-e2e-test-maxforbes-minion-3t1x in namespace default status to be "running and ready" (found "Pending") (40.558931639s)
INFO: Waiting for pod elasticsearch-logging-x8est in namespace default status to be "running and ready" (found "Pending") (40.67803343s)
INFO: Waiting for pod elasticsearch-logging-uoaxg in namespace default status to be "running and ready" (found "Pending") (45.449054631s)
INFO: Waiting for pod kibana-logging-2qlm8 in namespace default status to be "running and ready" (found "Pending") (45.449650182s)
INFO: Waiting for pod fluentd-elasticsearch-e2e-test-maxforbes-minion-66vd in namespace default status to be "running and ready" (found "Pending") (45.505525101s)
INFO: Waiting for pod monitoring-influx-grafana-controller-ohuml in namespace default status to be "running and ready" (found "Pending") (45.506911811s)
INFO: Waiting for pod kube-dns-u248l in namespace default status to be "running and ready" (found "Pending") (45.502005196s)
INFO: Waiting for pod monitoring-heapster-controller-uftv5 in namespace default status to be "running and ready" (found "Pending") (45.538194463s)
INFO: Waiting for pod fluentd-elasticsearch-e2e-test-maxforbes-minion-3t1x in namespace default status to be "running and ready" (found "Pending") (45.601838448s)
INFO: Waiting for pod elasticsearch-logging-x8est in namespace default status to be "running and ready" (found "Pending") (45.721215731s)
INFO: Waiting for pod kibana-logging-2qlm8 in namespace default status to be "running and ready" (found "Pending") (50.494143562s)
INFO: Waiting for pod elasticsearch-logging-uoaxg in namespace default status to be "running and ready" (found "Pending") (50.495709607s)
INFO: Waiting for pod fluentd-elasticsearch-e2e-test-maxforbes-minion-66vd in namespace default status to be "running and ready" (found "Pending") (50.550324378s)
INFO: Waiting for pod monitoring-influx-grafana-controller-ohuml in namespace default status to be "running and ready" (found "Pending") (50.551570787s)
INFO: Waiting for pod kube-dns-u248l in namespace default status to be "running and ready" (found "Pending") (50.547355112s)
INFO: Waiting for pod monitoring-heapster-controller-uftv5 in namespace default status to be "running and ready" (found "Pending") (50.582613821s)
INFO: Waiting for pod fluentd-elasticsearch-e2e-test-maxforbes-minion-3t1x in namespace default status to be "running and ready" (found "Pending") (50.64563638s)
INFO: Waiting for pod elasticsearch-logging-x8est in namespace default status to be "running and ready" (found "Pending") (50.764906924s)
INFO: Waiting for pod kibana-logging-2qlm8 in namespace default status to be "running and ready" (found "Pending") (55.539111229s)
INFO: Waiting for pod elasticsearch-logging-uoaxg in namespace default status to be "running and ready" (found "Pending") (55.541342009s)
INFO: Waiting for pod fluentd-elasticsearch-e2e-test-maxforbes-minion-66vd in namespace default status to be "running and ready" (found "Pending") (55.595880232s)
INFO: Waiting for pod monitoring-influx-grafana-controller-ohuml in namespace default status to be "running and ready" (found "Pending") (55.614031733s)
INFO: Waiting for pod kube-dns-u248l in namespace default status to be "running and ready" (found "Pending") (55.600273768s)
INFO: Waiting for pod monitoring-heapster-controller-uftv5 in namespace default status to be "running and ready" (found "Pending") (55.63748845s)
INFO: Waiting for pod fluentd-elasticsearch-e2e-test-maxforbes-minion-3t1x in namespace default status to be "running and ready" (found "Pending") (55.689349985s)
INFO: Waiting for pod elasticsearch-logging-x8est in namespace default status to be "running and ready" (found "Pending") (55.809468991s)
INFO: Waiting for pod kibana-logging-2qlm8 in namespace default status to be "running and ready" (found "Pending") (1m0.58257048s)
INFO: Waiting for pod elasticsearch-logging-uoaxg in namespace default status to be "running and ready" (found "Pending") (1m0.585048349s)
INFO: Waiting for pod fluentd-elasticsearch-e2e-test-maxforbes-minion-66vd in namespace default status to be "running and ready" (found "Pending") (1m0.640682737s)
INFO: Waiting for pod monitoring-influx-grafana-controller-ohuml in namespace default status to be "running and ready" (found "Pending") (1m0.657806489s)
INFO: Waiting for pod kube-dns-u248l in namespace default status to be "running and ready" (found "Pending") (1m0.646284606s)
INFO: Waiting for pod monitoring-heapster-controller-uftv5 in namespace default status to be "running and ready" (found "Pending") (1m0.731354551s)
INFO: Waiting for pod fluentd-elasticsearch-e2e-test-maxforbes-minion-3t1x in namespace default status to be "running and ready" (found "Pending") (1m0.734610363s)
INFO: Waiting for pod elasticsearch-logging-x8est in namespace default status to be "running and ready" (found "Pending") (1m0.853694531s)
INFO: Waiting for pod kibana-logging-2qlm8 in namespace default status to be "running and ready" (found "Pending") (1m5.627049281s)
INFO: Waiting for pod elasticsearch-logging-uoaxg in namespace default status to be "running and ready" (found "Pending") (1m5.6279762s)
INFO: Waiting for pod fluentd-elasticsearch-e2e-test-maxforbes-minion-66vd in namespace default status to be "running and ready" (found "Pending") (1m5.686042111s)
INFO: Waiting for pod monitoring-influx-grafana-controller-ohuml in namespace default status to be "running and ready" (found "Pending") (1m5.701243411s)
INFO: Waiting for pod kube-dns-u248l in namespace default status to be "running and ready" (found "Pending") (1m5.69075893s)
INFO: Waiting for pod monitoring-heapster-controller-uftv5 in namespace default status to be "running and ready" (found "Pending") (1m5.774778557s)
INFO: Waiting for pod fluentd-elasticsearch-e2e-test-maxforbes-minion-3t1x in namespace default status to be "running and ready" (found "Pending") (1m5.780211908s)
INFO: Waiting for pod elasticsearch-logging-x8est in namespace default status to be "running and ready" (found "Pending") (1m5.897502178s)
INFO: Waiting for pod elasticsearch-logging-uoaxg in namespace default status to be "running and ready" (found "Pending") (1m10.671976766s)
INFO: Waiting for pod kibana-logging-2qlm8 in namespace default status to be "running and ready" (found "Pending") (1m10.675409778s)
INFO: Waiting for pod fluentd-elasticsearch-e2e-test-maxforbes-minion-66vd in namespace default status to be "running and ready" (found "Pending") (1m10.732053923s)
INFO: Waiting for pod monitoring-influx-grafana-controller-ohuml in namespace default status to be "running and ready" (found "Pending") (1m10.744955665s)
INFO: Waiting for pod kube-dns-u248l in namespace default status to be "running and ready" (found "Pending") (1m10.767637928s)
INFO: Waiting for pod monitoring-heapster-controller-uftv5 in namespace default status to be "running and ready" (found "Pending") (1m10.819331988s)
INFO: Waiting for pod fluentd-elasticsearch-e2e-test-maxforbes-minion-3t1x in namespace default status to be "running and ready" (found "Pending") (1m10.82338948s)
INFO: Waiting for pod elasticsearch-logging-x8est in namespace default status to be "running and ready" (found "Pending") (1m10.940886115s)
INFO: Waiting for pod elasticsearch-logging-uoaxg in namespace default status to be "running and ready" (found "Pending") (1m15.716497675s)
INFO: Waiting for pod kibana-logging-2qlm8 in namespace default status to be "running and ready" (found "Pending") (1m15.71819396s)
INFO: Waiting for pod fluentd-elasticsearch-e2e-test-maxforbes-minion-66vd in namespace default status to be "running and ready" (found "Pending") (1m15.776679415s)
INFO: Waiting for pod monitoring-influx-grafana-controller-ohuml in namespace default status to be "running and ready" (found "Pending") (1m15.788610258s)
INFO: Waiting for pod kube-dns-u248l in namespace default status to be "running and ready" (found "Pending") (1m15.811936149s)
INFO: Waiting for pod monitoring-heapster-controller-uftv5 in namespace default status to be "running and ready" (found "Pending") (1m15.862391515s)
INFO: Waiting for pod fluentd-elasticsearch-e2e-test-maxforbes-minion-3t1x in namespace default status to be "running and ready" (found "Pending") (1m15.86863957s)
INFO: Waiting for pod elasticsearch-logging-x8est in namespace default status to be "running and ready" (found "Pending") (1m15.98465021s)
INFO: Waiting for pod elasticsearch-logging-uoaxg in namespace default status to be "running and ready" (found "Pending") (1m20.759794604s)
INFO: Waiting for pod kibana-logging-2qlm8 in namespace default status to be "running and ready" (found "Pending") (1m20.76095417s)
INFO: Waiting for pod fluentd-elasticsearch-e2e-test-maxforbes-minion-66vd in namespace default status to be "running and ready" (found "Pending") (1m20.819139825s)
INFO: Waiting for pod monitoring-influx-grafana-controller-ohuml in namespace default status to be "running and ready" (found "Pending") (1m20.832219513s)
INFO: Waiting for pod kube-dns-u248l in namespace default status to be "running and ready" (found "Pending") (1m20.85853785s)
INFO: Waiting for pod monitoring-heapster-controller-uftv5 in namespace default status to be "running and ready" (found "Pending") (1m20.906600935s)
INFO: Waiting for pod fluentd-elasticsearch-e2e-test-maxforbes-minion-3t1x in namespace default status to be "running and ready" (found "Pending") (1m20.911503288s)
INFO: Waiting for pod elasticsearch-logging-x8est in namespace default status to be "running and ready" (found "Pending") (1m21.028812097s)
INFO: Waiting for pod elasticsearch-logging-uoaxg in namespace default status to be "running and ready" (found "Pending") (1m25.805981714s)
INFO: Waiting for pod kibana-logging-2qlm8 in namespace default status to be "running and ready" (found "Pending") (1m25.806381503s)
INFO: Waiting for pod fluentd-elasticsearch-e2e-test-maxforbes-minion-66vd in namespace default status to be "running and ready" (found "Pending") (1m25.86342355s)
INFO: Waiting for pod monitoring-influx-grafana-controller-ohuml in namespace default status to be "running and ready" (found "Pending") (1m25.878006063s)
INFO: Waiting for pod kube-dns-u248l in namespace default status to be "running and ready" (found "Pending") (1m25.90433314s)
INFO: Waiting for pod monitoring-heapster-controller-uftv5 in namespace default status to be "running and ready" (found "Pending") (1m25.949425172s)
INFO: Waiting for pod fluentd-elasticsearch-e2e-test-maxforbes-minion-3t1x in namespace default status to be "running and ready" (found "Pending") (1m25.954430281s)
INFO: Waiting for pod elasticsearch-logging-x8est in namespace default status to be "running and ready" (found "Pending") (1m26.072227709s)
INFO: Waiting for pod kibana-logging-2qlm8 in namespace default status to be "running and ready" (found "Pending") (1m30.84966132s)
INFO: Waiting for pod elasticsearch-logging-uoaxg in namespace default status to be "running and ready" (found "Pending") (1m30.850850471s)
INFO: Waiting for pod fluentd-elasticsearch-e2e-test-maxforbes-minion-66vd in namespace default status to be "running and ready" (found "Pending") (1m30.914769077s)
INFO: Waiting for pod monitoring-influx-grafana-controller-ohuml in namespace default status to be "running and ready" (found "Pending") (1m30.921715993s)
INFO: Waiting for pod kube-dns-u248l in namespace default status to be "running and ready" (found "Pending") (1m30.950799426s)
INFO: Waiting for pod monitoring-heapster-controller-uftv5 in namespace default status to be "running and ready" (found "Pending") (1m30.993045027s)
INFO: Waiting for pod fluentd-elasticsearch-e2e-test-maxforbes-minion-3t1x in namespace default status to be "running and ready" (found "Pending") (1m30.997845809s)
INFO: Waiting for pod elasticsearch-logging-x8est in namespace default status to be "running and ready" (found "Pending") (1m31.11945071s)
INFO: Waiting for pod elasticsearch-logging-uoaxg in namespace default status to be "running and ready" (found "Pending") (1m35.895489228s)
INFO: Waiting for pod kibana-logging-2qlm8 in namespace default status to be "running and ready" (found "Pending") (1m35.895939552s)
INFO: Waiting for pod fluentd-elasticsearch-e2e-test-maxforbes-minion-66vd in namespace default status to be "running and ready" (found "Pending") (1m35.958435795s)
INFO: Waiting for pod monitoring-influx-grafana-controller-ohuml in namespace default status to be "running and ready" (found "Pending") (1m35.968778669s)
INFO: Waiting for pod kube-dns-u248l in namespace default status to be "running and ready" (found "Pending") (1m35.99584573s)
INFO: Waiting for pod monitoring-heapster-controller-uftv5 in namespace default status to be "running and ready" (found "Pending") (1m36.036515855s)
INFO: Waiting for pod fluentd-elasticsearch-e2e-test-maxforbes-minion-3t1x in namespace default status to be "running and ready" (found "Pending") (1m36.041066728s)
INFO: Waiting for pod elasticsearch-logging-x8est in namespace default status to be "running and ready" (found "Pending") (1m36.164027319s)
INFO: Waiting for pod kibana-logging-2qlm8 in namespace default status to be "running and ready" (found "Pending") (1m40.939973293s)
INFO: Waiting for pod elasticsearch-logging-uoaxg in namespace default status to be "running and ready" (found "Pending") (1m40.941199197s)
INFO: Waiting for pod monitoring-influx-grafana-controller-ohuml in namespace default status to be "running and ready" (found "Pending") (1m41.013564479s)
INFO: Waiting for pod fluentd-elasticsearch-e2e-test-maxforbes-minion-66vd in namespace default status to be "running and ready" (found "Pending") (1m41.044029451s)
INFO: Waiting for pod kube-dns-u248l in namespace default status to be "running and ready" (found "Pending") (1m41.041773s)
INFO: Waiting for pod monitoring-heapster-controller-uftv5 in namespace default status to be "running and ready" (found "Pending") (1m41.081032336s)
INFO: Waiting for pod fluentd-elasticsearch-e2e-test-maxforbes-minion-3t1x in namespace default status to be "running and ready" (found "Pending") (1m41.083593719s)
INFO: Waiting for pod elasticsearch-logging-x8est in namespace default status to be "running and ready" (found "Pending") (1m41.208906038s)
INFO: Waiting for pod kibana-logging-2qlm8 in namespace default status to be "running and ready" (found "Pending") (1m45.984956491s)
INFO: Waiting for pod elasticsearch-logging-uoaxg in namespace default status to be "running and ready" (found "Pending") (1m45.985842429s)
INFO: Waiting for pod monitoring-influx-grafana-controller-ohuml in namespace default status to be "running and ready" (found "Pending") (1m46.057637331s)
INFO: Waiting for pod fluentd-elasticsearch-e2e-test-maxforbes-minion-66vd in namespace default status to be "running and ready" (found "Pending") (1m46.087589811s)
INFO: Waiting for pod kube-dns-u248l in namespace default status to be "running and ready" (found "Pending") (1m46.086532445s)
INFO: Waiting for pod monitoring-heapster-controller-uftv5 in namespace default status to be "running and ready" (found "Pending") (1m46.125424097s)
INFO: Waiting for pod fluentd-elasticsearch-e2e-test-maxforbes-minion-3t1x in namespace default status to be "running and ready" (found "Pending") (1m46.128275601s)
INFO: Waiting for pod elasticsearch-logging-x8est in namespace default status to be "running and ready" (found "Pending") (1m46.252079635s)
INFO: Waiting for pod kibana-logging-2qlm8 in namespace default status to be "running and ready" (found "Pending") (1m51.028059378s)
INFO: Waiting for pod elasticsearch-logging-uoaxg in namespace default status to be "running and ready" (found "Pending") (1m51.029201646s)
INFO: Waiting for pod monitoring-influx-grafana-controller-ohuml in namespace default status to be "running and ready" (found "Pending") (1m51.1014368s)
INFO: Waiting for pod fluentd-elasticsearch-e2e-test-maxforbes-minion-66vd in namespace default status to be "running and ready" (found "Pending") (1m51.130282302s)
INFO: Waiting for pod kube-dns-u248l in namespace default status to be "running and ready" (found "Pending") (1m51.137191579s)
INFO: Waiting for pod monitoring-heapster-controller-uftv5 in namespace default status to be "running and ready" (found "Pending") (1m51.17070366s)
INFO: Waiting for pod fluentd-elasticsearch-e2e-test-maxforbes-minion-3t1x in namespace default status to be "running and ready" (found "Pending") (1m51.180450692s)
INFO: Waiting for pod elasticsearch-logging-x8est in namespace default status to be "running and ready" (found "Pending") (1m51.299421861s)
INFO: Waiting for pod kibana-logging-2qlm8 in namespace default status to be "running and ready" (found "Pending") (1m56.079190845s)
INFO: Waiting for pod elasticsearch-logging-uoaxg in namespace default status to be "running and ready" (found "Pending") (1m56.080258519s)
INFO: Waiting for pod monitoring-influx-grafana-controller-ohuml in namespace default status to be "running and ready" (found "Pending") (1m56.145842468s)
INFO: Waiting for pod fluentd-elasticsearch-e2e-test-maxforbes-minion-66vd in namespace default status to be "running and ready" (found "Pending") (1m56.174474315s)
INFO: Waiting for pod kube-dns-u248l in namespace default status to be "running and ready" (found "Pending") (1m56.183179643s)
INFO: Waiting for pod monitoring-heapster-controller-uftv5 in namespace default status to be "running and ready" (found "Pending") (1m56.2147446s)
INFO: Waiting for pod fluentd-elasticsearch-e2e-test-maxforbes-minion-3t1x in namespace default status to be "running and ready" (found "Pending") (1m56.224862199s)
INFO: Waiting for pod elasticsearch-logging-x8est in namespace default status to be "running and ready" (found "Pending") (1m56.34311137s)
INFO: Pod elasticsearch-logging-uoaxg failed to be running and ready.
INFO: Pod fluentd-elasticsearch-e2e-test-maxforbes-minion-66vd failed to be running and ready.
INFO: Pod kibana-logging-2qlm8 failed to be running and ready.
INFO: Pod monitoring-influx-grafana-controller-ohuml failed to be running and ready.
INFO: Wanted all pods to be running and ready. Result: false. Pods: [elasticsearch-logging-uoaxg fluentd-elasticsearch-e2e-test-maxforbes-minion-66vd kibana-logging-2qlm8 monitoring-influx-grafana-controller-ohuml]
• Failure [219.177 seconds]
Reboot
/go/src/github.com/GoogleCloudPlatform/kubernetes/_output/dockerized/go/src/github.com/GoogleCloudPlatform/kubernetes/test/e2e/reboot.go:98
should reboot each node and ensure they function upon restart [It]
/go/src/github.com/GoogleCloudPlatform/kubernetes/_output/dockerized/go/src/github.com/GoogleCloudPlatform/kubernetes/test/e2e/reboot.go:97
Node e2e-test-maxforbes-minion-3t1x failed reboot test.
/go/src/github.com/GoogleCloudPlatform/kubernetes/_output/dockerized/go/src/github.com/GoogleCloudPlatform/kubernetes/test/e2e/reboot.go:90
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment