Skip to content

Instantly share code, notes, and snippets.

@shashank855
Created May 25, 2020 22:34
Show Gist options
  • Save shashank855/8f97a2e8a4fa9771215e90d69a874212 to your computer and use it in GitHub Desktop.
Save shashank855/8f97a2e8a4fa9771215e90d69a874212 to your computer and use it in GitHub Desktop.
```
devuser@mlrack1:~$ kubectl logs -f litmus-velero-backup-restore-pz8z8-t2zsk -n litmus
ansible-playbook 2.7.3
config file = None
configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules']
ansible python module location = /usr/local/lib/python2.7/dist-packages/ansible
executable location = /usr/local/bin/ansible-playbook
python version = 2.7.17 (default, Apr 15 2020, 17:20:14) [GCC 7.5.0]
No config file found; using defaults
/etc/ansible/hosts did not meet host_list requirements, check plugin documentation if this is unexpected
/etc/ansible/hosts did not meet script requirements, check plugin documentation if this is unexpected
PLAYBOOK: test.yml *************************************************************
1 plays in ./experiments/functional/backup_and_restore/test.yml
PLAY [localhost] ***************************************************************
2020-05-25T21:44:30.663493 (delta: 0.092619) elapsed: 0.092619 ********
===============================================================================
TASK [Gathering Facts] *********************************************************
task path: /experiments/functional/backup_and_restore/test.yml:1
2020-05-25T21:44:30.678293 (delta: 0.014768) elapsed: 0.107419 ********
ok: [127.0.0.1]
META: ran handlers
TASK [include_tasks] ***********************************************************
task path: /experiments/functional/backup_and_restore/test.yml:12
2020-05-25T21:44:38.962408 (delta: 8.284096) elapsed: 8.391534 ********
included: /utils/fcm/create_testname.yml for 127.0.0.1
TASK [Record test instance/run ID] *********************************************
task path: /utils/fcm/create_testname.yml:3
2020-05-25T21:44:39.062470 (delta: 0.100007) elapsed: 8.491596 ********
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Construct testname appended with runID] **********************************
task path: /utils/fcm/create_testname.yml:7
2020-05-25T21:44:39.140366 (delta: 0.077836) elapsed: 8.569492 ********
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [include_tasks] ***********************************************************
task path: /experiments/functional/backup_and_restore/test.yml:15
2020-05-25T21:44:39.201034 (delta: 0.060613) elapsed: 8.63016 *********
included: /utils/fcm/update_litmus_result_resource.yml for 127.0.0.1
TASK [Generate the litmus result CR to reflect SOT (Start of Test)] ************
task path: /utils/fcm/update_litmus_result_resource.yml:3
2020-05-25T21:44:39.329890 (delta: 0.128801) elapsed: 8.759016 ********
changed: [127.0.0.1] => {"changed": true, "checksum": "cb23761ee0ecfea3a093f94c0443739c777b3007", "dest": "./litmus-result.yaml", "gid": 0, "group": "root", "md5sum": "0da5664e16958d4f3c263662ebe36b5c", "mode": "0644", "owner": "root", "size": 422, "src": "/root/.ansible/tmp/ansible-tmp-1590443079.39-184115203946513/source", "state": "file", "uid": 0}
TASK [Analyze the cr yaml] *****************************************************
task path: /utils/fcm/update_litmus_result_resource.yml:14
2020-05-25T21:44:40.065200 (delta: 0.735251) elapsed: 9.494326 ********
changed: [127.0.0.1] => {"changed": true, "cmd": "cat litmus-result.yaml", "delta": "0:00:00.818046", "end": "2020-05-25 21:44:41.240863", "rc": 0, "start": "2020-05-25 21:44:40.422817", "stderr": "", "stderr_lines": [], "stdout": "---\napiVersion: litmus.io/v1alpha1\nkind: LitmusResult\nmetadata:\n\n # name of the litmus testcase\n name: velero-backup-restore \nspec:\n\n # holds information on the testcase\n testMetadata:\n app: \n chaostype: \n\n # holds the state of testcase, manually updated by json merge patch\n # result is the useful value today, but anticipate phase use in future \n testStatus: \n phase: in-progress \n result: none ", "stdout_lines": ["---", "apiVersion: litmus.io/v1alpha1", "kind: LitmusResult", "metadata:", "", " # name of the litmus testcase", " name: velero-backup-restore ", "spec:", "", " # holds information on the testcase", " testMetadata:", " app: ", " chaostype: ", "", " # holds the state of testcase, manually updated by json merge patch", " # result is the useful value today, but anticipate phase use in future ", " testStatus: ", " phase: in-progress ", " result: none "]}
TASK [Apply the litmus result CR] **********************************************
task path: /utils/fcm/update_litmus_result_resource.yml:17
2020-05-25T21:44:41.299975 (delta: 1.234723) elapsed: 10.729101 *******
changed: [127.0.0.1] => {"changed": true, "failed_when_result": false, "method": "patch", "result": {"apiVersion": "litmus.io/v1alpha1", "kind": "LitmusResult", "metadata": {"creationTimestamp": "2020-05-25T21:08:59Z", "generation": 5, "name": "velero-backup-restore", "resourceVersion": "1680731", "selfLink": "/apis/litmus.io/v1alpha1/litmusresults/velero-backup-restore", "uid": "f39fdbaf-9ecb-11ea-be3f-005056987ecb"}, "spec": {"testMetadata": {}, "testStatus": {"phase": "in-progress", "result": "none"}}}}
TASK [Generate the litmus result CR to reflect EOT (End of Test)] **************
task path: /utils/fcm/update_litmus_result_resource.yml:27
2020-05-25T21:44:43.045869 (delta: 1.745838) elapsed: 12.474995 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Analyze the cr yaml] *****************************************************
task path: /utils/fcm/update_litmus_result_resource.yml:38
2020-05-25T21:44:43.167056 (delta: 0.121096) elapsed: 12.596182 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Apply the litmus result CR] **********************************************
task path: /utils/fcm/update_litmus_result_resource.yml:41
2020-05-25T21:44:43.280789 (delta: 0.113604) elapsed: 12.709915 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Verify that the AUT is running] ******************************************
task path: /experiments/functional/backup_and_restore/test.yml:19
2020-05-25T21:44:43.393282 (delta: 0.1124) elapsed: 12.822408 *********
included: /utils/k8s/check_deployment_status.yml for 127.0.0.1
TASK [Check the pod status] ****************************************************
task path: /utils/k8s/check_deployment_status.yml:8
2020-05-25T21:44:43.632547 (delta: 0.239172) elapsed: 13.061673 *******
changed: [127.0.0.1] => {"attempts": 1, "changed": true, "cmd": "kubectl get pods -n app-percona-ns -l name=percona --no-headers -o custom-columns=:status.phase", "delta": "0:00:01.302921", "end": "2020-05-25 21:44:45.168420", "rc": 0, "start": "2020-05-25 21:44:43.865499", "stderr": "", "stderr_lines": [], "stdout": "Running", "stdout_lines": ["Running"]}
TASK [obtain the number of replicas.] ******************************************
task path: /utils/k8s/check_deployment_status.yml:22
2020-05-25T21:44:45.301839 (delta: 1.669232) elapsed: 14.730965 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Obtain the ready replica count and compare with the replica count.] ******
task path: /utils/k8s/check_deployment_status.yml:29
2020-05-25T21:44:45.450893 (delta: 0.148954) elapsed: 14.880019 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [include_tasks] ***********************************************************
task path: /experiments/functional/backup_and_restore/test.yml:22
2020-05-25T21:44:45.539298 (delta: 0.08831) elapsed: 14.968424 ********
included: /experiments/functional/backup_and_restore/setup_dependency.yml for 127.0.0.1
TASK [Downloading velero binary] ***********************************************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:1
2020-05-25T21:44:45.744384 (delta: 0.205032) elapsed: 15.17351 ********
changed: [127.0.0.1] => {"attempts": 1, "changed": true, "checksum_dest": null, "checksum_src": "30e3e5cdaf2df32b84472dc0ab1fa1ae591aae40", "dest": "./velero-v1.3.0-linux-amd64.tar.gz", "gid": 0, "group": "root", "md5sum": "b45cbd835a139e4c3ed5b56309dcf22d", "mode": "0644", "msg": "OK (24524254 bytes)", "owner": "root", "size": 24524254, "src": "/root/.ansible/tmp/ansible-tmp-1590443085.8-67546738846513/tmpntZetN", "state": "file", "status_code": 200, "uid": 0, "url": "https://github.com/vmware-tanzu/velero/releases/download/v1.3.0/velero-v1.3.0-linux-amd64.tar.gz"}
TASK [Installing velero inside litmus container] *******************************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:11
2020-05-25T21:44:56.671039 (delta: 10.926599) elapsed: 26.100165 ******
[WARNING]: Consider using the unarchive module rather than running tar. If
you need to use command because unarchive is insufficient you can add
warn=False to this command task or set command_warnings=False in ansible.cfg to
get rid of this message.
changed: [127.0.0.1] => {"changed": true, "cmd": "tar -xvf velero-v1.3.0-linux-amd64.tar.gz\n mv velero-v1.3.0-linux-amd64/velero /usr/local/bin/", "delta": "0:00:01.680983", "end": "2020-05-25 21:44:58.702266", "rc": 0, "start": "2020-05-25 21:44:57.021283", "stderr": "", "stderr_lines": [], "stdout": "velero-v1.3.0-linux-amd64/LICENSE\nvelero-v1.3.0-linux-amd64/examples/README.md\nvelero-v1.3.0-linux-amd64/examples/minio\nvelero-v1.3.0-linux-amd64/examples/minio/00-minio-deployment.yaml\nvelero-v1.3.0-linux-amd64/examples/nginx-app\nvelero-v1.3.0-linux-amd64/examples/nginx-app/README.md\nvelero-v1.3.0-linux-amd64/examples/nginx-app/base.yaml\nvelero-v1.3.0-linux-amd64/examples/nginx-app/with-pv.yaml\nvelero-v1.3.0-linux-amd64/velero", "stdout_lines": ["velero-v1.3.0-linux-amd64/LICENSE", "velero-v1.3.0-linux-amd64/examples/README.md", "velero-v1.3.0-linux-amd64/examples/minio", "velero-v1.3.0-linux-amd64/examples/minio/00-minio-deployment.yaml", "velero-v1.3.0-linux-amd64/examples/nginx-app", "velero-v1.3.0-linux-amd64/examples/nginx-app/README.md", "velero-v1.3.0-linux-amd64/examples/nginx-app/base.yaml", "velero-v1.3.0-linux-amd64/examples/nginx-app/with-pv.yaml", "velero-v1.3.0-linux-amd64/velero"]}
TASK [Checking the velero version] *********************************************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:16
2020-05-25T21:44:58.809775 (delta: 2.138583) elapsed: 28.238901 *******
changed: [127.0.0.1] => {"changed": true, "cmd": "velero version", "delta": "0:00:01.069617", "end": "2020-05-25 21:45:00.221766", "failed_when_result": false, "rc": 0, "start": "2020-05-25 21:44:59.152149", "stderr": "", "stderr_lines": [], "stdout": "Client:\n\tVersion: v1.3.0\n\tGit commit: 8fec8ed7fb8b4776b191753497eafeb47f2f9136\n<error getting server version: namespaces \"velero\" not found>", "stdout_lines": ["Client:", "\tVersion: v1.3.0", "\tGit commit: 8fec8ed7fb8b4776b191753497eafeb47f2f9136", "<error getting server version: namespaces \"velero\" not found>"]}
TASK [Installing minio s3-bucket] **********************************************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:23
2020-05-25T21:45:00.408325 (delta: 1.598472) elapsed: 29.837451 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Checking for minio pod status] *******************************************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:26
2020-05-25T21:45:00.538749 (delta: 0.130346) elapsed: 29.967875 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Waiting for minio job to create bucket] **********************************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:33
2020-05-25T21:45:00.668726 (delta: 0.129888) elapsed: 30.097852 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Fetching service account key.json from configmap] ************************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:44
2020-05-25T21:45:00.800011 (delta: 0.131196) elapsed: 30.229137 *******
changed: [127.0.0.1] => {"changed": true, "cmd": "kubectl get cm gcp-key -n litmus -o jsonpath='{.data.key\\.json}' > ./key.json", "delta": "0:00:01.060223", "end": "2020-05-25 21:45:02.226779", "rc": 0, "start": "2020-05-25 21:45:01.166556", "stderr": "", "stderr_lines": [], "stdout": "", "stdout_lines": []}
TASK [Authenticating using gcp service account] ********************************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:47
2020-05-25T21:45:02.292971 (delta: 1.492865) elapsed: 31.722097 *******
changed: [127.0.0.1] => {"changed": true, "cmd": "gcloud auth activate-service-account --key-file key.json", "delta": "0:00:02.525077", "end": "2020-05-25 21:45:05.083971", "rc": 0, "start": "2020-05-25 21:45:02.558894", "stderr": "Activated service account credentials for: [mayastor@openebs-ci.iam.gserviceaccount.com]", "stderr_lines": ["Activated service account credentials for: [mayastor@openebs-ci.iam.gserviceaccount.com]"], "stdout": "", "stdout_lines": []}
TASK [Setting up the project ID] ***********************************************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:50
2020-05-25T21:45:05.209215 (delta: 2.916183) elapsed: 34.638341 *******
changed: [127.0.0.1] => {"changed": true, "cmd": "gcloud config set project openebs-ci", "delta": "0:00:03.442528", "end": "2020-05-25 21:45:09.066729", "rc": 0, "start": "2020-05-25 21:45:05.624201", "stderr": "Updated property [core/project].", "stderr_lines": ["Updated property [core/project]."], "stdout": "", "stdout_lines": []}
TASK [Deleting GCP bucket if exist] ********************************************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:53
2020-05-25T21:45:09.123529 (delta: 3.914194) elapsed: 38.552655 *******
fatal: [127.0.0.1]: FAILED! => {"changed": true, "cmd": "gsutil rm -r gs://e2e-gcp-bucket", "delta": "0:00:03.297024", "end": "2020-05-25 21:45:12.614650", "msg": "non-zero return code", "rc": 1, "start": "2020-05-25 21:45:09.317626", "stderr": "BucketNotFoundException: 404 gs://e2e-gcp-bucket bucket does not exist.", "stderr_lines": ["BucketNotFoundException: 404 gs://e2e-gcp-bucket bucket does not exist."], "stdout": "", "stdout_lines": []}
...ignoring
TASK [Creating gcp bucket] *****************************************************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:59
2020-05-25T21:45:12.727214 (delta: 3.603633) elapsed: 42.15634 ********
changed: [127.0.0.1] => {"changed": true, "cmd": "gsutil mb gs://e2e-gcp-bucket/", "delta": "0:00:03.920948", "end": "2020-05-25 21:45:17.036435", "rc": 0, "start": "2020-05-25 21:45:13.115487", "stderr": "Creating gs://e2e-gcp-bucket/...", "stderr_lines": ["Creating gs://e2e-gcp-bucket/..."], "stdout": "", "stdout_lines": []}
TASK [Installing velero server inside cluster] *********************************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:68
2020-05-25T21:45:17.153609 (delta: 4.426309) elapsed: 46.582735 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Installing velero server inside cluster] *********************************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:79
2020-05-25T21:45:17.289379 (delta: 0.135681) elapsed: 46.718505 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Installing velero server inside cluster] *********************************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:95
2020-05-25T21:45:17.418733 (delta: 0.129269) elapsed: 46.847859 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Installing velero server inside cluster] *********************************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:105
2020-05-25T21:45:17.555232 (delta: 0.136414) elapsed: 46.984358 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Patching restic demonset for privileged access] **************************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:118
2020-05-25T21:45:17.687600 (delta: 0.132272) elapsed: 47.116726 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Getting the number of compute nodes] *************************************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:123
2020-05-25T21:45:17.841466 (delta: 0.15377) elapsed: 47.270592 ********
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Checking the status of restic demonset pods] *****************************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:127
2020-05-25T21:45:17.976206 (delta: 0.134639) elapsed: 47.405332 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Installing velero server inside cluster] *********************************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:140
2020-05-25T21:45:18.084996 (delta: 0.108692) elapsed: 47.514122 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Installing velero server inside cluster] *********************************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:150
2020-05-25T21:45:18.167742 (delta: 0.082696) elapsed: 47.596868 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Installing velero server inside cluster] *********************************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:165
2020-05-25T21:45:18.304043 (delta: 0.136224) elapsed: 47.733169 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Installing velero server inside cluster] *********************************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:174
2020-05-25T21:45:18.451784 (delta: 0.14762) elapsed: 47.88091 *********
changed: [127.0.0.1] => {"changed": true, "cmd": "velero install --provider gcp --plugins velero/velero-plugin-for-gcp:v1.0.0 --bucket e2e-gcp-bucket --use-volume-snapshots=false --secret-file ./key.json", "delta": "0:00:02.806455", "end": "2020-05-25 21:45:21.685303", "rc": 0, "start": "2020-05-25 21:45:18.878848", "stderr": "", "stderr_lines": [], "stdout": "CustomResourceDefinition/backups.velero.io: attempting to create resource\nCustomResourceDefinition/backups.velero.io: already exists, proceeding\nCustomResourceDefinition/backups.velero.io: created\nCustomResourceDefinition/backupstoragelocations.velero.io: attempting to create resource\nCustomResourceDefinition/backupstoragelocations.velero.io: already exists, proceeding\nCustomResourceDefinition/backupstoragelocations.velero.io: created\nCustomResourceDefinition/deletebackuprequests.velero.io: attempting to create resource\nCustomResourceDefinition/deletebackuprequests.velero.io: already exists, proceeding\nCustomResourceDefinition/deletebackuprequests.velero.io: created\nCustomResourceDefinition/downloadrequests.velero.io: attempting to create resource\nCustomResourceDefinition/downloadrequests.velero.io: already exists, proceeding\nCustomResourceDefinition/downloadrequests.velero.io: created\nCustomResourceDefinition/podvolumebackups.velero.io: attempting to create resource\nCustomResourceDefinition/podvolumebackups.velero.io: already exists, proceeding\nCustomResourceDefinition/podvolumebackups.velero.io: created\nCustomResourceDefinition/podvolumerestores.velero.io: attempting to create resource\nCustomResourceDefinition/podvolumerestores.velero.io: already exists, proceeding\nCustomResourceDefinition/podvolumerestores.velero.io: created\nCustomResourceDefinition/resticrepositories.velero.io: attempting to create resource\nCustomResourceDefinition/resticrepositories.velero.io: already exists, proceeding\nCustomResourceDefinition/resticrepositories.velero.io: created\nCustomResourceDefinition/restores.velero.io: attempting to create resource\nCustomResourceDefinition/restores.velero.io: already exists, proceeding\nCustomResourceDefinition/restores.velero.io: created\nCustomResourceDefinition/schedules.velero.io: attempting to create resource\nCustomResourceDefinition/schedules.velero.io: already exists, proceeding\nCustomResourceDefinition/schedules.velero.io: created\nCustomResourceDefinition/serverstatusrequests.velero.io: attempting to create resource\nCustomResourceDefinition/serverstatusrequests.velero.io: already exists, proceeding\nCustomResourceDefinition/serverstatusrequests.velero.io: created\nCustomResourceDefinition/volumesnapshotlocations.velero.io: attempting to create resource\nCustomResourceDefinition/volumesnapshotlocations.velero.io: already exists, proceeding\nCustomResourceDefinition/volumesnapshotlocations.velero.io: created\nWaiting for resources to be ready in cluster...\nNamespace/velero: attempting to create resource\nNamespace/velero: created\nClusterRoleBinding/velero: attempting to create resource\nClusterRoleBinding/velero: already exists, proceeding\nClusterRoleBinding/velero: created\nServiceAccount/velero: attempting to create resource\nServiceAccount/velero: created\nSecret/cloud-credentials: attempting to create resource\nSecret/cloud-credentials: created\nBackupStorageLocation/default: attempting to create resource\nBackupStorageLocation/default: created\nDeployment/velero: attempting to create resource\nDeployment/velero: created\nVelero is installed! ? Use 'kubectl logs deployment/velero -n velero' to view the status.", "stdout_lines": ["CustomResourceDefinition/backups.velero.io: attempting to create resource", "CustomResourceDefinition/backups.velero.io: already exists, proceeding", "CustomResourceDefinition/backups.velero.io: created", "CustomResourceDefinition/backupstoragelocations.velero.io: attempting to create resource", "CustomResourceDefinition/backupstoragelocations.velero.io: already exists, proceeding", "CustomResourceDefinition/backupstoragelocations.velero.io: created", "CustomResourceDefinition/deletebackuprequests.velero.io: attempting to create resource", "CustomResourceDefinition/deletebackuprequests.velero.io: already exists, proceeding", "CustomResourceDefinition/deletebackuprequests.velero.io: created", "CustomResourceDefinition/downloadrequests.velero.io: attempting to create resource", "CustomResourceDefinition/downloadrequests.velero.io: already exists, proceeding", "CustomResourceDefinition/downloadrequests.velero.io: created", "CustomResourceDefinition/podvolumebackups.velero.io: attempting to create resource", "CustomResourceDefinition/podvolumebackups.velero.io: already exists, proceeding", "CustomResourceDefinition/podvolumebackups.velero.io: created", "CustomResourceDefinition/podvolumerestores.velero.io: attempting to create resource", "CustomResourceDefinition/podvolumerestores.velero.io: already exists, proceeding", "CustomResourceDefinition/podvolumerestores.velero.io: created", "CustomResourceDefinition/resticrepositories.velero.io: attempting to create resource", "CustomResourceDefinition/resticrepositories.velero.io: already exists, proceeding", "CustomResourceDefinition/resticrepositories.velero.io: created", "CustomResourceDefinition/restores.velero.io: attempting to create resource", "CustomResourceDefinition/restores.velero.io: already exists, proceeding", "CustomResourceDefinition/restores.velero.io: created", "CustomResourceDefinition/schedules.velero.io: attempting to create resource", "CustomResourceDefinition/schedules.velero.io: already exists, proceeding", "CustomResourceDefinition/schedules.velero.io: created", "CustomResourceDefinition/serverstatusrequests.velero.io: attempting to create resource", "CustomResourceDefinition/serverstatusrequests.velero.io: already exists, proceeding", "CustomResourceDefinition/serverstatusrequests.velero.io: created", "CustomResourceDefinition/volumesnapshotlocations.velero.io: attempting to create resource", "CustomResourceDefinition/volumesnapshotlocations.velero.io: already exists, proceeding", "CustomResourceDefinition/volumesnapshotlocations.velero.io: created", "Waiting for resources to be ready in cluster...", "Namespace/velero: attempting to create resource", "Namespace/velero: created", "ClusterRoleBinding/velero: attempting to create resource", "ClusterRoleBinding/velero: already exists, proceeding", "ClusterRoleBinding/velero: created", "ServiceAccount/velero: attempting to create resource", "ServiceAccount/velero: created", "Secret/cloud-credentials: attempting to create resource", "Secret/cloud-credentials: created", "BackupStorageLocation/default: attempting to create resource", "BackupStorageLocation/default: created", "Deployment/velero: attempting to create resource", "Deployment/velero: created", "Velero is installed! ? Use 'kubectl logs deployment/velero -n velero' to view the status."]}
TASK [Checking for velero server status] ***************************************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:188
2020-05-25T21:45:21.808819 (delta: 3.356948) elapsed: 51.237945 *******
FAILED - RETRYING: Checking for velero server status (20 retries left).
FAILED - RETRYING: Checking for velero server status (19 retries left).
FAILED - RETRYING: Checking for velero server status (18 retries left).
FAILED - RETRYING: Checking for velero server status (17 retries left).
FAILED - RETRYING: Checking for velero server status (16 retries left).
FAILED - RETRYING: Checking for velero server status (15 retries left).
FAILED - RETRYING: Checking for velero server status (14 retries left).
changed: [127.0.0.1] => {"attempts": 8, "changed": true, "cmd": "kubectl get pod -n velero -l component=velero -ojsonpath='{.items[0].status.phase}'", "delta": "0:00:01.028490", "end": "2020-05-25 21:46:07.439051", "rc": 0, "start": "2020-05-25 21:46:06.410561", "stderr": "", "stderr_lines": [], "stdout": "Running", "stdout_lines": ["Running"]}
TASK [Creating volume snapshot location] ***************************************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:197
2020-05-25T21:46:07.537791 (delta: 45.72887) elapsed: 96.966917 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating volume snapshot location] ***************************************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:203
2020-05-25T21:46:07.686239 (delta: 0.148358) elapsed: 97.115365 *******
changed: [127.0.0.1] => {"changed": true, "cmd": "kubectl apply -f gcp-volume-snapshot-location.yml", "delta": "0:00:01.619587", "end": "2020-05-25 21:46:09.595096", "failed_when_result": false, "rc": 0, "start": "2020-05-25 21:46:07.975509", "stderr": "", "stderr_lines": [], "stdout": "volumesnapshotlocation.velero.io/gcp-bucket created", "stdout_lines": ["volumesnapshotlocation.velero.io/gcp-bucket created"]}
TASK [Creating volume snapshot location] ***************************************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:209
2020-05-25T21:46:09.659595 (delta: 1.973258) elapsed: 99.088721 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Installing development image of OpenEBS velero-plugin] *******************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:215
2020-05-25T21:46:09.727446 (delta: 0.067794) elapsed: 99.156572 *******
changed: [127.0.0.1] => {"changed": true, "cmd": "velero plugin add openebs/velero-plugin:1.10.0", "delta": "0:00:00.889305", "end": "2020-05-25 21:46:10.875882", "rc": 0, "start": "2020-05-25 21:46:09.986577", "stderr": "", "stderr_lines": [], "stdout": "", "stdout_lines": []}
TASK [Waiting for new velero server pod to come up] ****************************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:219
2020-05-25T21:46:10.932972 (delta: 1.205471) elapsed: 100.362098 ******
FAILED - RETRYING: Waiting for new velero server pod to come up (20 retries left).
FAILED - RETRYING: Waiting for new velero server pod to come up (19 retries left).
FAILED - RETRYING: Waiting for new velero server pod to come up (18 retries left).
FAILED - RETRYING: Waiting for new velero server pod to come up (17 retries left).
changed: [127.0.0.1] => {"attempts": 5, "changed": true, "cmd": "kubectl get pod -n velero --no-headers -l deploy=velero | wc -l", "delta": "0:00:01.014332", "end": "2020-05-25 21:46:37.807537", "rc": 0, "start": "2020-05-25 21:46:36.793205", "stderr": "", "stderr_lines": [], "stdout": "1", "stdout_lines": ["1"]}
TASK [Checking for velero server status] ***************************************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:228
2020-05-25T21:46:37.946057 (delta: 27.013033) elapsed: 127.375183 *****
changed: [127.0.0.1] => {"attempts": 1, "changed": true, "cmd": "kubectl get pod -n velero -l component=velero -ojsonpath='{.items[0].status.phase}'", "delta": "0:00:01.083580", "end": "2020-05-25 21:46:39.367773", "rc": 0, "start": "2020-05-25 21:46:38.284193", "stderr": "", "stderr_lines": [], "stdout": "Running", "stdout_lines": ["Running"]}
TASK [Checking for velero server container status] *****************************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:235
2020-05-25T21:46:39.438218 (delta: 1.492034) elapsed: 128.867344 ******
changed: [127.0.0.1] => {"attempts": 1, "changed": true, "cmd": "kubectl get pod -n velero -l component=velero -ojsonpath='{.items[0].status.containerStatuses[0].ready}'", "delta": "0:00:00.960023", "end": "2020-05-25 21:46:40.579566", "rc": 0, "start": "2020-05-25 21:46:39.619543", "stderr": "", "stderr_lines": [], "stdout": "true", "stdout_lines": ["true"]}
TASK [Get Application pod details] *********************************************
task path: /experiments/functional/backup_and_restore/test.yml:24
2020-05-25T21:46:40.678740 (delta: 1.240468) elapsed: 130.107866 ******
changed: [127.0.0.1] => {"changed": true, "cmd": "kubectl get pods -n app-percona-ns -l name=percona --no-headers -o custom-columns=:metadata.name", "delta": "0:00:01.003993", "end": "2020-05-25 21:46:41.916172", "rc": 0, "start": "2020-05-25 21:46:40.912179", "stderr": "", "stderr_lines": [], "stdout": "percona-755f6678bf-rw7vk", "stdout_lines": ["percona-755f6678bf-rw7vk"]}
TASK [Create new database in the mysql] ****************************************
task path: /experiments/functional/backup_and_restore/test.yml:28
2020-05-25T21:46:42.026108 (delta: 1.347294) elapsed: 131.455234 ******
included: /utils/scm/applications/mysql/mysql_data_persistence.yml for 127.0.0.1
TASK [Create some test data in the mysql database] *****************************
task path: /utils/scm/applications/mysql/mysql_data_persistence.yml:4
2020-05-25T21:46:42.386926 (delta: 0.360651) elapsed: 131.816052 ******
changed: [127.0.0.1] => (item=mysql -uroot -pk8sDem0 -e 'create database backup;') => {"changed": true, "cmd": "kubectl exec percona-755f6678bf-rw7vk -n app-percona-ns -- mysql -uroot -pk8sDem0 -e 'create database backup;'", "delta": "0:00:01.439458", "end": "2020-05-25 21:46:44.209680", "failed_when_result": false, "item": "mysql -uroot -pk8sDem0 -e 'create database backup;'", "rc": 0, "start": "2020-05-25 21:46:42.770222", "stderr": "mysql: [Warning] Using a password on the command line interface can be insecure.", "stderr_lines": ["mysql: [Warning] Using a password on the command line interface can be insecure."], "stdout": "", "stdout_lines": []}
changed: [127.0.0.1] => (item=mysql -uroot -pk8sDem0 -e 'create table ttbl (Data VARCHAR(20));' backup) => {"changed": true, "cmd": "kubectl exec percona-755f6678bf-rw7vk -n app-percona-ns -- mysql -uroot -pk8sDem0 -e 'create table ttbl (Data VARCHAR(20));' backup", "delta": "0:00:01.474776", "end": "2020-05-25 21:46:46.011706", "failed_when_result": false, "item": "mysql -uroot -pk8sDem0 -e 'create table ttbl (Data VARCHAR(20));' backup", "rc": 0, "start": "2020-05-25 21:46:44.536930", "stderr": "mysql: [Warning] Using a password on the command line interface can be insecure.", "stderr_lines": ["mysql: [Warning] Using a password on the command line interface can be insecure."], "stdout": "", "stdout_lines": []}
changed: [127.0.0.1] => (item=mysql -uroot -pk8sDem0 -e 'insert into ttbl (Data) VALUES ("tdata");' backup) => {"changed": true, "cmd": "kubectl exec percona-755f6678bf-rw7vk -n app-percona-ns -- mysql -uroot -pk8sDem0 -e 'insert into ttbl (Data) VALUES (\"tdata\");' backup", "delta": "0:00:01.751073", "end": "2020-05-25 21:46:48.122216", "failed_when_result": false, "item": "mysql -uroot -pk8sDem0 -e 'insert into ttbl (Data) VALUES (\"tdata\");' backup", "rc": 0, "start": "2020-05-25 21:46:46.371143", "stderr": "mysql: [Warning] Using a password on the command line interface can be insecure.", "stderr_lines": ["mysql: [Warning] Using a password on the command line interface can be insecure."], "stdout": "", "stdout_lines": []}
TASK [Kill the application pod] ************************************************
task path: /utils/scm/applications/mysql/mysql_data_persistence.yml:21
2020-05-25T21:46:48.242804 (delta: 5.855788) elapsed: 137.67193 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Verify if the application pod is deleted] ********************************
task path: /utils/scm/applications/mysql/mysql_data_persistence.yml:27
2020-05-25T21:46:48.363371 (delta: 0.120475) elapsed: 137.792497 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Obtain the newly created pod name for application] ***********************
task path: /utils/scm/applications/mysql/mysql_data_persistence.yml:37
2020-05-25T21:46:48.477713 (delta: 0.114248) elapsed: 137.906839 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Checking application pod is in running state] ****************************
task path: /utils/scm/applications/mysql/mysql_data_persistence.yml:44
2020-05-25T21:46:48.596848 (delta: 0.119037) elapsed: 138.025974 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Get the container status of application.] ********************************
task path: /utils/scm/applications/mysql/mysql_data_persistence.yml:51
2020-05-25T21:46:48.720775 (delta: 0.123828) elapsed: 138.149901 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Check if db is ready for connections] ************************************
task path: /utils/scm/applications/mysql/mysql_data_persistence.yml:61
2020-05-25T21:46:48.836516 (delta: 0.115645) elapsed: 138.265642 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Checking for the Corrupted tables] ***************************************
task path: /utils/scm/applications/mysql/mysql_data_persistence.yml:68
2020-05-25T21:46:48.954326 (delta: 0.117721) elapsed: 138.383452 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Verify mysql data persistence] *******************************************
task path: /utils/scm/applications/mysql/mysql_data_persistence.yml:77
2020-05-25T21:46:49.053353 (delta: 0.098938) elapsed: 138.482479 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Delete/drop MySQL database] **********************************************
task path: /utils/scm/applications/mysql/mysql_data_persistence.yml:90
2020-05-25T21:46:49.115186 (delta: 0.061777) elapsed: 138.544312 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Verify successful db delete] *********************************************
task path: /utils/scm/applications/mysql/mysql_data_persistence.yml:98
2020-05-25T21:46:49.188783 (delta: 0.073545) elapsed: 138.617909 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Getting the application mount point] *************************************
task path: /experiments/functional/backup_and_restore/test.yml:40
2020-05-25T21:46:49.252908 (delta: 0.064066) elapsed: 138.682034 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Writing data on application mount point] *********************************
task path: /experiments/functional/backup_and_restore/test.yml:45
2020-05-25T21:46:49.366665 (delta: 0.113699) elapsed: 138.795791 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Getting the application deployment name] *********************************
task path: /experiments/functional/backup_and_restore/test.yml:52
2020-05-25T21:46:49.494534 (delta: 0.127796) elapsed: 138.92366 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Fetching the volume name of application] *********************************
task path: /experiments/functional/backup_and_restore/test.yml:56
2020-05-25T21:46:49.645247 (delta: 0.150619) elapsed: 139.074373 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Annotating the application pod] ******************************************
task path: /experiments/functional/backup_and_restore/test.yml:61
2020-05-25T21:46:49.782454 (delta: 0.137111) elapsed: 139.21158 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating Backup for the provided application] ****************************
task path: /experiments/functional/backup_and_restore/test.yml:66
2020-05-25T21:46:49.917982 (delta: 0.135434) elapsed: 139.347108 ******
included: /experiments/functional/backup_and_restore/backup-restore.yml for 127.0.0.1
TASK [Creating Backup using minio bucket] **************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:6
2020-05-25T21:46:50.274795 (delta: 0.356722) elapsed: 139.703921 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating Backup using GCP bucket] ****************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:13
2020-05-25T21:46:50.392114 (delta: 0.117274) elapsed: 139.82124 *******
changed: [127.0.0.1] => {"changed": true, "cmd": "velero backup create percona-backup --include-namespaces=app-percona-ns --snapshot-volumes --volume-snapshot-locations=gcp-bucket", "delta": "0:00:01.083779", "end": "2020-05-25 21:46:51.775325", "rc": 0, "start": "2020-05-25 21:46:50.691546", "stderr": "", "stderr_lines": [], "stdout": "Backup request \"percona-backup\" submitted successfully.\nRun `velero backup describe percona-backup` or `velero backup logs percona-backup` for more details.", "stdout_lines": ["Backup request \"percona-backup\" submitted successfully.", "Run `velero backup describe percona-backup` or `velero backup logs percona-backup` for more details."]}
TASK [Creating Backup using GCP bucket] ****************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:20
2020-05-25T21:46:51.904518 (delta: 1.512276) elapsed: 141.333644 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating Backup] *********************************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:31
2020-05-25T21:46:52.055861 (delta: 0.151247) elapsed: 141.484987 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating Backup] *********************************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:41
2020-05-25T21:46:52.201448 (delta: 0.145499) elapsed: 141.630574 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating schedule using minio bucket] ************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:54
2020-05-25T21:46:52.368009 (delta: 0.166467) elapsed: 141.797135 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating schedule using GCP bucket] **************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:58
2020-05-25T21:46:52.486106 (delta: 0.118003) elapsed: 141.915232 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Getting the state of Backup] *********************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:66
2020-05-25T21:46:52.603887 (delta: 0.117651) elapsed: 142.033013 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating application namespace] ******************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:79
2020-05-25T21:46:52.726864 (delta: 0.122887) elapsed: 142.15599 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Restoring application] ***************************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:85
2020-05-25T21:46:52.855947 (delta: 0.128995) elapsed: 142.285073 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Restoring application] ***************************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:95
2020-05-25T21:46:52.936762 (delta: 0.080724) elapsed: 142.365888 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating application namespace] ******************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:105
2020-05-25T21:46:52.994281 (delta: 0.057467) elapsed: 142.423407 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Restoring application] ***************************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:110
2020-05-25T21:46:53.051426 (delta: 0.057091) elapsed: 142.480552 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Getting restore name] ****************************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:122
2020-05-25T21:46:53.125698 (delta: 0.074219) elapsed: 142.554824 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Checking the restore status] *********************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:126
2020-05-25T21:46:53.211899 (delta: 0.086146) elapsed: 142.641025 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [include_tasks] ***********************************************************
task path: /experiments/functional/backup_and_restore/test.yml:75
2020-05-25T21:46:53.320661 (delta: 0.108703) elapsed: 142.749787 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Getting the backup name] *************************************************
task path: /experiments/functional/backup_and_restore/test.yml:79
2020-05-25T21:46:53.453296 (delta: 0.13255) elapsed: 142.882422 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Getting the state of backup] *********************************************
task path: /experiments/functional/backup_and_restore/test.yml:83
2020-05-25T21:46:53.586746 (delta: 0.133361) elapsed: 143.015872 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [include_tasks] ***********************************************************
task path: /experiments/functional/backup_and_restore/test.yml:90
2020-05-25T21:46:53.721920 (delta: 0.135086) elapsed: 143.151046 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Waiting for completed backup] ********************************************
task path: /experiments/functional/backup_and_restore/test.yml:95
2020-05-25T21:46:53.860580 (delta: 0.138569) elapsed: 143.289706 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Getting the base backup name] ********************************************
task path: /experiments/functional/backup_and_restore/test.yml:102
2020-05-25T21:46:53.993672 (delta: 0.133003) elapsed: 143.422798 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Getting state of backup] *************************************************
task path: /experiments/functional/backup_and_restore/test.yml:108
2020-05-25T21:46:54.126847 (delta: 0.133087) elapsed: 143.555973 ******
included: /experiments/functional/backup_and_restore/backup-restore.yml for 127.0.0.1
TASK [Creating Backup using minio bucket] **************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:6
2020-05-25T21:46:54.444521 (delta: 0.317586) elapsed: 143.873647 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating Backup using GCP bucket] ****************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:13
2020-05-25T21:46:54.518211 (delta: 0.073638) elapsed: 143.947337 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating Backup using GCP bucket] ****************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:20
2020-05-25T21:46:54.620086 (delta: 0.101816) elapsed: 144.049212 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating Backup] *********************************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:31
2020-05-25T21:46:54.725016 (delta: 0.104808) elapsed: 144.154142 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating Backup] *********************************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:41
2020-05-25T21:46:54.829686 (delta: 0.104593) elapsed: 144.258812 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating schedule using minio bucket] ************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:54
2020-05-25T21:46:54.944067 (delta: 0.114292) elapsed: 144.373193 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating schedule using GCP bucket] **************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:58
2020-05-25T21:46:55.063840 (delta: 0.119652) elapsed: 144.492966 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Getting the state of Backup] *********************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:66
2020-05-25T21:46:55.182962 (delta: 0.119029) elapsed: 144.612088 ******
FAILED - RETRYING: Getting the state of Backup (100 retries left).
FAILED - RETRYING: Getting the state of Backup (99 retries left).
FAILED - RETRYING: Getting the state of Backup (98 retries left).
FAILED - RETRYING: Getting the state of Backup (97 retries left).
FAILED - RETRYING: Getting the state of Backup (96 retries left).
FAILED - RETRYING: Getting the state of Backup (95 retries left).
FAILED - RETRYING: Getting the state of Backup (94 retries left).
FAILED - RETRYING: Getting the state of Backup (93 retries left).
FAILED - RETRYING: Getting the state of Backup (92 retries left).
changed: [127.0.0.1] => {"attempts": 10, "changed": true, "cmd": "kubectl get backup percona-backup -n velero -o jsonpath='{.status.phase}'", "delta": "0:00:01.107340", "end": "2020-05-25 21:47:54.657748", "rc": 0, "start": "2020-05-25 21:47:53.550408", "stderr": "", "stderr_lines": [], "stdout": "Completed", "stdout_lines": ["Completed"]}
TASK [Creating application namespace] ******************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:79
2020-05-25T21:47:54.788390 (delta: 59.605334) elapsed: 204.217516 *****
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Restoring application] ***************************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:85
2020-05-25T21:47:54.899491 (delta: 0.111001) elapsed: 204.328617 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Restoring application] ***************************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:95
2020-05-25T21:47:54.959528 (delta: 0.059982) elapsed: 204.388654 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating application namespace] ******************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:105
2020-05-25T21:47:55.045239 (delta: 0.085659) elapsed: 204.474365 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Restoring application] ***************************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:110
2020-05-25T21:47:55.134873 (delta: 0.089559) elapsed: 204.563999 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Getting restore name] ****************************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:122
2020-05-25T21:47:55.253806 (delta: 0.118857) elapsed: 204.682932 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Checking the restore status] *********************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:126
2020-05-25T21:47:55.378736 (delta: 0.124836) elapsed: 204.807862 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Verifying for local snaphot deletion] ************************************
task path: /experiments/functional/backup_and_restore/test.yml:115
2020-05-25T21:47:55.502334 (delta: 0.123497) elapsed: 204.93146 *******
included: /experiments/functional/backup_and_restore/additional_verification.yml for 127.0.0.1
TASK [Getting the volume name] *************************************************
task path: /experiments/functional/backup_and_restore/additional_verification.yml:4
2020-05-25T21:47:55.773005 (delta: 0.270581) elapsed: 205.202131 ******
changed: [127.0.0.1] => {"changed": true, "cmd": "kubectl get pvc percona-mysql-claim -n app-percona-ns -o jsonpath='{.spec.volumeName}'", "delta": "0:00:01.124790", "end": "2020-05-25 21:47:57.165199", "rc": 0, "start": "2020-05-25 21:47:56.040409", "stderr": "", "stderr_lines": [], "stdout": "pvc-a83b23f3-9ecf-11ea-be3f-005056987ecb", "stdout_lines": ["pvc-a83b23f3-9ecf-11ea-be3f-005056987ecb"]}
TASK [Getting the cstor backup name] *******************************************
task path: /experiments/functional/backup_and_restore/additional_verification.yml:8
2020-05-25T21:47:57.287782 (delta: 1.514726) elapsed: 206.716908 ******
changed: [127.0.0.1] => {"attempts": 1, "changed": true, "cmd": "kubectl get cstorbackup -n app-percona-ns | grep percona-backup | awk '{print $1}'", "delta": "0:00:01.243949", "end": "2020-05-25 21:47:58.917272", "rc": 0, "start": "2020-05-25 21:47:57.673323", "stderr": "No resources found.", "stderr_lines": ["No resources found."], "stdout": "", "stdout_lines": []}
TASK [Getting the cstor pool UID] **********************************************
task path: /experiments/functional/backup_and_restore/additional_verification.yml:15
2020-05-25T21:47:59.035111 (delta: 1.747239) elapsed: 208.464237 ******
changed: [127.0.0.1] => {"changed": true, "cmd": "kubectl get cstorbackup -n app-percona-ns -o jsonpath='{.metadata.labels.cstorpool\\.openebs\\.io/uid}'", "delta": "0:00:01.038088", "end": "2020-05-25 21:48:00.478089", "rc": 0, "start": "2020-05-25 21:47:59.440001", "stderr": "", "stderr_lines": [], "stdout": "", "stdout_lines": []}
TASK [Getting the CVR name from the corresponding pool uid] ********************
task path: /experiments/functional/backup_and_restore/additional_verification.yml:19
2020-05-25T21:48:00.579873 (delta: 1.544641) elapsed: 210.008999 ******
changed: [127.0.0.1] => {"changed": true, "cmd": "kubectl get cvr -n openebs -l cstorpool.openebs.io/uid=,cstorvolume.openebs.io/name=pvc-a83b23f3-9ecf-11ea-be3f-005056987ecb -o jsonpath='{.items[*].metadata.name}'", "delta": "0:00:01.213954", "end": "2020-05-25 21:48:02.212533", "rc": 0, "start": "2020-05-25 21:48:00.998579", "stderr": "", "stderr_lines": [], "stdout": "", "stdout_lines": []}
TASK [Getting the cstor pool label from cvr] ***********************************
task path: /experiments/functional/backup_and_restore/additional_verification.yml:23
2020-05-25T21:48:02.272924 (delta: 1.692958) elapsed: 211.70205 *******
TASK [Geting the pool pod name corresponding to the label] *********************
task path: /experiments/functional/backup_and_restore/additional_verification.yml:29
2020-05-25T21:48:02.332916 (delta: 0.059936) elapsed: 211.762042 ******
TASK [Deleting the cstor pool pod] *********************************************
task path: /experiments/functional/backup_and_restore/additional_verification.yml:37
2020-05-25T21:48:02.405166 (delta: 0.072194) elapsed: 211.834292 ******
TASK [Verifying whether local snapshot is deleted] *****************************
task path: /experiments/functional/backup_and_restore/additional_verification.yml:48
2020-05-25T21:48:02.514337 (delta: 0.109078) elapsed: 211.943463 ******
TASK [Getting maya-pai-server pod name] ****************************************
task path: /experiments/functional/backup_and_restore/additional_verification.yml:61
2020-05-25T21:48:02.637982 (delta: 0.123555) elapsed: 212.067108 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Deleting the maya-api-server pod] ****************************************
task path: /experiments/functional/backup_and_restore/additional_verification.yml:65
2020-05-25T21:48:02.756207 (delta: 0.118129) elapsed: 212.185333 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Deleting Application] ****************************************************
task path: /experiments/functional/backup_and_restore/test.yml:124
2020-05-25T21:48:02.877072 (delta: 0.120768) elapsed: 212.306198 ******
changed: [127.0.0.1] => {"changed": true, "cmd": "kubectl delete pvc,deploy,svc --all -n app-percona-ns\n kubectl delete ns app-percona-ns", "delta": "0:00:22.798503", "end": "2020-05-25 21:48:25.954961", "rc": 0, "start": "2020-05-25 21:48:03.156458", "stderr": "", "stderr_lines": [], "stdout": "persistentvolumeclaim \"percona-mysql-claim\" deleted\ndeployment.extensions \"percona\" deleted\nservice \"percona-mysql\" deleted\nnamespace \"app-percona-ns\" deleted", "stdout_lines": ["persistentvolumeclaim \"percona-mysql-claim\" deleted", "deployment.extensions \"percona\" deleted", "service \"percona-mysql\" deleted", "namespace \"app-percona-ns\" deleted"]}
TASK [Checking whether namespace is deleted] ***********************************
task path: /experiments/functional/backup_and_restore/test.yml:129
2020-05-25T21:48:26.073581 (delta: 23.196329) elapsed: 235.502707 *****
changed: [127.0.0.1] => {"attempts": 1, "changed": true, "cmd": "kubectl get ns --no-headers -o custom-columns=:metadata.name", "delta": "0:00:01.109597", "end": "2020-05-25 21:48:27.427191", "rc": 0, "start": "2020-05-25 21:48:26.317594", "stderr": "", "stderr_lines": [], "stdout": "default\nkube-node-lease\nkube-public\nkube-system\nlitmus\nopenebs\nopenshift\nopenshift-apiserver\nopenshift-apiserver-operator\nopenshift-authentication\nopenshift-authentication-operator\nopenshift-cloud-credential-operator\nopenshift-cluster-machine-approver\nopenshift-cluster-node-tuning-operator\nopenshift-cluster-samples-operator\nopenshift-cluster-storage-operator\nopenshift-cluster-version\nopenshift-config\nopenshift-config-managed\nopenshift-console\nopenshift-console-operator\nopenshift-controller-manager\nopenshift-controller-manager-operator\nopenshift-dns\nopenshift-dns-operator\nopenshift-etcd\nopenshift-image-registry\nopenshift-infra\nopenshift-ingress\nopenshift-ingress-operator\nopenshift-insights\nopenshift-kni-infra\nopenshift-kube-apiserver\nopenshift-kube-apiserver-operator\nopenshift-kube-controller-manager\nopenshift-kube-controller-manager-operator\nopenshift-kube-scheduler\nopenshift-kube-scheduler-operator\nopenshift-machine-api\nopenshift-machine-config-operator\nopenshift-marketplace\nopenshift-monitoring\nopenshift-multus\nopenshift-network-operator\nopenshift-node\nopenshift-openstack-infra\nopenshift-operator-lifecycle-manager\nopenshift-operators\nopenshift-sdn\nopenshift-service-ca\nopenshift-service-ca-operator\nopenshift-service-catalog-apiserver-operator\nopenshift-service-catalog-controller-manager-operator\nvelero", "stdout_lines": ["default", "kube-node-lease", "kube-public", "kube-system", "litmus", "openebs", "openshift", "openshift-apiserver", "openshift-apiserver-operator", "openshift-authentication", "openshift-authentication-operator", "openshift-cloud-credential-operator", "openshift-cluster-machine-approver", "openshift-cluster-node-tuning-operator", "openshift-cluster-samples-operator", "openshift-cluster-storage-operator", "openshift-cluster-version", "openshift-config", "openshift-config-managed", "openshift-console", "openshift-console-operator", "openshift-controller-manager", "openshift-controller-manager-operator", "openshift-dns", "openshift-dns-operator", "openshift-etcd", "openshift-image-registry", "openshift-infra", "openshift-ingress", "openshift-ingress-operator", "openshift-insights", "openshift-kni-infra", "openshift-kube-apiserver", "openshift-kube-apiserver-operator", "openshift-kube-controller-manager", "openshift-kube-controller-manager-operator", "openshift-kube-scheduler", "openshift-kube-scheduler-operator", "openshift-machine-api", "openshift-machine-config-operator", "openshift-marketplace", "openshift-monitoring", "openshift-multus", "openshift-network-operator", "openshift-node", "openshift-openstack-infra", "openshift-operator-lifecycle-manager", "openshift-operators", "openshift-sdn", "openshift-service-ca", "openshift-service-ca-operator", "openshift-service-catalog-apiserver-operator", "openshift-service-catalog-controller-manager-operator", "velero"]}
TASK [Restoring Application] ***************************************************
task path: /experiments/functional/backup_and_restore/test.yml:146
2020-05-25T21:48:27.556990 (delta: 1.483308) elapsed: 236.986116 ******
included: /experiments/functional/backup_and_restore/backup-restore.yml for 127.0.0.1
TASK [Creating Backup using minio bucket] **************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:6
2020-05-25T21:48:27.801325 (delta: 0.244244) elapsed: 237.230451 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating Backup using GCP bucket] ****************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:13
2020-05-25T21:48:27.883927 (delta: 0.082551) elapsed: 237.313053 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating Backup using GCP bucket] ****************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:20
2020-05-25T21:48:27.958320 (delta: 0.074344) elapsed: 237.387446 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating Backup] *********************************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:31
2020-05-25T21:48:28.205868 (delta: 0.247458) elapsed: 237.634994 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating Backup] *********************************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:41
2020-05-25T21:48:28.348033 (delta: 0.142089) elapsed: 237.777159 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating schedule using minio bucket] ************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:54
2020-05-25T21:48:28.486082 (delta: 0.13792) elapsed: 237.915208 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating schedule using GCP bucket] **************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:58
2020-05-25T21:48:28.591938 (delta: 0.105725) elapsed: 238.021064 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Getting the state of Backup] *********************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:66
2020-05-25T21:48:28.677239 (delta: 0.085247) elapsed: 238.106365 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating application namespace] ******************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:79
2020-05-25T21:48:28.796865 (delta: 0.119549) elapsed: 238.225991 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Restoring application] ***************************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:85
2020-05-25T21:48:28.946373 (delta: 0.149419) elapsed: 238.375499 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Restoring application] ***************************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:95
2020-05-25T21:48:29.091559 (delta: 0.145097) elapsed: 238.520685 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating application namespace] ******************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:105
2020-05-25T21:48:29.233647 (delta: 0.141997) elapsed: 238.662773 ******
changed: [127.0.0.1] => {"changed": true, "cmd": "kubectl create ns app-percona-ns-local", "delta": "0:00:01.023922", "end": "2020-05-25 21:48:30.685067", "failed_when_result": false, "rc": 0, "start": "2020-05-25 21:48:29.661145", "stderr": "", "stderr_lines": [], "stdout": "namespace/app-percona-ns-local created", "stdout_lines": ["namespace/app-percona-ns-local created"]}
TASK [Restoring application] ***************************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:110
2020-05-25T21:48:30.805446 (delta: 1.571707) elapsed: 240.234572 ******
changed: [127.0.0.1] => {"changed": true, "cmd": "velero restore create --from-backup percona-backup --restore-volumes=true --namespace-mappings app-percona-ns:app-percona-ns-local", "delta": "0:00:01.068272", "end": "2020-05-25 21:48:32.266848", "rc": 0, "start": "2020-05-25 21:48:31.198576", "stderr": "", "stderr_lines": [], "stdout": "Restore request \"percona-backup-20200525214832\" submitted successfully.\nRun `velero restore describe percona-backup-20200525214832` or `velero restore logs percona-backup-20200525214832` for more details.", "stdout_lines": ["Restore request \"percona-backup-20200525214832\" submitted successfully.", "Run `velero restore describe percona-backup-20200525214832` or `velero restore logs percona-backup-20200525214832` for more details."]}
TASK [Getting restore name] ****************************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:122
2020-05-25T21:48:32.348810 (delta: 1.543276) elapsed: 241.777936 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Checking the restore status] *********************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:126
2020-05-25T21:48:32.468780 (delta: 0.119899) elapsed: 241.897906 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Getting state of restore] ************************************************
task path: /experiments/functional/backup_and_restore/test.yml:152
2020-05-25T21:48:32.605725 (delta: 0.136866) elapsed: 242.034851 ******
included: /experiments/functional/backup_and_restore/backup-restore.yml for 127.0.0.1
TASK [Creating Backup using minio bucket] **************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:6
2020-05-25T21:48:32.908070 (delta: 0.302252) elapsed: 242.337196 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating Backup using GCP bucket] ****************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:13
2020-05-25T21:48:33.006554 (delta: 0.098416) elapsed: 242.43568 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating Backup using GCP bucket] ****************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:20
2020-05-25T21:48:33.135437 (delta: 0.128829) elapsed: 242.564563 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating Backup] *********************************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:31
2020-05-25T21:48:33.279184 (delta: 0.143653) elapsed: 242.70831 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating Backup] *********************************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:41
2020-05-25T21:48:33.417177 (delta: 0.137888) elapsed: 242.846303 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating schedule using minio bucket] ************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:54
2020-05-25T21:48:33.552398 (delta: 0.135113) elapsed: 242.981524 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating schedule using GCP bucket] **************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:58
2020-05-25T21:48:33.694631 (delta: 0.142143) elapsed: 243.123757 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Getting the state of Backup] *********************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:66
2020-05-25T21:48:33.831401 (delta: 0.136675) elapsed: 243.260527 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating application namespace] ******************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:79
2020-05-25T21:48:33.982487 (delta: 0.150996) elapsed: 243.411613 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Restoring application] ***************************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:85
2020-05-25T21:48:34.120501 (delta: 0.137925) elapsed: 243.549627 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Restoring application] ***************************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:95
2020-05-25T21:48:34.259501 (delta: 0.138911) elapsed: 243.688627 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating application namespace] ******************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:105
2020-05-25T21:48:34.403311 (delta: 0.143714) elapsed: 243.832437 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Restoring application] ***************************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:110
2020-05-25T21:48:34.546786 (delta: 0.143375) elapsed: 243.975912 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Getting restore name] ****************************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:122
2020-05-25T21:48:34.687018 (delta: 0.140142) elapsed: 244.116144 ******
changed: [127.0.0.1] => {"changed": true, "cmd": "velero get restore | grep percona-backup | awk '{print $1}'", "delta": "0:00:01.189265", "end": "2020-05-25 21:48:36.169701", "rc": 0, "start": "2020-05-25 21:48:34.980436", "stderr": "", "stderr_lines": [], "stdout": "percona-backup-20200525214832", "stdout_lines": ["percona-backup-20200525214832"]}
TASK [Checking the restore status] *********************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:126
2020-05-25T21:48:36.261600 (delta: 1.574491) elapsed: 245.690726 ******
FAILED - RETRYING: Checking the restore status (60 retries left).
FAILED - RETRYING: Checking the restore status (59 retries left).
FAILED - RETRYING: Checking the restore status (58 retries left).
FAILED - RETRYING: Checking the restore status (57 retries left).
changed: [127.0.0.1] => {"attempts": 5, "changed": true, "cmd": "kubectl get restore percona-backup-20200525214832 -n velero -o jsonpath='{.status.phase}'", "delta": "0:00:01.136297", "end": "2020-05-25 21:49:03.554296", "rc": 0, "start": "2020-05-25 21:49:02.417999", "stderr": "", "stderr_lines": [], "stdout": "Completed", "stdout_lines": ["Completed"]}
TASK [Restoring Application] ***************************************************
task path: /experiments/functional/backup_and_restore/test.yml:162
2020-05-25T21:49:03.683088 (delta: 27.421422) elapsed: 273.112214 *****
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Getting state of restore] ************************************************
task path: /experiments/functional/backup_and_restore/test.yml:168
2020-05-25T21:49:03.818281 (delta: 0.135068) elapsed: 273.247407 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Getting the volume name] *************************************************
task path: /experiments/functional/backup_and_restore/test.yml:180
2020-05-25T21:49:03.948007 (delta: 0.129631) elapsed: 273.377133 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Getting the volume name] *************************************************
task path: /experiments/functional/backup_and_restore/test.yml:188
2020-05-25T21:49:04.067242 (delta: 0.119138) elapsed: 273.496368 ******
changed: [127.0.0.1] => {"changed": true, "cmd": "kubectl get pvc percona-mysql-claim -n app-percona-ns-local -o jsonpath='{.spec.volumeName}'", "delta": "0:00:01.368575", "end": "2020-05-25 21:49:05.869585", "rc": 0, "start": "2020-05-25 21:49:04.501010", "stderr": "", "stderr_lines": [], "stdout": "pvc-7cd59133-9ed1-11ea-be3f-005056987ecb", "stdout_lines": ["pvc-7cd59133-9ed1-11ea-be3f-005056987ecb"]}
TASK [Getting the target IP] ***************************************************
task path: /experiments/functional/backup_and_restore/test.yml:194
2020-05-25T21:49:05.988639 (delta: 1.921343) elapsed: 275.417765 ******
changed: [127.0.0.1] => {"attempts": 1, "changed": true, "cmd": "kubectl get pod -n openebs -l openebs.io/persistent-volume=pvc-7cd59133-9ed1-11ea-be3f-005056987ecb -o jsonpath='{.items[0].status.podIP}'", "delta": "0:00:01.266130", "end": "2020-05-25 21:49:07.663973", "rc": 0, "start": "2020-05-25 21:49:06.397843", "stderr": "", "stderr_lines": [], "stdout": "172.20.4.18", "stdout_lines": ["172.20.4.18"]}
TASK [Getting the CVR name from the corresponding PV] **************************
task path: /experiments/functional/backup_and_restore/test.yml:201
2020-05-25T21:49:07.797508 (delta: 1.808778) elapsed: 277.226634 ******
changed: [127.0.0.1] => {"changed": true, "cmd": "kubectl get cvr -n openebs -l openebs.io/persistent-volume=pvc-7cd59133-9ed1-11ea-be3f-005056987ecb -o jsonpath='{range.items[*]}{.metadata.labels.cstorpool\\.openebs\\.io/name}{\"\\n\"}{end}'", "delta": "0:00:01.019689", "end": "2020-05-25 21:49:09.190106", "rc": 0, "start": "2020-05-25 21:49:08.170417", "stderr": "", "stderr_lines": [], "stdout": "disk-pool-4ruh\ndisk-pool-ft1p\ndisk-pool-kt2c", "stdout_lines": ["disk-pool-4ruh", "disk-pool-ft1p", "disk-pool-kt2c"]}
TASK [Geting the CSP name corresponding to CVR] ********************************
task path: /experiments/functional/backup_and_restore/test.yml:205
2020-05-25T21:49:09.249278 (delta: 1.451679) elapsed: 278.678404 ******
changed: [127.0.0.1] => (item=disk-pool-4ruh) => {"changed": true, "cmd": "kubectl get csp disk-pool-4ruh -n openebs -o jsonpath='{.metadata.name}'", "delta": "0:00:01.004763", "end": "2020-05-25 21:49:10.510313", "item": "disk-pool-4ruh", "rc": 0, "start": "2020-05-25 21:49:09.505550", "stderr": "", "stderr_lines": [], "stdout": "disk-pool-4ruh", "stdout_lines": ["disk-pool-4ruh"]}
changed: [127.0.0.1] => (item=disk-pool-ft1p) => {"changed": true, "cmd": "kubectl get csp disk-pool-ft1p -n openebs -o jsonpath='{.metadata.name}'", "delta": "0:00:00.997934", "end": "2020-05-25 21:49:11.687078", "item": "disk-pool-ft1p", "rc": 0, "start": "2020-05-25 21:49:10.689144", "stderr": "", "stderr_lines": [], "stdout": "disk-pool-ft1p", "stdout_lines": ["disk-pool-ft1p"]}
changed: [127.0.0.1] => (item=disk-pool-kt2c) => {"changed": true, "cmd": "kubectl get csp disk-pool-kt2c -n openebs -o jsonpath='{.metadata.name}'", "delta": "0:00:01.138618", "end": "2020-05-25 21:49:13.076197", "item": "disk-pool-kt2c", "rc": 0, "start": "2020-05-25 21:49:11.937579", "stderr": "", "stderr_lines": [], "stdout": "disk-pool-kt2c", "stdout_lines": ["disk-pool-kt2c"]}
TASK [Getting the pool pod name] ***********************************************
task path: /experiments/functional/backup_and_restore/test.yml:210
2020-05-25T21:49:13.198902 (delta: 3.949563) elapsed: 282.628028 ******
changed: [127.0.0.1] => (item=disk-pool-4ruh) => {"changed": true, "cmd": "kubectl get pod -n openebs | grep disk-pool-4ruh | awk '{print $1}'", "delta": "0:00:01.128899", "end": "2020-05-25 21:49:14.749917", "item": "disk-pool-4ruh", "rc": 0, "start": "2020-05-25 21:49:13.621018", "stderr": "", "stderr_lines": [], "stdout": "disk-pool-4ruh-84b96d57cb-2pzjt", "stdout_lines": ["disk-pool-4ruh-84b96d57cb-2pzjt"]}
changed: [127.0.0.1] => (item=disk-pool-ft1p) => {"changed": true, "cmd": "kubectl get pod -n openebs | grep disk-pool-ft1p | awk '{print $1}'", "delta": "0:00:01.119941", "end": "2020-05-25 21:49:16.217960", "item": "disk-pool-ft1p", "rc": 0, "start": "2020-05-25 21:49:15.098019", "stderr": "", "stderr_lines": [], "stdout": "disk-pool-ft1p-7786f7dc9c-fn2vr", "stdout_lines": ["disk-pool-ft1p-7786f7dc9c-fn2vr"]}
changed: [127.0.0.1] => (item=disk-pool-kt2c) => {"changed": true, "cmd": "kubectl get pod -n openebs | grep disk-pool-kt2c | awk '{print $1}'", "delta": "0:00:01.212581", "end": "2020-05-25 21:49:17.753646", "item": "disk-pool-kt2c", "rc": 0, "start": "2020-05-25 21:49:16.541065", "stderr": "", "stderr_lines": [], "stdout": "disk-pool-kt2c-58c4569c46-rdb7n", "stdout_lines": ["disk-pool-kt2c-58c4569c46-rdb7n"]}
TASK [Getting the volume name inside cstor pool] *******************************
task path: /experiments/functional/backup_and_restore/test.yml:215
2020-05-25T21:49:17.865985 (delta: 4.66699) elapsed: 287.295111 *******
changed: [127.0.0.1] => (item={'_ansible_parsed': True, 'stderr_lines': [], u'changed': True, u'stdout': u'disk-pool-4ruh-84b96d57cb-2pzjt', '_ansible_item_result': True, u'delta': u'0:00:01.128899', 'stdout_lines': [u'disk-pool-4ruh-84b96d57cb-2pzjt'], '_ansible_item_label': u'disk-pool-4ruh', u'end': u'2020-05-25 21:49:14.749917', '_ansible_no_log': False, 'failed': False, u'cmd': u"kubectl get pod -n openebs | grep disk-pool-4ruh | awk '{print $1}'", 'item': u'disk-pool-4ruh', u'stderr': u'', u'rc': 0, u'invocation': {u'module_args': {u'warn': True, u'executable': None, u'_uses_shell': True, u'_raw_params': u"kubectl get pod -n openebs | grep disk-pool-4ruh | awk '{print $1}'", u'removes': None, u'argv': None, u'creates': None, u'chdir': None, u'stdin': None}}, u'start': u'2020-05-25 21:49:13.621018', '_ansible_ignore_errors': None}) => {"changed": true, "cmd": "echo $(kubectl exec -it disk-pool-4ruh-84b96d57cb-2pzjt -n openebs -c cstor-pool -- zfs list | grep pvc-7cd59133-9ed1-11ea-be3f-005056987ecb | awk '{print $1}') >> cstor-vol", "delta": "0:00:01.341002", "end": "2020-05-25 21:49:19.558948", "item": {"changed": true, "cmd": "kubectl get pod -n openebs | grep disk-pool-4ruh | awk '{print $1}'", "delta": "0:00:01.128899", "end": "2020-05-25 21:49:14.749917", "failed": false, "invocation": {"module_args": {"_raw_params": "kubectl get pod -n openebs | grep disk-pool-4ruh | awk '{print $1}'", "_uses_shell": true, "argv": null, "chdir": null, "creates": null, "executable": null, "removes": null, "stdin": null, "warn": true}}, "item": "disk-pool-4ruh", "rc": 0, "start": "2020-05-25 21:49:13.621018", "stderr": "", "stderr_lines": [], "stdout": "disk-pool-4ruh-84b96d57cb-2pzjt", "stdout_lines": ["disk-pool-4ruh-84b96d57cb-2pzjt"]}, "rc": 0, "start": "2020-05-25 21:49:18.217946", "stderr": "Unable to use a TTY - input is not a terminal or the right kind of file", "stderr_lines": ["Unable to use a TTY - input is not a terminal or the right kind of file"], "stdout": "", "stdout_lines": []}
changed: [127.0.0.1] => (item={'_ansible_parsed': True, 'stderr_lines': [], u'changed': True, u'stdout': u'disk-pool-ft1p-7786f7dc9c-fn2vr', '_ansible_item_result': True, u'delta': u'0:00:01.119941', 'stdout_lines': [u'disk-pool-ft1p-7786f7dc9c-fn2vr'], '_ansible_item_label': u'disk-pool-ft1p', u'end': u'2020-05-25 21:49:16.217960', '_ansible_no_log': False, 'failed': False, u'cmd': u"kubectl get pod -n openebs | grep disk-pool-ft1p | awk '{print $1}'", 'item': u'disk-pool-ft1p', u'stderr': u'', u'rc': 0, u'invocation': {u'module_args': {u'warn': True, u'executable': None, u'_uses_shell': True, u'_raw_params': u"kubectl get pod -n openebs | grep disk-pool-ft1p | awk '{print $1}'", u'removes': None, u'argv': None, u'creates': None, u'chdir': None, u'stdin': None}}, u'start': u'2020-05-25 21:49:15.098019', '_ansible_ignore_errors': None}) => {"changed": true, "cmd": "echo $(kubectl exec -it disk-pool-ft1p-7786f7dc9c-fn2vr -n openebs -c cstor-pool -- zfs list | grep pvc-7cd59133-9ed1-11ea-be3f-005056987ecb | awk '{print $1}') >> cstor-vol", "delta": "0:00:01.269939", "end": "2020-05-25 21:49:21.177831", "item": {"changed": true, "cmd": "kubectl get pod -n openebs | grep disk-pool-ft1p | awk '{print $1}'", "delta": "0:00:01.119941", "end": "2020-05-25 21:49:16.217960", "failed": false, "invocation": {"module_args": {"_raw_params": "kubectl get pod -n openebs | grep disk-pool-ft1p | awk '{print $1}'", "_uses_shell": true, "argv": null, "chdir": null, "creates": null, "executable": null, "removes": null, "stdin": null, "warn": true}}, "item": "disk-pool-ft1p", "rc": 0, "start": "2020-05-25 21:49:15.098019", "stderr": "", "stderr_lines": [], "stdout": "disk-pool-ft1p-7786f7dc9c-fn2vr", "stdout_lines": ["disk-pool-ft1p-7786f7dc9c-fn2vr"]}, "rc": 0, "start": "2020-05-25 21:49:19.907892", "stderr": "Unable to use a TTY - input is not a terminal or the right kind of file", "stderr_lines": ["Unable to use a TTY - input is not a terminal or the right kind of file"], "stdout": "", "stdout_lines": []}
changed: [127.0.0.1] => (item={'_ansible_parsed': True, 'stderr_lines': [], u'changed': True, u'stdout': u'disk-pool-kt2c-58c4569c46-rdb7n', '_ansible_item_result': True, u'delta': u'0:00:01.212581', 'stdout_lines': [u'disk-pool-kt2c-58c4569c46-rdb7n'], '_ansible_item_label': u'disk-pool-kt2c', u'end': u'2020-05-25 21:49:17.753646', '_ansible_no_log': False, 'failed': False, u'cmd': u"kubectl get pod -n openebs | grep disk-pool-kt2c | awk '{print $1}'", 'item': u'disk-pool-kt2c', u'stderr': u'', u'rc': 0, u'invocation': {u'module_args': {u'warn': True, u'executable': None, u'_uses_shell': True, u'_raw_params': u"kubectl get pod -n openebs | grep disk-pool-kt2c | awk '{print $1}'", u'removes': None, u'argv': None, u'creates': None, u'chdir': None, u'stdin': None}}, u'start': u'2020-05-25 21:49:16.541065', '_ansible_ignore_errors': None}) => {"changed": true, "cmd": "echo $(kubectl exec -it disk-pool-kt2c-58c4569c46-rdb7n -n openebs -c cstor-pool -- zfs list | grep pvc-7cd59133-9ed1-11ea-be3f-005056987ecb | awk '{print $1}') >> cstor-vol", "delta": "0:00:01.255475", "end": "2020-05-25 21:49:22.783046", "item": {"changed": true, "cmd": "kubectl get pod -n openebs | grep disk-pool-kt2c | awk '{print $1}'", "delta": "0:00:01.212581", "end": "2020-05-25 21:49:17.753646", "failed": false, "invocation": {"module_args": {"_raw_params": "kubectl get pod -n openebs | grep disk-pool-kt2c | awk '{print $1}'", "_uses_shell": true, "argv": null, "chdir": null, "creates": null, "executable": null, "removes": null, "stdin": null, "warn": true}}, "item": "disk-pool-kt2c", "rc": 0, "start": "2020-05-25 21:49:16.541065", "stderr": "", "stderr_lines": [], "stdout": "disk-pool-kt2c-58c4569c46-rdb7n", "stdout_lines": ["disk-pool-kt2c-58c4569c46-rdb7n"]}, "rc": 0, "start": "2020-05-25 21:49:21.527571", "stderr": "Unable to use a TTY - input is not a terminal or the right kind of file", "stderr_lines": ["Unable to use a TTY - input is not a terminal or the right kind of file"], "stdout": "", "stdout_lines": []}
TASK [Assigning the cstor-pool volume name to variable] ************************
task path: /experiments/functional/backup_and_restore/test.yml:220
2020-05-25T21:49:22.904559 (delta: 5.038484) elapsed: 292.333685 ******
changed: [127.0.0.1] => {"changed": true, "cmd": "cat ./cstor-vol", "delta": "0:00:00.912770", "end": "2020-05-25 21:49:24.007959", "rc": 0, "start": "2020-05-25 21:49:23.095189", "stderr": "", "stderr_lines": [], "stdout": "cstor-736ea2c9-9eca-11ea-be3f-005056987ecb/pvc-7cd59133-9ed1-11ea-be3f-005056987ecb\ncstor-734cf2af-9eca-11ea-be3f-005056987ecb/pvc-7cd59133-9ed1-11ea-be3f-005056987ecb\ncstor-731faf87-9eca-11ea-be3f-005056987ecb/pvc-7cd59133-9ed1-11ea-be3f-005056987ecb", "stdout_lines": ["cstor-736ea2c9-9eca-11ea-be3f-005056987ecb/pvc-7cd59133-9ed1-11ea-be3f-005056987ecb", "cstor-734cf2af-9eca-11ea-be3f-005056987ecb/pvc-7cd59133-9ed1-11ea-be3f-005056987ecb", "cstor-731faf87-9eca-11ea-be3f-005056987ecb/pvc-7cd59133-9ed1-11ea-be3f-005056987ecb"]}
TASK [Setting target IP for volume inside cstor pool] **************************
task path: /experiments/functional/backup_and_restore/test.yml:224
2020-05-25T21:49:24.095105 (delta: 1.190492) elapsed: 293.524231 ******
changed: [127.0.0.1] => (item=[{'_ansible_parsed': True, 'stderr_lines': [], u'changed': True, u'stdout': u'disk-pool-4ruh-84b96d57cb-2pzjt', '_ansible_item_result': True, u'delta': u'0:00:01.128899', 'stdout_lines': [u'disk-pool-4ruh-84b96d57cb-2pzjt'], '_ansible_item_label': u'disk-pool-4ruh', u'end': u'2020-05-25 21:49:14.749917', '_ansible_no_log': False, u'start': u'2020-05-25 21:49:13.621018', 'failed': False, u'cmd': u"kubectl get pod -n openebs | grep disk-pool-4ruh | awk '{print $1}'", 'item': u'disk-pool-4ruh', u'stderr': u'', u'rc': 0, u'invocation': {u'module_args': {u'creates': None, u'executable': None, u'_uses_shell': True, u'_raw_params': u"kubectl get pod -n openebs | grep disk-pool-4ruh | awk '{print $1}'", u'removes': None, u'argv': None, u'warn': True, u'chdir': None, u'stdin': None}}, '_ansible_ignore_errors': None}, u'cstor-736ea2c9-9eca-11ea-be3f-005056987ecb/pvc-7cd59133-9ed1-11ea-be3f-005056987ecb']) => {"changed": true, "cmd": "kubectl exec -it disk-pool-4ruh-84b96d57cb-2pzjt -n openebs -- zfs set io.openebs:targetip=172.20.4.18 cstor-736ea2c9-9eca-11ea-be3f-005056987ecb/pvc-7cd59133-9ed1-11ea-be3f-005056987ecb", "delta": "0:00:01.312116", "end": "2020-05-25 21:49:25.713806", "item": [{"_ansible_ignore_errors": null, "_ansible_item_label": "disk-pool-4ruh", "_ansible_item_result": true, "_ansible_no_log": false, "_ansible_parsed": true, "changed": true, "cmd": "kubectl get pod -n openebs | grep disk-pool-4ruh | awk '{print $1}'", "delta": "0:00:01.128899", "end": "2020-05-25 21:49:14.749917", "failed": false, "invocation": {"module_args": {"_raw_params": "kubectl get pod -n openebs | grep disk-pool-4ruh | awk '{print $1}'", "_uses_shell": true, "argv": null, "chdir": null, "creates": null, "executable": null, "removes": null, "stdin": null, "warn": true}}, "item": "disk-pool-4ruh", "rc": 0, "start": "2020-05-25 21:49:13.621018", "stderr": "", "stderr_lines": [], "stdout": "disk-pool-4ruh-84b96d57cb-2pzjt", "stdout_lines": ["disk-pool-4ruh-84b96d57cb-2pzjt"]}, "cstor-736ea2c9-9eca-11ea-be3f-005056987ecb/pvc-7cd59133-9ed1-11ea-be3f-005056987ecb"], "rc": 0, "start": "2020-05-25 21:49:24.401690", "stderr": "Defaulting container name to cstor-pool.\nUse 'kubectl describe pod/disk-pool-4ruh-84b96d57cb-2pzjt -n openebs' to see all of the containers in this pod.\nUnable to use a TTY - input is not a terminal or the right kind of file", "stderr_lines": ["Defaulting container name to cstor-pool.", "Use 'kubectl describe pod/disk-pool-4ruh-84b96d57cb-2pzjt -n openebs' to see all of the containers in this pod.", "Unable to use a TTY - input is not a terminal or the right kind of file"], "stdout": "", "stdout_lines": []}
changed: [127.0.0.1] => (item=[{'_ansible_parsed': True, 'stderr_lines': [], u'changed': True, u'stdout': u'disk-pool-ft1p-7786f7dc9c-fn2vr', '_ansible_item_result': True, u'delta': u'0:00:01.119941', 'stdout_lines': [u'disk-pool-ft1p-7786f7dc9c-fn2vr'], '_ansible_item_label': u'disk-pool-ft1p', u'end': u'2020-05-25 21:49:16.217960', '_ansible_no_log': False, u'start': u'2020-05-25 21:49:15.098019', 'failed': False, u'cmd': u"kubectl get pod -n openebs | grep disk-pool-ft1p | awk '{print $1}'", 'item': u'disk-pool-ft1p', u'stderr': u'', u'rc': 0, u'invocation': {u'module_args': {u'creates': None, u'executable': None, u'_uses_shell': True, u'_raw_params': u"kubectl get pod -n openebs | grep disk-pool-ft1p | awk '{print $1}'", u'removes': None, u'argv': None, u'warn': True, u'chdir': None, u'stdin': None}}, '_ansible_ignore_errors': None}, u'cstor-734cf2af-9eca-11ea-be3f-005056987ecb/pvc-7cd59133-9ed1-11ea-be3f-005056987ecb']) => {"changed": true, "cmd": "kubectl exec -it disk-pool-ft1p-7786f7dc9c-fn2vr -n openebs -- zfs set io.openebs:targetip=172.20.4.18 cstor-734cf2af-9eca-11ea-be3f-005056987ecb/pvc-7cd59133-9ed1-11ea-be3f-005056987ecb", "delta": "0:00:01.356579", "end": "2020-05-25 21:49:27.400734", "item": [{"_ansible_ignore_errors": null, "_ansible_item_label": "disk-pool-ft1p", "_ansible_item_result": true, "_ansible_no_log": false, "_ansible_parsed": true, "changed": true, "cmd": "kubectl get pod -n openebs | grep disk-pool-ft1p | awk '{print $1}'", "delta": "0:00:01.119941", "end": "2020-05-25 21:49:16.217960", "failed": false, "invocation": {"module_args": {"_raw_params": "kubectl get pod -n openebs | grep disk-pool-ft1p | awk '{print $1}'", "_uses_shell": true, "argv": null, "chdir": null, "creates": null, "executable": null, "removes": null, "stdin": null, "warn": true}}, "item": "disk-pool-ft1p", "rc": 0, "start": "2020-05-25 21:49:15.098019", "stderr": "", "stderr_lines": [], "stdout": "disk-pool-ft1p-7786f7dc9c-fn2vr", "stdout_lines": ["disk-pool-ft1p-7786f7dc9c-fn2vr"]}, "cstor-734cf2af-9eca-11ea-be3f-005056987ecb/pvc-7cd59133-9ed1-11ea-be3f-005056987ecb"], "rc": 0, "start": "2020-05-25 21:49:26.044155", "stderr": "Defaulting container name to cstor-pool.\nUse 'kubectl describe pod/disk-pool-ft1p-7786f7dc9c-fn2vr -n openebs' to see all of the containers in this pod.\nUnable to use a TTY - input is not a terminal or the right kind of file", "stderr_lines": ["Defaulting container name to cstor-pool.", "Use 'kubectl describe pod/disk-pool-ft1p-7786f7dc9c-fn2vr -n openebs' to see all of the containers in this pod.", "Unable to use a TTY - input is not a terminal or the right kind of file"], "stdout": "", "stdout_lines": []}
changed: [127.0.0.1] => (item=[{'_ansible_parsed': True, 'stderr_lines': [], u'changed': True, u'stdout': u'disk-pool-kt2c-58c4569c46-rdb7n', '_ansible_item_result': True, u'delta': u'0:00:01.212581', 'stdout_lines': [u'disk-pool-kt2c-58c4569c46-rdb7n'], '_ansible_item_label': u'disk-pool-kt2c', u'end': u'2020-05-25 21:49:17.753646', '_ansible_no_log': False, u'start': u'2020-05-25 21:49:16.541065', 'failed': False, u'cmd': u"kubectl get pod -n openebs | grep disk-pool-kt2c | awk '{print $1}'", 'item': u'disk-pool-kt2c', u'stderr': u'', u'rc': 0, u'invocation': {u'module_args': {u'creates': None, u'executable': None, u'_uses_shell': True, u'_raw_params': u"kubectl get pod -n openebs | grep disk-pool-kt2c | awk '{print $1}'", u'removes': None, u'argv': None, u'warn': True, u'chdir': None, u'stdin': None}}, '_ansible_ignore_errors': None}, u'cstor-731faf87-9eca-11ea-be3f-005056987ecb/pvc-7cd59133-9ed1-11ea-be3f-005056987ecb']) => {"changed": true, "cmd": "kubectl exec -it disk-pool-kt2c-58c4569c46-rdb7n -n openebs -- zfs set io.openebs:targetip=172.20.4.18 cstor-731faf87-9eca-11ea-be3f-005056987ecb/pvc-7cd59133-9ed1-11ea-be3f-005056987ecb", "delta": "0:00:01.345028", "end": "2020-05-25 21:49:29.095735", "item": [{"_ansible_ignore_errors": null, "_ansible_item_label": "disk-pool-kt2c", "_ansible_item_result": true, "_ansible_no_log": false, "_ansible_parsed": true, "changed": true, "cmd": "kubectl get pod -n openebs | grep disk-pool-kt2c | awk '{print $1}'", "delta": "0:00:01.212581", "end": "2020-05-25 21:49:17.753646", "failed": false, "invocation": {"module_args": {"_raw_params": "kubectl get pod -n openebs | grep disk-pool-kt2c | awk '{print $1}'", "_uses_shell": true, "argv": null, "chdir": null, "creates": null, "executable": null, "removes": null, "stdin": null, "warn": true}}, "item": "disk-pool-kt2c", "rc": 0, "start": "2020-05-25 21:49:16.541065", "stderr": "", "stderr_lines": [], "stdout": "disk-pool-kt2c-58c4569c46-rdb7n", "stdout_lines": ["disk-pool-kt2c-58c4569c46-rdb7n"]}, "cstor-731faf87-9eca-11ea-be3f-005056987ecb/pvc-7cd59133-9ed1-11ea-be3f-005056987ecb"], "rc": 0, "start": "2020-05-25 21:49:27.750707", "stderr": "Defaulting container name to cstor-pool.\nUse 'kubectl describe pod/disk-pool-kt2c-58c4569c46-rdb7n -n openebs' to see all of the containers in this pod.\nUnable to use a TTY - input is not a terminal or the right kind of file", "stderr_lines": ["Defaulting container name to cstor-pool.", "Use 'kubectl describe pod/disk-pool-kt2c-58c4569c46-rdb7n -n openebs' to see all of the containers in this pod.", "Unable to use a TTY - input is not a terminal or the right kind of file"], "stdout": "", "stdout_lines": []}
TASK [Waiting for application to be in running state] **************************
task path: /experiments/functional/backup_and_restore/test.yml:234
2020-05-25T21:49:29.221418 (delta: 5.126244) elapsed: 298.650544 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Verifying Data persistense] **********************************************
task path: /experiments/functional/backup_and_restore/test.yml:241
2020-05-25T21:49:29.366673 (delta: 0.145163) elapsed: 298.795799 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Waiting for application to be in running state] **************************
task path: /experiments/functional/backup_and_restore/test.yml:256
2020-05-25T21:49:29.509531 (delta: 0.142758) elapsed: 298.938657 ******
FAILED - RETRYING: Waiting for application to be in running state (30 retries left).
FAILED - RETRYING: Waiting for application to be in running state (29 retries left).
changed: [127.0.0.1] => {"attempts": 3, "changed": true, "cmd": "kubectl get pod percona-755f6678bf-rw7vk -n app-percona-ns-local -o jsonpath='{.status.phase}'", "delta": "0:00:01.152446", "end": "2020-05-25 21:49:43.894857", "rc": 0, "start": "2020-05-25 21:49:42.742411", "stderr": "", "stderr_lines": [], "stdout": "Running", "stdout_lines": ["Running"]}
TASK [Verifying Data persistense] **********************************************
task path: /experiments/functional/backup_and_restore/test.yml:263
2020-05-25T21:49:44.023365 (delta: 14.513743) elapsed: 313.452491 *****
included: /utils/scm/applications/mysql/mysql_data_persistence.yml for 127.0.0.1
TASK [Create some test data in the mysql database] *****************************
task path: /utils/scm/applications/mysql/mysql_data_persistence.yml:4
2020-05-25T21:49:44.288003 (delta: 0.264547) elapsed: 313.717129 ******
skipping: [127.0.0.1] => (item=mysql -uroot -pk8sDem0 -e 'create database backup;') => {"changed": false, "item": "mysql -uroot -pk8sDem0 -e 'create database backup;'", "skip_reason": "Conditional result was False"}
skipping: [127.0.0.1] => (item=mysql -uroot -pk8sDem0 -e 'create table ttbl (Data VARCHAR(20));' backup) => {"changed": false, "item": "mysql -uroot -pk8sDem0 -e 'create table ttbl (Data VARCHAR(20));' backup", "skip_reason": "Conditional result was False"}
skipping: [127.0.0.1] => (item=mysql -uroot -pk8sDem0 -e 'insert into ttbl (Data) VALUES ("tdata");' backup) => {"changed": false, "item": "mysql -uroot -pk8sDem0 -e 'insert into ttbl (Data) VALUES (\"tdata\");' backup", "skip_reason": "Conditional result was False"}
TASK [Kill the application pod] ************************************************
task path: /utils/scm/applications/mysql/mysql_data_persistence.yml:21
2020-05-25T21:49:44.418080 (delta: 0.130019) elapsed: 313.847206 ******
changed: [127.0.0.1] => {"changed": true, "cmd": "kubectl delete pod percona-755f6678bf-rw7vk -n app-percona-ns-local", "delta": "0:00:14.693835", "end": "2020-05-25 21:49:59.486462", "rc": 0, "start": "2020-05-25 21:49:44.792627", "stderr": "", "stderr_lines": [], "stdout": "pod \"percona-755f6678bf-rw7vk\" deleted", "stdout_lines": ["pod \"percona-755f6678bf-rw7vk\" deleted"]}
TASK [Verify if the application pod is deleted] ********************************
task path: /utils/scm/applications/mysql/mysql_data_persistence.yml:27
2020-05-25T21:49:59.642838 (delta: 15.224673) elapsed: 329.071964 *****
[WARNING]: when statements should not include jinja2 templating delimiters
such as {{ }} or {% %}. Found: "{{ pod_name }}" not in podstatus.stdout
changed: [127.0.0.1] => {"attempts": 1, "changed": true, "cmd": "kubectl get pods -n app-percona-ns-local", "delta": "0:00:01.099759", "end": "2020-05-25 21:50:01.043559", "rc": 0, "start": "2020-05-25 21:49:59.943800", "stderr": "", "stderr_lines": [], "stdout": "NAME READY STATUS RESTARTS AGE\npercona-755f6678bf-xmdjh 1/1 Running 0 15s", "stdout_lines": ["NAME READY STATUS RESTARTS AGE", "percona-755f6678bf-xmdjh 1/1 Running 0 15s"]}
TASK [Obtain the newly created pod name for application] ***********************
task path: /utils/scm/applications/mysql/mysql_data_persistence.yml:37
2020-05-25T21:50:01.134061 (delta: 1.491103) elapsed: 330.563187 ******
changed: [127.0.0.1] => {"changed": true, "cmd": "kubectl get pods -n app-percona-ns-local -l name=percona -o jsonpath='{.items[].metadata.name}'", "delta": "0:00:01.216353", "end": "2020-05-25 21:50:02.724449", "rc": 0, "start": "2020-05-25 21:50:01.508096", "stderr": "", "stderr_lines": [], "stdout": "percona-755f6678bf-xmdjh", "stdout_lines": ["percona-755f6678bf-xmdjh"]}
TASK [Checking application pod is in running state] ****************************
task path: /utils/scm/applications/mysql/mysql_data_persistence.yml:44
2020-05-25T21:50:02.854785 (delta: 1.720641) elapsed: 332.283911 ******
changed: [127.0.0.1] => {"attempts": 1, "changed": true, "cmd": "kubectl get pods -n app-percona-ns-local -o jsonpath='{.items[?(@.metadata.name==\"percona-755f6678bf-xmdjh\")].status.phase}'", "delta": "0:00:01.152967", "end": "2020-05-25 21:50:04.420796", "rc": 0, "start": "2020-05-25 21:50:03.267829", "stderr": "", "stderr_lines": [], "stdout": "Running", "stdout_lines": ["Running"]}
TASK [Get the container status of application.] ********************************
task path: /utils/scm/applications/mysql/mysql_data_persistence.yml:51
2020-05-25T21:50:04.516452 (delta: 1.661575) elapsed: 333.945578 ******
changed: [127.0.0.1] => {"attempts": 1, "changed": true, "cmd": "kubectl get pods -n app-percona-ns-local -o jsonpath='{.items[?(@.metadata.name==\"percona-755f6678bf-xmdjh\")].status.containerStatuses[].state}' | grep running", "delta": "0:00:01.233843", "end": "2020-05-25 21:50:06.042516", "rc": 0, "start": "2020-05-25 21:50:04.808673", "stderr": "", "stderr_lines": [], "stdout": "map[running:map[startedAt:2020-05-25T21:49:54Z]]", "stdout_lines": ["map[running:map[startedAt:2020-05-25T21:49:54Z]]"]}
TASK [Check if db is ready for connections] ************************************
task path: /utils/scm/applications/mysql/mysql_data_persistence.yml:61
2020-05-25T21:50:06.192533 (delta: 1.676013) elapsed: 335.621659 ******
changed: [127.0.0.1] => {"attempts": 1, "changed": true, "cmd": "kubectl logs percona-755f6678bf-xmdjh -n app-percona-ns-local | grep 'ready for connections'", "delta": "0:00:01.239410", "end": "2020-05-25 21:50:07.867882", "rc": 0, "start": "2020-05-25 21:50:06.628472", "stderr": "", "stderr_lines": [], "stdout": "2020-05-25T21:49:56.806571Z 0 [Note] mysqld: ready for connections.", "stdout_lines": ["2020-05-25T21:49:56.806571Z 0 [Note] mysqld: ready for connections."]}
TASK [Checking for the Corrupted tables] ***************************************
task path: /utils/scm/applications/mysql/mysql_data_persistence.yml:68
2020-05-25T21:50:08.028279 (delta: 1.835653) elapsed: 337.457405 ******
changed: [127.0.0.1] => {"changed": true, "cmd": "kubectl exec percona-755f6678bf-xmdjh -n app-percona-ns-local -- mysqlcheck -c backup -uroot -pk8sDem0", "delta": "0:00:01.248645", "end": "2020-05-25 21:50:09.657832", "failed_when_result": false, "rc": 0, "start": "2020-05-25 21:50:08.409187", "stderr": "mysqlcheck: [Warning] Using a password on the command line interface can be insecure.", "stderr_lines": ["mysqlcheck: [Warning] Using a password on the command line interface can be insecure."], "stdout": "backup.ttbl OK", "stdout_lines": ["backup.ttbl OK"]}
TASK [Verify mysql data persistence] *******************************************
task path: /utils/scm/applications/mysql/mysql_data_persistence.yml:77
2020-05-25T21:50:09.722638 (delta: 1.694269) elapsed: 339.151764 ******
changed: [127.0.0.1] => {"changed": true, "cmd": "kubectl exec percona-755f6678bf-xmdjh -n app-percona-ns-local -- mysql -uroot -pk8sDem0 -e 'select * from ttbl' backup;", "delta": "0:00:01.355427", "end": "2020-05-25 21:50:11.272579", "failed_when_result": false, "rc": 0, "start": "2020-05-25 21:50:09.917152", "stderr": "mysql: [Warning] Using a password on the command line interface can be insecure.", "stderr_lines": ["mysql: [Warning] Using a password on the command line interface can be insecure."], "stdout": "Data\ntdata", "stdout_lines": ["Data", "tdata"]}
TASK [Delete/drop MySQL database] **********************************************
task path: /utils/scm/applications/mysql/mysql_data_persistence.yml:90
2020-05-25T21:50:11.335979 (delta: 1.613288) elapsed: 340.765105 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Verify successful db delete] *********************************************
task path: /utils/scm/applications/mysql/mysql_data_persistence.yml:98
2020-05-25T21:50:11.402731 (delta: 0.066699) elapsed: 340.831857 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [set_fact] ****************************************************************
task path: /experiments/functional/backup_and_restore/test.yml:277
2020-05-25T21:50:11.469355 (delta: 0.066569) elapsed: 340.898481 ******
ok: [127.0.0.1] => {"ansible_facts": {"flag": "Pass"}, "changed": false}
TASK [include_tasks] ***********************************************************
task path: /experiments/functional/backup_and_restore/test.yml:285
2020-05-25T21:50:11.553563 (delta: 0.084153) elapsed: 340.982689 ******
included: /utils/fcm/update_litmus_result_resource.yml for 127.0.0.1
TASK [Generate the litmus result CR to reflect SOT (Start of Test)] ************
task path: /utils/fcm/update_litmus_result_resource.yml:3
2020-05-25T21:50:11.722330 (delta: 0.16871) elapsed: 341.151456 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Analyze the cr yaml] *****************************************************
task path: /utils/fcm/update_litmus_result_resource.yml:14
2020-05-25T21:50:11.797955 (delta: 0.075561) elapsed: 341.227081 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Apply the litmus result CR] **********************************************
task path: /utils/fcm/update_litmus_result_resource.yml:17
2020-05-25T21:50:11.873236 (delta: 0.075219) elapsed: 341.302362 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Generate the litmus result CR to reflect EOT (End of Test)] **************
task path: /utils/fcm/update_litmus_result_resource.yml:27
2020-05-25T21:50:11.970758 (delta: 0.097461) elapsed: 341.399884 ******
changed: [127.0.0.1] => {"changed": true, "checksum": "218cc4cd06f4a20f4ce3c234b033d66800aa02b5", "dest": "./litmus-result.yaml", "gid": 0, "group": "root", "md5sum": "9f2161a292cb589ce3599bf99ea6818b", "mode": "0644", "owner": "root", "size": 420, "src": "/root/.ansible/tmp/ansible-tmp-1590443412.08-149850154909233/source", "state": "file", "uid": 0}
TASK [Analyze the cr yaml] *****************************************************
task path: /utils/fcm/update_litmus_result_resource.yml:38
2020-05-25T21:50:14.498889 (delta: 2.528058) elapsed: 343.928015 ******
changed: [127.0.0.1] => {"changed": true, "cmd": "cat litmus-result.yaml", "delta": "0:00:01.004079", "end": "2020-05-25 21:50:15.878865", "rc": 0, "start": "2020-05-25 21:50:14.874786", "stderr": "", "stderr_lines": [], "stdout": "---\napiVersion: litmus.io/v1alpha1\nkind: LitmusResult\nmetadata:\n\n # name of the litmus testcase\n name: velero-backup-restore \nspec:\n\n # holds information on the testcase\n testMetadata:\n app: \n chaostype: \n\n # holds the state of testcase, manually updated by json merge patch\n # result is the useful value today, but anticipate phase use in future \n testStatus: \n phase: completed \n result: Pass ", "stdout_lines": ["---", "apiVersion: litmus.io/v1alpha1", "kind: LitmusResult", "metadata:", "", " # name of the litmus testcase", " name: velero-backup-restore ", "spec:", "", " # holds information on the testcase", " testMetadata:", " app: ", " chaostype: ", "", " # holds the state of testcase, manually updated by json merge patch", " # result is the useful value today, but anticipate phase use in future ", " testStatus: ", " phase: completed ", " result: Pass "]}
TASK [Apply the litmus result CR] **********************************************
task path: /utils/fcm/update_litmus_result_resource.yml:41
2020-05-25T21:50:15.960682 (delta: 1.461739) elapsed: 345.389808 ******
changed: [127.0.0.1] => {"changed": true, "failed_when_result": false, "method": "patch", "result": {"apiVersion": "litmus.io/v1alpha1", "kind": "LitmusResult", "metadata": {"creationTimestamp": "2020-05-25T21:08:59Z", "generation": 6, "name": "velero-backup-restore", "resourceVersion": "1683175", "selfLink": "/apis/litmus.io/v1alpha1/litmusresults/velero-backup-restore", "uid": "f39fdbaf-9ecb-11ea-be3f-005056987ecb"}, "spec": {"testMetadata": {}, "testStatus": {"phase": "completed", "result": "Pass"}}}}
TASK [Deleting GCP bucket] *****************************************************
task path: /experiments/functional/backup_and_restore/test.yml:289
2020-05-25T21:50:17.422759 (delta: 1.46202) elapsed: 346.851885 *******
changed: [127.0.0.1] => {"changed": true, "cmd": "gsutil rm -r gs://e2e-gcp-bucket", "delta": "0:00:09.445650", "end": "2020-05-25 21:50:27.188469", "rc": 0, "start": "2020-05-25 21:50:17.742819", "stderr": "Removing gs://e2e-gcp-bucket/backups/percona-backup/-pvc-a83b23f3-9ecf-11ea-be3f-005056987ecb-percona-backup#1590443260919563...\n/ [1 objects] \rRemoving gs://e2e-gcp-bucket/backups/percona-backup/-pvc-a83b23f3-9ecf-11ea-be3f-005056987ecb-percona-backup.pvc#1590443242137659...\n/ [2 objects] \rRemoving gs://e2e-gcp-bucket/backups/percona-backup/percona-backup-logs.gz#1590443269829082...\n/ [3 objects] \rRemoving gs://e2e-gcp-bucket/backups/percona-backup/percona-backup-podvolumebackups.json.gz#1590443271079723...\n/ [4 objects] \r\n==> NOTE: You are performing a sequence of gsutil operations that may\nrun significantly faster if you instead use gsutil -m rm ... Please\nsee the -m section under \"gsutil help options\" for further information\nabout when gsutil -m can be advantageous.\n\nRemoving gs://e2e-gcp-bucket/backups/percona-backup/percona-backup-resource-list.json.gz#1590443271875880...\n/ [5 objects] \rRemoving gs://e2e-gcp-bucket/backups/percona-backup/percona-backup-volumesnapshots.json.gz#1590443271490792...\n/ [6 objects] \rRemoving gs://e2e-gcp-bucket/backups/percona-backup/percona-backup.tar.gz#1590443270688635...\n/ [7 objects] \rRemoving gs://e2e-gcp-bucket/backups/percona-backup/velero-backup.json#1590443270249372...\n/ [8 objects] \rRemoving gs://e2e-gcp-bucket/restores/percona-backup-20200525214832/restore-percona-backup-20200525214832-logs.gz#1590443342128771...\n/ [9 objects] \rRemoving gs://e2e-gcp-bucket/restores/percona-backup-20200525214832/restore-percona-backup-20200525214832-results.gz#1590443342620199...\n/ [10 objects] \r\nOperation completed over 10 objects. \nRemoving gs://e2e-gcp-bucket/...", "stderr_lines": ["Removing gs://e2e-gcp-bucket/backups/percona-backup/-pvc-a83b23f3-9ecf-11ea-be3f-005056987ecb-percona-backup#1590443260919563...", "/ [1 objects] ", "Removing gs://e2e-gcp-bucket/backups/percona-backup/-pvc-a83b23f3-9ecf-11ea-be3f-005056987ecb-percona-backup.pvc#1590443242137659...", "/ [2 objects] ", "Removing gs://e2e-gcp-bucket/backups/percona-backup/percona-backup-logs.gz#1590443269829082...", "/ [3 objects] ", "Removing gs://e2e-gcp-bucket/backups/percona-backup/percona-backup-podvolumebackups.json.gz#1590443271079723...", "/ [4 objects] ", "==> NOTE: You are performing a sequence of gsutil operations that may", "run significantly faster if you instead use gsutil -m rm ... Please", "see the -m section under \"gsutil help options\" for further information", "about when gsutil -m can be advantageous.", "", "Removing gs://e2e-gcp-bucket/backups/percona-backup/percona-backup-resource-list.json.gz#1590443271875880...", "/ [5 objects] ", "Removing gs://e2e-gcp-bucket/backups/percona-backup/percona-backup-volumesnapshots.json.gz#1590443271490792...", "/ [6 objects] ", "Removing gs://e2e-gcp-bucket/backups/percona-backup/percona-backup.tar.gz#1590443270688635...", "/ [7 objects] ", "Removing gs://e2e-gcp-bucket/backups/percona-backup/velero-backup.json#1590443270249372...", "/ [8 objects] ", "Removing gs://e2e-gcp-bucket/restores/percona-backup-20200525214832/restore-percona-backup-20200525214832-logs.gz#1590443342128771...", "/ [9 objects] ", "Removing gs://e2e-gcp-bucket/restores/percona-backup-20200525214832/restore-percona-backup-20200525214832-results.gz#1590443342620199...", "/ [10 objects] ", "Operation completed over 10 objects. ", "Removing gs://e2e-gcp-bucket/..."], "stdout": "", "stdout_lines": []}
TASK [Deprovisioning Velero server] ********************************************
task path: /experiments/functional/backup_and_restore/test.yml:293
2020-05-25T21:50:27.305226 (delta: 9.882374) elapsed: 356.734352 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
META: ran handlers
META: ran handlers
PLAY RECAP *********************************************************************
127.0.0.1 : ok=68 changed=54 unreachable=0 failed=0
2020-05-25T21:50:27.373836 (delta: 0.068518) elapsed: 356.802962 ******
===============================================================================
```
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment