Skip to content

Instantly share code, notes, and snippets.

@shashank855
Created June 2, 2020 13:40
Show Gist options
  • Save shashank855/90810903ce7e367d067f3c6d0850b35e to your computer and use it in GitHub Desktop.
Save shashank855/90810903ce7e367d067f3c6d0850b35e to your computer and use it in GitHub Desktop.
ansible-playbook 2.7.3
config file = None
configured module search path = [u'/root/.ansible/plugins/modules', u'/usr/share/ansible/plugins/modules']
ansible python module location = /usr/local/lib/python2.7/dist-packages/ansible
executable location = /usr/local/bin/ansible-playbook
python version = 2.7.17 (default, Apr 15 2020, 17:20:14) [GCC 7.5.0]
No config file found; using defaults
/etc/ansible/hosts did not meet host_list requirements, check plugin documentation if this is unexpected
/etc/ansible/hosts did not meet script requirements, check plugin documentation if this is unexpected
PLAYBOOK: test.yml *************************************************************
1 plays in ./experiments/functional/backup_and_restore/test.yml
PLAY [localhost] ***************************************************************
2020-06-02T13:35:02.874519 (delta: 0.048864) elapsed: 0.048864 ********
===============================================================================
TASK [Gathering Facts] *********************************************************
task path: /experiments/functional/backup_and_restore/test.yml:1
2020-06-02T13:35:02.908899 (delta: 0.034351) elapsed: 0.083244 ********
ok: [127.0.0.1]
META: ran handlers
TASK [include_tasks] ***********************************************************
task path: /experiments/functional/backup_and_restore/test.yml:12
2020-06-02T13:35:12.246088 (delta: 9.337163) elapsed: 9.420433 ********
included: /utils/fcm/create_testname.yml for 127.0.0.1
TASK [Record test instance/run ID] *********************************************
task path: /utils/fcm/create_testname.yml:3
2020-06-02T13:35:12.317441 (delta: 0.071312) elapsed: 9.491786 ********
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Construct testname appended with runID] **********************************
task path: /utils/fcm/create_testname.yml:7
2020-06-02T13:35:12.369684 (delta: 0.052188) elapsed: 9.544029 ********
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [include_tasks] ***********************************************************
task path: /experiments/functional/backup_and_restore/test.yml:15
2020-06-02T13:35:12.425312 (delta: 0.055582) elapsed: 9.599657 ********
included: /utils/fcm/update_litmus_result_resource.yml for 127.0.0.1
TASK [Generate the litmus result CR to reflect SOT (Start of Test)] ************
task path: /utils/fcm/update_litmus_result_resource.yml:3
2020-06-02T13:35:12.505894 (delta: 0.08054) elapsed: 9.680239 *********
changed: [127.0.0.1] => {"changed": true, "checksum": "cb23761ee0ecfea3a093f94c0443739c777b3007", "dest": "./litmus-result.yaml", "gid": 0, "group": "root", "md5sum": "0da5664e16958d4f3c263662ebe36b5c", "mode": "0644", "owner": "root", "size": 422, "src": "/root/.ansible/tmp/ansible-tmp-1591104912.55-108846767627484/source", "state": "file", "uid": 0}
TASK [Analyze the cr yaml] *****************************************************
task path: /utils/fcm/update_litmus_result_resource.yml:14
2020-06-02T13:35:13.050580 (delta: 0.544643) elapsed: 10.224925 *******
changed: [127.0.0.1] => {"changed": true, "cmd": "cat litmus-result.yaml", "delta": "0:00:01.056832", "end": "2020-06-02 13:35:14.385435", "rc": 0, "start": "2020-06-02 13:35:13.328603", "stderr": "", "stderr_lines": [], "stdout": "---\napiVersion: litmus.io/v1alpha1\nkind: LitmusResult\nmetadata:\n\n # name of the litmus testcase\n name: velero-backup-restore \nspec:\n\n # holds information on the testcase\n testMetadata:\n app: \n chaostype: \n\n # holds the state of testcase, manually updated by json merge patch\n # result is the useful value today, but anticipate phase use in future \n testStatus: \n phase: in-progress \n result: none ", "stdout_lines": ["---", "apiVersion: litmus.io/v1alpha1", "kind: LitmusResult", "metadata:", "", " # name of the litmus testcase", " name: velero-backup-restore ", "spec:", "", " # holds information on the testcase", " testMetadata:", " app: ", " chaostype: ", "", " # holds the state of testcase, manually updated by json merge patch", " # result is the useful value today, but anticipate phase use in future ", " testStatus: ", " phase: in-progress ", " result: none "]}
TASK [Apply the litmus result CR] **********************************************
task path: /utils/fcm/update_litmus_result_resource.yml:17
2020-06-02T13:35:14.430392 (delta: 1.379764) elapsed: 11.604737 *******
changed: [127.0.0.1] => {"changed": true, "failed_when_result": false, "method": "patch", "result": {"apiVersion": "litmus.io/v1alpha1", "kind": "LitmusResult", "metadata": {"creationTimestamp": "2020-06-02T12:28:40Z", "generation": 3, "name": "velero-backup-restore", "resourceVersion": "1233958", "selfLink": "/apis/litmus.io/v1alpha1/litmusresults/velero-backup-restore", "uid": "96e7db5a-a4cc-11ea-9ff8-005056987ecb"}, "spec": {"testMetadata": {}, "testStatus": {"phase": "in-progress", "result": "none"}}}}
TASK [Generate the litmus result CR to reflect EOT (End of Test)] **************
task path: /utils/fcm/update_litmus_result_resource.yml:27
2020-06-02T13:35:15.508011 (delta: 1.077578) elapsed: 12.682356 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Analyze the cr yaml] *****************************************************
task path: /utils/fcm/update_litmus_result_resource.yml:38
2020-06-02T13:35:15.564877 (delta: 0.056818) elapsed: 12.739222 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Apply the litmus result CR] **********************************************
task path: /utils/fcm/update_litmus_result_resource.yml:41
2020-06-02T13:35:15.613149 (delta: 0.048198) elapsed: 12.787494 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Verify that the AUT is running] ******************************************
task path: /experiments/functional/backup_and_restore/test.yml:19
2020-06-02T13:35:15.656895 (delta: 0.043692) elapsed: 12.83124 ********
included: /utils/k8s/check_deployment_status.yml for 127.0.0.1
TASK [Check the pod status] ****************************************************
task path: /utils/k8s/check_deployment_status.yml:8
2020-06-02T13:35:15.737877 (delta: 0.080934) elapsed: 12.912222 *******
changed: [127.0.0.1] => {"attempts": 1, "changed": true, "cmd": "kubectl get pods -n app-percona -l name=percona --no-headers -o custom-columns=:status.phase", "delta": "0:00:01.281022", "end": "2020-06-02 13:35:17.173191", "rc": 0, "start": "2020-06-02 13:35:15.892169", "stderr": "", "stderr_lines": [], "stdout": "Running", "stdout_lines": ["Running"]}
TASK [obtain the number of replicas.] ******************************************
task path: /utils/k8s/check_deployment_status.yml:22
2020-06-02T13:35:17.228719 (delta: 1.490794) elapsed: 14.403064 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Obtain the ready replica count and compare with the replica count.] ******
task path: /utils/k8s/check_deployment_status.yml:29
2020-06-02T13:35:17.286139 (delta: 0.057378) elapsed: 14.460484 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [include_tasks] ***********************************************************
task path: /experiments/functional/backup_and_restore/test.yml:22
2020-06-02T13:35:17.336495 (delta: 0.050315) elapsed: 14.51084 ********
included: /experiments/functional/backup_and_restore/setup_dependency.yml for 127.0.0.1
TASK [Downloading velero binary] ***********************************************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:1
2020-06-02T13:35:17.484578 (delta: 0.148041) elapsed: 14.658923 *******
changed: [127.0.0.1] => {"attempts": 1, "changed": true, "checksum_dest": null, "checksum_src": "30e3e5cdaf2df32b84472dc0ab1fa1ae591aae40", "dest": "./velero-v1.3.0-linux-amd64.tar.gz", "gid": 0, "group": "root", "md5sum": "b45cbd835a139e4c3ed5b56309dcf22d", "mode": "0644", "msg": "OK (24524254 bytes)", "owner": "root", "size": 24524254, "src": "/root/.ansible/tmp/ansible-tmp-1591104917.53-99185819536291/tmpw9PVva", "state": "file", "status_code": 200, "uid": 0, "url": "https://github.com/vmware-tanzu/velero/releases/download/v1.3.0/velero-v1.3.0-linux-amd64.tar.gz"}
TASK [Installing velero inside litmus container] *******************************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:11
2020-06-02T13:35:28.067333 (delta: 10.582711) elapsed: 25.241678 ******
[WARNING]: Consider using the unarchive module rather than running tar. If
you need to use command because unarchive is insufficient you can add
warn=False to this command task or set command_warnings=False in ansible.cfg to
get rid of this message.
changed: [127.0.0.1] => {"changed": true, "cmd": "tar -xvf velero-v1.3.0-linux-amd64.tar.gz\n mv velero-v1.3.0-linux-amd64/velero /usr/local/bin/", "delta": "0:00:01.616372", "end": "2020-06-02 13:35:29.849344", "rc": 0, "start": "2020-06-02 13:35:28.232972", "stderr": "", "stderr_lines": [], "stdout": "velero-v1.3.0-linux-amd64/LICENSE\nvelero-v1.3.0-linux-amd64/examples/README.md\nvelero-v1.3.0-linux-amd64/examples/minio\nvelero-v1.3.0-linux-amd64/examples/minio/00-minio-deployment.yaml\nvelero-v1.3.0-linux-amd64/examples/nginx-app\nvelero-v1.3.0-linux-amd64/examples/nginx-app/README.md\nvelero-v1.3.0-linux-amd64/examples/nginx-app/base.yaml\nvelero-v1.3.0-linux-amd64/examples/nginx-app/with-pv.yaml\nvelero-v1.3.0-linux-amd64/velero", "stdout_lines": ["velero-v1.3.0-linux-amd64/LICENSE", "velero-v1.3.0-linux-amd64/examples/README.md", "velero-v1.3.0-linux-amd64/examples/minio", "velero-v1.3.0-linux-amd64/examples/minio/00-minio-deployment.yaml", "velero-v1.3.0-linux-amd64/examples/nginx-app", "velero-v1.3.0-linux-amd64/examples/nginx-app/README.md", "velero-v1.3.0-linux-amd64/examples/nginx-app/base.yaml", "velero-v1.3.0-linux-amd64/examples/nginx-app/with-pv.yaml", "velero-v1.3.0-linux-amd64/velero"]}
TASK [Checking the velero version] *********************************************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:16
2020-06-02T13:35:29.896841 (delta: 1.829452) elapsed: 27.071186 *******
changed: [127.0.0.1] => {"changed": true, "cmd": "velero version", "delta": "0:00:01.170662", "end": "2020-06-02 13:35:31.205891", "failed_when_result": false, "rc": 0, "start": "2020-06-02 13:35:30.035229", "stderr": "", "stderr_lines": [], "stdout": "Client:\n\tVersion: v1.3.0\n\tGit commit: 8fec8ed7fb8b4776b191753497eafeb47f2f9136\n<error getting server version: namespaces \"velero\" not found>", "stdout_lines": ["Client:", "\tVersion: v1.3.0", "\tGit commit: 8fec8ed7fb8b4776b191753497eafeb47f2f9136", "<error getting server version: namespaces \"velero\" not found>"]}
TASK [Replacing Profile name in credential file] *******************************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:21
2020-06-02T13:35:31.269279 (delta: 1.372402) elapsed: 28.443624 *******
changed: [127.0.0.1] => {"changed": true, "msg": "1 replacements made"}
TASK [Installing minio s3-bucket] **********************************************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:30
2020-06-02T13:35:31.610733 (delta: 0.341409) elapsed: 28.785078 *******
changed: [127.0.0.1] => {"changed": true, "cmd": "kubectl apply -f velero-v1.3.0-linux-amd64/examples/minio/00-minio-deployment.yaml", "delta": "0:00:01.613245", "end": "2020-06-02 13:35:33.410421", "rc": 0, "start": "2020-06-02 13:35:31.797176", "stderr": "", "stderr_lines": [], "stdout": "namespace/velero created\ndeployment.apps/minio created\nservice/minio created\njob.batch/minio-setup created", "stdout_lines": ["namespace/velero created", "deployment.apps/minio created", "service/minio created", "job.batch/minio-setup created"]}
TASK [Checking for minio pod status] *******************************************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:33
2020-06-02T13:35:33.463156 (delta: 1.852377) elapsed: 30.637501 *******
FAILED - RETRYING: Checking for minio pod status (15 retries left).
FAILED - RETRYING: Checking for minio pod status (14 retries left).
changed: [127.0.0.1] => {"attempts": 3, "changed": true, "cmd": "kubectl get pod -n velero -l component=minio -ojsonpath='{.items[0].status.phase}'", "delta": "0:00:01.186251", "end": "2020-06-02 13:35:47.479568", "rc": 0, "start": "2020-06-02 13:35:46.293317", "stderr": "", "stderr_lines": [], "stdout": "Running", "stdout_lines": ["Running"]}
TASK [Waiting for minio job to create bucket] **********************************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:40
2020-06-02T13:35:47.529650 (delta: 14.066433) elapsed: 44.703995 ******
FAILED - RETRYING: Waiting for minio job to create bucket (10 retries left).
changed: [127.0.0.1] => {"attempts": 2, "changed": true, "cmd": "kubectl get pod -n velero -l job-name=minio-setup -o jsonpath='{.items[*].status.phase}'", "delta": "0:00:01.235280", "end": "2020-06-02 13:35:55.231800", "rc": 0, "start": "2020-06-02 13:35:53.996520", "stderr": "", "stderr_lines": [], "stdout": "Succeeded", "stdout_lines": ["Succeeded"]}
TASK [Fetching service account key.json from configmap] ************************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:51
2020-06-02T13:35:55.284831 (delta: 7.755138) elapsed: 52.459176 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Authenticating using gcp service account] ********************************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:54
2020-06-02T13:35:55.337947 (delta: 0.053068) elapsed: 52.512292 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Setting up the project ID] ***********************************************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:57
2020-06-02T13:35:55.397656 (delta: 0.059663) elapsed: 52.572001 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Deleting GCP bucket if exist] ********************************************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:60
2020-06-02T13:35:55.452823 (delta: 0.055118) elapsed: 52.627168 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating gcp bucket] *****************************************************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:66
2020-06-02T13:35:55.509795 (delta: 0.056924) elapsed: 52.68414 ********
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Installing velero server inside cluster] *********************************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:75
2020-06-02T13:35:55.564149 (delta: 0.054302) elapsed: 52.738494 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Installing velero server inside cluster] *********************************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:86
2020-06-02T13:35:55.617434 (delta: 0.053237) elapsed: 52.791779 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Installing velero server inside cluster] *********************************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:102
2020-06-02T13:35:55.686954 (delta: 0.069458) elapsed: 52.861299 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Installing velero server inside cluster] *********************************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:112
2020-06-02T13:35:55.747541 (delta: 0.060539) elapsed: 52.921886 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Patching restic demonset for privileged access] **************************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:125
2020-06-02T13:35:55.806565 (delta: 0.058973) elapsed: 52.98091 ********
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Getting the number of compute nodes] *************************************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:130
2020-06-02T13:35:55.860129 (delta: 0.053515) elapsed: 53.034474 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Checking the status of restic demonset pods] *****************************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:134
2020-06-02T13:35:55.915881 (delta: 0.055704) elapsed: 53.090226 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Installing velero server inside cluster] *********************************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:147
2020-06-02T13:35:55.972114 (delta: 0.056187) elapsed: 53.146459 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Installing velero server inside cluster] *********************************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:157
2020-06-02T13:35:56.032811 (delta: 0.060649) elapsed: 53.207156 *******
changed: [127.0.0.1] => {"changed": true, "cmd": "velero install --provider aws --bucket velero --secret-file ./credentials-velero --use-volume-snapshots=false --plugins velero/velero-plugin-for-aws:v1.0.0 --backup-location-config region=minio,s3ForcePathStyle=\"true\",s3Url=http://minio.velero.svc:9000,profile=test", "delta": "0:00:02.968612", "end": "2020-06-02 13:35:59.183606", "rc": 0, "start": "2020-06-02 13:35:56.214994", "stderr": "", "stderr_lines": [], "stdout": "CustomResourceDefinition/backups.velero.io: attempting to create resource\nCustomResourceDefinition/backups.velero.io: already exists, proceeding\nCustomResourceDefinition/backups.velero.io: created\nCustomResourceDefinition/backupstoragelocations.velero.io: attempting to create resource\nCustomResourceDefinition/backupstoragelocations.velero.io: already exists, proceeding\nCustomResourceDefinition/backupstoragelocations.velero.io: created\nCustomResourceDefinition/deletebackuprequests.velero.io: attempting to create resource\nCustomResourceDefinition/deletebackuprequests.velero.io: already exists, proceeding\nCustomResourceDefinition/deletebackuprequests.velero.io: created\nCustomResourceDefinition/downloadrequests.velero.io: attempting to create resource\nCustomResourceDefinition/downloadrequests.velero.io: already exists, proceeding\nCustomResourceDefinition/downloadrequests.velero.io: created\nCustomResourceDefinition/podvolumebackups.velero.io: attempting to create resource\nCustomResourceDefinition/podvolumebackups.velero.io: already exists, proceeding\nCustomResourceDefinition/podvolumebackups.velero.io: created\nCustomResourceDefinition/podvolumerestores.velero.io: attempting to create resource\nCustomResourceDefinition/podvolumerestores.velero.io: already exists, proceeding\nCustomResourceDefinition/podvolumerestores.velero.io: created\nCustomResourceDefinition/resticrepositories.velero.io: attempting to create resource\nCustomResourceDefinition/resticrepositories.velero.io: already exists, proceeding\nCustomResourceDefinition/resticrepositories.velero.io: created\nCustomResourceDefinition/restores.velero.io: attempting to create resource\nCustomResourceDefinition/restores.velero.io: already exists, proceeding\nCustomResourceDefinition/restores.velero.io: created\nCustomResourceDefinition/schedules.velero.io: attempting to create resource\nCustomResourceDefinition/schedules.velero.io: already exists, proceeding\nCustomResourceDefinition/schedules.velero.io: created\nCustomResourceDefinition/serverstatusrequests.velero.io: attempting to create resource\nCustomResourceDefinition/serverstatusrequests.velero.io: already exists, proceeding\nCustomResourceDefinition/serverstatusrequests.velero.io: created\nCustomResourceDefinition/volumesnapshotlocations.velero.io: attempting to create resource\nCustomResourceDefinition/volumesnapshotlocations.velero.io: already exists, proceeding\nCustomResourceDefinition/volumesnapshotlocations.velero.io: created\nWaiting for resources to be ready in cluster...\nNamespace/velero: attempting to create resource\nNamespace/velero: already exists, proceeding\nNamespace/velero: created\nClusterRoleBinding/velero: attempting to create resource\nClusterRoleBinding/velero: already exists, proceeding\nClusterRoleBinding/velero: created\nServiceAccount/velero: attempting to create resource\nServiceAccount/velero: created\nSecret/cloud-credentials: attempting to create resource\nSecret/cloud-credentials: created\nBackupStorageLocation/default: attempting to create resource\nBackupStorageLocation/default: created\nDeployment/velero: attempting to create resource\nDeployment/velero: created\nVelero is installed! ? Use 'kubectl logs deployment/velero -n velero' to view the status.", "stdout_lines": ["CustomResourceDefinition/backups.velero.io: attempting to create resource", "CustomResourceDefinition/backups.velero.io: already exists, proceeding", "CustomResourceDefinition/backups.velero.io: created", "CustomResourceDefinition/backupstoragelocations.velero.io: attempting to create resource", "CustomResourceDefinition/backupstoragelocations.velero.io: already exists, proceeding", "CustomResourceDefinition/backupstoragelocations.velero.io: created", "CustomResourceDefinition/deletebackuprequests.velero.io: attempting to create resource", "CustomResourceDefinition/deletebackuprequests.velero.io: already exists, proceeding", "CustomResourceDefinition/deletebackuprequests.velero.io: created", "CustomResourceDefinition/downloadrequests.velero.io: attempting to create resource", "CustomResourceDefinition/downloadrequests.velero.io: already exists, proceeding", "CustomResourceDefinition/downloadrequests.velero.io: created", "CustomResourceDefinition/podvolumebackups.velero.io: attempting to create resource", "CustomResourceDefinition/podvolumebackups.velero.io: already exists, proceeding", "CustomResourceDefinition/podvolumebackups.velero.io: created", "CustomResourceDefinition/podvolumerestores.velero.io: attempting to create resource", "CustomResourceDefinition/podvolumerestores.velero.io: already exists, proceeding", "CustomResourceDefinition/podvolumerestores.velero.io: created", "CustomResourceDefinition/resticrepositories.velero.io: attempting to create resource", "CustomResourceDefinition/resticrepositories.velero.io: already exists, proceeding", "CustomResourceDefinition/resticrepositories.velero.io: created", "CustomResourceDefinition/restores.velero.io: attempting to create resource", "CustomResourceDefinition/restores.velero.io: already exists, proceeding", "CustomResourceDefinition/restores.velero.io: created", "CustomResourceDefinition/schedules.velero.io: attempting to create resource", "CustomResourceDefinition/schedules.velero.io: already exists, proceeding", "CustomResourceDefinition/schedules.velero.io: created", "CustomResourceDefinition/serverstatusrequests.velero.io: attempting to create resource", "CustomResourceDefinition/serverstatusrequests.velero.io: already exists, proceeding", "CustomResourceDefinition/serverstatusrequests.velero.io: created", "CustomResourceDefinition/volumesnapshotlocations.velero.io: attempting to create resource", "CustomResourceDefinition/volumesnapshotlocations.velero.io: already exists, proceeding", "CustomResourceDefinition/volumesnapshotlocations.velero.io: created", "Waiting for resources to be ready in cluster...", "Namespace/velero: attempting to create resource", "Namespace/velero: already exists, proceeding", "Namespace/velero: created", "ClusterRoleBinding/velero: attempting to create resource", "ClusterRoleBinding/velero: already exists, proceeding", "ClusterRoleBinding/velero: created", "ServiceAccount/velero: attempting to create resource", "ServiceAccount/velero: created", "Secret/cloud-credentials: attempting to create resource", "Secret/cloud-credentials: created", "BackupStorageLocation/default: attempting to create resource", "BackupStorageLocation/default: created", "Deployment/velero: attempting to create resource", "Deployment/velero: created", "Velero is installed! ? Use 'kubectl logs deployment/velero -n velero' to view the status."]}
TASK [Installing velero server inside cluster] *********************************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:172
2020-06-02T13:35:59.246305 (delta: 3.213446) elapsed: 56.42065 ********
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Installing velero server inside cluster] *********************************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:181
2020-06-02T13:35:59.303566 (delta: 0.057213) elapsed: 56.477911 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Checking for velero server status] ***************************************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:195
2020-06-02T13:35:59.362617 (delta: 0.059006) elapsed: 56.536962 *******
FAILED - RETRYING: Checking for velero server status (20 retries left).
FAILED - RETRYING: Checking for velero server status (19 retries left).
changed: [127.0.0.1] => {"attempts": 3, "changed": true, "cmd": "kubectl get pod -n velero -l component=velero -ojsonpath='{.items[0].status.phase}'", "delta": "0:00:01.254637", "end": "2020-06-02 13:36:13.393264", "rc": 0, "start": "2020-06-02 13:36:12.138627", "stderr": "", "stderr_lines": [], "stdout": "Running", "stdout_lines": ["Running"]}
TASK [Replacing profile spec in volumesnapshotlocation.yml] ********************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:204
2020-06-02T13:36:13.442327 (delta: 14.079665) elapsed: 70.616672 ******
changed: [127.0.0.1] => {"changed": true, "msg": "1 replacements made"}
TASK [Creating volume snapshot location] ***************************************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:211
2020-06-02T13:36:13.641080 (delta: 0.198708) elapsed: 70.815425 *******
changed: [127.0.0.1] => {"changed": true, "cmd": "kubectl apply -f minio-volume-snapshot-location.yml", "delta": "0:00:01.533324", "end": "2020-06-02 13:36:15.380307", "failed_when_result": false, "rc": 0, "start": "2020-06-02 13:36:13.846983", "stderr": "", "stderr_lines": [], "stdout": "volumesnapshotlocation.velero.io/minio created", "stdout_lines": ["volumesnapshotlocation.velero.io/minio created"]}
TASK [Creating volume snapshot location] ***************************************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:217
2020-06-02T13:36:15.435689 (delta: 1.794563) elapsed: 72.610034 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating volume snapshot location] ***************************************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:223
2020-06-02T13:36:15.493815 (delta: 0.058082) elapsed: 72.66816 ********
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Installing development image of OpenEBS velero-plugin] *******************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:229
2020-06-02T13:36:15.549439 (delta: 0.055577) elapsed: 72.723784 *******
changed: [127.0.0.1] => {"changed": true, "cmd": "velero plugin add openebs/velero-plugin:1.10.0", "delta": "0:00:01.153224", "end": "2020-06-02 13:36:16.853841", "rc": 0, "start": "2020-06-02 13:36:15.700617", "stderr": "", "stderr_lines": [], "stdout": "", "stdout_lines": []}
TASK [Waiting for new velero server pod to come up] ****************************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:233
2020-06-02T13:36:16.907039 (delta: 1.357541) elapsed: 74.081384 *******
FAILED - RETRYING: Waiting for new velero server pod to come up (20 retries left).
FAILED - RETRYING: Waiting for new velero server pod to come up (19 retries left).
FAILED - RETRYING: Waiting for new velero server pod to come up (18 retries left).
FAILED - RETRYING: Waiting for new velero server pod to come up (17 retries left).
FAILED - RETRYING: Waiting for new velero server pod to come up (16 retries left).
changed: [127.0.0.1] => {"attempts": 6, "changed": true, "cmd": "kubectl get pod -n velero --no-headers -l deploy=velero | wc -l", "delta": "0:00:01.108283", "end": "2020-06-02 13:36:49.830282", "rc": 0, "start": "2020-06-02 13:36:48.721999", "stderr": "", "stderr_lines": [], "stdout": "1", "stdout_lines": ["1"]}
TASK [Checking for velero server status] ***************************************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:242
2020-06-02T13:36:49.879825 (delta: 32.972713) elapsed: 107.05417 ******
changed: [127.0.0.1] => {"attempts": 1, "changed": true, "cmd": "kubectl get pod -n velero -l component=velero -ojsonpath='{.items[0].status.phase}'", "delta": "0:00:01.184318", "end": "2020-06-02 13:36:51.198408", "rc": 0, "start": "2020-06-02 13:36:50.014090", "stderr": "", "stderr_lines": [], "stdout": "Running", "stdout_lines": ["Running"]}
TASK [Checking for velero server container status] *****************************
task path: /experiments/functional/backup_and_restore/setup_dependency.yml:249
2020-06-02T13:36:51.254537 (delta: 1.374667) elapsed: 108.428882 ******
changed: [127.0.0.1] => {"attempts": 1, "changed": true, "cmd": "kubectl get pod -n velero -l component=velero -ojsonpath='{.items[0].status.containerStatuses[0].ready}'", "delta": "0:00:01.030236", "end": "2020-06-02 13:36:52.423443", "rc": 0, "start": "2020-06-02 13:36:51.393207", "stderr": "", "stderr_lines": [], "stdout": "true", "stdout_lines": ["true"]}
TASK [Get Application pod details] *********************************************
task path: /experiments/functional/backup_and_restore/test.yml:24
2020-06-02T13:36:52.476546 (delta: 1.221965) elapsed: 109.650891 ******
changed: [127.0.0.1] => {"changed": true, "cmd": "kubectl get pods -n app-percona -l name=percona --no-headers -o custom-columns=:metadata.name", "delta": "0:00:01.165485", "end": "2020-06-02 13:36:53.801593", "rc": 0, "start": "2020-06-02 13:36:52.636108", "stderr": "", "stderr_lines": [], "stdout": "percona-755f6678bf-j78cm", "stdout_lines": ["percona-755f6678bf-j78cm"]}
TASK [Create new database in the mysql] ****************************************
task path: /experiments/functional/backup_and_restore/test.yml:28
2020-06-02T13:36:53.849094 (delta: 1.372503) elapsed: 111.023439 ******
included: /utils/scm/applications/mysql/mysql_data_persistence.yml for 127.0.0.1
TASK [Create some test data in the mysql database] *****************************
task path: /utils/scm/applications/mysql/mysql_data_persistence.yml:4
2020-06-02T13:36:53.963959 (delta: 0.11482) elapsed: 111.138304 *******
changed: [127.0.0.1] => (item=mysql -uroot -pk8sDem0 -e 'create database backup;') => {"changed": true, "cmd": "kubectl exec percona-755f6678bf-j78cm -n app-percona -- mysql -uroot -pk8sDem0 -e 'create database backup;'", "delta": "0:00:01.330264", "end": "2020-06-02 13:36:55.452161", "failed_when_result": false, "item": "mysql -uroot -pk8sDem0 -e 'create database backup;'", "rc": 0, "start": "2020-06-02 13:36:54.121897", "stderr": "mysql: [Warning] Using a password on the command line interface can be insecure.", "stderr_lines": ["mysql: [Warning] Using a password on the command line interface can be insecure."], "stdout": "", "stdout_lines": []}
changed: [127.0.0.1] => (item=mysql -uroot -pk8sDem0 -e 'create table ttbl (Data VARCHAR(20));' backup) => {"changed": true, "cmd": "kubectl exec percona-755f6678bf-j78cm -n app-percona -- mysql -uroot -pk8sDem0 -e 'create table ttbl (Data VARCHAR(20));' backup", "delta": "0:00:01.479200", "end": "2020-06-02 13:36:57.063501", "failed_when_result": false, "item": "mysql -uroot -pk8sDem0 -e 'create table ttbl (Data VARCHAR(20));' backup", "rc": 0, "start": "2020-06-02 13:36:55.584301", "stderr": "mysql: [Warning] Using a password on the command line interface can be insecure.", "stderr_lines": ["mysql: [Warning] Using a password on the command line interface can be insecure."], "stdout": "", "stdout_lines": []}
changed: [127.0.0.1] => (item=mysql -uroot -pk8sDem0 -e 'insert into ttbl (Data) VALUES ("tdata");' backup) => {"changed": true, "cmd": "kubectl exec percona-755f6678bf-j78cm -n app-percona -- mysql -uroot -pk8sDem0 -e 'insert into ttbl (Data) VALUES (\"tdata\");' backup", "delta": "0:00:01.426467", "end": "2020-06-02 13:36:58.622504", "failed_when_result": false, "item": "mysql -uroot -pk8sDem0 -e 'insert into ttbl (Data) VALUES (\"tdata\");' backup", "rc": 0, "start": "2020-06-02 13:36:57.196037", "stderr": "mysql: [Warning] Using a password on the command line interface can be insecure.", "stderr_lines": ["mysql: [Warning] Using a password on the command line interface can be insecure."], "stdout": "", "stdout_lines": []}
TASK [Kill the application pod] ************************************************
task path: /utils/scm/applications/mysql/mysql_data_persistence.yml:21
2020-06-02T13:36:58.674142 (delta: 4.710139) elapsed: 115.848487 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Verify if the application pod is deleted] ********************************
task path: /utils/scm/applications/mysql/mysql_data_persistence.yml:27
2020-06-02T13:36:58.719171 (delta: 0.044983) elapsed: 115.893516 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Obtain the newly created pod name for application] ***********************
task path: /utils/scm/applications/mysql/mysql_data_persistence.yml:37
2020-06-02T13:36:58.764726 (delta: 0.045505) elapsed: 115.939071 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Checking application pod is in running state] ****************************
task path: /utils/scm/applications/mysql/mysql_data_persistence.yml:44
2020-06-02T13:36:58.811338 (delta: 0.046563) elapsed: 115.985683 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Get the container status of application.] ********************************
task path: /utils/scm/applications/mysql/mysql_data_persistence.yml:51
2020-06-02T13:36:58.856891 (delta: 0.045506) elapsed: 116.031236 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Check if db is ready for connections] ************************************
task path: /utils/scm/applications/mysql/mysql_data_persistence.yml:61
2020-06-02T13:36:58.901313 (delta: 0.044363) elapsed: 116.075658 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Checking for the Corrupted tables] ***************************************
task path: /utils/scm/applications/mysql/mysql_data_persistence.yml:68
2020-06-02T13:36:58.946773 (delta: 0.045414) elapsed: 116.121118 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Verify mysql data persistence] *******************************************
task path: /utils/scm/applications/mysql/mysql_data_persistence.yml:77
2020-06-02T13:36:58.991996 (delta: 0.045176) elapsed: 116.166341 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Delete/drop MySQL database] **********************************************
task path: /utils/scm/applications/mysql/mysql_data_persistence.yml:90
2020-06-02T13:36:59.049588 (delta: 0.057531) elapsed: 116.223933 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Verify successful db delete] *********************************************
task path: /utils/scm/applications/mysql/mysql_data_persistence.yml:98
2020-06-02T13:36:59.172770 (delta: 0.123134) elapsed: 116.347115 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Getting the application mount point] *************************************
task path: /experiments/functional/backup_and_restore/test.yml:40
2020-06-02T13:36:59.221353 (delta: 0.048544) elapsed: 116.395698 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Writing data on application mount point] *********************************
task path: /experiments/functional/backup_and_restore/test.yml:45
2020-06-02T13:36:59.272408 (delta: 0.051011) elapsed: 116.446753 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Getting the application deployment name] *********************************
task path: /experiments/functional/backup_and_restore/test.yml:52
2020-06-02T13:36:59.326082 (delta: 0.053614) elapsed: 116.500427 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Fetching the volume name of application] *********************************
task path: /experiments/functional/backup_and_restore/test.yml:56
2020-06-02T13:36:59.379443 (delta: 0.053315) elapsed: 116.553788 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Annotating the application pod] ******************************************
task path: /experiments/functional/backup_and_restore/test.yml:61
2020-06-02T13:36:59.431310 (delta: 0.051785) elapsed: 116.605655 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating Backup for the provided application] ****************************
task path: /experiments/functional/backup_and_restore/test.yml:66
2020-06-02T13:36:59.492348 (delta: 0.060984) elapsed: 116.666693 ******
included: /experiments/functional/backup_and_restore/backup-restore.yml for 127.0.0.1
TASK [Creating Backup using minio bucket] **************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:6
2020-06-02T13:36:59.631028 (delta: 0.13863) elapsed: 116.805373 *******
changed: [127.0.0.1] => {"changed": true, "cmd": "velero backup create percona-backup --include-namespaces=app-percona --snapshot-volumes --volume-snapshot-locations=minio", "delta": "0:00:01.128074", "end": "2020-06-02 13:37:00.922719", "rc": 0, "start": "2020-06-02 13:36:59.794645", "stderr": "", "stderr_lines": [], "stdout": "Backup request \"percona-backup\" submitted successfully.\nRun `velero backup describe percona-backup` or `velero backup logs percona-backup` for more details.", "stdout_lines": ["Backup request \"percona-backup\" submitted successfully.", "Run `velero backup describe percona-backup` or `velero backup logs percona-backup` for more details."]}
TASK [Creating Backup using GCP bucket] ****************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:13
2020-06-02T13:37:00.986252 (delta: 1.355179) elapsed: 118.160597 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating Local Backup] ***************************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:20
2020-06-02T13:37:01.051655 (delta: 0.06534) elapsed: 118.226 **********
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating Backup] *********************************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:31
2020-06-02T13:37:01.112081 (delta: 0.060372) elapsed: 118.286426 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating Backup] *********************************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:41
2020-06-02T13:37:01.168436 (delta: 0.056279) elapsed: 118.342781 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating schedule using minio bucket] ************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:54
2020-06-02T13:37:01.229798 (delta: 0.061293) elapsed: 118.404143 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating schedule using GCP bucket] **************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:58
2020-06-02T13:37:01.280843 (delta: 0.050966) elapsed: 118.455188 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Getting the state of Backup] *********************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:66
2020-06-02T13:37:01.328370 (delta: 0.047478) elapsed: 118.502715 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating application namespace] ******************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:79
2020-06-02T13:37:01.372704 (delta: 0.044288) elapsed: 118.547049 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Restoring application] ***************************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:85
2020-06-02T13:37:01.416918 (delta: 0.04417) elapsed: 118.591263 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Restoring application] ***************************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:95
2020-06-02T13:37:01.459487 (delta: 0.042524) elapsed: 118.633832 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating application namespace] ******************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:105
2020-06-02T13:37:01.501302 (delta: 0.04177) elapsed: 118.675647 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Restoring application] ***************************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:110
2020-06-02T13:37:01.552118 (delta: 0.050774) elapsed: 118.726463 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Getting restore name] ****************************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:122
2020-06-02T13:37:01.596186 (delta: 0.044025) elapsed: 118.770531 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Checking the restore status] *********************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:126
2020-06-02T13:37:01.639161 (delta: 0.042931) elapsed: 118.813506 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [include_tasks] ***********************************************************
task path: /experiments/functional/backup_and_restore/test.yml:75
2020-06-02T13:37:01.691845 (delta: 0.052641) elapsed: 118.86619 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Getting the backup name] *************************************************
task path: /experiments/functional/backup_and_restore/test.yml:79
2020-06-02T13:37:01.742569 (delta: 0.05068) elapsed: 118.916914 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Getting the state of backup] *********************************************
task path: /experiments/functional/backup_and_restore/test.yml:83
2020-06-02T13:37:01.800700 (delta: 0.058081) elapsed: 118.975045 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [include_tasks] ***********************************************************
task path: /experiments/functional/backup_and_restore/test.yml:90
2020-06-02T13:37:01.859058 (delta: 0.058305) elapsed: 119.033403 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Waiting for completed backup] ********************************************
task path: /experiments/functional/backup_and_restore/test.yml:95
2020-06-02T13:37:01.912024 (delta: 0.05289) elapsed: 119.086369 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Getting the base backup name] ********************************************
task path: /experiments/functional/backup_and_restore/test.yml:102
2020-06-02T13:37:01.964491 (delta: 0.052418) elapsed: 119.138836 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Getting state of backup] *************************************************
task path: /experiments/functional/backup_and_restore/test.yml:108
2020-06-02T13:37:02.019908 (delta: 0.055367) elapsed: 119.194253 ******
included: /experiments/functional/backup_and_restore/backup-restore.yml for 127.0.0.1
TASK [Creating Backup using minio bucket] **************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:6
2020-06-02T13:37:02.162034 (delta: 0.142077) elapsed: 119.336379 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating Backup using GCP bucket] ****************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:13
2020-06-02T13:37:02.212311 (delta: 0.05023) elapsed: 119.386656 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating Local Backup] ***************************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:20
2020-06-02T13:37:02.260974 (delta: 0.048615) elapsed: 119.435319 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating Backup] *********************************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:31
2020-06-02T13:37:02.311562 (delta: 0.050542) elapsed: 119.485907 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating Backup] *********************************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:41
2020-06-02T13:37:02.359305 (delta: 0.047696) elapsed: 119.53365 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating schedule using minio bucket] ************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:54
2020-06-02T13:37:02.406698 (delta: 0.047344) elapsed: 119.581043 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating schedule using GCP bucket] **************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:58
2020-06-02T13:37:02.454714 (delta: 0.047958) elapsed: 119.629059 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Getting the state of Backup] *********************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:66
2020-06-02T13:37:02.503004 (delta: 0.048243) elapsed: 119.677349 ******
FAILED - RETRYING: Getting the state of Backup (100 retries left).
FAILED - RETRYING: Getting the state of Backup (99 retries left).
FAILED - RETRYING: Getting the state of Backup (98 retries left).
FAILED - RETRYING: Getting the state of Backup (97 retries left).
FAILED - RETRYING: Getting the state of Backup (96 retries left).
FAILED - RETRYING: Getting the state of Backup (95 retries left).
FAILED - RETRYING: Getting the state of Backup (94 retries left).
changed: [127.0.0.1] => {"attempts": 8, "changed": true, "cmd": "kubectl get backup percona-backup -n velero -o jsonpath='{.status.phase}'", "delta": "0:00:01.236844", "end": "2020-06-02 13:37:48.297869", "rc": 0, "start": "2020-06-02 13:37:47.061025", "stderr": "", "stderr_lines": [], "stdout": "Completed", "stdout_lines": ["Completed"]}
TASK [Creating application namespace] ******************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:79
2020-06-02T13:37:48.351310 (delta: 45.848258) elapsed: 165.525655 *****
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Restoring application] ***************************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:85
2020-06-02T13:37:48.397723 (delta: 0.046366) elapsed: 165.572068 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Restoring application] ***************************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:95
2020-06-02T13:37:48.441949 (delta: 0.044175) elapsed: 165.616294 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating application namespace] ******************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:105
2020-06-02T13:37:48.489797 (delta: 0.0478) elapsed: 165.664142 ********
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Restoring application] ***************************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:110
2020-06-02T13:37:48.631878 (delta: 0.142032) elapsed: 165.806223 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Getting restore name] ****************************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:122
2020-06-02T13:37:48.678425 (delta: 0.046507) elapsed: 165.85277 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Checking the restore status] *********************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:126
2020-06-02T13:37:48.723991 (delta: 0.045507) elapsed: 165.898336 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Verifying for local snaphot deletion] ************************************
task path: /experiments/functional/backup_and_restore/test.yml:115
2020-06-02T13:37:48.770593 (delta: 0.046557) elapsed: 165.944938 ******
included: /experiments/functional/backup_and_restore/additional_verification.yml for 127.0.0.1
TASK [Getting the volume name] *************************************************
task path: /experiments/functional/backup_and_restore/additional_verification.yml:4
2020-06-02T13:37:48.910860 (delta: 0.140223) elapsed: 166.085205 ******
changed: [127.0.0.1] => {"changed": true, "cmd": "kubectl get pvc percona-mysql-claim -n app-percona -o jsonpath='{.spec.volumeName}'", "delta": "0:00:01.187592", "end": "2020-06-02 13:37:50.247744", "rc": 0, "start": "2020-06-02 13:37:49.060152", "stderr": "", "stderr_lines": [], "stdout": "pvc-624a7e35-a4d5-11ea-9ff8-005056987ecb", "stdout_lines": ["pvc-624a7e35-a4d5-11ea-9ff8-005056987ecb"]}
TASK [Getting the cstor backup name] *******************************************
task path: /experiments/functional/backup_and_restore/additional_verification.yml:8
2020-06-02T13:37:50.295961 (delta: 1.385057) elapsed: 167.470306 ******
changed: [127.0.0.1] => {"attempts": 1, "changed": true, "cmd": "kubectl get cstorbackup -n app-percona | grep percona-backup | awk '{print $1}'", "delta": "0:00:01.333111", "end": "2020-06-02 13:37:51.783913", "rc": 0, "start": "2020-06-02 13:37:50.450802", "stderr": "No resources found.", "stderr_lines": ["No resources found."], "stdout": "", "stdout_lines": []}
TASK [Getting the cstor pool UID] **********************************************
task path: /experiments/functional/backup_and_restore/additional_verification.yml:15
2020-06-02T13:37:51.840191 (delta: 1.544185) elapsed: 169.014536 ******
changed: [127.0.0.1] => {"changed": true, "cmd": "kubectl get cstorbackup -n app-percona -o jsonpath='{.metadata.labels.cstorpool\\.openebs\\.io/uid}'", "delta": "0:00:01.201025", "end": "2020-06-02 13:37:53.210851", "rc": 0, "start": "2020-06-02 13:37:52.009826", "stderr": "", "stderr_lines": [], "stdout": "", "stdout_lines": []}
TASK [Getting the CVR name from the corresponding pool uid] ********************
task path: /experiments/functional/backup_and_restore/additional_verification.yml:19
2020-06-02T13:37:53.262145 (delta: 1.421909) elapsed: 170.43649 *******
changed: [127.0.0.1] => {"changed": true, "cmd": "kubectl get cvr -n openebs -l cstorpool.openebs.io/uid=,cstorvolume.openebs.io/name=pvc-624a7e35-a4d5-11ea-9ff8-005056987ecb -o jsonpath='{.items[*].metadata.name}'", "delta": "0:00:01.156702", "end": "2020-06-02 13:37:54.576307", "rc": 0, "start": "2020-06-02 13:37:53.419605", "stderr": "", "stderr_lines": [], "stdout": "", "stdout_lines": []}
TASK [Getting the cstor pool label from cvr] ***********************************
task path: /experiments/functional/backup_and_restore/additional_verification.yml:23
2020-06-02T13:37:54.627614 (delta: 1.365415) elapsed: 171.801959 ******
TASK [Geting the pool pod name corresponding to the label] *********************
task path: /experiments/functional/backup_and_restore/additional_verification.yml:29
2020-06-02T13:37:54.674409 (delta: 0.046739) elapsed: 171.848754 ******
TASK [Deleting the cstor pool pod] *********************************************
task path: /experiments/functional/backup_and_restore/additional_verification.yml:37
2020-06-02T13:37:54.719653 (delta: 0.045183) elapsed: 171.893998 ******
TASK [Verifying whether local snapshot is deleted] *****************************
task path: /experiments/functional/backup_and_restore/additional_verification.yml:48
2020-06-02T13:37:54.765270 (delta: 0.045569) elapsed: 171.939615 ******
TASK [Getting maya-pai-server pod name] ****************************************
task path: /experiments/functional/backup_and_restore/additional_verification.yml:61
2020-06-02T13:37:54.813388 (delta: 0.04807) elapsed: 171.987733 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Deleting the maya-api-server pod] ****************************************
task path: /experiments/functional/backup_and_restore/additional_verification.yml:65
2020-06-02T13:37:54.859726 (delta: 0.04629) elapsed: 172.034071 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Deleting Application] ****************************************************
task path: /experiments/functional/backup_and_restore/test.yml:124
2020-06-02T13:37:54.906512 (delta: 0.046727) elapsed: 172.080857 ******
changed: [127.0.0.1] => {"changed": true, "cmd": "kubectl delete pvc,deploy,svc --all -n app-percona\n kubectl delete ns app-percona", "delta": "0:00:16.022987", "end": "2020-06-02 13:38:11.077310", "rc": 0, "start": "2020-06-02 13:37:55.054323", "stderr": "", "stderr_lines": [], "stdout": "persistentvolumeclaim \"percona-mysql-claim\" deleted\ndeployment.extensions \"percona\" deleted\nservice \"percona-mysql\" deleted\nnamespace \"app-percona\" deleted", "stdout_lines": ["persistentvolumeclaim \"percona-mysql-claim\" deleted", "deployment.extensions \"percona\" deleted", "service \"percona-mysql\" deleted", "namespace \"app-percona\" deleted"]}
TASK [Checking whether namespace is deleted] ***********************************
task path: /experiments/functional/backup_and_restore/test.yml:129
2020-06-02T13:38:11.127276 (delta: 16.220717) elapsed: 188.301621 *****
changed: [127.0.0.1] => {"attempts": 1, "changed": true, "cmd": "kubectl get ns --no-headers -o custom-columns=:metadata.name", "delta": "0:00:01.209410", "end": "2020-06-02 13:38:12.494369", "rc": 0, "start": "2020-06-02 13:38:11.284959", "stderr": "", "stderr_lines": [], "stdout": "app-nfsv4-jiva\napp-nfsv4-ns\nbackup-percona-cstor\nbackup-percona-cstor-local\nbackup-percona-jiva\nbackup-percona-localpv\nbusybox\nbusybox-jiva-snap\nbusybox-snap\ndefault\ne2e\nfio\nfio-jiva\nkube-node-lease\nkube-public\nkube-system\nlitmus\nlogging\nmemleak-jiva\nopenebs\nopenshift\nopenshift-apiserver\nopenshift-apiserver-operator\nopenshift-authentication\nopenshift-authentication-operator\nopenshift-cloud-credential-operator\nopenshift-cluster-machine-approver\nopenshift-cluster-node-tuning-operator\nopenshift-cluster-samples-operator\nopenshift-cluster-storage-operator\nopenshift-cluster-version\nopenshift-config\nopenshift-config-managed\nopenshift-console\nopenshift-console-operator\nopenshift-controller-manager\nopenshift-controller-manager-operator\nopenshift-dns\nopenshift-dns-operator\nopenshift-etcd\nopenshift-image-registry\nopenshift-infra\nopenshift-ingress\nopenshift-ingress-operator\nopenshift-insights\nopenshift-kni-infra\nopenshift-kube-apiserver\nopenshift-kube-apiserver-operator\nopenshift-kube-controller-manager\nopenshift-kube-controller-manager-operator\nopenshift-kube-scheduler\nopenshift-kube-scheduler-operator\nopenshift-machine-api\nopenshift-machine-config-operator\nopenshift-marketplace\nopenshift-monitoring\nopenshift-multus\nopenshift-network-operator\nopenshift-node\nopenshift-openstack-infra\nopenshift-operator-lifecycle-manager\nopenshift-operators\nopenshift-sdn\nopenshift-service-ca\nopenshift-service-ca-operator\nopenshift-service-catalog-apiserver-operator\nopenshift-service-catalog-controller-manager-operator\npercona-target-nw-loss\npool-kill\npost-rebuild\ntarget-delay\nthick-provision-ns\nvelero", "stdout_lines": ["app-nfsv4-jiva", "app-nfsv4-ns", "backup-percona-cstor", "backup-percona-cstor-local", "backup-percona-jiva", "backup-percona-localpv", "busybox", "busybox-jiva-snap", "busybox-snap", "default", "e2e", "fio", "fio-jiva", "kube-node-lease", "kube-public", "kube-system", "litmus", "logging", "memleak-jiva", "openebs", "openshift", "openshift-apiserver", "openshift-apiserver-operator", "openshift-authentication", "openshift-authentication-operator", "openshift-cloud-credential-operator", "openshift-cluster-machine-approver", "openshift-cluster-node-tuning-operator", "openshift-cluster-samples-operator", "openshift-cluster-storage-operator", "openshift-cluster-version", "openshift-config", "openshift-config-managed", "openshift-console", "openshift-console-operator", "openshift-controller-manager", "openshift-controller-manager-operator", "openshift-dns", "openshift-dns-operator", "openshift-etcd", "openshift-image-registry", "openshift-infra", "openshift-ingress", "openshift-ingress-operator", "openshift-insights", "openshift-kni-infra", "openshift-kube-apiserver", "openshift-kube-apiserver-operator", "openshift-kube-controller-manager", "openshift-kube-controller-manager-operator", "openshift-kube-scheduler", "openshift-kube-scheduler-operator", "openshift-machine-api", "openshift-machine-config-operator", "openshift-marketplace", "openshift-monitoring", "openshift-multus", "openshift-network-operator", "openshift-node", "openshift-openstack-infra", "openshift-operator-lifecycle-manager", "openshift-operators", "openshift-sdn", "openshift-service-ca", "openshift-service-ca-operator", "openshift-service-catalog-apiserver-operator", "openshift-service-catalog-controller-manager-operator", "percona-target-nw-loss", "pool-kill", "post-rebuild", "target-delay", "thick-provision-ns", "velero"]}
TASK [Restoring Application] ***************************************************
task path: /experiments/functional/backup_and_restore/test.yml:146
2020-06-02T13:38:12.552970 (delta: 1.425651) elapsed: 189.727315 ******
included: /experiments/functional/backup_and_restore/backup-restore.yml for 127.0.0.1
TASK [Creating Backup using minio bucket] **************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:6
2020-06-02T13:38:12.653488 (delta: 0.10047) elapsed: 189.827833 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating Backup using GCP bucket] ****************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:13
2020-06-02T13:38:12.705703 (delta: 0.052166) elapsed: 189.880048 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating Local Backup] ***************************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:20
2020-06-02T13:38:12.759394 (delta: 0.053642) elapsed: 189.933739 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating Backup] *********************************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:31
2020-06-02T13:38:12.818004 (delta: 0.058562) elapsed: 189.992349 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating Backup] *********************************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:41
2020-06-02T13:38:12.871353 (delta: 0.053301) elapsed: 190.045698 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating schedule using minio bucket] ************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:54
2020-06-02T13:38:12.926332 (delta: 0.054929) elapsed: 190.100677 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating schedule using GCP bucket] **************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:58
2020-06-02T13:38:12.987064 (delta: 0.060685) elapsed: 190.161409 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Getting the state of Backup] *********************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:66
2020-06-02T13:38:13.066440 (delta: 0.079329) elapsed: 190.240785 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating application namespace] ******************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:79
2020-06-02T13:38:13.126995 (delta: 0.060496) elapsed: 190.30134 *******
changed: [127.0.0.1] => {"changed": true, "cmd": "kubectl create ns app-percona", "delta": "0:00:01.170323", "end": "2020-06-02 13:38:14.453493", "failed_when_result": false, "rc": 0, "start": "2020-06-02 13:38:13.283170", "stderr": "", "stderr_lines": [], "stdout": "namespace/app-percona created", "stdout_lines": ["namespace/app-percona created"]}
TASK [Restoring application] ***************************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:85
2020-06-02T13:38:14.526905 (delta: 1.399861) elapsed: 191.70125 *******
changed: [127.0.0.1] => {"changed": true, "cmd": "velero restore create --from-backup percona-backup --restore-volumes=true", "delta": "0:00:01.182687", "end": "2020-06-02 13:38:15.875952", "rc": 0, "start": "2020-06-02 13:38:14.693265", "stderr": "", "stderr_lines": [], "stdout": "Restore request \"percona-backup-20200602133815\" submitted successfully.\nRun `velero restore describe percona-backup-20200602133815` or `velero restore logs percona-backup-20200602133815` for more details.", "stdout_lines": ["Restore request \"percona-backup-20200602133815\" submitted successfully.", "Run `velero restore describe percona-backup-20200602133815` or `velero restore logs percona-backup-20200602133815` for more details."]}
TASK [Restoring application] ***************************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:95
2020-06-02T13:38:15.923135 (delta: 1.396162) elapsed: 193.09748 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating application namespace] ******************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:105
2020-06-02T13:38:15.978383 (delta: 0.055203) elapsed: 193.152728 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Restoring application] ***************************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:110
2020-06-02T13:38:16.035721 (delta: 0.057295) elapsed: 193.210066 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Getting restore name] ****************************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:122
2020-06-02T13:38:16.092607 (delta: 0.056838) elapsed: 193.266952 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Checking the restore status] *********************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:126
2020-06-02T13:38:16.151210 (delta: 0.058543) elapsed: 193.325555 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Getting state of restore] ************************************************
task path: /experiments/functional/backup_and_restore/test.yml:152
2020-06-02T13:38:16.204009 (delta: 0.052752) elapsed: 193.378354 ******
included: /experiments/functional/backup_and_restore/backup-restore.yml for 127.0.0.1
TASK [Creating Backup using minio bucket] **************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:6
2020-06-02T13:38:16.317000 (delta: 0.112941) elapsed: 193.491345 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating Backup using GCP bucket] ****************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:13
2020-06-02T13:38:16.393934 (delta: 0.076886) elapsed: 193.568279 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating Local Backup] ***************************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:20
2020-06-02T13:38:16.461788 (delta: 0.067775) elapsed: 193.636133 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating Backup] *********************************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:31
2020-06-02T13:38:16.517176 (delta: 0.055337) elapsed: 193.691521 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating Backup] *********************************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:41
2020-06-02T13:38:16.575987 (delta: 0.058758) elapsed: 193.750332 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating schedule using minio bucket] ************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:54
2020-06-02T13:38:16.630965 (delta: 0.054928) elapsed: 193.80531 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating schedule using GCP bucket] **************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:58
2020-06-02T13:38:16.690112 (delta: 0.059099) elapsed: 193.864457 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Getting the state of Backup] *********************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:66
2020-06-02T13:38:16.747041 (delta: 0.056881) elapsed: 193.921386 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating application namespace] ******************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:79
2020-06-02T13:38:16.800820 (delta: 0.053731) elapsed: 193.975165 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Restoring application] ***************************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:85
2020-06-02T13:38:16.855653 (delta: 0.054782) elapsed: 194.029998 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Restoring application] ***************************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:95
2020-06-02T13:38:16.936115 (delta: 0.08041) elapsed: 194.11046 ********
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Creating application namespace] ******************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:105
2020-06-02T13:38:17.114052 (delta: 0.177864) elapsed: 194.288397 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Restoring application] ***************************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:110
2020-06-02T13:38:17.186506 (delta: 0.072414) elapsed: 194.360851 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Getting restore name] ****************************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:122
2020-06-02T13:38:17.242541 (delta: 0.055978) elapsed: 194.416886 ******
changed: [127.0.0.1] => {"changed": true, "cmd": "velero get restore | grep percona-backup | awk '{print $1}'", "delta": "0:00:01.126220", "end": "2020-06-02 13:38:18.557308", "rc": 0, "start": "2020-06-02 13:38:17.431088", "stderr": "", "stderr_lines": [], "stdout": "percona-backup-20200602133815", "stdout_lines": ["percona-backup-20200602133815"]}
TASK [Checking the restore status] *********************************************
task path: /experiments/functional/backup_and_restore/backup-restore.yml:126
2020-06-02T13:38:18.606614 (delta: 1.364025) elapsed: 195.780959 ******
FAILED - RETRYING: Checking the restore status (60 retries left).
FAILED - RETRYING: Checking the restore status (59 retries left).
FAILED - RETRYING: Checking the restore status (58 retries left).
changed: [127.0.0.1] => {"attempts": 4, "changed": true, "cmd": "kubectl get restore percona-backup-20200602133815 -n velero -o jsonpath='{.status.phase}'", "delta": "0:00:01.179767", "end": "2020-06-02 13:38:38.856950", "rc": 0, "start": "2020-06-02 13:38:37.677183", "stderr": "", "stderr_lines": [], "stdout": "Completed", "stdout_lines": ["Completed"]}
TASK [Restoring Application] ***************************************************
task path: /experiments/functional/backup_and_restore/test.yml:162
2020-06-02T13:38:38.913892 (delta: 20.30723) elapsed: 216.088237 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Getting state of restore] ************************************************
task path: /experiments/functional/backup_and_restore/test.yml:168
2020-06-02T13:38:38.966315 (delta: 0.052368) elapsed: 216.14066 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Getting the volume name] *************************************************
task path: /experiments/functional/backup_and_restore/test.yml:180
2020-06-02T13:38:39.018112 (delta: 0.051749) elapsed: 216.192457 ******
changed: [127.0.0.1] => {"changed": true, "cmd": "kubectl get pvc percona-mysql-claim -n app-percona -o jsonpath='{.spec.volumeName}'", "delta": "0:00:01.203007", "end": "2020-06-02 13:38:40.374375", "rc": 0, "start": "2020-06-02 13:38:39.171368", "stderr": "", "stderr_lines": [], "stdout": "pvc-500c764b-a4d6-11ea-9ff8-005056987ecb", "stdout_lines": ["pvc-500c764b-a4d6-11ea-9ff8-005056987ecb"]}
TASK [include_tasks] ***********************************************************
task path: /experiments/functional/backup_and_restore/test.yml:184
2020-06-02T13:38:40.425100 (delta: 1.406941) elapsed: 217.599445 ******
included: /experiments/functional/backup_and_restore/cstor_post_restore.yml for 127.0.0.1
TASK [Getting the target IP] ***************************************************
task path: /experiments/functional/backup_and_restore/cstor_post_restore.yml:2
2020-06-02T13:38:40.525950 (delta: 0.100801) elapsed: 217.700295 ******
changed: [127.0.0.1] => {"attempts": 1, "changed": true, "cmd": "kubectl get svc -n openebs --no-headers -l openebs.io/persistent-volume=pvc-500c764b-a4d6-11ea-9ff8-005056987ecb -o jsonpath='{.items[0].spec.clusterIP}'", "delta": "0:00:01.217894", "end": "2020-06-02 13:38:41.909866", "rc": 0, "start": "2020-06-02 13:38:40.691972", "stderr": "", "stderr_lines": [], "stdout": "172.22.152.73", "stdout_lines": ["172.22.152.73"]}
TASK [Getting the CVR name from the corresponding PV] **************************
task path: /experiments/functional/backup_and_restore/cstor_post_restore.yml:9
2020-06-02T13:38:41.971688 (delta: 1.445692) elapsed: 219.146033 ******
changed: [127.0.0.1] => {"changed": true, "cmd": "kubectl get cvr -n openebs -l openebs.io/persistent-volume=pvc-500c764b-a4d6-11ea-9ff8-005056987ecb -o jsonpath='{range.items[*]}{.metadata.labels.cstorpool\\.openebs\\.io/name}{\"\\n\"}{end}'", "delta": "0:00:01.164430", "end": "2020-06-02 13:38:43.293152", "rc": 0, "start": "2020-06-02 13:38:42.128722", "stderr": "", "stderr_lines": [], "stdout": "cstor-block-disk-pool-stripe-ge4f\ncstor-block-disk-pool-stripe-opa4\ncstor-block-disk-pool-stripe-vp8q", "stdout_lines": ["cstor-block-disk-pool-stripe-ge4f", "cstor-block-disk-pool-stripe-opa4", "cstor-block-disk-pool-stripe-vp8q"]}
TASK [Geting the CSP name corresponding to CVR] ********************************
task path: /experiments/functional/backup_and_restore/cstor_post_restore.yml:13
2020-06-02T13:38:43.341836 (delta: 1.370097) elapsed: 220.516181 ******
changed: [127.0.0.1] => (item=cstor-block-disk-pool-stripe-ge4f) => {"changed": true, "cmd": "kubectl get csp cstor-block-disk-pool-stripe-ge4f -n openebs -o jsonpath='{.metadata.name}'", "delta": "0:00:01.185971", "end": "2020-06-02 13:38:44.746720", "item": "cstor-block-disk-pool-stripe-ge4f", "rc": 0, "start": "2020-06-02 13:38:43.560749", "stderr": "", "stderr_lines": [], "stdout": "cstor-block-disk-pool-stripe-ge4f", "stdout_lines": ["cstor-block-disk-pool-stripe-ge4f"]}
changed: [127.0.0.1] => (item=cstor-block-disk-pool-stripe-opa4) => {"changed": true, "cmd": "kubectl get csp cstor-block-disk-pool-stripe-opa4 -n openebs -o jsonpath='{.metadata.name}'", "delta": "0:00:01.220414", "end": "2020-06-02 13:38:46.118804", "item": "cstor-block-disk-pool-stripe-opa4", "rc": 0, "start": "2020-06-02 13:38:44.898390", "stderr": "", "stderr_lines": [], "stdout": "cstor-block-disk-pool-stripe-opa4", "stdout_lines": ["cstor-block-disk-pool-stripe-opa4"]}
changed: [127.0.0.1] => (item=cstor-block-disk-pool-stripe-vp8q) => {"changed": true, "cmd": "kubectl get csp cstor-block-disk-pool-stripe-vp8q -n openebs -o jsonpath='{.metadata.name}'", "delta": "0:00:01.180069", "end": "2020-06-02 13:38:47.467994", "item": "cstor-block-disk-pool-stripe-vp8q", "rc": 0, "start": "2020-06-02 13:38:46.287925", "stderr": "", "stderr_lines": [], "stdout": "cstor-block-disk-pool-stripe-vp8q", "stdout_lines": ["cstor-block-disk-pool-stripe-vp8q"]}
TASK [Getting the pool pod name] ***********************************************
task path: /experiments/functional/backup_and_restore/cstor_post_restore.yml:18
2020-06-02T13:38:47.533060 (delta: 4.191175) elapsed: 224.707405 ******
changed: [127.0.0.1] => (item=cstor-block-disk-pool-stripe-ge4f) => {"changed": true, "cmd": "kubectl get pod -n openebs | grep cstor-block-disk-pool-stripe-ge4f | awk '{print $1}'", "delta": "0:00:01.392297", "end": "2020-06-02 13:38:49.114583", "item": "cstor-block-disk-pool-stripe-ge4f", "rc": 0, "start": "2020-06-02 13:38:47.722286", "stderr": "", "stderr_lines": [], "stdout": "cstor-block-disk-pool-stripe-ge4f-75dd775f8c-7cpkb", "stdout_lines": ["cstor-block-disk-pool-stripe-ge4f-75dd775f8c-7cpkb"]}
changed: [127.0.0.1] => (item=cstor-block-disk-pool-stripe-opa4) => {"changed": true, "cmd": "kubectl get pod -n openebs | grep cstor-block-disk-pool-stripe-opa4 | awk '{print $1}'", "delta": "0:00:01.199336", "end": "2020-06-02 13:38:50.468004", "item": "cstor-block-disk-pool-stripe-opa4", "rc": 0, "start": "2020-06-02 13:38:49.268668", "stderr": "", "stderr_lines": [], "stdout": "cstor-block-disk-pool-stripe-opa4-795b7bb5b7-w4px2", "stdout_lines": ["cstor-block-disk-pool-stripe-opa4-795b7bb5b7-w4px2"]}
changed: [127.0.0.1] => (item=cstor-block-disk-pool-stripe-vp8q) => {"changed": true, "cmd": "kubectl get pod -n openebs | grep cstor-block-disk-pool-stripe-vp8q | awk '{print $1}'", "delta": "0:00:01.171674", "end": "2020-06-02 13:38:51.800212", "item": "cstor-block-disk-pool-stripe-vp8q", "rc": 0, "start": "2020-06-02 13:38:50.628538", "stderr": "", "stderr_lines": [], "stdout": "cstor-block-disk-pool-stripe-vp8q-5658b85649-j5hdr", "stdout_lines": ["cstor-block-disk-pool-stripe-vp8q-5658b85649-j5hdr"]}
TASK [Getting the volume name inside cstor pool] *******************************
task path: /experiments/functional/backup_and_restore/cstor_post_restore.yml:23
2020-06-02T13:38:51.851541 (delta: 4.31843) elapsed: 229.025886 *******
changed: [127.0.0.1] => (item={'_ansible_parsed': True, 'stderr_lines': [], u'changed': True, u'stdout': u'cstor-block-disk-pool-stripe-ge4f-75dd775f8c-7cpkb', '_ansible_item_result': True, u'delta': u'0:00:01.392297', 'stdout_lines': [u'cstor-block-disk-pool-stripe-ge4f-75dd775f8c-7cpkb'], '_ansible_item_label': u'cstor-block-disk-pool-stripe-ge4f', u'end': u'2020-06-02 13:38:49.114583', '_ansible_no_log': False, 'failed': False, u'cmd': u"kubectl get pod -n openebs | grep cstor-block-disk-pool-stripe-ge4f | awk '{print $1}'", 'item': u'cstor-block-disk-pool-stripe-ge4f', u'stderr': u'', u'rc': 0, u'invocation': {u'module_args': {u'warn': True, u'executable': None, u'_uses_shell': True, u'_raw_params': u"kubectl get pod -n openebs | grep cstor-block-disk-pool-stripe-ge4f | awk '{print $1}'", u'removes': None, u'argv': None, u'creates': None, u'chdir': None, u'stdin': None}}, u'start': u'2020-06-02 13:38:47.722286', '_ansible_ignore_errors': None}) => {"changed": true, "cmd": "echo $(kubectl exec -it cstor-block-disk-pool-stripe-ge4f-75dd775f8c-7cpkb -n openebs -c cstor-pool -- zfs list | grep pvc-500c764b-a4d6-11ea-9ff8-005056987ecb | awk '{print $1}') >> cstor-vol", "delta": "0:00:01.417809", "end": "2020-06-02 13:38:53.449337", "item": {"changed": true, "cmd": "kubectl get pod -n openebs | grep cstor-block-disk-pool-stripe-ge4f | awk '{print $1}'", "delta": "0:00:01.392297", "end": "2020-06-02 13:38:49.114583", "failed": false, "invocation": {"module_args": {"_raw_params": "kubectl get pod -n openebs | grep cstor-block-disk-pool-stripe-ge4f | awk '{print $1}'", "_uses_shell": true, "argv": null, "chdir": null, "creates": null, "executable": null, "removes": null, "stdin": null, "warn": true}}, "item": "cstor-block-disk-pool-stripe-ge4f", "rc": 0, "start": "2020-06-02 13:38:47.722286", "stderr": "", "stderr_lines": [], "stdout": "cstor-block-disk-pool-stripe-ge4f-75dd775f8c-7cpkb", "stdout_lines": ["cstor-block-disk-pool-stripe-ge4f-75dd775f8c-7cpkb"]}, "rc": 0, "start": "2020-06-02 13:38:52.031528", "stderr": "Unable to use a TTY - input is not a terminal or the right kind of file", "stderr_lines": ["Unable to use a TTY - input is not a terminal or the right kind of file"], "stdout": "", "stdout_lines": []}
changed: [127.0.0.1] => (item={'_ansible_parsed': True, 'stderr_lines': [], u'changed': True, u'stdout': u'cstor-block-disk-pool-stripe-opa4-795b7bb5b7-w4px2', '_ansible_item_result': True, u'delta': u'0:00:01.199336', 'stdout_lines': [u'cstor-block-disk-pool-stripe-opa4-795b7bb5b7-w4px2'], '_ansible_item_label': u'cstor-block-disk-pool-stripe-opa4', u'end': u'2020-06-02 13:38:50.468004', '_ansible_no_log': False, 'failed': False, u'cmd': u"kubectl get pod -n openebs | grep cstor-block-disk-pool-stripe-opa4 | awk '{print $1}'", 'item': u'cstor-block-disk-pool-stripe-opa4', u'stderr': u'', u'rc': 0, u'invocation': {u'module_args': {u'warn': True, u'executable': None, u'_uses_shell': True, u'_raw_params': u"kubectl get pod -n openebs | grep cstor-block-disk-pool-stripe-opa4 | awk '{print $1}'", u'removes': None, u'argv': None, u'creates': None, u'chdir': None, u'stdin': None}}, u'start': u'2020-06-02 13:38:49.268668', '_ansible_ignore_errors': None}) => {"changed": true, "cmd": "echo $(kubectl exec -it cstor-block-disk-pool-stripe-opa4-795b7bb5b7-w4px2 -n openebs -c cstor-pool -- zfs list | grep pvc-500c764b-a4d6-11ea-9ff8-005056987ecb | awk '{print $1}') >> cstor-vol", "delta": "0:00:01.342765", "end": "2020-06-02 13:38:54.960929", "item": {"changed": true, "cmd": "kubectl get pod -n openebs | grep cstor-block-disk-pool-stripe-opa4 | awk '{print $1}'", "delta": "0:00:01.199336", "end": "2020-06-02 13:38:50.468004", "failed": false, "invocation": {"module_args": {"_raw_params": "kubectl get pod -n openebs | grep cstor-block-disk-pool-stripe-opa4 | awk '{print $1}'", "_uses_shell": true, "argv": null, "chdir": null, "creates": null, "executable": null, "removes": null, "stdin": null, "warn": true}}, "item": "cstor-block-disk-pool-stripe-opa4", "rc": 0, "start": "2020-06-02 13:38:49.268668", "stderr": "", "stderr_lines": [], "stdout": "cstor-block-disk-pool-stripe-opa4-795b7bb5b7-w4px2", "stdout_lines": ["cstor-block-disk-pool-stripe-opa4-795b7bb5b7-w4px2"]}, "rc": 0, "start": "2020-06-02 13:38:53.618164", "stderr": "Unable to use a TTY - input is not a terminal or the right kind of file", "stderr_lines": ["Unable to use a TTY - input is not a terminal or the right kind of file"], "stdout": "", "stdout_lines": []}
changed: [127.0.0.1] => (item={'_ansible_parsed': True, 'stderr_lines': [], u'changed': True, u'stdout': u'cstor-block-disk-pool-stripe-vp8q-5658b85649-j5hdr', '_ansible_item_result': True, u'delta': u'0:00:01.171674', 'stdout_lines': [u'cstor-block-disk-pool-stripe-vp8q-5658b85649-j5hdr'], '_ansible_item_label': u'cstor-block-disk-pool-stripe-vp8q', u'end': u'2020-06-02 13:38:51.800212', '_ansible_no_log': False, 'failed': False, u'cmd': u"kubectl get pod -n openebs | grep cstor-block-disk-pool-stripe-vp8q | awk '{print $1}'", 'item': u'cstor-block-disk-pool-stripe-vp8q', u'stderr': u'', u'rc': 0, u'invocation': {u'module_args': {u'warn': True, u'executable': None, u'_uses_shell': True, u'_raw_params': u"kubectl get pod -n openebs | grep cstor-block-disk-pool-stripe-vp8q | awk '{print $1}'", u'removes': None, u'argv': None, u'creates': None, u'chdir': None, u'stdin': None}}, u'start': u'2020-06-02 13:38:50.628538', '_ansible_ignore_errors': None}) => {"changed": true, "cmd": "echo $(kubectl exec -it cstor-block-disk-pool-stripe-vp8q-5658b85649-j5hdr -n openebs -c cstor-pool -- zfs list | grep pvc-500c764b-a4d6-11ea-9ff8-005056987ecb | awk '{print $1}') >> cstor-vol", "delta": "0:00:01.374989", "end": "2020-06-02 13:38:56.490139", "item": {"changed": true, "cmd": "kubectl get pod -n openebs | grep cstor-block-disk-pool-stripe-vp8q | awk '{print $1}'", "delta": "0:00:01.171674", "end": "2020-06-02 13:38:51.800212", "failed": false, "invocation": {"module_args": {"_raw_params": "kubectl get pod -n openebs | grep cstor-block-disk-pool-stripe-vp8q | awk '{print $1}'", "_uses_shell": true, "argv": null, "chdir": null, "creates": null, "executable": null, "removes": null, "stdin": null, "warn": true}}, "item": "cstor-block-disk-pool-stripe-vp8q", "rc": 0, "start": "2020-06-02 13:38:50.628538", "stderr": "", "stderr_lines": [], "stdout": "cstor-block-disk-pool-stripe-vp8q-5658b85649-j5hdr", "stdout_lines": ["cstor-block-disk-pool-stripe-vp8q-5658b85649-j5hdr"]}, "rc": 0, "start": "2020-06-02 13:38:55.115150", "stderr": "Unable to use a TTY - input is not a terminal or the right kind of file", "stderr_lines": ["Unable to use a TTY - input is not a terminal or the right kind of file"], "stdout": "", "stdout_lines": []}
TASK [Assigning the cstor-pool volume name to variable] ************************
task path: /experiments/functional/backup_and_restore/cstor_post_restore.yml:28
2020-06-02T13:38:56.546983 (delta: 4.695383) elapsed: 233.721328 ******
changed: [127.0.0.1] => {"changed": true, "cmd": "cat ./cstor-vol", "delta": "0:00:01.104166", "end": "2020-06-02 13:38:57.847308", "rc": 0, "start": "2020-06-02 13:38:56.743142", "stderr": "", "stderr_lines": [], "stdout": "cstor-fc73ae51-a4ca-11ea-9ff8-005056987ecb/pvc-500c764b-a4d6-11ea-9ff8-005056987ecb\ncstor-fc56eca3-a4ca-11ea-9ff8-005056987ecb/pvc-500c764b-a4d6-11ea-9ff8-005056987ecb\ncstor-fc21451f-a4ca-11ea-9ff8-005056987ecb/pvc-500c764b-a4d6-11ea-9ff8-005056987ecb", "stdout_lines": ["cstor-fc73ae51-a4ca-11ea-9ff8-005056987ecb/pvc-500c764b-a4d6-11ea-9ff8-005056987ecb", "cstor-fc56eca3-a4ca-11ea-9ff8-005056987ecb/pvc-500c764b-a4d6-11ea-9ff8-005056987ecb", "cstor-fc21451f-a4ca-11ea-9ff8-005056987ecb/pvc-500c764b-a4d6-11ea-9ff8-005056987ecb"]}
TASK [Setting target IP for volume inside cstor pool] **************************
task path: /experiments/functional/backup_and_restore/cstor_post_restore.yml:32
2020-06-02T13:38:57.898655 (delta: 1.351625) elapsed: 235.073 *********
changed: [127.0.0.1] => (item=[{'_ansible_parsed': True, 'stderr_lines': [], u'changed': True, u'stdout': u'cstor-block-disk-pool-stripe-ge4f-75dd775f8c-7cpkb', '_ansible_item_result': True, u'delta': u'0:00:01.392297', 'stdout_lines': [u'cstor-block-disk-pool-stripe-ge4f-75dd775f8c-7cpkb'], '_ansible_item_label': u'cstor-block-disk-pool-stripe-ge4f', u'end': u'2020-06-02 13:38:49.114583', '_ansible_no_log': False, u'start': u'2020-06-02 13:38:47.722286', 'failed': False, u'cmd': u"kubectl get pod -n openebs | grep cstor-block-disk-pool-stripe-ge4f | awk '{print $1}'", 'item': u'cstor-block-disk-pool-stripe-ge4f', u'stderr': u'', u'rc': 0, u'invocation': {u'module_args': {u'creates': None, u'executable': None, u'_uses_shell': True, u'_raw_params': u"kubectl get pod -n openebs | grep cstor-block-disk-pool-stripe-ge4f | awk '{print $1}'", u'removes': None, u'argv': None, u'warn': True, u'chdir': None, u'stdin': None}}, '_ansible_ignore_errors': None}, u'cstor-fc73ae51-a4ca-11ea-9ff8-005056987ecb/pvc-500c764b-a4d6-11ea-9ff8-005056987ecb']) => {"changed": true, "cmd": "kubectl exec -it cstor-block-disk-pool-stripe-ge4f-75dd775f8c-7cpkb -n openebs -- zfs set io.openebs:targetip=172.22.152.73 cstor-fc73ae51-a4ca-11ea-9ff8-005056987ecb/pvc-500c764b-a4d6-11ea-9ff8-005056987ecb", "delta": "0:00:01.565797", "end": "2020-06-02 13:38:59.664107", "item": [{"_ansible_ignore_errors": null, "_ansible_item_label": "cstor-block-disk-pool-stripe-ge4f", "_ansible_item_result": true, "_ansible_no_log": false, "_ansible_parsed": true, "changed": true, "cmd": "kubectl get pod -n openebs | grep cstor-block-disk-pool-stripe-ge4f | awk '{print $1}'", "delta": "0:00:01.392297", "end": "2020-06-02 13:38:49.114583", "failed": false, "invocation": {"module_args": {"_raw_params": "kubectl get pod -n openebs | grep cstor-block-disk-pool-stripe-ge4f | awk '{print $1}'", "_uses_shell": true, "argv": null, "chdir": null, "creates": null, "executable": null, "removes": null, "stdin": null, "warn": true}}, "item": "cstor-block-disk-pool-stripe-ge4f", "rc": 0, "start": "2020-06-02 13:38:47.722286", "stderr": "", "stderr_lines": [], "stdout": "cstor-block-disk-pool-stripe-ge4f-75dd775f8c-7cpkb", "stdout_lines": ["cstor-block-disk-pool-stripe-ge4f-75dd775f8c-7cpkb"]}, "cstor-fc73ae51-a4ca-11ea-9ff8-005056987ecb/pvc-500c764b-a4d6-11ea-9ff8-005056987ecb"], "rc": 0, "start": "2020-06-02 13:38:58.098310", "stderr": "Defaulting container name to cstor-pool.\nUse 'kubectl describe pod/cstor-block-disk-pool-stripe-ge4f-75dd775f8c-7cpkb -n openebs' to see all of the containers in this pod.\nUnable to use a TTY - input is not a terminal or the right kind of file", "stderr_lines": ["Defaulting container name to cstor-pool.", "Use 'kubectl describe pod/cstor-block-disk-pool-stripe-ge4f-75dd775f8c-7cpkb -n openebs' to see all of the containers in this pod.", "Unable to use a TTY - input is not a terminal or the right kind of file"], "stdout": "", "stdout_lines": []}
changed: [127.0.0.1] => (item=[{'_ansible_parsed': True, 'stderr_lines': [], u'changed': True, u'stdout': u'cstor-block-disk-pool-stripe-opa4-795b7bb5b7-w4px2', '_ansible_item_result': True, u'delta': u'0:00:01.199336', 'stdout_lines': [u'cstor-block-disk-pool-stripe-opa4-795b7bb5b7-w4px2'], '_ansible_item_label': u'cstor-block-disk-pool-stripe-opa4', u'end': u'2020-06-02 13:38:50.468004', '_ansible_no_log': False, u'start': u'2020-06-02 13:38:49.268668', 'failed': False, u'cmd': u"kubectl get pod -n openebs | grep cstor-block-disk-pool-stripe-opa4 | awk '{print $1}'", 'item': u'cstor-block-disk-pool-stripe-opa4', u'stderr': u'', u'rc': 0, u'invocation': {u'module_args': {u'creates': None, u'executable': None, u'_uses_shell': True, u'_raw_params': u"kubectl get pod -n openebs | grep cstor-block-disk-pool-stripe-opa4 | awk '{print $1}'", u'removes': None, u'argv': None, u'warn': True, u'chdir': None, u'stdin': None}}, '_ansible_ignore_errors': None}, u'cstor-fc56eca3-a4ca-11ea-9ff8-005056987ecb/pvc-500c764b-a4d6-11ea-9ff8-005056987ecb']) => {"changed": true, "cmd": "kubectl exec -it cstor-block-disk-pool-stripe-opa4-795b7bb5b7-w4px2 -n openebs -- zfs set io.openebs:targetip=172.22.152.73 cstor-fc56eca3-a4ca-11ea-9ff8-005056987ecb/pvc-500c764b-a4d6-11ea-9ff8-005056987ecb", "delta": "0:00:01.403872", "end": "2020-06-02 13:39:01.245033", "item": [{"_ansible_ignore_errors": null, "_ansible_item_label": "cstor-block-disk-pool-stripe-opa4", "_ansible_item_result": true, "_ansible_no_log": false, "_ansible_parsed": true, "changed": true, "cmd": "kubectl get pod -n openebs | grep cstor-block-disk-pool-stripe-opa4 | awk '{print $1}'", "delta": "0:00:01.199336", "end": "2020-06-02 13:38:50.468004", "failed": false, "invocation": {"module_args": {"_raw_params": "kubectl get pod -n openebs | grep cstor-block-disk-pool-stripe-opa4 | awk '{print $1}'", "_uses_shell": true, "argv": null, "chdir": null, "creates": null, "executable": null, "removes": null, "stdin": null, "warn": true}}, "item": "cstor-block-disk-pool-stripe-opa4", "rc": 0, "start": "2020-06-02 13:38:49.268668", "stderr": "", "stderr_lines": [], "stdout": "cstor-block-disk-pool-stripe-opa4-795b7bb5b7-w4px2", "stdout_lines": ["cstor-block-disk-pool-stripe-opa4-795b7bb5b7-w4px2"]}, "cstor-fc56eca3-a4ca-11ea-9ff8-005056987ecb/pvc-500c764b-a4d6-11ea-9ff8-005056987ecb"], "rc": 0, "start": "2020-06-02 13:38:59.841161", "stderr": "Defaulting container name to cstor-pool.\nUse 'kubectl describe pod/cstor-block-disk-pool-stripe-opa4-795b7bb5b7-w4px2 -n openebs' to see all of the containers in this pod.\nUnable to use a TTY - input is not a terminal or the right kind of file", "stderr_lines": ["Defaulting container name to cstor-pool.", "Use 'kubectl describe pod/cstor-block-disk-pool-stripe-opa4-795b7bb5b7-w4px2 -n openebs' to see all of the containers in this pod.", "Unable to use a TTY - input is not a terminal or the right kind of file"], "stdout": "", "stdout_lines": []}
changed: [127.0.0.1] => (item=[{'_ansible_parsed': True, 'stderr_lines': [], u'changed': True, u'stdout': u'cstor-block-disk-pool-stripe-vp8q-5658b85649-j5hdr', '_ansible_item_result': True, u'delta': u'0:00:01.171674', 'stdout_lines': [u'cstor-block-disk-pool-stripe-vp8q-5658b85649-j5hdr'], '_ansible_item_label': u'cstor-block-disk-pool-stripe-vp8q', u'end': u'2020-06-02 13:38:51.800212', '_ansible_no_log': False, u'start': u'2020-06-02 13:38:50.628538', 'failed': False, u'cmd': u"kubectl get pod -n openebs | grep cstor-block-disk-pool-stripe-vp8q | awk '{print $1}'", 'item': u'cstor-block-disk-pool-stripe-vp8q', u'stderr': u'', u'rc': 0, u'invocation': {u'module_args': {u'creates': None, u'executable': None, u'_uses_shell': True, u'_raw_params': u"kubectl get pod -n openebs | grep cstor-block-disk-pool-stripe-vp8q | awk '{print $1}'", u'removes': None, u'argv': None, u'warn': True, u'chdir': None, u'stdin': None}}, '_ansible_ignore_errors': None}, u'cstor-fc21451f-a4ca-11ea-9ff8-005056987ecb/pvc-500c764b-a4d6-11ea-9ff8-005056987ecb']) => {"changed": true, "cmd": "kubectl exec -it cstor-block-disk-pool-stripe-vp8q-5658b85649-j5hdr -n openebs -- zfs set io.openebs:targetip=172.22.152.73 cstor-fc21451f-a4ca-11ea-9ff8-005056987ecb/pvc-500c764b-a4d6-11ea-9ff8-005056987ecb", "delta": "0:00:01.375276", "end": "2020-06-02 13:39:02.773503", "item": [{"_ansible_ignore_errors": null, "_ansible_item_label": "cstor-block-disk-pool-stripe-vp8q", "_ansible_item_result": true, "_ansible_no_log": false, "_ansible_parsed": true, "changed": true, "cmd": "kubectl get pod -n openebs | grep cstor-block-disk-pool-stripe-vp8q | awk '{print $1}'", "delta": "0:00:01.171674", "end": "2020-06-02 13:38:51.800212", "failed": false, "invocation": {"module_args": {"_raw_params": "kubectl get pod -n openebs | grep cstor-block-disk-pool-stripe-vp8q | awk '{print $1}'", "_uses_shell": true, "argv": null, "chdir": null, "creates": null, "executable": null, "removes": null, "stdin": null, "warn": true}}, "item": "cstor-block-disk-pool-stripe-vp8q", "rc": 0, "start": "2020-06-02 13:38:50.628538", "stderr": "", "stderr_lines": [], "stdout": "cstor-block-disk-pool-stripe-vp8q-5658b85649-j5hdr", "stdout_lines": ["cstor-block-disk-pool-stripe-vp8q-5658b85649-j5hdr"]}, "cstor-fc21451f-a4ca-11ea-9ff8-005056987ecb/pvc-500c764b-a4d6-11ea-9ff8-005056987ecb"], "rc": 0, "start": "2020-06-02 13:39:01.398227", "stderr": "Defaulting container name to cstor-pool.\nUse 'kubectl describe pod/cstor-block-disk-pool-stripe-vp8q-5658b85649-j5hdr -n openebs' to see all of the containers in this pod.\nUnable to use a TTY - input is not a terminal or the right kind of file", "stderr_lines": ["Defaulting container name to cstor-pool.", "Use 'kubectl describe pod/cstor-block-disk-pool-stripe-vp8q-5658b85649-j5hdr -n openebs' to see all of the containers in this pod.", "Unable to use a TTY - input is not a terminal or the right kind of file"], "stdout": "", "stdout_lines": []}
TASK [Getting the volume name] *************************************************
task path: /experiments/functional/backup_and_restore/test.yml:193
2020-06-02T13:39:02.833153 (delta: 4.934447) elapsed: 240.007498 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [include_tasks] ***********************************************************
task path: /experiments/functional/backup_and_restore/test.yml:197
2020-06-02T13:39:02.901201 (delta: 0.067994) elapsed: 240.075546 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Waiting for application to be in running state] **************************
task path: /experiments/functional/backup_and_restore/test.yml:208
2020-06-02T13:39:02.962748 (delta: 0.061484) elapsed: 240.137093 ******
FAILED - RETRYING: Waiting for application to be in running state (30 retries left).
FAILED - RETRYING: Waiting for application to be in running state (29 retries left).
changed: [127.0.0.1] => {"attempts": 3, "changed": true, "cmd": "kubectl get pod percona-755f6678bf-j78cm -n app-percona -o jsonpath='{.status.phase}'", "delta": "0:00:01.174775", "end": "2020-06-02 13:39:17.073657", "rc": 0, "start": "2020-06-02 13:39:15.898882", "stderr": "", "stderr_lines": [], "stdout": "Running", "stdout_lines": ["Running"]}
TASK [Verifying Data persistense] **********************************************
task path: /experiments/functional/backup_and_restore/test.yml:215
2020-06-02T13:39:17.125145 (delta: 14.16235) elapsed: 254.29949 *******
included: /utils/scm/applications/mysql/mysql_data_persistence.yml for 127.0.0.1
TASK [Create some test data in the mysql database] *****************************
task path: /utils/scm/applications/mysql/mysql_data_persistence.yml:4
2020-06-02T13:39:17.225350 (delta: 0.100153) elapsed: 254.399695 ******
skipping: [127.0.0.1] => (item=mysql -uroot -pk8sDem0 -e 'create database backup;') => {"changed": false, "item": "mysql -uroot -pk8sDem0 -e 'create database backup;'", "skip_reason": "Conditional result was False"}
skipping: [127.0.0.1] => (item=mysql -uroot -pk8sDem0 -e 'create table ttbl (Data VARCHAR(20));' backup) => {"changed": false, "item": "mysql -uroot -pk8sDem0 -e 'create table ttbl (Data VARCHAR(20));' backup", "skip_reason": "Conditional result was False"}
skipping: [127.0.0.1] => (item=mysql -uroot -pk8sDem0 -e 'insert into ttbl (Data) VALUES ("tdata");' backup) => {"changed": false, "item": "mysql -uroot -pk8sDem0 -e 'insert into ttbl (Data) VALUES (\"tdata\");' backup", "skip_reason": "Conditional result was False"}
TASK [Kill the application pod] ************************************************
task path: /utils/scm/applications/mysql/mysql_data_persistence.yml:21
2020-06-02T13:39:17.310474 (delta: 0.085078) elapsed: 254.484819 ******
changed: [127.0.0.1] => {"changed": true, "cmd": "kubectl delete pod percona-755f6678bf-j78cm -n app-percona", "delta": "0:00:06.269962", "end": "2020-06-02 13:39:23.746277", "rc": 0, "start": "2020-06-02 13:39:17.476315", "stderr": "", "stderr_lines": [], "stdout": "pod \"percona-755f6678bf-j78cm\" deleted", "stdout_lines": ["pod \"percona-755f6678bf-j78cm\" deleted"]}
TASK [Verify if the application pod is deleted] ********************************
task path: /utils/scm/applications/mysql/mysql_data_persistence.yml:27
2020-06-02T13:39:23.797437 (delta: 6.486909) elapsed: 260.971782 ******
[WARNING]: when statements should not include jinja2 templating delimiters
such as {{ }} or {% %}. Found: "{{ pod_name }}" not in podstatus.stdout
changed: [127.0.0.1] => {"attempts": 1, "changed": true, "cmd": "kubectl get pods -n app-percona", "delta": "0:00:01.205030", "end": "2020-06-02 13:39:25.178591", "rc": 0, "start": "2020-06-02 13:39:23.973561", "stderr": "", "stderr_lines": [], "stdout": "NAME READY STATUS RESTARTS AGE\npercona-755f6678bf-mv86f 0/1 ContainerCreating 0 7s", "stdout_lines": ["NAME READY STATUS RESTARTS AGE", "percona-755f6678bf-mv86f 0/1 ContainerCreating 0 7s"]}
TASK [Obtain the newly created pod name for application] ***********************
task path: /utils/scm/applications/mysql/mysql_data_persistence.yml:37
2020-06-02T13:39:25.236495 (delta: 1.438991) elapsed: 262.41084 *******
changed: [127.0.0.1] => {"changed": true, "cmd": "kubectl get pods -n app-percona -l name=percona -o jsonpath='{.items[].metadata.name}'", "delta": "0:00:01.150583", "end": "2020-06-02 13:39:26.554811", "rc": 0, "start": "2020-06-02 13:39:25.404228", "stderr": "", "stderr_lines": [], "stdout": "percona-755f6678bf-mv86f", "stdout_lines": ["percona-755f6678bf-mv86f"]}
TASK [Checking application pod is in running state] ****************************
task path: /utils/scm/applications/mysql/mysql_data_persistence.yml:44
2020-06-02T13:39:26.609220 (delta: 1.372678) elapsed: 263.783565 ******
changed: [127.0.0.1] => {"attempts": 1, "changed": true, "cmd": "kubectl get pods -n app-percona -o jsonpath='{.items[?(@.metadata.name==\"percona-755f6678bf-mv86f\")].status.phase}'", "delta": "0:00:01.310596", "end": "2020-06-02 13:39:28.086419", "rc": 0, "start": "2020-06-02 13:39:26.775823", "stderr": "", "stderr_lines": [], "stdout": "Running", "stdout_lines": ["Running"]}
TASK [Get the container status of application.] ********************************
task path: /utils/scm/applications/mysql/mysql_data_persistence.yml:51
2020-06-02T13:39:28.168004 (delta: 1.558734) elapsed: 265.342349 ******
changed: [127.0.0.1] => {"attempts": 1, "changed": true, "cmd": "kubectl get pods -n app-percona -o jsonpath='{.items[?(@.metadata.name==\"percona-755f6678bf-mv86f\")].status.containerStatuses[].state}' | grep running", "delta": "0:00:01.229842", "end": "2020-06-02 13:39:29.584503", "rc": 0, "start": "2020-06-02 13:39:28.354661", "stderr": "", "stderr_lines": [], "stdout": "map[running:map[startedAt:2020-06-02T13:39:26Z]]", "stdout_lines": ["map[running:map[startedAt:2020-06-02T13:39:26Z]]"]}
TASK [Check if db is ready for connections] ************************************
task path: /utils/scm/applications/mysql/mysql_data_persistence.yml:61
2020-06-02T13:39:29.636523 (delta: 1.468444) elapsed: 266.810868 ******
changed: [127.0.0.1] => {"attempts": 1, "changed": true, "cmd": "kubectl logs percona-755f6678bf-mv86f -n app-percona | grep 'ready for connections'", "delta": "0:00:01.413856", "end": "2020-06-02 13:39:31.212914", "rc": 0, "start": "2020-06-02 13:39:29.799058", "stderr": "", "stderr_lines": [], "stdout": "2020-06-02T13:39:27.740555Z 0 [Note] mysqld: ready for connections.", "stdout_lines": ["2020-06-02T13:39:27.740555Z 0 [Note] mysqld: ready for connections."]}
TASK [Checking for the Corrupted tables] ***************************************
task path: /utils/scm/applications/mysql/mysql_data_persistence.yml:68
2020-06-02T13:39:31.267021 (delta: 1.630449) elapsed: 268.441366 ******
changed: [127.0.0.1] => {"changed": true, "cmd": "kubectl exec percona-755f6678bf-mv86f -n app-percona -- mysqlcheck -c backup -uroot -pk8sDem0", "delta": "0:00:01.338225", "end": "2020-06-02 13:39:32.775345", "failed_when_result": false, "rc": 0, "start": "2020-06-02 13:39:31.437120", "stderr": "mysqlcheck: [Warning] Using a password on the command line interface can be insecure.", "stderr_lines": ["mysqlcheck: [Warning] Using a password on the command line interface can be insecure."], "stdout": "backup.ttbl OK", "stdout_lines": ["backup.ttbl OK"]}
TASK [Verify mysql data persistence] *******************************************
task path: /utils/scm/applications/mysql/mysql_data_persistence.yml:77
2020-06-02T13:39:32.832397 (delta: 1.565326) elapsed: 270.006742 ******
changed: [127.0.0.1] => {"changed": true, "cmd": "kubectl exec percona-755f6678bf-mv86f -n app-percona -- mysql -uroot -pk8sDem0 -e 'select * from ttbl' backup;", "delta": "0:00:01.521322", "end": "2020-06-02 13:39:34.519087", "failed_when_result": false, "rc": 0, "start": "2020-06-02 13:39:32.997765", "stderr": "mysql: [Warning] Using a password on the command line interface can be insecure.", "stderr_lines": ["mysql: [Warning] Using a password on the command line interface can be insecure."], "stdout": "Data\ntdata", "stdout_lines": ["Data", "tdata"]}
TASK [Delete/drop MySQL database] **********************************************
task path: /utils/scm/applications/mysql/mysql_data_persistence.yml:90
2020-06-02T13:39:34.579378 (delta: 1.74693) elapsed: 271.753723 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Verify successful db delete] *********************************************
task path: /utils/scm/applications/mysql/mysql_data_persistence.yml:98
2020-06-02T13:39:34.634377 (delta: 0.054933) elapsed: 271.808722 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Waiting for application to be in running state] **************************
task path: /experiments/functional/backup_and_restore/test.yml:230
2020-06-02T13:39:34.693986 (delta: 0.05956) elapsed: 271.868331 *******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Verifying Data persistense] **********************************************
task path: /experiments/functional/backup_and_restore/test.yml:237
2020-06-02T13:39:34.754959 (delta: 0.060923) elapsed: 271.929304 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [set_fact] ****************************************************************
task path: /experiments/functional/backup_and_restore/test.yml:251
2020-06-02T13:39:34.805921 (delta: 0.050914) elapsed: 271.980266 ******
ok: [127.0.0.1] => {"ansible_facts": {"flag": "Pass"}, "changed": false}
TASK [include_tasks] ***********************************************************
task path: /experiments/functional/backup_and_restore/test.yml:259
2020-06-02T13:39:34.867580 (delta: 0.061611) elapsed: 272.041925 ******
included: /utils/fcm/update_litmus_result_resource.yml for 127.0.0.1
TASK [Generate the litmus result CR to reflect SOT (Start of Test)] ************
task path: /utils/fcm/update_litmus_result_resource.yml:3
2020-06-02T13:39:34.981776 (delta: 0.114147) elapsed: 272.156121 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Analyze the cr yaml] *****************************************************
task path: /utils/fcm/update_litmus_result_resource.yml:14
2020-06-02T13:39:35.028436 (delta: 0.046613) elapsed: 272.202781 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Apply the litmus result CR] **********************************************
task path: /utils/fcm/update_litmus_result_resource.yml:17
2020-06-02T13:39:35.077342 (delta: 0.048838) elapsed: 272.251687 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Generate the litmus result CR to reflect EOT (End of Test)] **************
task path: /utils/fcm/update_litmus_result_resource.yml:27
2020-06-02T13:39:35.126738 (delta: 0.04935) elapsed: 272.301083 *******
changed: [127.0.0.1] => {"changed": true, "checksum": "218cc4cd06f4a20f4ce3c234b033d66800aa02b5", "dest": "./litmus-result.yaml", "gid": 0, "group": "root", "md5sum": "9f2161a292cb589ce3599bf99ea6818b", "mode": "0644", "owner": "root", "size": 420, "src": "/root/.ansible/tmp/ansible-tmp-1591105175.28-171965597197056/source", "state": "file", "uid": 0}
TASK [Analyze the cr yaml] *****************************************************
task path: /utils/fcm/update_litmus_result_resource.yml:38
2020-06-02T13:39:37.791851 (delta: 2.665067) elapsed: 274.966196 ******
changed: [127.0.0.1] => {"changed": true, "cmd": "cat litmus-result.yaml", "delta": "0:00:01.052966", "end": "2020-06-02 13:39:39.026624", "rc": 0, "start": "2020-06-02 13:39:37.973658", "stderr": "", "stderr_lines": [], "stdout": "---\napiVersion: litmus.io/v1alpha1\nkind: LitmusResult\nmetadata:\n\n # name of the litmus testcase\n name: velero-backup-restore \nspec:\n\n # holds information on the testcase\n testMetadata:\n app: \n chaostype: \n\n # holds the state of testcase, manually updated by json merge patch\n # result is the useful value today, but anticipate phase use in future \n testStatus: \n phase: completed \n result: Pass ", "stdout_lines": ["---", "apiVersion: litmus.io/v1alpha1", "kind: LitmusResult", "metadata:", "", " # name of the litmus testcase", " name: velero-backup-restore ", "spec:", "", " # holds information on the testcase", " testMetadata:", " app: ", " chaostype: ", "", " # holds the state of testcase, manually updated by json merge patch", " # result is the useful value today, but anticipate phase use in future ", " testStatus: ", " phase: completed ", " result: Pass "]}
TASK [Apply the litmus result CR] **********************************************
task path: /utils/fcm/update_litmus_result_resource.yml:41
2020-06-02T13:39:39.076899 (delta: 1.285006) elapsed: 276.251244 ******
changed: [127.0.0.1] => {"changed": true, "failed_when_result": false, "method": "patch", "result": {"apiVersion": "litmus.io/v1alpha1", "kind": "LitmusResult", "metadata": {"creationTimestamp": "2020-06-02T12:28:40Z", "generation": 4, "name": "velero-backup-restore", "resourceVersion": "1237850", "selfLink": "/apis/litmus.io/v1alpha1/litmusresults/velero-backup-restore", "uid": "96e7db5a-a4cc-11ea-9ff8-005056987ecb"}, "spec": {"testMetadata": {}, "testStatus": {"phase": "completed", "result": "Pass"}}}}
TASK [Deleting GCP bucket] *****************************************************
task path: /experiments/functional/backup_and_restore/test.yml:263
2020-06-02T13:39:39.830121 (delta: 0.753173) elapsed: 277.004466 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
TASK [Deprovisioning Velero server] ********************************************
task path: /experiments/functional/backup_and_restore/test.yml:267
2020-06-02T13:39:39.881807 (delta: 0.051636) elapsed: 277.056152 ******
skipping: [127.0.0.1] => {"changed": false, "skip_reason": "Conditional result was False"}
META: ran handlers
META: ran handlers
PLAY RECAP *********************************************************************
127.0.0.1 : ok=68 changed=53 unreachable=0 failed=0
2020-06-02T13:39:39.911400 (delta: 0.029543) elapsed: 277.085745 ******
===============================================================================
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment