Skip to content

Instantly share code, notes, and snippets.

Show Gist options
  • Save amitkumarj441/ba7ede660c619e95635c9f1e90d1467b to your computer and use it in GitHub Desktop.
Save amitkumarj441/ba7ede660c619e95635c9f1e90d1467b to your computer and use it in GitHub Desktop.
gsoc_project_error2
[root@viaq openshift-ansible]# ANSIBLE_LOG_PATH=/tmp/ansible.log ansible-playbook -vvv -e @/root/vars.yaml -i ansible-inventory playbooks/byo/config.yml
Using /etc/ansible/ansible.cfg as config file
statically included: /usr/share/ansible/openshift-ansible/roles/os_firewall/tasks/firewall/firewalld.yml
statically included: /usr/share/ansible/openshift-ansible/roles/os_firewall/tasks/firewall/iptables.yml
statically included: /usr/share/ansible/openshift-ansible/roles/docker/tasks/udev_workaround.yml
statically included: /usr/share/ansible/openshift-ansible/roles/openshift_version/tasks/set_version_rpm.yml
statically included: /usr/share/ansible/openshift-ansible/roles/openshift_version/tasks/set_version_containerized.yml
statically included: /usr/share/ansible/openshift-ansible/roles/os_firewall/tasks/firewall/firewalld.yml
statically included: /usr/share/ansible/openshift-ansible/roles/os_firewall/tasks/firewall/iptables.yml
statically included: /usr/share/ansible/openshift-ansible/roles/docker/tasks/udev_workaround.yml
statically included: /usr/share/ansible/openshift-ansible/roles/openshift_version/tasks/set_version_rpm.yml
statically included: /usr/share/ansible/openshift-ansible/roles/openshift_version/tasks/set_version_containerized.yml
statically included: /usr/share/ansible/openshift-ansible/roles/openshift_excluder/tasks/init.yml
statically included: /usr/share/ansible/openshift-ansible/roles/openshift_excluder/tasks/unexclude.yml
statically included: /usr/share/ansible/openshift-ansible/roles/openshift_excluder/tasks/install.yml
statically included: /usr/share/ansible/openshift-ansible/roles/openshift_excluder/tasks/exclude.yml
statically included: /usr/share/ansible/openshift-ansible/roles/openshift_excluder/tasks/unexclude.yml
statically included: /usr/share/ansible/openshift-ansible/roles/os_firewall/tasks/firewall/firewalld.yml
statically included: /usr/share/ansible/openshift-ansible/roles/os_firewall/tasks/firewall/iptables.yml
statically included: /usr/share/ansible/openshift-ansible/roles/docker/tasks/udev_workaround.yml
statically included: /usr/share/ansible/openshift-ansible/roles/openshift_version/tasks/set_version_rpm.yml
statically included: /usr/share/ansible/openshift-ansible/roles/openshift_version/tasks/set_version_containerized.yml
statically included: /usr/share/ansible/openshift-ansible/roles/docker/tasks/udev_workaround.yml
statically included: /usr/share/ansible/openshift-ansible/roles/os_firewall/tasks/firewall/firewalld.yml
statically included: /usr/share/ansible/openshift-ansible/roles/os_firewall/tasks/firewall/iptables.yml
statically included: /usr/share/ansible/openshift-ansible/roles/etcd/tasks/system_container.yml
statically included: /usr/share/ansible/openshift-ansible/roles/etcd/tasks/etcdctl.yml
statically included: /usr/share/ansible/openshift-ansible/roles/os_firewall/tasks/firewall/firewalld.yml
statically included: /usr/share/ansible/openshift-ansible/roles/os_firewall/tasks/firewall/iptables.yml
statically included: /usr/share/ansible/openshift-ansible/roles/os_firewall/tasks/firewall/firewalld.yml
statically included: /usr/share/ansible/openshift-ansible/roles/os_firewall/tasks/firewall/iptables.yml
statically included: /usr/share/ansible/openshift-ansible/roles/os_firewall/tasks/firewall/firewalld.yml
statically included: /usr/share/ansible/openshift-ansible/roles/os_firewall/tasks/firewall/iptables.yml
statically included: /usr/share/ansible/openshift-ansible/roles/os_firewall/tasks/firewall/firewalld.yml
statically included: /usr/share/ansible/openshift-ansible/roles/os_firewall/tasks/firewall/iptables.yml
statically included: /usr/share/ansible/openshift-ansible/roles/docker/tasks/udev_workaround.yml
statically included: /usr/share/ansible/openshift-ansible/roles/openshift_version/tasks/set_version_rpm.yml
statically included: /usr/share/ansible/openshift-ansible/roles/openshift_version/tasks/set_version_containerized.yml
statically included: /usr/share/ansible/openshift-ansible/roles/docker/tasks/udev_workaround.yml
statically included: /usr/share/ansible/openshift-ansible/roles/openshift_cloud_provider/tasks/openstack.yml
statically included: /usr/share/ansible/openshift-ansible/roles/openshift_cloud_provider/tasks/aws.yml
statically included: /usr/share/ansible/openshift-ansible/roles/openshift_cloud_provider/tasks/gce.yml
statically included: /usr/share/ansible/openshift-ansible/roles/os_firewall/tasks/firewall/firewalld.yml
statically included: /usr/share/ansible/openshift-ansible/roles/os_firewall/tasks/firewall/iptables.yml
statically included: /usr/share/ansible/openshift-ansible/roles/os_firewall/tasks/firewall/firewalld.yml
statically included: /usr/share/ansible/openshift-ansible/roles/os_firewall/tasks/firewall/iptables.yml
statically included: /usr/share/ansible/openshift-ansible/roles/nuage_master/tasks/serviceaccount.yml
statically included: /usr/share/ansible/openshift-ansible/roles/nuage_master/tasks/certificates.yml
statically included: /usr/share/ansible/openshift-ansible/roles/openshift_master/tasks/systemd_units.yml
statically included: /usr/share/ansible/openshift-ansible/roles/openshift_master/tasks/system_container.yml
statically included: /usr/share/ansible/openshift-ansible/roles/openshift_master/tasks/set_loopback_context.yml
statically included: /usr/share/ansible/openshift-ansible/roles/openshift_master_cluster/tasks/configure.yml
statically included: /usr/share/ansible/openshift-ansible/roles/os_firewall/tasks/firewall/firewalld.yml
statically included: /usr/share/ansible/openshift-ansible/roles/os_firewall/tasks/firewall/iptables.yml
statically included: /usr/share/ansible/openshift-ansible/roles/docker/tasks/udev_workaround.yml
statically included: /usr/share/ansible/openshift-ansible/roles/openshift_version/tasks/set_version_rpm.yml
statically included: /usr/share/ansible/openshift-ansible/roles/openshift_version/tasks/set_version_containerized.yml
statically included: /usr/share/ansible/openshift-ansible/roles/os_firewall/tasks/firewall/firewalld.yml
statically included: /usr/share/ansible/openshift-ansible/roles/os_firewall/tasks/firewall/iptables.yml
statically included: /usr/share/ansible/openshift-ansible/roles/os_firewall/tasks/firewall/firewalld.yml
statically included: /usr/share/ansible/openshift-ansible/roles/os_firewall/tasks/firewall/iptables.yml
statically included: /usr/share/ansible/openshift-ansible/roles/docker/tasks/udev_workaround.yml
statically included: /usr/share/ansible/openshift-ansible/roles/openshift_version/tasks/set_version_rpm.yml
statically included: /usr/share/ansible/openshift-ansible/roles/openshift_version/tasks/set_version_containerized.yml
statically included: /usr/share/ansible/openshift-ansible/roles/docker/tasks/udev_workaround.yml
statically included: /usr/share/ansible/openshift-ansible/roles/openshift_cloud_provider/tasks/openstack.yml
statically included: /usr/share/ansible/openshift-ansible/roles/openshift_cloud_provider/tasks/aws.yml
statically included: /usr/share/ansible/openshift-ansible/roles/openshift_cloud_provider/tasks/gce.yml
statically included: /usr/share/ansible/openshift-ansible/roles/os_firewall/tasks/firewall/firewalld.yml
statically included: /usr/share/ansible/openshift-ansible/roles/os_firewall/tasks/firewall/iptables.yml
statically included: /usr/share/ansible/openshift-ansible/roles/os_firewall/tasks/firewall/firewalld.yml
statically included: /usr/share/ansible/openshift-ansible/roles/os_firewall/tasks/firewall/iptables.yml
statically included: /usr/share/ansible/openshift-ansible/roles/os_firewall/tasks/firewall/firewalld.yml
statically included: /usr/share/ansible/openshift-ansible/roles/os_firewall/tasks/firewall/iptables.yml
statically included: /usr/share/ansible/openshift-ansible/roles/openshift_node_dnsmasq/tasks/./network-manager.yml
statically included: /usr/share/ansible/openshift-ansible/roles/openshift_node_dnsmasq/tasks/./no-network-manager.yml
statically included: /usr/share/ansible/openshift-ansible/roles/openshift_node/tasks/systemd_units.yml
statically included: /usr/share/ansible/openshift-ansible/roles/openshift_node/tasks/node_system_container.yml
statically included: /usr/share/ansible/openshift-ansible/roles/openshift_node/tasks/openvswitch_system_container.yml
statically included: /usr/share/ansible/openshift-ansible/roles/openshift_node/tasks/storage_plugins/nfs.yml
statically included: /usr/share/ansible/openshift-ansible/roles/openshift_node/tasks/storage_plugins/glusterfs.yml
statically included: /usr/share/ansible/openshift-ansible/roles/openshift_node/tasks/storage_plugins/ceph.yml
statically included: /usr/share/ansible/openshift-ansible/roles/openshift_node/tasks/storage_plugins/iscsi.yml
statically included: /usr/share/ansible/openshift-ansible/roles/os_firewall/tasks/firewall/firewalld.yml
statically included: /usr/share/ansible/openshift-ansible/roles/os_firewall/tasks/firewall/iptables.yml
statically included: /usr/share/ansible/openshift-ansible/roles/docker/tasks/udev_workaround.yml
statically included: /usr/share/ansible/openshift-ansible/roles/openshift_version/tasks/set_version_rpm.yml
statically included: /usr/share/ansible/openshift-ansible/roles/openshift_version/tasks/set_version_containerized.yml
statically included: /usr/share/ansible/openshift-ansible/roles/docker/tasks/udev_workaround.yml
statically included: /usr/share/ansible/openshift-ansible/roles/openshift_cloud_provider/tasks/openstack.yml
statically included: /usr/share/ansible/openshift-ansible/roles/openshift_cloud_provider/tasks/aws.yml
statically included: /usr/share/ansible/openshift-ansible/roles/openshift_cloud_provider/tasks/gce.yml
statically included: /usr/share/ansible/openshift-ansible/roles/os_firewall/tasks/firewall/firewalld.yml
statically included: /usr/share/ansible/openshift-ansible/roles/os_firewall/tasks/firewall/iptables.yml
statically included: /usr/share/ansible/openshift-ansible/roles/os_firewall/tasks/firewall/firewalld.yml
statically included: /usr/share/ansible/openshift-ansible/roles/os_firewall/tasks/firewall/iptables.yml
statically included: /usr/share/ansible/openshift-ansible/roles/os_firewall/tasks/firewall/firewalld.yml
statically included: /usr/share/ansible/openshift-ansible/roles/os_firewall/tasks/firewall/iptables.yml
statically included: /usr/share/ansible/openshift-ansible/roles/openshift_node_dnsmasq/tasks/./network-manager.yml
statically included: /usr/share/ansible/openshift-ansible/roles/openshift_node_dnsmasq/tasks/./no-network-manager.yml
statically included: /usr/share/ansible/openshift-ansible/roles/openshift_node/tasks/systemd_units.yml
statically included: /usr/share/ansible/openshift-ansible/roles/openshift_node/tasks/node_system_container.yml
statically included: /usr/share/ansible/openshift-ansible/roles/openshift_node/tasks/openvswitch_system_container.yml
statically included: /usr/share/ansible/openshift-ansible/roles/openshift_node/tasks/storage_plugins/nfs.yml
statically included: /usr/share/ansible/openshift-ansible/roles/openshift_node/tasks/storage_plugins/glusterfs.yml
statically included: /usr/share/ansible/openshift-ansible/roles/openshift_node/tasks/storage_plugins/ceph.yml
statically included: /usr/share/ansible/openshift-ansible/roles/openshift_node/tasks/storage_plugins/iscsi.yml
statically included: /usr/share/ansible/openshift-ansible/roles/os_firewall/tasks/firewall/firewalld.yml
statically included: /usr/share/ansible/openshift-ansible/roles/os_firewall/tasks/firewall/iptables.yml
statically included: /usr/share/ansible/openshift-ansible/roles/nuage_node/tasks/certificates.yml
statically included: /usr/share/ansible/openshift-ansible/roles/nuage_node/tasks/iptables.yml
statically included: /usr/share/ansible/openshift-ansible/roles/os_firewall/tasks/firewall/firewalld.yml
statically included: /usr/share/ansible/openshift-ansible/roles/os_firewall/tasks/firewall/iptables.yml
statically included: /usr/share/ansible/openshift-ansible/roles/docker/tasks/udev_workaround.yml
statically included: /usr/share/ansible/openshift-ansible/roles/openshift_version/tasks/set_version_rpm.yml
statically included: /usr/share/ansible/openshift-ansible/roles/openshift_version/tasks/set_version_containerized.yml
statically included: /usr/share/ansible/openshift-ansible/roles/os_firewall/tasks/firewall/firewalld.yml
statically included: /usr/share/ansible/openshift-ansible/roles/os_firewall/tasks/firewall/iptables.yml
statically included: /usr/share/ansible/openshift-ansible/roles/docker/tasks/udev_workaround.yml
statically included: /usr/share/ansible/openshift-ansible/roles/openshift_version/tasks/set_version_rpm.yml
statically included: /usr/share/ansible/openshift-ansible/roles/openshift_version/tasks/set_version_containerized.yml
statically included: /usr/share/ansible/openshift-ansible/roles/docker/tasks/udev_workaround.yml
statically included: /usr/share/ansible/openshift-ansible/roles/openshift_hosted/tasks/router/router.yml
statically included: /usr/share/ansible/openshift-ansible/roles/openshift_hosted/tasks/registry/registry.yml
statically included: /usr/share/ansible/openshift-ansible/roles/openshift_excluder/tasks/init.yml
statically included: /usr/share/ansible/openshift-ansible/roles/openshift_excluder/tasks/install.yml
statically included: /usr/share/ansible/openshift-ansible/roles/openshift_excluder/tasks/exclude.yml
PLAYBOOK: config.yml ***************************************************************************************************************************************************
28 plays in playbooks/byo/config.yml
PLAY [Create initial host groups for localhost] ************************************************************************************************************************
META: ran handlers
TASK [include_vars] ****************************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/playbooks/common/openshift-cluster/std_include.yml:10
ok: [localhost] => {
"ansible_facts": {
"g_all_hosts": "{{ g_master_hosts | union(g_node_hosts) | union(g_etcd_hosts) | union(g_lb_hosts) | union(g_nfs_hosts) | union(g_new_node_hosts)| union(g_new_master_hosts) | default([]) }}",
"g_etcd_hosts": "{{ groups.etcd | default([]) }}",
"g_lb_hosts": "{{ groups.lb | default([]) }}",
"g_master_hosts": "{{ groups.masters | default([]) }}",
"g_new_master_hosts": "{{ groups.new_masters | default([]) }}",
"g_new_node_hosts": "{{ groups.new_nodes | default([]) }}",
"g_nfs_hosts": "{{ groups.nfs | default([]) }}",
"g_node_hosts": "{{ groups.nodes | default([]) }}"
},
"changed": false
}
TASK [Evaluate group l_oo_all_hosts] ***********************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/playbooks/common/openshift-cluster/std_include.yml:11
creating host via 'add_host': hostname=localhost
ok: [localhost] => (item=localhost) => {
"add_host": {
"groups": [
"l_oo_all_hosts"
],
"host_name": "localhost",
"host_vars": {}
},
"changed": false,
"item": "localhost"
}
META: ran handlers
META: ran handlers
PLAY [Create initial host groups for all hosts] ************************************************************************************************************************
META: ran handlers
TASK [include_vars] ****************************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/playbooks/common/openshift-cluster/std_include.yml:24
ok: [localhost] => {
"ansible_facts": {
"g_all_hosts": "{{ g_master_hosts | union(g_node_hosts) | union(g_etcd_hosts) | union(g_lb_hosts) | union(g_nfs_hosts) | union(g_new_node_hosts)| union(g_new_master_hosts) | default([]) }}",
"g_etcd_hosts": "{{ groups.etcd | default([]) }}",
"g_lb_hosts": "{{ groups.lb | default([]) }}",
"g_master_hosts": "{{ groups.masters | default([]) }}",
"g_new_master_hosts": "{{ groups.new_masters | default([]) }}",
"g_new_node_hosts": "{{ groups.new_nodes | default([]) }}",
"g_nfs_hosts": "{{ groups.nfs | default([]) }}",
"g_node_hosts": "{{ groups.nodes | default([]) }}"
},
"changed": false
}
TASK [set_fact] ********************************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/playbooks/common/openshift-cluster/std_include.yml:25
ok: [localhost] => {
"ansible_facts": {
"openshift_deployment_type": "origin"
},
"changed": false
}
META: ran handlers
META: ran handlers
PLAY [Populate config host groups] *************************************************************************************************************************************
META: ran handlers
TASK [fail] ************************************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/playbooks/common/openshift-cluster/evaluate_groups.yml:8
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [fail] ************************************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/playbooks/common/openshift-cluster/evaluate_groups.yml:12
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [fail] ************************************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/playbooks/common/openshift-cluster/evaluate_groups.yml:16
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [fail] ************************************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/playbooks/common/openshift-cluster/evaluate_groups.yml:20
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [fail] ************************************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/playbooks/common/openshift-cluster/evaluate_groups.yml:24
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [fail] ************************************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/playbooks/common/openshift-cluster/evaluate_groups.yml:28
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [Evaluate oo_all_hosts] *******************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/playbooks/common/openshift-cluster/evaluate_groups.yml:32
creating host via 'add_host': hostname=localhost
ok: [localhost] => (item=localhost) => {
"add_host": {
"groups": [
"oo_all_hosts"
],
"host_name": "localhost",
"host_vars": {}
},
"changed": false,
"item": "localhost"
}
TASK [Evaluate oo_masters] *********************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/playbooks/common/openshift-cluster/evaluate_groups.yml:41
creating host via 'add_host': hostname=localhost
ok: [localhost] => (item=localhost) => {
"add_host": {
"groups": [
"oo_masters"
],
"host_name": "localhost",
"host_vars": {}
},
"changed": false,
"item": "localhost"
}
TASK [Evaluate oo_etcd_to_config] **************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/playbooks/common/openshift-cluster/evaluate_groups.yml:50
TASK [Evaluate oo_masters_to_config] ***********************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/playbooks/common/openshift-cluster/evaluate_groups.yml:59
creating host via 'add_host': hostname=localhost
ok: [localhost] => (item=localhost) => {
"add_host": {
"groups": [
"oo_masters_to_config"
],
"host_name": "localhost",
"host_vars": {}
},
"changed": false,
"item": "localhost"
}
TASK [Evaluate oo_nodes_to_config] *************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/playbooks/common/openshift-cluster/evaluate_groups.yml:68
creating host via 'add_host': hostname=localhost
ok: [localhost] => (item=localhost) => {
"add_host": {
"groups": [
"oo_nodes_to_config"
],
"host_name": "localhost",
"host_vars": {}
},
"changed": false,
"item": "localhost"
}
TASK [Add master to oo_nodes_to_config] ********************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/playbooks/common/openshift-cluster/evaluate_groups.yml:78
skipping: [localhost] => (item=localhost) => {
"changed": false,
"item": "localhost",
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [Evaluate oo_first_etcd] ******************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/playbooks/common/openshift-cluster/evaluate_groups.yml:88
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [Evaluate oo_first_master] ****************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/playbooks/common/openshift-cluster/evaluate_groups.yml:96
creating host via 'add_host': hostname=localhost
ok: [localhost] => {
"add_host": {
"groups": [
"oo_first_master"
],
"host_name": "localhost",
"host_vars": {}
},
"changed": false
}
TASK [Evaluate oo_lb_to_config] ****************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/playbooks/common/openshift-cluster/evaluate_groups.yml:105
TASK [Evaluate oo_nfs_to_config] ***************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/playbooks/common/openshift-cluster/evaluate_groups.yml:114
META: ran handlers
META: ran handlers
PLAY [Ensure that all non-node hosts are accessible] *******************************************************************************************************************
TASK [Gathering Facts] *************************************************************************************************************************************************
Using module file /usr/lib/python2.7/site-packages/ansible/modules/system/setup.py
<localhost> ESTABLISH LOCAL CONNECTION FOR USER: ith
<localhost> EXEC /bin/sh -c 'echo ~ && sleep 0'
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp/ansible-tmp-1497367105.7-165107962052476 `" && echo ansible-tmp-1497367105.7-165107962052476="` echo /root/.ansible/tmp/ansible-tmp-1497367105.7-165107962052476 `" ) && sleep 0'
<localhost> PUT /tmp/tmpN0CxXx TO /root/.ansible/tmp/ansible-tmp-1497367105.7-165107962052476/setup.py
<localhost> EXEC /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1497367105.7-165107962052476/ /root/.ansible/tmp/ansible-tmp-1497367105.7-165107962052476/setup.py && sleep 0'
<localhost> EXEC /bin/sh -c '/usr/bin/python /root/.ansible/tmp/ansible-tmp-1497367105.7-165107962052476/setup.py; rm -rf "/root/.ansible/tmp/ansible-tmp-1497367105.7-165107962052476/" > /dev/null 2>&1 && sleep 0'
ok: [localhost]
META: ran handlers
META: ran handlers
META: ran handlers
PLAY [Initialize host facts] *******************************************************************************************************************************************
TASK [Gathering Facts] *************************************************************************************************************************************************
Using module file /usr/lib/python2.7/site-packages/ansible/modules/system/setup.py
<localhost> ESTABLISH LOCAL CONNECTION FOR USER: ith
<localhost> EXEC /bin/sh -c 'echo ~ && sleep 0'
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp/ansible-tmp-1497367106.29-182243180370702 `" && echo ansible-tmp-1497367106.29-182243180370702="` echo /root/.ansible/tmp/ansible-tmp-1497367106.29-182243180370702 `" ) && sleep 0'
<localhost> PUT /tmp/tmpYq16Oh TO /root/.ansible/tmp/ansible-tmp-1497367106.29-182243180370702/setup.py
<localhost> EXEC /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1497367106.29-182243180370702/ /root/.ansible/tmp/ansible-tmp-1497367106.29-182243180370702/setup.py && sleep 0'
<localhost> EXEC /bin/sh -c '/usr/bin/python /root/.ansible/tmp/ansible-tmp-1497367106.29-182243180370702/setup.py; rm -rf "/root/.ansible/tmp/ansible-tmp-1497367106.29-182243180370702/" > /dev/null 2>&1 && sleep 0'
ok: [localhost]
META: ran handlers
TASK [openshift_facts : Detecting Operating System] ********************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_facts/tasks/main.yml:2
Using module file /usr/lib/python2.7/site-packages/ansible/modules/files/stat.py
<localhost> ESTABLISH LOCAL CONNECTION FOR USER: ith
<localhost> EXEC /bin/sh -c 'echo ~ && sleep 0'
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp/ansible-tmp-1497367107.06-108066039614406 `" && echo ansible-tmp-1497367107.06-108066039614406="` echo /root/.ansible/tmp/ansible-tmp-1497367107.06-108066039614406 `" ) && sleep 0'
<localhost> PUT /tmp/tmpOBgNVo TO /root/.ansible/tmp/ansible-tmp-1497367107.06-108066039614406/stat.py
<localhost> EXEC /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1497367107.06-108066039614406/ /root/.ansible/tmp/ansible-tmp-1497367107.06-108066039614406/stat.py && sleep 0'
<localhost> EXEC /bin/sh -c '/usr/bin/python /root/.ansible/tmp/ansible-tmp-1497367107.06-108066039614406/stat.py; rm -rf "/root/.ansible/tmp/ansible-tmp-1497367107.06-108066039614406/" > /dev/null 2>&1 && sleep 0'
ok: [localhost] => {
"changed": false,
"invocation": {
"module_args": {
"checksum_algorithm": "sha1",
"follow": false,
"get_attributes": true,
"get_checksum": true,
"get_md5": true,
"get_mime": true,
"path": "/run/ostree-booted"
}
},
"stat": {
"exists": false
}
}
TASK [openshift_facts : set_fact] **************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_facts/tasks/main.yml:8
ok: [localhost] => {
"ansible_facts": {
"l_is_atomic": false
},
"changed": false
}
TASK [openshift_facts : set_fact] **************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_facts/tasks/main.yml:10
ok: [localhost] => {
"ansible_facts": {
"l_is_containerized": false,
"l_is_etcd_system_container": false,
"l_is_master_system_container": false,
"l_is_node_system_container": false,
"l_is_openvswitch_system_container": false
},
"changed": false
}
TASK [openshift_facts : set_fact] **************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_facts/tasks/main.yml:16
ok: [localhost] => {
"ansible_facts": {
"l_any_system_container": false
},
"changed": false
}
TASK [openshift_facts : Validate python version] ***********************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_facts/tasks/main.yml:19
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_facts : Validate python version] ***********************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_facts/tasks/main.yml:26
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_facts : Determine Atomic Host Docker Version] **********************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_facts/tasks/main.yml:39
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_facts : assert] ****************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_facts/tasks/main.yml:43
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_facts : Ensure various deps are installed] *************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_facts/tasks/main.yml:50
Using module file /usr/lib/python2.7/site-packages/ansible/modules/packaging/os/yum.py
<localhost> ESTABLISH LOCAL CONNECTION FOR USER: ith
<localhost> EXEC /bin/sh -c 'echo ~ && sleep 0'
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp/ansible-tmp-1497367108.49-110308861724427 `" && echo ansible-tmp-1497367108.49-110308861724427="` echo /root/.ansible/tmp/ansible-tmp-1497367108.49-110308861724427 `" ) && sleep 0'
<localhost> PUT /tmp/tmpywcs2G TO /root/.ansible/tmp/ansible-tmp-1497367108.49-110308861724427/yum.py
<localhost> EXEC /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1497367108.49-110308861724427/ /root/.ansible/tmp/ansible-tmp-1497367108.49-110308861724427/yum.py && sleep 0'
<localhost> EXEC /bin/sh -c '/usr/bin/python /root/.ansible/tmp/ansible-tmp-1497367108.49-110308861724427/yum.py; rm -rf "/root/.ansible/tmp/ansible-tmp-1497367108.49-110308861724427/" > /dev/null 2>&1 && sleep 0'
Using module file /usr/lib/python2.7/site-packages/ansible/modules/packaging/os/yum.py
<localhost> EXEC /bin/sh -c 'echo ~ && sleep 0'
ok: [localhost] => (item=iproute) => {
"changed": false,
"invocation": {
"module_args": {
"conf_file": null,
"disable_gpg_check": false,
"disablerepo": null,
"enablerepo": null,
"exclude": null,
"install_repoquery": true,
"installroot": "/",
"list": null,
"name": [
"iproute"
],
"skip_broken": false,
"state": "present",
"update_cache": false,
"validate_certs": true
}
},
"item": "iproute",
"rc": 0,
"results": [
"iproute-3.10.0-74.el7.x86_64 providing iproute is already installed"
]
}
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp/ansible-tmp-1497367109.1-220839630995130 `" && echo ansible-tmp-1497367109.1-220839630995130="` echo /root/.ansible/tmp/ansible-tmp-1497367109.1-220839630995130 `" ) && sleep 0'
<localhost> PUT /tmp/tmp9IKf21 TO /root/.ansible/tmp/ansible-tmp-1497367109.1-220839630995130/yum.py
<localhost> EXEC /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1497367109.1-220839630995130/ /root/.ansible/tmp/ansible-tmp-1497367109.1-220839630995130/yum.py && sleep 0'
<localhost> EXEC /bin/sh -c '/usr/bin/python /root/.ansible/tmp/ansible-tmp-1497367109.1-220839630995130/yum.py; rm -rf "/root/.ansible/tmp/ansible-tmp-1497367109.1-220839630995130/" > /dev/null 2>&1 && sleep 0'
Using module file /usr/lib/python2.7/site-packages/ansible/modules/packaging/os/yum.py
<localhost> EXEC /bin/sh -c 'echo ~ && sleep 0'
ok: [localhost] => (item=python-dbus) => {
"changed": false,
"invocation": {
"module_args": {
"conf_file": null,
"disable_gpg_check": false,
"disablerepo": null,
"enablerepo": null,
"exclude": null,
"install_repoquery": true,
"installroot": "/",
"list": null,
"name": [
"python-dbus"
],
"skip_broken": false,
"state": "present",
"update_cache": false,
"validate_certs": true
}
},
"item": "python-dbus",
"rc": 0,
"results": [
"dbus-python-1.1.1-9.el7.x86_64 providing python-dbus is already installed"
]
}
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp/ansible-tmp-1497367110.12-113846771358539 `" && echo ansible-tmp-1497367110.12-113846771358539="` echo /root/.ansible/tmp/ansible-tmp-1497367110.12-113846771358539 `" ) && sleep 0'
<localhost> PUT /tmp/tmpC5ey0J TO /root/.ansible/tmp/ansible-tmp-1497367110.12-113846771358539/yum.py
<localhost> EXEC /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1497367110.12-113846771358539/ /root/.ansible/tmp/ansible-tmp-1497367110.12-113846771358539/yum.py && sleep 0'
<localhost> EXEC /bin/sh -c '/usr/bin/python /root/.ansible/tmp/ansible-tmp-1497367110.12-113846771358539/yum.py; rm -rf "/root/.ansible/tmp/ansible-tmp-1497367110.12-113846771358539/" > /dev/null 2>&1 && sleep 0'
Using module file /usr/lib/python2.7/site-packages/ansible/modules/packaging/os/yum.py
<localhost> EXEC /bin/sh -c 'echo ~ && sleep 0'
ok: [localhost] => (item=python-six) => {
"changed": false,
"invocation": {
"module_args": {
"conf_file": null,
"disable_gpg_check": false,
"disablerepo": null,
"enablerepo": null,
"exclude": null,
"install_repoquery": true,
"installroot": "/",
"list": null,
"name": [
"python-six"
],
"skip_broken": false,
"state": "present",
"update_cache": false,
"validate_certs": true
}
},
"item": "python-six",
"rc": 0,
"results": [
"python-six-1.9.0-2.el7.noarch providing python-six is already installed"
]
}
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp/ansible-tmp-1497367110.67-40724649859999 `" && echo ansible-tmp-1497367110.67-40724649859999="` echo /root/.ansible/tmp/ansible-tmp-1497367110.67-40724649859999 `" ) && sleep 0'
<localhost> PUT /tmp/tmpnANe7h TO /root/.ansible/tmp/ansible-tmp-1497367110.67-40724649859999/yum.py
<localhost> EXEC /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1497367110.67-40724649859999/ /root/.ansible/tmp/ansible-tmp-1497367110.67-40724649859999/yum.py && sleep 0'
<localhost> EXEC /bin/sh -c '/usr/bin/python /root/.ansible/tmp/ansible-tmp-1497367110.67-40724649859999/yum.py; rm -rf "/root/.ansible/tmp/ansible-tmp-1497367110.67-40724649859999/" > /dev/null 2>&1 && sleep 0'
Using module file /usr/lib/python2.7/site-packages/ansible/modules/packaging/os/yum.py
<localhost> EXEC /bin/sh -c 'echo ~ && sleep 0'
ok: [localhost] => (item=PyYAML) => {
"changed": false,
"invocation": {
"module_args": {
"conf_file": null,
"disable_gpg_check": false,
"disablerepo": null,
"enablerepo": null,
"exclude": null,
"install_repoquery": true,
"installroot": "/",
"list": null,
"name": [
"PyYAML"
],
"skip_broken": false,
"state": "present",
"update_cache": false,
"validate_certs": true
}
},
"item": "PyYAML",
"rc": 0,
"results": [
"PyYAML-3.10-11.el7.x86_64 providing PyYAML is already installed"
]
}
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp/ansible-tmp-1497367111.21-52869294383641 `" && echo ansible-tmp-1497367111.21-52869294383641="` echo /root/.ansible/tmp/ansible-tmp-1497367111.21-52869294383641 `" ) && sleep 0'
<localhost> PUT /tmp/tmpbWSNWC TO /root/.ansible/tmp/ansible-tmp-1497367111.21-52869294383641/yum.py
<localhost> EXEC /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1497367111.21-52869294383641/ /root/.ansible/tmp/ansible-tmp-1497367111.21-52869294383641/yum.py && sleep 0'
<localhost> EXEC /bin/sh -c '/usr/bin/python /root/.ansible/tmp/ansible-tmp-1497367111.21-52869294383641/yum.py; rm -rf "/root/.ansible/tmp/ansible-tmp-1497367111.21-52869294383641/" > /dev/null 2>&1 && sleep 0'
ok: [localhost] => (item=yum-utils) => {
"changed": false,
"invocation": {
"module_args": {
"conf_file": null,
"disable_gpg_check": false,
"disablerepo": null,
"enablerepo": null,
"exclude": null,
"install_repoquery": true,
"installroot": "/",
"list": null,
"name": [
"yum-utils"
],
"skip_broken": false,
"state": "present",
"update_cache": false,
"validate_certs": true
}
},
"item": "yum-utils",
"rc": 0,
"results": [
"yum-utils-1.1.31-40.el7.noarch providing yum-utils is already installed"
]
}
TASK [openshift_facts : Ensure various deps for running system containers are installed] *******************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_facts/tasks/main.yml:55
skipping: [localhost] => (item=atomic) => {
"changed": false,
"item": "atomic",
"skip_reason": "Conditional result was False",
"skipped": true
}
skipping: [localhost] => (item=runc) => {
"changed": false,
"item": "runc",
"skip_reason": "Conditional result was False",
"skipped": true
}
skipping: [localhost] => (item=ostree) => {
"changed": false,
"item": "ostree",
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_facts : Gather Cluster facts and set is_containerized if needed] ***************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_facts/tasks/main.yml:62
Using module file /usr/share/ansible/openshift-ansible/roles/openshift_facts/library/openshift_facts.py
<localhost> ESTABLISH LOCAL CONNECTION FOR USER: ith
<localhost> EXEC /bin/sh -c 'echo ~ && sleep 0'
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp/ansible-tmp-1497367112.57-238229209130284 `" && echo ansible-tmp-1497367112.57-238229209130284="` echo /root/.ansible/tmp/ansible-tmp-1497367112.57-238229209130284 `" ) && sleep 0'
<localhost> PUT /tmp/tmpZbmS45 TO /root/.ansible/tmp/ansible-tmp-1497367112.57-238229209130284/openshift_facts.py
<localhost> EXEC /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1497367112.57-238229209130284/ /root/.ansible/tmp/ansible-tmp-1497367112.57-238229209130284/openshift_facts.py && sleep 0'
<localhost> EXEC /bin/sh -c '/usr/bin/python /root/.ansible/tmp/ansible-tmp-1497367112.57-238229209130284/openshift_facts.py; rm -rf "/root/.ansible/tmp/ansible-tmp-1497367112.57-238229209130284/" > /dev/null 2>&1 && sleep 0'
ok: [localhost] => {
"ansible_facts": {
"openshift": {
"common": {
"admin_binary": "oadm",
"all_hostnames": [
"kubernetes.default",
"kubernetes.default.svc.cluster.local",
"kubernetes",
"openshift.default",
"openshift.default.svc",
"172.16.93.5",
"172.30.0.1",
"viaq.logging.test",
"openshift.default.svc.cluster.local",
"kubernetes.default.svc",
"openshift"
],
"cli_image": "openshift/origin",
"client_binary": "oc",
"cluster_id": "default",
"config_base": "/etc/origin",
"data_dir": "/var/lib/origin",
"debug_level": "2",
"deployer_image": "openshift/origin-deployer",
"deployment_subtype": "basic",
"deployment_type": "origin",
"dns_domain": "cluster.local",
"examples_content_version": "v1.5",
"generate_no_proxy_hosts": true,
"hostname": "viaq.logging.test",
"install_examples": true,
"installed_variant_rpms": [],
"internal_hostnames": [
"kubernetes.default",
"kubernetes.default.svc.cluster.local",
"kubernetes",
"openshift.default",
"openshift.default.svc",
"172.16.93.5",
"172.30.0.1",
"viaq.logging.test",
"openshift.default.svc.cluster.local",
"kubernetes.default.svc",
"openshift"
],
"ip": "172.16.93.5",
"is_atomic": false,
"is_containerized": false,
"is_etcd_system_container": false,
"is_master_system_container": false,
"is_node_system_container": false,
"is_openvswitch_system_container": false,
"kube_svc_ip": "172.30.0.1",
"pod_image": "openshift/origin-pod",
"portal_net": "172.30.0.0/16",
"public_hostname": "viaq.logging.test",
"public_ip": "172.16.93.5",
"registry_image": "openshift/origin-docker-registry",
"router_image": "openshift/origin-haproxy-router",
"sdn_network_plugin_name": "redhat/openshift-ovs-subnet",
"service_type": "origin",
"use_contiv": false,
"use_dnsmasq": true,
"use_flannel": false,
"use_manageiq": true,
"use_nuage": false,
"use_openshift_sdn": true,
"version_gte_3_1_1_or_1_1_1": true,
"version_gte_3_1_or_1_1": true,
"version_gte_3_2_or_1_2": true,
"version_gte_3_3_or_1_3": true,
"version_gte_3_4_or_1_4": true,
"version_gte_3_5_or_1_5": true,
"version_gte_3_6_or_1_6": false
},
"current_config": {
"roles": [
"node",
"docker",
"master"
]
},
"docker": {
"additional_registries": [],
"api_version": 1.24,
"blocked_registries": [],
"disable_push_dockerhub": false,
"gte_1_10": true,
"hosted_registry_insecure": false,
"hosted_registry_network": "172.30.0.0/16",
"insecure_registries": [],
"options": "--log-driver=journald",
"version": "1.12.6"
},
"hosted": {
"logging": {
"selector": null
},
"metrics": {
"selector": null
},
"registry": {
"selector": "region=infra"
},
"router": {
"selector": "region=infra"
}
},
"master": {
"access_token_max_seconds": 86400,
"api_port": "8443",
"api_url": "https://viaq.logging.test:8443",
"api_use_ssl": true,
"auth_token_max_seconds": 500,
"bind_addr": "0.0.0.0",
"console_path": "/console",
"console_port": "8443",
"console_url": "https://viaq.logging.test:8443/console",
"console_use_ssl": true,
"controllers_port": "8444",
"default_node_selector": "",
"dns_port": 8053,
"dynamic_provisioning_enabled": true,
"embedded_dns": true,
"embedded_etcd": true,
"embedded_kube": true,
"etcd_hosts": "",
"etcd_port": "4001",
"etcd_urls": [
"https://viaq.logging.test:4001"
],
"etcd_use_ssl": true,
"ha": false,
"identity_providers": [
{
"challenge": true,
"kind": "AllowAllPasswordIdentityProvider",
"login": true,
"name": "allow_all"
}
],
"loopback_api_url": "https://viaq.logging.test:8443",
"loopback_cluster_name": "viaq-logging-test:8443",
"loopback_context_name": "default/viaq-logging-test:8443/system:openshift-master",
"loopback_user": "system:openshift-master/viaq-logging-test:8443",
"master_count": "1",
"master_image": "openshift/origin",
"master_system_image": "openshift/origin",
"max_requests_inflight": 500,
"mcs_allocator_range": "s0:/2",
"mcs_labels_per_project": 5,
"oauth_grant_method": "auto",
"portal_net": "172.30.0.0/16",
"project_request_message": "",
"project_request_template": "",
"public_api_url": "https://viaq.logging.test:8443",
"public_console_url": "https://viaq.logging.test:8443/console",
"registry_url": "openshift/origin-${component}:${version}",
"sdn_cluster_network_cidr": "10.128.0.0/14",
"sdn_host_subnet_length": "9",
"session_max_seconds": 3600,
"session_name": "ssn",
"session_secrets_file": "",
"uid_allocator_range": "1000000000-1999999999/10000"
},
"node": {
"annotations": {},
"iptables_sync_period": "30s",
"kubelet_args": {
"node-labels": []
},
"labels": {},
"local_quota_per_fsgroup": "",
"node_image": "openshift/node",
"node_system_image": "openshift/node",
"nodename": "viaq.logging.test",
"ovs_image": "openshift/openvswitch",
"ovs_system_image": "openshift/openvswitch",
"registry_url": "openshift/origin-${component}:${version}",
"schedulable": false,
"sdn_mtu": "1450",
"set_node_ip": false,
"storage_plugin_deps": [
"ceph",
"glusterfs",
"iscsi"
]
}
}
},
"changed": false,
"invocation": {
"module_args": {
"additive_facts_to_overwrite": [],
"attributes": null,
"backup": null,
"content": null,
"delimiter": null,
"directory_mode": null,
"filter": "*",
"follow": false,
"force": null,
"gather_subset": [
"hardware",
"network",
"virtual",
"facter"
],
"gather_timeout": 10,
"group": null,
"local_facts": {
"cluster_id": "default",
"debug_level": "2",
"deployment_subtype": "",
"deployment_type": "origin",
"generate_no_proxy_hosts": true,
"hostname": "viaq.logging.test",
"http_proxy": "",
"https_proxy": "",
"ip": "172.16.93.5",
"is_containerized": false,
"is_etcd_system_container": false,
"is_master_system_container": false,
"is_node_system_container": false,
"is_openvswitch_system_container": false,
"no_proxy": "",
"no_proxy_internal_hostnames": "",
"portal_net": "",
"public_hostname": "viaq.logging.test",
"public_ip": "172.16.93.5",
"sdn_network_plugin_name": "",
"system_images_registry": "",
"use_openshift_sdn": ""
},
"mode": null,
"openshift_env": {},
"openshift_env_structures": [],
"owner": null,
"protected_facts_to_overwrite": [],
"regexp": null,
"remote_src": null,
"role": "common",
"selevel": null,
"serole": null,
"setype": null,
"seuser": null,
"src": null,
"unsafe_writes": null
}
}
}
TASK [openshift_facts : Set repoquery command] *************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_facts/tasks/main.yml:90
ok: [localhost] => {
"ansible_facts": {
"repoquery_cmd": "repoquery --plugins"
},
"changed": false
}
TASK [openshift_facts] *************************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/playbooks/common/openshift-cluster/initialize_facts.yml:12
Using module file /usr/share/ansible/openshift-ansible/roles/openshift_facts/library/openshift_facts.py
<localhost> ESTABLISH LOCAL CONNECTION FOR USER: ith
<localhost> EXEC /bin/sh -c 'echo ~ && sleep 0'
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp/ansible-tmp-1497367114.09-279233519361691 `" && echo ansible-tmp-1497367114.09-279233519361691="` echo /root/.ansible/tmp/ansible-tmp-1497367114.09-279233519361691 `" ) && sleep 0'
<localhost> PUT /tmp/tmpmPZYyA TO /root/.ansible/tmp/ansible-tmp-1497367114.09-279233519361691/openshift_facts.py
<localhost> EXEC /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1497367114.09-279233519361691/ /root/.ansible/tmp/ansible-tmp-1497367114.09-279233519361691/openshift_facts.py && sleep 0'
<localhost> EXEC /bin/sh -c '/usr/bin/python /root/.ansible/tmp/ansible-tmp-1497367114.09-279233519361691/openshift_facts.py; rm -rf "/root/.ansible/tmp/ansible-tmp-1497367114.09-279233519361691/" > /dev/null 2>&1 && sleep 0'
ok: [localhost] => {
"ansible_facts": {
"openshift": {
"common": {
"admin_binary": "oadm",
"all_hostnames": [
"kubernetes.default",
"kubernetes.default.svc.cluster.local",
"kubernetes",
"openshift.default",
"openshift.default.svc",
"172.16.93.5",
"172.30.0.1",
"viaq.logging.test",
"openshift.default.svc.cluster.local",
"kubernetes.default.svc",
"openshift"
],
"cli_image": "openshift/origin",
"client_binary": "oc",
"cluster_id": "default",
"config_base": "/etc/origin",
"data_dir": "/var/lib/origin",
"debug_level": "2",
"deployer_image": "openshift/origin-deployer",
"deployment_subtype": "basic",
"deployment_type": "origin",
"dns_domain": "cluster.local",
"examples_content_version": "v1.5",
"generate_no_proxy_hosts": true,
"hostname": "viaq.logging.test",
"install_examples": true,
"installed_variant_rpms": [],
"internal_hostnames": [
"kubernetes.default",
"kubernetes.default.svc.cluster.local",
"kubernetes",
"openshift.default",
"openshift.default.svc",
"172.16.93.5",
"172.30.0.1",
"viaq.logging.test",
"openshift.default.svc.cluster.local",
"kubernetes.default.svc",
"openshift"
],
"ip": "172.16.93.5",
"is_atomic": false,
"is_containerized": false,
"is_etcd_system_container": false,
"is_master_system_container": false,
"is_node_system_container": false,
"is_openvswitch_system_container": false,
"kube_svc_ip": "172.30.0.1",
"pod_image": "openshift/origin-pod",
"portal_net": "172.30.0.0/16",
"public_hostname": "viaq.logging.test",
"public_ip": "172.16.93.5",
"registry_image": "openshift/origin-docker-registry",
"router_image": "openshift/origin-haproxy-router",
"sdn_network_plugin_name": "redhat/openshift-ovs-subnet",
"service_type": "origin",
"use_contiv": false,
"use_dnsmasq": true,
"use_flannel": false,
"use_manageiq": true,
"use_nuage": false,
"use_openshift_sdn": true,
"version_gte_3_1_1_or_1_1_1": true,
"version_gte_3_1_or_1_1": true,
"version_gte_3_2_or_1_2": true,
"version_gte_3_3_or_1_3": true,
"version_gte_3_4_or_1_4": true,
"version_gte_3_5_or_1_5": true,
"version_gte_3_6_or_1_6": false
},
"current_config": {
"roles": [
"node",
"docker",
"master"
]
},
"docker": {
"additional_registries": [],
"api_version": 1.24,
"blocked_registries": [],
"disable_push_dockerhub": false,
"gte_1_10": true,
"hosted_registry_insecure": false,
"hosted_registry_network": "172.30.0.0/16",
"insecure_registries": [],
"options": "--log-driver=journald",
"version": "1.12.6"
},
"hosted": {
"logging": {
"selector": null
},
"metrics": {
"selector": null
},
"registry": {
"selector": "region=infra"
},
"router": {
"selector": "region=infra"
}
},
"master": {
"access_token_max_seconds": 86400,
"api_port": "8443",
"api_url": "https://viaq.logging.test:8443",
"api_use_ssl": true,
"auth_token_max_seconds": 500,
"bind_addr": "0.0.0.0",
"console_path": "/console",
"console_port": "8443",
"console_url": "https://viaq.logging.test:8443/console",
"console_use_ssl": true,
"controllers_port": "8444",
"default_node_selector": "",
"dns_port": 8053,
"dynamic_provisioning_enabled": true,
"embedded_dns": true,
"embedded_etcd": true,
"embedded_kube": true,
"etcd_hosts": "",
"etcd_port": "4001",
"etcd_urls": [
"https://viaq.logging.test:4001"
],
"etcd_use_ssl": true,
"ha": false,
"identity_providers": [
{
"challenge": true,
"kind": "AllowAllPasswordIdentityProvider",
"login": true,
"name": "allow_all"
}
],
"loopback_api_url": "https://viaq.logging.test:8443",
"loopback_cluster_name": "viaq-logging-test:8443",
"loopback_context_name": "default/viaq-logging-test:8443/system:openshift-master",
"loopback_user": "system:openshift-master/viaq-logging-test:8443",
"master_count": "1",
"master_image": "openshift/origin",
"master_system_image": "openshift/origin",
"max_requests_inflight": 500,
"mcs_allocator_range": "s0:/2",
"mcs_labels_per_project": 5,
"oauth_grant_method": "auto",
"portal_net": "172.30.0.0/16",
"project_request_message": "",
"project_request_template": "",
"public_api_url": "https://viaq.logging.test:8443",
"public_console_url": "https://viaq.logging.test:8443/console",
"registry_url": "openshift/origin-${component}:${version}",
"sdn_cluster_network_cidr": "10.128.0.0/14",
"sdn_host_subnet_length": "9",
"session_max_seconds": 3600,
"session_name": "ssn",
"session_secrets_file": "",
"uid_allocator_range": "1000000000-1999999999/10000"
},
"node": {
"annotations": {},
"iptables_sync_period": "30s",
"kubelet_args": {
"node-labels": []
},
"labels": {},
"local_quota_per_fsgroup": "",
"node_image": "openshift/node",
"node_system_image": "openshift/node",
"nodename": "viaq.logging.test",
"ovs_image": "openshift/openvswitch",
"ovs_system_image": "openshift/openvswitch",
"registry_url": "openshift/origin-${component}:${version}",
"schedulable": false,
"sdn_mtu": "1450",
"set_node_ip": false,
"storage_plugin_deps": [
"ceph",
"glusterfs",
"iscsi"
]
}
}
},
"changed": false,
"invocation": {
"module_args": {
"additive_facts_to_overwrite": [],
"attributes": null,
"backup": null,
"content": null,
"delimiter": null,
"directory_mode": null,
"filter": "*",
"follow": false,
"force": null,
"gather_subset": [
"hardware",
"network",
"virtual",
"facter"
],
"gather_timeout": 10,
"group": null,
"local_facts": {
"hostname": "viaq.logging.test"
},
"mode": null,
"openshift_env": {},
"openshift_env_structures": [],
"owner": null,
"protected_facts_to_overwrite": [],
"regexp": null,
"remote_src": null,
"role": "common",
"selevel": null,
"serole": null,
"setype": null,
"seuser": null,
"src": null,
"unsafe_writes": null
}
}
}
TASK [set_fact] ********************************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/playbooks/common/openshift-cluster/initialize_facts.yml:16
ok: [localhost] => {
"ansible_facts": {
"openshift_docker_hosted_registry_network": "172.30.0.0/16"
},
"changed": false
}
TASK [set_fact] ********************************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/playbooks/common/openshift-cluster/initialize_facts.yml:18
ok: [localhost] => {
"ansible_facts": {
"openshift_deployment_type": "origin"
},
"changed": false
}
META: ran handlers
META: ran handlers
PLAY [Gather and set facts for node hosts] *****************************************************************************************************************************
TASK [Gathering Facts] *************************************************************************************************************************************************
Using module file /usr/lib/python2.7/site-packages/ansible/modules/system/setup.py
<localhost> ESTABLISH LOCAL CONNECTION FOR USER: ith
<localhost> EXEC /bin/sh -c 'echo ~ && sleep 0'
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp/ansible-tmp-1497367115.73-246361821336810 `" && echo ansible-tmp-1497367115.73-246361821336810="` echo /root/.ansible/tmp/ansible-tmp-1497367115.73-246361821336810 `" ) && sleep 0'
<localhost> PUT /tmp/tmpMPjvHj TO /root/.ansible/tmp/ansible-tmp-1497367115.73-246361821336810/setup.py
<localhost> EXEC /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1497367115.73-246361821336810/ /root/.ansible/tmp/ansible-tmp-1497367115.73-246361821336810/setup.py && sleep 0'
<localhost> EXEC /bin/sh -c '/usr/bin/python /root/.ansible/tmp/ansible-tmp-1497367115.73-246361821336810/setup.py; rm -rf "/root/.ansible/tmp/ansible-tmp-1497367115.73-246361821336810/" > /dev/null 2>&1 && sleep 0'
ok: [localhost]
META: ran handlers
TASK [openshift_facts : Detecting Operating System] ********************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_facts/tasks/main.yml:2
Using module file /usr/lib/python2.7/site-packages/ansible/modules/files/stat.py
<localhost> ESTABLISH LOCAL CONNECTION FOR USER: ith
<localhost> EXEC /bin/sh -c 'echo ~ && sleep 0'
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp/ansible-tmp-1497367116.24-170848916668968 `" && echo ansible-tmp-1497367116.24-170848916668968="` echo /root/.ansible/tmp/ansible-tmp-1497367116.24-170848916668968 `" ) && sleep 0'
<localhost> PUT /tmp/tmpHCzSA9 TO /root/.ansible/tmp/ansible-tmp-1497367116.24-170848916668968/stat.py
<localhost> EXEC /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1497367116.24-170848916668968/ /root/.ansible/tmp/ansible-tmp-1497367116.24-170848916668968/stat.py && sleep 0'
<localhost> EXEC /bin/sh -c '/usr/bin/python /root/.ansible/tmp/ansible-tmp-1497367116.24-170848916668968/stat.py; rm -rf "/root/.ansible/tmp/ansible-tmp-1497367116.24-170848916668968/" > /dev/null 2>&1 && sleep 0'
ok: [localhost] => {
"changed": false,
"invocation": {
"module_args": {
"checksum_algorithm": "sha1",
"follow": false,
"get_attributes": true,
"get_checksum": true,
"get_md5": true,
"get_mime": true,
"path": "/run/ostree-booted"
}
},
"stat": {
"exists": false
}
}
TASK [openshift_facts : set_fact] **************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_facts/tasks/main.yml:8
ok: [localhost] => {
"ansible_facts": {
"l_is_atomic": false
},
"changed": false
}
TASK [openshift_facts : set_fact] **************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_facts/tasks/main.yml:10
ok: [localhost] => {
"ansible_facts": {
"l_is_containerized": false,
"l_is_etcd_system_container": false,
"l_is_master_system_container": false,
"l_is_node_system_container": false,
"l_is_openvswitch_system_container": false
},
"changed": false
}
TASK [openshift_facts : set_fact] **************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_facts/tasks/main.yml:16
ok: [localhost] => {
"ansible_facts": {
"l_any_system_container": false
},
"changed": false
}
TASK [openshift_facts : Validate python version] ***********************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_facts/tasks/main.yml:19
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_facts : Validate python version] ***********************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_facts/tasks/main.yml:26
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_facts : Determine Atomic Host Docker Version] **********************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_facts/tasks/main.yml:39
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_facts : assert] ****************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_facts/tasks/main.yml:43
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_facts : Ensure various deps are installed] *************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_facts/tasks/main.yml:50
Using module file /usr/lib/python2.7/site-packages/ansible/modules/packaging/os/yum.py
<localhost> ESTABLISH LOCAL CONNECTION FOR USER: ith
<localhost> EXEC /bin/sh -c 'echo ~ && sleep 0'
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp/ansible-tmp-1497367117.39-63716300156519 `" && echo ansible-tmp-1497367117.39-63716300156519="` echo /root/.ansible/tmp/ansible-tmp-1497367117.39-63716300156519 `" ) && sleep 0'
<localhost> PUT /tmp/tmpiAkaS8 TO /root/.ansible/tmp/ansible-tmp-1497367117.39-63716300156519/yum.py
<localhost> EXEC /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1497367117.39-63716300156519/ /root/.ansible/tmp/ansible-tmp-1497367117.39-63716300156519/yum.py && sleep 0'
<localhost> EXEC /bin/sh -c '/usr/bin/python /root/.ansible/tmp/ansible-tmp-1497367117.39-63716300156519/yum.py; rm -rf "/root/.ansible/tmp/ansible-tmp-1497367117.39-63716300156519/" > /dev/null 2>&1 && sleep 0'
Using module file /usr/lib/python2.7/site-packages/ansible/modules/packaging/os/yum.py
<localhost> EXEC /bin/sh -c 'echo ~ && sleep 0'
ok: [localhost] => (item=iproute) => {
"changed": false,
"invocation": {
"module_args": {
"conf_file": null,
"disable_gpg_check": false,
"disablerepo": null,
"enablerepo": null,
"exclude": null,
"install_repoquery": true,
"installroot": "/",
"list": null,
"name": [
"iproute"
],
"skip_broken": false,
"state": "present",
"update_cache": false,
"validate_certs": true
}
},
"item": "iproute",
"rc": 0,
"results": [
"iproute-3.10.0-74.el7.x86_64 providing iproute is already installed"
]
}
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp/ansible-tmp-1497367117.97-229024398749285 `" && echo ansible-tmp-1497367117.97-229024398749285="` echo /root/.ansible/tmp/ansible-tmp-1497367117.97-229024398749285 `" ) && sleep 0'
<localhost> PUT /tmp/tmpeG1O56 TO /root/.ansible/tmp/ansible-tmp-1497367117.97-229024398749285/yum.py
<localhost> EXEC /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1497367117.97-229024398749285/ /root/.ansible/tmp/ansible-tmp-1497367117.97-229024398749285/yum.py && sleep 0'
<localhost> EXEC /bin/sh -c '/usr/bin/python /root/.ansible/tmp/ansible-tmp-1497367117.97-229024398749285/yum.py; rm -rf "/root/.ansible/tmp/ansible-tmp-1497367117.97-229024398749285/" > /dev/null 2>&1 && sleep 0'
Using module file /usr/lib/python2.7/site-packages/ansible/modules/packaging/os/yum.py
<localhost> EXEC /bin/sh -c 'echo ~ && sleep 0'
ok: [localhost] => (item=python-dbus) => {
"changed": false,
"invocation": {
"module_args": {
"conf_file": null,
"disable_gpg_check": false,
"disablerepo": null,
"enablerepo": null,
"exclude": null,
"install_repoquery": true,
"installroot": "/",
"list": null,
"name": [
"python-dbus"
],
"skip_broken": false,
"state": "present",
"update_cache": false,
"validate_certs": true
}
},
"item": "python-dbus",
"rc": 0,
"results": [
"dbus-python-1.1.1-9.el7.x86_64 providing python-dbus is already installed"
]
}
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp/ansible-tmp-1497367118.97-147497312981379 `" && echo ansible-tmp-1497367118.97-147497312981379="` echo /root/.ansible/tmp/ansible-tmp-1497367118.97-147497312981379 `" ) && sleep 0'
<localhost> PUT /tmp/tmphT2xn2 TO /root/.ansible/tmp/ansible-tmp-1497367118.97-147497312981379/yum.py
<localhost> EXEC /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1497367118.97-147497312981379/ /root/.ansible/tmp/ansible-tmp-1497367118.97-147497312981379/yum.py && sleep 0'
<localhost> EXEC /bin/sh -c '/usr/bin/python /root/.ansible/tmp/ansible-tmp-1497367118.97-147497312981379/yum.py; rm -rf "/root/.ansible/tmp/ansible-tmp-1497367118.97-147497312981379/" > /dev/null 2>&1 && sleep 0'
Using module file /usr/lib/python2.7/site-packages/ansible/modules/packaging/os/yum.py
<localhost> EXEC /bin/sh -c 'echo ~ && sleep 0'
ok: [localhost] => (item=python-six) => {
"changed": false,
"invocation": {
"module_args": {
"conf_file": null,
"disable_gpg_check": false,
"disablerepo": null,
"enablerepo": null,
"exclude": null,
"install_repoquery": true,
"installroot": "/",
"list": null,
"name": [
"python-six"
],
"skip_broken": false,
"state": "present",
"update_cache": false,
"validate_certs": true
}
},
"item": "python-six",
"rc": 0,
"results": [
"python-six-1.9.0-2.el7.noarch providing python-six is already installed"
]
}
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp/ansible-tmp-1497367119.49-142087898311880 `" && echo ansible-tmp-1497367119.49-142087898311880="` echo /root/.ansible/tmp/ansible-tmp-1497367119.49-142087898311880 `" ) && sleep 0'
<localhost> PUT /tmp/tmpUCZgjN TO /root/.ansible/tmp/ansible-tmp-1497367119.49-142087898311880/yum.py
<localhost> EXEC /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1497367119.49-142087898311880/ /root/.ansible/tmp/ansible-tmp-1497367119.49-142087898311880/yum.py && sleep 0'
<localhost> EXEC /bin/sh -c '/usr/bin/python /root/.ansible/tmp/ansible-tmp-1497367119.49-142087898311880/yum.py; rm -rf "/root/.ansible/tmp/ansible-tmp-1497367119.49-142087898311880/" > /dev/null 2>&1 && sleep 0'
Using module file /usr/lib/python2.7/site-packages/ansible/modules/packaging/os/yum.py
<localhost> EXEC /bin/sh -c 'echo ~ && sleep 0'
ok: [localhost] => (item=PyYAML) => {
"changed": false,
"invocation": {
"module_args": {
"conf_file": null,
"disable_gpg_check": false,
"disablerepo": null,
"enablerepo": null,
"exclude": null,
"install_repoquery": true,
"installroot": "/",
"list": null,
"name": [
"PyYAML"
],
"skip_broken": false,
"state": "present",
"update_cache": false,
"validate_certs": true
}
},
"item": "PyYAML",
"rc": 0,
"results": [
"PyYAML-3.10-11.el7.x86_64 providing PyYAML is already installed"
]
}
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp/ansible-tmp-1497367120.0-56721582995108 `" && echo ansible-tmp-1497367120.0-56721582995108="` echo /root/.ansible/tmp/ansible-tmp-1497367120.0-56721582995108 `" ) && sleep 0'
<localhost> PUT /tmp/tmpqandn0 TO /root/.ansible/tmp/ansible-tmp-1497367120.0-56721582995108/yum.py
<localhost> EXEC /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1497367120.0-56721582995108/ /root/.ansible/tmp/ansible-tmp-1497367120.0-56721582995108/yum.py && sleep 0'
<localhost> EXEC /bin/sh -c '/usr/bin/python /root/.ansible/tmp/ansible-tmp-1497367120.0-56721582995108/yum.py; rm -rf "/root/.ansible/tmp/ansible-tmp-1497367120.0-56721582995108/" > /dev/null 2>&1 && sleep 0'
ok: [localhost] => (item=yum-utils) => {
"changed": false,
"invocation": {
"module_args": {
"conf_file": null,
"disable_gpg_check": false,
"disablerepo": null,
"enablerepo": null,
"exclude": null,
"install_repoquery": true,
"installroot": "/",
"list": null,
"name": [
"yum-utils"
],
"skip_broken": false,
"state": "present",
"update_cache": false,
"validate_certs": true
}
},
"item": "yum-utils",
"rc": 0,
"results": [
"yum-utils-1.1.31-40.el7.noarch providing yum-utils is already installed"
]
}
TASK [openshift_facts : Ensure various deps for running system containers are installed] *******************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_facts/tasks/main.yml:55
skipping: [localhost] => (item=atomic) => {
"changed": false,
"item": "atomic",
"skip_reason": "Conditional result was False",
"skipped": true
}
skipping: [localhost] => (item=runc) => {
"changed": false,
"item": "runc",
"skip_reason": "Conditional result was False",
"skipped": true
}
skipping: [localhost] => (item=ostree) => {
"changed": false,
"item": "ostree",
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_facts : Gather Cluster facts and set is_containerized if needed] ***************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_facts/tasks/main.yml:62
Using module file /usr/share/ansible/openshift-ansible/roles/openshift_facts/library/openshift_facts.py
<localhost> ESTABLISH LOCAL CONNECTION FOR USER: ith
<localhost> EXEC /bin/sh -c 'echo ~ && sleep 0'
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp/ansible-tmp-1497367120.83-209426240277985 `" && echo ansible-tmp-1497367120.83-209426240277985="` echo /root/.ansible/tmp/ansible-tmp-1497367120.83-209426240277985 `" ) && sleep 0'
<localhost> PUT /tmp/tmpzTXM3K TO /root/.ansible/tmp/ansible-tmp-1497367120.83-209426240277985/openshift_facts.py
<localhost> EXEC /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1497367120.83-209426240277985/ /root/.ansible/tmp/ansible-tmp-1497367120.83-209426240277985/openshift_facts.py && sleep 0'
<localhost> EXEC /bin/sh -c '/usr/bin/python /root/.ansible/tmp/ansible-tmp-1497367120.83-209426240277985/openshift_facts.py; rm -rf "/root/.ansible/tmp/ansible-tmp-1497367120.83-209426240277985/" > /dev/null 2>&1 && sleep 0'
ok: [localhost] => {
"ansible_facts": {
"openshift": {
"common": {
"admin_binary": "oadm",
"all_hostnames": [
"kubernetes.default",
"kubernetes.default.svc.cluster.local",
"kubernetes",
"openshift.default",
"openshift.default.svc",
"172.16.93.5",
"172.30.0.1",
"viaq.logging.test",
"openshift.default.svc.cluster.local",
"kubernetes.default.svc",
"openshift"
],
"cli_image": "openshift/origin",
"client_binary": "oc",
"cluster_id": "default",
"config_base": "/etc/origin",
"data_dir": "/var/lib/origin",
"debug_level": "2",
"deployer_image": "openshift/origin-deployer",
"deployment_subtype": "basic",
"deployment_type": "origin",
"dns_domain": "cluster.local",
"examples_content_version": "v1.5",
"generate_no_proxy_hosts": true,
"hostname": "viaq.logging.test",
"install_examples": true,
"installed_variant_rpms": [],
"internal_hostnames": [
"kubernetes.default",
"kubernetes.default.svc.cluster.local",
"kubernetes",
"openshift.default",
"openshift.default.svc",
"172.16.93.5",
"172.30.0.1",
"viaq.logging.test",
"openshift.default.svc.cluster.local",
"kubernetes.default.svc",
"openshift"
],
"ip": "172.16.93.5",
"is_atomic": false,
"is_containerized": false,
"is_etcd_system_container": false,
"is_master_system_container": false,
"is_node_system_container": false,
"is_openvswitch_system_container": false,
"kube_svc_ip": "172.30.0.1",
"pod_image": "openshift/origin-pod",
"portal_net": "172.30.0.0/16",
"public_hostname": "viaq.logging.test",
"public_ip": "172.16.93.5",
"registry_image": "openshift/origin-docker-registry",
"router_image": "openshift/origin-haproxy-router",
"sdn_network_plugin_name": "redhat/openshift-ovs-subnet",
"service_type": "origin",
"use_contiv": false,
"use_dnsmasq": true,
"use_flannel": false,
"use_manageiq": true,
"use_nuage": false,
"use_openshift_sdn": true,
"version_gte_3_1_1_or_1_1_1": true,
"version_gte_3_1_or_1_1": true,
"version_gte_3_2_or_1_2": true,
"version_gte_3_3_or_1_3": true,
"version_gte_3_4_or_1_4": true,
"version_gte_3_5_or_1_5": true,
"version_gte_3_6_or_1_6": false
},
"current_config": {
"roles": [
"node",
"docker",
"master"
]
},
"docker": {
"additional_registries": [],
"api_version": 1.24,
"blocked_registries": [],
"disable_push_dockerhub": false,
"gte_1_10": true,
"hosted_registry_insecure": false,
"hosted_registry_network": "172.30.0.0/16",
"insecure_registries": [],
"options": "--log-driver=journald",
"version": "1.12.6"
},
"hosted": {
"logging": {
"selector": null
},
"metrics": {
"selector": null
},
"registry": {
"selector": "region=infra"
},
"router": {
"selector": "region=infra"
}
},
"master": {
"access_token_max_seconds": 86400,
"api_port": "8443",
"api_url": "https://viaq.logging.test:8443",
"api_use_ssl": true,
"auth_token_max_seconds": 500,
"bind_addr": "0.0.0.0",
"console_path": "/console",
"console_port": "8443",
"console_url": "https://viaq.logging.test:8443/console",
"console_use_ssl": true,
"controllers_port": "8444",
"default_node_selector": "",
"dns_port": 8053,
"dynamic_provisioning_enabled": true,
"embedded_dns": true,
"embedded_etcd": true,
"embedded_kube": true,
"etcd_hosts": "",
"etcd_port": "4001",
"etcd_urls": [
"https://viaq.logging.test:4001"
],
"etcd_use_ssl": true,
"ha": false,
"identity_providers": [
{
"challenge": true,
"kind": "AllowAllPasswordIdentityProvider",
"login": true,
"name": "allow_all"
}
],
"loopback_api_url": "https://viaq.logging.test:8443",
"loopback_cluster_name": "viaq-logging-test:8443",
"loopback_context_name": "default/viaq-logging-test:8443/system:openshift-master",
"loopback_user": "system:openshift-master/viaq-logging-test:8443",
"master_count": "1",
"master_image": "openshift/origin",
"master_system_image": "openshift/origin",
"max_requests_inflight": 500,
"mcs_allocator_range": "s0:/2",
"mcs_labels_per_project": 5,
"oauth_grant_method": "auto",
"portal_net": "172.30.0.0/16",
"project_request_message": "",
"project_request_template": "",
"public_api_url": "https://viaq.logging.test:8443",
"public_console_url": "https://viaq.logging.test:8443/console",
"registry_url": "openshift/origin-${component}:${version}",
"sdn_cluster_network_cidr": "10.128.0.0/14",
"sdn_host_subnet_length": "9",
"session_max_seconds": 3600,
"session_name": "ssn",
"session_secrets_file": "",
"uid_allocator_range": "1000000000-1999999999/10000"
},
"node": {
"annotations": {},
"iptables_sync_period": "30s",
"kubelet_args": {
"node-labels": []
},
"labels": {},
"local_quota_per_fsgroup": "",
"node_image": "openshift/node",
"node_system_image": "openshift/node",
"nodename": "viaq.logging.test",
"ovs_image": "openshift/openvswitch",
"ovs_system_image": "openshift/openvswitch",
"registry_url": "openshift/origin-${component}:${version}",
"schedulable": false,
"sdn_mtu": "1450",
"set_node_ip": false,
"storage_plugin_deps": [
"ceph",
"glusterfs",
"iscsi"
]
}
}
},
"changed": false,
"invocation": {
"module_args": {
"additive_facts_to_overwrite": [],
"attributes": null,
"backup": null,
"content": null,
"delimiter": null,
"directory_mode": null,
"filter": "*",
"follow": false,
"force": null,
"gather_subset": [
"hardware",
"network",
"virtual",
"facter"
],
"gather_timeout": 10,
"group": null,
"local_facts": {
"cluster_id": "default",
"debug_level": "2",
"deployment_subtype": "",
"deployment_type": "origin",
"generate_no_proxy_hosts": true,
"hostname": "viaq.logging.test",
"http_proxy": "",
"https_proxy": "",
"ip": "172.16.93.5",
"is_containerized": false,
"is_etcd_system_container": false,
"is_master_system_container": false,
"is_node_system_container": false,
"is_openvswitch_system_container": false,
"no_proxy": "",
"no_proxy_internal_hostnames": "",
"portal_net": "",
"public_hostname": "viaq.logging.test",
"public_ip": "172.16.93.5",
"sdn_network_plugin_name": "",
"system_images_registry": "",
"use_openshift_sdn": ""
},
"mode": null,
"openshift_env": {},
"openshift_env_structures": [],
"owner": null,
"protected_facts_to_overwrite": [],
"regexp": null,
"remote_src": null,
"role": "common",
"selevel": null,
"serole": null,
"setype": null,
"seuser": null,
"src": null,
"unsafe_writes": null
}
}
}
TASK [openshift_facts : Set repoquery command] *************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_facts/tasks/main.yml:90
ok: [localhost] => {
"ansible_facts": {
"repoquery_cmd": "repoquery --plugins"
},
"changed": false
}
TASK [command] *********************************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/playbooks/common/openshift-cluster/validate_hostnames.yml:7
Using module file /usr/lib/python2.7/site-packages/ansible/modules/commands/command.py
<localhost> ESTABLISH LOCAL CONNECTION FOR USER: ith
<localhost> EXEC /bin/sh -c 'echo ~ && sleep 0'
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp/ansible-tmp-1497367122.53-120112763884334 `" && echo ansible-tmp-1497367122.53-120112763884334="` echo /root/.ansible/tmp/ansible-tmp-1497367122.53-120112763884334 `" ) && sleep 0'
<localhost> PUT /tmp/tmpXkGWB6 TO /root/.ansible/tmp/ansible-tmp-1497367122.53-120112763884334/command.py
<localhost> EXEC /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1497367122.53-120112763884334/ /root/.ansible/tmp/ansible-tmp-1497367122.53-120112763884334/command.py && sleep 0'
<localhost> EXEC /bin/sh -c '/usr/bin/python /root/.ansible/tmp/ansible-tmp-1497367122.53-120112763884334/command.py; rm -rf "/root/.ansible/tmp/ansible-tmp-1497367122.53-120112763884334/" > /dev/null 2>&1 && sleep 0'
ok: [localhost] => {
"changed": false,
"cmd": "getent ahostsv4 viaq.logging.test | head -n 1 | awk '{ print $1 }'",
"delta": "0:00:00.011767",
"end": "2017-06-13 08:18:42.750964",
"failed": false,
"failed_when_result": false,
"invocation": {
"module_args": {
"_raw_params": "getent ahostsv4 viaq.logging.test | head -n 1 | awk '{ print $1 }'",
"_uses_shell": true,
"chdir": null,
"creates": null,
"executable": null,
"removes": null,
"warn": true
}
},
"rc": 0,
"start": "2017-06-13 08:18:42.739197"
}
STDOUT:
172.16.93.5
TASK [Warn user about bad openshift_hostname values] *******************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/playbooks/common/openshift-cluster/validate_hostnames.yml:12
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
META: ran handlers
META: ran handlers
PLAY [Verify compatible yum/subscription-manager combination] **********************************************************************************************************
META: ran handlers
TASK [Check for bad combinations of yum and subscription-manager] ******************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/playbooks/common/openshift-cluster/initialize_openshift_version.yml:11
Using module file /usr/lib/python2.7/site-packages/ansible/modules/commands/command.py
<localhost> ESTABLISH LOCAL CONNECTION FOR USER: ith
<localhost> EXEC /bin/sh -c 'echo ~ && sleep 0'
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp/ansible-tmp-1497367123.0-79008758521988 `" && echo ansible-tmp-1497367123.0-79008758521988="` echo /root/.ansible/tmp/ansible-tmp-1497367123.0-79008758521988 `" ) && sleep 0'
<localhost> PUT /tmp/tmpT8LVN4 TO /root/.ansible/tmp/ansible-tmp-1497367123.0-79008758521988/command.py
<localhost> EXEC /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1497367123.0-79008758521988/ /root/.ansible/tmp/ansible-tmp-1497367123.0-79008758521988/command.py && sleep 0'
<localhost> EXEC /bin/sh -c '/usr/bin/python /root/.ansible/tmp/ansible-tmp-1497367123.0-79008758521988/command.py; rm -rf "/root/.ansible/tmp/ansible-tmp-1497367123.0-79008758521988/" > /dev/null 2>&1 && sleep 0'
ok: [localhost] => {
"changed": false,
"cmd": [
"repoquery",
"--plugins",
"--installed",
"--qf",
"%{version}",
"yum"
],
"delta": "0:00:00.287791",
"end": "2017-06-13 08:18:43.533975",
"invocation": {
"module_args": {
"_raw_params": "repoquery --plugins --installed --qf '%{version}' \"yum\"",
"_uses_shell": false,
"chdir": null,
"creates": null,
"executable": null,
"removes": null,
"warn": true
}
},
"rc": 0,
"start": "2017-06-13 08:18:43.246184"
}
STDOUT:
3.4.3
STDERR:
Repository centos-openshift-origin is listed more than once in the configuration
TASK [fail] ************************************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/playbooks/common/openshift-cluster/initialize_openshift_version.yml:17
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
META: ran handlers
META: ran handlers
PLAY [Determine openshift_version to configure on first master] ********************************************************************************************************
TASK [Gathering Facts] *************************************************************************************************************************************************
Using module file /usr/lib/python2.7/site-packages/ansible/modules/system/setup.py
<localhost> ESTABLISH LOCAL CONNECTION FOR USER: ith
<localhost> EXEC /bin/sh -c 'echo ~ && sleep 0'
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp/ansible-tmp-1497367123.82-102664238307559 `" && echo ansible-tmp-1497367123.82-102664238307559="` echo /root/.ansible/tmp/ansible-tmp-1497367123.82-102664238307559 `" ) && sleep 0'
<localhost> PUT /tmp/tmpvL6qAD TO /root/.ansible/tmp/ansible-tmp-1497367123.82-102664238307559/setup.py
<localhost> EXEC /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1497367123.82-102664238307559/ /root/.ansible/tmp/ansible-tmp-1497367123.82-102664238307559/setup.py && sleep 0'
<localhost> EXEC /bin/sh -c '/usr/bin/python /root/.ansible/tmp/ansible-tmp-1497367123.82-102664238307559/setup.py; rm -rf "/root/.ansible/tmp/ansible-tmp-1497367123.82-102664238307559/" > /dev/null 2>&1 && sleep 0'
ok: [localhost]
META: ran handlers
TASK [openshift_repos : openshift_repos detect ostree] *****************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_repos/tasks/main.yaml:2
Using module file /usr/lib/python2.7/site-packages/ansible/modules/files/stat.py
<localhost> ESTABLISH LOCAL CONNECTION FOR USER: ith
<localhost> EXEC /bin/sh -c 'echo ~ && sleep 0'
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp/ansible-tmp-1497367124.34-17746058314312 `" && echo ansible-tmp-1497367124.34-17746058314312="` echo /root/.ansible/tmp/ansible-tmp-1497367124.34-17746058314312 `" ) && sleep 0'
<localhost> PUT /tmp/tmpmbjRvO TO /root/.ansible/tmp/ansible-tmp-1497367124.34-17746058314312/stat.py
<localhost> EXEC /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1497367124.34-17746058314312/ /root/.ansible/tmp/ansible-tmp-1497367124.34-17746058314312/stat.py && sleep 0'
<localhost> EXEC /bin/sh -c '/usr/bin/python /root/.ansible/tmp/ansible-tmp-1497367124.34-17746058314312/stat.py; rm -rf "/root/.ansible/tmp/ansible-tmp-1497367124.34-17746058314312/" > /dev/null 2>&1 && sleep 0'
ok: [localhost] => {
"changed": false,
"invocation": {
"module_args": {
"checksum_algorithm": "sha1",
"follow": false,
"get_attributes": true,
"get_checksum": true,
"get_md5": true,
"get_mime": true,
"path": "/run/ostree-booted"
}
},
"stat": {
"exists": false
}
}
TASK [openshift_repos : assert] ****************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_repos/tasks/main.yaml:7
ok: [localhost] => {
"changed": false
}
MSG:
All assertions passed
TASK [openshift_repos : Ensure libselinux-python is installed] *********************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_repos/tasks/main.yaml:12
Using module file /usr/lib/python2.7/site-packages/ansible/modules/packaging/os/yum.py
<localhost> ESTABLISH LOCAL CONNECTION FOR USER: ith
<localhost> EXEC /bin/sh -c 'echo ~ && sleep 0'
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp/ansible-tmp-1497367124.84-226445267577051 `" && echo ansible-tmp-1497367124.84-226445267577051="` echo /root/.ansible/tmp/ansible-tmp-1497367124.84-226445267577051 `" ) && sleep 0'
<localhost> PUT /tmp/tmp9zHeXz TO /root/.ansible/tmp/ansible-tmp-1497367124.84-226445267577051/yum.py
<localhost> EXEC /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1497367124.84-226445267577051/ /root/.ansible/tmp/ansible-tmp-1497367124.84-226445267577051/yum.py && sleep 0'
<localhost> EXEC /bin/sh -c '/usr/bin/python /root/.ansible/tmp/ansible-tmp-1497367124.84-226445267577051/yum.py; rm -rf "/root/.ansible/tmp/ansible-tmp-1497367124.84-226445267577051/" > /dev/null 2>&1 && sleep 0'
ok: [localhost] => {
"changed": false,
"invocation": {
"module_args": {
"conf_file": null,
"disable_gpg_check": false,
"disablerepo": null,
"enablerepo": null,
"exclude": null,
"install_repoquery": true,
"installroot": "/",
"list": null,
"name": [
"libselinux-python"
],
"skip_broken": false,
"state": "present",
"update_cache": false,
"validate_certs": true
}
},
"rc": 0,
"results": [
"libselinux-python-2.5-6.el7.x86_64 providing libselinux-python is already installed"
]
}
TASK [openshift_repos : Create any additional repos that are defined] **************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_repos/tasks/main.yaml:15
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_repos : Remove the additional repos if no longer defined] **********************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_repos/tasks/main.yaml:23
Using module file /usr/lib/python2.7/site-packages/ansible/modules/files/file.py
<localhost> ESTABLISH LOCAL CONNECTION FOR USER: ith
<localhost> EXEC /bin/sh -c 'echo ~ && sleep 0'
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp/ansible-tmp-1497367125.88-249712289677806 `" && echo ansible-tmp-1497367125.88-249712289677806="` echo /root/.ansible/tmp/ansible-tmp-1497367125.88-249712289677806 `" ) && sleep 0'
<localhost> PUT /tmp/tmpHs264n TO /root/.ansible/tmp/ansible-tmp-1497367125.88-249712289677806/file.py
<localhost> EXEC /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1497367125.88-249712289677806/ /root/.ansible/tmp/ansible-tmp-1497367125.88-249712289677806/file.py && sleep 0'
<localhost> EXEC /bin/sh -c '/usr/bin/python /root/.ansible/tmp/ansible-tmp-1497367125.88-249712289677806/file.py; rm -rf "/root/.ansible/tmp/ansible-tmp-1497367125.88-249712289677806/" > /dev/null 2>&1 && sleep 0'
ok: [localhost] => {
"changed": false,
"invocation": {
"module_args": {
"attributes": null,
"backup": null,
"content": null,
"delimiter": null,
"dest": "/etc/yum.repos.d/openshift_additional.repo",
"diff_peek": null,
"directory_mode": null,
"follow": false,
"force": false,
"group": null,
"mode": null,
"original_basename": null,
"owner": null,
"path": "/etc/yum.repos.d/openshift_additional.repo",
"recurse": false,
"regexp": null,
"remote_src": null,
"selevel": null,
"serole": null,
"setype": null,
"seuser": null,
"src": null,
"state": "absent",
"unsafe_writes": null,
"validate": null
}
},
"path": "/etc/yum.repos.d/openshift_additional.repo",
"state": "absent"
}
TASK [openshift_repos : Configure origin gpg keys if needed] ***********************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_repos/tasks/main.yaml:31
Using module file /usr/lib/python2.7/site-packages/ansible/modules/files/stat.py
<localhost> ESTABLISH LOCAL CONNECTION FOR USER: ith
<localhost> EXEC /bin/sh -c 'echo ~ && sleep 0'
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp/ansible-tmp-1497367126.27-271597700143786 `" && echo ansible-tmp-1497367126.27-271597700143786="` echo /root/.ansible/tmp/ansible-tmp-1497367126.27-271597700143786 `" ) && sleep 0'
<localhost> PUT /tmp/tmpvzmWY2 TO /root/.ansible/tmp/ansible-tmp-1497367126.27-271597700143786/stat.py
<localhost> EXEC /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1497367126.27-271597700143786/ /root/.ansible/tmp/ansible-tmp-1497367126.27-271597700143786/stat.py && sleep 0'
<localhost> EXEC /bin/sh -c '/usr/bin/python /root/.ansible/tmp/ansible-tmp-1497367126.27-271597700143786/stat.py; rm -rf "/root/.ansible/tmp/ansible-tmp-1497367126.27-271597700143786/" > /dev/null 2>&1 && sleep 0'
Using module file /usr/lib/python2.7/site-packages/ansible/modules/files/file.py
<localhost> EXEC /bin/sh -c 'echo ~ && sleep 0'
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp/ansible-tmp-1497367126.54-86721934173802 `" && echo ansible-tmp-1497367126.54-86721934173802="` echo /root/.ansible/tmp/ansible-tmp-1497367126.54-86721934173802 `" ) && sleep 0'
<localhost> PUT /tmp/tmpoEBE3h TO /root/.ansible/tmp/ansible-tmp-1497367126.54-86721934173802/file.py
<localhost> EXEC /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1497367126.54-86721934173802/ /root/.ansible/tmp/ansible-tmp-1497367126.54-86721934173802/file.py && sleep 0'
<localhost> EXEC /bin/sh -c '/usr/bin/python /root/.ansible/tmp/ansible-tmp-1497367126.54-86721934173802/file.py; rm -rf "/root/.ansible/tmp/ansible-tmp-1497367126.54-86721934173802/" > /dev/null 2>&1 && sleep 0'
Using module file /usr/lib/python2.7/site-packages/ansible/modules/files/stat.py
<localhost> EXEC /bin/sh -c 'echo ~ && sleep 0'
ok: [localhost] => (item={u'dest': u'/etc/pki/rpm-gpg/', u'src': u'origin/gpg_keys/openshift-ansible-CentOS-SIG-PaaS'}) => {
"changed": false,
"checksum": "e350bf139f91980ff9cc39877e14643ba3139da1",
"dest": "/etc/pki/rpm-gpg/openshift-ansible-CentOS-SIG-PaaS",
"diff": {
"after": {
"path": "/etc/pki/rpm-gpg/openshift-ansible-CentOS-SIG-PaaS"
},
"before": {
"path": "/etc/pki/rpm-gpg/openshift-ansible-CentOS-SIG-PaaS"
}
},
"gid": 0,
"group": "root",
"invocation": {
"module_args": {
"attributes": null,
"backup": null,
"content": null,
"delimiter": null,
"dest": "/etc/pki/rpm-gpg/",
"diff_peek": null,
"directory_mode": null,
"follow": false,
"force": false,
"group": null,
"mode": null,
"original_basename": "openshift-ansible-CentOS-SIG-PaaS",
"owner": null,
"path": "/etc/pki/rpm-gpg/openshift-ansible-CentOS-SIG-PaaS",
"recurse": false,
"regexp": null,
"remote_src": null,
"selevel": null,
"serole": null,
"setype": null,
"seuser": null,
"src": "openshift-ansible-CentOS-SIG-PaaS",
"state": null,
"unsafe_writes": null,
"validate": null
}
},
"item": {
"dest": "/etc/pki/rpm-gpg/",
"src": "origin/gpg_keys/openshift-ansible-CentOS-SIG-PaaS"
},
"mode": "0644",
"owner": "root",
"path": "/etc/pki/rpm-gpg/openshift-ansible-CentOS-SIG-PaaS",
"secontext": "system_u:object_r:cert_t:s0",
"size": 1037,
"state": "file",
"uid": 0
}
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp/ansible-tmp-1497367126.82-249049485005505 `" && echo ansible-tmp-1497367126.82-249049485005505="` echo /root/.ansible/tmp/ansible-tmp-1497367126.82-249049485005505 `" ) && sleep 0'
<localhost> PUT /tmp/tmpmvfvc8 TO /root/.ansible/tmp/ansible-tmp-1497367126.82-249049485005505/stat.py
<localhost> EXEC /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1497367126.82-249049485005505/ /root/.ansible/tmp/ansible-tmp-1497367126.82-249049485005505/stat.py && sleep 0'
<localhost> EXEC /bin/sh -c '/usr/bin/python /root/.ansible/tmp/ansible-tmp-1497367126.82-249049485005505/stat.py; rm -rf "/root/.ansible/tmp/ansible-tmp-1497367126.82-249049485005505/" > /dev/null 2>&1 && sleep 0'
Using module file /usr/lib/python2.7/site-packages/ansible/modules/files/file.py
<localhost> EXEC /bin/sh -c 'echo ~ && sleep 0'
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp/ansible-tmp-1497367127.09-39449883151769 `" && echo ansible-tmp-1497367127.09-39449883151769="` echo /root/.ansible/tmp/ansible-tmp-1497367127.09-39449883151769 `" ) && sleep 0'
<localhost> PUT /tmp/tmpRH2n37 TO /root/.ansible/tmp/ansible-tmp-1497367127.09-39449883151769/file.py
<localhost> EXEC /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1497367127.09-39449883151769/ /root/.ansible/tmp/ansible-tmp-1497367127.09-39449883151769/file.py && sleep 0'
<localhost> EXEC /bin/sh -c '/usr/bin/python /root/.ansible/tmp/ansible-tmp-1497367127.09-39449883151769/file.py; rm -rf "/root/.ansible/tmp/ansible-tmp-1497367127.09-39449883151769/" > /dev/null 2>&1 && sleep 0'
ok: [localhost] => (item={u'dest': u'/etc/yum.repos.d/', u'src': u'origin/repos/openshift-ansible-centos-paas-sig.repo'}) => {
"changed": false,
"checksum": "f50e4ab48783725fab7d6b13e855bd3cee84b664",
"dest": "/etc/yum.repos.d/openshift-ansible-centos-paas-sig.repo",
"diff": {
"after": {
"path": "/etc/yum.repos.d/openshift-ansible-centos-paas-sig.repo"
},
"before": {
"path": "/etc/yum.repos.d/openshift-ansible-centos-paas-sig.repo"
}
},
"gid": 0,
"group": "root",
"invocation": {
"module_args": {
"attributes": null,
"backup": null,
"content": null,
"delimiter": null,
"dest": "/etc/yum.repos.d/",
"diff_peek": null,
"directory_mode": null,
"follow": false,
"force": false,
"group": null,
"mode": null,
"original_basename": "openshift-ansible-centos-paas-sig.repo",
"owner": null,
"path": "/etc/yum.repos.d/openshift-ansible-centos-paas-sig.repo",
"recurse": false,
"regexp": null,
"remote_src": null,
"selevel": null,
"serole": null,
"setype": null,
"seuser": null,
"src": "openshift-ansible-centos-paas-sig.repo",
"state": null,
"unsafe_writes": null,
"validate": null
}
},
"item": {
"dest": "/etc/yum.repos.d/",
"src": "origin/repos/openshift-ansible-centos-paas-sig.repo"
},
"mode": "0644",
"owner": "root",
"path": "/etc/yum.repos.d/openshift-ansible-centos-paas-sig.repo",
"secontext": "system_u:object_r:system_conf_t:s0",
"size": 893,
"state": "file",
"uid": 0
}
TASK [openshift_facts : Detecting Operating System] ********************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_facts/tasks/main.yml:2
Using module file /usr/lib/python2.7/site-packages/ansible/modules/files/stat.py
<localhost> ESTABLISH LOCAL CONNECTION FOR USER: ith
<localhost> EXEC /bin/sh -c 'echo ~ && sleep 0'
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp/ansible-tmp-1497367127.41-279216863670415 `" && echo ansible-tmp-1497367127.41-279216863670415="` echo /root/.ansible/tmp/ansible-tmp-1497367127.41-279216863670415 `" ) && sleep 0'
<localhost> PUT /tmp/tmpfhmu7s TO /root/.ansible/tmp/ansible-tmp-1497367127.41-279216863670415/stat.py
<localhost> EXEC /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1497367127.41-279216863670415/ /root/.ansible/tmp/ansible-tmp-1497367127.41-279216863670415/stat.py && sleep 0'
<localhost> EXEC /bin/sh -c '/usr/bin/python /root/.ansible/tmp/ansible-tmp-1497367127.41-279216863670415/stat.py; rm -rf "/root/.ansible/tmp/ansible-tmp-1497367127.41-279216863670415/" > /dev/null 2>&1 && sleep 0'
ok: [localhost] => {
"changed": false,
"invocation": {
"module_args": {
"checksum_algorithm": "sha1",
"follow": false,
"get_attributes": true,
"get_checksum": true,
"get_md5": true,
"get_mime": true,
"path": "/run/ostree-booted"
}
},
"stat": {
"exists": false
}
}
TASK [openshift_facts : set_fact] **************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_facts/tasks/main.yml:8
ok: [localhost] => {
"ansible_facts": {
"l_is_atomic": false
},
"changed": false
}
TASK [openshift_facts : set_fact] **************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_facts/tasks/main.yml:10
ok: [localhost] => {
"ansible_facts": {
"l_is_containerized": false,
"l_is_etcd_system_container": false,
"l_is_master_system_container": false,
"l_is_node_system_container": false,
"l_is_openvswitch_system_container": false
},
"changed": false
}
TASK [openshift_facts : set_fact] **************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_facts/tasks/main.yml:16
ok: [localhost] => {
"ansible_facts": {
"l_any_system_container": false
},
"changed": false
}
TASK [openshift_facts : Validate python version] ***********************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_facts/tasks/main.yml:19
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_facts : Validate python version] ***********************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_facts/tasks/main.yml:26
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_facts : Determine Atomic Host Docker Version] **********************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_facts/tasks/main.yml:39
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_facts : assert] ****************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_facts/tasks/main.yml:43
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_facts : Ensure various deps are installed] *************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_facts/tasks/main.yml:50
Using module file /usr/lib/python2.7/site-packages/ansible/modules/packaging/os/yum.py
<localhost> ESTABLISH LOCAL CONNECTION FOR USER: ith
<localhost> EXEC /bin/sh -c 'echo ~ && sleep 0'
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp/ansible-tmp-1497367128.58-171381454303429 `" && echo ansible-tmp-1497367128.58-171381454303429="` echo /root/.ansible/tmp/ansible-tmp-1497367128.58-171381454303429 `" ) && sleep 0'
<localhost> PUT /tmp/tmpQhGL3g TO /root/.ansible/tmp/ansible-tmp-1497367128.58-171381454303429/yum.py
<localhost> EXEC /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1497367128.58-171381454303429/ /root/.ansible/tmp/ansible-tmp-1497367128.58-171381454303429/yum.py && sleep 0'
<localhost> EXEC /bin/sh -c '/usr/bin/python /root/.ansible/tmp/ansible-tmp-1497367128.58-171381454303429/yum.py; rm -rf "/root/.ansible/tmp/ansible-tmp-1497367128.58-171381454303429/" > /dev/null 2>&1 && sleep 0'
Using module file /usr/lib/python2.7/site-packages/ansible/modules/packaging/os/yum.py
<localhost> EXEC /bin/sh -c 'echo ~ && sleep 0'
ok: [localhost] => (item=iproute) => {
"changed": false,
"invocation": {
"module_args": {
"conf_file": null,
"disable_gpg_check": false,
"disablerepo": null,
"enablerepo": null,
"exclude": null,
"install_repoquery": true,
"installroot": "/",
"list": null,
"name": [
"iproute"
],
"skip_broken": false,
"state": "present",
"update_cache": false,
"validate_certs": true
}
},
"item": "iproute",
"rc": 0,
"results": [
"iproute-3.10.0-74.el7.x86_64 providing iproute is already installed"
]
}
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp/ansible-tmp-1497367129.2-165036410521176 `" && echo ansible-tmp-1497367129.2-165036410521176="` echo /root/.ansible/tmp/ansible-tmp-1497367129.2-165036410521176 `" ) && sleep 0'
<localhost> PUT /tmp/tmpZiZnmO TO /root/.ansible/tmp/ansible-tmp-1497367129.2-165036410521176/yum.py
<localhost> EXEC /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1497367129.2-165036410521176/ /root/.ansible/tmp/ansible-tmp-1497367129.2-165036410521176/yum.py && sleep 0'
<localhost> EXEC /bin/sh -c '/usr/bin/python /root/.ansible/tmp/ansible-tmp-1497367129.2-165036410521176/yum.py; rm -rf "/root/.ansible/tmp/ansible-tmp-1497367129.2-165036410521176/" > /dev/null 2>&1 && sleep 0'
Using module file /usr/lib/python2.7/site-packages/ansible/modules/packaging/os/yum.py
<localhost> EXEC /bin/sh -c 'echo ~ && sleep 0'
ok: [localhost] => (item=python-dbus) => {
"changed": false,
"invocation": {
"module_args": {
"conf_file": null,
"disable_gpg_check": false,
"disablerepo": null,
"enablerepo": null,
"exclude": null,
"install_repoquery": true,
"installroot": "/",
"list": null,
"name": [
"python-dbus"
],
"skip_broken": false,
"state": "present",
"update_cache": false,
"validate_certs": true
}
},
"item": "python-dbus",
"rc": 0,
"results": [
"dbus-python-1.1.1-9.el7.x86_64 providing python-dbus is already installed"
]
}
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp/ansible-tmp-1497367130.11-7332030008185 `" && echo ansible-tmp-1497367130.11-7332030008185="` echo /root/.ansible/tmp/ansible-tmp-1497367130.11-7332030008185 `" ) && sleep 0'
<localhost> PUT /tmp/tmpdPwNU5 TO /root/.ansible/tmp/ansible-tmp-1497367130.11-7332030008185/yum.py
<localhost> EXEC /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1497367130.11-7332030008185/ /root/.ansible/tmp/ansible-tmp-1497367130.11-7332030008185/yum.py && sleep 0'
<localhost> EXEC /bin/sh -c '/usr/bin/python /root/.ansible/tmp/ansible-tmp-1497367130.11-7332030008185/yum.py; rm -rf "/root/.ansible/tmp/ansible-tmp-1497367130.11-7332030008185/" > /dev/null 2>&1 && sleep 0'
Using module file /usr/lib/python2.7/site-packages/ansible/modules/packaging/os/yum.py
<localhost> EXEC /bin/sh -c 'echo ~ && sleep 0'
ok: [localhost] => (item=python-six) => {
"changed": false,
"invocation": {
"module_args": {
"conf_file": null,
"disable_gpg_check": false,
"disablerepo": null,
"enablerepo": null,
"exclude": null,
"install_repoquery": true,
"installroot": "/",
"list": null,
"name": [
"python-six"
],
"skip_broken": false,
"state": "present",
"update_cache": false,
"validate_certs": true
}
},
"item": "python-six",
"rc": 0,
"results": [
"python-six-1.9.0-2.el7.noarch providing python-six is already installed"
]
}
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp/ansible-tmp-1497367130.63-156578718524332 `" && echo ansible-tmp-1497367130.63-156578718524332="` echo /root/.ansible/tmp/ansible-tmp-1497367130.63-156578718524332 `" ) && sleep 0'
<localhost> PUT /tmp/tmpEpLUel TO /root/.ansible/tmp/ansible-tmp-1497367130.63-156578718524332/yum.py
<localhost> EXEC /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1497367130.63-156578718524332/ /root/.ansible/tmp/ansible-tmp-1497367130.63-156578718524332/yum.py && sleep 0'
<localhost> EXEC /bin/sh -c '/usr/bin/python /root/.ansible/tmp/ansible-tmp-1497367130.63-156578718524332/yum.py; rm -rf "/root/.ansible/tmp/ansible-tmp-1497367130.63-156578718524332/" > /dev/null 2>&1 && sleep 0'
Using module file /usr/lib/python2.7/site-packages/ansible/modules/packaging/os/yum.py
<localhost> EXEC /bin/sh -c 'echo ~ && sleep 0'
ok: [localhost] => (item=PyYAML) => {
"changed": false,
"invocation": {
"module_args": {
"conf_file": null,
"disable_gpg_check": false,
"disablerepo": null,
"enablerepo": null,
"exclude": null,
"install_repoquery": true,
"installroot": "/",
"list": null,
"name": [
"PyYAML"
],
"skip_broken": false,
"state": "present",
"update_cache": false,
"validate_certs": true
}
},
"item": "PyYAML",
"rc": 0,
"results": [
"PyYAML-3.10-11.el7.x86_64 providing PyYAML is already installed"
]
}
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp/ansible-tmp-1497367131.15-42690734604483 `" && echo ansible-tmp-1497367131.15-42690734604483="` echo /root/.ansible/tmp/ansible-tmp-1497367131.15-42690734604483 `" ) && sleep 0'
<localhost> PUT /tmp/tmpN_tYby TO /root/.ansible/tmp/ansible-tmp-1497367131.15-42690734604483/yum.py
<localhost> EXEC /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1497367131.15-42690734604483/ /root/.ansible/tmp/ansible-tmp-1497367131.15-42690734604483/yum.py && sleep 0'
<localhost> EXEC /bin/sh -c '/usr/bin/python /root/.ansible/tmp/ansible-tmp-1497367131.15-42690734604483/yum.py; rm -rf "/root/.ansible/tmp/ansible-tmp-1497367131.15-42690734604483/" > /dev/null 2>&1 && sleep 0'
ok: [localhost] => (item=yum-utils) => {
"changed": false,
"invocation": {
"module_args": {
"conf_file": null,
"disable_gpg_check": false,
"disablerepo": null,
"enablerepo": null,
"exclude": null,
"install_repoquery": true,
"installroot": "/",
"list": null,
"name": [
"yum-utils"
],
"skip_broken": false,
"state": "present",
"update_cache": false,
"validate_certs": true
}
},
"item": "yum-utils",
"rc": 0,
"results": [
"yum-utils-1.1.31-40.el7.noarch providing yum-utils is already installed"
]
}
TASK [openshift_facts : Ensure various deps for running system containers are installed] *******************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_facts/tasks/main.yml:55
skipping: [localhost] => (item=atomic) => {
"changed": false,
"item": "atomic",
"skip_reason": "Conditional result was False",
"skipped": true
}
skipping: [localhost] => (item=runc) => {
"changed": false,
"item": "runc",
"skip_reason": "Conditional result was False",
"skipped": true
}
skipping: [localhost] => (item=ostree) => {
"changed": false,
"item": "ostree",
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_facts : Gather Cluster facts and set is_containerized if needed] ***************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_facts/tasks/main.yml:62
Using module file /usr/share/ansible/openshift-ansible/roles/openshift_facts/library/openshift_facts.py
<localhost> ESTABLISH LOCAL CONNECTION FOR USER: ith
<localhost> EXEC /bin/sh -c 'echo ~ && sleep 0'
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp/ansible-tmp-1497367131.97-206533704814723 `" && echo ansible-tmp-1497367131.97-206533704814723="` echo /root/.ansible/tmp/ansible-tmp-1497367131.97-206533704814723 `" ) && sleep 0'
<localhost> PUT /tmp/tmppJhV9y TO /root/.ansible/tmp/ansible-tmp-1497367131.97-206533704814723/openshift_facts.py
<localhost> EXEC /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1497367131.97-206533704814723/ /root/.ansible/tmp/ansible-tmp-1497367131.97-206533704814723/openshift_facts.py && sleep 0'
<localhost> EXEC /bin/sh -c '/usr/bin/python /root/.ansible/tmp/ansible-tmp-1497367131.97-206533704814723/openshift_facts.py; rm -rf "/root/.ansible/tmp/ansible-tmp-1497367131.97-206533704814723/" > /dev/null 2>&1 && sleep 0'
ok: [localhost] => {
"ansible_facts": {
"openshift": {
"common": {
"admin_binary": "oadm",
"all_hostnames": [
"kubernetes.default",
"kubernetes.default.svc.cluster.local",
"kubernetes",
"openshift.default",
"openshift.default.svc",
"172.16.93.5",
"172.30.0.1",
"viaq.logging.test",
"openshift.default.svc.cluster.local",
"kubernetes.default.svc",
"openshift"
],
"cli_image": "openshift/origin",
"client_binary": "oc",
"cluster_id": "default",
"config_base": "/etc/origin",
"data_dir": "/var/lib/origin",
"debug_level": "2",
"deployer_image": "openshift/origin-deployer",
"deployment_subtype": "basic",
"deployment_type": "origin",
"dns_domain": "cluster.local",
"examples_content_version": "v1.5",
"generate_no_proxy_hosts": true,
"hostname": "viaq.logging.test",
"install_examples": true,
"installed_variant_rpms": [],
"internal_hostnames": [
"kubernetes.default",
"kubernetes.default.svc.cluster.local",
"kubernetes",
"openshift.default",
"openshift.default.svc",
"172.16.93.5",
"172.30.0.1",
"viaq.logging.test",
"openshift.default.svc.cluster.local",
"kubernetes.default.svc",
"openshift"
],
"ip": "172.16.93.5",
"is_atomic": false,
"is_containerized": false,
"is_etcd_system_container": false,
"is_master_system_container": false,
"is_node_system_container": false,
"is_openvswitch_system_container": false,
"kube_svc_ip": "172.30.0.1",
"pod_image": "openshift/origin-pod",
"portal_net": "172.30.0.0/16",
"public_hostname": "viaq.logging.test",
"public_ip": "172.16.93.5",
"registry_image": "openshift/origin-docker-registry",
"router_image": "openshift/origin-haproxy-router",
"sdn_network_plugin_name": "redhat/openshift-ovs-subnet",
"service_type": "origin",
"use_contiv": false,
"use_dnsmasq": true,
"use_flannel": false,
"use_manageiq": true,
"use_nuage": false,
"use_openshift_sdn": true,
"version_gte_3_1_1_or_1_1_1": true,
"version_gte_3_1_or_1_1": true,
"version_gte_3_2_or_1_2": true,
"version_gte_3_3_or_1_3": true,
"version_gte_3_4_or_1_4": true,
"version_gte_3_5_or_1_5": true,
"version_gte_3_6_or_1_6": false
},
"current_config": {
"roles": [
"node",
"docker",
"master"
]
},
"docker": {
"additional_registries": [],
"api_version": 1.24,
"blocked_registries": [],
"disable_push_dockerhub": false,
"gte_1_10": true,
"hosted_registry_insecure": false,
"hosted_registry_network": "172.30.0.0/16",
"insecure_registries": [],
"options": "--log-driver=journald",
"version": "1.12.6"
},
"hosted": {
"logging": {
"selector": null
},
"metrics": {
"selector": null
},
"registry": {
"selector": "region=infra"
},
"router": {
"selector": "region=infra"
}
},
"master": {
"access_token_max_seconds": 86400,
"api_port": "8443",
"api_url": "https://viaq.logging.test:8443",
"api_use_ssl": true,
"auth_token_max_seconds": 500,
"bind_addr": "0.0.0.0",
"console_path": "/console",
"console_port": "8443",
"console_url": "https://viaq.logging.test:8443/console",
"console_use_ssl": true,
"controllers_port": "8444",
"default_node_selector": "",
"dns_port": 8053,
"dynamic_provisioning_enabled": true,
"embedded_dns": true,
"embedded_etcd": true,
"embedded_kube": true,
"etcd_hosts": "",
"etcd_port": "4001",
"etcd_urls": [
"https://viaq.logging.test:4001"
],
"etcd_use_ssl": true,
"ha": false,
"identity_providers": [
{
"challenge": true,
"kind": "AllowAllPasswordIdentityProvider",
"login": true,
"name": "allow_all"
}
],
"loopback_api_url": "https://viaq.logging.test:8443",
"loopback_cluster_name": "viaq-logging-test:8443",
"loopback_context_name": "default/viaq-logging-test:8443/system:openshift-master",
"loopback_user": "system:openshift-master/viaq-logging-test:8443",
"master_count": "1",
"master_image": "openshift/origin",
"master_system_image": "openshift/origin",
"max_requests_inflight": 500,
"mcs_allocator_range": "s0:/2",
"mcs_labels_per_project": 5,
"oauth_grant_method": "auto",
"portal_net": "172.30.0.0/16",
"project_request_message": "",
"project_request_template": "",
"public_api_url": "https://viaq.logging.test:8443",
"public_console_url": "https://viaq.logging.test:8443/console",
"registry_url": "openshift/origin-${component}:${version}",
"sdn_cluster_network_cidr": "10.128.0.0/14",
"sdn_host_subnet_length": "9",
"session_max_seconds": 3600,
"session_name": "ssn",
"session_secrets_file": "",
"uid_allocator_range": "1000000000-1999999999/10000"
},
"node": {
"annotations": {},
"iptables_sync_period": "30s",
"kubelet_args": {
"node-labels": []
},
"labels": {},
"local_quota_per_fsgroup": "",
"node_image": "openshift/node",
"node_system_image": "openshift/node",
"nodename": "viaq.logging.test",
"ovs_image": "openshift/openvswitch",
"ovs_system_image": "openshift/openvswitch",
"registry_url": "openshift/origin-${component}:${version}",
"schedulable": false,
"sdn_mtu": "1450",
"set_node_ip": false,
"storage_plugin_deps": [
"ceph",
"glusterfs",
"iscsi"
]
}
}
},
"changed": false,
"invocation": {
"module_args": {
"additive_facts_to_overwrite": [],
"attributes": null,
"backup": null,
"content": null,
"delimiter": null,
"directory_mode": null,
"filter": "*",
"follow": false,
"force": null,
"gather_subset": [
"hardware",
"network",
"virtual",
"facter"
],
"gather_timeout": 10,
"group": null,
"local_facts": {
"cluster_id": "default",
"debug_level": "2",
"deployment_subtype": "",
"deployment_type": "origin",
"generate_no_proxy_hosts": true,
"hostname": "viaq.logging.test",
"http_proxy": "",
"https_proxy": "",
"ip": "172.16.93.5",
"is_containerized": false,
"is_etcd_system_container": false,
"is_master_system_container": false,
"is_node_system_container": false,
"is_openvswitch_system_container": false,
"no_proxy": "",
"no_proxy_internal_hostnames": "",
"portal_net": "",
"public_hostname": "viaq.logging.test",
"public_ip": "172.16.93.5",
"sdn_network_plugin_name": "",
"system_images_registry": "",
"use_openshift_sdn": ""
},
"mode": null,
"openshift_env": {},
"openshift_env_structures": [],
"owner": null,
"protected_facts_to_overwrite": [],
"regexp": null,
"remote_src": null,
"role": "common",
"selevel": null,
"serole": null,
"setype": null,
"seuser": null,
"src": null,
"unsafe_writes": null
}
}
}
TASK [openshift_facts : Set repoquery command] *************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_facts/tasks/main.yml:90
ok: [localhost] => {
"ansible_facts": {
"repoquery_cmd": "repoquery --plugins"
},
"changed": false
}
TASK [openshift_docker_facts : Set docker facts] ***********************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_docker_facts/tasks/main.yml:2
Using module file /usr/share/ansible/openshift-ansible/roles/openshift_facts/library/openshift_facts.py
<localhost> ESTABLISH LOCAL CONNECTION FOR USER: ith
<localhost> EXEC /bin/sh -c 'echo ~ && sleep 0'
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp/ansible-tmp-1497367133.45-63513964400268 `" && echo ansible-tmp-1497367133.45-63513964400268="` echo /root/.ansible/tmp/ansible-tmp-1497367133.45-63513964400268 `" ) && sleep 0'
<localhost> PUT /tmp/tmp24X8A3 TO /root/.ansible/tmp/ansible-tmp-1497367133.45-63513964400268/openshift_facts.py
<localhost> EXEC /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1497367133.45-63513964400268/ /root/.ansible/tmp/ansible-tmp-1497367133.45-63513964400268/openshift_facts.py && sleep 0'
<localhost> EXEC /bin/sh -c '/usr/bin/python /root/.ansible/tmp/ansible-tmp-1497367133.45-63513964400268/openshift_facts.py; rm -rf "/root/.ansible/tmp/ansible-tmp-1497367133.45-63513964400268/" > /dev/null 2>&1 && sleep 0'
ok: [localhost] => (item={u'local_facts': {u'blocked_registries': u'', u'hosted_registry_insecure': False, u'log_driver': u'', u'disable_push_dockerhub': u'', u'selinux_enabled': u'', u'additional_registries': u'', u'hosted_registry_network': u'172.30.0.0/16', u'log_options': u'', u'insecure_registries': u'', u'options': u''}, u'role': u'docker'}) => {
"ansible_facts": {
"openshift": {
"common": {
"admin_binary": "oadm",
"all_hostnames": [
"kubernetes.default",
"kubernetes.default.svc.cluster.local",
"kubernetes",
"openshift.default",
"openshift.default.svc",
"172.16.93.5",
"172.30.0.1",
"viaq.logging.test",
"openshift.default.svc.cluster.local",
"kubernetes.default.svc",
"openshift"
],
"cli_image": "openshift/origin",
"client_binary": "oc",
"cluster_id": "default",
"config_base": "/etc/origin",
"data_dir": "/var/lib/origin",
"debug_level": "2",
"deployer_image": "openshift/origin-deployer",
"deployment_subtype": "basic",
"deployment_type": "origin",
"dns_domain": "cluster.local",
"examples_content_version": "v1.5",
"generate_no_proxy_hosts": true,
"hostname": "viaq.logging.test",
"install_examples": true,
"installed_variant_rpms": [],
"internal_hostnames": [
"kubernetes.default",
"kubernetes.default.svc.cluster.local",
"kubernetes",
"openshift.default",
"openshift.default.svc",
"172.16.93.5",
"172.30.0.1",
"viaq.logging.test",
"openshift.default.svc.cluster.local",
"kubernetes.default.svc",
"openshift"
],
"ip": "172.16.93.5",
"is_atomic": false,
"is_containerized": false,
"is_etcd_system_container": false,
"is_master_system_container": false,
"is_node_system_container": false,
"is_openvswitch_system_container": false,
"kube_svc_ip": "172.30.0.1",
"pod_image": "openshift/origin-pod",
"portal_net": "172.30.0.0/16",
"public_hostname": "viaq.logging.test",
"public_ip": "172.16.93.5",
"registry_image": "openshift/origin-docker-registry",
"router_image": "openshift/origin-haproxy-router",
"sdn_network_plugin_name": "redhat/openshift-ovs-subnet",
"service_type": "origin",
"use_contiv": false,
"use_dnsmasq": true,
"use_flannel": false,
"use_manageiq": true,
"use_nuage": false,
"use_openshift_sdn": true,
"version_gte_3_1_1_or_1_1_1": true,
"version_gte_3_1_or_1_1": true,
"version_gte_3_2_or_1_2": true,
"version_gte_3_3_or_1_3": true,
"version_gte_3_4_or_1_4": true,
"version_gte_3_5_or_1_5": true,
"version_gte_3_6_or_1_6": false
},
"current_config": {
"roles": [
"node",
"docker",
"master"
]
},
"docker": {
"additional_registries": [],
"api_version": 1.24,
"blocked_registries": [],
"disable_push_dockerhub": false,
"gte_1_10": true,
"hosted_registry_insecure": false,
"hosted_registry_network": "172.30.0.0/16",
"insecure_registries": [],
"options": "--log-driver=journald",
"version": "1.12.6"
},
"hosted": {
"logging": {
"selector": null
},
"metrics": {
"selector": null
},
"registry": {
"selector": "region=infra"
},
"router": {
"selector": "region=infra"
}
},
"master": {
"access_token_max_seconds": 86400,
"api_port": "8443",
"api_url": "https://viaq.logging.test:8443",
"api_use_ssl": true,
"auth_token_max_seconds": 500,
"bind_addr": "0.0.0.0",
"console_path": "/console",
"console_port": "8443",
"console_url": "https://viaq.logging.test:8443/console",
"console_use_ssl": true,
"controllers_port": "8444",
"default_node_selector": "",
"dns_port": 8053,
"dynamic_provisioning_enabled": true,
"embedded_dns": true,
"embedded_etcd": true,
"embedded_kube": true,
"etcd_hosts": "",
"etcd_port": "4001",
"etcd_urls": [
"https://viaq.logging.test:4001"
],
"etcd_use_ssl": true,
"ha": false,
"identity_providers": [
{
"challenge": true,
"kind": "AllowAllPasswordIdentityProvider",
"login": true,
"name": "allow_all"
}
],
"loopback_api_url": "https://viaq.logging.test:8443",
"loopback_cluster_name": "viaq-logging-test:8443",
"loopback_context_name": "default/viaq-logging-test:8443/system:openshift-master",
"loopback_user": "system:openshift-master/viaq-logging-test:8443",
"master_count": "1",
"master_image": "openshift/origin",
"master_system_image": "openshift/origin",
"max_requests_inflight": 500,
"mcs_allocator_range": "s0:/2",
"mcs_labels_per_project": 5,
"oauth_grant_method": "auto",
"portal_net": "172.30.0.0/16",
"project_request_message": "",
"project_request_template": "",
"public_api_url": "https://viaq.logging.test:8443",
"public_console_url": "https://viaq.logging.test:8443/console",
"registry_url": "openshift/origin-${component}:${version}",
"sdn_cluster_network_cidr": "10.128.0.0/14",
"sdn_host_subnet_length": "9",
"session_max_seconds": 3600,
"session_name": "ssn",
"session_secrets_file": "",
"uid_allocator_range": "1000000000-1999999999/10000"
},
"node": {
"annotations": {},
"iptables_sync_period": "30s",
"kubelet_args": {
"node-labels": []
},
"labels": {},
"local_quota_per_fsgroup": "",
"node_image": "openshift/node",
"node_system_image": "openshift/node",
"nodename": "viaq.logging.test",
"ovs_image": "openshift/openvswitch",
"ovs_system_image": "openshift/openvswitch",
"registry_url": "openshift/origin-${component}:${version}",
"schedulable": false,
"sdn_mtu": "1450",
"set_node_ip": false,
"storage_plugin_deps": [
"ceph",
"glusterfs",
"iscsi"
]
}
}
},
"changed": false,
"invocation": {
"module_args": {
"additive_facts_to_overwrite": [],
"attributes": null,
"backup": null,
"content": null,
"delimiter": null,
"directory_mode": null,
"filter": "*",
"follow": false,
"force": null,
"gather_subset": [
"hardware",
"network",
"virtual",
"facter"
],
"gather_timeout": 10,
"group": null,
"local_facts": {
"additional_registries": "",
"blocked_registries": "",
"disable_push_dockerhub": "",
"hosted_registry_insecure": false,
"hosted_registry_network": "172.30.0.0/16",
"insecure_registries": "",
"log_driver": "",
"log_options": "",
"options": "",
"selinux_enabled": ""
},
"mode": null,
"openshift_env": {},
"openshift_env_structures": [],
"owner": null,
"protected_facts_to_overwrite": [],
"regexp": null,
"remote_src": null,
"role": "docker",
"selevel": null,
"serole": null,
"setype": null,
"seuser": null,
"src": null,
"unsafe_writes": null
}
},
"item": {
"local_facts": {
"additional_registries": "",
"blocked_registries": "",
"disable_push_dockerhub": "",
"hosted_registry_insecure": false,
"hosted_registry_network": "172.30.0.0/16",
"insecure_registries": "",
"log_driver": "",
"log_options": "",
"options": "",
"selinux_enabled": ""
},
"role": "docker"
}
}
TASK [openshift_docker_facts : set_fact] *******************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_docker_facts/tasks/main.yml:20
ok: [localhost] => {
"ansible_facts": {
"docker_additional_registries": [],
"docker_blocked_registries": [],
"docker_insecure_registries": [],
"docker_push_dockerhub": false
},
"changed": false
}
TASK [openshift_docker_facts : set_fact] *******************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_docker_facts/tasks/main.yml:36
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_docker_facts : set_fact] *******************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_docker_facts/tasks/main.yml:41
ok: [localhost] => {
"ansible_facts": {
"docker_options": "--log-driver=journald"
},
"changed": false
}
TASK [os_firewall : Assert - Do not use firewalld on Atomic Host] ******************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/os_firewall/tasks/main.yml:2
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [os_firewall : Install firewalld packages] ************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/os_firewall/tasks/firewall/firewalld.yml:2
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [os_firewall : Ensure iptables services are not enabled] **********************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/os_firewall/tasks/firewall/firewalld.yml:7
skipping: [localhost] => (item=iptables) => {
"changed": false,
"item": "iptables",
"skip_reason": "Conditional result was False",
"skipped": true
}
skipping: [localhost] => (item=ip6tables) => {
"changed": false,
"item": "ip6tables",
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [os_firewall : Wait 10 seconds after disabling iptables] **********************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/os_firewall/tasks/firewall/firewalld.yml:19
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [os_firewall : Start and enable firewalld service] ****************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/os_firewall/tasks/firewall/firewalld.yml:24
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [os_firewall : need to pause here, otherwise the firewalld service starting can sometimes cause ssh to fail] ******************************************************
task path: /usr/share/ansible/openshift-ansible/roles/os_firewall/tasks/firewall/firewalld.yml:33
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [os_firewall : Add firewalld allow rules] *************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/os_firewall/tasks/firewall/firewalld.yml:37
TASK [os_firewall : Remove firewalld allow rules] **********************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/os_firewall/tasks/firewall/firewalld.yml:45
TASK [os_firewall : Ensure firewalld service is not enabled] ***********************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/os_firewall/tasks/firewall/iptables.yml:3
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [os_firewall : Wait 10 seconds after disabling firewalld] *********************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/os_firewall/tasks/firewall/iptables.yml:12
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [os_firewall : Install iptables packages] *************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/os_firewall/tasks/firewall/iptables.yml:17
skipping: [localhost] => (item=iptables-services) => {
"changed": false,
"item": "iptables-services",
"skip_reason": "Conditional result was False",
"skipped": true
}
skipping: [localhost] => (item=iptables) => {
"changed": false,
"item": "iptables",
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [os_firewall : Start and enable iptables service] *****************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/os_firewall/tasks/firewall/iptables.yml:24
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [os_firewall : need to pause here, otherwise the iptables service starting can sometimes cause ssh to fail] *******************************************************
task path: /usr/share/ansible/openshift-ansible/roles/os_firewall/tasks/firewall/iptables.yml:33
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [os_firewall : Add iptables allow rules] **************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/os_firewall/tasks/firewall/iptables.yml:37
TASK [os_firewall : Remove iptables rules] *****************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/os_firewall/tasks/firewall/iptables.yml:45
TASK [docker : Get current installed Docker version] *******************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/docker/tasks/main.yml:2
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [docker : Error out if Docker pre-installed but too old] **********************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/docker/tasks/main.yml:8
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [docker : Error out if requested Docker is too old] ***************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/docker/tasks/main.yml:13
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [docker : Fail if Docker version requested but downgrade is required] *********************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/docker/tasks/main.yml:20
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [docker : Error out if attempting to upgrade Docker across the 1.10 boundary] *************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/docker/tasks/main.yml:27
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [docker : Install Docker] *****************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/docker/tasks/main.yml:34
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [docker : Ensure docker.service.d directory exists] ***************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/docker/tasks/main.yml:40
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [docker : Configure Docker service unit file] *********************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/docker/tasks/main.yml:45
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [docker : Getting current systemd-udevd exec command] *************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/docker/tasks/udev_workaround.yml:3
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [docker : Assure systemd-udevd.service.d directory exists] ********************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/docker/tasks/udev_workaround.yml:8
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [docker : Create systemd-udevd override file] *********************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/docker/tasks/udev_workaround.yml:13
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [docker : stat] ***************************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/docker/tasks/main.yml:54
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [docker : Set registry params] ************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/docker/tasks/main.yml:57
skipping: [localhost] => (item={u'reg_conf_var': u'ADD_REGISTRY', u'reg_flag': u'--add-registry', u'reg_fact_val': u''}) => {
"changed": false,
"item": {
"reg_conf_var": "ADD_REGISTRY",
"reg_fact_val": "",
"reg_flag": "--add-registry"
},
"skip_reason": "Conditional result was False",
"skipped": true
}
skipping: [localhost] => (item={u'reg_conf_var': u'INSECURE_REGISTRY', u'reg_flag': u'--insecure-registry', u'reg_fact_val': u''}) => {
"changed": false,
"item": {
"reg_conf_var": "INSECURE_REGISTRY",
"reg_fact_val": "",
"reg_flag": "--insecure-registry"
},
"skip_reason": "Conditional result was False",
"skipped": true
}
skipping: [localhost] => (item={u'reg_conf_var': u'BLOCK_REGISTRY', u'reg_flag': u'--block-registry', u'reg_fact_val': u''}) => {
"changed": false,
"item": {
"reg_conf_var": "BLOCK_REGISTRY",
"reg_fact_val": "",
"reg_flag": "--block-registry"
},
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [docker : Set Proxy Settings] *************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/docker/tasks/main.yml:76
skipping: [localhost] => (item={u'reg_conf_var': u'HTTP_PROXY', u'reg_fact_val': u''}) => {
"changed": false,
"item": {
"reg_conf_var": "HTTP_PROXY",
"reg_fact_val": ""
},
"skip_reason": "Conditional result was False",
"skipped": true
}
skipping: [localhost] => (item={u'reg_conf_var': u'NO_PROXY', u'reg_fact_val': u''}) => {
"changed": false,
"item": {
"reg_conf_var": "NO_PROXY",
"reg_fact_val": ""
},
"skip_reason": "Conditional result was False",
"skipped": true
}
skipping: [localhost] => (item={u'reg_conf_var': u'HTTPS_PROXY', u'reg_fact_val': u''}) => {
"changed": false,
"item": {
"reg_conf_var": "HTTPS_PROXY",
"reg_fact_val": ""
},
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [docker : Set various Docker options] *****************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/docker/tasks/main.yml:94
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [docker : Start the Docker service] *******************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/docker/tasks/main.yml:108
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [docker : set_fact] ***********************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/docker/tasks/main.yml:116
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
META: ran handlers
TASK [openshift_version : set_fact] ************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_version/tasks/main.yml:4
ok: [localhost] => {
"ansible_facts": {
"is_containerized": false
},
"changed": false
}
TASK [openshift_version : fail] ****************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_version/tasks/main.yml:10
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_version : set_fact] ************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_version/tasks/main.yml:15
ok: [localhost] => {
"ansible_facts": {
"openshift_release": "1.5"
},
"changed": false
}
TASK [openshift_version : set_fact] ************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_version/tasks/main.yml:19
ok: [localhost] => {
"ansible_facts": {
"openshift_release": "1.5"
},
"changed": false
}
TASK [openshift_version : set_fact] ************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_version/tasks/main.yml:23
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_version : set_fact] ************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_version/tasks/main.yml:27
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_version : set_fact] ************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_version/tasks/main.yml:32
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_version : Use openshift.common.version fact as version to configure if already installed] **************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_version/tasks/main.yml:38
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_version : Set rpm version to configure if openshift_pkg_version specified] *****************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_version/tasks/set_version_rpm.yml:2
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_version : Create temporary yum.conf file] **************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_version/tasks/set_version_rpm.yml:11
Using module file /usr/lib/python2.7/site-packages/ansible/modules/commands/command.py
<localhost> ESTABLISH LOCAL CONNECTION FOR USER: ith
<localhost> EXEC /bin/sh -c 'echo ~ && sleep 0'
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp/ansible-tmp-1497367139.67-104426862863291 `" && echo ansible-tmp-1497367139.67-104426862863291="` echo /root/.ansible/tmp/ansible-tmp-1497367139.67-104426862863291 `" ) && sleep 0'
<localhost> PUT /tmp/tmpBtUrm4 TO /root/.ansible/tmp/ansible-tmp-1497367139.67-104426862863291/command.py
<localhost> EXEC /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1497367139.67-104426862863291/ /root/.ansible/tmp/ansible-tmp-1497367139.67-104426862863291/command.py && sleep 0'
<localhost> EXEC /bin/sh -c '/usr/bin/python /root/.ansible/tmp/ansible-tmp-1497367139.67-104426862863291/command.py; rm -rf "/root/.ansible/tmp/ansible-tmp-1497367139.67-104426862863291/" > /dev/null 2>&1 && sleep 0'
changed: [localhost] => {
"changed": true,
"cmd": [
"mktemp",
"-d",
"/tmp/yum.conf.XXXXXX"
],
"delta": "0:00:00.004283",
"end": "2017-06-13 08:18:59.904608",
"invocation": {
"module_args": {
"_raw_params": "mktemp -d /tmp/yum.conf.XXXXXX",
"_uses_shell": false,
"chdir": null,
"creates": null,
"executable": null,
"removes": null,
"warn": true
}
},
"rc": 0,
"start": "2017-06-13 08:18:59.900325"
}
STDOUT:
/tmp/yum.conf.CWAsmU
TASK [openshift_version : set_fact] ************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_version/tasks/set_version_rpm.yml:15
ok: [localhost] => {
"ansible_facts": {
"yum_conf_temp_file": "/tmp/yum.conf.CWAsmU/yum.conf"
},
"changed": false
}
TASK [openshift_version : Copy yum.conf into the temporary file] *******************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_version/tasks/set_version_rpm.yml:18
Using module file /usr/lib/python2.7/site-packages/ansible/modules/files/copy.py
<localhost> ESTABLISH LOCAL CONNECTION FOR USER: ith
<localhost> EXEC /bin/sh -c 'echo ~ && sleep 0'
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp/ansible-tmp-1497367140.44-26006192652006 `" && echo ansible-tmp-1497367140.44-26006192652006="` echo /root/.ansible/tmp/ansible-tmp-1497367140.44-26006192652006 `" ) && sleep 0'
<localhost> PUT /tmp/tmpy3WSv2 TO /root/.ansible/tmp/ansible-tmp-1497367140.44-26006192652006/copy.py
<localhost> EXEC /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1497367140.44-26006192652006/ /root/.ansible/tmp/ansible-tmp-1497367140.44-26006192652006/copy.py && sleep 0'
<localhost> EXEC /bin/sh -c '/usr/bin/python /root/.ansible/tmp/ansible-tmp-1497367140.44-26006192652006/copy.py; rm -rf "/root/.ansible/tmp/ansible-tmp-1497367140.44-26006192652006/" > /dev/null 2>&1 && sleep 0'
changed: [localhost] => {
"changed": true,
"checksum": "de079542f6812661899f13c08f8b7d34df8fe917",
"dest": "/tmp/yum.conf.CWAsmU/yum.conf",
"gid": 0,
"group": "root",
"invocation": {
"module_args": {
"attributes": null,
"backup": false,
"content": null,
"delimiter": null,
"dest": "/tmp/yum.conf.CWAsmU/yum.conf",
"directory_mode": null,
"follow": false,
"force": true,
"group": null,
"mode": null,
"original_basename": null,
"owner": null,
"regexp": null,
"remote_src": true,
"selevel": null,
"serole": null,
"setype": null,
"seuser": null,
"src": "/etc/yum.conf",
"unsafe_writes": null,
"validate": null
}
},
"md5sum": "b2113d639fd25196e8105c7c61464fc3",
"mode": "0644",
"owner": "root",
"secontext": "unconfined_u:object_r:user_tmp_t:s0",
"size": 1091,
"src": "/etc/yum.conf",
"state": "file",
"uid": 0
}
TASK [openshift_version : Clear the exclude= list in the temporary yum.conf] *******************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_version/tasks/set_version_rpm.yml:24
Using module file /usr/lib/python2.7/site-packages/ansible/modules/files/lineinfile.py
<localhost> ESTABLISH LOCAL CONNECTION FOR USER: ith
<localhost> EXEC /bin/sh -c 'echo ~ && sleep 0'
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp/ansible-tmp-1497367141.11-212811378534698 `" && echo ansible-tmp-1497367141.11-212811378534698="` echo /root/.ansible/tmp/ansible-tmp-1497367141.11-212811378534698 `" ) && sleep 0'
<localhost> PUT /tmp/tmpeC7TvY TO /root/.ansible/tmp/ansible-tmp-1497367141.11-212811378534698/lineinfile.py
<localhost> EXEC /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1497367141.11-212811378534698/ /root/.ansible/tmp/ansible-tmp-1497367141.11-212811378534698/lineinfile.py && sleep 0'
<localhost> EXEC /bin/sh -c '/usr/bin/python /root/.ansible/tmp/ansible-tmp-1497367141.11-212811378534698/lineinfile.py; rm -rf "/root/.ansible/tmp/ansible-tmp-1497367141.11-212811378534698/" > /dev/null 2>&1 && sleep 0'
changed: [localhost] => {
"backup": "",
"changed": true,
"diff": [
{
"after": "",
"after_header": "/tmp/yum.conf.CWAsmU/yum.conf (content)",
"before": "",
"before_header": "/tmp/yum.conf.CWAsmU/yum.conf (content)"
},
{
"after_header": "/tmp/yum.conf.CWAsmU/yum.conf (file attributes)",
"before_header": "/tmp/yum.conf.CWAsmU/yum.conf (file attributes)"
}
],
"invocation": {
"module_args": {
"attributes": null,
"backrefs": false,
"backup": false,
"content": null,
"create": false,
"delimiter": null,
"dest": "/tmp/yum.conf.CWAsmU/yum.conf",
"directory_mode": null,
"follow": false,
"force": null,
"group": null,
"insertafter": null,
"insertbefore": null,
"line": "exclude=",
"mode": null,
"owner": null,
"path": "/tmp/yum.conf.CWAsmU/yum.conf",
"regexp": "^exclude=",
"remote_src": null,
"selevel": null,
"serole": null,
"setype": null,
"seuser": null,
"src": null,
"state": "present",
"unsafe_writes": null,
"validate": null
}
}
}
MSG:
line replaced
TASK [openshift_version : Gather common package version] ***************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_version/tasks/set_version_rpm.yml:31
Using module file /usr/lib/python2.7/site-packages/ansible/modules/commands/command.py
<localhost> ESTABLISH LOCAL CONNECTION FOR USER: ith
<localhost> EXEC /bin/sh -c 'echo ~ && sleep 0'
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp/ansible-tmp-1497367141.51-197794143920260 `" && echo ansible-tmp-1497367141.51-197794143920260="` echo /root/.ansible/tmp/ansible-tmp-1497367141.51-197794143920260 `" ) && sleep 0'
<localhost> PUT /tmp/tmpBxBsLA TO /root/.ansible/tmp/ansible-tmp-1497367141.51-197794143920260/command.py
<localhost> EXEC /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1497367141.51-197794143920260/ /root/.ansible/tmp/ansible-tmp-1497367141.51-197794143920260/command.py && sleep 0'
<localhost> EXEC /bin/sh -c '/usr/bin/python /root/.ansible/tmp/ansible-tmp-1497367141.51-197794143920260/command.py; rm -rf "/root/.ansible/tmp/ansible-tmp-1497367141.51-197794143920260/" > /dev/null 2>&1 && sleep 0'
ok: [localhost] => {
"changed": false,
"cmd": [
"repoquery",
"--plugins",
"--config",
"/tmp/yum.conf.CWAsmU/yum.conf",
"--qf",
"%{version}",
"origin"
],
"delta": "0:00:00.577509",
"end": "2017-06-13 08:19:02.341264",
"failed": false,
"failed_when_result": false,
"invocation": {
"module_args": {
"_raw_params": "repoquery --plugins --config \"/tmp/yum.conf.CWAsmU/yum.conf\" --qf '%{version}' \"origin\"",
"_uses_shell": false,
"chdir": null,
"creates": null,
"executable": null,
"removes": null,
"warn": true
}
},
"rc": 0,
"start": "2017-06-13 08:19:01.763755"
}
STDOUT:
Could not retrieve mirrorlist http://mirrorlist.centos.org/?release=7&arch=x86_64&repo=os&infra=stock error was
14: curl#6 - "Could not resolve host: mirrorlist.centos.org; Unknown error"
Could not get metalink https://mirrors.fedoraproject.org/metalink?repo=epel-7&arch=x86_64 error was
14: curl#6 - "Could not resolve host: mirrors.fedoraproject.org; Unknown error"
Could not retrieve mirrorlist http://mirrorlist.centos.org/?release=7&arch=x86_64&repo=extras&infra=stock error was
14: curl#6 - "Could not resolve host: mirrorlist.centos.org; Unknown error"
Could not retrieve mirrorlist http://mirrorlist.centos.org/?release=7&arch=x86_64&repo=updates&infra=stock error was
14: curl#6 - "Could not resolve host: mirrorlist.centos.org; Unknown error"
1.5.1
STDERR:
Repository centos-openshift-origin is listed more than once in the configuration
TASK [openshift_version : Delete the temporary yum.conf] ***************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_version/tasks/set_version_rpm.yml:39
Using module file /usr/lib/python2.7/site-packages/ansible/modules/files/file.py
<localhost> ESTABLISH LOCAL CONNECTION FOR USER: ith
<localhost> EXEC /bin/sh -c 'echo ~ && sleep 0'
<localhost> EXEC /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp/ansible-tmp-1497367142.47-199745651861494 `" && echo ansible-tmp-1497367142.47-199745651861494="` echo /root/.ansible/tmp/ansible-tmp-1497367142.47-199745651861494 `" ) && sleep 0'
<localhost> PUT /tmp/tmp1bqAGS TO /root/.ansible/tmp/ansible-tmp-1497367142.47-199745651861494/file.py
<localhost> EXEC /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1497367142.47-199745651861494/ /root/.ansible/tmp/ansible-tmp-1497367142.47-199745651861494/file.py && sleep 0'
<localhost> EXEC /bin/sh -c '/usr/bin/python /root/.ansible/tmp/ansible-tmp-1497367142.47-199745651861494/file.py; rm -rf "/root/.ansible/tmp/ansible-tmp-1497367142.47-199745651861494/" > /dev/null 2>&1 && sleep 0'
changed: [localhost] => {
"changed": true,
"diff": {
"after": {
"path": "/tmp/yum.conf.CWAsmU",
"state": "absent"
},
"before": {
"path": "/tmp/yum.conf.CWAsmU",
"state": "directory"
}
},
"invocation": {
"module_args": {
"attributes": null,
"backup": null,
"content": null,
"delimiter": null,
"diff_peek": null,
"directory_mode": null,
"follow": false,
"force": false,
"group": null,
"mode": null,
"original_basename": null,
"owner": null,
"path": "/tmp/yum.conf.CWAsmU",
"recurse": false,
"regexp": null,
"remote_src": null,
"selevel": null,
"serole": null,
"setype": null,
"seuser": null,
"src": null,
"state": "absent",
"unsafe_writes": null,
"validate": null
}
},
"path": "/tmp/yum.conf.CWAsmU",
"state": "absent"
}
TASK [openshift_version : set_fact] ************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_version/tasks/set_version_rpm.yml:44
ok: [localhost] => {
"ansible_facts": {
"openshift_version": "Could not retrieve mirrorlist http://mirrorlist.centos.org/?release=7&arch=x86_64&repo=os&infra=stock error was\n14: curl#6 - \"Could not resolve host: mirrorlist.centos.org; Unknown error\"\nCould not get metalink https://mirrors.fedoraproject.org/metalink?repo=epel-7&arch=x86_64 error was\n14: curl#6 - \"Could not resolve host: mirrors.fedoraproject.org; Unknown error\"\nCould not retrieve mirrorlist http://mirrorlist.centos.org/?release=7&arch=x86_64&repo=extras&infra=stock error was\n14: curl#6 - \"Could not resolve host: mirrorlist.centos.org; Unknown error\"\nCould not retrieve mirrorlist http://mirrorlist.centos.org/?release=7&arch=x86_64&repo=updates&infra=stock error was\n14: curl#6 - \"Could not resolve host: mirrorlist.centos.org; Unknown error\"\n1.5.1"
},
"changed": false
}
TASK [openshift_version : Set containerized version to configure if openshift_image_tag specified] *********************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_version/tasks/set_version_containerized.yml:2
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_version : Set containerized version to configure if openshift_release specified] ***********************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_version/tasks/set_version_containerized.yml:9
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_version : Set default image tag based on OCP versus origin] ********************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_version/tasks/set_version_containerized.yml:14
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_version : Lookup latest containerized version if no version specified] *********************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_version/tasks/set_version_containerized.yml:18
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_version : set_fact] ************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_version/tasks/set_version_containerized.yml:25
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_version : set_fact] ************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_version/tasks/set_version_containerized.yml:29
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_version : Set precise containerized version to configure if openshift_release specified] ***************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_version/tasks/set_version_containerized.yml:35
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_version : set_fact] ************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_version/tasks/set_version_containerized.yml:41
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_version : set_fact] ************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_version/tasks/set_version_containerized.yml:48
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_version : set_fact] ************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_version/tasks/main.yml:56
ok: [localhost] => {
"ansible_facts": {
"openshift_image_tag": "vCould not retrieve mirrorlist http://mirrorlist.centos.org/?release=7&arch=x86_64&repo=os&infra=stock error was\n14: curl#6 - \"Could not resolve host: mirrorlist.centos.org; Unknown error\"\nCould not get metalink https://mirrors.fedoraproject.org/metalink?repo=epel-7&arch=x86_64 error was\n14: curl#6 - \"Could not resolve host: mirrors.fedoraproject.org; Unknown error\"\nCould not retrieve mirrorlist http://mirrorlist.centos.org/?release=7&arch=x86_64&repo=extras&infra=stock error was\n14: curl#6 - \"Could not resolve host: mirrorlist.centos.org; Unknown error\"\nCould not retrieve mirrorlist http://mirrorlist.centos.org/?release=7&arch=x86_64&repo=updates&infra=stock error was\n14: curl#6 - \"Could not resolve host: mirrorlist.centos.org; Unknown error\"\n1.5.1"
},
"changed": false
}
TASK [openshift_version : set_fact] ************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_version/tasks/main.yml:60
ok: [localhost] => {
"ansible_facts": {
"openshift_pkg_version": "-Could not retrieve mirrorlist http://mirrorlist.centos.org/?release=7&arch=x86_64&repo=os&infra=stock error was\n14: curl#6 - \"Could not resolve host: mirrorlist.centos.org; Unknown error\"\nCould not get metalink https://mirrors.fedoraproject.org/metalink?repo=epel-7&arch=x86_64 error was\n14: curl#6 - \"Could not resolve host: mirrors.fedoraproject.org; Unknown error\"\nCould not retrieve mirrorlist http://mirrorlist.centos.org/?release=7&arch=x86_64&repo=extras&infra=stock error was\n14: curl#6 - \"Could not resolve host: mirrorlist.centos.org; Unknown error\"\nCould not retrieve mirrorlist http://mirrorlist.centos.org/?release=7&arch=x86_64&repo=updates&infra=stock error was\n14: curl#6 - \"Could not resolve host: mirrorlist.centos.org; Unknown error\"\n1.5.1"
},
"changed": false
}
TASK [openshift_version : fail] ****************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_version/tasks/main.yml:64
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_version : fail] ****************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_version/tasks/main.yml:68
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_version : fail] ****************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_version/tasks/main.yml:72
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_version : fail] ****************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_version/tasks/main.yml:76
skipping: [localhost] => {
"changed": false,
"skip_reason": "Conditional result was False",
"skipped": true
}
TASK [openshift_version : fail] ****************************************************************************************************************************************
task path: /usr/share/ansible/openshift-ansible/roles/openshift_version/tasks/main.yml:82
fatal: [localhost]: FAILED! => {
"changed": false,
"failed": true
}
MSG:
Detected OpenShift version Could not retrieve mirrorlist http://mirrorlist.centos.org/?release=7&arch=x86_64&repo=os&infra=stock error was
14: curl#6 - "Could not resolve host: mirrorlist.centos.org; Unknown error"
Could not get metalink https://mirrors.fedoraproject.org/metalink?repo=epel-7&arch=x86_64 error was
14: curl#6 - "Could not resolve host: mirrors.fedoraproject.org; Unknown error"
Could not retrieve mirrorlist http://mirrorlist.centos.org/?release=7&arch=x86_64&repo=extras&infra=stock error was
14: curl#6 - "Could not resolve host: mirrorlist.centos.org; Unknown error"
Could not retrieve mirrorlist http://mirrorlist.centos.org/?release=7&arch=x86_64&repo=updates&infra=stock error was
14: curl#6 - "Could not resolve host: mirrorlist.centos.org; Unknown error"
1.5.1 does not match requested openshift_release 1.5. You may need to adjust your yum repositories, inventory, or run the appropriate OpenShift upgrade playbook.
to retry, use: --limit @/usr/share/ansible/openshift-ansible/playbooks/byo/config.retry
PLAY RECAP *************************************************************************************************************************************************************
localhost : ok=59 changed=4 unreachable=0 failed=1
[root@viaq openshift-ansible]#
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment