Skip to content

Instantly share code, notes, and snippets.

@mirceaulinic
Created March 21, 2017 11:50
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save mirceaulinic/af28e34db8749326585cf40cd006799a to your computer and use it in GitHub Desktop.
Save mirceaulinic/af28e34db8749326585cf40cd006799a to your computer and use it in GitHub Desktop.
[INFO ] nxos proxy __virtual__() called...
[DEBUG ] rest_sample proxy __virtual__() called...
[INFO ] ssh_sample proxy __virtual__() called...
[DEBUG ] Could not LazyLoad netmiko.grains: 'netmiko.grains' is not available.
[DEBUG ] starting thread (client mode): 0x40279690L
[DEBUG ] Local version/idstring: SSH-2.0-paramiko_2.1.2
[DEBUG ] Remote version/idstring: SSH-2.0-Cisco-1.25
[INFO ] Connected (version 2.0, client Cisco-1.25)
[DEBUG ] kex algos:[u'diffie-hellman-group-exchange-sha1', u'diffie-hellman-group14-sha1'] server key:[u'ssh-rsa'] client encrypt:[u'aes128-ctr', u'aes192-ctr', u'aes256-ctr', u'aes128-cbc', u'3des-cbc', u'aes192-cbc', u'aes256-cbc'] server encrypt:[u'aes128-ctr', u'aes192-ctr', u'aes256-ctr', u'aes128-cbc', u'3des-cbc', u'aes192-cbc', u'aes256-cbc'] client mac:[u'hmac-sha1', u'hmac-sha1-96'] server mac:[u'hmac-sha1', u'hmac-sha1-96'] client compress:[u'none'] server compress:[u'none'] client lang:[u''] server lang:[u''] kex follows?False
[DEBUG ] Kex agreed: diffie-hellman-group14-sha1
[DEBUG ] Cipher agreed: aes128-ctr
[DEBUG ] MAC agreed: hmac-sha1-96
[DEBUG ] Compression agreed: none
[DEBUG ] kex engine KexGroup14 specified hash_algo <built-in function openssl_sha1>
[DEBUG ] Switch to new keys ...
[DEBUG ] Adding ssh-rsa host key for ip-172-31-46-249.us-east-2.compute.internal: d26b4df49e4d6c7d58bb4e2eae20a5c5
[DEBUG ] userauth is OK
[INFO ] Authentication (password) successful!
[DEBUG ] [chan 0] Max packet in: 32768 bytes
[DEBUG ] [chan 0] Max packet out: 4096 bytes
[DEBUG ] Secsh channel 0 opened.
[DEBUG ] [chan 0] Sesch channel 0 request ok
[DEBUG ] [chan 0] Sesch channel 0 request ok
[DEBUG ] read_channel:
ip-172-31-46-249#
[DEBUG ] read_channel:
[DEBUG ] read_channel:
[DEBUG ] read_channel:
[DEBUG ] write_channel:
[DEBUG ] read_channel:
ip-172-31-46-249#
[DEBUG ] read_channel:
[DEBUG ] read_channel:
[DEBUG ] write_channel: terminal length 0
[DEBUG ] _read_channel_expect read_data: te
[DEBUG ] _read_channel_expect read_data: rminal length 0
ip-172-31-46-249#
[DEBUG ] write_channel: terminal width 511
[DEBUG ] _read_channel_expect read_data: termi
[DEBUG ] _read_channel_expect read_data: nal width 511
ip-172-31-46-249#
[DEBUG ] read_channel:
[DEBUG ] Reading configuration from /etc/salt/proxy
[DEBUG ] /etc/resolv.conf: The domain and search keywords are mutually exclusive.
[DEBUG ] Please install 'virt-what' to improve results of the 'virtual' grain.
[DEBUG ] Reading configuration from /etc/salt/proxy
[DEBUG ] /etc/resolv.conf: The domain and search keywords are mutually exclusive.
[DEBUG ] Please install 'virt-what' to improve results of the 'virtual' grain.
[DEBUG ] Initializing new Schedule
[DEBUG ] LazyLoaded timezone.get_offset
[DEBUG ] LazyLoaded cmd.run
[INFO ] Executing command ['date', '+%z'] in directory '/home/admin'
[DEBUG ] output: +0000
[DEBUG ] LazyLoaded config.merge
[DEBUG ] read_channel:
ip-172-31-46-249#
[DEBUG ] read_channel:
[DEBUG ] read_channel:
[DEBUG ] write_channel: show clock
[DEBUG ] read_channel:
[DEBUG ] read_channel: show clock
*11:46:08.849 UTC Tue Mar 21 2017
ip-172-31-46-249#
[DEBUG ] Minion return retry timer set to 9 seconds (randomized)
[INFO ] Returning information for job: 20170321114608960624
[DEBUG ] Initializing new AsyncZeroMQReqChannel for ('/etc/salt/pki/proxy', 'device1', 'tcp://127.0.0.1:4506', 'aes')
[DEBUG ] Initializing new AsyncAuth for ('/etc/salt/pki/proxy', 'device1', 'tcp://127.0.0.1:4506')
[INFO ] Running scheduled job: stuff
[DEBUG ] schedule: This job was scheduled with jid_include, adding to cache (jid_include defaults to True)
[INFO ] schedule: maxrunning parameter was not specified for job stuff, defaulting to 1.
[DEBUG ] schedule.handle_func: adding this job to the jobcache with data {'fun_args': [], 'jid': '20170321114621144666', 'schedule': 'stuff', 'pid': 27805, 'fun': 'state.sls', 'id': 'device1'}
[DEBUG ] LazyLoaded state.sls
[DEBUG ] LazyLoaded saltutil.is_running
[DEBUG ] LazyLoaded grains.get
[DEBUG ] Initializing new AsyncZeroMQReqChannel for ('/etc/salt/pki/proxy', 'device1', 'tcp://127.0.0.1:4506', 'aes')
[DEBUG ] Initializing new AsyncAuth for ('/etc/salt/pki/proxy', 'device1', 'tcp://127.0.0.1:4506')
[DEBUG ] Determining pillar cache
[DEBUG ] Initializing new AsyncZeroMQReqChannel for ('/etc/salt/pki/proxy', 'device1', 'tcp://127.0.0.1:4506', 'aes')
[DEBUG ] Initializing new AsyncAuth for ('/etc/salt/pki/proxy', 'device1', 'tcp://127.0.0.1:4506')
[DEBUG ] Loaded minion key: /etc/salt/pki/proxy/minion.pem
[INFO ] Loading fresh modules for state activity
[DEBUG ] LazyLoaded jinja.render
[DEBUG ] LazyLoaded yaml.render
[DEBUG ] In saltenv 'base', looking at rel_path 'router/ntp.sls' to resolve 'salt://router/ntp.sls'
[DEBUG ] In saltenv 'base', ** considering ** path '/var/cache/salt/proxy/device1/files/base/router/ntp.sls' to resolve 'salt://router/ntp.sls'
[INFO ] Fetching file from saltenv 'base', ** skipped ** latest already in cache 'salt://router/ntp.sls', mode up-to-date
[DEBUG ] compile template: /var/cache/salt/proxy/device1/files/base/router/ntp.sls
[DEBUG ] Jinja search path: ['/var/cache/salt/proxy/device1/files/base']
[PROFILE ] Time (in seconds) to render '/var/cache/salt/proxy/device1/files/base/router/ntp.sls' using 'jinja' renderer: 0.00514888763428
[DEBUG ] Rendered data from file: /var/cache/salt/proxy/device1/files/base/router/ntp.sls:
update_my_ntp_peers:
netconfig.managed:
- template_name: salt://ntp.jinja
- peers: ["1.2.3.4"]
- servers: ["9.10.11.12"]
[DEBUG ] LazyLoaded config.get
[DEBUG ] Results of YAML rendering:
OrderedDict([('update_my_ntp_peers', OrderedDict([('netconfig.managed', [OrderedDict([('template_name', 'salt://ntp.jinja')]), OrderedDict([('peers', ['1.2.3.4'])]), OrderedDict([('servers', ['9.10.11.12'])])])]))])
[PROFILE ] Time (in seconds) to render '/var/cache/salt/proxy/device1/files/base/router/ntp.sls' using 'yaml' renderer: 0.00222086906433
[DEBUG ] LazyLoaded boto.assign_funcs
[DEBUG ] boto_kms requires boto 2.38.0.
[DEBUG ] Could not LazyLoad acme.cert: 'acme' __virtual__ returned False: The ACME execution module cannot be loaded: letsencrypt-auto not installed.
[DEBUG ] LazyLoaded apache.config
[DEBUG ] LazyLoaded apache.a2enconf
[DEBUG ] LazyLoaded apache.a2enmod
[DEBUG ] LazyLoaded apache.a2ensite
[DEBUG ] Could not LazyLoad pkg.get_selections: 'pkg.get_selections' is not available.
[DEBUG ] LazyLoaded at.at
[DEBUG ] Could not LazyLoad augeas.execute: 'augeas.execute' is not available.
[DEBUG ] LazyLoaded boto3_elasticache.cache_cluster_exists
[DEBUG ] LazyLoaded boto3_route53.find_hosted_zone
[DEBUG ] LazyLoaded boto_apigateway.describe_apis
[DEBUG ] LazyLoaded boto_asg.exists
[DEBUG ] LazyLoaded boto_cloudtrail.exists
[DEBUG ] LazyLoaded boto_cloudwatch_event.exists
[DEBUG ] LazyLoaded boto_cognitoidentity.describe_identity_pools
[DEBUG ] LazyLoaded boto_datapipeline.create_pipeline
[DEBUG ] LazyLoaded boto_elasticsearch_domain.exists
[DEBUG ] LazyLoaded boto_iot.policy_exists
[DEBUG ] LazyLoaded boto_kinesis.exists
[DEBUG ] LazyLoaded boto_kms.describe_key
[DEBUG ] LazyLoaded boto_lambda.function_exists
[DEBUG ] LazyLoaded boto_asg.exists
[DEBUG ] LazyLoaded boto_rds.exists
[DEBUG ] LazyLoaded boto_route53.get_record
[DEBUG ] LazyLoaded boto_s3_bucket.exists
[DEBUG ] LazyLoaded boto_vpc.exists
[DEBUG ] LazyLoaded bower.list
[DEBUG ] LazyLoaded chef.client
[DEBUG ] LazyLoaded chocolatey.install
[DEBUG ] LazyLoaded cyg.list
[DEBUG ] LazyLoaded ddns.update
[DEBUG ] LazyLoaded eselect.exec_action
[INFO ] Executing command ['git', '--version'] in directory '/home/admin'
[DEBUG ] stdout: git version 2.11.0
[DEBUG ] LazyLoaded github.list_users
[DEBUG ] LazyLoaded glusterfs.list_volumes
[DEBUG ] LazyLoaded elasticsearch.exists
[DEBUG ] LazyLoaded icinga2.generate_ticket
[DEBUG ] LazyLoaded ifttt.trigger_event
[DEBUG ] Could not LazyLoad influxdb08.db_exists: 'influxdb08.db_exists' is not available.
[DEBUG ] Could not LazyLoad influxdb08.db_exists: 'influxdb08.db_exists' is not available.
[DEBUG ] Could not LazyLoad influxdb.db_exists: 'influxdb.db_exists' is not available.
[DEBUG ] Could not LazyLoad influxdb.db_exists: 'influxdb.db_exists' is not available.
[DEBUG ] Could not LazyLoad influxdb.db_exists: 'influxdb.db_exists' is not available.
[DEBUG ] Could not LazyLoad influxdb.db_exists: 'influxdb.db_exists' is not available.
[DEBUG ] LazyLoaded ipset.version
[DEBUG ] LazyLoaded kapacitor.version
[DEBUG ] LazyLoaded keystone.auth
[DEBUG ] LazyLoaded kmod.available
[DEBUG ] LazyLoaded layman.add
[DEBUG ] LazyLoaded lvs.get_rules
[DEBUG ] LazyLoaded lvs.get_rules
[DEBUG ] LazyLoaded makeconf.get_var
[DEBUG ] LazyLoaded memcached.status
[DEBUG ] LazyLoaded mongodb.user_exists
[DEBUG ] LazyLoaded monit.summary
[DEBUG ] LazyLoaded mysql.db_exists
[DEBUG ] LazyLoaded mysql.grant_exists
[DEBUG ] LazyLoaded mysql.query
[DEBUG ] LazyLoaded mysql.user_create
[DEBUG ] LazyLoaded nftables.version
[DEBUG ] LazyLoaded npm.list
[DEBUG ] LazyLoaded openvswitch.bridge_create
[DEBUG ] LazyLoaded openvswitch.port_add
[DEBUG ] LazyLoaded pdbedit.create
[DEBUG ] LazyLoaded pecl.list
[DEBUG ] lzma module is not available
[DEBUG ] Registered VCS backend: git
[DEBUG ] Registered VCS backend: hg
[DEBUG ] Registered VCS backend: svn
[DEBUG ] Registered VCS backend: bzr
[DEBUG ] Could not LazyLoad pkg.install: 'pkg.install' is not available.
[DEBUG ] Could not LazyLoad pkg.mod_repo: 'pkg.mod_repo' is not available.
[DEBUG ] LazyLoaded portage_config.get_missing_flags
[DEBUG ] LazyLoaded postgres.cluster_exists
[DEBUG ] LazyLoaded postgres.user_exists
[DEBUG ] LazyLoaded postgres.create_extension
[DEBUG ] LazyLoaded postgres.group_create
[DEBUG ] LazyLoaded postgres.datadir_init
[DEBUG ] LazyLoaded postgres.language_create
[DEBUG ] LazyLoaded postgres.privileges_grant
[DEBUG ] LazyLoaded postgres.schema_exists
[DEBUG ] LazyLoaded postgres.tablespace_exists
[DEBUG ] LazyLoaded postgres.user_exists
[DEBUG ] LazyLoaded ps.pkill
[DEBUG ] LazyLoaded quota.report
[DEBUG ] Could not LazyLoad rbac.profile_list: 'rbac.profile_list' is not available.
[DEBUG ] LazyLoaded rdp.enable
[DEBUG ] Could not LazyLoad redis.set_key: 'redis.set_key' is not available.
[DEBUG ] LazyLoaded reg.read_value
[DEBUG ] LazyLoaded selinux.getenforce
[DEBUG ] LazyLoaded service.start
[DEBUG ] Could not LazyLoad vmadm.create: 'vmadm.create' is not available.
[DEBUG ] LazyLoaded snapper.diff
[DEBUG ] LazyLoaded splunk.list_users
[DEBUG ] LazyLoaded splunk_search.get
[DEBUG ] LazyLoaded stormpath.create_account
[DEBUG ] Could not LazyLoad sysctl.show: 'sysctl.show' is not available.
[DEBUG ] LazyLoaded tls.cert_info
[DEBUG ] LazyLoaded tomcat.status
[DEBUG ] LazyLoaded trafficserver.set_config
[DEBUG ] LazyLoaded victorops.create_event
[DEBUG ] LazyLoaded virt.node_info
[DEBUG ] LazyLoaded win_dacl.add_ace
[DEBUG ] LazyLoaded win_dns_client.add_dns
[DEBUG ] Could not LazyLoad firewall.get_config: 'firewall.get_config' is not available.
[DEBUG ] LazyLoaded win_iis.create_site
[DEBUG ] Could not LazyLoad lgpo.set: 'lgpo.set' is not available.
[DEBUG ] LazyLoaded win_path.rehash
[DEBUG ] LazyLoaded win_pki.get_stores
[DEBUG ] LazyLoaded win_servermanager.install
[DEBUG ] LazyLoaded win_smtp_server.get_server_setting
[DEBUG ] LazyLoaded win_snmp.get_agent_settings
[DEBUG ] LazyLoaded x509.get_pem_entry
[DEBUG ] LazyLoaded xmpp.send_msg
[DEBUG ] LazyLoaded zabbix.host_create
[DEBUG ] LazyLoaded zabbix.hostgroup_create
[DEBUG ] LazyLoaded zabbix.user_create
[DEBUG ] LazyLoaded zabbix.usergroup_create
[DEBUG ] LazyLoaded zfs.create
[DEBUG ] LazyLoaded config.merge
[DEBUG ] LazyLoaded zk_concurrency.lock
[DEBUG ] LazyLoaded zonecfg.create
[DEBUG ] LazyLoaded zpool.create
[DEBUG ] Could not LazyLoad netconfig.managed: 'netconfig' __virtual__ returned False: "netconfig"" (/home/admin/salt/salt/states/netconfig.pyc) cannot be loaded: NAPALM is not installed or not running in a (proxy) minion
[INFO ] Running state [update_my_ntp_peers] at time 11:46:22.143197
[DEBUG ] LazyLoaded netconfig.managed
[ERROR ] State 'netconfig.managed' was not found in SLS 'router.ntp'
Reason: 'netconfig' __virtual__ returned False: "netconfig"" (/home/admin/salt/salt/states/netconfig.pyc) cannot be loaded: NAPALM is not installed or not running in a (proxy) minion
[DEBUG ] File /var/cache/salt/proxy/device1/accumulator/140669616011024 does not exist, no need to cleanup.
[DEBUG ] SaltEvent PUB socket URI: /var/run/salt/minion/minion_event_18faa0dd7a_pub.ipc
[DEBUG ] SaltEvent PULL socket URI: /var/run/salt/minion/minion_event_18faa0dd7a_pull.ipc
[DEBUG ] Initializing new IPCClient for path: /var/run/salt/minion/minion_event_18faa0dd7a_pull.ipc
[DEBUG ] Sending event: tag = __schedule_return; data = {'fun_args': ['router.ntp'], 'jid': 'req', 'return': {'netconfig_|-update_my_ntp_peers_|-update_my_ntp_peers_|-managed': {'comment': 'State \'netconfig.managed\' was not found in SLS \'router.ntp\'\nReason: \'netconfig\' __virtual__ returned False: "netconfig"" (/home/admin/salt/salt/states/netconfig.pyc) cannot be loaded: NAPALM is not installed or not running in a (proxy) minion\n', '__run_num__': 0, 'changes': {}, 'result': False, 'name': 'update_my_ntp_peers'}}, 'success': True, 'schedule': 'stuff', 'cmd': '_return', 'pid': 27805, '_stamp': '2017-03-21T11:46:22.146119', 'fun': 'state.sls', 'id': 'device1'}
[DEBUG ] schedule.handle_func: Removing /var/cache/salt/proxy/device1/proc/20170321114621144666
[DEBUG ] Minion of "localhost" is handling event tag '__schedule_return'
[INFO ] Returning information for job: req
[DEBUG ] Initializing new AsyncZeroMQReqChannel for ('/etc/salt/pki/proxy', 'device1', 'tcp://127.0.0.1:4506', 'aes')
[DEBUG ] Re-using AsyncAuth for ('/etc/salt/pki/proxy', 'device1', 'tcp://127.0.0.1:4506')
[INFO ] Running scheduled job: stuff
[DEBUG ] schedule: This job was scheduled with jid_include, adding to cache (jid_include defaults to True)
[INFO ] schedule: maxrunning parameter was not specified for job stuff, defaulting to 1.
[DEBUG ] schedule.handle_func: adding this job to the jobcache with data {'fun_args': [], 'jid': '20170321114722197932', 'schedule': 'stuff', 'pid': 27805, 'fun': 'state.sls', 'id': 'device1'}
[DEBUG ] LazyLoaded state.sls
[DEBUG ] LazyLoaded saltutil.is_running
[DEBUG ] LazyLoaded grains.get
[DEBUG ] Initializing new AsyncZeroMQReqChannel for ('/etc/salt/pki/proxy', 'device1', 'tcp://127.0.0.1:4506', 'aes')
[DEBUG ] Initializing new AsyncAuth for ('/etc/salt/pki/proxy', 'device1', 'tcp://127.0.0.1:4506')
[DEBUG ] Determining pillar cache
[DEBUG ] Initializing new AsyncZeroMQReqChannel for ('/etc/salt/pki/proxy', 'device1', 'tcp://127.0.0.1:4506', 'aes')
[DEBUG ] Initializing new AsyncAuth for ('/etc/salt/pki/proxy', 'device1', 'tcp://127.0.0.1:4506')
[DEBUG ] Loaded minion key: /etc/salt/pki/proxy/minion.pem
[INFO ] Loading fresh modules for state activity
[DEBUG ] LazyLoaded jinja.render
[DEBUG ] LazyLoaded yaml.render
[DEBUG ] In saltenv 'base', looking at rel_path 'router/ntp.sls' to resolve 'salt://router/ntp.sls'
[DEBUG ] In saltenv 'base', ** considering ** path '/var/cache/salt/proxy/device1/files/base/router/ntp.sls' to resolve 'salt://router/ntp.sls'
[INFO ] Fetching file from saltenv 'base', ** skipped ** latest already in cache 'salt://router/ntp.sls', mode up-to-date
[DEBUG ] compile template: /var/cache/salt/proxy/device1/files/base/router/ntp.sls
[DEBUG ] Jinja search path: ['/var/cache/salt/proxy/device1/files/base']
[PROFILE ] Time (in seconds) to render '/var/cache/salt/proxy/device1/files/base/router/ntp.sls' using 'jinja' renderer: 0.00308299064636
[DEBUG ] Rendered data from file: /var/cache/salt/proxy/device1/files/base/router/ntp.sls:
update_my_ntp_peers:
netconfig.managed:
- template_name: salt://ntp.jinja
- peers: ["1.2.3.4"]
- servers: ["9.10.11.12"]
[DEBUG ] LazyLoaded config.get
[DEBUG ] Results of YAML rendering:
OrderedDict([('update_my_ntp_peers', OrderedDict([('netconfig.managed', [OrderedDict([('template_name', 'salt://ntp.jinja')]), OrderedDict([('peers', ['1.2.3.4'])]), OrderedDict([('servers', ['9.10.11.12'])])])]))])
[PROFILE ] Time (in seconds) to render '/var/cache/salt/proxy/device1/files/base/router/ntp.sls' using 'yaml' renderer: 0.00321102142334
[DEBUG ] LazyLoaded boto.assign_funcs
[DEBUG ] boto_kms requires boto 2.38.0.
[DEBUG ] Could not LazyLoad acme.cert: 'acme' __virtual__ returned False: The ACME execution module cannot be loaded: letsencrypt-auto not installed.
[DEBUG ] LazyLoaded apache.config
[DEBUG ] LazyLoaded apache.a2enconf
[DEBUG ] LazyLoaded apache.a2enmod
[DEBUG ] LazyLoaded apache.a2ensite
[DEBUG ] Could not LazyLoad pkg.get_selections: 'pkg.get_selections' is not available.
[DEBUG ] LazyLoaded at.at
[DEBUG ] Could not LazyLoad augeas.execute: 'augeas.execute' is not available.
[DEBUG ] LazyLoaded boto3_elasticache.cache_cluster_exists
[DEBUG ] LazyLoaded boto3_route53.find_hosted_zone
[DEBUG ] LazyLoaded boto_apigateway.describe_apis
[DEBUG ] LazyLoaded boto_asg.exists
[DEBUG ] LazyLoaded boto_cloudtrail.exists
[DEBUG ] LazyLoaded boto_cloudwatch_event.exists
[DEBUG ] LazyLoaded boto_cognitoidentity.describe_identity_pools
[DEBUG ] LazyLoaded boto_datapipeline.create_pipeline
[DEBUG ] LazyLoaded boto_elasticsearch_domain.exists
[DEBUG ] LazyLoaded boto_iot.policy_exists
[DEBUG ] LazyLoaded boto_kinesis.exists
[DEBUG ] LazyLoaded boto_kms.describe_key
[DEBUG ] LazyLoaded boto_lambda.function_exists
[DEBUG ] LazyLoaded boto_asg.exists
[DEBUG ] LazyLoaded boto_rds.exists
[DEBUG ] LazyLoaded boto_route53.get_record
[DEBUG ] LazyLoaded boto_s3_bucket.exists
[DEBUG ] LazyLoaded boto_vpc.exists
[DEBUG ] LazyLoaded bower.list
[DEBUG ] LazyLoaded chef.client
[DEBUG ] LazyLoaded chocolatey.install
[DEBUG ] LazyLoaded cyg.list
[DEBUG ] LazyLoaded ddns.update
[DEBUG ] LazyLoaded eselect.exec_action
[INFO ] Executing command ['git', '--version'] in directory '/home/admin'
[DEBUG ] stdout: git version 2.11.0
[DEBUG ] LazyLoaded github.list_users
[DEBUG ] LazyLoaded glusterfs.list_volumes
[DEBUG ] LazyLoaded elasticsearch.exists
[DEBUG ] LazyLoaded icinga2.generate_ticket
[DEBUG ] LazyLoaded ifttt.trigger_event
[DEBUG ] Could not LazyLoad influxdb08.db_exists: 'influxdb08.db_exists' is not available.
[DEBUG ] Could not LazyLoad influxdb08.db_exists: 'influxdb08.db_exists' is not available.
[DEBUG ] Could not LazyLoad influxdb.db_exists: 'influxdb.db_exists' is not available.
[DEBUG ] Could not LazyLoad influxdb.db_exists: 'influxdb.db_exists' is not available.
[DEBUG ] Could not LazyLoad influxdb.db_exists: 'influxdb.db_exists' is not available.
[DEBUG ] Could not LazyLoad influxdb.db_exists: 'influxdb.db_exists' is not available.
[DEBUG ] LazyLoaded ipset.version
[DEBUG ] LazyLoaded kapacitor.version
[DEBUG ] LazyLoaded keystone.auth
[DEBUG ] LazyLoaded kmod.available
[DEBUG ] LazyLoaded layman.add
[DEBUG ] LazyLoaded lvs.get_rules
[DEBUG ] LazyLoaded lvs.get_rules
[DEBUG ] LazyLoaded makeconf.get_var
[DEBUG ] LazyLoaded memcached.status
[DEBUG ] LazyLoaded mongodb.user_exists
[DEBUG ] LazyLoaded monit.summary
[DEBUG ] LazyLoaded mysql.db_exists
[DEBUG ] LazyLoaded mysql.grant_exists
[DEBUG ] LazyLoaded mysql.query
[DEBUG ] LazyLoaded mysql.user_create
[DEBUG ] LazyLoaded nftables.version
[DEBUG ] LazyLoaded npm.list
[DEBUG ] LazyLoaded openvswitch.bridge_create
[DEBUG ] LazyLoaded openvswitch.port_add
[DEBUG ] LazyLoaded pdbedit.create
[DEBUG ] LazyLoaded pecl.list
[DEBUG ] Could not LazyLoad pkg.install: 'pkg.install' is not available.
[DEBUG ] Could not LazyLoad pkg.mod_repo: 'pkg.mod_repo' is not available.
[DEBUG ] LazyLoaded portage_config.get_missing_flags
[DEBUG ] LazyLoaded postgres.cluster_exists
[DEBUG ] LazyLoaded postgres.user_exists
[DEBUG ] LazyLoaded postgres.create_extension
[DEBUG ] LazyLoaded postgres.group_create
[DEBUG ] LazyLoaded postgres.datadir_init
[DEBUG ] LazyLoaded postgres.language_create
[DEBUG ] LazyLoaded postgres.privileges_grant
[DEBUG ] LazyLoaded postgres.schema_exists
[DEBUG ] LazyLoaded postgres.tablespace_exists
[DEBUG ] LazyLoaded postgres.user_exists
[DEBUG ] LazyLoaded ps.pkill
[DEBUG ] LazyLoaded quota.report
[DEBUG ] Could not LazyLoad rbac.profile_list: 'rbac.profile_list' is not available.
[DEBUG ] LazyLoaded rdp.enable
[DEBUG ] Could not LazyLoad redis.set_key: 'redis.set_key' is not available.
[DEBUG ] LazyLoaded reg.read_value
[DEBUG ] LazyLoaded selinux.getenforce
[DEBUG ] LazyLoaded service.start
[DEBUG ] Could not LazyLoad vmadm.create: 'vmadm.create' is not available.
[DEBUG ] LazyLoaded snapper.diff
[DEBUG ] LazyLoaded splunk.list_users
[DEBUG ] LazyLoaded splunk_search.get
[DEBUG ] LazyLoaded stormpath.create_account
[DEBUG ] Could not LazyLoad sysctl.show: 'sysctl.show' is not available.
[DEBUG ] LazyLoaded tls.cert_info
[DEBUG ] LazyLoaded tomcat.status
[DEBUG ] LazyLoaded trafficserver.set_config
[DEBUG ] LazyLoaded victorops.create_event
[DEBUG ] LazyLoaded virt.node_info
[DEBUG ] LazyLoaded win_dacl.add_ace
[DEBUG ] LazyLoaded win_dns_client.add_dns
[DEBUG ] Could not LazyLoad firewall.get_config: 'firewall.get_config' is not available.
[DEBUG ] LazyLoaded win_iis.create_site
[DEBUG ] Could not LazyLoad lgpo.set: 'lgpo.set' is not available.
[DEBUG ] LazyLoaded win_path.rehash
[DEBUG ] LazyLoaded win_pki.get_stores
[DEBUG ] LazyLoaded win_servermanager.install
[DEBUG ] LazyLoaded win_smtp_server.get_server_setting
[DEBUG ] LazyLoaded win_snmp.get_agent_settings
[DEBUG ] LazyLoaded x509.get_pem_entry
[DEBUG ] LazyLoaded xmpp.send_msg
[DEBUG ] LazyLoaded zabbix.host_create
[DEBUG ] LazyLoaded zabbix.hostgroup_create
[DEBUG ] LazyLoaded zabbix.user_create
[DEBUG ] LazyLoaded zabbix.usergroup_create
[DEBUG ] LazyLoaded zfs.create
[DEBUG ] LazyLoaded zk_concurrency.lock
[DEBUG ] LazyLoaded zonecfg.create
[DEBUG ] LazyLoaded zpool.create
[DEBUG ] Could not LazyLoad netconfig.managed: 'netconfig' __virtual__ returned False: "netconfig"" (/home/admin/salt/salt/states/netconfig.pyc) cannot be loaded: NAPALM is not installed or not running in a (proxy) minion
[INFO ] Running state [update_my_ntp_peers] at time 11:47:22.851233
[DEBUG ] LazyLoaded netconfig.managed
[ERROR ] State 'netconfig.managed' was not found in SLS 'router.ntp'
Reason: 'netconfig' __virtual__ returned False: "netconfig"" (/home/admin/salt/salt/states/netconfig.pyc) cannot be loaded: NAPALM is not installed or not running in a (proxy) minion
[DEBUG ] File /var/cache/salt/proxy/device1/accumulator/140669308685776 does not exist, no need to cleanup.
[DEBUG ] SaltEvent PUB socket URI: /var/run/salt/minion/minion_event_18faa0dd7a_pub.ipc
[DEBUG ] SaltEvent PULL socket URI: /var/run/salt/minion/minion_event_18faa0dd7a_pull.ipc
[DEBUG ] Initializing new IPCClient for path: /var/run/salt/minion/minion_event_18faa0dd7a_pull.ipc
[DEBUG ] Sending event: tag = __schedule_return; data = {'fun_args': ['router.ntp'], 'jid': 'req', 'return': {'netconfig_|-update_my_ntp_peers_|-update_my_ntp_peers_|-managed': {'comment': 'State \'netconfig.managed\' was not found in SLS \'router.ntp\'\nReason: \'netconfig\' __virtual__ returned False: "netconfig"" (/home/admin/salt/salt/states/netconfig.pyc) cannot be loaded: NAPALM is not installed or not running in a (proxy) minion\n', '__run_num__': 0, 'changes': {}, 'result': False, 'name': 'update_my_ntp_peers'}}, 'success': True, 'schedule': 'stuff', 'cmd': '_return', 'pid': 27805, '_stamp': '2017-03-21T11:47:22.854229', 'fun': 'state.sls', 'id': 'device1'}
[DEBUG ] schedule.handle_func: Removing /var/cache/salt/proxy/device1/proc/20170321114722197932
[DEBUG ] Minion of "localhost" is handling event tag '__schedule_return'
[INFO ] Returning information for job: req
[DEBUG ] Initializing new AsyncZeroMQReqChannel for ('/etc/salt/pki/proxy', 'device1', 'tcp://127.0.0.1:4506', 'aes')
[DEBUG ] Re-using AsyncAuth for ('/etc/salt/pki/proxy', 'device1', 'tcp://127.0.0.1:4506')
[DEBUG ] LazyLoaded config.merge
[INFO ] User sudo_admin Executing command netmiko.send_command with jid 20170321114752111514
[DEBUG ] Command details {'tgt_type': 'glob', 'jid': '20170321114752111514', 'tgt': 'device1', 'ret': '', 'user': 'sudo_admin', 'arg': ['show ip interface brief'], 'fun': 'netmiko.send_command'}
[INFO ] Starting a new job with PID 27805
[DEBUG ] read_channel:
[DEBUG ] write_channel:
[DEBUG ] read_channel:
ip-172-31-46-249#
[DEBUG ] read_channel:
[DEBUG ] read_channel:
[DEBUG ] write_channel: show ip interface brief
[DEBUG ] read_channel:
[DEBUG ] read_channel: show ip interface brief
Interface IP-Address OK? Method Status Protocol
GigabitEthernet1 172.31.46.249 YES DHCP up up
ip-172-31-46-249#
[DEBUG ] Minion return retry timer set to 9 seconds (randomized)
[INFO ] Returning information for job: 20170321114752111514
[DEBUG ] Initializing new AsyncZeroMQReqChannel for ('/etc/salt/pki/proxy', 'device1', 'tcp://127.0.0.1:4506', 'aes')
[DEBUG ] Initializing new AsyncAuth for ('/etc/salt/pki/proxy', 'device1', 'tcp://127.0.0.1:4506')
[INFO ] User sudo_admin Executing command netmiko.send_command with jid 20170321114815590236
[DEBUG ] Command details {'tgt_type': 'glob', 'jid': '20170321114815590236', 'tgt': 'device1', 'ret': '', 'user': 'sudo_admin', 'arg': ['show interfaces'], 'fun': 'netmiko.send_command'}
[INFO ] Starting a new job with PID 27805
[DEBUG ] read_channel:
[DEBUG ] write_channel:
[DEBUG ] read_channel:
ip-172-31-46-249#
[DEBUG ] read_channel:
[DEBUG ] read_channel:
[DEBUG ] write_channel: show interfaces
[DEBUG ] read_channel:
[DEBUG ] read_channel: show interfaces
GigabitEthernet1 is up, line protocol is up
Hardware is CSR vNIC, address is 0a11.aabe.c10f (bia 0a11.aabe.c10f)
Internet address is 172.31.46.249/20
MTU 1500 bytes, BW 1000000 Kbit/sec, DLY 10 usec,
reliability 255/255, txload 1/255, rxload 1/255
Encapsulation ARPA, loopback not set
Keepalive set (10 sec)
Full Duplex, 1000Mbps, link type is auto, media type is RJ45
output flow-control is unsupported, input flow-control is unsupported
ARP type: ARPA, ARP Timeout 04:00:00
Last input 00:00:02, output 00:00:02, output hang never
Last clearing of "show interface" counters never
Input queue: 0/375/0/0 (size/max/drops/flushes); Total output drops: 0
Queueing strategy: fifo
Output queue: 0/40 (size/max)
5 minute input rate 0 bits/sec, 0 packets/sec
5 minute output rate 0 bits/sec, 0 packets/sec
541 packets input, 35059 bytes, 0 no buffer
Received 0 broadcasts (0 IP multicasts)
0 runts, 0 giants, 0 throttles
0 input errors, 0 CRC, 0 frame, 0 overrun, 0 ignored
0 watchdog, 0 multicast, 0 pause input
669 packets output, 71714 bytes, 0 underruns
0 output errors, 0 collisions, 0 interface resets
0 unknown protocol drops
0 babbles, 0 late collision, 0 deferred
0 lost carrier, 0 no carrier, 0 pause output
0 output buffer failures, 0 output buffers swapped out
ip-172-31-46-249#
[DEBUG ] Minion return retry timer set to 9 seconds (randomized)
[INFO ] Returning information for job: 20170321114815590236
[DEBUG ] Initializing new AsyncZeroMQReqChannel for ('/etc/salt/pki/proxy', 'device1', 'tcp://127.0.0.1:4506', 'aes')
[DEBUG ] Initializing new AsyncAuth for ('/etc/salt/pki/proxy', 'device1', 'tcp://127.0.0.1:4506')
[INFO ] User sudo_admin Executing command netmiko.send_command with jid 20170321114819637861
[DEBUG ] Command details {'tgt_type': 'glob', 'jid': '20170321114819637861', 'tgt': 'device1', 'ret': '', 'user': 'sudo_admin', 'arg': ['show interfaces summary'], 'fun': 'netmiko.send_command'}
[INFO ] Starting a new job with PID 27805
[DEBUG ] read_channel:
[DEBUG ] write_channel:
[DEBUG ] read_channel:
ip-172-31-46-249#
[DEBUG ] read_channel:
[DEBUG ] read_channel:
[DEBUG ] write_channel: show interfaces summary
[DEBUG ] read_channel:
[DEBUG ] read_channel: show interfaces summary
*: interface is up
IHQ: pkts in input hold queue IQD: pkts dropped from input queue
OHQ: pkts in output hold queue OQD: pkts dropped from output queue
RXBS: rx rate (bits/sec) RXPS: rx rate (pkts/sec)
TXBS: tx rate (bits/sec) TXPS: tx rate (pkts/sec)
TRTL: throttle count
Interface IHQ IQD OHQ OQD RXBS RXPS TXBS TXPS TRTL
-----------------------------------------------------------------------------------------------------------------
* GigabitEthernet1 0 0 0 0 1000 1 1000 1 0
ip-172-31-46-249#
[DEBUG ] Minion return retry timer set to 10 seconds (randomized)
[INFO ] Returning information for job: 20170321114819637861
[DEBUG ] Initializing new AsyncZeroMQReqChannel for ('/etc/salt/pki/proxy', 'device1', 'tcp://127.0.0.1:4506', 'aes')
[DEBUG ] Initializing new AsyncAuth for ('/etc/salt/pki/proxy', 'device1', 'tcp://127.0.0.1:4506')
[INFO ] Running scheduled job: stuff
[DEBUG ] schedule: This job was scheduled with jid_include, adding to cache (jid_include defaults to True)
[DEBUG ] schedule: This job was scheduled with a max number of 1
[DEBUG ] schedule.handle_func: adding this job to the jobcache with data {'fun_args': [], 'jid': '20170321114822145514', 'schedule': 'stuff', 'pid': 27805, 'fun': 'state.sls', 'id': 'device1'}
[DEBUG ] LazyLoaded state.sls
[DEBUG ] LazyLoaded saltutil.is_running
[DEBUG ] LazyLoaded grains.get
[DEBUG ] Initializing new AsyncZeroMQReqChannel for ('/etc/salt/pki/proxy', 'device1', 'tcp://127.0.0.1:4506', 'aes')
[DEBUG ] Initializing new AsyncAuth for ('/etc/salt/pki/proxy', 'device1', 'tcp://127.0.0.1:4506')
[DEBUG ] Determining pillar cache
[DEBUG ] Initializing new AsyncZeroMQReqChannel for ('/etc/salt/pki/proxy', 'device1', 'tcp://127.0.0.1:4506', 'aes')
[DEBUG ] Initializing new AsyncAuth for ('/etc/salt/pki/proxy', 'device1', 'tcp://127.0.0.1:4506')
[DEBUG ] Loaded minion key: /etc/salt/pki/proxy/minion.pem
[INFO ] Loading fresh modules for state activity
[DEBUG ] LazyLoaded jinja.render
[DEBUG ] LazyLoaded yaml.render
[DEBUG ] In saltenv 'base', looking at rel_path 'router/ntp.sls' to resolve 'salt://router/ntp.sls'
[DEBUG ] In saltenv 'base', ** considering ** path '/var/cache/salt/proxy/device1/files/base/router/ntp.sls' to resolve 'salt://router/ntp.sls'
[INFO ] Fetching file from saltenv 'base', ** skipped ** latest already in cache 'salt://router/ntp.sls', mode up-to-date
[DEBUG ] compile template: /var/cache/salt/proxy/device1/files/base/router/ntp.sls
[DEBUG ] Jinja search path: ['/var/cache/salt/proxy/device1/files/base']
[PROFILE ] Time (in seconds) to render '/var/cache/salt/proxy/device1/files/base/router/ntp.sls' using 'jinja' renderer: 0.00292491912842
[DEBUG ] Rendered data from file: /var/cache/salt/proxy/device1/files/base/router/ntp.sls:
update_my_ntp_peers:
netconfig.managed:
- template_name: salt://ntp.jinja
- peers: ["1.2.3.4"]
- servers: ["9.10.11.12"]
[DEBUG ] LazyLoaded config.get
[DEBUG ] Results of YAML rendering:
OrderedDict([('update_my_ntp_peers', OrderedDict([('netconfig.managed', [OrderedDict([('template_name', 'salt://ntp.jinja')]), OrderedDict([('peers', ['1.2.3.4'])]), OrderedDict([('servers', ['9.10.11.12'])])])]))])
[PROFILE ] Time (in seconds) to render '/var/cache/salt/proxy/device1/files/base/router/ntp.sls' using 'yaml' renderer: 0.00311303138733
[DEBUG ] LazyLoaded boto.assign_funcs
[DEBUG ] boto_kms requires boto 2.38.0.
[DEBUG ] Could not LazyLoad acme.cert: 'acme' __virtual__ returned False: The ACME execution module cannot be loaded: letsencrypt-auto not installed.
[DEBUG ] LazyLoaded apache.config
[DEBUG ] LazyLoaded apache.a2enconf
[DEBUG ] LazyLoaded apache.a2enmod
[DEBUG ] LazyLoaded apache.a2ensite
[DEBUG ] Could not LazyLoad pkg.get_selections: 'pkg.get_selections' is not available.
[DEBUG ] LazyLoaded at.at
[DEBUG ] Could not LazyLoad augeas.execute: 'augeas.execute' is not available.
[DEBUG ] LazyLoaded boto3_elasticache.cache_cluster_exists
[DEBUG ] LazyLoaded boto3_route53.find_hosted_zone
[DEBUG ] LazyLoaded boto_apigateway.describe_apis
[DEBUG ] LazyLoaded boto_asg.exists
[DEBUG ] LazyLoaded boto_cloudtrail.exists
[DEBUG ] LazyLoaded boto_cloudwatch_event.exists
[DEBUG ] LazyLoaded boto_cognitoidentity.describe_identity_pools
[DEBUG ] LazyLoaded boto_datapipeline.create_pipeline
[DEBUG ] LazyLoaded boto_elasticsearch_domain.exists
[DEBUG ] LazyLoaded boto_iot.policy_exists
[DEBUG ] LazyLoaded boto_kinesis.exists
[DEBUG ] LazyLoaded boto_kms.describe_key
[DEBUG ] LazyLoaded boto_lambda.function_exists
[DEBUG ] LazyLoaded boto_asg.exists
[DEBUG ] LazyLoaded boto_rds.exists
[DEBUG ] LazyLoaded boto_route53.get_record
[DEBUG ] LazyLoaded boto_s3_bucket.exists
[DEBUG ] LazyLoaded boto_vpc.exists
[DEBUG ] LazyLoaded bower.list
[DEBUG ] LazyLoaded chef.client
[DEBUG ] LazyLoaded chocolatey.install
[DEBUG ] LazyLoaded cyg.list
[DEBUG ] LazyLoaded ddns.update
[DEBUG ] LazyLoaded eselect.exec_action
[INFO ] Executing command ['git', '--version'] in directory '/home/admin'
[DEBUG ] stdout: git version 2.11.0
[DEBUG ] LazyLoaded github.list_users
[DEBUG ] LazyLoaded glusterfs.list_volumes
[DEBUG ] LazyLoaded elasticsearch.exists
[DEBUG ] LazyLoaded icinga2.generate_ticket
[DEBUG ] LazyLoaded ifttt.trigger_event
[DEBUG ] Could not LazyLoad influxdb08.db_exists: 'influxdb08.db_exists' is not available.
[DEBUG ] Could not LazyLoad influxdb08.db_exists: 'influxdb08.db_exists' is not available.
[DEBUG ] Could not LazyLoad influxdb.db_exists: 'influxdb.db_exists' is not available.
[DEBUG ] Could not LazyLoad influxdb.db_exists: 'influxdb.db_exists' is not available.
[DEBUG ] Could not LazyLoad influxdb.db_exists: 'influxdb.db_exists' is not available.
[DEBUG ] Could not LazyLoad influxdb.db_exists: 'influxdb.db_exists' is not available.
[DEBUG ] LazyLoaded ipset.version
[DEBUG ] LazyLoaded kapacitor.version
[DEBUG ] LazyLoaded keystone.auth
[DEBUG ] LazyLoaded kmod.available
[DEBUG ] LazyLoaded layman.add
[DEBUG ] LazyLoaded lvs.get_rules
[DEBUG ] LazyLoaded lvs.get_rules
[DEBUG ] LazyLoaded makeconf.get_var
[DEBUG ] LazyLoaded memcached.status
[DEBUG ] LazyLoaded mongodb.user_exists
[DEBUG ] LazyLoaded monit.summary
[DEBUG ] LazyLoaded mysql.db_exists
[DEBUG ] LazyLoaded mysql.grant_exists
[DEBUG ] LazyLoaded mysql.query
[DEBUG ] LazyLoaded mysql.user_create
[DEBUG ] LazyLoaded nftables.version
[DEBUG ] LazyLoaded npm.list
[DEBUG ] LazyLoaded openvswitch.bridge_create
[DEBUG ] LazyLoaded openvswitch.port_add
[DEBUG ] LazyLoaded pdbedit.create
[DEBUG ] LazyLoaded pecl.list
[DEBUG ] Could not LazyLoad pkg.install: 'pkg.install' is not available.
[DEBUG ] Could not LazyLoad pkg.mod_repo: 'pkg.mod_repo' is not available.
[DEBUG ] LazyLoaded portage_config.get_missing_flags
[DEBUG ] LazyLoaded postgres.cluster_exists
[DEBUG ] LazyLoaded postgres.user_exists
[DEBUG ] LazyLoaded postgres.create_extension
[DEBUG ] LazyLoaded postgres.group_create
[DEBUG ] LazyLoaded postgres.datadir_init
[DEBUG ] LazyLoaded postgres.language_create
[DEBUG ] LazyLoaded postgres.privileges_grant
[DEBUG ] LazyLoaded postgres.schema_exists
[DEBUG ] LazyLoaded postgres.tablespace_exists
[DEBUG ] LazyLoaded postgres.user_exists
[DEBUG ] LazyLoaded ps.pkill
[DEBUG ] LazyLoaded quota.report
[DEBUG ] Could not LazyLoad rbac.profile_list: 'rbac.profile_list' is not available.
[DEBUG ] LazyLoaded rdp.enable
[DEBUG ] Could not LazyLoad redis.set_key: 'redis.set_key' is not available.
[DEBUG ] LazyLoaded reg.read_value
[DEBUG ] LazyLoaded selinux.getenforce
[DEBUG ] LazyLoaded service.start
[DEBUG ] Could not LazyLoad vmadm.create: 'vmadm.create' is not available.
[DEBUG ] LazyLoaded snapper.diff
[DEBUG ] LazyLoaded splunk.list_users
[DEBUG ] LazyLoaded splunk_search.get
[DEBUG ] LazyLoaded stormpath.create_account
[DEBUG ] Could not LazyLoad sysctl.show: 'sysctl.show' is not available.
[DEBUG ] LazyLoaded tls.cert_info
[DEBUG ] LazyLoaded tomcat.status
[DEBUG ] LazyLoaded trafficserver.set_config
[DEBUG ] LazyLoaded victorops.create_event
[DEBUG ] LazyLoaded virt.node_info
[DEBUG ] LazyLoaded win_dacl.add_ace
[DEBUG ] LazyLoaded win_dns_client.add_dns
[DEBUG ] Could not LazyLoad firewall.get_config: 'firewall.get_config' is not available.
[DEBUG ] LazyLoaded win_iis.create_site
[DEBUG ] Could not LazyLoad lgpo.set: 'lgpo.set' is not available.
[DEBUG ] LazyLoaded win_path.rehash
[DEBUG ] LazyLoaded win_pki.get_stores
[DEBUG ] LazyLoaded win_servermanager.install
[DEBUG ] LazyLoaded win_smtp_server.get_server_setting
[DEBUG ] LazyLoaded win_snmp.get_agent_settings
[DEBUG ] LazyLoaded x509.get_pem_entry
[DEBUG ] LazyLoaded xmpp.send_msg
[DEBUG ] LazyLoaded zabbix.host_create
[DEBUG ] LazyLoaded zabbix.hostgroup_create
[DEBUG ] LazyLoaded zabbix.user_create
[DEBUG ] LazyLoaded zabbix.usergroup_create
[DEBUG ] LazyLoaded zfs.create
[DEBUG ] LazyLoaded zk_concurrency.lock
[DEBUG ] LazyLoaded zonecfg.create
[DEBUG ] LazyLoaded zpool.create
[DEBUG ] Could not LazyLoad netconfig.managed: 'netconfig' __virtual__ returned False: "netconfig"" (/home/admin/salt/salt/states/netconfig.pyc) cannot be loaded: NAPALM is not installed or not running in a (proxy) minion
[INFO ] Running state [update_my_ntp_peers] at time 11:48:22.794615
[DEBUG ] LazyLoaded netconfig.managed
[ERROR ] State 'netconfig.managed' was not found in SLS 'router.ntp'
Reason: 'netconfig' __virtual__ returned False: "netconfig"" (/home/admin/salt/salt/states/netconfig.pyc) cannot be loaded: NAPALM is not installed or not running in a (proxy) minion
[DEBUG ] File /var/cache/salt/proxy/device1/accumulator/140669285107280 does not exist, no need to cleanup.
[DEBUG ] SaltEvent PUB socket URI: /var/run/salt/minion/minion_event_18faa0dd7a_pub.ipc
[DEBUG ] SaltEvent PULL socket URI: /var/run/salt/minion/minion_event_18faa0dd7a_pull.ipc
[DEBUG ] Initializing new IPCClient for path: /var/run/salt/minion/minion_event_18faa0dd7a_pull.ipc
[DEBUG ] Sending event: tag = __schedule_return; data = {'fun_args': ['router.ntp'], 'jid': 'req', 'return': {'netconfig_|-update_my_ntp_peers_|-update_my_ntp_peers_|-managed': {'comment': 'State \'netconfig.managed\' was not found in SLS \'router.ntp\'\nReason: \'netconfig\' __virtual__ returned False: "netconfig"" (/home/admin/salt/salt/states/netconfig.pyc) cannot be loaded: NAPALM is not installed or not running in a (proxy) minion\n', '__run_num__': 0, 'changes': {}, 'result': False, 'name': 'update_my_ntp_peers'}}, 'success': True, 'schedule': 'stuff', 'cmd': '_return', 'pid': 27805, '_stamp': '2017-03-21T11:48:22.797503', 'fun': 'state.sls', 'id': 'device1'}
[DEBUG ] schedule.handle_func: Removing /var/cache/salt/proxy/device1/proc/20170321114822145514
[DEBUG ] Minion of "localhost" is handling event tag '__schedule_return'
[INFO ] Returning information for job: req
[DEBUG ] Initializing new AsyncZeroMQReqChannel for ('/etc/salt/pki/proxy', 'device1', 'tcp://127.0.0.1:4506', 'aes')
[DEBUG ] Re-using AsyncAuth for ('/etc/salt/pki/proxy', 'device1', 'tcp://127.0.0.1:4506')
[DEBUG ] LazyLoaded config.merge
[INFO ] User sudo_admin Executing command netmiko.send_command with jid 20170321114830616297
[DEBUG ] Command details {'tgt_type': 'glob', 'jid': '20170321114830616297', 'tgt': 'device1', 'ret': '', 'user': 'sudo_admin', 'arg': ['show arp'], 'fun': 'netmiko.send_command'}
[INFO ] Starting a new job with PID 27805
[DEBUG ] read_channel:
[DEBUG ] write_channel:
[DEBUG ] read_channel:
ip-172-31-46-249#
[DEBUG ] read_channel:
[DEBUG ] read_channel:
[DEBUG ] write_channel: show arp
[DEBUG ] read_channel:
[DEBUG ] read_channel: show arp
Protocol Address Age (min) Hardware Addr Type Interface
Internet 172.31.32.1 0 0a9a.d74c.2695 ARPA GigabitEthernet1
Internet 172.31.46.249 - 0a11.aabe.c10f ARPA GigabitEthernet1
ip-172-31-46-249#
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment