Skip to content

Instantly share code, notes, and snippets.

@PredatorVI
Last active February 4, 2016 23:09
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save PredatorVI/3e7ea91595607d89943e to your computer and use it in GitHub Desktop.
Save PredatorVI/3e7ea91595607d89943e to your computer and use it in GitHub Desktop.
[DEBUG ] Including configuration from '/etc/salt/minion.d/_schedule.conf'
[DEBUG ] Reading configuration from /etc/salt/minion.d/_schedule.conf
[INFO ] Completed state [oracle-java8-installer] at time 16:01:05.066113
[DEBUG ] LazyLoaded config.option
[DEBUG ] LazyLoaded file.symlink
[INFO ] Running state [/usr/lib/jvm/default-java] at time 16:01:05.101762
[INFO ] Executing state file.symlink for /usr/lib/jvm/default-java
[DEBUG ] LazyLoaded file.gid_to_group
[DEBUG ] LazyLoaded user.info
[INFO ] {'new': '/usr/lib/jvm/default-java'}
[INFO ] Completed state [/usr/lib/jvm/default-java] at time 16:01:05.106351
[INFO ] Running state [/usr/lib/jvm/java-8-oracle/jre/lib/security/local_policy.jar] at time 16:01:05.106536
[INFO ] Executing state file.managed for /usr/lib/jvm/java-8-oracle/jre/lib/security/local_policy.jar
[DEBUG ] LazyLoaded cp.hash_file
[DEBUG ] In saltenv 'base', looking at rel_path u'oracle-java/local_policy.jar.java8' to resolve u'salt://oracle-java/local_policy.jar.java8'
[DEBUG ] In saltenv 'base', ** considering ** path u'/var/cache/salt/minion/files/base/oracle-java/local_policy.jar.java8' to resolve u'salt://oracle-java/local_policy.jar.java8'
[INFO ] Fetching file from saltenv 'base', ** skipped ** latest already in cache u'salt://oracle-java/local_policy.jar.java8'
[INFO ] File changed:
Replace binary file
[INFO ] Completed state [/usr/lib/jvm/java-8-oracle/jre/lib/security/local_policy.jar] at time 16:01:05.147161
[INFO ] Running state [/usr/lib/jvm/java-8-oracle/jre/lib/security/US_export_policy.jar] at time 16:01:05.147336
[INFO ] Executing state file.managed for /usr/lib/jvm/java-8-oracle/jre/lib/security/US_export_policy.jar
[DEBUG ] Please install 'virt-what' to improve results of the 'virtual' grain.
[DEBUG ] In saltenv 'base', looking at rel_path u'oracle-java/US_export_policy.jar.java8' to resolve u'salt://oracle-java/US_export_policy.jar.java8'
[DEBUG ] In saltenv 'base', ** considering ** path u'/var/cache/salt/minion/files/base/oracle-java/US_export_policy.jar.java8' to resolve u'salt://oracle-java/US_export_policy.jar.java8'
[INFO ] Fetching file from saltenv 'base', ** skipped ** latest already in cache u'salt://oracle-java/US_export_policy.jar.java8'
[INFO ] File changed:
Replace binary file
[INFO ] Completed state [/usr/lib/jvm/java-8-oracle/jre/lib/security/US_export_policy.jar] at time 16:01:05.166652
[DEBUG ] LazyLoaded pkg.install
[DEBUG ] LazyLoaded pkg.latest
[INFO ] Running state [unzip] at time 16:01:05.170359
[INFO ] Executing state pkg.latest for unzip
[DEBUG ] LazyLoaded pkg_resource.version
[DEBUG ] LazyLoaded cmd.run_all
[INFO ] Executing command ['apt-cache', '-q', 'policy', 'unzip'] in directory '/root'
[DEBUG ] LazyLoaded grains.item
[INFO ] Package unzip is already up-to-date
[INFO ] Completed state [unzip] at time 16:01:05.191467
[INFO ] Running state [jsvc] at time 16:01:05.191709
[INFO ] Executing state pkg.latest for jsvc
[INFO ] Executing command ['apt-cache', '-q', 'policy', 'jsvc'] in directory '/root'
[INFO ] Package jsvc is already up-to-date
[INFO ] Completed state [jsvc] at time 16:01:05.207170
[INFO ] Running state [/opt/myuser01/minion-processor/] at time 16:01:05.207432
[INFO ] Executing state file.directory for /opt/myuser01/minion-processor/
[DEBUG ] List of kept files when use file.directory with clean: []
[INFO ] {'removed': ['/opt/myuser01/minion-processor/minion-processor.jar']}
[INFO ] Completed state [/opt/myuser01/minion-processor/] at time 16:01:05.257364
[INFO ] Running state [/etc/myuser01/minion-processor/] at time 16:01:05.257576
[INFO ] Executing state file.directory for /etc/myuser01/minion-processor/
[DEBUG ] List of kept files when use file.directory with clean: []
[INFO ] {'/etc/myuser01/minion-processor': 'New Dir'}
[INFO ] Completed state [/etc/myuser01/minion-processor/] at time 16:01:05.277946
[INFO ] Running state [/var/log/myuser01] at time 16:01:05.279444
[INFO ] Executing state file.directory for /var/log/myuser01
[INFO ] {'/var/log/myuser01': 'New Dir'}
[INFO ] Completed state [/var/log/myuser01] at time 16:01:05.280945
[INFO ] Running state [/opt/myuser01/minion-processor/minion-processor.jar] at time 16:01:05.281747
[INFO ] Executing state file.managed for /opt/myuser01/minion-processor/minion-processor.jar
[DEBUG ] Reading configuration from /etc/salt/minion
[DEBUG ] Including configuration from '/etc/salt/minion.d/_schedule.conf'
[DEBUG ] Reading configuration from /etc/salt/minion.d/_schedule.conf
[DEBUG ] Using cached minion ID from /etc/salt/minion_id: minion-acc01.mycompany.com
[DEBUG ] LazyLoaded config.merge
[DEBUG ] LazyLoaded mine.update
[DEBUG ] Requesting URL http://nexus.mycompany.com/nexus/content/groups/mycompany/com/mycompany/myuser01/minion-processor/1.0-SNAPSHOT/minion-processor-1.0-20160204.180935-144-jar-with-dependencies.jar using GET method
[INFO ] File changed:
New file
[INFO ] Completed state [/opt/myuser01/minion-processor/minion-processor.jar] at time 16:01:06.568150
[DEBUG ] Error loading module.rh_service: Cannot load rh_service module: OS not in set(['SUSE Enterprise Server', 'SUSE', 'RedHat', 'CentOS', 'CloudLinux', 'McAfee OS Server', 'XenServer', 'Amazon', 'OEL', 'ScientificLinux', 'ALT', 'Fedora'])
[DEBUG ] Error loading module.ifttt: IFTTT Secret Key Unavailable, not loading.
[DEBUG ] Error loading module.dockerng: Docker module could not get imported
[DEBUG ] Error loading module.victorops: No VictorOps api key found.
[DEBUG ] Error loading module.ipmi: No module named pyghmi.ipmi
[DEBUG ] Error loading module.win_status: Cannot load win_status module on non-windows
[DEBUG ] You should upgrade pyOpenSSL to at least 0.14.1 to enable the use of X509 extensions in the tls module
[DEBUG ] Error loading module.glusterfs: glusterfs server is not installed
[DEBUG ] Error loading module.vsphere: Missing dependency: The vSphere module requires the pyVmomi Python module.
[DEBUG ] Error loading module.nacl: libnacl import error, perhaps missing python libnacl package
[DEBUG ] LazyLoaded service.start
[DEBUG ] LazyLoaded service.running
[DEBUG ] LazyLoaded cmd.run
[INFO ] Running state [/etc/init/minion-processor.conf] at time 16:01:06.696266
[INFO ] Executing state file.managed for /etc/init/minion-processor.conf
[INFO ] File /etc/init/minion-processor.conf is in the correct state
[INFO ] Completed state [/etc/init/minion-processor.conf] at time 16:01:06.708585
[INFO ] Running state [initctl reload-configuration] at time 16:01:06.709182
[INFO ] Executing state cmd.run for initctl reload-configuration
stdin: is not a tty
[INFO ] Executing command 'initctl reload-configuration' as user 'root' in directory '/root'
[INFO ] {'pid': 2573, 'retcode': 0, 'stderr': '', 'stdout': ''}
[INFO ] Completed state [initctl reload-configuration] at time 16:01:06.784548
[INFO ] Running state [minion-processor] at time 16:01:06.785321
[INFO ] Executing state service.running for minion-processor
[INFO ] Executing command ['runlevel', '/run/utmp'] in directory '/root'
[DEBUG ] output: N 2
[INFO ] Executing command ['service', 'minion-processor', 'status'] in directory '/root'
[DEBUG ] output: minion-processor stop/waiting
[INFO ] Executing command ['service', 'minion-processor', 'start'] in directory '/root'
[DEBUG ] output: minion-processor start/running, process 2587
[INFO ] Executing command ['service', 'minion-processor', 'status'] in directory '/root'
[DEBUG ] output: minion-processor start/running, process 2587
[INFO ] {'minion-processor': True}
[INFO ] Completed state [minion-processor] at time 16:01:06.850697
[INFO ] Running state [/tmp/minion-processor_salt/properties] at time 16:01:06.850946
[INFO ] Executing state file.directory for /tmp/minion-processor_salt/properties
[DEBUG ] Creating directory: /tmp/minion-processor_salt
[DEBUG ] List of kept files when use file.directory with clean: []
[INFO ] {'/tmp/minion-processor_salt/properties': 'New Dir'}
[INFO ] Completed state [/tmp/minion-processor_salt/properties] at time 16:01:06.853293
[INFO ] Running state [unzip -j /opt/myuser01/minion-processor/minion-processor.jar "**/acc.minion-processor.properties" -d /tmp/minion-processor_salt/properties
chown -R root:root /tmp/minion-processor_salt/properties
chmod 644 /tmp/minion-processor_salt/properties/acc.minion-processor.properties
mkdir -p /etc/myuser01/minion-processor/
mv /tmp/minion-processor_salt/properties/acc.minion-processor.properties /etc/myuser01/minion-processor/minion-processor.properties
] at time 16:01:06.855055
[INFO ] Executing state cmd.run for unzip -j /opt/myuser01/minion-processor/minion-processor.jar "**/acc.minion-processor.properties" -d /tmp/minion-processor_salt/properties
chown -R root:root /tmp/minion-processor_salt/properties
chmod 644 /tmp/minion-processor_salt/properties/acc.minion-processor.properties
mkdir -p /etc/myuser01/minion-processor/
mv /tmp/minion-processor_salt/properties/acc.minion-processor.properties /etc/myuser01/minion-processor/minion-processor.properties
[INFO ] Executing command 'unzip -j /opt/myuser01/minion-processor/minion-processor.jar "**/acc.minion-processor.properties" -d /tmp/minion-processor_salt/properties\nchown -R root:root /tmp/minion-processor_salt/properties\nchmod 644 /tmp/minion-processor_salt/properties/acc.minion-processor.properties\nmkdir -p /etc/myuser01/minion-processor/\nmv /tmp/minion-processor_salt/properties/acc.minion-processor.properties /etc/myuser01/minion-processor/minion-processor.properties\n' in directory '/root'
[DEBUG ] stdout: Archive: /opt/myuser01/minion-processor/minion-processor.jar
inflating: /tmp/minion-processor_salt/properties/acc.minion-processor.properties
[INFO ] {'pid': 2603, 'retcode': 0, 'stderr': '', 'stdout': 'Archive: /opt/myuser01/minion-processor/minion-processor.jar\n inflating: /tmp/minion-processor_salt/properties/acc.minion-processor.properties'}
[INFO ] Completed state [unzip -j /opt/myuser01/minion-processor/minion-processor.jar "**/acc.minion-processor.properties" -d /tmp/minion-processor_salt/properties
chown -R root:root /tmp/minion-processor_salt/properties
chmod 644 /tmp/minion-processor_salt/properties/acc.minion-processor.properties
mkdir -p /etc/myuser01/minion-processor/
mv /tmp/minion-processor_salt/properties/acc.minion-processor.properties /etc/myuser01/minion-processor/minion-processor.properties
] at time 16:01:07.027094
[INFO ] Running state [/etc/myuser01/minion-processor/minion-processor.properties] at time 16:01:07.027382
[INFO ] Executing state file.prepend for /etc/myuser01/minion-processor/minion-processor.properties
[INFO ] File changed:
---
+++
@@ -1,3 +1,10 @@
+###############################################################
+# This file is auto-generated by Salt
+# JAR_FILE: http://nexus.mycompany.com/nexus/content/groups/mycompany/com/mycompany/myuser01/minion-processor/1.0-SNAPSHOT/minion-processor-1.0-20160204.180935-144-jar-with-dependencies.jar
+# JAR_HASH: sha1=df91bd8d4f64c7786bf40d5a4bf4a13e23585fe8
+# ORIGINAL_PROPS: acc.minion-processor.properties
+# DEPLOYED: Thu Feb 4 22:58:49 UTC 2016
+###############################################################
ENCRYPTION_KEY_FILE=/etc/myuser01/encryption/encryptionkey-processor.properties
[INFO ] Completed state [/etc/myuser01/minion-processor/minion-processor.properties] at time 16:01:07.033231
[INFO ] Running state [tomcat6] at time 16:01:07.036600
[INFO ] Executing state service.dead for tomcat6
[INFO ] The named service tomcat6 is not available
[INFO ] Completed state [tomcat6] at time 16:01:07.049469
[INFO ] Running state [tomcat6] at time 16:01:07.050040
[INFO ] Executing state pkg.purged for tomcat6
[DEBUG ] Error loading module.boto_cloudwatch: The boto_cloudwatch module cannot be loaded: boto libraries are unavailable.
[DEBUG ] Error loading module.npm: npm execution module could not be loaded because the npm binary could not be located
[DEBUG ] Error loading module.x509: Could not load x509 module, m2crypto unavailable
[DEBUG ] Could not LazyLoad pkg.normalize_name
[DEBUG ] Could not LazyLoad pkg.normalize_name
[INFO ] All specified packages are already absent
[INFO ] Completed state [tomcat6] at time 16:01:07.159382
[INFO ] Running state [tomcat6-admin] at time 16:01:07.160086
[INFO ] Executing state pkg.purged for tomcat6-admin
[DEBUG ] Could not LazyLoad pkg.normalize_name
[DEBUG ] Could not LazyLoad pkg.normalize_name
[INFO ] All specified packages are already absent
[INFO ] Completed state [tomcat6-admin] at time 16:01:07.166119
[INFO ] Running state [tomcat6-common] at time 16:01:07.166675
[INFO ] Executing state pkg.purged for tomcat6-common
[DEBUG ] Could not LazyLoad pkg.normalize_name
[DEBUG ] Could not LazyLoad pkg.normalize_name
[INFO ] All specified packages are already absent
[INFO ] Completed state [tomcat6-common] at time 16:01:07.172547
[INFO ] Running state [libtomcat6-java] at time 16:01:07.173126
[INFO ] Executing state pkg.purged for libtomcat6-java
[DEBUG ] Could not LazyLoad pkg.normalize_name
[DEBUG ] Could not LazyLoad pkg.normalize_name
[INFO ] All specified packages are already absent
[INFO ] Completed state [libtomcat6-java] at time 16:01:07.178937
[INFO ] Running state [oracle-java7-installer] at time 16:01:07.179478
[INFO ] Executing state pkg.purged for oracle-java7-installer
[DEBUG ] Could not LazyLoad pkg.normalize_name
[DEBUG ] Could not LazyLoad pkg.normalize_name
[INFO ] All specified packages are already absent
[INFO ] Completed state [oracle-java7-installer] at time 16:01:07.185377
[INFO ] Running state [tomcat7-common] at time 16:01:07.186205
[INFO ] Executing state pkg.installed for tomcat7-common
[DEBUG ] Could not LazyLoad pkg.normalize_name
[DEBUG ] Could not LazyLoad pkg.check_db
[DEBUG ] Could not LazyLoad pkg.normalize_name
[INFO ] Executing command ['apt-get', '-q', '-y', '-o', 'DPkg::Options::=--force-confold', '-o', 'DPkg::Options::=--force-confdef', 'install', 'tomcat7-common'] in directory '/root'
[INFO ] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}\n', '-W'] in directory '/root'
[INFO ] Made the following changes:
'tomcat7-common' changed from 'absent' to '7.0.52-1ubuntu0.3'
'libcommons-dbcp-java' changed from 'absent' to '1.4-3ubuntu1'
'libservlet3.0-java' changed from 'absent' to '7.0.52-1ubuntu0.3'
'libcommons-pool-java' changed from 'absent' to '1.6-2'
'libtomcat7-java' changed from 'absent' to '7.0.52-1ubuntu0.3'
'libgeronimo-jta-1.0.1b-spec-java' changed from 'absent' to '1'
'libcommons-collections3-java' changed from 'absent' to '3.2.1-6'
'libecj-java' changed from 'absent' to '3.9.0-1'
'libgeronimo-jta-1.1-spec-java' changed from 'absent' to '1.1.1-3ubuntu1'
[DEBUG ] Refreshing modules...
[INFO ] Loading fresh modules for state activity
[DEBUG ] LazyLoaded jinja.render
[DEBUG ] LazyLoaded yaml.render
[DEBUG ] LazyLoaded saltutil.refresh_modules
[DEBUG ] LazyLoaded event.fire
[DEBUG ] SaltEvent PUB socket URI: ipc:///var/run/salt/minion/minion_event_13dcaed882_pub.ipc
[DEBUG ] SaltEvent PULL socket URI: ipc:///var/run/salt/minion/minion_event_13dcaed882_pull.ipc
[DEBUG ] Sending event - data = {'_stamp': '2016-02-04T23:01:11.191572'}
[DEBUG ] Handling event 'module_refresh\n\n\x81\xa6_stamp\xba2016-02-04T23:01:11.191572'
[DEBUG ] Refreshing modules. Notify=False
[DEBUG ] Reading configuration from /etc/salt/minion
[DEBUG ] Including configuration from '/etc/salt/minion.d/_schedule.conf'
[DEBUG ] Reading configuration from /etc/salt/minion.d/_schedule.conf
[INFO ] Completed state [tomcat7-common] at time 16:01:11.218073
[DEBUG ] LazyLoaded config.option
[DEBUG ] LazyLoaded pkg.install
[DEBUG ] LazyLoaded pkg.installed
[INFO ] Running state [tomcat7] at time 16:01:11.225329
[INFO ] Executing state pkg.installed for tomcat7
[DEBUG ] Error loading module.ifttt: IFTTT Secret Key Unavailable, not loading.
[DEBUG ] Error loading module.dockerng: Docker module could not get imported
[DEBUG ] Error loading module.victorops: No VictorOps api key found.
[DEBUG ] Error loading module.ipmi: No module named pyghmi.ipmi
[DEBUG ] Error loading module.rh_service: Cannot load rh_service module: OS not in set(['SUSE Enterprise Server', 'SUSE', 'RedHat', 'CentOS', 'CloudLinux', 'McAfee OS Server', 'XenServer', 'Amazon', 'OEL', 'ScientificLinux', 'ALT', 'Fedora'])
[DEBUG ] Error loading module.win_status: Cannot load win_status module on non-windows
[DEBUG ] Please install 'virt-what' to improve results of the 'virtual' grain.
[DEBUG ] You should upgrade pyOpenSSL to at least 0.14.1 to enable the use of X509 extensions in the tls module
[DEBUG ] Error loading module.glusterfs: glusterfs server is not installed
[DEBUG ] LazyLoaded config.merge
[DEBUG ] LazyLoaded mine.update
[DEBUG ] Error loading module.vsphere: Missing dependency: The vSphere module requires the pyVmomi Python module.
[DEBUG ] Error loading module.nacl: libnacl import error, perhaps missing python libnacl package
[DEBUG ] Error loading module.boto_cloudwatch: The boto_cloudwatch module cannot be loaded: boto libraries are unavailable.
[DEBUG ] Error loading module.npm: npm execution module could not be loaded because the npm binary could not be located
[DEBUG ] Error loading module.x509: Could not load x509 module, m2crypto unavailable
[DEBUG ] Could not LazyLoad pkg.normalize_name
[DEBUG ] Could not LazyLoad pkg.check_db
[DEBUG ] Could not LazyLoad pkg.normalize_name
[INFO ] Executing command ['apt-get', '-q', '-y', '-o', 'DPkg::Options::=--force-confold', '-o', 'DPkg::Options::=--force-confdef', 'install', 'tomcat7'] in directory '/root'
[INFO ] User sudo_myuser01 Executing command saltutil.find_job with jid 20160204160123094258
[DEBUG ] Command details {'tgt_type': 'glob', 'jid': '20160204160123094258', 'tgt': 'minion-acc01.mycompany.com', 'ret': '', 'user': 'sudo_myuser01', 'arg': ['20160204155857806709'], 'fun': 'saltutil.find_job'}
[INFO ] Starting a new job with PID 3518
[DEBUG ] LazyLoaded saltutil.find_job
[DEBUG ] Minion return retry timer set to 9 seconds (randomized)
[INFO ] Returning information for job: 20160204160123094258
[DEBUG ] Initializing new AsyncZeroMQReqChannel for ('/etc/salt/pki/minion', 'minion-acc01.mycompany.com', 'tcp://10.0.24.11:4506', 'aes')
[DEBUG ] Initializing new SAuth for ('/etc/salt/pki/minion', 'minion-acc01.mycompany.com', 'tcp://10.0.24.11:4506')
[INFO ] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}\n', '-W'] in directory '/root'
[INFO ] Made the following changes:
'tomcat7' changed from 'absent' to '7.0.52-1ubuntu0.3'
'authbind' changed from 'absent' to '2.1.1'
[DEBUG ] Refreshing modules...
[INFO ] Loading fresh modules for state activity
[DEBUG ] LazyLoaded jinja.render
[DEBUG ] LazyLoaded yaml.render
[DEBUG ] LazyLoaded saltutil.refresh_modules
[DEBUG ] LazyLoaded event.fire
[DEBUG ] SaltEvent PUB socket URI: ipc:///var/run/salt/minion/minion_event_13dcaed882_pub.ipc
[DEBUG ] SaltEvent PULL socket URI: ipc:///var/run/salt/minion/minion_event_13dcaed882_pull.ipc
[DEBUG ] Sending event - data = {'_stamp': '2016-02-04T23:01:18.582488'}
[DEBUG ] Handling event 'module_refresh\n\n\x81\xa6_stamp\xba2016-02-04T23:01:18.582488'
[DEBUG ] Refreshing modules. Notify=False
[DEBUG ] Reading configuration from /etc/salt/minion
[DEBUG ] Including configuration from '/etc/salt/minion.d/_schedule.conf'
[DEBUG ] Reading configuration from /etc/salt/minion.d/_schedule.conf
[INFO ] Completed state [tomcat7] at time 16:01:18.610074
[DEBUG ] LazyLoaded config.option
[DEBUG ] LazyLoaded pkg.install
[DEBUG ] LazyLoaded pkg.installed
[INFO ] Running state [tomcat7-admin] at time 16:01:18.617100
[INFO ] Executing state pkg.installed for tomcat7-admin
[DEBUG ] Error loading module.ifttt: IFTTT Secret Key Unavailable, not loading.
[DEBUG ] Error loading module.dockerng: Docker module could not get imported
[DEBUG ] Error loading module.victorops: No VictorOps api key found.
[DEBUG ] Error loading module.ipmi: No module named pyghmi.ipmi
[DEBUG ] Error loading module.rh_service: Cannot load rh_service module: OS not in set(['SUSE Enterprise Server', 'SUSE', 'RedHat', 'CentOS', 'CloudLinux', 'McAfee OS Server', 'XenServer', 'Amazon', 'OEL', 'ScientificLinux', 'ALT', 'Fedora'])
[DEBUG ] Error loading module.win_status: Cannot load win_status module on non-windows
[DEBUG ] Please install 'virt-what' to improve results of the 'virtual' grain.
[DEBUG ] You should upgrade pyOpenSSL to at least 0.14.1 to enable the use of X509 extensions in the tls module
[DEBUG ] Error loading module.glusterfs: glusterfs server is not installed
[DEBUG ] Error loading module.vsphere: Missing dependency: The vSphere module requires the pyVmomi Python module.
[DEBUG ] Error loading module.nacl: libnacl import error, perhaps missing python libnacl package
[DEBUG ] Error loading module.boto_cloudwatch: The boto_cloudwatch module cannot be loaded: boto libraries are unavailable.
[DEBUG ] Error loading module.npm: npm execution module could not be loaded because the npm binary could not be located
[DEBUG ] Error loading module.x509: Could not load x509 module, m2crypto unavailable
[DEBUG ] Could not LazyLoad pkg.normalize_name
[DEBUG ] Could not LazyLoad pkg.check_db
[DEBUG ] Could not LazyLoad pkg.normalize_name
[INFO ] Executing command ['apt-get', '-q', '-y', '-o', 'DPkg::Options::=--force-confold', '-o', 'DPkg::Options::=--force-confdef', 'install', 'tomcat7-admin'] in directory '/root'
[DEBUG ] LazyLoaded config.merge
[DEBUG ] LazyLoaded mine.update
[INFO ] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}\n', '-W'] in directory '/root'
[INFO ] Made the following changes:
'tomcat7-admin' changed from 'absent' to '7.0.52-1ubuntu0.3'
[DEBUG ] Refreshing modules...
[INFO ] Loading fresh modules for state activity
[DEBUG ] LazyLoaded jinja.render
[DEBUG ] LazyLoaded yaml.render
[DEBUG ] LazyLoaded saltutil.refresh_modules
[DEBUG ] LazyLoaded event.fire
[DEBUG ] SaltEvent PUB socket URI: ipc:///var/run/salt/minion/minion_event_13dcaed882_pub.ipc
[DEBUG ] SaltEvent PULL socket URI: ipc:///var/run/salt/minion/minion_event_13dcaed882_pull.ipc
[DEBUG ] Sending event - data = {'_stamp': '2016-02-04T23:01:19.788962'}
[DEBUG ] Handling event 'module_refresh\n\n\x81\xa6_stamp\xba2016-02-04T23:01:19.788962'
[DEBUG ] Refreshing modules. Notify=False
[DEBUG ] Reading configuration from /etc/salt/minion
[DEBUG ] Including configuration from '/etc/salt/minion.d/_schedule.conf'
[DEBUG ] Reading configuration from /etc/salt/minion.d/_schedule.conf
[INFO ] Completed state [tomcat7-admin] at time 16:01:19.818121
[DEBUG ] LazyLoaded config.option
[DEBUG ] LazyLoaded file.blockreplace
[DEBUG ] LazyLoaded user.present
[INFO ] Running state [tomcat7] at time 16:01:19.829381
[INFO ] Executing state user.present for tomcat7
[DEBUG ] LazyLoaded shadow.info
[DEBUG ] LazyLoaded user.info
[DEBUG ] LazyLoaded file.gid_to_group
[INFO ] User tomcat7 is present and up to date
[INFO ] Completed state [tomcat7] at time 16:01:19.838164
[DEBUG ] LazyLoaded group.present
[INFO ] Running state [tomcat7] at time 16:01:19.845775
[INFO ] Executing state group.present for tomcat7
[DEBUG ] LazyLoaded group.info
[INFO ] Group tomcat7 is present and up to date
[INFO ] Completed state [tomcat7] at time 16:01:19.848584
[INFO ] Running state [/var/lib/tomcat7/conf/tomcat-users.xml] at time 16:01:19.849852
[INFO ] Executing state file.managed for /var/lib/tomcat7/conf/tomcat-users.xml
[DEBUG ] LazyLoaded cp.hash_file
[DEBUG ] In saltenv 'base', looking at rel_path u'tomcat/tomcat-users-template.xml' to resolve u'salt://tomcat/tomcat-users-template.xml'
[DEBUG ] In saltenv 'base', ** considering ** path u'/var/cache/salt/minion/files/base/tomcat/tomcat-users-template.xml' to resolve u'salt://tomcat/tomcat-users-template.xml'
[DEBUG ] Please install 'virt-what' to improve results of the 'virtual' grain.
[INFO ] Fetching file from saltenv 'base', ** skipped ** latest already in cache u'salt://tomcat/tomcat-users-template.xml'
[INFO ] File changed:
---
+++
@@ -33,4 +33,8 @@
<user username="both" password="tomcat" roles="tomcat,role1"/>
<user username="role1" password="tomcat" roles="role1"/>
-->
+<!-- Begin myuser01 User Section -->
+<!-- End myuser01 User Section -->
+<!-- Begin myuser01 Monitoring Section -->
+<!-- End myuser01 Monitoring Section -->
</tomcat-users>
[INFO ] Completed state [/var/lib/tomcat7/conf/tomcat-users.xml] at time 16:01:19.891396
[INFO ] Running state [/var/lib/tomcat7/conf/tomcat-users.xml] at time 16:01:19.892029
[INFO ] Executing state file.blockreplace for /var/lib/tomcat7/conf/tomcat-users.xml
[INFO ] File changed:
---
+++
@@ -34,6 +34,8 @@
<user username="role1" password="tomcat" roles="role1"/>
-->
<!-- Begin myuser01 User Section -->
+<user username="admin" password="pwd" roles="manager-gui,manager-status"/>
+<user username="scriptor" password="pwd" roles="manager-script"/>
<!-- End myuser01 User Section -->
<!-- Begin myuser01 Monitoring Section -->
<!-- End myuser01 Monitoring Section -->
[INFO ] Completed state [/var/lib/tomcat7/conf/tomcat-users.xml] at time 16:01:19.896988
[DEBUG ] Error loading module.rh_service: Cannot load rh_service module: OS not in set(['SUSE Enterprise Server', 'SUSE', 'RedHat', 'CentOS', 'CloudLinux', 'McAfee OS Server', 'XenServer', 'Amazon', 'OEL', 'ScientificLinux', 'ALT', 'Fedora'])
[DEBUG ] Error loading module.ifttt: IFTTT Secret Key Unavailable, not loading.
[DEBUG ] Error loading module.dockerng: Docker module could not get imported
[DEBUG ] Error loading module.victorops: No VictorOps api key found.
[DEBUG ] Error loading module.ipmi: No module named pyghmi.ipmi
[DEBUG ] Error loading module.win_status: Cannot load win_status module on non-windows
[DEBUG ] You should upgrade pyOpenSSL to at least 0.14.1 to enable the use of X509 extensions in the tls module
[DEBUG ] Error loading module.glusterfs: glusterfs server is not installed
[DEBUG ] Error loading module.vsphere: Missing dependency: The vSphere module requires the pyVmomi Python module.
[DEBUG ] Error loading module.nacl: libnacl import error, perhaps missing python libnacl package
[DEBUG ] LazyLoaded service.start
[DEBUG ] LazyLoaded service.running
[INFO ] Running state [tomcat7] at time 16:01:20.071248
[INFO ] Executing state service.running for tomcat7
[DEBUG ] LazyLoaded status.pid
[DEBUG ] LazyLoaded cmd.run_stdout
[INFO ] Executing command 'ps -efHww' in directory '/root'
[DEBUG ] stdout: UID PID PPID C STIME TTY TIME CMD
root 2 0 0 15:57 ? 00:00:00 [kthreadd]
root 3 2 0 15:57 ? 00:00:00 [ksoftirqd/0]
root 4 2 0 15:57 ? 00:00:00 [kworker/0:0]
root 5 2 0 15:57 ? 00:00:00 [kworker/0:0H]
root 6 2 0 15:57 ? 00:00:01 [kworker/u128:0]
root 7 2 0 15:57 ? 00:00:00 [rcu_sched]
root 8 2 0 15:57 ? 00:00:00 [rcuos/0]
root 9 2 0 15:57 ? 00:00:00 [rcuos/1]
root 10 2 0 15:57 ? 00:00:00 [rcuos/2]
root 11 2 0 15:57 ? 00:00:00 [rcuos/3]
root 12 2 0 15:57 ? 00:00:00 [rcuos/4]
root 13 2 0 15:57 ? 00:00:00 [rcuos/5]
root 14 2 0 15:57 ? 00:00:00 [rcuos/6]
root 15 2 0 15:57 ? 00:00:00 [rcuos/7]
root 16 2 0 15:57 ? 00:00:00 [rcuos/8]
root 17 2 0 15:57 ? 00:00:00 [rcuos/9]
root 18 2 0 15:57 ? 00:00:00 [rcuos/10]
root 19 2 0 15:57 ? 00:00:00 [rcuos/11]
root 20 2 0 15:57 ? 00:00:00 [rcuos/12]
root 21 2 0 15:57 ? 00:00:00 [rcuos/13]
root 22 2 0 15:57 ? 00:00:00 [rcuos/14]
root 23 2 0 15:57 ? 00:00:00 [rcuos/15]
root 24 2 0 15:57 ? 00:00:00 [rcuos/16]
root 25 2 0 15:57 ? 00:00:00 [rcuos/17]
root 26 2 0 15:57 ? 00:00:00 [rcuos/18]
root 27 2 0 15:57 ? 00:00:00 [rcuos/19]
root 28 2 0 15:57 ? 00:00:00 [rcuos/20]
root 29 2 0 15:57 ? 00:00:00 [rcuos/21]
root 30 2 0 15:57 ? 00:00:00 [rcuos/22]
root 31 2 0 15:57 ? 00:00:00 [rcuos/23]
root 32 2 0 15:57 ? 00:00:00 [rcuos/24]
root 33 2 0 15:57 ? 00:00:00 [rcuos/25]
root 34 2 0 15:57 ? 00:00:00 [rcuos/26]
root 35 2 0 15:57 ? 00:00:00 [rcuos/27]
root 36 2 0 15:57 ? 00:00:00 [rcuos/28]
root 37 2 0 15:57 ? 00:00:00 [rcuos/29]
root 38 2 0 15:57 ? 00:00:00 [rcuos/30]
root 39 2 0 15:57 ? 00:00:00 [rcuos/31]
root 40 2 0 15:57 ? 00:00:00 [rcuos/32]
root 41 2 0 15:57 ? 00:00:00 [rcuos/33]
root 42 2 0 15:57 ? 00:00:00 [rcuos/34]
root 43 2 0 15:57 ? 00:00:00 [rcuos/35]
root 44 2 0 15:57 ? 00:00:00 [rcuos/36]
root 45 2 0 15:57 ? 00:00:00 [rcuos/37]
root 46 2 0 15:57 ? 00:00:00 [rcuos/38]
root 47 2 0 15:57 ? 00:00:00 [rcuos/39]
root 48 2 0 15:57 ? 00:00:00 [rcuos/40]
root 49 2 0 15:57 ? 00:00:00 [rcuos/41]
root 50 2 0 15:57 ? 00:00:00 [rcuos/42]
root 51 2 0 15:57 ? 00:00:00 [rcuos/43]
root 52 2 0 15:57 ? 00:00:00 [rcuos/44]
root 53 2 0 15:57 ? 00:00:00 [rcuos/45]
root 54 2 0 15:57 ? 00:00:00 [rcuos/46]
root 55 2 0 15:57 ? 00:00:00 [rcuos/47]
root 56 2 0 15:57 ? 00:00:00 [rcuos/48]
root 57 2 0 15:57 ? 00:00:00 [rcuos/49]
root 58 2 0 15:57 ? 00:00:00 [rcuos/50]
root 59 2 0 15:57 ? 00:00:00 [rcuos/51]
root 60 2 0 15:57 ? 00:00:00 [rcuos/52]
root 61 2 0 15:57 ? 00:00:00 [rcuos/53]
root 62 2 0 15:57 ? 00:00:00 [rcuos/54]
root 63 2 0 15:57 ? 00:00:00 [rcuos/55]
root 64 2 0 15:57 ? 00:00:00 [rcuos/56]
root 65 2 0 15:57 ? 00:00:00 [rcuos/57]
root 66 2 0 15:57 ? 00:00:00 [rcuos/58]
root 67 2 0 15:57 ? 00:00:00 [rcuos/59]
root 68 2 0 15:57 ? 00:00:00 [rcuos/60]
root 69 2 0 15:57 ? 00:00:00 [rcuos/61]
root 70 2 0 15:57 ? 00:00:00 [rcuos/62]
root 71 2 0 15:57 ? 00:00:00 [rcuos/63]
root 72 2 0 15:57 ? 00:00:00 [rcu_bh]
root 73 2 0 15:57 ? 00:00:00 [rcuob/0]
root 74 2 0 15:57 ? 00:00:00 [rcuob/1]
root 75 2 0 15:57 ? 00:00:00 [rcuob/2]
root 76 2 0 15:57 ? 00:00:00 [rcuob/3]
root 77 2 0 15:57 ? 00:00:00 [rcuob/4]
root 78 2 0 15:57 ? 00:00:00 [rcuob/5]
root 79 2 0 15:57 ? 00:00:00 [rcuob/6]
root 80 2 0 15:57 ? 00:00:00 [rcuob/7]
root 81 2 0 15:57 ? 00:00:00 [rcuob/8]
root 82 2 0 15:57 ? 00:00:00 [rcuob/9]
root 83 2 0 15:57 ? 00:00:00 [rcuob/10]
root 84 2 0 15:57 ? 00:00:00 [rcuob/11]
root 85 2 0 15:57 ? 00:00:00 [rcuob/12]
root 86 2 0 15:57 ? 00:00:00 [rcuob/13]
root 87 2 0 15:57 ? 00:00:00 [rcuob/14]
root 88 2 0 15:57 ? 00:00:00 [rcuob/15]
root 89 2 0 15:57 ? 00:00:00 [rcuob/16]
root 90 2 0 15:57 ? 00:00:00 [rcuob/17]
root 91 2 0 15:57 ? 00:00:00 [rcuob/18]
root 92 2 0 15:57 ? 00:00:00 [rcuob/19]
root 93 2 0 15:57 ? 00:00:00 [rcuob/20]
root 94 2 0 15:57 ? 00:00:00 [rcuob/21]
root 95 2 0 15:57 ? 00:00:00 [rcuob/22]
root 96 2 0 15:57 ? 00:00:00 [rcuob/23]
root 97 2 0 15:57 ? 00:00:00 [rcuob/24]
root 98 2 0 15:57 ? 00:00:00 [rcuob/25]
root 99 2 0 15:57 ? 00:00:00 [rcuob/26]
root 100 2 0 15:57 ? 00:00:00 [rcuob/27]
root 101 2 0 15:57 ? 00:00:00 [rcuob/28]
root 102 2 0 15:57 ? 00:00:00 [rcuob/29]
root 103 2 0 15:57 ? 00:00:00 [rcuob/30]
root 104 2 0 15:57 ? 00:00:00 [rcuob/31]
root 105 2 0 15:57 ? 00:00:00 [rcuob/32]
root 106 2 0 15:57 ? 00:00:00 [rcuob/33]
root 107 2 0 15:57 ? 00:00:00 [rcuob/34]
root 108 2 0 15:57 ? 00:00:00 [rcuob/35]
root 109 2 0 15:57 ? 00:00:00 [rcuob/36]
root 110 2 0 15:57 ? 00:00:00 [rcuob/37]
root 111 2 0 15:57 ? 00:00:00 [rcuob/38]
root 112 2 0 15:57 ? 00:00:00 [rcuob/39]
root 113 2 0 15:57 ? 00:00:00 [rcuob/40]
root 114 2 0 15:57 ? 00:00:00 [rcuob/41]
root 115 2 0 15:57 ? 00:00:00 [rcuob/42]
root 116 2 0 15:57 ? 00:00:00 [rcuob/43]
root 117 2 0 15:57 ? 00:00:00 [rcuob/44]
root 118 2 0 15:57 ? 00:00:00 [rcuob/45]
root 119 2 0 15:57 ? 00:00:00 [rcuob/46]
root 120 2 0 15:57 ? 00:00:00 [rcuob/47]
root 121 2 0 15:57 ? 00:00:00 [rcuob/48]
root 122 2 0 15:57 ? 00:00:00 [rcuob/49]
root 123 2 0 15:57 ? 00:00:00 [rcuob/50]
root 124 2 0 15:57 ? 00:00:00 [rcuob/51]
root 125 2 0 15:57 ? 00:00:00 [rcuob/52]
root 126 2 0 15:57 ? 00:00:00 [rcuob/53]
root 127 2 0 15:57 ? 00:00:00 [rcuob/54]
root 128 2 0 15:57 ? 00:00:00 [rcuob/55]
root 129 2 0 15:57 ? 00:00:00 [rcuob/56]
root 130 2 0 15:57 ? 00:00:00 [rcuob/57]
root 131 2 0 15:57 ? 00:00:00 [rcuob/58]
root 132 2 0 15:57 ? 00:00:00 [rcuob/59]
root 133 2 0 15:57 ? 00:00:00 [rcuob/60]
root 134 2 0 15:57 ? 00:00:00 [rcuob/61]
root 135 2 0 15:57 ? 00:00:00 [rcuob/62]
root 136 2 0 15:57 ? 00:00:00 [rcuob/63]
root 137 2 0 15:57 ? 00:00:00 [migration/0]
root 138 2 0 15:57 ? 00:00:00 [watchdog/0]
root 139 2 0 15:57 ? 00:00:00 [watchdog/1]
root 140 2 0 15:57 ? 00:00:00 [migration/1]
root 141 2 0 15:57 ? 00:00:00 [ksoftirqd/1]
root 142 2 0 15:57 ? 00:00:00 [kworker/1:0]
root 143 2 0 15:57 ? 00:00:00 [kworker/1:0H]
root 144 2 0 15:57 ? 00:00:00 [khelper]
root 145 2 0 15:57 ? 00:00:00 [kdevtmpfs]
root 146 2 0 15:57 ? 00:00:00 [netns]
root 147 2 0 15:57 ? 00:00:00 [writeback]
root 148 2 0 15:57 ? 00:00:00 [kintegrityd]
root 149 2 0 15:57 ? 00:00:00 [bioset]
root 150 2 0 15:57 ? 00:00:00 [kworker/u129:0]
root 151 2 0 15:57 ? 00:00:00 [kblockd]
root 152 2 0 15:57 ? 00:00:00 [ata_sff]
root 153 2 0 15:57 ? 00:00:00 [khubd]
root 154 2 0 15:57 ? 00:00:00 [md]
root 155 2 0 15:57 ? 00:00:00 [devfreq_wq]
root 156 2 0 15:57 ? 00:00:00 [kworker/1:1]
root 158 2 0 15:57 ? 00:00:00 [khungtaskd]
root 159 2 0 15:57 ? 00:00:00 [kswapd0]
root 160 2 0 15:57 ? 00:00:00 [ksmd]
root 161 2 0 15:57 ? 00:00:00 [khugepaged]
root 162 2 0 15:57 ? 00:00:00 [fsnotify_mark]
root 163 2 0 15:57 ? 00:00:00 [ecryptfs-kthrea]
root 164 2 0 15:57 ? 00:00:00 [crypto]
root 176 2 0 15:57 ? 00:00:00 [kthrotld]
root 177 2 0 15:57 ? 00:00:00 [kworker/u128:1]
root 178 2 0 15:57 ? 00:00:00 [scsi_eh_0]
root 179 2 0 15:57 ? 00:00:00 [scsi_eh_1]
root 180 2 0 15:57 ? 00:00:00 [kworker/u128:2]
root 181 2 0 15:57 ? 00:00:00 [kworker/0:1]
root 182 2 0 15:57 ? 00:00:00 [kworker/u128:3]
root 201 2 0 15:57 ? 00:00:00 [deferwq]
root 202 2 0 15:57 ? 00:00:00 [charger_manager]
root 253 2 0 15:57 ? 00:00:00 [kpsmoused]
root 254 2 0 15:57 ? 00:00:00 [mpt_poll_0]
root 255 2 0 15:57 ? 00:00:00 [mpt/0]
root 256 2 0 15:57 ? 00:00:00 [scsi_eh_2]
root 258 2 0 15:57 ? 00:00:00 [kworker/1:2]
root 264 2 0 15:57 ? 00:00:00 [kdmflush]
root 265 2 0 15:57 ? 00:00:00 [bioset]
root 267 2 0 15:57 ? 00:00:00 [kdmflush]
root 268 2 0 15:57 ? 00:00:00 [bioset]
root 283 2 0 15:57 ? 00:00:00 [jbd2/dm-0-8]
root 284 2 0 15:57 ? 00:00:00 [ext4-rsv-conver]
root 461 2 0 15:58 ? 00:00:00 [ttm_swap]
root 474 2 0 15:58 ? 00:00:00 [ext4-rsv-conver]
root 1322 2 0 15:58 ? 00:00:00 [kauditd]
root 1 0 0 15:57 ? 00:00:01 /sbin/init
root 420 1 0 15:58 ? 00:00:00 upstart-udev-bridge --daemon
root 424 1 0 15:58 ? 00:00:00 /lib/systemd/systemd-udevd --daemon
message+ 597 1 0 15:58 ? 00:00:00 dbus-daemon --system --fork
root 647 1 0 15:58 ? 00:00:00 /lib/systemd/systemd-logind
syslog 653 1 0 15:58 ? 00:00:00 rsyslogd
root 867 1 0 15:58 ? 00:00:00 upstart-file-bridge --daemon
root 869 1 0 15:58 ? 00:00:00 upstart-socket-bridge --daemon
root 930 1 0 15:58 tty4 00:00:00 /sbin/getty -8 38400 tty4
root 933 1 0 15:58 tty5 00:00:00 /sbin/getty -8 38400 tty5
root 940 1 0 15:58 tty2 00:00:00 /sbin/getty -8 38400 tty2
root 941 1 0 15:58 tty3 00:00:00 /sbin/getty -8 38400 tty3
root 943 1 0 15:58 tty6 00:00:00 /sbin/getty -8 38400 tty6
root 977 1 0 15:58 ? 00:00:00 /usr/sbin/sshd -D
root 1320 977 0 15:58 ? 00:00:00 sshd: myuser01 [priv]
myuser01 1369 1320 0 15:58 ? 00:00:00 sshd: myuser01@pts/0
myuser01 1370 1369 0 15:58 pts/0 00:00:00 -bash
root 1384 1370 0 15:58 pts/0 00:00:00 sudo su
root 1385 1384 0 15:58 pts/0 00:00:00 su
root 1386 1385 0 15:58 pts/0 00:00:00 bash
root 1415 1386 0 15:58 pts/0 00:00:00 /usr/bin/python /usr/bin/salt-minion -l debug
root 1420 1415 0 15:58 pts/0 00:00:00 /usr/bin/python /usr/bin/salt-minion -l debug
root 997 1 0 15:58 ? 00:00:00 cron
daemon 998 1 0 15:58 ? 00:00:00 atd
root 1024 1 0 15:58 ? 00:00:00 acpid -c /etc/acpi/events -s /var/run/acpid.socket
root 1069 1 0 15:58 ? 00:00:00 /usr/sbin/irqbalance
root 1143 1 0 15:58 ? 00:00:00 /usr/bin/vmtoolsd
root 1205 1 0 15:58 tty1 00:00:00 /sbin/getty -8 38400 tty1
root 1511 1 2 15:58 ? 00:00:04 /usr/bin/python /usr/bin/salt-minion -l debug
root 3624 1511 0 16:01 ? 00:00:00 ps -efHww
root 2587 1 0 16:01 ? 00:00:00 minion-processor -home /usr/lib/jvm/default-java -jvm server -cp /opt/myuser01/minion-processor/minion-processor.jar -user myuser01 -procname minion-processor -nodetach -outfile /var/log/myuser01/minion-processor_output.log -errfile /var/log/myuser01/minion-processor_error.log -pidfile /run/minion-processor.pid com.mycompany.minion.minionProcessorDaemon -p /etc/myuser01/minion-processor/minion-processor.properties
myuser01 2591 2587 35 16:01 ? 00:00:04 minion-processor -home /usr/lib/jvm/default-java -jvm server -cp /opt/myuser01/minion-processor/minion-processor.jar -user myuser01 -procname minion-processor -nodetach -outfile /var/log/myuser01/minion-processor_output.log -errfile /var/log/myuser01/minion-processor_error.log -pidfile /run/minion-processor.pid com.mycompany.minion.minionProcessorDaemon -p /etc/myuser01/minion-processor/minion-processor.properties
tomcat7 3502 1 44 16:01 ? 00:00:02 /usr/lib/jvm/default-java/bin/java -Djava.util.logging.config.file=/var/lib/tomcat7/conf/logging.properties -Djava.util.logging.manager=org.apache.juli.ClassLoaderLogManager -Djava.awt.headless=true -Xmx128m -XX:+UseConcMarkSweepGC -Djava.endorsed.dirs=/usr/share/tomcat7/endorsed -classpath /usr/share/tomcat7/bin/bootstrap.jar:/usr/share/tomcat7/bin/tomcat-juli.jar -Dcatalina.base=/var/lib/tomcat7 -Dcatalina.home=/usr/share/tomcat7 -Djava.io.tmpdir=/tmp/tomcat7-tomcat7-tmp org.apache.catalina.startup.Bootstrap start
[INFO ] The service tomcat7 is already running
[INFO ] Completed state [tomcat7] at time 16:01:20.127708
[INFO ] Running state [tomcat7] at time 16:01:20.127915
[INFO ] Executing state service.mod_watch for tomcat7
[INFO ] Executing command 'ps -efHww' in directory '/root'
[DEBUG ] stdout: UID PID PPID C STIME TTY TIME CMD
root 2 0 0 15:57 ? 00:00:00 [kthreadd]
root 3 2 0 15:57 ? 00:00:00 [ksoftirqd/0]
root 4 2 0 15:57 ? 00:00:00 [kworker/0:0]
root 5 2 0 15:57 ? 00:00:00 [kworker/0:0H]
root 6 2 0 15:57 ? 00:00:01 [kworker/u128:0]
root 7 2 0 15:57 ? 00:00:00 [rcu_sched]
root 8 2 0 15:57 ? 00:00:00 [rcuos/0]
root 9 2 0 15:57 ? 00:00:00 [rcuos/1]
root 10 2 0 15:57 ? 00:00:00 [rcuos/2]
root 11 2 0 15:57 ? 00:00:00 [rcuos/3]
root 12 2 0 15:57 ? 00:00:00 [rcuos/4]
root 13 2 0 15:57 ? 00:00:00 [rcuos/5]
root 14 2 0 15:57 ? 00:00:00 [rcuos/6]
root 15 2 0 15:57 ? 00:00:00 [rcuos/7]
root 16 2 0 15:57 ? 00:00:00 [rcuos/8]
root 17 2 0 15:57 ? 00:00:00 [rcuos/9]
root 18 2 0 15:57 ? 00:00:00 [rcuos/10]
root 19 2 0 15:57 ? 00:00:00 [rcuos/11]
root 20 2 0 15:57 ? 00:00:00 [rcuos/12]
root 21 2 0 15:57 ? 00:00:00 [rcuos/13]
root 22 2 0 15:57 ? 00:00:00 [rcuos/14]
root 23 2 0 15:57 ? 00:00:00 [rcuos/15]
root 24 2 0 15:57 ? 00:00:00 [rcuos/16]
root 25 2 0 15:57 ? 00:00:00 [rcuos/17]
root 26 2 0 15:57 ? 00:00:00 [rcuos/18]
root 27 2 0 15:57 ? 00:00:00 [rcuos/19]
root 28 2 0 15:57 ? 00:00:00 [rcuos/20]
root 29 2 0 15:57 ? 00:00:00 [rcuos/21]
root 30 2 0 15:57 ? 00:00:00 [rcuos/22]
root 31 2 0 15:57 ? 00:00:00 [rcuos/23]
root 32 2 0 15:57 ? 00:00:00 [rcuos/24]
root 33 2 0 15:57 ? 00:00:00 [rcuos/25]
root 34 2 0 15:57 ? 00:00:00 [rcuos/26]
root 35 2 0 15:57 ? 00:00:00 [rcuos/27]
root 36 2 0 15:57 ? 00:00:00 [rcuos/28]
root 37 2 0 15:57 ? 00:00:00 [rcuos/29]
root 38 2 0 15:57 ? 00:00:00 [rcuos/30]
root 39 2 0 15:57 ? 00:00:00 [rcuos/31]
root 40 2 0 15:57 ? 00:00:00 [rcuos/32]
root 41 2 0 15:57 ? 00:00:00 [rcuos/33]
root 42 2 0 15:57 ? 00:00:00 [rcuos/34]
root 43 2 0 15:57 ? 00:00:00 [rcuos/35]
root 44 2 0 15:57 ? 00:00:00 [rcuos/36]
root 45 2 0 15:57 ? 00:00:00 [rcuos/37]
root 46 2 0 15:57 ? 00:00:00 [rcuos/38]
root 47 2 0 15:57 ? 00:00:00 [rcuos/39]
root 48 2 0 15:57 ? 00:00:00 [rcuos/40]
root 49 2 0 15:57 ? 00:00:00 [rcuos/41]
root 50 2 0 15:57 ? 00:00:00 [rcuos/42]
root 51 2 0 15:57 ? 00:00:00 [rcuos/43]
root 52 2 0 15:57 ? 00:00:00 [rcuos/44]
root 53 2 0 15:57 ? 00:00:00 [rcuos/45]
root 54 2 0 15:57 ? 00:00:00 [rcuos/46]
root 55 2 0 15:57 ? 00:00:00 [rcuos/47]
root 56 2 0 15:57 ? 00:00:00 [rcuos/48]
root 57 2 0 15:57 ? 00:00:00 [rcuos/49]
root 58 2 0 15:57 ? 00:00:00 [rcuos/50]
root 59 2 0 15:57 ? 00:00:00 [rcuos/51]
root 60 2 0 15:57 ? 00:00:00 [rcuos/52]
root 61 2 0 15:57 ? 00:00:00 [rcuos/53]
root 62 2 0 15:57 ? 00:00:00 [rcuos/54]
root 63 2 0 15:57 ? 00:00:00 [rcuos/55]
root 64 2 0 15:57 ? 00:00:00 [rcuos/56]
root 65 2 0 15:57 ? 00:00:00 [rcuos/57]
root 66 2 0 15:57 ? 00:00:00 [rcuos/58]
root 67 2 0 15:57 ? 00:00:00 [rcuos/59]
root 68 2 0 15:57 ? 00:00:00 [rcuos/60]
root 69 2 0 15:57 ? 00:00:00 [rcuos/61]
root 70 2 0 15:57 ? 00:00:00 [rcuos/62]
root 71 2 0 15:57 ? 00:00:00 [rcuos/63]
root 72 2 0 15:57 ? 00:00:00 [rcu_bh]
root 73 2 0 15:57 ? 00:00:00 [rcuob/0]
root 74 2 0 15:57 ? 00:00:00 [rcuob/1]
root 75 2 0 15:57 ? 00:00:00 [rcuob/2]
root 76 2 0 15:57 ? 00:00:00 [rcuob/3]
root 77 2 0 15:57 ? 00:00:00 [rcuob/4]
root 78 2 0 15:57 ? 00:00:00 [rcuob/5]
root 79 2 0 15:57 ? 00:00:00 [rcuob/6]
root 80 2 0 15:57 ? 00:00:00 [rcuob/7]
root 81 2 0 15:57 ? 00:00:00 [rcuob/8]
root 82 2 0 15:57 ? 00:00:00 [rcuob/9]
root 83 2 0 15:57 ? 00:00:00 [rcuob/10]
root 84 2 0 15:57 ? 00:00:00 [rcuob/11]
root 85 2 0 15:57 ? 00:00:00 [rcuob/12]
root 86 2 0 15:57 ? 00:00:00 [rcuob/13]
root 87 2 0 15:57 ? 00:00:00 [rcuob/14]
root 88 2 0 15:57 ? 00:00:00 [rcuob/15]
root 89 2 0 15:57 ? 00:00:00 [rcuob/16]
root 90 2 0 15:57 ? 00:00:00 [rcuob/17]
root 91 2 0 15:57 ? 00:00:00 [rcuob/18]
root 92 2 0 15:57 ? 00:00:00 [rcuob/19]
root 93 2 0 15:57 ? 00:00:00 [rcuob/20]
root 94 2 0 15:57 ? 00:00:00 [rcuob/21]
root 95 2 0 15:57 ? 00:00:00 [rcuob/22]
root 96 2 0 15:57 ? 00:00:00 [rcuob/23]
root 97 2 0 15:57 ? 00:00:00 [rcuob/24]
root 98 2 0 15:57 ? 00:00:00 [rcuob/25]
root 99 2 0 15:57 ? 00:00:00 [rcuob/26]
root 100 2 0 15:57 ? 00:00:00 [rcuob/27]
root 101 2 0 15:57 ? 00:00:00 [rcuob/28]
root 102 2 0 15:57 ? 00:00:00 [rcuob/29]
root 103 2 0 15:57 ? 00:00:00 [rcuob/30]
root 104 2 0 15:57 ? 00:00:00 [rcuob/31]
root 105 2 0 15:57 ? 00:00:00 [rcuob/32]
root 106 2 0 15:57 ? 00:00:00 [rcuob/33]
root 107 2 0 15:57 ? 00:00:00 [rcuob/34]
root 108 2 0 15:57 ? 00:00:00 [rcuob/35]
root 109 2 0 15:57 ? 00:00:00 [rcuob/36]
root 110 2 0 15:57 ? 00:00:00 [rcuob/37]
root 111 2 0 15:57 ? 00:00:00 [rcuob/38]
root 112 2 0 15:57 ? 00:00:00 [rcuob/39]
root 113 2 0 15:57 ? 00:00:00 [rcuob/40]
root 114 2 0 15:57 ? 00:00:00 [rcuob/41]
root 115 2 0 15:57 ? 00:00:00 [rcuob/42]
root 116 2 0 15:57 ? 00:00:00 [rcuob/43]
root 117 2 0 15:57 ? 00:00:00 [rcuob/44]
root 118 2 0 15:57 ? 00:00:00 [rcuob/45]
root 119 2 0 15:57 ? 00:00:00 [rcuob/46]
root 120 2 0 15:57 ? 00:00:00 [rcuob/47]
root 121 2 0 15:57 ? 00:00:00 [rcuob/48]
root 122 2 0 15:57 ? 00:00:00 [rcuob/49]
root 123 2 0 15:57 ? 00:00:00 [rcuob/50]
root 124 2 0 15:57 ? 00:00:00 [rcuob/51]
root 125 2 0 15:57 ? 00:00:00 [rcuob/52]
root 126 2 0 15:57 ? 00:00:00 [rcuob/53]
root 127 2 0 15:57 ? 00:00:00 [rcuob/54]
root 128 2 0 15:57 ? 00:00:00 [rcuob/55]
root 129 2 0 15:57 ? 00:00:00 [rcuob/56]
root 130 2 0 15:57 ? 00:00:00 [rcuob/57]
root 131 2 0 15:57 ? 00:00:00 [rcuob/58]
root 132 2 0 15:57 ? 00:00:00 [rcuob/59]
root 133 2 0 15:57 ? 00:00:00 [rcuob/60]
root 134 2 0 15:57 ? 00:00:00 [rcuob/61]
root 135 2 0 15:57 ? 00:00:00 [rcuob/62]
root 136 2 0 15:57 ? 00:00:00 [rcuob/63]
root 137 2 0 15:57 ? 00:00:00 [migration/0]
root 138 2 0 15:57 ? 00:00:00 [watchdog/0]
root 139 2 0 15:57 ? 00:00:00 [watchdog/1]
root 140 2 0 15:57 ? 00:00:00 [migration/1]
root 141 2 0 15:57 ? 00:00:00 [ksoftirqd/1]
root 142 2 0 15:57 ? 00:00:00 [kworker/1:0]
root 143 2 0 15:57 ? 00:00:00 [kworker/1:0H]
root 144 2 0 15:57 ? 00:00:00 [khelper]
root 145 2 0 15:57 ? 00:00:00 [kdevtmpfs]
root 146 2 0 15:57 ? 00:00:00 [netns]
root 147 2 0 15:57 ? 00:00:00 [writeback]
root 148 2 0 15:57 ? 00:00:00 [kintegrityd]
root 149 2 0 15:57 ? 00:00:00 [bioset]
root 150 2 0 15:57 ? 00:00:00 [kworker/u129:0]
root 151 2 0 15:57 ? 00:00:00 [kblockd]
root 152 2 0 15:57 ? 00:00:00 [ata_sff]
root 153 2 0 15:57 ? 00:00:00 [khubd]
root 154 2 0 15:57 ? 00:00:00 [md]
root 155 2 0 15:57 ? 00:00:00 [devfreq_wq]
root 156 2 0 15:57 ? 00:00:00 [kworker/1:1]
root 158 2 0 15:57 ? 00:00:00 [khungtaskd]
root 159 2 0 15:57 ? 00:00:00 [kswapd0]
root 160 2 0 15:57 ? 00:00:00 [ksmd]
root 161 2 0 15:57 ? 00:00:00 [khugepaged]
root 162 2 0 15:57 ? 00:00:00 [fsnotify_mark]
root 163 2 0 15:57 ? 00:00:00 [ecryptfs-kthrea]
root 164 2 0 15:57 ? 00:00:00 [crypto]
root 176 2 0 15:57 ? 00:00:00 [kthrotld]
root 177 2 0 15:57 ? 00:00:00 [kworker/u128:1]
root 178 2 0 15:57 ? 00:00:00 [scsi_eh_0]
root 179 2 0 15:57 ? 00:00:00 [scsi_eh_1]
root 180 2 0 15:57 ? 00:00:00 [kworker/u128:2]
root 181 2 0 15:57 ? 00:00:00 [kworker/0:1]
root 182 2 0 15:57 ? 00:00:00 [kworker/u128:3]
root 201 2 0 15:57 ? 00:00:00 [deferwq]
root 202 2 0 15:57 ? 00:00:00 [charger_manager]
root 253 2 0 15:57 ? 00:00:00 [kpsmoused]
root 254 2 0 15:57 ? 00:00:00 [mpt_poll_0]
root 255 2 0 15:57 ? 00:00:00 [mpt/0]
root 256 2 0 15:57 ? 00:00:00 [scsi_eh_2]
root 258 2 0 15:57 ? 00:00:00 [kworker/1:2]
root 264 2 0 15:57 ? 00:00:00 [kdmflush]
root 265 2 0 15:57 ? 00:00:00 [bioset]
root 267 2 0 15:57 ? 00:00:00 [kdmflush]
root 268 2 0 15:57 ? 00:00:00 [bioset]
root 283 2 0 15:57 ? 00:00:00 [jbd2/dm-0-8]
root 284 2 0 15:57 ? 00:00:00 [ext4-rsv-conver]
root 461 2 0 15:58 ? 00:00:00 [ttm_swap]
root 474 2 0 15:58 ? 00:00:00 [ext4-rsv-conver]
root 1322 2 0 15:58 ? 00:00:00 [kauditd]
root 1 0 0 15:57 ? 00:00:01 /sbin/init
root 420 1 0 15:58 ? 00:00:00 upstart-udev-bridge --daemon
root 424 1 0 15:58 ? 00:00:00 /lib/systemd/systemd-udevd --daemon
message+ 597 1 0 15:58 ? 00:00:00 dbus-daemon --system --fork
root 647 1 0 15:58 ? 00:00:00 /lib/systemd/systemd-logind
syslog 653 1 0 15:58 ? 00:00:00 rsyslogd
root 867 1 0 15:58 ? 00:00:00 upstart-file-bridge --daemon
root 869 1 0 15:58 ? 00:00:00 upstart-socket-bridge --daemon
root 930 1 0 15:58 tty4 00:00:00 /sbin/getty -8 38400 tty4
root 933 1 0 15:58 tty5 00:00:00 /sbin/getty -8 38400 tty5
root 940 1 0 15:58 tty2 00:00:00 /sbin/getty -8 38400 tty2
root 941 1 0 15:58 tty3 00:00:00 /sbin/getty -8 38400 tty3
root 943 1 0 15:58 tty6 00:00:00 /sbin/getty -8 38400 tty6
root 977 1 0 15:58 ? 00:00:00 /usr/sbin/sshd -D
root 1320 977 0 15:58 ? 00:00:00 sshd: myuser01 [priv]
myuser01 1369 1320 0 15:58 ? 00:00:00 sshd: myuser01@pts/0
myuser01 1370 1369 0 15:58 pts/0 00:00:00 -bash
root 1384 1370 0 15:58 pts/0 00:00:00 sudo su
root 1385 1384 0 15:58 pts/0 00:00:00 su
root 1386 1385 0 15:58 pts/0 00:00:00 bash
root 1415 1386 0 15:58 pts/0 00:00:00 /usr/bin/python /usr/bin/salt-minion -l debug
root 1420 1415 0 15:58 pts/0 00:00:00 /usr/bin/python /usr/bin/salt-minion -l debug
root 997 1 0 15:58 ? 00:00:00 cron
daemon 998 1 0 15:58 ? 00:00:00 atd
root 1024 1 0 15:58 ? 00:00:00 acpid -c /etc/acpi/events -s /var/run/acpid.socket
root 1069 1 0 15:58 ? 00:00:00 /usr/sbin/irqbalance
root 1143 1 0 15:58 ? 00:00:00 /usr/bin/vmtoolsd
root 1205 1 0 15:58 tty1 00:00:00 /sbin/getty -8 38400 tty1
root 1511 1 2 15:58 ? 00:00:04 /usr/bin/python /usr/bin/salt-minion -l debug
root 3625 1511 0 16:01 ? 00:00:00 ps -efHww
root 2587 1 0 16:01 ? 00:00:00 minion-processor -home /usr/lib/jvm/default-java -jvm server -cp /opt/myuser01/minion-processor/minion-processor.jar -user myuser01 -procname minion-processor -nodetach -outfile /var/log/myuser01/minion-processor_output.log -errfile /var/log/myuser01/minion-processor_error.log -pidfile /run/minion-processor.pid com.mycompany.minion.minionProcessorDaemon -p /etc/myuser01/minion-processor/minion-processor.properties
myuser01 2591 2587 35 16:01 ? 00:00:04 minion-processor -home /usr/lib/jvm/default-java -jvm server -cp /opt/myuser01/minion-processor/minion-processor.jar -user myuser01 -procname minion-processor -nodetach -outfile /var/log/myuser01/minion-processor_output.log -errfile /var/log/myuser01/minion-processor_error.log -pidfile /run/minion-processor.pid com.mycompany.minion.minionProcessorDaemon -p /etc/myuser01/minion-processor/minion-processor.properties
tomcat7 3502 1 44 16:01 ? 00:00:02 /usr/lib/jvm/default-java/bin/java -Djava.util.logging.config.file=/var/lib/tomcat7/conf/logging.properties -Djava.util.logging.manager=org.apache.juli.ClassLoaderLogManager -Djava.awt.headless=true -Xmx128m -XX:+UseConcMarkSweepGC -Djava.endorsed.dirs=/usr/share/tomcat7/endorsed -classpath /usr/share/tomcat7/bin/bootstrap.jar:/usr/share/tomcat7/bin/tomcat-juli.jar -Dcatalina.base=/var/lib/tomcat7 -Dcatalina.home=/usr/share/tomcat7 -Djava.io.tmpdir=/tmp/tomcat7-tomcat7-tmp org.apache.catalina.startup.Bootstrap start
[INFO ] Executing command ['service', 'tomcat7', 'restart'] in directory '/root'
[DEBUG ] LazyLoaded config.merge
[DEBUG ] LazyLoaded mine.update
[INFO ] User sudo_myuser01 Executing command saltutil.find_job with jid 20160204160133108731
[DEBUG ] Command details {'tgt_type': 'glob', 'jid': '20160204160133108731', 'tgt': 'minion-acc01.mycompany.com', 'ret': '', 'user': 'sudo_myuser01', 'arg': ['20160204155857806709'], 'fun': 'saltutil.find_job'}
[INFO ] Starting a new job with PID 3687
[DEBUG ] LazyLoaded saltutil.find_job
[DEBUG ] Minion return retry timer set to 8 seconds (randomized)
[INFO ] Returning information for job: 20160204160133108731
[DEBUG ] Initializing new AsyncZeroMQReqChannel for ('/etc/salt/pki/minion', 'minion-acc01.mycompany.com', 'tcp://10.0.24.11:4506', 'aes')
[DEBUG ] Initializing new SAuth for ('/etc/salt/pki/minion', 'minion-acc01.mycompany.com', 'tcp://10.0.24.11:4506')
[DEBUG ] output: * Stopping Tomcat servlet engine tomcat7
...done.
* Starting Tomcat servlet engine tomcat7
...done.
[INFO ] {'tomcat7': True}
[INFO ] Completed state [tomcat7] at time 16:01:26.266465
[DEBUG ] LazyLoaded pkg.latest
[INFO ] Running state [unzip] at time 16:01:26.267650
[INFO ] Executing state pkg.latest for unzip
[DEBUG ] LazyLoaded pkg_resource.version
[INFO ] Executing command ['apt-cache', '-q', 'policy', 'unzip'] in directory '/root'
[INFO ] Package unzip is already up-to-date
[INFO ] Completed state [unzip] at time 16:01:26.289650
[INFO ] Running state [tomcat7] at time 16:01:26.289965
[INFO ] Executing state service.dead for tomcat7
[INFO ] Executing command 'ps -efHww' in directory '/root'
[DEBUG ] stdout: UID PID PPID C STIME TTY TIME CMD
root 2 0 0 15:57 ? 00:00:00 [kthreadd]
root 3 2 0 15:57 ? 00:00:00 [ksoftirqd/0]
root 4 2 0 15:57 ? 00:00:00 [kworker/0:0]
root 5 2 0 15:57 ? 00:00:00 [kworker/0:0H]
root 6 2 0 15:57 ? 00:00:01 [kworker/u128:0]
root 7 2 0 15:57 ? 00:00:00 [rcu_sched]
root 8 2 0 15:57 ? 00:00:00 [rcuos/0]
root 9 2 0 15:57 ? 00:00:00 [rcuos/1]
root 10 2 0 15:57 ? 00:00:00 [rcuos/2]
root 11 2 0 15:57 ? 00:00:00 [rcuos/3]
root 12 2 0 15:57 ? 00:00:00 [rcuos/4]
root 13 2 0 15:57 ? 00:00:00 [rcuos/5]
root 14 2 0 15:57 ? 00:00:00 [rcuos/6]
root 15 2 0 15:57 ? 00:00:00 [rcuos/7]
root 16 2 0 15:57 ? 00:00:00 [rcuos/8]
root 17 2 0 15:57 ? 00:00:00 [rcuos/9]
root 18 2 0 15:57 ? 00:00:00 [rcuos/10]
root 19 2 0 15:57 ? 00:00:00 [rcuos/11]
root 20 2 0 15:57 ? 00:00:00 [rcuos/12]
root 21 2 0 15:57 ? 00:00:00 [rcuos/13]
root 22 2 0 15:57 ? 00:00:00 [rcuos/14]
root 23 2 0 15:57 ? 00:00:00 [rcuos/15]
root 24 2 0 15:57 ? 00:00:00 [rcuos/16]
root 25 2 0 15:57 ? 00:00:00 [rcuos/17]
root 26 2 0 15:57 ? 00:00:00 [rcuos/18]
root 27 2 0 15:57 ? 00:00:00 [rcuos/19]
root 28 2 0 15:57 ? 00:00:00 [rcuos/20]
root 29 2 0 15:57 ? 00:00:00 [rcuos/21]
root 30 2 0 15:57 ? 00:00:00 [rcuos/22]
root 31 2 0 15:57 ? 00:00:00 [rcuos/23]
root 32 2 0 15:57 ? 00:00:00 [rcuos/24]
root 33 2 0 15:57 ? 00:00:00 [rcuos/25]
root 34 2 0 15:57 ? 00:00:00 [rcuos/26]
root 35 2 0 15:57 ? 00:00:00 [rcuos/27]
root 36 2 0 15:57 ? 00:00:00 [rcuos/28]
root 37 2 0 15:57 ? 00:00:00 [rcuos/29]
root 38 2 0 15:57 ? 00:00:00 [rcuos/30]
root 39 2 0 15:57 ? 00:00:00 [rcuos/31]
root 40 2 0 15:57 ? 00:00:00 [rcuos/32]
root 41 2 0 15:57 ? 00:00:00 [rcuos/33]
root 42 2 0 15:57 ? 00:00:00 [rcuos/34]
root 43 2 0 15:57 ? 00:00:00 [rcuos/35]
root 44 2 0 15:57 ? 00:00:00 [rcuos/36]
root 45 2 0 15:57 ? 00:00:00 [rcuos/37]
root 46 2 0 15:57 ? 00:00:00 [rcuos/38]
root 47 2 0 15:57 ? 00:00:00 [rcuos/39]
root 48 2 0 15:57 ? 00:00:00 [rcuos/40]
root 49 2 0 15:57 ? 00:00:00 [rcuos/41]
root 50 2 0 15:57 ? 00:00:00 [rcuos/42]
root 51 2 0 15:57 ? 00:00:00 [rcuos/43]
root 52 2 0 15:57 ? 00:00:00 [rcuos/44]
root 53 2 0 15:57 ? 00:00:00 [rcuos/45]
root 54 2 0 15:57 ? 00:00:00 [rcuos/46]
root 55 2 0 15:57 ? 00:00:00 [rcuos/47]
root 56 2 0 15:57 ? 00:00:00 [rcuos/48]
root 57 2 0 15:57 ? 00:00:00 [rcuos/49]
root 58 2 0 15:57 ? 00:00:00 [rcuos/50]
root 59 2 0 15:57 ? 00:00:00 [rcuos/51]
root 60 2 0 15:57 ? 00:00:00 [rcuos/52]
root 61 2 0 15:57 ? 00:00:00 [rcuos/53]
root 62 2 0 15:57 ? 00:00:00 [rcuos/54]
root 63 2 0 15:57 ? 00:00:00 [rcuos/55]
root 64 2 0 15:57 ? 00:00:00 [rcuos/56]
root 65 2 0 15:57 ? 00:00:00 [rcuos/57]
root 66 2 0 15:57 ? 00:00:00 [rcuos/58]
root 67 2 0 15:57 ? 00:00:00 [rcuos/59]
root 68 2 0 15:57 ? 00:00:00 [rcuos/60]
root 69 2 0 15:57 ? 00:00:00 [rcuos/61]
root 70 2 0 15:57 ? 00:00:00 [rcuos/62]
root 71 2 0 15:57 ? 00:00:00 [rcuos/63]
root 72 2 0 15:57 ? 00:00:00 [rcu_bh]
root 73 2 0 15:57 ? 00:00:00 [rcuob/0]
root 74 2 0 15:57 ? 00:00:00 [rcuob/1]
root 75 2 0 15:57 ? 00:00:00 [rcuob/2]
root 76 2 0 15:57 ? 00:00:00 [rcuob/3]
root 77 2 0 15:57 ? 00:00:00 [rcuob/4]
root 78 2 0 15:57 ? 00:00:00 [rcuob/5]
root 79 2 0 15:57 ? 00:00:00 [rcuob/6]
root 80 2 0 15:57 ? 00:00:00 [rcuob/7]
root 81 2 0 15:57 ? 00:00:00 [rcuob/8]
root 82 2 0 15:57 ? 00:00:00 [rcuob/9]
root 83 2 0 15:57 ? 00:00:00 [rcuob/10]
root 84 2 0 15:57 ? 00:00:00 [rcuob/11]
root 85 2 0 15:57 ? 00:00:00 [rcuob/12]
root 86 2 0 15:57 ? 00:00:00 [rcuob/13]
root 87 2 0 15:57 ? 00:00:00 [rcuob/14]
root 88 2 0 15:57 ? 00:00:00 [rcuob/15]
root 89 2 0 15:57 ? 00:00:00 [rcuob/16]
root 90 2 0 15:57 ? 00:00:00 [rcuob/17]
root 91 2 0 15:57 ? 00:00:00 [rcuob/18]
root 92 2 0 15:57 ? 00:00:00 [rcuob/19]
root 93 2 0 15:57 ? 00:00:00 [rcuob/20]
root 94 2 0 15:57 ? 00:00:00 [rcuob/21]
root 95 2 0 15:57 ? 00:00:00 [rcuob/22]
root 96 2 0 15:57 ? 00:00:00 [rcuob/23]
root 97 2 0 15:57 ? 00:00:00 [rcuob/24]
root 98 2 0 15:57 ? 00:00:00 [rcuob/25]
root 99 2 0 15:57 ? 00:00:00 [rcuob/26]
root 100 2 0 15:57 ? 00:00:00 [rcuob/27]
root 101 2 0 15:57 ? 00:00:00 [rcuob/28]
root 102 2 0 15:57 ? 00:00:00 [rcuob/29]
root 103 2 0 15:57 ? 00:00:00 [rcuob/30]
root 104 2 0 15:57 ? 00:00:00 [rcuob/31]
root 105 2 0 15:57 ? 00:00:00 [rcuob/32]
root 106 2 0 15:57 ? 00:00:00 [rcuob/33]
root 107 2 0 15:57 ? 00:00:00 [rcuob/34]
root 108 2 0 15:57 ? 00:00:00 [rcuob/35]
root 109 2 0 15:57 ? 00:00:00 [rcuob/36]
root 110 2 0 15:57 ? 00:00:00 [rcuob/37]
root 111 2 0 15:57 ? 00:00:00 [rcuob/38]
root 112 2 0 15:57 ? 00:00:00 [rcuob/39]
root 113 2 0 15:57 ? 00:00:00 [rcuob/40]
root 114 2 0 15:57 ? 00:00:00 [rcuob/41]
root 115 2 0 15:57 ? 00:00:00 [rcuob/42]
root 116 2 0 15:57 ? 00:00:00 [rcuob/43]
root 117 2 0 15:57 ? 00:00:00 [rcuob/44]
root 118 2 0 15:57 ? 00:00:00 [rcuob/45]
root 119 2 0 15:57 ? 00:00:00 [rcuob/46]
root 120 2 0 15:57 ? 00:00:00 [rcuob/47]
root 121 2 0 15:57 ? 00:00:00 [rcuob/48]
root 122 2 0 15:57 ? 00:00:00 [rcuob/49]
root 123 2 0 15:57 ? 00:00:00 [rcuob/50]
root 124 2 0 15:57 ? 00:00:00 [rcuob/51]
root 125 2 0 15:57 ? 00:00:00 [rcuob/52]
root 126 2 0 15:57 ? 00:00:00 [rcuob/53]
root 127 2 0 15:57 ? 00:00:00 [rcuob/54]
root 128 2 0 15:57 ? 00:00:00 [rcuob/55]
root 129 2 0 15:57 ? 00:00:00 [rcuob/56]
root 130 2 0 15:57 ? 00:00:00 [rcuob/57]
root 131 2 0 15:57 ? 00:00:00 [rcuob/58]
root 132 2 0 15:57 ? 00:00:00 [rcuob/59]
root 133 2 0 15:57 ? 00:00:00 [rcuob/60]
root 134 2 0 15:57 ? 00:00:00 [rcuob/61]
root 135 2 0 15:57 ? 00:00:00 [rcuob/62]
root 136 2 0 15:57 ? 00:00:00 [rcuob/63]
root 137 2 0 15:57 ? 00:00:00 [migration/0]
root 138 2 0 15:57 ? 00:00:00 [watchdog/0]
root 139 2 0 15:57 ? 00:00:00 [watchdog/1]
root 140 2 0 15:57 ? 00:00:00 [migration/1]
root 141 2 0 15:57 ? 00:00:00 [ksoftirqd/1]
root 142 2 0 15:57 ? 00:00:00 [kworker/1:0]
root 143 2 0 15:57 ? 00:00:00 [kworker/1:0H]
root 144 2 0 15:57 ? 00:00:00 [khelper]
root 145 2 0 15:57 ? 00:00:00 [kdevtmpfs]
root 146 2 0 15:57 ? 00:00:00 [netns]
root 147 2 0 15:57 ? 00:00:00 [writeback]
root 148 2 0 15:57 ? 00:00:00 [kintegrityd]
root 149 2 0 15:57 ? 00:00:00 [bioset]
root 150 2 0 15:57 ? 00:00:00 [kworker/u129:0]
root 151 2 0 15:57 ? 00:00:00 [kblockd]
root 152 2 0 15:57 ? 00:00:00 [ata_sff]
root 153 2 0 15:57 ? 00:00:00 [khubd]
root 154 2 0 15:57 ? 00:00:00 [md]
root 155 2 0 15:57 ? 00:00:00 [devfreq_wq]
root 156 2 0 15:57 ? 00:00:00 [kworker/1:1]
root 158 2 0 15:57 ? 00:00:00 [khungtaskd]
root 159 2 0 15:57 ? 00:00:00 [kswapd0]
root 160 2 0 15:57 ? 00:00:00 [ksmd]
root 161 2 0 15:57 ? 00:00:00 [khugepaged]
root 162 2 0 15:57 ? 00:00:00 [fsnotify_mark]
root 163 2 0 15:57 ? 00:00:00 [ecryptfs-kthrea]
root 164 2 0 15:57 ? 00:00:00 [crypto]
root 176 2 0 15:57 ? 00:00:00 [kthrotld]
root 177 2 0 15:57 ? 00:00:00 [kworker/u128:1]
root 178 2 0 15:57 ? 00:00:00 [scsi_eh_0]
root 179 2 0 15:57 ? 00:00:00 [scsi_eh_1]
root 180 2 0 15:57 ? 00:00:00 [kworker/u128:2]
root 181 2 0 15:57 ? 00:00:00 [kworker/0:1]
root 182 2 0 15:57 ? 00:00:00 [kworker/u128:3]
root 201 2 0 15:57 ? 00:00:00 [deferwq]
root 202 2 0 15:57 ? 00:00:00 [charger_manager]
root 253 2 0 15:57 ? 00:00:00 [kpsmoused]
root 254 2 0 15:57 ? 00:00:00 [mpt_poll_0]
root 255 2 0 15:57 ? 00:00:00 [mpt/0]
root 256 2 0 15:57 ? 00:00:00 [scsi_eh_2]
root 258 2 0 15:57 ? 00:00:00 [kworker/1:2]
root 264 2 0 15:57 ? 00:00:00 [kdmflush]
root 265 2 0 15:57 ? 00:00:00 [bioset]
root 267 2 0 15:57 ? 00:00:00 [kdmflush]
root 268 2 0 15:57 ? 00:00:00 [bioset]
root 283 2 0 15:57 ? 00:00:00 [jbd2/dm-0-8]
root 284 2 0 15:57 ? 00:00:00 [ext4-rsv-conver]
root 461 2 0 15:58 ? 00:00:00 [ttm_swap]
root 474 2 0 15:58 ? 00:00:00 [ext4-rsv-conver]
root 1322 2 0 15:58 ? 00:00:00 [kauditd]
root 1 0 0 15:57 ? 00:00:01 /sbin/init
root 420 1 0 15:58 ? 00:00:00 upstart-udev-bridge --daemon
root 424 1 0 15:58 ? 00:00:00 /lib/systemd/systemd-udevd --daemon
message+ 597 1 0 15:58 ? 00:00:00 dbus-daemon --system --fork
root 647 1 0 15:58 ? 00:00:00 /lib/systemd/systemd-logind
syslog 653 1 0 15:58 ? 00:00:00 rsyslogd
root 867 1 0 15:58 ? 00:00:00 upstart-file-bridge --daemon
root 869 1 0 15:58 ? 00:00:00 upstart-socket-bridge --daemon
root 930 1 0 15:58 tty4 00:00:00 /sbin/getty -8 38400 tty4
root 933 1 0 15:58 tty5 00:00:00 /sbin/getty -8 38400 tty5
root 940 1 0 15:58 tty2 00:00:00 /sbin/getty -8 38400 tty2
root 941 1 0 15:58 tty3 00:00:00 /sbin/getty -8 38400 tty3
root 943 1 0 15:58 tty6 00:00:00 /sbin/getty -8 38400 tty6
root 977 1 0 15:58 ? 00:00:00 /usr/sbin/sshd -D
root 1320 977 0 15:58 ? 00:00:00 sshd: myuser01 [priv]
myuser01 1369 1320 0 15:58 ? 00:00:00 sshd: myuser01@pts/0
myuser01 1370 1369 0 15:58 pts/0 00:00:00 -bash
root 1384 1370 0 15:58 pts/0 00:00:00 sudo su
root 1385 1384 0 15:58 pts/0 00:00:00 su
root 1386 1385 0 15:58 pts/0 00:00:00 bash
root 1415 1386 0 15:58 pts/0 00:00:00 /usr/bin/python /usr/bin/salt-minion -l debug
root 1420 1415 0 15:58 pts/0 00:00:00 /usr/bin/python /usr/bin/salt-minion -l debug
root 997 1 0 15:58 ? 00:00:00 cron
daemon 998 1 0 15:58 ? 00:00:00 atd
root 1024 1 0 15:58 ? 00:00:00 acpid -c /etc/acpi/events -s /var/run/acpid.socket
root 1069 1 0 15:58 ? 00:00:00 /usr/sbin/irqbalance
root 1143 1 0 15:58 ? 00:00:00 /usr/bin/vmtoolsd
root 1205 1 0 15:58 tty1 00:00:00 /sbin/getty -8 38400 tty1
root 1511 1 2 15:58 ? 00:00:04 /usr/bin/python /usr/bin/salt-minion -l debug
root 3695 1511 0 16:01 ? 00:00:00 ps -efHww
root 2587 1 0 16:01 ? 00:00:00 minion-processor -home /usr/lib/jvm/default-java -jvm server -cp /opt/myuser01/minion-processor/minion-processor.jar -user myuser01 -procname minion-processor -nodetach -outfile /var/log/myuser01/minion-processor_output.log -errfile /var/log/myuser01/minion-processor_error.log -pidfile /run/minion-processor.pid com.mycompany.minion.minionProcessorDaemon -p /etc/myuser01/minion-processor/minion-processor.properties
myuser01 2591 2587 23 16:01 ? 00:00:04 minion-processor -home /usr/lib/jvm/default-java -jvm server -cp /opt/myuser01/minion-processor/minion-processor.jar -user myuser01 -procname minion-processor -nodetach -outfile /var/log/myuser01/minion-processor_output.log -errfile /var/log/myuser01/minion-processor_error.log -pidfile /run/minion-processor.pid com.mycompany.minion.minionProcessorDaemon -p /etc/myuser01/minion-processor/minion-processor.properties
tomcat7 3665 1 60 16:01 ? 00:00:03 /usr/lib/jvm/default-java/bin/java -Djava.util.logging.config.file=/var/lib/tomcat7/conf/logging.properties -Djava.util.logging.manager=org.apache.juli.ClassLoaderLogManager -Djava.awt.headless=true -Xmx128m -XX:+UseConcMarkSweepGC -Djava.endorsed.dirs=/usr/share/tomcat7/endorsed -classpath /usr/share/tomcat7/bin/bootstrap.jar:/usr/share/tomcat7/bin/tomcat-juli.jar -Dcatalina.base=/var/lib/tomcat7 -Dcatalina.home=/usr/share/tomcat7 -Djava.io.tmpdir=/tmp/tomcat7-tomcat7-tmp org.apache.catalina.startup.Bootstrap start
[INFO ] Executing command ['service', 'tomcat7', 'stop'] in directory '/root'
[DEBUG ] output: * Stopping Tomcat servlet engine tomcat7
...done.
[INFO ] Executing command ['service', 'tomcat7', 'status'] in directory '/root'
[ERROR ] Command ['service', 'tomcat7', 'status'] failed with return code: 3
[ERROR ] output: * Tomcat servlet engine is not running.
[INFO ] {'tomcat7': True}
[INFO ] Completed state [tomcat7] at time 16:01:26.436235
[INFO ] Running state [/var/lib/tomcat7/webapps/minion-web.war] at time 16:01:26.438540
[INFO ] Executing state file.absent for /var/lib/tomcat7/webapps/minion-web.war
[INFO ] File /var/lib/tomcat7/webapps/minion-web.war is not present
[INFO ] Completed state [/var/lib/tomcat7/webapps/minion-web.war] at time 16:01:26.439506
[INFO ] Running state [/usr/share/tomcat7/shared/classes/minion-web.properties] at time 16:01:26.439815
[INFO ] Executing state file.absent for /usr/share/tomcat7/shared/classes/minion-web.properties
[INFO ] File /usr/share/tomcat7/shared/classes/minion-web.properties is not present
[INFO ] Completed state [/usr/share/tomcat7/shared/classes/minion-web.properties] at time 16:01:26.440633
[INFO ] Running state [/var/lib/tomcat7/webapps/minion-web] at time 16:01:26.440954
[INFO ] Executing state file.absent for /var/lib/tomcat7/webapps/minion-web
[INFO ] File /var/lib/tomcat7/webapps/minion-web is not present
[INFO ] Completed state [/var/lib/tomcat7/webapps/minion-web] at time 16:01:26.441757
[INFO ] Running state [/usr/share/tomcat7/shared/] at time 16:01:26.443034
[INFO ] Executing state file.directory for /usr/share/tomcat7/shared/
[DEBUG ] List of kept files when use file.directory with clean: ['/var/lib/tomcat7/webapps/minion-web.war', '/usr/share/tomcat7/shared/classes/minion-web.properties', '/var/lib/tomcat7/webapps/minion-web']
[INFO ] {'/usr/share/tomcat7/shared': 'New Dir'}
[INFO ] Completed state [/usr/share/tomcat7/shared/] at time 16:01:26.444576
[INFO ] Running state [/usr/share/tomcat7/shared/classes/] at time 16:01:26.445825
[INFO ] Executing state file.directory for /usr/share/tomcat7/shared/classes/
[DEBUG ] List of kept files when use file.directory with clean: ['/var/lib/tomcat7/webapps/minion-web.war', '/usr/share/tomcat7/shared/classes/minion-web.properties', '/var/lib/tomcat7/webapps/minion-web']
[INFO ] {'/usr/share/tomcat7/shared/classes': 'New Dir'}
[INFO ] Completed state [/usr/share/tomcat7/shared/classes/] at time 16:01:26.447236
[INFO ] Running state [/var/lib/tomcat7/work] at time 16:01:26.448457
[INFO ] Executing state file.directory for /var/lib/tomcat7/work
[DEBUG ] List of kept files when use file.directory with clean: ['/var/lib/tomcat7/webapps/minion-web.war', '/usr/share/tomcat7/shared/classes/minion-web.properties', '/var/lib/tomcat7/webapps/minion-web']
[INFO ] {'removed': ['/var/lib/tomcat7/work/Catalina', '/var/lib/tomcat7/work/catalina.policy']}
[INFO ] Completed state [/var/lib/tomcat7/work] at time 16:01:26.450176
[INFO ] Running state [/var/lib/tomcat7/logs] at time 16:01:26.451345
[INFO ] Executing state file.directory for /var/lib/tomcat7/logs
[DEBUG ] List of kept files when use file.directory with clean: ['/var/lib/tomcat7/webapps/minion-web.war', '/usr/share/tomcat7/shared/classes/minion-web.properties', '/var/lib/tomcat7/webapps/minion-web']
[INFO ] {'removed': ['/var/lib/tomcat7/logs/catalina.out', '/var/lib/tomcat7/logs/localhost_access_log.2016-02-04.txt', '/var/lib/tomcat7/logs/catalina.2016-02-04.log', '/var/lib/tomcat7/logs/localhost.2016-02-04.log']}
[INFO ] Completed state [/var/lib/tomcat7/logs] at time 16:01:26.452552
[INFO ] Running state [/var/lib/tomcat7/conf/tomcat-users.xml] at time 16:01:26.453098
[INFO ] Executing state file.blockreplace for /var/lib/tomcat7/conf/tomcat-users.xml
[INFO ] File changed:
---
+++
@@ -38,5 +38,7 @@
<user username="scriptor" password="pwd" roles="manager-script"/>
<!-- End myuser01 User Section -->
<!-- Begin myuser01 Monitoring Section -->
+<role rolename="monitoring"/>
+<user username="myuser01" password="pwd" roles="monitoring"/>
<!-- End myuser01 Monitoring Section -->
</tomcat-users>
[INFO ] Completed state [/var/lib/tomcat7/conf/tomcat-users.xml] at time 16:01:26.455420
[INFO ] Running state [/var/lib/tomcat7/webapps/minion-web.war] at time 16:01:26.455613
[INFO ] Executing state file.managed for /var/lib/tomcat7/webapps/minion-web.war
[DEBUG ] Reading configuration from /etc/salt/minion
[DEBUG ] Including configuration from '/etc/salt/minion.d/_schedule.conf'
[DEBUG ] Reading configuration from /etc/salt/minion.d/_schedule.conf
[DEBUG ] Using cached minion ID from /etc/salt/minion_id: minion-acc01.mycompany.com
[DEBUG ] Requesting URL http://nexus.mycompany.com/nexus/content/groups/mycompany/com/mycompany/myuser01/minion-web/1.0-SNAPSHOT/minion-web-1.0-20160204.180827-187.war using GET method
[INFO ] File changed:
New file
[INFO ] Completed state [/var/lib/tomcat7/webapps/minion-web.war] at time 16:01:27.885460
[INFO ] Running state [/tmp/minion-web_salt] at time 16:01:27.885710
[INFO ] Executing state file.directory for /tmp/minion-web_salt
[DEBUG ] List of kept files when use file.directory with clean: []
[INFO ] {'/tmp/minion-web_salt': 'New Dir'}
[INFO ] Completed state [/tmp/minion-web_salt] at time 16:01:27.887351
[INFO ] Running state [/tmp/minion-web_salt/properties] at time 16:01:27.887557
[INFO ] Executing state file.directory for /tmp/minion-web_salt/properties
[DEBUG ] List of kept files when use file.directory with clean: []
[INFO ] {'/tmp/minion-web_salt/properties': 'New Dir'}
[INFO ] Completed state [/tmp/minion-web_salt/properties] at time 16:01:27.889237
[DEBUG ] LazyLoaded cmd.run
[INFO ] Running state [unzip -j /var/lib/tomcat7/webapps/minion-web.war "**/acc.minion-web.properties" -d /tmp/minion-web_salt/properties
chown -R tomcat7:tomcat7 /tmp/minion-web_salt/properties
mv /tmp/minion-web_salt/properties/acc.minion-web.properties /usr/share/tomcat7/shared/classes/minion-web.properties
rm -rf /tmp/minion-web_salt/properties
] at time 16:01:27.891046
[INFO ] Executing state cmd.run for unzip -j /var/lib/tomcat7/webapps/minion-web.war "**/acc.minion-web.properties" -d /tmp/minion-web_salt/properties
chown -R tomcat7:tomcat7 /tmp/minion-web_salt/properties
mv /tmp/minion-web_salt/properties/acc.minion-web.properties /usr/share/tomcat7/shared/classes/minion-web.properties
rm -rf /tmp/minion-web_salt/properties
[INFO ] Executing command 'unzip -j /var/lib/tomcat7/webapps/minion-web.war "**/acc.minion-web.properties" -d /tmp/minion-web_salt/properties\nchown -R tomcat7:tomcat7 /tmp/minion-web_salt/properties\nmv /tmp/minion-web_salt/properties/acc.minion-web.properties /usr/share/tomcat7/shared/classes/minion-web.properties\nrm -rf /tmp/minion-web_salt/properties\n' in directory '/root'
[DEBUG ] stdout: Archive: /var/lib/tomcat7/webapps/minion-web.war
inflating: /tmp/minion-web_salt/properties/acc.minion-web.properties
[INFO ] {'pid': 3714, 'retcode': 0, 'stderr': '', 'stdout': 'Archive: /var/lib/tomcat7/webapps/minion-web.war\n inflating: /tmp/minion-web_salt/properties/acc.minion-web.properties'}
[INFO ] Completed state [unzip -j /var/lib/tomcat7/webapps/minion-web.war "**/acc.minion-web.properties" -d /tmp/minion-web_salt/properties
chown -R tomcat7:tomcat7 /tmp/minion-web_salt/properties
mv /tmp/minion-web_salt/properties/acc.minion-web.properties /usr/share/tomcat7/shared/classes/minion-web.properties
rm -rf /tmp/minion-web_salt/properties
] at time 16:01:27.900539
[INFO ] Running state [/usr/share/tomcat7/shared/classes/minion-web.properties] at time 16:01:27.901003
[INFO ] Executing state file.prepend for /usr/share/tomcat7/shared/classes/minion-web.properties
[INFO ] File changed:
---
+++
@@ -1,3 +1,10 @@
+###############################################################
+# This file is auto-generated by Salt
+# WAR_FILE: http://nexus.mycompany.com/nexus/content/groups/mycompany/com/mycompany/myuser01/minion-web/1.0-SNAPSHOT/minion-web-1.0-20160204.180827-187.war
+# WAR_HASH: sha1=437e0a4ec4a3c0a0878b7928c007c7e56b213a82
+# ORIGINAL_PROPS: WEB-INF/classes/wp_properties_jinja/generated/acc.minion-web.properties
+# DEPLOYED: Thu Feb 4 22:58:49 UTC 2016
+###############################################################
SKIP_DB_VALIDATION=true
[INFO ] Completed state [/usr/share/tomcat7/shared/classes/minion-web.properties] at time 16:01:27.907119
[INFO ] Running state [tomcat7] at time 16:01:27.907785
[INFO ] Executing state service.running for tomcat7
[INFO ] Executing command 'ps -efHww' in directory '/root'
[DEBUG ] stdout: UID PID PPID C STIME TTY TIME CMD
root 2 0 0 15:57 ? 00:00:00 [kthreadd]
root 3 2 0 15:57 ? 00:00:00 [ksoftirqd/0]
root 4 2 0 15:57 ? 00:00:00 [kworker/0:0]
root 5 2 0 15:57 ? 00:00:00 [kworker/0:0H]
root 6 2 0 15:57 ? 00:00:01 [kworker/u128:0]
root 7 2 0 15:57 ? 00:00:00 [rcu_sched]
root 8 2 0 15:57 ? 00:00:00 [rcuos/0]
root 9 2 0 15:57 ? 00:00:00 [rcuos/1]
root 10 2 0 15:57 ? 00:00:00 [rcuos/2]
root 11 2 0 15:57 ? 00:00:00 [rcuos/3]
root 12 2 0 15:57 ? 00:00:00 [rcuos/4]
root 13 2 0 15:57 ? 00:00:00 [rcuos/5]
root 14 2 0 15:57 ? 00:00:00 [rcuos/6]
root 15 2 0 15:57 ? 00:00:00 [rcuos/7]
root 16 2 0 15:57 ? 00:00:00 [rcuos/8]
root 17 2 0 15:57 ? 00:00:00 [rcuos/9]
root 18 2 0 15:57 ? 00:00:00 [rcuos/10]
root 19 2 0 15:57 ? 00:00:00 [rcuos/11]
root 20 2 0 15:57 ? 00:00:00 [rcuos/12]
root 21 2 0 15:57 ? 00:00:00 [rcuos/13]
root 22 2 0 15:57 ? 00:00:00 [rcuos/14]
root 23 2 0 15:57 ? 00:00:00 [rcuos/15]
root 24 2 0 15:57 ? 00:00:00 [rcuos/16]
root 25 2 0 15:57 ? 00:00:00 [rcuos/17]
root 26 2 0 15:57 ? 00:00:00 [rcuos/18]
root 27 2 0 15:57 ? 00:00:00 [rcuos/19]
root 28 2 0 15:57 ? 00:00:00 [rcuos/20]
root 29 2 0 15:57 ? 00:00:00 [rcuos/21]
root 30 2 0 15:57 ? 00:00:00 [rcuos/22]
root 31 2 0 15:57 ? 00:00:00 [rcuos/23]
root 32 2 0 15:57 ? 00:00:00 [rcuos/24]
root 33 2 0 15:57 ? 00:00:00 [rcuos/25]
root 34 2 0 15:57 ? 00:00:00 [rcuos/26]
root 35 2 0 15:57 ? 00:00:00 [rcuos/27]
root 36 2 0 15:57 ? 00:00:00 [rcuos/28]
root 37 2 0 15:57 ? 00:00:00 [rcuos/29]
root 38 2 0 15:57 ? 00:00:00 [rcuos/30]
root 39 2 0 15:57 ? 00:00:00 [rcuos/31]
root 40 2 0 15:57 ? 00:00:00 [rcuos/32]
root 41 2 0 15:57 ? 00:00:00 [rcuos/33]
root 42 2 0 15:57 ? 00:00:00 [rcuos/34]
root 43 2 0 15:57 ? 00:00:00 [rcuos/35]
root 44 2 0 15:57 ? 00:00:00 [rcuos/36]
root 45 2 0 15:57 ? 00:00:00 [rcuos/37]
root 46 2 0 15:57 ? 00:00:00 [rcuos/38]
root 47 2 0 15:57 ? 00:00:00 [rcuos/39]
root 48 2 0 15:57 ? 00:00:00 [rcuos/40]
root 49 2 0 15:57 ? 00:00:00 [rcuos/41]
root 50 2 0 15:57 ? 00:00:00 [rcuos/42]
root 51 2 0 15:57 ? 00:00:00 [rcuos/43]
root 52 2 0 15:57 ? 00:00:00 [rcuos/44]
root 53 2 0 15:57 ? 00:00:00 [rcuos/45]
root 54 2 0 15:57 ? 00:00:00 [rcuos/46]
root 55 2 0 15:57 ? 00:00:00 [rcuos/47]
root 56 2 0 15:57 ? 00:00:00 [rcuos/48]
root 57 2 0 15:57 ? 00:00:00 [rcuos/49]
root 58 2 0 15:57 ? 00:00:00 [rcuos/50]
root 59 2 0 15:57 ? 00:00:00 [rcuos/51]
root 60 2 0 15:57 ? 00:00:00 [rcuos/52]
root 61 2 0 15:57 ? 00:00:00 [rcuos/53]
root 62 2 0 15:57 ? 00:00:00 [rcuos/54]
root 63 2 0 15:57 ? 00:00:00 [rcuos/55]
root 64 2 0 15:57 ? 00:00:00 [rcuos/56]
root 65 2 0 15:57 ? 00:00:00 [rcuos/57]
root 66 2 0 15:57 ? 00:00:00 [rcuos/58]
root 67 2 0 15:57 ? 00:00:00 [rcuos/59]
root 68 2 0 15:57 ? 00:00:00 [rcuos/60]
root 69 2 0 15:57 ? 00:00:00 [rcuos/61]
root 70 2 0 15:57 ? 00:00:00 [rcuos/62]
root 71 2 0 15:57 ? 00:00:00 [rcuos/63]
root 72 2 0 15:57 ? 00:00:00 [rcu_bh]
root 73 2 0 15:57 ? 00:00:00 [rcuob/0]
root 74 2 0 15:57 ? 00:00:00 [rcuob/1]
root 75 2 0 15:57 ? 00:00:00 [rcuob/2]
root 76 2 0 15:57 ? 00:00:00 [rcuob/3]
root 77 2 0 15:57 ? 00:00:00 [rcuob/4]
root 78 2 0 15:57 ? 00:00:00 [rcuob/5]
root 79 2 0 15:57 ? 00:00:00 [rcuob/6]
root 80 2 0 15:57 ? 00:00:00 [rcuob/7]
root 81 2 0 15:57 ? 00:00:00 [rcuob/8]
root 82 2 0 15:57 ? 00:00:00 [rcuob/9]
root 83 2 0 15:57 ? 00:00:00 [rcuob/10]
root 84 2 0 15:57 ? 00:00:00 [rcuob/11]
root 85 2 0 15:57 ? 00:00:00 [rcuob/12]
root 86 2 0 15:57 ? 00:00:00 [rcuob/13]
root 87 2 0 15:57 ? 00:00:00 [rcuob/14]
root 88 2 0 15:57 ? 00:00:00 [rcuob/15]
root 89 2 0 15:57 ? 00:00:00 [rcuob/16]
root 90 2 0 15:57 ? 00:00:00 [rcuob/17]
root 91 2 0 15:57 ? 00:00:00 [rcuob/18]
root 92 2 0 15:57 ? 00:00:00 [rcuob/19]
root 93 2 0 15:57 ? 00:00:00 [rcuob/20]
root 94 2 0 15:57 ? 00:00:00 [rcuob/21]
root 95 2 0 15:57 ? 00:00:00 [rcuob/22]
root 96 2 0 15:57 ? 00:00:00 [rcuob/23]
root 97 2 0 15:57 ? 00:00:00 [rcuob/24]
root 98 2 0 15:57 ? 00:00:00 [rcuob/25]
root 99 2 0 15:57 ? 00:00:00 [rcuob/26]
root 100 2 0 15:57 ? 00:00:00 [rcuob/27]
root 101 2 0 15:57 ? 00:00:00 [rcuob/28]
root 102 2 0 15:57 ? 00:00:00 [rcuob/29]
root 103 2 0 15:57 ? 00:00:00 [rcuob/30]
root 104 2 0 15:57 ? 00:00:00 [rcuob/31]
root 105 2 0 15:57 ? 00:00:00 [rcuob/32]
root 106 2 0 15:57 ? 00:00:00 [rcuob/33]
root 107 2 0 15:57 ? 00:00:00 [rcuob/34]
root 108 2 0 15:57 ? 00:00:00 [rcuob/35]
root 109 2 0 15:57 ? 00:00:00 [rcuob/36]
root 110 2 0 15:57 ? 00:00:00 [rcuob/37]
root 111 2 0 15:57 ? 00:00:00 [rcuob/38]
root 112 2 0 15:57 ? 00:00:00 [rcuob/39]
root 113 2 0 15:57 ? 00:00:00 [rcuob/40]
root 114 2 0 15:57 ? 00:00:00 [rcuob/41]
root 115 2 0 15:57 ? 00:00:00 [rcuob/42]
root 116 2 0 15:57 ? 00:00:00 [rcuob/43]
root 117 2 0 15:57 ? 00:00:00 [rcuob/44]
root 118 2 0 15:57 ? 00:00:00 [rcuob/45]
root 119 2 0 15:57 ? 00:00:00 [rcuob/46]
root 120 2 0 15:57 ? 00:00:00 [rcuob/47]
root 121 2 0 15:57 ? 00:00:00 [rcuob/48]
root 122 2 0 15:57 ? 00:00:00 [rcuob/49]
root 123 2 0 15:57 ? 00:00:00 [rcuob/50]
root 124 2 0 15:57 ? 00:00:00 [rcuob/51]
root 125 2 0 15:57 ? 00:00:00 [rcuob/52]
root 126 2 0 15:57 ? 00:00:00 [rcuob/53]
root 127 2 0 15:57 ? 00:00:00 [rcuob/54]
root 128 2 0 15:57 ? 00:00:00 [rcuob/55]
root 129 2 0 15:57 ? 00:00:00 [rcuob/56]
root 130 2 0 15:57 ? 00:00:00 [rcuob/57]
root 131 2 0 15:57 ? 00:00:00 [rcuob/58]
root 132 2 0 15:57 ? 00:00:00 [rcuob/59]
root 133 2 0 15:57 ? 00:00:00 [rcuob/60]
root 134 2 0 15:57 ? 00:00:00 [rcuob/61]
root 135 2 0 15:57 ? 00:00:00 [rcuob/62]
root 136 2 0 15:57 ? 00:00:00 [rcuob/63]
root 137 2 0 15:57 ? 00:00:00 [migration/0]
root 138 2 0 15:57 ? 00:00:00 [watchdog/0]
root 139 2 0 15:57 ? 00:00:00 [watchdog/1]
root 140 2 0 15:57 ? 00:00:00 [migration/1]
root 141 2 0 15:57 ? 00:00:00 [ksoftirqd/1]
root 142 2 0 15:57 ? 00:00:00 [kworker/1:0]
root 143 2 0 15:57 ? 00:00:00 [kworker/1:0H]
root 144 2 0 15:57 ? 00:00:00 [khelper]
root 145 2 0 15:57 ? 00:00:00 [kdevtmpfs]
root 146 2 0 15:57 ? 00:00:00 [netns]
root 147 2 0 15:57 ? 00:00:00 [writeback]
root 148 2 0 15:57 ? 00:00:00 [kintegrityd]
root 149 2 0 15:57 ? 00:00:00 [bioset]
root 150 2 0 15:57 ? 00:00:00 [kworker/u129:0]
root 151 2 0 15:57 ? 00:00:00 [kblockd]
root 152 2 0 15:57 ? 00:00:00 [ata_sff]
root 153 2 0 15:57 ? 00:00:00 [khubd]
root 154 2 0 15:57 ? 00:00:00 [md]
root 155 2 0 15:57 ? 00:00:00 [devfreq_wq]
root 156 2 0 15:57 ? 00:00:00 [kworker/1:1]
root 158 2 0 15:57 ? 00:00:00 [khungtaskd]
root 159 2 0 15:57 ? 00:00:00 [kswapd0]
root 160 2 0 15:57 ? 00:00:00 [ksmd]
root 161 2 0 15:57 ? 00:00:00 [khugepaged]
root 162 2 0 15:57 ? 00:00:00 [fsnotify_mark]
root 163 2 0 15:57 ? 00:00:00 [ecryptfs-kthrea]
root 164 2 0 15:57 ? 00:00:00 [crypto]
root 176 2 0 15:57 ? 00:00:00 [kthrotld]
root 177 2 0 15:57 ? 00:00:00 [kworker/u128:1]
root 178 2 0 15:57 ? 00:00:00 [scsi_eh_0]
root 179 2 0 15:57 ? 00:00:00 [scsi_eh_1]
root 180 2 0 15:57 ? 00:00:00 [kworker/u128:2]
root 181 2 0 15:57 ? 00:00:00 [kworker/0:1]
root 182 2 0 15:57 ? 00:00:00 [kworker/u128:3]
root 201 2 0 15:57 ? 00:00:00 [deferwq]
root 202 2 0 15:57 ? 00:00:00 [charger_manager]
root 253 2 0 15:57 ? 00:00:00 [kpsmoused]
root 254 2 0 15:57 ? 00:00:00 [mpt_poll_0]
root 255 2 0 15:57 ? 00:00:00 [mpt/0]
root 256 2 0 15:57 ? 00:00:00 [scsi_eh_2]
root 258 2 0 15:57 ? 00:00:00 [kworker/1:2]
root 264 2 0 15:57 ? 00:00:00 [kdmflush]
root 265 2 0 15:57 ? 00:00:00 [bioset]
root 267 2 0 15:57 ? 00:00:00 [kdmflush]
root 268 2 0 15:57 ? 00:00:00 [bioset]
root 283 2 0 15:57 ? 00:00:00 [jbd2/dm-0-8]
root 284 2 0 15:57 ? 00:00:00 [ext4-rsv-conver]
root 461 2 0 15:58 ? 00:00:00 [ttm_swap]
root 474 2 0 15:58 ? 00:00:00 [ext4-rsv-conver]
root 1322 2 0 15:58 ? 00:00:00 [kauditd]
root 1 0 0 15:57 ? 00:00:01 /sbin/init
root 420 1 0 15:58 ? 00:00:00 upstart-udev-bridge --daemon
root 424 1 0 15:58 ? 00:00:00 /lib/systemd/systemd-udevd --daemon
message+ 597 1 0 15:58 ? 00:00:00 dbus-daemon --system --fork
root 647 1 0 15:58 ? 00:00:00 /lib/systemd/systemd-logind
syslog 653 1 0 15:58 ? 00:00:00 rsyslogd
root 867 1 0 15:58 ? 00:00:00 upstart-file-bridge --daemon
root 869 1 0 15:58 ? 00:00:00 upstart-socket-bridge --daemon
root 930 1 0 15:58 tty4 00:00:00 /sbin/getty -8 38400 tty4
root 933 1 0 15:58 tty5 00:00:00 /sbin/getty -8 38400 tty5
root 940 1 0 15:58 tty2 00:00:00 /sbin/getty -8 38400 tty2
root 941 1 0 15:58 tty3 00:00:00 /sbin/getty -8 38400 tty3
root 943 1 0 15:58 tty6 00:00:00 /sbin/getty -8 38400 tty6
root 977 1 0 15:58 ? 00:00:00 /usr/sbin/sshd -D
root 1320 977 0 15:58 ? 00:00:00 sshd: myuser01 [priv]
myuser01 1369 1320 0 15:58 ? 00:00:00 sshd: myuser01@pts/0
myuser01 1370 1369 0 15:58 pts/0 00:00:00 -bash
root 1384 1370 0 15:58 pts/0 00:00:00 sudo su
root 1385 1384 0 15:58 pts/0 00:00:00 su
root 1386 1385 0 15:58 pts/0 00:00:00 bash
root 1415 1386 0 15:58 pts/0 00:00:00 /usr/bin/python /usr/bin/salt-minion -l debug
root 1420 1415 0 15:58 pts/0 00:00:00 /usr/bin/python /usr/bin/salt-minion -l debug
root 997 1 0 15:58 ? 00:00:00 cron
daemon 998 1 0 15:58 ? 00:00:00 atd
root 1024 1 0 15:58 ? 00:00:00 acpid -c /etc/acpi/events -s /var/run/acpid.socket
root 1069 1 0 15:58 ? 00:00:00 /usr/sbin/irqbalance
root 1143 1 0 15:58 ? 00:00:00 /usr/bin/vmtoolsd
root 1205 1 0 15:58 tty1 00:00:00 /sbin/getty -8 38400 tty1
root 1511 1 3 15:58 ? 00:00:05 /usr/bin/python /usr/bin/salt-minion -l debug
root 3718 1511 0 16:01 ? 00:00:00 ps -efHww
root 2587 1 0 16:01 ? 00:00:00 minion-processor -home /usr/lib/jvm/default-java -jvm server -cp /opt/myuser01/minion-processor/minion-processor.jar -user myuser01 -procname minion-processor -nodetach -outfile /var/log/myuser01/minion-processor_output.log -errfile /var/log/myuser01/minion-processor_error.log -pidfile /run/minion-processor.pid com.mycompany.minion.minionProcessorDaemon -p /etc/myuser01/minion-processor/minion-processor.properties
myuser01 2591 2587 22 16:01 ? 00:00:04 minion-processor -home /usr/lib/jvm/default-java -jvm server -cp /opt/myuser01/minion-processor/minion-processor.jar -user myuser01 -procname minion-processor -nodetach -outfile /var/log/myuser01/minion-processor_output.log -errfile /var/log/myuser01/minion-processor_error.log -pidfile /run/minion-processor.pid com.mycompany.minion.minionProcessorDaemon -p /etc/myuser01/minion-processor/minion-processor.properties
[INFO ] Executing command ['service', 'tomcat7', 'start'] in directory '/root'
[DEBUG ] output: * Starting Tomcat servlet engine tomcat7
...done.
[INFO ] Executing command ['service', 'tomcat7', 'status'] in directory '/root'
[DEBUG ] output: * Tomcat servlet engine is running with pid 3744
[INFO ] {'tomcat7': True}
[INFO ] Completed state [tomcat7] at time 16:01:33.069698
[INFO ] Running state [/t] at time 16:01:33.069935
[INFO ] Executing state file.symlink for /t
[INFO ] Symlink /t is present and owned by tomcat7:tomcat7
[INFO ] Completed state [/t] at time 16:01:33.070888
[DEBUG ] LazyLoaded pkgrepo.managed
[INFO ] Running state [deb http://www.rabbitmq.com/debian/ testing main] at time 16:01:33.080946
[INFO ] Executing state pkgrepo.managed for deb http://www.rabbitmq.com/debian/ testing main
[INFO ] Package repo 'deb http://www.rabbitmq.com/debian/ testing main' already configured
[INFO ] Completed state [deb http://www.rabbitmq.com/debian/ testing main] at time 16:01:33.136451
[INFO ] Running state [rabbitmq-server] at time 16:01:33.136895
[INFO ] Executing state pkg.installed for rabbitmq-server
[DEBUG ] Error loading module.boto_cloudwatch: The boto_cloudwatch module cannot be loaded: boto libraries are unavailable.
[DEBUG ] Error loading module.npm: npm execution module could not be loaded because the npm binary could not be located
[DEBUG ] Error loading module.x509: Could not load x509 module, m2crypto unavailable
[DEBUG ] Could not LazyLoad pkg.normalize_name
[DEBUG ] Could not LazyLoad pkg.check_db
[DEBUG ] Could not LazyLoad pkg.normalize_name
[INFO ] Executing command ['apt-get', '-q', '-y', '-o', 'DPkg::Options::=--force-confold', '-o', 'DPkg::Options::=--force-confdef', 'install', 'rabbitmq-server'] in directory '/root'
[INFO ] User sudo_myuser01 Executing command saltutil.find_job with jid 20160204160143126975
[DEBUG ] Command details {'tgt_type': 'glob', 'jid': '20160204160143126975', 'tgt': 'minion-acc01.mycompany.com', 'ret': '', 'user': 'sudo_myuser01', 'arg': ['20160204155857806709'], 'fun': 'saltutil.find_job'}
[INFO ] Starting a new job with PID 3787
[DEBUG ] LazyLoaded saltutil.find_job
[DEBUG ] Minion return retry timer set to 6 seconds (randomized)
[INFO ] Returning information for job: 20160204160143126975
[DEBUG ] Initializing new AsyncZeroMQReqChannel for ('/etc/salt/pki/minion', 'minion-acc01.mycompany.com', 'tcp://10.0.24.11:4506', 'aes')
[DEBUG ] Initializing new SAuth for ('/etc/salt/pki/minion', 'minion-acc01.mycompany.com', 'tcp://10.0.24.11:4506')
[INFO ] Executing command ['dpkg-query', '--showformat', '${Status} ${Package} ${Version} ${Architecture}\n', '-W'] in directory '/root'
[INFO ] Made the following changes:
'libltdl7' changed from 'absent' to '2.4.2-1.7ubuntu1'
'erlang-asn1' changed from 'absent' to '1:16.b.3-dfsg-1ubuntu2.1'
'erlang-inets' changed from 'absent' to '1:16.b.3-dfsg-1ubuntu2.1'
'erlang-snmp' changed from 'absent' to '1:16.b.3-dfsg-1ubuntu2.1'
'erlang-percept' changed from 'absent' to '1:16.b.3-dfsg-1ubuntu2.1'
'erlang-ic' changed from 'absent' to '1:16.b.3-dfsg-1ubuntu2.1'
'erlang-os-mon' changed from 'absent' to '1:16.b.3-dfsg-1ubuntu2.1'
'libodbc1' changed from 'absent' to '2.2.14p2-5ubuntu5'
'erlang-syntax-tools' changed from 'absent' to '1:16.b.3-dfsg-1ubuntu2.1'
'rabbitmq-server' changed from 'absent' to '3.6.0-1'
'erlang-ssl' changed from 'absent' to '1:16.b.3-dfsg-1ubuntu2.1'
'erlang-ssh' changed from 'absent' to '1:16.b.3-dfsg-1ubuntu2.1'
'libsctp1' changed from 'absent' to '1.0.15+dfsg-1'
'erlang-runtime-tools' changed from 'absent' to '1:16.b.3-dfsg-1ubuntu2.1'
'erlang-eunit' changed from 'absent' to '1:16.b.3-dfsg-1ubuntu2.1'
'erlang-tools' changed from 'absent' to '1:16.b.3-dfsg-1ubuntu2.1'
'erlang-odbc' changed from 'absent' to '1:16.b.3-dfsg-1ubuntu2.1'
'erlang-parsetools' changed from 'absent' to '1:16.b.3-dfsg-1ubuntu2.1'
'erlang-public-key' changed from 'absent' to '1:16.b.3-dfsg-1ubuntu2.1'
'lksctp-tools' changed from 'absent' to '1.0.15+dfsg-1'
'erlang-diameter' changed from 'absent' to '1:16.b.3-dfsg-1ubuntu2.1'
'erlang-corba' changed from 'absent' to '1:16.b.3-dfsg-1ubuntu2.1'
'erlang-xmerl' changed from 'absent' to '1:16.b.3-dfsg-1ubuntu2.1'
'erlang-nox' changed from 'absent' to '1:16.b.3-dfsg-1ubuntu2.1'
'erlang-eldap' changed from 'absent' to '1:16.b.3-dfsg-1ubuntu2.1'
'erlang-edoc' changed from 'absent' to '1:16.b.3-dfsg-1ubuntu2.1'
'erlang-mnesia' changed from 'absent' to '1:16.b.3-dfsg-1ubuntu2.1'
'erlang-webtool' changed from 'absent' to '1:16.b.3-dfsg-1ubuntu2.1'
'erlang-crypto' changed from 'absent' to '1:16.b.3-dfsg-1ubuntu2.1'
'erlang-erl-docgen' changed from 'absent' to '1:16.b.3-dfsg-1ubuntu2.1'
'erlang-abi-15.b' changed from 'absent' to '1'
'erlang-base' changed from 'absent' to '1:16.b.3-dfsg-1ubuntu2.1'
[DEBUG ] Refreshing modules...
[INFO ] Loading fresh modules for state activity
[DEBUG ] LazyLoaded jinja.render
[DEBUG ] LazyLoaded yaml.render
[DEBUG ] LazyLoaded saltutil.refresh_modules
[DEBUG ] LazyLoaded event.fire
[DEBUG ] SaltEvent PUB socket URI: ipc:///var/run/salt/minion/minion_event_13dcaed882_pub.ipc
[DEBUG ] SaltEvent PULL socket URI: ipc:///var/run/salt/minion/minion_event_13dcaed882_pull.ipc
[DEBUG ] Sending event - data = {'_stamp': '2016-02-04T23:01:41.166026'}
[DEBUG ] Handling event 'module_refresh\n\n\x81\xa6_stamp\xba2016-02-04T23:01:41.166026'
[DEBUG ] Refreshing modules. Notify=False
[DEBUG ] Reading configuration from /etc/salt/minion
[DEBUG ] Including configuration from '/etc/salt/minion.d/_schedule.conf'
[DEBUG ] Reading configuration from /etc/salt/minion.d/_schedule.conf
[INFO ] Completed state [rabbitmq-server] at time 16:01:41.194060
[DEBUG ] LazyLoaded config.option
[DEBUG ] Error loading module.rh_service: Cannot load rh_service module: OS not in set(['SUSE Enterprise Server', 'SUSE', 'RedHat', 'CentOS', 'CloudLinux', 'McAfee OS Server', 'XenServer', 'Amazon', 'OEL', 'ScientificLinux', 'ALT', 'Fedora'])
[DEBUG ] Error loading module.ifttt: IFTTT Secret Key Unavailable, not loading.
[DEBUG ] Error loading module.dockerng: Docker module could not get imported
[DEBUG ] Error loading module.victorops: No VictorOps api key found.
[DEBUG ] Error loading module.ipmi: No module named pyghmi.ipmi
[DEBUG ] Error loading module.win_status: Cannot load win_status module on non-windows
[DEBUG ] You should upgrade pyOpenSSL to at least 0.14.1 to enable the use of X509 extensions in the tls module
[DEBUG ] Error loading module.glusterfs: glusterfs server is not installed
[DEBUG ] Error loading module.vsphere: Missing dependency: The vSphere module requires the pyVmomi Python module.
[DEBUG ] Error loading module.nacl: libnacl import error, perhaps missing python libnacl package
[DEBUG ] Please install 'virt-what' to improve results of the 'virtual' grain.
[DEBUG ] LazyLoaded service.start
[DEBUG ] LazyLoaded service.dead
[DEBUG ] LazyLoaded file.managed
[INFO ] Running state [/var/lib/rabbitmq/.erlang.cookie] at time 16:01:41.426379
[DEBUG ] LazyLoaded file.user_to_uid
[DEBUG ] In saltenv 'base', looking at rel_path u'minion/erlang.cookie' to resolve u'salt://minion/erlang.cookie'
[DEBUG ] In saltenv 'base', ** considering ** path u'/var/cache/salt/minion/files/base/minion/erlang.cookie' to resolve u'salt://minion/erlang.cookie'
[DEBUG ] LazyLoaded config.merge
[DEBUG ] LazyLoaded mine.update
[INFO ] Fetching file from saltenv 'base', ** skipped ** latest already in cache u'salt://minion/erlang.cookie'
[INFO ] Running state [rabbitmq-server] at time 16:01:41.478896
[INFO ] Executing state service.dead for rabbitmq-server
[DEBUG ] LazyLoaded cmd.retcode
[INFO ] Executing command ['service', 'rabbitmq-server', 'status'] in directory '/root'
[DEBUG ] output: Status of node 'rabbit@minion-acc01' ...
[{pid,4914},
{running_applications,[{rabbit,"RabbitMQ","3.6.0"},
{mnesia,"MNESIA CXC 138 12","4.11"},
{os_mon,"CPO CXC 138 46","2.2.14"},
{rabbit_common,[],"3.6.0"},
{xmerl,"XML parser","1.3.5"},
{ranch,"Socket acceptor pool for TCP protocols.",
"1.2.1"},
{sasl,"SASL CXC 138 11","2.3.4"},
{stdlib,"ERTS CXC 138 10","1.19.4"},
{kernel,"ERTS CXC 138 10","2.16.4"}]},
{os,{unix,linux}},
{erlang_version,"Erlang R16B03 (erts-5.10.4) [source] [64-bit] [smp:2:2] [async-threads:64] [kernel-poll:true]\n"},
{memory,[{total,41833864},
{connection_readers,0},
{connection_writers,0},
{connection_channels,0},
{connection_other,0},
{queue_procs,2704},
{queue_slave_procs,0},
{plugins,0},
{other_proc,18916240},
{mnesia,61368},
{mgmt_db,0},
{msg_index,40224},
{other_ets,883960},
{binary,28080},
{code,17094921},
{atom,654217},
{other_system,4152150}]},
{alarms,[]},
{listeners,[{clustering,25672,"::"},{amqp,5672,"::"}]},
{vm_memory_high_watermark,0.4},
{vm_memory_limit,836683366},
{disk_free_limit,50000000},
{disk_free,116929630208},
{file_descriptors,[{total_limit,924},
{total_used,2},
{sockets_limit,829},
{sockets_used,0}]},
{processes,[{limit,1048576},{used,129}]},
{run_queue,0},
{uptime,2},
{kernel,{net_ticktime,60}}]
[INFO ] Executing command ['service', 'rabbitmq-server', 'stop'] in directory '/root'
[INFO ] User sudo_myuser01 Executing command saltutil.find_job with jid 20160204160153151102
[DEBUG ] Command details {'tgt_type': 'glob', 'jid': '20160204160153151102', 'tgt': 'minion-acc01.mycompany.com', 'ret': '', 'user': 'sudo_myuser01', 'arg': ['20160204155857806709'], 'fun': 'saltutil.find_job'}
[INFO ] Starting a new job with PID 5611
[DEBUG ] LazyLoaded saltutil.find_job
[DEBUG ] Minion return retry timer set to 7 seconds (randomized)
[INFO ] Returning information for job: 20160204160153151102
[DEBUG ] Initializing new AsyncZeroMQReqChannel for ('/etc/salt/pki/minion', 'minion-acc01.mycompany.com', 'tcp://10.0.24.11:4506', 'aes')
[DEBUG ] Initializing new SAuth for ('/etc/salt/pki/minion', 'minion-acc01.mycompany.com', 'tcp://10.0.24.11:4506')
[DEBUG ] output: * Stopping message broker rabbitmq-server
...done.
[INFO ] Executing command ['service', 'rabbitmq-server', 'status'] in directory '/root'
[ERROR ] Command ['service', 'rabbitmq-server', 'status'] failed with return code: 3
[ERROR ] output: Status of node 'rabbit@minion-acc01' ...
Error: unable to connect to node 'rabbit@minion-acc01': nodedown
DIAGNOSTICS
===========
attempted to contact: ['rabbit@minion-acc01']
rabbit@minion-acc01:
* connected to epmd (port 4369) on minion-acc01
* epmd reports: node 'rabbit' not running at all
no other nodes on minion-acc01
* suggestion: start the node
current node details:
- node name: 'rabbitmq-cli-5703@minion-acc01'
- home dir: /var/lib/rabbitmq
- cookie hash: saG4+7chFAnjc4YHUIvZWQ==
[INFO ] {'rabbitmq-server': True}
[INFO ] Completed state [rabbitmq-server] at time 16:01:44.962473
[INFO ] Running state [/var/lib/rabbitmq/.erlang.cookie] at time 16:01:44.963375
[INFO ] Executing state file.managed for /var/lib/rabbitmq/.erlang.cookie
[DEBUG ] In saltenv 'base', looking at rel_path u'minion/erlang.cookie' to resolve u'salt://minion/erlang.cookie'
[DEBUG ] In saltenv 'base', ** considering ** path u'/var/cache/salt/minion/files/base/minion/erlang.cookie' to resolve u'salt://minion/erlang.cookie'
[INFO ] Fetching file from saltenv 'base', ** skipped ** latest already in cache u'salt://minion/erlang.cookie'
[INFO ] File changed:
---
+++
@@ -1 +1 @@
-UHELPGXEVTYGSYCAJYZQ+AVIPLIEBGJXMRONWUZHIJKLMNO
[INFO ] Completed state [/var/lib/rabbitmq/.erlang.cookie] at time 16:01:44.984433
[DEBUG ] LazyLoaded rabbitmq_plugin.enabled
[INFO ] Running state [/usr/local/bin/rabbitmq-env] at time 16:01:44.992513
[INFO ] Executing state file.symlink for /usr/local/bin/rabbitmq-env
[INFO ] Symlink /usr/local/bin/rabbitmq-env is present and owned by root:root
[INFO ] Completed state [/usr/local/bin/rabbitmq-env] at time 16:01:45.014938
[INFO ] Running state [/usr/local/bin/rabbitmq-plugins] at time 16:01:45.015884
[INFO ] Executing state file.symlink for /usr/local/bin/rabbitmq-plugins
[INFO ] Symlink /usr/local/bin/rabbitmq-plugins is present and owned by root:root
[INFO ] Completed state [/usr/local/bin/rabbitmq-plugins] at time 16:01:45.017247
[INFO ] Running state [rabbitmq_management] at time 16:01:45.018181
[INFO ] Executing state rabbitmq_plugin.enabled for rabbitmq_management
stdin: is not a tty
[INFO ] Executing command '/usr/local/bin/rabbitmq-plugins list -m -e' as user 'root' in directory '/root'
stdin: is not a tty
[INFO ] Executing command '/usr/local/bin/rabbitmq-plugins enable rabbitmq_management' as user 'root' in directory '/root'
[DEBUG ] stdout: The following plugins have been enabled:
mochiweb
webmachine
rabbitmq_web_dispatch
amqp_client
rabbitmq_management_agent
rabbitmq_management
Applying plugin configuration to rabbit@minion-acc01... failed.
* Could not contact node rabbit@minion-acc01.
Changes will take effect at broker restart.
* Options: --online - fail if broker cannot be contacted.
--offline - do not try to contact broker.
[INFO ] {'new': 'rabbitmq_management', 'old': ''}
[INFO ] Completed state [rabbitmq_management] at time 16:01:46.918688
[INFO ] Running state [rabbitmq-server] at time 16:01:46.919895
[INFO ] Executing state service.running for rabbitmq-server
[INFO ] Executing command ['service', 'rabbitmq-server', 'status'] in directory '/root'
[ERROR ] Command ['service', 'rabbitmq-server', 'status'] failed with return code: 3
[ERROR ] output: Status of node 'rabbit@minion-acc01' ...
Error: unable to connect to node 'rabbit@minion-acc01': nodedown
DIAGNOSTICS
===========
attempted to contact: ['rabbit@minion-acc01']
rabbit@minion-acc01:
* connected to epmd (port 4369) on minion-acc01
* epmd reports: node 'rabbit' not running at all
no other nodes on minion-acc01
* suggestion: start the node
current node details:
- node name: 'rabbitmq-cli-6150@minion-acc01'
- home dir: /var/lib/rabbitmq
- cookie hash: lPQ2SksvJe+eVo7TjqOxcQ==
[INFO ] Executing command ['service', 'rabbitmq-server', 'start'] in directory '/root'
[DEBUG ] output: * Starting message broker rabbitmq-server
...done.
[INFO ] Executing command ['service', 'rabbitmq-server', 'status'] in directory '/root'
[DEBUG ] output: Status of node 'rabbit@minion-acc01' ...
[{pid,6770},
{running_applications,
[{rabbitmq_management,"RabbitMQ Management Console","3.6.0"},
{rabbitmq_web_dispatch,"RabbitMQ Web Dispatcher","3.6.0"},
{webmachine,"webmachine","git"},
{mochiweb,"MochiMedia Web Server","2.13.0"},
{rabbitmq_management_agent,"RabbitMQ Management Agent","3.6.0"},
{rabbit,"RabbitMQ","3.6.0"},
{os_mon,"CPO CXC 138 46","2.2.14"},
{ssl,"Erlang/OTP SSL application","5.3.2"},
{public_key,"Public key infrastructure","0.21"},
{crypto,"CRYPTO version 2","3.2"},
{amqp_client,"RabbitMQ AMQP Client","3.6.0"},
{rabbit_common,[],"3.6.0"},
{inets,"INETS CXC 138 49","5.9.7"},
{mnesia,"MNESIA CXC 138 12","4.11"},
{compiler,"ERTS CXC 138 10","4.9.4"},
{xmerl,"XML parser","1.3.5"},
{syntax_tools,"Syntax tools","1.6.12"},
{asn1,"The Erlang ASN1 compiler version 2.0.4","2.0.4"},
{ranch,"Socket acceptor pool for TCP protocols.","1.2.1"},
{sasl,"SASL CXC 138 11","2.3.4"},
{stdlib,"ERTS CXC 138 10","1.19.4"},
{kernel,"ERTS CXC 138 10","2.16.4"}]},
{os,{unix,linux}},
{erlang_version,
"Erlang R16B03 (erts-5.10.4) [source] [64-bit] [smp:2:2] [async-threads:64] [kernel-poll:true]\n"},
{memory,
[{total,53556840},
{connection_readers,0},
{connection_writers,0},
{connection_channels,0},
{connection_other,2704},
{queue_procs,2704},
{queue_slave_procs,0},
{plugins,177144},
{other_proc,19286624},
{mnesia,61032},
{mgmt_db,73856},
{msg_index,62592},
{other_ets,1361344},
{binary,35152},
{code,26652340},
{atom,951441},
{other_system,4889907}]},
{alarms,[]},
{listeners,[{clustering,25672,"::"},{amqp,5672,"::"}]},
{vm_memory_high_watermark,0.4},
{vm_memory_limit,836683366},
{disk_free_limit,50000000},
{disk_free,116991930368},
{file_descriptors,
[{total_limit,924},{total_used,2},{sockets_limit,829},{sockets_used,0}]},
{processes,[{limit,1048576},{used,192}]},
{run_queue,0},
{uptime,2},
{kernel,{net_ticktime,60}}]
[INFO ] {'rabbitmq-server': True}
[INFO ] Completed state [rabbitmq-server] at time 16:01:50.252934
[DEBUG ] LazyLoaded rabbitmq_policy.present
[INFO ] Running state [HA] at time 16:01:50.262791
[INFO ] Executing state rabbitmq_policy.present for HA
stdin: is not a tty
[INFO ] Executing command 'rabbitmqctl list_policies -p /' as user 'root' in directory '/root'
[DEBUG ] output: Listing policies ...
[DEBUG ] Listing policies: {}
[DEBUG ] Policy doesn't exist - Creating
stdin: is not a tty
[INFO ] Executing command 'rabbitmqctl set_policy -p / HA \'.*\' \'{"ha-mode": "all"}\'' as user 'root' in directory '/root'
[DEBUG ] output: Setting policy "HA" for pattern ".*" to "{\"ha-mode\": \"all\"}" with priority "0" ...
[DEBUG ] Set policy: Setting policy "HA" for pattern ".*" to "{\"ha-mode\": \"all\"}" with priority "0" ...
[INFO ] {'new': None, 'old': ''}
[INFO ] Completed state [HA] at time 16:01:50.950627
[DEBUG ] LazyLoaded rabbitmq_user.present
[INFO ] Running state [minion_processor] at time 16:01:50.958694
[INFO ] Executing state rabbitmq_user.present for minion_processor
stdin: is not a tty
[INFO ] Executing command 'rabbitmqctl list_users' as user 'root' in directory '/root'
[DEBUG ] output: Listing users ...
guest [administrator]
[DEBUG ] {'guest': set(['administrator'])}
[DEBUG ] RabbitMQ user 'minion_processor' doesn't exist - Creating.
stdin: is not a tty
stdin: is not a tty
[INFO ] Executing command 'rabbitmqctl list_user_permissions minion_processor' as user 'root' in directory '/root'
[DEBUG ] output: Listing permissions for user "minion_processor" ...
stdin: is not a tty
[INFO ] Executing command 'rabbitmqctl set_permissions -p / minion_processor ".*" ".*" ".*"' as user 'root' in directory '/root'
[DEBUG ] output: Setting permissions for user "minion_processor" in vhost "/" ...
[INFO ] Made the following changes:
'perms' changed from 'absent' to {'/': ['.*', '.*', '.*']}
'user' changed from 'absent' to 'minion_processor'
[INFO ] Completed state [minion_processor] at time 16:01:52.329385
[INFO ] Running state [myuser01] at time 16:01:52.330061
[INFO ] Executing state rabbitmq_user.present for myuser01
stdin: is not a tty
[INFO ] Executing command 'rabbitmqctl list_users' as user 'root' in directory '/root'
[DEBUG ] output: Listing users ...
minion_processor []
guest [administrator]
[DEBUG ] {'minion_processor': set(['']), 'guest': set(['administrator'])}
[DEBUG ] RabbitMQ user 'myuser01' doesn't exist - Creating.
stdin: is not a tty
stdin: is not a tty
[INFO ] Executing command 'rabbitmqctl list_users' as user 'root' in directory '/root'
[DEBUG ] output: Listing users ...
minion_processor []
guest [administrator]
myuser01 []
stdin: is not a tty
[INFO ] Executing command 'rabbitmqctl set_user_tags myuser01 administrator' as user 'root' in directory '/root'
[INFO ] User sudo_myuser01 Executing command saltutil.find_job with jid 20160204160203174583
[DEBUG ] Command details {'tgt_type': 'glob', 'jid': '20160204160203174583', 'tgt': 'minion-acc01.mycompany.com', 'ret': '', 'user': 'sudo_myuser01', 'arg': ['20160204155857806709'], 'fun': 'saltutil.find_job'}
[INFO ] Starting a new job with PID 8527
[DEBUG ] LazyLoaded saltutil.find_job
[DEBUG ] Minion return retry timer set to 5 seconds (randomized)
[INFO ] Returning information for job: 20160204160203174583
[DEBUG ] Initializing new AsyncZeroMQReqChannel for ('/etc/salt/pki/minion', 'minion-acc01.mycompany.com', 'tcp://10.0.24.11:4506', 'aes')
[DEBUG ] Initializing new SAuth for ('/etc/salt/pki/minion', 'minion-acc01.mycompany.com', 'tcp://10.0.24.11:4506')
[DEBUG ] output: Setting tags for user "myuser01" to [administrator] ...
stdin: is not a tty
[INFO ] Executing command 'rabbitmqctl list_user_permissions myuser01' as user 'root' in directory '/root'
[DEBUG ] output: Listing permissions for user "myuser01" ...
stdin: is not a tty
[INFO ] Executing command 'rabbitmqctl set_permissions -p / myuser01 ".*" ".*" ".*"' as user 'root' in directory '/root'
[DEBUG ] output: Setting permissions for user "myuser01" in vhost "/" ...
[INFO ] Made the following changes:
'perms' changed from 'absent' to {'/': ['.*', '.*', '.*']}
'user' changed from 'absent' to 'myuser01'
'tags' changed from 'administrator' to set([''])
[INFO ] Completed state [myuser01] at time 16:01:54.355665
[INFO ] Running state [minion_web] at time 16:01:54.356343
[INFO ] Executing state rabbitmq_user.present for minion_web
stdin: is not a tty
[INFO ] Executing command 'rabbitmqctl list_users' as user 'root' in directory '/root'
[DEBUG ] output: Listing users ...
minion_processor []
guest [administrator]
myuser01 [administrator]
[DEBUG ] {'minion_processor': set(['']), 'guest': set(['administrator']), 'myuser01': set(['administrator'])}
[DEBUG ] RabbitMQ user 'minion_web' doesn't exist - Creating.
stdin: is not a tty
stdin: is not a tty
[INFO ] Executing command 'rabbitmqctl list_user_permissions minion_web' as user 'root' in directory '/root'
[DEBUG ] output: Listing permissions for user "minion_web" ...
stdin: is not a tty
[INFO ] Executing command 'rabbitmqctl set_permissions -p / minion_web ".*" ".*" ".*"' as user 'root' in directory '/root'
[DEBUG ] output: Setting permissions for user "minion_web" in vhost "/" ...
[INFO ] Made the following changes:
'perms' changed from 'absent' to {'/': ['.*', '.*', '.*']}
'user' changed from 'absent' to 'minion_web'
[INFO ] Completed state [minion_web] at time 16:01:55.696214
[DEBUG ] LazyLoaded rabbitmq_vhost.present
[INFO ] Running state [/] at time 16:01:55.706217
[INFO ] Executing state rabbitmq_vhost.present for /
stdin: is not a tty
[INFO ] Executing command 'rabbitmqctl list_vhosts -q' as user 'root' in directory '/root'
[DEBUG ] output: /
[INFO ] VHost / already exists
[INFO ] Completed state [/] at time 16:01:56.036293
[DEBUG ] File /var/cache/salt/minion/accumulator/139681067854160 does not exist, no need to cleanup.
[DEBUG ] Minion return retry timer set to 5 seconds (randomized)
[INFO ] Returning information for job: 20160204155857806709
[DEBUG ] Initializing new AsyncZeroMQReqChannel for ('/etc/salt/pki/minion', 'minion-acc01.mycompany.com', 'tcp://10.0.24.11:4506', 'aes')
[DEBUG ] Initializing new SAuth for ('/etc/salt/pki/minion', 'minion-acc01.mycompany.com', 'tcp://10.0.24.11:4506')
Process Process-1:3:
Traceback (most recent call last):
File "/usr/lib/python2.7/multiprocessing/process.py", line 258, in _bootstrap
self.run()
File "/usr/lib/python2.7/multiprocessing/process.py", line 114, in run
self._target(*self._args, **self._kwargs)
File "/usr/lib/python2.7/dist-packages/salt/minion.py", line 1155, in _thread_return
timeout=minion_instance._return_retry_timer()
File "/usr/lib/python2.7/dist-packages/salt/minion.py", line 1305, in _return_pub
ret_val = channel.send(load, timeout=timeout)
File "/usr/lib/python2.7/dist-packages/salt/utils/async.py", line 73, in wrap
ret = self._block_future(ret)
File "/usr/lib/python2.7/dist-packages/salt/utils/async.py", line 83, in _block_future
return future.result()
File "/usr/lib/python2.7/dist-packages/tornado/concurrent.py", line 215, in result
raise_exc_info(self._exc_info)
File "/usr/lib/python2.7/dist-packages/tornado/gen.py", line 876, in run
yielded = self.gen.throw(*exc_info)
File "/usr/lib/python2.7/dist-packages/salt/transport/zeromq.py", line 231, in send
ret = yield self._crypted_transfer(load, tries=tries, timeout=timeout)
File "/usr/lib/python2.7/dist-packages/tornado/gen.py", line 870, in run
value = future.result()
File "/usr/lib/python2.7/dist-packages/tornado/concurrent.py", line 215, in result
raise_exc_info(self._exc_info)
File "/usr/lib/python2.7/dist-packages/tornado/gen.py", line 876, in run
yielded = self.gen.throw(*exc_info)
File "/usr/lib/python2.7/dist-packages/salt/transport/zeromq.py", line 199, in _crypted_transfer
ret = yield _do_transfer()
File "/usr/lib/python2.7/dist-packages/tornado/gen.py", line 870, in run
value = future.result()
File "/usr/lib/python2.7/dist-packages/tornado/concurrent.py", line 215, in result
raise_exc_info(self._exc_info)
File "/usr/lib/python2.7/dist-packages/tornado/gen.py", line 230, in wrapper
yielded = next(result)
File "/usr/lib/python2.7/dist-packages/salt/transport/zeromq.py", line 183, in _do_transfer
self._package_load(self.auth.crypticle.dumps(load)),
File "/usr/lib/python2.7/dist-packages/salt/crypt.py", line 1164, in dumps
return self.encrypt(self.PICKLE_PAD + self.serial.dumps(obj))
File "/usr/lib/python2.7/dist-packages/salt/payload.py", line 140, in dumps
return msgpack.dumps(msg)
File "_packer.pyx", line 259, in msgpack._packer.packb (msgpack/_packer.cpp:259)
File "_packer.pyx", line 184, in msgpack._packer.Packer.pack (msgpack/_packer.cpp:184)
File "_packer.pyx", line 159, in msgpack._packer.Packer._pack (msgpack/_packer.cpp:159)
File "_packer.pyx", line 159, in msgpack._packer.Packer._pack (msgpack/_packer.cpp:159)
File "_packer.pyx", line 159, in msgpack._packer.Packer._pack (msgpack/_packer.cpp:159)
File "_packer.pyx", line 159, in msgpack._packer.Packer._pack (msgpack/_packer.cpp:159)
File "_packer.pyx", line 159, in msgpack._packer.Packer._pack (msgpack/_packer.cpp:159)
File "_packer.pyx", line 179, in msgpack._packer.Packer._pack (msgpack/_packer.cpp:179)
TypeError: can't serialize set([''])
[INFO ] User sudo_myuser01 Executing command saltutil.find_job with jid 20160204160213212394
[DEBUG ] Command details {'tgt_type': 'glob', 'jid': '20160204160213212394', 'tgt': 'minion-acc01.mycompany.com', 'ret': '', 'user': 'sudo_myuser01', 'arg': ['20160204155857806709'], 'fun': 'saltutil.find_job'}
[INFO ] Starting a new job with PID 9548
[DEBUG ] LazyLoaded saltutil.find_job
[DEBUG ] Minion return retry timer set to 5 seconds (randomized)
[INFO ] Returning information for job: 20160204160213212394
[DEBUG ] Initializing new AsyncZeroMQReqChannel for ('/etc/salt/pki/minion', 'minion-acc01.mycompany.com', 'tcp://10.0.24.11:4506', 'aes')
[DEBUG ] Initializing new SAuth for ('/etc/salt/pki/minion', 'minion-acc01.mycompany.com', 'tcp://10.0.24.11:4506')
^C[INFO ] The salt minion is shut down
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment