Created
December 11, 2014 02:53
-
-
Save barkerd427/4ad76e5a983f628d26b3 to your computer and use it in GitHub Desktop.
Remove splunkforwarder and run chef-client
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
root@kc7aeonmap001:/root | |
20:27:06 # rpm -e splunkforwarder | |
root@kc7aeonmap001:/root | |
20:27:12 # rpm -q splunkforwarder | |
package splunkforwarder is not installed | |
root@kc7aeonmap001:/root | |
20:27:27 # chef-client | |
[2014-12-10T20:30:17-06:00] INFO: *** Chef 10.24.0 *** | |
[2014-12-10T20:30:18-06:00] INFO: [inet6] no default interface, picking the first ipaddress | |
[2014-12-10T20:30:19-06:00] INFO: Run List is [role[Hadoop-jdk6], role[Hadoop-client-role], role[dnsmasq-hadoop-kc7], role[aeon_mapreduce_kc7staging], role[aeon_orgsecurity_mapreduce_staging], role[aeon_cron_kc7staging], role[salt_shaker_kc7staging], role[aeon_formulary_mapreduce_staging], role[aeon_temporary_storage_cron_staging], role[pharmacy_mappings_pre-prod], role[aeon_orders_reference_mapreduce_staging], role[zabbix_agent_non_prod_jmx_disabled], role[statesman_cli_install_staging], role[concept_tags_2_staging], role[hdfs_tmp_cleanup_staging], role[aeon_eventset_mapreduce_kc7staging]] | |
[2014-12-10T20:30:19-06:00] INFO: Run List expands to [java, kepler-tools-cookbook, cdh4::base, dnsmasq, cerner_splunk, aeon_mapreduce, aeon_org_security_mapreduce, aeon_cron, salt_shaker, aeon_formulary_mapreduce, aeon_formulary_mapreduce::cron, aeon_temporary_storage_cron, aeon_pharmacy_mappings, aeon_orders_reference_mapreduce, zabbix_agent, statesman-cli, concept_tags, hdfs_upload::hdfs_cleanup, aeon_eventset_mapreduce, aeon_eventset_mapreduce::cron] | |
[2014-12-10T20:30:19-06:00] INFO: HTTP Request Returned 404 Not Found: No routes match the request: /reports/nodes/kc7aeonmap001.cernerasp.com/runs | |
[2014-12-10T20:30:19-06:00] INFO: Starting Chef Run for kc7aeonmap001.cernerasp.com | |
[2014-12-10T20:30:19-06:00] INFO: Running start handlers | |
[2014-12-10T20:30:19-06:00] INFO: Start handlers complete. | |
[2014-12-10T20:30:19-06:00] INFO: Loading cookbooks [aeon_cron, aeon_eventset_mapreduce, aeon_formulary_mapreduce, aeon_mapreduce, aeon_orders_reference_mapreduce, aeon_org_security_mapreduce, aeon_pharmacy_mappings, aeon_temporary_storage_cron, bashum, cdh4, cerner_splunk, concept_tags, dnsmasq, hdfs_upload, java, kepler-tools-cookbook, maven_artifact, ruby_build, salt_shaker, statesman-cli, sun_java, ulimit, zabbix_agent] | |
/var/chef/cache/cookbooks/cerner_splunk/libraries/recipe.rb:14: warning: already initialized constant NODE_TYPE | |
[2014-12-10T20:30:20-06:00] WARN: Setting attributes without specifying a precedence is deprecated and will be | |
removed in Chef 11.0. To set attributes at normal precedence, change code like: | |
`node["key"] = "value"` # Not this | |
to: | |
`node.set["key"] = "value"` # This | |
Called from: | |
/opt/chef/embedded/lib/ruby/gems/1.9.1/gems/chef-10.24.0/lib/chef/node/attribute.rb:449:in `method_missing' | |
/opt/chef/embedded/lib/ruby/gems/1.9.1/gems/chef-10.24.0/lib/chef/node.rb:346:in `method_missing' | |
/var/chef/cache/cookbooks/cerner_splunk/attributes/_install.rb:5:in `from_file' | |
[2014-12-10T20:30:20-06:00] WARN: Setting attributes without specifying a precedence is deprecated and will be | |
removed in Chef 11.0. To set attributes at normal precedence, change code like: | |
`node["key"] = "value"` # Not this | |
to: | |
`node.set["key"] = "value"` # This | |
Called from: | |
/opt/chef/embedded/lib/ruby/gems/1.9.1/gems/chef-10.24.0/lib/chef/node/attribute.rb:449:in `method_missing' | |
/opt/chef/embedded/lib/ruby/gems/1.9.1/gems/chef-10.24.0/lib/chef/node.rb:346:in `method_missing' | |
/var/chef/cache/cookbooks/cerner_splunk/attributes/_user_management.rb:4:in `from_file' | |
[2014-12-10T20:30:20-06:00] WARN: Using java::default instead is recommended. | |
SELinux status: disabled | |
[2014-12-10T20:30:27-06:00] INFO: Processing group[hive] action create (cdh4::base line 130) | |
[2014-12-10T20:30:27-06:00] INFO: Processing user[hive] action create (cdh4::base line 134) | |
[2014-12-10T20:30:27-06:00] WARN: Cloning resource attributes for directory[/etc/hadoop/conf] from prior resource (CHEF-3694) | |
[2014-12-10T20:30:27-06:00] WARN: Previous directory[/etc/hadoop/conf]: /var/chef/cache/cookbooks/kepler-tools-cookbook/recipes/default.rb:53:in `from_file' | |
[2014-12-10T20:30:27-06:00] WARN: Current directory[/etc/hadoop/conf]: /var/chef/cache/cookbooks/cdh4/recipes/config-hadoop-hbase.rb:14:in `from_file' | |
[2014-12-10T20:30:27-06:00] INFO: Processing directory[/opt/hive-aux] action create (cdh4::config-hadoop-hbase line 257) | |
[2014-12-10T20:30:27-06:00] INFO: Processing cookbook_file[/opt/hive-aux/mysql-connector-java-5.1.22-bin.jar] action create (cdh4::config-hadoop-hbase line 266) | |
[2014-12-10T20:30:27-06:00] INFO: Processing chef_gem[chef-vault] action install (cerner_splunk::_install line 8) | |
[2014-12-10T20:30:27-06:00] INFO: This version of Chef client does not support removing users from groups. If you need to remove 'SYSTEM' from groups you must do so manually. | |
[2014-12-10T20:30:27-06:00] INFO: Roles not configured for this node. | |
[2014-12-10T20:30:27-06:00] INFO: Splunk Authentication not configured for this node. | |
[2014-12-10T20:30:27-06:00] INFO: Splunk Alerts not configured for this node. | |
[2014-12-10T20:30:32-06:00] WARN: Cloning resource attributes for hdfs_upload[upload] from prior resource (CHEF-3694) | |
[2014-12-10T20:30:32-06:00] WARN: Previous hdfs_upload[upload]: /var/chef/cache/cookbooks/salt_shaker/recipes/default.rb:4:in `from_file' | |
[2014-12-10T20:30:32-06:00] WARN: Current hdfs_upload[upload]: /var/chef/cache/cookbooks/aeon_pharmacy_mappings/recipes/default.rb:6:in `from_file' | |
[2014-12-10T20:30:32-06:00] WARN: Cloning resource attributes for package[git] from prior resource (CHEF-3694) | |
[2014-12-10T20:30:32-06:00] WARN: Previous package[git]: /var/chef/cache/cookbooks/aeon_mapreduce/recipes/bane.rb:47:in `from_file' | |
[2014-12-10T20:30:32-06:00] WARN: Current package[git]: /var/chef/cache/cookbooks/ruby_build/recipes/default.rb:38:in `block in from_file' | |
[2014-12-10T20:30:32-06:00] INFO: Installing hdfs_cleanup hdfs_tmp_cleanup | |
[2014-12-10T20:30:32-06:00] INFO: Processing ruby_block[set-env-java-home] action create (java::set_java_home line 19) | |
[2014-12-10T20:30:32-06:00] INFO: ruby_block[set-env-java-home] called | |
[2014-12-10T20:30:32-06:00] INFO: Processing directory[/etc/profile.d] action create (java::set_java_home line 26) | |
[2014-12-10T20:30:32-06:00] INFO: Processing file[/etc/profile.d/jdk.sh] action create (java::set_java_home line 30) | |
[2014-12-10T20:30:32-06:00] INFO: Processing java_ark[jdk] action install (java::oracle line 53) | |
[2014-12-10T20:30:32-06:00] INFO: Processing java_alternatives[set-java-alternatives] action set (/var/chef/cache/cookbooks/java/providers/ark.rb line 206) | |
[2014-12-10T20:30:34-06:00] INFO: Processing cookbook_file[/var/chef/cache/kepler-tools-2.5.tar.gz] action create (kepler-tools-cookbook::default line 16) | |
[2014-12-10T20:30:35-06:00] INFO: Processing execute[untar_kepler_tools] action run (kepler-tools-cookbook::default line 22) | |
[2014-12-10T20:30:35-06:00] INFO: execute[untar_kepler_tools] ran successfully | |
[2014-12-10T20:30:35-06:00] INFO: Processing link[/usr/local/bin/kepler] action create (kepler-tools-cookbook::default line 28) | |
[2014-12-10T20:30:35-06:00] INFO: Processing ruby_block[delete_kepler_tools_environement] action create (kepler-tools-cookbook::default line 34) | |
[2014-12-10T20:30:35-06:00] INFO: ruby_block[delete_kepler_tools_environement] called | |
[2014-12-10T20:30:35-06:00] INFO: Processing execute[create_kepler_tools_environment] action run (kepler-tools-cookbook::default line 46) | |
[2014-12-10T20:30:35-06:00] INFO: execute[create_kepler_tools_environment] ran successfully | |
[2014-12-10T20:30:35-06:00] INFO: Processing directory[/etc/hadoop/conf] action create (kepler-tools-cookbook::default line 53) | |
[2014-12-10T20:30:35-06:00] INFO: Processing template[/etc/hadoop/conf/kafka-site.xml] action create (kepler-tools-cookbook::default line 60) | |
[2014-12-10T20:30:35-06:00] INFO: Processing ruby_block[check for java_home] action create (cdh4::base line 18) | |
[2014-12-10T20:30:35-06:00] INFO: ruby_block[check for java_home] called | |
[2014-12-10T20:30:35-06:00] INFO: Processing execute[download_cdh4_repo] action run (cdh4::base line 34) | |
[2014-12-10T20:30:35-06:00] INFO: Processing template[/etc/yum.repos.d/cdh4-repository.repo] action create (cdh4::base line 40) | |
[2014-12-10T20:30:35-06:00] INFO: Processing bash[clean yum] action nothing (cdh4::base line 48) | |
[2014-12-10T20:30:35-06:00] INFO: Processing package[libxml2-devel] action install (cdh4::base line 60) | |
[2014-12-10T20:30:48-06:00] INFO: Processing gem_package[libxml-ruby] action install (cdh4::base line 64) | |
[2014-12-10T20:30:48-06:00] INFO: Processing gem_package[highline] action install (cdh4::base line 69) | |
[2014-12-10T20:30:48-06:00] INFO: Processing package[hadoop] action install (cdh4::base line 74) | |
[2014-12-10T20:30:48-06:00] INFO: Processing package[hadoop] action upgrade (cdh4::base line 74) | |
[2014-12-10T20:30:48-06:00] INFO: Processing package[hbase] action install (cdh4::base line 80) | |
[2014-12-10T20:30:48-06:00] INFO: Processing package[hbase] action upgrade (cdh4::base line 80) | |
[2014-12-10T20:30:48-06:00] INFO: Processing package[zookeeper] action install (cdh4::base line 86) | |
[2014-12-10T20:30:48-06:00] INFO: Processing package[zookeeper] action upgrade (cdh4::base line 86) | |
[2014-12-10T20:30:48-06:00] INFO: Processing package[hadoop-0.20-mapreduce] action install (cdh4::base line 91) | |
[2014-12-10T20:30:48-06:00] INFO: Processing package[hadoop-0.20-mapreduce] action upgrade (cdh4::base line 91) | |
[2014-12-10T20:30:48-06:00] INFO: Processing group[hbase] action create (cdh4::base line 98) | |
[2014-12-10T20:30:48-06:00] INFO: Processing user[hbase] action create (cdh4::base line 102) | |
[2014-12-10T20:30:48-06:00] INFO: Processing group[hadoop] action create (cdh4::base line 106) | |
[2014-12-10T20:30:48-06:00] INFO: Processing group[yarn] action create (cdh4::base line 110) | |
[2014-12-10T20:30:48-06:00] INFO: Processing user[yarn] action create (cdh4::base line 114) | |
[2014-12-10T20:30:48-06:00] INFO: Processing user[hdfs] action create (cdh4::base line 118) | |
[2014-12-10T20:30:48-06:00] INFO: Processing group[mapred] action create (cdh4::base line 122) | |
[2014-12-10T20:30:48-06:00] INFO: Processing user[mapred] action create (cdh4::base line 126) | |
[2014-12-10T20:30:48-06:00] INFO: Processing group[hive] action create (cdh4::base line 130) | |
[2014-12-10T20:30:48-06:00] INFO: Processing user[hive] action create (cdh4::base line 134) | |
[2014-12-10T20:30:48-06:00] INFO: Processing group[oozie] action create (cdh4::base line 138) | |
[2014-12-10T20:30:48-06:00] INFO: Processing user[oozie] action create (cdh4::base line 142) | |
[2014-12-10T20:30:48-06:00] INFO: Processing group[rvm] action manage (cdh4::base line 147) | |
[2014-12-10T20:30:48-06:00] INFO: Processing ruby_block[reset user/group list] action nothing (cdh4::base line 155) | |
[2014-12-10T20:30:48-06:00] INFO: Processing package[libgcj] action install (cdh4::lzo line 15) | |
[2014-12-10T20:30:48-06:00] INFO: Processing yum_package[lzo] action install (cdh4::lzo line 20) | |
[2014-12-10T20:30:48-06:00] INFO: Processing yum_package[lzo] action upgrade (cdh4::lzo line 20) | |
[2014-12-10T20:30:48-06:00] INFO: Processing cookbook_file[/var/chef/cache/hadoop-lzo-0.4.15-2.el6.x86_64.rpm] action create (cdh4::lzo line 24) | |
[2014-12-10T20:30:48-06:00] INFO: Processing yum_package[hadoop-lzo-0.4.15-2.el6.x86_64.rpm] action install (cdh4::lzo line 27) | |
[2014-12-10T20:30:49-06:00] INFO: Processing bash[rename_old_jars] action run (cdh4::base line 167) | |
[2014-12-10T20:30:49-06:00] INFO: bash[rename_old_jars] ran successfully | |
[2014-12-10T20:30:49-06:00] INFO: Processing cookbook_file[/usr/lib/hadoop/lib/core-3.2.0.666.jar] action create (cdh4::base line 178) | |
[2014-12-10T20:30:49-06:00] INFO: Processing directory[/opt/hbase/aux-jars/] action create (cdh4::base line 186) | |
[2014-12-10T20:30:49-06:00] INFO: Processing remote_file[download hbase libjar aeon-mapreduce-filter-4.7.0-filter.jar] action create_if_missing (cdh4::base line 196) | |
[2014-12-10T20:30:49-06:00] INFO: Processing remote_file[download hbase libjar kepler-filters-2.5.jar] action create_if_missing (cdh4::base line 196) | |
[2014-12-10T20:30:49-06:00] INFO: Processing remote_file[download hbase libjar aeon-measurement-filter-0.14.jar] action create_if_missing (cdh4::base line 196) | |
[2014-12-10T20:30:49-06:00] INFO: Processing remote_file[download hbase libjar fluentdb-filter-0.2.jar] action create_if_missing (cdh4::base line 196) | |
[2014-12-10T20:30:49-06:00] INFO: Processing remote_file[download hbase libjar thresh-filter-0.5.jar] action create_if_missing (cdh4::base line 196) | |
[2014-12-10T20:30:49-06:00] INFO: Processing remote_file[download hbase libjar aeon-scratchpad-filter-1.0.jar] action create_if_missing (cdh4::base line 196) | |
[2014-12-10T20:30:49-06:00] INFO: Processing ruby_block[cleanup hbase filter jars] action create (cdh4::base line 207) | |
[2014-12-10T20:30:49-06:00] INFO: ruby_block[cleanup hbase filter jars] called | |
[2014-12-10T20:30:49-06:00] INFO: Processing directory[/var/log/hbase] action create (cdh4::base line 219) | |
[2014-12-10T20:30:49-06:00] INFO: Processing directory[/var/log/hadoop] action create (cdh4::base line 225) | |
[2014-12-10T20:30:49-06:00] INFO: Processing directory[/etc/sysctl.d] action create (cdh4::base line 233) | |
[2014-12-10T20:30:49-06:00] INFO: Processing template[/etc/sysctl.d/hadoop.sysctl] action create (cdh4::base line 239) | |
[2014-12-10T20:30:49-06:00] INFO: Processing execute[load sysctl] action run (cdh4::base line 245) | |
vm.swappiness = 0 | |
[2014-12-10T20:30:49-06:00] INFO: execute[load sysctl] ran successfully | |
[2014-12-10T20:30:49-06:00] INFO: Processing execute[copy capacity scheduler jar] action run (cdh4::base line 250) | |
[2014-12-10T20:30:49-06:00] INFO: Processing package[hive] action install (cdh4::hive line 10) | |
[2014-12-10T20:30:49-06:00] INFO: Processing execute[untar cerner hive] action run (cdh4::hive line 18) | |
[2014-12-10T20:30:49-06:00] INFO: Processing execute[symlink hive utils jar] action run (cdh4::hive line 32) | |
[2014-12-10T20:30:49-06:00] INFO: Processing execute[symlink hive-conf] action run (cdh4::hive line 39) | |
[2014-12-10T20:30:49-06:00] INFO: Processing directory[/etc/hadoop/conf] action create (cdh4::config-hadoop-hbase line 14) | |
[2014-12-10T20:30:49-06:00] INFO: Processing directory[/etc/oozie/conf.dist] action create (cdh4::config-hadoop-hbase line 22) | |
[2014-12-10T20:30:49-06:00] INFO: Processing template[/etc/hadoop/conf/log4j.properties] action create (cdh4::config-hadoop-hbase line 30) | |
[2014-12-10T20:30:49-06:00] INFO: Processing template[/etc/hbase/conf/log4j.properties] action create (cdh4::config-hadoop-hbase line 35) | |
[2014-12-10T20:30:49-06:00] INFO: Processing template[/etc/hadoop/conf/core-site.xml] action create (cdh4::config-hadoop-hbase line 43) | |
[2014-12-10T20:30:49-06:00] INFO: Processing template[/etc/hadoop/conf/hdfs-site.xml] action create (cdh4::config-hadoop-hbase line 43) | |
[2014-12-10T20:30:49-06:00] INFO: Processing template[/etc/hadoop/conf/mapred-site.xml] action create (cdh4::config-hadoop-hbase line 43) | |
[2014-12-10T20:30:49-06:00] INFO: Processing template[/etc/hadoop/conf/mapred-queue-acls.xml] action create (cdh4::config-hadoop-hbase line 43) | |
[2014-12-10T20:30:49-06:00] INFO: Processing template[/etc/hadoop/conf/capacity-scheduler.xml] action create (cdh4::config-hadoop-hbase line 43) | |
[2014-12-10T20:30:49-06:00] INFO: Processing template[/etc/hadoop/conf/httpfs-site.xml] action create (cdh4::config-hadoop-hbase line 43) | |
[2014-12-10T20:30:49-06:00] INFO: Processing template[/etc/hadoop/conf/yarn-site.xml] action create (cdh4::config-hadoop-hbase line 55) | |
[2014-12-10T20:30:49-06:00] INFO: Processing template[/etc/hadoop/conf/hadoop-env.sh] action create (cdh4::config-hadoop-hbase line 66) | |
[2014-12-10T20:30:49-06:00] INFO: Processing template[/etc/hadoop/conf/slaves] action create (cdh4::config-hadoop-hbase line 66) | |
[2014-12-10T20:30:49-06:00] INFO: Processing template[/etc/hadoop/conf/yarn-env.sh] action create (cdh4::config-hadoop-hbase line 66) | |
[2014-12-10T20:30:49-06:00] INFO: Processing template[/etc/oozie/conf.dist/oozie-site.xml] action create (cdh4::config-hadoop-hbase line 74) | |
[2014-12-10T20:30:49-06:00] INFO: Processing template[/etc/oozie/conf.dist/oozie-env.sh] action create (cdh4::config-hadoop-hbase line 85) | |
[2014-12-10T20:30:49-06:00] INFO: Processing template[/etc/hbase/conf/hadoop-metrics.properties] action create (cdh4::config-hadoop-hbase line 93) | |
[2014-12-10T20:30:49-06:00] INFO: Processing template[/etc/hadoop/conf/hadoop-metrics.properties] action create (cdh4::config-hadoop-hbase line 93) | |
[2014-12-10T20:30:49-06:00] INFO: Processing directory[/usr/lib/hadoop/logs] action create (cdh4::config-hadoop-hbase line 100) | |
[2014-12-10T20:30:49-06:00] INFO: Processing directory[/etc/hbase/conf] action create (cdh4::config-hadoop-hbase line 107) | |
[2014-12-10T20:30:49-06:00] INFO: directory[/etc/hbase/conf] owner changed to 496 | |
[2014-12-10T20:30:49-06:00] INFO: directory[/etc/hbase/conf] group changed to 397 | |
[2014-12-10T20:30:49-06:00] INFO: Processing template[/etc/hbase/conf/hbase-env.sh] action create (cdh4::config-hadoop-hbase line 117) | |
[2014-12-10T20:30:49-06:00] INFO: Processing template[/etc/hbase/conf/hbase-site.xml] action create (cdh4::config-hadoop-hbase line 117) | |
[2014-12-10T20:30:49-06:00] INFO: Processing template[/etc/hbase/conf/regionservers] action create (cdh4::config-hadoop-hbase line 117) | |
[2014-12-10T20:30:49-06:00] INFO: Processing directory[/storage/hadoop/1/mapred_local] action create (cdh4::config-hadoop-hbase line 152) | |
[2014-12-10T20:30:49-06:00] INFO: Processing directory[/storage/hadoop/2/mapred_local] action create (cdh4::config-hadoop-hbase line 152) | |
[2014-12-10T20:30:49-06:00] INFO: Processing directory[/storage/hadoop/3/mapred_local] action create (cdh4::config-hadoop-hbase line 152) | |
[2014-12-10T20:30:49-06:00] INFO: Processing directory[/storage/hadoop/1/hdfs_data] action create (cdh4::config-hadoop-hbase line 163) | |
[2014-12-10T20:30:49-06:00] INFO: Processing directory[/storage/hadoop/2/hdfs_data] action create (cdh4::config-hadoop-hbase line 163) | |
[2014-12-10T20:30:49-06:00] INFO: Processing directory[/storage/hadoop/3/hdfs_data] action create (cdh4::config-hadoop-hbase line 163) | |
[2014-12-10T20:30:49-06:00] INFO: Processing directory[/storage/hadoop/hdfs_jn] action create (cdh4::config-hadoop-hbase line 174) | |
[2014-12-10T20:30:49-06:00] INFO: Processing directory[/storage/hadoop/hdfs_name] action create (cdh4::config-hadoop-hbase line 195) | |
[2014-12-10T20:30:49-06:00] INFO: Processing template[/etc/security/limits.conf] action create (cdh4::config-hadoop-hbase line 207) | |
[2014-12-10T20:30:49-06:00] INFO: Processing directory[/usr/lib/hadoop/badjars] action create (cdh4::config-hadoop-hbase line 212) | |
[2014-12-10T20:30:49-06:00] INFO: Processing execute[move jar /usr/lib/hadoop/hadoop-common-2.0.0-cdh4.2.0-tests.jar] action run (cdh4::config-hadoop-hbase line 220) | |
[2014-12-10T20:30:50-06:00] INFO: Processing template[/usr/lib/hadoop/bin/rack-config.sh] action create (cdh4::config-hadoop-hbase line 226) | |
[2014-12-10T20:30:50-06:00] INFO: Processing cookbook_file[/usr/lib/hadoop/bin/rack-awareness.rb] action create (cdh4::config-hadoop-hbase line 232) | |
[2014-12-10T20:30:50-06:00] INFO: Processing template[/etc/hive/conf/hive-env.sh] action create (cdh4::config-hadoop-hbase line 237) | |
[2014-12-10T20:30:50-06:00] INFO: Processing template[/etc/hive/conf/hive-site.xml] action create (cdh4::config-hadoop-hbase line 245) | |
[2014-12-10T20:30:50-06:00] INFO: Processing directory[/opt/hive-aux] action create (cdh4::config-hadoop-hbase line 257) | |
[2014-12-10T20:30:50-06:00] INFO: Processing cookbook_file[/opt/hive-aux/mysql-connector-java-5.1.22-bin.jar] action create_if_missing (cdh4::config-hadoop-hbase line 266) | |
[2014-12-10T20:30:50-06:00] INFO: Processing ruby_block[cleanup hive aux jars] action create (cdh4::config-hadoop-hbase line 291) | |
[2014-12-10T20:30:50-06:00] INFO: ruby_block[cleanup hive aux jars] called | |
[2014-12-10T20:30:50-06:00] INFO: Processing package[pig] action install (cdh4::pig line 10) | |
[2014-12-10T20:30:50-06:00] INFO: Processing package[pig] action upgrade (cdh4::pig line 10) | |
[2014-12-10T20:30:50-06:00] INFO: Processing remote_file[/var/chef/cache/dnsmasq-2.66.tar.gz] action create (dnsmasq::source line 26) | |
[2014-12-10T20:30:50-06:00] INFO: Processing execute[Extract dnsmasq] action run (dnsmasq::source line 31) | |
[2014-12-10T20:30:50-06:00] INFO: Processing execute[Compile dnsmasq] action run (dnsmasq::source line 37) | |
[2014-12-10T20:30:50-06:00] INFO: Processing execute[Install dnsmasq] action run (dnsmasq::source line 43) | |
[2014-12-10T20:30:50-06:00] INFO: Processing directory[/var/chef/cache/dnsmasq-2.66] action delete (dnsmasq::source line 49) | |
[2014-12-10T20:30:50-06:00] INFO: Processing file[/var/chef/cache/dnsmasq-2.66.tar.gz] action delete (dnsmasq::source line 55) | |
[2014-12-10T20:30:50-06:00] INFO: Processing template[/etc/logrotate.d/dnsmasq] action create (dnsmasq::config line 11) | |
[2014-12-10T20:30:50-06:00] INFO: Processing template[/etc/init.d/dnsmasq] action create (dnsmasq::config line 17) | |
[2014-12-10T20:30:50-06:00] INFO: Processing template[/etc/dnsmasq.conf] action create (dnsmasq::config line 23) | |
[2014-12-10T20:30:50-06:00] INFO: Processing service[dnsmasq] action enable (dnsmasq::config line 30) | |
[2014-12-10T20:30:50-06:00] INFO: Processing service[dnsmasq] action start (dnsmasq::config line 30) | |
[2014-12-10T20:30:50-06:00] INFO: Processing execute[Create upstream resolver] action run (dnsmasq::config line 37) | |
[2014-12-10T20:30:50-06:00] INFO: Processing execute[Reload dnsmasq] action nothing (dnsmasq::config line 45) | |
[2014-12-10T20:30:50-06:00] INFO: Processing template[/etc/resolv.conf] action create (dnsmasq::config line 53) | |
[2014-12-10T20:30:50-06:00] INFO: Processing chef_gem[chef-vault] action install (cerner_splunk::_install line 8) | |
[2014-12-10T20:30:50-06:00] INFO: Processing ruby_block[clean-bashrc] action create (cerner_splunk::_cleanup_aeon line 10) | |
[2014-12-10T20:30:50-06:00] INFO: ruby_block[clean-bashrc] called | |
[2014-12-10T20:30:50-06:00] INFO: Processing ruby_block[uninstall-tar-forwarder] action create (cerner_splunk::_cleanup_aeon line 25) | |
[2014-12-10T20:30:50-06:00] INFO: Processing service[aeon-forwarder] action nothing (cerner_splunk::_cleanup_aeon line 35) | |
[2014-12-10T20:30:50-06:00] INFO: Processing execute[disable-tar-boot-start] action nothing (cerner_splunk::_cleanup_aeon line 43) | |
[2014-12-10T20:30:50-06:00] INFO: Processing directory[tar-dir] action nothing (cerner_splunk::_cleanup_aeon line 48) | |
[2014-12-10T20:30:50-06:00] INFO: Processing service[splunk] action nothing (cerner_splunk::_install line 42) | |
[2014-12-10T20:30:50-06:00] INFO: Processing remote_file[/var/chef/cache/splunkforwarder-6.0.6-228831-linux-2.6-x86_64.rpm] action create (cerner_splunk::_install line 50) | |
[2014-12-10T20:31:04-06:00] INFO: Processing package[splunkforwarder-6.0.6-228831] action install (cerner_splunk::_install line 56) | |
[2014-12-10T20:31:05-06:00] INFO: package[splunkforwarder-6.0.6-228831] sending run action to execute[splunk-first-run] (immediate) | |
[2014-12-10T20:31:05-06:00] INFO: Processing execute[splunk-first-run] action run (cerner_splunk::_install line 77) | |
================================================================================ | |
Error executing action `run` on resource 'execute[splunk-first-run]' | |
================================================================================ | |
Mixlib::ShellOut::ShellCommandFailed | |
------------------------------------ | |
Expected process to exit with [0], but received '8' | |
---- Begin output of /opt/splunkforwarder/bin/splunk help commands --accept-license --answer-yes --no-prompt ---- | |
STDOUT: | |
STDERR: Warning: cannot create "/opt/splunkforwarder/var/log/splunk" | |
Warning: cannot create "/opt/splunkforwarder/etc/licenses/enterprise" | |
---- End output of /opt/splunkforwarder/bin/splunk help commands --accept-license --answer-yes --no-prompt ---- | |
Ran /opt/splunkforwarder/bin/splunk help commands --accept-license --answer-yes --no-prompt returned 8 | |
Resource Declaration: | |
--------------------- | |
# In /var/chef/cache/cookbooks/cerner_splunk/recipes/_install.rb | |
77: execute 'splunk-first-run' do | |
78: command "#{node[:splunk][:cmd]} help commands --accept-license --answer-yes --no-prompt" | |
79: user node[:splunk][:user] | |
80: group node[:splunk][:group] | |
81: action :nothing | |
82: end | |
83: | |
Compiled Resource: | |
------------------ | |
# Declared in /var/chef/cache/cookbooks/cerner_splunk/recipes/_install.rb:77:in `from_file' | |
execute("splunk-first-run") do | |
action [:nothing] | |
retries 0 | |
retry_delay 2 | |
command "/opt/splunkforwarder/bin/splunk help commands --accept-license --answer-yes --no-prompt" | |
backup 5 | |
group "SYSTEM" | |
returns 0 | |
user "SYSTEM" | |
cookbook_name "cerner_splunk" | |
recipe_name "_install" | |
end | |
[2014-12-10T20:31:05-06:00] ERROR: Running exception handlers | |
[2014-12-10T20:31:05-06:00] FATAL: Saving node information to /var/chef/cache/failed-run-data.json | |
[2014-12-10T20:31:05-06:00] ERROR: Exception handlers complete | |
[2014-12-10T20:31:05-06:00] FATAL: Stacktrace dumped to /var/chef/cache/chef-stacktrace.out | |
[2014-12-10T20:31:05-06:00] FATAL: Mixlib::ShellOut::ShellCommandFailed: execute[splunk-first-run] (cerner_splunk::_install line 77) had an error: Mixlib::ShellOut::ShellCommandFailed: Expected process to exit with [0], but received '8' | |
---- Begin output of /opt/splunkforwarder/bin/splunk help commands --accept-license --answer-yes --no-prompt ---- | |
STDOUT: | |
STDERR: Warning: cannot create "/opt/splunkforwarder/var/log/splunk" | |
Warning: cannot create "/opt/splunkforwarder/etc/licenses/enterprise" | |
---- End output of /opt/splunkforwarder/bin/splunk help commands --accept-license --answer-yes --no-prompt ---- | |
Ran /opt/splunkforwarder/bin/splunk help commands --accept-license --answer-yes --no-prompt returned 8 |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment