Skip to content

Instantly share code, notes, and snippets.

@jpeppercorn
Created April 24, 2018 20:10
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save jpeppercorn/f006a5405f0db63c07622d3849a2ed7b to your computer and use it in GitHub Desktop.
Save jpeppercorn/f006a5405f0db63c07622d3849a2ed7b to your computer and use it in GitHub Desktop.
logstash-plain.log
[2018-04-24T16:04:37,837][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"/usr/share/logstash/modules/fb_apache/configuration"}
[2018-04-24T16:04:37,841][DEBUG][logstash.plugins.registry] Adding plugin to the registry {:name=>"fb_apache", :type=>:modules, :class=>#<LogStash::Modules::Scaffold:0x20c85d69 @module_name="fb_apache", @directory="/usr/share/logstash/modules/fb_apache/configuration", @kibana_version_parts=["6", "0", "0"]>}
[2018-04-24T16:04:37,842][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"/usr/share/logstash/modules/netflow/configuration"}
[2018-04-24T16:04:37,842][DEBUG][logstash.plugins.registry] Adding plugin to the registry {:name=>"netflow", :type=>:modules, :class=>#<LogStash::Modules::Scaffold:0x5e25b1dc @module_name="netflow", @directory="/usr/share/logstash/modules/netflow/configuration", @kibana_version_parts=["6", "0", "0"]>}
[2018-04-24T16:04:37,857][DEBUG][logstash.plugins.registry] Executing hooks {:name=>"", :type=>"pack", :hooks_file=>"/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/x-pack-6.2.3-java/lib/logstash_registry.rb"}
[2018-04-24T16:04:37,996][DEBUG][logstash.plugins.registry] Adding plugin to the registry {:name=>"metrics", :type=>:input, :class=>LogStash::Inputs::Metrics}
[2018-04-24T16:04:37,997][DEBUG][logstash.plugins.registry] Adding plugin to the registry {:name=>"monitoring", :type=>:universal, :class=>LogStash::MonitoringExtension}
[2018-04-24T16:04:37,997][DEBUG][logstash.plugins.registry] Adding plugin to the registry {:name=>"config_management", :type=>:universal, :class=>LogStash::ConfigManagement::Extension}
[2018-04-24T16:04:37,998][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"arcsight", :directory=>"/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/x-pack-6.2.3-java/modules/arcsight/configuration"}
[2018-04-24T16:04:37,998][DEBUG][logstash.plugins.registry] Adding plugin to the registry {:name=>"arcsight", :type=>:modules, :class=>#<LogStash::Modules::Scaffold:0xf9390ab @module_name="arcsight", @directory="/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/x-pack-6.2.3-java/modules/arcsight/configuration", @kibana_version_parts=["6", "0", "0"]>}
[2018-04-24T16:04:38,022][INFO ][logstash.configmanagement.bootstrapcheck] Using Elasticsearch as config store {:pipeline_id=>["main"], :poll_interval=>"5000000000ns"}
[2018-04-24T16:04:38,023][DEBUG][logstash.configmanagement.hooks] Removing the `Logstash::Config::Source::Local` and replacing it with `ElasticsearchSource`
[2018-04-24T16:04:38,065][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"plain", :type=>"codec", :class=>LogStash::Codecs::Plain}
[2018-04-24T16:04:38,074][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@id = "plain_e6b76bbe-e1e5-4212-8ed2-a94b93d84f05"
[2018-04-24T16:04:38,074][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@enable_metric = true
[2018-04-24T16:04:38,074][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@charset = "UTF-8"
[2018-04-24T16:04:38,078][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@hosts = [https://10.54.52.31:9200]
[2018-04-24T16:04:38,078][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@user = "logstash_writer"
[2018-04-24T16:04:38,078][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@password = <password>
[2018-04-24T16:04:38,078][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@sniffing = false
[2018-04-24T16:04:38,078][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@cacert = "/etc/logstash/cert/LMUWU0438.pem"
[2018-04-24T16:04:38,078][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ssl = true
[2018-04-24T16:04:38,078][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@id = "elasticsearch_66a5fc99-8a90-4748-a155-d215b3915e06"
[2018-04-24T16:04:38,078][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@enable_metric = true
[2018-04-24T16:04:38,079][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@codec = <LogStash::Codecs::Plain id=>"plain_e6b76bbe-e1e5-4212-8ed2-a94b93d84f05", enable_metric=>true, charset=>"UTF-8">
[2018-04-24T16:04:38,079][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@workers = 1
[2018-04-24T16:04:38,079][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@index = "logstash-%{+YYYY.MM.dd}"
[2018-04-24T16:04:38,079][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@manage_template = true
[2018-04-24T16:04:38,080][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@template_name = "logstash"
[2018-04-24T16:04:38,080][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@template_overwrite = false
[2018-04-24T16:04:38,080][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@parent = nil
[2018-04-24T16:04:38,080][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@join_field = nil
[2018-04-24T16:04:38,080][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@upsert = ""
[2018-04-24T16:04:38,080][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@doc_as_upsert = false
[2018-04-24T16:04:38,080][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script = ""
[2018-04-24T16:04:38,080][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_type = "inline"
[2018-04-24T16:04:38,080][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_lang = "painless"
[2018-04-24T16:04:38,080][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_var_name = "event"
[2018-04-24T16:04:38,080][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@scripted_upsert = false
[2018-04-24T16:04:38,080][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_initial_interval = 2
[2018-04-24T16:04:38,081][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_max_interval = 64
[2018-04-24T16:04:38,081][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_on_conflict = 1
[2018-04-24T16:04:38,081][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pipeline = nil
[2018-04-24T16:04:38,081][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@action = "index"
[2018-04-24T16:04:38,081][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ssl_certificate_verification = true
[2018-04-24T16:04:38,081][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@sniffing_delay = 5
[2018-04-24T16:04:38,081][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@timeout = 60
[2018-04-24T16:04:38,081][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@failure_type_logging_whitelist = []
[2018-04-24T16:04:38,081][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pool_max = 1000
[2018-04-24T16:04:38,081][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pool_max_per_route = 100
[2018-04-24T16:04:38,081][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@resurrect_delay = 5
[2018-04-24T16:04:38,081][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@validate_after_inactivity = 10000
[2018-04-24T16:04:38,082][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@http_compression = false
[2018-04-24T16:04:38,083][DEBUG][logstash.licensechecker.licensereader] Normalizing http path {:path=>nil, :normalized=>nil}
[2018-04-24T16:04:38,264][INFO ][logstash.licensechecker.licensereader] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[https://logstash_writer:xxxxxx@10.54.52.31:9200/]}}
[2018-04-24T16:04:38,266][INFO ][logstash.licensechecker.licensereader] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>https://logstash_writer:xxxxxx@10.54.52.31:9200/, :path=>"/"}
[2018-04-24T16:04:38,511][WARN ][logstash.licensechecker.licensereader] Restored connection to ES instance {:url=>"https://logstash_writer:xxxxxx@10.54.52.31:9200/"}
[2018-04-24T16:04:38,542][INFO ][logstash.licensechecker.licensereader] ES Output version determined {:es_version=>6}
[2018-04-24T16:04:38,543][WARN ][logstash.licensechecker.licensereader] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
[2018-04-24T16:04:38,560][DEBUG][logstash.licensechecker.licensemanager] updating observers of xpack info change
[2018-04-24T16:04:38,571][DEBUG][logstash.configmanagement.elasticsearchsource] updating licensing state installed:true,
license:{"status"=>"active", "uid"=>"9e6cfba8-7c54-4f36-9ab0-3d4dd63415ee", "type"=>"platinum", "issue_date"=>"2017-07-31T00:00:00.000Z", "issue_date_in_millis"=>1501459200000, "expiry_date"=>"2018-07-31T23:59:59.999Z", "expiry_date_in_millis"=>1533081599999, "max_nodes"=>23, "issued_to"=>"XPO Logistics, Inc. (non-production environments)", "issuer"=>"Aunik Bhattacharjee", "start_date_in_millis"=>1501459200000},
last_updated:}
[2018-04-24T16:04:38,571][INFO ][logstash.configmanagement.elasticsearchsource] Configuration Management License OK
[2018-04-24T16:04:38,571][DEBUG][logstash.config.sourceloader] Adding source {:source=>"#<LogStash::ConfigManagement::ElasticsearchSource:0x7406db58>"}
[2018-04-24T16:04:38,741][DEBUG][logstash.runner ] -------- Logstash Settings (* means modified) ---------
[2018-04-24T16:04:38,741][DEBUG][logstash.runner ] node.name: "LMUWU0438"
[2018-04-24T16:04:38,741][DEBUG][logstash.runner ] *path.data: "/var/lib/logstash" (default: "/usr/share/logstash/data")
[2018-04-24T16:04:38,741][DEBUG][logstash.runner ] modules.cli: []
[2018-04-24T16:04:38,741][DEBUG][logstash.runner ] modules: []
[2018-04-24T16:04:38,741][DEBUG][logstash.runner ] modules_setup: false
[2018-04-24T16:04:38,741][DEBUG][logstash.runner ] config.test_and_exit: false
[2018-04-24T16:04:38,742][DEBUG][logstash.runner ] *config.reload.automatic: true (default: false)
[2018-04-24T16:04:38,742][DEBUG][logstash.runner ] *config.reload.interval: 5000000000 (default: 3000000000)
[2018-04-24T16:04:38,742][DEBUG][logstash.runner ] config.support_escapes: false
[2018-04-24T16:04:38,742][DEBUG][logstash.runner ] metric.collect: true
[2018-04-24T16:04:38,742][DEBUG][logstash.runner ] pipeline.id: "main"
[2018-04-24T16:04:38,742][DEBUG][logstash.runner ] pipeline.system: false
[2018-04-24T16:04:38,742][DEBUG][logstash.runner ] pipeline.workers: 4
[2018-04-24T16:04:38,742][DEBUG][logstash.runner ] pipeline.output.workers: 1
[2018-04-24T16:04:38,742][DEBUG][logstash.runner ] pipeline.batch.size: 125
[2018-04-24T16:04:38,742][DEBUG][logstash.runner ] pipeline.batch.delay: 50
[2018-04-24T16:04:38,742][DEBUG][logstash.runner ] pipeline.unsafe_shutdown: false
[2018-04-24T16:04:38,742][DEBUG][logstash.runner ] pipeline.java_execution: false
[2018-04-24T16:04:38,742][DEBUG][logstash.runner ] pipeline.reloadable: true
[2018-04-24T16:04:38,742][DEBUG][logstash.runner ] path.plugins: []
[2018-04-24T16:04:38,742][DEBUG][logstash.runner ] config.debug: false
[2018-04-24T16:04:38,742][DEBUG][logstash.runner ] *log.level: "debug" (default: "info")
[2018-04-24T16:04:38,743][DEBUG][logstash.runner ] version: false
[2018-04-24T16:04:38,743][DEBUG][logstash.runner ] help: false
[2018-04-24T16:04:38,743][DEBUG][logstash.runner ] log.format: "plain"
[2018-04-24T16:04:38,743][DEBUG][logstash.runner ] http.host: "127.0.0.1"
[2018-04-24T16:04:38,743][DEBUG][logstash.runner ] http.port: 9600..9700
[2018-04-24T16:04:38,743][DEBUG][logstash.runner ] http.environment: "production"
[2018-04-24T16:04:38,743][DEBUG][logstash.runner ] queue.type: "memory"
[2018-04-24T16:04:38,743][DEBUG][logstash.runner ] queue.drain: false
[2018-04-24T16:04:38,743][DEBUG][logstash.runner ] queue.page_capacity: 67108864
[2018-04-24T16:04:38,743][DEBUG][logstash.runner ] queue.max_bytes: 1073741824
[2018-04-24T16:04:38,743][DEBUG][logstash.runner ] queue.max_events: 0
[2018-04-24T16:04:38,743][DEBUG][logstash.runner ] queue.checkpoint.acks: 1024
[2018-04-24T16:04:38,743][DEBUG][logstash.runner ] queue.checkpoint.writes: 1024
[2018-04-24T16:04:38,743][DEBUG][logstash.runner ] queue.checkpoint.interval: 1000
[2018-04-24T16:04:38,743][DEBUG][logstash.runner ] dead_letter_queue.enable: false
[2018-04-24T16:04:38,743][DEBUG][logstash.runner ] dead_letter_queue.max_bytes: 1073741824
[2018-04-24T16:04:38,743][DEBUG][logstash.runner ] slowlog.threshold.warn: -1
[2018-04-24T16:04:38,744][DEBUG][logstash.runner ] slowlog.threshold.info: -1
[2018-04-24T16:04:38,744][DEBUG][logstash.runner ] slowlog.threshold.debug: -1
[2018-04-24T16:04:38,744][DEBUG][logstash.runner ] slowlog.threshold.trace: -1
[2018-04-24T16:04:38,744][DEBUG][logstash.runner ] keystore.classname: "org.logstash.secret.store.backend.JavaKeyStore"
[2018-04-24T16:04:38,744][DEBUG][logstash.runner ] *keystore.file: "/etc/logstash/logstash.keystore" (default: "/usr/share/logstash/config/logstash.keystore")
[2018-04-24T16:04:38,744][DEBUG][logstash.runner ] *path.queue: "/var/lib/logstash/queue" (default: "/usr/share/logstash/data/queue")
[2018-04-24T16:04:38,744][DEBUG][logstash.runner ] *path.dead_letter_queue: "/var/lib/logstash/dead_letter_queue" (default: "/usr/share/logstash/data/dead_letter_queue")
[2018-04-24T16:04:38,744][DEBUG][logstash.runner ] *path.settings: "/etc/logstash" (default: "/usr/share/logstash/config")
[2018-04-24T16:04:38,744][DEBUG][logstash.runner ] *path.logs: "/var/log/logstash" (default: "/usr/share/logstash/logs")
[2018-04-24T16:04:38,744][DEBUG][logstash.runner ] xpack.monitoring.enabled: true
[2018-04-24T16:04:38,744][DEBUG][logstash.runner ] *xpack.monitoring.elasticsearch.url: ["https://10.54.52.31:9200"] (default: ["http://localhost:9200"])
[2018-04-24T16:04:38,744][DEBUG][logstash.runner ] xpack.monitoring.collection.interval: 10000000000
[2018-04-24T16:04:38,744][DEBUG][logstash.runner ] xpack.monitoring.collection.timeout_interval: 600000000000
[2018-04-24T16:04:38,744][DEBUG][logstash.runner ] *xpack.monitoring.elasticsearch.username: "logstash_writer" (default: "logstash_system")
[2018-04-24T16:04:38,744][DEBUG][logstash.runner ] *xpack.monitoring.elasticsearch.password: "lastmile"
[2018-04-24T16:04:38,744][DEBUG][logstash.runner ] *xpack.monitoring.elasticsearch.ssl.ca: "/etc/logstash/cert/LMUWU0438.pem"
[2018-04-24T16:04:38,745][DEBUG][logstash.runner ] *xpack.monitoring.elasticsearch.ssl.truststore.password: "lastmile"
[2018-04-24T16:04:38,745][DEBUG][logstash.runner ] *xpack.monitoring.elasticsearch.ssl.keystore.password: "lastmile"
[2018-04-24T16:04:38,745][DEBUG][logstash.runner ] xpack.monitoring.elasticsearch.ssl.verification_mode: "certificate"
[2018-04-24T16:04:38,745][DEBUG][logstash.runner ] xpack.monitoring.elasticsearch.sniffing: false
[2018-04-24T16:04:38,745][DEBUG][logstash.runner ] xpack.monitoring.collection.pipeline.details.enabled: true
[2018-04-24T16:04:38,745][DEBUG][logstash.runner ] xpack.monitoring.collection.config.enabled: true
[2018-04-24T16:04:38,745][DEBUG][logstash.runner ] node.uuid: ""
[2018-04-24T16:04:38,745][DEBUG][logstash.runner ] *xpack.management.enabled: true (default: false)
[2018-04-24T16:04:38,745][DEBUG][logstash.runner ] xpack.management.logstash.poll_interval: 5000000000
[2018-04-24T16:04:38,745][DEBUG][logstash.runner ] xpack.management.pipeline.id: ["main"]
[2018-04-24T16:04:38,745][DEBUG][logstash.runner ] *xpack.management.elasticsearch.username: "logstash_writer" (default: "logstash_system")
[2018-04-24T16:04:38,745][DEBUG][logstash.runner ] *xpack.management.elasticsearch.password: "lastmile"
[2018-04-24T16:04:38,746][DEBUG][logstash.runner ] *xpack.management.elasticsearch.url: ["https://10.54.52.31:9200"] (default: ["https://localhost:9200"])
[2018-04-24T16:04:38,746][DEBUG][logstash.runner ] *xpack.management.elasticsearch.ssl.ca: "/etc/logstash/cert/LMUWU0438.pem"
[2018-04-24T16:04:38,746][DEBUG][logstash.runner ] xpack.management.elasticsearch.sniffing: false
[2018-04-24T16:04:38,746][DEBUG][logstash.runner ] --------------- Logstash Settings -------------------
[2018-04-24T16:04:38,765][DEBUG][logstash.agent ] Setting up metric collection
[2018-04-24T16:04:38,770][DEBUG][logstash.instrument.periodicpoller.os] Starting {:polling_interval=>5, :polling_timeout=>120}
[2018-04-24T16:04:38,791][DEBUG][logstash.instrument.periodicpoller.jvm] Starting {:polling_interval=>5, :polling_timeout=>120}
[2018-04-24T16:04:38,813][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-04-24T16:04:38,814][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-04-24T16:04:38,817][DEBUG][logstash.instrument.periodicpoller.persistentqueue] Starting {:polling_interval=>5, :polling_timeout=>120}
[2018-04-24T16:04:38,819][DEBUG][logstash.instrument.periodicpoller.deadletterqueue] Starting {:polling_interval=>5, :polling_timeout=>120}
[2018-04-24T16:04:38,835][DEBUG][logstash.monitoringextension.pipelineregisterhook] compiled metrics pipeline config: {:config=>"# [2017] Elasticsearch Incorporated. All Rights Reserved.\n#\n# NOTICE: All information contained herein is, and remains\n# the property of Elasticsearch Incorporated and its suppliers,\n# if any. The intellectual and technical concepts contained\n# herein are proprietary to Elasticsearch Incorporated\n# and its suppliers and may be covered by U.S. and Foreign Patents,\n# patents in process, and are protected by trade secret or copyright law.\n# Dissemination of this information or reproduction of this material\n# is strictly forbidden unless prior written permission is obtained\n# from Elasticsearch Incorporated.\n#\ninput {\n metrics {\n collection_interval => 10\n collection_timeout_interval => 600\n extended_performance_collection => true\n config_collection => true\n }\n}\noutput {\n elasticsearch {\n hosts => [\"https://10.54.52.31:9200\"]\n bulk_path => \"/_xpack/monitoring/_bulk?system_id=logstash&system_api_version=2&interval=1s\"\n manage_template => false\n document_type => \"%{[@metadata][document_type]}\"\n index => \"\"\n sniffing => false\n \n user => \"logstash_writer\"\n password => \"lastmile\"\n \n \n ssl => true\n \n cacert => \"/etc/logstash/cert/LMUWU0438.pem\"\n \n \n \n \n }\n}\n"}
[2018-04-24T16:04:38,842][DEBUG][logstash.config.sourceloader] Adding source {:source=>"#<LogStash::Monitoring::InternalPipelineSource:0x602b9b05>"}
[2018-04-24T16:04:38,844][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.2.3"}
[2018-04-24T16:04:38,847][DEBUG][logstash.agent ] Starting agent
[2018-04-24T16:04:38,850][DEBUG][logstash.agent ] Starting puma
[2018-04-24T16:04:38,851][DEBUG][logstash.agent ] Trying to start WebServer {:port=>9600}
[2018-04-24T16:04:38,865][DEBUG][logstash.api.service ] [api-service] start
[2018-04-24T16:04:38,897][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2018-04-24T16:04:38,991][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@id = "plain_2b120c92-e024-4bab-ae27-746a0a0d65b7"
[2018-04-24T16:04:38,991][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@enable_metric = true
[2018-04-24T16:04:38,991][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@charset = "UTF-8"
[2018-04-24T16:04:38,997][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@hosts = [https://10.54.52.31:9200]
[2018-04-24T16:04:39,003][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@user = "logstash_writer"
[2018-04-24T16:04:39,003][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@password = <password>
[2018-04-24T16:04:39,003][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@sniffing = false
[2018-04-24T16:04:39,003][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@cacert = "/etc/logstash/cert/LMUWU0438.pem"
[2018-04-24T16:04:39,003][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ssl = true
[2018-04-24T16:04:39,003][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@id = "elasticsearch_70b82df6-2dd4-430d-950d-99659fc1d62e"
[2018-04-24T16:04:39,004][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@enable_metric = true
[2018-04-24T16:04:39,005][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@codec = <LogStash::Codecs::Plain id=>"plain_2b120c92-e024-4bab-ae27-746a0a0d65b7", enable_metric=>true, charset=>"UTF-8">
[2018-04-24T16:04:39,005][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@workers = 1
[2018-04-24T16:04:39,005][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@index = "logstash-%{+YYYY.MM.dd}"
[2018-04-24T16:04:39,005][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@manage_template = true
[2018-04-24T16:04:39,005][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@template_name = "logstash"
[2018-04-24T16:04:39,006][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@template_overwrite = false
[2018-04-24T16:04:39,006][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@parent = nil
[2018-04-24T16:04:39,006][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@join_field = nil
[2018-04-24T16:04:39,006][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@upsert = ""
[2018-04-24T16:04:39,006][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@doc_as_upsert = false
[2018-04-24T16:04:39,006][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script = ""
[2018-04-24T16:04:39,006][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_type = "inline"
[2018-04-24T16:04:39,006][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_lang = "painless"
[2018-04-24T16:04:39,006][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_var_name = "event"
[2018-04-24T16:04:39,006][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@scripted_upsert = false
[2018-04-24T16:04:39,007][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_initial_interval = 2
[2018-04-24T16:04:39,007][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_max_interval = 64
[2018-04-24T16:04:39,007][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_on_conflict = 1
[2018-04-24T16:04:39,007][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pipeline = nil
[2018-04-24T16:04:39,007][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@action = "index"
[2018-04-24T16:04:39,008][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ssl_certificate_verification = true
[2018-04-24T16:04:39,008][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@sniffing_delay = 5
[2018-04-24T16:04:39,008][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@timeout = 60
[2018-04-24T16:04:39,008][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@failure_type_logging_whitelist = []
[2018-04-24T16:04:39,008][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pool_max = 1000
[2018-04-24T16:04:39,008][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pool_max_per_route = 100
[2018-04-24T16:04:39,008][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@resurrect_delay = 5
[2018-04-24T16:04:39,008][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@validate_after_inactivity = 10000
[2018-04-24T16:04:39,008][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@http_compression = false
[2018-04-24T16:04:39,010][DEBUG][logstash.configmanagement.elasticsearchsource] Normalizing http path {:path=>nil, :normalized=>nil}
[2018-04-24T16:04:39,023][INFO ][logstash.configmanagement.elasticsearchsource] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[https://logstash_writer:xxxxxx@10.54.52.31:9200/]}}
[2018-04-24T16:04:39,025][INFO ][logstash.configmanagement.elasticsearchsource] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>https://logstash_writer:xxxxxx@10.54.52.31:9200/, :path=>"/"}
[2018-04-24T16:04:39,055][WARN ][logstash.configmanagement.elasticsearchsource] Restored connection to ES instance {:url=>"https://logstash_writer:xxxxxx@10.54.52.31:9200/"}
[2018-04-24T16:04:39,059][INFO ][logstash.configmanagement.elasticsearchsource] ES Output version determined {:es_version=>6}
[2018-04-24T16:04:39,059][WARN ][logstash.configmanagement.elasticsearchsource] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
[2018-04-24T16:04:39,091][DEBUG][logstash.agent ] Converging pipelines state {:actions_count=>2}
[2018-04-24T16:04:39,093][DEBUG][logstash.agent ] Executing action {:action=>LogStash::PipelineAction::Create/pipeline_id:.monitoring-logstash}
[2018-04-24T16:04:39,317][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@id = "plain_7a2b95e9-e45d-49d6-820a-7439789967c4"
[2018-04-24T16:04:39,317][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@enable_metric = true
[2018-04-24T16:04:39,317][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@charset = "UTF-8"
[2018-04-24T16:04:39,318][DEBUG][logstash.inputs.metrics ] config LogStash::Inputs::Metrics/@collection_interval = 10
[2018-04-24T16:04:39,318][DEBUG][logstash.inputs.metrics ] config LogStash::Inputs::Metrics/@collection_timeout_interval = 600
[2018-04-24T16:04:39,318][DEBUG][logstash.inputs.metrics ] config LogStash::Inputs::Metrics/@extended_performance_collection = "true"
[2018-04-24T16:04:39,318][DEBUG][logstash.inputs.metrics ] config LogStash::Inputs::Metrics/@config_collection = "true"
[2018-04-24T16:04:39,318][DEBUG][logstash.inputs.metrics ] config LogStash::Inputs::Metrics/@id = "cdf1ea97fc5684fe6e369cd88ce9e30114cd1f2f93adfad2101c922d259ef5b4"
[2018-04-24T16:04:39,318][DEBUG][logstash.inputs.metrics ] config LogStash::Inputs::Metrics/@enable_metric = true
[2018-04-24T16:04:39,318][DEBUG][logstash.inputs.metrics ] config LogStash::Inputs::Metrics/@codec = <LogStash::Codecs::Plain id=>"plain_7a2b95e9-e45d-49d6-820a-7439789967c4", enable_metric=>true, charset=>"UTF-8">
[2018-04-24T16:04:39,318][DEBUG][logstash.inputs.metrics ] config LogStash::Inputs::Metrics/@add_field = {}
[2018-04-24T16:04:39,319][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"elasticsearch", :type=>"output", :class=>LogStash::Outputs::ElasticSearch}
[2018-04-24T16:04:39,325][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@id = "plain_25df1d5c-28f9-4a71-a4d0-51ac9a9a1e7d"
[2018-04-24T16:04:39,325][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@enable_metric = true
[2018-04-24T16:04:39,325][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@charset = "UTF-8"
[2018-04-24T16:04:39,327][WARN ][logstash.outputs.elasticsearch] You are using a deprecated config setting "document_type" set in elasticsearch. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Document types are being deprecated in Elasticsearch 6.0, and removed entirely in 7.0. You should avoid this feature If you have any questions about this, please visit the #logstash channel on freenode irc. {:name=>"document_type", :plugin=><LogStash::Outputs::ElasticSearch hosts=>[https://10.54.52.31:9200], bulk_path=>"/_xpack/monitoring/_bulk?system_id=logstash&system_api_version=2&interval=1s", manage_template=>false, document_type=>"%{[@metadata][document_type]}", sniffing=>false, user=>"logstash_writer", password=><password>, ssl=>true, cacert=>"/etc/logstash/cert/LMUWU0438.pem", id=>"bbab00ed38755642d2fb5362673e378c4205645a5c0dee2bf5d0df95fdbcd636", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_25df1d5c-28f9-4a71-a4d0-51ac9a9a1e7d", enable_metric=>true, charset=>"UTF-8">, workers=>1, template_name=>"logstash", template_overwrite=>false, doc_as_upsert=>false, script_type=>"inline", script_lang=>"painless", script_var_name=>"event", scripted_upsert=>false, retry_initial_interval=>2, retry_max_interval=>64, retry_on_conflict=>1, action=>"index", ssl_certificate_verification=>true, sniffing_delay=>5, timeout=>60, pool_max=>1000, pool_max_per_route=>100, resurrect_delay=>5, validate_after_inactivity=>10000, http_compression=>false>}
[2018-04-24T16:04:39,327][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@hosts = [https://10.54.52.31:9200]
[2018-04-24T16:04:39,328][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@bulk_path = "/_xpack/monitoring/_bulk?system_id=logstash&system_api_version=2&interval=1s"
[2018-04-24T16:04:39,328][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@manage_template = false
[2018-04-24T16:04:39,328][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@document_type = "%{[@metadata][document_type]}"
[2018-04-24T16:04:39,328][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@index = ""
[2018-04-24T16:04:39,328][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@sniffing = false
[2018-04-24T16:04:39,328][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@user = "logstash_writer"
[2018-04-24T16:04:39,328][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@password = <password>
[2018-04-24T16:04:39,329][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ssl = true
[2018-04-24T16:04:39,329][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@cacert = "/etc/logstash/cert/LMUWU0438.pem"
[2018-04-24T16:04:39,329][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@id = "bbab00ed38755642d2fb5362673e378c4205645a5c0dee2bf5d0df95fdbcd636"
[2018-04-24T16:04:39,329][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@enable_metric = true
[2018-04-24T16:04:39,329][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@codec = <LogStash::Codecs::Plain id=>"plain_25df1d5c-28f9-4a71-a4d0-51ac9a9a1e7d", enable_metric=>true, charset=>"UTF-8">
[2018-04-24T16:04:39,329][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@workers = 1
[2018-04-24T16:04:39,329][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@template_name = "logstash"
[2018-04-24T16:04:39,329][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@template_overwrite = false
[2018-04-24T16:04:39,329][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@parent = nil
[2018-04-24T16:04:39,329][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@join_field = nil
[2018-04-24T16:04:39,329][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@upsert = ""
[2018-04-24T16:04:39,329][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@doc_as_upsert = false
[2018-04-24T16:04:39,329][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script = ""
[2018-04-24T16:04:39,329][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_type = "inline"
[2018-04-24T16:04:39,330][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_lang = "painless"
[2018-04-24T16:04:39,330][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_var_name = "event"
[2018-04-24T16:04:39,330][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@scripted_upsert = false
[2018-04-24T16:04:39,330][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_initial_interval = 2
[2018-04-24T16:04:39,330][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_max_interval = 64
[2018-04-24T16:04:39,330][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_on_conflict = 1
[2018-04-24T16:04:39,330][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pipeline = nil
[2018-04-24T16:04:39,330][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@action = "index"
[2018-04-24T16:04:39,330][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ssl_certificate_verification = true
[2018-04-24T16:04:39,330][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@sniffing_delay = 5
[2018-04-24T16:04:39,330][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@timeout = 60
[2018-04-24T16:04:39,330][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@failure_type_logging_whitelist = []
[2018-04-24T16:04:39,330][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pool_max = 1000
[2018-04-24T16:04:39,330][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pool_max_per_route = 100
[2018-04-24T16:04:39,330][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@resurrect_delay = 5
[2018-04-24T16:04:39,331][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@validate_after_inactivity = 10000
[2018-04-24T16:04:39,331][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@http_compression = false
[2018-04-24T16:04:39,332][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>".monitoring-logstash", "pipeline.workers"=>1, "pipeline.batch.size"=>2, "pipeline.batch.delay"=>50}
[2018-04-24T16:04:39,336][DEBUG][logstash.outputs.elasticsearch] Normalizing http path {:path=>nil, :normalized=>nil}
[2018-04-24T16:04:39,349][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[https://logstash_writer:xxxxxx@10.54.52.31:9200/]}}
[2018-04-24T16:04:39,350][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>https://logstash_writer:xxxxxx@10.54.52.31:9200/, :path=>"/"}
[2018-04-24T16:04:39,398][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"https://logstash_writer:xxxxxx@10.54.52.31:9200/"}
[2018-04-24T16:04:39,402][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2018-04-24T16:04:39,402][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
[2018-04-24T16:04:39,404][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["https://10.54.52.31:9200"]}
[2018-04-24T16:04:39,414][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@id = "plain_2e11fd06-c3d0-4a66-ad0d-57c38af355f1"
[2018-04-24T16:04:39,414][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@enable_metric = true
[2018-04-24T16:04:39,414][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@charset = "UTF-8"
[2018-04-24T16:04:39,417][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@hosts = [https://10.54.52.31:9200]
[2018-04-24T16:04:39,417][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@user = "logstash_writer"
[2018-04-24T16:04:39,417][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@password = <password>
[2018-04-24T16:04:39,417][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@sniffing = false
[2018-04-24T16:04:39,417][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@cacert = "/etc/logstash/cert/LMUWU0438.pem"
[2018-04-24T16:04:39,417][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ssl = true
[2018-04-24T16:04:39,417][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@id = "elasticsearch_7c1e015a-987b-49c9-921f-619499bdb232"
[2018-04-24T16:04:39,417][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@enable_metric = true
[2018-04-24T16:04:39,418][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@codec = <LogStash::Codecs::Plain id=>"plain_2e11fd06-c3d0-4a66-ad0d-57c38af355f1", enable_metric=>true, charset=>"UTF-8">
[2018-04-24T16:04:39,418][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@workers = 1
[2018-04-24T16:04:39,418][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@index = "logstash-%{+YYYY.MM.dd}"
[2018-04-24T16:04:39,418][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@manage_template = true
[2018-04-24T16:04:39,418][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@template_name = "logstash"
[2018-04-24T16:04:39,418][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@template_overwrite = false
[2018-04-24T16:04:39,418][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@parent = nil
[2018-04-24T16:04:39,418][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@join_field = nil
[2018-04-24T16:04:39,419][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@upsert = ""
[2018-04-24T16:04:39,419][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@doc_as_upsert = false
[2018-04-24T16:04:39,419][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script = ""
[2018-04-24T16:04:39,419][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_type = "inline"
[2018-04-24T16:04:39,419][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_lang = "painless"
[2018-04-24T16:04:39,419][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_var_name = "event"
[2018-04-24T16:04:39,419][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@scripted_upsert = false
[2018-04-24T16:04:39,419][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_initial_interval = 2
[2018-04-24T16:04:39,419][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_max_interval = 64
[2018-04-24T16:04:39,419][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_on_conflict = 1
[2018-04-24T16:04:39,419][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pipeline = nil
[2018-04-24T16:04:39,419][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@action = "index"
[2018-04-24T16:04:39,419][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ssl_certificate_verification = true
[2018-04-24T16:04:39,420][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@sniffing_delay = 5
[2018-04-24T16:04:39,420][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@timeout = 60
[2018-04-24T16:04:39,420][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@failure_type_logging_whitelist = []
[2018-04-24T16:04:39,420][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pool_max = 1000
[2018-04-24T16:04:39,420][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pool_max_per_route = 100
[2018-04-24T16:04:39,420][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@resurrect_delay = 5
[2018-04-24T16:04:39,420][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@validate_after_inactivity = 10000
[2018-04-24T16:04:39,420][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@http_compression = false
[2018-04-24T16:04:39,421][DEBUG][logstash.licensechecker.licensereader] Normalizing http path {:path=>nil, :normalized=>nil}
[2018-04-24T16:04:39,425][INFO ][logstash.licensechecker.licensereader] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[https://logstash_writer:xxxxxx@10.54.52.31:9200/]}}
[2018-04-24T16:04:39,426][INFO ][logstash.licensechecker.licensereader] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>https://logstash_writer:xxxxxx@10.54.52.31:9200/, :path=>"/"}
[2018-04-24T16:04:39,443][WARN ][logstash.licensechecker.licensereader] Restored connection to ES instance {:url=>"https://logstash_writer:xxxxxx@10.54.52.31:9200/"}
[2018-04-24T16:04:39,446][INFO ][logstash.licensechecker.licensereader] ES Output version determined {:es_version=>6}
[2018-04-24T16:04:39,447][WARN ][logstash.licensechecker.licensereader] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
[2018-04-24T16:04:39,456][DEBUG][logstash.licensechecker.licensemanager] updating observers of xpack info change
[2018-04-24T16:04:39,457][DEBUG][logstash.inputs.metrics ] updating licensing state installed:true,
license:{"status"=>"active", "uid"=>"9e6cfba8-7c54-4f36-9ab0-3d4dd63415ee", "type"=>"platinum", "issue_date"=>"2017-07-31T00:00:00.000Z", "issue_date_in_millis"=>1501459200000, "expiry_date"=>"2018-07-31T23:59:59.999Z", "expiry_date_in_millis"=>1533081599999, "max_nodes"=>23, "issued_to"=>"XPO Logistics, Inc. (non-production environments)", "issuer"=>"Aunik Bhattacharjee", "start_date_in_millis"=>1501459200000},
last_updated:}
[2018-04-24T16:04:39,462][DEBUG][logstash.inputs.metrics ] Metric: input started
[2018-04-24T16:04:39,467][INFO ][logstash.pipeline ] Pipeline started succesfully {:pipeline_id=>".monitoring-logstash", :thread=>"#<Thread:0x35cf001e@/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:246 sleep>"}
[2018-04-24T16:04:39,473][DEBUG][logstash.agent ] Executing action {:action=>LogStash::PipelineAction::Create/pipeline_id:main}
[2018-04-24T16:04:40,402][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"beats", :type=>"input", :class=>LogStash::Inputs::Beats}
[2018-04-24T16:04:40,422][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@id = "plain_6393b589-eb35-43ec-b3a5-ea66a9297d7d"
[2018-04-24T16:04:40,423][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@enable_metric = true
[2018-04-24T16:04:40,423][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@charset = "UTF-8"
[2018-04-24T16:04:40,424][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@port = 5044
[2018-04-24T16:04:40,424][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@id = "0740b5540d560fb13a000edf7e31d3e6418a27a943d2994d87a96282570c2f10"
[2018-04-24T16:04:40,424][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@enable_metric = true
[2018-04-24T16:04:40,424][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@codec = <LogStash::Codecs::Plain id=>"plain_6393b589-eb35-43ec-b3a5-ea66a9297d7d", enable_metric=>true, charset=>"UTF-8">
[2018-04-24T16:04:40,424][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@add_field = {}
[2018-04-24T16:04:40,425][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@host = "0.0.0.0"
[2018-04-24T16:04:40,425][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@ssl = false
[2018-04-24T16:04:40,425][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@ssl_certificate_authorities = []
[2018-04-24T16:04:40,425][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@ssl_verify_mode = "none"
[2018-04-24T16:04:40,425][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@include_codec_tag = true
[2018-04-24T16:04:40,425][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@ssl_handshake_timeout = 10000
[2018-04-24T16:04:40,425][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@tls_min_version = 1
[2018-04-24T16:04:40,435][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@tls_max_version = 1.2
[2018-04-24T16:04:40,435][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@cipher_suites = ["TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384", "TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384", "TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256", "TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256", "TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA384", "TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA384", "TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256", "TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256"]
[2018-04-24T16:04:40,435][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@client_inactivity_timeout = 60
[2018-04-24T16:04:40,435][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@executor_threads = 4
[2018-04-24T16:04:40,444][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"mutate", :type=>"filter", :class=>LogStash::Filters::Mutate}
[2018-04-24T16:04:40,452][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@remove_field = ["[beat][hostname]", "[beat][name]"]
[2018-04-24T16:04:40,453][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@id = "0f1b71bed12899aaa2102ad1b5e3a7b1d9fe53717a30ca01d8cd88453aca8064"
[2018-04-24T16:04:40,453][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@enable_metric = true
[2018-04-24T16:04:40,453][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@add_tag = []
[2018-04-24T16:04:40,453][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@remove_tag = []
[2018-04-24T16:04:40,453][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@add_field = {}
[2018-04-24T16:04:40,453][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@periodic_flush = false
[2018-04-24T16:04:40,456][DEBUG][logstash.filters.mutate ] Replacing `${HOSTNAME}` with actual value
[2018-04-24T16:04:40,460][DEBUG][org.logstash.secret.store.SecretStoreFactory] Attempting to exists or secret store with implementation: org.logstash.secret.store.backend.JavaKeyStore
[2018-04-24T16:04:40,464][DEBUG][org.logstash.secret.store.SecretStoreFactory] Attempting to load or secret store with implementation: org.logstash.secret.store.backend.JavaKeyStore
[2018-04-24T16:04:40,470][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"Java::OrgLogstashSecretStore::SecretStoreException::AccessException", :message=>"Could not determine keystore password. Please ensure the file at /etc/logstash/logstash.keystore is a valid Logstash keystore", :backtrace=>["org.logstash.secret.store.backend.JavaKeyStore.getKeyStorePassword(org/logstash/secret/store/backend/JavaKeyStore.java:198)", "org.logstash.secret.store.backend.JavaKeyStore.init(org/logstash/secret/store/backend/JavaKeyStore.java:207)", "org.logstash.secret.store.backend.JavaKeyStore.load(org/logstash/secret/store/backend/JavaKeyStore.java:254)", "org.logstash.secret.store.backend.JavaKeyStore.load(org/logstash/secret/store/backend/JavaKeyStore.java:40)", "org.logstash.secret.store.SecretStoreFactory.doIt(org/logstash/secret/store/SecretStoreFactory.java:79)", "org.logstash.secret.store.SecretStoreFactory.load(org/logstash/secret/store/SecretStoreFactory.java:65)", "java.lang.reflect.Method.invoke(java/lang/reflect/Method)", "org.jruby.javasupport.JavaMethod.invokeDirectWithExceptionHandling(org/jruby/javasupport/JavaMethod.java:453)", "org.jruby.javasupport.JavaMethod.invokeStaticDirect(org/jruby/javasupport/JavaMethod.java:365)", "RUBY.get_if_exists(/usr/share/logstash/logstash-core/lib/logstash/util/secretstore.rb:28)", "usr.share.logstash.logstash_minus_core.lib.logstash.util.substitution_variables.invokeOther11:get_if_exists(usr/share/logstash/logstash_minus_core/lib/logstash/util//usr/share/logstash/logstash-core/lib/logstash/util/substitution_variables.rb:46)", "usr.share.logstash.logstash_minus_core.lib.logstash.util.substitution_variables.block in replace_placeholders(/usr/share/logstash/logstash-core/lib/logstash/util/substitution_variables.rb:46)", "org.jruby.RubyString.gsubCommon19(org/jruby/RubyString.java:2629)", "org.jruby.RubyString.gsubCommon19(org/jruby/RubyString.java:2583)", "org.jruby.RubyString.gsub(org/jruby/RubyString.java:2541)", "org.jruby.RubyString$INVOKER$i$gsub19.call(org/jruby/RubyString$INVOKER$i$gsub19.gen)", "usr.share.logstash.logstash_minus_core.lib.logstash.util.substitution_variables.invokeOther27:gsub(usr/share/logstash/logstash_minus_core/lib/logstash/util//usr/share/logstash/logstash-core/lib/logstash/util/substitution_variables.rb:36)", "usr.share.logstash.logstash_minus_core.lib.logstash.util.substitution_variables.replace_placeholders(/usr/share/logstash/logstash-core/lib/logstash/util/substitution_variables.rb:36)", "usr.share.logstash.logstash_minus_core.lib.logstash.util.substitution_variables.invokeOther12:replace_placeholders(usr/share/logstash/logstash_minus_core/lib/logstash/util//usr/share/logstash/logstash-core/lib/logstash/util/substitution_variables.rb:24)", "usr.share.logstash.logstash_minus_core.lib.logstash.util.substitution_variables.deep_replace(/usr/share/logstash/logstash-core/lib/logstash/util/substitution_variables.rb:24)", "usr.share.logstash.logstash_minus_core.lib.logstash.util.substitution_variables.invokeOther4:deep_replace(usr/share/logstash/logstash_minus_core/lib/logstash/util//usr/share/logstash/logstash-core/lib/logstash/util/substitution_variables.rb:21)", "usr.share.logstash.logstash_minus_core.lib.logstash.util.substitution_variables.block in deep_replace(/usr/share/logstash/logstash-core/lib/logstash/util/substitution_variables.rb:21)", "org.jruby.RubyArray.eachIndex(org/jruby/RubyArray.java:1790)", "org.jruby.RubyArray.each_index(org/jruby/RubyArray.java:1797)", "org.jruby.RubyArray$INVOKER$i$0$0$each_index.call(org/jruby/RubyArray$INVOKER$i$0$0$each_index.gen)", "usr.share.logstash.logstash_minus_core.lib.logstash.util.substitution_variables.invokeOther11:each_index(usr/share/logstash/logstash_minus_core/lib/logstash/util//usr/share/logstash/logstash-core/lib/logstash/util/substitution_variables.rb:20)", "usr.share.logstash.logstash_minus_core.lib.logstash.util.substitution_variables.deep_replace(/usr/share/logstash/logstash-core/lib/logstash/util/substitution_variables.rb:20)", "usr.share.logstash.logstash_minus_core.lib.logstash.config.mixin.invokeOther1:deep_replace(usr/share/logstash/logstash_minus_core/lib/logstash/config//usr/share/logstash/logstash-core/lib/logstash/config/mixin.rb:85)", "usr.share.logstash.logstash_minus_core.lib.logstash.config.mixin.block in config_init(/usr/share/logstash/logstash-core/lib/logstash/config/mixin.rb:85)", "org.jruby.RubyHash$12.visit(org/jruby/RubyHash.java:1362)", "org.jruby.RubyHash$12.visit(org/jruby/RubyHash.java:1359)", "org.jruby.RubyHash.visitLimited(org/jruby/RubyHash.java:662)", "org.jruby.RubyHash.visitAll(org/jruby/RubyHash.java:647)", "org.jruby.RubyHash.iteratorVisitAll(org/jruby/RubyHash.java:1319)", "org.jruby.RubyHash.each_pairCommon(org/jruby/RubyHash.java:1354)", "org.jruby.RubyHash.each(org/jruby/RubyHash.java:1343)", "org.jruby.RubyHash$INVOKER$i$0$0$each.call(org/jruby/RubyHash$INVOKER$i$0$0$each.gen)", "RUBY.config_init(/usr/share/logstash/logstash-core/lib/logstash/config/mixin.rb:84)", "RUBY.initialize(/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:128)", "org.jruby.RubyClass.newInstance(org/jruby/RubyClass.java:1001)", "org.jruby.RubyClass$INVOKER$i$newInstance.call(org/jruby/RubyClass$INVOKER$i$newInstance.gen)", "RUBY.initialize(/usr/share/logstash/logstash-core/lib/logstash/filter_delegator.rb:22)", "org.jruby.RubyClass.newInstance(org/jruby/RubyClass.java:1022)", "org.jruby.RubyClass$INVOKER$i$newInstance.call(org/jruby/RubyClass$INVOKER$i$newInstance.gen)", "RUBY.plugin(/usr/share/logstash/logstash-core/lib/logstash/plugins/plugin_factory.rb:87)", "RUBY.plugin(/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:112)", "RUBY.<eval>((eval):41)", "org.jruby.RubyKernel.evalCommon(org/jruby/RubyKernel.java:1027)", "org.jruby.RubyKernel.eval(org/jruby/RubyKernel.java:994)", "org.jruby.RubyKernel$INVOKER$s$0$3$eval19.call(org/jruby/RubyKernel$INVOKER$s$0$3$eval19.gen)", "RUBY.initialize(/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:84)", "RUBY.initialize(/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:169)", "org.jruby.RubyClass.newInstance(org/jruby/RubyClass.java:1022)", "org.jruby.RubyClass$INVOKER$i$newInstance.call(org/jruby/RubyClass$INVOKER$i$newInstance.gen)", "RUBY.execute(/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:40)", "RUBY.block in converge_state(/usr/share/logstash/logstash-core/lib/logstash/agent.rb:315)", "RUBY.with_pipelines(/usr/share/logstash/logstash-core/lib/logstash/agent.rb:141)", "RUBY.block in converge_state(/usr/share/logstash/logstash-core/lib/logstash/agent.rb:312)", "org.jruby.RubyArray.each(org/jruby/RubyArray.java:1734)", "org.jruby.RubyArray$INVOKER$i$0$0$each.call(org/jruby/RubyArray$INVOKER$i$0$0$each.gen)", "RUBY.converge_state(/usr/share/logstash/logstash-core/lib/logstash/agent.rb:299)", "RUBY.block in converge_state_and_update(/usr/share/logstash/logstash-core/lib/logstash/agent.rb:166)", "RUBY.with_pipelines(/usr/share/logstash/logstash-core/lib/logstash/agent.rb:141)", "RUBY.converge_state_and_update(/usr/share/logstash/logstash-core/lib/logstash/agent.rb:164)", "RUBY.execute(/usr/share/logstash/logstash-core/lib/logstash/agent.rb:90)", "RUBY.block in execute(/usr/share/logstash/logstash-core/lib/logstash/runner.rb:348)", "org.jruby.RubyProc.call(org/jruby/RubyProc.java:289)", "org.jruby.RubyProc.call19(org/jruby/RubyProc.java:273)", "org.jruby.RubyProc$INVOKER$i$0$0$call19.call(org/jruby/RubyProc$INVOKER$i$0$0$call19.gen)", "RUBY.block in initialize(/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/task.rb:24)", "org.jruby.RubyProc.call(org/jruby/RubyProc.java:289)", "org.jruby.RubyProc.call(org/jruby/RubyProc.java:246)", "java.lang.Thread.run(java/lang/Thread)"]}
[2018-04-24T16:04:40,471][ERROR][logstash.agent ] An exception happened when converging configuration {:exception=>LogStash::Error, :message=>"Don't know how to handle `Java::OrgLogstashSecretStore::SecretStoreException::AccessException` for `LogStash::PipelineAction::Create/pipeline_id:main`", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/converge_result.rb:27:in `create'", "/usr/share/logstash/logstash-core/lib/logstash/converge_result.rb:67:in `add'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:327:in `block in converge_state'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:141:in `with_pipelines'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:312:in `block in converge_state'", "org/jruby/RubyArray.java:1734:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:299:in `converge_state'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:166:in `block in converge_state_and_update'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:141:in `with_pipelines'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:164:in `converge_state_and_update'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:90:in `execute'", "/usr/share/logstash/logstash-core/lib/logstash/runner.rb:348:in `block in execute'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/task.rb:24:in `block in initialize'"]}
[2018-04-24T16:04:43,831][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-04-24T16:04:43,832][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-04-24T16:04:44,461][DEBUG][logstash.pipeline ] Pushing flush onto pipeline {:pipeline_id=>".monitoring-logstash", :thread=>"#<Thread:0x35cf001e@/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:246 sleep>"}
[2018-04-24T16:04:45,482][DEBUG][logstash.agent ] Converging pipelines state {:actions_count=>1}
[2018-04-24T16:04:45,482][DEBUG][logstash.agent ] Executing action {:action=>LogStash::PipelineAction::Create/pipeline_id:main}
[2018-04-24T16:04:46,127][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@id = "plain_a296ee22-932a-4f22-b3c7-16a5c17bf8ae"
[2018-04-24T16:04:46,127][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@enable_metric = true
[2018-04-24T16:04:46,127][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@charset = "UTF-8"
[2018-04-24T16:04:46,128][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@port = 5044
[2018-04-24T16:04:46,128][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@id = "0740b5540d560fb13a000edf7e31d3e6418a27a943d2994d87a96282570c2f10"
[2018-04-24T16:04:46,128][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@enable_metric = true
[2018-04-24T16:04:46,129][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@codec = <LogStash::Codecs::Plain id=>"plain_a296ee22-932a-4f22-b3c7-16a5c17bf8ae", enable_metric=>true, charset=>"UTF-8">
[2018-04-24T16:04:46,129][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@add_field = {}
[2018-04-24T16:04:46,129][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@host = "0.0.0.0"
[2018-04-24T16:04:46,129][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@ssl = false
[2018-04-24T16:04:46,129][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@ssl_certificate_authorities = []
[2018-04-24T16:04:46,129][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@ssl_verify_mode = "none"
[2018-04-24T16:04:46,129][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@include_codec_tag = true
[2018-04-24T16:04:46,130][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@ssl_handshake_timeout = 10000
[2018-04-24T16:04:46,130][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@tls_min_version = 1
[2018-04-24T16:04:46,130][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@tls_max_version = 1.2
[2018-04-24T16:04:46,130][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@cipher_suites = ["TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384", "TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384", "TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256", "TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256", "TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA384", "TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA384", "TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256", "TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256"]
[2018-04-24T16:04:46,130][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@client_inactivity_timeout = 60
[2018-04-24T16:04:46,130][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@executor_threads = 4
[2018-04-24T16:04:46,132][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@remove_field = ["[beat][hostname]", "[beat][name]"]
[2018-04-24T16:04:46,132][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@id = "0f1b71bed12899aaa2102ad1b5e3a7b1d9fe53717a30ca01d8cd88453aca8064"
[2018-04-24T16:04:46,132][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@enable_metric = true
[2018-04-24T16:04:46,132][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@add_tag = []
[2018-04-24T16:04:46,132][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@remove_tag = []
[2018-04-24T16:04:46,132][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@add_field = {}
[2018-04-24T16:04:46,132][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@periodic_flush = false
[2018-04-24T16:04:46,133][DEBUG][logstash.filters.mutate ] Replacing `${HOSTNAME}` with actual value
[2018-04-24T16:04:46,133][DEBUG][org.logstash.secret.store.SecretStoreFactory] Attempting to exists or secret store with implementation: org.logstash.secret.store.backend.JavaKeyStore
[2018-04-24T16:04:46,134][DEBUG][org.logstash.secret.store.SecretStoreFactory] Attempting to load or secret store with implementation: org.logstash.secret.store.backend.JavaKeyStore
[2018-04-24T16:04:46,136][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"Java::OrgLogstashSecretStore::SecretStoreException::AccessException", :message=>"Could not determine keystore password. Please ensure the file at /etc/logstash/logstash.keystore is a valid Logstash keystore", :backtrace=>["org.logstash.secret.store.backend.JavaKeyStore.getKeyStorePassword(org/logstash/secret/store/backend/JavaKeyStore.java:198)", "org.logstash.secret.store.backend.JavaKeyStore.init(org/logstash/secret/store/backend/JavaKeyStore.java:207)", "org.logstash.secret.store.backend.JavaKeyStore.load(org/logstash/secret/store/backend/JavaKeyStore.java:254)", "org.logstash.secret.store.backend.JavaKeyStore.load(org/logstash/secret/store/backend/JavaKeyStore.java:40)", "org.logstash.secret.store.SecretStoreFactory.doIt(org/logstash/secret/store/SecretStoreFactory.java:79)", "org.logstash.secret.store.SecretStoreFactory.load(org/logstash/secret/store/SecretStoreFactory.java:65)", "java.lang.reflect.Method.invoke(java/lang/reflect/Method)", "org.jruby.javasupport.JavaMethod.invokeDirectWithExceptionHandling(org/jruby/javasupport/JavaMethod.java:453)", "org.jruby.javasupport.JavaMethod.invokeStaticDirect(org/jruby/javasupport/JavaMethod.java:365)", "RUBY.get_if_exists(/usr/share/logstash/logstash-core/lib/logstash/util/secretstore.rb:28)", "usr.share.logstash.logstash_minus_core.lib.logstash.util.substitution_variables.invokeOther11:get_if_exists(usr/share/logstash/logstash_minus_core/lib/logstash/util//usr/share/logstash/logstash-core/lib/logstash/util/substitution_variables.rb:46)", "usr.share.logstash.logstash_minus_core.lib.logstash.util.substitution_variables.block in replace_placeholders(/usr/share/logstash/logstash-core/lib/logstash/util/substitution_variables.rb:46)", "org.jruby.RubyString.gsubCommon19(org/jruby/RubyString.java:2629)", "org.jruby.RubyString.gsubCommon19(org/jruby/RubyString.java:2583)", "org.jruby.RubyString.gsub(org/jruby/RubyString.java:2541)", "org.jruby.RubyString$INVOKER$i$gsub19.call(org/jruby/RubyString$INVOKER$i$gsub19.gen)", "usr.share.logstash.logstash_minus_core.lib.logstash.util.substitution_variables.invokeOther27:gsub(usr/share/logstash/logstash_minus_core/lib/logstash/util//usr/share/logstash/logstash-core/lib/logstash/util/substitution_variables.rb:36)", "usr.share.logstash.logstash_minus_core.lib.logstash.util.substitution_variables.replace_placeholders(/usr/share/logstash/logstash-core/lib/logstash/util/substitution_variables.rb:36)", "usr.share.logstash.logstash_minus_core.lib.logstash.util.substitution_variables.invokeOther12:replace_placeholders(usr/share/logstash/logstash_minus_core/lib/logstash/util//usr/share/logstash/logstash-core/lib/logstash/util/substitution_variables.rb:24)", "usr.share.logstash.logstash_minus_core.lib.logstash.util.substitution_variables.deep_replace(/usr/share/logstash/logstash-core/lib/logstash/util/substitution_variables.rb:24)", "usr.share.logstash.logstash_minus_core.lib.logstash.util.substitution_variables.invokeOther4:deep_replace(usr/share/logstash/logstash_minus_core/lib/logstash/util//usr/share/logstash/logstash-core/lib/logstash/util/substitution_variables.rb:21)", "usr.share.logstash.logstash_minus_core.lib.logstash.util.substitution_variables.block in deep_replace(/usr/share/logstash/logstash-core/lib/logstash/util/substitution_variables.rb:21)", "org.jruby.RubyArray.eachIndex(org/jruby/RubyArray.java:1790)", "org.jruby.RubyArray.each_index(org/jruby/RubyArray.java:1797)", "org.jruby.RubyArray$INVOKER$i$0$0$each_index.call(org/jruby/RubyArray$INVOKER$i$0$0$each_index.gen)", "usr.share.logstash.logstash_minus_core.lib.logstash.util.substitution_variables.invokeOther11:each_index(usr/share/logstash/logstash_minus_core/lib/logstash/util//usr/share/logstash/logstash-core/lib/logstash/util/substitution_variables.rb:20)", "usr.share.logstash.logstash_minus_core.lib.logstash.util.substitution_variables.deep_replace(/usr/share/logstash/logstash-core/lib/logstash/util/substitution_variables.rb:20)", "usr.share.logstash.logstash_minus_core.lib.logstash.config.mixin.invokeOther1:deep_replace(usr/share/logstash/logstash_minus_core/lib/logstash/config//usr/share/logstash/logstash-core/lib/logstash/config/mixin.rb:85)", "usr.share.logstash.logstash_minus_core.lib.logstash.config.mixin.block in config_init(/usr/share/logstash/logstash-core/lib/logstash/config/mixin.rb:85)", "org.jruby.RubyHash$12.visit(org/jruby/RubyHash.java:1362)", "org.jruby.RubyHash$12.visit(org/jruby/RubyHash.java:1359)", "org.jruby.RubyHash.visitLimited(org/jruby/RubyHash.java:662)", "org.jruby.RubyHash.visitAll(org/jruby/RubyHash.java:647)", "org.jruby.RubyHash.iteratorVisitAll(org/jruby/RubyHash.java:1319)", "org.jruby.RubyHash.each_pairCommon(org/jruby/RubyHash.java:1354)", "org.jruby.RubyHash.each(org/jruby/RubyHash.java:1343)", "org.jruby.RubyHash$INVOKER$i$0$0$each.call(org/jruby/RubyHash$INVOKER$i$0$0$each.gen)", "RUBY.config_init(/usr/share/logstash/logstash-core/lib/logstash/config/mixin.rb:84)", "RUBY.initialize(/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:128)", "org.jruby.RubyClass.newInstance(org/jruby/RubyClass.java:1001)", "org.jruby.RubyClass$INVOKER$i$newInstance.call(org/jruby/RubyClass$INVOKER$i$newInstance.gen)", "RUBY.initialize(/usr/share/logstash/logstash-core/lib/logstash/filter_delegator.rb:22)", "org.jruby.RubyClass.newInstance(org/jruby/RubyClass.java:1022)", "org.jruby.RubyClass$INVOKER$i$newInstance.call(org/jruby/RubyClass$INVOKER$i$newInstance.gen)", "RUBY.plugin(/usr/share/logstash/logstash-core/lib/logstash/plugins/plugin_factory.rb:87)", "RUBY.plugin(/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:112)", "RUBY.<eval>((eval):41)", "org.jruby.RubyKernel.evalCommon(org/jruby/RubyKernel.java:1027)", "org.jruby.RubyKernel.eval(org/jruby/RubyKernel.java:994)", "org.jruby.RubyKernel$INVOKER$s$0$3$eval19.call(org/jruby/RubyKernel$INVOKER$s$0$3$eval19.gen)", "RUBY.initialize(/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:84)", "RUBY.initialize(/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:169)", "org.jruby.RubyClass.newInstance(org/jruby/RubyClass.java:1022)", "org.jruby.RubyClass$INVOKER$i$newInstance.call(org/jruby/RubyClass$INVOKER$i$newInstance.gen)", "RUBY.execute(/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:40)", "RUBY.block in converge_state(/usr/share/logstash/logstash-core/lib/logstash/agent.rb:315)", "RUBY.with_pipelines(/usr/share/logstash/logstash-core/lib/logstash/agent.rb:141)", "RUBY.block in converge_state(/usr/share/logstash/logstash-core/lib/logstash/agent.rb:312)", "org.jruby.RubyArray.each(org/jruby/RubyArray.java:1734)", "org.jruby.RubyArray$INVOKER$i$0$0$each.call(org/jruby/RubyArray$INVOKER$i$0$0$each.gen)", "RUBY.converge_state(/usr/share/logstash/logstash-core/lib/logstash/agent.rb:299)", "RUBY.block in converge_state_and_update(/usr/share/logstash/logstash-core/lib/logstash/agent.rb:166)", "RUBY.with_pipelines(/usr/share/logstash/logstash-core/lib/logstash/agent.rb:141)", "RUBY.converge_state_and_update(/usr/share/logstash/logstash-core/lib/logstash/agent.rb:164)", "RUBY.block in execute(/usr/share/logstash/logstash-core/lib/logstash/agent.rb:105)", "org.jruby.RubyProc.call(org/jruby/RubyProc.java:289)", "org.jruby.RubyProc.call19(org/jruby/RubyProc.java:273)", "org.jruby.RubyProc$INVOKER$i$0$0$call19.call(org/jruby/RubyProc$INVOKER$i$0$0$call19.gen)", "RUBY.interval(/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/interval.rb:18)", "RUBY.execute(/usr/share/logstash/logstash-core/lib/logstash/agent.rb:94)", "RUBY.block in execute(/usr/share/logstash/logstash-core/lib/logstash/runner.rb:348)", "org.jruby.RubyProc.call(org/jruby/RubyProc.java:289)", "org.jruby.RubyProc.call19(org/jruby/RubyProc.java:273)", "org.jruby.RubyProc$INVOKER$i$0$0$call19.call(org/jruby/RubyProc$INVOKER$i$0$0$call19.gen)", "RUBY.block in initialize(/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/task.rb:24)", "org.jruby.RubyProc.call(org/jruby/RubyProc.java:289)", "org.jruby.RubyProc.call(org/jruby/RubyProc.java:246)", "java.lang.Thread.run(java/lang/Thread)"]}
[2018-04-24T16:04:46,137][ERROR][logstash.agent ] An exception happened when converging configuration {:exception=>LogStash::Error, :message=>"Don't know how to handle `Java::OrgLogstashSecretStore::SecretStoreException::AccessException` for `LogStash::PipelineAction::Create/pipeline_id:main`", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/converge_result.rb:27:in `create'", "/usr/share/logstash/logstash-core/lib/logstash/converge_result.rb:67:in `add'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:327:in `block in converge_state'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:141:in `with_pipelines'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:312:in `block in converge_state'", "org/jruby/RubyArray.java:1734:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:299:in `converge_state'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:166:in `block in converge_state_and_update'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:141:in `with_pipelines'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:164:in `converge_state_and_update'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:105:in `block in execute'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/interval.rb:18:in `interval'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:94:in `execute'", "/usr/share/logstash/logstash-core/lib/logstash/runner.rb:348:in `block in execute'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/task.rb:24:in `block in initialize'"]}
[2018-04-24T16:04:48,839][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-04-24T16:04:48,839][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-04-24T16:04:49,462][DEBUG][logstash.pipeline ] Pushing flush onto pipeline {:pipeline_id=>".monitoring-logstash", :thread=>"#<Thread:0x35cf001e@/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:246 sleep>"}
[2018-04-24T16:04:49,467][INFO ][logstash.inputs.metrics ] Monitoring License OK
[2018-04-24T16:04:49,468][DEBUG][logstash.inputs.metrics ] Metrics input: received a new snapshot {:created_at=>2018-04-24 16:04:49 -0400, :snapshot=>#<LogStash::Instrument::Snapshot:0x33746c23 @metric_store=#<LogStash::Instrument::MetricStore:0x7b8ad1f @store=#<Concurrent::Map:0x00000000000fbc entries=4 default_proc=nil>, @structured_lookup_mutex=#<Mutex:0x10e75fc6>, @fast_lookup=#<Concurrent::Map:0x00000000000fc0 entries=56 default_proc=nil>>, @created_at=2018-04-24 16:04:49 -0400>}
[2018-04-24T16:04:49,487][ERROR][logstash.inputs.metrics ] Failed to create monitoring event {:message=>"undefined method `ephemeral_id' for nil:NilClass", :error=>"NoMethodError", :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/x-pack-6.2.3-java/lib/monitoring/inputs/metrics/stats_event_factory.rb:124:in `fetch_node_stats'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/x-pack-6.2.3-java/lib/monitoring/inputs/metrics/stats_event_factory.rb:29:in `make'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/x-pack-6.2.3-java/lib/monitoring/inputs/metrics.rb:126:in `update_stats'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/x-pack-6.2.3-java/lib/monitoring/inputs/metrics.rb:117:in `block in update'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/x-pack-6.2.3-java/lib/license_checker/licensed.rb:76:in `with_license_check'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/x-pack-6.2.3-java/lib/monitoring/inputs/metrics.rb:116:in `update'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/x-pack-6.2.3-java/lib/monitoring/inputs/metrics.rb:83:in `block in configure_snapshot_poller'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/concurrent-ruby-1.0.5-java/lib/concurrent/executor/safe_task_executor.rb:24:in `block in execute'", "com/concurrent_ruby/ext/SynchronizationLibrary.java:222:in `synchronize'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/concurrent-ruby-1.0.5-java/lib/concurrent/executor/safe_task_executor.rb:19:in `execute'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/concurrent-ruby-1.0.5-java/lib/concurrent/timer_task.rb:309:in `execute_task'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/concurrent-ruby-1.0.5-java/lib/concurrent/executor/safe_task_executor.rb:24:in `block in execute'", "com/concurrent_ruby/ext/SynchronizationLibrary.java:222:in `synchronize'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/concurrent-ruby-1.0.5-java/lib/concurrent/executor/safe_task_executor.rb:19:in `execute'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/concurrent-ruby-1.0.5-java/lib/concurrent/ivar.rb:170:in `safe_execute'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/concurrent-ruby-1.0.5-java/lib/concurrent/scheduled_task.rb:285:in `process_task'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/concurrent-ruby-1.0.5-java/lib/concurrent/executor/timer_set.rb:168:in `block in process_tasks'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/concurrent-ruby-1.0.5-java/lib/concurrent/executor/java_executor_service.rb:94:in `run'"]}
[2018-04-24T16:04:50,481][DEBUG][logstash.agent ] Converging pipelines state {:actions_count=>1}
[2018-04-24T16:04:50,482][DEBUG][logstash.agent ] Executing action {:action=>LogStash::PipelineAction::Create/pipeline_id:main}
[2018-04-24T16:04:50,971][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@id = "plain_d4f8def3-4ea9-4516-ae3c-973b917108dc"
[2018-04-24T16:04:50,971][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@enable_metric = true
[2018-04-24T16:04:50,971][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@charset = "UTF-8"
[2018-04-24T16:04:50,972][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@port = 5044
[2018-04-24T16:04:50,972][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@id = "0740b5540d560fb13a000edf7e31d3e6418a27a943d2994d87a96282570c2f10"
[2018-04-24T16:04:50,972][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@enable_metric = true
[2018-04-24T16:04:50,972][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@codec = <LogStash::Codecs::Plain id=>"plain_d4f8def3-4ea9-4516-ae3c-973b917108dc", enable_metric=>true, charset=>"UTF-8">
[2018-04-24T16:04:50,972][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@add_field = {}
[2018-04-24T16:04:50,972][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@host = "0.0.0.0"
[2018-04-24T16:04:50,972][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@ssl = false
[2018-04-24T16:04:50,972][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@ssl_certificate_authorities = []
[2018-04-24T16:04:50,972][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@ssl_verify_mode = "none"
[2018-04-24T16:04:50,972][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@include_codec_tag = true
[2018-04-24T16:04:50,972][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@ssl_handshake_timeout = 10000
[2018-04-24T16:04:50,972][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@tls_min_version = 1
[2018-04-24T16:04:50,973][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@tls_max_version = 1.2
[2018-04-24T16:04:50,973][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@cipher_suites = ["TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384", "TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384", "TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256", "TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256", "TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA384", "TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA384", "TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256", "TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256"]
[2018-04-24T16:04:50,973][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@client_inactivity_timeout = 60
[2018-04-24T16:04:50,973][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@executor_threads = 4
[2018-04-24T16:04:50,974][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@remove_field = ["[beat][hostname]", "[beat][name]"]
[2018-04-24T16:04:50,974][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@id = "0f1b71bed12899aaa2102ad1b5e3a7b1d9fe53717a30ca01d8cd88453aca8064"
[2018-04-24T16:04:50,974][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@enable_metric = true
[2018-04-24T16:04:50,974][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@add_tag = []
[2018-04-24T16:04:50,975][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@remove_tag = []
[2018-04-24T16:04:50,975][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@add_field = {}
[2018-04-24T16:04:50,975][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@periodic_flush = false
[2018-04-24T16:04:50,976][DEBUG][logstash.filters.mutate ] Replacing `${HOSTNAME}` with actual value
[2018-04-24T16:04:50,976][DEBUG][org.logstash.secret.store.SecretStoreFactory] Attempting to exists or secret store with implementation: org.logstash.secret.store.backend.JavaKeyStore
[2018-04-24T16:04:50,976][DEBUG][org.logstash.secret.store.SecretStoreFactory] Attempting to load or secret store with implementation: org.logstash.secret.store.backend.JavaKeyStore
[2018-04-24T16:04:50,977][ERROR][logstash.agent ] Failed to execute action {:action=>LogStash::PipelineAction::Create/pipeline_id:main, :exception=>"Java::OrgLogstashSecretStore::SecretStoreException::AccessException", :message=>"Could not determine keystore password. Please ensure the file at /etc/logstash/logstash.keystore is a valid Logstash keystore", :backtrace=>["org.logstash.secret.store.backend.JavaKeyStore.getKeyStorePassword(org/logstash/secret/store/backend/JavaKeyStore.java:198)", "org.logstash.secret.store.backend.JavaKeyStore.init(org/logstash/secret/store/backend/JavaKeyStore.java:207)", "org.logstash.secret.store.backend.JavaKeyStore.load(org/logstash/secret/store/backend/JavaKeyStore.java:254)", "org.logstash.secret.store.backend.JavaKeyStore.load(org/logstash/secret/store/backend/JavaKeyStore.java:40)", "org.logstash.secret.store.SecretStoreFactory.doIt(org/logstash/secret/store/SecretStoreFactory.java:79)", "org.logstash.secret.store.SecretStoreFactory.load(org/logstash/secret/store/SecretStoreFactory.java:65)", "java.lang.reflect.Method.invoke(java/lang/reflect/Method)", "org.jruby.javasupport.JavaMethod.invokeDirectWithExceptionHandling(org/jruby/javasupport/JavaMethod.java:453)", "org.jruby.javasupport.JavaMethod.invokeStaticDirect(org/jruby/javasupport/JavaMethod.java:365)", "RUBY.get_if_exists(/usr/share/logstash/logstash-core/lib/logstash/util/secretstore.rb:28)", "usr.share.logstash.logstash_minus_core.lib.logstash.util.substitution_variables.invokeOther11:get_if_exists(usr/share/logstash/logstash_minus_core/lib/logstash/util//usr/share/logstash/logstash-core/lib/logstash/util/substitution_variables.rb:46)", "usr.share.logstash.logstash_minus_core.lib.logstash.util.substitution_variables.block in replace_placeholders(/usr/share/logstash/logstash-core/lib/logstash/util/substitution_variables.rb:46)", "org.jruby.RubyString.gsubCommon19(org/jruby/RubyString.java:2629)", "org.jruby.RubyString.gsubCommon19(org/jruby/RubyString.java:2583)", "org.jruby.RubyString.gsub(org/jruby/RubyString.java:2541)", "org.jruby.RubyString$INVOKER$i$gsub19.call(org/jruby/RubyString$INVOKER$i$gsub19.gen)", "usr.share.logstash.logstash_minus_core.lib.logstash.util.substitution_variables.invokeOther27:gsub(usr/share/logstash/logstash_minus_core/lib/logstash/util//usr/share/logstash/logstash-core/lib/logstash/util/substitution_variables.rb:36)", "usr.share.logstash.logstash_minus_core.lib.logstash.util.substitution_variables.replace_placeholders(/usr/share/logstash/logstash-core/lib/logstash/util/substitution_variables.rb:36)", "usr.share.logstash.logstash_minus_core.lib.logstash.util.substitution_variables.invokeOther12:replace_placeholders(usr/share/logstash/logstash_minus_core/lib/logstash/util//usr/share/logstash/logstash-core/lib/logstash/util/substitution_variables.rb:24)", "usr.share.logstash.logstash_minus_core.lib.logstash.util.substitution_variables.deep_replace(/usr/share/logstash/logstash-core/lib/logstash/util/substitution_variables.rb:24)", "usr.share.logstash.logstash_minus_core.lib.logstash.util.substitution_variables.invokeOther4:deep_replace(usr/share/logstash/logstash_minus_core/lib/logstash/util//usr/share/logstash/logstash-core/lib/logstash/util/substitution_variables.rb:21)", "usr.share.logstash.logstash_minus_core.lib.logstash.util.substitution_variables.block in deep_replace(/usr/share/logstash/logstash-core/lib/logstash/util/substitution_variables.rb:21)", "org.jruby.RubyArray.eachIndex(org/jruby/RubyArray.java:1790)", "org.jruby.RubyArray.each_index(org/jruby/RubyArray.java:1797)", "org.jruby.RubyArray$INVOKER$i$0$0$each_index.call(org/jruby/RubyArray$INVOKER$i$0$0$each_index.gen)", "usr.share.logstash.logstash_minus_core.lib.logstash.util.substitution_variables.invokeOther11:each_index(usr/share/logstash/logstash_minus_core/lib/logstash/util//usr/share/logstash/logstash-core/lib/logstash/util/substitution_variables.rb:20)", "usr.share.logstash.logstash_minus_core.lib.logstash.util.substitution_variables.deep_replace(/usr/share/logstash/logstash-core/lib/logstash/util/substitution_variables.rb:20)", "usr.share.logstash.logstash_minus_core.lib.logstash.config.mixin.invokeOther1:deep_replace(usr/share/logstash/logstash_minus_core/lib/logstash/config//usr/share/logstash/logstash-core/lib/logstash/config/mixin.rb:85)", "usr.share.logstash.logstash_minus_core.lib.logstash.config.mixin.block in config_init(/usr/share/logstash/logstash-core/lib/logstash/config/mixin.rb:85)", "org.jruby.RubyHash$12.visit(org/jruby/RubyHash.java:1362)", "org.jruby.RubyHash$12.visit(org/jruby/RubyHash.java:1359)", "org.jruby.RubyHash.visitLimited(org/jruby/RubyHash.java:662)", "org.jruby.RubyHash.visitAll(org/jruby/RubyHash.java:647)", "org.jruby.RubyHash.iteratorVisitAll(org/jruby/RubyHash.java:1319)", "org.jruby.RubyHash.each_pairCommon(org/jruby/RubyHash.java:1354)", "org.jruby.RubyHash.each(org/jruby/RubyHash.java:1343)", "org.jruby.RubyHash$INVOKER$i$0$0$each.call(org/jruby/RubyHash$INVOKER$i$0$0$each.gen)", "RUBY.config_init(/usr/share/logstash/logstash-core/lib/logstash/config/mixin.rb:84)", "RUBY.initialize(/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:128)", "org.jruby.RubyClass.newInstance(org/jruby/RubyClass.java:1001)", "org.jruby.RubyClass$INVOKER$i$newInstance.call(org/jruby/RubyClass$INVOKER$i$newInstance.gen)", "RUBY.initialize(/usr/share/logstash/logstash-core/lib/logstash/filter_delegator.rb:22)", "org.jruby.RubyClass.newInstance(org/jruby/RubyClass.java:1022)", "org.jruby.RubyClass$INVOKER$i$newInstance.call(org/jruby/RubyClass$INVOKER$i$newInstance.gen)", "RUBY.plugin(/usr/share/logstash/logstash-core/lib/logstash/plugins/plugin_factory.rb:87)", "RUBY.plugin(/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:112)", "RUBY.<eval>((eval):41)", "org.jruby.RubyKernel.evalCommon(org/jruby/RubyKernel.java:1027)", "org.jruby.RubyKernel.eval(org/jruby/RubyKernel.java:994)", "org.jruby.RubyKernel$INVOKER$s$0$3$eval19.call(org/jruby/RubyKernel$INVOKER$s$0$3$eval19.gen)", "RUBY.initialize(/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:84)", "RUBY.initialize(/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:169)", "org.jruby.RubyClass.newInstance(org/jruby/RubyClass.java:1022)", "org.jruby.RubyClass$INVOKER$i$newInstance.call(org/jruby/RubyClass$INVOKER$i$newInstance.gen)", "RUBY.execute(/usr/share/logstash/logstash-core/lib/logstash/pipeline_action/create.rb:40)", "RUBY.block in converge_state(/usr/share/logstash/logstash-core/lib/logstash/agent.rb:315)", "RUBY.with_pipelines(/usr/share/logstash/logstash-core/lib/logstash/agent.rb:141)", "RUBY.block in converge_state(/usr/share/logstash/logstash-core/lib/logstash/agent.rb:312)", "org.jruby.RubyArray.each(org/jruby/RubyArray.java:1734)", "org.jruby.RubyArray$INVOKER$i$0$0$each.call(org/jruby/RubyArray$INVOKER$i$0$0$each.gen)", "RUBY.converge_state(/usr/share/logstash/logstash-core/lib/logstash/agent.rb:299)", "RUBY.block in converge_state_and_update(/usr/share/logstash/logstash-core/lib/logstash/agent.rb:166)", "RUBY.with_pipelines(/usr/share/logstash/logstash-core/lib/logstash/agent.rb:141)", "RUBY.converge_state_and_update(/usr/share/logstash/logstash-core/lib/logstash/agent.rb:164)", "RUBY.block in execute(/usr/share/logstash/logstash-core/lib/logstash/agent.rb:105)", "org.jruby.RubyProc.call(org/jruby/RubyProc.java:289)", "org.jruby.RubyProc.call19(org/jruby/RubyProc.java:273)", "org.jruby.RubyProc$INVOKER$i$0$0$call19.call(org/jruby/RubyProc$INVOKER$i$0$0$call19.gen)", "RUBY.interval(/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/interval.rb:18)", "RUBY.execute(/usr/share/logstash/logstash-core/lib/logstash/agent.rb:94)", "RUBY.block in execute(/usr/share/logstash/logstash-core/lib/logstash/runner.rb:348)", "org.jruby.RubyProc.call(org/jruby/RubyProc.java:289)", "org.jruby.RubyProc.call19(org/jruby/RubyProc.java:273)", "org.jruby.RubyProc$INVOKER$i$0$0$call19.call(org/jruby/RubyProc$INVOKER$i$0$0$call19.gen)", "RUBY.block in initialize(/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/task.rb:24)", "org.jruby.RubyProc.call(org/jruby/RubyProc.java:289)", "org.jruby.RubyProc.call(org/jruby/RubyProc.java:246)", "java.lang.Thread.run(java/lang/Thread)"]}
[2018-04-24T16:04:50,980][ERROR][logstash.agent ] An exception happened when converging configuration {:exception=>LogStash::Error, :message=>"Don't know how to handle `Java::OrgLogstashSecretStore::SecretStoreException::AccessException` for `LogStash::PipelineAction::Create/pipeline_id:main`", :backtrace=>["/usr/share/logstash/logstash-core/lib/logstash/converge_result.rb:27:in `create'", "/usr/share/logstash/logstash-core/lib/logstash/converge_result.rb:67:in `add'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:327:in `block in converge_state'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:141:in `with_pipelines'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:312:in `block in converge_state'", "org/jruby/RubyArray.java:1734:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:299:in `converge_state'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:166:in `block in converge_state_and_update'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:141:in `with_pipelines'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:164:in `converge_state_and_update'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:105:in `block in execute'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/interval.rb:18:in `interval'", "/usr/share/logstash/logstash-core/lib/logstash/agent.rb:94:in `execute'", "/usr/share/logstash/logstash-core/lib/logstash/runner.rb:348:in `block in execute'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/stud-0.0.23/lib/stud/task.rb:24:in `block in initialize'"]}
[2018-04-24T16:04:53,852][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-04-24T16:04:53,852][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-04-24T16:04:54,463][DEBUG][logstash.pipeline ] Pushing flush onto pipeline {:pipeline_id=>".monitoring-logstash", :thread=>"#<Thread:0x35cf001e@/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:246 sleep>"}
[2018-04-24T16:04:54,892][WARN ][logstash.runner ] SIGTERM received. Shutting down.
[2018-04-24T16:04:54,893][DEBUG][logstash.instrument.periodicpoller.os] Stopping
[2018-04-24T16:04:54,897][DEBUG][logstash.instrument.periodicpoller.jvm] Stopping
[2018-04-24T16:04:54,897][DEBUG][logstash.instrument.periodicpoller.persistentqueue] Stopping
[2018-04-24T16:04:54,897][DEBUG][logstash.instrument.periodicpoller.deadletterqueue] Stopping
[2018-04-24T16:04:54,905][DEBUG][logstash.agent ] Shutting down all pipelines {:pipelines_count=>1}
[2018-04-24T16:04:54,906][DEBUG][logstash.agent ] Converging pipelines state {:actions_count=>1}
[2018-04-24T16:04:54,906][DEBUG][logstash.agent ] Executing action {:action=>LogStash::PipelineAction::Stop/pipeline_id:.monitoring-logstash}
[2018-04-24T16:04:54,907][DEBUG][logstash.pipeline ] Stopping inputs {:pipeline_id=>".monitoring-logstash", :thread=>"#<Thread:0x35cf001e@/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:246 sleep>"}
[2018-04-24T16:04:54,907][DEBUG][logstash.inputs.metrics ] Stopping {:plugin=>"LogStash::Inputs::Metrics"}
[2018-04-24T16:04:54,908][DEBUG][logstash.inputs.metrics ] Metrics input: stopped
[2018-04-24T16:04:54,908][DEBUG][logstash.pipeline ] Stopped inputs {:pipeline_id=>".monitoring-logstash", :thread=>"#<Thread:0x35cf001e@/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:246 sleep>"}
[2018-04-24T16:04:55,467][DEBUG][logstash.inputs.metrics ] Closing {:plugin=>"LogStash::Inputs::Metrics"}
[2018-04-24T16:04:55,567][DEBUG][logstash.pipeline ] Pushing flush onto pipeline {:pipeline_id=>".monitoring-logstash", :thread=>"#<Thread:0x35cf001e@/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:246 sleep>"}
[2018-04-24T16:04:55,567][DEBUG][logstash.pipeline ] Shutting down filter/output workers {:pipeline_id=>".monitoring-logstash", :thread=>"#<Thread:0x35cf001e@/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:246 run>"}
[2018-04-24T16:04:55,568][DEBUG][logstash.pipeline ] Pushing shutdown {:pipeline_id=>".monitoring-logstash", :thread=>"#<Thread:0x5fa58dd4@[.monitoring-logstash]>worker0@/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:384 run>"}
[2018-04-24T16:04:55,568][DEBUG][logstash.pipeline ] Shutdown waiting for worker thread {:pipeline_id=>".monitoring-logstash", :thread=>"#<Thread:0x5fa58dd4@[.monitoring-logstash]>worker0@/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:384 run>"}
[2018-04-24T16:04:55,690][DEBUG][logstash.outputs.elasticsearch] Closing {:plugin=>"LogStash::Outputs::ElasticSearch"}
[2018-04-24T16:04:55,690][DEBUG][logstash.pipeline ] Worker terminated {:pipeline_id=>".monitoring-logstash", :thread=>"#<Thread:0x5fa58dd4@[.monitoring-logstash]>worker0@/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:384 dead>"}
[2018-04-24T16:04:55,691][DEBUG][logstash.outputs.elasticsearch] Stopping sniffer
[2018-04-24T16:04:55,691][DEBUG][logstash.outputs.elasticsearch] Stopping resurrectionist
[2018-04-24T16:04:56,415][DEBUG][logstash.outputs.elasticsearch] Waiting for in use manticore connections
[2018-04-24T16:04:56,417][DEBUG][logstash.outputs.elasticsearch] Closing adapter #<LogStash::Outputs::ElasticSearch::HttpClient::ManticoreAdapter:0x4e6ef74f>
[2018-04-24T16:04:56,418][INFO ][logstash.pipeline ] Pipeline has terminated {:pipeline_id=>".monitoring-logstash", :thread=>"#<Thread:0x35cf001e@/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:246 run>"}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment