Skip to content

Instantly share code, notes, and snippets.

@tsg
Created September 15, 2016 09:17
Show Gist options
  • Save tsg/965625a1a2aa6aa6bfed1fa9fd81646c to your computer and use it in GitHub Desktop.
Save tsg/965625a1a2aa6aa6bfed1fa9fd81646c to your computer and use it in GitHub Desktop.
[2016-09-15T09:15:34,279][WARN ][logstash.runner ] --config.debug was specified, but log.level was not set to 'debug'! No config info will be logged.
[2016-09-15T09:15:34,312][DEBUG][logstash.runner ] -------- Logstash Settings (* means modified) ---------
[2016-09-15T09:15:34,312][DEBUG][logstash.runner ] node.name: "e3a4261bc5be"
[2016-09-15T09:15:34,313][DEBUG][logstash.runner ] *path.config: "/logstash.conf"
[2016-09-15T09:15:34,314][DEBUG][logstash.runner ] path.data: "/opt/logstash-5.0.0-beta1/data"
[2016-09-15T09:15:34,315][DEBUG][logstash.runner ] config.test_and_exit: false
[2016-09-15T09:15:34,316][DEBUG][logstash.runner ] config.reload.automatic: false
[2016-09-15T09:15:34,317][DEBUG][logstash.runner ] config.reload.interval: 3
[2016-09-15T09:15:34,319][DEBUG][logstash.runner ] metric.collect: true
[2016-09-15T09:15:34,322][DEBUG][logstash.runner ] pipeline.id: "main"
[2016-09-15T09:15:34,323][DEBUG][logstash.runner ] pipeline.workers: 2
[2016-09-15T09:15:34,323][DEBUG][logstash.runner ] pipeline.output.workers: 1
[2016-09-15T09:15:34,324][DEBUG][logstash.runner ] pipeline.batch.size: 125
[2016-09-15T09:15:34,324][DEBUG][logstash.runner ] pipeline.batch.delay: 5
[2016-09-15T09:15:34,325][DEBUG][logstash.runner ] pipeline.unsafe_shutdown: false
[2016-09-15T09:15:34,325][DEBUG][logstash.runner ] path.plugins: []
[2016-09-15T09:15:34,333][DEBUG][logstash.runner ] *config.debug: true (default: false)
[2016-09-15T09:15:34,333][DEBUG][logstash.runner ] *log.level: "debug" (default: "info")
[2016-09-15T09:15:34,334][DEBUG][logstash.runner ] version: false
[2016-09-15T09:15:34,334][DEBUG][logstash.runner ] help: false
[2016-09-15T09:15:34,334][DEBUG][logstash.runner ] log.format: "plain"
[2016-09-15T09:15:34,334][DEBUG][logstash.runner ] http.host: "127.0.0.1"
[2016-09-15T09:15:34,335][DEBUG][logstash.runner ] http.port: 9600..9700
[2016-09-15T09:15:34,335][DEBUG][logstash.runner ] http.environment: "production"
[2016-09-15T09:15:34,337][DEBUG][logstash.runner ] path.settings: "/opt/logstash-5.0.0-beta1/config"
[2016-09-15T09:15:34,338][DEBUG][logstash.runner ] path.logs: "/opt/logstash-5.0.0-beta1/logs"
[2016-09-15T09:15:34,340][DEBUG][logstash.runner ] --------------- Logstash Settings -------------------
[2016-09-15T09:15:34,382][DEBUG][logstash.agent ] Agent: Configuring metric collection
[2016-09-15T09:15:34,386][DEBUG][logstash.instrument.periodicpoller.os] PeriodicPoller: Starting {:polling_interval=>1, :polling_timeout=>60}
[2016-09-15T09:15:34,408][DEBUG][logstash.instrument.periodicpoller.jvm] PeriodicPoller: Starting {:polling_interval=>1, :polling_timeout=>60}
[2016-09-15T09:15:34,518][DEBUG][logstash.agent ] Reading config file {:config_file=>"/logstash.conf"}
[2016-09-15T09:15:34,524][DEBUG][logstash.agent ]
The following is the content of a file {:config_file=>"/logstash.conf"}
[2016-09-15T09:15:34,527][DEBUG][logstash.agent ]
input {
beats {
port => 5044
ssl => false
}
beats {
port => 5055
ssl => true
ssl_certificate => "/etc/pki/tls/certs/logstash.crt"
ssl_key => "/etc/pki/tls/private/logstash.key"
}
}
output {
elasticsearch {
hosts => ["elasticsearch:9200"]
#user => "beats"
#password => "testing"
index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
}
# Used for easier debugging
#stdout { codec => rubydebug { metadata => true } }
}
[2016-09-15T09:15:34,528][DEBUG][logstash.agent ]
The following is the merged configuration
[2016-09-15T09:15:34,529][DEBUG][logstash.agent ]
input {
beats {
port => 5044
ssl => false
}
beats {
port => 5055
ssl => true
ssl_certificate => "/etc/pki/tls/certs/logstash.crt"
ssl_key => "/etc/pki/tls/private/logstash.key"
}
}
output {
elasticsearch {
hosts => ["elasticsearch:9200"]
#user => "beats"
#password => "testing"
index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
}
# Used for easier debugging
#stdout { codec => rubydebug { metadata => true } }
}
[2016-09-15T09:15:34,647][DEBUG][logstash.pipeline ] Compiled pipeline code {:code=>" @inputs = []\n @filters = []\n @outputs = []\n @periodic_flushers = []\n @shutdown_flushers = []\n @generated_objects = {}\n\n @generated_objects[:input_beats_1] = plugin(\"input\", \"beats\", LogStash::Util.hash_merge_many({ \"port\" => 5044 }, { \"ssl\" => (\"false\") }))\n\n @inputs << @generated_objects[:input_beats_1]\n\n @generated_objects[:input_beats_2] = plugin(\"input\", \"beats\", LogStash::Util.hash_merge_many({ \"port\" => 5055 }, { \"ssl\" => (\"true\") }, { \"ssl_certificate\" => (\"/etc/pki/tls/certs/logstash.crt\") }, { \"ssl_key\" => (\"/etc/pki/tls/private/logstash.key\") }))\n\n @inputs << @generated_objects[:input_beats_2]\n\n @generated_objects[:output_elasticsearch_3] = plugin(\"output\", \"elasticsearch\", LogStash::Util.hash_merge_many({ \"hosts\" => [(\"elasticsearch:9200\")] }, { \"index\" => (\"%{[@metadata][beat]}-%{+YYYY.MM.dd}\") }, { \"document_type\" => (\"%{[@metadata][type]}\") }))\n\n @outputs << @generated_objects[:output_elasticsearch_3]\n\n define_singleton_method :filter_func do |event|\n events = [event]\n @logger.debug? && @logger.debug(\"filter received\", \"event\" => event.to_hash)\n events\n end\n define_singleton_method :output_func do |event|\n targeted_outputs = []\n @logger.debug? && @logger.debug(\"output received\", \"event\" => event.to_hash)\n targeted_outputs << @generated_objects[:output_elasticsearch_3]\n \n targeted_outputs\n end"}
[2016-09-15T09:15:34,764][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@id = "plain_2966da5b-8ba0-42ec-b8f0-989af11baef4"
[2016-09-15T09:15:34,780][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@enable_metric = true
[2016-09-15T09:15:34,782][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@charset = "UTF-8"
[2016-09-15T09:15:34,795][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@port = 5044
[2016-09-15T09:15:34,803][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@ssl = false
[2016-09-15T09:15:34,811][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@id = "19c43bf807295f7021a23b430ac8a9125c97c670-1"
[2016-09-15T09:15:34,814][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@enable_metric = true
[2016-09-15T09:15:34,818][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@codec = <LogStash::Codecs::Plain id=>"plain_2966da5b-8ba0-42ec-b8f0-989af11baef4", enable_metric=>true, charset=>"UTF-8">
[2016-09-15T09:15:34,821][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@add_field = {}
[2016-09-15T09:15:34,822][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@host = "0.0.0.0"
[2016-09-15T09:15:34,826][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@ssl_certificate_authorities = []
[2016-09-15T09:15:34,845][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@ssl_verify_mode = "none"
[2016-09-15T09:15:34,850][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@ssl_handshake_timeout = 10000
[2016-09-15T09:15:34,868][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@congestion_threshold = 5
[2016-09-15T09:15:34,870][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@target_field_for_codec = "message"
[2016-09-15T09:15:34,881][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@tls_min_version = 1
[2016-09-15T09:15:34,890][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@tls_max_version = 1.2
[2016-09-15T09:15:34,892][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@cipher_suites = ["TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA38", "TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384", "TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256", "TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256", "TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA384", "TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA384", "TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256"]
[2016-09-15T09:15:34,893][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@client_inactivity_timeout = 15
[2016-09-15T09:15:34,917][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@id = "plain_149ddd60-7f99-4372-bf29-d4bea1d2daab"
[2016-09-15T09:15:34,923][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@enable_metric = true
[2016-09-15T09:15:34,924][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@charset = "UTF-8"
[2016-09-15T09:15:34,928][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@port = 5055
[2016-09-15T09:15:34,929][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@ssl = true
[2016-09-15T09:15:34,932][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@ssl_certificate = "/etc/pki/tls/certs/logstash.crt"
[2016-09-15T09:15:34,933][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@ssl_key = "/etc/pki/tls/private/logstash.key"
[2016-09-15T09:15:34,944][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@id = "19c43bf807295f7021a23b430ac8a9125c97c670-2"
[2016-09-15T09:15:34,947][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@enable_metric = true
[2016-09-15T09:15:34,948][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@codec = <LogStash::Codecs::Plain id=>"plain_149ddd60-7f99-4372-bf29-d4bea1d2daab", enable_metric=>true, charset=>"UTF-8">
[2016-09-15T09:15:34,949][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@add_field = {}
[2016-09-15T09:15:34,949][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@host = "0.0.0.0"
[2016-09-15T09:15:34,949][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@ssl_certificate_authorities = []
[2016-09-15T09:15:34,950][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@ssl_verify_mode = "none"
[2016-09-15T09:15:34,950][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@ssl_handshake_timeout = 10000
[2016-09-15T09:15:34,951][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@congestion_threshold = 5
[2016-09-15T09:15:34,951][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@target_field_for_codec = "message"
[2016-09-15T09:15:34,951][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@tls_min_version = 1
[2016-09-15T09:15:34,952][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@tls_max_version = 1.2
[2016-09-15T09:15:34,953][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@cipher_suites = ["TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA38", "TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384", "TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256", "TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256", "TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA384", "TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA384", "TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256"]
[2016-09-15T09:15:34,953][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@client_inactivity_timeout = 15
[2016-09-15T09:15:35,332][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@id = "plain_6594c839-6836-4126-898c-75ada214c93c"
[2016-09-15T09:15:35,333][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@enable_metric = true
[2016-09-15T09:15:35,334][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@charset = "UTF-8"
[2016-09-15T09:15:35,344][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@hosts = ["elasticsearch:9200"]
[2016-09-15T09:15:35,345][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@index = "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
[2016-09-15T09:15:35,346][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@document_type = "%{[@metadata][type]}"
[2016-09-15T09:15:35,347][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@id = "19c43bf807295f7021a23b430ac8a9125c97c670-3"
[2016-09-15T09:15:35,349][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@enable_metric = true
[2016-09-15T09:15:35,350][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@codec = <LogStash::Codecs::Plain id=>"plain_6594c839-6836-4126-898c-75ada214c93c", enable_metric=>true, charset=>"UTF-8">
[2016-09-15T09:15:35,350][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@workers = 1
[2016-09-15T09:15:35,351][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@manage_template = true
[2016-09-15T09:15:35,351][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@template_name = "logstash"
[2016-09-15T09:15:35,352][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@template_overwrite = false
[2016-09-15T09:15:35,352][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@parent = nil
[2016-09-15T09:15:35,353][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@flush_size = 500
[2016-09-15T09:15:35,353][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@idle_flush_time = 1
[2016-09-15T09:15:35,353][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@upsert = ""
[2016-09-15T09:15:35,354][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@doc_as_upsert = false
[2016-09-15T09:15:35,354][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script = ""
[2016-09-15T09:15:35,355][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_type = "inline"
[2016-09-15T09:15:35,355][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_lang = ""
[2016-09-15T09:15:35,356][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_var_name = "event"
[2016-09-15T09:15:35,357][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@scripted_upsert = false
[2016-09-15T09:15:35,358][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_initial_interval = 2
[2016-09-15T09:15:35,359][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_max_interval = 64
[2016-09-15T09:15:35,359][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_on_conflict = 1
[2016-09-15T09:15:35,360][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pipeline = nil
[2016-09-15T09:15:35,361][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@action = "index"
[2016-09-15T09:15:35,361][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ssl_certificate_verification = true
[2016-09-15T09:15:35,362][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@sniffing = false
[2016-09-15T09:15:35,362][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@sniffing_delay = 5
[2016-09-15T09:15:35,362][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@timeout = 60
[2016-09-15T09:15:35,363][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@failure_type_logging_whitelist = []
[2016-09-15T09:15:35,363][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pool_max = 1000
[2016-09-15T09:15:35,363][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pool_max_per_route = 100
[2016-09-15T09:15:35,364][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@healthcheck_path = "/"
[2016-09-15T09:15:35,364][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@resurrect_delay = 5
[2016-09-15T09:15:35,364][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@validate_after_inactivity = 10000
[2016-09-15T09:15:35,371][DEBUG][logstash.agent ] starting agent
[2016-09-15T09:15:35,382][DEBUG][logstash.agent ] starting pipeline {:id=>"main"}
[2016-09-15T09:15:35,517][INFO ][logstash.inputs.beats ] Beats inputs: Starting input listener {:address=>"0.0.0.0:5044"}
[2016-09-15T09:15:35,527][DEBUG][logstash.instrument.collector] Collector: Sending snapshot to observers {:created_at=>2016-09-15 09:15:35 +0000}
[2016-09-15T09:15:35,648][DEBUG][io.netty.util.internal.logging.InternalLoggerFactory] Using Log4J as the default logging framework
[2016-09-15T09:15:35,660][DEBUG][io.netty.channel.MultithreadEventLoopGroup] -Dio.netty.eventLoopThreads: 4
[2016-09-15T09:15:35,732][DEBUG][io.netty.util.internal.PlatformDependent0] java.nio.Buffer.address: available
[2016-09-15T09:15:35,733][DEBUG][io.netty.util.internal.PlatformDependent0] sun.misc.Unsafe.theUnsafe: available
[2016-09-15T09:15:35,734][DEBUG][io.netty.util.internal.PlatformDependent0] sun.misc.Unsafe.copyMemory: available
[2016-09-15T09:15:35,735][DEBUG][io.netty.util.internal.PlatformDependent0] java.nio.Bits.unaligned: true
[2016-09-15T09:15:35,736][DEBUG][io.netty.util.internal.PlatformDependent0] java.nio.DirectByteBuffer.<init>(long, int): available
[2016-09-15T09:15:35,744][DEBUG][io.netty.util.internal.Cleaner0] java.nio.ByteBuffer.cleaner(): available
[2016-09-15T09:15:35,749][DEBUG][io.netty.util.internal.PlatformDependent] Java version: 8
[2016-09-15T09:15:35,751][DEBUG][io.netty.util.internal.PlatformDependent] -Dio.netty.noUnsafe: false
[2016-09-15T09:15:35,799][DEBUG][io.netty.util.internal.PlatformDependent] sun.misc.Unsafe: available
[2016-09-15T09:15:35,809][DEBUG][io.netty.util.internal.PlatformDependent] -Dio.netty.noJavassist: false
[2016-09-15T09:15:36,055][DEBUG][io.netty.util.internal.PlatformDependent] Javassist: available
[2016-09-15T09:15:36,057][DEBUG][io.netty.util.internal.PlatformDependent] -Dio.netty.tmpdir: /tmp (java.io.tmpdir)
[2016-09-15T09:15:36,058][DEBUG][io.netty.util.internal.PlatformDependent] -Dio.netty.bitMode: 64 (sun.arch.data.model)
[2016-09-15T09:15:36,058][DEBUG][io.netty.util.internal.PlatformDependent] -Dio.netty.noPreferDirect: false
[2016-09-15T09:15:36,062][DEBUG][io.netty.util.internal.PlatformDependent] io.netty.maxDirectMemory: 1056309248 bytes
[2016-09-15T09:15:36,100][DEBUG][io.netty.channel.nio.NioEventLoop] -Dio.netty.noKeySetOptimization: false
[2016-09-15T09:15:36,102][DEBUG][io.netty.channel.nio.NioEventLoop] -Dio.netty.selectorAutoRebuildThreshold: 512
[2016-09-15T09:15:36,126][INFO ][logstash.inputs.beats ] Beats inputs: Starting input listener {:address=>"0.0.0.0:5055"}
[2016-09-15T09:15:36,213][INFO ][org.logstash.beats.Server] Starting server on port: 5044
[2016-09-15T09:15:36,219][DEBUG][org.logstash.netty.PrivateKeyConverter] Converting Private keys if needed
[2016-09-15T09:15:36,285][DEBUG][io.netty.util.internal.JavassistTypeParameterMatcherGenerator] Generated: io.netty.util.internal.__matchers__.org.logstash.beats.BatchMatcher
[2016-09-15T09:15:36,413][DEBUG][io.netty.channel.DefaultChannelId] -Dio.netty.processId: 12 (auto-detected)
[2016-09-15T09:15:36,445][ERROR][logstash.agent ] Pipeline aborted due to error {:exception=>java.lang.NullPointerException, :backtrace=>["org.logstash.netty.PrivateKeyConverter.generatePkcs8(org/logstash/netty/PrivateKeyConverter.java:43)", "org.logstash.netty.PrivateKeyConverter.convert(org/logstash/netty/PrivateKeyConverter.java:39)", "java.lang.reflect.Method.invoke(java/lang/reflect/Method.java:498)", "RUBY.create_server(/opt/logstash-5.0.0-beta1/vendor/bundle/jruby/1.9/gems/logstash-input-beats-3.1.0.beta4-java/lib/logstash/inputs/beats.rb:139)", "RUBY.register(/opt/logstash-5.0.0-beta1/vendor/bundle/jruby/1.9/gems/logstash-input-beats-3.1.0.beta4-java/lib/logstash/inputs/beats.rb:132)", "RUBY.start_inputs(/opt/logstash-5.0.0-beta1/logstash-core/lib/logstash/pipeline.rb:324)", "org.jruby.RubyArray.each(org/jruby/RubyArray.java:1613)", "RUBY.start_inputs(/opt/logstash-5.0.0-beta1/logstash-core/lib/logstash/pipeline.rb:323)", "RUBY.start_workers(/opt/logstash-5.0.0-beta1/logstash-core/lib/logstash/pipeline.rb:195)", "RUBY.run(/opt/logstash-5.0.0-beta1/logstash-core/lib/logstash/pipeline.rb:153)", "RUBY.start_pipeline(/opt/logstash-5.0.0-beta1/logstash-core/lib/logstash/agent.rb:250)", "java.lang.Thread.run(java/lang/Thread.java:745)"]}
[2016-09-15T09:15:36,446][DEBUG][io.netty.util.NetUtil ] Loopback interface: lo (lo, 0:0:0:0:0:0:0:1%lo)
[2016-09-15T09:15:36,463][DEBUG][io.netty.util.NetUtil ] /proc/sys/net/core/somaxconn: 128
[2016-09-15T09:15:36,467][DEBUG][logstash.agent ] Starting puma
[2016-09-15T09:15:36,473][DEBUG][logstash.agent ] Trying to start WebServer {:port=>9600}
[2016-09-15T09:15:36,473][DEBUG][io.netty.channel.DefaultChannelId] -Dio.netty.machineId: 02:42:ac:ff:fe:11:00:04 (auto-detected)
[2016-09-15T09:15:36,474][DEBUG][io.netty.util.internal.ThreadLocalRandom] -Dio.netty.initialSeedUniquifier: 0x7f18ac5220ec92fb
[2016-09-15T09:15:36,478][DEBUG][logstash.api.service ] [api-service] start
[2016-09-15T09:15:36,532][DEBUG][io.netty.util.ResourceLeakDetector] -Dio.netty.leakDetection.level: simple
[2016-09-15T09:15:36,533][DEBUG][io.netty.util.ResourceLeakDetector] -Dio.netty.leakDetection.maxRecords: 4
[2016-09-15T09:15:36,550][DEBUG][logstash.instrument.collector] Collector: Sending snapshot to observers {:created_at=>2016-09-15 09:15:36 +0000}
[2016-09-15T09:15:36,625][DEBUG][io.netty.buffer.PooledByteBufAllocator] -Dio.netty.allocator.numHeapArenas: 4
[2016-09-15T09:15:36,626][DEBUG][io.netty.buffer.PooledByteBufAllocator] -Dio.netty.allocator.numDirectArenas: 4
[2016-09-15T09:15:36,626][DEBUG][io.netty.buffer.PooledByteBufAllocator] -Dio.netty.allocator.pageSize: 8192
[2016-09-15T09:15:36,627][DEBUG][io.netty.buffer.PooledByteBufAllocator] -Dio.netty.allocator.maxOrder: 11
[2016-09-15T09:15:36,627][DEBUG][io.netty.buffer.PooledByteBufAllocator] -Dio.netty.allocator.chunkSize: 16777216
[2016-09-15T09:15:36,627][DEBUG][io.netty.buffer.PooledByteBufAllocator] -Dio.netty.allocator.tinyCacheSize: 512
[2016-09-15T09:15:36,627][DEBUG][io.netty.buffer.PooledByteBufAllocator] -Dio.netty.allocator.smallCacheSize: 256
[2016-09-15T09:15:36,627][DEBUG][io.netty.buffer.PooledByteBufAllocator] -Dio.netty.allocator.normalCacheSize: 64
[2016-09-15T09:15:36,627][DEBUG][io.netty.buffer.PooledByteBufAllocator] -Dio.netty.allocator.maxCachedBufferCapacity: 32768
[2016-09-15T09:15:36,628][DEBUG][io.netty.buffer.PooledByteBufAllocator] -Dio.netty.allocator.cacheTrimInterval: 8192
[2016-09-15T09:15:36,658][DEBUG][io.netty.buffer.ByteBufUtil] -Dio.netty.allocator.type: pooled
[2016-09-15T09:15:36,661][DEBUG][io.netty.buffer.ByteBufUtil] -Dio.netty.threadLocalDirectBufferSize: 65536
[2016-09-15T09:15:36,664][DEBUG][io.netty.buffer.ByteBufUtil] -Dio.netty.maxThreadLocalCharBufferSize: 16384
[2016-09-15T09:15:36,681][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2016-09-15T09:15:37,564][DEBUG][logstash.instrument.collector] Collector: Sending snapshot to observers {:created_at=>2016-09-15 09:15:37 +0000}
[2016-09-15T09:15:38,568][DEBUG][logstash.instrument.collector] Collector: Sending snapshot to observers {:created_at=>2016-09-15 09:15:38 +0000}
[2016-09-15T09:15:39,465][DEBUG][logstash.instrument.periodicpoller.os] PeriodicPoller: Stopping
[2016-09-15T09:15:39,467][DEBUG][logstash.instrument.periodicpoller.jvm] PeriodicPoller: Stopping
[2016-09-15T09:15:39,481][WARN ][logstash.agent ] stopping pipeline {:id=>"main"}
[2016-09-15T09:15:39,484][DEBUG][logstash.pipeline ] Closing inputs
[2016-09-15T09:15:39,485][DEBUG][logstash.inputs.beats ] stopping {:plugin=>"LogStash::Inputs::Beats"}
[2016-09-15T09:15:39,486][DEBUG][org.logstash.beats.Server] Server shutting down
[2016-09-15T09:15:39,514][DEBUG][logstash.inputs.beats ] closing {:plugin=>"LogStash::Inputs::Beats"}
[2016-09-15T09:15:40,492][DEBUG][org.logstash.beats.Server] Server stopped
[2016-09-15T09:15:40,495][DEBUG][logstash.inputs.beats ] stopping {:plugin=>"LogStash::Inputs::Beats"}
[2016-09-15T09:15:40,514][FATAL][logstash.runner ] An unexpected error occurred! {:error=>#<NoMethodError: undefined method `stop' for nil:NilClass>, :backtrace=>["/opt/logstash-5.0.0-beta1/vendor/bundle/jruby/1.9/gems/logstash-input-beats-3.1.0.beta4-java/lib/logstash/inputs/beats.rb:173:in `stop'", "/opt/logstash-5.0.0-beta1/logstash-core/lib/logstash/inputs/base.rb:89:in `do_stop'", "org/jruby/RubyArray.java:1613:in `each'", "/opt/logstash-5.0.0-beta1/logstash-core/lib/logstash/pipeline.rb:379:in `shutdown'", "/opt/logstash-5.0.0-beta1/logstash-core/lib/logstash/agent.rb:267:in `stop_pipeline'", "/opt/logstash-5.0.0-beta1/logstash-core/lib/logstash/agent.rb:280:in `shutdown_pipelines'", "org/jruby/RubyHash.java:1342:in `each'", "/opt/logstash-5.0.0-beta1/logstash-core/lib/logstash/agent.rb:280:in `shutdown_pipelines'", "/opt/logstash-5.0.0-beta1/logstash-core/lib/logstash/agent.rb:130:in `shutdown'", "/opt/logstash-5.0.0-beta1/logstash-core/lib/logstash/runner.rb:263:in `execute'", "/opt/logstash-5.0.0-beta1/vendor/bundle/jruby/1.9/gems/clamp-0.6.5/lib/clamp/command.rb:67:in `run'", "/opt/logstash-5.0.0-beta1/logstash-core/lib/logstash/runner.rb:174:in `run'", "/opt/logstash-5.0.0-beta1/vendor/bundle/jruby/1.9/gems/clamp-0.6.5/lib/clamp/command.rb:132:in `run'", "/opt/logstash-5.0.0-beta1/lib/bootstrap/environment.rb:68:in `(root)'"]}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment