Skip to content

Instantly share code, notes, and snippets.

@jpeppercorn
Created April 24, 2018 21:22
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save jpeppercorn/9df2f6d3b0a2e7188fe93472cec67a0a to your computer and use it in GitHub Desktop.
Save jpeppercorn/9df2f6d3b0a2e7188fe93472cec67a0a to your computer and use it in GitHub Desktop.
Logstash-plain2.log No Errors (mutate is commented out)
This file has been truncated, but you can view the full file.
[2018-04-24T17:17:35,479][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"/usr/share/logstash/modules/fb_apache/configuration"}
[2018-04-24T17:17:35,496][DEBUG][logstash.plugins.registry] Adding plugin to the registry {:name=>"fb_apache", :type=>:modules, :class=>#<LogStash::Modules::Scaffold:0x6de7bd67 @module_name="fb_apache", @directory="/usr/share/logstash/modules/fb_apache/configuration", @kibana_version_parts=["6", "0", "0"]>}
[2018-04-24T17:17:35,497][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"/usr/share/logstash/modules/netflow/configuration"}
[2018-04-24T17:17:35,497][DEBUG][logstash.plugins.registry] Adding plugin to the registry {:name=>"netflow", :type=>:modules, :class=>#<LogStash::Modules::Scaffold:0x6e206b5c @module_name="netflow", @directory="/usr/share/logstash/modules/netflow/configuration", @kibana_version_parts=["6", "0", "0"]>}
[2018-04-24T17:17:35,518][DEBUG][logstash.plugins.registry] Executing hooks {:name=>"", :type=>"pack", :hooks_file=>"/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/x-pack-6.2.3-java/lib/logstash_registry.rb"}
[2018-04-24T17:17:35,828][DEBUG][logstash.plugins.registry] Adding plugin to the registry {:name=>"metrics", :type=>:input, :class=>LogStash::Inputs::Metrics}
[2018-04-24T17:17:35,829][DEBUG][logstash.plugins.registry] Adding plugin to the registry {:name=>"monitoring", :type=>:universal, :class=>LogStash::MonitoringExtension}
[2018-04-24T17:17:35,829][DEBUG][logstash.plugins.registry] Adding plugin to the registry {:name=>"config_management", :type=>:universal, :class=>LogStash::ConfigManagement::Extension}
[2018-04-24T17:17:35,830][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"arcsight", :directory=>"/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/x-pack-6.2.3-java/modules/arcsight/configuration"}
[2018-04-24T17:17:35,830][DEBUG][logstash.plugins.registry] Adding plugin to the registry {:name=>"arcsight", :type=>:modules, :class=>#<LogStash::Modules::Scaffold:0x740ff2bd @module_name="arcsight", @directory="/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/x-pack-6.2.3-java/modules/arcsight/configuration", @kibana_version_parts=["6", "0", "0"]>}
[2018-04-24T17:17:35,892][INFO ][logstash.configmanagement.bootstrapcheck] Using Elasticsearch as config store {:pipeline_id=>["main"], :poll_interval=>"5000000000ns"}
[2018-04-24T17:17:35,893][DEBUG][logstash.configmanagement.hooks] Removing the `Logstash::Config::Source::Local` and replacing it with `ElasticsearchSource`
[2018-04-24T17:17:35,954][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"plain", :type=>"codec", :class=>LogStash::Codecs::Plain}
[2018-04-24T17:17:35,958][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@id = "plain_259eb90d-af0d-4b6a-8cdf-976243e64bc8"
[2018-04-24T17:17:35,958][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@enable_metric = true
[2018-04-24T17:17:35,958][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@charset = "UTF-8"
[2018-04-24T17:17:35,961][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@hosts = [https://10.54.52.31:9200]
[2018-04-24T17:17:35,961][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@user = "logstash_writer"
[2018-04-24T17:17:35,961][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@password = <password>
[2018-04-24T17:17:35,961][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@sniffing = false
[2018-04-24T17:17:35,961][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@cacert = "/etc/logstash/cert/LMUWU0438.pem"
[2018-04-24T17:17:35,961][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ssl = true
[2018-04-24T17:17:35,961][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@id = "elasticsearch_a473cb00-3940-43e1-ad66-4eb9336d69e1"
[2018-04-24T17:17:35,961][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@enable_metric = true
[2018-04-24T17:17:35,963][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@codec = <LogStash::Codecs::Plain id=>"plain_259eb90d-af0d-4b6a-8cdf-976243e64bc8", enable_metric=>true, charset=>"UTF-8">
[2018-04-24T17:17:35,963][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@workers = 1
[2018-04-24T17:17:35,963][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@index = "logstash-%{+YYYY.MM.dd}"
[2018-04-24T17:17:35,963][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@manage_template = true
[2018-04-24T17:17:35,963][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@template_name = "logstash"
[2018-04-24T17:17:35,963][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@template_overwrite = false
[2018-04-24T17:17:35,963][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@parent = nil
[2018-04-24T17:17:35,963][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@join_field = nil
[2018-04-24T17:17:35,963][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@upsert = ""
[2018-04-24T17:17:35,964][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@doc_as_upsert = false
[2018-04-24T17:17:35,964][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script = ""
[2018-04-24T17:17:35,964][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_type = "inline"
[2018-04-24T17:17:35,964][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_lang = "painless"
[2018-04-24T17:17:35,964][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_var_name = "event"
[2018-04-24T17:17:35,964][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@scripted_upsert = false
[2018-04-24T17:17:35,964][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_initial_interval = 2
[2018-04-24T17:17:35,964][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_max_interval = 64
[2018-04-24T17:17:35,964][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_on_conflict = 1
[2018-04-24T17:17:35,964][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pipeline = nil
[2018-04-24T17:17:35,964][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@action = "index"
[2018-04-24T17:17:35,964][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ssl_certificate_verification = true
[2018-04-24T17:17:35,964][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@sniffing_delay = 5
[2018-04-24T17:17:35,965][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@timeout = 60
[2018-04-24T17:17:35,965][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@failure_type_logging_whitelist = []
[2018-04-24T17:17:35,965][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pool_max = 1000
[2018-04-24T17:17:35,965][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pool_max_per_route = 100
[2018-04-24T17:17:35,965][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@resurrect_delay = 5
[2018-04-24T17:17:35,965][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@validate_after_inactivity = 10000
[2018-04-24T17:17:35,965][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@http_compression = false
[2018-04-24T17:17:35,966][DEBUG][logstash.licensechecker.licensereader] Normalizing http path {:path=>nil, :normalized=>nil}
[2018-04-24T17:17:36,265][INFO ][logstash.licensechecker.licensereader] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[https://logstash_writer:xxxxxx@10.54.52.31:9200/]}}
[2018-04-24T17:17:36,267][INFO ][logstash.licensechecker.licensereader] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>https://logstash_writer:xxxxxx@10.54.52.31:9200/, :path=>"/"}
[2018-04-24T17:17:36,464][WARN ][logstash.licensechecker.licensereader] Restored connection to ES instance {:url=>"https://logstash_writer:xxxxxx@10.54.52.31:9200/"}
[2018-04-24T17:17:36,493][INFO ][logstash.licensechecker.licensereader] ES Output version determined {:es_version=>6}
[2018-04-24T17:17:36,493][WARN ][logstash.licensechecker.licensereader] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
[2018-04-24T17:17:36,509][DEBUG][logstash.licensechecker.licensemanager] updating observers of xpack info change
[2018-04-24T17:17:36,518][DEBUG][logstash.configmanagement.elasticsearchsource] updating licensing state installed:true,
license:{"status"=>"active", "uid"=>"9e6cfba8-7c54-4f36-9ab0-3d4dd63415ee", "type"=>"platinum", "issue_date"=>"2017-07-31T00:00:00.000Z", "issue_date_in_millis"=>1501459200000, "expiry_date"=>"2018-07-31T23:59:59.999Z", "expiry_date_in_millis"=>1533081599999, "max_nodes"=>23, "issued_to"=>"XPO Logistics, Inc. (non-production environments)", "issuer"=>"Aunik Bhattacharjee", "start_date_in_millis"=>1501459200000},
last_updated:}
[2018-04-24T17:17:36,542][INFO ][logstash.configmanagement.elasticsearchsource] Configuration Management License OK
[2018-04-24T17:17:36,542][DEBUG][logstash.config.sourceloader] Adding source {:source=>"#<LogStash::ConfigManagement::ElasticsearchSource:0x4a9c137a>"}
[2018-04-24T17:17:36,713][DEBUG][logstash.runner ] -------- Logstash Settings (* means modified) ---------
[2018-04-24T17:17:36,713][DEBUG][logstash.runner ] node.name: "LMUWU0438"
[2018-04-24T17:17:36,713][DEBUG][logstash.runner ] *path.data: "/var/lib/logstash" (default: "/usr/share/logstash/data")
[2018-04-24T17:17:36,713][DEBUG][logstash.runner ] modules.cli: []
[2018-04-24T17:17:36,713][DEBUG][logstash.runner ] modules: []
[2018-04-24T17:17:36,713][DEBUG][logstash.runner ] modules_setup: false
[2018-04-24T17:17:36,713][DEBUG][logstash.runner ] config.test_and_exit: false
[2018-04-24T17:17:36,713][DEBUG][logstash.runner ] *config.reload.automatic: true (default: false)
[2018-04-24T17:17:36,713][DEBUG][logstash.runner ] *config.reload.interval: 5000000000 (default: 3000000000)
[2018-04-24T17:17:36,714][DEBUG][logstash.runner ] config.support_escapes: false
[2018-04-24T17:17:36,714][DEBUG][logstash.runner ] metric.collect: true
[2018-04-24T17:17:36,714][DEBUG][logstash.runner ] pipeline.id: "main"
[2018-04-24T17:17:36,714][DEBUG][logstash.runner ] pipeline.system: false
[2018-04-24T17:17:36,714][DEBUG][logstash.runner ] pipeline.workers: 4
[2018-04-24T17:17:36,714][DEBUG][logstash.runner ] pipeline.output.workers: 1
[2018-04-24T17:17:36,714][DEBUG][logstash.runner ] pipeline.batch.size: 125
[2018-04-24T17:17:36,714][DEBUG][logstash.runner ] pipeline.batch.delay: 50
[2018-04-24T17:17:36,714][DEBUG][logstash.runner ] pipeline.unsafe_shutdown: false
[2018-04-24T17:17:36,714][DEBUG][logstash.runner ] pipeline.java_execution: false
[2018-04-24T17:17:36,714][DEBUG][logstash.runner ] pipeline.reloadable: true
[2018-04-24T17:17:36,714][DEBUG][logstash.runner ] path.plugins: []
[2018-04-24T17:17:36,714][DEBUG][logstash.runner ] config.debug: false
[2018-04-24T17:17:36,714][DEBUG][logstash.runner ] *log.level: "debug" (default: "info")
[2018-04-24T17:17:36,714][DEBUG][logstash.runner ] version: false
[2018-04-24T17:17:36,714][DEBUG][logstash.runner ] help: false
[2018-04-24T17:17:36,714][DEBUG][logstash.runner ] log.format: "plain"
[2018-04-24T17:17:36,715][DEBUG][logstash.runner ] http.host: "127.0.0.1"
[2018-04-24T17:17:36,715][DEBUG][logstash.runner ] http.port: 9600..9700
[2018-04-24T17:17:36,715][DEBUG][logstash.runner ] http.environment: "production"
[2018-04-24T17:17:36,715][DEBUG][logstash.runner ] queue.type: "memory"
[2018-04-24T17:17:36,715][DEBUG][logstash.runner ] queue.drain: false
[2018-04-24T17:17:36,715][DEBUG][logstash.runner ] queue.page_capacity: 67108864
[2018-04-24T17:17:36,715][DEBUG][logstash.runner ] queue.max_bytes: 1073741824
[2018-04-24T17:17:36,715][DEBUG][logstash.runner ] queue.max_events: 0
[2018-04-24T17:17:36,715][DEBUG][logstash.runner ] queue.checkpoint.acks: 1024
[2018-04-24T17:17:36,715][DEBUG][logstash.runner ] queue.checkpoint.writes: 1024
[2018-04-24T17:17:36,715][DEBUG][logstash.runner ] queue.checkpoint.interval: 1000
[2018-04-24T17:17:36,715][DEBUG][logstash.runner ] dead_letter_queue.enable: false
[2018-04-24T17:17:36,715][DEBUG][logstash.runner ] dead_letter_queue.max_bytes: 1073741824
[2018-04-24T17:17:36,715][DEBUG][logstash.runner ] slowlog.threshold.warn: -1
[2018-04-24T17:17:36,715][DEBUG][logstash.runner ] slowlog.threshold.info: -1
[2018-04-24T17:17:36,716][DEBUG][logstash.runner ] slowlog.threshold.debug: -1
[2018-04-24T17:17:36,716][DEBUG][logstash.runner ] slowlog.threshold.trace: -1
[2018-04-24T17:17:36,716][DEBUG][logstash.runner ] keystore.classname: "org.logstash.secret.store.backend.JavaKeyStore"
[2018-04-24T17:17:36,716][DEBUG][logstash.runner ] *keystore.file: "/etc/logstash/logstash.keystore" (default: "/usr/share/logstash/config/logstash.keystore")
[2018-04-24T17:17:36,716][DEBUG][logstash.runner ] *path.queue: "/var/lib/logstash/queue" (default: "/usr/share/logstash/data/queue")
[2018-04-24T17:17:36,716][DEBUG][logstash.runner ] *path.dead_letter_queue: "/var/lib/logstash/dead_letter_queue" (default: "/usr/share/logstash/data/dead_letter_queue")
[2018-04-24T17:17:36,716][DEBUG][logstash.runner ] *path.settings: "/etc/logstash" (default: "/usr/share/logstash/config")
[2018-04-24T17:17:36,716][DEBUG][logstash.runner ] *path.logs: "/var/log/logstash" (default: "/usr/share/logstash/logs")
[2018-04-24T17:17:36,716][DEBUG][logstash.runner ] xpack.monitoring.enabled: true
[2018-04-24T17:17:36,716][DEBUG][logstash.runner ] *xpack.monitoring.elasticsearch.url: ["https://10.54.52.31:9200"] (default: ["http://localhost:9200"])
[2018-04-24T17:17:36,716][DEBUG][logstash.runner ] xpack.monitoring.collection.interval: 10000000000
[2018-04-24T17:17:36,716][DEBUG][logstash.runner ] xpack.monitoring.collection.timeout_interval: 600000000000
[2018-04-24T17:17:36,716][DEBUG][logstash.runner ] *xpack.monitoring.elasticsearch.username: "logstash_writer" (default: "logstash_system")
[2018-04-24T17:17:36,716][DEBUG][logstash.runner ] *xpack.monitoring.elasticsearch.password: "lastmile"
[2018-04-24T17:17:36,716][DEBUG][logstash.runner ] *xpack.monitoring.elasticsearch.ssl.ca: "/etc/logstash/cert/LMUWU0438.pem"
[2018-04-24T17:17:36,716][DEBUG][logstash.runner ] *xpack.monitoring.elasticsearch.ssl.truststore.password: "lastmile"
[2018-04-24T17:17:36,716][DEBUG][logstash.runner ] *xpack.monitoring.elasticsearch.ssl.keystore.password: "lastmile"
[2018-04-24T17:17:36,717][DEBUG][logstash.runner ] xpack.monitoring.elasticsearch.ssl.verification_mode: "certificate"
[2018-04-24T17:17:36,717][DEBUG][logstash.runner ] xpack.monitoring.elasticsearch.sniffing: false
[2018-04-24T17:17:36,717][DEBUG][logstash.runner ] xpack.monitoring.collection.pipeline.details.enabled: true
[2018-04-24T17:17:36,717][DEBUG][logstash.runner ] xpack.monitoring.collection.config.enabled: true
[2018-04-24T17:17:36,717][DEBUG][logstash.runner ] node.uuid: ""
[2018-04-24T17:17:36,717][DEBUG][logstash.runner ] *xpack.management.enabled: true (default: false)
[2018-04-24T17:17:36,717][DEBUG][logstash.runner ] xpack.management.logstash.poll_interval: 5000000000
[2018-04-24T17:17:36,717][DEBUG][logstash.runner ] xpack.management.pipeline.id: ["main"]
[2018-04-24T17:17:36,717][DEBUG][logstash.runner ] *xpack.management.elasticsearch.username: "logstash_writer" (default: "logstash_system")
[2018-04-24T17:17:36,717][DEBUG][logstash.runner ] *xpack.management.elasticsearch.password: "lastmile"
[2018-04-24T17:17:36,717][DEBUG][logstash.runner ] *xpack.management.elasticsearch.url: ["https://10.54.52.31:9200"] (default: ["https://localhost:9200"])
[2018-04-24T17:17:36,717][DEBUG][logstash.runner ] *xpack.management.elasticsearch.ssl.ca: "/etc/logstash/cert/LMUWU0438.pem"
[2018-04-24T17:17:36,718][DEBUG][logstash.runner ] xpack.management.elasticsearch.sniffing: false
[2018-04-24T17:17:36,718][DEBUG][logstash.runner ] --------------- Logstash Settings -------------------
[2018-04-24T17:17:36,736][DEBUG][logstash.agent ] Setting up metric collection
[2018-04-24T17:17:36,740][DEBUG][logstash.instrument.periodicpoller.os] Starting {:polling_interval=>5, :polling_timeout=>120}
[2018-04-24T17:17:36,761][DEBUG][logstash.instrument.periodicpoller.jvm] Starting {:polling_interval=>5, :polling_timeout=>120}
[2018-04-24T17:17:36,786][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-04-24T17:17:36,787][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-04-24T17:17:36,795][DEBUG][logstash.instrument.periodicpoller.persistentqueue] Starting {:polling_interval=>5, :polling_timeout=>120}
[2018-04-24T17:17:36,797][DEBUG][logstash.instrument.periodicpoller.deadletterqueue] Starting {:polling_interval=>5, :polling_timeout=>120}
[2018-04-24T17:17:36,813][DEBUG][logstash.monitoringextension.pipelineregisterhook] compiled metrics pipeline config: {:config=>"# [2017] Elasticsearch Incorporated. All Rights Reserved.\n#\n# NOTICE: All information contained herein is, and remains\n# the property of Elasticsearch Incorporated and its suppliers,\n# if any. The intellectual and technical concepts contained\n# herein are proprietary to Elasticsearch Incorporated\n# and its suppliers and may be covered by U.S. and Foreign Patents,\n# patents in process, and are protected by trade secret or copyright law.\n# Dissemination of this information or reproduction of this material\n# is strictly forbidden unless prior written permission is obtained\n# from Elasticsearch Incorporated.\n#\ninput {\n metrics {\n collection_interval => 10\n collection_timeout_interval => 600\n extended_performance_collection => true\n config_collection => true\n }\n}\noutput {\n elasticsearch {\n hosts => [\"https://10.54.52.31:9200\"]\n bulk_path => \"/_xpack/monitoring/_bulk?system_id=logstash&system_api_version=2&interval=1s\"\n manage_template => false\n document_type => \"%{[@metadata][document_type]}\"\n index => \"\"\n sniffing => false\n \n user => \"logstash_writer\"\n password => \"lastmile\"\n \n \n ssl => true\n \n cacert => \"/etc/logstash/cert/LMUWU0438.pem\"\n \n \n \n \n }\n}\n"}
[2018-04-24T17:17:36,822][DEBUG][logstash.config.sourceloader] Adding source {:source=>"#<LogStash::Monitoring::InternalPipelineSource:0x48640536>"}
[2018-04-24T17:17:36,825][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.2.3"}
[2018-04-24T17:17:36,827][DEBUG][logstash.agent ] Starting agent
[2018-04-24T17:17:36,831][DEBUG][logstash.agent ] Starting puma
[2018-04-24T17:17:36,836][DEBUG][logstash.agent ] Trying to start WebServer {:port=>9600}
[2018-04-24T17:17:36,850][DEBUG][logstash.api.service ] [api-service] start
[2018-04-24T17:17:36,873][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@id = "plain_592c749b-8ada-47ef-9179-47b43f1e4165"
[2018-04-24T17:17:36,873][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@enable_metric = true
[2018-04-24T17:17:36,873][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@charset = "UTF-8"
[2018-04-24T17:17:36,889][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@hosts = [https://10.54.52.31:9200]
[2018-04-24T17:17:36,889][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@user = "logstash_writer"
[2018-04-24T17:17:36,889][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@password = <password>
[2018-04-24T17:17:36,889][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@sniffing = false
[2018-04-24T17:17:36,893][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@cacert = "/etc/logstash/cert/LMUWU0438.pem"
[2018-04-24T17:17:36,893][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ssl = true
[2018-04-24T17:17:36,893][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@id = "elasticsearch_3393f11e-d499-4563-b7fd-05c1ff6e7f0c"
[2018-04-24T17:17:36,893][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@enable_metric = true
[2018-04-24T17:17:36,901][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@codec = <LogStash::Codecs::Plain id=>"plain_592c749b-8ada-47ef-9179-47b43f1e4165", enable_metric=>true, charset=>"UTF-8">
[2018-04-24T17:17:36,904][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@workers = 1
[2018-04-24T17:17:36,904][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@index = "logstash-%{+YYYY.MM.dd}"
[2018-04-24T17:17:36,904][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@manage_template = true
[2018-04-24T17:17:36,904][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@template_name = "logstash"
[2018-04-24T17:17:36,905][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@template_overwrite = false
[2018-04-24T17:17:36,905][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@parent = nil
[2018-04-24T17:17:36,905][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@join_field = nil
[2018-04-24T17:17:36,905][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@upsert = ""
[2018-04-24T17:17:36,905][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@doc_as_upsert = false
[2018-04-24T17:17:36,905][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script = ""
[2018-04-24T17:17:36,905][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_type = "inline"
[2018-04-24T17:17:36,905][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_lang = "painless"
[2018-04-24T17:17:36,905][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_var_name = "event"
[2018-04-24T17:17:36,905][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@scripted_upsert = false
[2018-04-24T17:17:36,905][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_initial_interval = 2
[2018-04-24T17:17:36,905][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_max_interval = 64
[2018-04-24T17:17:36,905][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_on_conflict = 1
[2018-04-24T17:17:36,905][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pipeline = nil
[2018-04-24T17:17:36,905][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@action = "index"
[2018-04-24T17:17:36,906][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ssl_certificate_verification = true
[2018-04-24T17:17:36,906][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@sniffing_delay = 5
[2018-04-24T17:17:36,906][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@timeout = 60
[2018-04-24T17:17:36,906][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@failure_type_logging_whitelist = []
[2018-04-24T17:17:36,906][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pool_max = 1000
[2018-04-24T17:17:36,906][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pool_max_per_route = 100
[2018-04-24T17:17:36,906][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@resurrect_delay = 5
[2018-04-24T17:17:36,906][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@validate_after_inactivity = 10000
[2018-04-24T17:17:36,907][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@http_compression = false
[2018-04-24T17:17:36,909][DEBUG][logstash.configmanagement.elasticsearchsource] Normalizing http path {:path=>nil, :normalized=>nil}
[2018-04-24T17:17:36,915][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2018-04-24T17:17:36,915][INFO ][logstash.configmanagement.elasticsearchsource] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[https://logstash_writer:xxxxxx@10.54.52.31:9200/]}}
[2018-04-24T17:17:36,927][INFO ][logstash.configmanagement.elasticsearchsource] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>https://logstash_writer:xxxxxx@10.54.52.31:9200/, :path=>"/"}
[2018-04-24T17:17:36,996][WARN ][logstash.configmanagement.elasticsearchsource] Restored connection to ES instance {:url=>"https://logstash_writer:xxxxxx@10.54.52.31:9200/"}
[2018-04-24T17:17:36,999][INFO ][logstash.configmanagement.elasticsearchsource] ES Output version determined {:es_version=>6}
[2018-04-24T17:17:36,999][WARN ][logstash.configmanagement.elasticsearchsource] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
[2018-04-24T17:17:37,028][DEBUG][logstash.agent ] Converging pipelines state {:actions_count=>2}
[2018-04-24T17:17:37,030][DEBUG][logstash.agent ] Executing action {:action=>LogStash::PipelineAction::Create/pipeline_id:.monitoring-logstash}
[2018-04-24T17:17:37,224][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@id = "plain_4e336793-fc82-4f39-bff7-fd9785d5700a"
[2018-04-24T17:17:37,224][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@enable_metric = true
[2018-04-24T17:17:37,224][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@charset = "UTF-8"
[2018-04-24T17:17:37,225][DEBUG][logstash.inputs.metrics ] config LogStash::Inputs::Metrics/@collection_interval = 10
[2018-04-24T17:17:37,225][DEBUG][logstash.inputs.metrics ] config LogStash::Inputs::Metrics/@collection_timeout_interval = 600
[2018-04-24T17:17:37,225][DEBUG][logstash.inputs.metrics ] config LogStash::Inputs::Metrics/@extended_performance_collection = "true"
[2018-04-24T17:17:37,225][DEBUG][logstash.inputs.metrics ] config LogStash::Inputs::Metrics/@config_collection = "true"
[2018-04-24T17:17:37,225][DEBUG][logstash.inputs.metrics ] config LogStash::Inputs::Metrics/@id = "cdf1ea97fc5684fe6e369cd88ce9e30114cd1f2f93adfad2101c922d259ef5b4"
[2018-04-24T17:17:37,225][DEBUG][logstash.inputs.metrics ] config LogStash::Inputs::Metrics/@enable_metric = true
[2018-04-24T17:17:37,225][DEBUG][logstash.inputs.metrics ] config LogStash::Inputs::Metrics/@codec = <LogStash::Codecs::Plain id=>"plain_4e336793-fc82-4f39-bff7-fd9785d5700a", enable_metric=>true, charset=>"UTF-8">
[2018-04-24T17:17:37,225][DEBUG][logstash.inputs.metrics ] config LogStash::Inputs::Metrics/@add_field = {}
[2018-04-24T17:17:37,226][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"elasticsearch", :type=>"output", :class=>LogStash::Outputs::ElasticSearch}
[2018-04-24T17:17:37,234][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@id = "plain_b9ae998c-863a-4b84-8386-85ed7796db4b"
[2018-04-24T17:17:37,234][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@enable_metric = true
[2018-04-24T17:17:37,234][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@charset = "UTF-8"
[2018-04-24T17:17:37,237][WARN ][logstash.outputs.elasticsearch] You are using a deprecated config setting "document_type" set in elasticsearch. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Document types are being deprecated in Elasticsearch 6.0, and removed entirely in 7.0. You should avoid this feature If you have any questions about this, please visit the #logstash channel on freenode irc. {:name=>"document_type", :plugin=><LogStash::Outputs::ElasticSearch hosts=>[https://10.54.52.31:9200], bulk_path=>"/_xpack/monitoring/_bulk?system_id=logstash&system_api_version=2&interval=1s", manage_template=>false, document_type=>"%{[@metadata][document_type]}", sniffing=>false, user=>"logstash_writer", password=><password>, ssl=>true, cacert=>"/etc/logstash/cert/LMUWU0438.pem", id=>"bbab00ed38755642d2fb5362673e378c4205645a5c0dee2bf5d0df95fdbcd636", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_b9ae998c-863a-4b84-8386-85ed7796db4b", enable_metric=>true, charset=>"UTF-8">, workers=>1, template_name=>"logstash", template_overwrite=>false, doc_as_upsert=>false, script_type=>"inline", script_lang=>"painless", script_var_name=>"event", scripted_upsert=>false, retry_initial_interval=>2, retry_max_interval=>64, retry_on_conflict=>1, action=>"index", ssl_certificate_verification=>true, sniffing_delay=>5, timeout=>60, pool_max=>1000, pool_max_per_route=>100, resurrect_delay=>5, validate_after_inactivity=>10000, http_compression=>false>}
[2018-04-24T17:17:37,238][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@hosts = [https://10.54.52.31:9200]
[2018-04-24T17:17:37,238][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@bulk_path = "/_xpack/monitoring/_bulk?system_id=logstash&system_api_version=2&interval=1s"
[2018-04-24T17:17:37,239][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@manage_template = false
[2018-04-24T17:17:37,239][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@document_type = "%{[@metadata][document_type]}"
[2018-04-24T17:17:37,239][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@index = ""
[2018-04-24T17:17:37,239][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@sniffing = false
[2018-04-24T17:17:37,239][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@user = "logstash_writer"
[2018-04-24T17:17:37,239][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@password = <password>
[2018-04-24T17:17:37,239][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ssl = true
[2018-04-24T17:17:37,239][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@cacert = "/etc/logstash/cert/LMUWU0438.pem"
[2018-04-24T17:17:37,239][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@id = "bbab00ed38755642d2fb5362673e378c4205645a5c0dee2bf5d0df95fdbcd636"
[2018-04-24T17:17:37,239][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@enable_metric = true
[2018-04-24T17:17:37,239][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@codec = <LogStash::Codecs::Plain id=>"plain_b9ae998c-863a-4b84-8386-85ed7796db4b", enable_metric=>true, charset=>"UTF-8">
[2018-04-24T17:17:37,239][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@workers = 1
[2018-04-24T17:17:37,240][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@template_name = "logstash"
[2018-04-24T17:17:37,240][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@template_overwrite = false
[2018-04-24T17:17:37,240][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@parent = nil
[2018-04-24T17:17:37,240][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@join_field = nil
[2018-04-24T17:17:37,240][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@upsert = ""
[2018-04-24T17:17:37,240][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@doc_as_upsert = false
[2018-04-24T17:17:37,240][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script = ""
[2018-04-24T17:17:37,240][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_type = "inline"
[2018-04-24T17:17:37,240][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_lang = "painless"
[2018-04-24T17:17:37,240][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_var_name = "event"
[2018-04-24T17:17:37,240][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@scripted_upsert = false
[2018-04-24T17:17:37,240][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_initial_interval = 2
[2018-04-24T17:17:37,240][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_max_interval = 64
[2018-04-24T17:17:37,240][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_on_conflict = 1
[2018-04-24T17:17:37,240][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pipeline = nil
[2018-04-24T17:17:37,241][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@action = "index"
[2018-04-24T17:17:37,241][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ssl_certificate_verification = true
[2018-04-24T17:17:37,241][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@sniffing_delay = 5
[2018-04-24T17:17:37,241][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@timeout = 60
[2018-04-24T17:17:37,241][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@failure_type_logging_whitelist = []
[2018-04-24T17:17:37,241][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pool_max = 1000
[2018-04-24T17:17:37,241][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pool_max_per_route = 100
[2018-04-24T17:17:37,241][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@resurrect_delay = 5
[2018-04-24T17:17:37,241][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@validate_after_inactivity = 10000
[2018-04-24T17:17:37,241][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@http_compression = false
[2018-04-24T17:17:37,244][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>".monitoring-logstash", "pipeline.workers"=>1, "pipeline.batch.size"=>2, "pipeline.batch.delay"=>50}
[2018-04-24T17:17:37,248][DEBUG][logstash.outputs.elasticsearch] Normalizing http path {:path=>nil, :normalized=>nil}
[2018-04-24T17:17:37,260][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[https://logstash_writer:xxxxxx@10.54.52.31:9200/]}}
[2018-04-24T17:17:37,262][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>https://logstash_writer:xxxxxx@10.54.52.31:9200/, :path=>"/"}
[2018-04-24T17:17:37,283][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"https://logstash_writer:xxxxxx@10.54.52.31:9200/"}
[2018-04-24T17:17:37,288][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2018-04-24T17:17:37,289][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
[2018-04-24T17:17:37,291][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["https://10.54.52.31:9200"]}
[2018-04-24T17:17:37,302][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@id = "plain_741206e1-add7-42ce-9ec8-a390213025ea"
[2018-04-24T17:17:37,302][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@enable_metric = true
[2018-04-24T17:17:37,302][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@charset = "UTF-8"
[2018-04-24T17:17:37,304][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@hosts = [https://10.54.52.31:9200]
[2018-04-24T17:17:37,304][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@user = "logstash_writer"
[2018-04-24T17:17:37,304][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@password = <password>
[2018-04-24T17:17:37,304][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@sniffing = false
[2018-04-24T17:17:37,304][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@cacert = "/etc/logstash/cert/LMUWU0438.pem"
[2018-04-24T17:17:37,305][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ssl = true
[2018-04-24T17:17:37,305][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@id = "elasticsearch_c08f3207-45b4-4a27-bb67-61da6716eda0"
[2018-04-24T17:17:37,305][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@enable_metric = true
[2018-04-24T17:17:37,305][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@codec = <LogStash::Codecs::Plain id=>"plain_741206e1-add7-42ce-9ec8-a390213025ea", enable_metric=>true, charset=>"UTF-8">
[2018-04-24T17:17:37,305][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@workers = 1
[2018-04-24T17:17:37,305][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@index = "logstash-%{+YYYY.MM.dd}"
[2018-04-24T17:17:37,305][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@manage_template = true
[2018-04-24T17:17:37,305][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@template_name = "logstash"
[2018-04-24T17:17:37,305][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@template_overwrite = false
[2018-04-24T17:17:37,306][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@parent = nil
[2018-04-24T17:17:37,306][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@join_field = nil
[2018-04-24T17:17:37,306][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@upsert = ""
[2018-04-24T17:17:37,306][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@doc_as_upsert = false
[2018-04-24T17:17:37,306][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script = ""
[2018-04-24T17:17:37,306][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_type = "inline"
[2018-04-24T17:17:37,306][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_lang = "painless"
[2018-04-24T17:17:37,306][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_var_name = "event"
[2018-04-24T17:17:37,306][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@scripted_upsert = false
[2018-04-24T17:17:37,306][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_initial_interval = 2
[2018-04-24T17:17:37,306][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_max_interval = 64
[2018-04-24T17:17:37,306][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_on_conflict = 1
[2018-04-24T17:17:37,306][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pipeline = nil
[2018-04-24T17:17:37,306][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@action = "index"
[2018-04-24T17:17:37,307][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ssl_certificate_verification = true
[2018-04-24T17:17:37,307][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@sniffing_delay = 5
[2018-04-24T17:17:37,307][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@timeout = 60
[2018-04-24T17:17:37,307][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@failure_type_logging_whitelist = []
[2018-04-24T17:17:37,307][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pool_max = 1000
[2018-04-24T17:17:37,307][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pool_max_per_route = 100
[2018-04-24T17:17:37,307][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@resurrect_delay = 5
[2018-04-24T17:17:37,307][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@validate_after_inactivity = 10000
[2018-04-24T17:17:37,307][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@http_compression = false
[2018-04-24T17:17:37,308][DEBUG][logstash.licensechecker.licensereader] Normalizing http path {:path=>nil, :normalized=>nil}
[2018-04-24T17:17:37,312][INFO ][logstash.licensechecker.licensereader] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[https://logstash_writer:xxxxxx@10.54.52.31:9200/]}}
[2018-04-24T17:17:37,312][INFO ][logstash.licensechecker.licensereader] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>https://logstash_writer:xxxxxx@10.54.52.31:9200/, :path=>"/"}
[2018-04-24T17:17:37,332][WARN ][logstash.licensechecker.licensereader] Restored connection to ES instance {:url=>"https://logstash_writer:xxxxxx@10.54.52.31:9200/"}
[2018-04-24T17:17:37,335][INFO ][logstash.licensechecker.licensereader] ES Output version determined {:es_version=>6}
[2018-04-24T17:17:37,335][WARN ][logstash.licensechecker.licensereader] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
[2018-04-24T17:17:37,346][DEBUG][logstash.licensechecker.licensemanager] updating observers of xpack info change
[2018-04-24T17:17:37,347][DEBUG][logstash.inputs.metrics ] updating licensing state installed:true,
license:{"status"=>"active", "uid"=>"9e6cfba8-7c54-4f36-9ab0-3d4dd63415ee", "type"=>"platinum", "issue_date"=>"2017-07-31T00:00:00.000Z", "issue_date_in_millis"=>1501459200000, "expiry_date"=>"2018-07-31T23:59:59.999Z", "expiry_date_in_millis"=>1533081599999, "max_nodes"=>23, "issued_to"=>"XPO Logistics, Inc. (non-production environments)", "issuer"=>"Aunik Bhattacharjee", "start_date_in_millis"=>1501459200000},
last_updated:}
[2018-04-24T17:17:37,350][INFO ][logstash.pipeline ] Pipeline started succesfully {:pipeline_id=>".monitoring-logstash", :thread=>"#<Thread:0x134849f2@/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:246 sleep>"}
[2018-04-24T17:17:37,362][DEBUG][logstash.agent ] Executing action {:action=>LogStash::PipelineAction::Create/pipeline_id:main}
[2018-04-24T17:17:37,371][DEBUG][logstash.inputs.metrics ] Metric: input started
[2018-04-24T17:17:38,375][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"beats", :type=>"input", :class=>LogStash::Inputs::Beats}
[2018-04-24T17:17:38,383][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@id = "plain_369dbb1e-c19c-44b0-82a3-3b5637a219a9"
[2018-04-24T17:17:38,384][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@enable_metric = true
[2018-04-24T17:17:38,384][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@charset = "UTF-8"
[2018-04-24T17:17:38,385][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@port = 5044
[2018-04-24T17:17:38,385][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@id = "5537bdbc19a5d80de34ddad7bdfd09082f3df48f7a8695f55459b8ee4dd06cf1"
[2018-04-24T17:17:38,385][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@enable_metric = true
[2018-04-24T17:17:38,385][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@codec = <LogStash::Codecs::Plain id=>"plain_369dbb1e-c19c-44b0-82a3-3b5637a219a9", enable_metric=>true, charset=>"UTF-8">
[2018-04-24T17:17:38,386][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@add_field = {}
[2018-04-24T17:17:38,386][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@host = "0.0.0.0"
[2018-04-24T17:17:38,386][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@ssl = false
[2018-04-24T17:17:38,386][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@ssl_certificate_authorities = []
[2018-04-24T17:17:38,387][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@ssl_verify_mode = "none"
[2018-04-24T17:17:38,387][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@include_codec_tag = true
[2018-04-24T17:17:38,387][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@ssl_handshake_timeout = 10000
[2018-04-24T17:17:38,387][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@tls_min_version = 1
[2018-04-24T17:17:38,387][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@tls_max_version = 1.2
[2018-04-24T17:17:38,387][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@cipher_suites = ["TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384", "TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384", "TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256", "TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256", "TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA384", "TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA384", "TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256", "TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256"]
[2018-04-24T17:17:38,387][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@client_inactivity_timeout = 60
[2018-04-24T17:17:38,388][DEBUG][logstash.inputs.beats ] config LogStash::Inputs::Beats/@executor_threads = 4
[2018-04-24T17:17:38,396][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"mutate", :type=>"filter", :class=>LogStash::Filters::Mutate}
[2018-04-24T17:17:38,401][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@remove_field = ["[beat][hostname]", "[beat][name]"]
[2018-04-24T17:17:38,402][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@id = "9fc06e415faae68f28253c2f1e1835ca4184585e823d2adf3dfdcab75ecf93eb"
[2018-04-24T17:17:38,402][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@enable_metric = true
[2018-04-24T17:17:38,402][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@add_tag = []
[2018-04-24T17:17:38,402][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@remove_tag = []
[2018-04-24T17:17:38,402][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@add_field = {}
[2018-04-24T17:17:38,402][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@periodic_flush = false
[2018-04-24T17:17:38,404][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@remove_field = ["keywords", "computer_name", "[event_data][param1]"]
[2018-04-24T17:17:38,405][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@id = "505c54eaf9f7dd705fda815d4f9ce6be2537022d424504b6a722f8d18b174c48"
[2018-04-24T17:17:38,405][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@enable_metric = true
[2018-04-24T17:17:38,405][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@add_tag = []
[2018-04-24T17:17:38,405][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@remove_tag = []
[2018-04-24T17:17:38,405][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@add_field = {}
[2018-04-24T17:17:38,405][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@periodic_flush = false
[2018-04-24T17:17:38,422][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"grok", :type=>"filter", :class=>LogStash::Filters::Grok}
[2018-04-24T17:17:38,426][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@patterns_dir = ["/etc/logstash/conf.d/patterns"]
[2018-04-24T17:17:38,426][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@match = {"message"=>"%{GREEDYDATA:PrefixMessage} \\[%{DATESWITHUNDERLINE:logtime}\\] %{GREEDYDATA:SuffixMessage}"}
[2018-04-24T17:17:38,426][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@id = "4037240d9184f97dd40e18d7d0e0c20bf404e642aa17c2a60f3e76cd4aae0415"
[2018-04-24T17:17:38,426][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@enable_metric = true
[2018-04-24T17:17:38,426][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@add_tag = []
[2018-04-24T17:17:38,426][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@remove_tag = []
[2018-04-24T17:17:38,426][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@add_field = {}
[2018-04-24T17:17:38,426][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@remove_field = []
[2018-04-24T17:17:38,426][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@periodic_flush = false
[2018-04-24T17:17:38,426][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@pattern_definitions = {}
[2018-04-24T17:17:38,426][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@patterns_files_glob = "*"
[2018-04-24T17:17:38,426][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@break_on_match = true
[2018-04-24T17:17:38,427][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@named_captures_only = true
[2018-04-24T17:17:38,427][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@keep_empty_captures = false
[2018-04-24T17:17:38,427][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@tag_on_failure = ["_grokparsefailure"]
[2018-04-24T17:17:38,427][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@timeout_millis = 30000
[2018-04-24T17:17:38,427][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@tag_on_timeout = "_groktimeout"
[2018-04-24T17:17:38,427][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@overwrite = []
[2018-04-24T17:17:38,429][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@replace = {"message"=>"%{[logtime]} : %{[PrefixMessage]} : %{[SuffixMessage]}"}
[2018-04-24T17:17:38,429][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@id = "eb363b21a70c9bcc2d19bbfce3717472d315a36ab5ada4220a8147346159a68a"
[2018-04-24T17:17:38,430][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@enable_metric = true
[2018-04-24T17:17:38,430][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@add_tag = []
[2018-04-24T17:17:38,430][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@remove_tag = []
[2018-04-24T17:17:38,430][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@add_field = {}
[2018-04-24T17:17:38,430][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@remove_field = []
[2018-04-24T17:17:38,430][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@periodic_flush = false
[2018-04-24T17:17:38,435][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"date", :type=>"filter", :class=>LogStash::Filters::Date}
[2018-04-24T17:17:38,438][DEBUG][logstash.filters.date ] config LogStash::Filters::Date/@match = ["logtime", "ISO8601", "yyyy_MM_dd_HH_mm_ss_SSS"]
[2018-04-24T17:17:38,438][DEBUG][logstash.filters.date ] config LogStash::Filters::Date/@target = "@timestamp"
[2018-04-24T17:17:38,438][DEBUG][logstash.filters.date ] config LogStash::Filters::Date/@timezone = "UTC"
[2018-04-24T17:17:38,438][DEBUG][logstash.filters.date ] config LogStash::Filters::Date/@id = "df8923c17f0ee47380e6bc2eef6ac58880bda6d6ca96907434ce09e573055b87"
[2018-04-24T17:17:38,439][DEBUG][logstash.filters.date ] config LogStash::Filters::Date/@enable_metric = true
[2018-04-24T17:17:38,439][DEBUG][logstash.filters.date ] config LogStash::Filters::Date/@add_tag = []
[2018-04-24T17:17:38,439][DEBUG][logstash.filters.date ] config LogStash::Filters::Date/@remove_tag = []
[2018-04-24T17:17:38,439][DEBUG][logstash.filters.date ] config LogStash::Filters::Date/@add_field = {}
[2018-04-24T17:17:38,439][DEBUG][logstash.filters.date ] config LogStash::Filters::Date/@remove_field = []
[2018-04-24T17:17:38,439][DEBUG][logstash.filters.date ] config LogStash::Filters::Date/@periodic_flush = false
[2018-04-24T17:17:38,439][DEBUG][logstash.filters.date ] config LogStash::Filters::Date/@tag_on_failure = ["_dateparsefailure"]
[2018-04-24T17:17:38,449][DEBUG][org.logstash.filters.DateFilter] Date filter with format=ISO8601, locale=null, timezone=UTC built as org.logstash.filters.parser.CasualISO8601Parser
[2018-04-24T17:17:38,454][DEBUG][org.logstash.filters.DateFilter] Date filter with format=yyyy_MM_dd_HH_mm_ss_SSS, locale=null, timezone=UTC built as org.logstash.filters.parser.JodaParser
[2018-04-24T17:17:38,457][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@remove_field = ["logtime", "PrefixMessage", "SuffixMessage"]
[2018-04-24T17:17:38,457][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@id = "bbe33a88317edd9a24cee3aa9d9cce34045096614bca501f3c58961a0b8a81f0"
[2018-04-24T17:17:38,457][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@enable_metric = true
[2018-04-24T17:17:38,457][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@add_tag = []
[2018-04-24T17:17:38,458][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@remove_tag = []
[2018-04-24T17:17:38,458][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@add_field = {}
[2018-04-24T17:17:38,458][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@periodic_flush = false
[2018-04-24T17:17:38,460][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@patterns_dir = ["/etc/logstash/conf.d/patterns"]
[2018-04-24T17:17:38,460][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@match = {"message"=>"%{GREEDYDATA:LevelMessage} %{TIMESTAMP_ISO8601:logtime}%{GREEDYDATA:SuffixMessage}"}
[2018-04-24T17:17:38,460][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@id = "a4eeaa8b6fc65b97bcf9fae420ec2b32afbfc3c9f145e9a947d202a241be3e8d"
[2018-04-24T17:17:38,460][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@enable_metric = true
[2018-04-24T17:17:38,460][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@add_tag = []
[2018-04-24T17:17:38,460][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@remove_tag = []
[2018-04-24T17:17:38,460][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@add_field = {}
[2018-04-24T17:17:38,460][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@remove_field = []
[2018-04-24T17:17:38,460][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@periodic_flush = false
[2018-04-24T17:17:38,460][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@pattern_definitions = {}
[2018-04-24T17:17:38,460][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@patterns_files_glob = "*"
[2018-04-24T17:17:38,460][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@break_on_match = true
[2018-04-24T17:17:38,461][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@named_captures_only = true
[2018-04-24T17:17:38,461][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@keep_empty_captures = false
[2018-04-24T17:17:38,461][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@tag_on_failure = ["_grokparsefailure"]
[2018-04-24T17:17:38,461][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@timeout_millis = 30000
[2018-04-24T17:17:38,461][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@tag_on_timeout = "_groktimeout"
[2018-04-24T17:17:38,461][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@overwrite = []
[2018-04-24T17:17:38,463][DEBUG][logstash.filters.date ] config LogStash::Filters::Date/@match = ["logtime", "ISO8601", "yyyy-MM-dd HH:mm:ss,SSS"]
[2018-04-24T17:17:38,463][DEBUG][logstash.filters.date ] config LogStash::Filters::Date/@target = "@timestamp"
[2018-04-24T17:17:38,463][DEBUG][logstash.filters.date ] config LogStash::Filters::Date/@id = "83dd44229a8537d7ad3a0c33411f2ef220944d21e1e034b20ba581de58a56578"
[2018-04-24T17:17:38,463][DEBUG][logstash.filters.date ] config LogStash::Filters::Date/@enable_metric = true
[2018-04-24T17:17:38,463][DEBUG][logstash.filters.date ] config LogStash::Filters::Date/@add_tag = []
[2018-04-24T17:17:38,463][DEBUG][logstash.filters.date ] config LogStash::Filters::Date/@remove_tag = []
[2018-04-24T17:17:38,463][DEBUG][logstash.filters.date ] config LogStash::Filters::Date/@add_field = {}
[2018-04-24T17:17:38,463][DEBUG][logstash.filters.date ] config LogStash::Filters::Date/@remove_field = []
[2018-04-24T17:17:38,463][DEBUG][logstash.filters.date ] config LogStash::Filters::Date/@periodic_flush = false
[2018-04-24T17:17:38,463][DEBUG][logstash.filters.date ] config LogStash::Filters::Date/@tag_on_failure = ["_dateparsefailure"]
[2018-04-24T17:17:38,464][DEBUG][org.logstash.filters.DateFilter] Date filter with format=ISO8601, locale=null, timezone=null built as org.logstash.filters.parser.CasualISO8601Parser
[2018-04-24T17:17:38,464][DEBUG][org.logstash.filters.DateFilter] Date filter with format=yyyy-MM-dd HH:mm:ss,SSS, locale=null, timezone=null built as org.logstash.filters.parser.JodaParser
[2018-04-24T17:17:38,466][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@remove_field = ["logtime", "LevelMessage", "SuffixMessage"]
[2018-04-24T17:17:38,466][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@id = "94c3fb44fd4485d915b4f47760800ef72d803e08af970b60eb7e2bb6a76f863c"
[2018-04-24T17:17:38,466][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@enable_metric = true
[2018-04-24T17:17:38,466][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@add_tag = []
[2018-04-24T17:17:38,466][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@remove_tag = []
[2018-04-24T17:17:38,466][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@add_field = {}
[2018-04-24T17:17:38,467][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@periodic_flush = false
[2018-04-24T17:17:38,469][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@patterns_dir = ["/etc/logstash/conf.d/patterns"]
[2018-04-24T17:17:38,469][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@match = {"message"=>"%{GREEDYDATA:PrefixMessage}/%{GREEDYDATA:PrefixMessageTwo}/%{DATESWITHDOTS:logtimetwo}%{GREEDYDATA:SuffixMessage}"}
[2018-04-24T17:17:38,469][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@id = "f03898c9ae1dcf6c1e43edff832388087a4e508c2af2bea6828e725e91006250"
[2018-04-24T17:17:38,469][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@enable_metric = true
[2018-04-24T17:17:38,469][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@add_tag = []
[2018-04-24T17:17:38,469][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@remove_tag = []
[2018-04-24T17:17:38,469][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@add_field = {}
[2018-04-24T17:17:38,470][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@remove_field = []
[2018-04-24T17:17:38,470][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@periodic_flush = false
[2018-04-24T17:17:38,470][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@pattern_definitions = {}
[2018-04-24T17:17:38,470][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@patterns_files_glob = "*"
[2018-04-24T17:17:38,470][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@break_on_match = true
[2018-04-24T17:17:38,470][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@named_captures_only = true
[2018-04-24T17:17:38,470][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@keep_empty_captures = false
[2018-04-24T17:17:38,470][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@tag_on_failure = ["_grokparsefailure"]
[2018-04-24T17:17:38,470][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@timeout_millis = 30000
[2018-04-24T17:17:38,470][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@tag_on_timeout = "_groktimeout"
[2018-04-24T17:17:38,470][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@overwrite = []
[2018-04-24T17:17:38,472][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@replace = {"message"=>"%{[logtimetwo]} : %{[PrefixMessage]} : %{[PrefixMessageTwo]} : %{[SuffixMessage]}"}
[2018-04-24T17:17:38,472][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@id = "b355a946e1dd18ca9fcd72d7b3967f6f28d003aaab13ca68f98706c886330e67"
[2018-04-24T17:17:38,472][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@enable_metric = true
[2018-04-24T17:17:38,472][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@add_tag = []
[2018-04-24T17:17:38,472][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@remove_tag = []
[2018-04-24T17:17:38,472][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@add_field = {}
[2018-04-24T17:17:38,472][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@remove_field = []
[2018-04-24T17:17:38,472][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@periodic_flush = false
[2018-04-24T17:17:38,474][DEBUG][logstash.filters.date ] config LogStash::Filters::Date/@match = ["logtimetwo", "ISO8601", "yyyy.MM.dd.HH.mm.ss.SSS"]
[2018-04-24T17:17:38,474][DEBUG][logstash.filters.date ] config LogStash::Filters::Date/@target = "@timestamp"
[2018-04-24T17:17:38,474][DEBUG][logstash.filters.date ] config LogStash::Filters::Date/@id = "83cfabdc54e1b36eddd9006f6152f2db76d90067ed269fc0ef42a901a33eee01"
[2018-04-24T17:17:38,474][DEBUG][logstash.filters.date ] config LogStash::Filters::Date/@enable_metric = true
[2018-04-24T17:17:38,475][DEBUG][logstash.filters.date ] config LogStash::Filters::Date/@add_tag = []
[2018-04-24T17:17:38,475][DEBUG][logstash.filters.date ] config LogStash::Filters::Date/@remove_tag = []
[2018-04-24T17:17:38,475][DEBUG][logstash.filters.date ] config LogStash::Filters::Date/@add_field = {}
[2018-04-24T17:17:38,475][DEBUG][logstash.filters.date ] config LogStash::Filters::Date/@remove_field = []
[2018-04-24T17:17:38,475][DEBUG][logstash.filters.date ] config LogStash::Filters::Date/@periodic_flush = false
[2018-04-24T17:17:38,475][DEBUG][logstash.filters.date ] config LogStash::Filters::Date/@tag_on_failure = ["_dateparsefailure"]
[2018-04-24T17:17:38,475][DEBUG][org.logstash.filters.DateFilter] Date filter with format=ISO8601, locale=null, timezone=null built as org.logstash.filters.parser.CasualISO8601Parser
[2018-04-24T17:17:38,475][DEBUG][org.logstash.filters.DateFilter] Date filter with format=yyyy.MM.dd.HH.mm.ss.SSS, locale=null, timezone=null built as org.logstash.filters.parser.JodaParser
[2018-04-24T17:17:38,480][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@remove_field = ["logtimetwo", "PrefixMessage", "PrefixMessageTwo", "SuffixMessage"]
[2018-04-24T17:17:38,480][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@id = "a8d6a82ac595b101dafb6127a737491e870321c634c846df9c6fcc7fc1ef6fb7"
[2018-04-24T17:17:38,480][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@enable_metric = true
[2018-04-24T17:17:38,480][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@add_tag = []
[2018-04-24T17:17:38,480][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@remove_tag = []
[2018-04-24T17:17:38,480][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@add_field = {}
[2018-04-24T17:17:38,480][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@periodic_flush = false
[2018-04-24T17:17:38,485][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@patterns_dir = ["/etc/logstash/conf.d/patterns"]
[2018-04-24T17:17:38,485][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@match = {"message"=>"%{GREEDYDATA:Message}Timestamp : %{TIMESTAMP_ISO8601:logtime}"}
[2018-04-24T17:17:38,485][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@id = "5c5c7de9a254764eb192289530ff73343de85b4b49af0e3f0dff371267d0d0d2"
[2018-04-24T17:17:38,485][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@enable_metric = true
[2018-04-24T17:17:38,485][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@add_tag = []
[2018-04-24T17:17:38,485][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@remove_tag = []
[2018-04-24T17:17:38,485][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@add_field = {}
[2018-04-24T17:17:38,486][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@remove_field = []
[2018-04-24T17:17:38,486][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@periodic_flush = false
[2018-04-24T17:17:38,486][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@pattern_definitions = {}
[2018-04-24T17:17:38,486][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@patterns_files_glob = "*"
[2018-04-24T17:17:38,486][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@break_on_match = true
[2018-04-24T17:17:38,486][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@named_captures_only = true
[2018-04-24T17:17:38,486][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@keep_empty_captures = false
[2018-04-24T17:17:38,486][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@tag_on_failure = ["_grokparsefailure"]
[2018-04-24T17:17:38,486][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@timeout_millis = 30000
[2018-04-24T17:17:38,486][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@tag_on_timeout = "_groktimeout"
[2018-04-24T17:17:38,486][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@overwrite = []
[2018-04-24T17:17:38,489][DEBUG][logstash.filters.date ] config LogStash::Filters::Date/@match = ["logtime", "ISO8601", "yyyy-MM-dd HH:mm:ss.SSSS", "yyyy-MM-dd HH:mm:ss,SSS", "yyyy-MM-dd'T'HH:mm:ss.SSSSSSS'Z'"]
[2018-04-24T17:17:38,489][DEBUG][logstash.filters.date ] config LogStash::Filters::Date/@target = "@timestamp"
[2018-04-24T17:17:38,489][DEBUG][logstash.filters.date ] config LogStash::Filters::Date/@id = "49721ca4126eab2bb4c9cfb46ff71f547fb6ef00419614af28eb7cd247a67c99"
[2018-04-24T17:17:38,489][DEBUG][logstash.filters.date ] config LogStash::Filters::Date/@enable_metric = true
[2018-04-24T17:17:38,489][DEBUG][logstash.filters.date ] config LogStash::Filters::Date/@add_tag = []
[2018-04-24T17:17:38,489][DEBUG][logstash.filters.date ] config LogStash::Filters::Date/@remove_tag = []
[2018-04-24T17:17:38,489][DEBUG][logstash.filters.date ] config LogStash::Filters::Date/@add_field = {}
[2018-04-24T17:17:38,489][DEBUG][logstash.filters.date ] config LogStash::Filters::Date/@remove_field = []
[2018-04-24T17:17:38,489][DEBUG][logstash.filters.date ] config LogStash::Filters::Date/@periodic_flush = false
[2018-04-24T17:17:38,490][DEBUG][logstash.filters.date ] config LogStash::Filters::Date/@tag_on_failure = ["_dateparsefailure"]
[2018-04-24T17:17:38,490][DEBUG][org.logstash.filters.DateFilter] Date filter with format=ISO8601, locale=null, timezone=null built as org.logstash.filters.parser.CasualISO8601Parser
[2018-04-24T17:17:38,490][DEBUG][org.logstash.filters.DateFilter] Date filter with format=yyyy-MM-dd HH:mm:ss.SSSS, locale=null, timezone=null built as org.logstash.filters.parser.JodaParser
[2018-04-24T17:17:38,490][DEBUG][org.logstash.filters.DateFilter] Date filter with format=yyyy-MM-dd HH:mm:ss,SSS, locale=null, timezone=null built as org.logstash.filters.parser.JodaParser
[2018-04-24T17:17:38,490][DEBUG][org.logstash.filters.DateFilter] Date filter with format=yyyy-MM-dd'T'HH:mm:ss.SSSSSSS'Z', locale=null, timezone=null built as org.logstash.filters.parser.JodaParser
[2018-04-24T17:17:38,492][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@gsub = ["message", "--------------- Event Log Start Here ---------------\\n", "", "message", "\\n--------------- Event Log End Here ---------------", ""]
[2018-04-24T17:17:38,492][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@remove_field = ["Message"]
[2018-04-24T17:17:38,493][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@id = "ca44f71e8b77fe8918f303e55479a4b1a451e403040e32630df48bddc3819b40"
[2018-04-24T17:17:38,497][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@enable_metric = true
[2018-04-24T17:17:38,497][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@add_tag = []
[2018-04-24T17:17:38,497][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@remove_tag = []
[2018-04-24T17:17:38,497][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@add_field = {}
[2018-04-24T17:17:38,497][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@periodic_flush = false
[2018-04-24T17:17:38,501][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@patterns_dir = ["/etc/logstash/conf.d/patterns"]
[2018-04-24T17:17:38,501][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@match = {"message"=>"%{TIMESTAMP_ISO8601:logtime} %{GREEDYDATA:Message}"}
[2018-04-24T17:17:38,501][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@id = "8f467467d51f5218229defdae13f6d26046943a40cd23a9525f45a91d8ff700b"
[2018-04-24T17:17:38,501][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@enable_metric = true
[2018-04-24T17:17:38,502][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@add_tag = []
[2018-04-24T17:17:38,502][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@remove_tag = []
[2018-04-24T17:17:38,502][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@add_field = {}
[2018-04-24T17:17:38,502][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@remove_field = []
[2018-04-24T17:17:38,502][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@periodic_flush = false
[2018-04-24T17:17:38,502][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@pattern_definitions = {}
[2018-04-24T17:17:38,502][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@patterns_files_glob = "*"
[2018-04-24T17:17:38,502][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@break_on_match = true
[2018-04-24T17:17:38,502][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@named_captures_only = true
[2018-04-24T17:17:38,502][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@keep_empty_captures = false
[2018-04-24T17:17:38,502][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@tag_on_failure = ["_grokparsefailure"]
[2018-04-24T17:17:38,502][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@timeout_millis = 30000
[2018-04-24T17:17:38,502][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@tag_on_timeout = "_groktimeout"
[2018-04-24T17:17:38,503][DEBUG][logstash.filters.grok ] config LogStash::Filters::Grok/@overwrite = []
[2018-04-24T17:17:38,506][DEBUG][logstash.filters.date ] config LogStash::Filters::Date/@match = ["logtime", "ISO8601", "yyyy-MM-dd HH:mm:ss.SSSS", "MMMM dd, yyyy HH:mm:ss aa", "yyyy-MM-dd hh:mm:ss,SSS", "yyyy-MM-dd HH:mm:ss,SSS", "yyyy-MM-dd'T'HH:mm:ss.SSSSSSS'Z'"]
[2018-04-24T17:17:38,506][DEBUG][logstash.filters.date ] config LogStash::Filters::Date/@target = "@timestamp"
[2018-04-24T17:17:38,506][DEBUG][logstash.filters.date ] config LogStash::Filters::Date/@id = "98148cf069a8dc5dbfba3bd5f4247bfceb347f5730b9395ddaa2f493db8c23ef"
[2018-04-24T17:17:38,506][DEBUG][logstash.filters.date ] config LogStash::Filters::Date/@enable_metric = true
[2018-04-24T17:17:38,506][DEBUG][logstash.filters.date ] config LogStash::Filters::Date/@add_tag = []
[2018-04-24T17:17:38,506][DEBUG][logstash.filters.date ] config LogStash::Filters::Date/@remove_tag = []
[2018-04-24T17:17:38,506][DEBUG][logstash.filters.date ] config LogStash::Filters::Date/@add_field = {}
[2018-04-24T17:17:38,506][DEBUG][logstash.filters.date ] config LogStash::Filters::Date/@remove_field = []
[2018-04-24T17:17:38,506][DEBUG][logstash.filters.date ] config LogStash::Filters::Date/@periodic_flush = false
[2018-04-24T17:17:38,506][DEBUG][logstash.filters.date ] config LogStash::Filters::Date/@tag_on_failure = ["_dateparsefailure"]
[2018-04-24T17:17:38,506][DEBUG][org.logstash.filters.DateFilter] Date filter with format=ISO8601, locale=null, timezone=null built as org.logstash.filters.parser.CasualISO8601Parser
[2018-04-24T17:17:38,507][DEBUG][org.logstash.filters.DateFilter] Date filter with format=yyyy-MM-dd HH:mm:ss.SSSS, locale=null, timezone=null built as org.logstash.filters.parser.JodaParser
[2018-04-24T17:17:38,507][DEBUG][org.logstash.filters.DateFilter] Date filter with format=MMMM dd, yyyy HH:mm:ss aa, locale=null, timezone=null built as org.logstash.filters.parser.JodaParser
[2018-04-24T17:17:38,507][DEBUG][org.logstash.filters.DateFilter] Date filter with format=yyyy-MM-dd hh:mm:ss,SSS, locale=null, timezone=null built as org.logstash.filters.parser.JodaParser
[2018-04-24T17:17:38,507][DEBUG][org.logstash.filters.DateFilter] Date filter with format=yyyy-MM-dd HH:mm:ss,SSS, locale=null, timezone=null built as org.logstash.filters.parser.JodaParser
[2018-04-24T17:17:38,507][DEBUG][org.logstash.filters.DateFilter] Date filter with format=yyyy-MM-dd'T'HH:mm:ss.SSSSSSS'Z', locale=null, timezone=null built as org.logstash.filters.parser.JodaParser
[2018-04-24T17:17:38,512][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@remove_field = ["Message"]
[2018-04-24T17:17:38,512][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@id = "6aeeff6c2638eb7e0f31012bdfb27b0e03a6c983b6ed03efe8f17ac2e0f19a20"
[2018-04-24T17:17:38,513][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@enable_metric = true
[2018-04-24T17:17:38,513][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@add_tag = []
[2018-04-24T17:17:38,513][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@remove_tag = []
[2018-04-24T17:17:38,513][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@add_field = {}
[2018-04-24T17:17:38,513][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@periodic_flush = false
[2018-04-24T17:17:38,517][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@remove_field = ["logtime"]
[2018-04-24T17:17:38,517][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@id = "78267f24b6888d38ac8b6bcfc41ef16fea0550d3986fe5648b72d9df39334ee1"
[2018-04-24T17:17:38,517][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@enable_metric = true
[2018-04-24T17:17:38,517][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@add_tag = []
[2018-04-24T17:17:38,517][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@remove_tag = []
[2018-04-24T17:17:38,517][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@add_field = {}
[2018-04-24T17:17:38,517][DEBUG][logstash.filters.mutate ] config LogStash::Filters::Mutate/@periodic_flush = false
[2018-04-24T17:17:38,521][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@id = "plain_b72a588b-6100-414c-a5f1-87e20dcd100b"
[2018-04-24T17:17:38,521][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@enable_metric = true
[2018-04-24T17:17:38,521][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@charset = "UTF-8"
[2018-04-24T17:17:38,523][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@hosts = [https://10.54.52.31:9200]
[2018-04-24T17:17:38,523][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@user = "logstash_writer"
[2018-04-24T17:17:38,525][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@password = <password>
[2018-04-24T17:17:38,525][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@cacert = "/etc/logstash/cert/LMUWU0438.pem"
[2018-04-24T17:17:38,525][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ssl = true
[2018-04-24T17:17:38,525][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@action = "index"
[2018-04-24T17:17:38,525][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@index = "logstash01-%{+YYYY.MM.dd}"
[2018-04-24T17:17:38,525][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@id = "17f56a2dd2dac572b4037d04771e3e1ace454ec24fd5523d664576d9f466ef31"
[2018-04-24T17:17:38,525][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@enable_metric = true
[2018-04-24T17:17:38,525][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@codec = <LogStash::Codecs::Plain id=>"plain_b72a588b-6100-414c-a5f1-87e20dcd100b", enable_metric=>true, charset=>"UTF-8">
[2018-04-24T17:17:38,526][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@workers = 1
[2018-04-24T17:17:38,526][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@manage_template = true
[2018-04-24T17:17:38,526][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@template_name = "logstash"
[2018-04-24T17:17:38,526][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@template_overwrite = false
[2018-04-24T17:17:38,526][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@parent = nil
[2018-04-24T17:17:38,526][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@join_field = nil
[2018-04-24T17:17:38,526][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@upsert = ""
[2018-04-24T17:17:38,526][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@doc_as_upsert = false
[2018-04-24T17:17:38,526][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script = ""
[2018-04-24T17:17:38,526][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_type = "inline"
[2018-04-24T17:17:38,526][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_lang = "painless"
[2018-04-24T17:17:38,526][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_var_name = "event"
[2018-04-24T17:17:38,526][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@scripted_upsert = false
[2018-04-24T17:17:38,526][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_initial_interval = 2
[2018-04-24T17:17:38,526][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_max_interval = 64
[2018-04-24T17:17:38,526][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_on_conflict = 1
[2018-04-24T17:17:38,526][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pipeline = nil
[2018-04-24T17:17:38,526][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ssl_certificate_verification = true
[2018-04-24T17:17:38,526][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@sniffing = false
[2018-04-24T17:17:38,527][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@sniffing_delay = 5
[2018-04-24T17:17:38,527][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@timeout = 60
[2018-04-24T17:17:38,527][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@failure_type_logging_whitelist = []
[2018-04-24T17:17:38,527][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pool_max = 1000
[2018-04-24T17:17:38,527][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pool_max_per_route = 100
[2018-04-24T17:17:38,527][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@resurrect_delay = 5
[2018-04-24T17:17:38,527][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@validate_after_inactivity = 10000
[2018-04-24T17:17:38,527][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@http_compression = false
[2018-04-24T17:17:38,531][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@id = "plain_5a21af8c-8551-4d07-b815-f9bab4ee8d83"
[2018-04-24T17:17:38,531][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@enable_metric = true
[2018-04-24T17:17:38,531][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@charset = "UTF-8"
[2018-04-24T17:17:38,532][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@hosts = [https://10.54.52.31:9200]
[2018-04-24T17:17:38,532][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@user = "logstash_writer"
[2018-04-24T17:17:38,532][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@password = <password>
[2018-04-24T17:17:38,532][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@cacert = "/etc/logstash/cert/LMUWU0438.pem"
[2018-04-24T17:17:38,533][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ssl = true
[2018-04-24T17:17:38,533][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@action = "index"
[2018-04-24T17:17:38,533][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@index = "lmx-dev"
[2018-04-24T17:17:38,533][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@id = "5ce646a90d89ed12a7f99cb997af29716078b6d0d3d4cf31ff8ae95f6ee2ceae"
[2018-04-24T17:17:38,533][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@enable_metric = true
[2018-04-24T17:17:38,533][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@codec = <LogStash::Codecs::Plain id=>"plain_5a21af8c-8551-4d07-b815-f9bab4ee8d83", enable_metric=>true, charset=>"UTF-8">
[2018-04-24T17:17:38,533][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@workers = 1
[2018-04-24T17:17:38,533][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@manage_template = true
[2018-04-24T17:17:38,533][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@template_name = "logstash"
[2018-04-24T17:17:38,533][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@template_overwrite = false
[2018-04-24T17:17:38,533][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@parent = nil
[2018-04-24T17:17:38,533][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@join_field = nil
[2018-04-24T17:17:38,533][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@upsert = ""
[2018-04-24T17:17:38,533][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@doc_as_upsert = false
[2018-04-24T17:17:38,534][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script = ""
[2018-04-24T17:17:38,534][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_type = "inline"
[2018-04-24T17:17:38,534][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_lang = "painless"
[2018-04-24T17:17:38,534][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_var_name = "event"
[2018-04-24T17:17:38,534][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@scripted_upsert = false
[2018-04-24T17:17:38,534][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_initial_interval = 2
[2018-04-24T17:17:38,534][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_max_interval = 64
[2018-04-24T17:17:38,534][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_on_conflict = 1
[2018-04-24T17:17:38,534][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pipeline = nil
[2018-04-24T17:17:38,534][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ssl_certificate_verification = true
[2018-04-24T17:17:38,534][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@sniffing = false
[2018-04-24T17:17:38,534][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@sniffing_delay = 5
[2018-04-24T17:17:38,534][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@timeout = 60
[2018-04-24T17:17:38,534][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@failure_type_logging_whitelist = []
[2018-04-24T17:17:38,534][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pool_max = 1000
[2018-04-24T17:17:38,534][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pool_max_per_route = 100
[2018-04-24T17:17:38,534][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@resurrect_delay = 5
[2018-04-24T17:17:38,534][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@validate_after_inactivity = 10000
[2018-04-24T17:17:38,534][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@http_compression = false
[2018-04-24T17:17:38,537][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@id = "plain_fab3db4a-28da-4063-85c9-91b4082fd852"
[2018-04-24T17:17:38,538][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@enable_metric = true
[2018-04-24T17:17:38,538][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@charset = "UTF-8"
[2018-04-24T17:17:38,540][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@hosts = [https://10.54.52.31:9200]
[2018-04-24T17:17:38,540][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@user = "logstash_writer"
[2018-04-24T17:17:38,540][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@password = <password>
[2018-04-24T17:17:38,540][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@cacert = "/etc/logstash/cert/LMUWU0438.pem"
[2018-04-24T17:17:38,540][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ssl = true
[2018-04-24T17:17:38,540][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@action = "index"
[2018-04-24T17:17:38,540][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@index = "brokerage"
[2018-04-24T17:17:38,540][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@id = "9c4441aeea169e568e057a019bb62fb0bb7907cd8dcee0e9d21d9c28a41e50e1"
[2018-04-24T17:17:38,540][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@enable_metric = true
[2018-04-24T17:17:38,540][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@codec = <LogStash::Codecs::Plain id=>"plain_fab3db4a-28da-4063-85c9-91b4082fd852", enable_metric=>true, charset=>"UTF-8">
[2018-04-24T17:17:38,540][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@workers = 1
[2018-04-24T17:17:38,540][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@manage_template = true
[2018-04-24T17:17:38,540][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@template_name = "logstash"
[2018-04-24T17:17:38,541][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@template_overwrite = false
[2018-04-24T17:17:38,541][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@parent = nil
[2018-04-24T17:17:38,541][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@join_field = nil
[2018-04-24T17:17:38,541][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@upsert = ""
[2018-04-24T17:17:38,541][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@doc_as_upsert = false
[2018-04-24T17:17:38,541][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script = ""
[2018-04-24T17:17:38,541][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_type = "inline"
[2018-04-24T17:17:38,541][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_lang = "painless"
[2018-04-24T17:17:38,541][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_var_name = "event"
[2018-04-24T17:17:38,541][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@scripted_upsert = false
[2018-04-24T17:17:38,541][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_initial_interval = 2
[2018-04-24T17:17:38,541][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_max_interval = 64
[2018-04-24T17:17:38,541][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_on_conflict = 1
[2018-04-24T17:17:38,541][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pipeline = nil
[2018-04-24T17:17:38,541][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ssl_certificate_verification = true
[2018-04-24T17:17:38,541][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@sniffing = false
[2018-04-24T17:17:38,541][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@sniffing_delay = 5
[2018-04-24T17:17:38,541][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@timeout = 60
[2018-04-24T17:17:38,541][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@failure_type_logging_whitelist = []
[2018-04-24T17:17:38,541][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pool_max = 1000
[2018-04-24T17:17:38,541][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pool_max_per_route = 100
[2018-04-24T17:17:38,542][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@resurrect_delay = 5
[2018-04-24T17:17:38,542][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@validate_after_inactivity = 10000
[2018-04-24T17:17:38,542][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@http_compression = false
[2018-04-24T17:17:38,545][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@id = "plain_d369e018-a751-452f-9d86-7c2743baa7c9"
[2018-04-24T17:17:38,545][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@enable_metric = true
[2018-04-24T17:17:38,545][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@charset = "UTF-8"
[2018-04-24T17:17:38,547][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@hosts = [https://10.54.52.31:9200]
[2018-04-24T17:17:38,547][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@user = "logstash_writer"
[2018-04-24T17:17:38,547][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@password = <password>
[2018-04-24T17:17:38,547][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@cacert = "/etc/logstash/cert/LMUWU0438.pem"
[2018-04-24T17:17:38,547][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ssl = true
[2018-04-24T17:17:38,547][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@action = "index"
[2018-04-24T17:17:38,547][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@index = "xrt_product"
[2018-04-24T17:17:38,547][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@id = "c9a4e5b5f33e1ee98d60373e7d712bab92d78dfbbc32b04402534212a92c7312"
[2018-04-24T17:17:38,547][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@enable_metric = true
[2018-04-24T17:17:38,547][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@codec = <LogStash::Codecs::Plain id=>"plain_d369e018-a751-452f-9d86-7c2743baa7c9", enable_metric=>true, charset=>"UTF-8">
[2018-04-24T17:17:38,547][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@workers = 1
[2018-04-24T17:17:38,547][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@manage_template = true
[2018-04-24T17:17:38,547][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@template_name = "logstash"
[2018-04-24T17:17:38,547][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@template_overwrite = false
[2018-04-24T17:17:38,548][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@parent = nil
[2018-04-24T17:17:38,548][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@join_field = nil
[2018-04-24T17:17:38,548][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@upsert = ""
[2018-04-24T17:17:38,548][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@doc_as_upsert = false
[2018-04-24T17:17:38,548][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script = ""
[2018-04-24T17:17:38,548][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_type = "inline"
[2018-04-24T17:17:38,548][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_lang = "painless"
[2018-04-24T17:17:38,548][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_var_name = "event"
[2018-04-24T17:17:38,548][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@scripted_upsert = false
[2018-04-24T17:17:38,548][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_initial_interval = 2
[2018-04-24T17:17:38,548][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_max_interval = 64
[2018-04-24T17:17:38,548][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_on_conflict = 1
[2018-04-24T17:17:38,548][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pipeline = nil
[2018-04-24T17:17:38,548][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ssl_certificate_verification = true
[2018-04-24T17:17:38,548][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@sniffing = false
[2018-04-24T17:17:38,548][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@sniffing_delay = 5
[2018-04-24T17:17:38,548][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@timeout = 60
[2018-04-24T17:17:38,548][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@failure_type_logging_whitelist = []
[2018-04-24T17:17:38,548][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pool_max = 1000
[2018-04-24T17:17:38,548][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pool_max_per_route = 100
[2018-04-24T17:17:38,549][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@resurrect_delay = 5
[2018-04-24T17:17:38,549][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@validate_after_inactivity = 10000
[2018-04-24T17:17:38,549][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@http_compression = false
[2018-04-24T17:17:38,552][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@id = "plain_7bce0052-8181-44d4-ab11-17d3b531013a"
[2018-04-24T17:17:38,552][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@enable_metric = true
[2018-04-24T17:17:38,552][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@charset = "UTF-8"
[2018-04-24T17:17:38,553][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@hosts = [https://10.54.52.31:9200]
[2018-04-24T17:17:38,553][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@user = "logstash_writer"
[2018-04-24T17:17:38,554][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@password = <password>
[2018-04-24T17:17:38,554][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@cacert = "/etc/logstash/cert/LMUWU0438.pem"
[2018-04-24T17:17:38,554][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ssl = true
[2018-04-24T17:17:38,554][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@action = "index"
[2018-04-24T17:17:38,554][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@index = "lmx-sta"
[2018-04-24T17:17:38,554][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@id = "3630197e44f4ce7f7863661164289fb53b337b76f88f4f4d1181444e12822517"
[2018-04-24T17:17:38,554][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@enable_metric = true
[2018-04-24T17:17:38,554][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@codec = <LogStash::Codecs::Plain id=>"plain_7bce0052-8181-44d4-ab11-17d3b531013a", enable_metric=>true, charset=>"UTF-8">
[2018-04-24T17:17:38,554][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@workers = 1
[2018-04-24T17:17:38,554][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@manage_template = true
[2018-04-24T17:17:38,554][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@template_name = "logstash"
[2018-04-24T17:17:38,554][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@template_overwrite = false
[2018-04-24T17:17:38,554][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@parent = nil
[2018-04-24T17:17:38,554][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@join_field = nil
[2018-04-24T17:17:38,554][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@upsert = ""
[2018-04-24T17:17:38,554][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@doc_as_upsert = false
[2018-04-24T17:17:38,554][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script = ""
[2018-04-24T17:17:38,554][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_type = "inline"
[2018-04-24T17:17:38,554][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_lang = "painless"
[2018-04-24T17:17:38,555][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_var_name = "event"
[2018-04-24T17:17:38,555][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@scripted_upsert = false
[2018-04-24T17:17:38,555][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_initial_interval = 2
[2018-04-24T17:17:38,555][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_max_interval = 64
[2018-04-24T17:17:38,555][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_on_conflict = 1
[2018-04-24T17:17:38,555][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pipeline = nil
[2018-04-24T17:17:38,555][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ssl_certificate_verification = true
[2018-04-24T17:17:38,555][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@sniffing = false
[2018-04-24T17:17:38,555][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@sniffing_delay = 5
[2018-04-24T17:17:38,555][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@timeout = 60
[2018-04-24T17:17:38,555][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@failure_type_logging_whitelist = []
[2018-04-24T17:17:38,555][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pool_max = 1000
[2018-04-24T17:17:38,555][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pool_max_per_route = 100
[2018-04-24T17:17:38,555][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@resurrect_delay = 5
[2018-04-24T17:17:38,555][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@validate_after_inactivity = 10000
[2018-04-24T17:17:38,555][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@http_compression = false
[2018-04-24T17:17:38,558][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@id = "plain_1c443460-1862-4d97-9efc-9678b93e140d"
[2018-04-24T17:17:38,558][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@enable_metric = true
[2018-04-24T17:17:38,558][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@charset = "UTF-8"
[2018-04-24T17:17:38,560][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@hosts = [https://10.54.52.31:9200]
[2018-04-24T17:17:38,560][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@user = "logstash_writer"
[2018-04-24T17:17:38,560][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@password = <password>
[2018-04-24T17:17:38,560][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@cacert = "/etc/logstash/cert/LMUWU0438.pem"
[2018-04-24T17:17:38,560][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ssl = true
[2018-04-24T17:17:38,560][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@manage_template = false
[2018-04-24T17:17:38,560][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@index = "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
[2018-04-24T17:17:38,560][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@id = "e81d6c1b2dbd27273d0670c70a10032ebab46c05b58b1646e0c5ef7486889d0e"
[2018-04-24T17:17:38,560][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@enable_metric = true
[2018-04-24T17:17:38,561][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@codec = <LogStash::Codecs::Plain id=>"plain_1c443460-1862-4d97-9efc-9678b93e140d", enable_metric=>true, charset=>"UTF-8">
[2018-04-24T17:17:38,561][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@workers = 1
[2018-04-24T17:17:38,561][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@template_name = "logstash"
[2018-04-24T17:17:38,561][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@template_overwrite = false
[2018-04-24T17:17:38,561][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@parent = nil
[2018-04-24T17:17:38,561][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@join_field = nil
[2018-04-24T17:17:38,561][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@upsert = ""
[2018-04-24T17:17:38,561][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@doc_as_upsert = false
[2018-04-24T17:17:38,561][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script = ""
[2018-04-24T17:17:38,561][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_type = "inline"
[2018-04-24T17:17:38,561][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_lang = "painless"
[2018-04-24T17:17:38,561][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@script_var_name = "event"
[2018-04-24T17:17:38,561][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@scripted_upsert = false
[2018-04-24T17:17:38,561][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_initial_interval = 2
[2018-04-24T17:17:38,561][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_max_interval = 64
[2018-04-24T17:17:38,561][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_on_conflict = 1
[2018-04-24T17:17:38,561][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pipeline = nil
[2018-04-24T17:17:38,561][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@action = "index"
[2018-04-24T17:17:38,561][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ssl_certificate_verification = true
[2018-04-24T17:17:38,561][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@sniffing = false
[2018-04-24T17:17:38,562][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@sniffing_delay = 5
[2018-04-24T17:17:38,562][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@timeout = 60
[2018-04-24T17:17:38,562][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@failure_type_logging_whitelist = []
[2018-04-24T17:17:38,562][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pool_max = 1000
[2018-04-24T17:17:38,562][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pool_max_per_route = 100
[2018-04-24T17:17:38,562][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@resurrect_delay = 5
[2018-04-24T17:17:38,562][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@validate_after_inactivity = 10000
[2018-04-24T17:17:38,562][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@http_compression = false
[2018-04-24T17:17:38,564][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2018-04-24T17:17:38,565][DEBUG][logstash.outputs.elasticsearch] Normalizing http path {:path=>nil, :normalized=>nil}
[2018-04-24T17:17:38,570][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[https://logstash_writer:xxxxxx@10.54.52.31:9200/]}}
[2018-04-24T17:17:38,570][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>https://logstash_writer:xxxxxx@10.54.52.31:9200/, :path=>"/"}
[2018-04-24T17:17:38,588][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"https://logstash_writer:xxxxxx@10.54.52.31:9200/"}
[2018-04-24T17:17:38,591][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2018-04-24T17:17:38,591][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
[2018-04-24T17:17:38,592][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2018-04-24T17:17:38,603][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2018-04-24T17:17:38,612][DEBUG][logstash.outputs.elasticsearch] Found existing Elasticsearch template. Skipping template management {:name=>"logstash"}
[2018-04-24T17:17:38,612][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["https://10.54.52.31:9200"]}
[2018-04-24T17:17:38,612][DEBUG][logstash.outputs.elasticsearch] Normalizing http path {:path=>nil, :normalized=>nil}
[2018-04-24T17:17:38,617][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[https://logstash_writer:xxxxxx@10.54.52.31:9200/]}}
[2018-04-24T17:17:38,617][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>https://logstash_writer:xxxxxx@10.54.52.31:9200/, :path=>"/"}
[2018-04-24T17:17:38,635][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"https://logstash_writer:xxxxxx@10.54.52.31:9200/"}
[2018-04-24T17:17:38,639][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2018-04-24T17:17:38,639][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
[2018-04-24T17:17:38,649][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2018-04-24T17:17:38,650][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2018-04-24T17:17:38,654][DEBUG][logstash.outputs.elasticsearch] Found existing Elasticsearch template. Skipping template management {:name=>"logstash"}
[2018-04-24T17:17:38,654][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["https://10.54.52.31:9200"]}
[2018-04-24T17:17:38,655][DEBUG][logstash.outputs.elasticsearch] Normalizing http path {:path=>nil, :normalized=>nil}
[2018-04-24T17:17:38,658][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[https://logstash_writer:xxxxxx@10.54.52.31:9200/]}}
[2018-04-24T17:17:38,658][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>https://logstash_writer:xxxxxx@10.54.52.31:9200/, :path=>"/"}
[2018-04-24T17:17:38,678][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"https://logstash_writer:xxxxxx@10.54.52.31:9200/"}
[2018-04-24T17:17:38,682][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2018-04-24T17:17:38,682][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
[2018-04-24T17:17:38,682][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2018-04-24T17:17:38,684][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2018-04-24T17:17:38,687][DEBUG][logstash.outputs.elasticsearch] Found existing Elasticsearch template. Skipping template management {:name=>"logstash"}
[2018-04-24T17:17:38,687][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["https://10.54.52.31:9200"]}
[2018-04-24T17:17:38,688][DEBUG][logstash.outputs.elasticsearch] Normalizing http path {:path=>nil, :normalized=>nil}
[2018-04-24T17:17:38,692][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[https://logstash_writer:xxxxxx@10.54.52.31:9200/]}}
[2018-04-24T17:17:38,692][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>https://logstash_writer:xxxxxx@10.54.52.31:9200/, :path=>"/"}
[2018-04-24T17:17:38,709][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"https://logstash_writer:xxxxxx@10.54.52.31:9200/"}
[2018-04-24T17:17:38,711][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2018-04-24T17:17:38,711][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
[2018-04-24T17:17:38,713][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2018-04-24T17:17:38,714][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2018-04-24T17:17:38,718][DEBUG][logstash.outputs.elasticsearch] Found existing Elasticsearch template. Skipping template management {:name=>"logstash"}
[2018-04-24T17:17:38,718][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["https://10.54.52.31:9200"]}
[2018-04-24T17:17:38,718][DEBUG][logstash.outputs.elasticsearch] Normalizing http path {:path=>nil, :normalized=>nil}
[2018-04-24T17:17:38,721][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[https://logstash_writer:xxxxxx@10.54.52.31:9200/]}}
[2018-04-24T17:17:38,721][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>https://logstash_writer:xxxxxx@10.54.52.31:9200/, :path=>"/"}
[2018-04-24T17:17:38,739][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"https://logstash_writer:xxxxxx@10.54.52.31:9200/"}
[2018-04-24T17:17:38,742][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2018-04-24T17:17:38,743][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
[2018-04-24T17:17:38,743][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2018-04-24T17:17:38,744][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2018-04-24T17:17:38,751][DEBUG][logstash.outputs.elasticsearch] Found existing Elasticsearch template. Skipping template management {:name=>"logstash"}
[2018-04-24T17:17:38,751][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["https://10.54.52.31:9200"]}
[2018-04-24T17:17:38,752][DEBUG][logstash.outputs.elasticsearch] Normalizing http path {:path=>nil, :normalized=>nil}
[2018-04-24T17:17:38,754][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[https://logstash_writer:xxxxxx@10.54.52.31:9200/]}}
[2018-04-24T17:17:38,755][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>https://logstash_writer:xxxxxx@10.54.52.31:9200/, :path=>"/"}
[2018-04-24T17:17:38,774][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"https://logstash_writer:xxxxxx@10.54.52.31:9200/"}
[2018-04-24T17:17:38,776][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2018-04-24T17:17:38,777][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
[2018-04-24T17:17:38,777][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["https://10.54.52.31:9200"]}
[2018-04-24T17:17:38,783][DEBUG][logstash.filters.grok ] Grok patterns path {:paths=>["/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-patterns-core-4.1.2/patterns", "/usr/share/logstash/patterns/*"]}
[2018-04-24T17:17:38,786][DEBUG][logstash.filters.grok ] Grok patterns path {:paths=>["/etc/logstash/conf.d/patterns"]}
[2018-04-24T17:17:38,786][DEBUG][logstash.filters.grok ] Match data {:match=>{"message"=>"%{GREEDYDATA:PrefixMessage} \\[%{DATESWITHUNDERLINE:logtime}\\] %{GREEDYDATA:SuffixMessage}"}}
[2018-04-24T17:17:38,787][DEBUG][logstash.filters.grok ] regexp: /message {:pattern=>"%{GREEDYDATA:PrefixMessage} \\[%{DATESWITHUNDERLINE:logtime}\\] %{GREEDYDATA:SuffixMessage}"}
[2018-04-24T17:17:38,801][DEBUG][logstash.filters.grok ] Adding pattern {"S3_REQUEST_LINE"=>"(?:%{WORD:verb} %{NOTSPACE:request}(?: HTTP/%{NUMBER:httpversion})?|%{DATA:rawrequest})"}
[2018-04-24T17:17:38,801][DEBUG][logstash.filters.grok ] Adding pattern {"S3_ACCESS_LOG"=>"%{WORD:owner} %{NOTSPACE:bucket} \\[%{HTTPDATE:timestamp}\\] %{IP:clientip} %{NOTSPACE:requester} %{NOTSPACE:request_id} %{NOTSPACE:operation} %{NOTSPACE:key} (?:\"%{S3_REQUEST_LINE}\"|-) (?:%{INT:response:int}|-) (?:-|%{NOTSPACE:error_code}) (?:%{INT:bytes:int}|-) (?:%{INT:object_size:int}|-) (?:%{INT:request_time_ms:int}|-) (?:%{INT:turnaround_time_ms:int}|-) (?:%{QS:referrer}|-) (?:\"?%{QS:agent}\"?|-) (?:-|%{NOTSPACE:version_id})"}
[2018-04-24T17:17:38,801][DEBUG][logstash.filters.grok ] Adding pattern {"ELB_URIPATHPARAM"=>"%{URIPATH:path}(?:%{URIPARAM:params})?"}
[2018-04-24T17:17:38,801][DEBUG][logstash.filters.grok ] Adding pattern {"ELB_URI"=>"%{URIPROTO:proto}://(?:%{USER}(?::[^@]*)?@)?(?:%{URIHOST:urihost})?(?:%{ELB_URIPATHPARAM})?"}
[2018-04-24T17:17:38,801][DEBUG][logstash.filters.grok ] Adding pattern {"ELB_REQUEST_LINE"=>"(?:%{WORD:verb} %{ELB_URI:request}(?: HTTP/%{NUMBER:httpversion})?|%{DATA:rawrequest})"}
[2018-04-24T17:17:38,801][DEBUG][logstash.filters.grok ] Adding pattern {"ELB_ACCESS_LOG"=>"%{TIMESTAMP_ISO8601:timestamp} %{NOTSPACE:elb} %{IP:clientip}:%{INT:clientport:int} (?:(%{IP:backendip}:?:%{INT:backendport:int})|-) %{NUMBER:request_processing_time:float} %{NUMBER:backend_processing_time:float} %{NUMBER:response_processing_time:float} %{INT:response:int} %{INT:backend_response:int} %{INT:received_bytes:int} %{INT:bytes:int} \"%{ELB_REQUEST_LINE}\""}
[2018-04-24T17:17:38,801][DEBUG][logstash.filters.grok ] Adding pattern {"CLOUDFRONT_ACCESS_LOG"=>"(?<timestamp>%{YEAR}-%{MONTHNUM}-%{MONTHDAY}\\t%{TIME})\\t%{WORD:x_edge_location}\\t(?:%{NUMBER:sc_bytes:int}|-)\\t%{IPORHOST:clientip}\\t%{WORD:cs_method}\\t%{HOSTNAME:cs_host}\\t%{NOTSPACE:cs_uri_stem}\\t%{NUMBER:sc_status:int}\\t%{GREEDYDATA:referrer}\\t%{GREEDYDATA:agent}\\t%{GREEDYDATA:cs_uri_query}\\t%{GREEDYDATA:cookies}\\t%{WORD:x_edge_result_type}\\t%{NOTSPACE:x_edge_request_id}\\t%{HOSTNAME:x_host_header}\\t%{URIPROTO:cs_protocol}\\t%{INT:cs_bytes:int}\\t%{GREEDYDATA:time_taken:float}\\t%{GREEDYDATA:x_forwarded_for}\\t%{GREEDYDATA:ssl_protocol}\\t%{GREEDYDATA:ssl_cipher}\\t%{GREEDYDATA:x_edge_response_result_type}"}
[2018-04-24T17:17:38,802][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_TIMESTAMP"=>"%{MONTHDAY}-%{MONTH} %{HOUR}:%{MINUTE}"}
[2018-04-24T17:17:38,802][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_HOST"=>"[a-zA-Z0-9-]+"}
[2018-04-24T17:17:38,802][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_VOLUME"=>"%{USER}"}
[2018-04-24T17:17:38,802][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_DEVICE"=>"%{USER}"}
[2018-04-24T17:17:38,802][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_DEVICEPATH"=>"%{UNIXPATH}"}
[2018-04-24T17:17:38,802][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_CAPACITY"=>"%{INT}{1,3}(,%{INT}{3})*"}
[2018-04-24T17:17:38,802][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_VERSION"=>"%{USER}"}
[2018-04-24T17:17:38,802][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_JOB"=>"%{USER}"}
[2018-04-24T17:17:38,802][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_MAX_CAPACITY"=>"User defined maximum volume capacity %{BACULA_CAPACITY} exceeded on device \\\"%{BACULA_DEVICE:device}\\\" \\(%{BACULA_DEVICEPATH}\\)"}
[2018-04-24T17:17:38,802][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_END_VOLUME"=>"End of medium on Volume \\\"%{BACULA_VOLUME:volume}\\\" Bytes=%{BACULA_CAPACITY} Blocks=%{BACULA_CAPACITY} at %{MONTHDAY}-%{MONTH}-%{YEAR} %{HOUR}:%{MINUTE}."}
[2018-04-24T17:17:38,803][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NEW_VOLUME"=>"Created new Volume \\\"%{BACULA_VOLUME:volume}\\\" in catalog."}
[2018-04-24T17:17:38,803][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NEW_LABEL"=>"Labeled new Volume \\\"%{BACULA_VOLUME:volume}\\\" on device \\\"%{BACULA_DEVICE:device}\\\" \\(%{BACULA_DEVICEPATH}\\)."}
[2018-04-24T17:17:38,803][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_WROTE_LABEL"=>"Wrote label to prelabeled Volume \\\"%{BACULA_VOLUME:volume}\\\" on device \\\"%{BACULA_DEVICE}\\\" \\(%{BACULA_DEVICEPATH}\\)"}
[2018-04-24T17:17:38,803][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NEW_MOUNT"=>"New volume \\\"%{BACULA_VOLUME:volume}\\\" mounted on device \\\"%{BACULA_DEVICE:device}\\\" \\(%{BACULA_DEVICEPATH}\\) at %{MONTHDAY}-%{MONTH}-%{YEAR} %{HOUR}:%{MINUTE}."}
[2018-04-24T17:17:38,803][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NOOPEN"=>"\\s+Cannot open %{DATA}: ERR=%{GREEDYDATA:berror}"}
[2018-04-24T17:17:38,803][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NOOPENDIR"=>"\\s+Could not open directory %{DATA}: ERR=%{GREEDYDATA:berror}"}
[2018-04-24T17:17:38,803][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NOSTAT"=>"\\s+Could not stat %{DATA}: ERR=%{GREEDYDATA:berror}"}
[2018-04-24T17:17:38,803][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NOJOBS"=>"There are no more Jobs associated with Volume \\\"%{BACULA_VOLUME:volume}\\\". Marking it purged."}
[2018-04-24T17:17:38,803][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_ALL_RECORDS_PRUNED"=>"All records pruned from Volume \\\"%{BACULA_VOLUME:volume}\\\"; marking it \\\"Purged\\\""}
[2018-04-24T17:17:38,803][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_BEGIN_PRUNE_JOBS"=>"Begin pruning Jobs older than %{INT} month %{INT} days ."}
[2018-04-24T17:17:38,803][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_BEGIN_PRUNE_FILES"=>"Begin pruning Files."}
[2018-04-24T17:17:38,803][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_PRUNED_JOBS"=>"Pruned %{INT} Jobs* for client %{BACULA_HOST:client} from catalog."}
[2018-04-24T17:17:38,804][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_PRUNED_FILES"=>"Pruned Files from %{INT} Jobs* for client %{BACULA_HOST:client} from catalog."}
[2018-04-24T17:17:38,804][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_ENDPRUNE"=>"End auto prune."}
[2018-04-24T17:17:38,804][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_STARTJOB"=>"Start Backup JobId %{INT}, Job=%{BACULA_JOB:job}"}
[2018-04-24T17:17:38,804][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_STARTRESTORE"=>"Start Restore Job %{BACULA_JOB:job}"}
[2018-04-24T17:17:38,804][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_USEDEVICE"=>"Using Device \\\"%{BACULA_DEVICE:device}\\\""}
[2018-04-24T17:17:38,804][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_DIFF_FS"=>"\\s+%{UNIXPATH} is a different filesystem. Will not descend from %{UNIXPATH} into it."}
[2018-04-24T17:17:38,804][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_JOBEND"=>"Job write elapsed time = %{DATA:elapsed}, Transfer rate = %{NUMBER} (K|M|G)? Bytes/second"}
[2018-04-24T17:17:38,804][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NOPRUNE_JOBS"=>"No Jobs found to prune."}
[2018-04-24T17:17:38,804][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NOPRUNE_FILES"=>"No Files found to prune."}
[2018-04-24T17:17:38,804][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_VOLUME_PREVWRITTEN"=>"Volume \\\"%{BACULA_VOLUME:volume}\\\" previously written, moving to end of data."}
[2018-04-24T17:17:38,805][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_READYAPPEND"=>"Ready to append to end of Volume \\\"%{BACULA_VOLUME:volume}\\\" size=%{INT}"}
[2018-04-24T17:17:38,805][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_CANCELLING"=>"Cancelling duplicate JobId=%{INT}."}
[2018-04-24T17:17:38,805][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_MARKCANCEL"=>"JobId %{INT}, Job %{BACULA_JOB:job} marked to be canceled."}
[2018-04-24T17:17:38,805][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_CLIENT_RBJ"=>"shell command: run ClientRunBeforeJob \\\"%{GREEDYDATA:runjob}\\\""}
[2018-04-24T17:17:38,805][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_VSS"=>"(Generate )?VSS (Writer)?"}
[2018-04-24T17:17:38,805][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_MAXSTART"=>"Fatal error: Job canceled because max start delay time exceeded."}
[2018-04-24T17:17:38,805][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_DUPLICATE"=>"Fatal error: JobId %{INT:duplicate} already running. Duplicate job not allowed."}
[2018-04-24T17:17:38,805][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NOJOBSTAT"=>"Fatal error: No Job status returned from FD."}
[2018-04-24T17:17:38,805][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_FATAL_CONN"=>"Fatal error: bsock.c:133 Unable to connect to (Client: %{BACULA_HOST:client}|Storage daemon) on %{HOSTNAME}:%{POSINT}. ERR=(?<berror>%{GREEDYDATA})"}
[2018-04-24T17:17:38,805][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NO_CONNECT"=>"Warning: bsock.c:127 Could not connect to (Client: %{BACULA_HOST:client}|Storage daemon) on %{HOSTNAME}:%{POSINT}. ERR=(?<berror>%{GREEDYDATA})"}
[2018-04-24T17:17:38,806][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NO_AUTH"=>"Fatal error: Unable to authenticate with File daemon at %{HOSTNAME}. Possible causes:"}
[2018-04-24T17:17:38,806][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NOSUIT"=>"No prior or suitable Full backup found in catalog. Doing FULL backup."}
[2018-04-24T17:17:38,806][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NOPRIOR"=>"No prior Full backup Job record found."}
[2018-04-24T17:17:38,806][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_JOB"=>"(Error: )?Bacula %{BACULA_HOST} %{BACULA_VERSION} \\(%{BACULA_VERSION}\\):"}
[2018-04-24T17:17:38,806][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOGLINE"=>"%{BACULA_TIMESTAMP:bts} %{BACULA_HOST:hostname} JobId %{INT:jobid}: (%{BACULA_LOG_MAX_CAPACITY}|%{BACULA_LOG_END_VOLUME}|%{BACULA_LOG_NEW_VOLUME}|%{BACULA_LOG_NEW_LABEL}|%{BACULA_LOG_WROTE_LABEL}|%{BACULA_LOG_NEW_MOUNT}|%{BACULA_LOG_NOOPEN}|%{BACULA_LOG_NOOPENDIR}|%{BACULA_LOG_NOSTAT}|%{BACULA_LOG_NOJOBS}|%{BACULA_LOG_ALL_RECORDS_PRUNED}|%{BACULA_LOG_BEGIN_PRUNE_JOBS}|%{BACULA_LOG_BEGIN_PRUNE_FILES}|%{BACULA_LOG_PRUNED_JOBS}|%{BACULA_LOG_PRUNED_FILES}|%{BACULA_LOG_ENDPRUNE}|%{BACULA_LOG_STARTJOB}|%{BACULA_LOG_STARTRESTORE}|%{BACULA_LOG_USEDEVICE}|%{BACULA_LOG_DIFF_FS}|%{BACULA_LOG_JOBEND}|%{BACULA_LOG_NOPRUNE_JOBS}|%{BACULA_LOG_NOPRUNE_FILES}|%{BACULA_LOG_VOLUME_PREVWRITTEN}|%{BACULA_LOG_READYAPPEND}|%{BACULA_LOG_CANCELLING}|%{BACULA_LOG_MARKCANCEL}|%{BACULA_LOG_CLIENT_RBJ}|%{BACULA_LOG_VSS}|%{BACULA_LOG_MAXSTART}|%{BACULA_LOG_DUPLICATE}|%{BACULA_LOG_NOJOBSTAT}|%{BACULA_LOG_FATAL_CONN}|%{BACULA_LOG_NO_CONNECT}|%{BACULA_LOG_NO_AUTH}|%{BACULA_LOG_NOSUIT}|%{BACULA_LOG_JOB}|%{BACULA_LOG_NOPRIOR})"}
[2018-04-24T17:17:38,806][DEBUG][logstash.filters.grok ] Adding pattern {"BIND9_TIMESTAMP"=>"%{MONTHDAY}[-]%{MONTH}[-]%{YEAR} %{TIME}"}
[2018-04-24T17:17:38,806][DEBUG][logstash.filters.grok ] Adding pattern {"BIND9"=>"%{BIND9_TIMESTAMP:timestamp} queries: %{LOGLEVEL:loglevel}: client %{IP:clientip}#%{POSINT:clientport} \\(%{GREEDYDATA:query}\\): query: %{GREEDYDATA:query} IN %{GREEDYDATA:querytype} \\(%{IP:dns}\\)"}
[2018-04-24T17:17:38,807][DEBUG][logstash.filters.grok ] Adding pattern {"BRO_HTTP"=>"%{NUMBER:ts}\\t%{NOTSPACE:uid}\\t%{IP:orig_h}\\t%{INT:orig_p}\\t%{IP:resp_h}\\t%{INT:resp_p}\\t%{INT:trans_depth}\\t%{GREEDYDATA:method}\\t%{GREEDYDATA:domain}\\t%{GREEDYDATA:uri}\\t%{GREEDYDATA:referrer}\\t%{GREEDYDATA:user_agent}\\t%{NUMBER:request_body_len}\\t%{NUMBER:response_body_len}\\t%{GREEDYDATA:status_code}\\t%{GREEDYDATA:status_msg}\\t%{GREEDYDATA:info_code}\\t%{GREEDYDATA:info_msg}\\t%{GREEDYDATA:filename}\\t%{GREEDYDATA:bro_tags}\\t%{GREEDYDATA:username}\\t%{GREEDYDATA:password}\\t%{GREEDYDATA:proxied}\\t%{GREEDYDATA:orig_fuids}\\t%{GREEDYDATA:orig_mime_types}\\t%{GREEDYDATA:resp_fuids}\\t%{GREEDYDATA:resp_mime_types}"}
[2018-04-24T17:17:38,807][DEBUG][logstash.filters.grok ] Adding pattern {"BRO_DNS"=>"%{NUMBER:ts}\\t%{NOTSPACE:uid}\\t%{IP:orig_h}\\t%{INT:orig_p}\\t%{IP:resp_h}\\t%{INT:resp_p}\\t%{WORD:proto}\\t%{INT:trans_id}\\t%{GREEDYDATA:query}\\t%{GREEDYDATA:qclass}\\t%{GREEDYDATA:qclass_name}\\t%{GREEDYDATA:qtype}\\t%{GREEDYDATA:qtype_name}\\t%{GREEDYDATA:rcode}\\t%{GREEDYDATA:rcode_name}\\t%{GREEDYDATA:AA}\\t%{GREEDYDATA:TC}\\t%{GREEDYDATA:RD}\\t%{GREEDYDATA:RA}\\t%{GREEDYDATA:Z}\\t%{GREEDYDATA:answers}\\t%{GREEDYDATA:TTLs}\\t%{GREEDYDATA:rejected}"}
[2018-04-24T17:17:38,807][DEBUG][logstash.filters.grok ] Adding pattern {"BRO_CONN"=>"%{NUMBER:ts}\\t%{NOTSPACE:uid}\\t%{IP:orig_h}\\t%{INT:orig_p}\\t%{IP:resp_h}\\t%{INT:resp_p}\\t%{WORD:proto}\\t%{GREEDYDATA:service}\\t%{NUMBER:duration}\\t%{NUMBER:orig_bytes}\\t%{NUMBER:resp_bytes}\\t%{GREEDYDATA:conn_state}\\t%{GREEDYDATA:local_orig}\\t%{GREEDYDATA:missed_bytes}\\t%{GREEDYDATA:history}\\t%{GREEDYDATA:orig_pkts}\\t%{GREEDYDATA:orig_ip_bytes}\\t%{GREEDYDATA:resp_pkts}\\t%{GREEDYDATA:resp_ip_bytes}\\t%{GREEDYDATA:tunnel_parents}"}
[2018-04-24T17:17:38,807][DEBUG][logstash.filters.grok ] Adding pattern {"BRO_FILES"=>"%{NUMBER:ts}\\t%{NOTSPACE:fuid}\\t%{IP:tx_hosts}\\t%{IP:rx_hosts}\\t%{NOTSPACE:conn_uids}\\t%{GREEDYDATA:source}\\t%{GREEDYDATA:depth}\\t%{GREEDYDATA:analyzers}\\t%{GREEDYDATA:mime_type}\\t%{GREEDYDATA:filename}\\t%{GREEDYDATA:duration}\\t%{GREEDYDATA:local_orig}\\t%{GREEDYDATA:is_orig}\\t%{GREEDYDATA:seen_bytes}\\t%{GREEDYDATA:total_bytes}\\t%{GREEDYDATA:missing_bytes}\\t%{GREEDYDATA:overflow_bytes}\\t%{GREEDYDATA:timedout}\\t%{GREEDYDATA:parent_fuid}\\t%{GREEDYDATA:md5}\\t%{GREEDYDATA:sha1}\\t%{GREEDYDATA:sha256}\\t%{GREEDYDATA:extracted}"}
[2018-04-24T17:17:38,807][DEBUG][logstash.filters.grok ] Adding pattern {"EXIM_MSGID"=>"[0-9A-Za-z]{6}-[0-9A-Za-z]{6}-[0-9A-Za-z]{2}"}
[2018-04-24T17:17:38,807][DEBUG][logstash.filters.grok ] Adding pattern {"EXIM_FLAGS"=>"(<=|[-=>*]>|[*]{2}|==)"}
[2018-04-24T17:17:38,808][DEBUG][logstash.filters.grok ] Adding pattern {"EXIM_DATE"=>"%{YEAR:exim_year}-%{MONTHNUM:exim_month}-%{MONTHDAY:exim_day} %{TIME:exim_time}"}
[2018-04-24T17:17:38,808][DEBUG][logstash.filters.grok ] Adding pattern {"EXIM_PID"=>"\\[%{POSINT}\\]"}
[2018-04-24T17:17:38,808][DEBUG][logstash.filters.grok ] Adding pattern {"EXIM_QT"=>"((\\d+y)?(\\d+w)?(\\d+d)?(\\d+h)?(\\d+m)?(\\d+s)?)"}
[2018-04-24T17:17:38,808][DEBUG][logstash.filters.grok ] Adding pattern {"EXIM_EXCLUDE_TERMS"=>"(Message is frozen|(Start|End) queue run| Warning: | retry time not reached | no (IP address|host name) found for (IP address|host) | unexpected disconnection while reading SMTP command | no immediate delivery: |another process is handling this message)"}
[2018-04-24T17:17:38,808][DEBUG][logstash.filters.grok ] Adding pattern {"EXIM_REMOTE_HOST"=>"(H=(%{NOTSPACE:remote_hostname} )?(\\(%{NOTSPACE:remote_heloname}\\) )?\\[%{IP:remote_host}\\])"}
[2018-04-24T17:17:38,809][DEBUG][logstash.filters.grok ] Adding pattern {"EXIM_INTERFACE"=>"(I=\\[%{IP:exim_interface}\\](:%{NUMBER:exim_interface_port}))"}
[2018-04-24T17:17:38,809][DEBUG][logstash.filters.grok ] Adding pattern {"EXIM_PROTOCOL"=>"(P=%{NOTSPACE:protocol})"}
[2018-04-24T17:17:38,809][DEBUG][logstash.filters.grok ] Adding pattern {"EXIM_MSG_SIZE"=>"(S=%{NUMBER:exim_msg_size})"}
[2018-04-24T17:17:38,809][DEBUG][logstash.filters.grok ] Adding pattern {"EXIM_HEADER_ID"=>"(id=%{NOTSPACE:exim_header_id})"}
[2018-04-24T17:17:38,809][DEBUG][logstash.filters.grok ] Adding pattern {"EXIM_SUBJECT"=>"(T=%{QS:exim_subject})"}
[2018-04-24T17:17:38,809][DEBUG][logstash.filters.grok ] Adding pattern {"NETSCREENSESSIONLOG"=>"%{SYSLOGTIMESTAMP:date} %{IPORHOST:device} %{IPORHOST}: NetScreen device_id=%{WORD:device_id}%{DATA}: start_time=%{QUOTEDSTRING:start_time} duration=%{INT:duration} policy_id=%{INT:policy_id} service=%{DATA:service} proto=%{INT:proto} src zone=%{WORD:src_zone} dst zone=%{WORD:dst_zone} action=%{WORD:action} sent=%{INT:sent} rcvd=%{INT:rcvd} src=%{IPORHOST:src_ip} dst=%{IPORHOST:dst_ip} src_port=%{INT:src_port} dst_port=%{INT:dst_port} src-xlated ip=%{IPORHOST:src_xlated_ip} port=%{INT:src_xlated_port} dst-xlated ip=%{IPORHOST:dst_xlated_ip} port=%{INT:dst_xlated_port} session_id=%{INT:session_id} reason=%{GREEDYDATA:reason}"}
[2018-04-24T17:17:38,810][DEBUG][logstash.filters.grok ] Adding pattern {"CISCO_TAGGED_SYSLOG"=>"^<%{POSINT:syslog_pri}>%{CISCOTIMESTAMP:timestamp}( %{SYSLOGHOST:sysloghost})? ?: %%{CISCOTAG:ciscotag}:"}
[2018-04-24T17:17:38,810][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOTIMESTAMP"=>"%{MONTH} +%{MONTHDAY}(?: %{YEAR})? %{TIME}"}
[2018-04-24T17:17:38,810][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOTAG"=>"[A-Z0-9]+-%{INT}-(?:[A-Z0-9_]+)"}
[2018-04-24T17:17:38,810][DEBUG][logstash.filters.grok ] Adding pattern {"CISCO_ACTION"=>"Built|Teardown|Deny|Denied|denied|requested|permitted|denied by ACL|discarded|est-allowed|Dropping|created|deleted"}
[2018-04-24T17:17:38,810][DEBUG][logstash.filters.grok ] Adding pattern {"CISCO_REASON"=>"Duplicate TCP SYN|Failed to locate egress interface|Invalid transport field|No matching connection|DNS Response|DNS Query|(?:%{WORD}\\s*)*"}
[2018-04-24T17:17:38,810][DEBUG][logstash.filters.grok ] Adding pattern {"CISCO_DIRECTION"=>"Inbound|inbound|Outbound|outbound"}
[2018-04-24T17:17:38,810][DEBUG][logstash.filters.grok ] Adding pattern {"CISCO_INTERVAL"=>"first hit|%{INT}-second interval"}
[2018-04-24T17:17:38,810][DEBUG][logstash.filters.grok ] Adding pattern {"CISCO_XLATE_TYPE"=>"static|dynamic"}
[2018-04-24T17:17:38,810][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW104001"=>"\\((?:Primary|Secondary)\\) Switching to ACTIVE - %{GREEDYDATA:switch_reason}"}
[2018-04-24T17:17:38,810][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW104002"=>"\\((?:Primary|Secondary)\\) Switching to STANDBY - %{GREEDYDATA:switch_reason}"}
[2018-04-24T17:17:38,811][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW104003"=>"\\((?:Primary|Secondary)\\) Switching to FAILED\\."}
[2018-04-24T17:17:38,811][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW104004"=>"\\((?:Primary|Secondary)\\) Switching to OK\\."}
[2018-04-24T17:17:38,811][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW105003"=>"\\((?:Primary|Secondary)\\) Monitoring on [Ii]nterface %{GREEDYDATA:interface_name} waiting"}
[2018-04-24T17:17:38,811][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW105004"=>"\\((?:Primary|Secondary)\\) Monitoring on [Ii]nterface %{GREEDYDATA:interface_name} normal"}
[2018-04-24T17:17:38,811][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW105005"=>"\\((?:Primary|Secondary)\\) Lost Failover communications with mate on [Ii]nterface %{GREEDYDATA:interface_name}"}
[2018-04-24T17:17:38,811][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW105008"=>"\\((?:Primary|Secondary)\\) Testing [Ii]nterface %{GREEDYDATA:interface_name}"}
[2018-04-24T17:17:38,811][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW105009"=>"\\((?:Primary|Secondary)\\) Testing on [Ii]nterface %{GREEDYDATA:interface_name} (?:Passed|Failed)"}
[2018-04-24T17:17:38,811][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW106001"=>"%{CISCO_DIRECTION:direction} %{WORD:protocol} connection %{CISCO_ACTION:action} from %{IP:src_ip}/%{INT:src_port} to %{IP:dst_ip}/%{INT:dst_port} flags %{GREEDYDATA:tcp_flags} on interface %{GREEDYDATA:interface}"}
[2018-04-24T17:17:38,811][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW106006_106007_106010"=>"%{CISCO_ACTION:action} %{CISCO_DIRECTION:direction} %{WORD:protocol} (?:from|src) %{IP:src_ip}/%{INT:src_port}(\\(%{DATA:src_fwuser}\\))? (?:to|dst) %{IP:dst_ip}/%{INT:dst_port}(\\(%{DATA:dst_fwuser}\\))? (?:on interface %{DATA:interface}|due to %{CISCO_REASON:reason})"}
[2018-04-24T17:17:38,811][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW106014"=>"%{CISCO_ACTION:action} %{CISCO_DIRECTION:direction} %{WORD:protocol} src %{DATA:src_interface}:%{IP:src_ip}(\\(%{DATA:src_fwuser}\\))? dst %{DATA:dst_interface}:%{IP:dst_ip}(\\(%{DATA:dst_fwuser}\\))? \\(type %{INT:icmp_type}, code %{INT:icmp_code}\\)"}
[2018-04-24T17:17:38,812][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW106015"=>"%{CISCO_ACTION:action} %{WORD:protocol} \\(%{DATA:policy_id}\\) from %{IP:src_ip}/%{INT:src_port} to %{IP:dst_ip}/%{INT:dst_port} flags %{DATA:tcp_flags} on interface %{GREEDYDATA:interface}"}
[2018-04-24T17:17:38,812][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW106021"=>"%{CISCO_ACTION:action} %{WORD:protocol} reverse path check from %{IP:src_ip} to %{IP:dst_ip} on interface %{GREEDYDATA:interface}"}
[2018-04-24T17:17:38,812][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW106023"=>"%{CISCO_ACTION:action}( protocol)? %{WORD:protocol} src %{DATA:src_interface}:%{DATA:src_ip}(/%{INT:src_port})?(\\(%{DATA:src_fwuser}\\))? dst %{DATA:dst_interface}:%{DATA:dst_ip}(/%{INT:dst_port})?(\\(%{DATA:dst_fwuser}\\))?( \\(type %{INT:icmp_type}, code %{INT:icmp_code}\\))? by access-group \"?%{DATA:policy_id}\"? \\[%{DATA:hashcode1}, %{DATA:hashcode2}\\]"}
[2018-04-24T17:17:38,812][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW106100_2_3"=>"access-list %{NOTSPACE:policy_id} %{CISCO_ACTION:action} %{WORD:protocol} for user '%{DATA:src_fwuser}' %{DATA:src_interface}/%{IP:src_ip}\\(%{INT:src_port}\\) -> %{DATA:dst_interface}/%{IP:dst_ip}\\(%{INT:dst_port}\\) hit-cnt %{INT:hit_count} %{CISCO_INTERVAL:interval} \\[%{DATA:hashcode1}, %{DATA:hashcode2}\\]"}
[2018-04-24T17:17:38,812][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW106100"=>"access-list %{NOTSPACE:policy_id} %{CISCO_ACTION:action} %{WORD:protocol} %{DATA:src_interface}/%{IP:src_ip}\\(%{INT:src_port}\\)(\\(%{DATA:src_fwuser}\\))? -> %{DATA:dst_interface}/%{IP:dst_ip}\\(%{INT:dst_port}\\)(\\(%{DATA:src_fwuser}\\))? hit-cnt %{INT:hit_count} %{CISCO_INTERVAL:interval} \\[%{DATA:hashcode1}, %{DATA:hashcode2}\\]"}
[2018-04-24T17:17:38,812][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW304001"=>"%{IP:src_ip}(\\(%{DATA:src_fwuser}\\))? Accessed URL %{IP:dst_ip}:%{GREEDYDATA:dst_url}"}
[2018-04-24T17:17:38,812][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW110002"=>"%{CISCO_REASON:reason} for %{WORD:protocol} from %{DATA:src_interface}:%{IP:src_ip}/%{INT:src_port} to %{IP:dst_ip}/%{INT:dst_port}"}
[2018-04-24T17:17:38,812][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW302010"=>"%{INT:connection_count} in use, %{INT:connection_count_max} most used"}
[2018-04-24T17:17:38,812][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW302013_302014_302015_302016"=>"%{CISCO_ACTION:action}(?: %{CISCO_DIRECTION:direction})? %{WORD:protocol} connection %{INT:connection_id} for %{DATA:src_interface}:%{IP:src_ip}/%{INT:src_port}( \\(%{IP:src_mapped_ip}/%{INT:src_mapped_port}\\))?(\\(%{DATA:src_fwuser}\\))? to %{DATA:dst_interface}:%{IP:dst_ip}/%{INT:dst_port}( \\(%{IP:dst_mapped_ip}/%{INT:dst_mapped_port}\\))?(\\(%{DATA:dst_fwuser}\\))?( duration %{TIME:duration} bytes %{INT:bytes})?(?: %{CISCO_REASON:reason})?( \\(%{DATA:user}\\))?"}
[2018-04-24T17:17:38,813][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW302020_302021"=>"%{CISCO_ACTION:action}(?: %{CISCO_DIRECTION:direction})? %{WORD:protocol} connection for faddr %{IP:dst_ip}/%{INT:icmp_seq_num}(?:\\(%{DATA:fwuser}\\))? gaddr %{IP:src_xlated_ip}/%{INT:icmp_code_xlated} laddr %{IP:src_ip}/%{INT:icmp_code}( \\(%{DATA:user}\\))?"}
[2018-04-24T17:17:38,813][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW305011"=>"%{CISCO_ACTION:action} %{CISCO_XLATE_TYPE:xlate_type} %{WORD:protocol} translation from %{DATA:src_interface}:%{IP:src_ip}(/%{INT:src_port})?(\\(%{DATA:src_fwuser}\\))? to %{DATA:src_xlated_interface}:%{IP:src_xlated_ip}/%{DATA:src_xlated_port}"}
[2018-04-24T17:17:38,813][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW313001_313004_313008"=>"%{CISCO_ACTION:action} %{WORD:protocol} type=%{INT:icmp_type}, code=%{INT:icmp_code} from %{IP:src_ip} on interface %{DATA:interface}( to %{IP:dst_ip})?"}
[2018-04-24T17:17:38,813][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW313005"=>"%{CISCO_REASON:reason} for %{WORD:protocol} error message: %{WORD:err_protocol} src %{DATA:err_src_interface}:%{IP:err_src_ip}(\\(%{DATA:err_src_fwuser}\\))? dst %{DATA:err_dst_interface}:%{IP:err_dst_ip}(\\(%{DATA:err_dst_fwuser}\\))? \\(type %{INT:err_icmp_type}, code %{INT:err_icmp_code}\\) on %{DATA:interface} interface\\. Original IP payload: %{WORD:protocol} src %{IP:orig_src_ip}/%{INT:orig_src_port}(\\(%{DATA:orig_src_fwuser}\\))? dst %{IP:orig_dst_ip}/%{INT:orig_dst_port}(\\(%{DATA:orig_dst_fwuser}\\))?"}
[2018-04-24T17:17:38,813][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW321001"=>"Resource '%{WORD:resource_name}' limit of %{POSINT:resource_limit} reached for system"}
[2018-04-24T17:17:38,813][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW402117"=>"%{WORD:protocol}: Received a non-IPSec packet \\(protocol= %{WORD:orig_protocol}\\) from %{IP:src_ip} to %{IP:dst_ip}"}
[2018-04-24T17:17:38,813][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW402119"=>"%{WORD:protocol}: Received an %{WORD:orig_protocol} packet \\(SPI= %{DATA:spi}, sequence number= %{DATA:seq_num}\\) from %{IP:src_ip} \\(user= %{DATA:user}\\) to %{IP:dst_ip} that failed anti-replay checking"}
[2018-04-24T17:17:38,813][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW419001"=>"%{CISCO_ACTION:action} %{WORD:protocol} packet from %{DATA:src_interface}:%{IP:src_ip}/%{INT:src_port} to %{DATA:dst_interface}:%{IP:dst_ip}/%{INT:dst_port}, reason: %{GREEDYDATA:reason}"}
[2018-04-24T17:17:38,813][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW419002"=>"%{CISCO_REASON:reason} from %{DATA:src_interface}:%{IP:src_ip}/%{INT:src_port} to %{DATA:dst_interface}:%{IP:dst_ip}/%{INT:dst_port} with different initial sequence number"}
[2018-04-24T17:17:38,814][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW500004"=>"%{CISCO_REASON:reason} for protocol=%{WORD:protocol}, from %{IP:src_ip}/%{INT:src_port} to %{IP:dst_ip}/%{INT:dst_port}"}
[2018-04-24T17:17:38,814][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW602303_602304"=>"%{WORD:protocol}: An %{CISCO_DIRECTION:direction} %{GREEDYDATA:tunnel_type} SA \\(SPI= %{DATA:spi}\\) between %{IP:src_ip} and %{IP:dst_ip} \\(user= %{DATA:user}\\) has been %{CISCO_ACTION:action}"}
[2018-04-24T17:17:38,814][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW710001_710002_710003_710005_710006"=>"%{WORD:protocol} (?:request|access) %{CISCO_ACTION:action} from %{IP:src_ip}/%{INT:src_port} to %{DATA:dst_interface}:%{IP:dst_ip}/%{INT:dst_port}"}
[2018-04-24T17:17:38,814][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW713172"=>"Group = %{GREEDYDATA:group}, IP = %{IP:src_ip}, Automatic NAT Detection Status:\\s+Remote end\\s*%{DATA:is_remote_natted}\\s*behind a NAT device\\s+This\\s+end\\s*%{DATA:is_local_natted}\\s*behind a NAT device"}
[2018-04-24T17:17:38,814][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW733100"=>"\\[\\s*%{DATA:drop_type}\\s*\\] drop %{DATA:drop_rate_id} exceeded. Current burst rate is %{INT:drop_rate_current_burst} per second, max configured rate is %{INT:drop_rate_max_burst}; Current average rate is %{INT:drop_rate_current_avg} per second, max configured rate is %{INT:drop_rate_max_avg}; Cumulative total count is %{INT:drop_total_count}"}
[2018-04-24T17:17:38,814][DEBUG][logstash.filters.grok ] Adding pattern {"SHOREWALL"=>"(%{SYSLOGTIMESTAMP:timestamp}) (%{WORD:nf_host}) kernel:.*Shorewall:(%{WORD:nf_action1})?:(%{WORD:nf_action2})?.*IN=(%{USERNAME:nf_in_interface})?.*(OUT= *MAC=(%{COMMONMAC:nf_dst_mac}):(%{COMMONMAC:nf_src_mac})?|OUT=%{USERNAME:nf_out_interface}).*SRC=(%{IPV4:nf_src_ip}).*DST=(%{IPV4:nf_dst_ip}).*LEN=(%{WORD:nf_len}).?*TOS=(%{WORD:nf_tos}).?*PREC=(%{WORD:nf_prec}).?*TTL=(%{INT:nf_ttl}).?*ID=(%{INT:nf_id}).?*PROTO=(%{WORD:nf_protocol}).?*SPT=(%{INT:nf_src_port}?.*DPT=%{INT:nf_dst_port}?.*)"}
[2018-04-24T17:17:38,814][DEBUG][logstash.filters.grok ] Adding pattern {"SFW2"=>"((%{SYSLOGTIMESTAMP})|(%{TIMESTAMP_ISO8601}))\\s*%{HOSTNAME}\\s*kernel\\S+\\s*%{NAGIOSTIME}\\s*SFW2\\-INext\\-%{NOTSPACE:nf_action}\\s*IN=%{USERNAME:nf_in_interface}.*OUT=((\\s*%{USERNAME:nf_out_interface})|(\\s*))MAC=((%{COMMONMAC:nf_dst_mac}:%{COMMONMAC:nf_src_mac})|(\\s*)).*SRC=%{IP:nf_src_ip}\\s*DST=%{IP:nf_dst_ip}.*PROTO=%{WORD:nf_protocol}((.*SPT=%{INT:nf_src_port}.*DPT=%{INT:nf_dst_port}.*)|())"}
[2018-04-24T17:17:38,815][DEBUG][logstash.filters.grok ] Adding pattern {"USERNAME"=>"[a-zA-Z0-9._-]+"}
[2018-04-24T17:17:38,815][DEBUG][logstash.filters.grok ] Adding pattern {"USER"=>"%{USERNAME}"}
[2018-04-24T17:17:38,815][DEBUG][logstash.filters.grok ] Adding pattern {"EMAILLOCALPART"=>"[a-zA-Z][a-zA-Z0-9_.+-=:]+"}
[2018-04-24T17:17:38,815][DEBUG][logstash.filters.grok ] Adding pattern {"EMAILADDRESS"=>"%{EMAILLOCALPART}@%{HOSTNAME}"}
[2018-04-24T17:17:38,815][DEBUG][logstash.filters.grok ] Adding pattern {"INT"=>"(?:[+-]?(?:[0-9]+))"}
[2018-04-24T17:17:38,815][DEBUG][logstash.filters.grok ] Adding pattern {"BASE10NUM"=>"(?<![0-9.+-])(?>[+-]?(?:(?:[0-9]+(?:\\.[0-9]+)?)|(?:\\.[0-9]+)))"}
[2018-04-24T17:17:38,815][DEBUG][logstash.filters.grok ] Adding pattern {"NUMBER"=>"(?:%{BASE10NUM})"}
[2018-04-24T17:17:38,815][DEBUG][logstash.filters.grok ] Adding pattern {"BASE16NUM"=>"(?<![0-9A-Fa-f])(?:[+-]?(?:0x)?(?:[0-9A-Fa-f]+))"}
[2018-04-24T17:17:38,815][DEBUG][logstash.filters.grok ] Adding pattern {"BASE16FLOAT"=>"\\b(?<![0-9A-Fa-f.])(?:[+-]?(?:0x)?(?:(?:[0-9A-Fa-f]+(?:\\.[0-9A-Fa-f]*)?)|(?:\\.[0-9A-Fa-f]+)))\\b"}
[2018-04-24T17:17:38,815][DEBUG][logstash.filters.grok ] Adding pattern {"POSINT"=>"\\b(?:[1-9][0-9]*)\\b"}
[2018-04-24T17:17:38,815][DEBUG][logstash.filters.grok ] Adding pattern {"NONNEGINT"=>"\\b(?:[0-9]+)\\b"}
[2018-04-24T17:17:38,815][DEBUG][logstash.filters.grok ] Adding pattern {"WORD"=>"\\b\\w+\\b"}
[2018-04-24T17:17:38,815][DEBUG][logstash.filters.grok ] Adding pattern {"NOTSPACE"=>"\\S+"}
[2018-04-24T17:17:38,816][DEBUG][logstash.filters.grok ] Adding pattern {"SPACE"=>"\\s*"}
[2018-04-24T17:17:38,816][DEBUG][logstash.filters.grok ] Adding pattern {"DATA"=>".*?"}
[2018-04-24T17:17:38,816][DEBUG][logstash.filters.grok ] Adding pattern {"GREEDYDATA"=>".*"}
[2018-04-24T17:17:38,816][DEBUG][logstash.filters.grok ] Adding pattern {"QUOTEDSTRING"=>"(?>(?<!\\\\)(?>\"(?>\\\\.|[^\\\\\"]+)+\"|\"\"|(?>'(?>\\\\.|[^\\\\']+)+')|''|(?>`(?>\\\\.|[^\\\\`]+)+`)|``))"}
[2018-04-24T17:17:38,816][DEBUG][logstash.filters.grok ] Adding pattern {"UUID"=>"[A-Fa-f0-9]{8}-(?:[A-Fa-f0-9]{4}-){3}[A-Fa-f0-9]{12}"}
[2018-04-24T17:17:38,816][DEBUG][logstash.filters.grok ] Adding pattern {"URN"=>"urn:[0-9A-Za-z][0-9A-Za-z-]{0,31}:(?:%[0-9a-fA-F]{2}|[0-9A-Za-z()+,.:=@;$_!*'/?#-])+"}
[2018-04-24T17:17:38,816][DEBUG][logstash.filters.grok ] Adding pattern {"MAC"=>"(?:%{CISCOMAC}|%{WINDOWSMAC}|%{COMMONMAC})"}
[2018-04-24T17:17:38,816][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOMAC"=>"(?:(?:[A-Fa-f0-9]{4}\\.){2}[A-Fa-f0-9]{4})"}
[2018-04-24T17:17:38,816][DEBUG][logstash.filters.grok ] Adding pattern {"WINDOWSMAC"=>"(?:(?:[A-Fa-f0-9]{2}-){5}[A-Fa-f0-9]{2})"}
[2018-04-24T17:17:38,816][DEBUG][logstash.filters.grok ] Adding pattern {"COMMONMAC"=>"(?:(?:[A-Fa-f0-9]{2}:){5}[A-Fa-f0-9]{2})"}
[2018-04-24T17:17:38,817][DEBUG][logstash.filters.grok ] Adding pattern {"IPV6"=>"((([0-9A-Fa-f]{1,4}:){7}([0-9A-Fa-f]{1,4}|:))|(([0-9A-Fa-f]{1,4}:){6}(:[0-9A-Fa-f]{1,4}|((25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)(\\.(25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)){3})|:))|(([0-9A-Fa-f]{1,4}:){5}(((:[0-9A-Fa-f]{1,4}){1,2})|:((25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)(\\.(25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)){3})|:))|(([0-9A-Fa-f]{1,4}:){4}(((:[0-9A-Fa-f]{1,4}){1,3})|((:[0-9A-Fa-f]{1,4})?:((25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)(\\.(25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)){3}))|:))|(([0-9A-Fa-f]{1,4}:){3}(((:[0-9A-Fa-f]{1,4}){1,4})|((:[0-9A-Fa-f]{1,4}){0,2}:((25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)(\\.(25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)){3}))|:))|(([0-9A-Fa-f]{1,4}:){2}(((:[0-9A-Fa-f]{1,4}){1,5})|((:[0-9A-Fa-f]{1,4}){0,3}:((25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)(\\.(25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)){3}))|:))|(([0-9A-Fa-f]{1,4}:){1}(((:[0-9A-Fa-f]{1,4}){1,6})|((:[0-9A-Fa-f]{1,4}){0,4}:((25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)(\\.(25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)){3}))|:))|(:(((:[0-9A-Fa-f]{1,4}){1,7})|((:[0-9A-Fa-f]{1,4}){0,5}:((25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)(\\.(25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)){3}))|:)))(%.+)?"}
[2018-04-24T17:17:38,817][DEBUG][logstash.filters.grok ] Adding pattern {"IPV4"=>"(?<![0-9])(?:(?:[0-1]?[0-9]{1,2}|2[0-4][0-9]|25[0-5])[.](?:[0-1]?[0-9]{1,2}|2[0-4][0-9]|25[0-5])[.](?:[0-1]?[0-9]{1,2}|2[0-4][0-9]|25[0-5])[.](?:[0-1]?[0-9]{1,2}|2[0-4][0-9]|25[0-5]))(?![0-9])"}
[2018-04-24T17:17:38,817][DEBUG][logstash.filters.grok ] Adding pattern {"IP"=>"(?:%{IPV6}|%{IPV4})"}
[2018-04-24T17:17:38,817][DEBUG][logstash.filters.grok ] Adding pattern {"HOSTNAME"=>"\\b(?:[0-9A-Za-z][0-9A-Za-z-]{0,62})(?:\\.(?:[0-9A-Za-z][0-9A-Za-z-]{0,62}))*(\\.?|\\b)"}
[2018-04-24T17:17:38,817][DEBUG][logstash.filters.grok ] Adding pattern {"IPORHOST"=>"(?:%{IP}|%{HOSTNAME})"}
[2018-04-24T17:17:38,817][DEBUG][logstash.filters.grok ] Adding pattern {"HOSTPORT"=>"%{IPORHOST}:%{POSINT}"}
[2018-04-24T17:17:38,817][DEBUG][logstash.filters.grok ] Adding pattern {"PATH"=>"(?:%{UNIXPATH}|%{WINPATH})"}
[2018-04-24T17:17:38,817][DEBUG][logstash.filters.grok ] Adding pattern {"UNIXPATH"=>"(/([\\w_%!$@:.,+~-]+|\\\\.)*)+"}
[2018-04-24T17:17:38,817][DEBUG][logstash.filters.grok ] Adding pattern {"TTY"=>"(?:/dev/(pts|tty([pq])?)(\\w+)?/?(?:[0-9]+))"}
[2018-04-24T17:17:38,818][DEBUG][logstash.filters.grok ] Adding pattern {"WINPATH"=>"(?>[A-Za-z]+:|\\\\)(?:\\\\[^\\\\?*]*)+"}
[2018-04-24T17:17:38,818][DEBUG][logstash.filters.grok ] Adding pattern {"URIPROTO"=>"[A-Za-z]([A-Za-z0-9+\\-.]+)+"}
[2018-04-24T17:17:38,818][DEBUG][logstash.filters.grok ] Adding pattern {"URIHOST"=>"%{IPORHOST}(?::%{POSINT:port})?"}
[2018-04-24T17:17:38,818][DEBUG][logstash.filters.grok ] Adding pattern {"URIPATH"=>"(?:/[A-Za-z0-9$.+!*'(){},~:;=@#%&_\\-]*)+"}
[2018-04-24T17:17:38,818][DEBUG][logstash.filters.grok ] Adding pattern {"URIPARAM"=>"\\?[A-Za-z0-9$.+!*'|(){},~@#%&/=:;_?\\-\\[\\]<>]*"}
[2018-04-24T17:17:38,818][DEBUG][logstash.filters.grok ] Adding pattern {"URIPATHPARAM"=>"%{URIPATH}(?:%{URIPARAM})?"}
[2018-04-24T17:17:38,818][DEBUG][logstash.filters.grok ] Adding pattern {"URI"=>"%{URIPROTO}://(?:%{USER}(?::[^@]*)?@)?(?:%{URIHOST})?(?:%{URIPATHPARAM})?"}
[2018-04-24T17:17:38,818][DEBUG][logstash.filters.grok ] Adding pattern {"MONTH"=>"\\b(?:[Jj]an(?:uary|uar)?|[Ff]eb(?:ruary|ruar)?|[Mm](?:a|ä)?r(?:ch|z)?|[Aa]pr(?:il)?|[Mm]a(?:y|i)?|[Jj]un(?:e|i)?|[Jj]ul(?:y)?|[Aa]ug(?:ust)?|[Ss]ep(?:tember)?|[Oo](?:c|k)?t(?:ober)?|[Nn]ov(?:ember)?|[Dd]e(?:c|z)(?:ember)?)\\b"}
[2018-04-24T17:17:38,818][DEBUG][logstash.filters.grok ] Adding pattern {"MONTHNUM"=>"(?:0?[1-9]|1[0-2])"}
[2018-04-24T17:17:38,819][DEBUG][logstash.filters.grok ] Adding pattern {"MONTHNUM2"=>"(?:0[1-9]|1[0-2])"}
[2018-04-24T17:17:38,819][DEBUG][logstash.filters.grok ] Adding pattern {"MONTHDAY"=>"(?:(?:0[1-9])|(?:[12][0-9])|(?:3[01])|[1-9])"}
[2018-04-24T17:17:38,819][DEBUG][logstash.filters.grok ] Adding pattern {"DAY"=>"(?:Mon(?:day)?|Tue(?:sday)?|Wed(?:nesday)?|Thu(?:rsday)?|Fri(?:day)?|Sat(?:urday)?|Sun(?:day)?)"}
[2018-04-24T17:17:38,819][DEBUG][logstash.filters.grok ] Adding pattern {"YEAR"=>"(?>\\d\\d){1,2}"}
[2018-04-24T17:17:38,819][DEBUG][logstash.filters.grok ] Adding pattern {"HOUR"=>"(?:2[0123]|[01]?[0-9])"}
[2018-04-24T17:17:38,819][DEBUG][logstash.filters.grok ] Adding pattern {"MINUTE"=>"(?:[0-5][0-9])"}
[2018-04-24T17:17:38,819][DEBUG][logstash.filters.grok ] Adding pattern {"SECOND"=>"(?:(?:[0-5]?[0-9]|60)(?:[:.,][0-9]+)?)"}
[2018-04-24T17:17:38,819][DEBUG][logstash.filters.grok ] Adding pattern {"TIME"=>"(?!<[0-9])%{HOUR}:%{MINUTE}(?::%{SECOND})(?![0-9])"}
[2018-04-24T17:17:38,819][DEBUG][logstash.filters.grok ] Adding pattern {"DATE_US"=>"%{MONTHNUM}[/-]%{MONTHDAY}[/-]%{YEAR}"}
[2018-04-24T17:17:38,819][DEBUG][logstash.filters.grok ] Adding pattern {"DATE_EU"=>"%{MONTHDAY}[./-]%{MONTHNUM}[./-]%{YEAR}"}
[2018-04-24T17:17:38,820][DEBUG][logstash.filters.grok ] Adding pattern {"ISO8601_TIMEZONE"=>"(?:Z|[+-]%{HOUR}(?::?%{MINUTE}))"}
[2018-04-24T17:17:38,820][DEBUG][logstash.filters.grok ] Adding pattern {"ISO8601_SECOND"=>"(?:%{SECOND}|60)"}
[2018-04-24T17:17:38,820][DEBUG][logstash.filters.grok ] Adding pattern {"TIMESTAMP_ISO8601"=>"%{YEAR}-%{MONTHNUM}-%{MONTHDAY}[T ]%{HOUR}:?%{MINUTE}(?::?%{SECOND})?%{ISO8601_TIMEZONE}?"}
[2018-04-24T17:17:38,820][DEBUG][logstash.filters.grok ] Adding pattern {"DATE"=>"%{DATE_US}|%{DATE_EU}"}
[2018-04-24T17:17:38,820][DEBUG][logstash.filters.grok ] Adding pattern {"DATESTAMP"=>"%{DATE}[- ]%{TIME}"}
[2018-04-24T17:17:38,820][DEBUG][logstash.filters.grok ] Adding pattern {"TZ"=>"(?:[APMCE][SD]T|UTC)"}
[2018-04-24T17:17:38,820][DEBUG][logstash.filters.grok ] Adding pattern {"DATESTAMP_RFC822"=>"%{DAY} %{MONTH} %{MONTHDAY} %{YEAR} %{TIME} %{TZ}"}
[2018-04-24T17:17:38,820][DEBUG][logstash.filters.grok ] Adding pattern {"DATESTAMP_RFC2822"=>"%{DAY}, %{MONTHDAY} %{MONTH} %{YEAR} %{TIME} %{ISO8601_TIMEZONE}"}
[2018-04-24T17:17:38,820][DEBUG][logstash.filters.grok ] Adding pattern {"DATESTAMP_OTHER"=>"%{DAY} %{MONTH} %{MONTHDAY} %{TIME} %{TZ} %{YEAR}"}
[2018-04-24T17:17:38,820][DEBUG][logstash.filters.grok ] Adding pattern {"DATESTAMP_EVENTLOG"=>"%{YEAR}%{MONTHNUM2}%{MONTHDAY}%{HOUR}%{MINUTE}%{SECOND}"}
[2018-04-24T17:17:38,821][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOGTIMESTAMP"=>"%{MONTH} +%{MONTHDAY} %{TIME}"}
[2018-04-24T17:17:38,821][DEBUG][logstash.filters.grok ] Adding pattern {"PROG"=>"[\\x21-\\x5a\\x5c\\x5e-\\x7e]+"}
[2018-04-24T17:17:38,821][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOGPROG"=>"%{PROG:program}(?:\\[%{POSINT:pid}\\])?"}
[2018-04-24T17:17:38,821][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOGHOST"=>"%{IPORHOST}"}
[2018-04-24T17:17:38,821][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOGFACILITY"=>"<%{NONNEGINT:facility}.%{NONNEGINT:priority}>"}
[2018-04-24T17:17:38,821][DEBUG][logstash.filters.grok ] Adding pattern {"HTTPDATE"=>"%{MONTHDAY}/%{MONTH}/%{YEAR}:%{TIME} %{INT}"}
[2018-04-24T17:17:38,821][DEBUG][logstash.filters.grok ] Adding pattern {"QS"=>"%{QUOTEDSTRING}"}
[2018-04-24T17:17:38,821][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOGBASE"=>"%{SYSLOGTIMESTAMP:timestamp} (?:%{SYSLOGFACILITY} )?%{SYSLOGHOST:logsource} %{SYSLOGPROG}:"}
[2018-04-24T17:17:38,822][DEBUG][logstash.filters.grok ] Adding pattern {"LOGLEVEL"=>"([Aa]lert|ALERT|[Tt]race|TRACE|[Dd]ebug|DEBUG|[Nn]otice|NOTICE|[Ii]nfo|INFO|[Ww]arn?(?:ing)?|WARN?(?:ING)?|[Ee]rr?(?:or)?|ERR?(?:OR)?|[Cc]rit?(?:ical)?|CRIT?(?:ICAL)?|[Ff]atal|FATAL|[Ss]evere|SEVERE|EMERG(?:ENCY)?|[Ee]merg(?:ency)?)"}
[2018-04-24T17:17:38,822][DEBUG][logstash.filters.grok ] Adding pattern {"HAPROXYTIME"=>"(?!<[0-9])%{HOUR:haproxy_hour}:%{MINUTE:haproxy_minute}(?::%{SECOND:haproxy_second})(?![0-9])"}
[2018-04-24T17:17:38,822][DEBUG][logstash.filters.grok ] Adding pattern {"HAPROXYDATE"=>"%{MONTHDAY:haproxy_monthday}/%{MONTH:haproxy_month}/%{YEAR:haproxy_year}:%{HAPROXYTIME:haproxy_time}.%{INT:haproxy_milliseconds}"}
[2018-04-24T17:17:38,822][DEBUG][logstash.filters.grok ] Adding pattern {"HAPROXYCAPTUREDREQUESTHEADERS"=>"%{DATA:captured_request_headers}"}
[2018-04-24T17:17:38,822][DEBUG][logstash.filters.grok ] Adding pattern {"HAPROXYCAPTUREDRESPONSEHEADERS"=>"%{DATA:captured_response_headers}"}
[2018-04-24T17:17:38,822][DEBUG][logstash.filters.grok ] Adding pattern {"HAPROXYHTTPBASE"=>"%{IP:client_ip}:%{INT:client_port} \\[%{HAPROXYDATE:accept_date}\\] %{NOTSPACE:frontend_name} %{NOTSPACE:backend_name}/%{NOTSPACE:server_name} %{INT:time_request}/%{INT:time_queue}/%{INT:time_backend_connect}/%{INT:time_backend_response}/%{NOTSPACE:time_duration} %{INT:http_status_code} %{NOTSPACE:bytes_read} %{DATA:captured_request_cookie} %{DATA:captured_response_cookie} %{NOTSPACE:termination_state} %{INT:actconn}/%{INT:feconn}/%{INT:beconn}/%{INT:srvconn}/%{NOTSPACE:retries} %{INT:srv_queue}/%{INT:backend_queue} (\\{%{HAPROXYCAPTUREDREQUESTHEADERS}\\})?( )?(\\{%{HAPROXYCAPTUREDRESPONSEHEADERS}\\})?( )?\"(<BADREQ>|(%{WORD:http_verb} (%{URIPROTO:http_proto}://)?(?:%{USER:http_user}(?::[^@]*)?@)?(?:%{URIHOST:http_host})?(?:%{URIPATHPARAM:http_request})?( HTTP/%{NUMBER:http_version})?))?\""}
[2018-04-24T17:17:38,823][DEBUG][logstash.filters.grok ] Adding pattern {"HAPROXYHTTP"=>"(?:%{SYSLOGTIMESTAMP:syslog_timestamp}|%{TIMESTAMP_ISO8601:timestamp8601}) %{IPORHOST:syslog_server} %{SYSLOGPROG}: %{HAPROXYHTTPBASE}"}
[2018-04-24T17:17:38,823][DEBUG][logstash.filters.grok ] Adding pattern {"HAPROXYTCP"=>"(?:%{SYSLOGTIMESTAMP:syslog_timestamp}|%{TIMESTAMP_ISO8601:timestamp8601}) %{IPORHOST:syslog_server} %{SYSLOGPROG}: %{IP:client_ip}:%{INT:client_port} \\[%{HAPROXYDATE:accept_date}\\] %{NOTSPACE:frontend_name} %{NOTSPACE:backend_name}/%{NOTSPACE:server_name} %{INT:time_queue}/%{INT:time_backend_connect}/%{NOTSPACE:time_duration} %{NOTSPACE:bytes_read} %{NOTSPACE:termination_state} %{INT:actconn}/%{INT:feconn}/%{INT:beconn}/%{INT:srvconn}/%{NOTSPACE:retries} %{INT:srv_queue}/%{INT:backend_queue}"}
[2018-04-24T17:17:38,823][DEBUG][logstash.filters.grok ] Adding pattern {"HTTPDUSER"=>"%{EMAILADDRESS}|%{USER}"}
[2018-04-24T17:17:38,823][DEBUG][logstash.filters.grok ] Adding pattern {"HTTPDERROR_DATE"=>"%{DAY} %{MONTH} %{MONTHDAY} %{TIME} %{YEAR}"}
[2018-04-24T17:17:38,823][DEBUG][logstash.filters.grok ] Adding pattern {"HTTPD_COMMONLOG"=>"%{IPORHOST:clientip} %{HTTPDUSER:ident} %{HTTPDUSER:auth} \\[%{HTTPDATE:timestamp}\\] \"(?:%{WORD:verb} %{NOTSPACE:request}(?: HTTP/%{NUMBER:httpversion})?|%{DATA:rawrequest})\" %{NUMBER:response} (?:%{NUMBER:bytes}|-)"}
[2018-04-24T17:17:38,823][DEBUG][logstash.filters.grok ] Adding pattern {"HTTPD_COMBINEDLOG"=>"%{HTTPD_COMMONLOG} %{QS:referrer} %{QS:agent}"}
[2018-04-24T17:17:38,824][DEBUG][logstash.filters.grok ] Adding pattern {"HTTPD20_ERRORLOG"=>"\\[%{HTTPDERROR_DATE:timestamp}\\] \\[%{LOGLEVEL:loglevel}\\] (?:\\[client %{IPORHOST:clientip}\\] ){0,1}%{GREEDYDATA:message}"}
[2018-04-24T17:17:38,824][DEBUG][logstash.filters.grok ] Adding pattern {"HTTPD24_ERRORLOG"=>"\\[%{HTTPDERROR_DATE:timestamp}\\] \\[%{WORD:module}:%{LOGLEVEL:loglevel}\\] \\[pid %{POSINT:pid}(:tid %{NUMBER:tid})?\\]( \\(%{POSINT:proxy_errorcode}\\)%{DATA:proxy_message}:)?( \\[client %{IPORHOST:clientip}:%{POSINT:clientport}\\])?( %{DATA:errorcode}:)? %{GREEDYDATA:message}"}
[2018-04-24T17:17:38,824][DEBUG][logstash.filters.grok ] Adding pattern {"HTTPD_ERRORLOG"=>"%{HTTPD20_ERRORLOG}|%{HTTPD24_ERRORLOG}"}
[2018-04-24T17:17:38,824][DEBUG][logstash.filters.grok ] Adding pattern {"COMMONAPACHELOG"=>"%{HTTPD_COMMONLOG}"}
[2018-04-24T17:17:38,824][DEBUG][logstash.filters.grok ] Adding pattern {"COMBINEDAPACHELOG"=>"%{HTTPD_COMBINEDLOG}"}
[2018-04-24T17:17:38,824][DEBUG][logstash.filters.grok ] Adding pattern {"JAVACLASS"=>"(?:[a-zA-Z$_][a-zA-Z$_0-9]*\\.)*[a-zA-Z$_][a-zA-Z$_0-9]*"}
[2018-04-24T17:17:38,825][DEBUG][logstash.filters.grok ] Adding pattern {"JAVAFILE"=>"(?:[A-Za-z0-9_. -]+)"}
[2018-04-24T17:17:38,825][DEBUG][logstash.filters.grok ] Adding pattern {"JAVAMETHOD"=>"(?:(<(?:cl)?init>)|[a-zA-Z$_][a-zA-Z$_0-9]*)"}
[2018-04-24T17:17:38,825][DEBUG][logstash.filters.grok ] Adding pattern {"JAVASTACKTRACEPART"=>"%{SPACE}at %{JAVACLASS:class}\\.%{JAVAMETHOD:method}\\(%{JAVAFILE:file}(?::%{NUMBER:line})?\\)"}
[2018-04-24T17:17:38,825][DEBUG][logstash.filters.grok ] Adding pattern {"JAVATHREAD"=>"(?:[A-Z]{2}-Processor[\\d]+)"}
[2018-04-24T17:17:38,825][DEBUG][logstash.filters.grok ] Adding pattern {"JAVACLASS"=>"(?:[a-zA-Z0-9-]+\\.)+[A-Za-z0-9$]+"}
[2018-04-24T17:17:38,825][DEBUG][logstash.filters.grok ] Adding pattern {"JAVAFILE"=>"(?:[A-Za-z0-9_.-]+)"}
[2018-04-24T17:17:38,825][DEBUG][logstash.filters.grok ] Adding pattern {"JAVALOGMESSAGE"=>"(.*)"}
[2018-04-24T17:17:38,825][DEBUG][logstash.filters.grok ] Adding pattern {"CATALINA_DATESTAMP"=>"%{MONTH} %{MONTHDAY}, 20%{YEAR} %{HOUR}:?%{MINUTE}(?::?%{SECOND}) (?:AM|PM)"}
[2018-04-24T17:17:38,826][DEBUG][logstash.filters.grok ] Adding pattern {"TOMCAT_DATESTAMP"=>"20%{YEAR}-%{MONTHNUM}-%{MONTHDAY} %{HOUR}:?%{MINUTE}(?::?%{SECOND}) %{ISO8601_TIMEZONE}"}
[2018-04-24T17:17:38,826][DEBUG][logstash.filters.grok ] Adding pattern {"CATALINALOG"=>"%{CATALINA_DATESTAMP:timestamp} %{JAVACLASS:class} %{JAVALOGMESSAGE:logmessage}"}
[2018-04-24T17:17:38,827][DEBUG][logstash.filters.grok ] Adding pattern {"TOMCATLOG"=>"%{TOMCAT_DATESTAMP:timestamp} \\| %{LOGLEVEL:level} \\| %{JAVACLASS:class} - %{JAVALOGMESSAGE:logmessage}"}
[2018-04-24T17:17:38,827][DEBUG][logstash.filters.grok ] Adding pattern {"RT_FLOW_EVENT"=>"(RT_FLOW_SESSION_CREATE|RT_FLOW_SESSION_CLOSE|RT_FLOW_SESSION_DENY)"}
[2018-04-24T17:17:38,827][DEBUG][logstash.filters.grok ] Adding pattern {"RT_FLOW1"=>"%{RT_FLOW_EVENT:event}: %{GREEDYDATA:close-reason}: %{IP:src-ip}/%{INT:src-port}->%{IP:dst-ip}/%{INT:dst-port} %{DATA:service} %{IP:nat-src-ip}/%{INT:nat-src-port}->%{IP:nat-dst-ip}/%{INT:nat-dst-port} %{DATA:src-nat-rule-name} %{DATA:dst-nat-rule-name} %{INT:protocol-id} %{DATA:policy-name} %{DATA:from-zone} %{DATA:to-zone} %{INT:session-id} \\d+\\(%{DATA:sent}\\) \\d+\\(%{DATA:received}\\) %{INT:elapsed-time} .*"}
[2018-04-24T17:17:38,827][DEBUG][logstash.filters.grok ] Adding pattern {"RT_FLOW2"=>"%{RT_FLOW_EVENT:event}: session created %{IP:src-ip}/%{INT:src-port}->%{IP:dst-ip}/%{INT:dst-port} %{DATA:service} %{IP:nat-src-ip}/%{INT:nat-src-port}->%{IP:nat-dst-ip}/%{INT:nat-dst-port} %{DATA:src-nat-rule-name} %{DATA:dst-nat-rule-name} %{INT:protocol-id} %{DATA:policy-name} %{DATA:from-zone} %{DATA:to-zone} %{INT:session-id} .*"}
[2018-04-24T17:17:38,827][DEBUG][logstash.filters.grok ] Adding pattern {"RT_FLOW3"=>"%{RT_FLOW_EVENT:event}: session denied %{IP:src-ip}/%{INT:src-port}->%{IP:dst-ip}/%{INT:dst-port} %{DATA:service} %{INT:protocol-id}\\(\\d\\) %{DATA:policy-name} %{DATA:from-zone} %{DATA:to-zone} .*"}
[2018-04-24T17:17:38,828][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOG5424PRINTASCII"=>"[!-~]+"}
[2018-04-24T17:17:38,828][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOGBASE2"=>"(?:%{SYSLOGTIMESTAMP:timestamp}|%{TIMESTAMP_ISO8601:timestamp8601}) (?:%{SYSLOGFACILITY} )?%{SYSLOGHOST:logsource}+(?: %{SYSLOGPROG}:|)"}
[2018-04-24T17:17:38,828][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOGPAMSESSION"=>"%{SYSLOGBASE} (?=%{GREEDYDATA:message})%{WORD:pam_module}\\(%{DATA:pam_caller}\\): session %{WORD:pam_session_state} for user %{USERNAME:username}(?: by %{GREEDYDATA:pam_by})?"}
[2018-04-24T17:17:38,828][DEBUG][logstash.filters.grok ] Adding pattern {"CRON_ACTION"=>"[A-Z ]+"}
[2018-04-24T17:17:38,828][DEBUG][logstash.filters.grok ] Adding pattern {"CRONLOG"=>"%{SYSLOGBASE} \\(%{USER:user}\\) %{CRON_ACTION:action} \\(%{DATA:message}\\)"}
[2018-04-24T17:17:38,828][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOGLINE"=>"%{SYSLOGBASE2} %{GREEDYDATA:message}"}
[2018-04-24T17:17:38,828][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOG5424PRI"=>"<%{NONNEGINT:syslog5424_pri}>"}
[2018-04-24T17:17:38,828][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOG5424SD"=>"\\[%{DATA}\\]+"}
[2018-04-24T17:17:38,829][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOG5424BASE"=>"%{SYSLOG5424PRI}%{NONNEGINT:syslog5424_ver} +(?:%{TIMESTAMP_ISO8601:syslog5424_ts}|-) +(?:%{IPORHOST:syslog5424_host}|-) +(-|%{SYSLOG5424PRINTASCII:syslog5424_app}) +(-|%{SYSLOG5424PRINTASCII:syslog5424_proc}) +(-|%{SYSLOG5424PRINTASCII:syslog5424_msgid}) +(?:%{SYSLOG5424SD:syslog5424_sd}|-|)"}
[2018-04-24T17:17:38,829][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOG5424LINE"=>"%{SYSLOG5424BASE} +%{GREEDYDATA:syslog5424_msg}"}
[2018-04-24T17:17:38,830][DEBUG][logstash.filters.grok ] Adding pattern {"MAVEN_VERSION"=>"(?:(\\d+)\\.)?(?:(\\d+)\\.)?(\\*|\\d+)(?:[.-](RELEASE|SNAPSHOT))?"}
[2018-04-24T17:17:38,830][DEBUG][logstash.filters.grok ] Adding pattern {"MCOLLECTIVEAUDIT"=>"%{TIMESTAMP_ISO8601:timestamp}:"}
[2018-04-24T17:17:38,830][DEBUG][logstash.filters.grok ] Adding pattern {"MCOLLECTIVE"=>"., \\[%{TIMESTAMP_ISO8601:timestamp} #%{POSINT:pid}\\]%{SPACE}%{LOGLEVEL:event_level}"}
[2018-04-24T17:17:38,830][DEBUG][logstash.filters.grok ] Adding pattern {"MCOLLECTIVEAUDIT"=>"%{TIMESTAMP_ISO8601:timestamp}:"}
[2018-04-24T17:17:38,830][DEBUG][logstash.filters.grok ] Adding pattern {"MONGO_LOG"=>"%{SYSLOGTIMESTAMP:timestamp} \\[%{WORD:component}\\] %{GREEDYDATA:message}"}
[2018-04-24T17:17:38,831][DEBUG][logstash.filters.grok ] Adding pattern {"MONGO_QUERY"=>"\\{ (?<={ ).*(?= } ntoreturn:) \\}"}
[2018-04-24T17:17:38,831][DEBUG][logstash.filters.grok ] Adding pattern {"MONGO_SLOWQUERY"=>"%{WORD} %{MONGO_WORDDASH:database}\\.%{MONGO_WORDDASH:collection} %{WORD}: %{MONGO_QUERY:query} %{WORD}:%{NONNEGINT:ntoreturn} %{WORD}:%{NONNEGINT:ntoskip} %{WORD}:%{NONNEGINT:nscanned}.*nreturned:%{NONNEGINT:nreturned}..+ (?<duration>[0-9]+)ms"}
[2018-04-24T17:17:38,831][DEBUG][logstash.filters.grok ] Adding pattern {"MONGO_WORDDASH"=>"\\b[\\w-]+\\b"}
[2018-04-24T17:17:38,831][DEBUG][logstash.filters.grok ] Adding pattern {"MONGO3_SEVERITY"=>"\\w"}
[2018-04-24T17:17:38,831][DEBUG][logstash.filters.grok ] Adding pattern {"MONGO3_COMPONENT"=>"%{WORD}|-"}
[2018-04-24T17:17:38,831][DEBUG][logstash.filters.grok ] Adding pattern {"MONGO3_LOG"=>"%{TIMESTAMP_ISO8601:timestamp} %{MONGO3_SEVERITY:severity} %{MONGO3_COMPONENT:component}%{SPACE}(?:\\[%{DATA:context}\\])? %{GREEDYDATA:message}"}
[2018-04-24T17:17:38,831][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOSTIME"=>"\\[%{NUMBER:nagios_epoch}\\]"}
[2018-04-24T17:17:38,831][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_CURRENT_SERVICE_STATE"=>"CURRENT SERVICE STATE"}
[2018-04-24T17:17:38,832][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_CURRENT_HOST_STATE"=>"CURRENT HOST STATE"}
[2018-04-24T17:17:38,832][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_SERVICE_NOTIFICATION"=>"SERVICE NOTIFICATION"}
[2018-04-24T17:17:38,832][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_HOST_NOTIFICATION"=>"HOST NOTIFICATION"}
[2018-04-24T17:17:38,832][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_SERVICE_ALERT"=>"SERVICE ALERT"}
[2018-04-24T17:17:38,832][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_HOST_ALERT"=>"HOST ALERT"}
[2018-04-24T17:17:38,832][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_SERVICE_FLAPPING_ALERT"=>"SERVICE FLAPPING ALERT"}
[2018-04-24T17:17:38,832][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_HOST_FLAPPING_ALERT"=>"HOST FLAPPING ALERT"}
[2018-04-24T17:17:38,832][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_SERVICE_DOWNTIME_ALERT"=>"SERVICE DOWNTIME ALERT"}
[2018-04-24T17:17:38,832][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_HOST_DOWNTIME_ALERT"=>"HOST DOWNTIME ALERT"}
[2018-04-24T17:17:38,832][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_PASSIVE_SERVICE_CHECK"=>"PASSIVE SERVICE CHECK"}
[2018-04-24T17:17:38,832][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_PASSIVE_HOST_CHECK"=>"PASSIVE HOST CHECK"}
[2018-04-24T17:17:38,833][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_SERVICE_EVENT_HANDLER"=>"SERVICE EVENT HANDLER"}
[2018-04-24T17:17:38,833][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_HOST_EVENT_HANDLER"=>"HOST EVENT HANDLER"}
[2018-04-24T17:17:38,833][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_EXTERNAL_COMMAND"=>"EXTERNAL COMMAND"}
[2018-04-24T17:17:38,833][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_TIMEPERIOD_TRANSITION"=>"TIMEPERIOD TRANSITION"}
[2018-04-24T17:17:38,833][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_DISABLE_SVC_CHECK"=>"DISABLE_SVC_CHECK"}
[2018-04-24T17:17:38,833][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_ENABLE_SVC_CHECK"=>"ENABLE_SVC_CHECK"}
[2018-04-24T17:17:38,833][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_DISABLE_HOST_CHECK"=>"DISABLE_HOST_CHECK"}
[2018-04-24T17:17:38,833][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_ENABLE_HOST_CHECK"=>"ENABLE_HOST_CHECK"}
[2018-04-24T17:17:38,833][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_PROCESS_SERVICE_CHECK_RESULT"=>"PROCESS_SERVICE_CHECK_RESULT"}
[2018-04-24T17:17:38,833][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_PROCESS_HOST_CHECK_RESULT"=>"PROCESS_HOST_CHECK_RESULT"}
[2018-04-24T17:17:38,833][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_SCHEDULE_SERVICE_DOWNTIME"=>"SCHEDULE_SERVICE_DOWNTIME"}
[2018-04-24T17:17:38,834][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_SCHEDULE_HOST_DOWNTIME"=>"SCHEDULE_HOST_DOWNTIME"}
[2018-04-24T17:17:38,834][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_DISABLE_HOST_SVC_NOTIFICATIONS"=>"DISABLE_HOST_SVC_NOTIFICATIONS"}
[2018-04-24T17:17:38,834][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_ENABLE_HOST_SVC_NOTIFICATIONS"=>"ENABLE_HOST_SVC_NOTIFICATIONS"}
[2018-04-24T17:17:38,834][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_DISABLE_HOST_NOTIFICATIONS"=>"DISABLE_HOST_NOTIFICATIONS"}
[2018-04-24T17:17:38,834][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_ENABLE_HOST_NOTIFICATIONS"=>"ENABLE_HOST_NOTIFICATIONS"}
[2018-04-24T17:17:38,834][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_DISABLE_SVC_NOTIFICATIONS"=>"DISABLE_SVC_NOTIFICATIONS"}
[2018-04-24T17:17:38,834][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_ENABLE_SVC_NOTIFICATIONS"=>"ENABLE_SVC_NOTIFICATIONS"}
[2018-04-24T17:17:38,834][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_WARNING"=>"Warning:%{SPACE}%{GREEDYDATA:nagios_message}"}
[2018-04-24T17:17:38,834][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_CURRENT_SERVICE_STATE"=>"%{NAGIOS_TYPE_CURRENT_SERVICE_STATE:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_service};%{DATA:nagios_state};%{DATA:nagios_statetype};%{DATA:nagios_statecode};%{GREEDYDATA:nagios_message}"}
[2018-04-24T17:17:38,834][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_CURRENT_HOST_STATE"=>"%{NAGIOS_TYPE_CURRENT_HOST_STATE:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_state};%{DATA:nagios_statetype};%{DATA:nagios_statecode};%{GREEDYDATA:nagios_message}"}
[2018-04-24T17:17:38,834][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_SERVICE_NOTIFICATION"=>"%{NAGIOS_TYPE_SERVICE_NOTIFICATION:nagios_type}: %{DATA:nagios_notifyname};%{DATA:nagios_hostname};%{DATA:nagios_service};%{DATA:nagios_state};%{DATA:nagios_contact};%{GREEDYDATA:nagios_message}"}
[2018-04-24T17:17:38,835][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_HOST_NOTIFICATION"=>"%{NAGIOS_TYPE_HOST_NOTIFICATION:nagios_type}: %{DATA:nagios_notifyname};%{DATA:nagios_hostname};%{DATA:nagios_state};%{DATA:nagios_contact};%{GREEDYDATA:nagios_message}"}
[2018-04-24T17:17:38,835][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_SERVICE_ALERT"=>"%{NAGIOS_TYPE_SERVICE_ALERT:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_service};%{DATA:nagios_state};%{DATA:nagios_statelevel};%{NUMBER:nagios_attempt};%{GREEDYDATA:nagios_message}"}
[2018-04-24T17:17:38,835][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_HOST_ALERT"=>"%{NAGIOS_TYPE_HOST_ALERT:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_state};%{DATA:nagios_statelevel};%{NUMBER:nagios_attempt};%{GREEDYDATA:nagios_message}"}
[2018-04-24T17:17:38,835][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_SERVICE_FLAPPING_ALERT"=>"%{NAGIOS_TYPE_SERVICE_FLAPPING_ALERT:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_service};%{DATA:nagios_state};%{GREEDYDATA:nagios_message}"}
[2018-04-24T17:17:38,835][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_HOST_FLAPPING_ALERT"=>"%{NAGIOS_TYPE_HOST_FLAPPING_ALERT:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_state};%{GREEDYDATA:nagios_message}"}
[2018-04-24T17:17:38,835][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_SERVICE_DOWNTIME_ALERT"=>"%{NAGIOS_TYPE_SERVICE_DOWNTIME_ALERT:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_service};%{DATA:nagios_state};%{GREEDYDATA:nagios_comment}"}
[2018-04-24T17:17:38,835][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_HOST_DOWNTIME_ALERT"=>"%{NAGIOS_TYPE_HOST_DOWNTIME_ALERT:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_state};%{GREEDYDATA:nagios_comment}"}
[2018-04-24T17:17:38,836][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_PASSIVE_SERVICE_CHECK"=>"%{NAGIOS_TYPE_PASSIVE_SERVICE_CHECK:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_service};%{DATA:nagios_state};%{GREEDYDATA:nagios_comment}"}
[2018-04-24T17:17:38,837][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_PASSIVE_HOST_CHECK"=>"%{NAGIOS_TYPE_PASSIVE_HOST_CHECK:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_state};%{GREEDYDATA:nagios_comment}"}
[2018-04-24T17:17:38,837][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_SERVICE_EVENT_HANDLER"=>"%{NAGIOS_TYPE_SERVICE_EVENT_HANDLER:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_service};%{DATA:nagios_state};%{DATA:nagios_statelevel};%{DATA:nagios_event_handler_name}"}
[2018-04-24T17:17:38,837][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_HOST_EVENT_HANDLER"=>"%{NAGIOS_TYPE_HOST_EVENT_HANDLER:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_state};%{DATA:nagios_statelevel};%{DATA:nagios_event_handler_name}"}
[2018-04-24T17:17:38,837][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TIMEPERIOD_TRANSITION"=>"%{NAGIOS_TYPE_TIMEPERIOD_TRANSITION:nagios_type}: %{DATA:nagios_service};%{DATA:nagios_unknown1};%{DATA:nagios_unknown2}"}
[2018-04-24T17:17:38,837][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_DISABLE_SVC_CHECK"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_DISABLE_SVC_CHECK:nagios_command};%{DATA:nagios_hostname};%{DATA:nagios_service}"}
[2018-04-24T17:17:38,837][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_DISABLE_HOST_CHECK"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_DISABLE_HOST_CHECK:nagios_command};%{DATA:nagios_hostname}"}
[2018-04-24T17:17:38,837][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_ENABLE_SVC_CHECK"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_ENABLE_SVC_CHECK:nagios_command};%{DATA:nagios_hostname};%{DATA:nagios_service}"}
[2018-04-24T17:17:38,837][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_ENABLE_HOST_CHECK"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_ENABLE_HOST_CHECK:nagios_command};%{DATA:nagios_hostname}"}
[2018-04-24T17:17:38,837][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_PROCESS_SERVICE_CHECK_RESULT"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_PROCESS_SERVICE_CHECK_RESULT:nagios_command};%{DATA:nagios_hostname};%{DATA:nagios_service};%{DATA:nagios_state};%{GREEDYDATA:nagios_check_result}"}
[2018-04-24T17:17:38,838][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_PROCESS_HOST_CHECK_RESULT"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_PROCESS_HOST_CHECK_RESULT:nagios_command};%{DATA:nagios_hostname};%{DATA:nagios_state};%{GREEDYDATA:nagios_check_result}"}
[2018-04-24T17:17:38,838][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_DISABLE_HOST_SVC_NOTIFICATIONS"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_DISABLE_HOST_SVC_NOTIFICATIONS:nagios_command};%{GREEDYDATA:nagios_hostname}"}
[2018-04-24T17:17:38,838][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_DISABLE_HOST_NOTIFICATIONS"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_DISABLE_HOST_NOTIFICATIONS:nagios_command};%{GREEDYDATA:nagios_hostname}"}
[2018-04-24T17:17:38,838][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_DISABLE_SVC_NOTIFICATIONS"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_DISABLE_SVC_NOTIFICATIONS:nagios_command};%{DATA:nagios_hostname};%{GREEDYDATA:nagios_service}"}
[2018-04-24T17:17:38,838][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_ENABLE_HOST_SVC_NOTIFICATIONS"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_ENABLE_HOST_SVC_NOTIFICATIONS:nagios_command};%{GREEDYDATA:nagios_hostname}"}
[2018-04-24T17:17:38,838][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_ENABLE_HOST_NOTIFICATIONS"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_ENABLE_HOST_NOTIFICATIONS:nagios_command};%{GREEDYDATA:nagios_hostname}"}
[2018-04-24T17:17:38,838][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_ENABLE_SVC_NOTIFICATIONS"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_ENABLE_SVC_NOTIFICATIONS:nagios_command};%{DATA:nagios_hostname};%{GREEDYDATA:nagios_service}"}
[2018-04-24T17:17:38,841][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_SCHEDULE_HOST_DOWNTIME"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_SCHEDULE_HOST_DOWNTIME:nagios_command};%{DATA:nagios_hostname};%{NUMBER:nagios_start_time};%{NUMBER:nagios_end_time};%{NUMBER:nagios_fixed};%{NUMBER:nagios_trigger_id};%{NUMBER:nagios_duration};%{DATA:author};%{DATA:comment}"}
[2018-04-24T17:17:38,841][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOSLOGLINE"=>"%{NAGIOSTIME} (?:%{NAGIOS_WARNING}|%{NAGIOS_CURRENT_SERVICE_STATE}|%{NAGIOS_CURRENT_HOST_STATE}|%{NAGIOS_SERVICE_NOTIFICATION}|%{NAGIOS_HOST_NOTIFICATION}|%{NAGIOS_SERVICE_ALERT}|%{NAGIOS_HOST_ALERT}|%{NAGIOS_SERVICE_FLAPPING_ALERT}|%{NAGIOS_HOST_FLAPPING_ALERT}|%{NAGIOS_SERVICE_DOWNTIME_ALERT}|%{NAGIOS_HOST_DOWNTIME_ALERT}|%{NAGIOS_PASSIVE_SERVICE_CHECK}|%{NAGIOS_PASSIVE_HOST_CHECK}|%{NAGIOS_SERVICE_EVENT_HANDLER}|%{NAGIOS_HOST_EVENT_HANDLER}|%{NAGIOS_TIMEPERIOD_TRANSITION}|%{NAGIOS_EC_LINE_DISABLE_SVC_CHECK}|%{NAGIOS_EC_LINE_ENABLE_SVC_CHECK}|%{NAGIOS_EC_LINE_DISABLE_HOST_CHECK}|%{NAGIOS_EC_LINE_ENABLE_HOST_CHECK}|%{NAGIOS_EC_LINE_PROCESS_HOST_CHECK_RESULT}|%{NAGIOS_EC_LINE_PROCESS_SERVICE_CHECK_RESULT}|%{NAGIOS_EC_LINE_SCHEDULE_HOST_DOWNTIME}|%{NAGIOS_EC_LINE_DISABLE_HOST_SVC_NOTIFICATIONS}|%{NAGIOS_EC_LINE_ENABLE_HOST_SVC_NOTIFICATIONS}|%{NAGIOS_EC_LINE_DISABLE_HOST_NOTIFICATIONS}|%{NAGIOS_EC_LINE_ENABLE_HOST_NOTIFICATIONS}|%{NAGIOS_EC_LINE_DISABLE_SVC_NOTIFICATIONS}|%{NAGIOS_EC_LINE_ENABLE_SVC_NOTIFICATIONS})"}
[2018-04-24T17:17:38,841][DEBUG][logstash.filters.grok ] Adding pattern {"POSTGRESQL"=>"%{DATESTAMP:timestamp} %{TZ} %{DATA:user_id} %{GREEDYDATA:connection_id} %{POSINT:pid}"}
[2018-04-24T17:17:38,841][DEBUG][logstash.filters.grok ] Adding pattern {"RUUID"=>"\\h{32}"}
[2018-04-24T17:17:38,841][DEBUG][logstash.filters.grok ] Adding pattern {"RCONTROLLER"=>"(?<controller>[^#]+)#(?<action>\\w+)"}
[2018-04-24T17:17:38,842][DEBUG][logstash.filters.grok ] Adding pattern {"RAILS3HEAD"=>"(?m)Started %{WORD:verb} \"%{URIPATHPARAM:request}\" for %{IPORHOST:clientip} at (?<timestamp>%{YEAR}-%{MONTHNUM}-%{MONTHDAY} %{HOUR}:%{MINUTE}:%{SECOND} %{ISO8601_TIMEZONE})"}
[2018-04-24T17:17:38,842][DEBUG][logstash.filters.grok ] Adding pattern {"RPROCESSING"=>"\\W*Processing by %{RCONTROLLER} as (?<format>\\S+)(?:\\W*Parameters: {%{DATA:params}}\\W*)?"}
[2018-04-24T17:17:38,842][DEBUG][logstash.filters.grok ] Adding pattern {"RAILS3FOOT"=>"Completed %{NUMBER:response}%{DATA} in %{NUMBER:totalms}ms %{RAILS3PROFILE}%{GREEDYDATA}"}
[2018-04-24T17:17:38,842][DEBUG][logstash.filters.grok ] Adding pattern {"RAILS3PROFILE"=>"(?:\\(Views: %{NUMBER:viewms}ms \\| ActiveRecord: %{NUMBER:activerecordms}ms|\\(ActiveRecord: %{NUMBER:activerecordms}ms)?"}
[2018-04-24T17:17:38,842][DEBUG][logstash.filters.grok ] Adding pattern {"RAILS3"=>"%{RAILS3HEAD}(?:%{RPROCESSING})?(?<context>(?:%{DATA}\\n)*)(?:%{RAILS3FOOT})?"}
[2018-04-24T17:17:38,842][DEBUG][logstash.filters.grok ] Adding pattern {"REDISTIMESTAMP"=>"%{MONTHDAY} %{MONTH} %{TIME}"}
[2018-04-24T17:17:38,842][DEBUG][logstash.filters.grok ] Adding pattern {"REDISLOG"=>"\\[%{POSINT:pid}\\] %{REDISTIMESTAMP:timestamp} \\* "}
[2018-04-24T17:17:38,842][DEBUG][logstash.filters.grok ] Adding pattern {"REDISMONLOG"=>"%{NUMBER:timestamp} \\[%{INT:database} %{IP:client}:%{NUMBER:port}\\] \"%{WORD:command}\"\\s?%{GREEDYDATA:params}"}
[2018-04-24T17:17:38,843][DEBUG][logstash.filters.grok ] Adding pattern {"RUBY_LOGLEVEL"=>"(?:DEBUG|FATAL|ERROR|WARN|INFO)"}
[2018-04-24T17:17:38,843][DEBUG][logstash.filters.grok ] Adding pattern {"RUBY_LOGGER"=>"[DFEWI], \\[%{TIMESTAMP_ISO8601:timestamp} #%{POSINT:pid}\\] *%{RUBY_LOGLEVEL:loglevel} -- +%{DATA:progname}: %{GREEDYDATA:message}"}
[2018-04-24T17:17:38,843][DEBUG][logstash.filters.grok ] Adding pattern {"SQUID3"=>"%{NUMBER:timestamp}\\s+%{NUMBER:duration}\\s%{IP:client_address}\\s%{WORD:cache_result}/%{POSINT:status_code}\\s%{NUMBER:bytes}\\s%{WORD:request_method}\\s%{NOTSPACE:url}\\s(%{NOTSPACE:user}|-)\\s%{WORD:hierarchy_code}/%{IPORHOST:server}\\s%{NOTSPACE:content_type}"}
[2018-04-24T17:17:38,843][DEBUG][logstash.filters.grok ] Adding pattern {"PAYLOAD"=>"[\\s\\S]*"}
[2018-04-24T17:17:38,843][DEBUG][logstash.filters.grok ] Adding pattern {"SPACE"=>"[ ]{1,}"}
[2018-04-24T17:17:38,843][DEBUG][logstash.filters.grok ] Adding pattern {"P_TIMESTAMP"=>"%{MONTH}\\s%{MONTHDAY},\\s%{YEAR}\\s%{TIME}\\s(AM|PM)"}
[2018-04-24T17:17:38,843][DEBUG][logstash.filters.grok ] Adding pattern {"LOGGINGSERVICEPREFIX"=>"[-]{12,18} Event Log Start Here [-]{12,18}\\\\n"}
[2018-04-24T17:17:38,843][DEBUG][logstash.filters.grok ] Adding pattern {"LOGGINGSERVICESUFFIX"=>"\\\\n[-]{12,18} Event Log End Here [-]{12,18}"}
[2018-04-24T17:17:38,843][DEBUG][logstash.filters.grok ] Adding pattern {"XLMLOGGING"=>"[0-9]{4}-[0-9]{2}-[0-9]{2} [0-9]{2}:[0-9]{2}:[0-9]{2}:[0-9]{3,7}"}
[2018-04-24T17:17:38,844][DEBUG][logstash.filters.grok ] Adding pattern {"DATESWITHDOTS"=>"[0-9]{4}.[0-9]{2}.[0-9]{2}.[0-9]{2}.[0-9]{2}.[0-9]{2}.[0-9]{3,7}"}
[2018-04-24T17:17:38,844][DEBUG][logstash.filters.grok ] Adding pattern {"DATESWITHUNDERLINE"=>"[0-9]{4}_[0-9]{1,2}_[0-9]{1,2}_[0-9]{1,2}_[0-9]{1,2}_[0-9]{1,2}_[0-9]{1,7}"}
[2018-04-24T17:17:38,846][DEBUG][logstash.filters.grok ] replacement_pattern => (?<GREEDYDATA:PrefixMessage>.*)
[2018-04-24T17:17:38,846][DEBUG][logstash.filters.grok ] replacement_pattern => (?<DATESWITHUNDERLINE:logtime>[0-9]{4}_[0-9]{1,2}_[0-9]{1,2}_[0-9]{1,2}_[0-9]{1,2}_[0-9]{1,2}_[0-9]{1,7})
[2018-04-24T17:17:38,847][DEBUG][logstash.filters.grok ] replacement_pattern => (?<GREEDYDATA:SuffixMessage>.*)
[2018-04-24T17:17:38,847][DEBUG][logstash.filters.grok ] Grok compiled OK {:pattern=>"%{GREEDYDATA:PrefixMessage} \\[%{DATESWITHUNDERLINE:logtime}\\] %{GREEDYDATA:SuffixMessage}", :expanded_pattern=>"(?<GREEDYDATA:PrefixMessage>.*) \\[(?<DATESWITHUNDERLINE:logtime>[0-9]{4}_[0-9]{1,2}_[0-9]{1,2}_[0-9]{1,2}_[0-9]{1,2}_[0-9]{1,2}_[0-9]{1,7})\\] (?<GREEDYDATA:SuffixMessage>.*)"}
[2018-04-24T17:17:38,849][DEBUG][logstash.filters.grok ] Grok patterns path {:paths=>["/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-patterns-core-4.1.2/patterns", "/usr/share/logstash/patterns/*"]}
[2018-04-24T17:17:38,850][DEBUG][logstash.filters.grok ] Grok patterns path {:paths=>["/etc/logstash/conf.d/patterns"]}
[2018-04-24T17:17:38,850][DEBUG][logstash.filters.grok ] Match data {:match=>{"message"=>"%{GREEDYDATA:LevelMessage} %{TIMESTAMP_ISO8601:logtime}%{GREEDYDATA:SuffixMessage}"}}
[2018-04-24T17:17:38,851][DEBUG][logstash.filters.grok ] regexp: /message {:pattern=>"%{GREEDYDATA:LevelMessage} %{TIMESTAMP_ISO8601:logtime}%{GREEDYDATA:SuffixMessage}"}
[2018-04-24T17:17:38,851][DEBUG][logstash.filters.grok ] Adding pattern {"S3_REQUEST_LINE"=>"(?:%{WORD:verb} %{NOTSPACE:request}(?: HTTP/%{NUMBER:httpversion})?|%{DATA:rawrequest})"}
[2018-04-24T17:17:38,851][DEBUG][logstash.filters.grok ] Adding pattern {"S3_ACCESS_LOG"=>"%{WORD:owner} %{NOTSPACE:bucket} \\[%{HTTPDATE:timestamp}\\] %{IP:clientip} %{NOTSPACE:requester} %{NOTSPACE:request_id} %{NOTSPACE:operation} %{NOTSPACE:key} (?:\"%{S3_REQUEST_LINE}\"|-) (?:%{INT:response:int}|-) (?:-|%{NOTSPACE:error_code}) (?:%{INT:bytes:int}|-) (?:%{INT:object_size:int}|-) (?:%{INT:request_time_ms:int}|-) (?:%{INT:turnaround_time_ms:int}|-) (?:%{QS:referrer}|-) (?:\"?%{QS:agent}\"?|-) (?:-|%{NOTSPACE:version_id})"}
[2018-04-24T17:17:38,851][DEBUG][logstash.filters.grok ] Adding pattern {"ELB_URIPATHPARAM"=>"%{URIPATH:path}(?:%{URIPARAM:params})?"}
[2018-04-24T17:17:38,851][DEBUG][logstash.filters.grok ] Adding pattern {"ELB_URI"=>"%{URIPROTO:proto}://(?:%{USER}(?::[^@]*)?@)?(?:%{URIHOST:urihost})?(?:%{ELB_URIPATHPARAM})?"}
[2018-04-24T17:17:38,851][DEBUG][logstash.filters.grok ] Adding pattern {"ELB_REQUEST_LINE"=>"(?:%{WORD:verb} %{ELB_URI:request}(?: HTTP/%{NUMBER:httpversion})?|%{DATA:rawrequest})"}
[2018-04-24T17:17:38,852][DEBUG][logstash.filters.grok ] Adding pattern {"ELB_ACCESS_LOG"=>"%{TIMESTAMP_ISO8601:timestamp} %{NOTSPACE:elb} %{IP:clientip}:%{INT:clientport:int} (?:(%{IP:backendip}:?:%{INT:backendport:int})|-) %{NUMBER:request_processing_time:float} %{NUMBER:backend_processing_time:float} %{NUMBER:response_processing_time:float} %{INT:response:int} %{INT:backend_response:int} %{INT:received_bytes:int} %{INT:bytes:int} \"%{ELB_REQUEST_LINE}\""}
[2018-04-24T17:17:38,852][DEBUG][logstash.filters.grok ] Adding pattern {"CLOUDFRONT_ACCESS_LOG"=>"(?<timestamp>%{YEAR}-%{MONTHNUM}-%{MONTHDAY}\\t%{TIME})\\t%{WORD:x_edge_location}\\t(?:%{NUMBER:sc_bytes:int}|-)\\t%{IPORHOST:clientip}\\t%{WORD:cs_method}\\t%{HOSTNAME:cs_host}\\t%{NOTSPACE:cs_uri_stem}\\t%{NUMBER:sc_status:int}\\t%{GREEDYDATA:referrer}\\t%{GREEDYDATA:agent}\\t%{GREEDYDATA:cs_uri_query}\\t%{GREEDYDATA:cookies}\\t%{WORD:x_edge_result_type}\\t%{NOTSPACE:x_edge_request_id}\\t%{HOSTNAME:x_host_header}\\t%{URIPROTO:cs_protocol}\\t%{INT:cs_bytes:int}\\t%{GREEDYDATA:time_taken:float}\\t%{GREEDYDATA:x_forwarded_for}\\t%{GREEDYDATA:ssl_protocol}\\t%{GREEDYDATA:ssl_cipher}\\t%{GREEDYDATA:x_edge_response_result_type}"}
[2018-04-24T17:17:38,852][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_TIMESTAMP"=>"%{MONTHDAY}-%{MONTH} %{HOUR}:%{MINUTE}"}
[2018-04-24T17:17:38,852][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_HOST"=>"[a-zA-Z0-9-]+"}
[2018-04-24T17:17:38,852][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_VOLUME"=>"%{USER}"}
[2018-04-24T17:17:38,852][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_DEVICE"=>"%{USER}"}
[2018-04-24T17:17:38,852][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_DEVICEPATH"=>"%{UNIXPATH}"}
[2018-04-24T17:17:38,852][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_CAPACITY"=>"%{INT}{1,3}(,%{INT}{3})*"}
[2018-04-24T17:17:38,852][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_VERSION"=>"%{USER}"}
[2018-04-24T17:17:38,852][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_JOB"=>"%{USER}"}
[2018-04-24T17:17:38,852][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_MAX_CAPACITY"=>"User defined maximum volume capacity %{BACULA_CAPACITY} exceeded on device \\\"%{BACULA_DEVICE:device}\\\" \\(%{BACULA_DEVICEPATH}\\)"}
[2018-04-24T17:17:38,853][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_END_VOLUME"=>"End of medium on Volume \\\"%{BACULA_VOLUME:volume}\\\" Bytes=%{BACULA_CAPACITY} Blocks=%{BACULA_CAPACITY} at %{MONTHDAY}-%{MONTH}-%{YEAR} %{HOUR}:%{MINUTE}."}
[2018-04-24T17:17:38,853][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NEW_VOLUME"=>"Created new Volume \\\"%{BACULA_VOLUME:volume}\\\" in catalog."}
[2018-04-24T17:17:38,853][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NEW_LABEL"=>"Labeled new Volume \\\"%{BACULA_VOLUME:volume}\\\" on device \\\"%{BACULA_DEVICE:device}\\\" \\(%{BACULA_DEVICEPATH}\\)."}
[2018-04-24T17:17:38,853][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_WROTE_LABEL"=>"Wrote label to prelabeled Volume \\\"%{BACULA_VOLUME:volume}\\\" on device \\\"%{BACULA_DEVICE}\\\" \\(%{BACULA_DEVICEPATH}\\)"}
[2018-04-24T17:17:38,853][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NEW_MOUNT"=>"New volume \\\"%{BACULA_VOLUME:volume}\\\" mounted on device \\\"%{BACULA_DEVICE:device}\\\" \\(%{BACULA_DEVICEPATH}\\) at %{MONTHDAY}-%{MONTH}-%{YEAR} %{HOUR}:%{MINUTE}."}
[2018-04-24T17:17:38,853][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NOOPEN"=>"\\s+Cannot open %{DATA}: ERR=%{GREEDYDATA:berror}"}
[2018-04-24T17:17:38,853][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NOOPENDIR"=>"\\s+Could not open directory %{DATA}: ERR=%{GREEDYDATA:berror}"}
[2018-04-24T17:17:38,853][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NOSTAT"=>"\\s+Could not stat %{DATA}: ERR=%{GREEDYDATA:berror}"}
[2018-04-24T17:17:38,853][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NOJOBS"=>"There are no more Jobs associated with Volume \\\"%{BACULA_VOLUME:volume}\\\". Marking it purged."}
[2018-04-24T17:17:38,853][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_ALL_RECORDS_PRUNED"=>"All records pruned from Volume \\\"%{BACULA_VOLUME:volume}\\\"; marking it \\\"Purged\\\""}
[2018-04-24T17:17:38,853][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_BEGIN_PRUNE_JOBS"=>"Begin pruning Jobs older than %{INT} month %{INT} days ."}
[2018-04-24T17:17:38,853][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_BEGIN_PRUNE_FILES"=>"Begin pruning Files."}
[2018-04-24T17:17:38,853][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_PRUNED_JOBS"=>"Pruned %{INT} Jobs* for client %{BACULA_HOST:client} from catalog."}
[2018-04-24T17:17:38,854][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_PRUNED_FILES"=>"Pruned Files from %{INT} Jobs* for client %{BACULA_HOST:client} from catalog."}
[2018-04-24T17:17:38,854][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_ENDPRUNE"=>"End auto prune."}
[2018-04-24T17:17:38,854][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_STARTJOB"=>"Start Backup JobId %{INT}, Job=%{BACULA_JOB:job}"}
[2018-04-24T17:17:38,854][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_STARTRESTORE"=>"Start Restore Job %{BACULA_JOB:job}"}
[2018-04-24T17:17:38,854][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_USEDEVICE"=>"Using Device \\\"%{BACULA_DEVICE:device}\\\""}
[2018-04-24T17:17:38,854][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_DIFF_FS"=>"\\s+%{UNIXPATH} is a different filesystem. Will not descend from %{UNIXPATH} into it."}
[2018-04-24T17:17:38,854][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_JOBEND"=>"Job write elapsed time = %{DATA:elapsed}, Transfer rate = %{NUMBER} (K|M|G)? Bytes/second"}
[2018-04-24T17:17:38,854][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NOPRUNE_JOBS"=>"No Jobs found to prune."}
[2018-04-24T17:17:38,854][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NOPRUNE_FILES"=>"No Files found to prune."}
[2018-04-24T17:17:38,854][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_VOLUME_PREVWRITTEN"=>"Volume \\\"%{BACULA_VOLUME:volume}\\\" previously written, moving to end of data."}
[2018-04-24T17:17:38,854][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_READYAPPEND"=>"Ready to append to end of Volume \\\"%{BACULA_VOLUME:volume}\\\" size=%{INT}"}
[2018-04-24T17:17:38,854][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_CANCELLING"=>"Cancelling duplicate JobId=%{INT}."}
[2018-04-24T17:17:38,854][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_MARKCANCEL"=>"JobId %{INT}, Job %{BACULA_JOB:job} marked to be canceled."}
[2018-04-24T17:17:38,854][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_CLIENT_RBJ"=>"shell command: run ClientRunBeforeJob \\\"%{GREEDYDATA:runjob}\\\""}
[2018-04-24T17:17:38,855][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_VSS"=>"(Generate )?VSS (Writer)?"}
[2018-04-24T17:17:38,855][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_MAXSTART"=>"Fatal error: Job canceled because max start delay time exceeded."}
[2018-04-24T17:17:38,855][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_DUPLICATE"=>"Fatal error: JobId %{INT:duplicate} already running. Duplicate job not allowed."}
[2018-04-24T17:17:38,855][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NOJOBSTAT"=>"Fatal error: No Job status returned from FD."}
[2018-04-24T17:17:38,855][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_FATAL_CONN"=>"Fatal error: bsock.c:133 Unable to connect to (Client: %{BACULA_HOST:client}|Storage daemon) on %{HOSTNAME}:%{POSINT}. ERR=(?<berror>%{GREEDYDATA})"}
[2018-04-24T17:17:38,855][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NO_CONNECT"=>"Warning: bsock.c:127 Could not connect to (Client: %{BACULA_HOST:client}|Storage daemon) on %{HOSTNAME}:%{POSINT}. ERR=(?<berror>%{GREEDYDATA})"}
[2018-04-24T17:17:38,855][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NO_AUTH"=>"Fatal error: Unable to authenticate with File daemon at %{HOSTNAME}. Possible causes:"}
[2018-04-24T17:17:38,855][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NOSUIT"=>"No prior or suitable Full backup found in catalog. Doing FULL backup."}
[2018-04-24T17:17:38,855][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NOPRIOR"=>"No prior Full backup Job record found."}
[2018-04-24T17:17:38,855][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_JOB"=>"(Error: )?Bacula %{BACULA_HOST} %{BACULA_VERSION} \\(%{BACULA_VERSION}\\):"}
[2018-04-24T17:17:38,855][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOGLINE"=>"%{BACULA_TIMESTAMP:bts} %{BACULA_HOST:hostname} JobId %{INT:jobid}: (%{BACULA_LOG_MAX_CAPACITY}|%{BACULA_LOG_END_VOLUME}|%{BACULA_LOG_NEW_VOLUME}|%{BACULA_LOG_NEW_LABEL}|%{BACULA_LOG_WROTE_LABEL}|%{BACULA_LOG_NEW_MOUNT}|%{BACULA_LOG_NOOPEN}|%{BACULA_LOG_NOOPENDIR}|%{BACULA_LOG_NOSTAT}|%{BACULA_LOG_NOJOBS}|%{BACULA_LOG_ALL_RECORDS_PRUNED}|%{BACULA_LOG_BEGIN_PRUNE_JOBS}|%{BACULA_LOG_BEGIN_PRUNE_FILES}|%{BACULA_LOG_PRUNED_JOBS}|%{BACULA_LOG_PRUNED_FILES}|%{BACULA_LOG_ENDPRUNE}|%{BACULA_LOG_STARTJOB}|%{BACULA_LOG_STARTRESTORE}|%{BACULA_LOG_USEDEVICE}|%{BACULA_LOG_DIFF_FS}|%{BACULA_LOG_JOBEND}|%{BACULA_LOG_NOPRUNE_JOBS}|%{BACULA_LOG_NOPRUNE_FILES}|%{BACULA_LOG_VOLUME_PREVWRITTEN}|%{BACULA_LOG_READYAPPEND}|%{BACULA_LOG_CANCELLING}|%{BACULA_LOG_MARKCANCEL}|%{BACULA_LOG_CLIENT_RBJ}|%{BACULA_LOG_VSS}|%{BACULA_LOG_MAXSTART}|%{BACULA_LOG_DUPLICATE}|%{BACULA_LOG_NOJOBSTAT}|%{BACULA_LOG_FATAL_CONN}|%{BACULA_LOG_NO_CONNECT}|%{BACULA_LOG_NO_AUTH}|%{BACULA_LOG_NOSUIT}|%{BACULA_LOG_JOB}|%{BACULA_LOG_NOPRIOR})"}
[2018-04-24T17:17:38,856][DEBUG][logstash.filters.grok ] Adding pattern {"BIND9_TIMESTAMP"=>"%{MONTHDAY}[-]%{MONTH}[-]%{YEAR} %{TIME}"}
[2018-04-24T17:17:38,856][DEBUG][logstash.filters.grok ] Adding pattern {"BIND9"=>"%{BIND9_TIMESTAMP:timestamp} queries: %{LOGLEVEL:loglevel}: client %{IP:clientip}#%{POSINT:clientport} \\(%{GREEDYDATA:query}\\): query: %{GREEDYDATA:query} IN %{GREEDYDATA:querytype} \\(%{IP:dns}\\)"}
[2018-04-24T17:17:38,856][DEBUG][logstash.filters.grok ] Adding pattern {"BRO_HTTP"=>"%{NUMBER:ts}\\t%{NOTSPACE:uid}\\t%{IP:orig_h}\\t%{INT:orig_p}\\t%{IP:resp_h}\\t%{INT:resp_p}\\t%{INT:trans_depth}\\t%{GREEDYDATA:method}\\t%{GREEDYDATA:domain}\\t%{GREEDYDATA:uri}\\t%{GREEDYDATA:referrer}\\t%{GREEDYDATA:user_agent}\\t%{NUMBER:request_body_len}\\t%{NUMBER:response_body_len}\\t%{GREEDYDATA:status_code}\\t%{GREEDYDATA:status_msg}\\t%{GREEDYDATA:info_code}\\t%{GREEDYDATA:info_msg}\\t%{GREEDYDATA:filename}\\t%{GREEDYDATA:bro_tags}\\t%{GREEDYDATA:username}\\t%{GREEDYDATA:password}\\t%{GREEDYDATA:proxied}\\t%{GREEDYDATA:orig_fuids}\\t%{GREEDYDATA:orig_mime_types}\\t%{GREEDYDATA:resp_fuids}\\t%{GREEDYDATA:resp_mime_types}"}
[2018-04-24T17:17:38,856][DEBUG][logstash.filters.grok ] Adding pattern {"BRO_DNS"=>"%{NUMBER:ts}\\t%{NOTSPACE:uid}\\t%{IP:orig_h}\\t%{INT:orig_p}\\t%{IP:resp_h}\\t%{INT:resp_p}\\t%{WORD:proto}\\t%{INT:trans_id}\\t%{GREEDYDATA:query}\\t%{GREEDYDATA:qclass}\\t%{GREEDYDATA:qclass_name}\\t%{GREEDYDATA:qtype}\\t%{GREEDYDATA:qtype_name}\\t%{GREEDYDATA:rcode}\\t%{GREEDYDATA:rcode_name}\\t%{GREEDYDATA:AA}\\t%{GREEDYDATA:TC}\\t%{GREEDYDATA:RD}\\t%{GREEDYDATA:RA}\\t%{GREEDYDATA:Z}\\t%{GREEDYDATA:answers}\\t%{GREEDYDATA:TTLs}\\t%{GREEDYDATA:rejected}"}
[2018-04-24T17:17:38,856][DEBUG][logstash.filters.grok ] Adding pattern {"BRO_CONN"=>"%{NUMBER:ts}\\t%{NOTSPACE:uid}\\t%{IP:orig_h}\\t%{INT:orig_p}\\t%{IP:resp_h}\\t%{INT:resp_p}\\t%{WORD:proto}\\t%{GREEDYDATA:service}\\t%{NUMBER:duration}\\t%{NUMBER:orig_bytes}\\t%{NUMBER:resp_bytes}\\t%{GREEDYDATA:conn_state}\\t%{GREEDYDATA:local_orig}\\t%{GREEDYDATA:missed_bytes}\\t%{GREEDYDATA:history}\\t%{GREEDYDATA:orig_pkts}\\t%{GREEDYDATA:orig_ip_bytes}\\t%{GREEDYDATA:resp_pkts}\\t%{GREEDYDATA:resp_ip_bytes}\\t%{GREEDYDATA:tunnel_parents}"}
[2018-04-24T17:17:38,857][DEBUG][logstash.filters.grok ] Adding pattern {"BRO_FILES"=>"%{NUMBER:ts}\\t%{NOTSPACE:fuid}\\t%{IP:tx_hosts}\\t%{IP:rx_hosts}\\t%{NOTSPACE:conn_uids}\\t%{GREEDYDATA:source}\\t%{GREEDYDATA:depth}\\t%{GREEDYDATA:analyzers}\\t%{GREEDYDATA:mime_type}\\t%{GREEDYDATA:filename}\\t%{GREEDYDATA:duration}\\t%{GREEDYDATA:local_orig}\\t%{GREEDYDATA:is_orig}\\t%{GREEDYDATA:seen_bytes}\\t%{GREEDYDATA:total_bytes}\\t%{GREEDYDATA:missing_bytes}\\t%{GREEDYDATA:overflow_bytes}\\t%{GREEDYDATA:timedout}\\t%{GREEDYDATA:parent_fuid}\\t%{GREEDYDATA:md5}\\t%{GREEDYDATA:sha1}\\t%{GREEDYDATA:sha256}\\t%{GREEDYDATA:extracted}"}
[2018-04-24T17:17:38,857][DEBUG][logstash.filters.grok ] Adding pattern {"EXIM_MSGID"=>"[0-9A-Za-z]{6}-[0-9A-Za-z]{6}-[0-9A-Za-z]{2}"}
[2018-04-24T17:17:38,857][DEBUG][logstash.filters.grok ] Adding pattern {"EXIM_FLAGS"=>"(<=|[-=>*]>|[*]{2}|==)"}
[2018-04-24T17:17:38,857][DEBUG][logstash.filters.grok ] Adding pattern {"EXIM_DATE"=>"%{YEAR:exim_year}-%{MONTHNUM:exim_month}-%{MONTHDAY:exim_day} %{TIME:exim_time}"}
[2018-04-24T17:17:38,857][DEBUG][logstash.filters.grok ] Adding pattern {"EXIM_PID"=>"\\[%{POSINT}\\]"}
[2018-04-24T17:17:38,857][DEBUG][logstash.filters.grok ] Adding pattern {"EXIM_QT"=>"((\\d+y)?(\\d+w)?(\\d+d)?(\\d+h)?(\\d+m)?(\\d+s)?)"}
[2018-04-24T17:17:38,857][DEBUG][logstash.filters.grok ] Adding pattern {"EXIM_EXCLUDE_TERMS"=>"(Message is frozen|(Start|End) queue run| Warning: | retry time not reached | no (IP address|host name) found for (IP address|host) | unexpected disconnection while reading SMTP command | no immediate delivery: |another process is handling this message)"}
[2018-04-24T17:17:38,857][DEBUG][logstash.filters.grok ] Adding pattern {"EXIM_REMOTE_HOST"=>"(H=(%{NOTSPACE:remote_hostname} )?(\\(%{NOTSPACE:remote_heloname}\\) )?\\[%{IP:remote_host}\\])"}
[2018-04-24T17:17:38,857][DEBUG][logstash.filters.grok ] Adding pattern {"EXIM_INTERFACE"=>"(I=\\[%{IP:exim_interface}\\](:%{NUMBER:exim_interface_port}))"}
[2018-04-24T17:17:38,857][DEBUG][logstash.filters.grok ] Adding pattern {"EXIM_PROTOCOL"=>"(P=%{NOTSPACE:protocol})"}
[2018-04-24T17:17:38,857][DEBUG][logstash.filters.grok ] Adding pattern {"EXIM_MSG_SIZE"=>"(S=%{NUMBER:exim_msg_size})"}
[2018-04-24T17:17:38,858][DEBUG][logstash.filters.grok ] Adding pattern {"EXIM_HEADER_ID"=>"(id=%{NOTSPACE:exim_header_id})"}
[2018-04-24T17:17:38,858][DEBUG][logstash.filters.grok ] Adding pattern {"EXIM_SUBJECT"=>"(T=%{QS:exim_subject})"}
[2018-04-24T17:17:38,858][DEBUG][logstash.filters.grok ] Adding pattern {"NETSCREENSESSIONLOG"=>"%{SYSLOGTIMESTAMP:date} %{IPORHOST:device} %{IPORHOST}: NetScreen device_id=%{WORD:device_id}%{DATA}: start_time=%{QUOTEDSTRING:start_time} duration=%{INT:duration} policy_id=%{INT:policy_id} service=%{DATA:service} proto=%{INT:proto} src zone=%{WORD:src_zone} dst zone=%{WORD:dst_zone} action=%{WORD:action} sent=%{INT:sent} rcvd=%{INT:rcvd} src=%{IPORHOST:src_ip} dst=%{IPORHOST:dst_ip} src_port=%{INT:src_port} dst_port=%{INT:dst_port} src-xlated ip=%{IPORHOST:src_xlated_ip} port=%{INT:src_xlated_port} dst-xlated ip=%{IPORHOST:dst_xlated_ip} port=%{INT:dst_xlated_port} session_id=%{INT:session_id} reason=%{GREEDYDATA:reason}"}
[2018-04-24T17:17:38,858][DEBUG][logstash.filters.grok ] Adding pattern {"CISCO_TAGGED_SYSLOG"=>"^<%{POSINT:syslog_pri}>%{CISCOTIMESTAMP:timestamp}( %{SYSLOGHOST:sysloghost})? ?: %%{CISCOTAG:ciscotag}:"}
[2018-04-24T17:17:38,858][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOTIMESTAMP"=>"%{MONTH} +%{MONTHDAY}(?: %{YEAR})? %{TIME}"}
[2018-04-24T17:17:38,858][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOTAG"=>"[A-Z0-9]+-%{INT}-(?:[A-Z0-9_]+)"}
[2018-04-24T17:17:38,858][DEBUG][logstash.filters.grok ] Adding pattern {"CISCO_ACTION"=>"Built|Teardown|Deny|Denied|denied|requested|permitted|denied by ACL|discarded|est-allowed|Dropping|created|deleted"}
[2018-04-24T17:17:38,858][DEBUG][logstash.filters.grok ] Adding pattern {"CISCO_REASON"=>"Duplicate TCP SYN|Failed to locate egress interface|Invalid transport field|No matching connection|DNS Response|DNS Query|(?:%{WORD}\\s*)*"}
[2018-04-24T17:17:38,858][DEBUG][logstash.filters.grok ] Adding pattern {"CISCO_DIRECTION"=>"Inbound|inbound|Outbound|outbound"}
[2018-04-24T17:17:38,858][DEBUG][logstash.filters.grok ] Adding pattern {"CISCO_INTERVAL"=>"first hit|%{INT}-second interval"}
[2018-04-24T17:17:38,858][DEBUG][logstash.filters.grok ] Adding pattern {"CISCO_XLATE_TYPE"=>"static|dynamic"}
[2018-04-24T17:17:38,859][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW104001"=>"\\((?:Primary|Secondary)\\) Switching to ACTIVE - %{GREEDYDATA:switch_reason}"}
[2018-04-24T17:17:38,859][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW104002"=>"\\((?:Primary|Secondary)\\) Switching to STANDBY - %{GREEDYDATA:switch_reason}"}
[2018-04-24T17:17:38,859][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW104003"=>"\\((?:Primary|Secondary)\\) Switching to FAILED\\."}
[2018-04-24T17:17:38,859][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW104004"=>"\\((?:Primary|Secondary)\\) Switching to OK\\."}
[2018-04-24T17:17:38,859][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW105003"=>"\\((?:Primary|Secondary)\\) Monitoring on [Ii]nterface %{GREEDYDATA:interface_name} waiting"}
[2018-04-24T17:17:38,859][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW105004"=>"\\((?:Primary|Secondary)\\) Monitoring on [Ii]nterface %{GREEDYDATA:interface_name} normal"}
[2018-04-24T17:17:38,859][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW105005"=>"\\((?:Primary|Secondary)\\) Lost Failover communications with mate on [Ii]nterface %{GREEDYDATA:interface_name}"}
[2018-04-24T17:17:38,859][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW105008"=>"\\((?:Primary|Secondary)\\) Testing [Ii]nterface %{GREEDYDATA:interface_name}"}
[2018-04-24T17:17:38,859][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW105009"=>"\\((?:Primary|Secondary)\\) Testing on [Ii]nterface %{GREEDYDATA:interface_name} (?:Passed|Failed)"}
[2018-04-24T17:17:38,859][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW106001"=>"%{CISCO_DIRECTION:direction} %{WORD:protocol} connection %{CISCO_ACTION:action} from %{IP:src_ip}/%{INT:src_port} to %{IP:dst_ip}/%{INT:dst_port} flags %{GREEDYDATA:tcp_flags} on interface %{GREEDYDATA:interface}"}
[2018-04-24T17:17:38,860][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW106006_106007_106010"=>"%{CISCO_ACTION:action} %{CISCO_DIRECTION:direction} %{WORD:protocol} (?:from|src) %{IP:src_ip}/%{INT:src_port}(\\(%{DATA:src_fwuser}\\))? (?:to|dst) %{IP:dst_ip}/%{INT:dst_port}(\\(%{DATA:dst_fwuser}\\))? (?:on interface %{DATA:interface}|due to %{CISCO_REASON:reason})"}
[2018-04-24T17:17:38,860][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW106014"=>"%{CISCO_ACTION:action} %{CISCO_DIRECTION:direction} %{WORD:protocol} src %{DATA:src_interface}:%{IP:src_ip}(\\(%{DATA:src_fwuser}\\))? dst %{DATA:dst_interface}:%{IP:dst_ip}(\\(%{DATA:dst_fwuser}\\))? \\(type %{INT:icmp_type}, code %{INT:icmp_code}\\)"}
[2018-04-24T17:17:38,860][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW106015"=>"%{CISCO_ACTION:action} %{WORD:protocol} \\(%{DATA:policy_id}\\) from %{IP:src_ip}/%{INT:src_port} to %{IP:dst_ip}/%{INT:dst_port} flags %{DATA:tcp_flags} on interface %{GREEDYDATA:interface}"}
[2018-04-24T17:17:38,860][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW106021"=>"%{CISCO_ACTION:action} %{WORD:protocol} reverse path check from %{IP:src_ip} to %{IP:dst_ip} on interface %{GREEDYDATA:interface}"}
[2018-04-24T17:17:38,860][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW106023"=>"%{CISCO_ACTION:action}( protocol)? %{WORD:protocol} src %{DATA:src_interface}:%{DATA:src_ip}(/%{INT:src_port})?(\\(%{DATA:src_fwuser}\\))? dst %{DATA:dst_interface}:%{DATA:dst_ip}(/%{INT:dst_port})?(\\(%{DATA:dst_fwuser}\\))?( \\(type %{INT:icmp_type}, code %{INT:icmp_code}\\))? by access-group \"?%{DATA:policy_id}\"? \\[%{DATA:hashcode1}, %{DATA:hashcode2}\\]"}
[2018-04-24T17:17:38,860][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW106100_2_3"=>"access-list %{NOTSPACE:policy_id} %{CISCO_ACTION:action} %{WORD:protocol} for user '%{DATA:src_fwuser}' %{DATA:src_interface}/%{IP:src_ip}\\(%{INT:src_port}\\) -> %{DATA:dst_interface}/%{IP:dst_ip}\\(%{INT:dst_port}\\) hit-cnt %{INT:hit_count} %{CISCO_INTERVAL:interval} \\[%{DATA:hashcode1}, %{DATA:hashcode2}\\]"}
[2018-04-24T17:17:38,860][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW106100"=>"access-list %{NOTSPACE:policy_id} %{CISCO_ACTION:action} %{WORD:protocol} %{DATA:src_interface}/%{IP:src_ip}\\(%{INT:src_port}\\)(\\(%{DATA:src_fwuser}\\))? -> %{DATA:dst_interface}/%{IP:dst_ip}\\(%{INT:dst_port}\\)(\\(%{DATA:src_fwuser}\\))? hit-cnt %{INT:hit_count} %{CISCO_INTERVAL:interval} \\[%{DATA:hashcode1}, %{DATA:hashcode2}\\]"}
[2018-04-24T17:17:38,860][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW304001"=>"%{IP:src_ip}(\\(%{DATA:src_fwuser}\\))? Accessed URL %{IP:dst_ip}:%{GREEDYDATA:dst_url}"}
[2018-04-24T17:17:38,860][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW110002"=>"%{CISCO_REASON:reason} for %{WORD:protocol} from %{DATA:src_interface}:%{IP:src_ip}/%{INT:src_port} to %{IP:dst_ip}/%{INT:dst_port}"}
[2018-04-24T17:17:38,860][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW302010"=>"%{INT:connection_count} in use, %{INT:connection_count_max} most used"}
[2018-04-24T17:17:38,860][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW302013_302014_302015_302016"=>"%{CISCO_ACTION:action}(?: %{CISCO_DIRECTION:direction})? %{WORD:protocol} connection %{INT:connection_id} for %{DATA:src_interface}:%{IP:src_ip}/%{INT:src_port}( \\(%{IP:src_mapped_ip}/%{INT:src_mapped_port}\\))?(\\(%{DATA:src_fwuser}\\))? to %{DATA:dst_interface}:%{IP:dst_ip}/%{INT:dst_port}( \\(%{IP:dst_mapped_ip}/%{INT:dst_mapped_port}\\))?(\\(%{DATA:dst_fwuser}\\))?( duration %{TIME:duration} bytes %{INT:bytes})?(?: %{CISCO_REASON:reason})?( \\(%{DATA:user}\\))?"}
[2018-04-24T17:17:38,861][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW302020_302021"=>"%{CISCO_ACTION:action}(?: %{CISCO_DIRECTION:direction})? %{WORD:protocol} connection for faddr %{IP:dst_ip}/%{INT:icmp_seq_num}(?:\\(%{DATA:fwuser}\\))? gaddr %{IP:src_xlated_ip}/%{INT:icmp_code_xlated} laddr %{IP:src_ip}/%{INT:icmp_code}( \\(%{DATA:user}\\))?"}
[2018-04-24T17:17:38,861][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW305011"=>"%{CISCO_ACTION:action} %{CISCO_XLATE_TYPE:xlate_type} %{WORD:protocol} translation from %{DATA:src_interface}:%{IP:src_ip}(/%{INT:src_port})?(\\(%{DATA:src_fwuser}\\))? to %{DATA:src_xlated_interface}:%{IP:src_xlated_ip}/%{DATA:src_xlated_port}"}
[2018-04-24T17:17:38,861][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW313001_313004_313008"=>"%{CISCO_ACTION:action} %{WORD:protocol} type=%{INT:icmp_type}, code=%{INT:icmp_code} from %{IP:src_ip} on interface %{DATA:interface}( to %{IP:dst_ip})?"}
[2018-04-24T17:17:38,861][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW313005"=>"%{CISCO_REASON:reason} for %{WORD:protocol} error message: %{WORD:err_protocol} src %{DATA:err_src_interface}:%{IP:err_src_ip}(\\(%{DATA:err_src_fwuser}\\))? dst %{DATA:err_dst_interface}:%{IP:err_dst_ip}(\\(%{DATA:err_dst_fwuser}\\))? \\(type %{INT:err_icmp_type}, code %{INT:err_icmp_code}\\) on %{DATA:interface} interface\\. Original IP payload: %{WORD:protocol} src %{IP:orig_src_ip}/%{INT:orig_src_port}(\\(%{DATA:orig_src_fwuser}\\))? dst %{IP:orig_dst_ip}/%{INT:orig_dst_port}(\\(%{DATA:orig_dst_fwuser}\\))?"}
[2018-04-24T17:17:38,861][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW321001"=>"Resource '%{WORD:resource_name}' limit of %{POSINT:resource_limit} reached for system"}
[2018-04-24T17:17:38,861][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW402117"=>"%{WORD:protocol}: Received a non-IPSec packet \\(protocol= %{WORD:orig_protocol}\\) from %{IP:src_ip} to %{IP:dst_ip}"}
[2018-04-24T17:17:38,861][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW402119"=>"%{WORD:protocol}: Received an %{WORD:orig_protocol} packet \\(SPI= %{DATA:spi}, sequence number= %{DATA:seq_num}\\) from %{IP:src_ip} \\(user= %{DATA:user}\\) to %{IP:dst_ip} that failed anti-replay checking"}
[2018-04-24T17:17:38,861][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW419001"=>"%{CISCO_ACTION:action} %{WORD:protocol} packet from %{DATA:src_interface}:%{IP:src_ip}/%{INT:src_port} to %{DATA:dst_interface}:%{IP:dst_ip}/%{INT:dst_port}, reason: %{GREEDYDATA:reason}"}
[2018-04-24T17:17:38,861][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW419002"=>"%{CISCO_REASON:reason} from %{DATA:src_interface}:%{IP:src_ip}/%{INT:src_port} to %{DATA:dst_interface}:%{IP:dst_ip}/%{INT:dst_port} with different initial sequence number"}
[2018-04-24T17:17:38,861][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW500004"=>"%{CISCO_REASON:reason} for protocol=%{WORD:protocol}, from %{IP:src_ip}/%{INT:src_port} to %{IP:dst_ip}/%{INT:dst_port}"}
[2018-04-24T17:17:38,862][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW602303_602304"=>"%{WORD:protocol}: An %{CISCO_DIRECTION:direction} %{GREEDYDATA:tunnel_type} SA \\(SPI= %{DATA:spi}\\) between %{IP:src_ip} and %{IP:dst_ip} \\(user= %{DATA:user}\\) has been %{CISCO_ACTION:action}"}
[2018-04-24T17:17:38,862][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW710001_710002_710003_710005_710006"=>"%{WORD:protocol} (?:request|access) %{CISCO_ACTION:action} from %{IP:src_ip}/%{INT:src_port} to %{DATA:dst_interface}:%{IP:dst_ip}/%{INT:dst_port}"}
[2018-04-24T17:17:38,862][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW713172"=>"Group = %{GREEDYDATA:group}, IP = %{IP:src_ip}, Automatic NAT Detection Status:\\s+Remote end\\s*%{DATA:is_remote_natted}\\s*behind a NAT device\\s+This\\s+end\\s*%{DATA:is_local_natted}\\s*behind a NAT device"}
[2018-04-24T17:17:38,862][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW733100"=>"\\[\\s*%{DATA:drop_type}\\s*\\] drop %{DATA:drop_rate_id} exceeded. Current burst rate is %{INT:drop_rate_current_burst} per second, max configured rate is %{INT:drop_rate_max_burst}; Current average rate is %{INT:drop_rate_current_avg} per second, max configured rate is %{INT:drop_rate_max_avg}; Cumulative total count is %{INT:drop_total_count}"}
[2018-04-24T17:17:38,862][DEBUG][logstash.filters.grok ] Adding pattern {"SHOREWALL"=>"(%{SYSLOGTIMESTAMP:timestamp}) (%{WORD:nf_host}) kernel:.*Shorewall:(%{WORD:nf_action1})?:(%{WORD:nf_action2})?.*IN=(%{USERNAME:nf_in_interface})?.*(OUT= *MAC=(%{COMMONMAC:nf_dst_mac}):(%{COMMONMAC:nf_src_mac})?|OUT=%{USERNAME:nf_out_interface}).*SRC=(%{IPV4:nf_src_ip}).*DST=(%{IPV4:nf_dst_ip}).*LEN=(%{WORD:nf_len}).?*TOS=(%{WORD:nf_tos}).?*PREC=(%{WORD:nf_prec}).?*TTL=(%{INT:nf_ttl}).?*ID=(%{INT:nf_id}).?*PROTO=(%{WORD:nf_protocol}).?*SPT=(%{INT:nf_src_port}?.*DPT=%{INT:nf_dst_port}?.*)"}
[2018-04-24T17:17:38,862][DEBUG][logstash.filters.grok ] Adding pattern {"SFW2"=>"((%{SYSLOGTIMESTAMP})|(%{TIMESTAMP_ISO8601}))\\s*%{HOSTNAME}\\s*kernel\\S+\\s*%{NAGIOSTIME}\\s*SFW2\\-INext\\-%{NOTSPACE:nf_action}\\s*IN=%{USERNAME:nf_in_interface}.*OUT=((\\s*%{USERNAME:nf_out_interface})|(\\s*))MAC=((%{COMMONMAC:nf_dst_mac}:%{COMMONMAC:nf_src_mac})|(\\s*)).*SRC=%{IP:nf_src_ip}\\s*DST=%{IP:nf_dst_ip}.*PROTO=%{WORD:nf_protocol}((.*SPT=%{INT:nf_src_port}.*DPT=%{INT:nf_dst_port}.*)|())"}
[2018-04-24T17:17:38,862][DEBUG][logstash.filters.grok ] Adding pattern {"USERNAME"=>"[a-zA-Z0-9._-]+"}
[2018-04-24T17:17:38,862][DEBUG][logstash.filters.grok ] Adding pattern {"USER"=>"%{USERNAME}"}
[2018-04-24T17:17:38,862][DEBUG][logstash.filters.grok ] Adding pattern {"EMAILLOCALPART"=>"[a-zA-Z][a-zA-Z0-9_.+-=:]+"}
[2018-04-24T17:17:38,862][DEBUG][logstash.filters.grok ] Adding pattern {"EMAILADDRESS"=>"%{EMAILLOCALPART}@%{HOSTNAME}"}
[2018-04-24T17:17:38,863][DEBUG][logstash.filters.grok ] Adding pattern {"INT"=>"(?:[+-]?(?:[0-9]+))"}
[2018-04-24T17:17:38,863][DEBUG][logstash.filters.grok ] Adding pattern {"BASE10NUM"=>"(?<![0-9.+-])(?>[+-]?(?:(?:[0-9]+(?:\\.[0-9]+)?)|(?:\\.[0-9]+)))"}
[2018-04-24T17:17:38,863][DEBUG][logstash.filters.grok ] Adding pattern {"NUMBER"=>"(?:%{BASE10NUM})"}
[2018-04-24T17:17:38,863][DEBUG][logstash.filters.grok ] Adding pattern {"BASE16NUM"=>"(?<![0-9A-Fa-f])(?:[+-]?(?:0x)?(?:[0-9A-Fa-f]+))"}
[2018-04-24T17:17:38,863][DEBUG][logstash.filters.grok ] Adding pattern {"BASE16FLOAT"=>"\\b(?<![0-9A-Fa-f.])(?:[+-]?(?:0x)?(?:(?:[0-9A-Fa-f]+(?:\\.[0-9A-Fa-f]*)?)|(?:\\.[0-9A-Fa-f]+)))\\b"}
[2018-04-24T17:17:38,863][DEBUG][logstash.filters.grok ] Adding pattern {"POSINT"=>"\\b(?:[1-9][0-9]*)\\b"}
[2018-04-24T17:17:38,863][DEBUG][logstash.filters.grok ] Adding pattern {"NONNEGINT"=>"\\b(?:[0-9]+)\\b"}
[2018-04-24T17:17:38,863][DEBUG][logstash.filters.grok ] Adding pattern {"WORD"=>"\\b\\w+\\b"}
[2018-04-24T17:17:38,863][DEBUG][logstash.filters.grok ] Adding pattern {"NOTSPACE"=>"\\S+"}
[2018-04-24T17:17:38,863][DEBUG][logstash.filters.grok ] Adding pattern {"SPACE"=>"\\s*"}
[2018-04-24T17:17:38,863][DEBUG][logstash.filters.grok ] Adding pattern {"DATA"=>".*?"}
[2018-04-24T17:17:38,863][DEBUG][logstash.filters.grok ] Adding pattern {"GREEDYDATA"=>".*"}
[2018-04-24T17:17:38,863][DEBUG][logstash.filters.grok ] Adding pattern {"QUOTEDSTRING"=>"(?>(?<!\\\\)(?>\"(?>\\\\.|[^\\\\\"]+)+\"|\"\"|(?>'(?>\\\\.|[^\\\\']+)+')|''|(?>`(?>\\\\.|[^\\\\`]+)+`)|``))"}
[2018-04-24T17:17:38,863][DEBUG][logstash.filters.grok ] Adding pattern {"UUID"=>"[A-Fa-f0-9]{8}-(?:[A-Fa-f0-9]{4}-){3}[A-Fa-f0-9]{12}"}
[2018-04-24T17:17:38,863][DEBUG][logstash.filters.grok ] Adding pattern {"URN"=>"urn:[0-9A-Za-z][0-9A-Za-z-]{0,31}:(?:%[0-9a-fA-F]{2}|[0-9A-Za-z()+,.:=@;$_!*'/?#-])+"}
[2018-04-24T17:17:38,863][DEBUG][logstash.filters.grok ] Adding pattern {"MAC"=>"(?:%{CISCOMAC}|%{WINDOWSMAC}|%{COMMONMAC})"}
[2018-04-24T17:17:38,863][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOMAC"=>"(?:(?:[A-Fa-f0-9]{4}\\.){2}[A-Fa-f0-9]{4})"}
[2018-04-24T17:17:38,864][DEBUG][logstash.filters.grok ] Adding pattern {"WINDOWSMAC"=>"(?:(?:[A-Fa-f0-9]{2}-){5}[A-Fa-f0-9]{2})"}
[2018-04-24T17:17:38,864][DEBUG][logstash.filters.grok ] Adding pattern {"COMMONMAC"=>"(?:(?:[A-Fa-f0-9]{2}:){5}[A-Fa-f0-9]{2})"}
[2018-04-24T17:17:38,864][DEBUG][logstash.filters.grok ] Adding pattern {"IPV6"=>"((([0-9A-Fa-f]{1,4}:){7}([0-9A-Fa-f]{1,4}|:))|(([0-9A-Fa-f]{1,4}:){6}(:[0-9A-Fa-f]{1,4}|((25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)(\\.(25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)){3})|:))|(([0-9A-Fa-f]{1,4}:){5}(((:[0-9A-Fa-f]{1,4}){1,2})|:((25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)(\\.(25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)){3})|:))|(([0-9A-Fa-f]{1,4}:){4}(((:[0-9A-Fa-f]{1,4}){1,3})|((:[0-9A-Fa-f]{1,4})?:((25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)(\\.(25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)){3}))|:))|(([0-9A-Fa-f]{1,4}:){3}(((:[0-9A-Fa-f]{1,4}){1,4})|((:[0-9A-Fa-f]{1,4}){0,2}:((25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)(\\.(25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)){3}))|:))|(([0-9A-Fa-f]{1,4}:){2}(((:[0-9A-Fa-f]{1,4}){1,5})|((:[0-9A-Fa-f]{1,4}){0,3}:((25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)(\\.(25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)){3}))|:))|(([0-9A-Fa-f]{1,4}:){1}(((:[0-9A-Fa-f]{1,4}){1,6})|((:[0-9A-Fa-f]{1,4}){0,4}:((25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)(\\.(25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)){3}))|:))|(:(((:[0-9A-Fa-f]{1,4}){1,7})|((:[0-9A-Fa-f]{1,4}){0,5}:((25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)(\\.(25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)){3}))|:)))(%.+)?"}
[2018-04-24T17:17:38,864][DEBUG][logstash.filters.grok ] Adding pattern {"IPV4"=>"(?<![0-9])(?:(?:[0-1]?[0-9]{1,2}|2[0-4][0-9]|25[0-5])[.](?:[0-1]?[0-9]{1,2}|2[0-4][0-9]|25[0-5])[.](?:[0-1]?[0-9]{1,2}|2[0-4][0-9]|25[0-5])[.](?:[0-1]?[0-9]{1,2}|2[0-4][0-9]|25[0-5]))(?![0-9])"}
[2018-04-24T17:17:38,864][DEBUG][logstash.filters.grok ] Adding pattern {"IP"=>"(?:%{IPV6}|%{IPV4})"}
[2018-04-24T17:17:38,864][DEBUG][logstash.filters.grok ] Adding pattern {"HOSTNAME"=>"\\b(?:[0-9A-Za-z][0-9A-Za-z-]{0,62})(?:\\.(?:[0-9A-Za-z][0-9A-Za-z-]{0,62}))*(\\.?|\\b)"}
[2018-04-24T17:17:38,864][DEBUG][logstash.filters.grok ] Adding pattern {"IPORHOST"=>"(?:%{IP}|%{HOSTNAME})"}
[2018-04-24T17:17:38,864][DEBUG][logstash.filters.grok ] Adding pattern {"HOSTPORT"=>"%{IPORHOST}:%{POSINT}"}
[2018-04-24T17:17:38,864][DEBUG][logstash.filters.grok ] Adding pattern {"PATH"=>"(?:%{UNIXPATH}|%{WINPATH})"}
[2018-04-24T17:17:38,864][DEBUG][logstash.filters.grok ] Adding pattern {"UNIXPATH"=>"(/([\\w_%!$@:.,+~-]+|\\\\.)*)+"}
[2018-04-24T17:17:38,864][DEBUG][logstash.filters.grok ] Adding pattern {"TTY"=>"(?:/dev/(pts|tty([pq])?)(\\w+)?/?(?:[0-9]+))"}
[2018-04-24T17:17:38,864][DEBUG][logstash.filters.grok ] Adding pattern {"WINPATH"=>"(?>[A-Za-z]+:|\\\\)(?:\\\\[^\\\\?*]*)+"}
[2018-04-24T17:17:38,864][DEBUG][logstash.filters.grok ] Adding pattern {"URIPROTO"=>"[A-Za-z]([A-Za-z0-9+\\-.]+)+"}
[2018-04-24T17:17:38,864][DEBUG][logstash.filters.grok ] Adding pattern {"URIHOST"=>"%{IPORHOST}(?::%{POSINT:port})?"}
[2018-04-24T17:17:38,865][DEBUG][logstash.filters.grok ] Adding pattern {"URIPATH"=>"(?:/[A-Za-z0-9$.+!*'(){},~:;=@#%&_\\-]*)+"}
[2018-04-24T17:17:38,865][DEBUG][logstash.filters.grok ] Adding pattern {"URIPARAM"=>"\\?[A-Za-z0-9$.+!*'|(){},~@#%&/=:;_?\\-\\[\\]<>]*"}
[2018-04-24T17:17:38,865][DEBUG][logstash.filters.grok ] Adding pattern {"URIPATHPARAM"=>"%{URIPATH}(?:%{URIPARAM})?"}
[2018-04-24T17:17:38,865][DEBUG][logstash.filters.grok ] Adding pattern {"URI"=>"%{URIPROTO}://(?:%{USER}(?::[^@]*)?@)?(?:%{URIHOST})?(?:%{URIPATHPARAM})?"}
[2018-04-24T17:17:38,865][DEBUG][logstash.filters.grok ] Adding pattern {"MONTH"=>"\\b(?:[Jj]an(?:uary|uar)?|[Ff]eb(?:ruary|ruar)?|[Mm](?:a|ä)?r(?:ch|z)?|[Aa]pr(?:il)?|[Mm]a(?:y|i)?|[Jj]un(?:e|i)?|[Jj]ul(?:y)?|[Aa]ug(?:ust)?|[Ss]ep(?:tember)?|[Oo](?:c|k)?t(?:ober)?|[Nn]ov(?:ember)?|[Dd]e(?:c|z)(?:ember)?)\\b"}
[2018-04-24T17:17:38,865][DEBUG][logstash.filters.grok ] Adding pattern {"MONTHNUM"=>"(?:0?[1-9]|1[0-2])"}
[2018-04-24T17:17:38,865][DEBUG][logstash.filters.grok ] Adding pattern {"MONTHNUM2"=>"(?:0[1-9]|1[0-2])"}
[2018-04-24T17:17:38,865][DEBUG][logstash.filters.grok ] Adding pattern {"MONTHDAY"=>"(?:(?:0[1-9])|(?:[12][0-9])|(?:3[01])|[1-9])"}
[2018-04-24T17:17:38,865][DEBUG][logstash.filters.grok ] Adding pattern {"DAY"=>"(?:Mon(?:day)?|Tue(?:sday)?|Wed(?:nesday)?|Thu(?:rsday)?|Fri(?:day)?|Sat(?:urday)?|Sun(?:day)?)"}
[2018-04-24T17:17:38,865][DEBUG][logstash.filters.grok ] Adding pattern {"YEAR"=>"(?>\\d\\d){1,2}"}
[2018-04-24T17:17:38,865][DEBUG][logstash.filters.grok ] Adding pattern {"HOUR"=>"(?:2[0123]|[01]?[0-9])"}
[2018-04-24T17:17:38,865][DEBUG][logstash.filters.grok ] Adding pattern {"MINUTE"=>"(?:[0-5][0-9])"}
[2018-04-24T17:17:38,865][DEBUG][logstash.filters.grok ] Adding pattern {"SECOND"=>"(?:(?:[0-5]?[0-9]|60)(?:[:.,][0-9]+)?)"}
[2018-04-24T17:17:38,865][DEBUG][logstash.filters.grok ] Adding pattern {"TIME"=>"(?!<[0-9])%{HOUR}:%{MINUTE}(?::%{SECOND})(?![0-9])"}
[2018-04-24T17:17:38,865][DEBUG][logstash.filters.grok ] Adding pattern {"DATE_US"=>"%{MONTHNUM}[/-]%{MONTHDAY}[/-]%{YEAR}"}
[2018-04-24T17:17:38,866][DEBUG][logstash.filters.grok ] Adding pattern {"DATE_EU"=>"%{MONTHDAY}[./-]%{MONTHNUM}[./-]%{YEAR}"}
[2018-04-24T17:17:38,866][DEBUG][logstash.filters.grok ] Adding pattern {"ISO8601_TIMEZONE"=>"(?:Z|[+-]%{HOUR}(?::?%{MINUTE}))"}
[2018-04-24T17:17:38,866][DEBUG][logstash.filters.grok ] Adding pattern {"ISO8601_SECOND"=>"(?:%{SECOND}|60)"}
[2018-04-24T17:17:38,866][DEBUG][logstash.filters.grok ] Adding pattern {"TIMESTAMP_ISO8601"=>"%{YEAR}-%{MONTHNUM}-%{MONTHDAY}[T ]%{HOUR}:?%{MINUTE}(?::?%{SECOND})?%{ISO8601_TIMEZONE}?"}
[2018-04-24T17:17:38,866][DEBUG][logstash.filters.grok ] Adding pattern {"DATE"=>"%{DATE_US}|%{DATE_EU}"}
[2018-04-24T17:17:38,866][DEBUG][logstash.filters.grok ] Adding pattern {"DATESTAMP"=>"%{DATE}[- ]%{TIME}"}
[2018-04-24T17:17:38,866][DEBUG][logstash.filters.grok ] Adding pattern {"TZ"=>"(?:[APMCE][SD]T|UTC)"}
[2018-04-24T17:17:38,866][DEBUG][logstash.filters.grok ] Adding pattern {"DATESTAMP_RFC822"=>"%{DAY} %{MONTH} %{MONTHDAY} %{YEAR} %{TIME} %{TZ}"}
[2018-04-24T17:17:38,866][DEBUG][logstash.filters.grok ] Adding pattern {"DATESTAMP_RFC2822"=>"%{DAY}, %{MONTHDAY} %{MONTH} %{YEAR} %{TIME} %{ISO8601_TIMEZONE}"}
[2018-04-24T17:17:38,866][DEBUG][logstash.filters.grok ] Adding pattern {"DATESTAMP_OTHER"=>"%{DAY} %{MONTH} %{MONTHDAY} %{TIME} %{TZ} %{YEAR}"}
[2018-04-24T17:17:38,866][DEBUG][logstash.filters.grok ] Adding pattern {"DATESTAMP_EVENTLOG"=>"%{YEAR}%{MONTHNUM2}%{MONTHDAY}%{HOUR}%{MINUTE}%{SECOND}"}
[2018-04-24T17:17:38,866][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOGTIMESTAMP"=>"%{MONTH} +%{MONTHDAY} %{TIME}"}
[2018-04-24T17:17:38,866][DEBUG][logstash.filters.grok ] Adding pattern {"PROG"=>"[\\x21-\\x5a\\x5c\\x5e-\\x7e]+"}
[2018-04-24T17:17:38,866][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOGPROG"=>"%{PROG:program}(?:\\[%{POSINT:pid}\\])?"}
[2018-04-24T17:17:38,866][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOGHOST"=>"%{IPORHOST}"}
[2018-04-24T17:17:38,866][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOGFACILITY"=>"<%{NONNEGINT:facility}.%{NONNEGINT:priority}>"}
[2018-04-24T17:17:38,866][DEBUG][logstash.filters.grok ] Adding pattern {"HTTPDATE"=>"%{MONTHDAY}/%{MONTH}/%{YEAR}:%{TIME} %{INT}"}
[2018-04-24T17:17:38,866][DEBUG][logstash.filters.grok ] Adding pattern {"QS"=>"%{QUOTEDSTRING}"}
[2018-04-24T17:17:38,867][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOGBASE"=>"%{SYSLOGTIMESTAMP:timestamp} (?:%{SYSLOGFACILITY} )?%{SYSLOGHOST:logsource} %{SYSLOGPROG}:"}
[2018-04-24T17:17:38,867][DEBUG][logstash.filters.grok ] Adding pattern {"LOGLEVEL"=>"([Aa]lert|ALERT|[Tt]race|TRACE|[Dd]ebug|DEBUG|[Nn]otice|NOTICE|[Ii]nfo|INFO|[Ww]arn?(?:ing)?|WARN?(?:ING)?|[Ee]rr?(?:or)?|ERR?(?:OR)?|[Cc]rit?(?:ical)?|CRIT?(?:ICAL)?|[Ff]atal|FATAL|[Ss]evere|SEVERE|EMERG(?:ENCY)?|[Ee]merg(?:ency)?)"}
[2018-04-24T17:17:38,867][DEBUG][logstash.filters.grok ] Adding pattern {"HAPROXYTIME"=>"(?!<[0-9])%{HOUR:haproxy_hour}:%{MINUTE:haproxy_minute}(?::%{SECOND:haproxy_second})(?![0-9])"}
[2018-04-24T17:17:38,867][DEBUG][logstash.filters.grok ] Adding pattern {"HAPROXYDATE"=>"%{MONTHDAY:haproxy_monthday}/%{MONTH:haproxy_month}/%{YEAR:haproxy_year}:%{HAPROXYTIME:haproxy_time}.%{INT:haproxy_milliseconds}"}
[2018-04-24T17:17:38,867][DEBUG][logstash.filters.grok ] Adding pattern {"HAPROXYCAPTUREDREQUESTHEADERS"=>"%{DATA:captured_request_headers}"}
[2018-04-24T17:17:38,867][DEBUG][logstash.filters.grok ] Adding pattern {"HAPROXYCAPTUREDRESPONSEHEADERS"=>"%{DATA:captured_response_headers}"}
[2018-04-24T17:17:38,867][DEBUG][logstash.filters.grok ] Adding pattern {"HAPROXYHTTPBASE"=>"%{IP:client_ip}:%{INT:client_port} \\[%{HAPROXYDATE:accept_date}\\] %{NOTSPACE:frontend_name} %{NOTSPACE:backend_name}/%{NOTSPACE:server_name} %{INT:time_request}/%{INT:time_queue}/%{INT:time_backend_connect}/%{INT:time_backend_response}/%{NOTSPACE:time_duration} %{INT:http_status_code} %{NOTSPACE:bytes_read} %{DATA:captured_request_cookie} %{DATA:captured_response_cookie} %{NOTSPACE:termination_state} %{INT:actconn}/%{INT:feconn}/%{INT:beconn}/%{INT:srvconn}/%{NOTSPACE:retries} %{INT:srv_queue}/%{INT:backend_queue} (\\{%{HAPROXYCAPTUREDREQUESTHEADERS}\\})?( )?(\\{%{HAPROXYCAPTUREDRESPONSEHEADERS}\\})?( )?\"(<BADREQ>|(%{WORD:http_verb} (%{URIPROTO:http_proto}://)?(?:%{USER:http_user}(?::[^@]*)?@)?(?:%{URIHOST:http_host})?(?:%{URIPATHPARAM:http_request})?( HTTP/%{NUMBER:http_version})?))?\""}
[2018-04-24T17:17:38,867][DEBUG][logstash.filters.grok ] Adding pattern {"HAPROXYHTTP"=>"(?:%{SYSLOGTIMESTAMP:syslog_timestamp}|%{TIMESTAMP_ISO8601:timestamp8601}) %{IPORHOST:syslog_server} %{SYSLOGPROG}: %{HAPROXYHTTPBASE}"}
[2018-04-24T17:17:38,867][DEBUG][logstash.filters.grok ] Adding pattern {"HAPROXYTCP"=>"(?:%{SYSLOGTIMESTAMP:syslog_timestamp}|%{TIMESTAMP_ISO8601:timestamp8601}) %{IPORHOST:syslog_server} %{SYSLOGPROG}: %{IP:client_ip}:%{INT:client_port} \\[%{HAPROXYDATE:accept_date}\\] %{NOTSPACE:frontend_name} %{NOTSPACE:backend_name}/%{NOTSPACE:server_name} %{INT:time_queue}/%{INT:time_backend_connect}/%{NOTSPACE:time_duration} %{NOTSPACE:bytes_read} %{NOTSPACE:termination_state} %{INT:actconn}/%{INT:feconn}/%{INT:beconn}/%{INT:srvconn}/%{NOTSPACE:retries} %{INT:srv_queue}/%{INT:backend_queue}"}
[2018-04-24T17:17:38,868][DEBUG][logstash.filters.grok ] Adding pattern {"HTTPDUSER"=>"%{EMAILADDRESS}|%{USER}"}
[2018-04-24T17:17:38,868][DEBUG][logstash.filters.grok ] Adding pattern {"HTTPDERROR_DATE"=>"%{DAY} %{MONTH} %{MONTHDAY} %{TIME} %{YEAR}"}
[2018-04-24T17:17:38,868][DEBUG][logstash.filters.grok ] Adding pattern {"HTTPD_COMMONLOG"=>"%{IPORHOST:clientip} %{HTTPDUSER:ident} %{HTTPDUSER:auth} \\[%{HTTPDATE:timestamp}\\] \"(?:%{WORD:verb} %{NOTSPACE:request}(?: HTTP/%{NUMBER:httpversion})?|%{DATA:rawrequest})\" %{NUMBER:response} (?:%{NUMBER:bytes}|-)"}
[2018-04-24T17:17:38,868][DEBUG][logstash.filters.grok ] Adding pattern {"HTTPD_COMBINEDLOG"=>"%{HTTPD_COMMONLOG} %{QS:referrer} %{QS:agent}"}
[2018-04-24T17:17:38,868][DEBUG][logstash.filters.grok ] Adding pattern {"HTTPD20_ERRORLOG"=>"\\[%{HTTPDERROR_DATE:timestamp}\\] \\[%{LOGLEVEL:loglevel}\\] (?:\\[client %{IPORHOST:clientip}\\] ){0,1}%{GREEDYDATA:message}"}
[2018-04-24T17:17:38,868][DEBUG][logstash.filters.grok ] Adding pattern {"HTTPD24_ERRORLOG"=>"\\[%{HTTPDERROR_DATE:timestamp}\\] \\[%{WORD:module}:%{LOGLEVEL:loglevel}\\] \\[pid %{POSINT:pid}(:tid %{NUMBER:tid})?\\]( \\(%{POSINT:proxy_errorcode}\\)%{DATA:proxy_message}:)?( \\[client %{IPORHOST:clientip}:%{POSINT:clientport}\\])?( %{DATA:errorcode}:)? %{GREEDYDATA:message}"}
[2018-04-24T17:17:38,868][DEBUG][logstash.filters.grok ] Adding pattern {"HTTPD_ERRORLOG"=>"%{HTTPD20_ERRORLOG}|%{HTTPD24_ERRORLOG}"}
[2018-04-24T17:17:38,868][DEBUG][logstash.filters.grok ] Adding pattern {"COMMONAPACHELOG"=>"%{HTTPD_COMMONLOG}"}
[2018-04-24T17:17:38,868][DEBUG][logstash.filters.grok ] Adding pattern {"COMBINEDAPACHELOG"=>"%{HTTPD_COMBINEDLOG}"}
[2018-04-24T17:17:38,868][DEBUG][logstash.filters.grok ] Adding pattern {"JAVACLASS"=>"(?:[a-zA-Z$_][a-zA-Z$_0-9]*\\.)*[a-zA-Z$_][a-zA-Z$_0-9]*"}
[2018-04-24T17:17:38,868][DEBUG][logstash.filters.grok ] Adding pattern {"JAVAFILE"=>"(?:[A-Za-z0-9_. -]+)"}
[2018-04-24T17:17:38,869][DEBUG][logstash.filters.grok ] Adding pattern {"JAVAMETHOD"=>"(?:(<(?:cl)?init>)|[a-zA-Z$_][a-zA-Z$_0-9]*)"}
[2018-04-24T17:17:38,869][DEBUG][logstash.filters.grok ] Adding pattern {"JAVASTACKTRACEPART"=>"%{SPACE}at %{JAVACLASS:class}\\.%{JAVAMETHOD:method}\\(%{JAVAFILE:file}(?::%{NUMBER:line})?\\)"}
[2018-04-24T17:17:38,869][DEBUG][logstash.filters.grok ] Adding pattern {"JAVATHREAD"=>"(?:[A-Z]{2}-Processor[\\d]+)"}
[2018-04-24T17:17:38,869][DEBUG][logstash.filters.grok ] Adding pattern {"JAVACLASS"=>"(?:[a-zA-Z0-9-]+\\.)+[A-Za-z0-9$]+"}
[2018-04-24T17:17:38,869][DEBUG][logstash.filters.grok ] Adding pattern {"JAVAFILE"=>"(?:[A-Za-z0-9_.-]+)"}
[2018-04-24T17:17:38,869][DEBUG][logstash.filters.grok ] Adding pattern {"JAVALOGMESSAGE"=>"(.*)"}
[2018-04-24T17:17:38,869][DEBUG][logstash.filters.grok ] Adding pattern {"CATALINA_DATESTAMP"=>"%{MONTH} %{MONTHDAY}, 20%{YEAR} %{HOUR}:?%{MINUTE}(?::?%{SECOND}) (?:AM|PM)"}
[2018-04-24T17:17:38,869][DEBUG][logstash.filters.grok ] Adding pattern {"TOMCAT_DATESTAMP"=>"20%{YEAR}-%{MONTHNUM}-%{MONTHDAY} %{HOUR}:?%{MINUTE}(?::?%{SECOND}) %{ISO8601_TIMEZONE}"}
[2018-04-24T17:17:38,869][DEBUG][logstash.filters.grok ] Adding pattern {"CATALINALOG"=>"%{CATALINA_DATESTAMP:timestamp} %{JAVACLASS:class} %{JAVALOGMESSAGE:logmessage}"}
[2018-04-24T17:17:38,869][DEBUG][logstash.filters.grok ] Adding pattern {"TOMCATLOG"=>"%{TOMCAT_DATESTAMP:timestamp} \\| %{LOGLEVEL:level} \\| %{JAVACLASS:class} - %{JAVALOGMESSAGE:logmessage}"}
[2018-04-24T17:17:38,869][DEBUG][logstash.filters.grok ] Adding pattern {"RT_FLOW_EVENT"=>"(RT_FLOW_SESSION_CREATE|RT_FLOW_SESSION_CLOSE|RT_FLOW_SESSION_DENY)"}
[2018-04-24T17:17:38,869][DEBUG][logstash.filters.grok ] Adding pattern {"RT_FLOW1"=>"%{RT_FLOW_EVENT:event}: %{GREEDYDATA:close-reason}: %{IP:src-ip}/%{INT:src-port}->%{IP:dst-ip}/%{INT:dst-port} %{DATA:service} %{IP:nat-src-ip}/%{INT:nat-src-port}->%{IP:nat-dst-ip}/%{INT:nat-dst-port} %{DATA:src-nat-rule-name} %{DATA:dst-nat-rule-name} %{INT:protocol-id} %{DATA:policy-name} %{DATA:from-zone} %{DATA:to-zone} %{INT:session-id} \\d+\\(%{DATA:sent}\\) \\d+\\(%{DATA:received}\\) %{INT:elapsed-time} .*"}
[2018-04-24T17:17:38,870][DEBUG][logstash.filters.grok ] Adding pattern {"RT_FLOW2"=>"%{RT_FLOW_EVENT:event}: session created %{IP:src-ip}/%{INT:src-port}->%{IP:dst-ip}/%{INT:dst-port} %{DATA:service} %{IP:nat-src-ip}/%{INT:nat-src-port}->%{IP:nat-dst-ip}/%{INT:nat-dst-port} %{DATA:src-nat-rule-name} %{DATA:dst-nat-rule-name} %{INT:protocol-id} %{DATA:policy-name} %{DATA:from-zone} %{DATA:to-zone} %{INT:session-id} .*"}
[2018-04-24T17:17:38,870][DEBUG][logstash.filters.grok ] Adding pattern {"RT_FLOW3"=>"%{RT_FLOW_EVENT:event}: session denied %{IP:src-ip}/%{INT:src-port}->%{IP:dst-ip}/%{INT:dst-port} %{DATA:service} %{INT:protocol-id}\\(\\d\\) %{DATA:policy-name} %{DATA:from-zone} %{DATA:to-zone} .*"}
[2018-04-24T17:17:38,870][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOG5424PRINTASCII"=>"[!-~]+"}
[2018-04-24T17:17:38,870][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOGBASE2"=>"(?:%{SYSLOGTIMESTAMP:timestamp}|%{TIMESTAMP_ISO8601:timestamp8601}) (?:%{SYSLOGFACILITY} )?%{SYSLOGHOST:logsource}+(?: %{SYSLOGPROG}:|)"}
[2018-04-24T17:17:38,870][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOGPAMSESSION"=>"%{SYSLOGBASE} (?=%{GREEDYDATA:message})%{WORD:pam_module}\\(%{DATA:pam_caller}\\): session %{WORD:pam_session_state} for user %{USERNAME:username}(?: by %{GREEDYDATA:pam_by})?"}
[2018-04-24T17:17:38,870][DEBUG][logstash.filters.grok ] Adding pattern {"CRON_ACTION"=>"[A-Z ]+"}
[2018-04-24T17:17:38,870][DEBUG][logstash.filters.grok ] Adding pattern {"CRONLOG"=>"%{SYSLOGBASE} \\(%{USER:user}\\) %{CRON_ACTION:action} \\(%{DATA:message}\\)"}
[2018-04-24T17:17:38,870][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOGLINE"=>"%{SYSLOGBASE2} %{GREEDYDATA:message}"}
[2018-04-24T17:17:38,870][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOG5424PRI"=>"<%{NONNEGINT:syslog5424_pri}>"}
[2018-04-24T17:17:38,870][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOG5424SD"=>"\\[%{DATA}\\]+"}
[2018-04-24T17:17:38,870][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOG5424BASE"=>"%{SYSLOG5424PRI}%{NONNEGINT:syslog5424_ver} +(?:%{TIMESTAMP_ISO8601:syslog5424_ts}|-) +(?:%{IPORHOST:syslog5424_host}|-) +(-|%{SYSLOG5424PRINTASCII:syslog5424_app}) +(-|%{SYSLOG5424PRINTASCII:syslog5424_proc}) +(-|%{SYSLOG5424PRINTASCII:syslog5424_msgid}) +(?:%{SYSLOG5424SD:syslog5424_sd}|-|)"}
[2018-04-24T17:17:38,870][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOG5424LINE"=>"%{SYSLOG5424BASE} +%{GREEDYDATA:syslog5424_msg}"}
[2018-04-24T17:17:38,871][DEBUG][logstash.filters.grok ] Adding pattern {"MAVEN_VERSION"=>"(?:(\\d+)\\.)?(?:(\\d+)\\.)?(\\*|\\d+)(?:[.-](RELEASE|SNAPSHOT))?"}
[2018-04-24T17:17:38,871][DEBUG][logstash.filters.grok ] Adding pattern {"MCOLLECTIVEAUDIT"=>"%{TIMESTAMP_ISO8601:timestamp}:"}
[2018-04-24T17:17:38,871][DEBUG][logstash.filters.grok ] Adding pattern {"MCOLLECTIVE"=>"., \\[%{TIMESTAMP_ISO8601:timestamp} #%{POSINT:pid}\\]%{SPACE}%{LOGLEVEL:event_level}"}
[2018-04-24T17:17:38,871][DEBUG][logstash.filters.grok ] Adding pattern {"MCOLLECTIVEAUDIT"=>"%{TIMESTAMP_ISO8601:timestamp}:"}
[2018-04-24T17:17:38,871][DEBUG][logstash.filters.grok ] Adding pattern {"MONGO_LOG"=>"%{SYSLOGTIMESTAMP:timestamp} \\[%{WORD:component}\\] %{GREEDYDATA:message}"}
[2018-04-24T17:17:38,871][DEBUG][logstash.filters.grok ] Adding pattern {"MONGO_QUERY"=>"\\{ (?<={ ).*(?= } ntoreturn:) \\}"}
[2018-04-24T17:17:38,871][DEBUG][logstash.filters.grok ] Adding pattern {"MONGO_SLOWQUERY"=>"%{WORD} %{MONGO_WORDDASH:database}\\.%{MONGO_WORDDASH:collection} %{WORD}: %{MONGO_QUERY:query} %{WORD}:%{NONNEGINT:ntoreturn} %{WORD}:%{NONNEGINT:ntoskip} %{WORD}:%{NONNEGINT:nscanned}.*nreturned:%{NONNEGINT:nreturned}..+ (?<duration>[0-9]+)ms"}
[2018-04-24T17:17:38,871][DEBUG][logstash.filters.grok ] Adding pattern {"MONGO_WORDDASH"=>"\\b[\\w-]+\\b"}
[2018-04-24T17:17:38,871][DEBUG][logstash.filters.grok ] Adding pattern {"MONGO3_SEVERITY"=>"\\w"}
[2018-04-24T17:17:38,872][DEBUG][logstash.filters.grok ] Adding pattern {"MONGO3_COMPONENT"=>"%{WORD}|-"}
[2018-04-24T17:17:38,872][DEBUG][logstash.filters.grok ] Adding pattern {"MONGO3_LOG"=>"%{TIMESTAMP_ISO8601:timestamp} %{MONGO3_SEVERITY:severity} %{MONGO3_COMPONENT:component}%{SPACE}(?:\\[%{DATA:context}\\])? %{GREEDYDATA:message}"}
[2018-04-24T17:17:38,872][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOSTIME"=>"\\[%{NUMBER:nagios_epoch}\\]"}
[2018-04-24T17:17:38,872][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_CURRENT_SERVICE_STATE"=>"CURRENT SERVICE STATE"}
[2018-04-24T17:17:38,872][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_CURRENT_HOST_STATE"=>"CURRENT HOST STATE"}
[2018-04-24T17:17:38,872][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_SERVICE_NOTIFICATION"=>"SERVICE NOTIFICATION"}
[2018-04-24T17:17:38,872][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_HOST_NOTIFICATION"=>"HOST NOTIFICATION"}
[2018-04-24T17:17:38,872][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_SERVICE_ALERT"=>"SERVICE ALERT"}
[2018-04-24T17:17:38,872][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_HOST_ALERT"=>"HOST ALERT"}
[2018-04-24T17:17:38,872][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_SERVICE_FLAPPING_ALERT"=>"SERVICE FLAPPING ALERT"}
[2018-04-24T17:17:38,872][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_HOST_FLAPPING_ALERT"=>"HOST FLAPPING ALERT"}
[2018-04-24T17:17:38,872][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_SERVICE_DOWNTIME_ALERT"=>"SERVICE DOWNTIME ALERT"}
[2018-04-24T17:17:38,872][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_HOST_DOWNTIME_ALERT"=>"HOST DOWNTIME ALERT"}
[2018-04-24T17:17:38,872][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_PASSIVE_SERVICE_CHECK"=>"PASSIVE SERVICE CHECK"}
[2018-04-24T17:17:38,873][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_PASSIVE_HOST_CHECK"=>"PASSIVE HOST CHECK"}
[2018-04-24T17:17:38,873][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_SERVICE_EVENT_HANDLER"=>"SERVICE EVENT HANDLER"}
[2018-04-24T17:17:38,873][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_HOST_EVENT_HANDLER"=>"HOST EVENT HANDLER"}
[2018-04-24T17:17:38,873][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_EXTERNAL_COMMAND"=>"EXTERNAL COMMAND"}
[2018-04-24T17:17:38,873][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_TIMEPERIOD_TRANSITION"=>"TIMEPERIOD TRANSITION"}
[2018-04-24T17:17:38,873][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_DISABLE_SVC_CHECK"=>"DISABLE_SVC_CHECK"}
[2018-04-24T17:17:38,873][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_ENABLE_SVC_CHECK"=>"ENABLE_SVC_CHECK"}
[2018-04-24T17:17:38,873][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_DISABLE_HOST_CHECK"=>"DISABLE_HOST_CHECK"}
[2018-04-24T17:17:38,873][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_ENABLE_HOST_CHECK"=>"ENABLE_HOST_CHECK"}
[2018-04-24T17:17:38,873][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_PROCESS_SERVICE_CHECK_RESULT"=>"PROCESS_SERVICE_CHECK_RESULT"}
[2018-04-24T17:17:38,873][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_PROCESS_HOST_CHECK_RESULT"=>"PROCESS_HOST_CHECK_RESULT"}
[2018-04-24T17:17:38,873][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_SCHEDULE_SERVICE_DOWNTIME"=>"SCHEDULE_SERVICE_DOWNTIME"}
[2018-04-24T17:17:38,873][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_SCHEDULE_HOST_DOWNTIME"=>"SCHEDULE_HOST_DOWNTIME"}
[2018-04-24T17:17:38,873][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_DISABLE_HOST_SVC_NOTIFICATIONS"=>"DISABLE_HOST_SVC_NOTIFICATIONS"}
[2018-04-24T17:17:38,873][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_ENABLE_HOST_SVC_NOTIFICATIONS"=>"ENABLE_HOST_SVC_NOTIFICATIONS"}
[2018-04-24T17:17:38,873][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_DISABLE_HOST_NOTIFICATIONS"=>"DISABLE_HOST_NOTIFICATIONS"}
[2018-04-24T17:17:38,873][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_ENABLE_HOST_NOTIFICATIONS"=>"ENABLE_HOST_NOTIFICATIONS"}
[2018-04-24T17:17:38,873][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_DISABLE_SVC_NOTIFICATIONS"=>"DISABLE_SVC_NOTIFICATIONS"}
[2018-04-24T17:17:38,873][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_ENABLE_SVC_NOTIFICATIONS"=>"ENABLE_SVC_NOTIFICATIONS"}
[2018-04-24T17:17:38,873][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_WARNING"=>"Warning:%{SPACE}%{GREEDYDATA:nagios_message}"}
[2018-04-24T17:17:38,874][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_CURRENT_SERVICE_STATE"=>"%{NAGIOS_TYPE_CURRENT_SERVICE_STATE:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_service};%{DATA:nagios_state};%{DATA:nagios_statetype};%{DATA:nagios_statecode};%{GREEDYDATA:nagios_message}"}
[2018-04-24T17:17:38,874][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_CURRENT_HOST_STATE"=>"%{NAGIOS_TYPE_CURRENT_HOST_STATE:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_state};%{DATA:nagios_statetype};%{DATA:nagios_statecode};%{GREEDYDATA:nagios_message}"}
[2018-04-24T17:17:38,874][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_SERVICE_NOTIFICATION"=>"%{NAGIOS_TYPE_SERVICE_NOTIFICATION:nagios_type}: %{DATA:nagios_notifyname};%{DATA:nagios_hostname};%{DATA:nagios_service};%{DATA:nagios_state};%{DATA:nagios_contact};%{GREEDYDATA:nagios_message}"}
[2018-04-24T17:17:38,874][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_HOST_NOTIFICATION"=>"%{NAGIOS_TYPE_HOST_NOTIFICATION:nagios_type}: %{DATA:nagios_notifyname};%{DATA:nagios_hostname};%{DATA:nagios_state};%{DATA:nagios_contact};%{GREEDYDATA:nagios_message}"}
[2018-04-24T17:17:38,874][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_SERVICE_ALERT"=>"%{NAGIOS_TYPE_SERVICE_ALERT:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_service};%{DATA:nagios_state};%{DATA:nagios_statelevel};%{NUMBER:nagios_attempt};%{GREEDYDATA:nagios_message}"}
[2018-04-24T17:17:38,874][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_HOST_ALERT"=>"%{NAGIOS_TYPE_HOST_ALERT:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_state};%{DATA:nagios_statelevel};%{NUMBER:nagios_attempt};%{GREEDYDATA:nagios_message}"}
[2018-04-24T17:17:38,874][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_SERVICE_FLAPPING_ALERT"=>"%{NAGIOS_TYPE_SERVICE_FLAPPING_ALERT:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_service};%{DATA:nagios_state};%{GREEDYDATA:nagios_message}"}
[2018-04-24T17:17:38,874][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_HOST_FLAPPING_ALERT"=>"%{NAGIOS_TYPE_HOST_FLAPPING_ALERT:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_state};%{GREEDYDATA:nagios_message}"}
[2018-04-24T17:17:38,874][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_SERVICE_DOWNTIME_ALERT"=>"%{NAGIOS_TYPE_SERVICE_DOWNTIME_ALERT:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_service};%{DATA:nagios_state};%{GREEDYDATA:nagios_comment}"}
[2018-04-24T17:17:38,874][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_HOST_DOWNTIME_ALERT"=>"%{NAGIOS_TYPE_HOST_DOWNTIME_ALERT:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_state};%{GREEDYDATA:nagios_comment}"}
[2018-04-24T17:17:38,874][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_PASSIVE_SERVICE_CHECK"=>"%{NAGIOS_TYPE_PASSIVE_SERVICE_CHECK:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_service};%{DATA:nagios_state};%{GREEDYDATA:nagios_comment}"}
[2018-04-24T17:17:38,875][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_PASSIVE_HOST_CHECK"=>"%{NAGIOS_TYPE_PASSIVE_HOST_CHECK:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_state};%{GREEDYDATA:nagios_comment}"}
[2018-04-24T17:17:38,875][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_SERVICE_EVENT_HANDLER"=>"%{NAGIOS_TYPE_SERVICE_EVENT_HANDLER:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_service};%{DATA:nagios_state};%{DATA:nagios_statelevel};%{DATA:nagios_event_handler_name}"}
[2018-04-24T17:17:38,875][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_HOST_EVENT_HANDLER"=>"%{NAGIOS_TYPE_HOST_EVENT_HANDLER:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_state};%{DATA:nagios_statelevel};%{DATA:nagios_event_handler_name}"}
[2018-04-24T17:17:38,875][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TIMEPERIOD_TRANSITION"=>"%{NAGIOS_TYPE_TIMEPERIOD_TRANSITION:nagios_type}: %{DATA:nagios_service};%{DATA:nagios_unknown1};%{DATA:nagios_unknown2}"}
[2018-04-24T17:17:38,875][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_DISABLE_SVC_CHECK"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_DISABLE_SVC_CHECK:nagios_command};%{DATA:nagios_hostname};%{DATA:nagios_service}"}
[2018-04-24T17:17:38,875][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_DISABLE_HOST_CHECK"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_DISABLE_HOST_CHECK:nagios_command};%{DATA:nagios_hostname}"}
[2018-04-24T17:17:38,875][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_ENABLE_SVC_CHECK"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_ENABLE_SVC_CHECK:nagios_command};%{DATA:nagios_hostname};%{DATA:nagios_service}"}
[2018-04-24T17:17:38,875][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_ENABLE_HOST_CHECK"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_ENABLE_HOST_CHECK:nagios_command};%{DATA:nagios_hostname}"}
[2018-04-24T17:17:38,875][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_PROCESS_SERVICE_CHECK_RESULT"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_PROCESS_SERVICE_CHECK_RESULT:nagios_command};%{DATA:nagios_hostname};%{DATA:nagios_service};%{DATA:nagios_state};%{GREEDYDATA:nagios_check_result}"}
[2018-04-24T17:17:38,875][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_PROCESS_HOST_CHECK_RESULT"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_PROCESS_HOST_CHECK_RESULT:nagios_command};%{DATA:nagios_hostname};%{DATA:nagios_state};%{GREEDYDATA:nagios_check_result}"}
[2018-04-24T17:17:38,875][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_DISABLE_HOST_SVC_NOTIFICATIONS"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_DISABLE_HOST_SVC_NOTIFICATIONS:nagios_command};%{GREEDYDATA:nagios_hostname}"}
[2018-04-24T17:17:38,875][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_DISABLE_HOST_NOTIFICATIONS"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_DISABLE_HOST_NOTIFICATIONS:nagios_command};%{GREEDYDATA:nagios_hostname}"}
[2018-04-24T17:17:38,875][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_DISABLE_SVC_NOTIFICATIONS"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_DISABLE_SVC_NOTIFICATIONS:nagios_command};%{DATA:nagios_hostname};%{GREEDYDATA:nagios_service}"}
[2018-04-24T17:17:38,876][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_ENABLE_HOST_SVC_NOTIFICATIONS"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_ENABLE_HOST_SVC_NOTIFICATIONS:nagios_command};%{GREEDYDATA:nagios_hostname}"}
[2018-04-24T17:17:38,876][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_ENABLE_HOST_NOTIFICATIONS"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_ENABLE_HOST_NOTIFICATIONS:nagios_command};%{GREEDYDATA:nagios_hostname}"}
[2018-04-24T17:17:38,876][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_ENABLE_SVC_NOTIFICATIONS"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_ENABLE_SVC_NOTIFICATIONS:nagios_command};%{DATA:nagios_hostname};%{GREEDYDATA:nagios_service}"}
[2018-04-24T17:17:38,876][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_SCHEDULE_HOST_DOWNTIME"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_SCHEDULE_HOST_DOWNTIME:nagios_command};%{DATA:nagios_hostname};%{NUMBER:nagios_start_time};%{NUMBER:nagios_end_time};%{NUMBER:nagios_fixed};%{NUMBER:nagios_trigger_id};%{NUMBER:nagios_duration};%{DATA:author};%{DATA:comment}"}
[2018-04-24T17:17:38,876][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOSLOGLINE"=>"%{NAGIOSTIME} (?:%{NAGIOS_WARNING}|%{NAGIOS_CURRENT_SERVICE_STATE}|%{NAGIOS_CURRENT_HOST_STATE}|%{NAGIOS_SERVICE_NOTIFICATION}|%{NAGIOS_HOST_NOTIFICATION}|%{NAGIOS_SERVICE_ALERT}|%{NAGIOS_HOST_ALERT}|%{NAGIOS_SERVICE_FLAPPING_ALERT}|%{NAGIOS_HOST_FLAPPING_ALERT}|%{NAGIOS_SERVICE_DOWNTIME_ALERT}|%{NAGIOS_HOST_DOWNTIME_ALERT}|%{NAGIOS_PASSIVE_SERVICE_CHECK}|%{NAGIOS_PASSIVE_HOST_CHECK}|%{NAGIOS_SERVICE_EVENT_HANDLER}|%{NAGIOS_HOST_EVENT_HANDLER}|%{NAGIOS_TIMEPERIOD_TRANSITION}|%{NAGIOS_EC_LINE_DISABLE_SVC_CHECK}|%{NAGIOS_EC_LINE_ENABLE_SVC_CHECK}|%{NAGIOS_EC_LINE_DISABLE_HOST_CHECK}|%{NAGIOS_EC_LINE_ENABLE_HOST_CHECK}|%{NAGIOS_EC_LINE_PROCESS_HOST_CHECK_RESULT}|%{NAGIOS_EC_LINE_PROCESS_SERVICE_CHECK_RESULT}|%{NAGIOS_EC_LINE_SCHEDULE_HOST_DOWNTIME}|%{NAGIOS_EC_LINE_DISABLE_HOST_SVC_NOTIFICATIONS}|%{NAGIOS_EC_LINE_ENABLE_HOST_SVC_NOTIFICATIONS}|%{NAGIOS_EC_LINE_DISABLE_HOST_NOTIFICATIONS}|%{NAGIOS_EC_LINE_ENABLE_HOST_NOTIFICATIONS}|%{NAGIOS_EC_LINE_DISABLE_SVC_NOTIFICATIONS}|%{NAGIOS_EC_LINE_ENABLE_SVC_NOTIFICATIONS})"}
[2018-04-24T17:17:38,876][DEBUG][logstash.filters.grok ] Adding pattern {"POSTGRESQL"=>"%{DATESTAMP:timestamp} %{TZ} %{DATA:user_id} %{GREEDYDATA:connection_id} %{POSINT:pid}"}
[2018-04-24T17:17:38,876][DEBUG][logstash.filters.grok ] Adding pattern {"RUUID"=>"\\h{32}"}
[2018-04-24T17:17:38,876][DEBUG][logstash.filters.grok ] Adding pattern {"RCONTROLLER"=>"(?<controller>[^#]+)#(?<action>\\w+)"}
[2018-04-24T17:17:38,877][DEBUG][logstash.filters.grok ] Adding pattern {"RAILS3HEAD"=>"(?m)Started %{WORD:verb} \"%{URIPATHPARAM:request}\" for %{IPORHOST:clientip} at (?<timestamp>%{YEAR}-%{MONTHNUM}-%{MONTHDAY} %{HOUR}:%{MINUTE}:%{SECOND} %{ISO8601_TIMEZONE})"}
[2018-04-24T17:17:38,877][DEBUG][logstash.filters.grok ] Adding pattern {"RPROCESSING"=>"\\W*Processing by %{RCONTROLLER} as (?<format>\\S+)(?:\\W*Parameters: {%{DATA:params}}\\W*)?"}
[2018-04-24T17:17:38,877][DEBUG][logstash.filters.grok ] Adding pattern {"RAILS3FOOT"=>"Completed %{NUMBER:response}%{DATA} in %{NUMBER:totalms}ms %{RAILS3PROFILE}%{GREEDYDATA}"}
[2018-04-24T17:17:38,877][DEBUG][logstash.filters.grok ] Adding pattern {"RAILS3PROFILE"=>"(?:\\(Views: %{NUMBER:viewms}ms \\| ActiveRecord: %{NUMBER:activerecordms}ms|\\(ActiveRecord: %{NUMBER:activerecordms}ms)?"}
[2018-04-24T17:17:38,877][DEBUG][logstash.filters.grok ] Adding pattern {"RAILS3"=>"%{RAILS3HEAD}(?:%{RPROCESSING})?(?<context>(?:%{DATA}\\n)*)(?:%{RAILS3FOOT})?"}
[2018-04-24T17:17:38,877][DEBUG][logstash.filters.grok ] Adding pattern {"REDISTIMESTAMP"=>"%{MONTHDAY} %{MONTH} %{TIME}"}
[2018-04-24T17:17:38,877][DEBUG][logstash.filters.grok ] Adding pattern {"REDISLOG"=>"\\[%{POSINT:pid}\\] %{REDISTIMESTAMP:timestamp} \\* "}
[2018-04-24T17:17:38,877][DEBUG][logstash.filters.grok ] Adding pattern {"REDISMONLOG"=>"%{NUMBER:timestamp} \\[%{INT:database} %{IP:client}:%{NUMBER:port}\\] \"%{WORD:command}\"\\s?%{GREEDYDATA:params}"}
[2018-04-24T17:17:38,877][DEBUG][logstash.filters.grok ] Adding pattern {"RUBY_LOGLEVEL"=>"(?:DEBUG|FATAL|ERROR|WARN|INFO)"}
[2018-04-24T17:17:38,877][DEBUG][logstash.filters.grok ] Adding pattern {"RUBY_LOGGER"=>"[DFEWI], \\[%{TIMESTAMP_ISO8601:timestamp} #%{POSINT:pid}\\] *%{RUBY_LOGLEVEL:loglevel} -- +%{DATA:progname}: %{GREEDYDATA:message}"}
[2018-04-24T17:17:38,877][DEBUG][logstash.filters.grok ] Adding pattern {"SQUID3"=>"%{NUMBER:timestamp}\\s+%{NUMBER:duration}\\s%{IP:client_address}\\s%{WORD:cache_result}/%{POSINT:status_code}\\s%{NUMBER:bytes}\\s%{WORD:request_method}\\s%{NOTSPACE:url}\\s(%{NOTSPACE:user}|-)\\s%{WORD:hierarchy_code}/%{IPORHOST:server}\\s%{NOTSPACE:content_type}"}
[2018-04-24T17:17:38,878][DEBUG][logstash.filters.grok ] Adding pattern {"PAYLOAD"=>"[\\s\\S]*"}
[2018-04-24T17:17:38,878][DEBUG][logstash.filters.grok ] Adding pattern {"SPACE"=>"[ ]{1,}"}
[2018-04-24T17:17:38,878][DEBUG][logstash.filters.grok ] Adding pattern {"P_TIMESTAMP"=>"%{MONTH}\\s%{MONTHDAY},\\s%{YEAR}\\s%{TIME}\\s(AM|PM)"}
[2018-04-24T17:17:38,878][DEBUG][logstash.filters.grok ] Adding pattern {"LOGGINGSERVICEPREFIX"=>"[-]{12,18} Event Log Start Here [-]{12,18}\\\\n"}
[2018-04-24T17:17:38,878][DEBUG][logstash.filters.grok ] Adding pattern {"LOGGINGSERVICESUFFIX"=>"\\\\n[-]{12,18} Event Log End Here [-]{12,18}"}
[2018-04-24T17:17:38,878][DEBUG][logstash.filters.grok ] Adding pattern {"XLMLOGGING"=>"[0-9]{4}-[0-9]{2}-[0-9]{2} [0-9]{2}:[0-9]{2}:[0-9]{2}:[0-9]{3,7}"}
[2018-04-24T17:17:38,878][DEBUG][logstash.filters.grok ] Adding pattern {"DATESWITHDOTS"=>"[0-9]{4}.[0-9]{2}.[0-9]{2}.[0-9]{2}.[0-9]{2}.[0-9]{2}.[0-9]{3,7}"}
[2018-04-24T17:17:38,878][DEBUG][logstash.filters.grok ] Adding pattern {"DATESWITHUNDERLINE"=>"[0-9]{4}_[0-9]{1,2}_[0-9]{1,2}_[0-9]{1,2}_[0-9]{1,2}_[0-9]{1,2}_[0-9]{1,7}"}
[2018-04-24T17:17:38,878][DEBUG][logstash.filters.grok ] replacement_pattern => (?<GREEDYDATA:LevelMessage>.*)
[2018-04-24T17:17:38,878][DEBUG][logstash.filters.grok ] replacement_pattern => (?<TIMESTAMP_ISO8601:logtime>%{YEAR}-%{MONTHNUM}-%{MONTHDAY}[T ]%{HOUR}:?%{MINUTE}(?::?%{SECOND})?%{ISO8601_TIMEZONE}?)
[2018-04-24T17:17:38,878][DEBUG][logstash.filters.grok ] replacement_pattern => (?:(?>\d\d){1,2})
[2018-04-24T17:17:38,878][DEBUG][logstash.filters.grok ] replacement_pattern => (?:(?:0?[1-9]|1[0-2]))
[2018-04-24T17:17:38,879][DEBUG][logstash.filters.grok ] replacement_pattern => (?:(?:(?:0[1-9])|(?:[12][0-9])|(?:3[01])|[1-9]))
[2018-04-24T17:17:38,879][DEBUG][logstash.filters.grok ] replacement_pattern => (?:(?:2[0123]|[01]?[0-9]))
[2018-04-24T17:17:38,879][DEBUG][logstash.filters.grok ] replacement_pattern => (?:(?:[0-5][0-9]))
[2018-04-24T17:17:38,879][DEBUG][logstash.filters.grok ] replacement_pattern => (?:(?:(?:[0-5]?[0-9]|60)(?:[:.,][0-9]+)?))
[2018-04-24T17:17:38,879][DEBUG][logstash.filters.grok ] replacement_pattern => (?:(?:Z|[+-]%{HOUR}(?::?%{MINUTE})))
[2018-04-24T17:17:38,879][DEBUG][logstash.filters.grok ] replacement_pattern => (?:(?:2[0123]|[01]?[0-9]))
[2018-04-24T17:17:38,879][DEBUG][logstash.filters.grok ] replacement_pattern => (?:(?:[0-5][0-9]))
[2018-04-24T17:17:38,879][DEBUG][logstash.filters.grok ] replacement_pattern => (?<GREEDYDATA:SuffixMessage>.*)
[2018-04-24T17:17:38,879][DEBUG][logstash.filters.grok ] Grok compiled OK {:pattern=>"%{GREEDYDATA:LevelMessage} %{TIMESTAMP_ISO8601:logtime}%{GREEDYDATA:SuffixMessage}", :expanded_pattern=>"(?<GREEDYDATA:LevelMessage>.*) (?<TIMESTAMP_ISO8601:logtime>(?:(?>\\d\\d){1,2})-(?:(?:0?[1-9]|1[0-2]))-(?:(?:(?:0[1-9])|(?:[12][0-9])|(?:3[01])|[1-9]))[T ](?:(?:2[0123]|[01]?[0-9])):?(?:(?:[0-5][0-9]))(?::?(?:(?:(?:[0-5]?[0-9]|60)(?:[:.,][0-9]+)?)))?(?:(?:Z|[+-](?:(?:2[0123]|[01]?[0-9]))(?::?(?:(?:[0-5][0-9])))))?)(?<GREEDYDATA:SuffixMessage>.*)"}
[2018-04-24T17:17:38,881][DEBUG][logstash.filters.grok ] Grok patterns path {:paths=>["/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-patterns-core-4.1.2/patterns", "/usr/share/logstash/patterns/*"]}
[2018-04-24T17:17:38,882][DEBUG][logstash.filters.grok ] Grok patterns path {:paths=>["/etc/logstash/conf.d/patterns"]}
[2018-04-24T17:17:38,882][DEBUG][logstash.filters.grok ] Match data {:match=>{"message"=>"%{GREEDYDATA:PrefixMessage}/%{GREEDYDATA:PrefixMessageTwo}/%{DATESWITHDOTS:logtimetwo}%{GREEDYDATA:SuffixMessage}"}}
[2018-04-24T17:17:38,882][DEBUG][logstash.filters.grok ] regexp: /message {:pattern=>"%{GREEDYDATA:PrefixMessage}/%{GREEDYDATA:PrefixMessageTwo}/%{DATESWITHDOTS:logtimetwo}%{GREEDYDATA:SuffixMessage}"}
[2018-04-24T17:17:38,883][DEBUG][logstash.filters.grok ] Adding pattern {"S3_REQUEST_LINE"=>"(?:%{WORD:verb} %{NOTSPACE:request}(?: HTTP/%{NUMBER:httpversion})?|%{DATA:rawrequest})"}
[2018-04-24T17:17:38,883][DEBUG][logstash.filters.grok ] Adding pattern {"S3_ACCESS_LOG"=>"%{WORD:owner} %{NOTSPACE:bucket} \\[%{HTTPDATE:timestamp}\\] %{IP:clientip} %{NOTSPACE:requester} %{NOTSPACE:request_id} %{NOTSPACE:operation} %{NOTSPACE:key} (?:\"%{S3_REQUEST_LINE}\"|-) (?:%{INT:response:int}|-) (?:-|%{NOTSPACE:error_code}) (?:%{INT:bytes:int}|-) (?:%{INT:object_size:int}|-) (?:%{INT:request_time_ms:int}|-) (?:%{INT:turnaround_time_ms:int}|-) (?:%{QS:referrer}|-) (?:\"?%{QS:agent}\"?|-) (?:-|%{NOTSPACE:version_id})"}
[2018-04-24T17:17:38,883][DEBUG][logstash.filters.grok ] Adding pattern {"ELB_URIPATHPARAM"=>"%{URIPATH:path}(?:%{URIPARAM:params})?"}
[2018-04-24T17:17:38,883][DEBUG][logstash.filters.grok ] Adding pattern {"ELB_URI"=>"%{URIPROTO:proto}://(?:%{USER}(?::[^@]*)?@)?(?:%{URIHOST:urihost})?(?:%{ELB_URIPATHPARAM})?"}
[2018-04-24T17:17:38,883][DEBUG][logstash.filters.grok ] Adding pattern {"ELB_REQUEST_LINE"=>"(?:%{WORD:verb} %{ELB_URI:request}(?: HTTP/%{NUMBER:httpversion})?|%{DATA:rawrequest})"}
[2018-04-24T17:17:38,883][DEBUG][logstash.filters.grok ] Adding pattern {"ELB_ACCESS_LOG"=>"%{TIMESTAMP_ISO8601:timestamp} %{NOTSPACE:elb} %{IP:clientip}:%{INT:clientport:int} (?:(%{IP:backendip}:?:%{INT:backendport:int})|-) %{NUMBER:request_processing_time:float} %{NUMBER:backend_processing_time:float} %{NUMBER:response_processing_time:float} %{INT:response:int} %{INT:backend_response:int} %{INT:received_bytes:int} %{INT:bytes:int} \"%{ELB_REQUEST_LINE}\""}
[2018-04-24T17:17:38,883][DEBUG][logstash.filters.grok ] Adding pattern {"CLOUDFRONT_ACCESS_LOG"=>"(?<timestamp>%{YEAR}-%{MONTHNUM}-%{MONTHDAY}\\t%{TIME})\\t%{WORD:x_edge_location}\\t(?:%{NUMBER:sc_bytes:int}|-)\\t%{IPORHOST:clientip}\\t%{WORD:cs_method}\\t%{HOSTNAME:cs_host}\\t%{NOTSPACE:cs_uri_stem}\\t%{NUMBER:sc_status:int}\\t%{GREEDYDATA:referrer}\\t%{GREEDYDATA:agent}\\t%{GREEDYDATA:cs_uri_query}\\t%{GREEDYDATA:cookies}\\t%{WORD:x_edge_result_type}\\t%{NOTSPACE:x_edge_request_id}\\t%{HOSTNAME:x_host_header}\\t%{URIPROTO:cs_protocol}\\t%{INT:cs_bytes:int}\\t%{GREEDYDATA:time_taken:float}\\t%{GREEDYDATA:x_forwarded_for}\\t%{GREEDYDATA:ssl_protocol}\\t%{GREEDYDATA:ssl_cipher}\\t%{GREEDYDATA:x_edge_response_result_type}"}
[2018-04-24T17:17:38,884][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_TIMESTAMP"=>"%{MONTHDAY}-%{MONTH} %{HOUR}:%{MINUTE}"}
[2018-04-24T17:17:38,884][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_HOST"=>"[a-zA-Z0-9-]+"}
[2018-04-24T17:17:38,884][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_VOLUME"=>"%{USER}"}
[2018-04-24T17:17:38,884][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_DEVICE"=>"%{USER}"}
[2018-04-24T17:17:38,884][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_DEVICEPATH"=>"%{UNIXPATH}"}
[2018-04-24T17:17:38,884][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_CAPACITY"=>"%{INT}{1,3}(,%{INT}{3})*"}
[2018-04-24T17:17:38,884][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_VERSION"=>"%{USER}"}
[2018-04-24T17:17:38,884][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_JOB"=>"%{USER}"}
[2018-04-24T17:17:38,884][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_MAX_CAPACITY"=>"User defined maximum volume capacity %{BACULA_CAPACITY} exceeded on device \\\"%{BACULA_DEVICE:device}\\\" \\(%{BACULA_DEVICEPATH}\\)"}
[2018-04-24T17:17:38,884][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_END_VOLUME"=>"End of medium on Volume \\\"%{BACULA_VOLUME:volume}\\\" Bytes=%{BACULA_CAPACITY} Blocks=%{BACULA_CAPACITY} at %{MONTHDAY}-%{MONTH}-%{YEAR} %{HOUR}:%{MINUTE}."}
[2018-04-24T17:17:38,884][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NEW_VOLUME"=>"Created new Volume \\\"%{BACULA_VOLUME:volume}\\\" in catalog."}
[2018-04-24T17:17:38,884][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NEW_LABEL"=>"Labeled new Volume \\\"%{BACULA_VOLUME:volume}\\\" on device \\\"%{BACULA_DEVICE:device}\\\" \\(%{BACULA_DEVICEPATH}\\)."}
[2018-04-24T17:17:38,884][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_WROTE_LABEL"=>"Wrote label to prelabeled Volume \\\"%{BACULA_VOLUME:volume}\\\" on device \\\"%{BACULA_DEVICE}\\\" \\(%{BACULA_DEVICEPATH}\\)"}
[2018-04-24T17:17:38,884][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NEW_MOUNT"=>"New volume \\\"%{BACULA_VOLUME:volume}\\\" mounted on device \\\"%{BACULA_DEVICE:device}\\\" \\(%{BACULA_DEVICEPATH}\\) at %{MONTHDAY}-%{MONTH}-%{YEAR} %{HOUR}:%{MINUTE}."}
[2018-04-24T17:17:38,884][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NOOPEN"=>"\\s+Cannot open %{DATA}: ERR=%{GREEDYDATA:berror}"}
[2018-04-24T17:17:38,884][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NOOPENDIR"=>"\\s+Could not open directory %{DATA}: ERR=%{GREEDYDATA:berror}"}
[2018-04-24T17:17:38,884][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NOSTAT"=>"\\s+Could not stat %{DATA}: ERR=%{GREEDYDATA:berror}"}
[2018-04-24T17:17:38,884][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NOJOBS"=>"There are no more Jobs associated with Volume \\\"%{BACULA_VOLUME:volume}\\\". Marking it purged."}
[2018-04-24T17:17:38,884][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_ALL_RECORDS_PRUNED"=>"All records pruned from Volume \\\"%{BACULA_VOLUME:volume}\\\"; marking it \\\"Purged\\\""}
[2018-04-24T17:17:38,885][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_BEGIN_PRUNE_JOBS"=>"Begin pruning Jobs older than %{INT} month %{INT} days ."}
[2018-04-24T17:17:38,885][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_BEGIN_PRUNE_FILES"=>"Begin pruning Files."}
[2018-04-24T17:17:38,885][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_PRUNED_JOBS"=>"Pruned %{INT} Jobs* for client %{BACULA_HOST:client} from catalog."}
[2018-04-24T17:17:38,885][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_PRUNED_FILES"=>"Pruned Files from %{INT} Jobs* for client %{BACULA_HOST:client} from catalog."}
[2018-04-24T17:17:38,885][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_ENDPRUNE"=>"End auto prune."}
[2018-04-24T17:17:38,885][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_STARTJOB"=>"Start Backup JobId %{INT}, Job=%{BACULA_JOB:job}"}
[2018-04-24T17:17:38,885][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_STARTRESTORE"=>"Start Restore Job %{BACULA_JOB:job}"}
[2018-04-24T17:17:38,885][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_USEDEVICE"=>"Using Device \\\"%{BACULA_DEVICE:device}\\\""}
[2018-04-24T17:17:38,885][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_DIFF_FS"=>"\\s+%{UNIXPATH} is a different filesystem. Will not descend from %{UNIXPATH} into it."}
[2018-04-24T17:17:38,885][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_JOBEND"=>"Job write elapsed time = %{DATA:elapsed}, Transfer rate = %{NUMBER} (K|M|G)? Bytes/second"}
[2018-04-24T17:17:38,885][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NOPRUNE_JOBS"=>"No Jobs found to prune."}
[2018-04-24T17:17:38,885][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NOPRUNE_FILES"=>"No Files found to prune."}
[2018-04-24T17:17:38,885][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_VOLUME_PREVWRITTEN"=>"Volume \\\"%{BACULA_VOLUME:volume}\\\" previously written, moving to end of data."}
[2018-04-24T17:17:38,885][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_READYAPPEND"=>"Ready to append to end of Volume \\\"%{BACULA_VOLUME:volume}\\\" size=%{INT}"}
[2018-04-24T17:17:38,885][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_CANCELLING"=>"Cancelling duplicate JobId=%{INT}."}
[2018-04-24T17:17:38,885][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_MARKCANCEL"=>"JobId %{INT}, Job %{BACULA_JOB:job} marked to be canceled."}
[2018-04-24T17:17:38,885][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_CLIENT_RBJ"=>"shell command: run ClientRunBeforeJob \\\"%{GREEDYDATA:runjob}\\\""}
[2018-04-24T17:17:38,885][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_VSS"=>"(Generate )?VSS (Writer)?"}
[2018-04-24T17:17:38,886][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_MAXSTART"=>"Fatal error: Job canceled because max start delay time exceeded."}
[2018-04-24T17:17:38,886][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_DUPLICATE"=>"Fatal error: JobId %{INT:duplicate} already running. Duplicate job not allowed."}
[2018-04-24T17:17:38,886][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NOJOBSTAT"=>"Fatal error: No Job status returned from FD."}
[2018-04-24T17:17:38,886][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_FATAL_CONN"=>"Fatal error: bsock.c:133 Unable to connect to (Client: %{BACULA_HOST:client}|Storage daemon) on %{HOSTNAME}:%{POSINT}. ERR=(?<berror>%{GREEDYDATA})"}
[2018-04-24T17:17:38,886][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NO_CONNECT"=>"Warning: bsock.c:127 Could not connect to (Client: %{BACULA_HOST:client}|Storage daemon) on %{HOSTNAME}:%{POSINT}. ERR=(?<berror>%{GREEDYDATA})"}
[2018-04-24T17:17:38,886][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NO_AUTH"=>"Fatal error: Unable to authenticate with File daemon at %{HOSTNAME}. Possible causes:"}
[2018-04-24T17:17:38,886][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NOSUIT"=>"No prior or suitable Full backup found in catalog. Doing FULL backup."}
[2018-04-24T17:17:38,886][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NOPRIOR"=>"No prior Full backup Job record found."}
[2018-04-24T17:17:38,886][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_JOB"=>"(Error: )?Bacula %{BACULA_HOST} %{BACULA_VERSION} \\(%{BACULA_VERSION}\\):"}
[2018-04-24T17:17:38,886][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOGLINE"=>"%{BACULA_TIMESTAMP:bts} %{BACULA_HOST:hostname} JobId %{INT:jobid}: (%{BACULA_LOG_MAX_CAPACITY}|%{BACULA_LOG_END_VOLUME}|%{BACULA_LOG_NEW_VOLUME}|%{BACULA_LOG_NEW_LABEL}|%{BACULA_LOG_WROTE_LABEL}|%{BACULA_LOG_NEW_MOUNT}|%{BACULA_LOG_NOOPEN}|%{BACULA_LOG_NOOPENDIR}|%{BACULA_LOG_NOSTAT}|%{BACULA_LOG_NOJOBS}|%{BACULA_LOG_ALL_RECORDS_PRUNED}|%{BACULA_LOG_BEGIN_PRUNE_JOBS}|%{BACULA_LOG_BEGIN_PRUNE_FILES}|%{BACULA_LOG_PRUNED_JOBS}|%{BACULA_LOG_PRUNED_FILES}|%{BACULA_LOG_ENDPRUNE}|%{BACULA_LOG_STARTJOB}|%{BACULA_LOG_STARTRESTORE}|%{BACULA_LOG_USEDEVICE}|%{BACULA_LOG_DIFF_FS}|%{BACULA_LOG_JOBEND}|%{BACULA_LOG_NOPRUNE_JOBS}|%{BACULA_LOG_NOPRUNE_FILES}|%{BACULA_LOG_VOLUME_PREVWRITTEN}|%{BACULA_LOG_READYAPPEND}|%{BACULA_LOG_CANCELLING}|%{BACULA_LOG_MARKCANCEL}|%{BACULA_LOG_CLIENT_RBJ}|%{BACULA_LOG_VSS}|%{BACULA_LOG_MAXSTART}|%{BACULA_LOG_DUPLICATE}|%{BACULA_LOG_NOJOBSTAT}|%{BACULA_LOG_FATAL_CONN}|%{BACULA_LOG_NO_CONNECT}|%{BACULA_LOG_NO_AUTH}|%{BACULA_LOG_NOSUIT}|%{BACULA_LOG_JOB}|%{BACULA_LOG_NOPRIOR})"}
[2018-04-24T17:17:38,886][DEBUG][logstash.filters.grok ] Adding pattern {"BIND9_TIMESTAMP"=>"%{MONTHDAY}[-]%{MONTH}[-]%{YEAR} %{TIME}"}
[2018-04-24T17:17:38,886][DEBUG][logstash.filters.grok ] Adding pattern {"BIND9"=>"%{BIND9_TIMESTAMP:timestamp} queries: %{LOGLEVEL:loglevel}: client %{IP:clientip}#%{POSINT:clientport} \\(%{GREEDYDATA:query}\\): query: %{GREEDYDATA:query} IN %{GREEDYDATA:querytype} \\(%{IP:dns}\\)"}
[2018-04-24T17:17:38,887][DEBUG][logstash.filters.grok ] Adding pattern {"BRO_HTTP"=>"%{NUMBER:ts}\\t%{NOTSPACE:uid}\\t%{IP:orig_h}\\t%{INT:orig_p}\\t%{IP:resp_h}\\t%{INT:resp_p}\\t%{INT:trans_depth}\\t%{GREEDYDATA:method}\\t%{GREEDYDATA:domain}\\t%{GREEDYDATA:uri}\\t%{GREEDYDATA:referrer}\\t%{GREEDYDATA:user_agent}\\t%{NUMBER:request_body_len}\\t%{NUMBER:response_body_len}\\t%{GREEDYDATA:status_code}\\t%{GREEDYDATA:status_msg}\\t%{GREEDYDATA:info_code}\\t%{GREEDYDATA:info_msg}\\t%{GREEDYDATA:filename}\\t%{GREEDYDATA:bro_tags}\\t%{GREEDYDATA:username}\\t%{GREEDYDATA:password}\\t%{GREEDYDATA:proxied}\\t%{GREEDYDATA:orig_fuids}\\t%{GREEDYDATA:orig_mime_types}\\t%{GREEDYDATA:resp_fuids}\\t%{GREEDYDATA:resp_mime_types}"}
[2018-04-24T17:17:38,887][DEBUG][logstash.filters.grok ] Adding pattern {"BRO_DNS"=>"%{NUMBER:ts}\\t%{NOTSPACE:uid}\\t%{IP:orig_h}\\t%{INT:orig_p}\\t%{IP:resp_h}\\t%{INT:resp_p}\\t%{WORD:proto}\\t%{INT:trans_id}\\t%{GREEDYDATA:query}\\t%{GREEDYDATA:qclass}\\t%{GREEDYDATA:qclass_name}\\t%{GREEDYDATA:qtype}\\t%{GREEDYDATA:qtype_name}\\t%{GREEDYDATA:rcode}\\t%{GREEDYDATA:rcode_name}\\t%{GREEDYDATA:AA}\\t%{GREEDYDATA:TC}\\t%{GREEDYDATA:RD}\\t%{GREEDYDATA:RA}\\t%{GREEDYDATA:Z}\\t%{GREEDYDATA:answers}\\t%{GREEDYDATA:TTLs}\\t%{GREEDYDATA:rejected}"}
[2018-04-24T17:17:38,887][DEBUG][logstash.filters.grok ] Adding pattern {"BRO_CONN"=>"%{NUMBER:ts}\\t%{NOTSPACE:uid}\\t%{IP:orig_h}\\t%{INT:orig_p}\\t%{IP:resp_h}\\t%{INT:resp_p}\\t%{WORD:proto}\\t%{GREEDYDATA:service}\\t%{NUMBER:duration}\\t%{NUMBER:orig_bytes}\\t%{NUMBER:resp_bytes}\\t%{GREEDYDATA:conn_state}\\t%{GREEDYDATA:local_orig}\\t%{GREEDYDATA:missed_bytes}\\t%{GREEDYDATA:history}\\t%{GREEDYDATA:orig_pkts}\\t%{GREEDYDATA:orig_ip_bytes}\\t%{GREEDYDATA:resp_pkts}\\t%{GREEDYDATA:resp_ip_bytes}\\t%{GREEDYDATA:tunnel_parents}"}
[2018-04-24T17:17:38,887][DEBUG][logstash.filters.grok ] Adding pattern {"BRO_FILES"=>"%{NUMBER:ts}\\t%{NOTSPACE:fuid}\\t%{IP:tx_hosts}\\t%{IP:rx_hosts}\\t%{NOTSPACE:conn_uids}\\t%{GREEDYDATA:source}\\t%{GREEDYDATA:depth}\\t%{GREEDYDATA:analyzers}\\t%{GREEDYDATA:mime_type}\\t%{GREEDYDATA:filename}\\t%{GREEDYDATA:duration}\\t%{GREEDYDATA:local_orig}\\t%{GREEDYDATA:is_orig}\\t%{GREEDYDATA:seen_bytes}\\t%{GREEDYDATA:total_bytes}\\t%{GREEDYDATA:missing_bytes}\\t%{GREEDYDATA:overflow_bytes}\\t%{GREEDYDATA:timedout}\\t%{GREEDYDATA:parent_fuid}\\t%{GREEDYDATA:md5}\\t%{GREEDYDATA:sha1}\\t%{GREEDYDATA:sha256}\\t%{GREEDYDATA:extracted}"}
[2018-04-24T17:17:38,887][DEBUG][logstash.filters.grok ] Adding pattern {"EXIM_MSGID"=>"[0-9A-Za-z]{6}-[0-9A-Za-z]{6}-[0-9A-Za-z]{2}"}
[2018-04-24T17:17:38,887][DEBUG][logstash.filters.grok ] Adding pattern {"EXIM_FLAGS"=>"(<=|[-=>*]>|[*]{2}|==)"}
[2018-04-24T17:17:38,887][DEBUG][logstash.filters.grok ] Adding pattern {"EXIM_DATE"=>"%{YEAR:exim_year}-%{MONTHNUM:exim_month}-%{MONTHDAY:exim_day} %{TIME:exim_time}"}
[2018-04-24T17:17:38,888][DEBUG][logstash.filters.grok ] Adding pattern {"EXIM_PID"=>"\\[%{POSINT}\\]"}
[2018-04-24T17:17:38,888][DEBUG][logstash.filters.grok ] Adding pattern {"EXIM_QT"=>"((\\d+y)?(\\d+w)?(\\d+d)?(\\d+h)?(\\d+m)?(\\d+s)?)"}
[2018-04-24T17:17:38,888][DEBUG][logstash.filters.grok ] Adding pattern {"EXIM_EXCLUDE_TERMS"=>"(Message is frozen|(Start|End) queue run| Warning: | retry time not reached | no (IP address|host name) found for (IP address|host) | unexpected disconnection while reading SMTP command | no immediate delivery: |another process is handling this message)"}
[2018-04-24T17:17:38,888][DEBUG][logstash.filters.grok ] Adding pattern {"EXIM_REMOTE_HOST"=>"(H=(%{NOTSPACE:remote_hostname} )?(\\(%{NOTSPACE:remote_heloname}\\) )?\\[%{IP:remote_host}\\])"}
[2018-04-24T17:17:38,888][DEBUG][logstash.filters.grok ] Adding pattern {"EXIM_INTERFACE"=>"(I=\\[%{IP:exim_interface}\\](:%{NUMBER:exim_interface_port}))"}
[2018-04-24T17:17:38,888][DEBUG][logstash.filters.grok ] Adding pattern {"EXIM_PROTOCOL"=>"(P=%{NOTSPACE:protocol})"}
[2018-04-24T17:17:38,888][DEBUG][logstash.filters.grok ] Adding pattern {"EXIM_MSG_SIZE"=>"(S=%{NUMBER:exim_msg_size})"}
[2018-04-24T17:17:38,888][DEBUG][logstash.filters.grok ] Adding pattern {"EXIM_HEADER_ID"=>"(id=%{NOTSPACE:exim_header_id})"}
[2018-04-24T17:17:38,888][DEBUG][logstash.filters.grok ] Adding pattern {"EXIM_SUBJECT"=>"(T=%{QS:exim_subject})"}
[2018-04-24T17:17:38,888][DEBUG][logstash.filters.grok ] Adding pattern {"NETSCREENSESSIONLOG"=>"%{SYSLOGTIMESTAMP:date} %{IPORHOST:device} %{IPORHOST}: NetScreen device_id=%{WORD:device_id}%{DATA}: start_time=%{QUOTEDSTRING:start_time} duration=%{INT:duration} policy_id=%{INT:policy_id} service=%{DATA:service} proto=%{INT:proto} src zone=%{WORD:src_zone} dst zone=%{WORD:dst_zone} action=%{WORD:action} sent=%{INT:sent} rcvd=%{INT:rcvd} src=%{IPORHOST:src_ip} dst=%{IPORHOST:dst_ip} src_port=%{INT:src_port} dst_port=%{INT:dst_port} src-xlated ip=%{IPORHOST:src_xlated_ip} port=%{INT:src_xlated_port} dst-xlated ip=%{IPORHOST:dst_xlated_ip} port=%{INT:dst_xlated_port} session_id=%{INT:session_id} reason=%{GREEDYDATA:reason}"}
[2018-04-24T17:17:38,888][DEBUG][logstash.filters.grok ] Adding pattern {"CISCO_TAGGED_SYSLOG"=>"^<%{POSINT:syslog_pri}>%{CISCOTIMESTAMP:timestamp}( %{SYSLOGHOST:sysloghost})? ?: %%{CISCOTAG:ciscotag}:"}
[2018-04-24T17:17:38,888][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOTIMESTAMP"=>"%{MONTH} +%{MONTHDAY}(?: %{YEAR})? %{TIME}"}
[2018-04-24T17:17:38,889][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOTAG"=>"[A-Z0-9]+-%{INT}-(?:[A-Z0-9_]+)"}
[2018-04-24T17:17:38,889][DEBUG][logstash.filters.grok ] Adding pattern {"CISCO_ACTION"=>"Built|Teardown|Deny|Denied|denied|requested|permitted|denied by ACL|discarded|est-allowed|Dropping|created|deleted"}
[2018-04-24T17:17:38,889][DEBUG][logstash.filters.grok ] Adding pattern {"CISCO_REASON"=>"Duplicate TCP SYN|Failed to locate egress interface|Invalid transport field|No matching connection|DNS Response|DNS Query|(?:%{WORD}\\s*)*"}
[2018-04-24T17:17:38,889][DEBUG][logstash.filters.grok ] Adding pattern {"CISCO_DIRECTION"=>"Inbound|inbound|Outbound|outbound"}
[2018-04-24T17:17:38,889][DEBUG][logstash.filters.grok ] Adding pattern {"CISCO_INTERVAL"=>"first hit|%{INT}-second interval"}
[2018-04-24T17:17:38,889][DEBUG][logstash.filters.grok ] Adding pattern {"CISCO_XLATE_TYPE"=>"static|dynamic"}
[2018-04-24T17:17:38,889][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW104001"=>"\\((?:Primary|Secondary)\\) Switching to ACTIVE - %{GREEDYDATA:switch_reason}"}
[2018-04-24T17:17:38,889][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW104002"=>"\\((?:Primary|Secondary)\\) Switching to STANDBY - %{GREEDYDATA:switch_reason}"}
[2018-04-24T17:17:38,889][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW104003"=>"\\((?:Primary|Secondary)\\) Switching to FAILED\\."}
[2018-04-24T17:17:38,889][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW104004"=>"\\((?:Primary|Secondary)\\) Switching to OK\\."}
[2018-04-24T17:17:38,889][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW105003"=>"\\((?:Primary|Secondary)\\) Monitoring on [Ii]nterface %{GREEDYDATA:interface_name} waiting"}
[2018-04-24T17:17:38,889][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW105004"=>"\\((?:Primary|Secondary)\\) Monitoring on [Ii]nterface %{GREEDYDATA:interface_name} normal"}
[2018-04-24T17:17:38,889][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW105005"=>"\\((?:Primary|Secondary)\\) Lost Failover communications with mate on [Ii]nterface %{GREEDYDATA:interface_name}"}
[2018-04-24T17:17:38,889][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW105008"=>"\\((?:Primary|Secondary)\\) Testing [Ii]nterface %{GREEDYDATA:interface_name}"}
[2018-04-24T17:17:38,889][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW105009"=>"\\((?:Primary|Secondary)\\) Testing on [Ii]nterface %{GREEDYDATA:interface_name} (?:Passed|Failed)"}
[2018-04-24T17:17:38,889][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW106001"=>"%{CISCO_DIRECTION:direction} %{WORD:protocol} connection %{CISCO_ACTION:action} from %{IP:src_ip}/%{INT:src_port} to %{IP:dst_ip}/%{INT:dst_port} flags %{GREEDYDATA:tcp_flags} on interface %{GREEDYDATA:interface}"}
[2018-04-24T17:17:38,890][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW106006_106007_106010"=>"%{CISCO_ACTION:action} %{CISCO_DIRECTION:direction} %{WORD:protocol} (?:from|src) %{IP:src_ip}/%{INT:src_port}(\\(%{DATA:src_fwuser}\\))? (?:to|dst) %{IP:dst_ip}/%{INT:dst_port}(\\(%{DATA:dst_fwuser}\\))? (?:on interface %{DATA:interface}|due to %{CISCO_REASON:reason})"}
[2018-04-24T17:17:38,890][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW106014"=>"%{CISCO_ACTION:action} %{CISCO_DIRECTION:direction} %{WORD:protocol} src %{DATA:src_interface}:%{IP:src_ip}(\\(%{DATA:src_fwuser}\\))? dst %{DATA:dst_interface}:%{IP:dst_ip}(\\(%{DATA:dst_fwuser}\\))? \\(type %{INT:icmp_type}, code %{INT:icmp_code}\\)"}
[2018-04-24T17:17:38,890][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW106015"=>"%{CISCO_ACTION:action} %{WORD:protocol} \\(%{DATA:policy_id}\\) from %{IP:src_ip}/%{INT:src_port} to %{IP:dst_ip}/%{INT:dst_port} flags %{DATA:tcp_flags} on interface %{GREEDYDATA:interface}"}
[2018-04-24T17:17:38,890][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW106021"=>"%{CISCO_ACTION:action} %{WORD:protocol} reverse path check from %{IP:src_ip} to %{IP:dst_ip} on interface %{GREEDYDATA:interface}"}
[2018-04-24T17:17:38,890][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW106023"=>"%{CISCO_ACTION:action}( protocol)? %{WORD:protocol} src %{DATA:src_interface}:%{DATA:src_ip}(/%{INT:src_port})?(\\(%{DATA:src_fwuser}\\))? dst %{DATA:dst_interface}:%{DATA:dst_ip}(/%{INT:dst_port})?(\\(%{DATA:dst_fwuser}\\))?( \\(type %{INT:icmp_type}, code %{INT:icmp_code}\\))? by access-group \"?%{DATA:policy_id}\"? \\[%{DATA:hashcode1}, %{DATA:hashcode2}\\]"}
[2018-04-24T17:17:38,890][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW106100_2_3"=>"access-list %{NOTSPACE:policy_id} %{CISCO_ACTION:action} %{WORD:protocol} for user '%{DATA:src_fwuser}' %{DATA:src_interface}/%{IP:src_ip}\\(%{INT:src_port}\\) -> %{DATA:dst_interface}/%{IP:dst_ip}\\(%{INT:dst_port}\\) hit-cnt %{INT:hit_count} %{CISCO_INTERVAL:interval} \\[%{DATA:hashcode1}, %{DATA:hashcode2}\\]"}
[2018-04-24T17:17:38,890][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW106100"=>"access-list %{NOTSPACE:policy_id} %{CISCO_ACTION:action} %{WORD:protocol} %{DATA:src_interface}/%{IP:src_ip}\\(%{INT:src_port}\\)(\\(%{DATA:src_fwuser}\\))? -> %{DATA:dst_interface}/%{IP:dst_ip}\\(%{INT:dst_port}\\)(\\(%{DATA:src_fwuser}\\))? hit-cnt %{INT:hit_count} %{CISCO_INTERVAL:interval} \\[%{DATA:hashcode1}, %{DATA:hashcode2}\\]"}
[2018-04-24T17:17:38,890][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW304001"=>"%{IP:src_ip}(\\(%{DATA:src_fwuser}\\))? Accessed URL %{IP:dst_ip}:%{GREEDYDATA:dst_url}"}
[2018-04-24T17:17:38,890][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW110002"=>"%{CISCO_REASON:reason} for %{WORD:protocol} from %{DATA:src_interface}:%{IP:src_ip}/%{INT:src_port} to %{IP:dst_ip}/%{INT:dst_port}"}
[2018-04-24T17:17:38,890][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW302010"=>"%{INT:connection_count} in use, %{INT:connection_count_max} most used"}
[2018-04-24T17:17:38,890][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW302013_302014_302015_302016"=>"%{CISCO_ACTION:action}(?: %{CISCO_DIRECTION:direction})? %{WORD:protocol} connection %{INT:connection_id} for %{DATA:src_interface}:%{IP:src_ip}/%{INT:src_port}( \\(%{IP:src_mapped_ip}/%{INT:src_mapped_port}\\))?(\\(%{DATA:src_fwuser}\\))? to %{DATA:dst_interface}:%{IP:dst_ip}/%{INT:dst_port}( \\(%{IP:dst_mapped_ip}/%{INT:dst_mapped_port}\\))?(\\(%{DATA:dst_fwuser}\\))?( duration %{TIME:duration} bytes %{INT:bytes})?(?: %{CISCO_REASON:reason})?( \\(%{DATA:user}\\))?"}
[2018-04-24T17:17:38,890][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW302020_302021"=>"%{CISCO_ACTION:action}(?: %{CISCO_DIRECTION:direction})? %{WORD:protocol} connection for faddr %{IP:dst_ip}/%{INT:icmp_seq_num}(?:\\(%{DATA:fwuser}\\))? gaddr %{IP:src_xlated_ip}/%{INT:icmp_code_xlated} laddr %{IP:src_ip}/%{INT:icmp_code}( \\(%{DATA:user}\\))?"}
[2018-04-24T17:17:38,890][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW305011"=>"%{CISCO_ACTION:action} %{CISCO_XLATE_TYPE:xlate_type} %{WORD:protocol} translation from %{DATA:src_interface}:%{IP:src_ip}(/%{INT:src_port})?(\\(%{DATA:src_fwuser}\\))? to %{DATA:src_xlated_interface}:%{IP:src_xlated_ip}/%{DATA:src_xlated_port}"}
[2018-04-24T17:17:38,890][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW313001_313004_313008"=>"%{CISCO_ACTION:action} %{WORD:protocol} type=%{INT:icmp_type}, code=%{INT:icmp_code} from %{IP:src_ip} on interface %{DATA:interface}( to %{IP:dst_ip})?"}
[2018-04-24T17:17:38,891][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW313005"=>"%{CISCO_REASON:reason} for %{WORD:protocol} error message: %{WORD:err_protocol} src %{DATA:err_src_interface}:%{IP:err_src_ip}(\\(%{DATA:err_src_fwuser}\\))? dst %{DATA:err_dst_interface}:%{IP:err_dst_ip}(\\(%{DATA:err_dst_fwuser}\\))? \\(type %{INT:err_icmp_type}, code %{INT:err_icmp_code}\\) on %{DATA:interface} interface\\. Original IP payload: %{WORD:protocol} src %{IP:orig_src_ip}/%{INT:orig_src_port}(\\(%{DATA:orig_src_fwuser}\\))? dst %{IP:orig_dst_ip}/%{INT:orig_dst_port}(\\(%{DATA:orig_dst_fwuser}\\))?"}
[2018-04-24T17:17:38,891][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW321001"=>"Resource '%{WORD:resource_name}' limit of %{POSINT:resource_limit} reached for system"}
[2018-04-24T17:17:38,891][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW402117"=>"%{WORD:protocol}: Received a non-IPSec packet \\(protocol= %{WORD:orig_protocol}\\) from %{IP:src_ip} to %{IP:dst_ip}"}
[2018-04-24T17:17:38,891][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW402119"=>"%{WORD:protocol}: Received an %{WORD:orig_protocol} packet \\(SPI= %{DATA:spi}, sequence number= %{DATA:seq_num}\\) from %{IP:src_ip} \\(user= %{DATA:user}\\) to %{IP:dst_ip} that failed anti-replay checking"}
[2018-04-24T17:17:38,891][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW419001"=>"%{CISCO_ACTION:action} %{WORD:protocol} packet from %{DATA:src_interface}:%{IP:src_ip}/%{INT:src_port} to %{DATA:dst_interface}:%{IP:dst_ip}/%{INT:dst_port}, reason: %{GREEDYDATA:reason}"}
[2018-04-24T17:17:38,891][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW419002"=>"%{CISCO_REASON:reason} from %{DATA:src_interface}:%{IP:src_ip}/%{INT:src_port} to %{DATA:dst_interface}:%{IP:dst_ip}/%{INT:dst_port} with different initial sequence number"}
[2018-04-24T17:17:38,891][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW500004"=>"%{CISCO_REASON:reason} for protocol=%{WORD:protocol}, from %{IP:src_ip}/%{INT:src_port} to %{IP:dst_ip}/%{INT:dst_port}"}
[2018-04-24T17:17:38,891][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW602303_602304"=>"%{WORD:protocol}: An %{CISCO_DIRECTION:direction} %{GREEDYDATA:tunnel_type} SA \\(SPI= %{DATA:spi}\\) between %{IP:src_ip} and %{IP:dst_ip} \\(user= %{DATA:user}\\) has been %{CISCO_ACTION:action}"}
[2018-04-24T17:17:38,891][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW710001_710002_710003_710005_710006"=>"%{WORD:protocol} (?:request|access) %{CISCO_ACTION:action} from %{IP:src_ip}/%{INT:src_port} to %{DATA:dst_interface}:%{IP:dst_ip}/%{INT:dst_port}"}
[2018-04-24T17:17:38,891][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW713172"=>"Group = %{GREEDYDATA:group}, IP = %{IP:src_ip}, Automatic NAT Detection Status:\\s+Remote end\\s*%{DATA:is_remote_natted}\\s*behind a NAT device\\s+This\\s+end\\s*%{DATA:is_local_natted}\\s*behind a NAT device"}
[2018-04-24T17:17:38,891][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW733100"=>"\\[\\s*%{DATA:drop_type}\\s*\\] drop %{DATA:drop_rate_id} exceeded. Current burst rate is %{INT:drop_rate_current_burst} per second, max configured rate is %{INT:drop_rate_max_burst}; Current average rate is %{INT:drop_rate_current_avg} per second, max configured rate is %{INT:drop_rate_max_avg}; Cumulative total count is %{INT:drop_total_count}"}
[2018-04-24T17:17:38,891][DEBUG][logstash.filters.grok ] Adding pattern {"SHOREWALL"=>"(%{SYSLOGTIMESTAMP:timestamp}) (%{WORD:nf_host}) kernel:.*Shorewall:(%{WORD:nf_action1})?:(%{WORD:nf_action2})?.*IN=(%{USERNAME:nf_in_interface})?.*(OUT= *MAC=(%{COMMONMAC:nf_dst_mac}):(%{COMMONMAC:nf_src_mac})?|OUT=%{USERNAME:nf_out_interface}).*SRC=(%{IPV4:nf_src_ip}).*DST=(%{IPV4:nf_dst_ip}).*LEN=(%{WORD:nf_len}).?*TOS=(%{WORD:nf_tos}).?*PREC=(%{WORD:nf_prec}).?*TTL=(%{INT:nf_ttl}).?*ID=(%{INT:nf_id}).?*PROTO=(%{WORD:nf_protocol}).?*SPT=(%{INT:nf_src_port}?.*DPT=%{INT:nf_dst_port}?.*)"}
[2018-04-24T17:17:38,892][DEBUG][logstash.filters.grok ] Adding pattern {"SFW2"=>"((%{SYSLOGTIMESTAMP})|(%{TIMESTAMP_ISO8601}))\\s*%{HOSTNAME}\\s*kernel\\S+\\s*%{NAGIOSTIME}\\s*SFW2\\-INext\\-%{NOTSPACE:nf_action}\\s*IN=%{USERNAME:nf_in_interface}.*OUT=((\\s*%{USERNAME:nf_out_interface})|(\\s*))MAC=((%{COMMONMAC:nf_dst_mac}:%{COMMONMAC:nf_src_mac})|(\\s*)).*SRC=%{IP:nf_src_ip}\\s*DST=%{IP:nf_dst_ip}.*PROTO=%{WORD:nf_protocol}((.*SPT=%{INT:nf_src_port}.*DPT=%{INT:nf_dst_port}.*)|())"}
[2018-04-24T17:17:38,892][DEBUG][logstash.filters.grok ] Adding pattern {"USERNAME"=>"[a-zA-Z0-9._-]+"}
[2018-04-24T17:17:38,892][DEBUG][logstash.filters.grok ] Adding pattern {"USER"=>"%{USERNAME}"}
[2018-04-24T17:17:38,892][DEBUG][logstash.filters.grok ] Adding pattern {"EMAILLOCALPART"=>"[a-zA-Z][a-zA-Z0-9_.+-=:]+"}
[2018-04-24T17:17:38,892][DEBUG][logstash.filters.grok ] Adding pattern {"EMAILADDRESS"=>"%{EMAILLOCALPART}@%{HOSTNAME}"}
[2018-04-24T17:17:38,893][DEBUG][logstash.filters.grok ] Adding pattern {"INT"=>"(?:[+-]?(?:[0-9]+))"}
[2018-04-24T17:17:38,893][DEBUG][logstash.filters.grok ] Adding pattern {"BASE10NUM"=>"(?<![0-9.+-])(?>[+-]?(?:(?:[0-9]+(?:\\.[0-9]+)?)|(?:\\.[0-9]+)))"}
[2018-04-24T17:17:38,893][DEBUG][logstash.filters.grok ] Adding pattern {"NUMBER"=>"(?:%{BASE10NUM})"}
[2018-04-24T17:17:38,893][DEBUG][logstash.filters.grok ] Adding pattern {"BASE16NUM"=>"(?<![0-9A-Fa-f])(?:[+-]?(?:0x)?(?:[0-9A-Fa-f]+))"}
[2018-04-24T17:17:38,893][DEBUG][logstash.filters.grok ] Adding pattern {"BASE16FLOAT"=>"\\b(?<![0-9A-Fa-f.])(?:[+-]?(?:0x)?(?:(?:[0-9A-Fa-f]+(?:\\.[0-9A-Fa-f]*)?)|(?:\\.[0-9A-Fa-f]+)))\\b"}
[2018-04-24T17:17:38,893][DEBUG][logstash.filters.grok ] Adding pattern {"POSINT"=>"\\b(?:[1-9][0-9]*)\\b"}
[2018-04-24T17:17:38,893][DEBUG][logstash.filters.grok ] Adding pattern {"NONNEGINT"=>"\\b(?:[0-9]+)\\b"}
[2018-04-24T17:17:38,893][DEBUG][logstash.filters.grok ] Adding pattern {"WORD"=>"\\b\\w+\\b"}
[2018-04-24T17:17:38,893][DEBUG][logstash.filters.grok ] Adding pattern {"NOTSPACE"=>"\\S+"}
[2018-04-24T17:17:38,893][DEBUG][logstash.filters.grok ] Adding pattern {"SPACE"=>"\\s*"}
[2018-04-24T17:17:38,893][DEBUG][logstash.filters.grok ] Adding pattern {"DATA"=>".*?"}
[2018-04-24T17:17:38,893][DEBUG][logstash.filters.grok ] Adding pattern {"GREEDYDATA"=>".*"}
[2018-04-24T17:17:38,893][DEBUG][logstash.filters.grok ] Adding pattern {"QUOTEDSTRING"=>"(?>(?<!\\\\)(?>\"(?>\\\\.|[^\\\\\"]+)+\"|\"\"|(?>'(?>\\\\.|[^\\\\']+)+')|''|(?>`(?>\\\\.|[^\\\\`]+)+`)|``))"}
[2018-04-24T17:17:38,893][DEBUG][logstash.filters.grok ] Adding pattern {"UUID"=>"[A-Fa-f0-9]{8}-(?:[A-Fa-f0-9]{4}-){3}[A-Fa-f0-9]{12}"}
[2018-04-24T17:17:38,893][DEBUG][logstash.filters.grok ] Adding pattern {"URN"=>"urn:[0-9A-Za-z][0-9A-Za-z-]{0,31}:(?:%[0-9a-fA-F]{2}|[0-9A-Za-z()+,.:=@;$_!*'/?#-])+"}
[2018-04-24T17:17:38,893][DEBUG][logstash.filters.grok ] Adding pattern {"MAC"=>"(?:%{CISCOMAC}|%{WINDOWSMAC}|%{COMMONMAC})"}
[2018-04-24T17:17:38,893][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOMAC"=>"(?:(?:[A-Fa-f0-9]{4}\\.){2}[A-Fa-f0-9]{4})"}
[2018-04-24T17:17:38,893][DEBUG][logstash.filters.grok ] Adding pattern {"WINDOWSMAC"=>"(?:(?:[A-Fa-f0-9]{2}-){5}[A-Fa-f0-9]{2})"}
[2018-04-24T17:17:38,893][DEBUG][logstash.filters.grok ] Adding pattern {"COMMONMAC"=>"(?:(?:[A-Fa-f0-9]{2}:){5}[A-Fa-f0-9]{2})"}
[2018-04-24T17:17:38,894][DEBUG][logstash.filters.grok ] Adding pattern {"IPV6"=>"((([0-9A-Fa-f]{1,4}:){7}([0-9A-Fa-f]{1,4}|:))|(([0-9A-Fa-f]{1,4}:){6}(:[0-9A-Fa-f]{1,4}|((25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)(\\.(25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)){3})|:))|(([0-9A-Fa-f]{1,4}:){5}(((:[0-9A-Fa-f]{1,4}){1,2})|:((25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)(\\.(25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)){3})|:))|(([0-9A-Fa-f]{1,4}:){4}(((:[0-9A-Fa-f]{1,4}){1,3})|((:[0-9A-Fa-f]{1,4})?:((25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)(\\.(25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)){3}))|:))|(([0-9A-Fa-f]{1,4}:){3}(((:[0-9A-Fa-f]{1,4}){1,4})|((:[0-9A-Fa-f]{1,4}){0,2}:((25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)(\\.(25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)){3}))|:))|(([0-9A-Fa-f]{1,4}:){2}(((:[0-9A-Fa-f]{1,4}){1,5})|((:[0-9A-Fa-f]{1,4}){0,3}:((25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)(\\.(25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)){3}))|:))|(([0-9A-Fa-f]{1,4}:){1}(((:[0-9A-Fa-f]{1,4}){1,6})|((:[0-9A-Fa-f]{1,4}){0,4}:((25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)(\\.(25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)){3}))|:))|(:(((:[0-9A-Fa-f]{1,4}){1,7})|((:[0-9A-Fa-f]{1,4}){0,5}:((25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)(\\.(25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)){3}))|:)))(%.+)?"}
[2018-04-24T17:17:38,894][DEBUG][logstash.filters.grok ] Adding pattern {"IPV4"=>"(?<![0-9])(?:(?:[0-1]?[0-9]{1,2}|2[0-4][0-9]|25[0-5])[.](?:[0-1]?[0-9]{1,2}|2[0-4][0-9]|25[0-5])[.](?:[0-1]?[0-9]{1,2}|2[0-4][0-9]|25[0-5])[.](?:[0-1]?[0-9]{1,2}|2[0-4][0-9]|25[0-5]))(?![0-9])"}
[2018-04-24T17:17:38,894][DEBUG][logstash.filters.grok ] Adding pattern {"IP"=>"(?:%{IPV6}|%{IPV4})"}
[2018-04-24T17:17:38,894][DEBUG][logstash.filters.grok ] Adding pattern {"HOSTNAME"=>"\\b(?:[0-9A-Za-z][0-9A-Za-z-]{0,62})(?:\\.(?:[0-9A-Za-z][0-9A-Za-z-]{0,62}))*(\\.?|\\b)"}
[2018-04-24T17:17:38,894][DEBUG][logstash.filters.grok ] Adding pattern {"IPORHOST"=>"(?:%{IP}|%{HOSTNAME})"}
[2018-04-24T17:17:38,894][DEBUG][logstash.filters.grok ] Adding pattern {"HOSTPORT"=>"%{IPORHOST}:%{POSINT}"}
[2018-04-24T17:17:38,894][DEBUG][logstash.filters.grok ] Adding pattern {"PATH"=>"(?:%{UNIXPATH}|%{WINPATH})"}
[2018-04-24T17:17:38,894][DEBUG][logstash.filters.grok ] Adding pattern {"UNIXPATH"=>"(/([\\w_%!$@:.,+~-]+|\\\\.)*)+"}
[2018-04-24T17:17:38,894][DEBUG][logstash.filters.grok ] Adding pattern {"TTY"=>"(?:/dev/(pts|tty([pq])?)(\\w+)?/?(?:[0-9]+))"}
[2018-04-24T17:17:38,894][DEBUG][logstash.filters.grok ] Adding pattern {"WINPATH"=>"(?>[A-Za-z]+:|\\\\)(?:\\\\[^\\\\?*]*)+"}
[2018-04-24T17:17:38,894][DEBUG][logstash.filters.grok ] Adding pattern {"URIPROTO"=>"[A-Za-z]([A-Za-z0-9+\\-.]+)+"}
[2018-04-24T17:17:38,894][DEBUG][logstash.filters.grok ] Adding pattern {"URIHOST"=>"%{IPORHOST}(?::%{POSINT:port})?"}
[2018-04-24T17:17:38,894][DEBUG][logstash.filters.grok ] Adding pattern {"URIPATH"=>"(?:/[A-Za-z0-9$.+!*'(){},~:;=@#%&_\\-]*)+"}
[2018-04-24T17:17:38,894][DEBUG][logstash.filters.grok ] Adding pattern {"URIPARAM"=>"\\?[A-Za-z0-9$.+!*'|(){},~@#%&/=:;_?\\-\\[\\]<>]*"}
[2018-04-24T17:17:38,894][DEBUG][logstash.filters.grok ] Adding pattern {"URIPATHPARAM"=>"%{URIPATH}(?:%{URIPARAM})?"}
[2018-04-24T17:17:38,894][DEBUG][logstash.filters.grok ] Adding pattern {"URI"=>"%{URIPROTO}://(?:%{USER}(?::[^@]*)?@)?(?:%{URIHOST})?(?:%{URIPATHPARAM})?"}
[2018-04-24T17:17:38,894][DEBUG][logstash.filters.grok ] Adding pattern {"MONTH"=>"\\b(?:[Jj]an(?:uary|uar)?|[Ff]eb(?:ruary|ruar)?|[Mm](?:a|ä)?r(?:ch|z)?|[Aa]pr(?:il)?|[Mm]a(?:y|i)?|[Jj]un(?:e|i)?|[Jj]ul(?:y)?|[Aa]ug(?:ust)?|[Ss]ep(?:tember)?|[Oo](?:c|k)?t(?:ober)?|[Nn]ov(?:ember)?|[Dd]e(?:c|z)(?:ember)?)\\b"}
[2018-04-24T17:17:38,894][DEBUG][logstash.filters.grok ] Adding pattern {"MONTHNUM"=>"(?:0?[1-9]|1[0-2])"}
[2018-04-24T17:17:38,894][DEBUG][logstash.filters.grok ] Adding pattern {"MONTHNUM2"=>"(?:0[1-9]|1[0-2])"}
[2018-04-24T17:17:38,894][DEBUG][logstash.filters.grok ] Adding pattern {"MONTHDAY"=>"(?:(?:0[1-9])|(?:[12][0-9])|(?:3[01])|[1-9])"}
[2018-04-24T17:17:38,894][DEBUG][logstash.filters.grok ] Adding pattern {"DAY"=>"(?:Mon(?:day)?|Tue(?:sday)?|Wed(?:nesday)?|Thu(?:rsday)?|Fri(?:day)?|Sat(?:urday)?|Sun(?:day)?)"}
[2018-04-24T17:17:38,895][DEBUG][logstash.filters.grok ] Adding pattern {"YEAR"=>"(?>\\d\\d){1,2}"}
[2018-04-24T17:17:38,895][DEBUG][logstash.filters.grok ] Adding pattern {"HOUR"=>"(?:2[0123]|[01]?[0-9])"}
[2018-04-24T17:17:38,895][DEBUG][logstash.filters.grok ] Adding pattern {"MINUTE"=>"(?:[0-5][0-9])"}
[2018-04-24T17:17:38,895][DEBUG][logstash.filters.grok ] Adding pattern {"SECOND"=>"(?:(?:[0-5]?[0-9]|60)(?:[:.,][0-9]+)?)"}
[2018-04-24T17:17:38,895][DEBUG][logstash.filters.grok ] Adding pattern {"TIME"=>"(?!<[0-9])%{HOUR}:%{MINUTE}(?::%{SECOND})(?![0-9])"}
[2018-04-24T17:17:38,895][DEBUG][logstash.filters.grok ] Adding pattern {"DATE_US"=>"%{MONTHNUM}[/-]%{MONTHDAY}[/-]%{YEAR}"}
[2018-04-24T17:17:38,895][DEBUG][logstash.filters.grok ] Adding pattern {"DATE_EU"=>"%{MONTHDAY}[./-]%{MONTHNUM}[./-]%{YEAR}"}
[2018-04-24T17:17:38,895][DEBUG][logstash.filters.grok ] Adding pattern {"ISO8601_TIMEZONE"=>"(?:Z|[+-]%{HOUR}(?::?%{MINUTE}))"}
[2018-04-24T17:17:38,895][DEBUG][logstash.filters.grok ] Adding pattern {"ISO8601_SECOND"=>"(?:%{SECOND}|60)"}
[2018-04-24T17:17:38,895][DEBUG][logstash.filters.grok ] Adding pattern {"TIMESTAMP_ISO8601"=>"%{YEAR}-%{MONTHNUM}-%{MONTHDAY}[T ]%{HOUR}:?%{MINUTE}(?::?%{SECOND})?%{ISO8601_TIMEZONE}?"}
[2018-04-24T17:17:38,895][DEBUG][logstash.filters.grok ] Adding pattern {"DATE"=>"%{DATE_US}|%{DATE_EU}"}
[2018-04-24T17:17:38,895][DEBUG][logstash.filters.grok ] Adding pattern {"DATESTAMP"=>"%{DATE}[- ]%{TIME}"}
[2018-04-24T17:17:38,895][DEBUG][logstash.filters.grok ] Adding pattern {"TZ"=>"(?:[APMCE][SD]T|UTC)"}
[2018-04-24T17:17:38,895][DEBUG][logstash.filters.grok ] Adding pattern {"DATESTAMP_RFC822"=>"%{DAY} %{MONTH} %{MONTHDAY} %{YEAR} %{TIME} %{TZ}"}
[2018-04-24T17:17:38,895][DEBUG][logstash.filters.grok ] Adding pattern {"DATESTAMP_RFC2822"=>"%{DAY}, %{MONTHDAY} %{MONTH} %{YEAR} %{TIME} %{ISO8601_TIMEZONE}"}
[2018-04-24T17:17:38,895][DEBUG][logstash.filters.grok ] Adding pattern {"DATESTAMP_OTHER"=>"%{DAY} %{MONTH} %{MONTHDAY} %{TIME} %{TZ} %{YEAR}"}
[2018-04-24T17:17:38,895][DEBUG][logstash.filters.grok ] Adding pattern {"DATESTAMP_EVENTLOG"=>"%{YEAR}%{MONTHNUM2}%{MONTHDAY}%{HOUR}%{MINUTE}%{SECOND}"}
[2018-04-24T17:17:38,895][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOGTIMESTAMP"=>"%{MONTH} +%{MONTHDAY} %{TIME}"}
[2018-04-24T17:17:38,895][DEBUG][logstash.filters.grok ] Adding pattern {"PROG"=>"[\\x21-\\x5a\\x5c\\x5e-\\x7e]+"}
[2018-04-24T17:17:38,895][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOGPROG"=>"%{PROG:program}(?:\\[%{POSINT:pid}\\])?"}
[2018-04-24T17:17:38,895][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOGHOST"=>"%{IPORHOST}"}
[2018-04-24T17:17:38,895][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOGFACILITY"=>"<%{NONNEGINT:facility}.%{NONNEGINT:priority}>"}
[2018-04-24T17:17:38,895][DEBUG][logstash.filters.grok ] Adding pattern {"HTTPDATE"=>"%{MONTHDAY}/%{MONTH}/%{YEAR}:%{TIME} %{INT}"}
[2018-04-24T17:17:38,896][DEBUG][logstash.filters.grok ] Adding pattern {"QS"=>"%{QUOTEDSTRING}"}
[2018-04-24T17:17:38,896][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOGBASE"=>"%{SYSLOGTIMESTAMP:timestamp} (?:%{SYSLOGFACILITY} )?%{SYSLOGHOST:logsource} %{SYSLOGPROG}:"}
[2018-04-24T17:17:38,896][DEBUG][logstash.filters.grok ] Adding pattern {"LOGLEVEL"=>"([Aa]lert|ALERT|[Tt]race|TRACE|[Dd]ebug|DEBUG|[Nn]otice|NOTICE|[Ii]nfo|INFO|[Ww]arn?(?:ing)?|WARN?(?:ING)?|[Ee]rr?(?:or)?|ERR?(?:OR)?|[Cc]rit?(?:ical)?|CRIT?(?:ICAL)?|[Ff]atal|FATAL|[Ss]evere|SEVERE|EMERG(?:ENCY)?|[Ee]merg(?:ency)?)"}
[2018-04-24T17:17:38,897][DEBUG][logstash.filters.grok ] Adding pattern {"HAPROXYTIME"=>"(?!<[0-9])%{HOUR:haproxy_hour}:%{MINUTE:haproxy_minute}(?::%{SECOND:haproxy_second})(?![0-9])"}
[2018-04-24T17:17:38,897][DEBUG][logstash.filters.grok ] Adding pattern {"HAPROXYDATE"=>"%{MONTHDAY:haproxy_monthday}/%{MONTH:haproxy_month}/%{YEAR:haproxy_year}:%{HAPROXYTIME:haproxy_time}.%{INT:haproxy_milliseconds}"}
[2018-04-24T17:17:38,897][DEBUG][logstash.filters.grok ] Adding pattern {"HAPROXYCAPTUREDREQUESTHEADERS"=>"%{DATA:captured_request_headers}"}
[2018-04-24T17:17:38,897][DEBUG][logstash.filters.grok ] Adding pattern {"HAPROXYCAPTUREDRESPONSEHEADERS"=>"%{DATA:captured_response_headers}"}
[2018-04-24T17:17:38,897][DEBUG][logstash.filters.grok ] Adding pattern {"HAPROXYHTTPBASE"=>"%{IP:client_ip}:%{INT:client_port} \\[%{HAPROXYDATE:accept_date}\\] %{NOTSPACE:frontend_name} %{NOTSPACE:backend_name}/%{NOTSPACE:server_name} %{INT:time_request}/%{INT:time_queue}/%{INT:time_backend_connect}/%{INT:time_backend_response}/%{NOTSPACE:time_duration} %{INT:http_status_code} %{NOTSPACE:bytes_read} %{DATA:captured_request_cookie} %{DATA:captured_response_cookie} %{NOTSPACE:termination_state} %{INT:actconn}/%{INT:feconn}/%{INT:beconn}/%{INT:srvconn}/%{NOTSPACE:retries} %{INT:srv_queue}/%{INT:backend_queue} (\\{%{HAPROXYCAPTUREDREQUESTHEADERS}\\})?( )?(\\{%{HAPROXYCAPTUREDRESPONSEHEADERS}\\})?( )?\"(<BADREQ>|(%{WORD:http_verb} (%{URIPROTO:http_proto}://)?(?:%{USER:http_user}(?::[^@]*)?@)?(?:%{URIHOST:http_host})?(?:%{URIPATHPARAM:http_request})?( HTTP/%{NUMBER:http_version})?))?\""}
[2018-04-24T17:17:38,897][DEBUG][logstash.filters.grok ] Adding pattern {"HAPROXYHTTP"=>"(?:%{SYSLOGTIMESTAMP:syslog_timestamp}|%{TIMESTAMP_ISO8601:timestamp8601}) %{IPORHOST:syslog_server} %{SYSLOGPROG}: %{HAPROXYHTTPBASE}"}
[2018-04-24T17:17:38,897][DEBUG][logstash.filters.grok ] Adding pattern {"HAPROXYTCP"=>"(?:%{SYSLOGTIMESTAMP:syslog_timestamp}|%{TIMESTAMP_ISO8601:timestamp8601}) %{IPORHOST:syslog_server} %{SYSLOGPROG}: %{IP:client_ip}:%{INT:client_port} \\[%{HAPROXYDATE:accept_date}\\] %{NOTSPACE:frontend_name} %{NOTSPACE:backend_name}/%{NOTSPACE:server_name} %{INT:time_queue}/%{INT:time_backend_connect}/%{NOTSPACE:time_duration} %{NOTSPACE:bytes_read} %{NOTSPACE:termination_state} %{INT:actconn}/%{INT:feconn}/%{INT:beconn}/%{INT:srvconn}/%{NOTSPACE:retries} %{INT:srv_queue}/%{INT:backend_queue}"}
[2018-04-24T17:17:38,898][DEBUG][logstash.filters.grok ] Adding pattern {"HTTPDUSER"=>"%{EMAILADDRESS}|%{USER}"}
[2018-04-24T17:17:38,898][DEBUG][logstash.filters.grok ] Adding pattern {"HTTPDERROR_DATE"=>"%{DAY} %{MONTH} %{MONTHDAY} %{TIME} %{YEAR}"}
[2018-04-24T17:17:38,898][DEBUG][logstash.filters.grok ] Adding pattern {"HTTPD_COMMONLOG"=>"%{IPORHOST:clientip} %{HTTPDUSER:ident} %{HTTPDUSER:auth} \\[%{HTTPDATE:timestamp}\\] \"(?:%{WORD:verb} %{NOTSPACE:request}(?: HTTP/%{NUMBER:httpversion})?|%{DATA:rawrequest})\" %{NUMBER:response} (?:%{NUMBER:bytes}|-)"}
[2018-04-24T17:17:38,898][DEBUG][logstash.filters.grok ] Adding pattern {"HTTPD_COMBINEDLOG"=>"%{HTTPD_COMMONLOG} %{QS:referrer} %{QS:agent}"}
[2018-04-24T17:17:38,898][DEBUG][logstash.filters.grok ] Adding pattern {"HTTPD20_ERRORLOG"=>"\\[%{HTTPDERROR_DATE:timestamp}\\] \\[%{LOGLEVEL:loglevel}\\] (?:\\[client %{IPORHOST:clientip}\\] ){0,1}%{GREEDYDATA:message}"}
[2018-04-24T17:17:38,898][DEBUG][logstash.filters.grok ] Adding pattern {"HTTPD24_ERRORLOG"=>"\\[%{HTTPDERROR_DATE:timestamp}\\] \\[%{WORD:module}:%{LOGLEVEL:loglevel}\\] \\[pid %{POSINT:pid}(:tid %{NUMBER:tid})?\\]( \\(%{POSINT:proxy_errorcode}\\)%{DATA:proxy_message}:)?( \\[client %{IPORHOST:clientip}:%{POSINT:clientport}\\])?( %{DATA:errorcode}:)? %{GREEDYDATA:message}"}
[2018-04-24T17:17:38,898][DEBUG][logstash.filters.grok ] Adding pattern {"HTTPD_ERRORLOG"=>"%{HTTPD20_ERRORLOG}|%{HTTPD24_ERRORLOG}"}
[2018-04-24T17:17:38,898][DEBUG][logstash.filters.grok ] Adding pattern {"COMMONAPACHELOG"=>"%{HTTPD_COMMONLOG}"}
[2018-04-24T17:17:38,898][DEBUG][logstash.filters.grok ] Adding pattern {"COMBINEDAPACHELOG"=>"%{HTTPD_COMBINEDLOG}"}
[2018-04-24T17:17:38,898][DEBUG][logstash.filters.grok ] Adding pattern {"JAVACLASS"=>"(?:[a-zA-Z$_][a-zA-Z$_0-9]*\\.)*[a-zA-Z$_][a-zA-Z$_0-9]*"}
[2018-04-24T17:17:38,898][DEBUG][logstash.filters.grok ] Adding pattern {"JAVAFILE"=>"(?:[A-Za-z0-9_. -]+)"}
[2018-04-24T17:17:38,898][DEBUG][logstash.filters.grok ] Adding pattern {"JAVAMETHOD"=>"(?:(<(?:cl)?init>)|[a-zA-Z$_][a-zA-Z$_0-9]*)"}
[2018-04-24T17:17:38,899][DEBUG][logstash.filters.grok ] Adding pattern {"JAVASTACKTRACEPART"=>"%{SPACE}at %{JAVACLASS:class}\\.%{JAVAMETHOD:method}\\(%{JAVAFILE:file}(?::%{NUMBER:line})?\\)"}
[2018-04-24T17:17:38,899][DEBUG][logstash.filters.grok ] Adding pattern {"JAVATHREAD"=>"(?:[A-Z]{2}-Processor[\\d]+)"}
[2018-04-24T17:17:38,899][DEBUG][logstash.filters.grok ] Adding pattern {"JAVACLASS"=>"(?:[a-zA-Z0-9-]+\\.)+[A-Za-z0-9$]+"}
[2018-04-24T17:17:38,899][DEBUG][logstash.filters.grok ] Adding pattern {"JAVAFILE"=>"(?:[A-Za-z0-9_.-]+)"}
[2018-04-24T17:17:38,899][DEBUG][logstash.filters.grok ] Adding pattern {"JAVALOGMESSAGE"=>"(.*)"}
[2018-04-24T17:17:38,899][DEBUG][logstash.filters.grok ] Adding pattern {"CATALINA_DATESTAMP"=>"%{MONTH} %{MONTHDAY}, 20%{YEAR} %{HOUR}:?%{MINUTE}(?::?%{SECOND}) (?:AM|PM)"}
[2018-04-24T17:17:38,899][DEBUG][logstash.filters.grok ] Adding pattern {"TOMCAT_DATESTAMP"=>"20%{YEAR}-%{MONTHNUM}-%{MONTHDAY} %{HOUR}:?%{MINUTE}(?::?%{SECOND}) %{ISO8601_TIMEZONE}"}
[2018-04-24T17:17:38,899][DEBUG][logstash.filters.grok ] Adding pattern {"CATALINALOG"=>"%{CATALINA_DATESTAMP:timestamp} %{JAVACLASS:class} %{JAVALOGMESSAGE:logmessage}"}
[2018-04-24T17:17:38,899][DEBUG][logstash.filters.grok ] Adding pattern {"TOMCATLOG"=>"%{TOMCAT_DATESTAMP:timestamp} \\| %{LOGLEVEL:level} \\| %{JAVACLASS:class} - %{JAVALOGMESSAGE:logmessage}"}
[2018-04-24T17:17:38,899][DEBUG][logstash.filters.grok ] Adding pattern {"RT_FLOW_EVENT"=>"(RT_FLOW_SESSION_CREATE|RT_FLOW_SESSION_CLOSE|RT_FLOW_SESSION_DENY)"}
[2018-04-24T17:17:38,899][DEBUG][logstash.filters.grok ] Adding pattern {"RT_FLOW1"=>"%{RT_FLOW_EVENT:event}: %{GREEDYDATA:close-reason}: %{IP:src-ip}/%{INT:src-port}->%{IP:dst-ip}/%{INT:dst-port} %{DATA:service} %{IP:nat-src-ip}/%{INT:nat-src-port}->%{IP:nat-dst-ip}/%{INT:nat-dst-port} %{DATA:src-nat-rule-name} %{DATA:dst-nat-rule-name} %{INT:protocol-id} %{DATA:policy-name} %{DATA:from-zone} %{DATA:to-zone} %{INT:session-id} \\d+\\(%{DATA:sent}\\) \\d+\\(%{DATA:received}\\) %{INT:elapsed-time} .*"}
[2018-04-24T17:17:38,899][DEBUG][logstash.filters.grok ] Adding pattern {"RT_FLOW2"=>"%{RT_FLOW_EVENT:event}: session created %{IP:src-ip}/%{INT:src-port}->%{IP:dst-ip}/%{INT:dst-port} %{DATA:service} %{IP:nat-src-ip}/%{INT:nat-src-port}->%{IP:nat-dst-ip}/%{INT:nat-dst-port} %{DATA:src-nat-rule-name} %{DATA:dst-nat-rule-name} %{INT:protocol-id} %{DATA:policy-name} %{DATA:from-zone} %{DATA:to-zone} %{INT:session-id} .*"}
[2018-04-24T17:17:38,899][DEBUG][logstash.filters.grok ] Adding pattern {"RT_FLOW3"=>"%{RT_FLOW_EVENT:event}: session denied %{IP:src-ip}/%{INT:src-port}->%{IP:dst-ip}/%{INT:dst-port} %{DATA:service} %{INT:protocol-id}\\(\\d\\) %{DATA:policy-name} %{DATA:from-zone} %{DATA:to-zone} .*"}
[2018-04-24T17:17:38,900][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOG5424PRINTASCII"=>"[!-~]+"}
[2018-04-24T17:17:38,900][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOGBASE2"=>"(?:%{SYSLOGTIMESTAMP:timestamp}|%{TIMESTAMP_ISO8601:timestamp8601}) (?:%{SYSLOGFACILITY} )?%{SYSLOGHOST:logsource}+(?: %{SYSLOGPROG}:|)"}
[2018-04-24T17:17:38,900][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOGPAMSESSION"=>"%{SYSLOGBASE} (?=%{GREEDYDATA:message})%{WORD:pam_module}\\(%{DATA:pam_caller}\\): session %{WORD:pam_session_state} for user %{USERNAME:username}(?: by %{GREEDYDATA:pam_by})?"}
[2018-04-24T17:17:38,900][DEBUG][logstash.filters.grok ] Adding pattern {"CRON_ACTION"=>"[A-Z ]+"}
[2018-04-24T17:17:38,900][DEBUG][logstash.filters.grok ] Adding pattern {"CRONLOG"=>"%{SYSLOGBASE} \\(%{USER:user}\\) %{CRON_ACTION:action} \\(%{DATA:message}\\)"}
[2018-04-24T17:17:38,900][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOGLINE"=>"%{SYSLOGBASE2} %{GREEDYDATA:message}"}
[2018-04-24T17:17:38,900][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOG5424PRI"=>"<%{NONNEGINT:syslog5424_pri}>"}
[2018-04-24T17:17:38,900][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOG5424SD"=>"\\[%{DATA}\\]+"}
[2018-04-24T17:17:38,900][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOG5424BASE"=>"%{SYSLOG5424PRI}%{NONNEGINT:syslog5424_ver} +(?:%{TIMESTAMP_ISO8601:syslog5424_ts}|-) +(?:%{IPORHOST:syslog5424_host}|-) +(-|%{SYSLOG5424PRINTASCII:syslog5424_app}) +(-|%{SYSLOG5424PRINTASCII:syslog5424_proc}) +(-|%{SYSLOG5424PRINTASCII:syslog5424_msgid}) +(?:%{SYSLOG5424SD:syslog5424_sd}|-|)"}
[2018-04-24T17:17:38,900][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOG5424LINE"=>"%{SYSLOG5424BASE} +%{GREEDYDATA:syslog5424_msg}"}
[2018-04-24T17:17:38,900][DEBUG][logstash.filters.grok ] Adding pattern {"MAVEN_VERSION"=>"(?:(\\d+)\\.)?(?:(\\d+)\\.)?(\\*|\\d+)(?:[.-](RELEASE|SNAPSHOT))?"}
[2018-04-24T17:17:38,900][DEBUG][logstash.filters.grok ] Adding pattern {"MCOLLECTIVEAUDIT"=>"%{TIMESTAMP_ISO8601:timestamp}:"}
[2018-04-24T17:17:38,901][DEBUG][logstash.filters.grok ] Adding pattern {"MCOLLECTIVE"=>"., \\[%{TIMESTAMP_ISO8601:timestamp} #%{POSINT:pid}\\]%{SPACE}%{LOGLEVEL:event_level}"}
[2018-04-24T17:17:38,901][DEBUG][logstash.filters.grok ] Adding pattern {"MCOLLECTIVEAUDIT"=>"%{TIMESTAMP_ISO8601:timestamp}:"}
[2018-04-24T17:17:38,901][DEBUG][logstash.filters.grok ] Adding pattern {"MONGO_LOG"=>"%{SYSLOGTIMESTAMP:timestamp} \\[%{WORD:component}\\] %{GREEDYDATA:message}"}
[2018-04-24T17:17:38,901][DEBUG][logstash.filters.grok ] Adding pattern {"MONGO_QUERY"=>"\\{ (?<={ ).*(?= } ntoreturn:) \\}"}
[2018-04-24T17:17:38,901][DEBUG][logstash.filters.grok ] Adding pattern {"MONGO_SLOWQUERY"=>"%{WORD} %{MONGO_WORDDASH:database}\\.%{MONGO_WORDDASH:collection} %{WORD}: %{MONGO_QUERY:query} %{WORD}:%{NONNEGINT:ntoreturn} %{WORD}:%{NONNEGINT:ntoskip} %{WORD}:%{NONNEGINT:nscanned}.*nreturned:%{NONNEGINT:nreturned}..+ (?<duration>[0-9]+)ms"}
[2018-04-24T17:17:38,901][DEBUG][logstash.filters.grok ] Adding pattern {"MONGO_WORDDASH"=>"\\b[\\w-]+\\b"}
[2018-04-24T17:17:38,901][DEBUG][logstash.filters.grok ] Adding pattern {"MONGO3_SEVERITY"=>"\\w"}
[2018-04-24T17:17:38,901][DEBUG][logstash.filters.grok ] Adding pattern {"MONGO3_COMPONENT"=>"%{WORD}|-"}
[2018-04-24T17:17:38,901][DEBUG][logstash.filters.grok ] Adding pattern {"MONGO3_LOG"=>"%{TIMESTAMP_ISO8601:timestamp} %{MONGO3_SEVERITY:severity} %{MONGO3_COMPONENT:component}%{SPACE}(?:\\[%{DATA:context}\\])? %{GREEDYDATA:message}"}
[2018-04-24T17:17:38,902][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOSTIME"=>"\\[%{NUMBER:nagios_epoch}\\]"}
[2018-04-24T17:17:38,902][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_CURRENT_SERVICE_STATE"=>"CURRENT SERVICE STATE"}
[2018-04-24T17:17:38,902][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_CURRENT_HOST_STATE"=>"CURRENT HOST STATE"}
[2018-04-24T17:17:38,902][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_SERVICE_NOTIFICATION"=>"SERVICE NOTIFICATION"}
[2018-04-24T17:17:38,902][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_HOST_NOTIFICATION"=>"HOST NOTIFICATION"}
[2018-04-24T17:17:38,902][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_SERVICE_ALERT"=>"SERVICE ALERT"}
[2018-04-24T17:17:38,902][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_HOST_ALERT"=>"HOST ALERT"}
[2018-04-24T17:17:38,902][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_SERVICE_FLAPPING_ALERT"=>"SERVICE FLAPPING ALERT"}
[2018-04-24T17:17:38,902][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_HOST_FLAPPING_ALERT"=>"HOST FLAPPING ALERT"}
[2018-04-24T17:17:38,902][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_SERVICE_DOWNTIME_ALERT"=>"SERVICE DOWNTIME ALERT"}
[2018-04-24T17:17:38,902][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_HOST_DOWNTIME_ALERT"=>"HOST DOWNTIME ALERT"}
[2018-04-24T17:17:38,902][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_PASSIVE_SERVICE_CHECK"=>"PASSIVE SERVICE CHECK"}
[2018-04-24T17:17:38,902][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_PASSIVE_HOST_CHECK"=>"PASSIVE HOST CHECK"}
[2018-04-24T17:17:38,902][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_SERVICE_EVENT_HANDLER"=>"SERVICE EVENT HANDLER"}
[2018-04-24T17:17:38,902][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_HOST_EVENT_HANDLER"=>"HOST EVENT HANDLER"}
[2018-04-24T17:17:38,902][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_EXTERNAL_COMMAND"=>"EXTERNAL COMMAND"}
[2018-04-24T17:17:38,902][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_TIMEPERIOD_TRANSITION"=>"TIMEPERIOD TRANSITION"}
[2018-04-24T17:17:38,902][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_DISABLE_SVC_CHECK"=>"DISABLE_SVC_CHECK"}
[2018-04-24T17:17:38,902][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_ENABLE_SVC_CHECK"=>"ENABLE_SVC_CHECK"}
[2018-04-24T17:17:38,902][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_DISABLE_HOST_CHECK"=>"DISABLE_HOST_CHECK"}
[2018-04-24T17:17:38,902][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_ENABLE_HOST_CHECK"=>"ENABLE_HOST_CHECK"}
[2018-04-24T17:17:38,903][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_PROCESS_SERVICE_CHECK_RESULT"=>"PROCESS_SERVICE_CHECK_RESULT"}
[2018-04-24T17:17:38,903][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_PROCESS_HOST_CHECK_RESULT"=>"PROCESS_HOST_CHECK_RESULT"}
[2018-04-24T17:17:38,903][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_SCHEDULE_SERVICE_DOWNTIME"=>"SCHEDULE_SERVICE_DOWNTIME"}
[2018-04-24T17:17:38,903][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_SCHEDULE_HOST_DOWNTIME"=>"SCHEDULE_HOST_DOWNTIME"}
[2018-04-24T17:17:38,903][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_DISABLE_HOST_SVC_NOTIFICATIONS"=>"DISABLE_HOST_SVC_NOTIFICATIONS"}
[2018-04-24T17:17:38,903][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_ENABLE_HOST_SVC_NOTIFICATIONS"=>"ENABLE_HOST_SVC_NOTIFICATIONS"}
[2018-04-24T17:17:38,903][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_DISABLE_HOST_NOTIFICATIONS"=>"DISABLE_HOST_NOTIFICATIONS"}
[2018-04-24T17:17:38,903][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_ENABLE_HOST_NOTIFICATIONS"=>"ENABLE_HOST_NOTIFICATIONS"}
[2018-04-24T17:17:38,903][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_DISABLE_SVC_NOTIFICATIONS"=>"DISABLE_SVC_NOTIFICATIONS"}
[2018-04-24T17:17:38,903][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_ENABLE_SVC_NOTIFICATIONS"=>"ENABLE_SVC_NOTIFICATIONS"}
[2018-04-24T17:17:38,903][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_WARNING"=>"Warning:%{SPACE}%{GREEDYDATA:nagios_message}"}
[2018-04-24T17:17:38,903][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_CURRENT_SERVICE_STATE"=>"%{NAGIOS_TYPE_CURRENT_SERVICE_STATE:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_service};%{DATA:nagios_state};%{DATA:nagios_statetype};%{DATA:nagios_statecode};%{GREEDYDATA:nagios_message}"}
[2018-04-24T17:17:38,903][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_CURRENT_HOST_STATE"=>"%{NAGIOS_TYPE_CURRENT_HOST_STATE:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_state};%{DATA:nagios_statetype};%{DATA:nagios_statecode};%{GREEDYDATA:nagios_message}"}
[2018-04-24T17:17:38,903][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_SERVICE_NOTIFICATION"=>"%{NAGIOS_TYPE_SERVICE_NOTIFICATION:nagios_type}: %{DATA:nagios_notifyname};%{DATA:nagios_hostname};%{DATA:nagios_service};%{DATA:nagios_state};%{DATA:nagios_contact};%{GREEDYDATA:nagios_message}"}
[2018-04-24T17:17:38,903][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_HOST_NOTIFICATION"=>"%{NAGIOS_TYPE_HOST_NOTIFICATION:nagios_type}: %{DATA:nagios_notifyname};%{DATA:nagios_hostname};%{DATA:nagios_state};%{DATA:nagios_contact};%{GREEDYDATA:nagios_message}"}
[2018-04-24T17:17:38,903][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_SERVICE_ALERT"=>"%{NAGIOS_TYPE_SERVICE_ALERT:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_service};%{DATA:nagios_state};%{DATA:nagios_statelevel};%{NUMBER:nagios_attempt};%{GREEDYDATA:nagios_message}"}
[2018-04-24T17:17:38,903][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_HOST_ALERT"=>"%{NAGIOS_TYPE_HOST_ALERT:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_state};%{DATA:nagios_statelevel};%{NUMBER:nagios_attempt};%{GREEDYDATA:nagios_message}"}
[2018-04-24T17:17:38,903][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_SERVICE_FLAPPING_ALERT"=>"%{NAGIOS_TYPE_SERVICE_FLAPPING_ALERT:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_service};%{DATA:nagios_state};%{GREEDYDATA:nagios_message}"}
[2018-04-24T17:17:38,904][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_HOST_FLAPPING_ALERT"=>"%{NAGIOS_TYPE_HOST_FLAPPING_ALERT:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_state};%{GREEDYDATA:nagios_message}"}
[2018-04-24T17:17:38,904][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_SERVICE_DOWNTIME_ALERT"=>"%{NAGIOS_TYPE_SERVICE_DOWNTIME_ALERT:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_service};%{DATA:nagios_state};%{GREEDYDATA:nagios_comment}"}
[2018-04-24T17:17:38,904][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_HOST_DOWNTIME_ALERT"=>"%{NAGIOS_TYPE_HOST_DOWNTIME_ALERT:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_state};%{GREEDYDATA:nagios_comment}"}
[2018-04-24T17:17:38,904][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_PASSIVE_SERVICE_CHECK"=>"%{NAGIOS_TYPE_PASSIVE_SERVICE_CHECK:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_service};%{DATA:nagios_state};%{GREEDYDATA:nagios_comment}"}
[2018-04-24T17:17:38,904][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_PASSIVE_HOST_CHECK"=>"%{NAGIOS_TYPE_PASSIVE_HOST_CHECK:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_state};%{GREEDYDATA:nagios_comment}"}
[2018-04-24T17:17:38,904][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_SERVICE_EVENT_HANDLER"=>"%{NAGIOS_TYPE_SERVICE_EVENT_HANDLER:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_service};%{DATA:nagios_state};%{DATA:nagios_statelevel};%{DATA:nagios_event_handler_name}"}
[2018-04-24T17:17:38,904][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_HOST_EVENT_HANDLER"=>"%{NAGIOS_TYPE_HOST_EVENT_HANDLER:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_state};%{DATA:nagios_statelevel};%{DATA:nagios_event_handler_name}"}
[2018-04-24T17:17:38,904][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TIMEPERIOD_TRANSITION"=>"%{NAGIOS_TYPE_TIMEPERIOD_TRANSITION:nagios_type}: %{DATA:nagios_service};%{DATA:nagios_unknown1};%{DATA:nagios_unknown2}"}
[2018-04-24T17:17:38,904][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_DISABLE_SVC_CHECK"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_DISABLE_SVC_CHECK:nagios_command};%{DATA:nagios_hostname};%{DATA:nagios_service}"}
[2018-04-24T17:17:38,904][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_DISABLE_HOST_CHECK"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_DISABLE_HOST_CHECK:nagios_command};%{DATA:nagios_hostname}"}
[2018-04-24T17:17:38,904][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_ENABLE_SVC_CHECK"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_ENABLE_SVC_CHECK:nagios_command};%{DATA:nagios_hostname};%{DATA:nagios_service}"}
[2018-04-24T17:17:38,904][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_ENABLE_HOST_CHECK"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_ENABLE_HOST_CHECK:nagios_command};%{DATA:nagios_hostname}"}
[2018-04-24T17:17:38,904][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_PROCESS_SERVICE_CHECK_RESULT"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_PROCESS_SERVICE_CHECK_RESULT:nagios_command};%{DATA:nagios_hostname};%{DATA:nagios_service};%{DATA:nagios_state};%{GREEDYDATA:nagios_check_result}"}
[2018-04-24T17:17:38,904][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_PROCESS_HOST_CHECK_RESULT"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_PROCESS_HOST_CHECK_RESULT:nagios_command};%{DATA:nagios_hostname};%{DATA:nagios_state};%{GREEDYDATA:nagios_check_result}"}
[2018-04-24T17:17:38,905][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_DISABLE_HOST_SVC_NOTIFICATIONS"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_DISABLE_HOST_SVC_NOTIFICATIONS:nagios_command};%{GREEDYDATA:nagios_hostname}"}
[2018-04-24T17:17:38,905][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_DISABLE_HOST_NOTIFICATIONS"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_DISABLE_HOST_NOTIFICATIONS:nagios_command};%{GREEDYDATA:nagios_hostname}"}
[2018-04-24T17:17:38,905][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_DISABLE_SVC_NOTIFICATIONS"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_DISABLE_SVC_NOTIFICATIONS:nagios_command};%{DATA:nagios_hostname};%{GREEDYDATA:nagios_service}"}
[2018-04-24T17:17:38,905][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_ENABLE_HOST_SVC_NOTIFICATIONS"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_ENABLE_HOST_SVC_NOTIFICATIONS:nagios_command};%{GREEDYDATA:nagios_hostname}"}
[2018-04-24T17:17:38,905][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_ENABLE_HOST_NOTIFICATIONS"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_ENABLE_HOST_NOTIFICATIONS:nagios_command};%{GREEDYDATA:nagios_hostname}"}
[2018-04-24T17:17:38,905][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_ENABLE_SVC_NOTIFICATIONS"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_ENABLE_SVC_NOTIFICATIONS:nagios_command};%{DATA:nagios_hostname};%{GREEDYDATA:nagios_service}"}
[2018-04-24T17:17:38,905][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_SCHEDULE_HOST_DOWNTIME"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_SCHEDULE_HOST_DOWNTIME:nagios_command};%{DATA:nagios_hostname};%{NUMBER:nagios_start_time};%{NUMBER:nagios_end_time};%{NUMBER:nagios_fixed};%{NUMBER:nagios_trigger_id};%{NUMBER:nagios_duration};%{DATA:author};%{DATA:comment}"}
[2018-04-24T17:17:38,905][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOSLOGLINE"=>"%{NAGIOSTIME} (?:%{NAGIOS_WARNING}|%{NAGIOS_CURRENT_SERVICE_STATE}|%{NAGIOS_CURRENT_HOST_STATE}|%{NAGIOS_SERVICE_NOTIFICATION}|%{NAGIOS_HOST_NOTIFICATION}|%{NAGIOS_SERVICE_ALERT}|%{NAGIOS_HOST_ALERT}|%{NAGIOS_SERVICE_FLAPPING_ALERT}|%{NAGIOS_HOST_FLAPPING_ALERT}|%{NAGIOS_SERVICE_DOWNTIME_ALERT}|%{NAGIOS_HOST_DOWNTIME_ALERT}|%{NAGIOS_PASSIVE_SERVICE_CHECK}|%{NAGIOS_PASSIVE_HOST_CHECK}|%{NAGIOS_SERVICE_EVENT_HANDLER}|%{NAGIOS_HOST_EVENT_HANDLER}|%{NAGIOS_TIMEPERIOD_TRANSITION}|%{NAGIOS_EC_LINE_DISABLE_SVC_CHECK}|%{NAGIOS_EC_LINE_ENABLE_SVC_CHECK}|%{NAGIOS_EC_LINE_DISABLE_HOST_CHECK}|%{NAGIOS_EC_LINE_ENABLE_HOST_CHECK}|%{NAGIOS_EC_LINE_PROCESS_HOST_CHECK_RESULT}|%{NAGIOS_EC_LINE_PROCESS_SERVICE_CHECK_RESULT}|%{NAGIOS_EC_LINE_SCHEDULE_HOST_DOWNTIME}|%{NAGIOS_EC_LINE_DISABLE_HOST_SVC_NOTIFICATIONS}|%{NAGIOS_EC_LINE_ENABLE_HOST_SVC_NOTIFICATIONS}|%{NAGIOS_EC_LINE_DISABLE_HOST_NOTIFICATIONS}|%{NAGIOS_EC_LINE_ENABLE_HOST_NOTIFICATIONS}|%{NAGIOS_EC_LINE_DISABLE_SVC_NOTIFICATIONS}|%{NAGIOS_EC_LINE_ENABLE_SVC_NOTIFICATIONS})"}
[2018-04-24T17:17:38,905][DEBUG][logstash.filters.grok ] Adding pattern {"POSTGRESQL"=>"%{DATESTAMP:timestamp} %{TZ} %{DATA:user_id} %{GREEDYDATA:connection_id} %{POSINT:pid}"}
[2018-04-24T17:17:38,906][DEBUG][logstash.filters.grok ] Adding pattern {"RUUID"=>"\\h{32}"}
[2018-04-24T17:17:38,906][DEBUG][logstash.filters.grok ] Adding pattern {"RCONTROLLER"=>"(?<controller>[^#]+)#(?<action>\\w+)"}
[2018-04-24T17:17:38,906][DEBUG][logstash.filters.grok ] Adding pattern {"RAILS3HEAD"=>"(?m)Started %{WORD:verb} \"%{URIPATHPARAM:request}\" for %{IPORHOST:clientip} at (?<timestamp>%{YEAR}-%{MONTHNUM}-%{MONTHDAY} %{HOUR}:%{MINUTE}:%{SECOND} %{ISO8601_TIMEZONE})"}
[2018-04-24T17:17:38,906][DEBUG][logstash.filters.grok ] Adding pattern {"RPROCESSING"=>"\\W*Processing by %{RCONTROLLER} as (?<format>\\S+)(?:\\W*Parameters: {%{DATA:params}}\\W*)?"}
[2018-04-24T17:17:38,906][DEBUG][logstash.filters.grok ] Adding pattern {"RAILS3FOOT"=>"Completed %{NUMBER:response}%{DATA} in %{NUMBER:totalms}ms %{RAILS3PROFILE}%{GREEDYDATA}"}
[2018-04-24T17:17:38,906][DEBUG][logstash.filters.grok ] Adding pattern {"RAILS3PROFILE"=>"(?:\\(Views: %{NUMBER:viewms}ms \\| ActiveRecord: %{NUMBER:activerecordms}ms|\\(ActiveRecord: %{NUMBER:activerecordms}ms)?"}
[2018-04-24T17:17:38,906][DEBUG][logstash.filters.grok ] Adding pattern {"RAILS3"=>"%{RAILS3HEAD}(?:%{RPROCESSING})?(?<context>(?:%{DATA}\\n)*)(?:%{RAILS3FOOT})?"}
[2018-04-24T17:17:38,906][DEBUG][logstash.filters.grok ] Adding pattern {"REDISTIMESTAMP"=>"%{MONTHDAY} %{MONTH} %{TIME}"}
[2018-04-24T17:17:38,906][DEBUG][logstash.filters.grok ] Adding pattern {"REDISLOG"=>"\\[%{POSINT:pid}\\] %{REDISTIMESTAMP:timestamp} \\* "}
[2018-04-24T17:17:38,906][DEBUG][logstash.filters.grok ] Adding pattern {"REDISMONLOG"=>"%{NUMBER:timestamp} \\[%{INT:database} %{IP:client}:%{NUMBER:port}\\] \"%{WORD:command}\"\\s?%{GREEDYDATA:params}"}
[2018-04-24T17:17:38,906][DEBUG][logstash.filters.grok ] Adding pattern {"RUBY_LOGLEVEL"=>"(?:DEBUG|FATAL|ERROR|WARN|INFO)"}
[2018-04-24T17:17:38,906][DEBUG][logstash.filters.grok ] Adding pattern {"RUBY_LOGGER"=>"[DFEWI], \\[%{TIMESTAMP_ISO8601:timestamp} #%{POSINT:pid}\\] *%{RUBY_LOGLEVEL:loglevel} -- +%{DATA:progname}: %{GREEDYDATA:message}"}
[2018-04-24T17:17:38,907][DEBUG][logstash.filters.grok ] Adding pattern {"SQUID3"=>"%{NUMBER:timestamp}\\s+%{NUMBER:duration}\\s%{IP:client_address}\\s%{WORD:cache_result}/%{POSINT:status_code}\\s%{NUMBER:bytes}\\s%{WORD:request_method}\\s%{NOTSPACE:url}\\s(%{NOTSPACE:user}|-)\\s%{WORD:hierarchy_code}/%{IPORHOST:server}\\s%{NOTSPACE:content_type}"}
[2018-04-24T17:17:38,907][DEBUG][logstash.filters.grok ] Adding pattern {"PAYLOAD"=>"[\\s\\S]*"}
[2018-04-24T17:17:38,907][DEBUG][logstash.filters.grok ] Adding pattern {"SPACE"=>"[ ]{1,}"}
[2018-04-24T17:17:38,907][DEBUG][logstash.filters.grok ] Adding pattern {"P_TIMESTAMP"=>"%{MONTH}\\s%{MONTHDAY},\\s%{YEAR}\\s%{TIME}\\s(AM|PM)"}
[2018-04-24T17:17:38,907][DEBUG][logstash.filters.grok ] Adding pattern {"LOGGINGSERVICEPREFIX"=>"[-]{12,18} Event Log Start Here [-]{12,18}\\\\n"}
[2018-04-24T17:17:38,907][DEBUG][logstash.filters.grok ] Adding pattern {"LOGGINGSERVICESUFFIX"=>"\\\\n[-]{12,18} Event Log End Here [-]{12,18}"}
[2018-04-24T17:17:38,907][DEBUG][logstash.filters.grok ] Adding pattern {"XLMLOGGING"=>"[0-9]{4}-[0-9]{2}-[0-9]{2} [0-9]{2}:[0-9]{2}:[0-9]{2}:[0-9]{3,7}"}
[2018-04-24T17:17:38,907][DEBUG][logstash.filters.grok ] Adding pattern {"DATESWITHDOTS"=>"[0-9]{4}.[0-9]{2}.[0-9]{2}.[0-9]{2}.[0-9]{2}.[0-9]{2}.[0-9]{3,7}"}
[2018-04-24T17:17:38,907][DEBUG][logstash.filters.grok ] Adding pattern {"DATESWITHUNDERLINE"=>"[0-9]{4}_[0-9]{1,2}_[0-9]{1,2}_[0-9]{1,2}_[0-9]{1,2}_[0-9]{1,2}_[0-9]{1,7}"}
[2018-04-24T17:17:38,907][DEBUG][logstash.filters.grok ] replacement_pattern => (?<GREEDYDATA:PrefixMessage>.*)
[2018-04-24T17:17:38,908][DEBUG][logstash.filters.grok ] replacement_pattern => (?<GREEDYDATA:PrefixMessageTwo>.*)
[2018-04-24T17:17:38,908][DEBUG][logstash.filters.grok ] replacement_pattern => (?<DATESWITHDOTS:logtimetwo>[0-9]{4}.[0-9]{2}.[0-9]{2}.[0-9]{2}.[0-9]{2}.[0-9]{2}.[0-9]{3,7})
[2018-04-24T17:17:38,908][DEBUG][logstash.filters.grok ] replacement_pattern => (?<GREEDYDATA:SuffixMessage>.*)
[2018-04-24T17:17:38,908][DEBUG][logstash.filters.grok ] Grok compiled OK {:pattern=>"%{GREEDYDATA:PrefixMessage}/%{GREEDYDATA:PrefixMessageTwo}/%{DATESWITHDOTS:logtimetwo}%{GREEDYDATA:SuffixMessage}", :expanded_pattern=>"(?<GREEDYDATA:PrefixMessage>.*)/(?<GREEDYDATA:PrefixMessageTwo>.*)/(?<DATESWITHDOTS:logtimetwo>[0-9]{4}.[0-9]{2}.[0-9]{2}.[0-9]{2}.[0-9]{2}.[0-9]{2}.[0-9]{3,7})(?<GREEDYDATA:SuffixMessage>.*)"}
[2018-04-24T17:17:38,909][DEBUG][logstash.filters.grok ] Grok patterns path {:paths=>["/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-patterns-core-4.1.2/patterns", "/usr/share/logstash/patterns/*"]}
[2018-04-24T17:17:38,911][DEBUG][logstash.filters.grok ] Grok patterns path {:paths=>["/etc/logstash/conf.d/patterns"]}
[2018-04-24T17:17:38,911][DEBUG][logstash.filters.grok ] Match data {:match=>{"message"=>"%{GREEDYDATA:Message}Timestamp : %{TIMESTAMP_ISO8601:logtime}"}}
[2018-04-24T17:17:38,911][DEBUG][logstash.filters.grok ] regexp: /message {:pattern=>"%{GREEDYDATA:Message}Timestamp : %{TIMESTAMP_ISO8601:logtime}"}
[2018-04-24T17:17:38,912][DEBUG][logstash.filters.grok ] Adding pattern {"S3_REQUEST_LINE"=>"(?:%{WORD:verb} %{NOTSPACE:request}(?: HTTP/%{NUMBER:httpversion})?|%{DATA:rawrequest})"}
[2018-04-24T17:17:38,912][DEBUG][logstash.filters.grok ] Adding pattern {"S3_ACCESS_LOG"=>"%{WORD:owner} %{NOTSPACE:bucket} \\[%{HTTPDATE:timestamp}\\] %{IP:clientip} %{NOTSPACE:requester} %{NOTSPACE:request_id} %{NOTSPACE:operation} %{NOTSPACE:key} (?:\"%{S3_REQUEST_LINE}\"|-) (?:%{INT:response:int}|-) (?:-|%{NOTSPACE:error_code}) (?:%{INT:bytes:int}|-) (?:%{INT:object_size:int}|-) (?:%{INT:request_time_ms:int}|-) (?:%{INT:turnaround_time_ms:int}|-) (?:%{QS:referrer}|-) (?:\"?%{QS:agent}\"?|-) (?:-|%{NOTSPACE:version_id})"}
[2018-04-24T17:17:38,912][DEBUG][logstash.filters.grok ] Adding pattern {"ELB_URIPATHPARAM"=>"%{URIPATH:path}(?:%{URIPARAM:params})?"}
[2018-04-24T17:17:38,912][DEBUG][logstash.filters.grok ] Adding pattern {"ELB_URI"=>"%{URIPROTO:proto}://(?:%{USER}(?::[^@]*)?@)?(?:%{URIHOST:urihost})?(?:%{ELB_URIPATHPARAM})?"}
[2018-04-24T17:17:38,912][DEBUG][logstash.filters.grok ] Adding pattern {"ELB_REQUEST_LINE"=>"(?:%{WORD:verb} %{ELB_URI:request}(?: HTTP/%{NUMBER:httpversion})?|%{DATA:rawrequest})"}
[2018-04-24T17:17:38,912][DEBUG][logstash.filters.grok ] Adding pattern {"ELB_ACCESS_LOG"=>"%{TIMESTAMP_ISO8601:timestamp} %{NOTSPACE:elb} %{IP:clientip}:%{INT:clientport:int} (?:(%{IP:backendip}:?:%{INT:backendport:int})|-) %{NUMBER:request_processing_time:float} %{NUMBER:backend_processing_time:float} %{NUMBER:response_processing_time:float} %{INT:response:int} %{INT:backend_response:int} %{INT:received_bytes:int} %{INT:bytes:int} \"%{ELB_REQUEST_LINE}\""}
[2018-04-24T17:17:38,912][DEBUG][logstash.filters.grok ] Adding pattern {"CLOUDFRONT_ACCESS_LOG"=>"(?<timestamp>%{YEAR}-%{MONTHNUM}-%{MONTHDAY}\\t%{TIME})\\t%{WORD:x_edge_location}\\t(?:%{NUMBER:sc_bytes:int}|-)\\t%{IPORHOST:clientip}\\t%{WORD:cs_method}\\t%{HOSTNAME:cs_host}\\t%{NOTSPACE:cs_uri_stem}\\t%{NUMBER:sc_status:int}\\t%{GREEDYDATA:referrer}\\t%{GREEDYDATA:agent}\\t%{GREEDYDATA:cs_uri_query}\\t%{GREEDYDATA:cookies}\\t%{WORD:x_edge_result_type}\\t%{NOTSPACE:x_edge_request_id}\\t%{HOSTNAME:x_host_header}\\t%{URIPROTO:cs_protocol}\\t%{INT:cs_bytes:int}\\t%{GREEDYDATA:time_taken:float}\\t%{GREEDYDATA:x_forwarded_for}\\t%{GREEDYDATA:ssl_protocol}\\t%{GREEDYDATA:ssl_cipher}\\t%{GREEDYDATA:x_edge_response_result_type}"}
[2018-04-24T17:17:38,913][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_TIMESTAMP"=>"%{MONTHDAY}-%{MONTH} %{HOUR}:%{MINUTE}"}
[2018-04-24T17:17:38,913][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_HOST"=>"[a-zA-Z0-9-]+"}
[2018-04-24T17:17:38,913][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_VOLUME"=>"%{USER}"}
[2018-04-24T17:17:38,913][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_DEVICE"=>"%{USER}"}
[2018-04-24T17:17:38,913][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_DEVICEPATH"=>"%{UNIXPATH}"}
[2018-04-24T17:17:38,913][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_CAPACITY"=>"%{INT}{1,3}(,%{INT}{3})*"}
[2018-04-24T17:17:38,913][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_VERSION"=>"%{USER}"}
[2018-04-24T17:17:38,913][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_JOB"=>"%{USER}"}
[2018-04-24T17:17:38,913][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_MAX_CAPACITY"=>"User defined maximum volume capacity %{BACULA_CAPACITY} exceeded on device \\\"%{BACULA_DEVICE:device}\\\" \\(%{BACULA_DEVICEPATH}\\)"}
[2018-04-24T17:17:38,913][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_END_VOLUME"=>"End of medium on Volume \\\"%{BACULA_VOLUME:volume}\\\" Bytes=%{BACULA_CAPACITY} Blocks=%{BACULA_CAPACITY} at %{MONTHDAY}-%{MONTH}-%{YEAR} %{HOUR}:%{MINUTE}."}
[2018-04-24T17:17:38,913][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NEW_VOLUME"=>"Created new Volume \\\"%{BACULA_VOLUME:volume}\\\" in catalog."}
[2018-04-24T17:17:38,913][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NEW_LABEL"=>"Labeled new Volume \\\"%{BACULA_VOLUME:volume}\\\" on device \\\"%{BACULA_DEVICE:device}\\\" \\(%{BACULA_DEVICEPATH}\\)."}
[2018-04-24T17:17:38,913][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_WROTE_LABEL"=>"Wrote label to prelabeled Volume \\\"%{BACULA_VOLUME:volume}\\\" on device \\\"%{BACULA_DEVICE}\\\" \\(%{BACULA_DEVICEPATH}\\)"}
[2018-04-24T17:17:38,913][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NEW_MOUNT"=>"New volume \\\"%{BACULA_VOLUME:volume}\\\" mounted on device \\\"%{BACULA_DEVICE:device}\\\" \\(%{BACULA_DEVICEPATH}\\) at %{MONTHDAY}-%{MONTH}-%{YEAR} %{HOUR}:%{MINUTE}."}
[2018-04-24T17:17:38,913][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NOOPEN"=>"\\s+Cannot open %{DATA}: ERR=%{GREEDYDATA:berror}"}
[2018-04-24T17:17:38,913][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NOOPENDIR"=>"\\s+Could not open directory %{DATA}: ERR=%{GREEDYDATA:berror}"}
[2018-04-24T17:17:38,913][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NOSTAT"=>"\\s+Could not stat %{DATA}: ERR=%{GREEDYDATA:berror}"}
[2018-04-24T17:17:38,913][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NOJOBS"=>"There are no more Jobs associated with Volume \\\"%{BACULA_VOLUME:volume}\\\". Marking it purged."}
[2018-04-24T17:17:38,913][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_ALL_RECORDS_PRUNED"=>"All records pruned from Volume \\\"%{BACULA_VOLUME:volume}\\\"; marking it \\\"Purged\\\""}
[2018-04-24T17:17:38,914][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_BEGIN_PRUNE_JOBS"=>"Begin pruning Jobs older than %{INT} month %{INT} days ."}
[2018-04-24T17:17:38,914][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_BEGIN_PRUNE_FILES"=>"Begin pruning Files."}
[2018-04-24T17:17:38,914][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_PRUNED_JOBS"=>"Pruned %{INT} Jobs* for client %{BACULA_HOST:client} from catalog."}
[2018-04-24T17:17:38,914][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_PRUNED_FILES"=>"Pruned Files from %{INT} Jobs* for client %{BACULA_HOST:client} from catalog."}
[2018-04-24T17:17:38,914][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_ENDPRUNE"=>"End auto prune."}
[2018-04-24T17:17:38,914][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_STARTJOB"=>"Start Backup JobId %{INT}, Job=%{BACULA_JOB:job}"}
[2018-04-24T17:17:38,914][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_STARTRESTORE"=>"Start Restore Job %{BACULA_JOB:job}"}
[2018-04-24T17:17:38,914][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_USEDEVICE"=>"Using Device \\\"%{BACULA_DEVICE:device}\\\""}
[2018-04-24T17:17:38,914][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_DIFF_FS"=>"\\s+%{UNIXPATH} is a different filesystem. Will not descend from %{UNIXPATH} into it."}
[2018-04-24T17:17:38,914][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_JOBEND"=>"Job write elapsed time = %{DATA:elapsed}, Transfer rate = %{NUMBER} (K|M|G)? Bytes/second"}
[2018-04-24T17:17:38,914][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NOPRUNE_JOBS"=>"No Jobs found to prune."}
[2018-04-24T17:17:38,914][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NOPRUNE_FILES"=>"No Files found to prune."}
[2018-04-24T17:17:38,914][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_VOLUME_PREVWRITTEN"=>"Volume \\\"%{BACULA_VOLUME:volume}\\\" previously written, moving to end of data."}
[2018-04-24T17:17:38,914][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_READYAPPEND"=>"Ready to append to end of Volume \\\"%{BACULA_VOLUME:volume}\\\" size=%{INT}"}
[2018-04-24T17:17:38,914][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_CANCELLING"=>"Cancelling duplicate JobId=%{INT}."}
[2018-04-24T17:17:38,914][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_MARKCANCEL"=>"JobId %{INT}, Job %{BACULA_JOB:job} marked to be canceled."}
[2018-04-24T17:17:38,914][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_CLIENT_RBJ"=>"shell command: run ClientRunBeforeJob \\\"%{GREEDYDATA:runjob}\\\""}
[2018-04-24T17:17:38,914][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_VSS"=>"(Generate )?VSS (Writer)?"}
[2018-04-24T17:17:38,914][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_MAXSTART"=>"Fatal error: Job canceled because max start delay time exceeded."}
[2018-04-24T17:17:38,914][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_DUPLICATE"=>"Fatal error: JobId %{INT:duplicate} already running. Duplicate job not allowed."}
[2018-04-24T17:17:38,914][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NOJOBSTAT"=>"Fatal error: No Job status returned from FD."}
[2018-04-24T17:17:38,914][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_FATAL_CONN"=>"Fatal error: bsock.c:133 Unable to connect to (Client: %{BACULA_HOST:client}|Storage daemon) on %{HOSTNAME}:%{POSINT}. ERR=(?<berror>%{GREEDYDATA})"}
[2018-04-24T17:17:38,915][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NO_CONNECT"=>"Warning: bsock.c:127 Could not connect to (Client: %{BACULA_HOST:client}|Storage daemon) on %{HOSTNAME}:%{POSINT}. ERR=(?<berror>%{GREEDYDATA})"}
[2018-04-24T17:17:38,915][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NO_AUTH"=>"Fatal error: Unable to authenticate with File daemon at %{HOSTNAME}. Possible causes:"}
[2018-04-24T17:17:38,915][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NOSUIT"=>"No prior or suitable Full backup found in catalog. Doing FULL backup."}
[2018-04-24T17:17:38,915][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NOPRIOR"=>"No prior Full backup Job record found."}
[2018-04-24T17:17:38,915][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_JOB"=>"(Error: )?Bacula %{BACULA_HOST} %{BACULA_VERSION} \\(%{BACULA_VERSION}\\):"}
[2018-04-24T17:17:38,915][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOGLINE"=>"%{BACULA_TIMESTAMP:bts} %{BACULA_HOST:hostname} JobId %{INT:jobid}: (%{BACULA_LOG_MAX_CAPACITY}|%{BACULA_LOG_END_VOLUME}|%{BACULA_LOG_NEW_VOLUME}|%{BACULA_LOG_NEW_LABEL}|%{BACULA_LOG_WROTE_LABEL}|%{BACULA_LOG_NEW_MOUNT}|%{BACULA_LOG_NOOPEN}|%{BACULA_LOG_NOOPENDIR}|%{BACULA_LOG_NOSTAT}|%{BACULA_LOG_NOJOBS}|%{BACULA_LOG_ALL_RECORDS_PRUNED}|%{BACULA_LOG_BEGIN_PRUNE_JOBS}|%{BACULA_LOG_BEGIN_PRUNE_FILES}|%{BACULA_LOG_PRUNED_JOBS}|%{BACULA_LOG_PRUNED_FILES}|%{BACULA_LOG_ENDPRUNE}|%{BACULA_LOG_STARTJOB}|%{BACULA_LOG_STARTRESTORE}|%{BACULA_LOG_USEDEVICE}|%{BACULA_LOG_DIFF_FS}|%{BACULA_LOG_JOBEND}|%{BACULA_LOG_NOPRUNE_JOBS}|%{BACULA_LOG_NOPRUNE_FILES}|%{BACULA_LOG_VOLUME_PREVWRITTEN}|%{BACULA_LOG_READYAPPEND}|%{BACULA_LOG_CANCELLING}|%{BACULA_LOG_MARKCANCEL}|%{BACULA_LOG_CLIENT_RBJ}|%{BACULA_LOG_VSS}|%{BACULA_LOG_MAXSTART}|%{BACULA_LOG_DUPLICATE}|%{BACULA_LOG_NOJOBSTAT}|%{BACULA_LOG_FATAL_CONN}|%{BACULA_LOG_NO_CONNECT}|%{BACULA_LOG_NO_AUTH}|%{BACULA_LOG_NOSUIT}|%{BACULA_LOG_JOB}|%{BACULA_LOG_NOPRIOR})"}
[2018-04-24T17:17:38,915][DEBUG][logstash.filters.grok ] Adding pattern {"BIND9_TIMESTAMP"=>"%{MONTHDAY}[-]%{MONTH}[-]%{YEAR} %{TIME}"}
[2018-04-24T17:17:38,915][DEBUG][logstash.filters.grok ] Adding pattern {"BIND9"=>"%{BIND9_TIMESTAMP:timestamp} queries: %{LOGLEVEL:loglevel}: client %{IP:clientip}#%{POSINT:clientport} \\(%{GREEDYDATA:query}\\): query: %{GREEDYDATA:query} IN %{GREEDYDATA:querytype} \\(%{IP:dns}\\)"}
[2018-04-24T17:17:38,915][DEBUG][logstash.filters.grok ] Adding pattern {"BRO_HTTP"=>"%{NUMBER:ts}\\t%{NOTSPACE:uid}\\t%{IP:orig_h}\\t%{INT:orig_p}\\t%{IP:resp_h}\\t%{INT:resp_p}\\t%{INT:trans_depth}\\t%{GREEDYDATA:method}\\t%{GREEDYDATA:domain}\\t%{GREEDYDATA:uri}\\t%{GREEDYDATA:referrer}\\t%{GREEDYDATA:user_agent}\\t%{NUMBER:request_body_len}\\t%{NUMBER:response_body_len}\\t%{GREEDYDATA:status_code}\\t%{GREEDYDATA:status_msg}\\t%{GREEDYDATA:info_code}\\t%{GREEDYDATA:info_msg}\\t%{GREEDYDATA:filename}\\t%{GREEDYDATA:bro_tags}\\t%{GREEDYDATA:username}\\t%{GREEDYDATA:password}\\t%{GREEDYDATA:proxied}\\t%{GREEDYDATA:orig_fuids}\\t%{GREEDYDATA:orig_mime_types}\\t%{GREEDYDATA:resp_fuids}\\t%{GREEDYDATA:resp_mime_types}"}
[2018-04-24T17:17:38,916][DEBUG][logstash.filters.grok ] Adding pattern {"BRO_DNS"=>"%{NUMBER:ts}\\t%{NOTSPACE:uid}\\t%{IP:orig_h}\\t%{INT:orig_p}\\t%{IP:resp_h}\\t%{INT:resp_p}\\t%{WORD:proto}\\t%{INT:trans_id}\\t%{GREEDYDATA:query}\\t%{GREEDYDATA:qclass}\\t%{GREEDYDATA:qclass_name}\\t%{GREEDYDATA:qtype}\\t%{GREEDYDATA:qtype_name}\\t%{GREEDYDATA:rcode}\\t%{GREEDYDATA:rcode_name}\\t%{GREEDYDATA:AA}\\t%{GREEDYDATA:TC}\\t%{GREEDYDATA:RD}\\t%{GREEDYDATA:RA}\\t%{GREEDYDATA:Z}\\t%{GREEDYDATA:answers}\\t%{GREEDYDATA:TTLs}\\t%{GREEDYDATA:rejected}"}
[2018-04-24T17:17:38,916][DEBUG][logstash.filters.grok ] Adding pattern {"BRO_CONN"=>"%{NUMBER:ts}\\t%{NOTSPACE:uid}\\t%{IP:orig_h}\\t%{INT:orig_p}\\t%{IP:resp_h}\\t%{INT:resp_p}\\t%{WORD:proto}\\t%{GREEDYDATA:service}\\t%{NUMBER:duration}\\t%{NUMBER:orig_bytes}\\t%{NUMBER:resp_bytes}\\t%{GREEDYDATA:conn_state}\\t%{GREEDYDATA:local_orig}\\t%{GREEDYDATA:missed_bytes}\\t%{GREEDYDATA:history}\\t%{GREEDYDATA:orig_pkts}\\t%{GREEDYDATA:orig_ip_bytes}\\t%{GREEDYDATA:resp_pkts}\\t%{GREEDYDATA:resp_ip_bytes}\\t%{GREEDYDATA:tunnel_parents}"}
[2018-04-24T17:17:38,916][DEBUG][logstash.filters.grok ] Adding pattern {"BRO_FILES"=>"%{NUMBER:ts}\\t%{NOTSPACE:fuid}\\t%{IP:tx_hosts}\\t%{IP:rx_hosts}\\t%{NOTSPACE:conn_uids}\\t%{GREEDYDATA:source}\\t%{GREEDYDATA:depth}\\t%{GREEDYDATA:analyzers}\\t%{GREEDYDATA:mime_type}\\t%{GREEDYDATA:filename}\\t%{GREEDYDATA:duration}\\t%{GREEDYDATA:local_orig}\\t%{GREEDYDATA:is_orig}\\t%{GREEDYDATA:seen_bytes}\\t%{GREEDYDATA:total_bytes}\\t%{GREEDYDATA:missing_bytes}\\t%{GREEDYDATA:overflow_bytes}\\t%{GREEDYDATA:timedout}\\t%{GREEDYDATA:parent_fuid}\\t%{GREEDYDATA:md5}\\t%{GREEDYDATA:sha1}\\t%{GREEDYDATA:sha256}\\t%{GREEDYDATA:extracted}"}
[2018-04-24T17:17:38,916][DEBUG][logstash.filters.grok ] Adding pattern {"EXIM_MSGID"=>"[0-9A-Za-z]{6}-[0-9A-Za-z]{6}-[0-9A-Za-z]{2}"}
[2018-04-24T17:17:38,916][DEBUG][logstash.filters.grok ] Adding pattern {"EXIM_FLAGS"=>"(<=|[-=>*]>|[*]{2}|==)"}
[2018-04-24T17:17:38,916][DEBUG][logstash.filters.grok ] Adding pattern {"EXIM_DATE"=>"%{YEAR:exim_year}-%{MONTHNUM:exim_month}-%{MONTHDAY:exim_day} %{TIME:exim_time}"}
[2018-04-24T17:17:38,916][DEBUG][logstash.filters.grok ] Adding pattern {"EXIM_PID"=>"\\[%{POSINT}\\]"}
[2018-04-24T17:17:38,916][DEBUG][logstash.filters.grok ] Adding pattern {"EXIM_QT"=>"((\\d+y)?(\\d+w)?(\\d+d)?(\\d+h)?(\\d+m)?(\\d+s)?)"}
[2018-04-24T17:17:38,916][DEBUG][logstash.filters.grok ] Adding pattern {"EXIM_EXCLUDE_TERMS"=>"(Message is frozen|(Start|End) queue run| Warning: | retry time not reached | no (IP address|host name) found for (IP address|host) | unexpected disconnection while reading SMTP command | no immediate delivery: |another process is handling this message)"}
[2018-04-24T17:17:38,916][DEBUG][logstash.filters.grok ] Adding pattern {"EXIM_REMOTE_HOST"=>"(H=(%{NOTSPACE:remote_hostname} )?(\\(%{NOTSPACE:remote_heloname}\\) )?\\[%{IP:remote_host}\\])"}
[2018-04-24T17:17:38,916][DEBUG][logstash.filters.grok ] Adding pattern {"EXIM_INTERFACE"=>"(I=\\[%{IP:exim_interface}\\](:%{NUMBER:exim_interface_port}))"}
[2018-04-24T17:17:38,916][DEBUG][logstash.filters.grok ] Adding pattern {"EXIM_PROTOCOL"=>"(P=%{NOTSPACE:protocol})"}
[2018-04-24T17:17:38,916][DEBUG][logstash.filters.grok ] Adding pattern {"EXIM_MSG_SIZE"=>"(S=%{NUMBER:exim_msg_size})"}
[2018-04-24T17:17:38,916][DEBUG][logstash.filters.grok ] Adding pattern {"EXIM_HEADER_ID"=>"(id=%{NOTSPACE:exim_header_id})"}
[2018-04-24T17:17:38,916][DEBUG][logstash.filters.grok ] Adding pattern {"EXIM_SUBJECT"=>"(T=%{QS:exim_subject})"}
[2018-04-24T17:17:38,917][DEBUG][logstash.filters.grok ] Adding pattern {"NETSCREENSESSIONLOG"=>"%{SYSLOGTIMESTAMP:date} %{IPORHOST:device} %{IPORHOST}: NetScreen device_id=%{WORD:device_id}%{DATA}: start_time=%{QUOTEDSTRING:start_time} duration=%{INT:duration} policy_id=%{INT:policy_id} service=%{DATA:service} proto=%{INT:proto} src zone=%{WORD:src_zone} dst zone=%{WORD:dst_zone} action=%{WORD:action} sent=%{INT:sent} rcvd=%{INT:rcvd} src=%{IPORHOST:src_ip} dst=%{IPORHOST:dst_ip} src_port=%{INT:src_port} dst_port=%{INT:dst_port} src-xlated ip=%{IPORHOST:src_xlated_ip} port=%{INT:src_xlated_port} dst-xlated ip=%{IPORHOST:dst_xlated_ip} port=%{INT:dst_xlated_port} session_id=%{INT:session_id} reason=%{GREEDYDATA:reason}"}
[2018-04-24T17:17:38,917][DEBUG][logstash.filters.grok ] Adding pattern {"CISCO_TAGGED_SYSLOG"=>"^<%{POSINT:syslog_pri}>%{CISCOTIMESTAMP:timestamp}( %{SYSLOGHOST:sysloghost})? ?: %%{CISCOTAG:ciscotag}:"}
[2018-04-24T17:17:38,917][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOTIMESTAMP"=>"%{MONTH} +%{MONTHDAY}(?: %{YEAR})? %{TIME}"}
[2018-04-24T17:17:38,917][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOTAG"=>"[A-Z0-9]+-%{INT}-(?:[A-Z0-9_]+)"}
[2018-04-24T17:17:38,917][DEBUG][logstash.filters.grok ] Adding pattern {"CISCO_ACTION"=>"Built|Teardown|Deny|Denied|denied|requested|permitted|denied by ACL|discarded|est-allowed|Dropping|created|deleted"}
[2018-04-24T17:17:38,917][DEBUG][logstash.filters.grok ] Adding pattern {"CISCO_REASON"=>"Duplicate TCP SYN|Failed to locate egress interface|Invalid transport field|No matching connection|DNS Response|DNS Query|(?:%{WORD}\\s*)*"}
[2018-04-24T17:17:38,917][DEBUG][logstash.filters.grok ] Adding pattern {"CISCO_DIRECTION"=>"Inbound|inbound|Outbound|outbound"}
[2018-04-24T17:17:38,917][DEBUG][logstash.filters.grok ] Adding pattern {"CISCO_INTERVAL"=>"first hit|%{INT}-second interval"}
[2018-04-24T17:17:38,917][DEBUG][logstash.filters.grok ] Adding pattern {"CISCO_XLATE_TYPE"=>"static|dynamic"}
[2018-04-24T17:17:38,917][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW104001"=>"\\((?:Primary|Secondary)\\) Switching to ACTIVE - %{GREEDYDATA:switch_reason}"}
[2018-04-24T17:17:38,917][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW104002"=>"\\((?:Primary|Secondary)\\) Switching to STANDBY - %{GREEDYDATA:switch_reason}"}
[2018-04-24T17:17:38,918][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW104003"=>"\\((?:Primary|Secondary)\\) Switching to FAILED\\."}
[2018-04-24T17:17:38,918][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW104004"=>"\\((?:Primary|Secondary)\\) Switching to OK\\."}
[2018-04-24T17:17:38,918][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW105003"=>"\\((?:Primary|Secondary)\\) Monitoring on [Ii]nterface %{GREEDYDATA:interface_name} waiting"}
[2018-04-24T17:17:38,918][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW105004"=>"\\((?:Primary|Secondary)\\) Monitoring on [Ii]nterface %{GREEDYDATA:interface_name} normal"}
[2018-04-24T17:17:38,918][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW105005"=>"\\((?:Primary|Secondary)\\) Lost Failover communications with mate on [Ii]nterface %{GREEDYDATA:interface_name}"}
[2018-04-24T17:17:38,918][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW105008"=>"\\((?:Primary|Secondary)\\) Testing [Ii]nterface %{GREEDYDATA:interface_name}"}
[2018-04-24T17:17:38,918][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW105009"=>"\\((?:Primary|Secondary)\\) Testing on [Ii]nterface %{GREEDYDATA:interface_name} (?:Passed|Failed)"}
[2018-04-24T17:17:38,918][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW106001"=>"%{CISCO_DIRECTION:direction} %{WORD:protocol} connection %{CISCO_ACTION:action} from %{IP:src_ip}/%{INT:src_port} to %{IP:dst_ip}/%{INT:dst_port} flags %{GREEDYDATA:tcp_flags} on interface %{GREEDYDATA:interface}"}
[2018-04-24T17:17:38,918][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW106006_106007_106010"=>"%{CISCO_ACTION:action} %{CISCO_DIRECTION:direction} %{WORD:protocol} (?:from|src) %{IP:src_ip}/%{INT:src_port}(\\(%{DATA:src_fwuser}\\))? (?:to|dst) %{IP:dst_ip}/%{INT:dst_port}(\\(%{DATA:dst_fwuser}\\))? (?:on interface %{DATA:interface}|due to %{CISCO_REASON:reason})"}
[2018-04-24T17:17:38,918][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW106014"=>"%{CISCO_ACTION:action} %{CISCO_DIRECTION:direction} %{WORD:protocol} src %{DATA:src_interface}:%{IP:src_ip}(\\(%{DATA:src_fwuser}\\))? dst %{DATA:dst_interface}:%{IP:dst_ip}(\\(%{DATA:dst_fwuser}\\))? \\(type %{INT:icmp_type}, code %{INT:icmp_code}\\)"}
[2018-04-24T17:17:38,918][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW106015"=>"%{CISCO_ACTION:action} %{WORD:protocol} \\(%{DATA:policy_id}\\) from %{IP:src_ip}/%{INT:src_port} to %{IP:dst_ip}/%{INT:dst_port} flags %{DATA:tcp_flags} on interface %{GREEDYDATA:interface}"}
[2018-04-24T17:17:38,918][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW106021"=>"%{CISCO_ACTION:action} %{WORD:protocol} reverse path check from %{IP:src_ip} to %{IP:dst_ip} on interface %{GREEDYDATA:interface}"}
[2018-04-24T17:17:38,918][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW106023"=>"%{CISCO_ACTION:action}( protocol)? %{WORD:protocol} src %{DATA:src_interface}:%{DATA:src_ip}(/%{INT:src_port})?(\\(%{DATA:src_fwuser}\\))? dst %{DATA:dst_interface}:%{DATA:dst_ip}(/%{INT:dst_port})?(\\(%{DATA:dst_fwuser}\\))?( \\(type %{INT:icmp_type}, code %{INT:icmp_code}\\))? by access-group \"?%{DATA:policy_id}\"? \\[%{DATA:hashcode1}, %{DATA:hashcode2}\\]"}
[2018-04-24T17:17:38,918][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW106100_2_3"=>"access-list %{NOTSPACE:policy_id} %{CISCO_ACTION:action} %{WORD:protocol} for user '%{DATA:src_fwuser}' %{DATA:src_interface}/%{IP:src_ip}\\(%{INT:src_port}\\) -> %{DATA:dst_interface}/%{IP:dst_ip}\\(%{INT:dst_port}\\) hit-cnt %{INT:hit_count} %{CISCO_INTERVAL:interval} \\[%{DATA:hashcode1}, %{DATA:hashcode2}\\]"}
[2018-04-24T17:17:38,918][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW106100"=>"access-list %{NOTSPACE:policy_id} %{CISCO_ACTION:action} %{WORD:protocol} %{DATA:src_interface}/%{IP:src_ip}\\(%{INT:src_port}\\)(\\(%{DATA:src_fwuser}\\))? -> %{DATA:dst_interface}/%{IP:dst_ip}\\(%{INT:dst_port}\\)(\\(%{DATA:src_fwuser}\\))? hit-cnt %{INT:hit_count} %{CISCO_INTERVAL:interval} \\[%{DATA:hashcode1}, %{DATA:hashcode2}\\]"}
[2018-04-24T17:17:38,918][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW304001"=>"%{IP:src_ip}(\\(%{DATA:src_fwuser}\\))? Accessed URL %{IP:dst_ip}:%{GREEDYDATA:dst_url}"}
[2018-04-24T17:17:38,919][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW110002"=>"%{CISCO_REASON:reason} for %{WORD:protocol} from %{DATA:src_interface}:%{IP:src_ip}/%{INT:src_port} to %{IP:dst_ip}/%{INT:dst_port}"}
[2018-04-24T17:17:38,919][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW302010"=>"%{INT:connection_count} in use, %{INT:connection_count_max} most used"}
[2018-04-24T17:17:38,919][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW302013_302014_302015_302016"=>"%{CISCO_ACTION:action}(?: %{CISCO_DIRECTION:direction})? %{WORD:protocol} connection %{INT:connection_id} for %{DATA:src_interface}:%{IP:src_ip}/%{INT:src_port}( \\(%{IP:src_mapped_ip}/%{INT:src_mapped_port}\\))?(\\(%{DATA:src_fwuser}\\))? to %{DATA:dst_interface}:%{IP:dst_ip}/%{INT:dst_port}( \\(%{IP:dst_mapped_ip}/%{INT:dst_mapped_port}\\))?(\\(%{DATA:dst_fwuser}\\))?( duration %{TIME:duration} bytes %{INT:bytes})?(?: %{CISCO_REASON:reason})?( \\(%{DATA:user}\\))?"}
[2018-04-24T17:17:38,919][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW302020_302021"=>"%{CISCO_ACTION:action}(?: %{CISCO_DIRECTION:direction})? %{WORD:protocol} connection for faddr %{IP:dst_ip}/%{INT:icmp_seq_num}(?:\\(%{DATA:fwuser}\\))? gaddr %{IP:src_xlated_ip}/%{INT:icmp_code_xlated} laddr %{IP:src_ip}/%{INT:icmp_code}( \\(%{DATA:user}\\))?"}
[2018-04-24T17:17:38,919][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW305011"=>"%{CISCO_ACTION:action} %{CISCO_XLATE_TYPE:xlate_type} %{WORD:protocol} translation from %{DATA:src_interface}:%{IP:src_ip}(/%{INT:src_port})?(\\(%{DATA:src_fwuser}\\))? to %{DATA:src_xlated_interface}:%{IP:src_xlated_ip}/%{DATA:src_xlated_port}"}
[2018-04-24T17:17:38,919][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW313001_313004_313008"=>"%{CISCO_ACTION:action} %{WORD:protocol} type=%{INT:icmp_type}, code=%{INT:icmp_code} from %{IP:src_ip} on interface %{DATA:interface}( to %{IP:dst_ip})?"}
[2018-04-24T17:17:38,919][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW313005"=>"%{CISCO_REASON:reason} for %{WORD:protocol} error message: %{WORD:err_protocol} src %{DATA:err_src_interface}:%{IP:err_src_ip}(\\(%{DATA:err_src_fwuser}\\))? dst %{DATA:err_dst_interface}:%{IP:err_dst_ip}(\\(%{DATA:err_dst_fwuser}\\))? \\(type %{INT:err_icmp_type}, code %{INT:err_icmp_code}\\) on %{DATA:interface} interface\\. Original IP payload: %{WORD:protocol} src %{IP:orig_src_ip}/%{INT:orig_src_port}(\\(%{DATA:orig_src_fwuser}\\))? dst %{IP:orig_dst_ip}/%{INT:orig_dst_port}(\\(%{DATA:orig_dst_fwuser}\\))?"}
[2018-04-24T17:17:38,919][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW321001"=>"Resource '%{WORD:resource_name}' limit of %{POSINT:resource_limit} reached for system"}
[2018-04-24T17:17:38,919][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW402117"=>"%{WORD:protocol}: Received a non-IPSec packet \\(protocol= %{WORD:orig_protocol}\\) from %{IP:src_ip} to %{IP:dst_ip}"}
[2018-04-24T17:17:38,919][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW402119"=>"%{WORD:protocol}: Received an %{WORD:orig_protocol} packet \\(SPI= %{DATA:spi}, sequence number= %{DATA:seq_num}\\) from %{IP:src_ip} \\(user= %{DATA:user}\\) to %{IP:dst_ip} that failed anti-replay checking"}
[2018-04-24T17:17:38,919][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW419001"=>"%{CISCO_ACTION:action} %{WORD:protocol} packet from %{DATA:src_interface}:%{IP:src_ip}/%{INT:src_port} to %{DATA:dst_interface}:%{IP:dst_ip}/%{INT:dst_port}, reason: %{GREEDYDATA:reason}"}
[2018-04-24T17:17:38,920][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW419002"=>"%{CISCO_REASON:reason} from %{DATA:src_interface}:%{IP:src_ip}/%{INT:src_port} to %{DATA:dst_interface}:%{IP:dst_ip}/%{INT:dst_port} with different initial sequence number"}
[2018-04-24T17:17:38,920][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW500004"=>"%{CISCO_REASON:reason} for protocol=%{WORD:protocol}, from %{IP:src_ip}/%{INT:src_port} to %{IP:dst_ip}/%{INT:dst_port}"}
[2018-04-24T17:17:38,920][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW602303_602304"=>"%{WORD:protocol}: An %{CISCO_DIRECTION:direction} %{GREEDYDATA:tunnel_type} SA \\(SPI= %{DATA:spi}\\) between %{IP:src_ip} and %{IP:dst_ip} \\(user= %{DATA:user}\\) has been %{CISCO_ACTION:action}"}
[2018-04-24T17:17:38,920][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW710001_710002_710003_710005_710006"=>"%{WORD:protocol} (?:request|access) %{CISCO_ACTION:action} from %{IP:src_ip}/%{INT:src_port} to %{DATA:dst_interface}:%{IP:dst_ip}/%{INT:dst_port}"}
[2018-04-24T17:17:38,920][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW713172"=>"Group = %{GREEDYDATA:group}, IP = %{IP:src_ip}, Automatic NAT Detection Status:\\s+Remote end\\s*%{DATA:is_remote_natted}\\s*behind a NAT device\\s+This\\s+end\\s*%{DATA:is_local_natted}\\s*behind a NAT device"}
[2018-04-24T17:17:38,920][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW733100"=>"\\[\\s*%{DATA:drop_type}\\s*\\] drop %{DATA:drop_rate_id} exceeded. Current burst rate is %{INT:drop_rate_current_burst} per second, max configured rate is %{INT:drop_rate_max_burst}; Current average rate is %{INT:drop_rate_current_avg} per second, max configured rate is %{INT:drop_rate_max_avg}; Cumulative total count is %{INT:drop_total_count}"}
[2018-04-24T17:17:38,920][DEBUG][logstash.filters.grok ] Adding pattern {"SHOREWALL"=>"(%{SYSLOGTIMESTAMP:timestamp}) (%{WORD:nf_host}) kernel:.*Shorewall:(%{WORD:nf_action1})?:(%{WORD:nf_action2})?.*IN=(%{USERNAME:nf_in_interface})?.*(OUT= *MAC=(%{COMMONMAC:nf_dst_mac}):(%{COMMONMAC:nf_src_mac})?|OUT=%{USERNAME:nf_out_interface}).*SRC=(%{IPV4:nf_src_ip}).*DST=(%{IPV4:nf_dst_ip}).*LEN=(%{WORD:nf_len}).?*TOS=(%{WORD:nf_tos}).?*PREC=(%{WORD:nf_prec}).?*TTL=(%{INT:nf_ttl}).?*ID=(%{INT:nf_id}).?*PROTO=(%{WORD:nf_protocol}).?*SPT=(%{INT:nf_src_port}?.*DPT=%{INT:nf_dst_port}?.*)"}
[2018-04-24T17:17:38,920][DEBUG][logstash.filters.grok ] Adding pattern {"SFW2"=>"((%{SYSLOGTIMESTAMP})|(%{TIMESTAMP_ISO8601}))\\s*%{HOSTNAME}\\s*kernel\\S+\\s*%{NAGIOSTIME}\\s*SFW2\\-INext\\-%{NOTSPACE:nf_action}\\s*IN=%{USERNAME:nf_in_interface}.*OUT=((\\s*%{USERNAME:nf_out_interface})|(\\s*))MAC=((%{COMMONMAC:nf_dst_mac}:%{COMMONMAC:nf_src_mac})|(\\s*)).*SRC=%{IP:nf_src_ip}\\s*DST=%{IP:nf_dst_ip}.*PROTO=%{WORD:nf_protocol}((.*SPT=%{INT:nf_src_port}.*DPT=%{INT:nf_dst_port}.*)|())"}
[2018-04-24T17:17:38,921][DEBUG][logstash.filters.grok ] Adding pattern {"USERNAME"=>"[a-zA-Z0-9._-]+"}
[2018-04-24T17:17:38,921][DEBUG][logstash.filters.grok ] Adding pattern {"USER"=>"%{USERNAME}"}
[2018-04-24T17:17:38,921][DEBUG][logstash.filters.grok ] Adding pattern {"EMAILLOCALPART"=>"[a-zA-Z][a-zA-Z0-9_.+-=:]+"}
[2018-04-24T17:17:38,921][DEBUG][logstash.filters.grok ] Adding pattern {"EMAILADDRESS"=>"%{EMAILLOCALPART}@%{HOSTNAME}"}
[2018-04-24T17:17:38,921][DEBUG][logstash.filters.grok ] Adding pattern {"INT"=>"(?:[+-]?(?:[0-9]+))"}
[2018-04-24T17:17:38,921][DEBUG][logstash.filters.grok ] Adding pattern {"BASE10NUM"=>"(?<![0-9.+-])(?>[+-]?(?:(?:[0-9]+(?:\\.[0-9]+)?)|(?:\\.[0-9]+)))"}
[2018-04-24T17:17:38,921][DEBUG][logstash.filters.grok ] Adding pattern {"NUMBER"=>"(?:%{BASE10NUM})"}
[2018-04-24T17:17:38,921][DEBUG][logstash.filters.grok ] Adding pattern {"BASE16NUM"=>"(?<![0-9A-Fa-f])(?:[+-]?(?:0x)?(?:[0-9A-Fa-f]+))"}
[2018-04-24T17:17:38,921][DEBUG][logstash.filters.grok ] Adding pattern {"BASE16FLOAT"=>"\\b(?<![0-9A-Fa-f.])(?:[+-]?(?:0x)?(?:(?:[0-9A-Fa-f]+(?:\\.[0-9A-Fa-f]*)?)|(?:\\.[0-9A-Fa-f]+)))\\b"}
[2018-04-24T17:17:38,921][DEBUG][logstash.filters.grok ] Adding pattern {"POSINT"=>"\\b(?:[1-9][0-9]*)\\b"}
[2018-04-24T17:17:38,921][DEBUG][logstash.filters.grok ] Adding pattern {"NONNEGINT"=>"\\b(?:[0-9]+)\\b"}
[2018-04-24T17:17:38,921][DEBUG][logstash.filters.grok ] Adding pattern {"WORD"=>"\\b\\w+\\b"}
[2018-04-24T17:17:38,921][DEBUG][logstash.filters.grok ] Adding pattern {"NOTSPACE"=>"\\S+"}
[2018-04-24T17:17:38,921][DEBUG][logstash.filters.grok ] Adding pattern {"SPACE"=>"\\s*"}
[2018-04-24T17:17:38,921][DEBUG][logstash.filters.grok ] Adding pattern {"DATA"=>".*?"}
[2018-04-24T17:17:38,921][DEBUG][logstash.filters.grok ] Adding pattern {"GREEDYDATA"=>".*"}
[2018-04-24T17:17:38,921][DEBUG][logstash.filters.grok ] Adding pattern {"QUOTEDSTRING"=>"(?>(?<!\\\\)(?>\"(?>\\\\.|[^\\\\\"]+)+\"|\"\"|(?>'(?>\\\\.|[^\\\\']+)+')|''|(?>`(?>\\\\.|[^\\\\`]+)+`)|``))"}
[2018-04-24T17:17:38,921][DEBUG][logstash.filters.grok ] Adding pattern {"UUID"=>"[A-Fa-f0-9]{8}-(?:[A-Fa-f0-9]{4}-){3}[A-Fa-f0-9]{12}"}
[2018-04-24T17:17:38,921][DEBUG][logstash.filters.grok ] Adding pattern {"URN"=>"urn:[0-9A-Za-z][0-9A-Za-z-]{0,31}:(?:%[0-9a-fA-F]{2}|[0-9A-Za-z()+,.:=@;$_!*'/?#-])+"}
[2018-04-24T17:17:38,921][DEBUG][logstash.filters.grok ] Adding pattern {"MAC"=>"(?:%{CISCOMAC}|%{WINDOWSMAC}|%{COMMONMAC})"}
[2018-04-24T17:17:38,921][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOMAC"=>"(?:(?:[A-Fa-f0-9]{4}\\.){2}[A-Fa-f0-9]{4})"}
[2018-04-24T17:17:38,922][DEBUG][logstash.filters.grok ] Adding pattern {"WINDOWSMAC"=>"(?:(?:[A-Fa-f0-9]{2}-){5}[A-Fa-f0-9]{2})"}
[2018-04-24T17:17:38,922][DEBUG][logstash.filters.grok ] Adding pattern {"COMMONMAC"=>"(?:(?:[A-Fa-f0-9]{2}:){5}[A-Fa-f0-9]{2})"}
[2018-04-24T17:17:38,922][DEBUG][logstash.filters.grok ] Adding pattern {"IPV6"=>"((([0-9A-Fa-f]{1,4}:){7}([0-9A-Fa-f]{1,4}|:))|(([0-9A-Fa-f]{1,4}:){6}(:[0-9A-Fa-f]{1,4}|((25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)(\\.(25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)){3})|:))|(([0-9A-Fa-f]{1,4}:){5}(((:[0-9A-Fa-f]{1,4}){1,2})|:((25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)(\\.(25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)){3})|:))|(([0-9A-Fa-f]{1,4}:){4}(((:[0-9A-Fa-f]{1,4}){1,3})|((:[0-9A-Fa-f]{1,4})?:((25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)(\\.(25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)){3}))|:))|(([0-9A-Fa-f]{1,4}:){3}(((:[0-9A-Fa-f]{1,4}){1,4})|((:[0-9A-Fa-f]{1,4}){0,2}:((25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)(\\.(25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)){3}))|:))|(([0-9A-Fa-f]{1,4}:){2}(((:[0-9A-Fa-f]{1,4}){1,5})|((:[0-9A-Fa-f]{1,4}){0,3}:((25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)(\\.(25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)){3}))|:))|(([0-9A-Fa-f]{1,4}:){1}(((:[0-9A-Fa-f]{1,4}){1,6})|((:[0-9A-Fa-f]{1,4}){0,4}:((25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)(\\.(25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)){3}))|:))|(:(((:[0-9A-Fa-f]{1,4}){1,7})|((:[0-9A-Fa-f]{1,4}){0,5}:((25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)(\\.(25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)){3}))|:)))(%.+)?"}
[2018-04-24T17:17:38,922][DEBUG][logstash.filters.grok ] Adding pattern {"IPV4"=>"(?<![0-9])(?:(?:[0-1]?[0-9]{1,2}|2[0-4][0-9]|25[0-5])[.](?:[0-1]?[0-9]{1,2}|2[0-4][0-9]|25[0-5])[.](?:[0-1]?[0-9]{1,2}|2[0-4][0-9]|25[0-5])[.](?:[0-1]?[0-9]{1,2}|2[0-4][0-9]|25[0-5]))(?![0-9])"}
[2018-04-24T17:17:38,922][DEBUG][logstash.filters.grok ] Adding pattern {"IP"=>"(?:%{IPV6}|%{IPV4})"}
[2018-04-24T17:17:38,922][DEBUG][logstash.filters.grok ] Adding pattern {"HOSTNAME"=>"\\b(?:[0-9A-Za-z][0-9A-Za-z-]{0,62})(?:\\.(?:[0-9A-Za-z][0-9A-Za-z-]{0,62}))*(\\.?|\\b)"}
[2018-04-24T17:17:38,922][DEBUG][logstash.filters.grok ] Adding pattern {"IPORHOST"=>"(?:%{IP}|%{HOSTNAME})"}
[2018-04-24T17:17:38,922][DEBUG][logstash.filters.grok ] Adding pattern {"HOSTPORT"=>"%{IPORHOST}:%{POSINT}"}
[2018-04-24T17:17:38,922][DEBUG][logstash.filters.grok ] Adding pattern {"PATH"=>"(?:%{UNIXPATH}|%{WINPATH})"}
[2018-04-24T17:17:38,922][DEBUG][logstash.filters.grok ] Adding pattern {"UNIXPATH"=>"(/([\\w_%!$@:.,+~-]+|\\\\.)*)+"}
[2018-04-24T17:17:38,922][DEBUG][logstash.filters.grok ] Adding pattern {"TTY"=>"(?:/dev/(pts|tty([pq])?)(\\w+)?/?(?:[0-9]+))"}
[2018-04-24T17:17:38,922][DEBUG][logstash.filters.grok ] Adding pattern {"WINPATH"=>"(?>[A-Za-z]+:|\\\\)(?:\\\\[^\\\\?*]*)+"}
[2018-04-24T17:17:38,922][DEBUG][logstash.filters.grok ] Adding pattern {"URIPROTO"=>"[A-Za-z]([A-Za-z0-9+\\-.]+)+"}
[2018-04-24T17:17:38,922][DEBUG][logstash.filters.grok ] Adding pattern {"URIHOST"=>"%{IPORHOST}(?::%{POSINT:port})?"}
[2018-04-24T17:17:38,922][DEBUG][logstash.filters.grok ] Adding pattern {"URIPATH"=>"(?:/[A-Za-z0-9$.+!*'(){},~:;=@#%&_\\-]*)+"}
[2018-04-24T17:17:38,922][DEBUG][logstash.filters.grok ] Adding pattern {"URIPARAM"=>"\\?[A-Za-z0-9$.+!*'|(){},~@#%&/=:;_?\\-\\[\\]<>]*"}
[2018-04-24T17:17:38,922][DEBUG][logstash.filters.grok ] Adding pattern {"URIPATHPARAM"=>"%{URIPATH}(?:%{URIPARAM})?"}
[2018-04-24T17:17:38,922][DEBUG][logstash.filters.grok ] Adding pattern {"URI"=>"%{URIPROTO}://(?:%{USER}(?::[^@]*)?@)?(?:%{URIHOST})?(?:%{URIPATHPARAM})?"}
[2018-04-24T17:17:38,923][DEBUG][logstash.filters.grok ] Adding pattern {"MONTH"=>"\\b(?:[Jj]an(?:uary|uar)?|[Ff]eb(?:ruary|ruar)?|[Mm](?:a|ä)?r(?:ch|z)?|[Aa]pr(?:il)?|[Mm]a(?:y|i)?|[Jj]un(?:e|i)?|[Jj]ul(?:y)?|[Aa]ug(?:ust)?|[Ss]ep(?:tember)?|[Oo](?:c|k)?t(?:ober)?|[Nn]ov(?:ember)?|[Dd]e(?:c|z)(?:ember)?)\\b"}
[2018-04-24T17:17:38,923][DEBUG][logstash.filters.grok ] Adding pattern {"MONTHNUM"=>"(?:0?[1-9]|1[0-2])"}
[2018-04-24T17:17:38,923][DEBUG][logstash.filters.grok ] Adding pattern {"MONTHNUM2"=>"(?:0[1-9]|1[0-2])"}
[2018-04-24T17:17:38,923][DEBUG][logstash.filters.grok ] Adding pattern {"MONTHDAY"=>"(?:(?:0[1-9])|(?:[12][0-9])|(?:3[01])|[1-9])"}
[2018-04-24T17:17:38,923][DEBUG][logstash.filters.grok ] Adding pattern {"DAY"=>"(?:Mon(?:day)?|Tue(?:sday)?|Wed(?:nesday)?|Thu(?:rsday)?|Fri(?:day)?|Sat(?:urday)?|Sun(?:day)?)"}
[2018-04-24T17:17:38,923][DEBUG][logstash.filters.grok ] Adding pattern {"YEAR"=>"(?>\\d\\d){1,2}"}
[2018-04-24T17:17:38,923][DEBUG][logstash.filters.grok ] Adding pattern {"HOUR"=>"(?:2[0123]|[01]?[0-9])"}
[2018-04-24T17:17:38,923][DEBUG][logstash.filters.grok ] Adding pattern {"MINUTE"=>"(?:[0-5][0-9])"}
[2018-04-24T17:17:38,923][DEBUG][logstash.filters.grok ] Adding pattern {"SECOND"=>"(?:(?:[0-5]?[0-9]|60)(?:[:.,][0-9]+)?)"}
[2018-04-24T17:17:38,923][DEBUG][logstash.filters.grok ] Adding pattern {"TIME"=>"(?!<[0-9])%{HOUR}:%{MINUTE}(?::%{SECOND})(?![0-9])"}
[2018-04-24T17:17:38,923][DEBUG][logstash.filters.grok ] Adding pattern {"DATE_US"=>"%{MONTHNUM}[/-]%{MONTHDAY}[/-]%{YEAR}"}
[2018-04-24T17:17:38,923][DEBUG][logstash.filters.grok ] Adding pattern {"DATE_EU"=>"%{MONTHDAY}[./-]%{MONTHNUM}[./-]%{YEAR}"}
[2018-04-24T17:17:38,923][DEBUG][logstash.filters.grok ] Adding pattern {"ISO8601_TIMEZONE"=>"(?:Z|[+-]%{HOUR}(?::?%{MINUTE}))"}
[2018-04-24T17:17:38,923][DEBUG][logstash.filters.grok ] Adding pattern {"ISO8601_SECOND"=>"(?:%{SECOND}|60)"}
[2018-04-24T17:17:38,923][DEBUG][logstash.filters.grok ] Adding pattern {"TIMESTAMP_ISO8601"=>"%{YEAR}-%{MONTHNUM}-%{MONTHDAY}[T ]%{HOUR}:?%{MINUTE}(?::?%{SECOND})?%{ISO8601_TIMEZONE}?"}
[2018-04-24T17:17:38,923][DEBUG][logstash.filters.grok ] Adding pattern {"DATE"=>"%{DATE_US}|%{DATE_EU}"}
[2018-04-24T17:17:38,923][DEBUG][logstash.filters.grok ] Adding pattern {"DATESTAMP"=>"%{DATE}[- ]%{TIME}"}
[2018-04-24T17:17:38,923][DEBUG][logstash.filters.grok ] Adding pattern {"TZ"=>"(?:[APMCE][SD]T|UTC)"}
[2018-04-24T17:17:38,923][DEBUG][logstash.filters.grok ] Adding pattern {"DATESTAMP_RFC822"=>"%{DAY} %{MONTH} %{MONTHDAY} %{YEAR} %{TIME} %{TZ}"}
[2018-04-24T17:17:38,923][DEBUG][logstash.filters.grok ] Adding pattern {"DATESTAMP_RFC2822"=>"%{DAY}, %{MONTHDAY} %{MONTH} %{YEAR} %{TIME} %{ISO8601_TIMEZONE}"}
[2018-04-24T17:17:38,923][DEBUG][logstash.filters.grok ] Adding pattern {"DATESTAMP_OTHER"=>"%{DAY} %{MONTH} %{MONTHDAY} %{TIME} %{TZ} %{YEAR}"}
[2018-04-24T17:17:38,924][DEBUG][logstash.filters.grok ] Adding pattern {"DATESTAMP_EVENTLOG"=>"%{YEAR}%{MONTHNUM2}%{MONTHDAY}%{HOUR}%{MINUTE}%{SECOND}"}
[2018-04-24T17:17:38,924][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOGTIMESTAMP"=>"%{MONTH} +%{MONTHDAY} %{TIME}"}
[2018-04-24T17:17:38,924][DEBUG][logstash.filters.grok ] Adding pattern {"PROG"=>"[\\x21-\\x5a\\x5c\\x5e-\\x7e]+"}
[2018-04-24T17:17:38,924][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOGPROG"=>"%{PROG:program}(?:\\[%{POSINT:pid}\\])?"}
[2018-04-24T17:17:38,924][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOGHOST"=>"%{IPORHOST}"}
[2018-04-24T17:17:38,924][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOGFACILITY"=>"<%{NONNEGINT:facility}.%{NONNEGINT:priority}>"}
[2018-04-24T17:17:38,924][DEBUG][logstash.filters.grok ] Adding pattern {"HTTPDATE"=>"%{MONTHDAY}/%{MONTH}/%{YEAR}:%{TIME} %{INT}"}
[2018-04-24T17:17:38,924][DEBUG][logstash.filters.grok ] Adding pattern {"QS"=>"%{QUOTEDSTRING}"}
[2018-04-24T17:17:38,924][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOGBASE"=>"%{SYSLOGTIMESTAMP:timestamp} (?:%{SYSLOGFACILITY} )?%{SYSLOGHOST:logsource} %{SYSLOGPROG}:"}
[2018-04-24T17:17:38,924][DEBUG][logstash.filters.grok ] Adding pattern {"LOGLEVEL"=>"([Aa]lert|ALERT|[Tt]race|TRACE|[Dd]ebug|DEBUG|[Nn]otice|NOTICE|[Ii]nfo|INFO|[Ww]arn?(?:ing)?|WARN?(?:ING)?|[Ee]rr?(?:or)?|ERR?(?:OR)?|[Cc]rit?(?:ical)?|CRIT?(?:ICAL)?|[Ff]atal|FATAL|[Ss]evere|SEVERE|EMERG(?:ENCY)?|[Ee]merg(?:ency)?)"}
[2018-04-24T17:17:38,924][DEBUG][logstash.filters.grok ] Adding pattern {"HAPROXYTIME"=>"(?!<[0-9])%{HOUR:haproxy_hour}:%{MINUTE:haproxy_minute}(?::%{SECOND:haproxy_second})(?![0-9])"}
[2018-04-24T17:17:38,924][DEBUG][logstash.filters.grok ] Adding pattern {"HAPROXYDATE"=>"%{MONTHDAY:haproxy_monthday}/%{MONTH:haproxy_month}/%{YEAR:haproxy_year}:%{HAPROXYTIME:haproxy_time}.%{INT:haproxy_milliseconds}"}
[2018-04-24T17:17:38,924][DEBUG][logstash.filters.grok ] Adding pattern {"HAPROXYCAPTUREDREQUESTHEADERS"=>"%{DATA:captured_request_headers}"}
[2018-04-24T17:17:38,924][DEBUG][logstash.filters.grok ] Adding pattern {"HAPROXYCAPTUREDRESPONSEHEADERS"=>"%{DATA:captured_response_headers}"}
[2018-04-24T17:17:38,925][DEBUG][logstash.filters.grok ] Adding pattern {"HAPROXYHTTPBASE"=>"%{IP:client_ip}:%{INT:client_port} \\[%{HAPROXYDATE:accept_date}\\] %{NOTSPACE:frontend_name} %{NOTSPACE:backend_name}/%{NOTSPACE:server_name} %{INT:time_request}/%{INT:time_queue}/%{INT:time_backend_connect}/%{INT:time_backend_response}/%{NOTSPACE:time_duration} %{INT:http_status_code} %{NOTSPACE:bytes_read} %{DATA:captured_request_cookie} %{DATA:captured_response_cookie} %{NOTSPACE:termination_state} %{INT:actconn}/%{INT:feconn}/%{INT:beconn}/%{INT:srvconn}/%{NOTSPACE:retries} %{INT:srv_queue}/%{INT:backend_queue} (\\{%{HAPROXYCAPTUREDREQUESTHEADERS}\\})?( )?(\\{%{HAPROXYCAPTUREDRESPONSEHEADERS}\\})?( )?\"(<BADREQ>|(%{WORD:http_verb} (%{URIPROTO:http_proto}://)?(?:%{USER:http_user}(?::[^@]*)?@)?(?:%{URIHOST:http_host})?(?:%{URIPATHPARAM:http_request})?( HTTP/%{NUMBER:http_version})?))?\""}
[2018-04-24T17:17:38,925][DEBUG][logstash.filters.grok ] Adding pattern {"HAPROXYHTTP"=>"(?:%{SYSLOGTIMESTAMP:syslog_timestamp}|%{TIMESTAMP_ISO8601:timestamp8601}) %{IPORHOST:syslog_server} %{SYSLOGPROG}: %{HAPROXYHTTPBASE}"}
[2018-04-24T17:17:38,925][DEBUG][logstash.filters.grok ] Adding pattern {"HAPROXYTCP"=>"(?:%{SYSLOGTIMESTAMP:syslog_timestamp}|%{TIMESTAMP_ISO8601:timestamp8601}) %{IPORHOST:syslog_server} %{SYSLOGPROG}: %{IP:client_ip}:%{INT:client_port} \\[%{HAPROXYDATE:accept_date}\\] %{NOTSPACE:frontend_name} %{NOTSPACE:backend_name}/%{NOTSPACE:server_name} %{INT:time_queue}/%{INT:time_backend_connect}/%{NOTSPACE:time_duration} %{NOTSPACE:bytes_read} %{NOTSPACE:termination_state} %{INT:actconn}/%{INT:feconn}/%{INT:beconn}/%{INT:srvconn}/%{NOTSPACE:retries} %{INT:srv_queue}/%{INT:backend_queue}"}
[2018-04-24T17:17:38,925][DEBUG][logstash.filters.grok ] Adding pattern {"HTTPDUSER"=>"%{EMAILADDRESS}|%{USER}"}
[2018-04-24T17:17:38,925][DEBUG][logstash.filters.grok ] Adding pattern {"HTTPDERROR_DATE"=>"%{DAY} %{MONTH} %{MONTHDAY} %{TIME} %{YEAR}"}
[2018-04-24T17:17:38,925][DEBUG][logstash.filters.grok ] Adding pattern {"HTTPD_COMMONLOG"=>"%{IPORHOST:clientip} %{HTTPDUSER:ident} %{HTTPDUSER:auth} \\[%{HTTPDATE:timestamp}\\] \"(?:%{WORD:verb} %{NOTSPACE:request}(?: HTTP/%{NUMBER:httpversion})?|%{DATA:rawrequest})\" %{NUMBER:response} (?:%{NUMBER:bytes}|-)"}
[2018-04-24T17:17:38,925][DEBUG][logstash.filters.grok ] Adding pattern {"HTTPD_COMBINEDLOG"=>"%{HTTPD_COMMONLOG} %{QS:referrer} %{QS:agent}"}
[2018-04-24T17:17:38,925][DEBUG][logstash.filters.grok ] Adding pattern {"HTTPD20_ERRORLOG"=>"\\[%{HTTPDERROR_DATE:timestamp}\\] \\[%{LOGLEVEL:loglevel}\\] (?:\\[client %{IPORHOST:clientip}\\] ){0,1}%{GREEDYDATA:message}"}
[2018-04-24T17:17:38,925][DEBUG][logstash.filters.grok ] Adding pattern {"HTTPD24_ERRORLOG"=>"\\[%{HTTPDERROR_DATE:timestamp}\\] \\[%{WORD:module}:%{LOGLEVEL:loglevel}\\] \\[pid %{POSINT:pid}(:tid %{NUMBER:tid})?\\]( \\(%{POSINT:proxy_errorcode}\\)%{DATA:proxy_message}:)?( \\[client %{IPORHOST:clientip}:%{POSINT:clientport}\\])?( %{DATA:errorcode}:)? %{GREEDYDATA:message}"}
[2018-04-24T17:17:38,925][DEBUG][logstash.filters.grok ] Adding pattern {"HTTPD_ERRORLOG"=>"%{HTTPD20_ERRORLOG}|%{HTTPD24_ERRORLOG}"}
[2018-04-24T17:17:38,925][DEBUG][logstash.filters.grok ] Adding pattern {"COMMONAPACHELOG"=>"%{HTTPD_COMMONLOG}"}
[2018-04-24T17:17:38,925][DEBUG][logstash.filters.grok ] Adding pattern {"COMBINEDAPACHELOG"=>"%{HTTPD_COMBINEDLOG}"}
[2018-04-24T17:17:38,925][DEBUG][logstash.filters.grok ] Adding pattern {"JAVACLASS"=>"(?:[a-zA-Z$_][a-zA-Z$_0-9]*\\.)*[a-zA-Z$_][a-zA-Z$_0-9]*"}
[2018-04-24T17:17:38,926][DEBUG][logstash.filters.grok ] Adding pattern {"JAVAFILE"=>"(?:[A-Za-z0-9_. -]+)"}
[2018-04-24T17:17:38,926][DEBUG][logstash.filters.grok ] Adding pattern {"JAVAMETHOD"=>"(?:(<(?:cl)?init>)|[a-zA-Z$_][a-zA-Z$_0-9]*)"}
[2018-04-24T17:17:38,926][DEBUG][logstash.filters.grok ] Adding pattern {"JAVASTACKTRACEPART"=>"%{SPACE}at %{JAVACLASS:class}\\.%{JAVAMETHOD:method}\\(%{JAVAFILE:file}(?::%{NUMBER:line})?\\)"}
[2018-04-24T17:17:38,926][DEBUG][logstash.filters.grok ] Adding pattern {"JAVATHREAD"=>"(?:[A-Z]{2}-Processor[\\d]+)"}
[2018-04-24T17:17:38,926][DEBUG][logstash.filters.grok ] Adding pattern {"JAVACLASS"=>"(?:[a-zA-Z0-9-]+\\.)+[A-Za-z0-9$]+"}
[2018-04-24T17:17:38,926][DEBUG][logstash.filters.grok ] Adding pattern {"JAVAFILE"=>"(?:[A-Za-z0-9_.-]+)"}
[2018-04-24T17:17:38,926][DEBUG][logstash.filters.grok ] Adding pattern {"JAVALOGMESSAGE"=>"(.*)"}
[2018-04-24T17:17:38,926][DEBUG][logstash.filters.grok ] Adding pattern {"CATALINA_DATESTAMP"=>"%{MONTH} %{MONTHDAY}, 20%{YEAR} %{HOUR}:?%{MINUTE}(?::?%{SECOND}) (?:AM|PM)"}
[2018-04-24T17:17:38,926][DEBUG][logstash.filters.grok ] Adding pattern {"TOMCAT_DATESTAMP"=>"20%{YEAR}-%{MONTHNUM}-%{MONTHDAY} %{HOUR}:?%{MINUTE}(?::?%{SECOND}) %{ISO8601_TIMEZONE}"}
[2018-04-24T17:17:38,926][DEBUG][logstash.filters.grok ] Adding pattern {"CATALINALOG"=>"%{CATALINA_DATESTAMP:timestamp} %{JAVACLASS:class} %{JAVALOGMESSAGE:logmessage}"}
[2018-04-24T17:17:38,926][DEBUG][logstash.filters.grok ] Adding pattern {"TOMCATLOG"=>"%{TOMCAT_DATESTAMP:timestamp} \\| %{LOGLEVEL:level} \\| %{JAVACLASS:class} - %{JAVALOGMESSAGE:logmessage}"}
[2018-04-24T17:17:38,926][DEBUG][logstash.filters.grok ] Adding pattern {"RT_FLOW_EVENT"=>"(RT_FLOW_SESSION_CREATE|RT_FLOW_SESSION_CLOSE|RT_FLOW_SESSION_DENY)"}
[2018-04-24T17:17:38,926][DEBUG][logstash.filters.grok ] Adding pattern {"RT_FLOW1"=>"%{RT_FLOW_EVENT:event}: %{GREEDYDATA:close-reason}: %{IP:src-ip}/%{INT:src-port}->%{IP:dst-ip}/%{INT:dst-port} %{DATA:service} %{IP:nat-src-ip}/%{INT:nat-src-port}->%{IP:nat-dst-ip}/%{INT:nat-dst-port} %{DATA:src-nat-rule-name} %{DATA:dst-nat-rule-name} %{INT:protocol-id} %{DATA:policy-name} %{DATA:from-zone} %{DATA:to-zone} %{INT:session-id} \\d+\\(%{DATA:sent}\\) \\d+\\(%{DATA:received}\\) %{INT:elapsed-time} .*"}
[2018-04-24T17:17:38,926][DEBUG][logstash.filters.grok ] Adding pattern {"RT_FLOW2"=>"%{RT_FLOW_EVENT:event}: session created %{IP:src-ip}/%{INT:src-port}->%{IP:dst-ip}/%{INT:dst-port} %{DATA:service} %{IP:nat-src-ip}/%{INT:nat-src-port}->%{IP:nat-dst-ip}/%{INT:nat-dst-port} %{DATA:src-nat-rule-name} %{DATA:dst-nat-rule-name} %{INT:protocol-id} %{DATA:policy-name} %{DATA:from-zone} %{DATA:to-zone} %{INT:session-id} .*"}
[2018-04-24T17:17:38,926][DEBUG][logstash.filters.grok ] Adding pattern {"RT_FLOW3"=>"%{RT_FLOW_EVENT:event}: session denied %{IP:src-ip}/%{INT:src-port}->%{IP:dst-ip}/%{INT:dst-port} %{DATA:service} %{INT:protocol-id}\\(\\d\\) %{DATA:policy-name} %{DATA:from-zone} %{DATA:to-zone} .*"}
[2018-04-24T17:17:38,927][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOG5424PRINTASCII"=>"[!-~]+"}
[2018-04-24T17:17:38,927][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOGBASE2"=>"(?:%{SYSLOGTIMESTAMP:timestamp}|%{TIMESTAMP_ISO8601:timestamp8601}) (?:%{SYSLOGFACILITY} )?%{SYSLOGHOST:logsource}+(?: %{SYSLOGPROG}:|)"}
[2018-04-24T17:17:38,927][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOGPAMSESSION"=>"%{SYSLOGBASE} (?=%{GREEDYDATA:message})%{WORD:pam_module}\\(%{DATA:pam_caller}\\): session %{WORD:pam_session_state} for user %{USERNAME:username}(?: by %{GREEDYDATA:pam_by})?"}
[2018-04-24T17:17:38,927][DEBUG][logstash.filters.grok ] Adding pattern {"CRON_ACTION"=>"[A-Z ]+"}
[2018-04-24T17:17:38,927][DEBUG][logstash.filters.grok ] Adding pattern {"CRONLOG"=>"%{SYSLOGBASE} \\(%{USER:user}\\) %{CRON_ACTION:action} \\(%{DATA:message}\\)"}
[2018-04-24T17:17:38,927][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOGLINE"=>"%{SYSLOGBASE2} %{GREEDYDATA:message}"}
[2018-04-24T17:17:38,927][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOG5424PRI"=>"<%{NONNEGINT:syslog5424_pri}>"}
[2018-04-24T17:17:38,927][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOG5424SD"=>"\\[%{DATA}\\]+"}
[2018-04-24T17:17:38,927][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOG5424BASE"=>"%{SYSLOG5424PRI}%{NONNEGINT:syslog5424_ver} +(?:%{TIMESTAMP_ISO8601:syslog5424_ts}|-) +(?:%{IPORHOST:syslog5424_host}|-) +(-|%{SYSLOG5424PRINTASCII:syslog5424_app}) +(-|%{SYSLOG5424PRINTASCII:syslog5424_proc}) +(-|%{SYSLOG5424PRINTASCII:syslog5424_msgid}) +(?:%{SYSLOG5424SD:syslog5424_sd}|-|)"}
[2018-04-24T17:17:38,927][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOG5424LINE"=>"%{SYSLOG5424BASE} +%{GREEDYDATA:syslog5424_msg}"}
[2018-04-24T17:17:38,927][DEBUG][logstash.filters.grok ] Adding pattern {"MAVEN_VERSION"=>"(?:(\\d+)\\.)?(?:(\\d+)\\.)?(\\*|\\d+)(?:[.-](RELEASE|SNAPSHOT))?"}
[2018-04-24T17:17:38,927][DEBUG][logstash.filters.grok ] Adding pattern {"MCOLLECTIVEAUDIT"=>"%{TIMESTAMP_ISO8601:timestamp}:"}
[2018-04-24T17:17:38,927][DEBUG][logstash.filters.grok ] Adding pattern {"MCOLLECTIVE"=>"., \\[%{TIMESTAMP_ISO8601:timestamp} #%{POSINT:pid}\\]%{SPACE}%{LOGLEVEL:event_level}"}
[2018-04-24T17:17:38,928][DEBUG][logstash.filters.grok ] Adding pattern {"MCOLLECTIVEAUDIT"=>"%{TIMESTAMP_ISO8601:timestamp}:"}
[2018-04-24T17:17:38,928][DEBUG][logstash.filters.grok ] Adding pattern {"MONGO_LOG"=>"%{SYSLOGTIMESTAMP:timestamp} \\[%{WORD:component}\\] %{GREEDYDATA:message}"}
[2018-04-24T17:17:38,928][DEBUG][logstash.filters.grok ] Adding pattern {"MONGO_QUERY"=>"\\{ (?<={ ).*(?= } ntoreturn:) \\}"}
[2018-04-24T17:17:38,928][DEBUG][logstash.filters.grok ] Adding pattern {"MONGO_SLOWQUERY"=>"%{WORD} %{MONGO_WORDDASH:database}\\.%{MONGO_WORDDASH:collection} %{WORD}: %{MONGO_QUERY:query} %{WORD}:%{NONNEGINT:ntoreturn} %{WORD}:%{NONNEGINT:ntoskip} %{WORD}:%{NONNEGINT:nscanned}.*nreturned:%{NONNEGINT:nreturned}..+ (?<duration>[0-9]+)ms"}
[2018-04-24T17:17:38,928][DEBUG][logstash.filters.grok ] Adding pattern {"MONGO_WORDDASH"=>"\\b[\\w-]+\\b"}
[2018-04-24T17:17:38,928][DEBUG][logstash.filters.grok ] Adding pattern {"MONGO3_SEVERITY"=>"\\w"}
[2018-04-24T17:17:38,928][DEBUG][logstash.filters.grok ] Adding pattern {"MONGO3_COMPONENT"=>"%{WORD}|-"}
[2018-04-24T17:17:38,928][DEBUG][logstash.filters.grok ] Adding pattern {"MONGO3_LOG"=>"%{TIMESTAMP_ISO8601:timestamp} %{MONGO3_SEVERITY:severity} %{MONGO3_COMPONENT:component}%{SPACE}(?:\\[%{DATA:context}\\])? %{GREEDYDATA:message}"}
[2018-04-24T17:17:38,928][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOSTIME"=>"\\[%{NUMBER:nagios_epoch}\\]"}
[2018-04-24T17:17:38,928][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_CURRENT_SERVICE_STATE"=>"CURRENT SERVICE STATE"}
[2018-04-24T17:17:38,928][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_CURRENT_HOST_STATE"=>"CURRENT HOST STATE"}
[2018-04-24T17:17:38,928][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_SERVICE_NOTIFICATION"=>"SERVICE NOTIFICATION"}
[2018-04-24T17:17:38,928][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_HOST_NOTIFICATION"=>"HOST NOTIFICATION"}
[2018-04-24T17:17:38,928][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_SERVICE_ALERT"=>"SERVICE ALERT"}
[2018-04-24T17:17:38,928][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_HOST_ALERT"=>"HOST ALERT"}
[2018-04-24T17:17:38,928][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_SERVICE_FLAPPING_ALERT"=>"SERVICE FLAPPING ALERT"}
[2018-04-24T17:17:38,928][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_HOST_FLAPPING_ALERT"=>"HOST FLAPPING ALERT"}
[2018-04-24T17:17:38,929][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_SERVICE_DOWNTIME_ALERT"=>"SERVICE DOWNTIME ALERT"}
[2018-04-24T17:17:38,929][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_HOST_DOWNTIME_ALERT"=>"HOST DOWNTIME ALERT"}
[2018-04-24T17:17:38,929][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_PASSIVE_SERVICE_CHECK"=>"PASSIVE SERVICE CHECK"}
[2018-04-24T17:17:38,929][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_PASSIVE_HOST_CHECK"=>"PASSIVE HOST CHECK"}
[2018-04-24T17:17:38,929][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_SERVICE_EVENT_HANDLER"=>"SERVICE EVENT HANDLER"}
[2018-04-24T17:17:38,929][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_HOST_EVENT_HANDLER"=>"HOST EVENT HANDLER"}
[2018-04-24T17:17:38,929][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_EXTERNAL_COMMAND"=>"EXTERNAL COMMAND"}
[2018-04-24T17:17:38,929][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_TIMEPERIOD_TRANSITION"=>"TIMEPERIOD TRANSITION"}
[2018-04-24T17:17:38,929][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_DISABLE_SVC_CHECK"=>"DISABLE_SVC_CHECK"}
[2018-04-24T17:17:38,929][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_ENABLE_SVC_CHECK"=>"ENABLE_SVC_CHECK"}
[2018-04-24T17:17:38,929][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_DISABLE_HOST_CHECK"=>"DISABLE_HOST_CHECK"}
[2018-04-24T17:17:38,929][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_ENABLE_HOST_CHECK"=>"ENABLE_HOST_CHECK"}
[2018-04-24T17:17:38,929][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_PROCESS_SERVICE_CHECK_RESULT"=>"PROCESS_SERVICE_CHECK_RESULT"}
[2018-04-24T17:17:38,929][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_PROCESS_HOST_CHECK_RESULT"=>"PROCESS_HOST_CHECK_RESULT"}
[2018-04-24T17:17:38,929][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_SCHEDULE_SERVICE_DOWNTIME"=>"SCHEDULE_SERVICE_DOWNTIME"}
[2018-04-24T17:17:38,929][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_SCHEDULE_HOST_DOWNTIME"=>"SCHEDULE_HOST_DOWNTIME"}
[2018-04-24T17:17:38,929][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_DISABLE_HOST_SVC_NOTIFICATIONS"=>"DISABLE_HOST_SVC_NOTIFICATIONS"}
[2018-04-24T17:17:38,929][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_ENABLE_HOST_SVC_NOTIFICATIONS"=>"ENABLE_HOST_SVC_NOTIFICATIONS"}
[2018-04-24T17:17:38,929][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_DISABLE_HOST_NOTIFICATIONS"=>"DISABLE_HOST_NOTIFICATIONS"}
[2018-04-24T17:17:38,929][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_ENABLE_HOST_NOTIFICATIONS"=>"ENABLE_HOST_NOTIFICATIONS"}
[2018-04-24T17:17:38,929][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_DISABLE_SVC_NOTIFICATIONS"=>"DISABLE_SVC_NOTIFICATIONS"}
[2018-04-24T17:17:38,929][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_ENABLE_SVC_NOTIFICATIONS"=>"ENABLE_SVC_NOTIFICATIONS"}
[2018-04-24T17:17:38,929][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_WARNING"=>"Warning:%{SPACE}%{GREEDYDATA:nagios_message}"}
[2018-04-24T17:17:38,929][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_CURRENT_SERVICE_STATE"=>"%{NAGIOS_TYPE_CURRENT_SERVICE_STATE:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_service};%{DATA:nagios_state};%{DATA:nagios_statetype};%{DATA:nagios_statecode};%{GREEDYDATA:nagios_message}"}
[2018-04-24T17:17:38,930][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_CURRENT_HOST_STATE"=>"%{NAGIOS_TYPE_CURRENT_HOST_STATE:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_state};%{DATA:nagios_statetype};%{DATA:nagios_statecode};%{GREEDYDATA:nagios_message}"}
[2018-04-24T17:17:38,930][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_SERVICE_NOTIFICATION"=>"%{NAGIOS_TYPE_SERVICE_NOTIFICATION:nagios_type}: %{DATA:nagios_notifyname};%{DATA:nagios_hostname};%{DATA:nagios_service};%{DATA:nagios_state};%{DATA:nagios_contact};%{GREEDYDATA:nagios_message}"}
[2018-04-24T17:17:38,930][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_HOST_NOTIFICATION"=>"%{NAGIOS_TYPE_HOST_NOTIFICATION:nagios_type}: %{DATA:nagios_notifyname};%{DATA:nagios_hostname};%{DATA:nagios_state};%{DATA:nagios_contact};%{GREEDYDATA:nagios_message}"}
[2018-04-24T17:17:38,930][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_SERVICE_ALERT"=>"%{NAGIOS_TYPE_SERVICE_ALERT:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_service};%{DATA:nagios_state};%{DATA:nagios_statelevel};%{NUMBER:nagios_attempt};%{GREEDYDATA:nagios_message}"}
[2018-04-24T17:17:38,930][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_HOST_ALERT"=>"%{NAGIOS_TYPE_HOST_ALERT:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_state};%{DATA:nagios_statelevel};%{NUMBER:nagios_attempt};%{GREEDYDATA:nagios_message}"}
[2018-04-24T17:17:38,930][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_SERVICE_FLAPPING_ALERT"=>"%{NAGIOS_TYPE_SERVICE_FLAPPING_ALERT:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_service};%{DATA:nagios_state};%{GREEDYDATA:nagios_message}"}
[2018-04-24T17:17:38,930][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_HOST_FLAPPING_ALERT"=>"%{NAGIOS_TYPE_HOST_FLAPPING_ALERT:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_state};%{GREEDYDATA:nagios_message}"}
[2018-04-24T17:17:38,930][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_SERVICE_DOWNTIME_ALERT"=>"%{NAGIOS_TYPE_SERVICE_DOWNTIME_ALERT:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_service};%{DATA:nagios_state};%{GREEDYDATA:nagios_comment}"}
[2018-04-24T17:17:38,930][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_HOST_DOWNTIME_ALERT"=>"%{NAGIOS_TYPE_HOST_DOWNTIME_ALERT:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_state};%{GREEDYDATA:nagios_comment}"}
[2018-04-24T17:17:38,930][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_PASSIVE_SERVICE_CHECK"=>"%{NAGIOS_TYPE_PASSIVE_SERVICE_CHECK:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_service};%{DATA:nagios_state};%{GREEDYDATA:nagios_comment}"}
[2018-04-24T17:17:38,930][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_PASSIVE_HOST_CHECK"=>"%{NAGIOS_TYPE_PASSIVE_HOST_CHECK:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_state};%{GREEDYDATA:nagios_comment}"}
[2018-04-24T17:17:38,930][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_SERVICE_EVENT_HANDLER"=>"%{NAGIOS_TYPE_SERVICE_EVENT_HANDLER:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_service};%{DATA:nagios_state};%{DATA:nagios_statelevel};%{DATA:nagios_event_handler_name}"}
[2018-04-24T17:17:38,930][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_HOST_EVENT_HANDLER"=>"%{NAGIOS_TYPE_HOST_EVENT_HANDLER:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_state};%{DATA:nagios_statelevel};%{DATA:nagios_event_handler_name}"}
[2018-04-24T17:17:38,930][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TIMEPERIOD_TRANSITION"=>"%{NAGIOS_TYPE_TIMEPERIOD_TRANSITION:nagios_type}: %{DATA:nagios_service};%{DATA:nagios_unknown1};%{DATA:nagios_unknown2}"}
[2018-04-24T17:17:38,930][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_DISABLE_SVC_CHECK"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_DISABLE_SVC_CHECK:nagios_command};%{DATA:nagios_hostname};%{DATA:nagios_service}"}
[2018-04-24T17:17:38,930][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_DISABLE_HOST_CHECK"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_DISABLE_HOST_CHECK:nagios_command};%{DATA:nagios_hostname}"}
[2018-04-24T17:17:38,931][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_ENABLE_SVC_CHECK"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_ENABLE_SVC_CHECK:nagios_command};%{DATA:nagios_hostname};%{DATA:nagios_service}"}
[2018-04-24T17:17:38,931][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_ENABLE_HOST_CHECK"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_ENABLE_HOST_CHECK:nagios_command};%{DATA:nagios_hostname}"}
[2018-04-24T17:17:38,931][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_PROCESS_SERVICE_CHECK_RESULT"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_PROCESS_SERVICE_CHECK_RESULT:nagios_command};%{DATA:nagios_hostname};%{DATA:nagios_service};%{DATA:nagios_state};%{GREEDYDATA:nagios_check_result}"}
[2018-04-24T17:17:38,931][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_PROCESS_HOST_CHECK_RESULT"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_PROCESS_HOST_CHECK_RESULT:nagios_command};%{DATA:nagios_hostname};%{DATA:nagios_state};%{GREEDYDATA:nagios_check_result}"}
[2018-04-24T17:17:38,931][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_DISABLE_HOST_SVC_NOTIFICATIONS"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_DISABLE_HOST_SVC_NOTIFICATIONS:nagios_command};%{GREEDYDATA:nagios_hostname}"}
[2018-04-24T17:17:38,931][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_DISABLE_HOST_NOTIFICATIONS"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_DISABLE_HOST_NOTIFICATIONS:nagios_command};%{GREEDYDATA:nagios_hostname}"}
[2018-04-24T17:17:38,931][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_DISABLE_SVC_NOTIFICATIONS"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_DISABLE_SVC_NOTIFICATIONS:nagios_command};%{DATA:nagios_hostname};%{GREEDYDATA:nagios_service}"}
[2018-04-24T17:17:38,931][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_ENABLE_HOST_SVC_NOTIFICATIONS"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_ENABLE_HOST_SVC_NOTIFICATIONS:nagios_command};%{GREEDYDATA:nagios_hostname}"}
[2018-04-24T17:17:38,931][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_ENABLE_HOST_NOTIFICATIONS"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_ENABLE_HOST_NOTIFICATIONS:nagios_command};%{GREEDYDATA:nagios_hostname}"}
[2018-04-24T17:17:38,931][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_ENABLE_SVC_NOTIFICATIONS"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_ENABLE_SVC_NOTIFICATIONS:nagios_command};%{DATA:nagios_hostname};%{GREEDYDATA:nagios_service}"}
[2018-04-24T17:17:38,931][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_SCHEDULE_HOST_DOWNTIME"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_SCHEDULE_HOST_DOWNTIME:nagios_command};%{DATA:nagios_hostname};%{NUMBER:nagios_start_time};%{NUMBER:nagios_end_time};%{NUMBER:nagios_fixed};%{NUMBER:nagios_trigger_id};%{NUMBER:nagios_duration};%{DATA:author};%{DATA:comment}"}
[2018-04-24T17:17:38,931][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOSLOGLINE"=>"%{NAGIOSTIME} (?:%{NAGIOS_WARNING}|%{NAGIOS_CURRENT_SERVICE_STATE}|%{NAGIOS_CURRENT_HOST_STATE}|%{NAGIOS_SERVICE_NOTIFICATION}|%{NAGIOS_HOST_NOTIFICATION}|%{NAGIOS_SERVICE_ALERT}|%{NAGIOS_HOST_ALERT}|%{NAGIOS_SERVICE_FLAPPING_ALERT}|%{NAGIOS_HOST_FLAPPING_ALERT}|%{NAGIOS_SERVICE_DOWNTIME_ALERT}|%{NAGIOS_HOST_DOWNTIME_ALERT}|%{NAGIOS_PASSIVE_SERVICE_CHECK}|%{NAGIOS_PASSIVE_HOST_CHECK}|%{NAGIOS_SERVICE_EVENT_HANDLER}|%{NAGIOS_HOST_EVENT_HANDLER}|%{NAGIOS_TIMEPERIOD_TRANSITION}|%{NAGIOS_EC_LINE_DISABLE_SVC_CHECK}|%{NAGIOS_EC_LINE_ENABLE_SVC_CHECK}|%{NAGIOS_EC_LINE_DISABLE_HOST_CHECK}|%{NAGIOS_EC_LINE_ENABLE_HOST_CHECK}|%{NAGIOS_EC_LINE_PROCESS_HOST_CHECK_RESULT}|%{NAGIOS_EC_LINE_PROCESS_SERVICE_CHECK_RESULT}|%{NAGIOS_EC_LINE_SCHEDULE_HOST_DOWNTIME}|%{NAGIOS_EC_LINE_DISABLE_HOST_SVC_NOTIFICATIONS}|%{NAGIOS_EC_LINE_ENABLE_HOST_SVC_NOTIFICATIONS}|%{NAGIOS_EC_LINE_DISABLE_HOST_NOTIFICATIONS}|%{NAGIOS_EC_LINE_ENABLE_HOST_NOTIFICATIONS}|%{NAGIOS_EC_LINE_DISABLE_SVC_NOTIFICATIONS}|%{NAGIOS_EC_LINE_ENABLE_SVC_NOTIFICATIONS})"}
[2018-04-24T17:17:38,932][DEBUG][logstash.filters.grok ] Adding pattern {"POSTGRESQL"=>"%{DATESTAMP:timestamp} %{TZ} %{DATA:user_id} %{GREEDYDATA:connection_id} %{POSINT:pid}"}
[2018-04-24T17:17:38,932][DEBUG][logstash.filters.grok ] Adding pattern {"RUUID"=>"\\h{32}"}
[2018-04-24T17:17:38,932][DEBUG][logstash.filters.grok ] Adding pattern {"RCONTROLLER"=>"(?<controller>[^#]+)#(?<action>\\w+)"}
[2018-04-24T17:17:38,932][DEBUG][logstash.filters.grok ] Adding pattern {"RAILS3HEAD"=>"(?m)Started %{WORD:verb} \"%{URIPATHPARAM:request}\" for %{IPORHOST:clientip} at (?<timestamp>%{YEAR}-%{MONTHNUM}-%{MONTHDAY} %{HOUR}:%{MINUTE}:%{SECOND} %{ISO8601_TIMEZONE})"}
[2018-04-24T17:17:38,932][DEBUG][logstash.filters.grok ] Adding pattern {"RPROCESSING"=>"\\W*Processing by %{RCONTROLLER} as (?<format>\\S+)(?:\\W*Parameters: {%{DATA:params}}\\W*)?"}
[2018-04-24T17:17:38,932][DEBUG][logstash.filters.grok ] Adding pattern {"RAILS3FOOT"=>"Completed %{NUMBER:response}%{DATA} in %{NUMBER:totalms}ms %{RAILS3PROFILE}%{GREEDYDATA}"}
[2018-04-24T17:17:38,932][DEBUG][logstash.filters.grok ] Adding pattern {"RAILS3PROFILE"=>"(?:\\(Views: %{NUMBER:viewms}ms \\| ActiveRecord: %{NUMBER:activerecordms}ms|\\(ActiveRecord: %{NUMBER:activerecordms}ms)?"}
[2018-04-24T17:17:38,932][DEBUG][logstash.filters.grok ] Adding pattern {"RAILS3"=>"%{RAILS3HEAD}(?:%{RPROCESSING})?(?<context>(?:%{DATA}\\n)*)(?:%{RAILS3FOOT})?"}
[2018-04-24T17:17:38,932][DEBUG][logstash.filters.grok ] Adding pattern {"REDISTIMESTAMP"=>"%{MONTHDAY} %{MONTH} %{TIME}"}
[2018-04-24T17:17:38,932][DEBUG][logstash.filters.grok ] Adding pattern {"REDISLOG"=>"\\[%{POSINT:pid}\\] %{REDISTIMESTAMP:timestamp} \\* "}
[2018-04-24T17:17:38,932][DEBUG][logstash.filters.grok ] Adding pattern {"REDISMONLOG"=>"%{NUMBER:timestamp} \\[%{INT:database} %{IP:client}:%{NUMBER:port}\\] \"%{WORD:command}\"\\s?%{GREEDYDATA:params}"}
[2018-04-24T17:17:38,932][DEBUG][logstash.filters.grok ] Adding pattern {"RUBY_LOGLEVEL"=>"(?:DEBUG|FATAL|ERROR|WARN|INFO)"}
[2018-04-24T17:17:38,932][DEBUG][logstash.filters.grok ] Adding pattern {"RUBY_LOGGER"=>"[DFEWI], \\[%{TIMESTAMP_ISO8601:timestamp} #%{POSINT:pid}\\] *%{RUBY_LOGLEVEL:loglevel} -- +%{DATA:progname}: %{GREEDYDATA:message}"}
[2018-04-24T17:17:38,933][DEBUG][logstash.filters.grok ] Adding pattern {"SQUID3"=>"%{NUMBER:timestamp}\\s+%{NUMBER:duration}\\s%{IP:client_address}\\s%{WORD:cache_result}/%{POSINT:status_code}\\s%{NUMBER:bytes}\\s%{WORD:request_method}\\s%{NOTSPACE:url}\\s(%{NOTSPACE:user}|-)\\s%{WORD:hierarchy_code}/%{IPORHOST:server}\\s%{NOTSPACE:content_type}"}
[2018-04-24T17:17:38,933][DEBUG][logstash.filters.grok ] Adding pattern {"PAYLOAD"=>"[\\s\\S]*"}
[2018-04-24T17:17:38,933][DEBUG][logstash.filters.grok ] Adding pattern {"SPACE"=>"[ ]{1,}"}
[2018-04-24T17:17:38,933][DEBUG][logstash.filters.grok ] Adding pattern {"P_TIMESTAMP"=>"%{MONTH}\\s%{MONTHDAY},\\s%{YEAR}\\s%{TIME}\\s(AM|PM)"}
[2018-04-24T17:17:38,933][DEBUG][logstash.filters.grok ] Adding pattern {"LOGGINGSERVICEPREFIX"=>"[-]{12,18} Event Log Start Here [-]{12,18}\\\\n"}
[2018-04-24T17:17:38,933][DEBUG][logstash.filters.grok ] Adding pattern {"LOGGINGSERVICESUFFIX"=>"\\\\n[-]{12,18} Event Log End Here [-]{12,18}"}
[2018-04-24T17:17:38,933][DEBUG][logstash.filters.grok ] Adding pattern {"XLMLOGGING"=>"[0-9]{4}-[0-9]{2}-[0-9]{2} [0-9]{2}:[0-9]{2}:[0-9]{2}:[0-9]{3,7}"}
[2018-04-24T17:17:38,933][DEBUG][logstash.filters.grok ] Adding pattern {"DATESWITHDOTS"=>"[0-9]{4}.[0-9]{2}.[0-9]{2}.[0-9]{2}.[0-9]{2}.[0-9]{2}.[0-9]{3,7}"}
[2018-04-24T17:17:38,933][DEBUG][logstash.filters.grok ] Adding pattern {"DATESWITHUNDERLINE"=>"[0-9]{4}_[0-9]{1,2}_[0-9]{1,2}_[0-9]{1,2}_[0-9]{1,2}_[0-9]{1,2}_[0-9]{1,7}"}
[2018-04-24T17:17:38,933][DEBUG][logstash.filters.grok ] replacement_pattern => (?<GREEDYDATA:Message>.*)
[2018-04-24T17:17:38,933][DEBUG][logstash.filters.grok ] replacement_pattern => (?<TIMESTAMP_ISO8601:logtime>%{YEAR}-%{MONTHNUM}-%{MONTHDAY}[T ]%{HOUR}:?%{MINUTE}(?::?%{SECOND})?%{ISO8601_TIMEZONE}?)
[2018-04-24T17:17:38,934][DEBUG][logstash.filters.grok ] replacement_pattern => (?:(?>\d\d){1,2})
[2018-04-24T17:17:38,934][DEBUG][logstash.filters.grok ] replacement_pattern => (?:(?:0?[1-9]|1[0-2]))
[2018-04-24T17:17:38,934][DEBUG][logstash.filters.grok ] replacement_pattern => (?:(?:(?:0[1-9])|(?:[12][0-9])|(?:3[01])|[1-9]))
[2018-04-24T17:17:38,934][DEBUG][logstash.filters.grok ] replacement_pattern => (?:(?:2[0123]|[01]?[0-9]))
[2018-04-24T17:17:38,934][DEBUG][logstash.filters.grok ] replacement_pattern => (?:(?:[0-5][0-9]))
[2018-04-24T17:17:38,934][DEBUG][logstash.filters.grok ] replacement_pattern => (?:(?:(?:[0-5]?[0-9]|60)(?:[:.,][0-9]+)?))
[2018-04-24T17:17:38,934][DEBUG][logstash.filters.grok ] replacement_pattern => (?:(?:Z|[+-]%{HOUR}(?::?%{MINUTE})))
[2018-04-24T17:17:38,934][DEBUG][logstash.filters.grok ] replacement_pattern => (?:(?:2[0123]|[01]?[0-9]))
[2018-04-24T17:17:38,934][DEBUG][logstash.filters.grok ] replacement_pattern => (?:(?:[0-5][0-9]))
[2018-04-24T17:17:38,935][DEBUG][logstash.filters.grok ] Grok compiled OK {:pattern=>"%{GREEDYDATA:Message}Timestamp : %{TIMESTAMP_ISO8601:logtime}", :expanded_pattern=>"(?<GREEDYDATA:Message>.*)Timestamp : (?<TIMESTAMP_ISO8601:logtime>(?:(?>\\d\\d){1,2})-(?:(?:0?[1-9]|1[0-2]))-(?:(?:(?:0[1-9])|(?:[12][0-9])|(?:3[01])|[1-9]))[T ](?:(?:2[0123]|[01]?[0-9])):?(?:(?:[0-5][0-9]))(?::?(?:(?:(?:[0-5]?[0-9]|60)(?:[:.,][0-9]+)?)))?(?:(?:Z|[+-](?:(?:2[0123]|[01]?[0-9]))(?::?(?:(?:[0-5][0-9])))))?)"}
[2018-04-24T17:17:38,937][DEBUG][logstash.filters.grok ] Grok patterns path {:paths=>["/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-patterns-core-4.1.2/patterns", "/usr/share/logstash/patterns/*"]}
[2018-04-24T17:17:38,937][DEBUG][logstash.filters.grok ] Grok patterns path {:paths=>["/etc/logstash/conf.d/patterns"]}
[2018-04-24T17:17:38,937][DEBUG][logstash.filters.grok ] Match data {:match=>{"message"=>"%{TIMESTAMP_ISO8601:logtime} %{GREEDYDATA:Message}"}}
[2018-04-24T17:17:38,938][DEBUG][logstash.filters.grok ] regexp: /message {:pattern=>"%{TIMESTAMP_ISO8601:logtime} %{GREEDYDATA:Message}"}
[2018-04-24T17:17:38,938][DEBUG][logstash.filters.grok ] Adding pattern {"S3_REQUEST_LINE"=>"(?:%{WORD:verb} %{NOTSPACE:request}(?: HTTP/%{NUMBER:httpversion})?|%{DATA:rawrequest})"}
[2018-04-24T17:17:38,938][DEBUG][logstash.filters.grok ] Adding pattern {"S3_ACCESS_LOG"=>"%{WORD:owner} %{NOTSPACE:bucket} \\[%{HTTPDATE:timestamp}\\] %{IP:clientip} %{NOTSPACE:requester} %{NOTSPACE:request_id} %{NOTSPACE:operation} %{NOTSPACE:key} (?:\"%{S3_REQUEST_LINE}\"|-) (?:%{INT:response:int}|-) (?:-|%{NOTSPACE:error_code}) (?:%{INT:bytes:int}|-) (?:%{INT:object_size:int}|-) (?:%{INT:request_time_ms:int}|-) (?:%{INT:turnaround_time_ms:int}|-) (?:%{QS:referrer}|-) (?:\"?%{QS:agent}\"?|-) (?:-|%{NOTSPACE:version_id})"}
[2018-04-24T17:17:38,938][DEBUG][logstash.filters.grok ] Adding pattern {"ELB_URIPATHPARAM"=>"%{URIPATH:path}(?:%{URIPARAM:params})?"}
[2018-04-24T17:17:38,938][DEBUG][logstash.filters.grok ] Adding pattern {"ELB_URI"=>"%{URIPROTO:proto}://(?:%{USER}(?::[^@]*)?@)?(?:%{URIHOST:urihost})?(?:%{ELB_URIPATHPARAM})?"}
[2018-04-24T17:17:38,939][DEBUG][logstash.filters.grok ] Adding pattern {"ELB_REQUEST_LINE"=>"(?:%{WORD:verb} %{ELB_URI:request}(?: HTTP/%{NUMBER:httpversion})?|%{DATA:rawrequest})"}
[2018-04-24T17:17:38,939][DEBUG][logstash.filters.grok ] Adding pattern {"ELB_ACCESS_LOG"=>"%{TIMESTAMP_ISO8601:timestamp} %{NOTSPACE:elb} %{IP:clientip}:%{INT:clientport:int} (?:(%{IP:backendip}:?:%{INT:backendport:int})|-) %{NUMBER:request_processing_time:float} %{NUMBER:backend_processing_time:float} %{NUMBER:response_processing_time:float} %{INT:response:int} %{INT:backend_response:int} %{INT:received_bytes:int} %{INT:bytes:int} \"%{ELB_REQUEST_LINE}\""}
[2018-04-24T17:17:38,939][DEBUG][logstash.filters.grok ] Adding pattern {"CLOUDFRONT_ACCESS_LOG"=>"(?<timestamp>%{YEAR}-%{MONTHNUM}-%{MONTHDAY}\\t%{TIME})\\t%{WORD:x_edge_location}\\t(?:%{NUMBER:sc_bytes:int}|-)\\t%{IPORHOST:clientip}\\t%{WORD:cs_method}\\t%{HOSTNAME:cs_host}\\t%{NOTSPACE:cs_uri_stem}\\t%{NUMBER:sc_status:int}\\t%{GREEDYDATA:referrer}\\t%{GREEDYDATA:agent}\\t%{GREEDYDATA:cs_uri_query}\\t%{GREEDYDATA:cookies}\\t%{WORD:x_edge_result_type}\\t%{NOTSPACE:x_edge_request_id}\\t%{HOSTNAME:x_host_header}\\t%{URIPROTO:cs_protocol}\\t%{INT:cs_bytes:int}\\t%{GREEDYDATA:time_taken:float}\\t%{GREEDYDATA:x_forwarded_for}\\t%{GREEDYDATA:ssl_protocol}\\t%{GREEDYDATA:ssl_cipher}\\t%{GREEDYDATA:x_edge_response_result_type}"}
[2018-04-24T17:17:38,939][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_TIMESTAMP"=>"%{MONTHDAY}-%{MONTH} %{HOUR}:%{MINUTE}"}
[2018-04-24T17:17:38,939][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_HOST"=>"[a-zA-Z0-9-]+"}
[2018-04-24T17:17:38,939][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_VOLUME"=>"%{USER}"}
[2018-04-24T17:17:38,939][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_DEVICE"=>"%{USER}"}
[2018-04-24T17:17:38,939][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_DEVICEPATH"=>"%{UNIXPATH}"}
[2018-04-24T17:17:38,939][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_CAPACITY"=>"%{INT}{1,3}(,%{INT}{3})*"}
[2018-04-24T17:17:38,939][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_VERSION"=>"%{USER}"}
[2018-04-24T17:17:38,939][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_JOB"=>"%{USER}"}
[2018-04-24T17:17:38,939][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_MAX_CAPACITY"=>"User defined maximum volume capacity %{BACULA_CAPACITY} exceeded on device \\\"%{BACULA_DEVICE:device}\\\" \\(%{BACULA_DEVICEPATH}\\)"}
[2018-04-24T17:17:38,939][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_END_VOLUME"=>"End of medium on Volume \\\"%{BACULA_VOLUME:volume}\\\" Bytes=%{BACULA_CAPACITY} Blocks=%{BACULA_CAPACITY} at %{MONTHDAY}-%{MONTH}-%{YEAR} %{HOUR}:%{MINUTE}."}
[2018-04-24T17:17:38,939][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NEW_VOLUME"=>"Created new Volume \\\"%{BACULA_VOLUME:volume}\\\" in catalog."}
[2018-04-24T17:17:38,939][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NEW_LABEL"=>"Labeled new Volume \\\"%{BACULA_VOLUME:volume}\\\" on device \\\"%{BACULA_DEVICE:device}\\\" \\(%{BACULA_DEVICEPATH}\\)."}
[2018-04-24T17:17:38,939][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_WROTE_LABEL"=>"Wrote label to prelabeled Volume \\\"%{BACULA_VOLUME:volume}\\\" on device \\\"%{BACULA_DEVICE}\\\" \\(%{BACULA_DEVICEPATH}\\)"}
[2018-04-24T17:17:38,940][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NEW_MOUNT"=>"New volume \\\"%{BACULA_VOLUME:volume}\\\" mounted on device \\\"%{BACULA_DEVICE:device}\\\" \\(%{BACULA_DEVICEPATH}\\) at %{MONTHDAY}-%{MONTH}-%{YEAR} %{HOUR}:%{MINUTE}."}
[2018-04-24T17:17:38,940][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NOOPEN"=>"\\s+Cannot open %{DATA}: ERR=%{GREEDYDATA:berror}"}
[2018-04-24T17:17:38,940][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NOOPENDIR"=>"\\s+Could not open directory %{DATA}: ERR=%{GREEDYDATA:berror}"}
[2018-04-24T17:17:38,940][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NOSTAT"=>"\\s+Could not stat %{DATA}: ERR=%{GREEDYDATA:berror}"}
[2018-04-24T17:17:38,940][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NOJOBS"=>"There are no more Jobs associated with Volume \\\"%{BACULA_VOLUME:volume}\\\". Marking it purged."}
[2018-04-24T17:17:38,940][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_ALL_RECORDS_PRUNED"=>"All records pruned from Volume \\\"%{BACULA_VOLUME:volume}\\\"; marking it \\\"Purged\\\""}
[2018-04-24T17:17:38,940][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_BEGIN_PRUNE_JOBS"=>"Begin pruning Jobs older than %{INT} month %{INT} days ."}
[2018-04-24T17:17:38,940][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_BEGIN_PRUNE_FILES"=>"Begin pruning Files."}
[2018-04-24T17:17:38,940][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_PRUNED_JOBS"=>"Pruned %{INT} Jobs* for client %{BACULA_HOST:client} from catalog."}
[2018-04-24T17:17:38,940][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_PRUNED_FILES"=>"Pruned Files from %{INT} Jobs* for client %{BACULA_HOST:client} from catalog."}
[2018-04-24T17:17:38,940][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_ENDPRUNE"=>"End auto prune."}
[2018-04-24T17:17:38,940][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_STARTJOB"=>"Start Backup JobId %{INT}, Job=%{BACULA_JOB:job}"}
[2018-04-24T17:17:38,940][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_STARTRESTORE"=>"Start Restore Job %{BACULA_JOB:job}"}
[2018-04-24T17:17:38,940][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_USEDEVICE"=>"Using Device \\\"%{BACULA_DEVICE:device}\\\""}
[2018-04-24T17:17:38,940][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_DIFF_FS"=>"\\s+%{UNIXPATH} is a different filesystem. Will not descend from %{UNIXPATH} into it."}
[2018-04-24T17:17:38,940][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_JOBEND"=>"Job write elapsed time = %{DATA:elapsed}, Transfer rate = %{NUMBER} (K|M|G)? Bytes/second"}
[2018-04-24T17:17:38,940][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NOPRUNE_JOBS"=>"No Jobs found to prune."}
[2018-04-24T17:17:38,940][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NOPRUNE_FILES"=>"No Files found to prune."}
[2018-04-24T17:17:38,940][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_VOLUME_PREVWRITTEN"=>"Volume \\\"%{BACULA_VOLUME:volume}\\\" previously written, moving to end of data."}
[2018-04-24T17:17:38,940][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_READYAPPEND"=>"Ready to append to end of Volume \\\"%{BACULA_VOLUME:volume}\\\" size=%{INT}"}
[2018-04-24T17:17:38,940][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_CANCELLING"=>"Cancelling duplicate JobId=%{INT}."}
[2018-04-24T17:17:38,940][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_MARKCANCEL"=>"JobId %{INT}, Job %{BACULA_JOB:job} marked to be canceled."}
[2018-04-24T17:17:38,940][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_CLIENT_RBJ"=>"shell command: run ClientRunBeforeJob \\\"%{GREEDYDATA:runjob}\\\""}
[2018-04-24T17:17:38,940][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_VSS"=>"(Generate )?VSS (Writer)?"}
[2018-04-24T17:17:38,941][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_MAXSTART"=>"Fatal error: Job canceled because max start delay time exceeded."}
[2018-04-24T17:17:38,941][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_DUPLICATE"=>"Fatal error: JobId %{INT:duplicate} already running. Duplicate job not allowed."}
[2018-04-24T17:17:38,941][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NOJOBSTAT"=>"Fatal error: No Job status returned from FD."}
[2018-04-24T17:17:38,941][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_FATAL_CONN"=>"Fatal error: bsock.c:133 Unable to connect to (Client: %{BACULA_HOST:client}|Storage daemon) on %{HOSTNAME}:%{POSINT}. ERR=(?<berror>%{GREEDYDATA})"}
[2018-04-24T17:17:38,941][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NO_CONNECT"=>"Warning: bsock.c:127 Could not connect to (Client: %{BACULA_HOST:client}|Storage daemon) on %{HOSTNAME}:%{POSINT}. ERR=(?<berror>%{GREEDYDATA})"}
[2018-04-24T17:17:38,941][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NO_AUTH"=>"Fatal error: Unable to authenticate with File daemon at %{HOSTNAME}. Possible causes:"}
[2018-04-24T17:17:38,941][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NOSUIT"=>"No prior or suitable Full backup found in catalog. Doing FULL backup."}
[2018-04-24T17:17:38,941][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_NOPRIOR"=>"No prior Full backup Job record found."}
[2018-04-24T17:17:38,941][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOG_JOB"=>"(Error: )?Bacula %{BACULA_HOST} %{BACULA_VERSION} \\(%{BACULA_VERSION}\\):"}
[2018-04-24T17:17:38,941][DEBUG][logstash.filters.grok ] Adding pattern {"BACULA_LOGLINE"=>"%{BACULA_TIMESTAMP:bts} %{BACULA_HOST:hostname} JobId %{INT:jobid}: (%{BACULA_LOG_MAX_CAPACITY}|%{BACULA_LOG_END_VOLUME}|%{BACULA_LOG_NEW_VOLUME}|%{BACULA_LOG_NEW_LABEL}|%{BACULA_LOG_WROTE_LABEL}|%{BACULA_LOG_NEW_MOUNT}|%{BACULA_LOG_NOOPEN}|%{BACULA_LOG_NOOPENDIR}|%{BACULA_LOG_NOSTAT}|%{BACULA_LOG_NOJOBS}|%{BACULA_LOG_ALL_RECORDS_PRUNED}|%{BACULA_LOG_BEGIN_PRUNE_JOBS}|%{BACULA_LOG_BEGIN_PRUNE_FILES}|%{BACULA_LOG_PRUNED_JOBS}|%{BACULA_LOG_PRUNED_FILES}|%{BACULA_LOG_ENDPRUNE}|%{BACULA_LOG_STARTJOB}|%{BACULA_LOG_STARTRESTORE}|%{BACULA_LOG_USEDEVICE}|%{BACULA_LOG_DIFF_FS}|%{BACULA_LOG_JOBEND}|%{BACULA_LOG_NOPRUNE_JOBS}|%{BACULA_LOG_NOPRUNE_FILES}|%{BACULA_LOG_VOLUME_PREVWRITTEN}|%{BACULA_LOG_READYAPPEND}|%{BACULA_LOG_CANCELLING}|%{BACULA_LOG_MARKCANCEL}|%{BACULA_LOG_CLIENT_RBJ}|%{BACULA_LOG_VSS}|%{BACULA_LOG_MAXSTART}|%{BACULA_LOG_DUPLICATE}|%{BACULA_LOG_NOJOBSTAT}|%{BACULA_LOG_FATAL_CONN}|%{BACULA_LOG_NO_CONNECT}|%{BACULA_LOG_NO_AUTH}|%{BACULA_LOG_NOSUIT}|%{BACULA_LOG_JOB}|%{BACULA_LOG_NOPRIOR})"}
[2018-04-24T17:17:38,941][DEBUG][logstash.filters.grok ] Adding pattern {"BIND9_TIMESTAMP"=>"%{MONTHDAY}[-]%{MONTH}[-]%{YEAR} %{TIME}"}
[2018-04-24T17:17:38,941][DEBUG][logstash.filters.grok ] Adding pattern {"BIND9"=>"%{BIND9_TIMESTAMP:timestamp} queries: %{LOGLEVEL:loglevel}: client %{IP:clientip}#%{POSINT:clientport} \\(%{GREEDYDATA:query}\\): query: %{GREEDYDATA:query} IN %{GREEDYDATA:querytype} \\(%{IP:dns}\\)"}
[2018-04-24T17:17:38,942][DEBUG][logstash.filters.grok ] Adding pattern {"BRO_HTTP"=>"%{NUMBER:ts}\\t%{NOTSPACE:uid}\\t%{IP:orig_h}\\t%{INT:orig_p}\\t%{IP:resp_h}\\t%{INT:resp_p}\\t%{INT:trans_depth}\\t%{GREEDYDATA:method}\\t%{GREEDYDATA:domain}\\t%{GREEDYDATA:uri}\\t%{GREEDYDATA:referrer}\\t%{GREEDYDATA:user_agent}\\t%{NUMBER:request_body_len}\\t%{NUMBER:response_body_len}\\t%{GREEDYDATA:status_code}\\t%{GREEDYDATA:status_msg}\\t%{GREEDYDATA:info_code}\\t%{GREEDYDATA:info_msg}\\t%{GREEDYDATA:filename}\\t%{GREEDYDATA:bro_tags}\\t%{GREEDYDATA:username}\\t%{GREEDYDATA:password}\\t%{GREEDYDATA:proxied}\\t%{GREEDYDATA:orig_fuids}\\t%{GREEDYDATA:orig_mime_types}\\t%{GREEDYDATA:resp_fuids}\\t%{GREEDYDATA:resp_mime_types}"}
[2018-04-24T17:17:38,942][DEBUG][logstash.filters.grok ] Adding pattern {"BRO_DNS"=>"%{NUMBER:ts}\\t%{NOTSPACE:uid}\\t%{IP:orig_h}\\t%{INT:orig_p}\\t%{IP:resp_h}\\t%{INT:resp_p}\\t%{WORD:proto}\\t%{INT:trans_id}\\t%{GREEDYDATA:query}\\t%{GREEDYDATA:qclass}\\t%{GREEDYDATA:qclass_name}\\t%{GREEDYDATA:qtype}\\t%{GREEDYDATA:qtype_name}\\t%{GREEDYDATA:rcode}\\t%{GREEDYDATA:rcode_name}\\t%{GREEDYDATA:AA}\\t%{GREEDYDATA:TC}\\t%{GREEDYDATA:RD}\\t%{GREEDYDATA:RA}\\t%{GREEDYDATA:Z}\\t%{GREEDYDATA:answers}\\t%{GREEDYDATA:TTLs}\\t%{GREEDYDATA:rejected}"}
[2018-04-24T17:17:38,942][DEBUG][logstash.filters.grok ] Adding pattern {"BRO_CONN"=>"%{NUMBER:ts}\\t%{NOTSPACE:uid}\\t%{IP:orig_h}\\t%{INT:orig_p}\\t%{IP:resp_h}\\t%{INT:resp_p}\\t%{WORD:proto}\\t%{GREEDYDATA:service}\\t%{NUMBER:duration}\\t%{NUMBER:orig_bytes}\\t%{NUMBER:resp_bytes}\\t%{GREEDYDATA:conn_state}\\t%{GREEDYDATA:local_orig}\\t%{GREEDYDATA:missed_bytes}\\t%{GREEDYDATA:history}\\t%{GREEDYDATA:orig_pkts}\\t%{GREEDYDATA:orig_ip_bytes}\\t%{GREEDYDATA:resp_pkts}\\t%{GREEDYDATA:resp_ip_bytes}\\t%{GREEDYDATA:tunnel_parents}"}
[2018-04-24T17:17:38,942][DEBUG][logstash.filters.grok ] Adding pattern {"BRO_FILES"=>"%{NUMBER:ts}\\t%{NOTSPACE:fuid}\\t%{IP:tx_hosts}\\t%{IP:rx_hosts}\\t%{NOTSPACE:conn_uids}\\t%{GREEDYDATA:source}\\t%{GREEDYDATA:depth}\\t%{GREEDYDATA:analyzers}\\t%{GREEDYDATA:mime_type}\\t%{GREEDYDATA:filename}\\t%{GREEDYDATA:duration}\\t%{GREEDYDATA:local_orig}\\t%{GREEDYDATA:is_orig}\\t%{GREEDYDATA:seen_bytes}\\t%{GREEDYDATA:total_bytes}\\t%{GREEDYDATA:missing_bytes}\\t%{GREEDYDATA:overflow_bytes}\\t%{GREEDYDATA:timedout}\\t%{GREEDYDATA:parent_fuid}\\t%{GREEDYDATA:md5}\\t%{GREEDYDATA:sha1}\\t%{GREEDYDATA:sha256}\\t%{GREEDYDATA:extracted}"}
[2018-04-24T17:17:38,942][DEBUG][logstash.filters.grok ] Adding pattern {"EXIM_MSGID"=>"[0-9A-Za-z]{6}-[0-9A-Za-z]{6}-[0-9A-Za-z]{2}"}
[2018-04-24T17:17:38,942][DEBUG][logstash.filters.grok ] Adding pattern {"EXIM_FLAGS"=>"(<=|[-=>*]>|[*]{2}|==)"}
[2018-04-24T17:17:38,942][DEBUG][logstash.filters.grok ] Adding pattern {"EXIM_DATE"=>"%{YEAR:exim_year}-%{MONTHNUM:exim_month}-%{MONTHDAY:exim_day} %{TIME:exim_time}"}
[2018-04-24T17:17:38,942][DEBUG][logstash.filters.grok ] Adding pattern {"EXIM_PID"=>"\\[%{POSINT}\\]"}
[2018-04-24T17:17:38,942][DEBUG][logstash.filters.grok ] Adding pattern {"EXIM_QT"=>"((\\d+y)?(\\d+w)?(\\d+d)?(\\d+h)?(\\d+m)?(\\d+s)?)"}
[2018-04-24T17:17:38,942][DEBUG][logstash.filters.grok ] Adding pattern {"EXIM_EXCLUDE_TERMS"=>"(Message is frozen|(Start|End) queue run| Warning: | retry time not reached | no (IP address|host name) found for (IP address|host) | unexpected disconnection while reading SMTP command | no immediate delivery: |another process is handling this message)"}
[2018-04-24T17:17:38,942][DEBUG][logstash.filters.grok ] Adding pattern {"EXIM_REMOTE_HOST"=>"(H=(%{NOTSPACE:remote_hostname} )?(\\(%{NOTSPACE:remote_heloname}\\) )?\\[%{IP:remote_host}\\])"}
[2018-04-24T17:17:38,942][DEBUG][logstash.filters.grok ] Adding pattern {"EXIM_INTERFACE"=>"(I=\\[%{IP:exim_interface}\\](:%{NUMBER:exim_interface_port}))"}
[2018-04-24T17:17:38,942][DEBUG][logstash.filters.grok ] Adding pattern {"EXIM_PROTOCOL"=>"(P=%{NOTSPACE:protocol})"}
[2018-04-24T17:17:38,942][DEBUG][logstash.filters.grok ] Adding pattern {"EXIM_MSG_SIZE"=>"(S=%{NUMBER:exim_msg_size})"}
[2018-04-24T17:17:38,943][DEBUG][logstash.filters.grok ] Adding pattern {"EXIM_HEADER_ID"=>"(id=%{NOTSPACE:exim_header_id})"}
[2018-04-24T17:17:38,943][DEBUG][logstash.filters.grok ] Adding pattern {"EXIM_SUBJECT"=>"(T=%{QS:exim_subject})"}
[2018-04-24T17:17:38,943][DEBUG][logstash.filters.grok ] Adding pattern {"NETSCREENSESSIONLOG"=>"%{SYSLOGTIMESTAMP:date} %{IPORHOST:device} %{IPORHOST}: NetScreen device_id=%{WORD:device_id}%{DATA}: start_time=%{QUOTEDSTRING:start_time} duration=%{INT:duration} policy_id=%{INT:policy_id} service=%{DATA:service} proto=%{INT:proto} src zone=%{WORD:src_zone} dst zone=%{WORD:dst_zone} action=%{WORD:action} sent=%{INT:sent} rcvd=%{INT:rcvd} src=%{IPORHOST:src_ip} dst=%{IPORHOST:dst_ip} src_port=%{INT:src_port} dst_port=%{INT:dst_port} src-xlated ip=%{IPORHOST:src_xlated_ip} port=%{INT:src_xlated_port} dst-xlated ip=%{IPORHOST:dst_xlated_ip} port=%{INT:dst_xlated_port} session_id=%{INT:session_id} reason=%{GREEDYDATA:reason}"}
[2018-04-24T17:17:38,943][DEBUG][logstash.filters.grok ] Adding pattern {"CISCO_TAGGED_SYSLOG"=>"^<%{POSINT:syslog_pri}>%{CISCOTIMESTAMP:timestamp}( %{SYSLOGHOST:sysloghost})? ?: %%{CISCOTAG:ciscotag}:"}
[2018-04-24T17:17:38,943][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOTIMESTAMP"=>"%{MONTH} +%{MONTHDAY}(?: %{YEAR})? %{TIME}"}
[2018-04-24T17:17:38,943][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOTAG"=>"[A-Z0-9]+-%{INT}-(?:[A-Z0-9_]+)"}
[2018-04-24T17:17:38,943][DEBUG][logstash.filters.grok ] Adding pattern {"CISCO_ACTION"=>"Built|Teardown|Deny|Denied|denied|requested|permitted|denied by ACL|discarded|est-allowed|Dropping|created|deleted"}
[2018-04-24T17:17:38,943][DEBUG][logstash.filters.grok ] Adding pattern {"CISCO_REASON"=>"Duplicate TCP SYN|Failed to locate egress interface|Invalid transport field|No matching connection|DNS Response|DNS Query|(?:%{WORD}\\s*)*"}
[2018-04-24T17:17:38,943][DEBUG][logstash.filters.grok ] Adding pattern {"CISCO_DIRECTION"=>"Inbound|inbound|Outbound|outbound"}
[2018-04-24T17:17:38,943][DEBUG][logstash.filters.grok ] Adding pattern {"CISCO_INTERVAL"=>"first hit|%{INT}-second interval"}
[2018-04-24T17:17:38,943][DEBUG][logstash.filters.grok ] Adding pattern {"CISCO_XLATE_TYPE"=>"static|dynamic"}
[2018-04-24T17:17:38,943][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW104001"=>"\\((?:Primary|Secondary)\\) Switching to ACTIVE - %{GREEDYDATA:switch_reason}"}
[2018-04-24T17:17:38,943][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW104002"=>"\\((?:Primary|Secondary)\\) Switching to STANDBY - %{GREEDYDATA:switch_reason}"}
[2018-04-24T17:17:38,943][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW104003"=>"\\((?:Primary|Secondary)\\) Switching to FAILED\\."}
[2018-04-24T17:17:38,943][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW104004"=>"\\((?:Primary|Secondary)\\) Switching to OK\\."}
[2018-04-24T17:17:38,943][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW105003"=>"\\((?:Primary|Secondary)\\) Monitoring on [Ii]nterface %{GREEDYDATA:interface_name} waiting"}
[2018-04-24T17:17:38,943][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW105004"=>"\\((?:Primary|Secondary)\\) Monitoring on [Ii]nterface %{GREEDYDATA:interface_name} normal"}
[2018-04-24T17:17:38,944][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW105005"=>"\\((?:Primary|Secondary)\\) Lost Failover communications with mate on [Ii]nterface %{GREEDYDATA:interface_name}"}
[2018-04-24T17:17:38,944][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW105008"=>"\\((?:Primary|Secondary)\\) Testing [Ii]nterface %{GREEDYDATA:interface_name}"}
[2018-04-24T17:17:38,944][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW105009"=>"\\((?:Primary|Secondary)\\) Testing on [Ii]nterface %{GREEDYDATA:interface_name} (?:Passed|Failed)"}
[2018-04-24T17:17:38,944][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW106001"=>"%{CISCO_DIRECTION:direction} %{WORD:protocol} connection %{CISCO_ACTION:action} from %{IP:src_ip}/%{INT:src_port} to %{IP:dst_ip}/%{INT:dst_port} flags %{GREEDYDATA:tcp_flags} on interface %{GREEDYDATA:interface}"}
[2018-04-24T17:17:38,944][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW106006_106007_106010"=>"%{CISCO_ACTION:action} %{CISCO_DIRECTION:direction} %{WORD:protocol} (?:from|src) %{IP:src_ip}/%{INT:src_port}(\\(%{DATA:src_fwuser}\\))? (?:to|dst) %{IP:dst_ip}/%{INT:dst_port}(\\(%{DATA:dst_fwuser}\\))? (?:on interface %{DATA:interface}|due to %{CISCO_REASON:reason})"}
[2018-04-24T17:17:38,944][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW106014"=>"%{CISCO_ACTION:action} %{CISCO_DIRECTION:direction} %{WORD:protocol} src %{DATA:src_interface}:%{IP:src_ip}(\\(%{DATA:src_fwuser}\\))? dst %{DATA:dst_interface}:%{IP:dst_ip}(\\(%{DATA:dst_fwuser}\\))? \\(type %{INT:icmp_type}, code %{INT:icmp_code}\\)"}
[2018-04-24T17:17:38,944][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW106015"=>"%{CISCO_ACTION:action} %{WORD:protocol} \\(%{DATA:policy_id}\\) from %{IP:src_ip}/%{INT:src_port} to %{IP:dst_ip}/%{INT:dst_port} flags %{DATA:tcp_flags} on interface %{GREEDYDATA:interface}"}
[2018-04-24T17:17:38,944][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW106021"=>"%{CISCO_ACTION:action} %{WORD:protocol} reverse path check from %{IP:src_ip} to %{IP:dst_ip} on interface %{GREEDYDATA:interface}"}
[2018-04-24T17:17:38,944][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW106023"=>"%{CISCO_ACTION:action}( protocol)? %{WORD:protocol} src %{DATA:src_interface}:%{DATA:src_ip}(/%{INT:src_port})?(\\(%{DATA:src_fwuser}\\))? dst %{DATA:dst_interface}:%{DATA:dst_ip}(/%{INT:dst_port})?(\\(%{DATA:dst_fwuser}\\))?( \\(type %{INT:icmp_type}, code %{INT:icmp_code}\\))? by access-group \"?%{DATA:policy_id}\"? \\[%{DATA:hashcode1}, %{DATA:hashcode2}\\]"}
[2018-04-24T17:17:38,944][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW106100_2_3"=>"access-list %{NOTSPACE:policy_id} %{CISCO_ACTION:action} %{WORD:protocol} for user '%{DATA:src_fwuser}' %{DATA:src_interface}/%{IP:src_ip}\\(%{INT:src_port}\\) -> %{DATA:dst_interface}/%{IP:dst_ip}\\(%{INT:dst_port}\\) hit-cnt %{INT:hit_count} %{CISCO_INTERVAL:interval} \\[%{DATA:hashcode1}, %{DATA:hashcode2}\\]"}
[2018-04-24T17:17:38,944][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW106100"=>"access-list %{NOTSPACE:policy_id} %{CISCO_ACTION:action} %{WORD:protocol} %{DATA:src_interface}/%{IP:src_ip}\\(%{INT:src_port}\\)(\\(%{DATA:src_fwuser}\\))? -> %{DATA:dst_interface}/%{IP:dst_ip}\\(%{INT:dst_port}\\)(\\(%{DATA:src_fwuser}\\))? hit-cnt %{INT:hit_count} %{CISCO_INTERVAL:interval} \\[%{DATA:hashcode1}, %{DATA:hashcode2}\\]"}
[2018-04-24T17:17:38,944][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW304001"=>"%{IP:src_ip}(\\(%{DATA:src_fwuser}\\))? Accessed URL %{IP:dst_ip}:%{GREEDYDATA:dst_url}"}
[2018-04-24T17:17:38,944][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW110002"=>"%{CISCO_REASON:reason} for %{WORD:protocol} from %{DATA:src_interface}:%{IP:src_ip}/%{INT:src_port} to %{IP:dst_ip}/%{INT:dst_port}"}
[2018-04-24T17:17:38,944][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW302010"=>"%{INT:connection_count} in use, %{INT:connection_count_max} most used"}
[2018-04-24T17:17:38,944][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW302013_302014_302015_302016"=>"%{CISCO_ACTION:action}(?: %{CISCO_DIRECTION:direction})? %{WORD:protocol} connection %{INT:connection_id} for %{DATA:src_interface}:%{IP:src_ip}/%{INT:src_port}( \\(%{IP:src_mapped_ip}/%{INT:src_mapped_port}\\))?(\\(%{DATA:src_fwuser}\\))? to %{DATA:dst_interface}:%{IP:dst_ip}/%{INT:dst_port}( \\(%{IP:dst_mapped_ip}/%{INT:dst_mapped_port}\\))?(\\(%{DATA:dst_fwuser}\\))?( duration %{TIME:duration} bytes %{INT:bytes})?(?: %{CISCO_REASON:reason})?( \\(%{DATA:user}\\))?"}
[2018-04-24T17:17:38,945][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW302020_302021"=>"%{CISCO_ACTION:action}(?: %{CISCO_DIRECTION:direction})? %{WORD:protocol} connection for faddr %{IP:dst_ip}/%{INT:icmp_seq_num}(?:\\(%{DATA:fwuser}\\))? gaddr %{IP:src_xlated_ip}/%{INT:icmp_code_xlated} laddr %{IP:src_ip}/%{INT:icmp_code}( \\(%{DATA:user}\\))?"}
[2018-04-24T17:17:38,945][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW305011"=>"%{CISCO_ACTION:action} %{CISCO_XLATE_TYPE:xlate_type} %{WORD:protocol} translation from %{DATA:src_interface}:%{IP:src_ip}(/%{INT:src_port})?(\\(%{DATA:src_fwuser}\\))? to %{DATA:src_xlated_interface}:%{IP:src_xlated_ip}/%{DATA:src_xlated_port}"}
[2018-04-24T17:17:38,945][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW313001_313004_313008"=>"%{CISCO_ACTION:action} %{WORD:protocol} type=%{INT:icmp_type}, code=%{INT:icmp_code} from %{IP:src_ip} on interface %{DATA:interface}( to %{IP:dst_ip})?"}
[2018-04-24T17:17:38,945][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW313005"=>"%{CISCO_REASON:reason} for %{WORD:protocol} error message: %{WORD:err_protocol} src %{DATA:err_src_interface}:%{IP:err_src_ip}(\\(%{DATA:err_src_fwuser}\\))? dst %{DATA:err_dst_interface}:%{IP:err_dst_ip}(\\(%{DATA:err_dst_fwuser}\\))? \\(type %{INT:err_icmp_type}, code %{INT:err_icmp_code}\\) on %{DATA:interface} interface\\. Original IP payload: %{WORD:protocol} src %{IP:orig_src_ip}/%{INT:orig_src_port}(\\(%{DATA:orig_src_fwuser}\\))? dst %{IP:orig_dst_ip}/%{INT:orig_dst_port}(\\(%{DATA:orig_dst_fwuser}\\))?"}
[2018-04-24T17:17:38,945][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW321001"=>"Resource '%{WORD:resource_name}' limit of %{POSINT:resource_limit} reached for system"}
[2018-04-24T17:17:38,945][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW402117"=>"%{WORD:protocol}: Received a non-IPSec packet \\(protocol= %{WORD:orig_protocol}\\) from %{IP:src_ip} to %{IP:dst_ip}"}
[2018-04-24T17:17:38,945][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW402119"=>"%{WORD:protocol}: Received an %{WORD:orig_protocol} packet \\(SPI= %{DATA:spi}, sequence number= %{DATA:seq_num}\\) from %{IP:src_ip} \\(user= %{DATA:user}\\) to %{IP:dst_ip} that failed anti-replay checking"}
[2018-04-24T17:17:38,945][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW419001"=>"%{CISCO_ACTION:action} %{WORD:protocol} packet from %{DATA:src_interface}:%{IP:src_ip}/%{INT:src_port} to %{DATA:dst_interface}:%{IP:dst_ip}/%{INT:dst_port}, reason: %{GREEDYDATA:reason}"}
[2018-04-24T17:17:38,945][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW419002"=>"%{CISCO_REASON:reason} from %{DATA:src_interface}:%{IP:src_ip}/%{INT:src_port} to %{DATA:dst_interface}:%{IP:dst_ip}/%{INT:dst_port} with different initial sequence number"}
[2018-04-24T17:17:38,945][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW500004"=>"%{CISCO_REASON:reason} for protocol=%{WORD:protocol}, from %{IP:src_ip}/%{INT:src_port} to %{IP:dst_ip}/%{INT:dst_port}"}
[2018-04-24T17:17:38,945][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW602303_602304"=>"%{WORD:protocol}: An %{CISCO_DIRECTION:direction} %{GREEDYDATA:tunnel_type} SA \\(SPI= %{DATA:spi}\\) between %{IP:src_ip} and %{IP:dst_ip} \\(user= %{DATA:user}\\) has been %{CISCO_ACTION:action}"}
[2018-04-24T17:17:38,945][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW710001_710002_710003_710005_710006"=>"%{WORD:protocol} (?:request|access) %{CISCO_ACTION:action} from %{IP:src_ip}/%{INT:src_port} to %{DATA:dst_interface}:%{IP:dst_ip}/%{INT:dst_port}"}
[2018-04-24T17:17:38,945][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW713172"=>"Group = %{GREEDYDATA:group}, IP = %{IP:src_ip}, Automatic NAT Detection Status:\\s+Remote end\\s*%{DATA:is_remote_natted}\\s*behind a NAT device\\s+This\\s+end\\s*%{DATA:is_local_natted}\\s*behind a NAT device"}
[2018-04-24T17:17:38,945][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOFW733100"=>"\\[\\s*%{DATA:drop_type}\\s*\\] drop %{DATA:drop_rate_id} exceeded. Current burst rate is %{INT:drop_rate_current_burst} per second, max configured rate is %{INT:drop_rate_max_burst}; Current average rate is %{INT:drop_rate_current_avg} per second, max configured rate is %{INT:drop_rate_max_avg}; Cumulative total count is %{INT:drop_total_count}"}
[2018-04-24T17:17:38,946][DEBUG][logstash.filters.grok ] Adding pattern {"SHOREWALL"=>"(%{SYSLOGTIMESTAMP:timestamp}) (%{WORD:nf_host}) kernel:.*Shorewall:(%{WORD:nf_action1})?:(%{WORD:nf_action2})?.*IN=(%{USERNAME:nf_in_interface})?.*(OUT= *MAC=(%{COMMONMAC:nf_dst_mac}):(%{COMMONMAC:nf_src_mac})?|OUT=%{USERNAME:nf_out_interface}).*SRC=(%{IPV4:nf_src_ip}).*DST=(%{IPV4:nf_dst_ip}).*LEN=(%{WORD:nf_len}).?*TOS=(%{WORD:nf_tos}).?*PREC=(%{WORD:nf_prec}).?*TTL=(%{INT:nf_ttl}).?*ID=(%{INT:nf_id}).?*PROTO=(%{WORD:nf_protocol}).?*SPT=(%{INT:nf_src_port}?.*DPT=%{INT:nf_dst_port}?.*)"}
[2018-04-24T17:17:38,946][DEBUG][logstash.filters.grok ] Adding pattern {"SFW2"=>"((%{SYSLOGTIMESTAMP})|(%{TIMESTAMP_ISO8601}))\\s*%{HOSTNAME}\\s*kernel\\S+\\s*%{NAGIOSTIME}\\s*SFW2\\-INext\\-%{NOTSPACE:nf_action}\\s*IN=%{USERNAME:nf_in_interface}.*OUT=((\\s*%{USERNAME:nf_out_interface})|(\\s*))MAC=((%{COMMONMAC:nf_dst_mac}:%{COMMONMAC:nf_src_mac})|(\\s*)).*SRC=%{IP:nf_src_ip}\\s*DST=%{IP:nf_dst_ip}.*PROTO=%{WORD:nf_protocol}((.*SPT=%{INT:nf_src_port}.*DPT=%{INT:nf_dst_port}.*)|())"}
[2018-04-24T17:17:38,946][DEBUG][logstash.filters.grok ] Adding pattern {"USERNAME"=>"[a-zA-Z0-9._-]+"}
[2018-04-24T17:17:38,946][DEBUG][logstash.filters.grok ] Adding pattern {"USER"=>"%{USERNAME}"}
[2018-04-24T17:17:38,946][DEBUG][logstash.filters.grok ] Adding pattern {"EMAILLOCALPART"=>"[a-zA-Z][a-zA-Z0-9_.+-=:]+"}
[2018-04-24T17:17:38,946][DEBUG][logstash.filters.grok ] Adding pattern {"EMAILADDRESS"=>"%{EMAILLOCALPART}@%{HOSTNAME}"}
[2018-04-24T17:17:38,946][DEBUG][logstash.filters.grok ] Adding pattern {"INT"=>"(?:[+-]?(?:[0-9]+))"}
[2018-04-24T17:17:38,946][DEBUG][logstash.filters.grok ] Adding pattern {"BASE10NUM"=>"(?<![0-9.+-])(?>[+-]?(?:(?:[0-9]+(?:\\.[0-9]+)?)|(?:\\.[0-9]+)))"}
[2018-04-24T17:17:38,946][DEBUG][logstash.filters.grok ] Adding pattern {"NUMBER"=>"(?:%{BASE10NUM})"}
[2018-04-24T17:17:38,946][DEBUG][logstash.filters.grok ] Adding pattern {"BASE16NUM"=>"(?<![0-9A-Fa-f])(?:[+-]?(?:0x)?(?:[0-9A-Fa-f]+))"}
[2018-04-24T17:17:38,946][DEBUG][logstash.filters.grok ] Adding pattern {"BASE16FLOAT"=>"\\b(?<![0-9A-Fa-f.])(?:[+-]?(?:0x)?(?:(?:[0-9A-Fa-f]+(?:\\.[0-9A-Fa-f]*)?)|(?:\\.[0-9A-Fa-f]+)))\\b"}
[2018-04-24T17:17:38,946][DEBUG][logstash.filters.grok ] Adding pattern {"POSINT"=>"\\b(?:[1-9][0-9]*)\\b"}
[2018-04-24T17:17:38,946][DEBUG][logstash.filters.grok ] Adding pattern {"NONNEGINT"=>"\\b(?:[0-9]+)\\b"}
[2018-04-24T17:17:38,946][DEBUG][logstash.filters.grok ] Adding pattern {"WORD"=>"\\b\\w+\\b"}
[2018-04-24T17:17:38,946][DEBUG][logstash.filters.grok ] Adding pattern {"NOTSPACE"=>"\\S+"}
[2018-04-24T17:17:38,946][DEBUG][logstash.filters.grok ] Adding pattern {"SPACE"=>"\\s*"}
[2018-04-24T17:17:38,946][DEBUG][logstash.filters.grok ] Adding pattern {"DATA"=>".*?"}
[2018-04-24T17:17:38,946][DEBUG][logstash.filters.grok ] Adding pattern {"GREEDYDATA"=>".*"}
[2018-04-24T17:17:38,946][DEBUG][logstash.filters.grok ] Adding pattern {"QUOTEDSTRING"=>"(?>(?<!\\\\)(?>\"(?>\\\\.|[^\\\\\"]+)+\"|\"\"|(?>'(?>\\\\.|[^\\\\']+)+')|''|(?>`(?>\\\\.|[^\\\\`]+)+`)|``))"}
[2018-04-24T17:17:38,946][DEBUG][logstash.filters.grok ] Adding pattern {"UUID"=>"[A-Fa-f0-9]{8}-(?:[A-Fa-f0-9]{4}-){3}[A-Fa-f0-9]{12}"}
[2018-04-24T17:17:38,947][DEBUG][logstash.filters.grok ] Adding pattern {"URN"=>"urn:[0-9A-Za-z][0-9A-Za-z-]{0,31}:(?:%[0-9a-fA-F]{2}|[0-9A-Za-z()+,.:=@;$_!*'/?#-])+"}
[2018-04-24T17:17:38,947][DEBUG][logstash.filters.grok ] Adding pattern {"MAC"=>"(?:%{CISCOMAC}|%{WINDOWSMAC}|%{COMMONMAC})"}
[2018-04-24T17:17:38,947][DEBUG][logstash.filters.grok ] Adding pattern {"CISCOMAC"=>"(?:(?:[A-Fa-f0-9]{4}\\.){2}[A-Fa-f0-9]{4})"}
[2018-04-24T17:17:38,947][DEBUG][logstash.filters.grok ] Adding pattern {"WINDOWSMAC"=>"(?:(?:[A-Fa-f0-9]{2}-){5}[A-Fa-f0-9]{2})"}
[2018-04-24T17:17:38,947][DEBUG][logstash.filters.grok ] Adding pattern {"COMMONMAC"=>"(?:(?:[A-Fa-f0-9]{2}:){5}[A-Fa-f0-9]{2})"}
[2018-04-24T17:17:38,947][DEBUG][logstash.filters.grok ] Adding pattern {"IPV6"=>"((([0-9A-Fa-f]{1,4}:){7}([0-9A-Fa-f]{1,4}|:))|(([0-9A-Fa-f]{1,4}:){6}(:[0-9A-Fa-f]{1,4}|((25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)(\\.(25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)){3})|:))|(([0-9A-Fa-f]{1,4}:){5}(((:[0-9A-Fa-f]{1,4}){1,2})|:((25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)(\\.(25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)){3})|:))|(([0-9A-Fa-f]{1,4}:){4}(((:[0-9A-Fa-f]{1,4}){1,3})|((:[0-9A-Fa-f]{1,4})?:((25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)(\\.(25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)){3}))|:))|(([0-9A-Fa-f]{1,4}:){3}(((:[0-9A-Fa-f]{1,4}){1,4})|((:[0-9A-Fa-f]{1,4}){0,2}:((25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)(\\.(25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)){3}))|:))|(([0-9A-Fa-f]{1,4}:){2}(((:[0-9A-Fa-f]{1,4}){1,5})|((:[0-9A-Fa-f]{1,4}){0,3}:((25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)(\\.(25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)){3}))|:))|(([0-9A-Fa-f]{1,4}:){1}(((:[0-9A-Fa-f]{1,4}){1,6})|((:[0-9A-Fa-f]{1,4}){0,4}:((25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)(\\.(25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)){3}))|:))|(:(((:[0-9A-Fa-f]{1,4}){1,7})|((:[0-9A-Fa-f]{1,4}){0,5}:((25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)(\\.(25[0-5]|2[0-4]\\d|1\\d\\d|[1-9]?\\d)){3}))|:)))(%.+)?"}
[2018-04-24T17:17:38,947][DEBUG][logstash.filters.grok ] Adding pattern {"IPV4"=>"(?<![0-9])(?:(?:[0-1]?[0-9]{1,2}|2[0-4][0-9]|25[0-5])[.](?:[0-1]?[0-9]{1,2}|2[0-4][0-9]|25[0-5])[.](?:[0-1]?[0-9]{1,2}|2[0-4][0-9]|25[0-5])[.](?:[0-1]?[0-9]{1,2}|2[0-4][0-9]|25[0-5]))(?![0-9])"}
[2018-04-24T17:17:38,947][DEBUG][logstash.filters.grok ] Adding pattern {"IP"=>"(?:%{IPV6}|%{IPV4})"}
[2018-04-24T17:17:38,947][DEBUG][logstash.filters.grok ] Adding pattern {"HOSTNAME"=>"\\b(?:[0-9A-Za-z][0-9A-Za-z-]{0,62})(?:\\.(?:[0-9A-Za-z][0-9A-Za-z-]{0,62}))*(\\.?|\\b)"}
[2018-04-24T17:17:38,947][DEBUG][logstash.filters.grok ] Adding pattern {"IPORHOST"=>"(?:%{IP}|%{HOSTNAME})"}
[2018-04-24T17:17:38,947][DEBUG][logstash.filters.grok ] Adding pattern {"HOSTPORT"=>"%{IPORHOST}:%{POSINT}"}
[2018-04-24T17:17:38,947][DEBUG][logstash.filters.grok ] Adding pattern {"PATH"=>"(?:%{UNIXPATH}|%{WINPATH})"}
[2018-04-24T17:17:38,947][DEBUG][logstash.filters.grok ] Adding pattern {"UNIXPATH"=>"(/([\\w_%!$@:.,+~-]+|\\\\.)*)+"}
[2018-04-24T17:17:38,947][DEBUG][logstash.filters.grok ] Adding pattern {"TTY"=>"(?:/dev/(pts|tty([pq])?)(\\w+)?/?(?:[0-9]+))"}
[2018-04-24T17:17:38,947][DEBUG][logstash.filters.grok ] Adding pattern {"WINPATH"=>"(?>[A-Za-z]+:|\\\\)(?:\\\\[^\\\\?*]*)+"}
[2018-04-24T17:17:38,947][DEBUG][logstash.filters.grok ] Adding pattern {"URIPROTO"=>"[A-Za-z]([A-Za-z0-9+\\-.]+)+"}
[2018-04-24T17:17:38,947][DEBUG][logstash.filters.grok ] Adding pattern {"URIHOST"=>"%{IPORHOST}(?::%{POSINT:port})?"}
[2018-04-24T17:17:38,947][DEBUG][logstash.filters.grok ] Adding pattern {"URIPATH"=>"(?:/[A-Za-z0-9$.+!*'(){},~:;=@#%&_\\-]*)+"}
[2018-04-24T17:17:38,947][DEBUG][logstash.filters.grok ] Adding pattern {"URIPARAM"=>"\\?[A-Za-z0-9$.+!*'|(){},~@#%&/=:;_?\\-\\[\\]<>]*"}
[2018-04-24T17:17:38,947][DEBUG][logstash.filters.grok ] Adding pattern {"URIPATHPARAM"=>"%{URIPATH}(?:%{URIPARAM})?"}
[2018-04-24T17:17:38,947][DEBUG][logstash.filters.grok ] Adding pattern {"URI"=>"%{URIPROTO}://(?:%{USER}(?::[^@]*)?@)?(?:%{URIHOST})?(?:%{URIPATHPARAM})?"}
[2018-04-24T17:17:38,948][DEBUG][logstash.filters.grok ] Adding pattern {"MONTH"=>"\\b(?:[Jj]an(?:uary|uar)?|[Ff]eb(?:ruary|ruar)?|[Mm](?:a|ä)?r(?:ch|z)?|[Aa]pr(?:il)?|[Mm]a(?:y|i)?|[Jj]un(?:e|i)?|[Jj]ul(?:y)?|[Aa]ug(?:ust)?|[Ss]ep(?:tember)?|[Oo](?:c|k)?t(?:ober)?|[Nn]ov(?:ember)?|[Dd]e(?:c|z)(?:ember)?)\\b"}
[2018-04-24T17:17:38,948][DEBUG][logstash.filters.grok ] Adding pattern {"MONTHNUM"=>"(?:0?[1-9]|1[0-2])"}
[2018-04-24T17:17:38,948][DEBUG][logstash.filters.grok ] Adding pattern {"MONTHNUM2"=>"(?:0[1-9]|1[0-2])"}
[2018-04-24T17:17:38,948][DEBUG][logstash.filters.grok ] Adding pattern {"MONTHDAY"=>"(?:(?:0[1-9])|(?:[12][0-9])|(?:3[01])|[1-9])"}
[2018-04-24T17:17:38,948][DEBUG][logstash.filters.grok ] Adding pattern {"DAY"=>"(?:Mon(?:day)?|Tue(?:sday)?|Wed(?:nesday)?|Thu(?:rsday)?|Fri(?:day)?|Sat(?:urday)?|Sun(?:day)?)"}
[2018-04-24T17:17:38,948][DEBUG][logstash.filters.grok ] Adding pattern {"YEAR"=>"(?>\\d\\d){1,2}"}
[2018-04-24T17:17:38,948][DEBUG][logstash.filters.grok ] Adding pattern {"HOUR"=>"(?:2[0123]|[01]?[0-9])"}
[2018-04-24T17:17:38,948][DEBUG][logstash.filters.grok ] Adding pattern {"MINUTE"=>"(?:[0-5][0-9])"}
[2018-04-24T17:17:38,948][DEBUG][logstash.filters.grok ] Adding pattern {"SECOND"=>"(?:(?:[0-5]?[0-9]|60)(?:[:.,][0-9]+)?)"}
[2018-04-24T17:17:38,948][DEBUG][logstash.filters.grok ] Adding pattern {"TIME"=>"(?!<[0-9])%{HOUR}:%{MINUTE}(?::%{SECOND})(?![0-9])"}
[2018-04-24T17:17:38,948][DEBUG][logstash.filters.grok ] Adding pattern {"DATE_US"=>"%{MONTHNUM}[/-]%{MONTHDAY}[/-]%{YEAR}"}
[2018-04-24T17:17:38,948][DEBUG][logstash.filters.grok ] Adding pattern {"DATE_EU"=>"%{MONTHDAY}[./-]%{MONTHNUM}[./-]%{YEAR}"}
[2018-04-24T17:17:38,948][DEBUG][logstash.filters.grok ] Adding pattern {"ISO8601_TIMEZONE"=>"(?:Z|[+-]%{HOUR}(?::?%{MINUTE}))"}
[2018-04-24T17:17:38,948][DEBUG][logstash.filters.grok ] Adding pattern {"ISO8601_SECOND"=>"(?:%{SECOND}|60)"}
[2018-04-24T17:17:38,948][DEBUG][logstash.filters.grok ] Adding pattern {"TIMESTAMP_ISO8601"=>"%{YEAR}-%{MONTHNUM}-%{MONTHDAY}[T ]%{HOUR}:?%{MINUTE}(?::?%{SECOND})?%{ISO8601_TIMEZONE}?"}
[2018-04-24T17:17:38,948][DEBUG][logstash.filters.grok ] Adding pattern {"DATE"=>"%{DATE_US}|%{DATE_EU}"}
[2018-04-24T17:17:38,948][DEBUG][logstash.filters.grok ] Adding pattern {"DATESTAMP"=>"%{DATE}[- ]%{TIME}"}
[2018-04-24T17:17:38,948][DEBUG][logstash.filters.grok ] Adding pattern {"TZ"=>"(?:[APMCE][SD]T|UTC)"}
[2018-04-24T17:17:38,948][DEBUG][logstash.filters.grok ] Adding pattern {"DATESTAMP_RFC822"=>"%{DAY} %{MONTH} %{MONTHDAY} %{YEAR} %{TIME} %{TZ}"}
[2018-04-24T17:17:38,948][DEBUG][logstash.filters.grok ] Adding pattern {"DATESTAMP_RFC2822"=>"%{DAY}, %{MONTHDAY} %{MONTH} %{YEAR} %{TIME} %{ISO8601_TIMEZONE}"}
[2018-04-24T17:17:38,948][DEBUG][logstash.filters.grok ] Adding pattern {"DATESTAMP_OTHER"=>"%{DAY} %{MONTH} %{MONTHDAY} %{TIME} %{TZ} %{YEAR}"}
[2018-04-24T17:17:38,948][DEBUG][logstash.filters.grok ] Adding pattern {"DATESTAMP_EVENTLOG"=>"%{YEAR}%{MONTHNUM2}%{MONTHDAY}%{HOUR}%{MINUTE}%{SECOND}"}
[2018-04-24T17:17:38,949][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOGTIMESTAMP"=>"%{MONTH} +%{MONTHDAY} %{TIME}"}
[2018-04-24T17:17:38,949][DEBUG][logstash.filters.grok ] Adding pattern {"PROG"=>"[\\x21-\\x5a\\x5c\\x5e-\\x7e]+"}
[2018-04-24T17:17:38,949][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOGPROG"=>"%{PROG:program}(?:\\[%{POSINT:pid}\\])?"}
[2018-04-24T17:17:38,949][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOGHOST"=>"%{IPORHOST}"}
[2018-04-24T17:17:38,949][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOGFACILITY"=>"<%{NONNEGINT:facility}.%{NONNEGINT:priority}>"}
[2018-04-24T17:17:38,949][DEBUG][logstash.filters.grok ] Adding pattern {"HTTPDATE"=>"%{MONTHDAY}/%{MONTH}/%{YEAR}:%{TIME} %{INT}"}
[2018-04-24T17:17:38,949][DEBUG][logstash.filters.grok ] Adding pattern {"QS"=>"%{QUOTEDSTRING}"}
[2018-04-24T17:17:38,949][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOGBASE"=>"%{SYSLOGTIMESTAMP:timestamp} (?:%{SYSLOGFACILITY} )?%{SYSLOGHOST:logsource} %{SYSLOGPROG}:"}
[2018-04-24T17:17:38,949][DEBUG][logstash.filters.grok ] Adding pattern {"LOGLEVEL"=>"([Aa]lert|ALERT|[Tt]race|TRACE|[Dd]ebug|DEBUG|[Nn]otice|NOTICE|[Ii]nfo|INFO|[Ww]arn?(?:ing)?|WARN?(?:ING)?|[Ee]rr?(?:or)?|ERR?(?:OR)?|[Cc]rit?(?:ical)?|CRIT?(?:ICAL)?|[Ff]atal|FATAL|[Ss]evere|SEVERE|EMERG(?:ENCY)?|[Ee]merg(?:ency)?)"}
[2018-04-24T17:17:38,949][DEBUG][logstash.filters.grok ] Adding pattern {"HAPROXYTIME"=>"(?!<[0-9])%{HOUR:haproxy_hour}:%{MINUTE:haproxy_minute}(?::%{SECOND:haproxy_second})(?![0-9])"}
[2018-04-24T17:17:38,949][DEBUG][logstash.filters.grok ] Adding pattern {"HAPROXYDATE"=>"%{MONTHDAY:haproxy_monthday}/%{MONTH:haproxy_month}/%{YEAR:haproxy_year}:%{HAPROXYTIME:haproxy_time}.%{INT:haproxy_milliseconds}"}
[2018-04-24T17:17:38,949][DEBUG][logstash.filters.grok ] Adding pattern {"HAPROXYCAPTUREDREQUESTHEADERS"=>"%{DATA:captured_request_headers}"}
[2018-04-24T17:17:38,949][DEBUG][logstash.filters.grok ] Adding pattern {"HAPROXYCAPTUREDRESPONSEHEADERS"=>"%{DATA:captured_response_headers}"}
[2018-04-24T17:17:38,949][DEBUG][logstash.filters.grok ] Adding pattern {"HAPROXYHTTPBASE"=>"%{IP:client_ip}:%{INT:client_port} \\[%{HAPROXYDATE:accept_date}\\] %{NOTSPACE:frontend_name} %{NOTSPACE:backend_name}/%{NOTSPACE:server_name} %{INT:time_request}/%{INT:time_queue}/%{INT:time_backend_connect}/%{INT:time_backend_response}/%{NOTSPACE:time_duration} %{INT:http_status_code} %{NOTSPACE:bytes_read} %{DATA:captured_request_cookie} %{DATA:captured_response_cookie} %{NOTSPACE:termination_state} %{INT:actconn}/%{INT:feconn}/%{INT:beconn}/%{INT:srvconn}/%{NOTSPACE:retries} %{INT:srv_queue}/%{INT:backend_queue} (\\{%{HAPROXYCAPTUREDREQUESTHEADERS}\\})?( )?(\\{%{HAPROXYCAPTUREDRESPONSEHEADERS}\\})?( )?\"(<BADREQ>|(%{WORD:http_verb} (%{URIPROTO:http_proto}://)?(?:%{USER:http_user}(?::[^@]*)?@)?(?:%{URIHOST:http_host})?(?:%{URIPATHPARAM:http_request})?( HTTP/%{NUMBER:http_version})?))?\""}
[2018-04-24T17:17:38,950][DEBUG][logstash.filters.grok ] Adding pattern {"HAPROXYHTTP"=>"(?:%{SYSLOGTIMESTAMP:syslog_timestamp}|%{TIMESTAMP_ISO8601:timestamp8601}) %{IPORHOST:syslog_server} %{SYSLOGPROG}: %{HAPROXYHTTPBASE}"}
[2018-04-24T17:17:38,950][DEBUG][logstash.filters.grok ] Adding pattern {"HAPROXYTCP"=>"(?:%{SYSLOGTIMESTAMP:syslog_timestamp}|%{TIMESTAMP_ISO8601:timestamp8601}) %{IPORHOST:syslog_server} %{SYSLOGPROG}: %{IP:client_ip}:%{INT:client_port} \\[%{HAPROXYDATE:accept_date}\\] %{NOTSPACE:frontend_name} %{NOTSPACE:backend_name}/%{NOTSPACE:server_name} %{INT:time_queue}/%{INT:time_backend_connect}/%{NOTSPACE:time_duration} %{NOTSPACE:bytes_read} %{NOTSPACE:termination_state} %{INT:actconn}/%{INT:feconn}/%{INT:beconn}/%{INT:srvconn}/%{NOTSPACE:retries} %{INT:srv_queue}/%{INT:backend_queue}"}
[2018-04-24T17:17:38,950][DEBUG][logstash.filters.grok ] Adding pattern {"HTTPDUSER"=>"%{EMAILADDRESS}|%{USER}"}
[2018-04-24T17:17:38,950][DEBUG][logstash.filters.grok ] Adding pattern {"HTTPDERROR_DATE"=>"%{DAY} %{MONTH} %{MONTHDAY} %{TIME} %{YEAR}"}
[2018-04-24T17:17:38,950][DEBUG][logstash.filters.grok ] Adding pattern {"HTTPD_COMMONLOG"=>"%{IPORHOST:clientip} %{HTTPDUSER:ident} %{HTTPDUSER:auth} \\[%{HTTPDATE:timestamp}\\] \"(?:%{WORD:verb} %{NOTSPACE:request}(?: HTTP/%{NUMBER:httpversion})?|%{DATA:rawrequest})\" %{NUMBER:response} (?:%{NUMBER:bytes}|-)"}
[2018-04-24T17:17:38,950][DEBUG][logstash.filters.grok ] Adding pattern {"HTTPD_COMBINEDLOG"=>"%{HTTPD_COMMONLOG} %{QS:referrer} %{QS:agent}"}
[2018-04-24T17:17:38,950][DEBUG][logstash.filters.grok ] Adding pattern {"HTTPD20_ERRORLOG"=>"\\[%{HTTPDERROR_DATE:timestamp}\\] \\[%{LOGLEVEL:loglevel}\\] (?:\\[client %{IPORHOST:clientip}\\] ){0,1}%{GREEDYDATA:message}"}
[2018-04-24T17:17:38,950][DEBUG][logstash.filters.grok ] Adding pattern {"HTTPD24_ERRORLOG"=>"\\[%{HTTPDERROR_DATE:timestamp}\\] \\[%{WORD:module}:%{LOGLEVEL:loglevel}\\] \\[pid %{POSINT:pid}(:tid %{NUMBER:tid})?\\]( \\(%{POSINT:proxy_errorcode}\\)%{DATA:proxy_message}:)?( \\[client %{IPORHOST:clientip}:%{POSINT:clientport}\\])?( %{DATA:errorcode}:)? %{GREEDYDATA:message}"}
[2018-04-24T17:17:38,950][DEBUG][logstash.filters.grok ] Adding pattern {"HTTPD_ERRORLOG"=>"%{HTTPD20_ERRORLOG}|%{HTTPD24_ERRORLOG}"}
[2018-04-24T17:17:38,950][DEBUG][logstash.filters.grok ] Adding pattern {"COMMONAPACHELOG"=>"%{HTTPD_COMMONLOG}"}
[2018-04-24T17:17:38,950][DEBUG][logstash.filters.grok ] Adding pattern {"COMBINEDAPACHELOG"=>"%{HTTPD_COMBINEDLOG}"}
[2018-04-24T17:17:38,950][DEBUG][logstash.filters.grok ] Adding pattern {"JAVACLASS"=>"(?:[a-zA-Z$_][a-zA-Z$_0-9]*\\.)*[a-zA-Z$_][a-zA-Z$_0-9]*"}
[2018-04-24T17:17:38,950][DEBUG][logstash.filters.grok ] Adding pattern {"JAVAFILE"=>"(?:[A-Za-z0-9_. -]+)"}
[2018-04-24T17:17:38,950][DEBUG][logstash.filters.grok ] Adding pattern {"JAVAMETHOD"=>"(?:(<(?:cl)?init>)|[a-zA-Z$_][a-zA-Z$_0-9]*)"}
[2018-04-24T17:17:38,950][DEBUG][logstash.filters.grok ] Adding pattern {"JAVASTACKTRACEPART"=>"%{SPACE}at %{JAVACLASS:class}\\.%{JAVAMETHOD:method}\\(%{JAVAFILE:file}(?::%{NUMBER:line})?\\)"}
[2018-04-24T17:17:38,951][DEBUG][logstash.filters.grok ] Adding pattern {"JAVATHREAD"=>"(?:[A-Z]{2}-Processor[\\d]+)"}
[2018-04-24T17:17:38,951][DEBUG][logstash.filters.grok ] Adding pattern {"JAVACLASS"=>"(?:[a-zA-Z0-9-]+\\.)+[A-Za-z0-9$]+"}
[2018-04-24T17:17:38,951][DEBUG][logstash.filters.grok ] Adding pattern {"JAVAFILE"=>"(?:[A-Za-z0-9_.-]+)"}
[2018-04-24T17:17:38,951][DEBUG][logstash.filters.grok ] Adding pattern {"JAVALOGMESSAGE"=>"(.*)"}
[2018-04-24T17:17:38,951][DEBUG][logstash.filters.grok ] Adding pattern {"CATALINA_DATESTAMP"=>"%{MONTH} %{MONTHDAY}, 20%{YEAR} %{HOUR}:?%{MINUTE}(?::?%{SECOND}) (?:AM|PM)"}
[2018-04-24T17:17:38,951][DEBUG][logstash.filters.grok ] Adding pattern {"TOMCAT_DATESTAMP"=>"20%{YEAR}-%{MONTHNUM}-%{MONTHDAY} %{HOUR}:?%{MINUTE}(?::?%{SECOND}) %{ISO8601_TIMEZONE}"}
[2018-04-24T17:17:38,951][DEBUG][logstash.filters.grok ] Adding pattern {"CATALINALOG"=>"%{CATALINA_DATESTAMP:timestamp} %{JAVACLASS:class} %{JAVALOGMESSAGE:logmessage}"}
[2018-04-24T17:17:38,951][DEBUG][logstash.filters.grok ] Adding pattern {"TOMCATLOG"=>"%{TOMCAT_DATESTAMP:timestamp} \\| %{LOGLEVEL:level} \\| %{JAVACLASS:class} - %{JAVALOGMESSAGE:logmessage}"}
[2018-04-24T17:17:38,951][DEBUG][logstash.filters.grok ] Adding pattern {"RT_FLOW_EVENT"=>"(RT_FLOW_SESSION_CREATE|RT_FLOW_SESSION_CLOSE|RT_FLOW_SESSION_DENY)"}
[2018-04-24T17:17:38,951][DEBUG][logstash.filters.grok ] Adding pattern {"RT_FLOW1"=>"%{RT_FLOW_EVENT:event}: %{GREEDYDATA:close-reason}: %{IP:src-ip}/%{INT:src-port}->%{IP:dst-ip}/%{INT:dst-port} %{DATA:service} %{IP:nat-src-ip}/%{INT:nat-src-port}->%{IP:nat-dst-ip}/%{INT:nat-dst-port} %{DATA:src-nat-rule-name} %{DATA:dst-nat-rule-name} %{INT:protocol-id} %{DATA:policy-name} %{DATA:from-zone} %{DATA:to-zone} %{INT:session-id} \\d+\\(%{DATA:sent}\\) \\d+\\(%{DATA:received}\\) %{INT:elapsed-time} .*"}
[2018-04-24T17:17:38,951][DEBUG][logstash.filters.grok ] Adding pattern {"RT_FLOW2"=>"%{RT_FLOW_EVENT:event}: session created %{IP:src-ip}/%{INT:src-port}->%{IP:dst-ip}/%{INT:dst-port} %{DATA:service} %{IP:nat-src-ip}/%{INT:nat-src-port}->%{IP:nat-dst-ip}/%{INT:nat-dst-port} %{DATA:src-nat-rule-name} %{DATA:dst-nat-rule-name} %{INT:protocol-id} %{DATA:policy-name} %{DATA:from-zone} %{DATA:to-zone} %{INT:session-id} .*"}
[2018-04-24T17:17:38,951][DEBUG][logstash.filters.grok ] Adding pattern {"RT_FLOW3"=>"%{RT_FLOW_EVENT:event}: session denied %{IP:src-ip}/%{INT:src-port}->%{IP:dst-ip}/%{INT:dst-port} %{DATA:service} %{INT:protocol-id}\\(\\d\\) %{DATA:policy-name} %{DATA:from-zone} %{DATA:to-zone} .*"}
[2018-04-24T17:17:38,951][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOG5424PRINTASCII"=>"[!-~]+"}
[2018-04-24T17:17:38,951][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOGBASE2"=>"(?:%{SYSLOGTIMESTAMP:timestamp}|%{TIMESTAMP_ISO8601:timestamp8601}) (?:%{SYSLOGFACILITY} )?%{SYSLOGHOST:logsource}+(?: %{SYSLOGPROG}:|)"}
[2018-04-24T17:17:38,952][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOGPAMSESSION"=>"%{SYSLOGBASE} (?=%{GREEDYDATA:message})%{WORD:pam_module}\\(%{DATA:pam_caller}\\): session %{WORD:pam_session_state} for user %{USERNAME:username}(?: by %{GREEDYDATA:pam_by})?"}
[2018-04-24T17:17:38,952][DEBUG][logstash.filters.grok ] Adding pattern {"CRON_ACTION"=>"[A-Z ]+"}
[2018-04-24T17:17:38,952][DEBUG][logstash.filters.grok ] Adding pattern {"CRONLOG"=>"%{SYSLOGBASE} \\(%{USER:user}\\) %{CRON_ACTION:action} \\(%{DATA:message}\\)"}
[2018-04-24T17:17:38,952][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOGLINE"=>"%{SYSLOGBASE2} %{GREEDYDATA:message}"}
[2018-04-24T17:17:38,952][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOG5424PRI"=>"<%{NONNEGINT:syslog5424_pri}>"}
[2018-04-24T17:17:38,952][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOG5424SD"=>"\\[%{DATA}\\]+"}
[2018-04-24T17:17:38,952][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOG5424BASE"=>"%{SYSLOG5424PRI}%{NONNEGINT:syslog5424_ver} +(?:%{TIMESTAMP_ISO8601:syslog5424_ts}|-) +(?:%{IPORHOST:syslog5424_host}|-) +(-|%{SYSLOG5424PRINTASCII:syslog5424_app}) +(-|%{SYSLOG5424PRINTASCII:syslog5424_proc}) +(-|%{SYSLOG5424PRINTASCII:syslog5424_msgid}) +(?:%{SYSLOG5424SD:syslog5424_sd}|-|)"}
[2018-04-24T17:17:38,952][DEBUG][logstash.filters.grok ] Adding pattern {"SYSLOG5424LINE"=>"%{SYSLOG5424BASE} +%{GREEDYDATA:syslog5424_msg}"}
[2018-04-24T17:17:38,952][DEBUG][logstash.filters.grok ] Adding pattern {"MAVEN_VERSION"=>"(?:(\\d+)\\.)?(?:(\\d+)\\.)?(\\*|\\d+)(?:[.-](RELEASE|SNAPSHOT))?"}
[2018-04-24T17:17:38,952][DEBUG][logstash.filters.grok ] Adding pattern {"MCOLLECTIVEAUDIT"=>"%{TIMESTAMP_ISO8601:timestamp}:"}
[2018-04-24T17:17:38,952][DEBUG][logstash.filters.grok ] Adding pattern {"MCOLLECTIVE"=>"., \\[%{TIMESTAMP_ISO8601:timestamp} #%{POSINT:pid}\\]%{SPACE}%{LOGLEVEL:event_level}"}
[2018-04-24T17:17:38,952][DEBUG][logstash.filters.grok ] Adding pattern {"MCOLLECTIVEAUDIT"=>"%{TIMESTAMP_ISO8601:timestamp}:"}
[2018-04-24T17:17:38,952][DEBUG][logstash.filters.grok ] Adding pattern {"MONGO_LOG"=>"%{SYSLOGTIMESTAMP:timestamp} \\[%{WORD:component}\\] %{GREEDYDATA:message}"}
[2018-04-24T17:17:38,952][DEBUG][logstash.filters.grok ] Adding pattern {"MONGO_QUERY"=>"\\{ (?<={ ).*(?= } ntoreturn:) \\}"}
[2018-04-24T17:17:38,953][DEBUG][logstash.filters.grok ] Adding pattern {"MONGO_SLOWQUERY"=>"%{WORD} %{MONGO_WORDDASH:database}\\.%{MONGO_WORDDASH:collection} %{WORD}: %{MONGO_QUERY:query} %{WORD}:%{NONNEGINT:ntoreturn} %{WORD}:%{NONNEGINT:ntoskip} %{WORD}:%{NONNEGINT:nscanned}.*nreturned:%{NONNEGINT:nreturned}..+ (?<duration>[0-9]+)ms"}
[2018-04-24T17:17:38,953][DEBUG][logstash.filters.grok ] Adding pattern {"MONGO_WORDDASH"=>"\\b[\\w-]+\\b"}
[2018-04-24T17:17:38,953][DEBUG][logstash.filters.grok ] Adding pattern {"MONGO3_SEVERITY"=>"\\w"}
[2018-04-24T17:17:38,953][DEBUG][logstash.filters.grok ] Adding pattern {"MONGO3_COMPONENT"=>"%{WORD}|-"}
[2018-04-24T17:17:38,953][DEBUG][logstash.filters.grok ] Adding pattern {"MONGO3_LOG"=>"%{TIMESTAMP_ISO8601:timestamp} %{MONGO3_SEVERITY:severity} %{MONGO3_COMPONENT:component}%{SPACE}(?:\\[%{DATA:context}\\])? %{GREEDYDATA:message}"}
[2018-04-24T17:17:38,953][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOSTIME"=>"\\[%{NUMBER:nagios_epoch}\\]"}
[2018-04-24T17:17:38,953][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_CURRENT_SERVICE_STATE"=>"CURRENT SERVICE STATE"}
[2018-04-24T17:17:38,953][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_CURRENT_HOST_STATE"=>"CURRENT HOST STATE"}
[2018-04-24T17:17:38,953][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_SERVICE_NOTIFICATION"=>"SERVICE NOTIFICATION"}
[2018-04-24T17:17:38,953][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_HOST_NOTIFICATION"=>"HOST NOTIFICATION"}
[2018-04-24T17:17:38,953][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_SERVICE_ALERT"=>"SERVICE ALERT"}
[2018-04-24T17:17:38,953][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_HOST_ALERT"=>"HOST ALERT"}
[2018-04-24T17:17:38,953][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_SERVICE_FLAPPING_ALERT"=>"SERVICE FLAPPING ALERT"}
[2018-04-24T17:17:38,953][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_HOST_FLAPPING_ALERT"=>"HOST FLAPPING ALERT"}
[2018-04-24T17:17:38,953][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_SERVICE_DOWNTIME_ALERT"=>"SERVICE DOWNTIME ALERT"}
[2018-04-24T17:17:38,953][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_HOST_DOWNTIME_ALERT"=>"HOST DOWNTIME ALERT"}
[2018-04-24T17:17:38,953][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_PASSIVE_SERVICE_CHECK"=>"PASSIVE SERVICE CHECK"}
[2018-04-24T17:17:38,953][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_PASSIVE_HOST_CHECK"=>"PASSIVE HOST CHECK"}
[2018-04-24T17:17:38,953][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_SERVICE_EVENT_HANDLER"=>"SERVICE EVENT HANDLER"}
[2018-04-24T17:17:38,953][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_HOST_EVENT_HANDLER"=>"HOST EVENT HANDLER"}
[2018-04-24T17:17:38,954][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_EXTERNAL_COMMAND"=>"EXTERNAL COMMAND"}
[2018-04-24T17:17:38,954][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TYPE_TIMEPERIOD_TRANSITION"=>"TIMEPERIOD TRANSITION"}
[2018-04-24T17:17:38,954][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_DISABLE_SVC_CHECK"=>"DISABLE_SVC_CHECK"}
[2018-04-24T17:17:38,954][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_ENABLE_SVC_CHECK"=>"ENABLE_SVC_CHECK"}
[2018-04-24T17:17:38,954][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_DISABLE_HOST_CHECK"=>"DISABLE_HOST_CHECK"}
[2018-04-24T17:17:38,954][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_ENABLE_HOST_CHECK"=>"ENABLE_HOST_CHECK"}
[2018-04-24T17:17:38,954][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_PROCESS_SERVICE_CHECK_RESULT"=>"PROCESS_SERVICE_CHECK_RESULT"}
[2018-04-24T17:17:38,954][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_PROCESS_HOST_CHECK_RESULT"=>"PROCESS_HOST_CHECK_RESULT"}
[2018-04-24T17:17:38,954][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_SCHEDULE_SERVICE_DOWNTIME"=>"SCHEDULE_SERVICE_DOWNTIME"}
[2018-04-24T17:17:38,954][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_SCHEDULE_HOST_DOWNTIME"=>"SCHEDULE_HOST_DOWNTIME"}
[2018-04-24T17:17:38,954][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_DISABLE_HOST_SVC_NOTIFICATIONS"=>"DISABLE_HOST_SVC_NOTIFICATIONS"}
[2018-04-24T17:17:38,954][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_ENABLE_HOST_SVC_NOTIFICATIONS"=>"ENABLE_HOST_SVC_NOTIFICATIONS"}
[2018-04-24T17:17:38,954][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_DISABLE_HOST_NOTIFICATIONS"=>"DISABLE_HOST_NOTIFICATIONS"}
[2018-04-24T17:17:38,954][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_ENABLE_HOST_NOTIFICATIONS"=>"ENABLE_HOST_NOTIFICATIONS"}
[2018-04-24T17:17:38,954][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_DISABLE_SVC_NOTIFICATIONS"=>"DISABLE_SVC_NOTIFICATIONS"}
[2018-04-24T17:17:38,954][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_ENABLE_SVC_NOTIFICATIONS"=>"ENABLE_SVC_NOTIFICATIONS"}
[2018-04-24T17:17:38,954][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_WARNING"=>"Warning:%{SPACE}%{GREEDYDATA:nagios_message}"}
[2018-04-24T17:17:38,954][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_CURRENT_SERVICE_STATE"=>"%{NAGIOS_TYPE_CURRENT_SERVICE_STATE:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_service};%{DATA:nagios_state};%{DATA:nagios_statetype};%{DATA:nagios_statecode};%{GREEDYDATA:nagios_message}"}
[2018-04-24T17:17:38,954][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_CURRENT_HOST_STATE"=>"%{NAGIOS_TYPE_CURRENT_HOST_STATE:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_state};%{DATA:nagios_statetype};%{DATA:nagios_statecode};%{GREEDYDATA:nagios_message}"}
[2018-04-24T17:17:38,954][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_SERVICE_NOTIFICATION"=>"%{NAGIOS_TYPE_SERVICE_NOTIFICATION:nagios_type}: %{DATA:nagios_notifyname};%{DATA:nagios_hostname};%{DATA:nagios_service};%{DATA:nagios_state};%{DATA:nagios_contact};%{GREEDYDATA:nagios_message}"}
[2018-04-24T17:17:38,954][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_HOST_NOTIFICATION"=>"%{NAGIOS_TYPE_HOST_NOTIFICATION:nagios_type}: %{DATA:nagios_notifyname};%{DATA:nagios_hostname};%{DATA:nagios_state};%{DATA:nagios_contact};%{GREEDYDATA:nagios_message}"}
[2018-04-24T17:17:38,954][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_SERVICE_ALERT"=>"%{NAGIOS_TYPE_SERVICE_ALERT:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_service};%{DATA:nagios_state};%{DATA:nagios_statelevel};%{NUMBER:nagios_attempt};%{GREEDYDATA:nagios_message}"}
[2018-04-24T17:17:38,955][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_HOST_ALERT"=>"%{NAGIOS_TYPE_HOST_ALERT:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_state};%{DATA:nagios_statelevel};%{NUMBER:nagios_attempt};%{GREEDYDATA:nagios_message}"}
[2018-04-24T17:17:38,955][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_SERVICE_FLAPPING_ALERT"=>"%{NAGIOS_TYPE_SERVICE_FLAPPING_ALERT:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_service};%{DATA:nagios_state};%{GREEDYDATA:nagios_message}"}
[2018-04-24T17:17:38,955][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_HOST_FLAPPING_ALERT"=>"%{NAGIOS_TYPE_HOST_FLAPPING_ALERT:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_state};%{GREEDYDATA:nagios_message}"}
[2018-04-24T17:17:38,955][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_SERVICE_DOWNTIME_ALERT"=>"%{NAGIOS_TYPE_SERVICE_DOWNTIME_ALERT:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_service};%{DATA:nagios_state};%{GREEDYDATA:nagios_comment}"}
[2018-04-24T17:17:38,955][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_HOST_DOWNTIME_ALERT"=>"%{NAGIOS_TYPE_HOST_DOWNTIME_ALERT:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_state};%{GREEDYDATA:nagios_comment}"}
[2018-04-24T17:17:38,955][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_PASSIVE_SERVICE_CHECK"=>"%{NAGIOS_TYPE_PASSIVE_SERVICE_CHECK:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_service};%{DATA:nagios_state};%{GREEDYDATA:nagios_comment}"}
[2018-04-24T17:17:38,955][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_PASSIVE_HOST_CHECK"=>"%{NAGIOS_TYPE_PASSIVE_HOST_CHECK:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_state};%{GREEDYDATA:nagios_comment}"}
[2018-04-24T17:17:38,955][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_SERVICE_EVENT_HANDLER"=>"%{NAGIOS_TYPE_SERVICE_EVENT_HANDLER:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_service};%{DATA:nagios_state};%{DATA:nagios_statelevel};%{DATA:nagios_event_handler_name}"}
[2018-04-24T17:17:38,955][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_HOST_EVENT_HANDLER"=>"%{NAGIOS_TYPE_HOST_EVENT_HANDLER:nagios_type}: %{DATA:nagios_hostname};%{DATA:nagios_state};%{DATA:nagios_statelevel};%{DATA:nagios_event_handler_name}"}
[2018-04-24T17:17:38,955][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_TIMEPERIOD_TRANSITION"=>"%{NAGIOS_TYPE_TIMEPERIOD_TRANSITION:nagios_type}: %{DATA:nagios_service};%{DATA:nagios_unknown1};%{DATA:nagios_unknown2}"}
[2018-04-24T17:17:38,955][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_DISABLE_SVC_CHECK"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_DISABLE_SVC_CHECK:nagios_command};%{DATA:nagios_hostname};%{DATA:nagios_service}"}
[2018-04-24T17:17:38,955][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_DISABLE_HOST_CHECK"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_DISABLE_HOST_CHECK:nagios_command};%{DATA:nagios_hostname}"}
[2018-04-24T17:17:38,955][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_ENABLE_SVC_CHECK"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_ENABLE_SVC_CHECK:nagios_command};%{DATA:nagios_hostname};%{DATA:nagios_service}"}
[2018-04-24T17:17:38,955][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_ENABLE_HOST_CHECK"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_ENABLE_HOST_CHECK:nagios_command};%{DATA:nagios_hostname}"}
[2018-04-24T17:17:38,955][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_PROCESS_SERVICE_CHECK_RESULT"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_PROCESS_SERVICE_CHECK_RESULT:nagios_command};%{DATA:nagios_hostname};%{DATA:nagios_service};%{DATA:nagios_state};%{GREEDYDATA:nagios_check_result}"}
[2018-04-24T17:17:38,955][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_PROCESS_HOST_CHECK_RESULT"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_PROCESS_HOST_CHECK_RESULT:nagios_command};%{DATA:nagios_hostname};%{DATA:nagios_state};%{GREEDYDATA:nagios_check_result}"}
[2018-04-24T17:17:38,956][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_DISABLE_HOST_SVC_NOTIFICATIONS"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_DISABLE_HOST_SVC_NOTIFICATIONS:nagios_command};%{GREEDYDATA:nagios_hostname}"}
[2018-04-24T17:17:38,956][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_DISABLE_HOST_NOTIFICATIONS"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_DISABLE_HOST_NOTIFICATIONS:nagios_command};%{GREEDYDATA:nagios_hostname}"}
[2018-04-24T17:17:38,956][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_DISABLE_SVC_NOTIFICATIONS"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_DISABLE_SVC_NOTIFICATIONS:nagios_command};%{DATA:nagios_hostname};%{GREEDYDATA:nagios_service}"}
[2018-04-24T17:17:38,956][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_ENABLE_HOST_SVC_NOTIFICATIONS"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_ENABLE_HOST_SVC_NOTIFICATIONS:nagios_command};%{GREEDYDATA:nagios_hostname}"}
[2018-04-24T17:17:38,956][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_ENABLE_HOST_NOTIFICATIONS"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_ENABLE_HOST_NOTIFICATIONS:nagios_command};%{GREEDYDATA:nagios_hostname}"}
[2018-04-24T17:17:38,956][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_ENABLE_SVC_NOTIFICATIONS"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_ENABLE_SVC_NOTIFICATIONS:nagios_command};%{DATA:nagios_hostname};%{GREEDYDATA:nagios_service}"}
[2018-04-24T17:17:38,956][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOS_EC_LINE_SCHEDULE_HOST_DOWNTIME"=>"%{NAGIOS_TYPE_EXTERNAL_COMMAND:nagios_type}: %{NAGIOS_EC_SCHEDULE_HOST_DOWNTIME:nagios_command};%{DATA:nagios_hostname};%{NUMBER:nagios_start_time};%{NUMBER:nagios_end_time};%{NUMBER:nagios_fixed};%{NUMBER:nagios_trigger_id};%{NUMBER:nagios_duration};%{DATA:author};%{DATA:comment}"}
[2018-04-24T17:17:38,956][DEBUG][logstash.filters.grok ] Adding pattern {"NAGIOSLOGLINE"=>"%{NAGIOSTIME} (?:%{NAGIOS_WARNING}|%{NAGIOS_CURRENT_SERVICE_STATE}|%{NAGIOS_CURRENT_HOST_STATE}|%{NAGIOS_SERVICE_NOTIFICATION}|%{NAGIOS_HOST_NOTIFICATION}|%{NAGIOS_SERVICE_ALERT}|%{NAGIOS_HOST_ALERT}|%{NAGIOS_SERVICE_FLAPPING_ALERT}|%{NAGIOS_HOST_FLAPPING_ALERT}|%{NAGIOS_SERVICE_DOWNTIME_ALERT}|%{NAGIOS_HOST_DOWNTIME_ALERT}|%{NAGIOS_PASSIVE_SERVICE_CHECK}|%{NAGIOS_PASSIVE_HOST_CHECK}|%{NAGIOS_SERVICE_EVENT_HANDLER}|%{NAGIOS_HOST_EVENT_HANDLER}|%{NAGIOS_TIMEPERIOD_TRANSITION}|%{NAGIOS_EC_LINE_DISABLE_SVC_CHECK}|%{NAGIOS_EC_LINE_ENABLE_SVC_CHECK}|%{NAGIOS_EC_LINE_DISABLE_HOST_CHECK}|%{NAGIOS_EC_LINE_ENABLE_HOST_CHECK}|%{NAGIOS_EC_LINE_PROCESS_HOST_CHECK_RESULT}|%{NAGIOS_EC_LINE_PROCESS_SERVICE_CHECK_RESULT}|%{NAGIOS_EC_LINE_SCHEDULE_HOST_DOWNTIME}|%{NAGIOS_EC_LINE_DISABLE_HOST_SVC_NOTIFICATIONS}|%{NAGIOS_EC_LINE_ENABLE_HOST_SVC_NOTIFICATIONS}|%{NAGIOS_EC_LINE_DISABLE_HOST_NOTIFICATIONS}|%{NAGIOS_EC_LINE_ENABLE_HOST_NOTIFICATIONS}|%{NAGIOS_EC_LINE_DISABLE_SVC_NOTIFICATIONS}|%{NAGIOS_EC_LINE_ENABLE_SVC_NOTIFICATIONS})"}
[2018-04-24T17:17:38,956][DEBUG][logstash.filters.grok ] Adding pattern {"POSTGRESQL"=>"%{DATESTAMP:timestamp} %{TZ} %{DATA:user_id} %{GREEDYDATA:connection_id} %{POSINT:pid}"}
[2018-04-24T17:17:38,956][DEBUG][logstash.filters.grok ] Adding pattern {"RUUID"=>"\\h{32}"}
[2018-04-24T17:17:38,956][DEBUG][logstash.filters.grok ] Adding pattern {"RCONTROLLER"=>"(?<controller>[^#]+)#(?<action>\\w+)"}
[2018-04-24T17:17:38,957][DEBUG][logstash.filters.grok ] Adding pattern {"RAILS3HEAD"=>"(?m)Started %{WORD:verb} \"%{URIPATHPARAM:request}\" for %{IPORHOST:clientip} at (?<timestamp>%{YEAR}-%{MONTHNUM}-%{MONTHDAY} %{HOUR}:%{MINUTE}:%{SECOND} %{ISO8601_TIMEZONE})"}
[2018-04-24T17:17:38,957][DEBUG][logstash.filters.grok ] Adding pattern {"RPROCESSING"=>"\\W*Processing by %{RCONTROLLER} as (?<format>\\S+)(?:\\W*Parameters: {%{DATA:params}}\\W*)?"}
[2018-04-24T17:17:38,957][DEBUG][logstash.filters.grok ] Adding pattern {"RAILS3FOOT"=>"Completed %{NUMBER:response}%{DATA} in %{NUMBER:totalms}ms %{RAILS3PROFILE}%{GREEDYDATA}"}
[2018-04-24T17:17:38,957][DEBUG][logstash.filters.grok ] Adding pattern {"RAILS3PROFILE"=>"(?:\\(Views: %{NUMBER:viewms}ms \\| ActiveRecord: %{NUMBER:activerecordms}ms|\\(ActiveRecord: %{NUMBER:activerecordms}ms)?"}
[2018-04-24T17:17:38,957][DEBUG][logstash.filters.grok ] Adding pattern {"RAILS3"=>"%{RAILS3HEAD}(?:%{RPROCESSING})?(?<context>(?:%{DATA}\\n)*)(?:%{RAILS3FOOT})?"}
[2018-04-24T17:17:38,957][DEBUG][logstash.filters.grok ] Adding pattern {"REDISTIMESTAMP"=>"%{MONTHDAY} %{MONTH} %{TIME}"}
[2018-04-24T17:17:38,957][DEBUG][logstash.filters.grok ] Adding pattern {"REDISLOG"=>"\\[%{POSINT:pid}\\] %{REDISTIMESTAMP:timestamp} \\* "}
[2018-04-24T17:17:38,957][DEBUG][logstash.filters.grok ] Adding pattern {"REDISMONLOG"=>"%{NUMBER:timestamp} \\[%{INT:database} %{IP:client}:%{NUMBER:port}\\] \"%{WORD:command}\"\\s?%{GREEDYDATA:params}"}
[2018-04-24T17:17:38,957][DEBUG][logstash.filters.grok ] Adding pattern {"RUBY_LOGLEVEL"=>"(?:DEBUG|FATAL|ERROR|WARN|INFO)"}
[2018-04-24T17:17:38,957][DEBUG][logstash.filters.grok ] Adding pattern {"RUBY_LOGGER"=>"[DFEWI], \\[%{TIMESTAMP_ISO8601:timestamp} #%{POSINT:pid}\\] *%{RUBY_LOGLEVEL:loglevel} -- +%{DATA:progname}: %{GREEDYDATA:message}"}
[2018-04-24T17:17:38,957][DEBUG][logstash.filters.grok ] Adding pattern {"SQUID3"=>"%{NUMBER:timestamp}\\s+%{NUMBER:duration}\\s%{IP:client_address}\\s%{WORD:cache_result}/%{POSINT:status_code}\\s%{NUMBER:bytes}\\s%{WORD:request_method}\\s%{NOTSPACE:url}\\s(%{NOTSPACE:user}|-)\\s%{WORD:hierarchy_code}/%{IPORHOST:server}\\s%{NOTSPACE:content_type}"}
[2018-04-24T17:17:38,957][DEBUG][logstash.filters.grok ] Adding pattern {"PAYLOAD"=>"[\\s\\S]*"}
[2018-04-24T17:17:38,958][DEBUG][logstash.filters.grok ] Adding pattern {"SPACE"=>"[ ]{1,}"}
[2018-04-24T17:17:38,958][DEBUG][logstash.filters.grok ] Adding pattern {"P_TIMESTAMP"=>"%{MONTH}\\s%{MONTHDAY},\\s%{YEAR}\\s%{TIME}\\s(AM|PM)"}
[2018-04-24T17:17:38,958][DEBUG][logstash.filters.grok ] Adding pattern {"LOGGINGSERVICEPREFIX"=>"[-]{12,18} Event Log Start Here [-]{12,18}\\\\n"}
[2018-04-24T17:17:38,958][DEBUG][logstash.filters.grok ] Adding pattern {"LOGGINGSERVICESUFFIX"=>"\\\\n[-]{12,18} Event Log End Here [-]{12,18}"}
[2018-04-24T17:17:38,958][DEBUG][logstash.filters.grok ] Adding pattern {"XLMLOGGING"=>"[0-9]{4}-[0-9]{2}-[0-9]{2} [0-9]{2}:[0-9]{2}:[0-9]{2}:[0-9]{3,7}"}
[2018-04-24T17:17:38,958][DEBUG][logstash.filters.grok ] Adding pattern {"DATESWITHDOTS"=>"[0-9]{4}.[0-9]{2}.[0-9]{2}.[0-9]{2}.[0-9]{2}.[0-9]{2}.[0-9]{3,7}"}
[2018-04-24T17:17:38,958][DEBUG][logstash.filters.grok ] Adding pattern {"DATESWITHUNDERLINE"=>"[0-9]{4}_[0-9]{1,2}_[0-9]{1,2}_[0-9]{1,2}_[0-9]{1,2}_[0-9]{1,2}_[0-9]{1,7}"}
[2018-04-24T17:17:38,958][DEBUG][logstash.filters.grok ] replacement_pattern => (?<TIMESTAMP_ISO8601:logtime>%{YEAR}-%{MONTHNUM}-%{MONTHDAY}[T ]%{HOUR}:?%{MINUTE}(?::?%{SECOND})?%{ISO8601_TIMEZONE}?)
[2018-04-24T17:17:38,958][DEBUG][logstash.filters.grok ] replacement_pattern => (?:(?>\d\d){1,2})
[2018-04-24T17:17:38,958][DEBUG][logstash.filters.grok ] replacement_pattern => (?:(?:0?[1-9]|1[0-2]))
[2018-04-24T17:17:38,958][DEBUG][logstash.filters.grok ] replacement_pattern => (?:(?:(?:0[1-9])|(?:[12][0-9])|(?:3[01])|[1-9]))
[2018-04-24T17:17:38,958][DEBUG][logstash.filters.grok ] replacement_pattern => (?:(?:2[0123]|[01]?[0-9]))
[2018-04-24T17:17:38,958][DEBUG][logstash.filters.grok ] replacement_pattern => (?:(?:[0-5][0-9]))
[2018-04-24T17:17:38,958][DEBUG][logstash.filters.grok ] replacement_pattern => (?:(?:(?:[0-5]?[0-9]|60)(?:[:.,][0-9]+)?))
[2018-04-24T17:17:38,958][DEBUG][logstash.filters.grok ] replacement_pattern => (?:(?:Z|[+-]%{HOUR}(?::?%{MINUTE})))
[2018-04-24T17:17:38,958][DEBUG][logstash.filters.grok ] replacement_pattern => (?:(?:2[0123]|[01]?[0-9]))
[2018-04-24T17:17:38,959][DEBUG][logstash.filters.grok ] replacement_pattern => (?:(?:[0-5][0-9]))
[2018-04-24T17:17:38,959][DEBUG][logstash.filters.grok ] replacement_pattern => (?<GREEDYDATA:Message>.*)
[2018-04-24T17:17:38,959][DEBUG][logstash.filters.grok ] Grok compiled OK {:pattern=>"%{TIMESTAMP_ISO8601:logtime} %{GREEDYDATA:Message}", :expanded_pattern=>"(?<TIMESTAMP_ISO8601:logtime>(?:(?>\\d\\d){1,2})-(?:(?:0?[1-9]|1[0-2]))-(?:(?:(?:0[1-9])|(?:[12][0-9])|(?:3[01])|[1-9]))[T ](?:(?:2[0123]|[01]?[0-9])):?(?:(?:[0-5][0-9]))(?::?(?:(?:(?:[0-5]?[0-9]|60)(?:[:.,][0-9]+)?)))?(?:(?:Z|[+-](?:(?:2[0123]|[01]?[0-9]))(?::?(?:(?:[0-5][0-9])))))?) (?<GREEDYDATA:Message>.*)"}
[2018-04-24T17:17:38,980][DEBUG][io.netty.util.internal.logging.InternalLoggerFactory] Using SLF4J as the default logging framework
[2018-04-24T17:17:39,001][DEBUG][io.netty.util.internal.PlatformDependent0] -Dio.netty.noUnsafe: false
[2018-04-24T17:17:39,001][DEBUG][io.netty.util.internal.PlatformDependent0] Java version: 8
[2018-04-24T17:17:39,002][DEBUG][io.netty.util.internal.PlatformDependent0] sun.misc.Unsafe.theUnsafe: available
[2018-04-24T17:17:39,002][DEBUG][io.netty.util.internal.PlatformDependent0] sun.misc.Unsafe.copyMemory: available
[2018-04-24T17:17:39,003][DEBUG][io.netty.util.internal.PlatformDependent0] java.nio.Buffer.address: available
[2018-04-24T17:17:39,003][DEBUG][io.netty.util.internal.PlatformDependent0] direct buffer constructor: available
[2018-04-24T17:17:39,004][DEBUG][io.netty.util.internal.PlatformDependent0] java.nio.Bits.unaligned: available, true
[2018-04-24T17:17:39,004][DEBUG][io.netty.util.internal.PlatformDependent0] jdk.internal.misc.Unsafe.allocateUninitializedArray(int): unavailable prior to Java9
[2018-04-24T17:17:39,004][DEBUG][io.netty.util.internal.PlatformDependent0] java.nio.DirectByteBuffer.<init>(long, int): available
[2018-04-24T17:17:39,004][DEBUG][io.netty.util.internal.PlatformDependent] sun.misc.Unsafe: available
[2018-04-24T17:17:39,004][DEBUG][io.netty.util.internal.PlatformDependent] -Dio.netty.tmpdir: /tmp (java.io.tmpdir)
[2018-04-24T17:17:39,004][DEBUG][io.netty.util.internal.PlatformDependent] -Dio.netty.bitMode: 64 (sun.arch.data.model)
[2018-04-24T17:17:39,005][DEBUG][io.netty.util.internal.PlatformDependent] -Dio.netty.noPreferDirect: false
[2018-04-24T17:17:39,005][DEBUG][io.netty.util.internal.PlatformDependent] -Dio.netty.maxDirectMemory: 1038876672 bytes
[2018-04-24T17:17:39,005][DEBUG][io.netty.util.internal.PlatformDependent] -Dio.netty.uninitializedArrayAllocationThreshold: -1
[2018-04-24T17:17:39,006][DEBUG][io.netty.util.internal.CleanerJava6] java.nio.ByteBuffer.cleaner(): available
[2018-04-24T17:17:39,008][DEBUG][io.netty.util.internal.NativeLibraryLoader] -Dio.netty.native.workdir: /tmp (io.netty.tmpdir)
[2018-04-24T17:17:39,009][DEBUG][io.netty.util.internal.NativeLibraryLoader] Unable to load the library 'netty_tcnative_linux_x86_64', trying other loading mechanism.
java.lang.UnsatisfiedLinkError: no netty_tcnative_linux_x86_64 in java.library.path
at java.lang.ClassLoader.loadLibrary(Unknown Source) ~[?:1.8.0_144]
at java.lang.Runtime.loadLibrary0(Unknown Source) ~[?:1.8.0_144]
at java.lang.System.loadLibrary(Unknown Source) ~[?:1.8.0_144]
at io.netty.util.internal.NativeLibraryUtil.loadLibrary(NativeLibraryUtil.java:38) ~[netty-all-4.1.18.Final.jar:4.1.18.Final]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_144]
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[?:1.8.0_144]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[?:1.8.0_144]
at java.lang.reflect.Method.invoke(Unknown Source) ~[?:1.8.0_144]
at io.netty.util.internal.NativeLibraryLoader$1.run(NativeLibraryLoader.java:263) ~[netty-all-4.1.18.Final.jar:4.1.18.Final]
at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_144]
at io.netty.util.internal.NativeLibraryLoader.loadLibraryByHelper(NativeLibraryLoader.java:255) ~[netty-all-4.1.18.Final.jar:4.1.18.Final]
at io.netty.util.internal.NativeLibraryLoader.loadLibrary(NativeLibraryLoader.java:233) [netty-all-4.1.18.Final.jar:4.1.18.Final]
at io.netty.util.internal.NativeLibraryLoader.load(NativeLibraryLoader.java:124) [netty-all-4.1.18.Final.jar:4.1.18.Final]
at io.netty.util.internal.NativeLibraryLoader.loadFirstAvailable(NativeLibraryLoader.java:85) [netty-all-4.1.18.Final.jar:4.1.18.Final]
at io.netty.handler.ssl.OpenSsl.loadTcNative(OpenSsl.java:421) [netty-all-4.1.18.Final.jar:4.1.18.Final]
at io.netty.handler.ssl.OpenSsl.<clinit>(OpenSsl.java:89) [netty-all-4.1.18.Final.jar:4.1.18.Final]
at java.lang.Class.forName0(Native Method) [?:1.8.0_144]
at java.lang.Class.forName(Unknown Source) [?:1.8.0_144]
at org.jruby.javasupport.JavaSupportImpl.loadJavaClass(JavaSupportImpl.java:154) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.javasupport.JavaSupportImpl.loadJavaClassVerbose(JavaSupportImpl.java:163) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.javasupport.JavaClass.forNameVerbose(JavaClass.java:271) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.javasupport.JavaClass.for_name(JavaClass.java:286) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.javasupport.Java.get_proxy_class(Java.java:386) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.javasupport.JavaUtilities.get_proxy_class(JavaUtilities.java:34) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.javasupport.JavaUtilities$INVOKER$s$1$0$get_proxy_class.call(JavaUtilities$INVOKER$s$1$0$get_proxy_class.gen) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:161) [jruby-complete-9.1.13.0.jar:?]
at uri_3a_classloader_3a_.jruby.java.core_ext.object.invokeOther73:get_proxy_class(uri:classloader:/jruby/java/core_ext/object.rb:49) [jruby-complete-9.1.13.0.jar:?]
at uri_3a_classloader_3a_.jruby.java.core_ext.object.RUBY$block$java_import$2(uri:classloader:/jruby/java/core_ext/object.rb:49) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.CompiledIRBlockBody.yieldDirect(CompiledIRBlockBody.java:156) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.BlockBody.yield(BlockBody.java:114) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.Block.yield(Block.java:165) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.RubyArray.collect(RubyArray.java:2472) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.RubyArray.map19(RubyArray.java:2486) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.RubyArray$INVOKER$i$0$0$map19.call(RubyArray$INVOKER$i$0$0$map19.gen) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.callsite.CachingCallSite.callBlock(CachingCallSite.java:139) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:145) [jruby-complete-9.1.13.0.jar:?]
at uri_3a_classloader_3a_.jruby.java.core_ext.object.invokeOther77:map(uri:classloader:/jruby/java/core_ext/object.rb:36) [jruby-complete-9.1.13.0.jar:?]
at uri_3a_classloader_3a_.jruby.java.core_ext.object.RUBY$method$java_import$0(uri:classloader:/jruby/java/core_ext/object.rb:36) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.CompiledIRMethod.call(CompiledIRMethod.java:77) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.CompiledIRMethod.call(CompiledIRMethod.java:100) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:163) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:200) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:338) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:163) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:314) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:73) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.InterpreterEngine.interpret(InterpreterEngine.java:77) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:144) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:130) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:192) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:318) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:131) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:339) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:73) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.InterpreterEngine.interpret(InterpreterEngine.java:83) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:179) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:165) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:200) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:161) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:314) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:73) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.Interpreter.INTERPRET_BLOCK(Interpreter.java:132) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.MixedModeIRBlockBody.commonYieldPath(MixedModeIRBlockBody.java:148) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.IRBlockBody.doYield(IRBlockBody.java:186) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.BlockBody.yield(BlockBody.java:116) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.Block.yield(Block.java:165) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.RubyArray.each(RubyArray.java:1734) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.RubyArray$INVOKER$i$0$0$each.call(RubyArray$INVOKER$i$0$0$each.gen) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.JavaMethod$JavaMethodZeroBlock.call(JavaMethod.java:498) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.callsite.CachingCallSite.callBlock(CachingCallSite.java:77) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:83) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.instructions.CallBase.interpret(CallBase.java:428) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:355) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:73) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.InterpreterEngine.interpret(InterpreterEngine.java:83) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:179) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:165) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:200) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:338) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:163) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:314) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:73) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.InterpreterEngine.interpret(InterpreterEngine.java:77) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:144) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:130) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:192) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:318) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:131) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:339) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:73) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.InterpreterEngine.interpret(InterpreterEngine.java:77) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:144) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:130) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:192) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:318) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:131) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:339) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:73) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.InterpreterEngine.interpret(InterpreterEngine.java:77) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:144) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:130) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:192) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:318) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:131) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:339) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:73) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.Interpreter.INTERPRET_BLOCK(Interpreter.java:132) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.MixedModeIRBlockBody.commonYieldPath(MixedModeIRBlockBody.java:148) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:73) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.Block.call(Block.java:124) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.RubyProc.call(RubyProc.java:289) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.RubyProc.call(RubyProc.java:246) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.RubyRunnable.run(RubyRunnable.java:104) [jruby-complete-9.1.13.0.jar:?]
at java.lang.Thread.run(Unknown Source) [?:1.8.0_144]
[2018-04-24T17:17:39,014][DEBUG][io.netty.util.internal.NativeLibraryLoader] netty_tcnative_linux_x86_64 cannot be loaded from java.libary.path, now trying export to -Dio.netty.native.workdir: /tmp
java.lang.UnsatisfiedLinkError: no netty_tcnative_linux_x86_64 in java.library.path
at java.lang.ClassLoader.loadLibrary(Unknown Source) ~[?:1.8.0_144]
at java.lang.Runtime.loadLibrary0(Unknown Source) ~[?:1.8.0_144]
at java.lang.System.loadLibrary(Unknown Source) ~[?:1.8.0_144]
at io.netty.util.internal.NativeLibraryUtil.loadLibrary(NativeLibraryUtil.java:38) ~[netty-all-4.1.18.Final.jar:4.1.18.Final]
at io.netty.util.internal.NativeLibraryLoader.loadLibrary(NativeLibraryLoader.java:243) ~[netty-all-4.1.18.Final.jar:4.1.18.Final]
at io.netty.util.internal.NativeLibraryLoader.load(NativeLibraryLoader.java:124) [netty-all-4.1.18.Final.jar:4.1.18.Final]
at io.netty.util.internal.NativeLibraryLoader.loadFirstAvailable(NativeLibraryLoader.java:85) [netty-all-4.1.18.Final.jar:4.1.18.Final]
at io.netty.handler.ssl.OpenSsl.loadTcNative(OpenSsl.java:421) [netty-all-4.1.18.Final.jar:4.1.18.Final]
at io.netty.handler.ssl.OpenSsl.<clinit>(OpenSsl.java:89) [netty-all-4.1.18.Final.jar:4.1.18.Final]
at java.lang.Class.forName0(Native Method) [?:1.8.0_144]
at java.lang.Class.forName(Unknown Source) [?:1.8.0_144]
at org.jruby.javasupport.JavaSupportImpl.loadJavaClass(JavaSupportImpl.java:154) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.javasupport.JavaSupportImpl.loadJavaClassVerbose(JavaSupportImpl.java:163) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.javasupport.JavaClass.forNameVerbose(JavaClass.java:271) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.javasupport.JavaClass.for_name(JavaClass.java:286) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.javasupport.Java.get_proxy_class(Java.java:386) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.javasupport.JavaUtilities.get_proxy_class(JavaUtilities.java:34) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.javasupport.JavaUtilities$INVOKER$s$1$0$get_proxy_class.call(JavaUtilities$INVOKER$s$1$0$get_proxy_class.gen) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:161) [jruby-complete-9.1.13.0.jar:?]
at uri_3a_classloader_3a_.jruby.java.core_ext.object.invokeOther73:get_proxy_class(uri:classloader:/jruby/java/core_ext/object.rb:49) [jruby-complete-9.1.13.0.jar:?]
at uri_3a_classloader_3a_.jruby.java.core_ext.object.RUBY$block$java_import$2(uri:classloader:/jruby/java/core_ext/object.rb:49) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.CompiledIRBlockBody.yieldDirect(CompiledIRBlockBody.java:156) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.BlockBody.yield(BlockBody.java:114) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.Block.yield(Block.java:165) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.RubyArray.collect(RubyArray.java:2472) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.RubyArray.map19(RubyArray.java:2486) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.RubyArray$INVOKER$i$0$0$map19.call(RubyArray$INVOKER$i$0$0$map19.gen) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.callsite.CachingCallSite.callBlock(CachingCallSite.java:139) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:145) [jruby-complete-9.1.13.0.jar:?]
at uri_3a_classloader_3a_.jruby.java.core_ext.object.invokeOther77:map(uri:classloader:/jruby/java/core_ext/object.rb:36) [jruby-complete-9.1.13.0.jar:?]
at uri_3a_classloader_3a_.jruby.java.core_ext.object.RUBY$method$java_import$0(uri:classloader:/jruby/java/core_ext/object.rb:36) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.CompiledIRMethod.call(CompiledIRMethod.java:77) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.CompiledIRMethod.call(CompiledIRMethod.java:100) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:163) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:200) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:338) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:163) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:314) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:73) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.InterpreterEngine.interpret(InterpreterEngine.java:77) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:144) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:130) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:192) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:318) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:131) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:339) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:73) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.InterpreterEngine.interpret(InterpreterEngine.java:83) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:179) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:165) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:200) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:161) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:314) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:73) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.Interpreter.INTERPRET_BLOCK(Interpreter.java:132) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.MixedModeIRBlockBody.commonYieldPath(MixedModeIRBlockBody.java:148) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.IRBlockBody.doYield(IRBlockBody.java:186) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.BlockBody.yield(BlockBody.java:116) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.Block.yield(Block.java:165) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.RubyArray.each(RubyArray.java:1734) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.RubyArray$INVOKER$i$0$0$each.call(RubyArray$INVOKER$i$0$0$each.gen) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.JavaMethod$JavaMethodZeroBlock.call(JavaMethod.java:498) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.callsite.CachingCallSite.callBlock(CachingCallSite.java:77) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:83) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.instructions.CallBase.interpret(CallBase.java:428) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:355) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:73) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.InterpreterEngine.interpret(InterpreterEngine.java:83) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:179) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:165) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:200) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:338) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:163) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:314) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:73) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.InterpreterEngine.interpret(InterpreterEngine.java:77) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:144) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:130) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:192) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:318) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:131) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:339) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:73) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.InterpreterEngine.interpret(InterpreterEngine.java:77) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:144) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:130) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:192) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:318) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:131) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:339) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:73) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.InterpreterEngine.interpret(InterpreterEngine.java:77) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:144) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:130) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:192) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:318) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:131) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:339) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:73) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.Interpreter.INTERPRET_BLOCK(Interpreter.java:132) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.MixedModeIRBlockBody.commonYieldPath(MixedModeIRBlockBody.java:148) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:73) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.Block.call(Block.java:124) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.RubyProc.call(RubyProc.java:289) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.RubyProc.call(RubyProc.java:246) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.RubyRunnable.run(RubyRunnable.java:104) [jruby-complete-9.1.13.0.jar:?]
at java.lang.Thread.run(Unknown Source) [?:1.8.0_144]
Suppressed: java.lang.UnsatisfiedLinkError: no netty_tcnative_linux_x86_64 in java.library.path
at java.lang.ClassLoader.loadLibrary(Unknown Source) ~[?:1.8.0_144]
at java.lang.Runtime.loadLibrary0(Unknown Source) ~[?:1.8.0_144]
at java.lang.System.loadLibrary(Unknown Source) ~[?:1.8.0_144]
at io.netty.util.internal.NativeLibraryUtil.loadLibrary(NativeLibraryUtil.java:38) ~[netty-all-4.1.18.Final.jar:4.1.18.Final]
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method) ~[?:1.8.0_144]
at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source) ~[?:1.8.0_144]
at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source) ~[?:1.8.0_144]
at java.lang.reflect.Method.invoke(Unknown Source) ~[?:1.8.0_144]
at io.netty.util.internal.NativeLibraryLoader$1.run(NativeLibraryLoader.java:263) ~[netty-all-4.1.18.Final.jar:4.1.18.Final]
at java.security.AccessController.doPrivileged(Native Method) ~[?:1.8.0_144]
at io.netty.util.internal.NativeLibraryLoader.loadLibraryByHelper(NativeLibraryLoader.java:255) ~[netty-all-4.1.18.Final.jar:4.1.18.Final]
at io.netty.util.internal.NativeLibraryLoader.loadLibrary(NativeLibraryLoader.java:233) ~[netty-all-4.1.18.Final.jar:4.1.18.Final]
at io.netty.util.internal.NativeLibraryLoader.load(NativeLibraryLoader.java:124) [netty-all-4.1.18.Final.jar:4.1.18.Final]
at io.netty.util.internal.NativeLibraryLoader.loadFirstAvailable(NativeLibraryLoader.java:85) [netty-all-4.1.18.Final.jar:4.1.18.Final]
at io.netty.handler.ssl.OpenSsl.loadTcNative(OpenSsl.java:421) [netty-all-4.1.18.Final.jar:4.1.18.Final]
at io.netty.handler.ssl.OpenSsl.<clinit>(OpenSsl.java:89) [netty-all-4.1.18.Final.jar:4.1.18.Final]
at java.lang.Class.forName0(Native Method) [?:1.8.0_144]
at java.lang.Class.forName(Unknown Source) [?:1.8.0_144]
at org.jruby.javasupport.JavaSupportImpl.loadJavaClass(JavaSupportImpl.java:154) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.javasupport.JavaSupportImpl.loadJavaClassVerbose(JavaSupportImpl.java:163) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.javasupport.JavaClass.forNameVerbose(JavaClass.java:271) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.javasupport.JavaClass.for_name(JavaClass.java:286) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.javasupport.Java.get_proxy_class(Java.java:386) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.javasupport.JavaUtilities.get_proxy_class(JavaUtilities.java:34) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.javasupport.JavaUtilities$INVOKER$s$1$0$get_proxy_class.call(JavaUtilities$INVOKER$s$1$0$get_proxy_class.gen) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:161) [jruby-complete-9.1.13.0.jar:?]
at uri_3a_classloader_3a_.jruby.java.core_ext.object.invokeOther73:get_proxy_class(uri:classloader:/jruby/java/core_ext/object.rb:49) [jruby-complete-9.1.13.0.jar:?]
at uri_3a_classloader_3a_.jruby.java.core_ext.object.RUBY$block$java_import$2(uri:classloader:/jruby/java/core_ext/object.rb:49) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.CompiledIRBlockBody.yieldDirect(CompiledIRBlockBody.java:156) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.BlockBody.yield(BlockBody.java:114) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.Block.yield(Block.java:165) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.RubyArray.collect(RubyArray.java:2472) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.RubyArray.map19(RubyArray.java:2486) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.RubyArray$INVOKER$i$0$0$map19.call(RubyArray$INVOKER$i$0$0$map19.gen) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.callsite.CachingCallSite.callBlock(CachingCallSite.java:139) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:145) [jruby-complete-9.1.13.0.jar:?]
at uri_3a_classloader_3a_.jruby.java.core_ext.object.invokeOther77:map(uri:classloader:/jruby/java/core_ext/object.rb:36) [jruby-complete-9.1.13.0.jar:?]
at uri_3a_classloader_3a_.jruby.java.core_ext.object.RUBY$method$java_import$0(uri:classloader:/jruby/java/core_ext/object.rb:36) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.CompiledIRMethod.call(CompiledIRMethod.java:77) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.CompiledIRMethod.call(CompiledIRMethod.java:100) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:163) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:200) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:338) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:163) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:314) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:73) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.InterpreterEngine.interpret(InterpreterEngine.java:77) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:144) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:130) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:192) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:318) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:131) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:339) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:73) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.InterpreterEngine.interpret(InterpreterEngine.java:83) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:179) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:165) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:200) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:161) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:314) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:73) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.Interpreter.INTERPRET_BLOCK(Interpreter.java:132) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.MixedModeIRBlockBody.commonYieldPath(MixedModeIRBlockBody.java:148) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.IRBlockBody.doYield(IRBlockBody.java:186) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.BlockBody.yield(BlockBody.java:116) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.Block.yield(Block.java:165) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.RubyArray.each(RubyArray.java:1734) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.RubyArray$INVOKER$i$0$0$each.call(RubyArray$INVOKER$i$0$0$each.gen) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.JavaMethod$JavaMethodZeroBlock.call(JavaMethod.java:498) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.callsite.CachingCallSite.callBlock(CachingCallSite.java:77) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:83) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.instructions.CallBase.interpret(CallBase.java:428) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:355) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:73) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.InterpreterEngine.interpret(InterpreterEngine.java:83) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:179) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:165) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:200) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:338) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:163) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:314) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:73) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.InterpreterEngine.interpret(InterpreterEngine.java:77) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:144) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:130) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:192) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:318) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:131) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:339) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:73) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.InterpreterEngine.interpret(InterpreterEngine.java:77) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:144) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:130) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:192) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:318) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:131) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:339) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:73) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.InterpreterEngine.interpret(InterpreterEngine.java:77) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.MixedModeIRMethod.INTERPRET_METHOD(MixedModeIRMethod.java:144) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.MixedModeIRMethod.call(MixedModeIRMethod.java:130) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.methods.DynamicMethod.call(DynamicMethod.java:192) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.callsite.CachingCallSite.cacheAndCall(CachingCallSite.java:318) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.callsite.CachingCallSite.call(CachingCallSite.java:131) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.InterpreterEngine.processCall(InterpreterEngine.java:339) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.StartupInterpreterEngine.interpret(StartupInterpreterEngine.java:73) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.ir.interpreter.Interpreter.INTERPRET_BLOCK(Interpreter.java:132) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.MixedModeIRBlockBody.commonYieldPath(MixedModeIRBlockBody.java:148) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.IRBlockBody.call(IRBlockBody.java:73) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.runtime.Block.call(Block.java:124) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.RubyProc.call(RubyProc.java:289) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.RubyProc.call(RubyProc.java:246) [jruby-complete-9.1.13.0.jar:?]
at org.jruby.internal.runtime.RubyRunnable.run(RubyRunnable.java:104) [jruby-complete-9.1.13.0.jar:?]
at java.lang.Thread.run(Unknown Source) [?:1.8.0_144]
[2018-04-24T17:17:39,032][DEBUG][io.netty.util.internal.NativeLibraryLoader] Successfully loaded the library /tmp/libnetty_tcnative_linux_x86_642637945217917033496.so
[2018-04-24T17:17:39,032][DEBUG][io.netty.handler.ssl.OpenSsl] netty-tcnative using native library: BoringSSL
[2018-04-24T17:17:39,110][DEBUG][io.netty.util.ResourceLeakDetector] -Dio.netty.leakDetection.level: simple
[2018-04-24T17:17:39,110][DEBUG][io.netty.util.ResourceLeakDetector] -Dio.netty.leakDetection.targetRecords: 4
[2018-04-24T17:17:39,118][DEBUG][io.netty.buffer.AbstractByteBuf] -Dio.netty.buffer.bytebuf.checkAccessible: true
[2018-04-24T17:17:39,120][DEBUG][io.netty.util.ResourceLeakDetectorFactory] Loaded default ResourceLeakDetector: io.netty.util.ResourceLeakDetector@73fb013f
[2018-04-24T17:17:39,160][DEBUG][io.netty.util.internal.InternalThreadLocalMap] -Dio.netty.threadLocalMap.stringBuilder.initialSize: 1024
[2018-04-24T17:17:39,160][DEBUG][io.netty.util.internal.InternalThreadLocalMap] -Dio.netty.threadLocalMap.stringBuilder.maxSize: 4096
[2018-04-24T17:17:39,163][DEBUG][io.netty.buffer.PooledByteBufAllocator] -Dio.netty.allocator.numHeapArenas: 8
[2018-04-24T17:17:39,163][DEBUG][io.netty.buffer.PooledByteBufAllocator] -Dio.netty.allocator.numDirectArenas: 8
[2018-04-24T17:17:39,163][DEBUG][io.netty.buffer.PooledByteBufAllocator] -Dio.netty.allocator.pageSize: 8192
[2018-04-24T17:17:39,163][DEBUG][io.netty.buffer.PooledByteBufAllocator] -Dio.netty.allocator.maxOrder: 11
[2018-04-24T17:17:39,163][DEBUG][io.netty.buffer.PooledByteBufAllocator] -Dio.netty.allocator.chunkSize: 16777216
[2018-04-24T17:17:39,163][DEBUG][io.netty.buffer.PooledByteBufAllocator] -Dio.netty.allocator.tinyCacheSize: 512
[2018-04-24T17:17:39,163][DEBUG][io.netty.buffer.PooledByteBufAllocator] -Dio.netty.allocator.smallCacheSize: 256
[2018-04-24T17:17:39,163][DEBUG][io.netty.buffer.PooledByteBufAllocator] -Dio.netty.allocator.normalCacheSize: 64
[2018-04-24T17:17:39,163][DEBUG][io.netty.buffer.PooledByteBufAllocator] -Dio.netty.allocator.maxCachedBufferCapacity: 32768
[2018-04-24T17:17:39,163][DEBUG][io.netty.buffer.PooledByteBufAllocator] -Dio.netty.allocator.cacheTrimInterval: 8192
[2018-04-24T17:17:39,163][DEBUG][io.netty.buffer.PooledByteBufAllocator] -Dio.netty.allocator.useCacheForAllThreads: true
[2018-04-24T17:17:39,170][DEBUG][io.netty.buffer.ByteBufUtil] -Dio.netty.allocator.type: pooled
[2018-04-24T17:17:39,170][DEBUG][io.netty.buffer.ByteBufUtil] -Dio.netty.threadLocalDirectBufferSize: 65536
[2018-04-24T17:17:39,170][DEBUG][io.netty.buffer.ByteBufUtil] -Dio.netty.maxThreadLocalCharBufferSize: 16384
[2018-04-24T17:17:39,183][DEBUG][io.netty.util.ResourceLeakDetectorFactory] Loaded default ResourceLeakDetector: io.netty.util.ResourceLeakDetector@5557392a
[2018-04-24T17:17:39,183][DEBUG][io.netty.handler.ssl.ReferenceCountedOpenSslContext] ReferenceCountedOpenSslContext supports -Djdk.tls.ephemeralDHKeySize={int}, but got: matched
[2018-04-24T17:17:39,189][DEBUG][io.netty.util.Recycler ] -Dio.netty.recycler.maxCapacityPerThread: 32768
[2018-04-24T17:17:39,189][DEBUG][io.netty.util.Recycler ] -Dio.netty.recycler.maxSharedCapacityFactor: 2
[2018-04-24T17:17:39,189][DEBUG][io.netty.util.Recycler ] -Dio.netty.recycler.linkCapacity: 16
[2018-04-24T17:17:39,189][DEBUG][io.netty.util.Recycler ] -Dio.netty.recycler.ratio: 8
[2018-04-24T17:17:39,203][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 => ECDHE-ECDSA-AES128-GCM-SHA256
[2018-04-24T17:17:39,203][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: SSL_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256 => ECDHE-ECDSA-AES128-GCM-SHA256
[2018-04-24T17:17:39,203][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256 => ECDHE-RSA-AES128-GCM-SHA256
[2018-04-24T17:17:39,203][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: SSL_ECDHE_RSA_WITH_AES_128_GCM_SHA256 => ECDHE-RSA-AES128-GCM-SHA256
[2018-04-24T17:17:39,203][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384 => ECDHE-ECDSA-AES256-GCM-SHA384
[2018-04-24T17:17:39,203][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: SSL_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384 => ECDHE-ECDSA-AES256-GCM-SHA384
[2018-04-24T17:17:39,203][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: TLS_ECDHE_RSA_WITH_AES_256_GCM_SHA384 => ECDHE-RSA-AES256-GCM-SHA384
[2018-04-24T17:17:39,203][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: SSL_ECDHE_RSA_WITH_AES_256_GCM_SHA384 => ECDHE-RSA-AES256-GCM-SHA384
[2018-04-24T17:17:39,203][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: TLS_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256 => ECDHE-ECDSA-CHACHA20-POLY1305
[2018-04-24T17:17:39,203][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: SSL_ECDHE_ECDSA_WITH_CHACHA20_POLY1305_SHA256 => ECDHE-ECDSA-CHACHA20-POLY1305
[2018-04-24T17:17:39,203][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: TLS_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 => ECDHE-RSA-CHACHA20-POLY1305
[2018-04-24T17:17:39,203][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: SSL_ECDHE_RSA_WITH_CHACHA20_POLY1305_SHA256 => ECDHE-RSA-CHACHA20-POLY1305
[2018-04-24T17:17:39,203][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: TLS_ECDHE_PSK_WITH_CHACHA20_POLY1305_SHA256 => ECDHE-PSK-CHACHA20-POLY1305
[2018-04-24T17:17:39,203][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: SSL_ECDHE_PSK_WITH_CHACHA20_POLY1305_SHA256 => ECDHE-PSK-CHACHA20-POLY1305
[2018-04-24T17:17:39,203][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA => ECDHE-ECDSA-AES128-SHA
[2018-04-24T17:17:39,204][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: SSL_ECDHE_ECDSA_WITH_AES_128_CBC_SHA => ECDHE-ECDSA-AES128-SHA
[2018-04-24T17:17:39,204][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: TLS_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256 => ECDHE-ECDSA-AES128-SHA256
[2018-04-24T17:17:39,204][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: SSL_ECDHE_ECDSA_WITH_AES_128_CBC_SHA256 => ECDHE-ECDSA-AES128-SHA256
[2018-04-24T17:17:39,204][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA => ECDHE-RSA-AES128-SHA
[2018-04-24T17:17:39,204][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: SSL_ECDHE_RSA_WITH_AES_128_CBC_SHA => ECDHE-RSA-AES128-SHA
[2018-04-24T17:17:39,204][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA256 => ECDHE-RSA-AES128-SHA256
[2018-04-24T17:17:39,204][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: SSL_ECDHE_RSA_WITH_AES_128_CBC_SHA256 => ECDHE-RSA-AES128-SHA256
[2018-04-24T17:17:39,204][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: TLS_ECDHE_PSK_WITH_AES_128_CBC_SHA => ECDHE-PSK-AES128-CBC-SHA
[2018-04-24T17:17:39,204][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: SSL_ECDHE_PSK_WITH_AES_128_CBC_SHA => ECDHE-PSK-AES128-CBC-SHA
[2018-04-24T17:17:39,204][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA => ECDHE-ECDSA-AES256-SHA
[2018-04-24T17:17:39,204][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: SSL_ECDHE_ECDSA_WITH_AES_256_CBC_SHA => ECDHE-ECDSA-AES256-SHA
[2018-04-24T17:17:39,204][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: TLS_ECDHE_ECDSA_WITH_AES_256_CBC_SHA384 => ECDHE-ECDSA-AES256-SHA384
[2018-04-24T17:17:39,204][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: SSL_ECDHE_ECDSA_WITH_AES_256_CBC_SHA384 => ECDHE-ECDSA-AES256-SHA384
[2018-04-24T17:17:39,204][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA => ECDHE-RSA-AES256-SHA
[2018-04-24T17:17:39,204][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: SSL_ECDHE_RSA_WITH_AES_256_CBC_SHA => ECDHE-RSA-AES256-SHA
[2018-04-24T17:17:39,204][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA384 => ECDHE-RSA-AES256-SHA384
[2018-04-24T17:17:39,204][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: SSL_ECDHE_RSA_WITH_AES_256_CBC_SHA384 => ECDHE-RSA-AES256-SHA384
[2018-04-24T17:17:39,204][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: TLS_ECDHE_PSK_WITH_AES_256_CBC_SHA => ECDHE-PSK-AES256-CBC-SHA
[2018-04-24T17:17:39,204][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: SSL_ECDHE_PSK_WITH_AES_256_CBC_SHA => ECDHE-PSK-AES256-CBC-SHA
[2018-04-24T17:17:39,204][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: TLS_RSA_WITH_AES_128_GCM_SHA256 => AES128-GCM-SHA256
[2018-04-24T17:17:39,204][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: SSL_RSA_WITH_AES_128_GCM_SHA256 => AES128-GCM-SHA256
[2018-04-24T17:17:39,204][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: TLS_RSA_WITH_AES_256_GCM_SHA384 => AES256-GCM-SHA384
[2018-04-24T17:17:39,204][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: SSL_RSA_WITH_AES_256_GCM_SHA384 => AES256-GCM-SHA384
[2018-04-24T17:17:39,204][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: TLS_RSA_WITH_AES_128_CBC_SHA => AES128-SHA
[2018-04-24T17:17:39,204][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: SSL_RSA_WITH_AES_128_CBC_SHA => AES128-SHA
[2018-04-24T17:17:39,204][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: TLS_RSA_WITH_AES_128_CBC_SHA256 => AES128-SHA256
[2018-04-24T17:17:39,204][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: SSL_RSA_WITH_AES_128_CBC_SHA256 => AES128-SHA256
[2018-04-24T17:17:39,204][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: TLS_PSK_WITH_AES_128_CBC_SHA => PSK-AES128-CBC-SHA
[2018-04-24T17:17:39,204][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: SSL_PSK_WITH_AES_128_CBC_SHA => PSK-AES128-CBC-SHA
[2018-04-24T17:17:39,204][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: TLS_RSA_WITH_AES_256_CBC_SHA => AES256-SHA
[2018-04-24T17:17:39,204][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: SSL_RSA_WITH_AES_256_CBC_SHA => AES256-SHA
[2018-04-24T17:17:39,204][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: TLS_RSA_WITH_AES_256_CBC_SHA256 => AES256-SHA256
[2018-04-24T17:17:39,204][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: SSL_RSA_WITH_AES_256_CBC_SHA256 => AES256-SHA256
[2018-04-24T17:17:39,204][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: TLS_PSK_WITH_AES_256_CBC_SHA => PSK-AES256-CBC-SHA
[2018-04-24T17:17:39,204][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: SSL_PSK_WITH_AES_256_CBC_SHA => PSK-AES256-CBC-SHA
[2018-04-24T17:17:39,204][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: TLS_RSA_WITH_3DES_EDE_CBC_SHA => DES-CBC3-SHA
[2018-04-24T17:17:39,204][DEBUG][io.netty.handler.ssl.CipherSuiteConverter] Cipher suite mapping: SSL_RSA_WITH_3DES_EDE_CBC_SHA => DES-CBC3-SHA
[2018-04-24T17:17:39,205][DEBUG][io.netty.handler.ssl.OpenSsl] Supported protocols (OpenSSL): [[SSLv2Hello, TLSv1, TLSv1.1, TLSv1.2]]
[2018-04-24T17:17:39,205][DEBUG][io.netty.handler.ssl.OpenSsl] Default cipher suites (OpenSSL): [TLS_ECDHE_ECDSA_WITH_AES_256_GCM_SHA384, TLS_ECDHE_ECDSA_WITH_AES_128_GCM_SHA256, TLS_ECDHE_RSA_WITH_AES_128_GCM_SHA256, TLS_ECDHE_RSA_WITH_AES_128_CBC_SHA, TLS_ECDHE_RSA_WITH_AES_256_CBC_SHA, TLS_RSA_WITH_AES_128_GCM_SHA256, TLS_RSA_WITH_AES_128_CBC_SHA, TLS_RSA_WITH_AES_256_CBC_SHA]
[2018-04-24T17:17:39,206][INFO ][logstash.inputs.beats ] Beats inputs: Starting input listener {:address=>"0.0.0.0:5044"}
[2018-04-24T17:17:39,211][DEBUG][io.netty.channel.MultithreadEventLoopGroup] -Dio.netty.eventLoopThreads: 8
[2018-04-24T17:17:39,231][DEBUG][io.netty.channel.nio.NioEventLoop] -Dio.netty.noKeySetOptimization: false
[2018-04-24T17:17:39,231][DEBUG][io.netty.channel.nio.NioEventLoop] -Dio.netty.selectorAutoRebuildThreshold: 512
[2018-04-24T17:17:39,238][DEBUG][io.netty.util.internal.PlatformDependent] org.jctools-core.MpscChunkedArrayQueue: available
[2018-04-24T17:17:39,246][INFO ][logstash.pipeline ] Pipeline started succesfully {:pipeline_id=>"main", :thread=>"#<Thread:0x3b732c82@/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:246 sleep>"}
[2018-04-24T17:17:39,250][INFO ][logstash.agent ] Pipelines running {:count=>2, :pipelines=>[".monitoring-logstash", "main"]}
[2018-04-24T17:17:39,251][INFO ][logstash.inputs.metrics ] Monitoring License OK
[2018-04-24T17:17:39,252][INFO ][org.logstash.beats.Server] Starting server on port: 5044
[2018-04-24T17:17:39,275][DEBUG][io.netty.channel.DefaultChannelId] -Dio.netty.processId: 9921 (auto-detected)
[2018-04-24T17:17:39,277][DEBUG][io.netty.util.NetUtil ] -Djava.net.preferIPv4Stack: true
[2018-04-24T17:17:39,277][DEBUG][io.netty.util.NetUtil ] -Djava.net.preferIPv6Addresses: false
[2018-04-24T17:17:39,278][DEBUG][io.netty.util.NetUtil ] Loopback interface: lo (lo, 0:0:0:0:0:0:0:1%lo)
[2018-04-24T17:17:39,278][DEBUG][io.netty.util.NetUtil ] /proc/sys/net/core/somaxconn: 128
[2018-04-24T17:17:39,279][DEBUG][io.netty.channel.DefaultChannelId] -Dio.netty.machineId: 00:50:56:ff:fe:8b:a9:e3 (auto-detected)
[2018-04-24T17:17:39,401][DEBUG][logstash.pipeline ] 29a8537d7ad3a0c33411f2ef220944d21e1e034b20ba581de58a56578", "id"=>"c869426d64fc5c01242c74002a3d11750ddb19efc81001611910908ef818322f", "from"=>"a4eeaa8b6fc65b97bcf9fae420ec2b32afbfc3c9f145e9a947d202a241be3e8d"}, {"type"=>"plain", "to"=>"94c3fb44fd4485d915b4f47760800ef72d803e08af970b60eb7e2bb6a76f863c", "id"=>"249eceb5d34dc80ef6bb9aa549dffe7f369ec2efbf1a8dd74f346382e52ea9d2", "from"=>"83dd44229a8537d7ad3a0c33411f2ef220944d21e1e034b20ba581de58a56578"}, {"type"=>"plain", "to"=>"b355a946e1dd18ca9fcd72d7b3967f6f28d003aaab13ca68f98706c886330e67", "id"=>"ef89a8ea095ee32158fd1df6d511a9c333a7ff288579334ab4f30f7e1a43b788", "from"=>"f03898c9ae1dcf6c1e43edff832388087a4e508c2af2bea6828e725e91006250"}, {"type"=>"plain", "to"=>"83cfabdc54e1b36eddd9006f6152f2db76d90067ed269fc0ef42a901a33eee01", "id"=>"67d77359874b3d86da27a5856c8d21bcbf526fbe97ee572b79a1bd654ca4f701", "from"=>"b355a946e1dd18ca9fcd72d7b3967f6f28d003aaab13ca68f98706c886330e67"}, {"type"=>"plain", "to"=>"a8d6a82ac595b101dafb6127a737491e870321c634c846df9c6fcc7fc1ef6fb7", "id"=>"ddee1036075a68af6767dd49fd05a953ed66c126e4fff1d98935d299b7862c20", "from"=>"83cfabdc54e1b36eddd9006f6152f2db76d90067ed269fc0ef42a901a33eee01"}, {"type"=>"plain", "to"=>"49721ca4126eab2bb4c9cfb46ff71f547fb6ef00419614af28eb7cd247a67c99", "id"=>"909920dd7285d66c9a232f8c466302bb87dd43cc84a11b284fbe856782b09555", "from"=>"5c5c7de9a254764eb192289530ff73343de85b4b49af0e3f0dff371267d0d0d2"}, {"type"=>"plain", "to"=>"ca44f71e8b77fe8918f303e55479a4b1a451e403040e32630df48bddc3819b40", "id"=>"3c525edb062d10bd718dcf75fd629367ca9f7e0a2b42f0921d2f596293729566", "from"=>"49721ca4126eab2bb4c9cfb46ff71f547fb6ef00419614af28eb7cd247a67c99"}, {"type"=>"plain", "to"=>"98148cf069a8dc5dbfba3bd5f4247bfceb347f5730b9395ddaa2f493db8c23ef", "id"=>"06ab2eac2930ec7f8343d8ecf0ebd52a2f6adbafb5a79863598bc9b399418db2", "from"=>"8f467467d51f5218229defdae13f6d26046943a40cd23a9525f45a91d8ff700b"}, {"type"=>"plain", "to"=>"6aeeff6c2638eb7e0f31012bdfb27b0e03a6c983b6ed03efe8f17ac2e0f19a20", "id"=>"c24d38758fcc8245b9d16f9b32d65e506716fe345b4a687cc587b2ea9b5bd759", "from"=>"98148cf069a8dc5dbfba3bd5f4247bfceb347f5730b9395ddaa2f493db8c23ef"}, {"type"=>"boolean", "to"=>"8f467467d51f5218229defdae13f6d26046943a40cd23a9525f45a91d8ff700b", "id"=>"331ada00bff70929bac2f1bea783fdcc564f280e6a79283055989af592acea88", "from"=>"9bdb9493e780337ddd3c3127e9a569f2c77dd5565028dc50bf9becd07b962d98", "when"=>true}, {"type"=>"boolean", "to"=>"5c5c7de9a254764eb192289530ff73343de85b4b49af0e3f0dff371267d0d0d2", "id"=>"3c3a4b918da9c26e9b58b760ad27946dfa9a455d1c1ed89719dfa8ed9eb42dcd", "from"=>"7c5598bd2fd92c0ee3070eaad61b50291964a7e8c73c23d9eb7773a77c274a0b", "when"=>true}, {"type"=>"boolean", "to"=>"9bdb9493e780337ddd3c3127e9a569f2c77dd5565028dc50bf9becd07b962d98", "id"=>"89840ffd9b3a98afaf57c8e8c41f10778bbec81c50bf490055f9a857f96a7ff9", "from"=>"7c5598bd2fd92c0ee3070eaad61b50291964a7e8c73c23d9eb7773a77c274a0b", "when"=>false}, {"type"=>"boolean", "to"=>"f03898c9ae1dcf6c1e43edff832388087a4e508c2af2bea6828e725e91006250", "id"=>"f0839860e256075cc92d59de0880ab1906ce84964ed0e4fb8a0880c8be548392", "from"=>"9f1e7d773ca7041ae8f0ba237366441dc7f09abf89142ac08a79ddff93f3ed9f", "when"=>true}, {"type"=>"boolean", "to"=>"7c5598bd2fd92c0ee3070eaad61b50291964a7e8c73c23d9eb7773a77c274a0b", "id"=>"d365a10855ef51c646c1807393ae6663c4a625920272e5e94c273ea03198ff04", "from"=>"9f1e7d773ca7041ae8f0ba237366441dc7f09abf89142ac08a79ddff93f3ed9f", "when"=>false}, {"type"=>"boolean", "to"=>"a4eeaa8b6fc65b97bcf9fae420ec2b32afbfc3c9f145e9a947d202a241be3e8d", "id"=>"7ba2c0f52802bbc85d0a3bc836ef093a121443edf9d26fe0c62b2e6cd78fcaad", "from"=>"68adeea06b74341e9a64f57ac30e2c53337ff3a200a7964eaf1e2c68b4c72878", "when"=>true}, {"type"=>"boolean", "to"=>"9f1e7d773ca7041ae8f0ba237366441dc7f09abf89142ac08a79ddff93f3ed9f", "id"=>"09142d92e11b59ac1fe74478ea7f5a4438d0aba23c2c3161815751628a527bf9", "from"=>"68adeea06b74341e9a64f57ac30e2c53337ff3a200a7964eaf1e2c68b4c72878", "when"=>false}, {"type"=>"boolean", "to"=>"4037240d9184f97dd40e18d7d0e0c20bf404e642aa17c2a60f3e76cd4aae0415", "id"=>"6dedcf294bf31ccbcaf2f34d9558b9b224090b1b58f7d75a15abe1acf6d37b24", "from"=>"73ce98507f3134adbcd95843d447d1b16e0b9a641ff51836ff82ea0b7ff726fb", "when"=>true}, {"type"=>"boolean", "to"=>"68adeea06b74341e9a64f57ac30e2c53337ff3a200a7964eaf1e2c68b4c72878", "id"=>"bd7d26e842e4b71a3c3172ea34cce368fe917463458d4d106207ef3835f21cdd", "from"=>"73ce98507f3134adbcd95843d447d1b16e0b9a641ff51836ff82ea0b7ff726fb", "when"=>false}, {"type"=>"boolean", "to"=>"505c54eaf9f7dd705fda815d4f9ce6be2537022d424504b6a722f8d18b174c48", "id"=>"08c71b272433c4609f40e04f73c23a1f43bf708c14a4e390dce3cf5593685ffd", "from"=>"53d97b1319d912fae1d0714e0d69f94e8e15135e9c597eb0d41e49d54e58a590", "when"=>true}, {"type"=>"boolean", "to"=>"73ce98507f3134adbcd95843d447d1b16e0b9a641ff51836ff82ea0b7ff726fb", "id"=>"feeb35ff1416efe24e5f7f31823ba1fa56ab5a185b033c5f274731afd908b51e", "from"=>"53d97b1319d912fae1d0714e0d69f94e8e15135e9c597eb0d41e49d54e58a590", "when"=>false}, {"type"=>"plain", "to"=>"53d97b1319d912fae1d0714e0d69f94e8e15135e9c597eb0d41e49d54e58a590", "id"=>"7d9bcc43cef802d2a86dc4926dc11cbf0ab50dad5a9d8d9fca69f466d62818c9", "from"=>"9fc06e415faae68f28253c2f1e1835ca4184585e823d2adf3dfdcab75ecf93eb"}, {"type"=>"plain", "to"=>"78267f24b6888d38ac8b6bcfc41ef16fea0550d3986fe5648b72d9df39334ee1", "id"=>"2b47edba101b30a101ad6d6d9402770fda6e6769b43902050990162082b96563", "from"=>"505c54eaf9f7dd705fda815d4f9ce6be2537022d424504b6a722f8d18b174c48"}, {"type"=>"plain", "to"=>"78267f24b6888d38ac8b6bcfc41ef16fea0550d3986fe5648b72d9df39334ee1", "id"=>"642089194e85738ffe362e97028585e9f595107f54384e6288a9d32a771738d3", "from"=>"bbe33a88317edd9a24cee3aa9d9cce34045096614bca501f3c58961a0b8a81f0"}, {"type"=>"plain", "to"=>"78267f24b6888d38ac8b6bcfc41ef16fea0550d3986fe5648b72d9df39334ee1", "id"=>"1be25b52f462949d110ebee657ae0bd37dd6556b5c9a8fe30da11d75f3bc2571", "from"=>"94c3fb44fd4485d915b4f47760800ef72d803e08af970b60eb7e2bb6a76f863c"}, {"type"=>"plain", "to"=>"78267f24b6888d38ac8b6bcfc41ef16fea0550d3986fe5648b72d9df39334ee1", "id"=>"352cd0901681c0f53a7faf4e8ad47be79f794e977ba4d9a91d7e6310834bf396", "from"=>"a8d6a82ac595b101dafb6127a737491e870321c634c846df9c6fcc7fc1ef6fb7"}, {"type"=>"plain", "to"=>"78267f24b6888d38ac8b6bcfc41ef16fea0550d3986fe5648b72d9df39334ee1", "id"=>"42a67a6fe5e5f1d6aea9fcd83768e56d3df37645db9d4c02ba9e0061734a870f", "from"=>"ca44f71e8b77fe8918f303e55479a4b1a451e403040e32630df48bddc3819b40"}, {"type"=>"plain", "to"=>"78267f24b6888d38ac8b6bcfc41ef16fea0550d3986fe5648b72d9df39334ee1", "id"=>"ab8a48c00a6afd4769628ba4b9690322411e8604a6213d5571bf661360e7efe1", "from"=>"6aeeff6c2638eb7e0f31012bdfb27b0e03a6c983b6ed03efe8f17ac2e0f19a20"}, {"type"=>"boolean", "to"=>"78267f24b6888d38ac8b6bcfc41ef16fea0550d3986fe5648b72d9df39334ee1", "id"=>"f8a08efee4e7ff924d70791afe8aa819b5b17b108a4b52d86033d8a964e3f097", "from"=>"9bdb9493e780337ddd3c3127e9a569f2c77dd5565028dc50bf9becd07b962d98", "when"=>false}, {"type"=>"plain", "to"=>"9fc06e415faae68f28253c2f1e1835ca4184585e823d2adf3dfdcab75ecf93eb", "id"=>"c207255485397504e44c78a2a65799fa0f4ca542862868f54b191f7978b228ab", "from"=>"__QUEUE__"}, {"type"=>"boolean", "to"=>"3630197e44f4ce7f7863661164289fb53b337b76f88f4f4d1181444e12822517", "id"=>"90db76f5e8e7336dbcfbf4497ac47fa52aef25984a10cc0916710a3ab533c216", "from"=>"b46187a1f88a196a953264b887723ce3db24ff79c9e76edf751dfab451dafb9e", "when"=>true}, {"type"=>"boolean", "to"=>"e81d6c1b2dbd27273d0670c70a10032ebab46c05b58b1646e0c5ef7486889d0e", "id"=>"2ded7a5820f58010f7f4c1358d715ff8696550dda32e660d84b80d1391601918", "from"=>"b46187a1f88a196a953264b887723ce3db24ff79c9e76edf751dfab451dafb9e", "when"=>false}, {"type"=>"boolean", "to"=>"c9a4e5b5f33e1ee98d60373e7d712bab92d78dfbbc32b04402534212a92c7312", "id"=>"0fc757ca35936aef31ae37c3ddfd3151fb90d9c32774d7dce5159cc0b1893376", "from"=>"6e57e4e9a80dc4da2af483954484e11321b0184accd03b20d42264d483130b32", "when"=>true}, {"type"=>"boolean", "to"=>"b46187a1f88a196a953264b887723ce3db24ff79c9e76edf751dfab451dafb9e", "id"=>"e41a69bdd2dcdb4bd10a0f8dafded9f738d2fde70449956bb522e268315dd606", "from"=>"6e57e4e9a80dc4da2af483954484e11321b0184accd03b20d42264d483130b32", "when"=>false}, {"type"=>"boolean", "to"=>"9c4441aeea169e568e057a019bb62fb0bb7907cd8dcee0e9d21d9c28a41e50e1", "id"=>"57ac831a6a8d49482520c8ffa77e0c564dec551ac00ac36959bc4c68aa40321b", "from"=>"a08042d8b15518e00c064824a5b6040cf15b322d460fddcaeba4dbcdcebff565", "when"=>true}, {"type"=>"boolean", "to"=>"6e57e4e9a80dc4da2af483954484e11321b0184accd03b20d42264d483130b32", "id"=>"5ec37d9658353f351323e195d53f8e4a68b4282a4be3392b319435c5e4a43a93", "from"=>"a08042d8b15518e00c064824a5b6040cf15b322d460fddcaeba4dbcdcebff565", "when"=>false}, {"type"=>"boolean", "to"=>"5ce646a90d89ed12a7f99cb997af29716078b6d0d3d4cf31ff8ae95f6ee2ceae", "id"=>"c450a99614e62d16b74634babc528b8ad80bdc2135237efefff0fb5deff12e49", "from"=>"c4078407a72cf52dadb5d3ead76d8ed9576e0625e1e635e7a6be4aa1af02d08d", "when"=>true}, {"type"=>"boolean", "to"=>"a08042d8b15518e00c064824a5b6040cf15b322d460fddcaeba4dbcdcebff565", "id"=>"b4fb1ec5b18a7bd2c09e4db7ed1e9055357841930dcd4d812956215812d4e5bb", "from"=>"c4078407a72cf52dadb5d3ead76d8ed9576e0625e1e635e7a6be4aa1af02d08d", "when"=>false}, {"type"=>"boolean", "to"=>"17f56a2dd2dac572b4037d04771e3e1ace454ec24fd5523d664576d9f466ef31", "id"=>"cc45f7cbe31ff668af766f91850826758d5543a23742c7ab5aef28aa1417fff1", "from"=>"6633bea8de95b92117d20f453bb2db9f2be2d2f80c0bd503588b6047d0d2afef", "when"=>true}, {"type"=>"boolean", "to"=>"c4078407a72cf52dadb5d3ead76d8ed9576e0625e1e635e7a6be4aa1af02d08d", "id"=>"6c532b69dc22cc709ede7ebdf78cf2be8b68fb8936feecff2a9a38a8347a4ab5", "from"=>"6633bea8de95b92117d20f453bb2db9f2be2d2f80c0bd503588b6047d0d2afef", "when"=>false}, {"type"=>"plain", "to"=>"6633bea8de95b92117d20f453bb2db9f2be2d2f80c0bd503588b6047d0d2afef", "id"=>"76806ce5c8eaead010e8c34313388724d6ddf1bf7c3299c8f719662ebd42b76c", "from"=>"78267f24b6888d38ac8b6bcfc41ef16fea0550d3986fe5648b72d9df39334ee1"}]}}}}}
[2018-04-24T17:17:39,406][DEBUG][logstash.pipeline ] 29a8537d7ad3a0c33411f2ef220944d21e1e034b20ba581de58a56578", "id"=>"c869426d64fc5c01242c74002a3d11750ddb19efc81001611910908ef818322f", "from"=>"a4eeaa8b6fc65b97bcf9fae420ec2b32afbfc3c9f145e9a947d202a241be3e8d"}, {"type"=>"plain", "to"=>"94c3fb44fd4485d915b4f47760800ef72d803e08af970b60eb7e2bb6a76f863c", "id"=>"249eceb5d34dc80ef6bb9aa549dffe7f369ec2efbf1a8dd74f346382e52ea9d2", "from"=>"83dd44229a8537d7ad3a0c33411f2ef220944d21e1e034b20ba581de58a56578"}, {"type"=>"plain", "to"=>"b355a946e1dd18ca9fcd72d7b3967f6f28d003aaab13ca68f98706c886330e67", "id"=>"ef89a8ea095ee32158fd1df6d511a9c333a7ff288579334ab4f30f7e1a43b788", "from"=>"f03898c9ae1dcf6c1e43edff832388087a4e508c2af2bea6828e725e91006250"}, {"type"=>"plain", "to"=>"83cfabdc54e1b36eddd9006f6152f2db76d90067ed269fc0ef42a901a33eee01", "id"=>"67d77359874b3d86da27a5856c8d21bcbf526fbe97ee572b79a1bd654ca4f701", "from"=>"b355a946e1dd18ca9fcd72d7b3967f6f28d003aaab13ca68f98706c886330e67"}, {"type"=>"plain", "to"=>"a8d6a82ac595b101dafb6127a737491e870321c634c846df9c6fcc7fc1ef6fb7", "id"=>"ddee1036075a68af6767dd49fd05a953ed66c126e4fff1d98935d299b7862c20", "from"=>"83cfabdc54e1b36eddd9006f6152f2db76d90067ed269fc0ef42a901a33eee01"}, {"type"=>"plain", "to"=>"49721ca4126eab2bb4c9cfb46ff71f547fb6ef00419614af28eb7cd247a67c99", "id"=>"909920dd7285d66c9a232f8c466302bb87dd43cc84a11b284fbe856782b09555", "from"=>"5c5c7de9a254764eb192289530ff73343de85b4b49af0e3f0dff371267d0d0d2"}, {"type"=>"plain", "to"=>"ca44f71e8b77fe8918f303e55479a4b1a451e403040e32630df48bddc3819b40", "id"=>"3c525edb062d10bd718dcf75fd629367ca9f7e0a2b42f0921d2f596293729566", "from"=>"49721ca4126eab2bb4c9cfb46ff71f547fb6ef00419614af28eb7cd247a67c99"}, {"type"=>"plain", "to"=>"98148cf069a8dc5dbfba3bd5f4247bfceb347f5730b9395ddaa2f493db8c23ef", "id"=>"06ab2eac2930ec7f8343d8ecf0ebd52a2f6adbafb5a79863598bc9b399418db2", "from"=>"8f467467d51f5218229defdae13f6d26046943a40cd23a9525f45a91d8ff700b"}, {"type"=>"plain", "to"=>"6aeeff6c2638eb7e0f31012bdfb27b0e03a6c983b6ed03efe8f17ac2e0f19a20", "id"=>"c24d38758fcc8245b9d16f9b32d65e506716fe345b4a687cc587b2ea9b5bd759", "from"=>"98148cf069a8dc5dbfba3bd5f4247bfceb347f5730b9395ddaa2f493db8c23ef"}, {"type"=>"boolean", "to"=>"8f467467d51f5218229defdae13f6d26046943a40cd23a9525f45a91d8ff700b", "id"=>"331ada00bff70929bac2f1bea783fdcc564f280e6a79283055989af592acea88", "from"=>"9bdb9493e780337ddd3c3127e9a569f2c77dd5565028dc50bf9becd07b962d98", "when"=>true}, {"type"=>"boolean", "to"=>"5c5c7de9a254764eb192289530ff73343de85b4b49af0e3f0dff371267d0d0d2", "id"=>"3c3a4b918da9c26e9b58b760ad27946dfa9a455d1c1ed89719dfa8ed9eb42dcd", "from"=>"7c5598bd2fd92c0ee3070eaad61b50291964a7e8c73c23d9eb7773a77c274a0b", "when"=>true}, {"type"=>"boolean", "to"=>"9bdb9493e780337ddd3c3127e9a569f2c77dd5565028dc50bf9becd07b962d98", "id"=>"89840ffd9b3a98afaf57c8e8c41f10778bbec81c50bf490055f9a857f96a7ff9", "from"=>"7c5598bd2fd92c0ee3070eaad61b50291964a7e8c73c23d9eb7773a77c274a0b", "when"=>false}, {"type"=>"boolean", "to"=>"f03898c9ae1dcf6c1e43edff832388087a4e508c2af2bea6828e725e91006250", "id"=>"f0839860e256075cc92d59de0880ab1906ce84964ed0e4fb8a0880c8be548392", "from"=>"9f1e7d773ca7041ae8f0ba237366441dc7f09abf89142ac08a79ddff93f3ed9f", "when"=>true}, {"type"=>"boolean", "to"=>"7c5598bd2fd92c0ee3070eaad61b50291964a7e8c73c23d9eb7773a77c274a0b", "id"=>"d365a10855ef51c646c1807393ae6663c4a625920272e5e94c273ea03198ff04", "from"=>"9f1e7d773ca7041ae8f0ba237366441dc7f09abf89142ac08a79ddff93f3ed9f", "when"=>false}, {"type"=>"boolean", "to"=>"a4eeaa8b6fc65b97bcf9fae420ec2b32afbfc3c9f145e9a947d202a241be3e8d", "id"=>"7ba2c0f52802bbc85d0a3bc836ef093a121443edf9d26fe0c62b2e6cd78fcaad", "from"=>"68adeea06b74341e9a64f57ac30e2c53337ff3a200a7964eaf1e2c68b4c72878", "when"=>true}, {"type"=>"boolean", "to"=>"9f1e7d773ca7041ae8f0ba237366441dc7f09abf89142ac08a79ddff93f3ed9f", "id"=>"09142d92e11b59ac1fe74478ea7f5a4438d0aba23c2c3161815751628a527bf9", "from"=>"68adeea06b74341e9a64f57ac30e2c53337ff3a200a7964eaf1e2c68b4c72878", "when"=>false}, {"type"=>"boolean", "to"=>"4037240d9184f97dd40e18d7d0e0c20bf404e642aa17c2a60f3e76cd4aae0415", "id"=>"6dedcf294bf31ccbcaf2f34d9558b9b224090b1b58f7d75a15abe1acf6d37b24", "from"=>"73ce98507f3134adbcd95843d447d1b16e0b9a641ff51836ff82ea0b7ff726fb", "when"=>true}, {"type"=>"boolean", "to"=>"68adeea06b74341e9a64f57ac30e2c53337ff3a200a7964eaf1e2c68b4c72878", "id"=>"bd7d26e842e4b71a3c3172ea34cce368fe917463458d4d106207ef3835f21cdd", "from"=>"73ce98507f3134adbcd95843d447d1b16e0b9a641ff51836ff82ea0b7ff726fb", "when"=>false}, {"type"=>"boolean", "to"=>"505c54eaf9f7dd705fda815d4f9ce6be2537022d424504b6a722f8d18b174c48", "id"=>"08c71b272433c4609f40e04f73c23a1f43bf708c14a4e390dce3cf5593685ffd", "from"=>"53d97b1319d912fae1d0714e0d69f94e8e15135e9c597eb0d41e49d54e58a590", "when"=>true}, {"type"=>"boolean", "to"=>"73ce98507f3134adbcd95843d447d1b16e0b9a641ff51836ff82ea0b7ff726fb", "id"=>"feeb35ff1416efe24e5f7f31823ba1fa56ab5a185b033c5f274731afd908b51e", "from"=>"53d97b1319d912fae1d0714e0d69f94e8e15135e9c597eb0d41e49d54e58a590", "when"=>false}, {"type"=>"plain", "to"=>"53d97b1319d912fae1d0714e0d69f94e8e15135e9c597eb0d41e49d54e58a590", "id"=>"7d9bcc43cef802d2a86dc4926dc11cbf0ab50dad5a9d8d9fca69f466d62818c9", "from"=>"9fc06e415faae68f28253c2f1e1835ca4184585e823d2adf3dfdcab75ecf93eb"}, {"type"=>"plain", "to"=>"78267f24b6888d38ac8b6bcfc41ef16fea0550d3986fe5648b72d9df39334ee1", "id"=>"2b47edba101b30a101ad6d6d9402770fda6e6769b43902050990162082b96563", "from"=>"505c54eaf9f7dd705fda815d4f9ce6be2537022d424504b6a722f8d18b174c48"}, {"type"=>"plain", "to"=>"78267f24b6888d38ac8b6bcfc41ef16fea0550d3986fe5648b72d9df39334ee1", "id"=>"642089194e85738ffe362e97028585e9f595107f54384e6288a9d32a771738d3", "from"=>"bbe33a88317edd9a24cee3aa9d9cce34045096614bca501f3c58961a0b8a81f0"}, {"type"=>"plain", "to"=>"78267f24b6888d38ac8b6bcfc41ef16fea0550d3986fe5648b72d9df39334ee1", "id"=>"1be25b52f462949d110ebee657ae0bd37dd6556b5c9a8fe30da11d75f3bc2571", "from"=>"94c3fb44fd4485d915b4f47760800ef72d803e08af970b60eb7e2bb6a76f863c"}, {"type"=>"plain", "to"=>"78267f24b6888d38ac8b6bcfc41ef16fea0550d3986fe5648b72d9df39334ee1", "id"=>"352cd0901681c0f53a7faf4e8ad47be79f794e977ba4d9a91d7e6310834bf396", "from"=>"a8d6a82ac595b101dafb6127a737491e870321c634c846df9c6fcc7fc1ef6fb7"}, {"type"=>"plain", "to"=>"78267f24b6888d38ac8b6bcfc41ef16fea0550d3986fe5648b72d9df39334ee1", "id"=>"42a67a6fe5e5f1d6aea9fcd83768e56d3df37645db9d4c02ba9e0061734a870f", "from"=>"ca44f71e8b77fe8918f303e55479a4b1a451e403040e32630df48bddc3819b40"}, {"type"=>"plain", "to"=>"78267f24b6888d38ac8b6bcfc41ef16fea0550d3986fe5648b72d9df39334ee1", "id"=>"ab8a48c00a6afd4769628ba4b9690322411e8604a6213d5571bf661360e7efe1", "from"=>"6aeeff6c2638eb7e0f31012bdfb27b0e03a6c983b6ed03efe8f17ac2e0f19a20"}, {"type"=>"boolean", "to"=>"78267f24b6888d38ac8b6bcfc41ef16fea0550d3986fe5648b72d9df39334ee1", "id"=>"f8a08efee4e7ff924d70791afe8aa819b5b17b108a4b52d86033d8a964e3f097", "from"=>"9bdb9493e780337ddd3c3127e9a569f2c77dd5565028dc50bf9becd07b962d98", "when"=>false}, {"type"=>"plain", "to"=>"9fc06e415faae68f28253c2f1e1835ca4184585e823d2adf3dfdcab75ecf93eb", "id"=>"c207255485397504e44c78a2a65799fa0f4ca542862868f54b191f7978b228ab", "from"=>"__QUEUE__"}, {"type"=>"boolean", "to"=>"3630197e44f4ce7f7863661164289fb53b337b76f88f4f4d1181444e12822517", "id"=>"90db76f5e8e7336dbcfbf4497ac47fa52aef25984a10cc0916710a3ab533c216", "from"=>"b46187a1f88a196a953264b887723ce3db24ff79c9e76edf751dfab451dafb9e", "when"=>true}, {"type"=>"boolean", "to"=>"e81d6c1b2dbd27273d0670c70a10032ebab46c05b58b1646e0c5ef7486889d0e", "id"=>"2ded7a5820f58010f7f4c1358d715ff8696550dda32e660d84b80d1391601918", "from"=>"b46187a1f88a196a953264b887723ce3db24ff79c9e76edf751dfab451dafb9e", "when"=>false}, {"type"=>"boolean", "to"=>"c9a4e5b5f33e1ee98d60373e7d712bab92d78dfbbc32b04402534212a92c7312", "id"=>"0fc757ca35936aef31ae37c3ddfd3151fb90d9c32774d7dce5159cc0b1893376", "from"=>"6e57e4e9a80dc4da2af483954484e11321b0184accd03b20d42264d483130b32", "when"=>true}, {"type"=>"boolean", "to"=>"b46187a1f88a196a953264b887723ce3db24ff79c9e76edf751dfab451dafb9e", "id"=>"e41a69bdd2dcdb4bd10a0f8dafded9f738d2fde70449956bb522e268315dd606", "from"=>"6e57e4e9a80dc4da2af483954484e11321b0184accd03b20d42264d483130b32", "when"=>false}, {"type"=>"boolean", "to"=>"9c4441aeea169e568e057a019bb62fb0bb7907cd8dcee0e9d21d9c28a41e50e1", "id"=>"57ac831a6a8d49482520c8ffa77e0c564dec551ac00ac36959bc4c68aa40321b", "from"=>"a08042d8b15518e00c064824a5b6040cf15b322d460fddcaeba4dbcdcebff565", "when"=>true}, {"type"=>"boolean", "to"=>"6e57e4e9a80dc4da2af483954484e11321b0184accd03b20d42264d483130b32", "id"=>"5ec37d9658353f351323e195d53f8e4a68b4282a4be3392b319435c5e4a43a93", "from"=>"a08042d8b15518e00c064824a5b6040cf15b322d460fddcaeba4dbcdcebff565", "when"=>false}, {"type"=>"boolean", "to"=>"5ce646a90d89ed12a7f99cb997af29716078b6d0d3d4cf31ff8ae95f6ee2ceae", "id"=>"c450a99614e62d16b74634babc528b8ad80bdc2135237efefff0fb5deff12e49", "from"=>"c4078407a72cf52dadb5d3ead76d8ed9576e0625e1e635e7a6be4aa1af02d08d", "when"=>true}, {"type"=>"boolean", "to"=>"a08042d8b15518e00c064824a5b6040cf15b322d460fddcaeba4dbcdcebff565", "id"=>"b4fb1ec5b18a7bd2c09e4db7ed1e9055357841930dcd4d812956215812d4e5bb", "from"=>"c4078407a72cf52dadb5d3ead76d8ed9576e0625e1e635e7a6be4aa1af02d08d", "when"=>false}, {"type"=>"boolean", "to"=>"17f56a2dd2dac572b4037d04771e3e1ace454ec24fd5523d664576d9f466ef31", "id"=>"cc45f7cbe31ff668af766f91850826758d5543a23742c7ab5aef28aa1417fff1", "from"=>"6633bea8de95b92117d20f453bb2db9f2be2d2f80c0bd503588b6047d0d2afef", "when"=>true}, {"type"=>"boolean", "to"=>"c4078407a72cf52dadb5d3ead76d8ed9576e0625e1e635e7a6be4aa1af02d08d", "id"=>"6c532b69dc22cc709ede7ebdf78cf2be8b68fb8936feecff2a9a38a8347a4ab5", "from"=>"6633bea8de95b92117d20f453bb2db9f2be2d2f80c0bd503588b6047d0d2afef", "when"=>false}, {"type"=>"plain", "to"=>"6633bea8de95b92117d20f453bb2db9f2be2d2f80c0bd503588b6047d0d2afef", "id"=>"76806ce5c8eaead010e8c34313388724d6ddf1bf7c3299c8f719662ebd42b76c", "from"=>"78267f24b6888d38ac8b6bcfc41ef16fea0550d3986fe5648b72d9df39334ee1"}]}}}}}
[2018-04-24T17:17:39,599][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@id = "plain_369dbb1e-c19c-44b0-82a3-3b5637a219a9"
[2018-04-24T17:17:39,600][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@enable_metric = true
[2018-04-24T17:17:39,600][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@charset = "UTF-8"
[2018-04-24T17:17:39,615][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.16.253:62064] Received a new payload
[2018-04-24T17:17:39,616][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.16.253:62064] Sending a new message for the listener, sequence: 1
[2018-04-24T17:17:39,668][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.16.253:62064] Received a new payload
[2018-04-24T17:17:39,668][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.16.253:62064] Sending a new message for the listener, sequence: 1
[2018-04-24T17:17:39,759][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@id = "plain_369dbb1e-c19c-44b0-82a3-3b5637a219a9"
[2018-04-24T17:17:39,760][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@enable_metric = true
[2018-04-24T17:17:39,760][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@charset = "UTF-8"
[2018-04-24T17:17:39,761][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"E:\\inetpub\\wwwroot\\dms.xpo.com\\portal\\logs\\dispatchOfficeWeb.log", "offset"=>628777, "input_type"=>"log", "type"=>"log4net_input", "@timestamp"=>2018-04-24T21:16:25.130Z, "beat"=>{"version"=>"5.2.0", "name"=>"DMSWEBSTA02", "hostname"=>"DMSWEBSTA02"}, "message"=>"2018-04-24 17:16:23,744 ERROR [Penchant.dispatchOffice.Common - (257)]: LogException() - 114fed3e-ef45-42cc-8988-ef2fe4805441: A public action method 'Login' was not found on controller 'dispatchOfficeWeb.Controllers.SecurityController'.: Exception occurred while processing URL: '/portal/signin'\nA public action method 'Login' was not found on controller 'dispatchOfficeWeb.Controllers.SecurityController'.\n : at System.Web.Mvc.Controller.HandleUnknownAction(String actionName)\n at System.Web.Mvc.Controller.<>c__DisplayClass22.<BeginExecuteCore>b__1e()\n at System.Web.Mvc.Async.AsyncResultWrapper.<.cctor>b__0(IAsyncResult asyncResult, Action action)\n at System.Web.Mvc.Controller.EndExecuteCore(IAsyncResult asyncResult)\n at System.Web.Mvc.Async.AsyncResultWrapper.WrappedAsyncVoid`1.CallEndDelegate(IAsyncResult asyncResult)\n at System.Web.Mvc.Controller.EndExecute(IAsyncResult asyncResult)\n at System.Web.Mvc.MvcHandler.<BeginProcessRequest>b__5(IAsyncResult asyncResult, ProcessRequestState innerState)\n at System.Web.Mvc.Async.AsyncResultWrapper.WrappedAsyncVoid`1.CallEndDelegate(IAsyncResult asyncResult)\n at System.Web.Mvc.MvcHandler.EndProcessRequest(IAsyncResult asyncResult)\n at System.Web.HttpApplication.CallHandlerExecutionStep.System.Web.HttpApplication.IExecutionStep.Execute()\n at System.Web.HttpApplication.CallHandlerExecutionStep.System.Web.HttpApplication.IExecutionStep.Execute()\n at System.Web.HttpApplication.ExecuteStep(IExecutionStep step, Boolean& completedSynchronously)", "fields"=>{"env_id"=>"STA", "app_id"=>"portal"}, "host"=>"DMSWEBSTA02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,766][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Received a new payload
[2018-04-24T17:17:39,766][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 1
[2018-04-24T17:17:39,767][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 2
[2018-04-24T17:17:39,767][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 3
[2018-04-24T17:17:39,768][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 4
[2018-04-24T17:17:39,769][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 5
[2018-04-24T17:17:39,776][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:39,776][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:39,776][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Received a new payload
[2018-04-24T17:17:39,776][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 1
[2018-04-24T17:17:39,777][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:39,777][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 2
[2018-04-24T17:17:39,777][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"E:\\inetpub\\wwwroot\\dms.xpo.com\\portal\\logs\\dispatchOfficeWeb.log", "offset"=>628777, "input_type"=>"log", "type"=>"log4net_input", "@timestamp"=>2018-04-24T21:16:25.130Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"2018-04-24 17:16:23,744 ERROR [Penchant.dispatchOffice.Common - (257)]: LogException() - 114fed3e-ef45-42cc-8988-ef2fe4805441: A public action method 'Login' was not found on controller 'dispatchOfficeWeb.Controllers.SecurityController'.: Exception occurred while processing URL: '/portal/signin'\nA public action method 'Login' was not found on controller 'dispatchOfficeWeb.Controllers.SecurityController'.\n : at System.Web.Mvc.Controller.HandleUnknownAction(String actionName)\n at System.Web.Mvc.Controller.<>c__DisplayClass22.<BeginExecuteCore>b__1e()\n at System.Web.Mvc.Async.AsyncResultWrapper.<.cctor>b__0(IAsyncResult asyncResult, Action action)\n at System.Web.Mvc.Controller.EndExecuteCore(IAsyncResult asyncResult)\n at System.Web.Mvc.Async.AsyncResultWrapper.WrappedAsyncVoid`1.CallEndDelegate(IAsyncResult asyncResult)\n at System.Web.Mvc.Controller.EndExecute(IAsyncResult asyncResult)\n at System.Web.Mvc.MvcHandler.<BeginProcessRequest>b__5(IAsyncResult asyncResult, ProcessRequestState innerState)\n at System.Web.Mvc.Async.AsyncResultWrapper.WrappedAsyncVoid`1.CallEndDelegate(IAsyncResult asyncResult)\n at System.Web.Mvc.MvcHandler.EndProcessRequest(IAsyncResult asyncResult)\n at System.Web.HttpApplication.CallHandlerExecutionStep.System.Web.HttpApplication.IExecutionStep.Execute()\n at System.Web.HttpApplication.CallHandlerExecutionStep.System.Web.HttpApplication.IExecutionStep.Execute()\n at System.Web.HttpApplication.ExecuteStep(IExecutionStep step, Boolean& completedSynchronously)", "fields"=>{"env_id"=>"STA", "app_id"=>"portal"}, "host"=>"DMSWEBSTA02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,778][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 3
[2018-04-24T17:17:39,778][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 4
[2018-04-24T17:17:39,779][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 5
[2018-04-24T17:17:39,780][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 6
[2018-04-24T17:17:39,780][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 7
[2018-04-24T17:17:39,782][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 8
[2018-04-24T17:17:39,792][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Received a new payload
[2018-04-24T17:17:39,793][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 1
[2018-04-24T17:17:39,794][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 2
[2018-04-24T17:17:39,795][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 3
[2018-04-24T17:17:39,795][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 4
[2018-04-24T17:17:39,796][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 5
[2018-04-24T17:17:39,797][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 6
[2018-04-24T17:17:39,798][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 7
[2018-04-24T17:17:39,798][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 8
[2018-04-24T17:17:39,799][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 9
[2018-04-24T17:17:39,802][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Received a new payload
[2018-04-24T17:17:39,802][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 1
[2018-04-24T17:17:39,803][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 2
[2018-04-24T17:17:39,803][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 3
[2018-04-24T17:17:39,804][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 4
[2018-04-24T17:17:39,805][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 5
[2018-04-24T17:17:39,805][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 6
[2018-04-24T17:17:39,806][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 7
[2018-04-24T17:17:39,807][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 8
[2018-04-24T17:17:39,808][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 9
[2018-04-24T17:17:39,809][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 10
[2018-04-24T17:17:39,815][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Received a new payload
[2018-04-24T17:17:39,815][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 1
[2018-04-24T17:17:39,816][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 2
[2018-04-24T17:17:39,816][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 3
[2018-04-24T17:17:39,817][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 4
[2018-04-24T17:17:39,817][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 5
[2018-04-24T17:17:39,818][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 6
[2018-04-24T17:17:39,818][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 7
[2018-04-24T17:17:39,819][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 8
[2018-04-24T17:17:39,820][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 9
[2018-04-24T17:17:39,820][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 10
[2018-04-24T17:17:39,825][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Received a new payload
[2018-04-24T17:17:39,825][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 1
[2018-04-24T17:17:39,826][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 2
[2018-04-24T17:17:39,826][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 3
[2018-04-24T17:17:39,827][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 4
[2018-04-24T17:17:39,828][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 5
[2018-04-24T17:17:39,829][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 6
[2018-04-24T17:17:39,829][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 7
[2018-04-24T17:17:39,830][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 8
[2018-04-24T17:17:39,831][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 9
[2018-04-24T17:17:39,831][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 10
[2018-04-24T17:17:39,920][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Received a new payload
[2018-04-24T17:17:39,920][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 1
[2018-04-24T17:17:39,924][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 2
[2018-04-24T17:17:39,935][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Received a new payload
[2018-04-24T17:17:39,935][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 1
[2018-04-24T17:17:39,955][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41168893, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:36.529Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"ProviderId : 3ac9cccc-7b35-599a-e4dc-bb08e9ad5a20\nEventId : 4\nKeywords : None\nLevel : Error\nMessage : The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied. / Additional Information: \nOpcode : Info\nTask : 65530\nVersion : 0\nPayload : [message : The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied. / Additional Information: ] [stacktrace : System.Data.SqlClient.SqlException (0x80131904): The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied.\n at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)\n at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)\n at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj, Boolean callerHasConnectionLock, Boolean asyncClose)\n at System.Data.SqlClient.TdsParser.TryRun(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj, Boolean& dataReady)\n at System.Data.SqlClient.SqlDataReader.TryConsumeMetaData()\n at System.Data.SqlClient.SqlDataReader.get_MetaData()\n at System.Data.SqlClient.SqlCommand.FinishExecuteReader(SqlDataReader ds, RunBehavior runBehavior, String resetOptionsString, Boolean isInternal, Boolean forDescribeParameterEncryption)\n at System.Data.SqlClient.SqlCommand.RunExecuteReaderTds(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, Boolean async, Int32 timeout, Task& task, Boolean asyncWrite, Boolean inRetry, SqlDataReader ds, Boolean describeParameterEncryptionRequest)\n at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method, TaskCompletionSource`1 completion, Int32 timeout, Task& task, Boolean& usedCache, Boolean asyncWrite, Boolean inRetry)\n at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method)\n at System.Data.SqlClient.SqlCommand.ExecuteReader(CommandBehavior behavior, String method)\n at System.Data.SqlClient.SqlCommand.ExecuteDbDataReader(CommandBehavior behavior)\n at System.Data.Common.DbCommand.ExecuteReader(CommandBehavior behavior)\n at System.Data.Entity.Infrastructure.Interception.DbCommandDispatcher.<Reader>b__c(DbCommand t, DbCommandInterceptionContext`1 c)\n at System.Data.Entity.Infrastructure.Interception.InternalDispatcher`1.Dispatch[TTarget,TInterceptionContext,TResult](TTarget target, Func`3 operation, TInterceptionContext interceptionContext, Action`3 executing, Action`3 executed)\n at System.Data.Entity.Infrastructure.Interception.DbCommandDispatcher.Reader(DbCommand command, DbCommandInterceptionContext interceptionContext)\n at System.Data.Entity.Internal.InterceptableDbCommand.ExecuteDbDataReader(CommandBehavior behavior)\n at System.Data.Common.DbCommand.ExecuteReader(CommandBehavior behavior)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQueryInternal[TElement](String commandText, String entitySetName, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Core.Objects.ObjectContext.<>c__DisplayClass65`1.<ExecuteStoreQueryReliably>b__64()\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteInTransaction[T](Func`1 func, IDbExecutionStrategy executionStrategy, Boolean startLocalTransaction, Boolean releaseConnectionOnSuccess)\n at System.Data.Entity.Core.Objects.ObjectContext.<>c__DisplayClass65`1.<ExecuteStoreQueryReliably>b__63()\n at System.Data.Entity.SqlServer.DefaultSqlExecutionStrategy.Execute[TResult](Func`1 operation)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQueryReliably[TElement](String commandText, String entitySetName, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQuery[TElement](String commandText, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Internal.InternalContext.<>c__DisplayClass14`1.<ExecuteSqlQuery>b__13()\n at System.Data.Entity.Internal.LazyEnumerator`1.MoveNext()\n at System.Linq.Enumerable.First[TSource](IEnumerable`1 source)\n at XPOLastMile.Repository.FinanceSynchronizer.SettlementEntityRepository.InsertSettlementEntity(SettlementEntity jobCost) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.Repository\\XPOLastMile.Repository.FinanceSynchronizer\\SettlementEntityRepository.cs:line 44\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSync.Impl.OrderCostSyncService.SettlementEntityInsert(SettlementEntity jobCostAfter, Int64 apTransactionTypeIdVoucher) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSync\\Impl\\OrderCostSyncService.cs:line 171\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSync.Impl.OrderCostSyncService.SyncOrderCost(Message message, ExecutionContext executionContext, IContext context) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSync\\Impl\\OrderCostSyncService.cs:line 94\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts.ProcessRequest(IContext context, Message message) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSyncOrderCosts.cs:line 24\nClientConnectionId:1a4dc75f-c457-4903-870b-428b5d0182c0\nError Number:8178,State:1,Class:16] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : 73de1794-821e-43b8-a75f-dab93a2a4fee] \nEventName : ErrorInfo\nTimestamp : 2018-04-24T21:14:58.7719242Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,955][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41169489, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:36.529Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 3, Level : Warning, Message : Could not process for queue 35235901 - XPOLastMile.Framework.MVC.BooleanResponse, Payload : [message : Could not process for queue 35235901 - XPOLastMile.Framework.MVC.BooleanResponse] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:14:58.7914708Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,956][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41169953, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:36.530Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 3, Level : Warning, Message : About to read the business process name, Payload : [message : About to read the business process name] [applicationName : Not Provided] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:14:59.7750056Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,956][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41170391, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:36.530Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 3, Level : Warning, Message : Found the business process, Payload : [message : Found the business process] [applicationName : Not Provided] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:14:59.7781725Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,956][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41170888, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:36.530Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 3, Level : Warning, Message : Executing business process!!!!!!!!!!!!!!!!!, Payload : [message : Executing business process!!!!!!!!!!!!!!!!!] [applicationName : XPOLastMile.Host.MSMQ.Auto.ProcessAll] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:14:59.7786024Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,956][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41171432, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:36.530Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 3, Level : Warning, Message : FINISHED ---------- Executing business process!!!!!!!!!!!!!!!!!, Payload : [message : FINISHED ---------- Executing business process!!!!!!!!!!!!!!!!!] [applicationName : XPOLastMile.Host.MSMQ.Core.ProcessUntilEmpty] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:14:59.8177140Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,956][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41171950, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:36.530Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 3, Level : Warning, Message : Done everything&^**^*&*(*()*()()*((*()*()*()*()*(), Payload : [message : Done everything&^**^*&*(*()*()()*((*()*()*()*()*()] [applicationName : XPOLastMile.Host.MSMQ.Core.ProcessUntilEmpty] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:14:59.8180177Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,956][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41172510, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:36.530Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 1, Level : Verbose, Message : The messageID to process: 35235906 and MessageQueueID: 35185039, Payload : [message : The messageID to process: 35235906 and MessageQueueID: 35185039] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : DebugInfo, Timestamp : 2018-04-24T21:15:02.1774592Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,956][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41188018, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:36.530Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 3, Level : Warning, Message : Could not process for queue 35235908 - XPOLastMile.Framework.MVC.BooleanResponse, Payload : [message : Could not process for queue 35235908 - XPOLastMile.Framework.MVC.BooleanResponse] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:04.6872573Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,957][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41188578, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:37.531Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 1, Level : Verbose, Message : The messageID to process: 35235910 and MessageQueueID: 35185043, Payload : [message : The messageID to process: 35235910 and MessageQueueID: 35185043] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : DebugInfo, Timestamp : 2018-04-24T21:15:07.0939056Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,957][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41189142, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:37.531Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 1, Level : Verbose, Message : FinanceSyncOrderCost called with MessageQueueID: 35185043 started, Payload : [message : FinanceSyncOrderCost called with MessageQueueID: 35185043 started] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : DebugInfo, Timestamp : 2018-04-24T21:15:07.0976855Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,957][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41195456, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:37.531Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"ProviderId : 3ac9cccc-7b35-599a-e4dc-bb08e9ad5a20\nEventId : 4\nKeywords : None\nLevel : Error\nMessage : The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied. / Additional Information: \nOpcode : Info\nTask : 65530\nVersion : 0\nPayload : [message : The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied. / Additional Information: ] [stacktrace : System.Data.SqlClient.SqlException (0x80131904): The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied.\n at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)\n at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)\n at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj, Boolean callerHasConnectionLock, Boolean asyncClose)\n at System.Data.SqlClient.TdsParser.TryRun(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj, Boolean& dataReady)\n at System.Data.SqlClient.SqlDataReader.TryConsumeMetaData()\n at System.Data.SqlClient.SqlDataReader.get_MetaData()\n at System.Data.SqlClient.SqlCommand.FinishExecuteReader(SqlDataReader ds, RunBehavior runBehavior, String resetOptionsString, Boolean isInternal, Boolean forDescribeParameterEncryption)\n at System.Data.SqlClient.SqlCommand.RunExecuteReaderTds(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, Boolean async, Int32 timeout, Task& task, Boolean asyncWrite, Boolean inRetry, SqlDataReader ds, Boolean describeParameterEncryptionRequest)\n at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method, TaskCompletionSource`1 completion, Int32 timeout, Task& task, Boolean& usedCache, Boolean asyncWrite, Boolean inRetry)\n at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method)\n at System.Data.SqlClient.SqlCommand.ExecuteReader(CommandBehavior behavior, String method)\n at System.Data.SqlClient.SqlCommand.ExecuteDbDataReader(CommandBehavior behavior)\n at System.Data.Common.DbCommand.ExecuteReader(CommandBehavior behavior)\n at System.Data.Entity.Infrastructure.Interception.DbCommandDispatcher.<Reader>b__c(DbCommand t, DbCommandInterceptionContext`1 c)\n at System.Data.Entity.Infrastructure.Interception.InternalDispatcher`1.Dispatch[TTarget,TInterceptionContext,TResult](TTarget target, Func`3 operation, TInterceptionContext interceptionContext, Action`3 executing, Action`3 executed)\n at System.Data.Entity.Infrastructure.Interception.DbCommandDispatcher.Reader(DbCommand command, DbCommandInterceptionContext interceptionContext)\n at System.Data.Entity.Internal.InterceptableDbCommand.ExecuteDbDataReader(CommandBehavior behavior)\n at System.Data.Common.DbCommand.ExecuteReader(CommandBehavior behavior)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQueryInternal[TElement](String commandText, String entitySetName, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Core.Objects.ObjectContext.<>c__DisplayClass65`1.<ExecuteStoreQueryReliably>b__64()\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteInTransaction[T](Func`1 func, IDbExecutionStrategy executionStrategy, Boolean startLocalTransaction, Boolean releaseConnectionOnSuccess)\n at System.Data.Entity.Core.Objects.ObjectContext.<>c__DisplayClass65`1.<ExecuteStoreQueryReliably>b__63()\n at System.Data.Entity.SqlServer.DefaultSqlExecutionStrategy.Execute[TResult](Func`1 operation)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQueryReliably[TElement](String commandText, String entitySetName, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQuery[TElement](String commandText, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Internal.InternalContext.<>c__DisplayClass14`1.<ExecuteSqlQuery>b__13()\n at System.Data.Entity.Internal.LazyEnumerator`1.MoveNext()\n at System.Linq.Enumerable.First[TSource](IEnumerable`1 source)\n at XPOLastMile.Repository.FinanceSynchronizer.SettlementEntityRepository.InsertSettlementEntity(SettlementEntity jobCost) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.Repository\\XPOLastMile.Repository.FinanceSynchronizer\\SettlementEntityRepository.cs:line 44\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSync.Impl.OrderCostSyncService.SettlementEntityInsert(SettlementEntity jobCostAfter, Int64 apTransactionTypeIdVoucher) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSync\\Impl\\OrderCostSyncService.cs:line 171\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSync.Impl.OrderCostSyncService.SyncOrderCost(Message message, ExecutionContext executionContext, IContext context) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSync\\Impl\\OrderCostSyncService.cs:line 94\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts.ProcessRequest(IContext context, Message message) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSyncOrderCosts.cs:line 24\nClientConnectionId:1a4dc75f-c457-4903-870b-428b5d0182c0\nError Number:8178,State:1,Class:16] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : 08a527e0-9387-450c-90a2-780d0423159a] \nEventName : ErrorInfo\nTimestamp : 2018-04-24T21:15:07.1184700Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,957][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41195998, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:37.534Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 3, Level : Warning, Message : Could not process for queue 35235910 - XPOLastMile.Framework.MVC.BooleanResponse, Payload : [message : Could not process for queue 35235910 - XPOLastMile.Framework.MVC.BooleanResponse] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:07.1260302Z\n--------------- Event Log End Here ---------------\n", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,957][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41196052, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:46.531Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,957][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41196612, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:46.531Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 1, Level : Verbose, Message : The messageID to process: 35235890 and MessageQueueID: 35185023, Payload : [message : The messageID to process: 35235890 and MessageQueueID: 35185023] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : DebugInfo, Timestamp : 2018-04-24T21:15:09.5274828Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,957][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41197176, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:46.531Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 1, Level : Verbose, Message : FinanceSyncOrderCost called with MessageQueueID: 35185023 started, Payload : [message : FinanceSyncOrderCost called with MessageQueueID: 35185023 started] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : DebugInfo, Timestamp : 2018-04-24T21:15:09.5313906Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,958][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41203490, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:46.531Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"ProviderId : 3ac9cccc-7b35-599a-e4dc-bb08e9ad5a20\nEventId : 4\nKeywords : None\nLevel : Error\nMessage : The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied. / Additional Information: \nOpcode : Info\nTask : 65530\nVersion : 0\nPayload : [message : The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied. / Additional Information: ] [stacktrace : System.Data.SqlClient.SqlException (0x80131904): The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied.\n at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)\n at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)\n at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj, Boolean callerHasConnectionLock, Boolean asyncClose)\n at System.Data.SqlClient.TdsParser.TryRun(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj, Boolean& dataReady)\n at System.Data.SqlClient.SqlDataReader.TryConsumeMetaData()\n at System.Data.SqlClient.SqlDataReader.get_MetaData()\n at System.Data.SqlClient.SqlCommand.FinishExecuteReader(SqlDataReader ds, RunBehavior runBehavior, String resetOptionsString, Boolean isInternal, Boolean forDescribeParameterEncryption)\n at System.Data.SqlClient.SqlCommand.RunExecuteReaderTds(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, Boolean async, Int32 timeout, Task& task, Boolean asyncWrite, Boolean inRetry, SqlDataReader ds, Boolean describeParameterEncryptionRequest)\n at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method, TaskCompletionSource`1 completion, Int32 timeout, Task& task, Boolean& usedCache, Boolean asyncWrite, Boolean inRetry)\n at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method)\n at System.Data.SqlClient.SqlCommand.ExecuteReader(CommandBehavior behavior, String method)\n at System.Data.SqlClient.SqlCommand.ExecuteDbDataReader(CommandBehavior behavior)\n at System.Data.Common.DbCommand.ExecuteReader(CommandBehavior behavior)\n at System.Data.Entity.Infrastructure.Interception.DbCommandDispatcher.<Reader>b__c(DbCommand t, DbCommandInterceptionContext`1 c)\n at System.Data.Entity.Infrastructure.Interception.InternalDispatcher`1.Dispatch[TTarget,TInterceptionContext,TResult](TTarget target, Func`3 operation, TInterceptionContext interceptionContext, Action`3 executing, Action`3 executed)\n at System.Data.Entity.Infrastructure.Interception.DbCommandDispatcher.Reader(DbCommand command, DbCommandInterceptionContext interceptionContext)\n at System.Data.Entity.Internal.InterceptableDbCommand.ExecuteDbDataReader(CommandBehavior behavior)\n at System.Data.Common.DbCommand.ExecuteReader(CommandBehavior behavior)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQueryInternal[TElement](String commandText, String entitySetName, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Core.Objects.ObjectContext.<>c__DisplayClass65`1.<ExecuteStoreQueryReliably>b__64()\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteInTransaction[T](Func`1 func, IDbExecutionStrategy executionStrategy, Boolean startLocalTransaction, Boolean releaseConnectionOnSuccess)\n at System.Data.Entity.Core.Objects.ObjectContext.<>c__DisplayClass65`1.<ExecuteStoreQueryReliably>b__63()\n at System.Data.Entity.SqlServer.DefaultSqlExecutionStrategy.Execute[TResult](Func`1 operation)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQueryReliably[TElement](String commandText, String entitySetName, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQuery[TElement](String commandText, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Internal.InternalContext.<>c__DisplayClass14`1.<ExecuteSqlQuery>b__13()\n at System.Data.Entity.Internal.LazyEnumerator`1.MoveNext()\n at System.Linq.Enumerable.First[TSource](IEnumerable`1 source)\n at XPOLastMile.Repository.FinanceSynchronizer.SettlementEntityRepository.InsertSettlementEntity(SettlementEntity jobCost) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.Repository\\XPOLastMile.Repository.FinanceSynchronizer\\SettlementEntityRepository.cs:line 44\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSync.Impl.OrderCostSyncService.SettlementEntityInsert(SettlementEntity jobCostAfter, Int64 apTransactionTypeIdVoucher) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSync\\Impl\\OrderCostSyncService.cs:line 171\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSync.Impl.OrderCostSyncService.SyncOrderCost(Message message, ExecutionContext executionContext, IContext context) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSync\\Impl\\OrderCostSyncService.cs:line 94\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts.ProcessRequest(IContext context, Message message) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSyncOrderCosts.cs:line 24\nClientConnectionId:1a4dc75f-c457-4903-870b-428b5d0182c0\nError Number:8178,State:1,Class:16] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : f901c7a2-981b-47c5-ba6e-040e168aff9c] \nEventName : ErrorInfo\nTimestamp : 2018-04-24T21:15:09.5530014Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,958][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41204086, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:46.532Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 3, Level : Warning, Message : Could not process for queue 35235890 - XPOLastMile.Framework.MVC.BooleanResponse, Payload : [message : Could not process for queue 35235890 - XPOLastMile.Framework.MVC.BooleanResponse] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:09.5641623Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,958][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41204646, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:46.532Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 1, Level : Verbose, Message : The messageID to process: 35235892 and MessageQueueID: 35185025, Payload : [message : The messageID to process: 35235892 and MessageQueueID: 35185025] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : DebugInfo, Timestamp : 2018-04-24T21:15:11.9665932Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,958][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41205210, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:46.532Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 1, Level : Verbose, Message : FinanceSyncOrderCost called with MessageQueueID: 35185025 started, Payload : [message : FinanceSyncOrderCost called with MessageQueueID: 35185025 started] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : DebugInfo, Timestamp : 2018-04-24T21:15:11.9711233Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,958][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41213048, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:46.532Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 3, Level : Warning, Message : About to read the business process name, Payload : [message : About to read the business process name] [applicationName : Not Provided] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:12.7683197Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,959][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:39,959][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:39,959][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:39,959][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:39,959][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:39,959][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:39,960][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:39,960][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:39,960][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:39,960][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:39,960][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:39,960][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:39,960][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:39,960][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:39,960][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:39,960][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:39,960][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:39,960][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:39,960][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:39,960][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:39,960][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:39,961][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:39,961][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:39,961][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:39,961][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:39,961][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:39,961][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:39,961][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:39,961][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:39,961][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:39,961][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:39,961][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:39,961][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:39,961][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:39,961][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:39,961][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:39,962][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:39,962][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:39,962][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:39,962][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:39,962][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:39,962][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:39,963][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:39,963][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:39,963][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:39,963][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:39,963][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:39,963][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:39,963][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:39,963][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:39,963][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:39,963][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:39,963][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:39,964][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:39,964][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:39,964][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:39,964][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:39,964][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:39,964][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:39,964][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:39,964][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:39,964][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:39,964][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:39,965][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41168893, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:36.529Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"ProviderId : 3ac9cccc-7b35-599a-e4dc-bb08e9ad5a20\nEventId : 4\nKeywords : None\nLevel : Error\nMessage : The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied. / Additional Information: \nOpcode : Info\nTask : 65530\nVersion : 0\nPayload : [message : The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied. / Additional Information: ] [stacktrace : System.Data.SqlClient.SqlException (0x80131904): The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied.\n at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)\n at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)\n at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj, Boolean callerHasConnectionLock, Boolean asyncClose)\n at System.Data.SqlClient.TdsParser.TryRun(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj, Boolean& dataReady)\n at System.Data.SqlClient.SqlDataReader.TryConsumeMetaData()\n at System.Data.SqlClient.SqlDataReader.get_MetaData()\n at System.Data.SqlClient.SqlCommand.FinishExecuteReader(SqlDataReader ds, RunBehavior runBehavior, String resetOptionsString, Boolean isInternal, Boolean forDescribeParameterEncryption)\n at System.Data.SqlClient.SqlCommand.RunExecuteReaderTds(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, Boolean async, Int32 timeout, Task& task, Boolean asyncWrite, Boolean inRetry, SqlDataReader ds, Boolean describeParameterEncryptionRequest)\n at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method, TaskCompletionSource`1 completion, Int32 timeout, Task& task, Boolean& usedCache, Boolean asyncWrite, Boolean inRetry)\n at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method)\n at System.Data.SqlClient.SqlCommand.ExecuteReader(CommandBehavior behavior, String method)\n at System.Data.SqlClient.SqlCommand.ExecuteDbDataReader(CommandBehavior behavior)\n at System.Data.Common.DbCommand.ExecuteReader(CommandBehavior behavior)\n at System.Data.Entity.Infrastructure.Interception.DbCommandDispatcher.<Reader>b__c(DbCommand t, DbCommandInterceptionContext`1 c)\n at System.Data.Entity.Infrastructure.Interception.InternalDispatcher`1.Dispatch[TTarget,TInterceptionContext,TResult](TTarget target, Func`3 operation, TInterceptionContext interceptionContext, Action`3 executing, Action`3 executed)\n at System.Data.Entity.Infrastructure.Interception.DbCommandDispatcher.Reader(DbCommand command, DbCommandInterceptionContext interceptionContext)\n at System.Data.Entity.Internal.InterceptableDbCommand.ExecuteDbDataReader(CommandBehavior behavior)\n at System.Data.Common.DbCommand.ExecuteReader(CommandBehavior behavior)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQueryInternal[TElement](String commandText, String entitySetName, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Core.Objects.ObjectContext.<>c__DisplayClass65`1.<ExecuteStoreQueryReliably>b__64()\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteInTransaction[T](Func`1 func, IDbExecutionStrategy executionStrategy, Boolean startLocalTransaction, Boolean releaseConnectionOnSuccess)\n at System.Data.Entity.Core.Objects.ObjectContext.<>c__DisplayClass65`1.<ExecuteStoreQueryReliably>b__63()\n at System.Data.Entity.SqlServer.DefaultSqlExecutionStrategy.Execute[TResult](Func`1 operation)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQueryReliably[TElement](String commandText, String entitySetName, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQuery[TElement](String commandText, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Internal.InternalContext.<>c__DisplayClass14`1.<ExecuteSqlQuery>b__13()\n at System.Data.Entity.Internal.LazyEnumerator`1.MoveNext()\n at System.Linq.Enumerable.First[TSource](IEnumerable`1 source)\n at XPOLastMile.Repository.FinanceSynchronizer.SettlementEntityRepository.InsertSettlementEntity(SettlementEntity jobCost) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.Repository\\XPOLastMile.Repository.FinanceSynchronizer\\SettlementEntityRepository.cs:line 44\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSync.Impl.OrderCostSyncService.SettlementEntityInsert(SettlementEntity jobCostAfter, Int64 apTransactionTypeIdVoucher) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSync\\Impl\\OrderCostSyncService.cs:line 171\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSync.Impl.OrderCostSyncService.SyncOrderCost(Message message, ExecutionContext executionContext, IContext context) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSync\\Impl\\OrderCostSyncService.cs:line 94\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts.ProcessRequest(IContext context, Message message) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSyncOrderCosts.cs:line 24\nClientConnectionId:1a4dc75f-c457-4903-870b-428b5d0182c0\nError Number:8178,State:1,Class:16] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : 73de1794-821e-43b8-a75f-dab93a2a4fee] \nEventName : ErrorInfo\nTimestamp : 2018-04-24T21:14:58.7719242Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,965][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41169489, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:36.529Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 3, Level : Warning, Message : Could not process for queue 35235901 - XPOLastMile.Framework.MVC.BooleanResponse, Payload : [message : Could not process for queue 35235901 - XPOLastMile.Framework.MVC.BooleanResponse] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:14:58.7914708Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,966][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 2
[2018-04-24T17:17:39,967][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 3
[2018-04-24T17:17:39,965][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41169953, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:36.530Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 3, Level : Warning, Message : About to read the business process name, Payload : [message : About to read the business process name] [applicationName : Not Provided] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:14:59.7750056Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,967][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41170391, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:36.530Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 3, Level : Warning, Message : Found the business process, Payload : [message : Found the business process] [applicationName : Not Provided] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:14:59.7781725Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,967][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41170888, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:36.530Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 3, Level : Warning, Message : Executing business process!!!!!!!!!!!!!!!!!, Payload : [message : Executing business process!!!!!!!!!!!!!!!!!] [applicationName : XPOLastMile.Host.MSMQ.Auto.ProcessAll] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:14:59.7786024Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,967][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 4
[2018-04-24T17:17:39,967][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41171432, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:36.530Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 3, Level : Warning, Message : FINISHED ---------- Executing business process!!!!!!!!!!!!!!!!!, Payload : [message : FINISHED ---------- Executing business process!!!!!!!!!!!!!!!!!] [applicationName : XPOLastMile.Host.MSMQ.Core.ProcessUntilEmpty] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:14:59.8177140Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,968][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41171950, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:36.530Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 3, Level : Warning, Message : Done everything&^**^*&*(*()*()()*((*()*()*()*()*(), Payload : [message : Done everything&^**^*&*(*()*()()*((*()*()*()*()*()] [applicationName : XPOLastMile.Host.MSMQ.Core.ProcessUntilEmpty] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:14:59.8180177Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,968][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41172510, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:36.530Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 1, Level : Verbose, Message : The messageID to process: 35235906 and MessageQueueID: 35185039, Payload : [message : The messageID to process: 35235906 and MessageQueueID: 35185039] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : DebugInfo, Timestamp : 2018-04-24T21:15:02.1774592Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,968][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41188018, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:36.530Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 3, Level : Warning, Message : Could not process for queue 35235908 - XPOLastMile.Framework.MVC.BooleanResponse, Payload : [message : Could not process for queue 35235908 - XPOLastMile.Framework.MVC.BooleanResponse] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:04.6872573Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,968][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41188578, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:37.531Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 1, Level : Verbose, Message : The messageID to process: 35235910 and MessageQueueID: 35185043, Payload : [message : The messageID to process: 35235910 and MessageQueueID: 35185043] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : DebugInfo, Timestamp : 2018-04-24T21:15:07.0939056Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,968][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 5
[2018-04-24T17:17:39,968][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41189142, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:37.531Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 1, Level : Verbose, Message : FinanceSyncOrderCost called with MessageQueueID: 35185043 started, Payload : [message : FinanceSyncOrderCost called with MessageQueueID: 35185043 started] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : DebugInfo, Timestamp : 2018-04-24T21:15:07.0976855Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,969][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 6
[2018-04-24T17:17:39,968][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41195456, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:37.531Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"ProviderId : 3ac9cccc-7b35-599a-e4dc-bb08e9ad5a20\nEventId : 4\nKeywords : None\nLevel : Error\nMessage : The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied. / Additional Information: \nOpcode : Info\nTask : 65530\nVersion : 0\nPayload : [message : The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied. / Additional Information: ] [stacktrace : System.Data.SqlClient.SqlException (0x80131904): The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied.\n at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)\n at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)\n at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj, Boolean callerHasConnectionLock, Boolean asyncClose)\n at System.Data.SqlClient.TdsParser.TryRun(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj, Boolean& dataReady)\n at System.Data.SqlClient.SqlDataReader.TryConsumeMetaData()\n at System.Data.SqlClient.SqlDataReader.get_MetaData()\n at System.Data.SqlClient.SqlCommand.FinishExecuteReader(SqlDataReader ds, RunBehavior runBehavior, String resetOptionsString, Boolean isInternal, Boolean forDescribeParameterEncryption)\n at System.Data.SqlClient.SqlCommand.RunExecuteReaderTds(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, Boolean async, Int32 timeout, Task& task, Boolean asyncWrite, Boolean inRetry, SqlDataReader ds, Boolean describeParameterEncryptionRequest)\n at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method, TaskCompletionSource`1 completion, Int32 timeout, Task& task, Boolean& usedCache, Boolean asyncWrite, Boolean inRetry)\n at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method)\n at System.Data.SqlClient.SqlCommand.ExecuteReader(CommandBehavior behavior, String method)\n at System.Data.SqlClient.SqlCommand.ExecuteDbDataReader(CommandBehavior behavior)\n at System.Data.Common.DbCommand.ExecuteReader(CommandBehavior behavior)\n at System.Data.Entity.Infrastructure.Interception.DbCommandDispatcher.<Reader>b__c(DbCommand t, DbCommandInterceptionContext`1 c)\n at System.Data.Entity.Infrastructure.Interception.InternalDispatcher`1.Dispatch[TTarget,TInterceptionContext,TResult](TTarget target, Func`3 operation, TInterceptionContext interceptionContext, Action`3 executing, Action`3 executed)\n at System.Data.Entity.Infrastructure.Interception.DbCommandDispatcher.Reader(DbCommand command, DbCommandInterceptionContext interceptionContext)\n at System.Data.Entity.Internal.InterceptableDbCommand.ExecuteDbDataReader(CommandBehavior behavior)\n at System.Data.Common.DbCommand.ExecuteReader(CommandBehavior behavior)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQueryInternal[TElement](String commandText, String entitySetName, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Core.Objects.ObjectContext.<>c__DisplayClass65`1.<ExecuteStoreQueryReliably>b__64()\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteInTransaction[T](Func`1 func, IDbExecutionStrategy executionStrategy, Boolean startLocalTransaction, Boolean releaseConnectionOnSuccess)\n at System.Data.Entity.Core.Objects.ObjectContext.<>c__DisplayClass65`1.<ExecuteStoreQueryReliably>b__63()\n at System.Data.Entity.SqlServer.DefaultSqlExecutionStrategy.Execute[TResult](Func`1 operation)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQueryReliably[TElement](String commandText, String entitySetName, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQuery[TElement](String commandText, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Internal.InternalContext.<>c__DisplayClass14`1.<ExecuteSqlQuery>b__13()\n at System.Data.Entity.Internal.LazyEnumerator`1.MoveNext()\n at System.Linq.Enumerable.First[TSource](IEnumerable`1 source)\n at XPOLastMile.Repository.FinanceSynchronizer.SettlementEntityRepository.InsertSettlementEntity(SettlementEntity jobCost) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.Repository\\XPOLastMile.Repository.FinanceSynchronizer\\SettlementEntityRepository.cs:line 44\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSync.Impl.OrderCostSyncService.SettlementEntityInsert(SettlementEntity jobCostAfter, Int64 apTransactionTypeIdVoucher) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSync\\Impl\\OrderCostSyncService.cs:line 171\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSync.Impl.OrderCostSyncService.SyncOrderCost(Message message, ExecutionContext executionContext, IContext context) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSync\\Impl\\OrderCostSyncService.cs:line 94\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts.ProcessRequest(IContext context, Message message) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSyncOrderCosts.cs:line 24\nClientConnectionId:1a4dc75f-c457-4903-870b-428b5d0182c0\nError Number:8178,State:1,Class:16] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : 08a527e0-9387-450c-90a2-780d0423159a] \nEventName : ErrorInfo\nTimestamp : 2018-04-24T21:15:07.1184700Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,969][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41195998, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:37.534Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 3, Level : Warning, Message : Could not process for queue 35235910 - XPOLastMile.Framework.MVC.BooleanResponse, Payload : [message : Could not process for queue 35235910 - XPOLastMile.Framework.MVC.BooleanResponse] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:07.1260302Z\n--------------- Event Log End Here ---------------\n", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,969][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41196052, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:46.531Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,969][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 7
[2018-04-24T17:17:39,969][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41196612, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:46.531Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 1, Level : Verbose, Message : The messageID to process: 35235890 and MessageQueueID: 35185023, Payload : [message : The messageID to process: 35235890 and MessageQueueID: 35185023] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : DebugInfo, Timestamp : 2018-04-24T21:15:09.5274828Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,969][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41197176, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:46.531Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 1, Level : Verbose, Message : FinanceSyncOrderCost called with MessageQueueID: 35185023 started, Payload : [message : FinanceSyncOrderCost called with MessageQueueID: 35185023 started] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : DebugInfo, Timestamp : 2018-04-24T21:15:09.5313906Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,970][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 8
[2018-04-24T17:17:39,969][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41203490, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:46.531Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"ProviderId : 3ac9cccc-7b35-599a-e4dc-bb08e9ad5a20\nEventId : 4\nKeywords : None\nLevel : Error\nMessage : The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied. / Additional Information: \nOpcode : Info\nTask : 65530\nVersion : 0\nPayload : [message : The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied. / Additional Information: ] [stacktrace : System.Data.SqlClient.SqlException (0x80131904): The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied.\n at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)\n at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)\n at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj, Boolean callerHasConnectionLock, Boolean asyncClose)\n at System.Data.SqlClient.TdsParser.TryRun(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj, Boolean& dataReady)\n at System.Data.SqlClient.SqlDataReader.TryConsumeMetaData()\n at System.Data.SqlClient.SqlDataReader.get_MetaData()\n at System.Data.SqlClient.SqlCommand.FinishExecuteReader(SqlDataReader ds, RunBehavior runBehavior, String resetOptionsString, Boolean isInternal, Boolean forDescribeParameterEncryption)\n at System.Data.SqlClient.SqlCommand.RunExecuteReaderTds(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, Boolean async, Int32 timeout, Task& task, Boolean asyncWrite, Boolean inRetry, SqlDataReader ds, Boolean describeParameterEncryptionRequest)\n at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method, TaskCompletionSource`1 completion, Int32 timeout, Task& task, Boolean& usedCache, Boolean asyncWrite, Boolean inRetry)\n at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method)\n at System.Data.SqlClient.SqlCommand.ExecuteReader(CommandBehavior behavior, String method)\n at System.Data.SqlClient.SqlCommand.ExecuteDbDataReader(CommandBehavior behavior)\n at System.Data.Common.DbCommand.ExecuteReader(CommandBehavior behavior)\n at System.Data.Entity.Infrastructure.Interception.DbCommandDispatcher.<Reader>b__c(DbCommand t, DbCommandInterceptionContext`1 c)\n at System.Data.Entity.Infrastructure.Interception.InternalDispatcher`1.Dispatch[TTarget,TInterceptionContext,TResult](TTarget target, Func`3 operation, TInterceptionContext interceptionContext, Action`3 executing, Action`3 executed)\n at System.Data.Entity.Infrastructure.Interception.DbCommandDispatcher.Reader(DbCommand command, DbCommandInterceptionContext interceptionContext)\n at System.Data.Entity.Internal.InterceptableDbCommand.ExecuteDbDataReader(CommandBehavior behavior)\n at System.Data.Common.DbCommand.ExecuteReader(CommandBehavior behavior)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQueryInternal[TElement](String commandText, String entitySetName, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Core.Objects.ObjectContext.<>c__DisplayClass65`1.<ExecuteStoreQueryReliably>b__64()\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteInTransaction[T](Func`1 func, IDbExecutionStrategy executionStrategy, Boolean startLocalTransaction, Boolean releaseConnectionOnSuccess)\n at System.Data.Entity.Core.Objects.ObjectContext.<>c__DisplayClass65`1.<ExecuteStoreQueryReliably>b__63()\n at System.Data.Entity.SqlServer.DefaultSqlExecutionStrategy.Execute[TResult](Func`1 operation)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQueryReliably[TElement](String commandText, String entitySetName, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQuery[TElement](String commandText, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Internal.InternalContext.<>c__DisplayClass14`1.<ExecuteSqlQuery>b__13()\n at System.Data.Entity.Internal.LazyEnumerator`1.MoveNext()\n at System.Linq.Enumerable.First[TSource](IEnumerable`1 source)\n at XPOLastMile.Repository.FinanceSynchronizer.SettlementEntityRepository.InsertSettlementEntity(SettlementEntity jobCost) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.Repository\\XPOLastMile.Repository.FinanceSynchronizer\\SettlementEntityRepository.cs:line 44\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSync.Impl.OrderCostSyncService.SettlementEntityInsert(SettlementEntity jobCostAfter, Int64 apTransactionTypeIdVoucher) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSync\\Impl\\OrderCostSyncService.cs:line 171\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSync.Impl.OrderCostSyncService.SyncOrderCost(Message message, ExecutionContext executionContext, IContext context) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSync\\Impl\\OrderCostSyncService.cs:line 94\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts.ProcessRequest(IContext context, Message message) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSyncOrderCosts.cs:line 24\nClientConnectionId:1a4dc75f-c457-4903-870b-428b5d0182c0\nError Number:8178,State:1,Class:16] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : f901c7a2-981b-47c5-ba6e-040e168aff9c] \nEventName : ErrorInfo\nTimestamp : 2018-04-24T21:15:09.5530014Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,970][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41204086, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:46.532Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 3, Level : Warning, Message : Could not process for queue 35235890 - XPOLastMile.Framework.MVC.BooleanResponse, Payload : [message : Could not process for queue 35235890 - XPOLastMile.Framework.MVC.BooleanResponse] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:09.5641623Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,970][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41204646, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:46.532Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 1, Level : Verbose, Message : The messageID to process: 35235892 and MessageQueueID: 35185025, Payload : [message : The messageID to process: 35235892 and MessageQueueID: 35185025] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : DebugInfo, Timestamp : 2018-04-24T21:15:11.9665932Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,970][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41205210, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:46.532Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 1, Level : Verbose, Message : FinanceSyncOrderCost called with MessageQueueID: 35185025 started, Payload : [message : FinanceSyncOrderCost called with MessageQueueID: 35185025 started] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : DebugInfo, Timestamp : 2018-04-24T21:15:11.9711233Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,970][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 9
[2018-04-24T17:17:39,971][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41213048, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:46.532Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 3, Level : Warning, Message : About to read the business process name, Payload : [message : About to read the business process name] [applicationName : Not Provided] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:12.7683197Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,971][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 10
[2018-04-24T17:17:39,979][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Received a new payload
[2018-04-24T17:17:39,979][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 1
[2018-04-24T17:17:39,977][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"E:\\inetpub\\wwwroot\\dms.xpo.com\\portal\\logs\\dispatchOfficeWeb.log", "offset"=>630318, "input_type"=>"log", "type"=>"log4net_input", "@timestamp"=>2018-04-24T21:17:25.136Z, "beat"=>{"version"=>"5.2.0", "name"=>"DMSWEBSTA02", "hostname"=>"DMSWEBSTA02"}, "message"=>"2018-04-24 17:16:58,128 ERROR [Penchant.dispatchOffice.Common - (174)]: LogException() - 815991dd-902d-484e-8c6f-e8a25add3abf: A public action method 'Login' was not found on controller 'dispatchOfficeWeb.Controllers.SecurityController'.: Exception occurred while processing URL: '/portal/signin'\nA public action method 'Login' was not found on controller 'dispatchOfficeWeb.Controllers.SecurityController'.\n : at System.Web.Mvc.Controller.HandleUnknownAction(String actionName)\n at System.Web.Mvc.Controller.<>c__DisplayClass22.<BeginExecuteCore>b__1e()\n at System.Web.Mvc.Async.AsyncResultWrapper.<.cctor>b__0(IAsyncResult asyncResult, Action action)\n at System.Web.Mvc.Controller.EndExecuteCore(IAsyncResult asyncResult)\n at System.Web.Mvc.Async.AsyncResultWrapper.WrappedAsyncVoid`1.CallEndDelegate(IAsyncResult asyncResult)\n at System.Web.Mvc.Controller.EndExecute(IAsyncResult asyncResult)\n at System.Web.Mvc.MvcHandler.<BeginProcessRequest>b__5(IAsyncResult asyncResult, ProcessRequestState innerState)\n at System.Web.Mvc.Async.AsyncResultWrapper.WrappedAsyncVoid`1.CallEndDelegate(IAsyncResult asyncResult)\n at System.Web.Mvc.MvcHandler.EndProcessRequest(IAsyncResult asyncResult)\n at System.Web.HttpApplication.CallHandlerExecutionStep.System.Web.HttpApplication.IExecutionStep.Execute()\n at System.Web.HttpApplication.CallHandlerExecutionStep.System.Web.HttpApplication.IExecutionStep.Execute()\n at System.Web.HttpApplication.ExecuteStep(IExecutionStep step, Boolean& completedSynchronously)", "fields"=>{"env_id"=>"STA", "app_id"=>"portal"}, "host"=>"DMSWEBSTA02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,981][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41161455, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:36.529Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,981][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41162015, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:36.529Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 1, Level : Verbose, Message : The messageID to process: 35235901 and MessageQueueID: 35185034, Payload : [message : The messageID to process: 35235901 and MessageQueueID: 35185034] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : DebugInfo, Timestamp : 2018-04-24T21:14:58.7167551Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,981][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41162579, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:36.529Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 1, Level : Verbose, Message : FinanceSyncOrderCost called with MessageQueueID: 35185034 started, Payload : [message : FinanceSyncOrderCost called with MessageQueueID: 35185034 started] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : DebugInfo, Timestamp : 2018-04-24T21:14:58.7426155Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,982][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41173074, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:36.530Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 1, Level : Verbose, Message : FinanceSyncOrderCost called with MessageQueueID: 35185039 started, Payload : [message : FinanceSyncOrderCost called with MessageQueueID: 35185039 started] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : DebugInfo, Timestamp : 2018-04-24T21:15:02.1811244Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,982][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41179388, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:36.530Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"ProviderId : 3ac9cccc-7b35-599a-e4dc-bb08e9ad5a20\nEventId : 4\nKeywords : None\nLevel : Error\nMessage : The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied. / Additional Information: \nOpcode : Info\nTask : 65530\nVersion : 0\nPayload : [message : The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied. / Additional Information: ] [stacktrace : System.Data.SqlClient.SqlException (0x80131904): The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied.\n at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)\n at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)\n at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj, Boolean callerHasConnectionLock, Boolean asyncClose)\n at System.Data.SqlClient.TdsParser.TryRun(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj, Boolean& dataReady)\n at System.Data.SqlClient.SqlDataReader.TryConsumeMetaData()\n at System.Data.SqlClient.SqlDataReader.get_MetaData()\n at System.Data.SqlClient.SqlCommand.FinishExecuteReader(SqlDataReader ds, RunBehavior runBehavior, String resetOptionsString, Boolean isInternal, Boolean forDescribeParameterEncryption)\n at System.Data.SqlClient.SqlCommand.RunExecuteReaderTds(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, Boolean async, Int32 timeout, Task& task, Boolean asyncWrite, Boolean inRetry, SqlDataReader ds, Boolean describeParameterEncryptionRequest)\n at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method, TaskCompletionSource`1 completion, Int32 timeout, Task& task, Boolean& usedCache, Boolean asyncWrite, Boolean inRetry)\n at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method)\n at System.Data.SqlClient.SqlCommand.ExecuteReader(CommandBehavior behavior, String method)\n at System.Data.SqlClient.SqlCommand.ExecuteDbDataReader(CommandBehavior behavior)\n at System.Data.Common.DbCommand.ExecuteReader(CommandBehavior behavior)\n at System.Data.Entity.Infrastructure.Interception.DbCommandDispatcher.<Reader>b__c(DbCommand t, DbCommandInterceptionContext`1 c)\n at System.Data.Entity.Infrastructure.Interception.InternalDispatcher`1.Dispatch[TTarget,TInterceptionContext,TResult](TTarget target, Func`3 operation, TInterceptionContext interceptionContext, Action`3 executing, Action`3 executed)\n at System.Data.Entity.Infrastructure.Interception.DbCommandDispatcher.Reader(DbCommand command, DbCommandInterceptionContext interceptionContext)\n at System.Data.Entity.Internal.InterceptableDbCommand.ExecuteDbDataReader(CommandBehavior behavior)\n at System.Data.Common.DbCommand.ExecuteReader(CommandBehavior behavior)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQueryInternal[TElement](String commandText, String entitySetName, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Core.Objects.ObjectContext.<>c__DisplayClass65`1.<ExecuteStoreQueryReliably>b__64()\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteInTransaction[T](Func`1 func, IDbExecutionStrategy executionStrategy, Boolean startLocalTransaction, Boolean releaseConnectionOnSuccess)\n at System.Data.Entity.Core.Objects.ObjectContext.<>c__DisplayClass65`1.<ExecuteStoreQueryReliably>b__63()\n at System.Data.Entity.SqlServer.DefaultSqlExecutionStrategy.Execute[TResult](Func`1 operation)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQueryReliably[TElement](String commandText, String entitySetName, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQuery[TElement](String commandText, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Internal.InternalContext.<>c__DisplayClass14`1.<ExecuteSqlQuery>b__13()\n at System.Data.Entity.Internal.LazyEnumerator`1.MoveNext()\n at System.Linq.Enumerable.First[TSource](IEnumerable`1 source)\n at XPOLastMile.Repository.FinanceSynchronizer.SettlementEntityRepository.InsertSettlementEntity(SettlementEntity jobCost) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.Repository\\XPOLastMile.Repository.FinanceSynchronizer\\SettlementEntityRepository.cs:line 44\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSync.Impl.OrderCostSyncService.SettlementEntityInsert(SettlementEntity jobCostAfter, Int64 apTransactionTypeIdVoucher) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSync\\Impl\\OrderCostSyncService.cs:line 171\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSync.Impl.OrderCostSyncService.SyncOrderCost(Message message, ExecutionContext executionContext, IContext context) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSync\\Impl\\OrderCostSyncService.cs:line 94\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts.ProcessRequest(IContext context, Message message) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSyncOrderCosts.cs:line 24\nClientConnectionId:1a4dc75f-c457-4903-870b-428b5d0182c0\nError Number:8178,State:1,Class:16] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : 46cf94a5-0fbe-4bc2-9183-962229141d8d] \nEventName : ErrorInfo\nTimestamp : 2018-04-24T21:15:02.2013186Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,982][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41179984, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:36.530Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 3, Level : Warning, Message : Could not process for queue 35235906 - XPOLastMile.Framework.MVC.BooleanResponse, Payload : [message : Could not process for queue 35235906 - XPOLastMile.Framework.MVC.BooleanResponse] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:02.2096303Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,982][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41180544, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:36.530Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 1, Level : Verbose, Message : The messageID to process: 35235908 and MessageQueueID: 35185041, Payload : [message : The messageID to process: 35235908 and MessageQueueID: 35185041] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : DebugInfo, Timestamp : 2018-04-24T21:15:04.6508084Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,983][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41181108, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:36.530Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 1, Level : Verbose, Message : FinanceSyncOrderCost called with MessageQueueID: 35185041 started, Payload : [message : FinanceSyncOrderCost called with MessageQueueID: 35185041 started] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : DebugInfo, Timestamp : 2018-04-24T21:15:04.6544171Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,983][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41212120, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:46.532Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 3, Level : Warning, Message : Could not process for queue 35235892 - XPOLastMile.Framework.MVC.BooleanResponse, Payload : [message : Could not process for queue 35235892 - XPOLastMile.Framework.MVC.BooleanResponse] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:12.0049729Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,983][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41216460, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:46.532Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 3, Level : Warning, Message : Executing business process!!!!!!!!!!!!!!!!!, Payload : [message : Executing business process!!!!!!!!!!!!!!!!!] [applicationName : XPOLastMile.MessageQueue.Business.MessageQueueWatcher] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:14.2897132Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,983][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41227003, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:47.533Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 1, Level : Verbose, Message : The messageID to process: 35235896 and MessageQueueID: 35185029, Payload : [message : The messageID to process: 35235896 and MessageQueueID: 35185029] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : DebugInfo, Timestamp : 2018-04-24T21:15:16.8099489Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,983][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41227567, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:47.533Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 1, Level : Verbose, Message : FinanceSyncOrderCost called with MessageQueueID: 35185029 started, Payload : [message : FinanceSyncOrderCost called with MessageQueueID: 35185029 started] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : DebugInfo, Timestamp : 2018-04-24T21:15:16.8138079Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,983][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41233881, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:47.533Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"ProviderId : 3ac9cccc-7b35-599a-e4dc-bb08e9ad5a20\nEventId : 4\nKeywords : None\nLevel : Error\nMessage : The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied. / Additional Information: \nOpcode : Info\nTask : 65530\nVersion : 0\nPayload : [message : The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied. / Additional Information: ] [stacktrace : System.Data.SqlClient.SqlException (0x80131904): The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied.\n at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)\n at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)\n at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj, Boolean callerHasConnectionLock, Boolean asyncClose)\n at System.Data.SqlClient.TdsParser.TryRun(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj, Boolean& dataReady)\n at System.Data.SqlClient.SqlDataReader.TryConsumeMetaData()\n at System.Data.SqlClient.SqlDataReader.get_MetaData()\n at System.Data.SqlClient.SqlCommand.FinishExecuteReader(SqlDataReader ds, RunBehavior runBehavior, String resetOptionsString, Boolean isInternal, Boolean forDescribeParameterEncryption)\n at System.Data.SqlClient.SqlCommand.RunExecuteReaderTds(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, Boolean async, Int32 timeout, Task& task, Boolean asyncWrite, Boolean inRetry, SqlDataReader ds, Boolean describeParameterEncryptionRequest)\n at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method, TaskCompletionSource`1 completion, Int32 timeout, Task& task, Boolean& usedCache, Boolean asyncWrite, Boolean inRetry)\n at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method)\n at System.Data.SqlClient.SqlCommand.ExecuteReader(CommandBehavior behavior, String method)\n at System.Data.SqlClient.SqlCommand.ExecuteDbDataReader(CommandBehavior behavior)\n at System.Data.Common.DbCommand.ExecuteReader(CommandBehavior behavior)\n at System.Data.Entity.Infrastructure.Interception.DbCommandDispatcher.<Reader>b__c(DbCommand t, DbCommandInterceptionContext`1 c)\n at System.Data.Entity.Infrastructure.Interception.InternalDispatcher`1.Dispatch[TTarget,TInterceptionContext,TResult](TTarget target, Func`3 operation, TInterceptionContext interceptionContext, Action`3 executing, Action`3 executed)\n at System.Data.Entity.Infrastructure.Interception.DbCommandDispatcher.Reader(DbCommand command, DbCommandInterceptionContext interceptionContext)\n at System.Data.Entity.Internal.InterceptableDbCommand.ExecuteDbDataReader(CommandBehavior behavior)\n at System.Data.Common.DbCommand.ExecuteReader(CommandBehavior behavior)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQueryInternal[TElement](String commandText, String entitySetName, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Core.Objects.ObjectContext.<>c__DisplayClass65`1.<ExecuteStoreQueryReliably>b__64()\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteInTransaction[T](Func`1 func, IDbExecutionStrategy executionStrategy, Boolean startLocalTransaction, Boolean releaseConnectionOnSuccess)\n at System.Data.Entity.Core.Objects.ObjectContext.<>c__DisplayClass65`1.<ExecuteStoreQueryReliably>b__63()\n at System.Data.Entity.SqlServer.DefaultSqlExecutionStrategy.Execute[TResult](Func`1 operation)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQueryReliably[TElement](String commandText, String entitySetName, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQuery[TElement](String commandText, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Internal.InternalContext.<>c__DisplayClass14`1.<ExecuteSqlQuery>b__13()\n at System.Data.Entity.Internal.LazyEnumerator`1.MoveNext()\n at System.Linq.Enumerable.First[TSource](IEnumerable`1 source)\n at XPOLastMile.Repository.FinanceSynchronizer.SettlementEntityRepository.InsertSettlementEntity(SettlementEntity jobCost) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.Repository\\XPOLastMile.Repository.FinanceSynchronizer\\SettlementEntityRepository.cs:line 44\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSync.Impl.OrderCostSyncService.SettlementEntityInsert(SettlementEntity jobCostAfter, Int64 apTransactionTypeIdVoucher) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSync\\Impl\\OrderCostSyncService.cs:line 171\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSync.Impl.OrderCostSyncService.SyncOrderCost(Message message, ExecutionContext executionContext, IContext context) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSync\\Impl\\OrderCostSyncService.cs:line 94\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts.ProcessRequest(IContext context, Message message) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSyncOrderCosts.cs:line 24\nClientConnectionId:1a4dc75f-c457-4903-870b-428b5d0182c0\nError Number:8178,State:1,Class:16] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : 557601a3-83b2-4456-996c-e3eece34519e] \nEventName : ErrorInfo\nTimestamp : 2018-04-24T21:15:16.8349341Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,984][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41234477, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:47.533Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 3, Level : Warning, Message : Could not process for queue 35235896 - XPOLastMile.Framework.MVC.BooleanResponse, Payload : [message : Could not process for queue 35235896 - XPOLastMile.Framework.MVC.BooleanResponse] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:16.8422219Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,984][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:39,985][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:39,985][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:39,985][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:39,985][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:39,985][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:39,985][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:39,985][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:39,985][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:39,985][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:39,985][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:39,985][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:39,985][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:39,986][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:39,986][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:39,986][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:39,986][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:39,986][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:39,986][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:39,986][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:39,986][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:39,987][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:39,987][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:39,987][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:39,987][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:39,987][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:39,987][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:39,987][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:39,987][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:39,987][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:39,988][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:39,988][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:39,988][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:39,988][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:39,988][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:39,988][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:39,988][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:39,989][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:39,989][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:39,989][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:39,989][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:39,989][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:39,989][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:39,989][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:39,989][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:39,989][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"E:\\inetpub\\wwwroot\\dms.xpo.com\\portal\\logs\\dispatchOfficeWeb.log", "offset"=>630318, "input_type"=>"log", "type"=>"log4net_input", "@timestamp"=>2018-04-24T21:17:25.136Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"2018-04-24 17:16:58,128 ERROR [Penchant.dispatchOffice.Common - (174)]: LogException() - 815991dd-902d-484e-8c6f-e8a25add3abf: A public action method 'Login' was not found on controller 'dispatchOfficeWeb.Controllers.SecurityController'.: Exception occurred while processing URL: '/portal/signin'\nA public action method 'Login' was not found on controller 'dispatchOfficeWeb.Controllers.SecurityController'.\n : at System.Web.Mvc.Controller.HandleUnknownAction(String actionName)\n at System.Web.Mvc.Controller.<>c__DisplayClass22.<BeginExecuteCore>b__1e()\n at System.Web.Mvc.Async.AsyncResultWrapper.<.cctor>b__0(IAsyncResult asyncResult, Action action)\n at System.Web.Mvc.Controller.EndExecuteCore(IAsyncResult asyncResult)\n at System.Web.Mvc.Async.AsyncResultWrapper.WrappedAsyncVoid`1.CallEndDelegate(IAsyncResult asyncResult)\n at System.Web.Mvc.Controller.EndExecute(IAsyncResult asyncResult)\n at System.Web.Mvc.MvcHandler.<BeginProcessRequest>b__5(IAsyncResult asyncResult, ProcessRequestState innerState)\n at System.Web.Mvc.Async.AsyncResultWrapper.WrappedAsyncVoid`1.CallEndDelegate(IAsyncResult asyncResult)\n at System.Web.Mvc.MvcHandler.EndProcessRequest(IAsyncResult asyncResult)\n at System.Web.HttpApplication.CallHandlerExecutionStep.System.Web.HttpApplication.IExecutionStep.Execute()\n at System.Web.HttpApplication.CallHandlerExecutionStep.System.Web.HttpApplication.IExecutionStep.Execute()\n at System.Web.HttpApplication.ExecuteStep(IExecutionStep step, Boolean& completedSynchronously)", "fields"=>{"env_id"=>"STA", "app_id"=>"portal"}, "host"=>"DMSWEBSTA02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,990][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41161455, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:36.529Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,990][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41162015, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:36.529Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 1, Level : Verbose, Message : The messageID to process: 35235901 and MessageQueueID: 35185034, Payload : [message : The messageID to process: 35235901 and MessageQueueID: 35185034] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : DebugInfo, Timestamp : 2018-04-24T21:14:58.7167551Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,990][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41162579, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:36.529Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 1, Level : Verbose, Message : FinanceSyncOrderCost called with MessageQueueID: 35185034 started, Payload : [message : FinanceSyncOrderCost called with MessageQueueID: 35185034 started] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : DebugInfo, Timestamp : 2018-04-24T21:14:58.7426155Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,990][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41173074, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:36.530Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 1, Level : Verbose, Message : FinanceSyncOrderCost called with MessageQueueID: 35185039 started, Payload : [message : FinanceSyncOrderCost called with MessageQueueID: 35185039 started] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : DebugInfo, Timestamp : 2018-04-24T21:15:02.1811244Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,990][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 2
[2018-04-24T17:17:39,990][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41179388, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:36.530Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"ProviderId : 3ac9cccc-7b35-599a-e4dc-bb08e9ad5a20\nEventId : 4\nKeywords : None\nLevel : Error\nMessage : The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied. / Additional Information: \nOpcode : Info\nTask : 65530\nVersion : 0\nPayload : [message : The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied. / Additional Information: ] [stacktrace : System.Data.SqlClient.SqlException (0x80131904): The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied.\n at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)\n at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)\n at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj, Boolean callerHasConnectionLock, Boolean asyncClose)\n at System.Data.SqlClient.TdsParser.TryRun(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj, Boolean& dataReady)\n at System.Data.SqlClient.SqlDataReader.TryConsumeMetaData()\n at System.Data.SqlClient.SqlDataReader.get_MetaData()\n at System.Data.SqlClient.SqlCommand.FinishExecuteReader(SqlDataReader ds, RunBehavior runBehavior, String resetOptionsString, Boolean isInternal, Boolean forDescribeParameterEncryption)\n at System.Data.SqlClient.SqlCommand.RunExecuteReaderTds(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, Boolean async, Int32 timeout, Task& task, Boolean asyncWrite, Boolean inRetry, SqlDataReader ds, Boolean describeParameterEncryptionRequest)\n at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method, TaskCompletionSource`1 completion, Int32 timeout, Task& task, Boolean& usedCache, Boolean asyncWrite, Boolean inRetry)\n at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method)\n at System.Data.SqlClient.SqlCommand.ExecuteReader(CommandBehavior behavior, String method)\n at System.Data.SqlClient.SqlCommand.ExecuteDbDataReader(CommandBehavior behavior)\n at System.Data.Common.DbCommand.ExecuteReader(CommandBehavior behavior)\n at System.Data.Entity.Infrastructure.Interception.DbCommandDispatcher.<Reader>b__c(DbCommand t, DbCommandInterceptionContext`1 c)\n at System.Data.Entity.Infrastructure.Interception.InternalDispatcher`1.Dispatch[TTarget,TInterceptionContext,TResult](TTarget target, Func`3 operation, TInterceptionContext interceptionContext, Action`3 executing, Action`3 executed)\n at System.Data.Entity.Infrastructure.Interception.DbCommandDispatcher.Reader(DbCommand command, DbCommandInterceptionContext interceptionContext)\n at System.Data.Entity.Internal.InterceptableDbCommand.ExecuteDbDataReader(CommandBehavior behavior)\n at System.Data.Common.DbCommand.ExecuteReader(CommandBehavior behavior)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQueryInternal[TElement](String commandText, String entitySetName, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Core.Objects.ObjectContext.<>c__DisplayClass65`1.<ExecuteStoreQueryReliably>b__64()\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteInTransaction[T](Func`1 func, IDbExecutionStrategy executionStrategy, Boolean startLocalTransaction, Boolean releaseConnectionOnSuccess)\n at System.Data.Entity.Core.Objects.ObjectContext.<>c__DisplayClass65`1.<ExecuteStoreQueryReliably>b__63()\n at System.Data.Entity.SqlServer.DefaultSqlExecutionStrategy.Execute[TResult](Func`1 operation)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQueryReliably[TElement](String commandText, String entitySetName, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQuery[TElement](String commandText, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Internal.InternalContext.<>c__DisplayClass14`1.<ExecuteSqlQuery>b__13()\n at System.Data.Entity.Internal.LazyEnumerator`1.MoveNext()\n at System.Linq.Enumerable.First[TSource](IEnumerable`1 source)\n at XPOLastMile.Repository.FinanceSynchronizer.SettlementEntityRepository.InsertSettlementEntity(SettlementEntity jobCost) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.Repository\\XPOLastMile.Repository.FinanceSynchronizer\\SettlementEntityRepository.cs:line 44\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSync.Impl.OrderCostSyncService.SettlementEntityInsert(SettlementEntity jobCostAfter, Int64 apTransactionTypeIdVoucher) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSync\\Impl\\OrderCostSyncService.cs:line 171\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSync.Impl.OrderCostSyncService.SyncOrderCost(Message message, ExecutionContext executionContext, IContext context) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSync\\Impl\\OrderCostSyncService.cs:line 94\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts.ProcessRequest(IContext context, Message message) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSyncOrderCosts.cs:line 24\nClientConnectionId:1a4dc75f-c457-4903-870b-428b5d0182c0\nError Number:8178,State:1,Class:16] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : 46cf94a5-0fbe-4bc2-9183-962229141d8d] \nEventName : ErrorInfo\nTimestamp : 2018-04-24T21:15:02.2013186Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,991][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41179984, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:36.530Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 3, Level : Warning, Message : Could not process for queue 35235906 - XPOLastMile.Framework.MVC.BooleanResponse, Payload : [message : Could not process for queue 35235906 - XPOLastMile.Framework.MVC.BooleanResponse] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:02.2096303Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,991][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41180544, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:36.530Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 1, Level : Verbose, Message : The messageID to process: 35235908 and MessageQueueID: 35185041, Payload : [message : The messageID to process: 35235908 and MessageQueueID: 35185041] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : DebugInfo, Timestamp : 2018-04-24T21:15:04.6508084Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,991][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41181108, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:36.530Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 1, Level : Verbose, Message : FinanceSyncOrderCost called with MessageQueueID: 35185041 started, Payload : [message : FinanceSyncOrderCost called with MessageQueueID: 35185041 started] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : DebugInfo, Timestamp : 2018-04-24T21:15:04.6544171Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,991][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41212120, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:46.532Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 3, Level : Warning, Message : Could not process for queue 35235892 - XPOLastMile.Framework.MVC.BooleanResponse, Payload : [message : Could not process for queue 35235892 - XPOLastMile.Framework.MVC.BooleanResponse] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:12.0049729Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,991][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 3
[2018-04-24T17:17:39,991][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41216460, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:46.532Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 3, Level : Warning, Message : Executing business process!!!!!!!!!!!!!!!!!, Payload : [message : Executing business process!!!!!!!!!!!!!!!!!] [applicationName : XPOLastMile.MessageQueue.Business.MessageQueueWatcher] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:14.2897132Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,991][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41227003, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:47.533Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 1, Level : Verbose, Message : The messageID to process: 35235896 and MessageQueueID: 35185029, Payload : [message : The messageID to process: 35235896 and MessageQueueID: 35185029] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : DebugInfo, Timestamp : 2018-04-24T21:15:16.8099489Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,991][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41227567, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:47.533Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 1, Level : Verbose, Message : FinanceSyncOrderCost called with MessageQueueID: 35185029 started, Payload : [message : FinanceSyncOrderCost called with MessageQueueID: 35185029 started] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : DebugInfo, Timestamp : 2018-04-24T21:15:16.8138079Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,992][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 4
[2018-04-24T17:17:39,992][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41233881, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:47.533Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"ProviderId : 3ac9cccc-7b35-599a-e4dc-bb08e9ad5a20\nEventId : 4\nKeywords : None\nLevel : Error\nMessage : The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied. / Additional Information: \nOpcode : Info\nTask : 65530\nVersion : 0\nPayload : [message : The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied. / Additional Information: ] [stacktrace : System.Data.SqlClient.SqlException (0x80131904): The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied.\n at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)\n at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)\n at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj, Boolean callerHasConnectionLock, Boolean asyncClose)\n at System.Data.SqlClient.TdsParser.TryRun(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj, Boolean& dataReady)\n at System.Data.SqlClient.SqlDataReader.TryConsumeMetaData()\n at System.Data.SqlClient.SqlDataReader.get_MetaData()\n at System.Data.SqlClient.SqlCommand.FinishExecuteReader(SqlDataReader ds, RunBehavior runBehavior, String resetOptionsString, Boolean isInternal, Boolean forDescribeParameterEncryption)\n at System.Data.SqlClient.SqlCommand.RunExecuteReaderTds(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, Boolean async, Int32 timeout, Task& task, Boolean asyncWrite, Boolean inRetry, SqlDataReader ds, Boolean describeParameterEncryptionRequest)\n at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method, TaskCompletionSource`1 completion, Int32 timeout, Task& task, Boolean& usedCache, Boolean asyncWrite, Boolean inRetry)\n at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method)\n at System.Data.SqlClient.SqlCommand.ExecuteReader(CommandBehavior behavior, String method)\n at System.Data.SqlClient.SqlCommand.ExecuteDbDataReader(CommandBehavior behavior)\n at System.Data.Common.DbCommand.ExecuteReader(CommandBehavior behavior)\n at System.Data.Entity.Infrastructure.Interception.DbCommandDispatcher.<Reader>b__c(DbCommand t, DbCommandInterceptionContext`1 c)\n at System.Data.Entity.Infrastructure.Interception.InternalDispatcher`1.Dispatch[TTarget,TInterceptionContext,TResult](TTarget target, Func`3 operation, TInterceptionContext interceptionContext, Action`3 executing, Action`3 executed)\n at System.Data.Entity.Infrastructure.Interception.DbCommandDispatcher.Reader(DbCommand command, DbCommandInterceptionContext interceptionContext)\n at System.Data.Entity.Internal.InterceptableDbCommand.ExecuteDbDataReader(CommandBehavior behavior)\n at System.Data.Common.DbCommand.ExecuteReader(CommandBehavior behavior)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQueryInternal[TElement](String commandText, String entitySetName, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Core.Objects.ObjectContext.<>c__DisplayClass65`1.<ExecuteStoreQueryReliably>b__64()\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteInTransaction[T](Func`1 func, IDbExecutionStrategy executionStrategy, Boolean startLocalTransaction, Boolean releaseConnectionOnSuccess)\n at System.Data.Entity.Core.Objects.ObjectContext.<>c__DisplayClass65`1.<ExecuteStoreQueryReliably>b__63()\n at System.Data.Entity.SqlServer.DefaultSqlExecutionStrategy.Execute[TResult](Func`1 operation)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQueryReliably[TElement](String commandText, String entitySetName, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQuery[TElement](String commandText, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Internal.InternalContext.<>c__DisplayClass14`1.<ExecuteSqlQuery>b__13()\n at System.Data.Entity.Internal.LazyEnumerator`1.MoveNext()\n at System.Linq.Enumerable.First[TSource](IEnumerable`1 source)\n at XPOLastMile.Repository.FinanceSynchronizer.SettlementEntityRepository.InsertSettlementEntity(SettlementEntity jobCost) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.Repository\\XPOLastMile.Repository.FinanceSynchronizer\\SettlementEntityRepository.cs:line 44\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSync.Impl.OrderCostSyncService.SettlementEntityInsert(SettlementEntity jobCostAfter, Int64 apTransactionTypeIdVoucher) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSync\\Impl\\OrderCostSyncService.cs:line 171\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSync.Impl.OrderCostSyncService.SyncOrderCost(Message message, ExecutionContext executionContext, IContext context) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSync\\Impl\\OrderCostSyncService.cs:line 94\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts.ProcessRequest(IContext context, Message message) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSyncOrderCosts.cs:line 24\nClientConnectionId:1a4dc75f-c457-4903-870b-428b5d0182c0\nError Number:8178,State:1,Class:16] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : 557601a3-83b2-4456-996c-e3eece34519e] \nEventName : ErrorInfo\nTimestamp : 2018-04-24T21:15:16.8349341Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,992][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 5
[2018-04-24T17:17:39,992][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41234477, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:47.533Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 3, Level : Warning, Message : Could not process for queue 35235896 - XPOLastMile.Framework.MVC.BooleanResponse, Payload : [message : Could not process for queue 35235896 - XPOLastMile.Framework.MVC.BooleanResponse] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:16.8422219Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:39,993][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 6
[2018-04-24T17:17:39,993][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 7
[2018-04-24T17:17:39,994][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 8
[2018-04-24T17:17:39,994][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 9
[2018-04-24T17:17:39,995][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 10
[2018-04-24T17:17:40,014][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Received a new payload
[2018-04-24T17:17:40,014][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 1
[2018-04-24T17:17:40,015][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 2
[2018-04-24T17:17:40,015][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 3
[2018-04-24T17:17:40,016][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 4
[2018-04-24T17:17:40,016][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 5
[2018-04-24T17:17:40,017][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 6
[2018-04-24T17:17:40,017][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 7
[2018-04-24T17:17:40,018][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 8
[2018-04-24T17:17:40,019][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 9
[2018-04-24T17:17:40,019][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 10
[2018-04-24T17:17:40,021][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Received a new payload
[2018-04-24T17:17:40,021][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 1
[2018-04-24T17:17:40,022][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 10.54.48.24:60658] Sending a new message for the listener, sequence: 2
[2018-04-24T17:17:40,120][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41211524, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:46.532Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"ProviderId : 3ac9cccc-7b35-599a-e4dc-bb08e9ad5a20\nEventId : 4\nKeywords : None\nLevel : Error\nMessage : The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied. / Additional Information: \nOpcode : Info\nTask : 65530\nVersion : 0\nPayload : [message : The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied. / Additional Information: ] [stacktrace : System.Data.SqlClient.SqlException (0x80131904): The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied.\n at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)\n at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)\n at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj, Boolean callerHasConnectionLock, Boolean asyncClose)\n at System.Data.SqlClient.TdsParser.TryRun(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj, Boolean& dataReady)\n at System.Data.SqlClient.SqlDataReader.TryConsumeMetaData()\n at System.Data.SqlClient.SqlDataReader.get_MetaData()\n at System.Data.SqlClient.SqlCommand.FinishExecuteReader(SqlDataReader ds, RunBehavior runBehavior, String resetOptionsString, Boolean isInternal, Boolean forDescribeParameterEncryption)\n at System.Data.SqlClient.SqlCommand.RunExecuteReaderTds(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, Boolean async, Int32 timeout, Task& task, Boolean asyncWrite, Boolean inRetry, SqlDataReader ds, Boolean describeParameterEncryptionRequest)\n at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method, TaskCompletionSource`1 completion, Int32 timeout, Task& task, Boolean& usedCache, Boolean asyncWrite, Boolean inRetry)\n at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method)\n at System.Data.SqlClient.SqlCommand.ExecuteReader(CommandBehavior behavior, String method)\n at System.Data.SqlClient.SqlCommand.ExecuteDbDataReader(CommandBehavior behavior)\n at System.Data.Common.DbCommand.ExecuteReader(CommandBehavior behavior)\n at System.Data.Entity.Infrastructure.Interception.DbCommandDispatcher.<Reader>b__c(DbCommand t, DbCommandInterceptionContext`1 c)\n at System.Data.Entity.Infrastructure.Interception.InternalDispatcher`1.Dispatch[TTarget,TInterceptionContext,TResult](TTarget target, Func`3 operation, TInterceptionContext interceptionContext, Action`3 executing, Action`3 executed)\n at System.Data.Entity.Infrastructure.Interception.DbCommandDispatcher.Reader(DbCommand command, DbCommandInterceptionContext interceptionContext)\n at System.Data.Entity.Internal.InterceptableDbCommand.ExecuteDbDataReader(CommandBehavior behavior)\n at System.Data.Common.DbCommand.ExecuteReader(CommandBehavior behavior)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQueryInternal[TElement](String commandText, String entitySetName, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Core.Objects.ObjectContext.<>c__DisplayClass65`1.<ExecuteStoreQueryReliably>b__64()\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteInTransaction[T](Func`1 func, IDbExecutionStrategy executionStrategy, Boolean startLocalTransaction, Boolean releaseConnectionOnSuccess)\n at System.Data.Entity.Core.Objects.ObjectContext.<>c__DisplayClass65`1.<ExecuteStoreQueryReliably>b__63()\n at System.Data.Entity.SqlServer.DefaultSqlExecutionStrategy.Execute[TResult](Func`1 operation)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQueryReliably[TElement](String commandText, String entitySetName, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQuery[TElement](String commandText, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Internal.InternalContext.<>c__DisplayClass14`1.<ExecuteSqlQuery>b__13()\n at System.Data.Entity.Internal.LazyEnumerator`1.MoveNext()\n at System.Linq.Enumerable.First[TSource](IEnumerable`1 source)\n at XPOLastMile.Repository.FinanceSynchronizer.SettlementEntityRepository.InsertSettlementEntity(SettlementEntity jobCost) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.Repository\\XPOLastMile.Repository.FinanceSynchronizer\\SettlementEntityRepository.cs:line 44\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSync.Impl.OrderCostSyncService.SettlementEntityInsert(SettlementEntity jobCostAfter, Int64 apTransactionTypeIdVoucher) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSync\\Impl\\OrderCostSyncService.cs:line 171\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSync.Impl.OrderCostSyncService.SyncOrderCost(Message message, ExecutionContext executionContext, IContext context) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSync\\Impl\\OrderCostSyncService.cs:line 94\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts.ProcessRequest(IContext context, Message message) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSyncOrderCosts.cs:line 24\nClientConnectionId:1a4dc75f-c457-4903-870b-428b5d0182c0\nError Number:8178,State:1,Class:16] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : 094cf1d5-06ff-4a0e-bd99-17a2544fdfa8] \nEventName : ErrorInfo\nTimestamp : 2018-04-24T21:15:12.0015450Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,121][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41213512, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:46.532Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 3, Level : Warning, Message : About to read the business process name, Payload : [message : About to read the business process name] [applicationName : Not Provided] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:13.7641724Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,121][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41213950, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:46.532Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 3, Level : Warning, Message : Found the business process, Payload : [message : Found the business process] [applicationName : Not Provided] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:13.7675365Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,121][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41214447, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:46.532Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 3, Level : Warning, Message : Executing business process!!!!!!!!!!!!!!!!!, Payload : [message : Executing business process!!!!!!!!!!!!!!!!!] [applicationName : XPOLastMile.Host.MSMQ.Auto.ProcessAll] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:13.7680716Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,121][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41214991, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:46.532Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 3, Level : Warning, Message : FINISHED ---------- Executing business process!!!!!!!!!!!!!!!!!, Payload : [message : FINISHED ---------- Executing business process!!!!!!!!!!!!!!!!!] [applicationName : XPOLastMile.Host.MSMQ.Core.ProcessUntilEmpty] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:13.8295562Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,122][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41215509, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:46.532Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 3, Level : Warning, Message : Done everything&^**^*&*(*()*()()*((*()*()*()*()*(), Payload : [message : Done everything&^**^*&*(*()*()()*((*()*()*()*()*()] [applicationName : XPOLastMile.Host.MSMQ.Core.ProcessUntilEmpty] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:13.8298663Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,122][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41215947, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:46.532Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 3, Level : Warning, Message : Found the business process, Payload : [message : Found the business process] [applicationName : Not Provided] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:14.2891931Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,122][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41217971, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:46.532Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 1, Level : Verbose, Message : The messageID to process: 35235894 and MessageQueueID: 35185027, Payload : [message : The messageID to process: 35235894 and MessageQueueID: 35185027] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : DebugInfo, Timestamp : 2018-04-24T21:15:14.3864598Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,122][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41218535, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:46.532Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 1, Level : Verbose, Message : FinanceSyncOrderCost called with MessageQueueID: 35185027 started, Payload : [message : FinanceSyncOrderCost called with MessageQueueID: 35185027 started] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : DebugInfo, Timestamp : 2018-04-24T21:15:14.3902671Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,122][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41224849, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:46.532Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"ProviderId : 3ac9cccc-7b35-599a-e4dc-bb08e9ad5a20\nEventId : 4\nKeywords : None\nLevel : Error\nMessage : The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied. / Additional Information: \nOpcode : Info\nTask : 65530\nVersion : 0\nPayload : [message : The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied. / Additional Information: ] [stacktrace : System.Data.SqlClient.SqlException (0x80131904): The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied.\n at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)\n at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)\n at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj, Boolean callerHasConnectionLock, Boolean asyncClose)\n at System.Data.SqlClient.TdsParser.TryRun(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj, Boolean& dataReady)\n at System.Data.SqlClient.SqlDataReader.TryConsumeMetaData()\n at System.Data.SqlClient.SqlDataReader.get_MetaData()\n at System.Data.SqlClient.SqlCommand.FinishExecuteReader(SqlDataReader ds, RunBehavior runBehavior, String resetOptionsString, Boolean isInternal, Boolean forDescribeParameterEncryption)\n at System.Data.SqlClient.SqlCommand.RunExecuteReaderTds(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, Boolean async, Int32 timeout, Task& task, Boolean asyncWrite, Boolean inRetry, SqlDataReader ds, Boolean describeParameterEncryptionRequest)\n at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method, TaskCompletionSource`1 completion, Int32 timeout, Task& task, Boolean& usedCache, Boolean asyncWrite, Boolean inRetry)\n at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method)\n at System.Data.SqlClient.SqlCommand.ExecuteReader(CommandBehavior behavior, String method)\n at System.Data.SqlClient.SqlCommand.ExecuteDbDataReader(CommandBehavior behavior)\n at System.Data.Common.DbCommand.ExecuteReader(CommandBehavior behavior)\n at System.Data.Entity.Infrastructure.Interception.DbCommandDispatcher.<Reader>b__c(DbCommand t, DbCommandInterceptionContext`1 c)\n at System.Data.Entity.Infrastructure.Interception.InternalDispatcher`1.Dispatch[TTarget,TInterceptionContext,TResult](TTarget target, Func`3 operation, TInterceptionContext interceptionContext, Action`3 executing, Action`3 executed)\n at System.Data.Entity.Infrastructure.Interception.DbCommandDispatcher.Reader(DbCommand command, DbCommandInterceptionContext interceptionContext)\n at System.Data.Entity.Internal.InterceptableDbCommand.ExecuteDbDataReader(CommandBehavior behavior)\n at System.Data.Common.DbCommand.ExecuteReader(CommandBehavior behavior)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQueryInternal[TElement](String commandText, String entitySetName, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Core.Objects.ObjectContext.<>c__DisplayClass65`1.<ExecuteStoreQueryReliably>b__64()\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteInTransaction[T](Func`1 func, IDbExecutionStrategy executionStrategy, Boolean startLocalTransaction, Boolean releaseConnectionOnSuccess)\n at System.Data.Entity.Core.Objects.ObjectContext.<>c__DisplayClass65`1.<ExecuteStoreQueryReliably>b__63()\n at System.Data.Entity.SqlServer.DefaultSqlExecutionStrategy.Execute[TResult](Func`1 operation)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQueryReliably[TElement](String commandText, String entitySetName, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQuery[TElement](String commandText, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Internal.InternalContext.<>c__DisplayClass14`1.<ExecuteSqlQuery>b__13()\n at System.Data.Entity.Internal.LazyEnumerator`1.MoveNext()\n at System.Linq.Enumerable.First[TSource](IEnumerable`1 source)\n at XPOLastMile.Repository.FinanceSynchronizer.SettlementEntityRepository.InsertSettlementEntity(SettlementEntity jobCost) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.Repository\\XPOLastMile.Repository.FinanceSynchronizer\\SettlementEntityRepository.cs:line 44\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSync.Impl.OrderCostSyncService.SettlementEntityInsert(SettlementEntity jobCostAfter, Int64 apTransactionTypeIdVoucher) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSync\\Impl\\OrderCostSyncService.cs:line 171\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSync.Impl.OrderCostSyncService.SyncOrderCost(Message message, ExecutionContext executionContext, IContext context) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSync\\Impl\\OrderCostSyncService.cs:line 94\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts.ProcessRequest(IContext context, Message message) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSyncOrderCosts.cs:line 24\nClientConnectionId:1a4dc75f-c457-4903-870b-428b5d0182c0\nError Number:8178,State:1,Class:16] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : c3dfcd1c-fa81-450b-a0b5-4dd91b3ce16d] \nEventName : ErrorInfo\nTimestamp : 2018-04-24T21:15:14.4099942Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,122][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41225445, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:46.532Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 3, Level : Warning, Message : Could not process for queue 35235894 - XPOLastMile.Framework.MVC.BooleanResponse, Payload : [message : Could not process for queue 35235894 - XPOLastMile.Framework.MVC.BooleanResponse] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:14.4134425Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,122][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41225957, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:47.533Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 3, Level : Warning, Message : FINISHED ---------- Executing business process!!!!!!!!!!!!!!!!!, Payload : [message : FINISHED ---------- Executing business process!!!!!!!!!!!!!!!!!] [applicationName : Not Provided] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:16.6713610Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,122][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41226443, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:47.533Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 3, Level : Warning, Message : Done everything&^**^*&*(*()*()()*((*()*()*()*()*(), Payload : [message : Done everything&^**^*&*(*()*()()*((*()*()*()*()*()] [applicationName : Not Provided] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:16.6716813Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,122][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41235475, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.533Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,122][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41236035, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.533Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 1, Level : Verbose, Message : The messageID to process: 35235897 and MessageQueueID: 35185030, Payload : [message : The messageID to process: 35235897 and MessageQueueID: 35185030] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : DebugInfo, Timestamp : 2018-04-24T21:15:19.2336228Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,123][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41236599, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.533Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 1, Level : Verbose, Message : FinanceSyncOrderCost called with MessageQueueID: 35185030 started, Payload : [message : FinanceSyncOrderCost called with MessageQueueID: 35185030 started] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : DebugInfo, Timestamp : 2018-04-24T21:15:19.2374548Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,123][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41242913, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.533Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"ProviderId : 3ac9cccc-7b35-599a-e4dc-bb08e9ad5a20\nEventId : 4\nKeywords : None\nLevel : Error\nMessage : The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied. / Additional Information: \nOpcode : Info\nTask : 65530\nVersion : 0\nPayload : [message : The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied. / Additional Information: ] [stacktrace : System.Data.SqlClient.SqlException (0x80131904): The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied.\n at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)\n at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)\n at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj, Boolean callerHasConnectionLock, Boolean asyncClose)\n at System.Data.SqlClient.TdsParser.TryRun(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj, Boolean& dataReady)\n at System.Data.SqlClient.SqlDataReader.TryConsumeMetaData()\n at System.Data.SqlClient.SqlDataReader.get_MetaData()\n at System.Data.SqlClient.SqlCommand.FinishExecuteReader(SqlDataReader ds, RunBehavior runBehavior, String resetOptionsString, Boolean isInternal, Boolean forDescribeParameterEncryption)\n at System.Data.SqlClient.SqlCommand.RunExecuteReaderTds(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, Boolean async, Int32 timeout, Task& task, Boolean asyncWrite, Boolean inRetry, SqlDataReader ds, Boolean describeParameterEncryptionRequest)\n at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method, TaskCompletionSource`1 completion, Int32 timeout, Task& task, Boolean& usedCache, Boolean asyncWrite, Boolean inRetry)\n at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method)\n at System.Data.SqlClient.SqlCommand.ExecuteReader(CommandBehavior behavior, String method)\n at System.Data.SqlClient.SqlCommand.ExecuteDbDataReader(CommandBehavior behavior)\n at System.Data.Common.DbCommand.ExecuteReader(CommandBehavior behavior)\n at System.Data.Entity.Infrastructure.Interception.DbCommandDispatcher.<Reader>b__c(DbCommand t, DbCommandInterceptionContext`1 c)\n at System.Data.Entity.Infrastructure.Interception.InternalDispatcher`1.Dispatch[TTarget,TInterceptionContext,TResult](TTarget target, Func`3 operation, TInterceptionContext interceptionContext, Action`3 executing, Action`3 executed)\n at System.Data.Entity.Infrastructure.Interception.DbCommandDispatcher.Reader(DbCommand command, DbCommandInterceptionContext interceptionContext)\n at System.Data.Entity.Internal.InterceptableDbCommand.ExecuteDbDataReader(CommandBehavior behavior)\n at System.Data.Common.DbCommand.ExecuteReader(CommandBehavior behavior)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQueryInternal[TElement](String commandText, String entitySetName, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Core.Objects.ObjectContext.<>c__DisplayClass65`1.<ExecuteStoreQueryReliably>b__64()\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteInTransaction[T](Func`1 func, IDbExecutionStrategy executionStrategy, Boolean startLocalTransaction, Boolean releaseConnectionOnSuccess)\n at System.Data.Entity.Core.Objects.ObjectContext.<>c__DisplayClass65`1.<ExecuteStoreQueryReliably>b__63()\n at System.Data.Entity.SqlServer.DefaultSqlExecutionStrategy.Execute[TResult](Func`1 operation)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQueryReliably[TElement](String commandText, String entitySetName, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQuery[TElement](String commandText, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Internal.InternalContext.<>c__DisplayClass14`1.<ExecuteSqlQuery>b__13()\n at System.Data.Entity.Internal.LazyEnumerator`1.MoveNext()\n at System.Linq.Enumerable.First[TSource](IEnumerable`1 source)\n at XPOLastMile.Repository.FinanceSynchronizer.SettlementEntityRepository.InsertSettlementEntity(SettlementEntity jobCost) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.Repository\\XPOLastMile.Repository.FinanceSynchronizer\\SettlementEntityRepository.cs:line 44\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSync.Impl.OrderCostSyncService.SettlementEntityInsert(SettlementEntity jobCostAfter, Int64 apTransactionTypeIdVoucher) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSync\\Impl\\OrderCostSyncService.cs:line 171\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSync.Impl.OrderCostSyncService.SyncOrderCost(Message message, ExecutionContext executionContext, IContext context) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSync\\Impl\\OrderCostSyncService.cs:line 94\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts.ProcessRequest(IContext context, Message message) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSyncOrderCosts.cs:line 24\nClientConnectionId:1a4dc75f-c457-4903-870b-428b5d0182c0\nError Number:8178,State:1,Class:16] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : 55993834-7203-4e50-ad8a-197d5a329e08] \nEventName : ErrorInfo\nTimestamp : 2018-04-24T21:15:19.2583402Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,123][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41243509, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.533Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 3, Level : Warning, Message : Could not process for queue 35235897 - XPOLastMile.Framework.MVC.BooleanResponse, Payload : [message : Could not process for queue 35235897 - XPOLastMile.Framework.MVC.BooleanResponse] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:19.2620846Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,123][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41244069, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.533Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 1, Level : Verbose, Message : The messageID to process: 35235898 and MessageQueueID: 35185031, Payload : [message : The messageID to process: 35235898 and MessageQueueID: 35185031] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : DebugInfo, Timestamp : 2018-04-24T21:15:21.6488844Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,123][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41244633, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.533Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 1, Level : Verbose, Message : FinanceSyncOrderCost called with MessageQueueID: 35185031 started, Payload : [message : FinanceSyncOrderCost called with MessageQueueID: 35185031 started] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : DebugInfo, Timestamp : 2018-04-24T21:15:21.6528752Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,123][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41250947, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.534Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"ProviderId : 3ac9cccc-7b35-599a-e4dc-bb08e9ad5a20\nEventId : 4\nKeywords : None\nLevel : Error\nMessage : The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied. / Additional Information: \nOpcode : Info\nTask : 65530\nVersion : 0\nPayload : [message : The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied. / Additional Information: ] [stacktrace : System.Data.SqlClient.SqlException (0x80131904): The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied.\n at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)\n at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)\n at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj, Boolean callerHasConnectionLock, Boolean asyncClose)\n at System.Data.SqlClient.TdsParser.TryRun(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj, Boolean& dataReady)\n at System.Data.SqlClient.SqlDataReader.TryConsumeMetaData()\n at System.Data.SqlClient.SqlDataReader.get_MetaData()\n at System.Data.SqlClient.SqlCommand.FinishExecuteReader(SqlDataReader ds, RunBehavior runBehavior, String resetOptionsString, Boolean isInternal, Boolean forDescribeParameterEncryption)\n at System.Data.SqlClient.SqlCommand.RunExecuteReaderTds(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, Boolean async, Int32 timeout, Task& task, Boolean asyncWrite, Boolean inRetry, SqlDataReader ds, Boolean describeParameterEncryptionRequest)\n at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method, TaskCompletionSource`1 completion, Int32 timeout, Task& task, Boolean& usedCache, Boolean asyncWrite, Boolean inRetry)\n at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method)\n at System.Data.SqlClient.SqlCommand.ExecuteReader(CommandBehavior behavior, String method)\n at System.Data.SqlClient.SqlCommand.ExecuteDbDataReader(CommandBehavior behavior)\n at System.Data.Common.DbCommand.ExecuteReader(CommandBehavior behavior)\n at System.Data.Entity.Infrastructure.Interception.DbCommandDispatcher.<Reader>b__c(DbCommand t, DbCommandInterceptionContext`1 c)\n at System.Data.Entity.Infrastructure.Interception.InternalDispatcher`1.Dispatch[TTarget,TInterceptionContext,TResult](TTarget target, Func`3 operation, TInterceptionContext interceptionContext, Action`3 executing, Action`3 executed)\n at System.Data.Entity.Infrastructure.Interception.DbCommandDispatcher.Reader(DbCommand command, DbCommandInterceptionContext interceptionContext)\n at System.Data.Entity.Internal.InterceptableDbCommand.ExecuteDbDataReader(CommandBehavior behavior)\n at System.Data.Common.DbCommand.ExecuteReader(CommandBehavior behavior)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQueryInternal[TElement](String commandText, String entitySetName, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Core.Objects.ObjectContext.<>c__DisplayClass65`1.<ExecuteStoreQueryReliably>b__64()\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteInTransaction[T](Func`1 func, IDbExecutionStrategy executionStrategy, Boolean startLocalTransaction, Boolean releaseConnectionOnSuccess)\n at System.Data.Entity.Core.Objects.ObjectContext.<>c__DisplayClass65`1.<ExecuteStoreQueryReliably>b__63()\n at System.Data.Entity.SqlServer.DefaultSqlExecutionStrategy.Execute[TResult](Func`1 operation)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQueryReliably[TElement](String commandText, String entitySetName, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQuery[TElement](String commandText, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Internal.InternalContext.<>c__DisplayClass14`1.<ExecuteSqlQuery>b__13()\n at System.Data.Entity.Internal.LazyEnumerator`1.MoveNext()\n at System.Linq.Enumerable.First[TSource](IEnumerable`1 source)\n at XPOLastMile.Repository.FinanceSynchronizer.SettlementEntityRepository.InsertSettlementEntity(SettlementEntity jobCost) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.Repository\\XPOLastMile.Repository.FinanceSynchronizer\\SettlementEntityRepository.cs:line 44\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSync.Impl.OrderCostSyncService.SettlementEntityInsert(SettlementEntity jobCostAfter, Int64 apTransactionTypeIdVoucher) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSync\\Impl\\OrderCostSyncService.cs:line 171\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSync.Impl.OrderCostSyncService.SyncOrderCost(Message message, ExecutionContext executionContext, IContext context) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSync\\Impl\\OrderCostSyncService.cs:line 94\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts.ProcessRequest(IContext context, Message message) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSyncOrderCosts.cs:line 24\nClientConnectionId:1a4dc75f-c457-4903-870b-428b5d0182c0\nError Number:8178,State:1,Class:16] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : 3b3dacf5-e1d2-4460-a3b1-25e444030b7d] \nEventName : ErrorInfo\nTimestamp : 2018-04-24T21:15:21.6778484Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,123][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41251543, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.534Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 3, Level : Warning, Message : Could not process for queue 35235898 - XPOLastMile.Framework.MVC.BooleanResponse, Payload : [message : Could not process for queue 35235898 - XPOLastMile.Framework.MVC.BooleanResponse] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:21.6826882Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,124][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41252007, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.534Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 3, Level : Warning, Message : About to read the business process name, Payload : [message : About to read the business process name] [applicationName : Not Provided] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:21.7675305Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,124][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41252445, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.534Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 3, Level : Warning, Message : Found the business process, Payload : [message : Found the business process] [applicationName : Not Provided] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:21.7706919Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,124][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41252942, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.534Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 3, Level : Warning, Message : Executing business process!!!!!!!!!!!!!!!!!, Payload : [message : Executing business process!!!!!!!!!!!!!!!!!] [applicationName : XPOLastMile.Host.MSMQ.Auto.ProcessAll] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:21.7710867Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,124][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41253486, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.534Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 3, Level : Warning, Message : FINISHED ---------- Executing business process!!!!!!!!!!!!!!!!!, Payload : [message : FINISHED ---------- Executing business process!!!!!!!!!!!!!!!!!] [applicationName : XPOLastMile.Host.MSMQ.Core.ProcessUntilEmpty] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:21.8110554Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,124][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41254004, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.534Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 3, Level : Warning, Message : Done everything&^**^*&*(*()*()()*((*()*()*()*()*(), Payload : [message : Done everything&^**^*&*(*()*()()*((*()*()*()*()*()] [applicationName : XPOLastMile.Host.MSMQ.Core.ProcessUntilEmpty] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:21.8113608Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,124][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41254468, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.534Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 3, Level : Warning, Message : About to read the business process name, Payload : [message : About to read the business process name] [applicationName : Not Provided] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:22.7619178Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,124][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41254906, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.534Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 3, Level : Warning, Message : Found the business process, Payload : [message : Found the business process] [applicationName : Not Provided] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:22.7652135Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,124][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41255403, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.534Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 3, Level : Warning, Message : Executing business process!!!!!!!!!!!!!!!!!, Payload : [message : Executing business process!!!!!!!!!!!!!!!!!] [applicationName : XPOLastMile.Host.MSMQ.Auto.ProcessAll] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:22.7656878Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,124][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41255947, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.534Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 3, Level : Warning, Message : FINISHED ---------- Executing business process!!!!!!!!!!!!!!!!!, Payload : [message : FINISHED ---------- Executing business process!!!!!!!!!!!!!!!!!] [applicationName : XPOLastMile.Host.MSMQ.Core.ProcessUntilEmpty] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:22.8066648Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,124][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41256465, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.534Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 3, Level : Warning, Message : Done everything&^**^*&*(*()*()()*((*()*()*()*()*(), Payload : [message : Done everything&^**^*&*(*()*()()*((*()*()*()*()*()] [applicationName : XPOLastMile.Host.MSMQ.Core.ProcessUntilEmpty] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:22.8069513Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,124][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41257025, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.534Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 1, Level : Verbose, Message : The messageID to process: 35235900 and MessageQueueID: 35185033, Payload : [message : The messageID to process: 35235900 and MessageQueueID: 35185033] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : DebugInfo, Timestamp : 2018-04-24T21:15:24.0748428Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,124][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41257589, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.534Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 1, Level : Verbose, Message : FinanceSyncOrderCost called with MessageQueueID: 35185033 started, Payload : [message : FinanceSyncOrderCost called with MessageQueueID: 35185033 started] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : DebugInfo, Timestamp : 2018-04-24T21:15:24.0787305Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,125][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41263903, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.534Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"ProviderId : 3ac9cccc-7b35-599a-e4dc-bb08e9ad5a20\nEventId : 4\nKeywords : None\nLevel : Error\nMessage : The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied. / Additional Information: \nOpcode : Info\nTask : 65530\nVersion : 0\nPayload : [message : The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied. / Additional Information: ] [stacktrace : System.Data.SqlClient.SqlException (0x80131904): The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied.\n at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)\n at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)\n at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj, Boolean callerHasConnectionLock, Boolean asyncClose)\n at System.Data.SqlClient.TdsParser.TryRun(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj, Boolean& dataReady)\n at System.Data.SqlClient.SqlDataReader.TryConsumeMetaData()\n at System.Data.SqlClient.SqlDataReader.get_MetaData()\n at System.Data.SqlClient.SqlCommand.FinishExecuteReader(SqlDataReader ds, RunBehavior runBehavior, String resetOptionsString, Boolean isInternal, Boolean forDescribeParameterEncryption)\n at System.Data.SqlClient.SqlCommand.RunExecuteReaderTds(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, Boolean async, Int32 timeout, Task& task, Boolean asyncWrite, Boolean inRetry, SqlDataReader ds, Boolean describeParameterEncryptionRequest)\n at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method, TaskCompletionSource`1 completion, Int32 timeout, Task& task, Boolean& usedCache, Boolean asyncWrite, Boolean inRetry)\n at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method)\n at System.Data.SqlClient.SqlCommand.ExecuteReader(CommandBehavior behavior, String method)\n at System.Data.SqlClient.SqlCommand.ExecuteDbDataReader(CommandBehavior behavior)\n at System.Data.Common.DbCommand.ExecuteReader(CommandBehavior behavior)\n at System.Data.Entity.Infrastructure.Interception.DbCommandDispatcher.<Reader>b__c(DbCommand t, DbCommandInterceptionContext`1 c)\n at System.Data.Entity.Infrastructure.Interception.InternalDispatcher`1.Dispatch[TTarget,TInterceptionContext,TResult](TTarget target, Func`3 operation, TInterceptionContext interceptionContext, Action`3 executing, Action`3 executed)\n at System.Data.Entity.Infrastructure.Interception.DbCommandDispatcher.Reader(DbCommand command, DbCommandInterceptionContext interceptionContext)\n at System.Data.Entity.Internal.InterceptableDbCommand.ExecuteDbDataReader(CommandBehavior behavior)\n at System.Data.Common.DbCommand.ExecuteReader(CommandBehavior behavior)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQueryInternal[TElement](String commandText, String entitySetName, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Core.Objects.ObjectContext.<>c__DisplayClass65`1.<ExecuteStoreQueryReliably>b__64()\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteInTransaction[T](Func`1 func, IDbExecutionStrategy executionStrategy, Boolean startLocalTransaction, Boolean releaseConnectionOnSuccess)\n at System.Data.Entity.Core.Objects.ObjectContext.<>c__DisplayClass65`1.<ExecuteStoreQueryReliably>b__63()\n at System.Data.Entity.SqlServer.DefaultSqlExecutionStrategy.Execute[TResult](Func`1 operation)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQueryReliably[TElement](String commandText, String entitySetName, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQuery[TElement](String commandText, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Internal.InternalContext.<>c__DisplayClass14`1.<ExecuteSqlQuery>b__13()\n at System.Data.Entity.Internal.LazyEnumerator`1.MoveNext()\n at System.Linq.Enumerable.First[TSource](IEnumerable`1 source)\n at XPOLastMile.Repository.FinanceSynchronizer.SettlementEntityRepository.InsertSettlementEntity(SettlementEntity jobCost) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.Repository\\XPOLastMile.Repository.FinanceSynchronizer\\SettlementEntityRepository.cs:line 44\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSync.Impl.OrderCostSyncService.SettlementEntityInsert(SettlementEntity jobCostAfter, Int64 apTransactionTypeIdVoucher) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSync\\Impl\\OrderCostSyncService.cs:line 171\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSync.Impl.OrderCostSyncService.SyncOrderCost(Message message, ExecutionContext executionContext, IContext context) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSync\\Impl\\OrderCostSyncService.cs:line 94\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts.ProcessRequest(IContext context, Message message) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSyncOrderCosts.cs:line 24\nClientConnectionId:1a4dc75f-c457-4903-870b-428b5d0182c0\nError Number:8178,State:1,Class:16] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : 02012253-2089-4b2a-acb3-cffbcced1660] \nEventName : ErrorInfo\nTimestamp : 2018-04-24T21:15:24.1004029Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,125][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41264499, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.534Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 3, Level : Warning, Message : Could not process for queue 35235900 - XPOLastMile.Framework.MVC.BooleanResponse, Payload : [message : Could not process for queue 35235900 - XPOLastMile.Framework.MVC.BooleanResponse] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:24.1043890Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,125][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41264963, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.534Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 3, Level : Warning, Message : About to read the business process name, Payload : [message : About to read the business process name] [applicationName : Not Provided] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:24.7616205Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,125][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41265401, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.534Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 3, Level : Warning, Message : Found the business process, Payload : [message : Found the business process] [applicationName : Not Provided] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:24.7648118Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,125][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41265898, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.534Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 3, Level : Warning, Message : Executing business process!!!!!!!!!!!!!!!!!, Payload : [message : Executing business process!!!!!!!!!!!!!!!!!] [applicationName : XPOLastMile.Host.MSMQ.Auto.ProcessAll] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:24.7652516Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,125][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41266442, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.534Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 3, Level : Warning, Message : FINISHED ---------- Executing business process!!!!!!!!!!!!!!!!!, Payload : [message : FINISHED ---------- Executing business process!!!!!!!!!!!!!!!!!] [applicationName : XPOLastMile.Host.MSMQ.Core.ProcessUntilEmpty] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:24.8049902Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,125][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41266960, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.534Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 3, Level : Warning, Message : Done everything&^**^*&*(*()*()()*((*()*()*()*()*(), Payload : [message : Done everything&^**^*&*(*()*()()*((*()*()*()*()*()] [applicationName : XPOLastMile.Host.MSMQ.Core.ProcessUntilEmpty] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:24.8052665Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,125][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41267520, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.534Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 1, Level : Verbose, Message : The messageID to process: 35235903 and MessageQueueID: 35185036, Payload : [message : The messageID to process: 35235903 and MessageQueueID: 35185036] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : DebugInfo, Timestamp : 2018-04-24T21:15:26.5156273Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,125][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41268084, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.534Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 1, Level : Verbose, Message : FinanceSyncOrderCost called with MessageQueueID: 35185036 started, Payload : [message : FinanceSyncOrderCost called with MessageQueueID: 35185036 started] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : DebugInfo, Timestamp : 2018-04-24T21:15:26.5203136Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,128][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:40,128][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:40,129][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:40,129][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:40,129][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:40,129][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:40,129][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:40,130][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:40,130][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:40,130][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:40,130][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:40,130][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:40,130][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:40,130][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:40,130][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:40,130][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:40,130][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:40,130][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:40,130][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:40,130][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:40,131][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:40,131][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:40,131][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:40,131][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:40,131][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:40,131][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:40,131][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:40,131][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:40,131][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:40,131][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:40,131][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:40,131][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:40,131][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:40,131][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:40,132][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:40,132][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:40,132][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:40,132][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:40,132][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:40,132][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:40,132][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:40,132][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:40,132][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:40,132][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:40,132][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:40,132][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:40,132][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:40,132][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:40,133][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:40,133][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:40,133][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:40,133][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:40,133][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:40,133][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:40,133][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:40,133][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:40,133][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:40,133][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:40,133][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:40,133][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:40,133][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:40,133][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:40,133][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:40,134][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:40,134][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:40,134][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:40,134][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:40,134][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:40,134][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:40,134][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:40,134][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:40,134][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:40,134][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:40,134][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:40,134][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:40,134][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:40,134][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:40,135][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:40,135][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:40,135][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:40,135][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:40,135][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:40,135][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:40,135][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:40,135][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:40,135][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:40,136][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:40,136][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:40,137][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:40,137][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:40,137][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:40,137][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:40,137][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:40,137][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:40,137][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:40,137][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41187422, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:36.530Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"ProviderId : 3ac9cccc-7b35-599a-e4dc-bb08e9ad5a20\nEventId : 4\nKeywords : None\nLevel : Error\nMessage : The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied. / Additional Information: \nOpcode : Info\nTask : 65530\nVersion : 0\nPayload : [message : The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied. / Additional Information: ] [stacktrace : System.Data.SqlClient.SqlException (0x80131904): The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied.\n at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)\n at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)\n at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj, Boolean callerHasConnectionLock, Boolean asyncClose)\n at System.Data.SqlClient.TdsParser.TryRun(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj, Boolean& dataReady)\n at System.Data.SqlClient.SqlDataReader.TryConsumeMetaData()\n at System.Data.SqlClient.SqlDataReader.get_MetaData()\n at System.Data.SqlClient.SqlCommand.FinishExecuteReader(SqlDataReader ds, RunBehavior runBehavior, String resetOptionsString, Boolean isInternal, Boolean forDescribeParameterEncryption)\n at System.Data.SqlClient.SqlCommand.RunExecuteReaderTds(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, Boolean async, Int32 timeout, Task& task, Boolean asyncWrite, Boolean inRetry, SqlDataReader ds, Boolean describeParameterEncryptionRequest)\n at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method, TaskCompletionSource`1 completion, Int32 timeout, Task& task, Boolean& usedCache, Boolean asyncWrite, Boolean inRetry)\n at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method)\n at System.Data.SqlClient.SqlCommand.ExecuteReader(CommandBehavior behavior, String method)\n at System.Data.SqlClient.SqlCommand.ExecuteDbDataReader(CommandBehavior behavior)\n at System.Data.Common.DbCommand.ExecuteReader(CommandBehavior behavior)\n at System.Data.Entity.Infrastructure.Interception.DbCommandDispatcher.<Reader>b__c(DbCommand t, DbCommandInterceptionContext`1 c)\n at System.Data.Entity.Infrastructure.Interception.InternalDispatcher`1.Dispatch[TTarget,TInterceptionContext,TResult](TTarget target, Func`3 operation, TInterceptionContext interceptionContext, Action`3 executing, Action`3 executed)\n at System.Data.Entity.Infrastructure.Interception.DbCommandDispatcher.Reader(DbCommand command, DbCommandInterceptionContext interceptionContext)\n at System.Data.Entity.Internal.InterceptableDbCommand.ExecuteDbDataReader(CommandBehavior behavior)\n at System.Data.Common.DbCommand.ExecuteReader(CommandBehavior behavior)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQueryInternal[TElement](String commandText, String entitySetName, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Core.Objects.ObjectContext.<>c__DisplayClass65`1.<ExecuteStoreQueryReliably>b__64()\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteInTransaction[T](Func`1 func, IDbExecutionStrategy executionStrategy, Boolean startLocalTransaction, Boolean releaseConnectionOnSuccess)\n at System.Data.Entity.Core.Objects.ObjectContext.<>c__DisplayClass65`1.<ExecuteStoreQueryReliably>b__63()\n at System.Data.Entity.SqlServer.DefaultSqlExecutionStrategy.Execute[TResult](Func`1 operation)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQueryReliably[TElement](String commandText, String entitySetName, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQuery[TElement](String commandText, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Internal.InternalContext.<>c__DisplayClass14`1.<ExecuteSqlQuery>b__13()\n at System.Data.Entity.Internal.LazyEnumerator`1.MoveNext()\n at System.Linq.Enumerable.First[TSource](IEnumerable`1 source)\n at XPOLastMile.Repository.FinanceSynchronizer.SettlementEntityRepository.InsertSettlementEntity(SettlementEntity jobCost) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.Repository\\XPOLastMile.Repository.FinanceSynchronizer\\SettlementEntityRepository.cs:line 44\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSync.Impl.OrderCostSyncService.SettlementEntityInsert(SettlementEntity jobCostAfter, Int64 apTransactionTypeIdVoucher) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSync\\Impl\\OrderCostSyncService.cs:line 171\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSync.Impl.OrderCostSyncService.SyncOrderCost(Message message, ExecutionContext executionContext, IContext context) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSync\\Impl\\OrderCostSyncService.cs:line 94\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts.ProcessRequest(IContext context, Message message) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSyncOrderCosts.cs:line 24\nClientConnectionId:1a4dc75f-c457-4903-870b-428b5d0182c0\nError Number:8178,State:1,Class:16] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : 83b67baf-8ebc-4ce1-93c2-fa5d630776be] \nEventName : ErrorInfo\nTimestamp : 2018-04-24T21:15:04.6765810Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,138][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41212584, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:46.532Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 3, Level : Warning, Message : About to read the business process name, Payload : [message : About to read the business process name] [applicationName : Not Provided] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:12.7670853Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,138][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41216898, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:46.532Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 3, Level : Warning, Message : Found the business process, Payload : [message : Found the business process] [applicationName : Not Provided] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:14.2925902Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,138][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41217411, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:46.532Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 3, Level : Warning, Message : Executing business process!!!!!!!!!!!!!!!!!, Payload : [message : Executing business process!!!!!!!!!!!!!!!!!] [applicationName : XPOLastMile.MessageQueue.Business.MessageQueueWatcher] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:14.2931082Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,138][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41234989, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:47.533Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 3, Level : Warning, Message : FINISHED ---------- Executing business process!!!!!!!!!!!!!!!!!, Payload : [message : FINISHED ---------- Executing business process!!!!!!!!!!!!!!!!!] [applicationName : Not Provided] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:17.6649852Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,138][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41235421, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:47.533Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 3, Level : Warning, Message : Done everything&^**^*&*(*()*()()*((*()*()*()*()*(), Payload : [message : Done everything&^**^*&*(*()*()()*((*()*()*()*()*()] [applicationName : Not Provided] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:17.6653214Z\n--------------- Event Log End Here ---------------\n", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,138][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41274398, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.534Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"ProviderId : 3ac9cccc-7b35-599a-e4dc-bb08e9ad5a20\nEventId : 4\nKeywords : None\nLevel : Error\nMessage : The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied. / Additional Information: \nOpcode : Info\nTask : 65530\nVersion : 0\nPayload : [message : The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied. / Additional Information: ] [stacktrace : System.Data.SqlClient.SqlException (0x80131904): The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied.\n at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)\n at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)\n at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj, Boolean callerHasConnectionLock, Boolean asyncClose)\n at System.Data.SqlClient.TdsParser.TryRun(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj, Boolean& dataReady)\n at System.Data.SqlClient.SqlDataReader.TryConsumeMetaData()\n at System.Data.SqlClient.SqlDataReader.get_MetaData()\n at System.Data.SqlClient.SqlCommand.FinishExecuteReader(SqlDataReader ds, RunBehavior runBehavior, String resetOptionsString, Boolean isInternal, Boolean forDescribeParameterEncryption)\n at System.Data.SqlClient.SqlCommand.RunExecuteReaderTds(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, Boolean async, Int32 timeout, Task& task, Boolean asyncWrite, Boolean inRetry, SqlDataReader ds, Boolean describeParameterEncryptionRequest)\n at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method, TaskCompletionSource`1 completion, Int32 timeout, Task& task, Boolean& usedCache, Boolean asyncWrite, Boolean inRetry)\n at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method)\n at System.Data.SqlClient.SqlCommand.ExecuteReader(CommandBehavior behavior, String method)\n at System.Data.SqlClient.SqlCommand.ExecuteDbDataReader(CommandBehavior behavior)\n at System.Data.Common.DbCommand.ExecuteReader(CommandBehavior behavior)\n at System.Data.Entity.Infrastructure.Interception.DbCommandDispatcher.<Reader>b__c(DbCommand t, DbCommandInterceptionContext`1 c)\n at System.Data.Entity.Infrastructure.Interception.InternalDispatcher`1.Dispatch[TTarget,TInterceptionContext,TResult](TTarget target, Func`3 operation, TInterceptionContext interceptionContext, Action`3 executing, Action`3 executed)\n at System.Data.Entity.Infrastructure.Interception.DbCommandDispatcher.Reader(DbCommand command, DbCommandInterceptionContext interceptionContext)\n at System.Data.Entity.Internal.InterceptableDbCommand.ExecuteDbDataReader(CommandBehavior behavior)\n at System.Data.Common.DbCommand.ExecuteReader(CommandBehavior behavior)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQueryInternal[TElement](String commandText, String entitySetName, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Core.Objects.ObjectContext.<>c__DisplayClass65`1.<ExecuteStoreQueryReliably>b__64()\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteInTransaction[T](Func`1 func, IDbExecutionStrategy executionStrategy, Boolean startLocalTransaction, Boolean releaseConnectionOnSuccess)\n at System.Data.Entity.Core.Objects.ObjectContext.<>c__DisplayClass65`1.<ExecuteStoreQueryReliably>b__63()\n at System.Data.Entity.SqlServer.DefaultSqlExecutionStrategy.Execute[TResult](Func`1 operation)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQueryReliably[TElement](String commandText, String entitySetName, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQuery[TElement](String commandText, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Internal.InternalContext.<>c__DisplayClass14`1.<ExecuteSqlQuery>b__13()\n at System.Data.Entity.Internal.LazyEnumerator`1.MoveNext()\n at System.Linq.Enumerable.First[TSource](IEnumerable`1 source)\n at XPOLastMile.Repository.FinanceSynchronizer.SettlementEntityRepository.InsertSettlementEntity(SettlementEntity jobCost) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.Repository\\XPOLastMile.Repository.FinanceSynchronizer\\SettlementEntityRepository.cs:line 44\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSync.Impl.OrderCostSyncService.SettlementEntityInsert(SettlementEntity jobCostAfter, Int64 apTransactionTypeIdVoucher) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSync\\Impl\\OrderCostSyncService.cs:line 171\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSync.Impl.OrderCostSyncService.SyncOrderCost(Message message, ExecutionContext executionContext, IContext context) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSync\\Impl\\OrderCostSyncService.cs:line 94\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts.ProcessRequest(IContext context, Message message) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSyncOrderCosts.cs:line 24\nClientConnectionId:1a4dc75f-c457-4903-870b-428b5d0182c0\nError Number:8178,State:1,Class:16] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : e0a87dc4-3bd5-4aa9-a1f7-5def23391fcd] \nEventName : ErrorInfo\nTimestamp : 2018-04-24T21:15:26.5528603Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,139][DEBUG][logstash.pipeline ] filter received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41274940, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.534Z, "beat"=>{"version"=>"5.2.0", "name"=>"XLM-INT-APP-02", "hostname"=>"XLM-INT-APP-02"}, "message"=>"EventId : 3, Level : Warning, Message : Could not process for queue 35235903 - XPOLastMile.Framework.MVC.BooleanResponse, Payload : [message : Could not process for queue 35235903 - XPOLastMile.Framework.MVC.BooleanResponse] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:26.5571428Z\n--------------- Event Log End Here ---------------\n", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,139][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:40,139][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:40,139][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:40,139][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:40,139][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:40,139][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:40,140][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:40,140][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:40,140][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:40,140][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:40,140][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:40,140][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:40,140][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:40,140][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:40,140][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:40,140][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:40,141][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:40,141][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:40,141][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:40,141][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:40,141][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:40,141][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:40,141][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:40,141][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:40,141][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:40,141][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:40,141][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:40,142][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:40,142][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:40,142][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:40,142][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:40,142][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:40,142][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:40,142][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:40,143][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41211524, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:46.532Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"ProviderId : 3ac9cccc-7b35-599a-e4dc-bb08e9ad5a20\nEventId : 4\nKeywords : None\nLevel : Error\nMessage : The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied. / Additional Information: \nOpcode : Info\nTask : 65530\nVersion : 0\nPayload : [message : The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied. / Additional Information: ] [stacktrace : System.Data.SqlClient.SqlException (0x80131904): The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied.\n at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)\n at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)\n at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj, Boolean callerHasConnectionLock, Boolean asyncClose)\n at System.Data.SqlClient.TdsParser.TryRun(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj, Boolean& dataReady)\n at System.Data.SqlClient.SqlDataReader.TryConsumeMetaData()\n at System.Data.SqlClient.SqlDataReader.get_MetaData()\n at System.Data.SqlClient.SqlCommand.FinishExecuteReader(SqlDataReader ds, RunBehavior runBehavior, String resetOptionsString, Boolean isInternal, Boolean forDescribeParameterEncryption)\n at System.Data.SqlClient.SqlCommand.RunExecuteReaderTds(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, Boolean async, Int32 timeout, Task& task, Boolean asyncWrite, Boolean inRetry, SqlDataReader ds, Boolean describeParameterEncryptionRequest)\n at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method, TaskCompletionSource`1 completion, Int32 timeout, Task& task, Boolean& usedCache, Boolean asyncWrite, Boolean inRetry)\n at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method)\n at System.Data.SqlClient.SqlCommand.ExecuteReader(CommandBehavior behavior, String method)\n at System.Data.SqlClient.SqlCommand.ExecuteDbDataReader(CommandBehavior behavior)\n at System.Data.Common.DbCommand.ExecuteReader(CommandBehavior behavior)\n at System.Data.Entity.Infrastructure.Interception.DbCommandDispatcher.<Reader>b__c(DbCommand t, DbCommandInterceptionContext`1 c)\n at System.Data.Entity.Infrastructure.Interception.InternalDispatcher`1.Dispatch[TTarget,TInterceptionContext,TResult](TTarget target, Func`3 operation, TInterceptionContext interceptionContext, Action`3 executing, Action`3 executed)\n at System.Data.Entity.Infrastructure.Interception.DbCommandDispatcher.Reader(DbCommand command, DbCommandInterceptionContext interceptionContext)\n at System.Data.Entity.Internal.InterceptableDbCommand.ExecuteDbDataReader(CommandBehavior behavior)\n at System.Data.Common.DbCommand.ExecuteReader(CommandBehavior behavior)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQueryInternal[TElement](String commandText, String entitySetName, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Core.Objects.ObjectContext.<>c__DisplayClass65`1.<ExecuteStoreQueryReliably>b__64()\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteInTransaction[T](Func`1 func, IDbExecutionStrategy executionStrategy, Boolean startLocalTransaction, Boolean releaseConnectionOnSuccess)\n at System.Data.Entity.Core.Objects.ObjectContext.<>c__DisplayClass65`1.<ExecuteStoreQueryReliably>b__63()\n at System.Data.Entity.SqlServer.DefaultSqlExecutionStrategy.Execute[TResult](Func`1 operation)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQueryReliably[TElement](String commandText, String entitySetName, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQuery[TElement](String commandText, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Internal.InternalContext.<>c__DisplayClass14`1.<ExecuteSqlQuery>b__13()\n at System.Data.Entity.Internal.LazyEnumerator`1.MoveNext()\n at System.Linq.Enumerable.First[TSource](IEnumerable`1 source)\n at XPOLastMile.Repository.FinanceSynchronizer.SettlementEntityRepository.InsertSettlementEntity(SettlementEntity jobCost) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.Repository\\XPOLastMile.Repository.FinanceSynchronizer\\SettlementEntityRepository.cs:line 44\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSync.Impl.OrderCostSyncService.SettlementEntityInsert(SettlementEntity jobCostAfter, Int64 apTransactionTypeIdVoucher) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSync\\Impl\\OrderCostSyncService.cs:line 171\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSync.Impl.OrderCostSyncService.SyncOrderCost(Message message, ExecutionContext executionContext, IContext context) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSync\\Impl\\OrderCostSyncService.cs:line 94\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts.ProcessRequest(IContext context, Message message) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSyncOrderCosts.cs:line 24\nClientConnectionId:1a4dc75f-c457-4903-870b-428b5d0182c0\nError Number:8178,State:1,Class:16] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : 094cf1d5-06ff-4a0e-bd99-17a2544fdfa8] \nEventName : ErrorInfo\nTimestamp : 2018-04-24T21:15:12.0015450Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,143][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41213512, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:46.532Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 3, Level : Warning, Message : About to read the business process name, Payload : [message : About to read the business process name] [applicationName : Not Provided] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:13.7641724Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,143][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41213950, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:46.532Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 3, Level : Warning, Message : Found the business process, Payload : [message : Found the business process] [applicationName : Not Provided] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:13.7675365Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,143][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41214447, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:46.532Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 3, Level : Warning, Message : Executing business process!!!!!!!!!!!!!!!!!, Payload : [message : Executing business process!!!!!!!!!!!!!!!!!] [applicationName : XPOLastMile.Host.MSMQ.Auto.ProcessAll] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:13.7680716Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,143][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41214991, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:46.532Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 3, Level : Warning, Message : FINISHED ---------- Executing business process!!!!!!!!!!!!!!!!!, Payload : [message : FINISHED ---------- Executing business process!!!!!!!!!!!!!!!!!] [applicationName : XPOLastMile.Host.MSMQ.Core.ProcessUntilEmpty] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:13.8295562Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,144][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41215509, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:46.532Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 3, Level : Warning, Message : Done everything&^**^*&*(*()*()()*((*()*()*()*()*(), Payload : [message : Done everything&^**^*&*(*()*()()*((*()*()*()*()*()] [applicationName : XPOLastMile.Host.MSMQ.Core.ProcessUntilEmpty] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:13.8298663Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,144][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41215947, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:46.532Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 3, Level : Warning, Message : Found the business process, Payload : [message : Found the business process] [applicationName : Not Provided] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:14.2891931Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,144][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41217971, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:46.532Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 1, Level : Verbose, Message : The messageID to process: 35235894 and MessageQueueID: 35185027, Payload : [message : The messageID to process: 35235894 and MessageQueueID: 35185027] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : DebugInfo, Timestamp : 2018-04-24T21:15:14.3864598Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,144][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41218535, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:46.532Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 1, Level : Verbose, Message : FinanceSyncOrderCost called with MessageQueueID: 35185027 started, Payload : [message : FinanceSyncOrderCost called with MessageQueueID: 35185027 started] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : DebugInfo, Timestamp : 2018-04-24T21:15:14.3902671Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,144][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41224849, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:46.532Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"ProviderId : 3ac9cccc-7b35-599a-e4dc-bb08e9ad5a20\nEventId : 4\nKeywords : None\nLevel : Error\nMessage : The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied. / Additional Information: \nOpcode : Info\nTask : 65530\nVersion : 0\nPayload : [message : The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied. / Additional Information: ] [stacktrace : System.Data.SqlClient.SqlException (0x80131904): The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied.\n at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)\n at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)\n at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj, Boolean callerHasConnectionLock, Boolean asyncClose)\n at System.Data.SqlClient.TdsParser.TryRun(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj, Boolean& dataReady)\n at System.Data.SqlClient.SqlDataReader.TryConsumeMetaData()\n at System.Data.SqlClient.SqlDataReader.get_MetaData()\n at System.Data.SqlClient.SqlCommand.FinishExecuteReader(SqlDataReader ds, RunBehavior runBehavior, String resetOptionsString, Boolean isInternal, Boolean forDescribeParameterEncryption)\n at System.Data.SqlClient.SqlCommand.RunExecuteReaderTds(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, Boolean async, Int32 timeout, Task& task, Boolean asyncWrite, Boolean inRetry, SqlDataReader ds, Boolean describeParameterEncryptionRequest)\n at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method, TaskCompletionSource`1 completion, Int32 timeout, Task& task, Boolean& usedCache, Boolean asyncWrite, Boolean inRetry)\n at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method)\n at System.Data.SqlClient.SqlCommand.ExecuteReader(CommandBehavior behavior, String method)\n at System.Data.SqlClient.SqlCommand.ExecuteDbDataReader(CommandBehavior behavior)\n at System.Data.Common.DbCommand.ExecuteReader(CommandBehavior behavior)\n at System.Data.Entity.Infrastructure.Interception.DbCommandDispatcher.<Reader>b__c(DbCommand t, DbCommandInterceptionContext`1 c)\n at System.Data.Entity.Infrastructure.Interception.InternalDispatcher`1.Dispatch[TTarget,TInterceptionContext,TResult](TTarget target, Func`3 operation, TInterceptionContext interceptionContext, Action`3 executing, Action`3 executed)\n at System.Data.Entity.Infrastructure.Interception.DbCommandDispatcher.Reader(DbCommand command, DbCommandInterceptionContext interceptionContext)\n at System.Data.Entity.Internal.InterceptableDbCommand.ExecuteDbDataReader(CommandBehavior behavior)\n at System.Data.Common.DbCommand.ExecuteReader(CommandBehavior behavior)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQueryInternal[TElement](String commandText, String entitySetName, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Core.Objects.ObjectContext.<>c__DisplayClass65`1.<ExecuteStoreQueryReliably>b__64()\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteInTransaction[T](Func`1 func, IDbExecutionStrategy executionStrategy, Boolean startLocalTransaction, Boolean releaseConnectionOnSuccess)\n at System.Data.Entity.Core.Objects.ObjectContext.<>c__DisplayClass65`1.<ExecuteStoreQueryReliably>b__63()\n at System.Data.Entity.SqlServer.DefaultSqlExecutionStrategy.Execute[TResult](Func`1 operation)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQueryReliably[TElement](String commandText, String entitySetName, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQuery[TElement](String commandText, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Internal.InternalContext.<>c__DisplayClass14`1.<ExecuteSqlQuery>b__13()\n at System.Data.Entity.Internal.LazyEnumerator`1.MoveNext()\n at System.Linq.Enumerable.First[TSource](IEnumerable`1 source)\n at XPOLastMile.Repository.FinanceSynchronizer.SettlementEntityRepository.InsertSettlementEntity(SettlementEntity jobCost) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.Repository\\XPOLastMile.Repository.FinanceSynchronizer\\SettlementEntityRepository.cs:line 44\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSync.Impl.OrderCostSyncService.SettlementEntityInsert(SettlementEntity jobCostAfter, Int64 apTransactionTypeIdVoucher) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSync\\Impl\\OrderCostSyncService.cs:line 171\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSync.Impl.OrderCostSyncService.SyncOrderCost(Message message, ExecutionContext executionContext, IContext context) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSync\\Impl\\OrderCostSyncService.cs:line 94\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts.ProcessRequest(IContext context, Message message) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSyncOrderCosts.cs:line 24\nClientConnectionId:1a4dc75f-c457-4903-870b-428b5d0182c0\nError Number:8178,State:1,Class:16] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : c3dfcd1c-fa81-450b-a0b5-4dd91b3ce16d] \nEventName : ErrorInfo\nTimestamp : 2018-04-24T21:15:14.4099942Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,144][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41225445, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:46.532Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 3, Level : Warning, Message : Could not process for queue 35235894 - XPOLastMile.Framework.MVC.BooleanResponse, Payload : [message : Could not process for queue 35235894 - XPOLastMile.Framework.MVC.BooleanResponse] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:14.4134425Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,144][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41225957, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:47.533Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 3, Level : Warning, Message : FINISHED ---------- Executing business process!!!!!!!!!!!!!!!!!, Payload : [message : FINISHED ---------- Executing business process!!!!!!!!!!!!!!!!!] [applicationName : Not Provided] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:16.6713610Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,144][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41226443, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:47.533Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 3, Level : Warning, Message : Done everything&^**^*&*(*()*()()*((*()*()*()*()*(), Payload : [message : Done everything&^**^*&*(*()*()()*((*()*()*()*()*()] [applicationName : Not Provided] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:16.6716813Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,145][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41235475, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.533Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,145][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41236035, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.533Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 1, Level : Verbose, Message : The messageID to process: 35235897 and MessageQueueID: 35185030, Payload : [message : The messageID to process: 35235897 and MessageQueueID: 35185030] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : DebugInfo, Timestamp : 2018-04-24T21:15:19.2336228Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,145][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41236599, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.533Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 1, Level : Verbose, Message : FinanceSyncOrderCost called with MessageQueueID: 35185030 started, Payload : [message : FinanceSyncOrderCost called with MessageQueueID: 35185030 started] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : DebugInfo, Timestamp : 2018-04-24T21:15:19.2374548Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,145][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41242913, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.533Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"ProviderId : 3ac9cccc-7b35-599a-e4dc-bb08e9ad5a20\nEventId : 4\nKeywords : None\nLevel : Error\nMessage : The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied. / Additional Information: \nOpcode : Info\nTask : 65530\nVersion : 0\nPayload : [message : The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied. / Additional Information: ] [stacktrace : System.Data.SqlClient.SqlException (0x80131904): The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied.\n at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)\n at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)\n at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj, Boolean callerHasConnectionLock, Boolean asyncClose)\n at System.Data.SqlClient.TdsParser.TryRun(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj, Boolean& dataReady)\n at System.Data.SqlClient.SqlDataReader.TryConsumeMetaData()\n at System.Data.SqlClient.SqlDataReader.get_MetaData()\n at System.Data.SqlClient.SqlCommand.FinishExecuteReader(SqlDataReader ds, RunBehavior runBehavior, String resetOptionsString, Boolean isInternal, Boolean forDescribeParameterEncryption)\n at System.Data.SqlClient.SqlCommand.RunExecuteReaderTds(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, Boolean async, Int32 timeout, Task& task, Boolean asyncWrite, Boolean inRetry, SqlDataReader ds, Boolean describeParameterEncryptionRequest)\n at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method, TaskCompletionSource`1 completion, Int32 timeout, Task& task, Boolean& usedCache, Boolean asyncWrite, Boolean inRetry)\n at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method)\n at System.Data.SqlClient.SqlCommand.ExecuteReader(CommandBehavior behavior, String method)\n at System.Data.SqlClient.SqlCommand.ExecuteDbDataReader(CommandBehavior behavior)\n at System.Data.Common.DbCommand.ExecuteReader(CommandBehavior behavior)\n at System.Data.Entity.Infrastructure.Interception.DbCommandDispatcher.<Reader>b__c(DbCommand t, DbCommandInterceptionContext`1 c)\n at System.Data.Entity.Infrastructure.Interception.InternalDispatcher`1.Dispatch[TTarget,TInterceptionContext,TResult](TTarget target, Func`3 operation, TInterceptionContext interceptionContext, Action`3 executing, Action`3 executed)\n at System.Data.Entity.Infrastructure.Interception.DbCommandDispatcher.Reader(DbCommand command, DbCommandInterceptionContext interceptionContext)\n at System.Data.Entity.Internal.InterceptableDbCommand.ExecuteDbDataReader(CommandBehavior behavior)\n at System.Data.Common.DbCommand.ExecuteReader(CommandBehavior behavior)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQueryInternal[TElement](String commandText, String entitySetName, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Core.Objects.ObjectContext.<>c__DisplayClass65`1.<ExecuteStoreQueryReliably>b__64()\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteInTransaction[T](Func`1 func, IDbExecutionStrategy executionStrategy, Boolean startLocalTransaction, Boolean releaseConnectionOnSuccess)\n at System.Data.Entity.Core.Objects.ObjectContext.<>c__DisplayClass65`1.<ExecuteStoreQueryReliably>b__63()\n at System.Data.Entity.SqlServer.DefaultSqlExecutionStrategy.Execute[TResult](Func`1 operation)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQueryReliably[TElement](String commandText, String entitySetName, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQuery[TElement](String commandText, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Internal.InternalContext.<>c__DisplayClass14`1.<ExecuteSqlQuery>b__13()\n at System.Data.Entity.Internal.LazyEnumerator`1.MoveNext()\n at System.Linq.Enumerable.First[TSource](IEnumerable`1 source)\n at XPOLastMile.Repository.FinanceSynchronizer.SettlementEntityRepository.InsertSettlementEntity(SettlementEntity jobCost) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.Repository\\XPOLastMile.Repository.FinanceSynchronizer\\SettlementEntityRepository.cs:line 44\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSync.Impl.OrderCostSyncService.SettlementEntityInsert(SettlementEntity jobCostAfter, Int64 apTransactionTypeIdVoucher) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSync\\Impl\\OrderCostSyncService.cs:line 171\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSync.Impl.OrderCostSyncService.SyncOrderCost(Message message, ExecutionContext executionContext, IContext context) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSync\\Impl\\OrderCostSyncService.cs:line 94\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts.ProcessRequest(IContext context, Message message) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSyncOrderCosts.cs:line 24\nClientConnectionId:1a4dc75f-c457-4903-870b-428b5d0182c0\nError Number:8178,State:1,Class:16] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : 55993834-7203-4e50-ad8a-197d5a329e08] \nEventName : ErrorInfo\nTimestamp : 2018-04-24T21:15:19.2583402Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,145][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41243509, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.533Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 3, Level : Warning, Message : Could not process for queue 35235897 - XPOLastMile.Framework.MVC.BooleanResponse, Payload : [message : Could not process for queue 35235897 - XPOLastMile.Framework.MVC.BooleanResponse] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:19.2620846Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,145][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41244069, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.533Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 1, Level : Verbose, Message : The messageID to process: 35235898 and MessageQueueID: 35185031, Payload : [message : The messageID to process: 35235898 and MessageQueueID: 35185031] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : DebugInfo, Timestamp : 2018-04-24T21:15:21.6488844Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,145][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41244633, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.533Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 1, Level : Verbose, Message : FinanceSyncOrderCost called with MessageQueueID: 35185031 started, Payload : [message : FinanceSyncOrderCost called with MessageQueueID: 35185031 started] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : DebugInfo, Timestamp : 2018-04-24T21:15:21.6528752Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,145][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41250947, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.534Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"ProviderId : 3ac9cccc-7b35-599a-e4dc-bb08e9ad5a20\nEventId : 4\nKeywords : None\nLevel : Error\nMessage : The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied. / Additional Information: \nOpcode : Info\nTask : 65530\nVersion : 0\nPayload : [message : The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied. / Additional Information: ] [stacktrace : System.Data.SqlClient.SqlException (0x80131904): The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied.\n at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)\n at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)\n at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj, Boolean callerHasConnectionLock, Boolean asyncClose)\n at System.Data.SqlClient.TdsParser.TryRun(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj, Boolean& dataReady)\n at System.Data.SqlClient.SqlDataReader.TryConsumeMetaData()\n at System.Data.SqlClient.SqlDataReader.get_MetaData()\n at System.Data.SqlClient.SqlCommand.FinishExecuteReader(SqlDataReader ds, RunBehavior runBehavior, String resetOptionsString, Boolean isInternal, Boolean forDescribeParameterEncryption)\n at System.Data.SqlClient.SqlCommand.RunExecuteReaderTds(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, Boolean async, Int32 timeout, Task& task, Boolean asyncWrite, Boolean inRetry, SqlDataReader ds, Boolean describeParameterEncryptionRequest)\n at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method, TaskCompletionSource`1 completion, Int32 timeout, Task& task, Boolean& usedCache, Boolean asyncWrite, Boolean inRetry)\n at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method)\n at System.Data.SqlClient.SqlCommand.ExecuteReader(CommandBehavior behavior, String method)\n at System.Data.SqlClient.SqlCommand.ExecuteDbDataReader(CommandBehavior behavior)\n at System.Data.Common.DbCommand.ExecuteReader(CommandBehavior behavior)\n at System.Data.Entity.Infrastructure.Interception.DbCommandDispatcher.<Reader>b__c(DbCommand t, DbCommandInterceptionContext`1 c)\n at System.Data.Entity.Infrastructure.Interception.InternalDispatcher`1.Dispatch[TTarget,TInterceptionContext,TResult](TTarget target, Func`3 operation, TInterceptionContext interceptionContext, Action`3 executing, Action`3 executed)\n at System.Data.Entity.Infrastructure.Interception.DbCommandDispatcher.Reader(DbCommand command, DbCommandInterceptionContext interceptionContext)\n at System.Data.Entity.Internal.InterceptableDbCommand.ExecuteDbDataReader(CommandBehavior behavior)\n at System.Data.Common.DbCommand.ExecuteReader(CommandBehavior behavior)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQueryInternal[TElement](String commandText, String entitySetName, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Core.Objects.ObjectContext.<>c__DisplayClass65`1.<ExecuteStoreQueryReliably>b__64()\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteInTransaction[T](Func`1 func, IDbExecutionStrategy executionStrategy, Boolean startLocalTransaction, Boolean releaseConnectionOnSuccess)\n at System.Data.Entity.Core.Objects.ObjectContext.<>c__DisplayClass65`1.<ExecuteStoreQueryReliably>b__63()\n at System.Data.Entity.SqlServer.DefaultSqlExecutionStrategy.Execute[TResult](Func`1 operation)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQueryReliably[TElement](String commandText, String entitySetName, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQuery[TElement](String commandText, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Internal.InternalContext.<>c__DisplayClass14`1.<ExecuteSqlQuery>b__13()\n at System.Data.Entity.Internal.LazyEnumerator`1.MoveNext()\n at System.Linq.Enumerable.First[TSource](IEnumerable`1 source)\n at XPOLastMile.Repository.FinanceSynchronizer.SettlementEntityRepository.InsertSettlementEntity(SettlementEntity jobCost) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.Repository\\XPOLastMile.Repository.FinanceSynchronizer\\SettlementEntityRepository.cs:line 44\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSync.Impl.OrderCostSyncService.SettlementEntityInsert(SettlementEntity jobCostAfter, Int64 apTransactionTypeIdVoucher) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSync\\Impl\\OrderCostSyncService.cs:line 171\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSync.Impl.OrderCostSyncService.SyncOrderCost(Message message, ExecutionContext executionContext, IContext context) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSync\\Impl\\OrderCostSyncService.cs:line 94\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts.ProcessRequest(IContext context, Message message) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSyncOrderCosts.cs:line 24\nClientConnectionId:1a4dc75f-c457-4903-870b-428b5d0182c0\nError Number:8178,State:1,Class:16] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : 3b3dacf5-e1d2-4460-a3b1-25e444030b7d] \nEventName : ErrorInfo\nTimestamp : 2018-04-24T21:15:21.6778484Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,146][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:40,146][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:40,147][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:40,147][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:40,147][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:40,147][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:40,147][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:40,147][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:40,147][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:40,147][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:40,147][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:40,147][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:40,147][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:40,147][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:40,147][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][hostname]"}
[2018-04-24T17:17:40,147][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"[beat][name]"}
[2018-04-24T17:17:40,148][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:40,148][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:40,148][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:40,148][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:40,148][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:40,148][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:40,148][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:40,149][DEBUG][logstash.filters.mutate ] filters/LogStash::Filters::Mutate: removing field {:field=>"logtime"}
[2018-04-24T17:17:40,149][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41251543, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.534Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 3, Level : Warning, Message : Could not process for queue 35235898 - XPOLastMile.Framework.MVC.BooleanResponse, Payload : [message : Could not process for queue 35235898 - XPOLastMile.Framework.MVC.BooleanResponse] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:21.6826882Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,149][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41252007, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.534Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 3, Level : Warning, Message : About to read the business process name, Payload : [message : About to read the business process name] [applicationName : Not Provided] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:21.7675305Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,149][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41252445, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.534Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 3, Level : Warning, Message : Found the business process, Payload : [message : Found the business process] [applicationName : Not Provided] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:21.7706919Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,149][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41252942, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.534Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 3, Level : Warning, Message : Executing business process!!!!!!!!!!!!!!!!!, Payload : [message : Executing business process!!!!!!!!!!!!!!!!!] [applicationName : XPOLastMile.Host.MSMQ.Auto.ProcessAll] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:21.7710867Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,149][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41253486, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.534Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 3, Level : Warning, Message : FINISHED ---------- Executing business process!!!!!!!!!!!!!!!!!, Payload : [message : FINISHED ---------- Executing business process!!!!!!!!!!!!!!!!!] [applicationName : XPOLastMile.Host.MSMQ.Core.ProcessUntilEmpty] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:21.8110554Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,150][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41254004, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.534Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 3, Level : Warning, Message : Done everything&^**^*&*(*()*()()*((*()*()*()*()*(), Payload : [message : Done everything&^**^*&*(*()*()()*((*()*()*()*()*()] [applicationName : XPOLastMile.Host.MSMQ.Core.ProcessUntilEmpty] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:21.8113608Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,150][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41254468, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.534Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 3, Level : Warning, Message : About to read the business process name, Payload : [message : About to read the business process name] [applicationName : Not Provided] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:22.7619178Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,150][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41254906, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.534Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 3, Level : Warning, Message : Found the business process, Payload : [message : Found the business process] [applicationName : Not Provided] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:22.7652135Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,150][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41255403, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.534Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 3, Level : Warning, Message : Executing business process!!!!!!!!!!!!!!!!!, Payload : [message : Executing business process!!!!!!!!!!!!!!!!!] [applicationName : XPOLastMile.Host.MSMQ.Auto.ProcessAll] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:22.7656878Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,150][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41255947, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.534Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 3, Level : Warning, Message : FINISHED ---------- Executing business process!!!!!!!!!!!!!!!!!, Payload : [message : FINISHED ---------- Executing business process!!!!!!!!!!!!!!!!!] [applicationName : XPOLastMile.Host.MSMQ.Core.ProcessUntilEmpty] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:22.8066648Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,150][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41256465, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.534Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 3, Level : Warning, Message : Done everything&^**^*&*(*()*()()*((*()*()*()*()*(), Payload : [message : Done everything&^**^*&*(*()*()()*((*()*()*()*()*()] [applicationName : XPOLastMile.Host.MSMQ.Core.ProcessUntilEmpty] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:22.8069513Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,150][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41257025, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.534Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 1, Level : Verbose, Message : The messageID to process: 35235900 and MessageQueueID: 35185033, Payload : [message : The messageID to process: 35235900 and MessageQueueID: 35185033] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : DebugInfo, Timestamp : 2018-04-24T21:15:24.0748428Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,150][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41257589, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.534Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 1, Level : Verbose, Message : FinanceSyncOrderCost called with MessageQueueID: 35185033 started, Payload : [message : FinanceSyncOrderCost called with MessageQueueID: 35185033 started] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : DebugInfo, Timestamp : 2018-04-24T21:15:24.0787305Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,151][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41263903, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.534Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"ProviderId : 3ac9cccc-7b35-599a-e4dc-bb08e9ad5a20\nEventId : 4\nKeywords : None\nLevel : Error\nMessage : The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied. / Additional Information: \nOpcode : Info\nTask : 65530\nVersion : 0\nPayload : [message : The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied. / Additional Information: ] [stacktrace : System.Data.SqlClient.SqlException (0x80131904): The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied.\n at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)\n at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)\n at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj, Boolean callerHasConnectionLock, Boolean asyncClose)\n at System.Data.SqlClient.TdsParser.TryRun(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj, Boolean& dataReady)\n at System.Data.SqlClient.SqlDataReader.TryConsumeMetaData()\n at System.Data.SqlClient.SqlDataReader.get_MetaData()\n at System.Data.SqlClient.SqlCommand.FinishExecuteReader(SqlDataReader ds, RunBehavior runBehavior, String resetOptionsString, Boolean isInternal, Boolean forDescribeParameterEncryption)\n at System.Data.SqlClient.SqlCommand.RunExecuteReaderTds(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, Boolean async, Int32 timeout, Task& task, Boolean asyncWrite, Boolean inRetry, SqlDataReader ds, Boolean describeParameterEncryptionRequest)\n at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method, TaskCompletionSource`1 completion, Int32 timeout, Task& task, Boolean& usedCache, Boolean asyncWrite, Boolean inRetry)\n at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method)\n at System.Data.SqlClient.SqlCommand.ExecuteReader(CommandBehavior behavior, String method)\n at System.Data.SqlClient.SqlCommand.ExecuteDbDataReader(CommandBehavior behavior)\n at System.Data.Common.DbCommand.ExecuteReader(CommandBehavior behavior)\n at System.Data.Entity.Infrastructure.Interception.DbCommandDispatcher.<Reader>b__c(DbCommand t, DbCommandInterceptionContext`1 c)\n at System.Data.Entity.Infrastructure.Interception.InternalDispatcher`1.Dispatch[TTarget,TInterceptionContext,TResult](TTarget target, Func`3 operation, TInterceptionContext interceptionContext, Action`3 executing, Action`3 executed)\n at System.Data.Entity.Infrastructure.Interception.DbCommandDispatcher.Reader(DbCommand command, DbCommandInterceptionContext interceptionContext)\n at System.Data.Entity.Internal.InterceptableDbCommand.ExecuteDbDataReader(CommandBehavior behavior)\n at System.Data.Common.DbCommand.ExecuteReader(CommandBehavior behavior)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQueryInternal[TElement](String commandText, String entitySetName, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Core.Objects.ObjectContext.<>c__DisplayClass65`1.<ExecuteStoreQueryReliably>b__64()\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteInTransaction[T](Func`1 func, IDbExecutionStrategy executionStrategy, Boolean startLocalTransaction, Boolean releaseConnectionOnSuccess)\n at System.Data.Entity.Core.Objects.ObjectContext.<>c__DisplayClass65`1.<ExecuteStoreQueryReliably>b__63()\n at System.Data.Entity.SqlServer.DefaultSqlExecutionStrategy.Execute[TResult](Func`1 operation)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQueryReliably[TElement](String commandText, String entitySetName, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQuery[TElement](String commandText, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Internal.InternalContext.<>c__DisplayClass14`1.<ExecuteSqlQuery>b__13()\n at System.Data.Entity.Internal.LazyEnumerator`1.MoveNext()\n at System.Linq.Enumerable.First[TSource](IEnumerable`1 source)\n at XPOLastMile.Repository.FinanceSynchronizer.SettlementEntityRepository.InsertSettlementEntity(SettlementEntity jobCost) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.Repository\\XPOLastMile.Repository.FinanceSynchronizer\\SettlementEntityRepository.cs:line 44\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSync.Impl.OrderCostSyncService.SettlementEntityInsert(SettlementEntity jobCostAfter, Int64 apTransactionTypeIdVoucher) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSync\\Impl\\OrderCostSyncService.cs:line 171\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSync.Impl.OrderCostSyncService.SyncOrderCost(Message message, ExecutionContext executionContext, IContext context) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSync\\Impl\\OrderCostSyncService.cs:line 94\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts.ProcessRequest(IContext context, Message message) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSyncOrderCosts.cs:line 24\nClientConnectionId:1a4dc75f-c457-4903-870b-428b5d0182c0\nError Number:8178,State:1,Class:16] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : 02012253-2089-4b2a-acb3-cffbcced1660] \nEventName : ErrorInfo\nTimestamp : 2018-04-24T21:15:24.1004029Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,151][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41264499, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.534Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 3, Level : Warning, Message : Could not process for queue 35235900 - XPOLastMile.Framework.MVC.BooleanResponse, Payload : [message : Could not process for queue 35235900 - XPOLastMile.Framework.MVC.BooleanResponse] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:24.1043890Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,151][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41264963, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.534Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 3, Level : Warning, Message : About to read the business process name, Payload : [message : About to read the business process name] [applicationName : Not Provided] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:24.7616205Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,151][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41265401, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.534Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 3, Level : Warning, Message : Found the business process, Payload : [message : Found the business process] [applicationName : Not Provided] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:24.7648118Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,151][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41265898, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.534Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 3, Level : Warning, Message : Executing business process!!!!!!!!!!!!!!!!!, Payload : [message : Executing business process!!!!!!!!!!!!!!!!!] [applicationName : XPOLastMile.Host.MSMQ.Auto.ProcessAll] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:24.7652516Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,151][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41266442, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.534Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 3, Level : Warning, Message : FINISHED ---------- Executing business process!!!!!!!!!!!!!!!!!, Payload : [message : FINISHED ---------- Executing business process!!!!!!!!!!!!!!!!!] [applicationName : XPOLastMile.Host.MSMQ.Core.ProcessUntilEmpty] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:24.8049902Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,151][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41266960, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.534Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 3, Level : Warning, Message : Done everything&^**^*&*(*()*()()*((*()*()*()*()*(), Payload : [message : Done everything&^**^*&*(*()*()()*((*()*()*()*()*()] [applicationName : XPOLastMile.Host.MSMQ.Core.ProcessUntilEmpty] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:24.8052665Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,151][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41267520, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.534Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 1, Level : Verbose, Message : The messageID to process: 35235903 and MessageQueueID: 35185036, Payload : [message : The messageID to process: 35235903 and MessageQueueID: 35185036] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : DebugInfo, Timestamp : 2018-04-24T21:15:26.5156273Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,152][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41268084, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.534Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 1, Level : Verbose, Message : FinanceSyncOrderCost called with MessageQueueID: 35185036 started, Payload : [message : FinanceSyncOrderCost called with MessageQueueID: 35185036 started] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : DebugInfo, Timestamp : 2018-04-24T21:15:26.5203136Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,153][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41187422, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:36.530Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"ProviderId : 3ac9cccc-7b35-599a-e4dc-bb08e9ad5a20\nEventId : 4\nKeywords : None\nLevel : Error\nMessage : The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied. / Additional Information: \nOpcode : Info\nTask : 65530\nVersion : 0\nPayload : [message : The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied. / Additional Information: ] [stacktrace : System.Data.SqlClient.SqlException (0x80131904): The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied.\n at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)\n at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)\n at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj, Boolean callerHasConnectionLock, Boolean asyncClose)\n at System.Data.SqlClient.TdsParser.TryRun(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj, Boolean& dataReady)\n at System.Data.SqlClient.SqlDataReader.TryConsumeMetaData()\n at System.Data.SqlClient.SqlDataReader.get_MetaData()\n at System.Data.SqlClient.SqlCommand.FinishExecuteReader(SqlDataReader ds, RunBehavior runBehavior, String resetOptionsString, Boolean isInternal, Boolean forDescribeParameterEncryption)\n at System.Data.SqlClient.SqlCommand.RunExecuteReaderTds(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, Boolean async, Int32 timeout, Task& task, Boolean asyncWrite, Boolean inRetry, SqlDataReader ds, Boolean describeParameterEncryptionRequest)\n at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method, TaskCompletionSource`1 completion, Int32 timeout, Task& task, Boolean& usedCache, Boolean asyncWrite, Boolean inRetry)\n at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method)\n at System.Data.SqlClient.SqlCommand.ExecuteReader(CommandBehavior behavior, String method)\n at System.Data.SqlClient.SqlCommand.ExecuteDbDataReader(CommandBehavior behavior)\n at System.Data.Common.DbCommand.ExecuteReader(CommandBehavior behavior)\n at System.Data.Entity.Infrastructure.Interception.DbCommandDispatcher.<Reader>b__c(DbCommand t, DbCommandInterceptionContext`1 c)\n at System.Data.Entity.Infrastructure.Interception.InternalDispatcher`1.Dispatch[TTarget,TInterceptionContext,TResult](TTarget target, Func`3 operation, TInterceptionContext interceptionContext, Action`3 executing, Action`3 executed)\n at System.Data.Entity.Infrastructure.Interception.DbCommandDispatcher.Reader(DbCommand command, DbCommandInterceptionContext interceptionContext)\n at System.Data.Entity.Internal.InterceptableDbCommand.ExecuteDbDataReader(CommandBehavior behavior)\n at System.Data.Common.DbCommand.ExecuteReader(CommandBehavior behavior)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQueryInternal[TElement](String commandText, String entitySetName, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Core.Objects.ObjectContext.<>c__DisplayClass65`1.<ExecuteStoreQueryReliably>b__64()\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteInTransaction[T](Func`1 func, IDbExecutionStrategy executionStrategy, Boolean startLocalTransaction, Boolean releaseConnectionOnSuccess)\n at System.Data.Entity.Core.Objects.ObjectContext.<>c__DisplayClass65`1.<ExecuteStoreQueryReliably>b__63()\n at System.Data.Entity.SqlServer.DefaultSqlExecutionStrategy.Execute[TResult](Func`1 operation)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQueryReliably[TElement](String commandText, String entitySetName, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQuery[TElement](String commandText, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Internal.InternalContext.<>c__DisplayClass14`1.<ExecuteSqlQuery>b__13()\n at System.Data.Entity.Internal.LazyEnumerator`1.MoveNext()\n at System.Linq.Enumerable.First[TSource](IEnumerable`1 source)\n at XPOLastMile.Repository.FinanceSynchronizer.SettlementEntityRepository.InsertSettlementEntity(SettlementEntity jobCost) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.Repository\\XPOLastMile.Repository.FinanceSynchronizer\\SettlementEntityRepository.cs:line 44\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSync.Impl.OrderCostSyncService.SettlementEntityInsert(SettlementEntity jobCostAfter, Int64 apTransactionTypeIdVoucher) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSync\\Impl\\OrderCostSyncService.cs:line 171\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSync.Impl.OrderCostSyncService.SyncOrderCost(Message message, ExecutionContext executionContext, IContext context) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSync\\Impl\\OrderCostSyncService.cs:line 94\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts.ProcessRequest(IContext context, Message message) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSyncOrderCosts.cs:line 24\nClientConnectionId:1a4dc75f-c457-4903-870b-428b5d0182c0\nError Number:8178,State:1,Class:16] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : 83b67baf-8ebc-4ce1-93c2-fa5d630776be] \nEventName : ErrorInfo\nTimestamp : 2018-04-24T21:15:04.6765810Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,153][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41212584, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:46.532Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 3, Level : Warning, Message : About to read the business process name, Payload : [message : About to read the business process name] [applicationName : Not Provided] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:12.7670853Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,153][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41216898, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:46.532Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 3, Level : Warning, Message : Found the business process, Payload : [message : Found the business process] [applicationName : Not Provided] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:14.2925902Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,164][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41217411, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:46.532Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 3, Level : Warning, Message : Executing business process!!!!!!!!!!!!!!!!!, Payload : [message : Executing business process!!!!!!!!!!!!!!!!!] [applicationName : XPOLastMile.MessageQueue.Business.MessageQueueWatcher] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:14.2931082Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,165][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41234989, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:47.533Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 3, Level : Warning, Message : FINISHED ---------- Executing business process!!!!!!!!!!!!!!!!!, Payload : [message : FINISHED ---------- Executing business process!!!!!!!!!!!!!!!!!] [applicationName : Not Provided] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:17.6649852Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,165][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41235421, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:47.533Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 3, Level : Warning, Message : Done everything&^**^*&*(*()*()()*((*()*()*()*()*(), Payload : [message : Done everything&^**^*&*(*()*()()*((*()*()*()*()*()] [applicationName : Not Provided] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:17.6653214Z\n--------------- Event Log End Here ---------------\n", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,165][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41274398, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.534Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"ProviderId : 3ac9cccc-7b35-599a-e4dc-bb08e9ad5a20\nEventId : 4\nKeywords : None\nLevel : Error\nMessage : The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied. / Additional Information: \nOpcode : Info\nTask : 65530\nVersion : 0\nPayload : [message : The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied. / Additional Information: ] [stacktrace : System.Data.SqlClient.SqlException (0x80131904): The parameterized query '(@orderID bigint,@externalSystemID int,@externalParentItemID big' expects the parameter '@createdBy', which was not supplied.\n at System.Data.SqlClient.SqlConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)\n at System.Data.SqlClient.SqlInternalConnection.OnError(SqlException exception, Boolean breakConnection, Action`1 wrapCloseInAction)\n at System.Data.SqlClient.TdsParser.ThrowExceptionAndWarning(TdsParserStateObject stateObj, Boolean callerHasConnectionLock, Boolean asyncClose)\n at System.Data.SqlClient.TdsParser.TryRun(RunBehavior runBehavior, SqlCommand cmdHandler, SqlDataReader dataStream, BulkCopySimpleResultSet bulkCopyHandler, TdsParserStateObject stateObj, Boolean& dataReady)\n at System.Data.SqlClient.SqlDataReader.TryConsumeMetaData()\n at System.Data.SqlClient.SqlDataReader.get_MetaData()\n at System.Data.SqlClient.SqlCommand.FinishExecuteReader(SqlDataReader ds, RunBehavior runBehavior, String resetOptionsString, Boolean isInternal, Boolean forDescribeParameterEncryption)\n at System.Data.SqlClient.SqlCommand.RunExecuteReaderTds(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, Boolean async, Int32 timeout, Task& task, Boolean asyncWrite, Boolean inRetry, SqlDataReader ds, Boolean describeParameterEncryptionRequest)\n at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method, TaskCompletionSource`1 completion, Int32 timeout, Task& task, Boolean& usedCache, Boolean asyncWrite, Boolean inRetry)\n at System.Data.SqlClient.SqlCommand.RunExecuteReader(CommandBehavior cmdBehavior, RunBehavior runBehavior, Boolean returnStream, String method)\n at System.Data.SqlClient.SqlCommand.ExecuteReader(CommandBehavior behavior, String method)\n at System.Data.SqlClient.SqlCommand.ExecuteDbDataReader(CommandBehavior behavior)\n at System.Data.Common.DbCommand.ExecuteReader(CommandBehavior behavior)\n at System.Data.Entity.Infrastructure.Interception.DbCommandDispatcher.<Reader>b__c(DbCommand t, DbCommandInterceptionContext`1 c)\n at System.Data.Entity.Infrastructure.Interception.InternalDispatcher`1.Dispatch[TTarget,TInterceptionContext,TResult](TTarget target, Func`3 operation, TInterceptionContext interceptionContext, Action`3 executing, Action`3 executed)\n at System.Data.Entity.Infrastructure.Interception.DbCommandDispatcher.Reader(DbCommand command, DbCommandInterceptionContext interceptionContext)\n at System.Data.Entity.Internal.InterceptableDbCommand.ExecuteDbDataReader(CommandBehavior behavior)\n at System.Data.Common.DbCommand.ExecuteReader(CommandBehavior behavior)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQueryInternal[TElement](String commandText, String entitySetName, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Core.Objects.ObjectContext.<>c__DisplayClass65`1.<ExecuteStoreQueryReliably>b__64()\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteInTransaction[T](Func`1 func, IDbExecutionStrategy executionStrategy, Boolean startLocalTransaction, Boolean releaseConnectionOnSuccess)\n at System.Data.Entity.Core.Objects.ObjectContext.<>c__DisplayClass65`1.<ExecuteStoreQueryReliably>b__63()\n at System.Data.Entity.SqlServer.DefaultSqlExecutionStrategy.Execute[TResult](Func`1 operation)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQueryReliably[TElement](String commandText, String entitySetName, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Core.Objects.ObjectContext.ExecuteStoreQuery[TElement](String commandText, ExecutionOptions executionOptions, Object[] parameters)\n at System.Data.Entity.Internal.InternalContext.<>c__DisplayClass14`1.<ExecuteSqlQuery>b__13()\n at System.Data.Entity.Internal.LazyEnumerator`1.MoveNext()\n at System.Linq.Enumerable.First[TSource](IEnumerable`1 source)\n at XPOLastMile.Repository.FinanceSynchronizer.SettlementEntityRepository.InsertSettlementEntity(SettlementEntity jobCost) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.Repository\\XPOLastMile.Repository.FinanceSynchronizer\\SettlementEntityRepository.cs:line 44\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSync.Impl.OrderCostSyncService.SettlementEntityInsert(SettlementEntity jobCostAfter, Int64 apTransactionTypeIdVoucher) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSync\\Impl\\OrderCostSyncService.cs:line 171\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSync.Impl.OrderCostSyncService.SyncOrderCost(Message message, ExecutionContext executionContext, IContext context) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSync\\Impl\\OrderCostSyncService.cs:line 94\n at XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts.ProcessRequest(IContext context, Message message) in e:\\BuildAgent\\work\\BuildAllProjectsINT\\XPOLastMile.FinanceSynchronizer\\XPOLastMile.FinanceSynchronizer.Business\\FinanceSyncOrderCosts.cs:line 24\nClientConnectionId:1a4dc75f-c457-4903-870b-428b5d0182c0\nError Number:8178,State:1,Class:16] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : e0a87dc4-3bd5-4aa9-a1f7-5def23391fcd] \nEventName : ErrorInfo\nTimestamp : 2018-04-24T21:15:26.5528603Z\n--------------- Event Log End Here ---------------\n\n--------------- Event Log Start Here ---------------", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:40,165][DEBUG][logstash.pipeline ] output received {"event"=>{"source"=>"D:\\XLM\\Logging Service\\logs\\XPOLM-ETW-Logs.log", "offset"=>41274940, "input_type"=>"log", "type"=>"comma_separated_input", "@timestamp"=>2018-04-24T21:16:56.534Z, "beat"=>{"version"=>"5.2.0"}, "message"=>"EventId : 3, Level : Warning, Message : Could not process for queue 35235903 - XPOLastMile.Framework.MVC.BooleanResponse, Payload : [message : Could not process for queue 35235903 - XPOLastMile.Framework.MVC.BooleanResponse] [applicationName : XPOLastMile.FinanceSynchronizer.Business.FinanceSyncOrderCosts] [hostName : XLM-INT-APP-02] [currentPrincipal : ] [executingPrincipal : INT\\XPOFramework] [userMessageGuid : ] , EventName : WarningInfo, Timestamp : 2018-04-24T21:15:26.5571428Z\n--------------- Event Log End Here ---------------\n", "fields"=>{"env_id"=>"INT", "app_id"=>"LoggingService"}, "host"=>"XLM-INT-APP-02", "tags"=>["beats_input_codec_plain_applied"], "@version"=>"1"}}
[2018-04-24T17:17:41,807][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2018-04-24T17:17:41,807][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2018-04-24T17:17:42,349][DEBUG][logstash.pipeline ] Pushing flush onto pipeline {:pipeline_id=>".monitoring-logstash", :thread=>"#<Thread:0x134849f2@/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:246 sleep>"}
[2018-04-24T17:17:43,077][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@id = "plain_369dbb1e-c19c-44b0-82a3-3b5637a219a9"
[2018-04-24T17:17:43,078][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@enable_metric = true
[2018-04-24T17:17:43,078][DEBUG][logstash.codecs.plain ] config LogStash::Codecs::Plain/@charset = "UTF-8"
[2018-04-24T17:17:43,082][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 128.136.75.7:49878] Received a new payload
[2018-04-24T17:17:43,082][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 128.136.75.7:49878] Sending a new message for the listener, sequence: 1
[2018-04-24T17:17:43,083][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 128.136.75.7:49878] Sending a new message for the listener, sequence: 2
[2018-04-24T17:17:43,083][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 128.136.75.7:49878] Sending a new message for the listener, sequence: 3
[2018-04-24T17:17:43,084][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 128.136.75.7:49878] Sending a new message for the listener, sequence: 4
[2018-04-24T17:17:43,094][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 128.136.75.7:49878] Sending a new message for the listener, sequence: 5
[2018-04-24T17:17:43,094][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 128.136.75.7:49878] Sending a new message for the listener, sequence: 6
[2018-04-24T17:17:43,095][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 128.136.75.7:49878] Sending a new message for the listener, sequence: 7
[2018-04-24T17:17:43,095][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 128.136.75.7:49878] Sending a new message for the listener, sequence: 8
[2018-04-24T17:17:43,096][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 128.136.75.7:49878] Sending a new message for the listener, sequence: 9
[2018-04-24T17:17:43,096][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 128.136.75.7:49878] Sending a new message for the listener, sequence: 10
[2018-04-24T17:17:43,097][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 128.136.75.7:49878] Sending a new message for the listener, sequence: 11
[2018-04-24T17:17:43,105][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 128.136.75.7:49878] Received a new payload
[2018-04-24T17:17:43,105][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 128.136.75.7:49878] Sending a new message for the listener, sequence: 1
[2018-04-24T17:17:43,106][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 128.136.75.7:49878] Sending a new message for the listener, sequence: 2
[2018-04-24T17:17:43,106][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 128.136.75.7:49878] Sending a new message for the listener, sequence: 3
[2018-04-24T17:17:43,107][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 128.136.75.7:49878] Sending a new message for the listener, sequence: 4
[2018-04-24T17:17:43,107][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 128.136.75.7:49878] Sending a new message for the listener, sequence: 5
[2018-04-24T17:17:43,108][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 128.136.75.7:49878] Sending a new message for the listener, sequence: 6
[2018-04-24T17:17:43,108][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 128.136.75.7:49878] Sending a new message for the listener, sequence: 7
[2018-04-24T17:17:43,109][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 128.136.75.7:49878] Sending a new message for the listener, sequence: 8
[2018-04-24T17:17:43,109][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 128.136.75.7:49878] Sending a new message for the listener, sequence: 9
[2018-04-24T17:17:43,110][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 128.136.75.7:49878] Sending a new message for the listener, sequence: 10
[2018-04-24T17:17:43,110][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 128.136.75.7:49878] Sending a new message for the listener, sequence: 11
[2018-04-24T17:17:43,111][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 128.136.75.7:49878] Sending a new message for the listener, sequence: 12
[2018-04-24T17:17:43,111][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 128.136.75.7:49878] Sending a new message for the listener, sequence: 13
[2018-04-24T17:17:43,112][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 128.136.75.7:49878] Sending a new message for the listener, sequence: 14
[2018-04-24T17:17:43,113][DEBUG][org.logstash.beats.BeatsHandler] [local: 10.54.52.31:5044, remote: 128.136.75.7:49878] Sending a new message for the listener, sequence: 15
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment