Indices not creating after logstah starting with Systemctl service

Logstash not creating indices when I use the below command
sudo systemctl start logstash.service

If I run below command then indices creating successfully and pulling the data
bin/logstash -f /etc/logstash/conf.d/logstash_one.conf

Currently I am using 6.8.23 version. Any suggestions would be helpful.

Thanks in advance!

You need to share logstash logs and system logs as well (normally on /var/log/messages or /var/log/syslog).

Did you run logstash as the root user or with sudo before trying to run it as a service? If so, some directories may have the wrong permissions.

Hi Leandrojmp,

Thanks for the reply!

Below are the logs
[2022-05-16T20:57:28,925][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.8.23"}
[2022-05-16T20:58:06,070][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"logstashabc", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2022-05-16T20:58:06,706][INFO ][logstash.outputs.Elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>, :added=>[https://search-elkdomain-y7muh26u4lz646ile5b6dl5fve.us-east-2.es.amazonaws.com:443/]}}
[2022-05-16T20:58:07,150][WARN ][logstash.outputs.Elasticsearch] Restored connection to ES instance {:url=>"https://search-elkdomain-y7muh26u4lz646ile5b6dl5fve.us-east-2.es.amazonaws.com:443/"}
[2022-05-16T20:58:07,222][INFO ][logstash.outputs.Elasticsearch] ES Output version determined {:es_version=>7}
[2022-05-16T20:58:07,230][WARN ][logstash.outputs.Elasticsearch] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>7}
[2022-05-16T20:58:07,282][INFO ][logstash.outputs.Elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::Elasticsearch", :hosts=>["https://search-elkdomain-y7muh26u4lz646ile5b6dl5fve.us-east-2.es.amazonaws.com:443"]}
[2022-05-16T20:58:07,293][INFO ][logstash.outputs.Elasticsearch] Using default mapping template
[2022-05-16T20:58:07,349][INFO ][logstash.outputs.Elasticsearch] Attempting to install template {:manage_template=>{"index_patterns"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s", "number_of_shards"=>1}, "mappings"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}
[2022-05-16T20:58:07,687][INFO ][logstash.inputs.s3 ] Registering s3 input {:bucket=>"elk-s3-16j1c7e2wi83c-sw28ncguh23b", :region=>"us-east-2"}
[2022-05-16T20:58:08,196][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"logstashabc", :thread=>"#<Thread:0x487d5490 sleep>"}
[2022-05-16T20:58:08,285][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:logstashabc], :non_running_pipelines=>}
[2022-05-16T20:58:09,030][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2022-05-16T20:58:11,348][INFO ][logstash.inputs.s3 ] Using default generated file for the sincedb {:filename=>"/var/lib/logstash/plugins/inputs/s3/sincedb_e31c033c7c57b23f56a75840453901ec"}

there is no error in the logstash logs

I tried to run with sudo for both the scenarios

Thank you,
Karthik

Can you check permissions for this file? Is it root user?
Can you enable debug mode for logstash and run sudo systemctl start logstash.service, not as the command.

Hi Rios,

I checked permissions for the file and logstash is the root user
-rw-r--r-- 1 logstash logstash 23 May 6 21:09 sincedb_e31c033c7c57b23f56a75840453901ec

Enabled the debug mode and ran the systemctl command. Please verify the below logs

[2022-05-17T12:00:36,785][DEBUG][logstash.modules.scaffold] Found module 
[2022-05-17T12:00:36,809][DEBUG][logstash.modules.scaffold] Found module {:module_name=>"netflow", :directory=>"/usr/share/logstash/modules/netflow/configuration"}
[2022-05-17T12:00:36,810][DEBUG][logstash.plugins.registry] Adding plugin to the registry {:name=>"netflow", :type=>:modules, :class=>#<LogStash::Modules::Scaffold:0x70385eb1 @directory="/usr/share/logstash/modules/netflow/configuration", @module_name="netflow", @kibana_version_parts=["6", "0", "0"]>}
[2022-05-17T12:00:37,319][DEBUG][logstash.runner          ] -------- Logstash Settings (* means modified) ---------
[2022-05-17T12:00:37,321][DEBUG][logstash.runner          ] node.name: "us-east-2.compute.internal"
[2022-05-17T12:00:37,321][DEBUG][logstash.runner          ] path.data: "/var/lib/logstash"
[2022-05-17T12:00:37,322][DEBUG][logstash.runner          ] modules.cli: []
[2022-05-17T12:00:37,323][DEBUG][logstash.runner          ] modules: []
[2022-05-17T12:00:37,323][DEBUG][logstash.runner          ] modules_list: []
[2022-05-17T12:00:37,324][DEBUG][logstash.runner          ] modules_variable_list: []
[2022-05-17T12:00:37,325][DEBUG][logstash.runner          ] modules_setup: false
[2022-05-17T12:00:37,326][DEBUG][logstash.runner          ] config.test_and_exit: false
[2022-05-17T12:00:37,327][DEBUG][logstash.runner          ] config.reload.automatic: false
[2022-05-17T12:00:37,328][DEBUG][logstash.runner          ] config.reload.interval: 3000000000
[2022-05-17T12:00:37,328][DEBUG][logstash.runner          ] config.support_escapes: false
[2022-05-17T12:00:37,330][DEBUG][logstash.runner          ] config.field_reference.parser: "COMPAT"
[2022-05-17T12:00:37,330][DEBUG][logstash.runner          ] metric.collect: true
[2022-05-17T12:00:37,331][DEBUG][logstash.runner          ] pipeline.id: "main"
[2022-05-17T12:00:37,332][DEBUG][logstash.runner          ] pipeline.system: false
[2022-05-17T12:00:37,333][DEBUG][logstash.runner          ] pipeline.workers: 2
[2022-05-17T12:00:37,334][DEBUG][logstash.runner          ] pipeline.output.workers: 1
[2022-05-17T12:00:37,335][DEBUG][logstash.runner          ] pipeline.batch.size: 125
[2022-05-17T12:00:37,335][DEBUG][logstash.runner          ] pipeline.batch.delay: 50
[2022-05-17T12:00:37,336][DEBUG][logstash.runner          ] pipeline.unsafe_shutdown: false
[2022-05-17T12:00:37,337][DEBUG][logstash.runner          ] pipeline.java_execution: false
[2022-05-17T12:00:37,337][DEBUG][logstash.runner          ] pipeline.reloadable: true
[2022-05-17T12:00:37,338][DEBUG][logstash.runner          ] path.plugins: []
[2022-05-17T12:00:37,339][DEBUG][logstash.runner          ] config.debug: false
[2022-05-17T12:00:37,339][DEBUG][logstash.runner          ] *log.level: "debug" (default: "info")
[2022-05-17T12:00:37,340][DEBUG][logstash.runner          ] version: false
[2022-05-17T12:00:37,341][DEBUG][logstash.runner          ] help: false
[2022-05-17T12:00:37,342][DEBUG][logstash.runner          ] log.format: "plain"
[2022-05-17T12:00:37,343][DEBUG][logstash.runner          ] http.host: "127.0.0.1"
[2022-05-17T12:00:37,344][DEBUG][logstash.runner          ] http.port: 1200
[2022-05-17T12:00:37,344][DEBUG][logstash.runner          ] http.environment: "test"
[2022-05-17T12:00:37,345][DEBUG][logstash.runner          ] queue.type: "memory"
[2022-05-17T12:00:37,346][DEBUG][logstash.runner          ] queue.drain: false
[2022-05-17T12:00:37,347][DEBUG][logstash.runner          ] queue.page_capacity: 67108864
[2022-05-17T12:00:37,347][DEBUG][logstash.runner          ] queue.max_bytes: 1073741824
[2022-05-17T12:00:37,348][DEBUG][logstash.runner          ] queue.max_events: 0
[2022-05-17T12:00:37,349][DEBUG][logstash.runner          ] queue.checkpoint.acks: 1024
[2022-05-17T12:00:37,350][DEBUG][logstash.runner          ] queue.checkpoint.writes: 1024
[2022-05-17T12:00:37,351][DEBUG][logstash.runner          ] queue.checkpoint.interval: 1000
[2022-05-17T12:00:37,352][DEBUG][logstash.runner          ] queue.checkpoint.retry: false
[2022-05-17T12:00:37,352][DEBUG][logstash.runner          ] dead_letter_queue.enable: false
[2022-05-17T12:00:37,353][DEBUG][logstash.runner          ] dead_letter_queue.max_bytes: 1073741824
[2022-05-17T12:00:37,354][DEBUG][logstash.runner          ] slowlog.threshold.warn: -1
[2022-05-17T12:00:37,354][DEBUG][logstash.runner          ] slowlog.threshold.info: -1
[2022-05-17T12:00:37,356][DEBUG][logstash.runner          ] slowlog.threshold.debug: -1
[2022-05-17T12:00:37,358][DEBUG][logstash.runner          ] *keystore.file: "/etc/logstash/logstash.keystore" (default: "/usr/share/logstash/config/logstash.keystore")
[2022-05-17T12:00:37,358][DEBUG][logstash.runner          ] path.queue: "/var/lib/logstash/queue"
[2022-05-17T12:00:37,359][DEBUG][logstash.runner          ] path.dead_letter_queue: "/var/lib/logstash/dead_letter_queue"
[2022-05-17T12:00:37,360][DEBUG][logstash.runner          ] *path.settings: "/etc/logstash" (default: "/usr/share/logstash/config")
[2022-05-17T12:00:37,361][DEBUG][logstash.runner          ] *path.logs: "/var/log/logstash" (default: "/usr/share/logstash/logs")
[2022-05-17T12:00:37,361][DEBUG][logstash.runner          ] xpack.management.enabled: false
[2022-05-17T12:00:37,362][DEBUG][logstash.runner          ] xpack.management.logstash.poll_interval: 5000000000
[2022-05-17T12:00:37,363][DEBUG][logstash.runner          ] xpack.management.pipeline.id: ["main"]
[2022-05-17T12:00:37,364][DEBUG][logstash.runner          ] xpack.management.elasticsearch.username: "logstash_system"
[2022-05-17T12:00:37,364][DEBUG][logstash.runner          ] xpack.management.elasticsearch.url: ["https://localhost:9200"]
[2022-05-17T12:00:37,365][DEBUG][logstash.runner          ] xpack.management.elasticsearch.hosts: ["https://localhost:9200"]
[2022-05-17T12:00:37,366][DEBUG][logstash.runner          ] xpack.management.elasticsearch.ssl.verification_mode: "certificate"
[2022-05-17T12:00:37,367][DEBUG][logstash.runner          ] xpack.management.elasticsearch.sniffing: false
[2022-05-17T12:00:37,368][DEBUG][logstash.runner          ] xpack.monitoring.enabled: false
[2022-05-17T12:00:37,369][DEBUG][logstash.runner          ] xpack.monitoring.elasticsearch.hosts: ["http://localhost:9200"]
[2022-05-17T12:00:37,370][DEBUG][logstash.runner          ] xpack.monitoring.elasticsearch.url: ["http://localhost:9200"]
[2022-05-17T12:00:37,371][DEBUG][logstash.runner          ] xpack.monitoring.collection.interval: 10000000000
[2022-05-17T12:00:37,371][DEBUG][logstash.runner          ] xpack.monitoring.collection.timeout_interval: 600000000000
[2022-05-17T12:00:37,372][DEBUG][logstash.runner          ] xpack.monitoring.elasticsearch.username: "logstash_system"
[2022-05-17T12:00:37,373][DEBUG][logstash.runner          ] xpack.monitoring.elasticsearch.ssl.verification_mode: "certificate"
[2022-05-17T12:00:37,373][DEBUG][logstash.runner          ] xpack.monitoring.elasticsearch.sniffing: false
[2022-05-17T12:00:37,374][DEBUG][logstash.runner          ] xpack.monitoring.collection.pipeline.details.enabled: true
[2022-05-17T12:00:37,375][DEBUG][logstash.runner          ] xpack.monitoring.collection.config.enabled: true
[2022-05-17T12:00:37,376][DEBUG][logstash.runner          ] node.uuid: ""
[2022-05-17T12:00:37,376][DEBUG][logstash.runner          ] --------------- Logstash Settings -------------------
[2022-05-17T12:00:37,494][DEBUG][logstash.config.source.multilocal] Reading pipeline configurations from YAML {:location=>"/etc/logstash/pipelines.yml"}
[2022-05-17T12:00:37,537][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"6.8.23"}
[2022-05-17T12:00:37,575][DEBUG][logstash.agent           ] Setting global FieldReference parsing mode: COMPAT
[2022-05-17T12:00:37,600][DEBUG][logstash.agent           ] Setting up metric collection
[2022-05-17T12:00:37,673][DEBUG][logstash.instrument.periodicpoller.os] Starting {:polling_interval=>5, :polling_timeout=>120}
[2022-05-17T12:00:37,945][DEBUG][logstash.instrument.periodicpoller.jvm] Starting {:polling_interval=>5, :polling_timeout=>120}
[2022-05-17T12:00:38,097][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2022-05-17T12:00:38,105][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2022-05-17T12:00:38,131][DEBUG][logstash.instrument.periodicpoller.persistentqueue] Starting {:polling_interval=>5, :polling_timeout=>120}
[2022-05-17T12:00:38,149][DEBUG][logstash.instrument.periodicpoller.deadletterqueue] Starting {:polling_interval=>5, :polling_timeout=>120}
[2022-05-17T12:00:38,244][DEBUG][logstash.agent           ] Starting agent
[2022-05-17T12:00:38,283][DEBUG][logstash.config.source.multilocal] Reading pipeline configurations from YAML {:location=>"/etc/logstash/pipelines.yml"}
[2022-05-17T12:00:38,383][DEBUG][logstash.config.source.local.configpathloader] Skipping the following files while reading config since they don't match the specified glob pattern {:files=>[]}
[2022-05-17T12:00:38,392][DEBUG][logstash.config.source.local.configpathloader] Reading config file {:config_file=>"/etc/logstash/conf.d/logstash_one_test.conf"}
[2022-05-17T12:00:38,445][DEBUG][logstash.agent           ] Converging pipelines state {:actions_count=>1}
[2022-05-17T12:00:38,460][DEBUG][logstash.agent           ] Executing action {:action=>LogStash::PipelineAction::Create/pipeline_id:logstashabctest}
[2022-05-17T12:00:43,325][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2022-05-17T12:00:43,327][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2022-05-17T12:00:46,316][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"multiline", :type=>"codec", :class=>LogStash::Codecs::Multiline}
[2022-05-17T12:00:46,502][DEBUG][logstash.codecs.multiline] config LogStash::Codecs::Multiline/@pattern = "\\[%{TIMESTAMP_ISO8601}"
[2022-05-17T12:00:46,504][DEBUG][logstash.codecs.multiline] config LogStash::Codecs::Multiline/@what = "previous"
[2022-05-17T12:00:46,505][DEBUG][logstash.codecs.multiline] config LogStash::Codecs::Multiline/@id = "d8fe5700-1783-4731-9b1d-55667261a545"
[2022-05-17T12:00:46,506][DEBUG][logstash.codecs.multiline] config LogStash::Codecs::Multiline/@negate = true
[2022-05-17T12:00:46,507][DEBUG][logstash.codecs.multiline] config LogStash::Codecs::Multiline/@enable_metric = true
[2022-05-17T12:00:46,508][DEBUG][logstash.codecs.multiline] config LogStash::Codecs::Multiline/@patterns_dir = []
[2022-05-17T12:00:46,508][DEBUG][logstash.codecs.multiline] config LogStash::Codecs::Multiline/@charset = "UTF-8"
[2022-05-17T12:00:46,509][DEBUG][logstash.codecs.multiline] config LogStash::Codecs::Multiline/@multiline_tag = "multiline"
[2022-05-17T12:00:46,510][DEBUG][logstash.codecs.multiline] config LogStash::Codecs::Multiline/@max_lines = 500
[2022-05-17T12:00:46,511][DEBUG][logstash.codecs.multiline] config LogStash::Codecs::Multiline/@max_bytes = 10485760
[2022-05-17T12:00:46,577][DEBUG][logstash.codecs.multiline] Grok loading patterns from file {:path=>"/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-patterns-core-4.1.2/patterns/aws"}
[2022-05-17T12:00:46,594][DEBUG][logstash.codecs.multiline] Grok loading patterns from file 
[2022-05-17T12:00:46,607][DEBUG][logstash.codecs.multiline] Grok loading patterns from file {:path=>"/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-patterns-core-4.1.2/patterns/grok-patterns"}
[2022-05-17T12:00:46,620][DEBUG][logstash.codecs.multiline] Grok loading patterns from file {:path=>"/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-patterns-core-4.1.2/patterns/haproxy"}
[2022-05-17T12:00:46,622][DEBUG][logstash.codecs.multiline] Grok loading patterns from file {:path=>"/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-patterns-core-4.1.2/patterns/httpd"}
[2022-05-17T12:00:46,623][DEBUG][logstash.codecs.multiline] Grok loading patterns from file {:path=>"/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-patterns-core-4.1.2/patterns/java"}
[2022-05-17T12:00:46,642][DEBUG][logstash.codecs.multiline] Grok loading patterns from file {:path=>"/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-patterns-core-4.1.2/patterns/ruby"}
[2022-05-17T12:01:01,329][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"s3", :type=>"input", :class=>LogStash::Inputs::S3}
[2022-05-17T12:01:01,397][DEBUG][logstash.codecs.multiline] config LogStash::Codecs::Multiline/@pattern = "\\[%{TIMESTAMP_ISO8601}"
[2022-05-17T12:01:01,398][DEBUG][logstash.codecs.multiline] config LogStash::Codecs::Multiline/@what = "previous"
[2022-05-17T12:01:01,398][DEBUG][logstash.codecs.multiline] config LogStash::Codecs::Multiline/@id = "d8fe5700-1783-4731-9b1d-55667261a545"
[2022-05-17T12:01:01,399][DEBUG][logstash.codecs.multiline] config LogStash::Codecs::Multiline/@negate = true
[2022-05-17T12:01:01,399][DEBUG][logstash.codecs.multiline] config LogStash::Codecs::Multiline/@enable_metric = true
[2022-05-17T12:01:01,399][DEBUG][logstash.codecs.multiline] config LogStash::Codecs::Multiline/@patterns_dir = []
[2022-05-17T12:01:01,401][DEBUG][logstash.codecs.multiline] config LogStash::Codecs::Multiline/@charset = "UTF-8"
[2022-05-17T12:01:01,401][DEBUG][logstash.codecs.multiline] config LogStash::Codecs::Multiline/@multiline_tag = "multiline"
[2022-05-17T12:01:01,402][DEBUG][logstash.codecs.multiline] config LogStash::Codecs::Multiline/@max_lines = 500
[2022-05-17T12:01:01,402][DEBUG][logstash.codecs.multiline] config LogStash::Codecs::Multiline/@max_bytes = 10485760
[2022-05-17T12:01:01,411][DEBUG][logstash.codecs.multiline] Grok loading patterns from file {:path=>"/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-patterns-core-4.1.2/patterns/aws"}
[2022-05-17T12:01:01,423][DEBUG][logstash.codecs.multiline] Grok loading patterns from file {:path=>"/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-patterns-core-4.1.2/patterns/bacula"}
[2022-05-17T12:01:01,426][DEBUG][logstash.codecs.multiline] Grok loading patterns from file {:path=>"/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-patterns-core-4.1.2/patterns/bind"}
[2022-05-17T12:01:01,544][DEBUG][logstash.inputs.s3       ] config LogStash::Inputs::S3/@access_key_id = "keyid"
[2022-05-17T12:01:01,545][DEBUG][logstash.inputs.s3       ] config LogStash::Inputs::S3/@bucket = "elk-s3-16j1c7e2wi83c-bucket"
[2022-05-17T12:01:01,565][DEBUG][logstash.inputs.s3       ] config LogStash::Inputs::S3/@codec = <LogStash::Codecs::Multiline pattern=>"\\[%{TIMESTAMP_ISO8601}", what=>"previous", id=>"d8fe5700-1783-4731-9b1d-55667261a545", negate=>true, enable_metric=>true, charset=>"UTF-8", multiline_tag=>"multiline", max_lines=>500, max_bytes=>10485760>
[2022-05-17T12:01:01,566][DEBUG][logstash.inputs.s3       ] config LogStash::Inputs::S3/@additional_settings = {"force_path_style"=>"true", "follow_redirects"=>"false"}
[2022-05-17T12:01:01,567][DEBUG][logstash.inputs.s3       ] config LogStash::Inputs::S3/@prefix = "a360uat/"
[2022-05-17T12:01:01,570][DEBUG][logstash.inputs.s3       ] config LogStash::Inputs::S3/@secret_access_key = <password>
[2022-05-17T12:01:01,570][DEBUG][logstash.inputs.s3       ] config LogStash::Inputs::S3/@interval = 60
[2022-05-17T12:01:01,571][DEBUG][logstash.inputs.s3       ] config LogStash::Inputs::S3/@id = "system-testpath"
[2022-05-17T12:01:01,571][DEBUG][logstash.inputs.s3       ] config LogStash::Inputs::S3/@region = "us-east-2"
[2022-05-17T12:01:01,572][DEBUG][logstash.inputs.s3       ] config LogStash::Inputs::S3/@tags = ["systemtestpath"]
[2022-05-17T12:01:01,572][DEBUG][logstash.inputs.s3       ] config LogStash::Inputs::S3/@enable_metric = true
[2022-05-17T12:01:01,573][DEBUG][logstash.inputs.s3       ] config LogStash::Inputs::S3/@add_field = {}
[2022-05-17T12:01:01,573][DEBUG][logstash.inputs.s3       ] config LogStash::Inputs::S3/@role_session_name = "logstash"
[2022-05-17T12:01:01,574][DEBUG][logstash.inputs.s3       ] config LogStash::Inputs::S3/@sincedb_path = nil
[2022-05-17T12:01:01,574][DEBUG][logstash.inputs.s3       ] config LogStash::Inputs::S3/@backup_to_bucket = nil
[2022-05-17T12:01:01,574][DEBUG][logstash.inputs.s3       ] config LogStash::Inputs::S3/@backup_add_prefix = nil
[2022-05-17T12:01:01,575][DEBUG][logstash.inputs.s3       ] config LogStash::Inputs::S3/@backup_to_dir = nil
[2022-05-17T12:01:01,575][DEBUG][logstash.inputs.s3       ] config LogStash::Inputs::S3/@delete = false
[2022-05-17T12:01:01,576][DEBUG][logstash.inputs.s3       ] config LogStash::Inputs::S3/@watch_for_new_files = true
[2022-05-17T12:01:01,577][DEBUG][logstash.inputs.s3       ] config LogStash::Inputs::S3/@exclude_pattern = nil
[2022-05-17T12:01:01,578][DEBUG][logstash.inputs.s3       ] config LogStash::Inputs::S3/@temporary_directory = "/tmp/logstash"
[2022-05-17T12:01:01,578][DEBUG][logstash.inputs.s3       ] config LogStash::Inputs::S3/@include_object_properties = false
[2022-05-17T12:01:01,601][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"grok", :type=>"filter", :class=>LogStash::Filters::Grok}
[2022-05-17T12:01:01,623][DEBUG][logstash.filters.grok    ] config LogStash::Filters::Grok/@match = {"message"=>"\\[%{TIMESTAMP_ISO8601:Log_timestamp}\\]\\s+%{LOGLEVEL:Log_level}\\s+%{GREEDYDATA:msgbody}"}
[2022-05-17T12:01:01,624][DEBUG][logstash.filters.grok    ] config LogStash::Filters::Grok/@add_tag = ["Log_level", "Log_timestamp"]
[2022-05-17T12:01:01,628][DEBUG][logstash.filters.grok    ] config LogStash::Filters::Grok/@id = "f27978bf005b01121c64b433231049ec857ccf34909cc383b49b0e469a9e9f2d"
[2022-05-17T12:01:01,628][DEBUG][logstash.filters.grok    ] config LogStash::Filters::Grok/@overwrite = ["msgbody"]
[2022-05-17T12:01:01,628][DEBUG][logstash.filters.grok    ] config LogStash::Filters::Grok/@enable_metric = true
[2022-05-17T12:01:01,640][DEBUG][logstash.filters.grok    ] config LogStash::Filters::Grok/@tag_on_timeout = "_groktimeout"
[2022-05-17T12:01:01,671][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"date", :type=>"filter", :class=>LogStash::Filters::Date}
[2022-05-17T12:01:01,709][DEBUG][logstash.filters.date    ] config LogStash::Filters::Date/@match = ["Log_timestamp", "yyyy-MM-dd HH:mm:ss.SSS"]
[2022-05-17T12:01:01,710][DEBUG][logstash.filters.date    ] config LogStash::Filters::Date/@id = "7d931a15d1078b1d9366dedf673eaee3978ea5c34dfc6a8758b15d3fba811fca"
[2022-05-17T12:01:01,710][DEBUG][logstash.filters.date    ] config LogStash::Filters::Date/@timezone = "UTC"
[2022-05-17T12:01:01,711][DEBUG][logstash.filters.date    ] config LogStash::Filters::Date/@target = "Log_timestamp"
[2022-05-17T12:01:01,714][DEBUG][logstash.filters.date    ] config LogStash::Filters::Date/@tag_on_failure = ["_dateparsefailure"]
[2022-05-17T12:01:01,733][DEBUG][org.logstash.filters.DateFilter] Date filter with format=yyyy-MM-dd HH:mm:ss.SSS, locale=null, timezone=UTC built as org.logstash.filters.parser.JodaParser
[2022-05-17T12:01:01,750][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"kv", :type=>"filter", :class=>LogStash::Filters::KV}
[2022-05-17T12:01:01,766][DEBUG][logstash.filters.kv      ] config LogStash::Filters::KV/@transform_value = "capitalize"
[2022-05-17T12:01:01,769][DEBUG][logstash.filters.kv      ] config LogStash::Filters::KV/@transform_key = "capitalize"
[2022-05-17T12:01:01,770][DEBUG][logstash.filters.kv      ] config LogStash::Filters::KV/@include_brackets = false
[2022-05-17T12:01:01,771][DEBUG][logstash.filters.kv      ] config LogStash::Filters::KV/@field_split = "*{},?\\[\\]"
[2022-05-17T12:01:01,771][DEBUG][logstash.filters.kv      ] config LogStash::Filters::KV/@trim_value = "\\s"
[2022-05-17T12:01:01,772][DEBUG][logstash.filters.kv      ] config LogStash::Filters::KV/@trim_key = "\\s"
[2022-05-17T12:01:01,772][DEBUG][logstash.filters.kv      ] config LogStash::Filters::KV/@id = "5b70e2d821ce194c9b6c3945b6fe8901f4e266279af8c20945a427e6853b6c95"
[2022-05-17T12:01:01,801][DEBUG][logstash.filters.kv      ] config LogStash::Filters::KV/@field_split = "*:{},?\\[\\]"
[2022-05-17T12:01:01,801][DEBUG][logstash.filters.kv      ] config LogStash::Filters::KV/@trim_value = "\\s"
[2022-05-17T12:01:01,801][DEBUG][logstash.filters.kv      ] config LogStash::Filters::KV/@trim_key = "\\s"
[2022-05-17T12:01:01,802][DEBUG][logstash.filters.kv      ] config LogStash::Filters::KV/@id = "f7fbeb930316a2d4994da4b916ca7d5cd13c47afe7145c2540399f84785124f4"
[2022-05-17T12:01:01,802][DEBUG][logstash.filters.kv      ] config LogStash::Filters::KV/@enable_metric = true
[2022-05-17T12:01:01,802][DEBUG][logstash.filters.kv      ] config LogStash::Filters::KV/@add_tag = []
[2022-05-17T12:01:01,803][DEBUG][logstash.filters.kv      ] config LogStash::Filters::KV/@remove_tag = []
[2022-05-17T12:01:01,803][DEBUG][logstash.filters.kv      ] config LogStash::Filters::KV/@add_field = {}
[2022-05-17T12:01:01,808][DEBUG][logstash.filters.kv      ] config LogStash::Filters::KV/@value_split = "="
[2022-05-17T12:01:01,809][DEBUG][logstash.filters.kv      ] config LogStash::Filters::KV/@prefix = ""
[2022-05-17T12:01:01,809][DEBUG][logstash.filters.kv      ] config LogStash::Filters::KV/@source = "message"
[2022-05-17T12:01:01,810][DEBUG][logstash.filters.kv      ] config LogStash::Filters::KV/@include_keys = []
[2022-05-17T12:01:01,811][DEBUG][logstash.filters.kv      ] config LogStash::Filters::KV/@whitespace = "lenient"
[2022-05-17T12:01:01,812][DEBUG][logstash.filters.kv      ] config LogStash::Filters::KV/@timeout_millis = 30000
[2022-05-17T12:01:01,812][DEBUG][logstash.filters.kv      ] config LogStash::Filters::KV/@tag_on_timeout = "_kv_filter_timeout"
[2022-05-17T12:01:01,812][DEBUG][logstash.filters.kv      ] config LogStash::Filters::KV/@tag_on_failure = "_kv_filter_error"
[2022-05-17T12:01:01,833][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"ruby", :type=>"filter", :class=>LogStash::Filters::Ruby}
[2022-05-17T12:01:01,850][DEBUG][logstash.filters.ruby    ] config LogStash::Filters::Ruby/@code = "event.set('Logs_processed', Time.now());"
[2022-05-17T12:01:01,851][DEBUG][logstash.filters.ruby    ] config LogStash::Filters::Ruby/@id = "607fd2312c5d33188ca20cb2ad917af10533a68597cf2a1df6f91046fc8e9347"
[2022-05-17T12:01:01,853][DEBUG][logstash.filters.ruby    ] config LogStash::Filters::Ruby/@enable_metric = true
[2022-05-17T12:01:01,853][DEBUG][logstash.filters.ruby    ] config LogStash::Filters::Ruby/@add_tag = []
[2022-05-17T12:01:01,853][DEBUG][logstash.filters.ruby    ] config LogStash::Filters::Ruby/@remove_tag = []
[2022-05-17T12:01:01,854][DEBUG][logstash.filters.ruby    ] config LogStash::Filters::Ruby/@add_field = {}
[2022-05-17T12:01:01,855][DEBUG][logstash.filters.ruby    ] config LogStash::Filters::Ruby/@remove_field = []
[2022-05-17T12:01:01,855][DEBUG][logstash.filters.ruby    ] config LogStash::Filters::Ruby/@periodic_flush = false
[2022-05-17T12:01:01,856][DEBUG][logstash.filters.ruby    ] config LogStash::Filters::Ruby/@script_params = {}
[2022-05-17T12:01:01,857][DEBUG][logstash.filters.ruby    ] config LogStash::Filters::Ruby/@tag_on_exception = "_rubyexception"
[2022-05-17T12:01:01,858][DEBUG][logstash.filters.ruby    ] config LogStash::Filters::Ruby/@tag_with_exception_message = false
[2022-05-17T12:01:01,897][DEBUG][logstash.filters.mutate  ] config LogStash::Filters::Mutate/@gsub = ["message", "\\n", " "]
[2022-05-17T12:01:01,900][DEBUG][logstash.filters.mutate  ] config LogStash::Filters::Mutate/@rename = {"message"=>"Payload"}
[2022-05-17T12:01:01,900][DEBUG][logstash.filters.mutate  ] config LogStash::Filters::Mutate/@id = "25fe73a3d2211094fc427ac0bff0eaf2956f4655a1fc2adade3761b4eccb107a"
[2022-05-17T12:01:01,901][DEBUG][logstash.filters.mutate  ] config LogStash::Filters::Mutate/@enable_metric = true
[2022-05-17T12:01:01,901][DEBUG][logstash.filters.mutate  ] config LogStash::Filters::Mutate/@add_tag = []
[2022-05-17T12:01:01,902][DEBUG][logstash.filters.mutate  ] config LogStash::Filters::Mutate/@remove_tag = []
[2022-05-17T12:01:01,902][DEBUG][logstash.filters.mutate  ] config LogStash::Filters::Mutate/@add_field = {}
[2022-05-17T12:01:01,902][DEBUG][logstash.filters.mutate  ] config LogStash::Filters::Mutate/@remove_field = []
[2022-05-17T12:01:01,903][DEBUG][logstash.filters.mutate  ] config LogStash::Filters::Mutate/@periodic_flush = false
[2022-05-17T12:01:01,916][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"prune", :type=>"filter", :class=>LogStash::Filters::Prune}
[2022-05-17T12:01:01,927][DEBUG][logstash.filters.prune   ] config LogStash::Filters::Prune/@blacklist_names = ["[0-9]+", "unknown_fields"]
[2022-05-17T12:01:01,928][DEBUG][logstash.filters.prune   ] config LogStash::Filters::Prune/@whitelist_names = ["@timestamp", "Logs_processed", "Payload", "Log_timestamp", "App_version", "Correlation_id", "Env", "Flow_name", "host", "Instance_id", "Log_level", "log_thread", "id", "Statuscode", "Type", "Detail", "Message", "App_name"]
[2022-05-17T12:01:01,930][DEBUG][logstash.filters.prune   ] config LogStash::Filters::Prune/@id = "7af85e071256e2cf066be1304a70f4a7062dd9f8aa1315f013286d7c84545733"
[2022-05-17T12:01:01,930][DEBUG][logstash.filters.prune   ] config LogStash::Filters::Prune/@enable_metric = true
[2022-05-17T12:01:01,931][DEBUG][logstash.filters.prune   ] config LogStash::Filters::Prune/@add_tag = []
[2022-05-17T12:01:01,931][DEBUG][logstash.filters.prune   ] config LogStash::Filters::Prune/@remove_tag = []
[2022-05-17T12:01:01,932][DEBUG][logstash.filters.prune   ] config LogStash::Filters::Prune/@add_field = {}
[2022-05-17T12:01:01,932][DEBUG][logstash.filters.prune   ] config LogStash::Filters::Prune/@remove_field = []
[2022-05-17T12:01:01,936][DEBUG][logstash.filters.prune   ] config LogStash::Filters::Prune/@periodic_flush = false
[2022-05-17T12:01:01,936][DEBUG][logstash.filters.prune   ] config LogStash::Filters::Prune/@interpolate = false
[2022-05-17T12:01:01,937][DEBUG][logstash.filters.prune   ] config LogStash::Filters::Prune/@whitelist_values = {}
[2022-05-17T12:01:01,937][DEBUG][logstash.filters.prune   ] config LogStash::Filters::Prune/@blacklist_values = {}
[2022-05-17T12:01:01,940][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"elasticsearch", :type=>"output", :class=>LogStash::Outputs::ElasticSearch}
[2022-05-17T12:01:01,984][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"plain", :type=>"codec", :class=>LogStash::Codecs::Plain}
[2022-05-17T12:01:01,993][DEBUG][logstash.codecs.plain    ] config LogStash::Codecs::Plain/@id = "plain_26ae73c0-60b5-49cb-9374-3ab201a1d5f8"
[2022-05-17T12:01:01,994][DEBUG][logstash.codecs.plain    ] config LogStash::Codecs::Plain/@enable_metric = true
[2022-05-17T12:01:01,994][DEBUG][logstash.codecs.plain    ] config LogStash::Codecs::Plain/@charset = "UTF-8"
[2022-05-17T12:01:02,005][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@index = "logstash-system-pathtest-pipeline-%{+YYYY.MM}"
[2022-05-17T12:01:02,012][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@hosts = [https://search-elkdomain.us-east-2.es.amazonaws.com:443]
[2022-05-17T12:01:02,013][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@id = "23804230db5e9ad4e2682996c9623aebaa551178513aa21d9046b7fc518e201d"
[2022-05-17T12:01:02,014][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@enable_metric = true
[2022-05-17T12:01:02,015][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@codec = <LogStash::Codecs::Plain id=>"plain_26ae73c0-60b5-49cb-9374-3ab201a1d5f8", enable_metric=>true, charset=>"UTF-8">
[2022-05-17T12:01:02,015][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@workers = 1
[2022-05-17T12:01:02,016][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@manage_template = true
[2022-05-17T12:01:02,017][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@template_name = "logstash"
[2022-05-17T12:01:02,017][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@template_overwrite = false
[2022-05-17T12:01:02,024][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@retry_on_conflict = 1
[2022-05-17T12:01:02,024][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pipeline = nil
[2022-05-17T12:01:02,025][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ilm_enabled = false
[2022-05-17T12:01:02,025][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ilm_rollover_alias = "logstash"
[2022-05-17T12:01:02,026][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ilm_pattern = "{now/d}-000001"
[2022-05-17T12:01:02,027][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@action = "index"
[2022-05-17T12:01:02,028][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@ssl_certificate_verification = true
[2022-05-17T12:01:02,029][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@sniffing_delay = 5
[2022-05-17T12:01:02,029][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@timeout = 60
[2022-05-17T12:01:02,030][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@failure_type_logging_whitelist = []
[2022-05-17T12:01:02,030][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pool_max = 1000
[2022-05-17T12:01:02,031][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@pool_max_per_route = 100
[2022-05-17T12:01:02,031][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@resurrect_delay = 5
[2022-05-17T12:01:02,032][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@validate_after_inactivity = 10000
[2022-05-17T12:01:02,033][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@http_compression = false
[2022-05-17T12:01:02,033][DEBUG][logstash.outputs.elasticsearch] config LogStash::Outputs::ElasticSearch/@custom_headers = {}
[2022-05-17T12:01:02,043][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"stdout", :type=>"output", :class=>LogStash::Outputs::Stdout}
[2022-05-17T12:01:02,052][DEBUG][logstash.plugins.registry] On demand adding plugin to the registry {:name=>"rubydebug", :type=>"codec", :class=>LogStash::Codecs::RubyDebug}
[2022-05-17T12:01:02,056][DEBUG][logstash.codecs.rubydebug] config LogStash::Codecs::RubyDebug/@id = "rubydebug_d49363ae-0e1f-4165-a0ca-7b39f167e102"
[2022-05-17T12:01:02,056][DEBUG][logstash.codecs.rubydebug] config LogStash::Codecs::RubyDebug/@enable_metric = true
[2022-05-17T12:01:02,057][DEBUG][logstash.codecs.rubydebug] config LogStash::Codecs::RubyDebug/@metadata = false
[2022-05-17T12:01:06,593][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2022-05-17T12:01:07,529][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2022-05-17T12:01:14,090][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2022-05-17T12:01:16,928][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2022-05-17T12:01:16,988][DEBUG][logstash.outputs.stdout  ] config LogStash::Outputs::Stdout/@codec = <LogStash::Codecs::RubyDebug id=>"rubydebug_d49363ae-0e1f-4165-a0ca-7b39f167e102", enable_metric=>true, metadata=>false>
[2022-05-17T12:01:16,988][DEBUG][logstash.outputs.stdout  ] config LogStash::Outputs::Stdout/@id = "99478c19fcfc95c2f009db20708fa8002b07c8a8ddd861892ae2dba70916a427"
[2022-05-17T12:01:16,989][DEBUG][logstash.outputs.stdout  ] config LogStash::Outputs::Stdout/@enable_metric = true
[2022-05-17T12:01:16,989][DEBUG][logstash.outputs.stdout  ] config LogStash::Outputs::Stdout/@workers = 1
[2022-05-17T12:01:17,047][INFO ][logstash.pipeline        ] Starting pipeline {:pipeline_id=>"logstashabctest", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}
[2022-05-17T12:01:17,146][DEBUG][logstash.outputs.elasticsearch] Normalizing http path {:path=>nil, :normalized=>nil}
[2022-05-17T12:01:17,676][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[https://search-elkdomain.us-east-2.es.amazonaws.com:443/]}}
[2022-05-17T12:01:17,687][DEBUG][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>https://search-elkdomain.us-east-2.es.amazonaws.com:443/, :path=>"/"}
[2022-05-17T12:01:18,170][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"https://search-elkdomain.us-east-2.es.amazonaws.com:443/"}
[2022-05-17T12:01:18,244][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>7}
[2022-05-17T12:01:18,251][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2022-05-17T12:01:18,295][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["https://search-elkdomain.us-east-2.es.amazonaws.com:443"]}
[2022-05-17T12:01:18,313][INFO ][logstash.outputs.elasticsearch] Using default mapping template
[2022-05-17T12:01:18,812][INFO ][logstash.inputs.s3       ] Registering s3 input {:bucket=>"elk-s3-16j1c7e2wi83c-bucket-sw28ncguh23b", :region=>"us-east-2"}
[2022-05-17T12:01:19,324][INFO ][logstash.pipeline        ] Pipeline started successfully {:pipeline_id=>"logstashabctest", :thread=>"#<Thread:0x48d56ecb run>"}
[2022-05-17T12:01:19,438][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:logstashabctest], :non_running_pipelines=>[]}
[2022-05-17T12:01:19,555][DEBUG][logstash.agent           ] Starting puma
[2022-05-17T12:01:19,580][DEBUG][logstash.agent           ] Trying to start WebServer {:port=>9600}
[2022-05-17T12:01:19,785][DEBUG][logstash.api.service     ] [api-service] start
[2022-05-17T12:01:20,113][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
[2022-05-17T12:01:21,990][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2022-05-17T12:01:21,996][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}
[2022-05-17T12:01:22,966][DEBUG][logstash.inputs.s3       ] S3 input: Found key {:key=>"a360uat/"}
[2022-05-17T12:01:22,980][DEBUG][logstash.inputs.s3       ] S3 input: Ignoring {:key=>"a360uat/"}
[2022-05-17T12:01:22,980][DEBUG][logstash.inputs.s3       ] S3 input: Found key {:key=>"a360uat/20220301073828_ip-10-248-39-202_1cae47099da34670a36ba3a39f5ac0a3uat"}
[2022-05-17T12:01:23,068][INFO ][logstash.inputs.s3       ] Using default generated file for the sincedb {:filename=>"/var/lib/logstash/plugins/inputs/s3/sincedb_e31c033c7c57b23f56a75840453901ec"}
[2022-05-17T12:01:23,094][DEBUG][logstash.inputs.s3       ] S3 Input: Object Not Modified {:key=>"a360uat/20220301073828_ip-10-248-39-202_1cae47099da34670a36ba3a39f5ac0a3uat"}
[2022-05-17T12:01:23,095][DEBUG][logstash.inputs.s3       ] S3 input: Found key {:key=>"a360uat/20220405064055_ip-10-248-43-211_eee65021c31b4e51bda190a50ff97a30"}
[2022-05-17T12:01:23,097][DEBUG][logstash.inputs.s3       ] S3 Input: Object Not Modified {:key=>"a360uat/20220405064055_ip-10-248-43-211_eee65021c31b4e51bda190a50ff97a30"}
[2022-05-17T12:01:23,104][DEBUG][logstash.inputs.s3       ] S3 input: Found key {:key=>"a360uat/20220405131149_ip-10-248-45-66_1721e685010745b88b074300182c8d88"}
[2022-05-17T12:01:23,104][DEBUG][logstash.inputs.s3       ] S3 Input: Object Not Modified {:key=>"a360uat/20220405131149_ip-10-248-45-66_1721e685010745b88b074300182c8d88"}
[2022-05-17T12:01:24,373][DEBUG][logstash.pipeline        ] Pushing flush onto pipeline {:pipeline_id=>"logstashabctest", :thread=>"#<Thread:0x48d56ecb sleep>"}
[2022-05-17T12:01:27,012][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ParNew"}
[2022-05-17T12:01:27,012][DEBUG][logstash.instrument.periodicpoller.jvm] collector name {:name=>"ConcurrentMarkSweep"}

There are no errors in your log and logstash is running.

[2022-05-16T20:58:08,196][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"logstashabc", :thread=>"#<Thread:0x487d5490 sleep>"}
[2022-05-16T20:58:08,285][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:logstashabc], :non_running_pipelines=>}
[2022-05-16T20:58:09,030][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}

Those lines means that logstash started without any problems, it is not clear what is the issue here.

Within logs it shows the logstash is up and running successfully but indices are not creating and logs not pulling when I ran the "sudo systemctl start logstash.service"

If I ran the below command then everything is perfectly working fine as I mentioned earlier
bin/logstash -f /etc/logstash/conf.d/logstash.conf

What happens when you run sudo systecmtl start logstash ?

How did you run logstash to get the logs you shared before?

There is nothing wrong in the logs you shared, so Logstash is starting without any problems, if you are not getting any data then this is a different issue.

What is your current issue, is logstash not starting or you are not getting data after logstash started and is running?

Both commands performing same no differences
sudo systecmtl start logstash and sudo systecmtl start logstash.service

I enabled the debug mode in logstash.yml file (log.level=debug)

Logstash starting but data not getting and indices also not creating

Yeah, only this might be an issue. Like S3 data hasn't been pulled.

[2022-05-17T12:01:23,104][DEBUG][logstash.inputs.s3 ] S3 input: Found key {:key=>"a360uat/20220405131149_ip-10-248-45-66_1721e685010745b88b074300182c8d88"}
[2022-05-17T12:01:23,104][DEBUG][logstash.inputs.s3 ] S3 Input: Object Not Modified {:key=>"a360uat/20220405131149_ip-10-248-45-66_1721e685010745b88b074300182c8d88"}

Try to run as logstash user, not as root, in debug mode:
su -m logstash -c "bin/logstash -f /etc/logstash/conf.d/logstash.conf"

Please, share your config using the Preformatted text option, the </> buttom.

I tried to run the above command which you provided and it's asking for Password not sure where could I find.

Can you please let me know where could I find password for logstash user?

Use your root passwd

Config file:
----------------
input {
                s3 {
                        region => "us-east-2"
                        "access_key_id" => "keyid"
                        "secret_access_key" => "accesskey"
                        prefix => "test/"
                        id => "ee-system-test"
                        bucket => "s3-16j1c7e2wi83c-lbifkzb4jqo5"
                        interval => "60"
                        tags => [ "eesystemtest" ]
                        additional_settings => {
                                                  force_path_style => true
                                                  follow_redirects => false
                                                }
                        #Multiline filter is for combining multiple logs lines if it doen't starts with the timestamp
                        codec => multiline {
                                    pattern => "\[%{TIMESTAMP_ISO8601}"
                                    what => "previous"
                                    negate => true
                               }

                }

        }

filter {
#Grok filter helps in getting the timestamp and loglevel content from the log
         grok {
                        match => { "message" => "\[%{TIMESTAMP_ISO8601:Log_timestamp}\]\s+%{LOGLEVEL:Log_level}\s+%{GREEDYDATA:msgbody}"}
                        overwrite => [ "msgbody" ]
                        add_tag => ["Log_level", "Log_timestamp" ]
                }

                #converting the string results from grok filter to date format
                date {
                        match => [ "Log_timestamp", "yyyy-MM-dd HH:mm:ss.SSS" ]
                        target => "Log_timestamp"
                        timezone => "UTC"
                        #Sample Timestamp format from logs - 2020-02-10 11:06:55.698
                }


                #mutate helps in parsing the payload
                mutate
                {
                                gsub => ['message', "\n", " "]
                                rename => ["message", "Payload"]
                                #add_field => { "@timestamp" => "Logs_processed"}
                        }


}

output {
            elasticsearch {
                        hosts => ["https://search-elkdomain:443"]
                        #hosts => ["http://localhost:9200"]
                        index => "logstash-system-two-pipeline-%{+YYYY.MM}"
                }
                stdout { codec => rubydebug }

}

I am not sure whether password set for logstash and root users. I tried couple of options but nothing worked out.

Any other alternative that I can try to debug the issue?

I tried to run the same config file in 5.6.16 version and it worked perfectly fine without any issues but running 6.8.23 is not working when ran it with systemctl service

Difference is in 5.6.16 we don't have the pipelines.yml other than didn't see much difference.

Any suggestions/ideas would be helpful.

Few more idea as temporary values for test. Make backup of your files before you change and keep record what and where you have changed.

In input s3 set:

  1. sincedb_path - set to another file where logstash user has right ("/var/lib/logstash/plugins/inputs/s3/sincedb_temp.txt"). According to debug log, it's nil Inputs::S3/@sincedb_path = nil,then default: Using default generated file for the sincedb {:filename=>"/var/lib/logstash/plugins/inputs/s3/sincedb_e31c033c7c57b23f56a75840453901ec"} Be aware, by changing sincedb_path you might have duplicate records. Use another temp index, just for testing.

  2. temporary_directory - Default value is "/tmp/logstash", change it to be visible to the logstash user. This might be the issue.
    Set the directory where logstash will store the tmp files before processing them.

  3. watch_for_new_files => true - Default value is true, I would force to be true

  4. interval => "60" - set to 10 sec

  5. Check, is there anything specific in /etc/systemd/system/logstash.service

  6. Set log.level: "debug" in logstash.yml, Clean logs if is possible, run both cases systemctl and proc, then compare logs with a visual tool. Pay attention on logstash.runner and input settings. Later you can try with: log.level: "trace". Be aware trace will dump a lot of data.
    Read the documentation related to S3 input plugin, check is there any logical parameter cause the issue.

Maaaybe a newer Logstash version. This should the laaast option.

Hi Rios,

While logstash running, if I add new files to S3 bucket then indices got created and showing the data in Kiaban as well. (Didn't change any permission or logstash.conf file)

S3 input plugin now reading only the new files and skipping the old files even though the Storage class show as Standard

Any suggestion would be helpful to read all log files?

Thank you,
Karthik

This is the expected behavior with the s3 input.

Every time you start logstash it will check the date of the last file it read from the back, this date is stored in a file in the sincedb_path, if you do not set a custom sincedb_path, it will create one in /var/lib/logstash/plugins/inputs/s3 and use it, so the next time logstash run it will know where it stopped consuming from the bucket.

If you want to reprocess the files on a s3 bucket you will need to remove the sincedb file, this will tell logstash to start from the earlist file.