Indices not creating after logstah starting with Systemctl service

What happens when you run sudo systecmtl start logstash ?

How did you run logstash to get the logs you shared before?

There is nothing wrong in the logs you shared, so Logstash is starting without any problems, if you are not getting any data then this is a different issue.

What is your current issue, is logstash not starting or you are not getting data after logstash started and is running?

Both commands performing same no differences
sudo systecmtl start logstash and sudo systecmtl start logstash.service

I enabled the debug mode in logstash.yml file (log.level=debug)

Logstash starting but data not getting and indices also not creating

Yeah, only this might be an issue. Like S3 data hasn't been pulled.

[2022-05-17T12:01:23,104][DEBUG][logstash.inputs.s3 ] S3 input: Found key {:key=>"a360uat/20220405131149_ip-10-248-45-66_1721e685010745b88b074300182c8d88"}
[2022-05-17T12:01:23,104][DEBUG][logstash.inputs.s3 ] S3 Input: Object Not Modified {:key=>"a360uat/20220405131149_ip-10-248-45-66_1721e685010745b88b074300182c8d88"}

Try to run as logstash user, not as root, in debug mode:
su -m logstash -c "bin/logstash -f /etc/logstash/conf.d/logstash.conf"

Please, share your config using the Preformatted text option, the </> buttom.

I tried to run the above command which you provided and it's asking for Password not sure where could I find.

Can you please let me know where could I find password for logstash user?

Use your root passwd

Config file:
----------------
input {
                s3 {
                        region => "us-east-2"
                        "access_key_id" => "keyid"
                        "secret_access_key" => "accesskey"
                        prefix => "test/"
                        id => "ee-system-test"
                        bucket => "s3-16j1c7e2wi83c-lbifkzb4jqo5"
                        interval => "60"
                        tags => [ "eesystemtest" ]
                        additional_settings => {
                                                  force_path_style => true
                                                  follow_redirects => false
                                                }
                        #Multiline filter is for combining multiple logs lines if it doen't starts with the timestamp
                        codec => multiline {
                                    pattern => "\[%{TIMESTAMP_ISO8601}"
                                    what => "previous"
                                    negate => true
                               }

                }

        }

filter {
#Grok filter helps in getting the timestamp and loglevel content from the log
         grok {
                        match => { "message" => "\[%{TIMESTAMP_ISO8601:Log_timestamp}\]\s+%{LOGLEVEL:Log_level}\s+%{GREEDYDATA:msgbody}"}
                        overwrite => [ "msgbody" ]
                        add_tag => ["Log_level", "Log_timestamp" ]
                }

                #converting the string results from grok filter to date format
                date {
                        match => [ "Log_timestamp", "yyyy-MM-dd HH:mm:ss.SSS" ]
                        target => "Log_timestamp"
                        timezone => "UTC"
                        #Sample Timestamp format from logs - 2020-02-10 11:06:55.698
                }


                #mutate helps in parsing the payload
                mutate
                {
                                gsub => ['message', "\n", " "]
                                rename => ["message", "Payload"]
                                #add_field => { "@timestamp" => "Logs_processed"}
                        }


}

output {
            elasticsearch {
                        hosts => ["https://search-elkdomain:443"]
                        #hosts => ["http://localhost:9200"]
                        index => "logstash-system-two-pipeline-%{+YYYY.MM}"
                }
                stdout { codec => rubydebug }

}

I am not sure whether password set for logstash and root users. I tried couple of options but nothing worked out.

Any other alternative that I can try to debug the issue?

I tried to run the same config file in 5.6.16 version and it worked perfectly fine without any issues but running 6.8.23 is not working when ran it with systemctl service

Difference is in 5.6.16 we don't have the pipelines.yml other than didn't see much difference.

Any suggestions/ideas would be helpful.

Few more idea as temporary values for test. Make backup of your files before you change and keep record what and where you have changed.

In input s3 set:

  1. sincedb_path - set to another file where logstash user has right ("/var/lib/logstash/plugins/inputs/s3/sincedb_temp.txt"). According to debug log, it's nil Inputs::S3/@sincedb_path = nil,then default: Using default generated file for the sincedb {:filename=>"/var/lib/logstash/plugins/inputs/s3/sincedb_e31c033c7c57b23f56a75840453901ec"} Be aware, by changing sincedb_path you might have duplicate records. Use another temp index, just for testing.

  2. temporary_directory - Default value is "/tmp/logstash", change it to be visible to the logstash user. This might be the issue.
    Set the directory where logstash will store the tmp files before processing them.

  3. watch_for_new_files => true - Default value is true, I would force to be true

  4. interval => "60" - set to 10 sec

  5. Check, is there anything specific in /etc/systemd/system/logstash.service

  6. Set log.level: "debug" in logstash.yml, Clean logs if is possible, run both cases systemctl and proc, then compare logs with a visual tool. Pay attention on logstash.runner and input settings. Later you can try with: log.level: "trace". Be aware trace will dump a lot of data.
    Read the documentation related to S3 input plugin, check is there any logical parameter cause the issue.

Maaaybe a newer Logstash version. This should the laaast option.

Hi Rios,

While logstash running, if I add new files to S3 bucket then indices got created and showing the data in Kiaban as well. (Didn't change any permission or logstash.conf file)

S3 input plugin now reading only the new files and skipping the old files even though the Storage class show as Standard

Any suggestion would be helpful to read all log files?

Thank you,
Karthik

This is the expected behavior with the s3 input.

Every time you start logstash it will check the date of the last file it read from the back, this date is stored in a file in the sincedb_path, if you do not set a custom sincedb_path, it will create one in /var/lib/logstash/plugins/inputs/s3 and use it, so the next time logstash run it will know where it stopped consuming from the bucket.

If you want to reprocess the files on a s3 bucket you will need to remove the sincedb file, this will tell logstash to start from the earlist file.

After removing sincedb file then I started seeing the old log files as well. Thank you Leandro!

Hi Leandro,

Upgraded the logstash to 7.12.1 from 6.8.23. Tried to run same logstash conf file but got below error

> [2022-05-24T23:40:16,451][INFO ][logstash.runner          ] Log4j configuration path used is: /etc/logstash/log4j2.properties
> [2022-05-24T23:40:16,462][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"7.12.1", "jruby.version"=>"jruby 9.2.13.0 (2.5.7) 2020-08-03 9a89c94bcc OpenJDK 64-Bit Server VM 11.0.10+9 on 11.0.10+9 +indy +jit [linux-x86_64]"}
> [2022-05-24T23:40:17,651][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600}
> [2022-05-24T23:40:19,916][INFO ][org.reflections.Reflections] Reflections took 46 ms to scan 1 urls, producing 23 keys and 47 values
> [2022-05-24T23:40:33,533][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[https://search-domain-y7muh26u4l.us-east-2.es.amazonaws.com:443/]}}
> [2022-05-24T23:40:33,856][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"https://search-domain-y7muh26u4l.us-east-2.es.amazonaws.com:443/"}
> [2022-05-24T23:40:33,908][INFO ][logstash.outputs.elasticsearch][main] ES Output version determined {:es_version=>7}
> [2022-05-24T23:40:33,913][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
> [2022-05-24T23:40:33,939][ERROR][logstash.outputs.elasticsearch][main] Unable to get license information {:url=>"https://search-domain-y7muh26u4l.us-east-2.es.amazonaws.com:443/", :error_type=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::BadResponseCodeError, :error=>"Got response code '401' contacting Elasticsearch at URL 'https://search-domain-y7muh26u4l.us-east-2.es.amazonaws.com:443/_license'"}
> [2022-05-24T23:40:33,945][WARN ][logstash.outputs.elasticsearch][main] DEPRECATION WARNING: Connecting to an OSS distribution of Elasticsearch using the default distribution of Logstash will stop working in Logstash 8.0.0. Please upgrade to the default distribution of Elasticsearch, or use the OSS distribution of Logstash {:url=>"https://search-domain-y7muh26u4l.us-east-2.es.amazonaws.com:443/"}
> [2022-05-24T23:40:33,963][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["https://search-bsmhelkdomain-y7muh26u4lz646ile5b6dl5fve.us-east-2.es.amazonaws.com:443"]}
> [2022-05-24T23:40:34,036][INFO ][logstash.outputs.elasticsearch][main] Using a default mapping template {:es_version=>7, :ecs_compatibility=>:disabled}
> [2022-05-24T23:40:34,097][ERROR][logstash.outputs.elasticsearch][main] Failed to install template. {:message=>"Got response code '401' contacting Elasticsearch at URL 'https://search-domain-y7muh26u4l.us-east-2.es.amazonaws.com:443/_xpack'", :class=>"LogStash::Outputs::ElasticSearch::HttpClient::Pool::BadResponseCodeError", :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.8.6-java/lib/logstash/outputs/elasticsearch/http_client/manticore_adapter.rb:80:in `perform_request'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.8.6-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:317:in `perform_request_to_url'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.8.6-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:304:in `block in perform_request'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.8.6-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:399:in `with_connection'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.8.6-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:303:in `perform_request'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.8.6-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:311:in `block in Pool'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.8.6-java/lib/logstash/outputs/elasticsearch/http_client.rb:197:in `get'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.8.6-java/lib/logstash/outputs/elasticsearch/http_client.rb:418:in `get_xpack_info'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.8.6-java/lib/logstash/outputs/elasticsearch/ilm.rb:57:in `ilm_ready?'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.8.6-java/lib/logstash/outputs/elasticsearch/ilm.rb:28:in `ilm_in_use?'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.8.6-java/lib/logstash/outputs/elasticsearch/template_manager.rb:15:in `install_template'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.8.6-java/lib/logstash/outputs/elasticsearch.rb:426:in `install_template'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.8.6-java/lib/logstash/outputs/elasticsearch.rb:274:in `block in register'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.8.6-java/lib/logstash/plugin_mixins/elasticsearch/common.rb:137:in `block in setup_after_successful_connection'"]}
> [2022-05-24T23:40:34,207][INFO ][logstash.javapipeline    ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>250, "pipeline.sources"=>["/etc/logstash/conf.d/logstash.conf"], :thread=>"#<Thread:0x19555fbe run>"}
> [2022-05-24T23:40:34,209][FATAL][logstash.runner          ] An unexpected error occurred! {:error=>#<LogStash::Outputs::ElasticSearch::HttpClient::Pool::BadResponseCodeError: LogStash::Outputs::ElasticSearch::HttpClient::Pool::BadResponseCodeError>, :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.8.6-java/lib/logstash/outputs/elasticsearch/http_client/manticore_adapter.rb:80:in `perform_request'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.8.6-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:317:in `perform_request_to_url'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.8.6-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:304:in `block in perform_request'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.8.6-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:399:in `with_connection'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.8.6-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:303:in `perform_request'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.8.6-java/lib/logstash/outputs/elasticsearch/http_client/pool.rb:311:in `block in Pool'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.8.6-java/lib/logstash/outputs/elasticsearch/http_client.rb:197:in `get'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.8.6-java/lib/logstash/outputs/elasticsearch/http_client.rb:418:in `get_xpack_info'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.8.6-java/lib/logstash/outputs/elasticsearch/ilm.rb:57:in `ilm_ready?'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.8.6-java/lib/logstash/outputs/elasticsearch/ilm.rb:28:in `ilm_in_use?'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.8.6-java/lib/logstash/outputs/elasticsearch.rb:275:in `block in register'", "/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-output-elasticsearch-10.8.6-java/lib/logstash/plugin_mixins/elasticsearch/common.rb:137:in `block in setup_after_successful_connection'"]}
> [2022-05-24T23:40:34,255][FATAL][org.logstash.Logstash    ] Logstash stopped processing because of an error: (SystemExit) exit
> org.jruby.exceptions.SystemExit: (SystemExit) exit
>         at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:747) ~[jruby-complete-9.2.13.0.jar:?]
>         at org.jruby.RubyKernel.exit(org/jruby/RubyKernel.java:710) ~[jruby-complete-9.2.13.0.jar:?]
>         at usr.share.logstash.lib.bootstrap.environment.<main>(/usr/share/logstash/lib/bootstrap/environment.rb:89) ~[?:?]

Can you please provide any suggestions?

Thank you,
Karthik

Which version of Elasticsearch are you using? And which distribution?

You should try to keep Logstash and Elasticsearch in the same version.

Also, it seems that you are using the AWS Distribution of Elasticsearch, it won't work with Logstash versions higher than 7.10.X.

I am using Elasticsearch 7.1 version and it's AWS Legacy distribution of Elasticsearch.

Currently installed logstash 7.12.1 version.

Just to make sure - Is Elasticsearch 7.1 not compatible with logstash 7.12.1? And is it mandatory to keep Logstash and Elasticsearch in the same version?

Elasticsearch 7.1 is EOL and no longer supported. Please upgrade ASAP.

(This is an automated response from your friendly Elastic bot. Please report this post if you have any suggestions or concerns :elasticheart: )

No, but not every version is compatible with every other version. See this post.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.