You didn't answer both my questions. In the future please answer all questions so we're not wasting any time.
It looks like you're capturing the timestamp in two ways. First I think you have (?<timestamp>%{DAY}... in your grok expression (but you didn't format your configuration as preformatted text so we can't tell), and then you add to the same field with add_field (turning it into an array). Either way should work fine but please pick one, don't do both.
I do not see any error in the log file after startup log, as below:
[2017-05-12T08:16:37,191][INFO ][logstash.filters.geoip ] Using geoip database {:path=>"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-geoip-4.0.4-java/vendor/GeoLite2-City.mmdb"}
[2017-05-12T08:16:37,414][INFO ][logstash.filters.geoip ] Using geoip database {:path=>"/usr/share/logstash/vendor/bundle/jruby/1.9/gems/logstash-filter-geoip-4.0.4-java/vendor/GeoLite2-City.mmdb"}
[2017-05-12T08:16:37,510][INFO ][logstash.pipeline ] Starting pipeline {"id"=>"main", "pipeline.workers"=>3, "pipeline.batch.size"=>250, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>750}
[2017-05-12T08:16:38,154][INFO ][logstash.inputs.beats ] Beats inputs: Starting input listener {:address=>"0.0.0.0:5046"}
[2017-05-12T08:16:38,201][WARN ][logstash.inputs.beats ] Beats input: SSL Certificate will not be used
[2017-05-12T08:16:38,202][WARN ][logstash.inputs.beats ] Beats input: SSL Key will not be used
[2017-05-12T08:16:38,203][INFO ][logstash.inputs.beats ] Beats inputs: Starting input listener {:address=>"0.0.0.0:5043"}
[2017-05-12T08:16:38,214][INFO ][logstash.pipeline ] Pipeline main started
[2017-05-12T08:16:38,283][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2017-05-12T08:16:41,127][INFO ][logstash.inputs.s3 ] Using default generated file for the sincedb {:filename=>"/usr/share/logstash/.sincedb_b35047b42086d04ce9bf9fac20401019"}
[2017-05-12T08:16:41,150][INFO ][logstash.inputs.s3 ] Using default generated file for the sincedb {:filename=>"/usr/share/logstash/.sincedb_cf18dec491ac193434632184239c16d8"}
As you suggested I made changes to the grok pattern now I am not getting dateparsefailure but still I am not getting @timestamp field modified as well.
What do your events look like now? Please copy/paste from the JSON tab in Kibana's discover panel. I want to see the raw JSON message. The result of a stdout { codec => rubydebug } output on the Logstash side is also fine.
Yeah from JSON it looks perfect but when I see the kibana UI and not json output I see things like below
Log message timestamp "[Fri May 12 09:59:17.419071 2017]"
and @timestamp May 12th 2017, 15:29:17.880 .
Logstash and Web server system time is in UTC only.
Does it convert the timezone as well to my browser localtime zone (IST)? Cause I wanted to see the same timestamp value by default as it is coming in log message.
Yeah from JSON it looks perfect but when I see the kibana UI and not json output I see things like below
Then please say that so we don't have to play 20 questions.
Does it convert the timezone as well to my browser localtime zone (IST)?
Yes, Kibana does that by default.
Cause I wanted to see the same timestamp value by default as it is coming in log message.
There's a Kibana setting for setting the timezone adjustment. However, Logstash converts the original timestamp to UTC for storage in Elasticsearch, so unless the input is UTC (which is happens to be in your case) you won't be able to see the original timestamp.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.