Hi guys, I need help with this topic, I have to analice assets_logs from AWS bucket, I've dowloaded them to my local machine, but the problem is when I use this command to get @timestamp from log message, the log example is this one:
and I get this error from shell: Failed parsing date from field {:field=>"logtimestamp", :value=>"[29/Jul/2016:22:48:53 +0000]", :exception=>java.lang.IllegalArgumentException: Invalid format: "[29/Jul/2016:22:48:53 +0000]", :level=>:warn}
My question is how can I get timestamp from the log information? thanks a lot!
Thanks a lot magnus but the @timestamp field, don't get the date from my log information, with the configuration mentioned above, how can I get the date from the log information and then add it to the @timestamp field, so kibana will be able to recognize multiple dates, not just the one when I create de index.
the @timestamp field, don't get the date from my log information, with the configuration mentioned above
Then I expect that you have an error message about this in your Logstash log. Also, please post an example of the event you get so that we can verify that the logtimestamp field exists and looks as expected.
I use this command to run the logstash configuration file: sudo /opt/logstash/bin/logstash -f /etc/logstash/conf.d
and I'm getting this error using that configuration:
failed to open /home/ubuntu/algunlado/2016-07-22-14-31-54-99D064F24343B5BB: Permission denied - /home/ubuntu/algunlado/2016-07-22-14-31-54-99D064F24343B5BB {:level=>:warn}
Logstash shutdown completed
Errno::EACCES: Permission denied - /home/ubuntu/.sincedb_5d0efa804b3990eaec099d7dedce6fdf.28544.7215.954984
initialize at org/jruby/RubyFile.java:363
new at org/jruby/RubyIO.java:853
atomic_write at /opt/logstash/vendor/bundle/jruby/1.9/gems/filewatch-0.6.3/lib/filewatch/helper.rb:33
_sincedb_write at /opt/logstash/vendor/bundle/jruby/1.9/gems/filewatch-0.6.3/lib/filewatch/tail.rb:234
sincedb_write at /opt/logstash/vendor/bundle/jruby/1.9/gems/filewatch-0.6.3/lib/filewatch/tail.rb:204
teardown at /opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-file-0.1.10/lib/logstash/inputs/file.rb:151
inputworker at /opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-1.5.1-java/lib/logstash/pipeline.rb:202
synchronize at org/jruby/ext/thread/Mutex.java:149
inputworker at /opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-1.5.1-java/lib/logstash/pipeline.rb:202
start_input at /opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-core-1.5.1-java/lib/logstash/pipeline.rb:170
but some index are being created:
yellow open logstash-2016.07.27 5 1 1 0 7.8kb 7.8kb
yellow open logstash-2016.08.22 5 1 521 0 372.9kb 372.9kb
yellow open logstash-2016.07.29 5 1 100 0 94.6kb 94.6kb
yellow open .kibana 1 1 1 0 3kb 3kb
and in kibana logtimestamp isn't being recognized,
Please don't post screenshots. Post text, either from Kibana's JSON tab or from Logstash's stdout { codec => rubydebug } output.. Format the text as code with the </> toolbar button.
Now, the _grokparsefailure tag in what you posted indicates that there is no logtimestamp field because the grok filter failed. Address that first.
You should also address this:
failed to open /home/ubuntu/algunlado/2016-07-22-14-31-54-99D064F24343B5BB: Permission denied - /home/ubuntu/algunlado/2016-07-22-14-31-54-99D064F24343B5BB {:level=>:warn}
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.