Recently I set up the ELK stack to filter logs of my Load Balancer. But the timestamp which is getting showed up in Kibana is not from the log files, but is the one at which the file was downloaded to the server. I am not sure, what mistake I am making.
We are using AWS and our server logs are stored on the S3. Below is the content from the log file:
The logs that I am having are the historic logs. As per my understanding, Logstash or Kibana should not be caring for the age of the logs, and are expected to filter data based on timestamp mentioned with the log. But that is not the expected behavior in my case. What can be the possible reason for this?
But it is crashing with error. I think omething is wrong with the debug param I am passing.
Using milestone 2 input plugin 'file'. This plugin should be stable, but if you see strange behavior, please let us know! For more information on plugin milestones, see http://logstash.net/docs/1.3.3/plugin-milestones {:level=>:warn}
Exception in thread "LogStash::Runner" org.jruby.exceptions.RaiseException: (InvalidOption) invalid option: --log
at RUBY.complete(jar:file:/home/ubuntu/logstash-1.3.3-flatjar.jar!/META-INF/jruby.home/lib/ruby/1.9/optparse.rb:1542)
at org.jruby.RubyKernel.catch(org/jruby/RubyKernel.java:1284)
at RUBY.complete(jar:file:/home/ubuntu/logstash-1.3.3-flatjar.jar!/META-INF/jruby.home/lib/ruby/1.9/optparse.rb:1540)
at RUBY.parse_in_order(jar:file:/home/ubuntu/logstash-1.3.3-flatjar.jar!/META-INF/jruby.home/lib/ruby/1.9/optparse.rb:1354)
at org.jruby.RubyKernel.catch(org/jruby/RubyKernel.java:1284)
at RUBY.parse_in_order(jar:file:/home/ubuntu/logstash-1.3.3-flatjar.jar!/META-INF/jruby.home/lib/ruby/1.9/optparse.rb:1347)
at RUBY.order!(jar:file:/home/ubuntu/logstash-1.3.3-flatjar.jar!/META-INF/jruby.home/lib/ruby/1.9/optparse.rb:1341)
at RUBY.permute!(jar:file:/home/ubuntu/logstash-1.3.3-flatjar.jar!/META-INF/jruby.home/lib/ruby/1.9/optparse.rb:1432)
at RUBY.parse!(jar:file:/home/ubuntu/logstash-1.3.3-flatjar.jar!/META-INF/jruby.home/lib/ruby/1.9/optparse.rb:1453)
at RUBY.parse(jar:file:/home/ubuntu/logstash-1.3.3-flatjar.jar!/META-INF/jruby.home/lib/ruby/1.9/optparse.rb:1443)
at RUBY.run(file:/home/ubuntu/logstash-1.3.3-flatjar.jar!/logstash/kibana.rb:83)
at logstash.runner.run(logstash/runner.rb:113)
at org.jruby.RubyProc.call(org/jruby/RubyProc.java:271)
at logstash.runner.run(logstash/runner.rb:207)
at logstash.runner.main(logstash/runner.rb:79)
at logstash.runner.(root)(logstash/runner.rb:237)
Can you help me with the parameter to pass to generate log file?
Here we see that the _grokparsefailure tag is set, which indicates that your grok expression doesn't match the input string. If grok isn't successful the timestamp field isn't populated and then the date filter won't populate @timestamp either. So, fix your grok expression. To save time, don't use an elasticsearch output until you've verified that you get the desired output. Use stdout { codec => rubydebug } instead.
Clearly, a hyphen isn't a number. You can look at the patterns that ship with Logstash to see how you can match a hyphen or something else. Have a look at the Apache patterns.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.