The issue that I am facing is that I need to manually push some application logs into ES, through Logstash. I am using a separate instance of Logstash to do so, but the place where I am stuck is related to the timestamp of the logs when they are visualized in Kibana. It shows the timestamp when the logs are pushed into ES (since Kibana uses @timestamp field), rather than the timestamp at which the logs were generated. Some googling led me to know that the @timestamp field is populated by Logstash when it pushes the logs to ES index. Thus, there are currently 2 apparent solutions for this (searched through Google :P) :
- Extract the timestamp from the logs (through a GROK filter) and then put the value into the @timestamp field (Tried several methods, none of them worked as expected)
- Extract the timestamp from the logs, put them in a custom field (e.g. logtimestamp) and then push them into ES. Then configure Kibana to order the logs on the basis of the logtimestamp field, instead of the default @timestamp field (not possible due to several reasons)
Thus currently stuck in this. Any and all help will be appreciated. Plus, I was a bit confused where to post this (whether Elasticsearch or Logstash). Thus, if this is not the right place to post, let me know. Thanks!