My Filebeat is reading the time that the a log file is created as the timestamp. This is a problem since it does not read every message in the log with a unique time.
What I am trying to do is to parse a json that one of its terms will become the timestamp, that I will be able to get correct graphs.
You can use Filebeat to parse the JSON event. See the json docs.
And then in either Logstash or Elasticsearch (using Ingest Node) you can parse msgSubmissionTime using a date processor. From there you can either leave it as msgSubmissionTime in the event or overwrite the @timestamp field with the parsed value.
I installed Logstash and configured it to all the rest of ELK. I am trying to use Grok to make that msgSubmissionTime will become the timeStamp. This looks like a simple procedure but I am pretty confused from the documentation. I will be happy if you can help!
msgSubmissionTime is given as milliseconds and not on all the incoming Jsons get msgSubmissionTime.
If the fact that on some Jsons I get this Field and some I don't will be a issue I can change this.
I tried:
filter {
date {
match => [ "msgSubmissionTime", "SSS" ]
}
}
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.