Changing a term in a json to become the timestamp from filebeat

Hi,

My Filebeat is reading the time that the a log file is created as the timestamp. This is a problem since it does not read every message in the log with a unique time.
What I am trying to do is to parse a json that one of its terms will become the timestamp, that I will be able to get correct graphs.

Here is an example of such a json

{
"msg": "Hello",
"msgSubmissionTime": "1484752676968"
}

Many Thanks for any one who can help

You can use Filebeat to parse the JSON event. See the json docs.

And then in either Logstash or Elasticsearch (using Ingest Node) you can parse msgSubmissionTime using a date processor. From there you can either leave it as msgSubmissionTime in the event or overwrite the @timestamp field with the parsed value.

1 Like

Thanks, @andrewkroh!

Is there a way to do so without logstash? Is there a way to do this configuration through Kibana or FileBeat?

As Andrew described, it can also be done with ingest, so you don't need Logstash. Configuration must be done directly in Elasticsearch.

1 Like

Hi @andrewkroh and @ruflin,

Thanks for the help until now!

I installed Logstash and configured it to all the rest of ELK. I am trying to use Grok to make that msgSubmissionTime will become the timeStamp. This looks like a simple procedure but I am pretty confused from the documentation. I will be happy if you can help!

msgSubmissionTime is given as milliseconds and not on all the incoming Jsons get msgSubmissionTime.
If the fact that on some Jsons I get this Field and some I don't will be a issue I can change this.

I tried:

filter {
  date {
match => [ "msgSubmissionTime", "SSS" ]
  }
}

I also tried:

filter {
  mutate {
    convert => {"msgSubmissionTime"=> "integer" }
  }
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.