Logstash JSON parsing of @timestamp

The docs for the logstrash JSON plugin say:

If the parsed data contains a @timestamp field, we will try to use it for the event’s @timestamp, if the parsing fails, the field will be renamed to _@timestamp and the event will be tagged with a _timestampparsefailure.

...but I can't find any info on how they try to parse it or how to affect the parsing.

I'm feeding in a record with @timestamp set to a valid epoch time - and it's failing to parse it:

"@timestamp":"1522458058"

Now, it's about 3 months old (old test data), but it is the correct timestamp for the event - 2018-03-31 01.00.58 am - so why is it getting rejected?

Is there anyway of disabling the timestamp parsing? I'm trying to feed logstash data it can simply ingest without it needing to mess around with each record after it's received it.

I get the same problem without the quotes around the number.

Does it work if you parse _@timestamp with a date filter afterwards? I don't know if the json filter is super clever at detecting the date format.

Something like this should work:

if "_timestampparsefailure" in [tags] {
  date {
    match => [ "_@timestamp", "UNIX" ]
    remove_field => "_@timestamp"
    remove_tag => "_timestampparsefailure"
  }
}

If the question is just about disabling the parsing (and not caring about the correct timestamp for the event), I don't know, and I don't see a way in the code to do that.

Yeah, that works. Probably a little more efficient just to put the epoch value in a different field and just catch it with a date filter.

It is not documented as far as I can see. You would have to dig in to the source to see how coercion to a DateTime works.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.