Overwriting json @timestamp

Hello, I am sending json logs to logstash and I want to overwrite the @timestamp field.

To do this I do:

json.keys_under_root: true
json.overwrite_keys: true

The log entries goes to logstash, but in Kibana the @timestamp field is not overwritten and shows the error:

@timestamp not overwritten (parse error on 2017-06-02T21:40:59+0000)

It looks like Go's RFC3389 time parser is failing on that timestamp. This recreates the issue. The full error is:

parsing time "2017-06-02T21:40:59+0000" as "2006-01-02T15:04:05Z07:00": cannot parse "+0000" as "Z07:00"

Maybe the time parser should be a bit more robust and try a few more common formats.

For now you will need to use Logstash to handle this.

Thank you!

Do you have any suggestions on how I can fix this with logstash?

One way would be to do the JSON decoding in Logstash. Then apply a date filter.

filter {
  json {
    source => "message"
  }
  date {
     # Add config here for parsing the date.
  }
}

https://www.elastic.co/guide/en/logstash/current/plugins-filters-json.html#plugins-filters-json-source
https://www.elastic.co/guide/en/logstash/current/plugins-filters-date.html#plugins-filters-date-target

I added this now, but I still get the same error message:

input {
beats {
port => 5445
codec => "json"
ssl => true
ssl_certificate => "/etc/logstash/logstash.crt"
ssl_key => "/etc/logstash/logstash.key"
}
}

filter {
json {
source => "message"
}
date {
match => [ "timestamp", "ISO8601" ]
}
}

Try using@timestamp instead

This made no difference.

Remove that line.

1 Like

Hi, thanks for the suggestion, but this did not make any difference.

Did you disable the JSON parsing on the Beats side?

Yes, tried with and without.

Can this be due to the timestamp beeing

2017-06-02T21:40:59+0000

And not

2017-06-02T21:40:59+00:00

Which is the correct ISO8601 format?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.