Error parsing timestamp

Hi everyone,

i have a weird log parsing error I failed to solve, maybe someone here can assist me.

I have a docker container shipping logs in this (simple?) format:

2021-06-14 08:30:14 ERROR Some message

I am shipping them to Logstash using filebeat with this config:

    - type: docker
        - condition:
            - type: container
                - /var/lib/docker/containers/${}/*.log

In Logstash my pipeline config looks like this:

filter {
    if [container][name] == "Home-Assistant" {
         grok {
             match => { "message" => "%{TIMESTAMP_ISO8601:timestamp} %{LOGLEVEL:level} %{GREEDYDATA:message}" }
             # (%{DATA:thread}) \[%{DATA:class}\] (?<message>(.|\r|\n)*)
             overwrite => ["message"]

Parsing this line fails with "Could not index event to Elasticsearch." and the following log in Logstash.

"error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [timestamp] of type [date] in document with id 'c542CXoBwxcEamlAvRaR'. Preview of field's value: '2021-06-14 08:30:14'", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"failed to parse date field [2021-06-14 08:30:14] with format [strict_date_optional_time||epoch_millis]", "caused_by"=>{"type"=>"date_time_parse_exception", "reason"=>"Failed to parse with all enclosed parsers"}}}}}}

I understand that there is some problem parsing the timestamp but I have no idea why. Any help is appreciated.

P.S. : This line is parsed perfectly fine in grok debugger.

You have told elasticsearch (possibly deliberately, or possibly by accident) that [timestamp] should be a date. When elasticsearch tries to parse a date it accepts two different formats by default. One is epoch_millis, which expects a long integer, the other is strict_date_optional_time which is a "generic ISO datetime parser, where the date must include the year at a minimum, and the time (separated by T ), is optional. Examples: yyyy-MM-dd'T'HH:mm:ss.SSSZ or yyyy-MM-dd"

"2021-06-14 08:30:14" does not match either of those.

If you have configured a template to tell elasticsearch that the [timestamp] field is a date, then you could adjust the template to tell it what format to expect.

Alternatively you could use a date filter to parse and overwrite the [timestamp] field (using the target option) and then logstash will send the field to elasticsearch as a long.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.