Date can not parse date formated field

Version: logstash 5.6.3

We have 2 date filters in our logstash config.
The first parses the timestamp field we matched in our logs. We want to have always the same timestamp format so we use date to do this.
Second date is used to set @timestamp to timestamp for each event, if we have a timestamp parsed. But this fails if we altered date with timestamp.

I created a little test case to reproduce the issue.

input { stdin {} }
filter {
grok {
match => ["message", "%{TIMESTAMP_ISO8601:timestamp} %{WORD}"]
}
# trying to fix timestamp parsing
date {
match => [ "timestamp", "ISO8601", "UNIX", "UNIX_MS", "TAI64N", "yyyy-MM-dd HH:mm:ss", "MMM dd YYYY HH:mm:ss", "YYYY/MM/dd HH:mm:ss", "dd/MMM/YYYY:HH:mm:ss Z" ]
target => "timestamp"
}
# if we have a valid timestamp, set is as @timestamp instead of parsed time of logstash
date {
match => [ "timestamp", "ISO8601"]
target => "@timestamp"
}

}
output { stdout { codec => rubydebug } }

# echo '2011-04-19T03:44:01.103Z blabla' | /usr/share/logstash/bin/logstash -f test_logstash.config
WARNING: Could not find logstash.yml which is typically located in $LS_HOME/config or /etc/logstash. You can specify the path using --path.settings. Continuing using the defaults
Could not find log4j2 configuration at path /usr/share/logstash/config/log4j2.properties. Using default config which logs errors to the console
{
      "@version" => "1",
          "host" => "test-host",
    "@timestamp" => 2017-10-25T10:06:16.527Z,
       "message" => "2011-04-19T03:44:01.103Z blabla",
     "timestamp" => 2011-04-19T03:44:01.103Z,
          "tags" => [
        [0] "_dateparsefailure"
    ]
}

The immediate reason for the failure is that timestamp already is a timestamp and the date filter can only parse strings. You could use a mutate filter to convert timestamp to a string.

But I don't understand the difference between @timestamp and timestamp. And why do you want to copy from timestamp to @timestamp?

Yes, you are right, we should only use @timestamp.

But if we have both we need to have two date entrys for this to not run into Elasticsearch exceptions:

[2017-10-25T12:25:37,686][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"logstash-2017.10.25_12", :_type=>"rg-filter", :_routing=>nil}, 2017-10-25T12:25:37.665Z node1.domain.net Updating cluster description to {type=SHARDED, servers=[{address=node1.domain:port, type=SHARD_ROUTER, roundTripTime=1.1 ms, state=CONNECTED}]
], :response=>{"index"=>{"_index"=>"logstash-2017.10.25_12", "_type"=>"rg-filter", "_id"=>"AV9TfsSSUQ4zGMlkXWni", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [timestamp]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Invalid format: "1508934337.665" is malformed at "934337.665""}}}}}

That error means that timestamp has been mapped as something that isn't compatible with the value 1508934337.665. What does that field contain? A timestamp? Then use the date filter to parse it.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.