_dateparsefailure - Failing to parse timestamp field into @timestamp

Hello,
I'm dealing with this problem for quite some time now and after reading all topics related to this, I still wasn't able to find the solution. I am trying to replace the @timestamp field with my log timestamp having the pattern:
2020-04-13T13:51:30,127+0300.
I am sending the logs through a TCP socket from a Maven log4j2 application.

My logstash configuration is the following:

input {
      tcp {
        port => 12345
        type => "log4j2"
      }
}
filter {
     csv {
                    columns => ["timestamp","severity","host","pid","thread","app","message","uuid"]
                    separator => "|"
                    skip_empty_columns => true
     }
     date {
                    locale => "en"
                    timezone => "Europe/Bucharest"
                    match => ["timestamp" , "yyyy-MM-dd'T'HH:mm:ss,SSSZ"]
                    remove_field => ["timestamp"]
      }
}
output {
       elasticsearch {
            hosts => "elasticsearch:9200"
            index => "elk"
        }
}

I keep getting the _dateparsefailure tag in Kibana. I have tried matching with the "ISO8601" but failed as well. If anyone has an idea about what could be the problem, please help. Thank you!

Hello @rsustic

Would it be possible to temporarily disable the elasticsearch output and use stdout { codec => rubydebug } instead to see the content of the timestamp field?

Yes, of course.

{
        "_index" : "elk",
        "_type" : "doc",
        "_id" : "xFztcnEBEb-YhQoKv5jg",
        "_score" : 1.0,
        "_source" : {
          "type" : "log4j2",
          "@version" : "1",
          "uuid" : " f60480d7-52f7-49a4-bf26-d4e9bd9cce5a",
          "host" : " FRONB100139",
          "severity" : " DEBUG ",
          "thread" : " [main] ",
          "tags" : [
            "_dateparsefailure"
          ],
          "message" : " This is a debug message ",
          "port" : 57872,
          "app" : " LogGenerator.App ",
          "@timestamp" : "2020-04-13T09:45:14.982Z",
          "timestamp" : "2020-04-13T12:45:16,743+0300 ",
          "pid" : " P2904 "
        }

I am wondering if the extra space following timestamp is triggering the problem...

Could you try to add the following filter just after the csv filter?

mutate {
  strip => ["timestamp", "pid", "severity","thread","app","message"]
}

Also, as the TZ seems provided in the date, I think you can remove:

locale => "en"
timezone => "Europe/Bucharest"

I didn't had a chance to test it on my system, but should solve the problem.

Indeed, it solved the problem. Thank you so much!!! I've been dealing with this for God knows how long.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.