Failed to parse date - Elasticsearch 2.2 / Logstash 2.2

Hi,

I have two log files with different date format, and two config files in my conf.d dir for each log file : 01-file.conf corresponding to the first log file, and 02-file.conf corresponding to my second log file.

I have a date field in my first log file as 2015-09-26 03:45:12,400, and in my file 01-file.conf I use the date filter :

date {
      match => ["logtime", "YYYY-MM-dd HH:mm:ss,SSS"]
    }

In my second log file, I have a date field as 2015-09-26T03:45:12.400+0100, and in my second file 02-file.conf I use the date filter :

date {
      match => ["logtime", "YYYY-MM-dd'T'HH:mm:ss.SSSZ"]
    }

In my two config files I use the same grok pattern %{TIMESTAMP_ISO8601:logtime} first to parse my date.

Now when I run logstash with 01-file.conf first, and next 02-file.conf everything works fine. But when I run 02-file.conf first, and next 01-file.conf I have the following error :

… "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [logtime]", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"Invalid format: \"2016-02-29 03:58:00,077\" is malformed at \" 03:58:00,077\""}}}}, :level=>:warn}

I don't understand where the error comes from.

I use Logstash 2.2, Elasticsearch 2.2 and Kibana 4.4.1, with the default configurations, on a VM Centos/7.

Thanks in advance.

EDIT :
I add the following mutate before my date filter to change my date format :

01-file.conf

mutate {
  gsub => ["logtime", ",", "."]
}
mutate {
  gsub => ["logtime", " ", "T"]
}
date {
  match => ["logtime", "YYYY-MM-dd'T'HH:mm:ss.SSS"]

}

I have two log files with different date format, and two config files in my conf.d dir for each log file : 01-file.conf corresponding to the first log file, and 02-file.conf corresponding to my second log file.

And you have conditionals around those filters, right? Because all filters apply to all messages unless wrapped in a conditional.

I add the following mutate before my date filter to change my date format :

And this resolved the issue?

Since you're running the date filter to parse the timestamp into the @timestamp field, do you even need to keep the logtime field?

Yes I have conditionals before those filters for the reason you explained.

Yes this solved my issue, I don't know why elasticsearch doesn't like spaces sometimes ...

Yes I prefere keep my logtime field even if I use the @timestamp field in Kibana (but both are identic).