In Elastic Stack 6.7 the log file is successfully parsed by Logstash and I can see the output in Elasticsearch. Below is the index info from Elasticsearch. We can see that it successfully created an index of 12 records (there are 12 lines in the log file).
green open mflow-2019.06.13 JOv2398RQaab2RYu6EehqA 3 1 12 0 151.8kb 75.9kb
However, in Elastic Stack 7.0 I receive the below parsing error in Logstash for the same log line that was parsed successfully in v6.7:
[2019-06-13T13:23:17,587][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"mflow-2019.06.13", :_type=>"mflow", :routing=>nil}, #<LogStash::Event:0xba49170>], :response=>{"index"=>{"_index"=>"mflow-2019.06.13", "_type"=>"mflow", "_id"=>"PhoBUWsBDHqLRRnGvp7M", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [log.datetime] of type [date] in document with id 'PhoBUWsBDHqLRRnGvp7M'", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"failed to parse date field [2019-06-13 00:00:00,711] with format [strict_date_optional_time||epoch_millis]", "caused_by"=>{"type"=>"date_time_parse_exception", "reason"=>"date_time_parse_exception: Failed to parse with all enclosed parsers"}}}}}}
None of the formats documented seem to allow a space between the date and time, they all ask for a literal 'T'. I wonder if this change resulted in the stricter interpretation. Either way, it is a question about elasticsearch, not logstash.
Well in my case, I have even supplied the format in the field definition itself.
[2019-06-18T19:12:32,923][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"vm-autopatrol-status-2019.06.18", :_type=>"_doc", :routing=>nil}, #<LogStash::Event:0x7e64a599>], :response=>{"index"=>{"_index"=>"vm-autopatrol-status-2019.06.18", "_type"=>"_doc", "_id"=>"PR0BbGsBHgo23YjuR8BO", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [ts] of type [date] in document with id 'PR0BbGsBHgo23YjuR8BO'", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"failed to parse date field [2019-06-03T18:00:52.432-04:00] with format [yyyy-MM-dd HH:mm:ss|| date_hour_minute_second_fraction||yyyy-MM-dd ||epoch_millis||epoch_second || date_hour_minute_second_millis || basic_date_time || yyyy-MM-dd'T'HH:mm:ss.SSS'Z' || yyyy-MM-dd'T'HH:mm:ss.SSSZ || MM/dd/yyyy HH:mm:ss a]", "caused_by"=>{"type"=>"date_time_parse_exception", "reason"=>"date_time_parse_exception: Failed to parse with all enclosed parsers"}}}}}}
If you look at yyyy-MM-dd'T'HH:mm:ss.SSS'Z' format, it satisfies for 2019-06-03T18:00:52.432-04:00. Don't know what the issue is.
Added the aforementioned format to the mappings, its still not able to parse.
[2019-06-18T21:03:58,260][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"vm-autopatrol-status-2019.06.18", :_type=>"_doc", :routing=>nil}, #<LogStash::Event:0x661901bc>], :response=>{"index"=>{"_index"=>"vm-autopatrol-status-2019.06.18", "_type"=>"_doc", "_id"=>"qhpnbGsBWba35rWfTJvI", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [ts] of type [date] in document with id 'qhpnbGsBWba35rWfTJvI'", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"failed to parse date field [2019-04-15T16:26:37.716-04:00] with format [yyyy-MM-dd HH:mm:ss|| date_hour_minute_second_fraction||yyyy-MM-dd ||epoch_millis||epoch_second || date_hour_minute_second_millis || basic_date_time || yyyy-MM-dd'T'HH:mm:ss.SSS'Z' || MM/dd/yyyy HH:mm:ss a || yyyy-MM-dd'T'HH:mm:ss.SSSZZ]", "caused_by"=>{"type"=>"date_time_parse_exception", "reason"=>"date_time_parse_exception: Failed to parse with all enclosed parsers"}}}}}}
Looks like there is a stricter interpretation in place.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.