Upgraded ELK from 7.15 to 8.0, basic configurations are unchanged. But in 8.0, a CSV filter suddenly broke at unit test, failing to process a line that worked perfectly fine in 7.
The error is
"caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"failed to parse date field [FOB] with format [strict_date_optional_time||epoch_millis]", "caused_by"=>{"type"=>"date_time_parse_exception", "reason"=>"Failed to parse with all enclosed parsers"}}
The field that caused this is definitely not a datetime (a string "FOB"), and it is in the first line in the CSV file. Compared the data type in mappings created by logstash, it is in fact a date in 8.0, while it is text in 7.15.
Mapping in 7.15:
"seg22": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
},
Mapping in 8:
"seg22": {
"type": "date"
},
The csv configuration in logstash is a very simple one with nothing fancy, and nothing is changed during the upgrade. And the data file is also the same for the unit test in both ELK versions.
csv {
separator => "|"
skip_header => "true"
columns => ["seg1","seg2","seg3",...]
}
Any help will be much appreciated.