Hi guys.
Below my logstash configuration :
input {
file {
path => "/tmp/file.csv"
type => "myindex"
start_position => "beginning"
}
}
filter {
if ([type] == "myindex") {
csv {
skip_header => "true"
separator => ","
columns => ["time","f1","f2","f3","f4"]
remove_field => ["host"]
remove_field => ["message"]
remove_field => ["path"]
}
mutate {convert => ["f1", "float"] }
mutate {convert => ["f2", "integer"] }
mutate {convert => ["f3", "integer"] }
mutate { add_field => { "eventdate" => "%{+YYYY.MM.dd} %{time}" } }
date {
locale => "en"
match => ["eventdate", "YYYY.MM.dd HH:mm:ss.SSS"]
target => "eventdate"
}
}
}
output {
if [type] == "myindex" {
elasticsearch {
hosts => "myip:myport"
index => "myindex_%{+yyyy}%{+MM}"
document_id => "%{+yyyy}%{+MM}%{+dd}%{time}%{f2}"
}
}
}
Input{ } & Outut { } are classic
In my filter { }, I defined my separator and header, I get rid of the fields created by logstash (host, message, path), I converted few fields into the right formats and I created eventdate based on my field ${time} (looking like HH:mm:ss.SSS) by adding the current date before
I've got a very weird behavior with this configuration, it seems like something is going wrong but I can't figure out why/what. I actually got the below errors sometimes :
[2019-07-15T08:13:46,234][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>"20190715D1.4671", :_index=>"myindex_201907", :_type=>"_doc", :routing=>nil}, #<LogStash::Event:0x22a6398>], :response=>{"index"=>{"_index"=>"myindex_201907", "_type"=>"_doc", "_id"=>"20190715D1.4671", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [eventdate] of type [date] in document with id '20190715D1.vi '", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"failed to parse date field [2019.07.15 D] with format [strict_date_optional_time||epoch_millis]", "caused_by"=>{"type"=>"date_time_parse_exception", "reason"=>"Failed to parse with all enclosed parsers"}}}}}}
[2019-07-15T08:41:39,853][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>"2019071548:56.000val1", :_index=>"myindex_201907", :_type=>"_doc", :routing=>nil}, #<LogStash::Event:0x985bc66>], :response=>{"index"=>{"_index"=>"myindex_201907", "_type"=>"_doc", "_id"=>"2019071548:56.000val1", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [eventdate] of type [date] in document with id '2019071548:56.000val1'", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"failed to parse date field [2019.07.15 48:56.000] with format [strict_date_optional_time||epoch_millis]", "caused_by"=>{"type"=>"date_time_parse_exception", "reason"=>"Failed to parse with all enclosed parsers"}}}}}}
[2019-07-15T08:40:55,321][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>"20190715%{time}1", :_index=>"myindex_201907", :_type=>"_doc", :routing=>nil}, #<LogStash::Event:0x7c4e553>], :response=>{"index"=>{"_index"=>"myindex_201907", "_type"=>"_doc", "_id"=>"20190715%{time}1", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [eventdate] of type [date] in document with id '20190715%{time}1'", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"failed to parse date field [2019.07.15 %{time}] with format [strict_date_optional_time||epoch_millis]", "caused_by"=>{"type"=>"date_time_parse_exception", "reason"=>"Failed to parse with all enclosed parsers"}}}}}}
[2019-07-15T08:46:25,846][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>"20190715val2val3", :_index=>"myindex_201907", :_type=>"_doc", :routing=>nil}, #<LogStash::Event:0x22681dc7>], :response=>{"index"=>{"_index"=>"myindex_201907", "_type"=>"_doc", "_id"=>"20190715val2val3", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [eventdate] of type [date] in document with id '20190715val2val3'", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"failed to parse date field [2019.07.15 val2] with format [strict_date_optional_time||epoch_millis]", "caused_by"=>{"type"=>"date_time_parse_exception", "reason"=>"Failed to parse with all enclosed parsers"}}}}}}
It seems like sometimes logstash isn't able to correctly parse my lines and I checked the format of my files couple of times, they all look good. The worse thing is with the same config and file, I won't reproduce the same error, it only happens sometimes and randomly. Don't really know what to do here
Thanks in advance
Guillaume