Hi,
I have some weird issue with logstash 5.1.2 on RHEL 7.
I rolled out a new logstash filter for two logfiles. There first I made an mistake which I deployed. There I casted a string to integer by accident. The string should be kept.
Then I stopped Logstash again, corrected the filter and restarted logstash again. After that I deleted any docs for these types.
But now I find the field being empty. In logstash logs I get following exception:
[2018-02-05T15:28:44,110][WARN ][logstash.outputs.elasticsearch] Failed action. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"tux-prod-2018.02.05", :_type=>"vichandler_statistics", :_routing=>nil}, 2018-02-05T14:28:30.000Z LOGIPRODTUX11 %{message}], :response=>{"index"=>{"_index"=>"tux-prod-2018.02.05", "_type"=>"vichandler_statistics", "_id"=>"AWFmXn2YC4m7XMTfoDJV", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse [serviceName]", "caused_by"=>{"type"=>"number_format_exception", "reason"=>"For input string: \"vic_handler\""}}}}}
In my config grok is just extracting the value, no parsing any longer for the field serviceName.
If I check the mapping with GET tux-prod-2018.02.05/vichandler_statistics/_mapping
then I get following result fpr the serviceName:
"serviceName": {
"type": "text",
"fields": {
"keyword": {
"type": "keyword",
"ignore_above": 256
}
}
What can I do to fix it? An additional restart of logstash did not help.
In my elastic-dev-environment everything works fine, but not on production