Could not index event to Elasticsearch 400

Hi All,

I have updated the filebeat and elk stack from 6.2.4 to 6.6.1 and ended up with the below error message in the logstash log.

"[2019-04-18T17:24:04,887][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"logstash-2019.04.18", :_type=>"doc", :routing=>nil}, #LogStash::Event:0x5376923c], :response=>{"index"=>{"_index"=>"logstash-2019.04.18", "_type"=>"doc", "_id"=>"uxpLMGoBeNmtE8up8rvR", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"object mapping for [host] tried to parse field [host] as object, but found a concrete value"}}}}"

I have seen below is the fix , but the probelm is I have 6 filter.conf file and not sure in which file to place it

filter {
mutate {
remove_field => [ "[host]" ]
}

The version of filebeat 6.6 from the sources have an exported field host.* https://www.elastic.co/guide/en/beats/filebeat/6.6/exported-fields-host-processor.html

You can either drop this from the logsource filebeat config or remove this host field from all your 6 filter.conf files.

Using output stdout would let you figure the culprit field mismatch

Alternatively create a new index in elasticsearch with host as objects and not strings.

Hi Olatunde,

I have comment all the host values in the field.yml in the filebeat config.
I did it for just one host and it fine.

Shall I do the change for all the servers ?

Regards
Nandha

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.