Logstash cannot accept the events

Since some time logstash cannot accept the events from other hosts. It just fills the disk with logs of:
[2018-07-12T01:14:05,241][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"filebeat-2018.07.11", :_type=>"doc", :_routing=>nil}, #LogStash::Event:0x48a8f43], :response=>{"index"=>{"_index"=>"filebeat-2018.07.11", "_type"=>"doc", "_id"=>"RXGfi2QB4MOEB4eqhmGr", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"object mapping for [host] tried to parse field [host] as object, but found a concrete value"}}}}

What could be the reason?

Known problem with 6.3 of the Elastic stack. Search for "object mapping for [host] tried to parse field [host] as object, but found a concrete value" in the archives for details.

I guess the problem appeared since we added a new node and installed 6.3.x version of filebeat.
Yesterday I upgraded all the filebeats in the infrastructure, but the problem is still there.

I found a topic of Problem with transfer Filebeat 6.1.3 > Logstash 6.1.3 > Elasticsearch 6.1.3. There they recommend to mutate the fields in order to pass into the new index schema.

The recomendations what to do.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.