Error parsing file from filebeat

I have 2 web servers already running filebeats - everything works as expected, I can see the indices. I now added another web server, making it three. The issue is that I configured the filebeat excatly as the previous servers, but for some reason I keep seeing this error.

[2019-05-14T12:11:42,085][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"logstash-2019.05.14", :_type=>"doc", :routing=>nil}, #LogStash::Event:0x4f0fc4a7], :response=>{"index"=>{"_index"=>"logstash-2019.05.14", "_type"=>"doc", "_id"=>"R_1BtmoB45ihaa1VbEeD", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse field [agent] of type [text]", "caused_by"=>{"type"=>"illegal_state_exception", "reason"=>"Can't get text on a START_OBJECT at 1:177"}}}}}

Could someone tell me what I may be doing wrong?

Are you using a useragent filter to parse a string field called agent and overwrite it with the parsed agent object? If so, that's not working correctly for your third web server. In elasticsearch a field cannot be sometimes a string and sometimes an object.

There should be a more informative message in the elasticsearch log.

You could try changing the name of the string field that contains the user-agent information so that there is no conflict.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.