Filebeat giving "mapper_parsing_exception error

We have filebeat on our linux box which periodically sends logs to elastic search and from there kibana is used to display them. In the scenario our application generates two types of logs:

Type 1 - {"data": "some string log"}; - string
Type 2 - {"data": {"objectKey": "objectValue"}} - object

Now if type 1 is sent first to elastic search it creates the data field and type as keyword and when filebeat tries to send type 2 it breaks because of incompatible types. This happens vice versa as well.

Is there any solution for this problem? May be something on the line of dynamic templates. We can not change the logging as it comes from different applications and can be different. Any help will be really appreciated.

The two events cannot coexist in the same index no matter what template you use. For that reason you must send the events to two different indices. In case the events come from two different logs and could be put into two different prospector or they have a unique field you can use, you can define the index pattern accordingly to route the events to different indices: https://www.elastic.co/guide/en/beats/filebeat/current/elasticsearch-output.html#index-option-es

The other upcoming option is to potentially use the rename processor (still WIP): https://github.com/elastic/beats/pull/6292

Alternative you can always use Logstash in the middle to do any other more complex routing.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.