Filebeat can't work with a dot in field name

Hi all,

I am trying to insert a log in ElasticSearch whose structure is JSON. Some fields have a dot in their name, and because of this Filebeat can't insert the data into Elasticsearch. The fail is:

WARN Can not index event (status=400): {"type":"mapper_parsing_exception","reason":"failed to parse","caused_by":{"type":"class_cast_exception","reason":null}}

The data which I want to insert in Elasticsearch is:

{"fields":{"data.1":1,"data.2":0},"name":"hostname","tags":{"host":"hostname"},"timestamp":99999999}

Can anyone help with this?

Thanks.

What version of ES are you using? Dots in field names are allowed in 5.x and 1.x. But in 2.x they were disallowed, but brought back in 2.4 if you set a config option .

You can process the data to remove the dots before the data gets written to ES.

Ingest Node dot expander processor: https://www.elastic.co/guide/en/elasticsearch/reference/master/dot-expand-processor.html

Logstash de_dot: https://www.elastic.co/guide/en/logstash/current/plugins-filters-de_dot.html#plugins-filters-de_dot-nested

Thanks @andrewkroh.

I am using ES 5.1 and Filebeat 5.0. I have already seen the option in ES 2.x, so I think that my problem is in Filebeat using a specific template to format the data received in JSON file. This template is being used to create the index with a specific mapping in ES.

I am going to check both links and maybe Ingest Node can resolve the problem I think.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.