Mapper_parsing_exception Using GelfD and Logstash

We're using GelfD to ship our logs and although nothing has changed, we get these errors:

[2018-08-07T13:50:57,225][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"logstash-gelf2kafka-2018.08.07", :_type=>"logs", :_routing=>nil}, 2018-08-07T13:50:56.751Z auto1-task-scheduler.sonic-dev.us-east-1 task scheduler fetched tasks: ], :response=>{"index"=>{"_index"=>"logstash-gelf2kafka-2018.08.07", "_type"=>"logs", "_id"=>"AWUUp6rIWiXbnjb5AibS", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"object mapping for [host] tried to parse field [host] as object, but found a concrete value"}}}}

We run:
Elasticsearch 5.4.2
Logstash 5.4.2

The error is raised by Elasticsearch; Logstash is attempting to insert a string-valued host field, but the Elasticsearch index already has a field called host that contains an object (likely with sub-fields like name, ip, etc.).

There have been some recent conflicts with various plugins and data sources attempting to use the host field in different ways that work fine independently, but clash as above when used together.

There has been effort to define an "Elastic Common Schema" (ECS) to ensure we don't have these conflicts between plugins; the ECS defines the host field as an object with a variety of sub-keys that provide more information about the host in question.

Do you have other things pushing into the index (e.g., Beats via Elasticsearch Ingest Node)? Does the index's mapping template define host explicitly?

Moving the string-value host field to a non-clashing field such as source.ip prior to output may be a suitable workaround:

filter {
  mutate {
    rename => { "[host]" => "[source][ip]" }
  }
}
1 Like

Hi,
Thank you for elaborating. I have seen this in some other context so I went ahead and checked our Elasticsearch mapping for this index. It looks like we're expecting a string based host field.

"host": {
            "properties": {
              "name": {
                "type": "text",
                "norms": false,
                "fields": {
                  "raw": {
                    "type": "keyword",
                    "ignore_above": 256
                  }
                }
              }
            }

And there's hostname that comes with beat probably but again, it's a different field

"beat": {
            "properties": {
              "hostname": {
                "type": "text",
                "norms": false,
                "fields": {
                  "raw": {
                    "type": "keyword",
                    "ignore_above": 256
                  }
                }
              },
              "ip": {
                "type": "text",
                "norms": false,
                "fields": {
                  "raw": {
                    "type": "keyword",
                    "ignore_above": 256
                  }
                }
              },
              "name": {
                "type": "text",
                "norms": false,
                "fields": {
                  "raw": {
                    "type": "keyword",
                    "ignore_above": 256
                  }
                }
              },

When I'm trying to see what object I'm getting, it fails as well:

rename => { "host" => "testfield" }

As it turns out, we were confused between the filebeat object and the host.name object.
When we looked into another index which has only filebeat as an input source so now we have converted host to host.name

rename => { "host" => "[host][name]" }

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.