Existing mapping for [field name] must be of type object but found [field type]

Dear All,

I split a list of data into different events. Here's the example data

data source

1.178.179.217
1.179.170.7
1.93.0.224
100.16.243.115
101.187.28.8

After split to different event, i put a template to change the field type of "BadIP" to "ip" as follow

{
"template_4": {
"order": 0,
"index_patterns": [
"test"
],
"settings": {},
"mappings": {
"doc": {
"properties": {
"BadIP": {
"type": "ip"
}
}
}
},
"aliases": {}
}
}

Then, I try to use geoip to map those ips as follow

filter{
if "feed" in [tags]{
split {field => "message"}
grok { match => {"message" => "%{IP:BadIP}"}}

geoip{
source => "BadIP"
target => "BadIP.geoip"
add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}"]
add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}"]
}
mutate{ convert => ["[geoip][coordinates]", "float"]}
}
}

However, it show following message. I don'y understand what is the meaning of "Existing mapping for [BadIP] must be of type object but found [ip]".

Any idea? Thanks

[2018-09-12T17:52:51,508][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"test", :_type=>"doc", :_routing=>nil},
#LogStash::Event:0x30c6b249], :response=>{"index"=>{"_index"=>"test", "_type"=>"doc", "_id"=>"8JMyzWUBA8S-QR9OfTd8", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"Could not
dynamically add mapping for field [BadIP.geoip]. Existing mapping for [BadIP] must be of type object but found [ip]."}}}}

This message is coming back directly from Elasticsearch.

The result of your current pipeline is attempting to insert a value on a field that is different than what Elasticsearch expects for that field, and Elasticsearch was unable to coerce the value.

In your case, Elasticsearch is expecting the value of BadIP to be coercible to an ip (that is, a string representing an IPv4 or IPV6 address), but your output is sending an object.

I think that the source of this problem is your configuration in your GeoIP Filter, where you target the result of the GeoIP lookup to BadIP.geoip, since Elasticsearch uses dot-separator notation.

Hi Yaauie,

Thanks for your reply. I try to change Bad.geoip to Bad_geoip and it works. However, I don't understand that there are other field don't have this problem, such as firewall.source.geoip. Is it because the field BadIP has the word "IP" and there is a dot after that?

Thanks

No. The problem is that the Elasticsearch index that you are inserting to already has a field called BadIP, and that field is an ip in the current mapping. In Elasticsearch, types of any given field need to be consistent across the entire index.

I see. thanks a lot

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.