[host] is defined as an object in mapping [logs] but this name is already used for a field in other types

Hi,

We started getting this error in the active master log. It is so bad that the log file fills the disk overnight.

I've read about it but what I've found is people having fields named the same but with different types in the same index. In our case however those fileds "host" and "logs" are not defined anywhere in our log data, therefore it seems they are added by Filebeat.

We run 5.6.2 in Elasticsearch and Logstash, while Filebeat is 6.0.

[2018-12-27T14:22:35,382][DEBUG][o.e.a.a.i.m.p.TransportPutMappingAction] [es-master0402] failed to put mappings on indices [[[prod-api-2018.12.27/HuMoN-DmQnOEoFxnyROrpQ]]], type [logs]
java.lang.IllegalArgumentException: [host] is defined as an object in mapping [logs] but this name is already used for a field in other types
        at org.elasticsearch.index.mapper.MapperService.checkFieldUniqueness(MapperService.java:570) ~[elasticsearch-5.6.2.jar:5.6.2]
        at org.elasticsearch.index.mapper.MapperService.internalMerge(MapperService.java:394) ~[elasticsearch-5.6.2.jar:5.6.2]
        at org.elasticsearch.index.mapper.MapperService.internalMerge(MapperService.java:336) ~[elasticsearch-5.6.2.jar:5.6.2]
        at org.elasticsearch.index.mapper.MapperService.merge(MapperService.java:268) ~[elasticsearch-5.6.2.jar:5.6.2]
        at org.elasticsearch.cluster.metadata.MetaDataMappingService$PutMappingExecutor.applyRequest(MetaDataMappingService.java:311) ~[elasticsearch-5.6.2.jar:5.6.2]
        at org.elasticsearch.cluster.metadata.MetaDataMappingService$PutMappingExecutor.execute(MetaDataMappingService.java:230) ~[elasticsearch-5.6.2.jar:5.6.2]
        at org.elasticsearch.cluster.service.ClusterService.executeTasks(ClusterService.java:634) ~[elasticsearch-5.6.2.jar:5.6.2]
        at org.elasticsearch.cluster.service.ClusterService.calculateTaskOutputs(ClusterService.java:612) ~[elasticsearch-5.6.2.jar:5.6.2]
        at org.elasticsearch.cluster.service.ClusterService.runTasks(ClusterService.java:571) [elasticsearch-5.6.2.jar:5.6.2]
        at org.elasticsearch.cluster.service.ClusterService$ClusterServiceTaskBatcher.run(ClusterService.java:263) [elasticsearch-5.6.2.jar:5.6.2]
        at org.elasticsearch.cluster.service.TaskBatcher.runIfNotProcessed(TaskBatcher.java:150) [elasticsearch-5.6.2.jar:5.6.2]
        at org.elasticsearch.cluster.service.TaskBatcher$BatchedTask.run(TaskBatcher.java:188) [elasticsearch-5.6.2.jar:5.6.2]
        at org.elasticsearch.common.util.concurrent.ThreadContext$ContextPreservingRunnable.run(ThreadContext.java:569) [elasticsearch-5.6.2.jar:5.6.2]
        at org.elasticsearch.common.util.concurrent.PrioritizedEsThreadPoolExecutor$TieBreakingPrioritizedRunnable.runAndClean(PrioritizedEsThreadPoolExecutor.java:247) [elasticsearch-5.6.2.jar:5.6.2]
        at org.elasticsearch.common.util.concurrent.PrioritizedEsThreadPoolExecutor$TieBreakingPrioritizedRunnable.run(PrioritizedEsThreadPoolExecutor.java:210) [elasticsearch-5.6.2.jar:5.6.2]
        at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142) [?:1.8.0_131]
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617) [?:1.8.0_131]
        at java.lang.Thread.run(Thread.java:748) [?:1.8.0_131]

A possible cause may be the following:
We use Redis between Filebeat and Logstash but recently decided to test whether we can ship directly from Filebeat to Logstash. So I've added new input to the exiting Logstash config and configured one FIlebeat to send to this input.

This may have somehow caused this, however I don't see how since the Logstash filters and outputs are the same for inputs coming from both Redis and the one configured for Filebeat.

Unless Filebeat maps the "host" field with different type when it sends data directly to Logstash.

So problem comes from having already an Index created with Redis input in Logstash and pushing to the same index with beats input. Apparently there is a missmatch in some filed types between the two inputs in Logstash and outputs in Filebeat.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.