Incorrect mapping assigned in Elasticsearch

I am using Elasticsearch, Filebeat, and Logstash 6.3.2

I have a field in the files i'm attempting to index into ES called 'MEAS' and it is of type Double.

Here is an example of one of my JSON files, let's call it File_A:

   "COND":  295.0

if logstash sends this file first, Elasticsearch will create a mapping for it and assign its type to be 'Long'

so if subsequent JSON files contain "COND" with different value such as this one, lets call it File_B:

   "COND": 306.145

I receive an error from ES that is "cannot be converted to Long without data loss.

On the flipside: if Elasticsearch were to be fed File_B first, then the mapping is correct, however, File_A would be rejected because it "cannot convert long to float".

Before switching to using Logstash/Filebeat, i was sending data directly from a parser written in C#, using Elasticsearch.NET and the low-level client API, and never once did I encounter this issue. That solution was not sustainable, and I was basically reinventing the wheel called "Filebeat".

That said, I'm not sure if my Logstash/Filebeat configurations are to blame, all I need is to index the documents into ES, as the data has already been converted ahead of time, logstash should not need a filter.

here are my configurations:

    - input type: log
        - \\AUTOMATIONDEV\DataParser\OutputJSONSnP\*\*.json
        - \\AUTOMATIONDEV\DataParser\OutputJSON\*\*.json
      json.keys_under_root: true
      json.add_error_key: true
    hosts: ["localhost:5051"]


input {
	beats {
		host => ""
		port => "5051"
		codec => "json_lines"

output {
	elasticsearch {
		hosts => ["autoelk-01:9200", "autoelk-02:9200", "autoelk-03:9200"]
		index => "%{doc_index}"
		document_type => "%{doc_type}"
		document_id => "%{doc_id}"

Any help would be gladly appreciated.

Would creating a mapping resolve this issue? I have tried turning off coerce when creating index.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.