Type of field changes - Mapping conflict

Hi,

I am feeding log data through Logstash through Elasticsearch. Without making any change to the data itself or Logstash Config, I am getting a mapping conflict on a new index I have created yesterday (because of the same problem).

Logstash Filter (the mapping change happens on the field "KNX-Wert")

filter {
  if "knxmonitor" in [tags] {
    csv {
      columns => [ "eventtime", "ms", "Typ", "PA", "GA_KO", "KO-ID", "KNX-Name", "KNX-Wert" ]
      convert => { "KNX-Wert" => "float" }
    }
    date {
      locale => "en"
      match => [ 'eventtime' , 'yyyy-MM-dd HH:mm:ss' ]
    }
    mutate {
     remove_field => [ "message", "offset", "eventtime", "ms" ]
    }
  }
}

Not all messages contain valid float data in KNX-Message, but I am not interested in these messages and wouldn't mind them being discarded at worst. Anyway, they are giving me errors in the elasticsearch.log:

[2017-03-22T00:00:04,329][DEBUG][o.e.a.b.TransportShardBulkAction] [LAw6q-9] [knx2-2017.03.21][1] failed to execute bulk item (index) index {[knx2-2017.03.21][log][AVrzGUZoG7YEgFcI0c6J], source[{"KNX-Wert":"05:45:22","KNX-Name...
rg.elasticsearch.index.mapper.MapperParsingException: failed to parse [KNX-Wert]
	at org.elasticsearch.index.mapper.FieldMapper.parse(FieldMapper.java:298) ~[elasticsearch-5.2.2.jar:5.2.2]
...
Caused by: java.lang.NumberFormatException: For input string: "05:45:22"
	at sun.misc.FloatingDecimal.readJavaFormatString(FloatingDecimal.java:2043) ~[?:?]
	at sun.misc.FloatingDecimal.parseFloat(FloatingDecimal.java:122) ~[?:?]
	at java.lang.Float.parseFloat(Float.java:451) ~[?:1.8.0_121]
	at org.elasticsearch.common.xcontent.support.AbstractXContentParser.floatValue(AbstractXContentParser.java:174) ~[elasticsearch-5.2.2.jar:5.2.2]
	at org.elasticsearch.index.mapper.NumberFieldMapper$NumberType$2.parse(NumberFieldMapper.java:278) ~[elasticsearch-5.2.2.jar:5.2.2]
	at org.elasticsearch.index.mapper.NumberFieldMapper$NumberType$2.parse(NumberFieldMapper.java:264) ~[elasticsearch-5.2.2.jar:5.2.2]
	at org.elasticsearch.index.mapper.NumberFieldMapper.parseCreateField(NumberFieldMapper.java:1018) ~[elasticsearch-5.2.2.jar:5.2.2]
	at org.elasticsearch.index.mapper.FieldMapper.parse(FieldMapper.java:287) ~[elasticsearch-5.2.2.jar:5.2.2]
...

Is this the reason? If so, how can I tell elasticsearch to drop these messages instead of forcing them into the fields?

Or, how can I avoid field mappings being changed automatically like this?

thank you in advance

It looks like date time string mismatching. In the output process, you cloud try to filter the "tags" with "_dateparsefailure". Hope this help.

One way is to keep the field KNX-Wert as is and use grok to parse only number into a different field. It should look as below

filter {
  if "knxmonitor" in [tags] {
    csv {
      columns => [ "eventtime", "ms", "Typ", "PA", "GA_KO", "KO-ID", "KNX-Name", "KNX-Wert" ]      
    }
    
    # Retrieve numbers from KNX-Wert
    grok {
        match => {
            "KNX-Wert" => [
                "%{NUMBER:KNX-Wert-float}",
                "%{GREEDYDATA}"                
            ]
        }
    }
    date {
      locale => "en"
      match => [ 'eventtime' , 'yyyy-MM-dd HH:mm:ss' ]
    }
    mutate {
     remove_field => [ "message", "offset", "eventtime", "ms" ]
    }
  }
}

In Elasticsearch, map KNX-Wert as string type if you want to keep this field, or use mutate filter in Logstash to remove it completely. Map KNX-Wert-float as float or double as you wish.

Perhaps you can also use ruby to check if the field contains a valid float and act on that.

@wenpos and @anhlqn - thank you both. I will give it a try as soon as I can put my hands on it.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.