Conflict when creating mapping

Hi,

I have logstash shipping some csv based logs to elastic and I want to change to field type of some of them from string to long.

I've delete all data and created the mapping below. However when I restart logstash kibana shows a conflict for the three fields in the mapping (the mapping field type and actual field type don't match) and the data is still saved as a string.

PUT test-
{
  "mappings": {
    "felcom": { 
      "_all":       { "enabled": false  }, 
      "properties": { 
        "CNO":    { "type": "long"  }, 
        "Lat":     { "type": "long"  }, 
        "Lon":      { "type": "long" }  
      }
    }
}

Why isn't this working?

I also tried converting the fields in logstash to a integer and this works for the lat and lon fields but for some reason the CNO field is not converted...

filter {
    csv {
    autodetect_column_names => true
    }
    date {
    match => [ "Date", "yyyy-MM-dd HH:mm:ss" ]
    timezone => "UTC"
    target => "@timestamp"
    }
    mutate {
    convert => { "CN0" => "integer" } 
    convert => { "Lat" => "integer" } 
    convert => { "Lon" => "integer" } 
    remove_field => ["message", "Date"]
}

Test data example

Lat	        Lon	        CN0
51.799217	-350.738785	31.4

It looks like you are mixing CNO and CN0.

You are right. I copy/pasted in directly from the source but for some reason it seems that somehow CNO and CN0 got copied incorrectly.

It is working now.

But what is causing the mapping conflict? I suppose it is more efficient to let Elastic decide the field type rather than having logstash convert them.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.