Hi,
I have logstash shipping some csv based logs to elastic and I want to change to field type of some of them from string to long.
I've delete all data and created the mapping below. However when I restart logstash kibana shows a conflict for the three fields in the mapping (the mapping field type and actual field type don't match) and the data is still saved as a string.
PUT test-
{
"mappings": {
"felcom": {
"_all": { "enabled": false },
"properties": {
"CNO": { "type": "long" },
"Lat": { "type": "long" },
"Lon": { "type": "long" }
}
}
}
Why isn't this working?
I also tried converting the fields in logstash to a integer and this works for the lat and lon fields but for some reason the CNO field is not converted...
filter {
csv {
autodetect_column_names => true
}
date {
match => [ "Date", "yyyy-MM-dd HH:mm:ss" ]
timezone => "UTC"
target => "@timestamp"
}
mutate {
convert => { "CN0" => "integer" }
convert => { "Lat" => "integer" }
convert => { "Lon" => "integer" }
remove_field => ["message", "Date"]
}
Test data example
Lat Lon CN0
51.799217 -350.738785 31.4