Errors with geo_point

Hey,

I'm new to this platform. Been using ELK since last week. I'm encountering several problems concerning geo localisations.

I have a CSV which is made out of the following

Name,Adress,Latitude,Longitude

I have created an index (Test) with the following mapping

{
  "Test" : {
    "mappings" : {
      "partial_beacons" : {
        "properties" : {
          "latitude" : {
            "type" : "float"
          },
          "location" : {
            "type" : "geo_point"
          },
          "longitude" : {
            "type" : "float"
          }
        }
      }
    }
  }
}

My .conf looks like the following

input {
      file {
          path => "/.../*.csv"
          type => "partial_beacons"
          start_position => "beginning"
      }
}
filter {
    csv {
        columns => ["Name", "Adress", "latitude", "longitude"]
        separator => ","
    }
    mutate {
        add_field => ["location","%{latitude}"]
        add_field => ["location","%{longitude}"]
        convert => {"location" => "float"}
       }
}

output {
    elasticsearch {
        action => "index"
        hosts => "localhost"
        index => "Test"
        workers => 1
    }
 stdout {
     codec => rubydebug
     }
}

I have tried every solution possible on this forum. What am I doing wrong? (without the mapping the document succeeds in importing but with lat,lon as strings....

I would like to have both columns (Longitude & Latitude) together as to point it out on the map.

ES, Kibana and Logstash have been downloaded last week so we are running latest stable.

Cheers!

Thomas

IIRC Elasticsearch requires index names to be lowercase so I'm a bit surprised you're getting anything at all with your Test index.

Anyway, what do your documents look like in ES? Alternatively, what does your stdout output think they look like?

Hi Magnus,

You're right, I actually made a typo and my index is "test" (all lowercase).

With this method, my stdout outputs nothing. And nothing is imported into ES.

I have tried with following filter block :

filter {
    csv {
        columns => ["name", "adress", "latitude", "longitude"]
        separator => ","
    }
mutate {
    convert => {"latitude" => "float"}
    convert => {"longitude" => "float"}
    rename => {
            "longitude" => "[location][lon]"
            "latitude" => "[location][lat]"
    }
}
}

stdout outputs the following

{
       "message" => "name,adres,50.696324,4.0367",
      "@version" => "1",
    "@timestamp" => "2016-09-20T06:55:59.646Z",
          "host" => "ubuntu",
          "name" => "JohnSmith",
         "adress" => "Unknown",
      "location" => {
        "lon" => "4.0367",
        "lat" => "50.696324"
    }
}

but with the following error

"error"=>{"type"=>"illegal_argument_exception", "reason"=>"[location] is defined as an object in mapping [logs] but this name is already used for a field in other types"}}}, :level=>:warn}

Yet again, no imports in ES.

Thanks

Hi,

Further testing made me use the following

mutate {
    convert => {"latitude" => "float"}
    convert => {"longitude" => "float"}
    add_field => ["location", "%{latitude},%{longitude}"]
    convert => {"location" => "float"}
}
}

This seems to do the trick. Yet the maps don't zoom quite deep....so result is poorly shown on Kibana's Tile Map. I've heard you can zoom more with the options in kibana.yml. But can't find the option....

Magnus thank you for your help and time!

Cheers

Thomas

1 Like