Logstash load of geo data from lat, long - Kibana not recognising geo


(karijd) #1

Hi,
I am having trouble getting Kibana to recognise my geo data as proper geo data which it can plot.
I have done lots of searching for the answer but I cannot seem to get the logstash.conf file to work. The time column acq is picked up a time but the geo is ignored. I have tried a few different variations from other posts but it is not working. I think it is as I need to map the geo or something but all examples for that I have seen are not showing the logstash conf file.
Help would be much appreciated!

input {
file {
path => "/home/ubuntu/subset.csv"
type => "core2"
start_position => "beginning"
}
}

filter {
csv {
columns => ["mmsi","lon","lat","sog","cog","hdg","acq","id","geo"]
separator => ","
}

if [id] == "id"{
drop {}
} else {
mutate {
convert => { "lon" => "float" }
convert => { "lat" => "float" }
}

mutate {
rename => {
"lon" => "[location][lon]"
"lat" => "[location][lat]"
}
}

mutate {
  remove_field => [ "message", "host", "@timestamp", "@version" ]
}

geoip {
source => "[lat,lon]"
}
}
}

output {
elasticsearch {
action => "index"
host => "localhost"
index => "logstash-1"
workers => 1
}
stdout {
codec => rubydebug
#codec => dots
}
}


(Mark Walkom) #2

You can't do a geoip filter on two fields like that, you need to merge lon and lat into a single field for it to work :slight_smile:


(karijd) #3

Thanks Mark,
I have merged the lon and lat, which works and creates a value like 35, -5. But the geoip.location variable is empty and the geo_point is still greyed out in the tile map view. Can you see what I am doing wrong, I have tried a few things but still not working.. please help :frowning:

input {
file {
path => "/home/ubuntu/subset.csv"
type => "core2"
start_position => "beginning"
}
}

filter {
csv {
columns => ["mmsi","callsign","imo","vessel","cargotype","activitytype","navigationalstatus","shiptype","maxdraught","vesseltypeindx","length","beam","lon","lat","sog","cog","hdg","acq","id","geo"]
separator => ","
}

if [id] == "id"{
drop {}
} else {
mutate {
convert => { "lon" => "float" }
convert => { "lat" => "float" }
}

mutate {
replace => ["geo","%{lat},%{lon}"]
}

mutate {
  remove_field => [ "message", "host", "@timestamp", "@version" ]
}

geoip {
source => "geo"
target => "geoip"
}

}
}

output {
elasticsearch {
action => "index"
host => "localhost"
index => "logstash-1"
workers => 1
}
stdout {
codec => rubydebug
#codec => dots
}
}


(system) #4