I've trawled through post after post about GEOIP to on avail. I think my problem might lie in a minimul understanding of mappings.
I have a fortigate firewall that is sending its logs via syslog to logstash. This all works great and GEOIP does find latitude and longitudes etc, but I can't for the life of me get it to store them as a geo_point.
I've seen people say you need to change your mappings, but a new index is created each day. Do I need to change mappings every day? I'm guessing not.
Also, I didn't think you could change mappings when an index has been created. My indexes are created automatically when the data comes in. I tried a mutate filter but that idea fell flat on its face.
Here's a copy of my filter. I'm at the "throw my hands in the air whilst exclaiming 'it's bloody stupid'" point
filter {
if [type] == "forti_log" {
grok {
match => ["message", "%{SYSLOG5424PRI:syslog_index}%{GREEDYDATA:message}"]
overwrite => [ "message" ]
tag_on_failure => [ "forti_grok_failure" ]
}
kv {
source => "message"
value_split => "="
field_split => " "
}
mutate {
add_field => { "temp_time" => "%{date} %{time}" }
rename => { "type" => "ftg_type" }
rename => { "subtype" => "ftg_subtype" }
add_field => { "type" => "forti_log" }
convert => { "rcvdbyte" => "integer" }
convert => { "sentbyte" => "integer" }
}
date {
match => [ "temp_time", "yyyy-MM-dd HH:mm:ss" ]
timezone => "UTC"
target => "@timestamp"
}
mutate {
#add/remove fields as you see fit.
remove_field => ["syslog_index","syslog5424_pri","path","temp_time","service","date","time","sentpkt","rcvdpkt","log_id","message","poluuid"]
}
geoip {
source => "dstip"
}
}
}