GEOIP woes

I've trawled through post after post about GEOIP to on avail. I think my problem might lie in a minimul understanding of mappings.

I have a fortigate firewall that is sending its logs via syslog to logstash. This all works great and GEOIP does find latitude and longitudes etc, but I can't for the life of me get it to store them as a geo_point.

I've seen people say you need to change your mappings, but a new index is created each day. Do I need to change mappings every day? I'm guessing not.

Also, I didn't think you could change mappings when an index has been created. My indexes are created automatically when the data comes in. I tried a mutate filter but that idea fell flat on its face.

Here's a copy of my filter. I'm at the "throw my hands in the air whilst exclaiming 'it's bloody stupid'" point :slight_smile:

filter {
    if [type] == "forti_log" {

        grok {
            match => ["message", "%{SYSLOG5424PRI:syslog_index}%{GREEDYDATA:message}"]
            overwrite => [ "message" ]
            tag_on_failure => [ "forti_grok_failure" ]

        kv {
            source => "message"
            value_split => "="
            field_split => " "

        mutate {
            add_field => { "temp_time" => "%{date} %{time}" }
            rename => { "type" => "ftg_type" }
            rename => { "subtype" => "ftg_subtype" }
            add_field => { "type" => "forti_log" }
            convert => { "rcvdbyte" => "integer" }
            convert => { "sentbyte" => "integer" }

        date {
            match => [ "temp_time", "yyyy-MM-dd HH:mm:ss" ]
            timezone => "UTC"
            target => "@timestamp"

        mutate {
            #add/remove fields as you see fit.
            remove_field => ["syslog_index","syslog5424_pri","path","temp_time","service","date","time","sentpkt","rcvdpkt","log_id","message","poluuid"]
        geoip {
          source => "dstip"

You use an index template to do that. This thread might help.

Thank you for your reply Badger. That thread was very helpful. It led me to this article which really helped me understand how to create and modify a mapping, and apply it to multiple indexes.

Thanks agian!

The default logstash index template that is created when using the Elasticsearch output should already contain the proper mapping for geoip.location. If you're using a different output or clobbered the output template settings or modified the default template, then you may run into an issue, or if you've changed the target for the geoip filter (which it doesn't look like you have) to be a different field. You can confirm by getting the output of "GET _template/logstash" in the Kibana dev console.

I've been using geoip for years and never realized that -- the default index template ensures that the default target of the filter is a geo_point. That's because I almost always have a src and a dst, so I never use the default target of the filter :smiley: And if the field is not called geoip.location then the default template does not help.

It was a good idea, it just does not fit any use case I have ever seen.

I took a look at the logstash template and it did have a default geopoint mapping for geoip. The only departure from the norm in my setup was I was sending the data to an index called forti-YY-mm-dd. Could that have been why it wasn't using the logstash default template?

Either way, I'm up and running now geo detecting multiple fields into multiple targets :slight_smile:

Precisely. The default template only matches "logstash-*" index patterns. If you use a different index pattern then you'll need a new indexing template.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.