Geoip is not generate .location

Good morning,

I'm working with ELK for Palo Alto and I wanna to make an update from normal charts to Map Chart.

I've read different tutorials but it's not working properly

Logstash filter

if [SourceAddress] and [SourceAddress] !~ "(^127.0.0.1)|(^10.)|(^172.1[6-9].)|(^172.2[0-9].)|(^172.3[0-1].)|(^192.168.)|(^169.254.)" {
geoip {
database => "/etc/logstash/GeoLite2.mmdb"
source => "SourceAddress"
target => "SourceGeo"
}

if [DestinationAddress] and [DestinationAddress] !~ "(^127.0.0.1)|(^10.)|(^172.1[6-9].)|(^172.2[0-9].)|(^172.3[0-1].)|(^192.168.)|(^169.254.)" {
geoip {
database => "/etc/logstash/GeoLite2.mmdb"
source => "SourceAddress"
target => "SourceGeo"
}

It generates the location.lat and location. lan but not the

Any ideas why the location is not working ok?

Both your filters use SourceAddress to populate SourceGeo. Looks like you need to change this to DestinationAddress and DestinationGeo in the second one?

I've got a wrong in the copy paste.

the real state is the following

if [SourceAddress] and [SourceAddress] !~ "(^127.0.0.1)|(^10.)|(^172.1[6-9].)|(^172.2[0-9].)|(^172.3[0-1].)|(^192.168.)|(^169.254.)" {
geoip {
database => "/etc/logstash/GeoLite2.mmdb"
source => "SourceAddress"
target => "SourceGeo"
}

}
#Geolocate logs that have DestinationAddress and if that DestinationAddress is a non-RFC1918 address
if [DestinationAddress] and [DestinationAddress] !~ "(^127.0.0.1)|(^10.)|(^172.1[6-9].)|(^172.2[0-9].)|(^172.3[0-1].)|(^192.168.)|(^169.254.)" {
geoip {
database => "/etc/logstash/GeoLite2.mmdb"
source => "DestinationAddress"
target => "DestinationGeo"
}
}

Have you uploaded an index template that maps these fields correctly as a geo_point?

hi Christian,

I'm delete the index and regenerate in kibana but I dont know if it is the same.

I'm a bit newbie with ELK

regards

The default index template for the logstash-* indices maps the field geoip.location as a geo_point as shown here. In order for your fields to be recognised as a geo_point, you need to create/update the index template to map your fields the same way.

hi again,

maybe it's a stupid question but Where I've to set this file? and which permissions?

many thanks!!

I've installed the template and logstash is working properly

[2017-06-27T15:43:53,520][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-", "version"=>50001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"_all"=>{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword"}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "include_in_all"=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2017-06-27T15:43:53,528][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>[#<URI::Generic:0x653850cb URL://localhost:9200>]}
[2017-06-27T15:43:53,639][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>, :added=>[http://localhost:9200/]}}

Create the index pam-traffic but the geoip value doesnt appear

any idea?

I think u need add some field to index-template . it look like

PUT /_template/index-template-name
{
"properties": {
"SourceGeo": {
"type": "geo_point"
}
}
}

Copy the default Logstash index template and modify the template field in it to match your index name. Then add mappings for SourceGeo.location and DestinationGeo.location similar to how geoip.location is mapped. You can then either upload this template manually, e.g. using curl, or instruct Logstash to do so for you using the template parameter in the elasticsearch output plugin.

Many thanks.

  1. I find that the template that i've deployed in /etc/logstasth/elasticsearch-template.json is not included in logstashs

    {
    "template" : "logstash-",
    "version": 2,
    "settings" : {
    "index.refresh_interval" : "5s"
    },
    "mappings" : {
    "default" : {
    "_all" : {"enabled" : true},
    "dynamic_templates" : [ {
    "message_field" : {
    "match" : "message",
    "match_mapping_type" : "string",
    "mapping" : {
    "type" : "string", "index" : "analyzed", "omit_norms" : true
    }
    }
    }, {
    "string_fields" : {
    "match" : "
    ",
    "match_mapping_type" : "string",
    "mapping" : {
    "type" : "string", "index" : "analyzed", "omit_norms" : true,
    "fields" : {
    "raw" : {"type": "string", "index" : "not_analyzed", "ignore_above" : 256}
    }
    }
    }
    } ],
    "properties" : {
    "@version": { "type": "string", "index": "not_analyzed" },
    "geoip" : {
    "type" : "object",
    "dynamic": true,
    "path": "full",
    "properties" : {
    "location" : { "type" : "geo_point", "lat_lon" : true, "geohash" : true }
    }
    },
    "SourceGeo" : {
    "type" : "object",
    "dynamic": true,
    "path": "full",
    "properties" : {
    "location" : { "type" : "geo_point", "lat_lon" : true, "geohash" : true }
    }
    },
    "DestinationGeo" : {
    "type" : "object",
    "dynamic": true,
    "path": "full",
    "properties" : {
    "location" : { "type" : "geo_point", "lat_lon" : true, "geohash" : true }
    }
    }
    }
    }
    }
    }

  2. the log when logstash starts is the following:

    [2017-06-28T09:22:50,252][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-", "version"=>50001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"_all"=>{"enabled"=>true, "norms"=>false}, "dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword"}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date", "include_in_all"=>false}, "@version"=>{"type"=>"keyword", "include_in_all"=>false}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
    [2017-06-28T09:22:50,256][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>[#<URI::Generic:0x716a7ee0 URL://localhost:9200>]}
    [2017-06-28T09:22:50,374][INFO ][logstash.filters.geoip ] Using geoip database {:path=>"/etc/logstash/GeoLite2.mmdb"}
    [2017-06-28T09:22:50,436][INFO ][logstash.filters.geoip ] Using geoip database {:path=>"/etc/logstash/GeoLite2.mmdb"}
    [2017-06-28T09:22:50,490][INFO ][logstash.pipeline ] Starting pipeline {"id"=>"main", "pipeline.workers"=>1, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>125}
    [2017-06-28T09:22:50,933][INFO ][logstash.pipeline ] Pipeline main started
    [2017-06-28T09:22:51,059][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}

As you can see the version is 50001 but I've modify this value to identify if the correct template is loaded or not into the platform

Command line to get the template loaded

curl -XGET 'http://localhost:9200/_template/'
{"logstash":{"order":0,"version":50001,"template":"logstash-*","settings":{"index":{"refresh_interval":"5s"}},"mappings":{"_default_":{"dynamic_templates":[{"message_field":{"path_match":"message","mapping":{"norms":false,"type":"text"},"match_mapping_type":"string"}},{"string_fields":{"mapping":{"norms":false,"type":"text","fields":{"keyword":{"type":"keyword"}}},"match_mapping_type":"string","match":"*"}}],"_all":{"norms":false,"enabled":true},"properties":{"@timestamp":{"include_in_all":false,"type":"date"},"geoip":{"dynamic":true,"properties":{"ip":{"type":"ip"},"latitude":{"type":"half_float"},"location":{"type":"geo_point"},"longitude":{"type":"half_float"}}},"@version":{"include_in_all":false,"type":"keyword"}}}},"aliases":{}}}
  1. Where is this default template? or how can i change it?

kindly regards

You need to change this pattern to match your index and store it using a different name.

What I did:

curl -XGET 'localhost:9200/_template?pretty'
{
  "logstash" : {
    "order" : 0,
    "version" : 50001,
    "template" : "logstash-*",
    "settings" : {
      "index" : {
        "refresh_interval" : "5s"
      }
    },
    "mappings" : {
      "_default_" : {
        "dynamic_templates" : [
          {
            "message_field" : {
              "path_match" : "message",
              "mapping" : {
                "norms" : false,
                "type" : "text"
              },
              "match_mapping_type" : "string"
            }
          },
          {
            "string_fields" : {
              "mapping" : {
                "norms" : false,
                "type" : "text",
                "fields" : {
                  "keyword" : {
                    "type" : "keyword"
                  }
                }
              },
              "match_mapping_type" : "string",
              "match" : "*"
            }
          }
        ],
        "_all" : {
          "norms" : false,
          "enabled" : true
        },
        "properties" : {
          "@timestamp" : {
            "include_in_all" : false,
            "type" : "date"
          },
          "geoip" : {
            "dynamic" : true,
            "properties" : {
              "ip" : {
                "type" : "ip"
              },
              "latitude" : {
                "type" : "half_float"
              },
              "location" : {
                "type" : "geo_point"
              },
              "longitude" : {
                "type" : "half_float"
              }
            }
          },
          "@version" : {
            "include_in_all" : false,
            "type" : "keyword"
          }
        }
      }
    },
    "aliases" : { }
  },
  "pan-traffic" : {
    "order" : 0,
    "version" : 2,
    "template" : "pan_traffic",
    "settings" : {
      "index" : {
        "refresh_interval" : "5s"
      }
    },
    "mappings" : {
      "_default_" : {
        "dynamic_templates" : [
          {
            "message_field" : {
              "mapping" : {
                "index" : "analyzed",
                "omit_norms" : true,
                "type" : "string"
              },
              "match_mapping_type" : "string",
              "match" : "message"
            }
          },
          {
            "string_fields" : {
              "mapping" : {
                "index" : "analyzed",
                "omit_norms" : true,
                "type" : "string",
                "fields" : {
                  "raw" : {
                    "ignore_above" : 256,
                    "index" : "not_analyzed",
                    "type" : "string"
                  }
                }
              },
              "match_mapping_type" : "string",
              "match" : "*"
            }
          }
        ],
        "_all" : {
          "enabled" : true
        },
        "properties" : {
          "geoip" : {
            "dynamic" : true,
            "type" : "object",
            "properties" : {
              "ip" : {
                "type" : "ip"
              },
              "latitude" : {
                "type" : "half_float"
              },
              "location" : {
                "type" : "geo_point"
              },
              "longitude" : {
                "type" : "half_float"
              }
            }
          },
          "@version" : {
            "index" : "not_analyzed",
            "type" : "string"
          }
        }
      }
    },
    "aliases" : { }
  }
}

I start again the ELK stack and add pan-traffic in kibana.

Search for GEOIP and the result is

I've upload the template with following command:

curl -XPUT 'http://localhost:9200/_template/pan-traffic' -d@/etc/logstash/pantraffic-template.json

What does pantraffic-template.json look like? Once it is uploaded, I also think you need to delete and recreate your index, as index templates are applied at creation time.

Hi christian
{
"template" : "pan_traffic",
"version": 2,
"settings" : {
"index.refresh_interval" : "5s"
},
"mappings" : {
"default" : {
"_all" : {"enabled" : true},
"dynamic_templates" : [ {
"message_field" : {
"match" : "message",
"match_mapping_type" : "string",
"mapping" : {
"type" : "string", "index" : "analyzed", "omit_norms" : true
}
}
}, {
"string_fields" : {
"match" : "*",
"match_mapping_type" : "string",
"mapping" : {
"type" : "string", "index" : "analyzed", "omit_norms" : true,
"fields" : {
"raw" : {"type": "string", "index" : "not_analyzed", "ignore_above" : 256}
}
}
}
} ],
"properties" : {
"@version": { "type": "string", "index": "not_analyzed" },
"geoip" : {
"type" : "object",
"dynamic": true,
"properties" : {
"ip": { "type": "ip" }, "location" : { "type" : "geo_point" }, "latitude" : { "type" : "half_float" }, "longitude" : { "type" : "half_float" }
}
}
}
}
}
}

  1. I've deleted the pan-index in kibana but the problem persists

This template does not seem to maintain the mapping required for your fields. You will also need to delete the indices from Elasticsearch and recreate them, not just the index patterns in Kibana.

Hi christian,

If I understand you fine the steps that I've to follow are the following

1 ) curl -XDELETE <http://localhost:9200/_template/OldTemplateName
2) stop elastichsearch
3) Delete pan-traffic index in kibana
5 ) stop kibana
6) start elasticsearch
7) curl -XPUT 'http://localhost:9200/_template/pan-traffic' -d@/etc/logstash/pantraffic-template.json
8) start kibana
9) Add the new index in kibana

is correct?

I think the following should be sufficient:

  1. Update pantraffic-template.json to contain mappings for your fields.
  2. Push this template to the cluster, which should obverride the existing one.
  3. Delete the pan-traffic index through the delete index API
  4. Write data to the pan-traffic index
  5. Refresh the mappings for the pan-traffic index pattern in Kibana

What do you mean with this?

It does not seem to contain mappings for SourceGeo.location and DestinationGeo.location so you need to add that.