Geoip_lookup_failure. Tag in kibana


(Jones Thomas) #1

This is my example file:
1- Why my currentTimestamp is in string field?
2-geoip failure

and the filter i used :point_down:

filter {
json {
source => "message"
}
json {
source => "LogMsg"
}
date {
match => ["currentTimestamp", "dd/MMM/yyyy:HH:mm:ss Z"]
}
geoip{
source => "LogMsg"
}
}


(Tim Sullivan) #2

FYI, this really isn't a Kibana question, it's about how to configure your Logstash pipeline. You're just using Kibana to inspect the problem. I can offer you some limited help, but if you need further advice, then the Logstash category is what you're looking for.

I formatted the filter for you:

filter {
  json {
    source => "message"
  }
  json {
    source => "LogMsg"
  }
  date {
    match => ["currentTimestamp", "dd/MMM/yyyy:HH:mm:ss Z"]
  }
  geoip{
    source => "LogMsg"
  }
}

The filter alone doesn't ensure that the data is mapped the way you want it to be. If you have an elasticsearch output section in your pipeline configuration, you'll want to add a template and template_name as described in the Logstash docs: https://www.elastic.co/guide/en/logstash/current/plugins-outputs-elasticsearch.html#plugins-outputs-elasticsearch-options. In the template you specify, make sure to map currentTimestamp to be a date field.

It looks like you're configuring source to be the LogMsg field, which is an object, but source needs a string which is the IP address to perform the lookup: https://www.elastic.co/guide/en/logstash/current/plugins-filters-geoip.html#plugins-filters-geoip-source. Unless I'm mistaken, it doesn't look like your data even has IP address data, so geoip is not what you want. Looks like you already have the latitude and longitude data, so you just want to take those fields and map them to a geo_point. See: https://www.elastic.co/guide/en/elasticsearch/reference/5.5/geo-point.html


(system) #3

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.