Geoip_lookup_failure. Tag in kibana

This is my example file:
1- Why my currentTimestamp is in string field?
2-geoip failure

and the filter i used :point_down:

filter {
json {
source => "message"
}
json {
source => "LogMsg"
}
date {
match => ["currentTimestamp", "dd/MMM/yyyy:HH:mm:ss Z"]
}
geoip{
source => "LogMsg"
}
}

FYI, this really isn't a Kibana question, it's about how to configure your Logstash pipeline. You're just using Kibana to inspect the problem. I can offer you some limited help, but if you need further advice, then the Logstash category is what you're looking for.

I formatted the filter for you:

filter {
  json {
    source => "message"
  }
  json {
    source => "LogMsg"
  }
  date {
    match => ["currentTimestamp", "dd/MMM/yyyy:HH:mm:ss Z"]
  }
  geoip{
    source => "LogMsg"
  }
}

The filter alone doesn't ensure that the data is mapped the way you want it to be. If you have an elasticsearch output section in your pipeline configuration, you'll want to add a template and template_name as described in the Logstash docs: Elasticsearch output plugin | Logstash Reference [8.11] | Elastic. In the template you specify, make sure to map currentTimestamp to be a date field.

It looks like you're configuring source to be the LogMsg field, which is an object, but source needs a string which is the IP address to perform the lookup: Geoip filter plugin | Logstash Reference [8.11] | Elastic. Unless I'm mistaken, it doesn't look like your data even has IP address data, so geoip is not what you want. Looks like you already have the latitude and longitude data, so you just want to take those fields and map them to a geo_point. See: Geo-point datatype | Elasticsearch Reference [5.5] | Elastic

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.