Logstash Geoip filter with Packetbeat

Hi everyone,
to start, I'm running Logstash, elasticsearch and kibana 6.3.1 and multiple beats that are all on 6.3.2. I have loaded the various default beats index templates as outlined in https://www.elastic.co/guide/en/beats/packetbeat/current/packetbeat-template.html (as well as auditbeat, metricbeat, and winlogbeat). all of my beats are using the Logstash output.

#----------------------------- Logstash output --------------------------------
  # The Logstash hosts
  hosts: ["10.x.x.x:5044"]

  # Optional SSL. By default is off.
  # List of root certificates for HTTPS server verifications
  #ssl.certificate_authorities: ["/etc/pki/root/ca.pem"]

  # Certificate for SSL client authentication
  #ssl.certificate: "/etc/pki/client/cert.pem"

  # Client Certificate Key
  #ssl.key: "/etc/pki/client/cert.key"

The issue I am having is that i cannot get the Geoip filter to work with packetbeat. Here is my .conf file that handles the beats input

# The # character at the beginning of a line indicates a comment. Use
# comments to describe your configuration.
input {
    beats {
        type => beats
        port => 5044
        ssl => false

filter {
    geoip {
       source => "[dest][ip]"
       target => "[dest][ip_location]"

    geoip {
        source => "[source][ip]"
        target => "[source][ip_location]"

output {
    elasticsearch {
        hosts => ["10.x.x.x:9200", "10.x.x.x:9200", "10.x.x.x:9200"]
        index => "%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
        manage_template => false

I'm pretty sure I'm using the geoip filter wrong, but I'm not sure how. If you need more info please let me know, and Thank you for all that you do!

Do you get any errors or do you just not get a geo_point in Kibana? If it is the latter then you need an index template. See this thread.

I get no error, but i setup my template from the installation

here is the current index setup: https://gist.github.com/jkevan91/2a96da1a46757b489691b334109d8d88

I take it back i do get an error in logstash:

 {"type"=>"mapper_parsing_exception", "reason"=>"failed to parse", "caused_by"=>{"type"=>"parse_exception", "reason"=>"field [lat] missing"}}}}}
[2018-08-10T11:16:10,830][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"packetbeat-6.3.2-2018.08.10", :_type=>"doc", :_routing=>nil}, #<LogStash::Event:0x681ebb0e>], :response=>{"index"=>{"_index"=>"packetbeat-6.3.2-2018.08.10", "_type"=>"doc", "_id"=>"vl4NJWUBa96CdXgWkW8T", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse", "caused_by"=>{"type"=>"parse_exception", "reason"=>"field [lat] missing"}}}}}

OK. So your template defines several fields as geo_point. It would appear your event has one of those fields, but it does not have a lat/lon pair of numbers. The error message in Elasticsearch may show you more information, or else dump it to

output { stdout { codec=> rubydebug } }

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.