_geoip_parse_failure on logstash config

Hi all,

i have configured packet beat, filebeat and metricbeat.

i want to use it's geoIp feature and therefore i have used filter in logtsash config file.

but for each and evry configuration i am getting _geoip_parse_failure.

Please guide.

input {
         beats {
                port => 5044
                }
}

filter {
if [type] == "flow"
{
geoip {
    source => "[event][source][ip]"
        target => "geoip"
  }
}
}
output {
        stdout{}
  elasticsearch {
    hosts => "localhost:9200"
    manage_template => false
    index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
    document_type => "%{[@metadata][type]}"
  }
}

Please guide.

Regarding logstash questions, maybe you are better off asking in the logstash forums.

Can you include the complete log message?

Have you checked the actual event contents? e.g. IPv6 addresses are reported in under source.ipv6. Also why are you using [event]? I think packetbeat only reports the field under source.ip, not event.source.ip.

Running packetbeat with -d 'publish' will have packetbeat print the events to be published to it's log file.

Hi Steffens,

I tried to configure without [event] option but still i am getting same error in logstash logs.

[2017-09-06T10:13:40,480][DEBUG][logstash.pipeline ] output received {"event"=>{"geoip"=>{}, "source"=>{"stats"=>{"net_bytes_total"=>109, "net_packets
_total"=>1}, "ip"=>"0.0.0.0"}, "dest"=>{"ip"=>"1.1.1.1"}, "type"=>"flow", "tags"=>["beats_input_raw_event", "_geoip_lookup_failure"], "start_time"=

"2017-09-06T04:44:11.723Z", "@timestamp"=>2017-09-06T04:44:20.000Z, "last_time"=>"2017-09-06T04:44:11.723Z", "flow_id"=>"EAD/////AP////////8AAAEKKASCrBFyWw"
, "final"=>false, "beat"=>{"hostname"=>"Brackman", "name"=>"Brackman", "version"=>"5.5.1"}, "@version"=>"1", "host"=>"Brackman"}}

and yes packetbeat reports IP as source.ip and therefore [source][ip] should work but it's giving _geoip_parse_failure.

Kindly help.

Well, 0.0.0.0 obviously can't be tied to a geolocation so it's not terribly surprising that the geoip filter fails.

Hi Magnus,

I have had changed IP addresses.

You can find below log files and screenshots that i am checking on Kibana and Graylog. Both logstash.conf are same.
Graylog image:
image

Kibana Image:
image

I am receiving geoip_parse_failure on kibana in tags while the same config is working fine in Graylog.

Logstash Config file:

input {
         beats {
                port => 5044
                }
}

filter {
if [type] == "log"
{

                if [message] =~ "AuthAccept|AuthLogout"
                {
                        grok{
                                match => { "message" => "%{DATA:sm_eventID} %{DATA:sm_hostname} \[%{DATA}\] \"%{IPV4:UserIP} %{DATA:sm_username}\" \"%{DATA:sm_agent} %{DATA}\" (?<Greedydata>(.|\r|\n)*)" }
                                }
                }
                else
                {       drop{}
                }

geoip {
    source => "UserIP"
    target => "geoip"
  }

}
}

output {
        stdout{}
  elasticsearch {
    hosts => "localhost:9200"
    manage_template => false
    index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
    document_type => "%{[@metadata][type]}"
  }
}

Please guide.
Please note some fields are hidden.

The IP address you are trying to look up is a private address, which is why the lookup fails.

Hi Christian,

one ques.

How it comes that on Graylog UI, it's showing correct location then?

I don't know as I have no experience of Graylog. As it does not provide any latitude and longitude, which you typically get from the Maxmind lookup, maybe it makes assumptions about where host is located based on the locale where the software is installed?

Hi Christian,

Thanks for knowledge sharing :slight_smile:

and if i use Maxmind lookup will it enhance my GeoIP results?

Yes, but it can not map private Ids to a location as this varies by installation.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.