IP Field contained invalid IP address or hostname with geoip filter

Hi

I am parsing a CSV file with the following configurations. This CSV file has a field which contains IP address.

Problem is , Logstash gives the following error with geoip for IP addresses even though IP addresses are valid.

"IP Field contained invalid IP address or hostname"

Did I miss anything in the configurations ?
Can't I use the csv field directly in the geoip filter ?

input {
    file {
          path => "/Users/duleendra/Dev/ELK/logstash-2.3.2/data/*.csv"			
    }
}
filter {

  if [path] =~ "usage" {
    mutate { 
       replace => { "type" => "usage" } 
    }

csv {
    columns => [ "oid","user_id","ip","package","source_type","doc_id","digital_type","publication","pub_date","unit_price","gst","total","payment","pay_ref","createon"]
    separator => ","
} 

geoip {
    source => "ip"
}

}

}

output {
    elasticsearch {
		hosts => ["localhost:9200"] 
	}
   
}

Thanks
Duleendra

Well, what does the ip field contain?

Hi Magnus

This ip field contains IP address and apparently those are valid.

If I tired the following , it works. But not sure any performance issues.

grok {
        match => { "message" => "%{IP:clientip}" }
   }

geoip {
    source => "clientip"

}

Is there a way to skip if the "ip" field contains any invalid IP address ?

Thanks
Duleendra

I am facing similar error: IP Field contained invalid IP address or hostname with geoip filter

timestamp=>"2016-08-29T10:36:54.075000+0200", :message=>"IP Field contained invalid IP address or hostname", :field=>"client_ip", :event=>#<LogStash::Event:0x59fc3530 @metadata={}, @accessors=#<LogStash::Util::Accessors:0x3d081077 @store={"message"=>"^C82.103.128.63

My logstash configuration:

Analyze geo location

            if [client_ip] {
                    geoip {
                            source => "client_ip"
                            target => "client_geoip"
                            database => "/jsm/logstash/GeoLiteCity.dat"
                            add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
                            add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}"  ]
                    }
                    mutate {
                        convert => [ "[geoip][coordinates]", "float"]
                           }
                    if ("_grokparsefailure" in [tags]) {
                            mutate {
                                    add_tag => [ "geoip_parsefailure" ]
                                    remove_tag => [ "_grokparsefailure" ]
                            }
                    }
            }

PS: Client Ip contains IPorHost, default Logstash pattern.

If client_ip must be an IP address, perhaps you should to use the dns filter to look up hostnames and turn them into IP addresses? And if that's not successful and client_ip still contains a hostname, skip the geoip filter?