Logstash filter cidr error

Hi,

I was trying CIDR filter of logstash.

cidr {
add_tag => ["matched"]
address => ["%{clientip}"]
network => ["10.123.123.0/24", "10.xxx.xxx.0/24", "10.xxx.xxx.0/24", "10.xxx.xxx.0/24"]
}

This is giving an error when there is any other IP, other than, the 10 series i.e. external public IPs.

The error is

Invalid IP address, skipping {:address=>"%{clientip}", :event=>#<LogStash::Event:0x3ff5aaeb @metadata_accessors=#<LogStash::Util::Accessors:0x6bbb121a @store={}, @lut={}>, @cancelled=false, @data={...... ......:level=>:warn}

How to remove this error?

This is because the event didn't have a clientip field.

Thanks Magnus for reply.

There are clientip field simuntaneously with the geoip.ip field in the same event. Such events are throwing error.
How can I bypass such events ?

To only run the cidr filter if there's a clientip field:

if [clientip] {
  cidr {
    ...
  }
}

Hi Magnus,

I tried this. But it is not working. Both clientip and geoip.ip fileds exist together in same event.

So same error has occurred.

Sorry, I don't understand. What does the [geoip][ip] field have to do with this? What does a failing event look like? Use a stdout { codec => rubydebug } output.

Hi Magnus,

The event is:
<157>Nov 30 20:21:43 hostname process: 10.xx.x.x 47.xxx.xx.x ip3.host.domain [30/Nov/2016:20:21:43 +0530] - "POST /url/xxx/url/xx HTTP/1.1" 200 29 "-" "Apache-HttpClient/UNAVAILABLE (java 1.4)" 0.008 0.008 .

This has 3 clientip., with 1 extrenal ip (which then is the geoip)

This is the stdout for the event:

Invalid IP address, skipping {:address=>"%{clientip}", :event=>#<LogStash::Event:0x14422092 @metadata_accessors=#<LogStash::Util::Accessors:0x12d16236 @store={}, @lut={}>, @cancelled=false, @data={"message"=>"<157>Nov 30 20:21:43 hostname process: 10.x.x.x 47.x.x.x ip3.host.domain [30/Nov/2016:20:21:43 +0530] - "POST url/x/url/x HTTP/1.1" 200 29 "-" "Apache-HttpClient/UNAVAILABLE (java 1.4)" 0.008 0.008 .", "@version"=>"1", "@timestamp"=>"2017-01-31T09:21:05.662Z", "host"=>"anu", "tags"=>["_grokparsefailure", "weblogs_3ips"], "timestamp"=>"Nov 30 20:21:43", "logsource"=>"hostname", "program"=>"process", "clientip"=>["10.x.x.x", "47.x.x.x", "ip3.host.domain"], "t"=>"30/Nov/2016:20:21:43 +0530", "verb"=>"POST", "request"=>"/apis/jionetwork/v1/checklist_v1.3/", "httpversion"=>"1.1", "response"=>"200", "bytes"=>"29", "ref"=>"-"}, @metadata={}, @accessors=#<LogStash::Util::Accessors:0x169f9c46 @store={"message"=>"<157>Nov 30 20:21:43 SMUMAPI002 nginx: 10.x.x.x 47.x.x.x ip3.host.domain [30/Nov/2016:20:21:43 +0530] - "POST /url/x/url/x HTTP/1.1" 200 29 "-" "Apache-HttpClient/UNAVAILABLE (java 1.4)" 0.008 0.008 .", "@version"=>"1", "@timestamp"=>"2017-01-31T09:21:05.662Z", "host"=>"anu", "tags"=>["_grokparsefailure", "weblogs_3ips"], "timestamp"=>"Nov 30 20:21:43", "logsource"=>hostname", "program"=>"process", "clientip"=>["10.x.x.x", "47.x.x.x", "ip3.host.domain"], "t"=>"30/Nov/2016:20:21:43 +0530", "verb"=>"POST", "request"=>"/apis/jionetwork/v1/checklist_v1.3/", "httpversion"=>"1.1", "response"=>"200", "bytes"=>"29", "ref"=>"-"}, @lut={"host"=>[{"message"=>"<157>Nov 30 20:21:43 hostname process: 10.x.x.x 47.x.x.x ip3.host.domain [30/Nov/2016:20:21:43 +0530] - "POST /apis/jionetwork/v1/checklist_v1.3/ HTTP/1.1" 200 29 "-" "Apache-HttpClient/UNAVAILABLE (java 1.4)" 0.008 0.008 .", "@version"=>"1", "@timestamp"=>"2017-01-31T09:21:05.662Z", "host"=>"anu", "tags"=>["_grokparsefailure", "weblogs_3ips"], "timestamp"=>"Nov 30 20:21:43", "logsource"=>"hostname", "program"=>"process", "clientip"=>["10.x.x.x", "47.x.x.x", "ip3.host.domain"], "t"=>"30/Nov/2016:20:21:43 +0530", "verb"=>"POST", "request"=r"/url/x/url/x", "httpversion"=>"1.1", "response"=>"200", "bytes"=>"29", "ref"=>"-"}, "host"], ...........................repeats many times............................>>, :level=>:warn}
{
"message" => "<157>Nov 30 20:21:43 hostname process: 10.xx.x.xx 47.x.x.xx ip3.host.domain [30/Nov/2016:20:21:43 +0530] - "POST /url/x/x/url/ HTTP/1.1" 200 29 "-" "Apache-HttpClient/UNAVAILABLE (java 1.4)" 0.008 0.008 .",
"@version" => "1",
"@timestamp" => "2017-01-31T09:21:05.662Z",
"host" => "anu",
"tags" => [
[0] "_grokparsefailure",
[1] "weblogs_3ips"
],
"timestamp" => "Nov 30 20:21:43",
"logsource" => "hostname",
"program" => "process",
"clientip" => [
[0] "10.x.x.x",
[1] "47.x.x.x",
[2] "ip3.host.domain"
],
"t" => "30/Nov/2016:20:21:43 +0530",
"verb" => "POST",
"request" => "/url/x/url/x/",
"httpversion" => "1.1",
"response" => "200",
"bytes" => "29",
"ref" => "-"
}

Which IP address would you like to use with the cidr filter? Please show your full configuration.

Hi Magnus,

The full configuration is

input { stdin {} }

filter {
grok {

   match => {"message" => "%{COMMONAPACHELOG}"}

  }

grok {
match => { "message" => "%{SYSLOGBASE} %{NUMBER:response} [%{HTTPDATE}] %{NUMBER} %{NUMBER} %{NUMBER} "(?:%{WORD:verb} %{NOTSPACE:request}(?: HTTP/%{NUMBER:httpversion})?|%{DATA:rawrequest})" %{IPORHOST:clientip} "}

  }

cidr {
add_tag => ["matched"]
address => ["%{clientip}"]
network => ["10.123.123.0/24", "10.xxx.xxx.0/24", "10.xxx.xxx.0/24", "10.xxx.xxx.0/24"]
}

geoip {
source => "clientip"
target => "geoip"
database => "/etc/logstash/GeoLiteCity.dat"
add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}" ]
add_tag => ["geo"]
}

mutate {
convert => [ "[geoip][coordinates]", "float"]
}
}

output { stdout {codec => rubydebug } }

Hi magnus,

The above configuration works well. But when I add this filter (below):

grok {
match => { "message" => "%{SYSLOGBASE} %{IPORHOST:clientip} %{IPORHOST:clientip} %{IPORHOST:clientip} [%{HTTPDATE:t}] %{USER} "(?:%{WORD:verb} %{NOTSPACE:request}(?: HTTP/%{NUMBER:httpversion})?|%{DATA:rawrequest})" %{NUMBER:response} %{NUMBER:bytes} "%{USER:ref}" "}

  }

This has 3 clientips, which creates the conflict.

How can I remove this?

If you don't want three IP addresses in the clientip field, don't list %{IPORHOST:clientip} three times. Capture the IP addresses to different fields.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.