GeoIP issue on logstash conf file

Hi,

Myself getting an error after giving an geoIP database separately for geoip city and geoIP country.

Please find the error message below

/usr/share/logstash/bin/logstash -t -f /etc/logstash/conf.d/beats.conf

WARNING: Could not find logstash.yml which is typically located in $LS_HOME/config or /etc/logstash. You can specify the path using --path.settings. Continuing using the defaults
Could not find log4j2 configuration at path /usr/share/logstash/config/log4j2.properties. Using default config which logs errors to the console
[WARN ] 2018-07-14 14:10:04.888 [LogStash::Runner] multilocal - Ignoring the 'pipelines.yml' file because modules or command line options are specified
[FATAL] 2018-07-14 14:10:06.369 [LogStash::Runner] runner - The given configuration is invalid. Reason: Expected one of #, => at line 26, column 7 (byte 1207) after filter {
grok {
match => { "message" => '"remote address" %{IP:remote_address} - "remote user" - ["local time" %{HTTPDATE:time}] "Request" "%{GREEDYDATA:request}" "status code" %{INT:http_status_code} "bytes Transfer" %{NOTSPACE:bytes-transfer} "http_refere ""-" "http user agent" "%{DATA:httpuseragent}" "http x forwaded for" "%{DATA:http_x_forwarded_for}""requesttime" "%{DATA:requesttime}" "upstream time" "%{DATA:upstream_time}"'}
match => { "message" => '%{IP:client_ip} %{NOTSPACE:termination_state} %{NOTSPACE:termination_state} [%{HTTPDATE:timestamp}] "%{WORD:verb} %{URIPATHPARAM:request} HTTP/%{NUMBER:httpversion}" %{INT:http_status_code} %{NOTSPACE:bytes_read} %{GREEDYDATA:http_user_agent}'}

add_field => [ "received_at", "%{@timestamp}" ]
add_field => [ "received_from", "%{host}" ]

}
syslog_pri { }
date {
match => [ "syslog_timestamp", "dd/MMM/yyyy:HH:mm:ss Z" ]
remove_field => ["timestamp"]
}

geoip {
source => "client_ip"
target => "geoip"
database => "/etc/logstash/GeoLite2-Country_20180605/GeoLite2-Country.mmdb" "/etc/logstash/GeoLite2-Country_20180605/GeoLite2-City.mmdb"

[ERROR] 2018-07-14 14:10:06.387 [LogStash::Runner] Logstash - java.lang.IllegalStateException: Logstash stopped processing because of an error: (SystemExit) exit

Also please find my beats.conf file

input {
beats {
port => 5044
}
}

filter {
grok {
match => { "message" => '"remote address" %{IP:remote_address} - "remote user" - ["local time" %{HTTPDATE:time}] "Request" "%{GREEDYDATA:request}" "status code" %{INT:http_status_code} "bytes Transfer" %{NOTSPACE:bytes-transfer} "http_refere ""-" "http user agent" "%{DATA:httpuseragent}" "http x forwaded for" "%{DATA:http_x_forwarded_for}""requesttime" "%{DATA:requesttime}" "upstream time" "%{DATA:upstream_time}"'}
match => { "message" => '%{IP:client_ip} %{NOTSPACE:termination_state} %{NOTSPACE:termination_state} [%{HTTPDATE:timestamp}] "%{WORD:verb} %{URIPATHPARAM:request} HTTP/%{NUMBER:httpversion}" %{INT:http_status_code} %{NOTSPACE:bytes_read} %{GREEDYDATA:http_user_agent}'}

add_field => [ "received_at", "%{@timestamp}" ]
add_field => [ "received_from", "%{host}" ]

}
syslog_pri { }
date {
match => [ "syslog_timestamp", "dd/MMM/yyyy:HH:mm:ss Z" ]
remove_field => ["timestamp"]
}

geoip {
source => "client_ip"
target => "geoip"
database => "/etc/logstash/GeoLite2-Country_20180605/GeoLite2-Country.mmdb" "/etc/logstash/GeoLite2-Country_20180605/GeoLite2-City.mmdb"
add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}" ]
}
mutate {
convert => [ "[geoip][coordinates]", "float"]
}

}

output {
elasticsearch {
hosts => ["127.0.0.1:9200"]
index => "filebeat"
}

database => "/etc/logstash/GeoLite2-Country_20180605/GeoLite2-Country.mmdb" "/etc/logstash/GeoLite2-Country_20180605/GeoLite2-City.mmdb"

I don't think the geoip filter supports more than one database. If it did the syntax would be this:

database => ["/etc/logstash/GeoLite2-Country_20180605/GeoLite2-Country.mmdb", "/etc/logstash/GeoLite2-Country_20180605/GeoLite2-City.mmdb"]

No its still getting an error

[ERROR] 2018-07-15 12:51:30.923 [LogStash::Runner] geoip - Invalid setting for geoip filter plugin:

filter {
geoip {
# This setting must be a path
# Expected path (one value), got 2 values?
database => ["/etc/logstash/GeoLite2-Country_20180605/GeoLite2-Country.mmdb", "/etc/logstash/GeoLite2-Country_20180605/GeoLite2-City.mmdb"]
...
}
}
[FATAL] 2018-07-15 12:51:30.942 [LogStash::Runner] runner - The given configuration is invalid. Reason: Something is wrong with your configuration.

If its not supporting the two geoip database,please let me know how can i get both country and city in dashboard.

This is my grok pattern
match => { "message"=> '%{IP:client_ip} %{NOTSPACE:termination_state} %{NOTSPACE:termination_state} [%{HTTPDATE:timestamp}] "%{WORD:verb} %{URIPATHPARAM:request} HTTP/%{NUMBER:httpversion}" %{INT:http_status_code} %{NOTSPACE:bytes_read} %{NOTSPACE:http_referer} "%{NOTSPACE:http_user_agent}" "%{NOTSPACE:http_x_forwarded_for}"request_time=%{BASE10NUM:request_time} upstream_response_time=%{BASE10NUM:upstream_response_time} body_bytes_sent=%{INT:body_bytes_sent} %{WORD:Country} %{WORD:Country_Code} %{WORD:Region_Name} %{WORD:City}'}

Please check and let me

Use two geoip filters, one for the country and one for the city?

HI,

Thanks for the update.After adding an another filter for city my logstash patterns stopped getting worked ,its not segregating the logs.

While observing on logs found below error

[2018-07-16T12:53:28,523][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//127.0.0.1:9200"]}
[2018-07-16T12:53:28,727][INFO ][logstash.filters.geoip ] Using geoip database {:path=>"/etc/logstash/GeoLite2-Country_20180605/GeoLite2-Country.mmdb"}
[2018-07-16T12:53:28,764][INFO ][logstash.filters.geoip ] Using geoip database {:path=>"/etc/logstash/GeoLite2-Country_20180605/GeoLite2-City.mmdb"}
[2018-07-16T12:53:29,390][INFO ][logstash.inputs.beats ] Beats inputs: Starting input listener {:address=>"0.0.0.0:5044"}
[2018-07-16T12:53:29,457][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x2f37ae53 run>"}
[2018-07-16T12:53:29,537][INFO ][org.logstash.beats.Server] Starting server on port: 5044
[2018-07-16T12:53:29,688][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}

[2018-07-16T12:53:30,124][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2018-07-16T12:53:33,840][ERROR][logstash.pipeline ] Exception in pipelineworker, the pipeline stopped processing new events, please check your filter configuration and restart Logstash. {:pipeline_id=>"main", "exception"=>"undefined method tr' for 0.0:Float", "backtrace"=>["/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-mutate-3.3.1/lib/logstash/filters/mutate.rb:344:inconvert_float'", "org/jruby/RubyMethod.java:115:in call'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-mutate-3.3.1/lib/logstash/filters/mutate.rb:309:inblock in convert'", "org/jruby/RubyArray.java:2486:in map'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-mutate-3.3.1/lib/logstash/filters/mutate.rb:309:inblock in convert'", "org/jruby/RubyHash.java:1343:in each'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-mutate-3.3.1/lib/logstash/filters/mutate.rb:299:inconvert'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-mutate-3.3.1/lib/logstash/filters/mutate.rb:252:in filter'", "/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:145:indo_filter'", "/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:164:in block in multi_filter'", "org/jruby/RubyArray.java:1734:ineach'", "/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:161:in multi_filter'", "/usr/share/logstash/logstash-core/lib/logstash/filter_delegator.rb:44:inmulti_filter'", "(eval):240:in block in filter_func'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:443:infilter_batch'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:422:in worker_loop'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:384:inblock in start_workers'"], :thread=>"#<Thread:0x2f37ae53 sleep>"}
[2018-07-16T12:53:33,848][ERROR][logstash.pipeline ] Exception in pipelineworker, the pipeline stopped processing new events, please check your filter configuration and restart Logstash. {:pipeline_id=>"main", "exception"=>"undefined method tr' for 0.0:Float", "backtrace"=>["/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-mutate-3.3.1/lib/logstash/filters/mutate.rb:344:inconvert_float'", "org/jruby/RubyMethod.java:115:in call'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-mutate-3.3.1/lib/logstash/filters/mutate.rb:309:inblock in convert'", "org/jruby/RubyArray.java:2486:in map'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-mutate-3.3.1/lib/logstash/filters/mutate.rb:309:inblock in convert'", "org/jruby/RubyHash.java:1343:in each'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-mutate-3.3.1/lib/logstash/filters/mutate.rb:299:inconvert'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-mutate-3.3.1/lib/logstash/filters/mutate.rb:252:in filter'", "/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:145:indo_filter'", "/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:164:in block in multi_filter'", "org/jruby/RubyArray.java:1734:ineach'", "/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:161:in multi_filter'", "/usr/share/logstash/logstash-core/lib/logstash/filter_delegator.rb:44:inmulti_filter'", "(eval):240:in block in filter_func'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:443:infilter_batch'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:422:in worker_loop'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:384:inblock in start_workers'"], :thread=>"#<Thread:0x2f37ae53 sleep>"}
[2018-07-16T12:53:33,975][FATAL][logstash.runner ] An unexpected error occurred! {:error=>#<NoMethodError: undefined method tr' for 0.0:Float>, :backtrace=>["/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-mutate-3.3.1/lib/logstash/filters/mutate.rb:344:inconvert_float'", "org/jruby/RubyMethod.java:115:in call'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-mutate-3.3.1/lib/logstash/filters/mutate.rb:309:inblock in convert'", "org/jruby/RubyArray.java:2486:in map'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-mutate-3.3.1/lib/logstash/filters/mutate.rb:309:inblock in convert'", "org/jruby/RubyHash.java:1343:in each'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-mutate-3.3.1/lib/logstash/filters/mutate.rb:299:inconvert'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-filter-mutate-3.3.1/lib/logstash/filters/mutate.rb:252:in filter'", "/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:145:indo_filter'", "/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:164:in block in multi_filter'", "org/jruby/RubyArray.java:1734:ineach'", "/usr/share/logstash/logstash-core/lib/logstash/filters/base.rb:161:in multi_filter'", "/usr/share/logstash/logstash-core/lib/logstash/filter_delegator.rb:44:inmulti_filter'", "(eval):240:in block in filter_func'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:443:infilter_batch'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:422:in worker_loop'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:384:inblock in start_workers'"]}
[2018-07-16T12:53:34,050][ERROR][org.logstash.Logstash ] java.lang.IllegalStateException: Logstash stopped processing because of an error: (SystemExit) exit

Check and suggest me the same

It looks like you're trying to convert a field that already contains a float value into a float value and that's apparently not supported.

Are you using target => "geoip" in both filters?

This is how i have put my geo IP on beats.conf

  source => "client_ip"
  target => "geoip"
  database => ["/etc/logstash/GeoLite2-Country_20180605/GeoLite2-Country.mmdb"]
  add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
  add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}"  ]
}
mutate {
  convert => [ "[geoip][coordinates]", "float"]
}

geoip {
source => "client_ip"
target => "geoip"
database => ["/etc/logstash/GeoLite2-Country_20180605/GeoLite2-City.mmdb"]
add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}" ]
}
mutate {
convert => [ "[geoip][coordinates]", "float"]
}

Please help to resolve this

Why are you trying to use both the Country and City DBs? There is nothing in the Country DB that isn't in the City DB.

HI,

So if i use only city DB then it can show both city and Country in my elk dashboard ?

Yes. As I said... "There is nothing in the Country DB that isn't in the City DB."

Try it for yourself and you will see.

Thanks ...Its works

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.