Problems with geoip configuration

Hi All

I've problems with mi geoip ip configuration, i've kibana 4.1.1 with logstash 1.5.3 and apache 2.4.

This is my personalized geoip file configuration on logstash, i called 12-geoip.conf :

filter {
if [type] == "apache_access" {
grok {
match => { "message" => "%{COMMONAPACHELOG}" }

}
geoip {
source => "clientip"
target => "geoip.location"
database => "/etc/logstash/GeoLiteCity.dat"
add_field => [ "[geoip.location][coordinates]", "%{[geoip.location][longitude]}" ]
add_field => [ "[geoip.location][coordinates]", "%{[geoip.location][latitude]}" ]
}
mutate {
convert => [ "[geoip.location][coordinates]", "float"]
}
}
}

Mi apache configuration on other file
filter {
if [type] == "apache" {
grok {
match => { "message" => "%{COMBINEDAPACHELOG}" }
add_field => [ "received_from", "%{host}" ]
}
}
}

My problem is that Kibana don't appears the "geoip" field, but clientip, hostname, etc.... appears.

Part of my logstash-forwarder configuration

{
"paths": [
"/var/log/apache2/*error.log",
"/var/log/apache2/*access-ssl.log"
],
"fields": { "type": "apache" }
},
{
"paths": [
"/var/log/apache2/*access.log"
],
"fields": { "type": "apache_access" }
}
]
}

And my logstash-forwarder registered events:

2015/09/04 09:07:57.807119 Registrar: processing 5 events
2015/09/04 09:08:45.244783 Registrar: processing 2 events
2015/09/04 09:08:50.238178 Registrar: processing 2 events
2015/09/04 09:09:02.744967 Registrar: processing 1 events

Thax so much.

1 Like

You shouldn't be using dots in your fieldnames like that, that will not be supported in ES 2.0.

What does the mapping for the field look like in ES?

Yes it's spanish.

I think that the problem resided in the filter configuration.

geoip {
source => "clientip"
target => "geoip.location"
database => "/etc/logstash/GeoLiteCity.dat"
add_field => [ "[geoip.location][coordinates]", "%{[geoip.location][longitude]}" ]
add_field => [ "[geoip.location][coordinates]", "%{[geoip.location][latitude]}" ]
}
mutate {
convert => [ "[geoip.location][coordinates]", "float"]
}

I don't see the field on kibana server, but for example the clientip appears

You need to get the mapping for the index and then look at the field.

curl -XGET localhost:9200/indexname/_mapping

The field is geoip, my index name called logstash-*

Ok, so did you check the mapping for that field?

I don't understand, should be work with this configuration

filter {
if [type] == "apache_access" {
grok {
match => { "message" => "%{COMBINEDAPACHELOG}" }
}
geoip {
source => "clientip"
target => "geoip"
database => "/etc/logstash/GeoLiteCity.dat"
add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}" ]
}
mutate {
convert => [ "[geoip][coordinates]", "float"]
}

I saw on kibana the client ip, my field for geoip is "geoip" i don't know that wrong.

"apache_access" _type works correctly, i saw that on kibana, was configurated on logstash-forwarder.

In Elasticsearch you map a field, this is where you tell it that any data in your geoip field is actually a geoip type, which is how KB then knows how to use that to put it on the map.

You need to check that field to make sure it is mapped correctly. curl host:9200/INDEXNAME/_mapping should show you.

Warkolm so many thanks for your help :grimacing:

I executed the curl host:9200/INDEXNAME/_mapping with my index, and filtered the command output i saw the geoip field, but is so curious i don't see the ip, clientip or ip address field.

"geoip":{"dynamic":"true","properties":{"location":{"type":"geo_point"}}}

Originally you mentioned you were using the above field, is it that or is it geoip?

On kibana interface appears with the name "geoip.location" but with the analyced column set to false.

At the moment i see the "geoip" field in my "index" but i don't see that on kibana.

filter {
if [type] == "apache_access" {
grok {
pattern => "%{COMBINEDAPACHELOG}"
}
geoip {
source => "clientip"
target => "geoip"
database => "/etc/logstash/GeoLiteCity.dat"
add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}" ]
}
mutate {
convert => [ "[geoip][coordinates]", "float"]
}
}
}

curl -XGET 10.29.0.71:9200/logstash-*/_mapping

"geoip":{"dynamic":"true","properties":{"location":{"type":"geo_point"}}},

Some idea?

Thx

Sorry i don't saw the specific index for my apache_access, but it's the good one

{
"apache_access-logstash-2015.09.08" : {
"mappings" : {
"apache_access" : {
"properties" : {
"@timestamp" : {
"type" : "date",
"format" : "dateOptionalTime"
},
"@version" : {
"type" : "string"
},
"agent" : {
"type" : "string"
},
"auth" : {
"type" : "string"
},
"bytes" : {
"type" : "string"
},
"clientip" : {
"type" : "string"
},
"file" : {
"type" : "string"
},
"host" : {
"type" : "string"
},
"httpversion" : {
"type" : "string"
},
"ident" : {
"type" : "string"
},
"message" : {
"type" : "string"
},
"offset" : {
"type" : "string"
},
"referrer" : {
"type" : "string"
},
"request" : {
"type" : "string"
},
"response" : {
"type" : "string"
},
"timestamp" : {
"type" : "string"
},
"type" : {
"type" : "string"
},
"verb" : {
"type" : "string"
}
}
}
}
}
}

And i don't see the geoip or geo_point field, some idea?

If that above mapping is the index you are trying to use geo mapping on, it has no geo field.
You'd need to check your Logstash config.

I don't know where i should set geo mapping turn on, i checked my logstash configuration but i don't see anything row :pensive:, maybe my index configuration is wrong.

When logstash catch my apache access log and parse the log should convert the "clientip" on geoip location

COMMONAPACHELOG %{IPORHOST:clientip} %{USER:ident} %{USER:auth} [%{HTTPDATE:timestamp}] "(?:%{WORD:verb} %{NOTSPACE:request}(?: HTTP/%{NUMBER:httpversion})?|%{DATA:rawrequest})" %{NUMBER:response} (?:%{NUMBER:bytes}|-)

{"type":"string","index":"not_analyzed","ignore_above":256}}},"geoip":{"dynamic":"true","properties":{"location":{"type":"geo_point"}}},"host":{"type":"string","norms":

I don't know because clinetip and geoip index mappings appears disable:

"clientip" : {
"type" : "string",
"norms" : {
"enabled" : false
},
"geoip" : {
"dynamic" : "true",
"properties" : {
"location" : {
"type" : "geo_point"
}
}

Can i create my own index? always appears with the syntax "logstash-date"

On kibana geoip.location appear configured as indexed but not analyced.

FInally i configured my own index.

output {
elasticsearch { host => "marioneto01"
cluster => "oknels"
action => "index"
index => "oknindex-%{+dd.MM.YYYY}"}
stdout { codec => rubydebug }
}

The mappings are:

{
"oknindex-15.09.2015" : {
"mappings" : {
"apache_access" : {
"properties" : {
"@timestamp" : {
"type" : "date",
"format" : "dateOptionalTime"
},
"@version" : {
"type" : "string"
},
"agent" : {
"type" : "string"
},
"auth" : {
"type" : "string"
},
"bytes" : {
"type" : "string"
},
"clientip" : {
"type" : "string"
},
"file" : {
"type" : "string"
},
"host" : {
"type" : "string"
},
"httpversion" : {
"type" : "string"
},
"ident" : {
"type" : "string"
},
"message" : {
"type" : "string"
},
"offset" : {
"type" : "string"
},
"referrer" : {
"type" : "string"
},
"request" : {
"type" : "string"
},
"response" : {
"type" : "string"
},
"timestamp" : {
"type" : "string"
},
"type" : {
"type" : "string"
},
"verb" : {
"type" : "string"
}
}
},
"syslog" : {
"properties" : {
"@timestamp" : {
"type" : "date",
"format" : "dateOptionalTime"
},
"@version" : {
"type" : "string"
},
"file" : {
"type" : "string"
},
"host" : {
"type" : "string"
},
"message" : {
"type" : "string"
},
"offset" : {
"type" : "string"
},
"received_at" : {
"type" : "date",
"format" : "dateOptionalTime"
},
"received_from" : {
"type" : "string"
},
"syslog_facility" : {
"type" : "string"
},
"syslog_facility_code" : {
"type" : "long"
},
"syslog_hostname" : {
"type" : "string"
},
"syslog_message" : {
"type" : "string"
},
"syslog_pid" : {
"type" : "string"
},
"syslog_program" : {
"type" : "string"
},
"syslog_severity" : {
"type" : "string"
},
"syslog_severity_code" : {
"type" : "long"
},
"syslog_timestamp" : {
"type" : "string"
},
"type" : {
"type" : "string"
}
}
}
}
}
}

My apache and my apache_access on logstash is configured on diferents files:

filter {
if [type] == "apache_error" {
grok {
match => { "message" => "%{COMBINEDAPACHELOG}" }
add_field => [ "received_from", "%{host}" ]
}
}
}

filter {
if [type] == "apache_access" {
grok {
match => {"message" => "%{COMBINEDAPACHELOG}" }
}
}
geoip {
source => "clientip"
target => "geoip"
database => "/etc/logstash/GeoLiteCity.dat"
add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}" ]
}
mutate {
convert => [ "[geoip][coordinates]", "float"]
}
}

The geoip location on kibana don't appear

What does the output look like, the json before it's pushed into ES?
Use stdout with rubydebug to check.

I had debug output configured:

'[DEPRECATED] use require 'concurrent' instead of require 'concurrent_ruby'
[2015-09-15 11:25:57.851] WARN -- Concurrent: [DEPRECATED] Java 7 is deprecated, please use Java 8.
Java 7 support is only best effort, it may not work. It will be removed in next release (1.0).
[2015-09-15 11:25:58.577] WARN -- Concurrent::Condition: [DEPRECATED] Will be replaced with Synchronization::Object in v1.0.
called on: /opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-lumberjack-1.0.2/lib/logstash/sized_queue_timeout.rb:15:in initialize' [2015-09-15 11:25:58.579] WARN -- Concurrent::Condition: [DEPRECATED] Will be replaced with Synchronization::Object in v1.0. called on: /opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-input-lumberjack-1.0.2/lib/logstash/sized_queue_timeout.rb:16:ininitialize'
sep 15, 2015 11:26:00 AM org.elasticsearch.node.internal.InternalNode
INFORMACIÓN: [logstash-marioneto01-6297-11634] version[1.7.0], pid[6297], build[929b973/2015-07-16T14:31:07Z]
sep 15, 2015 11:26:00 AM org.elasticsearch.node.internal.InternalNode
INFORMACIÓN: [logstash-marioneto01-6297-11634] initializing ...
sep 15, 2015 11:26:01 AM org.elasticsearch.plugins.PluginsService
INFORMACIÓN: [logstash-marioneto01-6297-11634] loaded [], sites []
sep 15, 2015 11:26:02 AM org.elasticsearch.bootstrap.Natives
ADVERTENCIA: JNA not found. native methods will be disabled.
sep 15, 2015 11:26:03 AM org.elasticsearch.node.internal.InternalNode
INFORMACIÓN: [logstash-marioneto01-6297-11634] initialized
sep 15, 2015 11:26:03 AM org.elasticsearch.node.internal.InternalNode start
INFORMACIÓN: [logstash-marioneto01-6297-11634] starting ...
sep 15, 2015 11:26:03 AM org.elasticsearch.transport.TransportService doStart
INFORMACIÓN: [logstash-marioneto01-6297-11634] bound_address {inet[/0:0:0:0:0:0:0:0:9301]}, publish_address {inet[/10.29.0.71:9301]}
sep 15, 2015 11:26:03 AM org.elasticsearch.discovery.DiscoveryService doStart
INFORMACIÓN: [logstash-marioneto01-6297-11634] oknels/8JA1K1caSjW5LdCSXShU9A
sep 15, 2015 11:26:06 AM org.elasticsearch.cluster.service.InternalClusterService$UpdateTask run
INFORMACIÓN: [logstash-marioneto01-6297-11634] detected_master [Gregory Gideon][HWQyrgxoTQieSesu9NPClw][marioneto01][inet[/10.29.0.71:9300]], added {[Gregory Gideon][HWQyrgxoTQieSesu9NPClw][marioneto01][inet[/10.29.0.71:9300]],}, reason: zen-disco-receive(from master [[Gregory Gideon][HWQyrgxoTQieSesu9NPClw][marioneto01][inet[/10.29.0.71:9300]]])
sep 15, 2015 11:26:06 AM org.elasticsearch.node.internal.InternalNode start
INFORMACIÓN: [logstash-marioneto01-6297-11634] started

At the moment i configured my own index:

output {
elasticsearch {
host => "marioneto01"
cluster => "oknels"
action => "index"
index => "oknindex-%{+dd.MM.YYYY}"
template => "/opt/logstash/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-1.0.5-java/lib/logstash/outputs/elasticsearch/elasticsearch-oknindex-template.json"
template_name => "oknindex" }
stdout { codec => rubydebug }
}

I created a template:

{
"template" : "oknindex",
"settings" : {
"index.refresh_interval" : "5s"
},
"mappings" : {
"default" : {
"_all" : {"enabled" : true, "omit_norms" : true},
"dynamic_templates" : [ {
"message_field" : {
"match" : "message",
"match_mapping_type" : "string",
"mapping" : {
"type" : "string", "index" : "analyzed", "omit_norms" : true
}
}
}, {
"string_fields" : {
"match" : "*",
"match_mapping_type" : "string",
"mapping" : {
"type" : "string", "index" : "analyzed", "omit_norms" : true,
"fields" : {
"raw" : {"type": "string", "index" : "not_analyzed", "ignore_above" : 256}
}
}
}
} ],
"properties" : {
"@version": { "type": "string", "index": "not_analyzed" },
"geoip" : {
"type" : "object",
"dynamic": true,
"properties" : {
"location" : { "type" : "geoip" }
}
}
}
}
}
}

But now the field are empthy

curl '10.29.0.71:9200/okn*/_mapping/?pretty'
{
"oknindex-15.09.2015" : {
"mappings" : {
"apache_access" : {
"properties" : {
"@timestamp" : {
"type" : "date",
"format" : "dateOptionalTime"
},
"@version" : {
"type" : "string"
},
"agent" : {
"type" : "string"
},
"auth" : {
"type" : "string"
},
"bytes" : {
"type" : "string"
},
"clientip" : {
"type" : "string"
},
"file" : {
"type" : "string"
},
"host" : {
"type" : "string"
},
"httpversion" : {
"type" : "string"
},
"ident" : {
"type" : "string"
},
"message" : {
"type" : "string"
},
"offset" : {
"type" : "string"
},
"referrer" : {
"type" : "string"
},
"request" : {
"type" : "string"
},
"response" : {
"type" : "string"
},
"timestamp" : {
"type" : "string"
},
"type" : {
"type" : "string"
},
"verb" : {
"type" : "string"
}
}
},
"syslog" : {
"properties" : {
"@timestamp" : {
"type" : "date",
"format" : "dateOptionalTime"
},
"@version" : {
"type" : "string"
},
"file" : {
"type" : "string"
},
"host" : {
"type" : "string"
},
"message" : {
"type" : "string"
},
"offset" : {
"type" : "string"
},
"received_at" : {
"type" : "date",
"format" : "dateOptionalTime"
},
"received_from" : {
"type" : "string"
},
"syslog_facility" : {
"type" : "string"
},
"syslog_facility_code" : {
"type" : "long"
},
"syslog_hostname" : {
"type" : "string"
},
"syslog_message" : {
"type" : "string"
},
"syslog_pid" : {
"type" : "string"
},
"syslog_program" : {
"type" : "string"
},
"syslog_severity" : {
"type" : "string"
},
"syslog_severity_code" : {
"type" : "long"
},
"syslog_timestamp" : {
"type" : "string"
},
"type" : {
"type" : "string"
}
}
}
}
}
}

But on my kibana i saw the fields complement with the correct information excepting geoip