Could not able to use geo_ip in logstash 2.4

hi all,
I'm trying to use geoip from apache access log with logstash 2.4, elasticsearch 2.4, kibna 4.6.

my logstash filter is...

input {
file {
path => "/var/log/httpd/access_log"
type => "apache"
start_position => "beginning"
}
}

filter {
grok {
match => { "message" => "%{COMBINEDAPACHELOG}" }
}
geoip {
source => "clientip"
target => "geoip"
database =>"/home/elk/logstash-2.4.0/GeoLiteCity.dat"
#add_field => { "foo_%{somefield}" => "Hello world, from %{host}" }
add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}" ]
}
mutate {
convert => [ "[geoip][coordinates]", "float" ]
}
}

output {
stdout { codec => rubydebug }
elasticsearch
{ hosts => ["192.168.56.200:9200"]
sniffing => true
manage_template => false
index => "apache-geoip-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
}
}

And if elasticsearch parsing some apache access log, the output is...

{
"message" => "xxx.xxx.xxx.xxx [24/Oct/2016:14:46:30 +0900] HTTP/1.1 8197 /images/egovframework/com/cmm/er_logo.jpg 200",
"@version" => "1",
"@timestamp" => "2016-10-24T05:46:34.505Z",
"path" => "/NCIALOG/JBOSS/SMBA/default-host/access_log.2016-10-24",
"host" => "smba",
"type" => "jboss_access_log",
"clientip" => "xxx.xxxx.xxx.xxx",
"geoip" => {
"ip" => "xxx.xxx.xxx.xxx",
"country_code2" => "KR",
"country_code3" => "KOR",
"country_name" => "Korea, Republic of",
"continent_code" => "AS",
"region_name" => "11",
"city_name" => "Seoul",
"latitude" => xx.5985,
"longitude" => xxx.97829999999999,
"timezone" => "Asia/Seoul",
"real_region_name" => "Seoul-t'ukpyolsi",
"location" => [
[0] xxx.97829999999999,
[1] xx.5985
],
"coordinates" => [
[0] xxx.97829999999999,
[1] xx.5985
]
}
}

When I create tile map with above filter I got this error in kibana web,
"index pattern does not contain any of the following field types: geo_point "

I could not able to see geo_ip field.

please help me.
Thanks.
Daniel.

The default index template that ships with Logstash assumes that your index names match logstash-*. Since you've renamed your indexes you have to install a matching index template (or just accept the default index name).

1 Like

Hi magnusbaeck,

I chnaged my index name from jboss-access* to logstash-*. but result is same.

[grkim@smba ~]$ curl http://192.168.0.50:9200/logstash*/_mapping?pretty
{
"logstash-2016.10" : {
"mappings" : {
"event" : {
"properties" : {
"@timestamp" : {
"type" : "date",
"format" : "strict_date_optional_time||epoch_millis"
},
"@version" : {
"type" : "string"
},
"clientip" : {
"type" : "string"
},
"geoip" : {
"properties" : {
"city_name" : {
"type" : "string"
},
"continent_code" : {
"type" : "string"
},
"country_code2" : {
"type" : "string"
},
"country_code3" : {
"type" : "string"
},
"country_name" : {
"type" : "string"
},
"ip" : {
"type" : "string"
},
"latitude" : {
"type" : "double"
},
"location" : {
"type" : "double"
},
"longitude" : {
"type" : "double"
},
"real_region_name" : {
"type" : "string"
},
"region_name" : {
"type" : "string"
},
"timezone" : {
"type" : "string"
}
}
},
"host" : {
"type" : "string"
},
"message" : {
"type" : "string"
},
"path" : {
"type" : "string"
},
"received_at" : {
"type" : "date",
"format" : "strict_date_optional_time||epoch_millis"
},
"received_from" : {
"type" : "string"
},
"syslog_facility" : {
"type" : "string"
},
"syslog_facility_code" : {
"type" : "long"
},
"syslog_hostname" : {
"type" : "string"
},
"syslog_message" : {
"type" : "string"
},
"syslog_pid" : {
"type" : "string"
},
"syslog_program" : {
"type" : "string"
},
"syslog_severity" : {
"type" : "string"
},
"syslog_severity_code" : {
"type" : "long"
},
"syslog_timestamp" : {
"type" : "string"
},
"tags" : {
"type" : "string"
},
"type" : {
"type" : "string"
}
}
}
}
}
}

Am I missing some configuration?

Thanks.

It is possible that the index template for the logstash-* index is not getting applied as you have disabled this in the Elasticsearch output.

Hi Christian,

Thanks for your replay.
Unfortunately it not working for me :tired_face:
The results are same.

I removed other option in my output configuration like this:

output {
stdout { codec => rubydebug }
elasticsearch
{ hosts => ['192.168.0.50:9200']
manage_template => true
index => "logstash-%{+YYYY.MM.dd}"
}
}

I can see geo_location when i execute " curl http://192.168.0.50:9200/logstash*/_mapping?pretty" . but still could not able to use map in kibana.
The error message from kibana is ..

" The "logstash-*" index pattern does not contain any of the following field types: geo_point "

Have you refreshed the field list in Kibana?

Hi magnusbaeck,

I refresed it. I got new mapping conflict warning message. I don't know why... and still I have same situation.

This is my new logstash configuration for my jboss access log ...

input {
file {
path => '/NCIALOG/JBOSS/SMBA/default-host/access_log.*'
type => 'jboss_access_log'
start_position => 'beginning'
}
}

filter {
if [type] == 'jboss_access_log' {
grok {
# 192.168.0.21 [24/Oct/2016:08:34:59 +0900] HTTP/1.1 7130 /boffice/sy/menu/MgrMenuListAx.do 200
match => { 'message' => '%{IP:clientip}' }
}
geoip {
source => 'clientip'
add_tag => ['GeoIP']
database => '/home/elk/logstash-2.4.0/GeoLiteCity.dat'
add_field => [ '[geoip][coordinates]', '%{[geoip][longitude]}' ]
add_field => [ '[geoip][coordinates]', '%{[geoip][latitude]}' ]
}
mutate {
convert => [ '[geoip][coordinates]', 'float']
}
}
}

output {
stdout { codec => rubydebug }
elasticsearch
{ hosts => ['192.168.0.50:9200']
manage_template => true
index => "logstash-%{+YYYY.MM.dd}"
}
}

I got new mapping conflict warning message. I don't know why...

Some of your indexes have geoip.location as a geo_point and some have them as double.

and still I have same situation.

Kibana still complains about there not being any geo_point fields even though everything indicates that geoip.location is a geo_point? Okay, I give up. Ask about this in the Kibana category. Just double-check with the get mapping API that geoip.location really is a geo_point for the most recent index.

hi magnusbaeck,

Thanks for your help. I'll do that.

Thanks.

Hi I found solution.

I delete all indecies in data directory in elasticsearch and restart elasticsearch, logstash
I can see geo_point and could able to use map!


Thanks.!!