Kibana, Elasticsearch and Logstash 6.2.2.
Based on IP addresses in our log data we want to view the location of site visitors on a map in Kibana.
Our mapping is setup with a field ip
of type ip
and another field location
of type geo_point
and Logstash is processing the IP as follows:
geoip {
source => "ip"
target => "location"
}
The Logstash output is:
{
...
"location" => {
"region_name" => "Texas",
"location" => {
"lat" => 32.7791,
"lon" => -96.8028
},
"longitude" => -96.8028,
"region_code" => "TX",
"country_code3" => "US",
"postal_code" => "75202",
"country_code2" => "US",
"timezone" => "America/Chicago",
"country_name" => "United States",
"latitude" => 32.7791,
"ip" => "63.143.42.246",
"continent_code" => "NA",
"dma_code" => 623,
"city_name" => "Dallas"
},
...
However ES is rejecting the indexing and Logstash reports:
[2018-03-16T19:31:21,296][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>"access-app02-test_2018-03-16T17:47:11.000Z_2113", :_index=>"access-dev", :_type=>"doc", :_routing=>nil}, #LogStash::Event:0x2e5312a4], :response=>{"index"=>{"_index"=>"access-dev1", "_type"=>"doc", "_id"=>"access-app02-test_2018-03-16T17:47:11.000Z_2113", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse", "caused_by"=>{"type"=>"parse_exception", "reason"=>"field must be either [lat], [lon] or [geohash]"}}}}}
Not sure what that means: geohash is not a field type AFAIK and [lat], [lon] are subfields of [location].
"mappings": {
"doc": {
"properties": {
"bytesSent": {
"type": "long"
},
"ip": {
"type": "ip"
},
"location": {
"type": "geo_point"
},
...
I might add, at this juncture, that the geoip filter docs are woefully lacking in proper examples. Indeed, their examples (add_field, remove_field) look like a copy-paste error from the mutate plugin.