I am having difficulty getting Logstash (2.2.4) to convert two parsed values, longitude and latitude, into a geo point for Kibana. The documentation for Elasticsearch presents an example for Geo Points where you make some sort of configuration file (https://www.elastic.co/guide/en/elasticsearch/guide/current/geopoints.html). I do no know:
- Where does it go?
- Is it overriding the elasticsearch template used by Logstash?
- Is there a pre-existing filter plugin for Logstash that can take an latitude and longitude to produce a geo point? The only thing that I am aware of is geoip. The assumption of geoip is that the IP is fixed to a set latitude and longitude. In my case the IP is mobile with a new position every few seconds.
My Logstash is as follows:
input {
file {
path => "/opt/project/mylog.txt"
}
}
filter {
grok {
add_tag => [ "project", "message1" ]
match => { "message" => "%{DATE_US:date} %{TIME:time} message1: name:%{DATA:name}, lat:%{DATA:latitude}, lon:%{DATA:longitude}, alt:%{DATA:altitude}" }
}
grok {
add_tag => [ "project", "meessage2" ]
match => { "message" => "%{DATE_US:date} %{TIME:time} meessage2: name:%{DATA:name}, lat:%{DATA:latitude}, lon:%{DATA:longitude}, alt:%{DATA:altitude}, delta:%{DATA:delta}, status:%{WORD:status}" }
}
if "message1" in [tags]
{
mutate {
add_field => { "[location][lat]" => "%{latitude}"
"[location][lon]" => "%{longitude}"
}
}
mutate {
convert => {
"[location][lat]" => "float"
"[location][lon]" => "float"
}
}
}
mutate {
convert =>{
"latitude" => "float"
"longitude" => "float"
"altitude" => "float"
"delta" => "float"
}
}
}
output {
if "message1" in [tags] and "project" in [tags]
{
elasticsearch
{
index => "project-%{+YYYY.MM.dd}"
manage_template => "false"
template => "/etc/logstash/templates/project-elasticsearch.json"
}
file {
path => "/opt/project/message1.txt"
}
}
else if "message2" in [tags] and "project" in [tags]
{
elasticsearch
{
index => "project-%{+YYYY.MM.dd}"
}
file {
path => "/opt/project/message2.txt"
}
}
}
Now here is the template I copied from the logstash elasticsearch output plugin and modified to try an tell logstash to modify the "location" object to be considered as a geo point.
{
"template" : "thunderstorm-*",
"settings" : {
"index.refresh_interval" : "5s"
},
"mappings" : {
"_default_" : {
"_all" : {"enabled" : true, "omit_norms" : true},
"dynamic_templates" : [ {
"message_field" : {
"match" : "message",
"match_mapping_type" : "string",
"mapping" : {
"type" : "string", "index" : "analyzed", "omit_norms" : true,
"fielddata" : { "format" : "disabled" }
}
}
}, {
"string_fields" : {
"match" : "*",
"match_mapping_type" : "string",
"mapping" : {
"type" : "string", "index" : "analyzed", "omit_norms" : true,
"fielddata" : { "format" : "disabled" },
"fields" : {
"raw" : {"type": "string", "index" : "not_analyzed", "ignore_above" : 256}
}
}
}
} ],
"properties" : {
"@timestamp": { "type": "date" },
"@version": { "type": "string", "index": "not_analyzed" },
"geoip" : {
"dynamic": true,
"properties" : {
"ip": { "type": "ip" },
"location" : { "type" : "geo_point" },
"latitude" : { "type" : "float" },
"longitude" : { "type" : "float" }
}
},
"location": {
"type": "geo_point"
}
}
}
}
}
I would appreciate any insight into this problem. As well I would appreciate any suggestions for improvement since I relative new to Logstash/Elasticsearch/Kibana.