I have a table and want to visualize on kibana.
lead_id | longitude | latitude
When I import it to elastic search using logstash, How can I map it on elasticsearch?
Two things:
- Store the longitude and latitude values in a field whose format can be parsed as a geo_point field (the ES documentation has details).
- Make sure the index maps that particular field as a geo_point field.
My config file:
filter {
csv{
columns => ["lead_id","first_name","latitude","longitude"]
separator => ","
}
}
output {
stdout {
codec => rubydebug
}
elasticsearch {
hosts => ["localhost:9200"]
index=>"croos_test"
document_id => "%{lead_id}"
location=>[ "%{latitude}", "%{longitude}" ]
}
}
My mapping:
"mappings": {
"geo_test": {
"_timestamp": {
"enabled": true
},
"properties": {
"first_name": {
"type": "string",
"analyzer": "standard"
},
"location": {
"type": "geo_point"
}
}
}
}
How to make index maps that particular field as a geo_point field?
location=>[ "%{latitude}", "%{longitude}" ]
No, this doesn't work. Use the mutate filter and its add_field
option to create new fields. To create an array field I believe the following works:
mutate {
add_field => ["location", "%{latitude}"]
add_field => ["location", "%{longitude}"]
}
1 Like
Thanks a lot Magnus.
{
"message" => "16798207,Yonago,35.433,133.333\r",
"@version" => "1",
"@timestamp" => "2016-02-29T09:30:26.379Z",
"path" => "country1.csv",
"host" => "Lenovo-PC",
"type" => "data",
"id" => "16798207",
"first_name" => "Croos",
"location" => {
"lat" => "35.433",
"lon" => "133.333"
}
}
{
"message" => "16798463,Izumo,35.367,132.767\r",
"@version" => "1",
"@timestamp" => "2016-02-29T09:30:26.379Z",
"path" => "country1.csv",
"host" => "Lenovo-PC",
"type" => "data",
"id" => "16798463",
"first_name" => "Vinith",
"location" => {
"lat" => "35.367",
"lon" => "132.767"
}
}
Now the logstash executed successfully. Now my problem is visualizing the same on kibana. For that, I start another thread.