Hi I am new to ELK so first I want to setup custom json file read using logstash. I want to show the customer locations in kibana map . This is the json format I want to convert.
{"id":1,"first_name":"Freeman","last_name":"Jowers","email":"fjowers0@mashable.com","gender":"Male","ip_address":"15.128.77.162","latitude":9.9004655,"longitude":13.0544185,"date":"2017-10-29T17:47:59Z","country":"Nigeria"}
This is the Configuration file I have used for logstash.
filter { grok {
match => ['message','(?<body>\"id\":.*\"country\":\"[^"]+\")']
add_field => ["json_body","{%{body}}"]
}
json {
source => "json_body"
remove_field => ["message","body","json_body"]
}
mutate{
add_field => ["[geoip][location]","%{[latitude]}"]
add_field => ["[geoip][location]","%{[longitude]}"]
}
mutate{
convert => ["[geoip][location]","float"]
}
}
And the problem is in kibana geoip.location type shown as number . I need to show geoip.location as geo_point.
Can anyone give me the how to resolve this issue. I am using ELK 6.2.3
Thanks & Regards
That grok filter seems like a generally bad idea (but it has nothing to do with your problem).
It's the index mappings that determine whether a field can become a geo point. What does your elasticsearch output configuration look like?
Hi @magnusbaeck thanks for reply. Two days I was stuck this issue. This the full configuration file .
input {
file{
path => ["/home/sajith/Desktop/scripts/logstash-data/data-file.json"]
type => "json"
start_position => "beginning"
}
}
filter {
grok {
match => ['message','(?<body>\"id\":.*\"country\":\"[^"]+\")']
add_field => ["json_body","{%{body}}"]
}
json {
source => "json_body"
remove_field => ["message","body","json_body"]
}
mutate{
add_field => ["[geoip][location]","%{[latitude]}"]
add_field => ["[geoip][location]","%{[longitude]}"]
}
mutate{
convert => ["[geoip][location]","float"]
}
}
output {
stdout {
codec => rubydebug
}
elasticsearch {
hosts => ["localhost:9200"]
index => "sample"
}
}
Your example would've worked if you'd used the default index name. When you chose the "sample" index name the index template that Logstash installs for you didn't apply. I think https://www.elastic.co/blog/geoip-in-the-elastic-stack should explain this.
hi @magnusbaeck when I remove creating custom index it works . Is there any way to create custom index with geo_point data type. Anyway thanks for your help
Is there any way to create custom index with geo_point data type.
Yes, I believe it's explained in the document I linked to earlier.