wajika
(wajika)
July 11, 2020, 2:43am
1
input{
kafka{
codec => "json"
decorate_events => true
......
}
}
filter {
geoip {
source => "remote_addr"
target => "geoip"
add_field => ["[geoip][coordinates]","%{[geoip][longitude]}"]
add_field => ["[geoip][coordinates]","%{[geoip][latitude]}"]
}
mutate {
convert => {
"[geoip][coordinates]" => "float"
}
}
}
output {
elasticsearch {
hosts => ["192.168.10.139:9200","192.168.10.140:9200","192.168.10.141:9200"]
index => "nginx"
}
}
}
I saw someone mentioned a long time ago, is there any way to make the custom index use the default logstash template?
Are you using the default index name of logstash-YYYY.MM.DD? If not, then the default template, which maps the geo_point for you , is not being used.
You could always borrow from this template to make yours map properly. Unfortunately, you'll have to re-index or wait until the next rollover to see the mapping change.
Badger
July 11, 2020, 5:25pm
2
Take the default template , update the value of index_patterns to match your custom index name, and PUT it into elasticsearch.
wajika
(wajika)
July 13, 2020, 5:28am
3
Solved the problem, thank you
system
(system)
Closed
August 10, 2020, 5:28am
4
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.