Hi,
For now, I have my own fields called "lat" and "lon" in the json file.
I was trying to make ES recognize it as the geo_point data so I did some kinds of editing in Logstash.
filter {
mutate{
add_field => {"location" => [[lat],[lon]]}
}
}
But it turned out that the type of geoip.location is not geo_point.
I am wondering if there is something wrong with my filter?
Updated: the logstash threw me the error like this
[2018-03-23T20:04:37,029][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"locationdata", :_type=>"eventdata", :_routing=>nil}, #<LogStash::Event:0x6bfc6277>], :response=>{"index"=>{"_index"=>"locationdata", "_type"=>"eventdata", "_id"=>"XNFSVWIB-SahT3BDQlYN", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"illegal latitude value [269.9986267089844] for location"}}}}}
{
"lon" => 25,
"@timestamp" => 2018-03-24T00:04:36.649Z,
"host" => "appletekiMacBook-Pro.local",
"@version" => "1",
"lat" => 15,
"location_info" => "united/michigan:[15,25]",
"location" => [
[0] "[\"lat\"]",
[1] "[\"lon\"]"
],
"path" => "/Users/apple/Desktop/Location/location.json"
}