My data source has Longtitude and Latitude in it. I have time series data of aircraft and want to plot the map progress.
I am using Logstash 5.3.0 from an event stream. Below is my configuration. Currently when the data is ingested into Elasticsearch its still showing as String (1st attempt), Numeric (2nd attempt and many after that). I want to get to a point I can use maps in Kibana.
Doing the convert of longitude and latitude has no effect on the geo_point. add_field converts them back to string. You could do a convert on "[location][lat]" but you do not need to. geo_point will work with either string or float.
Get rid of the template and create the index using
@Badger - I made the changes as per suggestion and now getting the following exception:
[2018-03-01T23:17:27,421][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"flightpostest", :_type=>"positional", :_routing=>nil}, #LogStash::Event:0x7cda5271], :response=>{"index"=>{"_index"=>"flightpostest", "_type"=>"positional", "_id"=>"AWHj2yweXQkqogeoXm0O", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"[location] is defined as an object in mapping [positional] but this name is already used for a field in other types"}}}}
Throw my two cents in here. Instead of expressing my geo_point as an object, I have it configured as a string, example 2 in the reference manual. You may want to give it a shot, if you haven't already, it appears as though it may be a lot less work for your pipeline.
Just to be clear, my field names are geoip.location.coordinates, geoip.location.lat, and geoip.location.lon; hence the reason for all the extra brackets...can be confusing if you've never encountered it before.
[2018-03-02T18:34:38,900][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"positional", :_type=>"positional", :_routing=>nil}, #LogStash::Event:0x76c34393], :response=>{"index"=>{"_index"=>"positional", "_type"=>"positional", "_id"=>"AWHn_p2lXQkqogeocS_l", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"[location] is defined as an object in mapping [positional] but this name is already used for a field in other types"}}}}
2018-03-02T19:04:17,379][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"differentname", :_type=>"GPS_EVENT", :_routing=>nil}, #LogStash::Event:0x228da33f], :response=>{"index"=>{"_index"=>"differentname", "_type"=>"GPS_EVENT", "_id"=>"AWHoGcAxXQkqogeocayS", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"[location] is defined as an object in mapping [GPS_EVENT] but this name is already used for a field in other types"}}}}
[2018-03-02T19:04:17,881][INFO ][logstash.pipeline ] Pipeline has terminated {:pipeline_id=>"main", :thread=>"#<Thread:0x5495491d run>"}
[
alright @Badger - got it working. It bugged the crap out of my that your dummy sample worked - but didn't work with my real config. I then did a lot of forum searching on the error message returned now and found this: Errors with geo_point.
From that, deleted my old index, put my mapping and then ingested and boom
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.