Help with Geo_point (longitude and latitude)

need help wiith importing Latitude and Longitude as geo_point.

My data has longitude and latitude field

I have a conf file to pull the data to ES:

mutate {convert => ["latitude", "float"]}
mutate {convert => ["longitude", "float"]}
mutate {rename => {"longitude" => "[location][lon]"}}
mutate {rename => {"latitude" => "[location][lat]"}}

I use logstash to pull the data to ES and is getting pulled without any error but I see that location.lat AND location.long showing as number not geo_point

location.lat number
location.lon number

As I understand after reading through few posts, I need to create mapping in ES. '

How to do that. Can I map after data is uploaded to ES ? my index name is ufo2.

Please help..

Yes. Create the mapping before indexing your first document.

See https://www.elastic.co/guide/en/elasticsearch/reference/6.1/indices-put-mapping.html

Apply the geo point type to your field. https://www.elastic.co/guide/en/elasticsearch/reference/6.1/geo-point.html

Below are the steps I am taking:

Below are the steps I am following to store geo_point

a.) Created index : PUT chicago_crime . - THIS IS TO CREATE INDEX
b.) Geo point mapping: - CREATE A MAPPING WITH a field location with Geo_point Datatype
PUT chicago_crime/_mapping/my_type
{
"my_type": {
"properties": {
"location": {
"type": "geo_point"
}
}
}
}

c.) logstash: - THIS IS THE LOGSTASH conf file
input {
file {
path => "/home/ppunj/chicago-crime.csv"
start_position => 'beginning'
sincedb_path => '/dev/null'
}
}
filter {
csv {
separator => ","
columns => ["id","case","number","date","block","iucr","primary type","description","location description","arrest","domestic","district","ward","community area","year","latitude","longitude"]
}
date {
match => ["date", "MM/dd/yyyy HH:mm"]
target => "date"
locale => "en"
}
mutate {convert => ["latitude", "float"]}
mutate {convert => ["longitude", "float"]}
mutate {
rename => {
"longitude" => "[location][lon]"
"latitude" => "[location][lat]"
}
}
}
output {
elasticsearch {
hosts => "http://localhost:9200"
index => "chicago_crime"
}
stdout {}
}

d.)SAMPLE CSV file

id,case number,date,block,iucr,primary type,description,location description,arrest,domestic,district,ward,community area,year,latitude,longitude
10000092,HY189866,3/18/2015 19:44,047XX W OHIO ST,041A,BATTERY,AGGRAVATED: HANDGUN,STREET,FALSE,FALSE,11,28,25,2015,41.89139886,-87.74438457
10000094,HY190059,3/18/2015 23:00,066XX S MARSHFIELD AVE,4625,OTHER OFFENSE,PAROLE VIOLATION,STREET,TRUE,FALSE,7,15,67,2015,41.77337153,-87.66531947
10000095,HY190052,3/18/2015 22:45,044XX S LAKE PARK AVE,486,BATTERY,DOMESTIC BATTERY SIMPLE,APARTMENT,FALSE,TRUE,2,4,39,2015,41.81386068,-87.59664284
10000096,HY190054,3/18/2015 22:30,051XX S MICHIGAN AVE,460,BATTERY,SIMPLE,APARTMENT,FALSE,FALSE,2,3,40,2015,41.80080242,-87.62261934
10000097,HY189976,3/18/2015 21:00,047XX W ADAMS ST,031A,ROBBERY,ARMED: HANDGUN,SIDEWALK,FALSE,FALSE,11,28,25,2015,41.87806476,-87.74335401

ERROR after logstash ran:

[2018-02-21T16:47:43,277][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"chicago_crime", :_type=>"doc", :_routing=>nil}, #LogStash::Event:0x4bdf395e], :response=>{"index"=>{"_index"=>"chicago_crime", "_type"=>"doc", "_id"=>"cYNWumEBzT_mw5xoIQ6O", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"[location] is defined as an object in mapping [doc] but this name is already used for a field in other types"}}}}

what I am missing ??

Please format your code, logs or configuration files using </> icon as explained in this guide and not the citation button. It will make your post more readable.

Or use markdown style like:

```
CODE
```

There's a live preview panel for exactly this reasons.

Lots of people read these forums, and many of them will simply skip over a post that is difficult to read, because it's just too large an investment of their time to try and follow a wall of badly formatted text.
If your goal is to get an answer to your questions, it's in your interest to make it as easy to read and understand as possible.
Please update your post.

STEP 1

PUT chicago_crime

STEP 2

PUT chicago_crime/_mapping/my_type 
{
"my_type": {
"properties": {
"location": {
"type": "geo_point"
}
}

}
}

STEP 3 - Logstash

    input {
file {
path => "/home/ppunj/chicago-crime.csv"
start_position => 'beginning'
sincedb_path => '/dev/null'
}
}
filter {
csv {
separator => ","
columns => ["id","case","number","date","block","iucr","primary type","description","location description","arrest","domestic","district","ward","community area","year","latitude","longitude"]
}
date {
match => ["date", "MM/dd/yyyy HH:mm"]
target => "date"
locale => "en"
}
mutate {convert => ["latitude", "float"]}
mutate {convert => ["longitude", "float"]}
mutate {
rename => {
"longitude" => "[location][lon]"
"latitude" => "[location][lat]"
}
}
}
output {
elasticsearch {
hosts => "http://localhost:9200"
index => "chicago_crime"
}
stdout {}
}

SAMPLE CSV FILE

d.)SAMPLE CSV file

id,case number,date,block,iucr,primary type,description,location description,arrest,domestic,district,ward,community area,year,latitude,longitude
10000092,HY189866,3/18/2015 19:44,047XX W OHIO ST,041A,BATTERY,AGGRAVATED: HANDGUN,STREET,FALSE,FALSE,11,28,25,2015,41.89139886,-87.74438457
10000094,HY190059,3/18/2015 23:00,066XX S MARSHFIELD AVE,4625,OTHER OFFENSE,PAROLE VIOLATION,STREET,TRUE,FALSE,7,15,67,2015,41.77337153,-87.66531947
10000095,HY190052,3/18/2015 22:45,044XX S LAKE PARK AVE,486,BATTERY,DOMESTIC BATTERY SIMPLE,APARTMENT,FALSE,TRUE,2,4,39,2015,41.81386068,-87.59664284
10000096,HY190054,3/18/2015 22:30,051XX S MICHIGAN AVE,460,BATTERY,SIMPLE,APARTMENT,FALSE,FALSE,2,3,40,2015,41.80080242,-87.62261934
10000097,HY189976,3/18/2015 21:00,047XX W ADAMS ST,031A,ROBBERY,ARMED: HANDGUN,SIDEWALK,FALSE,FALSE,11,28,25,2015,41.87806476,-87.74335401

ERROR after logstash ran:

[2018-02-21T16:47:43,277][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"chicago_crime", :_type=>"doc", :_routing=>nil}, #LogStash::Event:0x4bdf395e], :response=>{"index"=>{"_index"=>"chicago_crime", "_type"=>"doc", "_id"=>"cYNWumEBzT_mw5xoIQ6O", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"[location] is defined as an object in mapping [doc] but this name is already used for a field in other types"}}}}

what I am missing ??

Can you share what gives the output stdout plugin? I'd like to see the documents that are sent to elasticsearch.

I am using only couple of records to start with and error I am getting is:

_**[2018-02-23T09:49:49,097][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"chicago_crime", :_type=>"doc", :_routing=>nil}, #<LogStash::Event:0x2baeadd7>], :response=>{"index"=>{"_index"=>"chicago_crime", "_type"=>"doc", "_id"=>"c4Mkw2EBzT_mw5xoPw70", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"[location] is defined as an object in mapping [doc] but this name is already used for a field in other types"}}}}**_

[ppunj@EdgeNode ~]$ ./logstash-6.1.1/bin/logstash -f chicago.conf
Sending Logstash's logs to /home/ppunj/logstash-6.1.1/logs which is now configured via log4j2.properties
[2018-02-23T09:49:27,114][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"/home/ppunj/logstash-6.1.1/modules/netflow/configuration"}
[2018-02-23T09:49:27,171][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"/home/ppunj/logstash-6.1.1/modules/fb_apache/configuration"}
[2018-02-23T09:49:28,320][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2018-02-23T09:49:29,748][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.1.1"}
[2018-02-23T09:49:30,954][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2018-02-23T09:49:43,159][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://localhost:9200/]}}
[2018-02-23T09:49:43,223][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://localhost:9200/, :path=>"/"}
[2018-02-23T09:49:43,780][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2018-02-23T09:49:43,930][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>nil}
[2018-02-23T09:49:43,941][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the type event field won't be used to determine the document _type {:es_version=>6}
[2018-02-23T09:49:43,997][INFO ][logstash.outputs.elasticsearch] Using mapping template from {:path=>nil}
[2018-02-23T09:49:44,064][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"default"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2018-02-23T09:49:44,187][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://localhost:9200"]}
[2018-02-23T09:49:44,357][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>500, :thread=>"#<Thread:0x3cee0587 run>"}
[2018-02-23T09:49:45,300][INFO ][logstash.pipeline ] Pipeline started {"pipeline.id"=>"main"}
[2018-02-23T09:49:45,856][INFO ][logstash.agent ] Pipelines running {:count=>1, :pipelines=>["main"]}
[2018-02-23T09:49:48,503][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"chicago_crime", :_type=>"doc", :_routing=>nil}, #LogStash::Event:0x7c9e0fce], :response=>{"index"=>{"_index"=>"chicago_crime", "_type"=>"doc", "_id"=>"coMkw2EBzT_mw5xoPw7t", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"[location] is defined as an object in mapping [doc] but this name is already used for a field in other types"}}}}
2018-02-23T14:49:46.793Z EdgeNode.asotc.com id,case number,date,block,iucr,primary type,description,location description,arrest,domestic,district,ward,community area,year,latitude,longitude
[2018-02-23T09:49:49,097][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"chicago_crime", :_type=>"doc", :_routing=>nil}, #LogStash::Event:0x2baeadd7], :response=>{"index"=>{"_index"=>"chicago_crime", "_type"=>"doc", "_id"=>"c4Mkw2EBzT_mw5xoPw70", "status"=>400, "error"=>{"type"=>"illegal_argument_exception", "reason"=>"[location] is defined as an object in mapping [doc] but this name is already used for a field in other types"}}}}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.