Hi, getting a parse_exception error using Logstash 5.0.0. saying geo_point expected.
Here is the log from Logstash:
[2016-11-07T21:55:30,742][WARN ][logstash.outputs.elasticsearch] Failed action. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"coptic-20161108", :_type=>"location", :_routing=>nil}, 2016-11-08T02:55:28.897Z HanyiMac27.local -35.21973,149.10198,Cnr Marybirngong Ave,and Georgina Cres,KALEEN, ACT,St Mark,26176,Australia,www.stmarkact.org/,(+61) 402108823], :response=>{"index"=>{"_index"=>"coptic-20161108", "_type"=>"location", "_id"=>"AVhB3KfyFo4-yI3HoAP9", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse", "caused_by"=>{"type"=>"parse_exception", "reason"=>"geo_point expected"}}}}}
Here is the Elasticsearch 5.0.0 log (looks like the geo_point field "coordinates" created.
[coptic-20161108][0] failed to execute bulk item (index) index {[coptic-20161108][location][AVhB3KfyFo4-yI3HoAP9], source[{"StreetName":"and Georgina Cres","coordinates":[149.10198,-35.21973],"PostalCode":"26176","Website":"www.stmarkact.org/","Latitude":"-35.21973","City":"KALEEN","message":"-35.21973,149.10198,Cnr Marybirngong Ave,and Georgina Cres,KALEEN, ACT,St Mark,26176,Australia,www.stmarkact.org/,(+61) 402108823","type":"location","Longitude":"149.10198","Province":" ACT","path":"/Users/hanyishak/Google Drive/CopticFindData/coptic3.csv","ChurchName":"St Mark","@timestamp":"2016-11-08T02:55:28.897Z","Phone":"(+61) 402108823","@version":"1","host":"HanyiMac27.local","Country":"Australia","StreetNum":"Cnr Marybirngong Ave"}]}
org.elasticsearch.index.mapper.MapperParsingException: failed to parse
Here is the Logstash configuration file:
input
{
file
{
path => "/Users/hanyishak/Google Drive/CopticFindData/coptic3.csv"
type => "location"
start_position => "beginning"
user => ["logstash_internal"]
password => ["changeme"]
}
}
filter
{
csv
{
columns => [
"Latitude",
"Longitude",
"StreetNum",
"StreetName",
"City",
"Province",
"ChurchName",
"PostalCode",
"Country",
"Website",
"Phone"]
separator => ","
add_field => [ "[coordinates]", "%{[Longitude]}" ]
add_field => [ "[coordinates]", "%{[Latitude]}" ]
user => ["logstash_internal"]
password => ["changeme"]
}
mutate {convert => [ "[coordinates]", "float"]}
}
output
{
stdout { codec => dots }
elasticsearch
{
action => "index"
hosts => ["localhost:9200"]
manage_template => true
index => "coptic-%{+yyyyMMdd}"
template => "/Users/hanyishak/Downloads/elasticsearch-5.0.0/logstash-template.json"
template_name => "copticdata"
user => ["logstash_internal"]
password => ["changeme"]
}
}
Here is the logstash-template.json file:
{
"template": "coptic*",
"order": 1,
"settings": {
"number_of_shards": 1
},
"mappings": {
"streetdata": {
"dynamic_templates": [
{
"string_fields": {
"mapping": {
"index": "analyzed",
"omit_norms": true,
"type": "string",
"fields": {
"raw": {
"index": "not_analyzed",
"type": "string"
}
}
},
"match_mapping_type": "string",
"match": "*"
}
}
],
"_all": {
"enabled": false
},
"properties": {
"coordinates": {
"type": "geo_point"
}
}
}
}
}
The messages are the same for all rows in the csv file. Here is the row pertaining to the log entries:
-35.21973,149.10198,Cnr Marybirngong Ave,and Georgina Cres,KALEEN, ACT,St Mark,26176,Australia,www.stmarkact.org/,(+61) 402108823
Thoughts?