Logstash parse_exception geo_point expected

Hi, getting a parse_exception error using Logstash 5.0.0. saying geo_point expected.
Here is the log from Logstash:

[2016-11-07T21:55:30,742][WARN ][logstash.outputs.elasticsearch] Failed action. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"coptic-20161108", :_type=>"location", :_routing=>nil}, 2016-11-08T02:55:28.897Z HanyiMac27.local -35.21973,149.10198,Cnr Marybirngong Ave,and Georgina Cres,KALEEN, ACT,St Mark,26176,Australia,www.stmarkact.org/,(+61) 402108823], :response=>{"index"=>{"_index"=>"coptic-20161108", "_type"=>"location", "_id"=>"AVhB3KfyFo4-yI3HoAP9", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse", "caused_by"=>{"type"=>"parse_exception", "reason"=>"geo_point expected"}}}}}

Here is the Elasticsearch 5.0.0 log (looks like the geo_point field "coordinates" created.

[coptic-20161108][0] failed to execute bulk item (index) index {[coptic-20161108][location][AVhB3KfyFo4-yI3HoAP9], source[{"StreetName":"and Georgina Cres","coordinates":[149.10198,-35.21973],"PostalCode":"26176","Website":"www.stmarkact.org/","Latitude":"-35.21973","City":"KALEEN","message":"-35.21973,149.10198,Cnr Marybirngong Ave,and Georgina Cres,KALEEN, ACT,St Mark,26176,Australia,www.stmarkact.org/,(+61) 402108823","type":"location","Longitude":"149.10198","Province":" ACT","path":"/Users/hanyishak/Google Drive/CopticFindData/coptic3.csv","ChurchName":"St Mark","@timestamp":"2016-11-08T02:55:28.897Z","Phone":"(+61) 402108823","@version":"1","host":"HanyiMac27.local","Country":"Australia","StreetNum":"Cnr Marybirngong Ave"}]}
org.elasticsearch.index.mapper.MapperParsingException: failed to parse

Here is the Logstash configuration file:
input
{
file
{
path => "/Users/hanyishak/Google Drive/CopticFindData/coptic3.csv"
type => "location"
start_position => "beginning"

user => ["logstash_internal"]

password => ["changeme"]

}

}

filter
{
csv
{
columns => [
"Latitude",
"Longitude",
"StreetNum",
"StreetName",
"City",
"Province",
"ChurchName",
"PostalCode",
"Country",
"Website",
"Phone"]
separator => ","
add_field => [ "[coordinates]", "%{[Longitude]}" ]
add_field => [ "[coordinates]", "%{[Latitude]}" ]

user => ["logstash_internal"]

password => ["changeme"]

}

mutate {convert => [ "[coordinates]", "float"]}

}

output
{
stdout { codec => dots }
elasticsearch
{
action => "index"
hosts => ["localhost:9200"]
manage_template => true
index => "coptic-%{+yyyyMMdd}"
template => "/Users/hanyishak/Downloads/elasticsearch-5.0.0/logstash-template.json"
template_name => "copticdata"

user => ["logstash_internal"]

password => ["changeme"]

    }

}

Here is the logstash-template.json file:
{
"template": "coptic*",
"order": 1,
"settings": {
"number_of_shards": 1
},
"mappings": {
"streetdata": {
"dynamic_templates": [
{
"string_fields": {
"mapping": {
"index": "analyzed",
"omit_norms": true,
"type": "string",
"fields": {
"raw": {
"index": "not_analyzed",
"type": "string"
}
}
},
"match_mapping_type": "string",
"match": "*"
}
}
],
"_all": {
"enabled": false
},
"properties": {
"coordinates": {
"type": "geo_point"
}
}
}
}
}

The messages are the same for all rows in the csv file. Here is the row pertaining to the log entries:
-35.21973,149.10198,Cnr Marybirngong Ave,and Georgina Cres,KALEEN, ACT,St Mark,26176,Australia,www.stmarkact.org/,(+61) 402108823

Thoughts?

Please always format logs and configuration files as preformatted text.

You have a geo_point field in your index (named coordinates, it would seem) but the data you're trying to add to the field can't be recognized as geo_point. Use a stdout { codec => rubydebug } output to see what your events actually look like. Once they look like what you expect, reenable the elasticsearch output.

The output looks ok for the first third of the file (of under 1000 rows).
First record:
{
"StreetName" => "and Georgina Cres",
"coordinates" => [
[0] 149.10198,
[1] -35.21973
],
"PostalCode" => "26176",
"Website" => "www.stmarkact.org/",
"Latitude" => "-35.21973",
"City" => "KALEEN",
"message" => "-35.21973,149.10198,Cnr Marybirngong Ave,and Georgina Cres,KALEEN, ACT,St Mark,26176,Australia,www.stmarkact.org/,(+61) 402108823",
"type" => "location",
"Longitude" => "149.10198",
"Province" => " ACT",
"path" => "/Users/hanyishak/Google Drive/CopticFindData/coptic4.csv",
"ChurchName" => "St Mark",
"@timestamp" => 2016-11-09T04:00:36.299Z,
"Phone" => "(+61) 402108823",
"@version" => "1",
"host" => "HanyiMac27.local",
"Country" => "Australia",
"StreetNum" => "Cnr Marybirngong Ave"
}

Then about the first quarter of the files entries (which looked ok) appear again with a warning:
[2016-11-08T23:00:38,175][WARN ][logstash.outputs.elasticsearch] Failed action. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"coptic-20161109", :_type=>"location", :_routing=>nil}, 2016-11-09T04:00:36.299Z HanyiMac27.local -35.21973,149.10198,Cnr Marybirngong Ave,and Georgina Cres,KALEEN, ACT,St Mark,26176,Australia,www.stmarkact.org/,(+61) 402108823], :response=>{"index"=>{"_index"=>"coptic-20161109", "_type"=>"location", "_id"=>"AVhHPqRl37PS_SwgC3_L", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse", "caused_by"=>{"type"=>"parse_exception", "reason"=>"geo_point expected"}}}}}

The log then continues to parse properly from the last 'good' record. This is followed again by a seemingly repeated set of rows that looked ok, but now give the above error. This happens one more time to the end of the file.
The last row of the input doesn't appear as parsed, or in error.

Is what looks like the second pass of the data the execution of following command from the configuration file?
{mutate {convert => [ "[coordinates]", "float"]}
If so, and this is what is failing, what is wrong with the "coordinates" field. Looks ok to my novice, untrained eye:
"coordinates" => [
[0] 149.10198,
[1] -35.21973
]

Thanks again.
p.s.How to make 'preformatted text? I am copying text from imac textedit app.

Send the results to a stdout { codec => rubydebug } output so that we can see the actual results. Once we know exactly what a failing message looks (and the corresponding line of input) like we can fix the problem.

p.s.How to make 'preformatted text? I am copying text from imac textedit app.

There's a toolbar button for it.

Yes, the items I pasted are from stdout. Will format it better to make it clear when I get to that computer in 8 hours.

Logstash configuration file and Elasticsearch template as above.
Here is the csv input:
-35.21973,149.10198,Cnr Marybirngong Ave,and Georgina Cres,KALEEN, ACT,St Mark,26176,Australia,www.stmarkact.org/,(+61) 402108823

Logstach stdout:
{
"StreetName" => nil,
"coordinates" => [
[0] 31.81498,
[1] 26.54863
],
"PostalCode" => nil,
"Website" => nil,
"Latitude" => "26.54863",
"City" => "Markaz Akhmim",
"message" => "26.54863,31.81498,,,Markaz Akhmim,SOHAG,Monastery of St George,,Egypt,,",
"type" => "location",
"Longitude" => "31.81498",
"Province" => "SOHAG",
"path" => "/Users/hanyishak/Google Drive/CopticFindData/coptic5.csv",
"ChurchName" => "Monastery of St George",
"@timestamp" => 2016-11-11T01:55:56.771Z,
"Phone" => nil,
"@version" => "1",
"host" => "HanyiMac27.local",
"Country" => "Egypt",
"StreetNum" => nil
}

Here it is a picture of it from my log (so you can see the mapping easier):

Then, after about 300 records are processed, this first record is output again with the following:
[2016-11-10T20:55:58,398][WARN ][logstash.outputs.elasticsearch] Failed action. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"coptic-20161111", :_type=>"location", :_routing=>nil}, 2016-11-11T01:55:56.771Z HanyiMac27.local 26.54863,31.81498,,,Markaz Akhmim,SOHAG,Monastery of St George,,Egypt,,], :response=>{"index"=>{"_index"=>"coptic-20161111", "_type"=>"location", "_id"=>"AVhRGToXxr2wef7dsxl6", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse", "caused_by"=>{"type"=>"parse_exception", "reason"=>"geo_point expected"}}}}}

I am guessing this is a result of:
mutate {convert => [ "[coordinates]", "float"]}

I used something like this:

   mutate {
            add_field => [ "[location]", "%{Longitude}" ]
            add_field => [ "[location]", "%{Latitude}" ]
        }

        mutate {
            rename => [ "[location]", "[geoip][location]" ]
            convert => [ "[geoip][location]", "float" ]
        }

What are the actual mappings of the index? Not the index template but the index mappings of coptic-20161111.

Here is the result of GET /coptic-20161111/_mapping

{
"coptic-20161111": {
"mappings": {
"streetdata": {
"_all": {
"enabled": false
},
"dynamic_templates": [
{
"string_fields": {
"match": "*",
"match_mapping_type": "string",
"mapping": {
"fields": {
"raw": {
"index": "not_analyzed",
"type": "string"
}
},
"index": "analyzed",
"omit_norms": true,
"type": "string"
}
}
}
],
"properties": {
"coordinates": {
"type": "geo_point"
}
}
}
}
}
}

Then I don't know what's wrong. The only geo_point field you have is coordinates but the coordinates field in your sample message looks okay.

Hi there...
I'm having the same issue with version 5.

I found this:

At the end, someone asks if this was part of 2.x or also included in 5.x
My guess is that it was not included.

Have you found any solution?

I tried to build the location variable (under the name coordinates)

add_field => [ "[geoip_dst][coordinates]", "%{[geoip_dst][longitude]}" ]
add_field => [ "[geoip_dst][coordinates]", "%{[geoip_dst][latitude]}" ]

But got the same error.

I am also having this issue with a new ELK install with the latest version, I have tried all the suggestions and nothing works. The logstash log throws the following exception if it is converted to a float or not.

[2016-12-12T22:32:56,389][WARN ][logstash.outputs.elasticsearch] Failed action. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"bro-ssl_log-2016.12.12-xxxxx01", :_type=>"bro-ssl_log", :_routing=>nil}, 2016-12-12T22:06:43.300Z xxxxx01 1481580403.300635 CGsnFI5ZM2ieDVGuk 10.1.201.128 56937 34.192.175.114 443 - - - s.admathhd.com F - - F - - -- - - -], :response=>{"index"=>{"_index"=>"bro-ssl_log-2016.12.12-xxxxx01", "_type"=>"bro-ssl_log", "_id"=>"AVj1Ktst8SJenB4YGIMh", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse", "caused_by"=>{"type"=>"parse_exception", "reason"=>"geo_point expected"}}}}}

Anyone have any suggestions?

@rtb, please start a new thread for your question.

Adjusted the template "index.mapping.ignore_malformed": true and it fixed the issue

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.