Hi all,
Hitting my head against a wall with a geo_point issue, I am receiving an error such as the following from Logstash:
[2017-05-03T14:43:22,014][WARN ][logstash.outputs.elasticsearch] Failed action. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"ossec-2017.05.03", :_type=>"ossec", :_routing=>nil}, 2017-05-03T13:43:18.538Z x-1 %{message}], :response=>{"index"=>{"_index"=>"ossec-2017.05.03", "_type"=>"ossec", "_id"=>"AVvOjQf_pg9iTpVc1m9v", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse", "caused_by"=>{"type"=>"parse_exception", "reason"=>"geo_point expected"}}}}}
The software versions I am running at the moment are:
logstash-5.3.2-1 (following upgrade)
elasticsearch-5.3.2-1 (following upgrade)
An example document being sent to Elasticsearch is the following:
{
"srcip" => "x.x.x.x",
"offset" => 14808041,
"count" => 1,
"input_type" => "log",
"rule" => {
"firedtimes" => 538,
"PCI_DSS" => [
[0] "6.5",
[1] "11.4"
],
"groups" => [
[0] "web",
[1] "accesslog",
[2] "attack"
],
"description" => "Web server 400 error code.",
"AlertLevel" => 5,
"sidid" => 31101
},
"decoder" => {
"name" => "web-accesslog"
},
"source" => "/var/ossec/logs/alerts/alerts.json",
"type" => "ossec",
"url" => "/rss/catalog/notifystock/",
"full_log" => "xxxxx",
"tags" => [
[0] "ossec",
[1] "xxx",
[2] "beats_input_codec_json_applied"
],
"@timestamp" => 2017-05-03T13:31:29.000Z,
"AgentIP" => "x.x.x.x",
"@version" => "1",
"beat" => {
"hostname" => "x-x-01",
"name" => "x-x-01"
},
"host" => "x-x-01",
"location" => "/var/log/nginx/access.log",
"AgentID" => "014",
"id" => "401",
"GeoLocation" => {
"timezone" => "Europe/Paris",
"ip" => "x.x.x.x",
"latitude" => 48.9394,
"coordinates" => [
[0] 2.2367,
[1] 48.9394
],
"continent_code" => "EU",
"city_name" => "Argenteuil",
"country_code2" => "FR",
"country_name" => "France",
"country_code3" => "FR",
"region_name" => "Val d'Oise",
"postal_code" => "95100",
"longitude" => 2.2367,
"region_code" => "95"
},
"AgentName" => "x1",
"fields" => nil
}
And the relevant segment of mapping for the destination index looks as follows (can produce the full template / mapping if needed for diagnosis):
"GeoLocation": {
"properties": {
"area_code": {
"type": "long"
},
"city_name": {
"type": "keyword"
},
"continent_code": {
"type": "text"
},
"coordinates": {
"type": "geo_point"
},
A few things I have already done to try and rectify this are:
-
Re-index the data with a new field name, removing a conflict on the original name at the same time.
-
Removed any additional test / unneeded templates which -may- have overlapped with the template for this index
-
exampled rubydebug / raw JSON document data for the data being shipped.
-
Manually input the document via the ES API which produces the same error, however, in a test template does not.
This is some of the geoip configuration within Logstash:
if "" in [srcip] {
geoip {
source => "srcip"
target => "GeoLocation"
database => "/etc/logstash2/GeoLite2-City.mmdb"
tag_on_failure => [""]
}
}
And I also have the following:
rename => [ "[GeoLocation][location]", "[GeoLocation][coordinates]" ]
Any advice on this would be really helpful, as I'm unsure of where to look next for a resolution,
Also, if I've missed any config or data required to help, let me know and I'll try and provide
Cheers.
David