Logstash/eleasticsearch - geoip failed to parse from lat long

I'm using elastic search 6 and logstash 6. I have added the filter below:

input {
  udp {
    port => 9996
    codec => netflow {
      versions => [5, 9]
    }
    type => netflow
    tags => "checkpoint-flow"
  }
  udp {
    port => 9995
    codec => netflow {
      versions => [5, 9]
    }
    type => netflow
    tags => "cisco-flow"
  }
}
filter {
    if "checkpoint-flow" in [tags] or "cisco-flow" in [tags] {
        geoip {
                source => "[netflow][ipv4_src_addr]"
                target => "[geoip_src]"
                database => "/etc/logstash/GeoIP/GeoLite2-City.mmdb"

            }
        geoip {
                source => "[netflow][ipv4_dst_addr]"
                target => "[geoip_dst]"
                database => "/etc/logstash/GeoIP/GeoLite2-City.mmdb"

            }
        translate {
            field => "[netflow][protocol]"
            destination => "[netflow][protocol]"
            override => "true"
            dictionary => [ 6, "TCP", 17, "UDP", 1, "ICMP", 47, "GRE", 50, "ESP" ]
        }
        mutate {
                add_field => [ "[geoip_src][coordinates]", "%{[geoip_src][longitude]}" ]
                add_field => [ "[geoip_src][coordinates]", "%{[geoip_src][latitude]}" ]
                add_field => [ "[geoip_dst][coordinates]", "%{[geoip_src][longitude]}" ]
                add_field => [ "[geoip_dst][coordinates]", "%{[geoip_src][latitude]}" ]
        }
        mutate {
            convert => [ " [geoip_src][coordinates]", "float"]
            convert => [ " [geoip_dst][coordinates]", "float"]
        }
    }
}

I have also added a template to elastic search as follows:

curl -XPUT http://localhost:9200/_template/logstash_netflow_template -H 'Content-Type: application/json' -d '
{
  "template" : "checkpoint-9996-*",
  "mappings" : {
    "netflow" : {
      "properties" : {
        "geoip_src" : {
          "properties" : {
            "coordinates" : {"type":"geo_point"}
          }
        },
       "geoip_dst" : {
         "properties" : {
           "coordinates" : {"type":"geo_point"}
         }
       }
     }
   }
 }
}'

now in the logs I have the following:

tail /var/log/logstash/logstash-plain.log  -n 1
[2017-12-14T14:49:25,491][WARN ][logstash.outputs.elasticsearch] Could not index event to Elasticsearch. {:status=>400, :action=>["index", {:_id=>nil, :_index=>"checkpoint-9996-2017.12.14", :_type=>"doc", :_routing=>nil}, #<LogStash::Event:0x59afbbc0>], :response=>{"index"=>{"_index"=>"checkpoint-9996-2017.12.14", "_type"=>"doc", "_id"=>"-yiAVWABDhLk6o_IYirI", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"illegal latitude value [269.99999983236194] for geoip_dst.coordinates"}}}}}

I'm not sure why I'm getting this error, nor is my googlefu helping me get the answer, can anyone point me in the direction of my error?

anyone have any ideas?

Are you actually attempting to send a latitude value of 270 degrees? If you replace your elasticsearch output with stdout { codec => rubydebug } you can dump the raw event and see what it looks like.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.