Create geo_point from nested coordinates

I am trying to create a geo_point field to add coords using the http poller to pull some data in json format.

so the json is coming in a format like this. Multiple JSON documents representing each item and it's properties and sub-properties

[
    {
        "1": "a",
        "2": "",
        "3": "",
        "4": {
        "4a": {
            "4a1": ""
        },
        "4b": {
            "4b1": ""
        },
        "4c": {
            "latitude": 01,
            "longitude": -01
        }
      }
    },
    {
        "1": "b",
        "2": "",
        "3": "",
        "4": {
        "4a": {
            "4a1": ""
        },
        "4b": {
            "4b1": ""
        },
        "4c": {
            "latitude": 01,
            "longitude": -01
        }
      }
    }
]

so for my config i have

input {
  http_poller {
    urls => {
      url => {
        method => get
        url => ""
        headers => {
          Accept => "application/json"
        }
      }
    }
    request_timeout => 60
    interval => 60
    codec => "json"
    metadata_target => "http_poller_metadata"
  }
}

filter {
    mutate {
      add_field => { "[location][lat]" => "%{[4][4c][latitude]}" }
      add_field => { "[location][lon]" => "%{[4][4c][longitude]}" }
    }
    mutate {
      convert => { "[location]" => "geo_point" }
    }
    mutate {
      convert => { "[location][lat]" => "float" }
      convert => { "[location][lon]" => "float" }
    }
  }

output {
  elasticsearch { 
    hosts => ["localhost:9200"]
    manage_template => false
  }
  stdout { codec => rubydebug }
}

The http poller pulls the data in wonderfully without the mutate. i can also mutate and add a field such as test_field: test_data without issue. anything I start using anything regarding nested data it falls on its face. So, I'm trying to figure out where my formatting is off for the filters. is [4] the true top-level field or is it the item in the array? or do i need to split them first?

Since you have an array of objects with lat/lon pairs I'd assume that you'll want to split the array with the split filter.

  convert => { "[location]" => "geo_point" }

geo_point isn't a valid type to convert to.

ok thanks for the info. so I would need to change the mapping for the index to get the type changed to geo_type. ok... so for the split. this is where I was also unsure. I thought it was the route I needed to go. It seems as though the split works fine, but adding nested fields is where i seem to be failing.

filter {
  split {
    field => "id"
  }
  mutate {
    add_field => { "location" => "test_loc" }
    add_field => { "[location][lat]" => "test_lat" }
    add_field => { "[location][lon]" => "test_lon" }
  }
}

I tried having two mutates one wit hadding location and the other location lat and lon... adding just location works, but adding the lat and lon sub fields fails.

sample from the massive error...

"Exception in pipelineworker, the pipeline stopped processing new events, please check your filter configuration and restart Logstash.", "exception"=>java.lang.ClassCastException: expecting List or Map, found class com.logstash.bivalues.StringBiValue, "backtrace"=>["com.logstash.Accessors.newCollectionException(Accessors.java:175)", "com.logstash.Accessors.store(Accessors.java:162)",

add_field => { "location" => "test_loc" }

Here you add location as a string value.

add_field => { "[location][lat]" => "test_lat" }

Here you want to add lat as a subfield of location, but you just created location as a string value.

ok... so i got a bit further it looks like. added the fields successfully and the output from ES.

filter {
  split {
    field => "id"
  }
  mutate {
    add_field => { "[location][lat]" => 0 }
    add_field => { "[location][lon]" => 1 }
  }
}

"location": {
      "lon": 1,
      "lat": 0
    },

but this doesn't...

  mutate {
        add_field => { "[vehicle][location][lat]" => 0 }
        add_field => { "[vehicle][location][lon]" => 1 }
      }

the structure is correct from what i can see... so I am stumped. essentially, I'm wanting to take the original longitude and latitude values and put them in a geo_point field so i can map them in kibana.

{
    "id": "8416",
    "is_deleted": false,
    "trip_update": null,
    "vehicle": {
      "trip": {
        "trip_id": "UP-N_UN368_V1_C",
        "route_id": "UP-N",
        "direction_id": null,
        "start_time": "23:35:00",
        "start_date": "20160920",
        "schedule_relationship": 0
      },
      "vehicle": {
        "id": "8416",
        "label": "368",
        "license_plate": null
      },
      "position": {
        "latitude": 41.93284225463867,
        "longitude": -87.67341613769531,
        "bearing": null,
        "odometer": null,
        "speed": null
      },
      "current_stop_sequence": null,
      "stop_id": null,
      "current_status": 2,
      "timestamp": {
        "low": "2016-09-20T06:03:13.000Z",
        "high": 0,
        "unsigned": true
      },
      "congestion_level": null,
      "occupancy_status": null
    },

Which field in ES do you want to have as a geo_point field? How is that field currently mapped?

the location field is what i want to map, but i haven't yet because i was having an issue with all of the mutation stuff. so right now it's not mapped at all.

this is my first time trying to do my own mapping in an index and i cannot seem to get the mapping to take. This is what i'm trying to do with no success.

PUT _template/logstash/
{
  "mappings": {
      "properties": {
          "vehicle:" {
          "location": {
            "type": "geo_point"
          }
        }
      }
    }
  }

You need one properties key for each level.

{
  "mappings": {
    "properties": {
      "vehicle:" {
        "properties": {
          "location": {
            "type": "geo_point"
          }
        }
      }
    }
  }
}

So... I think I have it...

PUT _template/logstash/
{
  "template": "logstash-*",
  "mappings": {
     "_default_" : { 
       "properties" : { 
         "vehicle": {
           "properties": {
         "location": { "type": "geo_point" }
           }
        }
      }
    }
  }
}

{
  "acknowledged": true
}

GET _template/logstash/
	
	{
  "logstash": {
    "order": 0,
    "template": "logstash-*",
    "settings": {},
    "mappings": {
      "_default_": {
        "properties": {
          "vehicle": {
            "properties": {
              "location": {
                "type": "geo_point"
              }
            }
          }
        }
      }
    },
    "aliases": {}
  }
}

seems to be failing on the filter still

filter {
  split {
    field => "id"
  }
  mutate {
    add_field => { "[vehicle][location][lat]" => "%{vehicle.position.latitude}" }
    add_field => { "[vehicle][location][lon]" => "%{vehicle.position.longitude}" }
  }
  mutate {
    convert => {"[vehicle][location][lat]" => "float"}
    convert => {"[vehicle][location][lon]" => "float"}
  }
}

"2016-09-21T17:17:33.515000-0500", :message=>"Exception in pipelineworker, the pipeline stopped processing new events, please check your filter configuration and restart Logstash.", "exception"=>java.lang.ClassCastException: expecting List or Map, found class com.logstash.bivalues.NullBiValue, "backtrace"=>["com.logstash.Accessors.newCollectionException(Accessors.java:175)", "com.logstash.Accessors.fetch(Accessors.java:139)", "com.logstash.Accessors.findCreateTarget(Accessors.java:93)", "com.logstash.Accessors.set(Accessors.java:25)", "com.logstash.Event.setField(Event.java:156)", "com.logstash.ext.JrubyEventExtLibrary$RubyEvent.ruby_set_field(JrubyEventExtLibrary.java:144)",

in order to simplify i moved things a bit...

filter {
  split {
    field => "id"
  }
   mutate {
    add_field => { "location" => "%{[vehicle][position][latitude]}" }
    add_field => { "location" => "%{[vehicle][position][longitude]}" }
  }
}


{
  "logstash": {
    "order": 0,
    "template": "logstash-*",
    "settings": {},
    "mappings": {
      "_default_": {
        "properties": {
          "location": {
            "type": "geo_point"
          }
        }
      }
    },
    "aliases": {}
  }
}

here is my error

{:timestamp=>"2016-09-21T17:51:44.503000-0500", :message=>"Failed action.", :status=>400, :action=>["index", {:_id=>nil, :_index=>"logstash-2016.09.21", :_type=>"logs", :_routing=>nil}, 2016-09-21T22:51:43.541Z %{host} %{message}], :response=>{"index"=>{"_index"=>"logstash-2016.09.21", "_type"=>"logs", "_id"=>"AVdO8plc2ExhXBmCnMs9", "status"=>400, "error"=>{"type"=>"mapper_parsing_exception", "reason"=>"failed to parse", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"illegal latitude value [269.99999983236194] for location"}}}}, :level=>:warn}

finally got it! man what a learning experience. haha

{
  "logstash": {
    "order": 0,
    "template": "logstash-*",
    "settings": {},
    "mappings": {
      "_default_": {
        "properties": {
          "location": {
            "type": "geo_point"
          }
        }
      }
    },
    "aliases": {}
  }
}

filter {
  split {
    field => "id"
  }
   mutate {
   add_field => { "[location][lat]" => "%{[vehicle][position][latitude]}" }
   add_field => { "[location][lon]" => "%{[vehicle][position][longitude]}" }
  }
   mutate {
    convert => {"[location][lat]" => "float"}
    convert => {"[location][lon]" => "float"}
   }
}
2 Likes