Geopoint visualization in Kibana 6.4.2 (Elastic 6.4.2)

I have an index pattern for my metrics that's configured as "filebeat-*" in Kibana

I have a field called qp_point - this is created by reading another field and running "mutate" in logstash

However, I cannot see visualizations for the same geopoint

Please see screenshots of visualization page and index mapping

Hi @samiujan,

just to make sure I understand correctly - there is no data showing up on your map, right?

Could you check whether the field is actually populated with the expected data? If everything looks fine there, could you open the "Inspect" panel and copy/paste the request and response bodies of the actual request sent to elasticsearch?

Thanks Joe - I have attached screenshots of some of the data

<I couldn't cross it out but I have removed the logstash definition since that is not the correct one>

Also, following is the request and response result from "Inspect"

Request

{
  "aggs": {
    "2": {
      "geohash_grid": {
        "field": "qp_points",
        "precision": 2
      },
      "aggs": {
        "3": {
          "geo_centroid": {
            "field": "qp_points"
          }
        }
      }
    }
  },
  "size": 0,
  "_source": {
    "excludes": []
  },
  "stored_fields": [
    "*"
  ],
  "script_fields": {},
  "docvalue_fields": [
    {
      "field": "@timestamp",
      "format": "date_time"
    }
  ],
  "query": {
    "bool": {
      "must": [
        {
          "match_all": {}
        },
        {
          "range": {
            "@timestamp": {
              "gte": 1574072899078,
              "lte": 1574159299078,
              "format": "epoch_millis"
            }
          }
        },
        {
          "match_phrase": {
            "apikey.keyword": {
              "query": "<apikey_here>"
            }
          }
        },
        {
          "match_phrase": {
            "uripath.keyword": {
              "query": "/search/rgeocode"
            }
          }
        }
      ],
      "filter": [],
      "should": [],
      "must_not": []
    }
  }
}

Response

{
  "took": 47,
  "timed_out": false,
  "_shards": {
    "total": 720,
    "successful": 705,
    "skipped": 695,
    "failed": 0
  },
  "hits": {
    "total": 72438,
    "max_score": 0,
    "hits": []
  },
  "aggregations": {
    "2": {
      "buckets": []
    }
  },
  "status": 200
}

It seems "qp_points" is empty but qp_points.lat and qp_points.lng do exist...

Should renaming "lng" to "lon" help? Since the geopoint definition seems to have "lon" as a field type and not lng....

Right, it should be lon instead of lng for the index field to pick up your data correctly. For reference, there are also others ways to index geo_point data, but lng for longitude is not supported: https://www.elastic.co/guide/en/elasticsearch/reference/current/geo-point.html#geo-point.

Looks like the lng data is just ignored by the geo point index, this is why nothing shows up. You have to re-index your existing data with the correct field names, then it should work.

Thanks a lot for the help Joe!

Hi

I changed the logstash definition and sending a "lon" field - it does pick up the geopoint correctly and shows it that way in the "Discover" tab - but I still can't see it in the visualization

PFA screenshots of the geopoint datatype and its index mapping

Following is the request response

Request

{
  "aggs": {
    "filter_agg": {
      "filter": {
        "geo_bounding_box": {
          "ignore_unmapped": true,
          "qp_points": {
            "top_left": {
              "lat": 90,
              "lon": -180
            },
            "bottom_right": {
              "lat": -90,
              "lon": 180
            }
          }
        }
      },
      "aggs": {
        "3": {
          "geohash_grid": {
            "field": "qp_points",
            "precision": 2
          },
          "aggs": {
            "4": {
              "geo_centroid": {
                "field": "qp_points"
              }
            }
          }
        }
      }
    }
  },
  "size": 0,
  "_source": {
    "excludes": []
  },
  "stored_fields": [
    "*"
  ],
  "script_fields": {},
  "docvalue_fields": [
    {
      "field": "@timestamp",
      "format": "date_time"
    }
  ],
  "query": {
    "bool": {
      "must": [
        {
          "match_all": {}
        },
        {
          "range": {
            "@timestamp": {
              "gte": 1574165051943,
              "lte": 1574165351943,
              "format": "epoch_millis"
            }
          }
        },
        {
          "match_phrase": {
            "apikey.keyword": {
              "query": "<apikey>"
            }
          }
        }
      ],
      "filter": [],
      "should": [],
      "must_not": []
    }
  }
}

Response

{
  "took": 51,
  "timed_out": false,
  "_shards": {
    "total": 720,
    "successful": 705,
    "skipped": 700,
    "failed": 0
  },
  "hits": {
    "total": 307,
    "max_score": 0,
    "hits": []
  },
  "aggregations": {
    "filter_agg": {
      "3": {
        "buckets": []
      },
      "doc_count": 0
    }
  },
  "status": 200
}

Should I just use a new field altogether?

Also, Joe when you say "re-index existing data" I expect I need to run the update_by_query and delete_by_query commands to clean up the existing data fields (I have real time incoming data from the field)

Hey, update_by_query will make sure the new data is indexed, so that should be fine.

One thing is standing out for me - it seems like lon and lat are strings instead of numbers. It doesn't have to be the case, but maybe they are not indexed correctly because of that. Could you try to make those values numbers specifically?

I just tried to create such a visualization locally and it works for me with data looking like this (note the missing quotes around the numbers):

Thanks very much Joe

I will change the sub-fields to floats and test it out

So here is a postmortem of my efforts to correctly show geo_point's in my visualization

  1. I have real time data coming in through a REST API - this contains a "points" query param
  2. The data is ingested by a filebeat plugin which sends it to logstash
  3. Logstash's filter plugin parses and mutates it to an an object with 2 floating point values
  4. Logstash's output plugin writes it to a filebeat index on ES
  5. ES, Logstash and Kibana are all version 6.4
  6. Filebeat indexes are created via a script and there is a new index every day e.g. for today's it's called filebeat-2019.11.20

So I finally did the following, in order for the geo_point's to show up correctly

  1. I updated the mapping to my latest filebeat by sending
    curl -X PUT "localhost:9200/filebeat-2019.11.20/_mapping/doc?pretty" -H 'Content-Type: application/json' -d' { "properties": { "req_gp": { "type": "geo_point" } } } '
    Please note I am using "doc" as the type name instead of "_doc" since Logstash 6.4 auto creates a type called "doc" - the PUT Mapping reference for 6.4 on ES's official docs mention _doc and updating the mapping for this type throws an error

Rejecting mapping update to [<index_name>] final mapping would have more than 1 type: [_doc, doc]"

  1. Next I changed the field name in logstash's filter plugin:

       mutate {                                                                                                                                                                                                             
         split => { "point" => ";" }                                                                                                                                                                                      
       }                                                                                                                                                                                                            
       mutate {
         add_field => {
           "[req_gp][lat]" => "%{[point][0]}"
           }
    
         add_field => {
           "[req_gp][lon]" => "%{[point][1]}"
           }
       }
    
       mutate {
         convert => { "[req_gp][lat]" => "float" }
         convert => { "[req_gp][lon]" => "float" }
       }
    

I created a new field since you cannot change the mapping for an existing field in ES

Also, please note I created the mapping in the index before sending the data to this new field, otherwise ES would save the mapping as {float, float} for this field and again, it cannot be changed later

  1. Lastly, I restarted Logstash

  2. Updated the index pattern in Kibana and made sure only the field itself is listed with a "geopoint" type

  3. Clicked "Visualize" next to the field name on the filter on the left and it showed up successfully in the Map Coordinates visualization

Later on I intend to all the new field mapping to previous indices and run update_by_query to move the existing data in the old indices to the new type in the same indices

P.S. Big thanks to Joe Reuter for helping out!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.