LogStash, GeoJSON and Kibana

I have successfully (I think) ingested logs using Logstash. Used the geoip filter and all the data looks good and well strucutred. When I index it in Kibana, the coordinates are set as float. This is what I can observe in mappings.

"location": {
  "properties": {
    "lat": {
      "type": "float"
    },
    "lon": {
      "type": "float"
    }
  }
},

In logstash geoip web page I read:

A `[geoip][location]` field is created if the GeoIP lookup returns a latitude and longitude. The field is stored in [GeoJSON](http://geojson.org/geojson-spec.html) format. Additionally, the default Elasticsearch template provided with the [elasticsearch output](https://www.elastic.co/guide/en/logstash/7.13/plugins-outputs-elasticsearch.html) maps the `[geoip][location]` field to an [Elasticsearch Geo_point datatype](https://www.elastic.co/guide/en/elasticsearch/reference/7.13/geo-point.html).

As this field is a `geo_point` *and* it is still valid GeoJSON, you get the awesomeness of Elasticsearch’s geospatial query, facet and filter functions and the flexibility of having GeoJSON for all other applications (like Kibana’s map visualization).

But it is not what happens in my case, there is no geo_point created.

To confirm the issue, I cannot create maps in Kibana using GeoJson coordinates from the index I've ingested.

I am using versions 7.13

Kindly ask advice from more experienced people. Thanks.

Hi @f4d0 , welcome to the Elastic forums!

You need your location field mapping to be set before any indexing as a geo_point type. If you let the dynamic mapping set it, as you are experiencing, you won't get the correct type.

1 Like

@f4d0 Hii.Define mappings explicitly as geo-point/geo-shape before indexing documents into elasticsearch.Refer this:-

1.)	To create Maps visuals in kibana we have to put mapping and template before index creation in elasticsearch and then we have to create index i.e manually or through indexing data.

NOTE:- Mapping and templates must be defined before index creation.If we will create mapping and template after creating index,it will not work. 

A.)	Mapping:- Mapping is the process of defining how a document, and the fields it contains, are stored and indexed.

B.)	Template:- An index template is a way to tell Elasticsearch how to configure an index when it is created. Templates are configured prior to index creation. When an index is created - either manually or through indexing a document - the template settings are used as a basis for creating the index.

2.)	Open your kibana and go to “Dev_Tools” and execute this request in it:-




PUT _template/geotemplate   #Geotemplate for plotting geoinformation on kibana maps.


{
  "index_patterns": [
    "test1"   #Index Name
  ],
  "settings": { },
  "mappings": {
    "properties": {
      "myloc.location":{
        "type": "geo_point" #Here we have defined that we have to set field called “location” to geo_point data type which is inside field called myloc.
      }
    }
  }
}
After executing the above request you will see an acknowledgement response like this which means that your request has been completed successfully:-
{
  "acknowledged" : true
}

# Geo-point field type:-
Fields of type geo_point accept latitude-longitude pairs, which can be used:
•	to find geo-points within a bounding box, within a certain distance of a central point, or within a polygon or within a geo_shape query.
•	to aggregate documents geographically or by distance from a central point.
•	to integrate distance into a document’s relevance score.
•	to sort documents by distance.
1 Like

First of all, thanks so much for your effort in providing such a complete detailed explanation. I really appreciate.

I am aware that I can do it with mappings, I was able to accomplish that before.

My challenge is that I enjoy not having to map my indexes, getting just the result from KV. Additionally, like I mention in my post, Elastic.co states in their web page manual that geoip filter generates that type of data automatically. Or am I reading it wrong?

I work in cyber security incident response consulting, and while I do map logs that are always the same like EVTX files, the firewall logs normally are from different vendors, exported in different ways and dynamic mapping is really helpful to tackle different engagements without much effort.

I guess I have to weight the real need of having the GeoIP point and the effort needed to map new data sets.

Again, thanks so much.

The default template that an elasticsearch output uses maps a field called [geoip] to include a [geoip][location] which is a geo_point. [geoip] matches the default target of a geoip filter, so yes, that one get mapped "automatically".

If you do not set the mapping no other field will be a geo_point. I understand that dynamic mapping is really helpful, that's why it is there. But remember, you can still use dynamic mapping for everything else except your geo_point fields.

Note also that you may not even need to know where in your document structure those geo_points are if you can name them consistently. The dynamic template documentation includes an example that shows how anything that arrives in elasticsearch as a string and whose name starts with "ip" can be mapped as type ip. I have not tested it but I expect that you could create a template that maps any field whose name ends in "location" as a geo_point.

And remember that when sending a geo_point to elastic you do not have to send it as an array of two floats. You have five options, including a string like "41.12,-71.34". elasticsearch knows how to parse that once the field is mapped as a geo_point.

2 Likes

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.