Geo_point - no compatible types

I am getting the no compatible types message from Kibana and cannot figure out why. I am getting the geo details using logstash as per below

geoip {
source => "ipAddress"
target => "geometry"
database => "/etc/logstash/GeoLiteCity.dat"
add_field => [ "[geo_point][coordinates]", "%{[geo_point][longitude]}" ]
add_field => [ "[geo_point][coordinates]", "%{[geo_point][latitude]}" ]
}

However, in Kibana get the following message:
No Compatible Fields: The "mapdata" index pattern does not contain any of the following field types: geo_point

The mapping shows the location as a double as oppose to geo_point which is probably why I am getting the error, however shouldn't logstash be passing the location as a geo_point in the first place?.

"geo_point": {
"properties": {
"latitude": {
"type": "double"
},
"location": {
"type": "double"
},
"longitude": {
"type": "double"
}
}
},

When I look at a record in particular the location field I do get a lat, long from the data.
location": [-0.12999999999999545,51.5],

Do I need to configure the mapping of the data types before data is loaded? Not sure why the type is not geo_point

The mapping shows the location as a double as oppose to geo_point which is probably why I am getting the error, however shouldn't logstash be passing the location as a geo_point in the first place?

There is no geo_point type in JSON, which is what Logstash is passing to ES.

You have to modify the mapping if you want the geo_point field to have the geo_point type. This is typically done by modifying the index template that's applied for Logstash's indexes. In the default index template only the geoip field is mapped as geo_point.

Thanks for responding

So, if I am understanding you correctly I need to create the mapping before the data is in Elasticsearch? Can it be amended after?

When I do this on an index that has not been created:

PUT /example1/_mapping/mygeodata
{
"mygeodata": {
"properties": {
"geo_point": {
"properties": {
"location": {"type": "geo_point"}
}}}}
}

I get told its missing an index which makes sense.

{
"error": "IndexMissingException[[example1] missing]",
"status": 404
}

When I try to amend the type after it exists I get:

"error": "MergeMappingException[Merge failed with failures {[mapper [geo_point.location] of different type, current_type [double], merged_type [geo_point]]}]",

So I though create my own template but hasn't made much of a difference, as its not loading the data in using this template

PUT /_template/my_example
{
"template": "example*",
"order": 1,
"settings": {
"number_of_shards": 4
},
"mappings": {
"properties": {
"geo_point": {
"properties": {
"location": {"type": "geo_point"}
}}}
}
}

So, if I am understanding you correctly I need to create the mapping before the data is in Elasticsearch? Can it be amended after?

An index's mapping of a field can't be modified. A mapping can be set via an index template (i.e. the mapping is applied automatically when the index is created), explicitly as part of the index creation request, or (I think) after index creation if the field hasn't been mapped yet.

So I though create my own template but hasn't made much of a difference, as its not loading the data in using this template

And after creating this index template you create an index named, say, example-foo and that index doesn't have the geo_point field correctly mapped?

I created an index and added the mapping and this worked. So as you suggested an index cannot be modified after its created, I thought I could.

My original load from logstash was just loading the data and did not configure the type as geo_point but as a double. As json input/load recognised the field type as double. So by configuring the mapping upfront this worked for me.

PUT /example1/

PUT /example1/_mapping/geodata
{
"geodata": {
"properties": {
"geo_point": {
"properties": {
"location": {"type": "geo_point"}
}}}}
}

Thanks again