I am trying to create tile map but I am getting error " No Compatible Fields: The prod-* index pattern does not contain any of the following field types: geo_point"
Below is my logstash config file.
input {
rabbitmq {
host => "rabbitmq1"
queue => "Logs"
heartbeat => 30
durable => true
password => "xxxx"
user => "logstash"
vhost => "logs"
tags => "prod"
}
}
filter {
geoip { source => ip }
useragent { source => userAgent }
}
output {
if "prod" in [tags] {
elasticsearch {
hosts => [ "localhost"]
index => "prod-%{+YYYY.MM.dd}" }
}
}
Below is the output from Elasticsearch Query.
"ip" : "83.110.55.111 ",
"geoip" : {
"region_name" : "Ash Shariqah",
"region_code" : "SH",
"ip" : "83.110.55.111",
"country_name" : "United Arab Emirates",
"country_code2" : "AE",
"latitude" : 25.3573,
"longitude" : 55.4033,
"continent_code" : "AS",
"country_code3" : "AE",
"timezone" : "Asia/Dubai",
"city_name" : "Sharjah",
"location" : {
"lon" : 55.4033,
"lat" : 25.3573
}
},
"logTime" : "03/Apr/2019:04:23:15 +0000",
=======================================================
I can see Geoop.location in the JSON view of Kibana but not in Table View,
I am using the default elasticsearch-template
{
"template" : "logstash-*",
"version" : 60001,
"settings" : {
"index.refresh_interval" : "5s"
},
"mappings" : {
"_default_" : {
"dynamic_templates" : [ {
"message_field" : {
"path_match" : "message",
"match_mapping_type" : "string",
"mapping" : {
"type" : "text",
"norms" : false
}
}
}, {
"string_fields" : {
"match" : "*",
"match_mapping_type" : "string",
"mapping" : {
"type" : "text", "norms" : false,
"fields" : {
"keyword" : { "type": "keyword", "ignore_above": 256 }
}
}
}
} ],
"properties" : {
"@timestamp": { "type": "date"},
"@version": { "type": "keyword"},
"geoip" : {
"dynamic": true,
"properties" : {
"ip": { "type": "ip" },
"location" : { "type" : "geo_point" },
"latitude" : { "type" : "half_float" },
"longitude" : { "type" : "half_float" }
}
}
}
}
}
}
Below are the logs from Logstash.
[2019-04-03T11:35:04,687][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://localhost:9200/"}
[2019-04-03T11:35:04,703][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>6}
[2019-04-03T11:35:04,703][WARN ][logstash.outputs.elasticsearch] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>6}
[2019-04-03T11:35:04,720][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//localhost"]}
[2019-04-03T11:35:04,721][INFO ][logstash.outputs.elasticsearch] Using default mapping template
[2019-04-03T11:35:04,724][INFO ][logstash.outputs.elasticsearch] Attempting to install template {:manage_template=>{"template"=>"logstash-*", "version"=>60001, "settings"=>{"index.refresh_interval"=>"5s"}, "mappings"=>{"_default_"=>{"dynamic_templates"=>[{"message_field"=>{"path_match"=>"message", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false}}}, {"string_fields"=>{"match"=>"*", "match_mapping_type"=>"string", "mapping"=>{"type"=>"text", "norms"=>false, "fields"=>{"keyword"=>{"type"=>"keyword", "ignore_above"=>256}}}}}], "properties"=>{"@timestamp"=>{"type"=>"date"}, "@version"=>{"type"=>"keyword"}, "geoip"=>{"dynamic"=>true, "properties"=>{"ip"=>{"type"=>"ip"}, "location"=>{"type"=>"geo_point"}, "latitude"=>{"type"=>"half_float"}, "longitude"=>{"type"=>"half_float"}}}}}}}}
[2019-04-03T11:35:04,776][INFO ][logstash.filters.geoip ] Using geoip database {:path=>"/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-geoip-5.0.3-java/vendor/GeoLite2-City.mmdb"}
[2019-04-03T11:35:05,221][INFO ][logstash.filters.geoip ] Using geoip database {:path=>"/usr/share/logstash/vendor/bundle/jruby/2.5.0/gems/logstash-filter-geoip-5.0.3-java/vendor/GeoLite2-City.mmdb"}
[2019-04-03T11:35:05,530][INFO ][logstash.pipeline ] Pipeline started successfully {:pipeline_id=>"main", :thread=>"#<Thread:0x49818e8c run>"}
[2019-04-03T11:35:05,695][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2019-04-03T11:35:06,143][INFO ][logstash.inputs.rabbitmq ] Connected to RabbitMQ at
[2019-04-03T11:35:06,148][INFO ][logstash.inputs.rabbitmq ] Connected to RabbitMQ at
[2019-04-03T11:35:06,551][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
I've tried the following.
- Delete and re-create Index ( from kibana )
- Delete and re-index Index ( from Elasticsearch)
- Re-freshed fields from Kibana for index.
Anything I am missing here?