Once upon a time I was able to create a Region map that showed circles for location hits based on the geoip.ip field. For some reason now only the geoip.countrycode2 field works.
Here is my logstash config:
input {
# this is the actual live log file to monitor
file {
path => ["/home/cowrie/cowrie/var/log/cowrie/cowrie.json*"]
codec => json
type => "cowrie"
}
# this is to send old logs to for reprocessing
tcp {
port => 3333
type => "cowrie"
}
}
filter {
if [type] == "cowrie" {
#json {
# source => eventid
#}
date {
match => [ "timestamp", "ISO8601" ]
}
kv {
source => "message"
value_split => ":"
field_split => ","
}
if [src_ip] {
mutate {
add_field => { "src_host" => "%{src_ip}" }
}
dns {
reverse => [ "src_host" ]
nameserver => [ "10.71.1.1", "1.1.1.1" ]
action => "replace"
hit_cache_size => 4096
hit_cache_ttl => 900
failed_cache_size => 512
failed_cache_ttl => 900
}
geoip {
database => "/var/opt/logstash/vendor/geoip/GeoLite2-City.mmdb"
source => "src_ip"
target => "geoip"
add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}" ]
}
mutate {
convert => [ "[geoip][coordinates]", "float" ]
}
}
}
}
output {
if [type] == "cowrie" {
elasticsearch {
hosts => ["localhost:9200"]
}
file {
path => "/var/log/cowrie-logstash.log"
codec => json
}
stdout {
codec => rubydebug
}
}
}
Here is what I am doing...I am dumping json logs from a Cowrie honeypot into a log file where logstash picks them up. Using basic json parsing I get over 300 available fields, including all of the geoip fields. The region map used to work with the geoip.ip field I believe, but now it only works with geoip.country_code 2 and the maps are all colored in by country.
Does anyone have any ideas where I should look to troubleshoot this?