I hate starting over from scratch, but upgrading from 6.8.3 to 7.3.2 did not go well. So I removed and purged my whole ELK system. I still have all my JSON logs, so I can import them without any issue.
With that said, my fresh install is crashing in the same place the upgrade did. This leads me to believe there is a problem with my configuration. I have run out of ideas and need the communities help to determine if this is a bug and where.
logstash-stdout: https://pastebin.com/0yRWwH9K
logstash-stderr: https://pastebin.com/1xQEADsd
My config:
input {
file {
path => ["/home/cowrie/cowrie/var/log/cowrie/cowrie.json"]
codec => json
type => "cowrie"
}
tcp {
port => 3333
type => "cowrie"
}
}
filter {
if [type] == "cowrie" {
date {
match => [ "timestamp", "ISO8601" ]
}
if [src_ip] {
mutate {
add_field => { "src_host" => "%{src_ip}" }
}
dns {
reverse => [ "src_host" ]
nameserver => [ "10.71.1.1", "1.1.1.1" ]
action => "replace"
hit_cache_size => 4096
hit_cache_ttl => 900
failed_cache_size => 512
failed_cache_ttl => 900
}
geoip {
database => "/var/opt/logstash/vendor/geoip/GeoLite2-City.mmdb"
source => "src_ip"
target => "geoip"
add_field => [ "[geoip][coordinates]", "%{[geoip][longitude]}" ]
add_field => [ "[geoip][coordinates]", "%{[geoip][latitude]}" ]
}
mutate {
convert => [ "[geoip][coordinates]", "float" ]
}
}
}
}
output {
if [type] == "cowrie" {
elasticsearch {
hosts => ["localhost:9200"]
}
file {
path => "/var/log/cowrie-logstash.log"
codec => json
}
}
}
I'm hoping someone can spot why this is happening. Thank-you!