Hi,
I am trying to setup the ELK-Stack using this - https://pawelurbanek.com/elk-nginx-logs-setup - Blog article here.
Filebeat is running and sending data to logstash.
My logstash configuration looks like this (I removed the SSL part for testing)
input {
beats {
port => 5400
}
}
filter {
grok {
match => [ "message" , "%{COMBINEDAPACHELOG}+%{GREEDYDATA:extra_fields}"]
overwrite => [ "message" ]
}
mutate {
convert => ["response", "integer"]
convert => ["bytes", "integer"]
convert => ["responsetime", "float"]
}
geoip {
source => "clientip"
target => "geoip"
add_tag => [ "nginx-geoip" ]
}
date {
match => [ "timestamp" , "dd/MMM/YYYY:HH:mm:ss Z" ]
remove_field => [ "timestamp" ]
}
useragent {
source => "agent"
}
}
output {
elasticsearch {
hosts => ["localhost:9200"]
index => "weblogs-%{+YYYY.MM.dd}"
document_type => "nginx_logs"
}
stdout { codec => rubydebug }
Elasticsearch is running and accepting input on port 9200. But when I try to discover the data in Kibana I dont find the configured pattern.
/var/log/logstash/logstash-plain.log is full of these error which I can't interprete.
[2020-02-25T17:43:05,842][ERROR][logstash.filters.useragent][main] Uknown error while parsing user agent data {:exception=>#<TypeError: cannot convert instance of class org.jruby.RubyHash to class java.lang.String>, :field=>"agent", :event=>#LogStash::Event:0x4a5caa09}
It seems no Data arrives at Kibana at all.
How do i fix this?
Yours faithfully
Stefan Malte Schumacher