Hello,
hopefully someone can help me. The big problem is, that I can't see the data in Elasticsearch with Kibana, which I am inserting with Filebeat and Logstash. I think the main issue has to be between the Logstash and the Elasticsearch. My fields are generated but never filled with any value. If I print the results to the console from Logstash, all fields are found correctly. But if I want to see them in Kibana, there is no data.
Sometimes the pattern doesn't match anything. Does it have something to do with it?
If you need more information, feel free to ask. I am struggeling for days now with it. Thanks in advance
Here is my config file from Logstash.
# Sample Logstash configuration for creating a simple
# Beats -> Logstash -> Elasticsearch pipeline.
input {
beats {
port => 5044
}
}
filter {
grok {
match => { "message" => "\{\"log\"\:\"%{IP:client} \- \- \[%{HTTPDATE:timestamp}\] \\\"%{WORD:method} %{URIPATH:endpoint} HTTP\/%{NUMBER:httpversion}\\\" %{NUMBER:bytes_sent} %{NUMBER:no_one_knows} \\\"%{URI:uri}\\\" \\(?<user_agent>(?>(?>\"(?>\\.|[^\\"]+)+\"|\"\"|(?>'(?>\\.|[^\\']+)+')|''|(?>`(?>\\.|[^\\`]+)+`)|``)))\,\"stream\"\:\"%{GREEDYDATA:stream}\"\,\"time\"\:\"%{TIMESTAMP_ISO8601:timestamp_iso}\"\}"}
overwrite => [ "message" ]
add_field => { "client_ip" => "%{client}" }
}
mutate {
convert => ["response", "integer"]
convert => ["bytes", "integer"]
convert => ["responsetime", "float"]
}
geoip {
source => "clientip"
add_tag => [ "nginx-geoip" ]
ecs_compatibility => disabled
}
date {
match => [ "timestamp" , "dd/MMM/YYYY:HH:mm:ss Z" ]
remove_field => [ "timestamp" ]
}
}
output {
elasticsearch {
hosts => ["https://localhost:9200"]
ssl => true
ssl_certificate_verification => false
user => "elastic"
password => "--sm*83_qkSpIJ5lE_ZV"
index => "weblogs-2022"
document_type => "nginx_logs"
}
stdout { codec => rubydebug }
}