Warning: Manual override - there are filters that might not work with multiple worker threads {:pipeline_id=>"exterro", :worker_threads=>3, :filters=>["multiline", "multiline", "multiline", "multiline", "multiline", "multiline", "multiline", "multiline", "multiline", "multiline", "multiline", "multiline", "multiline"], :thread=>"#Thread:0x2b20cc95run"}
I have tried installing logstash-filter-multiline and logstash-codec-multiline and my configuration is as like below in filter.conf file
filter {
if [type] == "tomcat"{
multiline {
pattern => "([1]+(?:Error|Exception).+)|(^\s+at .+)|(^\s+... \d+ more)|(^\t+)|(^\s*Caused by:.+)"
what => "previous"
}
grok {
match => { "source" => "%{GREEDYDATA}/%{GREEDYDATA:filename}.log" }
}
grok {
match => { "source" => "%{GREEDYDATA}/%{GREEDYDATA:tenant}.log" }
}
mutate {
lowercase => ["tenant"]
}
date {
match => ["time", "yyyy-MM-dd-HH-mm-ss-SSS"]
target => "@timestamp"
}
}
}
and elasticsearch output conf file like below
output {
if [type] == "tomcat"{
opensearch {
hosts => ["https://es-url.region.amazonaws.com:443"]
user => ""
password => ""
index => "filebeat-%{tenantId}"
ecs_compatibility => disabled
ssl_certificate_verification => false
}
}
After the configuration is updated logs output is not updating in kibana dashboard
Kindly help to resolve the same!
I am using AWS elasticsearch with filebeat and logstash installed on ec2.
-
a-zA-Z. ↩︎