Hi! were moving our infra to containers and we wold like to use filebeat to send the logs directly to Elasticsearch from Filebeat, instead of going trough Logstash.
In this process we might lose our logstash filtering and parsing capabilities (grok is not supported in filebeat)
This is our current logstash configuration:
input {
file {
path => "/servicename/_logs/servicename.log"
codec => multiline {
pattern => "(^[a-zA-Z.]+(?:Error|Exception).+)|(^\s+at .+)|(^\s+... \d+ more)|(^\t+)|(^\s*Caused by:.+)"
what => "previous"
}
}
}
filter {
if "multiline" not in [tags]{
json {
source => "message"
remove_field => ["[request][body]","[response][body][response][items]"]
}
}
else {
grok {
pattern_definitions => { APPJSON => "{.*}" }
match => { "message" => "%{APPJSON:appjson} %{GREEDYDATA:stack_trace}"}
remove_field => ["message"]
}
json {
source => "appjson"
remove_field => ["appjson"]
}
}
}
output {
elasticsearch {
hosts => ["elasticsearch:9200"]
index => "logstash-servicename-%{+YYYY.MM.dd}"
document_type => "logs"
}
}
We dont use any fancy filtering and parsing capabilities.
Can we apply these filters somehow in filebeat?
thanks!