Adding more logs to filebeat to export and on elk server to logstash to receive

Hello
I have this starter config which derives from DigitalOcean and looks like this:
input {
beats {
port => 5044
ssl => true
ssl_certificate => "/etc/pki/tls/certs/logstash-forwarder.crt"
ssl_key => "/etc/pki/tls/private/logstash-forwarder.key"
}
}

I like to add mysql slow query logs and tomcat logs, each from a different server to this config, how the approach would look like and how to handle two or more multiline pattern in input and then filter sections?
thanks a lot
Pierre

  • Set the document type on the Filebeat side and it'll get carried over to Logstash. Then add conditional filters to process events of different types in different ways.
  • Do multiline processing as close to the source as possible, i.e. do it in Filebeat and not in Logstash.

thank you for your comment, is there any example showing how would the approach look like for each part, "Set the document type on the Filebeat side", "add conditional filters to process events of different types in different ways", "doing multiline processing in source" ?

how would the approach look like for each part, "Set the document type on the Filebeat side",

I believe the Filebeat option is named document_type. See its documentation.

"add conditional filters to process events of different types in different ways",

https://www.elastic.co/guide/en/logstash/current/event-dependent-configuration.html

"doing multiline processing in source" ?

https://www.elastic.co/guide/en/beats/filebeat/current/multiline-examples.html

thank you so much for your answer, I'll go through it step by step, to safe assure the first part, you mean input_type option, don't you?

you mean input_type option, don't you?

No, I really mean document_type.

Found it (document_type) here: Filebeat Configuration Options