Multiple filebeat to one logstash. How to optimize the configuration

I have 10 servers that i have Filebeat installed in. Each server monitors 2 applications, a total of 20 applications.

I have one Logstash server which collects all the above logs and passes it to Elasticsearch after filtering of these logs.

To read one file from one server , I use the below Logstash configuration:

input {
  beats {
    port => 5044
  }
}
filter {
    grok {
match => {"message" =>"\[%{TIMESTAMP_ISO8601:timestamp}\]%{SPACE}\[%{DATA:Severity}\]%{SPACE}\[%{DATA:Plugin}\]%{SPACE}\[%{DATA:Servername}\](?<short_message>(.|\r|\n)*)"}
    }
} 
output {
  elasticsearch {
    hosts => ["<ESserverip>:9200"]
    index => "groklogs"
}
          stdout { codec => rubydebug }
}

And this is the filebeat configuration:

paths:
    - D:\ELK 7.1.0\elasticsearch-7.1.0-windows-x86_64\elasticsearch-7.1.0\logs\*.log

output.logstash:
  hosts: ["<logstaship>:5044"]

Can anyone please give me an example of

  1. How i should convert the above to receive from multiple applications from multiple servers.
  2. Should i configure multiple ports? How?
  3. How should i use multiple Groks?
  4. How can i optimize it in a single or minimal logstash configuration files?

How will a typical set up look. Please help me.

Why can't you process input from multiple servers and applications with the existing configuration?

@Badger because different application logs are involved in different servers.
I mean, will it work if I write multiple grok statements for collecting from all the logs? How would I differenciate the application details if the logs don't give any?
And can I map all filebeat to reach a single port 5044?
Even if I have more than one path in each server?

You can add tags to each prospector in filebeat, then use conditionals based on the tags to determine which filters to apply.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.