Multiple pipeline in logstash listening to different beats input ports

Hi,

I am currently using filebeat to ship my app logs from say Server A to logstash server which is listening for beats input on custom port 52001. On the logstash side, I am using certain grok filters to parse the log and then sending it to elastic search. Now I want to ship the same app logs running from Server B which requires same grok filters but I somehow wanted keep both the servers separated.

And in my logstash host, I am using pipeline.yml to define the config path like

Current pipeline.yml config:

  • pipeline.id: app_log_server_A
    path.config: "/src/apps/logstash/conf.d/server_A.conf"

Current /src/apps/logs/server_A.conf:

input {

beats {
port => 52001
}

}

filter {

grok {
.....
.....
}
output
{
kafka {
codec => json
bootstrap_servers => "xxxx"
topic_id => "xxxxx"
acks => "1"
}

Proposed pipeline.yml:

  • pipeline.id: app_log_server_A
    path.config: "/src/apps/logs/server_A.conf"

  • pipeline.id: app_log_server_B
    path.config: "/src/apps/logstash/conf.d/server_B.conf"

Proposed /src/apps/logstash/conf.d/server_B.conf:

input {

beats {
port => 54001
}

}

filter {

grok {
.....
.....
}
output
{
kafka {
codec => json
bootstrap_servers => "xxxx"
topic_id => "xxxxx"
acks => "1"
}

So I would like to hear from experts if this approach is correct and if it is efficient way to handle pipelines in logstash? Can one logstash process have two different config which listens to 2 different beats ports? is this possible? If not please suggest me a better approach.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.