Multiple logstash configs with different filters running at the same time

Hello, I have been having this issue when I try to send server logs to logstash..

Java stack trace server logs sent to Logstash with their own input.conf filter.conf and output.conf

and

Cisco ASA firewall logs going to the same Logstash with their own input.conf filter.conf and output.conf

When I do both of these at the same time in the same logstash the data gets mixed up some times and Ill see firewall logs in the java stack trace index.. or vice versa

Will adding more pipeline workers fix this ?

Running the two configurations in different pipelines might. it depends on how you have configured logstash.

So how do you separate the pipelines? because right now I have two .. but the input from one is going to both indexes . and I have 4 pipeline workers

  beats {
    port => 15044
 }
}

filter {
}

output {
  elasticsearch {
  hosts => ["xxxxx:9500"]
  manage_template => false
  index => "devtest-%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
 }
}

input {
  beats {
    port => 5045
  }
}
filter {
}

output {
  elasticsearch {
  hosts => ["xxxxxx:9500"]
  manage_template => false
  index => "sdttest-%{[@metadata][beat]}-%{[@metadata][version]}-%{+YYYY.MM.dd}"
 }
}

Does this help?

Yes .. basically this has to be in the pipeline.yml..

- pipeline.id: main
  path.config: "/etc/logstash/conf.d/15044-pipeline.conf"
  pipeline.workers: 2
- pipeline.id: secondary
  path.config: "/etc/logstash/conf.d/5045-test-pipeline.conf"
  queue.type: persisted

It seems to be working but Ill be doing more test on it over this weekend .. thank you

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.