I am trying to import logs from different location from my local machine using logstash simultaneously, I have configured the pipeline.yml as
- pipeline.id: apachelogs
path.config: "/etc/logstash/conf.d/first-pipeline.conf"
pipeline.workers: 3
- pipeline.id: squidlogs
path.config: "/etc/logstash/conf.d/squid_log.conf"
queue.type: persisted
- pipeline.id: firewalllogs
path.config: "/etc/logstash/conf.d/firewall.conf"
queue.type: persisted
- pipeline.id: dhcpd
path.config: "/etc/logstash/conf.d/dhcplogs.conf"
queue.type: persisted
and I am trying to run it as
/usr/share/logstash/bin/logstash -r /etc/logstash/logstash.yml --config.reload.automatic
Now sometimes it is creating indices but there are only 2 indices created at a moment simulatneously and one of them doesn't have proper data.
I've also tried using filebeat but in filebeat it inject all the logs under one index
now the problems are
- How can I import logs from those location simultaneously?
- They have to be dynamic so that any time any new logs come it update them automatically