I am trying to import logs from different location from my local machine using logstash simultaneously, I have configured the pipeline.yml as
- pipeline.id: apachelogs
- pipeline.id: squidlogs
- pipeline.id: firewalllogs
- pipeline.id: dhcpd
and I am trying to run it as
/usr/share/logstash/bin/logstash -r /etc/logstash/logstash.yml --config.reload.automatic
Now sometimes it is creating indices but there are only 2 indices created at a moment simulatneously and one of them doesn't have proper data.
I've also tried using filebeat but in filebeat it inject all the logs under one index
now the problems are
- How can I import logs from those location simultaneously?
- They have to be dynamic so that any time any new logs come it update them automatically