I would like to set up multiple logstash servers for load balancing as mentioned here. Is there a best practice for maintaining the multiple pipeline files (one for each instance of logstash) besides just configuring one and copy/pasting to the other servers?
My logstash filters are a constant work in progress so having to make sure all the pipelines stay in sync across all the logstash servers seems pretty arduous!
It really depends on your infrastructure, but you can automate things using ansible, scripts, crontab and git.
You can for example keep all your pipelines in a git repository and clone this repository in your logstash servers, then you would configure your pipelines.yml to point to the config files in this repository.
Every time you need to update you would just need to make a git pull in the logstash servers, which you could do using a crontab or automate with ansible.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.