I know the config file being used in the logstash service -> /etc/logstash/conf.d/ls.conf.
But I want to modify the service to use forked pipelines. But what I assume is the home dir ... /opt/logstash doesn't contain any logstash.yml or pipelines.yml.
How do I restart the service to use the pipelines.yml I've written?
Working with the Elastic stack and being afraid of breaking changes is not a good match Elastic have no fear of introducing breaking changes when they think it provides enough benefit. That said, I realize you have been given a task to complete.
What are you trying to do with multiple pipelines? The distributor and collector patterns can be implemented in a single pipeline using if-else blocks in the filter and/or output sections. A forked-path pattern can be implemented using a clone filter plus an if-else based on the type added by the clone filter.
The output-isolator pattern cannot be implemented in old versions.
I'm working with a single input source (file path => "/abc/.*txt") but want to use two different filter plugins for 2 different outputs - in the future I'm gonna downgrade the RabbitMQ output & persist the Kafka output. So I want to implement a forked path.
I should upgrade the system since we're doing an architecture overhaul of the downstream application anyway. Just that I don't know what to tick off before upgrading. I think I should read the sincedb and provide the path in the new installation, but I can't find a 'sincedb' anywhere in the system!
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.