Combining several logstash config files into one... how do you do it?

To date, we have created several mini-config files on our own to satisfy very specific goals. Now we are looking to consolidate these files on a common server and wondering how to go about it. Is the best practice to combine all config file statements into one-big-file (with a bunch of conditional statements) and run a single instance of LogStash with that one-big-file?! Is there a way to start logstash with a number of config files? Can you run multiple instances of Logstash, each of which runs a unique config file? Wondering how you all do it. Thx -

1 Like

Technically, Logstash will combine all files in a directory if passed a wildcard, e.g. -f /path/to/configs/*.conf

It will do so in alphabetical order, so be sure to have them in the order you want.

Of course, you can just concatenate them yourself into a single file, and re-order the blocks as needed. Logstash doesn't care if you have 3 input blocks, 15 filter blocks, and 8 output blocks. It will try to do the right thing in merging them.

1 Like

I like the idea of having multiple config files dedicated to discrete Input/Filter/Output processing - all logic self-contained in unique config file containers.

What I'm not understanding still is, if Logstash is stared with all of these files at once (*.conf example you used), how will it cope with all these multiple Inputs, multiple Filters and especially multiple Output directives from all these different files representing all the different data types/mappings. Will it always evaluate each Input block, each Filter block and each Output block for each and every data Event?! If so, we will need conditioning to prevent data going to each and every index and making a mess.... Is this what's recommended?

if (datatype == 'billing'), then
populate 'billing' index...
else
if (datatype == 'orders'), then
populate 'orders' index....
etc

Apart from the options that have been mentioned already you can pass a directory to Logstash and it'll read all files therein.

Will it always evaluate each Input block, each Filter block and each Output block for each and every data Event?!

Yes.

If so, we will need conditioning to prevent data going to each and every index and making a mess.... Is this what's recommended?

Yes.

1 Like

Inputs are separate, but yes. Every filter and output will be evaluated for each event, unless you use conditionals to route around them.

Yes. The use of conditionals is the appropriate way to control the flow of streams to various filters and outputs.

Thank you both for clarification -

1 Like