[solved] Multiple logstash config file

Hi All ,

Please suggest how to use multiple logstash config file at a time on single command line or as a service.

Problem Statement- I have multiple logstash config file(As there is differet data configured in each file) for posting data from different machines in cluster which requires to open as many command line instances as number of config files. So is it possible to run all the config files from single instance or anything similar.

Regards,
Prateek

Either put all files in a directory and run Logstash with -f path/to/directory or use multiple -f options that each point to one of the files.

Keep in mind that Logstash has a single event pipeline and that all Logstash filters, no matter what file they're defined in, apply to all events unless you wrap the filters in conditionals that exclude them for e.g. certain message types.

2 Likes

Hi ,
I have tried the option but it is picking up the data from last config file only-

Command- logstash -f Sample1.conf -f Sample2.conf

It is picking the path from Sample2.conf only. Please suggest if I am missing something.

Regards,
Prateek

Hi,
one other thing you can do, is use numbers in your file, something like 1-input-foo.config, 2-filter-bar.conf, etc. This will help you debug the configs more clear in the directory.

The logstash config files have different names.

Problem statement - When executed from command line (logstash -f 1-input-foo.config -f 2-filter-bar.conf)
It posts the data for 2-filter-bar.conf only.

Please suggest .

Regards,
Prateek

I Have tried the option . Yes it is working to post the data but all the data is getting posted for every index i.e Index 1 & Index2 is having mixed data.
Please suggest where I am missing.

Regards,
Prateek

I don't know what's in your files, but keep in mind that Logstash has a single pipeline. All filters and outputs will apply to all input events unless you use conditionals to select how they apply. If you use multiple configuration files they will effectively be concatenated and treated as a single big file.

1 Like

Agree, this is what currently happening . All the configs are getting applied to each data.
So how do I suppose to handle this?

Please suggest ,I want individual logstash config file to read different data and apply the filters & groks etc correspondingly, not combined config setting on each data set.

Regards,
Prateek

You need to use conditionals to select which filters and outputs to apply to which events. You can e.g. use the type field and event tags in your conditionals.

1 Like

Thank you for the help.
I added a tag while doing the file "input" and applied "if" condition for those tags .
PFB the snippet from logstash config file-
input {
file {
path => "BatchData\Batch_Raw_Data.csv"
tags => [ "batchdata" ]
start_position => "beginning"
}
}
output {
if "batchdata" in [tags]{
elasticsearch {
action => "index"
index => "IndexName"
}
}}

Regards,
Prateek

3 Likes