Please suggest how to use multiple logstash config file at a time on single command line or as a service.
Problem Statement- I have multiple logstash config file(As there is differet data configured in each file) for posting data from different machines in cluster which requires to open as many command line instances as number of config files. So is it possible to run all the config files from single instance or anything similar.
Either put all files in a directory and run Logstash with -f path/to/directory or use multiple -f options that each point to one of the files.
Keep in mind that Logstash has a single event pipeline and that all Logstash filters, no matter what file they're defined in, apply to all events unless you wrap the filters in conditionals that exclude them for e.g. certain message types.
Hi,
one other thing you can do, is use numbers in your file, something like 1-input-foo.config, 2-filter-bar.conf, etc. This will help you debug the configs more clear in the directory.
Problem statement - When executed from command line (logstash -f 1-input-foo.config -f 2-filter-bar.conf)
It posts the data for 2-filter-bar.conf only.
I Have tried the option . Yes it is working to post the data but all the data is getting posted for every index i.e Index 1 & Index2 is having mixed data.
Please suggest where I am missing.
I don't know what's in your files, but keep in mind that Logstash has a single pipeline. All filters and outputs will apply to all input events unless you use conditionals to select how they apply. If you use multiple configuration files they will effectively be concatenated and treated as a single big file.
Agree, this is what currently happening . All the configs are getting applied to each data.
So how do I suppose to handle this?
Please suggest ,I want individual logstash config file to read different data and apply the filters & groks etc correspondingly, not combined config setting on each data set.
You need to use conditionals to select which filters and outputs to apply to which events. You can e.g. use the type field and event tags in your conditionals.
Thank you for the help.
I added a tag while doing the file "input" and applied "if" condition for those tags .
PFB the snippet from logstash config file-
input {
file {
path => "BatchData\Batch_Raw_Data.csv"
tags => [ "batchdata" ]
start_position => "beginning"
}
}
output {
if "batchdata" in [tags]{
elasticsearch {
action => "index"
index => "IndexName"
}
}}
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.