Logstash I want to configure multiple configs on one pipeline.
I want to read dozens of csv files and send data to the same number of different indexes.
So, I created as many config files as there are,
We want to input specific index data into one pipeline.
However, when I made several config settings with one pipeline configuration, the data was mixed.
Is there any other way than setting the same number of pipelines?
If you have multiple configuration files for a single pipeline then they are combined. Events are read from all of the outputs, sent through all of the filters, and every event is written to every output. If you want each configuration file to be self contained then either run it in its own pipeline, or use (for example)
add_field => { "documentType" => "typeOne" }
on the input and then use
if [documentType] == "typeOne" {
[...]
}
in the filter and output sections to make them conditional.
Is there a condition that can make the output different depending on what the input file is?
Or if there are about 200 Pipelines set up, are there any problems?
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.