Logstash Setting up multiple config files in one pipeline

Hi.

Logstash I want to configure multiple configs on one pipeline.

I want to read dozens of csv files and send data to the same number of different indexes.

So, I created as many config files as there are,
We want to input specific index data into one pipeline.
However, when I made several config settings with one pipeline configuration, the data was mixed.

Is there any other way than setting the same number of pipelines?

If you have multiple configuration files for a single pipeline then they are combined. Events are read from all of the outputs, sent through all of the filters, and every event is written to every output. If you want each configuration file to be self contained then either run it in its own pipeline, or use (for example)

add_field => { "documentType" => "typeOne" }

on the input and then use

if [documentType] == "typeOne" {
    [...]
}

in the filter and output sections to make them conditional.

2 Likes

thank you

Is there a condition that can make the output different depending on what the input file is?
Or if there are about 200 Pipelines set up, are there any problems?

I ask for your help.

How different is the processing of the different files? How complex is it?

You may not need a pipeline per filetype, but that depends on the processing.

The file extension is the same as csv. However, there are other parts, such as the number of columns and the difference in data types.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.