Multiple pipelines configurations not processed in Docker container


According i'am using multiple conf files in /usr/share/logstash/pipelines

My first question is: Do i understand correct that i can use multiple .conf files in /usr/share/logstash/pipelines configured as in my logstash.yml without further configuration.

First i had one .conf file and this is working. After that i added a second .conf file and this file is not processed. The first .conf file is still working, but the logging output contains

[1] "multiline_codec_max_lines_reached"

I don't know if this is causing the problem that second conf file is not processed.
In my memory this error wasn't there before i added the second pipeline, but i'am not sure :-).

The first pipeline processes a xml file with 370KB of data (which is/was working)
The second pipeline processes a xml file with 240MB of data

My logstash.yml is: ""
xpack.monitoring.enabled: false
config.reload.automatic: true
path.config: "/usr/share/logstash/pipeline"

first lines of logging logstash:

logstash_1 | Sending Logstash logs to /usr/share/logstash/logs which is now configured via
logstash_1 | [2019-04-17T08:43:22,225][INFO ][logstash.setting.writabledirectory] Creating directory {:setting=>"path.queue", :path=>"/usr/share/logstash/data/queue"}
logstash_1 | [2019-04-17T08:43:22,262][INFO ][logstash.setting.writabledirectory] Creating directory {:setting=>"path.dead_letter_qu
eue", :path=>"/usr/share/logstash/data/dead_letter_queue"}
logstash_1 | [2019-04-17T08:43:24,648][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules o
r command line options are specified
logstash_1 | [2019-04-17T08:43:24,927][INFO ][logstash.agent ] No persistent UUID file found. Generating new UUID {:uuid=>
"15b97308-220f-4381-9d37-ee7235f25192", :path=>"/usr/share/logstash/data/uuid"}
logstash_1 | [2019-04-17T08:43:28,683][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.4.2"}
logstash_1 | [2019-04-17T08:45:01,812][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"
=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50}

I don't know which information is necessary so that you can help me with this problem.
Please ask for this and i will provide.

When -f points to a directory it will concatenate all the files in the directory to create the configuration. Events will be read from all the inputs, sent through all of the filters, and events from all inputs will be written to all outputs unless you are using conditionals.

It sounds like you want the two configuration files to be independent, in which case you need to run them in separate pipelines.

Separately... the multiline codec has limits on both the number of lines it will combine and the size of the resulting combination. You may need to adjust these upwards if you want to create a very large event.

Aha the concatenation was not clear for me. Thank you for the clarification!
I will create multiple pipelines.

Kind regards.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.