Logstash pipeline StackOverflowError when using large conf file

Hey there,

Is there a limitation for logstash regarding the size of the configuration file ? I am having StackOverflowError error when logstash starts when using a large conf file (like 400kb) .

Below an excerpt of the error:

[INFO ] 2023-12-02 11:06:01.398 [[main]-pipeline-manager] javapipeline - Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>16, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>2000, "pipeline.sources"=>["/data/test2.conf"], :thread=>"#<Thread:0x7bc25d02 /usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:134 run>"}
[FATAL] 2023-12-02 11:06:01.438 [Ruby-0-Thread-11: /usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:289] Logstash - uncaught error (in thread Ruby-0-Thread-11: /usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:289)
java.lang.StackOverflowError: null
        at java.util.stream.ReduceOps$3ReducingSink.begin(java/util/stream/ReduceOps.java:164) ~[?:?]
        at java.util.stream.Sink$ChainedReference.begin(java/util/stream/Sink.java:253) ~[?:?]
        at java.util.stream.ReferencePipeline$2$1.begin(java/util/stream/ReferencePipeline.java:173) ~[?:?]
        at java.util.stream.Sink$ChainedReference.begin(java/util/stream/Sink.java:253) ~[?

Anyone can indicate how do we handle large conf file ? :smile:

Welcome to the community.

What did you put inside? Another app?!
Yes you can, split your logic and push to several pipelines which process your data.

I don't think there is a limitation, it may be something in your pipeline that is not right, but a 400kb text file for just one pipeline seems pretty big.

Can you share your configuration on a github gist?

1 Like

Yeah I know it is big, Im having multiple conf file in a directory that are all being loaded and in the end it crashes.

here is a single dummy file ~169kb which mimic the crash: Logstash Big config multiple filters · GitHub

Yeah, it crashed for me as well run I run that pipeline.

The error with java.lang.StackOverflowError is associated with exhaustion of resources.

This github issue has a solution, you need to increase the stack size used by each thread.

For me this pipeline only worked after I set -Xss2M in jvm.options, you may need to use higher values.

Also, depending on how your pipelines looks like, maybe you can optimize then or even use multiple logstash servers to process your data in parallel.

1 Like

Thank you ! That indeed solved it :heart_eyes:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.