Input is from S3
layout of S3 bucket is :
s3 {
....
bucket => "bucket"
prefix => "YYYY/MM/DD/hh/"
.....
}
so every hour I have to create a new conf file with the corresponding prefix so it gets process accordingly
Issue:
Logstash crashes. (assuming too many conf files)
Questions:
Is there a way to get a single file per day by changing the conf file? instead of creating multiple files (one per hour)
if I change the prefix to only include the year/month/day it does not download anything.
How do I overcome the crashing of logstash
Here is the partial log:
[2023-09-13T14:54:50,913][INFO ][logstash.javapipeline ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1000,
"pipeline.sources"=>["/etc/logstash/conf.d/s3_nr_2023091121.conf", "/etc/logstash/conf.d/s3_nr_2023091200.conf", "/etc/logstash/conf.d/s3_nr_2023091300.conf",
"/etc/logstash/conf.d/s3_nr_2023091301.conf", "/etc/logstash/conf.d/s3_nr_2023091302.conf",
"/etc/logstash/conf.d/s3_nr_2023091321.conf", "/etc/logstash/conf.d/s3_nr_2023091322.conf",
"/etc/logstash/conf.d/s3_nr_2023091323.conf", "/etc/logstash/conf.d/s3_nr_2023091400.conf",
"/etc/logstash/conf.d/s3_nr_2023091401.conf", "/etc/logstash/conf.d/s3_nr_2023091402.conf",
"/etc/logstash/conf.d/s3_nr_2023091403.conf", "/etc/logstash/conf.d/s3_nr_2023091404.conf",
"/etc/logstash/conf.d/s3_nr_2023091405.conf", "/etc/logstash/conf.d/s3_nr_2023091406.conf",
"/etc/logstash/conf.d/s3_nr_2023091407.conf", "/etc/logstash/conf.d/s3_nr_2023091408.conf",
"/etc/logstash/conf.d/s3_nr_2023091409.conf", "/etc/logstash/conf.d/s3_nr_2023091410.conf",
"/etc/logstash/conf.d/s3_nr_2023091411.conf", "/etc/logstash/conf.d/s3_nr_2023091412.conf",
"/etc/logstash/conf.d/s3_nr_2023091413.conf", "/etc/logstash/conf.d/s3_nr_2023091414.conf"], :thread=>"#<Thread:0x728c9767@/usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:134 run>"}
[2023-09-13T14:54:50,935][FATAL][org.logstash.Logstash ][main] uncaught error (in thread Ruby-0-Thread-58: /usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:289)
java.lang.StackOverflowError: null
at java.util.Spliterators$IteratorSpliterator.estimateSize(java/util/Spliterators.java:1865) ~[?:?]
at java.util.Spliterator.getExactSizeIfKnown(java/util/Spliterator.java:414) ~[?:?]
at java.util.stream.AbstractPipeline.copyInto(java/util/stream/AbstractPipeline.java:508) ~[?:?]
at java.util.stream.AbstractPipeline.wrapAndCopyInto(java/util/stream/AbstractPipeline.java:499) ~[?:?]
at java.util.stream.ReduceOps$ReduceOp.evaluateSequential(java/util/stream/ReduceOps.java:921) ~[?:?]
at java.util.stream.AbstractPipeline.evaluate(java/util/stream/AbstractPipeline.java:234) ~[?:?]
at java.util.stream.ReferencePipeline.collect(java/util/stream/ReferencePipeline.java:682) ~[?:?]
at org.logstash.config.ir.CompiledPipeline$CompiledExecution.compileDependencies(org/logstash/config/ir/CompiledPipeline.java:560) ~[logstash-core.jar:?]
at org.logstash.config.ir.CompiledPipeline$CompiledExecution.flatten(org/logstash/config/ir/CompiledPipeline.java:514) ~[logstash-core.jar:?]
at org.logstash.config.ir.CompiledPipeline$CompiledExecution.filterDataset(org/logstash/config/ir/CompiledPipeline.java:435) ~[logstash-core.jar:?]
at org.logstash.config.ir.CompiledPipeline$CompiledExecution.lambda$compileDependencies$6(org/logstash/config/ir/CompiledPipeline.java:537) ~[logstash-core.jar:?]
at java.util.stream.ReferencePipeline$3$1.accept(java/util/stream/ReferencePipeline.java:197) ~[?:?]