My conf:
/etc/logstash/conf.d/test.conf
input {
s3 {
access_key_id => ""
secret_access_key => ""
endpoint => ""
bucket => ""
prefix => ""
region => ""
codec => "plain"
delete => true
interval => 10
}
}
output {
stdout { codec => rubydebug }
}
Logs:
[2025-07-24T11:08:00,301][INFO ][logstash.runner ] Log4j configuration path used is: /etc/logstash/log4j2.properties
[2025-07-24T11:08:00,317][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"8.5.1", "jruby.version"=>"jruby 9.3.8.0 (2.6.8) 2022-09-13 98d69c9461 OpenJDK 64-Bit Server VM 17.0.5+8 on 17.0.5+8 +indy +jit [aarch64-linux]"}
[2025-07-24T11:08:00,319][INFO ][logstash.runner ] JVM bootstrap flags: [-Xms1g, -Xmx1g, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -Djruby.jit.threshold=0, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, -Djruby.regexp.interruptible=true, -Djdk.io.File.enableADS=true, --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED]
[2025-07-24T11:08:01,711][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
[2025-07-24T11:08:02,424][INFO ][org.reflections.Reflections] Reflections took 146 ms to scan 1 urls, producing 125 keys and 438 values
[2025-07-24T11:08:04,468][INFO ][logstash.javapipeline ] Pipeline main
is configured with pipeline.ecs_compatibility: v8
setting. All plugins in this pipeline will default to ecs_compatibility => v8
unless explicitly configured otherwise.
[2025-07-24T11:08:04,577][INFO ][logstash.javapipeline ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, "pipeline.sources"=>["/etc/logstash/conf.d/obs-syslog.conf"], :thread=>"#<Thread:0x69966a2 run>"}
[2025-07-24T11:08:05,321][INFO ][logstash.javapipeline ][main] Pipeline Java execution initialization time {"seconds"=>0.73}
[2025-07-24T11:08:05,339][INFO ][logstash.inputs.s3 ][main] Registering {:bucket=>"lts-retain", :region=>"testcloud-sz-01"}
[2025-07-24T11:08:05,740][INFO ][logstash.javapipeline ][main] Pipeline started {"pipeline.id"=>"main"}
[2025-07-24T11:08:05,791][INFO ][logstash.inputs.s3 ][main][d990bfb5c99ebf37edd039d1506f376b2e0d79328693c4682b2354d9b908ee80] Using default generated file for the sincedb {:filename=>"/var/lib/logstash/plugins/inputs/s3/sincedb_0a8776d255fabda9665e2280f192a53f"}
[2025-07-24T11:08:05,830][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>}
[2025-07-24T11:08:06,981][INFO ][logstash.inputs.s3 ][main][d990bfb5c99ebf37edd039d1506f376b2e0d79328693c4682b2354d9b908ee80] LogTank_test is updated at 2025-07-24 03:02:31 +0000 and will process in the next cycle
[2025-07-24T11:08:15,971][INFO ][logstash.inputs.s3 ][main][d990bfb5c99ebf37edd039d1506f376b2e0d79328693c4682b2354d9b908ee80] LogTank_test is updated at 2025-07-24 03:02:31 +0000 and will process in the next cycle