Hi ,
I want to setup a logstash pipeline which will read data from Elasticsearch and load it to s3. For time being i am creating a test pipeline which reads data from one file and create a output file from it (creating dulplicate file) but thing is when i trigger my run it keeps in the running state and didn't execute as expected
my configuration file looks like :
input {
file {
path => "/Users/ritik.loomba/Downloads/sample.json" }
}
output {
file {
path => "/Users/ritik.loomba/Downloads/unload.json"}
}
and messages i am getting is :
bin/logstash -f config/filetofile.conf
Using bundled JDK: /Users/ritik.loomba/Documents/ELKStack/logstash-8.16.1/jdk.app/Contents/Home
Sending Logstash logs to /Users/ritik.loomba/Documents/ELKStack/logstash-8.16.1/logs which is now configured via log4j2.properties
[2024-12-02T17:55:54,549][INFO ][logstash.runner ] Log4j configuration path used is: /Users/ritik.loomba/Documents/ELKStack/logstash-8.16.1/config/log4j2.properties
[2024-12-02T17:55:54,551][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"8.16.1", "jruby.version"=>"jruby 9.4.9.0 (3.1.4) 2024-11-04 547c6b150e OpenJDK 64-Bit Server VM 21.0.5+11-LTS on 21.0.5+11-LTS +indy +jit [arm64-darwin]"}
[2024-12-02T17:55:54,552][INFO ][logstash.runner ] JVM bootstrap flags: [-Xms1g, -Xmx1g, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, -Dlogstash.jackson.stream-read-constraints.max-string-length=200000000, -Dlogstash.jackson.stream-read-constraints.max-number-length=10000, -Djruby.regexp.interruptible=true, -Djdk.io.File.enableADS=true, --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED, -Dio.netty.allocator.maxOrder=11]
[2024-12-02T17:55:54,553][INFO ][logstash.runner ] Jackson default value override `logstash.jackson.stream-read-constraints.max-string-length` configured to `200000000`
[2024-12-02T17:55:54,553][INFO ][logstash.runner ] Jackson default value override `logstash.jackson.stream-read-constraints.max-number-length` configured to `10000`
[2024-12-02T17:55:54,564][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2024-12-02T17:55:54,725][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
[2024-12-02T17:55:54,793][INFO ][org.reflections.Reflections] Reflections took 41 ms to scan 1 urls, producing 149 keys and 523 values
[2024-12-02T17:55:54,863][INFO ][logstash.codecs.jsonlines] ECS compatibility is enabled but `target` option was not specified. This may cause fields to be set at the top-level of the event where they are likely to clash with the Elastic Common Schema. It is recommended to set the `target` option to avoid potential schema conflicts (if your data is ECS compliant or non-conflicting, feel free to ignore this message)
[2024-12-02T17:55:54,869][INFO ][logstash.javapipeline ] Pipeline `main` is configured with `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline will default to `ecs_compatibility => v8` unless explicitly configured otherwise.
[2024-12-02T17:55:54,878][INFO ][logstash.javapipeline ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1000, "pipeline.sources"=>["/Users/ritik.loomba/Documents/ELKStack/logstash-8.16.1/config/filetofile.conf"], :thread=>"#<Thread:0xbba223b /Users/ritik.loomba/Documents/ELKStack/logstash-8.16.1/logstash-core/lib/logstash/java_pipeline.rb:139 run>"}
[2024-12-02T17:55:55,062][INFO ][logstash.javapipeline ][main] Pipeline Java execution initialization time {"seconds"=>0.18}
[2024-12-02T17:55:55,066][INFO ][logstash.inputs.file ][main] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/Users/ritik.loomba/Documents/ELKStack/logstash-8.16.1/data/plugins/inputs/file/.sincedb_0e5b6dc105fb85a0435282afbff2a7b2", :path=>["/Users/ritik.loomba/Downloads/sample.json"]}
[2024-12-02T17:55:55,067][INFO ][logstash.javapipeline ][main] Pipeline started {"pipeline.id"=>"main"}
[2024-12-02T17:55:55,068][INFO ][filewatch.observingtail ][main][7b7c651e38eb3178a9477999377170c7f6fcf530140ac74c72974b5ccd08f6f6] START, creating Discoverer, Watch with file and sincedb collections
[2024-12-02T17:55:55,073][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
I am using macOS, logstash-8.16.1