I did corrected the schedule from
schedule => "*/1 * * * "
to
schedule => " * * * *"
but still cannot see any activity related to logstash-ora-01.conf - my conf file.
below is the content of /logstash/logs/logstash-plain.log
[2022-04-06T19:35:30,108][INFO ][logstash.runner ] Log4j configuration path used is: /opt/bitnami/logstash/config/log4j2.properties
[2022-04-06T19:35:30,236][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.17.1", "jruby.version"=>"jruby 9.2.20.1 (2.5.8) 2021-11-30 2a2962fbd1 OpenJDK 64-Bit Server VM 11.0.14+9-LTS on 11.0.14+9-LTS +indy +jit [linux-x86_64]"}
[2022-04-06T19:35:30,240][INFO ][logstash.runner ] JVM bootstrap flags: [-Xms1490m, -Xmx1490m, -XX:+UseConcMarkSweepGC, -XX:CMSInitiatingOccupancyFraction=75, -XX:+UseCMSInitiatingOccupancyOnly, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -Djruby.jit.threshold=0, -Djruby.regexp.interruptible=true, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true]
[2022-04-06T19:35:33,837][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2022-04-06T19:35:40,684][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
[2022-04-06T19:35:44,602][INFO ][org.reflections.Reflections] Reflections took 390 ms to scan 1 urls, producing 119 keys and 417 values
[2022-04-06T19:35:50,402][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//127.0.0.1:9200"]}
[2022-04-06T19:35:51,982][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://127.0.0.1:9200/]}}
[2022-04-06T19:35:53,699][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://127.0.0.1:9200/"}
[2022-04-06T19:35:53,747][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch version determined (7.17.1) {:es_version=>7}
[2022-04-06T19:35:53,761][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2022-04-06T19:35:54,343][INFO ][logstash.outputs.elasticsearch][main] Config is not compliant with data streams. `data_stream => auto` resolved to `false`
[2022-04-06T19:35:54,343][INFO ][logstash.outputs.elasticsearch][main] Config is not compliant with data streams. `data_stream => auto` resolved to `false`
[2022-04-06T19:35:54,678][INFO ][logstash.outputs.elasticsearch][main] Using a default mapping template {:es_version=>7, :ecs_compatibility=>:disabled}
[2022-04-06T19:35:54,833][INFO ][logstash.javapipeline ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, "pipeline.sources"=>["/opt/bitnami/logstash/pipeline/logstash.conf"], :thread=>"#<Thread:0x1e36bbc3 run>"}
[2022-04-06T19:35:59,164][INFO ][logstash.javapipeline ][main] Pipeline Java execution initialization time {"seconds"=>4.32}
[2022-04-06T19:35:59,244][INFO ][logstash.inputs.beats ][main] Starting input listener {:address=>"127.0.0.1:5044"}
[2022-04-06T19:36:00,648][INFO ][logstash.javapipeline ][main] Pipeline started {"pipeline.id"=>"main"}
[2022-04-06T19:36:00,689][INFO ][logstash.inputs.tcp ][main][db81e2e8bf758b0503d9fca34679b6dab77009a5c6f695a45b0db02ecdef151c] Starting tcp input listener {:address=>"127.0.0.1:5010", :ssl_enable=>false}
[2022-04-06T19:36:00,719][INFO ][logstash.inputs.http ][main][e952e964aeaee9edee05d91ee7c5223d1aa03199d6f576687d5ceb7f1e476568] Starting http input listener {:address=>"127.0.0.1:8080", :ssl=>"false"}
[2022-04-06T19:36:00,881][INFO ][logstash.inputs.gelf ][main][f13767570670f26c3d74550409a873aed4a860203dfbeea5aa4444e6b85e319c] Starting gelf listener (udp) ... {:address=>"127.0.0.1:12201"}
[2022-04-06T19:36:01,295][INFO ][logstash.inputs.udp ][main][cd19fb57ee267a9ed0bff60433a7fc564b0b5dd7efc1fb1e0f888cd900c4c690] Starting UDP listener {:address=>"127.0.0.1:5000"}
[2022-04-06T19:36:01,480][INFO ][org.logstash.beats.Server][main][b9a69d2add55e97ff229918bb1311d6853cc63de6e4b7c37b3d8685ce1d4abcd] Starting server on port: 5044
[2022-04-06T19:36:01,588][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2022-04-06T19:36:01,744][INFO ][logstash.inputs.udp ][main][cd19fb57ee267a9ed0bff60433a7fc564b0b5dd7efc1fb1e0f888cd900c4c690] UDP listener started {:address=>"127.0.0.1:5000", :receive_buffer_bytes=>"106496", :queue_size=>"2000"}
[2022-04-06T20:34:28,471][WARN ][logstash.runner ] SIGTERM received. Shutting down.
[2022-04-06T20:34:42,902][INFO ][logstash.javapipeline ][main] Pipeline terminated {"pipeline.id"=>"main"}
[2022-04-06T20:34:43,490][INFO ][logstash.runner ] Logstash shut down.
[2022-04-06T20:38:19,184][INFO ][logstash.runner ] Log4j configuration path used is: /opt/bitnami/logstash/config/log4j2.properties
[2022-04-06T20:38:19,353][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"7.17.1", "jruby.version"=>"jruby 9.2.20.1 (2.5.8) 2021-11-30 2a2962fbd1 OpenJDK 64-Bit Server VM 11.0.14+9-LTS on 11.0.14+9-LTS +indy +jit [linux-x86_64]"}
[2022-04-06T20:38:19,366][INFO ][logstash.runner ] JVM bootstrap flags: [-Xms1490m, -Xmx1490m, -XX:+UseConcMarkSweepGC, -XX:CMSInitiatingOccupancyFraction=75, -XX:+UseCMSInitiatingOccupancyOnly, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -Djruby.jit.threshold=0, -Djruby.regexp.interruptible=true, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true]
[2022-04-06T20:38:24,368][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2022-04-06T20:38:40,502][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
[2022-04-06T20:38:56,298][INFO ][org.reflections.Reflections] Reflections took 834 ms to scan 1 urls, producing 119 keys and 417 values
[2022-04-06T20:39:11,808][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//127.0.0.1:9200"]}
[2022-04-06T20:39:18,647][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://127.0.0.1:9200/]}}
[2022-04-06T20:39:22,008][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://127.0.0.1:9200/"}
[2022-04-06T20:39:22,188][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch version determined (7.17.1) {:es_version=>7}
[2022-04-06T20:39:22,217][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>7}
[2022-04-06T20:39:23,800][INFO ][logstash.outputs.elasticsearch][main] Config is not compliant with data streams. `data_stream => auto` resolved to `false`
[2022-04-06T20:39:23,801][INFO ][logstash.outputs.elasticsearch][main] Config is not compliant with data streams. `data_stream => auto` resolved to `false`
[2022-04-06T20:39:26,051][INFO ][logstash.outputs.elasticsearch][main] Using a default mapping template {:es_version=>7, :ecs_compatibility=>:disabled}
[2022-04-06T20:39:26,361][INFO ][logstash.javapipeline ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, "pipeline.sources"=>["/opt/bitnami/logstash/pipeline/logstash.conf"], :thread=>"#<Thread:0x115757f6 run>"}
[2022-04-06T20:39:41,534][INFO ][logstash.javapipeline ][main] Pipeline Java execution initialization time {"seconds"=>14.87}
[2022-04-06T20:39:47,283][INFO ][logstash.inputs.beats ][main] Starting input listener {:address=>"127.0.0.1:5044"}
[2022-04-06T20:39:49,190][INFO ][logstash.javapipeline ][main] Pipeline started {"pipeline.id"=>"main"}
[2022-04-06T20:39:49,579][INFO ][logstash.inputs.http ][main][e952e964aeaee9edee05d91ee7c5223d1aa03199d6f576687d5ceb7f1e476568] Starting http input listener {:address=>"127.0.0.1:8080", :ssl=>"false"}
[2022-04-06T20:39:49,836][INFO ][logstash.inputs.tcp ][main][db81e2e8bf758b0503d9fca34679b6dab77009a5c6f695a45b0db02ecdef151c] Starting tcp input listener {:address=>"127.0.0.1:5010", :ssl_enable=>false}
[2022-04-06T20:39:50,002][INFO ][logstash.inputs.gelf ][main][f13767570670f26c3d74550409a873aed4a860203dfbeea5aa4444e6b85e319c] Starting gelf listener (udp) ... {:address=>"127.0.0.1:12201"}
[2022-04-06T20:39:51,589][INFO ][logstash.inputs.udp ][main][cd19fb57ee267a9ed0bff60433a7fc564b0b5dd7efc1fb1e0f888cd900c4c690] Starting UDP listener {:address=>"127.0.0.1:5000"}
[2022-04-06T20:39:51,725][INFO ][org.logstash.beats.Server][main][b9a69d2add55e97ff229918bb1311d6853cc63de6e4b7c37b3d8685ce1d4abcd] Starting server on port: 5044
[2022-04-06T20:39:53,268][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2022-04-06T20:39:54,361][INFO ][logstash.inputs.udp ][main][cd19fb57ee267a9ed0bff60433a7fc564b0b5dd7efc1fb1e0f888cd900c4c690] UDP listener started {:address=>"127.0.0.1:5000", :receive_buffer_bytes=>"106496", :queue_size=>"2000"}