Well I rechecked, the docker-compose.yml was not saved apparently. I saved and re-created the containers. still getting error
2024-10-28 17:43:47 Using bundled JDK: /usr/share/logstash/jdk
2024-10-28 17:44:01 Sending Logstash logs to /usr/share/logstash/logs which is now configured via log4j2.properties
2024-10-28 17:44:01 [2024-10-28T12:44:01,573][WARN ][deprecation.logstash.settings] The setting `http.host` is a deprecated alias for `api.http.host` and will be removed in a future release of Logstash. Please use api.http.host instead
2024-10-28 17:44:01 [2024-10-28T12:44:01,604][INFO ][logstash.runner ] Log4j configuration path used is: /usr/share/logstash/config/log4j2.properties
2024-10-28 17:44:01 [2024-10-28T12:44:01,610][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"8.15.3", "jruby.version"=>"jruby 9.4.8.0 (3.1.4) 2024-07-02 4d41e55a67 OpenJDK 64-Bit Server VM 21.0.4+7-LTS on 21.0.4+7-LTS +indy +jit [aarch64-linux]"}
2024-10-28 17:44:01 [2024-10-28T12:44:01,613][INFO ][logstash.runner ] JVM bootstrap flags: [-Xms1g, -Xmx1g, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, -Dlogstash.jackson.stream-read-constraints.max-string-length=200000000, -Dlogstash.jackson.stream-read-constraints.max-number-length=10000, -Dls.cgroup.cpuacct.path.override=/, -Dls.cgroup.cpu.path.override=/, -Djruby.regexp.interruptible=true, -Djdk.io.File.enableADS=true, --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED, -Dio.netty.allocator.maxOrder=11]
2024-10-28 17:44:01 [2024-10-28T12:44:01,620][INFO ][logstash.runner ] Jackson default value override `logstash.jackson.stream-read-constraints.max-string-length` configured to `200000000`
2024-10-28 17:44:01 [2024-10-28T12:44:01,620][INFO ][logstash.runner ] Jackson default value override `logstash.jackson.stream-read-constraints.max-number-length` configured to `10000`
2024-10-28 17:44:01 [2024-10-28T12:44:01,634][INFO ][logstash.settings ] Creating directory {:setting=>"path.queue", :path=>"/usr/share/logstash/data/queue"}
2024-10-28 17:44:01 [2024-10-28T12:44:01,641][INFO ][logstash.settings ] Creating directory {:setting=>"path.dead_letter_queue", :path=>"/usr/share/logstash/data/dead_letter_queue"}
2024-10-28 17:44:01 [2024-10-28T12:44:01,928][INFO ][logstash.agent ] No persistent UUID file found. Generating new UUID {:uuid=>"88a629ab-5ed3-4fd6-8b15-1ab86ed3684a", :path=>"/usr/share/logstash/data/uuid"}
2024-10-28 17:44:03 [2024-10-28T12:44:03,177][WARN ][logstash.monitoringextension.pipelineregisterhook] xpack.monitoring.enabled has not been defined, but found elasticsearch configuration. Please explicitly set `xpack.monitoring.enabled: true` in logstash.yml
2024-10-28 17:44:03 [2024-10-28T12:44:03,189][WARN ][deprecation.logstash.monitoringextension.pipelineregisterhook] Internal collectors option for Logstash monitoring is deprecated and targeted for removal in the next major version.
2024-10-28 17:44:03 Please configure Elastic Agent to monitor Logstash. Documentation can be found at:
2024-10-28 17:44:03 https://www.elastic.co/guide/en/logstash/current/monitoring-with-elastic-agent.html
2024-10-28 17:44:04 [2024-10-28T12:44:04,772][INFO ][logstash.licensechecker.licensereader] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://elasticsearch:9200/]}}
2024-10-28 17:44:04 [2024-10-28T12:44:04,986][INFO ][logstash.licensechecker.licensereader] Failed to perform request {:message=>"Connect to elasticsearch:9200 [elasticsearch/192.168.240.2] failed: Connection refused", :exception=>Manticore::SocketException, :cause=>#<Java::OrgApacheHttpConn::HttpHostConnectException: Connect to elasticsearch:9200 [elasticsearch/192.168.240.2] failed: Connection refused>}
2024-10-28 17:44:04 [2024-10-28T12:44:04,995][WARN ][logstash.licensechecker.licensereader] Attempted to resurrect connection to dead ES instance, but got an error {:url=>"http://elasticsearch:9200/", :exception=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError, :message=>"Elasticsearch Unreachable: [http://elasticsearch:9200/][Manticore::SocketException] Connect to elasticsearch:9200 [elasticsearch/192.168.240.2] failed: Connection refused"}
2024-10-28 17:44:05 [2024-10-28T12:44:05,016][INFO ][logstash.licensechecker.licensereader] Failed to perform request {:message=>"Connect to elasticsearch:9200 [elasticsearch/192.168.240.2] failed: Connection refused", :exception=>Manticore::SocketException, :cause=>#<Java::OrgApacheHttpConn::HttpHostConnectException: Connect to elasticsearch:9200 [elasticsearch/192.168.240.2] failed: Connection refused>}
2024-10-28 17:44:05 [2024-10-28T12:44:05,016][WARN ][logstash.licensechecker.licensereader] Marking url as dead. Last error: [LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError] Elasticsearch Unreachable: [http://elasticsearch:9200/][Manticore::SocketException] Connect to elasticsearch:9200 [elasticsearch/192.168.240.2] failed: Connection refused {:url=>http://elasticsearch:9200/, :error_message=>"Elasticsearch Unreachable: [http://elasticsearch:9200/][Manticore::SocketException] Connect to elasticsearch:9200 [elasticsearch/192.168.240.2] failed: Connection refused", :error_class=>"LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError"}
2024-10-28 17:44:05 [2024-10-28T12:44:05,020][WARN ][logstash.licensechecker.licensereader] Attempt to fetch Elasticsearch cluster info failed. Sleeping for 0.02 {:fail_count=>1, :exception=>"Elasticsearch Unreachable: [http://elasticsearch:9200/][Manticore::SocketException] Connect to elasticsearch:9200 [elasticsearch/192.168.240.2] failed: Connection refused"}
2024-10-28 17:44:05 [2024-10-28T12:44:05,043][ERROR][logstash.licensechecker.licensereader] Unable to retrieve Elasticsearch cluster info. {:message=>"No Available connections", :exception=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::NoConnectionAvailableError}
2024-10-28 17:44:05 [2024-10-28T12:44:05,055][ERROR][logstash.licensechecker.licensereader] Unable to retrieve license information from license server {:message=>"No Available connections"}
2024-10-28 17:44:05 [2024-10-28T12:44:05,114][ERROR][logstash.monitoring.internalpipelinesource] Failed to fetch X-Pack information from Elasticsearch. This is likely due to failure to reach a live Elasticsearch cluster.
2024-10-28 17:44:05 [2024-10-28T12:44:05,396][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
2024-10-28 17:44:06 [2024-10-28T12:44:06,781][INFO ][org.reflections.Reflections] Reflections took 524 ms to scan 1 urls, producing 138 keys and 481 values
2024-10-28 17:44:07 [2024-10-28T12:44:07,458][INFO ][logstash.codecs.jsonlines] ECS compatibility is enabled but `target` option was not specified. This may cause fields to be set at the top-level of the event where they are likely to clash with the Elastic Common Schema. It is recommended to set the `target` option to avoid potential schema conflicts (if your data is ECS compliant or non-conflicting, feel free to ignore this message)
2024-10-28 17:44:07 [2024-10-28T12:44:07,684][INFO ][logstash.javapipeline ] Pipeline `main` is configured with `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline will default to `ecs_compatibility => v8` unless explicitly configured otherwise.
2024-10-28 17:44:07 [2024-10-28T12:44:07,721][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["https://elasticsearch:9200"]}
2024-10-28 17:44:07 [2024-10-28T12:44:07,726][WARN ][logstash.outputs.elasticsearch][main] You have enabled encryption but DISABLED certificate verification, to make sure your data is secure set `ssl_verification_mode => full`
2024-10-28 17:44:07 [2024-10-28T12:44:07,751][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[https://elastic:xxxxxx@elasticsearch:9200/]}}
2024-10-28 17:44:07 [2024-10-28T12:44:07,838][INFO ][logstash.outputs.elasticsearch][main] Failed to perform request {:message=>"Connect to elasticsearch:9200 [elasticsearch/192.168.240.2] failed: Connection refused", :exception=>Manticore::SocketException, :cause=>#<Java::OrgApacheHttpConn::HttpHostConnectException: Connect to elasticsearch:9200 [elasticsearch/192.168.240.2] failed: Connection refused>}
2024-10-28 17:44:07 [2024-10-28T12:44:07,839][WARN ][logstash.outputs.elasticsearch][main] Attempted to resurrect connection to dead ES instance, but got an error {:url=>"https://elastic:xxxxxx@elasticsearch:9200/", :exception=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError, :message=>"Elasticsearch Unreachable: [https://elasticsearch:9200/][Manticore::SocketException] Connect to elasticsearch:9200 [elasticsearch/192.168.240.2] failed: Connection refused"}
2024-10-28 17:44:07 [2024-10-28T12:44:07,863][INFO ][logstash.outputs.elasticsearch][main] Not eligible for data streams because config contains one or more settings that are not compatible with data streams: {"index"=>"logstash-%{+YYYY.MM.dd}"}
2024-10-28 17:44:07 [2024-10-28T12:44:07,871][INFO ][logstash.outputs.elasticsearch][main] Data streams auto configuration (`data_stream => auto` or unset) resolved to `false`
2024-10-28 17:44:07 [2024-10-28T12:44:07,918][INFO ][logstash.javapipeline ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>10, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1250, "pipeline.sources"=>["/usr/share/logstash/pipeline/logstash.conf"], :thread=>"#<Thread:0x14c10b5d /usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:134 run>"}
2024-10-28 17:44:09 [2024-10-28T12:44:09,092][INFO ][logstash.javapipeline ][main] Pipeline Java execution initialization time {"seconds"=>1.17}
2024-10-28 17:44:09 [2024-10-28T12:44:09,224][INFO ][logstash.javapipeline ][main] Pipeline started {"pipeline.id"=>"main"}
2024-10-28 17:44:09 [2024-10-28T12:44:09,227][INFO ][logstash.inputs.tcp ][main][eea5192a76575cbf28bf59c9e8d08a887ec382c86d2f3d1d04988cee27d6fa25] Starting tcp input listener {:address=>"0.0.0.0:5044", :ssl_enabled=>false}
2024-10-28 17:44:09 [2024-10-28T12:44:09,236][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
2024-10-28 17:44:12 [2024-10-28T12:44:12,947][INFO ][logstash.outputs.elasticsearch][main] Failed to perform request {:message=>"Connect to elasticsearch:9200 [elasticsearch/192.168.240.2] failed: Connection refused", :exception=>Manticore::SocketException, :cause=>#<Java::OrgApacheHttpConn::HttpHostConnectException: Connect to elasticsearch:9200 [elasticsearch/192.168.240.2] failed: Connection refused>}
2024-10-28 17:44:12 [2024-10-28T12:44:12,957][WARN ][logstash.outputs.elasticsearch][main] Attempted to resurrect connection to dead ES instance, but got an error {:url=>"https://elastic:xxxxxx@elasticsearch:9200/", :exception=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError, :message=>"Elasticsearch Unreachable: [https://elasticsearch:9200/][Manticore::SocketException] Connect to elasticsearch:9200 [elasticsearch/192.168.240.2] failed: Connection refused"}
2024-10-28 17:44:18 [2024-10-28T12:44:17,992][INFO ][logstash.outputs.elasticsearch][main] Failed to perform request {:message=>"Connect to elasticsearch:9200 [elasticsearch/192.168.240.2] failed: Connection refused", :exception=>Manticore::SocketException, :cause=>#<Java::OrgApacheHttpConn::HttpHostConnectException: Connect to elasticsearch:9200 [elasticsearch/192.168.240.2] failed: Connection refused>}
2024-10-28 17:44:18 [2024-10-28T12:44:17,999][WARN ][logstash.outputs.elasticsearch][main] Attempted to resurrect connection to dead ES instance, but got an error {:url=>"https://elastic:xxxxxx@elasticsearch:9200/", :exception=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError, :message=>"Elasticsearch Unreachable: [https://elasticsearch:9200/][Manticore::SocketException] Connect to elasticsearch:9200 [elasticsearch/192.168.240.2] failed: Connection refused"}
2024-10-28 17:44:23 [2024-10-28T12:44:23,020][INFO ][logstash.outputs.elasticsearch][main] Failed to perform request {:message=>"Connect to elasticsearch:9200 [elasticsearch/192.168.240.2] failed: Connection refused", :exception=>Manticore::SocketException, :cause=>#<Java::OrgApacheHttpConn::HttpHostConnectException: Connect to elasticsearch:9200 [elasticsearch/192.168.240.2] failed: Connection refused>}
2024-10-28 17:44:23 [2024-10-28T12:44:23,022][WARN ][logstash.outputs.elasticsearch][main] Attempted to resurrect connection to dead ES instance, but got an error {:url=>"https://elastic:xxxxxx@elasticsearch:9200/", :exception=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError, :message=>"Elasticsearch Unreachable: [https://elasticsearch:9200/][Manticore::SocketException] Connect to elasticsearch:9200 [elasticsearch/192.168.240.2] failed: Connection refused"}
2024-10-28 17:44:28 [2024-10-28T12:44:28,922][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"https://elastic:xxxxxx@elasticsearch:9200/"}
2024-10-28 17:44:28 [2024-10-28T12:44:28,978][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch version determined (8.15.3) {:es_version=>8}
2024-10-28 17:44:28 [2024-10-28T12:44:28,978][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>8}