I found the logs. They seems to be located in: /usr/share/logstash
[2024-04-10T14:29:22,482][WARN ][logstash.runner ] SIGTERM received. Shutting down.
[2024-04-10T14:29:22,903][INFO ][logstash.javapipeline ][.monitoring-logstash] Pipeline terminated {"pipeline.id"=>".monitoring-logstash"}
[2024-04-10T14:29:23,561][INFO ][logstash.pipelinesregistry] Removed pipeline from registry successfully {:pipeline_id=>:".monitoring-logstash"}
[2024-04-10T14:29:26,695][INFO ][logstash.javapipeline ][main] Pipeline terminated {"pipeline.id"=>"main"}
[2024-04-10T14:29:27,591][INFO ][logstash.pipelinesregistry] Removed pipeline from registry successfully {:pipeline_id=>:main}
[2024-04-10T14:29:27,631][INFO ][logstash.runner ] Logstash shut down.
[2024-04-10T14:29:46,857][INFO ][logstash.runner ] Log4j configuration path used is: /etc/logstash/log4j2.properties
[2024-04-10T14:29:46,864][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"8.13.0", "jruby.version"=>"jruby 9.4.5.0 (3.1.4) 2023-11-02 1abae2700f OpenJDK 64-Bit Server VM 17.0.10+7 on 17.0.10+7 +indy +jit [x86_64-linux]"}
[2024-04-10T14:29:46,872][INFO ][logstash.runner ] JVM bootstrap flags: [-Xms512m, -Xmx512m, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, -Dlogstash.jackson.stream-read-constraints.max-string-length=200000000, -Dlogstash.jackson.stream-read-constraints.max-number-length=10000, -Djruby.regexp.interruptible=true, -Djdk.io.File.enableADS=true, --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED, -Dio.netty.allocator.maxOrder=11]
[2024-04-10T14:29:46,875][INFO ][logstash.runner ] Jackson default value override `logstash.jackson.stream-read-constraints.max-string-length` configured to `200000000`
[2024-04-10T14:29:46,876][INFO ][logstash.runner ] Jackson default value override `logstash.jackson.stream-read-constraints.max-number-length` configured to `10000`
[2024-04-10T14:29:47,745][WARN ][logstash.monitoringextension.pipelineregisterhook] xpack.monitoring.enabled has not been defined, but found elasticsearch configuration. Please explicitly set `xpack.monitoring.enabled: true` in logstash.yml
[2024-04-10T14:29:48,369][INFO ][logstash.monitoring.internalpipelinesource] Monitoring License OK
[2024-04-10T14:29:48,370][INFO ][logstash.monitoring.internalpipelinesource] Validated license for monitoring. Enabling monitoring pipeline.
[2024-04-10T14:29:48,527][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
[2024-04-10T14:29:49,136][INFO ][org.reflections.Reflections] Reflections took 173 ms to scan 1 urls, producing 132 keys and 468 values
[2024-04-10T14:29:49,335][INFO ][logstash.codecs.json ] ECS compatibility is enabled but `target` option was not specified. This may cause fields to be set at the top-level of the event where they are likely to clash with the Elastic Common Schema. It is recommended to set the `target` option to avoid potential schema conflicts (if your data is ECS compliant or non-conflicting, feel free to ignore this message)
[2024-04-10T14:29:49,434][INFO ][logstash.javapipeline ] Pipeline `.monitoring-logstash` is configured with `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline will default to `ecs_compatibility => v8` unless explicitly configured otherwise.
[2024-04-10T14:29:49,471][INFO ][logstash.outputs.elasticsearchmonitoring][.monitoring-logstash] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearchMonitoring", :hosts=>["http://elastic_ip:9200"]}
[2024-04-10T14:29:49,484][INFO ][logstash.javapipeline ] Pipeline `main` is configured with `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline will default to `ecs_compatibility => v8` unless explicitly configured otherwise.
[2024-04-10T14:29:49,486][INFO ][logstash.outputs.elasticsearchmonitoring][.monitoring-logstash] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://elastic_ip:9200/]}}
[2024-04-10T14:29:49,505][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["//elastic_ip:9200"]}
[2024-04-10T14:29:49,516][WARN ][logstash.outputs.elasticsearchmonitoring][.monitoring-logstash] Restored connection to ES instance {:url=>"http://elastic_ip:9200/"}
[2024-04-10T14:29:49,517][INFO ][logstash.outputs.elasticsearchmonitoring][.monitoring-logstash] Elasticsearch version determined (8.13.0) {:es_version=>8}
[2024-04-10T14:29:49,517][WARN ][logstash.outputs.elasticsearchmonitoring][.monitoring-logstash] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>8}
[2024-04-10T14:29:49,522][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://elastic_ip:9200/]}}
[2024-04-10T14:29:49,541][WARN ][logstash.javapipeline ][.monitoring-logstash] 'pipeline.ordered' is enabled and is likely less efficient, consider disabling if preserving event order is not necessary
[2024-04-10T14:29:49,552][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"http://elastic_ip:9200/"}
[2024-04-10T14:29:49,553][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch version determined (8.13.0) {:es_version=>8}
[2024-04-10T14:29:49,553][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>8}
[2024-04-10T14:29:49,572][INFO ][logstash.outputs.elasticsearch][main] Not eligible for data streams because config contains one or more settings that are not compatible with data streams: {"index"=>"security-service-log"}
[2024-04-10T14:29:49,575][INFO ][logstash.outputs.elasticsearch][main] Data streams auto configuration (`data_stream => auto` or unset) resolved to `false`
[2024-04-10T14:29:49,579][INFO ][logstash.javapipeline ][.monitoring-logstash] Starting pipeline {:pipeline_id=>".monitoring-logstash", "pipeline.workers"=>1, "pipeline.batch.size"=>2, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>2, "pipeline.sources"=>["monitoring pipeline"], :thread=>"#<Thread:0x578fb018 /usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:134 run>"}
[2024-04-10T14:29:49,603][INFO ][logstash.javapipeline ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>2, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>250, "pipeline.sources"=>["/usr/share/logstash/pipeline/logstash.conf"], :thread=>"#<Thread:0x2b95b4bc /usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:134 run>"}
[2024-04-10T14:29:49,606][INFO ][logstash.outputs.elasticsearch][main] Using a default mapping template {:es_version=>8, :ecs_compatibility=>:v8}
[2024-04-10T14:29:50,319][INFO ][logstash.javapipeline ][.monitoring-logstash] Pipeline Java execution initialization time {"seconds"=>0.74}
[2024-04-10T14:29:50,318][INFO ][logstash.javapipeline ][main] Pipeline Java execution initialization time {"seconds"=>0.71}
[2024-04-10T14:29:50,341][INFO ][logstash.javapipeline ][.monitoring-logstash] Pipeline started {"pipeline.id"=>".monitoring-logstash"}
[2024-04-10T14:29:50,364][INFO ][logstash.inputs.tcp ][main] Automatically switching from json to json_lines codec {:plugin=>"tcp"}
[2024-04-10T14:29:50,379][INFO ][logstash.codecs.jsonlines][main] ECS compatibility is enabled but `target` option was not specified. This may cause fields to be set at the top-level of the event where they are likely to clash with the Elastic Common Schema. It is recommended to set the `target` option to avoid potential schema conflicts (if your data is ECS compliant or non-conflicting, feel free to ignore this message)
[2024-04-10T14:29:50,505][INFO ][logstash.inputs.tcp ][main][e6d8be9da7f73d52c3bcda26708d79587022fb657449e39a23821cf26fa4b952] Starting tcp input listener {:address=>"0.0.0.0:5000", :ssl_enabled=>false}
[2024-04-10T14:29:50,507][INFO ][logstash.javapipeline ][main] Pipeline started {"pipeline.id"=>"main"}
[2024-04-10T14:29:50,527][INFO ][logstash.agent ] Pipelines running {:count=>2, :running_pipelines=>[:".monitoring-logstash", :main], :non_running_pipelines=>[]}