Using bundled JDK: /usr/share/logstash/jdk
Sending Logstash logs to /usr/share/logstash/logs which is now configured via log4j2.properties
[2024-08-21T17:24:37,796][WARN ][deprecation.logstash.settings] The setting `http.host` is a deprecated alias for `api.http.host` and will be removed in a future release of Logstash. Please use api.http.host instead
[2024-08-21T17:24:37,820][INFO ][logstash.runner ] Log4j configuration path used is: /usr/share/logstash/config/log4j2.properties
[2024-08-21T17:24:37,823][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"8.15.0", "jruby.version"=>"jruby 9.4.8.0 (3.1.4) 2024-07-02 4d41e55a67 OpenJDK 64-Bit Server VM 21.0.4+7-LTS on 21.0.4+7-LTS +indy +jit [x86_64-linux]"}
[2024-08-21T17:24:37,829][INFO ][logstash.runner ] JVM bootstrap flags: [-Xms1g, -Xmx1g, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, -Dlogstash.jackson.stream-read-constraints.max-string-length=200000000, -Dlogstash.jackson.stream-read-constraints.max-number-length=10000, -Dls.cgroup.cpuacct.path.override=/, -Dls.cgroup.cpu.path.override=/, -Djruby.regexp.interruptible=true, -Djdk.io.File.enableADS=true, --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED, -Dio.netty.allocator.maxOrder=11]
[2024-08-21T17:24:37,834][INFO ][logstash.runner ] Jackson default value override `logstash.jackson.stream-read-constraints.max-string-length` configured to `200000000`
[2024-08-21T17:24:37,835][INFO ][logstash.runner ] Jackson default value override `logstash.jackson.stream-read-constraints.max-number-length` configured to `10000`
[2024-08-21T17:24:37,843][INFO ][logstash.settings ] Creating directory {:setting=>"path.queue", :path=>"/usr/share/logstash/data/queue"}
[2024-08-21T17:24:37,849][INFO ][logstash.settings ] Creating directory {:setting=>"path.dead_letter_queue", :path=>"/usr/share/logstash/data/dead_letter_queue"}
[2024-08-21T17:24:38,243][INFO ][logstash.agent ] No persistent UUID file found. Generating new UUID {:uuid=>"ce8725b0-80ec-474e-951e-0e70c50532c8", :path=>"/usr/share/logstash/data/uuid"}
[2024-08-21T17:24:39,143][WARN ][logstash.monitoringextension.pipelineregisterhook] xpack.monitoring.enabled has not been defined, but found elasticsearch configuration. Please explicitly set `xpack.monitoring.enabled: true` in logstash.yml
[2024-08-21T17:24:39,143][WARN ][deprecation.logstash.monitoringextension.pipelineregisterhook] Internal collectors option for Logstash monitoring is deprecated and targeted for removal in the next major version.
Please configure Elastic Agent to monitor Logstash. Documentation can be found at:
https://www.elastic.co/guide/en/logstash/current/monitoring-with-elastic-agent.html
[2024-08-21T17:24:39,889][INFO ][logstash.licensechecker.licensereader] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://elastic:xxxxxx@elasticsearch:9200/]}}
[2024-08-21T17:24:40,066][INFO ][logstash.licensechecker.licensereader] Failed to perform request {:message=>"Connect to elasticsearch:9200 [elasticsearch/172.18.0.2] failed: Connection refused", :exception=>Manticore::SocketException, :cause=>#<Java::OrgApacheHttpConn::HttpHostConnectException: Connect to elasticsearch:9200 [elasticsearch/172.18.0.2] failed: Connection refused>}
[2024-08-21T17:24:40,069][WARN ][logstash.licensechecker.licensereader] Attempted to resurrect connection to dead ES instance, but got an error {:url=>"http://elastic:xxxxxx@elasticsearch:9200/", :exception=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError, :message=>"Elasticsearch Unreachable: [http://elasticsearch:9200/][Manticore::SocketException] Connect to elasticsearch:9200 [elasticsearch/172.18.0.2] failed: Connection refused"}
[2024-08-21T17:24:40,089][INFO ][logstash.licensechecker.licensereader] Failed to perform request {:message=>"Connect to elasticsearch:9200 [elasticsearch/172.18.0.2] failed: Connection refused", :exception=>Manticore::SocketException, :cause=>#<Java::OrgApacheHttpConn::HttpHostConnectException: Connect to elasticsearch:9200 [elasticsearch/172.18.0.2] failed: Connection refused>}
[2024-08-21T17:24:40,091][WARN ][logstash.licensechecker.licensereader] Marking url as dead. Last error: [LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError] Elasticsearch Unreachable: [http://elasticsearch:9200/][Manticore::SocketException] Connect to elasticsearch:9200 [elasticsearch/172.18.0.2] failed: Connection refused {:url=>http://elastic:xxxxxx@elasticsearch:9200/, :error_message=>"Elasticsearch Unreachable: [http://elasticsearch:9200/][Manticore::SocketException] Connect to elasticsearch:9200 [elasticsearch/172.18.0.2] failed: Connection refused", :error_class=>"LogStash::Outputs::ElasticSearch::HttpClient::Pool::HostUnreachableError"}
[2024-08-21T17:24:40,097][WARN ][logstash.licensechecker.licensereader] Attempt to fetch Elasticsearch cluster info failed. Sleeping for 0.02 {:fail_count=>1, :exception=>"Elasticsearch Unreachable: [http://elasticsearch:9200/][Manticore::SocketException] Connect to elasticsearch:9200 [elasticsearch/172.18.0.2] failed: Connection refused"}
[2024-08-21T17:24:40,120][ERROR][logstash.licensechecker.licensereader] Unable to retrieve Elasticsearch cluster info. {:message=>"No Available connections", :exception=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::NoConnectionAvailableError}
[2024-08-21T17:24:40,123][ERROR][logstash.licensechecker.licensereader] Unable to retrieve license information from license server {:message=>"No Available connections"}
[2024-08-21T17:24:40,155][ERROR][logstash.monitoring.internalpipelinesource] Failed to fetch X-Pack information from Elasticsearch. This is likely due to failure to reach a live Elasticsearch cluster.
[2024-08-21T17:24:40,338][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
[2024-08-21T17:24:40,747][INFO ][org.reflections.Reflections] Reflections took 251 ms to scan 1 urls, producing 138 keys and 481 values
[2024-08-21T17:24:41,484][INFO ][logstash.javapipeline ] Pipeline `main` is configured with `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline will default to `ecs_compatibility => v8` unless explicitly configured otherwise.
[2024-08-21T17:24:41,557][INFO ][logstash.javapipeline ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>12, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1500, "pipeline.sources"=>["/usr/share/logstash/pipeline/logstash.conf"], :thread=>"#<Thread:0x855f367 /usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:134 run>"}
[2024-08-21T17:24:43,125][INFO ][logstash.javapipeline ][main] Pipeline Java execution initialization time {"seconds"=>1.57}
[2024-08-21T17:24:43,142][INFO ][logstash.inputs.beats ][main] Starting input listener {:address=>"0.0.0.0:5044"}
[2024-08-21T17:24:43,169][INFO ][logstash.javapipeline ][main] Pipeline started {"pipeline.id"=>"main"}
[2024-08-21T17:24:43,200][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2024-08-21T17:24:43,381][INFO ][org.logstash.beats.Server][main][0710cad67e8f47667bc7612580d5b91f691dd8262a4187d9eca8cf87229d04aa] Starting server on port: 5044
[2024-08-21T17:25:10,146][ERROR][logstash.licensechecker.licensereader] Unable to retrieve Elasticsearch cluster info. {:message=>"No Available connections", :exception=>LogStash::Outputs::ElasticSearch::HttpClient::Pool::NoConnectionAvailableError}
[2024-08-21T17:25:10,147][ERROR][logstash.licensechecker.licensereader] Unable to retrieve license information from license server {:message=>"No Available connections"}
[2024-08-21T17:25:10,197][WARN ][logstash.licensechecker.licensereader] Restored connection to ES instance {:url=>"http://elastic:xxxxxx@elasticsearch:9200/"}
[2024-08-21T17:25:10,217][INFO ][logstash.licensechecker.licensereader] Elasticsearch version determined (8.15.0) {:es_version=>8}
[2024-08-21T17:25:10,218][WARN ][logstash.licensechecker.licensereader] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>8}
[2024-08-21T17:25:40,170][INFO ][logstash.monitoring.internalpipelinesource] Monitoring License OK
[2024-08-21T17:25:40,170][INFO ][logstash.monitoring.internalpipelinesource] Validated license for monitoring. Enabling monitoring pipeline.
[2024-08-21T17:25:40,429][INFO ][logstash.javapipeline ] Pipeline `.monitoring-logstash` is configured with `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline will default to `ecs_compatibility => v8` unless explicitly configured otherwise.
[2024-08-21T17:25:40,439][INFO ][logstash.outputs.elasticsearchmonitoring][.monitoring-logstash] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearchMonitoring", :hosts=>["http://elasticsearch:9200"]}
[2024-08-21T17:25:40,444][INFO ][logstash.outputs.elasticsearchmonitoring][.monitoring-logstash] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://elastic:xxxxxx@elasticsearch:9200/]}}
[2024-08-21T17:25:40,462][WARN ][logstash.outputs.elasticsearchmonitoring][.monitoring-logstash] Restored connection to ES instance {:url=>"http://elastic:xxxxxx@elasticsearch:9200/"}
[2024-08-21T17:25:40,463][INFO ][logstash.outputs.elasticsearchmonitoring][.monitoring-logstash] Elasticsearch version determined (8.15.0) {:es_version=>8}
[2024-08-21T17:25:40,463][WARN ][logstash.outputs.elasticsearchmonitoring][.monitoring-logstash] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>8}
[2024-08-21T17:25:40,471][WARN ][logstash.javapipeline ][.monitoring-logstash] 'pipeline.ordered' is enabled and is likely less efficient, consider disabling if preserving event order is not necessary
[2024-08-21T17:25:40,472][INFO ][logstash.javapipeline ][.monitoring-logstash] Starting pipeline {:pipeline_id=>".monitoring-logstash", "pipeline.workers"=>1, "pipeline.batch.size"=>2, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>2, "pipeline.sources"=>["monitoring pipeline"], :thread=>"#<Thread:0x2565e3c6 /usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:134 run>"}
[2024-08-21T17:25:40,484][INFO ][logstash.javapipeline ][.monitoring-logstash] Pipeline Java execution initialization time {"seconds"=>0.01}
[2024-08-21T17:25:40,489][INFO ][logstash.javapipeline ][.monitoring-logstash] Pipeline started {"pipeline.id"=>".monitoring-logstash"}
[2024-08-21T17:25:40,502][INFO ][logstash.agent ] Pipelines running {:count=>2, :running_pipelines=>[:main, :".monitoring-logstash"], :non_running_pipelines=>[]}
{
"agent" => {
"name" => "f7aac19a2e9b",
"version" => "8.15.0",
"ephemeral_id" => "d7628ec3-b00e-4ce8-8901-dc638801b4c5",
"type" => "filebeat",
"id" => "eed035d7-5d57-41e7-a161-b2dc3f54e579"
},
"@timestamp" => 2024-08-21T17:29:53.615Z,
"@version" => "1",
"input" => {
"type" => "filestream"
},
"ecs" => {
"version" => "8.0.0"
},
"log" => {
"offset" => 54,
"file" => {
"inode" => "6192449488337721",
"device_id" => "83",
"path" => "/usr/share/filebeat/data/log20240801.txt"
}
},
"host" => {
"name" => "f7aac19a2e9b"
},
"event" => {
"original" => "2024-08-08 00:18:57.808 +02:00 [INF] Start DMTService."
},
"tags" => [
[0] "beats_input_codec_plain_applied"
],
"message" => "2024-08-08 00:18:57.808 +02:00 [INF] Start DMTService."
}
{
"agent" => {
"id" => "eed035d7-5d57-41e7-a161-b2dc3f54e579",
"name" => "f7aac19a2e9b",
"ephemeral_id" => "d7628ec3-b00e-4ce8-8901-dc638801b4c5",
"type" => "filebeat",
"version" => "8.15.0"
},
"@timestamp" => 2024-08-21T17:29:53.616Z,
"@version" => "1",
"input" => {
"type" => "filestream"
},
"ecs" => {
"version" => "8.0.0"
},
"log" => {
"offset" => 946,
"file" => {
"inode" => "6192449488337721",
"device_id" => "83",
"path" => "/usr/share/filebeat/data/log20240801.txt"
}
},
"host" => {
"name" => "f7aac19a2e9b"
},
"event" => {
"original" => "2024-08-08 00:18:59.109 +02:00 [DBG] First script: UPDATE im_dmt0 SET actieswitch = 1, [UPDATETIMESTAMP] = getDate() WHERE gguid = 0x6A1D3EE31EC64E59A124297E661CD3BC"
},
"tags" => [
[0] "beats_input_codec_plain_applied"
],
"message" => "2024-08-08 00:18:59.109 +02:00 [DBG] First script: UPDATE im_dmt0 SET actieswitch = 1, [UPDATETIMESTAMP] = getDate() WHERE gguid = 0x6A1D3EE31EC64E59A124297E661CD3BC"
}
{
"agent" => {
"name" => "f7aac19a2e9b",
"version" => "8.15.0",
"ephemeral_id" => "d7628ec3-b00e-4ce8-8901-dc638801b4c5",
"type" => "filebeat",
"id" => "eed035d7-5d57-41e7-a161-b2dc3f54e579"
},
"@timestamp" => 2024-08-21T17:29:53.616Z,
"@version" => "1",
"input" => {
"type" => "filestream"
},
"ecs" => {
"version" => "8.0.0"
},
"log" => {
"offset" => 679,
"file" => {
"inode" => "6192449488337721",
"device_id" => "83",
"path" => "/usr/share/filebeat/data/log20240801.txt"
}
},
"host" => {
"name" => "f7aac19a2e9b"
},
"event" => {
"original" => "2024-08-08 00:18:59.105 +02:00 [DBG] First script: UPDATE im_dmt0 SET actieswitch = 1, [UPDATETIMESTAMP] = getDate() WHERE gguid = 0x3A270AEE44DF49B793A8898DD32C99D8"
},
"tags" => [
[0] "beats_input_codec_plain_applied"
],
"message" => "2024-08-08 00:18:59.105 +02:00 [DBG] First script: UPDATE im_dmt0 SET actieswitch = 1, [UPDATETIMESTAMP] = getDate() WHERE gguid = 0x3A270AEE44DF49B793A8898DD32C99D8"
}
{
"agent" => {
"name" => "f7aac19a2e9b",
"version" => "8.15.0",
"ephemeral_id" => "d7628ec3-b00e-4ce8-8901-dc638801b4c5",
"type" => "filebeat",
"id" => "eed035d7-5d57-41e7-a161-b2dc3f54e579"
},
"@timestamp" => 2024-08-21T17:29:53.616Z,
"@version" => "1",
"input" => {
"type" => "filestream"
},
"ecs" => {
"version" => "8.0.0"
},
"log" => {
"offset" => 2352,
"file" => {
"inode" => "6192449488337721",
"device_id" => "83",
"path" => "/usr/share/filebeat/data/log20240801.txt"
}
},
"host" => {
"name" => "f7aac19a2e9b"
},
"event" => {
"original" => "2024-08-08 00:18:59.118 +02:00 [DBG] First script: UPDATE im_dmt0 SET actieswitch = 1, [UPDATETIMESTAMP] = getDate() WHERE gguid = 0x64A78B7D64CD444CA65AC87D81ABD5A8"
},
"tags" => [
[0] "beats_input_codec_plain_applied"
],
"message" => "2024-08-08 00:18:59.118 +02:00 [DBG] First script: UPDATE im_dmt0 SET actieswitch = 1, [UPDATETIMESTAMP] = getDate() WHERE gguid = 0x64A78B7D64CD444CA65AC87D81ABD5A8"
}
{
"agent" => {
"name" => "f7aac19a2e9b",
"version" => "8.15.0",
"ephemeral_id" => "d7628ec3-b00e-4ce8-8901-dc638801b4c5",
"type" => "filebeat",
"id" => "eed035d7-5d57-41e7-a161-b2dc3f54e579"
},
"@timestamp" => 2024-08-21T17:29:53.612Z,
"@version" => "1",
"input" => {
"type" => "filestream"
},
Extracted all the logs from my container to a text file and copied it here. As far as I could see there was no critical error.
Blockquote
It doesn't matter, anything that starts with logs-
will match the built-in template.
Ah ok. Because my filename is "log20240808" so im not sure that will automatically match.