Log is reproduced below. Apologies if its too big:
Using bundled JDK: /opt/total/logstash/jdk
Sending Logstash logs to /opt/total/logstash/logs which is now configured via log4j2.properties
[2024-06-20T19:00:56,814][INFO ][logstash.runner ] Log4j configuration path used is: /opt/total/logstash/config/log4j2.properties
[2024-06-20T19:00:56,821][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"8.6.2", "jruby.version"=>"jruby 9.3.10.0 (2.6.8) 2023-02-01 107b2e6697 OpenJDK 64-Bit Server VM 17.0.6+10 on 17.0.6+10 +indy +jit [x86_64-linux]"}
[2024-06-20T19:00:56,824][INFO ][logstash.runner ] JVM bootstrap flags: [-Xms6g, -Xmx6g, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, -Djruby.regexp.interruptible=true, -Djdk.io.File.enableADS=true, --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED]
[2024-06-20T19:00:57,017][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2024-06-20T19:00:57,630][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
[2024-06-20T19:01:05,829][INFO ][org.reflections.Reflections] Reflections took 100 ms to scan 1 urls, producing 127 keys and 444 values
[2024-06-20T19:01:08,847][WARN ][logstash.outputs.elasticsearch] You are using a deprecated config setting "document_type" set in elasticsearch. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Document types are being deprecated in Elasticsearch 6.0, and removed entirely in 7.0. You should avoid this feature If you have any questions about this, please visit the #logstash channel on freenode irc. {:name=>"document_type", :plugin=><LogStash::Outputs::ElasticSearch password=><password>, hosts=>[https://xxx.xx.xx.xx:43044, https://xxx.xx.xx.xx:43044, https://xxx.xx.xx.xx:43044, https://xxx.xx.xx.xx:43044], cacert=>"/opt/total/elasticsearch/config/certs/http_ca.crt", index=>"proxy_statslog-%{+YYYY.MM.dd}", id=>"e9dff330011ca662aba25c47bc4235c701b0f294d909e8539e5cc0843f82e666", ssl=>true, user=>"elastic", document_type=>"%{[@metadata][type]}", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_ab56725a-efe9-41a4-b32b-1be433b7a2a6", enable_metric=>true, charset=>"UTF-8">, workers=>1, ssl_certificate_verification=>true, sniffing=>false, sniffing_delay=>5, timeout=>60, pool_max=>1000, pool_max_per_route=>100, resurrect_delay=>5, validate_after_inactivity=>10000, http_compression=>false, retry_initial_interval=>2, retry_max_interval=>64, data_stream_type=>"logs", data_stream_dataset=>"generic", data_stream_namespace=>"default", data_stream_sync_fields=>true, data_stream_auto_routing=>true, manage_template=>true, template_overwrite=>false, template_api=>"auto", doc_as_upsert=>false, script_type=>"inline", script_lang=>"painless", script_var_name=>"event", scripted_upsert=>false, retry_on_conflict=>1, ilm_enabled=>"auto", ilm_pattern=>"{now/d}-000001", ilm_policy=>"logstash-policy", dlq_on_failed_indexname_interpolation=>true>}
[2024-06-20T19:01:08,917][INFO ][logstash.javapipeline ] Pipeline `main` is configured with `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline will default to `ecs_compatibility => v8` unless explicitly configured otherwise.
[2024-06-20T19:01:08,928][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["https://xxx.xx.xx.xx:43044", "https://xxx.xx.xx.xx:43044", "https://xxx.xx.xx.xx:43044", "https://xxx.xx.xx.xx:43044"]}
[2024-06-20T19:01:09,026][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[https://elastic:xxxxxx@xxx.xx.xx.xx:43044/, https://elastic:xxxxxx@xxx.xx.xx.xx:43044/, https://elastic:xxxxxx@xxx.xx.xx.xx:43044/, https://elastic:xxxxxx@xxx.xx.xx.xx:43044/]}}
[2024-06-20T19:01:09,308][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"https://elastic:xxxxxx@xxx.xx.xx.xx:43044/"}
[2024-06-20T19:01:09,314][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch version determined (8.7.0) {:es_version=>8}
[2024-06-20T19:01:09,315][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>8}
[2024-06-20T19:01:09,371][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"https://elastic:xxxxxx@xxx.xx.xx.xx:43044/"}
[2024-06-20T19:01:09,484][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"https://elastic:xxxxxx@xxx.xx.xx.xx:43044/"}
[2024-06-20T19:01:09,566][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"https://elastic:xxxxxx@xxx.xx.xx.xx:43044/"}
[2024-06-20T19:01:16,615][INFO ][logstash.outputs.elasticsearch][main] Not eligible for data streams because config contains one or more settings that are not compatible with data streams: {"index"=>"uat_epf_msg_log"}
[2024-06-20T19:01:16,616][INFO ][logstash.outputs.elasticsearch][main] Data streams auto configuration (`data_stream => auto` or unset) resolved to `false`
[2024-06-20T19:01:16,617][WARN ][logstash.outputs.elasticsearch][main] Elasticsearch Output configured with `ecs_compatibility => v8`, which resolved to an UNRELEASED preview of version 8.0.0 of the Elastic Common Schema. Once ECS v8 and an updated release of this plugin are publicly available, you will need to update this plugin to resolve this warning.
[2024-06-20T19:01:16,618][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["https://xxx.xx.xx.xx:43044", "https://xxx.xx.xx.xx:43044", "https://xxx.xx.xx.xx:43044", "https://xxx.xx.xx.xx:43044"]}
[2024-06-20T19:01:16,625][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[https://elastic:xxxxxx@xxx.xx.xx.xx:43044/, https://elastic:xxxxxx@xxx.xx.xx.xx:43044/, https://elastic:xxxxxx@xxx.xx.xx.xx:43044/, https://elastic:xxxxxx@xxx.xx.xx.xx:43044/]}}
[2024-06-20T19:01:16,626][INFO ][logstash.outputs.elasticsearch][main] Using a default mapping template {:es_version=>8, :ecs_compatibility=>:v8}
[2024-06-20T19:01:16,656][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"https://elastic:xxxxxx@xxx.xx.xx.xx:43044/"}
[2024-06-20T19:01:16,660][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch version determined (8.7.0) {:es_version=>8}
[2024-06-20T19:01:16,660][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>8}
[2024-06-20T19:01:16,708][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"https://elastic:xxxxxx@xxx.xx.xx.xx:43044/"}
[2024-06-20T19:01:16,751][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"https://elastic:xxxxxx@xxx.xx.xx.xx:43044/"}
[2024-06-20T19:01:16,797][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"https://elastic:xxxxxx@xxx.xx.xx.xx:43044/"}
[2024-06-20T19:01:17,215][INFO ][logstash.outputs.elasticsearch][main] Not eligible for data streams because config contains one or more settings that are not compatible with data streams: {"index"=>"proxy_statslog-%{+YYYY.MM.dd}", "document_type"=>"%{[@metadata][type]}"}
[2024-06-20T19:01:22,115][INFO ][logstash.outputs.elasticsearch][main] Data streams auto configuration (`data_stream => auto` or unset) resolved to `false`
[2024-06-20T19:01:22,116][WARN ][logstash.outputs.elasticsearch][main] Elasticsearch Output configured with `ecs_compatibility => v8`, which resolved to an UNRELEASED preview of version 8.0.0 of the Elastic Common Schema. Once ECS v8 and an updated release of this plugin are publicly available, you will need to update this plugin to resolve this warning.
[2024-06-20T19:01:22,126][WARN ][logstash.filters.grok ][main] ECS v8 support is a preview of the unreleased ECS v8, and uses the v1 patterns. When Version 8 of the Elastic Common Schema becomes available, this plugin will need to be updated
[2024-06-20T19:01:22,136][INFO ][logstash.outputs.elasticsearch][main] Using a default mapping template {:es_version=>8, :ecs_compatibility=>:v8}
[2024-06-20T19:01:22,216][WARN ][logstash.filters.grok ][main] ECS v8 support is a preview of the unreleased ECS v8, and uses the v1 patterns. When Version 8 of the Elastic Common Schema becomes available, this plugin will need to be updated
[2024-06-20T19:01:22,246][WARN ][logstash.filters.grok ][main] ECS v8 support is a preview of the unreleased ECS v8, and uses the v1 patterns. When Version 8 of the Elastic Common Schema becomes available, this plugin will need to be updated
[2024-06-20T19:01:22,291][WARN ][logstash.filters.grok ][main] ECS v8 support is a preview of the unreleased ECS v8, and uses the v1 patterns. When Version 8 of the Elastic Common Schema becomes available, this plugin will need to be updated
[2024-06-20T19:01:22,317][WARN ][logstash.filters.grok ][main] ECS v8 support is a preview of the unreleased ECS v8, and uses the v1 patterns. When Version 8 of the Elastic Common Schema becomes available, this plugin will need to be updated
[2024-06-20T19:01:22,387][INFO ][logstash.javapipeline ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1000, "pipeline.sources"=>["/opt/total/logstash/config/mergedlogstash_uat.conf"], :thread=>"#<Thread:0x33c754c4@/opt/total/logstash-8.6.2/logstash-core/lib/logstash/java_pipeline.rb:131 run>"}
[2024-06-20T19:01:28,773][INFO ][logstash.javapipeline ][main] Pipeline Java execution initialization time {"seconds"=>6.38}
[2024-06-20T19:01:28,829][INFO ][logstash.inputs.beats ][main] Starting input listener {:address=>"0.0.0.0:5044"}
[2024-06-20T19:01:28,837][INFO ][logstash.javapipeline ][main] Pipeline started {"pipeline.id"=>"main"}
[2024-06-20T19:01:28,866][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2024-06-20T19:01:28,967][INFO ][org.logstash.beats.Server][main][d765812c289289352c8eb26386f720f01008f2fd08d2ee7fde1217b9bd7a5920] Starting server on port: 5044
/opt/total/logstash-8.6.2/vendor/bundle/jruby/2.6.0/gems/manticore-0.9.1-java/lib/manticore/client.rb:284: warning: already initialized constant Manticore::Client::HttpPost
/opt/total/logstash-8.6.2/vendor/bundle/jruby/2.6.0/gems/manticore-0.9.1-java/lib/manticore/client.rb:284: warning: already initialized constant Manticore::Client::HttpPost
/opt/total/logstash-8.6.2/vendor/bundle/jruby/2.6.0/gems/manticore-0.9.1-java/lib/manticore/client.rb:536: warning: already initialized constant Manticore::Client::StringEntity
/opt/total/logstash-8.6.2/vendor/bundle/jruby/2.6.0/gems/manticore-0.9.1-java/lib/manticore/client.rb:536: warning: already initialized constant Manticore::Client::StringEntity
/opt/total/logstash-8.6.2/vendor/bundle/jruby/2.6.0/gems/manticore-0.9.1-java/lib/manticore/client.rb:284: warning: already initialized constant Manticore::Client::HttpPost
[2024-06-20T19:01:43,926][INFO ][logstash.outputs.file ][main][085e2f0de34ccc07275ef5b65528a683dac48e8f499fcb3f4de3b8375b51fb24] Opening file {:path=>"/tmp/tbstat.log"}
Redacted logstash.yml is as below:
input{
beats {
type => beats
port => 5044
client_inactivity_timeout => 0
}
}
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
if [type] == "proxy_statslog" {
mutate {
split => ["message", "~|~"]
add_field =>{
"createdTime" => "%{[message][0]}"
"type" => "%{[message][1]}"
"typeValue" => "%{[message][2]}"
"message" => "%{[message][3]}"
}
}
}
mutate{
convert => {
"type" => "string"
"typeValue" => "string"
}
}
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
- - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - - -
if [type] == "proxy_statslog"{
file {
path => "/tmp/zztop.log"
codec => rubydebug
}
elasticsearch {
hosts => [ "https://xxx.xx.xx.xx:43044" ]
hosts => [ "https://xxx.xx.xx.xx:43044" ]
hosts => [ "https://xxx.xx.xx.xx:43044" ]
hosts => [ "https://xxx.xx.xx.xx:43044" ]
ssl => true
cacert => "/opt/total/elasticsearch/config/certs/http_ca.crt"
user => "elastic"
password => "xxxxxxxxxxxxxxxxxxxxxx"
index => "proxy_statslog-%{+YYYY.MM.dd}"
document_type => "%{[@metadata][type]}"
}
}
Thanks