The log of my Logstash server:
[2024-06-24T08:06:35,194][INFO ][logstash.runner ] Log4j configuration path used is: /etc/logstash/log4j2.properties
[2024-06-24T08:06:35,199][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"8.11.3", "jruby.version"=>"jruby 9.4.5.0 (3.1.4) 2023-11-02 1abae2700f OpenJDK 64-Bit Server VM 17.0.9+9 on 17.0.9+9 +indy +jit [x86_64-linux]"}
[2024-06-24T08:06:35,201][INFO ][logstash.runner ] JVM bootstrap flags: [-Xms1g, -Xmx1g, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, -Djruby.regexp.interruptible=true, -Djdk.io.File.enableADS=true, --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED]
[2024-06-24T08:06:35,998][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>---, :ssl_enabled=>false}
[2024-06-24T08:06:36,279][INFO ][org.reflections.Reflections] Reflections took 129 ms to scan 1 urls, producing 131 keys and 463 values
[2024-06-24T08:06:36,545][INFO ][logstash.codecs.json ] ECS compatibility is enabled but `target` option was not specified. This may cause fields to be set at the top-level of the event where they are likely to clash with the Elastic Common Schema. It is recommended to set the `target` option to avoid potential schema conflicts (if your data is ECS compliant or non-conflicting, feel free to ignore this message)
[2024-06-24T08:06:36,572][INFO ][logstash.javapipeline ] Pipeline `snmp_pipeline` is configured with `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline will default to `ecs_compatibility => v8` unless explicitly configured otherwise.
[2024-06-24T08:06:36,632][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge][snmp_pipeline] A gauge metric of an unknown type (org.jruby.specialized.RubyArrayOneObject) has been created for key: send_to. This may result in invalid serialization. It is recommended to log an issue to the responsible developer/development team.
[2024-06-24T08:06:36,674][INFO ][logstash.javapipeline ][snmp_pipeline] Starting pipeline {:pipeline_id=>"snmp_pipeline", "pipeline.workers"=>2, "pipeline.batch.size"=>100, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>200, "pipeline.sources"=>["/etc/logstash/conf.d/snmp_pipeline.conf"], :thread=>"#<Thread:0x5f3cae00 /usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:134 run>"}
[2024-06-24T08:06:36,690][INFO ][logstash.javapipeline ] Pipeline `rest_pipeline` is configured with `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline will default to `ecs_compatibility => v8` unless explicitly configured otherwise.
[2024-06-24T08:06:36,700][INFO ][logstash.javapipeline ] Pipeline `metrics_pipeline` is configured with `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline will default to `ecs_compatibility => v8` unless explicitly configured otherwise.
[2024-06-24T08:06:36,707][INFO ][logstash.javapipeline ] Pipeline `syslog_pipeline` is configured with `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline will default to `ecs_compatibility => v8` unless explicitly configured otherwise.
[2024-06-24T08:06:36,707][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge][rest_pipeline] A gauge metric of an unknown type (org.jruby.specialized.RubyArrayOneObject) has been created for key: send_to. This may result in invalid serialization. It is recommended to log an issue to the responsible developer/development team.
[2024-06-24T08:06:36,720][INFO ][logstash.javapipeline ][rest_pipeline] Starting pipeline {:pipeline_id=>"rest_pipeline", "pipeline.workers"=>2, "pipeline.batch.size"=>100, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>200, "pipeline.sources"=>["/etc/logstash/conf.d/rest_pipeline.conf"], :thread=>"#<Thread:0x1956fa00 /usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:134 run>"}
[2024-06-24T08:06:36,796][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge][syslog_pipeline] A gauge metric of an unknown type (org.jruby.specialized.RubyArrayOneObject) has been created for key: send_to. This may result in invalid serialization. It is recommended to log an issue to the responsible developer/development team.
[2024-06-24T08:06:36,814][INFO ][logstash.javapipeline ][syslog_pipeline] Starting pipeline {:pipeline_id=>"syslog_pipeline", "pipeline.workers"=>2, "pipeline.batch.size"=>100, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>200, "pipeline.sources"=>["/etc/logstash/conf.d/syslog_pipeline.conf"], :thread=>"#<Thread:0x3e609d4e /usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:134 run>"}
[2024-06-24T08:06:36,822][WARN ][org.logstash.instrument.metrics.gauge.LazyDelegatingGauge][metrics_pipeline] A gauge metric of an unknown type (org.jruby.specialized.RubyArrayOneObject) has been created for key: send_to. This may result in invalid serialization. It is recommended to log an issue to the responsible developer/development team.
[2024-06-24T08:06:36,827][INFO ][logstash.javapipeline ][metrics_pipeline] Starting pipeline {:pipeline_id=>"metrics_pipeline", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, "pipeline.sources"=>["/etc/logstash/conf.d/metrics_pipeline.conf"], :thread=>"#<Thread:0xda60d74 /usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:134 run>"}
[2024-06-24T08:06:36,902][WARN ][logstash.outputs.elasticsearch] You are using a deprecated config setting "ssl" set in elasticsearch. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Set 'ssl_enabled' instead. If you have any questions about this, please visit the #logstash channel on freenode irc. {:name=>"ssl", :plugin=><LogStash::Outputs::ElasticSearch password=><password>, id=>"---", user=>"---", ssl=>false, hosts=>[---], data_stream=>"true", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"---", enable_metric=>true, charset=>"UTF-8">, workers=>1, ssl_certificate_verification=>true, ssl_verification_mode=>"full", sniffing=>false, sniffing_delay=>5, timeout=>60, pool_max=>1000, pool_max_per_route=>100, resurrect_delay=>5, validate_after_inactivity=>10000, http_compression=>true, compression_level=>1, retry_initial_interval=>2, retry_max_interval=>64, dlq_on_failed_indexname_interpolation=>true, data_stream_type=>"logs", data_stream_dataset=>"generic", data_stream_namespace=>"default", data_stream_sync_fields=>true, data_stream_auto_routing=>true, manage_template=>true, template_overwrite=>false, template_api=>"auto", doc_as_upsert=>false, script_type=>"inline", script_lang=>"painless", script_var_name=>"event", scripted_upsert=>false, retry_on_conflict=>1, ilm_enabled=>"auto", ilm_pattern=>"{now/d}-000001", ilm_policy=>"logstash-policy">}
[2024-06-24T08:06:36,919][INFO ][logstash.javapipeline ] Pipeline `output_pipeline` is configured with `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline will default to `ecs_compatibility => v8` unless explicitly configured otherwise.
[2024-06-24T08:06:36,948][INFO ][logstash.outputs.elasticsearch][output_pipeline] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["---"]}
[2024-06-24T08:06:37,055][INFO ][logstash.outputs.elasticsearch][output_pipeline] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[---]}}
[2024-06-24T08:06:37,237][WARN ][logstash.outputs.elasticsearch][output_pipeline] Restored connection to ES instance {:url=>"---"}
[2024-06-24T08:06:37,237][INFO ][logstash.outputs.elasticsearch][output_pipeline] Elasticsearch version determined (8.11.3) {:es_version=>8}
[2024-06-24T08:06:37,238][WARN ][logstash.outputs.elasticsearch][output_pipeline] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>8}
[2024-06-24T08:06:37,251][INFO ][logstash.outputs.elasticsearch][output_pipeline] Not eligible for data streams because config contains one or more settings that are not compatible with data streams: {"index"=>"---"}
[2024-06-24T08:06:37,252][INFO ][logstash.outputs.elasticsearch][output_pipeline] Data streams auto configuration (`data_stream => auto` or unset) resolved to `false`
[2024-06-24T08:06:37,260][INFO ][logstash.outputs.elasticsearch][output_pipeline] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["---"]}
[2024-06-24T08:06:37,264][INFO ][logstash.outputs.elasticsearch][output_pipeline] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[---]}}
[2024-06-24T08:06:37,276][INFO ][logstash.outputs.elasticsearch][output_pipeline] Using a default mapping template {:es_version=>8, :ecs_compatibility=>:v8}
[2024-06-24T08:06:37,301][WARN ][logstash.outputs.elasticsearch][output_pipeline] Restored connection to ES instance {:url=>"---"}
[2024-06-24T08:06:37,302][INFO ][logstash.outputs.elasticsearch][output_pipeline] Elasticsearch version determined (8.11.3) {:es_version=>8}
[2024-06-24T08:06:37,302][WARN ][logstash.outputs.elasticsearch][output_pipeline] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>8}
[2024-06-24T08:06:37,395][INFO ][logstash.javapipeline ][output_pipeline] Starting pipeline {:pipeline_id=>"output_pipeline", "pipeline.workers"=>5, "pipeline.batch.size"=>200, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1000, "pipeline.sources"=>["/etc/logstash/conf.d/output_pipeline.conf"], :thread=>"#<Thread:0x1cdaae41 /usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:134 run>"}
[2024-06-24T08:06:37,458][INFO ][logstash.javapipeline ][metrics_pipeline] Pipeline Java execution initialization time {"seconds"=>0.63}
[2024-06-24T08:06:37,459][INFO ][logstash.javapipeline ][rest_pipeline] Pipeline Java execution initialization time {"seconds"=>0.74}
[2024-06-24T08:06:37,460][INFO ][logstash.javapipeline ][syslog_pipeline] Pipeline Java execution initialization time {"seconds"=>0.65}
[2024-06-24T08:06:37,461][INFO ][logstash.javapipeline ][snmp_pipeline] Pipeline Java execution initialization time {"seconds"=>0.79}
[2024-06-24T08:06:37,509][WARN ][logstash.filters.grok ][syslog_pipeline] ECS v8 support is a preview of the unreleased ECS v8, and uses the v1 patterns. When Version 8 of the Elastic Common Schema becomes available, this plugin will need to be updated
[2024-06-24T08:06:37,536][INFO ][logstash.inputs.snmptrap ][snmp_pipeline] ECS compatibility is enabled but `target` option was not specified. This may cause fields to be set at the top-level of the event where they are likely to clash with the Elastic Common Schema. It is recommended to set the `target` option to avoid potential schema conflicts (if your data is ECS compliant or non-conflicting, feel free to ignore this message)
[2024-06-24T08:06:37,548][INFO ][logstash.javapipeline ][snmp_pipeline] Pipeline started {"pipeline.id"=>"snmp_pipeline"}
[2024-06-24T08:06:37,569][INFO ][logstash.codecs.json ][rest_pipeline] ECS compatibility is enabled but `target` option was not specified. This may cause fields to be set at the top-level of the event where they are likely to clash with the Elastic Common Schema. It is recommended to set the `target` option to avoid potential schema conflicts (if your data is ECS compliant or non-conflicting, feel free to ignore this message)
[2024-06-24T08:06:37,569][INFO ][logstash.inputs.snmptrap ][snmp_pipeline][logstash-snmp] It's a Trap! {:Port=>---, :Community=>["public"], :Host=>"---"}
[2024-06-24T08:06:37,576][INFO ][logstash.inputs.beats ][metrics_pipeline] Starting input listener {:address=>"---"}
[2024-06-24T08:06:37,676][INFO ][logstash.javapipeline ][syslog_pipeline] Pipeline started {"pipeline.id"=>"syslog_pipeline"}
[2024-06-24T08:06:37,743][INFO ][logstash.inputs.syslog ][syslog_pipeline][logstash-syslog] Starting syslog udp listener {:address=>"---"}
[2024-06-24T08:06:37,748][INFO ][logstash.inputs.syslog ][syslog_pipeline][logstash-syslog] Starting syslog tcp listener {:address=>"---"}
[2024-06-24T08:06:37,809][INFO ][logstash.javapipeline ][output_pipeline] Pipeline Java execution initialization time {"seconds"=>0.41}
[2024-06-24T08:06:37,840][INFO ][logstash.javapipeline ][output_pipeline] Pipeline started {"pipeline.id"=>"output_pipeline"}
[2024-06-24T08:06:37,847][INFO ][logstash.inputs.http ][rest_pipeline][---] Starting http input listener {:address=>"---", :ssl=>"false"}
[2024-06-24T08:06:37,847][INFO ][logstash.javapipeline ][rest_pipeline] Pipeline started {"pipeline.id"=>"rest_pipeline"}
[2024-06-24T08:06:38,031][INFO ][logstash.javapipeline ][metrics_pipeline] Pipeline started {"pipeline.id"=>"metrics_pipeline"}
[2024-06-24T08:06:38,034][INFO ][org.logstash.beats.Server][metrics_pipeline][---] Starting server on port: ---
[2024-06-24T08:06:38,047][INFO ][logstash.agent ] Pipelines running {:count=>5, :running_pipelines=>[:snmp_pipeline, :rest_pipeline, :metrics_pipeline, :syslog_pipeline, :output_pipeline], :non_running_pipelines=>[]}
The output of the sniffer on port 1062 with tcpdump during the sending of SNMP traps:
timestamp IP myIP > logstashIP: UDP, length 85