@leandrojmp
this is pipelines.yml
vim /etc/logstash/pipelines.yml
# This file is where you define your pipelines. You can define multiple.
# For more information on multiple pipelines, see the documentation:
# https://www.elastic.co/guide/en/logstash/current/multiple-pipelines.html
- pipeline.id: pipeline1
path.config: "/etc/logstash/conf.d/pipeline1.conf"
#- pipeline.id: main
# path.config: "/etc/logstash/conf.d/*.conf"
vim /etc/logstash/conf.d/pipeline1.conf
input{
beats {
port => 5070
}
}
filter{
grok{
patterns_dir => ["/etc/logstash/conf.d/testeod"]
match => { "message" => "%{custom_exception:dmerror}" }
}
}
output{
stdout{}
elasticsearch {
index => eodtest
hosts => ["https://10.20.30.29:9200"]
cacert => '/etc/logstash/certs/http_ca.crt'
user => "elastic"
password => "P@ssw0rd"
}
}
Also this is my logstash-plain.log
[2023-11-15T12:03:20,590][INFO ][logstash.runner ] Log4j configuration path used is: /etc/logstash/log4j2.properties
[2023-11-15T12:03:20,593][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"8.10.4", "jruby.version"=>"jruby 9.4.2.0 (3.1.0) 2023-03-08 90d2913fda OpenJDK 64-Bit Server VM 17.0.8+7 on 17.0.8+7 +indy +jit [x86_64-linux]"}
[2023-11-15T12:03:20,594][INFO ][logstash.runner ] JVM bootstrap flags: [-Xms1g, -Xmx1g, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, -Djruby.regexp.interruptible=true, -Djdk.io.File.enableADS=true, --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED]
[2023-11-15T12:03:20,816][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2023-11-15T12:03:21,593][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
[2023-11-15T12:03:21,858][INFO ][org.reflections.Reflections] Reflections took 159 ms to scan 1 urls, producing 132 keys and 464 values
[2023-11-15T12:03:22,718][WARN ][logstash.outputs.elasticsearch] You are using a deprecated config setting "cacert" set in elasticsearch. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Set 'ssl_certificate_authorities' instead. If you have any questions about this, please visit the #logstash channel on freenode irc. {:name=>"cacert", :plugin=><LogStash::Outputs::ElasticSearch index=>"eodtest", password=><password>, id=>"1d705bebc465c6f2954dfc0731576a2b1c889f4746ae5d1dec61206e4d3ddf62", user=>"elastic", hosts=>[https://10.20.30.29:9200], cacert=>"/etc/logstash/certs/http_ca.crt", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_9f5d7b97-4451-448c-a504-a6a19b08fe2b", enable_metric=>true, charset=>"UTF-8">, workers=>1, ssl_certificate_verification=>true, ssl_verification_mode=>"full", sniffing=>false, sniffing_delay=>5, timeout=>60, pool_max=>1000, pool_max_per_route=>100, resurrect_delay=>5, validate_after_inactivity=>10000, http_compression=>false, retry_initial_interval=>2, retry_max_interval=>64, dlq_on_failed_indexname_interpolation=>true, data_stream_type=>"logs", data_stream_dataset=>"generic", data_stream_namespace=>"default", data_stream_sync_fields=>true, data_stream_auto_routing=>true, manage_template=>true, template_overwrite=>false, template_api=>"auto", doc_as_upsert=>false, script_type=>"inline", script_lang=>"painless", script_var_name=>"event", scripted_upsert=>false, retry_on_conflict=>1, ilm_enabled=>"auto", ilm_pattern=>"{now/d}-000001", ilm_policy=>"logstash-policy">}
[2023-11-15T12:03:22,733][INFO ][logstash.javapipeline ] Pipeline `main` is configured with `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline will default to `ecs_compatibility => v8` unless explicitly configured otherwise.
[2023-11-15T12:03:22,743][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["https://10.20.30.29:9200"]}
[2023-11-15T12:03:22,805][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[https://elastic:xxxxxx@10.20.30.29:9200/]}}
[2023-11-15T12:03:22,959][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"https://elastic:xxxxxx@10.20.30.29:9200/"}
[2023-11-15T12:03:22,963][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch version determined (8.10.4) {:es_version=>8}
[2023-11-15T12:03:22,964][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>8}
[2023-11-15T12:03:22,970][INFO ][logstash.outputs.elasticsearch][main] Not eligible for data streams because config contains one or more settings that are not compatible with data streams: {"index"=>"eodtest"}
[2023-11-15T12:03:22,970][INFO ][logstash.outputs.elasticsearch][main] Data streams auto configuration (`data_stream => auto` or unset) resolved to `false`
[2023-11-15T12:03:22,971][WARN ][logstash.filters.grok ][main] ECS v8 support is a preview of the unreleased ECS v8, and uses the v1 patterns. When Version 8 of the Elastic Common Schema becomes available, this plugin will need to be updated
[2023-11-15T12:03:22,982][INFO ][logstash.outputs.elasticsearch][main] Using a default mapping template {:es_version=>8, :ecs_compatibility=>:v8}
[2023-11-15T12:03:23,023][INFO ][logstash.javapipeline ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1000, "pipeline.sources"=>["/etc/logstash/conf.d/pipeline1.conf"], :thread=>"#<Thread:0x532b3ff5 /usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:134 run>"}
[2023-11-15T12:03:23,680][INFO ][logstash.javapipeline ][main] Pipeline Java execution initialization time {"seconds"=>0.66}
[2023-11-15T12:03:23,683][INFO ][logstash.inputs.beats ][main] Starting input listener {:address=>"0.0.0.0:5070"}
[2023-11-15T12:03:23,720][INFO ][logstash.javapipeline ][main] Pipeline started {"pipeline.id"=>"main"}
[2023-11-15T12:03:23,736][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2023-11-15T12:03:23,785][INFO ][org.logstash.beats.Server][main][355b5961c5bd22351a4cf5494bc9b254124e7179d0d5df19c80c7a9c27b9bd90] Starting server on port: 5070
[2023-11-15T15:31:20,244][WARN ][logstash.runner ] SIGINT received. Shutting down.
[2023-11-15T15:31:23,429][INFO ][logstash.javapipeline ][main] Pipeline terminated {"pipeline.id"=>"main"}
[2023-11-15T15:31:23,474][INFO ][logstash.pipelinesregistry] Removed pipeline from registry successfully {:pipeline_id=>:main}
[2023-11-15T15:31:23,485][INFO ][logstash.runner ] Logstash shut down.
[2023-11-15T17:38:44,924][INFO ][logstash.runner ] Log4j configuration path used is: /etc/logstash/log4j2.properties
[2023-11-15T17:38:44,925][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"8.10.4", "jruby.version"=>"jruby 9.4.2.0 (3.1.0) 2023-03-08 90d2913fda OpenJDK 64-Bit Server VM 17.0.8+7 on 17.0.8+7 +indy +jit [x86_64-linux]"}
[2023-11-15T17:38:44,926][INFO ][logstash.runner ] JVM bootstrap flags: [-Xms1g, -Xmx5g, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, -Djruby.regexp.interruptible=true, -Djdk.io.File.enableADS=true, --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED]
[2023-11-15T17:38:45,044][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2023-11-15T17:38:45,353][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
[2023-11-15T17:38:45,596][INFO ][org.reflections.Reflections] Reflections took 112 ms to scan 1 urls, producing 132 keys and 464 values
[2023-11-15T17:38:45,895][WARN ][logstash.outputs.elasticsearch] You are using a deprecated config setting "cacert" set in elasticsearch. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Set 'ssl_certificate_authorities' instead. If you have any questions about this, please visit the #logstash channel on freenode irc. {:name=>"cacert", :plugin=><LogStash::Outputs::ElasticSearch index=>"eodtest", password=><password>, id=>"1d705bebc465c6f2954dfc0731576a2b1c889f4746ae5d1dec61206e4d3ddf62", user=>"elastic", hosts=>[https://10.20.30.29:9200], cacert=>"/etc/logstash/certs/http_ca.crt", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>"plain_cde43c7b-c4fe-47a3-ab8b-14dec1be9c95", enable_metric=>true, charset=>"UTF-8">, workers=>1, ssl_certificate_verification=>true, ssl_verification_mode=>"full", sniffing=>false, sniffing_delay=>5, timeout=>60, pool_max=>1000, pool_max_per_route=>100, resurrect_delay=>5, validate_after_inactivity=>10000, http_compression=>false, retry_initial_interval=>2, retry_max_interval=>64, dlq_on_failed_indexname_interpolation=>true, data_stream_type=>"logs", data_stream_dataset=>"generic", data_stream_namespace=>"default", data_stream_sync_fields=>true, data_stream_auto_routing=>true, manage_template=>true, template_overwrite=>false, template_api=>"auto", doc_as_upsert=>false, script_type=>"inline", script_lang=>"painless", script_var_name=>"event", scripted_upsert=>false, retry_on_conflict=>1, ilm_enabled=>"auto", ilm_pattern=>"{now/d}-000001", ilm_policy=>"logstash-policy">}
[2023-11-15T17:38:45,910][INFO ][logstash.javapipeline ] Pipeline `main` is configured with `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline will default to `ecs_compatibility => v8` unless explicitly configured otherwise.
[2023-11-15T17:38:45,922][INFO ][logstash.outputs.elasticsearch][main] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["https://10.20.30.29:9200"]}
[2023-11-15T17:38:45,987][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[https://elastic:xxxxxx@10.20.30.29:9200/]}}
[2023-11-15T17:38:46,140][WARN ][logstash.outputs.elasticsearch][main] Restored connection to ES instance {:url=>"https://elastic:xxxxxx@10.20.30.29:9200/"}
[2023-11-15T17:38:46,144][INFO ][logstash.outputs.elasticsearch][main] Elasticsearch version determined (8.10.4) {:es_version=>8}
[2023-11-15T17:38:46,144][WARN ][logstash.outputs.elasticsearch][main] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>8}
[2023-11-15T17:38:46,152][INFO ][logstash.outputs.elasticsearch][main] Not eligible for data streams because config contains one or more settings that are not compatible with data streams: {"index"=>"eodtest"}
[2023-11-15T17:38:46,152][INFO ][logstash.outputs.elasticsearch][main] Data streams auto configuration (`data_stream => auto` or unset) resolved to `false`
[2023-11-15T17:38:46,153][WARN ][logstash.filters.grok ][main] ECS v8 support is a preview of the unreleased ECS v8, and uses the v1 patterns. When Version 8 of the Elastic Common Schema becomes available, this plugin will need to be updated
[2023-11-15T17:38:46,164][INFO ][logstash.outputs.elasticsearch][main] Using a default mapping template {:es_version=>8, :ecs_compatibility=>:v8}
[2023-11-15T17:38:46,197][INFO ][logstash.javapipeline ][main] Starting pipeline {:pipeline_id=>"main", "pipeline.workers"=>8, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>1000, "pipeline.sources"=>["/etc/logstash/conf.d/pipeline1.conf"], :thread=>"#<Thread:0x51aeeb4c /usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:134 run>"}
[2023-11-15T17:38:46,631][INFO ][logstash.javapipeline ][main] Pipeline Java execution initialization time {"seconds"=>0.43}
[2023-11-15T17:38:46,634][INFO ][logstash.inputs.beats ][main] Starting input listener {:address=>"0.0.0.0:5070"}
[2023-11-15T17:38:46,638][INFO ][logstash.javapipeline ][main] Pipeline started {"pipeline.id"=>"main"}
[2023-11-15T17:38:46,652][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:main], :non_running_pipelines=>[]}
[2023-11-15T17:38:46,682][INFO ][org.logstash.beats.Server][main][355b5961c5bd22351a4cf5494bc9b254124e7179d0d5df19c80c7a9c27b9bd90] Starting server on port: 5070
[2023-11-15T17:41:42,084][WARN ][logstash.runner ] SIGINT received. Shutting down.
[2023-11-15T17:41:47,087][WARN ][logstash.runner ] Received shutdown signal, but pipeline is still waiting for in-flight events
to be processed. Sending another ^C will force quit Logstash, but this may cause
data loss.
[2023-11-15T17:41:49,268][INFO ][logstash.javapipeline ][main] Pipeline terminated {"pipeline.id"=>"main"}
[2023-11-15T17:41:50,236][INFO ][logstash.pipelinesregistry] Removed pipeline from registry successfully {:pipeline_id=>:main}
[2023-11-15T17:41:50,240][INFO ][logstash.runner ] Logstash shut down.
My logstash service status :
[root@Lstash ~]# systemctl status logstash.service
● logstash.service - logstash
Loaded: loaded (/usr/lib/systemd/system/logstash.service; enabled; vendor preset: disabled)
Active: active (running) since Wed 2023-11-15 18:01:49 +0330; 6s ago
Main PID: 157499 (java)
Tasks: 31 (limit: 100904)
Memory: 610.9M
CGroup: /system.slice/logstash.service
└─157499 /usr/share/logstash/jdk/bin/java -Xms1g -Xmx5g -Djava.awt.headless=true -Dfile.encoding=UTF-8 -Djruby.compile.invokedynamic=true -XX:+HeapDumpOnOutOfMemoryError -Djava.security.egd=file:/dev/urandom -Dlog4j2.isThread>
Nov 15 18:01:49 Lstash systemd[1]: Started logstash.
Nov 15 18:01:49 Lstash logstash[157499]: Using bundled JDK: /usr/share/logstash/jdk
I think my logstash-plain.log file does not update