Thank you for your response.
Yes I always used chown -R logstash:logstash
. I didn't realize running as root would turn out as such an issue. I've tried deleting the sincedb and also moved the permissions to logstash before I made the post too, without success.
I've tried to remedy the situation by uninstalling logstash. rm -rf
ing all four directories and then reinstalling logstash.
Since then I've only edited or created files in /etc/logstash
and then started it using systemd. Only to get the same result. No errors in logstash but also no events in Elastic.
I'll post the logstash log:
[2024-11-08T14:45:50,868][INFO ][logstash.runner ] Log4j configuration path used is: /etc/logstash/log4j2.properties
[2024-11-08T14:45:50,872][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"8.15.3", "jruby.version"=>"jruby 9.4.8.0 (3.1.4) 2024-07-02 4d41e55a67 OpenJDK 64-Bit Server VM 21.0.4+7-LTS on 21.0.4+7-LTS +indy +jit [x86_64-linux]"}
[2024-11-08T14:45:50,874][INFO ][logstash.runner ] JVM bootstrap flags: [-Xms1g, -Xmx1g, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, -Dlogstash.jackson.stream-read-constraints.max-string-length=200000000, -Dlogstash.jackson.stream-read-constraints.max-number-length=10000, -Djruby.regexp.interruptible=true, -Djdk.io.File.enableADS=true, --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED, -Dio.netty.allocator.maxOrder=11]
[2024-11-08T14:45:50,876][INFO ][logstash.runner ] Jackson default value override `logstash.jackson.stream-read-constraints.max-string-length` configured to `200000000`
[2024-11-08T14:45:50,876][INFO ][logstash.runner ] Jackson default value override `logstash.jackson.stream-read-constraints.max-number-length` configured to `10000`
[2024-11-08T14:45:51,395][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
[2024-11-08T14:45:51,829][INFO ][org.reflections.Reflections] Reflections took 77 ms to scan 1 urls, producing 138 keys and 481 values
[2024-11-08T14:45:52,076][INFO ][logstash.javapipeline ] Pipeline `dns` is configured with `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline will default to `ecs_compatibility => v8` unless explicitly configured otherwise.
[2024-11-08T14:45:52,086][INFO ][logstash.outputs.elasticsearch][dns] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["https://host1:9200", "https://host2:9200", "https://host3:9200"]}
[2024-11-08T14:45:52,203][INFO ][logstash.outputs.elasticsearch][dns] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[https://host1:9200/, https://host2:9200/, https://host3:9200/]}}
[2024-11-08T14:45:52,385][WARN ][logstash.outputs.elasticsearch][dns] Restored connection to ES instance {:url=>"https://host1:9200/"}
[2024-11-08T14:45:52,385][INFO ][logstash.outputs.elasticsearch][dns] Elasticsearch version determined (8.15.0) {:es_version=>8}
[2024-11-08T14:45:52,386][WARN ][logstash.outputs.elasticsearch][dns] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>8}
[2024-11-08T14:45:52,427][WARN ][logstash.outputs.elasticsearch][dns] Restored connection to ES instance {:url=>"https://host2:9200/"}
[2024-11-08T14:45:52,470][WARN ][logstash.outputs.elasticsearch][dns] Restored connection to ES instance {:url=>"https://host3:9200/"}
[2024-11-08T14:45:52,478][INFO ][logstash.outputs.elasticsearch][dns] Not eligible for data streams because config contains one or more settings that are not compatible with data streams: {"index"=>["logs-bind9.dns-default"]}
[2024-11-08T14:45:52,479][INFO ][logstash.outputs.elasticsearch][dns] Data streams auto configuration (`data_stream => auto` or unset) resolved to `false`
[2024-11-08T14:45:52,480][WARN ][logstash.filters.grok ][dns] ECS v8 support is a preview of the unreleased ECS v8, and uses the v1 patterns. When Version 8 of the Elastic Common Schema becomes available, this plugin will need to be updated
[2024-11-08T14:45:52,485][INFO ][logstash.outputs.elasticsearch][dns] Using a default mapping template {:es_version=>8, :ecs_compatibility=>:v8}
[2024-11-08T14:45:52,582][INFO ][logstash.javapipeline ][dns] Starting pipeline {:pipeline_id=>"dns", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, "pipeline.sources"=>["/etc/logstash/conf.d/dns.conf"], :thread=>"#<Thread:0x13581ca2 /usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:134 run>"}
[2024-11-08T14:45:53,043][INFO ][logstash.javapipeline ][dns] Pipeline Java execution initialization time {"seconds"=>0.46}
[2024-11-08T14:45:53,053][INFO ][logstash.inputs.file ][dns] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/var/lib/logstash/plugins/inputs/file/.sincedb_459170b707e4f3c36a6187dd2b219ee2", :path=>["/var/log/named/*.log"]}
[2024-11-08T14:45:53,055][INFO ][logstash.javapipeline ][dns] Pipeline started {"pipeline.id"=>"dns"}
[2024-11-08T14:45:53,057][INFO ][filewatch.observingtail ][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] START, creating Discoverer, Watch with file and sincedb collections
[2024-11-08T14:45:53,067][INFO ][logstash.agent ] Pipelines running {:count=>1, :running_pipelines=>[:dns], :non_running_pipelines=>[]}