Permission problem when starting logstash with systemd on suse

Hello all,

I'm having trouble getting a logstash instance to work sending events to Elasticsearch. I've troubleshooted quite far and narrowed it down to a problem somehow related to permissions but let me get into the setup first:

I installed logstash 8.15.3 using the rpm package with zypper on a suse host. I set up a pipeline in conf.d and referenced that pipeline in pipelines.yml. When starting logstash (as root) from the command line, using the same command as the Execstart of systemd (/usr/share/bin/logstash --path.settings /etc/logstash) the pipeline runs and elastic receives events. When starting with systemd the pipeline also runs, but elastic gets no events.

It has to have to do something with the permissions of the logstash user/group because if I replace the logstash user with the root user in the logstash.service file it works.

I've chowned all relevant directories to logstash:logstash
/usr/share/logstash
/var/lib/logstash
/etc/logstash
/var/log/logstash

The logstash user/group also has read access to the logfile it's supposed to read and does not produce a permission error in the logs.

Even when running with --log.level debug I get no warnings or errors, Logstash connects to Elastic, runs the pipeline but does not transfer events. I confirmed this with a tcpdump on the elastic node it's connecting to. Running curl localhost:9600/_node/pipelines?pretty shows the pipeline being ran.

Checking /proc/[PID]/fd does not show any files in other direcotries being used by logstash.

I'll paste my configs as reply to this post.

Thank you in advance!

EDIT: SELinux is not running, AppArmor is but even if I turn off AppArmor I get the same behaviour

logstash.yml

path.data: /var/lib/logstash
path.logs: /var/log/logstash
#log.level: debug

pipelines.yml

# This file is where you define your pipelines. You can define multiple.
# For more information on multiple pipelines, see the documentation:
#   https://www.elastic.co/guide/en/logstash/current/multiple-pipelines.html

#- pipeline.id: main
#  path.config: "/etc/logstash/conf.d/*.conf"

- pipeline.id: dns
  path.config: "/etc/logstash/conf.d/dns.conf"

dns.conf

input {
  file {
    path => ["/var/log/named/*.log"]
  }
}
filter {
  grok {
    patterns_dir => ["/etc/logstash/custom_patterns"]
    match => {
      "message" => [
         "%{BIND9_DATE:EventTime} queries: %{LOGLEVEL:LogLevel}: client %{DATA:ClientID} %{IP:ClientIP}[#]%{NUMBER:ClientPort} \(%{DNS:Hostname}\)[:] query: %{DNS:QueryName} %{WORD:QueryClass} %{WORD:QueryType} %{DATA:QueryFlags} \(%{IP:LocalAddress}\)",
         "%{BIND9_DATE:EventTime} query-errors: %{LOGLEVEL:LogLevel}: client %{DATA:ClientID} %{IP:ClientIP}[#]%{NUMBER:ClientPort} \(%{DNS:Hostname}\)[:] query failed \(%{WORD:Reason}\) %{GREEDYDATA:ErrorMsg}"
       ]
    }
  }
  date {
    match => ["EventTime", "dd-MMM-yyyy HH:mm:ss.SSS"]
    timezone => "Europe/Berlin"
  }
  mutate {
    remove_field => ["message", "Hostname", "log", "EventTime", "ClientID"]
    rename => {"LocalAddress" => "[host][ip]"}
    rename => {"ClientIP" => "[source][ip]"}
    rename => {"ClientPort" => "[source][port]"}
    rename => {"QueryClass" => "[dns][question][class]"}
    rename => {"QueryType" => "[dns][question][type]"}
    rename => {"QueryName" => "[dns][question][name]"}
    rename => {"QueryFlags" => "[dns][header_flags]"}
    rename => {"LogLevel" => "[event][type]"}

  }
}
output {
  elasticsearch {
    hosts => ["ES-host1"]
    index => ["logs-bind9.dns-default"]
    action => "create"
    ca_trusted_fingerprint => "b6cedf964fe0c3c39178a2cf7de2541b406674fc80d810c764232cc024b4b4a9"
    api_key => "[API Key]"
  }
}

logstash.service

[Unit]
Description=logstash

[Service]
Type=simple
User=logstash
Group=logstash
# Load env vars from /etc/default/ and /etc/sysconfig/ if they exist.
# Prefixing the path with '-' makes it try to load, but if the file doesn't
# exist, it continues onward.
EnvironmentFile=-/etc/default/logstash
EnvironmentFile=-/etc/sysconfig/logstash
ExecStart=/usr/share/logstash/bin/logstash "--path.settings" "/etc/logstash"
Restart=always
WorkingDirectory=/
Nice=19
LimitNOFILE=16384

# When stopping, how long to wait before giving up and sending SIGKILL?
# Keep in mind that SIGKILL on a process can cause data loss.
TimeoutStopSec=infinity

[Install]
WantedBy=multi-user.target

I don't think there's use in pasting the logstash log, it shows no warnings or errors.

This caused the permissions issue, you should never run logstash as root, it will create some files with the wrong permissions and when you try to run it as a service it will not work.

But it seems that you already fixed it.

Did you changed the permissions to everything inside /var/lib/logstash?

Also, you have a file input, did you remove the since db file created when you run it as a root?

Maybe it already read your files and won't read anything until new lines appear.

Not everything here needs to be owned by the logstash user.

/usr/share/logstash is owned by the root user and the logstash user has execute permissions.

/etc/logstash also should be owned by the root user and the logstash user would have read permissions.

/var/log/logstash should be owned by the logstash user, if i'm not wrong the permissions are logstash:root

/var/lib/logstash is owned by the logstash user, everything inside it as well.

I would recommend that you stop Logstash, remove everything inside /var/lib/logstash and start again.

This will remove any leftover that may exist and also the sincedb files created by the first run.

Thank you for your response.

Yes I always used chown -R logstash:logstash. I didn't realize running as root would turn out as such an issue. I've tried deleting the sincedb and also moved the permissions to logstash before I made the post too, without success.

I've tried to remedy the situation by uninstalling logstash. rm -rfing all four directories and then reinstalling logstash.
Since then I've only edited or created files in /etc/logstash and then started it using systemd. Only to get the same result. No errors in logstash but also no events in Elastic.

I'll post the logstash log:

[2024-11-08T14:45:50,868][INFO ][logstash.runner          ] Log4j configuration path used is: /etc/logstash/log4j2.properties
[2024-11-08T14:45:50,872][INFO ][logstash.runner          ] Starting Logstash {"logstash.version"=>"8.15.3", "jruby.version"=>"jruby 9.4.8.0 (3.1.4) 2024-07-02 4d41e55a67 OpenJDK 64-Bit Server VM 21.0.4+7-LTS on 21.0.4+7-LTS +indy +jit [x86_64-linux]"}
[2024-11-08T14:45:50,874][INFO ][logstash.runner          ] JVM bootstrap flags: [-Xms1g, -Xmx1g, -Djava.awt.headless=true, -Dfile.encoding=UTF-8, -Djruby.compile.invokedynamic=true, -XX:+HeapDumpOnOutOfMemoryError, -Djava.security.egd=file:/dev/urandom, -Dlog4j2.isThreadContextMapInheritable=true, -Dlogstash.jackson.stream-read-constraints.max-string-length=200000000, -Dlogstash.jackson.stream-read-constraints.max-number-length=10000, -Djruby.regexp.interruptible=true, -Djdk.io.File.enableADS=true, --add-exports=jdk.compiler/com.sun.tools.javac.api=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.file=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.parser=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.tree=ALL-UNNAMED, --add-exports=jdk.compiler/com.sun.tools.javac.util=ALL-UNNAMED, --add-opens=java.base/java.security=ALL-UNNAMED, --add-opens=java.base/java.io=ALL-UNNAMED, --add-opens=java.base/java.nio.channels=ALL-UNNAMED, --add-opens=java.base/sun.nio.ch=ALL-UNNAMED, --add-opens=java.management/sun.management=ALL-UNNAMED, -Dio.netty.allocator.maxOrder=11]
[2024-11-08T14:45:50,876][INFO ][logstash.runner          ] Jackson default value override `logstash.jackson.stream-read-constraints.max-string-length` configured to `200000000`
[2024-11-08T14:45:50,876][INFO ][logstash.runner          ] Jackson default value override `logstash.jackson.stream-read-constraints.max-number-length` configured to `10000`
[2024-11-08T14:45:51,395][INFO ][logstash.agent           ] Successfully started Logstash API endpoint {:port=>9600, :ssl_enabled=>false}
[2024-11-08T14:45:51,829][INFO ][org.reflections.Reflections] Reflections took 77 ms to scan 1 urls, producing 138 keys and 481 values
[2024-11-08T14:45:52,076][INFO ][logstash.javapipeline    ] Pipeline `dns` is configured with `pipeline.ecs_compatibility: v8` setting. All plugins in this pipeline will default to `ecs_compatibility => v8` unless explicitly configured otherwise.
[2024-11-08T14:45:52,086][INFO ][logstash.outputs.elasticsearch][dns] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["https://host1:9200", "https://host2:9200", "https://host3:9200"]}
[2024-11-08T14:45:52,203][INFO ][logstash.outputs.elasticsearch][dns] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[https://host1:9200/, https://host2:9200/, https://host3:9200/]}}
[2024-11-08T14:45:52,385][WARN ][logstash.outputs.elasticsearch][dns] Restored connection to ES instance {:url=>"https://host1:9200/"}
[2024-11-08T14:45:52,385][INFO ][logstash.outputs.elasticsearch][dns] Elasticsearch version determined (8.15.0) {:es_version=>8}
[2024-11-08T14:45:52,386][WARN ][logstash.outputs.elasticsearch][dns] Detected a 6.x and above cluster: the `type` event field won't be used to determine the document _type {:es_version=>8}
[2024-11-08T14:45:52,427][WARN ][logstash.outputs.elasticsearch][dns] Restored connection to ES instance {:url=>"https://host2:9200/"}
[2024-11-08T14:45:52,470][WARN ][logstash.outputs.elasticsearch][dns] Restored connection to ES instance {:url=>"https://host3:9200/"}
[2024-11-08T14:45:52,478][INFO ][logstash.outputs.elasticsearch][dns] Not eligible for data streams because config contains one or more settings that are not compatible with data streams: {"index"=>["logs-bind9.dns-default"]}
[2024-11-08T14:45:52,479][INFO ][logstash.outputs.elasticsearch][dns] Data streams auto configuration (`data_stream => auto` or unset) resolved to `false`
[2024-11-08T14:45:52,480][WARN ][logstash.filters.grok    ][dns] ECS v8 support is a preview of the unreleased ECS v8, and uses the v1 patterns. When Version 8 of the Elastic Common Schema becomes available, this plugin will need to be updated
[2024-11-08T14:45:52,485][INFO ][logstash.outputs.elasticsearch][dns] Using a default mapping template {:es_version=>8, :ecs_compatibility=>:v8}
[2024-11-08T14:45:52,582][INFO ][logstash.javapipeline    ][dns] Starting pipeline {:pipeline_id=>"dns", "pipeline.workers"=>4, "pipeline.batch.size"=>125, "pipeline.batch.delay"=>50, "pipeline.max_inflight"=>500, "pipeline.sources"=>["/etc/logstash/conf.d/dns.conf"], :thread=>"#<Thread:0x13581ca2 /usr/share/logstash/logstash-core/lib/logstash/java_pipeline.rb:134 run>"}
[2024-11-08T14:45:53,043][INFO ][logstash.javapipeline    ][dns] Pipeline Java execution initialization time {"seconds"=>0.46}
[2024-11-08T14:45:53,053][INFO ][logstash.inputs.file     ][dns] No sincedb_path set, generating one based on the "path" setting {:sincedb_path=>"/var/lib/logstash/plugins/inputs/file/.sincedb_459170b707e4f3c36a6187dd2b219ee2", :path=>["/var/log/named/*.log"]}
[2024-11-08T14:45:53,055][INFO ][logstash.javapipeline    ][dns] Pipeline started {"pipeline.id"=>"dns"}
[2024-11-08T14:45:53,057][INFO ][filewatch.observingtail  ][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] START, creating Discoverer, Watch with file and sincedb collections
[2024-11-08T14:45:53,067][INFO ][logstash.agent           ] Pipelines running {:count=>1, :running_pipelines=>[:dns], :non_running_pipelines=>[]}

You could try enabling log.level trace. See what filewatch has to say.

Thank you for the suggestion.

I cannot spot anything out of the ordinary in the log:

[2024-11-08T15:41:04,134][INFO ][filewatch.observingtail  ][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] START, creating Discoverer, Watch with file and sincedb collections
[2024-11-08T15:41:04,136][DEBUG][filewatch.sincedbcollection][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] open: reading from /var/lib/logstash/plugins/inputs/file/.sincedb_459170b707e4f3c36a6187dd2b219ee2
[2024-11-08T15:41:04,136][TRACE][filewatch.sincedbcollection][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] open: count of keys read: 0
[2024-11-08T15:41:04,137][TRACE][filewatch.discoverer     ][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] discover_files {:count=>0}
[2024-11-08T15:41:05,138][DEBUG][filewatch.sincedbcollection][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] writing sincedb (delta since last write = 1731076865)
[2024-11-08T15:41:05,141][TRACE][filewatch.sincedbcollection][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] sincedb_write: /var/lib/logstash/plugins/inputs/file/.sincedb_459170b707e4f3c36a6187dd2b219ee2 (time = 2024-11-08 15:41:05 +0100)
[2024-11-08T15:41:05,141][TRACE][filewatch.sincedbcollection][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] non_atomic_write:  {:time=>2024-11-08 15:41:05.138553 +0100}
[2024-11-08T15:41:18,152][TRACE][filewatch.discoverer     ][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] discover_files {:count=>0}
[2024-11-08T15:41:20,154][DEBUG][filewatch.sincedbcollection][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] writing sincedb (delta since last write = 15)
[2024-11-08T15:41:20,154][TRACE][filewatch.sincedbcollection][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] sincedb_write: /var/lib/logstash/plugins/inputs/file/.sincedb_459170b707e4f3c36a6187dd2b219ee2 (time = 2024-11-08 15:41:20 +0100)
[2024-11-08T15:41:20,154][TRACE][filewatch.sincedbcollection][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] non_atomic_write:  {:time=>2024-11-08 15:41:20.153877 +0100}
[2024-11-08T15:41:33,165][TRACE][filewatch.discoverer     ][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] discover_files {:count=>0}
[2024-11-08T15:41:35,167][DEBUG][filewatch.sincedbcollection][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] writing sincedb (delta since last write = 15)
[2024-11-08T15:41:35,167][TRACE][filewatch.sincedbcollection][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] sincedb_write: /var/lib/logstash/plugins/inputs/file/.sincedb_459170b707e4f3c36a6187dd2b219ee2 (time = 2024-11-08 15:41:35 +0100)
[2024-11-08T15:41:35,168][TRACE][filewatch.sincedbcollection][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] non_atomic_write:  {:time=>2024-11-08 15:41:35.167246 +0100}
[2024-11-08T15:41:48,176][TRACE][filewatch.discoverer     ][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] discover_files {:count=>0}
[2024-11-08T15:41:50,178][DEBUG][filewatch.sincedbcollection][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] writing sincedb (delta since last write = 15)
[2024-11-08T15:41:50,178][TRACE][filewatch.sincedbcollection][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] sincedb_write: /var/lib/logstash/plugins/inputs/file/.sincedb_459170b707e4f3c36a6187dd2b219ee2 (time = 2024-11-08 15:41:50 +0100)
[2024-11-08T15:41:50,178][TRACE][filewatch.sincedbcollection][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] non_atomic_write:  {:time=>2024-11-08 15:41:50.178024 +0100}
[2024-11-08T15:42:03,193][TRACE][filewatch.discoverer     ][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] discover_files {:count=>0}
[2024-11-08T15:42:05,196][DEBUG][filewatch.sincedbcollection][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] writing sincedb (delta since last write = 15)
[2024-11-08T15:42:05,197][TRACE][filewatch.sincedbcollection][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] sincedb_write: /var/lib/logstash/plugins/inputs/file/.sincedb_459170b707e4f3c36a6187dd2b219ee2 (time = 2024-11-08 15:42:05 +0100)
[2024-11-08T15:42:05,197][TRACE][filewatch.sincedbcollection][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] non_atomic_write:  {:time=>2024-11-08 15:42:05.194961 +0100}
kra-mail-dns-1:/etc/logstash # cat /var/log/logstash/logstash-plain.log | grep filewatch
[2024-11-08T14:45:53,057][INFO ][filewatch.observingtail  ][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] START, creating Discoverer, Watch with file and sincedb collections
[2024-11-08T15:40:50,970][INFO ][filewatch.observingtail  ] QUIT - closing all files and shutting down.
[2024-11-08T15:41:04,134][INFO ][filewatch.observingtail  ][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] START, creating Discoverer, Watch with file and sincedb collections
[2024-11-08T15:41:04,136][DEBUG][filewatch.sincedbcollection][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] open: reading from /var/lib/logstash/plugins/inputs/file/.sincedb_459170b707e4f3c36a6187dd2b219ee2
[2024-11-08T15:41:04,136][TRACE][filewatch.sincedbcollection][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] open: count of keys read: 0
[2024-11-08T15:41:04,137][TRACE][filewatch.discoverer     ][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] discover_files {:count=>0}
[2024-11-08T15:41:05,138][DEBUG][filewatch.sincedbcollection][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] writing sincedb (delta since last write = 1731076865)
[2024-11-08T15:41:05,141][TRACE][filewatch.sincedbcollection][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] sincedb_write: /var/lib/logstash/plugins/inputs/file/.sincedb_459170b707e4f3c36a6187dd2b219ee2 (time = 2024-11-08 15:41:05 +0100)
[2024-11-08T15:41:05,141][TRACE][filewatch.sincedbcollection][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] non_atomic_write:  {:time=>2024-11-08 15:41:05.138553 +0100}
[2024-11-08T15:41:18,152][TRACE][filewatch.discoverer     ][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] discover_files {:count=>0}
[2024-11-08T15:41:20,154][DEBUG][filewatch.sincedbcollection][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] writing sincedb (delta since last write = 15)
[2024-11-08T15:41:20,154][TRACE][filewatch.sincedbcollection][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] sincedb_write: /var/lib/logstash/plugins/inputs/file/.sincedb_459170b707e4f3c36a6187dd2b219ee2 (time = 2024-11-08 15:41:20 +0100)
[2024-11-08T15:41:20,154][TRACE][filewatch.sincedbcollection][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] non_atomic_write:  {:time=>2024-11-08 15:41:20.153877 +0100}
[2024-11-08T15:41:33,165][TRACE][filewatch.discoverer     ][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] discover_files {:count=>0}
[2024-11-08T15:41:35,167][DEBUG][filewatch.sincedbcollection][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] writing sincedb (delta since last write = 15)
[2024-11-08T15:41:35,167][TRACE][filewatch.sincedbcollection][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] sincedb_write: /var/lib/logstash/plugins/inputs/file/.sincedb_459170b707e4f3c36a6187dd2b219ee2 (time = 2024-11-08 15:41:35 +0100)
[2024-11-08T15:41:35,168][TRACE][filewatch.sincedbcollection][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] non_atomic_write:  {:time=>2024-11-08 15:41:35.167246 +0100}
[2024-11-08T15:41:48,176][TRACE][filewatch.discoverer     ][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] discover_files {:count=>0}
[2024-11-08T15:41:50,178][DEBUG][filewatch.sincedbcollection][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] writing sincedb (delta since last write = 15)
[2024-11-08T15:41:50,178][TRACE][filewatch.sincedbcollection][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] sincedb_write: /var/lib/logstash/plugins/inputs/file/.sincedb_459170b707e4f3c36a6187dd2b219ee2 (time = 2024-11-08 15:41:50 +0100)
[2024-11-08T15:41:50,178][TRACE][filewatch.sincedbcollection][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] non_atomic_write:  {:time=>2024-11-08 15:41:50.178024 +0100}
[2024-11-08T15:42:03,193][TRACE][filewatch.discoverer     ][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] discover_files {:count=>0}
[2024-11-08T15:42:05,196][DEBUG][filewatch.sincedbcollection][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] writing sincedb (delta since last write = 15)
[2024-11-08T15:42:05,197][TRACE][filewatch.sincedbcollection][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] sincedb_write: /var/lib/logstash/plugins/inputs/file/.sincedb_459170b707e4f3c36a6187dd2b219ee2 (time = 2024-11-08 15:42:05 +0100)
[2024-11-08T15:42:05,197][TRACE][filewatch.sincedbcollection][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] non_atomic_write:  {:time=>2024-11-08 15:42:05.194961 +0100}
[2024-11-08T15:42:18,206][TRACE][filewatch.discoverer     ][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] discover_files {:count=>0}
[2024-11-08T15:42:20,208][DEBUG][filewatch.sincedbcollection][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] writing sincedb (delta since last write = 15)
[2024-11-08T15:42:20,208][TRACE][filewatch.sincedbcollection][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] sincedb_write: /var/lib/logstash/plugins/inputs/file/.sincedb_459170b707e4f3c36a6187dd2b219ee2 (time = 2024-11-08 15:42:20 +0100)
[2024-11-08T15:42:20,208][TRACE][filewatch.sincedbcollection][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] non_atomic_write:  {:time=>2024-11-08 15:42:20.207801 +0100}
[2024-11-08T15:42:33,217][TRACE][filewatch.discoverer     ][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] discover_files {:count=>0}
[2024-11-08T15:42:35,219][DEBUG][filewatch.sincedbcollection][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] writing sincedb (delta since last write = 15)
[2024-11-08T15:42:35,219][TRACE][filewatch.sincedbcollection][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] sincedb_write: /var/lib/logstash/plugins/inputs/file/.sincedb_459170b707e4f3c36a6187dd2b219ee2 (time = 2024-11-08 15:42:35 +0100)
[2024-11-08T15:42:35,220][TRACE][filewatch.sincedbcollection][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] non_atomic_write:  {:time=>2024-11-08 15:42:35.218953 +0100}
[2024-11-08T15:42:48,228][TRACE][filewatch.discoverer     ][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] discover_files {:count=>0}
[2024-11-08T15:42:50,230][DEBUG][filewatch.sincedbcollection][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] writing sincedb (delta since last write = 15)
[2024-11-08T15:42:50,230][TRACE][filewatch.sincedbcollection][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] sincedb_write: /var/lib/logstash/plugins/inputs/file/.sincedb_459170b707e4f3c36a6187dd2b219ee2 (time = 2024-11-08 15:42:50 +0100)
[2024-11-08T15:42:50,230][TRACE][filewatch.sincedbcollection][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] non_atomic_write:  {:time=>2024-11-08 15:42:50.229782 +0100}
[2024-11-08T15:43:03,239][TRACE][filewatch.discoverer     ][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] discover_files {:count=>0}
[2024-11-08T15:43:05,241][DEBUG][filewatch.sincedbcollection][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] writing sincedb (delta since last write = 15)
[2024-11-08T15:43:05,241][TRACE][filewatch.sincedbcollection][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] sincedb_write: /var/lib/logstash/plugins/inputs/file/.sincedb_459170b707e4f3c36a6187dd2b219ee2 (time = 2024-11-08 15:43:05 +0100)
[2024-11-08T15:43:05,242][TRACE][filewatch.sincedbcollection][dns][573890e0da80b9c8cd03abf71ed673e77b66f6632f010420f685c8593c9ee85c] non_atomic_write:  {:time=>2024-11-08 15:43:05.240981 +0100}

Keep in mind that I have a tcpdump running on the elastic hosts and can see logstash disconnecting from the hosts on shutdown and connecting upon start but after that no more traffic flows.

The log about discovering files always shows discover_files {:count=>0}.

What are the permissions for the path /var/log/named and its files? I'm not sure if logstash will complain if it cannot read the files, it may just ignore it.

I had hoped it would throw an error or at least a warning.
I know the file in /var/log/named logstash is supposed to read has read permissions set for others, I think the folder does not. Before the reinstall I tried setting the path explicitly instead of using the wildcard (like /var/log/named/server_name.log) as well as making a copy of the logfile into /etc/logstash and giving logstash ownership, neither of which had an effect. But that was before the reinstall.
I'm a week off now so I'll have to get back with results on the 18th.
Thank you so far!

Ok, I mulled this over my break and managed to get it working today.
It was indeed the permission on the folder, as execute permission is needed to traverse folders, read permission is not enough.

Edit: I confirmed this by editing /etc/passwd and then doing su logstash to test the permissions as the logstash user

I had previously excluded this possiblity since I tried a copy of that logfile with logstash given ownership but it seemed that the result of that test was probably confounded with other permission issues from the previous installation and therefore not sufficient.

(I feel like an error message for "permission denied" would be helpful)

Thank you again for your help!