Logstash DLQ: sincedb never gets created

Hello,

I'm trying to configure dead letter queue on a setup which was installed by another team a few years ago.

Documents refused by elastic are well going to the dlq.
The dlq pipeline ingests them and send them correctly formatted to the index created for receiving the dlq events for analysis.

My problème is that at each restart of logstash, the events are re-sents to elastic although I set both commit_offsets and clean_consumed to true.
The sincedb file never appear whatever combination of settings I do.

I highly suspect that this setup was first used running logstash as root and then logstash as logstash user.
My suspiscion comes from the fact that running the logstash as root to test my pipeline created the sincedb.

We are running LogStash 8.4.2.

Our settings are the following:

/etc/logstash/logstash.yml

path.data:
  /var/lib/logstash
#pipeline.id: filestream-apigee
pipeline.ordered: auto
pipeline.ecs_compatibility: v8
config.reload.automatic: false
config.reload.interval: 60s
config.debug: true
log.level: debug
path.logs: /home/elasticsearch/log/logstash
xpack.management.elasticsearch.ssl.certificate_authority: "/etc/elasticsearch/certs/ca/ca.crt"
dead_letter_queue.enable: true
path.dead_letter_queue: "/home/sites/logstash/data/dead_letter_queue"

/etc/logstash/pipelines.yml

- pipeline.id: "filestream-apigee"
  path.config: "/etc/logstash/conf.d/filestream-apigee.conf"

- pipeline.id: "dead-letter-queue"
  path.config: "/etc/logstash/conf.d/dead-letter-queue.conf"

/etc/logstash/conf.d/dead-letter-queue.conf

input {
  dead_letter_queue {
    path => "/home/sites/logstash/data/dead_letter_queue/"
    pipeline_id => "filestream-apigee"
    commit_offsets => true
    clean_consumed => true
    sincedb_path => "/home/sites/logstash/data/dead_letter_queue/filestream-apigee/.sincedb_manually_configured"
  }
}
filter {
  prune {
    add_field => {
      "[_dlq][entry_time]" => "%{[@metadata][dead_letter_queue][entry_time]}"
      "[_dlq][plugin_id]" => "%{[@metadata][dead_letter_queue][plugin_id]}"
      "[_dlq][plugin_type]" => "%{[@metadata][dead_letter_queue][plugin_type]}"
      "[_dlq][reason]" => "%{[@metadata][dead_letter_queue][reason]}"
    }
    whitelist_names => [ "@timestamp", "dlq", "startTime", "endTime", "issuer", "tags", "input", "correlationId", "log", "fields", "host", "agent" ]
  }
  date {
    match => [ "[_dlq][entry_time]" , "ISO8601", "yyyy-MM-dd'T'HH:mm:ss.SSS" ]
    target => "@timestamp"
    timezone => "Europe/Brussels"
  }
}
output {
  elasticsearch {
    hosts => ["https://192.168.1.2:9200"]
    index => "dead-letter-queue"
    cacert => "/etc/elasticsearch/certs/ca/ca.crt"
    action => "create"
    user => "logstash_writer"
    password => "logstash_writer_pwd"
    ssl => true
  }
  stdout {
     codec => rubydebug
  }
}

Here are the rights on the dlq folder (within which the sincedb should be created).

dr-xr-xr-x. 19 root     root     4096 Jun 10 15:44 /
drwxr-xr-x. 25 root     root     4096 Jan 16  2024 /home
drwxr-x---   6 root     sites    4096 Mar 25  2024 /home/sites
drwxrwx---   3 logstash logstash 4096 Sep 24 15:17 /home/sites/logstash
drwxr-x---   3 logstash logstash 4096 Sep 24 15:17 /home/sites/logstash/data
drwxr-x---   4 logstash logstash 4096 Nov  7 23:37 /home/sites/logstash/data/dead_letter_queue
drwxr-xr-x 2 logstash logstash 4096 Nov  7 23:52 /home/sites/logstash/data/dead_letter_queue/filestream-apigee

I did try to set the commit_offsets to false with clean_consumed to true. As expected, I got a configuration error saying that both must be true which indicates me that the version I run know about those settings.

I tried to find my way through the log but did not find any warning or error.

What else can I check ?

Thank you in advance.