Logstash jdbc sqlite file in readonly directory

Hi all,
I need your help for a Logstash-Pipeline.
We have an third party application which stores is logs in a sqlite .db File (Windows Server).
We want to integrate these Logs in our Stack.
What we done so far:
Mount the directory ro (readonly) from the Windows-Server in our Logstash-Server (Ubuntu 18)
Write an Pipeline using the jdbc inout plugin

The pipeline:

input {
    jdbc {
     jdbc_driver_library => "/etc/logstash/driver/sqlite-jdbc-3.18.0.jar"
     jdbc_driver_class => "org.sqlite.JDBC"
     jdbc_connection_string => "jdbc:sqlite:/path/to/File.db"
     jdbc_user => ""
     use_column_value => false
     schedule => "* * * * *"
     statement => "SELECT * FROM Event"
output {
      stdout {}

The output is just for debugging now, later will be Elasticsearch.
Now if i run logstash i get the following error:

[ERROR] 2020-07-28 08:23:00.819 [Ruby-0-Thread-16: :1] jdbc - Java::OrgSqlite::SQLiteException: [SQLITE_CANTOPEN]  Unable to open the database file (unable to open database file): SELECT * FROM Event

If i copy the .db file into another directory (not the mounted ro dir) it works fine.
I done this and compared the hashes of the db file before and after a successful run. The db file itself does not get modified so i was wondering why its not working wit a "Read-Only File".

But after the execution in a rw (read-write) dir i found the following 2 new files:

so i guess thats the reason why its not working in the mounted ro directory.
Does anyone know how to deal with this? can i change the directory of the -shm/-wal file to write to an different path?
I also tried to create a new folder and create a symlink of the .db file in this folder, but i get the same error.

Logstash-Version: 7.6.1

PS: we don't want to change the windows-Share to be writable.


This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.