JDBC SQLite integration with logstash

Hi there,

I have a bunch of SQlite files with multiple tables each that I want to integrate into ES. All of ES is installed on Debian using the deb packages. Integration is done using JDBC with the following input configuration either with multiple jdbc blocks (per table) or multiple logstash configs (per database and table in database):

jdbc {
jdbc_connection_string => "jdbc:sqlite:importDB.db"
jdbc_user => ""
#jdbc_password => ""
jdbc_driver_library => "sqlite-jdbc-3.18.0.jar"
jdbc_driver_class => "org.sqlite.JDBC"
#jdbc_page_size => 1000
#jdbc_paging_enabled => "true"
statement => "SELECT * from table1"
type => "table1"
add_field => {

Either way, if I index only a small amount of data into ES it works fine, but indexing all tables within a single database already fails (around 16MB SQlite database file size) with a single call of logstash. What I receive is:

/tmp/sqlite-3.18.0-9b0a648a-822b-4970-a629-b51e9bedea72-libsqlitejdbc.so (No such file or directory)
/tmp/sqlite-3.18.0-a3167768-1e44-40bd-ab01-24365391abcd-libsqlitejdbc.so (No such file or directory)
/tmp/sqlite-3.18.0-cb6c9ce1-588c-4ecf-bcde-20d3686c55f6-libsqlitejdbc.so (No such file or directory)
/tmp/sqlite-3.18.0-325a371f-c4f3-46a9-a59d-246843f423ae-libsqlitejdbc.so (No such file or directory)
/tmp/sqlite-3.18.0-681bb254-463d-4bec-ace8-9670a6487bce-libsqlitejdbc.so (No such file or directory)
/tmp/sqlite-3.18.0-557c5b95-078e-42b0-bfb7-4c4eaed451be-libsqlitejdbc.so (No such file or directory)
10:44:51.429 [[main]<jdbc] WARN logstash.inputs.jdbc - Failed test_connection.
10:44:51.434 [[main]<jdbc] WARN logstash.inputs.jdbc - Failed test_connection.
10:44:51.430 [[main]<jdbc] WARN logstash.inputs.jdbc - Failed test_connection.
10:44:51.433 [[main]<jdbc] WARN logstash.inputs.jdbc - Failed test_connection.
10:44:51.439 [[main]<jdbc] ERROR logstash.pipeline - A plugin had an unrecoverable error. Will restart this plugin.

Any help is highly appreciated.

Kind regards,

I can't tell for sure from your post: do you get this error immediately on LS startup or after some X number of rows are read?

I assume you are using xerial/sqlite-jdbc

In this part of the README https://github.com/xerial/sqlite-jdbc#configuration its stated that "sqlite-jdbc extracts a native library for your OS".

Can you verify that the sqlite-3.18.0-<uuid>-libsqlitejdbc.so files can be created? Perhaps change the java.io.tmp JVM setting to a folder that you control.

Look at the jvm.options file in the LS config folder.

Thanks for your quick reply. It get the following list of files when executing logstash with multiple configs:

-rw-r--r-- 1 user user 0 Jun 26 12:18 sqlite-3.18.0-07c98248-c11c-4dea-8744-184840c2a610-libsqlitejdbc.so.lck
-rwxr--r-- 1 user user 950952 Jun 26 12:18 sqlite-3.18.0-28434bcb-d95d-4d2f-82a2-d7abcc0c2639-libsqlitejdbc.so
-rw-r--r-- 1 user user 0 Jun 26 12:18 sqlite-3.18.0-28434bcb-d95d-4d2f-82a2-d7abcc0c2639-libsqlitejdbc.so.lck
-rw-r--r-- 1 user user 0 Jun 26 12:18 sqlite-3.18.0-350dcc30-755d-4e98-97d2-4f88724b3001-libsqlitejdbc.so.lck
-rw-r--r-- 1 user user 0 Jun 26 12:18 sqlite-3.18.0-5ce7b7ed-6716-41d7-b304-933af10f9770-libsqlitejdbc.so.lck
-rw-r--r-- 1 user user 0 Jun 26 12:18 sqlite-3.18.0-5fc79974-7d57-4977-8698-854683f86ffc-libsqlitejdbc.so.lck
-rw-r--r-- 1 user user 0 Jun 26 12:18 sqlite-3.18.0-65f3a2fc-fef1-47bf-8ecc-af4a36de7a79-libsqlitejdbc.so.lck
-rw-r--r-- 1 user user 0 Jun 26 12:18 sqlite-3.18.0-b48413ad-7f4e-4a1d-bbf0-d86fcd743f53-libsqlitejdbc.so.lck
-rw-r--r-- 1 user user 0 Jun 26 12:18 sqlite-3.18.0-f1bf26bd-90cf-4b57-9055-1c4853a1a358-libsqlitejdbc.so.lck
-rwxr--r-- 1 user user 950952 Jun 26 12:18 sqlite-3.18.0-fd31daba-bec2-49ba-a5d4-e6cf7f663f32-libsqlitejdbc.so
-rw-r--r-- 1 user user 0 Jun 26 12:18 sqlite-3.18.0-fd31daba-bec2-49ba-a5d4-e6cf7f663f32-libsqlitejdbc.so.lck

It seems that some locks are generated although the .so file is missing.

From this line of code https://github.com/xerial/sqlite-jdbc/blob/master/src/main/java/org/sqlite/SQLiteJDBCLoader.java#L62, I suspect that the last run LS instance is wiping the so files of the other LS instances. The return name.startsWith(searchPattern) && !name.endsWith(".lck"); ignores the lck files and the searchPattern looks for any files starting with the name and version.

OK, so how do you mitigate this? I think you will need a separate tmp folder per LS instance.
In <LS>/config/jvm.options uncomment and set -Djava.io.tmpdir=/tmp/logstash-1.
Make sure each LS install is sequentially numbered. Go create and set permissions on those folders.

Thanks again. I am pretty new to logstash, so this might sound a bit dumb. I am aware of the fact that I can start multiple instances of logstash, but I only start it once with "-f "*.logstash". Does this invoke multiple instances of LS? How can I number those instances as suggested above? Is this a bug and will it be fixed in the future?

Kind regards,

Ahh, I understand now. You have multiple JDBC inputs in one running LS instance. These are messing with each other. I was foxed by your phrase "multiple jdbc blocks (per table) or multiple logstash configs (per database and table in database):". I assumed you have multiple LS instances running.

How did you install LS?

LS is installed via Debian's package system

I'm checking with other LS devs how you can get multiple LS installs via Debian packaging.

It does not seem possible to use the deb file again but to a different folder and have a different service name.

Maybe multiple Docker containers with one LS in each?
Maybe different machines?

Unfortunately multiple instances of LS won't be possible as i'd like to add roughly 20 databases with around 10 tables each every 15 minutes.i fear this won't work.will this be fixed in upcoming versions to truely support multiple jdbc sources per config or do you see using the sqlite input plugin works for my use case?first experiments there also showed some problems...

Kimd regards and again thanks for your help

This multiple JDBC input per LS instance problem seems to be restricted to the SQLite technology only.

Do you have a choice over the db technology? Could Apache Derby files work?

Thanks for your continuous support. I regret that I have no choice regarding the db technology. I tried using the sqlite input plugin without jdbc, but there I receive the following error message:

Error: Missing Valuefier handling for full class name=org.jruby.RubyObject, simple name=RubyObject

Is this somehow related to the jdbc problems or can this problem be solved somehow?

My very last alternative would be to convert the sqlite data into csv tables and import those text files into logstash, but I would prefer using a direct sqlite connection if possible. Any idea how to fix the RubyObject issue above?

Kind regards,

After some investigation, unfortunately logstash-input-sqlite uses the same JDBC driver (an older version though, but still includes the cleanup) under the hood. Meaning that even if the "Missing Valuefier handling for full class name=org.jruby.RubyObject, simple name=RubyObject" bug was fixed you would still get the .so file delete issue.

BTW, the "Missing Valuefier" bug is because the input has not been updated to convert some JDBC datatypes to datatypes that are acceptable to the newish Java Event (each acceptable datatype has a corresponding Valuefier).

If you are considering exporting to CSV then you should also consider exporting to one or more Apache Derby DB files.

That said, we have not verified that multiple JDBC inputs work satisfactorily on all supported DB technologies. We simply do not have enough time to do this - we rely on the help of others (e.g. you) and offer this kind of help in return.

Good news. There is a way to run multiple LS instances under Debian, my fellow developer Aaron explains how in this discuss question [SOLVED] Multiple Instances for Logstash.

Multiple instances of logstash is no option. Is there any other option to this?

Kind regards,

Multiple LS instances on a machine or export the sqlite data to another DB technology are the only options you have, sorry.

ok. Thanks for your continuous assistance. I'll see what I can do with csv files. However, would it make sense to file a bug report? And if yes, where could I do so?

Kind regards,

You may create a Github issue at https://github.com/logstash-plugins/logstash-input-jdbc/issues
Please supply as much info as possible,

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.