Logstash sqlite plugin can't output in elasticsearch

Hello,
I really need to store my logs in a database because I will modify the data so I tried using the sqlite pluging of logstash.

Here's my config file :
input {
sqlite {
path => "C:/Users/Library/Documents/logstash-2.4.0/tmp/DB_withweblogs.db"
type => weblogs
}
}

output {
elasticsearch {
hosts => [ "localhost:9200" ]
index => "test-{+YYYY.MM.dd}"
}
stdout {}
}

Here's the error who is cause by the output to elasticsearch :

e[33mCannot serialize instance of: Sequel::JDBC::Database {:class=>"JrJackson::ParseError", :backtrace=>["com/jrjackson/JrJacksonBase.java:76:in generate'", "C:/Users/Library/Documents/logstash-2.4.0/vendor/bundle/jruby/1.9/gems/jrjackson-0.3.9-java/lib/jrjackson/jrjackson.rb:60:indump'", "C:/Users/Library/Documents/logstash-2.4.0/vendor/bundle/jruby/1.9/gems/multi_json-1.12.1/lib/multi_json/adapters/jr_jackson.rb:20:in dump'", "C:/Users/Library/Documents/logstash-2.4.0/vendor/bundle/jruby/1.9/gems/multi_json-1.12.1/lib/multi_json/adapter.rb:25:indump'", "C:/Users/Library/Documents/logstash-2.4.0/vendor/bundle/jruby/1.9/gems/multi_json-1.12.1/lib/multi_json.rb:139:in dump'", "C:/Users/Library/Documents/logstash-2.4.0/vendor/bundle/jruby/1.9/gems/elasticsearch-api-1.1.0/lib/elasticsearch/api/utils.rb:99:in__bulkify'", "org/jruby/RubyArray.java:2414:in map'", "C:/Users/Library/Documents/logstash-2.4.0/vendor/bundle/jruby/1.9/gems/elasticsearch-api-1.1.0/lib/elasticsearch/api/utils.rb:89:in__bulkify'", "C:/Users/Library/Documents/logstash-2.4.0/vendor/bundle/jruby/1.9/gems/elasticsearch-api-1.1.0/lib/elasticsearch/api/actions/bulk.rb:88:in bulk'", "C:/Users/Library/Documents/logstash-2.4.0/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.1-java/lib/logstash/outputs/elasticsearch/http_client.rb:53:innon_threadsafe_bulk'", "C:/Users/Library/Documents/logstash-2.4.0/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.1-java/lib/logstash/outputs/elasticsearch/http_client.rb:38:in bulk'", "org/jruby/ext/thread/Mutex.java:149:insynchronize'", "C:/Users/Library/Documents/logstash-2.4.0/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.1-java/lib/logstash/outputs/elasticsearch/http_client.rb:38:in bulk'", "C:/Users/Library/Documents/logstash-2.4.0/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.1-java/lib/logstash/outputs/elasticsearch/common.rb:172:insafe_bulk'", "C:/Users/Library/Documents/logstash-2.4.0/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.1-java/lib/logstash/outputs/elasticsearch/common.rb:101:in submit'", "C:/Users/Library/Documents/logstash-2.4.0/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.1-java/lib/logstash/outputs/elasticsearch/common.rb:86:inretrying_submit'", "C:/Users/Library/Documents/logstash-2.4.0/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.1-java/lib/logstash/outputs/elasticsearch/common.rb:29:in multi_receive'", "org/jruby/RubyArray.java:1653:ineach_slice'", "C:/Users/Library/Documents/logstash-2.4.0/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.1-java/lib/logstash/outputs/elasticsearch/common.rb:28:in multi_receive'", "C:/Users/Library/Documents/logstash-2.4.0/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/output_delegator.rb:130:inworker_multi_receive'", "C:/Users/Library/Documents/logstash-2.4.0/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/output_delegator.rb:114:in multi_receive'", "C:/Users/Library/Documents/logstash-2.4.0/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:301:inoutput_batch'", "org/jruby/RubyHash.java:1342:in each'", "C:/Users/Library/Documents/logstash-2.4.0/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:301:inoutput_batch'", "C:/Users/Library/Documents/logstash-2.4.0/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:232:in worker_loop'", "C:/Users/Library/Documents/logstash-2.4.0/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:201:instart_workers'"], :level=>:warn}e[0m
e[33mSIGINT received. Shutting down the agent. {:level=>:warn}e[0m
stopping pipeline {:id=>"main"}

how can I take my data in my sqlite database (done) but exporting it in elasticsearch?

Thank you

I tried to export in a file and then import it in ES after. But I get the same error.

different .conf file :
input {
sqlite {
path => "C:/Users/Library/Documents/logstash-2.4.0/tmp/DB_withweblogs.db"
type => weblogs
}
}

output {
file {
path => "C:/Users/Library/Documents/logstash-2.4.0/test.txt"
}
stdout {}
}

Error :

C:\Users\Library\Documents\logstash-2.4.0>bin\logstash -f test.conf
Settings: Default pipeline workers: 1
Pipeline main started
LogStash::Json::GeneratorError: Cannot serialize instance of: Sequel::JDBC::Database
jruby_dump at C:/Users/Library/Documents/logstash-2.4.0/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/json.rb:53
to_json at C:/Users/Library/Documents/logstash-2.4.0/vendor/bundle/jruby/1.9/gems/logstash-core-event-2.4.0-java/lib/logstash/event.rb:145
encode at C:/Users/Library/Documents/logstash-2.4.0/vendor/bundle/jruby/1.9/gems/logstash-codec-json_lines-2.1.3/lib/logstash/codecs/json_lines.rb:48
receive at C:/Users/Library/Documents/logstash-2.4.0/vendor/bundle/jruby/1.9/gems/logstash-output-file-2.2.5/lib/logstash/outputs/file.rb:129
multi_receive at C:/Users/Library/Documents/logstash-2.4.0/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/outputs/base.rb:109
each at org/jruby/RubyArray.java:1613
multi_receive at C:/Users/Library/Documents/logstash-2.4.0/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/outputs/base.rb:109
worker_multi_receive at C:/Users/Library/Documents/logstash-2.4.0/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/output_delegator.rb:130
multi_receive at C:/Users/Library/Documents/logstash-2.4.0/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/output_delegator.rb:114
output_batch at C:/Users/Library/Documents/logstash-2.4.0/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:301
each at org/jruby/RubyHash.java:1342
output_batch at C:/Users/Library/Documents/logstash-2.4.0/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:301
worker_loop at C:/Users/Library/Documents/logstash-2.4.0/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:232
start_workers at C:/Users/Library/Documents/logstash-2.4.0/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:201

Does anyone know why it does not work in the output?

Can anyone tell me if they have already used that plugin before and made it work?

Thanks to this guy :

(I got the link from this website : https://github.com/logstash-plugins/logstash-input-sqlite/issues/5)

I found the file in this path :
C:\Users\Library\Documents\logstash-2.4.0\vendor\bundle\jruby\1.9\gems\logstash-input-sqlite-2.0.4\lib\logstash\inputs

and changed as he said the @db variable for @path on line 160.

Now it works

Here's my .conf file :

input {
sqlite {
path => "C:/Users/Library/Documents/logstash-2.4.0/tmp/DROIT_BIBLIO_withweblogs.db"
type => weblogs
}
}

output {
elasticsearch {
hosts => [ "localhost:9200" ]
index => "test"
}
stdout {}
}

I have verify with :
curl -XGET 'localhost:9200/test/_search?pretty&q=response=200'