Hello,
I really need to store my logs in a database because I will modify the data so I tried using the sqlite pluging of logstash.
Here's my config file :
input {
sqlite {
path => "C:/Users/Library/Documents/logstash-2.4.0/tmp/DB_withweblogs.db"
type => weblogs
}
}
output {
elasticsearch {
hosts => [ "localhost:9200" ]
index => "test-{+YYYY.MM.dd}"
}
stdout {}
}
Here's the error who is cause by the output to elasticsearch :
e[33mCannot serialize instance of: Sequel::JDBC::Database {:class=>"JrJackson::ParseError", :backtrace=>["com/jrjackson/JrJacksonBase.java:76:in generate'", "C:/Users/Library/Documents/logstash-2.4.0/vendor/bundle/jruby/1.9/gems/jrjackson-0.3.9-java/lib/jrjackson/jrjackson.rb:60:in
dump'", "C:/Users/Library/Documents/logstash-2.4.0/vendor/bundle/jruby/1.9/gems/multi_json-1.12.1/lib/multi_json/adapters/jr_jackson.rb:20:in dump'", "C:/Users/Library/Documents/logstash-2.4.0/vendor/bundle/jruby/1.9/gems/multi_json-1.12.1/lib/multi_json/adapter.rb:25:in
dump'", "C:/Users/Library/Documents/logstash-2.4.0/vendor/bundle/jruby/1.9/gems/multi_json-1.12.1/lib/multi_json.rb:139:in dump'", "C:/Users/Library/Documents/logstash-2.4.0/vendor/bundle/jruby/1.9/gems/elasticsearch-api-1.1.0/lib/elasticsearch/api/utils.rb:99:in
__bulkify'", "org/jruby/RubyArray.java:2414:in map'", "C:/Users/Library/Documents/logstash-2.4.0/vendor/bundle/jruby/1.9/gems/elasticsearch-api-1.1.0/lib/elasticsearch/api/utils.rb:89:in
__bulkify'", "C:/Users/Library/Documents/logstash-2.4.0/vendor/bundle/jruby/1.9/gems/elasticsearch-api-1.1.0/lib/elasticsearch/api/actions/bulk.rb:88:in bulk'", "C:/Users/Library/Documents/logstash-2.4.0/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.1-java/lib/logstash/outputs/elasticsearch/http_client.rb:53:in
non_threadsafe_bulk'", "C:/Users/Library/Documents/logstash-2.4.0/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.1-java/lib/logstash/outputs/elasticsearch/http_client.rb:38:in bulk'", "org/jruby/ext/thread/Mutex.java:149:in
synchronize'", "C:/Users/Library/Documents/logstash-2.4.0/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.1-java/lib/logstash/outputs/elasticsearch/http_client.rb:38:in bulk'", "C:/Users/Library/Documents/logstash-2.4.0/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.1-java/lib/logstash/outputs/elasticsearch/common.rb:172:in
safe_bulk'", "C:/Users/Library/Documents/logstash-2.4.0/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.1-java/lib/logstash/outputs/elasticsearch/common.rb:101:in submit'", "C:/Users/Library/Documents/logstash-2.4.0/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.1-java/lib/logstash/outputs/elasticsearch/common.rb:86:in
retrying_submit'", "C:/Users/Library/Documents/logstash-2.4.0/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.1-java/lib/logstash/outputs/elasticsearch/common.rb:29:in multi_receive'", "org/jruby/RubyArray.java:1653:in
each_slice'", "C:/Users/Library/Documents/logstash-2.4.0/vendor/bundle/jruby/1.9/gems/logstash-output-elasticsearch-2.7.1-java/lib/logstash/outputs/elasticsearch/common.rb:28:in multi_receive'", "C:/Users/Library/Documents/logstash-2.4.0/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/output_delegator.rb:130:in
worker_multi_receive'", "C:/Users/Library/Documents/logstash-2.4.0/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/output_delegator.rb:114:in multi_receive'", "C:/Users/Library/Documents/logstash-2.4.0/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:301:in
output_batch'", "org/jruby/RubyHash.java:1342:in each'", "C:/Users/Library/Documents/logstash-2.4.0/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:301:in
output_batch'", "C:/Users/Library/Documents/logstash-2.4.0/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:232:in worker_loop'", "C:/Users/Library/Documents/logstash-2.4.0/vendor/bundle/jruby/1.9/gems/logstash-core-2.4.0-java/lib/logstash/pipeline.rb:201:in
start_workers'"], :level=>:warn}e[0m
e[33mSIGINT received. Shutting down the agent. {:level=>:warn}e[0m
stopping pipeline {:id=>"main"}
how can I take my data in my sqlite database (done) but exporting it in elasticsearch?
Thank you