Data from mongodb using logstash via jdbc

I am trying to create mongodb-elk connection, read data from mongo and insert in to elk. decided to use jdbc because I can use date and only retrive new data every few minutes

here is my jdbc connection setup. it makes the connection with my sample databse but then it errors. I think it has something to do with null field in mongo. but do not know how to fix it. I google it a lot but can't find how to fix it.

     jdbc_driver_library => "/usr/share/logstash/logstash-core/lib/jars/mongojdbc1.6.jar"
     jdbc_driver_class => "com.dbschema.MongoJdbcDriver"
     jdbc_connection_string => "jdbc:mongodb://tst908:27072/Jobs"
    jdbc_user => ""
    jdbc_password => ""
    statement => "db.getCollection('jobs').find({})"


[INFO ] 2020-06-04 12:05:31.310 [[main]<jdbc] jdbc - (0.694380s) db.getCollection('jobs').find({})
[WARN ] 2020-06-04 12:05:31.375 [[main]<jdbc] jdbc - Exception when executing JDBC query {:exception=>#<Sequel::DatabaseError: Java::OrgLogstash::MissingConverterException: **Missing Converter handling for full class name=org.bson.types.ObjectId, simple name=ObjectId>}**
[INFO ] 2020-06-04 12:05:31.943 [LogStash::Runner] runner - Logstash shut down.

use this and it worked

db.getCollection('jobs').find({},{'_id': false})

Where did you get the "mongojdbc1.6.jar"?

do not remember from where but quick search shows me

Thanks! I also find I new version (2.0) at (check the repo file)

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.