[Logstash] [JDBC] [mongoDB] MissingConverterException for date fields

Hello!

I was trying to connect mongoDB with Logstash, but when a collection has fields with dates, Logstash logs show me this error and does not create the index or input data.

I'm using Logstash version 6.7.0

[WARN ][logstash.inputs.jdbc     ] Exception when executing JDBC query {:exception=>#<Sequel::DatabaseError: Java::OrgLogstash::MissingConverterException: Missing Converter handling for full class name=java.util.Date, simple name=Date>}

In the input's section of the logstash.conf file I am using JDBC plugin and connecting it to my mongoDB.
The query I'm running retrieves the following data example:
Query:

db.guides.find({_id: false}) # exclude id

Collection data example

{ "_id" : someid, 
  "organization" : "central", 
  "course" : "central/2000-k20000", 
  "slug" : "colegio/mumuki-guia-fundamentos-primeros-programas", 
  "updated_at" : ISODate("2019-03-29T20:03:39.403Z"), 
  "created_at" : ISODate("2019-03-29T20:03:39.194Z"), 
  "language" : { "name" : "gobstones" }, 
  "name" : "Primeros Programas", 
  "parent" : {  "type" : "Lesson", 
                "name" : "Primeros Programas", 
                "position" : 1, 
                "chapter" : { "id" : 97, 
                              "name" : "Fundamentos" 
                            } 
              } 
}

The thing is that the fields "created_at" and "updated_at" contains dates, and Logstash or JDBC plugin (I'm not sure) are failing showing me the error above...

So when I run the query in the following way:

db.guides.find({_id: false, created_at: false, updated_at: false}) # exclude id, and date fields

The query is correctly run and the data reach elasticseach and kibana, creating the indices.

Any ideas about this?

Note: MongoDB driver and connector I am using are these.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.