Logstash jdbc input timestamp parse issue

Previous version of logstash2.3.4. when we updated to 5.4.1 facing issue. please resolve ASAP. Provide fix. this is major issue in production.
Logstash Version : 5.4.1

Date Field : format (dd:mm:yy hh:mm:ss:sssss)
exception=>#<Sequel::DatabaseError: Java::JavaSql::SQLException: Cannot convert value '2016-12-30 08:21:04.074000' from column to TIMESTAMP

Query : select * from Table;

logstash Config File:

input {
jdbc {
type => "xyz"
jdbc_driver_class => "com.mysql.jdbc.Driver"
jdbc_connection_string => "jdbc:mysql://server/schema?"
jdbc_user => "user"
jdbc_password => "pwd"
lowercase_column_names => "false"
schedule => "* * * * * *"
statement_filepath => "sqlfilepath"
}
}

I'm not sure what's going on here, but does it help if you convert the timestamp column to a varchar in your select statement?

please resolve ASAP. Provide fix. this is major issue in production.

This forum is not an official support channel with an SLA. Many of the people responding to questions here are volunteers with no connection to Elastic and generally don't care whether you're in a state of emergency. If you want SLA-based support Elastic offers paid options.

if we convert timestamp field to varchar will be elastic hostogram will it recognizes as timestamp field

Yes, we can deal with that. Right now I'm seeing it as a debugging aid.

we are not using kibana using java elastic api. and also indexed time will be different than actual posted time. kibana level it can be adjusted but from java client api how to achieve this. Is there any mechanism to add time stamp with current zone difference and push it to elastic.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.