Logstash JDBC missing many documents

Moving search from postgres full text index to Elasticsearch. Using Logstash JDBC to input documents into index. Index about 3 million documents a year roughly 3K to 10K documents a day. ElasticSeach has increase searching performance better than factor of 10.

Missing roughly 5% of all documents from postgres to Logstash JDBC input. I can almost guarantee, escaping of text fields, is the cause of issue. Either single or double quotes in the text fields. This is very common on conversion of from one database to another.

I have not found a method of tracking the missing documents, there are no errors or exceptions being logged. I have tried all log levels. I need to log the exceptions AND need to verify the actual error, and then find a solution.

How can I get these errors to be logged?