Logstash - reading from mysql and loading to elastic search index

We are using logstash which queries from mysql database and load it to elastic search index using the logstash jdbc connector. This a nested index when we load approx 1,000 records some of the documents have multiple value for the nested component in that case only couple of nested records for that documents is inserted but I don't get any error. For those documents that didnt load properly if I load it individually i.e. only load one document at a time it loads all the nested record.

To see the error in the logs I added stdout { codec => "rubydebug" } in the config and in the logstash.yml added log.level: "debug". With this setting only for 999 it is creating a log file which is 12GB that is difficult to open . Is there anyway to log capture only the error from elastic search when inserting the value to the index.
This is the config file output section
output {
elasticsearch {
document_id => "%{memberpk}"
document_type => "member"
index => "prodtest7"
codec => "json"
hosts => "https://endpointurl.com:443"
}
stdout { codec => "rubydebug" }

Please advise

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.