I have like 2 million records, out of which 40 did not get read, 11 of them are giving CSV parse failure and these rows are just like the other rows.
Remaining 29, I cannot track which rows are they. There is no information about them in either es logs or logstash logs.
I know es hasn't deleted them as my docs.deleted on localhost:9200/_cat/index is 0.
My logstash logs gave me csv parse failure for these 11 rows.
My CSV uses ## as separator.