Loss Data while indexing from My sql to elastic via logstash

I had a issue facing during the indexing the data via logstash,

case : I am able to index my whole data count 150000 from mysql server using the default document ID , but when i put the document id by using 3 field of table to make its unique through out the enitre data , in that case my some dat not able to index means i am getting the less count , some data is missing , i query to my table also with the 3 field that use to create the document_id , but not able to find any duplicate data in the database , logstash also not giving any error ,

I tried every method to trace or figure out the issue why i am losing the data. Lots of goggling, internet solution tried but no help ,

Please help how to trace which data is not indexing or cause of the error .

Hi Rajesh,

Loss of the data might be due to the bad network issue or any other issue,
Since you have having 150000 records, you may not have possibility to check whether which record failed.
I think, the only possibility is to delete the index and reload the data again.

Regards,
Balu

Thanks for the reply sir, i had resolve the issue, Issue with the document id i am creating causing duplicate id and loss of the data

cheers!! Happy you resolved the issue

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.