Hello Support team,
I have query regarding csv file upload into elasticsearch using logstash.
With this fourm https://www.bmc.com/blogs/elasticsearch-load-csv-logstash/
Able to import our csv files into elasticsearch successfully. However I have noticed that data of my csv file again if I rerun the logstash command. It uploads again and creating duplicate entries. So is there any way to skip the data existing in elasticsearch?
Deleting the file after each upload is completed that the only possible way? Please suggest the best way.