I am new to logstash and elasticsearch. I have a query.
I have a csv file and the data in the file will keep on changing after every 3 hours. Is it possible to load fresh data to elasticsearch index everytime and delete the entire old data of the index using logstash.
yes that is possible. I think the easiest way to go would be to index in a new index everytime you do an import, and have an alias point to the latest index, after indexing is finished (the alias pointing cannot be done with logstash however).
Hope this helps.
Thankyou for your reply.
But in my scenario the index name should be fixed.
Is there any other way?
you would need to delete that index manually before starting logstash. but then this would work as well.