I want to delete all data from elastic search that is more than one week old. I have tried a command as below:
curl -XDELETE 'http://localhost:9200/logstash_2015_09_12'. But the data for that day is still displayed in Kibana. The index name in my settings page in Kibana is logstash-*. Am I missing anything?
Also, please suggest the best methodology to set up a job that will remove all data more than one week, assuming that the job will run once a day.
And one possible reason you see data for the 12th of September is that Logstash creates datestamps based on UTC time. Depending on your offset from UTC, you could see a little or a lot still for that date in your time zone.
I have around eight fields in my queries with ELK, viz. bank name, transaction id, timestamp, response code, transaction type, notes, transaction channel, transaction amount. In Kibana I have used the following in different dashboards: exists:bankname exists:transactionid
Besides I have different pie charts based on transaction channel ( example: mobile, internet, ATM ), transaction type and response code ( success/failure).
Can you please suggest whether I need to tune something in Elastic search, such that the dashboard gives good performance with large amount of data?
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.