We have all our logfile entries in a logs-* index and would like to tail it, choosing some special/error entries to push into another index issues-*.
How can we query ElasticSearch for all logs.* since time X where X is the last time we queried?
I imagine we'd need to persist the time we last queried in some way (file) and use this timestamp within an ES Query.
Of course there are workarounds: we could direct filebeat to send the logfile entries to logs-* and issues-*. We could also clone certain events we are sending to logs-* within logstash and send these to issues-*.
That creates a view of the data I want but does not allow me to copy those docs into a separate index unless I do it daily/hourly for the previous day/hour. I really want to copy documents once and once only into a different index.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.