How to "tail" an ElasticSearch index in LogStash

We have all our logfile entries in a logs-* index and would like to tail it, choosing some special/error entries to push into another index issues-*.

  • How can we query ElasticSearch for all logs.* since time X where X is the last time we queried?

I imagine we'd need to persist the time we last queried in some way (file) and use this timestamp within an ES Query.

Of course there are workarounds: we could direct filebeat to send the logfile entries to logs-* and issues-*. We could also clone certain events we are sending to logs-* within logstash and send these to issues-*.

I just feel there should be a way to tail ES.

Hi Marc,

I think you could use aliases too: Aliases API | Elasticsearch Guide [8.11] | Elastic

POST /_aliases
{
    "actions" : [
        {
            "add" : {
                 "index" : "logs-*",
                 "alias" : "issues",
                 "filter" : { "term" : { "log.level" : "ERROR" } }
            }
        }
    ]
}

Then you could query this alias as:
GET errorlog/_search

Best regards
Wolfram

That creates a view of the data I want but does not allow me to copy those docs into a separate index unless I do it daily/hourly for the previous day/hour. I really want to copy documents once and once only into a different index.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.