How to process logs from certain period of time? (I use elasticsearch input plugin)

In pipeline I receive logs from server with elasticsearch input plugin and I want to process not all logs, but for example only last 2 months, or, preferably from certain date to certain date.
When I receive logs there is already separate field @timestamp, so I tried to use this

input {
    elasticsearch {
        host =>
        index =>
        ...and so on...
   }
}

filter {
    if [@timestamp] >= "May 31, 2021 @ 23:59:56.672" {
      # grok and other stuff.
    }
}

but this doesn't work
I know about ignore_older in file input plugin, but that doesn't work for me.
Couldn't find anything with google either.
If you could give me some advice of suggestions that would be awesome.
Thank you.

There is an example of testing the age of an event here.

Thank you, I tried but pipeline still process all logs, probably I doing something wrong.
I tried to parse only last 2 days logs with this:

input {
    elasticsearch {
        host =>
        index =>
        ...and so on...
   }
}

filter {
    ruby { code => 'event.set("[@metadata][age]", Time.now.to_f - event.timestamp.to_f)' }
    mutate { convert => { "[@metadata][age]" => "integer" } }
    if 86400 < [@metadata][age] {
    # grok and other conditions.

The events will still go through the pipeline unless you drop them. That condition means event will only go through the grok filter if they are more than one day old. Perhaps what you want is

if [@metadata][age] > 172400 { drop {} }
1 Like

Thank you, it's works. I thought maybe if I skip some logs, logstash "jump" to logs that I want and whole process will be faster, but looks like it's still more or less same speed,
like even if it drop logs from certain period, it still check it and it take a lot of time if there are a lot of logs.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.