Skip reading historical data in logstash while parsing logs

As the grok parsing failed, @timestamp could not be updated based on the data in the log entry, which is why you see the default value (the current processing time).

I would also recommend moving the date and age filters to just after the grok filter. That way you can drop the events before going through the deep and useragent processing.

1 Like

ok i got it, but why it is reading 3 months older data ??

{
"token" => "HFboNZI168vuLmSL1521536284629",
"@version" => "1",
"tags" => [
[0] "_grokparsefailure",
[1] "_geoip_lookup_failure"
],
"type" => "elb_access",
"message" => "2018-01-22T14:22:55.661636Z awseb-e-m-AWSEBLoa-1EJARVYCNV2RH x.x.x.x:57784 - -1 -1 -1 504 0 0 0 "GET https://xxxxxxx:443/ HTTP/1.1" "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/61.0.3163.100 Safari/537.36" ECDHE-RSA-AES128-GCM-SHA256 TLSv1.2\n",
"@timestamp" => 2018-04-24T11:43:27.472Z
}

Please see the message , this event is of January month.

That is probably because the s3 input plugin reads all files in that bucket, which probably contains older data. I have not used the s3 plugin, but it looks like you might be able to exclude files/data to be read using the exclude_pattern directive.

1 Like

ok thank you .... i will try it

@Christian_Dahlqvist

Thanks for your help and time, i solved the issue's and now age filter is working great :smiley:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.