I am using logstash to read from an Elasticsearch index.
I have to write custom correlation login in filter ruby code, which accumulates 100 records, does some processing and then goes for next records.
For unknown reason, when the events reaches 1000, the pipeline stops without any errors.
I am using static variables to maintain the processed event counts
sample filter ruby code
@total_records = JSON.parse(response.body)["count"]
logger.info("@total_records: #{@total_records}")
@counter = 0
# Define a global array to store events
@@event_array ||= []
if @@event_array.size >= 100 || @counter >= @total_records
## businiess logic
# Clear the array after printing
@@event_array.clear
else
logger.info("event cancelled: counter #{@counter} total #{@total_records} #{sha256}")
event.cancel
end