Hi. I have issue where I'd duplicate data send to the target through filebeat > logstash > target.
I couldn't figure what was the issue on the server as it was terminated in aws autoscale so I'm thinking of having a filter that will compare a field, record_date, in the event to the @timestamp and if it's older than 30 mins then drop the line or ignore it so it won't send to target.
Is this something needs to be done with ruby code? Or as simple as through with the "if" ?
Hi,
can you share your config, logstash version and operating system? also having more info on your architecture would be nice to know where the issue might be.
My filebeat data file: {"function":"log","newRecordData":{"record_time":"2016-06-06 11:53:07","name": "test"}}
I was thinking if there's a way to see the "record_time" and see if it's older than 30mins then ignore the line.
I see example from others using filter along with ruby code but their example are just add tag and also their ruby code is not similar case. Maybe what I'm trying to do is as simple as inside the "if" in the output.
I'm wondering if there's something like this?
if [function] == "log" && event['newRecordData']['record_time'] < [current_time - 30mins]
You need a ruby filter for the timestamp math. Parse the record_time field to a DateTime object and subtract that from DateTime.now. I believe the result is the time difference expressed as fractions of a day, so multiple that by 24*60 and you'll get the difference in minutes.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.