Hi, it is unclear to me if the data is accumulated since the start of filebeat or it is a real-time view. Will the metrics be cleared if filebeat is restarted? Could you point to me the script that generates the metrics?
I'm trying to check if there is any failed or dropped events in filebeat by looking at output.events.dropped
. If filebeat is restarted, will the number of dropped events be cleared out? I'm asking this because I want to run a monitoring script on filebeat and don't want it to keep failing on this condition.
By the way, is it possible to have dropped and failed events if the output is logstash? Doesn't filebeat guarantee at-least-once-delievery?
Thanks in advance!
"output": {
"events": {
"acked": 0,
"active": 0,
"batches": 0,
"dropped": 0,
"duplicates": 0,
"failed": 0,
"total": 0
},
"read": {
"bytes": 0,
"errors": 0
},
"type": "logstash",
"write": {
"bytes": 0,
"errors": 0
}
},
"pipeline": {
"clients": 1,
"events": {
"active": 944,
"dropped": 0,
"failed": 0,
"filtered": 2,
"published": 944,
"retry": 0,
"total": 946
},
"queue": {
"acked": 0
}
}
},