Logstash Drop Filter

Hello guys,
I am new to logstash and was trying to drop few events which do not match a criteria.
Even though logstash is dropping those events, but still it keeps on sending an acknowledgement about the deleted event which is annoying. Can i drop these events without any logs on kafka?

I am atatching my conf file below:-

input {
beats {
port => 6001
}
}

filter {
kv {
source => "message"
field_split => ","
value_split => ":"
remove_char_key => "[]"
remove_char_value => "[]"
}

mutate {
convert => {
"Severity" => "integer"
"Class" => "integer"
"ExpireTime" => "integer"
"Tally" => "integer"
}
}

if ([Manager] == "ConnectionWatch" or [Manager] == "OMNIbus Self Monitoring @AGG_P") { drop{} }
}

output {
kafka {
codec => json
bootstrap_servers => "10.21.0.110:9092,10.21.0.110:9093"
topic_id => "new_data"
}
}

Even though logstash is dropping those events, but still it keeps on sending an acknowledgement about the deleted event which is annoying.

Exactly what do you mean by that?

I mean to say that when i try to use filter to drop an event on logstash and not allow to go to kafka on the basis of some matched conditions, it is indeed dropping them on the logstash level itself, but for the data it has dropped, logstash sends an acknowledgement message to kafka, that it has dropped some events due to some filters. I dont want logstash to send those acknowledgement messages.

Please save everyone's time by just pasting what it is that you don't want to see. Hard evidence is always more useful than descriptions.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.