Field based filter


I'm just trying to use logstash for filtering events. I wish to filter events based on event id. Event log format looks like below

"field_name":"field_value", "field2_name" : "field2_value", "event_id":1234,.........

There may be multiple events where I need to filter few based on field_Value

I'm expecting a conditional like below

if event_id == 1234
** aggregate**

Is it possible without grok???, I found difficulty in writing grok patterns, as there may be multiple formats. I thought straight forward field check will be helpful

will it be possible??


If your events are JSON you could use a json filter to parse them, and then a conditional like that would work. Can you provide more details?

Thanks for the reply

I will give a try for JSON . What if my events are in CSV format??

Try this

Thanks. I'm able to handle this with KV plugin. In my case I created nested field in logstash similar to here, Best Way to create nested field

Now I need to show this nested data in Kibana. I guess Kibana doesnt support nested field in 5.6.13. So earlier I was using copy_to through java code. I need to use same copy_to approach in logstash. Putting it simple, how can I create copy_to fields in Logstash??

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.