Scaling with many custom log patterns

Hi Community,

I'm wondering how would we write a filter with logs from a firewall like cisco asa with thousands of custom logs?
Currently in the filebeat module for cisco there are supported 150 cisco asa log id's.
If we would as example filter all cisco asa logs completely (2290) in logstash, how would this work?
Do we do one after another with dissect and grok?
Or is there another solution
Maybe am I here on the wrong track and it is not so complicated?