Recently tried to use the severity overrides functionality for a SIEM rule, but this doesn't seem to work as expected. While trying to find what's going wrong I'd like to point out several things:
When editing a duplicated siem rule, the severity and the risk score is reset every time to 'Low' and '50'. This is very annoying, as has resulted several times in wrong configurations. The configured severity and risk score should be saved and preserved.
My goal was to lower the severity when DNS to the Internet is detected and the outgoing DNS attempt has been denied on one of our firewalls (panw).
When editing a duplicated siem rule, the severity and the risk score is reset every time to 'Low' and '50'. This is very annoying, as has resulted several times in wrong configurations. The configured severity and risk score should be saved and preserved.
For your second issue you have there, I think you have stumbled onto something interesting which is you are putting your event.type of denied into an array and using that as part of your overrides checks. It looks like severity override and risk score override do not work within arrays at this moment. They only work as a regular values. But I do think we want it to work within an array as a supported feature since parts of ECS are very encouraging of arrays and out of box searching works with arrays too.
However, for now those will only work as regular scalar values. When we make it work within arrays we will solve the extra and additional interesting problems such as duplicates in the array or even two different mapping types within the same array (we would choose the higher severity one found if more than one is found).
I wrote this up here if you want to follow the progress of it:
Thanks @Frank_Hassanabad That was plus minus what I suspected. So 1 issue left, there shouldnt be two denied values in event.type in panw data. Shall I create GH issue for that?
Grtz
So 1 issue left, there shouldnt be two denied values in event.type in panw data. Shall I create GH issue for that?
Yeah I would. That sounds like a beats issue maybe? I'm not familiar with the panw data set. But if you create it, just post it here for any passer-byes and I will also socialize it in a few other spots to see if we can figure out if there's a duplicate issue or solved in the upcoming release or not hopefully.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.