Custom Rules not working

Hello team!

If possible, you could help us with setting up custom rules for SIEM.

We had the following problem:

The main idea is to create a rule according to which we would receive notifications when someone from the employee visits the Facebook site.

We'd like to create our own signal rules and faces issues with the simplest one.

But when creating a rule, it simply does not work out, please help figure it out.

Adding exported detection rule and the json of the document from one of wathcquard-* index

Hi

How did you index the field dstname? are you using keyword or text ? can you share you mapping ?

Regards

Hi @Yuriy_Tsarenko, welcome to our community!

I want to ensure that you are aware of Elastic Common Schema (ECS).

The Elastic SIEM/Security app, including its detection rules, signals, and detection alerts, requires your data to be indexed in an ECS-compliant format. ECS is an open source, community-developed schema that specifies field names and Elasticsearch data types for each field, and provides descriptions and example usage.

The easiest way to get your data in ECS-compliant format is to use an Elastic-supplied beat module, (e.g., filebeat or Elastic Agent integration), which will ingest and index your data in an ECS-compliant format. Elastic provides a growing list of these integrations that you can find on our Integrations page.

If you're using a custom data ingestion method (beat, Logstash, Ingest node pipeline), or one provided by a third-party, then you may need to convert your data so that it is in an ECS-compliant format before you can use the SIEM/security app. This can be done by creating your own beat/module, or your own Logstash configuration for each data source, which will convert your data to ECS during the ingestion process.

General guidelines for creating ECS-compliant data:

  1. Each indexed document (e.g., your log, event, etc.) MUST have the @timestamp field.
  2. Your index mapping template must specify the Elasticsearch field data type for each field as defined by ECS. For example, your @timestamp field must be of the date field data type, etc.. This ensures that there will not be any mapping conflicts in your indices.
  3. The original fields from your log/event SHOULD be copied/renamed/converted to the corresponding ECS-defined field name and data type.
  4. Additional ECS fields, such as the ECS Categorization fields SHOULD be populated for each log/event, to allow proper inclusion of your data into dashboards and detection rules.

A list of the specific ECS fields used by the SIEM/Security app is provided in this reference.

I am guessing that your Elasticsearch index mapping for host may not be compliant with ECS. Your document is using host to hold an IP address, but ECS defines host as a field set object with multiple host.* fields defined here.

Sorry for the information dump, but we've found that non-ECS-compliant data is a common root cause for users who experience problems getting their SIEM/Security app rules/signals to work.

Please let us know if this is helpful.

2 Likes

Hello and thank you for the response.

We aware about ECS for SIEM working with the predefined rules. But what if we would like to use custom rules ?
Following this manual https://www.elastic.co/guide/en/security/current/rules-ui-create.html
"Custom query: Query-based rule, which searches the defined indices and creates an alert when a document matches the rule’s query."
We need simply run KQL query in specific index.
So as I understand we do not need ECS for custom detection rules. What we're doing:

  • define what index we should use for search and add it to Kibana → Stack Management → Advanced Settings → securitysolution:defaultIndex (index has no relation to ECS)
  • go to discover and see if our KQL rule works
  • create a new detection rule with tested KQL query and enter the index pattern where we would like to run the query.
  • adding schedule, actions
  • save and enable rule

Please correct me if there is a mistake.
Thank you in advance

You need to use ECS for your mapping when creating custom rules. I had a similar problem too.

1 Like

Hello and mighty thank you for response.
To be 100% sure I understood everything right please give me an answer for next:
We have schema for fields naming and giving them some mapping (ECS). And SIEM works only with with index that follows ECS.
In other words if we have some custom field with naming that is not described in ECS, we cannot use it for SIEM not only for predefined rules but entirely for any analytics by SIEM ?

Can you please share the rule and a sample document as json here instead of posting screenshoot, so anyone can try to reproduce the issue.

Hi @Yuriy_Tsarenko

We have schema for fields naming and giving them some mapping (ECS). And SIEM works only with with index that follows ECS.

Correct. The Elastic SIEM/Security App requires that all data must be ECS-compliant.
The app works best and most-fully when the data has been fully converted to ECS format.

In other words if we have some custom field with naming that is not described in ECS, we cannot use it for SIEM not only for predefined rules but entirely for any analytics by SIEM ?

Not correct. ECS is a permissive schema. If your events have additional data that cannot be mapped to ECS, you can simply add them to your events, using custom field names, and your data can still be ECS-compliant.

However, any custom fields you add must not conflict with ECS-defined fields.
AND if you use any ECS-defined fields, your Elasticsearch field data type mapping for them must not conflict with those specified by ECS.

From your screen shot above, it is clear that your data has not been mapped to ECS at all. Further it appears that your data is NOT ECS-compatible, as it may have a mapping for host that conflicts with ECS use of host.*, which would cause your rule not work.

As suggested below, please feel free to attach an actual log sample (please ensure no private information is included) as well as the index mappings for the index into which your watchguard logs are being indexed.

Finally, can you tell us what method you are using to ingest your data? e.g., Logstash?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.