Set Elastic Security rules on syslog


I'm having trouble to display syslog events in Kibana.

I have a Stormshield firewall sending syslog events to my elastic server from different hosts :

2628  2703 3.429885364 →   Syslog 806 USER.INFO: 1 2021-10-29T16:33:30+02:00 TADEN asqd - - - \357\273\277id=firewall time="2021-10-29 16:33:30" fw="TADEN" tz=+0200 startime="2021-10-29 16:33:29" pri=5 confid=01 slotlevel=2 ruleid=55 rulename="17b8311aa81_10e" srcif="Ethernet5" srcifname="DMZ-OPENVPN" ipproto=tcp dstif="Ethernet0" dstifname="internet" proto=ssl src= srcport=61564 srcportname=port-tcp-sup srcname=P-WINSAV srcmac=00:ee:69:0e:e2:37 dst= dstport=443 dstportname=https dstcontinent="eu" dstcountry="fr" dstiprep="microsoftauth,o365common,office365,officeonline" modsrc= modsrcport=23064 origdst= origdstport=443 ipv=4 sent=1350 rcvd=7445 duration=0.27 action=pass logtype="connection"

The logs are succesfully retrieved and displayed in Kibana (Observability > Logs > Stream)

But I can't figure out how to make them appear in the alerts in the elastic security tab :

My questions are :

  1. How could I configure a syslog detection rule to make them appear in the detected alerts ?

  2. How could I had the hosts monitored by my stormshield firewall to the hosts in Kibana

They already have all the services (filebeat, packetbeat) running but they are not indexed by Elastic, so my guess is that I need to index them but I can't find how to do it.

Having a cluster with all my hosts monitored by Stormshield would also be really good.

for example, I would like the ip to be part of the hosts.

Thank you very much for your help !

There's a couple things needed to appear in the siem. First the index that the logs are in must be included in the siem config. To check that, go to stack management and kibana advanced config and search for siem. Then I believe the other thing is each document must contain the event.module and event.dataset fields.

Hi @Limoelou, thanks for your post.

I want to ensure that you are aware of Elastic Common Schema (ECS).

The Elastic SIEM/Security app, including its detection rules, signals, and detection alerts, requires your data to be indexed in an ECS-compliant format. ECS is an open source, community-developed schema that specifies field names and Elasticsearch data types for each field, and provides descriptions and example usage.

The easiest way to get your data in ECS-compliant format is to use an Elastic-supplied beat module, (e.g., filebeat or Elastic Agent integration), which will ingest and index your data in an ECS-compliant format. Elastic provides a growing list of these integrations that you can find on our Integrations page.

We don't seem to have an integration with Stormshield so you may need to convert your data so that it is in an ECS-compliant format before you can use the SIEM/security app. This can be done by creating your own beat, Logstash, or Elasticsearch ingest node pipeline, which will convert your data to ECS during the ingestion process. You can check out this experimental tool, called ECS Mapper to help you create these.

General guidelines for creating ECS-compliant data:

  1. Each indexed document (e.g., your log, event, etc.) MUST have the @timestamp field.
  2. Your index mapping template must specify the Elasticsearch field data type for each field as defined by ECS. For example, your @timestamp field must be of the date field data type, etc.. This ensures that there will not be any mapping conflicts in your indices.
  3. The original fields from your log/event SHOULD be copied/renamed/converted to the corresponding ECS-defined field name and data type.
  4. Additional ECS fields, such as the ECS Categorization fields SHOULD be populated for each log/event, to allow proper inclusion of your data into dashboards and detection rules.

Here's a simple graphic that I created to help get this point across.

Hope this helps!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.