Detect Horizontal Port Scan

Hello everyone, From the logs that I have stored in Elasticsearch from a Firewall, I need to detect a type of attack called "Horizontal Port Scan" that is defined as follows:

Unique source IP address that has "N" different destinations and all go to the same port in a specified time.

Source IP ----> N Destinations ---> Same Port
| ------------------------------ 2 hours ----------------- ------->

the formulation of the question would be as follows:

What IP address has 20 different destination IP addresses to the same destination port in the last 2 hours?

the names of the fields are:

srcip (source IP / type IP), dstip (Destination IP / type IP) and port (port / type integer)

Thank you so much

Hi @pmorenosi, welcome to our Community!

Glad to see you are trying out the detection rules within the Elastic SIEM/Security solution.

Source IP ----> N Destinations ---> Same Port
| ------------------------------ 2 hours ----------------- ------->
the formulation of the question would be as follows:
What IP address has 20 different destination IP addresses to the same destination port in the last 2 hours?

One idea is to create your own rule using the Threshold rule type in the 7.12 version. It has capabilities to do what you're looking for, like this.

the names of the fields are:
srcip (source IP / type IP), dstip (Destination IP / type IP) and port (port / type integer)

You'll notice that my example above used different field names than yours. The fields I used are defined by Elastic Common Schema (ECS).

The Elastic SIEM/Security app, including its detection rules, signals, and detection alerts, requires your data to be indexed in an ECS-compliant format. ECS is an open source, community-developed schema that specifies field names and Elasticsearch data types for each field, and provides descriptions and example usage.

The easiest way to get your data in ECS-compliant format is to use an Elastic-supplied beat module, (e.g., filebeat or Elastic Agent integration), which will ingest and index your data in an ECS-compliant format. Elastic provides a growing list of these integrations that you can find on our Integrations page.

What kind of firewall logs are you working with? There are integrations already created for a number of firewalls such as Barracuda, Cisco, CheckPoint, Palo Alto, and more.

General guidelines for creating ECS-compliant data:

  1. Each indexed document (e.g., your log, event, etc.) MUST have the @timestamp field.
  2. Your index mapping template must specify the Elasticsearch field data type for each field as defined by ECS. For example, your @timestamp field must be of the date field data type, etc.. This ensures that there will not be any mapping conflicts in your indices.
  3. The original fields from your log/event SHOULD be copied/renamed/converted to the corresponding ECS-defined field name and data type.
  4. Additional ECS fields, such as the ECS Categorization fields SHOULD be populated for each log/event, to allow proper inclusion of your data into dashboards and detection rules.

Here's a simple graphic that I created to help get this point across.

Please let us know if this helps.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.