Elastic Security Prebuilt Rules Error

Hello everyone, I am facing an issue with all prebuilt rules in Elasticsearch, when I enable the rules it runs with the following error.

An error occurred during rule execution: message: "verification_exception
	Root causes:
		verification_exception: Found 5 problems
line 1:1: Unknown column [event.category]
line 1:45: Unknown column [event.type]
line 2:3: Unknown column [process.name]
line 3:7: Unknown column [process.parent.name]
line 4:4: Unknown column [process.args]"

When I check my logs, I couldn't find these column names. "Process name" does not exists in my logs. (Collecting via winlogbeat).
Even if I find a way to map these to my logs, for example: "process terminated" doesn't have process name in it. Also I don't even know how to map.
How should I solve this ?

Hi @spazzrabbit,

As you've identified, it sounds like your winlogbeat data is missing some required mappings. Since Elastic Security rules require ECS fields, the errors you're seeing are due to those missing mappings.

You mentioned you're using winlogbeat, which should contain these fields by default. Can I ask how you set up winlogbeat?

Most of these mappings come from the associated index templates that are created via e.g. winlogbeat setup; it seems possible that those templates were not created before winlogbeat was started. Ingest pipelines is another aspect to examine, as those provide much of the winlogbeat data itself.

In general, this looks pretty similar to No event.category in Winlogbeat; I would suggest looking there for ideas as well.

Hello @RylandHerrick ,

I've successfully created categories with the guides you provided. (setup winlogbeat.yaml with --pipelines directly to elastic than changed back to logstash.)
But I couldn't configure Logstash to fill those fields.
You can see my current logstash output/input config as follows.
I configured index template, and it creates indexes like "Winlogbeat-100001"

input {
   beats {
     port => 5044
     add_field => {"[@metadata][source]" => "winlogbeat"}
     ssl => false
   }
}
output {
   if [@metadata][source] == "winlogbeat" {
      elasticsearch {
         hosts => ["172.25.95.100:9200"]
         manage_template => false
         user => "elastic"
         password => "MyPassword"
         index => "winlogbeat"
         document_type => "%{[@metadata][type]}"
       }
     }
}

When I change this file to following,
It does not create or write in any index and also I think its constantly restarts logstash service. Also I wonder, if I need to delete 02-beats-input file since there is a input in the file.

input {
  beats {
    port => 5044
  }
}
output {
  if [@metadata][pipeline] {
    elasticsearch {
      hosts => ["172.25.95.100:9200"]
      manage_template => false
      index => "%{[@metadata][beat]}-%{[@metadata][version]}"
      action => "create"
      pipeline => "%{[@metadata][pipeline]}"
      user => "elastic"
      password => "MyPassword"
    }
  } else {
    elasticsearch {
      hosts => hosts => ["172.25.95.100:9200"]
      manage_template => false
      index => "%{[@metadata][beat]}-%{[@metadata][version]}"
      action => "create"
      user => "elastic"
      password => "MyPassword"
    }
  }
}

Thank you for help

@spazzrabbit I'm not as familiar with logstash (or beats, really); this board centers around the Security Solution itself. You might have better luck cross-posting in the beats board, but I'll also see if I can find a logstash expert to weigh in.

That being said, if logstash is configured with the above as you've stated, and your stack has been restarted to respect those changes, then there should be some output somewhere indicating what's happening. What do you see in the elasticsearch logs? The fact that the logstash service "constantly restarts" might be indicative of an error or misconfiguration somewhere.

With more information, we can try to narrow down the issue. Please share:

  1. The steps you've taken to configure winlogbeat/logstash (in order) (e.g. winlogbeat setup --pipelines)
  2. The current configuration of winlogbeat/logstash
  3. Output from winlogbeat/logstash/elasticsearch

@RylandHerrick

  1. I setup winlogbeat as a service. While working on the rules, I've changed the configuration and disabled logstash output to create pipelines. Then I run winlogbeat setup --pipelines command after that I've changed back my output to logstash.

  2. Here is my current configuration

###################### Winlogbeat Configuration Example ########################

# This file is an example configuration file highlighting only the most common
# options. The winlogbeat.reference.yml file from the same directory contains
# all the supported options with more comments. You can use it as a reference.
#
# You can find the full configuration reference here:
# https://www.elastic.co/guide/en/beats/winlogbeat/index.html

# ======================== Winlogbeat specific options =========================

# event_logs specifies a list of event logs to monitor as well as any
# accompanying options. The YAML data type of event_logs is a list of
# dictionaries.
#
# The supported keys are name, id, xml_query, tags, fields, fields_under_root,
# forwarded, ignore_older, level, event_id, provider, and include_xml.
# The xml_query key requires an id and must not be used with the name,
# ignore_older, level, event_id, or provider keys. Please visit the
# documentation for the complete details of each option.
# https://go.es.io/WinlogbeatConfig

winlogbeat.event_logs:
  - name: Application
    ignore_older: 72h
    
  - name: Microsoft-Windows-DNS-Client/Operational
    ignore_older: 72h
    
  - name: Microsoft-Windows-DNS-Server/Analytical
    ignore_older: 72h
    
  - name: Windows PowerShell
    event_id: 400, 403, 600, 800

  - name: Microsoft-Windows-PowerShell/Operational
    event_id: 4103, 4104, 4105, 4106
  
  - name: System

  - name: Security
          
  - name: Microsoft-Windows-Sysmon/Operational

# ====================== Elasticsearch template settings =======================

setup.template.settings:
  index.number_of_shards: 3
  #index.codec: best_compression
  #_source.enabled: false


# ================================== General ===================================

# The name of the shipper that publishes the network data. It can be used to group
# all the transactions sent by a single shipper in the web interface.
#name:

# The tags of the shipper are included in their field with each
# transaction published.
#tags: ["service-X", "web-tier"]

# Optional fields that you can specify to add additional information to the
# output.
#fields:
#  env: staging

# ================================= Dashboards =================================
# These settings control loading the sample dashboards to the Kibana index. Loading
# the dashboards is disabled by default and can be enabled either by setting the
# options here or by using the `setup` command.
#setup.dashboards.enabled: false

# The URL from where to download the dashboard archive. By default, this URL
# has a value that is computed based on the Beat name and version. For released
# versions, this URL points to the dashboard archive on the artifacts.elastic.co
# website.
#setup.dashboards.url:

# =================================== Kibana ===================================

# Starting with Beats version 6.0.0, the dashboards are loaded via the Kibana API.
# This requires a Kibana endpoint configuration.
setup.kibana:

  # Kibana Host
  # Scheme and port can be left out and will be set to the default (http and 5601)
  # In case you specify and additional path, the scheme is required: http://localhost:5601/path
  # IPv6 addresses should always be defined as: https://[2001:db8::1]:5601
  #host: "localhost:5601"

  # Kibana Space ID
  # ID of the Kibana Space into which the dashboards should be loaded. By default,
  # the Default Space will be used.
  #space.id:

# =============================== Elastic Cloud ================================

# These settings simplify using Winlogbeat with the Elastic Cloud (https://cloud.elastic.co/).

# The cloud.id setting overwrites the `output.elasticsearch.hosts` and
# `setup.kibana.host` options.
# You can find the `cloud.id` in the Elastic Cloud web UI.
#cloud.id:

# The cloud.auth setting overwrites the `output.elasticsearch.username` and
# `output.elasticsearch.password` settings. The format is `<user>:<pass>`.
#cloud.auth:

# ================================== Outputs ===================================

# Configure what output to use when sending the data collected by the beat.

# ---------------------------- Elasticsearch Output ----------------------------
#output.elasticsearch:
  # Array of hosts to connect to.
 # hosts: ["172.**.**.**:9200"]

  # Protocol - either `http` (default) or `https`.
  #protocol: "https"

  # Authentication credentials - either API key or username/password.
  #api_key: "id:api_key"
  #username: elastic
  #password: mypassword

  # Pipeline to route events to security, sysmon, or powershell pipelines.
  #pipeline: "winlogbeat-%{[agent.version]}-routing"

# ------------------------------ Logstash Output -------------------------------
output.logstash:
  # The Logstash hosts
  hosts: ["172.**.**.**:5044"]

  # Optional SSL. By default is off.
  # List of root certificates for HTTPS server verifications
  #ssl.certificate_authorities: ["/etc/pki/root/ca.pem"]

  # Certificate for SSL client authentication
  #ssl.certificate: "/etc/pki/client/cert.pem"

  # Client Certificate Key
  #ssl.key: "/etc/pki/client/cert.key"

# ================================= Processors =================================
processors:
  - add_host_metadata:
      when.not.contains.tags: forwarded
  - add_cloud_metadata: ~

# ================================== Logging ===================================

# Sets log level. The default log level is info.
# Available log levels are: error, warning, info, debug
#logging.level: debug

# At debug level, you can selectively enable logging only for some components.
# To enable all selectors, use ["*"]. Examples of other selectors are "beat",
# "publisher", "service".
#logging.selectors: ["*"]

# ============================= X-Pack Monitoring ==============================
# Winlogbeat can export internal metrics to a central Elasticsearch monitoring
# cluster.  This requires xpack monitoring to be enabled in Elasticsearch.  The
# reporting is disabled by default.

# Set to true to enable the monitoring reporter.
#monitoring.enabled: false

# Sets the UUID of the Elasticsearch cluster under which monitoring data for this
# Winlogbeat instance will appear in the Stack Monitoring UI. If output.elasticsearch
# is enabled, the UUID is derived from the Elasticsearch cluster referenced by output.elasticsearch.
#monitoring.cluster_uuid:

# Uncomment to send the metrics to Elasticsearch. Most settings from the
# Elasticsearch outputs are accepted here as well.
# Note that the settings should point to your Elasticsearch *monitoring* cluster.
# Any setting that is not set is automatically inherited from the Elasticsearch
# output configuration, so if you have the Elasticsearch output configured such
# that it is pointing to your Elasticsearch monitoring cluster, you can simply
# uncomment the following line.
#monitoring.elasticsearch:

# ============================== Instrumentation ===============================

# Instrumentation support for the winlogbeat.
#instrumentation:
    # Set to true to enable instrumentation of winlogbeat.
    #enabled: false

    # Environment in which winlogbeat is running on (eg: staging, production, etc.)
    #environment: ""

    # APM Server hosts to report instrumentation results to.
    #hosts:
    #  - http://localhost:8200

    # API Key for the APM Server(s).
    # If api_key is set then secret_token will be ignored.
    #api_key:

    # Secret token for the APM Server(s).
    #secret_token:


# ================================= Migration ==================================

# This allows to enable 6.7 migration aliases
#migration.6_to_7.enabled: true
  1. I can't get any output from Elastic but I'll add winlogbeat output in the comments asap

Here is my winlogbeat output;

type or paste code here

{"@timestamp":"2024-07-02T06:14:57.368Z","@metadata":{"beat":"winlogbeat","type":"_doc","version":"8.12.2"},"ecs":{"version":"8.0.0"},"agent":{"id":"5b29ed52-3822-4c30-84e5-d2ad3d2c94cb","name":"DESKTOP-AQ6N264","type":"winlogbeat","version":"8.12.2","ephemeral_id":"e8c68337-c4c5-4243-b79e-93a85a9da8bf"},"log":{"level":"information"},"message":"Process Create:\nRuleName: -\nUtcTime: 2024-07-02 06:14:57.367\nProcessGuid: {85b70ee6-9ae1-6683-8703-000000002100}\nProcessId: 2104\nImage: C:\\Windows\\System32\\conhost.exe\nFileVersion: 10.0.19041.4355 (WinBuild.160101.0800)\nDescription: Console Window Host\nProduct: Microsoft® Windows® Operating System\nCompany: Microsoft Corporation\nOriginalFileName: CONHOST.EXE\nCommandLine: \\??\\C:\\Windows\\system32\\conhost.exe 0xffffffff -ForceV1\nCurrentDirectory: C:\\Windows\nUser: DESKTOP-AQ6N264\\Test\nLogonGuid: {85b70ee6-8e97-6683-b189-0a0000000000}\nLogonId: 0xA89B1\nTerminalSessionId: 2\nIntegrityLevel: High\nHashes: MD5=0F568F6C821565AB9FF45C7457953789,SHA256=CC0A60CD15FA21E54615E46CD0F10CFBE86F496DC64D14B31D9F3B415D120EE1,IMPHASH=0F64302D3280DE299F4C51A78746F606\nParentProcessGuid: {85b70ee6-9ae1-6683-8603-000000002100}\nParentProcessId: 6536\nParentImage: C:\\Users\\Test\\AppData\\Local\\Temp\\1301460b-456a-4606-876b-92cf36163032\\idaq.exe\nParentCommandLine: \"C:\\Users\\Test\\AppData\\Local\\Temp\\1301460b-456a-4606-876b-92cf36163032\\idaq.exe\" -t -w 3600000 -4 1.1.1.1 \nParentUser: DESKTOP-AQ6N264\\Test","host":{"mac":["00-15-5D-"],"hostname":"desktop-aq6n264","architecture":"x86_64","os":{"name":"Windows 10 Pro","kernel":"10.0.19041.4522 (WinBuild.160101.0800)","build":"19045.4529","type":"windows","platform":"windows","version":"10.0","family":"windows"},"id":"85b70ee6-5b58-414e-8c6c-ee259a6d7b95","name":"desktop-aq6n264","ip":["fe80","172.25."]},"winlog":{"opcode":"Info","record_id":273333,"event_data":{"ParentCommandLine":"\"C:\\Users\\Test\\AppData\\Local\\Temp\\1301460b-456a-4606-876b-92cf36163032\\idaq.exe\" -t -w 3600000 -4 1.1.1.1 ","LogonGuid":"{85b70ee6-8e97-6683-b189-0a0000000000}","LogonId":"0xa89b1","Product":"Microsoft® Windows® Operating System","OriginalFileName":"CONHOST.EXE","UtcTime":"2024-07-02 06:14:57.367","Image":"C:\\Windows\\System32\\conhost.exe","ParentImage":"C:\\Users\\Test\\AppData\\Local\\Temp\\1301460b-456a-4606-876b-92cf36163032\\idaq.exe","ProcessGuid":"{85b70ee6-9ae1-6683-8703-000000002100}","FileVersion":"10.0.19041.4355 (WinBuild.160101.0800)","CurrentDirectory":"C:\\Windows","RuleName":"-","Hashes":"MD5=0F568F6C821565AB9FF45C7457953789,SHA256=CC0A60CD15FA21E54615E46CD0F10CFBE86F496DC64D14B31D9F3B415D120EE1,IMPHASH=0F64302D3280DE299F4C51A78746F606","ParentUser":"DESKTOP-AQ6N264\\Test","Description":"Console Window Host","User":"DESKTOP-AQ6N264\\Test","Company":"Microsoft Corporation","CommandLine":"\\??\\C:\\Windows\\system32\\conhost.exe 0xffffffff -ForceV1","TerminalSessionId":"2","ParentProcessGuid":"{85b70ee6-9ae1-6683-8603-000000002100}","IntegrityLevel":"High","ParentProcessId":"6536","ProcessId":"2104"},"provider_name":"Microsoft-Windows-Sysmon","task":"Process Create (rule: ProcessCreate)","user":{"type":"User","identifier":"S-1-5-18","domain":"NT AUTHORITY","name":"SYSTEM"},"channel":"Microsoft-Windows-Sysmon/Operational","computer_name":"DESKTOP-AQ6N264","process":{"pid":3736,"thread":{"id":5132}},"event_id":"1","provider_guid":"{5770385f-c22a-43e0-bf4c-06f5698ffbd9}","version":5,"api":"wineventlog"},"event":{"code":"1","kind":"event","provider":"Microsoft-Windows-Sysmon","action":"Process Create (rule: ProcessCreate)","created":"2024-07-02T06:14:58.418Z"}}

These are my ingest pipelines.

There is a warn message in logstash, it seems that there is something wrong with my pipeline name.

illegal_argument_exception", "reason"=>"pipeline with id [pipeline] does not exist

Those who has the same issue;

You need to define pipelines in the winlogbeat.yml under output.logstash title;

pipelines:
    - pipeline: "winlogbeat-8.12.2-powershell"
      when.equals:
        log.name: "Microsoft-Windows-PowerShell/Operational"
    - pipeline: "winlogbeat-8.12.2-powershell_operational"
      when.equals:
        log.name: "Microsoft-Windows-PowerShell/Admin"
    - pipeline: "winlogbeat-8.12.2-routing"
      when.equals:
        log.name: "Routing"
    - pipeline: "winlogbeat-8.12.2-security"
      when.equals:
        log.name: "Security"
    - pipeline: "winlogbeat-8.12.2-sysmon"
      when.equals:
        log.name: "Microsoft-Windows-Sysmon/Operational"

Also you need to map these pipelines in the logstash server in the output file as follow;

input {
  beats {
    port => 5044
  }
}

filter {
  if [log][name] == "Microsoft-Windows-PowerShell/Operational" {
    mutate {
      add_field => { "[@metadata][pipeline]" => "winlogbeat-8.12.2-powershell_operational" }
    }
  }
  else if [log][name] == "Microsoft-Windows-PowerShell/Admin" {
    mutate {
      add_field => { "[@metadata][pipeline]" => "winlogbeat-8.12.2-powershell" }
    }
  }
  else if [log][name] == "Routing" {
    mutate {
      add_field => { "[@metadata][pipeline]" => "winlogbeat-8.12.2-routing" }
    }
  }
  else if [log][name] == "Security" {
    mutate {
      add_field => { "[@metadata][pipeline]" => "winlogbeat-8.12.2-security" }
    }
  }
  else if [log][name] == "Microsoft-Windows-Sysmon/Operational" {
    mutate {
      add_field => { "[@metadata][pipeline]" => "winlogbeat-8.12.2-sysmon" }
    }
  }
  else {
    mutate {
      add_field => { "[@metadata][pipeline]" => "winlogbeat-8.12.2-sysmon" }
    }
  }
}
output {
  elasticsearch {
        hosts => ["172.***.***.***:9200"]
        manage_template => false
        index => "winlogbeat-%{+YYYY.MM.dd}"
        action => "create"
        pipeline => "%{[@metadata][pipeline]}"
        user => "elastic"
        password => "mypassword"
  }
}

You can change the else block as you wish. If it can't match any field in the "if" block it goes to else block. So you should write something to there.
It works successfuly atm.

Thank you for guidance @RylandHerrick !

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.