Trouble with Index Patterns

Hi - Having a bit of trouble with the SIEM + beats configuration regarding index mappings. I've stood up 7.2 with the latest beats and followed the documentation to the T. When I open the SIEM app, I can see most of my data and receive no errors. Some areas are still not populating such as hosts, unique IPs, auths and uncommon processes. I'm sure there are others but I'm not seeing it at the moment. However, when I go to search through my data using discovery, I see some fields were not properly indexed.

Looking at the index patterns in Kibana, I can see I have the following error"

"Mapping conflict
5 fields are defined as several types (string, integer, etc) across the indices that match this pattern. You may still be able to use these conflict fields in parts of Kibana, but they will be unavailable for functions that require Kibana to know their type. Correcting this issue will require reindexing your data."

I am not able to use those fields as they are not indexed.

When I look at the Index Management in Elasticsearch I see I have two sets of indices for each log shipper. For instance:

winlogbeat-2019.07.04 and winlogbeat-7.2.0-2019.07.04-000001

The same for the other shippers.

If I delete the *000001 index, I am able to use the fields in the discovery app but the SIEM breaks and it tells me that my fielddata are not mapped properly.

If anyone can help, I'm more than happy to share logs and whatnot.

Thanks!

Starting with version 7.0, Beats have switched to using Index Lifecycle Management (ILM) by default. This means that the ILM policy decides when to roll over the indices and the way it works is that it creates indices in the form of winlogbeat-7.2.0-2019.07.04-000001 and a write alias in the form of winlogbeat-7.2.0.

You can check this with GET /_aliases in console, this is how it looks for me:

GET /_aliases
...
  "auditbeat-7.3.0-2019.07.05-000001" : {
    "aliases" : {
      "auditbeat-7.3.0" : {
        "is_write_index" : true
      }
    }
  },
...

I'm curious about that winlogbeat-2019.07.04 index, Beats no longer create indices in that form for a long time. Perhaps you have a custom index configuration on some of your Winlogbeats?

Thanks.

This is my only VM running winlogbeats for now. I pulled the latest beats from the site which was 7.2.0.

I didn't change much in the config file. I added log sources and configured it to push to logstash instead of elastic. Then I loaded the index templates manually as described in the docs.

Here's the config if its helpful

winlogbeat.event_logs:

  • name: Application
    ignore_older: 72h

  • name: System

  • name: Microsoft-windows-PowerShell/Operational
    ignore_older: 60m
    event_id: 4103, 4104

  • name: Windows PowerShell
    event_id: 400,600
    ignore_older: 60m

  • name: Microsoft-Windows-WMI-Activity/Operational
    event_id: 5857,5858,5859,5860,5861

  • name: Security
    processors:

    • script:
      lang: javascript
      id: security
      file: ${path.home}/module/security/config/winlogbeat-security.js
  • name: Microsoft-Windows-Sysmon/Operational
    processors:

    • script:
      lang: javascript
      id: sysmon
      file: ${path.home}/module/sysmon/config/winlogbeat-sysmon.js

#==================== Elasticsearch template settings ==========================

setup.template.settings:
index.number_of_shards: 1
#index.codec: best_compression
#_source.enabled: false

#================================ General =====================================

#The name of the shipper that publishes the network data. It can be used to group
#all the transactions sent by a single shipper in the web interface.
#name:

#The tags of the shipper are included in their own field with each
#transaction published.
#tags: ["service-X", "web-tier"]

#Optional fields that you can specify to add additional information to the
#output.
#fields:
#env: staging

#============================== Dashboards =====================================
#These settings control loading the sample dashboards to the Kibana index. Loading
#the dashboards is disabled by default and can be enabled either by setting the
#options here or by using the setup command.
#setup.dashboards.enabled: false

#The URL from where to download the dashboards archive. By default this URL
#has a value which is computed based on the Beat name and version. For released
#versions, this URL points to the dashboard archive on the artifacts.elastic.co
#website.
setup.dashboards.url:

#============================== Kibana =====================================

#Starting with Beats version 6.0.0, the dashboards are loaded via the Kibana API.
#This requires a Kibana endpoint configuration.
setup.kibana:

#Kibana Host
#Scheme and port can be left out and will be set to the default (http and 5601)
#in case you specify and additional path, the scheme is required: http://localhost:5601/path
#IPv6 addresses should always be defined as: https://[2001:db8::1]:5601
host: "192.168.1.232:5601"

#Kibana Space ID
#ID of the Kibana Space into which the dashboards should be loaded. By default,
#the Default Space will be used.
#space.id:

#============================= Elastic Cloud ==================================

#These settings simplify using winlogbeat with the Elastic Cloud (https://cloud.elastic.co/).

#The cloud.id setting overwrites the output.elasticsearch.hosts and
#setup.kibana.host options.
#You can find the cloud.id in the Elastic Cloud web UI.
#cloud.id:

#The cloud.auth setting overwrites the output.elasticsearch.username and
#output.elasticsearch.password settings. The format is <user>:<pass>.
#cloud.auth:

#================================ Outputs =====================================

#Configure what output to use when sending the data collected by the beat.

#-------------------------- Elasticsearch output ------------------------------
#output.elasticsearch:
#Array of hosts to connect to.
#hosts: ["localhost:9200"]

#Optional protocol and basic auth credentials.
#protocol: "https"
#username: "elastic"
#password: "changeme"

#----------------------------- Logstash output --------------------------------
output.logstash:
#The Logstash hosts
hosts: "192.168.1.232:5044"

#Optional SSL. By default is off.
#List of root certificates for HTTPS server verifications
#ssl.certificate_authorities: ["/etc/pki/root/ca.pem"]

#Certificate for SSL client authentication
#ssl.certificate: "/etc/pki/client/cert.pem"

#Client Certificate Key
#ssl.key: "/etc/pki/client/cert.key"

#================================ Processors =====================================

#Configure processors to enhance or manipulate events generated by the beat.

processors:

  • add_host_metadata: ~
  • add_cloud_metadata: ~

#================================ Logging =====================================

#Sets log level. The default log level is info.
#Available log levels are: error, warning, info, debug
#logging.level: debug

#At debug level, you can selectively enable logging only for some components.
#To enable all selectors use [""]. Examples of other selectors are "beat",
#"publish", "service".
#logging.selectors: ["
"]

#============================== Xpack Monitoring ===============================
#winlogbeat can export internal metrics to a central Elasticsearch monitoring
#cluster. This requires xpack monitoring to be enabled in Elasticsearch. The
#reporting is disabled by default.

#Set to true to enable the monitoring reporter.
#monitoring.enabled: false

#Uncomment to send the metrics to Elasticsearch. Most settings from the
#Elastic search output are accepted here as well.
#Note that the settings should point to your Elasticsearch monitoring cluster.
#Any setting that is not set is automatically inherited from the Elasticsearch
#output configuration, so if you have the Elasticsearch output configured such
#that it is pointing to your Elasticsearch monitoring cluster, you can simply
#uncomment the following line.
#monitoring.elasticsearch:

#================================= Migration ==================================

#This allows to enable 6.7 migration aliases
#migration.6_to_7.enabled: true

Can you post your LS config as well, please?

Also, please make sure to load load the template in ES. When you run Winlogbeat against ES that happens automatically, but when you send data via Logstash, you need to do it manually before you create any indices. See here: https://www.elastic.co/guide/en/beats/filebeat/7.1/filebeat-template.html#load-template-manually

I think the form is coming from logstash. I used a very basic syntax I've been using for a while. Is this the problem?

input {
beats {
port => 5044
}
}

output {
elasticsearch {
hosts => "192.168.1.232:9200"
index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
}
}

To load the indices I performed the following on the windows box:

.\winlogbeat.exe setup --index-management -E output.logstash.enabled=false -E 'output.elasticsearch.hosts=["192.168.1.232:9200"]'

Invoke-RestMethod -Method Delete "http://192.168.1.232:9200/winlogbeat-*"

.\winlogbeat.exe setup --dashboards

.\winlogbeat.exe setup -e -E output.logstash.enabled=false
-E output.elasticsearch.hosts=['192.168.1.232:9200'] `
-E setup.kibana.host=192.168.1.232:5601

Restart the service

Hmm, that looks good. Can you post a GET /_aliases?

GET /_alias

{
"winlogbeat-2019.07.05" : {
"aliases" : { }
},
".kibana_1" : {
"aliases" : {
".kibana" : { }
}
},
"packetbeat-2019.07.05" : {
"aliases" : { }
},
"winlogbeat-7.2.0-2019.07.05-000001" : {
"aliases" : {
"winlogbeat-7.2.0" : {
"is_write_index" : true
}
}
},
"packetbeat-7.2.0-2019.07.05-000001" : {
"aliases" : {
"packetbeat-7.2.0" : {
"is_write_index" : true
}
}
},
".kibana_task_manager" : {
"aliases" : { }
}
}

GET/_cat/aliases

.kibana .kibana_1 - - -
winlogbeat-7.2.0 winlogbeat-7.2.0-2019.07.05-000001 - - -
packetbeat-7.2.0 packetbeat-7.2.0-2019.07.05-000001 - - -

(I deleted all of the indexes prior to today to start fresh. I received the same issues)

Here are the SIEM errors when I try to fix some of this, if its helpful. Looking at the dashboards, I'm not populating anything for the host, user, source and dest ips, authentications, or processes.

[illegal_argument_exception] Fielddata is disabled on text fields by default. Set fielddata=true on [host.name] in order to load fielddata in memory by uninverting the inverted index. Note that this can however use significant memory. Alternatively use a keyword field instead.

[illegal_argument_exception] Fielddata is disabled on text fields by default. Set fielddata=true on [user.name] in order to load fielddata in memory by uninverting the inverted index. Note that this can however use significant memory. Alternatively use a keyword field instead.

[illegal_argument_exception] Fielddata is disabled on text fields by default. Set fielddata=true on [process.name] in order to load fielddata in memory by uninverting the inverted index. Note that this can however use significant memory. Alternatively use a keyword field instead.

I don't understand who creates this one: winlogbeat-2019.07.05. To my knowledge Beats shouldn't do that one. Can you post a document from it?

Sure - How do I do that?

You can run this in the Console:

GET /winlogbeat-2019.07.05/_search

{
"took" : 956,
"timed_out" : false,
"_shards" : {
"total" : 1,
"successful" : 1,
"skipped" : 0,
"failed" : 0
},
"hits" : {
"total" : {
"value" : 4771,
"relation" : "eq"
},
"max_score" : 1.0,
"hits" : [
{
"_index" : "winlogbeat-2019.07.05",
"_type" : "_doc",
"_id" : "zAQswmsBZ6MzYYU7K4pi",
"_score" : 1.0,
"_source" : {
"@timestamp" : "2019-07-05T12:46:40.124Z",
"event" : {
"kind" : "event",
"created" : "2019-07-05T12:46:41.966Z",
"action" : "Filtering Platform Connection",
"code" : 5156
},
"log" : {
"level" : "information"
},
"@version" : "1",
"agent" : {
"version" : "7.2.0",
"hostname" : "DESKTOP-P8TO48Q",
"id" : "85f27c56-585d-4ab2-ae5a-e0969327543b",
"ephemeral_id" : "60018923-4f3c-456e-acfb-87744101c314",
"type" : "winlogbeat"
},
"ecs" : {
"version" : "1.0.0"
},
"message" : """
The Windows Filtering Platform has permitted a connection.

Application Information:
Process ID: 7092
Application Name: \device\harddiskvolume2\windows\system32\svchost.exe

Network Information:
Direction: Inbound
Source Address: 127.0.0.1
Source Port: 51247
Destination Address: 239.255.255.250
Destination Port: 1900
Protocol: 17

Filter Information:
Filter Run-Time ID: 65886
Layer Name: Receive/Accept
Layer Run-Time ID: 44
""",
"winlog" : {
"provider_guid" : "{54849625-5478-4994-a5ba-3e3b0328c30d}",
"version" : 1,
"keywords" : [
"Audit Success"
],
"event_id" : 5156,
"opcode" : "Info",
"process" : {
"thread" : {
"id" : 6412
},
"pid" : 4
},
"computer_name" : "DESKTOP-P8TO48Q",
"record_id" : 86840,
"provider_name" : "Microsoft-Windows-Security-Auditing",
"task" : "Filtering Platform Connection",
"channel" : "Security",
"event_data" : {
"RemoteMachineID" : "S-1-0-0",
"FilterRTID" : "65886",
"RemoteUserID" : "S-1-0-0",
"LayerRTID" : "44",
"DestAddress" : "239.255.255.250",
"SourceAddress" : "127.0.0.1",
"DestPort" : "1900",
"Direction" : "%%14592",
"SourcePort" : "51247",
"LayerName" : "%%14610",
"Application" : """\device\harddiskvolume2\windows\system32\svchost.exe""",
"Protocol" : "17",
"ProcessID" : "7092"
},
"api" : "wineventlog"
},
"host" : {
"name" : "DESKTOP-P8TO48Q",
"hostname" : "DESKTOP-P8TO48Q",
"os" : {
"version" : "10.0",
"family" : "windows",
"name" : "Windows 10 Pro",
"kernel" : "10.0.18362.175 (WinBuild.160101.0800)",
"build" : "18362.175",
"platform" : "windows"
},
"architecture" : "x86_64",
"id" : "809fae3d-7e3c-48fc-8d6f-a8d35a1ecbf6"
},
"tags" : [
"beats_input_codec_plain_applied"
]
}
},

Ah, I think I see what's going on. Can you try using this in the LS config:

  index => "%{[@metadata][beat]}-%{[@metadata][version]}

And make sure to drop all indices and re-run setup to start over.

siem1

That was it. Thank you so much!

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.