Yet Another Elastic SIEM Not Showing Hosts

I started playing with Elastic a few months ago, and very recently started trying to get Elastic SIEM to work. I have not been able to get it to recognize anything, and the only host showing up is the Elastic node itself from filebeat logging. I have 3 Windows hosts with Winlogbeat installed, sending logs via Logstash that are not appearing. I'm sure I'm missing something obvious, and have read other posts here, but I can't see it.

I originally didn't have the Winlogbeat template imported, so I did that, deleted the previous indexes, and created new ones with the correct template. Below is a snippet from the index mapping JSON-

    "host": {
              "properties": {
                "architecture": {
                  "type": "keyword",
                  "ignore_above": 1024
                },
    			...
                ,
                "name": {
                  "type": "keyword",
                  "ignore_above": 1024
                },

Index name is " winlogbeat-2020.07.15-000001" and I confirmed that the SIEM indexes contain "winlogbeat-*". Inspecting host events and checking the "Response" tab starts with the below, everything below this snippet just returns ' "doc_count": 0 ':

    {
      "took": 17,
      "timed_out": false,
      "_shards": {
        "total": 93,
        "successful": 93,
    "skipped": 91,
    "failed": 0
      },,
      "hits": {
    "max_score": null,
    "hits": []
      },
    ...

Hi @brian_m. How long have you been running Winlogbeat on the hosts? Did you try expanding the time period for a bigger possibility of events?

I deleted the old indexes at the start of this, so as of right now I have a few hours short of a full day of data. The previous data was not mapped correctly, and I didn't feel like re-indexing it all. I have gone into the SIEM view and changed it to "past 7 days", and still no Winlogbeat hosts show up.

Hello everyone,

I have the same problem here using the same architecture: winlogbeat -> logstash -> elastic cloud, but the data in the SIEM app still no show up.

A lot of times the reason it won't show up is because of indexing issues. After you inspect the query have you run the same query through dev tools to see if it shows up there?

Alright, I'm mediocre at best when it comes to the dev tools and raw queries, so bear with me.

Taking the query from the "All Hosts" panel on the Hosts section of the SIEM, then running this through dev tools (cutting out a bunch for space reasons, hopefully left enough to confirm what query I'm using, but it's copied straight from the SIEM Inspect window)-

POST /winlogbeat-*/_search
{
  "aggregations": {
    "host_count": {
      "cardinality": {
        "field": "host.name"
      }
    },
...
  },
  "size": 0,
  "track_total_hits": false
}

and when running that, I get the correct response of 3 hosts. I tried re-running it with Filebeat, and got the error

Text fields are not optimised for operations that require per-document field data like aggregations and sorting, so these operations are disabled by default. Please use a keyword field instead. Alternatively, set fielddata=true on [host.name] in order to load field data by uninverting the inverted index. Note that this can use significant memory.

I confirmed the mapping was imported from Filebeat, deleted filebeat indexes and template, re-imported template, and tried again with same result. So there is something going on there, but I felt like it was unrelated to the winlogbeat SIEM issue, so I removed filebeat from the SIEM indexes list. After that, the SIEM just showed the "Welcome to SIEM. Let's get you started." page. Going through the "Add data with Beats" > "Winlogbeat" process showed that it saw data, but apparently still isn't recognizing it.

Hi @brian_m,

I did resolve my cases - that is very similar yours - uploading the winlogbeat template from winlogbeat agent after export the .json template and upload it to elasticsearch (Elastic Cloud).

Basically, I follow this documentation (to export and upload manually the template) https://www.elastic.co/guide/en/beats/winlogbeat/current/winlogbeat-template.html#load-template-manually-alternate.

I needed to wrote a code to upload the template correctly to elasticsearch, here is:

$root = 'https://user:pass@url.com:9243/_template/winlogbeat-7.8.0'
$user = "user"
$pass= "pass"
$secpasswd = ConvertTo-SecureString $pass -AsPlainText -Force
$credential = New-Object System.Management.Automation.PSCredential($user, $secpasswd)

$result = Invoke-RestMethod -Credential $credential -Method Put -ContentType "application/json" -InFile winlogbeat.template.json -Uri $root

I put all this in a powershell script and executed at the winlogbeat installation directory. After, I needed to delete my "old" indeces, recreate the index pattern and voilà, everything is showing up at the SIEM app.

Regards,
Rafael Martins

I appreciate the script (and I am definitely storing that for future use), however I'm not sure it's the mapping that is the issue. Running the query in dev tools shows:

    "winlog_module" : {
      "doc_count" : 3803680,
      "mwsysmon_operational_event_count" : {
        "doc_count" : 0
      },
      "security_event_count" : {
        "doc_count" : 3802467
      }

but in the "Inspect Host Events" response window, it shows

    "winlog_module": {
      "meta": {},
      "doc_count": 0,
      "mwsysmon_operational_event_count": {
        "meta": {},
        "doc_count": 0
      },
      "security_event_count": {
        "meta": {},
        "doc_count": 0
      }

This is copying the query in "Request" and running it in dev tools with

POST /winlogbeat-*/_search

(Also, for reference to anyone following along, I resolved the filebeat issue, and it was entirely user error. I typoed the index name so the template wasn't applying...)

This sounds similar to this issue here maybe?

You can use those instructions to check your mappings. What can happen is if you have a beat that is constantly pushing data and don't shut it down before re-doing your mappings, that beat will still usually "win" by pushing a new piece of data into the system and then you get default dynamic mappings all over again which are going to mostly text fields which is not what we want.

When re-doing mappings you have to:

  • ensure everything is shut down and nothing gets back in
  • Remove your mappings
  • Push your explicit beats mappings
  • Turn on your beats again
  • Check your mappings one more time to make sure the right parts are still keyword and not textfield

According to the mappings on the index, host.name is a keyword. It also returns, not just in Discovery, but when I use the same query that SIEM is using (as far as I understand it). I know Filebeat failed due to host.name not being a keyword, and when I fixed that, it started working. This is the mapping for the index (not the template, but the index itself, from Kibana "Index Management" > "Indices", then selecting the index and viewing "Mappings"

"name": {
              "type": "keyword",
              "ignore_above": 1024
            },

I have solved the issue. And it is the stupidest thing. The "Elasticsearch indices" setting for the SIEM was

auditbeat-*, endgame-*, filebeat-*, packetbeat-*; winlogbeat-*; meraki-*, suricata-*

The key point is to notice around winlogbeat, the two semi-colons trying very hard to blend in. Replaced those with commas as intended and it seems to be working.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.