Winlogbeat and User sessions (parsing fields from message)


(Matthieu Robin) #1

Hello,

I have installed winlogbeat on my Windows server, and it works fine, I have all data.

Unfortunately, I would like to have the count of user logon and logoff and there arent' the data corresponding.
All informations are stored on the field "message":

Is there a way to parse the "message" field to extract "Account Name" and set a new field with this one?

Or maybe another way to get these datas?

thanks a lot for your help


(Christian Dahlqvist) #2

When events need to be processed, parsed and/or enriched, it makes sense to send them through Logstash for that processing rather that directly to Elasticsearch.


(Matthieu Robin) #3

Hello Christian,

Thanks for your answer.
It goes to Logstash. Unfortunately, I don't find good info to parse these data :frowning: . Can you help me?
Thanks in advance


(Christian Dahlqvist) #4

I think you should be able to parse it through a combination of grok, kv and possibly mutate filters. Use a grok filter to separate out the key-value style list in the middle. Then use a kv filter to parse the fields, possibly after having improved the formatting of the extracted part using the mutate filter.


(Andrew Kroh) #5

I would probably apply a grok filter based on event ID.

There are many ways to do this.

filter {
  mutate {
    gsub => [
      "message", "\r\n", " ",
      "message", "\n", " " 
    ]   
  }

  if [event_id] == 4634 {
    grok {
      match => { "message" => "%{GREEDYDATA:msg} Subject:.*Security ID:\s*%{NOTSPACE:security_id}\s*Account Name:\s*%{GREEDYDATA:account_name}\s*Account Domain:\s*%{NOTSPACE:account_domain}" }
    }   
  }
}

That's not tested, but it's a start. You can debug grok filters with https://grokdebug.herokuapp.com/.

At some point this will be easier. :slight_smile: See https://github.com/elastic/beats/issues/1053


(Matthieu Robin) #6

Waouh! Thanks a lot! It works fine!
I have customized it for the logon /logoff! It's perfect! Thanks


(Dave) #7

So I have been sitting here banging my head against the wall for the better half of 3-4 hours now. I have the grok filter mentioned above working and parsing out the additional fields of my windows event logs. I have logstash outputting to my ES cluster and all of the messages are making it to the cluster, however the fields that are being parsed out with grok are not showing up in Kibana.

If I run logstash to stdout I can see the fileds in the log file.

Here is a capture of the logstash stdout, the new fields are at the bottom, security_id, account_name, & account_domain" (redacted data of course :wink: )

"message" => "An account was logged off. Subject: Security ID: S-1-5-21-XXXXXXXXXX-XXXXXXXXXX-XXXXXXXXXX-XXXX Account Name: myuser Account Domain: MYDOMAIN Logon ID: 0x7c2d10fe Logon Type: 3 This event is generated when a logon session is destroyed. It may be positively correlated with a logon event using the Logon ID value. Logon IDs are only unique between reboots on the same computer.",
"@version" => "1",
"@timestamp" => "2016-03-04T20:52:36.434Z",
"beat" => {
"hostname" => "MY-SERVER",
"name" => "MY-SERVER"
},
"category" => "Logoff",
"computer_name" => "MY-SERVER.MY.DOMAIN",
"count" => 1,
"event_id" => 4634,
"level" => "Information",
"log_name" => "Security",
"record_number" => "2135288570",
"source_name" => "Microsoft-Windows-Security-Auditing",
"tags" => [
[0] "windows",
[1] "server",
[2] "exchnage",
[3] "beats_input_codec_plain_applied"
],
"type" => "wineventlog",
"host" => "MYSERVER",
"msg" => "An account was logged off. ",
"security_id" => "S-1-5-21-XXXXXXXXXX-XXXXXXXXXX-XXXXXXXXXX-XXXX",
"account_name" => "myuser ",
"account_domain" => "MYDOMAIN"

AND MY LOGSTASH CONFIG

input {
beats {
port => "5044"
#type => "wincli-log"
}
}

filter {
mutate {
gsub => [
"meassage", "\r\n", " ",
"message", "\n", " ",
"message", "\t", " "
]
}
grok {
match => { "message" => "%{GREEDYDATA:msg} Subject:.Security ID:\s%{NOTSPACE:security_id}\sAccount Name:\s%{GREEDYDATA:account_name}\sAccount Domain:\s%{NOTSPACE:account_domain}" }
}
}

output {
elasticsearch { hosts => ["elastic1:9200", "elastic2:9200", "elastic3:9200", "elastic4:9200", "elastic5:9200", "elastic6:9200"] }
stdout { codec => rubydebug }


(Andrew Kroh) #8

@dloyd, If you query the enriched documents directly from Elasticsearch using curl or Sense, do you see the added fields?


(Dave) #9

Hi Andrew thanks for the reply!

I figured out the issue I was experiencing and blame the user (me) for the
problems I was experiencing. :slight_smile:

David


(Andrew Kroh) #10

@matthieurobin, @dloyd,

An update on this... Winlogbeat has been enhanced to report this data as a field so you will no longer need to grok the message. You can try the feature by using the development build. It will be released with v5. Screenshot here: Reporting Windows Security Events in Kibana


Pulling details of an event entry with Winlogbeat
(Andrew Kroh) #11