How to simulate output from Winlogbeat to logstash for testing


I am trying to write some logstash rules to capture log on/off events on Windows machines.
To extract security events from the Eventvwr I do something like that on eventvwr:

$starttime = (get-date).addhours(-1)
$data = Get-WinEvent -FilterHashtable @{logname="Security"; starttime=$starttime} | ConvertTo-Json -depth 3

I can write the "$data" to a file but I am having difficulties to stream that on logstash to do some testing. The $data is usually a kind of json file e.g.:

        "Id":  4624,
        "Version":  2,
        "Qualifiers":  null,
        "Level":  0,
        "Task":  12544,
        "Opcode":  0,
        "Keywords":  -9214364837600034816,
        "RecordId":  626044,
        "ProviderName":  "Microsoft-Windows-Security-Auditing",
        "ProviderId":  "54849625-5478-4994-aa-3e3b0328c30d",
        "LogName":  "Security",
        "ProcessId":  660,
        "ThreadId":  3964,
        "MachineName":  "XXXX.XXXX",
        "UserId":  null,
        "TimeCreated":  "\/Date(1540658683600)\/",
        "ActivityId":  "53e729eb-65f9-0003-e9-e753f965d401",
        "RelatedActivityId":  null,
        "ContainerLog":  "security",
        "MatchedQueryIds":  [

        "Bookmark":  {

        "LevelDisplayName":  "Information",
        "OpcodeDisplayName":  "Info",
        "TaskDisplayName":  "Logon",
        "KeywordsDisplayNames":  [
                                     "Audit Success"
        "Properties":  [
                               "Value":  "S-1-5-18"

how I can pipe this to my logstash for testing purpose?
I am looking at the multiline codec, file plugin but I dont manage to be able to properly parse the json in my test event.

I am running logstash 5.6.12. Sadly I didnt manage to go around this issue with these hints Indexing JSON files from a local directory to elastic

Thank you for any hint!

Why don't you just add a new field or a tag in winlogbeat?

You can then essentially divert that data anywhere you like with logstash for testing?

See here

Once you learn how divert data with fields or tags you can then move on to locate the data your actually looking for in more detail. Hope it helps you.

The problem I have is that I dont manage to properly parse the json contents of a record coming from the Windows Eventvwr (messages file on Linux equivalent).
I am trying to do:

input {
         stdin {
                add_field => {"[fields][document_type]" => "cerntssec"}
                add_field => {"logstash_source" => "logstashps"}
                codec => multiline {
                        pattern => "^{"
                        negate => true
                        what => "previous"


Or using the file input plugin, where I put the json on a file, /tmp/sec2 in this case:

input {
        file {
                path => "/tmp/sec2"
                sincedb_path => "/dev/null"
                delimiter => "§¶¶§"
                add_field => {"[fields][document_type]" => "cerntssec"}
                add_field => {"logstash_source" => "logstash-qaicapps"}



filter {
        json {
                source => "message"
                remove_field => ["message"]

So basically I dont manage to parse the json content that comes in a multiline.

Thank you

I have limited time to respond. A couple of quick questions:

Why use multiline? Is this data not pure json? if it is json use the

codec => json

on input.

The json codec should put the data into value pairs.

After that you can then evaluate the data based on value pair content etc

see here Accessing event data and fields | Logstash Reference [8.11] | Elastic

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.