Winlogbeat via Logstash into Elasticsearch

Hi,

I have a on a Windows machine winlogbeat and it's working fine so far. At the moment all logs are going into the same index in elasticsearch. Yes i have searched and found topics with that question but i was unable to solve it.
I want that Logstash is saving winlogbeat documents into a different index.

I have this at the moment:

output {
#  file {
#    path => "/tmp/out.log"
#  }
  elasticsearch { hosts => ["localhost:9200"]
     hosts => "localhost:9200"
     index => "logstash-%{+YYYY.MM.dd}"
  }
     index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
     document_type => "%{[@metadata][type]}"
  }
#  if [@metadata][cthostmeta] == "ELKSTACK" {
#    file {
#       codec => rubydebug { metadata => true}
#       path => "/tmp/cthost_out.log"
#    }
#   }
}

Thanks for your help.

what is your input section in logstash like? you can tag the beats input events with a winlogbeat tag and the use that in the output section with a conditional

Hi jsvd,

That sounds great. Input in Logstash looks like that:

syslog {
    port => 5514
    type => "syslog"
}
udp {
    type => "pfsense"
    port => 5140
}
beats {
    type => "log"
    port => 5044
}
udp {
    type => "syslog"
    port => 5515

}

Can you give an example how i have to do that?

looking at the winlogbeat docs, it seems that if you don't set the type , then it will be filled out by winlogbeat with either wineventlog or eventlogging: https://www.elastic.co/guide/en/beats/winlogbeat/current/exported-fields-common.html

Ok, but this doesn't help me to create different indexes?
I have a Windows Server 2016 which is sending the events.

#----------------------------- Logstash output --------------------------------
output.logstash:
  # The Logstash hosts
  hosts: ["192.168.0.XX:5044"]
output {
  if [type] == "winlogbeat" {
    elasticsearch { index => "logstash-winlogbeat-%{YYYY.mm.dd}"
  } else {
    elasticsearch { } # normal flow
}

I get the following error.

[2017-06-20T18:55:07,147][ERROR][logstash.agent           ] Cannot create pipeline {:reason=>"Expected one of #, => at line 33, column 17 (byte 534) after output {\n#  file {\n#    path => \"/tmp/out.log\"\n#  }\n if [type] == \"winlogbeat\" {\n    elasticsearch { index => \"logstash-winlogbeat-%{YYYY.mm.dd}\"\n  } else {\n  elasticsearch "}


> output {
> #  file {
> #    path => "/tmp/out.log"
> #  }
>  if [type] == "winlogbeat" {
>     elasticsearch { index => "logstash-winlogbeat-%{YYYY.mm.dd}"
>   } else {
>   elasticsearch { hosts => ["localhost:9200"]
>      hosts => "localhost:9200"
>      index => "logstash-%{+YYYY.MM.dd}"
>    }
> # }
> #  if [@metadata][cthostmeta] == "ELKSTACK" {
> #    file {
> #       codec => rubydebug { metadata => true}
> #       path => "/tmp/cthost_out.log"
> #    }
> #   }
> }

Somwhere I have a mistake... i have tried different variants...

both elasticsearch plugins are missing a }

elasticsearch { index => "logstash-winlogbeat-%{YYYY.mm.dd}"

should be

elasticsearch { index => "logstash-winlogbeat-%{YYYY.mm.dd}" }

also
elasticsearch { hosts => ["localhost:9200"]

 hosts => "localhost:9200"
 index => "logstash-%{+YYYY.MM.dd}"

}

has hosts twice

Thanks now I'm back online.

Hmm but it didn't make a new indices now for winlogbeat. If i do: GET /_cat/indices?v in Kibana i only see my logstash indices.
They are comming now in like that:

t _index	   	logstash-2017.06.20
 # _score	   	 - 
 t _type	   	wineventlog

I probably have this problem with one field. How should i go to fix that?
Mapping conflict field:
event_data.param1 conflict

Update:
I can't create a new index pattern in Kibana with: logstash-winlogbeat-*

this seem to be problems with how elasticsearch is configured and the already existing data in indices.
can you post error messages somewhere?

also, you can confirm which events are going where by putting a stdout { codec => rubydebug } } before a elasticsearch { .. } block, that will debug the events going to that elasticsearch

I have putted that before the elasticsearch.

Sorry i'm new to the whole elasticsearch. Which logs do you mean?

I found also the following:

[2017-06-21T13:51:17,545][INFO ][logstash.inputs.beats ] Beats inputs: Starting input listener {:address=>"0.0.0.0:5044"}

EDIT:
Where can I now see the result from stdout?

you can place the stdout block in different places to see the events that reach that section. example:

input {
  beats { }
  tcp {}
}
output {
  stdout { codec => rubydebug } # option 1: here all events will be logged
  if [type] == "winlogbeat" {
    stdout { codec => rubydebug } # option 2: here only all events with that tag will be logged
    elasticsearch { index => "logstash-winlogbeat-%{YYYY.mm.dd}" }
  } else {
    elasticsearch { index => "logstash-winlogbeat-%{YYYY.mm.dd}" }
  }
}

If you don't see events being printed to stdout it either means data is not arriving in the inputs, or your conditional is not being met (e.g. type isn't really winlogbeat)

I understand that. My problem is that I don't know now where can I look into the output stdout?

I have a installation on CentOS 7 based on this tutorial: https://www.howtoforge.com/tutorial/how-to-install-elastic-stack-on-centos-7/

Probably I found also a problem. I have done that with the certificate so maybe there is now a problem.

ah I didn't realise you were using the rpm packages. then you can replace:

stdout { codec => rubydebug } with something like
file { path => '/tmp/logstash.output.debug.txt' codec => rubydebug }

Ahhh so now i got it.

You wanted to see that:

"record_number" => "50939",
   "event_data" => {
       "Workstation" => "XXXXXLP",
            "Status" => "0x0",
       "PackageName" => "MICROSOFT_AUTHENTICATION_PACKAGE_V1_0",
    "TargetUserName" => "XXXXX"
},
      "message" => "Es wurde versucht, die Anmeldeinformationen für ein Konto zu überprüfen.\n\nAuthentifizierungspaket:\tMICROSOFT_AUTHENTICATION_PACKAGE_V1_0\nAnmeldekonto:\tMarc.Walter\nArbeitsstation:\tHISPEEDLP\nFehlercode:\t0x0",
         "type" => "wineventlog",
       "opcode" => "Info",
         "tags" => [
    [0] "beats_input_codec_plain_applied"
],
    "thread_id" => 180,
   "@timestamp" => 2017-06-22T20:08:34.398Z,
         "task" => "Überprüfung der Anmeldeinformationen",
     "event_id" => 4776,
"provider_guid" => "{54849625-5478-4994-A5BA-3E3B0328C30D}",
     "@version" => "1",
         "beat" => {
    "hostname" => "XXXXXX",
        "name" => "XXXXXX",
     "version" => "5.4.0"
},
  "activity_id" => "{13ED4890-D0D8-0003-9C48-ED13D8D0D201}",
         "host" => "XXXXXXXX",
  "source_name" => "Microsoft-Windows-Security-Auditing"
}
{
   "process_id" => 760,
"computer_name" => "XXXXXXX",
     "keywords" => [
    [0] "Überwachung erfolgreich"
],
        "level" => "Informationen",
     "log_name" => "Security",
    "@metadata" => {
    "beat" => "winlogbeat",
    "type" => "wineventlog"
},
"record_number" => "50940",

ok so we confirmed that data from winlogbeat has type == "wineventlog", now you can write the proper conditional, like:

output {
  if [type] == "wineventlog" {
    elasticsearch { index => "logstash-eventlog-%{+YYYY.MM.dd}" }
  } else {
    elasticsearch { index => "logstash-%{+YYYY.MM.dd}" }
  }
}

I have added it. It looks now like that:

I can't create: logstash-winlogbeat-* the index pattern in Kibana.

output {
#  file {
#    path => "/tmp/out.log"
#  }
 if [type] == "wineventlog" {
    elasticsearch { index => "logstash-winlogbeat-%{YYYY.mm.dd}" }
  } else {
  elasticsearch {
     hosts => "localhost:9200"
     index => "logstash-%{+YYYY.MM.dd}"
   }
 }
#  if [@metadata][cthostmeta] == "ELKSTACK" {
#    file {
#       codec => rubydebug { metadata => true}
#       path => "/tmp/logstash_out.log"
#    }
#   }
}

Kibana shows now today as following:

Unfortunately I can't create the index. Because I already have a logstash-* Index?

It also shows me the eventlog indices in the searches panel: No matching indices found: [index_not_found_exception] no such index, with { resource.type="index_or_alias" & resource.id="logstash-eventlog-" & index_uuid="na" & index="logstash-eventlog-" }
But i recieve there this message. When I look into the indices via Kibana I can't see a Indices like: logstash-eventlog-*

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.