Shipping Logs to Logstash not Working

I thought I would take moment today to see how I could ship Windows logs from endpoints to Elastic. I set up a basic winlogbeats, config shown below:

  - name: Application
    #ignore_older: 24h
  - name: Security
    #ignore_older: 24h
  - name: System
    #ignore_older: 24h
  - name: Windows PowerShell
    #ignore_older: 24h
  enabled: true
  hosts: [""]
  index: winlogbeat*

I then setup a very basic logstash config on my remote logstash/elastic instance

input {
beats { port => 905  }

filter {

} #close filter block

output {
# stdout { codec => rubydebug }
          elasticsearch { hosts => [""] index => "winlogbeat*"  }

} #close output block

I imported the template like this:

[root@HOST ~]# curl -XPUT 'localhost:9200/_template/winlogbeat*' -d@./winlogbeat.template.json
{"acknowledged":true}[root@HOST ~]# 

GET _cat/templates
contianmenttemplate_1 containment-* 0 
winlogbeat*           winlogbeat-*  0 
logstash              logstash-*    0 50001
template_1            te*           0 
filebeat              filebeat-*    0 

I checked for the index:

GET _cat/indices
yellow open winlogbeat       _a3GZ2snRjWCCyquIWfpOw 5 1        0  0    810b    810bm

I only see output when I use stdout and not elastic. What did I miss??

I resolved this by re creating all the steps and specifying the index and not using a * in the template creation.

See the documentation for index. You were literally using winlogbeat* as the index name rather than a daily index pattern.

And for the Logstash config to use with Beats see Setting Up Logstash [For Use with Beats].

I realized that mistake after a short time, and corrected it, then it started to work! What do you know about enterprise deployments of winlogbeats? Say I was interested in replacing Snare or Splunk forwarder?


This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.