Unable to index winlogbeat using logstash

I am trying to set up winlogbeat to ship data to a logstash server, which further forwards to another logstash server and then to elasticsearch. I am seeing that when I add index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}" to the elasticsearch output plugin that the metadata fields are not accessible. I am guessing that the problem is that Logstash removes the metadata fields, so i found this troubleshooting tip, however I don't understand how to apply it. https://www.elastic.co/guide/en/beats/winlogbeat/current/metadata-missing.html
Winlogbeat configuration:

winlogbeat.event_logs:
  - name: Application
    ignore_older: 72h
  - name: System
  - name: Security
  - name: Windows PowerShell
  - name: Microsoft-Windows-PowerShell/Operational
  - name: Microsoft-Windows-Windows Defender/Operational
  - name: Microsoft-Windows-Windows Firewall With Advanced Security/Firewall
  - name: ForwardedEvents
    tags: [forwarded]

setup.template.enabled: false

output.logstash:
  hosts: ["192.168.243.146:5044"]
  index: winlogbeat

#output.elasticsearch:
#  hosts: ["https://192.168.243.143:9200"]
#  username: "elastic"
#  password: "l1vEHXlK_zcwB*94rF4q"
#  index: winlogbeat
#  ssl.certificate_authorities: ['C:\Program Files\winlogbeat-8.4.1-windows-x86_64\http_ca.crt']


processors:
  - add_host_metadata:
      when.not.contains.tags: forwarded
  - add_cloud_metadata: ~

Logstash 1 configuration (Logstash to Logstash):

input {
  beats {
    port => 5044

  }
}

output {
  lumberjack {
    hosts => ["192.168.243.145"]
    port => 5044
    ssl_certificate => 'C:\Users\Administrator\Desktop\logstash-8.4.2-windows-x86_64\logstash-8.4.2\config\certs\lumberjack.cert'
    codec => json
  }
stdout { codec => rubydebug { metadata => true } }
}

Logstash 2 configuration (Logstash to Elasticsearch):

input {
  beats {
    port => 5044
    ssl => true
    codec => json
    ssl_certificate => 'C:\Users\Administrator\Desktop\logstash-8.4.2-windows-x86_64\logstash-8.4.2\config\certs\lumberjack.cert'
    ssl_key => 'C:\Users\Administrator\Desktop\logstash-8.4.2-windows-x86_64\logstash-8.4.2\config\certs\lumberjack.key'
 }
}

output {
    elasticsearch {
      hosts => ["https://192.168.243.143:9200"]
      ssl => "true"
      cacert => 'C:\Users\Administrator\Desktop\logstash-8.4.2-windows-x86_64\logstash-8.4.2\config\certs\http_ca.crt'
      user => "elastic"
      password => "l1vEHXlK_zcwB*94rF4q"
      #pipeline => "%{[@metadata][pipeline]}"
      manage_template => false
      index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
      #document_type => "%{[@metadata][type]}"
    }
     stdout { codec => rubydebug { metadata => true } }
  }

Index that is created in Kibana:
kibana

As explained in the documentation you need to create a new field in your first logstash that will have the value of the @metadata field you want to use.

To preserve @metadata fields, use the Logstash mutate filter with the rename setting to rename the fields to non-internal fields.

You need something like this in your first logstash

filter {
    mutate {
        rename => {
            "[@metadata][beat]" => "[index_name]"
        }
    }
}

Then in your second logstash you will use:

index => "%{[index_name]}-%{+YYYY.MM.dd}"

If you want to use other @metadata fields you will also need to rename them to new fields.

Thank you that worked out nicely! What else do I need to do in my config in order to get the exact same setup as if I was to connect winlogbeat directly to elasticsearch? I've loaded the template and pipelines into elasticsearch so far, but I'm not sure why it is not working. Again, preciate the help :slight_smile:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.