Windows Logs via Elastic Agent

Hello,

I am facing an issue with Windows logs when using Elastic Agent: the logs are not being sent to Kibana. However, Winlogbeat works correctly and is able to send the logs.

This is my elastic-agent configuration

agent:
  logging:
    level: debug
  monitoring:
    enabled: false
    logs: false
    metrics: false
outputs:
  default:
    type: logstash
    hosts: ["xxxxxx:5044"]
    ssl:
      enabled: true
      certificate_authorities:
        - "C:/Program Files/Elastic/Agent/certs/ca.crt"
inputs:
type: winlog
id: winlog-security
use_output: default
data_stream:
dataset: windows.security
namespace: default
name: Security
ignore_older: 72h
api: eventlog
processors:
  -add_fields:
     target: ""
     fields:
       app_type: "windows-ad"
  -add_host_metadata: {}
  - add_process_metadata: {}
    # event_id: 4624,4625,4768,4769,4776,4740  

  - type: winlog
    id: winlog-application
    use_output: default
    data_stream:
      dataset: windows.application
      namespace: default
    name: Application
    ignore_older: 24h
    api: eventlog
    processors:
      - add_fields:
          target: ""
          fields:
            app_type: "windows-ad"
      - add_host_metadata: {}
      - add_process_metadata: {}

Is there Any suggestion or solution please?

What does your Logstash configuration looks like?

Your output is logstash, so you need to check if the configuration is correct and if there is any error, check the Logstash logs.

I'm also not sure if the indentation of your yaml file is correct:

This does not seems right, is your agent running? What do you have in its log?

Thank you for your response.

The elastic agent is running and its log is clean no error or warning!!

Same is going for the Logstash!!!

You need to share the logstash configuration and the logs as well, it is not possible to provide any insight without this information.

The issue is the ELK stack is working just fine, and at the same elastic agent configuration I have put two types of inputs one filestream and the other is winlog the filestream is working and sucessufully reaching kibana while nothing about Windows logs!!

This is my Logstash Configuration

input {
  beats {
    port => 5044
    add_field => {
      "source_type" => "elastic-agent"
    }
    ssl_enabled => true
    ssl_key => "/distrib/elk/latest/logstash/config/certs/logstash.key"
    ssl_certificate => "/distrib/elk/latest/logstash/config/certs/logstash.crt"
    #ssl_certificate_authorities => ["/distrib/elk/latest/logstash/config/certs/ca.crt"]
    #ssl_verify_mode => "force_peer" 
  }


}


output {
  elasticsearch {
    hosts => ["xxxxxxx:9200"]
    user => "user"
    password => "*************"
    ssl_enabled => true
    ssl_certificate_authorities =>["/distrib/elk/latest/logstash/config/certs/ca.crt"]

    index => "%{[@metadata][alias]}"
  }
}

Your logstash configuration is not according with the recommended configuration from the documentation.

First, when using Elastic Agent you should use the elastic_agent input, and you need to set enrich => false on this input, then you also do not specify an index setting in the output, it will write into the data stream, you need to use data_stream => true only.

You would need something like this:

input {
  elastic_agent {
    port => 5044
    enrich => none
    ssl_enabled => true
    ssl_certificate_authorities => ["<ca_path>"]
    ssl_certificate => "<server_cert_path>"
    ssl_key => "<server_cert_key_in_pkcs8>"
    ssl_client_authentication => "required"
  }
}

output {
  elasticsearch {
    hosts => ["http://localhost:9200"]
    data_stream => "true"
    user => "username"
    password => "password"
    data_stream => true
    ssl_enabled => true
    ssl_certificate_authorities => "<elasticsearch_ca_path>"
  }
}

This works fine for me with Fleet managed agents, it should work for standalone agents as well.

Thank you.

I will try this and let you know.

When I set the data_stream to true even yhe filestream inputs are not indexed to Kibana anymore!

What could be the reason?