Standalone Elastic Agent Winlog System Input

I am trying to collect System logs from Windows Servers with Standalone Elastic Agent. While necessary System logs are being correctly picked up and indexed into Elasticsearch with Winlogbeat, Elastic Agent causes some issues.

Running agent from: elastic-agent-9.1.3-windows-x86_64
Basic content of elastic-agent.yml file:

outputs:
  default:
  type: elasticsearch
  hosts: ["elasticsearch-address"]
  api_key: "elastic-agent-api-key"
  preset: balanced

agent:
  monitoring:
  enabled: false
  use_output: default
  logs: false
  metrics: false
  traces: false
  namespace: default

inputs:
  - id: windows-event-log
    name: System
    type: winlog
    use_output: default
    meta:
      package:
        name: winlog
        version: 2.4.0
    data_stream:
      namespace: default
    streams:
      - name: System
        data_stream:
          dataset: system.system
          type: logs
        condition: '${host.platform} == "windows"'
        event_id: 6006

agent.logging.level: debug

agent.logging.to_stderr: true

providers:
  agent:
    enabled: true
  host:
    enabled: true

I can see that it picks System data and index named logs-system.system-default appears in Elastic but without a single document. When checking elastic-agent-event-log there are errors containing the following message:

{"type":"document_parsing_exception","reason":"\[1:539\] Cannot write to a field alias \[host.hostname\]."}, dropping event!","component":{"binary":"filebeat","dataset":"elastic_agent.filebeat","id":"winlog-default","type":"winlog"},"log":{"source":"winlog-default"},"ecs.version":"1.6.0","log.logger":"elasticsearch.elasticsearch","log.origin":{"file.line":535,"file.name":"elasticsearch/client.go"

and all event data goes under "raw_index". How to solve the issue? Am I missing something in elastic-agent.yml configuration?

Hi @leandrojmp (or anyone else from the team),

Could someone please take a look at this?

I'm cannot determine if it's a misconfiguration on my end or a deeper issue. I've already reviewed the documentation and scanned through several issues, but haven't found a clear solution yet.

If someone from the team could help clarify or guide me in the right direction, I’d really appreciate it!

Thanks in advance :folded_hands:

Have you tried without the condition?

Yes, I have tried without condition and also without providers section. Got the same error message.

Hi @s.buksa

What version of the stack are you on?

Cannot write to a field alias \[host.hostname\]."}, This is the issue all the docs are being dropped.

Have you created your own template or edited the default one because host.hostname should not be set as an alias type? AFAIK. So when host.hostname comes in as a field and the type for that field is alias then the document mapping will have a collision and the document will fail to be written.

Needing to be precise, is that an Index or a Data Stream...

Can you get the mapping for that index should be something like and share that specifically the mapping for host.hostname

GET .ds-logs-system.system-default-2025.09.07-000341

And I am confused you said there was no data or is this a different problem... lets fix one thing at a time

Hi @stephenb

Elastic Stack version: 8.18.3

I haven’t created my own template or edited any existing ones.

The logs-system.system-default index appeared in Elasticsearch after I ran the Elastic Agent, but it doesn't contain any documents. There is no data stream associated with logs-system.system-default

Regarding the raw_index error logged by Elastic Agent - appears in the file: elastic-agent/data/elastic-agent-*/logs/events/elastic-agent-event-log-*.ndjson.

These logs show that the data is picked up successfully, but fails during indexing due to: "Cannot write to a field alias [host.hostname]"

So lets do 1 things at a time

First I would not use agent 9.1.3 with stack 8.18; use the same version agent, especially not a "future" version

That does not make sense... Show me...

Run these and show the results

GET _cat/indices/*system.system*?v

GET _data_stream/logs-system.system-default

Ohh super important and did you actually add the System Integration in Kibana?
Otherwise the mapping / parsing etc.. Ingest PIpeline will not work!
You have to actually install it

Here is a configuration that I have running this is from fleet managaed but should be the same.

Here is one I generated for standalone
You can generate these through the UI, Just create a blank policy, add the integration, add agent standalone...

inputs:
  - id: winlog-system-ec05beac-9137-41ed-98fa-38d0d30a8697
    name: system-logs-only
    revision: 2
    type: winlog
    use_output: default
    meta:
      package:
        name: system
        version: 2.5.4
    data_stream:
      namespace: default
    package_policy_id: ec05beac-9137-41ed-98fa-38d0d30a8697
    streams:
      - id: winlog-system.system-ec05beac-9137-41ed-98fa-38d0d30a8697
        name: System
        data_stream:
          dataset: system.system
          type: logs
        condition: ${host.platform} == 'windows'
        ignore_older: 72h

BTW Your outputs above do not look correctly indented.

outputs:
  default:
    api_key: <REDACTED>
    hosts:
    - https://mydeployment12345.us-west1.gcp.cloud.es.io:443
    preset: balanced
    type: elasticsearch

The output indentation in the configuration is correct - I believe it was lost here during formatting.
Thanks a lot for the rest of the information. :folded_hands: I'll double check everything tomorrow.

1 Like

You can do this from UI

@6mil Welcome to the community.

Assuming your response was not AI generated...

It's a good question clarification..

There is a difference and often confusion about

Windows system logs
vs
And windows event logs

Your response is about event logs, but I think the original poster is interested in system logs.

Let's see...