Hi,
I have tested filebeat correctly using one single filebeat.yml file and works perfectly with my processors.
I have split it into a input yml file and the processor stops working. I have tried a lot of things, and it does not work.
I need processors to work in its own input file because we need to do different things with each file:
Can someone help me?
filebeat.yml
# Maintained by Chef.
#
filebeat.config.inputs:
enabled: true
path: inputs.d/*.yml
output.kafka:
hosts:
- kafka1
- kafka2
- kafka3
topic: '%{[@metadata.raw_index]}'
partition.round_robin:
reachable_only: false
required_acks: 1
compression: gzip
max_message_bytes: 1000000
version: 0.10.0.0
workers: 4
keep_alive: 60
channel_buffer_size: 10000
bulk_max_size: 10000
And the input file in inputs.d/.yml is like this:
# aaa input file.
# Maintained by Chef.
- type: log
paths:
- /var/event/event*.log
exclude_files:
- \.bz2$
- \.gz$
index: "blahblah"
json.keys_under_root: true
json.add_error_key: true
encoding: 'utf-8'
processors:
- rename:
ignore_missing: true
fields:
- from: "host.name"
to: "hostname"
- from: "message"
to: "body"
- drop_fields:
fields: ["host", "input", "agent", "ecs", "log"]
- rename:
fields:
- from: "hostname"
to: "host"
And I don't see anything strange in the configuration file, and I'm following the documentation. The only thing that I'm thinking is that, when using different input files, the fields that Filebeat adds are not done yet, and that's done after it? But I don't see anything in the documentation.
BTW, the error we get is
"error": {
"message": "Failed to rename fields in processor: could not fetch value for key: hostname, Error: key not found"
},
but if I try to trim down our processors, even the "drop" one fails.
Thanks!