Hi!
I was in the process of upgrade my filebeats from 5.6 to 6.0. But it broke my logstash pipeline. The reason is -- I think -- a bug in the merging fields process of the fields
option in the yaml config file.
My configuration is as follow:
filebeat.prospectors:
- input_type: log
paths:
- /xxx/ansible_events/events.log
fields:
type: ansible_events
- input_type: log
paths:
- /xxx/xxx/log/main.log
include_lines: ['^\[info\]'] # include info log
exclude_lines: ['\)$', 'GET /users ', 'GET /ping '] # exclude lines that doesn't have the Sent information and also /users and /ping
fields:
type: cs_main
- input_type: log
paths:
- /var/log/syslog
- /var/log/auth.log
- /var/log/kern.log
fields:
type: syslog
name: filebeat
fields:
env: production
datacenter: yyyy
group: zzzz
output.logstash:
hosts: ["x.x.x.x:5044"]
path.data: /xxx/elastic_beats/filebeat/data
path.logs: /xxxx/elastic_beats/filebeat/log
I do all the sorting on the fields.type
field. Everything worked fine in 5.6.4
, the files syslog
,auth.log
and kern.log
would have fields.type
set to syslog
. events.log
would have it set to ansible_events
and main.log
to cs_main
.
After upgrading to 6.0.0
all my events were set to syslog
. A quick debugging showed that filebeat
was taking the last input_type
declared. To prove it, I just moved some section on top to the bottom and all of the events got that fields.type
.