Ahh take a look at this post. Now it makes more sense.
For Filebeat the whole log line gets shipped as the message
and then processed with the ingest Pipeline on the Elasticsearch side so the fields are not available yet for the drop_event
processor on the harvestor side so it can not find the field and thus is not executed and that is probably what is producing those error logs.
You will need to use a different approach.
Example exclude_line
or a drop_event
with regex on the message
field etc
NOTE I got this to work in the nginx.yml
- module: nginx
# Access logs
access:
enabled: true
# Set custom paths for the log files. If left empty,
# Filebeat will choose the paths depending on your OS.
var.paths: ["/Users/sbrown/workspace/sample-data/nginx/nginx.log"]
input:
processors:
- add_locale: ~
- drop_event.when.regexp.message: " 200 "
BTW I had the add the add_locale
as it seems it is added automatically but needs to be explicitly defined when adding another processors perhaps that is a minor bug.