After upgrading to ELK 8.1, I noticed that every event has the "event.original" field containing all of the log data. This is highly unwanted, how to prevent this field from being sent from Filebeat?
I tried doing it on Filebeat level using processors:
and via Logstash remove_field.
None of this worked, the field is still visible.
Are you using any filebeat module?
event.original is normally created by the ingest pipeline used by some of the filebeat modules when parsing the original message, it does not exist in your original event, so you won't be able to remove it in filebeat nor in logstash, you would need to check the ingest pipeline for the module that you are using and remove the field there.
My whole filebeat config is as below:
- type: filestream
- type: filestream
After filebeat, Logstash parses the logs and sends to Elastic. Do you mean that the field is added on Logstash level? How to delete it?
What is your logstash pipeline? Please share it.
Logstash normally won't add any field unless explicitly configured in the pipeline, but I'm not running version 8.X and there was some changes regarding the ecs fields, so I'm not sure if this is being added by logstash or not.
How you tried to remove it in Logstash?
Did you have something like this?
remove_field => ["[event][original]"]
I tried to delete the field like below:
remove_field => ["event.original"]
and this didn't work.
This solves the issue for me.
Ah yes, this is confusing some times.
In filebeat and Elasticsearch you work with nested fields using
event.original, but in logstash you need to use
[top][nested], so it should be
event.original in Logstash would make it try to work with a field with that literal name, with the dot in the name.
Thank you for a very fast response on a Friday afternoon! Much appreciated!
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.