Hello,
I'm trying to send a big json file through filebeat to elasticsearch but I have too many fields so I want to drop a lot a them.
I saw on the doc that I can do this using the filebeat processor and I did smthing like this :
processors:
- drop_fields:
when:
or:
- regexp:
thefield_name_randomstring: "test."
- regexp:
thefield_name_itsthebest: "test."
fields: ["thefield_name_itsthebest", "thefield_name_randomstring"]
It works fine but since I want to drop a lot of fields , I wanted to use a regex to drop them so do smthing like this :
processors:
- drop_fields:
when:
- regexp:
thefield_name_: ""
fields: ["thefield_name_*"]
So I can drop all fields names starting with the name "thefield_name_" and I dont care about the value of the fields. But it does not work
Is there a way to do this ? Or am I doing this the wrong way and shouldn't use filebeats processors to do that ?
Note that I can't use logstash
Thank you !