Drop multiple fields with regex on field names using filebeat


I'm trying to send a big json file through filebeat to elasticsearch but I have too many fields so I want to drop a lot a them.

I saw on the doc that I can do this using the filebeat processor and I did smthing like this :


  • drop_fields:
    - regexp:
    thefield_name_randomstring: "test."
    - regexp:
    thefield_name_itsthebest: "test.
    fields: ["thefield_name_itsthebest", "thefield_name_randomstring"]

It works fine but since I want to drop a lot of fields , I wanted to use a regex to drop them so do smthing like this :


  • drop_fields:
    - regexp:
    thefield_name_: ""
    fields: ["thefield_name_*"]

So I can drop all fields names starting with the name "thefield_name_" and I dont care about the value of the fields. But it does not work :frowning:

Is there a way to do this ? Or am I doing this the wrong way and shouldn't use filebeats processors to do that ?

Note that I can't use logstash :frowning:

Thank you !

I don't think you can select the fields to drop via regex. There is also an include_fields processor, which drops all fields but the configured ones.

Hm ok thank you..

The thing is that I want to keep smthing like 250 fields and drop 300 so this is not cool haha I'm guessing this is a common problem so how is it fixed normally ? better mapping ? Not using filebeat but logstash ?

Wow, that's quite a number of fields. The prune filter in logstash seems to support regular expressions. Feel free to open an enhancement request for beats to provide the same functionality.

Yeah i know :frowning:

Ok ! Thx very much for the answers

This topic was automatically closed after 21 days. New replies are no longer allowed.