How to use filebeat copy_fields processor with data computed later (netflow received data and geo computed data)?

I'm using filebeat with netflow module, so I'm receiving netflow data and inserting in elasticsearch. All netflow fields are described here:

https://www.elastic.co/guide/en/beats/filebeat/current/exported-fields-netflow.html

However, when we analyze data in Kibana some other fields are avaiable, like geo.<other_fields> (destination.geo.country_iso_code, source.geo.country_iso_code etc). When these data are computed and inserted in elasticsearch index? I think they are not in data received.

Continuing ...

In my analisis I need to identify a external address. So I've processors that compare received values with my network addresses ranges and then copy right value to a new field.

- copy_fields:
      when:
        network:
           netflow.source_ipv4_address: ['XXX.XXX.XXX.0/24', 'YYY.YYY.YYY.0/24']
      fields:
        - from: netflow.destination_ipv4_address
          to: netflow.external_address
      ignore_missing: true
      fail_on_error: false
      ...

I would like to copy too another data like destination.geo.country_iso_code, destination.geo.city_name and destination.as.organization.name.

But, if I try to create a processor like that from netflow.destination_ipv4_address the values are never copied.

Netflow is a protocol that do not export info like that (AS name, cities and countries involved). I think that data are computed after flows to be received.

Is there some way to do that, copy that data like that one received by filebeat (addresses, like the processor above)?

Tks!!

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.