Hello, is it possible to send the same event into multiple indices in the same elasticsearch cluster?
Example filebeat.yml:
filebeat.config.inputs:
enabled: true
path: ${path.config}/inputs/*.yml
reload:
enabled: true
period: 10s
processors:
- decode_json_fields:
fields: ["metadata.country"]
output:
elasticsearch:
hosts: ["https://xxx:9200"]
indices:
- index: "filebeat-7.12.0-app1"
when.equals:
component: "app1"
- index: "filebeat-usa"
when.equals:
metadata.country: "usa"
and this is the filebeat input:
- type: log
paths:
- /var/log/apps/aaa.json
fields:
component: app1
team: xxx
fields_under_root: true
scan_frequency: 10s
json.keys_under_root: true
json.add_error_key: true
test command:
echo '{"outer":"value","metadata.country":"usa","inner":"{\"data\":\"value\"}"}' >> /var/log/apps/aaa.json
Expected result is to have the same event in both indices: filebeat-7.12.0-app1
(first match - because of component field) and filebeat-usa
(second match - because of processors and the decode of the specific field metadata.country).
Actual result is: the event goes into the first match (which is correct based on the docs), but is it possible to send the same event into multiple indices without logstash?
I know it's possible with logstash but I don't want to introduce another component into the infrastructure.