Filebeat eslog-%{[elasticsearch.cluster.name]}-%{[fileset.name]}-%{+yyyy.MM.dd} not work

output.elasticsearch:
  # Array of hosts to connect to.
  hosts: ["localhost:9200"]
  index: "eslog-%{[elasticsearch.cluster.name]}-%{[fileset.name]}-%{+yyyy.MM.dd}"

I found that the above configuration will cause errors due to elasticsearch.cluster.name, how do I configure it?

The errors are as follows

2021-03-31T12:51:21.609+0800 ERROR [publisher_pipeline_output] pipeline/output.go:180 failed to publish events: temporary bulk send failure

Hi @asasas234,

The variables in the index as %{[elasticsearch.cluster.name]} or %{[fileset.name]} need to exist in the published events.

Do your events contain these fields?

You can use the add_fields processor to add custom fields.

In any case default indexes use to be fine for most use cases, is there a reason why you are using custom fields?

@jsoriano Thanks for your reply, I confirmed that elasticsearch.cluster.name exists as I was able to look it up on ES. I started with the index name without elasticsearch.cluster.name, and the insert was successful, and then when I looked up the inserted data on ES, it included elasticsearch.cluster.name

But do all events contain these fields? It might happen that the errors are produced by events that don't contain these fields, even if other events contain it.

You can try to use indeces, and configure a different index for events that don't contain this field, something like this:

output.elasticsearch:
  hosts: ["localhost:9200"]
  indeces:
    - index: "eslog-%{[elasticsearch.cluster.name]}-%{[fileset.name]}-%{+yyyy.MM.dd}"
      when.has_fields: ['elasticsearch.cluster.name']
    - index: "otherlog-%{[fileset.name]}-%{+yyyy.MM.dd}"
      when.not.has_fields: ['elasticsearch.cluster.name']

@jsoriano Thanks, I'll try it some time, theoretically this should not happen because I use the official elasticsearch module and I only enabled the server.log log collection, the rest I have disabled