Module elasticsearch issue with filed limit

Hi,

I use metric beat to monitor elasticsearch per docker.

metricbeat.config:
  modules:
    path: ${path.config}/modules.d/*.yml
    # Reload module configs as they change:
    reload.enabled: false

processors:
  - add_cloud_metadata: ~
  - add_docker_metadata: ~

metricbeat.modules:
- module: elasticsearch
  xpack.enabled: true
  period: 10s
  hosts: [https://ELASTIC1:9200,https://ELASTIC2:9200,https://ELASTIC3:9200]
  username: '${ELASTICSEARCH_USERNAME:}'
  password: '${ELASTICSEARCH_PASSWORD:}'
  ssl.certificate_authorities: [ "/certs/ca.crt" ]

output.elasticsearch:
  hosts: '${ELASTICSEARCH_HOSTS:}'
  username: '${ELASTICSEARCH_USERNAME:}'
  password: '${ELASTICSEARCH_PASSWORD:}'
  ssl.certificate_authorities: [ "/certs/ca.crt" ]

It works fine for a few days but after a restart of docker container I get the following error:

cuted output:
elasticsearch/client.go:405 Cannot index event publisher.Event{Content:beat.Event{Timestamp:time.Time{wall:0xc042085d74feeae2, ext:386406978238, loc:(*time.Location)(0x55c975d62540)}, Meta:{"index":".monitoring-es-7-mb"}, Fields:{"agent":{"ephemeral_id ....... Cache:publisher.EventCache{m:common.MapStr(nil)}} (status=400): {"type":"mapper_parsing_exception","reason":"failed to parse","caused_by":{"type":"illegal_argument_exception","reason":"Limit of total fields [1000] has been exceeded while adding new fields [656]"}}

this appear about every 5 seconds. I have also deleted all .monitoring-es-7 indices but the issue still appear.

How can I fix this?

Best Regards