Today, I'm ingesting logs from kubernetes containers pods using elastic agent kubernetes integration. It works fine.
Now, I have a use case which is the following:
Some specific logs records need to be ingested in another index, so I tried to override data_stream.dataset field in ingest pipeline: logs-kubernetes.container_logs@custom
The issue is the following mapping exception: org.elasticsearch.index.mapper.MapperParsingException: failed to parse field [data_stream.dataset] of type [constant_keyword] in document with id 'X3MRDYUBU7-AR-97HDwE'. Preview of field's value: 'my_test_app'
How could it be implemented?