Ingest pipeline - Override datastream

Today, I'm ingesting logs from kubernetes containers pods using elastic agent kubernetes integration. It works fine.

Now, I have a use case which is the following:
Some specific logs records need to be ingested in another index, so I tried to override data_stream.dataset field in ingest pipeline: logs-kubernetes.container_logs@custom

The issue is the following mapping exception: org.elasticsearch.index.mapper.MapperParsingException: failed to parse field [data_stream.dataset] of type [constant_keyword] in document with id 'X3MRDYUBU7-AR-97HDwE'. Preview of field's value: 'my_test_app'

How could it be implemented?

Hi Jeff,

Welcome to Community!

Constant keyword is a specialization of the keyword field for the case that all documents in the index have the same value.

It will not take multiple values .

And you cannot override the index in an integration.

Routing documents based on values within the documents is not possible with Elastic Agent. You would need Logstash in between doing that for you.
Because, the Elastic Agent has an API key and that API key allows it to only write to a certain datastream.

Ok, I understand, thanks for the reply.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.