Configuration of transform data with custom pipeline

Hi all,

I am currenly using Elastic APM to monitor my application. and I noticed that transaction log data that created by APM agent is automactilly store in an Elasticsearch index. I want to enrich every doucment before it write into the index. And, I found a offical tutorial document showing me how to do. but I cannot find the configuation option in my UI.

Tutorial Link:Tutorial: Transform data with custom ingest pipelines | Fleet and Elastic Agent Guide [8.7] | Elastic
Elastic Stack version 8.8
Currently, I have step 1 done, and stuck on step 2 adding the injest pipeline

I cannot find the data stream to add the injestpipeline

Hi, @JasonREC.

That tutorial is designed for the System integration. Take a look at our APM ingest pipelines docs instead. Then try following the steps in this custom APM ingest pipeline tutorial.

Hi, @bmorelli25 ,

Thanks for the reply!

I read both the document you provided and set up the pipeline for testing like below:

The above Injest Pipeline with an append processor can sucessfully show the upcoming data with field "processor_ test" and the value

but then, I try to use the enrich processor, but it does not work somehow

Here is my set up,
source Index:


enrich processor"

Lastly I have applied the injest pipeline to the APM custom pipeline

PUT _ingest/pipeline/traces-apm@custom
  "processors": [
      "pipeline": {
        "name": "limit_lookup" 

Basically, as long as the upcoming document that contains the labels_data_name is matched the labels_database_name that in the source index. I would like it to add that couple fields into the document before it write into index.

Here is my apm index data looks like, the upcoming data have a field called labels_database_name like below

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.