Get Task from index.default_pipeline

I have a mapping that by default creates an index with the index.default_pipeline configuration pointing to an ingest pipeline.


PUT search-d
{
  "settings": {
    "index.default_pipeline": "e5-small-ingest-pipeline"
   ....
}

{
  "description": "Pipeline de ingestão e5-small",
  "processors": [
    {
      "inference": {
        "model_id": ".multilingual-e5-small_linux-x86_64",
        "input_output": [
          {
            "input_field": "text",
            "output_field": "text_embedding"
          }
        ]
      }
    }
  ],
  "on_failure": [
    {
      "set": {
        "description": "Index document to 'failed-<index>'",
        "field": "_index",
        "value": "failed-{{{_index}}}"
      }
    },
    {
      "set": {
        "description": "Set error message",
        "field": "ingest.failure",
        "value": "{{_ingest.on_failure_message}}"
      }
    }
  ]
}

Is it possible to retrieve the task value or determine if this ingest pipeline is being executed?

The pipeline is failing, and even with the failure command, the index is not being created.

It is creating a field within the document itself but not creating the index as it should be. How should the command be executed

ingest.failure : inference process queue is full. Unable to execute command

If you are using 8.12.0 or newer, you can set list_executed_pipelines to true in your bulk request to get the list of pipelines that were executed (Bulk API | Elasticsearch Guide [8.15] | Elastic).

Sorry, I'm confused. I don't pass the pipeline in the request; it is configured internally in my index. I only insert the document, and the pipeline is triggered automatically. I am using insertion via Logstash.

Oh OK I did not realize Logstash was involved. My suggestion won't work with logstash -- it only works if you're calling the _bulk API directly.