How to use ingest pipeline with logstash elaticsearch output update feature

I am using Logstash Elasticsearch output to publish data to Elasticsearch. Two records are merged to create a single record from a request and a response. This code is working with no issues.

elasticsearch {
  hosts => [ "localhost:9200" ]
  index => "transactions"
  action => "update"
  doc_as_upsert => true
  document_id => "%{tid}"
  script =>'
    if(ctx._source.transaction=="request"){
      ctx._source.status = params.event.get("status");
    }else if(ctx._source.transaction=="response"){                           
      ctx._source.api = params.event.get("api");                        	
    }
}

Now I am trying to do add a new field with above record update using ingest pipelines.

PUT _ingest/pipeline/ingest_pipe2
{
  "description" : "describe pipeline",
  "processors" : [
    {
      "set" : {
        "field": "api-test",
        "value": "new"
      }
    }
  ]
}

This will add a new field to the incoming event. It works fine with following code.

elasticsearch {
  hosts => [ "localhost:9200" ]
  index => "transactions"
  pipeline => "ingest_pipe2"  
}

The problem is both logstash update and ingest pipeline update doesn't work together.

elasticsearch {
  hosts => [ "localhost:9200" ]
  index => "transactions"
  pipeline => "ingest_pipe2"**
  action => "update"
  doc_as_upsert => true
  document_id => "%{tid}"
  script =>'
    if(ctx._source.transaction=="request"){
      ctx._source.status = params.event.get("status");
    }else if(ctx._source.transaction=="response"){                           
      ctx._source.api = params.event.get("api");                        	
    }
}

I believe ingest pipelines only intercept index requests (even if part of a bulk request), and not update requests. I therefore do not think what you are tryinbg to do is possible. You may need to add the field through the update script.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.