Applying ingest node processors with a specific order

Hi all,

I'm sending data from a csv file to elasticsearch via logstash and everything is ok but I have a problem on an ip address field that contains some different values that doesn't conform to ipv4 format so I'am using a "gsub" processor to replace these values with a default ip address and then I'm using a "geoip" processor to locate these addresses using kibana visualization. I would like to use these two processors together in an ingest node pipeline and not with logstash filter plugins ( I tried to use "gsub" filter in my logstash config file and everything was ok but I want to do it with an ingest node pipeline because I will be sending other data from a hive table instead of a csv file ). So, I think the problem is that the "geoip" processor is applied before "gsub" processor.

How can apply "gsub" processor before "geoip" processor ?

Here is my pipeline:

{
  "mypipeline": {
    "processors": [
      {
        "gsub": {
          "field": "ip_address",
          "pattern": "\\(na|-1\\)",
          "replacement": "0.0.0.0"
        }
      },
      {
        "geoip": {
          "field": "ip_address"
        }
      }
    ]
  }
}

And this the error while sending the data:

[INFO ][logstash.outputs.elasticsearch] retrying failed action with response code: 500 ({"type"=>"exception", "reason"=>"java.lang.IllegalArgumentException: java.lang.IllegalArgumentException: 'na' is not an IP string literal.", "caused_by"=>{"type"=>"illegal_argument_exception", "reason"=>"java.lang.IllegalArgumentException: 'na' is not an IP string literal.", "caused_by" =>{"type"=>"illegal_argument_exception", "reason"=>"'na' is not an IP string literal."}}, "header"=>{"processor_type"=>"geoip"}})

Pipelines are processed in the order that you define them, not in a random order.

What does a sample of the data that causes this look like?

Sorry, I was escaping parentheses in my configuration and I don't know why. It works when I changed pattern to "(na|-1)".

1 Like

Not a problem, thanks for sharing the solution :smiley:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.