Conditionals in pipelines

Hi,

We are in the process of recreating our logstash pipelines in kibana/elk to use Elasticsearch pipelines instead of logstash. However the lack of documentation on how to do this in Kibana is overwhelming.

See the accompanied picture. What is the syntax for providing a simple 'this field contains that value' conditional here? I searched for this question, but prior questions on this (like Condition format for pipelines in Kibana ) got no answers.

Since at least somebody designed this, there should be an answer somewhere though :wink:

Does somebody know? Whatever I come up with I keep getting compile errors (yes, also tried without parenthesis).

However, when I put a pipeline from somwhere in the docs this seems rather similar. What am I doing wrong?

image

This document help you?

No, sorry.
I've seen this page and even importted one of the pipelines to see how it is shown jn kibana.
It just doesnt work with my own pipeline.

Regards

ok, there is contains function for string value. Have you tried this?

And what was the result of your script? Error message or undesired result?

If the script.painless.regex.enabled cluster setting is enabled, you can use regular expressions in your if condition scripts. For supported syntax, see Painless regular expressions.

Check that?

It also could be this if you are trying to match http only.

"if": "ctx.url?.scheme =~ /^http[^s]/"
should be
"if": "ctx.url?.scheme == /^http[^s]/"

You can put these in Dev Tools to test the results to see if they work or not.

POST _ingest/pipeline/_simulate
{
  "pipeline": {
    "processors": [
      {
        "set": {
          "if": "ctx.url =~ /^http[^s]/",
          "field": "result",
          "value": true
        }
      }
    ]
  },
  "docs": [
    {
      "_source": {
        "url": "http://www.google.com"
      }
    },
    {
      "_source": {
        "url": "https://www.google.com"
      }
    }
  ]
}

Thnx for the replies.
What I am trying to do is simply match the value of a field. I have a field 'event-type'. This field can contain the string 'API_CALL' . I want the dissect from my OP to only be tried when event-type == 'API_CALL'

The funny thing is, I have a few processors in the pipeline already. When I save, it's OK and it processes the document I feed it. When I go to devtools en do GET _ingest/pipeline/pipelinename I get the pipeline in return. But when I then do PUT _ingest/pipeline/my-pipeline with te exact json I just got returned, I get a parse exception. Funny.

I already found out I somehow seem to need to use that (horrible) painless stuff. Appearantly I need to use ctx as source and then elaborate on that. However just: ctx.event-type == API_CALL still gives a compile error.

I tried aaron_nimocks suggestion:

POST _ingest/pipeline/AAB_AGL/_simulate
{
  "AAB_AGL" : {
    "processors" : [
      {
        "dissect" : {
          "field" : "message",
          "pattern" : "ts: %{ts} | logLevel: %{log-level} | appId: %{app-id} | thread: %{thread-id} | SID: %{session-id} | TN: %{transaction-id} | clientIp: %{client-ip} | userId: %{user-id} | apiType: %{api-type} | api: %{api-url} | platform: %{platform} | eventType: %{event-type} | %{additional-data}"
        }
      },
      {
        "trim" : {
          "field" : "app-id",
          "ignore_failure" : true
        }
      },
      {
        "trim" : {
          "field" : "client-ip",
          "ignore_failure" : true
        }
      },
      {
        "trim" : {
          "field" : "api-type",
          "ignore_failure" : true
        }
      },
      {
        "trim" : {
          "field" : "api-url",
          "ignore_failure" : true
        }
      },
      {
        "dissect" : {
          "field" : "additional-data",
          "pattern" : "message: %{ms-url}|%{ms-result-code}|%{ms-result}|%{execution-time}",
          "if": "ctx.event-type == API_CALL",
          "value": true
        }
      }
    ]
  }
}

But this results in:

{
  "error" : {
    "root_cause" : [
      {
        "type" : "parse_exception",
        "reason" : "[docs] required property is missing",
        "property_name" : "docs"
      }
    ],
    "type" : "parse_exception",
    "reason" : "[docs] required property is missing",
    "property_name" : "docs"
  },
  "status" : 400
}

Feel free to try.
This is an anonimised document I test with.

{
  "_source": {
  "@timestamp": "2022-01-20T12:56:45.262Z",
  "@metadata": {
    "beat": "filebeat",
    "type": "_doc",
    "version": "7.16.2"
  },
  "fields": {
    "environment": "production"
  },
  "agent": {
    "hostname": "server1.prdl.itv.local",
    "ephemeral_id": "866496c4-e379-421e-930c-1ade47f5105c",
    "id": "a86414cc-90f1-4b71-9723-d115033864d7",
    "name": "server1.prdl.itv.local",
    "type": "filebeat",
    "version": "7.16.2"
  },
  "ecs": {
    "version": "1.12.0"
  },
  "message": "ts: 2022-01-20 13:56:44.299 | logLevel: INFO | appId: AGL | thread: 111309 | SID: 1e7ad1c0-e99a-7af6-edd4-a3384bd19247 | TN: a39ffed3-7120-6313-ab67-045ee0ef6f20 | clientIp: 127.0.0.1 | userId: 0000000 | apiType: NANO | api: POST /100/1.2.0/A/nld/stb/kpn/API-1/MS-1 | platform: stb | eventType: API_CALL | message:  http://newservername:8080/new-api-name/b2b/tokens?channel=stb&lang=nld|20X|ACN_200|5",
  "log": {
    "file": {
      "path": "/product/AGL/agl-core/logs/agl.log"
    },
    "offset": 2884772107
  },
  "tags": [
    "avs6",
    "api-log",
    "apigateway",
    "asd"
  ],
  "input": {
    "type": "log"
  }
}
}

Tried this one. Again a compile error.

ctx.event-type.contains('API_CALL')

The error you are seeing is because you are trying to _simulate the processor which requires docs to sample from.

Change to POST _ingest/pipeline/AAB_AGL/ if you are trying to save the pipeline vs simulating it.

1 Like

That gives a 405

{
  "error" : "Incorrect HTTP method for uri [/_ingest/pipeline/AAB_AGL?pretty=true] and method [POST], allowed: [PUT, GET, DELETE]",
  "status" : 405
}

And after changing it to PUT,
I get:

{
  "error" : {
    "root_cause" : [
      {
        "type" : "parse_exception",
        "reason" : "[processors] required property is missing",
        "property_name" : "processors"
      }
    ],
    "type" : "parse_exception",
    "reason" : "[processors] required property is missing",
    "property_name" : "processors"
  },
  "status" : 400
}

Sorry. Here's a full example if this helps.

PUT _ingest/pipeline/my-pipeline
{
  "processors": [
    {
      "set": {
        "description": "If 'url.scheme' is 'http', set 'url.insecure' to true",
        "if": "ctx.url =~ /^http[^s]/",
        "field": "result",
        "value": true
      }
    }
  ]
}
1 Like

OK,

Finally got this syntax right.
There is a f***ing lot of documentation on ALL kinds of complex stuff with elastic, but I finally needed some external blog of 2 years ago to find my solution. Please @elastic , make parts of your documentation MORE kibana oriënted AND more beginner friendly.

OK, proper syntax for the conditional field in my case is:

ctx['event-type'] == 'API_CALL'

This did the trick.

Thnx for being with me.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.