Grok pattern for a unique value inside a field

need to create a new field status_code with value-successful by using ingest pipeline when status inside message field has 200 and when status inside message field is 502,404,402 it muse create status_code with value failed.

Figured out how to create the fields but was unable to pick the exact value of status from the logs. Sample logs:

{action:show,count:208,duration:6.38ms,status:200}

How do I write a grok pattern to pick status value alone from these logs? 200 has different logs compared to 404. Hence unable to define a common pattern(each 404 has a different log structure as well)

You can do a KV processor to parse the data. Then I added the code from your other thread. This should work if the field names are the same.

** I see this is in the Logstash category but I believe you are still looking for an ingest pipeline solution. If you need it done in Logstash that can be all executed there also.

POST /_ingest/pipeline/_simulate
{
  "pipeline": {
    "processors": [
      {
        "kv": {
          "field": "message",
          "field_split": ",",
          "value_split": ":",
          "trim_key": "{",
          "trim_value": "}"
        }
      },
      {
        "set": {
          "if": "ctx.status == '200'",
          "field": "status_code",
          "value": "successful"
        }
      },
      {
        "set": {
          "if": "ctx.status == '404' || ctx.status == '502'",
          "field": "status_code",
          "value": "failed"
        }
      },
      {
        "remove": {
          "field": "message"
        }
      }
    ]
  },
  "docs": [
    {
      "_index": "index",
      "_id": "id",
      "_source": {
        "message": "{action:show,count:208,duration:6.38ms,status:200}"
      }
    },
    {
      "_index": "index",
      "_id": "id",
      "_source": {
        "message": "{action:show,count:208,duration:6.38ms,status:502}"
      }
    }
  ]
}
1 Like

Hi,
Thanks a lot for ur suggestion and time but this was not working as expected. I used json processor to resolve this issue

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.