Logstash using KV filter creating a field with an array rather than individual fields and individual values

Hi all,
I'm working through a Logstash configuration which is parsing Cisco ISE log entries. I'm using multiline to collate entries into records and a grok filter coupled with a 'kv' filter which is mostly having the desired effect. There are potentially a large number of fields that are being processed by the 'kv' filter (around 100).

However, in the log messages, there are fields with the same name. e.g. Step and Stepdata. And what is happening is that instead of making these into individual fields, it is collating them into a list. Example from the original log messages.

StepData=3= TACACS.User, StepData=4= DEVICE.Device Type, StepData=6=SS_XX_User_PAP, 

And using the following:

filter {
  if [type] == "ise" {
    kv {
      source => "KVpairs"
      field_split_pattern => "," 
    }
}

They are being populated in Elasticsearch as (for example):

    " StepData": [
      "3= TACACS.User",
      "4= DEVICE.Device Type",
      "6=SS_XX_User_PAP",
    ],

The problem is that I would like to have multiple fields based on format "Stepdata=3": value, "Stepdata=4": value rather than a single field using an array "Stepdata": [3= value, 4= another value]. Is this possible in this context?

Thanks in advance.
Neil

It is certainly possible. This thread is about a slightly different question, but should give you an idea of how to do it.

Thanks @Badger. I'll see what it gives me when I get a chance. Cheers.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.