Logstash - How to Dynamic Parse Log's value

Hi I have this sample Document

[Thread-13][2023-09-15 09:32:35][INFO]:{'[Sub]0-BaseTransformer]': '0.0004', '[Sub]1-NGINX Feature Extractor Service]': '0.0135', '[Dataloader][#0.-PutToQueue]': '0.0005', '[Sub][#1.EMA_FPS|CURRENT_FPS]': '183.77|69.58', '[Sub][#2.FRAMEID': 143, '[Sub][#3.DataNum': 2, '[Sub][#4.QueueSize': 0}

How can I dynamically parse, extract and get the value of each subfield after curly braces .

For example . 0.004, 183.77|69.58, 0.0135 .v.v.v

I have used so many filter but it doesn't work somehow.

here is one that doesn't work:

input {
  beats {
    port => 5044
  }
}

filter {
  grok {
    match => { "message" => "\[%{DATA:thread}\]\[%{TIMESTAMP_ISO8601:timestamp}\]\[%{WORD:log_level}\]:%{GREEDYDATA:log_data}" }
  }
  
  kv {
    source => "log_data"
    field_split => ","
    value_split => ":"
  }
}

output {
  elasticsearch {
    hosts => ["localhost:9200"]
    index => "test"
  }
}

Looks like JSON

Hi there, can you tell me if I’m right or wrong

You could start with

    dissect { mapping => { "message" => "[%{thread}][%{[@metadata][ts]}][%{loglevel}]:%{[@metadata][restOfLine]}" } }

    mutate { gsub => [ "[@metadata][ts]", " ", "T" ] }
    date { match => [ "[@metadata][ts]", "ISO8601" ] }

    mutate { gsub => [ "[@metadata][restOfLine]", "'", '"', "[@metadata][restOfLine]", "\[", "{", "[@metadata][restOfLine]", "]", "}" ] }
    json { source => "[@metadata][restOfLine]" }

which will get you

                       "{Sub}{#2.FRAMEID" => 143,
           "{Dataloader}{#0.-PutToQueue}" => "0.0005",
                     "{Sub}{#4.QueueSize" => 0,
                       "{Sub}{#3.DataNum" => 2

etc.

Hi, thank for your respone, btw How can I split the value of these to two?
For example:

EMA_FPS : 183.77
CURRENT_FPS : 69.58

    ruby {
        code => '
            event.to_hash.each { |k, v|
                if k =~ /\|/ and v.to_s =~ /\|/
                    k = k.sub(/.*\./, "").sub(/}$/, "").split(/\|/)
                    v = v.split(/\|/)

                    k.each_index { |x|
                        event.set(k[x], v[x])
                    }
                end
            }
        '
    }

works for that example, but has no error handling and is fairly fragile with respect to the data format.

Thank you Badger, final question
is there anyway I can replace '{' and '}' with '[' and ']' in the final output of log, it's for nicer look
for exp:

original key-value output => "{Dataloader}{#0.-PutToQueue}" => "0.0005"

Modified: "{Dataloader}{#0.-PutToQueue}" replace { } with [ ] => [Dataloader][#0.-PutToQueue]

You could, using a mutate+gsub to reverse the effects of the second and third triplets in this filter.

mutate { gsub => [ "[@metadata][restOfLine]", "'", '"', "[@metadata][restOfLine]", "\[", "{", "[@metadata][restOfLine]", "]", "}" ] }

But I recommended this filter because if you have field names like "[foo][bar" then logstash may object when you try to reference them in some ways.

If you do it as the last filter then it may well work. I do not think elasticsearch will object to unbalanced square brackets in a field name.

ah I got your point, you meant I need to write another filter to reverse the effects of the second and third triplets in that filter?

And actually I changed the log format a little bit.

[Thread-13][2023-09-15 09:32:35][INFO]:{'[Sub][0-BaseTransformer]': '0.0004', '[Sub][1-NGINX Feature Extractor Service]': '0.0135', '[Dataloader][#0.-PutToQueue]': '0.0005', '[Sub][#1.EMA_FPS|CURRENT_FPS]': '183.77|69.58', '[Sub][#2.FRAMEID]': 143, '[Sub][#3.DataNum]': 2, '[Sub][#4.QueueSize]': 0}

=> I changed for exp :

[Sub]#2.FRAMEID] => [Sub][#2.FRAMEID]

for more consistent.