Best approach to parse key value inside groups

Hi,

We are starting using logstash intensively and now we have a very specific line of log to parse :

2017-06-16 13:17:59.156 [26] [Statistics] (INFO) category1:subcategory1[load=7%, physicalfree=11487488, swapfree=2490368, pagereads=21, pagewrites=0], category2:subcategory1[private=1417484, virtual=2271376, used=1305457, committed=1317048, reserved=1392324], category2:subcategory2[time=15%, count0=3389, count1=3155

So basicly I would use the kv plugin to parse key/value but then I need to add two fields (category & subcategory).

Is there any simple way to do this ?

Thanks for your help !

Use grok to parse out the field ahead of the kv list, including category and subcategory, and the apply the kv filter to the appropriate field.

Thanks !

In the end I end up with a ruby script

ruby {
			code => "
					statistic = event.get('statistics').split('], ')
					statistic.each {
						|categories|
							category, kv = categories.split('[')
							if !kv.nil? and !category.nil?
								kv.tr(']', '').split(',').each {
									|indicator|
									key, value = indicator.split('=')
									if !value.nil? and !key.nil?
										key = category.tr(':', '_') + '_' + key.tr(' ', '').tr(':', '_')
										value = value.tr('%','') 
										if value =~ (/\d+/)
											event.set(key, value.to_i)
										else
											event.set(key, value)
										end
									end
								}
							end
						}"
			}

it's not perfect, but it creates everything automatically and detect int vs string. Also to help me having every values in a document I've concatenated the category+subcat+key as the name of the field

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.