Kaspersky Grok Pattern which is working fine at Grok Debugger is not parsing logs appropriately as expected

I have created a Grok Pattern to parse Kaspersky logs.
Kaspersky Grok Pattern which I've written is working fine at Grok Debugger In Dev Tools, but logs are not getting parsed in appropriate fields as expected and returns an error provided below.

Sample Data:
"2025-02-13T17:07:57+05:30 test.local WSEE @cee:{"type":"12344","timestamp":"21312312312321","host_name":"","user_name":"","time":"12738163876","EventContextType":"0","rtid":{"id_0":"2672288272","id_1":"12345","id_2":"12345","id_3":"123","id_4":"123","id_5":"123","id_6":"123","id_7":"12","id_8":"12","id_9":"123","id_10":"123"},"task_type":"1234","task_name":"KSN Usage","objectName":"C:\\TEST_HOME_DIREcTORY\\87678677\\Pending\\UPM_Profile\\AppData\\Roaming\\Tracker Software\\TestEditor\\3.0\\History.dat","requestType":"1","error":"-2738376278"}"

Grok Pattern:
%{TIMESTAMP_ISO8601:timestamp} %{HOSTNAME:hostname} %{WORD:application} %{DATA:garbage}:{\"type\":\"%{DATA:type}\",\"timestamp\":\"%{INT:timestampAV}\",\"host_name\":\"\",\"user_name\":\"\",\"time\":\"%{INT:timestampAV2}\",\"EventContextType\":\"%{DATA:EventContextType}\",\"rtid\":{\"id_0\":\"%{DATA:id_0}\",\"id_1\":\"%{DATA:id_1}\",\"id_2\":\"%{DATA:id_}\",\"id_3\":\"%{DATA:id_3}\",\"id_4\":\"%{DATA:id_4}\",\"id_5\":\"%{DATA:id_5}\",\"id_6\":\"%{DATA:id_6}\",\"id_7\":\"%{DATA:id_7}\",\"id_8\":\"%{DATA:id_8}\",\"id_9\":\"%{DATA:id_9}\",\"id_10\":\"%{DATA:id_10}\"},\"task_type\":\"%{DATA:task_type}\",\"task_name\":\"%{DATA:task_name}\",\"objectName\":\"%{DATA:objectName}\",\"requestType\":\"%{DATA:requestType}\",\"error\":\"%{DATA:error}\"}

Output: (From Grok Debugger):

{
"task_name": "KSN Usage",
"timestampAV2": "12738163876",
"type": "12344",
"error": "-2738376278",
"hostname": "test.local",
"timestamp": "2025-02-13T17:07:57+05:30",
"id_5": "123",
"id_4": "123",
"id_7": "12",
"id_6": "123",
"id_9": "123",
"requestType": "1",
"id_": "12345",
"id_8": "12",
"EventContextType": "0",
"timestampAV": "21312312312321",
"garbage": "@cee",
"application": "WSEE",
"id_10": "123",
"objectName": "C:\\\\TEST_HOME_DIREcTORY\\\\87678677\\\\Pending\\\\UPM_Profile\\\\AppData\\\\Roaming\\\\Tracker Software\\\\TestEditor\\\\3.0\\\\History.dat",
"id_1": "12345",
"task_type": "1234",
"id_0": "2672288272",
"id_3": "123"
}

Error Message received from Parsed Logs:

"error": {
"message": "Provided Grok expressions do not match field value: [2025-02-13T17:07:57+05:30 test.local WSEE @cee:{"type":"12344","timestamp":"21312312312321","host_name":"","user_name":"","time":"12738163876","EventContextType":"0","rtid":{"id_0":"2672288272","id_1":"12345","id_2":"12345","id_3":"123","id_4":"123","id_5":"123","id_6":"123","id_7":"12","id_8":"12","id_9":"123","id_10":"123"},"task_type":"1234","task_name":"KSN Usage","objectName":"C:\\TEST_HOME_DIREcTORY\\87678677\\Pending\\UPM_Profile\\AppData\\Roaming\\Tracker Software\\TestEditor\\3.0\\History.dat","requestType":"1","error":"-2738376278"}]"

Hoping to receive an appropriate answer from this elastic community to fix this error.
Thanks in advance.

Do not use grok to parse JSON data as it is sensitive to the order in which the fields appear. Instead parse the log entry using grok so you get the full JSON part of the payload into a single field and then use a JSON processor on this.

1 Like

Hi Christian,

Thanksyou we were able to parse the log using a custom pipeline with a GROK and JSON as per your instruction.

  1. A GROK Processor to filter the JSON part:

Field: message
%{TIMESTAMP_ISO8601:timestamp} %{HOSTNAME:hostname} %{WORD:application} %{DATA:garbage}:%{GREEDYDATA:json_payload}

  1. JSON Processor:
    Field:json_payload

Unfortunately, this works fine just while Testing the Pipeline and not getting indexed appropriately even after mapping every single field at Index Mapping.

Kindly help us
Thankyou Sir.

Please share the errors you are getting now after the change you made.