Logstash Filter JSON syslog message field is not getting parsed

I cannot parse the incoming Syslog by JSON. The message field is not getting parsed. I tried JSON filter using addfield and also with mutate but no luck. I used GROK to parse specific fields but the message field has keys and values. How to parse the below message field into JSON

Blockquote conf file

input {
  file {
    path => "/opt/log/sample/*.txt"


     codec => "plain"  # { format => "%{message}" }

  }
}

filter {

#       mutate {            gsub => [                 "message","(\")", ""        ]     }
        mutate {            gsub => [                 "message","(\\")", ""        ]     }

        json    {
                source => "message"

        }


}

output {

    file {
        path => "/opt/log/out/out.txt"
        codec => json_lines
    }
    stdout {}
}

GROK %{TIME:timestamp} %{HOST:host} %{GREEDYDATA:message}

GROK output

  "timestamp": [
    [
      "18:11:58"
    ]
  ],
  "host": [
    [
      "myhost.aco.mydomain.net"
    ]
  ],
  "message": [
    [
      "{destinationPort:90,exception:-,totalByteUsage:0,sourcePort:160,extension:.com\\\\/,contentTypeHeader:-,callout:0,scheme:http,reportingGroup:0,requestMethod:GET,privateIp:-,sAction:Allowed,sourceIpAddress:10.10.10.10,description:-,categoryName:News,sandBoxDecoded:-,urlLogId:0,responseCode:0,sandboxResult:-,computerName:-,totalByteCount:0,audit:0,host:www.local.com,action:Allowed,useTime:0,upstreamByteUsage:0,uriPath:\\\\/,computerMacAddress:00:00:00:00:00:00,direction:0,myboss:myhost,malware:0,ipAddress:10.10.10.10,userAgent:-,publicIp:-,url:http:\\\\/\\\\/www.local.com\\\\/,logTime:2022-07-12,referrerUrl:-,mde:-,sha256Sum:-,macAddress:00:00:00:00:00:00,filename:-,uriQuery:-,filteringGroupName:Default Catch All,downstreamByteUsage:0,cncFlag:0,location:-,time:18:11:57,username:*10.10.10.10}""
    ]
  ]
}

Hello @Giridharan_C

Welcome to Elastic Community :slight_smile:

We have use to use Grok with KV filter. Hence, try the below


filter
{

grok
{
match => {"message" => "%{TIME:timestamp} %{DATA:host} {%{GREEDYDATA:messages}}"}
}

kv {
       source => "messages"
       field_split => ","
       value_split => ":"
   }

}

Keep posted on how it goes !!! Thanks

1 Like

Use a kv filter to parse the message field.

    mutate { gsub => [ "message", "[{}]", "" ] }
    kv { source => "message" field_split => "," value_split => ":" }

which will produce

               "time" => "18:11:57",
      "sandboxResult" => "-",
            "cncFlag" => "0",
     "reportingGroup" => "0",
           "uriQuery" => "-",
...
1 Like

Thanks @Badger @sudhagar_ramesh .

I can able to parse and extract the JSON fields.

Below is my filter

filter {

        mutate {  gsub => [  "message","(\\")", "" ]}

        mutate { gsub => [ "message", "[{}]", "" ] }

        grok {
                match => { "message" => "%{TIME:timestamp} %{GREEDYDATA:message_json}" }
        }

        kv { source => "message_json" field_split => "," value_split => ":" }

        mutate {
                rename => {"host" => "message_host"}
                remove_field => [ "message", "message_json" ]
        }
}
1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.