Gork filter with custom pattern

I am trying to aggregate outside logs into ElasticSearch cluster running in Kubernetes.

Syslogs get published to logstash instance which forward to elasticsearch instance. Syslogs has pattern Timestamp Serial_Number:Process Level Message

Sample :
Oct 11 21:59:02 e3ed8a4e-530f-4658-b39d-049f38cda310:systemd DEBUG waiting for reply from 10.3.1.204:123 (10.3.1.204).

Using https://grokdebug.herokuapp.com/ I was able to create following filter

filter {
    grok {
      match => {"message" => "%{SYSLOGTIMESTAMP:timestamp} (?<host>[A-Fa-f0-9]{8}-(?:[A-Fa-f0-9]{4}-){3}[A-Fa-f0-9]{12}):(?<process>(?<=:).*) %{LOGLEVEL:loglevel} %{GREEDYDATA:body}"}
    }
    mutate {
        add_field => { "process" => "%{process}" }
        add_field => { "loglevel" => "%{loglevel}" }
      }
    date {
        match => [ "logdate", "MMM dd HH:mm:ss" ]
      }
}

But new added fields process & loglevel doesn't have the actual value.

Its shows up just like process: %{process} & loglevel: %{loglevel} in kibana

What is the correct filter pattern I need to use here ?

That suggests your grok pattern is not matching, although it works for me for the sample data you showed.

Remove the mutate filter. That will just convert

  "loglevel" => "DEBUG",

into an array

  "loglevel" => [
    [0] "DEBUG",
    [1] "DEBUG"
],

Can you elaborate how the filter block would look with your suggestion ?

filter {
    grok {
        match => {"message" => "%{SYSLOGTIMESTAMP:timestamp} (?<host>[A-Fa-f0-9]{8}-(?:[A-Fa-f0-9]{4}-){3}[A-Fa-f0-9]{12}):(?<process>(?<=:).*) %{LOGLEVEL:loglevel} %{GREEDYDATA:body}"}
    }
    date {
        match => [ "logdate", "MMM dd HH:mm:ss" ]
    }
}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.