GROK pattern for syslogs

"Hello.
I have several sources of syslogs that I want to filter with logstash grok, have some issue and questions about this syslog event and how to use grok.

{"@version":"1","message":"<44> Sep 14 09:01:09 172.24.4.202 FFI:  port 1-Excessive undersized/giant packets. See help.","host":"1.2.3.4","type":"syslog","@timestamp":"2021-09-14T07:01:07.962Z"}

So far I have with created this grok:

%{INT:version}\S+%{NONNEGINT:syslog-priority:int}\S+%{GREEDYDATA:message}



Results:

{
"version": [
[
"1"
]
],
"syslog": [
[
"44"
]
],
"message": [
[
" Sep 14 09:01:09 172.24.4.202 FFI: port 1-Excessive undersized/giant packets. See help.","host":"172.24.4.202","type":"syslog","@timestamp":"2021-09-14T07:01:07.962Z"}"
]
]
}




My issue is that I want to parse "@timestamp" , "type" and "host" and then leave the rest as GREEDYDATA message.
I have tried with the timestamp first:

%{INT:version}\S+%{NONNEGINT:syslog-priority:int}\S+%{GREEDYDATA:message}%{TIMESTAMP_ISO8601:SyslogTimestamp}



Results:

{
"version": [
[
"1"
]
],
"syslog": [
[
"44"
]
],
"message": [
[
" Sep 14 09:01:09 172.24.4.202 FFI: port 1-Excessive undersized/giant packets. See help.","host":"172.24.4.202","type":"syslog","@timestamp":"20"
]
],
"SyslogTimestamp": [
[
"21-09-14T07:01:07.962Z"
]
],
"YEAR": [
[
"21"
]
],


As it looks like the grok pattern can not parse the YEAR field correctly because 20 is missing
Any ideas on what I am missing

//Christer

Hi,

Use %{DATA:message} instead of %{GREEDYDATA:message}.

Cad.

Ok, there's a couple of things here that I think need to be mentioned in terms of grok:
Grok is kind of like an abstraction layer from regex, so it works in the same kind of fashion.

You can check the patterns here.

I'm assuming your actual log message is the message field, not the full json, correct?
Since the log line being handed to logstash is syslog, then the timestamp field is generated by logstash, and the host field is whatever is the machine logstash is running on. They're sort of like wrappers supplied by logstash during parsing, in order to prepare your logs for elasticsearch when they convert the data to json.

If you don't want to keep the entire log line as the message, but instead want to parcel out the data, you want to use overwrite. I've provided an example of how you could parse you log line below:

filter {
  grok {
    match => { "message" => "^\<%{NUMBER:priority}\> %{SYSLOGTIMESTAMP:syslog_ts} %{IPV4:ip} %{GREEDYDATA:message}$" }
    overwrite => ["message"]
  }
}

Your resulting message might look like this:

{
  "priority": [
    [
      "44"
    ]
  ],
  "syslog_ts": [
    [
      "Sep 14 09:01:09"
    ]
  ],
  "ip": [
    [
      "172.24.4.202"
    ]
  ],
  "message": [
    [
      "FFI:  port 1-Excessive undersized/giant packets. See help."
    ]
  ]
}

Thank you for your replies, both are valid for my parsing of the logs.
Now the events looks good when I forward them to cert

Skickat från Yahoo Mail för iPhone