Create Grok filter date

Hi everyone,

I tried to parse date from this line

Wed Nov 27 07:31:40 CET 2019;User123;NEWPORT.WEBSERVICE.EXAMPLE

like that in grok filter

		grok { 
			match => {"message" => ["%{GREEDYDATA:date};%{USER:user};%{GREEDYDATA:webservice}"] } 
		}
		date {
				match => ["date", "EEE MMM dd HH:mm:ss z yyyy"]
			}
	}

The parsing works but not for date field, I have this result

{
"@version" => "1",
"user" => "User123",
"type" => "txt",
"date" => "Wed Nov 27 07:31:40 CET 2019",
"webservice" => "NEWPORT.WEBSERVICE.EXAMPLE",
"host" => "host12",
"tags" => [
[0] "_dateparsefailure"
],
"path" => "a/logs/7.txt",
"message" => "Wed Nov 27 07:31:40 CET 2019;User123;NEWPORT.WEBSERVICE.EXAMPLE",
"@timestamp" => 2020-03-05T12:17:08.007Z
}

Someone know if the problem comes from the date parse ?

Thank you

I have tried this grok :

grok { 
	match => {"message" => "%{SYSLOGTIMESTAMP:syslog}%{SPACE}%{DATA:zone}%{YEAR:syslog};%{USER:user};%{GREEDYDATA:webservice}"} 
}

The output is now but can't convert in timestamp :

  "syslog" => [
         [0] "Nov 22 12:37:51",
         [1] "2019"
    ]

Hi there,

the problem is the following (reported in the documentation)

z time zone names. Time zone names ( z ) cannot be parsed.

So, what you can do is use a gsub filter to replace the timezone with a format recognized by logstash, like:

filter {
  grok { 
    match => {"message" => ["%{GREEDYDATA:date};%{USER:user};%{GREEDYDATA:webservice}"] } 
  }

  mutate {
    gsub => [
      "date", "CET", "+0100"
    ]
  }

  date {
    match => ["date", "EEE MMM dd HH:mm:ss Z yyyy"]
  }
}

This way CET will be replaced with +0100, it'll be recognized by logstash and correctly parsed by the capital Z.

Let me know if it works :slight_smile:

1 Like

@Fabio-sama Thank you ! I fixed the issue with gsub, good idea. :+1:

No problem :slight_smile:

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.