_grokparsefailure because of time match

Hi,

I am facing an issue while transferiang a data with converted time. In the logstash output everithing looks fine, but in Kibana I can see _grokparsefailure. After few test I found out, that when I am deleting the "date" block from the configuration file everithing work. So this means that issue is in that block.

Config File:

input {
file {
path => ["C:/ELK/LogFiles/Web/test.log"]
start_position => "beginning"
type => "rest_log"
}
}
filter{
grok{
match => ["message","%{IP:client_ip}%{SPACE}-%{SPACE}-%{SPACE}[%{HTTPDATE:apache_timestamp}]%{SPACE}"%{WORD:request_method}%{SPACE}%{NOTSPACE:request_url}%{SPACE}%{NOTSPACE:http_version}"%{SPACE}%{NUMBER:response_code}%{SPACE}%{NOTSPACE:bytes}%{SPACE}%{NOTSPACE}D:%{NOTSPACE:responsetime_ms}%{SPACE}%{GREEDYDATA:user_string}""]
}
date{
match => [ "apache_timestamp" , "dd/MMM/yyyy:HH:mm:ss +0100", "dd/MMM/yyyy:HH:mm:ss Z", "ISO8601" ]
target => "@timestamp"
}
mutate {
convert => {
"bytes" => "integer"
"responsetime_ms" => "integer"
}
}
}
output{
stdout {codec => rubydebug}
elasticsearch {
hosts => ["http://localhost:9200"]
index => "my_index"
}
}

Log Example:
11.12.13.14 - - [12/Nov/2019:21:40:22 +0100] "GET //user/login HTTP/1.1" 404 208 "D:75"

Logstash Output

"request_url" => "/user/login",
"response_code" => "404",
"http_version" => "HTTP/1.1",
"request_method" => "GET",
"responsetime_ms" => 75,
"@timestamp" => 2019-11-12T20:40:30.000Z,
"path" => "C:/ELK/LogFiles/Web/test.log",
"host" => "CSTRL0047685567",
"message" => "11.12.13.14 - - [12/Nov/2019:21:40:30 +0100] "GET /user/login HTTP/1.1" 404 208 "D:75"",
"client_ip" => "11.12.13.14",
"bytes" => 208,
"apache_timestamp" => "12/Nov/2019:21:40:30 +0100",
"@version" => "1"
Blockquote

Thanks for your help!

Try using

    grok{ match => { "message" => "%{IP:client_ip}%{SPACE}-%{SPACE}-%{SPACE}\[%{HTTPDATE:apache_timestamp}\]%{SPACE}\"%{WORD:request_method}%{SPACE}%{NOTSPACE:request_url}%{SPACE}%{NOTSPACE:http_version}\"%{SPACE}%{NUMBER:response_code}%{SPACE}%{NOTSPACE:bytes:int}%{SPACE}%{NOTSPACE}D:%{NOTSPACE:responsetime_ms:int}%{SPACE}%{GREEDYDATA:user_string}\"" } }
    date { match => [ "apache_timestamp", "dd/MMM/yyyy:HH:mm:ss Z" ] }

Thanks for your reply.

2 issues:

  • Same error in Kibana -> _grokparsefailure (in cmd everithing looks good)
  • time wasn't parsed propperly "apache_timestamp" => "12/Nov/2019:21:40:12 +0100"

I can't explain that. Today I openned this config file if VSCode and Code highlighting looks strange to me:

I tryied to find issue in regex but without success.

Sorry have to correct myself. With this configuration I am getting the same issue (logstash output is ok), but _grokparsefailure in Kibana

grok{ match => { "message" => "%{IP:client_ip}%{SPACE}-%{SPACE}-%{SPACE}\[%{HTTPDATE:apache_timestamp}\]%{SPACE}\"%{WORD:request_method}%{SPACE}%{NOTSPACE:request_url}%{SPACE}%{NOTSPACE:http_version}\"%{SPACE}%{NUMBER:response_code}%{SPACE}%{NOTSPACE:bytes:int}%{SPACE}%{NOTSPACE}D:%{NOTSPACE:responsetime_ms:int}%{SPACE}%{GREEDYDATA:user_string}\"" } }
date { 
  match => [ "apache_timestamp", "dd/MMM/yyyy:HH:mm:ss Z" ] 
  target => "@timestamp"
}

Ok, I found the issue (workaround).

I changed the target for apache_timestamp:

date { 
  match => [ "apache_timestamp" , "dd/MMM/yyyy:HH:mm:ss +0100", "dd/MMM/yyyy:HH:mm:ss Z", "ISO8601" ]
  target => "@my_timestamp"
}

Than in Kibana I selected "my_timestamp" as leading one by Idex creation.

@Badger: Thanks for your support!