Help required for grok filter creation [windows platform]

Hi!
I am new to elastic search and using Elastic search 6.2.3.
I want create grok pattern for my following log format:
[Tue May 22 2018 15:16:56:325] [10.144.244.48] [CM] [12836] [Debug] [Starting Connection Manager event loop]
[Tue May 22 2018 15:16:56:325] [10.144.244.48] [CM] [12836] [INFO] [System Information fetching is DISABLED]
[Tue May 22 2018 15:16:56:328] [10.144.244.48] [CM] [12836] [Debug] [Adding timer for disconnected host: 10.144.244.48]
[Tue May 22 2018 15:16:56:329] [10.144.244.48] [CM] [12836] [INFO] [Initializing CSD process as Primary Server.]

I created following grok filter:
input {
beats {
port => 5044
}
}

filter {
grok {
match => { "message" => "[%{DATESTAMP:time_stamp}] %{SPACE} [%{IP:server_ip}] %{SPACE} [%{WORD:process_name}] %{SPACE} [%{NUMBER:process_id}] %{SPACE} [%{LOGLEVEL:log_level}] %{SPACE} [%{GREEDYDATA:log_message}]" }
}

}

output {
elasticsearch { hosts => ["localhost:9200"] }
stdout { codec => rubydebug }
}

But I always receive message as:
"input_type" => "log",
"tags" => [
[0] "beats_input_codec_plain_applied",
[1] "_grokparsefailure"

What is wrong with my grok filter?

Thanks & Regards
Siddharth

I see two issues. Secondly, all of the open and close square brackets have to be escaped, since they are not introducing character classes. Firstly, " %{SPACE} " will match nothing less than three spaces. I think you want "%{SPACE}". Try

match => { "message" => "\[%{DATA:time_stamp}\]%{SPACE}\[%{IPV4:server_ip}\]%{SPACE}\[%{WORD:process_name}\]%{SPACE}\[%{NUMBER:process_id}\]%{SPACE}\[%{LOGLEVEL:log_level}\]%{SPACE}\[%{GREEDYDATA:log_message}\]" }

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.