Hi!
I am new to elastic search and using Elastic search 6.2.3.
I want create grok pattern for my following log format:
[Tue May 22 2018 15:16:56:325] [10.144.244.48] [CM] [12836] [Debug] [Starting Connection Manager event loop]
[Tue May 22 2018 15:16:56:325] [10.144.244.48] [CM] [12836] [INFO] [System Information fetching is DISABLED]
[Tue May 22 2018 15:16:56:328] [10.144.244.48] [CM] [12836] [Debug] [Adding timer for disconnected host: 10.144.244.48]
[Tue May 22 2018 15:16:56:329] [10.144.244.48] [CM] [12836] [INFO] [Initializing CSD process as Primary Server.]
I created following grok filter:
input {
beats {
port => 5044
}
}
filter {
grok {
match => { "message" => "[%{DATESTAMP:time_stamp}] %{SPACE} [%{IP:server_ip}] %{SPACE} [%{WORD:process_name}] %{SPACE} [%{NUMBER:process_id}] %{SPACE} [%{LOGLEVEL:log_level}] %{SPACE} [%{GREEDYDATA:log_message}]" }
}
}
output {
elasticsearch { hosts => ["localhost:9200"] }
stdout { codec => rubydebug }
}
But I always receive message as:
"input_type" => "log",
"tags" => [
[0] "beats_input_codec_plain_applied",
[1] "_grokparsefailure"
What is wrong with my grok filter?
Thanks & Regards
Siddharth