Problem with grok filter in logstash

Hello,
I have latest version of ELK stack

on target server I have filebeat

filebeat version 6.4.2

on server I received message like that

2018-10-31 10:12:09,679 INFO [Pop3SSLServer-119] [ip=xxx.xx.xx.xxx;oip=xxx.xx.xx.xx;] security - cmd=Auth; account=xxx@xx.xx.xx; protocol=pop3;

I use next regexp

%{TIMESTAMP_ISO8601:date} %{WORD:log_level} [(?:%{DATA:service})] [?(ip=%{IPORHOST:server_ip};)(oip=%{IPORHOST:client_ip};)] security - cmd=%{WORD:command}; (%{WORD:username_type}=%{DATA:username}(; protocol=%{WORD:protocol})?;)?

I test here http://grokdebug.herokuapp.com/ and everything look likes good

But I think I did a mistake in logshash config

cat filebeat-input.conf
input {
beats {
port => 5044
}
}

filter {
grok {
match => { "message" => "%{TIMESTAMP_ISO8601:date} %{WORD:log_level} [(?:%{DATA:service})] [?(ip=%{IPORHOST:server_ip};)(oip=%{IPORHOST:client_ip};)] security - cmd=%{WORD:command}; (%{WORD:username_type}=%{DATA:username}(; protocol=%{WORD:protocol})?;)? %{DATA:message}" }
overwrite => [ "message" ]
}
}

cat output-elasticsearch.conf
output {
stdout {
codec => rubydebug
}
elasticsearch { hosts => ["localhost:9200"]
hosts => "localhost:9200"
index => "%{[@metadata][beat]}-%{+YYYY.MM.dd}"
}
}

Everything works exclude grok filter, can somebody help me?

I also try that

cat filebeat-input.conf
input {
beats {
port => 5044
}
}

filter {
grok {
match => [ "message", "%{TIMESTAMP_ISO8601:date} %{WORD:log_level} [(?:%{DATA:service})] [?(ip=%{IPORHOST:server_ip};)(oip=%{IPORHOST:client_ip};)] security - cmd=%{WORD:command}; (%{WORD:username_type}=%{DATA:username}(; protocol=%{WORD:protocol})?;)?" ]
}
}

and many other but always fails

One more test

cat logstash-filter.conf
input { stdin { } }

filter {
grok {
match => [ "message", "%{TIMESTAMP_ISO8601:date} %{WORD:log_level} [(?:%{DATA:service})] [?(ip=%{IPORHOST:server_ip};)(oip=%{IPORHOST:client_ip};)] security - cmd=%{WORD:command}; (%{WORD:username_type}=%{DATA:username}(; protocol=%{WORD:protocol})?;)?" ]
add_field => { "date" => "%{date}" }
}
}

output {
elasticsearch { hosts => ["localhost:9200"] }
stdout { codec => rubydebug }
}

then

bin/logstash -f logstash-filter.conf

then I paste that

2018-10-31 11:39:06,699 INFO [Pop3SSLServer-116] [ip=xxx.xx.xx.xx;oip=xxx.xx.xx.xx;] security - cmd=Auth; account=xxx@xxx.xxx.xx; protocol=pop3;

and received

{
"protocol" => "pop3",
"message" => "2018-10-31 11:36:52,702 INFO [Pop3SSLServer-124] [ip=xxx.xx.xx.xx;oip=xxx.xx.xx.xx;] security - cmd=Auth; account=xxxxx@xxx.xx.xx; protocol=pop3;",
"server_ip" => "xxx.xx.xx.xx",
"client_ip" => "xxx.xx.xx.xx",
"username" => "xxxxxx@xxx.xx.xx",
"@version" => "1",
"@timestamp" => 2018-10-31T09:40:29.205Z,
"log_level" => "INFO",
"date" => [
[0] "2018-10-31 11:36:52,702",
[1] "2018-10-31 11:36:52,702"
],
"service" => "Pop3SSLServer-124",
"command" => "Auth",
"host" => "elk",
"username_type" => "account"
}

but with the production config are fail why?

@MrSnaKe
What do the Logstash logs say?

Hello,
Problem was with additional space between INFO and [Pop3SSLServer-124]

Now problem is fixed

Thanks to all for help

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.