Grok filter is not working properly

Hello Guys,
I have Filebeat-7.1 installed in a Debian server, this Filebeat send data from files in this Debian server to server with Logstash 7.6 , here are the files config

Filebeat.yml:

#=========================== Filebeat inputs =============================

filebeat.inputs:

  • type: log

    Change to true to enable this input configuration.

    enabled: true
    paths:

    • /root/code/cigol/logs/server.log

    json.keys_under_root: true
    json.overwrite_keys: true
    json.add_error_key: true
    force_close_files: true
    fields:
    env: dev
    type: voiceserver.log

  • type: log
    enabled: true
    paths:
    - /usr/local/freeswitch/log/freeswitch.log
    force_close_files: true
    fields:
    env: dev
    type: freeswitch.log

processors:

  • drop_fields:
    fields: ["agent.ephemeral_id", "time", "agent.hostname", "agent.id", "agent.type", "agent.version", "ecs.version", "input.type", "log.offset", "@version", "fields.env", "tags"]

#----------------------------- Logstash output --------------------------------
output.logstash:

hosts: ["35.171.202.75:5044"]
--------------------------------logstash.conf-----------------------------------------------------------------------------
input.conf
input {
beats {
port => 5044
}
}

filter.conf

filter{
if [fields][env] == "dev" {
if [source] == "/root/code/cigol/logs/server.log" {
json {
source => "message"
}
}
} else
if [source] == "/usr/local/freeswitch/log/freeswitch.log" {
grok {
match => { "message" => "%{NOTSPACE:uuid} %{TIMESTAMP_ISO8601:date} [%{LOGLEVEL:loglevel}] %{GREEDYDATA:message}" }
remove_field => ["message"]
}
}
}

Output.conf

output {

elasticsearch {
hosts => ["127.0.0.1:9200"]
index => "%{[fields][type]}-%{+YYYY.MM.dd}"
}
stdout { codec => rubydebug }
}

application logs format

79110982-6d35-4b80-9be7-6ec9772313f9 2020-04-21 14:25:55.001130 [DEBUG] switch_core_state_machine.c:749 (sofia/3clogic_external/3001@freeswitch-registrar-10x.i3clogic.com:5505) State DESTROY

Kibana Output

message 79110982-6d35-4b80-9be7-6ec9772313f9 2020-04-21 14:25:55.001130 [DEBUG] mod_sofia.c:364 sofia/3clogic_external/3001@freeswitch-registrar-10x.i3clogic.com:5505 SOFIA DESTROY

I want to segregate message as below

"UUID" = 79110982-6d35-4b80-9be7-6ec9772313f9
"date" = 2020-04-21 14:25:55.001130
"loglevel" = DEBUG
"message" = switch_core_state_machine.c:749 (sofia/3clogic_external/3001@freeswitch-registrar-10x.i3clogic.com:5505) State DESTROY

Please help me on this

I do not believe that the timestamp format you have is ISO8601. See this for advice on how to develop a grok pattern.

Still getting the same output in kibana

filter{
if [fields][env] == "dev" {
if [source] == "/root/code/cigol/logs/server.log" {
json {
source => "message"
}
}
} else
if [source] == "/usr/local/freeswitch/log/freeswitch.log" {
grok {
pattern_definitions => { "MYDATETIME" => "%{YEAR}/%{MONTHNUM}/%{MONTHDAY} %{TIME}" }
match => { "message" => "%{NOTSPACE:uuid} ^%{MYDATETIME:time} [%{LOGLEVEL:loglevel}] %{GREEDYDATA:msg}" }
remove_field => ["message"]
}
}
}

I am new in logstash, please help me

The ^ anchors the pattern to the start of the line. You need to remove it, or move it to before %{NOTSPACE:uuid}

still getting the same output
message 043e5c71-30a7-484c-9616-1f0dd3712875 2020-04-21 18:50:48.541132 [DEBUG] switch_core_session.c:1726 Session 146 (sofia/3clogic_external/3001@freeswitch-registrar-10x.i3clogic.com:5505) Locked, Waiting on external entities

filter{
if [fields][env] == "dev" {
if [source] == "/root/code/cigol/logs/server.log" {
json {
source => "message"
}
}
} else
if [source] == "/usr/local/freeswitch/log/freeswitch.log" {
grok {
pattern_definitions => { "MYDATETIME" => "%{YEAR}/%{MONTHNUM}/%{MONTHDAY} %{TIME}" }
match => { "message" => "^%{NOTSPACE:uuid} %{MYDATETIME:time} [%{LOGLEVEL:loglevel}] %{GREEDYDATA:msg}" }
remove_field => ["message"]
}
}
}

I suggest you follow that link I posted above and build your pattern in the way it describes.

Can you please elaborate what exactly i am doing wrong.

You are trying to write a pattern that matches the entire line. That is not the best approach. A better approach is described in the post I linked to.

I also tried

match => { "message" => "%{NOTSPACE:uuid}" }

but no luck

Below mentioned Grok syntax is parsing properly in https://grokdebug.herokuapp.com but unfortunately not reflecting same result in Kibana

%{NOTSPACE:uuid} %{TIMESTAMP_ISO8601:date} [%{LOGLEVEL:loglevel}] %{GREEDYDATA:message}

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.