Can't Get Custom Grok Filter To Work

Hi, I am new to elastic search and started using logstash to parse unstructured logs. I am using a grok filter to split the logs as much as ı understood from the documentation but it doesn't work. It just uploads every log in a "message" field. How can I use the grok filter to seperate the logs into different fields.
Here is my logstash.conf file.

input {
    file {
            path => "/home/amo/qwe.txt"
            start_position => "beginning"
            ignore_older => 0
    }
}
filter {
    if [message] =~ /":  103]"/ or [message] =~ /":   93]"/ or [message] =~ /":   89]"/ {
            grok {
                    patterns_dir => ["/etc/logstash/conf.d/patterns"]
                    match => { "message" => "(?<timestamp>(\d{4}-\d{2}-\d{2} \d{2}:\d{2}:\d{2},\d{3})) (?<Message_Type>(\w{4,5}))\s+\[(?<Source>[\w\.]*)\s*:(\s){2,3}(?<Message_Code>\d{2,3})\] (?<Message>[\w\s]*):\s*(?<NPC_Event_Bytes>[\dA-F ]*)" }
            }
    }
}
output {
    stdout {
            codec => rubydebug
    }
    elasticsearch {
            hosts => ["localhost:9200"]
    }
}

Here are some of the logs I'm trying to parse.

2018-03-11 17:33:27,177 WARN  [tr.com.xxx.tcconnector.npc.NPCClientActor                   :  103] Incoming message from NPC is: 02 AE 01 52 FF 03
2018-03-11 17:33:27,677 WARN  [tr.com.xxx.tcconnector.npc.NPCClientActor                   :  103] Incoming message from NPC is: 02 AE 04 37 00 00 00 9F 03
2018-03-11 17:33:27,760 WARN  [tr.com.xxx.tcconnector.npc.NPCClientActor                   :   93] Command CommandWithResponseNIRT to be sent to NPC is:	02 20 02 79 53 0A 03
2018-03-11 17:33:27,777 WARN  [tr.com.xxx.tcconnector.npc.NPCClientActor                   :  103] Incoming message from NPC is: 02 60 06 79 53 05 50 00 A7 BC 03
2018-03-11 17:47:36,387 WARN  [tr.com.xxx.tcconnector.npc.NPCClientActor                   :   89] Command CommandNIRT to be sent to NPC is:	02 20 02 83 56 F5 03

I had posted a more full version of what I wanted to do but noone replied. I would very much appreciate it if I could get some insight on what I am doing wrong and how to fix it. Thank you.

By the way I tried the grok pattern on both the herokuapp grok debugger and the kibana debugger and it worked for each log files above. However logstash can't seem recognize it because when I commented out the filter plugin the output didn't change. Also when i added an

else {
     drop {}
}

after the if message this time nothing was uploaded at all.

I am changing the filename each time I reun logstash so it uploads from the beginning.

I removed the if part completly and just left the grok part. It still wont parse. I tried using custom patterns defined in a patterns directory and changed my regular expression accordingly (Tested it in kibana grok debugger), still no change. In the documentation it sayas that it will add the fields in the match segment of the grok field but it doesn't. Do I have to explicitly define them. If so, how? Please help me.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.