Multiple GROK filters not applying to log message

alerting

(Ashok ) #1

I have created 2 grok filters to match log messages, but the first filter is only applying the second filter is not applying to all log messages

but the 2 filters parsing logs, I have created the first filter for loglevel and second one for my needed fileds

 match => { "message" => "%{TIMESTAMP_ISO8601:timestamp} (\[%{WORD:loglevel}\])) %{GREEDYDATA}" }

match => { "message" => "%{TIMESTAMP_ISO8601:timestamp} (\[%{WORD:loglevel}\]) %{DATA} - %{DATA:method} processing time for transactionId : %{WORD:transactionid} documentType : %{WORD:document type} is %{INT:duration:int}" }

below are my log messages

2018-11-16 10:09:58.079 [INFO] from application in pool-3-thread-8 - Converting Document Authentication Response for transactionId : N9xnFCkW5Lpxa9tj to legacy api version : 1.1
2018-11-16 10:09:58.079 [INFO] from application in pool-3-thread-8 - LegacyDocumentAuthenticationResponseConversion processing time for transactionId : N9xnFCkW5Lpxa9tj documentType : License is 0 msec
2018-11-16 10:09:58.079 [INFO] from application in pool-3-thread-8 - Encrypting and Inserting Transaction Data into DB for transactionId : N9xnFCkW5Lpxa9tj
2018-11-16 10:09:58.080 [INFO] from application in pool-3-thread-8 - Inserting headshot into DB for transactionId : N9xnFCkW5Lpxa9tj
2018-11-16 10:09:58.187 [INFO] from application in pool-3-thread-8 - updateTransaction : Number of Documents modified in db for transactionId : 0
2018-11-16 10:09:58.187 [INFO] from application in pool-3-thread-8 - updateTransaction : merchantId : 5dce6959-b2c2-4890-bdf3-3e0d36662c4c inserted true for transactionId :N9xnFCkW5Lpxa9tj
2018-11-16 10:09:58.187 [INFO] from application in pool-3-thread-8 - Updating Accounting Data for transactionId : N9xnFCkW5Lpxa9tj
2018-11-16 10:09:58.188 [INFO] from application in pool-3-thread-8 - updateAccounts : Number of Documents modified in db for transactionId : 0
2018-11-16 10:09:58.188 [INFO] from application in pool-3-thread-8 - updateAccounts : requestType : Authentication inserted true for transactionId :N9xnFCkW5Lpxa9tj
2018-11-16 10:09:58.188 [INFO] from application in pool-3-thread-8 - Accounting Data updated for transactionId : N9xnFCkW5Lpxa9tj, requestType : Authentication
2018-11-16 10:09:58.188 [INFO] from application in pool-3-thread-8 - TransactionDataEncryptionAndInsertion processing time for transactionId : N9xnFCkW5pxa9tj documentType : License is 109 msec
2018-11-16 10:09:58.188 [INFO] from application in pool-3-thread-8 - AuthenticateDocument processing time for transactionId : N9xnFCkW5Lpxa9tj documentType : Licensemerchant : 5dce6959-b2c2-4890-bdf3-3e0d36662c4c is 23389 msec

(Tek Chand) #2

@ashok9177, Your first grok filter is become generic and that part is common in all logs. So please try to change the order of your grok filter and it should work.

Thanks.


(Ashok ) #3

If i change order also first one is only working second filter not applying


(Tek Chand) #4

@ashok9177, Can you please use the below patterns:

match => { "message" => [ "%{TIMESTAMP_ISO8601:timestamp} \[%{WORD:loglevel}\] %{DATA} - %{DATA:method} processing time for transactionId : %{WORD:transactionid} documentType : %{WORD:document type} is %{INT:duration} %{GREEDYDATA}", "%{TIMESTAMP_ISO8601:timestamp} \[%{WORD:loglevel}\] %{GREEDYDATA:message}" ] }

Please restart logstash service after making the above changes.

Hope so above filter will work for you.

Thanks.


(Ashok ) #5

Thanks for your reply, may I know the cause, because i have to write 10 grok filters


(Tek Chand) #6

@ashok9177, is it work fine for you?

If yes, then you can add multiple grok patterns in single message as i have added 2 in above filter. Please separate the patterns with , (comma). One more thing keep the below pattern at last:

%{TIMESTAMP_ISO8601:timestamp} \[%{WORD:loglevel}\] %{GREEDYDATA:message}

If we use multiple match in single grok pattern then all logs will parse from first pattern. I also faced the same issue earlier.

Thanks.


(Ashok ) #7

yes, you'r filter is working for me. but adding 10 filters in single line doesn't look good. do you have any idea, why filters not working now?

the same way it worked before


(Tek Chand) #8

@ashok9177, is all these logs are coming from single log file path?

If these are coming from different file path we can use different grok filter for all log file.

i also using same thing at my end and its not creating and performance issue for logstash node.

I also tried multiple messages in single grok filter but that not worked at my end. If it were working earlier at your end then i don't know how it was working.

Thanks.


(Tek Chand) #9

@ashok9177,

Your above filter have small mistake and i have corrected that in my filter. You have used small parenthesis () at loglevel field. What the purpose of these ()?

Thanks.


(Ashok ) #10

What the purpose of these ()?

Don't know copied from some website, but as a single filter it is working and can you please suggest me how to exclude api key word in loglevel place , in some log message there api keyword , below is example

2

018-11-16 12:24:00,129 [api] [MainThread ] [INFO ] Saving to /home/ubuntu/temp/caffe_demos_uploads/9e59effd-9534-4934-b3d7-c9182a07d507image.jpg., [MainThread ] [INFO ] Saving to /home/ubuntu/temp/caffe_demos_uploads/9e59effd-9534-4934-b3d7-c9182a07d507image.jpg.


(Tek Chand) #11

@ashok9177

You can drop anything using \ as shown below:

%{TIMESTAMP_ISO8601:timestamp}\,\d+\s\[\w+\]

Thanks.


(Ashok ) #12

But i want pick INFO,ERROR,DEBUG words and exclude other word like api


(Ashok ) #13

if the logs are coming from different servers, and each servers has different field name, how to write if condition in grok filter?


(Tek Chand) #14

@ashok9177

%{TIMESTAMP_ISO8601:timestamp}\,\d+\s\[\w+\]

The above pattren show how you can include and discard any field or word. It was only for your reference.

%{TIMESTAMP_ISO8601:timestamp}\,\d+\s\[\w+\]\s\[\w+\s\]\s\[%{WORD:loglevel}

You can extend above filter as per your log and requirement.

Thanks.