Logstash Grok Filter _tags vs _fields


(Pravin Kumar) #1

Dear All,

Have few queries as follows,

  • Requirement is to remove the tag/field Bwpayload and to add message, program and path as tags in case of grokparsefailure. i am confused about either to use remove_tag or remove_field. Sad part is either of one is not working. I am able to view Bwpayload tag with values in Kibana.

  • Also date plugin having timestamp, will point to which format [Generated internally by logstash or from my logs], because facing date mismatch error when not commented.

2017 Feb 14 17:31:28:389 GMT +5 - which is from my logs with tag timestamp in kibana

February 14th 2017, 17:31:28.455 - seen in kibana with @timestamp tag.

  • In case of grokprasefailure i would like to add few fields as given in below config. These fields are from my pattern and event. Is this the right procedure to add the same.

Kindly guide,

My filter config looks like below,

filter {
grok {
patterns_dir => "ELK/logstash-5.1.1/patterns/ingdevbw"
match => { "message" => "%{BWLOG}" }
remove_tag => [ "%{Bwpayload}" ]
# remove_tag => [ "%{Bwpayload}" ]
}

date {

match => [ "timestamp" , "YYYY MMM DD HH:mm:ss:SSS" ]

#  remove_field => [ "timestamp" ]

}

if "_grokparsefailure" in [tags] {
drop {
add_field => {
parsefail_path => "%{path}"
parsefail_prog => "%{program}"
parsefail_message => "%{@message}"
}
}
}
}

and My pattern is as below

Logstash pattern configuration

BWTIME %{YEAR} %{MONTH} %{MONTHDAY} %{TIME} GMT +%{INT}
BWLOG %{BWTIME:timestamp} %{PROG:program} %{WORD:loglevel} [%{USER:auth}] %{GREEDYDATA:Bwpayload}


(Magnus Bäck) #2

i am confused about either to use remove_tag or remove_field.

Bwpayload is a field so you should use remove_field.

Sad part is either of one is not working.

That's because you're using %{Bwpayload} which expands to the contents of the Bwpayload field, and that's not what you want. Just use Bwpayload.

Also date plugin having timestamp, will point to which format [Generated internally by logstash or from my logs], because facing date mismatch error when not commented.

I don't understand this sentence, but if the date filter fails it'll log details about the failure. In this case the problem is that your date pattern doesn't match the contents of the timestamp string.

In case of grokprasefailure i would like to add few fields as given in below config. These fields are from my pattern and event. Is this the right procedure to add the same.

Using add_field with the drop filter doesn't make any sense since drop deletes events. Use a mutate filter instead.


(Pravin Kumar) #3

Dear Magnus,

Thanks for the reply,

Bwpayload is a field so you should use remove_field.
That's because you're using %{Bwpayload} which expands to the contents of the Bwpayload field, and that's not what you want. Just use Bwpayload.

Used as suggested by you. Working fine now.

I don't understand this sentence, but if the date filter fails it'll log details about the failure. In this case the problem is that your date pattern doesn't match the contents of the timestamp string.

In Kibana i have two timestamp, as highlighted in below json output. whereas the date plugin in config is not matching the same. Query here is, i should form date{match [ field, formats... ]} pattern in config for which timestamp.

In case of grokprasefailure i would like to add few fields as given in below config. These fields are from my pattern and event. Is this the right procedure to add the same.
Using add_field with the drop filter doesn't make any sense since drop deletes events. Use a mutate filter instead.

I Used Mutate, but for parsefail_path field alone getting desired output, for other two fields not getting the actual value, instead getting the same names mention in config. highlighted below in output. (Tried liket his as well parsefail_prog => [ "program" ]).
Also i dont't want other tags, is it okay if i use remove_field in the same mutate plugin.

Kindly help.

Kibana JSON Output:

{
"_index": "logstash-2017.02.15",
"_type": "esbbwlog",
"_id": "AVpBhE3RooOYGdG3tn-Y",
"_score": null,
"_source": {
"path": "D:/logs/TestService-TestService.log",
"@timestamp": "2017-02-15T11:24:34.113Z",
"parsefail_prog": "program",
"@version": "1",
"host": "SDINCNB00010",
"message": "2017 Feb 15 16:54:32:910 GMT +5 TestService-TestService User [User] - \r\n ",
"type": "esbbwlog",
"parsefail_path": "D:/logs/TestService-TestService.log",
"parsefail_message": "message",
"tags": [
"multiline",
"_grokparsefailure"
]
},
"fields": {
"@timestamp": [
** 1487157874113**
** ]**
},
}

My Config File:

filter {
grok {
patterns_dir => "ELK/logstash-5.1.1/patterns/ingdevbw"
match => { "message" => "%{BWLOG}" }
remove_tag => [ "Bwpayload" ]

}
date {
match => [ "timestamp" , "YYYY MMM DD HH:mm:ss:SSS" ]
remove_field => [ "timestamp" ]
}

if "_grokparsefailure" in [tags] {
drop {
add_field => {
parsefail_path => "%{path}"
parsefail_prog => "%{program}"
parsefail_message => "%{@message}"

}
}
}
}


(Magnus Bäck) #4

In Kibana i have two timestamp, as highlighted in below json output

You can ignore the one under fields. Focus on the one under _source.

whereas the date plugin in config is not matching the same.

Since the grok filter that extracts the timestamp to a field of its own doesn't work the date filter won't work either. Focus on one thing at a time.

I Used Mutate, but for parsefail_path field alone getting desired output, for other two fields not getting the actual value, instead getting the same names mention in config.

parsefail_message has the wrong contents because the name of the field is message and not @message, i.e. add_field => { "parsefail_message" => "%{message}" } is the correct syntax.

parsefail_program has the wrong value because the program field only exists when the grok filter succeeds.

Also i dont't want other tags, is it okay if i use remove_field in the same mutate plugin.

Sure.


(Pravin Kumar) #5

Thanks Magnus


(system) #6

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.