Hi there,

i'm trying to use grok pattern here and as you can see my pattern is already match the log

but when I see in kibana, idk why the tags always show _grokparsefailure

this is my code in pipeline

if [kubernetes][namespace] in ["ff-rr"]{
grok {
  match => ["message", "%{WORD:type}:\s+(?<data_customer>%{WORD}\s+%{WORD}):\s+%{GREEDYDATA:bodydata}"]
json {
  source => "bodydata"
  skip_on_invalid_json => true

what should I do? no character needs to escape in my grok pattern


Please share the result message you are getting in Kibana.

Go to discover, find a document where you have this tag, expand it, then go to the json tab and copy the content.

here is the value:

"message": "Response: LINK CUSTOMER: {\"responseCode\":\"xx\",\"responseDesc\":\"Blokir \\/ Tidak \",\"data\":null}",

You need to share the entire document you have in Elasticsearch.

it's a production data. i need to take some time to masking it or if you can tell me what part you exactly want to see

Share the kubernetes fields, also share your entire logstash pipeline.

It is pretty hard to troubleshoot things without more context like the entire pipeline and what is the real output.

From what you share I see no issue in the grok filter you are using, it worked for me, so it may have something wrong with your pipeline, or maybe it is not matching the conditional you are using.

What is the source of data?

idk why is this happen. when i change this line

if [kubernetes][namespace] in ["ff-rr"]{


if [kubernetes][namespace] == "ff-rr" {

it looks work. i tested it before on my local environment using first line, it works well. but when i apply it in production, it's not working. is there any difference between them?

Yes, you cannot use in to test membership of an array with one member. It is parsed as a field reference, so logstash is testing the (non-existent) field ["ff-rr"] and the expression evaluates to false. It's very hard to fix this. It is tracked here.

ok, thanks for the info. i'll remember that