_grokparsefailure - Success in Grok Debugger - Log with Double quotes JSON

I am facing issues while trying to parse this log. I have tried multiple options but nothing seems to work. No error message in the logs, except for this tag in the records. Can you please help with what I am missing here?

Auto route reason : {"opportunityId":"0042f00000Kttt","accountNumber":"999999999999999","agId":"0051H0001111IWYqTTT","division":"US","errorCode":"EC-109","errorName":"GetDetails Not Found","errorDescription":"SiteDetails Not Found"}
filter {
      
    json {
        source => message
        add_field => {
            "region" => "us-east-1"
        }
    }
	if [message] =~ /Auto route reason/ {
		grok {
			match => {"message" =>'%{CISCO_REASON} : {"opportunityId":"%{DATA:opportunityId}",%{GREEDYDATA:Greedymessage}}'}
        }
    }
}

What is your structured data supposed to look like?

Hi Tim,

Thanks for your reply!

If by structured data you mean the output, I need the values of each of the fields from the log. Like OpportunityId, errorName etc as sperate fields with there respective values.

Thanks,
Athul M

Any help is appreciated!!!

I found this source of definitions for the grok syntax that should help you: https://github.com/logstash-plugins/logstash-patterns-core/blob/master/patterns/ecs-v1/grok-patterns. There are a lot of solutions that will look more suitable than GREEDYDATA that can get the fields from the logs.

Can you explain more what this means? Is the data created in Elasticsearch but it has the wrong structure? Or does it only work when the log message matches /Auto route reason/?

When is it working? When is it not working? How do you define working?

Thanks again for the response!

I have gone through different patterns and tried them as well. In the above example I have given as GreedyMessage, as I alteast want to see If any field is created correctly.

And with respect to this "No error message in the logs, except for this tag in the records."

So the records with the content 'Auto route reason' do get created correctly. However, the problem is grok is not prased correctly for those records and the individual fields like opportunityId or Greedymessage are not created on the records.

When I use the GrokDebugger tool to test the grok pattern, It works for me. Thus I am expecting the pattern to work in the pipeline as well.

Hope that clears the doubts.

image

Hm, I am not sure of situations were the tests on your grok string can pass in the Grok Debugger, but not work in your pipeline.

You may want to try posting this question under Logstash.

Thanks Will do.