Logstash config for grok (always get [0] "_grokparsefailure")

Hi,
I am facing a problem to parse json string in my log file;
Actaully i want to parse my json string which is embeded as data={....} as you can see below in log entry.

What i am usring:

Elastic search : elasticsearch-7.7.1-windows-x86_64
logstash : logstash-7.8.0
Kibana : kibana-7.2.0-windows-x86_64

My log file entry looks like as;

2020-06-30 17:41:47.521 INFO 9848 --- [http-nio-8080-exec-6] c.java.controllers.TestController : data={"ihqc":"66","masterJob":"chicago","merge":"Yes","jobs":["966","965"],"totalPages":"5","type":"jon","user":"marcel"}

my logstash.conf file is as;

input {
	  file {
			path => "C:/elk/spring-boot-elk.log"       
		   }
		}
  
filter {     
  grok {
 		match => ["%{TIMESTAMP_ISO8601:date}%{GREEDYDATA}data=%{GREEDYDATA:request}"]
		add_tag => ["stacktrace"]			
        }  
  json{
        source => "request"
        target => "parsedJson"
        remove_field=>["request"]
		}  
	mutate {
    add_field => {
      "user" => "%{[parsedJson][user]}"
      "type" => "%{[parsedJson][type]}"
    }
  }	
} 

output {
  stdout {
    codec => rubydebug
  }
  elasticsearch {
    hosts => ["localhost:9200"]
  }
}

My logstash console show the grokparsefailure on every log entry in file AS;

{
    "@timestamp" => 2020-06-30T15:41:47.584Z,
          "path" => "C:/elk/spring-boot-elk.log",
          "host" => "*****",
      "@version" => "1",
       "message" => "2020-06-30 17:41:46.298  INFO 9848 --- [http-nio-8080-exec-4] c.javainuse.controllers.TestController   : data={\"ihqc\":\"44\",\"masterJob\":\"chicago\",\"merge\":\"No\",\"jobs\":[\"966\",\"965\"],\"totalPages\":\"2\",\"type\":\"jon\",\"user\":\"Stefen\"}\r",
          "type" => "java",
          "tags" => [
        [0] "_grokparsefailure"
    ]
}

Would anybody help me to sort out that problem.
regards,
Naeem

I find it hard to believe that logstash even runs this:

It should be

grok {
    match => { "message" => "%{TIMESTAMP_ISO8601:date}%{GREEDYDATA}data=%{GREEDYDATA:request}" }
    add_tag => ["stacktrace"]
}

You will need to remove the \r from request or else you will get a _jsonparsefailure. You will need either

mutate { gsub => [ "request", "\\r$", "" ] }

or

mutate { gsub => [ "request", "\r$", "" ] }

Probably the second one.

Hi,
first of all thank you for the answer, i have changed the grok as per your suggestion

BUT still i am getting the error in logstash console :

{
      "@version" => "1",
          "tags" => [
        [0] "_grokparsefailure"
    ],
       "message" => "2020-06-30 20:02:49.312  INFO 9848 --- [http-nio-8080-exec-3] c.javainuse.controllers.TestController   : data={\"ihqc\":\"44\",\"masterJob\":\"chicago\",\"merge\":\"No\",\"jobs\":[\"966\",\"965\"],\"totalPages\":\"4\",\"type\":\"jon\",\"user\":\"Ali\"}\r",
          "host" => "*****",
          "type" => "java",
    "@timestamp" => 2020-06-30T18:02:50.073Z,
          "path" => "C:/elk/spring-boot-elk.log"
}

Mmy logstash.conf is looking as ;

input {
	  file {
			path => "C:/elk/spring-boot-elk.log"       
		   }
		}
  
filter {     
grok {
    match => { "message" => "%{TIMESTAMP_ISO8601:date}%{GREEDYDATA}data=%{GREEDYDATA:request}" }
    add_tag => ["stacktrace"]
} 
  json{
        source => "request"
        target => "parsedJson"
        remove_field=>["request"]
		}
	mutate { gsub => [ "request", "\r$", "" ] }	
	
	mutate {
    add_field => {
      "user" => "%{[parsedJson][user]}"
      "type" => "%{[parsedJson][type]}"
    }
  }	
} 

output {
  stdout {
    codec => rubydebug
  }
  elasticsearch {
    hosts => ["localhost:9200"]
  }
}

I cannot explain why that grok would fail.

thanks...
I was editing your proposed solution in wrong conf file... Now it's working
regards,
Naeem

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.