Getting _grokparsefailure in identical docs

Hey guys,

I have a problem. I use grok to parsing the document for special data, my problem with this is that I am facing a _grokparsefailure in some documents. When I compare them with documents where I get the result which I want, I can see no difference.

So my first question is, is there a possibility to see why I am getting the parsing error and my second question is, is it possible to say grok that he just parse a specific field?

I hope you can help me!
Best regards,

If you want help with a grok filter then show us the filter and the event that does not match. Either the output from output { stdout { codec => rubydebug } } or copy an event from the JSON tab in Kibana.

Hey @Badger,

Thanks for your reply!

This is my filter

	 match => {"ActiveUser" => "UserName=(?<user>[a-zA-Z0-9_.\\/-]+\\{1,2}[a-zA-Z0-9_.\\/-]+)"}
	 id => "grok_activeUser"

And here is a example message (json):

  "_index": "log-2018.07.12",
  "_type": "doc",
  "_id": "dfwZjWQBULn4xN3N8A74",
  "_version": 1,
  "_score": null,
  " _source": {
  "@version": "1",
  "type": "log",
  "database": "{DatabaseServerName=myServer, UserName=myLovelyDomain\\MyBestUser}Something",
  "level": "DEBUG",
  "application_version": "123456",
  "database_server_name": "{DatabaseServerName=myServer, UserName=myLovelyDomain\\MyBestUser}SomethingOther",
  "@timestamp": "2018-07-12T06:07:24.810Z",
  "is_online": "{DatabaseServerName=myServer, UserName=myLovelyDomain\\MyBestUser}AnotherSomethingOther",
  "machine_name": "Mycomputer",
  "host": "",
  "logmessage": "MyLogMessage",
  "application_name": "MyApplication",
  "tags": [
  "timestamp": "2018-07-12T06:07:14"
"fields": {
  "@timestamp": [
"timestamp": [
"sort": [

That event doesn't have a field named ActiveUser, so a grok filter that attempts to parse that field will naturally fail.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.