Grok parse failure in Logstash

I am using filebeat to push files to kafka and then consume it using Logstash. Below is the filebeat output.

{
  "@timestamp": "2017-02-10T05:07:06.895Z",
  "beat": {
    "hostname": "mypc",
    "name": "mybeat",
    "version": "5.0.0"
  },
  "fields": {
    "logtype": "my_logfile"
  },
  "input_type": "log",
  "message": "192.168.24.208 [08/Feb/2017:07:10:57 +0000] \'http-www-8080-www-5\' 1841 677 241 42C8FGTR467FDSMKH7523DFV73B652C6F \'POST /module/submodule/action?trigger=e7s2\u0026locale=en_US\u0026llt=\u0026agentId=\u0026TKN_EXCHG=9190B0C7941A2784F4DE94F1572732EF HTTP/1.1\' \'gzip, deflate\' - gzip \'Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/54.0.2840.99 Safari/537.36\'",
  "offset": 5685,
  "source": "/logfiles/mylog.2017-02-08.txt",
  "type": "logfile"
}

In Logstash, I am applying grok like;

grok{
	match => { "message" => "{GREEDYDATA:msg}" }	
}

But I am getting "tags":["_grokparsefailure"] in Logstash. Why is this happening and how can I fix this?

Thank you.

%{GREEDYDATA:msg}, not {GREEDYDATA:msg}. That's of course a pretty useless grok filter, but maybe you're just playing around.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.