Logstash grok parse failure

Hi All,
I tried to import a modified combined apache logs on a logstash instance and I have this sample row:

10.10.10.10 54338 - [29/May/2017:16:21:34 +0200] "GET /test.html HTTP/1.1" 200 682 8763 "https://mysite.com/test" "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/58.0.3029.96 Safari/537.36" "JSESSIONID=1010101; testid=asajdhasd "

I configured a custom pattern reported like this

%{IPORHOST:clientip} (?:%{DATA:ident}|-) (?:%{DATA:auth}|-) \[%{HTTPDATE:timestamp}\] \"%{WORD:verb} %{NOTSPACE:request}(?: HTTP/%{NUMBER:httpversion}|-)\" %{NUMBER:response} (?:%{NUMBER:bytes}|-) (?:%{NUMBER:timespent}|-) \"%{NOTSPACE:referrer}\" \"%{DATA:agent}\" \"%{DATA:cookies}\"

On elasticsearch side all the fields are fine but for all the rows I noticed the tag "_grokparsefailure".

Does someone notice this error before?

Thanks,
Marcello

yes, it means grok got an unexpected piece of data that it didnt know what to do with.
given the data you posted and the grok pattern everything seems fine.
could it be that not all of the data follow this pattern?

I resolved with a dos2unix command on the parsed file. Actually there aren't entries with "_grokparsefailure". We would move the entries with "_grokparsefailure" tag to a dedicated file and not under the elasticsearch index.

Marcello

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.