GROK filters timeout and 'value too large' message

Hi, I'm having problems parsing custom Apache logs. I've built custom GROKS using the debugger website. On there they match instantly but when I put them into logstash they just constantly timeout or throw the 'value too large' error message.

The logs do not have a fixed layout in terms of spacing so I am using \s*? to match that. Would that cause an issue with logstash that would not show up on the debugger site?

Sample pattern:

%{APRODLOGSTART} %{INT}\s*?(-|+)\s*?P%{INT}\s*?%{INT}b\s*?%{INT}b(%{INT:[system][apache][ratio]}%)\s*?%{INT}us\s*?%{HTTPACTION:[system][apache][action]}\s*?%{URIPATH:[system][apache][url]} %{HTTPVER:[system][apache][http_ver]}\s*?%{UNIXPATH:[system][apache][path]}\s*?"%{NOTSPACE:[system][apache][usergagent]}"\s*?"%{NOTSPACE:[system][apache][origin]}"\s*?U:%{NOTSPACE:[system][apache][unique_id]} SecIn:%{INT:[system][apache][secin]} SecOut:%{INT:[system][apache][secout]}\s*?%{TLS:[system][apache][tls_ver]}? %{CIPHER:[system][apache][ciphers]}?

I have more complex patterns running on firewall logs without a problem. It's just the apache logs that are causing problems.

Any help would be appreciated!

I've fixed this.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.