maar
May 21, 2021, 10:57am
1
Grok is parsing successfully when Haproxy gives a log - from var/log/haproxy.log
- similar to:
May 21 08:25:56 ha haproxy[5089]: 12.3.45.67:89012 [21/May/2021:08:25:56.055] www-https~ wss/wssnode website.domain.com 1/1/1/1/111 111 111 - - ---- 11111/11111/11111/111/0 0/0 "GET /ws/site/V3L235F/d88r3567pssllp/ HTTP/1.1"
But when instead of ip_address:port
there's a -:port
, for example:
May 21 08:25:56 ha haproxy[5089]: -:89012 [21/May/2021:08:25:56.055] www-https~ wss/wssnode website.domain.com 1/1/1/1/111 111 111 - - ---- 11111/11111/11111/111/0 0/0 "GET /ws/site/V3L235F/d88r3567pssllp/ HTTP/1.1"
I have an error:
Provided Grok expressions do not match field value: May 21 08:25:56 ha haproxy[5089]: -:89012 [21/May/2021:08:25:56.055] www-https~ wss/wssnode website.domain.com 1/1/1/1/111 111 111 - - ---- 11111/11111/11111/111/0 0/0 "GET /ws/site/V3L235F/d88r3567pssllp/ HTTP/1.1
Here's my /usr/share/filebeat/module/haproxy/log/pipline.json
I was trying to resolve this by adding a new pattern to grok
with message
field and by editing grok
pattern with source.address
field without success.
Is this a repeated issue you're having? Any special config that causes the - instead of an IP address? This should be an easy update to the ingest pipeline. Can you make an issue on GitHub for tracking?
maar
May 24, 2021, 7:47am
3
Yes
No, default config. Haproxy doesn't intercept every log with IP addresses.
Actually I already have the correct Grok expression that works in debugger, but still it doesn't work in Kibana. Maybe there's something else that causes the error.
I'll create an issue on GitHub.
Thanks for your help.
Also what version of filebeat and elasticsearch are you using?
maar
May 24, 2021, 1:24pm
5
Filebeat 7.6.1
Elastic&Kibana 7.7.1
opened 01:22PM - 24 May 21 UTC
needs_team
I created Grok pattern which works in Kibana Debug Grok devtool:
Sample data:
… ```
[May 22 02:22:22 server1 haproxy[5089]: -:22222 [22/May/2021:02:22:22.222] www-https~ myapp/node2 site.domain.com 0/0/0/18/18 200 200 - - ---- 222/222/2/0/0 0/0 \"OPTIONS /api/v2/app/ HTTP/1.1\"]
```
Grok pattern:
```
%{HAPROXY_LOG_DATE:haproxy.logdate} %{NOTSPACE:haproxy.host} %{NOTSPACE:process.name[pid]}[%{NUMBER:process.pid:long}] (%{IP:source.address}|-):%{POSINT:source} %{HAPROXY_DATE:haproxy.request_date} %{NOTSPACE:haproxy.frontend_name} %{NOTSPACE:haproxy.backend_name}/%{NOTSPACE:haproxy.server_name} %{NOTSPACE:haproxy.http.captured.request.headers} %{NUMBER:haproxy.http.request.time_wait_ms:long}/%{NUMBER:haproxy.total_waiting_time_ms:long}/%{NUMBER:haproxy.connection_wait_time_ms:long}/%{NUMBER:haproxy.http.request.time_wait_without_data_ms:long}/%{NUMBER:temp.duration:long} %{NUMBER:http.response.status_code:long} %{NUMBER:haproxy.bytes_read:long} %{NOTSPACE:haproxy.http.request.captured_cookie} %{NOTSPACE:haproxy.http.response.captured_cookie} %{NOTSPACE:haproxy.termination_state} %{NUMBER:haproxy.connections.active:long}/%{NUMBER:haproxy.connections.frontend:long}/%{NUMBER:haproxy.connections.backend:long}/%{NUMBER:haproxy.connections.server:long}/%{NUMBER:haproxy.connections.retries:long} %{NUMBER:haproxy.server_queue:long}/%{NUMBER:haproxy.backend_queue:long} \\\"%{NOTSPACE:haproxy.http.request.method} %{NOTSPACE:haproxy.http.request.captured_headers} %{NOTSPACE:haproxy.http.response.captured_headers}\\\"
```
Custom patterns:
```
HAPROXY_LOG_DATE %{MONTH} %{MONTHDAY} %{HOUR}:%{MINUTE}:%{SECOND}
HAPROXY_DATE \[%{MONTHDAY}[/-]%{MONTH}[/-]%{YEAR}:%{HOUR}:%{MINUTE}:%{SECOND}\]
```
Structured data:
```
{
"process": {
"name[pid]": "haproxy[5089]"
},
"temp": {
"duration": 18
},
"haproxy": {
"server_name": "node2",
"total_waiting_time_ms": 0,
"termination_state": "----",
"connection_wait_time_ms": 0,
"bytes_read": 200,
"backend_queue": 0,
"backend_name": "myapp",
"logdate": "May 22 02:22:22",
"host": "server1",
"request_date": "[22/May/2021:02:22:22.222]",
"http": {
"request": {
"captured_cookie": "-",
"time_wait_without_data_ms": 18,
"captured_headers": "/api/v2/app/",
"method": "OPTIONS",
"time_wait_ms": 0
},
"response": {
"captured_cookie": "-",
"captured_headers": "HTTP/1.1"
},
"captured": {
"request": {
"headers": "site.domain.com"
}
}
},
"frontend_name": "www-https~",
"server_queue": 0,
"connections": {
"server": 0,
"retries": 0,
"active": 222,
"backend": 2,
"frontend": 222
}
},
"http": {
"response": {
"status_code": 200
}
},
"source": "22222"
}
```
First I thought that something else causes the problem: [discuss.elastic.co](https://discuss.elastic.co/t/cant-parse-haproxy-logs-without-ip-address-in-grok/273654)
But the pattern is working, and I don't know why. but it doesn't work with Filebeat and Elastic.
I have an error in Elastic:
```
Provided Grok expressions do not match field value:
[May 22 02:22:22 server1 haproxy[5089]: -:22222 [22/May/2021:02:22:22.222] www-https~ myapp/node2 site.domain.com 0/0/0/18/18 200 200 - - ---- 222/222/2/0/0 0/0 \"OPTIONS /api/v2/app/ HTTP/1.1\"]
```
That's my current config file: [gist.github.com](https://gist.github.com/maarsaks/c7eb65a4d5e64681d6dd33c5898438b5)
- Version: Filebeat 7.6.1; Elastic&Kibana 7.7.1
- Operating System: Debian Buster
- Discuss Forum URL: [discuss.elastic.co](https://discuss.elastic.co/t/cant-parse-haproxy-logs-without-ip-address-in-grok/273654)
maar
May 25, 2021, 9:48am
7
Thanks. I've replied under the issue discussion.
maar
June 9, 2021, 2:35pm
8
I've created the repo how to reproduce the error in ~5 minutes - make sure that you've made the all steps from README.md
.
github.com/maarsaks/elk-docker-compose
system
(system)
Closed
July 7, 2021, 4:36pm
9
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.