Hi,
I am new to Filebeat. Sorry if I sound foolish. I have been trying to load the haproxy logs to elasticsearch without much luck for quite some time.
Haproxy Version: 1.8.27
Filebeat version: 7.17.8
Elasticsearch version: 7.17.8
My haproxy config looks as below.
global
log 127.0.0.1 local0
daemon
maxconn 2048
defaults
log global
timeout connect 500000ms
timeout client 86400s
timeout server 86400s
frontend front-https-servers
mode tcp
option tcplog
bind *:443
capture request header Host len 64
default_backend back-https-servers
listen stats
bind :1936
mode http
stats enable
stats realm Haproxy\ Statistics
stats uri /
backend back-https-servers
mode tcp
capture response header Content-Length len 64
balance leastconn
stick-table type ip size 1m expire 8h
stick on src
option ssl-hello-chk
server server1 X.X.X.X:443 check
server server2 X.X.X.X:443 backup
frontend front-ssh-servers
mode tcp
option tcplog
bind *:22
default_backend back-ssh-servers
timeout client 8h
backend back-ssh-servers
mode tcp
balance leastconn
stick-table type ip size 1m expire 8h
stick on src
server server1 X.X.X.X:22 check
server server2 X.X.X.X:22 check
server server3 X.X.X.X:22 backup
The filebeat haproxy module configuration is as below.
- module: haproxy
log:
enabled: true
var.input: "file"
var.paths: ["/var/log/haproxy.log"]
A sample log entry looks as below.
2023-01-24T20:29:16-08:00 127.0.0.1 haproxy[2284158]: X.X.X.X:43206 [24/Jan/2023:20:29:14.061] front-ssh-servers back-ssh-servers/server1 1/0/2565 5617 -- 2/1/0/0/0 0/0
For every log entry, the Kibana dashboard shows the error message that "error.message:
Provided Grok expressions do not match field value". I have been trying to troubleshoot with different log formats, request/response headers, etc without any luck. Appreciate any help in this regard. TIA.