Issue with ingest of haproxy logs into KIbana

I have signed up today for the free trial of elasticsearch. I am mostly interested in using hosted Kibana with elastic search. I am attempting to import some of my companies production data from an haproxy configuration. Below is the .json output of the error.
I have successfully pushed the data using filebeat per the instructions.
It looks like the data is getting to elasticsearch and to Kibana successfully, but the data is failing to be processed. I assume there is something unexpected with the format of my haproxy logs, but I am not an expert with Grok, and I am struggling to get the data imported successfully.
Here is the error that I believe is the key: "Provided Grok expressions do not match field value"

I have attempted different Grok formats, but I can't get any of them to work using the Grok debugger.
Any thoughts on how to further isolate this issue is greatly appreciated as I have to get this up and running for a demo for the middle of next week.

Thanks
Tony Alvino
{ "_index": "filebeat-6.5.4-2019.01.17", "_type": "doc", "_id": "g5fBXWgBxRkBRBUJmqI1", "_version": 1, "_score": null, "_source": { "offset": 7102335, "prospector": { "type": "log" }, "source": "/home/deepcompute/haproxy/haproxy.04-01-2019.log", "message": "Jan 4 10:51:03 hz-prestaging-haproxy haproxy[4050]: 54.228.16.3:55764 [04/Jan/2019:10:51:03.023] staging staging/<NOSRV> 0/-1/-1/-1/0 302 142 - - LR-- 1/1/0/0/0 0/0 "GET /login HTTP/1.1"", "fileset": { "module": "haproxy", "name": "log" }, "error": { "message": "Provided Grok expressions do not match field value: [Jan 4 10:51:03 hz-prestaging-haproxy haproxy[4050]: 54.228.16.3:55764 [04/Jan/2019:10:51:03.023] staging staging/<NOSRV> 0/-1/-1/-1/0 302 142 - - LR-- 1/1/0/0/0 0/0 \"GET /login HTTP/1.1\"]" }, "input": { "type": "log" }, "@timestamp": "2019-01-17T21:39:50.114Z", "beat": { "hostname": "tony-servers.nferx.com", "name": "tony-servers.nferx.com", "version": "6.5.4" }, "host": { "os": { "codename": "xenial", "family": "debian", "version": "16.04.5 LTS (Xenial Xerus)", "platform": "ubuntu" }, "containerized": false, "name": "tony-servers.nferx.com", "id": "51b7f45441b340d2ba60cbf34e685e5c", "architecture": "x86_64" } }, "fields": { "@timestamp": [ "2019-01-17T21:39:50.114Z" ] }, "sort": [ 1547761190114 ] }

I am guessing at this point that this is my problem:

My current configuration for haproxy is set for:
defaults
log global
mode http
option httplog

I don't know if I can go and update all of my haproxy nodes to include the option headers in the log per this config:

  capture request header Host len 15
  capture response header Content-length len 9

It would also mean all of my past data cannot be parsed, so I will investigate what options I might have.

Using sed, I have forced the headers (just empty values) into the log entries, and now I am able to successfully load the data into Kibana.

This can be closed.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.