How to parse haproxy logs using logstash and filebeat

I have setup logstash and filebeat on my ECK stack so in kibana/elasticsearch i can view logs from kubernetes. I followed this article to set this up https://medium.com/@raphaeldelio/deploy-logstash-and-filebeat-on-kubernetes-with-eck-ssl-and-filebeat-d9f616737390

I have a haproxy service on kubernetes and i want to parse the message field into separate fields so it can be visualized and analysed.

Log example:
enter image description here

I've come across filebeat modules (but im not 100% sure if this is what i need) https://www.elastic.co/guide/en/beats/filebeat/current/filebeat-modules.html . Anyway, ive added the "haproxy" module as you can see from my filebeats config below but it didint change the output of the log.

filebeat-configmap.yml

apiVersion: v1
kind: ConfigMap
metadata:
  name: filebeat-config
  labels:
    app: filebeat
data:
  filebeat.yml: |-
    filebeat.modules:
      - module: haproxy
        log:
          enabled: true
          
    filebeat.autodiscover:
      providers:
        - type: kubernetes
          node: ${NODE_NAME}
          hints.enabled: true
          hints.default_config:
            type: container
            paths:
              - /var/log/containers/*${data.kubernetes.container.id}.log
    processors:
      - add_cloud_metadata:
      - add_host_metadata:
    output.logstash:
      hosts: '${LOGSTASH_URL}'

logstash-configmap.yml

apiVersion: v1
kind: ConfigMap
metadata:
  name: logstash-configmap
data:
  logstash.yml: |
    http.host: "0.0.0.0"
    path.config: /usr/share/logstash/pipeline
  logstash.conf: |
    # all input will come from filebeat, no local logs
    input {
      beats {
        port => 5044
      }
    }
    filter {
      grok {
        match => { "message" => "%{TIMESTAMP_ISO8601:Date}  %{LOGLEVEL:Level} %{INT:ProcessID} --- \[%{DATA:ThreadName}\] %{JAVACLASS:Class} : %{GREEDYDATA:LogMessage}" }
      }
    }
    output {
      elasticsearch {
        index => "logstash-%{[@metadata][beat]}"
        hosts => [ "${ES_HOSTS}" ]
        user => "${ES_USER}"
        password => "${ES_PASSWORD}"
        cacert => '/etc/logstash/certificates/ca.crt'
      }
    }

Apparently this module is suppose to create some haproxy dashboard ion kibana aswell, but it hasen't.

still struggling with this if someone can point in the right direction

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.