I have succssfully send haproxy log to Elasticsearch using filebeat.
next step I am trying to do is to change destination.ip and source.ip to name
but I don't think I am using dns processor correctly. I am not getting this source.hostname field.
I do get source.ip field. dns servers are correct. I can resolve the IP to name from prompt on same machine.
what am I doing wrong?
here is my haproxy.yml file
- module: haproxy
# All logs
# Set which input to use between syslog (default) or file.
# Set custom paths for the log files. If left empty,
# Filebeat will choose the paths depending on your OS.
nameservers: ['10.59.240.246', '10.167.17.40']
What version of filebeat? I'm pretty sure the source.ip field doesn't exist yet as most of the processing exists within the Elasticsearch ingest pipelines, not filebeat.
yes source.ip exist I can see it in discover.
Yes but its being created in Elasticsearch, after leaving filebeat so the filebeat processor won't work.
no it is not being created in Elasticsearch.
this is part of the metric that haproxy log has
and I want to convert or add source.name as source.ip is coming from log file.
You misunderstand, the log from HAProxy isn't parsed into individual fields until it reaches Elasticsearch. Here is the grok processor in the ingest pipeline that does it, beats/pipeline.yml at v7.17.1 · elastic/beats · GitHub.
source.ip isn't created until beats/pipeline.yml at v7.17.1 · elastic/beats · GitHub. When u add the processor to filebeat it fails because
source.ip field doesn't exist yet.
ohhh now i see what you mean.
how do I convert that yml file to ingest pipeline?
is there a simple command that convert whole thing?
What do u mean? When u enable the module and run the setup command, it gets loaded into Elasticsearch for you.