Hi,
I'm trying to ship elasticsearch logs using filebeat's autodiscover feature + container input, but don't know how to split different types of es logs properly.
For nginx - I can split log easily just by using stream field, but for elasticsearch it is always equals stdout
for all 4 types of logs/assets and actual
split field is under path log.type
.
One example of docker log:
{ "log": { "type": "server", "timestamp": "20xx-04-11T18:20:54,430Z", "level": "DEBUG", "component": "o.e.a.a.c.n.t.c.TransportCancelTasksAction", "cluster.name": "some-cluster", "node.name": "some-xxxxx", "message": "Removing ban for the parent [xxxxxx:xxxx] on the node [xxxxx-xxxx]", "cluster.uuid": "xxx-xxxx", "node.id": "xxxx-xxxx" }, "stream":"stdout", "time":"20xx-04-11T18:11:54.43434Z" }
Here is my bad
config file:
filebeat.autodiscover: providers: - type: docker templates: # ElasticSearch services - condition.contains: docker.container.image: elasticsearch config: - module: elasticsearch server: input: type: container paths: - /var/lib/docker/containers/${data.docker.container.id}/*.log gc: input: type: container paths: - /var/lib/docker/containers/${data.docker.container.id}/*.log audit: input: type: container paths: - /var/lib/docker/containers/${data.docker.container.id}/*.log slowlog: input: type: container paths: - /var/lib/docker/containers/${data.docker.container.id}/*.log deprecation: input: type: container paths: - /var/lib/docker/containers/${data.docker.container.id}/*.log
By using this config - filebeat can ship all of log types with proper pipeline, but it will lead to lots of weird duplicates (because gc pipeline know nothing about slowlog and vice versa).
How can I split different types of logs like stream field? (I can add logstash as middleware and depends on log.type
specify different pipelines manually, but it looks so weird )
Best regards.