Getting logs from elasticsearch that runs in a container

We are running our elasticsearch in docker containers and have multiple elasticsearch containers. We have a seperate monitoring cluster setup and we were able to use metricbeats container to collect monitoring and send it to the monitoring cluster.

Now trying to configure filebeats container to collect elasticsearch logs and ingest and send it to the monitoring cluster but can't figure out whats the best way to perform this.

since our elasticsearch is running inside containers if you look at the log4j2 it poitns everything into console. docker is logging everything into /var/lib/docker/containers//.log IF you used a container or docker filebeat.input to collect container logs it gets collected into a filebeat index and you can view it in the logs tab in kibana but in the monitoring tab it does not show. Also the logs collected are not ingested with the pipelines like how elasticsearch module in filebeat would ingest.

Any suggestions?

Hi @jlim0930

Thanks for raising this question. We'll want to have a look at your config files, but my guess is that you'll need to specifically tell the Docker modules to use the ingest pipelines which are typically used by the Elasticsearch module. You can see those pipelines here.

If you want to post your config files, we can try to help you put something together that might work for this but what I've posted above may provide you a path to get started down a path that may work.

It is possible to use the Filebeat elasticsearch module to collect logs from Elasticsearch Docker containers. Please see https://www.elastic.co/guide/en/beats/filebeat/current/configuration-autodiscover.html#_docker_2. Specifically, look for "If you are using modules" on that page.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.