We are running our elasticsearch in docker containers and have multiple elasticsearch containers. We have a seperate monitoring cluster setup and we were able to use metricbeats container to collect monitoring and send it to the monitoring cluster.
Now trying to configure filebeats container to collect elasticsearch logs and ingest and send it to the monitoring cluster but can't figure out whats the best way to perform this.
since our elasticsearch is running inside containers if you look at the log4j2 it poitns everything into console. docker is logging everything into /var/lib/docker/containers//.log IF you used a container or docker filebeat.input to collect container logs it gets collected into a filebeat index and you can view it in the logs tab in kibana but in the monitoring tab it does not show. Also the logs collected are not ingested with the pipelines like how elasticsearch module in filebeat would ingest.
Thanks for raising this question. We'll want to have a look at your config files, but my guess is that you'll need to specifically tell the Docker modules to use the ingest pipelines which are typically used by the Elasticsearch module. You can see those pipelines here.
If you want to post your config files, we can try to help you put something together that might work for this but what I've posted above may provide you a path to get started down a path that may work.
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.