We are running our elasticsearch in docker containers and have multiple elasticsearch containers. We have a seperate monitoring cluster setup and we were able to use metricbeats container to collect monitoring and send it to the monitoring cluster.
Now trying to configure filebeats container to collect elasticsearch logs and ingest and send it to the monitoring cluster but can't figure out whats the best way to perform this.
since our elasticsearch is running inside containers if you look at the log4j2 it poitns everything into console. docker is logging everything into /var/lib/docker/containers//.log IF you used a container or docker filebeat.input to collect container logs it gets collected into a filebeat index and you can view it in the logs tab in kibana but in the monitoring tab it does not show. Also the logs collected are not ingested with the pipelines like how elasticsearch module in filebeat would ingest.
Any suggestions?