How do I send logs to elastic search which is set up on a VM , from a docker filebeat

Thhis is my filebeat.yml -

output:
elasticsearch:
enabled: true
hosts:
- http://<my_external_host_having_elasticsearch_instance>:9200

# ssl
# certificate_authorities:
_ # - /etc/pki/tls/certs/logstash-beats.crt_
timeout: 15

filebeat:
prospectors:
-
paths:
- /var/log/vmware-vmsvc.log
- /var/log/auth.log
document_type: syslog
-
paths:
- "/var/log/nginx/*.log"
document_type: nginx-access

For a starter, it will be dependant on the version of Elastic Stack you are using. For example, looking at your config file, in the latest version of Elastic Stack "prospectors" are deprecated in favor of "inputs". Secondly, under "output", you will need to provide Eleasticsearch username & password.

Here are a few suggestions:

  1. In non-swarm mode, here is an example on how to run Filebeat docker container: https://www.elastic.co/guide/en/beats/filebeat/current/running-on-docker.html
  2. Ensure you bind mount volumes (locations of logs that are to be shipped to Elasticsearch) as part of your docker run command
  3. You will need to look at the config reference file at https://www.elastic.co/guide/en/beats/filebeat/current/filebeat-reference-yml.html and configure filebeat as per your scenario

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.