Sending stdout and stderr to different elasticsearch hosts and kibana

Hi,
I have a requirement to send stdout and stderr to different Elastic and Kibana hosts. I have deployed two filebeat containers. Here are my filebeat.yml.
1.filebeat.stdout.yml

filebeat.config:
  modules:
    path: ${path.config}/modules.d/*.yml
    reload.enabled: false

filebeat.autodiscover:
  providers:
    - type: docker
      hints.enabled: true

processors:
- add_cloud_metadata: ~

filebeat.inputs:
- type: container
  enabled: true

  # Paths for container logs that should be crawled and fetched.
  paths:
    -/var/lib/docker/containers/*/*.log

  # Configure stream to filter to a specific stream: stdout, stderr or all (default)
  stream: stdout
    
output.elasticsearch:
  hosts: '${ELASTICSEARCH_HOSTS:<elasic_ip1>:9200}'

2.filebeat.stderr.yml
filebeat.config:
  modules:
    path: ${path.config}/modules.d/*.yml
    reload.enabled: false

filebeat.autodiscover:
  providers:
    - type: docker
      hints.enabled: true

processors:
- add_cloud_metadata: ~

filebeat.inputs:
- type: container
  enabled: true

  # Paths for container logs that should be crawled and fetched.
  paths:
    -/var/lib/docker/containers/*/*.log

  # Configure stream to filter to a specific stream: stdout, stderr or all (default)
  stream: stderr
    
output.elasticsearch:
  hosts: '${ELASTICSEARCH_HOSTS:<elasticIP2>9200}'

Is the above method right? I donot see my container logs in Kibana instances.
Any help would be appreciated . I am a total newbie to Elasticsearch.
Thanks

Welcome to our community! :smiley:

Do you mean different Elasticsearch clusters?

Yes, I want to send the streams to different elastic clusters. elastic_ip1 is the host where i want to send the stdout stream. elastic_ip2 is the host where i want to send the stderr stream.

That looks right from what I can tell. What do your docker configs look like?

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.