Exiting: error unpacking config data: more than one namespace configured accessing 'output' (source:'filebeat.yml')

when i enable elasticsearch output and logstash output in filebeat.yml i get this error.
Exiting: error unpacking config data: more than one namespace configured accessing 'output' (source:'filebeat.yml')

please give me the solution of this problem !!

here are my filebeat.yml file:


  • type: log
    enabled: true
    • /var/log/*.log
    • D:\Git\finance.api\FinanceAPI\logs*.log
      #- c:\programdata\elasticsearch\logs*

path: ${path.config}/modules.d/*.yml
reload.enabled: false

index.number_of_shards: 1


host: "localhost:5601"

hosts: ["localhost:9200"]
username: "elastic"
password: "changeme"

hosts: ["localhost:5044"]

You cannot have 2 different output on the same filebeat.

Please consider using logstash as a forwarder or another filebeat.

1 Like

Thank you for your reply !!
now I have only one output is a logstash output.
can you tell me how can I do logstash as forwarder or another instance of filebeat ?

Hi, can you mark the topic as solved to improve research for the community ?

Can't you just use logstash as an output to elasticsearch ?
Do you need to run 2 separates pipelines ?

1 Like

yeah sure !!

tell me how can I run 2 separate pipeline how can do that , I'm new to this technology so I can't figure out that solution .

here are my first-pipeline.config file.

Beats -> Logstash -> Elasticsearch pipeline.

input {
beats {
port => 5044
grok {
match => { "message" => "%{COMBINEDAPACHELOG}"}
output {
elasticsearch {
hosts => ["http://localhost:9200"]
user => "test"
password => "test123"

I dont understand why do you need filebeat to forward to both logstash and elasticsearch, while you already push logs to elasticsearch using logstash ?

I'm actually trying to understand if you need to send the same log 2 times in elastic ?

data --> filebeat --> logstash --> elastic

1 Like

I've multiple .net core projects which have logs so i want to show those logs into kibana dashboard in elasticsearch.
so that i use filebeat to send those logs to logstash.then logstash send those logs to elastic

log files => filebeat => logstash => elasticsearch
this is what i want to do.

Ok so you dont really need to have 2 separates pipelines, if you do need to separates projects you can always tag logs depending on their sources.

Tag example :

# Foo o365
- type: log
  enabled: true
    - "/var/log/o365.log"
  encoding: utf-8
  tags: ["foo365"]

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.