I'm playing around with filebeat and I wanted to sends logs to certain outputs in an intelligent way. That is I have a file in the /path/to/file2 which I want its logs to go to elasticsearch and another file in the /directory/to/file2 which I want to ship its logs to kafka.
I have given a try with config_dir in my main filebeat configuration file. Please find enclosed the config that I did.
filebeatFile2.yml: which is found in the directory /etc/filebeat/conf.d
The issue with the configuration is that output.elasticsearch is not considered. In fact it is completely ignored. That is both logs of file1 and file2 are sent to kafka.
So here is my question: can somebody now help me to achieve that please?
The output configuration in beats is global. As of now, filebeat does not support any enhanced event-routing. This is normally handled by logstash. Alternatively you can start 2 filebeat instances (make sure the
filebeat.registry_file is different for both beats).
Many thanks. I will give that a try. But I'm already sure this will work very well.
Pardon me if i posted in the wrong area, you may direct me where to post this, but maybe my problem is related to your response.
I have been trying to route our logs to a central Syslog-ng server but found only the options (Elastic,Logstash,Kafika,Console) i tried to improvise by entering my syslog-ng server address under logstash hosts
in the Elasticsearch output
The Logstash hosts
I keep getting this error from my logs
Connecting error publishing events (retrying): Get http://localhost:9200: dial tcp [::1]:9200: getsockopt: connection refused
Please share ideas on how i can i route to my syslog-ng server?
@laibon your question is totally unrelated. Please create your own topics. Filebeat does have no syslog output.
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.