Fluentd doesnt send log to elasticsearch

I have a problem it is about to send logs to elasticsearch. I had configure the conf file ind the server but in elastic i couldnt see any log even in the Discover menu

Hi @gurban.suleyman,

Welcome to the community! Are you using the fluentd elasticsearch plugin? Can you share any errors you are seeing in the logs?

If you are using the fluentd elasticsearch plugin I would recommend seeing if any of the tips in the troubleshooting guide help you identify the issue as well.

Hi @carly.richmond
I fixed issue but i have a question i searching about a month but i dont reach any solution.
Here is,
our client sending logs to elasticsearch with type:kafka2 but now they also want to send logs with type:elasticsearch i configured elasticsearch it was working but in td-agent.conf when i add kafka configuration it doesnt work or vice versa.
can you have any solution to how i configure multiple source or match type in td-agent? i want work both of them kafka and elasticsearch in td-agent.

Hi @gurban.suleyman,

It would be great to know what the solution to your issue was in case others encounter the same problem.

In terms of your new issue (I'm assuming you haven't posted a new topic for it) can you confirm what the inputs and outputs are? If I understand correctly you want to send logs from both Kafka and an Elasticsearch cluster to Elasticsearch? Or is it the other way around?

Kafka sends logs to kafdrop, and additionally logs must be sent to elasticsearch. So, is it possible to show 2 different sources and matches in the same td-agent.conf file?

Have you tried using the copy output? There is an example in the docs for file and elasticsearch output that you could try to adapt?

Here is example for we want to send both of logs which is i wrote in the td-agent.conf

@type tail path /path/path/example* pos_file /etc/td-agent/test.log.pos format none tag test

<match *>
@type kafka2
use_event_time true

@type file path /etc/td-agent/buffer flush_interval 3s @type json

topic settings

topic_key example-log
default_topic example-log
required_acks -1
compression_codec gzip
get_kafka_client_log true
@log_level trace

@type tail path /another_path/log/* pos_file /etc/td-agent/test.log.pos format none tag test

<match *>
@type elasticsearch
user elastic
password ""
scheme http
ssl_verify false
host "ip address"
port "port"
index_name test
logstash_format true
logstash_prefix test
include_timestamp true

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.