Filebeat => Logstash Shipper=>Kafka => logstash Parser => Elasticsearch =>Kibana

Hi I'm really confuse with following architecture pipeline

My Goal is to send transport logs to filebeat => Logstash => Kafka => Logstash Parser => Elasticsearch => Kibana

also I would like to set up all individual servers for each processing as I have 600 client servers send 1tb ingesting every day so I'm blocked @kafka configuration I will go one step at time to explain my issue

filebeat => logstash server with following configuration ( Sucess )

#----------------------------- Logstash output --------------------------------
output.logstash:

The Logstash hosts

 hosts: ["10.146.134.15:5044"]

Logstash => kafka

input {
beats {
port => 5044
type => "syslog"
#ssl => true
#ssl_certificate => "/etc/pki/tls/certs/filebeat.crt"
#ssl_key => "/etc/pki/tls/private/filebeat.key"
}
}

output {
kafka {
bootstrap_servers => "10.146.134.17:9092"
topic_id => ["beats"]
}
}

Kafka to Logstash not sure how to tell kafka to send logs to Logstash But I have setup logstash server to
Kafka server runs perfect with out any error messeges

and finally my logstash parser to elasticsearch configuration

input {
kafka {
bootstrap_servers => "10.146.134.17:9092"
topics => ["beats"]
}
}
output {
elasticsearch {
hosts => ["10.146.134.12:8888"]
index => "elasticse"
}
}

Not sure were I have did wrong but any input would be help me to move forward.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.