When i start the filebeat the log is sending from the filebeat to the kafka-topic(read by kafka-consumer) then this message is a input to the logstash which is getting in and then not able to view the message in the kibana index(logjmeter-2018.07.13)
NOTE:
1.the above mentioned index is already created before i am just trying to append my messages to this index but that's not working
2. if i specify a new index name then it creates a new index and the messages flow where i can see in kibana.
ASK:
- how to confirm if the data is being read in the logstash (var/log/logstash yes but what is the expected log)
- how to check if logstash is sending the message to elasticsearch
following is my configurations:
Filebeat:(6.2.1)
output.kafka:
enabled: true
hosts: ["16.202.68.51:9094","16.202.68.60:9094","16.202.68.62:9094"]
topic: perf2-jmeter-new
kafka(2_11.1.1.0) messages
logstash(6.2.4):
kafka {
bootstrap_servers => '16.202.68.62:9094,16.202.68.51:9094,16.202.68.60:9094'
topics => ["perf2-jmeter-new"]
type => "perf2-jmeter"
consumer_threads => 3
group_id => "logjmeter"
decorate_events => 'true'
enable_auto_commit => 'true'
auto_offset_reset => 'earliest'
poll_timeout_ms => "600"
fetch_max_wait_ms => "600"
auto_commit_interval_ms => "5000"
}
output {
if [type] == "perf2-jmeter" or [headers][content_type] == "perf2-jmeter" {
elasticsearch {
hosts => ["16.202.66.36:9200","16.202.66.40:9200","16.202.66.42:9200","16.202.66.56:9200","16.202.66.58:9200"]
index => "logjmeter-%{+YYYY.MM.dd}"
user => "elastic"
password => "changeme"
}
}
}