Hey all, looking for some help on where to look next:
I'm playing with a pipeline that has filebeat push some log data onto a kafka topic, and then logstash read that and do some filtering and ultimately push to Elasticsearch, but I've hit a problem.
Upon initial deployment things seemed to be working. After starting filebeat on my appliance a slew of the old log entries was pushed and logstash read them from the kafka topic and outputted them (no filtering at this stage) to a file for debug. Then I went to start iterating over the logstash filtering, SIGTERM'd logstash, and since then I've been unable to get it to read anything off the topic after restarting.
So basically it all worked, I killed logstash, and now I can't get it to to read anything off the topic, even when its sat running and I generate additional log entries.
I can verify that everything is going into kafka because if I manually spin up a console consumer I can then see the newer log file entries...
Versions:
Filebeat: 5.2
Logstash: 5.2
Kafka: 0.10.0.1
I'm not seeing kafka or Logstash log anything untoward, even when increasing the logging... I'm looking for any guidance anyone might have or suggestions on settings to tweak. Should I implement some group configuration on kafka (I haven't yet; just a simple topic). I've tried changing the auto_offset_reset variable in logstash for it's kafka input and not had any luck.
Would welcome any suggestions for things to check... i feel like I'm missing something obvious.
Thanks,
Greg