Kafka message frequency

I would like to consume messages from Kafka, with Logstash.
Is it possible for Logstash, to always read the last message that exists on Kafka topic, even if it was sent a little while ago?
So far I see I can configure a pipeline, which will read all current messages that arrive to Kafka.
If I stop and start pipeline, it will only wait for new messages, the old ones are forgotten.
What I would like, is to read the last message that is on topic continuously.
Is that possible?

Is this question related to your previous one with version numbers and ActiveMQ? If so then in another post I showed you that you don't need Kafka.

Indeed in my last post, it was a use case where I wanted to poll Kafka(not needed now).
For future use cases, is it possible to poll Kafka topic via Logstash?

If you think of Logstash as an unbounded stream processing system then you are on the right track.

Any attempts to move away from that idea usually means that you must go to extra lengths to implement it or accept it is too difficult to do so.

The kafka input will always try to read from the last consumer offset it knows about - and the last consumer offset is stored in kafka via the input's library code. Kafka itself does not have a concept of the last message because its up to the consumers to track where they are offset wise.
If a consumer is on offset 100, it does not know if it has read the last message or half way through reading the available messages.

1 Like

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.