Kafka and Logstash

I read many articles around Logstash and Kafka and their integration.

I know that kafka is scalable producer/consumer message queue, and I can gather, normalize, enrich and index different source with Logstash.

In other hand, I can gather, normalize, enrich with different topics in Kafka too, then elastic-search can consume proper topics and index it as well.

In my case, I have a streaming source which is produce information continuously, and currently I have used elasticsearch with kibana. Problem is that sometimes we have a burst of information and elastic-search cluster could not index in a given time, so I wanna to change the architecture. Note that my data is structured and doesn't need any normalization or enrichment.

My Questions:

  1. my thought is that, logstash is a perfect tool for normalization and enrichment which is very easy. Additionally, we can do the same jobs with kafka in a bit harder way. Is it correct?
  2. I know that logstash has a "Persistent Queue" property, and kafka nature is the same too. My thought is that using Kafka is better and more resilient than logstash. In my case, do you agree with me?
  3. I know that is possible to integrate kafka and logstash in easy way, but I think logstash make the system more complicated and it is not useful component in my case(since my information is normalize and does not need enrichment). Do you agree with me?
  4. I found that Apache Nifi is a super easy tool to index and check flow of data. But I have performance concerns. Do you think that using Nifi decrease ingestion rate compare to write every relation from scratch?
  1. Yes
  2. Depends on your definition of resilience. Logstash isn't clustered for example.
  3. Depends on what you do in 1 really.
  4. I have no experience with nifi so cannot comment.

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.