Find logstash compatibility with kafka

Hello,

I upgraded logstash from 2.4 to 7.5 and also kafka from 0.10 to 2.5 Im facing few issues here when pushing logs through logstash

issue background information:

  1. Im using the same logstash config files in /etc/logstash/conf.d/*.conf and I defined the each config file in pipeline file /etc/logstash/pipeline.yml
  2. Im using logstash-integration-kafka and sending the messages to kafka brokers which defined in output section
  3. when I start logstash 7.5 i can see in the logstash logs it is started properly without any issues and service is fine but it is sending empty messages to kafka( i have a kakfa UI tool running kafka side) and I also validated logstash config in verbose mode and didn't find any errors.
  4. To double check kafka side Im running logstash in two hosts with 0.10 and 7.5 I can see the messages from logstash with 0.10 sending to kafka topics but the logstash 7.5 is sending empty messages to topics

questions:

  1. my logstash-integration-kafka for logstash is in version 10.0.0 is this compatible with kafka 2.5?. In plugin documentation, it mentioned to check with kafka documentation but kafka documentation page is not up to date so I'm checking here.

You'll find on https://www.elastic.co/guide/en/logstash/current/plugins-integrations-kafka.html (which currently is for Logstash 7.8) that logstash-integration-plugin is version 10.2.0 and uses Kafka Client 2.4

On the Logstash 7.5 version of that page, it says that the logstash-integration-plugin version 10.0.0 uses Kafka Client 2.1.0

I think you'll need to share your configs for further assistance. In particular, it will be important to know that you have followed the upgrade documentation at https://kafka.apache.org/25/documentation.html#upgrade, particularly with regard to things like log.message.format.version. Note also any changes towards deprecating Zookeeper, which might cause a change for you regarding storage of consumer group offsets.

Bear also in mind that if you're just producing the occassional message then the buffering/batching settings may be getting in your way of seeing messages being flushed to the broker, or received on the consumer. The pages for the kafka input and output plugins describe those applicable setting in much more detail.

Hope that helps,
Cameron

Hello Cameron,

I updated my logstash config with just extra line in output kafka section, default codec is plain (didn't mentioned codec in previous logstash(2.4) config)
codec => "json"

without json codec logstash is sending empty messages to kafka broker, after adding codec i got desired ouput into kafka brokers

This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.