Hi All,
Application team doing "GZIP and encoding with UTF-8" and sending their logs to kafka topics. Now i want to read those logs through logstash pipeline. Could you please guide me on this?
Thanks
Hi All,
Application team doing "GZIP and encoding with UTF-8" and sending their logs to kafka topics. Now i want to read those logs through logstash pipeline. Could you please guide me on this?
Thanks
Hi Team,
Please guide me on the above topic?
Thanks
What have you tried? You didn't provide any context.
You need to use the Kafka input to read the messages from kafka, it will read compressed messages per default.
Hi Leandrojmp,
Here i am providing logstash configuration.
input
{
kafka
{
bootstrap_servers => "A,B,C"
topics => "XX_XX_XX"
group_id => "XX_XX_XX"
codec => gzip_lines {charset => "UTF-8"}
security_protocol => "SASL_SSL"
sasl_mechanism => "SCRAM-SHA-512"
sasl_jaas_config => "org.apache.kafka.common.security.scram.ScramLoginModule required username='XX'
password='XXXXXX';"
ssl_endpoint_identification_algorithm => ""
}
}
filter
{
ruby { code => 'event.set("decoded", Base64.decode64(event.get("message")).force_encoding("UTF-8"))' }
}
while using above configuration i am getting below error.
[ERROR] 2024-01-03 05:22:51.005 [kafka-input-worker-logstash-0] Logstash - uncaught exception (in thread kafka-input-worker-logstash-0)
org.jruby.exceptions.StandardError: (Error) not in gzip format
at org.jruby.ext.zlib.JZlibRubyGzipReader.initialize(org/jruby/ext/zlib/JZlibRubyGzipReader.java:141) ~[jruby.jar:?]
at org.jruby.ext.zlib.JZlibRubyGzipReader.new(org/jruby/ext/zlib/JZlibRubyGzipReader.java:85) ~[jruby.jar:?]
at org.jruby.ext.zlib.JZlibRubyGzipReader.new(org/jruby/ext/zlib/JZlibRubyGzipReader.java:76) ~[jruby.jar:?]
at usr.share.logstash.vendor.bundle.jruby.$2_dot_6_dot_0.gems.logstash_minus_codec_minus_gzip_lines_minus_3_dot_0_dot_4.lib.logstash.codecs.gzip_lines.decode(/usr/share/logstash/vendor/bundle/jruby/2.6.0/gems/logstash-codec-gzip_lines-3.0.4/lib/logstash/codecs/gzip_lines.rb:35) ~[?:?]
at usr.share.logstash.vendor.bundle.jruby.$2_dot_6_dot_0.gems.logstash_minus_integration_minus_kafka_minus_11_dot_2_dot_1_minus_java.lib.logstash.inputs.kafka.handle_record(/usr/share/logstash/vendor/bundle/jruby/2.6.0/gems/logstash-integration-kafka-11.2.1-java/lib/logstash/inputs/kafka.rb:359) ~[?:?]
at usr.share.logstash.vendor.bundle.jruby.$2_dot_6_dot_0.gems.logstash_minus_integration_minus_kafka_minus_11_dot_2_dot_1_minus_java.lib.logstash.inputs.kafka.thread_runner(/usr/share/logstash/vendor/bundle/jruby/2.6.0/gems/logstash-integration-kafka-11.2.1-java/lib/logstash/inputs/kafka.rb:329) ~[?:?]
at usr.share.logstash.vendor.bundle.jruby.$2_dot_6_dot_0.gems.logstash_minus_integration_minus_kafka_minus_11_dot_2_dot_1_minus_java.lib.logstash.inputs.kafka.thread_runner(/usr/share/logstash/vendor/bundle/jruby/2.6.0/gems/logstash-integration-kafka-11.2.1-java/lib/logstash/inputs/kafka.rb:329) ~[?:?]
This error means that the message is not on gzip format.
Have you tried to remove this line?
codec => gzip_lines {charset => "UTF-8"}
It is not clear what you mean with GZIP and encoding with UTF-8, the compression is done by Kafka normally and you do not need to do anything else as the messages would be uncompressed by default.
How are your documents being written into Kafka? What tool is being used?
It seems that you are compressing your documents and writing the compressed document into Kafka, is that right?
This topic was automatically closed 28 days after the last reply. New replies are no longer allowed.
© 2020. All Rights Reserved - Elasticsearch
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant logo are trademarks of the Apache Software Foundation in the United States and/or other countries.