16:51:47.339 [[main]<kafka] INFO org.apache.kafka.common.utils.AppInfoParser - Kafka version : 0.10.0.1
16:51:47.341 [[main]<kafka] INFO org.apache.kafka.common.utils.AppInfoParser - Kafka commitId : a7a17cdec9eaa6c5
16:51:47.513 [Ruby-0-Thread-16: C:/Users/A/test/logstash-5.1.1/vendor/bundle/jruby/1.9/gems/logstash-input-kafka-5.1.0/lib/logstash/inputs/kafka.rb:225] INFO org.apache.kafka.clients.consumer.internals.AbstractCoordinator - Discovered coordinator 192.168.31.54:9092 (id: 2147483646 rack: null) for group logstash.
16:51:47.519 [Ruby-0-Thread-16: C:/Users/A/test/logstash-5.1.1/vendor/bundle/jruby/1.9/gems/logstash-input-kafka-5.1.0/lib/logstash/inputs/kafka.rb:225] INFO org.apache.kafka.clients.consumer.internals.ConsumerCoordinator - Revoking previously assigned partitions [] for group logstash
16:51:47.523 [Ruby-0-Thread-16: C:/Users/A/test/logstash-5.1.1/vendor/bundle/jruby/1.9/gems/logstash-input-kafka-5.1.0/lib/logstash/inputs/kafka.rb:225] INFO org.apache.kafka.clients.consumer.internals.AbstractCoordinator - (Re-)joining group logstash
16:51:47.556 [Ruby-0-Thread-16: C:/Users/A/test/logstash-5.1.1/vendor/bundle/jruby/1.9/gems/logstash-input-kafka-5.1.0/lib/logstash/inputs/kafka.rb:225] INFO org.apache.kafka.clients.consumer.internals.AbstractCoordinator - Successfully joined group logstash with generation 5
16:51:47.561 [Ruby-0-Thread-16: C:/Users/A/test/logstash-5.1.1/vendor/bundle/jruby/1.9/gems/logstash-input-kafka-5.1.0/lib/logstash/inputs/kafka.rb:225] INFO org.apache.kafka.clients.consumer.internals.ConsumerCoordinator - Setting newly assigned partitions [test-beat33-0] for group logstash
Despite seeing so many messages I cannot see any logs in the console. When I check the Kafka server using the kafka-console-consumer I can see my logs.
Is there any third party tutorial or documentation that I can use? Is there something wrong with logstash or there are some secret information on usage that is not disclosed to the public
Which console are you referring to? Your logstash console?
Could not find log4j2 configuration at path /Users/A/test/logstash-5.1.1/config/log4j2.properties. Using default config which logs to console
You can try to start logstash as: ..\logstash-5.1.1\bin\logstash -f test.conf --path.settings=dir_name_where_logstash.yml_is_located --config.debug --log.level debug after which you can see the logs in logstash logs directory.
I know what is the problem. All data that arrived in kafka before logstash has started will automatically ignored by logstash. only data that arrives after logstash has already started will be processed by logstash.
Looks like default logstash configuration will result in data loss should logstash server goes down or need to be restarted in a production environment.
Is there anyway to solve this problem? This problem will defeat the purpose of using Kafka
Apache, Apache Lucene, Apache Hadoop, Hadoop, HDFS and the yellow elephant
logo are trademarks of the
Apache Software Foundation
in the United States and/or other countries.