Logs Not Appearing in Kibana After Instance Refresh Activity

Our log management pipeline operates as follows: Beats -> MSK (Kafka) -> Logstash (hosted in EC2) -> Elasticsearch -> Kibana.

We handle updates to the Logstash servers through the Instance Refresh activity in AWS. This process involves using a baseline server to replicate changes to two Logstash servers, where the Logstash services are actively running. The baseline server serves solely as a mirror for applying updates to these servers.

Recently, after performing a routine instance refresh activity, we noticed that logs have stopped appearing in Kibana. Interestingly, while a new index was successfully created, no logs are being ingested. The activity was carried out as per our usual process, but the following error appeared in the Logstash logs:

[2025-01-08T05:27:59,214][ERROR][logstash.javapipeline    ][prod-pipeline][6867f9988635ef86f462be78937e21cfa494002570c4caf5402438fa369bab1d] A plugin had an unrecoverable error. Will restart this plugin.
  Pipeline_id:easyjetholidays-pipeline
  Plugin: <LogStash::Inputs::Kafka codec=><LogStash::Codecs::JSON id=>"json_927c647a-2671-4fda-b74f-f20f93a3877a", enable_metric=>true, charset=>"UTF-8">, group_id=>"holidays_applog_grp_prod", topics=>["easyjet_holidays_prod_app_topic", "easyjet_holidays_prod_metrics_topic"], ssl_truststore_location=>"/usr/share/lc_certificates/kafka.client.truststore.jks", ssl_truststore_password=><password>, consumer_threads=>3, security_protocol=>"SSL", id=>"6867f9988635ef86f462be78937e21cfa494002570c4caf5402438fa369bab1d", bootstrap_servers=>"b------9094", enable_metric=>true, schema_registry_validation=>"auto", auto_commit_interval_ms=>5000, check_crcs=>true, client_dns_lookup=>"default", client_id=>"logstash", connections_max_idle_ms=>540000, enable_auto_commit=>true, fetch_max_bytes=>52428800, fetch_max_wait_ms=>500, heartbeat_interval_ms=>3000, isolation_level=>"read_uncommitted", key_deserializer_class=>"org.apache.kafka.common.serialization.StringDeserializer", max_poll_interval_ms=>300000, max_partition_fetch_bytes=>1048576, max_poll_records=>500, metadata_max_age_ms=>300000, receive_buffer_bytes=>32768, reconnect_backoff_ms=>50, request_timeout_ms=>40000, retry_backoff_ms=>100, send_buffer_bytes=>131072, session_timeout_ms=>10000, value_deserializer_class=>"org.apache.kafka.common.serialization.StringDeserializer", poll_timeout_ms=>100, ssl_endpoint_identification_algorithm=>"https", sasl_mechanism=>"GSSAPI", decorate_events=>"none">
  Error: Failed to construct kafka consumer
  Exception: Java::OrgApacheKafkaCommon::KafkaException
  Stack: org.apache.kafka.clients.consumer.KafkaConsumer.<init>(org/apache/kafka/clients/consumer/KafkaConsumer.java:825)
org.apache.kafka.clients.consumer.KafkaConsumer.<init>(org/apache/kafka/clients/consumer/KafkaConsumer.java:666)
org.apache.kafka.clients.consumer.KafkaConsumer.<init>(org/apache/kafka/clients/consumer/KafkaConsumer.java:646)
jdk.internal.reflect.GeneratedConstructorAccessor171.newInstance(jdk/internal/reflect/GeneratedConstructorAccessor171)
jdk.internal.reflect.DelegatingConstructorAccessorImpl.newInstance(jdk/internal/reflect/DelegatingConstructorAccessorImpl.java:45)
java.lang.reflect.Constructor.newInstance(java/lang/reflect/Constructor.java:490)
org.jruby.javasupport.JavaConstructor.newInstanceDirect(org/jruby/javasupport/JavaConstructor.java:285)
org.jruby.RubyClass.newInstance(org/jruby/RubyClass.java:918)

Could anyone please assist in identifying and resolving this issue? Your support would be greatly appreciated.