Filebeat is not sending logs to Kafka

I have filebeat running in GCP cluster and I am trying to send the logs to my Kafka in DC. I see in the logs that it was able to make a connection to Kafka. I configured the filebeat.yml with kafka output using the IP address since GCP is not able to resolve the Kafka hostname.

I also updated the Kafka server.prop with the advertised.listeners using the IP address to make sure Filebeat can connect to it. However, no logs in Kafka. No errors in filebeat as well, and I can see that filebeat is harversting logs.

Also, I tried to run the same setup using Docker locally on my mac and I was able to see the logs.

to add its harvesting the file, but shows failed count:

{"active":3311},"harvester":{"open_files":42,"running":71}},"libbeat":{"config":{"module":{"running":0}},"output":{"events":{"active":3200,"batches":1,"failed":1600,"total":1600}},"outputs":{"kafka":{"bytes_read":1000,"bytes_write":215}},"pipeline":{"clients":1,"events":{"active":3201,"retry":1600}}},"registrar":{"states":{"current":30}},"system":{"load":{"1":2.32,"15":0.87,"5":1.85,"norm":{"1":1.16,"15":0.435,"5":0.925}}}},"ecs.version":"1.6.0"}}