Access Kafka producer server through python script on GCP

I have got a successful connection between a Kafka producer and consumer on a Google Cloud Platform cluster established by:

$ cd /usr/lib/kafka
$ bin/ config/ --broker-list \
PLAINTEXT://[project-name]-w-0.c.[cluster-id].internal:9092  --topic test

and executing in a new shell

$ cd /usr/lib/kafka
$ bin/ --bootstrap-server \
PLAINTEXT://[project-name]-w-0.c.[cluster-id].internal:9092 --topic test \

Now, I want to send messages to the Kafka producer server using the following python script:

from kafka import *

topic = 'test'
producer = KafkaProducer(bootstrap_servers='PLAINTEXT://[project-name]-w-0.c.[cluster-id].internal:9092', 

producer.send(topic, b"Test test test")

However, this results in a KafkaTimeoutError:

"Failed to update metadata after %.1f secs." % (max_wait,))
kafka.errors.KafkaTimeoutError: KafkaTimeoutError: Failed to update metadata after 60.0 secs.

Looking around online told me to consider:

  • uncommenting listeners=... and advertised.listeners=... in the /usr/lib/kafka/config/ file.

However, listeners=PLAINTEXT://:9092 does not work and this post suggests to set PLAINTEXT://<external-ip>:9092.

So, I started wondering about accessing a Kafka server through an external (static) IP address of the GCP cluster. Then, we have set up a firewall rule to access the port (?) and allow https access to the cluster. But I am unsure whether this is an overkill of the problem.

I definitely need some guidance to connect successfully to the Kafka server from the python script.

1 answer