Kafka was unable to update metadata after some time

I have a kafka environment that has 3 brokers and 1 zookeeper . I clicked on> 20K post in my thread. Apache Storm calculates data in a topic that is added by the manufacturer.

A few hours later, while I try to create messages for kafka, it shows the following exception

 org.apache.kafka.common.errors.TimeoutException: Failed to update metadata after 60000 ms. 

After rebooting the server, kafka works fine. but on production I cannot restart my server every time. so can someone help me figure out my problem.

my kafka configuration is as follows:

 prodProperties.put(ProducerConfig.BOOTSTRAP_SERVERS_CONFIG,"list of broker"); prodProperties.put(ProducerConfig.ACKS_CONFIG, "1"); prodProperties.put(ProducerConfig.RETRIES_CONFIG, "3"); prodProperties.put(ProducerConfig.LINGER_MS_CONFIG, 5); 
+6
source share
1 answer

Although setting up a Kafka producer is a pretty complicated topic, I can imagine that your producer is trying to generate records faster than your Kafka cluster can transfer.

There is a buffer.memory manufacturer buffer.memory that determines how much the memory manufacturer can use before locking. The default value is 33554432 (33 MB).

If you increase the manufacturerโ€™s memory, you will avoid blocking. Try different values, for example 100 MB.

0
source

All Articles