I am currently working on a setup which has 6 kafka-brokers
, Data is being pushed into my topic
from two producers
at a rate of about 4000 messages per second, I have 5 Consumers
for this topic working as a group. What should be the ideal number of partitions of my kafka topic
?
Please feel free to tell me if any change is required in brokers/consumers/producers
as well.
In general more the partitions - more the throughput. However there are other considerations too like the limits of hardware you are running on, whether you are using compression etc. There is a good enough information from Confluent here which provides you insight into rough calculation you can use to arrive at number of partitions.
Moreover for consumer
So the best way is to measure and benchmark for your own use case