Fyi, our cluster is of secured kafka with sasl_plaintext and sasl_ssl and the respective consumer is connecting with sasl_plaintext.

producer is of python and producing the messages with compression of gzip

"compression_type": "gzip"

producer is fine and producing messages and producer is using the confluent library for producing.

consumer is using the aiokafka library and when we start consumer it is not consuming the messages and we observer

closing connection (org.apache.kafka.common.network.Selector) org.apache.kafka.common.network.InvalidReceiveException: Invalid receive (size = 369295617 larger than 524288) at org.apache.kafka.common.network.NetworkReceive.readFrom(NetworkReceive.java:105) at org.apache.kafka.common.security.authenticator.SaslServerAuthenticator.authenticate(SaslServerAuthenticator.java:257) at org.apache.kafka.common.network.KafkaChannel.prepare(KafkaChannel.java:181) at org.apache.kafka.common.network.Selector.pollSelectionKeys(Selector.java:543) at org.apache.kafka.common.network.Selector.poll(Selector.java:481) at kafka.network.Processor.poll(SocketServer.scala:1144) at kafka.network.Processor.run(SocketServer.scala:1047)

why is this being caused any help is appreciated. Thanks

Should I increase the socket.request.max.bytes as said in many different stackoverflow queries. But I dont think so why this is causing the issue.

0

There are 0 best solutions below