Kafka DLQ configuration for json message

438 Views Asked by At

I have a SCDF stream with two processors. The goal of the stream is to read the jsonb from the source topic and processed/parsed into the jdbc postgres. Mostly, the json is big enough to completely parsed and getting the error.

org.apache.kafka.common.errors.RecordTooLargeException: The message is 4277122 bytes when serialized which is larger than the maximum request size you have configured with the max.request.size configuration.

Although increasing the max.request.size help solving the error, but my main task is to implement the functionality of DLQ.

I try to configure the Kafka DLQ to get the error message in the DLQ topic. Following are the configurations I implemented:

spring.cloud.stream.kafka.bindings.input.consumer.enableDlq=true
spring.cloud.stream.kafka.bindings.input.consumer.dlqName=error.dlq
spring.cloud.stream.kafka.bindings.input.consumer.auto-commit-offset=true
spring.cloud.stream.kafka.bindings.input.consumer.auto-commit-on-error=true
spring.cloud.stream.kafka.bindings.input.consumer.dlqProducerProperties.configuration.key.serializer=org.apache.kafka.common.serialization.StringSerializer
spring.cloud.stream.kafka.bindings.input.consumer.dlqProducerProperties.configuration.value.serializer=org.apache.kafka.common.serialization.StringSerializer
spring.cloud.stream.kafka.streams.binder.deserializationExceptionHandler=sendToDlq
spring.cloud.stream.bindings.input.consumer.max-attempts=1
spring.cloud.stream.bindings.input.content-type=application/json
spring.cloud.stream.bindings.input.consumer.headerMode=headers

After running the SCDF stream, the DLQ topic is creating into the kafka but still the Record too large exception message not moving to the DLQ topic.

Is there any configuration missing or entered wrong? I have gone through other similar questions but couldn't solve the issue.

0

There are 0 best solutions below