Springboot Kafka - custom headers not getting deserialized properly

203 Views Asked by At

I am reading data from Confluent Kafka Cloud.

I am using batch listener, I have not configured deserializer explicitly.

My message payload and Kafka headers are getting deserialized properly. But headers under "kafka_batchConvertedHeaders" are not getting deserialized properly, it contains the custom headers added by producer.

In confluent cloud dashboard custom headers are looking properly but in listener it is showing header values as a object.

ex: header 1 - =[B@3cdba1db, header 2 - =[B@4ab06ebe

1

There are 1 best solutions below

0
On

You need to show your code and configuration. If the source of the data is not Spring, or you don't have Jackson on the class path of the sender or receiver, you will need to properly configure the header mapper to convert byte[] to String.

See https://docs.spring.io/spring-kafka/docs/current/reference/html/#headers

The SimpleKafkaHeaderMapper maps raw headers as byte[], with configuration options for conversion to String values. ...