Since debezium 2.0 has decided to remove the io.confluent.connect.avro.AvroConverter I tried to inject it directly but it keep giving me following error:
Here is my simplified deployment:
apiVersion: apps/v1
kind: Deployment
metadata:
name: debezium
spec:
template:
spec:
initContainers:
- name: download-kafka-connect-plugins
image: alpine:3.16.2
command:
- sh
args:
- -c
- |
wget https://packages.confluent.io/maven/io/confluent/kafka-connect-avro-converter/7.2.2/kafka-connect-avro-converter-7.2.2.jar
cp kafka-connect-avro-converter-7.2.2.jar /kafka/connect-plugins
volumeMounts:
- name: kafka-connect-plugins
mountPath: /kafka/connect-plugins
volumes:
- name: kafka-connect-plugins
emptyDir: {}
containers:
- name: debezium-intelipost
image: debezium/connect:2.0.0.Final
volumeMounts:
- name: kafka-connect-plugins
mountPath: /kafka/connect-plugins
env:
- name: KAFKA_CONNECT_PLUGINS_DIR
value: /kafka/connect-plugins
I also tried to inject the kafka-schema-registry-client and many others but i could not make it work. Anyone had similar issue ?
Jonathan.
To explain it to you roughly, you need to download the connector: https://www.confluent.io/hub/confluentinc/kafka-connect-avro-converter then you will need to add two jars: guava and failureaccess You add these libraries to KAFKA_CONNECT_PLUGINS_DIR. This will solve your problem. You don't need to change your image debezium/connect by debezium/connect-base