How to configure Kafka for Clickhouse whithout an error being thrown?

755 Views Asked by At

I have two Kafka clusters, the first — Kafka-A — uses a "SASL SCRAM-SHA-256" mechanism to authenticate,the other — Kafka-B — has no configuration set for it.

To be able to connect to Kafka-A in Clickhouse, I configured a config.xml file as demonstrated bellow:

My config.xml configuration:
<kafka>
    <security_protocol>sasl_plaintext</security_protocol>
    <sasl_mechanism>SCRAM-SHA-256</sasl_mechanism>
    <sasl_username>xxx</sasl_username>
    <sasl_password>xxx</sasl_password>
    <debug>all</debug>
    <auto_offset_reset>latest</auto_offset_reset>
    <compression_type>snappy</compression_type> 
</kafka>

At this point I found that I can't connect to Kafka-B using the Kafka engine table. When I try to an error occurs that prints the following message:

StorageKafka (xxx): [rdk:FAIL] [thrd:sasl_plaintext://xxx/bootstrap]: sasl_plaintext://xxx/bootstrap: SASL SCRAM-SHA-256 mechanism handshake failed: Broker: Request not valid in current SASL state: broker's supported mechanisms: (after 3ms in state AUTH_HANDSHAKE, 4 identical error(s) suppressed)


It seems when connecting to Kafka-B, Clickhouse also use the SASL authentication, which leads to the error being thrown, since Kafka-B servers are not configured with authentication.

I would like to know how I can configure it correctly to connect to different Kafka clusters?

1

There are 1 best solutions below

0
Denny Crane On

CH allows to define kafka config for each topic

Use topic in the name of an XML section:

<kafka_mytopic>
    <security_protocol>....
    ....
</kafka_mytopic>