Couchbase Kafka Source connector (kerberos)

connector

#1

Hi There,

I’m using the couchbase kafka source connector to take the documents from couchbase and put in Kafka topics.

Following are my configurations:

$KAFKA_HOME/config/connect-standalone.properties:

bootstrap.servers=uwd5-xxx-01.xxx.local:6667
key.converter=org.apache.kafka.connect.json.JsonConverter
value.converter=org.apache.kafka.connect.json.JsonConverter
key.converter.schemas.enable=true
value.converter.schemas.enable=true
internal.key.converter=org.apache.kafka.connect.json.JsonConverter
internal.value.converter=org.apache.kafka.connect.json.JsonConverter
internal.key.converter.schemas.enable=false
internal.value.converter.schemas.enable=false
offset.storage.file.filename=/tmp/new.connect.offsets
offset.flush.interval.ms=10000
sasl.kerberos.service.name=kafka
sasl.jaas.config=/etc/kafka/kafka-connect-couchbase-3.3.0/client-jaas.conf
security.protocol=PLAINTEXTSASL
consumer.security.protocol=PLAINTEXTSASL
producer.security.protocol=PLAINTEXTSASL
producer.sasl.jaas.config=/etc/kafka/kafka-connect-couchbase-3.3.0/client-jaas.conf
producer.sasl.kerberos.service.name=kafka
consumer.sasl.jaas.config=/etc/kafka/kafka-connect-couchbase-3.3.0/client-jaas.conf
consumer.sasl.kerberos.service.name=kafka
consumer.bootstrap.servers=uwd5-xx-01.xxx.local:6667
producer.bootstrap.servers=uwd5-xx-01.xxx.local:6667

quickstart-couchbase-source.properties:

name=test-couchbase-source
connector.class=com.couchbase.connect.kafka.CouchbaseSourceConnector
tasks.max=2
topic.name=test-default
connection.cluster_address=xxxx-xxxxn-01
connection.timeout.ms=2000
connection.bucket=sample_qa
connection.username=admin
connection.password=xxx
use_snapshots=false
dcp.message.converter.class=com.couchbase.connect.kafka.handler.source.DefaultSchemaSourceHandler
event.filter.class=com.couchbase.connect.kafka.filter.AllPassFilter
couchbase.stream_from=SAVED_OFFSET_OR_BEGINNING

But when i runt the connector, it shows following error:

[2018-06-19 20:58:39,336] INFO Reflections took 5125 ms to scan 76 urls, producing 4621 keys and 32888 values (org.reflections.Reflections:229)
[2018-06-19 20:59:54,627] INFO Poll returns 1 result(s) (com.couchbase.connect.kafka.CouchbaseSourceTask:175)
[2018-06-19 21:00:01,131] ERROR Failed to flush WorkerSourceTask{id=test-couchbase-source-1}, timed out while waiting for producer to flush outstanding 1 messages (org.apache.kafka.connect.runtime.WorkerSourceTask:289)
[2018-06-19 21:00:01,131] ERROR Failed to commit offsets for WorkerSourceTask{id=test-couchbase-source-1} (org.apache.kafka.connect.runtime.SourceTaskOffsetCommitter:109)
^C[2018-06-19 21:00:08,481] INFO Kafka Connect stopping (org.apache.kafka.connect.runtime.Connect:68)

It seems like, producer of connect is not getting authenticated or sending message to target kafka cluster broker. Please note that my target kafka cluster is kerberized and listening on port 6667.

Any help would be appreciated.

Regards


#2

Hi Ranjit,

I’m not familiar with kerberized deployments myself, but this comment makes me wonder if it’s a problem with how the SASL config properties are specified (seems like they don’t accept paths to files).

Thanks,
David


#3

Ok I followed the following document and it got resolved:

https://docs.confluent.io/current/kafka/authentication_sasl_gssapi.html#kafka-connect

so basically i end up using below format apart from having configuration in jaas file:
sasl.jaas.config=com.sun.security.auth.module.Krb5LoginModule required
useKeyTab=true
storeKey=true
keyTab="/etc/security/keytabs/kafka_client.keytab"
principal="connect@EXAMPLE.COM";