Hi,
I’m using Kafka Connect version 2.5.0 with kafka-connect-couchbase-3.4.x.jar version
My bucket size is 100mio docs and I need to stream out to kafka topic.
When reaching offset around 700k, I always get this error:
com.couchbase.client.deps.io.netty.codec.compression.DecompressionException: Offset exceeds size of chunk
Here’s my config:
{
“name”: “cb-connector”,
“config”: {
“connector.class”: “com.couchbase.connect.kafka.CouchbaseSourceConnector”,
“connection.password”: “password”,
“tasks.max”: “1”,
“couchbase.compression”: “ENABLED”,
“transforms.channel.static.field”: “channel”,
“connection.timeout.ms”: “2000”,
“connection.username”: “username”,
“couchbase.stream_from”: “SAVED_OFFSET_OR_BEGINNING”,
“couchbase.flow_control_buffer”: “128m”,
“dcp.message.converter.class”:
“transforms.channel.type”: “org.apache.kafka.connect.transforms.InsertField$Value”,
“connection.bucket”: “myBucket”,
“event.filter.class”: “com.couchbase.connect.kafka.filter.AllPassFilter”,
“name”: “cb-connector”,
“topic.name”: “myTopic”,
“couchbase.persistence_polling_interval”: “100ms”,
“connection.cluster_address”: “xx.xx.xx.xx”
}
}
Which config to handle this kind of issue?
Thanks