CancellationException occurs when connecting to a bucket

Hi, I have installed Couchbase Analytics DP4 on my local machine and have a bucket named reporting with a user of same name having Admin privileges and password = 123456

I went through following steps on Analytics Query Editor:

  1. CREATE BUCKET reporting WITH {
    “name”: “reporting”,
    “nodes”: [“localhost”]
    };

  2. CREATE SHADOW DATASET movements ON reporting WHERE type = “Person”;

  3. CONNECT BUCKET reporting WITH { “password”: “123456”, “timeout”: 2500 };

First two DDL queries are working fine but the third query is giving the following error:

[
  {
    "code": 1,
    "msg": "CancellationException",
    "query_from_user": "CONNECT BUCKET `reporting` WITH { \"password\": \"123456\", \"timeout\": 2500 };"
  }
]

Please help.

Hi there krishan.jangid,
The timeout specified was not enough to complete the connection operation and so the connection attempt was cancelled.

Can you try to connect with the default timeout. Just try to omit the timeout parameter and see if it succeeds.
The default timeout is 60000 = 1 minute.

Let us know if you can connect successfully then.

If that doesn’t work, then we might need to look at the log file to determine the problem.
~Abdullah.

krishan.jangid,
We also noticed the following:
You’re using the DP4 Couchbase Analytics and we assume that the Analytics service runs on the same Couchbase cluster as the data service.
In that case, you shouldn’t need to specify “nodes” when creating a bucket. Also, in that case, you don’t need to specify a password when connecting.
If our assumptions are correct, then the statements should become.

CREATE BUCKET reporting;
CREATE SHADOW DATASET movements ON reporting WHERE type = Person;
CONNECT BUCKET reporting;

However, if your data is in a different Couchbase cluster, then you’re doing it correctly.

Cheers,
Abdullah.

Hi Abdullah, I followed the steps mentioned by you. still getting the same error.

I extracted the relavent part of the log file (that made sense to me :slight_smile: ) instead of pasting the entire content. Please take a look at this:

2018-01-23T12:16:55.379+05:30 INFO CBAS.dataset.DatasetPartitionWriter [Executor-18:7c797d9e05a157f75cdd5e8040daac33] open(0)
2018-01-23T12:16:55.381+05:30 INFO CBAS.work.WorkQueue [Worker:ClusterController] Executing: RegisterResultPartitionLocation: JobId@JID:13 ResultSetId@RSID:0 Partition@0 NPartitions@1 ResultPartitionLocation@127.0.0.1:57789 OrderedResult@true EmptyResult@false
2018-01-23T12:16:55.382+05:30 INFO CBAS.dataset.DatasetPartitionWriter [Executor-18:7c797d9e05a157f75cdd5e8040daac33] close(0)
2018-01-23T12:16:55.387+05:30 INFO CBAS.work.WorkQueue [Worker:ClusterController] Executing: ReportResultPartitionWriteCompletion: JobId@JID:13 ResultSetId@RSID:0 Partition@0
2018-01-23T12:16:55.387+05:30 INFO CBAS.work.WorkQueue [Worker:7c797d9e05a157f75cdd5e8040daac33] Executing: NotifyTaskCompleteWork:TAID:TID:ANID:ODID:6:0:0:0
2018-01-23T12:16:55.388+05:30 INFO CBAS.work.WorkQueue [Worker:ClusterController] Executing: TaskComplete: [7c797d9e05a157f75cdd5e8040daac33[JID:13:TAID:TID:ANID:ODID:6:0:0:0]
2018-01-23T12:16:55.388+05:30 INFO CBAS.executor.JobExecutor [Worker:ClusterController] Runnable TC roots: [], inProgressTaskClusters: []
2018-01-23T12:16:55.388+05:30 INFO CBAS.work.WorkQueue [Worker:ClusterController] Executing: JobCleanup: JobId@JID:13 Status@TERMINATED
2018-01-23T12:16:55.388+05:30 INFO CBAS.work.JobCleanupWork [Worker:ClusterController] Cleanup for JobRun with id: JID:13
2018-01-23T12:16:55.388+05:30 INFO CBAS.work.WorkQueue [Worker:7c797d9e05a157f75cdd5e8040daac33] Executing: CleanupJoblet
2018-01-23T12:16:55.388+05:30 INFO CBAS.work.CleanupJobletWork [Worker:7c797d9e05a157f75cdd5e8040daac33] Cleaning up after job: JID:13
2018-01-23T12:16:55.388+05:30 WARN CBAS.nc.Joblet [Worker:7c797d9e05a157f75cdd5e8040daac33] Freeing leaked 524288 bytes
2018-01-23T12:16:55.389+05:30 INFO CBAS.work.WorkQueue [Worker:ClusterController] Executing: JobletCleanupNotification
2018-01-23T12:16:55.389+05:30 INFO CBAS.active.ActiveNotificationHandler [Worker:ClusterController] Getting notified of job finish for JobId: JID:13
2018-01-23T12:16:55.389+05:30 INFO CBAS.active.ActiveNotificationHandler [Worker:ClusterController] NO NEED TO NOTIFY JOB FINISH!
2018-01-23T12:16:55.390+05:30 INFO CBAS.work.WorkQueue [Worker:7c797d9e05a157f75cdd5e8040daac33] Executing: ApplicationMessage: nodeId: 7c797d9e05a157f75cdd5e8040daac33
2018-01-23T12:16:55.390+05:30 INFO CBAS.messaging.NCMessageBroker [Worker:7c797d9e05a157f75cdd5e8040daac33] Received message: ExecuteStatementResponseMessage(id=23): 54 characters
2018-01-23T12:16:55.391+05:30 INFO CBAS.work.WorkQueue [Worker:ClusterController] Executing: GetResultPartitionLocations: JobId@JID:13 ResultSetId@RSID:0 Known@null
2018-01-23T12:16:55.393+05:30 INFO CBAS.dataset.DatasetPartitionReader [Executor-16:7c797d9e05a157f75cdd5e8040daac33] Result Reader read + 32768 bytes
2018-01-23T12:16:55.394+05:30 INFO CBAS.dataset.DatasetPartitionReader [Executor-16:7c797d9e05a157f75cdd5e8040daac33] result reading successful(JID:13:RSID:0:0)
2018-01-23T12:17:08.018+05:30 INFO CBAS.auth.AuthorizationProvider [HttpExecutor(port:8095)-8] user authenticated as: {"@class":".User","id":"Administrator","roles":null,"name":null,"domain":"ADMIN","permissions":{"cluster.bucket[reporting].analytics!manage":true,"cluster.analytics!read":true,"cluster.settings!write":true,"cluster.bucket[reporting].data.docs!read":true,"cluster.settings!read":true},"analyticsReader":true,"readOnlyAdmin":true,"admin":true}
2018-01-23T12:17:08.018+05:30 INFO CBAS.server.QueryServiceServlet [HttpExecutor(port:8095)-8] {"host":"localhost:8091","path":"/query/service","statement":"CONNECT BUCKET reporting;","pretty":false,"mode":null,"clientContextID":"bd22182a-4628-4ebb-aa2a-9daacd0f59c2","format":null}
2018-01-23T12:17:08.019+05:30 INFO CBAS.work.WorkQueue [Worker:ClusterController] Executing: ApplicationMessage: nodeID: 7c797d9e05a157f75cdd5e8040daac33
2018-01-23T12:17:08.020+05:30 INFO CBAS.messaging.CCMessageBroker [Executor-2:ClusterController] Received message: ExecuteStatementRequestMessage(id=24, from=7c797d9e05a157f75cdd5e8040daac33): CONNECT BUCKET reporting;;
2018-01-23T12:17:08.030+05:30 INFO CBAS.active.ActiveNotificationHandler [Executor-2:ClusterController] getActiveEntityListener(EntityId entityId) was called with entity Default.reporting(CouchbaseMetadataExtension)
2018-01-23T12:17:08.030+05:30 INFO CBAS.active.ActiveNotificationHandler [Executor-2:ClusterController] Listener found: null
2018-01-23T12:17:08.082+05:30 INFO CBAS.active.ActiveNotificationHandler [Executor-2:ClusterController] registerListener(IActiveEntityEventsListener listener) was called for the entity Default.reporting(CouchbaseMetadataExtension)
2018-01-23T12:17:08.082+05:30 INFO CBAS.active.ActiveEntityEventsListener [Executor-2:ClusterController] State of Default.reporting(CouchbaseMetadataExtension)is being set to STARTING from STOPPED
2018-01-23T12:17:08.096+05:30 INFO CBAS.work.WorkQueue [Worker:ClusterController] Executing: GetNodeControllersInfo
2018-01-23T12:17:08.327+05:30 INFO CBAS.util.EventLoopProvider [Executor-2:ClusterController] Initializing Couchbase Environment
2018-01-23T12:17:08.435+05:30 ERRO CBAS.conductor.HttpStreamingConfigProvider [Executor-2:ClusterController] Adding a config node 127.0.0.1:8091:8091
2018-01-23T12:17:08.453+05:30 INFO CBAS.dcp.Client [Executor-2:ClusterController] Connecting to seed nodes and bootstrapping bucket reporting.
2018-01-23T12:17:08.488+05:30 INFO CBAS.conductor.HttpStreamingConfigProvider [Executor-2:ClusterController] Getting bucket config from 127.0.0.1:8091
2018-01-23T12:17:08.497+05:30 INFO CBAS.auth.AuthorizationProvider [Executor-2:ClusterController] Getting credentials for 127.0.0.1:8091
2018-01-23T12:17:18.400+05:30 WARN CBAS.conductor.HttpStreamingConfigProvider [Executor-2:ClusterController] Failed getting bucket config
java.util.concurrent.CancellationException: null
	at com.couchbase.client.deps.io.netty.util.concurrent.DefaultPromise.cancel(...)(Unknown Source) ~[core-io-1.5.1.jar:?]
2018-01-23T12:17:18.401+05:30 WARN CBAS.adapter.CouchbaseConnectorFactory [Executor-2:ClusterController] Failed getting the number of vbuckets
java.util.concurrent.CancellationException: null
	at com.couchbase.client.deps.io.netty.util.concurrent.DefaultPromise.cancel(...)(Unknown Source) ~[core-io-1.5.1.jar:?]
2018-01-23T12:17:18.402+05:30 WARN CBAS.lang.ConnectBucketStatement [Executor-2:ClusterController] Failed to connect bucket Bucket:Default.reporting
org.apache.hyracks.algebricks.common.exceptions.AlgebricksException: java.util.concurrent.CancellationException
	at com.couchbase.analytics.lang.BucketDatasource.buildDatasourceScanRuntime(BucketDatasource.java:140) ~[cbas-connector-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
	at org.apache.asterix.metadata.declared.MetadataProvider.getScannerRuntime(MetadataProvider.java:380) ~[asterix-metadata-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
	at org.apache.hyracks.algebricks.core.algebra.operators.physical.DataSourceScanPOperator.contributeRuntimeOperator(DataSourceScanPOperator.java:112) ~[algebricks-core-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
	at org.apache.hyracks.algebricks.core.algebra.operators.logical.AbstractLogicalOperator.contributeRuntimeOperator(AbstractLogicalOperator.java:166) ~[algebricks-core-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
	at org.apache.hyracks.algebricks.core.jobgen.impl.PlanCompiler.compileOpRef(PlanCompiler.java:96) ~[algebricks-core-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
	at org.apache.hyracks.algebricks.core.jobgen.impl.PlanCompiler.compileOpRef(PlanCompiler.java:83) ~[algebricks-core-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
	at org.apache.hyracks.algebricks.core.jobgen.impl.PlanCompiler.compileOpRef(PlanCompiler.java:83) ~[algebricks-core-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
	at org.apache.hyracks.algebricks.core.jobgen.impl.PlanCompiler.compileOpRef(PlanCompiler.java:83) ~[algebricks-core-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
	at org.apache.hyracks.algebricks.core.jobgen.impl.PlanCompiler.compileOpRef(PlanCompiler.java:83) ~[algebricks-core-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
	at org.apache.hyracks.algebricks.core.jobgen.impl.PlanCompiler.compileOpRef(PlanCompiler.java:83) ~[algebricks-core-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
	at org.apache.hyracks.algebricks.core.jobgen.impl.PlanCompiler.compileOpRef(PlanCompiler.java:83) ~[algebricks-core-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
	at org.apache.hyracks.algebricks.core.jobgen.impl.PlanCompiler.compileOpRef(PlanCompiler.java:83) ~[algebricks-core-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
	at org.apache.hyracks.algebricks.core.jobgen.impl.PlanCompiler.compilePlan(PlanCompiler.java:59) ~[algebricks-core-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
	at org.apache.hyracks.algebricks.compiler.api.HeuristicCompilerFactoryBuilder$1$1.createJob(HeuristicCompilerFactoryBuilder.java:107) ~[algebricks-compiler-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]

Thank you…
Is the data node part of the same cluster?

If the data nodes are in the same cluster… Then please:

  1. Drop the bucket and define it again without the parameter “nodes”.
  2. Try the connect statement without the with clause. Don’t pass the password parameter.

CREATE BUCKET reporting;
CREATE SHADOW DATASET movements ON reporting WHERE type = Person;
CONNECT BUCKET reporting;

If the data nodes are in a different cluster, then is the other cluster a Spock or Watson cluster?

data node is on the same cluster. I did exactly what you are saying. Dropped the dataset, dropped the bucket, and then recreated them with the queries you posted.
I am using a windows machine.

Interesting… The logs you posted show the statement used to connect the bucket as:
CONNECT BUCKET reporting WITH {“password”:“123456”};

Apologies. posted wrong portion of the log file. Updated it in the previous comment.

I’ve started facing another problem now. The Analytics query service is returning error 500. I’ve tried refreshing the web console page, also restarted the couchbase server through services, but nothing seems to work.
Error: Received error 500 when contacting the analytics service on this node. You can try refreshing the browser, or connecting to another node (if available).

Please take a look at the trailing logs:

152:stderr:2018-01-23T12:47:13.671+05:30 INFO CBAS.nc.NodeControllerService [ShutdownHook-7c797d9e05a157f75cdd5e8040daac33] Stopping NodeControllerService
,
186:stderr:2018-01-23T12:47:13.672+05:30 WARN CBAS.active.ActiveManager [ShutdownHook-7c797d9e05a157f75cdd5e8040daac33] Shutting down ActiveManager on node 7c797d9e05a157f75cdd5e8040daac33
,
190:stderr:2018-01-23T12:47:13.716+05:30 WARN CBAS.active.ActiveManager [ShutdownHook-7c797d9e05a157f75cdd5e8040daac33] Shutdown ActiveManager on node 7c797d9e05a157f75cdd5e8040daac33 complete
,
176:stderr:2018-01-23T12:47:13.717+05:30 WARN CBAS.dataset.DatasetPartitionManager [Executor-5:7c797d9e05a157f75cdd5e8040daac33] Result cleaner thread interrupted, shutting down.
,
187:stderr:2018-01-23T12:47:13.768+05:30 INFO CBAS.bootstrap.NCApplication [ShutdownHook-7c797d9e05a157f75cdd5e8040daac33] Stopping Asterix node controller: 7c797d9e05a157f75cdd5e8040daac33
,
[goport(c:/Program Files/Couchbase/Server Analytics/bin/cbas.exe)] 2018/01/23 12:47:14 Timeout while flushing stderr
2:ok,
[goport(c:/Program Files/Couchbase/Server Analytics/bin/cbas.exe)] 2018/01/23 12:47:14 Timeout while flushing stderr
2018-01-23T12:48:19.143+05:30 INFO CBAS.cbas ++++++++++++ starting cbas main ++++++++++++
2018-01-23T12:48:19.144+05:30 INFO CBAS.cbas uuid: 7c797d9e05a157f75cdd5e8040daac33
2018-01-23T12:48:20.965+05:30 INFO CBAS.cbas setting java.home to bundled jre (c:/Program Files/Couchbase/Server Analytics/lib/cbas\runtime)
2018/01/23 13:53:57 revrpc: Got error (EOF) and will retry in 1s
2018/01/23 13:53:57 revrpc: Got error (EOF) and will retry in 1s
[goport(c:/Program Files/Couchbase/Server Analytics/bin/cbas.exe)] 2018/01/23 13:53:57 killing (-9) child...
2:ok,
2018-01-23T13:57:35.352+05:30 INFO CBAS.cbas ++++++++++++ starting cbas main ++++++++++++
2018-01-23T13:57:35.353+05:30 INFO CBAS.cbas uuid: 7c797d9e05a157f75cdd5e8040daac33
2018-01-23T13:57:43.247+05:30 INFO CBAS.cbas setting java.home to bundled jre (c:/Program Files/Couchbase/Server Analytics/lib/cbas\runtime)

Got it… for unclear reason (probably environment specific) , the connection is timing out…
We will get back to you on this as well as the new issue.

In the meantime, can you help us by running cbcollect_info [1] and attach the info to an issue on issues.couchbase.com?

[1] http://docs.couchbase.com/prerelease/analytics-dp4/troubleshoot.html

I’ve noticed that http://localhost:8095 is not accessible (this port is used for analytics service I guess). previously it was accessible and lead to a blank page when I newly installed the server. Now the web page is showing connection error. Firewall is disabled. Hope it helps.

Thank you krishan.jangid,
It would be very helpful to get the complete cbcollect_info output. A lot of the information there can be helpful for us to determine what went wrong for both issues.

If you can then try the docker image if only to see if it is an environment issue. http://docs.couchbase.com/prerelease/analytics-dp4/quick-start.html

One other option is to try to do a fresh installation and try again. (Although, would be nice to get the logs first).

Finally, what version of Windows are you running? We would like to give this a try ourselves.

@amoudi, I installed the same version on another system (with same configuration) and same problem (error 500) is occurring on that system as well. I am using Windows 10 pro (64 bit) on both systems. I requested for JIRA account yesterday and got an invite today in the morning. I’ve created an issue with the same title as this thread and posted the cbcollect_info there.

I’m seeing the same connection error -
[
{
“code”: 1,
“msg”: “CancellationException”,
“query_from_user”: “CONNECT BUCKET beerBucket;”
}
]

Running through the tutorial - http://docs.couchbase.com/prerelease/analytics-dp4/primer-beer.html
Whenever I try to run - CONNECT BUCKET beerBucket;

I get the above error. In the analytics.log file this is the error I see.

2018-04-05T14:10:18.390-06:00 WARN CBAS.apache.asterix [HttpExecutor(port:8095)-13] org.apache.hyracks.algebricks.common.exceptions.AlgebricksException: java.util.concurrent.CancellationException
org.apache.hyracks.api.exceptions.HyracksDataException: org.apache.hyracks.algebricks.common.exceptions.AlgebricksException: java.util.concurrent.CancellationException
at org.apache.hyracks.api.exceptions.HyracksDataException.create(HyracksDataException.java:48) ~[hyracks-api-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
at com.couchbase.analytics.lang.ConnectBucketStatement.doConnect(ConnectBucketStatement.java:503) ~[cbas-connector-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
at com.couchbase.analytics.metadata.BucketEventsListener.doConnect(BucketEventsListener.java:255) ~[cbas-connector-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
at com.couchbase.analytics.metadata.BucketEventsListener.doStart(BucketEventsListener.java:233) ~[cbas-connector-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
at org.apache.asterix.app.active.ActiveEntityEventsListener.start(ActiveEntityEventsListener.java:378) ~[asterix-app-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
at com.couchbase.analytics.lang.ConnectBucketStatement.connect(ConnectBucketStatement.java:387) ~[cbas-connector-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
at com.couchbase.analytics.lang.ConnectBucketStatement.handle(ConnectBucketStatement.java:337) ~[cbas-connector-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
at org.apache.asterix.app.translator.QueryTranslator.compileAndExecute(QueryTranslator.java:401) ~[asterix-app-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
at org.apache.asterix.app.message.ExecuteStatementRequestMessage.handle(ExecuteStatementRequestMessage.java:125) ~[asterix-app-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
at org.apache.asterix.messaging.CCMessageBroker.receivedMessage(CCMessageBroker.java:65) ~[asterix-app-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
at org.apache.hyracks.control.cc.work.ApplicationMessageWork$1.run(ApplicationMessageWork.java:59) ~[hyracks-control-cc-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149) [?:1.8.0_152]
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624) [?:1.8.0_152]
at java.lang.Thread.run(Thread.java:748) [?:1.8.0_152]
Caused by: org.apache.hyracks.algebricks.common.exceptions.AlgebricksException: java.util.concurrent.CancellationException
at com.couchbase.analytics.lang.BucketDatasource.buildDatasourceScanRuntime(BucketDatasource.java:140) ~[cbas-connector-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
at org.apache.asterix.metadata.declared.MetadataProvider.getScannerRuntime(MetadataProvider.java:380) ~[asterix-metadata-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
at org.apache.hyracks.algebricks.core.algebra.operators.physical.DataSourceScanPOperator.contributeRuntimeOperator(DataSourceScanPOperator.java:112) ~[algebricks-core-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
at org.apache.hyracks.algebricks.core.algebra.operators.logical.AbstractLogicalOperator.contributeRuntimeOperator(AbstractLogicalOperator.java:166) ~[algebricks-core-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
at org.apache.hyracks.algebricks.core.jobgen.impl.PlanCompiler.compileOpRef(PlanCompiler.java:96) ~[algebricks-core-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
at org.apache.hyracks.algebricks.core.jobgen.impl.PlanCompiler.compileOpRef(PlanCompiler.java:83) ~[algebricks-core-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
at org.apache.hyracks.algebricks.core.jobgen.impl.PlanCompiler.compileOpRef(PlanCompiler.java:83) ~[algebricks-core-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
at org.apache.hyracks.algebricks.core.jobgen.impl.PlanCompiler.compileOpRef(PlanCompiler.java:83) ~[algebricks-core-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
at org.apache.hyracks.algebricks.core.jobgen.impl.PlanCompiler.compileOpRef(PlanCompiler.java:83) ~[algebricks-core-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
at org.apache.hyracks.algebricks.core.jobgen.impl.PlanCompiler.compileOpRef(PlanCompiler.java:83) ~[algebricks-core-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
at org.apache.hyracks.algebricks.core.jobgen.impl.PlanCompiler.compileOpRef(PlanCompiler.java:83) ~[algebricks-core-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
at org.apache.hyracks.algebricks.core.jobgen.impl.PlanCompiler.compileOpRef(PlanCompiler.java:83) ~[algebricks-core-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
at org.apache.hyracks.algebricks.core.jobgen.impl.PlanCompiler.compilePlan(PlanCompiler.java:59) ~[algebricks-core-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
at org.apache.hyracks.algebricks.compiler.api.HeuristicCompilerFactoryBuilder$1$1.createJob(HeuristicCompilerFactoryBuilder.java:107) ~[algebricks-compiler-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
at org.apache.asterix.api.common.APIFramework.compileQuery(APIFramework.java:355) ~[asterix-app-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
at org.apache.asterix.app.translator.QueryTranslator.rewriteCompileQuery(QueryTranslator.java:1868) ~[asterix-app-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
at com.couchbase.analytics.lang.ConnectBucketStatement.doConnect(ConnectBucketStatement.java:487) ~[cbas-connector-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
… 12 more
Caused by: org.apache.hyracks.api.exceptions.HyracksDataException: java.util.concurrent.CancellationException
at org.apache.hyracks.api.exceptions.HyracksDataException.create(HyracksDataException.java:48) ~[hyracks-api-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
at com.couchbase.analytics.adapter.CouchbaseConnectorFactory.fetchNumOfBucketPartitions(CouchbaseConnectorFactory.java:208) ~[cbas-connector-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
at com.couchbase.analytics.adapter.CouchbaseConnectorFactory.configure(CouchbaseConnectorFactory.java:138) ~[cbas-connector-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
at org.apache.asterix.external.adapter.factory.GenericAdapterFactory.configure(GenericAdapterFactory.java:148) ~[asterix-external-data-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
at org.apache.asterix.external.provider.AdapterFactoryProvider.getAdapterFactory(AdapterFactoryProvider.java:49) ~[asterix-external-data-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
at com.couchbase.analytics.lang.BucketDatasource.buildDatasourceScanRuntime(BucketDatasource.java:132) ~[cbas-connector-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
at org.apache.asterix.metadata.declared.MetadataProvider.getScannerRuntime(MetadataProvider.java:380) ~[asterix-metadata-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
at org.apache.hyracks.algebricks.core.algebra.operators.physical.DataSourceScanPOperator.contributeRuntimeOperator(DataSourceScanPOperator.java:112) ~[algebricks-core-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
at org.apache.hyracks.algebricks.core.algebra.operators.logical.AbstractLogicalOperator.contributeRuntimeOperator(AbstractLogicalOperator.java:166) ~[algebricks-core-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
at org.apache.hyracks.algebricks.core.jobgen.impl.PlanCompiler.compileOpRef(PlanCompiler.java:96) ~[algebricks-core-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
at org.apache.hyracks.algebricks.core.jobgen.impl.PlanCompiler.compileOpRef(PlanCompiler.java:83) ~[algebricks-core-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
at org.apache.hyracks.algebricks.core.jobgen.impl.PlanCompiler.compileOpRef(PlanCompiler.java:83) ~[algebricks-core-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
at org.apache.hyracks.algebricks.core.jobgen.impl.PlanCompiler.compileOpRef(PlanCompiler.java:83) ~[algebricks-core-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
at org.apache.hyracks.algebricks.core.jobgen.impl.PlanCompiler.compileOpRef(PlanCompiler.java:83) ~[algebricks-core-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
at org.apache.hyracks.algebricks.core.jobgen.impl.PlanCompiler.compileOpRef(PlanCompiler.java:83) ~[algebricks-core-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
at org.apache.hyracks.algebricks.core.jobgen.impl.PlanCompiler.compileOpRef(PlanCompiler.java:83) ~[algebricks-core-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
at org.apache.hyracks.algebricks.core.jobgen.impl.PlanCompiler.compileOpRef(PlanCompiler.java:83) ~[algebricks-core-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
at org.apache.hyracks.algebricks.core.jobgen.impl.PlanCompiler.compilePlan(PlanCompiler.java:59) ~[algebricks-core-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
at org.apache.hyracks.algebricks.compiler.api.HeuristicCompilerFactoryBuilder$1$1.createJob(HeuristicCompilerFactoryBuilder.java:107) ~[algebricks-compiler-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
at org.apache.asterix.api.common.APIFramework.compileQuery(APIFramework.java:355) ~[asterix-app-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
at org.apache.asterix.app.translator.QueryTranslator.rewriteCompileQuery(QueryTranslator.java:1868) ~[asterix-app-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
at com.couchbase.analytics.lang.ConnectBucketStatement.doConnect(ConnectBucketStatement.java:487) ~[cbas-connector-1.0.0-cbas-dp4.jar:1.0.0-cbas-dp4]
… 12 more
Caused by: java.util.concurrent.CancellationException
at com.couchbase.client.deps.io.netty.util.concurrent.DefaultPromise.cancel(…)(Unknown Source) ~[core-io-1.5.1.jar:?]

This appears to be relevant.

2018-04-05T14:25:40.089-06:00 WARN CBAS.conductor.HttpStreamingConfigProvider [Executor-6:ClusterController] Failed getting bucket config
java.util.concurrent.CancellationException: null
at com.couchbase.client.deps.io.netty.util.concurrent.DefaultPromise.cancel(…)(Unknown Source) ~[core-io-1.5.1.jar:?]
2018-04-05T14:25:40.089-06:00 WARN CBAS.adapter.CouchbaseConnectorFactory [Executor-6:ClusterController] Failed getting the number of vbuckets
java.util.concurrent.CancellationException: null
at com.couchbase.client.deps.io.netty.util.concurrent.DefaultPromise.cancel(…)(Unknown Source) ~[core-io-1.5.1.jar:?]

@thayerw
This problem has been fixed starting with Couchbase Server 5.5. The release is not out yet but there is a DP that you can try.

Check the Pre-release versions tab at
https://www.couchbase.com/downloads

Let us know if the problem still persists,
Abdullah.

It’s working! thank you.