Can't enable DCP connection on Spark Connector

connections
spark

#1

Hi,

I having trouble to enable DCP connection.
I receive each time the error message “The DCP service is not enabled or no node in the cluster supports it.”

I set up System.setProperty("com.couchbase.dcpEnabled", "true") before setting my SparkContext.

I also tried to put it in my configuration set up and as Java options on the Spark master and the spark worker.

I have tried on 2 different environments:
Test -> OSX, Spark 1.6.0 (Sala 10), Sbt + Assembly, Couchbase Server 4.0 Community.
Prod -> Mesos 0.25, Spark 1.6.0, Docker executor, Couchbase Server 4.0 Community on Debian.

I only can get DCP working when I setup the master as “local[*]” but it doesn’t works when I setup the master address “spark://spark-master:7077” (even if it is local address or a remote one).

Any advices on how to unable DCP when using remote servers ?

Thanks.


#4

Any chance you can turn the log level up for com.couchbase and file an issue attaching the logs? I don’t see any reason it shouldn’t work.


#5

Sure I will fill the issue tomorrow morning with the logs.

I could find a work around, the issue was maybe due to a bad practice, I’m not a spark expert yet.

I wasn’t setting at all the spark master in the spark conf of my code, only in the spark submit as:
new SparkConf().setAppName("myApp") bin/spark-submit.sh --master spark://spark-master:7077

After posting on the forum I tried set local as in the examples:
new SparkConf().setMaster("local[*]").setAppName("myApp") bin/spark-submit.sh --master spark://spark-master:7077

This trick does works in client mode, I haven’t tested yet in cluster mode.

But each time I remove the .setMaster("[*]") or when try to replace it by the address of my master as .setMaster("spark://spark-master:7077"), I have the DCP error back.