Issue in connection of spark and couchbase

I am trying to write the following Spark dataframe into Couchbase:

import com.couchbase.spark.sql._
   import com.couchbase.spark._
   object Test{
   case class Person(name: String, relation: String, uid:String)
  def main(args: Array[String]) {
    val conf = new SparkConf()
    conf.set("spark.couchbase.nodes", "")
    conf.set("spark.couchbase.username", "Administrator")
    conf.set("com.couchbase.password", "xxx")
    conf.set("com.couchbase.bucket.xx", "xx")
    conf.set("com.couchbase.connectTimeout", "50000")

    val sc = new SparkContext(conf)
    val hiveCtx = new HiveContext(sc)

    import hiveCtx.implicits._

    val people = sc.parallelize(Seq(
    Person("user::michael", "Michael", "27"),
    Person("user::tom", "Tom", "33")

    val couchbaseConf =  scala.collection.immutable.Map("bucket" -> "xx", "idField" -> "uid")




I use this spark-submit:

spark-submit --class Test  --master yarn-cluster --deploy-mode cluster --queue default --jars spark-connector_2.10-1.2.0.jar,couchbase-client-1.4.12.jar,core-io-1.3.3.jar,java-client-2.4.2.jar test_2.10-1.0.jar

but I face the following error:

java.lang.NoClassDefFoundError: rx/functions/Func2

Any information which could help me is highly appreciated.

@rezaliii the SDK also depends on rxjava, which you also need to include in your jar list. I would recommend to use --packages instead of --jars and just provide a maven coordinate:

  --packages                  Comma-separated list of maven coordinates of jars to include
                              on the driver and executor classpaths. Will search the local
                              maven repo, then maven central and any additional remote
                              repositories given by --repositories. The format for the
                              coordinates should be groupId:artifactId:version.