Using spark connector from spark-shell

Hello,

I am trying to run spark-connector from spark-shell instructed as follows:

http://developer.couchbase.com/documentation/server/current/connectors/spark-1.2/spark-shell.html

However, it didn’t work correctly.

$ spark-shell --packages com.couchbase.client:spark-connector_2.10:1.2.0 --conf "spark.couchbase.bucket.travel-sample="
Ivy Default Cache set to: /home/makoto/.ivy2/cache
The jars for the packages stored in: /home/makoto/.ivy2/jars
:: loading settings :: url = jar:file:/usr/local/oss/spark-1.6.1-bin-hadoop2.6/lib/spark-assembly-1.6.1-hadoop2.6.0.jar!/org/apache/ivy/core/settings/ivysettings.xml
com.couchbase.client#spark-connector_2.10 added as a dependency
:: resolving dependencies :: org.apache.spark#spark-submit-parent;1.0
confs: [default]
found com.couchbase.client#spark-connector_2.10;1.2.0 in list
found com.couchbase.client#java-client;2.2.7 in list
found com.couchbase.client#core-io;1.2.8 in list
found io.reactivex#rxjava;1.0.17 in list
found io.reactivex#rxscala_2.10;0.25.1 in list
:: resolution report :: resolve 323ms :: artifacts dl 11ms
:: modules in use:
com.couchbase.client#core-io;1.2.8 from list in [default]
com.couchbase.client#java-client;2.2.7 from list in [default]
com.couchbase.client#spark-connector_2.10;1.2.0 from list in [default]
io.reactivex#rxjava;1.0.17 from list in [default]
io.reactivex#rxscala_2.10;0.25.1 from list in [default]
---------------------------------------------------------------------
| | modules || artifacts |
| conf | number| search|dwnlded|evicted|| number|dwnlded|
---------------------------------------------------------------------
| default | 5 | 0 | 0 | 0 || 5 | 0 |
---------------------------------------------------------------------
:: retrieving :: org.apache.spark#spark-submit-parent
confs: [default]
0 artifacts copied, 5 already retrieved (0kB/11ms)
16/06/13 05:48:45 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform… using builtin-java classes where applicable
Welcome to
____ __
/ / ___ / /
\ / _ / _ `/ __/ '/
/
/ .__/_,// //_\ version 1.6.1
/
/

Using Scala version 2.10.5 (Java HotSpot™ 64-Bit Server VM, Java 1.8.0_65)
Type in expressions to have them evaluated.
Type :help for more information.
Spark context available as sc.
16/06/13 05:48:53 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
16/06/13 05:48:53 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
16/06/13 05:48:58 WARN ObjectStore: Version information not found in metastore. hive.metastore.schema.verification is not enabled so recording the schema version 1.2.0
16/06/13 05:48:58 WARN ObjectStore: Failed to get database default, returning NoSuchObjectException
16/06/13 05:48:59 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
16/06/13 05:49:00 WARN Connection: BoneCP specified but not present in CLASSPATH (or one of dependencies)
SQL context available as sqlContext.

scala> val airlines = sqlContext.read.couchbase(schemaFilter = org.apache.spark.sql.sources.EqualTo(“type”, “airline”))
:25: error: value couchbase is not a member of org.apache.spark.sql.DataFrameReader
val airlines = sqlContext.read.couchbase(schemaFilter = org.apache.spark.sql.sources.EqualTo(“type”, “airline”))
^

scala>

The bucket travel-sample exists in my couchbase server 4.5 beta.
Is something wrong in my environment ?

The following import is necessary before sqlContext.read.couchbase.

import com.couchbase.spark.sql._

It worked fine !

Glad you were able to figure it out - and thanks for posting the missing link! That should help others if they try to Google the same error message. -Will

@webber it’s actually in the doc you linked but its a little hidden in the middle - great you got it worked out!