How to insert big json file into couchbasedb using spark connector

I have a big json file which I need to insert / upsert into couchbase db. I went through the docs but could not find any such example.

Here is a simple example to upsert Json data to Couchbase Server:

What you need to do outside of the above is to convert your json data to JsonObject.

There is a similar example (and commentary on it) in Getting Started.

I hope this helps.

Hi @YoshiyukiKono,
Thanks for your response. I got what I needed by looking at the data frame persistence example.

Below is the sample code I’m using. Please let me know if I can do it in a better way.

import com.couchbase.spark._
import org.apache.spark.sql.SaveMode


val saveMode = SaveMode.Overwrite
val spark = SparkSession
  .builder()
  .master("local[*]")
  .appName("Spark SQL")
  .config("spark.couchbase.connectionString", "127.0.0.1")
  .config("spark.couchbase.username", "Administrator")
  .config("spark.couchbase.password", "password")
  .config("spark.couchbase.implicitBucket", "travel-sample")
  .getOrCreate()

val mydf = spark.read.json("/path/to/my.json")

val df_to_insert = mydf.select(col("_id")as("__META_ID"))

df_to_insert.write.mode(saveMode).format("couchbase.kv").save()