Import a huge JSON File


#1

Hello,
i try to import a huge JSON File with a lot of JSON Objects in a CouchBase 4.1. A single Object works with:

cbdocloader  -n localhost:Port -u username -p passwd -b bucketName UTF8OBJSONFile.json  

The DB works fine and i can Import a single JSON Object. But not a Huge File with a lot of Objects…

I read that i can make a .zip from the JSON but it does not work

Have someone an idea?

bg Andy


#2

Hey Andy -
Are you getting an error message? How big is the file you’re trying to import? What OS?
-Will


#3

Hey i try it with 9mb but later is must work with a 10gb file.
I use Win7


#4

Summoning @anil since this is his topic area. So no error message, just slow? Or is this blowing up?


#5

Sry ja it show this Error if i take a file with more than 1 JSON Objects

2016-03-01 19:00:58,342: w0 Fail to read json file with error:Extra data: line 2 column 1 - line 242 column 1 (char 2705 - 1442900)
.
bucket: UTF8OB.json, msgs transferred…
: total | last | per sec
byte : 0 | 0 | 0.0
done


#6

I’ve used the zip file when the zip is a collection of docs, one json object per file. I don’t think cbdocloader supports one giant file with tons of JSON objects in it. That’s what the error is telling you.


#7

I understand that loading a huge number of records at once is not optimal, but I would expect cbdocloader to load say, an array of 1K or 10K object at a time. What is the approach that other people use if the data to be imported is in the millions of documents? Litter them across the file system? This doesn’t seem optimal at all.