Regarding Memory Consumption


I did some evaluations on couchDB recently. I found that memory consumption is pretty high for view construction (map & reduce) as well as importing a larger JSON document into couchDB. I evaluated the view construction function on a Ubantu system (4 cores, Intel® Xeon® CPU E3-1240 v5 @ 3.50GHz). Here are the results:
(1) four hundred 100KB datasets would cost around 683 MB memory;
(2) one 80 MB dataset would cost around 2.5 GB memory;
(3) four 80 MB datasets would cost around 10 GB memory.
It seems that memory consumption is hundreds times of original JSON dataset. If we use 1 GB dataset, then couchDB would run out of the memory. Does anyone know the reason why memory consumption is so huge? Many thanks!

I don’t know if you meant to say Couchbase Server - note that Couchbase != CouchDB :slight_smile: