When rereduce is triggered

I play with couchbbase beer-sample. I use views to get list of cities in state.
My map/reduce functions work quite well but I wonder when rereduce is triggered.
It looks like after adding new document to bucket whole view is processed again from the begining. Even if I remove rereduce part of reducer cities list is updated.
My map and reducer below:

function (doc, meta) { if(doc.state!="" && doc.state!=null) { emit(doc.state, doc.city); } } function(key, values, rereduce) {

var result;
if(rereduce) {
//result = values[1];
//result.push(values[0]);
}
else {
result = values;
}
return result;
}

Hello,

In fact you have 2 cases when it is done:

  • when you have multiple nodes in your cluster, since the system will do the reduce on each node, the reduce has to be done again when you merge all the results
  • when the array of value is too big, the “too big” is something that the system will manage internally based on the size of the array (the data and number of entries)

The important part is: you do not control this part, just be sure you write the code to deal with the rereduce=true and false.

Documentation:
http://docs.couchbase.com/couchbase-manual-2.2/#handling-rereduce

Regards
Tug
@tgrall

One more question. Let assume that I have 1 node. If I will add new record to bucket, view would be processed from scratch? What if I have 1000000 docs? They would be processed each time?

Thanks

Ok I understand. But is there any possibility to force rereduce process? It would be very helpfull while testing. Otherwise we could check if rereduce part is working only after going on production.

No this is done incrementally.

The only simple way is to create a 2-3 node clusters

@tgrall

regarding the second trigger of the rereduce “the size of the array (the data and number of entries”.
I encounter below issue:
map emit only one value which is pretty big, in this case, if the system thinks its too big, it will trigger the rereduce for ever, couchbase will eventually eat up all my cpu and RAM.

what if I really need that big value? is there a way to bypass this limit?