Await Streaming rows


I have a script like following

var q = N1qlQuery.fromString('SELECT * FROM `travel-sample`');
var req = bucket.query(q);
req.on('row', async function(row) {
  await process(row);

process is a function that returns Promise and take long time to fulfilled

Currently , row event happens frequency and as fast as possible , So I have no streaming rows benefit as finally I buffer all rows (+ all promises) in memory
I want next row event wait for my async function


I think even without promise and with simple callback when we have nested callback , finally we buffer all rows, as before we consume a row we have a new row


Help please!!! :pensive:


Hey @sockerman2016,

Sorry for the delay. The Node.js will accept data from the server at the fastest rate possible, which is then bubbled up to the application via the row events. If you wish to enable sequential processing of the rows, you would need to queue the rows locally in the application, and then sequentially dispatch your promise functions from there. There is actually a built-in capability to do this by subscribing the the rows event which will emit an array of all the rows once they have been retrieved. Note that using this event does have the negative side-effect of forcing you to wait for all the rows to be fetched before you can start your promises (whereas by listening to the row event and doing the aggregation yourself, you would be able to start your work immediately upon the first row arriving).

Cheers, Brett