Bulk insert, with unknown number of documents [N1Ql]

Hi,
Is there a way, where i can bulk insert multiple documents in to couch base using N1Ql without knowing the numbr of docs i am going to insert

below is the scenario:
I have an Array, say

Array = [{"key": abc, "data": {"name": "xyz", "age": 21}}, {"key": abc1, "data": {"name": "xyz", "age": 21}}, {"key": abc2, "data": {"name": "xyz", "age": 21}}, {"key": abc3, "data": {"name": "xyz", "age": 21}} ........ {"key": abc100000, "data": {"name": "xyz", "age": 21}}]

I can have a million records like this in an array.
where key represents the DocKey, and data represents the data to be stored in that document.

i have tried

INSERT INTO bucket
(KEY, VALUE)
VALUES (key1, doc1), (key2, doc2), ...;

but this is use full when i know the given set of documents.
in the above scenario i don’t know the number of elements in array.

 INSERT INTO default(KEY _key, VALUE _data) SELECT d.`key`  AS _key, d.data AS _data FROM [{"key": "abc", "data": {"name": "xyz", "age": 21}}, {"key": "abc1", "data": {"name": "xyz", "age": 21}}, {"key": "abc2", "data": {"name": "xyz", "age": 21}}, {"key": "abc3", "data": {"name": "xyz", "age": 21}} , {"key": "abc100000", "data": {"name": "xyz", "age": 21}}] AS d;

 OR

INSERT INTO default VALUES ("001",{"array":[{"key": "abc", "data": {"name": "xyz", "age": 21}}, {"key": "abc1", "data": {"name": "xyz", "age": 21}}, {"key": "abc2", "data": {"name": "xyz", "age": 21}}, {"key": "abc3", "data": {"name": "xyz", "age": 21}} , {"key": "abc100000", "data": {"name": "xyz", "age": 21}}] });
 INSERT INTO default(KEY _key, VALUE _data) SELECT d.`key` AS _key, d.data AS _data FROM default AS d1 USE KEYS ["001"] UNNEST d1.array AS d;

If the array has too many elements best option will be use SDKs/Clients loop through the array insert each array element as separate document or batch them n at a time.