Indexing service crashes when indexing an array of items

I am trying to build and index of an array of items on a document. The array on the document called visitCharges and has 50 objects in it. Each object is about 600 bytes with 7 fields.
I am adding this index based on guidance in the linked article below. The index service exits with code 1 while trying to build this index. (full error below) I am running this on a dev box with about 20,000 documents. Production has up to 1000 items in the array. Is this a known bug or limitation?

CB version 4.5.1 on windows 10

Making the most of your arrays

Here is the query I am trying to cover with this index:

select charge
from cloud_med charges
unnest charges.visitCharges charge
where charges.type = ‘charges’ and charges.visitNumber = ‘visit-id’ and tenantName = 'tenant1’
AND ANY c IN charges.visitCharges SATISFIES c.hcpcs >= ‘12031’ END
and charge.hcpcs == '12031’
order by charge.hcpcs
limit 100

Index definition:

** CREATE INDEX idx_hcpcs_non_dis ON cloud_med(array (c.hcpcs) for c in visitCharges end,visitCharges,visitNumber,tenantName) WHERE (type = “charges”)**

Full error:

Service ‘indexer’ exited with status 1. Restarting. Messages: [FDB INFO] Forestdb opened database file c:\Program Files\Couchbase\Server\var\lib\couchbase\data@2i\cloud_med_idx-charge_tenantName_visitNumber_5440202956595155568_0.index\data.fdb.0
[FDB INFO] Forestdb opened database file c:\Program Files\Couchbase\Server\var\lib\couchbase\data@2i\cloud_med_idx_cpt_dis_6298577560406935950_0.index\data.fdb.0
[FDB INFO] Forestdb opened database file c:\Program Files\Couchbase\Server\var\lib\couchbase\data@2i\cloud_med_idx_admitDate_8853683722607819875_0.index\data.fdb.0
[FDB INFO] Forestdb opened database file c:\Program Files\Couchbase\Server\var\lib\couchbase\data@2i\cloud_med_#primary_9357026671000851690_0.index\data.fdb.0
[goport] 2016/10/21 08:10:11 c:/Program Files/Couchbase/Server/bin/indexer.exe terminated: exit status 2

@prasad @deepkaran.salooja

One thing to note is that if I add the “DISTINCT” key word before ARRAY it works.

@DrGarbinsky, can you share your indexer.log file from the index service node.


It means you have too many duplicate entries. Still, the indexer should not crash. It should reject the limit gracefully.

I cannot access the log from the link. Can you gzip and attach it here or upload to S3

curl -X PUT -T indexer.log