Array Indexation consistency in sub document mutations

I try to perform sub document mutations on arrays but I’m affraid the paths containing fixed index can lead to unwanted changes.

I have such structure:
“age”: “101”,
“name”: “testName”,
“testArray”: [
“id”: “x0”,
“obj1Field”: “obj1FieldValue”,
“obj1Field2”: {
“field1”: “fv1”,
“field2”: “fv2”
“id”: “x2”,
“obj1Field”: “testValue”,
“obj1Field2”: {
“field1”: “fv3”,
“field2”: “fv4”
“id”: “x1”,
“obj1Field”: “testValue2”,
“obj1Field2”: {
“field1”: “fv5”,
“field2”: “fv6”
“testInnerObj”: {
“field1”: “value1”,
“field2”: 4,
“innerArray”: [

I want to use MutateIn Java SDK to remove elements in the arrays (testArray for example).
I can use path like this: “testArray[1]” in remove or upsert MutateInSpec.

How can I be sure it always will point to the same object in array?

Is there any way to filter array elements by id?

Let’s say I will get object like this:
“operation”: REMOVE,
“objectId”: “x1”,
“path”: “testArray”

Can I retrieve sub document part like “testArray” and operate on it like it would be a separate document and modify it freely, so the changes will be applied in proper objects and sub inserted in document as sub document operation?

Please advise me how can I utilise sub document operations in such case to assure proper removal?

I saw I could use ARRAY (CASE WHEN… THEN…) FOR … END operation of N1ql.
But it will require use of UPDATE statement which I read will always use full docment operation, right?
Can I combine those two approaches somehow?

Hi @DominikS
Is it possible to adjust your data model so that instead of an array you have a map, something like this:

  "age": "101",
  "name": "testName",
  "test": {
    "x0": {
      "obj1Field": "obj1FieldValue",
      "obj1Field2": {
        "field1": "fv1",
        "field2": "fv2"

Now you can delete “test.x0”.

1 Like

Thank you @graham.pople for a quick reply.
This looks great.
Only thing I worry about is how to use and translate such structure to graphql and mupstruct.

Do you have any suggestion on that?

Hey @DominikS
Sorry, I don’t have enough experience with those technologies to speak to them.