Q: modify one document, lite LiveQuery listener get more than one changes

hi:

env:
database: couchbase 6.0
sync -gateway: 2.1
couchbase lite: 2.1.2

my question is:
when I update a document by sync_gateway restful API, the couchbase lite listener gets more than one changes.
My application scenario is A couchbase lite post a document via sg restful API, then D couchbase lite listen to changes. I use QueryListenerChange with one where condition. when getting changes, D gets the document and print it to the printer.
now when I update a document, the printer always prints more than one times. and the android logcat printed a lot of changes logs.
how can I resolve this?
sample graph
image
I also use database LiveQuery, when to update one document, because when the android device goes to sleep mode, the sync Thread same killed by the system. so I save a heartbeat document in a local database, when over 5 mins, I can’t get the database liveQuery result , I’ll restart sync. but when I put heartbeat document, the android logcat also gets two changes.
is the database level LiveQuery can’t the combination with QueryLiveQuery?
best regrades

Hi, are you saying this happens every time? Can you post a code sample?

Also wondering why you post the document through the rest api instead of writing it to Couchbase Lite and letting the normal sync process handle it?

hi @hod.greeley,
thanks for your reply.
not every time, Not often.
I have 5 android devices in a channel, one device use liveQuery listen to document and detect document property flag to print. for example, A B C D E android device. A post a document via sync gateway restful API, the E liveQuery the document, when detecting the document print flag is 0, then print it, when printer successful, E received call back, set the print flag to 1 via sg restful API, it means the document had printed.
I paste the sync gateway logs below.
when a normal logs is:

[root@k8s-master1 ~]# kubectl logs sync-gateway-import-68788b6b47-n6wr7 | grep Order.290fb86d-b9ef-4627-9049-649ace4840ca
2019-03-15T12:00:04.322Z [INF] CRUD: Doc “Order.290fb86d-b9ef-4627-9049-649ace4840ca” in channels “{733d2f51}”
2019-03-15T12:00:04.322Z [INF] CRUD: Stored doc “Order.290fb86d-b9ef-4627-9049-649ace4840ca” / “1-bbf72401d5fa78ede6f3efaf00fb951b”
2019-03-15T12:00:04.325Z [INF] Cache: Received #1000 after 2ms (“Order.290fb86d-b9ef-4627-9049-649ace4840ca” / “1-bbf72401d5fa78ede6f3efaf00fb951b”)
2019-03-15T12:00:04.350Z [INF] SyncMsg: [5d123124] #239: Type:rev Id:Order.290fb86d-b9ef-4627-9049-649ace4840ca Rev:1-20a7bc9845545fede2625f992475f6df02ca6d3f Sequence:1001 User:733d2f51
2019-03-15T12:00:04.352Z [INF] SyncMsg: [5fc78829] #322: Type:rev Id:Order.290fb86d-b9ef-4627-9049-649ace4840ca Rev:1-20a7bc9845545fede2625f992475f6df02ca6d3f Sequence:1001 User:733d2f51
2019-03-15T12:00:04.492Z [INF] HTTP: #453: GET /kitchen/Order.290fb86d-b9ef-4627-9049-649ace4840ca (as 733d2f51)
2019-03-15T12:00:04.520Z [INF] HTTP: #454: PUT /kitchen/Order.290fb86d-b9ef-4627-9049-649ace4840ca (as 733d2f51)
2019-03-15T12:00:04.522Z [INF] CRUD: Stored doc “Order.290fb86d-b9ef-4627-9049-649ace4840ca” / “2-eff59b29cd0dd510da5d2f7aac31edd5”
2019-03-15T12:00:04.524Z [INF] Cache: Received #1003 after 2ms (“Order.290fb86d-b9ef-4627-9049-649ace4840ca” / “2-eff59b29cd0dd510da5d2f7aac31edd5”)


the E android device will print 2 times with below sg logs:

[root@k8s-master1 ~]# kubectl logs sync-gateway-import-68788b6b47-n6wr7 | grep Order.adc6b289-6966-48e7-937c-26a9599da226
2019-03-15T11:59:55.134Z [INF] CRUD: Doc “Order.adc6b289-6966-48e7-937c-26a9599da226” in channels “{733d2f51}”
2019-03-15T11:59:55.134Z [INF] CRUD: Stored doc “Order.adc6b289-6966-48e7-937c-26a9599da226” / “1-2a542c92146677f434f9d49d3059390a”
2019-03-15T11:59:55.136Z [INF] Cache: Received #992 after 1ms (“Order.adc6b289-6966-48e7-937c-26a9599da226” / “1-2a542c92146677f434f9d49d3059390a”)
2019-03-15T11:59:55.307Z [INF] HTTP: #448: GET /kitchen/Order.adc6b289-6966-48e7-937c-26a9599da226 (as 733d2f51)
2019-03-15T11:59:55.341Z [INF] HTTP: #449: PUT /kitchen/Order.adc6b289-6966-48e7-937c-26a9599da226 (as 733d2f51)
2019-03-15T11:59:55.343Z [INF] CRUD: Stored doc “Order.adc6b289-6966-48e7-937c-26a9599da226” / “2-c45f624afa211a19b85be3e47d390254”
2019-03-15T11:59:55.344Z [INF] Cache: Received #996 after 1ms (“Order.adc6b289-6966-48e7-937c-26a9599da226” / “2-c45f624afa211a19b85be3e47d390254”)
2019-03-15T11:59:55.349Z [INF] SyncMsg: [5d123124] #230: Type:rev Id:Order.adc6b289-6966-48e7-937c-26a9599da226 Rev:1-3771c3ee29b64b16c97207891fbf3b96039e003e Sequence:993 User:733d2f51
2019-03-15T11:59:55.350Z [INF] SyncMsg: [5fc78829] #313: Type:rev Id:Order.adc6b289-6966-48e7-937c-26a9599da226 Rev:1-3771c3ee29b64b16c97207891fbf3b96039e003e Sequence:993 User:733d2f51
2019-03-15T11:59:55.367Z [INF] SyncMsg: [38b178d] #235: Type:rev Id:Order.adc6b289-6966-48e7-937c-26a9599da226 Rev:1-3771c3ee29b64b16c97207891fbf3b96039e003e Sequence:993 User:733d2f51
2019-03-15T11:59:55.785Z [INF] SyncMsg: [5fc78829] #316: Type:rev Id:Order.adc6b289-6966-48e7-937c-26a9599da226 Rev:3-abc9fe8454398d60e199295ad5f2a8fe009d4f84 Sequence:997 User:733d2f51
2019-03-15T11:59:55.789Z [INF] CRUD: Stored doc “Order.adc6b289-6966-48e7-937c-26a9599da226” / “3-abc9fe8454398d60e199295ad5f2a8fe009d4f84”
2019-03-15T11:59:55.790Z [INF] Cache: Received #997 after 2ms (“Order.adc6b289-6966-48e7-937c-26a9599da226” / “3-abc9fe8454398d60e199295ad5f2a8fe009d4f84”)
2019-03-15T11:59:55.797Z [INF] SyncMsg: [5d123124] #233: Type:rev Id:Order.adc6b289-6966-48e7-937c-26a9599da226 Rev:3-abc9fe8454398d60e199295ad5f2a8fe009d4f84 Sequence:997 User:733d2f51
2019-03-15T11:59:55.815Z [INF] SyncMsg: [38b178d] #238: Type:rev Id:Order.adc6b289-6966-48e7-937c-26a9599da226 Rev:3-abc9fe8454398d60e199295ad5f2a8fe009d4f84 Sequence:997 User:733d2f51
2019-03-15T11:59:55.958Z [INF] HTTP: #450: GET /kitchen/Order.adc6b289-6966-48e7-937c-26a9599da226 (as 733d2f51)
2019-03-15T11:59:55.991Z [INF] HTTP: #451: PUT /kitchen/Order.adc6b289-6966-48e7-937c-26a9599da226 (as 733d2f51)
2019-03-15T11:59:55.993Z [INF] CRUD: Stored doc “Order.adc6b289-6966-48e7-937c-26a9599da226” / “4-a02b1a46b9ad940c61864d8deafeb932”
2019-03-15T11:59:55.995Z [INF] Cache: Received #998 after 2ms (“Order.adc6b289-6966-48e7-937c-26a9599da226” / “4-a02b1a46b9ad940c61864d8deafeb932”)
2019-03-15T12:18:06.888Z [INF] HTTP: #456: GET /kitchen/Order.adc6b289-6966-48e7-937c-26a9599da226?rev=2-c45f624afa211a19b85be3e47d390254 (as 733d2f51)
2019-03-15T12:19:19.319Z [INF] HTTP: #457: GET /kitchen/Order.adc6b289-6966-48e7-937c-26a9599da226?rev=3-abc9fe8454398d60e199295ad5f2a8fe009d4f84 (as 733d2f51)


this logs have 4 revision, normal logs have 2 revision. and it’s secenes Not often. when I inspect all logs, I find 2 revision has conflict, I don’t know if my description is clear. how it’s occur conflict?

thanks!

angular

Flagging @jens @adamf to take a look at this.

The dual update of the document sounds like the result of a conflict. If a device pulls a change from the server that conflicts with what it has locally, it will resolve it by choosing one side or the other, then commit a new revision and push that. So two revisions are created.

Conflicts happen when a document is changed in different ways by two clients (or a client and the server) before they’ve had time to replicate the first change.

hi @hod.greeley @jens,

thanks for your reply. as your mention that issue is my code issue.If have many device share a one channel and sg user, documents are prone to conflicts. now I fix the issue.

thanks again.

angular