Troubleshoot a few items not syncing

What I don’t understand is why the Documents folder is in sync, but the remote machine isn’t (in both directions)

Unfortunate timing, here is a later screenshot

‘touch’-ing a file in the “Out of sync items” list causes the file to be synced across both devices. I have just touched one on dirac and the number unsynced has dropped from 792 to 791, but the file doesn’t appear as the “Latest change” in the “Documents” folder despite the access time being updated across both devices.

You should query db API for one of the files on both sides to see if there is anything obvious. Also, check the logs.

Can you give me an example of the curl command to use? The documentation is a bit sparse. I have tried:

curl -X POST -H "X-API-Key: <API Key>" -d '{"file": "<filename>"}' http://127.0.0.1:8384/rest/db/file/

and I get a 404

A simple search goes a long way:

So, I think that there is somehow still a reference to potentially an old syncthing instance? On a number of the documents there are references to BGRPGB2, which isn’t one of the current instances. This may not be the cause, but not sure why it is happening.

The documents which are “out of sync” have a global availability of null, which I am guessing is the reason why it cannot sync, but the file is actually available at both machines.

So the next question is…

Why do these files have an availability of null?

this post mentions "availability": null, but says it is expected??

I guess posting the same query from both sides might show something interesting.

Please post the query results and a list of currently active devices (i.e. the first “block” of their IDs).

Active devices

  • DZGK2SA - geo-fun
  • O3ET7QT - dirac
  • 7TUHXYK - vance-X230 (Not sharing the documents folder)
  • BGRPGB2 - Unknown, maybe a previous installation

Example of document that isn’t syncing

From geo-fun

curl "http://127.0.0.1:8384/rest/db/file?folder=cmpcj-twatt&file=20-29_Interests/21_Art/21.03_Flowers/Daffodil.ai" \
  -H "X-API-Key: xxxxxxxxx"
{
  "availability": null,
  "global": {
    "deleted": false,
    "ignored": false,
    "invalid": false,
    "localFlags": 0,
    "modified": "2009-03-08T17:23:26Z",
    "modifiedBy": "BGRPGB2",
    "mustRescan": false,
    "name": "20-29_Interests/21_Art/21.03_Flowers/Daffodil.ai",
    "noPermissions": false,
    "numBlocks": 10,
    "permissions": "0644",
    "sequence": 1675496,
    "size": 1300419,
    "type": "FILE",
    "version": [
      "BGRPGB2:1"
    ]
  },
  "local": {
    "deleted": false,
    "ignored": false,
    "invalid": false,
    "localFlags": 0,
    "modified": "2009-03-08T17:23:26Z",
    "modifiedBy": "BGRPGB2",
    "mustRescan": false,
    "name": "20-29_Interests/21_Art/21.03_Flowers/Daffodil.ai",
    "noPermissions": false,
    "numBlocks": 10,
    "permissions": "0644",
    "sequence": 1675496,
    "size": 1300419,
    "type": "FILE",
    "version": [
      "BGRPGB2:1"
    ]
  }
}

From dirac

curl "http://127.0.0.1:8384/rest/db/file?folder=cmpcj-twatt&file=20-29_Interests/21_Art/21.03_Flowers/Daffodil.ai" \
  -H "X-API-Key: xxxxxxxx"
 {
  "availability": null,
  "global": {
    "deleted": false,
    "ignored": false,
    "invalid": false,
    "localFlags": 0,
    "modified": "2009-03-08T17:23:26Z",
    "modifiedBy": "O3ET7QT",
    "mustRescan": false,
    "name": "20-29_Interests/21_Art/21.03_Flowers/Daffodil.ai",
    "noPermissions": false,
    "numBlocks": 10,
    "permissions": "0644",
    "sequence": 632068,
    "size": 1300419,
    "type": "FILE",
    "version": [
      "O3ET7QT:1"
    ]
  },
  "local": {
    "deleted": false,
    "ignored": false,
    "invalid": false,
    "localFlags": 0,
    "modified": "2009-03-08T17:23:26Z",
    "modifiedBy": "O3ET7QT",
    "mustRescan": false,
    "name": "20-29_Interests/21_Art/21.03_Flowers/Daffodil.ai",
    "noPermissions": false,
    "numBlocks": 10,
    "permissions": "0644",
    "sequence": 632068,
    "size": 1300419,
    "type": "FILE",
    "version": [
      "O3ET7QT:1"
    ]
  }
}

Analysis

It looks like DZGK2SA (geo-fun) thinks the latest version belongs to BGRPGB2 (Old install). O3ET7QT (dirac) thinks that it has the latest version. Maybe this conflict in state is causing availability: null for the file?

I am not sure how geo-fun has reference to the old install. I thought I had cleared out all of the state on both machines whilst syncthing was shutdown, then restarted both. The other machine isn’t an introducer, so the old reference should not have propagated that way.

Can you suggest how I can resolve this?

The weird thing here is that they disagree on the version vector of the global file. Whether or not the key in the version vector is an old device isn’t really relevant (it might be a pointer to how we got here though). What should happen is that one of these file infos “wins” (despite being equal, there is always a winner) and that triggers a conflict, which is then silently resolved. However here either they didn’t send the info or there was a problem “accounting” that. This comes up way too often, but still rare enough that I haven’t noticed any pattern…

Can you start Syncthing on both sides with -reset-deltas please. If it comes up again we can start debugging :smiley:

The modification timestamp on the two are identical, which is the deciding factor when picking who wins the conflict. I wonder if the logic there when the timestamps are the same is somehow flawed.

Or it just simple assumes the file is not changed (even if the version vectora are different), because size mtime et al matches.

Starting both with -reset-deltas has made no difference, still 791 files out of sync.

log files on both sides shows:

[O3ET7] 16:12:30 INFO: Reinitializing delta index IDs

but when the scanning settles we are back at the same point.

Regarding @AudriusButkevicius comments there must be something a little random or a race condition otherwise the whole filesystem would be in this state. As I said in the earlier post I rsync -a the files across before starting syncthing, so the timestamps and file permissions should be identical on both sides.

Hmmm, thinking further, it is deterministic, the same 791 files are not synced, so it can’t really be a race condition or something else random.

It uses the device ID as a tie-breaker, which should provide consistent ordering on all devices.

This is nice - as in it’s reproducible so there’s a chance at finding the cause :slight_smile:

Can you run syncthing with -reset-deltas and STTRACE=model,db env var set and save the logging output. The output will be huge, so don’t run it as a system service, just redirect stdout to a file.

Each log file is 1.5GB. Probably best I pull stuff out at this end. I will pick a file that hasn’t synced and post relevant sections. Is there anything specific to look out for?

Ideally if you don’t mind make the entire thing available. Otherwise if I don’t mess it up the following grep should contain the info I want (or at least a good starting point): grep -P '(*nameOfAnOutOfSyncFile*|INFO)' (obviously replacing the parts within asterisks) and have a more sensible size (still large though).

I cut these from the logs. I will have a go with the grep command in a moment. If we can work it out without the whole log I would prefer, but if we can’t then I will look to share them

From dirac

[O3ET7] 2020/01/13 16:45:07.945348 lowlevel.go:548: DEBUG: need folder="cmpcj-twatt" device=DZGK2SA-3E7QVQM-GPUIX7Y-AFFZMWU-WE
O53Q5-2FLBZRL-MCEUDBA-YNFLDAF name="20-29_Interests/21_Art/21.03_Flowers/Daffodil.ai" have=false invalid=false haveV={[]} glob
alV={[{O3ET7QT 1}]} globalDev=7777777-777777N-7777777-777777N-7777777-777777N-7777777-77777Q4
...
[O3ET7] 2020/01/13 16:45:08.973715 lowlevel.go:548: DEBUG: need folder="cmpcj-twatt" device=DZGK2SA-3E7QVQM-GPUIX7Y-AFFZMWU-WEO53Q5-2FLBZRL-MCEUDBA-YNFLDAF name="20-29_Interests/21_Art/21.03_Flowers/Daffodil.ai" have=false invalid=false haveV={[]} globalV={[{O3ET7QT 1}]} globalDev=7777777-777777N-7777777-777777N-7777777-777777N-7777777-77777Q4
...
[O3ET7] 2020/01/13 16:46:08.537569 lowlevel.go:548: DEBUG: need folder="cmpcj-twatt" device=DZGK2SA-3E7QVQM-GPUIX7Y-AFFZMWU-WEO53Q5-2FLBZRL-MCEUDBA-YNFLDAF name="20-29_Interests/21_Art/21.03_Flowers/Daffodil.ai" have=false invalid=false haveV={[]} globalV={[{O3ET7QT 1}]} globalDev=7777777-777777N-7777777-777777N-7777777-777777N-7777777-77777Q4
...
[O3ET7] 2020/01/13 16:48:47.577064 lowlevel.go:548: DEBUG: need folder="cmpcj-twatt" device=DZGK2SA-3E7QVQM-GPUIX7Y-AFFZMWU-WEO53Q5-2FLBZRL-MCEUDBA-YNFLDAF name="20-29_Interests/21_Art/21.03_Flowers/Daffodil.ai" have=false invalid=false haveV={[]} globalV={[{O3ET7QT 1}]} globalDev=7777777-777777N-7777777-777777N-7777777-777777N-7777777-77777Q4
...
[O3ET7] 2020/01/13 16:51:58.522881 lowlevel.go:548: DEBUG: need folder="cmpcj-twatt" device=DZGK2SA-3E7QVQM-GPUIX7Y-AFFZMWU-WEO53Q5-2FLBZRL-MCEUDBA-YNFLDAF name="20-29_Interests/21_Art/21.03_Flowers/Daffodil.ai" have=false invalid=false haveV={[]} globalV={[{O3ET7QT 1}]} globalDev=7777777-777777N-7777777-777777N-7777777-777777N-7777777-77777Q4
...
[O3ET7] 2020/01/13 16:55:24.806060 lowlevel.go:548: DEBUG: need folder="cmpcj-twatt" device=DZGK2SA-3E7QVQM-GPUIX7Y-AFFZMWU-WEO53Q5-2FLBZRL-MCEUDBA-YNFLDAF name="20-29_Interests/21_Art/21.03_Flowers/Daffodil.ai" have=false invalid=false haveV={[]} globalV={[{O3ET7QT 1}]} globalDev=7777777-777777N-7777777-777777N-7777777-777777N-7777777-77777Q4

From geo-fun

[DZGK2] 2020/01/13 16:46:01.054449 lowlevel.go:548: DEBUG: need folder="cmpcj-twatt" device=O3ET7QT-RNUQHPX-APW6PNS-QPAVQWY-PU43DGW-CG455OH-XWBVLQB-NLDTSAA name="20-29_Interests/21_Art/21.03_Flowers/Daffodil.ai" have=true invalid=false haveV={[{O3ET7QT 1}]} globalV={[{BGRPGB2 1}]} globalDev=7777777-777777N-7777777-777777N-7777777-777777N-7777777-77777Q4
...
[DZGK2] 2020/01/13 16:46:06.823728 transactions.go:267: DEBUG: remove from global; folder="cmpcj-twatt" device=O3ET7QT-RNUQHPX-APW6PNS-QPAVQWY-PU43DGW-CG455OH-XWBVLQB-NLDTSAA file="20-29_Interests/21_Art/21.03_Flowers/Daffodil.ai"
...
[DZGK2] 2020/01/13 16:46:07.996147 lowlevel.go:548: DEBUG: need folder="cmpcj-twatt" device=O3ET7QT-RNUQHPX-APW6PNS-QPAVQWY-PU43DGW-CG455OH-XWBVLQB-NLDTSAA name="20-29_Interests/21_Art/21.03_Flowers/Daffodil.ai" have=false invalid=false haveV={[]} globalV={[{BGRPGB2 1}]} globalDev=7777777-777777N-7777777-777777N-7777777-777777N-7777777-77777Q4
...
[DZGK2] 2020/01/13 16:48:47.743876 lowlevel.go:75: DEBUG: insert; folder="cmpcj-twatt" device=O3ET7QT-RNUQHPX-APW6PNS-QPAVQWY-PU43DGW-CG455OH-XWBVLQB-NLDTSAA File{Name:"20-29_Interests/21_Art/21.03_Flowers/Daffodil.ai", Sequence:632068, Permissions:0644, ModTime:2009-03-08 17:23:26 +0000 GMT, Version:{[{O3ET7QT 1}]}, Length:1300419, Deleted:false, Invalid:false, LocalFlags:0x0, NoPermissions:false, BlockSize:131072, Blocks:[Block{0/131072/3987058061/17b4374edae4abb683d8b56a4a3c0dd63a463d63bc23b2fa215571b08107d0e3} Block{131072/131072/3501716483/36a714145f5211f9bc38f9c4964e4108b4c898d46cdf338248fa96477f6a0cf4} Block{262144/131072/41453113/cee5e9e56e7e722b6090732142f1c84e00682ff2e539b1fcf966f80071095194} Block{393216/131072/3755594029/f3a418fb094538821c5e3f74cc2d4e5f08ba0097ae452875781fd025b669cfac} Block{524288/131072/985730212/52d4be8a0257f454c4ecce3121507b622340e71276e3b80f763bb5e4918b0165} Block{655360/131072/853818378/23e9b5dc8413a03bba4979bb1d62563dbf3332e06be6b1e94a7ebe8d760755f1} Block{786432/131072/999585817/7c087a5c9137cf76c842220f6a254e1966e6b730ea34ca8e19d8d87da66681f2} Block{917504/131072/1633936707/316d3dc7358210d9562d70c818dc43e22ecd2103c4e28b608244f5ec6a037eb9} Block{1048576/131072/588696647/649a73dc56f4f2c8f09af063e1f97c3041fe9e939e8cc500c225e3f02b869908} Block{1179648/120771/1384364102/34a61b3f75811733163cc6fbffd9c557688f6ee7d83074ef49e79bcbd6deb113}]}
[DZGK2] 2020/01/13 16:48:47.743927 transactions.go:133: DEBUG: update global; folder="cmpcj-twatt" device=O3ET7QT-RNUQHPX-APW6PNS-QPAVQWY-PU43DGW-CG455OH-XWBVLQB-NLDTSAA file="20-29_Interests/21_Art/21.03_Flowers/Daffodil.ai" version={[{O3ET7QT 1}]} invalid=false
[DZGK2] 2020/01/13 16:48:47.744120 transactions.go:177: DEBUG: new global for "20-29_Interests/21_Art/21.03_Flowers/Daffodil.ai" after update: {{{[{BGRPGB2 1}]}, 7777777-777777N-7777777-777777N-7777777-777777N-7777777-77777Q4}, {{[{O3ET7QT 1}]}, O3ET7QT-RNUQHPX-APW6PNS-QPAVQWY-PU43DGW-CG455OH-XWBVLQB-NLDTSAA}}
...
[DZGK2] 2020/01/13 16:48:49.524230 lowlevel.go:548: DEBUG: need folder="cmpcj-twatt" device=O3ET7QT-RNUQHPX-APW6PNS-QPAVQWY-PU43DGW-CG455OH-XWBVLQB-NLDTSAA name="20-29_Interests/21_Art/21.03_Flowers/Daffodil.ai" have=true invalid=false haveV={[{O3ET7QT 1}]} globalV={[{BGRPGB2 1}]} globalDev=7777777-777777N-7777777-777777N-7777777-777777N-7777777-77777Q4
...
[DZGK2] 2020/01/13 16:55:09.698525 lowlevel.go:548: DEBUG: need folder="cmpcj-twatt" device=O3ET7QT-RNUQHPX-APW6PNS-QPAVQWY-PU43DGW-CG455OH-XWBVLQB-NLDTSAA name="20-29_Interests/21_Art/21.03_Flowers/Daffodil.ai" have=true invalid=false haveV={[{O3ET7QT 1}]} globalV={[{BGRPGB2 1}]} globalDev=7777777-777777N-7777777-777777N-7777777-777777N-7777777-77777Q4
...
[DZGK2] 2020/01/13 16:55:18.079197 lowlevel.go:548: DEBUG: need folder="cmpcj-twatt" device=O3ET7QT-RNUQHPX-APW6PNS-QPAVQWY-PU43DGW-CG455OH-XWBVLQB-NLDTSAA name="20-29_Interests/21_Art/21.03_Flowers/Daffodil.ai" have=true invalid=false haveV={[{O3ET7QT 1}]} globalV={[{BGRPGB2 1}]} globalDev=7777777-777777N-7777777-777777N-7777777-777777N-7777777-77777Q4
...
[DZGK2] 2020/01/13 16:55:45.126982 lowlevel.go:548: DEBUG: need folder="cmpcj-twatt" device=O3ET7QT-RNUQHPX-APW6PNS-QPAVQWY-PU43DGW-CG455OH-XWBVLQB-NLDTSAA name="20-29_Interests/21_Art/21.03_Flowers/Daffodil.ai" have=true invalid=false haveV={[{O3ET7QT 1}]} globalV={[{BGRPGB2 1}]} globalDev=7777777-777777N-7777777-777777N-7777777-777777N-7777777-77777Q4

My version of grep doesn’t have -P. Using -E gives lines with the file as well as any line containing INFO, and there is a lot of noise in the files with filenames containing INFO. This is only 27k though…