Folders "Up to Date" but stuck in "Syncing" with empty "Out of Sync Items"

I have had this problem a couple of times, but still have no idea what the cause may be. Right now, I have two folders that appear to be fully in sync marked “Up to Date”, but on one device, the other device is kept in this “Syncing” state with an empty list of “Out of Sync Items”.

This is how the folder state looks on both devices:

and then on Device A, Device B is stuck in this “Syncing” state:

but on Device B, there is no such problem with Device A:

I have a feeling that this happens when similar/same files are created or deleted simultaneously on both sides. I have not been able to reproduce the problem in my test environment though. I have also been unable to find any clues in the logs.

However, the thing is that the folder itself does appear to be fully synced. If I make further changes, they also seem to be synced properly. It is only Syncthing itself that gets stuck in this state. The workaround is, of course, to either remove and re-add the folder, or wipe out the database. I would like to find the cause of the issue, if possible.

Can anyone help me diagnose this more thoroughly?

1 Like

Empty lists on the remotes - that’s a new one.

What Syncthing versions are involved?

One step is to run the index check on both sides (stindex -mode -idxck) to check if there’s something wrong with the file information.

Without a way to reproduce or debug logs from when it happens (probably model and db, which means lots and lots of logging, easily GBs) it’s virtually impossible to say what happened. In the usual logs panics or unclean shutdowns would be of interest, but I assume that would have counted as the clues, which you didn’t find any.

Before you do that, recalculating metadata by repairing the db (STRECHECKDBEVERY=1s) or by dropping indexes (-reset-deltas) are less drastic measure that may already resolve the problem.

I have STRECHECKDBEVERY=1s set on my devices for some time already, so I can say for sure that this one does not help here. I would like to find the culprit, so I will abstain from using -reset-deltas or similar for now. As I said, the folder keeps being synced properly, regardless of the stuck items :upside_down_face:.

I have this problem on two different sets of devices (A-B and A-C). Each time a different folder is involved, but I did not bother to include screenshots from the other one, as it looks like exactly the same thing. The two devices listed in this thread both run v1.10.0-rc.2 (Windows, x86-32). The other one is on v1.9.0 (Android, ARM), but I had seen this issue in the past too, so I do not believe that this is a “new” regression.

I will do stindex -mode -idxck and report back later. I should also still have some of the older log files, which may include some useful information, so I will try to check them again. They are just the normal ones though, so no debug options there.

I’d really advice against that. Maybe you have little enough data/fast enough storage such that it’s not a noticeable impediment, but those checks are heavy. Plus as you say you bulldoze over everything every time.

However the fact is interesting: It means that despite recalculating metadata, the discrepancy between metadata (syncing) and actual info in db (empty list) persists. That should be impossible, as recalculating and the creating the list in UI do the same things. Can you check that for the folder in question on device A, you did indeed have a log line saying “Stored folder metadata for … is … old; recalculating” (that should happen due to STRECHECKDBEVERY)

1 Like

Hmm, so the files had been stuck like that for at least a few days until yesterday morning, when I created this thread. Then, during the day, I had to reboot the operating system on Device A several times, and now it seems that Syncthing has managed to fix the problem by itself. There are no files stuck in “Syncing” in both folder pairs anymore.

I also did manage to check the log files, and realised that I actually had model enabled on Device B, but the older log files already got overwritten, and the only remaining ones were those created yesterday. In the log files on Device A, with no debugging options enabled, I did not find anything peculiar that could be related to the stuck files.

I am pretty sure that the issue will come back sometime in the future, so I will write back when this happens.

I know, I know, but the setting helped fix some other problems in the past, so I just left it like that. I should probably disable it, although I have not experienced that big of a performance impact because of it. By “enough data”, do you mean that it uses more bandwidth? The devices in question are connected under the same WLAN, so this is not a problem here, fortunately.

I have 32 folders on Device A right now, so yes, there are a lot of “Stored folder metadata for … is … old; recalculating” in the log files.

I’m having the same problem. I’m using syncthing for a few years on couple of computers including android. all versions are updated automatically. all of my settings are out of the box, nothing changed manually

simce this problem is new i would suggest its related to an update. all versions are 1.9.0

I had exactly the same issue on two remote devices. On the remote end I had tried rescanning, touching the folder, recheckdbevery and eventually renamed the remote devices index folder, and still no joy.

in the end I renamed the receive only index folder and let it reindex everything. I know it’s not the ideal solution, however I had also tried recheckdbevery and it made no difference, however blowing away the local index I think it’s fine now. Certainly when I open the remote device out of sync items, i’m now getting items, where before it was blank like tomasz.

As a thought, could it be that over the various upgrades, the index db simply is getting messed up internally and a wipe and restart clears out any issues?

1 Like

The issue has come back, this time with 14 deleted items stuck in sync. My observation is that the problem seems to happen when there are more than two devices involved. The problematic folder is shared between three machines.

I am still unable to reliably reproduce the problem, but until now I have only tried testing with 2 test configurations. I am going to try doing more experiments with 3 instances of Syncthing and see if anything comes out of it.

1 Like

I had this same problem. :frowning: Running v1.10.0 to sync my desktop on four linux systems. Of the four, one system showed ‘Up to Date’ for the folder but it was stuck in “Syncing” for 27 items but with empty “Out of Sync Items” reports. Over multiple days and with rescans and (unrelated) hardware reboots.

Originally, I had some systems on V1.9 and some on v1.10. So I upgraded all to V1.10.0. The problem persisted.

Finally, I did a ‘syncthing -verbose -reset-database’ which gave me the output: “INFO: Successfully reset database - it will be rebuilt after next start.”

And lo and behold, after restarting syncthing, it reported "Scanning’ followed by ‘Syncing’ for each of the other Remote Devices and finally (after a short while) ‘Up to Date’.

f.y.i. My system(s) have 1,490 desktop files with 69 sub-folders totaling ~549 MiB.

Also noted: While it’s comparing the sync status of the adjacent systems, it reported ‘syncing files’ for all 1,490 individual files. A little disconcerting as I don’t think it is really syncing any physical files, only comparing file indexes.

It worked for me. YMMV.

I have done a lot of stress testing with a 3-folder configuration including nested folders, etc., and I must say that I am simply unable to reproduce the issue. Syncthing has so far managed to always sync all the changes with no issues. I also want to say that I am running with Max Conflicts set to 0.

Anyhow, I have now run stindex on the databases from two devices. This is not the same situation, as the one described in the first post, as now the “Out of Sync Items” are actually listed, but the devices are still stuck in “Syncing”, so there is that.

This is how to the situation looks like. I am attaching screenshots of the GUI and stindex output for one of the out-of-sync files.

Device A

[device] F:4 D:1 N:"folder1/021a4c1d-45ae-4228-8137-56940eda6cf5.ics" V:File{Name:"folder1/021a4c1d-45ae-4228-8137-56940eda6cf5.ics", Sequence:238356, Permissions:0644, ModTime:2020-10-09 19:58:27.4205179 +0900 KST, Version:{[{D4DZUGP 1602241107}]}, Length:0, Deleted:true, Invalid:false, LocalFlags:0x0, NoPermissions:true, BlockSize:131072, Blocks:[], BlocksHash:}
[device] F:4 D:1 N:"folder2/021a4c1d-45ae-4228-8137-56940eda6cf5.ics" V:File{Name:"folder2/021a4c1d-45ae-4228-8137-56940eda6cf5.ics", Sequence:238286, Permissions:0644, ModTime:2020-10-09 19:22:58.4205179 +0900 KST, Version:{[{D4DZUGP 1602239437}]}, Length:311, Deleted:false, Invalid:false, LocalFlags:0x0, NoPermissions:true, BlockSize:131072, Blocks:[Block{0/311/669670638/5ef8052e22ad2b64bd967f73d562c1019bc5688f1c4836702644238320dc7e64}], BlocksHash:e5b434b0f76d0c1f06f5de2f4534c790c8804b1a1b44eb5c9262d8f9fa96ebcf}
[device] F:4 D:3 N:"folder1/021a4c1d-45ae-4228-8137-56940eda6cf5.ics" V:File{Name:"folder1/021a4c1d-45ae-4228-8137-56940eda6cf5.ics", Sequence:236952, Permissions:0644, ModTime:2020-10-09 19:58:27.4205179 +0900 KST, Version:{[{D4DZUGP 1602241107}]}, Length:0, Deleted:true, Invalid:false, LocalFlags:0x0, NoPermissions:true, BlockSize:131072, Blocks:[], BlocksHash:}
[device] F:4 D:3 N:"folder2/021a4c1d-45ae-4228-8137-56940eda6cf5.ics" V:File{Name:"folder2/021a4c1d-45ae-4228-8137-56940eda6cf5.ics", Sequence:236878, Permissions:0644, ModTime:2020-10-09 19:22:58.4205179 +0900 KST, Version:{[{D4DZUGP 1602239437}]}, Length:311, Deleted:false, Invalid:false, LocalFlags:0x0, NoPermissions:true, BlockSize:131072, Blocks:[Block{0/311/669670638/5ef8052e22ad2b64bd967f73d562c1019bc5688f1c4836702644238320dc7e64}], BlocksHash:e5b434b0f76d0c1f06f5de2f4534c790c8804b1a1b44eb5c9262d8f9fa96ebcf}
[device] F:12 D:1 N:"021a4c1d-45ae-4228-8137-56940eda6cf5.ics" V:File{Name:"021a4c1d-45ae-4228-8137-56940eda6cf5.ics", Sequence:149984, Permissions:0644, ModTime:2020-10-09 19:57:31.4205179 +0900 KST, Version:{[{D4DZUGP 1602241051}]}, Length:0, Deleted:true, Invalid:false, LocalFlags:0x0, NoPermissions:true, BlockSize:131072, Blocks:[], BlocksHash:}

Device B

[device] F:1 D:4 N:"021a4c1d-45ae-4228-8137-56940eda6cf5.ics" V:File{Name:"021a4c1d-45ae-4228-8137-56940eda6cf5.ics", Sequence:149984, Permissions:0644, ModTime:2020-10-09 19:57:31.4205179 +0900 KST, Version:{[{D4DZUGP 1602241051}]}, Length:0, Deleted:true, Invalid:false, LocalFlags:0x0, NoPermissions:true, BlockSize:131072, Blocks:[], BlocksHash:}
[device] F:1 D:5 N:"021a4c1d-45ae-4228-8137-56940eda6cf5.ics" V:File{Name:"021a4c1d-45ae-4228-8137-56940eda6cf5.ics", Sequence:198, Permissions:0644, ModTime:2020-10-09 19:58:36.4205179 +0900 KST, Version:{[{7LDIBZ4 1602241116}]}, Length:0, Deleted:true, Invalid:false, LocalFlags:0x0, NoPermissions:true, BlockSize:131072, Blocks:[], BlocksHash:}
[global] F:1 N:"021a4c1d-45ae-4228-8137-56940eda6cf5.ics" V:{{{[{7LDIBZ4 1602241116}]}, {7LDIBZ4}, {}}, {{[{D4DZUGP 1602241051}]}, {D4DZUGP}, {}}}

Is there anything useful in the stindex output?

I think that I have managed to reproduce the issue.