Shared Folder 'Out of Sync' and Other Issues

Hello:

Following on from my ‘Too Many Open Files’ thread: I’m seeing some curious behaviour in the Syncthing setup between my two NAS units.

Each unit is set to share a single shared folder, and has the same file exclusions. They are connected to each other on the LAN. The issues I’m seeing are:

  • One unit is reporting the shared folder to be ‘Out of Sync’. I can’t see from the documentation that this is a recognised reported state;
  • The Global State is not the same between the two units;
  • The two units do not agree on the completion state between their own local reporting, and the reporting of the remote device;
  • ‘Out of Sync Items’ is reported far too low for one device (NAS02) - and this device doesn’t appear to be pulling many files (about 4 files in 5 days?). NAS01 is pulling files just fine.

Here are screenshots of the two units:

Any thoughts about where I should start looking to get things back on track?

Many thanks,

Pants.

You can check the failed items. State can be disagreed due to ignore patterns.

Hi Audrius:

Thanks for replying.

There was 1 failed item, caused when I restarted Syncthing on NAS01.

But shouldn’t Global State still be consistent between machines? And the completion state of each machine should be reported consistently across GUIs too?

Thanks,

Pants.

Is it making progress in general? I think global state can still be influenced by mismatched ignores from old days.

In general it should be consistent once finished syncing. During sync they can be different as changes first need to propagate to be displayed. Possibly everything in the main Syncthing process is just fine and once finished syncing, the UI will be consistent as well. However there is indeed a few things which look wrong in your UI: NAS2 has more folders in the local state than the global state, which I think should not be possible. Then there are no out of sync items on the NAS2 despite NAS1 reporting that it has out of sync items (that might also be transitional though). Another thing that is confusing me is the low cpu utilization while handling a huge amount of data: Is this just incorrect reporting (i.e. in fact cpu is under high load?).

Hello:

NAS01 is making fine progress - but NAS02 is making very slow progress - less than 1 file per day.

I’ll re-check them - but they should have exactly the same ignore patterns (set up to ignore extended attribute files the Synology creates as it scans files).

No - it really is running that low.

Here’s an excerpt from the verbose log on NAS02:

[OVPYX] 2018/01/15 16:24:01.499883 model.go:1331: DEBUG: model@0xc42007e240 REQ(in): GSXHK4J-*******-*******-*******-*******-*******-*******-*******: "bvkmk-*****" / "file/path/1.html" o=0 s=4904 t=false
[OVPYX] 2018/01/15 16:25:27.099922 model.go:1331: DEBUG: model@0xc42007e240 REQ(in): GSXHK4J-*******-*******-*******-*******-*******-*******-*******: "bvkmk-*****" / "file/path/2.adv" o=0 s=21993 t=false
[OVPYX] 2018/01/15 16:25:31.223313 model.go:1331: DEBUG: model@0xc42007e240 REQ(in): GSXHK4J-*******-*******-*******-*******-*******-*******-*******: "bvkmk-*****" / "file/path/3/11479" o=0 s=4250 t=false

Does that indicate anything about what it’s actually doing at the moment?

Thanks,

Pants.

This is just answering requests. I think if you check disk usage, disks will be fully utilized and thrashing, as I suspect it’s scanning and supplying data at the same time.

Hello:

Thanks for your thoughts.

No - it doesn’t appear that way - disk utilisation is very low.

Does file pulling wait until the scan is complete, or does it run concurrently? I’m wondering if the scan has stalled for some reason? Is it worth looking down that path, do you think?

Thanks,

Pants.

How do you know that there is low disk activity?

You can enable scanner logging to see what it’s doing.

Hello:

Simples!

Thanks - will try that now.

Thanks,

Pants.

Hello:

Ok - it looked like the scanner completed - but for some reason it’s thinking it’s only 1 file behind. I know this isn’t true - I can see some top-level folders (with many GB of data) in the shared folder in NAS01 which haven’t replicated to NAS02.

Here’s what the log revealed:

[OVPYX] 2018/01/15 23:45:12.811214 folder.go:123: INFO: Completed initial scan of readwrite folder "NAS" (bvkmk-*****)
[OVPYX] 2018/01/15 23:45:12.811293 verboseservice.go:48: VERBOSE: Folder "bvkmk-*****" is now idle
[OVPYX] 2018/01/15 23:45:12.811535 verboseservice.go:48: VERBOSE: Folder "bvkmk-*****" is now syncing
[OVPYX] 2018/01/15 23:45:46.835518 verboseservice.go:48: VERBOSE: Started syncing "bvkmk-*****" / "file/path/1/.DS_Store" (update file)
[OVPYX] 2018/01/15 23:45:47.864196 rwfolder.go:1748: INFO: Puller (folder "NAS" (bvkmk-*****), file "file/path/1/.DS_Store"): finisher: pull: peers who had this file went away, or the file has changed while syncing. will retry later
[OVPYX] 2018/01/15 23:45:47.864364 verboseservice.go:48: VERBOSE: Finished syncing "bvkmk-*****" / "file/path/1/.DS_Store" (update file): pull: peers who had this file went away, or the file has changed while syncing. will retry later
[OVPYX] 2018/01/15 23:46:12.666126 verboseservice.go:48: VERBOSE: Started syncing "bvkmk-*****" / "file/path/1/.DS_Store" (update file)
[OVPYX] 2018/01/15 23:46:12.767074 verboseservice.go:48: VERBOSE: Finished syncing "bvkmk-*****" / "file/path/1/.DS_Store" (update file): pull: peers who had this file went away, or the file has changed while syncing. will retry later
[OVPYX] 2018/01/15 23:46:36.697754 verboseservice.go:48: VERBOSE: Started syncing "bvkmk-*****" / "file/path/1/.DS_Store" (update file)
[OVPYX] 2018/01/15 23:46:36.801728 verboseservice.go:48: VERBOSE: Finished syncing "bvkmk-*****" / "file/path/1/.DS_Store" (update file): pull: peers who had this file went away, or the file has changed while syncing. will retry later
[OVPYX] 2018/01/15 23:46:36.801818 rwfolder.go:294: INFO: Folder "NAS" (bvkmk-*****) isn't making progress. Pausing puller for 1m0s.
[OVPYX] 2018/01/15 23:46:36.802155 verboseservice.go:48: VERBOSE: FolderErrors events.Event{SubscriptionID:80, GlobalID:80, Time:time.Time{wall:0xbe8f6d372fc8d81f, ext:22895691818283, loc:(*time.Location)(0x119f800)}, Type:2097152, Data:map[string]interface {}{"errors":[]model.fileError{model.fileError{Path:"file/path/1/.DS_Store", Err:"finisher: pull: peers who had this file went away, or the file has changed while syncing. will retry later"}}, "folder":"bvkmk-*****"}}
[OVPYX] 2018/01/15 23:46:36.802218 verboseservice.go:48: VERBOSE: Folder "bvkmk-*****" is now idle
[OVPYX] 2018/01/15 23:46:36.802243 verboseservice.go:48: VERBOSE: Folder "bvkmk-*****" is now syncing
[OVPYX] 2018/01/15 23:47:01.153150 verboseservice.go:48: VERBOSE: Summary for folder "bvkmk-*****" is map[localFiles:4783351 globalDirectories:1231435 localBytes:36687065420323 globalBytes:44808573788174 needBytes:6148 needDirectories:0 localDirectories:1243622 inSyncBytes:44808573782026 state:syncing needDeletes:0 globalDeleted:433070 needFiles:1 globalFiles:5231442 sequence:16931096 inSyncFiles:5231441 localDeleted:373986 localSymlinks:14048 version:16931096 globalSymlinks:14048 needSymlinks:0]
[OVPYX] 2018/01/15 23:47:06.569122 verboseservice.go:48: VERBOSE: Started syncing "bvkmk-*****" / "file/path/1/.DS_Store" (update file)
[OVPYX] 2018/01/15 23:47:06.669925 rwfolder.go:1748: INFO: Puller (folder "NAS" (bvkmk-*****), file "file/path/1/.DS_Store"): finisher: pull: peers who had this file went away, or the file has changed while syncing. will retry later
[OVPYX] 2018/01/15 23:47:06.670424 verboseservice.go:48: VERBOSE: Finished syncing "bvkmk-*****" / "file/path/1/.DS_Store" (update file): pull: peers who had this file went away, or the file has changed while syncing. will retry later
[OVPYX] 2018/01/15 23:47:32.868678 verboseservice.go:48: VERBOSE: Started syncing "bvkmk-*****" / "file/path/1/.DS_Store" (update file)
[OVPYX] 2018/01/15 23:47:33.040854 verboseservice.go:48: VERBOSE: Finished syncing "bvkmk-*****" / "file/path/1/.DS_Store" (update file): pull: peers who had this file went away, or the file has changed while syncing. will retry later
[OVPYX] 2018/01/15 23:48:00.039678 verboseservice.go:48: VERBOSE: Started syncing "bvkmk-*****" / "file/path/1/.DS_Store" (update file)
[OVPYX] 2018/01/15 23:48:00.222843 verboseservice.go:48: VERBOSE: Finished syncing "bvkmk-*****" / "file/path/1/.DS_Store" (update file): pull: peers who had this file went away, or the file has changed while syncing. will retry later
[OVPYX] 2018/01/15 23:48:00.222949 verboseservice.go:48: VERBOSE: FolderErrors events.Event{SubscriptionID:90, GlobalID:90, Time:time.Time{wall:0xbe8f6d4c0d4829c9, ext:22979112959711, loc:(*time.Location)(0x119f800)}, Type:2097152, Data:map[string]interface {}{"folder":"bvkmk-*****", "errors":[]model.fileError{model.fileError{Path:"file/path/1/.DS_Store", Err:"finisher: pull: peers who had this file went away, or the file has changed while syncing. will retry later"}}}}
[OVPYX] 2018/01/15 23:48:00.222976 rwfolder.go:294: INFO: Folder "NAS" (bvkmk-*****) isn't making progress. Pausing puller for 1m0s.
[OVPYX] 2018/01/15 23:48:00.223029 verboseservice.go:48: VERBOSE: Folder "bvkmk-*****" is now idle
[OVPYX] 2018/01/15 23:48:00.223177 verboseservice.go:48: VERBOSE: Folder "bvkmk-*****" is now syncing
[OVPYX] 2018/01/15 23:48:25.585517 verboseservice.go:48: VERBOSE: Started syncing "bvkmk-*****" / "file/path/1/.DS_Store" (update file)
[OVPYX] 2018/01/15 23:48:25.740217 rwfolder.go:1748: INFO: Puller (folder "NAS" (bvkmk-*****), file "file/path/1/.DS_Store"): finisher: pull: peers who had this file went away, or the file has changed while syncing. will retry later
[OVPYX] 2018/01/15 23:48:25.740320 verboseservice.go:48: VERBOSE: Finished syncing "bvkmk-*****" / "file/path/1/.DS_Store" (update file): pull: peers who had this file went away, or the file has changed while syncing. will retry later
[OVPYX] 2018/01/15 23:48:51.692604 verboseservice.go:48: VERBOSE: Started syncing "bvkmk-*****" / "file/path/1/.DS_Store" (update file)
[OVPYX] 2018/01/15 23:48:51.937545 verboseservice.go:48: VERBOSE: Finished syncing "bvkmk-*****" / "file/path/1/.DS_Store" (update file): pull: peers who had this file went away, or the file has changed while syncing. will retry later
[OVPYX] 2018/01/15 23:49:18.996140 verboseservice.go:48: VERBOSE: Started syncing "bvkmk-*****" / "file/path/1/.DS_Store" (update file)
[OVPYX] 2018/01/15 23:49:19.131341 verboseservice.go:48: VERBOSE: Finished syncing "bvkmk-*****" / "file/path/1/.DS_Store" (update file): pull: peers who had this file went away, or the file has changed while syncing. will retry later
[OVPYX] 2018/01/15 23:49:19.131464 verboseservice.go:48: VERBOSE: FolderErrors events.Event{SubscriptionID:99, GlobalID:99, Time:time.Time{wall:0xbe8f6d5fc7d3e4b3, ext:23058021453769, loc:(*time.Location)(0x119f800)}, Type:2097152, Data:map[string]interface {}{"folder":"bvkmk-*****", "errors":[]model.fileError{model.fileError{Path:"file/path/1/.DS_Store", Err:"finisher: pull: peers who had this file went away, or the file has changed while syncing. will retry later"}}}}
[OVPYX] 2018/01/15 23:49:19.131493 rwfolder.go:294: INFO: Folder "NAS" (bvkmk-*****) isn't making progress. Pausing puller for 1m0s.
[OVPYX] 2018/01/15 23:49:19.131619 verboseservice.go:48: VERBOSE: Folder "bvkmk-*****" is now idle
[OVPYX] 2018/01/15 23:49:19.131733 verboseservice.go:48: VERBOSE: Folder "bvkmk-*****" is now syncing
[OVPYX] 2018/01/15 23:49:45.235340 verboseservice.go:48: VERBOSE: Started syncing "bvkmk-*****" / "file/path/1/.DS_Store" (update file)
[OVPYX] 2018/01/15 23:49:45.405802 rwfolder.go:1748: INFO: Puller (folder "NAS" (bvkmk-*****), file "file/path/1/.DS_Store"): finisher: pull: peers who had this file went away, or the file has changed while syncing. will retry later
[OVPYX] 2018/01/15 23:49:45.406144 verboseservice.go:48: VERBOSE: Finished syncing "bvkmk-*****" / "file/path/1/.DS_Store" (update file): pull: peers who had this file went away, or the file has changed while syncing. will retry later
[OVPYX] 2018/01/15 23:50:12.504596 verboseservice.go:48: VERBOSE: Started syncing "bvkmk-*****" / "file/path/1/.DS_Store" (update file)
[OVPYX] 2018/01/15 23:50:12.677974 verboseservice.go:48: VERBOSE: Finished syncing "bvkmk-*****" / "file/path/1/.DS_Store" (update file): pull: peers who had this file went away, or the file has changed while syncing. will retry later
[OVPYX] 2018/01/15 23:50:38.830219 verboseservice.go:48: VERBOSE: Started syncing "bvkmk-*****" / "file/path/1/.DS_Store" (update file)
[OVPYX] 2018/01/15 23:50:38.941417 verboseservice.go:48: VERBOSE: Finished syncing "bvkmk-*****" / "file/path/1/.DS_Store" (update file): pull: peers who had this file went away, or the file has changed while syncing. will retry later
[OVPYX] 2018/01/15 23:50:38.941532 verboseservice.go:48: VERBOSE: FolderErrors events.Event{SubscriptionID:108, GlobalID:108, Time:time.Time{wall:0xbe8f6d73b81ca743, ext:23137831528547, loc:(*time.Location)(0x119f800)}, Type:2097152, Data:map[string]interface {}{"errors":[]model.fileError{model.fileError{Path:"file/path/1/.DS_Store", Err:"finisher: pull: peers who had this file went away, or the file has changed while syncing. will retry later"}}, "folder":"bvkmk-*****"}}
[OVPYX] 2018/01/15 23:50:38.941737 rwfolder.go:294: INFO: Folder "NAS" (bvkmk-*****) isn't making progress. Pausing puller for 2m0s.
[OVPYX] 2018/01/15 23:50:38.941855 verboseservice.go:48: VERBOSE: Folder "bvkmk-*****" is now idle
[OVPYX] 2018/01/15 23:50:38.943439 verboseservice.go:48: VERBOSE: Folder "bvkmk-*****" is now syncing
[OVPYX] 2018/01/15 23:51:04.295652 verboseservice.go:48: VERBOSE: Started syncing "bvkmk-*****" / "file/path/1/.DS_Store" (update file)
[OVPYX] 2018/01/15 23:51:04.577233 rwfolder.go:1748: INFO: Puller (folder "NAS" (bvkmk-*****), file "file/path/1/.DS_Store"): finisher: pull: peers who had this file went away, or the file has changed while syncing. will retry later
[OVPYX] 2018/01/15 23:51:04.577578 verboseservice.go:48: VERBOSE: Finished syncing "bvkmk-*****" / "file/path/1/.DS_Store" (update file): pull: peers who had this file went away, or the file has changed while syncing. will retry later
[OVPYX] 2018/01/15 23:51:30.835442 verboseservice.go:48: VERBOSE: Started syncing "bvkmk-*****" / "file/path/1/.DS_Store" (update file)
[OVPYX] 2018/01/15 23:51:31.017902 verboseservice.go:48: VERBOSE: Finished syncing "bvkmk-*****" / "file/path/1/.DS_Store" (update file): pull: peers who had this file went away, or the file has changed while syncing. will retry later
[OVPYX] 2018/01/15 23:51:56.507589 verboseservice.go:48: VERBOSE: Started syncing "bvkmk-*****" / "file/path/1/.DS_Store" (update file)
[OVPYX] 2018/01/15 23:51:56.612000 verboseservice.go:48: VERBOSE: Finished syncing "bvkmk-*****" / "file/path/1/.DS_Store" (update file): pull: peers who had this file went away, or the file has changed while syncing. will retry later
[OVPYX] 2018/01/15 23:51:56.612096 verboseservice.go:48: VERBOSE: FolderErrors events.Event{SubscriptionID:117, GlobalID:117, Time:time.Time{wall:0xbe8f6d87247a26fc, ext:23215502111772, loc:(*time.Location)(0x119f800)}, Type:2097152, Data:map[string]interface {}{"folder":"bvkmk-*****", "errors":[]model.fileError{model.fileError{Path:"file/path/1/.DS_Store", Err:"finisher: pull: peers who had this file went away, or the file has changed while syncing. will retry later"}}}}
[OVPYX] 2018/01/15 23:51:56.612252 rwfolder.go:294: INFO: Folder "NAS" (bvkmk-*****) isn't making progress. Pausing puller for 4m0s.
[OVPYX] 2018/01/15 23:51:56.612435 verboseservice.go:48: VERBOSE: Folder "bvkmk-*****" is now idle
[OVPYX] 2018/01/15 23:55:20.150882 verboseservice.go:48: VERBOSE: Completion for folder "bvkmk-*****" on device GSXHK4J-*******-*******-*******-*******-*******-*******-******* is 21.73096117185034%
[OVPYX] 2018/01/15 23:55:56.612710 verboseservice.go:48: VERBOSE: Folder "bvkmk-*****" is now syncing
[OVPYX] 2018/01/15 23:56:21.378949 verboseservice.go:48: VERBOSE: Started syncing "bvkmk-*****" / "file/path/1/.DS_Store" (update file)
[OVPYX] 2018/01/15 23:56:21.487711 rwfolder.go:1748: INFO: Puller (folder "NAS" (bvkmk-*****), file "file/path/1/.DS_Store"): finisher: pull: peers who had this file went away, or the file has changed while syncing. will retry later
[OVPYX] 2018/01/15 23:56:21.487845 verboseservice.go:48: VERBOSE: Finished syncing "bvkmk-*****" / "file/path/1/.DS_Store" (update file): pull: peers who had this file went away, or the file has changed while syncing. will retry later
[OVPYX] 2018/01/15 23:56:45.435036 verboseservice.go:48: VERBOSE: Started syncing "bvkmk-*****" / "file/path/1/.DS_Store" (update file)
[OVPYX] 2018/01/15 23:56:45.542141 verboseservice.go:48: VERBOSE: Finished syncing "bvkmk-*****" / "file/path/1/.DS_Store" (update file): pull: peers who had this file went away, or the file has changed while syncing. will retry later
[OVPYX] 2018/01/15 23:57:09.497611 verboseservice.go:48: VERBOSE: Started syncing "bvkmk-*****" / "file/path/1/.DS_Store" (update file)
[OVPYX] 2018/01/15 23:57:09.726808 rwfolder.go:294: INFO: Folder "NAS" (bvkmk-*****) isn't making progress. Pausing puller for 8m0s.
[OVPYX] 2018/01/15 23:57:09.726883 verboseservice.go:48: VERBOSE: Finished syncing "bvkmk-*****" / "file/path/1/.DS_Store" (update file): pull: peers who had this file went away, or the file has changed while syncing. will retry later
[OVPYX] 2018/01/15 23:57:09.726986 verboseservice.go:48: VERBOSE: FolderErrors events.Event{SubscriptionID:127, GlobalID:127, Time:time.Time{wall:0xbe8f6dd56b51db87, ext:23528616911537, loc:(*time.Location)(0x119f800)}, Type:2097152, Data:map[string]interface {}{"folder":"bvkmk-*****", "errors":[]model.fileError{model.fileError{Path:"file/path/1/.DS_Store", Err:"finisher: pull: peers who had this file went away, or the file has changed while syncing. will retry later"}}}}
[OVPYX] 2018/01/15 23:57:09.727014 verboseservice.go:48: VERBOSE: Folder "bvkmk-*****" is now idle
[OVPYX] 2018/01/15 23:57:33.427328 verboseservice.go:48: VERBOSE: Summary for folder "bvkmk-*****" is map[inSyncFiles:5231441 inSyncBytes:44808573782026 globalDeleted:433070 localSymlinks:14048 needSymlinks:0 needFiles:1 localFiles:4783351 state:idle localBytes:36687065420323 needBytes:6148 globalDirectories:1231435 globalBytes:44808573788174 version:16931096 sequence:16931096 needDirectories:0 needDeletes:0 globalSymlinks:14048 localDeleted:373986 globalFiles:5231442 localDirectories:1243622]
[OVPYX] 2018/01/15 23:57:40.929135 structs.go:42: INFO: Removing NAT port mapping: external TCP address 12.34.56.78:23215, NAT UPnP_BTHomeHub5.0B-1_5CDC965E12BF/WANDevice/urn:upnp-org:serviceId:WANIPConn1/urn:schemas-upnp-org:service:WANIPConnection:1/http://192.168.1.254:52826/ctl/IPConn is no longer available.
[OVPYX] 2018/01/15 23:57:40.929198 structs.go:32: INFO: New NAT port mapping: external TCP address 12.34.56.78:38501 to local address 0.0.0.0:22000.
[OVPYX] 2018/01/15 23:57:41.528901 structs.go:42: INFO: Removing NAT port mapping: external TCP address 12.34.56.78:41172, NAT UPnP_BTHomeHub5.0B-1_5CDC965E12BF/WANDevice/urn:upnp-org:serviceId:WANPPPConn1/urn:schemas-upnp-org:service:WANPPPConnection:1/http://192.168.1.254:52826/ctl/PPPConn is no longer available.
[OVPYX] 2018/01/15 23:57:41.528962 structs.go:32: INFO: New NAT port mapping: external TCP address 12.34.56.78:44675 to local address 0.0.0.0:22000.
[OVPYX] 2018/01/15 23:57:51.537456 verboseservice.go:48: VERBOSE: Listen address tcp://0.0.0.0:22000 resolution has changed: lan addresses: [tcp://0.0.0.0:22000] wan addresses: [tcp://0.0.0.0:22000 tcp://0.0.0.0:38501 tcp://0.0.0.0:38501 tcp://0.0.0.0:44675 tcp://0.0.0.0:44675]
[OVPYX] 2018/01/16 00:04:40.706351 verboseservice.go:48: VERBOSE: Completion for folder "bvkmk-*****" on device GSXHK4J-*******-*******-*******-*******-*******-*******-******* is 21.73096117185034%
[OVPYX] 2018/01/16 00:05:09.727246 verboseservice.go:48: VERBOSE: Folder "bvkmk-*****" is now syncing
...
[OVPYX] 2018/01/16 06:22:03.239678 verboseservice.go:48: VERBOSE: Started syncing "bvkmk-*****" / "file/path/1/.DS_Store" (update file)
[OVPYX] 2018/01/16 06:22:03.760696 rwfolder.go:1748: INFO: Puller (folder "NAS" (bvkmk-*****), file "file/path/1/.DS_Store"): finisher: pull: peers who had this file went away, or the file has changed while syncing. will retry later
[OVPYX] 2018/01/16 06:22:03.760916 verboseservice.go:48: VERBOSE: Finished syncing "bvkmk-*****" / "file/path/1/.DS_Store" (update file): pull: peers who had this file went away, or the file has changed while syncing. will retry later
[OVPYX] 2018/01/16 06:22:27.421125 verboseservice.go:48: VERBOSE: Started syncing "bvkmk-*****" / "file/path/1/.DS_Store" (update file)
[OVPYX] 2018/01/16 06:22:27.520540 verboseservice.go:48: VERBOSE: Finished syncing "bvkmk-*****" / "file/path/1/.DS_Store" (update file): pull: peers who had this file went away, or the file has changed while syncing. will retry later
[OVPYX] 2018/01/16 06:22:51.168289 verboseservice.go:48: VERBOSE: Started syncing "bvkmk-*****" / "file/path/1/.DS_Store" (update file)
[OVPYX] 2018/01/16 06:22:51.280270 rwfolder.go:294: INFO: Folder "NAS" (bvkmk-*****) isn't making progress. Pausing puller for 1h4m0s.
[OVPYX] 2018/01/16 06:22:51.280325 verboseservice.go:48: VERBOSE: Finished syncing "bvkmk-*****" / "file/path/1/.DS_Store" (update file): pull: peers who had this file went away, or the file has changed while syncing. will retry later
[OVPYX] 2018/01/16 06:22:51.280601 verboseservice.go:48: VERBOSE: FolderErrors events.Event{SubscriptionID:228, GlobalID:228, Time:time.Time{wall:0xbe8f846ed0b443e6, ext:46670170375942, loc:(*time.Location)(0x119f800)}, Type:2097152, Data:map[string]interface {}{"folder":"bvkmk-*****", "errors":[]model.fileError{model.fileError{Path:"file/path/1/.DS_Store", Err:"finisher: pull: peers who had this file went away, or the file has changed while syncing. will retry later"}}}}
[OVPYX] 2018/01/16 06:22:51.280639 verboseservice.go:48: VERBOSE: Folder "bvkmk-*****" is now idle
[OVPYX] 2018/01/16 06:23:14.600525 verboseservice.go:48: VERBOSE: Summary for folder "bvkmk-*****" is map[globalSymlinks:14048 globalFiles:5231442 globalDirectories:1231435 globalDeleted:433070 needDeletes:0 localDirectories:1243622 needSymlinks:0 localDeleted:373986 localSymlinks:14048 inSyncBytes:44808573782026 localBytes:36687065420323 state:idle needFiles:1 needBytes:6148 version:16931096 sequence:16931096 globalBytes:44808573788174 localFiles:4783351 inSyncFiles:5231441 needDirectories:0]
[OVPYX] 2018/01/16 06:29:46.315631 verboseservice.go:48: VERBOSE: Completion for folder "bvkmk-*****" on device GSXHK4J-*******-*******-*******-*******-*******-*******-******* is 21.73096117185034%
[OVPYX] 2018/01/16 06:29:50.738714 structs.go:42: INFO: Removing NAT port mapping: external TCP address 12.34.56.78:44675, NAT UPnP_BTHomeHub5.0B-1_5CDC965E12BF/WANDevice/urn:upnp-org:serviceId:WANPPPConn1/urn:schemas-upnp-org:service:WANPPPConnection:1/http://192.168.1.254:52826/ctl/PPPConn is no longer available.
[OVPYX] 2018/01/16 06:29:50.738766 structs.go:32: INFO: New NAT port mapping: external TCP address 12.34.56.78:41172 to local address 0.0.0.0:22000.
[OVPYX] 2018/01/16 06:29:51.082722 structs.go:42: INFO: Removing NAT port mapping: external TCP address 12.34.56.78:38501, NAT UPnP_BTHomeHub5.0B-1_5CDC965E12BF/WANDevice/urn:upnp-org:serviceId:WANIPConn1/urn:schemas-upnp-org:service:WANIPConnection:1/http://192.168.1.254:52826/ctl/IPConn is no longer available.
[OVPYX] 2018/01/16 06:29:51.082771 structs.go:32: INFO: New NAT port mapping: external TCP address 12.34.56.78:23215 to local address 0.0.0.0:22000.
[OVPYX] 2018/01/16 06:30:01.094422 verboseservice.go:48: VERBOSE: Listen address tcp://0.0.0.0:22000 resolution has changed: lan addresses: [tcp://0.0.0.0:22000] wan addresses: [tcp://0.0.0.0:22000 tcp://0.0.0.0:23215 tcp://0.0.0.0:23215 tcp://0.0.0.0:41172 tcp://0.0.0.0:41172]
[OVPYX] 2018/01/16 07:00:01.002261 structs.go:42: INFO: Removing NAT port mapping: external TCP address 12.34.56.78:23215, NAT UPnP_BTHomeHub5.0B-1_5CDC965E12BF/WANDevice/urn:upnp-org:serviceId:WANIPConn1/urn:schemas-upnp-org:service:WANIPConnection:1/http://192.168.1.254:52826/ctl/IPConn is no longer available.
[OVPYX] 2018/01/16 07:00:01.002322 structs.go:32: INFO: New NAT port mapping: external TCP address 12.34.56.78:38501 to local address 0.0.0.0:22000.
[OVPYX] 2018/01/16 07:00:01.593645 structs.go:42: INFO: Removing NAT port mapping: external TCP address 12.34.56.78:41172, NAT UPnP_BTHomeHub5.0B-1_5CDC965E12BF/WANDevice/urn:upnp-org:serviceId:WANPPPConn1/urn:schemas-upnp-org:service:WANPPPConnection:1/http://192.168.1.254:52826/ctl/PPPConn is no longer available.
[OVPYX] 2018/01/16 07:00:01.593694 structs.go:32: INFO: New NAT port mapping: external TCP address 12.34.56.78:44675 to local address 0.0.0.0:22000.
[OVPYX] 2018/01/16 07:00:11.602024 verboseservice.go:48: VERBOSE: Listen address tcp://0.0.0.0:22000 resolution has changed: lan addresses: [tcp://0.0.0.0:22000] wan addresses: [tcp://0.0.0.0:22000 tcp://0.0.0.0:38501 tcp://0.0.0.0:38501 tcp://0.0.0.0:44675 tcp://0.0.0.0:44675]
[OVPYX] 2018/01/16 07:26:51.280793 verboseservice.go:48: VERBOSE: Folder "bvkmk-*****" is now syncing
[OVPYX] 2018/01/16 07:27:15.236716 verboseservice.go:48: VERBOSE: Started syncing "bvkmk-*****" / "file/path/1/.DS_Store" (update file)
[OVPYX] 2018/01/16 07:27:15.817436 rwfolder.go:1748: INFO: Puller (folder "NAS" (bvkmk-*****), file "file/path/1/.DS_Store"): finisher: pull: peers who had this file went away, or the file has changed while syncing. will retry later
[OVPYX] 2018/01/16 07:27:15.817629 verboseservice.go:48: VERBOSE: Finished syncing "bvkmk-*****" / "file/path/1/.DS_Store" (update file): pull: peers who had this file went away, or the file has changed while syncing. will retry later
[OVPYX] 2018/01/16 07:27:39.477422 verboseservice.go:48: VERBOSE: Started syncing "bvkmk-*****" / "file/path/1/.DS_Store" (update file)
[OVPYX] 2018/01/16 07:27:39.573854 verboseservice.go:48: VERBOSE: Finished syncing "bvkmk-*****" / "file/path/1/.DS_Store" (update file): pull: peers who had this file went away, or the file has changed while syncing. will retry later
[OVPYX] 2018/01/16 07:28:03.215867 verboseservice.go:48: VERBOSE: Started syncing "bvkmk-*****" / "file/path/1/.DS_Store" (update file)
[OVPYX] 2018/01/16 07:28:03.333601 verboseservice.go:48: VERBOSE: Finished syncing "bvkmk-*****" / "file/path/1/.DS_Store" (update file): pull: peers who had this file went away, or the file has changed while syncing. will retry later
[OVPYX] 2018/01/16 07:28:03.333879 rwfolder.go:294: INFO: Folder "NAS" (bvkmk-*****) isn't making progress. Pausing puller for 1h4m0s.
[OVPYX] 2018/01/16 07:28:03.334006 verboseservice.go:48: VERBOSE: FolderErrors events.Event{SubscriptionID:241, GlobalID:241, Time:time.Time{wall:0xbe8f8840d3e5ef33, ext:50582223962667, loc:(*time.Location)(0x119f800)}, Type:2097152, Data:map[string]interface {}{"folder":"bvkmk-*****", "errors":[]model.fileError{model.fileError{Path:"file/path/1/.DS_Store", Err:"finisher: pull: peers who had this file went away, or the file has changed while syncing. will retry later"}}}}
[OVPYX] 2018/01/16 07:28:03.334037 verboseservice.go:48: VERBOSE: Folder "bvkmk-*****" is now idle
[OVPYX] 2018/01/16 07:28:26.668128 verboseservice.go:48: VERBOSE: Summary for folder "bvkmk-*****" is map[globalSymlinks:14048 inSyncBytes:44808573782026 version:16931096 localBytes:36687065420323 needDirectories:0 needDeletes:0 globalDirectories:1231435 needSymlinks:0 localDeleted:373986 localSymlinks:14048 sequence:16931096 localFiles:4783351 localDirectories:1243622 needFiles:1 inSyncFiles:5231441 state:idle globalDeleted:433070 needBytes:6148 globalFiles:5231442 globalBytes:44808573788174]
[OVPYX] 2018/01/16 07:30:11.108626 structs.go:42: INFO: Removing NAT port mapping: external TCP address 12.34.56.78:38501, NAT UPnP_BTHomeHub5.0B-1_5CDC965E12BF/WANDevice/urn:upnp-org:serviceId:WANIPConn1/urn:schemas-upnp-org:service:WANIPConnection:1/http://192.168.1.254:52826/ctl/IPConn is no longer available.
[OVPYX] 2018/01/16 07:30:11.108683 structs.go:32: INFO: New NAT port mapping: external TCP address 12.34.56.78:23215 to local address 0.0.0.0:22000.
[OVPYX] 2018/01/16 07:30:11.455273 structs.go:42: INFO: Removing NAT port mapping: external TCP address 12.34.56.78:44675, NAT UPnP_BTHomeHub5.0B-1_5CDC965E12BF/WANDevice/urn:upnp-org:serviceId:WANPPPConn1/urn:schemas-upnp-org:service:WANPPPConnection:1/http://192.168.1.254:52826/ctl/PPPConn is no longer available.
[OVPYX] 2018/01/16 07:30:11.455733 structs.go:32: INFO: New NAT port mapping: external TCP address 12.34.56.78:41172 to local address 0.0.0.0:22000.
[OVPYX] 2018/01/16 07:30:21.464456 verboseservice.go:48: VERBOSE: Listen address tcp://0.0.0.0:22000 resolution has changed: lan addresses: [tcp://0.0.0.0:22000] wan addresses: [tcp://0.0.0.0:22000 tcp://0.0.0.0:23215 tcp://0.0.0.0:23215 tcp://0.0.0.0:41172 tcp://0.0.0.0:41172]
[OVPYX] 2018/01/16 07:35:01.714054 verboseservice.go:48: VERBOSE: Completion for folder "bvkmk-*****" on device GSXHK4J-*******-*******-*******-*******-*******-*******-******* is 21.73096117185034%

Does this suggest that NAS02 isn’t receiving index updates from NAS01? Any thoughts about where to go digging next?..

Many thanks,

Pants.

I wouldn’t worry too much about that failed .DS_Store file, that might just not exists anymore on NAS01 but as NAS01 is in a very long syncing operation, it did not notice that in a scan yet.

As for why NAS02 is showing as (almost) in sync while it is clearly not (and sitting almost idle): Missing index updates from NAS01 could be the cause. If NAS01 so overloaded that it doesn’t send the, but that seems unlikely given the screenshots. You can enable model debug logging on NAS01 and look for “sendIndexes for” (or “Sending indexes for”) messages.

What also strikes me as odd: You said there is many GB of data that is not in sync and in the first post, that NAS01 is pulling files just fine, however on the screenshot the total data transferred is <1GB.

Do you have rate limiting enabled or something like that?

Hello:

Yep - will do that and report back.

No - no rate limiting. The only special things I have set are:

  • Exclusion list (same on both units) for extended attributes files;
  • Running with STNOUPGRADE (I’m running the build that Jakob made to resolve ‘Too Many open Files’, and it kept wanting to upgrade to the latest release);
  • Verbose logging;
  • STTRACE=scanner;
  • Turned off Global Discovery (both units are on the same LAN).

Thanks,

Pants.

Hello:

I’ve been running with model logging since my last post, and I’ve seen the following:

  • NAS01 didn’t send any indexes for quite a while - but then sent a bunch of them just recently, about 36 hours after Syncthing was started;
  • NAS02 responded promptly and synced a bunch of files - but they appeared to be 99.9% metadata files;
  • There were a few ‘update files’ scattered about - NAS02 is now 13 files more up-to-date than it was before. However, it now seems to be focussed on pulling the same .DS_Store file as was noted previously.

I’m happy to upload the logs to the FTP that Jakob provided me if that would be useful; they contain confidential file names, so I can’t post them publicly.

Any thoughts?!

Thanks,

Pants.

That together with the screenshots points to NAS02 being in sync according to what it knows and NAS01 pulling stuff from NAS02. This pulling takes a long time and produces some sync conflicts (or a lot) and once there are enough conflicts, they are saved to the db and sent to NAS02. As NAS01 is stuck syncing (i.e. doesn’t scan), the “.DS_Store” issue does not resolve, but that doesn’t hurt. That the remote device completion is very low for NAS02 as displayed on NAS01 might also be due to conflicts: The conflicts detected in NAS01’s pulling iterations are sent, but if NAS01 “loses” the conflict they will only get resolved in the next puller iteration - and that is a long while off (when NAS01 is ± in sync for the first time). This might be optimized by checking for conflicts when computing remote device completions, but I don’t know whether it is worth the effort. I actually checked this and it is wrong: Winning conflicts are not wrongly reported as needed.

What remains unclear is your earlier statement that NAS01 has data that NAS02 does not: That should sync in any case as it should have been picked up in the scan before the current pull operation. You could grep the logs to check whether these unsynced files are mentioned at any point.

All in all I believe that syncing is on track, it just takes a lot of time as it is a lot of data.

Hi Simon:

Many thanks for your thoughts.

I’ve checked for this, and can’t find any reference to the top-level folder name (inside the Shared Folder) which is on NAS01 and not on NAS02, at least in the log generated in the last two days.

I understand - but it’s definitely a different behaviour that I’m seeing now compared with what I was seeing with previous builds (which were failing on another issue). Is it safe to downgrade the Syncthing executable back to one of those recent builds, and see if pulling starts going again on NAS02?

Thanks,

Pants.

Part of my reasoning above about conflicts was incorrect (see edit), so there is even more that is unclear. I don’t have any more ideas right now what the cause may be or what to check.

I believe the relevant fix for you was https://github.com/syncthing/syncthing/issues/4657 which is part of the v0.14.44-rc2 (and rc1) release. And syncing progress is also be preserved, so you should be able to safely go back to the most recent candidate release. However I’d suggest to wait for @calmh opinion, as he provided you those builds.

Thanks Simon - much appreciated.

@calmh - Sorry to bother you again: could you please let me know if it should be safe to downgrade from 0.14.43-rc1 -> 0.14.41 to debug this issue?

Thanks,

Pants.

Not calmh here, but I apparently didn’t understand what you wanted to do when pinging him, so chiming in all the same again:
Going back to 0.14.41 will (most likely) bring back the too many open files issue, as it is only fixed in 0.14.44(-rc.X).

As far as I understand currently syncing is happening on NAS01, so I don’t see the need to downgrade? Sure the status indication seems off and NAS02 not pulling while missing data is not ok, but downgrading to 0.14.41 will not help find the cause of that. There is still a big chance that once NAS01 went trough one pull iteration (that didn’t happen yet I assume?), syncing/status indication will become smooth(er).