Hi, I’m trying to set up file sync for a very large number of files (upwards of 1 million, maybe 2 million). Whilst the program seems capable of doing this, it also causes a lot of thrashing on the hard drive. Is there any way to reduce this? I have it set up to sync each folder on the drive as it’s own separate “job”, but would it be better to have the entire main folder (1 TB)?
Depends on what you want. First of all, you should try running syncthing with a lower IO priority as well as make sure it finishes the initial scans. Then you might want to increase the rescan interval as scanning a lot of files every 60seconds (default) can thrash the disk.
If you have multiple folders there could be parallel scans happening too…
Given your dataset is large you might want to set the rescan interval to 0 and use the inotify utility.
I should have specified I am on Windows, I don’t know if inotify has a windows equivalent.
also has a Windows release
Even with inotify and no rescan intervals it’s still thrashing the disk just scanning any folder when inotify triggers a rescan.