I use Syncthing to synchronize downloaded files across several computers. Some of those downloads are via http, some via ftp, and some via bittorrent. Unfortunately, this causes issues with Syncthing, as it starts syncing files as soon as they start downloading, then keeps re-syncing them as they change over time, until they are finished.
This results in:
- constant re-hashing as the files download
- bandwidth being used to upload the files while they are being downloaded
- disk access time being taken up due to #1 and #2
Now, it’s certainly not a critical bug, but when I’m spiking activity, it can easily end up in the downloading computer thrashing as it’s simultaneously downloading dozens of files, rehashing them constantly, and uploading them constantly – as well as the other computers constantly getting a “file has changed” error as they’re trying to sync said files.
A workaround, which is the feature that I am proposing, would be a “lazy hashing” option. Essentially, if you enable “lazy hashing”, you can enter a time period (5 seconds, 30 seconds, 5 minutes, whatever). Syncthing will then skip hashing a file that has been modified more recently than that. Basically, it will skip (and ideally queue) any file that has been added/changed, until it hasn’t been modified for X timeperiod.
This avoids breaking the normal flow of how Syncthing works, while accounting for files that are constantly changing (due to downloading, streaming a filecapture to disk, etc). Anyone who doesn’t want to use it, just doesn’t enable the “lazy hashing” feature, and it’s no skin off their backs.