By the time we’ve got this far - a partial download of a big file - I can see how it’s saturating the disk bandwidth checking what we’ve downloaded already. I’m not 100% sure that’s the only problem though, or it wouldn’t be an issue when I turn my machine on in the morning, because at that point I’ve got gigabytes of files waiting on servers to be downloaded, but nothing locally that needs scanning before the download can start.
Anyway. I can live with it. I can’t speak Go so I’m not going to fix it myself. I would have hoped that forcing it to scan one folder at a time would not require heavy lifting like a scan queue - just a mutex (or equivalent) on an object that says “I’m busy scanning a folder don’t start a new one” and all the other folder processes just wait and retry at random until they’re all done. I don’t care what order folders are scanned in, I just don’t want all 20 to kick off at once and kill performance for the next 6 hours when they could be done in 1/10 then time running one at at time. But not knowing the code base such hopes are worth less than nothing.
If help reproducing the problem is help worth having then I gladly offer that though.