Idea: Offline sync with local files

Hello,

i have the following case: I have one huge folder. It is synced across multiple devices already. But it makes problems due to its size.

Now i split it up into 9 smaller folders. I stopped sharing it and added new the new folders. But i kept the old folder so ST can still access its files.

Now basically what happens is: ST copies all matching files from the huge folder into the 9 smaller folders.

My feature request goes to this point. Apparently the copy process only runs / works, when my client is connected to another device which has the new folders / files.

But i think the following could be done and would be clever: When connected sync the list of files and hashes. When disconnected still copy the locally existing files.

We do not need the remote devices to be connected for this process once we have the list of which files we need (when they exist elsewhere locally).

This would speed up and simplify my usecase extremely.

Thanks for looking at my idea!

Greetings Fred

This means running sync loop consuming cpu and io when not connected to anyone, which 99.99% of the time does nothing.

We already have plenty of people unhappy about cpu/io usage when doing nothing (which kills batteries), so sadly I don’t think this is viable.

You should just one-off rsync between the local folders

My observation some time ago was that it continues copying even after a disconnect, but maybe that was only some files it started already while connected, I don’t remember… (or maybe it does not do this any more with the most recent version)

If we have all the data we need locally, there is no need to be connected to any other device. Data will be copied locally.

Oh, actually that’s not true. We will indeed need to know that the file is available somewhere to start syncing it, and by definition we don’t have it ourselves until we’ve done so. So we need some sort of connection to somebody who has it, even if we’re not going to use that connection to transfer any data.

I guess this could be improved but it seems rather niche… In your case with a large copy operation to perform it’s noticeable. For most devices that are already mostly in sync it isn’t.

But don’t you download a list of meta information from that other device (like file names and hashes)? If so, you could just temporarily store that list and locally copy the files even if the other device has disconnected in the meantime. Right?

Yes, we could. We have the metadata. It’s just that currently, if we don’t have a file and it’s not available from any peers we don’t start syncing it. We don’t look at the list of required blocks until we actually start syncing it. That relationship would need to be inverted for this special case.

This is orthogonal to the ticket (https://github.com/syncthing/syncthing/issues/6123) that I just closed “don’t use cpu when not needed”, as in 99.99% of these cases we would probably bounce around using CPU just to check local files to realise there is nothing we can do, draining battery and upsetting people because CPU fans kick in.

It seems you can’t make everyone happy.