Considering I spend most of my days in code, not being able to sync my work across machines pains me a great deal. Judging by many comments out there, I am not the only one. So, I have been thinking about the problem and came up with a rather simple idea. Before I start hacking away, I was curious what your opinions are.
The idea is rather simple. Git treats the local repo (.git) like a database. Rather then trying to figure out the details of how to sync databases, why not treat it like a single thing, single item that needs comparing/synchronizing.
So, the proposal is something like:
- SyncThing should keep a list of well known directories that should have special treatment. This list can be added to by the user.
- In these directories, any other settings for ignoring files etc. are disabled. So these directories are either synched completely or not at all.
- When detecting changes, a change of any kind inside the tree, will flag the whole tree for synching.
- To sync the “special” item, the receiver will first make a (hard) copy of the tree. Then all the changes will be synched into the copy. When the whole tree is finished, the receiver will unlink the old copy and place the new version in it’s place.
There are obviously some issues that need to be solved.
- Aborted transfers would have to be dropped or optimised in some non trivial way.
- Various conflicts are possible if any of the files in the old version of the directory (one that is removed).
- I am sure more problems.
For the purpose of this conversation, let’s leave aside comments about internals of SyncThing and/or how hard this would be to implement in SyncThing. These problems will surely need to be addressed, but let’s not get into weeds before we decide if the approach has merit.
So, any thought, comments, questions on the idea? Has something like this been tried before? Do you know any reason this would not work? Is another way being considered?
Thanks all and keep up the great work on SyncThing.