i have all my git repos synced with syncthing, it works well. BT Sync and ownCloud have problems here.
if there are 100.000 small files in the repo, it may take some time to first syncā¦
i have all my git repos synced with syncthing, it works well. BT Sync and ownCloud have problems here.
if there are 100.000 small files in the repo, it may take some time to first syncā¦
Why does BTSync and Dropbox have problems but Syncthing doesnāt?
Iāve been doing this with SugarSync for years and it works great. Want to try it with Syncthing at some point, but my experience using SS is the only problem is if the object files get to be too large, as SS delays syncing them.
You can, and itāll sync just fine, but you shouldnāt.
A Git repository is not like a collection of documents. In a collection of documents, each file is mostly independent so itās easy to āmergeā when one file has changed on one device and another file on another device. A Git repository on the other hand is made up of a bunch of files that should be internally consistent with each other. āMergingā two Git repositories that have diverged cannot be done by just copying in the newest files from each - it must be done by understanding the contents, doing an actual merge with a new commit, etc.
So no, donāt. Use a central Git repository and push/pull from it, like Linus intended.Ā :)
I think youāre imagining multiple people collaborating on a single synced project. However, I think Philippe is talking about a different scenario: single user, multiple machines. At least thatās what Iām looking into.
For example, Iām in the middle of implementing a feature on my desktop PC, and my project is a big uncompilable mess. However, I need to leave for a week now, and all I can take with me is my laptop. Would be nice to continue where I left off, right? Without either messing up commit history, or wasting time cleaning up the project, or syncing it manually, or, better yet, putting it on a thumb drive.
My main concern was that the repository could potentially get messed up, if something were to go wrong during a sync, but looks like git is implemented in such a way, that things need to go really wrong (as in āomg my computer is on fireā wrong) for it to become corrupted (due to the fact that it rarely modifies its files, and mostly adds new ones instead).
Iāve had git repositories mess up in a big way when thereās just 1 of me using Syncthing. I never dug properly into why, but it did happen surprisingly regularlyā¦
Just make a WIP commit (make it to a temporary branch if youāre that worried), and sync using the normal git mechanisms.
Thatās odd, Iāve read multiple reports of people using git with Dropbox / Google Drive / whatever for years, with no issues. Iāll try anyway, will leave a reply, if something goes wrong.
Over in #git on IRC we specifically advise people not to put git repos in Dropbox because of reports of corruption. It seems Dropbox does not handle lots of small fast-changing files well.
Well, gotta know the exact circumstances, to be sure. I mean, I can imagine somebody trying to work on some synced project on two machines simultaneously, and running into problems because of that.
If Iām careful, the sync process should always be a one-way overwrite. If itās not, in fact, one-way, then how can I trust Dropbox / Syncthing / etc. with anything at all?
Iām talking about single users. I donāt know the details and what exactly Dropbox was getting wrong, but something wasnāt right. When you have a single repo in Dropbox, you donāt have the multiple redundancy that git normally gives you, so once that one copy gets corrupted youāre screwed.
Iāve enabled staggered versioning for 90 days on my server, that should help, I guess.
All my git repositories usually have a proper remote, unless Iām in very early stages of a dev process, so it wouldnāt be a huge tragedy, if my repo were to go corrupt.
There are things which I care about more than I care about my projects, though, and it would be a real shame if they would disappear forever. This is why Iām actually trying to sync everything now, so that I would have a copy of my precious data on several machines, just in case. And now Iām reading all the horror stories about sync clients ānot handling many small files too wellā and āswallowing documents from time to timeā
Maybe Iāll also setup a cron job on my intermediate computer, and do a snapshot of the synced folder every night or so.
To be clear, I was specifically talking about Dropbox before. While I have seen git repo corruption with Syncthing, iām not sure what caused that.
They do use similar algorithms (block hashing), AFAIK, so the same issues apply to both, Iād assume.
Hereās an interesting read: When Git on Dropbox conflicts ā No problem
Iād probably summarize it to āif you have to ask, the answer is noā.
There are safe ways to sync git repos for sure, but you really need to understand how it works and know exactly what youāre doing. If not, youāre just playing fast and loose with your repo.
I donāt understand why youād want to do this with git. One of the objectives of git is to be decentralized. Just make a branch called uncompilable_mess and then clone the repo on your laptop.
That way, you donāt have to worry. Git is designed to do this.
Okay, say, Iām in a hurry, and I have 5 projects with uncommitted changes. One of them has a new config file with sensitive information in it (e.g. an API key), which isnāt in .gitignore
yet, and since I donāt have time to cherry-pick, I just git add -A
everything and push it to a public repo. This is just one harmful situation which couldāve been avoided if I were using a synchronization tool (not to mention that itās also automatic, so I canāt just forget to, say, commit 1 project out of 5).
Seems that the topic have been pretty active recently
Just for you all to know everything is running smoothly since Iāve been asking for help. Iāve never had any problem so far for my usage and in my setting which is a bit specific:
Generally speaking Iād advise not to push/pull/commit git repos while having Syncthing running in order to lower the synchronizations required. Iāve got no objective reason to do so but it seems just more logical and clean.
My 2 cents.
Thanks for chiming in. Thatās what I decided to do myself too. This is essentially a semi-automatic rsync
, where a single button does everything
Just an extra tuppence-worth.
Iām a single user syncronising a (Linux) desktop computer with a (Windows) laptop. Iām sometimes jumping between different branches in a repo quite quickly, which sometimes means a lot of files changing very rapidly.
Mostly Syncthing seems to cope really well with this, but I recently had a number of old (deleted, local) branches resurrect themselves. I donāt know if this was Syncthing or not, but I have added .git to my .stignore just in case.
What I lose is having any purely local branches syncronised between the two machines, but I donāt really need that, and I donāt like having to try to remember which branches I had or hadnāt previously deleted!
This topic was automatically closed 29 days after the last reply. New replies are no longer allowed.