Is putting a Git workspace in a synced folder really a good idea?

Thank you for that :kissing_heart: . It really is reassuring to know that it’s working somehow. I’ll give it a try and feedback here if it makes sense.

I haven’t tried this with syncthing, but would recommend caution. In other systems (dropbox, copy) I have ended up with multiple conflicting copies of files (conflicting-copy-of-foo.txt). This isn’t too bad for the files you are working on, but when this happens in your .git folder the resulting mess can take hours to sort out.

My belt and braces approach is to create a branch for each computer you work on e.g. laptop, chromebook, linode. Have a central repo somewhere like a VPS, or bitbucket where you can have free private repos. If I was working on my laptop, I would be using the laptop branch and commit it it to my central repo. You can automate this via cron, if you wish. If I move to my chromebook, I do a git pull and then merge the laptop branch with the chromebook branch.

This is less hassle than it sounds and has saved me time in the long run.

1 Like

i have all my git repos synced with syncthing, it works well. BT Sync and ownCloud have problems here.

if there are 100.000 small files in the repo, it may take some time to first sync…

1 Like

Why does BTSync and Dropbox have problems but Syncthing doesn’t?

1 Like

I’ve been doing this with SugarSync for years and it works great. Want to try it with Syncthing at some point, but my experience using SS is the only problem is if the object files get to be too large, as SS delays syncing them.

You can, and it’ll sync just fine, but you shouldn’t.

A Git repository is not like a collection of documents. In a collection of documents, each file is mostly independent so it’s easy to “merge” when one file has changed on one device and another file on another device. A Git repository on the other hand is made up of a bunch of files that should be internally consistent with each other. “Merging” two Git repositories that have diverged cannot be done by just copying in the newest files from each - it must be done by understanding the contents, doing an actual merge with a new commit, etc.

So no, don’t. Use a central Git repository and push/pull from it, like Linus intended. :)

3 Likes

I think you’re imagining multiple people collaborating on a single synced project. However, I think Philippe is talking about a different scenario: single user, multiple machines. At least that’s what I’m looking into.

For example, I’m in the middle of implementing a feature on my desktop PC, and my project is a big uncompilable mess. However, I need to leave for a week now, and all I can take with me is my laptop. Would be nice to continue where I left off, right? Without either messing up commit history, or wasting time cleaning up the project, or syncing it manually, or, better yet, putting it on a thumb drive.

My main concern was that the repository could potentially get messed up, if something were to go wrong during a sync, but looks like git is implemented in such a way, that things need to go really wrong (as in “omg my computer is on fire” wrong) for it to become corrupted (due to the fact that it rarely modifies its files, and mostly adds new ones instead).

2 Likes

I’ve had git repositories mess up in a big way when there’s just 1 of me using Syncthing. I never dug properly into why, but it did happen surprisingly regularly…

Just make a WIP commit (make it to a temporary branch if you’re that worried), and sync using the normal git mechanisms.

That’s odd, I’ve read multiple reports of people using git with Dropbox / Google Drive / whatever for years, with no issues. I’ll try anyway, will leave a reply, if something goes wrong.

2 Likes

Over in #git on IRC we specifically advise people not to put git repos in Dropbox because of reports of corruption. It seems Dropbox does not handle lots of small fast-changing files well.

Well, gotta know the exact circumstances, to be sure. I mean, I can imagine somebody trying to work on some synced project on two machines simultaneously, and running into problems because of that.

If I’m careful, the sync process should always be a one-way overwrite. If it’s not, in fact, one-way, then how can I trust Dropbox / Syncthing / etc. with anything at all?

2 Likes

I’m talking about single users. I don’t know the details and what exactly Dropbox was getting wrong, but something wasn’t right. When you have a single repo in Dropbox, you don’t have the multiple redundancy that git normally gives you, so once that one copy gets corrupted you’re screwed.

I’ve enabled staggered versioning for 90 days on my server, that should help, I guess.

All my git repositories usually have a proper remote, unless I’m in very early stages of a dev process, so it wouldn’t be a huge tragedy, if my repo were to go corrupt.

There are things which I care about more than I care about my projects, though, and it would be a real shame if they would disappear forever. This is why I’m actually trying to sync everything now, so that I would have a copy of my precious data on several machines, just in case. And now I’m reading all the horror stories about sync clients “not handling many small files too well” and “swallowing documents from time to time” :smile:

Maybe I’ll also setup a cron job on my intermediate computer, and do a snapshot of the synced folder every night or so.

1 Like

To be clear, I was specifically talking about Dropbox before. While I have seen git repo corruption with Syncthing, i’m not sure what caused that.

They do use similar algorithms (block hashing), AFAIK, so the same issues apply to both, I’d assume.

Here’s an interesting read: When Git on Dropbox conflicts – No problem

1 Like

I’d probably summarize it to “if you have to ask, the answer is no”.

There are safe ways to sync git repos for sure, but you really need to understand how it works and know exactly what you’re doing. If not, you’re just playing fast and loose with your repo.

I don’t understand why you’d want to do this with git. One of the objectives of git is to be decentralized. Just make a branch called uncompilable_mess and then clone the repo on your laptop.

That way, you don’t have to worry. Git is designed to do this.

Okay, say, I’m in a hurry, and I have 5 projects with uncommitted changes. One of them has a new config file with sensitive information in it (e.g. an API key), which isn’t in .gitignore yet, and since I don’t have time to cherry-pick, I just git add -A everything and push it to a public repo. This is just one harmful situation which could’ve been avoided if I were using a synchronization tool (not to mention that it’s also automatic, so I can’t just forget to, say, commit 1 project out of 5).

1 Like

Seems that the topic have been pretty active recently :smile:

Just for you all to know everything is running smoothly since I’ve been asking for help. I’ve never had any problem so far for my usage and in my setting which is a bit specific:

  • The git repo is not very active (< 2 commits / push / pull a day)
  • I’m the only user of the repo
  • Syncthing is NOT always running in the background; for conservative reasons, I only fire it up when I have to sync my computers before heading out or after getting back

Generally speaking I’d advise not to push/pull/commit git repos while having Syncthing running in order to lower the synchronizations required. I’ve got no objective reason to do so but it seems just more logical and clean.

My 2 cents.

2 Likes

Thanks for chiming in. That’s what I decided to do myself too. This is essentially a semi-automatic rsync, where a single button does everything :slight_smile:

2 Likes