Could you use Syncthing to sync source code from work to home devices? Certainly. Could you use an rPi to do it? For sure.
First and foremost: Don’t break any laws, nor any rules for using the network or for source code security that your employer may have. At every job I’ve held in the last decade-plus, what you’re describing would certainly be a termination offense if not also leading to serious legal entanglements.
Option One: Use Syncthing between your work and home machines, using any required firewall rules, or a mesh VPN, or Relays. I would expect using a Relay would be easiest and slowest of those options.
Option Two: Have a traveling device, such as an rPi, running Syncthing. At work, the rPi syncs with your work machine. You bring it home, and it syncs with your home machines.
I work only on my own machines, the code is already there. I basically just want to use it to propagate uncommitted changes between machines (because I don’t want to commit uncomplete changesets). I work from home 100% of the time, but I’m often moving between rooms so sometimes I work on laptop and sometimes desktop. The rpi with syncthing will be only at home, and only on local network.
I just wanted to make sure that the sync would work ok even between 3 devices, where one would be always online (server), because it sounds like it’s designed to be working only machine to machine without server.
Syncthing and the underlying Block Exchange Protocol doesn’t have a client-server architecture – it’s all peer-to-peer. You can configure Syncthing for a hub-and-spoke connection architecture but that’s extra work for no gain in most cases.
It’s not clear to me how the rPi fits into your use case unless it’s for additional redundancy.
Your use case is exactly what Syncthing is intended for.
I would setup Syncthing on all three machines and have them mutually share the same folders – three-way on each machine. Syncthing will figure out what needs to be copied to keep them all in sync.
I always turn on File Versioning (Simple or Staggered) when I’m sharing as a “just in case” fallback, but don’t depend on Syncthing file versioning for backup! At least make sure your file system has automatic/periodic snapshots turned on, but best case is an offline and/or remote device that gets updated with periodic (versioned) backups.
This makes a lot of sense to me. If there are times where both PCs are not able to sync for whatever reason, having a third peer (like the rPi) that’s always available to sync can be a great solution for your use case.
One caveat I would note: I would advise using storage medium other than a microSD card or USB flash drive for Syncthing storage and database.
The frequency of write/delete/rewrite will wear out the flash memory of a microSD card or flash drive and cause data corruption (which may then be quickly and efficiently propagated to all your devices – hence, the need for periodic versioned backups!).
Use an external storage medium made for frequent write/rewrite (conventional magnetic recording hard drive or a solid-state drive).
@mopani@chaos thank you very much guys for the insights. I will look into the storage options as I’m planning to buy zero 2 W. If it’s easy to use different storage media there then I’ll go ahead and order the rpi. Appreciate the help!
I’m not deeply versed in the various Raspberry Pi models, but I recommend researching the hashing performance for the various models. Some may do much better than others, and for initial scan of a data corpus, hashing performance is likely the biggest bottleneck.
I did some research, and actually, the rPi5 with an SSD hat might be better for me, as it solves the issue of where to put the external drive. It will also be better suited for the hashing, and if I want to do a bigger project in the future, it will work.
For a Raspberry Pi that is expected to run 24/7 this is a nice addition: Raspberry Pi UPS HAT - PiShop.us and includes a battery-backed Real-Time Clock.
Using a GPIO header with extra long pins allows it to fit inside most standard RPi cases and leave space for heat-sinks on the critical chips.
Wow, that’s pretty nice. Unfortunately I would have to have backup power for infrastructure as well, which I currently don’t. This would drive the cost up pretty fast. But thanks for the tip!
I should have clarified that if you have frequent power outages (like a lot of places I work with!) the UPS Hat is useful because it allows an orderly shutdown of the RPi operating system and avoid filesystem corruption.
Yeah I can see the use case, where I live we have maybe one outage a year, so not really suitable for me. But it’s great that the little rpi has so many “extensions”
Exactly what does the quoted sentence mean? You might be better off finding a Git workflow with suitable named branches and multiple remotes. Syncthing is great, but so is Git. The path to least pain often lies in choosing the right tool for the right job, and for source code that’s most often a version control system.
I would advise against using syncthing for non-bare active Git repositories. The index will get sync-conflicts, often. One could sweep that problem under the carpet by configuring to ignore them, but I would not trust that all objects are reachable in a repository where concurrent write operations happen from multiple hosts.
Please remember that “concurrent” might mean quickly looking something up on the desktop, while having left a laptop suspended with unsyncronized changes in a bag after a trip.
If deciding to mirror checked-out Git repos with syncthing, please make sure to configure staggering so you have backups when the data corruption kicks in.
I know because, for reasons, I actually have a few git-checkouts in ST managed folders. Running find / -name 'index.sync-conflicts*' | wc -l currently gives me 15 matches back, and I do clean up occasionally.