Haven’t been here in a while, because Syncthing has been so awesome and stable I haven’t had any problems with it. With the 2.0 upgrade, however, I’m having trouble with one of my machines.
The machine in question is a Raspberry Pi 3B with 1 GB of RAM. It keeps getting killed by systemd (syncthing.service: A process of this unit has been killed by the OOM killer.). It’s syncing maybe 1TB of data in multiple folders.
I’ve changed maxFolderConcurrency to 1 to try to get it to process only one folder at a time, but it doesn’t seem to have helped. It’s in a cycle of getting killed by systemd, then restarting, filling up memory, and getting killed again.
Are there any other settings I can change to make it use less memory?
You could also look into reducing the scan interval or disabling folder watchers to cut down on memory spikes. On a Pi 3B, every bit of RAM saved helps.
I’m not really sure about that. I’m testing v1.30.0 and v2.0.1 with the exact same data set right now, and v2.0.1 actually uses less RAM (250 MB vs 300 MB when scanning, and 200 MB vs 130 MB when idle). My test folder size is just about 27 GB though, which is obviously way smaller than yours, and I’m only testing RAM usage when scanning/idle with a single folder, and also not when transferring or receiving data, so the situation may be different there.
Well, it was just a guess. Nothing changed on this machine; I wouldn’t even have upgraded it, except Arch batched it in an update. I have similar machine (a Le Potato) that has 2GB RAM, and that one upgraded with no problems.
At this point, I’ve resorted to commenting out all the folders in config.xml except one, saving the file, restarting Syncthing, letting it scan and sync. When it’s done, I shut it down, go back into config.xml, uncomment another folder, start it again, and wait. So far, that’s keeping it up, but this is going to take forever.
Arch Linux doesn’t use RAM compression by default, so configuring zram can help quite a bit when there’s not enough physical RAM. It works transparently alongside swapping to disk, and also much faster.
To check if zram is currently in use:
zramctl --output-all
(For more details, check out the several earlier threads covering zram.)
Yeah; I’m using the AUR package zramswap. That command reports this:
NAME DISKSIZE DATA COMPR ALGORITHM STREAMS ZERO-PAGES TOTAL MEM-LIMIT MEM-USED MIGRATED COMP-RATIO MOUNTPOINT
/dev/zram0 444.7M 217.9M 103.1M lzo-rle 0 107.1M 0B 233.4M 587.7K 2.0337 [SWAP]
This Pi 3B has served me well as a Syncthing server for many years. It’s possible at this point it just doesn’t have enough RAM. Possibly something to note for people upgrading to 2.0. If I’d been using Raspbian, I wouldn’t have wound up upgrading, because Syncthing’s apt repo makes you choose to upgrade by providing a different path. Arch just threw 2.0 in their repos, which is what screwed me up, so I blame them (and myself for not paying attention to what was getting upgraded).
The output above looks pretty good, but it’s a bit surprising that the compression radio is only barely above 2:1. On my NAS, it’s almost 4:1, and over 3:1 on my laptop.
I usually set DISKSIZE to almost equal the amount of physical RAM. On your RPi3, zramswap capped it at 444.7M. It probably won’t make much difference on your RPi3 since only about 50% of the allotted disk size is being used. (The Linux kernel docs for the zram module recommends no more than 2x physical RAM.)
Most of the time it’s not an issue, but it was a bit of a pain to crawl out of dependency hell after discovering that a commercial software package wasn’t compatible with an upgraded package – ah, the downside to a rolling distro.
If staying on the Syncthing 1.x branch works for you and you’d like to keep using the RPi3, one option is to roll back the update. There’s a good chance that the PKG file is still in /var/cache/pacman/pkg, but if not, it can be downloaded from the Arch Linux archive:
Yup; that’s exactly what I did. I think the 2.x branch is too resource-intensive for these small embedded devices. I tried until today to get it to work, but no matter what I did, I kept running out of RAM and not getting syncing to happen reliably.
Moving back to 1.30 works well. Maybe as the 2.x branch matures, optimizations will happen that’ll let me try it again on this device. I do have the 2.x branch working on a Libre Computer Renegade, but that has 4GB RAM.