Huge RAM usage (2GB+) on synology DS1513+

Hi everybody,

I am currently looking into using syncthing to sync two office NAS’s home folders as the built in service (cloud station) can’t do that.

So when I start the folder initial scan, first off all looks normal but the memory usage slowly climbs up to about ~5-700MB, which is fine. I have 2GB. But a little while later memory usage suddenly jumps to over 2GB which makes the whole NAS unresponsive (swapping, I presume). So I have to kill the process to free up the ram and then start over again. I already changed the # of hashers to 1. Helped a lot but still no success on the first folder scan.

Anybody know why? I’m running DSM 5.2-5967 Update 2 with the SynoCommunity package. Just installed that, then clicked auto-update to get to v0.14.15. That part worked like a treat…

The folder size is 1.12TB, btw. But as far as I read, the folder size is independent of memory usage? It seems to be for the most time.

Thanks!

(I’ve also got a htop screen, can’t upload though).

Syncing homes with Syncthing is a bad idea. Syncthing doesn’t sync ownership, so everything that it creates on disk will be owned by the syncthing user, not the user that home dir belongs to.

As for the mem usage: I could think of the indexes and the leveldb using that much RAM when scanning that many files.

There are a few things which incur extra memory cost. In advanced settings disable scanProgress and set progress emitter interval to 0

Also cacheIgnoredFiles (advanced setting) should be off, which is the default but might have been set historically or something.

Memory usage peaks when there are many files to send index entries for or sync, which looks like it might be the case above (scanning not completed, possibly preparing a large index to send to NAS200). It’s surprisingly high for the amount of data in the system so far, though. scanProgress that Audrius mentioned might have something to do with it if this is the initial scan.

Hi, thanks for looking at this.

cacheIgnoredFiles was always off and I don’t think scanProgress is the issue as the peaking seems to happen as soon as the actual syncin starts. I now managed to get it to scan the whole folder over night and once finished it used ~700MB of ram. The trick was to pause synchronization while scanning and setting hashers, copiers and pullers to 1.

Unfortunately, two things are currently still a show-stopper:

  1. it seems to detect only half the stuff there. The homes folder shows 1.12TB on the nas but syncthing detects the folder as ~700GB. I do have some ignores set up but that’s only to ignore #recycle and some @eaDir folders from synology. I wouldn’t think that they’re that big. Will double check though.

  2. The other issue is, that as soon as I unpause syncing (with both folders completely scanned), ram usage again peaks at >2GB over the course of ~30s which forces me to quit syncthing.

ps: right now is finally seems to at least sync, using >90% of available ram. Not ideal but a start. The difference in folder size might be that the folders actualy are out of sync. It just now started showing a couple GB OOS items. I’d still like to reduce ram usage, though. But if all of this behaviour is to be expected then that’s alright. I guess I’ll have to go back to BTSync then (even though that one just stops syncing at some stage … and cloudstation sync doesn’t sync homes folders. but that’s all a different topic)

Thanks!!

I’m not entirely sure what your syncthing is doing, but I have roughly the same amount of data and memory usage hovers around 50-150 megs… Is this one of those funky builds with large RAM page size or a normal github release?

Edit: No, that appears to be the WD Cloud NAS things. Curiously though there is this super duper old issue with someone else on a Synology seeing about the same:

While, as mentioned, this is how it typically looks for me:

Wow!! 50 Megs only? At 800GB folder size? No way. I noticed though that the memory usage also seems to jump up as soon as I open the GUI. If I close it and leave it for a while, memory goes down to ~100-300MB. And as soon as I go back to the GUI, it jumps back up to 500-700MB (this is with first scan finished and only 1 copier, puller, hasher).

So it kinda seemed to work but still - 50 megs would be awesome! I have fallen back to using cloud station sync which unfortunately won’t let me limit its bandwidth usage. I’d love to set up one instance of syncthing per user to sync their home folders. With 50 MB ram per instance that might be doable.

Yes, it’s the Synology Community Syncthing package on a DS1513+ with OS 5.2. It just auto-upgraded to v 0.14.16. I even managed to get inotify running per upstart. Bummer - would be so good to use this.

I just looked - couldn’t find any logs unfortunately. Is there a way to activate extra debug logging? I might give it another go to generate some data if that would be of interest.

See if you can run one of the official releases from github.

The auto upgrade should have downloaded the official version from github. AFAIK the syno package uses official binaries and is only used to have an easy way to create the user and set the config (port, home dir, …).

That’s what I thought, too. But yes, I can download a github release and replace the file and test it.

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.