advice on max amount of shared folders

Hello, in the past days I’ve been playing extensively with Syncthing to see if I can implement it in my work environment and I find it so interesting! The way I can transparently manage devices, folders, and permission types it’s just great. Thank you for developing such a great tool!

Coming to my question: I’m looking for a smart solution to share folders in a very granular way with different people across multiple sites everywhere in the world, and I’m wondering if creating roughly 20k to 30k unique share points (folder_ID basically) would become very complex for the system to handle? Would I reach any specific limitation that you are already aware of?

Thanks in advance for your feedback. Francesco

That’s an impressive number for sure :melting_face:. This hopefully isn’t going to be handled by a single device, is it? The largest folder number reported currently at is “536” which is still a lot.

The Web GUI won’t handle such large number of folders, that’s for sure. A few hundred is probably the maximum it can operate with without massive lags. You will need to find other ways to interact with Syncthing if you add that many folders to it (e.g. through the command line or by editing config.xml directly).


Actually yes, meaning that on average each device would handle at least 4/5k share points and the main server which is serving all the other devices would definitely host all the roughly 30k. The folder management will only be executed via API

I work in computer graphic, and in the current project we decided to have for each version of a 3d asset a set of folders each of which contains certain data. What I’m envisioning is basically a system where I can expose each version on it’s own and sync it as the result is approved. We have a range of 400 assets each of which will have at least 3/4 versions published with at least 3 folder to share each time. So getting 4k share point is very easy… most likely I may have even a greater number to handle :slight_smile:

Seems like I have to make some test and see what can happen!

1 Like

Please report back once you’ve done your testing :slightly_smiling_face:. With this number of folders, you will likely be the pioneer here!

You may notice that now the max amount of shared folders is 1.4k, that’s my first test :slight_smile:

So far I can say that:

  • the api calls to fetch all the folders are pretty fast
  • the web GUI is a bit slow but still responsive (I’m just using it to verify the feedback rate)

I’m now sharing all the data included in device with a second device, let’s see if anything break, I’ll keep reporting here :slight_smile:

1 Like

It does sound to me like having a single folder and moving content in/out of it might be a better option.

1 Like

Impressive to hear that the GUI is still usable. Can you share more specific information on the hardware and the operating system configuration? Also, which browser do you use to operate the Web GUI, and do it do it locally or remotely?

On a side note, your number of folders has already been recorded at :innocent:.

Hello Audrius, I’m trying to picture your suggestion but I don’t get how it could work…

Just to provide you with a bit more context I would like to satisfy the following needs:

  • I have a central place where all the contributions from each device are stored
  • each time some work is produced needs to be provided to the central place and from there distributed in a read only mode to avoid potential user mistakes
  • each share point may be distributed to a non defined amount of devices depending on the work setup that may change based on the production needs
  • when a new version of some work is produced it deprecates the previous version so newcomers in the project may not need all the shared points but just a subselection of them.

I’m super curious to hear your idea, and I’m keeping thinking about that

Hey Tomasz86, I’m using my own home pc (which is getting quite old now) and it’s still reasonably performing. My pc is an:

  • Intel I7 4770 with 32GB of Ram.
  • Using Google Chrome Version 108.0.5359.125 (Official Build) (64-bit)

Let me give you some statistics:

  • Chrome uses 15% of my cpu and roughly 3GB of Ram when localhost:8384 is open
  • I currently have 1592 folders as share points
  • it takes roughly 50" to scan the whole set
  • I can still perform any action via GUI (I have 2-3" lag when I hit some button)

When I perform any operation via API the high amount of share points requires a little additional time compared to when the is a handful of them, in example getting the list of all the share point via Python using the rest API takes in the range of 5 seconds.

Creating 1000 new sharepoints take roughly the same amount of time. And restarting the system (after creating all those new entries) takes equally a very short amount of time.

So far I’m pretty happy with what I’m getting!

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.