Millions of files

I am currently using another bittorrent protocol system but their support is becoming more and more lacking. I’m wondering how this product would handle a system that has millions of files to index? While not constantly syncing millions of files, our needs would be that it’s able to index millions of files but then sync probably only a few thousand a day. Does anybody have any experience with this?

These files range from a few KB to hundreds of GB, totaling somewhere around 10+ TBs of files. The syncing would go from computer to computer and also have several Synology NAS devices that operate as fully synced nodes so that there is redundancy should any one full node go down.

Generally setups in this size exist: https://data.syncthing.net/

However I doubt that a Synology NAS can handle such an amount of data performance wise - hashing 10+TB of data is hardly going to go down well.

Hello:

I’m running a very similar setup - Syncthing on a number of desktops, with two Synology NAS units keeping ‘long-term’ copies. I currently have about 4 million files across ~40TB.

I find the setup works well - but with some caveats:

  • You’ll need plenty of RAM. My units (with the max possible of 6GB) struggle - though they do generally work;
  • Ideally put the Syncthing database on a small pair of mirrored SSDs rather than the main data array, otherwise you’ll clobber your NAS’ performance;
  • Be patient. With a low-powered CPU and large database, you can be waiting many minutes for operations in the UI to complete;
  • There’ll be a small amount of manual setup tweaking required - e.g. to allocate more RAM to the file watcher. It’s not a real simple install.

Other than those points, I’m delighted with how such a setup works.

2 Likes

Moisie stated it great and pointed the important things out.

You need RAM and time.

We have several similar setups running. Some have more or less TB, some have a million more or less files, but its all working fine.

Impatience is my vulnerability…

2 Likes