I recently bought a Synology DS218play NAS, partly because I could run syncthing on it. However, it also runs Plex, and with large folders getting scanned, performance is utterly pox, thereby ruining my viewing pleasure.
I’m thinking of setting the scan interval to zero, and “scheduling” scans overnight. Does this seem like a reasonable approach?
The files are changed on the computers, not the NAS. So i assume if one is changed, the watcher there will send it (soon) to the NAS. And if a watcher is set up on the NAS, it would then (soon) send it to other sync’ed folders. And i could turn off the watcher on the NAS for folders that contain large files, and they would sync overnight after the scan finishes (or I could choose to throttle the bandwidth).
Is my logic sound?
I assume that you have installed Syncthing through the package center. Then it makes more sense to start and stop the service at the desired times in the DSM using the task scheduler in the control panel. For this:
Create > Scheduled Task > Service > Select Syncthing …
In this way, no special intervention or special scripts etc. would be required.
You don’t need the watcher enabled on the Synology if nothing is ever changed there locally. Changes received from other devices are always propagated to third parties, that has nothing to do with scanning AFAIK.
However, stopping the service completely would be unwise, as that will prevent exactly what’s described above, receiving and propagating changes from other nodes.
If the scanning does not bother you during the night, check out the REST API to trigger an immediate scan based on your desired schedule.
What is described was more a question. Why should a service run and register when it shouldn’t do anything. That would only waste resources. There is no logic in the fact that the watcher should run. With P2P, each device can be connected to each other and synchronized without the NAS. It makes no sense.
So whether it triggers a sync via the API and thus a sync takes place is the same as when it starts the service. In addition, scripts are required in this way, which I also find cumbersome.
Thanks for the info. I was going to leap down the path of using the API, but it’s simpler to start off with scheduling syncthing and seeing how well that fits my needs. I was already concerned about the (forgetful) dependency on scripts (although I centralise them in Node Red).
Currently i have the syncs set up as hub-and-spoke with the NAS as the hub, so they would not sync until that night. I now realise this is a bit naive, and will also look to make it a mesh, so the computers sync soon, and the NAS syncs overnight.
Well, just keep in mind the NAS will not even receive changes from other devices when the service is stopped completely. Thus they need to be online when your schedule is activated in order for any synchronization to happen at all. Syncthing is designed to run continuously, so limiting its running time takes away the chance to do whatever it needs to do. Consider what happens for example when your scheduled time to have it active is not enough to even finish syncing.
Utilizing the API is just one command (e.g. curl) to run from the scheduler. No need for scripts really.
That is clear and correct.
I have 3 Synologys are don’t run 24/7, but sometimes only for a few hours a day. Syncthing runs with them right away. But then they synchronize when they are running. It’s basically the same. Even if the watchers were active, indexing and synchronization would still have to take place.
This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.