I am currently using staggered file versioning and have recently run into some disk space limits.
Searching for larger files, I noticed that the versions of a single large file take up a significant portion of the disk space and wanted to permanently delete those old versions without completely disabling file versioning, but failed to find a way to do so in the Syncthing UI. So my questions are:
Is it possible to not version specific files or files matching a (size/name) criteria that I just did not find? If yes, how?
Is it possible to remove old versions of specific files using the Syncthing UI? If yes, how?
If the answer to 2. is no, can I just remove the old versions using rm or will that mess up some internal database?
There are no options to control versioning like that. You would need to basically write your own external versioning script to achieve what you want here. The GUI is also limited and doesn’t offer any option to clean out versions manually (either all of them or one by one; I think there is a long-standing issue for that on GitHub). However, yes, you can simply delete the versioned files from the disk with no problems .
I see in the source that the versioner also seems to respect the copyRangeMethod, which in theory means, that on a modern filesystem the data can potentially be “cloned” (which I suppose to mean what I know as “reflink” in unix, but there seem to be similar concepts on other OSs) on a block level, right?
Assuming that OPs large files only change a few bytes, there is a possibility that having many large files can mean that they share the same data blocks to a large extent, correct?
Maybe it is worth checking out if that method is active (as it would mean that deleting the files has little effect) and a potential solution to the problem may be to switch to a supported FS (albeit a bit drastic)