Dupe files

Hi, I have some trouble when renaming files and moving them. For example, I place a file named A.txt in the folder of a “slave”. Then, on the master (read only) folder, I place the same file with a different name like Abis.txt. The sync process copies this file to the slave although it’s already there. So I end with a A.txt and a Abis.txt which are the same (same hash). I suggest to scan the destination file before sync to check if the potential new file is not already present. Usually, I don’t matter to scan for removing duplicates afterwards, but in this case, generating high volume trafic, I would prefer to avoid it. Thanks !

Blocks will be reused when we are aware of them. By default we scan for changes every minute, soon to be event driven instead. If you’ve changed the scan interval and added a file that is not yet scanned it won’t be reused when pulling… If that’s what you mean. But this should be fairly unusual.

Thanks for your answer, I 'm not sure if this situation applies to me or not, as I’m “playing” with Syncthing to discover, I must confess my directories are not really “stable” because I add and remove files to see how and what happens. So if I understand you well, when a file is present, or partially present, the part already present should be re-used to shorten the transfer volume, even if it has another name ?

Yep, as long as Syncthing has had the opportunity to scan it once.

1 Like