Syncing folders with constant changes and big files

Continuing the discussion from One-Time synchronization with constant file changes:

I have a very similar scenario. Yet i don’t care so much about the two phases Edelf mentioned. I just want all files to get synced to the second machine.

The problem in my scenario is: We have some big files (>1GB) which are constantly changing. Even so i want them to get synced to the other machine in order to have a failover system in case the other one fails.

Currently “37 Objects, ~4.69 GiB” won’t sync, because they get modified during transfer over and over again.

Is there any way or mode to enable transfer of such changing files?

Maybe making a snapshot copy before syncing it, or something alike?

I was thinking about other combined solutions, like using a cronjob to copy the folder to a secondary folder once in a while and syncing that one, but this causes other problems, like deletions do not get synced anymore and in general added complexity.

Any ideas?

Greetings;

It won’t work, as syncing is not an atomic process, nor is a copy, so whatever your file is, it’s most likely going to end up corrupt, with different parts of the file from different time spans.

1 Like

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.