Syncing growing video file - hash errors

I am trying to use Syncthing to do a one way copy of growing video .ts files from 20 laptop recorders to a single server over a LAN. It works sporadically while the files are being recorded and logs hash errors because the file changed while it was generating the hash. I am looking for a way to have it sync more regularly without having to stop or segment the recording.

I have workflow reasons for needing to copy growing .ts or .mkv files. I am open to education on the technical difficulties of this use case. I am guessing industries such as the security industry have tricks to make something like this work. I am receptive to alternate tools or services that will help.

Continuously syncing/copying growing files is easy, but not using the method implemented by Syncthing. I think you need to use a different tool, possibly one tailored for the purpose.

Thanks Calmh. Can anyone point out search terms I can google or the title a consultant might have who specializes in this type of storage transfer.

Syncthing could work with this if you set up some kind of a script that would periodically (e.g. once per hour) create a copy of the video file in a separate folder that would be then synced. In other words, don’t sync the main video file, only the copied version that doesn’t change and thus can be hashed in full. Obviously, doing so would require twice as much space on the disk.

Thanks tomasz86. That is an idea. Sounds sort of like Window’s Volume Shadow Copy feature for backups. I’ll have to look down that path and see what is available.

Are the video files on the server being used by something or someone while they are being updated, or is the goal to have a duplicate copy of what’s on the laptop recorders?

During the recording video editor will be reading but not writing the growing .ts files on the server. At the end of each day we stop the recorders and intend to have an identical duplicate copy on the server.

Even if you get Syncthing to work somehow, it’s going to be wildly inefficient because the entire file gets rewritten on the destination every time it gets updated. I think you should find something else that just streams the file there concurrently somehow.

Thanks Calmh. I have started looking for alternate services and methods for this use case. I just have to find the right search terms to get some direction.

In that case, it sounds like the --append option in rsync might be a viable solution. It can only be safely used on files that only grow in size and when the receiving end isn’t being modified by some other process.

The caveat is that rsync isn’t a continuous sync tool, so it must be run by a script in a loop or a scheduled task. But since rsync doesn’t rehash the data that’s already been transferred, it’ll be slower than a network file share, but considerably faster than the default rsync mode, Syncthing, rclone, etc.

Since the laptop recorders and server are all on a LAN, have you considered using a distributed filesystem or something along those lines?

What OS is on the laptops?

1 Like

Laptops are currently Windows 10. I have not seriously considered a distributed file system. We hope to migrate the workflow to sync with the cloud during phase 2 of our project so I will have to plan for that as I implement this.

Would it make a difference if was enabled (and supported by the OS and the file system, of course)?

1 Like

I think it will still effectively not work. The file is changing as its being scanned, it will most likely error out 99% of the time.

Solved - or moved to another forum.

Thank you to those who read and responded. I will be exploring some other tools for this specific task and using Sync Thing for more typical scenarios.

I am all ears when it comes to further comment but I think I should consider this post solved.

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.