How to handle continuously updated files

Hello,

I need some help adjusting the settings for better performance. I have a few files that are continuously updated, like log files, that I do want to backup, but i don’t care them to be really up to the second, they can be 1-5 minutes old.

So I’m looking to see if there is a way to say/configure “don’t backup a file more than once every X seconds”. “Fs Watcher Delay” doesn’t really help since the file is updated every second, to every update is delayed by 10 seconds, and I still end up with a backup every second.

Something that I need to note is that I do use the inotify watcher.

Thank you.

I suspect that if the files are in fact updates every second then they will never sync properly, as the actual contents when sent will not match the previously hashed contents. Apart from that I think this might be a case where you might want to use periodic scans rather than notifications.

1 Like

What Jakob wrote is the main problem regardless, still I’d like to correct the following notion:

That’s not how it works, the contrary actually. Incoming events (file changes) are aggregated and if there’s repeat events for a single file, it’s further delayed. Meaning your file that’s changed every second is even scanned only every 60s.

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.