Downloading large files in one folder prevents pushing changes to other folders

I have had this problem for a while, but I am not sure whether this an actual defect or “by design”.

There are multiple folders shared between PC1 and PC2. One of them consists mainly of large, 1.5+ GB-sized files. When such a large file is being uploaded from PC2 to PC1, the changes taking place in other folders on PC1 are not pushed to PC2.

As you can see in the screenshot, the upload rate is basically stuck at 0 B/s. As soon as I pause the folder with large files, then all the queued “Out of Sync Items” are pushed from PC1 to PC2 almost immediately.

Is this normal, and if not, what can be done about this? PC2 itself has its upload speed limited to 25 KB/s (on a 10Mb/1Mb connection), while PC1 is unlimited (on a 100Mb/10Mb connection). There is no other significant network traffic taking place on the two PCs.

I have checked the logs, but there does not seem to be any suspicious activity there. What kind of debugging options should I try to enable in order to investigate this problem?

If this is a single folder with sub folders, then this is by design as I think it works it’s way through the files one by one. When it hits the large file it’s then hashing it before it moves on.

I had a similar issue and I broke folders down into separate jobs so that a 300+Gb file didn’t stall other files.

If this was all in the same folder, I would understand, but no, in this case one folder is blocking 8 other folders. The direction is also different, as the folder with large files uploads data from PC2 to PC1, while the changes in all those other folders are pushed from PC1 to PC2.

Given the very low upload limit, what I can imagine happening is that the data upload from 2 to 1 prevents device 2 to even request data from 1. And 25Kb/s seems anyway unreasonable: ~20GB / 25Kb/s = ~9days.

Yeah, but what should I do about this?

Without any limit, Syncthing maxes out the slow 10Mb/1Mb ADSL connection, which makes using the Internet for other tasks impossible. I have tried different limits like 50KB/s or 75KB/s, but it had negative consequences on things like remote desktop, which was noticeably slower, hence the 25KB/s limit.

Why would the upload limit affect its downloading capability though?

it needs to request the data for download, which is an “upload”. anyway that’s just a pretty baseless theory. enable model debug logging to get more info on regards actually going on.

regardless: if you have such limited bandwidth, is it really practical to sub that amount of data?

Thank you. I have enabled model on both machines. I will let it run for a while and then check the logs.

It is very likely not, but I need to transfer the data anyway. It does not matter if it takes a few months to do it :slight_smile:. I do not want to have the changes in other folders delayed by that time though.

Just for the record, this is how the transfer looks on PC1 after pausing and resuming the problematic folder.

It seems to go like this for a moment, and then the upload rate stalls.

It’s not baseless, downloads require requests to go out, so limits either way will have detrimental performance on both ways.

During the last few hours, I have tested setting 25KB/s, 50KB/s, 75KB/s, 100KB/s, and finally unlimited upload rate on PC2, and only with the unlimited one there seem to be some downloading activity taking place. With any other upload rate limit imposed, downloading stalls.

I have also checked the logs with model enabled, but I am not seeing anything suspicious there. I would share them, but there are thousands of file names listed there, so I will need to edit them heavily first.

This is how the situation looks with no bandwidth limits set.

Of course, in this state, the upload is maxed out, so any other tasks like remote desktop, etc. are impossible.

That was indeed a bad choice of words, what i meant is that I haven’t even done a back of the envelope estimation for the size of download request. however that doesn’t actually matter: there are 64mib data being uploaded and thus plenty to contend/drown other BEP messages. It might even make sense to prioritize “non-data” transfers, but I think that’s not possible in the current scheme (limiter just sees bytes) - haven’t checked though.

Do you or anyone else have access to the remote within the next month’s? Data transfer per mail or any courier seems like much better approach here.

if it has to go over the “wire”, I suggest using rsync/unison/… with a rate limit for this huge transfer and syncthing with another limit for the less prohibitively sized rest.

Maybe also Syncthing compression (set to “all data” in device settings) helps a little.

Write a script that limits upload during business hours and allows full speed at night/weekend. REST API — Syncthing v1 documentation

Or copy it on SD card and send via letter, then copy files on other side and rescan in Syncthing. Or use QoS software that checks if internet connetion is needed somewhere else and limits upload of Syncthing. Or pay intern to pause/start Syncthing :slight_smile:

The remote computer is turned on 24/7 and used for office work during the day. I have not tried other ways to send the data, but using Syncthing is just comfortable, as there are many variables, such as unstable connection between the two machines (as they are located in different countries), dynamic IP changing every 24 hours, restarting of the PCs due to updates, etc. Syncthing takes care of all of that.

I have now tried to set bandwidth limits with QoS Group Policies in Windows, and they seem to be working.

The upload rate is not great, but at least it is not stalled, as it is the case when the limit is set in Syncthing itself.

Thank you for the tips.

Will the compression make any difference though? I normally have it disabled for all data, and especially I am not a fan of enabling it on this specific computer, as it is just a dual-core Intel Celeron CPU.

Writing a script is a possibility, although the hours are not set in stone, so I was rather thinking of something like automatically limiting / pausing Syncthing during specific activities, etc.

Sending the data in a physical form is out of the question, but as I mentioned above, I have just enabled QoS in Windows and it seems to be doing the job.

I am just writing to confirm that setting QoS in Windows does indeed “fix” the problem. I say “fix”, because there are some caveats (see 3).

To sum up,

  1. Setting the upload rate limit in Syncthing on PC2 blocks upload of changes from PC1 to PC2 as long as the large files are being transferred from PC2 to PC1.

  2. Setting the upload rate limit externally makes Syncing operate as if the network connection was slower than it really is. This way, Syncthing still maxes out the artificially limited bandwidth, but at the same time lets the changes from PC1 be pushed to PC2.

  3. Although QoS in Windows should limit only the upload speed, in reality both upload and download in Syncing seems to oscillate around 25 KB/s. This is not great, but better than nothing, I guess. With the problematic folder paused, Syncing does download with full speed (regardless of the QoS settings).

All in all, I am probably going to use this approach, as I do want the changes be pushed, even with such a limited upload rate.

I believe 3 is just a milder variant of your original problem: By limiting the upload rate and having lots of data to upload, you limit the amount of “request for downloads” being uploaded. With the internal rate limit it’s worse because whole “upload packets” contend for a slot to be uploaded while with external limit all packets try to squeeze a limit at once, and request are small thus more likely to “squeeze” through. That’s why again, it would be advantageous to split the huge transfer from the rest. If you want to use Syncthing (can’t blame you for that :wink: ), you can set up a second instance of Syncthing where you share only the large folder. Then you will get the full download speed on the other instance without the large folder.

1 Like

Thank you for the detailed explanation. I will probably keep the state like this for now, and see how it goes. The changes pushed from PC1 to PC2 are usually either small files or just renaming, copying, and moving the already existing files and folders, so they do not require that much bandwidth.

I will try running a second Syncthing instance if really necessary, but right now I would prefer to keep everything as simple as possible.

Also, just in case someone in a similar situation stumbles upon this thread, these sources explain how to set up QoS in Windows:

https://docs.microsoft.com/windows-server/networking/technologies/qos/qos-policy-top https://support.microsoft.com/help/2733528/policy-based-qos-not-working-in-windows-7-clients

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.