Scanning sticks at same place

I am having trouble with Syncthing stalling on its initial Scan to assess what data is in a folder, it is stalling at the same point on two different systems. The folder is large (~1.7 Terabyte) of data spread across around 33 files between two computers which are both running Win7 64Bit.

On both computers remote and local the data is held on an external USB drive (identical model drives on each end). Everything seems to go well with the scanning up until they reach 57% then they stall. The data integrity and readability has been confirmed as OK. Plenty of system resources still remain, it never progresses past that point even if left for 24 hours.

Three of the files within the folder are 488GB each, I’m not sure if there is a limit to the size of file Syncthing can handle? I tried removing those large files to see if it would complete its initial scan, it didn’t it stalled at a different percentage that time.

I should note that I manually duplicated the data between the two folders before setting up Syncthing, in order to minimise bandwidth consumed in the initial sync. But bandwidth doesn’t appear to be the issue it is more the initial scan where presumably it is creating hashes of the files.

Any suggestions or where to look to troubleshoot? Thanks for your time and help all.

Check Syncthing’s log. Syncthing logs to stdout: if you’re running it directly, check its output. If you’re running it through a service manager (systemd, etc), check to see where the service manager writes the logs.

Thanks for the reply, I will check the log and report back.

Hi canton7, I just twigged the stdout reference being Linux, both my PC’s are running Win7 64bit, any pointers on where I can find or enable a log? I checked the program directory but could not see one.

Log Files on Windows

I just found the following in the FAQ:

On Windows Syncthing by default also creates syncthing.log in Syncthing’s home directory (check -help to see where that is). Command line option -logfile can be used to specify a user-defined logfile.

I couldn’t see the default log in the home directory, permissions might be restricting its creation so I will try executing with a switch to do a user-specified one.

How are you starting Syncthing?

It will log to somewhere in %LOCALAPPDATA%\Syncthing by default (e.g. C:\Users\<You>\AppData\Local\Syncthing. However if you’re running it with SyncTrayzor, Syncthing-GTK, NSSM, etc, it probably won’t: this is why I ask how you’re starting it.

will currently not work, wait for a release with that merged (or build your own) to get support for up to 1220 GiB per file

1 Like

Very helpful thank you, I located the log in %LOCALAPPDATA%\Syncthing

I’m not aware that I’m using SyncTrayzor, Syncthing-GTK or NSSM.

On both systems I put in place the start at boot Schedule Tasks as advised by the documentation, but in recent attempts I have manually restarted / shutdown the service using the buttons in the Web GUI, then manually running the .exe with Administrator privileges.

Alex I’m just going to check your link, thank you.

Hi Alex,

I’m not a Developer myself but I assume from that thread that there may be a ~130GB+ file limit currently in place. Am I correct in thinking that is the case for the current code/build?

If that is the case then this is great to know and not an issue in my use-case… I’m looking to Sync some drive images, I can easily configure my backup software to split the images across smaller files say 100GB a piece.

Thanks for your time and help all.

I did not calculate it myself, but yes it says ~130GB is currently the limit for files.

Splitting is probably the best way to get it working if that is a possibility for you.

2 Likes

Absolutely, in this instance splitting files is not a problem, I can tell my backup software (which is the only thing that will be saving files in that folder) to split files every 100GB, it has the option to do that.

Thank you for helping me get to the bottom of that quickly Alex!

It will be merged in next Syncthing version ?

I hope so! :slight_smile: Just to let you all know I managed to Sync 1.7TB of data between two physical locations OK using Syncthing - I limited my file sizes to 100GB a piece, and I did an initial manual copy/sync of the data when the drives were local to each other at the start, thus reducing bandwidth. I believe it took Syncthing around 15 hours to do the initial scan / hashing of the 1.7TB of data on both computers. I will be monitoring how the Sync’ing goes from this point onward.

Every time a file is changed, it will have to rehash 100GB, so a smaller size might be more optimal.

Hi Audrius, thanks for that info. I am lucky though in my case as those 100GB files will not be changing at all. The folder is containing incremental backups, so the only thing that will happen is that every couple of days a new file is created in that folder that is substantially smaller (around 1-2GB) and that will obviously be sync’d to the off-site location.

If you’re interested; what I am striving to achieve is a low-cost, self-sufficient, one-way backup sync off-site which protects against the like of ransomware which maliciously encrypts data, you can see my method & thinking on this thread: https://superuser.com/questions/1041714/one-way-backup-sync-between-two-locations-to-thwart-ransomware

I will therefore be experimenting with Syncthing’s Master Folder concept to see what I can do with that.

Thanks for your time and help.

Given the above I currently have my Rescan interval set to 86400 seconds / 1 day, I don’t know what others would recommend? As mentioned there might only be an addition of a new file (~1-2GB) every 2 days.

I think it may be the ignoreDelete function that may be incredibly useful to me for this scenario. I love the versatility of Syncthing and the ability to be able to tweak all these settings.

I would reduce this. If the files haven’t changed the scan won’t be as resource heavy. If you set the scan to something like 1 hour your transfer will happen close to the backup being made. Currently it could be up to a day and the one for the next day may already exist.

I am assuming the backup isn’t happening on a hard schedule if files are only made 1 or 2 days apart.

Also, I haven’t read you post above… and no one else has said it yet but… Syncthing is not a backup utility. Please make sure you understand its shortcomings in this area before you rely on it.

2 Likes

Thank you for your input, I think I will shorten it as you suggest. I wasn’t sure if it would entirely redo the 15hr scan, but I don’t think it does.

I know Syncthing isn’t a backup utility, I pay for software (Macrium Reflect) to handle that aspect and create incremental disk images for me. I use Syncthing to sync those disk images to an off-site location for me, I believe I can configure Syncthing in a way that any accidental / malicious modification of old disk images on the source/local drive can be prevent from replicating to the off site location, by means of using the ignoreDelete setting and Master folder attributes etc. I will be doing a fair amount of testing to ensure my thinking.

I appreciate everyone’s time and input, thank you.

1 Like

This topic was automatically closed 30 days after the last reply. New replies are no longer allowed.