Panic: leveldb/table: corruption on data-block

Certainly looks like we’re running into some kind of database bug, perhaps related to the amount of data. I’ll whip something up to simulate an index for that amount data / files…

For what it’s wort, I can’t reproduce this by writing indexes for 500 x 7 GB files, and changing / listing them from the database. The database also doesn’t grow beyond about 50 ldb files when doing this…

can i send you the ldb files? Or is there some tool that i can use to see what’s going on in the ldb’s ?

There’s a tool to list the indexes, but it kind of assumes that the database is not corrupt. This is a library we use, so I’m not really sure how to troubleshoot it without being able to reproduce… You could pack up the index directory and send a link to me in PM, perhaps something is extractable from it that may point to the root cause, or let me make a more accurate simulation of it so I can reproduce it myself perhaps…

PM sent

= 20 chars.

guys… this keeps happening. Here’s the latest panic log : http://pastebin.com/3dgDHhEL Please help ?

My best guess at this point is some sort of hardware or system problem, to be honest. Someone else was running funky software to cache writes to SSD:s that apparently caused problems. Perhaps some antivirus or other thing does something similar. (The index I got in PM was fine on my computer.)

no antivirus, clean install of win8.1, on a new SSD. no services or background processes except syncthing itself, and Plex media server.

Can you run this test program? It’s by the database author, and tests a lot of reads and writes in parallel to the database. In practice this tests two things - the code for bugs, and the hardware for problems. I’m running it now on a few machines and haven’t seen any issues so far. But the database does checksum all it’s data, so memory problems will show up as the error in the topic. I’d be curious to see how it goes on your machine.

https://nym.se/t/dbstress.exe

Should probably run it both as is and with -enableblockcache -enablebufferpool -enablecompression to better simulate what Syncthing does. In either case, let it run for a day or so or until it exits with an error…

It is interesting that I am getting this error frequently on Arm Linux. This is probably the 4th time in couple months. I had to delete the database to start ST properly. The worst part is that one does not know when this happens so ST fails in the bg and one never knows about it.

Also deleting the index is not a sound solution for couple reasons. The biggest issue is that this is very cpu intensive so battery drain is no cool on any mobile. Thje second thing is that deleting the index will confuse the sync on both ends.

Why cant ST back this up properly? So it can use the backup when there is an error?

@kahun so is this a problem on the mobile or on your arm linux box (raspberry?)?

unfortunately this isn’t 100% reproducible. At this point, all we can do is to make an archive of the index folder asap after syncthing throws the panic, but before restarting syncthing, since it might repair or truncate the database. This archive can then, perhaps, be useful for analysis.

It is Arm Linux on Android via chroot. I think that I had it on another similar setup device. I am not sure about the x64 devices, I only have 2 of those but more Android devices.

got some fresh info… There seem to be 2 new variables to this: resume from sleep, and synctrayzor.

I had another panic just now, system has gone to sleep. On wake, synctrayzor crashed, i tried restarting, and it crashed again. This is synctrayzor log after 1st crash, when the host woke up : http://a246d0346c3c5161.paste.se/ This is synctrayzor log after 2nd crash : http://42b08ff760e25c15.paste.se/ This is the corresponding syncthing log, including panic : http://748666e5da3afce9.paste.se/

After all of that, Synctrayzor starts but crashes immediately. Syncthing (without synctrayzor) starts, chews a bit on some .ldb files and panics time after time.

edit : after this last crash, ST can’t repair the database. I’m reverting to an older copy of the index in order to avoid having to re-index 4Tb of files, which would take half a week :frowning:

Mind opening a SyncTrayzor issue with that log?

Edit: Actually I think I’ve already got an issue for that. From an initial read, it looks like Syncthing’s crashing out, and SyncTrayzor can’t handle it.

There’s not a lot else I can do there except be more graceful.

If SyncTrayzor is a factor in causing the corruption, I think it could only be because it’s occasionally telling Syncthing that a file’s changed. Other than that, it doesn’t have anything to do with Syncthing and its database.

Discussion is in

It looks like SyncTrayzor isn’t at fault - it’s giving Syncthing 10 minutes to start, and Syncthing’s not responding after this time.

1 Like

@calmh

Here is a panic log from a recent corruption business

Panic at 2015-05-17T17:46:00-05:00
panic: leveldb/table: corruption on data-block (pos=310114): checksum mismatch, want=0x748d7ef2 got=0x67140e7a [file=000107.ldb]

goroutine 1 [running]:
github.com/syncthing/syncthing/internal/db.ldbGenericReplace(0x10ab4620, 0x10afe448, 0x4, 0x8, 0x1213e320, 0x20, 0x20, 0x0, 0x0, 0x0, ...)
	/go/src/github.com/syncthing/syncthing/internal/db/leveldb.go:283 +0xf8c
github.com/syncthing/syncthing/internal/db.ldbReplace(0x10ab4620, 0x10afe448, 0x4, 0x8, 0x1213e320, 0x20, 0x20, 0x0, 0x0, 0x0, ...)
	/go/src/github.com/syncthing/syncthing/internal/db/leveldb.go:314 +0x7c
github.com/syncthing/syncthing/internal/db.(*FileSet).Replace(0x10dad300, 0x4b4f88eb, 0x74abdbbc, 0xeb63e937, 0x6917a53a, 0x775ba0d3, 0x40dae28d, 0x1370dd53, 0xd80b10b6, 0x0, ...)
	/go/src/github.com/syncthing/syncthing/internal/db/set.go:80 +0x364
github.com/syncthing/syncthing/internal/model.(*Model).Index(0x10c853b0, 0x4b4f88eb, 0x74abdbbc, 0xeb63e937, 0x6917a53a, 0x775ba0d3, 0x40dae28d, 0x1370dd53, 0xd80b10b6, 0x10a0ae60, ...)
	/go/src/github.com/syncthing/syncthing/internal/model/model.go:482 +0xc90
main.syncthingMain()
	/go/src/github.com/syncthing/syncthing/cmd/syncthing/main.go:511 +0x1dd0
main.main()
	/go/src/github.com/syncthing/syncthing/cmd/syncthing/main.go:266 +0x1c98

goroutine 5 [syscall]:
os/signal.loop()
	/usr/local/go/src/os/signal/signal_unix.go:21 +0x1c
created by os/signal.init?1
	/usr/local/go/src/os/signal/signal_unix.go:27 +0x40

goroutine 7 [chan receive]:
main.trackCPUUsage()
	/go/src/github.com/syncthing/syncthing/cmd/syncthing/gui_unix.go:24 +0x120
created by main.init?2
	/go/src/github.com/syncthing/syncthing/cmd/syncthing/gui_unix.go:17 +0x34

goroutine 9 [select]:
github.com/thejerf/suture.(*Supervisor).Serve(0x10adc180)
	/go/src/github.com/syncthing/syncthing/Godeps/_workspace/src/github.com/thejerf/suture/suture.go:412 +0xd60
created by github.com/thejerf/suture.(*Supervisor).ServeBackground
	/go/src/github.com/syncthing/syncthing/Godeps/_workspace/src/github.com/thejerf/suture/suture.go:373 +0x3c

goroutine 10 [select]:
main.(*verboseSvc).Serve(0x10a1c6d0)
	/go/src/github.com/syncthing/syncthing/cmd/syncthing/verbose.go:38 +0x248
github.com/thejerf/suture.func?007()
	/go/src/github.com/syncthing/syncthing/Godeps/_workspace/src/github.com/thejerf/suture/suture.go:516 +0x114
created by github.com/thejerf/suture.(*Supervisor).runService
	/go/src/github.com/syncthing/syncthing/Godeps/_workspace/src/github.com/thejerf/suture/suture.go:519 +0x140

goroutine 11 [chan receive]:
github.com/syncthing/syncthing/internal/config.(*Wrapper).Serve(0x10a5c400)
	/go/src/github.com/syncthing/syncthing/internal/config/wrapper.go:84 +0x54
created by github.com/syncthing/syncthing/internal/config.Wrap
	/go/src/github.com/syncthing/syncthing/internal/config/wrapper.go:59 +0x2ac

goroutine 12 [select]:
github.com/syndtr/goleveldb/leveldb/util.(*BufferPool).drain(0x10a0c2c0)
	/go/src/github.com/syncthing/syncthing/Godeps/_workspace/src/github.com/syndtr/goleveldb/leveldb/util/buffer_pool.go:205 +0x1d4
created by github.com/syndtr/goleveldb/leveldb/util.NewBufferPool
	/go/src/github.com/syncthing/syncthing/Godeps/_workspace/src/github.com/syndtr/goleveldb/leveldb/util/buffer_pool.go:236 +0x23c

goroutine 13 [select]:
github.com/syndtr/goleveldb/leveldb.(*DB).compactionError(0x10ab4620)
	/go/src/github.com/syncthing/syncthing/Godeps/_workspace/src/github.com/syndtr/goleveldb/leveldb/db_compaction.go:153 +0x254
created by github.com/syndtr/goleveldb/leveldb.openDB
	/go/src/github.com/syncthing/syncthing/Godeps/_workspace/src/github.com/syndtr/goleveldb/leveldb/db.go:122 +0x82c

goroutine 14 [select]:
github.com/syndtr/goleveldb/leveldb.(*DB).mpoolDrain(0x10ab4620)
	/go/src/github.com/syncthing/syncthing/Godeps/_workspace/src/github.com/syndtr/goleveldb/leveldb/db_state.go:73 +0x124
created by github.com/syndtr/goleveldb/leveldb.openDB
	/go/src/github.com/syncthing/syncthing/Godeps/_workspace/src/github.com/syndtr/goleveldb/leveldb/db.go:122 +0x854

goroutine 20 [select]:
github.com/syncthing/syncthing/internal/ignore.(*Matcher).clean(0x11194ba0, 0x61714000, 0x68c)
	/go/src/github.com/syncthing/syncthing/internal/ignore/ignore.go:161 +0x1e4
created by github.com/syncthing/syncthing/internal/ignore.New
	/go/src/github.com/syncthing/syncthing/internal/ignore/ignore.go:53 +0x1a0

goroutine 16 [select]:
github.com/syndtr/goleveldb/leveldb.(*DB).mCompaction(0x10ab4620)
	/go/src/github.com/syncthing/syncthing/Godeps/_workspace/src/github.com/syndtr/goleveldb/leveldb/db_compaction.go:759 +0x248
created by github.com/syndtr/goleveldb/leveldb.openDB
	/go/src/github.com/syncthing/syncthing/Godeps/_workspace/src/github.com/syndtr/goleveldb/leveldb/db.go:122 +0x8d4

goroutine 17 [select]:
github.com/syndtr/goleveldb/leveldb.(*DB).jWriter(0x10ab4620)
	/go/src/github.com/syncthing/syncthing/Godeps/_workspace/src/github.com/syndtr/goleveldb/leveldb/db_write.go:37 +0x17c
created by github.com/syndtr/goleveldb/leveldb.openDB
	/go/src/github.com/syncthing/syncthing/Godeps/_workspace/src/github.com/syndtr/goleveldb/leveldb/db.go:122 +0x8fc

goroutine 19 [select]:
github.com/syncthing/syncthing/internal/model.(*ProgressEmitter).Serve(0x10dacd40)
	/go/src/github.com/syncthing/syncthing/internal/model/progressemitter.go:49 +0x73c
created by github.com/syncthing/syncthing/internal/model.NewModel
	/go/src/github.com/syncthing/syncthing/internal/model/model.go:125 +0x8c0

goroutine 33 [IO wait]:
net.(*pollDesc).Wait(0x11a6cdf8, 0x72, 0x0, 0x0)
	/usr/local/go/src/net/fd_poll_runtime.go:84 +0x3c
net.(*pollDesc).WaitRead(0x11a6cdf8, 0x0, 0x0)
	/usr/local/go/src/net/fd_poll_runtime.go:89 +0x38
net.(*netFD).accept(0x11a6cdc0, 0x0, 0xb6f5dcb8, 0x10a0a61c)
	/usr/local/go/src/net/fd_unix.go:419 +0x390
net.(*TCPListener).AcceptTCP(0x10a1c8d0, 0x11a6e03c, 0x0, 0x0)
	/usr/local/go/src/net/tcpsock_posix.go:234 +0x50
net.(*TCPListener).Accept(0x10a1c8d0, 0x0, 0x0, 0x0, 0x0)
	/usr/local/go/src/net/tcpsock_posix.go:244 +0x3c
main.(*DowngradingListener).Accept(0x10afe4b0, 0x0, 0x0, 0x0, 0x0)
	/go/src/github.com/syncthing/syncthing/cmd/syncthing/tls.go:99 +0x64
net/http.(*Server).Serve(0x11a720c0, 0xb6f7c968, 0x10afe4b0, 0x0, 0x0)
	/usr/local/go/src/net/http/server.go:1728 +0x98
main.(*apiSvc).Serve(0x10dabbc0)
	/go/src/github.com/syncthing/syncthing/cmd/syncthing/gui.go:161 +0x183c
github.com/thejerf/suture.func?007()
	/go/src/github.com/syncthing/syncthing/Godeps/_workspace/src/github.com/thejerf/suture/suture.go:516 +0x114
created by github.com/thejerf/suture.(*Supervisor).runService
	/go/src/github.com/syncthing/syncthing/Godeps/_workspace/src/github.com/thejerf/suture/suture.go:519 +0x140

goroutine 34 [select]:
github.com/syncthing/syncthing/internal/events.(*Subscription).Poll(0x10a0a530, 0xf8475800, 0xd, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, 0x0, ...)
	/go/src/github.com/syncthing/syncthing/internal/events/events.go:178 +0x2a8
github.com/syncthing/syncthing/internal/events.(*BufferedSubscription).pollingLoop(0x11a7c000)
	/go/src/github.com/syncthing/syncthing/internal/events/events.go:215 +0x40
created by github.com/syncthing/syncthing/internal/events.NewBufferedSubscription
	/go/src/github.com/syncthing/syncthing/internal/events/events.go:209 +0x2a8

goroutine 35 [select]:
github.com/thejerf/suture.(*Supervisor).Serve(0x10e11100)
	/go/src/github.com/syncthing/syncthing/Godeps/_workspace/src/github.com/thejerf/suture/suture.go:412 +0xd60
created by github.com/thejerf/suture.(*Supervisor).ServeBackground
	/go/src/github.com/syncthing/syncthing/Godeps/_workspace/src/github.com/thejerf/suture/suture.go:373 +0x3c

goroutine 36 [select]:
main.(*folderSummarySvc).listenForUpdates(0x10dabb80)
	/go/src/github.com/syncthing/syncthing/cmd/syncthing/summarysvc.go:67 +0x510
main.*folderSummarySvc.(main.listenForUpdates)?fm()
	/go/src/github.com/syncthing/syncthing/cmd/syncthing/summarysvc.go:47 +0x24
main.serviceFunc.Serve(0x10cb5110)
	/go/src/github.com/syncthing/syncthing/cmd/syncthing/summarysvc.go:202 +0x24
github.com/thejerf/suture.func?007()
	/go/src/github.com/syncthing/syncthing/Godeps/_workspace/src/github.com/thejerf/suture/suture.go:516 +0x114
created by github.com/thejerf/suture.(*Supervisor).runService
	/go/src/github.com/syncthing/syncthing/Godeps/_workspace/src/github.com/thejerf/suture/suture.go:519 +0x140

goroutine 37 [select]:
main.(*folderSummarySvc).calculateSummaries(0x10dabb80)
	/go/src/github.com/syncthing/syncthing/cmd/syncthing/summarysvc.go:115 +0x290
main.*folderSummarySvc.(main.calculateSummaries)?fm()
	/go/src/github.com/syncthing/syncthing/cmd/syncthing/summarysvc.go:48 +0x24
main.serviceFunc.Serve(0x10cb5118)
	/go/src/github.com/syncthing/syncthing/cmd/syncthing/summarysvc.go:202 +0x24
github.com/thejerf/suture.func?007()
	/go/src/github.com/syncthing/syncthing/Godeps/_workspace/src/github.com/thejerf/suture/suture.go:516 +0x114
created by github.com/thejerf/suture.(*Supervisor).runService
	/go/src/github.com/syncthing/syncthing/Godeps/_workspace/src/github.com/thejerf/suture/suture.go:519 +0x140

I moved 20 posts to a new topic: Beta test of Syncthing build with alternate database backend (“boltdb”)