Disconnected on each other part

Hello,

My devices no longer see each other since the last update (both v2.0.12).

The source device : (seedbox)

The receive device (receive only)

I’ve changed nothing in the config and no errors are showing in the receiver’s logs. (I’ve redacted the IPs and names for security purposes)

2025-12-19 10:47:48 INF syncthing v2.0.12 “Hafnium Hornet” (go1.25.4 linux-amd64) ``docker@github.syncthing.net`` 2025-11-24 15:49:42 UTC [noupgrade] (log.pkg=main) 2025-12-19 10:47:48 INF Calculated our device ID (device=[REDACTED] log.pkg=syncthing) 2025-12-19 10:47:48 INF Overall rate limit in use (send=“is unlimited” recv=“is unlimited” log.pkg=connections) 2025-12-19 10:47:48 INF Using discovery mechanism (identity=“global discovery server ``https://discovery-lookup.syncthing.net/v2/?noannounce”`` log.pkg=discover) 2025-12-19 10:47:48 INF Using discovery mechanism (identity=“global discovery server ``https://discovery-announce-v4.syncthing.net/v2/?nolookup”`` log.pkg=discover) 2025-12-19 10:47:48 INF Using discovery mechanism (identity=“global discovery server ``https://discovery-announce-v6.syncthing.net/v2/?nolookup”`` log.pkg=discover) 2025-12-19 10:47:48 INF Relay listener starting (id=dynamic+https://relays.syncthing.net/endpoint log.pkg=connections) 2025-12-19 10:47:48 INF TCP listener starting (address=“[::]:22000” log.pkg=connections) 2025-12-19 10:47:48 INF Using discovery mechanism (identity=“IPv4 local broadcast discovery on port 21027” log.pkg=discover) 2025-12-19 10:47:48 INF QUIC listener starting (address=“[::]:22000” log.pkg=connections) 2025-12-19 10:47:48 INF Using discovery mechanism (identity=“IPv6 local multicast discovery on address [ff12::8384]:21027” log.pkg=discover) 2025-12-19 10:47:48 INF GUI and API listening (address=“[::]:8384” log.pkg=api) 2025-12-19 10:47:48 INF Access the GUI via the following URL: ``http://127.0.0.1:8384/`` (log.pkg=api) 2025-12-19 10:47:48 INF Loaded configuration (name=[redated] log.pkg=syncthing) 2025-12-19 10:47:48 INF Loaded peer device configuration (device=DCY54P6 name=[redacted] address=“[dynamic]” log.pkg=syncthing) 2025-12-19 10:47:48 INF Ready to synchronize (……………TRUNCATED……………..) 2025-12-19 10:47:49 INF Measured hashing performance (perf=“1805.55 MB/s” log.pkg=syncthing) 2025-12-19 10:47:57 INF Joined relay (uri=relay://212.20.112.112:22067 log.pkg=relay/client) 2025-12-19 10:48:17 INF Detected NAT type (uri=quic://0.0.0.0:22000 type=“Port restricted NAT” log.pkg=connections) 2025-12-19 10:48:17 INF Resolved external address (uri=quic://0.0.0.0:22000 address=quic://[REDACTED]:22000 via:3478=``stun.miwifi.com`` log.pkg=connections) 2025-12-19 10:48:18 INF Detected NAT services (count=0 log.pkg=nat)

I’ve checked some other topics, several with old versions or firewall issues. On my side, nothing changed. The last seen date is when I’ve updated them.

Any idea why they suddenly cannot see each other please ? Thanks for your help :slight_smile:

There’s some discovery errors on the receiver device it looks like. Click on the blue Discovery 3/5 link.

Thanks for your reply, I see indeed some timeouts and IPv6 messages.

Indeed, when I try to reach the URL rom the container there is a timeout (meanwhile the host gets a 404). I’ll investigate this, thanks fro the hint !

edit : Weird, the container can reach other URLs but these one are ending to a timeout.

1 Like

OK, I don’t know why the container is unable to connect on the syncthing URL, it end up in timeout and curl: (56) OpenSSL SSL_read: OpenSSL/3.5.4: error:0A000126:SSL routines::unexpected eof while reading, errno 0

I’ve fixed the issue by configuring in the receiver the IP address of the remote instead of “dynamic”.

OK, I’ve got the actual root cause. Some other container had this weird issue too.

It’s a regression in pasta Podman containers do not reach certain hosts · Issue #27765 · containers/podman · GitHub

Downgrading the package solved the issue.

From 2025_12_10.d04c480-1 to passt-2025_09_19.623dbf6-1 on my case (Arch based server)

I hope it will help anybody who encountered the same problem !

4 Likes