Girder
New member
Ngl I miss the early 2020s for speed testing. Used to just run a couple of speedtest-cli runs over a few hours, pick a consistent server, and that was that. tbh the numbers you got back then on OpenVPN UDP were. real. They sucked, but you knew exactly what you were getting. 50% drop, maybe 60% on a bad day. It was predictable chaos. Nowadays with WireGuard and these 'Lightway' or 'NordLynx' protocols, every test feels like marketing. You see someone post a screenshot with 90% of their base speed and you just know they tested for like 30 seconds at 3am on a server nobody uses. The methodology is dead. Nobody talks about bufferbloat anymore, or testing during peak hours, or even doing a sustained 10-minute iperf3 transfer to see if the tunnel chokes. They just wanna see the big number. I was digging through old forum archives and found my notes from testing my self-hosted Algo setup vs PIA back in like 2022. Had columns for latency under load, jitter during torrents, even how quickly speeds recovered after a protocol switch. That was the stuff. You'd actually learn smth. atm if I see another 'I got 950 Mbps on WireGuard!' post with no context I'm gonna lose it. Where's the consistency? The 24-hour graph? The real-world download test from a crappy HTTP server? Kinda makes me wanna go back and re-test all the old protocols just for nostalgia. IPsec IKEv2, good ol' OpenVPN TCP for that stable 10 Mbps ceiling. You knew where you stood. These new tests are all flash, no substance. Anyone else feel this way or am I just being a grumpy old head about it?