Been around the block with this proxy speed testing stuff and honestly most of what I see is just click a bunch of speed tests, pick the fastest and call it a day. RIP to the people actually trying to optimize for scraping or automation. Here's the thing, I want a method that reflects real world use, not just some synthetic speed test. Like, how do you test proxy speed when you're actually crawling or automating at scale? Do you spin up a bunch of sessions and see how long it takes to load a typical page? Or run some kind of sustained throughput test over a few hours? I've seen guys just ping proxies and call it a day but honestly, that's not enough if you want to squeeze juice out of your proxies. Would love to hear what works for you guys that actually live in the trenches, not some fake benchmark that's useless once you start crawling. Anyone got a legit methodology that isn't just 'ping the IP and look at ms'?