Gonna dump some messy thoughts here. Building a proxy pool isn't magic, it's just about stacking the deck in your favor. I started with 50 residential proxies, mostly from cheap providers, and cycled through about 10 per day. Within the first week, I noticed my scraping success rate jumped from 70% to 90%. Why? Because each IP was spinning fresh every hour, and I kept rotating to avoid detection. After two weeks, I doubled my pool size to 100 proxies. My success rate was holding steady at 92% but I was starting to see some proxies drop out, so I cleaned up and dropped the dead weight. It's all about numbers. I aim for at least 20 proxies per target site, so I have a buffer if some get flagged or banned. Plus, I keep a blacklist of IPs that gave me trouble and rotate out fast. The key is continuous monitoring, keeping track of IP performance, and adding new proxies weekly. No fancy API, just a spreadsheet, a rotating schedule, and a lot of small adjustments. If you're serious about scraping or avoiding detection, having a pool of 100 good residentials is the sweet spot. Speed, success, less spam detection. The actual build is just mixing the IPs, tracking, and never putting all your eggs in one basket.