why is my proxy authentication failing every single time when i try to plug a custom residential pool into scrapebox? setup works fine in other tools, lmao.
right, context. i built my own pool from a couple providers, about 5k ips mixed residential and dc, all authenticated via ip
ort:user
ass format. works perfectly in puppeteer scripts and some python scrapers. but when i feed the same list into scrapebox's proxy manager, it just sits there spinning. connection tests fail, the logs are useless - just says 'proxy error'. no timeout settings seem to fix it.
i'm on three coffees and this is killing my morning workflow. data or it didn't happen but i have none because the tool won't even start. anyone else fought this specific integration? google's core updates are mostly just a game of footprint whack-a-mole for smart operators but this feels like a basic config issue i'm missing.
right, context. i built my own pool from a couple providers, about 5k ips mixed residential and dc, all authenticated via ip
i'm on three coffees and this is killing my morning workflow. data or it didn't happen but i have none because the tool won't even start. anyone else fought this specific integration? google's core updates are mostly just a game of footprint whack-a-mole for smart operators but this feels like a basic config issue i'm missing.