so what actually works better for managing a ton of proxies on a custom python scraper IP whitelist or user
ass authentication alright I was building this new push angle finder and kept having to pause everything every time my home IP changed on me yeah I'm that guy who hasn't sprung for a static IP yet anyway the process of updating the whitelist across three different proxy dashboards was taking forever and sometimes my scraping would just die for an hour cause the ISP reset smth last week I switched the whole setup over to username/password auth with rotating credentials from the API of my main proxy provider and man its like night and day no more dashboards just a script that pulls fresh proxies and auth per session been running solid for 48 hours straight pulling data w/o a single blip in the connection logs definitely should have done this months ago anyone else make that switch lately or are you still riding with whitelists for security