No, no they really aren't, but I was thinking the "scraping industry" in the sense that that's a thing. Getting hosting in smaller datacenters is simple enough, but you may need to manage your own hardware, or VMs. Many will help you get your own IP ranges and ASN, that's going to go a long way, if you don't want to get bundled in with the bad bots.
This differs obviously, but having an ASN in our case means that we can deal you, contact you and assume that you're better than random bot number 817.
Thank you for speaking some sense. As a site operator that's been inundated with junk traffic over the past ~month where well in excess of 99% of it has to be blocked, the scrapers have brought this upon themselves.
I actually do let quite a few known, "good" scrapers scrape my stuff. They identify themselves, they make it clear what they do, and they respect conventions like robots.txt.
These residential proxies have been abused by scrapers that use random legit-looking user agents and absolutely hammer websites. What is it with these scrapers just not understanding consent? It's gross.
try scraping any of the major players e.g. Amazon without residential proxy it won't work. I appreciate that you are offering to abide by crawling etiquette (e.g. robots.txt) but no major app supports that any more.
You're thinking about the case of big AI companies crawling your blog. I'm talking about a small startup trying to do traditional indexing and needing to run from residential proxy to make it work.
This differs obviously, but having an ASN in our case means that we can deal you, contact you and assume that you're better than random bot number 817.