
Came across this phrase “robotic process automation” (RPA) <https://www.theregister.co.uk/2020/05/05/sba_rpa_ban/> which seems to be just another name for “web-scraping”. The site in question also offers APIs that can be used more directly to make requests and receive info. It’s just that the scraping works on a GUI designed for human users and human response times and data rates, so it creates a lot more overhead on the servers when being driven automatically with a lot of requests. Some outfits try to say that using such technology on their site is a violation of their terms of service. As one of the reader comments points out, if a company is banned from using a scraper bot that is batching its requests, and has to resort to hiring an army of humans, creating a concomitant swarm of separate login sessions, to enter the same data, the result can be even worse load on the servers.