After one year I've finally found some time to publish a new, more powerful version of my own web automation API. V1 couldn't manage dynamically generated dom elements (i.e: loaded via JS) and after thorough researches and architectural re-designs, it is now able to.
For this new version I have designed my own distributed, simple and flexible architecture in order to easily scale the system according to the load.
I was quite going to forget, the new api has proxies included in the plans. They dynamically change in order to provide maximum availability for the final user.
My main goals for this version, were:
Wish me luck! :D
And btw, go check it out on RapidAPI Hub: https://rapidapi.com/onipot/api/scrapingmonkey/