API to check crawl rules from robots.txt files
I'm building a Redis-compatible data store service. Pay only for what you use. Millisecond latency, instant auto-scaling, fully managed. Use it as your primary database with confidence.