We’ll check robots.txt first. If crawling is allowed, we’ll map up to 500 unique pages from the same host.