Enter a website URL to fetch robots.txt, parse crawler groups, detect Sitemap directives and validate linked sitemap XML files.
The analyzer checks whether important bots can crawl the root path, counts sitemap URLs, finds duplicate or external URLs and performs a sampled broken-link check for sitemap entries.