The Robots.txt Generator creates a properly formatted robots.txt file that tells search engines which pages to crawl and which to avoid.
Robots.txt directives:
Robots.txt is the first file search engines check. Use it to block crawling of admin pages, thank you pages, duplicate content, and other pages you do not want indexed.
Try these other tools to enhance your SEO analysis