The Robots.txt Checker enables you to check for and view the contents of the robots.txt file for your website or others. This helpful tool reveals the contents of your robots.txt file (if your site has one) and lets you know in an instant which pages are allowed or disallowed by your file. In layman terms, this simple file tells search engine robots or spiders which pages can be indexed by Google and other engines and which pages are off limits.
If there are pages containing sensitive data about your company or main directory pages for a large site, you can tell the search engine robots to ignore these pages. Use this tool to gather this valuable information about your site and gain more control of your search engine efforts.