Posted to Ben Finklea's blog on October 22nd, 2008

Validating your Robots files

Well behaved spiders, like the ones Google and Yahoo use, will look at your robots file to decide which pages to index. This is an excellent way to steer the search engines toward the pages you would like to see in the search engine results (like content pages) and steer them away from pages that aren't as important (like administrative pages).

Be sure that you validate your robots file (robots.txt) by entering your full URL. Once your file has been validated you should be able to easily share your optimized pages with the search engines you are targeting.