New commands for robots.txt

Robots.txt files have been around almost since the dawn of the Internet and certainly since search engines began crawling sites and have changed very little since then. Now there are new commands available to add to your robots.txt file besides the usual “allow” and “disallow” commands. Sitemaps.org have announced these new commands which are explained in detail in this fine article at Market Position. Now you can use your robots.txt file to tell a spider where your site map is, and to slow down the crawl rate of spiders on your site.

Leave a Reply