Monday, February 7, 2011

What is a robots.txt file?

Robots.txt is a plain text file that search engines use to see if there are areas within your website which should not be indexed.
This file must be placed in your /www directory in order for a search engine to see it.
The sample below tells search engines not to index pages in the specified folders.
1User-agent: *
2Disallow: /cgi-bin/
3Disallow: /images
4Disallow: /secret
5Disallow: /members