Tuesday, December 01, 2009

Why Pages Disallowed in robots.txt Still Appear in Google

robots.txt is a useful file which sits in your website’s root and controls how search engines index your pages. One of the most useful declarations is “Disallow” — it stops search engines accessing private or irrelevant sections of your website, e.g. By Craig Buckler