robots.txt is a useful file which sits in your website’s root and controls how search engines index your pages. One of the most useful declarations is “Disallow” — it stops search engines accessing private or irrelevant sections of your website, e.g.
By Craig Buckler