Sitepoint is always publishing info that is relevent to anyone that has an online presence. Today they posted an article on the robots.txt file that every website should have in the root of their website. It is a simple text file that anyone can create using notepad. It instructs user-agents (search engine spiders) how to spider your website. It allows you to even block spiders from wasting your bandwidth. There are a bunch of spiders of course that have nothing but ill intent for your site and they will of course avoid the robots.txt file but this article gives samples and shows a table of all the legitimate spiders that will abide your wishes. Check it out.