I recently created a robots.txt file for this site and the yet-to-be-launched Niche Site 1. Graywolf’s video is a nice primer on the problem of search engines crawling your blog, finding too much duplicate content, and lowering your PR. He’s a fan of WordPress and specifically made the video for it.
There are a few nice pages for the nitty gritty details. First is the official Web Robots page. Second is the sample robots.txt file on the SEO page at WordPress. Third is the robots.txt analysis tool under Google’s Webmaster Tools. This is probably the tip of the iceberg but I found these very useful for beginners like us.
Follow our adventures on Twitter