Question: Question. I have a question about robots.txt. What do we use the robots.txt for? I looked at the one on your site and I kind of get it but not totally.
Answer: Robots.txt has two primary uses. First and foremost, it is used by Google to know what you do NOT want indexed. So if you have content on your website, you do NOT want crawled, indicate that in robots.txt. If you really want to keep it confidential, out of the Google index, also add the ROBOTS META TAG to each and every page –
<META NAME=”ROBOTS” CONTENT=”NOINDEX, NOFOLLOW”>
Second, robots.txt is used to point to your XML sitemap. The XML sitemap is an OPTIONAL but helpful file that helps Google and Bing understand where content is located on your website. So a best practice is to indicate the XML sitemap. That’s about it for robots.txt!
Got a question? Click this link to email Jason or call 888-993-1122. Dr. Jason McDonald is founder and Senior SEO / Social Media Director of the JM Internet Group. He teaches the SEO training classes for the group, and therefore provides most of the SEO tips for this blog. His goal with this blog is to provide an easy 'one-stop shop' for the busy marketer looking for tips, tricks, and secrets on how to get to the top of Google and Bing for free using proven SEO tactics. When not dreaming up SEO tips, Dr. McDonald lives in the San Francisco Bay Area with his wife, two dogs, a cat, four iguanas and twelve children (just kidding). Really you read down to the bio on this guy? Enjoy the blog.
on SEO | AdWords | Social Media