Question: I have both a member and non member side to my website, Googlebots were so efficient at pulling from the member only side that I asked my web provide to disallow Google bots. (My Google alerts were letting me know it was happening) Was that a bad idea because it is limiting the number of pages I am having indexed? (156 pgs)
Answer: It depends. Use robots.txt primarily to 1) exclude parts of your website you want to keep private, and 2) to tell Google where your sitemap.xml file. If you do NOT want non-members to be able to find out about your MEMBERS subsections through the Google search engine, then EXCLUDE those pages via robots.txt. If you don’t care, or if you think that those pages might help you pull in new members, then let Google index them.
It’s really up to you. Use robots.txt just as a guide to Google as to what you DO want it to index publicly and what you do NOT want publicly indexed.
Got a question? Click this link to email Jason or call 888-993-1122. Dr. Jason McDonald is founder and Senior SEO / Social Media Director of the JM Internet Group. He teaches the SEO training classes for the group, and therefore provides most of the SEO tips for this blog. His goal with this blog is to provide an easy 'one-stop shop' for the busy marketer looking for tips, tricks, and secrets on how to get to the top of Google and Bing for free using proven SEO tactics. When not dreaming up SEO tips, Dr. McDonald lives in the San Francisco Bay Area with his wife, two dogs, a cat, four iguanas and twelve children (just kidding). Really you read down to the bio on this guy? Enjoy the blog.
on SEO | AdWords | Social Media