Blocking Complicated URLs with Robots.txt

If you have a large web site, you might have some content that you do not want the search engines to index — perhaps for duplicate content reasons or you simply don’t want someone casually stumbling across it. You know you can use robots.txt, but what if you need to block thousands of pages or block only certain files within a folder? This article will explain some of the more advanced uses of robots.txt. You will even learn how to block dynamic pages!
Continue reading