/ December 10, 2009

ImamDirektoriya asked:

 

How to make good robots.txt for phpLD? Is this a good way to protect my directory from duplicate content:

Quote:
User-agent: *
Disallow: /*submit.php?c
Disallow: /*?p
Disallow: /*?s
Disallow: /*p=d
Disallow: /*p=h

User-agent: Mediapartners-Google

David replied: Usually duplicate content is not an issue unless you are doing something that produced poor quality content like article spinning. If spiders are hitting your site too hard, it is probably better to block specific spiders by their IP or hostname than to handle this in robots.txt. Google generally knows the best pages to index, and you don’t need a special robots.txt.