in Web & Tech

Serving different robots.txt per domain with a single codebase

I just found myself with the task of excluding search robots from one site of a multisite Drupal installation while keeping the other site fully crawlable. As multisites share a single code base per definition there is obviously only one robots.txt for all sites inside a multisite installation. Eli suggested a great way of overcoming this problem by simply using a domain based rewrite rule. It takes not even a minute to implement and works smoothly \o/

RewriteCond %{HTTP_HOST}$ [NC]
RewriteRule ^robots.txt      robots_for_domainone.txt [L]

RewriteCond %{HTTP_HOST}$ [NC]
RewriteRule ^robots.txt      robots_for_domaintwo.txt [L]

Leave a Reply

This site uses Akismet to reduce spam. Learn how your comment data is processed.

To respond on your own website, enter the URL of your response which should contain a link to this post's permalink URL. Your response will then appear (possibly after moderation) on this page. Want to update or remove your response? Update or delete your post and re-enter your post's URL again. (Learn More)