in Web & Tech

Serving different robots.txt per domain with a single codebase

I just found myself with the task of excluding search robots from one site of a multisite Drupal installation while keeping the other site fully crawlable. As multisites share a single code base per definition there is obviously only one robots.txt for all sites inside a multisite installation. Eli suggested a great way of overcoming this problem by simply using a domain based rewrite rule. It takes not even a minute to implement and works smoothly \o/

RewriteCond %{HTTP_HOST}    domainone.com$ [NC]
RewriteRule ^robots.txt      robots_for_domainone.txt [L]

RewriteCond %{HTTP_HOST}    domaintwo.com$ [NC]
RewriteRule ^robots.txt      robots_for_domaintwo.txt [L]

Write a Comment

Comment