Modify robots.txt Using RobotsTxt Module

The robots.txt file is used to prevent cooperating web crawlers from accessing certain directories and files. The file plays a major part in search engine optimization and website performance. Drupal ships with a standard robots.txt file that prevents web crawlers from crawling specific directories and files.

If you have to modify the robots.txt file for a specific website within a multi-site setup, this is when things start to get tricky. You see, the file is shared across all sites in the multi-site setup.

The solution is to use the RobotsTxt module. The module dynamically generates a robot.txt file that can be modified directly from the Drupal’s administration section.

In this video, I’ll show you how to setup the module.


drush dl robotstxt


 * Implements hook_robotstxt().
function hook_robotstxt() {
  return array(
    'Disallow: /custom-search',
    'Disallow: /custom-listing',
Scroll to Top