Credits & Thanks
Thank you to:
About the RobotsTxt module
The Drupal RobotsTxt module is great when you are running multiple Drupal sites from a single code base (multisite) and you need a different robots.txt file for each one. RobotsTxt can generate the robots.txt file for each and gives you the ability to edit on a site by site basis from within the Drupal admin interface.
Volacci uses this module to make changes to the default robots.txt because it is not completely optimized for SEO.
Before installing the RobotsTXT module, you’ll need to delete or rename the existing robots.txt file in the root of your Drupal installation. The module will not work properly until this is done.
- Install and Enable the RobotsTxt module on your server. (See this section for more instructions on installing modules.)
- Go to the Extend page: Click Manage > Extend (Coffee: “extend”) or visit https://yourDrupalSite.dev/admin/modules.
- Select the checkbox next to RobotsTxt and click the Install button at the bottom of the page.
If necessary, give yourself permissions to use the XML Sitemap module.
- Click Manage > People > Permissions (Coffee: perm”) or visit https://yourDrupalSite.dev/admin/people/permissions.
- Select the appropriate checkboxes for "Administer robots.txt".
- Click the Save permissions button at the bottom of the page.
Adding the XML Sitemap to your robots.txt file using the RobotsTxt Module
Note: If you do not use the RobotsTxt module, you’ll need to do things the old-fashioned way. Skip to the next section for information on how to make changes to your robots.txt file.
- Click Manage > Configuration > Search and metadata > Robotstxt (coffee:robots) or go to https://yourDrupalSite.dev/admin/config/search/robotstxt in your browser.
- Put your cursor within the Contents of robots.txt window and scroll to the bottom of it.
- On a new line, add this to the bottom of the field:
- Click the Save configuration button at the bottom of the page.
Adding the XML Sitemap to your robots.txt file without a module
- Download your robots.txt file. One way to do that is to visit https://yourDrupalSite.dev/robots.txt in your browser and select File > Save Page As... from the browser’s menu.
- Using a text editor like Notepad or TextEdit, open your robots.txt file.
Avoid complex word processing programs to edit this file because they will add invisible markup that makes the file unusable by crawlers.
- Add this line to the bottom of your robots.txt file and save the file:
- Here’s what Volacci’s robots.txt file looks like.
Note: You will always want to use the https version of your site because not doing so will impact your SEO rankings. Contact your web developer or hosting company to make sure your site has an SSL certificate and that your site defaults to the https protocol.
- Upload the new file to the root level of your Drupal site, replacing your existing robots.txt file. If you don’t have FTP access, ask your developer or hosting company to help you.
- Verify that you did it properly by visiting https://yourDrupalSite.dev/robots.txt, refresh the page, and look for your changes.
That’s it! Now, any other search engines can find the location of your XML sitemap by visiting your robots.txt file.
Did you like this Drupal RobotsTxt Module walk through?
Please tell your friends!