Robots.txt
5-15 minIntermediate
Drupal generates a standard robots.txt by default. Customizing it usually requires the 'RobotsTxt' module or manual file editing.
Prerequisites
- Drupal Admin Access
- FTP/SFTP Access (for manual methods)
Easy Recommended
Using RobotsTxt Module
Manage robots.txt from the admin UI without touching files.
1
Install Module
1
Install 'RobotsTxt' via Composer: composer require drupal/robotstxt
2
Enable it in Extend.
2
Configure
1
Go to Configuration > Search and Metadata > RobotsTxt
2
Edit the content area.
3
Save configuration.
4
Note: You must delete the physical robots.txt file from the root for the module to take over (Drupal serves the file if it exists).
Best Practices
Do
- Block /admin and /user paths
- Allow /core/*.css and /core/*.js if Google needs them for rendering
Don't
- Block CSS/JS files that affect layout (Google needs to 'see' the page)
Verification Checklist
- Visit yoursite.com/robots.txt
- Ensure it reflects your changes
Pro Tips
- Drupal's default robots.txt is quite good. Only change it if you have specific needs.