Robots.txt
5-15 minIntermediate
The robots.txt file directs search engine bots on where they can and cannot go. It is a powerful tool - a single mistake can de-index your entire site.
Prerequisites
- WordPress admin access
- Yoast SEO or Rank Math (recommended for safe editing)
Easy Recommended
Using Yoast SEO (Virtual File)
The safest way to edit robots.txt without FTP.
1
Access File Editor
1
Go to Yoast SEO > Tools
2
Click "File editor"
3
The top section is robots.txt
2
Create/Edit File
1
If no file exists, click "Create robots.txt"
2
Add your directives
3
Click "Save changes"
Best Practices
Do
- Allow access to /wp-admin/admin-ajax.php
- Allow access to CSS and JS resource files
- Include your Sitemap location
- Test changes in GSC immediately
Don't
- Block your entire site with "Disallow: /" during development (forgetting to remove it later)
- Use robots.txt to hide sensitive data (it is public visibility)
- Block Google from crawling CSS/JS (hurts mobile ranking)
Verification Checklist
- File loads at yoursite.com/robots.txt
- Does NOT contain "Disallow: /" (unless intentional)
- Sitemap location is specified at the bottom
- Google Search Console Robots Tester shows "Allowed" for homepage
Pro Tips
- Use standard structure:
- User-agent: *
- Disallow: /wp-admin/
- Allow: /wp-admin/admin-ajax.php
- Sitemap: https://yoursite.com/sitemap_index.xml
Common Issues & Fixes
Problem: Site de-indexed after launch
Solution: Check Settings > Reading and ensure "Discourage search engines" is UNCHECKED. This adds a Disallow: / rule.
Problem: Robots.txt not updating
Solution: Server caching (Nginx/Varnish) often caches this file aggressively. Purge server cache.