Robots.txt Generator
Create SEO-friendly robots.txt files to control crawler access
Generated Robots.txt
Bulk Robots.txt Validator
Validate multiple robots.txt URLs at once
Validation Results
Robots.txt Generator Help
Create SEO-friendly robots.txt files to control crawler access for Google, Bing, and other search engine bots. A properly configured robots.txt file helps search engines understand which pages to crawl and index.
What is robots.txt?
The robots.txt file is a text file placed in your website's root directory that tells search engine crawlers which pages or sections of your site they should or shouldn't access. It's part of the Robots Exclusion Protocol (REP).
How to Use This Tool
- Choose a User-Agent (* for all crawlers, or specific like Googlebot)
- Enter paths to disallow (e.g., /admin/, /private/)
- Optionally add paths to explicitly allow
- Add your sitemap URL for better indexing
- Set a crawl delay if needed (optional)
- Click "Generate Robots.txt"
- Copy or download the generated file
- Upload it to your website's root directory
Common User-Agents
* - All crawlers
Googlebot - Google's web crawler
Googlebot-Image - Google's image crawler
Bingbot - Bing's web crawler
Slurp - Yahoo's crawler
DuckDuckBot - DuckDuckGo's crawler
Common Patterns
Block Admin Area
User-agent: * Disallow: /admin/ Disallow: /wp-admin/
Allow All with Sitemap
User-agent: * Disallow: Sitemap: https://example.com/sitemap.xml
Block Everything
User-agent: * Disallow: /
Best Practices
- Always include a sitemap reference
- Place robots.txt in your root directory (example.com/robots.txt)
- Test your robots.txt using Google Search Console
- Use specific paths rather than blocking entire sections when possible
- Remember: robots.txt doesn't guarantee privacy (use proper authentication)
- Keep it simple - over-complicated rules can confuse crawlers
Common Paths to Block
/admin/- Administrative areas/wp-admin/- WordPress admin (except admin-ajax.php)/cgi-bin/- CGI scripts/tmp/- Temporary files/private/- Private directories/*.pdf$- PDF files (if you don't want them indexed)
⚠️ Important Note
Blocking pages in robots.txt doesn't prevent them from appearing in search results if they're linked from other sites. For true privacy, use password protection or noindex meta tags.
💡 Pro Tip
After uploading your robots.txt, verify it's accessible at yourdomain.com/robots.txt and test it using Google Search Console's robots.txt Tester tool.
Usage Limits
| Plan | Daily Limit | Best For |
|---|