Robots.txt Generator
Generate robots.txt files visually with user-agent rules, path controls, AI bot blocking, and URL testing.
User-agent: * Disallow:
About Robots.txt Generator
The robots.txt file tells web crawlers which pages or sections of your website they can or cannot access. It uses the Robots Exclusion Protocol with User-agent, Disallow, Allow, Crawl-delay, and Sitemap directives. Properly configured robots.txt files help manage crawl budget, protect private pages, and control AI data collection.
How to Use
Choose a quick template or build custom rules. Add user-agent rules (specific bots or all bots with *). For each bot, add Disallow paths to block and Allow paths to permit. Add your sitemap URL. Use the URL tester to verify a specific URL would be allowed or blocked. Copy or download the generated robots.txt and upload it to your website root.
Common Use Cases
- Blocking AI web scrapers (GPTBot, CCBot) from training on your content
- Protecting admin, login, and private pages from search engines
- Configuring crawl budget by blocking duplicate/thin content pages
- Setting up robots.txt for WordPress, e-commerce, or custom sites
- Testing and validating existing robots.txt rules against specific URLs