Skip to content

robots.txt Generator

Generate robots.txt files for your website. Configure user agents, allow and disallow rules, crawl delay and sitemap URL.

Rule Group 1
User-agent: *
Disallow:

About robots.txt

The robots.txt file tells search engine crawlers which pages or sections of your site should or should not be crawled. It sits at the root of your domain (e.g. example.com/robots.txt) and follows the Robots Exclusion Protocol. Properly configuring robots.txt helps control crawl budget, protect private areas and guide search engines to your most important content.

This generator supports multiple user-agent groups, allow and disallow rules, crawl-delay directives and sitemap declarations. You can also use presets to quickly block AI training bots like GPTBot and CCBot. All processing happens entirely in your browser.