Robots.txt Generator
Create a robots.txt file to control how search engine crawlers access your site. Manage Allow/Disallow rules easily.
The bot you want to give instructions to.
Disallow (Block)
Allow (Unblock)
Optional: Path to your XML sitemap.
Optional: Seconds between requests (supported by some bots).
Generated robots.txt
Robots.txt Generator
The Robots.txt Generator allows you to easily create the robots.txt file for your website. This file gives instructions to web robots (also known as crawlers or spiders) about which parts of your site they should or should not crawl.
What is robots.txt?
The Robots Exclusion Protocol (REP), or robots.txt, is a text file residing in the root directory of your website. It tells search engine crawlers which pages or files the crawler can or can't request from your site.
Key Directives
- User-agent: Identifies which crawler the rules apply to. An asterisk (
*) means the rules apply to all bots. - Disallow: Tells a user-agent not to crawl a particular URL or path.
- Allow: Tells a user-agent that it can crawl a specific URL or path (often used to override a Disallow rule for a sub-path).
- Sitemap: Points crawlers to the location of your XML sitemap.
Why use it?
Using a robots.txt file helps you manage crawl budget by preventing bots from wasting time on unimportant pages (like admin panels, duplicate content, or temporary files). It is a crucial part of technical SEO.