Create optimized robots.txt and XML sitemaps to help search engines crawl and index your website effectively.Jump to Robots.txt SectionJump to Sitemap XML Section Robots.txt Generator Default - All Robots are: Allowed Refused Global Crawl-delay: Default - No Delay 5 seconds 10 seconds 20 seconds 60 seconds 120 secondsSpecify a delay between consecutive requests for all user-agents. (Note: Googlebot does not support this directive). Sitemap URLs Enter full URLs to your sitemaps, one per line (e.g., https://example.com/sitemap.xml). Host Directive Specify your preferred domain (e.g., example.com). (Note: Deprecated by Google). Globally Restricted Directories Paths to disallow for all user-agents, one per line. The path is relative to root and must contain a trailing slash "/" (e.g., /private/, /admin/). Global Comments Add comments at the top of your robots.txt file, one per line. Lines will be prefixed with #. Specific Search Robots Control Google Default Allowed Refused Google Image Default Allowed Refused Google Mobile Default Allowed Refused MSN Search Default Allowed Refused Yahoo Default Allowed Refused Yahoo MM Default Allowed Refused Yahoo Blogs Default Allowed Refused Ask/Teoma Default Allowed Refused GigaBlast Default Allowed Refused DMOZ Checker Default Allowed Refused Nutch Default Allowed Refused Alexa/Wayback Default Allowed Refused Baidu Default Allowed Refused Naver Default Allowed Refused MSN PicSearch Default Allowed Refused Generated Robots.txt: Copy and paste this content into your robots.txt file in your website's root directory. Sitemap Generator Website Base URL The base URL for your website (e.g., https://example.com). This will be used to resolve relative URLs. Generated Sitemap XML: Copy and paste this content into your sitemap.xml file or directly into your sitemap management system. Bookmark this tool