Free Robots.txt Generator
Search Engine Optimization

Free Robots.txt Generator


Default - All Robots are:  
    
Crawl-Delay:
    
Sitemap: (leave blank if you don't have) 
     
Search Robots: Google
  Google Image
  Google Mobile
  MSN Search
  Yahoo
  Yahoo MM
  Yahoo Blogs
  Ask/Teoma
  GigaBlast
  DMOZ Checker
  Nutch
  Alexa/Wayback
  Baidu
  Naver
  MSN PicSearch
   
Restricted Directories: The path is relative to root and must contain a trailing slash "/"
 
 
 
 
 
 
   



Your Generated Robots.txt File


About Free Robots.txt Generator

 

A robots.txt file should always live in the root folder and contains a set of instructions used by websites to tell search engines which pages should and should not be crawled. Robots.txt files guide crawler access but should not be used to keep pages out of Google's index.

It’s also essential that your robots.txt file is called robots.txt. The name is case-sensitive, so get that right, or it won’t work.

If you see your robots.txt file with the content you added, you’re ready to test the robots.txt markup.

And

Here is an example of What should a robots.txt file look like?

 

User-agent: Googlebot

Disallow: /clients/

Disallow: /not-for-google

User-agent: *

Disallow: /archive/

Disallow: /support/

 

In the robots.txt file, we declare user-agent, allow, disallow, and sitemap functions for search engines like Google, Bing, Yandex, etc.

Search engines such as Google use website crawlers, or robots that review all the content on your website.

There may be unimportant parts of your website that you do not want them to crawl to include in user search results, such as the admin page. You can add these pages to the file to be explicitly ignored.

robots.txt is part of the Robots Exclusion Protocol, a standard used by websites to indicate to visiting web crawlers and other web robots which portions of the website they are allowed to visit. This website will easily generate the file for you and anyone with inputs of pages to be excluded.