Robots.txt Generator

100% free SEO tools

Robots.txt Generator


Default - All Robots are:  
    
Crawl-Delay:
    
Sitemap: (leave blank if you don't have) 
     
Search Robots: Google
  Google Image
  Google Mobile
  MSN Search
  Yahoo
  Yahoo MM
  Yahoo Blogs
  Ask/Teoma
  GigaBlast
  DMOZ Checker
  Nutch
  Alexa/Wayback
  Baidu
  Naver
  MSN PicSearch
   
Restricted Directories: The path is relative to root and must contain a trailing slash "/"
 
 
 
 
 
 
   



Now, Create 'robots.txt' file at your root directory. Copy above text and paste into the text file.


About Robots.txt Generator

ABOUT ROBOTS.TXT TOOL AVAILABLE AT NICE SEO TOOLS

Robots.txt analyzer is a tool that has made lives of site owners very simple and hassle free by performing a very complex task itself and that with just a few clicks. This tool will generate a Google friendly robots.txt file. This sophisticate tool comprises of a good user friendly interface that gives you a choice to select the things that should be covered in robots.txt file and things that should not be. By using Nice SEO Tools robots.txt generator website or blog owners can notify robots that records or files in your website’s root index required to be crept through Google bot. user can even select a specific robot you must have entry to site’s index and stop various robots from doing same. Robots.txt generator creates a file which is opposite to sitemap that stipulates the page to be covered. Therefore robots.txt syntax is very much important for any site. Every time the search engine crawls a website, it looks for robots.txt file which is located at domain root level. After it is identified the crawlers reads the file, and then identify the files and directories that can be blocked.

How to use robots.txt generator?

By using our robots.txt generator tool you can create robots.txt file for your website by following some easy and simple steps:

  1. All Google robots.txt creator tools by default are allowed to use your site’s files; here you can select robots you wish to deny or allow access.
  2. Choose crawl delay that instructs you how much delay should be there in crawls, making you to choose desired delay duration between 10 to 100 seconds. By default it is no delay always.
  3. It the sitemap is already there for the website, user can paste that in text field.
  4. An entire list of search robots is given, you can select the one you want to crawl your site, or else you can even say no to robots you do not wish to crawl the files.
  5. The last thing is to confine the directories.

You can generate a new or even edit the current robots.txt record to the website with our free robots.txt generator tool. To edit an existing document, pre populate robots.txt tool paste the URL in content box and then click on add. Use our robots.txt generator to create directives with allow directives or disallow directives for selected content in your site.

In niceseotools robots.txt file creator tool yahoo and many various search engines like Google can be designated to your criteria. To elaborate other directives for single crawler, press on person agent to select the boat. When you press on upload directive custom phase is given to the listing with all regular directives covered with new custom directive. In the end, when you are done creating Google bot robots.txt files with the assistance of our robots.txt generator tool on niceseotools you can simply upload it to site root directory. Is a user wants to explore more about our responsive tool before using it then he/she may feel free to go through it and create a sample robot.txt file.