Menu
Ask The Expert

Robots.txt Generator

Overview

Robots.txt Generator

Average Rating

3.31

Description

Robots.txt is a text file web administrators create to instruct search engine robots on how to crawl and index pages that are part of their website.

Did you know that a text files are very powerful and a single text file could destroy your entire website? The error is damaging and the way it happens is pretty simple to understand:

  • The sneaky text file that is ruining your life could tell search engines not to crawl your website.
  • If search engines can't crawl your website, then your pages (that you've worked so hard on) won't appear in any search results.
  • If your website can't be found on search engines, nobody will know about your website.
  • As a result, business will suffer.

Don't worry. We won't let you or your website suffer. The best thing to do to avoid this issue is to use our FREE robots.txt generator. Our tool was completely designed to generate the proper robots.txt file for your website.

Key Features and Benefits

  • Robots.txt file can be uploaded directly to your root directory.
  • The new file will direct Google and other search engines which of your website's pages or directories should and should not show up in searches.
  • Will give you proper recommendations when you decide to add new directive, file, or file path to either the new or existing robots.txt file.

Tool

Wouldn't you like a FREE and easy way to create a new robots.txt file, or edit an existing file for your website



Enter Domain URL (optional):
Allow/Disallow
Help

Allow: Allow crawling of a particular path.
Disallow: Disallow crawling of a particular path.

User agent
Help

A user-agent is a specific search engine robot. The Web Robots Database lists many common bots. You can set an entry to apply to a specific bot (by listing the name) or you can set it to apply to all bots (by listing All).

Directory or File
Help

Directories or single pages or a pattern to exclude.

XML SiteMap URL(s)
(Enter your sitemap URL(s) as one per line)
Help

A sitemap is a file where you can list the web pages of your site to tell Google and other search engines about the organization of your site content. For ex: http://www.xyz.com/sitemap.xml





When using our tool, you can specify as to which search engines you want to include in your personalized criteria.

Upon generating your new robots.txt file, Google or other specified search engines will be pointed to as which pages/directories of your website should or should not be shown in searches.

How to use?

What is robots.txt?
Robots.txt is a text file web administrators create to instruct search engine robots on how to crawl and index pages that are part of their website.
What is User-Agent?
A user-agent is a specific search engine robot. The Web Robots Database lists many common bots. You can set an entry to apply to a specific bot (by listing the name) or you can set it to apply to all bots (by listing an asterisk).
For Example:
An entry that applies to all bots looks like this:
User-Agent: *
What is Disallow and Allow?
Allow: Allow crawling of a particular path
Disallow: Disallow crawling of a particular path.
You can list a specific URL or a pattern. The entry should begin with a forward slash (/).
How do I use the Robots.txt Generator Tool?
Simply enter your website domain name to the tool. If your website already contains a robots.txt file you can import it in the tool. You can edit the existing directives or add new directives to it.
If your site don't have any robots.txt file you can just generate one by adding new directives.
Use the Robots.txt Generator Tool in 3 Easy Steps:

Step 1: Enter your website domain name.

Step 2: Click on "Import Robots.txt" button. If robots.txt file is present, tool will fetch and display the content. And want to update the information update it by adding new directives or modifying existing one. If not present create new one by adding directives.

Example of adding new Directory

Allow: Allow crawling of a particular path

Ex:

Input: Allow/Disallow is "Allow", User agent is "All", Directory or File is "/lxrmarketplace"

Output:

User-agent: *

Allow: /lxrmarketplace

That means URLs which contain /lxrmarketplace will be crawled by all robots.

Disallow: Disallow crawling of a particular path

Ex:

Input: Allow/Disallow is "Disallow", User agent is "Googlebot", Directory or File is "/xyz"

Output:

User-agent: Googlebot

Disallow: /xyz

That means URLs which contain /xyz won't be crawled by Googlebot.

Step 3: Click on the "Get Result" button to view the output. You can also download the results into a text file by clicking the "Download" button.

Output:

Ebook for tips and tools for your website.

FREE EBOOK: Top 10 SEO Tactics That Work

Here are the top tactics that you can use to succeed in your SEO quest. Subscribe and download the free Ebook now!

Name*
Email*
No, Thanks.

Rate it!

Rating:
Title:
Comments:

User Reviews

Average Rating

3.31

Robots.txt Generator