Knowledge Base

Search

  • Video block
  • Video Tutorials

  • Watch our easy-to-follow video tutorials for quick tips on designing and managing your store.
  • View Video Tutorials

Setting Up a Robots.txt File

Magento Go lets you use a robots.txt file to control which pages of your site are indexed by search engines. This article explains what robots.txt files do, and how to generate and customize one from the Admin panel of your Magento Go store.

What Is a robots.txt File?

Robots.txt, often pronounced “robots dot text,” is a text file that is checked whenever search engine robots visit your web site. Search engine robots, or search engine “bots,” are machines that index web content, including your store. The robots.txt file is currently the standard for determining which information on a web site gets indexed.

When a search engine robot visits your web site, it first looks for a robots.txt file. If one is found, the search engine robot follows the instructions in the file. For example, the following instructions tell all visiting search engine robots not to visit or index any pages on your web site. The first line, “User-agent: *” tells the robot that the rules listed below apply to all robots. The “Disallow: /” tells the robot not to visit any page on the site.


Example: robots.txt
User-agent: *
Disallow: /

It should be noted that some robots completely ignore any rules defined in a robots.txt file, especially harmful bots that scour the web looking for vulnerabilities or contact information. In terms of configuring robots.txt, however, we’re only concerned with search engine robots - the ones that follow the rules and play nice! With that said, let’s take a look at how you can use robots.txt in your Magento Go store.

How to Generate a robots.txt File

Your Magento Go store already has a robots.txt file that is set to “INDEX, FOLLOW.” This default setting directs all search engine bots to index all pages and follow links. It is generally in your best interest to have your content indexed by search engines—especially your product pages.

To modify the instructions in your robots.txt file from the Admin panel of your Magento Go store, go to System > Configuration > Design > Search Engine Robots. The Default Robots option can be set to one of the following:

  • NOINDEX, FOLLOW: Pages are not indexed, but search engine bots are allowed to follow links from applicable pages*
  • INDEX, NOFOLLOW: Pages are indexed, but search engine bots do not follow links.
  • NOINDEX, NOFOLLOW: Pages are not indexed, and search engine bots do not follow links.

Note: Applicable pages are those that are not excluded using the “Disallow: “ feature of robots.txt.

Creating a Custom robots.txt File

If you need more control over which pages and directories search engine bots can index and follow, do the following:

  1. From the Admin panel, select System > Configuration > Design.
  2. Click to expand the Search Engine Robots section and then do the following:
  3. Set Default Robots to Custom Instructions
  4. Complete the Edit custom instructions of robots.txt File as follows:
    1. Specify whether the instructions apply to all bots or only to specific ones.
      • To apply to all bots, type:
        User-agent: *
      • To apply to specific bots, type the bot name, such as:
        User-agent: googlebot
    2. List the files or directories that you want to prevent the bots from indexing by listing them in the following pattern:
      • To disallow a directory, type:
        Disallow: /folder/
      • To disallow a file, type:
        Disallow: /folder/filename.html
  5. Set Reset to Default to No.
  6. Click the Save Changes button to save the new configurations.

Example

Imagine that you have a store with a set of pages and folders that you don’t want indexed by all search engine bots.

The pages and folders you may want to prevent from being indexed are:

  • /terms-of-service.html
  • /special-offers.html
  • /referral-discounts.html
  • /customer/
  • /review/
  • /media/

Here’s what the instructons would look like:

Example Robots.txt

User-agent: *
Disallow: /terms-of-service.html
Disallow: /special-offers.html
Disallow: /referral-discounts.html
Disallow: /customer/
Disallow: /review/
Disallow: /media/

And that’s it! Make sure to save your changes so that they take effect. To preview your robots.txt file, go to [your-store-name].gostorego.com/robots.txt and verify that it reflects the customizations you made.

See Also:

Setting Up a Google Sitemap

Still have questions? Submit them below and they'll be answered by our expert support team.
Like what you read? Drop us a note with your feedback below.