Web Developer Logo
Robots.txt

Introduction

This tutorial will allow you to create your own rules for web search robot crawlers that add your website and webpage's to search engines such as Google.

Instructions

  1. Open Notepad, SimpleText, or any other word processing program and create a new file named "robots.txt".

  2. Insert these lines into the new document and replace the HTML URL's with your own test pages.
    User-agent: *
    Disallow: /some_private_folder/
    Allow: /a_public_folder/
  3. Copy the line "Disallow: /some_private_folder/" as many times as you need and replace the "/some_private_folder/" with your own folders that you do not want the robot crawler to go into.

  4. Copy the line "Allow: /a_public_folder/" as many times as you need and replace the "/a_public_folder/" with your own folders that you want the robot crawler to go into.

© 2002-2009 Some Rights Reserved Report an Error | Return to Top