Introduction
This tutorial will allow you to create your own rules for web search
robot crawlers that add your website and webpage's to search engines
such as Google.
Instructions
- Open Notepad, SimpleText, or any other word processing program
and create a new file named "robots.txt".
- Insert these lines into the new document and replace the HTML
URL's with your own test pages.
User-agent: *
Disallow: /some_private_folder/
Allow: /a_public_folder/
- Copy the line "Disallow: /some_private_folder/" as
many times as you need and replace the "/some_private_folder/"
with your own folders that you do not want the robot crawler to
go into.
- Copy the line "Allow: /a_public_folder/" as many times
as you need and replace the "/a_public_folder/" with
your own folders that you want the robot crawler to go into.
|