Documentation
CHAMELEON: V1.0
System Requirements
- Unix, Lynyx or NT operating system
- Perl 5
- Ability to use htaccess
- Ability to run cgi on your host outside of the
cgi-bin
Preliminaries
- Determine the path to PERL 5 on your web
server host. Note that some web hosting companies run both PERL 4 and PERL 5.
Make ABSOLUTELY sure you are not setting this up under PERL 4. Ask your
administrator if you are not sure.
- Download the tarfile for this program and
save it to your desktop.
- Unpack the tar archive on your desktop using a
program that unpacks UNIX TAR ARCHIVES. If you don't have such a program then download
WINZIP FREE from SHAREWARE.COM.
- After you have unpacked the TAR archive you
will have a collection of folders and files on your desktop. Now you have to do some
basic editing of each of these files (or at least some of them). Use a text editor
such as wordpad, notepad, BBEdit, simpletext, or teachtext to edit the files. These
are NOT WORD PROCESSOR DOCUMENTS they are just simple TEXT files so don't save them as
word processor documents or save them with extentions such as .txt or they will NOT WORK.
Note that there may be a some files inside of folders which are "blank".
This is normal.
Preparing the CGI scripts
Define Path To PERL 5
The first step is to open up each and every
file that has a .cgi extention and edit line number one of each script. Each of the
cgi scripts is written in perl 5. For your scripts to run they must know where perl 5 is
installed on your web server. The path to perl 5 is defined to a cgi script in the first
line of the file. In each of the cgi scripts the first line of code looks something like
this:
#!/usr/bin/perl
If the path to perl 5 on your web server is
different from /usr/bin/perl you must edit the first line of each cgi script to reflect
the correct path. If the path to perl 5 is the same no changes are necessary. If you do
not know the path to perl 5 ask the webmaster or system administrator at your server site.
File Locations
The following files will be uploaded into a
SPECIAL HTML DIRECTORY which htaccess will control
- index.htm
- normal.html
- robot.list
- robot.html
File Access Permissions
File access permissions must be set correctly
for this program to run. The table below lists the permissions of each file which are to
be set by the unix command ( chmod ) used to set the correct access
permissions. You must set the access permissions for each of these files.
CHMOD 755 |
CHMOD
777 |
index.htm
|
normal.html
robot.list
robot.html
|
What each file does
- normal.html is the webpage displayed to the world at large
- robot.html is the webpage displayed to any any spider indexing your site
- index.htm is the CGI script which is DISGUISED as a webpage. This script feeds either
robot.html or normal.html to the person or robot requesting index.htm
- note that robot.html is actually a page of the
host information for most spiders
- index.htm is the page you will submit to the
search engines.
Using htaccess to disguise cgi's as webpages
- The dot.htaccess file should be renamed to
.htaccess and placed inside the same directory as your webpages. This file tells your
webserver that any file with the extension .htm is ACTUALLY a cgi script. Any file with
the extension .html is still treated as a normal webpage.
- How it works
- When the file index.htm is called by a
websurfer or a robot actually the CGI SCRIPT contained inside index.htm is EXECUTED.
index.htm then figures out who is requesting the webpage. If it is a robot then the
webpage robot.html is served. If the request is made by anyone else the script displays
the file normal.html