Cloaking is showing search engines other content then human visitors. To use cloaking effectively you need to detect whether your visitor is a searchbot.
In this post I will try to explain…
the process of implementing a cloaking script.
User agent or IP-based detection?
I will always recommend IP-based search engine detection. It is too easy for search engines to detect user agent based cloaking. It is very important!! to have an up to date IP database! I only work with the best IP database providers and always re-synchronise to keep up to date.
Search engine IP databases
Depending on your programming knowledge you can use several IP database providers that always have an up to date database. I’ve only listed the ones I can recommend. These list every engine by name and bot purpose.
Fantomaster provides several ways to help you detect search engine spiders. The SpiderSpy service provides both database access and scripting help to make it easy for you to keep your database up to date.
price $258 per year
A quicker script then Fantomaster and a guarantee on being up to date. But for almost four times the price I’ve never tried it. Some friends swear this is the best.
price $995 per year
- Reverse DNS lookup
You can also check every IP address visiting you and see if they belong to the Google domain. This article provides the Google explanation how to detect their crawling spiders.
Using default cloaking scripts
Fantomaster and IP-delivery.com provide free scripts with their service and you can easily make a function that gives back a simple yes or no result. You can check if the visitor is a search engine and you can also detect each search engine separately. Then you can program the different content you want to show them.
My favourite way to use Fantomaster SpiderSpy is by making a RewriteMap for mod_rewrite on Apache servers. You implement this by doing the following.
Use the following instructions to generate the RewriteMap:
- Generate an empty text file with write rights for PHP (for instance rewritemap.txt).
- Customize the following php script:
- After running this script you should have a textfile with IP adresses and hostnames followed by “spider”. This textfile must be specified in your httpd.conf (server wide) or vhost.conf (just for this domain) file to be able to use it in a RewriteCond. The problem with shared hosting is that you can probably not edit it yourself.The conf file should contain the following line:
- Then you can address allBots from a rewrite condition in your .htaccess file.RewriteEngine On
The rule above redirects any request from a searchbot to otherurl.php.
- Update the list every 12 hours by running a cronjob on the PHP file you generated.
- And you’re done!
You can probably think of many ways to use cloaking to your advantage. Here are some examples.
Whatever you use cloaking for, use it wisely. My tip of the day: Always keep your IP database up to date! If less decent ways of cloaking are detected they will be punished and your site will probably be removed from the index.