VOLUME 3b. MAY 9 1999 - Advertising on the Internet:
Search Engines: Rating - Popularity test and compromises.
By Ian Clayton
It pays to be popular. Search engines keep tabs on who links to who on the Internet and they use this to rate sites. If high profile sites like CNN link to you that is good for business and for SE listing. But if no one links to you you could be in trouble.
This article will look at who uses popularity screening and how they use it to rate a site. We also look at related issues and compromises in getting good ratings.
Popularity boosts revelancy but no popularity may mean no listing :
Some SE give a relevancy boost to a popular site putting it ahead of other sites that may be similar in all other respects (AltaVista, Excite, Google, Infoseek will boost relevancy rating based on popularity).
Some SE's also use the popularity index to decide if the site should be included in their index (Excite Yahoo, HotBot, Span, MSN, Lycos). If the site is new and has no links to it, it could be in big trouble with these SEs. In this case you will need to bolster your links by submitting your site to others, particularity the well know sites like USA Today, CNN and AOL. Links from these and other major Portals will improve SE listing, in fact they are becoming as important as the SE. (Barbados.org gets approx. 50% of its business from SE referrals, 50% comes from other referrals and direct hits (bookmarks). A year ago SE accounted for 80% of referrals. We will look at this trend in a future issue).
SE's are also beginning to measure which of their pages are clicked on most often (Direct Hits). Sites with the most direct hits get a relevancy boost. (Hotbot currently uses this measurement to rate pages). There is nothing you can do about this, other than to make the title and description of the page compelling. But you have to be in the top 20 on the list or direct hits will not count. You need to review your keywords ,contents and popularity first in order to get listed. (see previous article on keywords).
Very popular sites get special attention and may get the Deep Crawl treatment. Deep crawlers follow the links on the submitted page and index other pages of the site. The technique also evaluates the size and content matter of the site. Large sites with relevant content will do better than smaller sites, all other things being equal, but deep crawls will also pick up duplicate pages and sometimes too many pages and that is not good. (AltaVista, Yahoo, HotBot, Span, MSN and Nlightsearch do deep crawls. When their database become full they will purge sites to make room for new ones)
Duplicating pages.
It is sometime useful to copy and link the same content (slightly modified) to different channels. You might, for example, have a promotion for different travel groups and want to measure the effect of each group. We did this with the Barbados Offers Summer Savings (BOSS) program in 1998. It allowed us to immediately measure the response from our advertisements with Travelocity and Altavista etc. It is a useful technique but one that must be handled with great care as many robots know that a duplicate page is a "clever-dicks" way of boosting relevancy. Some SE's now react by not indexing all pages that are the same or substantially the same and if the robot thinks it is spamming they may drop the site altogether. It may not be exactly fair but those are the rules that they go by.
Large popular sites have another problem; too many pages get indexed. This happened to barbados.org with many SE in the early days. Some SE's, like Excite, dealt with the problem by limiting the number of pages any site could have in their index (Excite has a maximum of 25 pages and, it seems, give preference to sites not already indexed). Others have rules that limit any site to a single page within the top ratings. None of these rules are absolute, and they are constantly being amended.
Changes help boost relevancy: If two pages have an equal rating some SEs will favour the most current. Some SE also monitor how often pages change (change frequency). Pages that change will be visited more frequently and indexed again. (AltaVista, Infoseek measure change frequency). This is both good and bad: you have to keep the pages current, but if you already have the best rating you may not want to change things.
Remember that on the Internet, however, change is inevitable and change is necessary.
TO BE CONTINUED:
Next week we will look at Hypertext, Javascript, Frames and Databases and their effect on SE ratings.
PREVIOUS:
INTERNET MARKETING NEWSLETTER AND TUTORIAL
--------------------------------------------------------------------------------------
Internet Marketing -Intro mch 8
vol 1. Travel on net mch 12
vol 2a. Promo on the net mch 21
vol 2b. Features of the net mch 29
vol 2c. The new marketing mix Apr 8
vol 3. iNet Advertising-SE Apr20
vol3a. Keywords May 2