Dotcomology by Stone Evans, Joseph Costa - HTML preview

PLEASE NOTE: This is an HTML preview only and some elements such as links or page numbers may be incorrect.
Download the book in PDF, ePub, Kindle for a complete version.

3. Secrets of Winning Traffic through Search Engines

img8.png

When I first started selling online, the first major problem I ran into was bringing customers to my door. I put banner ads on other sites, organized reciprocal links and joined web rings. Those methods all worked to some extent, but what really did it for me, what turned my business from a small earner into a major money-grabber, was figuring out how to use search engines.

Sure, I’d submitted my sites to the major search engines as soon as I’d finished building them, but I didn’t really pay them much attention. After all, I figured search engines are just for people who are looking for information; they’re not really good for commercial sites.

Boy, was I wrong!

One day, I sat down and checked out which sites were popping up first in the categories that suited my businesses. I found that all the top-ranked sites were my biggest competitors. And when I say biggest, I mean these guys were in a whole other league. They had incomes that were ten or twenty times the size of mine. No wonder they had top billing at Yahoo! and Google! And then it clicked. Search engines don’t list sites by size, they list them by relevance. These sites weren’t listed first because they were big; they were big because they were listed first!

That was when I began to optimize my pages and think about metatags and keywords. As my sites rose through the listings, my traffic went through the roof. And not just any old traffic! The people that came to my sites from search engines hadn’t just clicked on a banner by accident or followed a link from curiosity, they’d actually been looking for a site like mine. My sales ratio went up like a rocket. I’d created my own big break.

In this chapter, we are going to discuss all proven strategies of search engine optimization. We will discuss how to optimize your site, submit your pages and pick up the targeted traffic you need to make cash. This chapter is one of the most important chapters in the whole book. It’s crucial that you read it carefully.

Let’s start with search engines.

3.1 How Search Engines Work

Internet search engines are special sites on the Internet that are designed to help people find information stored on other sites. There are differences in the ways various search engines work, but they all perform three basic tasks:

  • They search the Internet -- or select pieces of the Internet – - based on important words.
  • They keep an index of the words they find, and where they find them.
  • They allow users to look for words or combinations of words found in that index.

Early search engines held an index of a few hundred thousand pages and documents, and received maybe one or two thousand inquiries each day. Today, a top search engine will index hundreds of millions of pages, and respond to tens of millions of queries per day.

Spidering

Before a search engine can tell you where a file or document is, it must be found. To find information on the hundreds of millions of web  pages that exist, a search engine employs special software robots, called spiders, to build lists of the words found on websites.

When a spider is building its lists, the process is called crawling.

In order to build and maintain a useful list of words, a search engine's spiders have to look at a lot of pages. How does any spider start its travels over the web? The usual starting points are lists of heavily used servers and very popular pages. The spider will begin with a popular site, indexing the words on its pages and following every link found within the site. In this way, the spidering system quickly begins to travel, spreading out across the most widely used portions of the web.

Indexing

Once the spiders have completed the task of finding information on web pages, the search engine must store the information in a way that makes it useful. There are two key components involved in making the gathered data accessible to users:

  • The information stored with the data
  • The method by which the information is indexed

In the simplest case, a search engine could just store the word and the URL where it was found. In reality, this would make for an engine of limited use, since there would be no way of telling whether the word was used in an important or a trivial way on the page, whether the word was used once or many times or whether the page contained links to other pages containing the word. In other words, there would be no way of building the ranking list that tries to present the most useful pages at the top of the list of search results.

To make for more useful results, most search engines store more than just the word and URL. An engine might store the number of times that the word appears on a page. The engine might assign a weight to each entry, with increasing values assigned to words as they appear near the top of the document, in sub-headings, in links, in the meta tags or in the title of the page. Each commercial search engine has a different formula for assigning weight to the words in its index. This is one of the reasons that a search for the same word on different search engines will produce different lists, with the pages presented in different orders.

An index has a single purpose: It allows information to be found as quickly as possible. There are quite a few ways for an index to be built, but one of the most effective ways is to build a hash table. In  hashing, a formula is applied to attach a numerical value to each word.

The formula is designed to evenly distribute the entries across a predetermined number of divisions. This numerical distribution is different from the distribution of words across the alphabet, and that is the key to a hash table's effectiveness.

The Search Engine Program

The search engine software or program is the final part. When a person requests a search on a keyword or phrase, the search engine software searches the index for relevant information. The software then provides a report back to the searcher with the most relevant web pages listed first.

3.2 Top Search Engines

I’ve studied how search engines work. An integral part of any Internet marketing or search engine optimization campaign is to know exactly which search engines to target. This section discusses some of the top search engines today.

Google

Google has increased in popularity tenfold the past several years. They went from beta testing to becoming the Internet's largest index of web pages in a very short time. Their spider, affectionately named  "Googlebot", crawls the web and provides updates to Google's index about once a month.

Google.com began as an academic search engine. Google, by far, has a very good algorithm of ranking pages returned from a result, probably one of the main reasons it has become so popular over the years. Google has several methods which determine page rank in returned searches.

Yahoo!

Yahoo! is one of the oldest web directories and portals on the Internet today, and the site went live in August of 1994.

Yahoo! is also one of the largest traffic generators around, as far as web directories and search engines go. Unfortunately, however, it is also one of the most difficult to get listed in, unless of course you pay to submit your site. Even if you pay it doesn't guarantee you will get listed.

Either way, if you suggest a URL, it is "reviewed" by a Yahoo! editor and, if approved, it will appear in the next index update.

AltaVista

Many who have access to web logs may have seen a spider named “scooter” accessing their pages. Scooter used to be AltaVista's  robot. However, since the Feb 2001 site update, a newer form of Scooter is now crawling the web.

It will usually take several months for AltaVista to index your entire site. Unlike Google, AltaVista will only crawl and index 1 link deep, so it can take a long time to index your entire site, depending on how large your site is.

Inktomi

Inktomi's popularity grew years ago as they powered the secondary search database that had driven Yahoo. Their spiders are named "Slurp", and different versions of Slurp crawls the web many different times throughout the month, as Inktomi powers many sites’ search results. There isn't much more to Inktomi than that. Slurp puts heavy weight on title and description tags, and will rarely deep-crawl a site. Slurp usually only spiders pages that are submitted to its index.

Inktomi provides results to a number of sites. Some of these are America Online, MSN, Hotbot, Looksmart, About, Goto, CNet, Geocities, NBCi, ICQ and many more.

Lycos

Lycos is one of the oldest search engines on the Internet today, next to Altavista and Yahoo. Their spider, named "T-Rex", crawls the  web and provides updates to the Lycos index from time to time. The FAST crawler provides results for Lycos in addition to its own database.

The Lycos crawler does not weigh META tags too heavily. Instead, it relies on its own ranking algorithm to rank pages returned in results. The URL, META title, text headings, and word frequency are just a few of the methods Lycos uses to rank pages. Lycos does support pages with Frame content. However, any page that isn't at least 75 words in content is not indexed.

Excite

Excite has been around the web for many years now. Much more of a portal than just simply a search engine, Excite used to be a fairly popular search engine, until companies such as Google started dominating the search engine market. As of recently, Excite no longer accepts submissions of URL's, and appears to no longer spider. To get into the Excite search results, you need to be either listed with Yahoo! or Inktomi.

img9.png

Looksmart

Getting listed with Looksmart could mean getting a good amount of traffic to your site. Looksmart's results appear in many search engines, including AltaVista, MSN, CNN, and many others.

Looksmart has two options to submit your site. If your site is generally non-business related, you can submit your site to Zeal (Looksmart's sister site), or if you are a business, you can pay a fee to have your site listed. Either method will get you listed in Looksmart and its partner sites if you are approved.

Once your site is submitted and approved, it will take up to about 7 days for your site to be listed on Looksmart and its partner sites.

3.3 Search Engine PageRanking Algorithms

A search engine's main job is to provide results which most satisfy a user's query. If they present a result that the user visits and doesn't agree that the document is about their query, there is a very good chance that the user may not use that search engine again. Most search engines pay no attention at all to the Meta description tags.

Meta description and keyword tags are hidden attributes that you can add to the front of your document which are supposed to annotate and describe the document. Since the users will never see this information,  they will be disappointed if you stick in invalid keywords or fail to keep the description in line with the document's contents which usually is the case.

Most Search Engine page-ranking algorithms rank pages based on the following aspects:

  • Content of the website
  • Representation of content, keywords, and links on websites
  • Location and number of inward and outward links on websites
  • Relevancy of search terms as compared to the websites

Given below is a brief description of the page-ranking algorithms of some of the most popular search engines.

Google

You can submit your website to Google at the following URL: www.google.com/addurl.html. Submitting your site will only make Google aware that your page exists; it is quite possible that your pages may get crawled even if you have not submitted. It is advisable to submit the home page and some inside pages. Inside pages are added to the submission, just in case the home page is found too slow  to load or crawl. The pages that are submitted should link to the rest of the pages. Google indexes the full text that is visible on any page that it crawls. It generally does not index the metatags – keywords or descriptions.

When Google lists your page in the search results, the description that is displayed is the extract of text that is around the first line where the search word appears on the page. It may thus be a good idea to write a good description of the page and build it around the most likely search term(s) and place that near the top of your page. You should remember that one sure way of getting your site listed and indexed is if there are several links that point to your site and such links appear on web pages that in turn have several other links pointing to them. The term “link popularity” is used for this. It analyzes links of the pages that it has visited and this “link analysis” helps to determine the ranking of the page.

Google uses a proprietary PageRank algorithm for determining relevance and ranking of pages in the search results. Location and frequency of the search term on your web page are no doubt factors in ranking; however, off-the-page factors such as link analysis are more important. Generally, Google provides search results based on relevancy, meaning that it returns a list of pages ranked by the  number of other web pages linking to each page, as well as other mathematical algorithms

Yahoo!

Yahoo! offers a human-powered directory and offers its results to visitors. The directory is supplemented by a web page index created by crawling. The directory is an important channel in the area of search engine marketing. It is popular and is used extensively by people to locate sources of information. Moreover, the directory is a valuable boost to your site for crawling and ranking in other search engines, as the directory provides a high-quality link to your website.

When a visitor is looking for information on relevant sites, she could either browse through the hierarchy of directories and sub-directories or search for an appropriate directory through a search interface. As your site can be listed in just one category, generally, the choice of category is an important step. Choose the top category that your target visitor who is making a search may select out of the different categories offered to him/her.

Select your target keywords and find out which categories relate to those keywords. For submission of non-commercial sites, the Yahoo! Express submission is recommended rather than the Standard submission option.

The results page in your chosen category will list your site in two possible sections (for most categories). One section is called "Most Popular Sites" and this is on top, while the second section contains the remaining listings in Alphabetical order.

Yahoo! does not reveal how it includes certain sites in the “Most Popular Sites” list. However, link analysis and click-throughs are likely to be factors. You cannot pay to be included in this section. Certain sites with sunglasses shown next to their name or an “@” symbol shown at the end of the name reflect that Yahoo! considers those sites to be excellent.

Inktomi (MSN Search, AOL Search, Hotbot)

Inktomi is a search engine that does not offer its search services through its own site, but through Partner sites – prominent ones being MSN Search, AOL Search, HotBot and others.

Inktomi, through its crawler, creates three different indexes. “Best of the Web” index has around 110 million pages that it indexes on the web and considers high in link analysis. The next set of around 390 million pages is indexed as “Rest of the Web”, considered as lower in link analysis. The third index is of paid inclusion. It also offers specialized regional indexes as well as targeted news, multimedia and directory indexes. It avoids duplication of the same page in more than  one index. Link crawling and paid inclusion are the two most effective ways to get covered by crawling. For bulk submissions to its paid program, it offers IndexConnect (for 1000 or more pages). Again, there is a cost-per-click basis with a monthly minimum.

Ranking at Inktomi is determined by a combination of factors including HTML links, keywords and description tags near the top of the page or in the title tag. If the search string matches with what is found at these places on the page, the ranking is higher. Link analysis and analysis of clickthroughs are other important criteria that it adopts.

AltaVista

AltaVista will accept free listings through its “addurl” link, but it also has paid inclusion features. Generally, their crawler may visit every four weeks. Paid inclusion may be desirable if you have a new website or web pages or if you alter your web pages frequently, and you do not wish to wait until the next cycle of crawling. There is an “Express Paid” inclusion service of self-service type for up to 500 pages at a time. This service will enable weekly crawling. Their bulk program called “Trusted Feed” will enable the pages to be directly linked to their index. Pricing for “Trusted Feed” is on a cost-per-click model with a monthly minimum. In this program you can submit the  Meta data, descriptions and keywords directly to the index. Nevertheless, the engine will check whether the destination page has the same Meta data or not and could levy a penalty for spam.

AltaVista’s ranking policies are a combination of various factors. The frequency and positioning of keywords and descriptions is important, as are title tags or words that appear near the top of the page. AltaVista applies link analysis to determine relevancy and page ranking. It levies penalty on spamming and does not recognize invisible or tiny text, keyword stuffing, identical pages, mirror sites, or quick meta refresh tags.

3.4 Keywords — Optimizing Your Site to Get Top Billing at Search Engines

When a user enters a search term, also known as a “keyword”, into a search engine, the engine runs through the billions of pages in the database and awards each one a “relevancy score”. The higher your score, the higher your listing. If your site doesn’t contain the keyword used by the searcher, the only score it’s going to get is a big, fat zero. Your first task then is to make sure you know which keywords are most relevant for each of your sites.

There are three ways to figure out your keywords:

Ask Your Competitors

This is the cheapest way to find many of the most important keywords. Simply log on to a search engine (AltaVista is good, Google is better) and carry out a search for sites like yours. Open the top site, and once the home page has downloaded, click on “View” in your browser, and then “Source”. That will reveal all the HTML used to build the web page, including all the keywords that have been specially inserted.

Some of those keywords will be relevant to your site. Others, of course, won’t be relevant and there will be lots of other keywords that aren’t obviously listed, such as “vitamins” for example. You can repeat the process on other sites, using different keywords, and build up a pretty long list.

Ask the Pay-Per-Clicks

Pay-per-click sites actually let you see how popular a keyword is. They’re not being kind; they’re trying to make money. The more webmasters bid on those keywords, the higher the bids are going to rise — and the more money the pay-per-clicks are going to make. FindWhat, for example, has a Keyword Center. Other pay-per-click sites offer similar features. One of the most popular key word discovery tools, however, was provided by former PPC giant Overture. You can play around with this free keyword selector tool at: www.inventory.overture.com/d/searchinventory/suggestion/

Use a Specialized Tool

Not too surprisingly, a number of companies have popped up to supply specific keyword services for a fee. The best of these is WordTracker.com. They’re not bargain basement, but you get what you pay for. They’ll give you all the keywords you need and in my experience, they’re a sound investment.

GoogleFight.com is another useful tool to see whether one keyword is more popular than another. The site compares two keywords and tells you which is more popular. It’s free and has a limited use, but it’s fun to play with.

As you make up your list of keywords, bear in mind that it’s also worth looking at key phrases. It’s quite possible that a user looking to buy flowers online might search for “red roses” or “cheap bouquets” as well as just “flowers”. Key phrases are often overlooked by competitors, so you’ve got a pretty good chance of getting a high placement with the right combination.

Don’t worry too much about the competition though. Some people will tell you that you’re better off trying to find keywords that no one  else has thought of and others will tell you to throw in keywords that are only slightly relevant to your businesses.

In my experience, that’s a waste of time. If your competitors are using certain keywords, it’s because they know they work. And if any of your visitors found your site using irrelevant keywords, you're not going to sell them anything. Don’t try to reinvent the wheel here: just try to figure out the most popular keywords and the best key phrases to put on your site.

Whichever of these methods you use — and I tend to use more than one — you should end up with a pretty comprehensive list of keywords that you can stick into your website. The next question then, is how do you use them? When a search engine assigns relevancy to a site, it looks for the keywords in a number of specific areas.

Title Tag

The title tag is written in the <HEAD> section of the web page and after the <TITLE> tags. It’s usually the line listed in the search results as well. For example, the New York Times’ title tag at the time of this writing is “The New York Times - Breaking News, World News & Multimedia”. Of course, they will change this tag from time to time.

The title tag is usually between 50 and 80 characters including spaces. Different search engines have different limits so you want to make sure that your most important words are near the beginning of the title.

The rest of the title is made up of keywords and phrases but in fact, you don’t want to put in too many keywords here. Just place one keyword as the second or third word in the title. Too many, and your site could be seen as spamming.

You can also list more keywords in the <META Keywords> and <META description> sections of the <HEAD> area, but because these areas have been so abused in the past, a number of search engines today will skip right past the title tag and go straight to the web copy.

3.5 Web Copy for SEO

The search engines will scan the text on a web page to see if your site is relevant to the search term. That means that in effect, your web copy is going to have to do two things: persuade a customer to buy, and persuade a search engine it’s relevant.

When you write your copy aim for about 500 words a page, but throw in 4-8 keywords. You’ll have to try to balance a smooth text flow against getting in all the keywords you need to be listed.

You can also consider adding text-only pages such as how-to articles, tips or tutorials to your site. Throw in some keywords and they can turn up in search engines and create opportunities for link exchanges.

So there’s a few ways you can try to improve the position of your site in a search engine. More important than where you put the keywords is choosing the right keywords. That’s not really a huge challenge as your competitors are likely to have done the job for you.

Of course, even if you do get everything right, it doesn’t mean you’re going to shoot straight to the top of Google. One of the criteria for relevancy is how long you’ve been online, so success on the search engines won’t come overnight. The sooner you start submitting though, the sooner you can start to rise.

3.6 Submitting to Search Engines

Submitting sites to search engines is much easier than submitting them to directories or pay-per-clicks. In fact, you only have to submit the home page. The search engine’s “spider” (a neat little software program) will then follow all the links from the home page and include your other pages. Spidering actually increases your relevancy score more than hand-submitting your internal pages yourself.

The disadvantage of spidering is that it can be slow. Google has the best spider but even they can take up to a month to index all your pages. For other search engines you can wait three times as long.

Say What?! Is all of this search engine talk confusing to you? Do you want to make money online, but don’t want to figure it all out by yourself? You’re not alone. Click here for a turn-key money-making system that you can start earning from within the next 24 hours.

3.7 Search Directories – The Benefits of Browsing

Search directories differ from search engines by providing a range of categories for users to browse. Rather than enter a keyword into a search box, users click through categories and sub-categories narrowing down their options.

You could say that search engines are like going straight up to the sales assistant and asking what they have in evening wear; search directories are like browsing through the store and seeing what catches the eye.

How you make your site catch the eye in a directory is actually pretty similar to standing out in a search engine. It’s all about relevancy — a mixture of keywords and links.

3.7.1 Submitting to Search Directories

Submitting your site to a search directory is a little tougher than submitting to a search engine. Directories don’t have spiders. They rely on humans. When you submit your site to Yahoo! or any of the other directories, you’ll have to complete a form that will include your “URL”, “Page Title”, “Keywords” and a “Page Description”.

Your keywords and title will play some role in your ranking, but for the description, it’s much better to put a hard sell that will attract users. There’s no point having a link at the top of a category if no one wants to click on it.

Bear in mind that because each submission to a directory is checked by a human editor, it can take quite a while for your site to be approved and listed. Some sites do have express services but these are pretty pricey (Yahoo! wants $299 and $600 for adult sites!), and if they decide your site isn’t suitable for a category, you don’t get your money back. It’s usually worth the wait.

3.8 Pay-Per-Click — Buying Status

Pay-per-click programs (PPC’s) allow you to buy a prime position in a search engine by selecting the price you wish to pay for each visitor you receive. This can place you exactly where you want to be in the listing, or let you decide precisely how much you want to spend on advertising.

The big advantage of PPC’s is that you don’t have to worry about messing with keywords or links or any of that. You can just figure out how much you want to pay for a keyword and buy your position. In addition, you only pay for people who actually click on your link (for banner ads, you often have to pay when someone sees it.) And you can also get cheap visitors. Bids usually start at around five cents per click. The top three bids though are often promoted across a network of sites so there can be big bonuses for bidding high.

This is how most pay-per-click programs work:

  • You create your page title, description and link as you want it to appear in the search results.
  • You enter the keywords and phrases that will prompt your listing to appear.
  • You enter your keyword bid (the amount you are willing to pay for each click to your site).
  • Your keyword bid is compared to that of other bidders for the same keyword. The results are returned to the user with the highest bid appearing first.

3.8.1 Show Me the Money!

With PPC’s, the name of the game is profit. You need to be careful not to get carried away with the ranking so that your promotion doesn’t cut into your revenues.

This is essential! There’s no point in being top if you’re out of business in a month. You have to figure out what you can afford and keep to it. Base your decision on your visitor-to-sales-ratio (the number of visitors on average that it takes to generate a sale) and your net profit per sale.

So for example, if you get a sale from every tenth visitor, and you net a profit of $20 from each sale, then you can’t pay more than $2 for each click without operating at a loss (unless you have an effective back-end sales campaign setup as discussed earlier). In practice, you might make one sale for every 100 or so clicks and pay perhaps 15 or 20 cents for each visitor, depending on your market.

It’s absolutely crucial for you to know your visitor-to-sales-ratio.

It’s also important to keep that ratio as high as possible, and that means only bidding on relevant keywords. If you pay for visitors who are looking for something completely different than the services you’re offering, you’re just throwing your money away. They aren’t