Ask us a question!

Web Moves Blog

Web Moves News and Information

Google & Inktomi Optimization
The search engine environment continues to evolve rapidly, easily outpacing the ability of consumers and SEO practitioners to quickly adapt to the new landscape. With the ascension of Inktomi to the level of importance that until recently was held solely by Google, SEO practitioners need to rethink several strategies, tactics and, perhaps even the ethics of technique. Assuming this debate will unfold over the coming months, how does an “ethical SEO firm” work to optimize websites for two remarkably unique search engines without falling back on old-fashioned spammy tactics of leader-pages or portal-sites? Recently, another SEO unrelated to StepForth told me that he was starting to re-optimize his websites to meet what he thought were Inktomi’s standards as a way of beating his competition to what looks to be the new main driver. That shouldn’t be necessary if you are careful and follow all the “best practices” developed over the years.

The answer to our puzzle is less than obvious but it lies in the typical behaviors of the two search tools. While there are a number of similarities between the two engines, most notably in behaviors of their spiders, there are also significant differences in the way each engine treats websites. For the most part, Google and Inktomi place the greatest weight on radically different site elements when determining eventual site placement. For Google, strong and relevant link-popularity is still one of the most important factors in achieving strong placements. For Inktomi, titles, meta tags and text are the most important factors in getting good rankings. Both engines consider the number and arrangement of keywords, incoming links, and the anchor text used in links (though Google puts far more weight on anchor text than Inktomi tends to). That seems to be where the similarities end and, the point where SEO tactics need revision. Once Inktomi is adopted as Yahoo’s main listing provider, both Google and Inktomi will drive relativity similar levels of search engine traffic. Each will be as important as the other with the caveat that Inktomi powers two of the big three while Google will only power itself.

2004 – The Year of the Spider-Monkey
The first important factor to think about is how does each spider work?

Entry to Inktomi Does Not Mean Full-Indexing
Getting your site spidered by Inktomi’s bot “Slurp” is essential. Like “Google-bot”, “Slurp” will follow every link it comes across, reading and recording all information. A major difference between Google and Inktomi is that, when Google spiders a new site, there is a good chance of getting placements for an internal page without paying for that specific page to appear in the index. As far as we can tell, that inexpensive rule of thumb does not apply to Inktomi. While it is entirely possible to get entire sites indexed by Inktomi, we have yet to determine if Inktomi will allow all pages within a site to achieve placements without paying for these sites to appear in the search engine returns pages, (SERPs). Remember, Inktomi is a paid-inclusion service which charges webmasters an admission fee based on the number of pages in a site they wish to have spidered. From the information we have gathered, Slurp will follow each link in a site and, if provided a clear path, will spider every page in the site but, pages within that site that are paid-for during the submission will be spidered far more frequently and will appear in the indexes months before non-paid pages. We noted this when examining how many pages Inktomi lists from newer clients versus how many from old clients. We have noticed the older the site, the more pages appear in Inktomi’s database and on SERPs on search engines using the Inktomi database. (This is assuming the webmaster only paid for inclusion of their INDEX page) Based on Inktomi’s pricing, an average sized site of 50 pages could cost up to $1289 per year to have each page added to the paid-inclusion database so it is safer then not to assume that most small-business webmasters won’t want to pay that much.

Google’s Gonna Get You
Google-bot is like the Borg in Star Trek. If you exist on the web and have a link coming to your site from another site in Google’s index, Google-bot will find you and assimilate all your information. As the best known and most prolific spider on the web, Google-bot and its cousin Fresh-bot visit sites extremely frequently. This means that most websites with effective links will get into Google’s database without needing to manually submit the site. As Google currently does not have a paid-inclusion model, every page in a site can be expected to appear somewhere on Google produced SERPs. By providing a way of finding each page in the site (effective internal links), website designers should see their sites appearing in Google’s database within two months of publishing.

We Now Serve Two Masters; Google and Inktomi
OK, that said, how to optimize for both without risking placements at one over the other. The basic answer is to give each of them what they want. For almost a year, much of the SEO industry focused on linking strategies in order to please Google’s PageRank. Such heavy reliance on linking is likely one of the reasons Google re-ordered its algorithm in November. Relevant incoming links are still be extremely important but can no longer be considered the “clincher” strategy for our clients. Getting back to the basics of site optimization and remembering the lessons learned over the past 12- months should produce Top10 placements. SEOs and webmasters should spend a lot of time thinking about titles, tags and text as well as thinking about linking strategies (both internal and external). Keyword arrangement and densities are back on the table and need to be examined by SEOs and their clients as the new backbone of effective site optimization. While the addition of a text-based sitemap has always been considered an SEO Best Practice, it should now be considered an essential practice. The same goes for unique titles and tags on each page of a site. Another essential practice SEOs will have to start harping on is to only work with sites that have unique, original content. I am willing to bet that within 12- months, Inktomi introduces a rule against duplicate content as a means of controlling both the SEO industry and the affiliate marketing industry. Sites with duplicate content are either mirrors, portals or affiliates, none of which should be necessary for the hard-working SEO. While there are exceptional circumstances where duplicate content is needed, more often than not dupe-content is a waste of bandwidth and will impede a SEO campaign more than it would help.

One Last Tip
The last tip for this article is, don’t be afraid to pass higher costs on to the clients because if your client wants those placements soon, paid-inclusion of internal pages will be expected. When one really examines the costs of paid inclusion it is not terribly different than other advertising costs, with one major exception. Most paid-advertising is regionally based (or is prohibitively expensive for smaller businesses). Search engine advertising is, by nature, international exposure and that is worth paying for.

Author Bio:
Jim Hedger is the SEO Manager of StepForth Search Engine Placement Inc. Based in Victoria, BC, Canada, StepForth is the result of the consolidation of BraveArt Website Management, Promotion Experts, and Phoenix Creative Works, and has provided professional search engine placement and management services since 1997. http://www.stepforth.com/ Tel – 250-385-1190 Toll Free – 877-385- 5526 Fax – 250-385-1198

Introduction
Here’s something that is fast to read and does the job! The 10 do’s and don’ts of SEO. Five techniques you should always do to push your site at the top of the search engine results pages (SERP’s) and keep it there, and five things which you should always avoid doing, to protect your site from a possible penalty or risk it from being banned altogether.

List of the 5 Do’s
Do Number One:

Take all the time that it takes to do a careful research of all your keywords and key phrases for your site, on the products or services you are trying to sell. Proper keyword research can only be done using Wordtracker, the industry standard when it comes to professional keyword research. Trying to optimize a site without knowing your real keywords is like driving a car at night with no headlights! Some will tell you they use Overture’s free suggestion tool. Although that tool can help you to a limited degree, you should always use Wordtracker for the best results.

Do Number Two:
Make sure you write a short descriptive title tag of what each page of your site is all about, and make sure they are all different. Search engines use the information contained in that title tag, compare them to the text on that page and rank it accordingly. The short description in your title tags will also help your users. The idea here is to keep it as short and descriptive as possible. If you are creating a new page about ‘durable red widgets’ then call that page ‘durable red widgets’. Avoid the temptation of creating title tags that are longer than 30 characters maximum, since they might have a dilution effect in your rankings of certain search engines.

Do Number Three:
Write the main text on your page using the same keywords contained in your title tag. If you are working on a page with a title called ‘New houses in Baltimore’ then be sure that those important keywords are repeated at least two or three times in the main body of your text, without sounding repetitive. A well-designed and carefully written page will ‘read write’ and will not sound like you are repeating yourself. Search engines will rank your page higher if they see a keyword repeated a few times on a page, and will help them ‘build a theme’ throughout your site.

Do Number Four:
Make a complete sitemap of your site, which will help both your users and the search engines at the same time. Having a well-designed sitemap will ensure that each page of your site gets properly indexed by Google and the other search engines. It is important to call that file sitemap.html and not site-map.html or other variations. Additionally, make sure that your sitemap.html file is directly accessible from your homepage and that it uses link text. Link text is always a lot better than a picture or graphic, since search engines won’t be able to read them.

Do Number Five:
To increase your link popularity, participate in a link exchange program. Even in the aftermath of ‘Update Florida’, link popularity in Google today is even more important than ever. All else being equal, the higher your Page Rank, the higher your rankings. Increasing the number of links that point to your site will help you in the results pages. You should only link to sites that are in the same field as your site, and stay away from bad ‘neighbourhoods’ or from so-called link farms or ‘free-for-all’.

List of the Five Don’ts
Don’t Number One:

Don’t ever use cloaking mechanisms or software that need to know the IP address of a search engine spider or anything similar. Cloaking is based on the idea of serving a unique, optimized page for the search engines, while serving a completely different page to the ‘real’ users. Today, most major search engines prohibit the use of such techniques and you risk your site being penalized or banned altogether. Always play it safe and the search engines will treat you right.

Don’t Number Two:
Submit your website once to the search engines and then wait for at least 6 weeks! Don’t use software that automatically submits your sites on a weekly or monthly basis, since it might penalize you in the long run. Today’s modern search engines use automated crawlers or spiders to regularly index your site, so you don’t need to submit more than once. In the case of DMOZ (the Open Directory Project or ODP), you should always wait 8 to 12 weeks, since DMOZ rely only on volunteers to review and index your site. If your site still isn’t listed after 12 weeks, write them a friendly email explaining your problem and that should do it in most cases.

Don’t Number Three:
Don’t entrust your site to people that will submit it to ‘thousands of engines’. There isn’t that many search engines in the first place. There are only a handful of serious search properties you should submit too, and they are used by 99% of the people looking for information. Don’t waste your time or your money and only work with the serious search engines everybody uses.

Don’t Number Four:
Don’t develop your site using Flash technology or similar techniques that the search engines cannot read. As far as your rankings in the search engines go, the best way to develop a site is in using standard technology, such as HTML. Text written in the HTML format is proven technology that all search engines have long recognized and approved, since the beginning of the Internet. Using the right technology will always help your site attain a good position in the SERP’s.

Don’t Number Five:
Don’t deal with so-called SEO experts that promise you Number One or first-page rankings in some engines. There is no such thing as guaranteed number one placement. Ask for referrals and don’t be afraid to ask exactly what techniques your would-be SEO firm uses to achieve a good positioning for your site. Additionally, ask them to put everything in writing, before you sign on the dotted line, and before you give them some of your hard-earned cash.

Author:
Serge Thibodeau of Rank For Sales

Google Update Florida Solutions and Fixes
Over a month and a half ago I wrote an article about Google’s latest update dubbed Update Florida. I told our subscribers about how they might see their rankings drop in Google for prominent keywords for no apparent reason. Today I’m going to tell you what you need to do in order to fix your site and help restore your rankings.

First, let’s recap the main effects of the Google Update Florida as well as my original predictions of the root causes. Then I’ll explain in more detail the root causes as I see them.

Here’s the skinny on the latest Google Dance (from: 11/18/2003)
• Called “Update Florida” because similar to the last Presidential election, it was full of controversy and tons of upset people.

• Many people that had worked hard to build up solid and reputable backlinks appear to have been punished for no obvious reason. Even though their inbound links contain their main keyword phrase and have content related to their site in the title of the inbound linked page.

• Rankings have dropped for main targeted keywords but have not dropped for more specific keyword phrases. Rankings are not just for search engine optimization firms but are for web sites of every kind and variety.

• Results appear to have been repopulated from data that looks to be about 6 months old.

Original Google Update Florida Predictions and Explanations
• Probable: Google is either testing an updated algorithm or they are not factoring in the backlinks created in the past 6 months.

• Not Probable: SEO techniques that could be considered SPAM to Google are being filtered out in an effort to provide more relevant results. I’m doubtful on this one since the results are from about 6 months ago and many of them are not very relevant.

• Probable: Google is doing yet another deep crawl, we haven’t seen on in about 6 months, and they want to build a fresher, more relevant search base from sites currently out there that the Fresh Crawl GoogleBots may not be picking up.

• Not Probable: Many sites that have dropped off will never come back to the top because they have been penalized. As far as we know Google doesn’t penalize sites, they simply list them or delist them. Just like God, there is no gray area with Google, it’s either black or white. If you’re considered a black hat then your site will not show up anywhere in Google’s index, not even for the most descriptive keyword searches.

And the Verdict is In!

Let’s start from the beginning with the first prediction and move down from there.

Prediction no.1
Google is either testing an updated algorithm or they are not factoring in the backlinks created in the past 6 months.

Verdict: I was right and wrong on this one.

I was wrong about Google not factoring in backlinks created in the past 6 months. I was right about them testing new filters. Google has not dumped their old algorithm by any means. Instead they have created new filters to penalize sites who’s pagerank was artificially inflated due to less than reputable linking practices. What are less than reputable linking practices you ask? These include…

• Exact same phrase or a high percentage of the exact same phrase (text links) and alt text (image links) in backlinks. This catches folks caught up in link farms because they don’t rotate the text used to link to their sites.
Solution: Create a link to us page that has at least 5 different links using different keyword phrases. This way you spread out the concentration of text across the general theme of your site.

• Links from sites that are of a different subject matter than the site being linked to. When there is no mention of your subject matter in the body, title, keywords and description of the page linking to your site.

• Links from sites that don’t show up in search results for the keywords used in the links. This is because it is widely accepted that Google as implemented a localrank technology that determines whether or not your backlinks are from sites that rank for similar keywords used in the backlink. If so, then you’re fine, if not then you’re going to have problems getting to the top even if you have a high pagerank.

Prediction no.2
Not Probable: SEO techniques that could be considered SPAM to Google are being filtered out in an effort to provide more relevant results. I’m doubtful on this one since the results are from about 6 months ago and many of them are not very relevant.

Verdict: I was right on this one

Google is not penalizing sites for being overly optimized or for having keywords in the title, description and keywords of the page or having them in the Heading tags or anywhere else. The penalization is only coming into effect when backlinks are overly optimized and not from other industry resource sites. The results appears to be old only because the sites that ranked well earlier weren’t part of the linking scams and they returned to the top for that very reason.

Prediction no.3
Probable: Google is doing yet another deep crawl, we haven’t seen on in about 6 months, and they want to build a fresher, more relevant search base from sites currently out there that the Fresh Crawl GoogleBots may not be picking up.

Verdict: I was right again.

Yes, Google did another deep crawl and refreshed the backlinks of sites and update their PageRank about 2 weeks later. But this deep crawl is done on almost a continual basis now and doesn’t affect the rankings like many thought it did. Rankings were affected due to the implementation of new filters and that’s the only reason why.

Prediction no.4
Not Probable: Many sites that have dropped off will never come back to the top because they have been penalized. As far as we know Google doesn’t penalize sites, they simply list them or delist them. Just like God, there is no gray area with Google, it’s either black or white. If you’re considered a black hat then your site will not show up anywhere in Google’s index, not even for the most descriptive keyword searches.

Verdict: I was wrong.

Well I guess nobody’s perfect. Google is not necessarily “penalizing” sites but they are “filtering” sites out based on whether or not certain filters are triggered by the site for a particular keyword search. While a site may show up for a very detailed search or an off topic search, it may not rank well for a highly competitive (typically commercial) term if they have tripped a filter for that term.

In order to understand my predictions and conclusions correctly I need to explain a few more things that have bubbled up from the Google Update Florida and how they affect search results that are delivered today on Google.

Not only did Google implement filters but they also implemented new features which make it seem like the old algorithm was thrown out and they started over from scratch. While many will tell you Google isn’t as relevant as it once was I will contest that they’re changing to remain the most relevant search engine out there. Here’s a rundown on the new features.

• Google Implemented Stemming Technologies: Stemming means taking a root word and determining all variations of engine for that word. Now a search for “game sites” may return sites optimized for “gaming sites” “gamer sites” “gamed sites”. This increases the number of results delivered for highly targeted keywords and increases the dependency on solid natural optimization.

You can disable stemming by adding the “+” sign in front of each word you want to disable stemming for.

• Google Implemented Plural Searches: This means that a search for “knitting needle” and “knitting needles” will return the same results. Thus increasing the competition again since more results are returned for all keyword searches.

• Implementation of LocalRank: Localrank is a technology that looks at the first (x) number of search results. It can be any number the search engine specifies but is typically around 100. After it looks at those results it determines whether or not any of those sites have linked to you and then ranks sites based on how many “popular” sites for a specific search term have linked to you. This is why it’s critical to have someone help you with your link popularity campaign that understands the intricacies of linking and can provide advice that will not hurt you in the long run. Short term link popularity plans from unrelated sites will do nothing to help out your cause. For more information on LocalRank read this forum at WebmasterWorld.

• Internal Links Discounted: Links from within your site to particular pages in your site do not count as much as they once did. While this doesn’t mean you need to change your site link structure it is worth noting.

Artificial PageRank Deflated: Sites that have more than one link from a particular site are experiencing the law of diminishing returns. No longer are 100 links from a single site weighted as 100 individual links. This just makes sense. I mean, if site A has 100 links from a single site and site B has 20 links from 20 individual sites, I can guarantee you that the 20 links to site B will count more than the 100 links to site A.

In Conclusion
Well I’ll conclude my ramblings with a recap. Google has made many changes and has implemented several filters as well as new algorithm features to ensure it has the most reliable search results set on the internet. In order to climb your way back to the top you need to understand LocalRank, PageRank changes, Proper Link Reputation and Link Popularity, and be aware of anti-spam measures that need to be taken. It all boils down to common sense. Make your site user friendly and easy to navigate, encourage others to link to you by giving them some form of incentive, don’t use the exact same phrase in your backlinks, use good titles that explain what each page is about and keep it simple. By following these rules you can weather any search engine algorithm change and remain at the top with a lot less stress.

Author Bio:
Jason Dowdell is the founder and CEO of http://www.GlobalPromoter.com, a search engine optimization and marketing firm specializing in educating and empowering customer websites. Jason is also the founder of TurboPromoter.com, a web-based seo/sem project management suite comprised of professional seo tools, in-depth tutorials and an integrated help system.

Pay Per Click Search Engine Advertising
So you’ve decided to give pay-per-click search engine advertising a try? That’s a good move, because PPC advertising is one of the most affordable marketing options available to small businesses.

But like all advertising, you need a good strategy to get your money’s worth. I find that too many people running their first PPC campaign make mistakes that can quickly turn expensive.

In this article I’ll offer some basic advice about bidding and keyword selection to help you run a smart PPC campaign.

Just What Can You Pay Per Click?
The most important thing to know before starting your PPC campaign is how much you can afford to bid for a keyword. High traffic keywords on Overture and Google ‘ the leading PPC providers – can cost $5.00 per click for a top ranking. Can you afford that?

Consider this: the typical e-commerce site converts about 2% of its visitors. That means you need to bring 50 visitors to your site before you make a sale. At $5.00 per click, you’ll spend $250 dollars to generate one sale. Ouch!

Keep in mind that you usually want one of the top 3 listings for a keyword. These are the listings distributed to most of the PPC engine’s partner sites. For example, a #3 ranking on Overture will place your listing on Yahoo, MSN and Alta Vista. A #7 listing won’t appear on any of these search engines.

So you’re caught in a catch-22: you want a high PPC ranking to get traffic, but the top rankings for popular words are too expensive.

Cast Your Net Broadly
The solution is to cast your net broadly, targeting a large number of less popular keywords. These words are usually less expensive and, taken as a group, can give you a considerable volume of traffic.

For example, suppose you run a ski resort. The keyword ‘ski vacation’ currently receives over 60,000 searches per month. That’s great, but it costs $5.01 per click for the top ranking. Instead of competing head-to-head for that keyword, you would be better off choosing ‘ski trip’ (4,771 monthly searches at $0.57 per click for the top spot) and ‘ski lodge’ (4,244 monthly searches at $0.55 per click for the top spot).

By targeting a number of these less popular keywords, we get nearly the same traffic as if we had targeted ‘ski vacation,’ but at a fraction of the cost.

Note that this is the opposite of the strategy you typically use in your search engine optimization campaign. In an SEO campaign, you focus on perhaps a half dozen high traffic words. That’s because it takes a lot of hard word to earn a top listing.

In contrast, it’s relatively easy to create a new PPC listing. Since you don’t pay unless someone clicks on your listing, there’s no added cost for doing this, so targeting a large number of keywords makes sense.

The word ‘ski chalet’ only receives 930 searches per month. So what? At $0.52 per click, it’s worth adding to your PPC campaign.

It’s common for PPC advertisers to target dozens of keywords. I’ve managed PPC campaigns for clients using over 1,000 words.

Smart PPC Management
The downside of this approach is that it can be hard to manage such a large number of keywords. You’ll want to track your listings, making sure your rankings haven’t dropped. Plus, you’ll want to know which keywords are sending you traffic and converting visitors into customers.

Many businesses also use a PPC bid management software like Bid Rank or GoToast to manage their listings. These software packages track your listings, and can adjust your bid if you drop in the rankings.

Many companies also outsource the management of their PPC campaigns. Most SEOs now offer PPC management services. These options cost money, but they usually pay for themselves by running your campaigns more efficiently.

Keep in mind that you don’t have to use a software package or a consultant to start your PPC campaign. But you do need to know what sort of cost per click you can afford. If you decide that $2.00 per click is your maximum bid, then stick with it. Don’t get into an emotional bidding war if you lose a top ranking. It’s much smarter to look for new and cheaper keywords. Cast your net broadly and you’ll save money.

Author Bio:
Christine Churchill is President of KeyRelevance.com a full service search engine marketing firm. She is also on the Board of Directors of the Search Engine Marketing Professional Organization (SEMPO) and serves as co-chair of the SEMPO Technical Committee.

PageRank: Meet Hilltop
Based on Atul Gupta’s great article he recently wrote on the Hilltop algorithm, I did a bit of research on my own and came up with this article. Atul Gupta is the founder of SEO Rank Ltd. and, as he explained it in his article, the Hilltop algorithm played a fairly large role in Google’s November 16 update, dubbed “Update Florida”.

In my continuing series on the effects of the Google “Florida Update”, in my previous article, I discussed how the OOP (Over Optimization Penalty) could in some cases have been applied to certain sites that could have in fact been overly optimized on some of their main keywords. Researching and reading on the Hilltop algorithm, I found out that it isn’t even new- it dates back to early 2001.

As you might expect, and as is always the case, Google remains very silent on any of this, so my analysis is based on many observations and some testing, using the Google.com search engine. But before delving into how all of this may affect your positioning in Google, let me explain what the “Hilltop” algorithm is all about and how it works in Google.

For those of you that may be new to search engine algorithms, I suggest you read on Google’s Page Rank algorithm, as a primer, and also “Anatomy of a large-scale hypertext search engine”, written by Sergey Brin and Larry Page, the co-founders of Google.
In its most basic form, the Google PageRank algorithm determines the importance and the relevance of a website by the number of links pointing to it. Following this principle, as an example, Google would rank a page higher if it has 100 links pointing to it, when compared to another page with only 10 links. So far, so good and this principle makes a lot of sense when you think of it.

Definition Of The Hilltop Algorithm
In contrast to PageRank, Google’s Hilltop algorithm determines the relevance and importance of a specific web page determined by the search query or keyword used in the search box.

In its basic, simplest form, instead of relying only on the PageRank value to find “authoritative pages”, it would be more useful if that “PR value” would be more relevant by the topic or subject of that same page.

In such a way, computing links from documents that are relevant to a specific topic or relevant document of a web page would be of greater value to a searcher. In 1999 and 2000, when the Hilltop algorithm was being developed by engineer Krishna Bharat and others at Google, they called such relevant documents “expert documents” and links from these expert documents to the target documents determined their “score of authority”. Again, it does make a lot of sense.

For more in-depth information on this important topic, read the Hilltop Paper that was written by Krishna Bharat himself and is available from the University of Toronto’s computer science department.

Using The Hilltop Algorithm To Define Related Sites
Google also uses the Hilltop algorithm to better define how a site is related to another, such as in the case of affiliate sites or similar properties. The Hilltop algorithm is in fact Google’s technology and ‘ammunition’ in detecting sites that use heavy cross-linking or similar strategies!

As a side note, Google’s Hilltop algorithm bases some of its computations mostly from “expert documents”, as noted above.

Hilltop also requires that it can easily locate at least 2 expert documents voting for the same Web page. If Hilltop cannot find a minimum of 2 such “expert documents”, the results it will return will be absolute zero. All of what this really means is that Hilltop actually refuses to pass on any arbitrary values that may be relevant to the rest of Google’s ranking formula and thus becomes inappropriate for the search term or keyword used in the search box by the user.

So, What’s In Store For Hilltop In 2004?
Since we are only at the beginning of the year, some of you may ask: “That’s all really cool, but what will happen to websites in 2004, in the aftermath of “Hurricane Florida”? That’s a great question, and many articles have been written on this topic in the last six to seven weeks.

Today and in the past, many search engines stopped valuing certain search factors subject to abuse from certain webmasters or site owners, such as keywords meta tags. For that reason alone and since its very beginnings, Google has always completely ignored meta tags altogether in the first place.

In contrast, visible sections of a website are less subject to “spam-dexing” (search engine spam), since these ‘visible pages’ (!) need to make good sense to the average human “real” visitor.

The Reasons Behind A New Algorithm At Google
Since the inception of the Google search engine in 1998, the PageRank algorithm has been pretty much the benchmark used at Google to determine search relevance and importance. However, there is a fundamental design weakness and certain limitations involved in the PageRank algorithm system and Google has known about it for quite some time now.

PageRank’s ‘intrinsic value’ is simply not paramount to search terms or specific keywords and therefore a relatively high PR web page that only contained a reference to an off-topic search term or keyword phrase, often got a high ranking for that search phrase. This is exactly what Google is trying to eliminate with its Hilltop algorithm. Google always tries as best as it can to make its search engine as relevant as possible.

Coming back to Krishna Bharat, he filed for the Hilltop patent in January of 2001, with Google as an assignee. Thus, Google recognized the important improvements this new algorithm could bring to their search ranking features when combined with their existing PageRank algorithm.

Google’s Hilltop algorithm could now work in conjunction with its older technology (PR). It is my observation that Hilltop could have gone through many improvements from its original year 2000 design before the current implementation, notably the one that Google started to deploy on or around November 16, 2003, at the very beginning of its November update (Florida update).

In the past two years, I think that Hilltop has been “fine-tuned” by Google and now represents a serious contender to the PageRank algorithm, originally developed by Google co-founders Sergey Brin and Larry Page, back in early 1998.

Hilltop And Google’s Massive Index Of Over 3.3 Billion Pages
Since its very beginning, Google has basically been operating most of its search engine through about ten thousand Pentium servers (some call them inexpensive personal computers), evenly distributed mostly through some major data centers located anywhere on the planet. That is basically how Google has built its hardware technology, from the ground up.

Coming back to the Hilltop algorithm, if we make an observation on how about 10,000 servers can have the dynamic processing ‘intelligence’ to rapidly determine and locate “expert documents” from hundreds of thousands of different and ‘topical’ Web pages, it is clear that Google’s Hilltop algorithm is at work in such a formidable task.

From what I can see and from what I know of search engines, since November 16, Google is now running a form of batch processing (similar to the mid-seventies days of computing, using bulky mainframe computers the size of large refrigerators, except that today, those 10,000 servers replace those mainframes) of frequent keywords, key phrases and search terms. Google then stores these results in its massive database, ready to be used as soon as a searcher makes a query using those search terms.

How Google does this is very simple: it has immediate access of the most popular and frequent keywords used and in the search terms used daily from its large database, and in real time, collected from actual searches used by everyday users, as well as actual keywords and key phrases used in its AdWords PPC (pay-per-click) ad program.

It is my observation that Google has apparently set a certain arbitrary threshold value to the actual number of searches a real-life search keyword needs to have in practice before it triggers a set limit in the Hilltop algorithm, and is then sent to a temporary buffer for later batch processing in its whole complex system.

Looking back to the ‘old days of the monthly dances’, it would appear that Google’s Hilltop algorithm operates on the combined total of most popular search terms used once a month, hence the old “Google dance effect”, prior to November 16, 2003.

Additionally, and this is something I have noticed even before the Florida update, incremental and smaller bits of batch processing is likely being done more frequently by Google on certain search terms that increase in popularity much faster, such as a major news event, for example when the US captured Saddam Hussein in December 2003. Such short-term events or news would qualify for the short-term “buffer” and would be processed as such by Hilltop.

More ‘standard’ and ordinary results for the longer term would be timed in with the 10,000 servers about once a month, which again, would make perfect sense. Search terms that do not qualify to kick in the Hilltop algo continue to show you the old Google ranking.

Conclusion
In concluding this topic, as Atul Gupta and myself have written in some of our previous articles, webmasters and site owners need to think ‘out of the box’ if they want to thrive and continue to have sites that return favourable ROI’s. As always, link popularity is even more important now than ever before.

Additionally, try to get a listing in as many directories as possible, beginning with DMOZ (the Open Directory Project). Avoid FFA (Free for All) or link farms in every respect. Those are a thing of the past and might even get you penalized.

If your budget allows it, get into a good PPC ad program, such as AdWords or Overture. You might also want to consider some good paid inclusion search engines that deliver real value for your investment.

Note that since January 15, (and as expected) Yahoo has completely dropped its listings with Google, so you may also want to look at the possibility of a paid listing in Yahoo as a safety measure. Yahoo is now taking its results from Inktomi, which is also in the Yahoo family of search properties, since Yahoo bought Inktomi last year.

Author:
Serge Thibodeau of Rank For Sales

Your Google Description
When it comes to describing your site, Google assembles what is known as a snippet description to display in their search results. Sometimes it’s a good description – one that prompts potential visitors to click on your link. Other times, it isn’t. Take the case in point where the following page (ranked at #1) in a keyword search for scuba dive “entices” the potential site visitor by listing the various PADI locations from around the world …

PADI – The way the world learns to dive
PADI Americas – English, PADI Canada – English, PADI Europe – English, PADI Nordic – English, PADI International Limited – English, PADI Japan – English, PADI Asia …
Description: The largest and most recognized diving organization around the world with courses ranging from Snorkeling…
Category: Recreation > Outdoors > … > Dive Organizations > Training Agencies www.padi.com/ – 9k – Dec 27, 2003 – Cached – Similar pages

Oops! …oh, well – at least their Description, taken from their editor-assigned ODP directory description, is relevant – but their snippet leaves something to be desired.

Can the snippet entice users to click?
Can the snippet be changed to entice users to click on your listing?

Of course, this is important because potential site visitors are judging whether to click or not based in part on those snippets. So, how can one go about changing Google’s snippet advantageously? Let’s take a look and see.

For starters, we’ve found that Google actually pulls the snippet description from several different places on your Web page. Let’s think about this for a minute. If we could determine where Google is pulling our description, perhaps we might be able to change that wording to “produce” a description that more accurately describes our page.

Where is Google pulling the snippet description?

Currently Google is pulling the snippet from any one or combination of the following areas:

1. META description tag (although Google doesn’t use contents to determine relevancy). 2. First ALT text found on the page. 3. First text found on the page (which may be a heading tag, body text, etc.). 4. Additional heading tags on the page. 5. Additional body text found on the page. 6. Additional ALT text on the page. 7. Navigation bar on the left-hand side of the page (which is rarely a relevant description of a site!). 8. Copyright information at the bottom of the page. 9. Wherever the keyword phrase is found.

Important Note

One thing that’s very important to note is that the snippet is determined by the search term. In other words, if you search for your company’s name, you’ll get a different description than what you would get if you search for a keyword phrase that is relevant for your site. Generally, Google appears to be pulling the description from areas of the page that surround the usage of that particular keyword phrase. The obvious question is, Is it the first usage of the keyword phrase? Usually, but not always.

Another Important Note

Since most people aren’t going to be searching for the name of your business, don’t try to change your Google snippet description based on a search for your company name. Instead, search for the most important keyword phrase for each important page of your site, and then make changes accordingly.

Let’s look at some examples
If you search for “search engine seminars” (no quotes) at Google, you’ll find these results:

Search Engine Seminars–your path to success on the Web!
… Search Engine Seminars. Is your Web site achieving the success that you want, or that it deserves? … At our Search Engine Seminars . . . you learn by doing. …
www.searchengineworkshops.com/articles/search-engine-seminars.html – 8k – Cached – Similar pages

Here’s the first text on the page:

Search Engine Seminars

Is your Web site achieving the success that you want, or that it deserves? Are you getting any traffic? Is that traffic converting to sales? Have you considered attending a search engine seminar to learn how to take a struggling Web site and bring it to the top of the rankings?

Search engine seminars, conducted by Search Engine Workshops, are held at various locations across the globe. These seminars are totally different than attending a large search engine conference, where you listen to a speaker discuss theories from the front of the room.

At our Search Engine Seminars . . . you learn by doing

And, here’s the section of that page, which shows the META description tag:

Search Engine Seminars–your path to success on the Web!

The META description tag is obviously not being used as the snippet description for this page under the keyword phrase, “search engine seminars.” Could it be because the plural version of the keyword phrase, which is what we searched for, isn’t found in the META description tag? Possibly.

So where is the snippet being pulled from?
Here’s the snippet description again:

Search Engine Seminars. Is your Web site achieving the success that you want, or that it deserves? … At our Search Engine Seminars . . . you learn by doing. …

In this example, the snippet appears to be pulled from the first heading tag (“Search Engine Seminars” at the top of the page), followed by the first sentence in the body text, followed by the next heading tag (“At our Search Engine Seminars . . . you learn by doing . . .”). Notice that the second heading tag is not the second instance of the usage of the keyword phrase. In the second paragraph of the body text, the keyword phrase is used as a hyperlink.

So what am I going to do with this knowledge?
In this example, nothing, because the description accurately describes the Web page. I’m not going to change a thing.

If the snippet description of your page accurately describes the page, leave it alone!
(Continued in Part 2. For the complete article, write to robin@searchengineworkshops.com)

(Writer’s Note: This article offers tips for changing your Google description in order to increase the click throughs to your site. However, this has nothing to do with trying to increase your page’s search engine rankings.)

Author Bio:
Robin Nobles with Search Engine Workshops teaches SEO strategies the “stress free” way through hands-on, search engine marketing workshops in locations across the globe and online search engine marketing courses (http://www.onlinewebtraining.com). Visit the World Resource Center, a new networking community for search engine marketers. (http://www.sew-wrc.com)

Search Engine Innovation For 2004
After being blind-sided by the Google Florida update, many webmasters and SEO’s were reeling from the results. The message is clear: you can’t rely on just one search engine for all of your traffic. You must use all your wits to emerge victorious from the search engine wars. Google is important, but it is not everything. Keep your eyes and ears open to new opportunities and old standbys: other search engines and directories, paid placement and pay-per-click, newsletters, and even more traditional channels.

Wait To Change
So were you an innocent bystander caught in the onslaught of sites dumped in the Google Florida update? Many people lost their hard-earned ranking, even though they did nothing “wrong”. Many websites that follow Google’s rules for optimization to the letter were still caught up in the carnage. Unfortunately, many businesses were devastated by these changes, especially heading into the holiday months.

What to do? As difficult as it may have been to make sense of Google’s changes, for many, the simplest course of action was to simply do nothing. While perhaps contrary to a normal “it’s broken so I need to fix it” approach, for many webmasters “do nothing” has proven to be the correct course of action. Since the update, many sites that were exiled to search engine Siberia have returned to nearly their former ranking, shaken but intact. From all appearances, Google simply changed their algorithm and may not have gotten it quite right. Additional “tweaks” subsequent to the Florida update seem to have brought some sanity back to their results.

Who Will Stay Tops In The Search Engines?
You never know who will become the leader in search engines. It was only a few years ago that directories were the major force–until the upstart search engine Google came along. Google got its start about five years ago and hasn’t looked back. As long as Google provides good results for its users, it is in a good position to stay on top. However, with MSN working on the creation of its own search engine and Yahoo’s acquisition of Overture (which includes AllTheWeb and AltaVista), things could get interesting in 2004. Microsoft is always a force to be reckoned with, and Yahoo certainly has the tools to become a major competitor to Google.

Inktomi’s New Role
Inktomi may play an important role in this growth since it is now owned by Yahoo. Keep an eye on this engine: it provides secondary results for MSN and will probably replace Google in supplying primary results in Yahoo. Inktomi’s importance may also increase in MSN once the Microsoft property stops using LookSmart for its primary results.

To see which pages you have listed in Inktomi, use the Inktomi Pure Search function from Positiontech. Inktomi often adds a few free pages to its databases. Check first to see which pages you may already have in their database for free before using Paid Inclusion for your most important pages.

Other Ways To Promote Your Website
Keep your eye on search engine news. Google was an up and coming engine a few years ago, you never know what will happen in the industry so stay on your toes. Continue to promote your website through links in topical directory listings. Search for websites that contain topics related to yours. Link when it “makes sense”. Don’t forget traditional means of marketing your website: print ads, brochures, magazine articles and more may help to make a difference. One of the best ways to promote yourself online and increase your link popularity is to write articles on your subject. Find websites that accept free content and submit your ezine, articles or newsletters to those websites to build your link popularity. Newsletters, forums, FAQ’s, blogs and tips on your subject are all viable means to inform your visitors and bring in new traffic to your website. Don’t forget to archive your newsletters and articles on your website, which works to build your site size and increase link popularity through your authoritative knowledge of your subject. You aren’t a writer? Consider working with a copywriter to help build your good content.

Paid Inclusion And Pay Per Click
If you haven’t ventured into using Paid Inclusion or PPC services, consider using them to help balance the changes in your traffic. Use a Paid Inclusion subscription for your most important pages, or submit dynamically generated pages that aren’t being picked up by the search engine robots so they will appear regularly in the search engine database. You can start your PPC bidding in small doses. Look for some of the secondary smaller terms that don’t cost as much but will still bring in traffic your competitors may miss. Take a look at some of the smaller PPC engines available out there, a little traffic from a lot of places can add up.

For more information on choosing keyword phrases, read our article Finding Targeted Keyword Phrases Your Competitors Miss.

Content, Content, Content
The biggest mistake I see webmasters make is creating a website with little content. Don’t rely on a few paragraphs of text with optimization to convince search engine robots to stick around. A skeleton website does not make a good impression on anyone. Build the content of your website. Google’s new algorithm may be a sign of search engine robots getting a little smarter when it comes to understanding what your website content is about. Build information that will keep your visitors at your website. Become an authority on your subject so other websites will naturally link to you because your information is invaluable. Remember, Google is interested in serving those who use its search capabilities, just as you should be interested in serving your visitors. Give as much real content information as able to your visitors, they will thank you with return visits.

And In The End…
In the end, the information you give is often equal to the response you receive. Make the effort to become an authority site on your subject. Building the groundwork of your website with quality information and broadening your methods of marketing will help sustain you during the search engine wars upcoming.

Author Bio:
Daria Goetsch is the founder and Search Engine Marketing Consultant for Search Innovation Marketing, a Search Engine Promotion company serving small businesses. Besides running her own company, Daria is an associate of WebMama.com, an Internet web marketing strategies company. She has specialized in search engine optimization since 1998, including three years as the Search Engine Specialist for O’Reilly & Associates, a technical book publishing company.

Keyword Analysis & Ranking: The Value of Brainstorming
Conducting a good keyword analysis is the first step in an effective search engine marketing campaign. If you don’t chose good keywords, all efforts to boost your ranking will be wasted.

And the very first step within your keyword analysis is brainstorming. Before you begin focusing on keyword popularity, you need to think outside the box and develop a good list of words.

The Rules of Brainstorming
Brainstorming is a process for generating the broadest possible list of keywords for your web site, without yet judging whether they are good words. The cardinal rule of brainstorming is not to pass judgment on the ideas as they come up. Doing so can shut off our thinking, and at this stage in the process you want to encourage as much lateral thinking as possible.

When we get to the detailed analysis phase we’ll establish a detailed keyword ranking based on each word’s search popularity and relevance. At this point we’ll reject many of the words on our brainstorming list, but early on we want to be open to new ideas. Too early a focus on ranking keywords can block creative thought.

An Outside Perspective
I try to accomplish two things when I brainstorm keywords with clients. First, I’m trying to learn the language of their industry. Second, I’m trying to make sure they aren’t trapped by their own industry jargon.

Let’s talk about the second point first. It’s the classic marketing conundrum: the purpose of your marketing group is to keep the company focused on its customers and understand how they think. Yet a group of people who work in the same office, day in and day out, inevitably develop their own jargon. It’s easy for them to assume their customers speak the same language, when they may not.

A good example of this is a keyword review I recently performed on a site that sold CD-ROM devices. The list of keywords used included the terms “CD copier,” “CD duplication,” “CD replication.” Yet this list missed the term I and most other consumers use: “CD burner,” as in “Hey, Bob, could you burn a CD for me?”

Disconnects like this are more common they you might expect. Sometimes the causes are generational.

For example, ten years ago if you asked a 9 year old what a razor was they’d have told you it something dad uses to shave his face. Today there’s a high probability you’ll be told a razor is a scooter (Try it – I did with my 9-year-old and her friends and was told 100% of the time that a razor was a scooter!).

I once worked with a manager who insisted on referring to his product – which tracked web site uptime – as “accessibility” monitoring. The trouble is, “accessibility” is a well-established term in the web design world for ensuring your web site readable by people with disabilities.

A good brainstorming process helps spot issues like this. As a client, it’s important to be open to a fresh perspective from outside your company.

Learning Your Lingo
The other thing I try to accomplish during brainstorming is to learn my client’s unique language. While there sometimes is a disconnect between the company’s language and their customer’s language, at other times this jargon is used by everyone in the industry – including the searchers you’re trying to target.

This is especially true if your company offers professional services. In these cases your search terms may be highly specialized. For example, a CPA specializing in corporate tax preparation might use terms like “apportionment” or “franchise tax.”

Understanding the appropriateness of these terms is important, because while these highly focused terms may not receive a huge volume of traffic, they are often your most relevant search terms. Highly relevant search terms are the ones that generate the most business.

When working with clients, the brainstorming process is an important education process for me where I learn the language of their profession.

Often times the jargon you and your customers use may need to be amended slightly to reflect the way those same customers use a search engine. This is most true when you’re talking about acronyms.

Let’s fact it, Americans are acronym crazy. Every industry has its unique set of acronyms. If you work on enough projects, sooner or later you’ll see the same acronyms used to mean very different things.

Take one simple example: does “ADA” mean “Americans with Disabilities Act” or “American Dental Association,” or perhaps the “American Diabetes Association?” Search for ADA in Google and you’ll turn up hits on each of these definitions.

In my experience, people who search for acronyms get very unsatisfying results, and quick refine their search by adding additional words. So a searcher might change their query to “ADA compliance” to get more relevant results.

As an SEO, it’s important for me to understand what additional words might be coupled with your industry’s acronyms to create this type of refined search.

This understanding of how people use a search engine is one of the things a search engine marketing specialist brings to the table. During the brainstorming process, it’s common for us to expand some of the search terms suggested by the client to reflect search behavior.

Sources of Brainstorming
All of this is good, but how do you get past writer’s block when building your keyword list?

One of the best steps is to read your existing web site and your sales collaterals. A careful reading will often pick up alternate phrasing of common terms, and these should go into your list.

If this works for your web site and collaterals, it also works for your competitor’s web sites. I’m not recommending that you lift another site’s list of keywords intact, but seeing how other people in the industry phrase things can be useful in opening up your own thinking. Industry trade magazine and product reviews are also great sources to open up your thinking.

And don’t forget the thesaurus. If you’re stuck building a list of “real estate” words, a simple thesaurus will tell you to think of “property” and “realty” words. Programs like WordTracker can also help uncover related terms that you may not have considered.

Finally, don’t forget that your customers are a great source of keywords. This can be both a source of alternate terminology, and another check against the jargon trap. If you can’t run a focus group or conduct customer interviews, then talk to the people who interact with your customers. That means talking to your sales force, your user support staff, or anyone who has regular customer contact.

Brainstorming is all about thinking outside the box. The more effort you make to interact with new people inside and outside of your industry, the better job you’ll do breaking down the barriers to creative thinking, and the better set of keywords you’ll have for your web site.

Author Bio:
Christine Churchill is President of KeyRelevance.com a full service search engine marketing firm. She is also on the Board of Directors of the Search Engine Marketing Professional Organization (SEMPO) and serves as co-chair of the SEMPO Technical Committee.

Creating your ads for AdWords
Two of the most important factors of any Pay Per Click (PPC) campaign are creating successful ads and deciding how much to pay per click. There are many PPC options out there to choose from, I am going to focus on the two most popular, Google AdWords and Overture.

Creating your ads for AdWords
Creating your ad copy is the single most important part of any ad campaign. You want your ad to stand out amongst the others and scream out ‘click me!’ If your add looks and says the same thing as everyone else users will simply pass it by.

Before creating your ads you need to determine your target market and keyword selections. If your company focuses on a specific market niche try to target your ads in regards to that niche. Properly targeted ads will almost always out-perform those directed at a general audience.

When creating your first ad be sure to fit in your main keywords either in the title or near the beginning of the body text. Say something to draw attention by using call to action phrases and words that provoke enthusiasm and response. Things like “Save on DVDs,” “Get cheap stereos,” or “Join now for 20% discount,” etc. Just be cautious, if you advertise something that you don’t offer, Google will pull your ad. If your ad says you have something for free, you better have something for free listed on your landing page! Always be sure to follow Google’s Guidelines .

Once you are happy with your first ad, create 3 more ads that are radically different from the first. After 3 or 4 days take a look at how your ads are doing. (If you are using less frequently searched terms you may have to wait 1-2 weeks for better results.) Check the click through rate (CTR) of each ad. In most cases one of the 4 will show to be out-performing the rest. If this is the case, delete the poorly performing ads and create 3 new ads that closely resemble the successful one, each with subtle differences in the title and body text.

Again wait 3 or 4 days to see which of the ads is out performing the rest. If you again notice that one stands out, repeat the process. Eventually you will end up with 4 quality ads that are performing equally. Once the ads have leveled out, continue to keep an eye on them, I recommend daily. If one begins to slip, slightly tweak the wording. You must always keep an eye on your ads if you wish for them to continually perform well.

Determining your Max Cost Per Click with AdWords
With AdWords when you enter your MAX CPC, it will then show you what Google estimates your average position will be for each keyword. ( The position predictions provided by Google are based on historical data from previous advertisers and are not 100% accurate, but it will give you an idea what to expect.)

Unfortunately there is no way to see what the competition is paying, so in most cases it’s a bit of a duck hunt in the beginning. I suggest starting out with a MAX CPC slightly higher than you would normally, this will give you a slightly higher ranking and increase your chances of accumulating clicks. If your ad performs really well your rank will increase. As you begin to establish a good click through rate (CTR) you can adjust your max CPC to reflect the position you wish to obtain. (See part one of this article to find out how Google ranks ads.)

Creating your ads for Overture

With Overture, writing the perfect ad is slightly different than with AdWords. Overture only allows you to create one ad per keyword, so this takes away the option of trying out various ads and going with the obvious winner, however, the basis for creating your initial ad remains virtually the same. After you have selected your target market and main keywords, write a specific ad targeting each individual keyword and be sure to include the keyword in the title or beginning of the main body text along with a call to action phrase or something that is sure to draw attention. Remember to check the status of your ads on a weekly basis, and tweak as needed. Keep and eye on your click through rate and regularly tweak poorly performing ads.

Determining your Max Cost Per Click with Overture
Deciding how much to spend on Overture is simple. Take a look at what the competition is spending, and out bid them. With Overture you should always try to be in the top 3 if you wish to have your ad dispersed among partner sites. (Yahoo, Lycos, MSN, etc). If the number 1 spot is currently paying 25 cents per click you need only bid 26 cents to grab the number 1 spot. If you want the number one spot, but are also willing to pay more, you can bid 40 cents, and will only be charged the 26 cents. One penny above the competition. Keep in mind though, if someone else increases their bid, your actual cost will also increase up to the max CPC you have entered.

Summary
Managing an AdWords or Overture PPC campaign can be confusing at first, but it doesn’t take long to get a handle on what works. Creating a highly successful ad the first time around with either AdWords or Overture is a rare occurrence, but with a bit of regular maintenance and a well targeted campaign it won’t take long to start seeing results.

Author Bio:
Aaron Turpen is the proprietor of Aaronz WebWorkz, a full-service online company catering to small and home-based businesses. Aaronz WebWorkz offers a wide variety of services including Web development, newsletter publishing, consultation, and more.

Banned from Google?
Many web site owners are under the false impression that if they can’t get their web site listed on a search engine then they must be restricted from that engine. Banned from Google? How will my site ever get seen? Google, the world’s most popular search engine, retains over 60 percent of all search engine results. To lose that level of exposure is life threatening for a web site. There are many indicators for web site owners to be aware of in order to trouble shoot the position of their site. For instance, I have submitted hundreds of sites and after two or three weeks the home page will usually show up on the search engine listing. Some sites can take twice as long. A rule of thumb that I go by is, if you can’t find your home page on a search engine a month after the original submission date, then something may be wrong with your site.

If you think your site might be banned, then you have to ask yourself some fundamental questions.

1. Is your domain newly purchased with NO previous owners? If the site is new, then there should be no reason to worry about the issue. If after a month your site is still not listed on the engine, I would consider having a web site optimization company optimize your home page.

2. Are you employing any TRICKS? Did you do anything that could have resulted in your site being banned? In other words, are you trying to trick the engine into falsely indexing your site higher than it should actually be? Are you link farming, are you using doorway pages unwisely, using hidden text, cloaked pages, redirects, plagiarized content etc.? Please see my article on Spamming the Engine and The Ultimate Cost for more detailed information. If so, these tricks need to be corrected before resubmitting the site to a search engine.

3. Did you purchase this domain from a previous owner and is it possible that the previous owner may have gotten the domain banned from the engine? The previous owner may have tried to trick the engine into falsely indexing your domain and subsequently has gotten your domain banned. If this is a possibility then you need to trouble shoot further.

Are there a large number of broken links from your home page and throughout your web site? If so, these would definitely have to be fixed before an engine or directory will include the site into their system. If you are not sure about broken links, you can purchase software that will scan your site or have a search engine promotion company check your web site for broken links.

4. Is your site optimized for loading? Is your web site full of graphics and taking quite a long time to load? If dial up users are not waiting for your pages to load then neither will the search engine spiders. An industry consensus and a good rule of thumb is to try to optimize your pages for loading to about 30KB.

5. Are you using your robots.txt file properly, if at all? Try inputting your site into your web browser like this; http://www.yourdomain.com/robots.txt, if a 404 error i.e., “The page cannot be found” is displayed, then you’re not using one. If content comes up be sure you understand what you’re looking at and that you are not blocking your home page from any spiders even reaching it.

If you think there is a good possibility your site is actually banned due to unethical marketing and promotion practices then you need to do the following below to verify.

* If a couple of months have passed since first submission of your home page and you find zero results in a Search Engines Results Page (SERP), then that is a very good indication that your domain has been cleaned out of the index.

* You may also use the Google toolbar (http://www.toolbar.google.com) to check the “page rank” or PR value of your site. A zero page rank or PR0 doesn’t necessarily mean your banned either, but it is another possible indication and further trouble shooting will be required.

* Try to get a couple of web sites to link to yours and re-submit those pages your links are on to Google. Wait a few weeks and do a search for the site that is linking to yours with your keywords and domain together. If you can locate those submitted pages and your domain still doesn’t come up then there is a good chance your site may be banned.

* Another alternative is to look for your listing in the Google version of the Open Directory Project (directory.google.com). Verify that your site is even listed in the ODP. If the search is coming up blank, there is a chance you may have been banned or penalized.

After you have corrected all possible errors and your site is still not listed on Google you can try writing to Google at search-quality@google.com. Be sure to ask about the specific domains. Also, don’t expect a quick response and try to wait at least a couple of weeks before asking for an update. Google is very busy with hundreds of these types of requests and you wouldn’t want them to put you on the bottom of the pile for being too annoying. By all means don’t try to sneak any tricks past the engineering team at Google, they’ve seen them all and you’ll definitely risk your chances of ever getting indexed again. If you do not have the time to do the leg work, by all means hire a reputable search engine optimization company and let them handle your domains predicament.

Author Name:
Alex Skorohodov of KosmosCentral