Ask us a question!

John Wieber

Partner

has 13+ years experience in web development, ecommerce, and internet marketing. He has been actively involved in the internet marketing efforts of more then 100 websites in some of the most competitive industries online. John comes up with truly off the wall ideas, and has pioneered some completely unique marketing methods and campaigns. John is active in every single aspect of the work we do: link sourcing, website analytics, conversion optimization, PPC management, CMS, CRM, database management, hosting solutions, site optimization, social media, local search, content marketing. He is our conductor and idea man, and has a reputation of being a brutally honest straight shooter. He has been in the trenches directly and understands what motivates a site owner. His driven personality works to the client's benefit as his passion fuels his desire for your success. His aggressive approach is motivating, his intuition for internet marketing is fine tuned, and his knack for link building is unparalleled. He has been published in books, numerous international trade magazines, featured in the Wall Street Journal, sat on boards of trade associations, and has been a spokesperson for Fortune 100 corporations including MSN, Microsoft, EBay and Amazon at several internet marketing industry events. John is addicted to Peets coffee, loves travel and golf, and is a workaholic except on Sunday during Steelers games.

Web Moves Blog

Web Moves News and Information

Blog Posts by John

Introduction
By now, if you market on the web, you know all about the Florida update, and have likely heard of the Austin update.

But in case you haven’t – a little reminder: Back in November 2003 we began to notice a fairly major change occurring in the Google index. In fact, if we go back to May 2003 we can discern hints of the changes (targeting affiliate sites and so forth). Back then no one knew what was coming our way.

Recently, another fairly major update, named Austin, occurred and even more sites were removed from the index. So many people are asking why. I thought I would throw in my two cents.

Florida was a major change in thinking for Google. They are attempting to return the engine to supremacy. In doing so, however, they did harm to a lot of sites who were playing fairly and by the rules. This collateral damage caused many to surmise that Google was again broken and again wouldn’t be fixed.

We developed our own theory
We developed our own theory fairly early on with regards to Florida – that Google has implemented a context based algorithm. Indications are still there that this is the case. We hinted at the improving results then, and this seems to be happening.

You see, the next thing Google had to do once they had filtered out some sites, was to recalculate back links and PageRank. This happened in December. We were noticing some PageRank fluctuations on our own site during this time. It has since leveled out.

So once Florida happened, Google had to recalculate PageRank to adjust for those sites which were filtered. Since the site was removed – its influence on back links was also removed.

Google Austin Update
Now January comes – the Austin update. And what does Google do? They perform a refinement of the same algorithm which caused the Florida update. Nothing major – more like a tweak to allow some of the innocent sites which were previously dropped back into the SERPs. These are the results we are seeing now – some semblance of where Google will be in the next couple of months.

I don’t see this as the end though. Likely, in February another back link check will occur and those sites which were removed in January will lose their link influence in February. Then in March we should see a more stable index. True to Google’s nature – it will have been 90 days since the Florida update.

I bring up this figure because traditionally after a major Google change, it takes about 90 days for the Google index to stabilize and start returning more relevant results.

Be patient
Therefore, if you are a small site whose results have just now started to come back, try and be just a little more patient. My feeling is that in the next couple months you should start to recover some more of what was lost in the last two months. That is unless you were a site which was a target. These include affiliate sites, or sites which didn’t offer any useful content or information.

If your site is well constructed, easily spiderable and provides lots of useful information that is not too promotional you should start seeing a rise in results. You may want to increase the amount of content you have but be sure that it is informational in nature. Take a look at the average site ranking for your key phrases – how many pages are they? You will probably want to fall somewhere in that range of pages. Also, while you are analyzing your competitors – take a look at what kind of back links they have. Perhaps you can get links from these sites as well, as long as they are related to your industry. Remember that if your PageRank and site size are comparable to those currently ranking, you should also be considered an authority on the topic.

Summary
So as we begin February, remember that the Google ride isn’t over. We have just entered the phase where we are coming down the last hill near the end of the roller coaster ride. The end is near, but it’s not here yet.

Author Bio:
Rob Sullivan
Production Manager
Searchengineposition.com
Search Engine Positioning Specialists

Introduction
This article takes an in-depth look at some of the innovations introduced at ExactSeek.com, including, most recently, the implementation of a unique ranking algorithm that factors in Alexa traffic data.

Search engines basically cater to two audiences – webmasters and web searchers. Although there is an obvious overlap between the two groups, the latter dominates in numbers and most search engines bend over backwards to keep web searchers happy by continually tweaking search result algorithms. Webmasters, on the other hand, are more or less left to fend for themselves.

ExactSeek was founded 2 years ago on the premise that webmasters/ site owners are the driving force on the Web – not in numbers, but in terms of ideas, development, enthusiasm and innovation. The goal for ExactSeek, then and now, was to build a major engine that catered to this audience and web searchers more or less equally. Part 1 of this article deals with ExactSeek’s focus on webmasters.

Free Site Listings
ExactSeek’s belief in the importance of the webmaster audience to search engine growth and traffic resulted in the rapid introduction of three types of free site listings, each geared to helping webmasters give their websites maximum exposure within the search engine.

Here’s a short explanation of each:

Standard listings: As the name implies, this site listing is just a basic website listing in the ExactSeek database, comprised of Site Name (taken from the Title tag), description (taken from the Meta Description tag) and Site Link (the website URL).

These basic listings show up numbered 1 to 10, etc. in ExactSeek’s search result pages. Any webmaster can obtain a Standard Listing simply by filling out the form at: http://www.exactseek.com/add.html

Enhanced listings: Not strictly a website listing – really more of an enhancement to the standard website listing. In brief, an enhanced listing consists of a small icon residing next to a website’s standard listing which when clicked opens a 150×150 javascript window with additional information about that website. The cool thing about the enhanced listing is that webmasters can add up to 800 characters of text or HTML code to the content of the window, allowing them to:

– Embed audio, video or Flash.
– Include logo, product or service images.
– Include links to secondary site pages.
– Use font enhancements (size, color, style).
– Include more detailed information about their websites.

Simple examples can be seen at:
http://www.exactseek.com/enhanced.html

Priority listings: Introduced for those webmasters who wanted something in return for adding an ExactSeek search form to their websites. These listings appear in the first 10 to 100 ExactSeek search results, highlighted against a light blue background. Visit the link below to see an actual example:
http://exactseek.com/cgi-bin/search.cgi?term=perl

Obtaining a priority listing is a simple 3 step process outlined at:
http://www.exactseek.com/add_priority.html

The three types of listings noted above offer any half way knowledgeable webmaster a means to maximize site exposure with minimal effort. Best of all, any site meeting ExactSeek’s submission guidelines is added within 24 hours. No pointless weeks or months of waiting. In 24 hours, websites are either indexed or not. If not, the most common reasons are:

1. The site submitted lacked a Title tag. The ExactSeek crawler ignores sites without Title tags.

2. The site submitted was pornographic or contained illegal content.

3. The site submitted was dynamically generated and its URL contained non-standard characters like question marks (?), ampersands (&), equal signs (=) percent symbols (%) and/or plus signs (+).

4. The submitting webmaster failed to respond to the submission verification email message sent out by ExactSeek. Or, as is becoming more and more common, the webmaster failed to receive the message due to sp@m and ISP filtering and, thus, could not confirm submission. Webmasters using AOL, Hotmail and Yahoo email addresses may soon find it impossible to have their websites added to any search engine using a verification system.

The Do-It-Yourself Approach
One of the most irritating things about many search engines is that it can take weeks or even months for free website submissions to be indexed. And once sites have been added, it can take weeks for changes to content and/or tags to be re-indexed and for webmasters to see if those changes had a positive effect on site ranking.

ExactSeek opted to put a measure of control back in webmaster hands by introducing a do-it-yourself approach. Early on, two simple online tools were made available which allowed webmasters to quickly check the positioning of any website in the ExactSeek database for any keyword(s) relevant to that site and then, if necessary, do something about it.

(a.) Site Ranking Tool
Tracks the top 10,000 site rankings for all keywords in the ExactSeek database, allowing a webmaster to find his site ranking for any keyword(s) relevant to his website.

(b.) Web Crawler Tool
Allows webmasters to schedule recrawls of their websites as often as once per week. Self-scheduled recrawls and instant site ranking checks provide webmasters with a quick read on which optimization strategies work and which don’t.

Both of the above tools can be found with additional explanation on the following page:

http://www.exactseek.com/srank.html

Do-It-Yourself – The Next Step
More recently, ExactSeek implemented a simple free membership system that takes the do-it-yourself approach one step further. Any webmaster who has successfully submitted a website to ExactSeek is automatically a member and can obtain a Member ID and Password for login purposes by using the appropriate form at the ExactSeek Member Login page.

After logging in, webmasters can access a Member Account Manager which allows them to edit, enhance, delete and/or recrawl their site listings as often as they like. Revolutionary? Maybe not, but a big step away from total reliance on search engine scheduling.

Part 2 of this article will look at how traffic data from Alexa Internet was incorporated into the new ExactSeek ranking algorithm, the algorithm’s impact on delivering quality search results and at how ExactSeek’s new paid inclusion program stacks up against other paid inclusion and PPC programs.

Author Bio:
Mel Strocen (c) 2003. Mel Strocen is CEO of the Jayde Online Network of websites. The Jayde network currently consists of 12 websites, including ExactSeek.com (http://www.exactseek.com) and SiteProNews.com.

Optimizing Dynamic Sites
Dynamic content is delivered to the Web browser in a different form than it exists on the server, while static content is stored on the Web server in the same format that is delivered to the Web browser. Dynamic site pages are generated from a database “on the fly” as users request them.

You can often tell when you are looking at a dynamically generated page, because dynamic URLs contain one or more “query strings,” or question marks (?) while static URLs do not, but there are exceptions to this rule, which we shall discuss below.

Search engines have a hard time with dynamic URLs. Dynamic URLs may cause search engines to mistake a small site for a very large one because an unlimited number of URLs can be used to provide essentially the same content. This can cause search engine spiders to avoid such sites for fear of falling into “dynamic spider traps,” crawling through thousands of URLs when only a few are needed to represent the available content. Here’s how three popular search engines handle this:

* The FAST search engine will crawl and index dynamic URLs as quickly and easily as static ones (at this time).
* AltaVista doesn’t crawl dynamic URLs at all, but it will index each dynamic URL that you take the time to submit individually.
* But these two search engines are relatively insignificant. Google will crawl dynamic URL’s at about a third the speed and depth at which it indexes static pages. It will barely crawl at all if there are session IDs in the query strings, because it will soon discover that multiple URLs lead to the same page and regard the site as being full of duplicate content.

Another challenge dynamic sites throw at search engines is serving up different core content at the same URL. This might result when a site has content that may be viewed at the same URL in multiple languages, depending on the browser settings, or content, such as on a news site, which changes every few minutes.

Search engines want to be accurate they want visitors to a particular URL to see the same content the spider saw. They also want to be comprehensive. They vie with each other to have the largest database. Thus, they have billions of pages to index and typically can only visit each URL once every few weeks or so (although Google is pretty good at recognizing content that changes frequently, and spidering it more often). So if a search engine indexes your English content at a given URL, it will probably not index your Spanish content at the same URL during the same indexing period.

The Solution
Search engines want to be accurate  they want visitors to a particular URL to see the same content the spider saw. They also want to be comprehensive. They vie with each other to have the largest database. Thus, they have billions of pages to index and typically can only visit each URL once every few weeks or so (although Google is pretty good at recognizing content that changes frequently, and spidering it more often). So if a search engine indexes your English content at a given URL, it will probably not index your Spanish content at the same URL during the same indexing period.

The solution is to give each search engine unique core content at a unique URL, and ensure that all visitors see the same core content. There are three main ways of achieving this.

1) Use static URLs to reference dynamic content. If a search engine sees a static URL, it is more likely to index the content at that URL than if it found the same content at a dynamic URL. There are several ways of turning dynamic URLs into static URLs, despite the fact that you are serving dynamic content. Your method will depend upon your server and other factors. A friend of mine had the following experience after implementing this solution for a client:

“For the last year, since rewriting the dynamic URLs, my client’s site has been riding high in the rankings for thousands of search terms. Before the URL rewriting, Google had indexed just about 3,000 pages in the course of 18 months, on the first week of using URL rewriting, Google was grabbing 3,000 pages per day from the 500,000-item database it had previously barely touched. By the end of the first 2 months of using URL rewriting, Google had indexed over 200,000 pages from the site.”

The following sites offer instructions for two popular servers:
* Apache:
* ASP:

A good step-by-step tutorial can be found at fantomaster.com. The article links are on the right hand side. There are four articles in the series.

Here are some examples of sites that have implemented one of these
approaches:
* Yahoo.com (yes, Yahoo!)
* Epinions.com
* Dooyoo.co.uk
* Pricerunner.com

URL rewriting is a very common practice. Not only is it exceptionally powerful in terms of search engine optimization, but it is also superb for usability and marketing in general. A shorter, more logical-seeming URL is far easier for people to pass on in an email, link to from their homepage, or spell out to a friend on the telephone. Shorter URLs are good business.

Solution 2
2) Link to dynamic URLs from static URL pages. The above solutions are elegant, but may be difficult for some sites to implement. Fortunately, there is a simple work around for smaller sites.

One method search engines use to crawl dynamic content while avoiding dynamic spider traps is to follow links to dynamic URLs from static URLs. If your site isn’t too large, you could build a static site map page consisting of links to dynamic URLs. The search engines should crawl those links, but will probably go no further.

An even more effective technique would be to get other sites to link to your dynamic pages. If these sites have good Google PageRank, your dynamic pages will not only be indexed, but the likelihood of their achieving a high ranking for the key words on them will increase significantly.

Solution 3
3) Pay for inclusion? AltaVista, Ask Jeeves/TEOMA, FAST and Inktomi offer Pay-per-inclusion (PPI) programs. You pay $25/page (or so) to ensure that that page is spidered frequently (Inktomi spiders every 48 hours for that price). This will garner some traffic, but since Google now accounts for over 70% of all search engine traffic and continues to grow stronger all the time, don’t throw too much money into this solution unless you have deep pockets. If your site is huge, the cost could be prohibitive. Paying to have your pages spidered does not guarantee that they will rank well, so they must be optimized properly. Frequent spidering enables you to experiment with optimization and see your results within a day or two. Search engines, including those with PPI options, want their databases to be as large as possible. So if you don’t pay for inclusion, and instead implement one of the solutions discussed above, your pages will probably be indexed anyway. On the other hand, if you pay for some of your pages to be spidered, there’s a good chance the ones you don’t pay for won’t be.

Summary
To summarize:
1. Search engines have problems indexing dynamic content.
2. If possible, use static URLs to reference dynamic content.
3. Otherwise, try to link to your dynamic URLs from static pages.
4. If your budget allows, consider using paid-inclusion programs.

Author Bio:
Rick Archer owns and operates SearchSummit, a one-man SEO company. His clients include many site owners who used to employ large SEO firms, but found that a personal, direct approach was preferable. Visit his website at http://www.searchsummit.com.

Usability & SEO
Today, I will slightly steer away from the subject of search engine optimization and instead present what should be the most important considerations when designing or completely rebuilding an existing site.

While usability features are primarily intended for your end users, you should also think of the search engine robots (spiders) that will regularly come to visit your site, in an effort to index and refresh its pages.

To make a site that is both friendly to your users and to the search engines, you always have to think in terms of usability. In order of priority, the five most important usability features you should always implement in your website are:

1. Ease of navigation
2. Simplicity and intuitive
3. Clearly indicated menus and links
4. A well-designed site map
5. Clean, uncluttered design

Ease of navigation
Your site should always be designed in a way that it’s always easy to navigate, no matter where you happen to be in the site. Always include a ‘Back to our Homepage’ button or link, either at the top or at the bottom of the page. If your visitors get lost, as some of us do at times, clicking on the Home button or link will at least make them come back to the beginning of your site.

It’s also a good idea to place a ‘Back to the Top’ button at the bottom of the page, to help them come back to the top. Also, when you are building or ‘renovating’ a website, always include a contact button or link, either at the top or bottom of all your pages. The contact page is often the only way they can reach you, either by email, by phone or in person.

Make it simple and intuitive
Did you ever try a new software package or tested a new program application for the first time and found it to be so easy to use and friendly? You seem to know in advance what effect on the program every button or menu function will have, even before you click on it. Your website should have the same ‘look and feel’. Make it easy for them and take every step necessary to make their experience as enjoyable as possible and they will come back for more.

People today are always in a rush and they have less and less time to figure out how something works. They came to your site because they typed a keyword or two in a search engine and they expect to rapidly find on your site what it is they are looking for. Make sure they find it fast and they will certainly be repeat buyers.

Clearly indicated menus and links
I still sometimes come across websites that seem to have been designed for the person that originally built it and nobody else. What may seem to you as self-explanatory may not be as simple to figure out for some users. Having a button called ‘Knowledge Base’ may not mean much to some people. Instead, if that button would simply be called ‘Help’ it probably would assist them better.

A well-designed site map
A good site map has two functions: it helps your users find what they are looking for and it also helps the search engines better index all of your site. It’s always a good idea to group your site’s pages under topics. If you are selling both physical products and actual services on your site, it might be a good idea to place them into two different groups. Avoid the temptation of cramping all your pages together to make it go faster.

Take the time to design a good site map where people will easily find the page or section that interests them. Also, make sure your site map is directly linked to your homepage. This last step is important, since one of the first things a search engine robot looks at when it arrives at a website is the site map link. Also, make sure that your site map link is a text link. Remember that most search engines cannot read or understand Java scripts or image maps.

Clean, uncluttered design
For some people, nothing is more frustrating than arriving at a site and not having a clue what to do next or where to go. They usually leave as fast as they came too! For that reason, avoid putting heavily animated graphics, background music or Flash movies in any part of your site. What’s more, most search engines today penalize websites that make a heavy use of animated graphics or Flash movies, since they cannot read them or understand them.

A clean and uncluttered design usually wins hands down. The most simple and ‘clean’ websites are usually the ones that are visited the most, since many people seem to know they will find what it is they are looking for, and they usually come back often too.

Conclusion
You are investing a lot of time, energy, resources and money in your website. Make sure you maximize all of your investments in keeping them on your site for as long as you possibly can. Once they have bought from you, and if they liked the experience, statistics prove they will come back.

Give them a reason or two to come back. Offer them a free weekly or monthly newsletter in which you will talk of the subjects that interests them. Another way to create a lot of interest is to regularly write ‘How To’ articles, similar to this one. Most people are usually interested in learning more, and the Internet is the perfect place for this.

Make your website THEIR focal point of interest by offering your visitors what truly interests them and you will largely increase your chances of success.

Author:
Serge Thibodeau of Rank For Sales

Major search engine submission
Getting your web site listed in the major search engines is an absolute necessity. Why? Because 85% of all people on theInternet use them when searching for information. When they type the keyword or keyword phrase in a major search engine, up pops your web site, or at least you hope so.

The easiest, fastest and free way to get listed in any of the major search engines is getting links from other related sites that point to your site. Once you get your web site listed in one major search engine or directory, your site will appear in the others.

Besides working on your link building strategies, you can also submit your web site to the major search engines.

Here’s how to submit to the major search engines, listed in alphabetical order:

AllTheWeb
1. AllTheWeb – owned by overture which also provides results for Lycos.

Free Submission
http://www.alltheweb.com/add_url.php
• Only submit your homepage and a couple of inside pages.
• It takes about 6 weeks before your pages appear.
• There is no guarantee to get listed.

Paid Submission
• AlltheWeb doesn’t have a direct paid inclusion program but uses Lycos Insite Select http://insite.lycos.com/inclusion/searchenginesubmit.asp?co=undefined Price $35.00
• It takes 72 hours to get listed. Your pages will be visited every 2 days for 1 year. This is a good option if you want a new site to quickly get listed.

Altavista
2. Altavista – owned by Overture
http://www.altavista.com/addurl/new

Free Submission
• This gives you a basic submission.
• You can add up to 5 URLS to be considered for inclusion in 4-6 weeks.
• Submit your homepage and one or 2 other pages.
• There is no guarantee of inclusion.

Paid Submission
AltaVista Express Inclusion
https://www.infospider.com/av/app/signup
• Submit your homepage.
• Your pages will be listed within 2 weeks and revisited regularly.
• Price $39.00 for one URL for a 6 month subscription.
• Guarantee: Your pages could be dropped if you don’t renew your subscription.

Google
3. Google
http://www.google.com/addurl.html
Google is the top choice for searchers. It’s search engine results also appear in Yahoo and AOL.

Free Submission
• It is free to submit but there is no guarantee of a listing.
• Submit your homepage and a couple of other pages.
• It takes about a month for your pages to appear.

The best way to get listed in Google is to find a lot of related inbound links – links from other sites that point to yours. This will vastly improve your search engine rankings and your site will probably show up more quickly.

InfoSpace & Inktomi
4. Infospace
http://www.infospace.com/info/submit.htm

• Free to submit, however you need to fill out a form to get listed.

5. Inktomi – it was recently purchased by Yahoo, so it may provide results for Yahoo in the future.
http://www.inktomi.com/products/web_search/submit.html

Free Submission
As with all other major search engines, building links to your site is the best way to go if you want to get listed for free.

Paid Submission
• Inktomi’s search engine has an extensive network of Web search partners, including MSN, About, HotBot, Overture.
• It refreshes your site every 48 hours, to keep your content up to date in the index.
• Submit your homepage.
• Your site will be listed in 2 days and will be regularly revisited up to one year.
• Price $39.00/year. Each URL thereafter costs $25.00/year.
• Guarantee: Your pages may be dropped if you don’t renew yoursubscription.

Teoma
6. Teoma – provides results for the popular web site Ask Jeeves
http://ask.ineedhits.com
Ask Jeeves owns the Teoma search engine. Teoma partners include:
Ask.com, Teoma.com, Metacrawler, Excite, iLor, MySearch.com,
MyWay.com, Ixquick, Mamma.com, Hotbot.com, Search123.com.

Free Submission
Like the other major search engines, get links from other web sites to point to yours and the search engines will eventually find you though their linking structure.

Paid Submission
• Ask Jeeves Site Submit (http://ask.ineedhits.com)
• Submit your homepage
• Your site will be listed within one week.
• 1st URL $30.00
• 2 – 1000 URLs $18.00 each
• Guarantee: Your pages may be dropped if you don’t renew your subscription.

Tips
1. Just submitting your web site to the major search engines is not sufficient to get a good ranking. Make sure your web site is correctly optimized before you do your submissions. Read How to Use Keywords to Optimize Your Site for Search Engines

2. Getting massive amounts of traffic doesn’t guarantee sales. Your web site copy must pull your visitors through the page for them to take action. Get others to read your copy and ask for their opinions.

3. To achieve higher rankings (and therefore more traffic) in the search engines, you need to continually refine your web site. Monitor your web site rankings and if necessary, adjust the keywords in your meta tags and web site copy.

4. Search engines should be only one of your marketing strategies. One week you could be at the top of a search engine, the next week your site may disappear. Therefore don’t just rely on traffic from the search engines, but use other methods to drive traffic to your site.

Now go and submit your site to these major search engines. When you succeed in getting a top listing, as a result of taking action on this article, drop me a thank you note – I’ll be cheering for you!

Part 2 of this article will explain

How to get listed in the Major Search Engine Directories

Author Name: Herman Drost
Company: ISiteBuild.com

Author Bio:
Herman Drost is the author of the new ebook “101 Highly Effective Strategies to Promote Your Web Site” a powerful guide for attracting 1000s of visitors to your web site. Subscribe to his ‘Marketing Tips’ newsletter for more original articles at: subscribe@isitebuild.com. Read more of his in-depth articles at: www.isitebuild.com/articles

How to Get Your Web Site Listed In The Search Engine Directories
Submitting your web site to the search engine directories will give you a great boost in traffic. A listing in a large search engine directory like ODP(Open Directory Project) enables your web site to get indexed for free by the major search engines such as AskJeeves, AOLSearch, Google, and Lycos. Directories will also help increase your page ranking. However getting listed in the search engine directories is more difficult than the major search engines. Preparation and patience is needed.

In Part 1 we covered “How to Get Your Web Site Listed in the Major Search Engines”

So what’s the difference between the major search engines and major search engine directories?

Search engines are often confused as being the same as search directories.

A search engine is a web-based software tool that enables the user to locate sites and pages on the web based on the information they contain.

Search directories use people, so you are dependant on these editors to put in a description of your website that will cover the keyword phrases that you need. Therefore, before submitting your site to a directory think carefully about the description that you enter.

How to prepare your web site for search engine directory submissions.

1. Research the best keywords – use the overture suggestion tool or wordtracker tool to find the keywords best suited for your site. It’s no point entering words in a directory that don’t relate to your site or are not often searched for. Include these keywords in your page content and in each of your meta tags.

2. Evaluate your site – search directories are more particular about allowing web sites to get listed than search engines. Therefore make sure your site has no broken links, includes relevant content-based pages, loads quickly (within 10 seconds), is easy to navigate, and has cross browser compatibility.

3. Prepare your title – if you are a business enter your business name ie Drost Web Design. If you don’t have a business name use the name of your site ie iSiteBuild.com. Don’t use a long title. This may help you get listed in the search engines but not in the directories.

4. Write a description – remember that human beings are looking at your description so don’t create an ad. Create a well written description of about 25 words, that clearly explains what your site has to offer.

5. Look professional – your web site should have a professional appearance. Don’t just put up a bunch of affiliate links to other sites or just have a one page sales letter.

6. Submit your site – navigate through the directory to find the best category for your web site. Don’t just include what you think is the most popular but the most targeted category. An easy way to find out is to enter your search terms in the directory.

7. Be patient – it can take months or sometimes even years to get listed. This is because there are thousands of sites being submitted every day. If your site has not been listed after 3 months, then resubmit.

8. Guarantee – there is no guarantee to get listed as the categories may already be filled. There may also be a backlog of sites to be reviewed. Therefore concentrate your efforts on other marketing methods.

In Part 3 of this article we’ll discuss the detailed requirements for submitting your web site to each Major Search Engine Directory in order to attract thousands of targeted visitors.

Author Bio:
Herman Drost is the author of the new ebook “101 Highly Effective Strategies to Promote Your Web Site” a powerful guide for attracting 1000s of visitors to your web site. Subscribe to his ‘Marketing Tips’ newsletter for more original articles at: subscribe@isitebuild.com. Read more of his in-depth articles at: www.isitebuild.com/articles

Getting more Google backlinks
Search Engines over the years have struggled with providing relevant and good results for their users. At the same time, they have faced the “attack of the webmasters” from those determined to use every trick or tool to get their web site at the top of the listings.

It can become a webmaster obsession (or maybe an addiction?) to “win the game” of search engine ranking.

Another option is to part with cash for a Search Marketing Company to do the work for you. This will not be instant, guaranteed or cheap as it takes time and resources.

This has forced the engines to change how they rank sites. The most important change to the rules is that of links or link popularity, meaning the amount of web sites pointing to your site.

With Google this also means the quality of the sites linking to you. There are several ways to get links to your site:

• Exchange or reciprocal – This is when web site owners agree to place links on each other’s site.

• Affiliate Program -Having your own affiliate program is some times over looked as a form of linking, but can not only be a good form of linking but of real revenue production.

• Paid Linking – this is when you can pay for links in directories.

• Free Linking without return links. This is the area I want to focus on.

Some Internet marketing experts will tell it’s not possible to get links to your site but there are ways that work. I will show you they are effective and more importantly, that Google loves them.

Articles – Writing and submitting articles to other site owners is an effective way to get links to your site.

It’s not as effective as it once was but publishers are still looking for quality, unbiased information. All you have to do is make sure you have a resource box at the end of each article, which points to your site.

Writing tips & resources
Check out Jim Edwards book on help on how to write articles here:

http://www.turnwordsintotraffic.htm

It has a great template which you can use. Just fill in the blank templates to format your message.

If your spelling or grammar is not the best then a low cost editing option might fit you needs:

http://www.yourbestwork.com/

With articles, I highly recommend writing an article and leaving it at least a day, then looking over it again before allowing it to become public.

It can give you a chance to remove any obvious spelling and grammar mistakes. Once you have your article finished you need to find placed to submit. I have a list here:

http://www.digitalawol.com/esub.htm

But maybe you have a very specific market you want to reach?

Where can I post my articles?
Here is a step by step way of how to use Google to find good results on a specific market. Say you produce products or sell e- books on “weight”

1. Search in Google but think of the big picture. Instead of using “weight” search for “health” which brings around 181,000,000 results.

2. Scroll to the bottom of the results page and click the “search in results” link

3. Enter in the text box “submit article” with the quote marks, which will look for exact matches to the search term.

4. This brings up about 55,800 which should allow you to find at least 1000 sites that would accept your articles. Once you have covered those you could start again for different key words.

Also when you find a suitable site that accepts articles, create a folder within your bookmarks (add to you favourites list in Internet Explorer) and add it as a submit link, for ease of returning.

Getting free links
Other sites offer free links where you can place a link. A word of caution here, some sites use different programming languages in their design. They may list your site but Google and other engines have trouble reading the resulting page and the link may not actually go to you but to a redirect file which then goes to your site.

The addresses of the links on the page may be something like this:

http://thesite.com/default.asp?m=gs&SiteID=39647 or

http://thesite.com/redirect?=1234

So they don’t actually count as a link to your site. Look at otherlinks on the site for a clue here. Check it out first before wasting your time placing a link.

On a side note here, it may still bring you some traffic if the site has large amounts of visitors.

Getting links from discussion forums
Another way to get links is via discussion forums. Again, you have to check that you links will count towards your site. Also the forum must be aimed at your site so it would be best if it has your key words in its name.

For example, I post regular help at Duncan Carvers Marketing-Stratergy.org . When I do post I always use a signature like this on the end of each message:

——
Paul J Easton
Internet Marketing and Promotion from DigitalAwol.com

Custom IE (Internet Explorer) Toolbars from Createtoolbar.com
—–

And the most important words above are linked to my site – (that’s every before the “from”)

Due to the forum being very active, with new posts, the Google spider, which lists the pages in the search engine returns very often (they love new fresh content).

A small problem is the pages that are created have a Question Mark, like the examples above. All I need to do in this case is to submit the page I had posted to ensure that Google finds and indexes it.

With in a couple of days Google listed my link back to my site. You can see this by going to Google and pasting:

” link:http://www.createtoolbar.com ” without the quotes in the search box. and view the first link!

This is a great win-win. I helped someone else out and I got links to both my sites!

Hope this helps.

Author:
Paul Easton

The Foundation
In my recent article “The Future of Search Engine Technology”, I looked at a lot of developments that might happen in the future, that would improve search technology. I strongly believe that we are witnessing the infancy of search engine technology, but I wanted to hear what others had to say. Today, we start a series of interviews with prominent experts, insiders and search engine developers to hear what their thoughts are for the future.

If you’ve been online for any length of time and work in any industry connected with the Internet, you would heard of, Robert Scoble. The Microsoft employee maintains a daily blog (when he’s not working to get the word out about Microsoft’s new “Longhorn” operating system) where he gives his thoughts on all things Microsoft, while also casting a critical eye on the competition. Scoble does a great job of keeping a distinct line between what’s “official” and what’s simply his opinion.

I was fortunate enough to catch him taking a sabbatical from his blog and asked him his thoughts on the future of search engine technology. Scoble did ask me to note that the following represents his personal opinions, not Microsoft-vetted opinions.

Search Engine Technology 1
[Andy Beal] Robert, tell me about the search engine technology being developed that you are most excited about?

[Robert Scoble] That depends on whether you’re talking about Internet searching, or searching on your local hard drive. If we’re talking about your local hard drive, searching for files on your local hard drive is still awful and getting worse.

[AB] Why do you say that?

[RS] Because hard drives keep getting bigger (a 60GB drive at Fry’s Electronics is $60 now — in three years we predict it’ll be $20 and you’ll see 500GB drives for less than $100). It’s easier to create files now than it is to find them.

Today search tools like X1 are most interesting because they index your hard drive and make it easy to search for email and files on your local drives. Microsoft Research has been working on a tool called “Stuff I’ve Seen” too, which is also quite interesting (both let you search email as well as files on your hard drive). But, these tools don’t go far enough. First, they are bolted on top of the operating system. So, while they are indexing, your system often sees slowdowns. They can’t design those to work properly with the operating system and with other applications that might need processor time.

Plus, to really make search work well search engines need metadata and metadata that’s added by the system keeping track of your usage of files, as well as letting application developers add metadata into the system itself. In a lot of ways, weblogs are adding metadata to websites. When a weblog like mine links to a web site, we usually add some more details about that site. We might say it’s a “cool site” for instance. Well, Google puts those words into its engine. That’s metadata. (Technically metadata is “data about data”). Now if you search for “cool site” you’ll be more likely to find the site I just linked to. So, you can see how Google’s engine is helped by metadata. But, we haven’t been able to apply those lessons to the thousands of files on your hard drive. That will change.

[AB] Can you explain the problems faced with searching hard drives and what Microsoft is working on to help?

[RB] What if we did the same thing on your hard drive [as Google]? For instance, look at pictures. When I take pictures off of my Nikon, they have some metadata (for instance, inside the file is the date it was taken, along with the exposure information) but that metadata isn’t useful for most human searches. For instance, how about if I wanted to search for “my wedding photos?” Neither X1, nor Windows XP’s built in search would find your wedding photos. Why? Because they have useless names like DSC0001.jpg and there’s no metadata that says they are wedding photos.

Let’s go forward a couple of years to the next version of Windows, code-named Longhorn. In Longhorn we’re building a new file storage system, code-named WinFS. With WinFS searching and metadata will be part of the operating system. For instance, you could just start typing in an address bar “W” and “E” and “D” and “D” and anything that started with WEDD would come up to the top. For instance, your wedding documents, spreadsheets, and photos.

But, WinFS goes further than X1 and other file search tools do today. It lets you (and developers of apps you’ll use) add metadata to your files. So, even if you don’t change the name of your files, you might click on one of the faces in a picture application and get prompted to type a name and occasion. So, you would click on your cousin Joe’s face, type in “Joe Smith” and “Wedding.”

Now whenever you search for wedding stuff, that photo will come up. And that’s just the start. If you imported a group of photos into a wedding album, you’d be adding metadata for the search engine to use. In other words, you’ll see a much nicer system for searching your local hard drive.

Search Engine Technology 2
[AB] It looks like Microsoft has things mapped out for offline searches, but can they compete with Internet search engines?

[RS] Now, if we’re talking about the Internet, then Google has done an awesome job so far. I use Google dozens of times a day. Will MSN [search] be able to deliver more relevant results than Google? I don’t know. Certainly that’s not the case today. Will that change tomorrow? I’m waiting to see what the brains at MSN do.

One thing I do see is that in Longhorn, search will be nicer for customers. Google is working on making its toolbar the best possible experience. We’re working on a whole raft of things too. I’m very excited about the future of search, no matter which way things go.

[AB] Let’s look beyond the next couple of years. What new developments in search do you see happening in the next 3-5 years?

[RS] For Internet searches, I see social behavior analysis tools like Technorati becoming far more important. Why? Because people want different ways to see potentially relevant results. Google took us a long way toward that future as their Google’s results are strongly influenced by how many inbound links a site has. But, now, let’s go further, even further than Technorati has gone. Let’s identify who really is keeping the market up to date on a certain field and give him/her more weight.

I also see that search engines that search just specific types of content (like Feedster) are going to be more important (Feedster only searches RSS and Atom syndication feeds).

Oh, and users are going to demand new ways of exporting searches. Google showed us that with News Alerts. Enter in a search term, like “Microsoft” and get emailed anytime a news source mentions Microsoft. Feedster goes further than that. There you can build an RSS feed from a search term. I have several of those coming into my RSS News Aggregator and find they are invaluable for watching what the weblogs are saying about your product, company, or market. For instance, one of my terms I built a feed for is “WinFS” — I’ll be watching to see how many people link to this article and if any of you have something interesting to say I’ll even link back.

[AB] Let’s look at your “wish list”. Assuming there were no restrictions in technology, what new feature would you like to see introduced to search engines?

[RS] I want to see far better tools for searching photos — and connecting relationships between all types of files and photos. For instance, why can’t I just drag a name from my contact list to associate that name with a face in a photo? Wouldn’t that help searching later on? In just 18 months I’ve taken 7400 photos. But I can’t search any of them very well today without doing a lot of renaming and other work.

[AB] What impact do you see social networking having on the future of search engine technology?

[RS] We’re already seeing an impact over on Feedster and Technorati. It’s hard to tell what’ll come in the future, but what would happen if everyone in the world had a weblog and was a member of Google’s Orkut? Would that change how I’d search? Well, for one, it’d make me even more likely to search for people on services that linked together social spaces and weblogs. Heck, I can’t remember my brother’s email address, but Google finds his weblog (and I can send him an email there).

One other thing I’ll be watching is how Longhorn’s WinFS gets used by application developers to build new kinds of social systems. Today if you look at contacts, for instance, they are locked up in Outlook, or another personal information management program like ECCO. But, contacts in Outlook can’t be used by other applications (particularly now because virus writers used the APIs in Outlook to send fake emails to all contacts in Outlook, so Microsoft turned those features off).

[AB] WinFS changes that. How?

[RS] By putting a “contacts” file type into the OS itself, rather than forcing applications developers to come up with their own contacts methodology.

What if ALL applications, not just Outlook, could use that new file type? What if we could associate that file type to social software services like Friendster, Tribe, Yahoo’s personals, or Google’s Orkut? Would that radically change how you would keep track of your contacts? Would that make contacts radically more useful? I think it would.

Already we’re seeing systems like Plaxo keep track of contacts, but Plaxo is still unaware that I’ve entered my data into Google’s Orkut and Friendster. Why couldn’t I make a system that’d associate the data in all my social software systems? Including Outlook?

Search Engine Technology 3
[AB] Do you foresee any problems with the WinFS approach?

[RS] Developers distrust Microsoft’s intentions here. They also don’t want to open up their own applications to their competitors. If you were a developer at AOL, for instance, do you see opening up your contact system with, say, Yahoo or Google or Microsoft? That’s scary stuff for all of us.

But, if the industry works together on common WinFS schemas (not just for contacts either, but other types of data too), we’ll come away with some really great new capabilities. It really will take getting developers excited about WinFS’s promise and getting them to lose their fears about opening up their data types.

[AB] Do you foresee a time when commercial search results (product/services) will be separated from informational search results (white papers/educational sites)? And do you think all commercial listings will eventually be paid only?

[RS]I don’t see the system changing from the Google-style results today. Searchers just want to see relevant results. Paid-only searches won’t bring the most relevant results.

[AB] What makes you say that?

[RS] Because I often find the best information on weblogs. Webloggers are never going to be able to afford to pay to be listed on search engines.

Commercial-only listings might be seen on cell phones or PDAs, though. If I were doing a cell phone service for restaurants in Seattle, for instance, I might be more likely to list just member sites. But, thinking about it, I still don’t see such a system becoming popular enough without listing every restaurant in some way.

[AB] Speaking of cell phones. How do you see search engine technology impacting our use of PDAs and Cell phones?

[RS] Not sure if search engine technology will impact it, but the mixture of speech recognition with search engines might change it a lot. When I’m using my cell phone I don’t want to look at sites that have a lot to read (I’ll save those for later when I’m in front of a computer or my Tablet PC) but, instead, I want to find the closest Starbucks. Look up movie listings. Find a nice place to have a steak dinner. Now that cell phones are reporting e911 data (that means that the cell phone system knows approximately where you’re located, so can give you just one or two Starbucks, rather than all of the ones in Seattle).

[AB] If search engine users gave up a little of their privacy and allowed their search habits to be monitored, would this allow the search engines to provide better, customized results?

[RS] Yes. I already give Google the ability to watch my search terms (I use the Google Toolbar). But, it always must be a choice. People really hate it when you don’t have strict privacy policies that are easy to understand and they hate it if you don’t give them a choice to not report anything.

[AB] Robert, you’ve certainly opened our eyes to the future of search engine technology, is there anything else you would like to add?

[RS] To echo what I said above, I hope the industry sees the opportunities that Longhorn’s WinFS opens up. We can either work together and share data with each other, or we can be afraid and keep data to ourselves. It’ll be an interesting time to watch in the next three years.

Many thanks to Robert Scoble, Microsoft employee and blog extraordinaire. Please be sure to visit SearchEngineLowdown.com as we continue to highlight the thoughts and views on the future of search engine technology.

Author Bio:
Andy Beal is Vice President of Search Marketing for WebSourced, Inc and KeywordRanking.com, global leaders in professional search engine marketing. Highly respected as a source of search engine marketing advice, Andy has had articles published around the world and is a repeat speaker at Danny Sullivan’s Search Engine Strategies conferences. Clients include Real.com, Alaska Air, Peopleclick, Monica Lewinsky and NBC. You can reach Andy at andy@keywordranking.com and view his daily SEO blog at www.searchenginelowdown.com.

Search Engine Marketing
Search engine marketing success comes from good research. By applying research to understand your competition and target audience, your optimization efforts will succeed. Remember when homework from school often required some research on your part to complete? It is much the same scenario for search engine marketing: you need to apply yourself by researching in order to understand your competition and target audience. Your visitors need to relate to you and understand your message and what you want them to do.

Know Your Industry
You must spend time understanding the industry you are trying to make a profit from. If you sell widgets, know everything you can about widgets: where they come from, how they are typically sold, why people need them, etc. You can learn a great deal from your competitors. Research using industry communications such as online magazines, forums, newsletters and blogs. Read articles written by industry leaders and keep up on the latest news about your industry. The more you know, the better your basis for building an authoritative website.

What’s Your Point?
Ok, so you’ve got this product you want to sell online. What are you saying to your audience? Make sure you write in a clear, concise way so your message does not get lost in the process. Just because everyone in your company understands your message doesn’t mean the visitors to your site will. Have a few test scenarios set up; ask a few objective readers what they understand from your website and see just how much your message is getting across. You’d be surprised that what may be obvious to you is not necessarily obvious to your website visitors. If you are having trouble creating a clear message for your website, consider hiring a copywriter to convey what you want to say.

Know Your Target Audience
Who buys your product and why? Who needs the information you have on your website? Who would you like to have visit your website that isn’t already there? Who is visiting you? Are they professionals who understand your technical terms or visitors of varying levels who all need the same information from you? What do you offer that a certain market would want from you? Take the time to get a good look at what is out there and how your competitors are presenting their information to online visitors. Use your log statistics reports to track who comes to your website. Get familiar with the keyword terms they are using in the search engines to find you. Research the domains that most visit you. Find out why visitors are clicking away from certain pages before going deeper into the content of your website. Is it a lack of information? Too many choices to click on? Is the language used or instructions given easy to understand? Don’t give your visitors a reason to leave before they understand your message.

Know Your Competition
Take a good look at your top competition and see what they are offering online. Even looking at websites that are not direct competitors may give you an idea of what to offer your visitors. Think of it this way: someone put effort into creating those websites. Visit your competitor’s website. Search in the major search engines for your most important keyword phrases and see if your competitors show up in the top thirty search engine results. Learn what you can and see if what they are doing is something you should be doing. It’s always good to know if your competitors are using SEO, Paid Inclusion, PPC, Link Building and other means to rank well.

Make Your Website Accessible
There’s nothing worse than muddling through a website looking for what you want and clicking so much you finally give up. Use easy navigation, make sure your information or products are easily accessible to your visitors. Create written text that is easily understood in order to get your message across readily. Give your visitors plenty of written information. There’s no such thing as “too much text” when it comes to search engine robots “understanding” your web pages. What’s good for the visitor is often good for the search engine robots.

Do The Work
Research is the cornerstone to your success. The more you know about your subject, the better you will be able to inform your visitors. By informing your visitors you build trust and interest in learning more about your website. Do the math – get searching!

Author Bio:
Daria Goetsch is the founder and Search Engine Marketing Consultant for Search Innovation Marketing, a Search Engine Promotion company serving small businesses. Besides running her own company, Daria is an associate of WebMama.com, an Internet web marketing strategies company. She has specialized in search engine optimization since 1998, including three years as the Search Engine Specialist for O’Reilly & Associates, a technical book publishing company.

Microsofts Search Engine Discussion
As a follow up to our look at what Microsoft’s new search tool could look like, Rob Sullivan, our Head of Organic Search, and I locked ourselves in an office and tried to tackle some big questions about what will happen when Microsoft enters the search industry. We suspect these questions have been on a few peoples minds.

Q: Given what Microsoft is working on for Search, what do you see Microsoft doing between now and the release of Longhorn?

Rob: Version 1.0 of search will out by the end of the year; Bill Gates has already stated this. It will look the same as everybody else, however; nothing too different or radical. They will be playing catch up for the time being, offering similar features to Google, Yahoo, etc. Trying to get MSN competitive with other portals.

(Note: as a follow up to this, you can see a beta MSN search portal at beta.search.msn.com. Hmmm..notice any similarities in the results to Google’s layout?)

Their deal with Overture lasts until 2005 so we won’t see too much change with that. Likely after their current contract expires they will go with yearly renewals only. Nothing too long term, and with the option to bail. This is for when they are ready to launch their own PPC.

By next year, I wouldn’t be surprised to see Microsoft buying some PPC outlet, like a LookSmart or Enhance.

Gord: I think Rob’s right. Microsoft has to start building a name for their own search product, so they’ll introduce it on the MSN portal. It will be introduced with a lot of hype, but little in the way of functional advantages. Ideally, Microsoft will be able to add one or two features that would give it a distinct user advantage over Google and Yahoo.

Another important thing to remember is that Microsoft is probably trying to recoup their investment in search as soon as possible. I don’t think they are prepared to sink hundreds of millions of dollars in a search black hole without a return for years. They’ve played this game in the past to capture market share and I don’t think they want to go there again. That’s why you’ll see them hang onto an agreement with Overture for the foreseeable future.

Q: Why Buy a PPC Outlet?

Rob: Buying a PPC provider is quicker than building one. The current PPC suppliers already have developed the back end systems, such as the database, reporting features, and so on. Also, in some cases (particularly Looksmart) the price is right. At least by buying a PPC supplier they can quickly monetize the purchase. After all, look at how much money Yahoo made this quarter alone, the first full quarter since the Overture purchase, because of the PPC supplier.

Gord: We have to remember that Microsoft will be throwing a lot of resources at developing their own search technologies. I agree with Rob. It makes more sense to buy an existing PPC provider and get access to the technology. The one caveat here is the Overture/Yahoo portfolio of patents, which currently has some PPC portals paying them license fees. Microsoft will be looking to steer around this. And this brings us back to Google. Google AdWords uses a system sufficiently different from Overture that it doesn’t infringe on their patent. Could this be one of the reasons Microsoft was originally looking at Google?

Bottom line: Microsoft will want their own paid search channel as soon as possible, but will be looking for short cuts to get there.

Q: Why won’t Microsoft hold off on unveiling their search until Longhorn?

Rob: The problem will be winning people over. Microsoft results are not the best right now. MSN search doesn’t have a very good reputation. It has traditionally been confusing for most people, what with sponsored listing, organic listings, web directory listings, featured sites and so on. They’re already changing their default search to something that has more reliable results but Microsoft will have to overcome this bad quality perception to convince people to use search. Also, as they roll out with new search features, they will continue to change their results pages’ displaying more or less ads, more or less free listings to find the right balance between monetizing the paid listings. And during the process, they have to build their market share.

One way they could attempt to win people over would be a side by side comparison of results with other sources. If they could prove to more advanced web users (the early adopters so to speak) that their results are at least comparable, but more than likely superior to other providers, they can start winning people over. They will have to do this specifically before Longhorn comes out. Microsoft has to convince people that MSN search provides quality. If they can’t win people over with MSN search, they won’t likely get them to use Longhorn’s search capabilities.

Gord: Microsoft has never built a search brand. Right now, Google owns that outright. Even if Longhorn ships with its default set to Microsoft search, people have to have sufficient reason not to change it. It remains to be seen what the Antitrust overloads will do to prevent Microsoft from monopolizing search (like they did with browsers) but we can be sure that the user will have to at least have the appearance of a choice.

Microsoft simply can’t allow Google to continue to own the search business outright for the next few years. They have to start establishing their own brand identity. For the next two years, the search fight will be fought on two fronts: through the portals (Google vs. MSN vs. Yahoo) and through toolbars and desktop search apps. Microsoft doesn’t have to win these outright, but they have to put up a good fight to build a position as a contender. If they can do that, they can eventually crush the competition through OS integration with Longhorn.

Q: Of all the functionality we looked at in the last NetProfit, how much will ship with Longhorn?

Rob: The WinFS part is a given, that has to happen. They need to incorporate the XML features of the new file system with its ability to easily perform local searching. It will also give people the ability to see what MSN Search could become in the coming years; i.e. being able to search for a song based on genre, performer, songwriter, and so on. Once people learn of the power of the WinFS search feature they will likely come to rely on it for web search. This is key to Microsoft search. If people don’t buy into the power of the WinFS search, they won’t buy into Microsoft search.

They will have the Avalon desktop because the cool factor. Microsoft hasn’t changed the desktop since Windows 95. While they’ve added a ton of great features since then, they haven’t improved on the desktop. Having a desktop with the ability to render in 3D definitely contributes to the cool factor.

In addition, I see the subscription portion of MSN being more tightly integrated into the new OS. The ability to stream audio and video to subscribers will be a huge money maker for Microsoft. The subscription service will offer more features than the regular MSN portal.

Gord: In addition to what Rob has mentioned, I believe Microsoft will also introduce an iterative search interface, that will allow the user to tweak and filter their search and have the results update in real time. I believe this functionality is in complete alignment with how people actually search and will be a big factor in winning over users.

Microsofts Search Engine Discussion cont’d
Q: How about Implicit Query?

Rob: Not right off the bat. That will be likely wrapped into further enhancements. Also, the ability to anticipate queries, and aid in queries means that the computer has to know more about the user. Therefore it will have to monitor the user both locally and on the web to see how they search, what they search for, what results they picked and so on. Until the computer can understand the user Implicit Query won’t work.

Gord: I’m not so sure. Microsoft is testing Implicit Query pretty aggressively now, and I wouldn’t be at all surprised to see some version of it included in the first release of Longhorn. That, and the fact that the marketing possibilities of Implicit Query are staggering. I’m sure Microsoft is fully aware of that.

Q: How much of a selling feature will Search be in the Longhorn release?

Rob: It won’t be the major feature; Microsoft will be pushing the desktop features and functionality more than the web search feature, because of convenience. The ability to get information from formats such as sound and graphic files will be appealing to users. I think that initially many people may even change their default search preferences to a site like Google, especially if MSN search doesn’t perform. This is why they HAVE to get search right early on.

Gord: Right now, all Microsoft’s talk about the search functionality they’re developing is about its application on the desktop, not on the web. This leads me to think that they want to nail it as a localized search tool first, and then extend it to the web. For this reason, I think there will be a lot of marketing about ease of use and the ability to work more intuitively with your files without having to become a librarian. The web search implications will be rolled out later, as Microsoft moves their search focus online.

Q: What’s Google going to do?

Rob: Google has to get into the portal business. They need a built in traffic stream that’s looking for other features over and above search. That’s why they’re having an IPO – to raise money. Think about this for a second. You already have a hugely popular, highly profitable web property. Estimates are that it takes in a billion dollars a year in revenue with anywhere between $250 and $500 million in profit. Why would you IPO? Because you need to raise money. And with that money, they’re going to buy AOL or Netscape.

Why buy AOL or Netscape? Well either of these properties will give them a dedicated customer base, a portal, the ODP, a chat program (ICQ) Movie listings, mapping capabilities and so on. It puts them on a somewhat equal footing with MSN and Yahoo offerings. Not to mention that Time Warner has had nothing but headaches with the AOL branch since the two companies merged.

Gord: Google has to make a bold move forward in search functionality. They came out so far ahead of any search engine in 1998 that once the competition caught on, it took them 5 years to catch up. Now, that competition has caught up, and Google hasn’t been able to raise the bar that significantly since.

Google has to jump ahead of the pack again, and based on past experience, that jump on functionality will be squarely focused on providing web users with a better search experience. While I like Rob’s portal theory, I think such a move would split Google’s focus in so many areas that they’d lose sight of their core competency, which is search. In my mind, Google only has one slim chance to win the upcoming war with Microsoft, and that’s by continually focusing on providing the best search tool on the web.

What Impact will Yahoo dropping Google have?

Rob: Google will loose a substantial chunk of traffic, obviously, but it won’t have much of an impact on Yahoo regarding quality of search results. Yahoo will still have 1/3 of web users after they switch. They can replace with Inktomi, or a mix of AltaVista, Fast and Inktomi. How Yahoo will work is they will build features and test the new features on AltaVista. When they’re satisfied that the features do what they want, and are useful, they will implement the new features on Yahoo. The average Yahoo user won’t likely notice much of a difference from the day before they dropped Google to the day after.

As far as search functionality, they’re refining their semantic processing. They have had a year since they bought Inktomi, and Overture has had Alltheweb and Altavista for six months. It isn’t inconceivable that they have a huge amount of research and development going on to make a search product capable of replacing Google. (By the way, semantic search will also be a feature with MSN search therefore it’s safe to assume that Yahoo will be developing it as well)

Gord: The loss of Yahoo has been looming for ages, and I would hope Google has a Plan B, as well as a C, D and E. Really, the Yahoo arrangement has been a bonus for Google from the beginning. The fact is, Google still owns a huge chunk of search traffic outright, and they have to concentrate on this before anything else. If we’ve learned one thing in the past few years, it’s that you can’t depend on partnership arrangements for your success.

Google will be finding ways to build search traffic directly to Google. The launch of the Google Toolbar was the first move in this direction, and a brilliant one, in my opinion. I think toolbars and other desktop search apps will be the next hot battleground of the search industry.

Rob: The down point to that is people have to agree to download from Google. With Microsoft, it will all be built in. Again, a huge competitive advantage to Microsoft.

The thing is, if Google does do something revolutionary, and offers it as a free download, all Microsoft has to do is build it and release it as a patch or service pack to get it implemented.

Gord: And that’s why I think Microsoft can’t be beat in the long run.

Rob: I think Google can get back on top, and the application of the semantics db is just the first step. They have to get that working first. I think they are close and getting closer all the time, but they still have some tweaking. Once they do get it fixed they have to get in front and then stay in front, so the competition is always aiming for a moving target. Even then I don’t think superior search is enough to keep them in front. They have to offer more.

Google will keep trying to introduce ways to make search more intuitive and useful. For example, the slide show search at Google labs is kind of cool, if they find a way to make it more applicable to people.

Q: What about Yahoo?

Rob: Yahoo is in the most unique place right now. They have nothing to prove and a solid customer base. Anything they do is an improvement, so they have no where to go but up.

I think they’ll have to go into semantic search, like Google and MSN. The first roll out of a pure Yahoo search will be vanilla organic search, but they’ll be changing that as time goes on. By this fall they’re going to want have something out before MSN. MSN is the key in Yahoo’s formula. Yahoo will want to be ahead of MSN, and Google wants to be ahead of everybody else.

Gord: I wouldn’t want to be Yahoo right now. I can’t see how they’ll win in the long run. The one area that’s interesting is the Yahoo shopping search engine. Perhaps Yahoo will eventually have to concede the general search contest, and become a niche player providing specialized search functionality. But they’re not going to go quietly. It’s going to be a huge battle in the coming few years.

Microsofts Search Engine Discussion – More Q&As
Q: Does Yahoo offer anything unique or superior?

Rob: They’re just relying on their brand. They really don’t have any features that set them apart. That’s not to say that they won’t develop these products, but I think with Overture making so much money for them that, at least in the short term, they don’t need to innovate to stay in the game. Once they realize that they are getting left behind (or at least are simply status quo) they will invest more into organic search R&D. If it will be too little too late is anyone’s guess at that point.

Gord: Overture may provide another key to survival for Yahoo. In addition to the revenue Rob mentioned, Overture has always been ahead of Google in providing better management and reporting tools to PPC marketers. I think it would be a smart move on Yahoo’s part to build this advantage aggressively over the next two years. We know search marketing is a hot market, and if Overture can build some loyalty, it will give them some high ground to defend their position from.

Q: Does Yahoo have a chance?

Rob: Microsoft’s marketing power is huge – as is their budget. Yahoo is also a big company, but they can’t compete head to head with Microsoft. MSN is going to crush Google and then aim for Yahoo. They’ll be competing portal to portal. MSN has to beat Google at search first. Once that does happen (and I do believe it will ‘ provided they can get that crucial version 1 search right), MSN will set its sites squarely on Yahoo.

Gord: As I mentioned before, I don’t think Yahoo can win the search battle head to head with Microsoft or Google. They’re going to have to rely on their strengths in other areas to survive.

Q: Does Google have a chance?

Rob: Not really, and that’s why Google needs a portal. They can’t compete on search alone. If it’s not a portal than Google has to offer other unique features.

Gord: The only way you can compete on search is to be best of breed. And Google isn’t the clear leader in search any more. To be honest, Google impressed the hell out of me in the beginning, but I don’t think they’ve done anything noteworthy in a long time. I think they’re showing the stress factures inevitable in a young company that suddenly has to compete in the big leagues. I don’t think they’re up to taking on Microsoft head to head. I can’t think of many companies that could.

Rob: But if Google is building in semantics and a constantly improving relevancy feedback mechanism, they should continue to improve. After all, they are already collecting data on the number of abandoned searches, the number of returned results, the average clicked on position and so on. It shouldn’t be too difficult to integrate this data into the ranking algorithms to make them better (if they haven’t done so already). Remember that if this is Applied Semantics technology being applied by Google, then the software is supposed to ‘learn’ based on what results were picked or not picked. It is supposed to be able to refine results for the next time.

Q: Does anyone else have a chance?

Rob: They’ll become niche players..fighting over the scraps that are left. And there will always be an element of the population who are anti-big business. Many people adopt Linux because they are anti Microsoft; I think you will see similar sentiments towards the search engines if they become too commercial. Already we are seeing signs of open source search engines, and other smaller engines trying to compete.

Gord: It may seem ironic, but the biggest potential losers in this will be the ones that go head to head with Microsoft, namely Yahoo and Google. The smaller players will probably benefit. The profile of search will grow, along with the profitability. The small players will be able to move quicker to identify niches and capitalize on them. And they’ll probably be able to strike new partnership, providing specialized services to the big 3 in the upcoming battle. As niche players, I think the future looks good for some of these services.

Q: When Microsoft enters search’will the industry grow?

Rob: Yes, search volumes will continue to grow, so we should see search continue to grow. After all, markets outside of North American and Europe are growing faster than anywhere else. In addition, broadband usage is growing, hardware prices continue to come down, and more people are getting hooked up to the ‘Net. There will be a point where internet growth tapers off, and search plateaus, but I think that is many years away.

Gord: I think we’ll see incremental growth in search as a whole, with a possible jump coming with OS integration. But I see exponential growth in the commercialization of search, and therefore the revenue generated. Implicit Query and other ‘push’ search channels will change marketing all over again. Search, in its eventual evolved and converged form, will be one of the biggest marketing channels, bar none, in 5 to 6 years.

Q: What will Microsoft do on the paid search side with the release of Longhorn?

Rob: I think this is where implicit query kicks in, and the sponsored results will be shown first. Consider this: How much would one ad across the tool bar in an application be worth to an advertiser? That advertiser essentially has a captive audience. We’ve talked about this before ‘ the application ‘watching’ what you are doing and offering help ‘ by way of sponsored (or other search) listings ‘ appearing conveniently in the application you are using. Another resource for listings could be the desktop sidebar (another of Longhorn’s new features). It is also built on XML so it should be flexible enough to display ‘best picks’ listings ‘ whether paid or organic. Combine this with Longhorn’s ability to learn from you and refine its ability to provide what you want and you have a powerful new advertising medium.

Gord: It’s a different pricing model’a different delivery technique. It would very easy for Microsoft to serve up as much advertising as they want, they have to know where they start irritating people. It will be a whole new paradigm, and it remains to be seen how people respond to it.

That said, Implicit Query changes all the rules for advertising. It introduces huge considerations for privacy and online intrusiveness. We’re getting to the online equivalent of the personalized ad message that we saw in the movie Minority Report.

Rob: But Implicit Query and result returned don’t have to be that obvious. Microsoft has already experimented with inconspicuous helpers. Remember Smart Tags? They are still used by Microsoft applications. They appear as a little box with an ‘I’ in it. There could be a time where Longhorn recognizes a phrase and associates it with a smart tag which is linked to a search result which provides more information via organic results, or paid search listing. This type of system opens the door to many different types of web marketing we haven’t considered before.

Conclusion
Microsoft, Yahoo and Google will take search in totally new directions in the next few years. That means search marketing will also change dramatically. The medium will become more powerful than ever, probably prompting new investigations and concerns by consumer groups, and new legislation by government. I hope the questions we posed and tried to answer help clear up a very fuzzy future just a little bit. One thing is for sure. The way you search today will be significantly different from the way you search in 2006.

Author: Gord Hotchkiss