Ask us a question!

Web Moves Blog

Web Moves News and Information

Archive for the 'Search Industry News' Category

The Foundation
In my recent article “The Future of Search Engine Technology”, I looked at a lot of developments that might happen in the future, that would improve search technology. I strongly believe that we are witnessing the infancy of search engine technology, but I wanted to hear what others had to say. Today, we start a series of interviews with prominent experts, insiders and search engine developers to hear what their thoughts are for the future.

If you’ve been online for any length of time and work in any industry connected with the Internet, you would heard of, Robert Scoble. The Microsoft employee maintains a daily blog (when he’s not working to get the word out about Microsoft’s new “Longhorn” operating system) where he gives his thoughts on all things Microsoft, while also casting a critical eye on the competition. Scoble does a great job of keeping a distinct line between what’s “official” and what’s simply his opinion.

I was fortunate enough to catch him taking a sabbatical from his blog and asked him his thoughts on the future of search engine technology. Scoble did ask me to note that the following represents his personal opinions, not Microsoft-vetted opinions.

Search Engine Technology 1
[Andy Beal] Robert, tell me about the search engine technology being developed that you are most excited about?

[Robert Scoble] That depends on whether you’re talking about Internet searching, or searching on your local hard drive. If we’re talking about your local hard drive, searching for files on your local hard drive is still awful and getting worse.

[AB] Why do you say that?

[RS] Because hard drives keep getting bigger (a 60GB drive at Fry’s Electronics is $60 now — in three years we predict it’ll be $20 and you’ll see 500GB drives for less than $100). It’s easier to create files now than it is to find them.

Today search tools like X1 are most interesting because they index your hard drive and make it easy to search for email and files on your local drives. Microsoft Research has been working on a tool called “Stuff I’ve Seen” too, which is also quite interesting (both let you search email as well as files on your hard drive). But, these tools don’t go far enough. First, they are bolted on top of the operating system. So, while they are indexing, your system often sees slowdowns. They can’t design those to work properly with the operating system and with other applications that might need processor time.

Plus, to really make search work well search engines need metadata and metadata that’s added by the system keeping track of your usage of files, as well as letting application developers add metadata into the system itself. In a lot of ways, weblogs are adding metadata to websites. When a weblog like mine links to a web site, we usually add some more details about that site. We might say it’s a “cool site” for instance. Well, Google puts those words into its engine. That’s metadata. (Technically metadata is “data about data”). Now if you search for “cool site” you’ll be more likely to find the site I just linked to. So, you can see how Google’s engine is helped by metadata. But, we haven’t been able to apply those lessons to the thousands of files on your hard drive. That will change.

[AB] Can you explain the problems faced with searching hard drives and what Microsoft is working on to help?

[RB] What if we did the same thing on your hard drive [as Google]? For instance, look at pictures. When I take pictures off of my Nikon, they have some metadata (for instance, inside the file is the date it was taken, along with the exposure information) but that metadata isn’t useful for most human searches. For instance, how about if I wanted to search for “my wedding photos?” Neither X1, nor Windows XP’s built in search would find your wedding photos. Why? Because they have useless names like DSC0001.jpg and there’s no metadata that says they are wedding photos.

Let’s go forward a couple of years to the next version of Windows, code-named Longhorn. In Longhorn we’re building a new file storage system, code-named WinFS. With WinFS searching and metadata will be part of the operating system. For instance, you could just start typing in an address bar “W” and “E” and “D” and “D” and anything that started with WEDD would come up to the top. For instance, your wedding documents, spreadsheets, and photos.

But, WinFS goes further than X1 and other file search tools do today. It lets you (and developers of apps you’ll use) add metadata to your files. So, even if you don’t change the name of your files, you might click on one of the faces in a picture application and get prompted to type a name and occasion. So, you would click on your cousin Joe’s face, type in “Joe Smith” and “Wedding.”

Now whenever you search for wedding stuff, that photo will come up. And that’s just the start. If you imported a group of photos into a wedding album, you’d be adding metadata for the search engine to use. In other words, you’ll see a much nicer system for searching your local hard drive.

Search Engine Technology 2
[AB] It looks like Microsoft has things mapped out for offline searches, but can they compete with Internet search engines?

[RS] Now, if we’re talking about the Internet, then Google has done an awesome job so far. I use Google dozens of times a day. Will MSN [search] be able to deliver more relevant results than Google? I don’t know. Certainly that’s not the case today. Will that change tomorrow? I’m waiting to see what the brains at MSN do.

One thing I do see is that in Longhorn, search will be nicer for customers. Google is working on making its toolbar the best possible experience. We’re working on a whole raft of things too. I’m very excited about the future of search, no matter which way things go.

[AB] Let’s look beyond the next couple of years. What new developments in search do you see happening in the next 3-5 years?

[RS] For Internet searches, I see social behavior analysis tools like Technorati becoming far more important. Why? Because people want different ways to see potentially relevant results. Google took us a long way toward that future as their Google’s results are strongly influenced by how many inbound links a site has. But, now, let’s go further, even further than Technorati has gone. Let’s identify who really is keeping the market up to date on a certain field and give him/her more weight.

I also see that search engines that search just specific types of content (like Feedster) are going to be more important (Feedster only searches RSS and Atom syndication feeds).

Oh, and users are going to demand new ways of exporting searches. Google showed us that with News Alerts. Enter in a search term, like “Microsoft” and get emailed anytime a news source mentions Microsoft. Feedster goes further than that. There you can build an RSS feed from a search term. I have several of those coming into my RSS News Aggregator and find they are invaluable for watching what the weblogs are saying about your product, company, or market. For instance, one of my terms I built a feed for is “WinFS” — I’ll be watching to see how many people link to this article and if any of you have something interesting to say I’ll even link back.

[AB] Let’s look at your “wish list”. Assuming there were no restrictions in technology, what new feature would you like to see introduced to search engines?

[RS] I want to see far better tools for searching photos — and connecting relationships between all types of files and photos. For instance, why can’t I just drag a name from my contact list to associate that name with a face in a photo? Wouldn’t that help searching later on? In just 18 months I’ve taken 7400 photos. But I can’t search any of them very well today without doing a lot of renaming and other work.

[AB] What impact do you see social networking having on the future of search engine technology?

[RS] We’re already seeing an impact over on Feedster and Technorati. It’s hard to tell what’ll come in the future, but what would happen if everyone in the world had a weblog and was a member of Google’s Orkut? Would that change how I’d search? Well, for one, it’d make me even more likely to search for people on services that linked together social spaces and weblogs. Heck, I can’t remember my brother’s email address, but Google finds his weblog (and I can send him an email there).

One other thing I’ll be watching is how Longhorn’s WinFS gets used by application developers to build new kinds of social systems. Today if you look at contacts, for instance, they are locked up in Outlook, or another personal information management program like ECCO. But, contacts in Outlook can’t be used by other applications (particularly now because virus writers used the APIs in Outlook to send fake emails to all contacts in Outlook, so Microsoft turned those features off).

[AB] WinFS changes that. How?

[RS] By putting a “contacts” file type into the OS itself, rather than forcing applications developers to come up with their own contacts methodology.

What if ALL applications, not just Outlook, could use that new file type? What if we could associate that file type to social software services like Friendster, Tribe, Yahoo’s personals, or Google’s Orkut? Would that radically change how you would keep track of your contacts? Would that make contacts radically more useful? I think it would.

Already we’re seeing systems like Plaxo keep track of contacts, but Plaxo is still unaware that I’ve entered my data into Google’s Orkut and Friendster. Why couldn’t I make a system that’d associate the data in all my social software systems? Including Outlook?

Search Engine Technology 3
[AB] Do you foresee any problems with the WinFS approach?

[RS] Developers distrust Microsoft’s intentions here. They also don’t want to open up their own applications to their competitors. If you were a developer at AOL, for instance, do you see opening up your contact system with, say, Yahoo or Google or Microsoft? That’s scary stuff for all of us.

But, if the industry works together on common WinFS schemas (not just for contacts either, but other types of data too), we’ll come away with some really great new capabilities. It really will take getting developers excited about WinFS’s promise and getting them to lose their fears about opening up their data types.

[AB] Do you foresee a time when commercial search results (product/services) will be separated from informational search results (white papers/educational sites)? And do you think all commercial listings will eventually be paid only?

[RS]I don’t see the system changing from the Google-style results today. Searchers just want to see relevant results. Paid-only searches won’t bring the most relevant results.

[AB] What makes you say that?

[RS] Because I often find the best information on weblogs. Webloggers are never going to be able to afford to pay to be listed on search engines.

Commercial-only listings might be seen on cell phones or PDAs, though. If I were doing a cell phone service for restaurants in Seattle, for instance, I might be more likely to list just member sites. But, thinking about it, I still don’t see such a system becoming popular enough without listing every restaurant in some way.

[AB] Speaking of cell phones. How do you see search engine technology impacting our use of PDAs and Cell phones?

[RS] Not sure if search engine technology will impact it, but the mixture of speech recognition with search engines might change it a lot. When I’m using my cell phone I don’t want to look at sites that have a lot to read (I’ll save those for later when I’m in front of a computer or my Tablet PC) but, instead, I want to find the closest Starbucks. Look up movie listings. Find a nice place to have a steak dinner. Now that cell phones are reporting e911 data (that means that the cell phone system knows approximately where you’re located, so can give you just one or two Starbucks, rather than all of the ones in Seattle).

[AB] If search engine users gave up a little of their privacy and allowed their search habits to be monitored, would this allow the search engines to provide better, customized results?

[RS] Yes. I already give Google the ability to watch my search terms (I use the Google Toolbar). But, it always must be a choice. People really hate it when you don’t have strict privacy policies that are easy to understand and they hate it if you don’t give them a choice to not report anything.

[AB] Robert, you’ve certainly opened our eyes to the future of search engine technology, is there anything else you would like to add?

[RS] To echo what I said above, I hope the industry sees the opportunities that Longhorn’s WinFS opens up. We can either work together and share data with each other, or we can be afraid and keep data to ourselves. It’ll be an interesting time to watch in the next three years.

Many thanks to Robert Scoble, Microsoft employee and blog extraordinaire. Please be sure to visit SearchEngineLowdown.com as we continue to highlight the thoughts and views on the future of search engine technology.

Author Bio:
Andy Beal is Vice President of Search Marketing for WebSourced, Inc and KeywordRanking.com, global leaders in professional search engine marketing. Highly respected as a source of search engine marketing advice, Andy has had articles published around the world and is a repeat speaker at Danny Sullivan’s Search Engine Strategies conferences. Clients include Real.com, Alaska Air, Peopleclick, Monica Lewinsky and NBC. You can reach Andy at andy@keywordranking.com and view his daily SEO blog at www.searchenginelowdown.com.

Microsofts Search Engine Discussion
As a follow up to our look at what Microsoft’s new search tool could look like, Rob Sullivan, our Head of Organic Search, and I locked ourselves in an office and tried to tackle some big questions about what will happen when Microsoft enters the search industry. We suspect these questions have been on a few peoples minds.

Q: Given what Microsoft is working on for Search, what do you see Microsoft doing between now and the release of Longhorn?

Rob: Version 1.0 of search will out by the end of the year; Bill Gates has already stated this. It will look the same as everybody else, however; nothing too different or radical. They will be playing catch up for the time being, offering similar features to Google, Yahoo, etc. Trying to get MSN competitive with other portals.

(Note: as a follow up to this, you can see a beta MSN search portal at beta.search.msn.com. Hmmm..notice any similarities in the results to Google’s layout?)

Their deal with Overture lasts until 2005 so we won’t see too much change with that. Likely after their current contract expires they will go with yearly renewals only. Nothing too long term, and with the option to bail. This is for when they are ready to launch their own PPC.

By next year, I wouldn’t be surprised to see Microsoft buying some PPC outlet, like a LookSmart or Enhance.

Gord: I think Rob’s right. Microsoft has to start building a name for their own search product, so they’ll introduce it on the MSN portal. It will be introduced with a lot of hype, but little in the way of functional advantages. Ideally, Microsoft will be able to add one or two features that would give it a distinct user advantage over Google and Yahoo.

Another important thing to remember is that Microsoft is probably trying to recoup their investment in search as soon as possible. I don’t think they are prepared to sink hundreds of millions of dollars in a search black hole without a return for years. They’ve played this game in the past to capture market share and I don’t think they want to go there again. That’s why you’ll see them hang onto an agreement with Overture for the foreseeable future.

Q: Why Buy a PPC Outlet?

Rob: Buying a PPC provider is quicker than building one. The current PPC suppliers already have developed the back end systems, such as the database, reporting features, and so on. Also, in some cases (particularly Looksmart) the price is right. At least by buying a PPC supplier they can quickly monetize the purchase. After all, look at how much money Yahoo made this quarter alone, the first full quarter since the Overture purchase, because of the PPC supplier.

Gord: We have to remember that Microsoft will be throwing a lot of resources at developing their own search technologies. I agree with Rob. It makes more sense to buy an existing PPC provider and get access to the technology. The one caveat here is the Overture/Yahoo portfolio of patents, which currently has some PPC portals paying them license fees. Microsoft will be looking to steer around this. And this brings us back to Google. Google AdWords uses a system sufficiently different from Overture that it doesn’t infringe on their patent. Could this be one of the reasons Microsoft was originally looking at Google?

Bottom line: Microsoft will want their own paid search channel as soon as possible, but will be looking for short cuts to get there.

Q: Why won’t Microsoft hold off on unveiling their search until Longhorn?

Rob: The problem will be winning people over. Microsoft results are not the best right now. MSN search doesn’t have a very good reputation. It has traditionally been confusing for most people, what with sponsored listing, organic listings, web directory listings, featured sites and so on. They’re already changing their default search to something that has more reliable results but Microsoft will have to overcome this bad quality perception to convince people to use search. Also, as they roll out with new search features, they will continue to change their results pages’ displaying more or less ads, more or less free listings to find the right balance between monetizing the paid listings. And during the process, they have to build their market share.

One way they could attempt to win people over would be a side by side comparison of results with other sources. If they could prove to more advanced web users (the early adopters so to speak) that their results are at least comparable, but more than likely superior to other providers, they can start winning people over. They will have to do this specifically before Longhorn comes out. Microsoft has to convince people that MSN search provides quality. If they can’t win people over with MSN search, they won’t likely get them to use Longhorn’s search capabilities.

Gord: Microsoft has never built a search brand. Right now, Google owns that outright. Even if Longhorn ships with its default set to Microsoft search, people have to have sufficient reason not to change it. It remains to be seen what the Antitrust overloads will do to prevent Microsoft from monopolizing search (like they did with browsers) but we can be sure that the user will have to at least have the appearance of a choice.

Microsoft simply can’t allow Google to continue to own the search business outright for the next few years. They have to start establishing their own brand identity. For the next two years, the search fight will be fought on two fronts: through the portals (Google vs. MSN vs. Yahoo) and through toolbars and desktop search apps. Microsoft doesn’t have to win these outright, but they have to put up a good fight to build a position as a contender. If they can do that, they can eventually crush the competition through OS integration with Longhorn.

Q: Of all the functionality we looked at in the last NetProfit, how much will ship with Longhorn?

Rob: The WinFS part is a given, that has to happen. They need to incorporate the XML features of the new file system with its ability to easily perform local searching. It will also give people the ability to see what MSN Search could become in the coming years; i.e. being able to search for a song based on genre, performer, songwriter, and so on. Once people learn of the power of the WinFS search feature they will likely come to rely on it for web search. This is key to Microsoft search. If people don’t buy into the power of the WinFS search, they won’t buy into Microsoft search.

They will have the Avalon desktop because the cool factor. Microsoft hasn’t changed the desktop since Windows 95. While they’ve added a ton of great features since then, they haven’t improved on the desktop. Having a desktop with the ability to render in 3D definitely contributes to the cool factor.

In addition, I see the subscription portion of MSN being more tightly integrated into the new OS. The ability to stream audio and video to subscribers will be a huge money maker for Microsoft. The subscription service will offer more features than the regular MSN portal.

Gord: In addition to what Rob has mentioned, I believe Microsoft will also introduce an iterative search interface, that will allow the user to tweak and filter their search and have the results update in real time. I believe this functionality is in complete alignment with how people actually search and will be a big factor in winning over users.

Microsofts Search Engine Discussion cont’d
Q: How about Implicit Query?

Rob: Not right off the bat. That will be likely wrapped into further enhancements. Also, the ability to anticipate queries, and aid in queries means that the computer has to know more about the user. Therefore it will have to monitor the user both locally and on the web to see how they search, what they search for, what results they picked and so on. Until the computer can understand the user Implicit Query won’t work.

Gord: I’m not so sure. Microsoft is testing Implicit Query pretty aggressively now, and I wouldn’t be at all surprised to see some version of it included in the first release of Longhorn. That, and the fact that the marketing possibilities of Implicit Query are staggering. I’m sure Microsoft is fully aware of that.

Q: How much of a selling feature will Search be in the Longhorn release?

Rob: It won’t be the major feature; Microsoft will be pushing the desktop features and functionality more than the web search feature, because of convenience. The ability to get information from formats such as sound and graphic files will be appealing to users. I think that initially many people may even change their default search preferences to a site like Google, especially if MSN search doesn’t perform. This is why they HAVE to get search right early on.

Gord: Right now, all Microsoft’s talk about the search functionality they’re developing is about its application on the desktop, not on the web. This leads me to think that they want to nail it as a localized search tool first, and then extend it to the web. For this reason, I think there will be a lot of marketing about ease of use and the ability to work more intuitively with your files without having to become a librarian. The web search implications will be rolled out later, as Microsoft moves their search focus online.

Q: What’s Google going to do?

Rob: Google has to get into the portal business. They need a built in traffic stream that’s looking for other features over and above search. That’s why they’re having an IPO – to raise money. Think about this for a second. You already have a hugely popular, highly profitable web property. Estimates are that it takes in a billion dollars a year in revenue with anywhere between $250 and $500 million in profit. Why would you IPO? Because you need to raise money. And with that money, they’re going to buy AOL or Netscape.

Why buy AOL or Netscape? Well either of these properties will give them a dedicated customer base, a portal, the ODP, a chat program (ICQ) Movie listings, mapping capabilities and so on. It puts them on a somewhat equal footing with MSN and Yahoo offerings. Not to mention that Time Warner has had nothing but headaches with the AOL branch since the two companies merged.

Gord: Google has to make a bold move forward in search functionality. They came out so far ahead of any search engine in 1998 that once the competition caught on, it took them 5 years to catch up. Now, that competition has caught up, and Google hasn’t been able to raise the bar that significantly since.

Google has to jump ahead of the pack again, and based on past experience, that jump on functionality will be squarely focused on providing web users with a better search experience. While I like Rob’s portal theory, I think such a move would split Google’s focus in so many areas that they’d lose sight of their core competency, which is search. In my mind, Google only has one slim chance to win the upcoming war with Microsoft, and that’s by continually focusing on providing the best search tool on the web.

What Impact will Yahoo dropping Google have?

Rob: Google will loose a substantial chunk of traffic, obviously, but it won’t have much of an impact on Yahoo regarding quality of search results. Yahoo will still have 1/3 of web users after they switch. They can replace with Inktomi, or a mix of AltaVista, Fast and Inktomi. How Yahoo will work is they will build features and test the new features on AltaVista. When they’re satisfied that the features do what they want, and are useful, they will implement the new features on Yahoo. The average Yahoo user won’t likely notice much of a difference from the day before they dropped Google to the day after.

As far as search functionality, they’re refining their semantic processing. They have had a year since they bought Inktomi, and Overture has had Alltheweb and Altavista for six months. It isn’t inconceivable that they have a huge amount of research and development going on to make a search product capable of replacing Google. (By the way, semantic search will also be a feature with MSN search therefore it’s safe to assume that Yahoo will be developing it as well)

Gord: The loss of Yahoo has been looming for ages, and I would hope Google has a Plan B, as well as a C, D and E. Really, the Yahoo arrangement has been a bonus for Google from the beginning. The fact is, Google still owns a huge chunk of search traffic outright, and they have to concentrate on this before anything else. If we’ve learned one thing in the past few years, it’s that you can’t depend on partnership arrangements for your success.

Google will be finding ways to build search traffic directly to Google. The launch of the Google Toolbar was the first move in this direction, and a brilliant one, in my opinion. I think toolbars and other desktop search apps will be the next hot battleground of the search industry.

Rob: The down point to that is people have to agree to download from Google. With Microsoft, it will all be built in. Again, a huge competitive advantage to Microsoft.

The thing is, if Google does do something revolutionary, and offers it as a free download, all Microsoft has to do is build it and release it as a patch or service pack to get it implemented.

Gord: And that’s why I think Microsoft can’t be beat in the long run.

Rob: I think Google can get back on top, and the application of the semantics db is just the first step. They have to get that working first. I think they are close and getting closer all the time, but they still have some tweaking. Once they do get it fixed they have to get in front and then stay in front, so the competition is always aiming for a moving target. Even then I don’t think superior search is enough to keep them in front. They have to offer more.

Google will keep trying to introduce ways to make search more intuitive and useful. For example, the slide show search at Google labs is kind of cool, if they find a way to make it more applicable to people.

Q: What about Yahoo?

Rob: Yahoo is in the most unique place right now. They have nothing to prove and a solid customer base. Anything they do is an improvement, so they have no where to go but up.

I think they’ll have to go into semantic search, like Google and MSN. The first roll out of a pure Yahoo search will be vanilla organic search, but they’ll be changing that as time goes on. By this fall they’re going to want have something out before MSN. MSN is the key in Yahoo’s formula. Yahoo will want to be ahead of MSN, and Google wants to be ahead of everybody else.

Gord: I wouldn’t want to be Yahoo right now. I can’t see how they’ll win in the long run. The one area that’s interesting is the Yahoo shopping search engine. Perhaps Yahoo will eventually have to concede the general search contest, and become a niche player providing specialized search functionality. But they’re not going to go quietly. It’s going to be a huge battle in the coming few years.

Microsofts Search Engine Discussion – More Q&As
Q: Does Yahoo offer anything unique or superior?

Rob: They’re just relying on their brand. They really don’t have any features that set them apart. That’s not to say that they won’t develop these products, but I think with Overture making so much money for them that, at least in the short term, they don’t need to innovate to stay in the game. Once they realize that they are getting left behind (or at least are simply status quo) they will invest more into organic search R&D. If it will be too little too late is anyone’s guess at that point.

Gord: Overture may provide another key to survival for Yahoo. In addition to the revenue Rob mentioned, Overture has always been ahead of Google in providing better management and reporting tools to PPC marketers. I think it would be a smart move on Yahoo’s part to build this advantage aggressively over the next two years. We know search marketing is a hot market, and if Overture can build some loyalty, it will give them some high ground to defend their position from.

Q: Does Yahoo have a chance?

Rob: Microsoft’s marketing power is huge – as is their budget. Yahoo is also a big company, but they can’t compete head to head with Microsoft. MSN is going to crush Google and then aim for Yahoo. They’ll be competing portal to portal. MSN has to beat Google at search first. Once that does happen (and I do believe it will ‘ provided they can get that crucial version 1 search right), MSN will set its sites squarely on Yahoo.

Gord: As I mentioned before, I don’t think Yahoo can win the search battle head to head with Microsoft or Google. They’re going to have to rely on their strengths in other areas to survive.

Q: Does Google have a chance?

Rob: Not really, and that’s why Google needs a portal. They can’t compete on search alone. If it’s not a portal than Google has to offer other unique features.

Gord: The only way you can compete on search is to be best of breed. And Google isn’t the clear leader in search any more. To be honest, Google impressed the hell out of me in the beginning, but I don’t think they’ve done anything noteworthy in a long time. I think they’re showing the stress factures inevitable in a young company that suddenly has to compete in the big leagues. I don’t think they’re up to taking on Microsoft head to head. I can’t think of many companies that could.

Rob: But if Google is building in semantics and a constantly improving relevancy feedback mechanism, they should continue to improve. After all, they are already collecting data on the number of abandoned searches, the number of returned results, the average clicked on position and so on. It shouldn’t be too difficult to integrate this data into the ranking algorithms to make them better (if they haven’t done so already). Remember that if this is Applied Semantics technology being applied by Google, then the software is supposed to ‘learn’ based on what results were picked or not picked. It is supposed to be able to refine results for the next time.

Q: Does anyone else have a chance?

Rob: They’ll become niche players..fighting over the scraps that are left. And there will always be an element of the population who are anti-big business. Many people adopt Linux because they are anti Microsoft; I think you will see similar sentiments towards the search engines if they become too commercial. Already we are seeing signs of open source search engines, and other smaller engines trying to compete.

Gord: It may seem ironic, but the biggest potential losers in this will be the ones that go head to head with Microsoft, namely Yahoo and Google. The smaller players will probably benefit. The profile of search will grow, along with the profitability. The small players will be able to move quicker to identify niches and capitalize on them. And they’ll probably be able to strike new partnership, providing specialized services to the big 3 in the upcoming battle. As niche players, I think the future looks good for some of these services.

Q: When Microsoft enters search’will the industry grow?

Rob: Yes, search volumes will continue to grow, so we should see search continue to grow. After all, markets outside of North American and Europe are growing faster than anywhere else. In addition, broadband usage is growing, hardware prices continue to come down, and more people are getting hooked up to the ‘Net. There will be a point where internet growth tapers off, and search plateaus, but I think that is many years away.

Gord: I think we’ll see incremental growth in search as a whole, with a possible jump coming with OS integration. But I see exponential growth in the commercialization of search, and therefore the revenue generated. Implicit Query and other ‘push’ search channels will change marketing all over again. Search, in its eventual evolved and converged form, will be one of the biggest marketing channels, bar none, in 5 to 6 years.

Q: What will Microsoft do on the paid search side with the release of Longhorn?

Rob: I think this is where implicit query kicks in, and the sponsored results will be shown first. Consider this: How much would one ad across the tool bar in an application be worth to an advertiser? That advertiser essentially has a captive audience. We’ve talked about this before ‘ the application ‘watching’ what you are doing and offering help ‘ by way of sponsored (or other search) listings ‘ appearing conveniently in the application you are using. Another resource for listings could be the desktop sidebar (another of Longhorn’s new features). It is also built on XML so it should be flexible enough to display ‘best picks’ listings ‘ whether paid or organic. Combine this with Longhorn’s ability to learn from you and refine its ability to provide what you want and you have a powerful new advertising medium.

Gord: It’s a different pricing model’a different delivery technique. It would very easy for Microsoft to serve up as much advertising as they want, they have to know where they start irritating people. It will be a whole new paradigm, and it remains to be seen how people respond to it.

That said, Implicit Query changes all the rules for advertising. It introduces huge considerations for privacy and online intrusiveness. We’re getting to the online equivalent of the personalized ad message that we saw in the movie Minority Report.

Rob: But Implicit Query and result returned don’t have to be that obvious. Microsoft has already experimented with inconspicuous helpers. Remember Smart Tags? They are still used by Microsoft applications. They appear as a little box with an ‘I’ in it. There could be a time where Longhorn recognizes a phrase and associates it with a smart tag which is linked to a search result which provides more information via organic results, or paid search listing. This type of system opens the door to many different types of web marketing we haven’t considered before.

Conclusion
Microsoft, Yahoo and Google will take search in totally new directions in the next few years. That means search marketing will also change dramatically. The medium will become more powerful than ever, probably prompting new investigations and concerns by consumer groups, and new legislation by government. I hope the questions we posed and tried to answer help clear up a very fuzzy future just a little bit. One thing is for sure. The way you search today will be significantly different from the way you search in 2006.

Author: Gord Hotchkiss

The Hiltop Algorithm
Few events on the Internet have stirred as much controversy and confusion than Google’s most resent change nicked named Florida – a plan that knocked thousands of companies out of their top positions in search engine results.

There has been no shortage of accusations about Google’s intentions. The most common charge leveled against the search engine giant is that it is trying to increase revenues by forcing companies to buy keywords.

In the midst of this firestorm, we decided to do a little sleuth work to find out what really happened. What we found was that an algorithm named Hilltop was responsible for shaking up the entire online community.

A bit of background first
Google uses PageRank and other technology to drive its search engine. Here is a summary of PageRank from http://www.google.com/corporate/tech.html:

PAGE RANK DEFINED: Condensed version
PageRank relies on the uniquely democratic nature of the web by using its vast link structure as an indicator of an individual page’s value. In essence, Google interprets a link from page A to page B as a vote, by page A, for page B. But, Google looks at more than the sheer volume of votes, or links a page receives; it also analyzes the page that casts the vote. Votes cast by pages that are themselves “important” weigh more heavily and help to make other pages “important.”

Google knew that there were two problems with PageRank: it had difficulty filtering out spam sites and it did an inadequate job differentiating between sites that had relevant and irrelevant information.

Krishna Bharat at Compaq Systems Research Center and George A. Mihaila, a professor of Computer Science at the University of Toronto came up with a new algorithm to fix the problem: Hilltop.

A VERY simplified explanation of how Hilltop works.
What follows is a VERY simplified explanation of how Hilltop works.

• Hilltop counts the number of meaningful (related to the topic) hyperlinks coming into a content-rich Web site.

• Web sites with numerous meaningful links and volumes of pages with relevant information are called “authority sites”.

• Authority sites enjoy higher rankings on the assumption that they are of more value.

• Web sites with few hyperlinks, non-related links, MLM affiliates, and affiliate programs to inflate Page Rank are demoted.

Here is what the Hilltop Algorithm looks like:

Old Google Ranking Formula = {(1-d)+a (RS)} * {(1-e)+b (PR * fb)}
New Google Ranking Formula = {(1-d)+a (RS)} * {(1-e)+b (PR * fb)} * {(1-f)+c (LS)}

Google quickly found that the Hilltop algorithm also had flaws. So Google created a two-step search process that combines PageRank technology and the Hilltop algorithm.

So how do companies continue to enjoy top rankings on Google?

Getting top rankings on Google depends on meeting the criteria of PageRank and Hilltop. Therefore, companies have to receive significant numbers votes or quality links from authority sites (PageRank) and have meaningful / relevant hypertext links (Hilltop) inbound to their web pages. Just as it has always been, Content is critical and linking is now more important that ever.

Two Conclusions can be drawn from our research
• Google is not trying to force companies to buy keywords; it is simply trying to improve its algorithm.

• The search engine optimization methodology and strategies used to achieve top placements just became significantly more complicated with the addition of Hilltop

With Google handling 75 to 85 percent of all search requests, companies doing business on the Web must retain top rankings in the search engines in order to survive. And that means making sure that your firm satisfies the criteria of the two algorithms.

Author Bio:
Search Engine Optimization Inc. specializes in getting companies top rankings on all search engines through organic searches. Contact us at 1-877-736-0006 to learn how we can help your company enjoy the benefits that come from top placements.

Google’s Austin Update
Some of the websites that haven’t been hit too hard in Google’s Florida update (November 2003) got hit real hard on or around January 23. Google’s latest update is called “Austin”, and they are beginning to ‘sound’ like elections’

Depending on the industry you happen to be in, you could have been hit less, ‘ or harder- it depends on a whole number of factors and not one situation is usually the same. One of our clients that, up to recently had their site optimized by another SEO firm was completely devastated to realize his site was gone from the face of the earth. Things such as FFA’s (Free for All) link farms, invisible text (!) and stuffed meta tags that are 20 screens long, filled with useless spam will have your site penalized, or even banned faster than you can blink an eye.

Google’s new search algorithm is getting ‘smarter’ and better. Part of it is due to the Hilltop Algorithm, which I wrote about at the beginning of January. In combination with the Page Rank’ algorithm that Google has been using since 1998, these two algo’s work in tandem in what is supposed to be producing better, more relevant search results. This last statement remains to be tested, as in some cases I have found that some of the results produced are not always relevant. But Google is continuously tweaking and ‘fine-tuning’ its technology to make it better and the next few weeks should see more improvements.

Google’s new ‘weapons of mass destruction’
What is really clear at this point is Google’s unique way to find sites that attempt to use spam techniques or certain methods that are forbidden by most major search engines, such as cloaking. In fact, to rectify this situation, on January 25, Google has changed the IP addresses of most of its data centers all over the world.

What is really peculiar about this update is that the majority of the sites we have been optimizing for the past 12 months have actually gone up in their rankings. Still, we are hearing and reading of reports in the industry that some websites (in no particular industry) have seen their PR (Page Rank’) drop one point for no apparent reason. Sites that for the past two years had a PR of 6 dropped to 5, with no real changes in the number of outbound links pointing to them.

This is indicative that Google has changed the formula to calculate the number of outbound links used in computing the resulting PR value a site should have, in relation to its competing sites. It is also believed that Google now looks at the ‘theme’ of a site and looks at the relevancy of links to better calculate PR. Sites in the same industry will benefit more than before, while others which get the majority of their links from sites that have nothing to do with them could see their Page Rank’ value go down, as some did on January 23rd.

It is clear now that Google is making some significant improvements and some major changes to its search algorithm- again! Get used to this, as my feeling is that it will continue for the next several months. It is my belief that, from the month of November to this date, sites that haven’t gone through any significant loss in their rankings could very well fall in the “victims category” if they happen to use any forbidden techniques described earlier.

Spring clean up time is here already
For almost as long as I can remember, most major search engines have always frown upon sites that use spam techniques or ‘tricks’ that are not recommended, in an attempt to ‘get away’ and possibly rank higher. If you suspect your site is part of this category, the time has come to “clean up” and to start producing good and relevant content. From the beginning, sites that have always produced great and superior content have always benefited and these categories of quality sites are usually not affected in any major update.

Content that is freshly and regularly updated always helps in getting better rankings. On the Web, content is always king. It’s not just the quantity, but also the quality of the content that is important. Additionally, Google’s Hilltop Algorithm has a tendency to detect sites that are authoritative in their field and will usually rank them higher, along with a higher PR value. Hilltop looks at the topic of the theme and who links to a site, the value of those links and the anchor text surrounding those links, as well as the text in the links themselves.

Remove from your site any meta tags that are irrelevant and replace them with the ones that are truly in relation to the products and services you want to sell or whatever it is you are trying to promote on your site. Also, look at your title tags- are they really indicative of what that page is all about?

Look at the body of your text. Is it carefully written and does it ‘flow well’ to any ‘human visitor’ that tries to read it or does it sound like you are repeating the same keywords over and over dozens of times?

Build and write your website the best you can, using only approved techniques recommended in the terms of use of Google and the others and your site should never ‘fall off a cliff’ as some have done in the past two months.

Conclusion
For the SEO community, the next two to three months should be interesting, both as close observers (which most of us are) and from a ranking standpoint. It is now highly expected that Google will produce its long-expected IPO. Some think this is the reason why it is making such major changes to its search technology. Additionally, more speculation and ‘conspiracy theories’ have been rampant in many SEO forums, which, for the most part, are unfounded in my belief.

Whatever the reasons to these important changes, one thing is clear: Google is working hard to improve the quality of their search results and this can only be achieved by trying new algorithms, tweaking existing technology, changing IP addresses in data centers, etc.

Don’t unfasten your seat belts just now – the ‘Hurricane’ may not be over yet!

Author:
Serge Thibodeau of Rank For Sales

The Future Of Search Engine Technology
By now you have probably read numerous articles predicting ‘What will happen in 2004’ or ‘Can MSN take on Google’. While it is always worthwhile to look ahead and consider what may happen this year in the search engine industry, what about the things that we can’t quite yet predict? Instead of looking at what will happen this year, perhaps we should look at what must happen in the search engine space if Google, Yahoo and MSN are truly able to revolutionize search and enhance the user experience.

Overcoming The Lack Of Relevant Search Results
Even today, conducting a search on any of the major search engines can be classified as an ‘enter your query and hope for the best’ experience. Google’s ‘I’m Feeling Lucky’ button, while designed to take you directly to the number one results, could ironically be a truism for its entire search results (process?). Enter your desired search words into any of the search engines and you often end up crossing your fingers and hoping that they display the type of results you were looking for. Since the recent updates of ‘Florida’ and ‘Austin’, complaints that Google, in particular, is displaying less relevant results have escalated (although mostly by those who lost important positioning that they had assumed was their right to maintain).

There is, of course, evidence that the search engines are trying to enhance their search results- so that they can better anticipate the intentions of the searcher. Search for ‘pizza Chicago’ at Yahoo, and you’ll see that the top results include names, addresses, telephone numbers and even directions to pizza restaurants in Chicago, a great improvement on previous results. Even when you take everyone’s favorite search term example, ‘windows’, you can see that the search engines are at least trying to determine your intent. While Yahoo and Google still display search results saturated with links discussing Microsoft’s pervasive operating system, enter your search over at Ask Jeeves and the chirpy English butler will ask you if you meant ‘Microsoft Windows’ or ‘Windows made out of glass’.

Future Search Engine Technology
Smaller search engines have also materialized over the past few weeks, each offering to improve the user experience. Grokker offers an interface that groups search results graphically, improving the way search results are segmented and displayed. Eurekster, combines the social networking elements that are used by sites such as Friendster, and provides results that can be filtered based upon what members of your group are searching. While all of these are interesting and provide a glimpse of the future of search, it will not be the small companies that change the way we search. With Google about to get an influx of cash from its upcoming IPO, Yahoo re-vamping Inktomi and Overture, and Microsoft finally jumping into the search arena, it will be these search engine powerhouses that enhance our search experience and take search engine technology to the next level.

So what is this next level? What technology is it that I speak of, that will revolutionize the way we receive our search engine results? I believe that the search results we receive in just a couple of years from now could make current search engine technology look as archaic and cumbersome as picking up a Yellow Pages book is today. However, in order to achieve this new search nirvana we, as consumers, must quell our fears and trepidations surrounding the protection of our privacy. In order for the search engines to develop technology that will be intuitive and anticipate our every need, we must first relinquish at least some of the privacy that we currently hold so dear. Let’s take a look at some of the ways that search technology could improve and you’ll soon get the idea why it will require us to cooperate with the search engine providers.

‘Windows’ or ‘windows’?
If you desire to be able to enter a term as ambiguous as ‘windows’ and expect to see relevant results, you’ll first need to give up some personal information to the search engines. Google, Yahoo, MSN and Ask already have the means to collect an astonishing amount of information from us, by our use of their toolbars. Don’t panic, they currently allow you to disable this information gathering, and even if you do allow it, it is collected anonymously. However, with the technology already in place, why not unleash its full potential?

Let’s say I let Google track my online activities, allowing it to monitor the web sites I view and keep a log of all of the search queries I enter. This type of information could greatly improve the relevancy of the results displayed to me. For example, two years from now, I could search for ‘home improvement’ on Google. I then find the listing for Lowes.com and visit the site. While I am at their web site, I look at a number of different pages, but I spend a lot of time in the ‘house windows’ section, exploring the different styles and prices. Why not let Google capture all of that useful information? Then, when I go back to Google the following day and search for ‘windows’ it would know that glass windows is more likely to be the type of product I am seeking out. Google would simply have remembered my previous searches, read the HTML and Meta data, located on the Lowes.com pages, and used this to identify the intent of my new search for windows.

While I would have to give up some of my privacy, wouldn’t it be worth it if I could save myself time and energy by having search engine results more relevant to my desire?

You’ve Got Search In Your Mail
Another area with great potential for improving search engine results will likely be developed by Google. You may have heard the rumors that Google is getting set to launch an email client that many expect will be a free service similar to Yahoo Mail or Hotmail. Currently, Yahoo does an adequate job of making search available to all of its email customers. Each page within Yahoo Mail has a search box that makes it easy for you to conduct a search that might be sparked by an email you receive. But why not take it one step further?

Google has the technology to really take advantage of search within email. Why else would it even consider entering this arena? Imagine that, in order to use a free Google email account, you allow Google to provide advertisements and track your email activities. Google could change the way that search results and ads are displayed to free email users. For example, let’s say you receive an email from your brother, the content of which, among other things, gloats about the brand new P4 desktop computer that they just purchased from Dell. As part of the interface you use to read that email, Google magically displays paid search advertising for desktop computers, including a link that will take you directly to the appropriate page on Dell.com. This information would be quite beneficial to you, as you may be interested in seeing how you too can be a proud owner of a P4 computer. Fantastic targeted advertising for Dell, as they know that if you click on the listing, they are halfway there to converting you into another satisfied customer.

This idea is so much closer to reality than you may think. Google already has the advertisers with its AdWords service boasting 150,000 users, eager to spend their advertising dollars. It also has the technology to determine which results to show you within your email interface. Google’s AdSense can provide the contextual ad technology that would scan an email’s content to determine which ads are the most relevant to display. With this technology in place, a simple provision within any Google Email Terms & Conditions would give the world’s largest search engine the necessary permission to serve up relevant ads to all users of its free email service.

We could be offered the option of paying a monthly premium in order to not have ads shown when we read our email, but if they are relevant to the content of a received message, why would we want to block them?

From Desktop to Internet
Another development in search engine technology that I can see happening would come from the development of Microsoft’s new Longhorn operating system. While I must confess that I am not au fait with the intricate workings of this project, I do know that it will likely use the search technology that MSN is developing.

Imagine an operating system that monitors all of your activities — with your permission, of course. Every file, every image, word document, mp3, even e-books could be monitored by your computer as it endeavors to anticipate your every need. Not only could an integrated search engine allow you to search files located on your hard drive, but it could also use the information it has collected from these files to make your online search experience even more enjoyable.

It is quite possible that Longhorn or a future OS (Microsoft, Linux or Mac) could become intelligent enough to know that after listening to one of your favorite songs by the 80’s rock band, Heart, your consequent search online for ‘heart’ is more likely to originate from a desire to view the band’s fan site, than that pressing need to visit the web site of The American Heart Association. Your all-encompassing search engine would perhaps be a realization of the Ask Jeeves friendly butler, ready to anticipate your every need.

To Search Where No-one Has Searched Before
When you think about the future of search, it is easy to get excited. Millions (if not billions) of dollars are going to be filling the coffers of the largest search engine providers. They have some of the smartest people in the world working to develop the next great ‘thing’, which will enhance the user experience and serve up better, more relevant search results. Search engine technology is still most definitely in its infancy; how it grows will very much depend upon how much information and privacy the average search engine user is willing to give up. Personally, if I can view search results that more closely match my desired results, I’m willing to give up the name of my favorite pet, my place of birth and my mother’s maiden name!

Author Bio:
Andy Beal is Vice President of Search Marketing for WebSourced, Inc and KeywordRanking.com, global leaders in professional search engine marketing. Highly respected as a source of search engine marketing advice, Andy has had articles published around the world and is a repeat speaker at Danny Sullivan’s Search Engine Strategies conferences. Clients include Real.com, Alaska Air, Peopleclick, Monica Lewinsky and NBC. You can reach Andy at andy@keywordranking.com and view his daily SEO blog at www.searchenginelowdown.com.

PageRank: Meet Hilltop
Based on Atul Gupta’s great article he recently wrote on the Hilltop algorithm, I did a bit of research on my own and came up with this article. Atul Gupta is the founder of SEO Rank Ltd. and, as he explained it in his article, the Hilltop algorithm played a fairly large role in Google’s November 16 update, dubbed “Update Florida”.

In my continuing series on the effects of the Google “Florida Update”, in my previous article, I discussed how the OOP (Over Optimization Penalty) could in some cases have been applied to certain sites that could have in fact been overly optimized on some of their main keywords. Researching and reading on the Hilltop algorithm, I found out that it isn’t even new- it dates back to early 2001.

As you might expect, and as is always the case, Google remains very silent on any of this, so my analysis is based on many observations and some testing, using the Google.com search engine. But before delving into how all of this may affect your positioning in Google, let me explain what the “Hilltop” algorithm is all about and how it works in Google.

For those of you that may be new to search engine algorithms, I suggest you read on Google’s Page Rank algorithm, as a primer, and also “Anatomy of a large-scale hypertext search engine”, written by Sergey Brin and Larry Page, the co-founders of Google.
In its most basic form, the Google PageRank algorithm determines the importance and the relevance of a website by the number of links pointing to it. Following this principle, as an example, Google would rank a page higher if it has 100 links pointing to it, when compared to another page with only 10 links. So far, so good and this principle makes a lot of sense when you think of it.

Definition Of The Hilltop Algorithm
In contrast to PageRank, Google’s Hilltop algorithm determines the relevance and importance of a specific web page determined by the search query or keyword used in the search box.

In its basic, simplest form, instead of relying only on the PageRank value to find “authoritative pages”, it would be more useful if that “PR value” would be more relevant by the topic or subject of that same page.

In such a way, computing links from documents that are relevant to a specific topic or relevant document of a web page would be of greater value to a searcher. In 1999 and 2000, when the Hilltop algorithm was being developed by engineer Krishna Bharat and others at Google, they called such relevant documents “expert documents” and links from these expert documents to the target documents determined their “score of authority”. Again, it does make a lot of sense.

For more in-depth information on this important topic, read the Hilltop Paper that was written by Krishna Bharat himself and is available from the University of Toronto’s computer science department.

Using The Hilltop Algorithm To Define Related Sites
Google also uses the Hilltop algorithm to better define how a site is related to another, such as in the case of affiliate sites or similar properties. The Hilltop algorithm is in fact Google’s technology and ‘ammunition’ in detecting sites that use heavy cross-linking or similar strategies!

As a side note, Google’s Hilltop algorithm bases some of its computations mostly from “expert documents”, as noted above.

Hilltop also requires that it can easily locate at least 2 expert documents voting for the same Web page. If Hilltop cannot find a minimum of 2 such “expert documents”, the results it will return will be absolute zero. All of what this really means is that Hilltop actually refuses to pass on any arbitrary values that may be relevant to the rest of Google’s ranking formula and thus becomes inappropriate for the search term or keyword used in the search box by the user.

So, What’s In Store For Hilltop In 2004?
Since we are only at the beginning of the year, some of you may ask: “That’s all really cool, but what will happen to websites in 2004, in the aftermath of “Hurricane Florida”? That’s a great question, and many articles have been written on this topic in the last six to seven weeks.

Today and in the past, many search engines stopped valuing certain search factors subject to abuse from certain webmasters or site owners, such as keywords meta tags. For that reason alone and since its very beginnings, Google has always completely ignored meta tags altogether in the first place.

In contrast, visible sections of a website are less subject to “spam-dexing” (search engine spam), since these ‘visible pages’ (!) need to make good sense to the average human “real” visitor.

The Reasons Behind A New Algorithm At Google
Since the inception of the Google search engine in 1998, the PageRank algorithm has been pretty much the benchmark used at Google to determine search relevance and importance. However, there is a fundamental design weakness and certain limitations involved in the PageRank algorithm system and Google has known about it for quite some time now.

PageRank’s ‘intrinsic value’ is simply not paramount to search terms or specific keywords and therefore a relatively high PR web page that only contained a reference to an off-topic search term or keyword phrase, often got a high ranking for that search phrase. This is exactly what Google is trying to eliminate with its Hilltop algorithm. Google always tries as best as it can to make its search engine as relevant as possible.

Coming back to Krishna Bharat, he filed for the Hilltop patent in January of 2001, with Google as an assignee. Thus, Google recognized the important improvements this new algorithm could bring to their search ranking features when combined with their existing PageRank algorithm.

Google’s Hilltop algorithm could now work in conjunction with its older technology (PR). It is my observation that Hilltop could have gone through many improvements from its original year 2000 design before the current implementation, notably the one that Google started to deploy on or around November 16, 2003, at the very beginning of its November update (Florida update).

In the past two years, I think that Hilltop has been “fine-tuned” by Google and now represents a serious contender to the PageRank algorithm, originally developed by Google co-founders Sergey Brin and Larry Page, back in early 1998.

Hilltop And Google’s Massive Index Of Over 3.3 Billion Pages
Since its very beginning, Google has basically been operating most of its search engine through about ten thousand Pentium servers (some call them inexpensive personal computers), evenly distributed mostly through some major data centers located anywhere on the planet. That is basically how Google has built its hardware technology, from the ground up.

Coming back to the Hilltop algorithm, if we make an observation on how about 10,000 servers can have the dynamic processing ‘intelligence’ to rapidly determine and locate “expert documents” from hundreds of thousands of different and ‘topical’ Web pages, it is clear that Google’s Hilltop algorithm is at work in such a formidable task.

From what I can see and from what I know of search engines, since November 16, Google is now running a form of batch processing (similar to the mid-seventies days of computing, using bulky mainframe computers the size of large refrigerators, except that today, those 10,000 servers replace those mainframes) of frequent keywords, key phrases and search terms. Google then stores these results in its massive database, ready to be used as soon as a searcher makes a query using those search terms.

How Google does this is very simple: it has immediate access of the most popular and frequent keywords used and in the search terms used daily from its large database, and in real time, collected from actual searches used by everyday users, as well as actual keywords and key phrases used in its AdWords PPC (pay-per-click) ad program.

It is my observation that Google has apparently set a certain arbitrary threshold value to the actual number of searches a real-life search keyword needs to have in practice before it triggers a set limit in the Hilltop algorithm, and is then sent to a temporary buffer for later batch processing in its whole complex system.

Looking back to the ‘old days of the monthly dances’, it would appear that Google’s Hilltop algorithm operates on the combined total of most popular search terms used once a month, hence the old “Google dance effect”, prior to November 16, 2003.

Additionally, and this is something I have noticed even before the Florida update, incremental and smaller bits of batch processing is likely being done more frequently by Google on certain search terms that increase in popularity much faster, such as a major news event, for example when the US captured Saddam Hussein in December 2003. Such short-term events or news would qualify for the short-term “buffer” and would be processed as such by Hilltop.

More ‘standard’ and ordinary results for the longer term would be timed in with the 10,000 servers about once a month, which again, would make perfect sense. Search terms that do not qualify to kick in the Hilltop algo continue to show you the old Google ranking.

Conclusion
In concluding this topic, as Atul Gupta and myself have written in some of our previous articles, webmasters and site owners need to think ‘out of the box’ if they want to thrive and continue to have sites that return favourable ROI’s. As always, link popularity is even more important now than ever before.

Additionally, try to get a listing in as many directories as possible, beginning with DMOZ (the Open Directory Project). Avoid FFA (Free for All) or link farms in every respect. Those are a thing of the past and might even get you penalized.

If your budget allows it, get into a good PPC ad program, such as AdWords or Overture. You might also want to consider some good paid inclusion search engines that deliver real value for your investment.

Note that since January 15, (and as expected) Yahoo has completely dropped its listings with Google, so you may also want to look at the possibility of a paid listing in Yahoo as a safety measure. Yahoo is now taking its results from Inktomi, which is also in the Yahoo family of search properties, since Yahoo bought Inktomi last year.

Author:
Serge Thibodeau of Rank For Sales

Search Engine Innovation For 2004
After being blind-sided by the Google Florida update, many webmasters and SEO’s were reeling from the results. The message is clear: you can’t rely on just one search engine for all of your traffic. You must use all your wits to emerge victorious from the search engine wars. Google is important, but it is not everything. Keep your eyes and ears open to new opportunities and old standbys: other search engines and directories, paid placement and pay-per-click, newsletters, and even more traditional channels.

Wait To Change
So were you an innocent bystander caught in the onslaught of sites dumped in the Google Florida update? Many people lost their hard-earned ranking, even though they did nothing “wrong”. Many websites that follow Google’s rules for optimization to the letter were still caught up in the carnage. Unfortunately, many businesses were devastated by these changes, especially heading into the holiday months.

What to do? As difficult as it may have been to make sense of Google’s changes, for many, the simplest course of action was to simply do nothing. While perhaps contrary to a normal “it’s broken so I need to fix it” approach, for many webmasters “do nothing” has proven to be the correct course of action. Since the update, many sites that were exiled to search engine Siberia have returned to nearly their former ranking, shaken but intact. From all appearances, Google simply changed their algorithm and may not have gotten it quite right. Additional “tweaks” subsequent to the Florida update seem to have brought some sanity back to their results.

Who Will Stay Tops In The Search Engines?
You never know who will become the leader in search engines. It was only a few years ago that directories were the major force–until the upstart search engine Google came along. Google got its start about five years ago and hasn’t looked back. As long as Google provides good results for its users, it is in a good position to stay on top. However, with MSN working on the creation of its own search engine and Yahoo’s acquisition of Overture (which includes AllTheWeb and AltaVista), things could get interesting in 2004. Microsoft is always a force to be reckoned with, and Yahoo certainly has the tools to become a major competitor to Google.

Inktomi’s New Role
Inktomi may play an important role in this growth since it is now owned by Yahoo. Keep an eye on this engine: it provides secondary results for MSN and will probably replace Google in supplying primary results in Yahoo. Inktomi’s importance may also increase in MSN once the Microsoft property stops using LookSmart for its primary results.

To see which pages you have listed in Inktomi, use the Inktomi Pure Search function from Positiontech. Inktomi often adds a few free pages to its databases. Check first to see which pages you may already have in their database for free before using Paid Inclusion for your most important pages.

Other Ways To Promote Your Website
Keep your eye on search engine news. Google was an up and coming engine a few years ago, you never know what will happen in the industry so stay on your toes. Continue to promote your website through links in topical directory listings. Search for websites that contain topics related to yours. Link when it “makes sense”. Don’t forget traditional means of marketing your website: print ads, brochures, magazine articles and more may help to make a difference. One of the best ways to promote yourself online and increase your link popularity is to write articles on your subject. Find websites that accept free content and submit your ezine, articles or newsletters to those websites to build your link popularity. Newsletters, forums, FAQ’s, blogs and tips on your subject are all viable means to inform your visitors and bring in new traffic to your website. Don’t forget to archive your newsletters and articles on your website, which works to build your site size and increase link popularity through your authoritative knowledge of your subject. You aren’t a writer? Consider working with a copywriter to help build your good content.

Paid Inclusion And Pay Per Click
If you haven’t ventured into using Paid Inclusion or PPC services, consider using them to help balance the changes in your traffic. Use a Paid Inclusion subscription for your most important pages, or submit dynamically generated pages that aren’t being picked up by the search engine robots so they will appear regularly in the search engine database. You can start your PPC bidding in small doses. Look for some of the secondary smaller terms that don’t cost as much but will still bring in traffic your competitors may miss. Take a look at some of the smaller PPC engines available out there, a little traffic from a lot of places can add up.

For more information on choosing keyword phrases, read our article Finding Targeted Keyword Phrases Your Competitors Miss.

Content, Content, Content
The biggest mistake I see webmasters make is creating a website with little content. Don’t rely on a few paragraphs of text with optimization to convince search engine robots to stick around. A skeleton website does not make a good impression on anyone. Build the content of your website. Google’s new algorithm may be a sign of search engine robots getting a little smarter when it comes to understanding what your website content is about. Build information that will keep your visitors at your website. Become an authority on your subject so other websites will naturally link to you because your information is invaluable. Remember, Google is interested in serving those who use its search capabilities, just as you should be interested in serving your visitors. Give as much real content information as able to your visitors, they will thank you with return visits.

And In The End…
In the end, the information you give is often equal to the response you receive. Make the effort to become an authority site on your subject. Building the groundwork of your website with quality information and broadening your methods of marketing will help sustain you during the search engine wars upcoming.

Author Bio:
Daria Goetsch is the founder and Search Engine Marketing Consultant for Search Innovation Marketing, a Search Engine Promotion company serving small businesses. Besides running her own company, Daria is an associate of WebMama.com, an Internet web marketing strategies company. She has specialized in search engine optimization since 1998, including three years as the Search Engine Specialist for O’Reilly & Associates, a technical book publishing company.

Searching for Dominance: What Will Microsoft Search Look Like?
It’s the beginning of the year, so you’ll see dozens of ‘Year in Review’ and ‘Predictions for the Coming Year’ articles about the search engine industry. I have to admit, I was going to jump on the bandwagon myself, but as I started looking at what I would say, one thing dominated the future landscape for the next 4 years to such an extent that it made all the other developments pale in comparison. This tsunami of change will shape and affect every corner of the business. Search as we know it will be swept away because of it, and all the search providers we know will scramble to readjust and find their place in the new landscape. When Microsoft enters search, all else will become a footnote in the history of the web.

So, forget Yahoo and Google. Sorry Looksmart and Ask Jeeves, you’ve been pushed off the front page. Today, the spotlight is on Microsoft, and how they will likely change the face of web search. In this column, I won’t be talking about industry impact. Instead, with the help of our Organic Search wiz, Rob Sullivan, I’m looking at the promise of Microsoft’s research itself, and what the tool may actually look like.

MS Search..It’s All About Indexing
First, Microsoft is looking to solve a long standing desktop irritation. And when they find the answer, it will change the indexing of file information forever.

The current way of finding files on your computer leaves a lot to be desired. There has been no single system that effectively searches content from multiple file formats. To solve that problem, Microsoft is looking to employ three different technologies. First, to ensure compatibility, Microsoft will continue to use their NTFS File Structure system. They will combine it with the indexing capabilities of a SQL server relational database and the file labeling potential of XML. The new system is called WinFS.

WinFS
The problem with current file systems is that they are hierarchal. Files occupy one single place within a nested pyramid of file folders. But people don’t tend to think that way. A file may be relevant in a number of different ways, depending on the context in which you’re looking for it.

The other problem with hierarchal systems is that they need a librarian. Someone has to establish and organize the hierarchy. Usually this organization is established in anticipation of the context in which you’ll have to reaccess this information.

I know there are people out there who are diligent about filing away every single document in a well organized file system, but for the 99% that make up the rest of us, our hard drives are a vast junk yard of old files, spreadsheets and emails. More often than not, we desperately use Microsoft’s find file application to try to track down that elusive bit of information we’re looking for.

The other problem is that there is no good way to quickly search a number of different file formats for a scrap of information that may be hidden in one of them.

Microsoft’s new WinFS will work on top of the current NTFS structure, but it will introduce a dramatic new way of indexing files and their contents. XML tags will be used to send relevant information to an SQL database. It will bridge the current gap between indexable structured data, stored in a database and data which has been un-indexable, stored in unstructured formats such as Word documents, webpages and email messages. It also allows users to add ‘metadata’, identifying tags to existing files. For instance, a picture file could include information about the subject of the picture, or a sound file could include information about the audio captured.

Stuff I Have Seen
A Microsoft research team has been working on a prototype application called SIS, or Stuff I’ve Seen. Although it’s focus is to help users find files and information on their desktop, its implications for web search functionality could be dramatic. It pulls information from multiple file formats, including emails and webpages, and records them in a single index. This allows the user to search through them using a powerful interface that allows for the application of several filters at the same time. The search process becames a real time iterative process, allowing the user to quickly narrow down the search to the most relevant findings.

Implicit Query
‘Stuff I’ve Seen’ gives the user a powerful tool to find files and information on their desktop. Implicit Query (the link goes to a interesting Powerpoint presentation prepared by the Microsoft Research team) goes one step further by continually searching and retrieving information based on what the user is doing. As the program tracks user behaviour, it refines its model of what is important and relevant to the user and filters the search results accordingly. This is an extension of Microsoft”s Lumiere research which has modeled the Bayesian logic behind the current automated assistance functionality.

In an example, a Microsoft researcher was typing an email to a colleague about an upcoming conference. As she was typing, Implicit Query brought up presentations, slides and documents prepared for the conference in it’s results panel. In another instance, she was preparing an email to another colleague about a broken link in her group’s website. Before she was finished, she was shown an unopened email that contained the fix.

Memory Landmarks
A third Microsoft project doesn’t hold nearly the same promise for web search, but it would make an interesting add on feature. Memory Landmarks can add historical remarks to a list of chronological search results. For example, if you were searching for articles regarding the capture of Saddam Hussein, you could sort the list by date and Memory Landmarks would indicate where on the list the capture took place.

What Will MS Search Look Like
I think the above prototype applications give use some real clues as to what Microsoft Search will look like. As Microsoft works on the new Longhorn OS, we have to remember:

• As Microsoft works on ways to index and search files locally, it’s a logical extension to apply the new technology to web search.
• Longhorn’s Indigo makes a major move away from object oriented programming towards web services. There will be a much richer and deeper exchange of information between your local computer and web service sites. This allows for much greater localization in search tools.
• Microsoft has a long history of incorporating what were 3rd party stand alone applications into their applications and operating systems. They have already identified search as one of the key activities people do online.
• Microsoft’s ASI (Adaptive Systems and Interaction) research department is working to make their systems more intuitive and intelligent by letting them learn how the user works and adapting itself accordingly.
• Microsoft is working on desktop applications that will dramatically change how people launch searches for information.

Given all this, here is what I believe Microsoft Search will eventually look like.

Microsoft will use WinFS as the basis for eventually indexing every document on the web. Remember, because it’s integrated at the OS level, it will be native to every Microsoft IIS server on the internet. It gets around the current problem of the ‘invisible’ web by allowing web publishers to include metadata to allow for quicker indexing. Its SQL foundation will make indexing of data based information quick and transparent, as was shown when legal publisher Lexus Nexus allowed Microsoft to index a portion of their huge database.

This common indexing procedure will erase the dividing line between desktop searches and web searches. The entire web will be accessible from the Microsoft search sidebar. What’s more, the next evolution of ‘Stuff I’ve Seen’ and ‘Implicit Query’ will monitor what you’re working on and provide suggested information sources and files from both your desktop and the web.

If a user wants to launch a manual search, the current trial and error method of search (try a search, check the results, refine the query and try again if you don’t find what you’re looking for) will become much quicker and more powerful with an interface that allows for real time updating of results as filters are applied and parameters are tweaked. I’m willing to bet that Microsoft will also unveil leading edge natural language query technology that will mine web data based on interpreted concepts and not the current structured query method used on most search engines. By the time Microsoft Search is unveiled, I believe a more intuitive search interface will be standard on all the leading search portals.

Search functionality will eventually be integrated into every Microsoft application, much as the ubiquitous Office Assistant (I can’t tell you how much I hate that damned paper clip..until I need him) is now. Microsoft will be able to capitalize on this by selling sponsored search suggestions that will also be offered via the implicit query channel. For instance, if you’re writing an email about an upcoming business trip to New York, the Microsoft search pane will offer airfare and hotel specials, as well as suggestions of things to do while in New York.

Microsoft will be able to monitor everything you do. The more you do on your desktop, the more Microsoft and its applications will learn about your preferences and priorities. Their ASI research will allow them to adapt search functionality and personalize it just for you. So the search results you see won’t be the same as everyone else’s. Personalization will move beyond just geographic location to take into account the types of sites you tend to visit, business priorities, your typical workday activities and even your lifestyle interests. Big Brother lives, and his name is Microsoft!

Finally, Microsoft’s Indigo feature in Longhorn will remove the distinction between server side tasks and client side tasks. Therefore, local indexes will be utilized whenever possible to increase search performance and the options for personalization. The line between your desktop and the internet will become more and more indistinct as time goes on.

Implications For The Real World
Today, I just wanted to focus on what Microsoft’s search could look like. In writing this, I kept asking the same question, ‘Boy, I wonder what this means for Google?’ Obviously, the gang at Google is very aware of the impending threat of Microsoft search. So, in the next NetProfit, I’m going to ask our head of Organic Search, Rob Sullivan, to join me for a little brainstorming. We started chatting today by the water cooler and he has some very interesting theories. Stay tuned!

Author:
Gord Hotchkiss

My Predictions for the Search Engine Industry for 2004
The year 2003 was, generally speaking, a good year for the search engine industry. While for some that may have got hit hard with the Florida update in November, the balance of the year was good.

I predict 2004 will be even better, albeit you should expect some major changes. 2004 will be a year of consolidation, if there is such a thing in this rapidly changing industry. While in the year 2003, some search companies such as Looksmart and Espotting had more than their share of problems, especially in the case of Looksmart, I predict there will be a few newcomers.

2004 will be a year where you can expect news to hit this industry on a daily basis. In 2003, hardly a day went by that there wasn’t something happening at one company or another. Expect 2004 to be even busier.

Additionally, 2004 will be a year where the stakes are getting higher- much higher, and not just for Google! The level of competition among search engines will get tougher, in a race to land what I call ‘targeted eyeballs’.

Will it be GOO, GOG, GLE, GGG or simply GO?
As you must have read or heard about it by now, 2004 will also be the long-expected year where Google is supposed to transform itself into a public company, with all the advantages and the pitfalls that this entails. Google won’t have a choice but to go public in 2004. The reason for this is because of a little-known SEC rule in the US that forces American companies to issue quarterly statements once their number of employee-shareholders have surpassed a certain threshold.

It is estimated that over 700 to 800 of such employees are current shareholders, in effect forcing Google to become public, whether it likes the idea or not. At least by going public, it will receive many billions of new money that it will be able to use in its massive, continuing research and development programs at the Googleplex.

Many are expecting the 5-year old company to make an IPO sometime in late March or early April. People here are Rank for $ales are starting to take bets as to what its stock ticker symbol will be. On the New York Stock Exchange, a company cannot have more than three letters for its stock symbol. Personally, I think it might be GO, although that could cause some confusion with Infoseek’ 😉

The Yahoo Factor
Whatever happens in the Google stables, one major player that cannot be neglected is Yahoo. Yesterday in fact, the CEO of a very large Canadian company and client of mine asked me what I think of Yahoo and what it might do to fight back the ever-powerful Googlemania?

I strongly believe that the Sunnyvale California company should be considered as the ‘sleeping giant’, but not for long. After having bought and ‘digested’ Overture and All The Web in 2003, Yahoo is the number One search directory in the world and the number Two search engine (although I prefer to call it a search directory) on the Web. Yahoo is also the current owner of AltaVista.

In 2004, there is no question that Pay-per-Click (PPC) will be an extremely popular derivative of search, and one that will grow well into 2005 and 2006. Overture and Google, through its AdWords program currently have the two best PPC search properties.

New Search Engines to be born in 2004
After having successfully developed and deployed my paid inclusion search engine Global Business Listing ( http://www.globalbusinesslisting.com/ ) in 2003, in 2004 I will begin the development of my new PPC search engine Net Globe Media
( http://www.netglobemedia.com/ ) which should be completed some time in the third quarter.

Although somewhat slightly different, Net Globe Media will operate on similar principles that Google currently uses for its AdWords program and will be similar to some of the features Overture offers.

What will characterize Net Globe Media from its competition will be its drastically lowered bid prices and the way the ads can be rotated by its advertisers. There will also be an added tool to help you make the best bids on your ads, while lowering its cost at the same time, plus a few additional value-added features.

Where I see the SEO Industry as a whole.
After the Florida update and the many sites that got penalized in the process, I strongly believe that the SEO industry will probably ‘endure’ its largest shakeout ever in 2004. I feel that the search engine optimization firms that are serious, that have always and consistently produced good results for all its clients and the ones that constantly keep up with all the many changes in technology and new developments are the ones that will thrive in 2004.

And the others, the ones that resort to unethical practices or to techniques that are forbidden by the search engines will disappear. Google and many other major search engines are constantly developing new technology and algorithms, in an effort to detect that sort of thing and search engine spammers will get caught at their own game. As a result of all this, I think that the SEO’s and the SEM firms that will remain after this shakeout will be stronger than ever, and will represent the best marketing investment a company can make today for its online advertising programs.

In Conclusion
2004 should be an interesting year, both for the search engines themselves and the SEO community as a whole. There will be no place for second best. Companies and businesses that will have their sites optimized professionally will benefit greatly. Companies, site owners and their webmasters that continue to produce good content for its users will continue to be viewed positively by search properties such as Google and the others and will continue to produce higher than average ROI.

The ones that don’t, well they could end up on the sidelines’ way out of the results pages’

Happy New Year to all of you!

Author:
Serge Thibodeau of Rank For Sales

Looksmart In The New Year
There will be a number of changes around the corner for Looksmart in 2004. Some of their new features are already rolling out. Below is an overview of the new features.

Beginning in January 2004, MSN will be removing the Web Directory Sites section of its search results, and will no longer have a direct partnership agreement with LookSmart. Current LookListings may still appear on MSN through the distribution partnership with Inktomi, but paid inclusion traffic will be reduced at that time. The LookListings will also continue to provide traffic from more than 85 other LookSmart partners such as Lycos, About.com, Road Runner, InfoSpace and CNET. Looksmart is negotiating with other portals and ISPs for additional distribution of LookListings.

Looksmart recently launched Sponsored Listings, a bid-for-placement keyword product. Sponsored Listings are fully integrated with LookListings, so you can use your existing login and password, get started with just a few simple clicks, and monitor the performance of all of your paid inclusion and pay-for-placement listings together in the LookSmart Advertiser Center.

LookListings now has other important new features, designed to give you greater control and flexibility. Starting December 16th, you can now pinpoint customers with specific keywords. Or reach a broader audience using relevancy-based inclusion targeting.

Create Keyword-Targeted Listings
If you provided Relevancy Keywords for your LookListings campaigns, you will now see those Relevancy Keywords displayed as new keyword-targeted listings in your account. These new listings will appear in search results when users search for the associated keyword across the LookSmart network. If you did not provide Relevancy Keywords for your campaign, you may now access your account and add keywords to your existing LookListings campaigns to improve the precision of your targeting.

Combine Targeting Options In One Campaign
You may combine two different targeting options within the same LookListings campaign. With “Inclusion Targeting” your listings will continue to be displayed whenever they are relevant across the LookSmart network, just as they were previously. Now, by adding “Keyword Targeting,” your listings will only appear for searches that match the specific keywords you choose. Use both targeting options together to tailor messages to specific groups of customers.

You now have the ability to bid for higher placement on your LookListings keywords. Take advantage of the new “Max CPC” feature with auto-discounting: Set the CPC ceiling you are willing to pay for any given keyword. The actual CPC you are billed will be discounted to the minimum value necessary to maintain your desired position in search results. This means you will never pay more than you need, to show up where you want.

No Fees To Create Or Update Listings
It is now free to create new listings, add keywords, and update your campaigns.

It will be interesting to see what new announcements Looksmart makes in the New Year prior to their deal with MSN ending on January 14th.

Author Bio:
Jason Lane Senior Search Marketing Strategist at www.searchengineposition.com