Ask us a question!

Web Moves Blog

Web Moves News and Information

Archive for the 'Search Industry News' Category

Failure of the SEO system
I see people daily preaching about how web site owners need to pay upwards of $7,500.00 or more for a marketing consultant who also practices search engine optimization. The reason given for this is that any SEO can get you top rankings, but that will not help you sell your products. They preach that searchers have no trust, they say that getting a #1 position for your relevant keywords will only bring visitors, not sales.

To an extent this is true. But the fact is, this is said so you will pay the fee they ask. They know their service, they know their prospect’s mindset, and they do a fine job of turning you into a new customer.

You know your products or services better than they possibly could. You know your customers better than they ever could. This stands true of any industry or business model. Make that work for you!

Instead of shelling out the dinero, why not sit down and lay out the groundwork for a plan of action based on what you know about your business and it’s customers. What do they want to see, how much detail do they want to see, and above all, what does it take to close the sale.

Once you know all of this, the rest is merely laying it all out on the web site according to your plan. It sounds easy, and it really is not, it takes hard work on your part. But you are not relying on someone who claims that they will research your business and be expert enough in a matter of days to sell your product line.

Do not fall for the hype
Once you have a web site that you are confident will convert searchers into sales, it is time to make sure you have searchers finding your web site. These same marketing consultants will tell you that you need them for this also. They will tell you that you need to concentrate on the lesser search terms that will bring in lesser amounts of traffic. They will tell you that this method will bring you better quality traffic because it targets the visitor better. Bull! The fact is these marketers know that they do not have the ability or the means to target the more competitive search terms and keywords. They use this method so that they can show you good results using the lesser keywords, to keep you ignorant and happy in your newfound traffic. I say ignorant, because they will not tell you that you could in fact have twice to three times the targeted traffic if you used someone who was a true professional.

Professionalism
There are different definitions of a marketing professional, but for this purpose we are talking about search engine optimization professionals. There are many who call themselves professionals, but who will not go after the competitive keywords and phrases. They are the ones who will tell you it is better to be happy with less traffic, and thus, less sales. They do not practice search engine optimization, because they do not know how, or are not willing to do that much work. The true SEO will make sure that you get the absolute most that you possibly can for your money spent.

There are some well published so called professional SEO’s out there that are excellent at promoting themselves. They can make you think their way is God’s way with an almost religious zeal and grand speeches about what is ‘ethical’.

In summary, watch out for self hype, do not think you have to pay someone $7,500.00 or more to get less than the best rankings, and do some research on the marketer you choose to use. Google searches are often an excellent way of seeing the real story behind marketers.

Author: William Cross

What to do with Yahoo!
Chances are, if you have a website and were previously paying for an Altavista listing you received something like this:

“AltaVista’s Paid Inclusion Program has changed. A New sign-up is necessary for existing AV express inclusion customers to become Overture Site Match members!

NOTE: YOUR ALTAVISTA MEMBERSHIP DOES NOT TRANSFER OVER TO THE NEW OVERTURE SITE MATCH PROGRAM. YOU MUST SIGN-UP YOUR URLs TO THE NEW OVERTURE SITE MATCH PROGRAM.”

This is an email which was sent to Altavista customers explaining to them that Altavista would no longer be accepting fees for inclusion.

Further to this, you’ve probably also noticed lots of press about Overture’s new Site Match program in which they say that you can pay to be included into the new Yahoo! database. In fact the above notice makes it seem that you must invest in Overture site match to remain in Altavista. You may also have heard that you need to pay for Site Match to get into Yahoo!

Also, just to mix things up a little more, you may have heard about the free Yahoo! submission option. While not a guarantee of if or when your site will be indexed, it is still a way into the index.

So the question I’m sure many people have is this: What the heck is the best way to get into Yahoo! (and by extension, Altavista and Alltheweb) and get found? Is it worth $29-$49 per URL plus 15-30 cents per click to be found in the index? The simple answer is it depends.

Before we get into that, let’s outline the various options currently there to help a website get listed in Yahoo!

The Options
Right now, there are multiple ways to have your pages listed in Yahoo! and most of them cost money.

From the top of a search results page down, let’s look at what the different listings mean.

Along the top and right of the screen are Overture sponsored listings. These are PPC listings in which the highest bids win the top spots. These are the easiest to identify and the simplest to explain.

Once you get into the “Web Results” however, it starts to get a little more convoluted.

Within web results you will find 3 or 4 different types of included results, depending on the competitiveness of the search. In this mass of results you will find Yahoo! directory listings, Yahoo! free listings via the Yahoo! Slurp crawler, Overture Site Match, and Overture Xchange. So what’s the difference?

Well, the Yahoo! directory hasn’t changed since the update last week; it’s still a US$300 submission fee with no guarantee of inclusion. And even if you are included there is no guarantee of rankings. Free listings are just that, listings gathered by either submitting via Yahoo! free submit, or having Yahoo! Slurp find your site another way (i.e. a link from another site).

Overture Site Match is a paid program whereby you pay a URL submission fee to get indexed followed by a cost per click on top of that for every click on your listing. Again, with no guarantee of ranking, only a guarantee of indexing.

Finally, is Overture’s Xchange which is an XML feed similar to the Inktomi XML feed. You have to request a rep from Overture to help you set this up. There is also no guarantee of listings, only a guarantee of indexing.

So, what is the best way to get into Yahoo!?
It depends on the current state of your site in Yahoo! among other things. Is your site currently indexed in the engine? And if so, how many pages are there? So let us start there.

Go to Yahoo! and do a search for your site. You may have to try a couple times to see your site. Try the search with and without the “www” but make sure you have the “.com” or “.ca” (or whatever) extension you have. If your site is indexed in Yahoo! then you should see your listing come up. It may also work with just the main domain, without the “www” and extension.

Now to see how many pages currently indexed for you site you should see a “More pages from this site” link near the bottom right of your listing. Click on this to see how many site pages are indexed. Again, this may take some experimenting with, to see how many pages are indexed. When I initially did a search for “searchengineposition.com”, it only showed one page, however a search for “searchengineposition” then clicking on the “more pages” link returned in excess of 1500. So be patient and experiment.

Chances are good that your site has already been indexed in Yahoo! which leads to the question (which I’m hoping you already know the answer to) “should I pay for Overture’s Site Match program?”

My recommendation is probably not. If you are already in Yahoo! as a result of the free crawl, then why pay for listings? In this case, if most (or all) of your site is indexed then paying for listings won’t influence the rankings.

However, if you find that there are many pages that aren’t indexed, but you feel should be, then perhaps you could consider paying for these pages to be listed. But not until you read on.

What if some (or all) of your site isn’t in Yahoo?
So let us now say that your site isn’t in Yahoo! or pages which you feel are really good aren’t getting indexed. What then? Should you pay for listings in Yahoo! or will it even help?

Again, I’d have to say probably not.

Before rushing out to pay more money, first take a look at some possible reasons on why your site may not be getting picked up by the Yahoo! crawler.

Check your log files to see if the crawler has been visiting your site. If you use a program like Webtrends you can create a custom profile which will analyze your logs specifically for spiders, or if you like, you can narrow the analysis down to just the Yahoo! Slurp crawler. This specific profile might be handy for my next point.

If the spider has visited, and you have the profile specific to this spider, take a look at what pages it has been requesting. Also, look at the paths through the site. Is the spider going deep enough into the site to get to your really important content? Or is it simply requesting one page at a time and not going any deeper than 2 or 3 clicks in? Is it merely requesting the index page over and over, or is it trying to crawl the whole site? Try and see if you can figure out what barriers (if any) the spider is coming up against. Remember that just because other spiders can crawl your site completely doesn’t mean that this one can.

Check your robots.txt file. Sometimes this is a huge barrier if it’s not coded properly. A good indicator of this will be your log analysis report. Is this file the only one being requested, or is the spider requesting a page, then the robots and stopping there? If this is the case, chances are the spider has somehow been excluded from indexing the site.

If you think there is a barrier, work towards fixing it and this may help getting your site re-crawled and re-indexed by Yahoo!

Talk to your hosting provider to see if perhaps they have banned any IP’s from accessing the server. Sometimes, if a hosting provider sees a spike in traffic from a particular IP or range of IP’s they will exclude it from accessing the server for security reasons. Perhaps they inadvertently banned Yahoo!s Slurp from accessing your site.

So what now?
If after your analysis you can’t see any reason why you may not be indexed in Yahoo! then perhaps performing a free submission is all you need to do. Try the free submit and see what happens. With searchengineposition for example, the site is re-cached every couple days, so you should see Slurp visiting your site fairly soon after submission. I know Yahoo! says that they can’t guarantee if or when your site will get crawled, but my gut tells me that it won’t be any longer than a free submit to Google. After all, their crosshairs are set on Google, so they want to at least maintain the same level of service as them.

Summary
I guess what I’m trying to say here is be sure that there isn’t some physical barrier to accessing the site before rushing out to pay for an Overture Inclusion. Because even if you do pay for inclusion, but you still have barriers to indexing, your site still won’t get indexed and you will be out $29-$49 per URL.

So before going out and spending a ton of money and assuming that that will get you in Yahoo! be sure that your site can be indexed. And if it is indexed in Yahoo! then paying won’t help influence your rankings anyways, so save your money and try other proper optimization efforts to positively influence your rankings.

Also remember that in the coming weeks your Yahoo! listing should translate to equivalent Altavista and Alltheweb listings (and to a lesser extent, a way into MSN), so even if you do disappear from Altavista or Alltheweb, be patient as you should come back relatively soon. Remember that these engines need an index to survive, and if the index is a centralized Yahoo! index all the better. In other words, you shouldn’t be in just Altavista and not Yahoo! (or vice versa).

On a more positive note, previously a site owner had to pay for an Inktomi inclusion to get into MSN. Now with the free submit into Yahoo! you should be able to get into MSN for a lot less money. Sure you are still at the mercy of MSN’s ranking algorithms, but there is a good chance that if you are already in Yahoo! that you will be in MSN as well, whether you’ve paid the listing fee or not.

Also, while I haven’t touched on it here, Overture is offering click tracking tools on their paid inclusion listings, including the Site Match. Therefore there may be benefit in paying for clicks if you think that the data collected is worth while. We are still analyzing the click tracking data, so are unable to say at this point how worthwhile it is, but we will keep you posted.

Author Bio:
Rob Sullivan
Production Manager
Searchengineposition.com
Search Engine Positioning Specialists

Introduction
Back in November, when the Google Dance began, Barry Lloyd of makemetop.co.uk wrote an article entitled “Been Gazumped by Google?” GAZUMPED! What a wonderfully descriptive term. In fact, it succinctly describes what happened to us when our website went from #1 to oblivion a few months ago.

We were gazumped, swindled, cheated out of what was rightfully ours. Okay, who’s to say what search engine position rightfully belongs to anyone. I mean, let’s face it, getting to the top of the search engines for coveted keywords is really just a crap shoot, isn’t it?

Or is it?

The SEO Cycle
We’ve been at this internet game for a long time, a decade, in fact. Dinosaur days! We don’t pretend to be search engine experts, it’s not our forte. Nevertheless, we have been around long enough to see some patterns emerge.

Here’s a recurring pattern we’ve observed in Search Engine Optimization:

1) The SEO experts research and make claims of what works to get a website to the top of the search engines.

2) It works for a time, and websites implementing their tactics move up the ranks.

3) Eventually, the algorithms change, and websites gain or lose rankings.

4) SEO experts make adjustments, then back to step one in the cycle again.

Around and around we go like mice on the optimization exercise wheel. Furthermore, we do not foresee any chance of stepping off of the optimization wheel for years to come. From our experience, as long as search engines exist, the algorithms will always change, and the SEO experts will always have a job.

But, is Search Engine Optimization really necessary?

To Optimize or Not
If you’ve set up a flow of traffic to your website outside of search engines, there really is no need to optimize. However, if you want search engine traffic, you’ll need to step onto the wheel. Furthermore, you need to know there’s some pretty fierce competition riding the same wheel. But, be forewarned. Once you step onto the wheel, it will become part of your regular marketing exercise.

Is search engine traffic worth stepping onto the endless wheel?

Marketing experts tell us all the reasons why search engine traffic is targeted. But, let’s put that aside for a moment and talk about real-world experience for a minute. In the past 10 years, our websites have received traffic from every imaginable source. But, you want to know about search engine traffic, right? We’ll give it to you straight.

Search engines have sent us some of our best customers. A significant number have stayed with us for years, and a surprising number have made purchases on their first visit – direct from the search engines.

Now, with this is mind, perhaps you can understand why we have felt gazumped after losing a significant amount of search engine traffic.

So, what exactly happened to us?

Dancing With Google
For years, we have been building a professional community where home-based entrepreneurs can network with one another and find the resources they need. As a clearinghouse of information, it has been a real task to organize the thousands of pages of information and make it user-friendly.

We had been #1 on Google for years for the coveted term “home business,” among other terms. As such, we enjoyed traffic that brought us a significant number of customers and subscribers. Traffic and sales grew consistently for those years. Our Alexa rating showed the continued growth by going from a ranking of 30,000 to almost 15,000 in a year.

Enter the ‘Florida’ Update.

In November, we were still #1 on Google. Traffic and sales were good.

In December, it appeared the Update was good for us. We remained #1 and sales were at an all-time high. “We’re getting through this just fine,” we gloated. Nothing to worry about.

But, then things began to change – drastically!

Gazump #1
Around January 3rd, our traffic plummeted to an all-time, two-year low
( http://www.alexa.com/data/details/traffic_details?q=&url=http://www.homebusinessonline.com/ ).

After a quick perusal of our traffic logs, it became obvious Google was playing with our rankings. You know, playing the dance floor. Google may have come to the dance with us as her date, but she left with a different partner.

A little research confirmed that not only had we dropped in our rankings for our coveted search term, but we were nowhere to be found in even the top 1,000. What? From #1 to oblivion? How could that be? The date was over… Gazumped by Google!

Gazump #2
Some weeks later, typing ‘link:www.ourURL’ in the Google toolbar brought another startling revelation. While our inbound links before the dance were around 800
(http://www.linkpopularity.com/linkpop.cgi?url=http%3A%2F%2Fwww.homebusinessonline.com),
Google now listed only 69. Now, that doesn’t even make sense after almost a decade online, especially when typing in our URL returns 2,620 links
(http://www.google.com/search?q=%22homebusinessonline.com%22).
Gazumped by Google!

Gazump #3
But, why stop there? Let’s add a little salt to the wound, shall we? It also appeared that Google was only spidering a few, select parts of our website. No more deep crawls where all the meaty content can be found. Gazumped by Google!

So, where from here?

Lessons Learned
After two months of a substantial drop in traffic and sales, we are not out of business. Why not? For one, we have not counted on Google as our sole source of traffic. We are still receiving decent traffic, but we could be doing better. I’m afraid we had been gliding on the coattails of Google for too long. Sometimes you just need a good kick in the pants. This wake-up call has been good for us. Like multiple streams of income, we will up the anti on multiple streams of traffic. That way, when one traffic source dries up, it will not effect us significantly.

It’s time to make some well-needed changes to our website. It could be better. A lot better. We’re going to stop talking about them, hunker down, and make those changes.

We believe this dance is not over yet and Google may decide to return to the dance floor. Considering ‘Gazump #2’ above, it only makes sense that there are still plenty of bugs to be ironed out.

We’ll get our traffic back and in a big way. If not through Google, certainly through other traffic sources. There’s more than one way to reach the summit.

The Very Best Strategy
The fact is you could spend hours upon hours in search engine optimization, and while we believe that optimizing your website is important, it shouldn’t be the main focus. We’re of the opinion that websites should first be built for customers and prospective customers, and second for search engines. Take a look at your website from your customer’s perspective. Does it do what you want it to do?

Go ahead and hop on the optimization wheel. Get it turning, but don’t become obsessed with it. If you have the resources for it, consider hiring an SEO specialist to help. Be sure you have plenty of traffic sources, don’t rely on just one or two.

Finally, anything online, including the search engines, are driven by the market’s opinion. When the dance is over, if Google doesn’t provide the very best search results, the market will go elsewhere. For this reason, no matter what might happen to us personally through the Google Dance, the end result will be better search results – if not Google, then another search engine.

Either way, the market will determine who wins in the end.

Author Bio:
Seasoned entrepreneurs, Dave and Heidi Perry have developed half a dozen businesses and are founders of HomeBusinessOnline.com and PrettyGreat.com. Known for their unique insights and straight-shooter style, Heidi and Dave are editors of HomeBizBytes. Receive a free issue at http://www.HomeBusinessOnline.com/nsl.htm?sya

Who Cares? Webmasters Want “Referred Traffic”
“Searches Performed” Empty Lies, Damned Lies and Statistics!
Who Cares? Webmasters Want “Referred Traffic”

Yahoo abruptly quit using Google as a search partner recently in a surprise move that has the search industry now scrambling for statistics to analyze and numbers to bandy about. I’d like to share some rarely discussed statistics and numbers with you here. First the numbers and stats from the press, then I’ll share a few of my own. Here are the stats that are getting the most attention for the Yahoo search story.

Share of searches performed by U.S. users
(source: comScore Media Metrix)

Google = 35%
Yahoo = 28%
AOL (powered by Google) = 16%
MSN = 15%
ALL Others = 6%

Charts and analysis of this statistical lie can be seen at:

http://www.searchenginewatch.com/reports/article.php/2156431>

As an SEO specialist, I don’t care if Yahoo and MSN together get almost 40 percent of all searches performed as long as Google delivers nearly triple the REFERRED traffic of either of those also-rans.

I love those numbers presented by comScore. The problem is that it has nothing to do with search engine REFERRED TRAFFIC to webmasters. I did a small study of client traffic stats last year and found in EVERY case that Google delivered over 70% of referred traffic to client sites and one gets nearly 90% of his referred traffic from Google!

Those included some new clients as well as several of those I’d been working with for up to a couple of years.
This indicates to me that it’s not the work I do that favors Google, and that it is a similar result across many types of sites, optimized or not. My article discussing these numbers was picked up in half a dozen places and debated in a forum or two because it seems shocking to imagine Google dominating at that level.

http://searchengineoptimism.com/Google_refers_70_percent.html

If you scroll down the page at that SearchEngineWatch page linked above, you’ll see another chart that reflects Google’s reach and, guess what? It’s actually closer to 70% due to the fact that Google powers AOL and, up until last week, Yahoo. Now I expect the 28% loss from Yahoo will make those numbers fluctuate a bit in coming months as searchers decide whether they like the results they get from Yahoo search without Google powered results.

Now that Yahoo search will no longer be contributing to Google’s 70% of referred traffic, I suspect it will vary from last year if webmasters look at their traffic stats at the end of next month. I’ll look forward to THOSE numbers!

Still, I’ll wager that if you look at your client traffic stats for search engine referrals that delivered traffic from Google FAR outdoes traffic from any other search engine for some time to come. When that changes, then it will start to matter. Until it changes – who cares even if Yahoo and MSN search get double their current “Searches performed”
when the referred traffic they deliver is just a fraction of that?

Who cares if the competitors are at 27% of searches performed if they deliver only 5% of their referred traffic? What do people do when they get results at Yahoo and MSN search? They must stay there, follow paid links, or give up their search and go shopping if they aren’t ending up clicking through from those oroganic results to the top ranking sites!

I have multiple top ranking terms at MSN and Yahoo for several clients that get trickles of traffic from both of those sites, EVEN THOUGH those very same search phrases deliver dramatically higher traffic from Google – and in cases where they rank lower at Google! Puzzling, eh?

Google delivers traffic. The others don’t deliver at even half the rate that Google does. So I simply don’t care that nearly a third of searches are done elsewhere. I am going to work on ranking well for the search engine that DELIVERS VISITORS from organic search.

I’ll pay for traffic from the others if necessary since they don’t deliver on even TOP RANKING searches. I believe that is because the searches at MSN and Yahoo sites and their search partners have too many flashing, blinking, prominently placed, paid ads dominating the SERP’s.

Yahoo and MSN may get lots of searchers searching, but if those searchers don’t click through on those top ranking organic results – what earthly good does it do to rank well in organic results at those search engines?

I suspect that Yahoo will profit nicely from Overture, and since they appear to be so highly profit-driven (yes, I agree that is a good thing for business, but bad for search) then the results will be profit driven too. Paid results will dominate at both Yahoo and MSN and they will continue to deliver far less organic search referred traffic than does Google.

Until I see some changes in referred traffic, I’ll bet some serious money that Google will continue to deliver over 50% of all referred search traffic to everyone due to the emphasis on relevance above profit.

The impending Google IPO makes me nervous about all of this because Google will have to do as the others do and emphasize paid results on the SERP’s at a much higher level than they do now in order to keep investors happy. Investor pressure.

Those two clearly marked sponsored ads at the top of the page and clear boxed Adwords ads along the right will be charming memories in short order. We’ll see paid links grow to dominate the Google SERP’s and it wouldn’t surprise me if they started running banner ads in addition.

Statistics from the webmaster perspective show Google sending nearly triple the traffic of all other search engines combined.
That’s the only statistic that webmasters care about.

Author Bio:
Mike Banks Valentine is a Search Engine Optimization specialist practicing ethical small business SEO Search Engine Placement, Optimization, Marketing http://SearchEngineOptimism.com/SEO_articles.html http://SEOptimism.com

27
Feb
2004

Ask Jeeves Interview

Introduction – Ask Jeeves Interview
If you plan on attending Jupiter Media’s Search Engine Strategies conference in New York on March 1st thru 4th, you’ll no doubt hear a lot of buzz surrounding the future of search engine technology. With Yahoo recently switching to a new and improved Inktomi index, Google testing localized search and MSN promising to enter the foray sometime in the next twelve months, you can bet that the search engines we know today will be much improved over the next couple of years.

While the spotlight may be on Google, Yahoo and MSN, Ask Jeeves has quietly improved their search engine to ensure a user experience that is second-to-none. While, Ask could comfortably rest on their laurels, they know that the competitive world of search is constantly changing and in order to continue their success they need to ensure that they remain at the cutting edge of search engine technology.

After being fortunate enough to sit down with Microsoft’s Robert Scoble and discuss his thoughts on search engine technology development, I caught up with Ask Jeeves’ vice president of products, Jim Lanzone and asked him his thoughts on what the future of search might hold.

Starting things off
[Andy Beal] Thanks for taking the time to answer some questions. Let’s start with what you see happening in the future?

[Jim Lanzone] Unfortunately, we can’t talk publicly about the most exciting search technologies we’re building, because they are proprietary to us and we wouldn’t want Google copying us now would we? We have some very tasty special sauce we’ll be launching over the course of the next few quarters that will make our search results perceivably better than the competition’s.

[AB] Without giving too much away to your competitors, can you give any hints as to what might be upcoming from Ask Jeeves?

[JL] In the area of things we can talk about, we are very excited about our work on both the search technology side and the search experience side. They are equally important to helping people find what they need.

Regarding the user experience, we’ve had a lot of success with Smart Search the past year, and you can expect to see us continue to pursue that strategy. Smart Search is more of an ideology here than a brand name. It means giving the user smarter results in a more intuitive way, and what that means differs depending on what kind of search you’re doing.

[AB] What new developments in search do you see happening in the next 3-5 years?

[JL] Because accessing information is such an integral part of our lives, I believe your interaction with search will change dramatically in the next 3-5 years. You will be able to access search databases from other sources than the keyboard (with voice recognition technology, maybe), and on different platforms (such as the GPS in your car).

[AB] GPS (global positioning system)? How do you see GPS and search interacting?

[JL] For example, a GPS with search capabilities could tell you where to find the best local pizza restaurant or nearest medical clinic in a neighborhood you visit. Of course, in order for that to happen, local search capabilities will have to vastly improve, as will voice recognition technology.

[AB] Apart from GPS, do you see search having an impact on any other consumer products?

[JL] Search is the #1 activity on the Web, and there’s no reason why the utility of search or the Internet should be restricted to your PC or Mac. I believe a device will come along and have the same impact on search as the iPod did for music. Cell phones will probably adapt more to this device, ultimately, than the other way around, due to usability issues, and the user’s desire to carry only one device. Standing on a street corner and using this device, you will search for a local restaurant, or a cab company, through the Internet. Instead of going to the cab company’s website, you will click a link and initiate a phone call. The search engine will be compensated for the call (this is the traditional Yellow Pages model of “metered calling”) rather than the click.

More of the interview
[AB] What if cost wasn’t an issue? Any dream product?

[JL] If cost were no issue, we’d also like to see an Ask Jeeves-enabled PDA in every user’s hand!

[AB] Companies such as Eurekster are betting that social networking is the future of quality search engine results, what are your thoughts?

[JL] In terms of the social networking devices being developed by other companies, there are two types we’re seeing get attention. The first is the kind being used by the likes of Friendster and Tribe.net, where social networks are being used to help people find a job or a gardener or a date. The potential problem with this is the “reverse network effect”, whereby the more the network grows, the less useful the recommendations are by those in the network. For example, how much more useful is it to me, versus the yellow pages or a search engine, to be recommended a contractor by my friend’s cousin’s neighbor? Now imagine if that’s how I’m finding a date for next Friday night?

Meanwhile, with something like Eurekster, the “social networking search engine”, you may face the same problem. At what point are these results more useful than those given by our “normal” engine, which is already getting smarter and smarter about who and when it serves up certain results. So, in the end, we believe that social networking as defined and utilized by Teoma is the best of breed way to go in this area, and the most effective growth will be built on its foundation.

[AB] What makes Teoma the “best of breed”?

[JL] Our Teoma technology is predicated on social networking theory, as originally pursued by the Clever team at IBM in the mid-90’s. Teoma was the first (and is still the only) search technology that can identify the Web graph’s expert hubs and authorities in real time.

[AB] What is Teoma doing that the IBM team couldn’t do?

[JL] The Clever team identified that it was a better mousetrap for producing relevant search results, but thought it would take a server farm the size of the state of Texas to produce in real time. Teoma does it in a split second. Others questioned whether the technology would scale past 50 million document index. We’re now at 2 billion. Remember that Teoma is a much younger technology than our competitors, so in some ways we’re only now starting to see the power of it. And as it grows, social networking will continue to be at the heart of what makes Teoma different and special.

[AB] Do you foresee a time when commercial search results (product/services) will be separated from informational search results (white papers/educational sites)?

[JL] Yes, similar to Yellow vs. White Pages. But since index search is already separate from P4P links, this is a much more important prediction for the future of paid inclusion. The future of paid inclusion is more likely to be in separate, possibly 100% paid indexes, than it is the current mix of paid and unpaid links, and structured and unstructured data. It’s better for monetization, better for relevance, and probably better for the FTC. When you think about it, this is already happening with a site like Shopping.com, which is basically product search with a 100% paid index. Same thing with the Yellow Pages. I could see this model extended to jobs, airfares, and even adult sites.

[AB] We’ve talked a little about providing more relevant search results. If search engine users gave up a little of their privacy and allowed their search habits to be monitored, would this allow the search engines to provide better, customized results?

[JL] Some search engine users are already giving up their privacy willingly, for example with the latest Google 2.0 toolbar. The reason why Google wants this information is because the answer to your question is a resounding “yes”! Even more important than results customized for individuals, however, which will have some utility but not as much as some may think, are results customized for groups of individuals who exhibit similar characteristics. For example, those who frequently visit certain sites. Moreover, search engines can use this information to track the quality of their competitors’ results, because these toolbars can – if users allow them to – track their usage on other sites.

[AB] Jim, I appreciate you taking the time to answer these questions. What last thing would you like readers to know about Ask Jeeves?

[JL] We’d just like to add that we’re very proud of service Ask Jeeves has become the past 18 months. It is now a world-class search site, featuring world-class search results thanks to Teoma and a world-class search experience thanks to Smart Search. 2004 will be an exciting year for us.

Conclusion
Anyone connected with the search engine industry probably shares my excitement that the future holds some great advances in technology. Search engine users are going to be in for a thrill as Google finally faces some legitimate challenges from a host of search engine companies, both large and small. You can keep up with the latest search engine news and developments at Search Engine Watch or by visiting my blog, www.SearchEngineLowdown.com.

Author Bio:
Andy Beal is Vice President of Search Marketing for WebSourced, Inc and KeywordRanking.com, global leaders in professional search engine marketing. Highly respected as a source of search engine marketing advice, Andy has had articles published around the world and is a repeat speaker at Danny Sullivan’s Search Engine Strategies conferences. Clients include Real.com, Alaska Air, Peopleclick, Monica Lewinsky and NBC. You can reach Andy at andy@keywordranking.com and view his daily SEO blog at www.searchenginelowdown.com.

The New & Improved Yahoo
On February 18th, Yahoo implemented serious changes to its search directory, in an effort to better compete against Google and, to a lesser degree, MSN.

Since the 18th, I’ve carried out extensive testing with Yahoo and, so far, I like the new features implemented on their search property. I think the changes represent some positive improvements, both in the quality of the results and in their relevancy. Using a ‘spider robot’ that crawls the whole Web, in a similar fashion that Google does it, we need to change Yahoo’s category from search directory to a real search engine.

If you would like a peek ‘under the hood’, Yahoo has a new crawler known as Yahoo Slurp, similar to the Inktomi Slurp bot many webmasters have been seeing in their logs over the past few months. Yahoo Slurp works very much like GoogleBot (Google’s crawler). Yahoo Slurp is designed to follow every text link found in a website.

Some of the changes implemented
Among the positive changes I have noticed in Yahoo:
• For sites with direct RSS feeds, a link to the feed URL is now available with this new implementation. If required, you can even add the RSS feed to your My Yahoo page with a click of the mouse.

• Like Google, a cached version of each Web page is now available.

• AltaVista and AllTheWeb continue to use separate search functionality, using an independent search index. (Note: All the Web and AltaVista are both owned by Yahoo).

Note that a customizable and uncluttered (read no ads) entry point to Yahoo’s various search databases has been available for some time at www.search.yahoo.com.

I’ve also observed that Yahoo seems to be analysing similar page content and features as Inktomi does, despite the results being displayed today are somewhat different. Important criteria such as keyword frequency, keyword density and features such as keyword-rich title tags and key phrase placement appear to be important factors in Yahoo’s current algorithm.

We have also seen some sporadic Google results popping up from time to time. This is very likely a temporary measure to ensure search-continuity, as Yahoo engineers continue to develop and continuously improve their search algorithm.

Recommendations to site owners
Remember that all search engines crawlers, including Slurp, are incapable of following hyper links imbedded in a frame, in an image or graphic, and certainly not in a site that was created using Flash technology as its only means of presenting information.

As I wrote about the importance of usability in one of my articles two weeks ago, by making certain that your site is search engine-friendly, it will usually rank higher in the search engines, including the new Yahoo.

If any of you missed that article, you can follow this link to read it: usability & SEO.

If you want your site to rank high in any search engine, and if your site includes frames, remember that the correct use of the tag is very important. For more on how to optimize a site using frames, just follow this link: optimizing framed sites

For sites that are database-driven, Yahoo Slurp will try its best to follow dynamic links, but as a precaution, Yahoo is advising site owners and webmasters to place static pages with text links directed to certain parts of their sites that have any kind of dynamic content. For more on the correct way to optimize dynamic links, go to: optimize dynamic links.

Finally, just like Google, the description meta tag is a critical aspect in Yahoo’s algorithm. For maximum benefits, site owners and webmasters are encouraged to write keywords in their title tags, descriptions tags and body copy for each individual page in their sites. As always, quality and relevancy always yields the best dividends.

Stay on-topic for maximum efficiency in the results pages. Remember that Yahoo, as well as most of the major search engines always look at the ‘theme’ of any individual page, and will use its algorithm to determine what the topic is about that page. If the keywords used in the title tag reflect the same keywords used in the body of the copy, it will usually grant a higher ranking for that page than if the page happens to be inconsistent with the proper placement of its main keywords or key phrases.

Conclusion
Yahoo, as any other search engine, is only interested in one thing: quality and relevant results for its users. If your website delivers that information in a consistent and clean manner, it should do well in the search results pages. All recommended SEO techniques and procedures that have been taught for the past two to three years still hold, both in the case of Yahoo and the others.

Build and integrate your site in the most logical way, make it search engine-friendly and create great content are three of the best ways to significantly improve your rankings. If you can add new content every day, your site will be perceived as one that is well maintained. Search engines prefer sites that consistently feature fresh and new content, when compared to other websites that don’t change much and are somewhat dated.

Here’s a tip of the hat to Yahoo for a job well done!

Author Name: Serge Thibodeau
Company: Rank For Sales
Email: info@rankforsales.com

19
Feb
2004

Yahoo Changes Search

Yahoo no longer using Google
I am glad I am not a betting man, or I am sure I would have big guys named Vinnie and Freddie knocking on my door looking for their money. I have been trying to guess when Yahoo! was going to change results providers virtually since the time they bought Inktomi over a year ago. I even made a couple predictions, but in the end I was wrong.

So, what is all the fuss about?
Well for one, they didn’t use pure Inktomi results. In effect they didn’t replace Google with Inktomi; they replace Google with a Yahoo! brand of search. The Yahoo! search is new and uniquely different from everyone else.

So what does this mean to search marketers?

Well, while you used to have to pay to be listed in the Yahoo! directory, it doesn’t appear that you need to pay to get indexed by Inktomi. In fact, if you check your visitor logs, you’ve likely noticed a lot of activity from either Inktomi’s Slurp or Yahoo!s Slurp spiders.

So how do you know if you’ve been picked up in Yahoo?

If you’d like to see how many pages are indexed, you can go use the advanced search options (http://search.yahoo.com/web/advanced). If you put in 2 sets of double quotes (“”) in one of the “Show results with” boxes, followed by your domain name in the “only search in this domain/site” box, you will see what pages are indexed in the new Yahoo! index.

What does all this mean?
Well, it’s hard to say what the impact of Yahoo!s organic product will be. We already know that they can account for about 1/3 of most sites’ traffic. But how Yahoo! will deal with crawling and indexing is another story. Since they just switched over, we will have to see what happens.

We do know that, at this time, there is no place to add a URL to their index, so we must assume that at least for the time being a paid inclusion is required to be indexed in Yahoo! Or, we may find that a submission to Altavista or Alltheweb may help a site get indexed. Although this is doubtful, it is not out of the realm of possibility.

One speculation is that Yahoo! will use a similar submission strategy that Altavista has employed. That is, you could submit for free, with no guarantee of if, or when, the site would get indexed, or you could pay for guaranteed inclusion. In this case, the submission fee is definitely worth it considering the 30% reach that Yahoo! currently has.

So what does this mean for Google?
Aside from the obvious – that Google’s power has been cut in �, not much. Google is still striving to have the best search out there. Whether Yahoo! is a part of that or not, shouldn’t really concern them. Sure it is a hit to the bottom line, but Google is large enough that they will survive, at least until the rumored IPO.

Now what Google has to do is stay ahead of the pack. They have developed a reputation for the best search. They have to maintain that reputation. The fact that Yahoo! has entered the algorithmic search market after 10 years on the web does help Google in that they have some catching up to do. But the fact that Yahoo! can afford to catch up could be a concern.

They also have to consider that Yahoo! is only the first of the “big players” to foray into the highly competitive algorithmic search market. MSN still has to display its entry, which will also be later this year.

But for now they only have to consider Yahoo!s impact. While it will be great in terms of volume of search lost, it is too early to determine what the true impact will be. Launching algorithmic search is one thing, convincing people to use it is another.

Author Bio:
Rob Sullivan
Production Manager
Searchengineposition.com
Search Engine Positioning Specialists

17
Feb
2004

Lycos Leaving Search

Lycos Leaving Search
There are few internet web properties that were making news in 1998. Lycos was one of them. Once a powerhouse in internet search (and one of the hottest properties on the web) Lycos was soon bought by multinational Terra Networks to become TerraLycos.

Since that day in October 2002, Lycos seemed to have lost focus on what it was – a really good search engine. Instead they got into many other internet markets and performed even more global expansion. In fact, even today Lycos is more popular in some countries than any other search engine or portal.

If you were to review their press releases from 2000 and 2001 you would that they were acquiring properties back then that are considered hot now. Job finding sites, travel sites, music services and more.

While other players like Yahoo and MSN were coasting (and Google was sneaking up from behind) Lycos was building its world internet presence.

Lycos Losts Its Appeal
But something happened. Lycos lost its appeal to the majority of internet users. While they were still hugely popular in Europe and other places around the world, in North America they were all but forgotten.

Around the time of the tech stock crash, Lycos faded, much like many other big internet names, and never recovered in the US.

More recently, they have cut staff drastically and have put up “for lease” signs in their California offices. It was only a matter of time before Lycos search disappeared.

Today Lycos announced that it is getting out of search. Instead they are going to focus on the latest internet craze – social networking. Social networking is the new gathering place – A place to go hang out and meet new people, or maybe even meet that special someone.

Many other players are getting into social networking – such as Google – which led to rumors that Google will incorporate social networking into search.

But for Lycos, this could be their last attempt at some kind of web supremacy. They realized that they could not compete in search. In fact, they were so far behind, it would have taken years to get caught up even to where search is today. So they chose a new route.

Hopefully, for the company’s sake, this new path turns out to be the right choice for them. They are getting into social networking at its infancy. As long as they can capitalize on its current popularity and make it work, they may stick around for a few more years.

Author Bio:
Rob Sullivan
Production Manager
Searchengineposition.com
Search Engine Positioning Specialists

Introduction
By now, if you market on the web, you know all about the Florida update, and have likely heard of the Austin update.

But in case you haven’t – a little reminder: Back in November 2003 we began to notice a fairly major change occurring in the Google index. In fact, if we go back to May 2003 we can discern hints of the changes (targeting affiliate sites and so forth). Back then no one knew what was coming our way.

Recently, another fairly major update, named Austin, occurred and even more sites were removed from the index. So many people are asking why. I thought I would throw in my two cents.

Florida was a major change in thinking for Google. They are attempting to return the engine to supremacy. In doing so, however, they did harm to a lot of sites who were playing fairly and by the rules. This collateral damage caused many to surmise that Google was again broken and again wouldn’t be fixed.

We developed our own theory
We developed our own theory fairly early on with regards to Florida – that Google has implemented a context based algorithm. Indications are still there that this is the case. We hinted at the improving results then, and this seems to be happening.

You see, the next thing Google had to do once they had filtered out some sites, was to recalculate back links and PageRank. This happened in December. We were noticing some PageRank fluctuations on our own site during this time. It has since leveled out.

So once Florida happened, Google had to recalculate PageRank to adjust for those sites which were filtered. Since the site was removed – its influence on back links was also removed.

Google Austin Update
Now January comes – the Austin update. And what does Google do? They perform a refinement of the same algorithm which caused the Florida update. Nothing major – more like a tweak to allow some of the innocent sites which were previously dropped back into the SERPs. These are the results we are seeing now – some semblance of where Google will be in the next couple of months.

I don’t see this as the end though. Likely, in February another back link check will occur and those sites which were removed in January will lose their link influence in February. Then in March we should see a more stable index. True to Google’s nature – it will have been 90 days since the Florida update.

I bring up this figure because traditionally after a major Google change, it takes about 90 days for the Google index to stabilize and start returning more relevant results.

Be patient
Therefore, if you are a small site whose results have just now started to come back, try and be just a little more patient. My feeling is that in the next couple months you should start to recover some more of what was lost in the last two months. That is unless you were a site which was a target. These include affiliate sites, or sites which didn’t offer any useful content or information.

If your site is well constructed, easily spiderable and provides lots of useful information that is not too promotional you should start seeing a rise in results. You may want to increase the amount of content you have but be sure that it is informational in nature. Take a look at the average site ranking for your key phrases – how many pages are they? You will probably want to fall somewhere in that range of pages. Also, while you are analyzing your competitors – take a look at what kind of back links they have. Perhaps you can get links from these sites as well, as long as they are related to your industry. Remember that if your PageRank and site size are comparable to those currently ranking, you should also be considered an authority on the topic.

Summary
So as we begin February, remember that the Google ride isn’t over. We have just entered the phase where we are coming down the last hill near the end of the roller coaster ride. The end is near, but it’s not here yet.

Author Bio:
Rob Sullivan
Production Manager
Searchengineposition.com
Search Engine Positioning Specialists

Introduction
This article takes an in-depth look at some of the innovations introduced at ExactSeek.com, including, most recently, the implementation of a unique ranking algorithm that factors in Alexa traffic data.

Search engines basically cater to two audiences – webmasters and web searchers. Although there is an obvious overlap between the two groups, the latter dominates in numbers and most search engines bend over backwards to keep web searchers happy by continually tweaking search result algorithms. Webmasters, on the other hand, are more or less left to fend for themselves.

ExactSeek was founded 2 years ago on the premise that webmasters/ site owners are the driving force on the Web – not in numbers, but in terms of ideas, development, enthusiasm and innovation. The goal for ExactSeek, then and now, was to build a major engine that catered to this audience and web searchers more or less equally. Part 1 of this article deals with ExactSeek’s focus on webmasters.

Free Site Listings
ExactSeek’s belief in the importance of the webmaster audience to search engine growth and traffic resulted in the rapid introduction of three types of free site listings, each geared to helping webmasters give their websites maximum exposure within the search engine.

Here’s a short explanation of each:

Standard listings: As the name implies, this site listing is just a basic website listing in the ExactSeek database, comprised of Site Name (taken from the Title tag), description (taken from the Meta Description tag) and Site Link (the website URL).

These basic listings show up numbered 1 to 10, etc. in ExactSeek’s search result pages. Any webmaster can obtain a Standard Listing simply by filling out the form at: http://www.exactseek.com/add.html

Enhanced listings: Not strictly a website listing – really more of an enhancement to the standard website listing. In brief, an enhanced listing consists of a small icon residing next to a website’s standard listing which when clicked opens a 150×150 javascript window with additional information about that website. The cool thing about the enhanced listing is that webmasters can add up to 800 characters of text or HTML code to the content of the window, allowing them to:

– Embed audio, video or Flash.
– Include logo, product or service images.
– Include links to secondary site pages.
– Use font enhancements (size, color, style).
– Include more detailed information about their websites.

Simple examples can be seen at:
http://www.exactseek.com/enhanced.html

Priority listings: Introduced for those webmasters who wanted something in return for adding an ExactSeek search form to their websites. These listings appear in the first 10 to 100 ExactSeek search results, highlighted against a light blue background. Visit the link below to see an actual example:
http://exactseek.com/cgi-bin/search.cgi?term=perl

Obtaining a priority listing is a simple 3 step process outlined at:
http://www.exactseek.com/add_priority.html

The three types of listings noted above offer any half way knowledgeable webmaster a means to maximize site exposure with minimal effort. Best of all, any site meeting ExactSeek’s submission guidelines is added within 24 hours. No pointless weeks or months of waiting. In 24 hours, websites are either indexed or not. If not, the most common reasons are:

1. The site submitted lacked a Title tag. The ExactSeek crawler ignores sites without Title tags.

2. The site submitted was pornographic or contained illegal content.

3. The site submitted was dynamically generated and its URL contained non-standard characters like question marks (?), ampersands (&), equal signs (=) percent symbols (%) and/or plus signs (+).

4. The submitting webmaster failed to respond to the submission verification email message sent out by ExactSeek. Or, as is becoming more and more common, the webmaster failed to receive the message due to sp@m and ISP filtering and, thus, could not confirm submission. Webmasters using AOL, Hotmail and Yahoo email addresses may soon find it impossible to have their websites added to any search engine using a verification system.

The Do-It-Yourself Approach
One of the most irritating things about many search engines is that it can take weeks or even months for free website submissions to be indexed. And once sites have been added, it can take weeks for changes to content and/or tags to be re-indexed and for webmasters to see if those changes had a positive effect on site ranking.

ExactSeek opted to put a measure of control back in webmaster hands by introducing a do-it-yourself approach. Early on, two simple online tools were made available which allowed webmasters to quickly check the positioning of any website in the ExactSeek database for any keyword(s) relevant to that site and then, if necessary, do something about it.

(a.) Site Ranking Tool
Tracks the top 10,000 site rankings for all keywords in the ExactSeek database, allowing a webmaster to find his site ranking for any keyword(s) relevant to his website.

(b.) Web Crawler Tool
Allows webmasters to schedule recrawls of their websites as often as once per week. Self-scheduled recrawls and instant site ranking checks provide webmasters with a quick read on which optimization strategies work and which don’t.

Both of the above tools can be found with additional explanation on the following page:

http://www.exactseek.com/srank.html

Do-It-Yourself – The Next Step
More recently, ExactSeek implemented a simple free membership system that takes the do-it-yourself approach one step further. Any webmaster who has successfully submitted a website to ExactSeek is automatically a member and can obtain a Member ID and Password for login purposes by using the appropriate form at the ExactSeek Member Login page.

After logging in, webmasters can access a Member Account Manager which allows them to edit, enhance, delete and/or recrawl their site listings as often as they like. Revolutionary? Maybe not, but a big step away from total reliance on search engine scheduling.

Part 2 of this article will look at how traffic data from Alexa Internet was incorporated into the new ExactSeek ranking algorithm, the algorithm’s impact on delivering quality search results and at how ExactSeek’s new paid inclusion program stacks up against other paid inclusion and PPC programs.

Author Bio:
Mel Strocen (c) 2003. Mel Strocen is CEO of the Jayde Online Network of websites. The Jayde network currently consists of 12 websites, including ExactSeek.com (http://www.exactseek.com) and SiteProNews.com.