Ask us a question!

Web Moves Blog

Web Moves News and Information

Archive for the 'Search Industry News' Category

Teoma Search Engine
One search engine that really caught my attention at the search engine strategies conference in Chicago was Teoma. Teoma is the search engine that powers Ask Jeeves, also knows as the ask.com search site. Teoma has made several improvements to its engine over the past year or so that has drastically increased its index and the quality of results generated. They have a unique method of ranking sites that Teoma likes to call “authority”. In addition, they have a Teoma toolbar and many of the advanced search features that other search engine have.

In the upcoming pages I will cover the following:

– Teoma’s History and Background
– Teoma’s Search Technology
– Teoma’s Features and Advanced Search Functions

Teomas History and Background
Teoma’s founder and vice president of research and development is a Professor of computer science from Rutgers University, Apostolos Gerasoulis, Ph.D. Professor Gerasoulis along with his team of computer scientists founded Teoma in April of 2000. Their single goal was to build a search engine that looked “at the Web in terms of subject-specific communities.” Teoma had the advantage of joining the search engine wars late in the game. This allowed them to look at what the current engine did right and more importantly, what the other engines did wrong. Teoma.com was not officially launched until a year later in April 2001.

On September 18, 2001 Ask Jeeves, Inc. acquired Teoma, paying over 1.5 million dollars. On January 9, 2002 Ask Jeeves announced its successful integration of the Teoma search technology into Ask Jeeves. Ask Jeeves reported an increase user satisfaction of 25% and a site abandonment rate of 15% less after this integration. In 2002 Nielsen//NetRatings reports Teoma has grown 175% making Teoma the third most popular search engine in the United States.

Teoma 2.0 was released on January 21, 2003. The new version boasts improvements to overall search result relevancy, additions to search tools and more advanced search functions. Teoma currently powers Ask Jeeves primary organic results and receives its paid search results from Google AdWords.

Teoma’s Search Technology
Teoma adds a new layer of “authority” to search results through something they call “Subject-Specific Popularity.” Google’s PageRank, simply explained, ranks pages based on the quality and the number of inbound links to a site. Teoma ranks sites based on related communities of sites that are “organically organized” and link to each other. It then determines which sites are most relevant based an authority factored, that is where Subject-Specific Popularity comes into play. Subject-Specific Popularity determines the authority of a site based on the number of pages that link to a page within the same subject. Teoma provides a nice analogy to why this is important. They write, “picture yourself in your garage, in front of the opened hood of your severely out-of-commission pick-up truck. You need help with this major repair, and you can either ask your uncle, who owns two cars but has never held a wrench in his life and happens to be visiting (similar to using other leading search technologies) or you could phone your best friend, who has a degree in applied mechanics and builds automobiles from the ground up in his spare time (similar to Subject-Specific Popularity). The choice is quite clear.”

When Teoma 2.0 was released it provided improved relevancy, more accurate communities, spell checking, “Dynamic Descriptions”, more advanced search tools and an expanded index. Ask Jeeves reports an increased “user pick-rate” of 22% and a site abandonment decrease of 28% since the upgrade. In addition, Teoma received a relevancy grade of “A” from Search Engine Watch, adding them to the elite group of search engine that include Google, Yahoo and MSN. By improving Teoma’s analysis of “Communities” they were able to increase the relevancy of pages by better evaluating authoritative pages. In addition, the “refine” search option found on Teoma.com enables searchers to easily narrow down their search results. Many search engines have Web-based spell check, Teoma added this in its 2.0 version. Teoma 2.0 added other enhancements and features as well as increased its index by over 500 million URLs.

Teoma’s Features and Advanced Search Functions
Teoma provides numerous methods to both refine your search and locate subject specific resources. Conduct a search on search technology and you will find on the right side a “Refine” option that presents you with useful “refinements” to your search query. In addition you will find a collection of “Resources” directly below the refinements option that easily allows you to locate “expert” sites on search technology. You will also see that Search Engine Watch is one of those results. You will also see the sponsored listings at the top, provided by Google and you will find the Web page results directly below. Google is the first result for that keyword and we all know why.

In addition, Ask.com allows you to search for anything and it attempts to provide exactly what you are looking for without the use of any “tabs”. For example, I did a search on picture of chair at Ask Jeeves and guess what? It gave me actual pictures of chairs, this chair is a nice one. Ask Jeeves called this technology Natural Language Processing (NLP). For a complete listing of technologies and features provided by Ask Jeeves please visit: http://sp.ask.com/docs/about/tech.html. For advanced search tips from Teoma, please visit: http://sp.teoma.com/docs/teoma/about/advsearchtips.html.

Teoma has come a long way since April 2000 by achieving its rank as the number three search technology. Its unique method of providing relevant and accurate results makes Teoma and its partners stand apart from the other search engine technologies. As the features and index improve, Teoma will continue to succeed by adding partners and a more satisfied user base. It will be interesting to follow Teoma and Ask Jeeves over this coming new year and see how it competes with the other major search engines.

Sources: Teoma.com, Ask.com, and About.com

Author Bio:
Barry Schwartz is the President of RustyBrick, Inc., a Web services firm specializing in customized online technology that helps companies decrease costs and increase sales. Barry is a leading expert in the search engine optimization community. Barry has written and contributed many articles to the SEO community, by publishing in SEMPO (Search Engine Marketing Professionals Organization). Barry also gives regular seminars covering the complete spectrum of search engine marketing technology and methods.

Search Engine Strategies Conference 2003
Search Engine Strategies Conference and Expo is an event that brings together most talented and famous Search Engine Marketing professionals in the world. This event is held several times a year all over the world, and is hosted by JupiterMedia and SearchEngineWatch.com. Danny Sullivan is the name behind the show, but people come to this event for many reasons including to learn about search engine optimization and marketing, to hear about advancements in the search engine industry, to meet prospects and attract new business, and simply to meet people in the industry face to face and have a good time.

I was privileged to be able to attend the full three-day conference in Chicago that took place from December 9, 2003 through December 11, 2003. I have compiled a detailed review of each day of the Search Engine Strategies (SES) conference at The Search Engine Roundtable Weblog. Overall, I recommend the conference to all those in the Web design, Internet marketing, advertising field, search engine optimization and search engine marketing field.

In the upcoming pages I will cover the following:

– Overview of the Search Engine Strategies Conference
– Topics at the Search Engine Strategies Conference: What you can expect to learn?
– Advances in the Search Engine Industry and the RustyBrick Perspective
– Networking Opportunities and Exhibitors at Chicago
– The Search Engine Elite and Sometimes Wacky
– Search Engine Strategies Chicago Conference Wrap-Up

Overview of the Search Engine Strategies Conference
The Search Engine Strategies conference first took place on November 18, 1999 in San Francisco, California. Danny Sullivan told me “To my knowledge, it was the first search engine conference of this type.” The first show attracted about 250 attendees and 9 exhibitors. The whole seminar took place over one day between 8am and 6pm. Some of the past speakers still present today including Danny Sullivan, Shari Thurow, and Dana Todd. Mr Sullivan said that this event was “great because I [Danny Sullivan] felt both ‘sides’ saw each other much less as enemies but instead as real people.” Danny went on to explain that at this time, except for Overture, there was no one really in the paid listings market. Now, both the organic and paid people were in one room together and communicated face to face. Check out the first SES agenda at http://www.jupiterevents.com/sew/sf99/sew-agenda.html.

The following year JupiterMedia hosted 4 SES events, in New York, London, San Francisco and Dallas. The shows drew between 300 and 600 attendees each. New York and San Francisco were the larger events. Check out the SES 2000 archives at http://www.jupiterevents.com/seminar-archive2000.html.

The 2001 season for the SES conferences was a little different. They moved the New York show to Boston and added Denmark to the list of locations. The shows on average attracted between 300 and 700 attendees, and exhibitors ranged from 5 to 25 depending on the location. For the 2001 archives visit http://www.jupiterevents.com/seminar-archive2001.html.

In 2002 SES added more shows, making the total number of shows 7 for that year. They added Australia and Singapore to the current list of places to hold the event. Denmark moved to Germany and San Francisco moved to San Jose, California. They had over 30 exhibitors at Boston and San Jose and close to 900 attendees. Check the 2002 season out at http://www.jupiterevents.com/seminar-archive.html.

In 2003 they ran 6 shows and moved Dallas to Chicago – the conference this article will cover. The Boston show was the first to ever break 1,000 attendees and in San Jose they have over 1,500 attendees with over 45 exhibitors.

In 2004 they will be moving the Boston show back to New York, and they are expecting to break the 2,000-attendee barrier. Danny Sullivan told me “In the US, we’re also now at a four day, three or four-track format.” He said, “There’s that much content to cover.” As you can imagine this short but exciting history has been filled with success and opportunity for all those who are involved.

The Chicago 2003 conference was held at the McCormick Place Convention Center. It is a very nice hotel and conference center. The Hyatt McCormick Place Hotel was very well equipped for technical people with high-speed Internet connection in each room and wireless access in the restaurant. The wireless did not work on my Apple Powerbook; but I think it was due to the Web site where you pay for access and not the actual network itself. Many people complained about the “hike” from the hotel to the convention center, but I did not find it to be bad at all. The walk was about five minutes and was all indoors with a nice indoor bridge. The first day I had to walk back and forth several times to see if my room was available. Needless to say, the Hyatt was sold out the first night. Why Chicago was selected as the location for a December conference, I do not know. It was cold, rainy and windy during my stay but I rarely left the hotel. Overall, I am satisfied with both the hotel and the convention center.

Frank Fazio from JupiterMedia reported to me at the conference that 1,200 people have signed up for the conference. 900 of those were conference attendees, meaning JupiterMedia received payment to go to the conference sessions, and 300 were exhibitor only attendees. The numbers exceeded the projections; and all those from JupiterMedia were extremely delighted.

Topics at the Search Engine Strategies Conference: What you can expect to learn?
The Search Engine Strategies Conference provided a tremendous amount of information to attendees. If you are new to the search engine optimization (SEO) and search engine marketing (SEM) field, you will learn a great deal in just three days. If you are an intermediate SEO/SEM, you will still learn a lot, and at the same time reinforce your current skills. But if you are an advanced SEO/SEM you will gain in other ways. The conference has something to offer to every one in the Internet field. The next paragraphs will review some of the tracks I had attended at this conference. For a more detailed and less organized review of the conference please visit my review of Day One, Day Two and Day Three. For a complete conference itinerary please visit http://www.jupiterevents.com/sew/fall03/glance.html. The remainder of this section will be a selection of sessions I attended during the three-day conference. Details of materials will not be discussed. Only an overview of the track and anything discussed that stood out will be mentioned.

Shari Thurow, a name synonymous with search engine optimization, was the single speaker for the Search Engine Friendly Design track. This track focused on how to design (code, layout and navigation of) a site in a fashion that is both good for your Web visitor and search engine spiders. The session’s outline includes the definition of a search engine friendly web site design, the three search engine essentials and design considerations. The search engine essentials include something Shari calls the text component, the link component and the popularity component. This track is a must see for all ‘newbies’ to the field. Shari does an outstanding job of explaining the basics to any level designer and programmer. Shari Thurow wrote a very well organized book that goes into more detail then what she discusses at the track. I recommend that you read the book before going to the track. The book, Search Engine Visibility (ISBN: 0735712565) comes in both print and electronic versions and can be purchased at http://www.searchenginesbook.com/ or through your favorite online bookstore.

Andy Beal of KeywordRanking.com, Chris Copeland of Outrider Search Marketing and Dan Theis of SEO Research Lab presented the Search Term Research track. This track covered one of the most important areas of search engine marketing and optimization, how to best determine which keywords one should purchase and/or optimize their site for. Andy Beal discussed the importance of keyword research, how to actually select the keywords, the keyword research process, and the available tools to conduct keyword research, and gave very nice examples throughout the presentation. Andy also discussed some of the faults of the keyword research tools and some overall strategic decisions that need to be made during this process. Chris Copeland focused more on what needs to be done after your keyword research. Do you use organic optimization or paid advertisements? How do click-through rates differ when you tailor your description or use corporate name in the anchor text? Dan Theis took this presentation one step further by discussing the false self-satisfaction of ranking well for a specific keyword but (1) not getting click-throughs on that keyword or (2) not converting sales on that keyword. He also discussed other common mistakes of a search term researchers and how to avoid them.

Heather Llyod Martin of SuccessWorks and Jill Whalen of HighRankings.com presented Writing for Search Engines track. Balancing ones keyword density and readability within the content of your pages can be a hard task. Heather and Jill both explained how you could achieve a nice balance of both and rank well. Both described that you should not look at percentages such as keyword density, keyword divided by total words. You should first focus on making the page easy to read for the user and then look for ways to add your keyword phrases into the content. Both gave examples on how you can add these keyword phrases to your page copy. Heather was much more fluid and perky on the stage as compared to Jill. Both had very informative presentations but Heather claimed the prize for her speech delivery.

The final session for day one was the evening forum with Danny Sullivan. This session was an open-ended discussion where you can bring up any search engine related topic and Danny would talk about it. The first topic of discussion was obviously the Google Florida Update and Danny discussed it for a few minutes. He then began talking about the actual industry and where it was in the past and where it will be in the future. This was a very interesting and enlightening session and it gave the attendees the opportunity to be proactive in the conversation.

The morning keynote address for day two was given by Danny Sullivan. He discussed the Google Florida Update and the search engine industry revolution. Danny discussed his theory of “Invisible Tabs” and showed real life examples on how the search engines today are moving in that direction. It was basically a more organized presentation for what was discussed the night before at the Evening Forum with Danny Sullivan. The session was extremely interesting and thought provoking. For any of you expert SEOs, I recommend attending the keynote address at the next conference.

Optimizing Flash and Non-HTML Content track was designed to teach us how to take Flash or other non-HTML content and optimize it for search engines. The speakers for this track included Gregory Markel from Infuse Creative, Shari Thurow from GrantasticDesigns and Karen Howe from AOL Audio Video. Gregory’s focus was on Flash content and how you cannot optimize it for search engines. The point of this track was to discuss how you should optimize Flash documents or implement “workarounds”. Gregory gave ideas but quickly dismissed each one saying that he tried it and it didn’t work. Shari gave a nice presentation as she did in the past; Shari’s focus was on optimizing PDF documents. She eloquently gave examples of how you should optimize your page’s content and the actual PDF document to rank well in search engines. Again, her book covers most of this discussion. Karen presented the most interesting component of this track, optimizing rich multimedia content. AOL recently purchased Karen’s company, a search service that is designed specifically for rich media content. She explained the advances of this technology and how rich media designers often leave out the Meta data that is so crucial in determining the relevancy of the content. Overall, this track was disappointing – maybe the title should be changes to “Workarounds For Flash and Non-HTML Content”. There was no discussion about how to use the “noscript” or “noframes” tag. When the question was brought up during the Q&A session, Shari said that you should not use the tag – make an HTML equivalent site. The session was about optimizing for non-HTML content, not about realizing that you cannot optimize for non-HTML content.

One of the best tracks at the conference had to be “Link Building” presented by Paul Gardi from Teoma/Ask Jeeves, Mike Grehan from iProspect, Eric Ward from EricWard.com and Marissa Mayer from Google. Paul Gardi explained how link popularity works on Teoma, which happens to be very different from how PageRank works on Google. Mike Grehan then gave an enlightening overview of the science behind Teoma’s “link equity” structure. Eric Ward gave a nice presentation on how to get good quality links and then discussed some philosophical points on PageRank that was very interesting. Marissa Mayer discussed how Google’s PageRank works with a nice one liner, “links are proxies for human judgment of page value”. She explained that PageRank is not the only component to how Google ranks a page and should not be the single most sought after goal for a search engine optimizer. Make sure to catch this session at the next conference, it’s a must see.

One of the new tracks, Getting Local is one of the hottest topics today in the search engine industry. The speakers for this track included Dick Larkin from TransWestern Publishing, Cheryle Pingel from Range Online Media, Stacy Williams from Prominent Placement, Richard Holden from Google and John Ellis from Overture. Getting Local was more focused on the paid side of search engine marketing then the organic side. The Google and Overture representatives both gave demonstrations on how their pay per click models work with the local component. Very impressive stuff; and it is a safe bet that the technology to target local customers will continue to grow and be enhanced. The other two speakers discussed how they pick keywords to target the local traffic to their sites. Overall this presentation went well, and you should expect more information at the next conference on Getting Local.

Meet the Crawlers has always been and will always be one of the most attractive sessions at the conference for a search engine optimizer. Where else can you get representatives from the top search engine all in a single room? The representatives for the search engines included Jon Glick from Yahoo! Search, Steve Gemignani from Loosmart, Craig Nevill-Mannig from Google and Paul Gardi from Ask Jeeves. Each speaker went over some of the new and exciting features that were added to their engines. Yahoo! showed off its new SmartSort feature in the Yahoo! Shopping portal. Yahoo! also pointed out that Inkotmi inclusion could possibly also get you listed in the Yahoo! Shopping portal. Looksmart is working on an interesting new method to crawl updated pages or fresh content. Instead of crawling pages on set intervals like many of the crawlers do, they will look at an individual page and then determine based on how often that individual page is updated, how often they should send out their robot to that page. Google now allows you to type in UPS, FedEx, airline reservation, patent and more numbers and will automatically give you the information you seek. Paul Gardi again discussed Teoma’s unique method of determining link popularity but he also showed how Danny Sullivan’s invisible tabs theory is making its way into Ask Jeeves.

Overall you can expect to learn a great deal from the Search Engine Strategies conference. There tends to be a lot more sessions for novice SEOs/SEMs but there is something for all levels at this conference. Most of the basic information you can easily learn by reading Shari Thurow’s Search Engine Visibility book, but the conference does give you the ability to hear from the best. For a more detailed review of the conference, please visit the Search Engine Roundtable Weblog and the posts for Day One, Day Two and Day Three.

Advances in the Search Engine Industry and the RustyBrick Perspective
The search engine industry is a rapid and dynamic market. Search engines are constantly updating and upgrading their algorithms and indexes to achieve the competitive edge. Search engine marketers spend several hours each day trying to stay up-to-date with these changes in order to provide their clients with better results. Most of these dynamic changes remain transparent to the average Web searcher, which happens to be part of the goal.

Most of the people reading this article are well aware of the Google Updates, also known as the Google Dance. The Google Dance is a time period when Google begins to update its index throughout the Google datacenters. Normally you will see a change in the search engine results page, where pages are added, removed, bumped up and bumped down. All search engines have this update process where the results change on a sequential basis. Each update is the search engines attempt to provide better quality results for the searcher. Better quality results for the searcher translate into a larger searcher base. Marketers look to advertise at search engines that have a large user base.

As covered by Danny Sullivan, search engines are trying to provide the most relevant results possible without the need to specify exactly what the searcher is looking for. For example, if you want to do a search for a picture of a map, the search engine should know that by typing in “map” to return pictures. Currently if you conduct a search in Google on “map” it returns Map Quest as the first result. The user then has to know enough about clicking on the “images” tab to specify that he or she wants map images. Google does tell you to try an image search if you type in “map picture” but Ask Jeeves actually gives you pictures as part of your results. And Teoma is one of these search technologies that is moving towards that area of search. Anticipating what the searcher really wants and making that transparent to the end user will prove to be the killer search engine application.

Other areas where you can expect advances to be made are in local search. When searching for a local specific keyword phrase, the search engines will try to provide the best possible matching results. How they do this is a combination of teaming up with the yellow page directories and using some sort of geo-filtering technology. Yahoo might come the closest to providing those results; but then again, you need a tab to get them in the format you are looking for.

With Microsoft racing to compete in the search engine market and with all the consolidation amongst Yahoo, Overture, Inktomi, and Alta Vista we can expect a lot of excitement in the upcoming year. Will MSN compete? The common analogy is the Netscape and Internet Explorer history. However, others still feel MSN is too far behind, and it is unlikely that they can catch up. Others believe that since Microsoft has such a control over the PC market that if they build the search into the operating system they can easily win over the market. Danny Sullivan argues with that by pointing out that search is already built into the Windows operating system, so why aren’t people using it? I feel that Bill Gates will be focusing more on search and promoting it more in the operating system that people will be locked in to using MSN.

Networking Opportunities and Exhibitors at Chicago
As with many conferences, people attend just to add new prospects to their lead lists and hopefully close new sales. I have personally met two people who came up to me out of the blue and told me straight out that they were there simply to drive some business. People attending the conference are there to learn about the industry and technology supporting the industry.

This section will not get into why people network. That is more for a business-oriented article. However, it will discuss whom you can expect to meet. The types of people who attend this conference include search engine marketers, search engine optimizers, Web designers, Web developers, marketing departments, Webmasters and even top-level management. If you have something to offer any of the types of people above, then you have a reason to network. I shared a cab ride to the airport with an individual looking to invest in a company that is working on some type of specialized search service. This conference provided him with the contacts to reach out to and ask these confidential and important questions.

The exhibitors at the conference included search engines, optimization and marketing firms, Web analytics companies, specialized search engine tools, yellow page companies and affiliate companies. Exhibitors had their own goals to be at the conference. I personally did not spend more then five minutes in the exhibitor hall, but from what I saw, the hall was crowded with potential customers.

The Search Engine Elite and Sometimes Wacky
The Search Engine Strategies conference is the one conference that brings together almost all of the search engine elite into one event. Experts from around the world, from popular search engines and from next door gather at this conference to share their knowledge with others. They also come to hang out and relax with their colleagues in person with a beer in hand. Many have the opportunity meet their favorite forum poster, article author, blog poster, or search engine representative for the first time face-to-face.

Some of the most popular names at the Chicago conference include (in alphabetical order): Bruce Clay, Scottie Claiborne, Barbara Coll, Mike Grehan, Detlev Johnson, Heather Lloyd-Martin, Chris Sherman, Danny Sullivan, Shari Thurow, Eric Ward, and Jill Whalen. You also get to meet representatives from Google, Yahoo, Teoma, Ask Jeeves, Marketleap, Microsoft, Looksmart, About.com, Did-It, JimWorld, Commission Junction, Lycos, SearchEngineWatch.com, and Position Tech.

HighRankings.com is hosting pictures documenting some of the more relaxed and even wacky moments from the search engine elites. Visit Scottie Claiborne’s gallery at http://www.highrankings.com/scottie/chicago-ses/ and Mike Grehan’s gallery at http://www.highrankings.com/scottie/chicago-mike/. Please notify me if you know of other pictures of this event. I would like to apologize in advance if I missed any names in the list above.

Search Engine Strategies Chicago Conference Wrap-Up
Search Engine Strategies Conference and Expo is an event that is comprised of the most talented and famous Search Engine Marketing professional in the world. This event is held several times a year all over the world and is hosted by JupiterMedia and SearchEngineWatch.com. There are many reasons to come to the show including learning about search engine optimization and marketing, to hear about advances in the search engine industry, to meet prospects and attract new business, and simply to meet people in the industry face to face and have a good time.

I was privileged to be able to attend the full three-day conference in Chicago that took place on December 9, 2003 through December 11, 2003. I have compiled a detailed review of each day of the Search Engine Strategies (SES) conference at The Search Engine Roundtable Weblog. Overall, I recommend the conference to all those who are in the Web Design, Internet Marketing, Search Engine Optimization and Search Engine Marketing field. I am looking forward to the New York Show and hope to see you all there.

Author Bio:
Barry Schwartz is the President of RustyBrick, Inc., a Web services firm specializing in customized online technology that helps companies decrease costs and increase sales. Barry is a leading expert in the search engine optimization community. Barry has written and contributed many articles to the SEO community, by publishing in SEMPO (Search Engine Marketing Professionals Organization). Barry also gives regular seminars covering the complete spectrum of search engine marketing technology and methods.

Kanoodle Ramping Up To Compete
Kanoodle has announced that they plan to introduce a service similar to Google’s in which advertisers can pay to have ads placed on category pages related to the products they sell.

In an effort to compete with Google and Overture’s targeted sponsored listings, Kanoodle has hired former employees of Sprinks (now owned by Google) to help build the new product.

As mentioned above, the new advertising type will be based on pre-defined categories (which is similar to Google’s targeted AdWords). The difference is that Google is using technology it has purchased (Applied Semantics) to help determine which ads should be placed on which pages, while the Kanoodle listings will be based on the categories. These listings will be displayed on pages which relate to the categories, therefore advertisers will be allowed to choose which categories their ads will fall into.

In addition to the new advertising product, Kanoodle has also announced that they have secured additional venture financing to help with their expansion plans.

Author Bio:
Rob Sullivan
Production Manager
Searchengineposition.com
Search Engine Positioning Specialists

Yahoo Switching To Inktomi?
I took a leap a few months ago, predicting when Yahoo! would change search results providers. I thought my theory was sound, but apparently Yahoo! seems to have a few different ideas.

I figured, based on their history of rolling out new features a few days after announcing their quarterly earnings, that they would roll out with Inktomi results in early October (Tuesday, October 14th to be exact). There was a reason for the date, but you will have to read the other article to find out what that is.

While I was a little off on the date (OK way off, as it still hasn’t happened yet), it is still obvious that this is going to happen some time soon. Which brings me to the point of all this speculation, and a question for you – are you ready for the switch?

Think about this for a second before you answer. Consider the implications of having (or not having) listings in Inktomi right now. If you do not have an Inktomi listing, not only are you alienating 1/3 of your possible web traffic (vis-a-vis Microsoft’s MSN), but you have the potential of losing another 1/3 of your traffic once Yahoo! does do the change. That means you could be losing 66% of your total traffic. Wouldn’t this justify the paid inclusion fees?

If you do have Inktomi listings, consider what they are. Do they accurately describe your site? Do a search on MSN for your keywords. Do they come up? How compelling is the description. Would you click on your listing if you were looking for your product, or are your competitor’s listings more compelling? Keep in mind that MSN has their own ranking algorithm, so at the moment there is no way to influence rankings up or down. But just take a minute to look at the results and put yourself in your customer’s shoes and think about which result you would click on.

One more thing to think about: We have seen an incredible increase in traffic from the Inktomi crawler, also known as Slurp. And it has been incredibly active during the last 3 or 4 months. This can only mean one thing. That Yahoo! is gearing up for the switch by building the database. Which leads to the question; Are you being found by Slurp?

You can easily find out by viewing your log file reports. Most log analysis software can distinguish between browsers and spiders. You will want to look under the “spiders” section of the report to see if Slurp has been there, and how often. Based on our data, Slurp should be one of your top 3 spiders over the past few months.

If it hasn’t been there, you may want to consider paying for inclusion on a few pages just to get the spider visiting. Otherwise you do risk losing a lot of potential traffic when the change occurs.

On a final note, it has been almost a year since Yahoo! bought Inktomi. December 23rd was the day we brought the news to you. Wouldn’t it be poetic if Yahoo! made the change on that date – one year later? After all, it is a Tuesday.

Author Name: Rob Sullivan
Company: Searchengineposition.com

Author:
Rob Sullivan
Production Manager
Searchengineposition.com
Search Engine Positioning Specialists

How To Survive Search Engine Upheavals
During the month of November many website entrepreneurs were shaken when Google made it’s infamous “Florida” update, wiping out search positions for several high-ranking websites, which disappeared from the “top 100” overnight. While all the details behind the Google changes are still not known, it is possible for website owners to learn something from this experience and take steps which will ensure that they will not be adversely affected by future upheavals of this sort. I have compiled a list of six steps that can be taken to bring about steady, long term success regardless of search engine fluctuations.

1. Stick to the Basics: Instead of trying to outsmart the search engines, build a website that is both search engine friendly and stocked with useful content for your end-users. In the long run, you will be rewarded for such efforts and your website will achieve its best possible position even in the face of changes in search engine algorithms.

2. Don’t Over Optimize your site: Write your website’s text naturally. Sure, your main keywords have to be included on your site, in your headlines, in your opening paragraph and in some of the links on your site, but don’t overdo it. It is thought that Google applied a filter in its latest update which penalized sites for keyword densities that were higher than would be normally expected.

3. Don’t Indulge in “Linking Frenzy”: During the past year, as more people became aware of Google’s PageRank system, there was a feverish effort to trade links and get links from sites with high page ranks. The importance of Page Rank in determining search results on Google is uncertain in light of the new updates, and doesn’t even figure on other search engines which may come to play a more important role in the future. (Google currently accounts for around 75% of search engine activity, but should Yahoo switch to Inktomi in the coming year Google’s share will decline).

Three More
As I mentioned in a recent article you should pursue link exchanges if these links can generate traffic or if they can provide a valuable resource for your web visitors. Don’t worry about Page Rank, and don’t rely on linking alone to boost your site’s standing in search results. The traffic that you get from good links will make a steady contribution to your website’s success.

4. Diversify your promotion efforts: Take a log at your log files and website statistics and see where your traffic is coming from. Instead of trying to compete for a heavily contested keyword or keyword phrase see if there are variations that you can use to reach your target audience.

Diversification also means not relying on search alone. Participation in forums and online discussions, publishing your articles on other websites and in ezines will also bring steady streams of traffic that will not be affected by search engine updates.

5. Find a Niche: One of the biggest difficulties faced by Internet entrepreneurs is that they are selling products or services that are no different from those of several thousand other competitors. If you sell something that is specialized or unique, then your task of achieving prominence in search results will be eased considerably. Even if you do not have something that is unique, you should consider a niche in which you can market your product. Suppose you are a web designer. If you try to compete for the keyword phrase “web design” you will find it difficult to get success. But, suppose you compete for “Web Design, Iowa” (assuming that you live in Iowa!) your chances are much better.

6. Get Ready to Pay for your advertising: Let’s face it, if there are thousands of websites competing for a particular search phrase, only 50 of them can make it onto the first five pages of most major search engines. While a building a well optimized site is still one of the most cost effective measures that a web owner can take, in some of the heavily contested fields it may be necessary to sometimes participate in pay-per-click advertising in order to be able to stand out from the crowd.

Take a look at your website and your business activity, and see where you can make modifications that will improve your prospects for long-term success. If you do so, you will not be troubled by the next update of Google or any other search engine.

Author Bio:
Donald Nelson is a web developer, editor and social worker. He has been promoting web sites since 1995 and now runs A1-Optimization (http://www.a1-optimization.com) a company that provides low-cost search engine optimization and submission services. He can be reached at support@a1-optimization.com

Google’s Florida Update
It’s been almost one month since Google started rolling out the Florida update and millions of listings were dropped from the results. In that time, hundreds of search engine marketers and thousands of website owners have dealt with the loss with all the classic signs of bereavement: at first, denial, then anger, gradually changing to acceptance and finally, healing. We’re moving on, understanding that we’re just part of the never ending circle of Google.

I too moved through the process, starting by wondering what the heck Google was doing, then trying to guess, often shaking my head in bewilderment and then trying to look at it in Google’s perspective. It seems the rest of the SEM industry is doing the same. With a few exceptions, there was no outright attacks on Google at any of the sessions during the recently ended Search Engine Strategies show in Chicago. With the benefit of hindsight, I have seen why Google handled it the way they did. The jury is still out about whether it was the right way.

First, An Update’

As we mentioned in the last column, we expected this latest update to be a work in progress. Google confirmed this at the SES show. In fact, during one of the sessions, a site owner who had been dropped was very pleasantly surprised to see his site back in when the results were put up on the big screen. I pointed out a few instances where the relevancy of results were suffering badly from the broad exclusion of commercial sites. I expected Google to tweak the algorithm to allow for some of these sites to come back in. That seems to be exactly what is happening. For many of these searches, we’re seeing previously excluded commercial sites beginning to come back, making the results more relevant to the searcher. Changes seem to be happening almost daily. On the more competitive searches, the new rankings do seem to be catching a lot of the previous spam.

Yet Another Hypothesis About Why’
A few weeks ago, I said this was a filter aimed at aggressively optimized and affiliate sites. After several hours of team research and speaking to others in the industry, I’m beginning to think this is part 1 of a major change in how Google will rank sites. Danny Sullivan put forward his theory that Google is now using two algorithms, a new more sophisticated one on the more competitive searches and the old one on the less competitive searches.

There has been some discussion about the possible role Google’s recent purchase of Applied Semantics may be playing here. At Enquiro, we had a few people point to this as a place to start looking. Rob Sullivan dug a little deeper and came up with an interesting theory. I’ve taken it and run a little further. We have no proof that this could be happening, but certain things do start to make sense when you look at it from this perspective. Besides, this industry thrives on speculation, so why not throw a little more into the mix?

Applied Semantics Concept Server used language patterns, including semantics and ontology to try to both determine the real meaning of the words on a website page and also to anticipate what people are looking for. It tries to interpret concepts based on the use of words, their proximity and the patterns they occur in. What if Florida was Google’s first attempt to start introducing this concept to their search engine?

The other unique aspect about Concept Server is that it can refine results on an ongoing basis as it becomes ‘smarter’. It starts by feeding concepts or results that it feels matches the searchers intentions. If the response isn’t positive, it will try to do a better job next time.

Search engines already monitor the relevancy of their search results by looking at the click rate on each results page. If the search engine is doing its job well, there should be a heavy click through rate on the first page, and the clicks should be fairly consistently spread around the results shown. This indicates that all the results were relevant and the searcher didn’t have to go any further.

What if Google is combining the artificial intelligence of Applied Semantics Concept Server and this monitoring of click throughs. In this case, Google’s algorithm isn’t applied universally to every set of search results. Rather, the various factors that make up the algorithm can be adjusted on the fly, delivering results that improve with each search. The more popular the search term, the more searches are conducted and the faster the results will improve. As Google monitors more searches, the Concept Server will start to notice patterns between similar concepts and the type of results chosen by the searcher. With every search, Google will be better able to anticipate what the searcher is looking for, even if their query isn’t right on target.

Presenting Our Case..
With that in mind, let’s look back at what’s happened in the past 4 weeks.

Danny Sullivan theorized that the new algorithm wasn’t being applied in every case because of the processor horsepower required. This makes sense with our theory, as these ‘smarter’ queries would put a significantly higher workload on a server than the old searches

It would also explain why the most popular searches looked much more relevant right at the beginning, with the less popular searches taking a week or two to improve in relevance. It also would explain why it looks like search results are changing almost daily.

It also makes sense that the new algorithm wouldn’t be applied to most single words, as it would be harder for Applied Semantics process to work on single word searches.

Finally, there’s the question of why it was commercial sites that seemed to be hardest hit. I believe Google knew it had to move quickly to clean up spam so they started with a crack down on the most likely culprits, knowing they would be throwing out some of the good with the bad. They also knew that the algorithms would gradually adjust the thresholds, letting the borderline sites back in as the monitoring showed that relevancy had to be improved.

If We’re Right, What Does It Mean
Well, for one thing, it means the Google Dance is a thing of the past. Changes in results will happen fluidly and consistently, based on ongoing relevancy monitoring based on click throughs. It’s almost as if Google has taken a page from Direct Hit’s book and gave it a Google twist. Direct Hit was the one time search engine wunderkind that used searcher click throughs to determine relevance. Apparently its back end technology still plays a part in determining results on Ask Jeeves and Teoma.

Secondly, it would mean that individual rankings will move much more frequently and reliance on specific keyphrases will become less important.

Thirdly, a change like this will take a while to fully roll out, so Google will continue to take us on a roller coaster ride for the foreseeable future.

Lastly, this would be the first major step forward in search engine technology in quite some time, and that’s probably the biggest reason why we think we might be on to something.

If Google Did It, Why?
Consider Google’s position. They’re still moving towards an IPO. They knew there was a significant problem with the relevance and integrity of their search results. And they know that the 800 pound gorilla, Microsoft, is rattling the cage, waiting to go head to head with them. It’s a battle they had no hope of winning as long as their results were filled with spam. But if they could unveil a major technological improvement that put Google back far ahead of the crowd in terms of the quality of their search results, they might have a fighting chance.

Now to the question of timing. Why now? It’s pretty obvious. Google is still in an untouchable position when it comes to search engines. They enjoy a 80% plus market share. They could afford a little short term turmoil if they knew it would settle down in a couple of weeks. And as the battle with Microsoft looms larger, every week becomes vital. For Google advertisers, the timing might have been disastrous, but the damage to Google would have become dramatically more significant by waiting to the new year.

But Why The Big Secret?
If there was anger towards Google at the Search Engine Strategies conference, it was mainly due to the lack of communication about the Florida update. Many felt that Google owed it to their advertisers, many of which saw their organic results wiped out overnight, to communicate exactly what it was they were doing.

I have to admit, I was squarely in this camp, until I started thinking about this theoretical new roll out of a substantially different ranking mechanism.

If Google had warned us prior to the update, would it have accomplished anything? More likely, it would have just caused a furor of changes on websites, as optimizers, affiliates and site owners tried to avoid being dropped out of the results. In the end, it’s likely the same sites would have been dropped; only to see some of them come back in a few weeks as the algorithms adjusted. Perhaps Google was doing the SEM industry a favor, saving us from hours of futile work.

‘But what about after the update?’ respond the nay sayers, ‘Why did Google not just come out and tell us what they were doing, rather than force us to guess?’ This point is a little more valid. Officially, Google’s line was that is was just another algorithm change. If our theory is correct, this statement is true in substance, but grossly understated in scope. The Google Guy gave a few more hints on Webmaster World, but remained pretty tight lipped and virtually disappeared from the forums during the worst of the turmoil.

From a strict customer relations perspective, this could have been handled much better on Google’s part. But we can’t forget the Microsoft factor here. You have to know the gang from Redmond has Google under a high power microscope right now. With the battle looming, every day and week Google can keep Microsoft in the dark about their intentions could make the crucial difference between surviving and suffering the same fate as Netscape.

Crunch Time For Google
I have no idea if we’re right. All I know is that the pieces seemed to fall into place neatly around this theory. And I do know that small moves on Google’s part are not nearly enough to give them a hope of surviving Microsoft’s onslaught. They have to make big moves, and make them soon. This could be the first of them.

PS… Other Cool Things From Google
Although not nearly of the same potential scope as an artificially intelligent, self adapting algorithm, Google did unveil some new features in Chicago that are pretty high on the ‘neat’ scale. First of all, type a UPS waybill number into Google and you’ll be taken to the UPS tracking site. You can also do the same to track down patent numbers and other highly referenced items. By the way, if you want to see what other ideas Google is playing with, try Google Labs.

Of much more importance from a marketing perspective is Google’s recent featuring of Froogle results, tied into the Google results. If you have a US based e commerce website that allows consumers to purchase online, you can have your site included in Froogle free! This is a tremendous opportunity and we’ll be working on this for our clients.

Author:
Gord Hotchkiss

Google’s Shakeup
Monday, December 16, 2003

It has been exactly one month since Google introduced its infamous Florida Update. As the Florida Update has brought about the largest and most comprehensive shake-up of Google’s listings ever, it has generated a great deal of interest around the world and genuine panic for the literally millions of webmasters who’s sites have been adversely affected by the change. The following article is a compilation of our writings about the Florida Update since November 17th, the first Monday after the shift.

Monday November 17, 2003 – Happy Monday Morning Folks… DON’T PANIC!

Have you taken a look at Google today? Yes, what you are seeing is real. Google is showing totally different listings on the search engine returns pages today. Actually, this weirdness started sometime on Friday night or Saturday morning. Most of our clients have not been affected and the only one we have seen effected has had his rankings rise dramatically. Our site has been affected though, rather badly at that. From the #6 spot under the phrase “Search Engine Placement”, the happy-go-lucky StepForth site has dropped past the fifth page of returns.

I think this is a temporary thing. We last saw such a massive shake- up six months ago and the listings went back to normal after a few days. This sort of shake-up generally indicates that Google is re- ordering their entire database of spidered sites. Something big is happening at Google but we’re not sure what it is. What we do know is that since a major update of Google’s database started on Friday night, the search returns have been extremely buggy with long-term Top10 pages dropping from existence, recorded back links decreasing or disappearing for many sites, and more than the usual amount of spam appearing in the Top10. We have also noted the disappearance of one of their major servers (www-sj.google.com).

Google has been delivering questionable returns for several months now with spam and duplicate listings often making it into the Top10. The last time their listings have been this upset was in October 2002 when Google tried introducing Blog entries and news releases into its general listings. Within two weeks, the listings had been restored to a shakey state of “normal” but that marked the beginning of strange and often spammy entries into the Top10. This month’s update is being referred to in the SEO community as the “Florida Update” and has a lot of SEO practitioners scratching their heads. Our current advice is to wait it out for at least two weeks and see what Google does next.

December 3, 2003 – Google’s Florida Update
The impact of Google’s Florida Update has not been fully realized yet, but it appears the damage will be extensive considering the reports we are getting from some clients. Literally hundreds of thousands, if not millions of websites have seemingly disappeared from Google’s listings, most of whom enjoyed a Top10 placement before the massive update which started on November 16th. Like most retailers, ecommerce sites that have faded from the listings needed a good Christmas season to remain viable into the next year and many of them staked their sales plans on a their previously strong placements at Google. The fallout will be noticeable, particularly among small businesses where advertising options are limited by small business budgets. Small businesses, however, will not be the only companies facing an uncertain future because of the Florida Update. When the SEO community starts receiving calls from the mainstream media and people who are not clients, asking what is wrong with Google; one knows that Google itself has a problem that goes far beyond their data centers. As one of the pioneers of the web, Lee Roberts of The Web Doctor points out, “It was word-of-mouth that generated their popularity because people could find what they were looking for. Now, we only find sites with less quality content and less sites that offer what we want.”

The Florida Update encompasses the most substantial changes to Google’s famed ranking algorithm in the young company’s history. There are several theories as to why Google forced this update. Some say that Google is trying to force small businesses to join their highly profitable AdWords program by making such a comprehensive update just before the Christmas shopping season. Others say that Google has always used the weeks around the US Thanksgiving holiday to make changes in the hopes that the sudden decrease in traffic over what is often a 4-day weekend will give their engineers enough time to introduce a new algorithm, (and fix any minor errors), without causing massive disruptions to their normal users. A third theory, (the one I lean towards), states that Google was simply tired of being gamed by the growing cadre of less ethical players in the SEO sector and has simply changed the rules overnight by applying this new algorithm. Whatever the reason, the damage is being done and now advertisers and web-users want to know what to expect next. Unfortunately, that is not an easy question to answer as Google does not comment on any changes to their algorithm, therefore the only thing we can do is offer experienced and educated guesses.

I suspect that the folks at Google know they have a major problem on their hands and are working to fix it. We have seen MAJOR spider activity from Google-Bot in the past 48-hours and see evidence that another Google-Dance is currently underway. We have seen updates to the algorithm in the past. The most recent happened earlier this summer and the one before that was in October 2002. Each time Google augmented its algorithm with a new feature or filter, massive dislocation was temporarily felt across the commercial web. Both times, however, Google began producing relevant results within a matter of weeks. The new filters added to this update were too comprehensive and penalized sites that Google couldn’t have been targeting on purpose. Again, I suggest that Google’s engineering staff knows this, and if they don’t, their customer relations and PR departments are most certainly telling them. I expect to see parts of this filter retained and applied to the formula that eventually evolves into their new algorithm but I simply can’t see Google keeping this algorithm, continuing to serving up spam, and throwing its hard-earned reputation out the window. Regardless of the number of MBAs they have on staff, Google’s brain trust is simply too smart for that.

Google is not in the business of driving websites out of business. Google exists to make money by providing the most relevant listings possible, a goal they are clearly not achieving. As Lee Roberts stated above, Google was built on (and, implicitly can be brought down by), word-of-mouth advertising, a fact that cannot be lost on the management at Google. Google is not in the extortion business and has in fact, built its reputation on being above reproach in the separation of paid advertising (AdWords), and the general free listings. I have a difficult time accepting the theory that Google is simply trying to increase AdWords revenues, or increase its own perceived value before issuing the expected IPO next quarter. In reality, what I think we are seeing is Google trying to reclaim its power when it comes to choosing how it will rank websites. Think about this update as a pendulum. Before the update happened, the pendulum had swung to one extreme where, with enough hard work, some could make Google do almost anything they wanted it to. Now, with the application of the Florida Update, Google has pushed the pendulum back to the other extreme. Eventually, and based on past observation, the pendulum will find its way back to the middle. As for those of us adversely affected by the Florida Update, StepForth’s best advice is to continue making minor changes to your site as normal. We do not advise a full reoptimizaton at this point, a task that would not likely produce strong results before the end of the purchasing season anyway, until the SEO sector has an honest handle on what is happening at Google. As a wise and wonderful person recently told me, “… you can’t push a rope.”

December 10, 2003 – The Florida Update Update
It is now almost four weeks since Google engineers applied the filter that has become known as the Florida Update. It has been a busy month for the SEO community and a very difficult month for businesses dependant on strong Google listings. It has also been an extremely good month for one of Google’s main rivals, Yahoo! While there has not been a noticeable change in the Google listings, we have seen some limited movement at Google. Please note, this is only observation and limited analysis. Nobody in the SEO world can claim to really understand what is happening at Google until after the next major Google-Dance. What we can do is relay information and our observations as they come along and that is what is intended in this article.

For the past 24 days, the SEO community has been trying to analyze the Florida Update and make sense of what is happening at the Mountain View GooglePlex. We have read and heard opinions ranging from the profit motive, (forcing online retailers to sign up with AdWords), to the conspiratorial, (Google trying to delete SEO fueled campaigns). Our view continues to favour good intentions from Google as they have championed clean search results for years but, in light of the way Google has treated webmasters and SEO companies who followed Google’s guidelines, our faith in Google’s best practices is quickly waning.

To quickly reiterate, we believe that Google is trying to weed out sites that abused SEO techniques such as massive link-building campaigns and keyword enriching titles and anchor text. Google cast a net that was far too wide and caught a lot of completely innocent webmasters involved in business sectors that attract extremely aggressive marketers. In this case, it appears to be entire sectors being punished for the sins of a few. Some sectors that were hit particularly hard include, real estate sites, travel and tourism sites, and, ironically, search engine placement sites. Along with the millions of others affected by the Florida Update, we saw our own listing slip from the #6 position on Google to somewhere below the #1000 position. We have since noticed our site bounce back to the #48 and #50 position on Google, where it currently sits. Recently, we’ve noted a major increase in Google spider activity and a good deal of “bounce” for listings across Google’s various datacenters.

We believe another Google-Dance is on right now and that the engineers at the GooglePlex are working to restore some sense of relevance to the listings. We have seen many sites that were injured by the Florida Update bounce back up and vanish again, often within the same 24-hour period. Clearly something is happening in the background that is at times flushing over into the foreground. Again, this is evidence that Google is trying to fix a broken tool and they had better fix it soon! Search engine users are looking for other information sources and Google may see a decline in webmasters using its services if those users and webmasters do not have faith in Google’s ranking technology. As stated before, Google’s update hit some folks who were playing by the rules. This is likely the first time the phrase “collateral damage” can be used in a truly honest and meaningful way. As a measure of the impact of Florida Update, the SCROOGLE website from Daniel Brant (of GoogleWatch.Com fame) has placed #7 on Alexa’s Movers & Shaker’s list. I think this indicates how deeply people need to know what’s happening on Google. Scroogle had a user increase of 710% over the past 7-days.

Many, if not most of the sites affected by the Florida Update meet the guidelines mentioned above. About fourteen months ago, Google announced it would begin a massive Spam deletion campaign, a factor leading to last year’s October update. At that time, Google re- published its SEO guidelines, thus forcing a number of SEO firms, including StepForth to re-tool our promotion techniques to meet the new “rules” as spelled out by Google. When the world’s largest search engine said it will take action against websites if they deviate from simple, written rules, SEO firms sat up and listened. The problem for us is, we and our clients changed to suit Google’s rules but Google itself has not. Currently, the Top10 under almost any keyword phrase is bound to bring up some spam and several irrelevant results. We do not believe this would have happened if Google had enforced the rules it asked SEO’s to follow. Meanwhile, down the road in Sunnyvale California, Google’s #1 rival, Yahoo! is reaping the benefits of the early Christmas gift Google has given them. Yahoo!, owner of (Overture, Alta Vista, AlltheWeb, and Inktomi), has recently surpassed Google as the world’s most popular website, according to Alexa’s popular monitoring service. With searchers starting to look for information at other search tools, Yahoo! appears to be a clear winner. Ironically, Yahoo! continues to draw much of its listings from Google but industry rumour has Yahoo! switching fully to Inktomi early in the New Year. Yahoo!’s stable of search tools, patents and technologies, and Overture’s Content Match contextual distribution system, make Yahoo! an attractive option to Google.

As for businesses depending on Google, we are deeply concerned about the effect this update will have on their bottom lines, especially as Christmas sales are so important. Google will never be able to bring back lost time and it is doubtful that anything Google does at this point will help salvage the season for online retailers. Advertisers who NEED placement today should immediately consider using AdWords since Google is still the largest and most used search tool in the world. Regardless of the fact that it is not working properly, it still drives the vast majority of search engine traffic and is still an essential place to be listed and found. Good luck to all advertisers on Google. We expect to have either firm answers by mid-January or to see Google revert to an algorithm that does not produce spam on almost every search.

December 16, 2003 – The Florida Update View from Today
Not much has changed in the past week. We have noticed that Google seemed to stabilize last week but this week, the returns pages seem to be bouncing all over the place again. A couple of developments have occurred that are worth noting though. At last week’s Search Engine Strategy Conference in Chicago, WebProNews Webmaster Garrett French elicited a brief acknowledgement and apology from Google’s Senior Research Scientist, Craig Nevill- Manning. When pressed about the effect of Google’s algorithm switch on small businesses, Nevill-Manning said, “I apologize for the roller coaster. We’re aware that changes in the algorithm affect people’s livelihoods. We don’t make changes lightly.” As Garrett said, “that’s good to know”, but if you think your post-Christmas expense bill will be a shock, imagine what the small online retailers will face in January and February.

Many of the sites that disappeared in mid-November have reappeared but not in the prominent placements they had enjoyed before Florida was applied. StepForth’s site, for instance was oscillating between the #2 and #6 spot under the phrase “search engine placement”. The site dropped out of the Top1000 shortly in mid-November and has since resurfaced at positions #42, 48 & 52 over the past week. There is still a great deal of SPAM and irrelevant listings appearing in the Top10 under most commercial keyword phrases. We continue to see Top10 listings that break several anti-spam rules posted by Google. A search of various datacenters and ranking- servers shows that this trend is not expected to end in the coming two weeks.

Finally, we are starting to see search engine users moving away from Google. This is more than the wishful thinking that was floating about at the end of November; numbers from the website statistics company, Alexa show that both Yahoo and MSN are increasing in user popularity while Google is showing slight declines. We think this means that the users are perceiving problems in Google’s listings, both out of personal frustration and because the “experts” are noting the problems at Google with increasing frequency. Whatever the cause, Page and Brin and Co, must be aware of the symptoms and should be feeling some disease as searchers, businesses and webmasters are suffering.

Author Bio:
Jim Hedger is the SEO Manager of StepForth Search Engine Placement Inc. Based in Victoria, BC, Canada, StepForth is the result of the consolidation of BraveArt Website Management, Promotion Experts, and Phoenix Creative Works, and has provided professional search engine placement and management services since 1997. http://www.stepforth.com/ Tel – 250-385-1190 Toll Free – 877-385- 5526 Fax – 250-385-1198

Google Sells Christmas
Please note: at any given time Google may turn this new filter off and revert back to its old algorithm. I do not see this current algorithm as something that will stand the test of time. I do believe it is going to be the makeshift algorithm until Google can introduce some of the web search personalization and clustering technology that it obtained when Google purchased Kaltix. I am not a Google engineer and many of my statements in this document may eventually prove false. This is Google as I know it to be in my mind.

Cat and Mouse
Recently Google has had another “dance.” With this most recent change they did more than the usual PageRank update. For years search engine optimization firms and spammers have chased after the search engines. Each time the search engine would introduce a new feature thousands of people would look for ways to exploit it. Perhaps Google has the solution.

PageRank
Prior to the web, a good measure of the quality of a research paper was how many other research papers referred to it. Google brought this idea to the web. Google PageRank organizes the web based on an empirical link analysis of the entire web. The original document on PageRank, titled PageRank Citation Ranking: Bringing Order to the Web has been cited in hundreds of other documents.

Problems with PageRank
By linking to a website, the owner is casting a vote for the other site. The problem with this measure of relevancy is that a vote is not always a vote, and people can vote off topic. People used free for all link exchanges to artificially enhance their rankings, others signed guest books. Both of these methods faded in recent times as search engines grew wise.

Some web pages are naturally link heavy. Weblogs for example, have a large number of incoming and outgoing links. Many weblog software programs allow users to archive each post as its own page. If a small group of popular writers reference each other suddenly multi thousand page networks arise.

Articles dating back over a year have claimed that Google PageRank is dead. Just like the ugly spam that fills your inbox everyday, people want to get something for nothing from search engines.

Recent PageRank and Ranking Manipulation
Some of the common techniques for improving site relevancy (and degrading search results) were

• abusive reciprocal linking – two sites linking together exclusively for the sake of ranking improvements – common to sites selling viagra
• comment spam – people (or software) post comments on weblogs and point them at their website with optimized links
•selling of pagerank – people can sell PageRank to a completely unrelated site. in fact there are entire networks which have this as a business model (http://www.pradnetwork.com/)

While Google has fought hard to keep its relevancy, a drastic change was necessary. In the past search engines rated web pages on inbound links, keyword density, and keyword proximity.

Optimized inbound links would be links which have your exact key phrase in them. Keyword density would be the number of times a keyword appears on the page (and how they appear) divided by the total number of words. Keyword proximity is how close keywords appear to one another on the page.

The rule to high rankings was to simply place the phrases you wanted to list well for in your page copy multiple times, in your title, and in your inbound links.

The New Filter
This latest algorithmic update measures keyword proximity and other factors of artificial rank boosting to eliminates pages which trip the filter. It is not certain what factors are all considered in this filter, but it is believed that a high ratio of overly optimized text links coupled with high keyword density and close keyword proximity will trip the filter.

My Website
If you place a – between the keywords you are searching for Google will act as if the entire phrase is just one word. This is most easily observed by using the page highlighting feature on the Google toolbar.

If you search for “search-engine-marketing” you will see my site (Search Marketing Info) lists ~ 7. If you take the dashes out, you will see that my site tripped the filter and will not be listed in the search results for terms such as “search marketing” or “search engine marketing.” The dashes make it seem as if it is all one word to Google. That is why the keyword trip does not occur. If yo pull the dashes out though – I go over the limits and trip the filter.

Please note: Google recently fixed the – sign between words loophole. For a while searching for keywordA keywordB -blabla will still work. For a three word query you would need to add a -bla -fla extention to the end of the searcy. Google will eventually fix this too though.

The Penalty
If your site goes over the trip filter it will not cause your site to be assessed any spam penalties. Simply put, your site just will not show for the specific search you trip the filter on.

Only the specific page is penalized for the specific search. Your PageRank and all other pages in the site go unchanged.

The Test Website
note: a, b, and c refer to different keywords

One of my clients had a website that was listing in the top ten for phrase A B C. he also was in the top ten for A B and B C. When Google performed this update he was removed from all the various top rankings. His site was the perfect test site to discover the limits of this filter. His only inbound link was a singe PR6 link from DMOZ (the Open Directory Project) with none of the keywords in the link.

I, being interested in getting my clients site back on top again began to test. In some areas I combined B C into BC. In other areas I removed and/or distributed the keywords differently. Sure enough Google came and quickly re indexed his website. I then searched for A B and B C. He was quickly listing in the top 5 for both of these competitive phrases. I searched for A B C, and he was still over the filter limit. This all but confirmed in my mind the key phrase filter idea.

I changed his website again and am anxiously anticipating a Google re index of the page to verify we recapture his key phrase just below the trip limit!

Check to See if Your Site got Filter
This box will show sites that were filtered out. Type in your search term to see if you were filtered.

courtesy Scroogle.org

How to Bypass the Filter
The filter aims to catch highly optimized pages. The way to bypass it then is to not appear highly optimized. When Google searches for A B C, it wants to find A B C maybe a few times (not many), but it also wants to see A B, B C, A, B, and C sprinkled throughout the text where possible. The overall keyword density of your keywords should remain rather low.

How the Filter Hurts Google
Right now the filter does appear to be introducing more spam in on certain searches. The overall relevancy of most Google searches is still rather high though. Where I see a true problem with this new algorithm is that in a few months if cloaking software professionals adjust to this new script they will be able to tear holes in Google.

The problem with the current algorithm is that it is looking to match randomized, un optimized page and article structure. People can mix their words up some, but you frequently end up using some words together naturally. A cloaking script does not have to make sense to a reader. A cloaking script can write text which is more randomly organized than you or I because it only has to follow mathematical breakdowns. It does not also have to read well to a human eye.

In addition to this fact, many of the highly optimized sites that were appearing at the top are no longer there to protect the searcher from the real heavy spam which may (and in some cases already has) risen.

We shall see if they soon add better spam filters (which will surely be required once cloakers adjust to the new algorithm).

How the Filter Helps Google
This filter helps Google in two main ways.

• Manipulate search results is much harder for a novice SEO or webmaster. Selling PageRank to an off topic site may actually cause the site to trip the spam limits, thus PageRank is no longer as easy to abuse.
• Many highly optimized sites recently tripped the filter and lost their distribution. Might these people be buying AdWords for Christmas? ho ho ho!

What Google is Looking For
The general principal behind the filter is the idea that non commercial sites are not usually over optimized. Many commercial sites generally are. When Google serves up sites, they are looking to deliver the most relevant, not the most highly optimized websites.

The Eyes of a Webmaster
Our eyes are our own source of filtration. When we see our sites disappear for some of the best keywords we may get angry. Sometimes the results look bad while they are playing with the filters. What truly matters to Google is not if the sites are optimized, but if the results are relevant. Relevant results to their test subjects and Joe average web surfer is all that matters. Like it or not, if spam does not sneak in, this algorithm change may actually help Google.

Why Change Now
While some of the commercial searches have been degraded, often relevant results have filled the places once occupied by highly optimized websites. The biggest change is that these optimized websites have lost distribution right before the holiday spending season – ouch.

If you look closely at the Google search engine results pages now you will see that the Google AdWords boxes have grown a bit. Signs of things to come? Does Google want commercial search listings to only appear in ads? Now that Google has so many websites dependant upon it, they can change the face of the internet overnight, and people have to play by Google’s rules. So as the IPO is nearing, Google may be looking for some cash to show solid results which will improve opening stock price.

Yahoo
Yahoo has purchased Inktomi and has been branding Yahoo search heavily (although right now it is still powered by Google). Ideally if Google search results degrade Yahoo will switch to Inktomi and steal market share back from Google. It is sure to be a dog fight.

Author:
Aaron Wall of Search Marketing Info.

09
Dec
2003

Google & Trademarks

Google & Trademarks
Google is now involved in a long running legal issue which is sure to stir up some emotion over web advertising.

I’m not talking about the Florida Update, although it did stir a lot of emotion and even more lost sleep than the average Google Dance. I’m referring to the issue of copyrighted or trademarked phrases which could be purchased for advertising purposes.

Google has gone down this path before, as most PPC players have, but in this case they have chosen to ask the courts guidance in determining if a generic term (in this case “American Blind”) could be protected under trademark law.

The questions here are many. For example, should a competitor be able to profit from the use of the term “American Blind” even though it is protected by trademark? Should a company like Michigan based “American Blind & Wallpaper Factory” be able to have the power to block online advertising for such phrases, or even phrases similarly worded?

Can companies like Kimberly Clark (maker of the brand Kleenex) be able to stop advertisers from using the phrase “Kleenex” in their ads even though it’s such a generic phrase that commonly is used by the general public to refer to facial tissue?

Legally, Kimberly Clark should be able to protect its product and prevent its competitors from profiting from using the brand. In fact, outside of the web many companies have successfully sued competitors and others who have used trademarked phrases or images and profited from them.

It is an interesting case, given that Google is involved, when not so long ago they quietly fought to keep the name “Google” out of the dictionary. Google didn’t like the idea of themselves formally becoming part of the English language (as in by using the search engine to check up on some on you “Googled” them), so their lawyers quietly asked the site that proposed the change to request that the proposal be removed.

It almost appears as if there is some kind of double standards here. If Google doesn’t want to become a household term for using a search engine then there shouldn’t really be an issue here. They should simply not allow competitors to use trademarked phrases in their ads.

Author Bio:
Rob Sullivan
Production Manager
Searchengineposition.com
Search Engine Positioning Specialists

Googles Update – Future Website Ranking
The recent shakeup in Google’s search results, which set the SEO (search engine optimization) community buzzing and saw tens of thousands of webmasters watch their site ranking plummet, was in many ways inevitable. Almost all SEO companies and most savvy webmasters had a fairly good handle on what Google considered important. And since SEO, by definition, is the art of manipulating website ranking (not always with the best interests of searchers in mind), it was only a matter of time until Google decided to make some changes.

If you’ve been asleep at the SEO switch, here are a few links to articles and forums that have focused on the recent changes at Google:

Articles:
Site Pro News
Search Engine Guide
Search Engine Journal

Forums:
Webmaster World
JimWorld
SearchGuild

To date, most of the commentary has been predictable, ranging from the critical and analytical to the speculative.

Here’s a typical example from one of our SiteProNews readers:

“I’m not sure what has happened to Google’s vaunted algorithm, but searches are now returning unrelated junk results as early as the second page and even first page listings are a random collection of internal pages (not index pages) from minor players in my industry (mostly re-sellers) vaguely related to my highly-focused keyword search queries.”

So, what is Google trying to accomplish? As one author put it, Google has a “democratic” vision of the Web. Unfortunately for Google and the other major search engines, those with a grasp of SEO techniques were beginning to tarnish that vision by stacking the search result deck in favor of their websites.

Search Engine Optimization Or Ranking Manipulation?
Author and search engine expert, Barry Lloyd commented as follows: “Google has seen their search engine results manipulated by SEOs to a significant extent over the past few years. Their reliance on PageRankT to grade the authority of pages has led to the wholesale trading and buying of links with the primary purpose of influencing rankings on Google rather than for natural linking reasons.”

Given Google’s dominance of search and how important ranking well in Google is to millions of websites, attempts at rank manipulation shouldn’t come as a surprise to anyone. For many, achieving a high site ranking is more important than the hard work it takes to legitmately earn a good ranking.

The Problem with Current Site Ranking Methods
There will always be those who are more interested in the end result than on how they get there and site ranking that is based on site content (links, keywords, etc.) and interpreted by ranking algorithms will always be subject to manipulation. Why? Because, for now, crawlers and algorithms lack the intelligence to make informed judgements on site quality.

A short while ago, author, Mike Banks Valentine published an article entitled “SEO Mercilessly Murdered by Copywriters!”. The article rightly pointed out SEO’s focus on making text and page structure “crawler friendly”. Other SEO authors have written at great length about the need for “text, text, text” in page body content as well as in Meta, Heading, ALT, and Link tags. They are all correct and yet they are all missing (or ignoring) the point which is that the “tail is wagging the dog”. Search engines are determining what is relevant, not the people using those engines. Searchers are relegated to the role of engine critics and webmasters to being students of SEO.

SEO manipulation will continue and thrive as long as search engines base their algorithms on page and link analysis. The rules may change, but the game will remain the same.

Therein lies the problem with all current search engine ranking algorithms. SEO’s will always attempt to position their sites at the top of search engine results whether their sites deserve to be there or not, and search engines will continue to tweak their algorithms in an attempt to eliminate SEO loopholes. If there is a solution to this ongoing battle of vested interests, it won’t come from improving page content analysis.

Incorporating User Popularity Into Ranking Algorithms
The future of quality search results lies in harnessing the opinions of the Internet masses – in other words, by tying search results and site ranking to User Popularity. Google’s “democratic” vision of the Web will never be achieved by manipulating algorithm criteria based on content. It will only be achieved by factoring in what is important to people, and people will always remain the best judge of what that is. The true challenge for search engines in the future is how to incorporate web searcher input and preferences into their ranking algorithms.

Website ranking based on user popularity – the measurement of searcher visits to a site, pages viewed, time viewed, etc. – will be far less subject to manipulation and will ensure a more satisfying search experience. Why? Because web sites that receive the kiss of approval from 10,000, 100,000 or a million plus surfers a month are unlikely to disappoint new visitors. Although some websites might achieve temporary spikes in popularity through link exchanges, inflated or false claims, email marketing, pyramid schemes, etc., these spikes would be almost impossible to sustain over the long-term. As Lincoln said “You can fool some of the people all the time. You can fool all the people some of the time. But you can’t fool all the people all the time.” Any effective ranking system based on surfer input will inevitably be superior to current systems.

To date, none of the major search engines have shown a serious interest in incorporating user popularity into their ranking algorithms. As of this writing, ExactSeek is the only search engine that has implemented a site ranking algorithm based on user popularity.

Resistance to change, however, is not the only reason user data hasn’t made its way into ranking algorithms. ExactSeek’s new ranking algorithm was made possible only as a result of its partner arrangement with Alexa Internet, one of the oldest and largest aggregator’s of user data on the Web. Alexa has been collecting user data through its toolbar (downloaded over 10 million times) since 1997 and is currently the only web entity with a large enough user base to measure site popularity and evaluate user preferences in a meaningful way.

The Challenges Facing User Popularity Based Ranking
1. The Collection Of User Data: In order for web user data to play a significant role in search results and site ranking, it would need to be gathered in sufficient volume and detail to accurately reflect web user interests and choices. The surfing preferences of a few million toolbar users would be meaningless when applied to a search engine database of billions of web pages. Even Alexa, with its huge store of user data, is only able to rank 3 to 4 million websites with any degree of accuracy.

2. Privacy: The collection of user data obviously has privacy implications. Privacy concerns have become more of an issue in recent years and could hinder any attempt to collect user data on a large scale. The surfing public would need to cooperate in such an endeavor and be persuaded of the benefits.

3. Interest: Web search continues to grow in popularity with more than 80% of Internet users relying on search engines to find what they need. However, with the exception of site owners who have a vested interest in site ranking, most web searchers have not expressed any serious dissatisfaction with the overall quality of search results delivered by the major engines. Harnessing the cooperation and active participation of this latter and much larger group would be difficult, if not impossible.

The future of web search and website ranking belongs in the hands of all Internet users, but whether it ends up there depends on how willing they are to participate in that future.

Author Bio:
Mel Strocen is CEO of the Jayde Online Network of websites. The Jayde network currently consists of 12 websites, including ExactSeek.com (http://www.exactseek.com) and SiteProNews.com (http://www.sitepronews.com). Mel Strocen (c) 2003