Ask us a question!

John Wieber

Partner

has 13+ years experience in web development, ecommerce, and internet marketing. He has been actively involved in the internet marketing efforts of more then 100 websites in some of the most competitive industries online. John comes up with truly off the wall ideas, and has pioneered some completely unique marketing methods and campaigns. John is active in every single aspect of the work we do: link sourcing, website analytics, conversion optimization, PPC management, CMS, CRM, database management, hosting solutions, site optimization, social media, local search, content marketing. He is our conductor and idea man, and has a reputation of being a brutally honest straight shooter. He has been in the trenches directly and understands what motivates a site owner. His driven personality works to the client's benefit as his passion fuels his desire for your success. His aggressive approach is motivating, his intuition for internet marketing is fine tuned, and his knack for link building is unparalleled. He has been published in books, numerous international trade magazines, featured in the Wall Street Journal, sat on boards of trade associations, and has been a spokesperson for Fortune 100 corporations including MSN, Microsoft, EBay and Amazon at several internet marketing industry events. John is addicted to Peets coffee, loves travel and golf, and is a workaholic except on Sunday during Steelers games.

Web Moves Blog

Web Moves News and Information

Blog Posts by John

09
Dec
2003

Google Florida Update

Google Florida Update
November 16, 2003 will be a date to remember in search engine history. It will be remembered as the day that Google raised the bar on search engines. It is the day that they changed the rules as far as ranking pages and returning search results. As people will learn, in the days and weeks to come, it was also the day that Google introduced a new term to the SEO handbook: Ontology.

Ontology is “an explicit formal specification of how to represent the objects, concepts and other entities that are assumed to exist in some area of interest and the relationships that hold among them.” In other words, Ontology is a means whereby words are combined to give meaning to a body of text. No longer are the word combinations important. What is important, however, are what the combinations say about the text. It’s a new way of indexing the web.

It means applying a context to the page or site and qualifying it based on the context.

No longer is Google simply looking for x number of keywords or keyword combinations. It has harnessed the ability to look at what a page means by applying ontological rules to the page content to determine its context.

Now, this may sound far fetched. After all, how can you make a computer, which is used to look for sequences of words or numbers with no deeper interrelationships, understand that there is a deeper meaning. In other words, how to you teach a computer to think?

This is where Applied Semantics comes in. This company has devised a way to apply ontological rules to a web page to determine its context. By applying rules (such as synonymy, anonymity, similarity, relationships and many others) to the content of the page, the software can classify a page (or site) into various groupings, or taxonomies.

Now, I know you are saying “wait a computer can’t think. Not only that, but if it could it would have to have incredible processing power, or work extremely slow.”

This is where Kaltix comes in. Kaltix is a company which was formed by Google employees to improve the calculation of PageRank by making it faster. At least that’s what the press release said.

So, if you combine the speed harnessing abilities of Kaltix with the “understanding” of Applied Semantics what do you have? A very fast, very smart search engine.

And guess what? Google owns them both. Do you see where I’m going here?

If not, let me clarify. Google purchased Applied Semantics back in April this year. I would guess that they’ve had their eye out for something like this company for some time. They have been looking at improving results for a year or more now. Ever since people learned how to manipulate PageRank.

But they soon found, likely in May, that the Applied Semantic software wasn’t as efficient as they would have liked. It was at this time we noticed a slight change in the SERPs pages after a deep crawl.

So they went on a hunt, looking for someone to make it faster. And they found Kaltix. Remember Kaltix developed a way to make the PageRank algorithm faster. Well why couldn’t they apply the Kaltix technology to the Applied Semantics ontology software? After all PageRank is an algorithm, and so is the ontology software (albeit likely much more complex). In September, the Kaltix deal closed and Google now has the ability and speed it needs to take the next step.

Remember what Larry Page said about the perfect search engine? “The perfect search engine would understand exactly what you mean and give back exactly what you want.” (From the corporate section of the Google website – http://www.google.com/corporate/today.html).

The key words here are “understand” and “give back exactly what you want.” Now remember what ontology is. Ontology is used to capture the core relationships between concepts supporting the characterization of text in terms of meanings. In other words, understanding what a body of content is about.

Larry Page wants a search engine that understands and Applied Semantics has software which does just that.

So what’s next? Now that Google has a search engine that thinks can things get better? Well if you have been watching the recent Florida update news, you will know that in many cases, Google results appear to have actually gotten worse since that day in November. At least in some peoples eyes. So I guess now is the time to point out that according to Applied Semantics’ website (which, by the way, was recently changed to exclude most of the information I am sharing with you today � more proof?) the software is designed to learn. It can understand when the wrong results are displayed (by monitoring what results are chosen) and it can adapt.

Therefore, one would assume that search results should get better over time. How long will that be? No one can tell, but I would think it should not take more than a few months.

In the end, despite the short term pains, I guess change is a good thing.

Author Bio:
Rob Sullivan
Production Manager
Searchengineposition.com
Search Engine Positioning Specialists

Why You Should Always Avoid Cloaking
When building, implementing or optimizing a website, you should always avoid at all costs any form of cloaking, since those techniques are strongly forbidden by most major search engines today, with very good reasons. However, before I start explaining to you why you should always avoid cloaking, let me give you a brief introduction to the rather complex world of cloaking.

By definition, cloaking is a technique of delivering a certain set of pages to search engine crawlers, while at the same time delivering a completely different set of pages to your human visitors. There are many ways of doing this. Although some will tell you that the most effective way to cloak is with the use of IP (address) delivery software, the jury is still out on this and opinions vary widely. The subject of cloaking has always been and still is a very controversial topic.

Sophisticated cloaking software that makes use of IP delivery to cloak pages are complex programs that, in order for me to properly explain their functioning, would fall way beyond the scope of this article. However, let me give you a simplified explanation on how these cloaking programs work.

The Idea Behind Cloaking
When a search engine spider or human visitor requests a specific Web page, cloaking programs compare the IP address of the requesting party to a database of known IP addresses from specific search engine spiders. If the IP address matches one on the list, it serves a particular page that was specifically written for the search engines to the spider. This page is normally optimized to rank well for that particular search engine, in order to achieve high visibility.

On the other hand, if that IP address does not match one in the database’s list, the requesting entity is ‘assumed’ to be a human visitor and is served the actual ‘real’ page or is simply re-directed to another page appropriate for human visitors (!)

Since most modern search engines today constantly change their IP addresses in an effort to reduce this silly ‘cat-and-mouse-chase’, as you can imagine, if these databases are not regularly updated, they will become obsolete very quickly, in effect making such cloaking programs completely useless in no time!

As a rule, most major search engines today HATE any type of cloaking software and they categorically forbid them in their terms of use. One such strong opponent to any form of cloaking has always been Google. Today, Google will take strong action against any ‘cloaking website’ that it discovers. Such actions range from temporarily penalizing that site to outright banning it permanently from its index.

The simple reason for all this is that search engines rightly feel that they should always be delivered the same Web page that any human visitor would normally see. Many search engines also feel (and I agree with them) that webmasters making use of cloaking software or programs are in effect ‘spamming the index’. While some people may not agree with this, the search engines’ point of view does have some good merit and, if you want your site indexed by them, you don’t have any other option but to follow their rules.

Conclusion
Always build a great site that has a lot of valuable content to ALL your visitors and the search engines and make certain the search engines get the same pages your actual visitors get and your site should always rank high, if you follow all the rest of the advice offered on this website. Avoid cloaking at all costs, since it’s never an option.

Remember that all search engines today are only concerned with two important things: the relevancy and quality of their search results. Also, you should always remember that it is THEIR search engine and only they make the rules. So, in light of all this, it would be wise for you to completely forget about any kind of cloaking, unless you are willing to take the chance of having your site penalized or banned permanently from the major search engines.

Author:
Serge Thibodeau of Rank For Sales

Every Search Engine Robot Needs Validation
Your website is ready. Your content is in place, you have optimized your pages. What is the last thing you should do before uploading your hard work? Validate. It is surprising how many people do not validate the source code of their web pages before putting them online.

Search engine robots are automated programs that traverse the web, indexing page content and following links. Robots are basic, and robots are definitely not smart. Robots have the functionality of early generation browsers: they don’t understand frames; they can’t do client-side image maps; many types of dynamic pages are beyond them; they know nothing of JavaScript. Robots can’t really interact with your pages: they can’t click on buttons, and they can’t enter passwords. In fact, they can only do the simplest of things on your website: look at text and follow links. Your human visitors need clear, easy-to-understand content and navigation on your pages; search engine robots need that same kind of clarity.

Looking at what your visitors and the robots need, you can easily see how making your website “search engine friendly”, also makes the website visitor friendly.

For example, one project I worked on had many validation problems. Because of the huge number of errors generated by problems in the source code, the search engine robots were unable to index the web page, and in particular, a section of text with keyword phrases identified specifically for this page. Ironically, human users had problems with the page as well. Since humans are smart, they could work around the problem, but the robots could not. Fixing the source code corrected the situation for human and automated visitors.

There are several tools available to check your HTML code. One of the easiest to use is published by the W3C (http://validator.w3.org/). While you’re there, you can also validate your CSS code at W3C’s page for CSS (http://jigsaw.w3.org/css-validator/). The reports will tell you what source code needs to be fixed on your web page. One extra or unclosed tag can cause problems. With valid code, you make it easier for your human visitors and search engine robots can travel through your website and index your pages without source code errors stopping them in their tracks. How many times have you visited a website, only to find something broken when going through the web pages? Too many too count, I’m sure. Validating your pages makes everything easier for your website to get noticed.

As I said before, what works for your website visitors works for the search engine robots. Usability is the key for both your human visitors and automated robots. Why not provide the best chance for optimum viewing by both?

Author Bio:
Daria Goetsch is the founder and Search Engine Marketing Consultant for Search Innovation Marketing (http://www.searchinnovation.com), a Search Engine Promotion company serving small businesses. Besides running her own company, Daria is an associate of WebMama.com, an Internet web marketing strategies company. She has specialized in search engine optimization since 1998, including three years as the Search Engine Specialist for O’Reilly & Associates, a technical book publishing company.

Google’s “Florida Dance”
As most people have noticed by now, the last Google update, dubbed “Florida” has generated a lot of uproar amongst the business community and the Internet economy. This latest update is a clear example of the power of Google and what it can mean to a large number of commercial websites that depend on their rankings for the success of their business model and their viability.

There are many old and well-established commercial websites that have lost considerable rankings and that suffered a staggering loss of traffic in the process.

To the best of my knowledge, this is the largest and the most important algorithm change in the 5-year history of Google. Since November 15, Google has started to impose what some call the Over Optimization Penalty, or OOP. It’s still a bit too early to speculate what effect the OOP will have on some sites, but it might suggest to me that it may indeed be a form of static penalty, meaning whatever modifications you make now will probably not get your site back into the top results, at least not until the next Samba!

Therefore, you aren’t likely to see some affected sites return to the top for their target phrases until Google releases the OOP from those sites it was applied to. My feeling is there probably won’t be anything done until the next Google dance which won’t happen probably until December 15 or a bit later I think.

On December 2nd, somebody came to me and handed me his website that a previous SEO, a competitor of mine seemed to have “overly optimized” on certain important keywords of his. Although this article is certainly NOT meant to spam the search engines in any way, the following are techniques that can be used as a last recourse, if you feel some of the pages in your site have been affected in such a way.

Fine-Tuning A Website To The OOP
As is most apparent to many, Google is still experimenting with its new algorithm and continues to adjust as we are heading into the busy Christmas and Holiday season. One thing is certain: there appears to be no stability in the new algorithm.

For that reason, I would just sit and wait to see what the December update will bring us before committing to any changes whatsoever

Detecting If The OOP Has Been Applied To A Particular Page
You can determine if your site had any OOP penalty applied in any way by simply entering your keywords in a Google search, but you will need to add an -exclude for a unique string of characters. Example:

Your main keyword or key phrase blahblahblah

The -blahblahblah string of text doesn’t matter one bit- it’s simply a string of nonsense characters; you can type in anything you wish and it will work (!)

What is important is that you put the “-” character, which informs the Google algorithm to exclude the new spam filter which appears to cause the OOP penalty.

After running this simple test, if you see your page reappearing back under these conditions, it’s highly likely the OOP is in effect for that particular keyword or key phrase on that page.

Note that there is a possibility Google might modify how this type of search works, in an effort to prevent some people from seeing the results, but without the penalty filter. However, at the time I wrote this, that search technique enabled me to detect which of the penalized web pages have been affected by the OOP, so it is a pretty fairly accurate test.

More On The OOP Penalty
The OOP penalty is mostly related to certain incoming links to a page that is in fact penalized in a certain way. Also, on-site absolute links might trigger the OOP. In a particular case where I have seen a site receive an OOP ranking penalty, there were incoming text-links that matched exactly the keywords for which I made the search. Thus, if the incoming links match exactly with the text in the title of your page, your headline tags or your body text, it is my observation that the OOP penalty could be triggered by Google’s new algorithm.

Of course, it is extremely premature at this point to speculate on the exact formula Google is using to achieve this, but if a rather large number of your incoming links match exactly with the keyword or the key phrase used in the search box, it is in fact possible to trigger the over-optimization penalty in my view.

The Ad Word Conspiracy Debate
Since all of this can seem a bit strange, and since a lot of opinions or various ideas have been circulated in forums and different articles when all of this started in mid-November, and especially since many (including me) have discovered that the OOP penalty doesn’t seem to be consistently applied across all the Web, there are some people that have suggested Google might be in the center of a conspiracy to force companies and site owners to spend a lot more money on its AdWords PPC campaigns. Personally, I do NOT believe or participate in that philosophy and I don’t think it would be in Google’s best long-term interest to do that.

However, the OOP penalty does appear in fact to be applied mostly to higher traffic keywords and key phrases, the sort that commercial websites use the most. In retrospect, this does give some credibility I guess to those that think there might be a conspiracy theory to all of this, for what it’s worth.

Make no mistake about this. It is in fact the biggest and the most drastic algorithm change I have ever seen in the short 5-year history of Google. If a few pages in your site seem to suffer of the OOP penalty, let’s look at some ways to get out of this.

Some Solutions To The OOP Problem
For the most part, the OOP penalty comes from incoming links that perfectly matches the keywords or key phrases on a specific title tag or headline once on the affected page. There are a couple of ways to repair this. The first would be to reduce the quantity of the incoming links to your affected pages.

Another good alternative would be to modify the title or headline of your affected page (or pages), so that you have fewer occurrences of the exact keyword or key phrase that originally triggered the OOP penalty. For example let’s say the key phrase in your incoming link that carries the penalty is Construction & Building Materials and you have Construction & Building Materials in your title tag, as well as in your description tag and in your H1’s or H2’s. I would change the title tag to be ‘Special Construction and Building Supplies’ and see what that does.

You should also modify it and do the same for the text body of that page. In an ideal situation, it is imperative to try to achieve a ‘good balance’ that Google would consider as spam-free and not artificial. It is my thinking that too many incoming links with your exact key phrases in them, combined with too much on-page optimization and exact-matching text located within the target areas, i.e. the title tag, headline tags, H1’s, H2’s etc will likely trigger the OOP filter.

Conclusion
In light of all that has happened with update Florida, and with the drastic changes that Google has massively implemented to its PageRank’ algorithm, you should expect to see more upcoming changes of this algorithm, perhaps even before December 15, at which time some observers think Google might start dancing again.

As I have written in previous articles on this and as I was interviewed and quoted by the New York Post on November 27, I still believe that if your site has had little or almost no OOP penalties, in that case I would stay put and not do anything for the time being.

This article is simply meant to show a few techniques I have used to limit ‘some incurred damages’ already done to a site by an SEO firm that would have overly optimized a site, previous to the critical November 15 date. If your site was optimized using only Google-accepted techniques, according to their Terms of Use agreement, your site should not have suffered any OOP penalties whatsoever.

Author:
Serge Thibodeau of Rank For Sales

Where On Earth Is Your Web Site?
You’ve just finished congratulating your marketing team. After six months of concentrated effort you can now actually find your own company web site within the search engines. Everyone is busy handshaking and back patting when a voice from the back of the room rises above the din. “Yeah this is great! Can’t wait until we can find ourselves on wireless devices.” All conversation comes to an abrupt halt. Eyes widen. Everyone turns to the fresh-faced intern standing in the corner with a can of V8 juice in one hand and a PALM device in the other. You, being the Department Manager, barely managing to control your voice not to mention your temper, ask the now nearly frozen with panic intern, “What do you mean find ourselves on wireless? We just spent thousands on our web site visibility campaign!” “Well… Explains the sheepish intern, “There is no GPS or GIS locational data within our source code. Without it, most wireless appliances won’t be able to access our site.”

Guess what? The intern is absolutely correct. Anyone interested in selling goods and services via the Internet will soon be required to have some form Geographic Location data coded into your web pages. There are approximately 200 satellites currently orbiting the Earth. (even Nasa won’t confirm the exact number) Some are in geosynchronous or geostationary orbit 27,000 miles above your head. The Global Positioning System (GPS) is the name given to the mechanism of providing satellite ephemerides (“orbits”) data to the general public, under the auspices of the International Earth Rotation Service Terrestrial Reference Frame (ITRF). Sounds like Star Wars doesn’t it? It’s pretty close. The NAVSTAR GPS system is a satellite-based radio-navigation system developed and operated by the U.S. Department of Defense (DOD). The NAVSTAR system permits land, sea, and airborne users to determine their three-dimensional position, velocity, 24 hours a day, in all weather, anywhere in the world, with amazing precision. http://igscb.jpl.nasa.gov/

Wireless devices, WAP, Cellular, SATphones and a whole host of newly emerging appliances and indeed, new software applications, will all utilize some form of GPS or more likely GIS data retrieval. GIS stand for Geographic Information System and relies on exact Latitude and Longitude coordinates for location purposes. Several car manufacturers currently utilize GPS for on-board driver assistance and the Marine and Trucking Industries have been using it for years. Obviously your web site is a stable beast. It sits on a server somewhere and doesn’t move much, so at first glance it seems quite unplausible you’ll need GIS Locational Data within your source code. On the contrary. One aspect your web site represents is your business’s physical location(s) and if people are going to try to find your services and products, shouldn’t you at the very least, tell them where it is and how to get there?

Let’s look at it from the other end of the spectrum. The end user approach. Let’s say you’re vacationing in a new city for the first time. Once you get settled into your Hotel room, what’s the first thing you want to find? Restaurants? Bank machines? Stores? So you pull out your hand-held, wireless, device, log onto the web and search for “Italian Food in San Francisco.” FiveHundred results come back so you click the new “location” feature on your hand-held (which knows exactly where you are) and ten Italian restaurants, who were smart enough to code their web sites with GIS data, light up on the screen. Guess which restaurants didn’t get selected? The other four hundred and ninety. Starting to get the picture?

How does this affect you and your web site marketing?
GIS Latitude and Longitude co-ordinates will soon be a must have on every web site operators and web developer’s list and an absolute necessity for anyone wishing to trade good and services via the Internet. This data may relate to the physical location of the web site or where the site is being served from (if applicable) or where the actual business represented by the site is physically located. There may be multiple web site locations and coding involved, if for example, you have a franchise with multiple locations, each location will probably need a page of it’s own with the correct corresponding location data. If you run a home-based business, I doubt if the co-ordinates to your living room are going to be necessary, but you should provide the latitude and longitude of the closest city or town. Large corporations such as banks may want to code the exact location of every automated teller machine across the country. Industry standards and the methods of serving out this data are still in the development phases but it’s a safe bet to assume there are plenty of people working on the solutions right now and given the speed of technology, implementation will probably be much sooner than later. Give yourself an edge. Find out where in the world your web site is…before your web site is nowhere to be found.

Author Bio:
Robert McCourty is a founding partner and the Marketing Director of Metamend Software and Design Ltd., a cutting edge search engine optimization (SEO) and web site promotion and marketing company. Scores of Metamend Client web sites rank near or on top of the search engines for their respective search terms.

The Google Update Uproar
‘Be careful what you wish for, it may come true’

While at Ad-Tech, I lamented the clogging of Google’s results with spam filled sites. I asked Google to clean up its index. Although I’m sure there’s no connection between the two (unless Sergei and Larry are paying a lot more attention to me than I thought) Google responded just a few weeks later with the Florida update. And boy, have they responded big time!

If you haven’t ventured into an SEO forum for awhile, you might not have heard of the Florida update. It’s Google’s latest dance, and it’s a doozy. It appears that Google is trying to single handedly shut down the entire affiliate industry.

The scene is awash with guessing and speculation. Was it a Google mistake? A plot to force advertisers to move to AdWords for placement? Barry Lloyd did a good job of trying to bring sense to the mayhem. I’d like to jump in with some further research we’ve done and my own thoughts of what’s happening with the Google index.

A Florida Guide
First of all, the Florida update was rolled out November 16th. It appears to be a new filter that is applied to commercially based searches, triggered by certain words in the query. The filter clears out many of the sites that previously populated the top 100. In several tests, we found the filter generally removes 50 to 98% of the previously listed sites, with the average seeming to be 72%. Yes..that’s right: 72% of the sites that used to be in Google are nowhere to be seen!

Who’s Missing?
The target is pretty clear. Its affiliate sites, with domains that contain the keywords, and with a network of keyword links pointing back to the home page of the site. The filter is remarkably effective in removing the affiliate clutter. Unfortunately, legitimate commercial sites with lower page rank are being removed as well. There seems to be a PageRank threshold above which sites are no longer affected by the filter. We’ve seen most sites with PageRank 6 or above go through unscathed.

And the Secret Word is’
The filter also appears to be activated only when search queries contain certain words. For example, a search for ‘Calgary Web Design Firms’ activated the filter and cleared out 84% of the sites, while a search for ‘Calgary Database Development’ didn’t activate it. Search volumes are roughly equivalent for both phrases. The filter seems to be activated by a database of phrase matches, and doesn’t appear to be affected by stemming. For example, ‘Panasonic fax machines’ activates the filter, but none of these words as a single search phrase does. ‘Fax machines’ activates the filter, but ‘Panasonic machines’ doesn’t.

Also, it seems that only a few single word searches activate the filter. We found that jewelry, watches, clothing, swimwear, shelving, loans and apartments all activated the filter. Other terms that you would think would be bigger targets for spam, including sex, cash, porn, genealogy, MP3, gambling and casino don’t activate the filter. Obviously, when you look at these words, Google is more concerned with commercialization than spam.

Volume, Volume, Volume
Another factor is whether the filter is tripped or not seems to be search volume. Any commercial searches with volumes over 200 per month (as determined by Overture’s search term suggestion tool) seemed to trip the filter. Searches under that threshold seemed to remain unfiltered. For example, a search for ‘Oregon whitewater rafting’ (about 215 searches last month) activated the filter, while a search for ‘Washington whitewater rafting (about 37 searches last month) didn’t.

What is Google Thinking?
Obviously, given the deliberate nature of the implementation, this isn’t a hiccup or a mistake by Google. This was a well thought out addition to the algorithm. And in the most competitive searches, it produces much better results than did the ‘pre-Florida’ index. If you search for ‘New York Hotels’ today, you’ll find almost all of the affiliate clutter gone.

Where the problem occurs is in the less competitive searches, where there’s not a sufficient number of PageRank 6 or higher sites to fill the vacuum caused by the filter. If you do a search now for most phrases you’ll find the results are made up of mainly directory and irrelevant information sites. In cleaning house, Google has swept away many sites that should have stuck. As an example, visit Scroogle.org and search for ‘Calgary web design firms’. Scroogle is from the deliciously twisted minds of Google Watch, and gives graphic representation of the bloodshed resulting from Florida. In the pre-Florida results, the top 10 (all of which were wiped out by the filter) included 6 Calgary based web designers and 1 in Vancouver (two of the remaining results were additional pages from these firms). The other result was a directory called postcards-usa.com with a page of design firms from around North America. Eight of the 10 results were directly relevant, one was somewhat relevant and one was of questionable relevancy for the geographically specific search.

In the filtered results, there is not one web design firm from Calgary. The top 4 listings are directory site pages, two of which are not even specific to Calgary. Ranking 5 and 6 belong to Amazon.com pages selling a book on web design (nothing to do with Calgary other than a reader review from someone who lives there). Rankings 7 and 8 go to pages about evolt.org, a non profit organization of web designers, and a profile on a Calgary based member. Listing 9 goes to the web design page of an abysmal web directory, again not specific to any region. And listing 10 goes to an obvious link farm. Of the 10 results, none of them were relevant.

Google’s Next Move?
Pulling out the crystal ball, which in hindsight was amazingly accurate 2 weeks ago, here’s what I think will happen. The Florida filter will not be revoked, but it will be tweaked. It’s doing an amazing job on the ultra competitive searches, but the algorithm will be loosened to allow inflow of previously filtered sites to bring relevancy back to the less competitive searches. Hopefully, the sites finding their way back into the index will be better quality legitimate commercial sites and not affiliate knock offs. Google has to move quickly to fix the relevancy for these searches, because they can’t afford another blow to the quality of their search results.

I really don’t believe that Google purposely implemented the filter to drive advertisers to AdWords, but that is certainly a likely side effect. The most dramatic impact will be the devastation of the affiliate industry. Just 3 short weeks ago I listened to 4 major internet marketers say they didn’t bother with organic SEO because their affiliate partners did it for them. Those days are over. If Google was targeting anyone with Florida, it was affiliate sites. A number of forum posts indicated that Google was taking aim at SEO. I don’t believe so. I think Google is trying to wipe out bad SEO and affiliate programs and unfortunately there are a number of innocent bystanders who got hit in the crossfire. But every indication from Google itself (both from posts to forums and in replies to help requests) seems to indicate that Florida is a work in progress.

Author:
Gord Hotchkiss

Higher Google Adwords Clickthrough Rates
1. Target your ads to the right Audience.
You do this by selecting keywords and phrases which are relevant to your product or service. Avoid keywords that are too general because although they generate a large number of impressions, they often generate very few clicks. To improve your CTR, use more descriptive phrases so that your ads will only appear to prospective customers searching for what you have to offer.

2. Use the correct keyword matching option(s).
Google offers four different methods of targeting your ads by keywords: Broad Match, Phrase Match, Exact Match and Negative keyword. By applying the most focused matching options to your keywords, you can reach more targeted prospects, improve your CTR, reduce your cost-per-click and increase your return on investment.

3. Target your ads by location and language.
When creating your adwords campaign target your ads by location so that you maximise your sales and improve your CTR. Target the right audience by selecting the language and countries that you want to reach.

4. Use your main keywords in the Title or Body Text of your ad.
By using your keywords in the title or ad body text of your ad, it will stand out from your competitors and grab the eye of your prospective customers.

5. Create different Ad Groups for different search phrases/keywords.
This will allow you to refine your ads and test them for relevance and therefore maximise your clickthrough rates. For example, if your service offers loans, you can create different ad groups for home equity loans (and all other phrases that incorporate this phrase), consolidation loans, student loans and so on.

6. Calculate what you can afford to pay for each clickthrough.
You will find that more focused keywords and search phrases have a higher conversion ratio than other more general keywords. It’s a good strategy to pay more for clicks from keywords or phrases with a high conversion ratio than from the more general keyword groups.

7. Use highly targeted Keywords and search phrases.
Be specific when selecting keywords and search phrases for your campaign. General keywords will be more expensive and will result in lower clickthrough rates. If you’re bidding on general keywords that are relevant to your site consider using the exact match and the Phrase match keyword matching options in order to increase your CTR.

8. Test and monitor your ads to get the best clickthrough rates. Refine and fine-tune your ad to maximise click throughs. With Google you can do this in real time. You can do this by creating different ads for each ad group and then checking which ads have the best clickthrough rates.

9. Give Google users a compelling reason to click on your ad link. The easiest way to do this is to provide something of value for free. You can also achieve this if you tailor each keyword to your offer and use relevant terms/ words in both the title and the ad body. Use a different ad for each keyword group or search term. This increases relevance and the likelihood that Google users will click on it.

Author Bio:
Copyright. Ben Chapi owns Venister Home business and Affiliate Program Classifieds at http://www.venister.org.

Optimizing Framed Websites
A number of months ago, I wrote an article on the proper techniques that should be used to successfully optimize a site that makes use of frame technology. I’m still regularly getting some questions in relation to sites that were designed using frames, asking me the best way to optimize these sites to improve their visibility in the search engines. Every now and then, I’m also getting a few emails from some site owners that are desperate their framed site is nowhere to be found in the engines. In response to all of this, I think an extended version of my first article is necessary.

For some reason, even today, there is still a lot of controversy in establishing whether or not to use frame technology when building a new website. This controversy is certainly well founded, since, apparently there seems to be many site designers that still use it, even if that’s not a very good idea anymore. In this section, you will learn all you need to know to get any framed web site properly listed and optimized for the major search engines.

To be fair, let’s just say that a framed website can actually ‘sometimes’ be of help in its necessary updating needs and many web designers decide to use frame technology for this very reason. They can be even more useful for maintaining very big web sites that have a lot of content spread over 500 or more pages, although I don’t necessarily agree on that assumption.

However, and as you are about to find out, framed websites usually create many problems and complications for most major search engines in use today. This article will show you the pitfalls and the mistakes that some Webmasters do in implementing their framed sites, in lieu of “search engine-friendly” websites. It will also show you the right way to do it, if you still want to go ahead with this idea.

What Constitutes A Framed Web Site?
The easiest and surest way to determine if a site is framed or not is when its left-hand navigation menu remains stationary while the information in the centre of the page moves either up or down. Furthermore, some models of framed sites sometimes include a company logo or image at the top or sometimes at the bottom that also remains fixed while the rest of the page moves in either direction. Some models might also include links or navigation buttons in that same stationary section. When you see any or all of these characteristics, you are then dealing with a framed website.

Today, anywhere on the Internet or in good technical books, you can actually read a lot about search engine optimization (SEO) that basically says using frames on any website can actually be similar to a bad spell on that site simply because most major search engines will not navigate the frames or its simply impossible for them to crawl or spider. In such a case, these same people will tell you that such a framed website will never get indexed or optimized properly.

That statement is actually both true and false at the same time. It IS false if they are used correctly. But, by the same token, it’s perfectly true if the frames are in effect used improperly and are search engine-unfriendly. Happily, there are ways around these limitations.

When implemented correctly, a framed site can be ‘almost’ as good as a site without frames and can hopefully deliver results that could be considered reasonable, as compared to what the site delivered previously. However, if your site isn’t designed yet, I would still suggest you do it with search engine-friendly technology, in other words, stay away from frames!

Why Do Search Engines Hate Framed Websites?
In this section, I will explain the main reasons why most framed web sites fail to get indexed properly on the major search engines that use crawlers such as Google. When we look at the HTML code of a common framed web site, we usually see the TITLE tag, the META tags, and then a FRAMESET tag. The major search engine’s crawlers and spiders such as Google Bot and Freshbot are all programmed to completely ignore certain HTML code and instead are directed to specifically focus on indexing the actual body text on a particular page.

But with any typical framed website, there is just no body text for the search engine’s crawler to begin with, simply because the text is located on the other pages of the site, what we call the inner pages.

If you want to know what search engines see when they get to a site using frames, this is what they see:
“You are currently using a web browser that does not support frames. Please update your browser today in order to view this page.”

This is almost the equivalent of putting a sign on your door that says: “If you cannot read this sign, its because we are permanently closed for business”. Yet, this is exactly what search engines see when they get to a framed website!

Here is a little test that will prove what I just said is true: while at your keyboard, try a search on Google for the following query: “does not support frames” and you will actually discover about 2,800,000 web sites are found by Google! That is a lot of wasted Web pages that could bring in a LOT of sales.

In order to fix that problem, if you use that otherwise wasted space to effectively include important keywords and key phrases, and in fact adding some body text that is rich in sales copy, that will in fact make a big difference in helping make these framed sites rank higher in the search engines.

How To Properly Use The Noframes Tag
There actually is an HTML tag called the noframes tag and when properly applied, will effectively offer what the search engine spiders actually need: important text data they require to properly include that page in their index.

If a person really has to use a framed site for whatever reason, then it’s very important to use the noframes tag correctly. Evidently, doing all these necessary steps will be useful only if the information on the site’s homepage is carefully written and makes strong use of your important keywords and key phrases. Again, nothing beats the careful research and analysis of your ‘right’ keywords and key phrases, using the professional facilities of Wordtracker.

Always remember that if you optimize a site for the wrong keywords, you will have a site that will rank high, but for search terms that have nothing to do with what your site offers! You will then receive enquiries and questions on products or services that your company doesn’t even offer! In the search industry, relevancy and quality are king. The more relevant your site is, the better success you will get, and the higher your ROI will be. It’s just as simple as that.

More Framed Website Topics
So far, all the above information takes care of just the site’s homepage. Now as can be expected, the rest of all the pages on your site also need to be indexed correctly.

Most site designers and web programmers that elect to use frames do so mainly for ease of navigation reasons, although the same results can be achieved with other, more search engine-friendly technology, such as the use of tables. You simply include your navigational links inside the tables and its done!

One thing that is really important is to effectively ensure that all the inner pages of your site get optimized the same way, even if you don’t have to deal with the no-frames tag. Once all those inner pages are optimized appropriately, after one or two Google monthy updates (dances), you should start seeing your website properly positioned, ideally, close to the top of the SERPS’s (Search Engine Results Pages) in Google and most major search engines in use today.

Conclusion
Framed websites are made of old, outdated technology and are certainly not the best. They have serious limitations, as far as search engines are concerned. When building a completely new site, or when re-designing an old one, it’s always best to stay away from frame technology.

However, if you really have to, optimizing a framed site can be done successfully if the above examples and techniques are followed carefully. Remember that the key to all of this is the no-frames tag. If carefully implemented, the no-frames tag can successfully remove many of the barriers that plague framed websites.

Author:
Serge Thibodeau of Rank For Sales

Affiliates & Adwords
I’ll have to admit that I didn’t believe all the hype about Google Adwords for the longest time. Then I caught myself clicking on some of the Adwords ads instead of the search results. That made me think.

Five bucks. What’s five bucks? Lunch at McDonalds. I could skip that for a day. So I surfed Clickbank for a while, found something new and really targeted, popped a few keywords in and wrote the ad. Basically, I used the sledgehammer method of marketing. Fifteen minutes later, I was getting clicks. Then I went to bed, of course. A few things I have learned. If I’m on the computer at one in the morning, I must hide my credit card from myself. Or maybe I was wrong?

I woke up the next day and discovered a few things. One, all of these gurus I decided not to listen to weren’t lying. I had made sales. In fact, I made more instant affiliate sales in 12 hours than I ever did before. Another, I had a lot more to learn. But I was hooked. So I bought a few ebooks to learn more and went to work. Why the hell had I waited so long?

Adwords is a crash course in direct marketing. You learn quickly what people want and what keywords they type in to find it. And you can just as quickly lose money if you don’t learn the medium. Google Adwords takes all the guesswork out of optimizing webpages. If you have the money, you can buy all the traffic you want. The key, though, was to buy sales as cheaply as possible. These are the basics. After the basics, Adwords gets complex. It comes down to fine tuning. Fine tune your keywords correctly and you will receive instantly targeted visitors who are ready to purchase specific products.

Clickthrough Quantity
A lot of internet marketing comes down to keywords. Adwords is no difference. You must learn to think like your customers. What keywords would they use to find the product you are selling? Sometimes you just can’t think of them. This is where keyword tools come into play. I have listed a few free keyword tools that will help at the link in the resource box.

If the keywords you are using are not performing well, there may be a few ways to fix this. One is to fine tune the keyword. Instead of using “e-mail” to sell a spam blocker, you can use the Adwords keyword suggestion tool to find such words as e-mail filter, junk e-mail, and such. Then delete “e-mail.” Obviously, everyone who uses “e-mail” in their search terms isn’t looking for a spam blocker. In fact, very few will be, so don’t destroy your click through rate by using terms that are too general.

You can also fine tune the keyword by using brackets, quotation marks, or negative keywords. Brackets around [e-mail filter] will allow your ad to be shown only when searchers type just “e-mail filter” in Google’s search box, not “e-mail” and not “free e-mail filter.” Using quotes will allow your add to be shown for both “e-mail filter” and “free e-mail filter”, but not “filter e-mail”. Using “-” like “email-filter” will allow your add to be shown for all searches containing “e-mail” that do not contain “filter.” You can find more details here: https://adwords.google.com/select/tips.html.

Another problem may be that your customer just does not see your ad. Just by creating a new ad that contains the keyword that hasn’t been producing that well may improve your click through rate. This works because Google highlights the keywords that the searcher used in your ad. You can also try capitalizing power words in your ad to get that ad seen. If you know the keyword should be working but isn’t, try this first. If this doesn’t help consider narrowing the keyword or dropping it. Don’t be scared too.

Use 300 keywords instead of 30. The more keywords you use the better. You will receive customers looking for specific products and you will have a better chance of getting keywords that cost five cents instead of five dollars.

It may just come down to good old ad copy. An Adwords ad is haiku of advertisements. You must pack a punch in three lines, four if you count the url. Go for the benefits not features of the product. Use active verbs that make your surfers want to click instead of static nouns.

Clickthrough Quality
You can get all the clicks from Adwords that you can imagine, a 20% click through rate and still not get any sales. All this means is that you are throwing money down an internet hole. If you are paying for every click, all clicks are not the same.

If you are trying to sell a product you definitely want to get rid of freebie hunters first. How do you do this? Add a price to the ad. Or if the ad copy on the product’s page is killer, you can use something along the line of “Low-priced”. In other words, use a phrase that tells the surfer that he will be getting out his credit card.

Another way to convert more clicks through Adwords comes down to keywords again. More specific keywords will convert more surfers into customers. If you are an affiliate for a motorcycle helmet company, you don’t want to use “motorcycle” as a keyword. You want to use “motorcycle helmet.” Or else, you’ll customers who are searching for jackets, exhaust, pictures.

There are a few types of affiliate programs where you wouldn’t mind freebie hunters. One is site that uses cookies. Think of the last time you purchased an ebook. Did you buy it when you saw the first ad? Probably not. You waited. But after reading about the product all over the net, you decided to get out your credit card. I discovered these types of customers after I deleted the campaign from Adwords that actually brought in the sales. The site’s cookie credited me with the sales weeks after my ad brought them to the site.
Another example is programs that pay for leads. Some companies will pay you if the surfer fills out a form. They depend on their professional follow-up to get the sale. But the point is, you have made cash and the surfer didn’t even have to pay. You just have to get him to sign up for info.

Set A Budget
Figure out how much you want to spend a day and stick to it. A good start is $5.00 a day. Also set your maximum cost per click. This will keep you form spending way too much. Some products I have the CPC set at $0.05. On others, I have it set at $0.50. It depends on how many clicks you actually convert into sales.

You must also set a cutoff point. Decide how much money you can spend on a new campaign without any sales before you drop it. It could be 300 clicks or $20 spent. Whatever you set it at, you must stick to it.

Tracking
Google Adwords allows you to set up campaigns, followed by ad groups, followed by individual ads. This structure is great for testing your ads. And believe me, you have to test and you have to track. A campaign can be set up for one product. Then you can set up an ad group for each feature of the product. One ad group may focus on the products ease of use. Another may focus on the low price. And in each ad group you can have multiple ads. Using this system and Adwords tracking features, you track which ads work best for clickthroughs.

But you still have to track your conversions. Commission Junction is great in that you can send each ad to a link. Then when you go to their site you can discover which ads are producing the most sales.

But whatever you do, know your average sale, know your average price per click and know how many clicks it takes for you to make a sale. Then figure your return on investment. You may be spending more than you are making. It’s easy to get caught up in the process and forget you are trying to make money. Don’t get into bidding wars. And always test something new. Nobody’s perfect.

Following The Rules
In my little experiment, I also ran into a few rules that Google wants you to follow by breaking them. The best way. Here is the link to the FAQ: https://adwords.google.com/select/faq/index.html. One, you can’t use trademarks as keywords. Second, you must get a 0.5% clickthrough rate or Adwords will slow your campaign down. Also, if you are an affiliate, you must at least put “aff” at the end of your ad.

In summary, Adwords can get you instant sales, but you must study the system, fine-tune your campaigns, and track your results.

Author Bio:
Stephan Miller is a freelance programmer, and writer. For more Adwords resources, visit the page below http://www.profit-ware.com/resources/googleadwords.htm

Bayesian Spam Filter & Google
On November 26, Stephen Lynch, journalist at the New York Post picked up the phone and initiated a telephone interview with me about an article I wrote on the previous day. The article was in relationship to the current November Google update “dance”, dubbed “Florida”.

The following day, Mr. Lynch wrote this article and it was published in the New York Post, offering his comments and, without being technical, explaining some of the negative effects such an update can have on the average site owner or Webmaster.

As the latest “Florida” monthly Google update ‘dance’ has shown us, having a website highly-ranked on the Internet’s number one search engine, Google– if your search rankings precipitously drop as much as some did and without warning, it can spell a devastating blow to some online stores or certain commercial websites.

In the last 10 days, a lot of articles have also been written by some of my colleagues, some in the SEO field and some, like Seth Finkelstein who are more in favour of the free flow of information that the Internet can provide.

In this article, I will attempt to describe some of the spam-filtering techniques that Google is reported using during this Florida “dance”. This spam-filtering technology is based on the Bayesian algorithm.

The Inner-Workings of a Spam Filter for a Search Engine
For quite a long time now, Google’s search results have been under attack by search-engine spammers that continuously attempt to mask search results, in the end, cluttering the search engines with irrelevant information in their databases.

With the ever-growing popularity of Google and as it tries to handle more and more searching all over the Web, the temptation to foul the search results has become attractive to certain spammers, leading to substantial degradation in the quality and relevance of Google’s search results. Since Google is mostly concerned of quality search results that are relevant, it is now cracking down on these unscrupulous spammers, with new spam-filtering algorithms, using Bayesian filtering technology.

At the end of October 2003, Google deployed their new Bayesian anti-spamming algorithm, which appeared to have its search results crash when a previously identified spam site would have normally been displayed. In fact, the searching results were completely aborted when encountering such a spam-intended site. See “Google Spam Filtering Gone Bad” by Seth Finkelstein for more technical information on how this spam elimination algorithm works at Google.

The First Shoe That Fell
On or around November 5th, this spam problematic was in fact reduced significantly, resulting from the “kicking-in” of these new Bayesian anti-spam filters. Although not perfect, this new Bayesian spam-filtering technology seemed to have worked, albeit there were some crashes in some cases.

On or about November 15th 2003, Google, as it always does every month, started “dancing”, performing its needed monthly and extensive deep crawl of the Web, indexing more than 3.5 Billion pages. This update had some rather strange results, in a way reminding some observers of a previous major algorithm change done in April of 2003, dubbed update Dominick, where similar and very unpredictable results could be noted across the Web.

It was generally observed that, many ‘old’ and high-ranking sites, some of which were highly regarded as ‘authoritative’, which were certainly not spammers in any way, appeared to fall sharply in their rankings or would disappear entirely from Google’s search results.

Since then, there have been many explanations, some not too scientific, that attempted to answer this event that some have categorized as “serious”. For an example of some of the best of these explanations, read an article that Barry Lloyd wrote: “Been Gazumped by Google? Trying to make Sense of the “Florida” Update!”.

More on the Bayesian Spam Filter
Part of my research and the observations I have done in this matter point to the Bayesian spam filter that Google started to implement in late October. A “Bayesian spam filter” is a complex algorithm used in estimating the probability or the likelihood that certain content or material detected by Google is in fact spam. In its most basic format, the Bayesian spam filter determines if something “looks spammy” or if, on the other hand, it is relevant content that will truly help the user.

To a certain degree, the Bayesian algorithm has proven efficient in the war against spam in the search engines. Being ‘bombarded’ by spam as much as Google has been for the past couple of years, it has no choice but to implement such anti-spam safeguards to protect the quality and relevancy of its search results.

However, it is the general feeling in the SEO community that, unfortunately, the current Bayesian algorithm implementation seems to have extreme and unpredictable consequences that were practically impossible to be aware of beforehand.

On the outset, one of the problems with estimating the probability or likelihood that certain content does have spam in it is, given very huge datasets, such as the entire Internet for example, many “false success stories” can and will occur. It is exactly these false success stories that are at the centre of the current problem.

Since this whole event began to unwind, there are many people that have noted in tests and evaluations that, making the search more selective, differentiating such as trying to remove an irrelevant string tends to deactivate the new search results algorithm, which in turn effectively shuts down the newly-implemented Bayesian anti-spam solution at Google.

One More Observation
While we are still on the subject of the new filter, but getting away from the topic of spam-related issues, as a side note, while doing some testing with the new Florida update, I did notice that Google is now ‘stemming’. To my knowledge, it’s the first time that Google does offer such an important search feature. How does stemming works? Well, for example, if you search for ‘reliability testing in appliances’, Google would suggest you ‘reliable testing in appliances’.

To a certain degree, variants of your search terms will be highlighted in the snippet of text that Google provide each accompanying result with. The new stemming feature is something that will certainly help a lot of people with their searching for information. Again, Google tries to make its searches the most relevant they can be and this new stemming feature seems like a continuation on these efforts.

Conclusion
In retrospect, and in re-evaluating all the events that have happened on this major dance, it is clear that Google is still experimenting with its newly-implemented algorithm and that there are many important adjustments that will need to be done to it to make it more efficient.

Spam being a growing problem day by day, today’s modern search engines have no choice other than to implement better and more ‘intelligent’ spam-filtering algorithms that can make the difference between what is considered as spam and what isn’t.

The next 30 days can be viewed by some as being critical in the proper ‘fine-tuning’ and deployment of this new breed of application in the war against spam. How the major search engines do it will be crucial for some commercial websites or online storefronts that rely solely on their Google rankings for the bulk of their sales.

In light of all this, perhaps some companies in this position would be well advised in evaluating other alternatives such as PPC and paid inclusion marketing programs as complements. At any rate, it is my guess that search will continue to be an important and growing part of online marketing, both locally, nationally and on a global basis.

______________
References:

1) An anticensorware investigation by Seth Finkelstein
http://sethf.com/anticensorware/general/google-spam.php

2) Better Bayesian filtering by Paul Graham
http://www.paulgraham.com/better.html

Author:
Serge Thibodeau of Rank For Sales