Buzzwords vs Effective SEO Keywords
Ever see a website that seems to speak a foreign language…in English? We encounter many SEO client websites that rely on buzzwords in the page copy to get the word out about their product. The problem lies with visitors who may not be familiar with those terms. This means optimizing with buzzwords may not be the best way to gain traffic. If your prospective visitors are not searching for those terms, how do they find your website?
Start With The Obvious
You really need to know your industry. Study your prospective visitors–who your target audience is. If your prospective visitors are highly technical and work and talk in “buzzword speak”, no problem. But if you also want to attract prospective visitors who may not be immersed in the terminology used in your business, you must compensate by optimizing with a wider array of targeted keywords.
How Do I Find All Those Keywords?
Start researching. Yes, it’s going to take a little work on your part to take a close look at what keywords you may be missing out on. Keep account of prospective website visitors who may use other terms to find your website. Track the keywords used by visitors through your log reports. Most log statistics programs have a report showing the keywords used by searchers to find your website. Using your server logs or log statistics program for keyword information is a good way to get a better picture of how visitors are finding your website. Use Overture’s keyword tool (http://inventory.overture.com/d/searchinventory/suggestion/) or Wordtracker (http://www.wordtracker.com) and note the words used on your competitors’ websites.
Using these, or similar tools, type in your buzzwords and see what variations come up. Competitor websites may use a slightly different language than you when writing copy for their pages. Visit their websites and learn all you can about how many ways your business can get its message across. Read online articles; visit business newsgroups and forums. Find research information through industry websites and companies that specialize in producing reports about your industry.
Help Search Engine Robots Do Their Job
Search engine robots are just automated programs. Their concept and execution is relatively simple: search engine robots “read” the text on your pages by going through the source code of your web pages. If the majority of the words in your source code text are buzzwords, this is the information that will be taken back to the search engine database.
It’s Obvious (the “DUH” factor)
Ok, so it’s obvious to you what your industry buzzwords are. But don’t discount the simpler versions of those catchy words. Focus also on some lesser used terms and make a list of additional keywords you might be able to add. Clear, precise copy that catches the visitor’s attention and tells your story is generally more effective in the long run.
Compromise – Mix SEO Keywords & Buzzwords
You don’t want to change the copy on your webpages? This is often a problem with business websites. Once you have your keyword list of other-than-obvious words, work at fitting them into the page text carefully. You want them to make sense with the context of the web page. Use these new keywords as many times as “makes sense” so they do not sound spammy. Read your copy out loud or have a colleague read your copy to get a sense of how it might sound to a website visitor.
The Bottom Line
It should be easy enough to see how those extra keywords are producing for you. Keep track of your log reports and see if those new terms start showing up in your reports. Test a variety of keywords, then test again to see if visitors are staying on your website, moving through your individual web pages, or clicking away. Create specific pages using those keywords as a test scenario. The information you need should be available to you in your log statistics reports for visited web pages.
Don’t let business jargon get in the way of getting your message across to your audience. Yes, buzzwords may sound cutting edge, but the bottom line is, traffic and sales are what you really want to show for your hard work.
Author Bio:
Daria Goetsch is the founder and Search Engine Marketing Consultant for Search Innovation Marketing, a Search Engine Promotion company serving small businesses. Besides running her own company, Daria is an associate of WebMama.com, an Internet web marketing strategies company. She has specialized in search engine optimization since 1998, including three years as the Search Engine Specialist for O’Reilly & Associates, a technical book publishing company.
Google Sells Christmas
Please note: at any given time Google may turn this new filter off and revert back to its old algorithm. I do not see this current algorithm as something that will stand the test of time. I do believe it is going to be the makeshift algorithm until Google can introduce some of the web search personalization and clustering technology that it obtained when Google purchased Kaltix. I am not a Google engineer and many of my statements in this document may eventually prove false. This is Google as I know it to be in my mind.
Cat and Mouse
Recently Google has had another “dance.” With this most recent change they did more than the usual PageRank update. For years search engine optimization firms and spammers have chased after the search engines. Each time the search engine would introduce a new feature thousands of people would look for ways to exploit it. Perhaps Google has the solution.
PageRank
Prior to the web, a good measure of the quality of a research paper was how many other research papers referred to it. Google brought this idea to the web. Google PageRank organizes the web based on an empirical link analysis of the entire web. The original document on PageRank, titled PageRank Citation Ranking: Bringing Order to the Web has been cited in hundreds of other documents.
Problems with PageRank
By linking to a website, the owner is casting a vote for the other site. The problem with this measure of relevancy is that a vote is not always a vote, and people can vote off topic. People used free for all link exchanges to artificially enhance their rankings, others signed guest books. Both of these methods faded in recent times as search engines grew wise.
Some web pages are naturally link heavy. Weblogs for example, have a large number of incoming and outgoing links. Many weblog software programs allow users to archive each post as its own page. If a small group of popular writers reference each other suddenly multi thousand page networks arise.
Articles dating back over a year have claimed that Google PageRank is dead. Just like the ugly spam that fills your inbox everyday, people want to get something for nothing from search engines.
Recent PageRank and Ranking Manipulation
Some of the common techniques for improving site relevancy (and degrading search results) were
• abusive reciprocal linking – two sites linking together exclusively for the sake of ranking improvements – common to sites selling viagra
• comment spam – people (or software) post comments on weblogs and point them at their website with optimized links
•selling of pagerank – people can sell PageRank to a completely unrelated site. in fact there are entire networks which have this as a business model (http://www.pradnetwork.com/)
While Google has fought hard to keep its relevancy, a drastic change was necessary. In the past search engines rated web pages on inbound links, keyword density, and keyword proximity.
Optimized inbound links would be links which have your exact key phrase in them. Keyword density would be the number of times a keyword appears on the page (and how they appear) divided by the total number of words. Keyword proximity is how close keywords appear to one another on the page.
The rule to high rankings was to simply place the phrases you wanted to list well for in your page copy multiple times, in your title, and in your inbound links.
The New Filter
This latest algorithmic update measures keyword proximity and other factors of artificial rank boosting to eliminates pages which trip the filter. It is not certain what factors are all considered in this filter, but it is believed that a high ratio of overly optimized text links coupled with high keyword density and close keyword proximity will trip the filter.
My Website
If you place a – between the keywords you are searching for Google will act as if the entire phrase is just one word. This is most easily observed by using the page highlighting feature on the Google toolbar.
If you search for “search-engine-marketing” you will see my site (Search Marketing Info) lists ~ 7. If you take the dashes out, you will see that my site tripped the filter and will not be listed in the search results for terms such as “search marketing” or “search engine marketing.” The dashes make it seem as if it is all one word to Google. That is why the keyword trip does not occur. If yo pull the dashes out though – I go over the limits and trip the filter.
Please note: Google recently fixed the – sign between words loophole. For a while searching for keywordA keywordB -blabla will still work. For a three word query you would need to add a -bla -fla extention to the end of the searcy. Google will eventually fix this too though.
The Penalty
If your site goes over the trip filter it will not cause your site to be assessed any spam penalties. Simply put, your site just will not show for the specific search you trip the filter on.
Only the specific page is penalized for the specific search. Your PageRank and all other pages in the site go unchanged.
The Test Website
note: a, b, and c refer to different keywords
One of my clients had a website that was listing in the top ten for phrase A B C. he also was in the top ten for A B and B C. When Google performed this update he was removed from all the various top rankings. His site was the perfect test site to discover the limits of this filter. His only inbound link was a singe PR6 link from DMOZ (the Open Directory Project) with none of the keywords in the link.
I, being interested in getting my clients site back on top again began to test. In some areas I combined B C into BC. In other areas I removed and/or distributed the keywords differently. Sure enough Google came and quickly re indexed his website. I then searched for A B and B C. He was quickly listing in the top 5 for both of these competitive phrases. I searched for A B C, and he was still over the filter limit. This all but confirmed in my mind the key phrase filter idea.
I changed his website again and am anxiously anticipating a Google re index of the page to verify we recapture his key phrase just below the trip limit!
Check to See if Your Site got Filter
This box will show sites that were filtered out. Type in your search term to see if you were filtered.
courtesy Scroogle.org
How to Bypass the Filter
The filter aims to catch highly optimized pages. The way to bypass it then is to not appear highly optimized. When Google searches for A B C, it wants to find A B C maybe a few times (not many), but it also wants to see A B, B C, A, B, and C sprinkled throughout the text where possible. The overall keyword density of your keywords should remain rather low.
How the Filter Hurts Google
Right now the filter does appear to be introducing more spam in on certain searches. The overall relevancy of most Google searches is still rather high though. Where I see a true problem with this new algorithm is that in a few months if cloaking software professionals adjust to this new script they will be able to tear holes in Google.
The problem with the current algorithm is that it is looking to match randomized, un optimized page and article structure. People can mix their words up some, but you frequently end up using some words together naturally. A cloaking script does not have to make sense to a reader. A cloaking script can write text which is more randomly organized than you or I because it only has to follow mathematical breakdowns. It does not also have to read well to a human eye.
In addition to this fact, many of the highly optimized sites that were appearing at the top are no longer there to protect the searcher from the real heavy spam which may (and in some cases already has) risen.
We shall see if they soon add better spam filters (which will surely be required once cloakers adjust to the new algorithm).
How the Filter Helps Google
This filter helps Google in two main ways.
• Manipulate search results is much harder for a novice SEO or webmaster. Selling PageRank to an off topic site may actually cause the site to trip the spam limits, thus PageRank is no longer as easy to abuse.
• Many highly optimized sites recently tripped the filter and lost their distribution. Might these people be buying AdWords for Christmas? ho ho ho!
What Google is Looking For
The general principal behind the filter is the idea that non commercial sites are not usually over optimized. Many commercial sites generally are. When Google serves up sites, they are looking to deliver the most relevant, not the most highly optimized websites.
The Eyes of a Webmaster
Our eyes are our own source of filtration. When we see our sites disappear for some of the best keywords we may get angry. Sometimes the results look bad while they are playing with the filters. What truly matters to Google is not if the sites are optimized, but if the results are relevant. Relevant results to their test subjects and Joe average web surfer is all that matters. Like it or not, if spam does not sneak in, this algorithm change may actually help Google.
Why Change Now
While some of the commercial searches have been degraded, often relevant results have filled the places once occupied by highly optimized websites. The biggest change is that these optimized websites have lost distribution right before the holiday spending season – ouch.
If you look closely at the Google search engine results pages now you will see that the Google AdWords boxes have grown a bit. Signs of things to come? Does Google want commercial search listings to only appear in ads? Now that Google has so many websites dependant upon it, they can change the face of the internet overnight, and people have to play by Google’s rules. So as the IPO is nearing, Google may be looking for some cash to show solid results which will improve opening stock price.
Yahoo
Yahoo has purchased Inktomi and has been branding Yahoo search heavily (although right now it is still powered by Google). Ideally if Google search results degrade Yahoo will switch to Inktomi and steal market share back from Google. It is sure to be a dog fight.
Author:
Aaron Wall of Search Marketing Info.
Googles Update – Future Website Ranking
The recent shakeup in Google’s search results, which set the SEO (search engine optimization) community buzzing and saw tens of thousands of webmasters watch their site ranking plummet, was in many ways inevitable. Almost all SEO companies and most savvy webmasters had a fairly good handle on what Google considered important. And since SEO, by definition, is the art of manipulating website ranking (not always with the best interests of searchers in mind), it was only a matter of time until Google decided to make some changes.
If you’ve been asleep at the SEO switch, here are a few links to articles and forums that have focused on the recent changes at Google:
Articles:
Site Pro News
Search Engine Guide
Search Engine Journal
Forums:
Webmaster World
JimWorld
SearchGuild
To date, most of the commentary has been predictable, ranging from the critical and analytical to the speculative.
Here’s a typical example from one of our SiteProNews readers:
“I’m not sure what has happened to Google’s vaunted algorithm, but searches are now returning unrelated junk results as early as the second page and even first page listings are a random collection of internal pages (not index pages) from minor players in my industry (mostly re-sellers) vaguely related to my highly-focused keyword search queries.”
So, what is Google trying to accomplish? As one author put it, Google has a “democratic” vision of the Web. Unfortunately for Google and the other major search engines, those with a grasp of SEO techniques were beginning to tarnish that vision by stacking the search result deck in favor of their websites.
Search Engine Optimization Or Ranking Manipulation?
Author and search engine expert, Barry Lloyd commented as follows: “Google has seen their search engine results manipulated by SEOs to a significant extent over the past few years. Their reliance on PageRankT to grade the authority of pages has led to the wholesale trading and buying of links with the primary purpose of influencing rankings on Google rather than for natural linking reasons.”
Given Google’s dominance of search and how important ranking well in Google is to millions of websites, attempts at rank manipulation shouldn’t come as a surprise to anyone. For many, achieving a high site ranking is more important than the hard work it takes to legitmately earn a good ranking.
The Problem with Current Site Ranking Methods
There will always be those who are more interested in the end result than on how they get there and site ranking that is based on site content (links, keywords, etc.) and interpreted by ranking algorithms will always be subject to manipulation. Why? Because, for now, crawlers and algorithms lack the intelligence to make informed judgements on site quality.
A short while ago, author, Mike Banks Valentine published an article entitled “SEO Mercilessly Murdered by Copywriters!”. The article rightly pointed out SEO’s focus on making text and page structure “crawler friendly”. Other SEO authors have written at great length about the need for “text, text, text” in page body content as well as in Meta, Heading, ALT, and Link tags. They are all correct and yet they are all missing (or ignoring) the point which is that the “tail is wagging the dog”. Search engines are determining what is relevant, not the people using those engines. Searchers are relegated to the role of engine critics and webmasters to being students of SEO.
SEO manipulation will continue and thrive as long as search engines base their algorithms on page and link analysis. The rules may change, but the game will remain the same.
Therein lies the problem with all current search engine ranking algorithms. SEO’s will always attempt to position their sites at the top of search engine results whether their sites deserve to be there or not, and search engines will continue to tweak their algorithms in an attempt to eliminate SEO loopholes. If there is a solution to this ongoing battle of vested interests, it won’t come from improving page content analysis.
Incorporating User Popularity Into Ranking Algorithms
The future of quality search results lies in harnessing the opinions of the Internet masses – in other words, by tying search results and site ranking to User Popularity. Google’s “democratic” vision of the Web will never be achieved by manipulating algorithm criteria based on content. It will only be achieved by factoring in what is important to people, and people will always remain the best judge of what that is. The true challenge for search engines in the future is how to incorporate web searcher input and preferences into their ranking algorithms.
Website ranking based on user popularity – the measurement of searcher visits to a site, pages viewed, time viewed, etc. – will be far less subject to manipulation and will ensure a more satisfying search experience. Why? Because web sites that receive the kiss of approval from 10,000, 100,000 or a million plus surfers a month are unlikely to disappoint new visitors. Although some websites might achieve temporary spikes in popularity through link exchanges, inflated or false claims, email marketing, pyramid schemes, etc., these spikes would be almost impossible to sustain over the long-term. As Lincoln said “You can fool some of the people all the time. You can fool all the people some of the time. But you can’t fool all the people all the time.” Any effective ranking system based on surfer input will inevitably be superior to current systems.
To date, none of the major search engines have shown a serious interest in incorporating user popularity into their ranking algorithms. As of this writing, ExactSeek is the only search engine that has implemented a site ranking algorithm based on user popularity.
Resistance to change, however, is not the only reason user data hasn’t made its way into ranking algorithms. ExactSeek’s new ranking algorithm was made possible only as a result of its partner arrangement with Alexa Internet, one of the oldest and largest aggregator’s of user data on the Web. Alexa has been collecting user data through its toolbar (downloaded over 10 million times) since 1997 and is currently the only web entity with a large enough user base to measure site popularity and evaluate user preferences in a meaningful way.
The Challenges Facing User Popularity Based Ranking
1. The Collection Of User Data: In order for web user data to play a significant role in search results and site ranking, it would need to be gathered in sufficient volume and detail to accurately reflect web user interests and choices. The surfing preferences of a few million toolbar users would be meaningless when applied to a search engine database of billions of web pages. Even Alexa, with its huge store of user data, is only able to rank 3 to 4 million websites with any degree of accuracy.
2. Privacy: The collection of user data obviously has privacy implications. Privacy concerns have become more of an issue in recent years and could hinder any attempt to collect user data on a large scale. The surfing public would need to cooperate in such an endeavor and be persuaded of the benefits.
3. Interest: Web search continues to grow in popularity with more than 80% of Internet users relying on search engines to find what they need. However, with the exception of site owners who have a vested interest in site ranking, most web searchers have not expressed any serious dissatisfaction with the overall quality of search results delivered by the major engines. Harnessing the cooperation and active participation of this latter and much larger group would be difficult, if not impossible.
The future of web search and website ranking belongs in the hands of all Internet users, but whether it ends up there depends on how willing they are to participate in that future.
Author Bio:
Mel Strocen is CEO of the Jayde Online Network of websites. The Jayde network currently consists of 12 websites, including ExactSeek.com (http://www.exactseek.com) and SiteProNews.com (http://www.sitepronews.com). Mel Strocen (c) 2003
Google & Trademarks
Google is now involved in a long running legal issue which is sure to stir up some emotion over web advertising.
I’m not talking about the Florida Update, although it did stir a lot of emotion and even more lost sleep than the average Google Dance. I’m referring to the issue of copyrighted or trademarked phrases which could be purchased for advertising purposes.
Google has gone down this path before, as most PPC players have, but in this case they have chosen to ask the courts guidance in determining if a generic term (in this case “American Blind”) could be protected under trademark law.
The questions here are many. For example, should a competitor be able to profit from the use of the term “American Blind” even though it is protected by trademark? Should a company like Michigan based “American Blind & Wallpaper Factory” be able to have the power to block online advertising for such phrases, or even phrases similarly worded?
Can companies like Kimberly Clark (maker of the brand Kleenex) be able to stop advertisers from using the phrase “Kleenex” in their ads even though it’s such a generic phrase that commonly is used by the general public to refer to facial tissue?
Legally, Kimberly Clark should be able to protect its product and prevent its competitors from profiting from using the brand. In fact, outside of the web many companies have successfully sued competitors and others who have used trademarked phrases or images and profited from them.
It is an interesting case, given that Google is involved, when not so long ago they quietly fought to keep the name “Google” out of the dictionary. Google didn’t like the idea of themselves formally becoming part of the English language (as in by using the search engine to check up on some on you “Googled” them), so their lawyers quietly asked the site that proposed the change to request that the proposal be removed.
It almost appears as if there is some kind of double standards here. If Google doesn’t want to become a household term for using a search engine then there shouldn’t really be an issue here. They should simply not allow competitors to use trademarked phrases in their ads.
Author Bio:
Rob Sullivan
Production Manager
Searchengineposition.com
Search Engine Positioning Specialists
Google Florida Update
November 16, 2003 will be a date to remember in search engine history. It will be remembered as the day that Google raised the bar on search engines. It is the day that they changed the rules as far as ranking pages and returning search results. As people will learn, in the days and weeks to come, it was also the day that Google introduced a new term to the SEO handbook: Ontology.
Ontology is “an explicit formal specification of how to represent the objects, concepts and other entities that are assumed to exist in some area of interest and the relationships that hold among them.” In other words, Ontology is a means whereby words are combined to give meaning to a body of text. No longer are the word combinations important. What is important, however, are what the combinations say about the text. It’s a new way of indexing the web.
It means applying a context to the page or site and qualifying it based on the context.
No longer is Google simply looking for x number of keywords or keyword combinations. It has harnessed the ability to look at what a page means by applying ontological rules to the page content to determine its context.
Now, this may sound far fetched. After all, how can you make a computer, which is used to look for sequences of words or numbers with no deeper interrelationships, understand that there is a deeper meaning. In other words, how to you teach a computer to think?
This is where Applied Semantics comes in. This company has devised a way to apply ontological rules to a web page to determine its context. By applying rules (such as synonymy, anonymity, similarity, relationships and many others) to the content of the page, the software can classify a page (or site) into various groupings, or taxonomies.
Now, I know you are saying “wait a computer can’t think. Not only that, but if it could it would have to have incredible processing power, or work extremely slow.”
This is where Kaltix comes in. Kaltix is a company which was formed by Google employees to improve the calculation of PageRank by making it faster. At least that’s what the press release said.
So, if you combine the speed harnessing abilities of Kaltix with the “understanding” of Applied Semantics what do you have? A very fast, very smart search engine.
And guess what? Google owns them both. Do you see where I’m going here?
If not, let me clarify. Google purchased Applied Semantics back in April this year. I would guess that they’ve had their eye out for something like this company for some time. They have been looking at improving results for a year or more now. Ever since people learned how to manipulate PageRank.
But they soon found, likely in May, that the Applied Semantic software wasn’t as efficient as they would have liked. It was at this time we noticed a slight change in the SERPs pages after a deep crawl.
So they went on a hunt, looking for someone to make it faster. And they found Kaltix. Remember Kaltix developed a way to make the PageRank algorithm faster. Well why couldn’t they apply the Kaltix technology to the Applied Semantics ontology software? After all PageRank is an algorithm, and so is the ontology software (albeit likely much more complex). In September, the Kaltix deal closed and Google now has the ability and speed it needs to take the next step.
Remember what Larry Page said about the perfect search engine? “The perfect search engine would understand exactly what you mean and give back exactly what you want.” (From the corporate section of the Google website – http://www.google.com/corporate/today.html).
The key words here are “understand” and “give back exactly what you want.” Now remember what ontology is. Ontology is used to capture the core relationships between concepts supporting the characterization of text in terms of meanings. In other words, understanding what a body of content is about.
Larry Page wants a search engine that understands and Applied Semantics has software which does just that.
So what’s next? Now that Google has a search engine that thinks can things get better? Well if you have been watching the recent Florida update news, you will know that in many cases, Google results appear to have actually gotten worse since that day in November. At least in some peoples eyes. So I guess now is the time to point out that according to Applied Semantics’ website (which, by the way, was recently changed to exclude most of the information I am sharing with you today � more proof?) the software is designed to learn. It can understand when the wrong results are displayed (by monitoring what results are chosen) and it can adapt.
Therefore, one would assume that search results should get better over time. How long will that be? No one can tell, but I would think it should not take more than a few months.
In the end, despite the short term pains, I guess change is a good thing.
Author Bio:
Rob Sullivan
Production Manager
Searchengineposition.com
Search Engine Positioning Specialists
Why You Should Always Avoid Cloaking
When building, implementing or optimizing a website, you should always avoid at all costs any form of cloaking, since those techniques are strongly forbidden by most major search engines today, with very good reasons. However, before I start explaining to you why you should always avoid cloaking, let me give you a brief introduction to the rather complex world of cloaking.
By definition, cloaking is a technique of delivering a certain set of pages to search engine crawlers, while at the same time delivering a completely different set of pages to your human visitors. There are many ways of doing this. Although some will tell you that the most effective way to cloak is with the use of IP (address) delivery software, the jury is still out on this and opinions vary widely. The subject of cloaking has always been and still is a very controversial topic.
Sophisticated cloaking software that makes use of IP delivery to cloak pages are complex programs that, in order for me to properly explain their functioning, would fall way beyond the scope of this article. However, let me give you a simplified explanation on how these cloaking programs work.
The Idea Behind Cloaking
When a search engine spider or human visitor requests a specific Web page, cloaking programs compare the IP address of the requesting party to a database of known IP addresses from specific search engine spiders. If the IP address matches one on the list, it serves a particular page that was specifically written for the search engines to the spider. This page is normally optimized to rank well for that particular search engine, in order to achieve high visibility.
On the other hand, if that IP address does not match one in the database’s list, the requesting entity is ‘assumed’ to be a human visitor and is served the actual ‘real’ page or is simply re-directed to another page appropriate for human visitors (!)
Since most modern search engines today constantly change their IP addresses in an effort to reduce this silly ‘cat-and-mouse-chase’, as you can imagine, if these databases are not regularly updated, they will become obsolete very quickly, in effect making such cloaking programs completely useless in no time!
As a rule, most major search engines today HATE any type of cloaking software and they categorically forbid them in their terms of use. One such strong opponent to any form of cloaking has always been Google. Today, Google will take strong action against any ‘cloaking website’ that it discovers. Such actions range from temporarily penalizing that site to outright banning it permanently from its index.
The simple reason for all this is that search engines rightly feel that they should always be delivered the same Web page that any human visitor would normally see. Many search engines also feel (and I agree with them) that webmasters making use of cloaking software or programs are in effect ‘spamming the index’. While some people may not agree with this, the search engines’ point of view does have some good merit and, if you want your site indexed by them, you don’t have any other option but to follow their rules.
Conclusion
Always build a great site that has a lot of valuable content to ALL your visitors and the search engines and make certain the search engines get the same pages your actual visitors get and your site should always rank high, if you follow all the rest of the advice offered on this website. Avoid cloaking at all costs, since it’s never an option.
Remember that all search engines today are only concerned with two important things: the relevancy and quality of their search results. Also, you should always remember that it is THEIR search engine and only they make the rules. So, in light of all this, it would be wise for you to completely forget about any kind of cloaking, unless you are willing to take the chance of having your site penalized or banned permanently from the major search engines.
Author:
Serge Thibodeau of Rank For Sales
Every Search Engine Robot Needs Validation
Your website is ready. Your content is in place, you have optimized your pages. What is the last thing you should do before uploading your hard work? Validate. It is surprising how many people do not validate the source code of their web pages before putting them online.
Search engine robots are automated programs that traverse the web, indexing page content and following links. Robots are basic, and robots are definitely not smart. Robots have the functionality of early generation browsers: they don’t understand frames; they can’t do client-side image maps; many types of dynamic pages are beyond them; they know nothing of JavaScript. Robots can’t really interact with your pages: they can’t click on buttons, and they can’t enter passwords. In fact, they can only do the simplest of things on your website: look at text and follow links. Your human visitors need clear, easy-to-understand content and navigation on your pages; search engine robots need that same kind of clarity.
Looking at what your visitors and the robots need, you can easily see how making your website “search engine friendly”, also makes the website visitor friendly.
For example, one project I worked on had many validation problems. Because of the huge number of errors generated by problems in the source code, the search engine robots were unable to index the web page, and in particular, a section of text with keyword phrases identified specifically for this page. Ironically, human users had problems with the page as well. Since humans are smart, they could work around the problem, but the robots could not. Fixing the source code corrected the situation for human and automated visitors.
There are several tools available to check your HTML code. One of the easiest to use is published by the W3C (http://validator.w3.org/). While you’re there, you can also validate your CSS code at W3C’s page for CSS (http://jigsaw.w3.org/css-validator/). The reports will tell you what source code needs to be fixed on your web page. One extra or unclosed tag can cause problems. With valid code, you make it easier for your human visitors and search engine robots can travel through your website and index your pages without source code errors stopping them in their tracks. How many times have you visited a website, only to find something broken when going through the web pages? Too many too count, I’m sure. Validating your pages makes everything easier for your website to get noticed.
As I said before, what works for your website visitors works for the search engine robots. Usability is the key for both your human visitors and automated robots. Why not provide the best chance for optimum viewing by both?
Author Bio:
Daria Goetsch is the founder and Search Engine Marketing Consultant for Search Innovation Marketing (http://www.searchinnovation.com), a Search Engine Promotion company serving small businesses. Besides running her own company, Daria is an associate of WebMama.com, an Internet web marketing strategies company. She has specialized in search engine optimization since 1998, including three years as the Search Engine Specialist for O’Reilly & Associates, a technical book publishing company.
Google’s “Florida Dance”
As most people have noticed by now, the last Google update, dubbed “Florida” has generated a lot of uproar amongst the business community and the Internet economy. This latest update is a clear example of the power of Google and what it can mean to a large number of commercial websites that depend on their rankings for the success of their business model and their viability.
There are many old and well-established commercial websites that have lost considerable rankings and that suffered a staggering loss of traffic in the process.
To the best of my knowledge, this is the largest and the most important algorithm change in the 5-year history of Google. Since November 15, Google has started to impose what some call the Over Optimization Penalty, or OOP. It’s still a bit too early to speculate what effect the OOP will have on some sites, but it might suggest to me that it may indeed be a form of static penalty, meaning whatever modifications you make now will probably not get your site back into the top results, at least not until the next Samba!
Therefore, you aren’t likely to see some affected sites return to the top for their target phrases until Google releases the OOP from those sites it was applied to. My feeling is there probably won’t be anything done until the next Google dance which won’t happen probably until December 15 or a bit later I think.
On December 2nd, somebody came to me and handed me his website that a previous SEO, a competitor of mine seemed to have “overly optimized” on certain important keywords of his. Although this article is certainly NOT meant to spam the search engines in any way, the following are techniques that can be used as a last recourse, if you feel some of the pages in your site have been affected in such a way.
Fine-Tuning A Website To The OOP
As is most apparent to many, Google is still experimenting with its new algorithm and continues to adjust as we are heading into the busy Christmas and Holiday season. One thing is certain: there appears to be no stability in the new algorithm.
For that reason, I would just sit and wait to see what the December update will bring us before committing to any changes whatsoever
Detecting If The OOP Has Been Applied To A Particular Page
You can determine if your site had any OOP penalty applied in any way by simply entering your keywords in a Google search, but you will need to add an -exclude for a unique string of characters. Example:
Your main keyword or key phrase blahblahblah
The -blahblahblah string of text doesn’t matter one bit- it’s simply a string of nonsense characters; you can type in anything you wish and it will work (!)
What is important is that you put the “-” character, which informs the Google algorithm to exclude the new spam filter which appears to cause the OOP penalty.
After running this simple test, if you see your page reappearing back under these conditions, it’s highly likely the OOP is in effect for that particular keyword or key phrase on that page.
Note that there is a possibility Google might modify how this type of search works, in an effort to prevent some people from seeing the results, but without the penalty filter. However, at the time I wrote this, that search technique enabled me to detect which of the penalized web pages have been affected by the OOP, so it is a pretty fairly accurate test.
More On The OOP Penalty
The OOP penalty is mostly related to certain incoming links to a page that is in fact penalized in a certain way. Also, on-site absolute links might trigger the OOP. In a particular case where I have seen a site receive an OOP ranking penalty, there were incoming text-links that matched exactly the keywords for which I made the search. Thus, if the incoming links match exactly with the text in the title of your page, your headline tags or your body text, it is my observation that the OOP penalty could be triggered by Google’s new algorithm.
Of course, it is extremely premature at this point to speculate on the exact formula Google is using to achieve this, but if a rather large number of your incoming links match exactly with the keyword or the key phrase used in the search box, it is in fact possible to trigger the over-optimization penalty in my view.
The Ad Word Conspiracy Debate
Since all of this can seem a bit strange, and since a lot of opinions or various ideas have been circulated in forums and different articles when all of this started in mid-November, and especially since many (including me) have discovered that the OOP penalty doesn’t seem to be consistently applied across all the Web, there are some people that have suggested Google might be in the center of a conspiracy to force companies and site owners to spend a lot more money on its AdWords PPC campaigns. Personally, I do NOT believe or participate in that philosophy and I don’t think it would be in Google’s best long-term interest to do that.
However, the OOP penalty does appear in fact to be applied mostly to higher traffic keywords and key phrases, the sort that commercial websites use the most. In retrospect, this does give some credibility I guess to those that think there might be a conspiracy theory to all of this, for what it’s worth.
Make no mistake about this. It is in fact the biggest and the most drastic algorithm change I have ever seen in the short 5-year history of Google. If a few pages in your site seem to suffer of the OOP penalty, let’s look at some ways to get out of this.
Some Solutions To The OOP Problem
For the most part, the OOP penalty comes from incoming links that perfectly matches the keywords or key phrases on a specific title tag or headline once on the affected page. There are a couple of ways to repair this. The first would be to reduce the quantity of the incoming links to your affected pages.
Another good alternative would be to modify the title or headline of your affected page (or pages), so that you have fewer occurrences of the exact keyword or key phrase that originally triggered the OOP penalty. For example let’s say the key phrase in your incoming link that carries the penalty is Construction & Building Materials and you have Construction & Building Materials in your title tag, as well as in your description tag and in your H1’s or H2’s. I would change the title tag to be ‘Special Construction and Building Supplies’ and see what that does.
You should also modify it and do the same for the text body of that page. In an ideal situation, it is imperative to try to achieve a ‘good balance’ that Google would consider as spam-free and not artificial. It is my thinking that too many incoming links with your exact key phrases in them, combined with too much on-page optimization and exact-matching text located within the target areas, i.e. the title tag, headline tags, H1’s, H2’s etc will likely trigger the OOP filter.
Conclusion
In light of all that has happened with update Florida, and with the drastic changes that Google has massively implemented to its PageRank’ algorithm, you should expect to see more upcoming changes of this algorithm, perhaps even before December 15, at which time some observers think Google might start dancing again.
As I have written in previous articles on this and as I was interviewed and quoted by the New York Post on November 27, I still believe that if your site has had little or almost no OOP penalties, in that case I would stay put and not do anything for the time being.
This article is simply meant to show a few techniques I have used to limit ‘some incurred damages’ already done to a site by an SEO firm that would have overly optimized a site, previous to the critical November 15 date. If your site was optimized using only Google-accepted techniques, according to their Terms of Use agreement, your site should not have suffered any OOP penalties whatsoever.
Author:
Serge Thibodeau of Rank For Sales
Where On Earth Is Your Web Site?
You’ve just finished congratulating your marketing team. After six months of concentrated effort you can now actually find your own company web site within the search engines. Everyone is busy handshaking and back patting when a voice from the back of the room rises above the din. “Yeah this is great! Can’t wait until we can find ourselves on wireless devices.” All conversation comes to an abrupt halt. Eyes widen. Everyone turns to the fresh-faced intern standing in the corner with a can of V8 juice in one hand and a PALM device in the other. You, being the Department Manager, barely managing to control your voice not to mention your temper, ask the now nearly frozen with panic intern, “What do you mean find ourselves on wireless? We just spent thousands on our web site visibility campaign!” “Well… Explains the sheepish intern, “There is no GPS or GIS locational data within our source code. Without it, most wireless appliances won’t be able to access our site.”
Guess what? The intern is absolutely correct. Anyone interested in selling goods and services via the Internet will soon be required to have some form Geographic Location data coded into your web pages. There are approximately 200 satellites currently orbiting the Earth. (even Nasa won’t confirm the exact number) Some are in geosynchronous or geostationary orbit 27,000 miles above your head. The Global Positioning System (GPS) is the name given to the mechanism of providing satellite ephemerides (“orbits”) data to the general public, under the auspices of the International Earth Rotation Service Terrestrial Reference Frame (ITRF). Sounds like Star Wars doesn’t it? It’s pretty close. The NAVSTAR GPS system is a satellite-based radio-navigation system developed and operated by the U.S. Department of Defense (DOD). The NAVSTAR system permits land, sea, and airborne users to determine their three-dimensional position, velocity, 24 hours a day, in all weather, anywhere in the world, with amazing precision. http://igscb.jpl.nasa.gov/
Wireless devices, WAP, Cellular, SATphones and a whole host of newly emerging appliances and indeed, new software applications, will all utilize some form of GPS or more likely GIS data retrieval. GIS stand for Geographic Information System and relies on exact Latitude and Longitude coordinates for location purposes. Several car manufacturers currently utilize GPS for on-board driver assistance and the Marine and Trucking Industries have been using it for years. Obviously your web site is a stable beast. It sits on a server somewhere and doesn’t move much, so at first glance it seems quite unplausible you’ll need GIS Locational Data within your source code. On the contrary. One aspect your web site represents is your business’s physical location(s) and if people are going to try to find your services and products, shouldn’t you at the very least, tell them where it is and how to get there?
Let’s look at it from the other end of the spectrum. The end user approach. Let’s say you’re vacationing in a new city for the first time. Once you get settled into your Hotel room, what’s the first thing you want to find? Restaurants? Bank machines? Stores? So you pull out your hand-held, wireless, device, log onto the web and search for “Italian Food in San Francisco.” FiveHundred results come back so you click the new “location” feature on your hand-held (which knows exactly where you are) and ten Italian restaurants, who were smart enough to code their web sites with GIS data, light up on the screen. Guess which restaurants didn’t get selected? The other four hundred and ninety. Starting to get the picture?
How does this affect you and your web site marketing?
GIS Latitude and Longitude co-ordinates will soon be a must have on every web site operators and web developer’s list and an absolute necessity for anyone wishing to trade good and services via the Internet. This data may relate to the physical location of the web site or where the site is being served from (if applicable) or where the actual business represented by the site is physically located. There may be multiple web site locations and coding involved, if for example, you have a franchise with multiple locations, each location will probably need a page of it’s own with the correct corresponding location data. If you run a home-based business, I doubt if the co-ordinates to your living room are going to be necessary, but you should provide the latitude and longitude of the closest city or town. Large corporations such as banks may want to code the exact location of every automated teller machine across the country. Industry standards and the methods of serving out this data are still in the development phases but it’s a safe bet to assume there are plenty of people working on the solutions right now and given the speed of technology, implementation will probably be much sooner than later. Give yourself an edge. Find out where in the world your web site is…before your web site is nowhere to be found.
Author Bio:
Robert McCourty is a founding partner and the Marketing Director of Metamend Software and Design Ltd., a cutting edge search engine optimization (SEO) and web site promotion and marketing company. Scores of Metamend Client web sites rank near or on top of the search engines for their respective search terms.
The Google Update Uproar
‘Be careful what you wish for, it may come true’
While at Ad-Tech, I lamented the clogging of Google’s results with spam filled sites. I asked Google to clean up its index. Although I’m sure there’s no connection between the two (unless Sergei and Larry are paying a lot more attention to me than I thought) Google responded just a few weeks later with the Florida update. And boy, have they responded big time!
If you haven’t ventured into an SEO forum for awhile, you might not have heard of the Florida update. It’s Google’s latest dance, and it’s a doozy. It appears that Google is trying to single handedly shut down the entire affiliate industry.
The scene is awash with guessing and speculation. Was it a Google mistake? A plot to force advertisers to move to AdWords for placement? Barry Lloyd did a good job of trying to bring sense to the mayhem. I’d like to jump in with some further research we’ve done and my own thoughts of what’s happening with the Google index.
A Florida Guide
First of all, the Florida update was rolled out November 16th. It appears to be a new filter that is applied to commercially based searches, triggered by certain words in the query. The filter clears out many of the sites that previously populated the top 100. In several tests, we found the filter generally removes 50 to 98% of the previously listed sites, with the average seeming to be 72%. Yes..that’s right: 72% of the sites that used to be in Google are nowhere to be seen!
Who’s Missing?
The target is pretty clear. Its affiliate sites, with domains that contain the keywords, and with a network of keyword links pointing back to the home page of the site. The filter is remarkably effective in removing the affiliate clutter. Unfortunately, legitimate commercial sites with lower page rank are being removed as well. There seems to be a PageRank threshold above which sites are no longer affected by the filter. We’ve seen most sites with PageRank 6 or above go through unscathed.
And the Secret Word is’
The filter also appears to be activated only when search queries contain certain words. For example, a search for ‘Calgary Web Design Firms’ activated the filter and cleared out 84% of the sites, while a search for ‘Calgary Database Development’ didn’t activate it. Search volumes are roughly equivalent for both phrases. The filter seems to be activated by a database of phrase matches, and doesn’t appear to be affected by stemming. For example, ‘Panasonic fax machines’ activates the filter, but none of these words as a single search phrase does. ‘Fax machines’ activates the filter, but ‘Panasonic machines’ doesn’t.
Also, it seems that only a few single word searches activate the filter. We found that jewelry, watches, clothing, swimwear, shelving, loans and apartments all activated the filter. Other terms that you would think would be bigger targets for spam, including sex, cash, porn, genealogy, MP3, gambling and casino don’t activate the filter. Obviously, when you look at these words, Google is more concerned with commercialization than spam.
Volume, Volume, Volume
Another factor is whether the filter is tripped or not seems to be search volume. Any commercial searches with volumes over 200 per month (as determined by Overture’s search term suggestion tool) seemed to trip the filter. Searches under that threshold seemed to remain unfiltered. For example, a search for ‘Oregon whitewater rafting’ (about 215 searches last month) activated the filter, while a search for ‘Washington whitewater rafting (about 37 searches last month) didn’t.
What is Google Thinking?
Obviously, given the deliberate nature of the implementation, this isn’t a hiccup or a mistake by Google. This was a well thought out addition to the algorithm. And in the most competitive searches, it produces much better results than did the ‘pre-Florida’ index. If you search for ‘New York Hotels’ today, you’ll find almost all of the affiliate clutter gone.
Where the problem occurs is in the less competitive searches, where there’s not a sufficient number of PageRank 6 or higher sites to fill the vacuum caused by the filter. If you do a search now for most phrases you’ll find the results are made up of mainly directory and irrelevant information sites. In cleaning house, Google has swept away many sites that should have stuck. As an example, visit Scroogle.org and search for ‘Calgary web design firms’. Scroogle is from the deliciously twisted minds of Google Watch, and gives graphic representation of the bloodshed resulting from Florida. In the pre-Florida results, the top 10 (all of which were wiped out by the filter) included 6 Calgary based web designers and 1 in Vancouver (two of the remaining results were additional pages from these firms). The other result was a directory called postcards-usa.com with a page of design firms from around North America. Eight of the 10 results were directly relevant, one was somewhat relevant and one was of questionable relevancy for the geographically specific search.
In the filtered results, there is not one web design firm from Calgary. The top 4 listings are directory site pages, two of which are not even specific to Calgary. Ranking 5 and 6 belong to Amazon.com pages selling a book on web design (nothing to do with Calgary other than a reader review from someone who lives there). Rankings 7 and 8 go to pages about evolt.org, a non profit organization of web designers, and a profile on a Calgary based member. Listing 9 goes to the web design page of an abysmal web directory, again not specific to any region. And listing 10 goes to an obvious link farm. Of the 10 results, none of them were relevant.
Google’s Next Move?
Pulling out the crystal ball, which in hindsight was amazingly accurate 2 weeks ago, here’s what I think will happen. The Florida filter will not be revoked, but it will be tweaked. It’s doing an amazing job on the ultra competitive searches, but the algorithm will be loosened to allow inflow of previously filtered sites to bring relevancy back to the less competitive searches. Hopefully, the sites finding their way back into the index will be better quality legitimate commercial sites and not affiliate knock offs. Google has to move quickly to fix the relevancy for these searches, because they can’t afford another blow to the quality of their search results.
I really don’t believe that Google purposely implemented the filter to drive advertisers to AdWords, but that is certainly a likely side effect. The most dramatic impact will be the devastation of the affiliate industry. Just 3 short weeks ago I listened to 4 major internet marketers say they didn’t bother with organic SEO because their affiliate partners did it for them. Those days are over. If Google was targeting anyone with Florida, it was affiliate sites. A number of forum posts indicated that Google was taking aim at SEO. I don’t believe so. I think Google is trying to wipe out bad SEO and affiliate programs and unfortunately there are a number of innocent bystanders who got hit in the crossfire. But every indication from Google itself (both from posts to forums and in replies to help requests) seems to indicate that Florida is a work in progress.
Author:
Gord Hotchkiss