Ask us a question!

Web Moves Blog

Web Moves News and Information

Archive for the 'Google' Category

SEO Between the big three

The fundamentals of SEO apply regardless of which search engine you want to rank in, but each of the major ones has some finer points, so let’s get right to them.

Yahoo!

You may have heard that participating in Google AdWords would penalize your site in other search engines, but Yahoo! insists that this is not the case. They really have nothing to gain by dropping sites that are relevant, and if they did, they’d be cutting off their nose to spite their face. But the age of your domain is important because a longer track record ups a site’s relevancy. It’s only one score, and it’s one you can’t do much about, but it’s something to keep in mind: SEO is a long term idea.

Yahoo! suggests registering domains for more than one year at a time. It gives your site a long-term focus and keeps you from accidentally losing a domain because you didn’t find the email that it was time to renew. They also suggest you should buy the same domain name with and without dashes. This will keep you from losing so-called type-in traffic. However, all SEO pundits say that the more dashes, the harder it is to type in, and the more spam-like it will look, so don’t get carried away.

Like with Google, with Yahoo!, relevant inbound links from high quality pages are gold. This is one of the hardest parts of SEO, but it’s one where there really aren’t many shortcuts. You want links from sites that belong to the same general neighborhood of topics as yours. If you have a site that sells organic flour, a link from a fishing tackle site isn’t going to help you much, if at all. And when it comes to giving out your own links, be careful here as well. If you link to a lot of sites that are or could be penalized, you could be hurting yourself by association. If you sell text links, check out every site that buys from you to make sure you’re not endorsing spam, porn, or other content that search engines frown upon.

Make use of Yahoo Site Explorer to see how many pages are indexed and to track the inbound links to your site. The first screen shot shows the results of the analysis of one site. As you can see, there are 503 pages on the site, and 5,838 inlinks, each of which you can explore further. To maximize crawling of your site and indexing of pages, publishing fresh, high quality content is the key.

yahoo site explorer

Bing

There have been some case studies about what Bing looks at compared to Yahoo! and Google when ranking sites. When ranking for a keyword phrase, both Bing and Google look at the title tag 100% of the time. Prominence is given a little more weight with Bing than with Google, while Google favors link density and link prominence more than Bing. Bing evaluates H1 tags, while Google does not, and Google considers meta keywords and description while Bing does not. What that all boils down to with Bing is that having an older domain and having inbound links from sites that include the primary keyword in their title tags (another way of saying relevant inbound links) are keys to optimizing for Bing. Like optimizing for the other major search engines, link building should be a regular, steady part of your SEO effort.

With Bing, it’s easier to compete for broad terms. With Bing, keyword searches result in Quick Tabs that offer variations on the parent keyword. This has the effect of bringing to the surface websites that rank for those keyword combinations. The goal is for content-rich sites to convert better than sites with less relevant content. The multi-threaded SERP design brings up more pages associated with the primary keywords than would come up with a single-thread SERP list. Also, Bing takes away duplicate results from categorized result lists. This allows lower ranked pages to be shown in the categorized results.

The Bing screen shot shows the results for a search on “video cameras.” To the left is a column of subcategories. Results from those subcategories are listed below the main search results. While there are some differences to SEO for Bing, the relatively new search engine isn’t a game changer when it comes to SEO.

bing search

Google

It sometimes seems as if SEO is synonymous with “SEO for Google,” since Google is the top search engine. And it also seems that when it comes to SEO for Google, a lot of the conventional wisdom has to do with not displeasing the Google search engine gods by doing things like cloaking, buying links, etc.

The positive steps toward SEO with Google include keywords in content and in tags, good inbound links, good outbound links (to a lesser extent), site age, and top level domain (with .gov, .edu, and .org getting the most props).  Negative factors included all-image or Flash content, affiliate sites with little content, keyword stuffing, and stealing content from other sites. It isn’t so much that Google wants to seek out an destroy sites that buy links, but they want the sites with actual relevant, fresh content to have a shot at the top, and with some sites trying to game the system and get there dishonestly, Google has to find a way to deal with these sites without hurting the good sites.

In fact, Google wants users to report sites that are trying to cheat to get to the top of the search engines. On the screen shot, you can see a copy of the form found at https://www.google.com/webmasters/tools/spamreport?pli=1 for reporting deceptive practices. You have to be signed in to your Google account to use this, by the way. They want to get away from anonymous spam reports.

google search spam

Google Branding

Just over a year ago, Google made some changes in their algorithm that appeared to be giving weight to “brands.” Now, “brand” is a short word that’s loaded with a lot of meaning. When this change took place, there were plenty of people running around in circles waving their arms in the air thinking that the big, rich corporations were going to dominate the search engine rankings and the whole purpose of organic search would be lost forever (the purpose being bringing users relevant results that could be from anyone, anywhere if they were right on target).

Fortunately, the passage of time has let people make some more considered analyses of that change. To get up to speed, let’s take a quick look at some of the ancient history (dating back to 2003) in search engine technology.

(more…)

enhancing search results with multimedia

Education specialists will tell you that some of us learn more readily from reading, some from hearing, some from video, and some from doing. It only makes sense that when you conduct an internet search, you avail yourself of all the options when it comes to the information out there on the subject you’re interested in. That’s one reason why multimedia searching is a good idea.

Another reason is that sometimes the best information is something other than a written web page. For example, if you were to find a web page describing a newly discovered piece of music by one of the great composers of the past, the description would no doubt be interesting, but you could really appreciate it better if someone had posted a recording of someone playing the composition.

Likewise with video. While it can be thrilling to read a news story about a very close speed skating finish at the Olympics, you can experience it better if you watch a video of it.

When it comes to using multimedia to improve your search engine rankings, there are a number of ways to do it. You can post your own video content, or you can embed relevant video onto your page.

(more…)

Google Enhanced Local Listings

Depending on who’s doing the talking, Google’s new Enhanced Local Listings (currently available only in Houston, Texas and San Jose, California) are either a boon to small businesses with small advertising budgets or the end of organic search as we know it. Here is what all the heated discussion is about.

The Google Lat Long Blog describes it as “a new ads feature in local search that allows business owners to enhance their listings.” Apparently, it is not a matter of “buying position” on the SERP: “When the listing shows up in your Google.com or Google Maps search results, the enhancement also appears alongside it.”

The “enhancement” as you can see in the screen shot is a little yellow flag. Your listing will also have “sponsored” next to it. The enhancement is something you can click on to go to the business’s website, pictures, menu, or coupon. On the screen shot you can see that the flag will take you to the company’s website.

sponsor search

This service costs $25 per month and according to Google, it does not affect your search engine rankings. It only shows up beside your listing where your listing would appear anyway. So what are the pluses and minuses of this program? Forums have been thrashing it out the past couple of days. The two main arguments are: 1) It could help small businesses who don’t have $1000 or so a month for an effective AdWords account and 2) Just you wait: this will result in purchasing rank eventually and organic search as we know it on Google will be over. Let’s take the arguments in turn.

OK, if you’re a small business, you probably don’t have the money or the time to plan and execute a competitive ad plan on AdWords, and the return on investment of a smaller AdWords account isn’t always worth the trouble or cost. And even small businesses can fork over $25 to highlight their ad and include a link to a coupon, menu, photos, or just the website. Maybe it would be good to run for a month while they had a special promotion going on. There’s little question that Google’s bottom line will benefit. According to a Chantilly, Va. local business advertising company called BIA / Kelsey, advertising for small businesses is a $29 billion market. Those $25 flags will add up fast if Google rolls the program out nationwide.

Google says that it is thinking about taking the program nationally, but they don’t know exactly when. Businesses in San Jose and Houston can get into the program and get an enhanced listing with a little yellow flag and link for free for the rest of February. In March, it will start costing $25 per month. Businesses that don’t rank won’t really get anything for their $25, so local SEO will still be important.

If you have a small business who’s claimed the local Google listing for your site, whether you buy an enhanced listing or not you get something called dashboard analytics that will tell you when a visitor hits your local maps-based listing where they go: to your website, directions to your place, or your Google Maps record.

There are plenty of people, though, who see this as a slippery slope to paid rankings. Some of them believe that as the program grows, it will become competitive (like AdWords), and the position will be determined by the highest bids. Others think that if the program is nationwide, everyone will pay $25 per month and the enhanced listings will no longer stand out in a sea of little yellow flags.

Then there are those who believe that now Google has its big toe in the door, eventually there will be a more complex structure for price and what it gets you and after people get used to it, paying for real estate on page 1 of the SERPs will have worked its way into the mix without anyone thinking its any big deal. Boom: the end of Google’s organic search results. With AdWords already putting a price on 10 to 20% of the space on each SERP, Google will eventually want the other 80 to 90% to be monetized too. Other conjecture for Google’s nefarious plan include offering something like number one placement with purchase of a “premium” package.

The most likely scenario is that the program will roll out nationwide, everyone will pay $25 for a flag, and therefore nobody’s listing will stand out (except, ironically, maybe the oddball who ranks and doesn’t buy the enhanced listing). Pay for rank? I don’t know. It seems as if Google has an awful lot to lose by doing that. I suppose it is possible that Google thinks it’s so big and dominates search engines so thoroughly that they could pretty much do what they want and get away with it.

But if that were the case, it would make the time ripe for a new or open-source search engine to bust out due to its simple interface and truly organic listings. Or else Google could separate their search into something like “New Coke” (where businesses can buy rank) and “Classic Coke” (where the results are pure and ads are either gone or could be turned off). That would allow those who actually care about relevance and quality to do their thing while the ones that were OK with buying ranking could have their own search universe too.

15
Feb
2010

Google Buzz

Google Buzz

If you’ve checked your gmail account in the past few days, then you’ve probably gone straight to a screen asking if you want to set up Google Buzz. If you’re worried about a complex set-up procedure, then stop worrying. It’s extraordinarily easy to set up because you’re walked through every step and it’s all very clear.

But perhaps before you get all gung-ho on what should be a new and easy way to tie together contacts and share social networking information, you should know up front that Google Buzz feels very “wide open,” and that’s because, well, it is. Google has already taken some heat from a blogger who found out the hard way that Buzz was revealing her present social networking information to her ex-husband because he is, for better or worse, one of her “most frequent” contacts. Your most frequent contacts are automatically allowed access to your Google Reader unless you explicitly block them. “Frequent” contacts are also those who email you frequently, whether you want them to or not.

(more…)

personalized search

In December 2009 Google rolled out Personalized Search for users who were not signed in, and in over 40 languages. Personalized search has been around for awhile for signed in users who have web history enabled. This allows Google to fine-tune your search results based on past searches and on the sites you’ve clicked on in the past. This is how Google tries to optimize searches for you when your search terms have more than one meaning.

For example, Googling the term “blackberry” while signed in with web history on gets the results you see on the first screen shot. From my search history it is clear that I’m much more likely to be searching for information about the BlackBerry PDA than about the actual fruit. On the other hand, my mom, who does a lot of baking and jam-making, would probably end up with search results about the fruit rather than about the electronic device, based on her history of looking up recipes.

personalized search

So now that personalized search is on offer for signed out users, exactly what does that mean?

It means that personalized search can use an anonymous cookie on your browser to base your search results on 180 days of search activity linked to this cookie. This is completely separate and apart from your Web History and your Google account. When you have the option of personalized results while signed out, you will see a link that says View Customizations in the top right corner of the results page, as you can see in the second screen shot. When you click on it, you can see how the results are customized and, if you want, you can turn off this type of customization, as you can see in the third screen shot.

personalized search

personalized search

Now, there are obviously computers where lots of different people search, so the browser cookie might be influenced by multiple people’s search activity. To protect the privacy of the non-signed-in users, you can’t actually view the specific search activity upon which the signed-out personalized search is based. Plus, you can turn off personalized search settings for signed-out personalized search altogether if you want.

As for signed-in personalized search, you can clear the history upon which your personalized results are based at any time to protect your privacy. That way, if you stay signed in and someone else wants to know what you’ve been searching on, they won’t be able to do it through your personalized search history. Of course they could still go through your web history, so if you’re concerned, you should use your browser to clear your web history.

If you’re a webmaster, you might be wondering whether personalized search, adopted on a massive scale, will affect your ability to reach the people you want to visit your website. The answer, unsatisfactory as it may be, is “Yes, and no.”

Consider this. If a user searches for, say, “pith helmets,” and visits the top result and the last two results on the first page of listings. Then those three websites will be added into the person’s personalized search data. Next time the user decides to do a search for “pith helmets” then those two sites from the bottom of the first page of results will rank higher than they would in an organic, starting from scratch search.

But what if the user finds another search result, say the 7th one on the new results page, and bookmarks it and goes there henceforth rather than searching for pith helmets anymore for the time being. But then, a few months later, he thinks that maybe there’s a better pith helmet out there so he does another search. But this time, the one he bookmarked shows up as the top result. What gives?

Even though that site didn’t do any special SEO, there it is, right at the top of our pith helmet-loving searcher. However, other searchers will find other pith helmet dealers at the top of their results pages.

So how do you, the webmaster, change your SEO strategy? Or do you need to change it? If you’re doing legitimate, white-hat ways of getting traffic to your site, then your site will be more likely to bubble to the top of the research results for personalized search. A first time searcher might find your site perched at No. 1 simply because he has already visited it a bunch of times through other routes. This is true even if your site wouldn’t be tops in an organic search.

There’s nothing really that you should change in terms of your SEO strategy. Keep doing the on-site SEO as you have been, and you’ll probably do fine. But there’s nothing wrong with using off-site SEO to build up your brand and keep your visitors happy. Again, it boils down to having great content that people will find compelling and that will make them want to come back. This helps you whether the game is organic search or personalized search.

The takeaway is this: Google personalized search isn’t so much revolutionary as evolutionary. It’s not going to take all the search engine results and shuffle them massively. It may mean that sites that focus on the mechanics of SEO without focusing on great content could lose some ranking, but even that seems unlikely. Personalized search will change things up a little on an individual basis, but it by no means throws out the concept of organic search results based on SEO.

Google Adwords

Adwords calculates a Quality Score for every one of your keywords. The Quality Score is a measure of how relevant the keyword is to the text in your ad and users’ search queries. It is determined by a variety of factors. The Quality Score for a given keyword updates frequently to reflect its performance. Generally speaking, the higher the Quality Score, the higher the position at which your keyword activates ads, at a lower CPC (cost per click). Sounds like you want your Quality Scores to be as high as possible, right?

(more…)

What does googlebot see?

Something that can be very helpful when you are designing and refining your website is knowing what it “looks like” to the bots that crawl the web and index your pages. If your site doesn’t have the information that the bots need to know what your content and graphics are all about, then they can’t do a very good job indexing your pages.

If you use Firefox, you can download and install the “User Agent Switcher” option for Firefox. You’ll have to restart Firefox once you’ve installed it. Once you have it, in Firefox, go to Tools, then User Agent Switcher, then Options, then Options again. In the User Agent Switcher window that comes up, select User Agents and click on “Add.”

In the Description box, type something like “Google Bot” and in the User Agent box, paste this:

Mozilla/5.0 (compatible; Googlebot/2.1; +http://www.google.com/bot.html)

In the App Name box type Googlebot, then click OK. Now, any time you want to view one of your pages as if you were the Google bot, you go to Tools, User Agent Switcher, Googlebot.

You might have to block cookies to view some sites, and you can do this in Tools, Options, Privacy, Exceptions (then add the URL).

Another thing you can do is to use a text browser like Lynx to get a rough estimate as to how your site looks to Google. Google Webmaster Tools, however, has a feature that can help too. On the Webmaster Tools dashboard, click on the “+” sign by the “Labs” link in the left hand column. When you do, you’ll see an option called “Fetch as Googlebot” as you can see in the first screen shot. Click on it, and it will download your site (or whatever URL you enter) as the Googlebot sees it.

google webmaster tools google bot

As in the second screen shot, you’ll see the html source just like that you’ll see when you click on “View Source” in your browser. You’ll get a response code, like 200, which means everything is peachy, or 301, which means “permanent redirect.” You’ll see what kind of server your website is on and any CSS files or scripts that are called upon and included.

googlebot view of page

One caveat, however is that it doesn’t always work with PDF files, but Google insists it’s working on fixing the problem and if your sites look OK in your browser, chances are it looks OK to the Googlebot (even if it’s PDF).

If you run a lot of scripts or have lots of layers on your sites, this can be particularly handy. If your site is mostly simple html, your normal web browser will give you a pretty good idea of what Google sees on your site.

What Googlebot Sees as it crawls your site

spider robotWhen the Googlebot crawls your site, it uses computer algorithms to determine which sites to crawl, how often to crawl them, and how many pages to get from each site. It starts with a list of URLs from earlier crawls and with sitemap data. The bot notes changes to existing sites, new sites, and dead links for the Google index. When the Googlebot processes each page it takes in content tags and things like ALT attributes and title tags. Googlebot can process a lot of content types, but not all. It cannot process contents of some dynamic pages or rich media files.

There has been plenty of talk about how to handle Flash on your site. Googlebot doesn’t cope well with Flash content and links that are contained within Flash elements. Google has made no secret about its dislike for Flash content, saying that it is too user-unfriendly and doesn’t render on devices like PDAs and phones.

You do have some options, however, such as replacing Flash elements with something more accessible like CSS/DHTML. Web design using “progressive enhancement,” where the site’s designs are layered, yet concatenated, will allow all users including the search bots to access content and functions. Amazon has a “Create your Own Ring” tool for designing engagement rings that is a good example of this type of functionality. Also, something called sIFR, or Scalable Inman Flash Replacement is an image replacement technique that uses CSS, Flash, or JavaScript to display any font in existence, even if it isn’t on the user’s computer, as long as the user can display Flash. Now, sIFR is officially approved by Google.

Google says that the bottom line is to show your users and Googlebot the same thing. Otherwise your site could look suspicious to the search algorithms. This rule takes care of a lot of potential problems, like the use of JavaScript redirects, cloaking, doorway pages, and hidden text, which Google strongly dislikes.

Google support engineers say that Google looks at the content inside “noscript” tags, but they should accurately reflect the Flash-based content included in the noscript tags, or else Googlebot may think it’s cloaking.

According to Google engineer Matt Cutts, it’s difficult to pull text from a Flash file, but they can do a fair job of it. They use the Search Engine SDK tool that comes from Adobe / Macromedia. Most search engines are expected to make that the standard for pulling text out of Flash graphics. People who regularly use Flash might consider getting that tool as well and seeing for themselves what kind of text it pulls out of your graphics. In fact, Google may work with Adobe on updates to the tool.

Domain Penalties II
In Part I, we talked about how to determine if your site has been banned or penalized by Google and what to do about it. This part delves more into Google penalty folklore, and how the search engine is constantly changing and evolving to counter nefarious work-arounds that people develop to game the search engine world.

In August and September 2009, Google made changes that demote a site by 50 places in the rankings if you are penalized. At this time, variation of anchor texts grew in importance even more than it had in previous years. The “rules” of building natural anchor text change a lot. Not that you should stop using natural anchor text. More on that later.

Here are five things you probably shouldn’t spend much, if any, time worrying about anymore:

  1. Alexa Rank is tilted enough toward online marketers that it does not tell nearly enough of your web traffic story to be worth much.
  2. Google back link data can be dicey. Suppose a random sample is returned with your most spam-laden link? Don’t make major decisions based on Google back link data. You want the kind of links that come along with good content. You’ll find them with little page widgets like the one in the screen shot.
  3. Google cache date is overused to where great pages can be returned with no cache set. Therefore cache date is irrelevant too.
  4. Google PageRank can be randomized to throw SEO experts off. Apparently Matt Cutts has confirmed this, as can be seen here.
  5. Precise anchor texts are officially “out” since anchor text filters have been in effect for quite some time now.

DNS level penalty part II

The so-called “minus 50 penalty,” a filter in Google operates on the domain, page, and keyword level. In other words, pages drop by 50 positions in the ranking because of over-optimization of keywords the page has been linked to, either internally or externally.

What does this mean? Un-optimize your keywords?

Actually, yes.

Since the moods of Google change fairly rapidly, the things that worked last year may not work now. If you’re hit with a penalty, all you can do is fix the problem, suck it up, and move on starting today. Various webmasters have said that it takes 60 days to get rid of the penalty, so the sooner you deal with it the better.

Something else you must do is change up your anchor texts. Don’t just use one hot keyword to link all your links. And you can’t just vary them singular and plural. You have to use everything from natural language phrases to pieces of keyword phrases to misspellings and typos. What you don’t want is to overdo the linking with the hot keywords and phrases. Write your anchor texts as if you don’t care about squeezing every last bit of juice out of a particular keyword or phrase.

When it comes to anchor text variety, your best bet for figuring it out is to check out what your best competitors are doing because there’s no exact number of times a keyword text can be used to anchor links. The key is not to be too far out of line when compared to your competition.

Page level penalties are becoming more common, whereas before, penalties were usually applied at the domain level. In many cases the page level penalties are hitting home pages of  sites. Key phrase specific penalties are becoming more common too. This happens when there are easily detected paid links pointing to a page with exact anchor text for one key phrase. The problem is, the page can continue to get search traffic for some phrases, but not the one you want.

Apparently the reason Google does it this way is that sometimes Google susses out paid links, and sometimes it doesn’t. If the algorithm is going to hand out penalties for paid links, it needs to prevent itself from messing things up for a site too badly if the algorithm thinks there are paid links when there aren’t. Therefore, they limit the penalty to one specific page and one specific key phrase so that an entire site isn’t penalized in the even to of a mistakenly applied penalty. Of course, the best thing to do if you have paid links is get rid of them and wait for your site to be crawled again.

These page level and key phrase specific penalties are sometimes hard to detect, but there are some things you can do that might give you clues that your site is on the receiving end of a page level key phrase specific penalty.

  • If you have paid links from a link network that it would be easy for Google to pick up. Get rid of them. If you haven’t been penalized yet, you will. It’s just a matter of time.
  • Paid links use identical anchor text and point to the exact same page. Again, get rid of them.
  • If the page linked to doesn’t have a good ranking for the target key phrase, you can be hit with a “minus 30” penalty or worse.
  • If the page ranks OK for similar key phrases, then you could be the target of a key phrase level penalty. Test your ranking across several similar key phrases.

In the long term you’re always better off using above board link practices, such as the following:

  • Don’t purchase links for Page Rank. Seriously, why would you do this? Google uses PageRank mostly as a signal that it has been banned. The ego-stroking that a high PageRank gets you isn’t what gets you traffic, search engine ranking, or conversions. Make yourself not care about it.
  • Don’t have links on duplicate content pages such as article sites and article directories. And don’t add links into ancient pages without changing their content.
  • Put in some no-follow links on relevant pages. If you add a no-follow tag, it still passes along relevancy and trust, even if PageRank juice isn’t passed along. Not that you would care, because you know not to care about PageRank, right?
  • Links on relevant pages and in appropriate context are what you need. Simple, but true.
  • Domain trust is very important, both from your domain and from domains linking to you. Domain trust has to do with domain age, and that site’s links. If your page has a link from a page that links to bad neighborhood sites, then by the mysterious Google transitive property, you may look sleazy by association.
  • Conversely, “juicy” pages pass value to you if you put a link on it. Juicy pages are ones that rank for a keyword or phrase that is important to you, regardless of PageRank or other metrics.

The moral of this long story is that while domain level penalties appear to have peaked, you have to watch out for page level and keyword specific penalties. While Google’s intentions in doing this were probably honorable, these penalties can be a little harder to figure out than domain level problems, particularly keyword and key phrase level penalties. If you have paid links, shady links, or happen to link to a nice looking site that itself has questionable link issues, you need to fix these things now. It isn’t always easy finding out which of the sites you link to themselves link to porn or other dodgy sites, but it’s worth checking out. The bottom line is that if you shun all questionable practices, build links organically, and continue to provide fresh, relevant content with natural sounding anchor text for links, your site will bubble upward and is almost certain to resist getting penalized.

Guide To Google Penalties

Has your website’s search engine ranking dropped drastically for no apparent reason? It is possible that your site has been hit by a domain level penalty from Google’s web spam team. A domain level penalty means your whole site has been demoted drastically in the search engine rankings – not just certain pages of it. The bad news is this can be difficult to pin down. While there are tools to help you figure out if your site has been penalized, there is a degree of speculation about it.

When your site’s Google search engine ranking takes a hit suddenly, then you can pretty easily conclude that you have done something wrong in the opinion of Google. They’re not all that specific about what happens when they “catch” you doing something wrong, and exactly how they penalize you. But in general, there are a few categories into which these mistakes fall, and there are ways to make a reasonable conclusion about which sin you’ve committed.

Here is how to figure out if you’ve been hit with a site level penalty.

  • Is your site still indexed?
  • If not, you may have been banned. If you think this is the case, verify it with Webmaster Central, fix the problem you think caused you to be banned, and file a re-inclusion request.
  • If your site is still indexed, find out if it still ranks for its domain name.
  • If not, you may have been hit with a penalty for keyword stuffing, manipulative linking, or cloaking. Get rid of your bad outbound links – particularly any paid links. Then go to Webmaster Central, beg forgiveness, and submit a re-inclusion request.
  • If your site does still rank for its domain name, find out if it still ranks highly when you search on five or six unique terms in your title tag.
  • If it doesn’t, then your links have probably been cleansed of any value they had. If Google finds out about any attempts to pass link love, it penalizes it. Fix the problem, plead your case with Webmaster Central, ask for re-inclusion, and next time get your links honestly.
  • If your site still ranks highly when you search on several unique terms in your site tag, then you’re probably not penalized at all, but just lost ranking naturally. Maybe some spam comments got through, or maybe it’s time to review your on-page SEO.

Google Penalties Explained

Cataloging the various Google penalties people have come across is like trying to herd cats. Whatever relationship exists between the punishment and the crime either isn’t transparent, or is not handled evenly across the board. Being banned is, of course, as bad as it gets, since your site is suddenly not able to be found. This usually only happens when some kind of serious deception is going on on the site. But there are a few activities that have been pinned down as sending the Google gods into a frenzy.

Domain level redundancy is one thing that can get you penalized. This is what happens when webmasters clone sites. In other words, they point the Domain Name System (DNS) from several domains into the same directory. This makes each domain display exactly the same site.

If you duplicate the same content over several pages or sites, or if someone copies your site or content, Google will ding you for content redundancy. If you think someone else is ripping off your content, check Copyscape and see if you can track it down.

Purchasing a number of domains, each of which address separate keyword targets is now considered punishable.

If Google thinks you’re a link seller, your links will rapidly mean zilch. If your links do not pass on PageRank juice, they’re worth less than the pixels they’re written on, even if you are using them on your own site.

There is a certain amount of mystery surrounding the Google “sandbox” theory. Some people believe that young sites are penalized. You don’t see this penalty unless you try to SEO the heck out of it within the first few months of the site’s existence. Apparently there is a certain amount of dues paying your site has to do before your SEO starts getting respect from the big guys.

Does your site support (or appear to support) porn, gambling, or “male enhancement” sites? Well, duh. Of course you’re going to get penalized. Same with spam.

If your site may be perceived as a threat to national security, like if you sell fake IDs or something, your site will be penalized. If a third party hijacks your search engine rankings by means of cloaking and proxy, unfortunately, your site gets hit with a penalty. This one is mostly Google’s fault. The same is true if an affiliate uses content from your site, or if your competitor intentionally links to you from “bad neighborhood” (porn, gambling, etc.) sites.

And, of course, there’s the fact that the Google algorithm isn’t perfect. Algorithm quirks can have the same effect as a penalty if it makes your site unreachable.

If you want to find out if your site has been penalized, we have a free penalty checker tool you can use if you go to http://tools.lilengine.com/penaltychecker/. Click the button below to give it a try.

Penalty Checker

You just input the address of the site you want to check out, and you’ll get a result either like screen shot one (when a site does appear to have been banned or penalized) or screen shot two (when a site is not believed to have been penalized).

Domain level penalty

No penalty

If you suspect your site has been penalized, there are a number of steps you can take to find out if it has, and why. Once you fix the problems, you can request re-instatement by the Google search engine and eventually get back in its good graces.