has 13+ years experience in web development, ecommerce, and internet marketing. He has been actively involved in the internet marketing efforts of more then 100 websites in some of the most competitive industries online. John comes up with truly off the wall ideas, and has pioneered some completely unique marketing methods and campaigns. John is active in every single aspect of the work we do: link sourcing, website analytics, conversion optimization, PPC management, CMS, CRM, database management, hosting solutions, site optimization, social media, local search, content marketing. He is our conductor and idea man, and has a reputation of being a brutally honest straight shooter. He has been in the trenches directly and understands what motivates a site owner. His driven personality works to the client's benefit as his passion fuels his desire for your success. His aggressive approach is motivating, his intuition for internet marketing is fine tuned, and his knack for link building is unparalleled. He has been published in books, numerous international trade magazines, featured in the Wall Street Journal, sat on boards of trade associations, and has been a spokesperson for Fortune 100 corporations including MSN, Microsoft, EBay and Amazon at several internet marketing industry events. John is addicted to Peets coffee, loves travel and golf, and is a workaholic except on Sunday during Steelers games.
PageRank: Meet Hilltop
Based on Atul Gupta’s great article he recently wrote on the Hilltop algorithm, I did a bit of research on my own and came up with this article. Atul Gupta is the founder of SEO Rank Ltd. and, as he explained it in his article, the Hilltop algorithm played a fairly large role in Google’s November 16 update, dubbed “Update Florida”.
In my continuing series on the effects of the Google “Florida Update”, in my previous article, I discussed how the OOP (Over Optimization Penalty) could in some cases have been applied to certain sites that could have in fact been overly optimized on some of their main keywords. Researching and reading on the Hilltop algorithm, I found out that it isn’t even new- it dates back to early 2001.
As you might expect, and as is always the case, Google remains very silent on any of this, so my analysis is based on many observations and some testing, using the Google.com search engine. But before delving into how all of this may affect your positioning in Google, let me explain what the “Hilltop” algorithm is all about and how it works in Google.
For those of you that may be new to search engine algorithms, I suggest you read on Google’s Page Rank algorithm, as a primer, and also “Anatomy of a large-scale hypertext search engine”, written by Sergey Brin and Larry Page, the co-founders of Google.
In its most basic form, the Google PageRank algorithm determines the importance and the relevance of a website by the number of links pointing to it. Following this principle, as an example, Google would rank a page higher if it has 100 links pointing to it, when compared to another page with only 10 links. So far, so good and this principle makes a lot of sense when you think of it.
Definition Of The Hilltop Algorithm
In contrast to PageRank, Google’s Hilltop algorithm determines the relevance and importance of a specific web page determined by the search query or keyword used in the search box.
In its basic, simplest form, instead of relying only on the PageRank value to find “authoritative pages”, it would be more useful if that “PR value” would be more relevant by the topic or subject of that same page.
In such a way, computing links from documents that are relevant to a specific topic or relevant document of a web page would be of greater value to a searcher. In 1999 and 2000, when the Hilltop algorithm was being developed by engineer Krishna Bharat and others at Google, they called such relevant documents “expert documents” and links from these expert documents to the target documents determined their “score of authority”. Again, it does make a lot of sense.
For more in-depth information on this important topic, read the Hilltop Paper that was written by Krishna Bharat himself and is available from the University of Toronto’s computer science department.
Using The Hilltop Algorithm To Define Related Sites
Google also uses the Hilltop algorithm to better define how a site is related to another, such as in the case of affiliate sites or similar properties. The Hilltop algorithm is in fact Google’s technology and ‘ammunition’ in detecting sites that use heavy cross-linking or similar strategies!
As a side note, Google’s Hilltop algorithm bases some of its computations mostly from “expert documents”, as noted above.
Hilltop also requires that it can easily locate at least 2 expert documents voting for the same Web page. If Hilltop cannot find a minimum of 2 such “expert documents”, the results it will return will be absolute zero. All of what this really means is that Hilltop actually refuses to pass on any arbitrary values that may be relevant to the rest of Google’s ranking formula and thus becomes inappropriate for the search term or keyword used in the search box by the user.
So, What’s In Store For Hilltop In 2004?
Since we are only at the beginning of the year, some of you may ask: “That’s all really cool, but what will happen to websites in 2004, in the aftermath of “Hurricane Florida”? That’s a great question, and many articles have been written on this topic in the last six to seven weeks.
Today and in the past, many search engines stopped valuing certain search factors subject to abuse from certain webmasters or site owners, such as keywords meta tags. For that reason alone and since its very beginnings, Google has always completely ignored meta tags altogether in the first place.
In contrast, visible sections of a website are less subject to “spam-dexing” (search engine spam), since these ‘visible pages’ (!) need to make good sense to the average human “real” visitor.
The Reasons Behind A New Algorithm At Google
Since the inception of the Google search engine in 1998, the PageRank algorithm has been pretty much the benchmark used at Google to determine search relevance and importance. However, there is a fundamental design weakness and certain limitations involved in the PageRank algorithm system and Google has known about it for quite some time now.
PageRank’s ‘intrinsic value’ is simply not paramount to search terms or specific keywords and therefore a relatively high PR web page that only contained a reference to an off-topic search term or keyword phrase, often got a high ranking for that search phrase. This is exactly what Google is trying to eliminate with its Hilltop algorithm. Google always tries as best as it can to make its search engine as relevant as possible.
Coming back to Krishna Bharat, he filed for the Hilltop patent in January of 2001, with Google as an assignee. Thus, Google recognized the important improvements this new algorithm could bring to their search ranking features when combined with their existing PageRank algorithm.
Google’s Hilltop algorithm could now work in conjunction with its older technology (PR). It is my observation that Hilltop could have gone through many improvements from its original year 2000 design before the current implementation, notably the one that Google started to deploy on or around November 16, 2003, at the very beginning of its November update (Florida update).
In the past two years, I think that Hilltop has been “fine-tuned” by Google and now represents a serious contender to the PageRank algorithm, originally developed by Google co-founders Sergey Brin and Larry Page, back in early 1998.
Hilltop And Google’s Massive Index Of Over 3.3 Billion Pages
Since its very beginning, Google has basically been operating most of its search engine through about ten thousand Pentium servers (some call them inexpensive personal computers), evenly distributed mostly through some major data centers located anywhere on the planet. That is basically how Google has built its hardware technology, from the ground up.
Coming back to the Hilltop algorithm, if we make an observation on how about 10,000 servers can have the dynamic processing ‘intelligence’ to rapidly determine and locate “expert documents” from hundreds of thousands of different and ‘topical’ Web pages, it is clear that Google’s Hilltop algorithm is at work in such a formidable task.
From what I can see and from what I know of search engines, since November 16, Google is now running a form of batch processing (similar to the mid-seventies days of computing, using bulky mainframe computers the size of large refrigerators, except that today, those 10,000 servers replace those mainframes) of frequent keywords, key phrases and search terms. Google then stores these results in its massive database, ready to be used as soon as a searcher makes a query using those search terms.
How Google does this is very simple: it has immediate access of the most popular and frequent keywords used and in the search terms used daily from its large database, and in real time, collected from actual searches used by everyday users, as well as actual keywords and key phrases used in its AdWords PPC (pay-per-click) ad program.
It is my observation that Google has apparently set a certain arbitrary threshold value to the actual number of searches a real-life search keyword needs to have in practice before it triggers a set limit in the Hilltop algorithm, and is then sent to a temporary buffer for later batch processing in its whole complex system.
Looking back to the ‘old days of the monthly dances’, it would appear that Google’s Hilltop algorithm operates on the combined total of most popular search terms used once a month, hence the old “Google dance effect”, prior to November 16, 2003.
Additionally, and this is something I have noticed even before the Florida update, incremental and smaller bits of batch processing is likely being done more frequently by Google on certain search terms that increase in popularity much faster, such as a major news event, for example when the US captured Saddam Hussein in December 2003. Such short-term events or news would qualify for the short-term “buffer” and would be processed as such by Hilltop.
More ‘standard’ and ordinary results for the longer term would be timed in with the 10,000 servers about once a month, which again, would make perfect sense. Search terms that do not qualify to kick in the Hilltop algo continue to show you the old Google ranking.
Conclusion
In concluding this topic, as Atul Gupta and myself have written in some of our previous articles, webmasters and site owners need to think ‘out of the box’ if they want to thrive and continue to have sites that return favourable ROI’s. As always, link popularity is even more important now than ever before.
Additionally, try to get a listing in as many directories as possible, beginning with DMOZ (the Open Directory Project). Avoid FFA (Free for All) or link farms in every respect. Those are a thing of the past and might even get you penalized.
If your budget allows it, get into a good PPC ad program, such as AdWords or Overture. You might also want to consider some good paid inclusion search engines that deliver real value for your investment.
Note that since January 15, (and as expected) Yahoo has completely dropped its listings with Google, so you may also want to look at the possibility of a paid listing in Yahoo as a safety measure. Yahoo is now taking its results from Inktomi, which is also in the Yahoo family of search properties, since Yahoo bought Inktomi last year.
Author:
Serge Thibodeau of Rank For Sales
Your Google Description
When it comes to describing your site, Google assembles what is known as a snippet description to display in their search results. Sometimes it’s a good description – one that prompts potential visitors to click on your link. Other times, it isn’t. Take the case in point where the following page (ranked at #1) in a keyword search for scuba dive “entices” the potential site visitor by listing the various PADI locations from around the world …
PADI – The way the world learns to dive
PADI Americas – English, PADI Canada – English, PADI Europe – English, PADI Nordic – English, PADI International Limited – English, PADI Japan – English, PADI Asia …
Description: The largest and most recognized diving organization around the world with courses ranging from Snorkeling…
Category: Recreation > Outdoors > … > Dive Organizations > Training Agencies www.padi.com/ – 9k – Dec 27, 2003 – Cached – Similar pages
Oops! …oh, well – at least their Description, taken from their editor-assigned ODP directory description, is relevant – but their snippet leaves something to be desired.
Can the snippet entice users to click?
Can the snippet be changed to entice users to click on your listing?
Of course, this is important because potential site visitors are judging whether to click or not based in part on those snippets. So, how can one go about changing Google’s snippet advantageously? Let’s take a look and see.
For starters, we’ve found that Google actually pulls the snippet description from several different places on your Web page. Let’s think about this for a minute. If we could determine where Google is pulling our description, perhaps we might be able to change that wording to “produce” a description that more accurately describes our page.
Where is Google pulling the snippet description?
Currently Google is pulling the snippet from any one or combination of the following areas:
1. META description tag (although Google doesn’t use contents to determine relevancy). 2. First ALT text found on the page. 3. First text found on the page (which may be a heading tag, body text, etc.). 4. Additional heading tags on the page. 5. Additional body text found on the page. 6. Additional ALT text on the page. 7. Navigation bar on the left-hand side of the page (which is rarely a relevant description of a site!). 8. Copyright information at the bottom of the page. 9. Wherever the keyword phrase is found.
Important Note
One thing that’s very important to note is that the snippet is determined by the search term. In other words, if you search for your company’s name, you’ll get a different description than what you would get if you search for a keyword phrase that is relevant for your site. Generally, Google appears to be pulling the description from areas of the page that surround the usage of that particular keyword phrase. The obvious question is, Is it the first usage of the keyword phrase? Usually, but not always.
Another Important Note
Since most people aren’t going to be searching for the name of your business, don’t try to change your Google snippet description based on a search for your company name. Instead, search for the most important keyword phrase for each important page of your site, and then make changes accordingly.
Let’s look at some examples
If you search for “search engine seminars” (no quotes) at Google, you’ll find these results:
Search Engine Seminars–your path to success on the Web!
… Search Engine Seminars. Is your Web site achieving the success that you want, or that it deserves? … At our Search Engine Seminars . . . you learn by doing. …
www.searchengineworkshops.com/articles/search-engine-seminars.html – 8k – Cached – Similar pages
Here’s the first text on the page:
Search Engine Seminars
Is your Web site achieving the success that you want, or that it deserves? Are you getting any traffic? Is that traffic converting to sales? Have you considered attending a search engine seminar to learn how to take a struggling Web site and bring it to the top of the rankings?
Search engine seminars, conducted by Search Engine Workshops, are held at various locations across the globe. These seminars are totally different than attending a large search engine conference, where you listen to a speaker discuss theories from the front of the room.
At our Search Engine Seminars . . . you learn by doing
And, here’s the section of that page, which shows the META description tag:
Search Engine Seminars–your path to success on the Web!
The META description tag is obviously not being used as the snippet description for this page under the keyword phrase, “search engine seminars.” Could it be because the plural version of the keyword phrase, which is what we searched for, isn’t found in the META description tag? Possibly.
So where is the snippet being pulled from?
Here’s the snippet description again:
Search Engine Seminars. Is your Web site achieving the success that you want, or that it deserves? … At our Search Engine Seminars . . . you learn by doing. …
In this example, the snippet appears to be pulled from the first heading tag (“Search Engine Seminars” at the top of the page), followed by the first sentence in the body text, followed by the next heading tag (“At our Search Engine Seminars . . . you learn by doing . . .”). Notice that the second heading tag is not the second instance of the usage of the keyword phrase. In the second paragraph of the body text, the keyword phrase is used as a hyperlink.
So what am I going to do with this knowledge?
In this example, nothing, because the description accurately describes the Web page. I’m not going to change a thing.
If the snippet description of your page accurately describes the page, leave it alone!
(Continued in Part 2. For the complete article, write to robin@searchengineworkshops.com)
(Writer’s Note: This article offers tips for changing your Google description in order to increase the click throughs to your site. However, this has nothing to do with trying to increase your page’s search engine rankings.)
Author Bio:
Robin Nobles with Search Engine Workshops teaches SEO strategies the “stress free” way through hands-on, search engine marketing workshops in locations across the globe and online search engine marketing courses (http://www.onlinewebtraining.com). Visit the World Resource Center, a new networking community for search engine marketers. (http://www.sew-wrc.com)
Search Engine Innovation For 2004
After being blind-sided by the Google Florida update, many webmasters and SEO’s were reeling from the results. The message is clear: you can’t rely on just one search engine for all of your traffic. You must use all your wits to emerge victorious from the search engine wars. Google is important, but it is not everything. Keep your eyes and ears open to new opportunities and old standbys: other search engines and directories, paid placement and pay-per-click, newsletters, and even more traditional channels.
Wait To Change
So were you an innocent bystander caught in the onslaught of sites dumped in the Google Florida update? Many people lost their hard-earned ranking, even though they did nothing “wrong”. Many websites that follow Google’s rules for optimization to the letter were still caught up in the carnage. Unfortunately, many businesses were devastated by these changes, especially heading into the holiday months.
What to do? As difficult as it may have been to make sense of Google’s changes, for many, the simplest course of action was to simply do nothing. While perhaps contrary to a normal “it’s broken so I need to fix it” approach, for many webmasters “do nothing” has proven to be the correct course of action. Since the update, many sites that were exiled to search engine Siberia have returned to nearly their former ranking, shaken but intact. From all appearances, Google simply changed their algorithm and may not have gotten it quite right. Additional “tweaks” subsequent to the Florida update seem to have brought some sanity back to their results.
Who Will Stay Tops In The Search Engines?
You never know who will become the leader in search engines. It was only a few years ago that directories were the major force–until the upstart search engine Google came along. Google got its start about five years ago and hasn’t looked back. As long as Google provides good results for its users, it is in a good position to stay on top. However, with MSN working on the creation of its own search engine and Yahoo’s acquisition of Overture (which includes AllTheWeb and AltaVista), things could get interesting in 2004. Microsoft is always a force to be reckoned with, and Yahoo certainly has the tools to become a major competitor to Google.
Inktomi’s New Role
Inktomi may play an important role in this growth since it is now owned by Yahoo. Keep an eye on this engine: it provides secondary results for MSN and will probably replace Google in supplying primary results in Yahoo. Inktomi’s importance may also increase in MSN once the Microsoft property stops using LookSmart for its primary results.
To see which pages you have listed in Inktomi, use the Inktomi Pure Search function from Positiontech. Inktomi often adds a few free pages to its databases. Check first to see which pages you may already have in their database for free before using Paid Inclusion for your most important pages.
Other Ways To Promote Your Website
Keep your eye on search engine news. Google was an up and coming engine a few years ago, you never know what will happen in the industry so stay on your toes. Continue to promote your website through links in topical directory listings. Search for websites that contain topics related to yours. Link when it “makes sense”. Don’t forget traditional means of marketing your website: print ads, brochures, magazine articles and more may help to make a difference. One of the best ways to promote yourself online and increase your link popularity is to write articles on your subject. Find websites that accept free content and submit your ezine, articles or newsletters to those websites to build your link popularity. Newsletters, forums, FAQ’s, blogs and tips on your subject are all viable means to inform your visitors and bring in new traffic to your website. Don’t forget to archive your newsletters and articles on your website, which works to build your site size and increase link popularity through your authoritative knowledge of your subject. You aren’t a writer? Consider working with a copywriter to help build your good content.
Paid Inclusion And Pay Per Click
If you haven’t ventured into using Paid Inclusion or PPC services, consider using them to help balance the changes in your traffic. Use a Paid Inclusion subscription for your most important pages, or submit dynamically generated pages that aren’t being picked up by the search engine robots so they will appear regularly in the search engine database. You can start your PPC bidding in small doses. Look for some of the secondary smaller terms that don’t cost as much but will still bring in traffic your competitors may miss. Take a look at some of the smaller PPC engines available out there, a little traffic from a lot of places can add up.
For more information on choosing keyword phrases, read our article Finding Targeted Keyword Phrases Your Competitors Miss.
Content, Content, Content
The biggest mistake I see webmasters make is creating a website with little content. Don’t rely on a few paragraphs of text with optimization to convince search engine robots to stick around. A skeleton website does not make a good impression on anyone. Build the content of your website. Google’s new algorithm may be a sign of search engine robots getting a little smarter when it comes to understanding what your website content is about. Build information that will keep your visitors at your website. Become an authority on your subject so other websites will naturally link to you because your information is invaluable. Remember, Google is interested in serving those who use its search capabilities, just as you should be interested in serving your visitors. Give as much real content information as able to your visitors, they will thank you with return visits.
And In The End…
In the end, the information you give is often equal to the response you receive. Make the effort to become an authority site on your subject. Building the groundwork of your website with quality information and broadening your methods of marketing will help sustain you during the search engine wars upcoming.
Author Bio:
Daria Goetsch is the founder and Search Engine Marketing Consultant for Search Innovation Marketing, a Search Engine Promotion company serving small businesses. Besides running her own company, Daria is an associate of WebMama.com, an Internet web marketing strategies company. She has specialized in search engine optimization since 1998, including three years as the Search Engine Specialist for O’Reilly & Associates, a technical book publishing company.
Keyword Analysis & Ranking: The Value of Brainstorming
Conducting a good keyword analysis is the first step in an effective search engine marketing campaign. If you don’t chose good keywords, all efforts to boost your ranking will be wasted.
And the very first step within your keyword analysis is brainstorming. Before you begin focusing on keyword popularity, you need to think outside the box and develop a good list of words.
The Rules of Brainstorming
Brainstorming is a process for generating the broadest possible list of keywords for your web site, without yet judging whether they are good words. The cardinal rule of brainstorming is not to pass judgment on the ideas as they come up. Doing so can shut off our thinking, and at this stage in the process you want to encourage as much lateral thinking as possible.
When we get to the detailed analysis phase we’ll establish a detailed keyword ranking based on each word’s search popularity and relevance. At this point we’ll reject many of the words on our brainstorming list, but early on we want to be open to new ideas. Too early a focus on ranking keywords can block creative thought.
An Outside Perspective
I try to accomplish two things when I brainstorm keywords with clients. First, I’m trying to learn the language of their industry. Second, I’m trying to make sure they aren’t trapped by their own industry jargon.
Let’s talk about the second point first. It’s the classic marketing conundrum: the purpose of your marketing group is to keep the company focused on its customers and understand how they think. Yet a group of people who work in the same office, day in and day out, inevitably develop their own jargon. It’s easy for them to assume their customers speak the same language, when they may not.
A good example of this is a keyword review I recently performed on a site that sold CD-ROM devices. The list of keywords used included the terms “CD copier,” “CD duplication,” “CD replication.” Yet this list missed the term I and most other consumers use: “CD burner,” as in “Hey, Bob, could you burn a CD for me?”
Disconnects like this are more common they you might expect. Sometimes the causes are generational.
For example, ten years ago if you asked a 9 year old what a razor was they’d have told you it something dad uses to shave his face. Today there’s a high probability you’ll be told a razor is a scooter (Try it – I did with my 9-year-old and her friends and was told 100% of the time that a razor was a scooter!).
I once worked with a manager who insisted on referring to his product – which tracked web site uptime – as “accessibility” monitoring. The trouble is, “accessibility” is a well-established term in the web design world for ensuring your web site readable by people with disabilities.
A good brainstorming process helps spot issues like this. As a client, it’s important to be open to a fresh perspective from outside your company.
Learning Your Lingo
The other thing I try to accomplish during brainstorming is to learn my client’s unique language. While there sometimes is a disconnect between the company’s language and their customer’s language, at other times this jargon is used by everyone in the industry – including the searchers you’re trying to target.
This is especially true if your company offers professional services. In these cases your search terms may be highly specialized. For example, a CPA specializing in corporate tax preparation might use terms like “apportionment” or “franchise tax.”
Understanding the appropriateness of these terms is important, because while these highly focused terms may not receive a huge volume of traffic, they are often your most relevant search terms. Highly relevant search terms are the ones that generate the most business.
When working with clients, the brainstorming process is an important education process for me where I learn the language of their profession.
Often times the jargon you and your customers use may need to be amended slightly to reflect the way those same customers use a search engine. This is most true when you’re talking about acronyms.
Let’s fact it, Americans are acronym crazy. Every industry has its unique set of acronyms. If you work on enough projects, sooner or later you’ll see the same acronyms used to mean very different things.
Take one simple example: does “ADA” mean “Americans with Disabilities Act” or “American Dental Association,” or perhaps the “American Diabetes Association?” Search for ADA in Google and you’ll turn up hits on each of these definitions.
In my experience, people who search for acronyms get very unsatisfying results, and quick refine their search by adding additional words. So a searcher might change their query to “ADA compliance” to get more relevant results.
As an SEO, it’s important for me to understand what additional words might be coupled with your industry’s acronyms to create this type of refined search.
This understanding of how people use a search engine is one of the things a search engine marketing specialist brings to the table. During the brainstorming process, it’s common for us to expand some of the search terms suggested by the client to reflect search behavior.
Sources of Brainstorming
All of this is good, but how do you get past writer’s block when building your keyword list?
One of the best steps is to read your existing web site and your sales collaterals. A careful reading will often pick up alternate phrasing of common terms, and these should go into your list.
If this works for your web site and collaterals, it also works for your competitor’s web sites. I’m not recommending that you lift another site’s list of keywords intact, but seeing how other people in the industry phrase things can be useful in opening up your own thinking. Industry trade magazine and product reviews are also great sources to open up your thinking.
And don’t forget the thesaurus. If you’re stuck building a list of “real estate” words, a simple thesaurus will tell you to think of “property” and “realty” words. Programs like WordTracker can also help uncover related terms that you may not have considered.
Finally, don’t forget that your customers are a great source of keywords. This can be both a source of alternate terminology, and another check against the jargon trap. If you can’t run a focus group or conduct customer interviews, then talk to the people who interact with your customers. That means talking to your sales force, your user support staff, or anyone who has regular customer contact.
Brainstorming is all about thinking outside the box. The more effort you make to interact with new people inside and outside of your industry, the better job you’ll do breaking down the barriers to creative thinking, and the better set of keywords you’ll have for your web site.
Author Bio:
Christine Churchill is President of KeyRelevance.com a full service search engine marketing firm. She is also on the Board of Directors of the Search Engine Marketing Professional Organization (SEMPO) and serves as co-chair of the SEMPO Technical Committee.
Creating your ads for AdWords
Two of the most important factors of any Pay Per Click (PPC) campaign are creating successful ads and deciding how much to pay per click. There are many PPC options out there to choose from, I am going to focus on the two most popular, Google AdWords and Overture.
Creating your ads for AdWords
Creating your ad copy is the single most important part of any ad campaign. You want your ad to stand out amongst the others and scream out ‘click me!’ If your add looks and says the same thing as everyone else users will simply pass it by.
Before creating your ads you need to determine your target market and keyword selections. If your company focuses on a specific market niche try to target your ads in regards to that niche. Properly targeted ads will almost always out-perform those directed at a general audience.
When creating your first ad be sure to fit in your main keywords either in the title or near the beginning of the body text. Say something to draw attention by using call to action phrases and words that provoke enthusiasm and response. Things like “Save on DVDs,” “Get cheap stereos,” or “Join now for 20% discount,” etc. Just be cautious, if you advertise something that you don’t offer, Google will pull your ad. If your ad says you have something for free, you better have something for free listed on your landing page! Always be sure to follow Google’s Guidelines .
Once you are happy with your first ad, create 3 more ads that are radically different from the first. After 3 or 4 days take a look at how your ads are doing. (If you are using less frequently searched terms you may have to wait 1-2 weeks for better results.) Check the click through rate (CTR) of each ad. In most cases one of the 4 will show to be out-performing the rest. If this is the case, delete the poorly performing ads and create 3 new ads that closely resemble the successful one, each with subtle differences in the title and body text.
Again wait 3 or 4 days to see which of the ads is out performing the rest. If you again notice that one stands out, repeat the process. Eventually you will end up with 4 quality ads that are performing equally. Once the ads have leveled out, continue to keep an eye on them, I recommend daily. If one begins to slip, slightly tweak the wording. You must always keep an eye on your ads if you wish for them to continually perform well.
Determining your Max Cost Per Click with AdWords
With AdWords when you enter your MAX CPC, it will then show you what Google estimates your average position will be for each keyword. ( The position predictions provided by Google are based on historical data from previous advertisers and are not 100% accurate, but it will give you an idea what to expect.)
Unfortunately there is no way to see what the competition is paying, so in most cases it’s a bit of a duck hunt in the beginning. I suggest starting out with a MAX CPC slightly higher than you would normally, this will give you a slightly higher ranking and increase your chances of accumulating clicks. If your ad performs really well your rank will increase. As you begin to establish a good click through rate (CTR) you can adjust your max CPC to reflect the position you wish to obtain. (See part one of this article to find out how Google ranks ads.)
Creating your ads for Overture
With Overture, writing the perfect ad is slightly different than with AdWords. Overture only allows you to create one ad per keyword, so this takes away the option of trying out various ads and going with the obvious winner, however, the basis for creating your initial ad remains virtually the same. After you have selected your target market and main keywords, write a specific ad targeting each individual keyword and be sure to include the keyword in the title or beginning of the main body text along with a call to action phrase or something that is sure to draw attention. Remember to check the status of your ads on a weekly basis, and tweak as needed. Keep and eye on your click through rate and regularly tweak poorly performing ads.
Determining your Max Cost Per Click with Overture
Deciding how much to spend on Overture is simple. Take a look at what the competition is spending, and out bid them. With Overture you should always try to be in the top 3 if you wish to have your ad dispersed among partner sites. (Yahoo, Lycos, MSN, etc). If the number 1 spot is currently paying 25 cents per click you need only bid 26 cents to grab the number 1 spot. If you want the number one spot, but are also willing to pay more, you can bid 40 cents, and will only be charged the 26 cents. One penny above the competition. Keep in mind though, if someone else increases their bid, your actual cost will also increase up to the max CPC you have entered.
Summary
Managing an AdWords or Overture PPC campaign can be confusing at first, but it doesn’t take long to get a handle on what works. Creating a highly successful ad the first time around with either AdWords or Overture is a rare occurrence, but with a bit of regular maintenance and a well targeted campaign it won’t take long to start seeing results.
Author Bio:
Aaron Turpen is the proprietor of Aaronz WebWorkz, a full-service online company catering to small and home-based businesses. Aaronz WebWorkz offers a wide variety of services including Web development, newsletter publishing, consultation, and more.
Banned from Google?
Many web site owners are under the false impression that if they can’t get their web site listed on a search engine then they must be restricted from that engine. Banned from Google? How will my site ever get seen? Google, the world’s most popular search engine, retains over 60 percent of all search engine results. To lose that level of exposure is life threatening for a web site. There are many indicators for web site owners to be aware of in order to trouble shoot the position of their site. For instance, I have submitted hundreds of sites and after two or three weeks the home page will usually show up on the search engine listing. Some sites can take twice as long. A rule of thumb that I go by is, if you can’t find your home page on a search engine a month after the original submission date, then something may be wrong with your site.
If you think your site might be banned, then you have to ask yourself some fundamental questions.
1. Is your domain newly purchased with NO previous owners? If the site is new, then there should be no reason to worry about the issue. If after a month your site is still not listed on the engine, I would consider having a web site optimization company optimize your home page.
2. Are you employing any TRICKS? Did you do anything that could have resulted in your site being banned? In other words, are you trying to trick the engine into falsely indexing your site higher than it should actually be? Are you link farming, are you using doorway pages unwisely, using hidden text, cloaked pages, redirects, plagiarized content etc.? Please see my article on Spamming the Engine and The Ultimate Cost for more detailed information. If so, these tricks need to be corrected before resubmitting the site to a search engine.
3. Did you purchase this domain from a previous owner and is it possible that the previous owner may have gotten the domain banned from the engine? The previous owner may have tried to trick the engine into falsely indexing your domain and subsequently has gotten your domain banned. If this is a possibility then you need to trouble shoot further.
Are there a large number of broken links from your home page and throughout your web site? If so, these would definitely have to be fixed before an engine or directory will include the site into their system. If you are not sure about broken links, you can purchase software that will scan your site or have a search engine promotion company check your web site for broken links.
4. Is your site optimized for loading? Is your web site full of graphics and taking quite a long time to load? If dial up users are not waiting for your pages to load then neither will the search engine spiders. An industry consensus and a good rule of thumb is to try to optimize your pages for loading to about 30KB.
5. Are you using your robots.txt file properly, if at all? Try inputting your site into your web browser like this; http://www.yourdomain.com/robots.txt, if a 404 error i.e., “The page cannot be found” is displayed, then you’re not using one. If content comes up be sure you understand what you’re looking at and that you are not blocking your home page from any spiders even reaching it.
If you think there is a good possibility your site is actually banned due to unethical marketing and promotion practices then you need to do the following below to verify.
* If a couple of months have passed since first submission of your home page and you find zero results in a Search Engines Results Page (SERP), then that is a very good indication that your domain has been cleaned out of the index.
* You may also use the Google toolbar (http://www.toolbar.google.com) to check the “page rank” or PR value of your site. A zero page rank or PR0 doesn’t necessarily mean your banned either, but it is another possible indication and further trouble shooting will be required.
* Try to get a couple of web sites to link to yours and re-submit those pages your links are on to Google. Wait a few weeks and do a search for the site that is linking to yours with your keywords and domain together. If you can locate those submitted pages and your domain still doesn’t come up then there is a good chance your site may be banned.
* Another alternative is to look for your listing in the Google version of the Open Directory Project (directory.google.com). Verify that your site is even listed in the ODP. If the search is coming up blank, there is a chance you may have been banned or penalized.
After you have corrected all possible errors and your site is still not listed on Google you can try writing to Google at search-quality@google.com. Be sure to ask about the specific domains. Also, don’t expect a quick response and try to wait at least a couple of weeks before asking for an update. Google is very busy with hundreds of these types of requests and you wouldn’t want them to put you on the bottom of the pile for being too annoying. By all means don’t try to sneak any tricks past the engineering team at Google, they’ve seen them all and you’ll definitely risk your chances of ever getting indexed again. If you do not have the time to do the leg work, by all means hire a reputable search engine optimization company and let them handle your domains predicament.
Author Name:
Alex Skorohodov of KosmosCentral
JavaScript Links
JavaScript is a wonderful technology, but it’s invisible to all of the search engines. If you use JavaScript to control your site’s navigation, spiders may have serious problems crawling your site.
Links contained in your JavaScript code are likely to be ignored by search engine spiders. That’s especially true if your script builds links by combining several script variables into a fully formed URL.
For example, suppose you have the following script that sends the browser to a specific page in your site:
function goToPage(page) {
window.location = “http://www.mysite.com” + page
+ “?tracking=” + trackingCode;
}
This script uses a function called goToPage() to add a tracking code onto the end of the URL before sending visitors to the page.
I’ve seen sites where every link on every page ran through a JavaScript such as this. In some cases the JavaScript is used to include a tracking code, in other cases it’s used to send users to different domains based on the page. But in all of these cases the site’s home page is the only one listed in the search engines.
None of the spiders include a JavaScript parsing engine that would allow them to interpret this type of link. Even if the spider could interpret this script, it is difficult for them to simulate the different mouse clicks that would trigger execution of the goToPage() function with different values of the page variable, and it has no idea what value should be used for trackingCode.
Spiders will either ignore the contents of your SCRIPT tag, or else will read the script content as if it was visible text.
As a rule of thumb, it’s best to avoid JavaScript navigation.
Can Spiders Crawl Your Web Site?
A fundamental to ensuring your web site appears in the search engine listings is making sure that search engine spiders can successfully crawl your web site. After all, if the spider can’t reach your pages, then the search engine can’t include them in their search listings.
Unfortunately many web sites use technologies or architectures that make them hostile to search engine spiders. A search engine spider is really just an automated web browser that must interpret your page’s underlying HTML code, just like a regular browser does.
But search engine spiders are surprisingly unsophisticated web browsers. The most advanced spiders are arguably the equivalent of a version 2.0 web browser. That means the spider can’t understand many web technologies and can’t read parts of your web page. That’s especially damaging if those parts include some or all of your page’s links. If a spider can’t read your links, then it won’t crawl your site.
As a search engine marketing consultant, I’m often asked to evaluate new web sites soon after their launch. Search engine optimization is often neglected during the design process, when designers are focused on navigation, usability, and branding. As a result, many sites launch with built-in problems, and it’s much harder to correct these issues once the site is complete.
Yet often its only when their site fails to appear in the search engine listings that many companies call in an SEO.
That’s a shame, because for small businesses, search engines are by far the most important source of traffic. Fully 85% of Internet users find sites through search engines. A web site that isn’t friendly to search engines, its value is greatly reduced.
In this article, I’ll give an overview of the issues that can keep a search engine spider from indexing your site. This list is by no means exhaustive, but it will highlight the most common issues that will keep spiders from crawling your pages.
DHTML Menus
DHTML drop-down menus are extremely popular for site navigation. Unfortunately, they’re also hostile to search engine spiders, since the spiders again have problems finding links in the JavaScript code used to create these menus.
DHTML menus have the added problem that their code is often placed in external JavaScript files. While there are good reasons to put your script code into external files, some spiders won’t fetch these pure JavaScript files.
If you use DHTML menus on your site and want to see what effect they have on search engines, try turning JavaScript off in your browser. The drop-down part of your menus will disappear, and there’s a chance the top-level menus will disappear too. Yikes! Suddenly most of the pages in your site are unreachable. And that’s the way they are to the search engines.
Query Strings
If you have a database-driven site that uses server-side technologies such as ASP, PHP, Cold Fusion, or JSP, there’s a good chance your URLs include a query string. For example, you might have a URL like this one:
http://www.mysite.com/catalog.asp?item=320&category=23
That’s a problem, because many search engine spiders won’t follow links that include a query string. This is true even if the page that the link points to contains nothing but standard HTML. The URL itself is a barrier to the spider.
Why? Most search engines have made a conscious design decision not to follow query string links because they require additional record keeping by the spider. Spiders a keep list of all the pages they’ve crawled, and try to avoid indexing the same page in a single crawl. They do this by comparing all new URLs to the list of URLs they’ve already seen.
Now, suppose the spider sees a URL like this one in your site:
http://www.mysite.com/catalog.asp?category=23&item=320
This URL leads to the same page as our first query string URL, even though the URLs are not identical (Notice that the name/value pairs in the query string are in a different order).
To recognize that this URL leads to the same page, the spider would have to decompose the query string and store each name/value pair. Then, each time it sees a URL with the same root page, will would need to compare the name/value pairs of its query string to all the previous ones it has on file.
Keep in mind that our example query string is fairly short. I’ve seen query strings that are 200 characters long, and reference a dozen different name/value pairs.
So indexing query string pages can mean a great deal of extra work for the spider.
Some spiders, such as Googlebot, will handle URLs with a limited number of name/value pairs in the query string. Other spiders will ignore all URLs containing query strings.
Flash
Flash is cool, in fact it’s much cooler than HTML. It’s dynamic and cutting edge. Unfortunately, search engine spiders use trailing-edge technology. Remember: a search engine spider is roughly equivalent to a version 2.0 web browser. Spiders simply can’t interpret newer technologies, such as Flash.
So even though that Flash animation may amaze your visitors, it’s invisible to the search engines. If you’re using Flash to add a bit of spice to your site, but most of your pages are written in standard HTML, this shouldn’t be a problem.
But if you’ve created your entire site using Flash, you’ve got a serious problem getting your site into the engines.
Frames
Did I mention that search engine spiders are low-tech? That’s right, they’re so low-tech they don’t understand frames either. If you use frames, a search engine will be able to crawl your home page, which contains the FRAMESET tags, but it won’t be able to find the individual FRAME tags that make up the rest of your site.
In this case, at least, you can work around the problem by including a NOFRAMES section on your home page. This section of your page will be invisible to anyone using a frames-capable browser, but allows you to place content that is visible to the search engines and other frame-blind browsers.
If you do include a NOFRAMES section, be sure to put real content in there. At a minimum you should include standard hypertext links (A HREF) pointing to your individual frame pages.
It’s surprising how often people include a NOFRAMES section that simply says “This site requires frames. Please upgrade your browser.” If you’d like to try an experiment, run a query on Google for the phrase “requires frames.” You’ll see somewhere in the neighborhood of 160,000 pages returned, all of which include the text “this site requires frames.” Each of these sites has limited search engine visibility.
With www or without www?
The address of my web site is www.keyrelevance.com, but can people reach it if they leave off the initial “www?” For most server configurations, the answer is “yes,” but for some the answer is “no.” Make sure your site is reachable with and without the www.
This list presents some of the most common reasons why a search engine may not be indexing your site. Other factors, such as the way you structure the hierarchy of your web pages, will also affect how much of your site a spider will crawl.
Each of these problems has a solution, and in future articles I’ll expand on each one to help you get more of your pages indexed.
If you’re currently redesigning your web site, I’d encourage you to consider these issues before the site goes live. While each of these search engine barriers can be removed, it’s better to start with a search engine friendly design than to fix hundreds of pages after launch.
Author Bio:
Christine Churchill is President of KeyRelevance.com a full service search engine marketing firm. She is also on the Board of Directors of the Search Engine Marketing Professional Organization (SEMPO) and serves as co-chair of the SEMPO Technical Committee.
Top Tips To Pay-Per-Click Search Engine Success
Pay-per-click (PPC) search engines are a highly effective and inexpensive way to attract targeted traffic to your web site. With the most popular PPC search engines (www.overture.com and www.adwords.google.com) you can often buy this search engine traffic for 10 cents or 5 cents per click. It is possible to bid for keywords for as little as $0.01 per click with some of the less popular PPC search engines! This means that you can get 500 visitors to your website for just $5!
Below are some strategies that you can use to consistently get a steady flow of traffic to your website and increase
your sales.
Focus On One Item Or Product That You Want To Promote.
Promoting your entire list of programs or products will be more expensive and get you less targeted traffic. Instead, choose a product and promote that. You can promote other products or programs separately. Each click through should direct your visitor to the specific program or product that you’re selling. This will ensure that you will have less competition in bidding for very focused keywords and a higher likelihood of reaching really interested customers. Avoid directing your customers to a web page that is likely to confuse them or where they’ll have to start looking for the advertised product. Most customers simply do not have the time.
Find Out What The Traffic Is Like For Your Desired Keywords
Most pay per click search engines have a page where you can see what and how many times a keyword was searched for in the previous month. Remember a keyword that was popular in June may not be as popular in July or August. These reports must be used as a guide. I have found that any relevant keyword attracting more than 50 clicks per month is worth bidding on.
Strategize Before Placing Your Bid
Now that you know what you want to promote and what keywords to use, you need an effective strategy to make your campaign profitable. As a general guide, avoid bids that are too high unless you have a huge advertising budget and your product converts well. Instead, aim to bid for less popular low-cost, relevant keywords. They produce a better return than the popular ones.
For The More Popular Keywords, Bid For A Lower & Less Expensive Position
When you search on a keyword, most PPC search engine results will also show you what each advertiser is paying for their position. Remember you don’t always have to be first. If one advertiser is paying $0.25 for the first position and the second advertiser is paying $0.12, you can get third place for only $0.11. A good tip is to aim for the first page, even if you’re in position 15! Most surfers quickly scroll through the entire page and they do not necessarily click on the first or second listing!
Research has shown that listings at the top and bottom of the page are more likely to be noticed than those in the middle. With Google Adwords, increasing either your maximum cost-per-click (CPC) or your ad’s clickthrough rate (CTR) will improve your ad’s position.
Write Interesting Ad Copy
The headline and copy of your text are your one shot at getting leads and customers interested. You want to tell them what you are promoting and why they should visit your site. Remember the goal is interested customers (good leads) not just anybody. Give your potential customers an added incentive by giving away something FREE!. PEOPLE LOVE FREE STUFF! You can also include deadlines and discounts information when applicable. With the Google Adwords program, you can do this in real time.
Only Bid On Keywords That Are Relevant To Your Campaign
This point cannot be over-empasized. All the Pay-per-click search engines have rules concerning how relevant keyword bids are. Generally, you cannot bid on keywords that have nothing to do with what they are linking to. In any case, bidding on words that are not related to your promotion is a bad strategy. It may result in clicks that you pay for, but that do not bring interested customers or leads.
Bid On As Many Relevant Keywords As You Can
You catch more fish with a net than you do with a single line. You can often get great, targeted clicks for only a penny or less if you bid on less popular but relevant keywords. If you take the time to bid on a few words a day, you’ll have a powerful, inexpensive campaign running within a few days. Don’t forget to try it out at more than one PPC Search Engine as well.
Track Your Keywords And Your Entire Marketing Campaign
You can do this by checking your Reports. Each pay-per-click service offers reports. These will tell you how well each keyword is driving traffic. You can then fine-tune your efforts by changing the value of your bids or adding and deleting keywords. Regularly Check your keyword position and do a lookup on your keywords. If you have picked popular words, you will notice that your bids will change position as others bid on them as well. Expect results that match your effort.
If you take a little extra time to apply these points, you should see good results within a short period of time, depending on your efforts.
Someone has said that pay-per-click search engines (like all Internet marketing efforts) are like fishing. You must cast hundreds of times to get a few fish. So be patient and remember that using pay-per-click advertising, like ezine advertising, is an effective approach that can increase sales at a very low cost.
Author Bio:
Ben Chapi owns Venister Home business and Affiliate Program Classifieds at http://www.venister.org. He is also webmaster for http://www.best-debt-consolidation-loan.co.uk and http://www.home-equity-loan.org.uk/
What Is PR?
First of all PR stands for PageRank and is one of Google’s ways of determining the importance of a website. To view the PR of websites download the Google toolbar and it will be displayed as a green bar in the center. Why is PR so important? Well PR helps determine how high you will be listed for your keywords but PR is not a certain thing. Even with a PR of 10/10 you still might not get top placement for your keywords if you do not use correct search engines optimization. The techniques I am about to talk to you about are free so don’t worry if you have a zero budget.
PageRank Building Technique #1
The first PR building technique I am going to talk about is directory listings. Most web directories have a PR of 1 or 2 on the pages you’re going to be listed on. But many of them have pr of 6 or 7 for example DMOZ has great pr that can really boost your website’s PR. In addition to boosting you PR web directories also bring in some decent traffic especially if you get listed in the right category. Picking the most relevant category is more important than anything if you want targeted visitors. For example if you had a webmaster website you would not want to get listed under internet where the people that click the link aren’t even interested in your website. In addition to more targeted visitors if the most relevant category includes one of your keywords it can also boost your search engine position for that keyword.
PageRank Building Strategy #2
The second best PR building technique is reciprocal links. Reciprocal linking is when one site links to another and vise versa. This benefits both websites as long as they are not competitors. Reciprocal links on link pages do not bring too much traffic but normally the link pages have a pr of 0-6.
Finding reciprocal link partners can be hard. The trick to getting a reciprocal link partner is customize your email and offer something that other websites don’t. For example, offer to email all of your opt-in users once every other month and advertise your reciprocal link partner and you could see if they would do the same for you. Main page link exchanges seem to help PR the most and bring the most visitors which results in the best of both worlds.
The best reciprocal link partner in my opinion is forum partners. If you have a big website with tons of pages then I suggest emailing forum webmasters and asking them for a full website reciprocal link exchange. What I mean is ask them to put a code in their forum so your website shows up at the bottom of every single page on their forum while you do the same for them. You have the advantage while there forum website gets about another webpage or 2 a day and other websites get approximately one every week giving you the chance to raise your PageRank by leaps and bounds. I got a link form a forum to my website and it raised my PageRank from 4 to 6!
PageRank Building Tip #3
That brings about my last useful PR building technique, forum signatures! This technique is only useful if you participate in forums occasionally. If you do not then you could try to take a few minutes out of every day to reply to peoples posts. For some forums this technique does not work because they disallow forum signatures. If they do allow them then in your signature put no more than 2 of your links and a short description. Keep your forum description to a maximum of 4 lines because if they are more then that the administrator may ban you from using a signature or people will get annoyed and think you are spamming. One link from a forum will do you no good because normally posts on forums have a pr of 0 but occasionally the more controversial topics have higher pr. A pr of zero will do you nothing but 100s of them can help raise your pr and bring traffic from people on the forum interested in your website description.
Author Bio:
Aaron Turpen is the proprietor of Aaronz WebWorkz, a full-service online company catering to small and home-based businesses. Aaronz WebWorkz offers a wide variety of services including Web development, newsletter publishing, consultation, and more.
Will WebPosition Get My Site Banned from Google?
In mid November of 2003, Google seriously revamped their ranking algorithm. As a result, many sites were dropped from their index, or fell dramatically in rank. This infuriated many Web site owners at the height of the holiday buying season. Since that time, many accusations have been thrown at Google as to the reasons why this happened. Some say it?s a plot to encourage people to buy Adwords listings. Others have even theorized WebPosition is somehow to blame. Still others cite more traditional causes.
As soon as Google changed their algorithm, many WebPosition Gold customers whose sites had dropped contacted me demanding an explanation. They wanted to make sure their sites were not dropped because they had used WebPosition Gold. I reassured them that this was not the case. I went on to explain that many thousands of sites were dropped that don’t even use WebPosition Gold. Many of our customers even saw their rank increase. In addition, most of the time the site had not actually been banned from the index. It had simply dropped in rank.
In this article, I will attempt to dispel many of the pervasive myths regarding WebPosition Gold and Google. I?ve used WebPosition for years on my own site and for clients. I?ve also helped provide technical support to others using the product. Therefore, I?ve been on both sides of the fence, and thereby feel uniquely qualified to address the most common questions that tend to come up:
Will running automated Reporter Missions on Google get my site banned?
No. Despite repeated rumors, when running a Reporter Mission, WebPosition Gold does not pass personal information, such as your name, address, email, Web site URL or domain name to Google. Instead, it conducts queries as a normal browser would, and then examines the results offline. With that in mind, Google cannot determine if you’re running a query relating to a specific domain. The only information that is passed to Google is your “IP” address. In most cases, your Web site’s IP address is different than the IP address of your ISP (Internet Service Provider). So, how can Google connect the two? Simply put, it can’t.
Google states on their FAQ page that they do not recommend automated queries to be run on their service because it utilizes server resources. Yet, most businesses find it impractical not to measure their search engine rankings at least occasionally. It?s also hardly reasonable to check ranking by hand in Internet Explorer, which for the same keyword list, would yield the same number of queries on Google anyway. Therefore, most businesses optimizing their Web sites find it impractical not to use some kind of automated tool to monitor their progress and to measure their visibility. Working as a search engine marketer myself for many years, I?ve found that the best policy is to simply be sensitive to the needs of the search engines. Avoid being ?abusive? in your practices, whether it is your optimization strategies, your submissions, or your rank management.
Therefore, when using WebPosition, I often recommend the following strategies:
1. Avoid excessive numbers of queries if you choose to check your rankings on Google. Most people do not have time to improve their rankings on hundreds of keywords. Therefore, there?s no need to rank check on hundreds of keywords if you don’t have the time to do anything about that many different rankings anyway. While your site won?t be banned from excessive queries, Google could block your IP address that you use to connect to Google, if it found your query volume to be excessive. This is true regardless of what tool you may use, even if it?s a browser.
It has been my experience that a blocked IP is extremely rare even among consultants conducting rank checks for dozens of clients. Presumably, Google would not want to accidentally block an IP that does a large volume of queries simply because its shared by many different users. Even so, it?s always a good idea to practice a little common sense.
2. If you choose to run queries, try to run most of your queries at night and during off-peak periods, which is something Google has suggested in the past. This is when many of their servers are presumably standing idle, waiting to handle the increased volume during peak periods. The WebPosition Scheduler makes this easy to do.
3. Do not run your queries more often than is really necessary. Since Google normally doesn’t update their entire index more than once a month, there’s limited benefit to checking your rankings more often than that.
4. As an alternative to Google, consider checking your Google rankings using Yahoo Web Matches or another Google ?clone? engine in the Reporter. Although these rankings can vary slightly from Google.com, they’re normally close enough to give you a very good idea of your actual Google rankings without checking Google directly.
5. With WebPosition Gold 2, you can also use the “Be courteous to the search engines” feature on the Options tab of the Reporter so you don?t query their service so quickly. This gives you added peace of mind not found in many other automated tools, assuming you don’t mind your missions taking longer to run. The Submitter has a similar feature to submit randomly at various time intervals.
Can I use WebPosition Gold to get my competitors’ banned from Google?
No. If running automated queries on Google with WebPosition Gold would result in your site being banned, you could use it to get your competitors’ banned from Google. However this is not the case.
Google even verifies this on their web site. They don’t specifically name WebPosition Gold in this section; however, they do mention that there is nothing you can do to get your competitors’ banned from Google. For more information on this, please see the “Google Facts and Fiction” document at Google’s site
Will over submitting my site get me banned?
No. Many people think that Google will ban your site if your submissions exceed the recommended daily limits. If this were the case, we could over submit our competitors’ sites and easily get them banned from Google.
Google is very clear on this and even states that over submitting will not get you banned. Even though over submitting will not get you banned, some of your submissions might still be ignored or discarded if they break the rules. Therefore, I recommend using the “Slow Submit” option in WebPosition Gold’s Submitter and staying within WebPosition?s recommended daily limits. Some people argue that manual submissions are best. However, manual submissions can?t warn you if you inadvertently over-submit, make a typo in your submission, or forget what you submitted and when.
For achieving top rankings, and staying indexed long-term, the best submission technique may be to not submit at all. Instead, try to establish third party links to your Web site and wait for Google?s spider to find you on its own. WebPosition?s Page Critic offers numerous strategies for doing this.
Will Doorway or Entrance pages get me banned from Google?
That depends on whether these pages contain spam. If your definition of a doorway page is a page full of irrelevant or duplicate content, and excessive keyword use, then yes, you could find your site banned. That?s how Google often defines a doorway page. Consequently, the term doorway has developed a negative connotation over the years.
If your optimized page is nothing more than an extension of your main web site that happens to contain search engine friendly content, then you?ll be fine. In fact, you?ll be rewarded for the effort through top rankings. The key is not whether you label a page a doorway, entrance, optimized, informational, or ?whatever? page. The key is whether the page contains quality, relevant content that provides the search engine with what it wants to see.
Google mentions that they discourage the use of ?doorway? pages because they fear that webmasters will optimize for keywords that are not relevant to the page?s content. This is a legitimate fear as they are in the business to provide relevant results to their visitors. However, if you create pages that contain what Google is looking for, then obviously Google will not penalize this page, or view it differently from any other page on your site.
With this in mind, here are a few of my tips on creating Google-friendly pages:
1. Always Include Relevant Content – Make sure that the content on each of your pages is relevant to your site. Many sites have various resources on a number of different topics. This is fine, as long as the overall theme for your Web site is solid. I would also suggest that you organize your related content into individual directories. Some businesses find it beneficial to organize each sub-theme of their site into a separate domain so they can cross-link the domains. If you do this, make sure you have links from other sites as well.
2. Avoid Duplicate Content – Create each page with unique content. If you are targeting different search engines for the same keyword, then you may find that you have some very similar content between certain pages. If this is the case, you can always create a robot.txt file to tell each search engine crawler not to index a page or directory that was created for another search engine. See the October 2000 issue (http://www.marketposition.com/mp-1000.htm#THREE) of MarketPosition for more information on creating a robot.txt file.
3. Avoid Keyword Stuffing – Creating pages that excessively repeat your keyword phrase is definitely not a good idea. This almost always will throw up a red flag to the search engine and is one of the most common forms of “spamming.” How many keywords is too many? See WebPosition?s Page Critic for up to date, specific recommendations regarding how many words and keywords are recommended in each area of your page.
4. Design Good Looking Pages – Although Google cannot tell if your page is aesthetically pleasing, it is recommended that you create pages that look good and fit the theme of your Web site. This will definitely increase the click through rate from the arrival page to the rest of your Web site.
5. Avoid Using Hidden Image Links – Many site owners think they can fool Google by including transparent 1×1 pixel image links on their home page that point to their optimized pages. These are very small images contained in a hyperlink that are not visible to the naked eye. This can get your page dropped from Google’s index.
6. Avoid using links that have the same color as the background on your page – Many site owners try to hide the links on their home page by making the text color the same as the background color of the page. As with the scenario above, this can also get your page banned from Google.
7. Avoiding using Javascript Redirection Techniques – Many Web site owners have implemented the use of Javascript to redirect a user to another page while allowing Google to crawl the page that includes the Javascript code. This did work for a while, but Google eventually caught on. Other forms of redirection, like IP cloaking are also frowned upon by Google.
In Summary:
The rules regarding each search engine change routinely. That?s why WebPosition?s Page Critic is updated monthly to keep pace. As a search engine marketer, it?s critical that you keep informed as to the latest search engine rules and strategies.
It’s also important to understand that WebPosition Gold is only a tool. When used properly, it will not get you banned or blocked, and will in fact improve your rankings dramatically. However, as with any tool, you can choose to ignore its recommendations and to go your own way. For example, you can use a hammer to build a fine house, or you can take that same hammer to knock a bunch of holes in someone?s wall. Ultimately, this call is up to you, the user of the tool.
Author Bio:
This article is copyrighted and has been reprinted with permission from Matt Paolini. Matt Paolini is a Webmaster/Tech Support Specialist for FirstPlace Software, the makers of WebPosition Gold. He’s also an experienced freelance Search Engine Optimization Specialist and Cold Fusion/ASP.NET/SQL Server developer/designer. For more information on his services, please visit http://www.webtemplatestore.net/ or send him an email at webmaster@webtemplatestore.net