Higher Google Adwords Clickthrough Rates
1. Target your ads to the right Audience.
You do this by selecting keywords and phrases which are relevant to your product or service. Avoid keywords that are too general because although they generate a large number of impressions, they often generate very few clicks. To improve your CTR, use more descriptive phrases so that your ads will only appear to prospective customers searching for what you have to offer.
2. Use the correct keyword matching option(s).
Google offers four different methods of targeting your ads by keywords: Broad Match, Phrase Match, Exact Match and Negative keyword. By applying the most focused matching options to your keywords, you can reach more targeted prospects, improve your CTR, reduce your cost-per-click and increase your return on investment.
3. Target your ads by location and language.
When creating your adwords campaign target your ads by location so that you maximise your sales and improve your CTR. Target the right audience by selecting the language and countries that you want to reach.
4. Use your main keywords in the Title or Body Text of your ad.
By using your keywords in the title or ad body text of your ad, it will stand out from your competitors and grab the eye of your prospective customers.
5. Create different Ad Groups for different search phrases/keywords.
This will allow you to refine your ads and test them for relevance and therefore maximise your clickthrough rates. For example, if your service offers loans, you can create different ad groups for home equity loans (and all other phrases that incorporate this phrase), consolidation loans, student loans and so on.
6. Calculate what you can afford to pay for each clickthrough.
You will find that more focused keywords and search phrases have a higher conversion ratio than other more general keywords. It’s a good strategy to pay more for clicks from keywords or phrases with a high conversion ratio than from the more general keyword groups.
7. Use highly targeted Keywords and search phrases.
Be specific when selecting keywords and search phrases for your campaign. General keywords will be more expensive and will result in lower clickthrough rates. If you’re bidding on general keywords that are relevant to your site consider using the exact match and the Phrase match keyword matching options in order to increase your CTR.
8. Test and monitor your ads to get the best clickthrough rates. Refine and fine-tune your ad to maximise click throughs. With Google you can do this in real time. You can do this by creating different ads for each ad group and then checking which ads have the best clickthrough rates.
9. Give Google users a compelling reason to click on your ad link. The easiest way to do this is to provide something of value for free. You can also achieve this if you tailor each keyword to your offer and use relevant terms/ words in both the title and the ad body. Use a different ad for each keyword group or search term. This increases relevance and the likelihood that Google users will click on it.
Author Bio:
Copyright. Ben Chapi owns Venister Home business and Affiliate Program Classifieds at http://www.venister.org.
Optimizing Framed Websites
A number of months ago, I wrote an article on the proper techniques that should be used to successfully optimize a site that makes use of frame technology. I’m still regularly getting some questions in relation to sites that were designed using frames, asking me the best way to optimize these sites to improve their visibility in the search engines. Every now and then, I’m also getting a few emails from some site owners that are desperate their framed site is nowhere to be found in the engines. In response to all of this, I think an extended version of my first article is necessary.
For some reason, even today, there is still a lot of controversy in establishing whether or not to use frame technology when building a new website. This controversy is certainly well founded, since, apparently there seems to be many site designers that still use it, even if that’s not a very good idea anymore. In this section, you will learn all you need to know to get any framed web site properly listed and optimized for the major search engines.
To be fair, let’s just say that a framed website can actually ‘sometimes’ be of help in its necessary updating needs and many web designers decide to use frame technology for this very reason. They can be even more useful for maintaining very big web sites that have a lot of content spread over 500 or more pages, although I don’t necessarily agree on that assumption.
However, and as you are about to find out, framed websites usually create many problems and complications for most major search engines in use today. This article will show you the pitfalls and the mistakes that some Webmasters do in implementing their framed sites, in lieu of “search engine-friendly” websites. It will also show you the right way to do it, if you still want to go ahead with this idea.
What Constitutes A Framed Web Site?
The easiest and surest way to determine if a site is framed or not is when its left-hand navigation menu remains stationary while the information in the centre of the page moves either up or down. Furthermore, some models of framed sites sometimes include a company logo or image at the top or sometimes at the bottom that also remains fixed while the rest of the page moves in either direction. Some models might also include links or navigation buttons in that same stationary section. When you see any or all of these characteristics, you are then dealing with a framed website.
Today, anywhere on the Internet or in good technical books, you can actually read a lot about search engine optimization (SEO) that basically says using frames on any website can actually be similar to a bad spell on that site simply because most major search engines will not navigate the frames or its simply impossible for them to crawl or spider. In such a case, these same people will tell you that such a framed website will never get indexed or optimized properly.
That statement is actually both true and false at the same time. It IS false if they are used correctly. But, by the same token, it’s perfectly true if the frames are in effect used improperly and are search engine-unfriendly. Happily, there are ways around these limitations.
When implemented correctly, a framed site can be ‘almost’ as good as a site without frames and can hopefully deliver results that could be considered reasonable, as compared to what the site delivered previously. However, if your site isn’t designed yet, I would still suggest you do it with search engine-friendly technology, in other words, stay away from frames!
Why Do Search Engines Hate Framed Websites?
In this section, I will explain the main reasons why most framed web sites fail to get indexed properly on the major search engines that use crawlers such as Google. When we look at the HTML code of a common framed web site, we usually see the TITLE tag, the META tags, and then a FRAMESET tag. The major search engine’s crawlers and spiders such as Google Bot and Freshbot are all programmed to completely ignore certain HTML code and instead are directed to specifically focus on indexing the actual body text on a particular page.
But with any typical framed website, there is just no body text for the search engine’s crawler to begin with, simply because the text is located on the other pages of the site, what we call the inner pages.
If you want to know what search engines see when they get to a site using frames, this is what they see:
“You are currently using a web browser that does not support frames. Please update your browser today in order to view this page.”
This is almost the equivalent of putting a sign on your door that says: “If you cannot read this sign, its because we are permanently closed for business”. Yet, this is exactly what search engines see when they get to a framed website!
Here is a little test that will prove what I just said is true: while at your keyboard, try a search on Google for the following query: “does not support frames” and you will actually discover about 2,800,000 web sites are found by Google! That is a lot of wasted Web pages that could bring in a LOT of sales.
In order to fix that problem, if you use that otherwise wasted space to effectively include important keywords and key phrases, and in fact adding some body text that is rich in sales copy, that will in fact make a big difference in helping make these framed sites rank higher in the search engines.
How To Properly Use The Noframes Tag
There actually is an HTML tag called the noframes tag and when properly applied, will effectively offer what the search engine spiders actually need: important text data they require to properly include that page in their index.
If a person really has to use a framed site for whatever reason, then it’s very important to use the noframes tag correctly. Evidently, doing all these necessary steps will be useful only if the information on the site’s homepage is carefully written and makes strong use of your important keywords and key phrases. Again, nothing beats the careful research and analysis of your ‘right’ keywords and key phrases, using the professional facilities of Wordtracker.
Always remember that if you optimize a site for the wrong keywords, you will have a site that will rank high, but for search terms that have nothing to do with what your site offers! You will then receive enquiries and questions on products or services that your company doesn’t even offer! In the search industry, relevancy and quality are king. The more relevant your site is, the better success you will get, and the higher your ROI will be. It’s just as simple as that.
More Framed Website Topics
So far, all the above information takes care of just the site’s homepage. Now as can be expected, the rest of all the pages on your site also need to be indexed correctly.
Most site designers and web programmers that elect to use frames do so mainly for ease of navigation reasons, although the same results can be achieved with other, more search engine-friendly technology, such as the use of tables. You simply include your navigational links inside the tables and its done!
One thing that is really important is to effectively ensure that all the inner pages of your site get optimized the same way, even if you don’t have to deal with the no-frames tag. Once all those inner pages are optimized appropriately, after one or two Google monthy updates (dances), you should start seeing your website properly positioned, ideally, close to the top of the SERPS’s (Search Engine Results Pages) in Google and most major search engines in use today.
Conclusion
Framed websites are made of old, outdated technology and are certainly not the best. They have serious limitations, as far as search engines are concerned. When building a completely new site, or when re-designing an old one, it’s always best to stay away from frame technology.
However, if you really have to, optimizing a framed site can be done successfully if the above examples and techniques are followed carefully. Remember that the key to all of this is the no-frames tag. If carefully implemented, the no-frames tag can successfully remove many of the barriers that plague framed websites.
Author:
Serge Thibodeau of Rank For Sales
Affiliates & Adwords
I’ll have to admit that I didn’t believe all the hype about Google Adwords for the longest time. Then I caught myself clicking on some of the Adwords ads instead of the search results. That made me think.
Five bucks. What’s five bucks? Lunch at McDonalds. I could skip that for a day. So I surfed Clickbank for a while, found something new and really targeted, popped a few keywords in and wrote the ad. Basically, I used the sledgehammer method of marketing. Fifteen minutes later, I was getting clicks. Then I went to bed, of course. A few things I have learned. If I’m on the computer at one in the morning, I must hide my credit card from myself. Or maybe I was wrong?
I woke up the next day and discovered a few things. One, all of these gurus I decided not to listen to weren’t lying. I had made sales. In fact, I made more instant affiliate sales in 12 hours than I ever did before. Another, I had a lot more to learn. But I was hooked. So I bought a few ebooks to learn more and went to work. Why the hell had I waited so long?
Adwords is a crash course in direct marketing. You learn quickly what people want and what keywords they type in to find it. And you can just as quickly lose money if you don’t learn the medium. Google Adwords takes all the guesswork out of optimizing webpages. If you have the money, you can buy all the traffic you want. The key, though, was to buy sales as cheaply as possible. These are the basics. After the basics, Adwords gets complex. It comes down to fine tuning. Fine tune your keywords correctly and you will receive instantly targeted visitors who are ready to purchase specific products.
Clickthrough Quantity
A lot of internet marketing comes down to keywords. Adwords is no difference. You must learn to think like your customers. What keywords would they use to find the product you are selling? Sometimes you just can’t think of them. This is where keyword tools come into play. I have listed a few free keyword tools that will help at the link in the resource box.
If the keywords you are using are not performing well, there may be a few ways to fix this. One is to fine tune the keyword. Instead of using “e-mail” to sell a spam blocker, you can use the Adwords keyword suggestion tool to find such words as e-mail filter, junk e-mail, and such. Then delete “e-mail.” Obviously, everyone who uses “e-mail” in their search terms isn’t looking for a spam blocker. In fact, very few will be, so don’t destroy your click through rate by using terms that are too general.
You can also fine tune the keyword by using brackets, quotation marks, or negative keywords. Brackets around [e-mail filter] will allow your ad to be shown only when searchers type just “e-mail filter” in Google’s search box, not “e-mail” and not “free e-mail filter.” Using quotes will allow your add to be shown for both “e-mail filter” and “free e-mail filter”, but not “filter e-mail”. Using “-” like “email-filter” will allow your add to be shown for all searches containing “e-mail” that do not contain “filter.” You can find more details here: https://adwords.google.com/select/tips.html.
Another problem may be that your customer just does not see your ad. Just by creating a new ad that contains the keyword that hasn’t been producing that well may improve your click through rate. This works because Google highlights the keywords that the searcher used in your ad. You can also try capitalizing power words in your ad to get that ad seen. If you know the keyword should be working but isn’t, try this first. If this doesn’t help consider narrowing the keyword or dropping it. Don’t be scared too.
Use 300 keywords instead of 30. The more keywords you use the better. You will receive customers looking for specific products and you will have a better chance of getting keywords that cost five cents instead of five dollars.
It may just come down to good old ad copy. An Adwords ad is haiku of advertisements. You must pack a punch in three lines, four if you count the url. Go for the benefits not features of the product. Use active verbs that make your surfers want to click instead of static nouns.
Clickthrough Quality
You can get all the clicks from Adwords that you can imagine, a 20% click through rate and still not get any sales. All this means is that you are throwing money down an internet hole. If you are paying for every click, all clicks are not the same.
If you are trying to sell a product you definitely want to get rid of freebie hunters first. How do you do this? Add a price to the ad. Or if the ad copy on the product’s page is killer, you can use something along the line of “Low-priced”. In other words, use a phrase that tells the surfer that he will be getting out his credit card.
Another way to convert more clicks through Adwords comes down to keywords again. More specific keywords will convert more surfers into customers. If you are an affiliate for a motorcycle helmet company, you don’t want to use “motorcycle” as a keyword. You want to use “motorcycle helmet.” Or else, you’ll customers who are searching for jackets, exhaust, pictures.
There are a few types of affiliate programs where you wouldn’t mind freebie hunters. One is site that uses cookies. Think of the last time you purchased an ebook. Did you buy it when you saw the first ad? Probably not. You waited. But after reading about the product all over the net, you decided to get out your credit card. I discovered these types of customers after I deleted the campaign from Adwords that actually brought in the sales. The site’s cookie credited me with the sales weeks after my ad brought them to the site.
Another example is programs that pay for leads. Some companies will pay you if the surfer fills out a form. They depend on their professional follow-up to get the sale. But the point is, you have made cash and the surfer didn’t even have to pay. You just have to get him to sign up for info.
Set A Budget
Figure out how much you want to spend a day and stick to it. A good start is $5.00 a day. Also set your maximum cost per click. This will keep you form spending way too much. Some products I have the CPC set at $0.05. On others, I have it set at $0.50. It depends on how many clicks you actually convert into sales.
You must also set a cutoff point. Decide how much money you can spend on a new campaign without any sales before you drop it. It could be 300 clicks or $20 spent. Whatever you set it at, you must stick to it.
Tracking
Google Adwords allows you to set up campaigns, followed by ad groups, followed by individual ads. This structure is great for testing your ads. And believe me, you have to test and you have to track. A campaign can be set up for one product. Then you can set up an ad group for each feature of the product. One ad group may focus on the products ease of use. Another may focus on the low price. And in each ad group you can have multiple ads. Using this system and Adwords tracking features, you track which ads work best for clickthroughs.
But you still have to track your conversions. Commission Junction is great in that you can send each ad to a link. Then when you go to their site you can discover which ads are producing the most sales.
But whatever you do, know your average sale, know your average price per click and know how many clicks it takes for you to make a sale. Then figure your return on investment. You may be spending more than you are making. It’s easy to get caught up in the process and forget you are trying to make money. Don’t get into bidding wars. And always test something new. Nobody’s perfect.
Following The Rules
In my little experiment, I also ran into a few rules that Google wants you to follow by breaking them. The best way. Here is the link to the FAQ: https://adwords.google.com/select/faq/index.html. One, you can’t use trademarks as keywords. Second, you must get a 0.5% clickthrough rate or Adwords will slow your campaign down. Also, if you are an affiliate, you must at least put “aff” at the end of your ad.
In summary, Adwords can get you instant sales, but you must study the system, fine-tune your campaigns, and track your results.
Author Bio:
Stephan Miller is a freelance programmer, and writer. For more Adwords resources, visit the page below http://www.profit-ware.com/resources/googleadwords.htm
Bayesian Spam Filter & Google
On November 26, Stephen Lynch, journalist at the New York Post picked up the phone and initiated a telephone interview with me about an article I wrote on the previous day. The article was in relationship to the current November Google update “dance”, dubbed “Florida”.
The following day, Mr. Lynch wrote this article and it was published in the New York Post, offering his comments and, without being technical, explaining some of the negative effects such an update can have on the average site owner or Webmaster.
As the latest “Florida” monthly Google update ‘dance’ has shown us, having a website highly-ranked on the Internet’s number one search engine, Google– if your search rankings precipitously drop as much as some did and without warning, it can spell a devastating blow to some online stores or certain commercial websites.
In the last 10 days, a lot of articles have also been written by some of my colleagues, some in the SEO field and some, like Seth Finkelstein who are more in favour of the free flow of information that the Internet can provide.
In this article, I will attempt to describe some of the spam-filtering techniques that Google is reported using during this Florida “dance”. This spam-filtering technology is based on the Bayesian algorithm.
The Inner-Workings of a Spam Filter for a Search Engine
For quite a long time now, Google’s search results have been under attack by search-engine spammers that continuously attempt to mask search results, in the end, cluttering the search engines with irrelevant information in their databases.
With the ever-growing popularity of Google and as it tries to handle more and more searching all over the Web, the temptation to foul the search results has become attractive to certain spammers, leading to substantial degradation in the quality and relevance of Google’s search results. Since Google is mostly concerned of quality search results that are relevant, it is now cracking down on these unscrupulous spammers, with new spam-filtering algorithms, using Bayesian filtering technology.
At the end of October 2003, Google deployed their new Bayesian anti-spamming algorithm, which appeared to have its search results crash when a previously identified spam site would have normally been displayed. In fact, the searching results were completely aborted when encountering such a spam-intended site. See “Google Spam Filtering Gone Bad” by Seth Finkelstein for more technical information on how this spam elimination algorithm works at Google.
The First Shoe That Fell
On or around November 5th, this spam problematic was in fact reduced significantly, resulting from the “kicking-in” of these new Bayesian anti-spam filters. Although not perfect, this new Bayesian spam-filtering technology seemed to have worked, albeit there were some crashes in some cases.
On or about November 15th 2003, Google, as it always does every month, started “dancing”, performing its needed monthly and extensive deep crawl of the Web, indexing more than 3.5 Billion pages. This update had some rather strange results, in a way reminding some observers of a previous major algorithm change done in April of 2003, dubbed update Dominick, where similar and very unpredictable results could be noted across the Web.
It was generally observed that, many ‘old’ and high-ranking sites, some of which were highly regarded as ‘authoritative’, which were certainly not spammers in any way, appeared to fall sharply in their rankings or would disappear entirely from Google’s search results.
Since then, there have been many explanations, some not too scientific, that attempted to answer this event that some have categorized as “serious”. For an example of some of the best of these explanations, read an article that Barry Lloyd wrote: “Been Gazumped by Google? Trying to make Sense of the “Florida” Update!”.
More on the Bayesian Spam Filter
Part of my research and the observations I have done in this matter point to the Bayesian spam filter that Google started to implement in late October. A “Bayesian spam filter” is a complex algorithm used in estimating the probability or the likelihood that certain content or material detected by Google is in fact spam. In its most basic format, the Bayesian spam filter determines if something “looks spammy” or if, on the other hand, it is relevant content that will truly help the user.
To a certain degree, the Bayesian algorithm has proven efficient in the war against spam in the search engines. Being ‘bombarded’ by spam as much as Google has been for the past couple of years, it has no choice but to implement such anti-spam safeguards to protect the quality and relevancy of its search results.
However, it is the general feeling in the SEO community that, unfortunately, the current Bayesian algorithm implementation seems to have extreme and unpredictable consequences that were practically impossible to be aware of beforehand.
On the outset, one of the problems with estimating the probability or likelihood that certain content does have spam in it is, given very huge datasets, such as the entire Internet for example, many “false success stories” can and will occur. It is exactly these false success stories that are at the centre of the current problem.
Since this whole event began to unwind, there are many people that have noted in tests and evaluations that, making the search more selective, differentiating such as trying to remove an irrelevant string tends to deactivate the new search results algorithm, which in turn effectively shuts down the newly-implemented Bayesian anti-spam solution at Google.
One More Observation
While we are still on the subject of the new filter, but getting away from the topic of spam-related issues, as a side note, while doing some testing with the new Florida update, I did notice that Google is now ‘stemming’. To my knowledge, it’s the first time that Google does offer such an important search feature. How does stemming works? Well, for example, if you search for ‘reliability testing in appliances’, Google would suggest you ‘reliable testing in appliances’.
To a certain degree, variants of your search terms will be highlighted in the snippet of text that Google provide each accompanying result with. The new stemming feature is something that will certainly help a lot of people with their searching for information. Again, Google tries to make its searches the most relevant they can be and this new stemming feature seems like a continuation on these efforts.
Conclusion
In retrospect, and in re-evaluating all the events that have happened on this major dance, it is clear that Google is still experimenting with its newly-implemented algorithm and that there are many important adjustments that will need to be done to it to make it more efficient.
Spam being a growing problem day by day, today’s modern search engines have no choice other than to implement better and more ‘intelligent’ spam-filtering algorithms that can make the difference between what is considered as spam and what isn’t.
The next 30 days can be viewed by some as being critical in the proper ‘fine-tuning’ and deployment of this new breed of application in the war against spam. How the major search engines do it will be crucial for some commercial websites or online storefronts that rely solely on their Google rankings for the bulk of their sales.
In light of all this, perhaps some companies in this position would be well advised in evaluating other alternatives such as PPC and paid inclusion marketing programs as complements. At any rate, it is my guess that search will continue to be an important and growing part of online marketing, both locally, nationally and on a global basis.
______________
References:
1) An anticensorware investigation by Seth Finkelstein
http://sethf.com/anticensorware/general/google-spam.php
2) Better Bayesian filtering by Paul Graham
http://www.paulgraham.com/better.html
Author:
Serge Thibodeau of Rank For Sales
Dynamic URLs & Search Engines
Most Web sites with a large number of pages are developed using dynamic server-side Web technologies such as Hypertext PreProcessor (.php), Java Server Pages (.jsp), Active Server Pages (.asp), ColdFusion (.cfm) and Perl. These technologies provide programmers with the tools to build sites so that adding product or pages does not require extensive HTML work. In reality, all high volume sites must use one of these technologies in order to maximize efficiency and stay profitable.
The power and flexibility provided by these technologies is outstanding. There are, however, issues that need to be addressed in regards to the way search engines crawl a dynamically driven Web site. These issues do not have to do with the pages that are generated, but with the URLs these technologies generate.
In order to explain search engine optimization technique, let me give you some perspective with a client of RustyBrick, Inc. RustyBrick designed and developed a custom e-commerce site for an undergarment and intimate apparel shop named Freshpair.com. Freshpair.com is built with a PHP dynamic scripting language that helps facilitate the day-to-day operations of the site. After the site was built and running, Freshpair.com participated in many online advertising campaigns, including Overture and Google AdWords.
At some point after sales leveled off and the company was looking for ways to increase revenues, Freshpair came to me at RustyBrick and asked what they can do. Freshpair.com’s COO and I discussed many options; but one thing that stood out was that Freshpair.com was not to be found in the normal search results. We quickly devised a plan on what steps to take in order to make Freshpair.com “search engine friendly”.
We will get back to Freshpair.com shortly; but first let’s continue to discuss the importance of dynamically generated URLs. In order to better understand what a search engine sees, let us take a sample URL and discover it.
A simple standard URL would look something like: http://www.freshpair.com/underwear.html.
A complex URL would look something like: http://www.freshpair.com/catalog.php?formid=4&id=8&brand=&brasize=&ion=women.
The first thing you notice is the .php extension, and you might think that the .php extension is causing the issue. That is not the case. Next you will notice are question marks, equal signs, ampersands within the URL. Is that causing the issues? Kind of… These question marks, equal signs and ampersands are what are commonly referred to as “stop characters” in search engine optimization terms. They are named stop characters because they signal to search engines to stop crawling past a certain point, limiting the number of pages crawled on your site.
Let me present another example of one of the pages from our corporate Web site. The following URL is a page that contains the full RustyBrick client list. The URL reads as follows:
http://www.rustybrick.com/portfolio_client_list_all.php.
We also have enabled the Web visitor to sort the client list by industry. If a Web visitor would like to see all clients that fall within the industry of IT & Communication services they would be shown a URL that reads http://www.rustybrick.com/portfolio_client_list.php?industry=4.
We again see the question marks and equal signs. Now the Web visitor wants to view the client list by the Retail and Wholesale service industry and clicks on that link. The URL now reads
http://www.rustybrick.com/portfolio_client_list.php?industry=6.
As you can see the URLs are exactly the same up until the last digit in the URL where the numbers come in. So if RustyBrick severed 200 industries there would be the same URL except for the last digit.
Now put yourself in the shoes of a search engine. The search engine wants to only put pages into its index that are unique. So the full client list contains the same information that the industry specific URLs contain and search engines do not want repetitive information in their index. Search engines decide to combat this issue by “pruning off” the URLs after a specific number of variable strings (i.e. ?, =. &).
For example, the URL http://www.rustybrick.com/portfolio_client_list.php?industry=6 might be pruned down to http://www.rustybrick.com/portfolio_client_list.php by the search engine in order to limit the number of repeated content.
In a case like Freshpair.com, where there are numerous methods of finding the same product and with an unlimited number of pages, how do we get the search engine to find each product and each method of finding that product? Search engines want to keep the number of pages that a site contains to a minimum in order to (1) eliminate duplicate search results with the same content and (2) to make the crawling of the pages efficient.
The solution we came up with was to program a Mod_Rewrite on the URLs to remove the stop characters from the URLs. We modified a URL that once looked like http://www.freshpair.com/catalog.php?formid=5&query=bra&ion=women to something more like http://www.freshpair.com/catalog_section_women_id_8.html. We replaced all stop characters with underscores and more friendly URL characters and names. Today, Google has indexed over 21,000 pages on Freshpair.com and sales have increased tremendously due to the Mod_Rewrite and other search engine optimization techniques applied to the site.
For more information on Mod_Rewrite please visit the Apache module mod_rewrite page at: http://httpd.apache.org/docs/mod/mod_rewrite.html
Author Bio:
Barry Schwartz is the President of RustyBrick, Inc., a Web services firm specializing in customized online technology that helps companies decrease costs and increase sales. Barry is a leading expert in the search engine optimization community. Barry has written and contributed many articles to the SEO community, by publishing in SEMPO (Search Engine Marketing Professionals Organization). Barry also gives regular seminars covering the complete spectrum of search engine marketing technology and methods.
Google’s Next Big Move
(Will your website be ready, or will you be playing catch-up six months too late?)
November 2003 might go down in history as the month that Google shook a lot of smug webmasters and search engine optimization (SEO) specialists from the apple tree. But more than likely, it was just a precursor of the BIG shakeup to come.
Google touts highly its secret PageRank algorithm. Although PageRank is just one factor in choosing what sites appear on a specific search, it is the main way that Google determines the “importance” of a website.
In recent months, SEO specialists have become expert at manipulating PageRank, particularly through link exchanges.
There is nothing wrong with links. They make the Web a web rather than a series of isolated islands. However, PageRank relies on the naturally “democratic” nature of the web, whereby webmasters link to sites they feel are important for their visitors. Google rightly sees link exchanges designed to boost PageRank as stuffing the ballot box.
I was not surprised to see Google try to counter all the SEO efforts. In fact, I have been arguing the case with many non- believing SEO specialists over the past couple months. But I was surprised to see the clumsy way in which Google chose to do it.
Google targeted specific search terms, including many of the most competitive and commercial terms. Many websites lost top positions in five or six terms, but maintain their positions in several others. This had never happened before. Give credit to
Barry Lloyd of www.SearchEngineGuide.com for cleverly uncovering the process.
For Google, this shakeup is just a temporary fix. It will have to make much bigger changes if it is serious about harnessing the “democratic” nature of the Web and neutralizing the artificial results of so many link exchanges.
Here are a few techniques Google might use (remember to think like a search engine):
1. Google might start valuing inbound links within paragraphs much higher than links that stand on their own. (For all we know, Google is already doing this.) Such links are much less likely to be the product of a link exchange, and therefore more likely to be genuine “democratic” votes.
2. Google might look at the concentration of inbound links across a website. If most inbound links point to the home page, that is another possible indicator of a link exchange, or at least that the site’s content is not important enough to draw inbound links (and it is content that Google wants to deliver to its searchers).
3. Google might take a sample of inbound links to a domain, and check to see how many are reciprocated back to the linking domains. If a high percentage are reciprocated, Google might reduce the site’s PageRank accordingly. Or it might set a cut- point, dropping from its index any website with too many of its inbound links reciprocated.
4. Google might start valuing outbound links more highly. Two pages with 100 inbound links are, in theory, valued equally, even if one has 20 outbound links and the other has none. But why should Google send its searchers down a dead-end street, when the
information highway is paved just as smoothly on a major thoroughfare?
5. Google might weigh a website’s outbound link concentration. A website with most outbound links concentrated on just a few pages is more likely to be a “link-exchanger” than a site with links spread out across its pages.
Google might use a combination of these techniques and ones not mentioned here. We cannot predict the exact algorithm, nor can we assume that it will remain constant. What we can do is to prepare our websites to look and act like a website would on a “democratic” Web as Google would see it.
For Google to hold its own against upstart search engines, it must deliver on its PageRank promise. Its results reflect the “democratic” nature of the Web. Its algorithm must prod webmasters to give links on their own merit. That won’t be easy or even completely possible. And people will always find ways to turn Google’s algorithm to their advantage. But the techniques above can send the Internet a long way back to where Google promises it will be.
The time is now to start preparing your website for the changes to come.
Author Bio:
David Leonhardt is an online and offline publicity specialist who believes in getting in front of the ball, rather than chasing it downhill. To get your website optimized, email him at info@thehappyguy.com. Pick up a copy of Don’t Get Banned By The Search Engines or of Get In The News.
Search Engine Marketing Myth
You’ve seen the ads: Guaranteed #1 Ranking! There are no guarantees in search engine marketing and website promotion. If anyone tells you different, you should check quickly to make sure they don’t have their hand in your wallet.
Suppose you sell widgets. You want to sell more widgets, and the way to do that is to make sure that more people know about widgets, and that you are the place to buy their widgets. You might decide to buy a half-page ad in a national magazine to tell your story. When you place that ad, you are “guaranteed” your position.
With a magazine advertisement, you know what the magazine’s circulation is, who reads it, and which page will feature your ad. The magazine can guarantee all that, because they own the medium.
Search engine marketing is qualitatively different. When you work with a search engine marketing firm to promote your website, they cannot guarantee where your listing will appear. Certainly there are types of online ads where there are guarantees in place: banner ads priced at “cost per thousand impressions”, pop-up ads, and so forth. These are like traditional media buys, where you are working directly with the owner of the medium where the ads appear, but this is not search engine marketing.
Even so-called pay-per-click search engines cannot guarantee your position. In Google AdWords, for example, it is not just the price you pay for a given keyword that determines where you will rank. They also bring in other factors, including how often your ad is clicked-on, to determine which ad will be listed first. Just throwing money at them will not necessarily get you into the #1 spot.
The bottom line is this: search engine marketing professionals do not own the search engines. They can tell you that you will achieve #1 ranking on a given search engine, or they can tell you that the moon is made of green cheese, but there is no way they can make either of those happen. When you tell Time magazine you want your ad to be on the back cover, and you pay them enough money, they will guarantee you the back cover. If you tell your search engine marketing people you want to be #1 in AllTheWeb, they cannot guarantee you that result. They can recommend changes to your site that will increase the likelihood of your ranking higher, but that is a long way from a guarantee. If you don’t control the medium, you can’t guarantee the result. Since your search engine consultant doesn’t control the search engine, there is no way they can guarantee your position.
The ranking algorithms of the search engines are a closely-guarded secret. The search engine wants to give top ranking to the site that is the best match to an individual visitor’s search query, not to the site that was able to “beat” the system. That is where the value of real search engine marketing comes in.
While the search engine marketing person cannot guarantee you a position, what they can do is to apply years of experience to tell you what has worked in the past, and to help you make it work today. In many ways, “organic” search engine optimization is really a matter of editing web pages or whole sites to make them the most search engine friendly they can be. Making sure that a given page has just the right combination of keywords, title, links, and so on, is really at its base simply a matter of making that page the best web page it can possibly be. The page that will rank the best in the search engines is also the page that will make the most sense to the human visitor. Rather than relying on tricks to try to make the page rank high, it is a matter of just making the page the most focused and on-message that it can be. The bad news is that this doesn’t guarantee which position in the search engine rankings that page will occupy on a given day. The good news is that the page will always rank well.
The search engines change their algorithms from time to time. If today’s rule, for example, is that just the right combination of text in the title tag will make or break a site, and you know this is true, then all you have to do is to tweak the title tag to fit within that rule, and you will automatically rank very high. Today.
Suppose that tomorrow, however, that rule is changed. Suppose that now the most important factor that the search engines use to rank a site is the content of the META Description tag. All the work you went to yesterday to fix the title is now useless. All of your attention is now focused on fixing that description tag.
Clearly, over time the focus of the search engines will vary. The best way to deal with this is to not deal with it! This means that rather than tweaking a site one way today and another way tomorrow, the best way to approach optimizing a page or a whole site is to not try to beat the system. Instead of trying to “psych-out” the search engines, why not add value to the site? A “common sense” approach to search engine optimization, looking for long term results, is the way to go. When you try to help a site rank better by making it the best it can be, everybody wins.
Author Bio:
Dale Goetsch is the Technical Consultant for Search Innovation Marketing, a Search Engine Promotion company serving small businesses and non-profits. He has over twelve years experience in software development. Along with programming in Perl, JavaScript, ASP and VB, he is a technical writer and editor, with an emphasis on making technical subjects accessible to non-technical readers.
Search Engine Traffic
Most Internet marketing methods are risky and many will not have any affect on traffic to a web site. Some online marketers will sell you anything from banner impressions, to mass email campaigns (spam), to popup ads. All these marketing tools can work, but they are also extremely risky. Some people I know find pop-ups and spam so annoying that they will never purchase anything from a business that uses them. These plans are probably not the best customer acquisition strategies, and more likely they are a total waste of money. So why would anyone bother risking money on marketing strategies that probably will not increase traffic to your website? Why not concentrate on what does work? – The search engines.
Have you ever been contacted by online marketers who promise to deliver a “ton of traffic” to your website”? I get these emails every day. Here’s a quote from one I used to get 10 times a day (until I automatically filtered it to trash):
“Hi I visited www.metamend.com, and noticed that you’re not listed on some search engines! I think we can offer you a service which can help you increase traffic and the number of visitors to your website.
I would like to introduce you to thispromotioncompany.com. We offer a unique technology that will submit your website to over 300,000 search engines and directories every month.
You’ll be surprised by the low cost, and by how effective this website promotion method can be.
To find out more about thispromotioncompany and the cost for submitting your website to over 300,000 search engines and directories, visit www.thispromotioncompany.com. (…)”
Have you ever received one, and wondered why they were contacting you? First off, how did they find your web site? What search engines are they referring to? If they really could deliver on their promise, then they would have so much repeat, and word of mouth business, that they wouldn’t have time to be calling or emailing you. Lastly, how many people actually believe that there are 300,000 search engines?
While it’s true you need traffic from the search engines, you don’t need to use spam techniques to get it. You need real results, and not false hope.
Increase Web Site Traffic,.. Naturally It is true that the best way to obtain lots of targeted traffic (customers) is to acquire it based on relevance, via the search engines. Various studies show that anywhere from 83% to 92% of first time visitors to a web site find it through the search engines. That’s an incredible statistic. If you are not acquiring those customers as a result of a relevant query, then they will be disappointed. They may be disappointed with the search result, but more likely, they will be disappointed with your web site.
In the online world that’s your first impression. We all know how important a first impression is. You can never get a bad one back, and a good one will carry you a long way. You have to make sure that the search engines are sending you visitors that are looking for your products or services. If your web site matches their interests, they will remember it, and come back, even if they do not make a purchase on that visit. If they find it irrelevant, they may have subconsciously formed a negative opinion of your online business, through no fault of your own.
Search engines provide a continuous stream of targeted visitors to your website, and for the most part, it’s free of charge. Some engines do charge a listing fee, but most do not. The only thing the search engine asks is that each web site operator makes an effort to provide relevant and good information to web surfers for a particular search phrase. If a web site does so, the search engines will reward the site with increased good quality traffic.
Search engine traffic is a win-win situation for any online business. It doesn’t take much to improve on most web site’s search engine traffic – it just takes optimization. Did you know that as of January 1 2002, there were 160,000,000 domain hosts in use worldwide? Did you know that 88% of the web pages worldwide are not indexed by the largest search engines? 88% of web pages are not optimized. How can any business survive on the Internet if it is not optimized for the search engines, and thus can never be found? How can they exist if over 83% of first time visitors never find their web site? How much more money could a web site operator earn if they ensured their web site was even partially visible?
If you have an informative web site, the search engines want to send you lots of customers. That’s because the more web surfers find what they want, the more they’ll use a particular search engine and recommend it to their friends. The search engine also benefits, as it becomes known as a resource that gets its clients – the searchers – to their destinations quickly and efficiently. The more people recognize how well the engine works as a resource, the more it gets recommended, and used. As the popularity increases, so do the engine’s revenues from advertising.
What Does Your Web Site Need To Receive Traffic From The Search Engines?
Small web sites with only 1 or 2 pages set themselves up for failure, simply because they usually don’t have enough content of interest. There are of course exceptions, where the 1 or 2 pages are each as long as a book. But these are awfully frustrating to read, and no one will be satisfied with them. Most often 1 or 2 page sites are simply too short to provide any useful information, so the search engines don’t take them seriously. Among other factors, the search engines examine how deep a site is. The more meaningful content present, the more weighty the site is viewed as, and the more importance it is given.
If you are wondering about whether to bother, ask yourself this: Why does your company have a web site? What does the company do with it? Think about it. Most companies today have web sites, and most market the web sites to facilitate customer acquisition, to increase their customer base, and to improve customer retention rates.
There are a number of reasons for having web sites. Many companies use theirs to enhance their customer service. Using a web site as a marketing vehicle is a great way for a company “to put the word out” about products, services, or offerings.
Most importantly, remember that your web site is an online resource that your clients can use to find answers to frequently asked questions, “how to” tips, and to educate themselves. When including content on a web site, always remember that the knowledge shared may be common to you, but it’s likely that you are an expert in the eyes of your clients. People visit your web site for your product or service, but also for information. If they find useful, relevant, information, they will keep coming back, and will likely make purchases. People like to buy from experts.
Instead of thinking of your web site as nothing more than an online billboard or business card, think of it as an online menu, that lets people get an idea of what it is you do, and how you do it. Develop a content rich website, optimize it, and let the search engines increase your website traffic, naturally. If you optimize each major web page within your site, you will increase the rankings in the search engine results and therefore receive targeted traffic for each of those pages.
Doing each of the above – ensuring relevant content is present, and optimizing the pages – will ensure that the search engines have what they need so they can do their work. It will also ensure that they can send you targeted traffic (customers), so that you can get that 83% of first time visitors your online business needs to survive.
Author Bio:
Richard Zwicky is a founder and the CEO of Metamend Software, a Victoria, B.C. based firm whose cutting edge Search Engine Optimization software has been recognized around the world as a leader in its field. Employing a staff of 10, the firm’s business comes from around the world, with clients from every continent. Most recently the company was recognized for their geo-locational, or GIS technology, which correlates online businesses with their physical locations, as well as their cutting edge advances in contextual search algorithms.
Where Is The Search Industry Headed?
With all of what happened in the search industry in the last 9 to 10 months, one cannot neglect the fact that this industry is in for a lot of changes. Independently of the hypothesis that Google becomes a public company or not is irrelevant. The industry is facing major changes on its own.
For example, Yahoo is now the world’s second largest search property. After having acquired Overture a few months ago, it is now trying to battle a ‘level ground’ with Google. Competition will be fierce. Expect more mergers, buyouts and acquisitions in the coming months.
For example, late on November 20, Yahoo made a firm proposal to acquire a Beijing-based Web company for about $120 million in cash and stock. The company, with the unusual name of 3721, if acquired, would in effect benefit Yahoo of a new business for selling domain names in China. Yahoo would still maintain its strong search position in that country, for which many consider a rapidly growing market for Internet companies. Domain names’ Well, let’s call this a diversification away from Yahoo’s normal search operations. Still, such moves will be less uncommon in the near future.
Will the search market continue to grow in 2004? Yes, very much so. Depending to whom you talk to, estimates range anywhere from 10 to 30% growth. Some even expect higher numbers than that. What’s important is analyzing the trend. I am of those that think this growth trend is definitely on the increase and I expect it to continue well into 2004.
The Google ‘Wildcard’
A few weeks ago, Google stirred up a lot of news when it was widely reported in the press that it would probably come out with an IPO (Initial Public Offering) in February or March of 2004, effectively becoming a public company, joining its Wall Street rivals such as Yahoo, Overture and even LookSmart for that matter. On top of all that, Google hinted that, if there is an IPO, it would probably be of the auction type, in other words, it would probably bypass the large investment bankers, which as some observers have quoted such a gesture to be “uncommon”.
There were even some reports and articles in the press hinting that Bill Gates and Microsoft were in talks with Google, discussing a possible merger or acquisition of the Number One search engine. Bill Gates then categorically denied those allegations on November 17. The situation is in fact getting a bit cloudy.
Speaking of Microsoft, its no secret to nobody that Microsoft has been very busy lately, quietly developing its own search engine in the background. It even has a beta version already online in the United Kingdom, France, Italy and Spain. It’s only a question of time before Microsoft comes out with a full-fledged search engine that will probably make Google even more nervous that it is.
Then again, will it simply integrate its secret search engine as an overall component of its long-talked about Longhorn new operating system slated for late 2004, early 2005? As we have witnessed in 1995, when Netscape became a public company, it then saw its Navigator browser being overtaken by Microsoft’s Internet Explorer browser, which Microsoft had conveniently integrated into its Windows operating system. Could something similar happen again with Longhorn, this time the victim being Google? Again, only time will tell.
2004 And Beyond
Developments similar to what we are currently seeing will be extremely important to determine the exact course of action the search industry will probably take next. It promises to be very exciting. One thing is certain: the stakes are getting much higher as time goes by. Expect to see more PPC and PFP (Pay-for-Performance) advertising models. The field of search engines is certainly transforming certain parts of the advertising industry as we used to know it less than ten years ago.
It is my prediction that the following twelve months will be extremely critical to many of the largest and even the smallest players in the search industry. Coming back to Google, there are some that think it could probably raise between $ 10 and $ 15 billion with an IPO. It is estimated that, privately owned Google’s revenues will approximate between 700 million to 950 million for fiscal 2003. Others are wondering what Google would do with all that new money in its coffers. If all that money would be used to continue the development of its current search engine and initiate new R&D into other search-related technology, then many think it would be a good thing in deed.
Conclusion
There is no conclusion to any of this, as nobody can safely predict the outcome. However, buckle up your safety belts, since I think we could be into a few air pockets in the coming months.
Expect Google to continue to ‘fine-tune’ its PageRank algorithm in the coming weeks and months. Additionally, expect some of the other major search engines to do the same, in the never-ending battle for quality and relevancy in the search results.
What IS important and what I can safely predict here is the fact that, more than ever, companies, businesses, website owners and Webmasters alike continue to optimize their sites as much as they can. The ones that do will continue to reap the benefits of their efforts and should be amply rewarded in the search results and, hopefully in the conversion rates of their websites.
Author:
Serge Thibodeau of Rank For Sales
Revising Your Site
As of September, Smartads went through what so many have done before. I changed the entire site. I knew what it meant. It means that your search engine placements your probably so proud of are probably going to be lost in the process.
Sad but true. See, search engines are getting smarter, not only are they getting smarter but they do what you do. When you change a design, typically you change it for a couple of reasons.
1) You don’t like the design anymore
2) You’ve learned knew tips on how to optimize your site better for search engine placements.
3) Your revising your services
4) You bought an old domain name
Whatever it may be, you’re changing something. Now it won’t happen right away but eventually, if your site already has a good search engine ranking, then search engines will find out what you’ve done.
But “se’s” aren’t human? That’s correct but they were built by humans. Most se programmers realize that eventually something changes and since they want their search engine to be the best, they try to recognize your changes right away but there’s only one problem.
What do they do with the listings you have right now? Well, let’s say for instance you are attempting to change everything for the better. Now let’s also say that you HAD a page title called “good rank”. Search engine’s previously know and still think that your page is all about “good rank”. Moving along, your new page information is about “Getting a better rank”. Your old placements are now wrong in the eye’s of search engines. They now realize that something has changed. So what are they going to do?
Dump your old listings and start over! Ouch is right. So why would you ever want to change your site if that’s the case?
Well here’s some food for thought: Yes, you may loose your existing rankings but your roots are growing. Your new pages are growing bigger and better than before, it just takes a small transitional period to take effect.
For instance, within Google, Smartads had over 580 pages found when looking at all the links to smartads. Since we changed the whole site, there was a 2 day period were none of them were listed anymore. Normally, I would have gotten scared right away if I didn’t know that my site had changed.
Not 4 days later, not only were my pages back but the old rotten pages that weren’t being used anymore got dumped and the many new pages got listed. All in all around 680 pages now. That was a jump of around 100 pages were added. Cause and effect.
Here’s another example for you. Most of us know about link popularity within Google. You’ve put all your effort into promoting your main page. Within all of the links that you’ve had placed on other sites, you gave your link title “Great Ranking Services”. Once google saw your link, it started looking to see whether your information on your site is relevant to “Great Ranking Services” and awarded you appropriately.
Moving along a little, you’ve not only changed your site but you’ve also changed your link title to better fit the new content on your main page. The new link you now add on other sites reads something like this; “Boost Your Search Engine Rankings”.
Before you changed your site, you got a GREAT rank for “Great Ranking Services” when people searched online.
Here’s the catch. Now that your promoting a new site, a new page, new content within your page, the old content relavance goes straight down the drain. Search engines have recognized that everything has changed and are not paying any attention to your old links anymore. Yes, they still help your link popularity, but they don’t help your content relevance.
See, search engines are consistently trying to improve their content relevance. That’s what people want. If they search “gidgets & widgets”, they expect to get the best results for “gidgets & widgets”.
In Conclusion:
If you plan on changing your site, be prepared to suffer for a small amount of time. Prepare yourself for the worst because the best is yet to arise and prevail.
I understand that we all want better search engine rankings and just be ready to do what ever it takes, even if it means loosing your old se placements for some newer, fresher ones.
I hope you enjoyed my article, for more of my articles, simply follow the link below!
Best of luck to you!
Author Bio:
Martin R. Lemieux Smartads – President Affordable Web Site Design & Web Site Marketing Web Site Awards & Webmasters Playground Food for your mind & Entrepreneur Traits Read over 200 articles on advertising!