Ask us a question!

Web Moves Blog

Web Moves News and Information

Archive for the 'Grow Your Traffic' Category

Finding Good Keywords
First of all, what type of keywords should you optimize for?
A single word, a whole sentence. The actual question is what are people searching for? Here is a breakdown of the most popular type of searches.

Two word searches 29%
One word searches 25%
Three word searches 24%
Four word searches 12%
Five word searches 5%
Six word searches 2%
Seven word searches 1%

What about active verbs? Think about it. If someone types in “buy”, “find” or similar words, wouldn’t this be an indication that the person is about to make a move, instead of just surf. Although you could go to a top 100 search terms site, why not create your own top 100 active search terms by putting these active verbs into a keyword tool.

People talk to their computers. They will use “I” in their search engine queries sometimes. “How do I find…?” I know I do. I also hold my computer personally responsible when I can’t find something or Windows locks up.

People put .com in their searches. Type .com into the
Overture keyword tool and see how many websites people know
by name but still are attached to using a search engine.

Which sites have a really bad interface? I can name a few, a few big ones. You can find these by typing “registration,” “signup”, “login”, or “join” into a keyword tool. How about “purchase”? If you are doing affiliate marketing you may try these and take advantage of webmasters who should be called webminors.

Although a little bit off-topic, I read somewhere that sponsoring cars for Nascar races is one of the most effective ways to advertise a product. Nascar fans are very loyal. This got me thinking.

Fans are definitely an endless source of revenue. What would they be searching for? A few of the words they may type in are “posters”, “zine” and others. Or just type in a celebrity name and “fan” and look for keywords in the pages that come from Google.

Then there are terms that are standard on many sites. When I was looking for places to submit my software, I just plugged “submit pad” into Google. If you are looking for Clickbank affiliates, you might type the product name and “hop.”

The point is that the customers you are looking for aren’t a computer and they aren’t a dictionary. So you can’t be either when you search for terms to optimize. You’ve got to think like someone in need of your product. I bet if you gave it a little time, you can come up with more examples.

Author Bio:
Stephan Miller is a freelance programmer, and writer. For more Adwords resources, visit the page below http://www.profit-ware.com/resources/googleadwords.htm

Google Paid Placement Review
With the recent ‘Florida’ update on google & its impact on the SERPs, typically on the commercial search phrases, it has become imperative that we now examine paid advertising scenario at google. For scores of merchants, now paid advertising will be the only way to help them salvage their sales, as also future revenues. This will be a two parts article, with the first part focusing on the google paid advertising space & the second part on the best practices thereof. However, before we start on googles adwords & adsense programs a word or two on online advertising industry performance in 2003.

Online Advertising Industry: According to the Interactive Advertising Bureau (IAB), the online advertising industry sales have been robust this year with three consecutive quarters of growth witnessed for the first time in last two years. Lets first look at the yearly sales figures for last three years. Online advertising climaxed in 2000 with $8 billion in revenue (dot com boom) before declining to $7.1 billion in 2001, and $6 billion in 2002. This year in the first three quarters, the sales have been estimated at a healthy 5.07 billion (up 13% from the same period last year). The break up of the quarterly sales figure is as under:

• The first quarter sales have been at 1.63BUSD
• Second quarter is pegged to have clocked 1.69 billion (up 14% from same quarter last year)
• Third quarter is estimated to have done 1.75billion.(20% growth from this quarter last year)

From figures the emerging pattern is towards keyword based text ads gaining popularity.

According to IAB, out of the 6bUSD figure of 2002, 15.4% (or roughly 1 billion) can be attributed to Search term related sales. This is up 200% from the figures of 2001. The contribution of search term related ads has been estimated to 31% of the second quarter figures. No break up for third quarter figures is however available. Financial firm Salomon Smith Barney expects the estimated $1.4 billion market in 2003, to grow 30 percent to 35 percent per year, reaching $5 billion by 2008.

This Advertising space is dominated by two major players (obviously): Google & Overture. Google Adwords Google has two services for its paid advertisers. Adwords & Adsense. Lets start with Adwords. This is a program run by Google in which it lets the advertisers bid for a chunk of the real estate on the search engine results page related to specific queries. The ads appear as “sponsored listings/premium listings” next to the organic search results. Along with getting exposure at the googles own site, these ads are also syndicated at Googles’ partner sites like: America Online, Inc.,Ask Jeeves ,AT&T Worldnet ,CompuServe ,EarthLink, Inc. ,Netscape ,Sympatico Inc. etc.

Currently, Google is supposed to have approx 150,000 monthly advertisers for its paid placement services. Adwords Account Statistics: Following are the statistics for every adword account google has; One can set up 25 campaigns in every account. Each campaign can have up to 100 adgroups in it; with each adgroup capable of having up to 750 Keywords in it. The overall limit on any account is 2000 keywords. Adwords offers four kinds of Keyphrse matching namely: Broad, Phrase,Exact & Negative.

Recently, Google has introduced expanded matching for its broad matches. More on this later… It lets you target your ads to specifically 14 languages, 250 countries or 200 states /regions in US. Google claims 99% accuracy for its IP tracking system to deliver the ad effectively to the target audience. Recent Features on Adwords: Google has added three new features to its adwords campaign in October. Those are:

• Expanded Matching
• Conversion Tracking &
• Increased Click Through Threshold.

Conversion Tracking
Google has introduced a conversion tracking feature for its advertisers, wherein they can now track the conversions resulting from their advertisement traffic. This conversion tracking helps the advertisers by quantifying the ROI they are achieving with their campaigns. This feature works by introducing a cookie on the user computer whenever someone clicks on an advertisement. This cookie is connected to the conversion page if the user reaches there. If it is matched, google records a successful conversion.

Expanded Matching
As a part of its broad matching of keywords, Google has introduced a ‘smart’ feature called expanded matching. With expanded matching, AdWords system automatically runs ads on highly relevant keywords, including synonyms, related phrases, misspellings and plurals, even if they aren’t in the original list of keywords that you submitted with google. For example if you keyword is ‘software development’, google system will try & guess alternate searches to display your ads on. Some examples are ‘software solutions’ or ‘software services’ or ‘technical solutions’. Over time, it will monitor the click through rates (CTR) for these searches & ‘learn’ the relevance of these searches for you. This will help it make expanded searches more specific. Also, based on its mining the search queries, it will be able to develop fresh combination of search terms which will be relevant to the business.

This theme based adwords tuning, is in sync with googles trend of trying to understand ‘available content’ & ‘user queries’ & make ‘intelligent matching’ of the two. In an effort to let you know what are the keywords it will look to broaden your exposure to, google has for the first time put out a Adwords keyword suggestion tool: Google Adwords tool

This tool highlights the googles view of what other terms, it ‘understands’ to be relating to your business. (Tip :This tool works well for broad searches.)

Increased Click Through Threshold
The increased click-through threshold is designed to help ads that may have struggled for traffic due to poor search relevance. These include those related to contextual ads as well. Now for evaluating the Account performance evaluates each account after every 1,000 ad impressions. If the CTR for the account falls below a minimum required CTR (which varies by ad position, geography etc but is 0.5% for the top spot and slightly reduced for each subsequent position) the ads will only be displayed occasionally for under performing keywords.

Google Adsense
Google AdSense is an offshoot of the Google Content-Targeted Advertising program that was launched in early March 2003. This early program let large websites integrate Google AdWords into their websites. Each deal was independently negotiated with Google, and sites with less than 20 million page views couldn’t participate. The Google AdSense program democratizes the content based text ad display process. Even small merchants, bloggers with only a few thousand page views per month, can now apply to the AdSense program The AdSense program allows Web publishers to apply for the program online.

After Google vets a site for popularity and quality — a process the company estimates will take two to three days — accepted applicants download a string of HTML code to insert on Web pages on which they wish to carry text-link ads. Google draws the listings from its base of 150,000 advertisers, using its algorithmic search technology to scan the content page and match it up with relevant links that are displayed as either skyscraper ads on the right side of the page or banners at the top. Some of the large Googles partner networks are :HowStuffWorks, Mac Publishing (includes Macworld.com, MacCentral, JavaWorld, LinuxWorld ,New York Post Online Edition, Reed Business Information (includes Variety.com and Manufacturing.net), U.S.News & World Report online All the rules that apply to adwords listings also apply to Adsense in respect of CTR, Positioning etc.

Author Name: Vikas Maholta
Company: Mosaic Services
Email: sunny@mosaic-service.com

After the Google Update
By now, probably just about anyone that has a website did noticed that Google made some drastic and major changes to its search algorithm, when it began the regularly scheduled monthly update of its massive database, consisting of about 3.5 billion pages. This one was dubbed the ‘Florida dance’. Others prefer to call it the ‘Devastating Florida Hurricane’.

No matter how you want to call it, hundreds of articles, theories, wild guesses and just plain gossip have circulated since Google implemented its new ‘spam filter’, on around November 15, 2003. There were even some wild allegations in the press of a Google conspiracy, in an effort to maximize the sales of its AdWords PPC program.

On November 26, Stephen Lynch of the New York Post called me and conducted a phone interview, to better understand how small and medium-size businesses can cope with this disaster, and published the results of its findings the next day.

I have written a number of articles, offering my analysis, both on Google’s new spam filter, as well as the way the Florida update behaved.

If your site suffered some major hurricane damage of the F-5 category, or if you just lost a few positions on some of your keywords or search terms, this article will help you repair a lot of the damage the ‘Google Gods’ have caused to your website.

It is hoped you will profit from my advice. Since November 26, I have been ‘repairing’ many sites that received the Over Optimization Penalty or OOP. If your site did get the OOP, I will offer some advice on how you can fix that, and prevent it from happening again down the road. Additionally, if you are already participating in a link exchange program, or if you’re contemplating one, read the section in this article: The new linking strategy.

Removing the Rubble After Hurricane Florida
By now, it’s very clear what Google has implemented in its new algorithm and how to prevent yourself from falling victim, either now or later. One thing is certain: the ‘good times are over’ and from now on, only the best optimized sites, ie: the ones that Google approves of will maintain the original rankings they had, prior to update Florida.

First, you need to make certain and carefully determine if the OOP was implemented on all your site’s pages, or just a few of them. So far, in 90% of the sites I have seen that received the OOP penalty, most of them were affected to only some of their pages. The other 10% will have to be completely revamped, from a search engine optimization point of view.

Note that all the sites we have worked on which did receive the OOP were sites that our competitors optimized, before the owners of these sites came to see us, understandably upset their sites could not be found anymore, using their previous search terms.
You can determine if your site had any OOP penalty applied in any way by simply entering your keywords in a Google search, but you will need to add an -exclude for a unique string of characters. Example:

Your main keyword or key phrase ‘blahblahblah

The -blahblahblah string of text doesn’t matter one bit; it’s simply a string of nonsense characters; you can type in anything you wish and it will work… (!) What is important is that you put the “-” character, which informs the Google algorithm to exclude the new spam filter which appears to cause the OOP penalty.

After running this simple test, if you see your page reappearing back under these conditions, it’s highly likely the OOP is in effect for that particular keyword or key phrase on that page.

Note that there is a possibility Google might modify how this type of search works, in an effort to prevent some people from seeing the results, but without the penalty filter. However, at the time I wrote this, that search technique enabled me to detect which of the penalized web pages have been affected by the OOP, so it is a pretty fairly accurate test.

Repairing Your OOP Links
The reason why your site has received the OOP is simply because Google’s new anti-spam filter penalizes Web pages that use their exact keywords in the links that point to those pages.

Here’s an example: If your site sells home decorations and you have links pointing to it with the keywords ‘home decorations’ in them, you will have to re-edit those links with other keywords that will still work in your site.

In the above example, instead of using ‘home decorations’ in the link to that page, use ‘decorations for the home’ or ‘home & interior improvements’. An alternative would be to leave the link unchanged, but changing instead the title of the page, the description tags, the H1’s or H2’s and the headlines with the new keywords ‘decorations for the home’ or ‘home & interior improvements’.

Personally, I think simply changing the keywords in the link text is faster and less complicated, but you are free to use the one you feel most comfortable with. Either way, it will have the same desired effect. Remember that any changes you make will not be fully implemented until the new Google ‘dance’ which some believe might happen near December 19 or later. It is only about 7 to 10 days later that you will see your rankings improve again.

The New Linking Strategy
If you already practice link exchanges with other websites, commonly known as ‘reciprocal link exchange programs’ or if you have been thinking about one, then this section is for you. Since Google counts a link to your site as a ‘vote’, sites with a PageRank’ of 4 or higher can be beneficial to your site’s link popularity, hence your rankings in the Google search engine.

Since the advent of the Florida update, Google is more selective and more carefully analyzes links pointing to and from your website. On top of making certain you don’t use your exact same keywords in the text links from others linking to your site, you will also need to link to sites that are relevant to your site. If you sell accounting software, linking to sites that offer spreadsheet programs or accounting templates will certainly give your site and the other, a boost in your respective rankings.

On the other hand, if you link to a hotel or a casino, that link will certainly not be counted as a valid link, since it has nothing in common with your site. Before requesting or accepting a link, ask yourself if that link will actually benefit your users. If so, then go ahead and do it. Otherwise, your time and efforts can be better used elsewhere.

Conclusion
The Florida update is certainly the biggest and most important changes Google has ever implemented to its search technology in its short five-year history.

However, it probably won’t be the last either. Whatever sceptics are saying, it can only be hoped that the real reasons Google made those changes in its search algorithm is to improve the quality and relevancy of its results.

However, I would add that it’s also in an attempt to remove all the unwanted ‘spam-dexing’ and the large amount of search engine clutter Google had in its massive database of over 3.5 billion pages, prior to November 15.

Granted. The amount of work or expense some websites need to incur to ‘repair’ these damages can be important, in certain cases. However, it is imperative that the ‘repairs’ be done correctly, in accordance with the recommendations in this article. If these corrections are done and implemented in the right way, at the end of December, these sites should enjoy many of the rankings they used to have, before the passage of the ‘hurricane’.

Author:
Serge Thibodeau of Rank For Sales

Buying & Selling Google PageRank
There seems to be a lot of buzz about websites that are willing to shell out big bucks for links on sites with high Google PageRank (PR). And many webmasters and webmistresses are going crazy to cash in on the big windfall.

What is Google PageRank? Simply put, PageRank is a bit like link popularity, but there are many factors involved in its calculation.

You can look at the very briefeficial” Google information on PageRank at Google’s Technology Page or visit The Handy Dandy Google PageRank Figurin’ Guide for a more technical explanation & hypothesis.

To find out the PageRank of all websites, including your own, download the free Google Toolbar.

The Basic Premise of “Buying” PageRank is This:

People assume that the higher the PageRank, the more valuable the link. Therefore, if they buy some links on the highest ranking sites, they will receive the most benefit.

There are a few problems with this theory. Before you buy and before you sell any links with the hope of gaining PR, consider the following:

1. Where is Your Link Coming From?: Let’s say that you are being offered link from a site whose home page has a PageRank of 9. Well, that’s all fine and dandy, but if you are not receiving your link from the home page, you need to know the PageRank from the precise page that is linking to you.

Generally, Google indexes links coming from specific pages that have a PageRank of 4 or higher.

2. Number of Outgoing Links on the Page: If there are many outgoing links on the particular pages, the effect on the pages linked to will be minimized.

Sure a high ranking page can transfer some of its PageRank to another page by linking to it…This is true, but the effect is diluted as the page only has so much PR to transfer.

Therefore, it is very possible that getting a single solitary link from a page with a PageRank of 5, might be more effective than receiving a link from a page with a PR of 7 that has numerous links on it.

Before you buy: Always ask how many other links will be included on the page.

3. Long Pages & Placement of Your Link: Google only indexes approximately 750-1000 words to a page. If your link is at the bottom of a lengthy page…it’s not likely going to be indexed or affect your PageRank at all. To check how far down a page Google has indexed enter:

cache:url.com

in the search box.

4. Let’s Be Serious: Google PageRank is Not the Key to Search Engine Traffic: Sure, it’s nice to be able to boast about your great Google PageRank, but even with a PageRank as low as 2 or 3, you can actually get a steady stream of search engine traffic. In fact, if you’re targeting very specific keyword phrases in your optimization, as you should be…PageRank is not such a big factor.

5. Where Does the Selling Site Get It’s Backwards Links from?: Perhaps, they have a good PageRank, but they could be using deceptive techniques and just haven’t been caught yet. If you see a site with backward links from a ton of FFA (free for all links) pages guest books, and the like, steer clear. These people will not have their PageRank for long.

Don’t get me wrong, building your PageRank is useful and a desired thing. But there really is no reason to shell out hundreds, or even thousands of dollars, to artificially boost your ranking…particularly if the opportunity is not as good as it sounds.

For a more sound approach to building link popularity, PageRank and your online business, read Increase Your Link Popularity:
Building a Link Popularity Strategy

Author Bio:
Alice Seba is the owner and editor of http://www.internetbasedmoms.com, an Internet Marketing & Networking resource site for work at home moms and other entrepreneurs.

Buzzwords vs Effective SEO Keywords
Ever see a website that seems to speak a foreign language…in English? We encounter many SEO client websites that rely on buzzwords in the page copy to get the word out about their product. The problem lies with visitors who may not be familiar with those terms. This means optimizing with buzzwords may not be the best way to gain traffic. If your prospective visitors are not searching for those terms, how do they find your website?

Start With The Obvious
You really need to know your industry. Study your prospective visitors–who your target audience is. If your prospective visitors are highly technical and work and talk in “buzzword speak”, no problem. But if you also want to attract prospective visitors who may not be immersed in the terminology used in your business, you must compensate by optimizing with a wider array of targeted keywords.

How Do I Find All Those Keywords?
Start researching. Yes, it’s going to take a little work on your part to take a close look at what keywords you may be missing out on. Keep account of prospective website visitors who may use other terms to find your website. Track the keywords used by visitors through your log reports. Most log statistics programs have a report showing the keywords used by searchers to find your website. Using your server logs or log statistics program for keyword information is a good way to get a better picture of how visitors are finding your website. Use Overture’s keyword tool (http://inventory.overture.com/d/searchinventory/suggestion/) or Wordtracker (http://www.wordtracker.com) and note the words used on your competitors’ websites.

Using these, or similar tools, type in your buzzwords and see what variations come up. Competitor websites may use a slightly different language than you when writing copy for their pages. Visit their websites and learn all you can about how many ways your business can get its message across. Read online articles; visit business newsgroups and forums. Find research information through industry websites and companies that specialize in producing reports about your industry.

Help Search Engine Robots Do Their Job
Search engine robots are just automated programs. Their concept and execution is relatively simple: search engine robots “read” the text on your pages by going through the source code of your web pages. If the majority of the words in your source code text are buzzwords, this is the information that will be taken back to the search engine database.

It’s Obvious (the “DUH” factor)
Ok, so it’s obvious to you what your industry buzzwords are. But don’t discount the simpler versions of those catchy words. Focus also on some lesser used terms and make a list of additional keywords you might be able to add. Clear, precise copy that catches the visitor’s attention and tells your story is generally more effective in the long run.

Compromise – Mix SEO Keywords & Buzzwords
You don’t want to change the copy on your webpages? This is often a problem with business websites. Once you have your keyword list of other-than-obvious words, work at fitting them into the page text carefully. You want them to make sense with the context of the web page. Use these new keywords as many times as “makes sense” so they do not sound spammy. Read your copy out loud or have a colleague read your copy to get a sense of how it might sound to a website visitor.

The Bottom Line
It should be easy enough to see how those extra keywords are producing for you. Keep track of your log reports and see if those new terms start showing up in your reports. Test a variety of keywords, then test again to see if visitors are staying on your website, moving through your individual web pages, or clicking away. Create specific pages using those keywords as a test scenario. The information you need should be available to you in your log statistics reports for visited web pages.

Don’t let business jargon get in the way of getting your message across to your audience. Yes, buzzwords may sound cutting edge, but the bottom line is, traffic and sales are what you really want to show for your hard work.

Author Bio:
Daria Goetsch is the founder and Search Engine Marketing Consultant for Search Innovation Marketing, a Search Engine Promotion company serving small businesses. Besides running her own company, Daria is an associate of WebMama.com, an Internet web marketing strategies company. She has specialized in search engine optimization since 1998, including three years as the Search Engine Specialist for O’Reilly & Associates, a technical book publishing company.

Every Search Engine Robot Needs Validation
Your website is ready. Your content is in place, you have optimized your pages. What is the last thing you should do before uploading your hard work? Validate. It is surprising how many people do not validate the source code of their web pages before putting them online.

Search engine robots are automated programs that traverse the web, indexing page content and following links. Robots are basic, and robots are definitely not smart. Robots have the functionality of early generation browsers: they don’t understand frames; they can’t do client-side image maps; many types of dynamic pages are beyond them; they know nothing of JavaScript. Robots can’t really interact with your pages: they can’t click on buttons, and they can’t enter passwords. In fact, they can only do the simplest of things on your website: look at text and follow links. Your human visitors need clear, easy-to-understand content and navigation on your pages; search engine robots need that same kind of clarity.

Looking at what your visitors and the robots need, you can easily see how making your website “search engine friendly”, also makes the website visitor friendly.

For example, one project I worked on had many validation problems. Because of the huge number of errors generated by problems in the source code, the search engine robots were unable to index the web page, and in particular, a section of text with keyword phrases identified specifically for this page. Ironically, human users had problems with the page as well. Since humans are smart, they could work around the problem, but the robots could not. Fixing the source code corrected the situation for human and automated visitors.

There are several tools available to check your HTML code. One of the easiest to use is published by the W3C (http://validator.w3.org/). While you’re there, you can also validate your CSS code at W3C’s page for CSS (http://jigsaw.w3.org/css-validator/). The reports will tell you what source code needs to be fixed on your web page. One extra or unclosed tag can cause problems. With valid code, you make it easier for your human visitors and search engine robots can travel through your website and index your pages without source code errors stopping them in their tracks. How many times have you visited a website, only to find something broken when going through the web pages? Too many too count, I’m sure. Validating your pages makes everything easier for your website to get noticed.

As I said before, what works for your website visitors works for the search engine robots. Usability is the key for both your human visitors and automated robots. Why not provide the best chance for optimum viewing by both?

Author Bio:
Daria Goetsch is the founder and Search Engine Marketing Consultant for Search Innovation Marketing (http://www.searchinnovation.com), a Search Engine Promotion company serving small businesses. Besides running her own company, Daria is an associate of WebMama.com, an Internet web marketing strategies company. She has specialized in search engine optimization since 1998, including three years as the Search Engine Specialist for O’Reilly & Associates, a technical book publishing company.

Why You Should Always Avoid Cloaking
When building, implementing or optimizing a website, you should always avoid at all costs any form of cloaking, since those techniques are strongly forbidden by most major search engines today, with very good reasons. However, before I start explaining to you why you should always avoid cloaking, let me give you a brief introduction to the rather complex world of cloaking.

By definition, cloaking is a technique of delivering a certain set of pages to search engine crawlers, while at the same time delivering a completely different set of pages to your human visitors. There are many ways of doing this. Although some will tell you that the most effective way to cloak is with the use of IP (address) delivery software, the jury is still out on this and opinions vary widely. The subject of cloaking has always been and still is a very controversial topic.

Sophisticated cloaking software that makes use of IP delivery to cloak pages are complex programs that, in order for me to properly explain their functioning, would fall way beyond the scope of this article. However, let me give you a simplified explanation on how these cloaking programs work.

The Idea Behind Cloaking
When a search engine spider or human visitor requests a specific Web page, cloaking programs compare the IP address of the requesting party to a database of known IP addresses from specific search engine spiders. If the IP address matches one on the list, it serves a particular page that was specifically written for the search engines to the spider. This page is normally optimized to rank well for that particular search engine, in order to achieve high visibility.

On the other hand, if that IP address does not match one in the database’s list, the requesting entity is ‘assumed’ to be a human visitor and is served the actual ‘real’ page or is simply re-directed to another page appropriate for human visitors (!)

Since most modern search engines today constantly change their IP addresses in an effort to reduce this silly ‘cat-and-mouse-chase’, as you can imagine, if these databases are not regularly updated, they will become obsolete very quickly, in effect making such cloaking programs completely useless in no time!

As a rule, most major search engines today HATE any type of cloaking software and they categorically forbid them in their terms of use. One such strong opponent to any form of cloaking has always been Google. Today, Google will take strong action against any ‘cloaking website’ that it discovers. Such actions range from temporarily penalizing that site to outright banning it permanently from its index.

The simple reason for all this is that search engines rightly feel that they should always be delivered the same Web page that any human visitor would normally see. Many search engines also feel (and I agree with them) that webmasters making use of cloaking software or programs are in effect ‘spamming the index’. While some people may not agree with this, the search engines’ point of view does have some good merit and, if you want your site indexed by them, you don’t have any other option but to follow their rules.

Conclusion
Always build a great site that has a lot of valuable content to ALL your visitors and the search engines and make certain the search engines get the same pages your actual visitors get and your site should always rank high, if you follow all the rest of the advice offered on this website. Avoid cloaking at all costs, since it’s never an option.

Remember that all search engines today are only concerned with two important things: the relevancy and quality of their search results. Also, you should always remember that it is THEIR search engine and only they make the rules. So, in light of all this, it would be wise for you to completely forget about any kind of cloaking, unless you are willing to take the chance of having your site penalized or banned permanently from the major search engines.

Author:
Serge Thibodeau of Rank For Sales

Where On Earth Is Your Web Site?
You’ve just finished congratulating your marketing team. After six months of concentrated effort you can now actually find your own company web site within the search engines. Everyone is busy handshaking and back patting when a voice from the back of the room rises above the din. “Yeah this is great! Can’t wait until we can find ourselves on wireless devices.” All conversation comes to an abrupt halt. Eyes widen. Everyone turns to the fresh-faced intern standing in the corner with a can of V8 juice in one hand and a PALM device in the other. You, being the Department Manager, barely managing to control your voice not to mention your temper, ask the now nearly frozen with panic intern, “What do you mean find ourselves on wireless? We just spent thousands on our web site visibility campaign!” “Well… Explains the sheepish intern, “There is no GPS or GIS locational data within our source code. Without it, most wireless appliances won’t be able to access our site.”

Guess what? The intern is absolutely correct. Anyone interested in selling goods and services via the Internet will soon be required to have some form Geographic Location data coded into your web pages. There are approximately 200 satellites currently orbiting the Earth. (even Nasa won’t confirm the exact number) Some are in geosynchronous or geostationary orbit 27,000 miles above your head. The Global Positioning System (GPS) is the name given to the mechanism of providing satellite ephemerides (“orbits”) data to the general public, under the auspices of the International Earth Rotation Service Terrestrial Reference Frame (ITRF). Sounds like Star Wars doesn’t it? It’s pretty close. The NAVSTAR GPS system is a satellite-based radio-navigation system developed and operated by the U.S. Department of Defense (DOD). The NAVSTAR system permits land, sea, and airborne users to determine their three-dimensional position, velocity, 24 hours a day, in all weather, anywhere in the world, with amazing precision. http://igscb.jpl.nasa.gov/

Wireless devices, WAP, Cellular, SATphones and a whole host of newly emerging appliances and indeed, new software applications, will all utilize some form of GPS or more likely GIS data retrieval. GIS stand for Geographic Information System and relies on exact Latitude and Longitude coordinates for location purposes. Several car manufacturers currently utilize GPS for on-board driver assistance and the Marine and Trucking Industries have been using it for years. Obviously your web site is a stable beast. It sits on a server somewhere and doesn’t move much, so at first glance it seems quite unplausible you’ll need GIS Locational Data within your source code. On the contrary. One aspect your web site represents is your business’s physical location(s) and if people are going to try to find your services and products, shouldn’t you at the very least, tell them where it is and how to get there?

Let’s look at it from the other end of the spectrum. The end user approach. Let’s say you’re vacationing in a new city for the first time. Once you get settled into your Hotel room, what’s the first thing you want to find? Restaurants? Bank machines? Stores? So you pull out your hand-held, wireless, device, log onto the web and search for “Italian Food in San Francisco.” FiveHundred results come back so you click the new “location” feature on your hand-held (which knows exactly where you are) and ten Italian restaurants, who were smart enough to code their web sites with GIS data, light up on the screen. Guess which restaurants didn’t get selected? The other four hundred and ninety. Starting to get the picture?

How does this affect you and your web site marketing?
GIS Latitude and Longitude co-ordinates will soon be a must have on every web site operators and web developer’s list and an absolute necessity for anyone wishing to trade good and services via the Internet. This data may relate to the physical location of the web site or where the site is being served from (if applicable) or where the actual business represented by the site is physically located. There may be multiple web site locations and coding involved, if for example, you have a franchise with multiple locations, each location will probably need a page of it’s own with the correct corresponding location data. If you run a home-based business, I doubt if the co-ordinates to your living room are going to be necessary, but you should provide the latitude and longitude of the closest city or town. Large corporations such as banks may want to code the exact location of every automated teller machine across the country. Industry standards and the methods of serving out this data are still in the development phases but it’s a safe bet to assume there are plenty of people working on the solutions right now and given the speed of technology, implementation will probably be much sooner than later. Give yourself an edge. Find out where in the world your web site is…before your web site is nowhere to be found.

Author Bio:
Robert McCourty is a founding partner and the Marketing Director of Metamend Software and Design Ltd., a cutting edge search engine optimization (SEO) and web site promotion and marketing company. Scores of Metamend Client web sites rank near or on top of the search engines for their respective search terms.

Higher Google Adwords Clickthrough Rates
1. Target your ads to the right Audience.
You do this by selecting keywords and phrases which are relevant to your product or service. Avoid keywords that are too general because although they generate a large number of impressions, they often generate very few clicks. To improve your CTR, use more descriptive phrases so that your ads will only appear to prospective customers searching for what you have to offer.

2. Use the correct keyword matching option(s).
Google offers four different methods of targeting your ads by keywords: Broad Match, Phrase Match, Exact Match and Negative keyword. By applying the most focused matching options to your keywords, you can reach more targeted prospects, improve your CTR, reduce your cost-per-click and increase your return on investment.

3. Target your ads by location and language.
When creating your adwords campaign target your ads by location so that you maximise your sales and improve your CTR. Target the right audience by selecting the language and countries that you want to reach.

4. Use your main keywords in the Title or Body Text of your ad.
By using your keywords in the title or ad body text of your ad, it will stand out from your competitors and grab the eye of your prospective customers.

5. Create different Ad Groups for different search phrases/keywords.
This will allow you to refine your ads and test them for relevance and therefore maximise your clickthrough rates. For example, if your service offers loans, you can create different ad groups for home equity loans (and all other phrases that incorporate this phrase), consolidation loans, student loans and so on.

6. Calculate what you can afford to pay for each clickthrough.
You will find that more focused keywords and search phrases have a higher conversion ratio than other more general keywords. It’s a good strategy to pay more for clicks from keywords or phrases with a high conversion ratio than from the more general keyword groups.

7. Use highly targeted Keywords and search phrases.
Be specific when selecting keywords and search phrases for your campaign. General keywords will be more expensive and will result in lower clickthrough rates. If you’re bidding on general keywords that are relevant to your site consider using the exact match and the Phrase match keyword matching options in order to increase your CTR.

8. Test and monitor your ads to get the best clickthrough rates. Refine and fine-tune your ad to maximise click throughs. With Google you can do this in real time. You can do this by creating different ads for each ad group and then checking which ads have the best clickthrough rates.

9. Give Google users a compelling reason to click on your ad link. The easiest way to do this is to provide something of value for free. You can also achieve this if you tailor each keyword to your offer and use relevant terms/ words in both the title and the ad body. Use a different ad for each keyword group or search term. This increases relevance and the likelihood that Google users will click on it.

Author Bio:
Copyright. Ben Chapi owns Venister Home business and Affiliate Program Classifieds at http://www.venister.org.

Optimizing Framed Websites
A number of months ago, I wrote an article on the proper techniques that should be used to successfully optimize a site that makes use of frame technology. I’m still regularly getting some questions in relation to sites that were designed using frames, asking me the best way to optimize these sites to improve their visibility in the search engines. Every now and then, I’m also getting a few emails from some site owners that are desperate their framed site is nowhere to be found in the engines. In response to all of this, I think an extended version of my first article is necessary.

For some reason, even today, there is still a lot of controversy in establishing whether or not to use frame technology when building a new website. This controversy is certainly well founded, since, apparently there seems to be many site designers that still use it, even if that’s not a very good idea anymore. In this section, you will learn all you need to know to get any framed web site properly listed and optimized for the major search engines.

To be fair, let’s just say that a framed website can actually ‘sometimes’ be of help in its necessary updating needs and many web designers decide to use frame technology for this very reason. They can be even more useful for maintaining very big web sites that have a lot of content spread over 500 or more pages, although I don’t necessarily agree on that assumption.

However, and as you are about to find out, framed websites usually create many problems and complications for most major search engines in use today. This article will show you the pitfalls and the mistakes that some Webmasters do in implementing their framed sites, in lieu of “search engine-friendly” websites. It will also show you the right way to do it, if you still want to go ahead with this idea.

What Constitutes A Framed Web Site?
The easiest and surest way to determine if a site is framed or not is when its left-hand navigation menu remains stationary while the information in the centre of the page moves either up or down. Furthermore, some models of framed sites sometimes include a company logo or image at the top or sometimes at the bottom that also remains fixed while the rest of the page moves in either direction. Some models might also include links or navigation buttons in that same stationary section. When you see any or all of these characteristics, you are then dealing with a framed website.

Today, anywhere on the Internet or in good technical books, you can actually read a lot about search engine optimization (SEO) that basically says using frames on any website can actually be similar to a bad spell on that site simply because most major search engines will not navigate the frames or its simply impossible for them to crawl or spider. In such a case, these same people will tell you that such a framed website will never get indexed or optimized properly.

That statement is actually both true and false at the same time. It IS false if they are used correctly. But, by the same token, it’s perfectly true if the frames are in effect used improperly and are search engine-unfriendly. Happily, there are ways around these limitations.

When implemented correctly, a framed site can be ‘almost’ as good as a site without frames and can hopefully deliver results that could be considered reasonable, as compared to what the site delivered previously. However, if your site isn’t designed yet, I would still suggest you do it with search engine-friendly technology, in other words, stay away from frames!

Why Do Search Engines Hate Framed Websites?
In this section, I will explain the main reasons why most framed web sites fail to get indexed properly on the major search engines that use crawlers such as Google. When we look at the HTML code of a common framed web site, we usually see the TITLE tag, the META tags, and then a FRAMESET tag. The major search engine’s crawlers and spiders such as Google Bot and Freshbot are all programmed to completely ignore certain HTML code and instead are directed to specifically focus on indexing the actual body text on a particular page.

But with any typical framed website, there is just no body text for the search engine’s crawler to begin with, simply because the text is located on the other pages of the site, what we call the inner pages.

If you want to know what search engines see when they get to a site using frames, this is what they see:
“You are currently using a web browser that does not support frames. Please update your browser today in order to view this page.”

This is almost the equivalent of putting a sign on your door that says: “If you cannot read this sign, its because we are permanently closed for business”. Yet, this is exactly what search engines see when they get to a framed website!

Here is a little test that will prove what I just said is true: while at your keyboard, try a search on Google for the following query: “does not support frames” and you will actually discover about 2,800,000 web sites are found by Google! That is a lot of wasted Web pages that could bring in a LOT of sales.

In order to fix that problem, if you use that otherwise wasted space to effectively include important keywords and key phrases, and in fact adding some body text that is rich in sales copy, that will in fact make a big difference in helping make these framed sites rank higher in the search engines.

How To Properly Use The Noframes Tag
There actually is an HTML tag called the noframes tag and when properly applied, will effectively offer what the search engine spiders actually need: important text data they require to properly include that page in their index.

If a person really has to use a framed site for whatever reason, then it’s very important to use the noframes tag correctly. Evidently, doing all these necessary steps will be useful only if the information on the site’s homepage is carefully written and makes strong use of your important keywords and key phrases. Again, nothing beats the careful research and analysis of your ‘right’ keywords and key phrases, using the professional facilities of Wordtracker.

Always remember that if you optimize a site for the wrong keywords, you will have a site that will rank high, but for search terms that have nothing to do with what your site offers! You will then receive enquiries and questions on products or services that your company doesn’t even offer! In the search industry, relevancy and quality are king. The more relevant your site is, the better success you will get, and the higher your ROI will be. It’s just as simple as that.

More Framed Website Topics
So far, all the above information takes care of just the site’s homepage. Now as can be expected, the rest of all the pages on your site also need to be indexed correctly.

Most site designers and web programmers that elect to use frames do so mainly for ease of navigation reasons, although the same results can be achieved with other, more search engine-friendly technology, such as the use of tables. You simply include your navigational links inside the tables and its done!

One thing that is really important is to effectively ensure that all the inner pages of your site get optimized the same way, even if you don’t have to deal with the no-frames tag. Once all those inner pages are optimized appropriately, after one or two Google monthy updates (dances), you should start seeing your website properly positioned, ideally, close to the top of the SERPS’s (Search Engine Results Pages) in Google and most major search engines in use today.

Conclusion
Framed websites are made of old, outdated technology and are certainly not the best. They have serious limitations, as far as search engines are concerned. When building a completely new site, or when re-designing an old one, it’s always best to stay away from frame technology.

However, if you really have to, optimizing a framed site can be done successfully if the above examples and techniques are followed carefully. Remember that the key to all of this is the no-frames tag. If carefully implemented, the no-frames tag can successfully remove many of the barriers that plague framed websites.

Author:
Serge Thibodeau of Rank For Sales