Ask us a question!

John Wieber

Partner

has 13+ years experience in web development, ecommerce, and internet marketing. He has been actively involved in the internet marketing efforts of more then 100 websites in some of the most competitive industries online. John comes up with truly off the wall ideas, and has pioneered some completely unique marketing methods and campaigns. John is active in every single aspect of the work we do: link sourcing, website analytics, conversion optimization, PPC management, CMS, CRM, database management, hosting solutions, site optimization, social media, local search, content marketing. He is our conductor and idea man, and has a reputation of being a brutally honest straight shooter. He has been in the trenches directly and understands what motivates a site owner. His driven personality works to the client's benefit as his passion fuels his desire for your success. His aggressive approach is motivating, his intuition for internet marketing is fine tuned, and his knack for link building is unparalleled. He has been published in books, numerous international trade magazines, featured in the Wall Street Journal, sat on boards of trade associations, and has been a spokesperson for Fortune 100 corporations including MSN, Microsoft, EBay and Amazon at several internet marketing industry events. John is addicted to Peets coffee, loves travel and golf, and is a workaholic except on Sunday during Steelers games.

Web Moves Blog

Web Moves News and Information

Blog Posts by John

The Hiltop Algorithm
Few events on the Internet have stirred as much controversy and confusion than Google’s most resent change nicked named Florida – a plan that knocked thousands of companies out of their top positions in search engine results.

There has been no shortage of accusations about Google’s intentions. The most common charge leveled against the search engine giant is that it is trying to increase revenues by forcing companies to buy keywords.

In the midst of this firestorm, we decided to do a little sleuth work to find out what really happened. What we found was that an algorithm named Hilltop was responsible for shaking up the entire online community.

A bit of background first
Google uses PageRank and other technology to drive its search engine. Here is a summary of PageRank from http://www.google.com/corporate/tech.html:

PAGE RANK DEFINED: Condensed version
PageRank relies on the uniquely democratic nature of the web by using its vast link structure as an indicator of an individual page’s value. In essence, Google interprets a link from page A to page B as a vote, by page A, for page B. But, Google looks at more than the sheer volume of votes, or links a page receives; it also analyzes the page that casts the vote. Votes cast by pages that are themselves “important” weigh more heavily and help to make other pages “important.”

Google knew that there were two problems with PageRank: it had difficulty filtering out spam sites and it did an inadequate job differentiating between sites that had relevant and irrelevant information.

Krishna Bharat at Compaq Systems Research Center and George A. Mihaila, a professor of Computer Science at the University of Toronto came up with a new algorithm to fix the problem: Hilltop.

A VERY simplified explanation of how Hilltop works.
What follows is a VERY simplified explanation of how Hilltop works.

• Hilltop counts the number of meaningful (related to the topic) hyperlinks coming into a content-rich Web site.

• Web sites with numerous meaningful links and volumes of pages with relevant information are called “authority sites”.

• Authority sites enjoy higher rankings on the assumption that they are of more value.

• Web sites with few hyperlinks, non-related links, MLM affiliates, and affiliate programs to inflate Page Rank are demoted.

Here is what the Hilltop Algorithm looks like:

Old Google Ranking Formula = {(1-d)+a (RS)} * {(1-e)+b (PR * fb)}
New Google Ranking Formula = {(1-d)+a (RS)} * {(1-e)+b (PR * fb)} * {(1-f)+c (LS)}

Google quickly found that the Hilltop algorithm also had flaws. So Google created a two-step search process that combines PageRank technology and the Hilltop algorithm.

So how do companies continue to enjoy top rankings on Google?

Getting top rankings on Google depends on meeting the criteria of PageRank and Hilltop. Therefore, companies have to receive significant numbers votes or quality links from authority sites (PageRank) and have meaningful / relevant hypertext links (Hilltop) inbound to their web pages. Just as it has always been, Content is critical and linking is now more important that ever.

Two Conclusions can be drawn from our research
• Google is not trying to force companies to buy keywords; it is simply trying to improve its algorithm.

• The search engine optimization methodology and strategies used to achieve top placements just became significantly more complicated with the addition of Hilltop

With Google handling 75 to 85 percent of all search requests, companies doing business on the Web must retain top rankings in the search engines in order to survive. And that means making sure that your firm satisfies the criteria of the two algorithms.

Author Bio:
Search Engine Optimization Inc. specializes in getting companies top rankings on all search engines through organic searches. Contact us at 1-877-736-0006 to learn how we can help your company enjoy the benefits that come from top placements.

Google’s Austin Update
Some of the websites that haven’t been hit too hard in Google’s Florida update (November 2003) got hit real hard on or around January 23. Google’s latest update is called “Austin”, and they are beginning to ‘sound’ like elections’

Depending on the industry you happen to be in, you could have been hit less, ‘ or harder- it depends on a whole number of factors and not one situation is usually the same. One of our clients that, up to recently had their site optimized by another SEO firm was completely devastated to realize his site was gone from the face of the earth. Things such as FFA’s (Free for All) link farms, invisible text (!) and stuffed meta tags that are 20 screens long, filled with useless spam will have your site penalized, or even banned faster than you can blink an eye.

Google’s new search algorithm is getting ‘smarter’ and better. Part of it is due to the Hilltop Algorithm, which I wrote about at the beginning of January. In combination with the Page Rank’ algorithm that Google has been using since 1998, these two algo’s work in tandem in what is supposed to be producing better, more relevant search results. This last statement remains to be tested, as in some cases I have found that some of the results produced are not always relevant. But Google is continuously tweaking and ‘fine-tuning’ its technology to make it better and the next few weeks should see more improvements.

Google’s new ‘weapons of mass destruction’
What is really clear at this point is Google’s unique way to find sites that attempt to use spam techniques or certain methods that are forbidden by most major search engines, such as cloaking. In fact, to rectify this situation, on January 25, Google has changed the IP addresses of most of its data centers all over the world.

What is really peculiar about this update is that the majority of the sites we have been optimizing for the past 12 months have actually gone up in their rankings. Still, we are hearing and reading of reports in the industry that some websites (in no particular industry) have seen their PR (Page Rank’) drop one point for no apparent reason. Sites that for the past two years had a PR of 6 dropped to 5, with no real changes in the number of outbound links pointing to them.

This is indicative that Google has changed the formula to calculate the number of outbound links used in computing the resulting PR value a site should have, in relation to its competing sites. It is also believed that Google now looks at the ‘theme’ of a site and looks at the relevancy of links to better calculate PR. Sites in the same industry will benefit more than before, while others which get the majority of their links from sites that have nothing to do with them could see their Page Rank’ value go down, as some did on January 23rd.

It is clear now that Google is making some significant improvements and some major changes to its search algorithm- again! Get used to this, as my feeling is that it will continue for the next several months. It is my belief that, from the month of November to this date, sites that haven’t gone through any significant loss in their rankings could very well fall in the “victims category” if they happen to use any forbidden techniques described earlier.

Spring clean up time is here already
For almost as long as I can remember, most major search engines have always frown upon sites that use spam techniques or ‘tricks’ that are not recommended, in an attempt to ‘get away’ and possibly rank higher. If you suspect your site is part of this category, the time has come to “clean up” and to start producing good and relevant content. From the beginning, sites that have always produced great and superior content have always benefited and these categories of quality sites are usually not affected in any major update.

Content that is freshly and regularly updated always helps in getting better rankings. On the Web, content is always king. It’s not just the quantity, but also the quality of the content that is important. Additionally, Google’s Hilltop Algorithm has a tendency to detect sites that are authoritative in their field and will usually rank them higher, along with a higher PR value. Hilltop looks at the topic of the theme and who links to a site, the value of those links and the anchor text surrounding those links, as well as the text in the links themselves.

Remove from your site any meta tags that are irrelevant and replace them with the ones that are truly in relation to the products and services you want to sell or whatever it is you are trying to promote on your site. Also, look at your title tags- are they really indicative of what that page is all about?

Look at the body of your text. Is it carefully written and does it ‘flow well’ to any ‘human visitor’ that tries to read it or does it sound like you are repeating the same keywords over and over dozens of times?

Build and write your website the best you can, using only approved techniques recommended in the terms of use of Google and the others and your site should never ‘fall off a cliff’ as some have done in the past two months.

Conclusion
For the SEO community, the next two to three months should be interesting, both as close observers (which most of us are) and from a ranking standpoint. It is now highly expected that Google will produce its long-expected IPO. Some think this is the reason why it is making such major changes to its search technology. Additionally, more speculation and ‘conspiracy theories’ have been rampant in many SEO forums, which, for the most part, are unfounded in my belief.

Whatever the reasons to these important changes, one thing is clear: Google is working hard to improve the quality of their search results and this can only be achieved by trying new algorithms, tweaking existing technology, changing IP addresses in data centers, etc.

Don’t unfasten your seat belts just now – the ‘Hurricane’ may not be over yet!

Author:
Serge Thibodeau of Rank For Sales

Time To Think Summer
Today, I’m sitting at my desk looking out the window at the snow. We have a blizzard warning up as I type this Marketing Monitor. We are to expect between 8 and 10 cm (about 3 inches) on top of that same amount of snow we”ve received daily for the past 3 or 4 days.

On top of all that the temperature is below zero and has been for longer than normal. Yet despite the cold I”m thinking summer.

Not because I hate the snow. In fact I kinda like it. No I”m thinking summer because some of our client”s busiest seasons online are the summer time. So I thought I”d give some pointers here.

Remember that even if it”s snowing and cold where you are, summer is just a few short months away. Now is the time to start planning and budgeting for your summer search marketing campaign. Whether you are going to be running a PPC campaign, or strictly organic SEO, you have to begin preparing.

If you are a seasonal business and rely on summer sales, you should start researching key phrases for the summer. Consider what you sell, and what your competitors sell.

Be sure to review your visitor logs for last year to see what brought people to your site. Of course with the many changes in search over the past few months they won”t all be relevant phrase but perhaps some phrases will help you come up with ideas for additional phrases you can market.

Assess your current website”s condition. Does it look dated? Perhaps it needs a face lift. A simple redesign of a site can bring about a fresh look and perhaps drive more traffic. Sometimes a good indicator of what your site needs is your competitors. Those who aggressively market on the web have developed a formula for maximizing return on their web investment so take some pointers from them.

If you do consider a redesign, please consider the implications. We wrote an article quite a few years ago on this subject but the same rules still apply (you can read this article here). If you are working with an SEO company, above all consult with them before you do anything. They can help you minimize the impact of your new site. If you determine that a redesign is in order, now is a good time to do it � particularly if you”re key business occurs in the summer. If it”s done properly, the impact can be minimal and you will likely recover by spring.

Consider the current web environment. Again, many techniques and tactics used last year don”t work on today”s engines. Therefore you may want to revisit your optimization.

If you perform these simple tasks now, in preparation for your summer season you can help your site reach its potential without negatively impacting your online visibility.

Author Bio:
Rob Sullivan
Production Manager
Searchengineposition.com

Search Engine Positioning Specialists

SEO vs PPC
Back in 1997 when I started getting web sites to the top of the search engines it wasn’t even called “Search Engine Optimization”. In fact, there wasn’t a name for what I did much less a multi-billion dollar industry. I realized back then that search engines were the only place to find what you were looking for on the web. They were a phone book of sorts with about 4 billion listings that you could sort through in less than 1 second with the push of a search button.

Now the search engine game is very different yet very much the same. You have the media bombarding you about pay per click, sponsored listings, featured listings, ppc, cpa and don’t hear very much about natural search engine optimization anymore. You don’t know if there even is such a thing because most companies that perform search engine optimization of web sites are small and can’t compete with the large advertising budgets of the major search engines like Google and Overture. But the deeper you dig the more you realize the algorithms haven’t changed that much and getting to the top can be made simple. This leads us to two questions.

1. Does search engine optimization work anymore?
2. What kind of traffic can I expect to see from both methods?

We’ll start with the first question.

Does search engine optimization work anymore?
The answer is a resounding, ABSOLUTELY! Natural search engine placement and optimization is not dead at all. In fact, the industry is growing at a tremendous rate. The competitiveness over arguably the most competitive word on Google “search engine optimization” has drastically increased. As of the writing of this article there are over 1,620,000 results when you type in search engine optimization compared to only 560,000 two months ago.

Okay you say, that makes sense but give me more proof.

GlobalPromoter.com does no advertising other than natural search engine optimization and natural search engine placement in the major search engines. We spend $0 on pay per click campaigns or any other method of advertising yet our traffic rivals that of our competitors and we average an incredible amount of account signups on a daily basis. Our visitors / purchasers ratio is in upwards of 4.5% which is incredible in any industry. Why do we have such a high purchase/click ratio? Because people are looking for us, we’re not looking for them. When a user goes to Google and types in “search engine optimization tools” and finds us on the first page, they know we know what we’re doing and are compelled to click if only out of awe.

The secret of the search engines is the ability to be found not the other way around. That’s why the natural search engine listings in Google outperform the Adwords listings. Users know that sites listed in the Sponsored matches section or on the right side of the results means a business or individual is paying every time someone clicks on their site. That equates to advertising which is no different than radio, television, newspaper or magazines. That’s a company “pushing” their product onto the consumer.

But, when a user finds a site in the web matches section, they have more confidence. This site didn’t pay to get there, they are there because Google or Yahoo or AOL’s algorithm said their site is the most appropriate for my search based on the entire site’s content. This is “Pull” demand. Meaning, the user is looking for us instead of us looking for them. If you can get on the Pull side of advertising then you’ll experience much higher purchase / click rates on every visitor to your web site.

On to question 2.

What kind of traffic can I expect to see from both methods?
This question needs to be answered in two parts. First let’s look at the ppc method.

PPC search engine listings will give you as much traffic as there is demand for a given keyword or keyword phrase. Meaning, if there are 500,000 searches a month and your listing is appealing you can expect to receive approximately 2 – 5% of those searches. Let’s say you get an incredibly high click thru rate of 5%. That means you have .05 * 500,000 = 25,000 visitors at your disposal. But if a keyword has 500,000 searches in a month then that means it’s fairly competitive and it could easily be $1.00 to be in the top 3 positions for that keyword. So if you are paying $1.00/visitor and you had 25,000 visitors, then you paid $25,000 for the traffic one keyword would generate for your site.

I think you can see how risky and expensive ppc can be. Unless you know you can convert visitors into sales and your profit margin on the items you’re selling is incrdibly high, then Caveat Emptor (buyer be ware).

On the flip side, when your site shows up in the natural rankings you don’t pay a single cent for any of the traffic it generates. This means you have more money for developing your site, tweaking marketing tactics, making your product better, etc…

As far as the old argument that you won’t get as much traffic from natural placements vs ppc listings, that’s a myth. Several of our customers receive over 50,000 visitors a month on average from natural placements in the major search engines. In fact, when we optimize a client’s web site, one of their goals is to decreaes the amount of money they are currently spending on ppc advertising. After the completion of the optimization plan 75% of our clients completely abandon their ppc programs. This leads us to a general comparison of ppc vs. natural rankings.

Advantages of Search Engine Optimization
1. Up front fixed cost vs. fluctuating costs that can skyrocket with ppc advertising.

2. Long term listings and rankings with natural placement vs. Showing up only as long as your bank account has money.

3. Natural rankings have higher click thru ratios than ppc listings because natural rankings are pull demand vs. push demand.

In Conclusion
I’m going to close this article with an analogy. Most of you have been camping before and remember at least one cold night when you couldn’t get a fire started. So you went and got some lighter fluid and squirted it on the dry oak or whatever wood you used. Then you threw a match into the fire and began to warm up next to the fire.

The lighter fluid is akin to ppc advertising. When the lighter fluid is squirted on the fire the flames shoot high and bright and then vanish. Just like ppc advertising it’s short term because as soon as the money is gone so is your exposure. Whereas natural rankings are like the solid oak used in the fire. The oak will burn for hours and hours and keep you warm much longer than just lighter fluid alone. Like the oak, natural search engine optimization campaigns last in excess of 6 months in stead of one day. And if you learn the secrets to good web site optimization you can stoke the fire and make it last even longer with no added cost. Of course it takes a little longer to get the oak branches to light up but once you get them going they will last for a long time.

Many people see the solution to their search engine marketing campaign in pay per clicks because they’re easily set up and effective almost immediately. However, those that understand the principle of laying a solid foundation and building upon it can understand the long term benefits of natural search engine placement. It may take longer to get the same results but it will cost much less in the long run.

Author Bio:
Jason Dowdell is the founder and CEO of http://www.GlobalPromoter.com, a search engine optimization and marketing firm specializing in educating and empowering customer websites. Jason is also the founder of TurboPromoter.com, a web-based seo/sem project management suite comprised of professional seo tools, in-depth tutorials and an integrated help system.

28
Jan
2004

Site Rankings Gone?

Site Rankings Gone
As you may already know, a good part of my job is researching how the organic search engines work. Trying to figure out how the algorithms work in ranking pages is crucial to our day to day operations. Occasionally, we come across sites which seem to defy explanation – they have proper optimization, good internal linking and so on, yet seem to be getting penalized by engines such as Google. Today, I’m going to explain how we began researching a particular problem, in hopes that if it happens to you, you will know what to do.

The first indication that there was a problem with a site was when the PageRank in the Google toolbar disappeared, seemingly over night. This happened soon after a new URL was put up on an existing site. We assumed, as is usual, that Google hadn’t been able to associate the new URL with the “old” content. That is – Google was still expecting to see the old URL associated with the content. We advised the client that it would likely take a few weeks to re associate the new URL with the site.

When sufficient time passed without progress, we had to dig deeper to see what the issues were. As I mentioned above, everything looked ok. Optimization was in place, and there wasn’t any over optimization happening. Internal linking was good, and there was good use of a properly constructed site map. So we had to dig deeper – going beyond on-the-page factors to see if we could figure out what else was causing the problem.

The first thing I looked for…
The first thing I looked for was the existence of a robots.txt file. In many cases, an improperly coded robots file will exclude some, or all, search engine spiders from indexing. In this case a robots.txt was not being used, so I ruled this out.

I then checked to see if there were robots Meta tags in the body of the HTML. These tags do the same as the robots.txt file. That is, they tell the spiders which pages they can and cannot index, but it is done on a page by page basis, rather than a site wide basis as in the robots.txt. Again, an improperly coded robots Meta tag can exclude part or all of a site from getting indexed. Again, this was not the case. Although this site does use a Meta robots tag it was coded properly. In fact, the same tag existed on the “old” site and wasn’t an issue then.

So I then checked the log files to see if the spiders had been visiting the site and they have been there on a regular basis. As recent as a few days ago, as a matter of fact.

Seeing that everything was coded properly, and that spiders had been visiting the site peaked my interest. How is it that spiders are able to see the site (as indicated by their visits) yet the site is not showing up in the index and has a PageRank of 0 still, months after the change?

Some more digging…
So I did some more digging. I checked Google for the old URL. Upon viewing the cached version of the old URL, a theory began to form.

The cached pages are actually the current content of the new site. In other words, Google was somehow associating the old URL with the new site. So I did some more checking. I did a whois lookup and found that the old URL was still registered. So I decided to ping it, and found that it resolved to a new IP address, yet when I try to connect to it using my web browser it comes up as a 404 (page not found error).

I pinged the new site and the IP address is different, but it is the IP address that the site had when it had the “old” URL. This still doesn’t explain why the new site has no PageRank or indexed pages and the “old” URL is showing pages from the new site, but it does give me some clues.

We already know that in order to save time most search engines do not perform a DNS query when they visit a site. They tend to try and connect directly to the site via IP. If they don’t get a site via IP they then perform a DNS query to find the IP of the site.

In the case of this site, Google hasn’t needed to perform a DNS query as, from their point of view, the “old” site still exists. They can connect via IP to the site and are associating the “new” site to the “old” URL.

This also explains why the “new” site is showing a PageRank of 0 with no pages indexed. Because Google has also resolved the “new” site to the same IP which it thinks belongs to the “old” URL. Once it visits the new site it realizes that the new and old sites are identical it gives preference to the “old” site because it pre exists the new site.

Confused yet?
Let me put it in other terms. Since the “old” site has been around for longer, it has built up a reputation on the web. When the client replaced the URL they wiped out that reputation. But no one told Google that the old site was gone. Had Google performed a DNS query they would have found that the old site had in fact been moved, but since it found a site with the same content at the same IP it assumes that it is the site with the reputation.

Along comes a new site with the exact same content and no reputation and of course the first thing Google assumes is that the site owner is trying to spam the engine, so it penalizes the new site. Hence the lack of indexed pages and 0 PageRank value.

To resolve this issue we will try a variety of things. First will be either a 301 redirect (approved by Google to help spiders understand if a site has moved) or another on-the-page redirect, such as a Meta refresh or hyperlink on the “old” URL. These different efforts help enforce to Google that the “old” site has been replaced by the “new” site.

If this doesn’t work, our next step will be to request that the site be removed from the index. This is a last resort; as we would rather the engine figure it out on its own. If we find that Google still can’t figure out that there is a new site, we will definitely request the URL removal.

In addition, to try and help speed things along, we need to ensure that all other links, such as ODP directory listings, point to the correct URL and not the old domain. This will reinforce to the search engines that the “old” site no longer exists and that the “new” site is actually a valid site that isn’t spamming the engines.

Author Bio:
Rob Sullivan
Production Manager
Searchengineposition.com
Search Engine Positioning Specialists

The Future Of Search Engine Technology
By now you have probably read numerous articles predicting ‘What will happen in 2004’ or ‘Can MSN take on Google’. While it is always worthwhile to look ahead and consider what may happen this year in the search engine industry, what about the things that we can’t quite yet predict? Instead of looking at what will happen this year, perhaps we should look at what must happen in the search engine space if Google, Yahoo and MSN are truly able to revolutionize search and enhance the user experience.

Overcoming The Lack Of Relevant Search Results
Even today, conducting a search on any of the major search engines can be classified as an ‘enter your query and hope for the best’ experience. Google’s ‘I’m Feeling Lucky’ button, while designed to take you directly to the number one results, could ironically be a truism for its entire search results (process?). Enter your desired search words into any of the search engines and you often end up crossing your fingers and hoping that they display the type of results you were looking for. Since the recent updates of ‘Florida’ and ‘Austin’, complaints that Google, in particular, is displaying less relevant results have escalated (although mostly by those who lost important positioning that they had assumed was their right to maintain).

There is, of course, evidence that the search engines are trying to enhance their search results- so that they can better anticipate the intentions of the searcher. Search for ‘pizza Chicago’ at Yahoo, and you’ll see that the top results include names, addresses, telephone numbers and even directions to pizza restaurants in Chicago, a great improvement on previous results. Even when you take everyone’s favorite search term example, ‘windows’, you can see that the search engines are at least trying to determine your intent. While Yahoo and Google still display search results saturated with links discussing Microsoft’s pervasive operating system, enter your search over at Ask Jeeves and the chirpy English butler will ask you if you meant ‘Microsoft Windows’ or ‘Windows made out of glass’.

Future Search Engine Technology
Smaller search engines have also materialized over the past few weeks, each offering to improve the user experience. Grokker offers an interface that groups search results graphically, improving the way search results are segmented and displayed. Eurekster, combines the social networking elements that are used by sites such as Friendster, and provides results that can be filtered based upon what members of your group are searching. While all of these are interesting and provide a glimpse of the future of search, it will not be the small companies that change the way we search. With Google about to get an influx of cash from its upcoming IPO, Yahoo re-vamping Inktomi and Overture, and Microsoft finally jumping into the search arena, it will be these search engine powerhouses that enhance our search experience and take search engine technology to the next level.

So what is this next level? What technology is it that I speak of, that will revolutionize the way we receive our search engine results? I believe that the search results we receive in just a couple of years from now could make current search engine technology look as archaic and cumbersome as picking up a Yellow Pages book is today. However, in order to achieve this new search nirvana we, as consumers, must quell our fears and trepidations surrounding the protection of our privacy. In order for the search engines to develop technology that will be intuitive and anticipate our every need, we must first relinquish at least some of the privacy that we currently hold so dear. Let’s take a look at some of the ways that search technology could improve and you’ll soon get the idea why it will require us to cooperate with the search engine providers.

‘Windows’ or ‘windows’?
If you desire to be able to enter a term as ambiguous as ‘windows’ and expect to see relevant results, you’ll first need to give up some personal information to the search engines. Google, Yahoo, MSN and Ask already have the means to collect an astonishing amount of information from us, by our use of their toolbars. Don’t panic, they currently allow you to disable this information gathering, and even if you do allow it, it is collected anonymously. However, with the technology already in place, why not unleash its full potential?

Let’s say I let Google track my online activities, allowing it to monitor the web sites I view and keep a log of all of the search queries I enter. This type of information could greatly improve the relevancy of the results displayed to me. For example, two years from now, I could search for ‘home improvement’ on Google. I then find the listing for Lowes.com and visit the site. While I am at their web site, I look at a number of different pages, but I spend a lot of time in the ‘house windows’ section, exploring the different styles and prices. Why not let Google capture all of that useful information? Then, when I go back to Google the following day and search for ‘windows’ it would know that glass windows is more likely to be the type of product I am seeking out. Google would simply have remembered my previous searches, read the HTML and Meta data, located on the Lowes.com pages, and used this to identify the intent of my new search for windows.

While I would have to give up some of my privacy, wouldn’t it be worth it if I could save myself time and energy by having search engine results more relevant to my desire?

You’ve Got Search In Your Mail
Another area with great potential for improving search engine results will likely be developed by Google. You may have heard the rumors that Google is getting set to launch an email client that many expect will be a free service similar to Yahoo Mail or Hotmail. Currently, Yahoo does an adequate job of making search available to all of its email customers. Each page within Yahoo Mail has a search box that makes it easy for you to conduct a search that might be sparked by an email you receive. But why not take it one step further?

Google has the technology to really take advantage of search within email. Why else would it even consider entering this arena? Imagine that, in order to use a free Google email account, you allow Google to provide advertisements and track your email activities. Google could change the way that search results and ads are displayed to free email users. For example, let’s say you receive an email from your brother, the content of which, among other things, gloats about the brand new P4 desktop computer that they just purchased from Dell. As part of the interface you use to read that email, Google magically displays paid search advertising for desktop computers, including a link that will take you directly to the appropriate page on Dell.com. This information would be quite beneficial to you, as you may be interested in seeing how you too can be a proud owner of a P4 computer. Fantastic targeted advertising for Dell, as they know that if you click on the listing, they are halfway there to converting you into another satisfied customer.

This idea is so much closer to reality than you may think. Google already has the advertisers with its AdWords service boasting 150,000 users, eager to spend their advertising dollars. It also has the technology to determine which results to show you within your email interface. Google’s AdSense can provide the contextual ad technology that would scan an email’s content to determine which ads are the most relevant to display. With this technology in place, a simple provision within any Google Email Terms & Conditions would give the world’s largest search engine the necessary permission to serve up relevant ads to all users of its free email service.

We could be offered the option of paying a monthly premium in order to not have ads shown when we read our email, but if they are relevant to the content of a received message, why would we want to block them?

From Desktop to Internet
Another development in search engine technology that I can see happening would come from the development of Microsoft’s new Longhorn operating system. While I must confess that I am not au fait with the intricate workings of this project, I do know that it will likely use the search technology that MSN is developing.

Imagine an operating system that monitors all of your activities — with your permission, of course. Every file, every image, word document, mp3, even e-books could be monitored by your computer as it endeavors to anticipate your every need. Not only could an integrated search engine allow you to search files located on your hard drive, but it could also use the information it has collected from these files to make your online search experience even more enjoyable.

It is quite possible that Longhorn or a future OS (Microsoft, Linux or Mac) could become intelligent enough to know that after listening to one of your favorite songs by the 80’s rock band, Heart, your consequent search online for ‘heart’ is more likely to originate from a desire to view the band’s fan site, than that pressing need to visit the web site of The American Heart Association. Your all-encompassing search engine would perhaps be a realization of the Ask Jeeves friendly butler, ready to anticipate your every need.

To Search Where No-one Has Searched Before
When you think about the future of search, it is easy to get excited. Millions (if not billions) of dollars are going to be filling the coffers of the largest search engine providers. They have some of the smartest people in the world working to develop the next great ‘thing’, which will enhance the user experience and serve up better, more relevant search results. Search engine technology is still most definitely in its infancy; how it grows will very much depend upon how much information and privacy the average search engine user is willing to give up. Personally, if I can view search results that more closely match my desired results, I’m willing to give up the name of my favorite pet, my place of birth and my mother’s maiden name!

Author Bio:
Andy Beal is Vice President of Search Marketing for WebSourced, Inc and KeywordRanking.com, global leaders in professional search engine marketing. Highly respected as a source of search engine marketing advice, Andy has had articles published around the world and is a repeat speaker at Danny Sullivan’s Search Engine Strategies conferences. Clients include Real.com, Alaska Air, Peopleclick, Monica Lewinsky and NBC. You can reach Andy at andy@keywordranking.com and view his daily SEO blog at www.searchenginelowdown.com.

Google & Inktomi Optimization
The search engine environment continues to evolve rapidly, easily outpacing the ability of consumers and SEO practitioners to quickly adapt to the new landscape. With the ascension of Inktomi to the level of importance that until recently was held solely by Google, SEO practitioners need to rethink several strategies, tactics and, perhaps even the ethics of technique. Assuming this debate will unfold over the coming months, how does an “ethical SEO firm” work to optimize websites for two remarkably unique search engines without falling back on old-fashioned spammy tactics of leader-pages or portal-sites? Recently, another SEO unrelated to StepForth told me that he was starting to re-optimize his websites to meet what he thought were Inktomi’s standards as a way of beating his competition to what looks to be the new main driver. That shouldn’t be necessary if you are careful and follow all the “best practices” developed over the years.

The answer to our puzzle is less than obvious but it lies in the typical behaviors of the two search tools. While there are a number of similarities between the two engines, most notably in behaviors of their spiders, there are also significant differences in the way each engine treats websites. For the most part, Google and Inktomi place the greatest weight on radically different site elements when determining eventual site placement. For Google, strong and relevant link-popularity is still one of the most important factors in achieving strong placements. For Inktomi, titles, meta tags and text are the most important factors in getting good rankings. Both engines consider the number and arrangement of keywords, incoming links, and the anchor text used in links (though Google puts far more weight on anchor text than Inktomi tends to). That seems to be where the similarities end and, the point where SEO tactics need revision. Once Inktomi is adopted as Yahoo’s main listing provider, both Google and Inktomi will drive relativity similar levels of search engine traffic. Each will be as important as the other with the caveat that Inktomi powers two of the big three while Google will only power itself.

2004 – The Year of the Spider-Monkey
The first important factor to think about is how does each spider work?

Entry to Inktomi Does Not Mean Full-Indexing
Getting your site spidered by Inktomi’s bot “Slurp” is essential. Like “Google-bot”, “Slurp” will follow every link it comes across, reading and recording all information. A major difference between Google and Inktomi is that, when Google spiders a new site, there is a good chance of getting placements for an internal page without paying for that specific page to appear in the index. As far as we can tell, that inexpensive rule of thumb does not apply to Inktomi. While it is entirely possible to get entire sites indexed by Inktomi, we have yet to determine if Inktomi will allow all pages within a site to achieve placements without paying for these sites to appear in the search engine returns pages, (SERPs). Remember, Inktomi is a paid-inclusion service which charges webmasters an admission fee based on the number of pages in a site they wish to have spidered. From the information we have gathered, Slurp will follow each link in a site and, if provided a clear path, will spider every page in the site but, pages within that site that are paid-for during the submission will be spidered far more frequently and will appear in the indexes months before non-paid pages. We noted this when examining how many pages Inktomi lists from newer clients versus how many from old clients. We have noticed the older the site, the more pages appear in Inktomi’s database and on SERPs on search engines using the Inktomi database. (This is assuming the webmaster only paid for inclusion of their INDEX page) Based on Inktomi’s pricing, an average sized site of 50 pages could cost up to $1289 per year to have each page added to the paid-inclusion database so it is safer then not to assume that most small-business webmasters won’t want to pay that much.

Google’s Gonna Get You
Google-bot is like the Borg in Star Trek. If you exist on the web and have a link coming to your site from another site in Google’s index, Google-bot will find you and assimilate all your information. As the best known and most prolific spider on the web, Google-bot and its cousin Fresh-bot visit sites extremely frequently. This means that most websites with effective links will get into Google’s database without needing to manually submit the site. As Google currently does not have a paid-inclusion model, every page in a site can be expected to appear somewhere on Google produced SERPs. By providing a way of finding each page in the site (effective internal links), website designers should see their sites appearing in Google’s database within two months of publishing.

We Now Serve Two Masters; Google and Inktomi
OK, that said, how to optimize for both without risking placements at one over the other. The basic answer is to give each of them what they want. For almost a year, much of the SEO industry focused on linking strategies in order to please Google’s PageRank. Such heavy reliance on linking is likely one of the reasons Google re-ordered its algorithm in November. Relevant incoming links are still be extremely important but can no longer be considered the “clincher” strategy for our clients. Getting back to the basics of site optimization and remembering the lessons learned over the past 12- months should produce Top10 placements. SEOs and webmasters should spend a lot of time thinking about titles, tags and text as well as thinking about linking strategies (both internal and external). Keyword arrangement and densities are back on the table and need to be examined by SEOs and their clients as the new backbone of effective site optimization. While the addition of a text-based sitemap has always been considered an SEO Best Practice, it should now be considered an essential practice. The same goes for unique titles and tags on each page of a site. Another essential practice SEOs will have to start harping on is to only work with sites that have unique, original content. I am willing to bet that within 12- months, Inktomi introduces a rule against duplicate content as a means of controlling both the SEO industry and the affiliate marketing industry. Sites with duplicate content are either mirrors, portals or affiliates, none of which should be necessary for the hard-working SEO. While there are exceptional circumstances where duplicate content is needed, more often than not dupe-content is a waste of bandwidth and will impede a SEO campaign more than it would help.

One Last Tip
The last tip for this article is, don’t be afraid to pass higher costs on to the clients because if your client wants those placements soon, paid-inclusion of internal pages will be expected. When one really examines the costs of paid inclusion it is not terribly different than other advertising costs, with one major exception. Most paid-advertising is regionally based (or is prohibitively expensive for smaller businesses). Search engine advertising is, by nature, international exposure and that is worth paying for.

Author Bio:
Jim Hedger is the SEO Manager of StepForth Search Engine Placement Inc. Based in Victoria, BC, Canada, StepForth is the result of the consolidation of BraveArt Website Management, Promotion Experts, and Phoenix Creative Works, and has provided professional search engine placement and management services since 1997. http://www.stepforth.com/ Tel – 250-385-1190 Toll Free – 877-385- 5526 Fax – 250-385-1198

Introduction
Here’s something that is fast to read and does the job! The 10 do’s and don’ts of SEO. Five techniques you should always do to push your site at the top of the search engine results pages (SERP’s) and keep it there, and five things which you should always avoid doing, to protect your site from a possible penalty or risk it from being banned altogether.

List of the 5 Do’s
Do Number One:

Take all the time that it takes to do a careful research of all your keywords and key phrases for your site, on the products or services you are trying to sell. Proper keyword research can only be done using Wordtracker, the industry standard when it comes to professional keyword research. Trying to optimize a site without knowing your real keywords is like driving a car at night with no headlights! Some will tell you they use Overture’s free suggestion tool. Although that tool can help you to a limited degree, you should always use Wordtracker for the best results.

Do Number Two:
Make sure you write a short descriptive title tag of what each page of your site is all about, and make sure they are all different. Search engines use the information contained in that title tag, compare them to the text on that page and rank it accordingly. The short description in your title tags will also help your users. The idea here is to keep it as short and descriptive as possible. If you are creating a new page about ‘durable red widgets’ then call that page ‘durable red widgets’. Avoid the temptation of creating title tags that are longer than 30 characters maximum, since they might have a dilution effect in your rankings of certain search engines.

Do Number Three:
Write the main text on your page using the same keywords contained in your title tag. If you are working on a page with a title called ‘New houses in Baltimore’ then be sure that those important keywords are repeated at least two or three times in the main body of your text, without sounding repetitive. A well-designed and carefully written page will ‘read write’ and will not sound like you are repeating yourself. Search engines will rank your page higher if they see a keyword repeated a few times on a page, and will help them ‘build a theme’ throughout your site.

Do Number Four:
Make a complete sitemap of your site, which will help both your users and the search engines at the same time. Having a well-designed sitemap will ensure that each page of your site gets properly indexed by Google and the other search engines. It is important to call that file sitemap.html and not site-map.html or other variations. Additionally, make sure that your sitemap.html file is directly accessible from your homepage and that it uses link text. Link text is always a lot better than a picture or graphic, since search engines won’t be able to read them.

Do Number Five:
To increase your link popularity, participate in a link exchange program. Even in the aftermath of ‘Update Florida’, link popularity in Google today is even more important than ever. All else being equal, the higher your Page Rank, the higher your rankings. Increasing the number of links that point to your site will help you in the results pages. You should only link to sites that are in the same field as your site, and stay away from bad ‘neighbourhoods’ or from so-called link farms or ‘free-for-all’.

List of the Five Don’ts
Don’t Number One:

Don’t ever use cloaking mechanisms or software that need to know the IP address of a search engine spider or anything similar. Cloaking is based on the idea of serving a unique, optimized page for the search engines, while serving a completely different page to the ‘real’ users. Today, most major search engines prohibit the use of such techniques and you risk your site being penalized or banned altogether. Always play it safe and the search engines will treat you right.

Don’t Number Two:
Submit your website once to the search engines and then wait for at least 6 weeks! Don’t use software that automatically submits your sites on a weekly or monthly basis, since it might penalize you in the long run. Today’s modern search engines use automated crawlers or spiders to regularly index your site, so you don’t need to submit more than once. In the case of DMOZ (the Open Directory Project or ODP), you should always wait 8 to 12 weeks, since DMOZ rely only on volunteers to review and index your site. If your site still isn’t listed after 12 weeks, write them a friendly email explaining your problem and that should do it in most cases.

Don’t Number Three:
Don’t entrust your site to people that will submit it to ‘thousands of engines’. There isn’t that many search engines in the first place. There are only a handful of serious search properties you should submit too, and they are used by 99% of the people looking for information. Don’t waste your time or your money and only work with the serious search engines everybody uses.

Don’t Number Four:
Don’t develop your site using Flash technology or similar techniques that the search engines cannot read. As far as your rankings in the search engines go, the best way to develop a site is in using standard technology, such as HTML. Text written in the HTML format is proven technology that all search engines have long recognized and approved, since the beginning of the Internet. Using the right technology will always help your site attain a good position in the SERP’s.

Don’t Number Five:
Don’t deal with so-called SEO experts that promise you Number One or first-page rankings in some engines. There is no such thing as guaranteed number one placement. Ask for referrals and don’t be afraid to ask exactly what techniques your would-be SEO firm uses to achieve a good positioning for your site. Additionally, ask them to put everything in writing, before you sign on the dotted line, and before you give them some of your hard-earned cash.

Author:
Serge Thibodeau of Rank For Sales

Google Update Florida Solutions and Fixes
Over a month and a half ago I wrote an article about Google’s latest update dubbed Update Florida. I told our subscribers about how they might see their rankings drop in Google for prominent keywords for no apparent reason. Today I’m going to tell you what you need to do in order to fix your site and help restore your rankings.

First, let’s recap the main effects of the Google Update Florida as well as my original predictions of the root causes. Then I’ll explain in more detail the root causes as I see them.

Here’s the skinny on the latest Google Dance (from: 11/18/2003)
• Called “Update Florida” because similar to the last Presidential election, it was full of controversy and tons of upset people.

• Many people that had worked hard to build up solid and reputable backlinks appear to have been punished for no obvious reason. Even though their inbound links contain their main keyword phrase and have content related to their site in the title of the inbound linked page.

• Rankings have dropped for main targeted keywords but have not dropped for more specific keyword phrases. Rankings are not just for search engine optimization firms but are for web sites of every kind and variety.

• Results appear to have been repopulated from data that looks to be about 6 months old.

Original Google Update Florida Predictions and Explanations
• Probable: Google is either testing an updated algorithm or they are not factoring in the backlinks created in the past 6 months.

• Not Probable: SEO techniques that could be considered SPAM to Google are being filtered out in an effort to provide more relevant results. I’m doubtful on this one since the results are from about 6 months ago and many of them are not very relevant.

• Probable: Google is doing yet another deep crawl, we haven’t seen on in about 6 months, and they want to build a fresher, more relevant search base from sites currently out there that the Fresh Crawl GoogleBots may not be picking up.

• Not Probable: Many sites that have dropped off will never come back to the top because they have been penalized. As far as we know Google doesn’t penalize sites, they simply list them or delist them. Just like God, there is no gray area with Google, it’s either black or white. If you’re considered a black hat then your site will not show up anywhere in Google’s index, not even for the most descriptive keyword searches.

And the Verdict is In!

Let’s start from the beginning with the first prediction and move down from there.

Prediction no.1
Google is either testing an updated algorithm or they are not factoring in the backlinks created in the past 6 months.

Verdict: I was right and wrong on this one.

I was wrong about Google not factoring in backlinks created in the past 6 months. I was right about them testing new filters. Google has not dumped their old algorithm by any means. Instead they have created new filters to penalize sites who’s pagerank was artificially inflated due to less than reputable linking practices. What are less than reputable linking practices you ask? These include…

• Exact same phrase or a high percentage of the exact same phrase (text links) and alt text (image links) in backlinks. This catches folks caught up in link farms because they don’t rotate the text used to link to their sites.
Solution: Create a link to us page that has at least 5 different links using different keyword phrases. This way you spread out the concentration of text across the general theme of your site.

• Links from sites that are of a different subject matter than the site being linked to. When there is no mention of your subject matter in the body, title, keywords and description of the page linking to your site.

• Links from sites that don’t show up in search results for the keywords used in the links. This is because it is widely accepted that Google as implemented a localrank technology that determines whether or not your backlinks are from sites that rank for similar keywords used in the backlink. If so, then you’re fine, if not then you’re going to have problems getting to the top even if you have a high pagerank.

Prediction no.2
Not Probable: SEO techniques that could be considered SPAM to Google are being filtered out in an effort to provide more relevant results. I’m doubtful on this one since the results are from about 6 months ago and many of them are not very relevant.

Verdict: I was right on this one

Google is not penalizing sites for being overly optimized or for having keywords in the title, description and keywords of the page or having them in the Heading tags or anywhere else. The penalization is only coming into effect when backlinks are overly optimized and not from other industry resource sites. The results appears to be old only because the sites that ranked well earlier weren’t part of the linking scams and they returned to the top for that very reason.

Prediction no.3
Probable: Google is doing yet another deep crawl, we haven’t seen on in about 6 months, and they want to build a fresher, more relevant search base from sites currently out there that the Fresh Crawl GoogleBots may not be picking up.

Verdict: I was right again.

Yes, Google did another deep crawl and refreshed the backlinks of sites and update their PageRank about 2 weeks later. But this deep crawl is done on almost a continual basis now and doesn’t affect the rankings like many thought it did. Rankings were affected due to the implementation of new filters and that’s the only reason why.

Prediction no.4
Not Probable: Many sites that have dropped off will never come back to the top because they have been penalized. As far as we know Google doesn’t penalize sites, they simply list them or delist them. Just like God, there is no gray area with Google, it’s either black or white. If you’re considered a black hat then your site will not show up anywhere in Google’s index, not even for the most descriptive keyword searches.

Verdict: I was wrong.

Well I guess nobody’s perfect. Google is not necessarily “penalizing” sites but they are “filtering” sites out based on whether or not certain filters are triggered by the site for a particular keyword search. While a site may show up for a very detailed search or an off topic search, it may not rank well for a highly competitive (typically commercial) term if they have tripped a filter for that term.

In order to understand my predictions and conclusions correctly I need to explain a few more things that have bubbled up from the Google Update Florida and how they affect search results that are delivered today on Google.

Not only did Google implement filters but they also implemented new features which make it seem like the old algorithm was thrown out and they started over from scratch. While many will tell you Google isn’t as relevant as it once was I will contest that they’re changing to remain the most relevant search engine out there. Here’s a rundown on the new features.

• Google Implemented Stemming Technologies: Stemming means taking a root word and determining all variations of engine for that word. Now a search for “game sites” may return sites optimized for “gaming sites” “gamer sites” “gamed sites”. This increases the number of results delivered for highly targeted keywords and increases the dependency on solid natural optimization.

You can disable stemming by adding the “+” sign in front of each word you want to disable stemming for.

• Google Implemented Plural Searches: This means that a search for “knitting needle” and “knitting needles” will return the same results. Thus increasing the competition again since more results are returned for all keyword searches.

• Implementation of LocalRank: Localrank is a technology that looks at the first (x) number of search results. It can be any number the search engine specifies but is typically around 100. After it looks at those results it determines whether or not any of those sites have linked to you and then ranks sites based on how many “popular” sites for a specific search term have linked to you. This is why it’s critical to have someone help you with your link popularity campaign that understands the intricacies of linking and can provide advice that will not hurt you in the long run. Short term link popularity plans from unrelated sites will do nothing to help out your cause. For more information on LocalRank read this forum at WebmasterWorld.

• Internal Links Discounted: Links from within your site to particular pages in your site do not count as much as they once did. While this doesn’t mean you need to change your site link structure it is worth noting.

Artificial PageRank Deflated: Sites that have more than one link from a particular site are experiencing the law of diminishing returns. No longer are 100 links from a single site weighted as 100 individual links. This just makes sense. I mean, if site A has 100 links from a single site and site B has 20 links from 20 individual sites, I can guarantee you that the 20 links to site B will count more than the 100 links to site A.

In Conclusion
Well I’ll conclude my ramblings with a recap. Google has made many changes and has implemented several filters as well as new algorithm features to ensure it has the most reliable search results set on the internet. In order to climb your way back to the top you need to understand LocalRank, PageRank changes, Proper Link Reputation and Link Popularity, and be aware of anti-spam measures that need to be taken. It all boils down to common sense. Make your site user friendly and easy to navigate, encourage others to link to you by giving them some form of incentive, don’t use the exact same phrase in your backlinks, use good titles that explain what each page is about and keep it simple. By following these rules you can weather any search engine algorithm change and remain at the top with a lot less stress.

Author Bio:
Jason Dowdell is the founder and CEO of http://www.GlobalPromoter.com, a search engine optimization and marketing firm specializing in educating and empowering customer websites. Jason is also the founder of TurboPromoter.com, a web-based seo/sem project management suite comprised of professional seo tools, in-depth tutorials and an integrated help system.

Pay Per Click Search Engine Advertising
So you’ve decided to give pay-per-click search engine advertising a try? That’s a good move, because PPC advertising is one of the most affordable marketing options available to small businesses.

But like all advertising, you need a good strategy to get your money’s worth. I find that too many people running their first PPC campaign make mistakes that can quickly turn expensive.

In this article I’ll offer some basic advice about bidding and keyword selection to help you run a smart PPC campaign.

Just What Can You Pay Per Click?
The most important thing to know before starting your PPC campaign is how much you can afford to bid for a keyword. High traffic keywords on Overture and Google ‘ the leading PPC providers – can cost $5.00 per click for a top ranking. Can you afford that?

Consider this: the typical e-commerce site converts about 2% of its visitors. That means you need to bring 50 visitors to your site before you make a sale. At $5.00 per click, you’ll spend $250 dollars to generate one sale. Ouch!

Keep in mind that you usually want one of the top 3 listings for a keyword. These are the listings distributed to most of the PPC engine’s partner sites. For example, a #3 ranking on Overture will place your listing on Yahoo, MSN and Alta Vista. A #7 listing won’t appear on any of these search engines.

So you’re caught in a catch-22: you want a high PPC ranking to get traffic, but the top rankings for popular words are too expensive.

Cast Your Net Broadly
The solution is to cast your net broadly, targeting a large number of less popular keywords. These words are usually less expensive and, taken as a group, can give you a considerable volume of traffic.

For example, suppose you run a ski resort. The keyword ‘ski vacation’ currently receives over 60,000 searches per month. That’s great, but it costs $5.01 per click for the top ranking. Instead of competing head-to-head for that keyword, you would be better off choosing ‘ski trip’ (4,771 monthly searches at $0.57 per click for the top spot) and ‘ski lodge’ (4,244 monthly searches at $0.55 per click for the top spot).

By targeting a number of these less popular keywords, we get nearly the same traffic as if we had targeted ‘ski vacation,’ but at a fraction of the cost.

Note that this is the opposite of the strategy you typically use in your search engine optimization campaign. In an SEO campaign, you focus on perhaps a half dozen high traffic words. That’s because it takes a lot of hard word to earn a top listing.

In contrast, it’s relatively easy to create a new PPC listing. Since you don’t pay unless someone clicks on your listing, there’s no added cost for doing this, so targeting a large number of keywords makes sense.

The word ‘ski chalet’ only receives 930 searches per month. So what? At $0.52 per click, it’s worth adding to your PPC campaign.

It’s common for PPC advertisers to target dozens of keywords. I’ve managed PPC campaigns for clients using over 1,000 words.

Smart PPC Management
The downside of this approach is that it can be hard to manage such a large number of keywords. You’ll want to track your listings, making sure your rankings haven’t dropped. Plus, you’ll want to know which keywords are sending you traffic and converting visitors into customers.

Many businesses also use a PPC bid management software like Bid Rank or GoToast to manage their listings. These software packages track your listings, and can adjust your bid if you drop in the rankings.

Many companies also outsource the management of their PPC campaigns. Most SEOs now offer PPC management services. These options cost money, but they usually pay for themselves by running your campaigns more efficiently.

Keep in mind that you don’t have to use a software package or a consultant to start your PPC campaign. But you do need to know what sort of cost per click you can afford. If you decide that $2.00 per click is your maximum bid, then stick with it. Don’t get into an emotional bidding war if you lose a top ranking. It’s much smarter to look for new and cheaper keywords. Cast your net broadly and you’ll save money.

Author Bio:
Christine Churchill is President of KeyRelevance.com a full service search engine marketing firm. She is also on the Board of Directors of the Search Engine Marketing Professional Organization (SEMPO) and serves as co-chair of the SEMPO Technical Committee.