Ask us a question!

John Wieber

Partner

has 13+ years experience in web development, ecommerce, and internet marketing. He has been actively involved in the internet marketing efforts of more then 100 websites in some of the most competitive industries online. John comes up with truly off the wall ideas, and has pioneered some completely unique marketing methods and campaigns. John is active in every single aspect of the work we do: link sourcing, website analytics, conversion optimization, PPC management, CMS, CRM, database management, hosting solutions, site optimization, social media, local search, content marketing. He is our conductor and idea man, and has a reputation of being a brutally honest straight shooter. He has been in the trenches directly and understands what motivates a site owner. His driven personality works to the client's benefit as his passion fuels his desire for your success. His aggressive approach is motivating, his intuition for internet marketing is fine tuned, and his knack for link building is unparalleled. He has been published in books, numerous international trade magazines, featured in the Wall Street Journal, sat on boards of trade associations, and has been a spokesperson for Fortune 100 corporations including MSN, Microsoft, EBay and Amazon at several internet marketing industry events. John is addicted to Peets coffee, loves travel and golf, and is a workaholic except on Sunday during Steelers games.

Web Moves Blog

Web Moves News and Information

Blog Posts by John

Wording Up Your Website
Are you losing business because of your website? More and more customers are logging on to the Web to decide where to spend their money because it is quick and convenient, and they can jump from site to site instead of walking from store to store.

Web savvy customers don’t need to be patient, studies have shown that you need to engage a potential customer very quickly by giving them easy, fast access to the information they need. Otherwise they will simply move on to the next site.

Appealing design and speedy functionality are important but they don’t ensure that your site is well structured (intuitive) or well written (clear).

Write First
The real message on most websites is in the writing, and so it makes sense that the writing should determine the structure. Unfortunately, this is not the usual case. Most businesses choose the structure and design of their site first and then try and fit the writing around that structure. This flies in the face of commonsense. When you speak to someone, you structure your speech around your message, you don’t decide on a structure then change the message to suit. So you need to plan what you want to say before you create the site. Maybe even write the whole thing first and then use the message to determine the structure.

When deciding what to write, think about what your customer wants to know rather than what you want to say. It’s a subtle difference, but it is the key to engaging a potential customer.

Most customers will want to know the basics:
– What do you do?
– What benefit can you offer them?
– Why should they choose your service or product?
– What does it cost?
– How can they contact you?
– Where are you located?
– Brevity & Clarity

Your website has to communicate a lot of information and to make matters worse, you are going to have limited space.

Ideally, your customer won’t have to scroll on any page (all your information will fit in a single window) and that single view will need to contain more than just words. The design and navigation elements take up about a third of a window, and you should leave a bit of room for white space (you don’t want to overwhelm the customer). As a rule of thumb you should expect to have about half the window free for text.

How you are going to fit all your information in such a confined space? This is where writing skills come in – choose your words very carefully.

Websites can be an extremely powerful piece of marketing collateral. You can reach millions of potential customers for as little as a few hundred dollars. Unfortunately, your competitors are all doing the same thing – it’s a level playing field but there are a lot of players.

It is important that your message is structured and well thought out, otherwise your site will be a mess and no-one will bother to read about your business. If your message is clear, your site will be simple and easy to use. It’s all in the words.

Author Bio:
Glenn Murray heads copywriting studio Divine Write. He can be contacted on Sydney +612 4334 6222 or at glenn@divinewrite.com. Visit www.divinewrite.com for further details.

10 Tips for Tech-Writers
Tech-writing is a tricky business. It’s not a very high profile industry, so there’s not much support around. Follow these 10 quick tips, and you’ll be well on your way to a satisfying tech-writing career.

1) Follow a sensible career path:
STEP 1: Start in a team
STEP 2: Stay only just long enough
STEP 3: Manage yourself
STEP 4: Manage a team
STEP 5: Go contracting (depends on the market)

2) Knowledge is your lifeblood – learn the politics of your company. Know who knows what. Find someone who consistently gives you timely, reliable, technically accurate answers, and get their name tattooed on your shoulder! Every company has at least one. And they may not be in the project manager/product manager/customer/programmer roles. They are generally the people who’ve used the product in the real world, and dealt with real world customers.

3) Communicate WITH, not AT. Tech-writers don’t have enough power to get away with communicating at.

4) Track stuff (take spreadsheet printout and write it up on the board).

5) Develop good product and domain knowledge – The more you can figure out for yourself, the better off you’ll be (and the more respect you’ll get from the techies).

6) Find out who your users are, what they are trying to do, what they are having trouble doing, and how they want to be helped. Then provide this assistance. Help the user do what they are trying to do. Don’t just tell them what the product can do- a help system is only helpful if it addresses the users’ needs.

7) Treat everyone as a customer. Then manage their expectations and your commitments. Always ensure they know what you’re doing. Tell them when you’ll be finished. And pull out all stops to meet your deadline.

8) Provide a surrogate user testing mechanism for the development team – providing usability feedback.

9) Work as hard as required to get good quality doco finished on time and to budget – this is how you’ll get the satisfaction out of work that you need.

10) Have fun with it.

Don’t become jaded and cynical by the high-tech, harsh, uncaring IT world. Use your smarts, and make the most of the resources provided. Most importantly of all, make work satisfaction your number 1 goal. It’s the best way to stay happy and get ahead.

Author Bio:
Glenn Murray heads copywriting studio Divine Write. He can be contacted on Sydney +612 4334 6222 or at glenn@divinewrite.com. Visit www.divinewrite.com for further details.

Increase Web Site Visitors For Relevant Terms Using Your Less Relevant Listings
Completely by accident and through no effort of my own, besides the effort to get my site spidered by Yahoo’s Slurp Search Engine Spider and Google’s Googlebot Search Engine spider, I’ve seen an accidental increase in the visitors to my web site.

Suddenly, I was ranking first page for Google, Yahoo and AOL for the phrases like “picture of William Hung”, and “william hung she bangs”.

The Good news is, this means that when the Yahoo Slurp spider came to the site and indexed my front page last Saturday, (and no, I didn’t submit my site OR pay for inclusion), my results were in the search engine by Sunday apparently, which is when I started getting clicks from Yahoo for this unusual term.

The Bad News is, William Hung, except for being an example of the power of Free Marketing, has NOTHING to do with my site!

So how does one fix this situation? How do you turn an accidental good ranking into several good rankings for other terms? Or say you’ve got great rankings for a relevant, but fairly insignificant term? The solution is the same.

Just give the spider more of what it wants.

Google, Yahoo – and apparently their search partners – love that page. So, first thing I did was improve the page in question. At the time there was no picture of William Hung on the page that was ranking so well. It’s important that you satisfy web site visitors who accidentally happen up on your site.

It only happens once in a while, but sometimes the person who come by is also interested in your regular content.

The next thing I did, was create other pages I believed the two sites would like. Since the articles here primarily offer resources to webmasters who want to learn how to increase the number of visitors to their sites, all I had to do was tweak some of the wording to the liking of both those visitors and the search engine spiders, and write one more article.

How did I know what the search engines would like, and how that would increase the number of people who would come to the site? Well, I found out that some guarded information about the latest optimization tips for Yahoo and Google was available on the Web FREE. (If you want to know what that resource is, just download the updated Free Google Optimization Guide.)

Once I had the pages optimized to my satisfaction, all I had to do was link some of those pages to the front of my site (where the article was then located) and wait for the increase in traffic to my site. Google spiders me every day like clockwork ever since I made this tiny change to my website that increased my site rank from zero to five in January.

Of course, I then had to figure out if my secret technique to baiting the Google spider was going to work with Yahoo’s new search engine spider.

Not only did it work, but it worked faster than leading the Google spider to specific page of my site! Yahoo’s spider is not back at my site on a daily basis yet, but it does come regularly, finding new pages, and reindexing the area with the most content.

So there you have it. If your site has been indexed well for a term you didn’t really want, first, make the page as relevant as possible to the visitors who end up at your site.

Next, give the search engine spider “food” to eat that is more relevant to your site. It doesn’t hurt to optimize this page. You can now download the new optimization guide for FREE. IF you’re one of the next 100 subscribers. Only 1000 copies will be available for download so get yours now.

Third and last, get the spider to come on back to your site. If you know how to bait a search engine spider, great! You’ll get your results faster, and since the spider finds the link on its’ own, you may get better placement. Or, you can submit your site to one of Inktomi’s partners, and/or Google and wait the 6-8 weeks it takes to be included.

Author Bio:
Subscribe to Free Traffic Secrets to find out how to get the Googlebot spider sooner at ftdsecrets-subscribe@topica.com or visit http://www.freetrafficdirectory.com to get more time-tested information on getting Free Quality Traffic to Your Site.

SEO’s Relationship With Website Architecture
Search engine optimization for today’s search engine robots requires that sites be well-designed and easy-to-navigate. To a great degree, organic search engine optimization is simply an extension of best practices in web page design. SEO’s relationship with web design is a natural one. By making sites simple and easily accessible, you are providing the easiest path for the search engine robots to index your site, at the same time that you are creating the optimum experience for your human visitors.

This approach ties well into the notion of long-term search engine marketing success. Rather than trying to “psych out” the ever-changing search engine algorithms, build pages that have good text and good links. No matter what the search engines are looking for this month or next, they will always reward good content and simple navigation.

Search Engine Robots
Search engine robots are automated programs that go out on the World Wide Web and visit web pages. They read the text on a page and click through links in order to travel from page to page. What this really means is that they “read” or collect information from the source code of each page. Depending on the search engine, the robots typically pick up the title and meta description. The robots then go on to the body text of the page in the source code. They also pay attention to certain tags such as headings and alt text. Search engine robots have capabilities like first-generation browsers at best: no scripting, no frames, no Flash. When designing, think simple.

Search Engine Friendly Design
Creating search engine friendly design is relatively easy. Cut out all the bells and whistles and stick to simple architecture. Search engine robots “understand” text on the page and hyperlinks, especially text links. The relationship of SEO and web design makes sense when you start with good design techniques for your visitor. The easier the navigation and the more text on the page, the better it is not only for the visitor but also for the search engine robots.

Obstacles For Indexing Web Pages
Search engine robots cannot “choose” from drop down lists, click a submit button, or follow JavaScript links like a human visitor. In addition, the extra code necessary to script your pages or create those lists can trip-up the search engine robots while they index your web page. The long JavaScript in your source code means the search engine robots must go through all this code to finally reach the text that will appear on your page.

Offload your JavaScript and CSS code for quicker access to your source code by the search engine robots, and faster loading time for your online visitors. Some search engine robots have difficulty with dynamically-generated pages, especially those with URLs that contain long querystrings. Some search engines, such as Google, index a portion of dynamically generated pages, but not all search engines do. Frames cause problems with indexing and are generally best left out of design for optimum indexing. Web pages built entirely in Flash can present another set of problems for indexing.

Depth Of Directories
Search engine robots may have difficulty reaching deeper pages in a website. Aim to keep your most important pages no more than one or two “clicks” away from your home page. Keep your pages closer to the root instead of in deeply-nested subdirectories. In this way you will be assured the optimum indexing of your web pages. Just as your website visitor may become lost and frustrated in too many clicks away from your homepage, the robots may also give up after multiple clicks away from the root of your site.

Solutions And Helpful Techniques
If there are so many problems with indexing, how will you ever make it work?

The use of static pages is the easiest way to ensure you will be indexed by the search engine robots. If you must use dynamically-generated pages, there are techniques you can use to improve the chances of their being indexed. Use your web server’s rewrite capabilities to create simple URLs from complex ones. Use fixed landing pages including real content, which in turn will list the links to your dynamic pages. If you must use querystrings in your page addresses, make them as short as possible, and avoid the use of “session id” values.

When using Flash to dress up your pages, use a portion of Flash for an important message, but avoid building entire pages using that technology. Make sure that the search engine robots can look at all of the important text content on your pages. You want your message to get across to your human visitor as well. Give them enough information about your product to interest them in going the next step and purchasing your product.

If you must use frames, be sure to optimize the “no frames” section of your pages. Robots can’t index framed pages, so they rely on the no frames text to understand what your site is about. Include JavaScript code to reload the pages as needed in the search engine results page.

Got imagemaps and mouseover links? Make sure your pages include text links that duplicate those images, and always include a link back to your homepage.

Use a sitemap to present all your web pages to the search engine robots, especially your deeper pages. Make sure you have hyperlink text links on your page, and a sentence or two describing each page listed, using a few of your keyword phrases in the text.

Remember that the search engine robots “read” the text on your web page. The more that your content is on-topic and includes a reasonable amount of keyword-rich text, the more the search engine robot will “understand” what the page is about. This information is then taken back to the search engine database to eventually become part of the data you see in the search engine results.

Last of all, it is very important to test your pages for validation. Errors from programming code and malformed html can keep the search engine robots from indexing your web pages. Keep your coding clean.

Check List For Success
Include plenty of good content in text on your web pages Incorporate easy to follow text navigation Serve up dynamically generated pages as simply as possible Offload JavaScript and other non-text code (style sheets, etc.) to external files Add a sitemap for optimum indexing of pages Validate your pages using the World Wide Web Consortium’s validation tool, or other html validator On Your Way To Indexed Pages The best way to assure that your pages will be indexed is to keep them simple.

This type of architecture not only helps the search engine robots, but makes it easier for your website visitors to move throughout your site. Don’t forget to provide plenty of good content on your pages. The search engine robots and your visitors will reward you with return visits.

Resources
To learn more about how to work around optimization problems with JavaScript, dynamically-generated pages, Frames and Flash, read the following articles:

Optimizing Pages with JavaScript and Style Sheets for Search Engines
http://www.searchinnovation.com/optimizing-pages-with-javascript.asp

Optimizing Dynamic Pages (Part I)
http://www.searchinnovation.com/optimize-dynamic-pages-1.asp

Optimizing Dynamic Pages (Part II)
http://www.searchinnovation.com/optimize-dynamic-pages-2.asp

Optimizing Frames for Search Engines
http://www.searchinnovation.com/optimizing-frames-for-search-engines.asp

Html validation tool
http://validator.w3.org/

Stylesheet validation tool
http://jigsaw.w3.org/css-validator/

Author Bio:
Daria Goetsch is the founder and Search Engine Marketing Consultant for Search Innovation Marketing, a Search Engine Promotion company serving small businesses. Besides running her own company, Daria is an associate of WebMama.com, an Internet web marketing strategies company. She has specialized in search engine optimization since 1998, including three years as the Search Engine Specialist for O’Reilly & Associates, a technical book publishing company.

How much should you be expected to pay?
This is obviously a tricky question. All I can say is a PR with a value of 4 is worth less than a site with a value of PR 5, and even less then a site with a value of 6.

That said, the asking ‘monthly leasing’ quotes I received from some of these companies range between a low of about $ 15 a month to a high of about $ 70 (too high I think) for a PR 4. In the PR 5 category, costs ranged between $ 40 at the low end to about $ 110 at the highest. In the PR 6 and 7 categories, they were ‘all over the map’, with ranges between $ 150 to $ 250 for a PR 6, all the way up to $ 450 for a PR 7.

Note that these figures are in US currency and that prices can vary widely. I’ve even heard (but I don’t believe it) that some sites with PR of 4 are almost giving them away. Again, it is buyer beware. You should obviously do your due diligence on any of this before you decide to go with one company or another.

Is it better to stay in the same field?
I have been saying this for a long time and I will repeat it: Links are not all equal. A link from a site dealing in real estate pointing to another site in real estate will be given higher ranking privileges, not just by Google but also by some of the other search engines.

If your business is involved in home renovations and improvements, try leasing links from sites that will help your users, such as home decorators, lumber yards and building contractors. Some may not agree with me on this, but staying in the same field you are in will go a long way in substantially improving your rankings.

Conclusion
Leasing monthly links should be viewed as any other business function. You do it to increase traffic at your site and to help your users, not just to increase your rankings with the search engines. Having targeted traffic is not all. You also need a site that converts well.

Converting well means that visitors become buyers and eventually repeat customers that come back to your website time and time again. If leasing links from high-ranking sites in your industry does it for you, you have accomplished one of your goals: to increase the size of your business, at the same time making it become more successful, independently of how search engines will rank your site.

Since we know that the number and the quality of inbound links to a site helps it in its rankings, leasing links thus accomplishes two main functions for businesses that engage in it.

Buying Links
The subject of buying one-way links is a controversial subject at best. A lot has been said about it in the past, especially now after Google’s Florida and Austin updates. Depending to whom you talk to, you are likely to get opinions that range from ‘don’t do it’ to ‘it’s now OK to do it’. So what’s the average site owner or small business to do?

Look at it from a business perspective point of view. If some search engines look at it from an advertiser’s point of view, then maybe that’s how you should look at it too. In the last two months, we have had (and continue to have) an increasingly higher number of clients and people that contact us to buy one-way links from other sites.

So the question is: Should you do it? Is it ‘safe’ to do it in this conjecture?

Back to the basics of linking
Before I answer that question, let’s go back to basics. Why do we need links in the first place? Can’t we just have a site with no outside links that will still rank high?

Here’s how Google ranks Web pages:

Google ranks websites on their value of merit. One of Google’s algorithms that does just that is the Page Rank’ algorithm, named after one of Google co-founders, Larry Page. In its most basic form, Page Rank’ calculates the number and quality of incoming links to any given website.

As far as Google is concerned, a link from site A to site B is viewed as a ‘vote’, which means that if 300 sites link to site B, that site must be more important than if it only had 5 or 10 inbound links.

As a result, the more links you would have pointing to a given website, usually the better that site would be perceived in terms of quality of content. Since the past 3 or 4 years, a great number of webmasters and site owners have participated in what is called “reciprocal link exchanges”, which, as the name implies, does just that: “I will put a link from my site to your site, if you do the same”.

In essence, there is nothing wrong with that. However, since Google’s two major last updates, in November and January, reciprocal links don’t seem to have the same impact as they once had, at least not from a ranking point of view.

Since then, it would appear that one-way links (from A to B and not B to A also) are receiving better rankings.

Which reopens the debate on “are buying links ethical?” Look at this analogy: if a website or portal carries a number of advertisers with links on its homepage, all pointing directly to those sites, isn’t this link buying? It sure looks like it to me.

Viewed in this manner, then I think its safe to say that buying links is now an acceptable practice in the Internet age.

So how do we go about doing it?
You should approach the subject of buying links the same way you would approach any other business proposal or transaction: with careful planning and attention to the small details. You should sit down and plan carefully the way you will do this. You probably have some sites in mind you would like to contact in an effort to see if they will ‘lease’ you some links.

I prefer the term leasing instead of buying since, in these uncertain economic times, nothing lasts forever. A few years ago, what would have happened if you purchased a ‘lifetime link’ from a site like Enron? It’s pretty clear today that this would have been a bad idea.

In the last few weeks, I have been approached by a number of companies with websites that said they might be interested in leasing some links. Some of them have a PR (Page Rank’) value of between 4 to 6, with a few of PR 7. I personally believe that this trend will most likely increase.

Author:
Serge Thibodeau of Rank For Sales

Introduction
In search marketing, there are many more questions than answers, particularly when it comes to how people search. We know how we search and we assume everybody searches in a similar way. Also, because searching has become such an intuitive function, we tend not to really give the actual search process much thought. If many of us actually looked at what we did in a search process, we’d probably surprise ourselves.

At Enquiro, we decided to try to peel back the shroud that covers common search behavior. We wanted to see just how people searched, and ask them what went through their minds during the search process. It was a fascinating study, and resulted in a 30 page white paper which you can download from www.enquiro.com. Today, I want to cover just a few of the things we uncovered.

What We Did
First, we decided that we’d invite people to visit our computer lab, give them a couple typical search scenarios, and record their actual search behavior. Then, we sat down with them, reviewed their search activity and asked them questions about how they interacted with the search results. We recorded their comments and then compiled them for analysis. We had 24 participants, with a fairly broad representation from different age groups, income and education levels and backgrounds.

There’s No Such Thing as an Average Searcher
First, the assumption that everyone searches in a similar way quickly proved to be false. We saw marked differences in the way people searched. These differences could be due to gender (yes, men do search differently than women), age, education or experience with the internet. In same cases, the differences were dramatic and could have a major impact on an advertisers search strategies. For instance, women tended to scan all organic results and read titles and descriptions more carefully than men. An organic listing in the number 8 position on Google might not have been seen by almost half the men in the group, but would have been seen by the majority of the women. This is just one example of how one search marketing strategy won’t fit all prospective customers.

How They Saw a Search Results Page
One of the foremost questions on our mind was how people react with a search page. Do they scan all the listings, or just a few? How important is position versus the title and description? Do all users see sponsored listings?

It seems that people have already mentally divided the results page from their favorite engine into sections. These sections tend to be sponsored listings (in some cases, both at the top and along the side of the page), above the fold organic results (free listings that appear without the user having to scroll down), below the fold organic results (free listings that require scrolling down) and other features (such as Google’s news and shopping feed results, just above the organic results). Not all these sections are treated equally by the user. Some, particularly the sponsored listings, are often skipped over by many users (over half the group) to go directly to the organic listings. Depending on the type of searcher and what they find in the organic results, they may or may not come back to sponsored listings after looking at the organic ones.

The above the fold organic results proved to be the prime real estate on the search engine results page, with all users making sure they looked at the top few listings. The eyes started to drop off as we moved to below the fold organic results and the sponsored results, with only 16.6% of users saying they check out sponsored listings, regardless of what they find in the organic results. 50% of users said they’d check out sponsored listings if they didn’t find anything relevant in organic results.

Searching is more Complex than we Imagined
As marketers, we tend to think of the search process as a linear one. A person searches, chooses a results, visits a site, and hopefully converts. In reality, we see the typical search pattern is quite different

A typical search is a circular and complex process, with multiple interactions with sites and search engine results pages. The average online research interaction can involve 5 to 6 different queries and interactions with 15 to 20 different sites. Often, the actual contents of a search results page can cause the searcher to take the search in a totally different direction, launching a new query that is at best somewhat divergent from the original purpose of the search. Dead ends are common and the browser back button is used extensively to navigate through the search process. For this reason, the search engine results page is actually used as a navigation aid in negotiating the online research interaction, as people continually refer back to it and launch another online exploration from this starting point.

While difficult to strategize for, search marketers have to understand that a search interaction is a complex process and that the searchers mindset evolves as they move through it.

Building the Search Query: The Funnel Approach
Over 70% of participants indicated they like to start with a generic, inclusive keyphrase and narrow it down from there. Reasons for this included:

* Not wanting to exclude potential quality sites by being too exclusive in the original search
* By being broader, the searcher may find other options to help take the search in new directions by looking at the results
* Being able to judge relevancy of the original findings and selectively increase relevancy by adding qualifying keyphrases
* It’s easier and quicker to type in a broad, short phrase at the beginning

In this type of search pattern, looking at search volumes and typical conversion metrics can be misleading to many marketers.

For many searchers, the search becomes increasingly specific as they go through the searching process. As this happens, the chances of the searcher finding results that could lead to a conversion becomes greater and greater as the search progresses. However, the direction the search takes can be determined by the results found in the early, generic searches. For instance, in one case where a participant was looking for information on cruises, the searcher didn’t start out looking for either a Panama Canal Cruise or a Princess Cruise, but results found early in the search process led her to refine her search query in these directions. If awareness of these options hadn’t been introduced early in the search process, she would have never refined her search in these directions, leading to a likely conversion for Princess for a Panama Canal Cruise.

Introduction of Brand
This iterative search process introduces the opportunity for a multiple touchpoint approach to search marketing, introducing brand early in the search process and then reintroducing brand throughout the process. Obviously, this process works better if the brand is familiar with the consumer, with the advertiser having built brand equity through other online and offline search channels.

The Anonymity Threshold
In watching the participant’s interactions with a site, we also found that another common trait appears, particularly with the deliberate researchers. We have called it the Anonymity Threshold.

In general, people feel they are relatively anonymous when they are browsing online. And when people are gathering information about a purchasing decision, most prefer to remain anonymous. They don’t want to be exposed to sales pitches at this point, because they’re not ready to engage in the purchase process. They haven’t narrowed down their list of options yet.

In looking at the cruise example used in the buying funnel, it wasn’t until the searcher had found the right destination, type of cruise and cruise line that they were ready to engage in the purchase process. For this reason, they were resistant to purchase process oriented incentives (i.e. discounts) until the very last.

The internet has become very popular as a research tool during the information gathering process because it appears to offer the ability to remain anonymous. Through search engines, you can gather a lot of information quickly and you don’t have to enter into a situation where you surrender your anonymity until you choose to. We believe this is the reason there is a significant drop off between people willing to use the Internet to research a purchase decision and people willing to use it to purchase online. This drop off has been identified by a number of ecommerce studies. The purchase requires people to cross the anonymity threshold and they’re not prepared to do that. They know once they surrender contact information, they will likely be contacted by the vendor and be engaged in a purchase transaction. The consumer wants to do this according to their timing, not the vendors.

An interesting example of a violation of the anonymity threshold was presented by the use of online real time, real person sales chat tools such as HumanClick and Groopz. At first glance, these tools seemed a great answer to the impersonal nature of the Internet. You could watch visitors navigate through your site and if they wished, they could click on a button and initiate a real time chat with a sales person. As long as vendors stayed on this side of the fence, and let the visitor initiate the session, there was no problem. The challenge came when the vendor ‘pushed’ a chat window to visitors, offering assistance. Almost without exception, the visitors left immediately. We, along with a few other vendors we talked to, found that the minute we walked over the threshold and made visitors aware that they were being watched, they quickly left our site.

People won’t cross the threshold until they have no option. If given the choice between getting information and remaining anonymous and getting the information through registering, people will always choose the former. This creates a bit of a dilemma for the marketer, because generally the key metric is measuring against acquired or converted visitors. Almost every definition of an acquisition or conversion requires the visitor to cross the anonymity threshold. Because of the reluctance of the visitor to cross this threshold, the site owner may be building significant brand equity or trust with the visitor but is not giving credit to it because of the anonymity threshold.

In order to entice people to purchase online, the web vendor has to offer at least one significant advantage, whether it’s price, selection or convenience. If all things are equal or even close to equal, people will tend to avoid entering into a purchase process online.

In looking at most search marketing strategies; the emphasis is put on encouraging the purchase, while most people using search engines are more interested in anonymously gathering information. I believe there’s a potential disconnect here that more search marketers have to give some serious thought to.

More to Come
In this article, I’ve just looked at some of the findings from the study. In the next Net Profit, I’ll be looking at the 4 identified types of searchers, and what caught their attention in the search listings. And if you just can”t wait to get all the goodies, please download the full white paper at www.enquiro.com/research.asp.

Yes, despite what I said before, you will have to step over the anonymity threshold long enough to give us your email and name. Ironic, certainly, but like I said at the beginning, I don’t have all the answers. Hopefully we’ve thrown a few more at you.

Author:
Gord Hotchkiss

Introduction
In case you haven’t noticed yet, Yahoo has been hard at work lately. If you missed it, Yahoo now has its own toolbar. In a similar fashion the Google toolbar works, Yahoo just introduced Web Rank’. Web Rank’ happens to be Yahoo’s new search engine algorithm, as well as the name it gave its new toolbar.

The Yahoo Web Rank’ toolbar works a bit like the Google PageRank’ toolbar and is a technical measurement of a particular URL’s popularity. If you download and install the Yahoo Web Rank’ feature on the Yahoo Companion Toolbar, an icon will display the Web Rank’ value of each URL that you visit.

Just like Google, Web Rank’ values range from 1 to 10, with the higher number depicting higher link popularity. So, it is hoped (!) that a site with a higher Web Rank will offer more information and content, at least that’s how it should work.

Note that Web Rank’ is only in Beta release, which means Yahoo is still experimenting with it and modifications in the way it works are possible, until the final version is released sometime in the next few weeks.

To help determine a site’s Web Rank’ value, Yahoo’s Companion Toolbar collects anonymous URL data about sites visited by toolbar users who have enabled the Web Rank’ feature and then sends that information back to Yahoo. According to Yahoo, the new toolbar does not collect personal identity information about you, such as your name, phone number, physical address or email address, etc.

The anonymous URL data is sent to Yahoo’s servers and the Web Rank’ value is returned to the Companion Toolbar as one single measurement of the popularity of the Web page or URL you are currently visiting.

You will see a small Yahoo icon on your Companion Toolbar, displaying the Web Rank’ value of the site you are currently visiting. The value will be between 1 and 10.

Yahoo’s Web Rank’ algorithm numbers are calculated using a sophisticated scoring formula developed in Yahoo’s labs that provides a measure of the popularity of the Web page or Web site that you are viewing.

Installing Yahoo’s new Web Rank’ toolbar is easy. With the help of Yahoo’s Companion Toolbar, you will have the option of enabling the Web Rank’ feature. If you want to use it, choose the “Install with Web Rank” button on the configuration panel and the Web Rank’ feature will be enabled during the installation of your new toolbar. It took me less than two minutes to do the whole thing.

Once you have it installed on your machine, should you ever change your mind, you can turn it off if you want. Doing this is easy: from the Yahoo Companion Toolbar, just click the “Toolbar Settings” button and uncheck “Enable Web Rank.” That way, the Companion Toolbar will stop collecting anonymous URL data about the sites you visit once Web Rank’ has been disabled.

If you are running your toolbar from a corporate network and if you need to disable it, you will need to block http://cpn.yahoo-webrank.net at your firewall or proxy server. This will disable the Web Rank feature from sending any anonymous URL data back to Yahoo’s servers.

Do you really need the Yahoo Companion Toolbar?
You don’t need to install the Companion Toolbar if you don’t want too. Yahoo’s Web Rank’ feature is opt-in only and it’s not required to install the Companion Toolbar if you don’t need it. If you decide to have it installed, you will be asked if you would like to enable the Web Rank’ feature in the toolbar.

If you choose to enable it, a toolbar icon will display the Web Rank’ value for the URL that you are currently visiting, if that value is in fact available.

Web Rank’s cool new features
I had time to experiment with Yahoo’s new toolbar and discovered some cool new features. One of them is what Yahoo calls “Recent Searches”. This feature is a pull down menu of your last 30 search terms through Yahoo Companion. At the time I tested it, 12 query terms were directly visible and I could access an additional 18 with the scroll bar.

According to Yahoo, no information is ever sent back to any of their servers on any of your recent searches. If, for privacy reasons or otherwise, you want to turn off “Recent Searches”, you can do it at any time by selecting the ‘Clear Recent Searches’ under the “Toolbar Settings” button. That button has a small picture of a pencil on it to better help you find the button.

The Recent Searches feature can be disabled entirely by deselecting the menu item called ‘Enable Recent Searches’ through the “Toolbar Settings” menu. This action will delete all the recent search terms in the client and disable the feature.

Other features and benefits to searchers
Yahoo’s new Web Rank’ algorithm has even more features. It can also provide the following important benefits to its users:

1) It will help Yahoo identify critical new trends and usage habits in Internet activity, in an effort to better enhance the quality and relevance of the products and services people are searching on.

2) It will also help Yahoo to correctly identify new websites faster, and then add these newer sites into its search index, resulting in more relevant results and significantly improved overall search results.

Features and benefits to webmasters and site owners
For site owners and webmasters, the new Yahoo Web Rank’ algorithm will also deliver the following features and benefits:

1) Yahoo’s Web Rank’ algorithm delivers a valuable indicator of how popular your site is perceived by other site visitors or webmasters.

2) Additionally, Yahoo’s new Web Rank’ algorithm will help alert Yahoo’s Slurp (Yahoo’s search crawler or spider robot), to the existence of a particular website or Web page, and direct its spider to visit that website or Web page for inclusion in its search index, if it isn’t already in it.

Conclusion
As I have written in my last article on the New Yahoo, it is clear now that Yahoo wants to not only conserve its lead position on Google, but it also wishes to improve on it. The next few months should be exciting, as the industry witnesses other new developments, either in new search engine algorithms or new toolbars and the like.

This industry is advancing at break-neck speed and I predict that the search engine industry will be a $30 Billion-plus industry in 2005-2006. Judging at the speed the Internet is growing, it may well be over $ 100 Billion by 2010.

Author:
Serge Thibodeau of Rank For Sales

Interview With Eurekster CEO
If you keep tabs on the latest search engine news, you’ll no doubt realize that we are getting closer to a merging of search engine technology and social networking. While Google may have created the sizzle with its recent launch of Orkut along with rumors that it may one day roll it into its search engine, there is a company already making headway with social searching.

Although Eurekster may technically still be in Beta testing (launched in January of this year), there is no doubt that the offspring of SLI Systems and RealContacts is making major advancements in combining social networking with search engine technology.

Eurekster makes use of its own SearchMemory� technology which remembers the sites a user finds useful and presents them higher in the results the next time they search. Then, Eurekster lets a user and their friends share their searches and sites, so when they do a “hotel” search, for example, they’ll see the hotel sites their friends also found useful, moved up in the results and marked with an icon.

I had an opportunity to discuss with Eurekster CEO, Grant Ryan the future of social searching and find out what Eurekster is doing to get a step ahead of Google, Yahoo and MSN.

[Andy Beal] Tell me about the search engine technology being developed by Eurekster?

[Grant Ryan] The Internet is a huge place – how do we know what is interesting out there? Word of mouth is the most common way for new ideas to spread and the “What’s Hot” function of Eurekster allows users to see what is going on with their contacts without seeing exactly who does what. This has already worked in interesting ways. There was an earthquake in my home town and someone immediately did a search for that to find out how big it was. Two people I know who were overseas at the time saw this in the “Recent Searches” area of Eurekster and rang home to check that everything was ok. It is a great way to share information with your contacts.

We’ve also shown how search engines can now also remember that you or your friends liked one particular search result over the thousands of others, and deliver it on top of the results for all future searches performed by your network of contacts.

[AB] What new developments in search do you see happening in the couple of years?

[GR] We think that personalization will be the main area of improvement. Search technology has evolved from computers deciding what is relevant (e.g. Infoseek, AltaVista), to paid editors deciding what is relevant (e.g. Yahoo, LookSmart), to webmasters deciding what is relevant through link analysis (e.g. Google, Teoma).

The next logical step is that users decide what is relevant based on their knowledge and experiences. Search engines that learn and adapt results based on your behavior, giving personalized results is the next big opportunity and challenge.

Another big opportunity is local search – this is a form of personalization – delivering search results based on one’s location. This is, to some extent, like merging the yellow pages with search. This has a lot of potential commercially, especially since there still are greater numbers of yellow page advertisers out there than search engine advertisers.

[AB] What impact do you see social networking having on the future of search engine technology?

[GR] Word of mouth or social networking is the most commonly used method for filtering information in everyday life. We use it every day to get recommendations for doctors, lawyers, places to stay on holiday, and so on. As the quantity of information explodes, word of mouth information filtering will become even more important. It is inevitable that this natural social process will be used to filter information on the Internet and search engines are the logical place to start.

The reason that social networks are important for information filtering is that there are billions of people in the world with different views about what is important and interesting. One of the ways we choose people with whom to associate is based on the fact that we either enjoy something about their perspectives of the world or share similar views. In either case, this is a useful way to help work out what is likely to be more relevant to you.

[AB] Do you foresee a time when commercial search results (product/services) will be separated from informational search results (white papers/educational sites)?

[GR] Yes it may head that way. I can naturally see that there will be more tabs on search engines to allow users to focus only on products or just on educational information, etc. Most users simply want to type in a search query and have results appear — so I suspect they will continue to be mixed by default.

[AB] How do you see search engine technology impacting our use of portable technology such as PDAs and Cell phones?

[GR]I would be surprised if PDAs and Cell phones will ever be used as a primary source for searching given the requirement for small screen size. Mobile search engines of the future are likely to take into account your precise location when serving results as you are more likely to be looking for directions, local news, sport etc.

[AB] If search engine users gave up a little of their privacy and allowed their search habits to be monitored, would this allow the search engines to provide better, customized results?

[GR] Yes – if users want truly customized services then the provider has to know something about their preferences. The level of service you can get from a travel agent or investment advisor would be severely limited if you had to start from scratch every time you needed something. Most search engines assume that everyone typing in a term is looking for the same thing and give them exactly the same results!

[AB] Grant, tell us what Eurekster is doing to personalize the search experience?

[GR] At Eurekster we have developed a way to learn from your past search history and that of your contacts in a way to provide personalized and more relevant search results. There are strong incentives for search engines to keep their promises on privacy given there is more value in keeping a long term quality relationship, compared to the negative publicity and loss of customer trust.

[AB] How can Eurekster compete with Google or Yahoo?

[GR] I have been involved in the search business for over 6 years now and every year have read articles about how the search wars have been won (different companies over time e.g. AltaVista/Infoseek, Yahoo, Inktomi, Google). It is inevitable that companies will continue to come up with new technologies that offer consumers greater choice and new improvements. That is what we are doing at Eurekster, so have a play and tell us what you think — and what features you want us to add next.

Author Name: Andy Beal
Company: KeywordRanking.com
Email: andy@keywordranking.com

Author Bio:
Andy Beal is Vice President of Search Marketing for WebSourced, Inc and KeywordRanking.com, global leaders in professional search engine marketing. Highly respected as a source of search engine marketing advice, Andy has had articles published around the world and is a repeat speaker at Danny Sullivan’s Search Engine Strategies conferences. Clients include Real.com, Alaska Air, Peopleclick, Monica Lewinsky and NBC. You can reach Andy at andy@keywordranking.com and view his daily SEO blog at www.searchenginelowdown.com.

Don’t Drop Winning Ad Campaigns!
Newcomers to pay-per-click advertising are often stunned at how quickly they can achieve success marketing affiliate products. Don’t make the beginners mistake of dropping a winning ad campaign.

Here’s a fairly common mistake that is made by the newbie to the pay-per-click scene.

So you’ve started a Google AdWords campaign, and you’ve registered as an affiliate with several different products, and you’ve found at least one product that’s making you money.

Let’s say you’ve found a toy company that pays you 10% of the sale to promote their products (just for a tangible example we’ll refer to it as ‘Toy Company #1’)

You’ve played around, tweaked your ads, and you’re getting a little success. Bravo! Things are running in the black, and you’re making steady profits.

But wait, you’re still combing the Internet, looking for new and better opportunities when bam, all of a sudden you find Toy Company #2. Wow, Toy Company #2 looks really great. It pays a 20% commission on all sales, and the landing page seems easier and more intuitive than the landing page for Toy Company #1.

Being the wise and quick-to-evolve pay-per-clicker you are, you swiftly join the affiliate program for Toy Company #2 and set up an ad campaign for it, using many of the same ads and keywords that you’ve had such great success with in promoting Toy Company #1.

Surprise! Toy Company #2 is a great find! It converts much better than Toy Company #1 and moreover it pays much more per sale! You’re ecstatic! Greater success!

And then comes the mistake. Thinking ruefully what a sucker you’ve been all along for promoting Toy Company #1 instead of Toy Company #2, you pull the plug on your ad campaign for Toy Company #1.

It’s only human nature to want to do this. After all, Toy Company #1 wasn’t performing as well as #2 is. You aren’t seeing nearly as high a profit margin.

But the hard truth of the matter is this: Toy Company #1 was making you steady money! It wasn’t running in the red. It wasn’t a drain on your budget or resources. It was a winner.

Profit is profit. Every little bit adds up. Sure, Toy Company #1 wasn’t making you much money, but even a little bit of profit is more than you had. Imagine if you found seven more companies that performed like it ‘ at the end of the month, all those ‘little’ profits would add up to a ‘big’ overall profit.

The best solution is to keep the ad campaign that’s already working, and add new campaigns that show even better profits to it.

But what about the fact that you’re now competing with yourself? On the surface it doesn’t seem to make sense to promote two different products that are aimed at the same audience.

The answer to this objection is simple. Why shouldn’t you compete with yourself? That way, no matter which choice the consumer makes, you win. Some of your audience, for whatever reason, just isn’t going to buy products from Toy Company #2. Some of them will only buy from Toy Company #1. Don’t you want to make profits from those people too?

The lesson is simple. Don’t drop winning ad campaigns!

Author Bio:
Daniel Brough is the founder of AdWord Wizards, a free mentoring program designed to teach anyone how to profit from pay-per-click search engines. Want to start a profitable AdWords campaign in less than 30 minutes? Come to http://www.adwordwizard.com and sign up for this free program.

Traditional Search Engine Optimization
First I will define what traditional SEO is. Search engine optimization is the process of fine tuning (otherwise referred to as optimizing) your web site to reflect specific keywords and phrases that are relevant to your business and for which you want to attract visitors to your site who are searching for such words.

This optimization relates to a variety of elements, not only on your web site’s home page, but its sub pages as well. Those elements can include things like title tags, meta tags, alt tags, link structure, link popularity and the content of the site itself. Once your web site is properly optimized, the goal then is to make sure that each of the top crawler based search engines find your site and add as many pages as possible in their indices. These engines will usually start with your home page and then work their way to other pages of the site.

This is where another important aspect of SEO comes into play – making sure the sub pages or at least the main areas of your site are accessible from the home page. It can be assumed that if many of your pages are optimized for different keywords and can be found in the search engines, they will draw traffic to your site.

Search engine optimization also involves making sure your site is listed in the major directories such as Yahoo and Open Directory to name a few. I won’t go into detail about proper directory submissions here as that is another topic of its own but will say that it is important to make sure you are listed in the right category and have a title and description that is reflective of the keywords you are trying to target.

Other elements of SEO include monitoring your positioning in the search engines, making adjustments as necessary to your site to stay in top position and submitting to new engines or directories that come along as well.

Pay Per Click
Now I will define what PPC is using Overture as an example. PPC or pay per click is a service in which an advertiser selects specific keywords or phrases and then creates a listing that will show up when someone searches for that phrase. The advertiser selects an amount they are willing to pay for each click on their listing which results in a visit to their site ‘ thus the term ‘pay per click’. At Overture, you can bid anywhere from $.10 up to $50.00 for each click.

If other advertisers have selected the same keyword or phrase as yours, you then are competing against them for the highest position. Whoever is willing to spend the most shows up first and the others following in order.

What makes PPC attractive in the case of Overture is that they distribute their paid listings to other partner sites. In fact, if you bid in one of the top three positions at Overture, your listings will also show up at some of the leading search engines including Alta Vista, HotBot, Infospace, iWon, Lycos, MSN, Yahoo and others. They also show up in several meta crawlers and other minor search engines.

Therefore if you bid for top placement at Overture, you will show up at these partner sites as well. There are many other pay per click programs including Google AdWords, ah-ha.com, FindWhat, Sprinks and Looksmart to name a few. All function in a similar manner.

What PPC has allowed one to do then is to instantly ‘pay’ their way to the top whereas traditional SEO takes a lot of time and effort.

The overriding question then is how valuable is PPC and should it replace standard search engine optimization techniques?

Pros and Cons of Pay Per Click
First of all you must understand that PPC will never help improve your positioning in the regular editorial search results. They most always appear in a “Sponsored or Featured” area which is usually at the top or side of the regular search results.

While it is nice to ‘show up first’ there are still a lot of end users, myself included, that do not click on the “paid” listings but rather will search through the regular editorial search results.

Pay per click will not optimize your site so that it reflects who you are and what you offer. If your site is poorly optimized before you begin to advertise it with PPC, it will still be poorly optimized afterwards. Another downfall is that when you stop paying for PPC, it disappears and so does the traffic it brings!

On the other hand, a well executed SEO plan will improve your position in the regular editorial search results. You will have a finely tuned web site that reflects who you are and what you offer. You are not held hostage to having to continuously throw money at search engines to maintain your listings.

In most cases, once you have good positioning in the regular search results, you will continually receive free traffic. It may require some minor adjustments from time to time but all in all brings in consistent free traffic so long as people are searching for your products or services.

I am not saying that PPC is bad and you shouldn’t do it for if you have the budget that will go beyond a traditional SEO campaign, then by all means, utilize PPC as well. You gain instant exposure and only pay for actual visitors to your site.

But never ever disregard SEO in place of PPC! SEO should be the foundation of your online marketing strategy because in the long run, the benefits of SEO will far outweigh anything else you do.

Author Bio:
David Wallace is CEO and founder of SearchRank, an original search engine optimization and marketing firm providing keyword analysis, organic search engine optimization, link popularity enhancement, pay per click management, search engine friendly web design and ongoing campaign maintenance.