Ask us a question!

Web Moves Blog

Web Moves News and Information

Archive for the 'Grow Your Traffic' Category

How to Get Your Web Site Listed In The Search Engine Directories
Submitting your web site to the search engine directories will give you a great boost in traffic. A listing in a large search engine directory like ODP(Open Directory Project) enables your web site to get indexed for free by the major search engines such as AskJeeves, AOLSearch, Google, and Lycos. Directories will also help increase your page ranking. However getting listed in the search engine directories is more difficult than the major search engines. Preparation and patience is needed.

In Part 1 we covered “How to Get Your Web Site Listed in the Major Search Engines”

So what’s the difference between the major search engines and major search engine directories?

Search engines are often confused as being the same as search directories.

A search engine is a web-based software tool that enables the user to locate sites and pages on the web based on the information they contain.

Search directories use people, so you are dependant on these editors to put in a description of your website that will cover the keyword phrases that you need. Therefore, before submitting your site to a directory think carefully about the description that you enter.

How to prepare your web site for search engine directory submissions.

1. Research the best keywords – use the overture suggestion tool or wordtracker tool to find the keywords best suited for your site. It’s no point entering words in a directory that don’t relate to your site or are not often searched for. Include these keywords in your page content and in each of your meta tags.

2. Evaluate your site – search directories are more particular about allowing web sites to get listed than search engines. Therefore make sure your site has no broken links, includes relevant content-based pages, loads quickly (within 10 seconds), is easy to navigate, and has cross browser compatibility.

3. Prepare your title – if you are a business enter your business name ie Drost Web Design. If you don’t have a business name use the name of your site ie iSiteBuild.com. Don’t use a long title. This may help you get listed in the search engines but not in the directories.

4. Write a description – remember that human beings are looking at your description so don’t create an ad. Create a well written description of about 25 words, that clearly explains what your site has to offer.

5. Look professional – your web site should have a professional appearance. Don’t just put up a bunch of affiliate links to other sites or just have a one page sales letter.

6. Submit your site – navigate through the directory to find the best category for your web site. Don’t just include what you think is the most popular but the most targeted category. An easy way to find out is to enter your search terms in the directory.

7. Be patient – it can take months or sometimes even years to get listed. This is because there are thousands of sites being submitted every day. If your site has not been listed after 3 months, then resubmit.

8. Guarantee – there is no guarantee to get listed as the categories may already be filled. There may also be a backlog of sites to be reviewed. Therefore concentrate your efforts on other marketing methods.

In Part 3 of this article we’ll discuss the detailed requirements for submitting your web site to each Major Search Engine Directory in order to attract thousands of targeted visitors.

Author Bio:
Herman Drost is the author of the new ebook “101 Highly Effective Strategies to Promote Your Web Site” a powerful guide for attracting 1000s of visitors to your web site. Subscribe to his ‘Marketing Tips’ newsletter for more original articles at: subscribe@isitebuild.com. Read more of his in-depth articles at: www.isitebuild.com/articles

Getting more Google backlinks
Search Engines over the years have struggled with providing relevant and good results for their users. At the same time, they have faced the “attack of the webmasters” from those determined to use every trick or tool to get their web site at the top of the listings.

It can become a webmaster obsession (or maybe an addiction?) to “win the game” of search engine ranking.

Another option is to part with cash for a Search Marketing Company to do the work for you. This will not be instant, guaranteed or cheap as it takes time and resources.

This has forced the engines to change how they rank sites. The most important change to the rules is that of links or link popularity, meaning the amount of web sites pointing to your site.

With Google this also means the quality of the sites linking to you. There are several ways to get links to your site:

• Exchange or reciprocal – This is when web site owners agree to place links on each other’s site.

• Affiliate Program -Having your own affiliate program is some times over looked as a form of linking, but can not only be a good form of linking but of real revenue production.

• Paid Linking – this is when you can pay for links in directories.

• Free Linking without return links. This is the area I want to focus on.

Some Internet marketing experts will tell it’s not possible to get links to your site but there are ways that work. I will show you they are effective and more importantly, that Google loves them.

Articles – Writing and submitting articles to other site owners is an effective way to get links to your site.

It’s not as effective as it once was but publishers are still looking for quality, unbiased information. All you have to do is make sure you have a resource box at the end of each article, which points to your site.

Writing tips & resources
Check out Jim Edwards book on help on how to write articles here:

http://www.turnwordsintotraffic.htm

It has a great template which you can use. Just fill in the blank templates to format your message.

If your spelling or grammar is not the best then a low cost editing option might fit you needs:

http://www.yourbestwork.com/

With articles, I highly recommend writing an article and leaving it at least a day, then looking over it again before allowing it to become public.

It can give you a chance to remove any obvious spelling and grammar mistakes. Once you have your article finished you need to find placed to submit. I have a list here:

http://www.digitalawol.com/esub.htm

But maybe you have a very specific market you want to reach?

Where can I post my articles?
Here is a step by step way of how to use Google to find good results on a specific market. Say you produce products or sell e- books on “weight”

1. Search in Google but think of the big picture. Instead of using “weight” search for “health” which brings around 181,000,000 results.

2. Scroll to the bottom of the results page and click the “search in results” link

3. Enter in the text box “submit article” with the quote marks, which will look for exact matches to the search term.

4. This brings up about 55,800 which should allow you to find at least 1000 sites that would accept your articles. Once you have covered those you could start again for different key words.

Also when you find a suitable site that accepts articles, create a folder within your bookmarks (add to you favourites list in Internet Explorer) and add it as a submit link, for ease of returning.

Getting free links
Other sites offer free links where you can place a link. A word of caution here, some sites use different programming languages in their design. They may list your site but Google and other engines have trouble reading the resulting page and the link may not actually go to you but to a redirect file which then goes to your site.

The addresses of the links on the page may be something like this:

http://thesite.com/default.asp?m=gs&SiteID=39647 or

http://thesite.com/redirect?=1234

So they don’t actually count as a link to your site. Look at otherlinks on the site for a clue here. Check it out first before wasting your time placing a link.

On a side note here, it may still bring you some traffic if the site has large amounts of visitors.

Getting links from discussion forums
Another way to get links is via discussion forums. Again, you have to check that you links will count towards your site. Also the forum must be aimed at your site so it would be best if it has your key words in its name.

For example, I post regular help at Duncan Carvers Marketing-Stratergy.org . When I do post I always use a signature like this on the end of each message:

——
Paul J Easton
Internet Marketing and Promotion from DigitalAwol.com

Custom IE (Internet Explorer) Toolbars from Createtoolbar.com
—–

And the most important words above are linked to my site – (that’s every before the “from”)

Due to the forum being very active, with new posts, the Google spider, which lists the pages in the search engine returns very often (they love new fresh content).

A small problem is the pages that are created have a Question Mark, like the examples above. All I need to do in this case is to submit the page I had posted to ensure that Google finds and indexes it.

With in a couple of days Google listed my link back to my site. You can see this by going to Google and pasting:

” link:http://www.createtoolbar.com ” without the quotes in the search box. and view the first link!

This is a great win-win. I helped someone else out and I got links to both my sites!

Hope this helps.

Author:
Paul Easton

Search Engine Marketing
Search engine marketing success comes from good research. By applying research to understand your competition and target audience, your optimization efforts will succeed. Remember when homework from school often required some research on your part to complete? It is much the same scenario for search engine marketing: you need to apply yourself by researching in order to understand your competition and target audience. Your visitors need to relate to you and understand your message and what you want them to do.

Know Your Industry
You must spend time understanding the industry you are trying to make a profit from. If you sell widgets, know everything you can about widgets: where they come from, how they are typically sold, why people need them, etc. You can learn a great deal from your competitors. Research using industry communications such as online magazines, forums, newsletters and blogs. Read articles written by industry leaders and keep up on the latest news about your industry. The more you know, the better your basis for building an authoritative website.

What’s Your Point?
Ok, so you’ve got this product you want to sell online. What are you saying to your audience? Make sure you write in a clear, concise way so your message does not get lost in the process. Just because everyone in your company understands your message doesn’t mean the visitors to your site will. Have a few test scenarios set up; ask a few objective readers what they understand from your website and see just how much your message is getting across. You’d be surprised that what may be obvious to you is not necessarily obvious to your website visitors. If you are having trouble creating a clear message for your website, consider hiring a copywriter to convey what you want to say.

Know Your Target Audience
Who buys your product and why? Who needs the information you have on your website? Who would you like to have visit your website that isn’t already there? Who is visiting you? Are they professionals who understand your technical terms or visitors of varying levels who all need the same information from you? What do you offer that a certain market would want from you? Take the time to get a good look at what is out there and how your competitors are presenting their information to online visitors. Use your log statistics reports to track who comes to your website. Get familiar with the keyword terms they are using in the search engines to find you. Research the domains that most visit you. Find out why visitors are clicking away from certain pages before going deeper into the content of your website. Is it a lack of information? Too many choices to click on? Is the language used or instructions given easy to understand? Don’t give your visitors a reason to leave before they understand your message.

Know Your Competition
Take a good look at your top competition and see what they are offering online. Even looking at websites that are not direct competitors may give you an idea of what to offer your visitors. Think of it this way: someone put effort into creating those websites. Visit your competitor’s website. Search in the major search engines for your most important keyword phrases and see if your competitors show up in the top thirty search engine results. Learn what you can and see if what they are doing is something you should be doing. It’s always good to know if your competitors are using SEO, Paid Inclusion, PPC, Link Building and other means to rank well.

Make Your Website Accessible
There’s nothing worse than muddling through a website looking for what you want and clicking so much you finally give up. Use easy navigation, make sure your information or products are easily accessible to your visitors. Create written text that is easily understood in order to get your message across readily. Give your visitors plenty of written information. There’s no such thing as “too much text” when it comes to search engine robots “understanding” your web pages. What’s good for the visitor is often good for the search engine robots.

Do The Work
Research is the cornerstone to your success. The more you know about your subject, the better you will be able to inform your visitors. By informing your visitors you build trust and interest in learning more about your website. Do the math – get searching!

Author Bio:
Daria Goetsch is the founder and Search Engine Marketing Consultant for Search Innovation Marketing, a Search Engine Promotion company serving small businesses. Besides running her own company, Daria is an associate of WebMama.com, an Internet web marketing strategies company. She has specialized in search engine optimization since 1998, including three years as the Search Engine Specialist for O’Reilly & Associates, a technical book publishing company.

02
Feb
2004

Time To Think Summer

Time To Think Summer
Today, I’m sitting at my desk looking out the window at the snow. We have a blizzard warning up as I type this Marketing Monitor. We are to expect between 8 and 10 cm (about 3 inches) on top of that same amount of snow we”ve received daily for the past 3 or 4 days.

On top of all that the temperature is below zero and has been for longer than normal. Yet despite the cold I”m thinking summer.

Not because I hate the snow. In fact I kinda like it. No I”m thinking summer because some of our client”s busiest seasons online are the summer time. So I thought I”d give some pointers here.

Remember that even if it”s snowing and cold where you are, summer is just a few short months away. Now is the time to start planning and budgeting for your summer search marketing campaign. Whether you are going to be running a PPC campaign, or strictly organic SEO, you have to begin preparing.

If you are a seasonal business and rely on summer sales, you should start researching key phrases for the summer. Consider what you sell, and what your competitors sell.

Be sure to review your visitor logs for last year to see what brought people to your site. Of course with the many changes in search over the past few months they won”t all be relevant phrase but perhaps some phrases will help you come up with ideas for additional phrases you can market.

Assess your current website”s condition. Does it look dated? Perhaps it needs a face lift. A simple redesign of a site can bring about a fresh look and perhaps drive more traffic. Sometimes a good indicator of what your site needs is your competitors. Those who aggressively market on the web have developed a formula for maximizing return on their web investment so take some pointers from them.

If you do consider a redesign, please consider the implications. We wrote an article quite a few years ago on this subject but the same rules still apply (you can read this article here). If you are working with an SEO company, above all consult with them before you do anything. They can help you minimize the impact of your new site. If you determine that a redesign is in order, now is a good time to do it � particularly if you”re key business occurs in the summer. If it”s done properly, the impact can be minimal and you will likely recover by spring.

Consider the current web environment. Again, many techniques and tactics used last year don”t work on today”s engines. Therefore you may want to revisit your optimization.

If you perform these simple tasks now, in preparation for your summer season you can help your site reach its potential without negatively impacting your online visibility.

Author Bio:
Rob Sullivan
Production Manager
Searchengineposition.com

Search Engine Positioning Specialists

SEO vs PPC
Back in 1997 when I started getting web sites to the top of the search engines it wasn’t even called “Search Engine Optimization”. In fact, there wasn’t a name for what I did much less a multi-billion dollar industry. I realized back then that search engines were the only place to find what you were looking for on the web. They were a phone book of sorts with about 4 billion listings that you could sort through in less than 1 second with the push of a search button.

Now the search engine game is very different yet very much the same. You have the media bombarding you about pay per click, sponsored listings, featured listings, ppc, cpa and don’t hear very much about natural search engine optimization anymore. You don’t know if there even is such a thing because most companies that perform search engine optimization of web sites are small and can’t compete with the large advertising budgets of the major search engines like Google and Overture. But the deeper you dig the more you realize the algorithms haven’t changed that much and getting to the top can be made simple. This leads us to two questions.

1. Does search engine optimization work anymore?
2. What kind of traffic can I expect to see from both methods?

We’ll start with the first question.

Does search engine optimization work anymore?
The answer is a resounding, ABSOLUTELY! Natural search engine placement and optimization is not dead at all. In fact, the industry is growing at a tremendous rate. The competitiveness over arguably the most competitive word on Google “search engine optimization” has drastically increased. As of the writing of this article there are over 1,620,000 results when you type in search engine optimization compared to only 560,000 two months ago.

Okay you say, that makes sense but give me more proof.

GlobalPromoter.com does no advertising other than natural search engine optimization and natural search engine placement in the major search engines. We spend $0 on pay per click campaigns or any other method of advertising yet our traffic rivals that of our competitors and we average an incredible amount of account signups on a daily basis. Our visitors / purchasers ratio is in upwards of 4.5% which is incredible in any industry. Why do we have such a high purchase/click ratio? Because people are looking for us, we’re not looking for them. When a user goes to Google and types in “search engine optimization tools” and finds us on the first page, they know we know what we’re doing and are compelled to click if only out of awe.

The secret of the search engines is the ability to be found not the other way around. That’s why the natural search engine listings in Google outperform the Adwords listings. Users know that sites listed in the Sponsored matches section or on the right side of the results means a business or individual is paying every time someone clicks on their site. That equates to advertising which is no different than radio, television, newspaper or magazines. That’s a company “pushing” their product onto the consumer.

But, when a user finds a site in the web matches section, they have more confidence. This site didn’t pay to get there, they are there because Google or Yahoo or AOL’s algorithm said their site is the most appropriate for my search based on the entire site’s content. This is “Pull” demand. Meaning, the user is looking for us instead of us looking for them. If you can get on the Pull side of advertising then you’ll experience much higher purchase / click rates on every visitor to your web site.

On to question 2.

What kind of traffic can I expect to see from both methods?
This question needs to be answered in two parts. First let’s look at the ppc method.

PPC search engine listings will give you as much traffic as there is demand for a given keyword or keyword phrase. Meaning, if there are 500,000 searches a month and your listing is appealing you can expect to receive approximately 2 – 5% of those searches. Let’s say you get an incredibly high click thru rate of 5%. That means you have .05 * 500,000 = 25,000 visitors at your disposal. But if a keyword has 500,000 searches in a month then that means it’s fairly competitive and it could easily be $1.00 to be in the top 3 positions for that keyword. So if you are paying $1.00/visitor and you had 25,000 visitors, then you paid $25,000 for the traffic one keyword would generate for your site.

I think you can see how risky and expensive ppc can be. Unless you know you can convert visitors into sales and your profit margin on the items you’re selling is incrdibly high, then Caveat Emptor (buyer be ware).

On the flip side, when your site shows up in the natural rankings you don’t pay a single cent for any of the traffic it generates. This means you have more money for developing your site, tweaking marketing tactics, making your product better, etc…

As far as the old argument that you won’t get as much traffic from natural placements vs ppc listings, that’s a myth. Several of our customers receive over 50,000 visitors a month on average from natural placements in the major search engines. In fact, when we optimize a client’s web site, one of their goals is to decreaes the amount of money they are currently spending on ppc advertising. After the completion of the optimization plan 75% of our clients completely abandon their ppc programs. This leads us to a general comparison of ppc vs. natural rankings.

Advantages of Search Engine Optimization
1. Up front fixed cost vs. fluctuating costs that can skyrocket with ppc advertising.

2. Long term listings and rankings with natural placement vs. Showing up only as long as your bank account has money.

3. Natural rankings have higher click thru ratios than ppc listings because natural rankings are pull demand vs. push demand.

In Conclusion
I’m going to close this article with an analogy. Most of you have been camping before and remember at least one cold night when you couldn’t get a fire started. So you went and got some lighter fluid and squirted it on the dry oak or whatever wood you used. Then you threw a match into the fire and began to warm up next to the fire.

The lighter fluid is akin to ppc advertising. When the lighter fluid is squirted on the fire the flames shoot high and bright and then vanish. Just like ppc advertising it’s short term because as soon as the money is gone so is your exposure. Whereas natural rankings are like the solid oak used in the fire. The oak will burn for hours and hours and keep you warm much longer than just lighter fluid alone. Like the oak, natural search engine optimization campaigns last in excess of 6 months in stead of one day. And if you learn the secrets to good web site optimization you can stoke the fire and make it last even longer with no added cost. Of course it takes a little longer to get the oak branches to light up but once you get them going they will last for a long time.

Many people see the solution to their search engine marketing campaign in pay per clicks because they’re easily set up and effective almost immediately. However, those that understand the principle of laying a solid foundation and building upon it can understand the long term benefits of natural search engine placement. It may take longer to get the same results but it will cost much less in the long run.

Author Bio:
Jason Dowdell is the founder and CEO of http://www.GlobalPromoter.com, a search engine optimization and marketing firm specializing in educating and empowering customer websites. Jason is also the founder of TurboPromoter.com, a web-based seo/sem project management suite comprised of professional seo tools, in-depth tutorials and an integrated help system.

28
Jan
2004

Site Rankings Gone?

Site Rankings Gone
As you may already know, a good part of my job is researching how the organic search engines work. Trying to figure out how the algorithms work in ranking pages is crucial to our day to day operations. Occasionally, we come across sites which seem to defy explanation – they have proper optimization, good internal linking and so on, yet seem to be getting penalized by engines such as Google. Today, I’m going to explain how we began researching a particular problem, in hopes that if it happens to you, you will know what to do.

The first indication that there was a problem with a site was when the PageRank in the Google toolbar disappeared, seemingly over night. This happened soon after a new URL was put up on an existing site. We assumed, as is usual, that Google hadn’t been able to associate the new URL with the “old” content. That is – Google was still expecting to see the old URL associated with the content. We advised the client that it would likely take a few weeks to re associate the new URL with the site.

When sufficient time passed without progress, we had to dig deeper to see what the issues were. As I mentioned above, everything looked ok. Optimization was in place, and there wasn’t any over optimization happening. Internal linking was good, and there was good use of a properly constructed site map. So we had to dig deeper – going beyond on-the-page factors to see if we could figure out what else was causing the problem.

The first thing I looked for…
The first thing I looked for was the existence of a robots.txt file. In many cases, an improperly coded robots file will exclude some, or all, search engine spiders from indexing. In this case a robots.txt was not being used, so I ruled this out.

I then checked to see if there were robots Meta tags in the body of the HTML. These tags do the same as the robots.txt file. That is, they tell the spiders which pages they can and cannot index, but it is done on a page by page basis, rather than a site wide basis as in the robots.txt. Again, an improperly coded robots Meta tag can exclude part or all of a site from getting indexed. Again, this was not the case. Although this site does use a Meta robots tag it was coded properly. In fact, the same tag existed on the “old” site and wasn’t an issue then.

So I then checked the log files to see if the spiders had been visiting the site and they have been there on a regular basis. As recent as a few days ago, as a matter of fact.

Seeing that everything was coded properly, and that spiders had been visiting the site peaked my interest. How is it that spiders are able to see the site (as indicated by their visits) yet the site is not showing up in the index and has a PageRank of 0 still, months after the change?

Some more digging…
So I did some more digging. I checked Google for the old URL. Upon viewing the cached version of the old URL, a theory began to form.

The cached pages are actually the current content of the new site. In other words, Google was somehow associating the old URL with the new site. So I did some more checking. I did a whois lookup and found that the old URL was still registered. So I decided to ping it, and found that it resolved to a new IP address, yet when I try to connect to it using my web browser it comes up as a 404 (page not found error).

I pinged the new site and the IP address is different, but it is the IP address that the site had when it had the “old” URL. This still doesn’t explain why the new site has no PageRank or indexed pages and the “old” URL is showing pages from the new site, but it does give me some clues.

We already know that in order to save time most search engines do not perform a DNS query when they visit a site. They tend to try and connect directly to the site via IP. If they don’t get a site via IP they then perform a DNS query to find the IP of the site.

In the case of this site, Google hasn’t needed to perform a DNS query as, from their point of view, the “old” site still exists. They can connect via IP to the site and are associating the “new” site to the “old” URL.

This also explains why the “new” site is showing a PageRank of 0 with no pages indexed. Because Google has also resolved the “new” site to the same IP which it thinks belongs to the “old” URL. Once it visits the new site it realizes that the new and old sites are identical it gives preference to the “old” site because it pre exists the new site.

Confused yet?
Let me put it in other terms. Since the “old” site has been around for longer, it has built up a reputation on the web. When the client replaced the URL they wiped out that reputation. But no one told Google that the old site was gone. Had Google performed a DNS query they would have found that the old site had in fact been moved, but since it found a site with the same content at the same IP it assumes that it is the site with the reputation.

Along comes a new site with the exact same content and no reputation and of course the first thing Google assumes is that the site owner is trying to spam the engine, so it penalizes the new site. Hence the lack of indexed pages and 0 PageRank value.

To resolve this issue we will try a variety of things. First will be either a 301 redirect (approved by Google to help spiders understand if a site has moved) or another on-the-page redirect, such as a Meta refresh or hyperlink on the “old” URL. These different efforts help enforce to Google that the “old” site has been replaced by the “new” site.

If this doesn’t work, our next step will be to request that the site be removed from the index. This is a last resort; as we would rather the engine figure it out on its own. If we find that Google still can’t figure out that there is a new site, we will definitely request the URL removal.

In addition, to try and help speed things along, we need to ensure that all other links, such as ODP directory listings, point to the correct URL and not the old domain. This will reinforce to the search engines that the “old” site no longer exists and that the “new” site is actually a valid site that isn’t spamming the engines.

Author Bio:
Rob Sullivan
Production Manager
Searchengineposition.com
Search Engine Positioning Specialists

Google & Inktomi Optimization
The search engine environment continues to evolve rapidly, easily outpacing the ability of consumers and SEO practitioners to quickly adapt to the new landscape. With the ascension of Inktomi to the level of importance that until recently was held solely by Google, SEO practitioners need to rethink several strategies, tactics and, perhaps even the ethics of technique. Assuming this debate will unfold over the coming months, how does an “ethical SEO firm” work to optimize websites for two remarkably unique search engines without falling back on old-fashioned spammy tactics of leader-pages or portal-sites? Recently, another SEO unrelated to StepForth told me that he was starting to re-optimize his websites to meet what he thought were Inktomi’s standards as a way of beating his competition to what looks to be the new main driver. That shouldn’t be necessary if you are careful and follow all the “best practices” developed over the years.

The answer to our puzzle is less than obvious but it lies in the typical behaviors of the two search tools. While there are a number of similarities between the two engines, most notably in behaviors of their spiders, there are also significant differences in the way each engine treats websites. For the most part, Google and Inktomi place the greatest weight on radically different site elements when determining eventual site placement. For Google, strong and relevant link-popularity is still one of the most important factors in achieving strong placements. For Inktomi, titles, meta tags and text are the most important factors in getting good rankings. Both engines consider the number and arrangement of keywords, incoming links, and the anchor text used in links (though Google puts far more weight on anchor text than Inktomi tends to). That seems to be where the similarities end and, the point where SEO tactics need revision. Once Inktomi is adopted as Yahoo’s main listing provider, both Google and Inktomi will drive relativity similar levels of search engine traffic. Each will be as important as the other with the caveat that Inktomi powers two of the big three while Google will only power itself.

2004 – The Year of the Spider-Monkey
The first important factor to think about is how does each spider work?

Entry to Inktomi Does Not Mean Full-Indexing
Getting your site spidered by Inktomi’s bot “Slurp” is essential. Like “Google-bot”, “Slurp” will follow every link it comes across, reading and recording all information. A major difference between Google and Inktomi is that, when Google spiders a new site, there is a good chance of getting placements for an internal page without paying for that specific page to appear in the index. As far as we can tell, that inexpensive rule of thumb does not apply to Inktomi. While it is entirely possible to get entire sites indexed by Inktomi, we have yet to determine if Inktomi will allow all pages within a site to achieve placements without paying for these sites to appear in the search engine returns pages, (SERPs). Remember, Inktomi is a paid-inclusion service which charges webmasters an admission fee based on the number of pages in a site they wish to have spidered. From the information we have gathered, Slurp will follow each link in a site and, if provided a clear path, will spider every page in the site but, pages within that site that are paid-for during the submission will be spidered far more frequently and will appear in the indexes months before non-paid pages. We noted this when examining how many pages Inktomi lists from newer clients versus how many from old clients. We have noticed the older the site, the more pages appear in Inktomi’s database and on SERPs on search engines using the Inktomi database. (This is assuming the webmaster only paid for inclusion of their INDEX page) Based on Inktomi’s pricing, an average sized site of 50 pages could cost up to $1289 per year to have each page added to the paid-inclusion database so it is safer then not to assume that most small-business webmasters won’t want to pay that much.

Google’s Gonna Get You
Google-bot is like the Borg in Star Trek. If you exist on the web and have a link coming to your site from another site in Google’s index, Google-bot will find you and assimilate all your information. As the best known and most prolific spider on the web, Google-bot and its cousin Fresh-bot visit sites extremely frequently. This means that most websites with effective links will get into Google’s database without needing to manually submit the site. As Google currently does not have a paid-inclusion model, every page in a site can be expected to appear somewhere on Google produced SERPs. By providing a way of finding each page in the site (effective internal links), website designers should see their sites appearing in Google’s database within two months of publishing.

We Now Serve Two Masters; Google and Inktomi
OK, that said, how to optimize for both without risking placements at one over the other. The basic answer is to give each of them what they want. For almost a year, much of the SEO industry focused on linking strategies in order to please Google’s PageRank. Such heavy reliance on linking is likely one of the reasons Google re-ordered its algorithm in November. Relevant incoming links are still be extremely important but can no longer be considered the “clincher” strategy for our clients. Getting back to the basics of site optimization and remembering the lessons learned over the past 12- months should produce Top10 placements. SEOs and webmasters should spend a lot of time thinking about titles, tags and text as well as thinking about linking strategies (both internal and external). Keyword arrangement and densities are back on the table and need to be examined by SEOs and their clients as the new backbone of effective site optimization. While the addition of a text-based sitemap has always been considered an SEO Best Practice, it should now be considered an essential practice. The same goes for unique titles and tags on each page of a site. Another essential practice SEOs will have to start harping on is to only work with sites that have unique, original content. I am willing to bet that within 12- months, Inktomi introduces a rule against duplicate content as a means of controlling both the SEO industry and the affiliate marketing industry. Sites with duplicate content are either mirrors, portals or affiliates, none of which should be necessary for the hard-working SEO. While there are exceptional circumstances where duplicate content is needed, more often than not dupe-content is a waste of bandwidth and will impede a SEO campaign more than it would help.

One Last Tip
The last tip for this article is, don’t be afraid to pass higher costs on to the clients because if your client wants those placements soon, paid-inclusion of internal pages will be expected. When one really examines the costs of paid inclusion it is not terribly different than other advertising costs, with one major exception. Most paid-advertising is regionally based (or is prohibitively expensive for smaller businesses). Search engine advertising is, by nature, international exposure and that is worth paying for.

Author Bio:
Jim Hedger is the SEO Manager of StepForth Search Engine Placement Inc. Based in Victoria, BC, Canada, StepForth is the result of the consolidation of BraveArt Website Management, Promotion Experts, and Phoenix Creative Works, and has provided professional search engine placement and management services since 1997. http://www.stepforth.com/ Tel – 250-385-1190 Toll Free – 877-385- 5526 Fax – 250-385-1198

Introduction
Here’s something that is fast to read and does the job! The 10 do’s and don’ts of SEO. Five techniques you should always do to push your site at the top of the search engine results pages (SERP’s) and keep it there, and five things which you should always avoid doing, to protect your site from a possible penalty or risk it from being banned altogether.

List of the 5 Do’s
Do Number One:

Take all the time that it takes to do a careful research of all your keywords and key phrases for your site, on the products or services you are trying to sell. Proper keyword research can only be done using Wordtracker, the industry standard when it comes to professional keyword research. Trying to optimize a site without knowing your real keywords is like driving a car at night with no headlights! Some will tell you they use Overture’s free suggestion tool. Although that tool can help you to a limited degree, you should always use Wordtracker for the best results.

Do Number Two:
Make sure you write a short descriptive title tag of what each page of your site is all about, and make sure they are all different. Search engines use the information contained in that title tag, compare them to the text on that page and rank it accordingly. The short description in your title tags will also help your users. The idea here is to keep it as short and descriptive as possible. If you are creating a new page about ‘durable red widgets’ then call that page ‘durable red widgets’. Avoid the temptation of creating title tags that are longer than 30 characters maximum, since they might have a dilution effect in your rankings of certain search engines.

Do Number Three:
Write the main text on your page using the same keywords contained in your title tag. If you are working on a page with a title called ‘New houses in Baltimore’ then be sure that those important keywords are repeated at least two or three times in the main body of your text, without sounding repetitive. A well-designed and carefully written page will ‘read write’ and will not sound like you are repeating yourself. Search engines will rank your page higher if they see a keyword repeated a few times on a page, and will help them ‘build a theme’ throughout your site.

Do Number Four:
Make a complete sitemap of your site, which will help both your users and the search engines at the same time. Having a well-designed sitemap will ensure that each page of your site gets properly indexed by Google and the other search engines. It is important to call that file sitemap.html and not site-map.html or other variations. Additionally, make sure that your sitemap.html file is directly accessible from your homepage and that it uses link text. Link text is always a lot better than a picture or graphic, since search engines won’t be able to read them.

Do Number Five:
To increase your link popularity, participate in a link exchange program. Even in the aftermath of ‘Update Florida’, link popularity in Google today is even more important than ever. All else being equal, the higher your Page Rank, the higher your rankings. Increasing the number of links that point to your site will help you in the results pages. You should only link to sites that are in the same field as your site, and stay away from bad ‘neighbourhoods’ or from so-called link farms or ‘free-for-all’.

List of the Five Don’ts
Don’t Number One:

Don’t ever use cloaking mechanisms or software that need to know the IP address of a search engine spider or anything similar. Cloaking is based on the idea of serving a unique, optimized page for the search engines, while serving a completely different page to the ‘real’ users. Today, most major search engines prohibit the use of such techniques and you risk your site being penalized or banned altogether. Always play it safe and the search engines will treat you right.

Don’t Number Two:
Submit your website once to the search engines and then wait for at least 6 weeks! Don’t use software that automatically submits your sites on a weekly or monthly basis, since it might penalize you in the long run. Today’s modern search engines use automated crawlers or spiders to regularly index your site, so you don’t need to submit more than once. In the case of DMOZ (the Open Directory Project or ODP), you should always wait 8 to 12 weeks, since DMOZ rely only on volunteers to review and index your site. If your site still isn’t listed after 12 weeks, write them a friendly email explaining your problem and that should do it in most cases.

Don’t Number Three:
Don’t entrust your site to people that will submit it to ‘thousands of engines’. There isn’t that many search engines in the first place. There are only a handful of serious search properties you should submit too, and they are used by 99% of the people looking for information. Don’t waste your time or your money and only work with the serious search engines everybody uses.

Don’t Number Four:
Don’t develop your site using Flash technology or similar techniques that the search engines cannot read. As far as your rankings in the search engines go, the best way to develop a site is in using standard technology, such as HTML. Text written in the HTML format is proven technology that all search engines have long recognized and approved, since the beginning of the Internet. Using the right technology will always help your site attain a good position in the SERP’s.

Don’t Number Five:
Don’t deal with so-called SEO experts that promise you Number One or first-page rankings in some engines. There is no such thing as guaranteed number one placement. Ask for referrals and don’t be afraid to ask exactly what techniques your would-be SEO firm uses to achieve a good positioning for your site. Additionally, ask them to put everything in writing, before you sign on the dotted line, and before you give them some of your hard-earned cash.

Author:
Serge Thibodeau of Rank For Sales

Google Update Florida Solutions and Fixes
Over a month and a half ago I wrote an article about Google’s latest update dubbed Update Florida. I told our subscribers about how they might see their rankings drop in Google for prominent keywords for no apparent reason. Today I’m going to tell you what you need to do in order to fix your site and help restore your rankings.

First, let’s recap the main effects of the Google Update Florida as well as my original predictions of the root causes. Then I’ll explain in more detail the root causes as I see them.

Here’s the skinny on the latest Google Dance (from: 11/18/2003)
• Called “Update Florida” because similar to the last Presidential election, it was full of controversy and tons of upset people.

• Many people that had worked hard to build up solid and reputable backlinks appear to have been punished for no obvious reason. Even though their inbound links contain their main keyword phrase and have content related to their site in the title of the inbound linked page.

• Rankings have dropped for main targeted keywords but have not dropped for more specific keyword phrases. Rankings are not just for search engine optimization firms but are for web sites of every kind and variety.

• Results appear to have been repopulated from data that looks to be about 6 months old.

Original Google Update Florida Predictions and Explanations
• Probable: Google is either testing an updated algorithm or they are not factoring in the backlinks created in the past 6 months.

• Not Probable: SEO techniques that could be considered SPAM to Google are being filtered out in an effort to provide more relevant results. I’m doubtful on this one since the results are from about 6 months ago and many of them are not very relevant.

• Probable: Google is doing yet another deep crawl, we haven’t seen on in about 6 months, and they want to build a fresher, more relevant search base from sites currently out there that the Fresh Crawl GoogleBots may not be picking up.

• Not Probable: Many sites that have dropped off will never come back to the top because they have been penalized. As far as we know Google doesn’t penalize sites, they simply list them or delist them. Just like God, there is no gray area with Google, it’s either black or white. If you’re considered a black hat then your site will not show up anywhere in Google’s index, not even for the most descriptive keyword searches.

And the Verdict is In!

Let’s start from the beginning with the first prediction and move down from there.

Prediction no.1
Google is either testing an updated algorithm or they are not factoring in the backlinks created in the past 6 months.

Verdict: I was right and wrong on this one.

I was wrong about Google not factoring in backlinks created in the past 6 months. I was right about them testing new filters. Google has not dumped their old algorithm by any means. Instead they have created new filters to penalize sites who’s pagerank was artificially inflated due to less than reputable linking practices. What are less than reputable linking practices you ask? These include…

• Exact same phrase or a high percentage of the exact same phrase (text links) and alt text (image links) in backlinks. This catches folks caught up in link farms because they don’t rotate the text used to link to their sites.
Solution: Create a link to us page that has at least 5 different links using different keyword phrases. This way you spread out the concentration of text across the general theme of your site.

• Links from sites that are of a different subject matter than the site being linked to. When there is no mention of your subject matter in the body, title, keywords and description of the page linking to your site.

• Links from sites that don’t show up in search results for the keywords used in the links. This is because it is widely accepted that Google as implemented a localrank technology that determines whether or not your backlinks are from sites that rank for similar keywords used in the backlink. If so, then you’re fine, if not then you’re going to have problems getting to the top even if you have a high pagerank.

Prediction no.2
Not Probable: SEO techniques that could be considered SPAM to Google are being filtered out in an effort to provide more relevant results. I’m doubtful on this one since the results are from about 6 months ago and many of them are not very relevant.

Verdict: I was right on this one

Google is not penalizing sites for being overly optimized or for having keywords in the title, description and keywords of the page or having them in the Heading tags or anywhere else. The penalization is only coming into effect when backlinks are overly optimized and not from other industry resource sites. The results appears to be old only because the sites that ranked well earlier weren’t part of the linking scams and they returned to the top for that very reason.

Prediction no.3
Probable: Google is doing yet another deep crawl, we haven’t seen on in about 6 months, and they want to build a fresher, more relevant search base from sites currently out there that the Fresh Crawl GoogleBots may not be picking up.

Verdict: I was right again.

Yes, Google did another deep crawl and refreshed the backlinks of sites and update their PageRank about 2 weeks later. But this deep crawl is done on almost a continual basis now and doesn’t affect the rankings like many thought it did. Rankings were affected due to the implementation of new filters and that’s the only reason why.

Prediction no.4
Not Probable: Many sites that have dropped off will never come back to the top because they have been penalized. As far as we know Google doesn’t penalize sites, they simply list them or delist them. Just like God, there is no gray area with Google, it’s either black or white. If you’re considered a black hat then your site will not show up anywhere in Google’s index, not even for the most descriptive keyword searches.

Verdict: I was wrong.

Well I guess nobody’s perfect. Google is not necessarily “penalizing” sites but they are “filtering” sites out based on whether or not certain filters are triggered by the site for a particular keyword search. While a site may show up for a very detailed search or an off topic search, it may not rank well for a highly competitive (typically commercial) term if they have tripped a filter for that term.

In order to understand my predictions and conclusions correctly I need to explain a few more things that have bubbled up from the Google Update Florida and how they affect search results that are delivered today on Google.

Not only did Google implement filters but they also implemented new features which make it seem like the old algorithm was thrown out and they started over from scratch. While many will tell you Google isn’t as relevant as it once was I will contest that they’re changing to remain the most relevant search engine out there. Here’s a rundown on the new features.

• Google Implemented Stemming Technologies: Stemming means taking a root word and determining all variations of engine for that word. Now a search for “game sites” may return sites optimized for “gaming sites” “gamer sites” “gamed sites”. This increases the number of results delivered for highly targeted keywords and increases the dependency on solid natural optimization.

You can disable stemming by adding the “+” sign in front of each word you want to disable stemming for.

• Google Implemented Plural Searches: This means that a search for “knitting needle” and “knitting needles” will return the same results. Thus increasing the competition again since more results are returned for all keyword searches.

• Implementation of LocalRank: Localrank is a technology that looks at the first (x) number of search results. It can be any number the search engine specifies but is typically around 100. After it looks at those results it determines whether or not any of those sites have linked to you and then ranks sites based on how many “popular” sites for a specific search term have linked to you. This is why it’s critical to have someone help you with your link popularity campaign that understands the intricacies of linking and can provide advice that will not hurt you in the long run. Short term link popularity plans from unrelated sites will do nothing to help out your cause. For more information on LocalRank read this forum at WebmasterWorld.

• Internal Links Discounted: Links from within your site to particular pages in your site do not count as much as they once did. While this doesn’t mean you need to change your site link structure it is worth noting.

Artificial PageRank Deflated: Sites that have more than one link from a particular site are experiencing the law of diminishing returns. No longer are 100 links from a single site weighted as 100 individual links. This just makes sense. I mean, if site A has 100 links from a single site and site B has 20 links from 20 individual sites, I can guarantee you that the 20 links to site B will count more than the 100 links to site A.

In Conclusion
Well I’ll conclude my ramblings with a recap. Google has made many changes and has implemented several filters as well as new algorithm features to ensure it has the most reliable search results set on the internet. In order to climb your way back to the top you need to understand LocalRank, PageRank changes, Proper Link Reputation and Link Popularity, and be aware of anti-spam measures that need to be taken. It all boils down to common sense. Make your site user friendly and easy to navigate, encourage others to link to you by giving them some form of incentive, don’t use the exact same phrase in your backlinks, use good titles that explain what each page is about and keep it simple. By following these rules you can weather any search engine algorithm change and remain at the top with a lot less stress.

Author Bio:
Jason Dowdell is the founder and CEO of http://www.GlobalPromoter.com, a search engine optimization and marketing firm specializing in educating and empowering customer websites. Jason is also the founder of TurboPromoter.com, a web-based seo/sem project management suite comprised of professional seo tools, in-depth tutorials and an integrated help system.

Pay Per Click Search Engine Advertising
So you’ve decided to give pay-per-click search engine advertising a try? That’s a good move, because PPC advertising is one of the most affordable marketing options available to small businesses.

But like all advertising, you need a good strategy to get your money’s worth. I find that too many people running their first PPC campaign make mistakes that can quickly turn expensive.

In this article I’ll offer some basic advice about bidding and keyword selection to help you run a smart PPC campaign.

Just What Can You Pay Per Click?
The most important thing to know before starting your PPC campaign is how much you can afford to bid for a keyword. High traffic keywords on Overture and Google ‘ the leading PPC providers – can cost $5.00 per click for a top ranking. Can you afford that?

Consider this: the typical e-commerce site converts about 2% of its visitors. That means you need to bring 50 visitors to your site before you make a sale. At $5.00 per click, you’ll spend $250 dollars to generate one sale. Ouch!

Keep in mind that you usually want one of the top 3 listings for a keyword. These are the listings distributed to most of the PPC engine’s partner sites. For example, a #3 ranking on Overture will place your listing on Yahoo, MSN and Alta Vista. A #7 listing won’t appear on any of these search engines.

So you’re caught in a catch-22: you want a high PPC ranking to get traffic, but the top rankings for popular words are too expensive.

Cast Your Net Broadly
The solution is to cast your net broadly, targeting a large number of less popular keywords. These words are usually less expensive and, taken as a group, can give you a considerable volume of traffic.

For example, suppose you run a ski resort. The keyword ‘ski vacation’ currently receives over 60,000 searches per month. That’s great, but it costs $5.01 per click for the top ranking. Instead of competing head-to-head for that keyword, you would be better off choosing ‘ski trip’ (4,771 monthly searches at $0.57 per click for the top spot) and ‘ski lodge’ (4,244 monthly searches at $0.55 per click for the top spot).

By targeting a number of these less popular keywords, we get nearly the same traffic as if we had targeted ‘ski vacation,’ but at a fraction of the cost.

Note that this is the opposite of the strategy you typically use in your search engine optimization campaign. In an SEO campaign, you focus on perhaps a half dozen high traffic words. That’s because it takes a lot of hard word to earn a top listing.

In contrast, it’s relatively easy to create a new PPC listing. Since you don’t pay unless someone clicks on your listing, there’s no added cost for doing this, so targeting a large number of keywords makes sense.

The word ‘ski chalet’ only receives 930 searches per month. So what? At $0.52 per click, it’s worth adding to your PPC campaign.

It’s common for PPC advertisers to target dozens of keywords. I’ve managed PPC campaigns for clients using over 1,000 words.

Smart PPC Management
The downside of this approach is that it can be hard to manage such a large number of keywords. You’ll want to track your listings, making sure your rankings haven’t dropped. Plus, you’ll want to know which keywords are sending you traffic and converting visitors into customers.

Many businesses also use a PPC bid management software like Bid Rank or GoToast to manage their listings. These software packages track your listings, and can adjust your bid if you drop in the rankings.

Many companies also outsource the management of their PPC campaigns. Most SEOs now offer PPC management services. These options cost money, but they usually pay for themselves by running your campaigns more efficiently.

Keep in mind that you don’t have to use a software package or a consultant to start your PPC campaign. But you do need to know what sort of cost per click you can afford. If you decide that $2.00 per click is your maximum bid, then stick with it. Don’t get into an emotional bidding war if you lose a top ranking. It’s much smarter to look for new and cheaper keywords. Cast your net broadly and you’ll save money.

Author Bio:
Christine Churchill is President of KeyRelevance.com a full service search engine marketing firm. She is also on the Board of Directors of the Search Engine Marketing Professional Organization (SEMPO) and serves as co-chair of the SEMPO Technical Committee.