Ask us a question!

John Wieber

Partner

has 13+ years experience in web development, ecommerce, and internet marketing. He has been actively involved in the internet marketing efforts of more then 100 websites in some of the most competitive industries online. John comes up with truly off the wall ideas, and has pioneered some completely unique marketing methods and campaigns. John is active in every single aspect of the work we do: link sourcing, website analytics, conversion optimization, PPC management, CMS, CRM, database management, hosting solutions, site optimization, social media, local search, content marketing. He is our conductor and idea man, and has a reputation of being a brutally honest straight shooter. He has been in the trenches directly and understands what motivates a site owner. His driven personality works to the client's benefit as his passion fuels his desire for your success. His aggressive approach is motivating, his intuition for internet marketing is fine tuned, and his knack for link building is unparalleled. He has been published in books, numerous international trade magazines, featured in the Wall Street Journal, sat on boards of trade associations, and has been a spokesperson for Fortune 100 corporations including MSN, Microsoft, EBay and Amazon at several internet marketing industry events. John is addicted to Peets coffee, loves travel and golf, and is a workaholic except on Sunday during Steelers games.

Web Moves Blog

Web Moves News and Information

Blog Posts by John

Making Sure Your Site Is Crawler-friendly
I couldn’t find any “meaty” questions for this week’s newsletter, so I thought I’d just talk generally about what makes a site “crawler-friendly.” I used to call this “search-engine-friendly” but my friend Mike Grehan convinced me that the more accurate phrase was “crawler-friendly” because it’s the search engine crawlers (or spiders) that your site needs to buddy-up to, as opposed to the search engine itself.

So, how do you make sure your site is on good terms with the crawlers? Well, it always helps to first buy it a few drinks. But, since that’s not usually possible, your next-best bet is to design your site with the crawlers in mind. The search engine spiders are primitive beings, and although they are constantly being improved, for best results you should always choose simplicity over complexity.

What this means is that cutting-edge designs are generally not the best way to go. Interestingly enough, your site visitors may agree. Even though we SEO geeks have cable modems and DSL, our site visitors probably don’t. Slow-loading Flash sites, for example, may stop the search engine spiders right in their tracks. There’s nothing of interest on the average Flash site to a search engine spider anyway, so they’re certainly not going to wait for it to download!

Besides Flash, there are a number of “helpful” features being thrown into site designs these days that can sadly be the kiss of death to its overall spiderability. For instance, sites that require a session ID to track visitors may never receive any visitors to begin with — at least not from the search engines. If your site or shopping cart requires session IDs, check Google right now to see if your pages are indexed. (Do an allinurl:yourdomainhere.com in Google’s search box and see what shows up.) If you see that Google only has one or two pages indexed, your session IDs may be the culprit. There are workarounds for this, as I have seen many sites that use session IDs get indexed; however, the average programmer/designer may not even know this is a problem.

Another source of grief towards getting your pages thoroughly crawled is the use of the exact same Title tags on every page of your site. This sometimes happens because of Webmaster laziness, but often it’s done because a default Title tag is automatically pulled up through a content management system (CMS). If you have this problem it’s well worth taking the time to fix it.

Most CMS’s have workarounds where you can add a unique Title tag as opposed to pulling up the same one for each page. Usually the programmers simply never realized it was important, so it was never done. The cool thing is that with dynamically generated pages you can often set your templates to pull a particular sentence from each page and plug it into your Title field. A nice little “trick” is to make sure each page has a headline at the top of the page that is utilizing your most important keyword phrases. Once you’ve got that, you can set your CMS to pull it out and use it for your Titles also.

Another reason I’ve seen for pages not being crawled is because they are set to require a cookie when a visitor gets to the page. Well guess what, folks? Spiders don’t eat cookies! (Sure, they like beer, but they hate cookies!) No, you don’t have to remove your cookies to get crawled. Just don’t force-feed them to anyone and everyone. As long as they’re not required, your pages should be crawled just fine.

What about the use of JavaScript? We’ve often heard that JavaScript is unfriendly to the crawlers. This is partly true, and partly false. Nearly every site I look at these days uses some sort of JavaScript within the code. It’s certainly not bad in and of itself. As a rule of thumb, if you’re using JavaScript for mouseover effects and that sort of thing, just check to make sure that the HTML code for the links also uses the traditional <a> tag. As long as that’s there, you’ll most likely be fine. For extra insurance, you can place any JavaScript links into the tag, put text links at the bottom of your pages, and create a visible link to a sitemap page which contains links to all your other important pages. It’s definitely not overkill to do *all* of those things!

There are plenty more things you can worry about where your site’s crawlability is concerned, but those are the main ones I’ve been seeing lately. One day, I’m sure that any type of page under the sun will be crawler-friendly, but for now, we’ve still gotta give our little arachnid friends some help.

One tool I use to help me view any potential crawler problems is the Lynx browser tool. Generally, if your pages can be viewed and clicked through in a Lynx browser (which came before our graphical browsers of today), then a search engine spider should also be able to make its way around. That isn’t written in stone, but it’s at least one way of discovering potential problems that you may be having. It’s not foolproof, however. I just checked my forum in the Lynx browser and it shows a blank page, yet the forum gets spidered and indexed by the search engines without a problem.

This is a good time to remind you that when you think your site isn’t getting spidered completely, check out lots of things before jumping to any conclusions.

Jill

Author Bio:
Jill Whalen of High Rankings is an internationally recognized search engine marketing consultant and editor of the free weekly email newsletter, the High Rankings Advisor.

She specializes in search engine optimization, SEO consultations and seminars. Jill’s handbook, “The Nitty-gritty of Writing for the Search Engines” teaches business owners how and where to place relevant keyword phrases on their Web sites so that they make sense to users and gain high rankings in the major search engines.

Search Engine To Improve
Wouldn’t it be nice if the search engines could comprehend our impressions of search results and adjust their databases accordingly? Properly optimized web pages would show up well in contextual searches and be rewarded with favorable reviews and listings. Pages which were spam or which had content that did not properly match the query would get negative responses and be pushed down in the search results.

Well, this reality is much closer than you might think.

To date, most webmasters and search engine marketers have ignored or overlooked the importance of traffic as part of a search engine algorithm, and thus, not taken it into consideration as part of their search engine optimization strategy. However, that might soon change as search engines explore new methods to improve their search result offerings. Teoma and Alexa already employ traffic as a factor in the presentation of their search results. Teoma incorporated the technology used by Direct Hit, the first engine to use click through tracking and stickiness measurement as part of their ranking algorithm. More about Alexa below.

How Can Traffic Be A Factor?
Click popularity sorting algorithms track how many users click on a link and stickiness measurement calculates how long they stay at a website. Properly used and combined, this data can make it possible for users, via passive feedback, to help search engines organize and present relevant search results.

Click popularity is calculated by measuring the number of clicks each web site receives from a search engine’s results page. The theory is that the more often the search result is clicked, the more popular the web site must be. For many engines the click through calculation ends there. But for the search engines that have enabled toolbars, the possibilities are enormous.

Stickiness measurement is a really great idea in theory, the premise being that a user will click the first result, and either spend time reading a relevant web page, or will click on the back button, and look at the next result. The longer a user spends on each page, the more relevant it must be. This measurement does go a long way to fixing the problem with “spoofing” click popularity results. A great example of a search engine that uses this type of data in their algorithms is Alexa.

Alexa’s algorithm is different from the other search engines. Their click popularity algorithm collects traffic pattern data from their own site, partner sites, and also from their own toolbar. Alexa combines three distinct concepts: link popularity, click popularity and click depth. Its directory ranks related links based on popularity, so if your web site is popular, your site will be well placed in Alexa.

The Alexa toolbar doesn’t just allow searches, it also reports on people’s Internet navigation patterns. It records where people who use the Alexa toolbar go. For example, their technology is able to build a profile of which web sites are popular in the context of which search topic, and display the results sorted according to overall popularity on the Internet.

For example a user clicks a link to a “financial planner”, but the web site content is an “online casino”. They curse for a moment, sigh, and click back to get back to the search results, and look at the next result; the web site gets a low score. The next result is on topic, and they read 4 or 5 pages of content. This pattern is clearly identifiable and used by Alexa to help them sort results by popularity. The theory is that the more page views a web page has, the more useful a resource it must be. For example, follow this link today –

http://www.alexa.com/data/details/traffic_details?q=&url=http://www.metamend.com/

– look at the traffic details chart, and then click the “Go to site now” button. Repeat the procedure again tomorrow and you should see a spike in user traffic. This shows how Alexa ranks a web site for a single day.

What Can I Do To Score Higher With Click Popularity Algorithms?
Since the scores that generate search engine rankings are based on numerous factors, there’s no magic formula to improve your site’s placement. It’s a combination of things. Optimizing your content, structure and meta tags, and increasing keyword density won’t directly change how your site performs in click-tracking systems, but optimizing them will help your web site’s stickiness measurement by ensuring that the content is relevant to the search query. This relevance will help it move up the rankings and thus improve its click popularity score.

Search Engines Can Use The Click Through Strategy To Improve Results
Search engines need to keep an eye to new technologies and innovative techniques to improve the quality of their search results. Their business model is based on providing highly relevant results to a query quickly and efficiently. If they deliver inaccurate results too often, searchers will go elsewhere to find a more reliable information resource. The proper and carefully balanced application of usage data, such as that collected by Alexa, combined with a comprehensive ranking algorithm could be employed to improve the quality of search results for web searchers.

Such a ranking formula would certainly cause some waves within the search engine community and with good reason. It would turn existing search engine results on their head by demonstrating that search results need not be passive. Public feedback to previous search results could be factored into improving future search results.

Is any search engine employing such a ranking formula? The answer is yes. Exactseek recently announced it had implemented such a system, making it the first search engine to integrate direct customer feedback into its results. Exactseek still places an emphasis on content and quality of optimization, so a well optimized web site, which meets their guidelines, will perform well. What this customer feedback system will do is validate the entire process, automatically letting the search engine know how well received a search result is. Popular results will get extended views, whereas unpopular results will be pushed down in ranking.

Exactseek has recently entered into a variety of technology alliances, including the creation of an Exactseek Meta Tag awarded solely to web sites that meet their quality of optimization standards. Cumulatively, their alliances combine to dramatically improve their search results.

ExactSeek’s innovative approach to ranking search results could be the beginning of a trend among search engines to incorporate traffic data into their ranking algorithms. The searching public will likely have the last word, but webmasters and search engine marketers should take notice that the winds of change are once again blowing on the search engine playing field.

Author Bio:
Richard Zwicky is a founder and the CEO of Metamend Software, a Victoria, B.C. based firm whose cutting edge Search Engine Optimization software has been recognized around the world as a leader in its field. Employing a staff of 10, the firm’s business comes from around the world, with clients from every continent. Most recently the company was recognized for their geo-locational, or GIS technology, which correlates online businesses with their physical locations, as well as their cutting edge advances in contextual search algorithms.

Google Search Engine Related Question
Hello Jill,

First off, thank you for your informative and straightforward newsletter.

I see Google in a different light than you. [Last week you said] “As long as Google is still around and still focused on relevancy…”

Google is more concerned with inbound links than relevancy and they are ripe for misrepresentation if you have the money. If you search for “gas scooters” on Google you will see that the same company owns 9 of the top 10 spots. The same company pays for their text links at a very large, PageRank 6 site then interlinks all of the sites. If this sounds like sour grapes, it’s because it is. You can’t compete against money and brute force if you are small.

Thanks again,

Mike

Jill’s Response
When I first read Mike’s email, I basically thought of it as just another complaint from someone looking for a scapegoat because he couldn’t get high rankings himself. However, I received several similar emails from other subscribers over the past week. They were all concerned about their own Web sites’ ability to be ranked highly when there were companies dominating the search engine results pages (SERPs) through apparently deceptive means.

So, are the bad guys really winning at Google? Unfortunately, for many keyword phrases, it appears that they are.

I looked at the results for the gas scooter phrase Mike mentioned and yes indeed, I was aghast at the tricks being used on many of the top sites. I found every trick in the book, in fact. After doing a bit of digging, I even found some subtle clues that makes me think that I know which “SEO firm” is helping them.

One of the things I found were high-PageRanked sites, cross-linked with other high-PageRanked sites of the same genre, with numerous keyword phrases in the hyperlinks. Checking backlinks on these sites was also interesting because I found pages that were cloaking, pages that were stuffing image alt attribute tags and other things that I thought were pass’ in Google. Apparently they’re not at all pass’.

Surfing the sites with a Lynx browser was very revealing because JavaScript doesn’t work with Lynx, so I saw lots of stuff I wasn’t supposed to see.

The bigger question that I have to ask, however, is how does the person looking for gas scooters feel about these results at Google? Are they getting what they want? Are they happy with the results? Are they relevant to their query?

At first glance it appears that they are.

This may certainly be all that Google cares about. I don’t know — I’m not Google. Perhaps Google really does only care that the results are relevant and the means used to place them there are of no consequence. We know that their first priority is the user. As long as they’re happy, Google is happy. I guess they are happy…?

The thing is, I’ve always kind of thought that they did care. Maybe I’m wrong. Or maybe they’re just a little bit mixed up right now. Or maybe there’s just no way for Google to spot these techniques and we should all just use them. Okay, I wouldn’t actually go that far! Many of the techniques being used on these sites have been penalized in the past. I’ve seen that sort of cross-linking thing get sites PR0’d (penalized) many times. Perhaps Google’s PR0 penalty thingamabob is just broken?

Come to think of it, I remember that I wrote something similar way back in April of this year regarding the SERP for the phrase “email marketing consultant.” If it was just a bug, surely it would be fixed by now — over 6 months later. A quick check shows that nope, Google is still full of sites using deceptive optimization techniques.

So what’s up, Google? Tell me it isn’t so. Tell me that you still believe in good vs. evil and that the ends don’t justify the means. Cuz right now, you’re telling me that it does, and that makes me sad.

As to Mike and the others who have to compete in this space, all I can say is keep at it. You can keep making your sites better and better. You can continue to build up high-quality backlinks to your site. I took a quick look at Mike’s site, and Google is not showing any backlinks. He does have links, but they’re not yet considered high-enough quality to count for much. I didn’t notice any high-quality directory links pointing to Mike’s site, which might make a difference. Keep at it. Build up a great resource site all about your gas scooters. Invite others to write articles about their gas-scootering experiences, and whatever else you can think of.

We have no control over Google’s rankings. If they choose to let deceptive sites win, eventually the overall quality of Google will deteriorate. I’m sure they don’t want that to happen. They have a very tough job having to fight spammers every single day. The thing is, when they err on the side of penalizing too many sites, then everyone is up in arms because they got caught up in spam filters by mistake.

I’m confident that eventually Google will find a happy middle ground and someday be able to automatically tell the difference between sites that got there because they are truly the most relevant, and sites that deceptively appear to be the most relevant. Could be a long wait though.

If you believe that another site is abusing Google’s quality guidelines, feel free to report it to them here. Don’t expect any miracles, however.

Author Bio:
Jill Whalen of High Rankings is an internationally recognized search engine marketing consultant and editor of the free weekly email newsletter, the High Rankings Advisor.

She specializes in search engine optimization, SEO consultations and seminars. Jill’s handbook, “The Nitty-gritty of Writing for the Search Engines” teaches business owners how and where to place relevant keyword phrases on their Web sites so that they make sense to users and gain high rankings in the major search engines.

Robots Exclusion Protocol
Back in the spring of 2003, I wrote an article on the Robots exclusion protocol that generated a lot of emails and many questions. I am still getting them, so it seems that a more extended article is warranted. Often referred to as the ‘Robots.txt file’, the Robots exclusion protocol can be a very important part of your search engine optimization program, but it needs to be carefully implemented to be successful.

If used incorrectly, this small and ‘innocent looking’ text file can cause a lot of problems. It can even cause your site from being excluded from the search engine’s databases, it it’s not written correctly. In this extended article, I will show you how to correctly write it and make the Robots exclusion protocol an important part of your SEO efforts in attaining good visibility in the major search engines.

How The Robots Exclusion Protocol Works
Some of you may ask what is it and why do we need it? In a nutshell, as it’s name implies, the Robots exclusion protocol is used by Webmasters and site owners to prevent search engine crawlers (or spiders) from indexing certain parts of their Web sites. It could be for a number of reasons, such as sensitive corporate information, semi-confidential data, information that needs to stay private, or to prevent certain programs or scripts from being indexed, etc.

A search engine crawler or spider is a Web ‘robot’ and will normally follow the robots.txt file (Robots exclusion protocol) if it is present in the root directory of a Website. The robots.txt exclusion protocol was developed at the end of 1993 and still today remains the Internet’s standard for controlling how search engine spiders access a particular website.

If the robots.txt file can be used to prevent access to certain parts of a web site, if not correctly implemented, it can also prevent access to the whole site! On more than one occasion, I have found the robots exclusion protocol (Robots.txt file) to be the main culprit of why a site wasn’t listed in certain search engines. If it isn’t written correctly, it can cause all kinds of problems and, the worst part is, you will probably never find out about it just by looking at your actual HTML code.

When a client asks me to analyse a website that has been online for about a year and it still isn’t listed in certain engines, the first place I look is the robots.txt file. Once I have corrected it and written it for his website, and once I have optimized his most important keywords, usually the rankings will go up within the next thirty days or so.

How To Correctly Write The Robots.txt File
As the name implies, the ‘Disallow’ command in a robots.txt file instructs the search engine’s robots to “disallow reading”, but that certainly does not mean “disallow indexing”. In other words, a disallowed resource may be listed in a search engine’s index, even if the search engine follows the protocol. On the other hand, an allowed resource, such as many of the public (HTML) files of a website can be prevented from being indexed if the Robots.txt file isn’t carefully written for the search engines to understand.

The most obvious demonstration of this is the Google search engine. Google can add files to its index without reading them, merely by considering links to those files. In theory, Google can build an index of an entire Web site without ever visiting that site or ever retrieving its robots.txt file.

In so doing, it is not violating the robots.txt protocol, because it’s not reading any disallowed resources, it is simply reading other web sites’ links to those resources, which Google constantly uses for its page rank algorithm, among other things.

Contrary to popular belief, a website does not necessarily need to be ‘read’ by a robot in order to be indexed. To the question of how the robots.txt file can be used to prevent a search engine from listing a particular resource in its index, in practice, most search engines have placed their own interpretation on the robots.txt file which allows it to be used to prevent them from adding resources or disallowed files to their index.

Most modern search engines today interpret a resource being disallowed by the robots.txt file as meaning they should not add it to their index. Conversely, if it’s already in their index, placed there by previous crawling activity, they would normally remove it. This last point is important, and an example will illustrate that critical subject.

The inadequacies and limitations of the robots exclusion protocol are indicative of what sometimes could be a bigger problem. It is impossible to prevent any directly accessible resource on a site from being linked to by external sites, be they partner sites, affiliates, websites linked to competitors or, search engines.

Even with the robots.txt file, there is no legal or technical reason why they should be used, least of all by humans creating links, for which the standards were not written. In itself, this may not seem a bad idea, but there are many instances when a site owner would rather exclude a particular page from the Web. If such is the case, the robots.txt file will, to a certain degree help the site owner achieve his or her goals.

What Is Recommended
Since most websites normally change often and new content is constantly created or updated, it is strongly recommended that the Robots.txt file in your website be re-evaluated at least once a month. If necessary, it only takes a minute or two to edit this small file in order to make the changes required. Never assume that ‘it must be OK, so I don’t need to bother with it’. Take a few minutes and look at the way it’s written. Ask yourself these questions:

1. Did I add some sensitive files recently?
2. Are there new sections I don’t want indexed?
3. Is there a section I want indexed but isn’t?

As a rule of thumb, even before adding a file or a group of files that contain sensitive information that you don’t want to be indexed by the search engines, you should edit your Robots.txt file before uploading those files to your server. Make sure you place them in a separate directory. You could name it: private_files or private_content and add each of those directories to your Robots exclusion file to prevent the spiders from indexing any of those private directories.

Also, if you find that you have files in a separate directory but you want them indexed, if those public files have been on your server for more than a month and are still not indexed, have a look at your Robots.txt file to make certain there are no errors in any of it’s commands.

Examples Of A Properly Written Robots.txt File
In the following example, I will show you how to properly write or edit a Robots.txt file. First, never use a word processor to write or edit these files. Today’s modern word processors use special formatting and characters that will not be understood by any of the search robots and could lead to problems, or worse, it could cause them to ignore the Robots file completely.

Use a simple ‘pure vanilla text’ editor of the ASCII type or any text editor of the Unix variety. Personally, I always use the Notepad editor that comes on any Windows operating system. Make certain you save it as ‘robots.txt’ (all in lower case). Remember that most Web servers today run Unix and Linux and are all case sensitive.

Here is a carefully written Robots.txt file:

User-agent: Titan
Disallow: /

User-agent: EmailCollector
Disallow: /

User-agent: EmailSiphon
Disallow: /

User-agent: EmailWolf
Disallow: /

User-agent: ExtractorPro
Disallow: /

The user-agent is the name of the robot you want to disallow. In this example, I have chosen to disallow Titan, EmailCollector, EmailSiphon, EmailWolf and ExtractorPro. Note that many of these robots are from spam organizations or companies that are attempting to collect email addresses from websites that will probably be used in spam. Those unwanted robots take up unnecessary Internet bandwidth and slows down your Web server in the process. (Now you know where and how they usually get your email address from). It is my experience that most of those email collectors usually obey the Robots.txt protocol.

Conclusion
Properly implementing the Robots exclusion protocol is both a simple process and takes very little time to enforce. When used as it is intended, it can ensure that the files you want indexed in your website will be indexed. It will also tell the search robots where they are not welcomed, so you can really concentrate in managing your online business in the safest way possible, away from ‘inquisitive minds’.

Author:
Serge Thibodeau of Rank For Sales

Improve Link Popularity
There are many different ways to improve link popularity for your Website. One of the most effective ways to improve your site’s link popularity is by writing articles for publication on other related Web sites and in other related newsletters or Ezines.

In this article, I will be sharing 10 techniques I use when I am writing articles to improve link popularity for one of my Websites.

Let’s use this article as an example. I have two goals in mind as I start writing this article, a primary goal and a secondary goal.

My primary goal is to inform you, my reader, about how to improve link popularity for your Website.
My secondary goal is to improve link popularity for one of my own Websites.

If, after having read this article, you can say to yourself that this article was informative and it gave you one or more ideas about how to improve link popularity for your own Website, then I will have achieved my primary goal.

If, after having written this article, I can say to myself that this article was written well enough, and was informative enough, to persuade Webmasters and newsletter or Ezine publishers to reprint this article on their Website or in their newsletter or Ezine, then I will have achieved my secondary goal.

If I do achieve my primary goal, my secondary goal will be achieved as well – you will have been informed and the other publishers will realize that and will reprint this article in order to inform their readers as well.

Now, on to achieving my primary goal, informing you…

How To Improve Link Popularity By Writing Articles
1. Before writing your article, have definite goals in mind. Ask yourself what you want to accomplish by writing this article.

Since you are now reading an article about how to improve link popularity for your Website, that should be one of your goals. I’ll leave you to decide what other goals you want to achieve by writing your article.

2. Have well thought out ideas for the beginning, middle and ending of your article.

For this article, the purpose of my beginning was to inform you about an effective way to improve link popularity for your Website.

The purpose of the middle of this article is to inform you about how you can use this effective way to improve link popularity for your own Website.

The purpose of the ending of this article will be to provide you with a way to learn more on this topic and, to achieve my secondary goal, to improve link popularity for my own Website.

3. Include your most important keyword in the title of your article. You will notice that I have used the keyword phrase “improve link popularity” in the title of this article. This will improve the odds that your article will become indexed and listed by search engines.

4. Use your important keyword throughout the content of your article. Again, you will notice that I have used the keyword phrase “improve link popularity” several times throughout the content of this article. Again, this is for the purpose of getting this article indexed and listed by search engines.

5. Write your article as if you were speaking directly to your best friend about a topic that both you and she are passionate about. Don’t try to impress her with your knowledge of the topic. Simply provide her with your thoughts on the subject, using easy-to-understand words and phrases.

6. After you have written your article, leave it alone for a day or two before coming back to it. Using a fresh pair of eyes, read your article to see if the content flows in a logical, easy-to-understand manner. Then proofread your article for typos. Make any necessary corrections.

7. After you have written and proofread your article, publish it on your own Website. Include a resource box at the bottom of your Web page that informs Webmasters and newsletter or Ezine publishers that they are free to reprint your article. This is, after all, the focus of this article on how to improve link popularity for your Website.

8. If you publish a newsletter or Ezine, also publish your article in it as well. If it is a lengthy article, publish part of the article in your newsletter or Ezine and provide a link to the complete article that is published on your Website.

9. Include a resource box at the end of your article that provides a little information about you and an invitation for your readers to visit your Website. Some readers will go to your Website, read your other articles, and choose to reprint some of those as well. This is a definite bonus when you are writing an article to improve link popularity for your Website.

10. Submit your article to Websites that archive articles that are available for reprinting on other Websites and in other newsletters or Ezines. My favorite Website for archiving my articles and for obtaining articles written about how to improve link popularity is ArticleCity.com, located at http://www.articlecity.com/. Other archives can be located by searching on “free articles for reprint” or something similar.

Summary
It took me about two hours to write this article. When you are passionate about the topic you are writing about, the words just seem to flow from your brain to your article.

If you are passionate about the topic of your article, you will be able to write an informative article which will be reprinted by Webmasters and newsletter or Ezine publishers and that will help to improve link popularity for your Website.

Author Bio:
Bob designs and builds his own Websites. His most recent site is Link Popularity IQ: http://www.lpiq.com/. Bob invites you to test your knowledge of link popularity by taking his free Link Popularity IQ Test: http://www.lpiq.com/test.

Gasp! It Costs HOW Much??
If you’ve ever gone shopping for professional yet affordable search engine optimization, the sticker shock may have caused you to reconsider that ‘$99 search engine submission special‘ you saw advertised. After all, it’s a field of expertise with no regulatory body and no simple answer to the question: Just how much should an online business pay a search engine optimization company?
I know how much we charge at eTrafficJams.com and our clients say it’s worth every cent. But there are plenty of other SEO services out there that may quote you rates much higher or much lower.

So it’s up to you to examine your priorities. Is a bargain basement package more attractive than increasing your rankings, sales, and revenues via up-to-date, effective SEO techniques that may cost a little more?

Are momentarily high rankings more important than a guarantee of ‘clean’ search engine optimization that will keep you competitive over the long haul?

Are you willing to sacrifice effective SEO strategies in exchange for a discount solution?

Search engine optimization is an incredibly time-consuming task that requires specialized skills and constant research to stay on top of the changing SEO landscape. If the fee is tantalizingly low, well, like they say, ‘if it walks like a duck…’

On the other hand, if you opt for quality SEO, here’s what your money should get you. This will explain why professional optimization is not cheap:

1. Keyword phrase identification. Finding that handful of perfect terms capable of turning a sale is the first order of business and it gobbles up vast amounts of an SEO expert’s time. And here’s a surprise: coming up with a list of 100 keywords that apply in general is far easier than developing a short list of the most relevant, hardest-working ones.

2. Web design consultation. If your site layout isn’t spider-friendly, no amount of SEO finesse will solve your traffic problems. So a search engine expert also has to be up on web design issues and know how to fix the trouble spots.

3. SEO copywriting. It isn’t enough to drive herds of traffic to your site. The masses need to be informed and sold once they get there. That’s where SEO copywriting comes in. (Bargain optimizers always scrimp on this one.)

4. HTML optimization. Keyword phrases also play a starring role in your HTML tags. Knowing which tags are hot and which are not (in the eyes of the engines) is the purview of a truly clued-in search expert.

5. Site registration. This does not mean submitting to Google and the Open Directory Project, then going for lunch. SEO pros know which engines and directories are the movers and shakers, which minor ones are becoming forces to be reckoned with, and where your site stands in the rankings of each. A full-time job in itself.

6. Link partnership building. Have you tried it yet? It’s not easy locating quality sites related to but not competing with yours who are willing to give up valuable real estate for your link. It takes solid research, smooth negotiating skills, and a large investment of time to develop a base of reputable, relevant link partners that will make the search bots take notice.

7. Monitor traffic. Notice I said ‘monitor traffic’ not just ‘monitor rankings’. Since conversions are the bottom line for most e-businesses, a good search engine optimization company will include comprehensive ROI tracking so you can see exactly what your SEO dollar is producing for you.

If you sacrifice just one of these strategies, you could be jeopardizing the effectiveness of all the others. And many SEO services sacrifice more than just one.

Show Me the Money

Okay, let’s talk numbers. My research shows that you can expect to pay between $3,000 and $7,000 for search engine optimization that includes the above vital steps, depending on how many pages are optimized.

So now you know the difference between professional, affordable search engine optimization and the ‘$99 to submit your site to 3000 search engines’ type of SEO promotion. The price tag is vastly different. So is the product.

Author Bio:
Michael Pedone is founder of eTrafficJams.com, a professional search engine optimization company that specializes in getting targeted, eager-to-buy traffic to your site! Want some tips on how to improve your web site’s optimization? Call 877-785-9977 or click to request a free search engine optimization analysis.

Search On Keyword Phrases In The Search Engines
Using your expanded list of keyword phrases, search for those terms in the search engine databases. Note the number of search engine results. The more results, typically the more competitive the term. See the differences in number of search results for plural versions as opposed to singular versions of your keywords in each engine. Note the descriptions that the search engine results bring up – are there any keyword phrases there that might apply to your website? Don’t forget the ads Google displays in their search results. Study the ads that come up with your search terms as well. While you are searching on your keyword phrases, check your competitor’s ranking, along with the new keyword phrase variations you come up with through the Overture Keyword and WordTracker tools.

Add Keywords Reflecting Your Local Cities And State
You can also target local areas by including them in your title/meta tags and text of your web pages. List only the cities and state you reside in and/or provide services to. You never know who will be looking for a local contact producer of “blue widgets” in your city or state. Some people prefer to work with a local company. Adding in those type of specifics, even on your contact page with your local information, can pull in traffic your local competitors are missing.

Check Your Site Statistics
Last but certainly not least, check your search engine stats program or raw server logs to see what terms your visitors are using to find your website. There may be combinations of words your visitors are using you have not thought of or that may not be in the content of your pages.

Incorporate Keyword Phrases Into Content Of Your Web Pages
Once you have your list of varied keyword phrases, work them into your web page. Incorporating these terms into your web pages should “make sense”, in other words, they should read well and not sound “spammy”. Most of all, they should realistically be part of the content of the page, not placed there only because you need them in the content. Have another person read your copy to see if it sounds reasonable to them.

Finding Targeted Keyword Phrases
Finding keyword phrases your competition is missing is easier than you might think. Combinations of two and three word phrases are often overlooked by your competitors when vying for the top competitive terms. This missed opportunity may be a benefit to you to overcome your competition in the search engine rankings.

Think Like A Searcher – Study Your Target Audience
Really look at the audience you want to bring to your website. Are there terms you might not ordinarily use, or that your competitors use, that would work for a small portion of visitors? Remember that single words tend to be more competitive. Find two and three word phrases that would work for a searcher looking for your website topic. If your visitors usually search on “vertical widgets”, look at “horizontal widgets” as well. Dig deep to find terms that might not be obvious to you. Be sure to focus your terms on the actual topic of your website, and terms that people would really search for. Have another person compile a list of keyword phrases used to find your website or product. You’d be surprised at the number of variations two minds can come up with instead of one. Think like a searcher – not a website owner.

View Your Competitor’s Source Code And Content For Keyword Phrases
Viewing your competitor’s source code is very easy and a good way to see what keyword phrases (if any) they are using. Using your browser, view the source code of their page. The title and meta tags should contain the same keywords or variations of keyword phrases if the competitor’s website is optimized. Look over the web page content as well as for keyword phrases worked into the text, image alt text, headings and hyperlinks of the pages. If their pages are not optimized you may gain an even bigger edge on the competition by optimizing your web pages.

Using Keyword Tools To Find Variations Of Keyword Phrases
The Overture Suggestion Tool will provide keyword variations. You can find the tool at http://web.archive.org/web/20040605170523/http://inventory.overture.com/d/searchinventory/suggestion/

Clicking on the suggestion tool link will bring up a window that allows you to search for terms and variations of terms. Begin with your list and see how many variations come up with the results. You might be surprised at the popularity of some of the search variations you see. Be sure to add you new keyword phrases to your list.

WordTracker is a keyword tool as well, you can purchase a yearly subscription or even a one day subscription. Learn more about it here: http://web.archive.org/web/20040605170523/http://www.wordtracker.com/

Keyword Variations Make A Difference
Don’t miss out on the keywords your competitors might miss. Those extra keywords could translate into profits and increased viewing of your website by visitors who might otherwise not find you.

Author Bio:
Daria Goetsch is the founder and Search Engine Marketing Consultant for Search Innovation Marketing (http://www.searchinnovation.com/), a Search Engine Promotion company serving small businesses. Besides running her own company, Daria is an associate of WebMama.com, an Internet web marketing strategies company. She has specialized in search engine optimization since 1998, including three years as the Search Engine Specialist for O’Reilly & Associates, a technical book publishing company.

Free Webmaster Tools
When working with my workshop or online SEO students, I’m often told of a really neat tool that I’ve never heard of before.

So, I thought it would be a great idea to share these tools with you, just like they were shared with me.

A special thanks goes to all of you who told me about these unique tools. And, if anyone else has a “cool tool” to share, be sure to let me know. I’ll publish a part 2 of this article!

Customer Focus Calculator (“We We Monitor”!)
http://web.archive.org/web/20040605170440/http://www.futurenowinc.com/wewe.htm
This has got to be one of my favorite cool tools. It analyzes a Web page and lets you know if the Web page is focused on YOU or on your customers. After all, shouldn’t our Web pages be written for our customers . . . not ourselves?

Search Engine Optimization Tools
http://web.archive.org/web/20040605170440/http://www.webconfs.com/
This site has three different tools that are really cool. One is a Search Engine Spider Simulator, which displays exactly what a spider would see when it visits your page. Another free tool is the Google Backlink Checker, which lets you know who’s linking to your site and what link text they’re using. The last tool is the Similar Page Checker. We all know that “duplicate content” is a no no. This neat tool will determine the similarity between two pages.

Marketleap Search Engine Marketing Tools
http://web.archive.org/web/20040605170440/http://www.marketleap.com/services/freetools/default.htm
My favorite tool on this page is the Link Popularity Check. Enter your URL and up to three comparison URLs, and the site will let you know what your link popularity is in AllTheWeb, AltaVista, Google, Inktomi, and MSN. It also compares your site’s link popularity to other URLs you listed as well as other sites on the Net. The Search Engine Saturation tool determines how many pages a given search engine has in its index for your Web site. You can compare your site to five other sites with this tool. Finally, the Keyword Verification Tool checks to see if your page is in the top three pages of a search engine’s results for a certain keyword or keyword phrase.

Website Security Report
http://web.archive.org/web/20040605170440/http://security.computer-concierge.net/security/website_security_report.asp
This very cool tool lets you find out how secure your Web site is. When I first used it, it found more holes in the security of my Web site than the laws allows, and I promptly changed to a different hosting company!

Your Computer’s Security
http://web.archive.org/web/20040605170440/http://grc.com/default.htm
Gibson Research Corporation offers a really neat (and free) tool for checking the security of your computer: ShieldsUp. Scroll down on
the page to the Hot Spots area, where you’ll find the link to ShieldsUp. Be sure to go through each of the checks in the ShieldsUp section (File Sharing, Common Ports, etc.).

More Free Webmaster Tools
Reverse DNS Lookup
http://web.archive.org/web/20040605170440/http://remote.12dt.com/rns/
This service converts an IP address to the host name by searching domain name service tables.

Doctor HTML
http://web.archive.org/web/20040605170440/http://www2.imagiware.com/RxHTML/
Click on Single Page, and the free service will analyze your Web page based on document structure, image syntax, table analysis, browser
support, verify hyperlinks, and much more.

GifBot by NetMechanic
http://web.archive.org/web/20040605170440/http://www.netmechanic.com/accelerate.htm
GifBot will compress your GIF, JPEG, and animated GIF images by decreasing the file size without sacrificing the quality of the image. So, your pages load faster!

Bobby Online
http://web.archive.org/web/20040605170440/http://bobby.watchfire.com/bobby/html/en/index.jsp
This online service allows you to test Web pages to help expose and repair barriers to accessibility and encourage compliance with existing accessibility guidelines. It’s very important to make sure your Web pages are accessible for those who surf the Internet with disabilities.

AnyBrowser.com Screen Size Tester
http://web.archive.org/web/20040605170440/http://anybrowser.com/ScreenSizeTest.html
Have you ever been to a Web site and not been able to scroll to the bottom of the page? The scroll bar is there, but you can’t get down that far to scroll. It’s not a pleasant experience. At the Screen Size Tester, you can test your Web pages in different screen
resolutions so that you can be assured that all of your visitors can view your site in its entirety.

Anti-Spam and Security Fix for FormMail.pl Script
http://web.archive.org/web/20040605170440/http://mailvalley.com/formmail/
Formmail one of the most-used perl scripts on the Web, is designed to send data entered into a Web form to an e-mail address. This script could be exploited by a malicious user who could use Formmail as a spam server. If you use this script, spammers may be able to use it to send spam freely using your server’s resources.” This site offers a patched version that prevents the script from being used by spammers.

Comprehensive Guide to htaccess
http://web.archive.org/web/20040605170440/http://www.javascriptkit.com/howto/htaccess.shtml
Though not a “tool” per se, this comprehensive tutorial about how to use the htaccess file is excellent.

Top Ten Web Design Mistakes for 2002
http://web.archive.org/web/20040605170440/http://www.useit.com/alertbox/20021223.html
Though also not a tool, this handy list is something that we should all be aware of. For example, don’t use horizontal scrolling!

Again, thanks to all of you who shared these wonderfully helpful tools with me. If I missed any (which I’m sure I did), please let me know. I’ll send out a part 2 of this article.

Author Bio:
Robin Nobles with Search Engine Workshops teaches SEO strategies the “stress free” way through hands-on, search engine marketing workshops in locations across the globe and online search engine marketing courses. Visit the World Resource Center, a new networking community for search engine marketers. (http://www.sew-wrc.com/)

Introduction
ExactSeek.com has teamed up with Alexa to provide a brand new search engine ranking algorithm that ranks sites based on content relevancy and Alexa user popularity data.

This unique ranking system not only provides relevant search results, but it also enables website owners to increase their site’s ranking in ExactSeek simply by increasing their Alexa ranking.

ExactSeek.com, a Jayde Online company, is a search engine and directory that indexes over 25,000 websites each day. It currently indexes over 2 million websites and is
expected to exceed 5 million by the end of the year.

Alexa Internet, an Amazon.com company, provides the Alexa Toolbar — an invaluable tool for the Internet marketer. This little browser plug-in provides a wealth of information, including site statistics, traffic data and contact information for every site you visit. Best of all, it won’t cost you a dime.

You can download and install the Alexa toolbar here:
Alexa Toolbar

I recently had the opportunity to interview Jayde Online’s CEO, Mel Strocen. The Jayde Online Network includes ExactSeek.com, GoArticles.com, SiteProNews.com, AllBusinessNews.com, EzineHub.com and FreeWebSubmission.com. Jayde Online Inc. has been internet-focused from its inception in 1998 and primarily involved in the publication
of email newsletters and the development of niche and general search engines.

Here’s what Mel had to say about the new ExactSeek search engine ranking algorithm…

The Interview
Shelley: Mel, thank you for taking time out of your busy schedule to talk with me.

Mel: Shelley, I appreciate the opportunity to provide some insight on recent developments at ExactSeek and our new site ranking algorithm.

Shelley: Now that ExactSeek has teamed up with Alexa, how exactly will websites be ranked?

Mel: The main factors in determining website ranking will be page content relevancy and the site traffic data provided by Alexa Internet with the emphasis being on the former
versus the latter. It will be several weeks, or possibly even longer, before we finalize the exact weight we give Alexa data. What this means is that for the short term webmasters might find their website ranking changing fairly frequently. Think of it as ExactSeek’s version of the “Google Dance” 🙂

Shelley: Prior to the new partnership, ExactSeek ranked sites according to keywords displayed within the Title, Meta Description and Meta Keywords tags. Will these tags still play a role in ranking a website? Or, will sites be strictly ranked according to content and their Alexa ranking?

Mel: Yes, Title and Meta tags will continue to play a role, but not to the extent they did previously. The information from these tags will be shown in the actual search result
listings and will also be compared to page content as an additional check of overall relevancy.

Shelley: As I’m sure you’re aware, some “experts” may make a case that Alexa rankings can be manipulated. How would you respond to this analysis?

Mel: As mentioned, Alexa traffic data will be an integral part of the ExactSeek ranking algorithm but not the dominant factor, that being page content. Essentially, we’ve opted to emphasize user popularity over link popularity. Either one can be manipulated by savvy webmasters or SEO experts but, in our view, user popularity is less subject to manipulation over the long term because it is more difficult to manipulate the surfing public than it is to manipulate SE crawlers. The bottom line is that people will always be able to better evaluate content than search engine spiders.

Shelley: Will an increase in traffic from other sources affect how a site ranks in ExactSeek?

Mel: Yes, the good news for webmasters is that regardless of what they do to promote their websites, their efforts will result in a better ranking on ExactSeek. So, increased
traffic from any source, be it ezine advertising, PPC campaigns, search engine marketing, etc., will help boost site ranking in our search engine.

With other search engines, webmasters have been forced to learn what is important to each engine and tailor their sites accordingly. The beauty of ExactSeek is that webmasters can focus on making their sites relevant to people, not our search engine.

Shelley: How does ExactSeek’s new ranking algorithm compare with other search engines?

Mel: With the exception of Google, the major search engines have offered little in the way of innovation. Factoring user popularity into a ranking algorithm is ground-breaking. User popularity is a far more reliable indicator of where websites should rank and gives the surfing public some input on the search results they see.

Shelley: How will this new ranking system benefit website owners?

Mel: I’ve already mentioned the most obvious ways that webmasters will benefit as a result of our use of Alexa traffic stats. Indirectly, they will also benefit as searchers recognize the importance of user popularity in delivering quality search results. More search traffic for us will translate into more traffic for webmasters who have listed their websites in ExactSeek. In addition, we will be introducing other ranking factors specifically geared to webmasters which will give them an opportunity to boost site ranking in some very unique ways never before employed by any search engine.

Shelley: Is ExactSeek.com the only search engine that will be utilizing this new ranking system?

Mel: No, the new ranking system will be implemented on 6 search engines, those being ExactSeek, Aesop.com, OnSeek.com, SitesOnDisplay.com, MaxPromo.com and Best-SearchEngine.com. Four of these are using the new ranking algorithm now and the remaining two will be within a week or two.

Shelley: What are your submission policies? Should a webmaster submit just their main page or can they submit additional pages? How often should they submit and how?

Mel: We prefer that webmasters submit primary URLs, but additional pages can be submitted if they have unique content and title and meta tags that reflect that unique
content. Once a URL has been incorporated into the ExactSeek database, follow-up submissions are unnecessary. All URLs in our database are recrawled monthly. We also provide webmasters the option of scheduling their own recrawls with our recrawl tool (http://web.archive.org/web/20040605184955/http://www.exactseek.com/srank.html). Using this tool, a webmaster can have sites recrawled on a weekly basis.

Site submissions to ExactSeek can be made at:
http://web.archive.org/web/20040605184955/http://www.exactseek.com/add.html

Shelley: It’s been a pleasure talking with you, Mel. Thank you for allowing me to interview you.

Mel: My pleasure, Shelley. Thank you.

Conclusion
Since Alexa site rankings will play a major role in determining how well your site ranks in ExactSeek, I highly recommend that you download the Alexa toolbar — not only to
track your site’s status, but also to locate quality sites for joint ventures and link partners. It’s a win-win proposition no matter how you look at it.

Author Bio:
Shelley Lowery is the publisher of Etips — Web Design, Internet Marketing and Ecommerce Solutions. Visit Web-Source.net to sign up for a free subscription and receive a free copy of Shelley’s highly acclaimed ebook, “Killer Internet Marketing Strategies”.

Accepted Ethical Practices
Unfortunately, the marketplace today is still littered with many unscrupulous search engine optimization firms that claim to be experts in the field. During my day-to-day job as an SEO, I often encounter so-called search engine marketing companies, SEO firms or individuals that always promise you the world but fail to deliver any worthwhile results or any serious ROI.

They often use forbidden techniques by the search engines such as invisible text, invisible links, one-pixel links, doorway or gateway pages etc. You would think that after all these years they would have learned something worthwhile, but apparently such is not the case for some of them. Let’s not generalize here: most SEO firms make a great job at professionally positioning a website in the major search engines and provide a valuable service to the business community.

However, some of them don’t, and these are the ones you have to watch out for. For the sake of clarity, this article will explain the right and “acceptable” techniques that are recommended by the major search engines and these are the techniques that will make your site stand out from your competitors– without risking of getting your site penalized or banned altogether. Beware of people that promise you otherwise.

In this section we look at industry-accepted standard SEO practices and technology that are either recommended by the major search engines or that these techniques and related technologies are what the industry is doing, in the cause of better, more relevant searches by the users. Briefly, here are accepted and recommended techniques and/or technology that are sanctioned or approved by the engines and that are thouroughly enforced at Rank for $ales:

* The proper use of the title tag
* The proper use of the description tag
* The proper use of the robots.txt file
* The proper use of the keywords tag
* The proper use of the H1, H2 and H3 tags
* Good body text, keyword-rich marketing copy
* The proper use of a sitemap, helping your users
* Submitting once to the major search engines
* The proper use of internal links to your inward pages
* The proper use of text links, recommended over picture links

Remember that the Google Page Rank (PR) is an important measure that will determine where in the Google search engine and others, where your site will rank. Google relies heavily on heir complex algorithm to precisely calculate a site’s page rank factor that will directly determine all of the above. There are two ways that Rank for $ales strongly recommends to achieve high PR: Put a lot of informative content on your site and go into a reciprocal link exchange program. Nothing beats a good reciprocal link exchange program to increase page rank in Google. Click here to get all the information.

The Proper Use Of The Title Tag
Correctly using the title tag is as important as ever. More and more today, the major search engines give a tremendous weighting factor to the title tag. On top of informing your site visitors what the page is all about, it also helps the search engines get a better understanding of the topic of that page and what it should expect to see on it.

What’s more, the title tag is a great place to write that important keyword or key phrase that you are optimizing that page for! If your page’s main topic is ‘blue widgets’, then you should call the title tag Blue Widgets. As is usually the case with a lot of things, in SEO, keeping things as simple as possible usually will go a long way too. If you are using a W3C standards-compliant HTML editor as it is strongly recommended, the TITLE tag is always located under the HEAD tag and that is where you need to write it, along with your main keyword or key phrase.

The Proper Use Of The Description Tag
When you do a search on Google, after you have seen the title of the page, you will see a short description, usually of only a line or two that will further inform you of what to expect from that page. To be effective, always write a short but descriptive sentence or two in your descriptive tag. It is also a good idea to use at least once your main keyword or key phrase, as some search engines still look at the contents of some of the META tags (the description tag is part of the meta tags).

On a W3C standards-compliant HTML editor, the description tag is always located under the TITLE tag. Here is how it looks:

The Proper Use Of The Keywords Tag
In the early days of the Internet, placing your keywords in the meta keywords tag would almost be a certain recipe to success. However, due to such increasing abuse and spam techniques used by many unscrupulous Webmasters and site owners, today the only search engine that still looks at them is Inktomi. Since it only takes a minute or two to write them, you might as well do, since it won’t hurt your rankings. Just don’t spend too much time on them or over-emphasize their importance, as they are almost dead.

On a W3C standards-compliant HTML editor, the keywords tag is always located under the descriptive tag. Here is how it looks:

The Proper Use Of The H1, H2 And H3 Tags
After the TITLE tag, if there is one place where the search engines will reward you, it is with the use of the H1 and H2 tags. Simply put, H1 and H2 tags are your main headers when you need to place a headline or you simply need to write the main topic of that page that your viewers will read. Placing that headline in bold (H1 and H2 tags are ALWAYS in bold) H1 and H2 tags will signal the search engines that whatever is in there is real important.

Almost invariably, I always make strong use of H1’s and H2’s and the sites I optimize usually rank very well in the major search engines.

Good Body Text And Keyword-Rich Marketing Copy
In real estate, it is ‘Location, Location, Location’. In SEO, it is ‘Content, Content, Content’. The more text and the more content you have on a given page, the more the search engines will see all that content and the more your page will rank high. It’s just as simple as that. Beware of anybody that tells you differently.

For that reason, stay away from Flash technology or site designs that make heavy use of graphics in any way. Always remember that all search engines are ‘picture blind’ and cannot see any photographs, pictures or graphics on any of your pages! Write good copy, keyword-rich sales copy that will read well, both to your site visitors AND the search engines. The more content you have, the better.

The Proper Use Of A Sitemap, Helping Your Users
The sitemap is still the search engine’s ‘best friend’. When designing your sitemap, always make certain that all your files are in it and that they are all linked to that sitemap. Search engines today are programmed to encounter sitemaps ‘by default’ and if you have one, you are increasing your chances that all your pages will be spidered (or crawled) by the search engines.

Ideally, all your site’s files are supposed to be in the search engine’s database, for the best visibility on the Web. The proper use of a sitemap will greatly increase that visibility. Also, you should always call that filename sitemap.html (all in one name- avoid using a hyphen or underscore). Also, be sure to link your sitemap to your homepage.

Submitting Once To The Major Search Engines
Never submit your site more than once to the search engines. Today, most modern search engines such as Google, AltaVista and many more have automated crawlers (or spiders) that visit the Web and index new or updated files and then include them in their databases. It is also strongly recommended that you include your site to the Open Directory Project (ODP) or DMOZ. Google uses that directory for its main index of categories and it will usually benefit your site to be listed in it. The ODP is a free directory.

The Proper Use Of Links To Your Inward Pages
One important technique that some Webmasters still neglect to follow is the use of inter-linking techniques used throughout your site. Your inner pages are as important as your homepage or your products or services pages. All of these pages can be used as legitimate ‘doorway’ pages that the search engines will like to see and will usually reward you for using them. For an example of what a good inter-linked page looks like, simply look at this one.

If you have a page on blue widgets, it should also be linked to your green and red widgets pages too, as they are all related together. Following these simple techniques will usually give you some ‘good mileage’ as far as visibility is concerned.

The Proper Use Of Text Links
If at all possible, you should always use text links over graphics or Java-based links that the search engines cannot see. If you really have to use graphics or Java-based hyperlinks to the rest of your site, make sure you include a ‘bottom-of-page’ footer text link menu that the search engines will love and properly index your page in their databases. If you have a large site that is composed of many sections, divide all those sections into smaller ones, using ‘families’ of footer menus that will be different from one section to another. It’s a bit more work, but you are significantly increasing your chances that the engines will reward your site in the rankings.

Conclusion
As I have said so many times before, search engine optimization and positioning is a science as well as an art. What really counts are the results you will get from your efforts. If you carefully implement all the tips and techniques that are in this article, your website should rank well in the search engines and your Web-based business should be a success.

Be extremely careful of anybody that will tell you otherwise. Also, bear in mind that for a site to rank well, it can usually take anywhere from 60 to 90 days and sometimes more. Calculate a minimum of 2 to 3 Google monthly updates, which will set you off to at least three months away, in most cases. I have even seen delays of up to 4 to 6 months after the site was launched. What can speed up things here is a reciprocal link exchange program.

Author:
Serge Thibodeau of Rank For Sales