Ask us a question!

Web Moves Blog

Web Moves News and Information

Archive for the 'Grow Your Traffic' Category

Search Engine Optimization Basics
This is the latest article in the “Back to Basics” series. Previous articles Impact include the importance of search engine marketing (SEM), effective keyword research, title tag formats, Meta tag use, as well as submissions. In this topic, we take a look at changes you can make to the content of your site to further improve search engine positioning.

Over the past few months, search engine optimization (SEO) has become more mainstream, with many companies considering this form of marketing for the first time. The amount of information on the topic of SEO has increased dramatically, with many new authors stepping forward to pen guides that explain how to optimize a website. Yet even with this increased awareness, it still amazes me the number of business owners that still believe tweaking titles or adding keywords to Meta tags is all that is needed to increase search engine visibility.

Optimizing Your Page Content
In previous articles, I have endeavored to provide a beginner’s guide to making these changes; now it’s time to turn our attention to perhaps one of the most important aspects of any SEO campaign, optimizing your page content. The only problem with this topic is, where do I start? There are so many changes that can be made to a web page’s content that I could easily fill ten articles on the subject, so you can see my dilemma in trying to condense my advice into just one single piece. But that is what we shall do; after all, this is a ‘Back to Basics’ series.

So, where do we start? What is the most important change a Webmaster can make to a page in order to improve search engine positioning? To find the answer, we simply go back to the very first article in this series, where we discussed effective keyword research. When researching your industry, competitors and most requested search terms, you identified the keywords that are the most regularly used by your target audience. You’ve used them in your title and Meta tags, but their most important use is on the actual page content, the text you display on the pages you are trying to get positioned.

Include Your Targeted Search Terms
So many times, I have seen web sites that fail to mention any of the search terms they are trying to achieve rankings for. They’ll have lots of graphics and may also have good levels of text on the page, yet the company still fails to include the exact phrase that is important to them. For example, if you’re trying to achieve rankings for the term ‘desktop computer supplies,’ make sure your content has that exact phrase present in it. It is of little benefit to say something along the lines of, ‘The best selection of accessories for your home computer’ when trying to target ‘desktop computer supplies.’ While you may pick up points for having text that is on the same theme, you won’t achieve your best search engine rankings unless you include liberal occurrences of the exact phrase you are trying to target.

Checking Keyword Density
Your next question is likely to be ‘How often should I mention each search term?’ A well- optimized page should include at least 250 words of text. Within that text, aim to achieve between 5-15% frequency for the term you are trying to target. Not sure how to calculate search term frequency? Check out www.searchengineworld.com/cgi-bin/kwda.cgi, a great little tool that will show you the keyword density of each one, two and three-word phrases on any page within your web site. Make sure that you place your most important search terms in text located towards the top of your page and also try not to target more than 5 phrases within any block of text (the more phrases you try to target, the more text you need to achieve a high frequency).

Also look for opportunities to make links out of search terms located within your page text. In the example of ‘desktop computer supplies,’ consider making one of the occurrences of this phrase a hyperlink to the most relevant page within your website; it will give you a little push in your ranking efforts.

The Impact Of Keyword Proximity
If you’re unable to include the exact phrase within your page text, which can often happen when the targeted search term is not used in the course of normal syntax, try at least to keep the words within close proximity. For example, you could use ‘Discounted supplies for desktop computers.’ While it is not as valuable as including the exact phrase, it at least contains the targeted words, albeit in a different order. The search engines, while preferring to display pages that match search terms exactly, have shown propensity to display web pages that have the targeted words within close proximity, if not the exact order they were searched.

Search Terms Should Be Pervasive
While the paragraphs of text within your web page offer the best opportunity to include search terms, make sure you don’t miss the many other opportunities scattered among your content. For example, look at the text contained within the headings of each page and make sure they contain the most relevant search term for your content. Also, consider the navigation menu that you use and look for instances where you can include a relevant search term. How about the text you use under each product description? I’ve seen websites where the most dominant two-word phrase on a product page was ‘Sale Price.’ Ouch!

As you can see, the text you use on each page is vitally important when trying to achieve better search engine positioning. However, adding keywords to your content is not enough to get your web site to the coveted ‘#1’ position. There are many other factors that need to be considered, including many that don’t involve the content on the page, but as we are looking at the page content, here are a few quick tips:

‘ Don’t bury your keyword-rich content at the bottom of the page. The search engines consider where the text is located on a page when determining your site’s relevancy. Google will believe that text pushed to the bottom of your site, in a small font, can’t be that relevant to your business.

‘ Don’t overdo things. While having no search terms in your text is disastrous, having too many could have an equally negative impact. Stick to your 5-15% frequency.

‘ Remember the user experience. While your SEO efforts will help improve your search engine rankings, don’t sacrifice the usability of your web site. Ensure that it is easy to navigate and that all of your keyword-rich text still makes sense to the average visitor.

‘ Add one or two targeted search terms to the ALT tags of any image that links to another page within your website. Search engines have shown they consider ALT tag text when the image contains a link to another page.

‘ Don’t go overboard with the use of ‘H1’ tags or bolded text. While they can help improve your search engine positioning, less is more.

Walk Before You Run
Hopefully, the above advice will assist you in modifying your most important pages to increase search engine visibility. When you feel you have made all the basic changes to the text of your site, you’ll find many articles that discuss fine-tuning your page layout and content. Search engine optimization is a continued process and you’ll no doubt drive yourself crazy if you try to optimize every single aspect of your web site. Simply remember to keep your site relevant and make sure you have covered all the basics before advancing to more complex techniques.

Author Bio:
Andy Beal is Vice President of Search Marketing for WebSourced, Inc and KeywordRanking.com, global leaders in professional search engine marketing. Highly respected as a source of search engine marketing advice, Andy has had articles published around the world and is a repeat speaker at Danny Sullivan’s Search Engine Strategies conferences. Clients include Real.com, Alaska Air, Peopleclick, Monica Lewinsky and NBC. You can reach Andy at andy@keywordranking.com and view his daily SEO blog at www.searchenginelowdown.com.

Why Isn’t My Website In The Search Engines?
If your site isn’t found in the search engines, it is probably because the robots couldn’t deal with it. It could be something as simple as not being able to find the site, or it may be more complicated issues involving the robot’s not being able to crawl the site or figure out what your pages are all about.

Submitting your site to the major search engines: that will help deal with the “can’t find it” problem. Even having links pointing back to your site can be enough to attract the search engine robots. Google, for example, suggests that you may not have to submit your pages; they will find your site if you have a link pointing back to it from at least one other site on the web.

If the robots can find your site but can’t make sense of it, then you may need to look at the content and technology used on your pages. Frames, Flash, dynamically generated pages, and invalid HTML source code can cause problems when the search engine robot tries to access your web pages. While some search engines are beginning to be able to index dynamically enerated pages and Flash (e.g. Google and AllTheWeb), use of some of these technologies can hinder your ability to be indexed by the search engine robots.

Text in images cannot be read by the search engine robots. Using ALT image text is an important way to help the robots “read” your images. Websites with extensive images rely heavily on ALT text to present their content.

How Do I Get The Most Out Of Indexing?
If you know what to “feed” the spidering robots you will help yourself with search engine ranking.

Having a website full of good content is the major factor. Search engines exist to serve their visitors, not to rank your website. You need to be sure to present yourself in your site in the way that will be most useful to the search engine visitor. Each search engine has its own idea of what is important in a page, but they all value text highly. Making sure that the text on your pages includes your most important keyword phrases will help the search engine evaluate the content of those pages.

Making sure that you have good title and meta tags will further assist the search engines in understanding what your page is about. If the text on the page is about widgets, the title is about widgets, and the meta tags are about widgets, the search engine will have a pretty good idea that you are all about widgets. When their visitors search for widgets, the search engines know to list your site in the results.

A sitemap page is a very good way of giving the search engine robot every opportunity to reach your website pages. Since robots click through the links of your web pages, make sure that at least your most important pages are included in the sitemap; you may even want to include all your pages there, depending on the size of your site. Be sure to add a link to the sitemap page from each page on your site.

Another important consideration is that of keeping all of your pages within a small number of “clicks” from your top page. Many robots will not follow links more than two or three levels deep, so if your “widgets” page can only be reached from your home page by following multiple links (e.g. home page >> about us page >> products page >> widgets page), the robot may not crawl deep enough to get to the widgets page.

Testing Your Website For Search Engine Robot Accessibility To get an idea just what the search engine robot “sees” on your page, you can look at the Sim Spider tool. You may be surprised at how different your site looks to the robot. You can find this tool at http://www.searchengineworld.com/cgi-bin/sim_spider.cgi

You will see text and ALT image text show up in the results. If your entire website is built in Flash, you will see nothing at all because robots don’t understand Flash movies.

The Bottom Line
When it comes to search engine robots, think simply. Lots of good content and text, hyperlinks the robots can follow, optimization of your pages, topical links pointing back to your site and a sitemap will help insure the best results when the robots come visiting.

Resources

SpiderSpotting – Search Engine Watch

Robotstxt.org
Protocols for setting up a robots.txt file.

Spider-Food
Tutorials, forums and articles about Search Engine spiders and Search Engine Marketing.

SpiderHunter.com
Articles and resources about tracking Search Engine spiders.

Sims Search Engine Simulator
Search Engine World has a spider that simulates what the Search Engine robots read from your website.

Author Bio:
Daria Goetsch is the founder and Search Engine Marketing Consultant for Search Innovation Marketing (http://www.searchinnovation.com), a Search Engine Promotion company serving small businesses. Besides running her own company, Daria is an associate of WebMama.com, an Internet web marketing strategies company. She has specialized in search engine optimization since 1998, including three years as the Search Engine Specialist for O’Reilly & Associates, a technical book publishing company.

Search Engine Robots
Automated search engine robots, sometimes called “spiders” or “crawlers”, are the seekers of web pages. How do they work? What is it they really do? Why are they important?

You’d think with all the fuss about indexing web pages to add to search engine databases, that robots would be great and powerful beings. Wrong. Search engine robots have only basic functionality like that of early browsers in terms of what they can understand in a web page. Like early browsers, robots just can’t do certain things. Robots don’t understand frames, Flash movies, images or JavaScript. They can’t enter password protected areas and they can’t click all those buttons you have on your website. They can be stopped cold while indexing a dynamically generated URL and slowed to a stop with JavaScript navigation.

How Do Search Engine Robots Work?
Think of search engine robots as automated data retrieval programs, traveling the web to find information and links.

When you submit a web page to a search engine at the “Submit a URL” page, the new URL is added to the robot’s queue of websites to visit on its next foray out onto the web. Even if you don’t directly submit a page, many robots will find your site because of links from other sites that point back to yours. This is one of the reasons why it is important to build your link popularity and to get links from other topical sites back to yours.

When arriving at your website, the automated robots first check to see if you have a robots.txt file. This file is used to tell robots which areas of your site are off-limits to them. Typically these may be directories containing only binaries or other files the robot doesn’t need to concern itself with.

Robots collect links from each page they visit, and later follow those links through to other pages. In this way, they essentially follow the links from one page to another. The entire World Wide Web is made up of links, the original idea being that you could follow links from one place to another. This is how robots get around.

The “smarts” about indexing pages online comes from the search engine engineers, who devise the methods used to evaluate the information the search engine robots retrieve. When introduced into the search engine database, the information is available for searchers querying the search engine. When a search engine user enters their query into the search engine, there are a number of quick calculations done to make sure that the search engine presents just the right set of results to give their visitor the most relevant response to their query.

You can see which pages on your site the search engine robots have visited by looking at your server logs or the results from your log statistics program. Identifying the robots will show you when they visited your website, which pages they visited and how often they visit. Some robots are readily identifiable by their user agent names, like Google’s “Googlebot”; others are bit more obscure, like Inktomi’s “Slurp”. Still other robots may be listed in your logs that you cannot readily identify; some of them may even appear to be human-powered browsers.

Along with identifying individual robots and counting the number of their visits, the statistics can also show you aggressive bandwidth-grabbing robots or robots you may not want visiting your website. In the resources section of the end of this article, you will find sites that list names and IP addresses of search engine robots to help you identify them.

How Do They Read The Pages On Your Website?
When the search engine robot visits your page, it looks at the visible text on the page, the content of the various tags in your page’s source code (title tag, meta tags, etc.), and the hyperlinks on your page. From the words and the links that the robot finds, the search engine decides what your page is about. There are many factors used to figure out what “matters” and each search engine has its own algorithm in order to evaluate and process the information. Depending on how the robot is set up through the search engine, the information is indexed and then delivered to the search engine’s database.

The information delivered to the databases then becomes part of the search engine and directory ranking process. When the search engine visitor submits their query, the search engine digs through its database to give the final listing that is displayed on the results page.

The search engine databases update at varying times. Once you are in the search engine databases, the robots keep visiting you periodically, to pick up any changes to your pages, and to make sure they have the latest info. The number of times you are visited depends on how the search engine sets up its visits, which can vary per search engine.

Sometimes visiting robots are unable to access the website they are visiting. If your site is down, or you are experiencing huge amounts of traffic, the robot may not be able to access your site. When this happens, the website may not be re-indexed, depending on the frequency of the robot visits to your website. In most cases, robots that cannot access your pages will try again later, hoping that your site will be accessible then.
Many search engine robots are unidentifiable when you look through your web logs, they may be visiting you but the logs states it is someone using a Microsoft browser, etc. Some robots identify themselves by the actual search engine’s name (googlebot) or a variation of it (Scooter = AltaVista).

Depending on how the robot is set up through the search engine company, the information is indexed and then delivered to the databases.

The information delivered to the databases then becomes part of the search engine and directory ranking process. This is when the mathematical calculations are used for a final database result which is what you see when you search in the search engines.

The search engine databases update at varying times. Even directories who have secondary search results use the robot results as content for their website.

Once you are in the search engine databases, the robots keep visiting you. The amount of times you are visited depends on how the search engine sets up its visits, which can vary per search engine.

In fact, robots aren’t just used by the search engines. There are robots that check for new content for databases, re-visit old content for databases, check to see if links have changed, download entire websites for viewing, and more.

Sometimes visiting robots are unable to access the website they are visiting, when this happens, the website may not be re-indexed, depending on the frequency of the robot visits to your website.

For this reason, reading log files (either raw logs from your server or through your log statistics program) and tracking your search engine listings helps you keep track of the indexing of your website pages.

Resources
SpiderSpotting – Search Engine Watch

Robotstxt.org
Protocols for setting up a robots.txt file.

Spider-Food
Tutorials, forums and articles about Search Engine spiders and Search Engine Marketing.

SpiderHunter.com
Articles and resources about tracking Search Engine spiders.

Sims Search Engine Simulator
Search Engine World has a spider that simulates what the Search Engine robots read from your website.

Author Bio:
Daria Goetsch is the founder and Search Engine Marketing Consultant for Search Innovation Marketing (http://www.searchinnovation.com), a Search Engine Promotion company serving small businesses. Besides running her own company, Daria is an associate of WebMama.com, an Internet web marketing strategies company. She has specialized in search engine optimization since 1998, including three years as the Search Engine Specialist for O’Reilly & Associates, a technical book publishing company.

Server Log Files
You have invested considerable amounts of money in search engine optimization for your website. You might even participate in a PPC (Pay-per-Click) campaign or other forms of advertising. Yet, you would like to know just who, when, how and where did they go while visiting your site. Maybe you are also asking yourself the eternal question: Did they buy from me? If not, why did they leave without buying anything?

After being in this business for more than 6 years, I still can’t understand why it is that so few site owners, business administrators and even many Webmasters don’t look at their server logs more often and, most importantly, more closely.

Few seem to realize that there is a ‘diamond mine’ of information residing in those log files. All it takes is to retrieve them and analyze them, at least once a week, at a minimum. If you can look at them everyday, it’s even better.

A Primer On Server Log Files
First, let’s rewind back to 1994-1995. In the mid-90’s, when the Internet was still relatively young, commercially that is, most site owners and Webmasters were mostly interested in the number of ‘hits’ their site got, from one day to the next. Even today, I still get asked: “How many hits did I get today?” By talking with some of these clients, I realize that they need a little primer article, such as this one, in helping them better understand the importance of professional analytics and measurements of Web traffic, site demographics, geo-location data, statistics, etc.

Today, modern traffic analysis tools and complex analytical software applications can yield far beyond this rudimentary, basic information. The more information a business administrator can have at his or her fingertips, the more he or she can use it to their best advantage over their competitors. Plus, there are many other benefits to be had as an added bonus. If some executives within your organization would like to know exactly how some visitors behave once they are inside your site, they can now get access to this kind of information, with a good analytical package.

Additionally, one of the most important features of most of the good packages I have seen is the ‘link referrer’ information. As its name implies, the link referrer information will clearly identify the search engine that brought you that visitor, or if it’s a link from one of your affiliate’s sites or a site that links back to you. What’s more, when properly configured, the search engine referrer log will even tell you which specific keywords or key phrases they entered to find your site.

The Importance Of Error Logs
Another often-overlooked and neglected area is the error log. It’s an excellent way to discover problems with your website before a visitor mentions it to you. A good example is when you find ‘error 404’ or ‘file not found’ in your error logs. When added with the site referrer information, you are also able to discover which page caused the error. This feature alone makes it a lot simpler to correct this problem.

With today’s most popular Web hosting packages offered on the market, many of them provide some kind of server log files that one can analyze and evaluate. You should ask your Web hosting provider for that information.

Conclusion
Frequent and regular server log file analysis and monitoring can be a very valuable source of important website information in deed, if done correctly. It can mean all the difference between discovering that a site has little or no ROI and one that truly achieves the best overall value for any size business. There are very simple and inexpensive server log analytical programs and software available today. Some are even free. As with anything, you can also purchase expensive software suites that are more suited for a larger enterprise. It all depends on your needs and budget.

Author:
Serge Thibodeau of Rank For Sales

Organic SEO: Patience For Long Term Ranking Results
When does long term SEO show ranking results? It takes time for optimization to produce targeted traffic to your website. Organic SEO requires time to take effect, just as it takes time for your web pages to start showing up in the search engine results.

Clients regularly ask me about the timing of a search engine optimization campaign and when those results will be seen in the search engine listings. A long term marketing campaign based on search engine optimization takes time: patience is the name of the game.

Optimization Timeframe
SEO’s timeframe depends on a number of factors. Part of this involves the accuracy of keyword phrase choices: is the keyword phrase one your visitors would use to find your product or website? If your keyword phrases are targeted to your audience, you will gain optimum results. Did you use Paid Inclusion and/or PPC services? The best combination for success involves using a combination of SEO, Paid Inclusion and PPC services. If you do not use Paid Inclusion or PPC, using organic SEO only, it takes more time to achieve results.

Paid Results
When you use Paid Inclusion or PPC (Pay-Per-Click) bidding, your results show up sooner than traditional SEO. Paid Inclusion submissions state the time-frame in which your page will be indexed by the search engine robots when you sign up for services. PPC bidding results show up as soon as searchers start clicking on your PPC ads. This type of search engine marketing requires an annual budget to renew Paid Inclusion submissions and payment per month for PPC click-through costs. If you are paying too much for your PPC services, organic SEO combined with PPC often helps to keep the prices down for the paid service. By generating additional targeted traffic on those costly terms you may be able to bring the bidding prices down in your PPC campaign or even eliminate some keyword bidding.

The timeline given for paid submissions means the search engines are generating income through this process. Paying for results also gives you a guarantee the listings will be relatively stable in the database.

Paid Inclusion submissions will always take precedence over free submissions because the company makes money from Paid Inclusion. For this reason most search engines will implement free search engine submissions over a longer period of time than paid submissions. When using SEO without the paid submission choices, the process is the same but the optimized pages take longer to be processed into the search engine databases.

Organic SEO
Organic SEO works differently. The best reason to use organic SEO is that it is a low-cost method to promote your website. It can take up to three to six months to see the full results of optimizing your website, especially if you are only using organic optimization. The plus to an organic approach is that once you optimize your pages, the main part of the work is done. You may tweak your keywords and text here and there, but unless you completely re-design your pages, you have what you need in place to begin drawing in targeted traffic. Continue checking your ranking status and reading your log statistics, especially for new keywords visitors are using to find your website.
When using free submissions, expect a three to six month wait before seeing most of the long term results showing in the search engine listings. If you build on a link popularity program and have links pointing back to your website, the search engine robots will find your website through the links, eliminating the need for free submissions. Look at it this way: you pay once for basic optimization and over time the results improve to optimum level. You don’t have to keep paying for this service because, unless search engine databases drops your free submission pages (which is not often these days), you will be visible and present to the search engine users when they search on your targeted keyword phrases. Over time you should see a progression in your ranking, depending on how competitive your keyword phrases are.

Budget SEO
What if you can’t afford Paid Inclusion or PPC services? Organic SEO is a great way to increase targeted traffic to your website over time. If you do not have a budget for Paid Inclusion submissions and PPC programs, organic SEO will give you good results if you are willing to wait instead of gaining immediate results. Combine organic SEO with plenty of good content and a solid link building program for optimum results. Remember, the search engine listings may entice visitors to come to your website, but you must give them a reason to stay once they arrive. Build your content to keep your new visitors at your website.

Patience Pays Off
Organic SEO is “common sense” promotion. Not a lot of fancy bells and whistles, and it takes time. The addition of good navigation, good content with your keyword phrases throughout the pages and topical sites pointing links back to your website equals long term success. Practice patience when going organic for your SEO campaign. It may take time but it will be worth the long term results you reap.

Author Bio:
Daria Goetsch is the founder and Search Engine Marketing Consultant for Search Innovation Marketing (http://www.searchinnovation.com), a Search Engine Promotion company serving small businesses. Besides running her own company, Daria is an associate of WebMama.com, an Internet web marketing strategies company. She has specialized in search engine optimization since 1998, including three years as the Search Engine Specialist for O’Reilly & Associates, a technical book publishing company.

Making Sure Your Site Is Crawler-friendly
I couldn’t find any “meaty” questions for this week’s newsletter, so I thought I’d just talk generally about what makes a site “crawler-friendly.” I used to call this “search-engine-friendly” but my friend Mike Grehan convinced me that the more accurate phrase was “crawler-friendly” because it’s the search engine crawlers (or spiders) that your site needs to buddy-up to, as opposed to the search engine itself.

So, how do you make sure your site is on good terms with the crawlers? Well, it always helps to first buy it a few drinks. But, since that’s not usually possible, your next-best bet is to design your site with the crawlers in mind. The search engine spiders are primitive beings, and although they are constantly being improved, for best results you should always choose simplicity over complexity.

What this means is that cutting-edge designs are generally not the best way to go. Interestingly enough, your site visitors may agree. Even though we SEO geeks have cable modems and DSL, our site visitors probably don’t. Slow-loading Flash sites, for example, may stop the search engine spiders right in their tracks. There’s nothing of interest on the average Flash site to a search engine spider anyway, so they’re certainly not going to wait for it to download!

Besides Flash, there are a number of “helpful” features being thrown into site designs these days that can sadly be the kiss of death to its overall spiderability. For instance, sites that require a session ID to track visitors may never receive any visitors to begin with — at least not from the search engines. If your site or shopping cart requires session IDs, check Google right now to see if your pages are indexed. (Do an allinurl:yourdomainhere.com in Google’s search box and see what shows up.) If you see that Google only has one or two pages indexed, your session IDs may be the culprit. There are workarounds for this, as I have seen many sites that use session IDs get indexed; however, the average programmer/designer may not even know this is a problem.

Another source of grief towards getting your pages thoroughly crawled is the use of the exact same Title tags on every page of your site. This sometimes happens because of Webmaster laziness, but often it’s done because a default Title tag is automatically pulled up through a content management system (CMS). If you have this problem it’s well worth taking the time to fix it.

Most CMS’s have workarounds where you can add a unique Title tag as opposed to pulling up the same one for each page. Usually the programmers simply never realized it was important, so it was never done. The cool thing is that with dynamically generated pages you can often set your templates to pull a particular sentence from each page and plug it into your Title field. A nice little “trick” is to make sure each page has a headline at the top of the page that is utilizing your most important keyword phrases. Once you’ve got that, you can set your CMS to pull it out and use it for your Titles also.

Another reason I’ve seen for pages not being crawled is because they are set to require a cookie when a visitor gets to the page. Well guess what, folks? Spiders don’t eat cookies! (Sure, they like beer, but they hate cookies!) No, you don’t have to remove your cookies to get crawled. Just don’t force-feed them to anyone and everyone. As long as they’re not required, your pages should be crawled just fine.

What about the use of JavaScript? We’ve often heard that JavaScript is unfriendly to the crawlers. This is partly true, and partly false. Nearly every site I look at these days uses some sort of JavaScript within the code. It’s certainly not bad in and of itself. As a rule of thumb, if you’re using JavaScript for mouseover effects and that sort of thing, just check to make sure that the HTML code for the links also uses the traditional <a> tag. As long as that’s there, you’ll most likely be fine. For extra insurance, you can place any JavaScript links into the tag, put text links at the bottom of your pages, and create a visible link to a sitemap page which contains links to all your other important pages. It’s definitely not overkill to do *all* of those things!

There are plenty more things you can worry about where your site’s crawlability is concerned, but those are the main ones I’ve been seeing lately. One day, I’m sure that any type of page under the sun will be crawler-friendly, but for now, we’ve still gotta give our little arachnid friends some help.

One tool I use to help me view any potential crawler problems is the Lynx browser tool. Generally, if your pages can be viewed and clicked through in a Lynx browser (which came before our graphical browsers of today), then a search engine spider should also be able to make its way around. That isn’t written in stone, but it’s at least one way of discovering potential problems that you may be having. It’s not foolproof, however. I just checked my forum in the Lynx browser and it shows a blank page, yet the forum gets spidered and indexed by the search engines without a problem.

This is a good time to remind you that when you think your site isn’t getting spidered completely, check out lots of things before jumping to any conclusions.

Jill

Author Bio:
Jill Whalen of High Rankings is an internationally recognized search engine marketing consultant and editor of the free weekly email newsletter, the High Rankings Advisor.

She specializes in search engine optimization, SEO consultations and seminars. Jill’s handbook, “The Nitty-gritty of Writing for the Search Engines” teaches business owners how and where to place relevant keyword phrases on their Web sites so that they make sense to users and gain high rankings in the major search engines.

Improve Link Popularity
There are many different ways to improve link popularity for your Website. One of the most effective ways to improve your site’s link popularity is by writing articles for publication on other related Web sites and in other related newsletters or Ezines.

In this article, I will be sharing 10 techniques I use when I am writing articles to improve link popularity for one of my Websites.

Let’s use this article as an example. I have two goals in mind as I start writing this article, a primary goal and a secondary goal.

My primary goal is to inform you, my reader, about how to improve link popularity for your Website.
My secondary goal is to improve link popularity for one of my own Websites.

If, after having read this article, you can say to yourself that this article was informative and it gave you one or more ideas about how to improve link popularity for your own Website, then I will have achieved my primary goal.

If, after having written this article, I can say to myself that this article was written well enough, and was informative enough, to persuade Webmasters and newsletter or Ezine publishers to reprint this article on their Website or in their newsletter or Ezine, then I will have achieved my secondary goal.

If I do achieve my primary goal, my secondary goal will be achieved as well – you will have been informed and the other publishers will realize that and will reprint this article in order to inform their readers as well.

Now, on to achieving my primary goal, informing you…

How To Improve Link Popularity By Writing Articles
1. Before writing your article, have definite goals in mind. Ask yourself what you want to accomplish by writing this article.

Since you are now reading an article about how to improve link popularity for your Website, that should be one of your goals. I’ll leave you to decide what other goals you want to achieve by writing your article.

2. Have well thought out ideas for the beginning, middle and ending of your article.

For this article, the purpose of my beginning was to inform you about an effective way to improve link popularity for your Website.

The purpose of the middle of this article is to inform you about how you can use this effective way to improve link popularity for your own Website.

The purpose of the ending of this article will be to provide you with a way to learn more on this topic and, to achieve my secondary goal, to improve link popularity for my own Website.

3. Include your most important keyword in the title of your article. You will notice that I have used the keyword phrase “improve link popularity” in the title of this article. This will improve the odds that your article will become indexed and listed by search engines.

4. Use your important keyword throughout the content of your article. Again, you will notice that I have used the keyword phrase “improve link popularity” several times throughout the content of this article. Again, this is for the purpose of getting this article indexed and listed by search engines.

5. Write your article as if you were speaking directly to your best friend about a topic that both you and she are passionate about. Don’t try to impress her with your knowledge of the topic. Simply provide her with your thoughts on the subject, using easy-to-understand words and phrases.

6. After you have written your article, leave it alone for a day or two before coming back to it. Using a fresh pair of eyes, read your article to see if the content flows in a logical, easy-to-understand manner. Then proofread your article for typos. Make any necessary corrections.

7. After you have written and proofread your article, publish it on your own Website. Include a resource box at the bottom of your Web page that informs Webmasters and newsletter or Ezine publishers that they are free to reprint your article. This is, after all, the focus of this article on how to improve link popularity for your Website.

8. If you publish a newsletter or Ezine, also publish your article in it as well. If it is a lengthy article, publish part of the article in your newsletter or Ezine and provide a link to the complete article that is published on your Website.

9. Include a resource box at the end of your article that provides a little information about you and an invitation for your readers to visit your Website. Some readers will go to your Website, read your other articles, and choose to reprint some of those as well. This is a definite bonus when you are writing an article to improve link popularity for your Website.

10. Submit your article to Websites that archive articles that are available for reprinting on other Websites and in other newsletters or Ezines. My favorite Website for archiving my articles and for obtaining articles written about how to improve link popularity is ArticleCity.com, located at http://www.articlecity.com/. Other archives can be located by searching on “free articles for reprint” or something similar.

Summary
It took me about two hours to write this article. When you are passionate about the topic you are writing about, the words just seem to flow from your brain to your article.

If you are passionate about the topic of your article, you will be able to write an informative article which will be reprinted by Webmasters and newsletter or Ezine publishers and that will help to improve link popularity for your Website.

Author Bio:
Bob designs and builds his own Websites. His most recent site is Link Popularity IQ: http://www.lpiq.com/. Bob invites you to test your knowledge of link popularity by taking his free Link Popularity IQ Test: http://www.lpiq.com/test.

Gasp! It Costs HOW Much??
If you’ve ever gone shopping for professional yet affordable search engine optimization, the sticker shock may have caused you to reconsider that ‘$99 search engine submission special‘ you saw advertised. After all, it’s a field of expertise with no regulatory body and no simple answer to the question: Just how much should an online business pay a search engine optimization company?
I know how much we charge at eTrafficJams.com and our clients say it’s worth every cent. But there are plenty of other SEO services out there that may quote you rates much higher or much lower.

So it’s up to you to examine your priorities. Is a bargain basement package more attractive than increasing your rankings, sales, and revenues via up-to-date, effective SEO techniques that may cost a little more?

Are momentarily high rankings more important than a guarantee of ‘clean’ search engine optimization that will keep you competitive over the long haul?

Are you willing to sacrifice effective SEO strategies in exchange for a discount solution?

Search engine optimization is an incredibly time-consuming task that requires specialized skills and constant research to stay on top of the changing SEO landscape. If the fee is tantalizingly low, well, like they say, ‘if it walks like a duck…’

On the other hand, if you opt for quality SEO, here’s what your money should get you. This will explain why professional optimization is not cheap:

1. Keyword phrase identification. Finding that handful of perfect terms capable of turning a sale is the first order of business and it gobbles up vast amounts of an SEO expert’s time. And here’s a surprise: coming up with a list of 100 keywords that apply in general is far easier than developing a short list of the most relevant, hardest-working ones.

2. Web design consultation. If your site layout isn’t spider-friendly, no amount of SEO finesse will solve your traffic problems. So a search engine expert also has to be up on web design issues and know how to fix the trouble spots.

3. SEO copywriting. It isn’t enough to drive herds of traffic to your site. The masses need to be informed and sold once they get there. That’s where SEO copywriting comes in. (Bargain optimizers always scrimp on this one.)

4. HTML optimization. Keyword phrases also play a starring role in your HTML tags. Knowing which tags are hot and which are not (in the eyes of the engines) is the purview of a truly clued-in search expert.

5. Site registration. This does not mean submitting to Google and the Open Directory Project, then going for lunch. SEO pros know which engines and directories are the movers and shakers, which minor ones are becoming forces to be reckoned with, and where your site stands in the rankings of each. A full-time job in itself.

6. Link partnership building. Have you tried it yet? It’s not easy locating quality sites related to but not competing with yours who are willing to give up valuable real estate for your link. It takes solid research, smooth negotiating skills, and a large investment of time to develop a base of reputable, relevant link partners that will make the search bots take notice.

7. Monitor traffic. Notice I said ‘monitor traffic’ not just ‘monitor rankings’. Since conversions are the bottom line for most e-businesses, a good search engine optimization company will include comprehensive ROI tracking so you can see exactly what your SEO dollar is producing for you.

If you sacrifice just one of these strategies, you could be jeopardizing the effectiveness of all the others. And many SEO services sacrifice more than just one.

Show Me the Money

Okay, let’s talk numbers. My research shows that you can expect to pay between $3,000 and $7,000 for search engine optimization that includes the above vital steps, depending on how many pages are optimized.

So now you know the difference between professional, affordable search engine optimization and the ‘$99 to submit your site to 3000 search engines’ type of SEO promotion. The price tag is vastly different. So is the product.

Author Bio:
Michael Pedone is founder of eTrafficJams.com, a professional search engine optimization company that specializes in getting targeted, eager-to-buy traffic to your site! Want some tips on how to improve your web site’s optimization? Call 877-785-9977 or click to request a free search engine optimization analysis.

Search On Keyword Phrases In The Search Engines
Using your expanded list of keyword phrases, search for those terms in the search engine databases. Note the number of search engine results. The more results, typically the more competitive the term. See the differences in number of search results for plural versions as opposed to singular versions of your keywords in each engine. Note the descriptions that the search engine results bring up – are there any keyword phrases there that might apply to your website? Don’t forget the ads Google displays in their search results. Study the ads that come up with your search terms as well. While you are searching on your keyword phrases, check your competitor’s ranking, along with the new keyword phrase variations you come up with through the Overture Keyword and WordTracker tools.

Add Keywords Reflecting Your Local Cities And State
You can also target local areas by including them in your title/meta tags and text of your web pages. List only the cities and state you reside in and/or provide services to. You never know who will be looking for a local contact producer of “blue widgets” in your city or state. Some people prefer to work with a local company. Adding in those type of specifics, even on your contact page with your local information, can pull in traffic your local competitors are missing.

Check Your Site Statistics
Last but certainly not least, check your search engine stats program or raw server logs to see what terms your visitors are using to find your website. There may be combinations of words your visitors are using you have not thought of or that may not be in the content of your pages.

Incorporate Keyword Phrases Into Content Of Your Web Pages
Once you have your list of varied keyword phrases, work them into your web page. Incorporating these terms into your web pages should “make sense”, in other words, they should read well and not sound “spammy”. Most of all, they should realistically be part of the content of the page, not placed there only because you need them in the content. Have another person read your copy to see if it sounds reasonable to them.

Finding Targeted Keyword Phrases
Finding keyword phrases your competition is missing is easier than you might think. Combinations of two and three word phrases are often overlooked by your competitors when vying for the top competitive terms. This missed opportunity may be a benefit to you to overcome your competition in the search engine rankings.

Think Like A Searcher – Study Your Target Audience
Really look at the audience you want to bring to your website. Are there terms you might not ordinarily use, or that your competitors use, that would work for a small portion of visitors? Remember that single words tend to be more competitive. Find two and three word phrases that would work for a searcher looking for your website topic. If your visitors usually search on “vertical widgets”, look at “horizontal widgets” as well. Dig deep to find terms that might not be obvious to you. Be sure to focus your terms on the actual topic of your website, and terms that people would really search for. Have another person compile a list of keyword phrases used to find your website or product. You’d be surprised at the number of variations two minds can come up with instead of one. Think like a searcher – not a website owner.

View Your Competitor’s Source Code And Content For Keyword Phrases
Viewing your competitor’s source code is very easy and a good way to see what keyword phrases (if any) they are using. Using your browser, view the source code of their page. The title and meta tags should contain the same keywords or variations of keyword phrases if the competitor’s website is optimized. Look over the web page content as well as for keyword phrases worked into the text, image alt text, headings and hyperlinks of the pages. If their pages are not optimized you may gain an even bigger edge on the competition by optimizing your web pages.

Using Keyword Tools To Find Variations Of Keyword Phrases
The Overture Suggestion Tool will provide keyword variations. You can find the tool at http://web.archive.org/web/20040605170523/http://inventory.overture.com/d/searchinventory/suggestion/

Clicking on the suggestion tool link will bring up a window that allows you to search for terms and variations of terms. Begin with your list and see how many variations come up with the results. You might be surprised at the popularity of some of the search variations you see. Be sure to add you new keyword phrases to your list.

WordTracker is a keyword tool as well, you can purchase a yearly subscription or even a one day subscription. Learn more about it here: http://web.archive.org/web/20040605170523/http://www.wordtracker.com/

Keyword Variations Make A Difference
Don’t miss out on the keywords your competitors might miss. Those extra keywords could translate into profits and increased viewing of your website by visitors who might otherwise not find you.

Author Bio:
Daria Goetsch is the founder and Search Engine Marketing Consultant for Search Innovation Marketing (http://www.searchinnovation.com/), a Search Engine Promotion company serving small businesses. Besides running her own company, Daria is an associate of WebMama.com, an Internet web marketing strategies company. She has specialized in search engine optimization since 1998, including three years as the Search Engine Specialist for O’Reilly & Associates, a technical book publishing company.

Free Webmaster Tools
When working with my workshop or online SEO students, I’m often told of a really neat tool that I’ve never heard of before.

So, I thought it would be a great idea to share these tools with you, just like they were shared with me.

A special thanks goes to all of you who told me about these unique tools. And, if anyone else has a “cool tool” to share, be sure to let me know. I’ll publish a part 2 of this article!

Customer Focus Calculator (“We We Monitor”!)
http://web.archive.org/web/20040605170440/http://www.futurenowinc.com/wewe.htm
This has got to be one of my favorite cool tools. It analyzes a Web page and lets you know if the Web page is focused on YOU or on your customers. After all, shouldn’t our Web pages be written for our customers . . . not ourselves?

Search Engine Optimization Tools
http://web.archive.org/web/20040605170440/http://www.webconfs.com/
This site has three different tools that are really cool. One is a Search Engine Spider Simulator, which displays exactly what a spider would see when it visits your page. Another free tool is the Google Backlink Checker, which lets you know who’s linking to your site and what link text they’re using. The last tool is the Similar Page Checker. We all know that “duplicate content” is a no no. This neat tool will determine the similarity between two pages.

Marketleap Search Engine Marketing Tools
http://web.archive.org/web/20040605170440/http://www.marketleap.com/services/freetools/default.htm
My favorite tool on this page is the Link Popularity Check. Enter your URL and up to three comparison URLs, and the site will let you know what your link popularity is in AllTheWeb, AltaVista, Google, Inktomi, and MSN. It also compares your site’s link popularity to other URLs you listed as well as other sites on the Net. The Search Engine Saturation tool determines how many pages a given search engine has in its index for your Web site. You can compare your site to five other sites with this tool. Finally, the Keyword Verification Tool checks to see if your page is in the top three pages of a search engine’s results for a certain keyword or keyword phrase.

Website Security Report
http://web.archive.org/web/20040605170440/http://security.computer-concierge.net/security/website_security_report.asp
This very cool tool lets you find out how secure your Web site is. When I first used it, it found more holes in the security of my Web site than the laws allows, and I promptly changed to a different hosting company!

Your Computer’s Security
http://web.archive.org/web/20040605170440/http://grc.com/default.htm
Gibson Research Corporation offers a really neat (and free) tool for checking the security of your computer: ShieldsUp. Scroll down on
the page to the Hot Spots area, where you’ll find the link to ShieldsUp. Be sure to go through each of the checks in the ShieldsUp section (File Sharing, Common Ports, etc.).

More Free Webmaster Tools
Reverse DNS Lookup
http://web.archive.org/web/20040605170440/http://remote.12dt.com/rns/
This service converts an IP address to the host name by searching domain name service tables.

Doctor HTML
http://web.archive.org/web/20040605170440/http://www2.imagiware.com/RxHTML/
Click on Single Page, and the free service will analyze your Web page based on document structure, image syntax, table analysis, browser
support, verify hyperlinks, and much more.

GifBot by NetMechanic
http://web.archive.org/web/20040605170440/http://www.netmechanic.com/accelerate.htm
GifBot will compress your GIF, JPEG, and animated GIF images by decreasing the file size without sacrificing the quality of the image. So, your pages load faster!

Bobby Online
http://web.archive.org/web/20040605170440/http://bobby.watchfire.com/bobby/html/en/index.jsp
This online service allows you to test Web pages to help expose and repair barriers to accessibility and encourage compliance with existing accessibility guidelines. It’s very important to make sure your Web pages are accessible for those who surf the Internet with disabilities.

AnyBrowser.com Screen Size Tester
http://web.archive.org/web/20040605170440/http://anybrowser.com/ScreenSizeTest.html
Have you ever been to a Web site and not been able to scroll to the bottom of the page? The scroll bar is there, but you can’t get down that far to scroll. It’s not a pleasant experience. At the Screen Size Tester, you can test your Web pages in different screen
resolutions so that you can be assured that all of your visitors can view your site in its entirety.

Anti-Spam and Security Fix for FormMail.pl Script
http://web.archive.org/web/20040605170440/http://mailvalley.com/formmail/
Formmail one of the most-used perl scripts on the Web, is designed to send data entered into a Web form to an e-mail address. This script could be exploited by a malicious user who could use Formmail as a spam server. If you use this script, spammers may be able to use it to send spam freely using your server’s resources.” This site offers a patched version that prevents the script from being used by spammers.

Comprehensive Guide to htaccess
http://web.archive.org/web/20040605170440/http://www.javascriptkit.com/howto/htaccess.shtml
Though not a “tool” per se, this comprehensive tutorial about how to use the htaccess file is excellent.

Top Ten Web Design Mistakes for 2002
http://web.archive.org/web/20040605170440/http://www.useit.com/alertbox/20021223.html
Though also not a tool, this handy list is something that we should all be aware of. For example, don’t use horizontal scrolling!

Again, thanks to all of you who shared these wonderfully helpful tools with me. If I missed any (which I’m sure I did), please let me know. I’ll send out a part 2 of this article.

Author Bio:
Robin Nobles with Search Engine Workshops teaches SEO strategies the “stress free” way through hands-on, search engine marketing workshops in locations across the globe and online search engine marketing courses. Visit the World Resource Center, a new networking community for search engine marketers. (http://www.sew-wrc.com/)