Ask us a question!

Web Moves Blog

Web Moves News and Information

13
Feb
2004

Get Organized – Adopt A Naming Convention

Introduction
Building a successful SEO (Search Engine Optimization) campaign requires a lot of time and hard work. Search engines are constantly changing their algorithms and it’s up to you to make the necessary adjustments to accommodate these changes. Keeping track of all of your optimized pages can be a daunting task. However, you can avoid unnecessary confusion by organizing your optimized pages in a streamlined fashion. Although not common practice, this is one of the most important steps in any successful SEO campaign.

What do I mean by “organized?” Simply, that you should develop a clear plan on how your pages will be named and where they will be situated on your web site. You need to be able to easily identify and track what pages have been indexed by what engine and what pages need to be updated. One way to achieve this is to adopt a “naming convention”.

Example 1.
Your company web site sells widgets. You have a list of 5 of your most important keywords and you’ve optimized these keywords for 4 search engines. That’s a total of 20 optimized pages. You have a robots.txt file set up to prevent search engine ‘A’ from indexing pages that are intended for search engine ‘B’ and so on.

Let’s examine the drawbacks to this naming convention:

Keyword Page Name Engine
widgets widgets.htm Google
blue widgets bluewidgets.htm Google
red widgets redwidgets.htm Google
black widgets blackwidgets.htm Google
purple widgets purplewidgets.htm Google
widgets widgets2.htm MSN
blue widgets bluewidgets2.htm MSN
red widgets redwidgets2.htm MSN
black widgets blackwidgets2.htm MSN
purple widgets purplewidgets2.htm MSN
widgets widgets3.htm AltaVista
blue widgets bluewidgets3.htm AltaVista
red widgets redwidgets3.htm AltaVista
black widgets blackwidgets3.htm AltaVista
purple widgets purplewidgets3.htm AltaVista
widgets widgets4.htm Hotbot
blue widgets bluewidgets4.htm Hotbot
red widgets redwidgets4.htm Hotbot
black widgets blackwidgets4.htm Hotbot
purple widgets purplewidgets4.htm Hotbot

1. The words in your page names are not very distinct. This is important because a search engine cannot determine if bluewidgets.htm is made up of two distinct words “blue” and “widgets.” You need to find a way to separate these keywords in the page name or you will not get credit for the keyword in the file name.

2. Your page names are not easily identifiable. When you run a Reporter mission, you will see your pages indexed with the number appended to the keyword phrase in the file name. At first glance, this doesn’t tell you the engine for which the page is optimized. You need to be as descriptive as possible.

3. Using a robot.txt file can diminish your exposure throughout all of the search engines. I explain this in the next section.

Now, let’s take a look how we can modify our page names in order to get credit for the keywords, and allow you to easily identify them in the corresponding search engine while gaining maximum exposure.

Example 2.
Below, you’ll see an example of how I have added hyphens to separate keywords in the page name. Also, I’ve appended an engine indicator to the file name, so it will be easy to distinguish what page is optimized for which engine.

Keyword Page Name Engine

widgets widgets.htm Google
blue widgets blue-widgets-gg.htm Google
red widgets red-widgets-gg.htm Google
black widgets black-widgets-gg.htm Google
purple widgets purple-widgets-gg.htm Google
widgets widgets-ms.htm MSN
blue widgets blue-widgets-ms.htm MSN
red widgets red-widgets-ms.htm MSN
black widgets black-widgets-ms.htm MSN
purple widgets purple-widgets-ms.htm MSN
widgets widgets-av.htm AltaVista
blue widgets blue-widgets-av.htm AltaVista
red widgets red-widgets-av.htm AltaVista
black widgets black-widgets-av.htm AltaVista
purple widgets purple-widgets-av.htm AltaVista
widgets widgets-hb.htm Hotbot
blue widgets blue-widgets-hb.htm Hotbot
red widgets red-widgets-hb.htm Hotbot
black widgets black-widgets-hb.htm Hotbot
purple widgets purple-widgets-hb.htm Hotbot

I respectively use abbreviations such as “gg” for Google, “ms” for MSN, and so on. You don’t have to use my abbreviations. However, make sure the naming convention that you implement is consistent. That’s the most important thing.

Tip: Please be careful when creating an “engine indicator.” Do not spell out the entire engine name in your filename. For instance, avoid naming your page like this:

blue-widgets-google.htm

Although it has not been proven, Google and other crawlers could potentially flag this page as a doorway page because it thinks you are creating it specifically to rank high on that engine.

You might be thinking, “I’ve created a robot.txt file, so I don’t have to worry about search engine ‘A’ indexing pages that are intended for search engine ‘B.’ Yes, that is correct. However, if you use a robot.txt file for this purpose, you could be cheating yourself from gaining maximum exposure across all of the search engines.

If you do not use a robot.txt file, you will notice that search engine ‘A’ will index pages optimized for search engine ‘B.’ This is exactly what you want. In order to do this, you must be very careful because you do not want to have similar content that could be flagged as spam.

It is completely possible to optimize several different pages that target the same keyword, and create content so unique that you will not be flagged for spam. As I mentioned, this will maximize your exposure across all of the search engines, while allowing you to increase the overall unique content of your site.

I can’t tell you how many times engine ‘A’ has picked up pages that I’ve optimized for engine ‘B’ and ranked the ‘B’ pages higher than those I specifically optimized for ‘A.’ So, if at all possible, only use a robot.txt file to protect your confidential content from being indexed.

One final Tip
Try to avoid creating sub directories solely for the purpose of storing optimized pages for a specific search engine. Storing all of your optimized pages in your root directory gives you a better chance at higher rankings because most crawlers give more weight to pages found in the root directory. In this case, it is better to sacrifice the organization and shoot for the higher rankings.

Author Bio:
This article is copyrighted and has been reprinted with permission from Matt Paolini. Matt Paolini is a Webmaster/Tech Support Specialist for FirstPlace Software, the makers of WebPosition Gold. He’s also an experienced freelance Search Engine Optimization Specialist and Cold Fusion/ASP.NET/SQL Server developer/designer. For more information on his services, please visit http://www.webtemplatestore.net/ or send him an email at webmaster@webtemplatestore.net