has 13+ years experience in web development, ecommerce, and internet marketing. He has been actively involved in the internet marketing efforts of more then 100 websites in some of the most competitive industries online. John comes up with truly off the wall ideas, and has pioneered some completely unique marketing methods and campaigns. John is active in every single aspect of the work we do: link sourcing, website analytics, conversion optimization, PPC management, CMS, CRM, database management, hosting solutions, site optimization, social media, local search, content marketing. He is our conductor and idea man, and has a reputation of being a brutally honest straight shooter. He has been in the trenches directly and understands what motivates a site owner. His driven personality works to the client's benefit as his passion fuels his desire for your success. His aggressive approach is motivating, his intuition for internet marketing is fine tuned, and his knack for link building is unparalleled. He has been published in books, numerous international trade magazines, featured in the Wall Street Journal, sat on boards of trade associations, and has been a spokesperson for Fortune 100 corporations including MSN, Microsoft, EBay and Amazon at several internet marketing industry events. John is addicted to Peets coffee, loves travel and golf, and is a workaholic except on Sunday during Steelers games.
Optimization Of A Website For Your Local Market
Having a site globally optimized for certain keywords is great, if you happen to have a business that sells its products or services all over the planet. But what if your industry is a local one, or if the products you sell only cater to a specific city or county? Does it make sense to be in competition with hundreds or even thousands of other companies like yours when dealing for a local, smaller market?
In the same way that the good ‘old Yellow Pages have helped local and small businesses thrive before the advent of the Internet, today, a properly positioned website for a specific city or local market can do the same, even better and faster! Having your site optimized for your local market can bring you a lot of sales, not to mention the repeat business spread over many years. In this article, I will show you how to concentrate your SEO efforts to just one specific geographic location or city.
Your Geo-Title Tags
Can you think of a better place to start than on your title tags! If your title tag for an important products page is ‘black widgets’ and you only sell those black widgets in the city of Toronto, by naming that title tag ‘Black Widgets for Toronto’ or ‘Toronto Black Widgets’, just that simple step alone would drive a lot more sales to your site, providing people would type ‘Toronto’, either before or after the keywords ‘Black Widgets’. If you are living in San Francisco and you only want to sell those black widgets in that city, you would name them: ‘San Francisco Black Widgets’ and so on and so forth.
Going a step further, if you are only interested in a specific district, you should name that district as part of your title tag. A good example would be: ‘North Miami Black Widgets’. You wouldn’t rank as highly if people would type only ‘Miami Black Widgets’ in Google, but you would probably beat most of your outside competitors if they would type: ‘North Miami Black Widgets’, as your title tag would appear mostly at the top of the results page in Google, or just about any other major search engine for that matter. The whole idea here is relevancy, and North Miami is more relevant than just Miami, for a person living in North Miami.
Your “Geo-Headline” Or Descriptive Header Text
If you are optimizing your site for your local market, placing a ‘Geo-Headline’ in bold H1 or H2 tags will also help you a great deal. If your market is strictly the city of Calgary, you could write a Geo-Headline that would look something like this:
“City of Calgary Black Widgets” or “Black Widgets made for Calgary”. Either one of these would drive a lot of targeted traffic to your site, since they are Geographically (Geo) specific.
Today more than ever, the major search engines are using complex algorithms and sophisticated ranking formulas that take geo-physical location under consideration, further increasing your chances of a better ranking for a specific local market.
Your Body Text And “Geo-Sales” Copy
The same principles need to be applied to the body of your text for maximum local visibility in the search engines. If your local market is the city of Edmonton Alberta, instead of writing: ‘Visit our green widgets page and check our great prices’, you could write: ‘Our green widgets have been formulated to withstand Edmonton’s harshest winters and are Albertan’s first choice when it comes to after-sale service and warranty’.
Such strong Geo-Location copywriting is certain to get your site ranked well in the search engines. If you noticed carefully, I have included Edmonton and Alberta as two separate but very relevant geo-physical locations, as probably many searchers would type in the search box.
Use my “Geo-Linking” Strategy To The Fullest
If you really want to go all the way with my ‘Geo-Thinking’ that I spoke about so far, then consider using my Geo-Linking strategy! What is Geo-Linking? My Geo-Linking strategy is to use the name of your city or local market in the text link from one page of your site to another. Here are a few examples:
1- Please click here to view our New York City Red Widgets
2- Click here to examine our Washington apartments
3- Compare our low prices on our Quebec City rental cars
4- Take a virtual tour of our Madison County City Hall
5- Read Denver-West’s real estate history since 1940
Conclusion
Optimizing a website for a local market or just a single city can be as simple and as straightforward as this. These are the basics and if you stick to them, you should do great. Always try to put yourself in the ‘shoes’ of your average searcher.
If you are selling real-estate in just the eastern tip of a large city, use the many examples I gave you in this article and, in less time than it takes to have a Yellow Pages ad approved, printed and distributed in your city, your website should start generating targeted and strong local sales from many of your local prospects, searching for YOUR local products or services.
Author:
Serge Thibodeau of Rank For Sales
Google Adsense
If you’ve been looking for an easy way to increase your website’s revenue, Google AdSense may be your answer.
Google AdSense is a phenomenal new advertising revenue program that is taking the Internet by storm. It was specifically designed to enable content rich sites to increase their advertising revenue simply by displaying Google AdWords ads.
Google AdSense makes selling advertising space easy, as they handle everything for you. With access to a database of 100,000 advertisers, you’ll never have to worry about finding advertisers for your website again.
The concept is simple. If you have a quality site that provides content, such as articles, you may qualify to display Google AdWords text ads on your web pages. If your site is approved, you will receive a portion of the pay-per- click payment.
The great thing about Google’s sophisticated advertising system is that the ads that display on your web pages are relevant to your content. They scan each page to determine what your page is about and display the ads accordingly. This will increase your click-through rate considerably, as relevant text ads combined with quality content are highly effective.
Although the AdSense program is free to apply, your site will be reviewed and must be approved in order to begin displaying the ads.
As there have been many sites turned down, you’ll need to ensure your site meets Google’s criteria prior to applying. You can find the guidelines at the following web address: http://web.archive.org/web/20040605173906/https://www.google.com/adsense/policies
They’re basically looking for quality sites offering content. The keywords here are “quality and content.” If your site is under construction, loaded with advertising and/or broken images and links, don’t even waste your time applying, as you won’t be accepted. However, if you have a quality site and provide your visitors with content, you will most-likely be accepted.
The more content pages you have, the more advertising revenue you can make. It’s really that simple. If you don’t have your own content, there is a wealth of content available on Internet completely free. Subscribe to any of the following article announcement groups to receive new article submissions each day:
Article Announce – All types of articles
AABusiness – Business oriented articles, including: Business, Ecommerce, Sales, Networking, Business Communication, Internet Marketing, Promotion, etc.
AAInternet – Internet oriented articles, including: Ebooks, Ezines, Search Engines, Web Design, Web Development, Web Sites, etc.
AAHome – Home and Family oriented articles, including: Parenting, Relationships, Cooking, Recipes, Crafts, Gardening, Home, etc.
AAHealth – Health and Fitness oriented articles, including both physical and emotional health.
AAGeneral – General Interest oriented articles.
With your subscription to any of the above groups, you’ll not only receive new article submissions delivered to your email, but you’ll also have access to the archives which contain thousands of quality articles.
Visit the following web address for further information: http://web.archive.org/web/20040605173906/http://www.web-source.net/articlesub.htm
Once you’ve located articles that are relevant to your website, simply create a page for each. For example, if you find ten appropriate articles, you can create ten new content pages in which you can display Google AdWords ads. Not only will you be adding valuable content for your visitors, but you’ll also increase your website’s traffic and revenue. It’s a win-win deal no matter how you look at it.
Once your site has been approved, you simply select the style of ads you’d like to display and paste the code into your web pages where you’d like the ads to display.
Although all of the ads are text ads, there are currently four layout options to select from:
Although all four ad styles can be effective, the “skyscraper” ads placed toward the top right side of your page or the “Leaderboard” ads placed at the top of the page generally will provide a higher click-through rate than the other two styles.
However, each page is different and will produce different results. For this reason, you may want to test your pages and display the style that produces the highest click- through rate.
Once you begin displaying the ads on your website, you can visit Google and log in to your account to see how the ads are performing. You can view the number of impressions, clicks, click-through rate and your earnings.
If you’re concerned about displaying ads that may compete with products or services you’re promoting, Google offers an option to filter out unwanted ads. In addition, you can display the ads along with any affiliate programs you may be promoting, as long as the ads don’t look like Google ads.
Google AdSense is the hottest advertising revenue program on the Internet. Isn’t it about time you got paid for your content?
Visit Google AdSense for further information:
https://www.google.com/adsense
Author Bio:
Shelley Lowery is the publisher of Etips — Web Design, Internet Marketing and Ecommerce Solutions. Visit Web-Source.net to sign up for a free subscription and receive a free copy of Shelley’s highly acclaimed ebook, “Killer Internet Marketing Strategies.”
Pursuit of high search engine ranking has other marketing benefits.
If you run an on-line business or operate a website in support of your traditional small business, you have probably desired at some time to improve your ranking in search engine results. Wouldn’t we all like to be ranked #1 in Google for a term related to our business? Of course we would. The traffic could be staggering, and your business would gain considerable exposure.
Of course it is not that easy to do, since everyone else in the world who has a website wants the same thing. You’ve probably read articles that tell you all about the tips and tricks of climbing the Google ladder, and you’ve also read articles that tell you to forget about it. They say its simply too hard and will require too much effort to build your website up to that point.
In this writer’s opinion, they are both wrong. You should pursue a high ranking in Google with all of your available resources, but not necessarily with the goal of hitting the top of the search engine mountain. You’ll see why after a quick review of the two most commonly mentioned tips to increasing your ranking in Google.
Building Relevant Inbound Links
When somebody else’s website links to yours, they are essentially ‘voting’ for your website. They are telling visitors to their website that in their opinion, it would be worthwhile for them to visit your site as well, as it may be of interest to them. Google counts relevant inbound links in your favor, as they believe that the more people willing to give your site a ‘vote of confidence’ through a link must be an indication that your site is worthwhile. Building relevant inbound links can be accomplished through direct communication with other website owners, or through the practice of article writing.
Articles (like this one) carry a resource box at the end that contains a link back to the author’s website. If your article seems worthwhile to another business website owner, they may post it for their readers. In doing so they generate an automatic link to your site as their way of saying ‘thanks for the information’.
You’ll notice that I keep using the word ‘relevant’. Links from non-relevant sites or from pages that contain only links and no valuable content may actually downgrade your site ranking in Google, as it may be perceived as ‘SPAM’.
Establishing Free, Relevant Content
Your site won’t rank very well if it only contains one page filled with sales pitches for your product or service. The Internet is about information, and to improve your ranking you need to offer free information on your chosen topic to your visitors. Some people do this in the form of articles from themselves or other authors. The more content that you have, the more likely people are to find something valuable on your site, hence the more likely you are to rank well in search engine results.
Which brings us back to why you should pursue a high Google ranking. The reason lies in the tactics mentioned above. Simply put, the process of establishing links to your website from other quality sites related to your topic, and the fact that your site has quality content on it, will greatly improve your chances of making sales to your customers through your website, regardless of whether or not you rank well on Google. Your efforts will be well placed if you focus on links and content. A number of quality inbound links can generate significant, highly targeted traffic to your website even without a high ranking search engine listing.
Conclusion
The effects of pursuing inbound links and relevant content on your website’s traffic will be positive, regardless of your ranking on Google or any other search engine. So go ahead and try to drive your site as high as possible in the Google ranking. You will certainly build a solid foundation for your website along the way.
Author Bio:
Will Dylan is the Author of ‘Small Business Big Marketing’ a powerful e-book for small businesses available through his website http://www.marketingyoursmallbusiness.com/. Will also offers article and news release writing services. You can contact Will at askwill@marketingyoursmallbusiness.com
Banner Exchanges – Are They Useful?
For those who don’t know, a banner exchange is a service that offers to show your banner ad on other sites if you show their ads on your’s. They often have thousands of sites in their networks so you can get your banner shown in a lot of places. They will offer an “exchange” rate – the most common is probably 2:1. This means that for every two banners you show on your site, they will show one of your’s on other peoples. You can get better exchange rates such as 4:3 or 5:4.
On the surface, they seem like a good idea and, because they are free to join, there is no real need to be critical. However, it’s important that you have a good idea of the kind of response that you are likely to receive.
For a new site, they are pretty pointless. If you taken into account that you are likely to receive a 1% clkick through rate on your banner (and it will usually be much less) then they are not going to drive any traffic to your site. Using the above exchange rate of 2:1 you would need to show 200 banners on your site to get one new visitor – showing 200 banners earns you 100 of your’s shown and at a 1% click through rate, that’s one visitor. Let’s say you get 50 000 pageviews per month on your site, you would get your banner shown 25 000 times on other sites and get 250 new visitors. This looks OK but a 1% click through rate is often hard to achieve so this should be treated as a maximum.
The other issue is that Banner Exchanges are often not very well targeted. This means that your banner may not be shown on sites that match your’s. Therefore, the visitors you do get may just be surfers rather than potential clients. Most Banner Exchanges these days do offer some form of targeting but it’s never going to be as targeted as it would be if you just approached a site and offered to exchange banners at a 1:1 rate.
Having said all that, they do have some good points. Firstly, they are free which is always a good thing. Secondly, they will get you a bit of extra traffic assuming you already have some. Thirdly, they do offer some value in terms of branding – even though only one person in a hundred may click on the banner, many more will actually see it and, if you have a good banner, remember your name.
To sum up, Banner Exchanges are of value as part of your overall marketing strategy but are generally not the way to “get your site going”.
Author Bio:
Sean Burns is the author of the WebmastersReference.com Newsletter – http://www.webmastersreference.com/newsletter. More than five years of experience in site design, marketing, income generation, search engine optimization and more is passed on to subscribers – hype free. Sign up today to get real information of real value to webmasters.
Pay Per Click Advertising
If you are using pay per click advertising, I don’t need to tell you that it can get very expensive if you have a lot of unnecessary click throughs. In this article I will explain how to screen your visitors and how to apply it to your pay per click advertising campaign, so that you can screen your visitors before they click through.
How It Works
To minimize the amount of unnecessary click throughs, we are going to talk about a screening technique that is used in copywriting. A good copywriter has the ablity to screen the serious individuals from the test pilots, before the sale is initially made.
By using this screening technique you will dramatically decrease the amount of refunds that you could be receiving. In this case, you need to be specific about your product or service without giving too many details, this will eliminate unnecessary click throughs.
When it comes time to develop an ad that best describes your offer, you need to use precise wording. If you use any ambiguous words, phrases or statements in your ads, you will confuse the viewer, making them either click through or leave. You need to keep in mind that every click through is costing you money, so you need to make sure that you are targeting your market and that each of your visitors are qualified.
Applying The Headline
When placing a pay per click advertisement there are two things that you need to pay attention to, the headline and the description. The headline is used to grab their attention, build their curiousity and force them to read on. The difficult part is that the pay per click ads only allow you a limited amount of characters, usually up to 50. Your attention grabbing headline will end up being only three or four words. You need to make your headline jump out at the viewer, but at the same time, you need to be specific.
One of the biggest mistakes I often see, is that people use their business name for the headline of their pay per click advertisement. A business name is not going to grab their attention or motivate them to read the description. For example, let me ask you which headline would grab your attention and motivate you to read the description, “Elites Marketing” or “Earn $47 – $270 Per Sale”. Do you see the difference between the two headlines and how specific the second one was?
Applying The Description
As far as the description goes, you have a little more to work with, unless you are using Google’s Adwords. Google’s Adwords gives you two lines and each line only allows up to 35 characters. You will need to be as specific and descriptive as you can. The description is very crucial, and it will determine whether or not your visitor will initially click through.
Let me give you another example, now which description is precise in wording and is descriptive enough to screen your visitor, “You can join our Two Tier Associate Program at no cost or obligation”, or “Snowball in cash by promoting info-marketing products. Join for Free!” I hope you picked the second description!
The first description, “You can join our Two Tier Associate Program at no cost or obligation” is vague and wide open. This description does not describe what kind of product or service they would be promoting or kind of associate program I am offering is, pay per lead, pay per click, pay per sale, or two tier. You don’t want to use a description that is too vague, that is how you get a lot of unnecessary click throughs.
On the other hand, the second description, “Snowball in cash by promoting marketing info-products. Join for Free!” is very clear and concise. Even though the description did not say what kind of associate program it was, in the headline it was clear. It said, “Earn $47 – $270 Per Sale.” Moreover, I was able to tell my visitor that they’d be promoting information marketing products and was free to participate. I was also able to hit them with a couple psychological triggers, “Snowball” and “Cash”.
Conclusion
To screen your visitors more effectively, you need to choose keywords that are relevant to your product or service and that target your market. If you select keywords or phrases that are too general, you will still have a lot of unnecessary click throughs. You can only screen so much, so don’t select inappropriate keywords or phrases when starting your pay per click advertisement campaign. Take your time and brainstorm for the appropriate keywords and phrases that best desribes your product or service.
Author Bio:
Rich Hamilton, Jr is the CEO/President of http://www.elitesmarketing.com/. You can start earning cash today by joining our FREE Two Tier Associate Program and make $45 – $270 per sale hhttp://www.elitesmarketing.com/assoc/.
Big Site? Make The Most Of It On Google – Intro.
For many people a five or six page web site is all they need or want, but for others, selling services and products on the internet, a hundred page site is barely adequate ‘ if you’re one of those companies then here are some tips on making the most of your site on Google and other deep search engines.
One of the sites we manage is 520 pages packed with content and informative articles. It has some 10 / 12 levels of pages in its structure and we became aware that Google only indexed 126 of those 520 pages, what was going on?
Maximising Each Page
We worked diligently with the web site owner to optimise each page, ensuring it had a unique and the page content was rich in the KEYWORDS for that topic, for instance if you?re trying to get onto Google with ?content management systems? and the phrase ?content management systems? does not appear on the page in HTML text then you won?t hit the top 1000!! Similarly if the phrase ?content management system? appears lots you will still fail because Google sees ?system? and ?systems? as 2 totally separate words
Remember, each page has a , , , and ALT TAGS if you, or the designer has simply duplicated another page, as a template, in the design all your pages will have the same attributes as far as the spider is concerned.
Spider Depth
Most spiders do not index below level 3 and therefore they do not find what may be very important pages at all. In addition we noticed with Google Page Ranking that the Index Page was 5/10, a level 2 page on the same site was 4/10 and a level 3 page was 3/10. Presumably pages beyond level 3 are considered so insignificant that the spider has been programmed to ignore them.
In addition the spider was stopping dead at drop down menus and graphic links it could not move beyond. Spiders essentially follow HTML text links and that’s about it. If you stick to that rule you won’t go too far wrong.
Our challenge was therefore to bring every page, no matter where it was in the site, to a level 3 position at least ‘ without changing the structure of the site itself, so the spider would index it and the page ranking would be higher. This would give us 520 marketable, optimised pages rather that 126.
The solution was quite simple. A site map ‘ we simply spent a few hours setting up a site map with a link from the index page (making the site map level 2) and then an HTML text link to every page on the site, making every page on the site at least level 3.
The next time the site was spidered by Google, there it was, 520 content rich optimised pages and an increase in traffic of 1000%.
Big sites, make the most of them, don’t keep your content hidden under a bushel!!
Author Bio:
John Saxon is technical director of site-pro limited a site offering free tips, tools and articles for web site optimisation ‘.
To Gogle, Or Not To Gogle?
A while back I was posting an article submitted by one of our regular authors on LilEngine.com and I did it a bit faster than I normally would. I was in a hurry to catch an appointment and was already running late. While posting the article I tripped on something that would change my view on mistakes forever.
In my haste I made a mistake in the article’s title, yes a misspelling, just as I have purposely misspelt Google Search Engine as Gogle Search Engine in the title of this article to clue you in on its content. Weirdly enough the mistake in the title I posted slipped by unnoticed and eventually got pushed off the homepage and into our archives.
If you follow the course of your content pages after posting them they usually go into hiding for a few days and then resurface with varying placement depending on their content and other variables. Every now and again you will have some page on your site that attracts large amounts of traffic compared to some of your other pages. This page inevitably grabs your attention and this is what happened to me.
I noticed a large increase in the daily traffic to http://web.archive.org/web/20040605194628/http://www.webmoves.net/blog// and started analyzing the logs to find the culprit. Now, I describe it as a large increase in traffic as opposed to a large spike in traffic as this traffic gain did not suddenly appear then disappear. It was a stable increase in traffic and was funneled to our site by the Gogle Search Engine :-). It so followed that the page that was responsible for this flood was the same page with the mistake and it was showing up as #1 in Google for this keyword misspelling.
Finding Common Misspellings
It is pretty easy to come up with misspellings for your targeted keywords, however, incorporating them into your content may not be as easy. With a little imagination you can come up with several methods to keep your content legitimate for your users and the search engines.
Using the Overture Keyword Suggestion Tool and Google Search Results you can decide which misspellings get the most searches and which are highly competitive hence which ones would be worth your while to optimize for.
Here’s how you do it.
Use the Overture Keyword Suggestion Tool to see how many searches there are for the misspelling. If this number is satisfactory for you then do a Gogle Search for the misspelling and see how many results Gogle has for this keyword. If this number is too high then there may be too much competition for this keyword and you might want to try another.
Summary
People will always make mistakes and these will include misspellings. If you can reach out further to your target market by incorporating words that they may misspell to find your site, in a tasteful manner, then gearing pages of your website for misspellings should be considered when optimizing your website.
Author: A Duncan
www.lilengine.com
SEO Defined
The true definition of Search Engine Optimization (SEO) can be stated as a highly specialized process of building a successful website. We say successful because if a commercial website cannot be found in the major search engines, it is not successful, it just isn’t doing it’s job. SEO directly addresses the need for a website to attract new and targeted visitors, who in turn will convert into buying customers.
Fact: About 90% of all new visitors to a web site found it using the major search engines such as Google or, search directories such as Yahoo! Now SEO professionals and specialists alike are closely accustomed to acquiring top-level search engine results for all their clients. Priority placement and top rankings is a daily, on-going struggle when you are a professional SEO specialist. The SEO field, as a whole, needs to keep up with ever-changing technology and also has to be familiar with other web site marketing techniques such as banner ads, pay-per-click programs and even some offline advertising options.
After a careful industry study, every survey has taught us that each web site competes in a different, segmented market. You could say that some are probably less competitive and others are surely more competitive. But each and every website targets very different customers in very different markets. Some have a more local scope or orientation, others target either national or global markets. In Rank for $ales’s own experience, “the classic one size fits all promotional packages” are usually termed as a “one size fits none” scenario. We have been involved in the professional Search Engine Optimization field since 1997.
Our initial approach is to custom-design a unique Internet marketing strategy that will work for all our clients’ sales goals, market segments and their advertising budgets. Rank for Sales works as an important team inside our client’s sales operations. We diligently work with every department and all concerned to reach their common goals.
Traditional advertising in the brick-and-mortar version of any business model is a proven method of promotion for any business. It has worked since the roaring twenties and continues to work today. Now the Internet is certainly no exception, but achieving a high return (ROI) for your online advertising dollar can be a very expensive, trial and error proposal if not done correctly. A good SEO professional firm can help you choose the most effective programs for your needs. The best programs that will work in your company.
Author:
Serge Thibodeau of Rank For Sales
Introduction
With the Robots.txt protocol, a webmaster or web site owner can really protect himself if it is done correctly. Today, web domain names are certainly plentiful on the Internet. There exists a multitude of sites on just about any subject anybody can think of. Most sites offer good content that is of value to most people and can certainly help with just about any query. However, like in the real world, what you see is not always what you get.
There are a lot of sites out there that are spamming the engines. Spam is best defined as search engine results that have nothing to do with the keywords or key phrases that were used in the search. Enter any good SEO forum today and most spam topics in daily threads usually point to hidden text, keyword stuffing in the meta tags, doorway pages and cloaking issues. Thanks to newer and more powerful search engine algorithms, these domain networks that spam the engines are increasingly being penalized or banned all together.
The inherent risks of getting a web site banned on the basis of spam increases proportionately if it appears to have duplicate listings or duplicate content. Rank for $ales does not recommend machine-generated pages because such pages have a tendency of generating spam. Most of those so-called ‘page generators’ were not designed to be search engine-friendly and no attention was ever given to engines when they were designed.
One major drawback of these ‘machines’ is that once a page is ‘optimized’ for a single keyword or key phrase, first-level and at times second-level keywords tend to flood results with listings that will most assuredly look as 100% spam. Stay away from any of those so-called ‘automated page generators’. A good optimization process starts with content that is completely written by a human! That way, you can be certain that each page of your site will end up being absolutely unique.
How Do Search Engines Deal With Duplicate Content?
Modern crawler-based search engines now have sophisticated and powerful algorithms that were specifically designed to catch sites that are spamming the engines, especially the ones that make use of duplicate domains. To be sure, there are perfectly legitimate web sites whose situation can certainly be informative. However, and as the following example will clearly demonstrate, that is not always the case.
We will take this practical example of where there are actually three identical web sites, all owned and operated by the same company, where the use of duplicate content is evident. Google, Alta-Vista and most other crawler-based search engines have noticed and indexed all three domains. In this scenario, the right thing to do is to make use of individual IP addresses and implementing a server re-direct command (a 301 re-direct). An alternative to this would be to at least provide unique folders or sub-directories and using the Robots.txt exclusion protocol to disallow two of the three affected domains.
That way the search engines wouldn’t index the two duplicate sites. In such cases, the Robots.txt exclusion protocol should always be used. It is in fact your best ‘insurance’ against getting your site penalized or banned. In the above example, since that was not done we will look at duplicate content and assess where the risk of getting a penalty is the highest. We will list and describe the indexing of these three sites as being site one which is the main primary domain, site two and finally, site three.
The four major crawler-based engines that were analyzed were Google, Teoma, Fast and Alta-Vista. All three domain names point to the same IP address, which actually made it simpler to use Fast’s Internet Protocol filter to discover that there was really no more than three affected domains in this example. However, all three web sites are directed to the same IP address AND content folder! Such a scenario makes them in fact exact duplicates, raising all the duplicate content flags in all four engines analyzed.
Even if all three sites share the same Robots.txt file, the hosting arrangement and syntax in the Robots.txt file does nothing that is effective to help this duplicate content problem. Major spider-based search engines which rely a lot on hypertext to compute relevancy and importance as most do today, are best at discovering and dealing with sites that delve into duplicate content issues. As a direct result, a webmaster runs a large risk of having duplicate content in these engines because their algorithm makes it such a simple task to analyse, sort out and finally reject these duplicate content web sites.
If a ‘spam technician’ discovers duplicate listings, chances are very good they will take action against such offending sites. The chances actually increase when a person, often a competitor files a spam complaint or that a certain site is ‘spam-dexing’ the engines. To be sure, any page caused by duplicate content can improperly “populate” a search query. The end result is unfairly dominating most search results.
Marketing Analysis And PPC “Landing” Pages
In order to better analyse specific online marketing campaigns or surveys, some companies at certain times have in fact duplicate sites or operate PPC (Pay-per-Click) landing pages. It is important in such cases not to neglect to use the Robots.txt exclusion protocol to manage your duplicate sites. Disallow spiders from crawling duplicate sites by properly editing the right syntax in the Robots.txt file. Your index count will certainly decrease, but that is the right thing to do and you are actually performing the search engines a service. In such a case, a webmaster needs not to worry of impending penalties from the engines.
If these businesses or their marketing departments are in fact running marketing tests or surveys, there is usually more than just one domain that could potentially appear in the actual results pages of the engines. In such cases, I strongly recommend writing or re-writing all content all over and making certain that no real duplicate content gets to be indexed. One way to achieve that is to use some form of meta refresh tag or Java script solution to actually direct visitors to the most recent versions of pages while their webmasters get the Robots.txt exclusion protocol written correctly.
The Java script would effectively indicate where it is intended to redirect, assuring it can put the final document in its proper place. A ‘301 server redirect’ command is always the best thing to use in these cases and constitutes the best insurance against any penalties, as it will inform the search engines that the affected document (s) have in fact moved permanently.
Author:
Serge Thibodeau of Rank For Sales
Content Management Software & SEO
A lot has been said about Content Management Systems (CMS) in the past. Since search engine marketing (SEM) and search engine optimisation (SEO) are now such a growing part of the main marketing objectives for companies of all sizes, extra attention and caution must be used before the purchase of any such CMS software or programs.
When people today are looking to buy a certain product or service or need information on a specific subject, most use the power, speed and flexibility of Internet search engines. However, in the last two years, some companies have been developing their web sites and creating new or additional content with some of those so-called CMS packages. Some of the CMS programs available today may in fact negatively affect the visibility of your web site in the SERP’s (Search Engine Results Pages).
The main reason for that is some of them were not designed with search engines in mind. They were designed for what they are supposed to do: to help manage the contents of certain documents! That said, how can a company that has made extensive use of such CMS software reasonably insure itself that their site will do well in the search engines?
Some well-designed CMS solutions that give good attention to search engines when they were designed could possibly contribute to SEO work by standardizing its presentation, labelling of content and to a certain degree, its structure which is crucial to most search engines when evaluating how a particular page or site section should rank in the engines.
If a CMS package was truly designed with search engines in mind, it could be helpful for updating and some maintenance chores of certain dynamic sites, but even with some of these features present, there could be certain areas where it can still negatively affect some of the best SEO strategies. Rank for $ales advises caution in using them in such instances. Some CMS software packages can cost a lot of money, so be careful when shopping around and don’t be afraid to ask pointed questions, especially as it pertains to search engine visibility.
Whatever CMS software package you are analyzing, make certain that none of them create their own title tags, which are almost always off-topic. A good example to this would be the steel products manufacturer that is trying to use a CMS program that writes title tags such as: Page 1, Page 2, Page 3, etc. Such title tags are completely useless to search engines. In this particular example, if the CMS program is well-designed for the engines, it will let YOU write informative and useful title tags such as: steel staircases, steel shelves, steel accessories, etc.
This detail may seem like negligible to some people, but it is very significant information for the engines and will make a drastic improvement in the results. An additional benefit to such standard SEO techniques is it will make the maintenance of the site much easier and faster. One problem with some CMS vendors is they don’t realize all of this as being their concern. Some don’t identify it is as even being a problem. It would be almost effortless for a CMS designer or vendor to make its product search-engine friendly, as all the technology and the proper resources are already there. Some of the actual problem stems from the fact that the real needs are not being communicated effectively by most of the end-users and customers of some of the CMS programs.
PPC And Paid For Search
Today, the PPC and paid for search industry is growing rapidly. It basically offers businesses and companies a results-driven, value-added complement to search engine marketing techniques. PPC (Pay-per-Click) services allow web sites to bid for keywords and key phrases to be posted as sponsored links next to results pages. Companies can also pay to have their sites included in search-engine catalogues. In using such a formula, web sites that are of the dynamic type don’t have to make too many modifications to its basic structure, and can still yield a significant ROI.
As stated previously, there could be some benefit and some advantages to using certain, selected CMS products. However, when it comes to today’s major search engines, make certain you choose a product that will be truly ‘search engine compliant’. Additionally, always remember that some of them might still pose some problems, as far as SEO is concerned. In light of all this, it is clear that CMS vendors, search engines, web site owners and professional search engine optimization companies must work together to make the whole process as rewarding it can be and in the most diligent fashion.
Author:
Serge Thibodeau of Rank For Sales