has 13+ years experience in web development, ecommerce, and internet marketing. He has been actively involved in the internet marketing efforts of more then 100 websites in some of the most competitive industries online. John comes up with truly off the wall ideas, and has pioneered some completely unique marketing methods and campaigns. John is active in every single aspect of the work we do: link sourcing, website analytics, conversion optimization, PPC management, CMS, CRM, database management, hosting solutions, site optimization, social media, local search, content marketing. He is our conductor and idea man, and has a reputation of being a brutally honest straight shooter. He has been in the trenches directly and understands what motivates a site owner. His driven personality works to the client's benefit as his passion fuels his desire for your success. His aggressive approach is motivating, his intuition for internet marketing is fine tuned, and his knack for link building is unparalleled. He has been published in books, numerous international trade magazines, featured in the Wall Street Journal, sat on boards of trade associations, and has been a spokesperson for Fortune 100 corporations including MSN, Microsoft, EBay and Amazon at several internet marketing industry events. John is addicted to Peets coffee, loves travel and golf, and is a workaholic except on Sunday during Steelers games.
Introduction
For many, the value of their websites can be measured in visitors, for others it is the amount of revenue that the site generates. Regardless of how you measure the success of your website, you are going to have to bring visitors to it that are looking for the information, content and/or products that you are providing. So how do you do this?
For many businesses the yellow pages is a first choice for promotions and marketing. A standard yellow page ad runs for about $1,200+/year for a smaller ad and serves only a local market. You further have to consider the cost of ad development, which can often run into the thousands of dollars as well.
While this may be an essential form of advertising for many businesses which serve only local areas, it is certainly not the only one.
On the Internet there are many forms of advertising that have and are being used to bring traffic to websites. From banner ads and sp@m email to PPC and natural search engines there are countless methods for promoting your website online. So how do you choose which marketing tactic to utilize?
One thing to consider is that over 80% of all Internet traffic comes from search engines with Google currently responsible for the vast majority of that. With such an overwhelming amount of traffic coming from a single identifiable source it makes sense to put a lot of weight on what traffic from this source can mean for your website, and for your business.
In this article we will explore the difference between sites that should consider search engine placement as a viable choice in their marketing strategy, and those that would benefit little from top placements. As well we will look at ways to insure that you are maximizing the effect and potential return on investment of your search engine placement campaign should you choose to go that route.
Who Should Consider Search Engine Placement
An ethical search engine placement firm will tell a client honestly if search engine optimization will benefit their website. Of course, not every company is ethical and further, how do you know until you have undergone search engine optimization, whether it will benefit your website? By this point you have paid the firm and they have done their job whether it helped you or not.
There are a few things you should consider before you apply search engine placement tactics to your website, or consider hiring a search engine placement firm. I include “apply search engine placement tactics to your website” as, even though it may be “free” to do it yourself you can detract from the overall visual appeal of your site if not done correctly, and also it is a very time consuming process to both learn and perform.
Questions You Should Ask Yourself
What do I stand to gain from higher search engine traffic? As search engine optimization has a built in cost of either time and resources should you choose to do it yourself, or through the direct cost of money should you hire a search engine placement firm, you have to insure that what you hope to accomplish will be at least equal to and preferably greater than the cost of time and money it will take to do successfully. For more information on this you will want to read the section below “So You’re Going To Market Your Website On The Search Engines, Now What?” for tips on establishing whether there is truly enough traffic to be had to make your efforts worth it.
Can I compete? The short answer to this question is generally always “yes” however there are many factors to consider. While any site, with enough work, can rank well you do have to consider whether the effort will be worth it. For example, if you own a small computer store in Michigan that repairs computers and troubleshoots software issues, it is theoretically possible for you to rank your website well for the term “Microsoft Windows”. To do so would require an ENORMOUS amount of both time and money. And so you have to consider, is it really worth investing years of time and money into this one ranking? The answer is “probably not” but that doesn’t mean that search engine placement would not be beneficial for you, just that those keywords are not worth targeting. There are suggestions for choosing the right keywords below.
Do I make money from my website traffic? This question isn’t the end-all-be-all however it’s certainly relevant if you’re considering hiring a search engine placement firm to optimize your site. If you plan on “going it alone” you may want to put in the effort for no money in return simply for the “fun” of it, however if you’re planning on spending your hard earned money on a search engine placement firm you have to make sure that it is in your economic best interest to do so. This may be from either direct product/services sales or from the sale of advertising on your site.
If, after answering these questions to yourself you have determined that search engine placement is indeed a good choice for your marketing strategy you will now have the task of determining exactly what tactics will produce the greatest Return On Investment for your efforts.
So You’re Going To Market Your Website On The Search Engines � Now What?
Now that you have determined that search engine placement is indeed an avenue of marketing that can produce beneficial results for your site and for your business, you have to decide on a “plan of attack”. Many SEO firms will “help” you determine keywords to target, and some will even build links for you from “valuable” sites. In many case they may be entirely truthful but how do you know?
The choosing of the keywords to target is probably the most crucial step of the entire search engine optimization process. This will determine the success and/or failure of your promotions. Even if you attain all the top placements that were targeted, if you target the wrong terms these rankings will produce little or no results. So how do you determine for yourself which keywords to target?
Without getting into anything too technical there are a couple of great resources out there to help you isolate the keywords that you should target.
The Overture Search Term Suggestion Tool
Advantages – The Overture Search Term Suggestion Tool is free and produces a large number of results for related searches. Disadvantages – There are two main disadvantages to the Overture Search Term Suggestion Tool in determining your keywords. The first of these is that it puts everything in terms on the singular and further will correct misspellings where you may want to know how many people searched for a misspelled term. If you run a shopping site that sells gifts you will not be able to determine whether the main searched phrase was “gift” or “gifts”. The second major disadvantage to the Overture Search Term Suggestion Tool is that it doesn’t give you alternative search phrases that are related but that you might not have thought to punch in (“presents” for example). Which brings us to the second tool.
WordTracker
Advantages – WordTracker addresses all the disadvantages noted about the Overture Search Term Suggestion Tool. It differentiates between singular and plural and will allow you to search for misspellings. Further, it searches a thesaurus and will make a number of suggestions for other terms you may not have thought of but which may be related to your industry. It will then analyze the variety of terms that you have searched and chosen and actually make recommendations on which keywords to target based on the number of competing pages and the specific search engine you are targeting. It will also give you a predicted number of searches per day for each phrase.
Disadvantages – The only real disadvantage to WordTracker is that it has a cost. There is a free trial on the site which you can use though the results it produces are far lower in numbers and it does not give you information on all of the major search engines. Certainly worth checking into even if you only try the demo mode (not a download – this is an online resource).
So You Now Have A List Of Possible Keyword Phrases � Now What?
The next step is to determine whether you can compete with those currently holding the top positions and, more importantly, whether it will be economical to do so. The first place to look when you are trying to determine this is the search engines themselves. Let’s assume for a second that you have determined that there are a good number of searches for your product and that the main keyword phrase you would like to target is “acne cures”. The next step is to run a search for “acne cures”. Most people are interested primarily in their Google rankings and so you would run that search on Google producing the following results:
http://www.google.com/search?hl=en&lr=&ie=UTF-8&oe=UTF-8&q=acne+cures.
The site currently holding the #4 position is www.approvedcures.com. Take a look at the site. There are two major things that you will first be looking for:
Is this a large site with a lot of content related to the search phrase? In this case the answer is “yes”. They have a very large site and all of it is related to the topic the search was for.
How many incoming links do they have? You will now want to find out what sort of link popularity you will be competing with. Links are not the end-all-be-all of search engine optimization, if fact it is just one of a large number of deciding factors, however the links help determine your Google PageRank and this PageRank is a final multiplier in determining your position. Because the number of incoming links is easily determined it is something you should look into. To determine the number of links simply enter into the Google search bar “link:www.domain-in-question.com”. In this case it would be http://www.google.com/search?hl=en&lr=&ie=UTF-8&oe=UTF-8&c2coff=1&q=link%3Awww.approvedcures.com. They do not have a large number of incoming links and thus, this is not going to be a significant factor in determining their ranking.
And Now You’re Ready For Search Engine Placement
At this point you should have a pretty good idea as to whether search engine placement is a viable choice for your website promotions, which keywords you should target, and what competition you will be facing.
Assuming that your website needs to be optimized, you will now be faced with the choice of doing it yourself or hiring a search engine placement firm.
If you will be doing your own optimization I would highly recommend reading an article by the CEO of StepForth Search Engine Placement entitled “A 10 Minute Search Engine Optimization”. It can be found on the StepForth website at http://news.stepforth.com/2003-news/ten-minute-optimization.shtml.
If you will be hiring on a search engine placement firm I would recommend first submitting your site for a free website review off of our homepage at http://www.stepforth.com. There is no obligation and it will give you a very good idea of what areas need to be addressed.
Author Bio:
Dave Davies is the marketing manager for StepForth Search Engine Placement Inc. Visit them online at http://www.stepforth.com or email him at dave@stepforth.com.
Search Engine Optimization Basics
This is the latest article in the “Back to Basics” series. Previous articles Impact include the importance of search engine marketing (SEM), effective keyword research, title tag formats, Meta tag use, as well as submissions. In this topic, we take a look at changes you can make to the content of your site to further improve search engine positioning.
Over the past few months, search engine optimization (SEO) has become more mainstream, with many companies considering this form of marketing for the first time. The amount of information on the topic of SEO has increased dramatically, with many new authors stepping forward to pen guides that explain how to optimize a website. Yet even with this increased awareness, it still amazes me the number of business owners that still believe tweaking titles or adding keywords to Meta tags is all that is needed to increase search engine visibility.
Optimizing Your Page Content
In previous articles, I have endeavored to provide a beginner’s guide to making these changes; now it’s time to turn our attention to perhaps one of the most important aspects of any SEO campaign, optimizing your page content. The only problem with this topic is, where do I start? There are so many changes that can be made to a web page’s content that I could easily fill ten articles on the subject, so you can see my dilemma in trying to condense my advice into just one single piece. But that is what we shall do; after all, this is a ‘Back to Basics’ series.
So, where do we start? What is the most important change a Webmaster can make to a page in order to improve search engine positioning? To find the answer, we simply go back to the very first article in this series, where we discussed effective keyword research. When researching your industry, competitors and most requested search terms, you identified the keywords that are the most regularly used by your target audience. You’ve used them in your title and Meta tags, but their most important use is on the actual page content, the text you display on the pages you are trying to get positioned.
Include Your Targeted Search Terms
So many times, I have seen web sites that fail to mention any of the search terms they are trying to achieve rankings for. They’ll have lots of graphics and may also have good levels of text on the page, yet the company still fails to include the exact phrase that is important to them. For example, if you’re trying to achieve rankings for the term ‘desktop computer supplies,’ make sure your content has that exact phrase present in it. It is of little benefit to say something along the lines of, ‘The best selection of accessories for your home computer’ when trying to target ‘desktop computer supplies.’ While you may pick up points for having text that is on the same theme, you won’t achieve your best search engine rankings unless you include liberal occurrences of the exact phrase you are trying to target.
Checking Keyword Density
Your next question is likely to be ‘How often should I mention each search term?’ A well- optimized page should include at least 250 words of text. Within that text, aim to achieve between 5-15% frequency for the term you are trying to target. Not sure how to calculate search term frequency? Check out www.searchengineworld.com/cgi-bin/kwda.cgi, a great little tool that will show you the keyword density of each one, two and three-word phrases on any page within your web site. Make sure that you place your most important search terms in text located towards the top of your page and also try not to target more than 5 phrases within any block of text (the more phrases you try to target, the more text you need to achieve a high frequency).
Also look for opportunities to make links out of search terms located within your page text. In the example of ‘desktop computer supplies,’ consider making one of the occurrences of this phrase a hyperlink to the most relevant page within your website; it will give you a little push in your ranking efforts.
The Impact Of Keyword Proximity
If you’re unable to include the exact phrase within your page text, which can often happen when the targeted search term is not used in the course of normal syntax, try at least to keep the words within close proximity. For example, you could use ‘Discounted supplies for desktop computers.’ While it is not as valuable as including the exact phrase, it at least contains the targeted words, albeit in a different order. The search engines, while preferring to display pages that match search terms exactly, have shown propensity to display web pages that have the targeted words within close proximity, if not the exact order they were searched.
Search Terms Should Be Pervasive
While the paragraphs of text within your web page offer the best opportunity to include search terms, make sure you don’t miss the many other opportunities scattered among your content. For example, look at the text contained within the headings of each page and make sure they contain the most relevant search term for your content. Also, consider the navigation menu that you use and look for instances where you can include a relevant search term. How about the text you use under each product description? I’ve seen websites where the most dominant two-word phrase on a product page was ‘Sale Price.’ Ouch!
As you can see, the text you use on each page is vitally important when trying to achieve better search engine positioning. However, adding keywords to your content is not enough to get your web site to the coveted ‘#1’ position. There are many other factors that need to be considered, including many that don’t involve the content on the page, but as we are looking at the page content, here are a few quick tips:
‘ Don’t bury your keyword-rich content at the bottom of the page. The search engines consider where the text is located on a page when determining your site’s relevancy. Google will believe that text pushed to the bottom of your site, in a small font, can’t be that relevant to your business.
‘ Don’t overdo things. While having no search terms in your text is disastrous, having too many could have an equally negative impact. Stick to your 5-15% frequency.
‘ Remember the user experience. While your SEO efforts will help improve your search engine rankings, don’t sacrifice the usability of your web site. Ensure that it is easy to navigate and that all of your keyword-rich text still makes sense to the average visitor.
‘ Add one or two targeted search terms to the ALT tags of any image that links to another page within your website. Search engines have shown they consider ALT tag text when the image contains a link to another page.
‘ Don’t go overboard with the use of ‘H1’ tags or bolded text. While they can help improve your search engine positioning, less is more.
Walk Before You Run
Hopefully, the above advice will assist you in modifying your most important pages to increase search engine visibility. When you feel you have made all the basic changes to the text of your site, you’ll find many articles that discuss fine-tuning your page layout and content. Search engine optimization is a continued process and you’ll no doubt drive yourself crazy if you try to optimize every single aspect of your web site. Simply remember to keep your site relevant and make sure you have covered all the basics before advancing to more complex techniques.
Author Bio:
Andy Beal is Vice President of Search Marketing for WebSourced, Inc and KeywordRanking.com, global leaders in professional search engine marketing. Highly respected as a source of search engine marketing advice, Andy has had articles published around the world and is a repeat speaker at Danny Sullivan’s Search Engine Strategies conferences. Clients include Real.com, Alaska Air, Peopleclick, Monica Lewinsky and NBC. You can reach Andy at andy@keywordranking.com and view his daily SEO blog at www.searchenginelowdown.com.
Domain Name Dilemma
It’s a fact. When it comes to Google ranking you don’t have to be a lot better to beat out the competition. So let’s take another dip into the “every little bit helps” pool.
Now believe it or not there are those who like to debate the merits of using dashes or underscores in domain names. Some assert dashes are better. Some have an ongoing love affair with underscores. Others are certain there is no difference.
I agree you do get a bit of a bounce in Google if you do this right. But it’s only marginal. Still let’s end this debate once and for all and PROVE which is better. To sort this out we need to conduct a study. Using the Google search results to test if Google treats dashes or underscores the same or differently. The guinea pig search term I picked is “affordable search engine placement”. There’s nothing special about it. It’s more or less your run-of-the-mill multi-word search term. So let’s get searching.
First, to set a benchmark I cast the broadest net possible by doing a simple search using
affordable search engine placement
Google returned these results:
Searched the web for affordable search engine placement.
Results 1 – 10 of about 78,600
That simply says there were 78,600 pages indexed by Google for ANY of those keywords.
Next I searched on the same phrase only this time I separated the words by dashes like this:
affordable-search-engine-placement
Google turned up these results:
Searched the web for affordable-search-engine-placement.
Results 1 – 10 of about 1,160.
As you can see our term with dashes gave considerably fewer results than the one without.
Then I searched on the same words separated by underscores:
affordable_search_engine_placement
For this one Google didn’t find much:
Searched the web for affordable_search_engine_placement.
Results 1 – 4 of about 6.
Whoa! Next to no pages with underscores, right?
Finally I searched for
“affordable search engine placement”
Note the quotes. Using quotes limits the search results to one specific phrase. Just like you were doing an advanced search for that exact phrase.
In this case Google returned these results:
Searched the web for “affordable search engine placement”.
Results 1 – 10 of about 1,160.
Huh, exactly the same number of pages as with the keyword phrase with dashes. Okay so what do we got?
Let’s see. The first search returns what you could say is a free for all of listings with any of the words in the keyword phrase. That’s why there are so many search results.
SIDEBAR: Reality check time. This is how most search. In fact I saw a stat the said only 3% use the advanced search feature provided by Google. Yet the dramatically bigger number of resulting SERPS explains why it is harder to rank high sometimes. Reason being you are going up against a whole bunch more pages – some unrelated to what is being searched for. So it’s takes more juice, ie on-page optimization, internal links and maybe even in bound links to come out on top.
Now our study also showed the phrase with underscores (which Google treats as any other CHARACTER) produced negligible results. As in next to none. While the keyword phrase with dashes and the exact phrase search turned up the same number of listings. At this point you should be wondering “Why is that?” Glad you asked. Even if you didn’t let me explain. Oh and since this is important engage your brain NOW.
The reason for this apparent match of search results is Google uses the dash to separate the words in the phrase. Programmers call this a “delimiter”. In essence Google sees it as a space or separator between the words. Or in other words Google treats the dash as a spacer. Yet Google does NOT treat the underscore as a delimiter. Again to Google it’s just another character.
Which is proven by the search results. Had Google treated the dash and underscore alike the number of SERPs returned for
affordable_search_engine_placement
or
affordable-search-engine-placement
would be identical. But as you saw they are not. Not even close.
So to answer the original question of which is better dashes or underscores is obvious isn’t it? You want to go with dashes in your domain names, folder names, files names etc. That’s because using dashes to separate the words will give you the biggest Google impact -whatever that impact may be. Since Google can parse the different words. While using underscores won’t help one iota.
Look. This isn’t theory or speculation. It’s fact. And you can repeat the same searches with any keyword phrase you want and you’ll get the same results. In any case let’s be real. Don’t expect some kind of massive boost from this dash trick. Sure it can help at tad as part of an over all optimization scheme. But whether or not you use dashes in a domain, folder or file name is not going to be whatgets you top Google listings. Content and links are.
Still this study settles the debate about dashes and underscores. Giving you yet another little thing you can do to rank well.
Author Bio:
How much is more traffic worth to your business? Take John Gergye’s Search Engine Quiz and get a special report “Coming Out On Top” with 49 tools that make it easy to get more traffic. http://www.traffic-test-tube.com/search-engine-quiz.shtml
Keyword Ownership
Have you ever got one of those silly emails that offers to let you own a keyword? Silly question. How many such emails do you get every day?
A number of such services regularly email me offering keyword ownership of premium keywords for $300/year. They say that anyone can type the keyword I bought in the address bar of Internet explorer, instead of typing in a URL, and they will be sent directly to my site. In total it seems that there are about 2% of Internet users worldwide who have enabled one type or another of this system, spread out between a few competitive services.
Data shows that between 4% and 7% of search queries are performed by entering something in the address bar. By default for IE users, these searches are automatically routed through to MSN search. Many of us however have installed so much software over time, and unknowingly, some of this software has re-routed these search queries to other search portals, such as iGetNet, or others. This often happens if you’ve installed any file sharing software. We have all heard & read about how many extra ‘features’ come with programs like Kazaa. This means that your default search from the address bar may no longer be MSN, and may have been rerouted elsewhere, but the basic principle still applies. Of the queries that are actually run from an address bar, at least half of them are unintentionally instigated by people mistyping the desired URL. This means that between 2% and 4% of Internet users actually search via their address bar.
So how exactly do these address bars work? There are many of these companies offering this kind of service, with each one of them selling the very same keywords to different and sometimes competing companies. To make things worse, the keywords you might buy will only work with the issuing company’s proprietary address bar plug-in. Then, to actually offer search capabilities from the address bar, each of these service providers needs to get individual Internet users to download and install their plug-in, and remember to run searches from the address bar.
How effective can a marketing strategy of this nature be when the various tools are not interchangeable, there are numerous competitors selling the same key words to different companies, and you are targeting only a small fraction of Internet users? If your ad is being displayed because it’s similar to the search query, are you paying for irrelevant results? This can happen. If there is not a perfect match to a search query, the next closest match may be displayed.
Competing with these companies is any search engine that offers its own toolbar. You can download a toolbar from any number of engines, and run searches on any key word or phrase quickly and easily. You then get the search engines selection of closest matches, from all the web sites they have indexed. They offer more than just one choice, and don’t cost anything.
Who Started This?
Started in 1998, Realnames was the first company that tied searching via the address bar to a web browser. At the time, it was touted as a value added solution for businesses around the world who were attempting to get their products found quickly, but didn’t want customers to have to wade through a sea of Web addresses to reach their destination.
In part, it was deemed necessary because so few web site operators were search engine savvy, and fewer still knew anything about search engine optimization and promotion. What the Realnames solution did was allow a web site operator to buy a keyword, and then when any user of Internet Explorer would type that keyword into the IE address toolbar, they would get directed to the web site that owned the keyword.
The company hoped to profit from businesses which wanted to reach Internet users who would type keywords into their browsers address bar instead of remembering the URL, or going through a standard search interface.
Unfortunately for the company, the service was entirely dependent on Microsoft; and when Microsoft stopped supporting the technology in May 2002, the company was forced to close. The reason it was so totally dependent was simple. Unlike the new companies on the market today, Realnames did not depend on an end user downloading and installing a plugin, instead it was essentially integrated into Internet Explorer by Microsoft. Therefore everyone who used IE automatically had the plugin.
The Legal Question
Each of the companies offering these services has a policy designed to ensure that a web site only buys keywords related to their content, and their review process is designed to keep cybersquatters from hijacking popular names and products. Unfortunately, there is no way to guarantee that any one of these keyword ownership services adheres to any naming standard, or even ensures that any purchaser has the legal right to any of the terms they are buying. This means that the rights to copyrighted material like “Pepsi” or generic words like “business” could end up in the hands of the first buyer. While Pepsi is a well known brand name, there are millions of copyrighted and trademark protected terms, covered in multiple jurisdictions. It would not be cost effective or practical for these services to police copyright and trademark infringement.
In the summer of 1999, the U.S. Court of Appeals for the Ninth Circuit, denied Playboy’s request for an injunction barring a search engine from selling advertising based on the terms playboy and playmate. In the precedent setting ruling regarding keyword advertising, Judge Stotler of the United States District Court in Santa, Ana, California, dismissed a lawsuit brought by Playboy Enterprises against the search engine Excite, Inc. and Netscape. The ruling limited the online rights of trademark holders, as it recognized that a trademark may be used without authorization by search engines in advertising sales practices.
Playboy claimed that the search engines were displaying paid banner ads from pornographic web sites whenever “playboy” or “playmate” were used as a search term. As the owner of the trademarks for both terms, Playboy argued that the use of its trademarks for a third party sales scheme was trademark infringement and branding dilution.
In the ruling dismissing Playboy’s case, the Judge found that Excite had not used the trademarks “playboy” and “playmate” in an unlawful manner. This was because Excite had not used the trademarked words to identify Excites own goods or services and therefore trademark infringement laws did not apply. It was further determined that even if there was trademark usage, there was no infringement because there was no evidence that consumers confused Playboy products with the services of Excite or Netscape.
What About Within Meta Tags?
Is it illegal to use trademarked terms in your Meta tags? Sometimes. The problem occurs with how and why you are using the terms. Web sites that use the tags in a deceptive manner have lost legal battles. However, legitimate reasons to use the terms have resulted in successful defenses.
In a case involving Playboy, the firm was able to prove trademark infringement, based on use of their trademark in Meta tags, URL and content on the web site. The case was filed by the firm against web site operators for stuffing their web pages with the words Playboy and Playmate hundreds of times. Furthermore, the defendants were also using the terms Playboy and Playmate in the site names, URLs, and slogans. In this case the Judge ruled for Playboy, as there was a clear case of trademark infringement.
In the separate case, Playboy vs. Terri Welles, the court refused Playboy’s request. The reason was simple. Terri Welles was Playboy’s 1981 Playmate of the Year. She had used the terms “Playmate” and “Playboy” on her web pages and within her Meta tags, and the Court felt she had a legitimate right to use them to accurately describe herself, and to ensure that the search engines could catalog her web site properly within their databases. Playboy’s appeal was dismissed on Feb. 1, 2002.
In Summary
It is clear that if you have a legitimate reason to use a trademarked word or phrase in your web site you can. You may also rent their ownership from one of the keyword ownership companies. Be careful, though, it is possible that may get sued.
Does the technology work? Yes, but only for some of the approximately 3% of Internet users worldwide who have installed any one of a variety of competing plugins that enable this type of searching. I stress a fraction of the 3%, as you would need to buy the keywords from each individual vendor to ensure reaching all 2%.
Author Bio:
Richard Zwicky is a founder and the CEO of Metamend Software, a Victoria, B.C. based firm whose cutting edge Search Engine Optimization software has been recognized around the world as a leader in its field. Employing a staff of 10, the firm’s business comes from around the world, with clients from every continent. Most recently the company was recognized for their geo-locational, or GIS technology, which correlates online businesses with their physical locations, as well as their cutting edge advances in contextual search algorithms.
New Legal Guidelines
The marketing environment online has been changing over time to reflect new needs and to remove new problems. E-mail may no longer be the “killer app” it was, what with the evolving changes taking place.
With ISPs filtering email at ever-increasing rates as consumers complain about the volumes of junk e-mail (SPAM or UCE) they’re receiving; with spammers getting more and more aggressive (and ingenious) with their tactics; and with consumers continually complaining to politicians to “do something about it;” life for the newsletter publisher is no longer simple.
It used to be that accepting signups and sending your newsletter was the EASY part � putting it together and getting subscribers to find you was the hard part. Not any more.
Now you have to deal with a myriad of laws � laws which may or may not apply to you, which vary by location, and laws which you may be completely unaware of.
Many states in the United States have laws which prohibit certain types of email marketing. These are usually based on how the email address is acquired and what the contents of the email actually are. California, of the states in the Union, has the most stringent laws.
In addition to this, many member countries of the European Union have passed or are in the process of passing similar legislation against unsolicited commercial email.
“How does this affect me?” The newsletter publisher asks. Well, the way these laws are written, you could be in violation even if your entire list of subscribers are opt-ins. How?
Since California’s law is the most stringent and since estimates say that 20% of Internet traffic around the world originates, passes through, or is served from that state, we’ll look at the law there. Most laws in other places are not as strict, but many countries in the EU are working on laws that will be similar. Plus California is commonly known as a “test zone” for laws in the United States.
The law defines an “unsolicited commercial email advertisement” as being any email sent without the “direct consent” of or without a “preexisting or current business relationship” with the receiver. Interestingly, the receiver doesn’t have to be a California resident because the law states that if the email originates or has been sent “within, from and to” the state of California, it is covered. So if your server is located in California and you send email through it that someone doesn’t like, you could be subject to the law � even if you’re a resident of New York and the receiver is in Washington!
The other crux of this law is the definition of “direct consent.” It is defined as “�the recipient has expressly consented to receive e-mail advertisements from the advertiser, either in response to a clear and conspicuous request for the consent or at the recipient’s own initiative.”
The penalties for violating this law are immense: $1,000 for each offending e-mail and up to $1,000,000 per incident, plus actual damages and attorney’s fees.
The up-side to all of this is that the law, as written, is full of holes. There are a myriad of ways to get around trouble with it, but there is definitely an increased risk to email marketers. After all, a new law means that it’s easier for those with complaints to force legal action, which means your chance of ending up in court is higher than it was before.
My personal opinion is that as soon as this law is used (it goes into effect on January 1, 2004 in California), it will be challenged Constitutionally and probably fail to hold up because of the ambiguous wording of much of the legislation.
It is still a good idea, however, for the e-zine publisher to make sure their email collection techniques are on the up-and-up: double- opt-in if possible, collect names as well as emails, provide VERY easy unsubscribe options (links in every email are the best), and don’t abuse your list.
Most of us are following good guidelines and have nothing much to worry about. Just make sure you aren’t setting yourself up for anything.
Author Bio:
Aaron Turpen is the proprietor of Aaronz WebWorkz, a full-service online company catering to small and home-based businesses. Aaronz WebWorkz offers a wide variety of services including Web development, newsletter publishing, consultation, and more.
Why Isn’t My Website In The Search Engines?
If your site isn’t found in the search engines, it is probably because the robots couldn’t deal with it. It could be something as simple as not being able to find the site, or it may be more complicated issues involving the robot’s not being able to crawl the site or figure out what your pages are all about.
Submitting your site to the major search engines: that will help deal with the “can’t find it” problem. Even having links pointing back to your site can be enough to attract the search engine robots. Google, for example, suggests that you may not have to submit your pages; they will find your site if you have a link pointing back to it from at least one other site on the web.
If the robots can find your site but can’t make sense of it, then you may need to look at the content and technology used on your pages. Frames, Flash, dynamically generated pages, and invalid HTML source code can cause problems when the search engine robot tries to access your web pages. While some search engines are beginning to be able to index dynamically enerated pages and Flash (e.g. Google and AllTheWeb), use of some of these technologies can hinder your ability to be indexed by the search engine robots.
Text in images cannot be read by the search engine robots. Using ALT image text is an important way to help the robots “read” your images. Websites with extensive images rely heavily on ALT text to present their content.
How Do I Get The Most Out Of Indexing?
If you know what to “feed” the spidering robots you will help yourself with search engine ranking.
Having a website full of good content is the major factor. Search engines exist to serve their visitors, not to rank your website. You need to be sure to present yourself in your site in the way that will be most useful to the search engine visitor. Each search engine has its own idea of what is important in a page, but they all value text highly. Making sure that the text on your pages includes your most important keyword phrases will help the search engine evaluate the content of those pages.
Making sure that you have good title and meta tags will further assist the search engines in understanding what your page is about. If the text on the page is about widgets, the title is about widgets, and the meta tags are about widgets, the search engine will have a pretty good idea that you are all about widgets. When their visitors search for widgets, the search engines know to list your site in the results.
A sitemap page is a very good way of giving the search engine robot every opportunity to reach your website pages. Since robots click through the links of your web pages, make sure that at least your most important pages are included in the sitemap; you may even want to include all your pages there, depending on the size of your site. Be sure to add a link to the sitemap page from each page on your site.
Another important consideration is that of keeping all of your pages within a small number of “clicks” from your top page. Many robots will not follow links more than two or three levels deep, so if your “widgets” page can only be reached from your home page by following multiple links (e.g. home page >> about us page >> products page >> widgets page), the robot may not crawl deep enough to get to the widgets page.
Testing Your Website For Search Engine Robot Accessibility To get an idea just what the search engine robot “sees” on your page, you can look at the Sim Spider tool. You may be surprised at how different your site looks to the robot. You can find this tool at http://www.searchengineworld.com/cgi-bin/sim_spider.cgi
You will see text and ALT image text show up in the results. If your entire website is built in Flash, you will see nothing at all because robots don’t understand Flash movies.
The Bottom Line
When it comes to search engine robots, think simply. Lots of good content and text, hyperlinks the robots can follow, optimization of your pages, topical links pointing back to your site and a sitemap will help insure the best results when the robots come visiting.
Resources
SpiderSpotting – Search Engine Watch
Robotstxt.org
Protocols for setting up a robots.txt file.
Spider-Food
Tutorials, forums and articles about Search Engine spiders and Search Engine Marketing.
SpiderHunter.com
Articles and resources about tracking Search Engine spiders.
Sims Search Engine Simulator
Search Engine World has a spider that simulates what the Search Engine robots read from your website.
Author Bio:
Daria Goetsch is the founder and Search Engine Marketing Consultant for Search Innovation Marketing (http://www.searchinnovation.com), a Search Engine Promotion company serving small businesses. Besides running her own company, Daria is an associate of WebMama.com, an Internet web marketing strategies company. She has specialized in search engine optimization since 1998, including three years as the Search Engine Specialist for O’Reilly & Associates, a technical book publishing company.
Google API’s
Lately, I’ve been getting a few questions about the Google API’s. Although mostly intended for Web developers, search engine implementations or Internet applications that query the Internet using the Google database, Google’s API could be of some help for you, depending on your general level of knowledge or expertise in Web technology and your programming skills.
This article is meant only as an introduction to Google’s API’s. If the need warrants it, I would be pleased to write a more detailed and in-depth article on them at a later date.
What Is A Google API?
The Google API stands for ‘Application Programmable Interface’. As it’s name implies, it is an interface that queries the Google database to help programmers in the development of their applications.
At this point, it is important to remember that all of Google’s APIs are only available in beta version, which means they are mostly still in their initial trial release and that there could still be a few adjustments required to some of them, although I must honestly say that I am quite pleased with what I saw so far.
The Google API’s consist basically of specialized Web programs and specialized scripts that enable Internet application developers to better find and process information on the Web. In essence, Google APIs can be used as an added resource in their applications.
How Can Google API’s Be Used Effectively?
In the real world, application programmers, developers and integrators write software programs that can connect remotely to the Google API’s. All data communications are executed via the ‘Simple Object Access Protocol’ (SOAP), which is a Web services standard as defined by the industry. The SOAP protocol is an XML-based technology meant to easily exchange information entered into a Web application.
Google’s API can better assist developers in easily accessing Google’s web search database, empowering them in developing software that can query billions of Web documents, constantly refreshed by Google’s automated crawlers. Programmers can initiate search queries to Google’s vast index of more than three billion pages and have results delivered to them as structured data that is simple to analyse and work with.
Additionally, Google API’s can seamlessly access data in the Google cache, while at the same time provide checking in the spelling of words. Google APIs will in fact implement the standardized search syntax used on many of Google’s search properties.
Use Of Google’s API’s In The Real World
Here I will give a few examples of real-life Web applications that could effectively use Google API’s. I will only give a few examples, but obviously, there can be many more.
‘ Querying the Web, using non-HTML interfaces, such as complex industrial software or command-line interfaces used in certain Unix applications
‘ Processing specialized market information, research and analysis of discrepancies in certain data types used in various industries
‘ Initiating regularly scheduled search requests to track the Internet for new and updated information on a specific subject
Currently, Google issues each programmer or developer who registers to use the APIs a set limit of one thousand queries per day. Some think that number could increase in the future, but for now, that is the imposed limit at the ‘Googleplex’.
Google’s API is an experimental program that is provided free to anybody that agrees to its terms. As of now, the available resources to fully implement and support the program are rather limited, which explains why there is a 1,000 queries a day limit on all accounts.
Registering For Your Google API
In order to be able to use and implement any Google API’s you must first agree to the terms and promise to abide by Google’s rules concerning it’s API’s. You will also have to create a Google ‘API account’. Once you have done all that, Google will email you a personal license key to use with your APIs.
Remember that when you build an application, you must integrate your API account number in your code. That way, every time your Web application makes a request or queries Google’s database, your license key is part of the query string.
More On The Google API Account
Creating an API account with Google is simple. All you need to do is to go to http://www.google.com/apis/ and simply follow the easy instructions on your screen. You will be required to provide your email address and a password. Currently, Google rules and regulations forbid you from having more than one account.
One word of caution about Google API’s: remember that you cannot create or develop any industrial application or commercial querying program or software using Google’s API technology, without first obtaining a valid and written consent from Google.
Reference: Google Inc., http://www.google.com/apis/
Author:
Serge Thibodeau of Rank For Sales
Search Engine Robots
Automated search engine robots, sometimes called “spiders” or “crawlers”, are the seekers of web pages. How do they work? What is it they really do? Why are they important?
You’d think with all the fuss about indexing web pages to add to search engine databases, that robots would be great and powerful beings. Wrong. Search engine robots have only basic functionality like that of early browsers in terms of what they can understand in a web page. Like early browsers, robots just can’t do certain things. Robots don’t understand frames, Flash movies, images or JavaScript. They can’t enter password protected areas and they can’t click all those buttons you have on your website. They can be stopped cold while indexing a dynamically generated URL and slowed to a stop with JavaScript navigation.
How Do Search Engine Robots Work?
Think of search engine robots as automated data retrieval programs, traveling the web to find information and links.
When you submit a web page to a search engine at the “Submit a URL” page, the new URL is added to the robot’s queue of websites to visit on its next foray out onto the web. Even if you don’t directly submit a page, many robots will find your site because of links from other sites that point back to yours. This is one of the reasons why it is important to build your link popularity and to get links from other topical sites back to yours.
When arriving at your website, the automated robots first check to see if you have a robots.txt file. This file is used to tell robots which areas of your site are off-limits to them. Typically these may be directories containing only binaries or other files the robot doesn’t need to concern itself with.
Robots collect links from each page they visit, and later follow those links through to other pages. In this way, they essentially follow the links from one page to another. The entire World Wide Web is made up of links, the original idea being that you could follow links from one place to another. This is how robots get around.
The “smarts” about indexing pages online comes from the search engine engineers, who devise the methods used to evaluate the information the search engine robots retrieve. When introduced into the search engine database, the information is available for searchers querying the search engine. When a search engine user enters their query into the search engine, there are a number of quick calculations done to make sure that the search engine presents just the right set of results to give their visitor the most relevant response to their query.
You can see which pages on your site the search engine robots have visited by looking at your server logs or the results from your log statistics program. Identifying the robots will show you when they visited your website, which pages they visited and how often they visit. Some robots are readily identifiable by their user agent names, like Google’s “Googlebot”; others are bit more obscure, like Inktomi’s “Slurp”. Still other robots may be listed in your logs that you cannot readily identify; some of them may even appear to be human-powered browsers.
Along with identifying individual robots and counting the number of their visits, the statistics can also show you aggressive bandwidth-grabbing robots or robots you may not want visiting your website. In the resources section of the end of this article, you will find sites that list names and IP addresses of search engine robots to help you identify them.
How Do They Read The Pages On Your Website?
When the search engine robot visits your page, it looks at the visible text on the page, the content of the various tags in your page’s source code (title tag, meta tags, etc.), and the hyperlinks on your page. From the words and the links that the robot finds, the search engine decides what your page is about. There are many factors used to figure out what “matters” and each search engine has its own algorithm in order to evaluate and process the information. Depending on how the robot is set up through the search engine, the information is indexed and then delivered to the search engine’s database.
The information delivered to the databases then becomes part of the search engine and directory ranking process. When the search engine visitor submits their query, the search engine digs through its database to give the final listing that is displayed on the results page.
The search engine databases update at varying times. Once you are in the search engine databases, the robots keep visiting you periodically, to pick up any changes to your pages, and to make sure they have the latest info. The number of times you are visited depends on how the search engine sets up its visits, which can vary per search engine.
Sometimes visiting robots are unable to access the website they are visiting. If your site is down, or you are experiencing huge amounts of traffic, the robot may not be able to access your site. When this happens, the website may not be re-indexed, depending on the frequency of the robot visits to your website. In most cases, robots that cannot access your pages will try again later, hoping that your site will be accessible then.
Many search engine robots are unidentifiable when you look through your web logs, they may be visiting you but the logs states it is someone using a Microsoft browser, etc. Some robots identify themselves by the actual search engine’s name (googlebot) or a variation of it (Scooter = AltaVista).
Depending on how the robot is set up through the search engine company, the information is indexed and then delivered to the databases.
The information delivered to the databases then becomes part of the search engine and directory ranking process. This is when the mathematical calculations are used for a final database result which is what you see when you search in the search engines.
The search engine databases update at varying times. Even directories who have secondary search results use the robot results as content for their website.
Once you are in the search engine databases, the robots keep visiting you. The amount of times you are visited depends on how the search engine sets up its visits, which can vary per search engine.
In fact, robots aren’t just used by the search engines. There are robots that check for new content for databases, re-visit old content for databases, check to see if links have changed, download entire websites for viewing, and more.
Sometimes visiting robots are unable to access the website they are visiting, when this happens, the website may not be re-indexed, depending on the frequency of the robot visits to your website.
For this reason, reading log files (either raw logs from your server or through your log statistics program) and tracking your search engine listings helps you keep track of the indexing of your website pages.
Resources
SpiderSpotting – Search Engine Watch
Robotstxt.org
Protocols for setting up a robots.txt file.
Spider-Food
Tutorials, forums and articles about Search Engine spiders and Search Engine Marketing.
SpiderHunter.com
Articles and resources about tracking Search Engine spiders.
Sims Search Engine Simulator
Search Engine World has a spider that simulates what the Search Engine robots read from your website.
Author Bio:
Daria Goetsch is the founder and Search Engine Marketing Consultant for Search Innovation Marketing (http://www.searchinnovation.com), a Search Engine Promotion company serving small businesses. Besides running her own company, Daria is an associate of WebMama.com, an Internet web marketing strategies company. She has specialized in search engine optimization since 1998, including three years as the Search Engine Specialist for O’Reilly & Associates, a technical book publishing company.
Organic SEO: Patience For Long Term Ranking Results
When does long term SEO show ranking results? It takes time for optimization to produce targeted traffic to your website. Organic SEO requires time to take effect, just as it takes time for your web pages to start showing up in the search engine results.
Clients regularly ask me about the timing of a search engine optimization campaign and when those results will be seen in the search engine listings. A long term marketing campaign based on search engine optimization takes time: patience is the name of the game.
Optimization Timeframe
SEO’s timeframe depends on a number of factors. Part of this involves the accuracy of keyword phrase choices: is the keyword phrase one your visitors would use to find your product or website? If your keyword phrases are targeted to your audience, you will gain optimum results. Did you use Paid Inclusion and/or PPC services? The best combination for success involves using a combination of SEO, Paid Inclusion and PPC services. If you do not use Paid Inclusion or PPC, using organic SEO only, it takes more time to achieve results.
Paid Results
When you use Paid Inclusion or PPC (Pay-Per-Click) bidding, your results show up sooner than traditional SEO. Paid Inclusion submissions state the time-frame in which your page will be indexed by the search engine robots when you sign up for services. PPC bidding results show up as soon as searchers start clicking on your PPC ads. This type of search engine marketing requires an annual budget to renew Paid Inclusion submissions and payment per month for PPC click-through costs. If you are paying too much for your PPC services, organic SEO combined with PPC often helps to keep the prices down for the paid service. By generating additional targeted traffic on those costly terms you may be able to bring the bidding prices down in your PPC campaign or even eliminate some keyword bidding.
The timeline given for paid submissions means the search engines are generating income through this process. Paying for results also gives you a guarantee the listings will be relatively stable in the database.
Paid Inclusion submissions will always take precedence over free submissions because the company makes money from Paid Inclusion. For this reason most search engines will implement free search engine submissions over a longer period of time than paid submissions. When using SEO without the paid submission choices, the process is the same but the optimized pages take longer to be processed into the search engine databases.
Organic SEO
Organic SEO works differently. The best reason to use organic SEO is that it is a low-cost method to promote your website. It can take up to three to six months to see the full results of optimizing your website, especially if you are only using organic optimization. The plus to an organic approach is that once you optimize your pages, the main part of the work is done. You may tweak your keywords and text here and there, but unless you completely re-design your pages, you have what you need in place to begin drawing in targeted traffic. Continue checking your ranking status and reading your log statistics, especially for new keywords visitors are using to find your website.
When using free submissions, expect a three to six month wait before seeing most of the long term results showing in the search engine listings. If you build on a link popularity program and have links pointing back to your website, the search engine robots will find your website through the links, eliminating the need for free submissions. Look at it this way: you pay once for basic optimization and over time the results improve to optimum level. You don’t have to keep paying for this service because, unless search engine databases drops your free submission pages (which is not often these days), you will be visible and present to the search engine users when they search on your targeted keyword phrases. Over time you should see a progression in your ranking, depending on how competitive your keyword phrases are.
Budget SEO
What if you can’t afford Paid Inclusion or PPC services? Organic SEO is a great way to increase targeted traffic to your website over time. If you do not have a budget for Paid Inclusion submissions and PPC programs, organic SEO will give you good results if you are willing to wait instead of gaining immediate results. Combine organic SEO with plenty of good content and a solid link building program for optimum results. Remember, the search engine listings may entice visitors to come to your website, but you must give them a reason to stay once they arrive. Build your content to keep your new visitors at your website.
Patience Pays Off
Organic SEO is “common sense” promotion. Not a lot of fancy bells and whistles, and it takes time. The addition of good navigation, good content with your keyword phrases throughout the pages and topical sites pointing links back to your website equals long term success. Practice patience when going organic for your SEO campaign. It may take time but it will be worth the long term results you reap.
Author Bio:
Daria Goetsch is the founder and Search Engine Marketing Consultant for Search Innovation Marketing (http://www.searchinnovation.com), a Search Engine Promotion company serving small businesses. Besides running her own company, Daria is an associate of WebMama.com, an Internet web marketing strategies company. She has specialized in search engine optimization since 1998, including three years as the Search Engine Specialist for O’Reilly & Associates, a technical book publishing company.
Server Log Files
You have invested considerable amounts of money in search engine optimization for your website. You might even participate in a PPC (Pay-per-Click) campaign or other forms of advertising. Yet, you would like to know just who, when, how and where did they go while visiting your site. Maybe you are also asking yourself the eternal question: Did they buy from me? If not, why did they leave without buying anything?
After being in this business for more than 6 years, I still can’t understand why it is that so few site owners, business administrators and even many Webmasters don’t look at their server logs more often and, most importantly, more closely.
Few seem to realize that there is a ‘diamond mine’ of information residing in those log files. All it takes is to retrieve them and analyze them, at least once a week, at a minimum. If you can look at them everyday, it’s even better.
A Primer On Server Log Files
First, let’s rewind back to 1994-1995. In the mid-90’s, when the Internet was still relatively young, commercially that is, most site owners and Webmasters were mostly interested in the number of ‘hits’ their site got, from one day to the next. Even today, I still get asked: “How many hits did I get today?” By talking with some of these clients, I realize that they need a little primer article, such as this one, in helping them better understand the importance of professional analytics and measurements of Web traffic, site demographics, geo-location data, statistics, etc.
Today, modern traffic analysis tools and complex analytical software applications can yield far beyond this rudimentary, basic information. The more information a business administrator can have at his or her fingertips, the more he or she can use it to their best advantage over their competitors. Plus, there are many other benefits to be had as an added bonus. If some executives within your organization would like to know exactly how some visitors behave once they are inside your site, they can now get access to this kind of information, with a good analytical package.
Additionally, one of the most important features of most of the good packages I have seen is the ‘link referrer’ information. As its name implies, the link referrer information will clearly identify the search engine that brought you that visitor, or if it’s a link from one of your affiliate’s sites or a site that links back to you. What’s more, when properly configured, the search engine referrer log will even tell you which specific keywords or key phrases they entered to find your site.
The Importance Of Error Logs
Another often-overlooked and neglected area is the error log. It’s an excellent way to discover problems with your website before a visitor mentions it to you. A good example is when you find ‘error 404’ or ‘file not found’ in your error logs. When added with the site referrer information, you are also able to discover which page caused the error. This feature alone makes it a lot simpler to correct this problem.
With today’s most popular Web hosting packages offered on the market, many of them provide some kind of server log files that one can analyze and evaluate. You should ask your Web hosting provider for that information.
Conclusion
Frequent and regular server log file analysis and monitoring can be a very valuable source of important website information in deed, if done correctly. It can mean all the difference between discovering that a site has little or no ROI and one that truly achieves the best overall value for any size business. There are very simple and inexpensive server log analytical programs and software available today. Some are even free. As with anything, you can also purchase expensive software suites that are more suited for a larger enterprise. It all depends on your needs and budget.
Author:
Serge Thibodeau of Rank For Sales