Yahoo no longer using Google
I am glad I am not a betting man, or I am sure I would have big guys named Vinnie and Freddie knocking on my door looking for their money. I have been trying to guess when Yahoo! was going to change results providers virtually since the time they bought Inktomi over a year ago. I even made a couple predictions, but in the end I was wrong.
So, what is all the fuss about?
Well for one, they didn’t use pure Inktomi results. In effect they didn’t replace Google with Inktomi; they replace Google with a Yahoo! brand of search. The Yahoo! search is new and uniquely different from everyone else.
So what does this mean to search marketers?
Well, while you used to have to pay to be listed in the Yahoo! directory, it doesn’t appear that you need to pay to get indexed by Inktomi. In fact, if you check your visitor logs, you’ve likely noticed a lot of activity from either Inktomi’s Slurp or Yahoo!s Slurp spiders.
So how do you know if you’ve been picked up in Yahoo?
If you’d like to see how many pages are indexed, you can go use the advanced search options (http://search.yahoo.com/web/advanced). If you put in 2 sets of double quotes (“”) in one of the “Show results with” boxes, followed by your domain name in the “only search in this domain/site” box, you will see what pages are indexed in the new Yahoo! index.
What does all this mean?
Well, it’s hard to say what the impact of Yahoo!s organic product will be. We already know that they can account for about 1/3 of most sites’ traffic. But how Yahoo! will deal with crawling and indexing is another story. Since they just switched over, we will have to see what happens.
We do know that, at this time, there is no place to add a URL to their index, so we must assume that at least for the time being a paid inclusion is required to be indexed in Yahoo! Or, we may find that a submission to Altavista or Alltheweb may help a site get indexed. Although this is doubtful, it is not out of the realm of possibility.
One speculation is that Yahoo! will use a similar submission strategy that Altavista has employed. That is, you could submit for free, with no guarantee of if, or when, the site would get indexed, or you could pay for guaranteed inclusion. In this case, the submission fee is definitely worth it considering the 30% reach that Yahoo! currently has.
So what does this mean for Google?
Aside from the obvious – that Google’s power has been cut in �, not much. Google is still striving to have the best search out there. Whether Yahoo! is a part of that or not, shouldn’t really concern them. Sure it is a hit to the bottom line, but Google is large enough that they will survive, at least until the rumored IPO.
Now what Google has to do is stay ahead of the pack. They have developed a reputation for the best search. They have to maintain that reputation. The fact that Yahoo! has entered the algorithmic search market after 10 years on the web does help Google in that they have some catching up to do. But the fact that Yahoo! can afford to catch up could be a concern.
They also have to consider that Yahoo! is only the first of the “big players” to foray into the highly competitive algorithmic search market. MSN still has to display its entry, which will also be later this year.
But for now they only have to consider Yahoo!s impact. While it will be great in terms of volume of search lost, it is too early to determine what the true impact will be. Launching algorithmic search is one thing, convincing people to use it is another.
Author Bio:
Rob Sullivan
Production Manager
Searchengineposition.com
Search Engine Positioning Specialists
Lycos Leaving Search
There are few internet web properties that were making news in 1998. Lycos was one of them. Once a powerhouse in internet search (and one of the hottest properties on the web) Lycos was soon bought by multinational Terra Networks to become TerraLycos.
Since that day in October 2002, Lycos seemed to have lost focus on what it was – a really good search engine. Instead they got into many other internet markets and performed even more global expansion. In fact, even today Lycos is more popular in some countries than any other search engine or portal.
If you were to review their press releases from 2000 and 2001 you would that they were acquiring properties back then that are considered hot now. Job finding sites, travel sites, music services and more.
While other players like Yahoo and MSN were coasting (and Google was sneaking up from behind) Lycos was building its world internet presence.
Lycos Losts Its Appeal
But something happened. Lycos lost its appeal to the majority of internet users. While they were still hugely popular in Europe and other places around the world, in North America they were all but forgotten.
Around the time of the tech stock crash, Lycos faded, much like many other big internet names, and never recovered in the US.
More recently, they have cut staff drastically and have put up “for lease” signs in their California offices. It was only a matter of time before Lycos search disappeared.
Today Lycos announced that it is getting out of search. Instead they are going to focus on the latest internet craze – social networking. Social networking is the new gathering place – A place to go hang out and meet new people, or maybe even meet that special someone.
Many other players are getting into social networking – such as Google – which led to rumors that Google will incorporate social networking into search.
But for Lycos, this could be their last attempt at some kind of web supremacy. They realized that they could not compete in search. In fact, they were so far behind, it would have taken years to get caught up even to where search is today. So they chose a new route.
Hopefully, for the company’s sake, this new path turns out to be the right choice for them. They are getting into social networking at its infancy. As long as they can capitalize on its current popularity and make it work, they may stick around for a few more years.
Author Bio:
Rob Sullivan
Production Manager
Searchengineposition.com
Search Engine Positioning Specialists
Even if Google has been tweaking and fine tuning its search engine algorithms almost daily since it started last November, one thing remains constant: there are things in SEO that you need to do and others that you absolutely need to avoid.
Here are my 10 commandments of rigorous SEO techniques that I practice. As fast and furious as Google ‘dances’ have been lately, these 10 commandments have, to this day, kept my clients out of trouble while helping them to continue enjoying priority rankings in the results pages.
I strongly believe that if you practice ‘safe SEO’ such as the ones described in these 10 commandments, then you should get equally positive results.
SEO Commandment Number One
Thou shall not write invisible text
It continues to surprise me that some people are still trying to get away with this! Just last week, we had to ‘clean up’ a mess that somebody else did who had inserted lots of invisible text on most of the pages in that site. As a result, and as might be expected, the site got banned by Google. We had to remove it, properly optimize the site for its main keywords, utilize Wordtracker and resubmit to Google and the other search engines.
Just to be certain, I also wrote an email to Google advising them of the ‘clean up’ we conducted, with a thank you note for their consideration in re-instating the site back in their index.
SEO Commandment Number Two
Thou shall always use Wordtracker
If you are really serious about getting your site higher in the rankings, don’t waste your time with tools that are just not accurate enough. The only serious tool by today’s standards in the SEO industry is Wordtracker. Optimize a site with the wrong keywords and you will end up with a site that ranks high, but for products or services your company may not even sell! Correctly identifying your ‘right’ keywords and search terms using Wordtracker is the ONLY way you will be successful at targeting your real audience, and your real prospects.
SEO Commandment Number Three
It’s a known and proven fact. I see it everyday. Sites that continually add new content and new pages everyday ultimately get the best rankings all the time. What’s more, when search engine spiders visit your site and find newly added or updated content, they will come and visit you more often! The more often they come, usually the higher and better your rankings. Always remember the search engines are always on the lookout for fresh new ‘meat’ all the time. Surprise them everyday with new content and watch your rankings go higher.
Also, and this is a very well kept secret in the SEO industry: With all else being equal, the greater the number of pages in a given website, the higher the PR (Page Rank). This is a mathematical formula. In combination with the number of links pointing to your site, Google’s Page Rank algorithm uses the number of actual pages in a given site to calculate its final PR value. This is performed once a month during every major Google dance. Contact me if you need to know more about it.
SEO Commandment Number Four
Thou shall never link to bad neighbourhoods
You can’t control who links to you, and, for the most part, no search engine will ever penalize your site if someone links to you, since they know you have no control over it. But you do have control of the sites YOU link to. If you happen to link to a site which some would call ‘a bad neighbourhood’, you might get penalized by some engines for linking to them. Examples of this would be sites that have been penalized in the past (or still are) for prohibited techniques such as invisible text, invisible links, gateway (or doorway pages) or, in some cases, sites that are really nothing more than a bunch of affiliates sites that are heavily cross linked.
If you feel suspicious before linking to a site, it would help to make a few investigations on your own. Make a search of who’s linking to them. Are they penalized too? I like to think it is ‘Linker Beware’ and proceed cautiously before you do, just to keep safe.
SEO Commandment Number Five
Thou shall not spam the meta tags
It’s a known fact that most search engines today don’t even care to look at meta tags anymore, since they were spammed and abused so much in the past. The only search engine today that places some importance on meta tags is Inktomi. Since Yahoo is about to drop Google’s search results any day now, it will exclusively use their own search results produced by their own search property, Inktomi. If you decide to write your meta tags, just write your main keywords that are on that particular page. If the page is about blue and red widgets, then that is all that should be written in the meta tags. Don’t ‘stuff’ them with useless words that have nothing to do with the actual text written on that page.
SEO Commandment Number Six
Thou shall not use Doorway or Gateway pages
You should never even think of using doorway or gateway pages, since they are frowned upon by most search engines. If you implement any kind of software or program that needs to know the IP address of a search engine robot, you are making use of spam tactics that will have your site penalized or banned. Gateway pages serve different pages to search engines than they do to real human visitors, which is a big No-No in the SEO industry!
Some people suggest ‘trusted feeds’ or XML feeds or similar variations. Depending on the exact nature of such implementations, such techniques could be accepted, if certain basic conditions are met. But in almost all the cases I have seen, doorway (or gateway) pages are a bad idea and you should always stay away from them. To give you an example of how doorway pages are hated by the search engines, on January 25, Google changed most of their IP addresses in many of its data centers, as ammunition against this sort of practice.
SEO Commandment Number Seven
Thou shall build the best site you can
Last week, I wrote an article about the importance of usability in any website. If new visitors visit your site in good numbers but then leave just as fast, there is something very wrong with your site! High rankings won’t compensate for a site that has little usability features. Does your site take too long to download? Did you know that even today, about 80% of people are connected to the Internet using slow dialup modem phone lines? This is especially true in rural areas and in many parts of Europe.
Did you know that if your menu system isn’t clear or if your homepage fails to give the slightest clue as to what your website has to offer, they will leave faster than you can say goodbye!
SEO Commandment Number Eight
Thou shall write compelling sales copy and body text
Writing for the Web is a bit different than writing for print. If you want visibility in the search engines, you will have to include your important keywords a few times at strategic locations in your pages, writing text that flows well, but at the same time that is rich in keywords and key phrases. Without sounding repetitive, a normal page with about 750 words on it needs to contain from two to three times your keywords for that page. Also, as stated in the following commandment, make certain you write a proper title tag that is descriptive of that page, and be sure to include your keywords in the title tag.
SEO Commandment Number Nine
Thou shall always write good title tags
This is one of the most neglected activities of SEO, and yet it is still one of the most critical because it can make such a big difference! A well-written title tag has two functions: it helps your users know what the page is all about before they click on your link in the search results pages (SERP’s) and it also helps the search engines better index your page.
To get a better idea of what I mean, type in any meaningful keyword or key phrase in Google or Yahoo and look at the first 10 results that you see. Note that most (if not all) of these sites have a good description of what the page offers, all in the title tag. The title tag is the underlined snippet of text that you actually see written for each result. It is also what you see in your browser at the top.
SEO Commandment Number Ten
Thou shall always keep up to date in SEO news
More than ever today, the search engine optimization (SEO) industry is advancing at a breathtaking pace. Just witness what has happened over the past year and you get the feeling that it will continue to accelerate.
Whether or not Google goes public, the search industry today is one of the fastest growing industries and it is revolutionizing the way companies advertise their products and the way consumers buy them.
Subscribing to good SEO newsletters, reading search engine industry portal sites such as this one and staying current of the news in this rapidly growing industry and you will be certain to remain one step ahead of your competition.
Author:
Serge Thibodeau of Rank For Sales
Introduction
Building a successful SEO (Search Engine Optimization) campaign requires a lot of time and hard work. Search engines are constantly changing their algorithms and it’s up to you to make the necessary adjustments to accommodate these changes. Keeping track of all of your optimized pages can be a daunting task. However, you can avoid unnecessary confusion by organizing your optimized pages in a streamlined fashion. Although not common practice, this is one of the most important steps in any successful SEO campaign.
What do I mean by “organized?” Simply, that you should develop a clear plan on how your pages will be named and where they will be situated on your web site. You need to be able to easily identify and track what pages have been indexed by what engine and what pages need to be updated. One way to achieve this is to adopt a “naming convention”.
Example 1.
Your company web site sells widgets. You have a list of 5 of your most important keywords and you’ve optimized these keywords for 4 search engines. That’s a total of 20 optimized pages. You have a robots.txt file set up to prevent search engine ‘A’ from indexing pages that are intended for search engine ‘B’ and so on.
Let’s examine the drawbacks to this naming convention:
Keyword Page Name Engine
widgets widgets.htm Google
blue widgets bluewidgets.htm Google
red widgets redwidgets.htm Google
black widgets blackwidgets.htm Google
purple widgets purplewidgets.htm Google
widgets widgets2.htm MSN
blue widgets bluewidgets2.htm MSN
red widgets redwidgets2.htm MSN
black widgets blackwidgets2.htm MSN
purple widgets purplewidgets2.htm MSN
widgets widgets3.htm AltaVista
blue widgets bluewidgets3.htm AltaVista
red widgets redwidgets3.htm AltaVista
black widgets blackwidgets3.htm AltaVista
purple widgets purplewidgets3.htm AltaVista
widgets widgets4.htm Hotbot
blue widgets bluewidgets4.htm Hotbot
red widgets redwidgets4.htm Hotbot
black widgets blackwidgets4.htm Hotbot
purple widgets purplewidgets4.htm Hotbot
1. The words in your page names are not very distinct. This is important because a search engine cannot determine if bluewidgets.htm is made up of two distinct words “blue” and “widgets.” You need to find a way to separate these keywords in the page name or you will not get credit for the keyword in the file name.
2. Your page names are not easily identifiable. When you run a Reporter mission, you will see your pages indexed with the number appended to the keyword phrase in the file name. At first glance, this doesn’t tell you the engine for which the page is optimized. You need to be as descriptive as possible.
3. Using a robot.txt file can diminish your exposure throughout all of the search engines. I explain this in the next section.
Now, let’s take a look how we can modify our page names in order to get credit for the keywords, and allow you to easily identify them in the corresponding search engine while gaining maximum exposure.
Example 2.
Below, you’ll see an example of how I have added hyphens to separate keywords in the page name. Also, I’ve appended an engine indicator to the file name, so it will be easy to distinguish what page is optimized for which engine.
Keyword Page Name Engine
widgets widgets.htm Google
blue widgets blue-widgets-gg.htm Google
red widgets red-widgets-gg.htm Google
black widgets black-widgets-gg.htm Google
purple widgets purple-widgets-gg.htm Google
widgets widgets-ms.htm MSN
blue widgets blue-widgets-ms.htm MSN
red widgets red-widgets-ms.htm MSN
black widgets black-widgets-ms.htm MSN
purple widgets purple-widgets-ms.htm MSN
widgets widgets-av.htm AltaVista
blue widgets blue-widgets-av.htm AltaVista
red widgets red-widgets-av.htm AltaVista
black widgets black-widgets-av.htm AltaVista
purple widgets purple-widgets-av.htm AltaVista
widgets widgets-hb.htm Hotbot
blue widgets blue-widgets-hb.htm Hotbot
red widgets red-widgets-hb.htm Hotbot
black widgets black-widgets-hb.htm Hotbot
purple widgets purple-widgets-hb.htm Hotbot
I respectively use abbreviations such as “gg” for Google, “ms” for MSN, and so on. You don’t have to use my abbreviations. However, make sure the naming convention that you implement is consistent. That’s the most important thing.
Tip: Please be careful when creating an “engine indicator.” Do not spell out the entire engine name in your filename. For instance, avoid naming your page like this:
blue-widgets-google.htm
Although it has not been proven, Google and other crawlers could potentially flag this page as a doorway page because it thinks you are creating it specifically to rank high on that engine.
You might be thinking, “I’ve created a robot.txt file, so I don’t have to worry about search engine ‘A’ indexing pages that are intended for search engine ‘B.’ Yes, that is correct. However, if you use a robot.txt file for this purpose, you could be cheating yourself from gaining maximum exposure across all of the search engines.
If you do not use a robot.txt file, you will notice that search engine ‘A’ will index pages optimized for search engine ‘B.’ This is exactly what you want. In order to do this, you must be very careful because you do not want to have similar content that could be flagged as spam.
It is completely possible to optimize several different pages that target the same keyword, and create content so unique that you will not be flagged for spam. As I mentioned, this will maximize your exposure across all of the search engines, while allowing you to increase the overall unique content of your site.
I can’t tell you how many times engine ‘A’ has picked up pages that I’ve optimized for engine ‘B’ and ranked the ‘B’ pages higher than those I specifically optimized for ‘A.’ So, if at all possible, only use a robot.txt file to protect your confidential content from being indexed.
One final Tip
Try to avoid creating sub directories solely for the purpose of storing optimized pages for a specific search engine. Storing all of your optimized pages in your root directory gives you a better chance at higher rankings because most crawlers give more weight to pages found in the root directory. In this case, it is better to sacrifice the organization and shoot for the higher rankings.
Author Bio:
This article is copyrighted and has been reprinted with permission from Matt Paolini. Matt Paolini is a Webmaster/Tech Support Specialist for FirstPlace Software, the makers of WebPosition Gold. He’s also an experienced freelance Search Engine Optimization Specialist and Cold Fusion/ASP.NET/SQL Server developer/designer. For more information on his services, please visit http://www.webtemplatestore.net/ or send him an email at webmaster@webtemplatestore.net
Introduction
By now, if you market on the web, you know all about the Florida update, and have likely heard of the Austin update.
But in case you haven’t – a little reminder: Back in November 2003 we began to notice a fairly major change occurring in the Google index. In fact, if we go back to May 2003 we can discern hints of the changes (targeting affiliate sites and so forth). Back then no one knew what was coming our way.
Recently, another fairly major update, named Austin, occurred and even more sites were removed from the index. So many people are asking why. I thought I would throw in my two cents.
Florida was a major change in thinking for Google. They are attempting to return the engine to supremacy. In doing so, however, they did harm to a lot of sites who were playing fairly and by the rules. This collateral damage caused many to surmise that Google was again broken and again wouldn’t be fixed.
We developed our own theory
We developed our own theory fairly early on with regards to Florida – that Google has implemented a context based algorithm. Indications are still there that this is the case. We hinted at the improving results then, and this seems to be happening.
You see, the next thing Google had to do once they had filtered out some sites, was to recalculate back links and PageRank. This happened in December. We were noticing some PageRank fluctuations on our own site during this time. It has since leveled out.
So once Florida happened, Google had to recalculate PageRank to adjust for those sites which were filtered. Since the site was removed – its influence on back links was also removed.
Google Austin Update
Now January comes – the Austin update. And what does Google do? They perform a refinement of the same algorithm which caused the Florida update. Nothing major – more like a tweak to allow some of the innocent sites which were previously dropped back into the SERPs. These are the results we are seeing now – some semblance of where Google will be in the next couple of months.
I don’t see this as the end though. Likely, in February another back link check will occur and those sites which were removed in January will lose their link influence in February. Then in March we should see a more stable index. True to Google’s nature – it will have been 90 days since the Florida update.
I bring up this figure because traditionally after a major Google change, it takes about 90 days for the Google index to stabilize and start returning more relevant results.
Be patient
Therefore, if you are a small site whose results have just now started to come back, try and be just a little more patient. My feeling is that in the next couple months you should start to recover some more of what was lost in the last two months. That is unless you were a site which was a target. These include affiliate sites, or sites which didn’t offer any useful content or information.
If your site is well constructed, easily spiderable and provides lots of useful information that is not too promotional you should start seeing a rise in results. You may want to increase the amount of content you have but be sure that it is informational in nature. Take a look at the average site ranking for your key phrases – how many pages are they? You will probably want to fall somewhere in that range of pages. Also, while you are analyzing your competitors – take a look at what kind of back links they have. Perhaps you can get links from these sites as well, as long as they are related to your industry. Remember that if your PageRank and site size are comparable to those currently ranking, you should also be considered an authority on the topic.
Summary
So as we begin February, remember that the Google ride isn’t over. We have just entered the phase where we are coming down the last hill near the end of the roller coaster ride. The end is near, but it’s not here yet.
Author Bio:
Rob Sullivan
Production Manager
Searchengineposition.com
Search Engine Positioning Specialists
Introduction
This article takes an in-depth look at some of the innovations introduced at ExactSeek.com, including, most recently, the implementation of a unique ranking algorithm that factors in Alexa traffic data.
Search engines basically cater to two audiences – webmasters and web searchers. Although there is an obvious overlap between the two groups, the latter dominates in numbers and most search engines bend over backwards to keep web searchers happy by continually tweaking search result algorithms. Webmasters, on the other hand, are more or less left to fend for themselves.
ExactSeek was founded 2 years ago on the premise that webmasters/ site owners are the driving force on the Web – not in numbers, but in terms of ideas, development, enthusiasm and innovation. The goal for ExactSeek, then and now, was to build a major engine that catered to this audience and web searchers more or less equally. Part 1 of this article deals with ExactSeek’s focus on webmasters.
Free Site Listings
ExactSeek’s belief in the importance of the webmaster audience to search engine growth and traffic resulted in the rapid introduction of three types of free site listings, each geared to helping webmasters give their websites maximum exposure within the search engine.
Here’s a short explanation of each:
Standard listings: As the name implies, this site listing is just a basic website listing in the ExactSeek database, comprised of Site Name (taken from the Title tag), description (taken from the Meta Description tag) and Site Link (the website URL).
These basic listings show up numbered 1 to 10, etc. in ExactSeek’s search result pages. Any webmaster can obtain a Standard Listing simply by filling out the form at: http://www.exactseek.com/add.html
Enhanced listings: Not strictly a website listing – really more of an enhancement to the standard website listing. In brief, an enhanced listing consists of a small icon residing next to a website’s standard listing which when clicked opens a 150×150 javascript window with additional information about that website. The cool thing about the enhanced listing is that webmasters can add up to 800 characters of text or HTML code to the content of the window, allowing them to:
– Embed audio, video or Flash.
– Include logo, product or service images.
– Include links to secondary site pages.
– Use font enhancements (size, color, style).
– Include more detailed information about their websites.
Simple examples can be seen at:
http://www.exactseek.com/enhanced.html
Priority listings: Introduced for those webmasters who wanted something in return for adding an ExactSeek search form to their websites. These listings appear in the first 10 to 100 ExactSeek search results, highlighted against a light blue background. Visit the link below to see an actual example:
http://exactseek.com/cgi-bin/search.cgi?term=perl
Obtaining a priority listing is a simple 3 step process outlined at:
http://www.exactseek.com/add_priority.html
The three types of listings noted above offer any half way knowledgeable webmaster a means to maximize site exposure with minimal effort. Best of all, any site meeting ExactSeek’s submission guidelines is added within 24 hours. No pointless weeks or months of waiting. In 24 hours, websites are either indexed or not. If not, the most common reasons are:
1. The site submitted lacked a Title tag. The ExactSeek crawler ignores sites without Title tags.
2. The site submitted was pornographic or contained illegal content.
3. The site submitted was dynamically generated and its URL contained non-standard characters like question marks (?), ampersands (&), equal signs (=) percent symbols (%) and/or plus signs (+).
4. The submitting webmaster failed to respond to the submission verification email message sent out by ExactSeek. Or, as is becoming more and more common, the webmaster failed to receive the message due to sp@m and ISP filtering and, thus, could not confirm submission. Webmasters using AOL, Hotmail and Yahoo email addresses may soon find it impossible to have their websites added to any search engine using a verification system.
The Do-It-Yourself Approach
One of the most irritating things about many search engines is that it can take weeks or even months for free website submissions to be indexed. And once sites have been added, it can take weeks for changes to content and/or tags to be re-indexed and for webmasters to see if those changes had a positive effect on site ranking.
ExactSeek opted to put a measure of control back in webmaster hands by introducing a do-it-yourself approach. Early on, two simple online tools were made available which allowed webmasters to quickly check the positioning of any website in the ExactSeek database for any keyword(s) relevant to that site and then, if necessary, do something about it.
(a.) Site Ranking Tool
Tracks the top 10,000 site rankings for all keywords in the ExactSeek database, allowing a webmaster to find his site ranking for any keyword(s) relevant to his website.
(b.) Web Crawler Tool
Allows webmasters to schedule recrawls of their websites as often as once per week. Self-scheduled recrawls and instant site ranking checks provide webmasters with a quick read on which optimization strategies work and which don’t.
Both of the above tools can be found with additional explanation on the following page:
http://www.exactseek.com/srank.html
Do-It-Yourself – The Next Step
More recently, ExactSeek implemented a simple free membership system that takes the do-it-yourself approach one step further. Any webmaster who has successfully submitted a website to ExactSeek is automatically a member and can obtain a Member ID and Password for login purposes by using the appropriate form at the ExactSeek Member Login page.
After logging in, webmasters can access a Member Account Manager which allows them to edit, enhance, delete and/or recrawl their site listings as often as they like. Revolutionary? Maybe not, but a big step away from total reliance on search engine scheduling.
Part 2 of this article will look at how traffic data from Alexa Internet was incorporated into the new ExactSeek ranking algorithm, the algorithm’s impact on delivering quality search results and at how ExactSeek’s new paid inclusion program stacks up against other paid inclusion and PPC programs.
Author Bio:
Mel Strocen (c) 2003. Mel Strocen is CEO of the Jayde Online Network of websites. The Jayde network currently consists of 12 websites, including ExactSeek.com (http://www.exactseek.com) and SiteProNews.com.
Optimizing Dynamic Sites
Dynamic content is delivered to the Web browser in a different form than it exists on the server, while static content is stored on the Web server in the same format that is delivered to the Web browser. Dynamic site pages are generated from a database “on the fly” as users request them.
You can often tell when you are looking at a dynamically generated page, because dynamic URLs contain one or more “query strings,” or question marks (?) while static URLs do not, but there are exceptions to this rule, which we shall discuss below.
Search engines have a hard time with dynamic URLs. Dynamic URLs may cause search engines to mistake a small site for a very large one because an unlimited number of URLs can be used to provide essentially the same content. This can cause search engine spiders to avoid such sites for fear of falling into “dynamic spider traps,” crawling through thousands of URLs when only a few are needed to represent the available content. Here’s how three popular search engines handle this:
* The FAST search engine will crawl and index dynamic URLs as quickly and easily as static ones (at this time).
* AltaVista doesn’t crawl dynamic URLs at all, but it will index each dynamic URL that you take the time to submit individually.
* But these two search engines are relatively insignificant. Google will crawl dynamic URL’s at about a third the speed and depth at which it indexes static pages. It will barely crawl at all if there are session IDs in the query strings, because it will soon discover that multiple URLs lead to the same page and regard the site as being full of duplicate content.
Another challenge dynamic sites throw at search engines is serving up different core content at the same URL. This might result when a site has content that may be viewed at the same URL in multiple languages, depending on the browser settings, or content, such as on a news site, which changes every few minutes.
Search engines want to be accurate they want visitors to a particular URL to see the same content the spider saw. They also want to be comprehensive. They vie with each other to have the largest database. Thus, they have billions of pages to index and typically can only visit each URL once every few weeks or so (although Google is pretty good at recognizing content that changes frequently, and spidering it more often). So if a search engine indexes your English content at a given URL, it will probably not index your Spanish content at the same URL during the same indexing period.
The Solution
Search engines want to be accurate they want visitors to a particular URL to see the same content the spider saw. They also want to be comprehensive. They vie with each other to have the largest database. Thus, they have billions of pages to index and typically can only visit each URL once every few weeks or so (although Google is pretty good at recognizing content that changes frequently, and spidering it more often). So if a search engine indexes your English content at a given URL, it will probably not index your Spanish content at the same URL during the same indexing period.
The solution is to give each search engine unique core content at a unique URL, and ensure that all visitors see the same core content. There are three main ways of achieving this.
1) Use static URLs to reference dynamic content. If a search engine sees a static URL, it is more likely to index the content at that URL than if it found the same content at a dynamic URL. There are several ways of turning dynamic URLs into static URLs, despite the fact that you are serving dynamic content. Your method will depend upon your server and other factors. A friend of mine had the following experience after implementing this solution for a client:
“For the last year, since rewriting the dynamic URLs, my client’s site has been riding high in the rankings for thousands of search terms. Before the URL rewriting, Google had indexed just about 3,000 pages in the course of 18 months, on the first week of using URL rewriting, Google was grabbing 3,000 pages per day from the 500,000-item database it had previously barely touched. By the end of the first 2 months of using URL rewriting, Google had indexed over 200,000 pages from the site.”
The following sites offer instructions for two popular servers:
* Apache:
* ASP:
A good step-by-step tutorial can be found at fantomaster.com. The article links are on the right hand side. There are four articles in the series.
Here are some examples of sites that have implemented one of these
approaches:
* Yahoo.com (yes, Yahoo!)
* Epinions.com
* Dooyoo.co.uk
* Pricerunner.com
URL rewriting is a very common practice. Not only is it exceptionally powerful in terms of search engine optimization, but it is also superb for usability and marketing in general. A shorter, more logical-seeming URL is far easier for people to pass on in an email, link to from their homepage, or spell out to a friend on the telephone. Shorter URLs are good business.
Solution 2
2) Link to dynamic URLs from static URL pages. The above solutions are elegant, but may be difficult for some sites to implement. Fortunately, there is a simple work around for smaller sites.
One method search engines use to crawl dynamic content while avoiding dynamic spider traps is to follow links to dynamic URLs from static URLs. If your site isn’t too large, you could build a static site map page consisting of links to dynamic URLs. The search engines should crawl those links, but will probably go no further.
An even more effective technique would be to get other sites to link to your dynamic pages. If these sites have good Google PageRank, your dynamic pages will not only be indexed, but the likelihood of their achieving a high ranking for the key words on them will increase significantly.
Solution 3
3) Pay for inclusion? AltaVista, Ask Jeeves/TEOMA, FAST and Inktomi offer Pay-per-inclusion (PPI) programs. You pay $25/page (or so) to ensure that that page is spidered frequently (Inktomi spiders every 48 hours for that price). This will garner some traffic, but since Google now accounts for over 70% of all search engine traffic and continues to grow stronger all the time, don’t throw too much money into this solution unless you have deep pockets. If your site is huge, the cost could be prohibitive. Paying to have your pages spidered does not guarantee that they will rank well, so they must be optimized properly. Frequent spidering enables you to experiment with optimization and see your results within a day or two. Search engines, including those with PPI options, want their databases to be as large as possible. So if you don’t pay for inclusion, and instead implement one of the solutions discussed above, your pages will probably be indexed anyway. On the other hand, if you pay for some of your pages to be spidered, there’s a good chance the ones you don’t pay for won’t be.
Summary
To summarize:
1. Search engines have problems indexing dynamic content.
2. If possible, use static URLs to reference dynamic content.
3. Otherwise, try to link to your dynamic URLs from static pages.
4. If your budget allows, consider using paid-inclusion programs.
Author Bio:
Rick Archer owns and operates SearchSummit, a one-man SEO company. His clients include many site owners who used to employ large SEO firms, but found that a personal, direct approach was preferable. Visit his website at http://www.searchsummit.com.
Usability & SEO
Today, I will slightly steer away from the subject of search engine optimization and instead present what should be the most important considerations when designing or completely rebuilding an existing site.
While usability features are primarily intended for your end users, you should also think of the search engine robots (spiders) that will regularly come to visit your site, in an effort to index and refresh its pages.
To make a site that is both friendly to your users and to the search engines, you always have to think in terms of usability. In order of priority, the five most important usability features you should always implement in your website are:
1. Ease of navigation
2. Simplicity and intuitive
3. Clearly indicated menus and links
4. A well-designed site map
5. Clean, uncluttered design
Ease of navigation
Your site should always be designed in a way that it’s always easy to navigate, no matter where you happen to be in the site. Always include a ‘Back to our Homepage’ button or link, either at the top or at the bottom of the page. If your visitors get lost, as some of us do at times, clicking on the Home button or link will at least make them come back to the beginning of your site.
It’s also a good idea to place a ‘Back to the Top’ button at the bottom of the page, to help them come back to the top. Also, when you are building or ‘renovating’ a website, always include a contact button or link, either at the top or bottom of all your pages. The contact page is often the only way they can reach you, either by email, by phone or in person.
Make it simple and intuitive
Did you ever try a new software package or tested a new program application for the first time and found it to be so easy to use and friendly? You seem to know in advance what effect on the program every button or menu function will have, even before you click on it. Your website should have the same ‘look and feel’. Make it easy for them and take every step necessary to make their experience as enjoyable as possible and they will come back for more.
People today are always in a rush and they have less and less time to figure out how something works. They came to your site because they typed a keyword or two in a search engine and they expect to rapidly find on your site what it is they are looking for. Make sure they find it fast and they will certainly be repeat buyers.
Clearly indicated menus and links
I still sometimes come across websites that seem to have been designed for the person that originally built it and nobody else. What may seem to you as self-explanatory may not be as simple to figure out for some users. Having a button called ‘Knowledge Base’ may not mean much to some people. Instead, if that button would simply be called ‘Help’ it probably would assist them better.
A well-designed site map
A good site map has two functions: it helps your users find what they are looking for and it also helps the search engines better index all of your site. It’s always a good idea to group your site’s pages under topics. If you are selling both physical products and actual services on your site, it might be a good idea to place them into two different groups. Avoid the temptation of cramping all your pages together to make it go faster.
Take the time to design a good site map where people will easily find the page or section that interests them. Also, make sure your site map is directly linked to your homepage. This last step is important, since one of the first things a search engine robot looks at when it arrives at a website is the site map link. Also, make sure that your site map link is a text link. Remember that most search engines cannot read or understand Java scripts or image maps.
Clean, uncluttered design
For some people, nothing is more frustrating than arriving at a site and not having a clue what to do next or where to go. They usually leave as fast as they came too! For that reason, avoid putting heavily animated graphics, background music or Flash movies in any part of your site. What’s more, most search engines today penalize websites that make a heavy use of animated graphics or Flash movies, since they cannot read them or understand them.
A clean and uncluttered design usually wins hands down. The most simple and ‘clean’ websites are usually the ones that are visited the most, since many people seem to know they will find what it is they are looking for, and they usually come back often too.
Conclusion
You are investing a lot of time, energy, resources and money in your website. Make sure you maximize all of your investments in keeping them on your site for as long as you possibly can. Once they have bought from you, and if they liked the experience, statistics prove they will come back.
Give them a reason or two to come back. Offer them a free weekly or monthly newsletter in which you will talk of the subjects that interests them. Another way to create a lot of interest is to regularly write ‘How To’ articles, similar to this one. Most people are usually interested in learning more, and the Internet is the perfect place for this.
Make your website THEIR focal point of interest by offering your visitors what truly interests them and you will largely increase your chances of success.
Author:
Serge Thibodeau of Rank For Sales
Major search engine submission
Getting your web site listed in the major search engines is an absolute necessity. Why? Because 85% of all people on theInternet use them when searching for information. When they type the keyword or keyword phrase in a major search engine, up pops your web site, or at least you hope so.
The easiest, fastest and free way to get listed in any of the major search engines is getting links from other related sites that point to your site. Once you get your web site listed in one major search engine or directory, your site will appear in the others.
Besides working on your link building strategies, you can also submit your web site to the major search engines.
Here’s how to submit to the major search engines, listed in alphabetical order:
AllTheWeb
1. AllTheWeb – owned by overture which also provides results for Lycos.
Free Submission
http://www.alltheweb.com/add_url.php
• Only submit your homepage and a couple of inside pages.
• It takes about 6 weeks before your pages appear.
• There is no guarantee to get listed.
Paid Submission
• AlltheWeb doesn’t have a direct paid inclusion program but uses Lycos Insite Select http://insite.lycos.com/inclusion/searchenginesubmit.asp?co=undefined Price $35.00
• It takes 72 hours to get listed. Your pages will be visited every 2 days for 1 year. This is a good option if you want a new site to quickly get listed.
Altavista
2. Altavista – owned by Overture
http://www.altavista.com/addurl/new
Free Submission
• This gives you a basic submission.
• You can add up to 5 URLS to be considered for inclusion in 4-6 weeks.
• Submit your homepage and one or 2 other pages.
• There is no guarantee of inclusion.
Paid Submission
AltaVista Express Inclusion
https://www.infospider.com/av/app/signup
• Submit your homepage.
• Your pages will be listed within 2 weeks and revisited regularly.
• Price $39.00 for one URL for a 6 month subscription.
• Guarantee: Your pages could be dropped if you don’t renew your subscription.
Google
3. Google
http://www.google.com/addurl.html
Google is the top choice for searchers. It’s search engine results also appear in Yahoo and AOL.
Free Submission
• It is free to submit but there is no guarantee of a listing.
• Submit your homepage and a couple of other pages.
• It takes about a month for your pages to appear.
The best way to get listed in Google is to find a lot of related inbound links – links from other sites that point to yours. This will vastly improve your search engine rankings and your site will probably show up more quickly.
InfoSpace & Inktomi
4. Infospace
http://www.infospace.com/info/submit.htm
• Free to submit, however you need to fill out a form to get listed.
5. Inktomi – it was recently purchased by Yahoo, so it may provide results for Yahoo in the future.
http://www.inktomi.com/products/web_search/submit.html
Free Submission
As with all other major search engines, building links to your site is the best way to go if you want to get listed for free.
Paid Submission
• Inktomi’s search engine has an extensive network of Web search partners, including MSN, About, HotBot, Overture.
• It refreshes your site every 48 hours, to keep your content up to date in the index.
• Submit your homepage.
• Your site will be listed in 2 days and will be regularly revisited up to one year.
• Price $39.00/year. Each URL thereafter costs $25.00/year.
• Guarantee: Your pages may be dropped if you don’t renew yoursubscription.
Teoma
6. Teoma – provides results for the popular web site Ask Jeeves
http://ask.ineedhits.com
Ask Jeeves owns the Teoma search engine. Teoma partners include:
Ask.com, Teoma.com, Metacrawler, Excite, iLor, MySearch.com,
MyWay.com, Ixquick, Mamma.com, Hotbot.com, Search123.com.
Free Submission
Like the other major search engines, get links from other web sites to point to yours and the search engines will eventually find you though their linking structure.
Paid Submission
• Ask Jeeves Site Submit (http://ask.ineedhits.com)
• Submit your homepage
• Your site will be listed within one week.
• 1st URL $30.00
• 2 – 1000 URLs $18.00 each
• Guarantee: Your pages may be dropped if you don’t renew your subscription.
Tips
1. Just submitting your web site to the major search engines is not sufficient to get a good ranking. Make sure your web site is correctly optimized before you do your submissions. Read How to Use Keywords to Optimize Your Site for Search Engines
2. Getting massive amounts of traffic doesn’t guarantee sales. Your web site copy must pull your visitors through the page for them to take action. Get others to read your copy and ask for their opinions.
3. To achieve higher rankings (and therefore more traffic) in the search engines, you need to continually refine your web site. Monitor your web site rankings and if necessary, adjust the keywords in your meta tags and web site copy.
4. Search engines should be only one of your marketing strategies. One week you could be at the top of a search engine, the next week your site may disappear. Therefore don’t just rely on traffic from the search engines, but use other methods to drive traffic to your site.
Now go and submit your site to these major search engines. When you succeed in getting a top listing, as a result of taking action on this article, drop me a thank you note – I’ll be cheering for you!
Part 2 of this article will explain
How to get listed in the Major Search Engine Directories
Author Name: Herman Drost
Company: ISiteBuild.com
Author Bio:
Herman Drost is the author of the new ebook “101 Highly Effective Strategies to Promote Your Web Site” a powerful guide for attracting 1000s of visitors to your web site. Subscribe to his ‘Marketing Tips’ newsletter for more original articles at: subscribe@isitebuild.com. Read more of his in-depth articles at: www.isitebuild.com/articles
How to Get Your Web Site Listed In The Search Engine Directories
Submitting your web site to the search engine directories will give you a great boost in traffic. A listing in a large search engine directory like ODP(Open Directory Project) enables your web site to get indexed for free by the major search engines such as AskJeeves, AOLSearch, Google, and Lycos. Directories will also help increase your page ranking. However getting listed in the search engine directories is more difficult than the major search engines. Preparation and patience is needed.
In Part 1 we covered “How to Get Your Web Site Listed in the Major Search Engines”
So what’s the difference between the major search engines and major search engine directories?
Search engines are often confused as being the same as search directories.
A search engine is a web-based software tool that enables the user to locate sites and pages on the web based on the information they contain.
Search directories use people, so you are dependant on these editors to put in a description of your website that will cover the keyword phrases that you need. Therefore, before submitting your site to a directory think carefully about the description that you enter.
How to prepare your web site for search engine directory submissions.
1. Research the best keywords – use the overture suggestion tool or wordtracker tool to find the keywords best suited for your site. It’s no point entering words in a directory that don’t relate to your site or are not often searched for. Include these keywords in your page content and in each of your meta tags.
2. Evaluate your site – search directories are more particular about allowing web sites to get listed than search engines. Therefore make sure your site has no broken links, includes relevant content-based pages, loads quickly (within 10 seconds), is easy to navigate, and has cross browser compatibility.
3. Prepare your title – if you are a business enter your business name ie Drost Web Design. If you don’t have a business name use the name of your site ie iSiteBuild.com. Don’t use a long title. This may help you get listed in the search engines but not in the directories.
4. Write a description – remember that human beings are looking at your description so don’t create an ad. Create a well written description of about 25 words, that clearly explains what your site has to offer.
5. Look professional – your web site should have a professional appearance. Don’t just put up a bunch of affiliate links to other sites or just have a one page sales letter.
6. Submit your site – navigate through the directory to find the best category for your web site. Don’t just include what you think is the most popular but the most targeted category. An easy way to find out is to enter your search terms in the directory.
7. Be patient – it can take months or sometimes even years to get listed. This is because there are thousands of sites being submitted every day. If your site has not been listed after 3 months, then resubmit.
8. Guarantee – there is no guarantee to get listed as the categories may already be filled. There may also be a backlog of sites to be reviewed. Therefore concentrate your efforts on other marketing methods.
In Part 3 of this article we’ll discuss the detailed requirements for submitting your web site to each Major Search Engine Directory in order to attract thousands of targeted visitors.
Author Bio:
Herman Drost is the author of the new ebook “101 Highly Effective Strategies to Promote Your Web Site” a powerful guide for attracting 1000s of visitors to your web site. Subscribe to his ‘Marketing Tips’ newsletter for more original articles at: subscribe@isitebuild.com. Read more of his in-depth articles at: www.isitebuild.com/articles