Getting more Google backlinks
Search Engines over the years have struggled with providing relevant and good results for their users. At the same time, they have faced the “attack of the webmasters” from those determined to use every trick or tool to get their web site at the top of the listings.
It can become a webmaster obsession (or maybe an addiction?) to “win the game” of search engine ranking.
Another option is to part with cash for a Search Marketing Company to do the work for you. This will not be instant, guaranteed or cheap as it takes time and resources.
This has forced the engines to change how they rank sites. The most important change to the rules is that of links or link popularity, meaning the amount of web sites pointing to your site.
With Google this also means the quality of the sites linking to you. There are several ways to get links to your site:
• Exchange or reciprocal – This is when web site owners agree to place links on each other’s site.
• Affiliate Program -Having your own affiliate program is some times over looked as a form of linking, but can not only be a good form of linking but of real revenue production.
• Paid Linking – this is when you can pay for links in directories.
• Free Linking without return links. This is the area I want to focus on.
Some Internet marketing experts will tell it’s not possible to get links to your site but there are ways that work. I will show you they are effective and more importantly, that Google loves them.
Articles – Writing and submitting articles to other site owners is an effective way to get links to your site.
It’s not as effective as it once was but publishers are still looking for quality, unbiased information. All you have to do is make sure you have a resource box at the end of each article, which points to your site.
Writing tips & resources
Check out Jim Edwards book on help on how to write articles here:
http://www.turnwordsintotraffic.htm
It has a great template which you can use. Just fill in the blank templates to format your message.
If your spelling or grammar is not the best then a low cost editing option might fit you needs:
http://www.yourbestwork.com/
With articles, I highly recommend writing an article and leaving it at least a day, then looking over it again before allowing it to become public.
It can give you a chance to remove any obvious spelling and grammar mistakes. Once you have your article finished you need to find placed to submit. I have a list here:
http://www.digitalawol.com/esub.htm
But maybe you have a very specific market you want to reach?
Where can I post my articles?
Here is a step by step way of how to use Google to find good results on a specific market. Say you produce products or sell e- books on “weight”
1. Search in Google but think of the big picture. Instead of using “weight” search for “health” which brings around 181,000,000 results.
2. Scroll to the bottom of the results page and click the “search in results” link
3. Enter in the text box “submit article” with the quote marks, which will look for exact matches to the search term.
4. This brings up about 55,800 which should allow you to find at least 1000 sites that would accept your articles. Once you have covered those you could start again for different key words.
Also when you find a suitable site that accepts articles, create a folder within your bookmarks (add to you favourites list in Internet Explorer) and add it as a submit link, for ease of returning.
Getting free links
Other sites offer free links where you can place a link. A word of caution here, some sites use different programming languages in their design. They may list your site but Google and other engines have trouble reading the resulting page and the link may not actually go to you but to a redirect file which then goes to your site.
The addresses of the links on the page may be something like this:
http://thesite.com/default.asp?m=gs&SiteID=39647 or
http://thesite.com/redirect?=1234
So they don’t actually count as a link to your site. Look at otherlinks on the site for a clue here. Check it out first before wasting your time placing a link.
On a side note here, it may still bring you some traffic if the site has large amounts of visitors.
Getting links from discussion forums
Another way to get links is via discussion forums. Again, you have to check that you links will count towards your site. Also the forum must be aimed at your site so it would be best if it has your key words in its name.
For example, I post regular help at Duncan Carvers Marketing-Stratergy.org . When I do post I always use a signature like this on the end of each message:
——
Paul J Easton
Internet Marketing and Promotion from DigitalAwol.com
Custom IE (Internet Explorer) Toolbars from Createtoolbar.com
—–
And the most important words above are linked to my site – (that’s every before the “from”)
Due to the forum being very active, with new posts, the Google spider, which lists the pages in the search engine returns very often (they love new fresh content).
A small problem is the pages that are created have a Question Mark, like the examples above. All I need to do in this case is to submit the page I had posted to ensure that Google finds and indexes it.
With in a couple of days Google listed my link back to my site. You can see this by going to Google and pasting:
” link:http://www.createtoolbar.com ” without the quotes in the search box. and view the first link!
This is a great win-win. I helped someone else out and I got links to both my sites!
Hope this helps.
Author:
Paul Easton
The Foundation
In my recent article “The Future of Search Engine Technology”, I looked at a lot of developments that might happen in the future, that would improve search technology. I strongly believe that we are witnessing the infancy of search engine technology, but I wanted to hear what others had to say. Today, we start a series of interviews with prominent experts, insiders and search engine developers to hear what their thoughts are for the future.
If you’ve been online for any length of time and work in any industry connected with the Internet, you would heard of, Robert Scoble. The Microsoft employee maintains a daily blog (when he’s not working to get the word out about Microsoft’s new “Longhorn” operating system) where he gives his thoughts on all things Microsoft, while also casting a critical eye on the competition. Scoble does a great job of keeping a distinct line between what’s “official” and what’s simply his opinion.
I was fortunate enough to catch him taking a sabbatical from his blog and asked him his thoughts on the future of search engine technology. Scoble did ask me to note that the following represents his personal opinions, not Microsoft-vetted opinions.
Search Engine Technology 1
[Andy Beal] Robert, tell me about the search engine technology being developed that you are most excited about?
[Robert Scoble] That depends on whether you’re talking about Internet searching, or searching on your local hard drive. If we’re talking about your local hard drive, searching for files on your local hard drive is still awful and getting worse.
[AB] Why do you say that?
[RS] Because hard drives keep getting bigger (a 60GB drive at Fry’s Electronics is $60 now — in three years we predict it’ll be $20 and you’ll see 500GB drives for less than $100). It’s easier to create files now than it is to find them.
Today search tools like X1 are most interesting because they index your hard drive and make it easy to search for email and files on your local drives. Microsoft Research has been working on a tool called “Stuff I’ve Seen” too, which is also quite interesting (both let you search email as well as files on your hard drive). But, these tools don’t go far enough. First, they are bolted on top of the operating system. So, while they are indexing, your system often sees slowdowns. They can’t design those to work properly with the operating system and with other applications that might need processor time.
Plus, to really make search work well search engines need metadata and metadata that’s added by the system keeping track of your usage of files, as well as letting application developers add metadata into the system itself. In a lot of ways, weblogs are adding metadata to websites. When a weblog like mine links to a web site, we usually add some more details about that site. We might say it’s a “cool site” for instance. Well, Google puts those words into its engine. That’s metadata. (Technically metadata is “data about data”). Now if you search for “cool site” you’ll be more likely to find the site I just linked to. So, you can see how Google’s engine is helped by metadata. But, we haven’t been able to apply those lessons to the thousands of files on your hard drive. That will change.
[AB] Can you explain the problems faced with searching hard drives and what Microsoft is working on to help?
[RB] What if we did the same thing on your hard drive [as Google]? For instance, look at pictures. When I take pictures off of my Nikon, they have some metadata (for instance, inside the file is the date it was taken, along with the exposure information) but that metadata isn’t useful for most human searches. For instance, how about if I wanted to search for “my wedding photos?” Neither X1, nor Windows XP’s built in search would find your wedding photos. Why? Because they have useless names like DSC0001.jpg and there’s no metadata that says they are wedding photos.
Let’s go forward a couple of years to the next version of Windows, code-named Longhorn. In Longhorn we’re building a new file storage system, code-named WinFS. With WinFS searching and metadata will be part of the operating system. For instance, you could just start typing in an address bar “W” and “E” and “D” and “D” and anything that started with WEDD would come up to the top. For instance, your wedding documents, spreadsheets, and photos.
But, WinFS goes further than X1 and other file search tools do today. It lets you (and developers of apps you’ll use) add metadata to your files. So, even if you don’t change the name of your files, you might click on one of the faces in a picture application and get prompted to type a name and occasion. So, you would click on your cousin Joe’s face, type in “Joe Smith” and “Wedding.”
Now whenever you search for wedding stuff, that photo will come up. And that’s just the start. If you imported a group of photos into a wedding album, you’d be adding metadata for the search engine to use. In other words, you’ll see a much nicer system for searching your local hard drive.
Search Engine Technology 2
[AB] It looks like Microsoft has things mapped out for offline searches, but can they compete with Internet search engines?
[RS] Now, if we’re talking about the Internet, then Google has done an awesome job so far. I use Google dozens of times a day. Will MSN [search] be able to deliver more relevant results than Google? I don’t know. Certainly that’s not the case today. Will that change tomorrow? I’m waiting to see what the brains at MSN do.
One thing I do see is that in Longhorn, search will be nicer for customers. Google is working on making its toolbar the best possible experience. We’re working on a whole raft of things too. I’m very excited about the future of search, no matter which way things go.
[AB] Let’s look beyond the next couple of years. What new developments in search do you see happening in the next 3-5 years?
[RS] For Internet searches, I see social behavior analysis tools like Technorati becoming far more important. Why? Because people want different ways to see potentially relevant results. Google took us a long way toward that future as their Google’s results are strongly influenced by how many inbound links a site has. But, now, let’s go further, even further than Technorati has gone. Let’s identify who really is keeping the market up to date on a certain field and give him/her more weight.
I also see that search engines that search just specific types of content (like Feedster) are going to be more important (Feedster only searches RSS and Atom syndication feeds).
Oh, and users are going to demand new ways of exporting searches. Google showed us that with News Alerts. Enter in a search term, like “Microsoft” and get emailed anytime a news source mentions Microsoft. Feedster goes further than that. There you can build an RSS feed from a search term. I have several of those coming into my RSS News Aggregator and find they are invaluable for watching what the weblogs are saying about your product, company, or market. For instance, one of my terms I built a feed for is “WinFS” — I’ll be watching to see how many people link to this article and if any of you have something interesting to say I’ll even link back.
[AB] Let’s look at your “wish list”. Assuming there were no restrictions in technology, what new feature would you like to see introduced to search engines?
[RS] I want to see far better tools for searching photos — and connecting relationships between all types of files and photos. For instance, why can’t I just drag a name from my contact list to associate that name with a face in a photo? Wouldn’t that help searching later on? In just 18 months I’ve taken 7400 photos. But I can’t search any of them very well today without doing a lot of renaming and other work.
[AB] What impact do you see social networking having on the future of search engine technology?
[RS] We’re already seeing an impact over on Feedster and Technorati. It’s hard to tell what’ll come in the future, but what would happen if everyone in the world had a weblog and was a member of Google’s Orkut? Would that change how I’d search? Well, for one, it’d make me even more likely to search for people on services that linked together social spaces and weblogs. Heck, I can’t remember my brother’s email address, but Google finds his weblog (and I can send him an email there).
One other thing I’ll be watching is how Longhorn’s WinFS gets used by application developers to build new kinds of social systems. Today if you look at contacts, for instance, they are locked up in Outlook, or another personal information management program like ECCO. But, contacts in Outlook can’t be used by other applications (particularly now because virus writers used the APIs in Outlook to send fake emails to all contacts in Outlook, so Microsoft turned those features off).
[AB] WinFS changes that. How?
[RS] By putting a “contacts” file type into the OS itself, rather than forcing applications developers to come up with their own contacts methodology.
What if ALL applications, not just Outlook, could use that new file type? What if we could associate that file type to social software services like Friendster, Tribe, Yahoo’s personals, or Google’s Orkut? Would that radically change how you would keep track of your contacts? Would that make contacts radically more useful? I think it would.
Already we’re seeing systems like Plaxo keep track of contacts, but Plaxo is still unaware that I’ve entered my data into Google’s Orkut and Friendster. Why couldn’t I make a system that’d associate the data in all my social software systems? Including Outlook?
Search Engine Technology 3
[AB] Do you foresee any problems with the WinFS approach?
[RS] Developers distrust Microsoft’s intentions here. They also don’t want to open up their own applications to their competitors. If you were a developer at AOL, for instance, do you see opening up your contact system with, say, Yahoo or Google or Microsoft? That’s scary stuff for all of us.
But, if the industry works together on common WinFS schemas (not just for contacts either, but other types of data too), we’ll come away with some really great new capabilities. It really will take getting developers excited about WinFS’s promise and getting them to lose their fears about opening up their data types.
[AB] Do you foresee a time when commercial search results (product/services) will be separated from informational search results (white papers/educational sites)? And do you think all commercial listings will eventually be paid only?
[RS]I don’t see the system changing from the Google-style results today. Searchers just want to see relevant results. Paid-only searches won’t bring the most relevant results.
[AB] What makes you say that?
[RS] Because I often find the best information on weblogs. Webloggers are never going to be able to afford to pay to be listed on search engines.
Commercial-only listings might be seen on cell phones or PDAs, though. If I were doing a cell phone service for restaurants in Seattle, for instance, I might be more likely to list just member sites. But, thinking about it, I still don’t see such a system becoming popular enough without listing every restaurant in some way.
[AB] Speaking of cell phones. How do you see search engine technology impacting our use of PDAs and Cell phones?
[RS] Not sure if search engine technology will impact it, but the mixture of speech recognition with search engines might change it a lot. When I’m using my cell phone I don’t want to look at sites that have a lot to read (I’ll save those for later when I’m in front of a computer or my Tablet PC) but, instead, I want to find the closest Starbucks. Look up movie listings. Find a nice place to have a steak dinner. Now that cell phones are reporting e911 data (that means that the cell phone system knows approximately where you’re located, so can give you just one or two Starbucks, rather than all of the ones in Seattle).
[AB] If search engine users gave up a little of their privacy and allowed their search habits to be monitored, would this allow the search engines to provide better, customized results?
[RS] Yes. I already give Google the ability to watch my search terms (I use the Google Toolbar). But, it always must be a choice. People really hate it when you don’t have strict privacy policies that are easy to understand and they hate it if you don’t give them a choice to not report anything.
[AB] Robert, you’ve certainly opened our eyes to the future of search engine technology, is there anything else you would like to add?
[RS] To echo what I said above, I hope the industry sees the opportunities that Longhorn’s WinFS opens up. We can either work together and share data with each other, or we can be afraid and keep data to ourselves. It’ll be an interesting time to watch in the next three years.
Many thanks to Robert Scoble, Microsoft employee and blog extraordinaire. Please be sure to visit SearchEngineLowdown.com as we continue to highlight the thoughts and views on the future of search engine technology.
Author Bio:
Andy Beal is Vice President of Search Marketing for WebSourced, Inc and KeywordRanking.com, global leaders in professional search engine marketing. Highly respected as a source of search engine marketing advice, Andy has had articles published around the world and is a repeat speaker at Danny Sullivan’s Search Engine Strategies conferences. Clients include Real.com, Alaska Air, Peopleclick, Monica Lewinsky and NBC. You can reach Andy at andy@keywordranking.com and view his daily SEO blog at www.searchenginelowdown.com.
Search Engine Marketing
Search engine marketing success comes from good research. By applying research to understand your competition and target audience, your optimization efforts will succeed. Remember when homework from school often required some research on your part to complete? It is much the same scenario for search engine marketing: you need to apply yourself by researching in order to understand your competition and target audience. Your visitors need to relate to you and understand your message and what you want them to do.
Know Your Industry
You must spend time understanding the industry you are trying to make a profit from. If you sell widgets, know everything you can about widgets: where they come from, how they are typically sold, why people need them, etc. You can learn a great deal from your competitors. Research using industry communications such as online magazines, forums, newsletters and blogs. Read articles written by industry leaders and keep up on the latest news about your industry. The more you know, the better your basis for building an authoritative website.
What’s Your Point?
Ok, so you’ve got this product you want to sell online. What are you saying to your audience? Make sure you write in a clear, concise way so your message does not get lost in the process. Just because everyone in your company understands your message doesn’t mean the visitors to your site will. Have a few test scenarios set up; ask a few objective readers what they understand from your website and see just how much your message is getting across. You’d be surprised that what may be obvious to you is not necessarily obvious to your website visitors. If you are having trouble creating a clear message for your website, consider hiring a copywriter to convey what you want to say.
Know Your Target Audience
Who buys your product and why? Who needs the information you have on your website? Who would you like to have visit your website that isn’t already there? Who is visiting you? Are they professionals who understand your technical terms or visitors of varying levels who all need the same information from you? What do you offer that a certain market would want from you? Take the time to get a good look at what is out there and how your competitors are presenting their information to online visitors. Use your log statistics reports to track who comes to your website. Get familiar with the keyword terms they are using in the search engines to find you. Research the domains that most visit you. Find out why visitors are clicking away from certain pages before going deeper into the content of your website. Is it a lack of information? Too many choices to click on? Is the language used or instructions given easy to understand? Don’t give your visitors a reason to leave before they understand your message.
Know Your Competition
Take a good look at your top competition and see what they are offering online. Even looking at websites that are not direct competitors may give you an idea of what to offer your visitors. Think of it this way: someone put effort into creating those websites. Visit your competitor’s website. Search in the major search engines for your most important keyword phrases and see if your competitors show up in the top thirty search engine results. Learn what you can and see if what they are doing is something you should be doing. It’s always good to know if your competitors are using SEO, Paid Inclusion, PPC, Link Building and other means to rank well.
Make Your Website Accessible
There’s nothing worse than muddling through a website looking for what you want and clicking so much you finally give up. Use easy navigation, make sure your information or products are easily accessible to your visitors. Create written text that is easily understood in order to get your message across readily. Give your visitors plenty of written information. There’s no such thing as “too much text” when it comes to search engine robots “understanding” your web pages. What’s good for the visitor is often good for the search engine robots.
Do The Work
Research is the cornerstone to your success. The more you know about your subject, the better you will be able to inform your visitors. By informing your visitors you build trust and interest in learning more about your website. Do the math – get searching!
Author Bio:
Daria Goetsch is the founder and Search Engine Marketing Consultant for Search Innovation Marketing, a Search Engine Promotion company serving small businesses. Besides running her own company, Daria is an associate of WebMama.com, an Internet web marketing strategies company. She has specialized in search engine optimization since 1998, including three years as the Search Engine Specialist for O’Reilly & Associates, a technical book publishing company.
Microsofts Search Engine Discussion
As a follow up to our look at what Microsoft’s new search tool could look like, Rob Sullivan, our Head of Organic Search, and I locked ourselves in an office and tried to tackle some big questions about what will happen when Microsoft enters the search industry. We suspect these questions have been on a few peoples minds.
Q: Given what Microsoft is working on for Search, what do you see Microsoft doing between now and the release of Longhorn?
Rob: Version 1.0 of search will out by the end of the year; Bill Gates has already stated this. It will look the same as everybody else, however; nothing too different or radical. They will be playing catch up for the time being, offering similar features to Google, Yahoo, etc. Trying to get MSN competitive with other portals.
(Note: as a follow up to this, you can see a beta MSN search portal at beta.search.msn.com. Hmmm..notice any similarities in the results to Google’s layout?)
Their deal with Overture lasts until 2005 so we won’t see too much change with that. Likely after their current contract expires they will go with yearly renewals only. Nothing too long term, and with the option to bail. This is for when they are ready to launch their own PPC.
By next year, I wouldn’t be surprised to see Microsoft buying some PPC outlet, like a LookSmart or Enhance.
Gord: I think Rob’s right. Microsoft has to start building a name for their own search product, so they’ll introduce it on the MSN portal. It will be introduced with a lot of hype, but little in the way of functional advantages. Ideally, Microsoft will be able to add one or two features that would give it a distinct user advantage over Google and Yahoo.
Another important thing to remember is that Microsoft is probably trying to recoup their investment in search as soon as possible. I don’t think they are prepared to sink hundreds of millions of dollars in a search black hole without a return for years. They’ve played this game in the past to capture market share and I don’t think they want to go there again. That’s why you’ll see them hang onto an agreement with Overture for the foreseeable future.
Q: Why Buy a PPC Outlet?
Rob: Buying a PPC provider is quicker than building one. The current PPC suppliers already have developed the back end systems, such as the database, reporting features, and so on. Also, in some cases (particularly Looksmart) the price is right. At least by buying a PPC supplier they can quickly monetize the purchase. After all, look at how much money Yahoo made this quarter alone, the first full quarter since the Overture purchase, because of the PPC supplier.
Gord: We have to remember that Microsoft will be throwing a lot of resources at developing their own search technologies. I agree with Rob. It makes more sense to buy an existing PPC provider and get access to the technology. The one caveat here is the Overture/Yahoo portfolio of patents, which currently has some PPC portals paying them license fees. Microsoft will be looking to steer around this. And this brings us back to Google. Google AdWords uses a system sufficiently different from Overture that it doesn’t infringe on their patent. Could this be one of the reasons Microsoft was originally looking at Google?
Bottom line: Microsoft will want their own paid search channel as soon as possible, but will be looking for short cuts to get there.
Q: Why won’t Microsoft hold off on unveiling their search until Longhorn?
Rob: The problem will be winning people over. Microsoft results are not the best right now. MSN search doesn’t have a very good reputation. It has traditionally been confusing for most people, what with sponsored listing, organic listings, web directory listings, featured sites and so on. They’re already changing their default search to something that has more reliable results but Microsoft will have to overcome this bad quality perception to convince people to use search. Also, as they roll out with new search features, they will continue to change their results pages’ displaying more or less ads, more or less free listings to find the right balance between monetizing the paid listings. And during the process, they have to build their market share.
One way they could attempt to win people over would be a side by side comparison of results with other sources. If they could prove to more advanced web users (the early adopters so to speak) that their results are at least comparable, but more than likely superior to other providers, they can start winning people over. They will have to do this specifically before Longhorn comes out. Microsoft has to convince people that MSN search provides quality. If they can’t win people over with MSN search, they won’t likely get them to use Longhorn’s search capabilities.
Gord: Microsoft has never built a search brand. Right now, Google owns that outright. Even if Longhorn ships with its default set to Microsoft search, people have to have sufficient reason not to change it. It remains to be seen what the Antitrust overloads will do to prevent Microsoft from monopolizing search (like they did with browsers) but we can be sure that the user will have to at least have the appearance of a choice.
Microsoft simply can’t allow Google to continue to own the search business outright for the next few years. They have to start establishing their own brand identity. For the next two years, the search fight will be fought on two fronts: through the portals (Google vs. MSN vs. Yahoo) and through toolbars and desktop search apps. Microsoft doesn’t have to win these outright, but they have to put up a good fight to build a position as a contender. If they can do that, they can eventually crush the competition through OS integration with Longhorn.
Q: Of all the functionality we looked at in the last NetProfit, how much will ship with Longhorn?
Rob: The WinFS part is a given, that has to happen. They need to incorporate the XML features of the new file system with its ability to easily perform local searching. It will also give people the ability to see what MSN Search could become in the coming years; i.e. being able to search for a song based on genre, performer, songwriter, and so on. Once people learn of the power of the WinFS search feature they will likely come to rely on it for web search. This is key to Microsoft search. If people don’t buy into the power of the WinFS search, they won’t buy into Microsoft search.
They will have the Avalon desktop because the cool factor. Microsoft hasn’t changed the desktop since Windows 95. While they’ve added a ton of great features since then, they haven’t improved on the desktop. Having a desktop with the ability to render in 3D definitely contributes to the cool factor.
In addition, I see the subscription portion of MSN being more tightly integrated into the new OS. The ability to stream audio and video to subscribers will be a huge money maker for Microsoft. The subscription service will offer more features than the regular MSN portal.
Gord: In addition to what Rob has mentioned, I believe Microsoft will also introduce an iterative search interface, that will allow the user to tweak and filter their search and have the results update in real time. I believe this functionality is in complete alignment with how people actually search and will be a big factor in winning over users.
Microsofts Search Engine Discussion cont’d
Q: How about Implicit Query?
Rob: Not right off the bat. That will be likely wrapped into further enhancements. Also, the ability to anticipate queries, and aid in queries means that the computer has to know more about the user. Therefore it will have to monitor the user both locally and on the web to see how they search, what they search for, what results they picked and so on. Until the computer can understand the user Implicit Query won’t work.
Gord: I’m not so sure. Microsoft is testing Implicit Query pretty aggressively now, and I wouldn’t be at all surprised to see some version of it included in the first release of Longhorn. That, and the fact that the marketing possibilities of Implicit Query are staggering. I’m sure Microsoft is fully aware of that.
Q: How much of a selling feature will Search be in the Longhorn release?
Rob: It won’t be the major feature; Microsoft will be pushing the desktop features and functionality more than the web search feature, because of convenience. The ability to get information from formats such as sound and graphic files will be appealing to users. I think that initially many people may even change their default search preferences to a site like Google, especially if MSN search doesn’t perform. This is why they HAVE to get search right early on.
Gord: Right now, all Microsoft’s talk about the search functionality they’re developing is about its application on the desktop, not on the web. This leads me to think that they want to nail it as a localized search tool first, and then extend it to the web. For this reason, I think there will be a lot of marketing about ease of use and the ability to work more intuitively with your files without having to become a librarian. The web search implications will be rolled out later, as Microsoft moves their search focus online.
Q: What’s Google going to do?
Rob: Google has to get into the portal business. They need a built in traffic stream that’s looking for other features over and above search. That’s why they’re having an IPO – to raise money. Think about this for a second. You already have a hugely popular, highly profitable web property. Estimates are that it takes in a billion dollars a year in revenue with anywhere between $250 and $500 million in profit. Why would you IPO? Because you need to raise money. And with that money, they’re going to buy AOL or Netscape.
Why buy AOL or Netscape? Well either of these properties will give them a dedicated customer base, a portal, the ODP, a chat program (ICQ) Movie listings, mapping capabilities and so on. It puts them on a somewhat equal footing with MSN and Yahoo offerings. Not to mention that Time Warner has had nothing but headaches with the AOL branch since the two companies merged.
Gord: Google has to make a bold move forward in search functionality. They came out so far ahead of any search engine in 1998 that once the competition caught on, it took them 5 years to catch up. Now, that competition has caught up, and Google hasn’t been able to raise the bar that significantly since.
Google has to jump ahead of the pack again, and based on past experience, that jump on functionality will be squarely focused on providing web users with a better search experience. While I like Rob’s portal theory, I think such a move would split Google’s focus in so many areas that they’d lose sight of their core competency, which is search. In my mind, Google only has one slim chance to win the upcoming war with Microsoft, and that’s by continually focusing on providing the best search tool on the web.
What Impact will Yahoo dropping Google have?
Rob: Google will loose a substantial chunk of traffic, obviously, but it won’t have much of an impact on Yahoo regarding quality of search results. Yahoo will still have 1/3 of web users after they switch. They can replace with Inktomi, or a mix of AltaVista, Fast and Inktomi. How Yahoo will work is they will build features and test the new features on AltaVista. When they’re satisfied that the features do what they want, and are useful, they will implement the new features on Yahoo. The average Yahoo user won’t likely notice much of a difference from the day before they dropped Google to the day after.
As far as search functionality, they’re refining their semantic processing. They have had a year since they bought Inktomi, and Overture has had Alltheweb and Altavista for six months. It isn’t inconceivable that they have a huge amount of research and development going on to make a search product capable of replacing Google. (By the way, semantic search will also be a feature with MSN search therefore it’s safe to assume that Yahoo will be developing it as well)
Gord: The loss of Yahoo has been looming for ages, and I would hope Google has a Plan B, as well as a C, D and E. Really, the Yahoo arrangement has been a bonus for Google from the beginning. The fact is, Google still owns a huge chunk of search traffic outright, and they have to concentrate on this before anything else. If we’ve learned one thing in the past few years, it’s that you can’t depend on partnership arrangements for your success.
Google will be finding ways to build search traffic directly to Google. The launch of the Google Toolbar was the first move in this direction, and a brilliant one, in my opinion. I think toolbars and other desktop search apps will be the next hot battleground of the search industry.
Rob: The down point to that is people have to agree to download from Google. With Microsoft, it will all be built in. Again, a huge competitive advantage to Microsoft.
The thing is, if Google does do something revolutionary, and offers it as a free download, all Microsoft has to do is build it and release it as a patch or service pack to get it implemented.
Gord: And that’s why I think Microsoft can’t be beat in the long run.
Rob: I think Google can get back on top, and the application of the semantics db is just the first step. They have to get that working first. I think they are close and getting closer all the time, but they still have some tweaking. Once they do get it fixed they have to get in front and then stay in front, so the competition is always aiming for a moving target. Even then I don’t think superior search is enough to keep them in front. They have to offer more.
Google will keep trying to introduce ways to make search more intuitive and useful. For example, the slide show search at Google labs is kind of cool, if they find a way to make it more applicable to people.
Q: What about Yahoo?
Rob: Yahoo is in the most unique place right now. They have nothing to prove and a solid customer base. Anything they do is an improvement, so they have no where to go but up.
I think they’ll have to go into semantic search, like Google and MSN. The first roll out of a pure Yahoo search will be vanilla organic search, but they’ll be changing that as time goes on. By this fall they’re going to want have something out before MSN. MSN is the key in Yahoo’s formula. Yahoo will want to be ahead of MSN, and Google wants to be ahead of everybody else.
Gord: I wouldn’t want to be Yahoo right now. I can’t see how they’ll win in the long run. The one area that’s interesting is the Yahoo shopping search engine. Perhaps Yahoo will eventually have to concede the general search contest, and become a niche player providing specialized search functionality. But they’re not going to go quietly. It’s going to be a huge battle in the coming few years.
Microsofts Search Engine Discussion – More Q&As
Q: Does Yahoo offer anything unique or superior?
Rob: They’re just relying on their brand. They really don’t have any features that set them apart. That’s not to say that they won’t develop these products, but I think with Overture making so much money for them that, at least in the short term, they don’t need to innovate to stay in the game. Once they realize that they are getting left behind (or at least are simply status quo) they will invest more into organic search R&D. If it will be too little too late is anyone’s guess at that point.
Gord: Overture may provide another key to survival for Yahoo. In addition to the revenue Rob mentioned, Overture has always been ahead of Google in providing better management and reporting tools to PPC marketers. I think it would be a smart move on Yahoo’s part to build this advantage aggressively over the next two years. We know search marketing is a hot market, and if Overture can build some loyalty, it will give them some high ground to defend their position from.
Q: Does Yahoo have a chance?
Rob: Microsoft’s marketing power is huge – as is their budget. Yahoo is also a big company, but they can’t compete head to head with Microsoft. MSN is going to crush Google and then aim for Yahoo. They’ll be competing portal to portal. MSN has to beat Google at search first. Once that does happen (and I do believe it will ‘ provided they can get that crucial version 1 search right), MSN will set its sites squarely on Yahoo.
Gord: As I mentioned before, I don’t think Yahoo can win the search battle head to head with Microsoft or Google. They’re going to have to rely on their strengths in other areas to survive.
Q: Does Google have a chance?
Rob: Not really, and that’s why Google needs a portal. They can’t compete on search alone. If it’s not a portal than Google has to offer other unique features.
Gord: The only way you can compete on search is to be best of breed. And Google isn’t the clear leader in search any more. To be honest, Google impressed the hell out of me in the beginning, but I don’t think they’ve done anything noteworthy in a long time. I think they’re showing the stress factures inevitable in a young company that suddenly has to compete in the big leagues. I don’t think they’re up to taking on Microsoft head to head. I can’t think of many companies that could.
Rob: But if Google is building in semantics and a constantly improving relevancy feedback mechanism, they should continue to improve. After all, they are already collecting data on the number of abandoned searches, the number of returned results, the average clicked on position and so on. It shouldn’t be too difficult to integrate this data into the ranking algorithms to make them better (if they haven’t done so already). Remember that if this is Applied Semantics technology being applied by Google, then the software is supposed to ‘learn’ based on what results were picked or not picked. It is supposed to be able to refine results for the next time.
Q: Does anyone else have a chance?
Rob: They’ll become niche players..fighting over the scraps that are left. And there will always be an element of the population who are anti-big business. Many people adopt Linux because they are anti Microsoft; I think you will see similar sentiments towards the search engines if they become too commercial. Already we are seeing signs of open source search engines, and other smaller engines trying to compete.
Gord: It may seem ironic, but the biggest potential losers in this will be the ones that go head to head with Microsoft, namely Yahoo and Google. The smaller players will probably benefit. The profile of search will grow, along with the profitability. The small players will be able to move quicker to identify niches and capitalize on them. And they’ll probably be able to strike new partnership, providing specialized services to the big 3 in the upcoming battle. As niche players, I think the future looks good for some of these services.
Q: When Microsoft enters search’will the industry grow?
Rob: Yes, search volumes will continue to grow, so we should see search continue to grow. After all, markets outside of North American and Europe are growing faster than anywhere else. In addition, broadband usage is growing, hardware prices continue to come down, and more people are getting hooked up to the ‘Net. There will be a point where internet growth tapers off, and search plateaus, but I think that is many years away.
Gord: I think we’ll see incremental growth in search as a whole, with a possible jump coming with OS integration. But I see exponential growth in the commercialization of search, and therefore the revenue generated. Implicit Query and other ‘push’ search channels will change marketing all over again. Search, in its eventual evolved and converged form, will be one of the biggest marketing channels, bar none, in 5 to 6 years.
Q: What will Microsoft do on the paid search side with the release of Longhorn?
Rob: I think this is where implicit query kicks in, and the sponsored results will be shown first. Consider this: How much would one ad across the tool bar in an application be worth to an advertiser? That advertiser essentially has a captive audience. We’ve talked about this before ‘ the application ‘watching’ what you are doing and offering help ‘ by way of sponsored (or other search) listings ‘ appearing conveniently in the application you are using. Another resource for listings could be the desktop sidebar (another of Longhorn’s new features). It is also built on XML so it should be flexible enough to display ‘best picks’ listings ‘ whether paid or organic. Combine this with Longhorn’s ability to learn from you and refine its ability to provide what you want and you have a powerful new advertising medium.
Gord: It’s a different pricing model’a different delivery technique. It would very easy for Microsoft to serve up as much advertising as they want, they have to know where they start irritating people. It will be a whole new paradigm, and it remains to be seen how people respond to it.
That said, Implicit Query changes all the rules for advertising. It introduces huge considerations for privacy and online intrusiveness. We’re getting to the online equivalent of the personalized ad message that we saw in the movie Minority Report.
Rob: But Implicit Query and result returned don’t have to be that obvious. Microsoft has already experimented with inconspicuous helpers. Remember Smart Tags? They are still used by Microsoft applications. They appear as a little box with an ‘I’ in it. There could be a time where Longhorn recognizes a phrase and associates it with a smart tag which is linked to a search result which provides more information via organic results, or paid search listing. This type of system opens the door to many different types of web marketing we haven’t considered before.
Conclusion
Microsoft, Yahoo and Google will take search in totally new directions in the next few years. That means search marketing will also change dramatically. The medium will become more powerful than ever, probably prompting new investigations and concerns by consumer groups, and new legislation by government. I hope the questions we posed and tried to answer help clear up a very fuzzy future just a little bit. One thing is for sure. The way you search today will be significantly different from the way you search in 2006.
Author: Gord Hotchkiss
The Hiltop Algorithm
Few events on the Internet have stirred as much controversy and confusion than Google’s most resent change nicked named Florida – a plan that knocked thousands of companies out of their top positions in search engine results.
There has been no shortage of accusations about Google’s intentions. The most common charge leveled against the search engine giant is that it is trying to increase revenues by forcing companies to buy keywords.
In the midst of this firestorm, we decided to do a little sleuth work to find out what really happened. What we found was that an algorithm named Hilltop was responsible for shaking up the entire online community.
A bit of background first
Google uses PageRank and other technology to drive its search engine. Here is a summary of PageRank from http://www.google.com/corporate/tech.html:
PAGE RANK DEFINED: Condensed version
PageRank relies on the uniquely democratic nature of the web by using its vast link structure as an indicator of an individual page’s value. In essence, Google interprets a link from page A to page B as a vote, by page A, for page B. But, Google looks at more than the sheer volume of votes, or links a page receives; it also analyzes the page that casts the vote. Votes cast by pages that are themselves “important” weigh more heavily and help to make other pages “important.”
Google knew that there were two problems with PageRank: it had difficulty filtering out spam sites and it did an inadequate job differentiating between sites that had relevant and irrelevant information.
Krishna Bharat at Compaq Systems Research Center and George A. Mihaila, a professor of Computer Science at the University of Toronto came up with a new algorithm to fix the problem: Hilltop.
A VERY simplified explanation of how Hilltop works.
What follows is a VERY simplified explanation of how Hilltop works.
• Hilltop counts the number of meaningful (related to the topic) hyperlinks coming into a content-rich Web site.
• Web sites with numerous meaningful links and volumes of pages with relevant information are called “authority sites”.
• Authority sites enjoy higher rankings on the assumption that they are of more value.
• Web sites with few hyperlinks, non-related links, MLM affiliates, and affiliate programs to inflate Page Rank are demoted.
Here is what the Hilltop Algorithm looks like:
Old Google Ranking Formula = {(1-d)+a (RS)} * {(1-e)+b (PR * fb)}
New Google Ranking Formula = {(1-d)+a (RS)} * {(1-e)+b (PR * fb)} * {(1-f)+c (LS)}
Google quickly found that the Hilltop algorithm also had flaws. So Google created a two-step search process that combines PageRank technology and the Hilltop algorithm.
So how do companies continue to enjoy top rankings on Google?
Getting top rankings on Google depends on meeting the criteria of PageRank and Hilltop. Therefore, companies have to receive significant numbers votes or quality links from authority sites (PageRank) and have meaningful / relevant hypertext links (Hilltop) inbound to their web pages. Just as it has always been, Content is critical and linking is now more important that ever.
Two Conclusions can be drawn from our research
• Google is not trying to force companies to buy keywords; it is simply trying to improve its algorithm.
• The search engine optimization methodology and strategies used to achieve top placements just became significantly more complicated with the addition of Hilltop
With Google handling 75 to 85 percent of all search requests, companies doing business on the Web must retain top rankings in the search engines in order to survive. And that means making sure that your firm satisfies the criteria of the two algorithms.
Author Bio:
Search Engine Optimization Inc. specializes in getting companies top rankings on all search engines through organic searches. Contact us at 1-877-736-0006 to learn how we can help your company enjoy the benefits that come from top placements.
Google’s Austin Update
Some of the websites that haven’t been hit too hard in Google’s Florida update (November 2003) got hit real hard on or around January 23. Google’s latest update is called “Austin”, and they are beginning to ‘sound’ like elections’
Depending on the industry you happen to be in, you could have been hit less, ‘ or harder- it depends on a whole number of factors and not one situation is usually the same. One of our clients that, up to recently had their site optimized by another SEO firm was completely devastated to realize his site was gone from the face of the earth. Things such as FFA’s (Free for All) link farms, invisible text (!) and stuffed meta tags that are 20 screens long, filled with useless spam will have your site penalized, or even banned faster than you can blink an eye.
Google’s new search algorithm is getting ‘smarter’ and better. Part of it is due to the Hilltop Algorithm, which I wrote about at the beginning of January. In combination with the Page Rank’ algorithm that Google has been using since 1998, these two algo’s work in tandem in what is supposed to be producing better, more relevant search results. This last statement remains to be tested, as in some cases I have found that some of the results produced are not always relevant. But Google is continuously tweaking and ‘fine-tuning’ its technology to make it better and the next few weeks should see more improvements.
Google’s new ‘weapons of mass destruction’
What is really clear at this point is Google’s unique way to find sites that attempt to use spam techniques or certain methods that are forbidden by most major search engines, such as cloaking. In fact, to rectify this situation, on January 25, Google has changed the IP addresses of most of its data centers all over the world.
What is really peculiar about this update is that the majority of the sites we have been optimizing for the past 12 months have actually gone up in their rankings. Still, we are hearing and reading of reports in the industry that some websites (in no particular industry) have seen their PR (Page Rank’) drop one point for no apparent reason. Sites that for the past two years had a PR of 6 dropped to 5, with no real changes in the number of outbound links pointing to them.
This is indicative that Google has changed the formula to calculate the number of outbound links used in computing the resulting PR value a site should have, in relation to its competing sites. It is also believed that Google now looks at the ‘theme’ of a site and looks at the relevancy of links to better calculate PR. Sites in the same industry will benefit more than before, while others which get the majority of their links from sites that have nothing to do with them could see their Page Rank’ value go down, as some did on January 23rd.
It is clear now that Google is making some significant improvements and some major changes to its search algorithm- again! Get used to this, as my feeling is that it will continue for the next several months. It is my belief that, from the month of November to this date, sites that haven’t gone through any significant loss in their rankings could very well fall in the “victims category” if they happen to use any forbidden techniques described earlier.
Spring clean up time is here already
For almost as long as I can remember, most major search engines have always frown upon sites that use spam techniques or ‘tricks’ that are not recommended, in an attempt to ‘get away’ and possibly rank higher. If you suspect your site is part of this category, the time has come to “clean up” and to start producing good and relevant content. From the beginning, sites that have always produced great and superior content have always benefited and these categories of quality sites are usually not affected in any major update.
Content that is freshly and regularly updated always helps in getting better rankings. On the Web, content is always king. It’s not just the quantity, but also the quality of the content that is important. Additionally, Google’s Hilltop Algorithm has a tendency to detect sites that are authoritative in their field and will usually rank them higher, along with a higher PR value. Hilltop looks at the topic of the theme and who links to a site, the value of those links and the anchor text surrounding those links, as well as the text in the links themselves.
Remove from your site any meta tags that are irrelevant and replace them with the ones that are truly in relation to the products and services you want to sell or whatever it is you are trying to promote on your site. Also, look at your title tags- are they really indicative of what that page is all about?
Look at the body of your text. Is it carefully written and does it ‘flow well’ to any ‘human visitor’ that tries to read it or does it sound like you are repeating the same keywords over and over dozens of times?
Build and write your website the best you can, using only approved techniques recommended in the terms of use of Google and the others and your site should never ‘fall off a cliff’ as some have done in the past two months.
Conclusion
For the SEO community, the next two to three months should be interesting, both as close observers (which most of us are) and from a ranking standpoint. It is now highly expected that Google will produce its long-expected IPO. Some think this is the reason why it is making such major changes to its search technology. Additionally, more speculation and ‘conspiracy theories’ have been rampant in many SEO forums, which, for the most part, are unfounded in my belief.
Whatever the reasons to these important changes, one thing is clear: Google is working hard to improve the quality of their search results and this can only be achieved by trying new algorithms, tweaking existing technology, changing IP addresses in data centers, etc.
Don’t unfasten your seat belts just now – the ‘Hurricane’ may not be over yet!
Author:
Serge Thibodeau of Rank For Sales
Time To Think Summer
Today, I’m sitting at my desk looking out the window at the snow. We have a blizzard warning up as I type this Marketing Monitor. We are to expect between 8 and 10 cm (about 3 inches) on top of that same amount of snow we”ve received daily for the past 3 or 4 days.
On top of all that the temperature is below zero and has been for longer than normal. Yet despite the cold I”m thinking summer.
Not because I hate the snow. In fact I kinda like it. No I”m thinking summer because some of our client”s busiest seasons online are the summer time. So I thought I”d give some pointers here.
Remember that even if it”s snowing and cold where you are, summer is just a few short months away. Now is the time to start planning and budgeting for your summer search marketing campaign. Whether you are going to be running a PPC campaign, or strictly organic SEO, you have to begin preparing.
If you are a seasonal business and rely on summer sales, you should start researching key phrases for the summer. Consider what you sell, and what your competitors sell.
Be sure to review your visitor logs for last year to see what brought people to your site. Of course with the many changes in search over the past few months they won”t all be relevant phrase but perhaps some phrases will help you come up with ideas for additional phrases you can market.
Assess your current website”s condition. Does it look dated? Perhaps it needs a face lift. A simple redesign of a site can bring about a fresh look and perhaps drive more traffic. Sometimes a good indicator of what your site needs is your competitors. Those who aggressively market on the web have developed a formula for maximizing return on their web investment so take some pointers from them.
If you do consider a redesign, please consider the implications. We wrote an article quite a few years ago on this subject but the same rules still apply (you can read this article here). If you are working with an SEO company, above all consult with them before you do anything. They can help you minimize the impact of your new site. If you determine that a redesign is in order, now is a good time to do it � particularly if you”re key business occurs in the summer. If it”s done properly, the impact can be minimal and you will likely recover by spring.
Consider the current web environment. Again, many techniques and tactics used last year don”t work on today”s engines. Therefore you may want to revisit your optimization.
If you perform these simple tasks now, in preparation for your summer season you can help your site reach its potential without negatively impacting your online visibility.
Author Bio:
Rob Sullivan
Production Manager
Searchengineposition.com
Search Engine Positioning Specialists
SEO vs PPC
Back in 1997 when I started getting web sites to the top of the search engines it wasn’t even called “Search Engine Optimization”. In fact, there wasn’t a name for what I did much less a multi-billion dollar industry. I realized back then that search engines were the only place to find what you were looking for on the web. They were a phone book of sorts with about 4 billion listings that you could sort through in less than 1 second with the push of a search button.
Now the search engine game is very different yet very much the same. You have the media bombarding you about pay per click, sponsored listings, featured listings, ppc, cpa and don’t hear very much about natural search engine optimization anymore. You don’t know if there even is such a thing because most companies that perform search engine optimization of web sites are small and can’t compete with the large advertising budgets of the major search engines like Google and Overture. But the deeper you dig the more you realize the algorithms haven’t changed that much and getting to the top can be made simple. This leads us to two questions.
1. Does search engine optimization work anymore?
2. What kind of traffic can I expect to see from both methods?
We’ll start with the first question.
Does search engine optimization work anymore?
The answer is a resounding, ABSOLUTELY! Natural search engine placement and optimization is not dead at all. In fact, the industry is growing at a tremendous rate. The competitiveness over arguably the most competitive word on Google “search engine optimization” has drastically increased. As of the writing of this article there are over 1,620,000 results when you type in search engine optimization compared to only 560,000 two months ago.
Okay you say, that makes sense but give me more proof.
GlobalPromoter.com does no advertising other than natural search engine optimization and natural search engine placement in the major search engines. We spend $0 on pay per click campaigns or any other method of advertising yet our traffic rivals that of our competitors and we average an incredible amount of account signups on a daily basis. Our visitors / purchasers ratio is in upwards of 4.5% which is incredible in any industry. Why do we have such a high purchase/click ratio? Because people are looking for us, we’re not looking for them. When a user goes to Google and types in “search engine optimization tools” and finds us on the first page, they know we know what we’re doing and are compelled to click if only out of awe.
The secret of the search engines is the ability to be found not the other way around. That’s why the natural search engine listings in Google outperform the Adwords listings. Users know that sites listed in the Sponsored matches section or on the right side of the results means a business or individual is paying every time someone clicks on their site. That equates to advertising which is no different than radio, television, newspaper or magazines. That’s a company “pushing” their product onto the consumer.
But, when a user finds a site in the web matches section, they have more confidence. This site didn’t pay to get there, they are there because Google or Yahoo or AOL’s algorithm said their site is the most appropriate for my search based on the entire site’s content. This is “Pull” demand. Meaning, the user is looking for us instead of us looking for them. If you can get on the Pull side of advertising then you’ll experience much higher purchase / click rates on every visitor to your web site.
On to question 2.
What kind of traffic can I expect to see from both methods?
This question needs to be answered in two parts. First let’s look at the ppc method.
PPC search engine listings will give you as much traffic as there is demand for a given keyword or keyword phrase. Meaning, if there are 500,000 searches a month and your listing is appealing you can expect to receive approximately 2 – 5% of those searches. Let’s say you get an incredibly high click thru rate of 5%. That means you have .05 * 500,000 = 25,000 visitors at your disposal. But if a keyword has 500,000 searches in a month then that means it’s fairly competitive and it could easily be $1.00 to be in the top 3 positions for that keyword. So if you are paying $1.00/visitor and you had 25,000 visitors, then you paid $25,000 for the traffic one keyword would generate for your site.
I think you can see how risky and expensive ppc can be. Unless you know you can convert visitors into sales and your profit margin on the items you’re selling is incrdibly high, then Caveat Emptor (buyer be ware).
On the flip side, when your site shows up in the natural rankings you don’t pay a single cent for any of the traffic it generates. This means you have more money for developing your site, tweaking marketing tactics, making your product better, etc…
As far as the old argument that you won’t get as much traffic from natural placements vs ppc listings, that’s a myth. Several of our customers receive over 50,000 visitors a month on average from natural placements in the major search engines. In fact, when we optimize a client’s web site, one of their goals is to decreaes the amount of money they are currently spending on ppc advertising. After the completion of the optimization plan 75% of our clients completely abandon their ppc programs. This leads us to a general comparison of ppc vs. natural rankings.
Advantages of Search Engine Optimization
1. Up front fixed cost vs. fluctuating costs that can skyrocket with ppc advertising.
2. Long term listings and rankings with natural placement vs. Showing up only as long as your bank account has money.
3. Natural rankings have higher click thru ratios than ppc listings because natural rankings are pull demand vs. push demand.
In Conclusion
I’m going to close this article with an analogy. Most of you have been camping before and remember at least one cold night when you couldn’t get a fire started. So you went and got some lighter fluid and squirted it on the dry oak or whatever wood you used. Then you threw a match into the fire and began to warm up next to the fire.
The lighter fluid is akin to ppc advertising. When the lighter fluid is squirted on the fire the flames shoot high and bright and then vanish. Just like ppc advertising it’s short term because as soon as the money is gone so is your exposure. Whereas natural rankings are like the solid oak used in the fire. The oak will burn for hours and hours and keep you warm much longer than just lighter fluid alone. Like the oak, natural search engine optimization campaigns last in excess of 6 months in stead of one day. And if you learn the secrets to good web site optimization you can stoke the fire and make it last even longer with no added cost. Of course it takes a little longer to get the oak branches to light up but once you get them going they will last for a long time.
Many people see the solution to their search engine marketing campaign in pay per clicks because they’re easily set up and effective almost immediately. However, those that understand the principle of laying a solid foundation and building upon it can understand the long term benefits of natural search engine placement. It may take longer to get the same results but it will cost much less in the long run.
Author Bio:
Jason Dowdell is the founder and CEO of http://www.GlobalPromoter.com, a search engine optimization and marketing firm specializing in educating and empowering customer websites. Jason is also the founder of TurboPromoter.com, a web-based seo/sem project management suite comprised of professional seo tools, in-depth tutorials and an integrated help system.
Site Rankings Gone
As you may already know, a good part of my job is researching how the organic search engines work. Trying to figure out how the algorithms work in ranking pages is crucial to our day to day operations. Occasionally, we come across sites which seem to defy explanation – they have proper optimization, good internal linking and so on, yet seem to be getting penalized by engines such as Google. Today, I’m going to explain how we began researching a particular problem, in hopes that if it happens to you, you will know what to do.
The first indication that there was a problem with a site was when the PageRank in the Google toolbar disappeared, seemingly over night. This happened soon after a new URL was put up on an existing site. We assumed, as is usual, that Google hadn’t been able to associate the new URL with the “old” content. That is – Google was still expecting to see the old URL associated with the content. We advised the client that it would likely take a few weeks to re associate the new URL with the site.
When sufficient time passed without progress, we had to dig deeper to see what the issues were. As I mentioned above, everything looked ok. Optimization was in place, and there wasn’t any over optimization happening. Internal linking was good, and there was good use of a properly constructed site map. So we had to dig deeper – going beyond on-the-page factors to see if we could figure out what else was causing the problem.
The first thing I looked for…
The first thing I looked for was the existence of a robots.txt file. In many cases, an improperly coded robots file will exclude some, or all, search engine spiders from indexing. In this case a robots.txt was not being used, so I ruled this out.
I then checked to see if there were robots Meta tags in the body of the HTML. These tags do the same as the robots.txt file. That is, they tell the spiders which pages they can and cannot index, but it is done on a page by page basis, rather than a site wide basis as in the robots.txt. Again, an improperly coded robots Meta tag can exclude part or all of a site from getting indexed. Again, this was not the case. Although this site does use a Meta robots tag it was coded properly. In fact, the same tag existed on the “old” site and wasn’t an issue then.
So I then checked the log files to see if the spiders had been visiting the site and they have been there on a regular basis. As recent as a few days ago, as a matter of fact.
Seeing that everything was coded properly, and that spiders had been visiting the site peaked my interest. How is it that spiders are able to see the site (as indicated by their visits) yet the site is not showing up in the index and has a PageRank of 0 still, months after the change?
Some more digging…
So I did some more digging. I checked Google for the old URL. Upon viewing the cached version of the old URL, a theory began to form.
The cached pages are actually the current content of the new site. In other words, Google was somehow associating the old URL with the new site. So I did some more checking. I did a whois lookup and found that the old URL was still registered. So I decided to ping it, and found that it resolved to a new IP address, yet when I try to connect to it using my web browser it comes up as a 404 (page not found error).
I pinged the new site and the IP address is different, but it is the IP address that the site had when it had the “old” URL. This still doesn’t explain why the new site has no PageRank or indexed pages and the “old” URL is showing pages from the new site, but it does give me some clues.
We already know that in order to save time most search engines do not perform a DNS query when they visit a site. They tend to try and connect directly to the site via IP. If they don’t get a site via IP they then perform a DNS query to find the IP of the site.
In the case of this site, Google hasn’t needed to perform a DNS query as, from their point of view, the “old” site still exists. They can connect via IP to the site and are associating the “new” site to the “old” URL.
This also explains why the “new” site is showing a PageRank of 0 with no pages indexed. Because Google has also resolved the “new” site to the same IP which it thinks belongs to the “old” URL. Once it visits the new site it realizes that the new and old sites are identical it gives preference to the “old” site because it pre exists the new site.
Confused yet?
Let me put it in other terms. Since the “old” site has been around for longer, it has built up a reputation on the web. When the client replaced the URL they wiped out that reputation. But no one told Google that the old site was gone. Had Google performed a DNS query they would have found that the old site had in fact been moved, but since it found a site with the same content at the same IP it assumes that it is the site with the reputation.
Along comes a new site with the exact same content and no reputation and of course the first thing Google assumes is that the site owner is trying to spam the engine, so it penalizes the new site. Hence the lack of indexed pages and 0 PageRank value.
To resolve this issue we will try a variety of things. First will be either a 301 redirect (approved by Google to help spiders understand if a site has moved) or another on-the-page redirect, such as a Meta refresh or hyperlink on the “old” URL. These different efforts help enforce to Google that the “old” site has been replaced by the “new” site.
If this doesn’t work, our next step will be to request that the site be removed from the index. This is a last resort; as we would rather the engine figure it out on its own. If we find that Google still can’t figure out that there is a new site, we will definitely request the URL removal.
In addition, to try and help speed things along, we need to ensure that all other links, such as ODP directory listings, point to the correct URL and not the old domain. This will reinforce to the search engines that the “old” site no longer exists and that the “new” site is actually a valid site that isn’t spamming the engines.
Author Bio:
Rob Sullivan
Production Manager
Searchengineposition.com
Search Engine Positioning Specialists
The Future Of Search Engine Technology
By now you have probably read numerous articles predicting ‘What will happen in 2004’ or ‘Can MSN take on Google’. While it is always worthwhile to look ahead and consider what may happen this year in the search engine industry, what about the things that we can’t quite yet predict? Instead of looking at what will happen this year, perhaps we should look at what must happen in the search engine space if Google, Yahoo and MSN are truly able to revolutionize search and enhance the user experience.
Overcoming The Lack Of Relevant Search Results
Even today, conducting a search on any of the major search engines can be classified as an ‘enter your query and hope for the best’ experience. Google’s ‘I’m Feeling Lucky’ button, while designed to take you directly to the number one results, could ironically be a truism for its entire search results (process?). Enter your desired search words into any of the search engines and you often end up crossing your fingers and hoping that they display the type of results you were looking for. Since the recent updates of ‘Florida’ and ‘Austin’, complaints that Google, in particular, is displaying less relevant results have escalated (although mostly by those who lost important positioning that they had assumed was their right to maintain).
There is, of course, evidence that the search engines are trying to enhance their search results- so that they can better anticipate the intentions of the searcher. Search for ‘pizza Chicago’ at Yahoo, and you’ll see that the top results include names, addresses, telephone numbers and even directions to pizza restaurants in Chicago, a great improvement on previous results. Even when you take everyone’s favorite search term example, ‘windows’, you can see that the search engines are at least trying to determine your intent. While Yahoo and Google still display search results saturated with links discussing Microsoft’s pervasive operating system, enter your search over at Ask Jeeves and the chirpy English butler will ask you if you meant ‘Microsoft Windows’ or ‘Windows made out of glass’.
Future Search Engine Technology
Smaller search engines have also materialized over the past few weeks, each offering to improve the user experience. Grokker offers an interface that groups search results graphically, improving the way search results are segmented and displayed. Eurekster, combines the social networking elements that are used by sites such as Friendster, and provides results that can be filtered based upon what members of your group are searching. While all of these are interesting and provide a glimpse of the future of search, it will not be the small companies that change the way we search. With Google about to get an influx of cash from its upcoming IPO, Yahoo re-vamping Inktomi and Overture, and Microsoft finally jumping into the search arena, it will be these search engine powerhouses that enhance our search experience and take search engine technology to the next level.
So what is this next level? What technology is it that I speak of, that will revolutionize the way we receive our search engine results? I believe that the search results we receive in just a couple of years from now could make current search engine technology look as archaic and cumbersome as picking up a Yellow Pages book is today. However, in order to achieve this new search nirvana we, as consumers, must quell our fears and trepidations surrounding the protection of our privacy. In order for the search engines to develop technology that will be intuitive and anticipate our every need, we must first relinquish at least some of the privacy that we currently hold so dear. Let’s take a look at some of the ways that search technology could improve and you’ll soon get the idea why it will require us to cooperate with the search engine providers.
‘Windows’ or ‘windows’?
If you desire to be able to enter a term as ambiguous as ‘windows’ and expect to see relevant results, you’ll first need to give up some personal information to the search engines. Google, Yahoo, MSN and Ask already have the means to collect an astonishing amount of information from us, by our use of their toolbars. Don’t panic, they currently allow you to disable this information gathering, and even if you do allow it, it is collected anonymously. However, with the technology already in place, why not unleash its full potential?
Let’s say I let Google track my online activities, allowing it to monitor the web sites I view and keep a log of all of the search queries I enter. This type of information could greatly improve the relevancy of the results displayed to me. For example, two years from now, I could search for ‘home improvement’ on Google. I then find the listing for Lowes.com and visit the site. While I am at their web site, I look at a number of different pages, but I spend a lot of time in the ‘house windows’ section, exploring the different styles and prices. Why not let Google capture all of that useful information? Then, when I go back to Google the following day and search for ‘windows’ it would know that glass windows is more likely to be the type of product I am seeking out. Google would simply have remembered my previous searches, read the HTML and Meta data, located on the Lowes.com pages, and used this to identify the intent of my new search for windows.
While I would have to give up some of my privacy, wouldn’t it be worth it if I could save myself time and energy by having search engine results more relevant to my desire?
You’ve Got Search In Your Mail
Another area with great potential for improving search engine results will likely be developed by Google. You may have heard the rumors that Google is getting set to launch an email client that many expect will be a free service similar to Yahoo Mail or Hotmail. Currently, Yahoo does an adequate job of making search available to all of its email customers. Each page within Yahoo Mail has a search box that makes it easy for you to conduct a search that might be sparked by an email you receive. But why not take it one step further?
Google has the technology to really take advantage of search within email. Why else would it even consider entering this arena? Imagine that, in order to use a free Google email account, you allow Google to provide advertisements and track your email activities. Google could change the way that search results and ads are displayed to free email users. For example, let’s say you receive an email from your brother, the content of which, among other things, gloats about the brand new P4 desktop computer that they just purchased from Dell. As part of the interface you use to read that email, Google magically displays paid search advertising for desktop computers, including a link that will take you directly to the appropriate page on Dell.com. This information would be quite beneficial to you, as you may be interested in seeing how you too can be a proud owner of a P4 computer. Fantastic targeted advertising for Dell, as they know that if you click on the listing, they are halfway there to converting you into another satisfied customer.
This idea is so much closer to reality than you may think. Google already has the advertisers with its AdWords service boasting 150,000 users, eager to spend their advertising dollars. It also has the technology to determine which results to show you within your email interface. Google’s AdSense can provide the contextual ad technology that would scan an email’s content to determine which ads are the most relevant to display. With this technology in place, a simple provision within any Google Email Terms & Conditions would give the world’s largest search engine the necessary permission to serve up relevant ads to all users of its free email service.
We could be offered the option of paying a monthly premium in order to not have ads shown when we read our email, but if they are relevant to the content of a received message, why would we want to block them?
From Desktop to Internet
Another development in search engine technology that I can see happening would come from the development of Microsoft’s new Longhorn operating system. While I must confess that I am not au fait with the intricate workings of this project, I do know that it will likely use the search technology that MSN is developing.
Imagine an operating system that monitors all of your activities — with your permission, of course. Every file, every image, word document, mp3, even e-books could be monitored by your computer as it endeavors to anticipate your every need. Not only could an integrated search engine allow you to search files located on your hard drive, but it could also use the information it has collected from these files to make your online search experience even more enjoyable.
It is quite possible that Longhorn or a future OS (Microsoft, Linux or Mac) could become intelligent enough to know that after listening to one of your favorite songs by the 80’s rock band, Heart, your consequent search online for ‘heart’ is more likely to originate from a desire to view the band’s fan site, than that pressing need to visit the web site of The American Heart Association. Your all-encompassing search engine would perhaps be a realization of the Ask Jeeves friendly butler, ready to anticipate your every need.
To Search Where No-one Has Searched Before
When you think about the future of search, it is easy to get excited. Millions (if not billions) of dollars are going to be filling the coffers of the largest search engine providers. They have some of the smartest people in the world working to develop the next great ‘thing’, which will enhance the user experience and serve up better, more relevant search results. Search engine technology is still most definitely in its infancy; how it grows will very much depend upon how much information and privacy the average search engine user is willing to give up. Personally, if I can view search results that more closely match my desired results, I’m willing to give up the name of my favorite pet, my place of birth and my mother’s maiden name!
Author Bio:
Andy Beal is Vice President of Search Marketing for WebSourced, Inc and KeywordRanking.com, global leaders in professional search engine marketing. Highly respected as a source of search engine marketing advice, Andy has had articles published around the world and is a repeat speaker at Danny Sullivan’s Search Engine Strategies conferences. Clients include Real.com, Alaska Air, Peopleclick, Monica Lewinsky and NBC. You can reach Andy at andy@keywordranking.com and view his daily SEO blog at www.searchenginelowdown.com.