JavaScript Links
JavaScript is a wonderful technology, but it’s invisible to all of the search engines. If you use JavaScript to control your site’s navigation, spiders may have serious problems crawling your site.
Links contained in your JavaScript code are likely to be ignored by search engine spiders. That’s especially true if your script builds links by combining several script variables into a fully formed URL.
For example, suppose you have the following script that sends the browser to a specific page in your site:
function goToPage(page) {
window.location = “http://www.mysite.com” + page
+ “?tracking=” + trackingCode;
}
This script uses a function called goToPage() to add a tracking code onto the end of the URL before sending visitors to the page.
I’ve seen sites where every link on every page ran through a JavaScript such as this. In some cases the JavaScript is used to include a tracking code, in other cases it’s used to send users to different domains based on the page. But in all of these cases the site’s home page is the only one listed in the search engines.
None of the spiders include a JavaScript parsing engine that would allow them to interpret this type of link. Even if the spider could interpret this script, it is difficult for them to simulate the different mouse clicks that would trigger execution of the goToPage() function with different values of the page variable, and it has no idea what value should be used for trackingCode.
Spiders will either ignore the contents of your SCRIPT tag, or else will read the script content as if it was visible text.
As a rule of thumb, it’s best to avoid JavaScript navigation.
Can Spiders Crawl Your Web Site?
A fundamental to ensuring your web site appears in the search engine listings is making sure that search engine spiders can successfully crawl your web site. After all, if the spider can’t reach your pages, then the search engine can’t include them in their search listings.
Unfortunately many web sites use technologies or architectures that make them hostile to search engine spiders. A search engine spider is really just an automated web browser that must interpret your page’s underlying HTML code, just like a regular browser does.
But search engine spiders are surprisingly unsophisticated web browsers. The most advanced spiders are arguably the equivalent of a version 2.0 web browser. That means the spider can’t understand many web technologies and can’t read parts of your web page. That’s especially damaging if those parts include some or all of your page’s links. If a spider can’t read your links, then it won’t crawl your site.
As a search engine marketing consultant, I’m often asked to evaluate new web sites soon after their launch. Search engine optimization is often neglected during the design process, when designers are focused on navigation, usability, and branding. As a result, many sites launch with built-in problems, and it’s much harder to correct these issues once the site is complete.
Yet often its only when their site fails to appear in the search engine listings that many companies call in an SEO.
That’s a shame, because for small businesses, search engines are by far the most important source of traffic. Fully 85% of Internet users find sites through search engines. A web site that isn’t friendly to search engines, its value is greatly reduced.
In this article, I’ll give an overview of the issues that can keep a search engine spider from indexing your site. This list is by no means exhaustive, but it will highlight the most common issues that will keep spiders from crawling your pages.
DHTML Menus
DHTML drop-down menus are extremely popular for site navigation. Unfortunately, they’re also hostile to search engine spiders, since the spiders again have problems finding links in the JavaScript code used to create these menus.
DHTML menus have the added problem that their code is often placed in external JavaScript files. While there are good reasons to put your script code into external files, some spiders won’t fetch these pure JavaScript files.
If you use DHTML menus on your site and want to see what effect they have on search engines, try turning JavaScript off in your browser. The drop-down part of your menus will disappear, and there’s a chance the top-level menus will disappear too. Yikes! Suddenly most of the pages in your site are unreachable. And that’s the way they are to the search engines.
Query Strings
If you have a database-driven site that uses server-side technologies such as ASP, PHP, Cold Fusion, or JSP, there’s a good chance your URLs include a query string. For example, you might have a URL like this one:
http://www.mysite.com/catalog.asp?item=320&category=23
That’s a problem, because many search engine spiders won’t follow links that include a query string. This is true even if the page that the link points to contains nothing but standard HTML. The URL itself is a barrier to the spider.
Why? Most search engines have made a conscious design decision not to follow query string links because they require additional record keeping by the spider. Spiders a keep list of all the pages they’ve crawled, and try to avoid indexing the same page in a single crawl. They do this by comparing all new URLs to the list of URLs they’ve already seen.
Now, suppose the spider sees a URL like this one in your site:
http://www.mysite.com/catalog.asp?category=23&item=320
This URL leads to the same page as our first query string URL, even though the URLs are not identical (Notice that the name/value pairs in the query string are in a different order).
To recognize that this URL leads to the same page, the spider would have to decompose the query string and store each name/value pair. Then, each time it sees a URL with the same root page, will would need to compare the name/value pairs of its query string to all the previous ones it has on file.
Keep in mind that our example query string is fairly short. I’ve seen query strings that are 200 characters long, and reference a dozen different name/value pairs.
So indexing query string pages can mean a great deal of extra work for the spider.
Some spiders, such as Googlebot, will handle URLs with a limited number of name/value pairs in the query string. Other spiders will ignore all URLs containing query strings.
Flash
Flash is cool, in fact it’s much cooler than HTML. It’s dynamic and cutting edge. Unfortunately, search engine spiders use trailing-edge technology. Remember: a search engine spider is roughly equivalent to a version 2.0 web browser. Spiders simply can’t interpret newer technologies, such as Flash.
So even though that Flash animation may amaze your visitors, it’s invisible to the search engines. If you’re using Flash to add a bit of spice to your site, but most of your pages are written in standard HTML, this shouldn’t be a problem.
But if you’ve created your entire site using Flash, you’ve got a serious problem getting your site into the engines.
Frames
Did I mention that search engine spiders are low-tech? That’s right, they’re so low-tech they don’t understand frames either. If you use frames, a search engine will be able to crawl your home page, which contains the FRAMESET tags, but it won’t be able to find the individual FRAME tags that make up the rest of your site.
In this case, at least, you can work around the problem by including a NOFRAMES section on your home page. This section of your page will be invisible to anyone using a frames-capable browser, but allows you to place content that is visible to the search engines and other frame-blind browsers.
If you do include a NOFRAMES section, be sure to put real content in there. At a minimum you should include standard hypertext links (A HREF) pointing to your individual frame pages.
It’s surprising how often people include a NOFRAMES section that simply says “This site requires frames. Please upgrade your browser.” If you’d like to try an experiment, run a query on Google for the phrase “requires frames.” You’ll see somewhere in the neighborhood of 160,000 pages returned, all of which include the text “this site requires frames.” Each of these sites has limited search engine visibility.
With www or without www?
The address of my web site is www.keyrelevance.com, but can people reach it if they leave off the initial “www?” For most server configurations, the answer is “yes,” but for some the answer is “no.” Make sure your site is reachable with and without the www.
This list presents some of the most common reasons why a search engine may not be indexing your site. Other factors, such as the way you structure the hierarchy of your web pages, will also affect how much of your site a spider will crawl.
Each of these problems has a solution, and in future articles I’ll expand on each one to help you get more of your pages indexed.
If you’re currently redesigning your web site, I’d encourage you to consider these issues before the site goes live. While each of these search engine barriers can be removed, it’s better to start with a search engine friendly design than to fix hundreds of pages after launch.
Author Bio:
Christine Churchill is President of KeyRelevance.com a full service search engine marketing firm. She is also on the Board of Directors of the Search Engine Marketing Professional Organization (SEMPO) and serves as co-chair of the SEMPO Technical Committee.
Top Tips To Pay-Per-Click Search Engine Success
Pay-per-click (PPC) search engines are a highly effective and inexpensive way to attract targeted traffic to your web site. With the most popular PPC search engines (www.overture.com and www.adwords.google.com) you can often buy this search engine traffic for 10 cents or 5 cents per click. It is possible to bid for keywords for as little as $0.01 per click with some of the less popular PPC search engines! This means that you can get 500 visitors to your website for just $5!
Below are some strategies that you can use to consistently get a steady flow of traffic to your website and increase
your sales.
Focus On One Item Or Product That You Want To Promote.
Promoting your entire list of programs or products will be more expensive and get you less targeted traffic. Instead, choose a product and promote that. You can promote other products or programs separately. Each click through should direct your visitor to the specific program or product that you’re selling. This will ensure that you will have less competition in bidding for very focused keywords and a higher likelihood of reaching really interested customers. Avoid directing your customers to a web page that is likely to confuse them or where they’ll have to start looking for the advertised product. Most customers simply do not have the time.
Find Out What The Traffic Is Like For Your Desired Keywords
Most pay per click search engines have a page where you can see what and how many times a keyword was searched for in the previous month. Remember a keyword that was popular in June may not be as popular in July or August. These reports must be used as a guide. I have found that any relevant keyword attracting more than 50 clicks per month is worth bidding on.
Strategize Before Placing Your Bid
Now that you know what you want to promote and what keywords to use, you need an effective strategy to make your campaign profitable. As a general guide, avoid bids that are too high unless you have a huge advertising budget and your product converts well. Instead, aim to bid for less popular low-cost, relevant keywords. They produce a better return than the popular ones.
For The More Popular Keywords, Bid For A Lower & Less Expensive Position
When you search on a keyword, most PPC search engine results will also show you what each advertiser is paying for their position. Remember you don’t always have to be first. If one advertiser is paying $0.25 for the first position and the second advertiser is paying $0.12, you can get third place for only $0.11. A good tip is to aim for the first page, even if you’re in position 15! Most surfers quickly scroll through the entire page and they do not necessarily click on the first or second listing!
Research has shown that listings at the top and bottom of the page are more likely to be noticed than those in the middle. With Google Adwords, increasing either your maximum cost-per-click (CPC) or your ad’s clickthrough rate (CTR) will improve your ad’s position.
Write Interesting Ad Copy
The headline and copy of your text are your one shot at getting leads and customers interested. You want to tell them what you are promoting and why they should visit your site. Remember the goal is interested customers (good leads) not just anybody. Give your potential customers an added incentive by giving away something FREE!. PEOPLE LOVE FREE STUFF! You can also include deadlines and discounts information when applicable. With the Google Adwords program, you can do this in real time.
Only Bid On Keywords That Are Relevant To Your Campaign
This point cannot be over-empasized. All the Pay-per-click search engines have rules concerning how relevant keyword bids are. Generally, you cannot bid on keywords that have nothing to do with what they are linking to. In any case, bidding on words that are not related to your promotion is a bad strategy. It may result in clicks that you pay for, but that do not bring interested customers or leads.
Bid On As Many Relevant Keywords As You Can
You catch more fish with a net than you do with a single line. You can often get great, targeted clicks for only a penny or less if you bid on less popular but relevant keywords. If you take the time to bid on a few words a day, you’ll have a powerful, inexpensive campaign running within a few days. Don’t forget to try it out at more than one PPC Search Engine as well.
Track Your Keywords And Your Entire Marketing Campaign
You can do this by checking your Reports. Each pay-per-click service offers reports. These will tell you how well each keyword is driving traffic. You can then fine-tune your efforts by changing the value of your bids or adding and deleting keywords. Regularly Check your keyword position and do a lookup on your keywords. If you have picked popular words, you will notice that your bids will change position as others bid on them as well. Expect results that match your effort.
If you take a little extra time to apply these points, you should see good results within a short period of time, depending on your efforts.
Someone has said that pay-per-click search engines (like all Internet marketing efforts) are like fishing. You must cast hundreds of times to get a few fish. So be patient and remember that using pay-per-click advertising, like ezine advertising, is an effective approach that can increase sales at a very low cost.
Author Bio:
Ben Chapi owns Venister Home business and Affiliate Program Classifieds at http://www.venister.org. He is also webmaster for http://www.best-debt-consolidation-loan.co.uk and http://www.home-equity-loan.org.uk/
What Is PR?
First of all PR stands for PageRank and is one of Google’s ways of determining the importance of a website. To view the PR of websites download the Google toolbar and it will be displayed as a green bar in the center. Why is PR so important? Well PR helps determine how high you will be listed for your keywords but PR is not a certain thing. Even with a PR of 10/10 you still might not get top placement for your keywords if you do not use correct search engines optimization. The techniques I am about to talk to you about are free so don’t worry if you have a zero budget.
PageRank Building Technique #1
The first PR building technique I am going to talk about is directory listings. Most web directories have a PR of 1 or 2 on the pages you’re going to be listed on. But many of them have pr of 6 or 7 for example DMOZ has great pr that can really boost your website’s PR. In addition to boosting you PR web directories also bring in some decent traffic especially if you get listed in the right category. Picking the most relevant category is more important than anything if you want targeted visitors. For example if you had a webmaster website you would not want to get listed under internet where the people that click the link aren’t even interested in your website. In addition to more targeted visitors if the most relevant category includes one of your keywords it can also boost your search engine position for that keyword.
PageRank Building Strategy #2
The second best PR building technique is reciprocal links. Reciprocal linking is when one site links to another and vise versa. This benefits both websites as long as they are not competitors. Reciprocal links on link pages do not bring too much traffic but normally the link pages have a pr of 0-6.
Finding reciprocal link partners can be hard. The trick to getting a reciprocal link partner is customize your email and offer something that other websites don’t. For example, offer to email all of your opt-in users once every other month and advertise your reciprocal link partner and you could see if they would do the same for you. Main page link exchanges seem to help PR the most and bring the most visitors which results in the best of both worlds.
The best reciprocal link partner in my opinion is forum partners. If you have a big website with tons of pages then I suggest emailing forum webmasters and asking them for a full website reciprocal link exchange. What I mean is ask them to put a code in their forum so your website shows up at the bottom of every single page on their forum while you do the same for them. You have the advantage while there forum website gets about another webpage or 2 a day and other websites get approximately one every week giving you the chance to raise your PageRank by leaps and bounds. I got a link form a forum to my website and it raised my PageRank from 4 to 6!
PageRank Building Tip #3
That brings about my last useful PR building technique, forum signatures! This technique is only useful if you participate in forums occasionally. If you do not then you could try to take a few minutes out of every day to reply to peoples posts. For some forums this technique does not work because they disallow forum signatures. If they do allow them then in your signature put no more than 2 of your links and a short description. Keep your forum description to a maximum of 4 lines because if they are more then that the administrator may ban you from using a signature or people will get annoyed and think you are spamming. One link from a forum will do you no good because normally posts on forums have a pr of 0 but occasionally the more controversial topics have higher pr. A pr of zero will do you nothing but 100s of them can help raise your pr and bring traffic from people on the forum interested in your website description.
Author Bio:
Aaron Turpen is the proprietor of Aaronz WebWorkz, a full-service online company catering to small and home-based businesses. Aaronz WebWorkz offers a wide variety of services including Web development, newsletter publishing, consultation, and more.
Will WebPosition Get My Site Banned from Google?
In mid November of 2003, Google seriously revamped their ranking algorithm. As a result, many sites were dropped from their index, or fell dramatically in rank. This infuriated many Web site owners at the height of the holiday buying season. Since that time, many accusations have been thrown at Google as to the reasons why this happened. Some say it?s a plot to encourage people to buy Adwords listings. Others have even theorized WebPosition is somehow to blame. Still others cite more traditional causes.
As soon as Google changed their algorithm, many WebPosition Gold customers whose sites had dropped contacted me demanding an explanation. They wanted to make sure their sites were not dropped because they had used WebPosition Gold. I reassured them that this was not the case. I went on to explain that many thousands of sites were dropped that don’t even use WebPosition Gold. Many of our customers even saw their rank increase. In addition, most of the time the site had not actually been banned from the index. It had simply dropped in rank.
In this article, I will attempt to dispel many of the pervasive myths regarding WebPosition Gold and Google. I?ve used WebPosition for years on my own site and for clients. I?ve also helped provide technical support to others using the product. Therefore, I?ve been on both sides of the fence, and thereby feel uniquely qualified to address the most common questions that tend to come up:
Will running automated Reporter Missions on Google get my site banned?
No. Despite repeated rumors, when running a Reporter Mission, WebPosition Gold does not pass personal information, such as your name, address, email, Web site URL or domain name to Google. Instead, it conducts queries as a normal browser would, and then examines the results offline. With that in mind, Google cannot determine if you’re running a query relating to a specific domain. The only information that is passed to Google is your “IP” address. In most cases, your Web site’s IP address is different than the IP address of your ISP (Internet Service Provider). So, how can Google connect the two? Simply put, it can’t.
Google states on their FAQ page that they do not recommend automated queries to be run on their service because it utilizes server resources. Yet, most businesses find it impractical not to measure their search engine rankings at least occasionally. It?s also hardly reasonable to check ranking by hand in Internet Explorer, which for the same keyword list, would yield the same number of queries on Google anyway. Therefore, most businesses optimizing their Web sites find it impractical not to use some kind of automated tool to monitor their progress and to measure their visibility. Working as a search engine marketer myself for many years, I?ve found that the best policy is to simply be sensitive to the needs of the search engines. Avoid being ?abusive? in your practices, whether it is your optimization strategies, your submissions, or your rank management.
Therefore, when using WebPosition, I often recommend the following strategies:
1. Avoid excessive numbers of queries if you choose to check your rankings on Google. Most people do not have time to improve their rankings on hundreds of keywords. Therefore, there?s no need to rank check on hundreds of keywords if you don’t have the time to do anything about that many different rankings anyway. While your site won?t be banned from excessive queries, Google could block your IP address that you use to connect to Google, if it found your query volume to be excessive. This is true regardless of what tool you may use, even if it?s a browser.
It has been my experience that a blocked IP is extremely rare even among consultants conducting rank checks for dozens of clients. Presumably, Google would not want to accidentally block an IP that does a large volume of queries simply because its shared by many different users. Even so, it?s always a good idea to practice a little common sense.
2. If you choose to run queries, try to run most of your queries at night and during off-peak periods, which is something Google has suggested in the past. This is when many of their servers are presumably standing idle, waiting to handle the increased volume during peak periods. The WebPosition Scheduler makes this easy to do.
3. Do not run your queries more often than is really necessary. Since Google normally doesn’t update their entire index more than once a month, there’s limited benefit to checking your rankings more often than that.
4. As an alternative to Google, consider checking your Google rankings using Yahoo Web Matches or another Google ?clone? engine in the Reporter. Although these rankings can vary slightly from Google.com, they’re normally close enough to give you a very good idea of your actual Google rankings without checking Google directly.
5. With WebPosition Gold 2, you can also use the “Be courteous to the search engines” feature on the Options tab of the Reporter so you don?t query their service so quickly. This gives you added peace of mind not found in many other automated tools, assuming you don’t mind your missions taking longer to run. The Submitter has a similar feature to submit randomly at various time intervals.
Can I use WebPosition Gold to get my competitors’ banned from Google?
No. If running automated queries on Google with WebPosition Gold would result in your site being banned, you could use it to get your competitors’ banned from Google. However this is not the case.
Google even verifies this on their web site. They don’t specifically name WebPosition Gold in this section; however, they do mention that there is nothing you can do to get your competitors’ banned from Google. For more information on this, please see the “Google Facts and Fiction” document at Google’s site
Will over submitting my site get me banned?
No. Many people think that Google will ban your site if your submissions exceed the recommended daily limits. If this were the case, we could over submit our competitors’ sites and easily get them banned from Google.
Google is very clear on this and even states that over submitting will not get you banned. Even though over submitting will not get you banned, some of your submissions might still be ignored or discarded if they break the rules. Therefore, I recommend using the “Slow Submit” option in WebPosition Gold’s Submitter and staying within WebPosition?s recommended daily limits. Some people argue that manual submissions are best. However, manual submissions can?t warn you if you inadvertently over-submit, make a typo in your submission, or forget what you submitted and when.
For achieving top rankings, and staying indexed long-term, the best submission technique may be to not submit at all. Instead, try to establish third party links to your Web site and wait for Google?s spider to find you on its own. WebPosition?s Page Critic offers numerous strategies for doing this.
Will Doorway or Entrance pages get me banned from Google?
That depends on whether these pages contain spam. If your definition of a doorway page is a page full of irrelevant or duplicate content, and excessive keyword use, then yes, you could find your site banned. That?s how Google often defines a doorway page. Consequently, the term doorway has developed a negative connotation over the years.
If your optimized page is nothing more than an extension of your main web site that happens to contain search engine friendly content, then you?ll be fine. In fact, you?ll be rewarded for the effort through top rankings. The key is not whether you label a page a doorway, entrance, optimized, informational, or ?whatever? page. The key is whether the page contains quality, relevant content that provides the search engine with what it wants to see.
Google mentions that they discourage the use of ?doorway? pages because they fear that webmasters will optimize for keywords that are not relevant to the page?s content. This is a legitimate fear as they are in the business to provide relevant results to their visitors. However, if you create pages that contain what Google is looking for, then obviously Google will not penalize this page, or view it differently from any other page on your site.
With this in mind, here are a few of my tips on creating Google-friendly pages:
1. Always Include Relevant Content – Make sure that the content on each of your pages is relevant to your site. Many sites have various resources on a number of different topics. This is fine, as long as the overall theme for your Web site is solid. I would also suggest that you organize your related content into individual directories. Some businesses find it beneficial to organize each sub-theme of their site into a separate domain so they can cross-link the domains. If you do this, make sure you have links from other sites as well.
2. Avoid Duplicate Content – Create each page with unique content. If you are targeting different search engines for the same keyword, then you may find that you have some very similar content between certain pages. If this is the case, you can always create a robot.txt file to tell each search engine crawler not to index a page or directory that was created for another search engine. See the October 2000 issue (http://www.marketposition.com/mp-1000.htm#THREE) of MarketPosition for more information on creating a robot.txt file.
3. Avoid Keyword Stuffing – Creating pages that excessively repeat your keyword phrase is definitely not a good idea. This almost always will throw up a red flag to the search engine and is one of the most common forms of “spamming.” How many keywords is too many? See WebPosition?s Page Critic for up to date, specific recommendations regarding how many words and keywords are recommended in each area of your page.
4. Design Good Looking Pages – Although Google cannot tell if your page is aesthetically pleasing, it is recommended that you create pages that look good and fit the theme of your Web site. This will definitely increase the click through rate from the arrival page to the rest of your Web site.
5. Avoid Using Hidden Image Links – Many site owners think they can fool Google by including transparent 1×1 pixel image links on their home page that point to their optimized pages. These are very small images contained in a hyperlink that are not visible to the naked eye. This can get your page dropped from Google’s index.
6. Avoid using links that have the same color as the background on your page – Many site owners try to hide the links on their home page by making the text color the same as the background color of the page. As with the scenario above, this can also get your page banned from Google.
7. Avoiding using Javascript Redirection Techniques – Many Web site owners have implemented the use of Javascript to redirect a user to another page while allowing Google to crawl the page that includes the Javascript code. This did work for a while, but Google eventually caught on. Other forms of redirection, like IP cloaking are also frowned upon by Google.
In Summary:
The rules regarding each search engine change routinely. That?s why WebPosition?s Page Critic is updated monthly to keep pace. As a search engine marketer, it?s critical that you keep informed as to the latest search engine rules and strategies.
It’s also important to understand that WebPosition Gold is only a tool. When used properly, it will not get you banned or blocked, and will in fact improve your rankings dramatically. However, as with any tool, you can choose to ignore its recommendations and to go your own way. For example, you can use a hammer to build a fine house, or you can take that same hammer to knock a bunch of holes in someone?s wall. Ultimately, this call is up to you, the user of the tool.
Author Bio:
This article is copyrighted and has been reprinted with permission from Matt Paolini. Matt Paolini is a Webmaster/Tech Support Specialist for FirstPlace Software, the makers of WebPosition Gold. He’s also an experienced freelance Search Engine Optimization Specialist and Cold Fusion/ASP.NET/SQL Server developer/designer. For more information on his services, please visit http://www.webtemplatestore.net/ or send him an email at webmaster@webtemplatestore.net
Searching for Dominance: What Will Microsoft Search Look Like?
It’s the beginning of the year, so you’ll see dozens of ‘Year in Review’ and ‘Predictions for the Coming Year’ articles about the search engine industry. I have to admit, I was going to jump on the bandwagon myself, but as I started looking at what I would say, one thing dominated the future landscape for the next 4 years to such an extent that it made all the other developments pale in comparison. This tsunami of change will shape and affect every corner of the business. Search as we know it will be swept away because of it, and all the search providers we know will scramble to readjust and find their place in the new landscape. When Microsoft enters search, all else will become a footnote in the history of the web.
So, forget Yahoo and Google. Sorry Looksmart and Ask Jeeves, you’ve been pushed off the front page. Today, the spotlight is on Microsoft, and how they will likely change the face of web search. In this column, I won’t be talking about industry impact. Instead, with the help of our Organic Search wiz, Rob Sullivan, I’m looking at the promise of Microsoft’s research itself, and what the tool may actually look like.
MS Search..It’s All About Indexing
First, Microsoft is looking to solve a long standing desktop irritation. And when they find the answer, it will change the indexing of file information forever.
The current way of finding files on your computer leaves a lot to be desired. There has been no single system that effectively searches content from multiple file formats. To solve that problem, Microsoft is looking to employ three different technologies. First, to ensure compatibility, Microsoft will continue to use their NTFS File Structure system. They will combine it with the indexing capabilities of a SQL server relational database and the file labeling potential of XML. The new system is called WinFS.
WinFS
The problem with current file systems is that they are hierarchal. Files occupy one single place within a nested pyramid of file folders. But people don’t tend to think that way. A file may be relevant in a number of different ways, depending on the context in which you’re looking for it.
The other problem with hierarchal systems is that they need a librarian. Someone has to establish and organize the hierarchy. Usually this organization is established in anticipation of the context in which you’ll have to reaccess this information.
I know there are people out there who are diligent about filing away every single document in a well organized file system, but for the 99% that make up the rest of us, our hard drives are a vast junk yard of old files, spreadsheets and emails. More often than not, we desperately use Microsoft’s find file application to try to track down that elusive bit of information we’re looking for.
The other problem is that there is no good way to quickly search a number of different file formats for a scrap of information that may be hidden in one of them.
Microsoft’s new WinFS will work on top of the current NTFS structure, but it will introduce a dramatic new way of indexing files and their contents. XML tags will be used to send relevant information to an SQL database. It will bridge the current gap between indexable structured data, stored in a database and data which has been un-indexable, stored in unstructured formats such as Word documents, webpages and email messages. It also allows users to add ‘metadata’, identifying tags to existing files. For instance, a picture file could include information about the subject of the picture, or a sound file could include information about the audio captured.
Stuff I Have Seen
A Microsoft research team has been working on a prototype application called SIS, or Stuff I’ve Seen. Although it’s focus is to help users find files and information on their desktop, its implications for web search functionality could be dramatic. It pulls information from multiple file formats, including emails and webpages, and records them in a single index. This allows the user to search through them using a powerful interface that allows for the application of several filters at the same time. The search process becames a real time iterative process, allowing the user to quickly narrow down the search to the most relevant findings.
Implicit Query
‘Stuff I’ve Seen’ gives the user a powerful tool to find files and information on their desktop. Implicit Query (the link goes to a interesting Powerpoint presentation prepared by the Microsoft Research team) goes one step further by continually searching and retrieving information based on what the user is doing. As the program tracks user behaviour, it refines its model of what is important and relevant to the user and filters the search results accordingly. This is an extension of Microsoft”s Lumiere research which has modeled the Bayesian logic behind the current automated assistance functionality.
In an example, a Microsoft researcher was typing an email to a colleague about an upcoming conference. As she was typing, Implicit Query brought up presentations, slides and documents prepared for the conference in it’s results panel. In another instance, she was preparing an email to another colleague about a broken link in her group’s website. Before she was finished, she was shown an unopened email that contained the fix.
Memory Landmarks
A third Microsoft project doesn’t hold nearly the same promise for web search, but it would make an interesting add on feature. Memory Landmarks can add historical remarks to a list of chronological search results. For example, if you were searching for articles regarding the capture of Saddam Hussein, you could sort the list by date and Memory Landmarks would indicate where on the list the capture took place.
What Will MS Search Look Like
I think the above prototype applications give use some real clues as to what Microsoft Search will look like. As Microsoft works on the new Longhorn OS, we have to remember:
• As Microsoft works on ways to index and search files locally, it’s a logical extension to apply the new technology to web search.
• Longhorn’s Indigo makes a major move away from object oriented programming towards web services. There will be a much richer and deeper exchange of information between your local computer and web service sites. This allows for much greater localization in search tools.
• Microsoft has a long history of incorporating what were 3rd party stand alone applications into their applications and operating systems. They have already identified search as one of the key activities people do online.
• Microsoft’s ASI (Adaptive Systems and Interaction) research department is working to make their systems more intuitive and intelligent by letting them learn how the user works and adapting itself accordingly.
• Microsoft is working on desktop applications that will dramatically change how people launch searches for information.
Given all this, here is what I believe Microsoft Search will eventually look like.
Microsoft will use WinFS as the basis for eventually indexing every document on the web. Remember, because it’s integrated at the OS level, it will be native to every Microsoft IIS server on the internet. It gets around the current problem of the ‘invisible’ web by allowing web publishers to include metadata to allow for quicker indexing. Its SQL foundation will make indexing of data based information quick and transparent, as was shown when legal publisher Lexus Nexus allowed Microsoft to index a portion of their huge database.
This common indexing procedure will erase the dividing line between desktop searches and web searches. The entire web will be accessible from the Microsoft search sidebar. What’s more, the next evolution of ‘Stuff I’ve Seen’ and ‘Implicit Query’ will monitor what you’re working on and provide suggested information sources and files from both your desktop and the web.
If a user wants to launch a manual search, the current trial and error method of search (try a search, check the results, refine the query and try again if you don’t find what you’re looking for) will become much quicker and more powerful with an interface that allows for real time updating of results as filters are applied and parameters are tweaked. I’m willing to bet that Microsoft will also unveil leading edge natural language query technology that will mine web data based on interpreted concepts and not the current structured query method used on most search engines. By the time Microsoft Search is unveiled, I believe a more intuitive search interface will be standard on all the leading search portals.
Search functionality will eventually be integrated into every Microsoft application, much as the ubiquitous Office Assistant (I can’t tell you how much I hate that damned paper clip..until I need him) is now. Microsoft will be able to capitalize on this by selling sponsored search suggestions that will also be offered via the implicit query channel. For instance, if you’re writing an email about an upcoming business trip to New York, the Microsoft search pane will offer airfare and hotel specials, as well as suggestions of things to do while in New York.
Microsoft will be able to monitor everything you do. The more you do on your desktop, the more Microsoft and its applications will learn about your preferences and priorities. Their ASI research will allow them to adapt search functionality and personalize it just for you. So the search results you see won’t be the same as everyone else’s. Personalization will move beyond just geographic location to take into account the types of sites you tend to visit, business priorities, your typical workday activities and even your lifestyle interests. Big Brother lives, and his name is Microsoft!
Finally, Microsoft’s Indigo feature in Longhorn will remove the distinction between server side tasks and client side tasks. Therefore, local indexes will be utilized whenever possible to increase search performance and the options for personalization. The line between your desktop and the internet will become more and more indistinct as time goes on.
Implications For The Real World
Today, I just wanted to focus on what Microsoft’s search could look like. In writing this, I kept asking the same question, ‘Boy, I wonder what this means for Google?’ Obviously, the gang at Google is very aware of the impending threat of Microsoft search. So, in the next NetProfit, I’m going to ask our head of Organic Search, Rob Sullivan, to join me for a little brainstorming. We started chatting today by the water cooler and he has some very interesting theories. Stay tuned!
Author:
Gord Hotchkiss
My Predictions for the Search Engine Industry for 2004
The year 2003 was, generally speaking, a good year for the search engine industry. While for some that may have got hit hard with the Florida update in November, the balance of the year was good.
I predict 2004 will be even better, albeit you should expect some major changes. 2004 will be a year of consolidation, if there is such a thing in this rapidly changing industry. While in the year 2003, some search companies such as Looksmart and Espotting had more than their share of problems, especially in the case of Looksmart, I predict there will be a few newcomers.
2004 will be a year where you can expect news to hit this industry on a daily basis. In 2003, hardly a day went by that there wasn’t something happening at one company or another. Expect 2004 to be even busier.
Additionally, 2004 will be a year where the stakes are getting higher- much higher, and not just for Google! The level of competition among search engines will get tougher, in a race to land what I call ‘targeted eyeballs’.
Will it be GOO, GOG, GLE, GGG or simply GO?
As you must have read or heard about it by now, 2004 will also be the long-expected year where Google is supposed to transform itself into a public company, with all the advantages and the pitfalls that this entails. Google won’t have a choice but to go public in 2004. The reason for this is because of a little-known SEC rule in the US that forces American companies to issue quarterly statements once their number of employee-shareholders have surpassed a certain threshold.
It is estimated that over 700 to 800 of such employees are current shareholders, in effect forcing Google to become public, whether it likes the idea or not. At least by going public, it will receive many billions of new money that it will be able to use in its massive, continuing research and development programs at the Googleplex.
Many are expecting the 5-year old company to make an IPO sometime in late March or early April. People here are Rank for $ales are starting to take bets as to what its stock ticker symbol will be. On the New York Stock Exchange, a company cannot have more than three letters for its stock symbol. Personally, I think it might be GO, although that could cause some confusion with Infoseek’ 😉
The Yahoo Factor
Whatever happens in the Google stables, one major player that cannot be neglected is Yahoo. Yesterday in fact, the CEO of a very large Canadian company and client of mine asked me what I think of Yahoo and what it might do to fight back the ever-powerful Googlemania?
I strongly believe that the Sunnyvale California company should be considered as the ‘sleeping giant’, but not for long. After having bought and ‘digested’ Overture and All The Web in 2003, Yahoo is the number One search directory in the world and the number Two search engine (although I prefer to call it a search directory) on the Web. Yahoo is also the current owner of AltaVista.
In 2004, there is no question that Pay-per-Click (PPC) will be an extremely popular derivative of search, and one that will grow well into 2005 and 2006. Overture and Google, through its AdWords program currently have the two best PPC search properties.
New Search Engines to be born in 2004
After having successfully developed and deployed my paid inclusion search engine Global Business Listing ( http://www.globalbusinesslisting.com/ ) in 2003, in 2004 I will begin the development of my new PPC search engine Net Globe Media
( http://www.netglobemedia.com/ ) which should be completed some time in the third quarter.
Although somewhat slightly different, Net Globe Media will operate on similar principles that Google currently uses for its AdWords program and will be similar to some of the features Overture offers.
What will characterize Net Globe Media from its competition will be its drastically lowered bid prices and the way the ads can be rotated by its advertisers. There will also be an added tool to help you make the best bids on your ads, while lowering its cost at the same time, plus a few additional value-added features.
Where I see the SEO Industry as a whole.
After the Florida update and the many sites that got penalized in the process, I strongly believe that the SEO industry will probably ‘endure’ its largest shakeout ever in 2004. I feel that the search engine optimization firms that are serious, that have always and consistently produced good results for all its clients and the ones that constantly keep up with all the many changes in technology and new developments are the ones that will thrive in 2004.
And the others, the ones that resort to unethical practices or to techniques that are forbidden by the search engines will disappear. Google and many other major search engines are constantly developing new technology and algorithms, in an effort to detect that sort of thing and search engine spammers will get caught at their own game. As a result of all this, I think that the SEO’s and the SEM firms that will remain after this shakeout will be stronger than ever, and will represent the best marketing investment a company can make today for its online advertising programs.
In Conclusion
2004 should be an interesting year, both for the search engines themselves and the SEO community as a whole. There will be no place for second best. Companies and businesses that will have their sites optimized professionally will benefit greatly. Companies, site owners and their webmasters that continue to produce good content for its users will continue to be viewed positively by search properties such as Google and the others and will continue to produce higher than average ROI.
The ones that don’t, well they could end up on the sidelines’ way out of the results pages’
Happy New Year to all of you!
Author:
Serge Thibodeau of Rank For Sales
How To Get Your Visitors To Create Content For Your Website
An ongoing challenge for webmasters today is to provide fresh content that gives visitors a reason to return to their site. Unless you have a full-time staff dedicated to creating regular content, the time involved can be crippling.
Wouldn’t it be great if someone else would write timely, relevant content for you? Sure, but what are the odds of that happening? Well, many webmasters are already enjoying this phenomenon, and I’m happy to count myself among them.
Set it up. Whether your site has a catalog of products or a collection of articles, you can design your pages to allow visitors to post reviews of whatever is featured on the page. They can share their experiences with items they’ve bought or post comments on the information in your site. Don’t confuse this powerful tool with a discussion forum. You create the topic of each page, and encourage visitors to post updates with the latest information in this area.
The more information you provide on your site, the better service you are providing to your visitors; but they aren’t the only ones who benefit from including reviews. It’s also a boon to you as a webmaster. Even if you never wrote another line of new copy, your pages will continue to grow with relevant content. Granted you still have to spend a little time reading the reviews and moderating what appears on your site, but the time required is much less than you would spend writing original content.
Why Someone Else Will Create Content For You.
While the advantages to the webmaster and reader may be obvious, you are probably wondering what incentives exist for the reviewer. A few lucky souls have enough free time to share their opinions online for the mere satisfaction of seeing their own words, but most will need something more tangible before they invest the time to write a useful review. The onus is on you, the webmaster, to create an appropriate incentive. The importance of link popularity in search engine rankings provides a powerful clue. At AffiliateScreen.com, I allow my visitors to post reviews of their experiences with online affiliate programs. At the bottom of their review, they have the option of including a link to another page that supports what they’ve written (or they can simply include their affiliate link for the program). AffiliateScreen.com gives them the additional credibility of their expertise appearing on a third-party site.
Look at this from the reviewer’s perspective. Here’s a unique opportunity to gain an external link pointing to the reviewer’s website, but this is far more valuable than a random reciprocal link. Search engines are determining link popularity by examining both the quantity and relevance of links. The page with their link is loaded with keywords related to the product or service, and the reviewer can include additional relevant keywords in their post. The reviewer actually has a hand in creating the page that will link to their site. This is extremely powerful! Can you begin to see how anyone looking to increase traffic to their site would jump at the opportunity to create content for your website?
Now you may have some concerns that allowing anyone to add content and links to your site is inviting your competitors to steal your traffic. If your site has detailed reviews of products on your site, anyone who leaves your site to visit a competitor won’t spend much time there before realizing that your site has useful, original content that they can’t find anywhere else. If that doesn’t put your mind at ease, though, there’s nothing to stop you from including a disclaimer when someone submits a review. Explain in this disclaimer that direct references to your competitors will be removed to keep the reviews informative and to avoid marketing hype. You are the webmaster, after all.
You can draw even more traffic by allowing visitors to rate the usefulness of each review. Many reviewers will encourage people to visit your site so they can read and rate the review. More important than just being a gimmick to increase traffic, rating reviews allows you to sort them by quality and expose your average visitors to the best possible content first.
Promote It!
Once you’ve got the code in place, it’s important to publicize this new feature of your website. If you have experience or know someone with experience writing press releases, you may be able to garner attention from news and niche media. Another highly effective form of publicity is to mention the new feature above the fold on the main page of your site.
If you have a newsletter, promote the review process in your next issue. In fact, as you begin to accumulate reviews, they can serve as great additional content for your newsletter. Simply include the best reviews in each issue. As with the reviews on your website, you are providing valuable content for your audience, giving your reviewer great exposure, and saving time for yourself. It’s a win-win-win situation.
Author Bio:
Clay Mabbitt writes articles about online income opportunities. He is the founder of a community of Internet entrepreneurs sharing knowledge and experience at http://www.affiliatescreen.com
Getting A Better Search Engine Rank For All Of Your Pages!
In one of my articles, I discussed how to market your web site link twice. It detailed out how to promote not only www.yoursite.com but how you should also promote your site without the www., like this: http://yoursite.com.
This article is to talk about promoting ALL of your pages within your marketing campaign. See, most of us typically only promote the main page on our sites. Ex. www.yoursite.com. The truth is, your site is much more than just the 1st page right? Well, let’s condition ourselves to promote everything available within your site…
Search Engine Marketing is crutial for all companies who want to succeed online. I’m sure at one point or another, you will hear how “Optimizing your site for search engines is crutial”. Of course, they aren’t fooling you, it is a crutial marketing tactic but, what is also crutial is learning how to use different tools to boost your search engine placements once you’ve optimized your site for the web.
So let’s talk about the marketing tactics available to you and implement strategies on how to promote all your pages within them.
Link Exchanges
Link popularity has become a norm for most small companies to implement in their daily promoting activities. Here’s the problem, most companies that perform link exchanges daily fail to utilize it to their advantage. For ex. Let’s say you perform approximately 10 link exchanges daily. For each one, you submitted your link “www.yoursite.com”. What you’ll want to start implementing is submitting 10 different links within your site.
Ex Link Exchanges:
Link #1: http://yoursite.com
Link #2: http://yoursite.com/resources
Link #3: http://www.yoursite.com/services
And so on…
A good strategy would be to open up “Note Pad” and create all the links you want to promote.
1) Add each link
2) Assign an appropriate title to each link
3) Create an appropriate description for each link
Now all you have to do is to copy and paste each link when performing your daily link exchanges.
Writing Articles
Do you write articles to promote your site??? If you do, then you probably have created a “Resource Box” at the end of each article right??? Good, let’s change the resource box a little.
My site is host to 100’s of Marketing Articles and the 1 thing I notice time & time again is that each author (no disrespect to any of the wonderful authors) Typically only add the main page of their web site within their “resource box”.
Ok, let’s say you write articles about “Search Engine Marketing”. At the end of the article, add a link to a page on your site that talks about “Search Engine Marketing” or something similar.
Ex. http://www.yoursite.com/search_engine_marketing.html
So with that in mind, try revising all your articles to point to specific pages on your web site.
Directory Submissions
Submitting your site to directories will give your site long lasting traffic. Not all directories will accept any link from your site besides the main page. This is ok though, there are literally 100’s of other directories that will allow you to submit any page you want.
Experiment with this and try to change up all your links when your submitting to directories.
I WOULD NOT RECOMMEND : Using anything other than your main page for directories like: OPD Open Project Directory or Yahoo.
Wrap Up
So now you have 3 proven marketing strategies on how to improve your search engine placements for all of your pages instead of only your main page. Be creative with this and look for more strategies you can implement this tactic with.
Author Bio:
Martin R. Lemieux Smartads – President Affordable Web Site Design & Web Site Marketing Web Site Awards & Webmasters Playground Food for your mind & Entrepreneur Traits Read over 200 articles on advertising!
Finding Good Keywords
First of all, what type of keywords should you optimize for?
A single word, a whole sentence. The actual question is what are people searching for? Here is a breakdown of the most popular type of searches.
Two word searches 29%
One word searches 25%
Three word searches 24%
Four word searches 12%
Five word searches 5%
Six word searches 2%
Seven word searches 1%
What about active verbs? Think about it. If someone types in “buy”, “find” or similar words, wouldn’t this be an indication that the person is about to make a move, instead of just surf. Although you could go to a top 100 search terms site, why not create your own top 100 active search terms by putting these active verbs into a keyword tool.
People talk to their computers. They will use “I” in their search engine queries sometimes. “How do I find…?” I know I do. I also hold my computer personally responsible when I can’t find something or Windows locks up.
People put .com in their searches. Type .com into the
Overture keyword tool and see how many websites people know
by name but still are attached to using a search engine.
Which sites have a really bad interface? I can name a few, a few big ones. You can find these by typing “registration,” “signup”, “login”, or “join” into a keyword tool. How about “purchase”? If you are doing affiliate marketing you may try these and take advantage of webmasters who should be called webminors.
Although a little bit off-topic, I read somewhere that sponsoring cars for Nascar races is one of the most effective ways to advertise a product. Nascar fans are very loyal. This got me thinking.
Fans are definitely an endless source of revenue. What would they be searching for? A few of the words they may type in are “posters”, “zine” and others. Or just type in a celebrity name and “fan” and look for keywords in the pages that come from Google.
Then there are terms that are standard on many sites. When I was looking for places to submit my software, I just plugged “submit pad” into Google. If you are looking for Clickbank affiliates, you might type the product name and “hop.”
The point is that the customers you are looking for aren’t a computer and they aren’t a dictionary. So you can’t be either when you search for terms to optimize. You’ve got to think like someone in need of your product. I bet if you gave it a little time, you can come up with more examples.
Author Bio:
Stephan Miller is a freelance programmer, and writer. For more Adwords resources, visit the page below http://www.profit-ware.com/resources/googleadwords.htm
Google Paid Placement Review
With the recent ‘Florida’ update on google & its impact on the SERPs, typically on the commercial search phrases, it has become imperative that we now examine paid advertising scenario at google. For scores of merchants, now paid advertising will be the only way to help them salvage their sales, as also future revenues. This will be a two parts article, with the first part focusing on the google paid advertising space & the second part on the best practices thereof. However, before we start on googles adwords & adsense programs a word or two on online advertising industry performance in 2003.
Online Advertising Industry: According to the Interactive Advertising Bureau (IAB), the online advertising industry sales have been robust this year with three consecutive quarters of growth witnessed for the first time in last two years. Lets first look at the yearly sales figures for last three years. Online advertising climaxed in 2000 with $8 billion in revenue (dot com boom) before declining to $7.1 billion in 2001, and $6 billion in 2002. This year in the first three quarters, the sales have been estimated at a healthy 5.07 billion (up 13% from the same period last year). The break up of the quarterly sales figure is as under:
• The first quarter sales have been at 1.63BUSD
• Second quarter is pegged to have clocked 1.69 billion (up 14% from same quarter last year)
• Third quarter is estimated to have done 1.75billion.(20% growth from this quarter last year)
From figures the emerging pattern is towards keyword based text ads gaining popularity.
According to IAB, out of the 6bUSD figure of 2002, 15.4% (or roughly 1 billion) can be attributed to Search term related sales. This is up 200% from the figures of 2001. The contribution of search term related ads has been estimated to 31% of the second quarter figures. No break up for third quarter figures is however available. Financial firm Salomon Smith Barney expects the estimated $1.4 billion market in 2003, to grow 30 percent to 35 percent per year, reaching $5 billion by 2008.
This Advertising space is dominated by two major players (obviously): Google & Overture. Google Adwords Google has two services for its paid advertisers. Adwords & Adsense. Lets start with Adwords. This is a program run by Google in which it lets the advertisers bid for a chunk of the real estate on the search engine results page related to specific queries. The ads appear as “sponsored listings/premium listings” next to the organic search results. Along with getting exposure at the googles own site, these ads are also syndicated at Googles’ partner sites like: America Online, Inc.,Ask Jeeves ,AT&T Worldnet ,CompuServe ,EarthLink, Inc. ,Netscape ,Sympatico Inc. etc.
Currently, Google is supposed to have approx 150,000 monthly advertisers for its paid placement services. Adwords Account Statistics: Following are the statistics for every adword account google has; One can set up 25 campaigns in every account. Each campaign can have up to 100 adgroups in it; with each adgroup capable of having up to 750 Keywords in it. The overall limit on any account is 2000 keywords. Adwords offers four kinds of Keyphrse matching namely: Broad, Phrase,Exact & Negative.
Recently, Google has introduced expanded matching for its broad matches. More on this later… It lets you target your ads to specifically 14 languages, 250 countries or 200 states /regions in US. Google claims 99% accuracy for its IP tracking system to deliver the ad effectively to the target audience. Recent Features on Adwords: Google has added three new features to its adwords campaign in October. Those are:
• Expanded Matching
• Conversion Tracking &
• Increased Click Through Threshold.
Conversion Tracking
Google has introduced a conversion tracking feature for its advertisers, wherein they can now track the conversions resulting from their advertisement traffic. This conversion tracking helps the advertisers by quantifying the ROI they are achieving with their campaigns. This feature works by introducing a cookie on the user computer whenever someone clicks on an advertisement. This cookie is connected to the conversion page if the user reaches there. If it is matched, google records a successful conversion.
Expanded Matching
As a part of its broad matching of keywords, Google has introduced a ‘smart’ feature called expanded matching. With expanded matching, AdWords system automatically runs ads on highly relevant keywords, including synonyms, related phrases, misspellings and plurals, even if they aren’t in the original list of keywords that you submitted with google. For example if you keyword is ‘software development’, google system will try & guess alternate searches to display your ads on. Some examples are ‘software solutions’ or ‘software services’ or ‘technical solutions’. Over time, it will monitor the click through rates (CTR) for these searches & ‘learn’ the relevance of these searches for you. This will help it make expanded searches more specific. Also, based on its mining the search queries, it will be able to develop fresh combination of search terms which will be relevant to the business.
This theme based adwords tuning, is in sync with googles trend of trying to understand ‘available content’ & ‘user queries’ & make ‘intelligent matching’ of the two. In an effort to let you know what are the keywords it will look to broaden your exposure to, google has for the first time put out a Adwords keyword suggestion tool: Google Adwords tool
This tool highlights the googles view of what other terms, it ‘understands’ to be relating to your business. (Tip :This tool works well for broad searches.)
Increased Click Through Threshold
The increased click-through threshold is designed to help ads that may have struggled for traffic due to poor search relevance. These include those related to contextual ads as well. Now for evaluating the Account performance evaluates each account after every 1,000 ad impressions. If the CTR for the account falls below a minimum required CTR (which varies by ad position, geography etc but is 0.5% for the top spot and slightly reduced for each subsequent position) the ads will only be displayed occasionally for under performing keywords.
Google Adsense
Google AdSense is an offshoot of the Google Content-Targeted Advertising program that was launched in early March 2003. This early program let large websites integrate Google AdWords into their websites. Each deal was independently negotiated with Google, and sites with less than 20 million page views couldn’t participate. The Google AdSense program democratizes the content based text ad display process. Even small merchants, bloggers with only a few thousand page views per month, can now apply to the AdSense program The AdSense program allows Web publishers to apply for the program online.
After Google vets a site for popularity and quality — a process the company estimates will take two to three days — accepted applicants download a string of HTML code to insert on Web pages on which they wish to carry text-link ads. Google draws the listings from its base of 150,000 advertisers, using its algorithmic search technology to scan the content page and match it up with relevant links that are displayed as either skyscraper ads on the right side of the page or banners at the top. Some of the large Googles partner networks are :HowStuffWorks, Mac Publishing (includes Macworld.com, MacCentral, JavaWorld, LinuxWorld ,New York Post Online Edition, Reed Business Information (includes Variety.com and Manufacturing.net), U.S.News & World Report online All the rules that apply to adwords listings also apply to Adsense in respect of CTR, Positioning etc.
Author Name: Vikas Maholta
Company: Mosaic Services
Email: sunny@mosaic-service.com