Ask us a question!

Web Moves Blog

Web Moves News and Information

Archive for the 'Bing' Category

While several companies established a group in order to try and prevent the Google-ITA deal, claiming that it is a yet another step of monopoly (by Google that is) of the airline ticket market, Bing has decided not to complain, but to fight. In addition to purchasing a predictive engine for flight costs, the Farecast, about a year ago, they now team up with one of the popular travel search engines, the KAYAK. It seems that the deal is beneficial for both sides – after all, KAYAK is probably also worried about Google acquiring ITA, despite their talks about “welcoming Google as a competitor”.

It their announcement, Bing flatters KAYAK, calling its new partner “a leading innovator in travel search”, and talks about “more comprehensive travel search experience”. The deal should benefit those people who want to plan and book via Bing. Although this looks like trying to stop people from leaving Bing, the move is actually a counter-step to Google entering the travel search. Of course, although “Google is not wining every niche it enters” as said KAYAK CTO and co-founder Paul English, it can affect the market heavily.

So, there is nothing left but to wish good luck to both Bing and KAYAK in their struggle.

According to latest StatCounter data, Goolge has dropped below 90% of search engine market share – for the first time since July 2009. The presented figure of 89.94%, though is still a major headache for its competitors, Yahoo and Bing that combine to just over 8% of global search… In the European market the domination is even greater – Google has about 94% of market share.

Although Bing has surpassed Yahoo globally in January, in the US market Yahoo! is still a number two search engine, with 9.74% share compared to Bing’s 9.03%. Google has dropped below 80% once again, with 79.63%.

In Asia, Baidu has once again beaten Bing for the number three spot (Yahoo! is second). It must be noted however, that StatCounter only considers English searches so the results have to be viewed with care. For example, in Russia Google is reported as the market leader with 52% with Yandex having a figure of 46%, and in Czech Republic the picture looks even brighter for Google, which beats local Seznam 79% to 19%. Of course, when native language searches are considered, both Yandex and Seznam are more popular than Google in their local market.

But even so, in China, Baidu is a clear number one, with almost 70% of the market (compared to Google’s 29%) and in South Korea Naver is back to absolute majority (55.15%), with both Google and recently launched Daum both loosing ground (31.7% and 7.85% respectively).

Woke up this morning thinking further about my statement yesterday that Microsoft should by Twitter.  I really think that if Google does not buy Twitter and it lands in the hands of Microsoft, it could potentially become a great equalizer. Bing’s real time search results would be exclusive and therefore at the very least very different from Google. Bing needs to do something, it is sort of floundering as many companies do when they are not really committed to being the best.

On the other hand if Facebook buys Twitter, Google has a much bigger problem, potential elimination from real time search. Facebook is the number one visited website in the world. Now this is great, but their problem is, their visitors are not interested in buying anything, they do not click on ads, they do not convert into $$, and this is becoming a problem for the future of Facebook. It is sort of the old school internet business model on steroids:  build it,make it cool and free,  get traffic, and with traffic all your problems will be solved.  Now if your roll Twitter into Facebook, you do not get any better profit generation, but now you hold all the cards in real time search. Facebook could place extraordinary value on this real time data, and begin to charge search engines massive amounts of fees to access their websites and data. If the search engines do not agree to pay these outrageous fees, then Facebook can begin to build their own search engine. Even if their algorithm was not very robust to begin, with having the real time data from Facebook and Twitter would insure that they provide phenomenal real time information (that would not be found anywhere else) and can use this real-time data VERY effectively. It is a fact that no one is really Tweeting or Facebooking about the spamming Viagra website they found on page one of Google, nor the insurance website they found in BING. Therefore Facebook would be able to quickly put a serious reduction on spam, create a place in search, and provide themselves with very bright future for profitability and a serious chunk of what Google and BING currently have.

As an internet marketing professional, I really do not care who does what. I do not own the game, just play by the rules set forth by people far smarter and wealthier than I. I must say though, I really like Twitter in the hands of Facebook or Microsoft. Lets see what Google is really made of….

Bringing the most relevant results to the user is the quest of every search engine. Fighting spam is one aspect of this issue. The other one is personalization – showing the results that would be the most interesting to the SPECIFIC searcher. Hence the localization, hence the search history….

Bing has recently followed Google on that path, applying city-based localization to the query results in the US. It will now give additional weight to local businesses, especially service providers. This is another step forward, as the local Bing searches in various countries are showing different results for quite a while already. However, for big countries, such as the States this might be not enough – so additional refinement is now applied, based on the city you are in. It must be noted, the results are not entirely different – it is just that local businesses are given some “extra points” by the search algorithm.

Another aspect is using your past search queries in the results. The Bing (as does Google for some time already) tries to “learn your preferences” based on the searches you conduct and the results you pick from the presented list. Those will be stored in search history and shown more frequently (or higher) in the result list when similar query is submitted.

It seems that “those who bought this also liked that” feature, used by many online stores and other websites is now entering the SE world.

I was in the shower this morning, considering the impact that Twitter and Facebook have had on Google’s search results. After reading Rands test results from Twitter links versus traditional text links in ranking pages within the Google search results, it is clear that Google is placing significant weight on links from Twitter and Facebook. Based upon this information, one would assume that if Facebook and Twitter no longer permitted Google-bot access their websites, Google’s algorithm would have to be seriously adjusted.  It would probably end up in pushing Google search results to displaying only yesterday’s news and information, instead of real-time search results currently based upon the linking patterns Google-bot gets from Twitter and Facebook.

I was just reading a blog post on searchengineland about Twitter being acquired by someone, whether it be Google, Microsoft or Facebook. I find this concept interesting: Whomever buys Twitter will have the most updated real-time content online. I believe that the acquisition of Twitter must be made by Microsoft. This would give Microsoft its first leg up on Google in search. Microsoft could probably license the access to Twitter to Google for hundreds of millions of dollars.

The facts are quite simple that without Twitter and Facebook links, Google’s sort of screwed. Unfortunately Google’s recent behavior has created a bit of industry anger towards its online business practices. I think this is why Groupon did not sell to Google.

I wanted to mention that this blog post was written with the assistance of NaturallySpeaking by Dragon. If you have hesitated in using speech-recognition software, I would say now is the time to give it a try. I think that using this software will make it much easier for me to blog from this point forward.

While I’m getting NaturallySpeaking a plug, I may as well mention the really really cool viral marketing tool they’ve built on their website. It’s called Fingers of Fire.

I am really getting tired of Google presenting information and blog posts from 2007. The authority Google gives to these old blog posts and news items causes their results for particular topics to just STINK.

So I jump search engines to BING or for today trying Blekko. Both these engines tend to do a better job weeding out some old content from their results which is great. But…..and this needs to be BUT….

What is with the results in BING and BLEKKO showing websites from every English speaking country? A search on Blekko for “promotional mugs” presents results from all over the world, and although not quite as bad the same thing happens with BING.

Which search engineers decided that it is a good idea to present these international results to a US search query? It seems to me that this is the most basic part of a relevancy algorithm.

I can provide free tips to the engineers at Blekko and Bing:

1.) if the domain ends in these results should be provided to people searching in the United Kingdom.

2.) if the domain ends in these results should be provided to people searching in Australia.

3.) if my IP address is based in the United States, please only show me websites whose IP address is in the US. (Take this same theory and apply it to whatever country the search query originates from).

It is really sad when in general the entire internet community is looking for alternatives to Google, and this is the best competition we can come up with?

No wonder Google is taking over….. and Bing are very anxious to prove the world they can beat Google. Even in minor things, like Image Search that Bing was enhancing constantly over last several month. Or in a Search Engine Jeopardy contest, managed by Stephen Wolfram. Well, it seems Google competitors still have some work to do, as the Search Industry leader was victorious once again.

The SE Jeopardy consisted of Jeopardy questions randomly selected form a database of around 200000 that were fed into the search queries of various engines. The developers then looked at the number of correct answers that appeared in the search results page and also at the number of correct answers that were included in the page that search engines presented as the top result.

The results were as follows:

Percentage of correct answers appearing somewhere on the first page: Google – 69%; – 68%, Bing – 63%, Yandex – 62%, Blekko – 58%, Wikipedia – 23%.

Percentage of correct answers appearing in the top result of the page: Google – 66%; Bing – 65%, Yandex – 58%, – 51%, Blekko – 40%, Wikipedia – 29%.

Obviously Wikipedia didn’t stand too much chance, as it was only one website competing against “the whole internet”. Still, it must be noted that only about one-third of Jeopardy answers are already in Wikipedia…

As to Search Engines – Google has beaten the competition, although the margins are not that big. But based on these results, Ask and Blekko have to do a better job of listing the most relevant link at the top (see how their percentage dropped when they looked into the first document. And Bing is “almost there” – but still a fraction behind Google.

Yandex numbers were very impressive, as it is basically a local Russian search engine. If the test has been done in Russian (or at least, based on Russian Jeoprdy Analogue, “Svoya Igra”, which includes fewer questions about American culture and history) Yandex would probably beat Google – exactly as it does in the Russian Search Engine market.

In summary, nobody can beat Google in providing relevant information. Not just yet. So, when you want to know “What is” something – don’t ask and don’t bing. Google it!

It seems that Bing decided to combat Google first of all in the Images section. Merely a month after introducing its Image Categorization Panel, Bing now applies a new look to the Bing Image Search page.

It is now showing the 20 top most searched images of the day – people, event, places, animals etc. Yesterday, the squirrel appreciation day brought “Cute Squirrel” to the top of the list. Today, the “National Geographic Photo Contest” occupies the number one Bing Image Search spot, with “Todd Palin” (6) beats “Sasha Obama” (10) and “Chelsea Clinton Wedding”.

Of course, once the search is done the top list disappears – but it can be easily retrieved via the “Browse top image searches” link just below the categorized search tabs bar.

The option is currently available in US, Australia and Canada Bing websites.

Everybody knows that Yahoo US has teamed up with Bing in order to fight Google in the North American search market. In other parts of the world, however, strange things are happening.

Since the start of 2011, Yahoo and Bing are also a joint force in Australia, Mexico and Brazil. In the UK, however, the deal has not been sealed yet. And although people are saying that it is only a matter of time, noticing that certain Yahoo search results look identical to Bing and speculating about “two different indexes”, it is yet to be seen whether Yahoo UK will be powered by Bing in the end. Why not, anyway? Where will Yahoo go? To Google? Well, yes!

Yahoo! Japan, for example, has made a partnership with Google. The deal (Google US will supply the technology for Yahoo! Japan) was recently approved by the FTC (Fair Trade Commission) – a body responsible for preventing monopolization of the markets. And although the ratification is not permanent, and FTC stated they will monitor the activity of the combined team closely, it was a major hit for both Microsoft and local search engines. Yahoo US was not very happy either, but was unable to stop the move, as it only own about 30% of Yahoo! Japan.

Bing Webmaster Tools

If we are going to understand the latest nuance of online searching we need to know just which fundamentals we’re dealing with. If we are going to understand how to fine-tune and adjust our Web-based lives we need to know something about the tools we’re planning to use.

We’ll start from the announcement of “the release of the updated Bing Webmaster Tools,” as announced in July on

According to the information provided at the time, “After the Bing launch, we reached out to the webmaster and SEO communities to see how we could improve the Webmaster Tools. Your feedback was very consistent: you wanted more transparency to see how Bing crawls and indexes your sites, more control over your content in the Bing Index, and more information to help you optimize your sites for Bing.” That’s the word from Anthony M. Garcia, Senior Product Manager for Bing Webmaster Tools.

New and Improved

The goal, according to the Webmaster Center, was to change everything, or, in the staff’s words “hit the reset button and rebuild the tools from the ground up.” The latest tool was announced by Garcia in the same forum on Dec. 14. He wrote that a new Inbound Links feature was released to provide “registered site owners the ability to easily & intuitively retrieve data about links to their sites. “

Bing apparently got a lot of feedback from SEO pros and Web-site owners about “the importance of this data to better understand how their sites are ranked in Bing.”

Users can now learn details about the number of inbound links over time, with details on URL and anchor text. In addition, users can export this link data so it can be studied offline.

Garcia wrote, “It is important to note that the count of inbound links will be based on content stored in the Bing index vs. a complete, comprehensive count of links between every page on the Internet.” This could be a crucial distinction for many site owners and other industry professionals.

Building Blocks

New top-floor tools such as the just-announced Inbound Links item have to be supported by a strong and well-constructed foundation and lower floors. That was the goal when Bing “hit the reset button” a few months ago. Developers were aiming for a simpler experience that was also more intuitive.

The focus, according to Bing, was on “three key areas: crawl, index and traffic.” The new work included Index Explorer and Submit URLs, which basically provide more information about Bing’s crawls and indexes, but also gives the user improved control. Garcia wrote:

Index Explorer gives you unprecedented access to browse through the Bing index in order to verify which of your directories and pages have been included. Submit URLs gives you the ability to signal which URLs Bing should add to the index. Other new features include: Crawl Issues to view details on redirects, malware, and exclusions encountered while crawling sites; and Block URLs to prevent specific URLs from appearing in Bing search engine results pages.

In addition, the new tools take advantage of Microsoft Silverlight 4 to deliver rich charting functionality that will help you quickly analyze up to six months of crawling, indexing, and traffic data. That means more transparency and more control to help you make decisions, which optimize your sites for Bing.

At this point we should have a better-than-average idea of what the developers and managers intended with the new Bing Webmaster Tools. But if we don’t understand the overall changes made we might find that we’re a bit confused about the latest innovation. Let’s take a look back at some foundation material.

Back Then

In previous incarnations, Bing was Live Search, Windows Live Search and MSN Search. This new “decision engine” came under the broader category of Web search engine and was introduced to the general public in San Diego in May 2009. It was online a few days later.

That first product included such changes as search suggestions listed as queries are entered. According to information from the company Bing also included a list of related searches based on Powerset semantic technology. (Microsoft bought Powerset in 2008.) Bing was also introduced as the muscle behind Yahoo! Search.

To get to the roots of Bing predecessors you have to go back an entire lifetime (in Internet terms) to 1998 and 1999. At that time, the company offered MSN Search and gradually added such improvements as self-constructed search-engine results that could be updated on a regular basis. Serious upgrades were moved out in 2004 and 2005, with results made available to other search portals.

In 2006, the world was introduced to Windows Live Search, the replacement for MSN Search. This new iteration used such tabs as Web, images, news, desktop etc. It was at this point that Microsoft no longer used Picsearch to provide images within the service. In-house developers created their own image-search mechanisms.

A year later the company effectively changed the playing field by moving the search products away from Windows Live services. The new product became Live Search, under the Platform and Systems division of the company. It’s important to also understand that Live Search was integrated with Microsoft adCenter at this time.

There’s More

Some of the key elements in the transformation of Microsoft search tools (from Windows Live Search and MSN Search to Bing) include:

  • 2008 – Separate search categories/tools for books and academic subjects were discontinued. These were integrated into the overall search program
  • 2008 – Live Search Macros discontinued
  • 2009 – Live Product Upload discontinued
  • 2009 – MSN QnA lived briefly between February and May

It’s also essential for industry watchers to understand the significance of the change from the “Live” moniker. This decision was made by a company that was concerned about brand image and brand awareness. Bing was born.

So, we’ve come from the last century and MSN/Live Search to Bing, in about 10 or 11 years. Where does that leave us? Well, in 2009 Microsoft and Yahoo! signed an agreement to replace Yahoo! search tools with Bing. The contract is intended to last 10 years.

Not only has Bing become the power behind another brand it has resurrected Microsoft’s share of the search market. Industry figures show that the big company’s market share was declining until Bing gave it the adrenaline rush needed. In October 2010 Bing was in the top five among search engines by volume.

Painful Birth?

Was the transition from Live to Bing painful for users? It would be necessary to survey a majority of those folks to get an accurate picture of what the change meant. Microsoft and the people behind Bing introduced the change as good news.

In July, Garcia wrote, “We have good news for all the veteran users of the Bing Webmaster Tools. Your existing Webmaster Center accounts have been automatically upgraded to the new tools. This means that starting today, you’re already a registered user of the new Bing Webmaster Tools. There’s no need to create a new account, change ownership verification codes, or re-enter site data. If you don’t have a current account, you can easily sign-up and register your sites to begin using the new tools.”

A few veterans have classified Bing Webmaster Tools as good, if not the best. There were some preliminary problems with Bing reporting targeted United States Web sites as being United Kingdom sites. The problem seemed to be with specific tags that have the letters “gb” in them. This was written about on SEORankings by founder Wesley LeFebvre, based in Seattle.

At the time, LeFebvre asked if some of these sites were suffering in the organic-ranking category because of the mistaken information. He also questioned the validity of Bing, Google and Yahoo! local rankings in light of this information. LeFebvre set a goal of finding out if other organic search engines use this tag.

In October, Steve Tullis wrote on the Webmaster Center blog “In keeping with our themes of rapid change and responding to input and feedback from the webmaster community, we are working on future feature planning for the Bing webmaster tools and we would like to hear your thoughts in a few areas.”

Tullis discussed Crawl-Delay and asked several specific questions about support and control in this area. He asked users and webmasters if they preferred to have crawl-delay supported in robots.txt, in addition to several other pointed questions.

The door to getting assistance with Bing Webmaster Tools is on the Bing Webmaster Center Help page. While that may seem obvious, it doesn’t hurt to make contact with the source somewhere in your search for information about new versions, upgrades and additions. Judging from initial reports from both the company and a few users, the new tools are workable and the Inbound Links addition is absolutely essential.

It’s a bit early to try any detailed comparisons on Bing Webmaster Tools and its Inbound Links structure. But rest assured there are plenty of people out there who will be providing useful reports.