Our world, especially the online part of it, is all about speed. That’s why Google had become so popular so quickly. The amount of search results shown and the speed at which those result have been accumulated and presented to the user did the trick.
Nowadays, of course, many search engines are almost as quick as Google, the difference in speed of producing the search results measured in milliseconds. However, what the users now want from their SE is convenience and reliability. With several questions asked about Google supposedly “biased” results, the Search Engine giant’s UI had never been an issue. That is, until now. The revised Google Image Search has a problem that can reduce the speed of entering a query.
Normally, when you type the query term in the search box and get the results, you do not use your mouse. Continue typing for “narrowing” the search, or press Shift-Backspace to erase the previous entry and start over. So far so good. However, when you use Google Image Search, the query box seems to lose the mouse focus once the pointer is moved – deliberately, or accidentally. So, when you decide to alternate your search, you have to use the mouse (one of the worst nightmares of the speedy typer) in order to activate the query box again.
Google have reported they are aware of the problem and the “frustration” and “looking for ways” to solve it.
UK shopping market maybe smaller than the US one, but Google wants to make an impact on it as well, introducing Google Nearby Shops UK. Now, when you query for certain product via UK Google Product Search, you will have the shops selling it appearing map-style below your search, to help you locate the closest item to your location.
Of course, if you are a merchant and want to take advantage of this feature, becoming listed in those results, you need to have your shop’s URL linked to both Google Places and Google Merchant Center.
For both buyers and sellers, this combination of Google Maps and Google Products can be very useful in this Holiday season.
You might have noticed that starting mid-December, Google is labeling certain websites with “this site may be compromised” notice that appears in the search results under the website’s link. According to Google’s Matt Curtis, this is actually done to help webmasters, noting that their website is probably being hacked. The procedure of banning sites from Google search index as noting the owners via Google Webmaster tools has proved “too slow” as not many site owners check their Webmaster Tools notifications on regular basis.
As the hacked website does not usually present an immediate threat to the visitor (if malware is detected, Google Search will show the more aggressive “This site may be harmful to your computer” message), the “this site may be compromised” notice is destined mainly towards owners, who constantly monitor their website appearance in Google search, urging them to pay an immediate attention to the problem.
If we are going to understand the latest nuance of online searching we need to know just which fundamentals we’re dealing with. If we are going to understand how to fine-tune and adjust our Web-based lives we need to know something about the tools we’re planning to use.
We’ll start from the announcement of “the release of the updated Bing Webmaster Tools,” as announced in July on www.bing.com.
According to the information provided at the time, “After the Bing launch, we reached out to the webmaster and SEO communities to see how we could improve the Webmaster Tools. Your feedback was very consistent: you wanted more transparency to see how Bing crawls and indexes your sites, more control over your content in the Bing Index, and more information to help you optimize your sites for Bing.” That’s the word from Anthony M. Garcia, Senior Product Manager for Bing Webmaster Tools.
New and Improved
The goal, according to the Webmaster Center, was to change everything, or, in the staff’s words “hit the reset button and rebuild the tools from the ground up.” The latest tool was announced by Garcia in the same forum on Dec. 14. He wrote that a new Inbound Links feature was released to provide “registered site owners the ability to easily & intuitively retrieve data about links to their sites. “
Bing apparently got a lot of feedback from SEO pros and Web-site owners about “the importance of this data to better understand how their sites are ranked in Bing.”
Users can now learn details about the number of inbound links over time, with details on URL and anchor text. In addition, users can export this link data so it can be studied offline.
Garcia wrote, “It is important to note that the count of inbound links will be based on content stored in the Bing index vs. a complete, comprehensive count of links between every page on the Internet.” This could be a crucial distinction for many site owners and other industry professionals.
Building Blocks
New top-floor tools such as the just-announced Inbound Links item have to be supported by a strong and well-constructed foundation and lower floors. That was the goal when Bing “hit the reset button” a few months ago. Developers were aiming for a simpler experience that was also more intuitive.
The focus, according to Bing, was on “three key areas: crawl, index and traffic.” The new work included Index Explorer and Submit URLs, which basically provide more information about Bing’s crawls and indexes, but also gives the user improved control. Garcia wrote:
Index Explorer gives you unprecedented access to browse through the Bing index in order to verify which of your directories and pages have been included. Submit URLs gives you the ability to signal which URLs Bing should add to the index. Other new features include: Crawl Issues to view details on redirects, malware, and exclusions encountered while crawling sites; and Block URLs to prevent specific URLs from appearing in Bing search engine results pages.
In addition, the new tools take advantage of Microsoft Silverlight 4 to deliver rich charting functionality that will help you quickly analyze up to six months of crawling, indexing, and traffic data. That means more transparency and more control to help you make decisions, which optimize your sites for Bing.
At this point we should have a better-than-average idea of what the developers and managers intended with the new Bing Webmaster Tools. But if we don’t understand the overall changes made we might find that we’re a bit confused about the latest innovation. Let’s take a look back at some foundation material.
Back Then
In previous incarnations, Bing was Live Search, Windows Live Search and MSN Search. This new “decision engine” came under the broader category of Web search engine and was introduced to the general public in San Diego in May 2009. It was online a few days later.
That first product included such changes as search suggestions listed as queries are entered. According to information from the company Bing also included a list of related searches based on Powerset semantic technology. (Microsoft bought Powerset in 2008.) Bing was also introduced as the muscle behind Yahoo! Search.
To get to the roots of Bing predecessors you have to go back an entire lifetime (in Internet terms) to 1998 and 1999. At that time, the company offered MSN Search and gradually added such improvements as self-constructed search-engine results that could be updated on a regular basis. Serious upgrades were moved out in 2004 and 2005, with results made available to other search portals.
In 2006, the world was introduced to Windows Live Search, the replacement for MSN Search. This new iteration used such tabs as Web, images, news, desktop etc. It was at this point that Microsoft no longer used Picsearch to provide images within the service. In-house developers created their own image-search mechanisms.
A year later the company effectively changed the playing field by moving the search products away from Windows Live services. The new product became Live Search, under the Platform and Systems division of the company. It’s important to also understand that Live Search was integrated with Microsoft adCenter at this time.
There’s More
Some of the key elements in the transformation of Microsoft search tools (from Windows Live Search and MSN Search to Bing) include:
It’s also essential for industry watchers to understand the significance of the change from the “Live” moniker. This decision was made by a company that was concerned about brand image and brand awareness. Bing was born.
So, we’ve come from the last century and MSN/Live Search to Bing, in about 10 or 11 years. Where does that leave us? Well, in 2009 Microsoft and Yahoo! signed an agreement to replace Yahoo! search tools with Bing. The contract is intended to last 10 years.
Not only has Bing become the power behind another brand it has resurrected Microsoft’s share of the search market. Industry figures show that the big company’s market share was declining until Bing gave it the adrenaline rush needed. In October 2010 Bing was in the top five among search engines by volume.
Painful Birth?
Was the transition from Live to Bing painful for users? It would be necessary to survey a majority of those folks to get an accurate picture of what the change meant. Microsoft and the people behind Bing introduced the change as good news.
In July, Garcia wrote, “We have good news for all the veteran users of the Bing Webmaster Tools. Your existing Webmaster Center accounts have been automatically upgraded to the new tools. This means that starting today, you’re already a registered user of the new Bing Webmaster Tools. There’s no need to create a new account, change ownership verification codes, or re-enter site data. If you don’t have a current account, you can easily sign-up and register your sites to begin using the new tools.”
A few veterans have classified Bing Webmaster Tools as good, if not the best. There were some preliminary problems with Bing reporting targeted United States Web sites as being United Kingdom sites. The problem seemed to be with specific tags that have the letters “gb” in them. This was written about on SEORankings by founder Wesley LeFebvre, based in Seattle.
At the time, LeFebvre asked if some of these sites were suffering in the organic-ranking category because of the mistaken information. He also questioned the validity of Bing, Google and Yahoo! local rankings in light of this information. LeFebvre set a goal of finding out if other organic search engines use this tag.
In October, Steve Tullis wrote on the Webmaster Center blog “In keeping with our themes of rapid change and responding to input and feedback from the webmaster community, we are working on future feature planning for the Bing webmaster tools and we would like to hear your thoughts in a few areas.”
Tullis discussed Crawl-Delay and asked several specific questions about support and control in this area. He asked users and webmasters if they preferred to have crawl-delay supported in robots.txt, in addition to several other pointed questions.
The door to getting assistance with Bing Webmaster Tools is on the Bing Webmaster Center Help page. While that may seem obvious, it doesn’t hurt to make contact with the source somewhere in your search for information about new versions, upgrades and additions. Judging from initial reports from both the company and a few users, the new tools are workable and the Inbound Links addition is absolutely essential.
It’s a bit early to try any detailed comparisons on Bing Webmaster Tools and its Inbound Links structure. But rest assured there are plenty of people out there who will be providing useful reports.
With Goggle introducing its new tool, the Google Books Ngram Viewer several days ago, many were enthusiastic about this being an ultimate feature to use in etymological research. After all, the Ngram Viewer allowed to search millions of books (Google books, of course) and then check, track, and analyze the appearances of any word throughout many centuries.
The users were enthusiastic at first, but it turned out that the tools is far from perfect. According to recent review, there are many problems and inaccuracies in Ngram Viewer reports – both expected and unexpected. A very basic issue is the OCR – Optical Character Recognition. Even for modern books and fonts, there are occasional mistakes that occur, best OCR programs report just below 1% percent error margin for a text of recognized words. For books from the 16th and 17th centuries, with the artistic fonts this margin is sure to be higher. One example is the letter “s” confused with “f” on numerous occasions.
Another problem observed is that for the first occurrence, as Google Books NGram Viewer does not take into account the developing of language over time, thus you have to research several forms of the world used throughout the centuries to find the actual first usage. And also, there are the reprints. Many Google books are labeled with the year of their print, instead of the year of the original manuscript, making the search produce more hits for “recent” years.
Overall, Google Books Ngram Viewer is not bad. It is just not as reliable as one could think it is. Suitable for occasional queries, it cannot be considered as reliable tool in serious academic research.
After Google had announced its intention to acquire the renowned flight data provider ITA (the offer stands, reportedly, at 700 million USD), several serious questions arouse. Wouldn’t it be too much a step towards monopoly? What are the benefits for the customers? Why, the hell, Google is buying ITA at all?
Well, the answer to the last question is pretty obvious. In recent years, Google seems to enter every niche available in the market. Long gone are the times when Google was just a search engine. Google maps, Google news, Google Sketch-up – more and more services are provided by the enterprise and some people are already asking – is Google a Search Engine or your ultimate competitor?
However, the Google ITA offer has now encountered a serious opposition itself. A group of businesses, namely the Fairsearch.org, have gathered together in order to prevent the deal. With ITA serving about two thirds of airline ticketing and satellite websites, the ultimate “danger” – according to the Fairsearch claim – is that Google will eventually start selling tickets directly, while it has a control of data flow towards potential competitors.
Google, of course, claims that the intention is purely to improve the service, making flight data offered by Google more reliable and continuing to redirect the searchers to other websites that offer flight tickets.
With the consequences of this case remain to be seen , one thing is clear – more and more businesses (including the giants like Microsoft and Expedia) are concerned with Google taking over.
Bing has introduced a new feature for the Bing Image Search. It is called “Instant Answer” and is, actually, a small bar of “tabs” that appears just below the search term box. Each tab suggests a more “specific” query, being especially helpful in the case of ambiguous search terms.
For example, when you type “heat”as the Bing Image search term, the “Instant Answer” will suggest narrowing your search to “Miami Heat”, “Heat Energy”, “Heat Wave” etc. The term “star” will feature “shortcut tabs” for filtering images of “Patrick Star”, “Star wars”, “Star of David” and more.
This feature resembles the infamous “related searches” suggestions for web search and is useful in saving our time, as we do not need to retype the whole query, using a simple click of the mouse instead. “Instant Answer” is, however, currently only available for the US Bing website.
Google has recently announced that Google Instant Mobile is now available “globally”. This means that the tool is released for all countries that have Google Mobile access (there are several dozens of those) and supports almost thirty languages.
The product, similarly to Google Instant Mobile, is integrated into search features for any Android browser (built-in for Android OS 2.2 and up) , and features various algorithms that allow faster dynamic search results.
Although this release was expected (shortly after releasing Google Instant Mobile in English, the company had announced that international support is on its way) – nobody anticipated that this would happen so quickly. The roll-out took Google slightly over one month time – an incredible figure, considering the complexity of the product. Of course, this simply means that Google had been working on globalization of Google Instant Mobile simultaneously with the product itself. That is no wonder – Google had always emphasized the importance of international marketing and global support.
Telemarketing was never a part of Google business strategy. The idea was to spread the information, get recognizable and make the clients come to them, asking for services. That’s were the sales managers stepped in, offering a variety of products, bargain deals and impressing the customer.
However, according to latest news, this has recently changed. Several hundreds of telemarketers are employed by the company, their task being to sell Google Boost and Google Tags services to local businesses at several markets in the US.
It seems the Rubicon has been crossed and Google has now “recognized” that some niches and companies should be addressed directly rather than by advertising. The next question, which should be probably answered in few months time, is whether they will expand their own telemarketing group (reportedly currently measuring 300 employees) or try to purchase an established sales force.
After all, with Groupon denying “the 6 billion offer”, Google has some money to spare…
Everybody knows that since the introduction of the new Caffeine index, Google is able to update its site indexing within hours and even minutes. Long gone are the days when we were shown some outdated “caption” of the website content that turned out as no longer present on the page. Different was the case with images, as Google Image Search indexing was still lagging behind. It could take a month or so for the recently added image to appear in Google Image Search results.
It appears Google, in its efforts to produce real time results, is now addressing this “problem”. Several users, who are following Google image indexing closely, have reported a major improvement in this field, stating that the new images are being indexed more fluently, appearing in Google Image Search results with a delay of several days only. This is yet another proof that non-textual content is becoming more and more popular among users, making it essential for proper SEO.