has 13+ years experience in web development, ecommerce, and internet marketing. He has been actively involved in the internet marketing efforts of more then 100 websites in some of the most competitive industries online. John comes up with truly off the wall ideas, and has pioneered some completely unique marketing methods and campaigns. John is active in every single aspect of the work we do: link sourcing, website analytics, conversion optimization, PPC management, CMS, CRM, database management, hosting solutions, site optimization, social media, local search, content marketing. He is our conductor and idea man, and has a reputation of being a brutally honest straight shooter. He has been in the trenches directly and understands what motivates a site owner. His driven personality works to the client's benefit as his passion fuels his desire for your success. His aggressive approach is motivating, his intuition for internet marketing is fine tuned, and his knack for link building is unparalleled. He has been published in books, numerous international trade magazines, featured in the Wall Street Journal, sat on boards of trade associations, and has been a spokesperson for Fortune 100 corporations including MSN, Microsoft, EBay and Amazon at several internet marketing industry events. John is addicted to Peets coffee, loves travel and golf, and is a workaholic except on Sunday during Steelers games.
UK shopping market maybe smaller than the US one, but Google wants to make an impact on it as well, introducing Google Nearby Shops UK. Now, when you query for certain product via UK Google Product Search, you will have the shops selling it appearing map-style below your search, to help you locate the closest item to your location.
Of course, if you are a merchant and want to take advantage of this feature, becoming listed in those results, you need to have your shop’s URL linked to both Google Places and Google Merchant Center.
For both buyers and sellers, this combination of Google Maps and Google Products can be very useful in this Holiday season.
You might have noticed that starting mid-December, Google is labeling certain websites with “this site may be compromised” notice that appears in the search results under the website’s link. According to Google’s Matt Curtis, this is actually done to help webmasters, noting that their website is probably being hacked. The procedure of banning sites from Google search index as noting the owners via Google Webmaster tools has proved “too slow” as not many site owners check their Webmaster Tools notifications on regular basis.
As the hacked website does not usually present an immediate threat to the visitor (if malware is detected, Google Search will show the more aggressive “This site may be harmful to your computer” message), the “this site may be compromised” notice is destined mainly towards owners, who constantly monitor their website appearance in Google search, urging them to pay an immediate attention to the problem.
If we are going to understand the latest nuance of online searching we need to know just which fundamentals we’re dealing with. If we are going to understand how to fine-tune and adjust our Web-based lives we need to know something about the tools we’re planning to use.
We’ll start from the announcement of “the release of the updated Bing Webmaster Tools,” as announced in July on www.bing.com.
According to the information provided at the time, “After the Bing launch, we reached out to the webmaster and SEO communities to see how we could improve the Webmaster Tools. Your feedback was very consistent: you wanted more transparency to see how Bing crawls and indexes your sites, more control over your content in the Bing Index, and more information to help you optimize your sites for Bing.” That’s the word from Anthony M. Garcia, Senior Product Manager for Bing Webmaster Tools.
New and Improved
The goal, according to the Webmaster Center, was to change everything, or, in the staff’s words “hit the reset button and rebuild the tools from the ground up.” The latest tool was announced by Garcia in the same forum on Dec. 14. He wrote that a new Inbound Links feature was released to provide “registered site owners the ability to easily & intuitively retrieve data about links to their sites. “
Bing apparently got a lot of feedback from SEO pros and Web-site owners about “the importance of this data to better understand how their sites are ranked in Bing.”
Users can now learn details about the number of inbound links over time, with details on URL and anchor text. In addition, users can export this link data so it can be studied offline.
Garcia wrote, “It is important to note that the count of inbound links will be based on content stored in the Bing index vs. a complete, comprehensive count of links between every page on the Internet.” This could be a crucial distinction for many site owners and other industry professionals.
Building Blocks
New top-floor tools such as the just-announced Inbound Links item have to be supported by a strong and well-constructed foundation and lower floors. That was the goal when Bing “hit the reset button” a few months ago. Developers were aiming for a simpler experience that was also more intuitive.
The focus, according to Bing, was on “three key areas: crawl, index and traffic.” The new work included Index Explorer and Submit URLs, which basically provide more information about Bing’s crawls and indexes, but also gives the user improved control. Garcia wrote:
Index Explorer gives you unprecedented access to browse through the Bing index in order to verify which of your directories and pages have been included. Submit URLs gives you the ability to signal which URLs Bing should add to the index. Other new features include: Crawl Issues to view details on redirects, malware, and exclusions encountered while crawling sites; and Block URLs to prevent specific URLs from appearing in Bing search engine results pages.
In addition, the new tools take advantage of Microsoft Silverlight 4 to deliver rich charting functionality that will help you quickly analyze up to six months of crawling, indexing, and traffic data. That means more transparency and more control to help you make decisions, which optimize your sites for Bing.
At this point we should have a better-than-average idea of what the developers and managers intended with the new Bing Webmaster Tools. But if we don’t understand the overall changes made we might find that we’re a bit confused about the latest innovation. Let’s take a look back at some foundation material.
Back Then
In previous incarnations, Bing was Live Search, Windows Live Search and MSN Search. This new “decision engine” came under the broader category of Web search engine and was introduced to the general public in San Diego in May 2009. It was online a few days later.
That first product included such changes as search suggestions listed as queries are entered. According to information from the company Bing also included a list of related searches based on Powerset semantic technology. (Microsoft bought Powerset in 2008.) Bing was also introduced as the muscle behind Yahoo! Search.
To get to the roots of Bing predecessors you have to go back an entire lifetime (in Internet terms) to 1998 and 1999. At that time, the company offered MSN Search and gradually added such improvements as self-constructed search-engine results that could be updated on a regular basis. Serious upgrades were moved out in 2004 and 2005, with results made available to other search portals.
In 2006, the world was introduced to Windows Live Search, the replacement for MSN Search. This new iteration used such tabs as Web, images, news, desktop etc. It was at this point that Microsoft no longer used Picsearch to provide images within the service. In-house developers created their own image-search mechanisms.
A year later the company effectively changed the playing field by moving the search products away from Windows Live services. The new product became Live Search, under the Platform and Systems division of the company. It’s important to also understand that Live Search was integrated with Microsoft adCenter at this time.
There’s More
Some of the key elements in the transformation of Microsoft search tools (from Windows Live Search and MSN Search to Bing) include:
It’s also essential for industry watchers to understand the significance of the change from the “Live” moniker. This decision was made by a company that was concerned about brand image and brand awareness. Bing was born.
So, we’ve come from the last century and MSN/Live Search to Bing, in about 10 or 11 years. Where does that leave us? Well, in 2009 Microsoft and Yahoo! signed an agreement to replace Yahoo! search tools with Bing. The contract is intended to last 10 years.
Not only has Bing become the power behind another brand it has resurrected Microsoft’s share of the search market. Industry figures show that the big company’s market share was declining until Bing gave it the adrenaline rush needed. In October 2010 Bing was in the top five among search engines by volume.
Painful Birth?
Was the transition from Live to Bing painful for users? It would be necessary to survey a majority of those folks to get an accurate picture of what the change meant. Microsoft and the people behind Bing introduced the change as good news.
In July, Garcia wrote, “We have good news for all the veteran users of the Bing Webmaster Tools. Your existing Webmaster Center accounts have been automatically upgraded to the new tools. This means that starting today, you’re already a registered user of the new Bing Webmaster Tools. There’s no need to create a new account, change ownership verification codes, or re-enter site data. If you don’t have a current account, you can easily sign-up and register your sites to begin using the new tools.”
A few veterans have classified Bing Webmaster Tools as good, if not the best. There were some preliminary problems with Bing reporting targeted United States Web sites as being United Kingdom sites. The problem seemed to be with specific tags that have the letters “gb” in them. This was written about on SEORankings by founder Wesley LeFebvre, based in Seattle.
At the time, LeFebvre asked if some of these sites were suffering in the organic-ranking category because of the mistaken information. He also questioned the validity of Bing, Google and Yahoo! local rankings in light of this information. LeFebvre set a goal of finding out if other organic search engines use this tag.
In October, Steve Tullis wrote on the Webmaster Center blog “In keeping with our themes of rapid change and responding to input and feedback from the webmaster community, we are working on future feature planning for the Bing webmaster tools and we would like to hear your thoughts in a few areas.”
Tullis discussed Crawl-Delay and asked several specific questions about support and control in this area. He asked users and webmasters if they preferred to have crawl-delay supported in robots.txt, in addition to several other pointed questions.
The door to getting assistance with Bing Webmaster Tools is on the Bing Webmaster Center Help page. While that may seem obvious, it doesn’t hurt to make contact with the source somewhere in your search for information about new versions, upgrades and additions. Judging from initial reports from both the company and a few users, the new tools are workable and the Inbound Links addition is absolutely essential.
It’s a bit early to try any detailed comparisons on Bing Webmaster Tools and its Inbound Links structure. But rest assured there are plenty of people out there who will be providing useful reports.
With Goggle introducing its new tool, the Google Books Ngram Viewer several days ago, many were enthusiastic about this being an ultimate feature to use in etymological research. After all, the Ngram Viewer allowed to search millions of books (Google books, of course) and then check, track, and analyze the appearances of any word throughout many centuries.
The users were enthusiastic at first, but it turned out that the tools is far from perfect. According to recent review, there are many problems and inaccuracies in Ngram Viewer reports – both expected and unexpected. A very basic issue is the OCR – Optical Character Recognition. Even for modern books and fonts, there are occasional mistakes that occur, best OCR programs report just below 1% percent error margin for a text of recognized words. For books from the 16th and 17th centuries, with the artistic fonts this margin is sure to be higher. One example is the letter “s” confused with “f” on numerous occasions.
Another problem observed is that for the first occurrence, as Google Books NGram Viewer does not take into account the developing of language over time, thus you have to research several forms of the world used throughout the centuries to find the actual first usage. And also, there are the reprints. Many Google books are labeled with the year of their print, instead of the year of the original manuscript, making the search produce more hits for “recent” years.
Overall, Google Books Ngram Viewer is not bad. It is just not as reliable as one could think it is. Suitable for occasional queries, it cannot be considered as reliable tool in serious academic research.
After Google had announced its intention to acquire the renowned flight data provider ITA (the offer stands, reportedly, at 700 million USD), several serious questions arouse. Wouldn’t it be too much a step towards monopoly? What are the benefits for the customers? Why, the hell, Google is buying ITA at all?
Well, the answer to the last question is pretty obvious. In recent years, Google seems to enter every niche available in the market. Long gone are the times when Google was just a search engine. Google maps, Google news, Google Sketch-up – more and more services are provided by the enterprise and some people are already asking – is Google a Search Engine or your ultimate competitor?
However, the Google ITA offer has now encountered a serious opposition itself. A group of businesses, namely the Fairsearch.org, have gathered together in order to prevent the deal. With ITA serving about two thirds of airline ticketing and satellite websites, the ultimate “danger” – according to the Fairsearch claim – is that Google will eventually start selling tickets directly, while it has a control of data flow towards potential competitors.
Google, of course, claims that the intention is purely to improve the service, making flight data offered by Google more reliable and continuing to redirect the searchers to other websites that offer flight tickets.
With the consequences of this case remain to be seen , one thing is clear – more and more businesses (including the giants like Microsoft and Expedia) are concerned with Google taking over.
Bing has introduced a new feature for the Bing Image Search. It is called “Instant Answer” and is, actually, a small bar of “tabs” that appears just below the search term box. Each tab suggests a more “specific” query, being especially helpful in the case of ambiguous search terms.
For example, when you type “heat”as the Bing Image search term, the “Instant Answer” will suggest narrowing your search to “Miami Heat”, “Heat Energy”, “Heat Wave” etc. The term “star” will feature “shortcut tabs” for filtering images of “Patrick Star”, “Star wars”, “Star of David” and more.
This feature resembles the infamous “related searches” suggestions for web search and is useful in saving our time, as we do not need to retype the whole query, using a simple click of the mouse instead. “Instant Answer” is, however, currently only available for the US Bing website.
Google has recently announced that Google Instant Mobile is now available “globally”. This means that the tool is released for all countries that have Google Mobile access (there are several dozens of those) and supports almost thirty languages.
The product, similarly to Google Instant Mobile, is integrated into search features for any Android browser (built-in for Android OS 2.2 and up) , and features various algorithms that allow faster dynamic search results.
Although this release was expected (shortly after releasing Google Instant Mobile in English, the company had announced that international support is on its way) – nobody anticipated that this would happen so quickly. The roll-out took Google slightly over one month time – an incredible figure, considering the complexity of the product. Of course, this simply means that Google had been working on globalization of Google Instant Mobile simultaneously with the product itself. That is no wonder – Google had always emphasized the importance of international marketing and global support.
Telemarketing was never a part of Google business strategy. The idea was to spread the information, get recognizable and make the clients come to them, asking for services. That’s were the sales managers stepped in, offering a variety of products, bargain deals and impressing the customer.
However, according to latest news, this has recently changed. Several hundreds of telemarketers are employed by the company, their task being to sell Google Boost and Google Tags services to local businesses at several markets in the US.
It seems the Rubicon has been crossed and Google has now “recognized” that some niches and companies should be addressed directly rather than by advertising. The next question, which should be probably answered in few months time, is whether they will expand their own telemarketing group (reportedly currently measuring 300 employees) or try to purchase an established sales force.
After all, with Groupon denying “the 6 billion offer”, Google has some money to spare…
Everybody knows that since the introduction of the new Caffeine index, Google is able to update its site indexing within hours and even minutes. Long gone are the days when we were shown some outdated “caption” of the website content that turned out as no longer present on the page. Different was the case with images, as Google Image Search indexing was still lagging behind. It could take a month or so for the recently added image to appear in Google Image Search results.
It appears Google, in its efforts to produce real time results, is now addressing this “problem”. Several users, who are following Google image indexing closely, have reported a major improvement in this field, stating that the new images are being indexed more fluently, appearing in Google Image Search results with a delay of several days only. This is yet another proof that non-textual content is becoming more and more popular among users, making it essential for proper SEO.
Twitter LOVES twitter spam. Is there an official name for twitter spam? Twam (OK I will trademark that)?
Why do I say this? Because I found it shockingly difficult today to find a tool to Bulk Unfollow on Twitter. I have been in quite a dog fight today. In theory what I am trying to do should be simple. I would like to bulk remove spammers and defunct Twitter members that a client is following due to the use of one of those auto follow programs.
This Twitter account is currently following 20K members, and has about 20K people following them. I don’t want to close the account as they would lose many loyal followers; my guess is about 5K. So we need to remove all the junk. We started out by trying to remove them manually within the Twitter interface, 20 minutes removed about 200 and I quickly figured out this was not going to work. So I decided to look for some tools online that could assist me with the removal of most of the people currently being followed.
I found plenty of tools to remove Twitter followers, and of course tried the free ones first. With zero success on those, it’s possible that these free tools are not being properly maintained. Time to spend some money to get the right tool. The first paid tool would not get past the API call to Twitter after 30 minutes. I’m fast becoming frustrated, but luckily I find a nice blog post listing 23 tools. Here’s that list for your reference:
http://www.aboutonlinetips.com/mass-bulk-follow-unfollow-tools/
And here is a summary of what I found checking these out:
http://www.socialoomph.com/ did not try this one, as no where did it mention bulk removing of twitter members being followed.
http://www.jdwashere.com/twiping/index.html Twiping looked really promising, quickly dropped $9 on this, downloaded the application. Except all it did was lock up my computer, multiple times, tried it on multiple twitter accounts. Really annoying.
http://www.theunfollowed.com/home.php Looks really easy, log into twitter, and unfollow in bulk, perfect. But after logging into the Twitter account authenticate, it kicks back to http://www.theunfollowed.com/home.php, and when I click on bulk, nothing happens, locked up. Tried this in Firefox, Chrome and IE….same thing, just locks up.
http://www.tweemaid.com/ Tweemaid is another tool that claims to wipe the list clean, but 3 browsers and all I get is taken to a dead page at the end.
http://joeldrapper.com/untwollow/ Untwollow only removes 1 twitter following at a time (I think not confirmed its removal as I did not have time for one at a time)….yet another lost cause.
http://friendorfollow.com/ This tool seems to function, but does not do what I am looking to do. The tool just shows you who you’re following that is not following you, possibly useful, but will not work for this task.
http://www.huitter.com/ apparently does not exist any longer.
http://dossy.org/twitter/karma/ also looked promising, never a good sign when it says on the page, if it does not work in 5 minutes you should reload the page and try again….guess what that would be me, moving on.
http://www.mycleenr.com/ apparently out of business, but they were kind enough to recommend two other services. They recommend FriendorFollow which I tried with no success, and:
Refollow http://www.refollow.com/refollow/index.html The verdict is not out yet, I paid them $20 and am waiting 24 hours for a confirmation of payment? (Certainly not giving me the warm fuzzies, they may as well of sent something via the US post office).
http://www.buzzom.com/ Does not even mention on the website how to begin…
http://twitoria.com this looked good except it only appears to list out the users that have not been active on twitter in some time, does not appear to give an opportunity to get nuclear on them.
http://tweetblocker.com/ this website appears either down or gone…..I vote for gone.
Needless to say the list goes on, I tried probably 5-6 others I cannot even find the names of.
I did find two that work!
http://untweeps.com was fairly simply to log in, it apparently only permits you to unfollow people who have not tweeted in a specific time range, like in the last 30 days for instance, although if you sign up for $2.00 you can remove just about everyone. If your patient with loading time you can remove about 700-1000 at a time (secret put 0 in the box for time since last tweet).
http://tweepi.com Also works but will only permit you to remove 20 (or 40 at a time if you tweet about how wonderful they are) at a time.
In order for these two services to work, you need to download one of the two Firefox checkbox plug ins:
https://addons.mozilla.org/en-US/firefox/addon/2393/ I tried this one with Firefox 3.6.13 and it would not work.
https://addons.mozilla.org/en-US/firefox/addon/9740/ this one specifically said it would not work with the latest browsers. So I downloaded Firefox 3.0 and this utility worked great.
For either tool, simply highlight the entire section you want to check, then right click and select check. All boxes checked, then click Remove.
In finally finding these two websites that actually appear to work, I discovered something interesting. Twitter is making bulk un following people VERY difficult. They actually state in their terms and conditions of API usage that a “Check All button was a violation of the TOS (terms of service)”. In theory what Twitter is saying is it is very interested in keeping up inflated numbers of members, followers and activity, to such a degree that bulk no following is basically becoming impossible.
I suppose that Twitter in a half hearted way likes Twitter spammers, auto bots and deadbeat users, because if they really wanted to eliminate these users, offering the opportunity to bulk remove people would not be such a mighty task.
I could understand if Twitter took a very hard line approach to all API applications, but many of the auto-follow applications work flawlessly, but seems bulk removal of following people is discouraged.
I hope my 4 hours of time can save you time if you find yourself needing to find a Twitter Bulk Un Follow or Removal tool.