Earlier this year Google began including AMP listings into its mobile search results. AMP, short for Accelerated Mobile Pages, is a specification for creating slimmed down pages for mobile devices. It’s a subset of standard HTML but with restrictions. In addition, Google caches AMPs on its own CDN to provide the fastest retrieval possible. Any AMPs appearing in Google Search results are linked to these cached pages.
You may be asking yourself why we need such a thing when mobile phones are already capable of displaying “ordinary” websites. It’s a good question, and admittedly that was my first question when I first learned of AMP. The AMP project says the purpose of AMP is to give mobile users a better, faster web experience. But what’s wrong with the current user experience on mobile phones? Is it really bad enough to warrant an entire new web page specification when we already have HTML 5?
A Federal judge has allowed a lawsuit to move forward against Google clearing the first hurdle in actually bringing the suit to trial. The company e-ventures Worldwide LLC (a search engine optimization company) is alleging that Google Inc. has improperly censored search results for “anti-competitive, economic” reasons. (more…)
As confirmed over on the Webmaster Tools blog yesterday, Google is implementing a new change to their algorithm, this time focusing new penalties on something called “Doorway Pages.”
“For example, searchers might get a list of results that all go to the same site. So if a user clicks on one result, doesn’t like it, and then tries the next result in the search results page and is taken to that same site that they didn’t like, that’s a really frustrating experience.”
Basically over time in an effort to maximize their search footprint, a number of online sites have created various doorway campaigns using either the creation of many different pages on a site, a number of different domains, or a combination thereof.
If you are wondering about any of your current campaigns, make sure they don’t fall into any of these categories: (more…)
On December 1, 2014 Mozilla rolled out version 34 of its popular open source internet browser – Firefox. This may not sound like a big deal as updates to Firefox come pretty frequently but what was included in this new update may very be.
The default search provider has been switched to Yahoo in the United States.
Since 2004 Mozilla Firefox has been in an agreement with Google to use them as their default search provider on the home page and the search box in the top right corner in almost all countries. This has been the main source of revenue for Mozilla over these past 10 years which had forced them to become dependent on Google. (more…)
UPDATE 10/19: Google has confirmed the rollout of Penguin 3.0
I’ve only got one anecdote to support my suspicion, but a client who was a victim of Negative SEO lost all of their top 10 rankings, and as of today they are all back 100%!
Backstory: This client hired us in December 2013 to figure out why their rankings and organic traffic had dropped overnight on Oct 5, 2013 (date of the last Penguin update). We analyzed their back links using a variety of tools and didn’t take us long to discover they had been a victim of negative SEO. We found over a period of 3 months there were random spammy links added to their site to the tune of about 20,000 links a month! They had never done their own link building nor hired an SEO. Other than these links they had a very small link profile, with just natural links and only a few.
Because the rankings and traffic did drop right in line with the Penguin update, and there were no warnings in Webmaster Tools of a manual penalty, we knew this was an algorithmic penalty associated with these links.
It took us about 3 months to discover and disavow these links, and the reason why is interesting. The links were not showing up on the page every time – they would only occasionally show up upon a refresh. Each refresh of these spam pages was filled with about 100 random links, and each refresh showed a new 100 links. So any link checker would find the links some times, but not other times. We had no choice but to continue to run a link checker, 3 different link checkers as a matter of fact, about 1x a week. Each time we would discover new links that we hadn’t disavowed yet. It was a brilliant sneaky black hat SEO attack.
By around February, we were confident we had disavowed about 90% of the spam links. But no recovery.
As we have all learned since then, a site must wait on Google to update the Penguin algorithm before any changes in rankings or recovery can take place. Unfair? Absolutely. Especially in the case of Negative SEO, which is real and does work. And in this case it took Google over a year to update Penguin. Imagine us reassuring this client that as soon as Google runs their update, their rankings will be back. “When will that be?” “Uh, Google won’t tell anyone, should be within 3 more months”.
In the meantime, we worked hard on content marketing, blogging for the client, getting high quality content published on their site and blog, building out Google Local pages, hoping that when the update hit, their site would be stronger than ever.
SUCCESS! ALL of their previous important keywords are back to the top 10, as of today October 18, 2014. Some of them even higher!
We are torn between being thrilled and being sickened that Google can allow something like this to happen.
January 2013 – Last spring Google posted about Responsive Web Design on their official webmaster central blog and though the flavor of their article was fairly mild, they made it very clear that their “commitment to accessibility” includes a very important message to web designers – “Mark up one set of content, making it viewable on any device.”
If you work in the field of SEO, you probably understand that Google controls everything. We constantly bend to its will and try to outthink it at every turn. Just as space travel is unpredictable because we haven’t yet experienced much of it, SEO is also a largely new frontier and we seldom know what to expect from our environment. Our environment, of course, is Google. But what if Google didn’t exist? Where would we look for sites? How would we get links? Your brain is probably boiling over with great ideas right now, and that’s the point of this whole thing—if we eliminate Google from the equation entirely, those paths that we come up with are almost completely organic.
Google is extremely popular with both the general public and with SEO professionals, but it often locks us inside of a box. At some point we’re not exploring the web on our own, and instead we are relying on an algorithm and some web spiders to explore for us. We can break out of this box and choose our own destination in a natural, organic way. Considering the question “what if Google didn’t exist?” is a great way to answer the question “where can I get more links?”
Playing “what if?” is a fun, but sometimes dangerous, game. It’s easy to get stuck down in the mire of negativity and use “what if?” to fuel your own pessimistic fire. If you use it correctly, however, “what if?” can be a great catalyst for ideas and innovation. For example, think about the popular post apocalypse genre of fiction, where a shovel might become the protagonist’s best weapon, best tool and best friend. Similarly, in a world without Google, a message board buried somewhere inside of a mediocre site with low domain authority might become an excellent research tool. After all, if all of these people are willing to brave an underwhelming site just to talk to each other and share about a topic, that means they’re passionate about it. Passion leads to great info, great leads on new sites and useful links. Google does exist, of course, but thinking outside of that box produces some interesting results.
In addition to bringing you the most relevant results, search engines are many times fighting over presenting the most up-to-date pages to the searcher. That’s why Google has those time-related filters in the left, just below the “type” filters. Although recent content might be well of importance only to news-seekers, Google thinks otherwise. Long before the recent “Panda” update to its indexing algorithm that is being talked about all over the world in during last week, Google has made numerous adjustments to its ranking rationale, with frequently updated websites getting “bonuses” in SE placements.
Yet another step in the same direction was done several days ago, although no official announcement has been made. It seems, Twitter is getting more credit within Google, which has decided to present recent tweets in the search results. In addition, the results also show user’s picture. But more important is the fact that the link is the tweet is included in the SERP’s, making it a valuable inbound link for the featured website.
It has to be noted, that the above only applies to recently posted tweets (the exact amount of time could not be determined, but from my testing it is probably several hours, and after that the results return to the usual “join twitter to follow”. If you want to see those results, by the way, it is very advisable to include the word “Twitter” in your search query.
While several companies established a Fairsearch.org group in order to try and prevent the Google-ITA deal, claiming that it is a yet another step of monopoly (by Google that is) of the airline ticket market, Bing has decided not to complain, but to fight. In addition to purchasing a predictive engine for flight costs, the Farecast, about a year ago, they now team up with one of the popular travel search engines, the KAYAK. It seems that the deal is beneficial for both sides – after all, KAYAK is probably also worried about Google acquiring ITA, despite their talks about “welcoming Google as a competitor”.
It their announcement, Bing flatters KAYAK, calling its new partner “a leading innovator in travel search”, and talks about “more comprehensive travel search experience”. The deal should benefit those people who want to plan and book via Bing. Although this looks like trying to stop people from leaving Bing, the move is actually a counter-step to Google entering the travel search. Of course, although “Google is not wining every niche it enters” as said KAYAK CTO and co-founder Paul English, it can affect the market heavily.
So, there is nothing left but to wish good luck to both Bing and KAYAK in their struggle.
According to latest StatCounter data, Goolge has dropped below 90% of search engine market share – for the first time since July 2009. The presented figure of 89.94%, though is still a major headache for its competitors, Yahoo and Bing that combine to just over 8% of global search… In the European market the domination is even greater – Google has about 94% of market share.
Although Bing has surpassed Yahoo globally in January, in the US market Yahoo! is still a number two search engine, with 9.74% share compared to Bing’s 9.03%. Google has dropped below 80% once again, with 79.63%.
In Asia, Baidu has once again beaten Bing for the number three spot (Yahoo! is second). It must be noted however, that StatCounter only considers English searches so the results have to be viewed with care. For example, in Russia Google is reported as the market leader with 52% with Yandex having a figure of 46%, and in Czech Republic the picture looks even brighter for Google, which beats local Seznam 79% to 19%. Of course, when native language searches are considered, both Yandex and Seznam are more popular than Google in their local market.
But even so, in China, Baidu is a clear number one, with almost 70% of the market (compared to Google’s 29%) and in South Korea Naver is back to absolute majority (55.15%), with both Google and recently launched Daum both loosing ground (31.7% and 7.85% respectively).