Earlier this year Google began including AMP listings into its mobile search results. AMP, short for Accelerated Mobile Pages, is a specification for creating slimmed down pages for mobile devices. It’s a subset of standard HTML but with restrictions. In addition, Google caches AMPs on its own CDN to provide the fastest retrieval possible. Any AMPs appearing in Google Search results are linked to these cached pages.
You may be asking yourself why we need such a thing when mobile phones are already capable of displaying “ordinary” websites. It’s a good question, and admittedly that was my first question when I first learned of AMP. The AMP project says the purpose of AMP is to give mobile users a better, faster web experience. But what’s wrong with the current user experience on mobile phones? Is it really bad enough to warrant an entire new web page specification when we already have HTML 5?
A Federal judge has allowed a lawsuit to move forward against Google clearing the first hurdle in actually bringing the suit to trial. The company e-ventures Worldwide LLC (a search engine optimization company) is alleging that Google Inc. has improperly censored search results for “anti-competitive, economic” reasons. (more…)
As confirmed over on the Webmaster Tools blog yesterday, Google is implementing a new change to their algorithm, this time focusing new penalties on something called “Doorway Pages.”
“For example, searchers might get a list of results that all go to the same site. So if a user clicks on one result, doesn’t like it, and then tries the next result in the search results page and is taken to that same site that they didn’t like, that’s a really frustrating experience.”
Basically over time in an effort to maximize their search footprint, a number of online sites have created various doorway campaigns using either the creation of many different pages on a site, a number of different domains, or a combination thereof.
If you are wondering about any of your current campaigns, make sure they don’t fall into any of these categories: (more…)
For awhile we have known that speed is an important factor in Search Ranking but over the last week we have seen an interesting notification for this being tested. As reported on SearchEngineLand, Google has been testing a bright red “Slow” warning that can be found in Search Engine Results Pages (SERP) for sites that are slower than normal. This way users can be warned that clicking the link will result in a slow page load time. (more…)
As you may know we recently re-branded our company and along with creating a new name, website, and logo – we also had to change all of our social media pages / URL’s to reflect the new company name.
Thankfully these days most of the popular social media sights have special tools designed just for this purpose and I will walk you through a few of them. (more…)
UPDATE 10/19: Google has confirmed the rollout of Penguin 3.0
I’ve only got one anecdote to support my suspicion, but a client who was a victim of Negative SEO lost all of their top 10 rankings, and as of today they are all back 100%!
Backstory: This client hired us in December 2013 to figure out why their rankings and organic traffic had dropped overnight on Oct 5, 2013 (date of the last Penguin update). We analyzed their back links using a variety of tools and didn’t take us long to discover they had been a victim of negative SEO. We found over a period of 3 months there were random spammy links added to their site to the tune of about 20,000 links a month! They had never done their own link building nor hired an SEO. Other than these links they had a very small link profile, with just natural links and only a few.
Because the rankings and traffic did drop right in line with the Penguin update, and there were no warnings in Webmaster Tools of a manual penalty, we knew this was an algorithmic penalty associated with these links.
It took us about 3 months to discover and disavow these links, and the reason why is interesting. The links were not showing up on the page every time – they would only occasionally show up upon a refresh. Each refresh of these spam pages was filled with about 100 random links, and each refresh showed a new 100 links. So any link checker would find the links some times, but not other times. We had no choice but to continue to run a link checker, 3 different link checkers as a matter of fact, about 1x a week. Each time we would discover new links that we hadn’t disavowed yet. It was a brilliant sneaky black hat SEO attack.
By around February, we were confident we had disavowed about 90% of the spam links. But no recovery.
As we have all learned since then, a site must wait on Google to update the Penguin algorithm before any changes in rankings or recovery can take place. Unfair? Absolutely. Especially in the case of Negative SEO, which is real and does work. And in this case it took Google over a year to update Penguin. Imagine us reassuring this client that as soon as Google runs their update, their rankings will be back. “When will that be?” “Uh, Google won’t tell anyone, should be within 3 more months”.
In the meantime, we worked hard on content marketing, blogging for the client, getting high quality content published on their site and blog, building out Google Local pages, hoping that when the update hit, their site would be stronger than ever.
SUCCESS! ALL of their previous important keywords are back to the top 10, as of today October 18, 2014. Some of them even higher!
We are torn between being thrilled and being sickened that Google can allow something like this to happen.
With Google and other search engines increasingly trying to ‘humanize’ their algorithms so that their preference of websites reflects that of their physical users, there has never been a more important time than now for website owners to make sure that the content they use is fully up-to-scratch.
It is no secret to anybody that Google is a bit of a bookworm, with its head of webspam Matt Cutts frequently using his Webmaster Central vlog to point out that, while intelligent use of images, infographics, videos, and audio files helps a website’s ranking, the search engine is still hungriest of all for text.
Content should be viewed as the bricks and mortar of your site. If you view your website in the same way as your house, of course you want it to look pretty and stand out on your street, but there’s no point doing that if it’s made of sponge bricks and uses porridge as mortar. Nobody would want to visit such a house more than once, and it would fail in its purpose. Aesthetic features on your site should be there to accentuate its solid foundation of content – not in place of it.
Adwords by Google is a proven way to generate traffic, leads, and sales. It is a tool that some marketers are using on a regular basis to drive business goals. While there are a few using this tool to great effect, several more are striking out and in the process, throwing a lot of money out the window. Having success with Adwords calls for you to take the same approach you take to generating visibility for your organic content — optimize. However, the actual steps to optimization are a bit different in this arena.
1. Learn to Bid and Budget
Understanding how to bid and budget your funds is crucial to Adwords success. Spending $20 a day may appear to be the way to go for cost-conscious advertisers, but a budget that low will also limit your ability to profit. Likewise, bidding low initially can save you some money, but it might not get you very many clicks. While you don’t want to spend beyond the budget, you also don’t want to limit your potential, so learning how to manage your funds in accordance to the ad platform is key.
January 2013 – Last spring Google posted about Responsive Web Design on their official webmaster central blog and though the flavor of their article was fairly mild, they made it very clear that their “commitment to accessibility” includes a very important message to web designers – “Mark up one set of content, making it viewable on any device.”
I’m sure by now everyone has seen that Google allows you to set up authorship credit for the content that you create. Credit is given by a picture of the author along with a link to the author’s Google Plus page as well as a link that allows you to read more posts by the author.
Setting this up for a blog with only one author that is posting content is pretty straight forward and there is a lot of good information available on how to do it. The problem comes when you have a website or blog that has multiple or more than one authors posting content. Unfortunately the steps that would allow this to work in a single author instance do not work when there are multiple authors and you would end up with the wrong author receiving credit for the content.
After a fair amount of searching I was still not satisfied with any of the answers I had found on how to create the authorship credit when there is more than one author. A lot of the posts that I found contained old and outdated information or steps that are honestly not necessary to this process. Finally after piecing together a bunch of information, I was able to find a solution to my problem that was surprisingly even easier than I had expected! In these next couple sections I will cover how to configure the author credit for both of these scenarios (single author or multiple authors) in WordPress.