Ask us a question!

Web Moves Blog

Web Moves News and Information

Archive for the 'Search Engine Optimization' Category

Stay calm it’s still working …

This from Phil Leahy

I normally check every site that I visit to see their Google page rank and alexa page rank. I noticed from Thursday 6th of October it was not showing up on my Alexa toolbar or SeoQuake toolbar. I did some research via twitter to see what was going; on only to see people saying Google has just stopped its PR feature!
As a result I visited Google and do some research & see what is happening. The problem isn’t with Google stopping its pagerank feature or that there was another update, the problem had to do with something else.
The Recent Google PR Server Change
The reason for the recent changes in Google PR is due to the fact that Google just changed its PR server and as a result also changed its query URL.
The old query URL is:
http://toolbarqueries.google.com/search?client=navclient-auto&features=Rank&ch=8f3b58e04&q=info:[URLHERE]
The new query URL is:
http://toolbarqueries.google.com/tbr?client=navclient-auto&features=Rank&ch=8f3b58e04&q=info:[URLHERE]
What that means is that any software addon such as Alexa or Seoquake, website or PR tool trying to use the old server to check PR won’t be working again.
So nothing has happened to your pagerank, and your site won’t be affected. The only thing happening is a recent change in the Google PageRank query URL which will soon be fixed in a lot of SEO tools and websites, and as a result you will soon be able to check your Google Pagerank again via your chosen tool.
If you want to check your Google pagerank now (to ensure everything is okay) you can easily install the new Google Toolbar on your Mozilla Firefox (version 4 or under) or Internet Explorer browser and you will be able to check it. To change the settings in SeoQuake: Open preferences, Click on “Parameters Tab”, Double click “Google Pagerank”  Replace: http://toolbarqueries.google.com/search?client with http://toolbarqueries.google.com/tbr?client.

google page rank missing

I noticed last night that Google’s visible page rank had disappeared for several tools I use. I looked around a little bit but could not find any tool that was working. I think the SEO community has been looking towards this for quite some time, as people have put far too much emphasis on Google’s visible page rank to the point where an entire economy (Text Link Industry) has been developed around a websites Page Rank and the associated value you can extract from selling links.

I think the guys from Majestic SEO and SEOMoz are both quite pleased to see this change, as they both have built into their link spider the ability to score the link value of a website based upon somewhat similar characteristics to that of Google’s Page rank.

I guess we will see how this all plays out over the coming days, weeks, months or years. I think the removal of visible page rank has been long over due.

 

 

Pretty Nice Bonus to Brands within natural search results today, the sitelinks are now going VERY wide across many valuable pages within your domain.

Check this out:

Google New Sitelinks Format

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

 

I think this is a GREAT move by Google, as in most cases it will eliminate people from marketing themselves around someone else’s brand. Clearly will help provide someone looking for a particular brand a better user experience.

 

 

 

Just noticed something new in Google webmaster tools called “Google instant preview” it appears to be something they have tied to the fetch as Googlebot. It appears to be somewhat flawed, and does not crawl or display flash (makes our website look bad). Although it is yet another way to keep an eye on your website for errors and crawl-ability. I think this is fairly heavily tied to the “load time” of your website, as it appears to only provide Googlebot with a limited amount of time to grab a snap-shot of your website.

Google Webmaster Tools Instant Previews

 

It showed us as having 26 errors on this page, seems like a lot considering our website is fairly tight (not as tight as our clients websites).

One last interesting note, our page actually appears rock solid in the Google Search Preview, yet not in this instant preview….hmmmm…..

 

My rant of the day is that I’m struggling to understand why Google is not either penalizing website sites that do not provide a mobile version of their website, or rewarding websites that do?

This weekend I found myself annoyed yet again with Google’s search results while searching on my phone.  I am really tired of surfing the web from my mobile device only to be presented with 10 of the slowest loading, fat bellied, ad riddled authority websites on earth.  I want to see mobile friendly results when searching on my mobile and find what I need FAST.   Waiting for these bloated outdated sites to load delivers an awful user experience, and that’s what Google is supposed to be making their number one priority.  Even within Google News results, you find these traditional ad heavy websites are the ones that often appear. It’s maddening, sluggish, and I end up giving up before I find what I was searching for.

We know Googlebot can decipher the difference between a mobile website and a traditional website.  I realize that in the beginning, the mobile search results will suffer, as many websites have not been provided the encouragement to create a mobile website.  But I think the time has come.  If you would like to see mobile search results on your desktop, all you need to do is to go to http://www.google.com/m and you can see what the search results look like on a mobile device.

google mobile serps

 

 

 

 

 

 

 

 

 

 

 

 

 

Google, we all appreciate your efforts to take over the world, with your Android OS, Chrome OS, Chrome Browser, Google Radio among the many other market segments your actively jumping into with both feet. I think you need to circle the wagons here and make sure your core business is tidy before you keep going into new markets. You never know someone may be cooking up a mobile browser (like http://www.skyfire.com/) or better yet a mobile search engine in the background.

Why isn’t Google delivering mobile friendly results?

Cindy Krum discussed mobile search results here at SEOMoz, the talk about Google having two separate indexes for mobile and traditional search.  She makes the point about how bad the results are when Google only provides mobile results.  But I am not sure the results are any better when provided with several really big slow websites that take forever to load, and are nearly impossible to navigate or find what you were looking for.

Whether Google needs to maintain a separate index or have a difference in the algorithm is debatable, but I think this should be changed sooner rather than later.  Once mobile websites are rewarded in mobile search, obviously more companies would get their sites mobile friendly.  Google does need not exclude non mobile results, but possibly just tweak their mobile algorithm;  doing something as simple as say for any query, run trhough the top 20-30 results which are traditionally fairy relevant, and if any of these results have a mobile website, show these website higher. No rocket science required, just a little common sense.

It is unusual that Google is not rewarding the online businesses that have embraced mobile search and even providing a great mobile website with the appropriate search engine optimization for mobile search. I really thought by May 2011, Google’s mobile search would have come further than this.

If there is a key to getting good reviews from customers and clients it is the commitment to providing excellent service and quality products. Put this in place and you build a solid foundation that will take care of reviews and testimonials.
But, in this online world, how do you make sure that others will find these reviews and that your business will benefit from them? You have to make sure your users and visitors understand that reviews must be in the correct format and submitted in the proper manner.
That said, you – the business owner – must be responsible for getting the process in place so that reviewing your services and products is as easy as, well, as telling others about the experience. The customer shouldn’t have to be a “techie” and he or she shouldn’t have to wade through half a dozen steps to get that review ready for others to read.
So what does that mean?
In the current online world it means finding out what Google is doing and what Google is not doing. Here’s the answer: Google supports specific microformats. If you want your reviews to get into the Google ocean you will have to get them into the correct format.
How does that happen? Well, it doesn’t just happen.
First of all, determine that you are indeed a viable ecommerce site that could benefit from client/customer reviews. Of course you are! In addition, this particular path is relevant for businesses listed with Google Places. Then you must make sure that the business name, business address and business telephone number are on your pages – in the correct hCard format!!
What is hCard? Here’s the layperson’s definition (if there really is one) from Wikipedia: “hCard is a microformat for publishing the contact details of people, companies, organizations and places in (X)HTML, Atom, RSS or arbitrary XML.”
For nearly two years, Google has been using hCard, hReview and hProduct microformats for their search results. More recently, the company is using these microformats for local search results. As we see it, having this code in place will allow businesses to instruct customers to go to the business Web site to give a review, rather than sending them to Google Maps so they can search for the business, find the listing, choose a review link, log in etc. etc. etc.

Going Places
Use the correct format and Google will recognize the review and put it on your Places page. Think of it as user-generated content that goes directly to the place where it will do the most good. Once you set up the instructions – in the correct format – the reviews will be where they should be. Now, it isn’t necessary to rely on so-called authority sites. The information can be drawn from our business Web sites.
We understand that Google presents this in two distinct ways: individual reviews and aggregate reviews. The first is used to format pages that will display a single review. If you plan to use a page for a number of reviews or for summaries of several reviews you should use the aggregate method.
Here’s how Google explains it: “You can mark up either individual reviews (for example, an editor’s review of a product), or aggregate review information—for example, the average rating for a local business or the total number of user reviews submitted.”
If the page will have an editor’s review or an expert review that should stand as individual content but also has a collection of user reviews you should simply choose a format. Again, we understand that Google will work this way: “Use the individual Review format to mark up the editor’s review OR you can use the Review-aggregate format to summarize the set of user reviews. If a page contains both types of markup, Google will use the Review-aggregate markup for display.”
When you start to use this process to enhance your business presence make sure you and your Web site folks understand the concept of properties, as defined by Google for this purpose. The online behemoth explains that reviews “contain a number of different properties which you can label using microdata, microformats or RDFa markup.”

A good example presented at: Google Microformats

Hreview Individual

hreview aggregate

Interesting discovery today, a search on www.google.com.au showed some different looking serps. I noticed Google has moved the location of the URL to the upper left hand side just below the title tag of the website before the description tag.

Google Moves URL Location in Search

It used to be below the description tag on the lower left hand side.

Google Moves URL Location in Search Old

It is quite interesting how different the SERP look by simply moving the placement of the URL. Not exactly sure why they would be making this move, but it is clear that having sitelinks, becomes more important that ever.

It appears about a year ago people at Webmasterworld had seen this within adwords. http://www.webmasterworld.com/google_adwords/4142468.htm

In addition to bringing you the most relevant results, search engines are many times fighting over presenting the most up-to-date pages to the searcher. That’s why Google has those time-related filters in the left, just below the “type” filters. Although recent content might be well of importance only to news-seekers, Google thinks otherwise. Long before the recent “Panda” update to its indexing algorithm that is being talked about all over the world in during last week, Google has made numerous adjustments to its ranking rationale, with frequently updated websites getting “bonuses” in SE placements.

Yet another step in the same direction was done several days ago, although no official announcement has been made. It seems, Twitter is getting more credit within Google, which has decided to present recent tweets in the search results. In addition, the results also show user’s picture. But more important is the fact that the link is the tweet is included in the SERP’s, making it a valuable inbound link for the featured website.

It has to be noted, that the above only applies to recently posted tweets (the exact amount of time could not be determined, but from my testing it is probably several hours, and after that the results return to the usual “join twitter to follow”. If you want to see those results, by the way, it is very advisable to include the word “Twitter” in your search query.

Google shows a tweet

With the recent Goolge’s algorithm update (which was quickly called “Farmer’s Update”, as it seriously affects the so-called “content farms”) and Blekko’s removal of twenty famous websites from its results, it seems that fighting spam is the hottest issue in the search engine market.

Indeed, when we face certain enemy, it is very advisable to know about him as much as you can. So, what is this “spam”? The answer is clear – something annoying and useless. The first occurrence of spam is said to happen in the 19th century, when many honorable English gentlemen received an urgent telegram with an advertising content.

When we are talking about search results, however, spam is not easily defined. Usually, it means irrelevant pages that happen to have a keyword in them. But this has been handled a while ago. The search algorithms are far more advanced than 10 years ago, when one could fill the page with meaningless phrases and get a high SE ranking.

The problem has switched to using a good-written content (grammatically that is), which provides little useful information. It keeps repeating the same things again and again, so while looking “normal article” for the bot/spider, for the human being it is simply a waste of time. That’s what “content farm” means – a website that has constantly generated and frequently updated content, which has little value in it. That’s what Blekko and Google are fighting. The problem is that technically it is very hard to distinguish between “useful” and “useless” content – even for a human, let alone an indexing bot…

Numerous websites reported that Google is ignoring the page title tags, replacing them with something “equivalent”. The discussion started on the WebmastersWorld forum with several “upset” webmasters claiming that Google is shoeing page titles that are different from what is included in the page HTML.

Well, instead of being upset, I would rather try to understand the issue. Not a big fan of Google myself, I still recognize the fact that whatever they do, they do for a reason. Matt Cutts says that “We (Google-J.S.) reserve the right to try to figure out what’s a better title.” Of course, one can shout “who are you to determine a better title for my page”, but the answer to this is pretty simple – they are GOOGLE, world’s number one search engine. And with the SPAM issue so hot, their quest of fighting “crawler-fooling-techniques” is understandable.

So, when does Google try to find an alternative title for a page? According to Google’s John Mueller, this happens when “the titles are particularly short, shared across large parts of the site or appear to be mostly a collection of keywords.” Those titles are regarded as “inappropriate” by the search engine that will try to replace them with “other text on the page”.

Some might see this as a violation of rights and yet another step towards global domination by the greedy Google. What I see is the basic principle of SEO – offer a solid and interesting content in your website and you will rank high. Page title should definitely be remarkable and unique, providing the user with the most important info about the page. So, instead of complaining about Google policy, go and check your page titles. If they are good I am sure Google won’t touch them.