Website drop in the SERP’s- Possible penalty? Block Googlebot
Well in the past, most rapid drops in a websites search engine rankings were caused by off site factors. Bad links, too many links too fast, too many exact match anchor text links, too many footer or site wide links etc. Recently we are seeing more keyword specific or page level penalties which are turning up to be caused by on site factors. Google is starting to look carefully at websites. They are becoming picky about internal link structure and placement of navigation, content quality and placement, and general over optimization on site is becoming a BAD thing.
We were trying to sort out why a particular website we were working on dropped hard in the rankings. The back link profile was not bad, and the links were not built overnight. The website was old enough 2006, and had been growing organically for several years. We tweaked a few things on site and off site to no avail. We decided to try something sort of outside of the box, we blocked Googlebot. Why block Googlebot? Well it permitted us to see if the problem was off site or on site. Low and behold within 2 weeks, we had lost a ton of long tail traffic, but we recovered all the keywords search results that had been dropping. Which showed us that the back links alone were fine, and strong enough to carry the website in the serps without any content :-).
Now we know, we have some on site work to do.