21st Century Renaissance Man

Jan 11, 2013

Penguin 2.0 and How It Didn’t Change Everything

Panic in the SERPs
        The recent update to Google’s search algorithm, a continuation of the early 2012 Penguin update, has certainly ruffled a few waterproof, insulating feathers since its late-May introduction. Like its predecessor, Penguin 2.0 focused on increasing the quality of search results by penalizing sites for spam, keyword stuffing, and other “black-hat” SEO techniques. As with previous updates, many sites that were on the front page in the search engine result pages (SERPs) found themselves kicked down, to the second, fourth, even the seventh page. This has obviously caused a somewhat panicked reaction among those who’s job it is to keep their clients on the front page as much as possible. Some have even bemoaned the death of “old SEO” or hailed the coming of the Age of Aquarius “new SEO” in the form of a quickly trending buzzword: “content marketing”. What we’ll show you in this post is that content marketing is really just the same as regular marketing, and the Penguin 2.0 won’t be changing the business as much as some would like to think.
What is this Penguin 2.0 business?
        Simply speaking, Penguin 2.0 focuses on the same aspects as the first update of its line: ensuring that websites with low-quality user experiences didn’t rank as high as those with a more user-centric outlook. Rather than broadening the amount of information of the search engine’s algorithm takes into account, Matt Cutts and his team deepened it. In essence, Penguin 2.0 attempts to see beyond shallow facades of “content” in an ongoing attempt to merge the previously separate worlds what is attractive to search engines, and what is attractive to users of said search engines.
        One of these attempts to get a deeper grasp on what makes a user-centric site, the new update takes the penalization of duplicate content a little farther. Rather than just counting pattern matches, Google wants its algorithm to identify content that is similar to each other, to find content that, while it may be “unique” to a computer, would look and feel remarkably similar to a user. Sites may have taken a hit from this aspect of the update due to content that has become outdated, or pages who’s purpose has been outmoded or duplicated elsewhere on the site. One of the steps they could take to recover from this would be to see what needs to be removed, consolidated, or redirected with a permanent 301.
        Other factors that have been a focus of this update include: the use of quick-fixes and band-aid solutions over lasting changes to increase the use of best practices, leading to penalizations, sites that are more diverse in focus being placed lower in priority than sites that are more tightly aimed in their subject matter, frequent changes in SEO strategy being made a signal of over-optimization to Google, and even sites that are too slow will suffer in the SERPs as Google focuses on sites with better UX.
So What Should We Change?
        It depends on what is meant by ‘we’. If by ‘we’ you mean a site that has been struck by the changes made in the past month by Google, then there are a few things you could be doing. Planning your content around expertise, focusing on user experience, and making sure your website is running fast and clean are all good measures to take to be ahead of Penguin and other updates. If by ‘we’ you mean the SEO business as a whole… not a lot. All the strategies of good SEO right now are still applicable and will be for a long time. A strong, easy-to-use website design, natural link-building, and steady streams of content are all still very, very relevant. Google’s Matt Cutts claims that these changes are to improve the average Google user’s experience, so the search-engine giant is going to want to emphasize websites that meet that high standard. Show off that somebody’s home and somebody cares about the user, you’ll see your SEO strategies fly. 

No comments:

Post a Comment