It has been just over a month now since Penguin 2.0 was officially rolled out. Although it is the fourth penguin-related launch that Google has released (which is why some are calling it Penguin 4.0), the previous two releases were simply data refreshes, whereas this release was an updated algorithm, so it’s the second update that would make a significant impact on web spam, indexing, and Googles’ methodology.
A lot of this algorithm update was to try to impact spam links & queries, including Advertorials, which are adverts or articles created to look like they are independent, unbiased reviews or adverts for products or services, when in reality they are created by the owners, or paid audiences. Advertorials in themselves are not a problem, but they must be declared as such, so people know that it is a heavily biased review and is not to be taken at face value.
Matt Cutts, head of Googles Webspam team, sums up Advertorials and their issues very well in the following video:
As mentioned in the video above it has always been Googles’ policy that if an advert or review is paid, it shouldn’t pass PageRank at all, but when advertorials are not listed as such, then it violates this rule as it appears genuine. It should be clearly labelled that it is paid content, and Penguin 2.0 will address this.
Spam has always, and always will be, an issue that plagues the internet, but Google are continually fighting a winning battle against it, and this update only goes to strengthen their position. In addition to spam, hacked sites are also an issue, but Google have also targetted that with Penguin 2.0, aiming to provide better information, faster and more precise notifications through Google Webmaster tools to website owners. Google will also look at hacked sites much more efficiently than before, making detection more effective. On the subject of Webmaster tools, Google are also planning to make improvements to the Webmaster tools, to give more information to users. The aim of this is to provide more assistance and links to help when people need to diagnose a site problem, just small changes to try to refine the huge amount of information available within the tools.
If you have been hanging around Black Hat forums and discussing ways to get a one-up on Google with some shady spam or SEO tactics, then Matt Cutts promises ‘A more eventful summer for you’. This can only mean good things for the rest of us in the White Hat world.
Authority detection is also something that is set to get better now that Penguin 2.0 is live. For example, the medical field and searches within it will have reputable companies and representatives in that sector appearing in search results. Google is aiming to add more value to sites with reputable characters, which, in turn, will give more accurate results to the end user.
Now that Penguin has been active for over a month, a significant amount of spam and misrepresented sites have been removed from listings, and some people who (be it intentionally or unintentionally) have a lot of spammy backlinks from link farms and such have been very negatively impacted by the update, essentially dropping off of the face of Google. For everyone else, this is a good thing as it shows that Black Hat is not the way forward, even if you do find a way to get around the system, when the system catches up with you, your website can be rendered almost entirely useless.