The U.K.-based business used reverse engineering to identify the factors of Penguin 2.1 that impact search results. MathSight’s data show that websites that gained and lost traffic from Penguin 2.1 had links from websites that contained a high or low proportion of rare words, a high or low number of words per sentence and a high or low number of syllables per word in the body copy.
“The readability of content from a linking web page is highly influential to how Penguin views the destination site, that is, the site being linked to,” MathSight’s Managing Director Andreas Voniatis told Search Engine Watch. “Websites should eliminate links from sites that don’t meet the readability thresholds Penguin demands.”
While these results have yet to be confirmed by Google, they do align with common perceptions of the company’s search algorithms. Increasingly, SEO trends have revolved around quality content marketing, driven by Google’s push to provide its users with the best possible answers to their questions.
Google has a unique interest in preserving and improving web content quality on an ongoing basis. In comScore’s qSearch analysis of the US search marketplace for December 2013, data show that Google continues to tighten its grip on the industry. Last month, Google sites led the search market with 67.3 percent of all search queries, totaling approximately 12.3 billion unique searches. What’s more, 68.6 percent of total search queries on Google carried organic search results.
Photo credit: Wikimedia Commons