Analysing the Google Penguin algorithm has been the passion of many SEO professionals even before getting to know the update. Recently MathSight revealed some details about how the algorithm changed after the update. Prior to the release of Penguin 2.1, Andreas Voniatis, managing director of MathSight, said it is important to think deeply when it comes to understanding how Penguin works.
MathSight data shows that the websites that gained and lost traffic - because of Penguin 2.1 - came from links to web pages that contain:
MathSights data can support the theories that SEO professionals have about connecting to poor quality sites and the fact that the quality factor is penalised for content. According to Voniatis, Penguin appraises the content legibility of a linking web page, and also how the site is linked to other sites. His advice is to remove those links to sites that do not meet the readability standards required by Penguin.
How does this 2.1 version differ? Voniatis answers us again by saying that the algorithm has been modified in such a way as to concretely orientate readability towards the Flesch-Kincaid metrics. From this information we glean that Google is trying to change its rules of readability to put a stop to web spam. Voniatis said the formula used to determine readability using the scale Flesh Kincaid is the following: RE = 206,835 - (1.015 x ASL) - (84,6 x ASW) RE = Readability Ease ASL = average sentence length (the number of words divided by the number of sentences) ASW = average number of syllables per word (the number of syllables divided by the l number of words). Whats still not clear are the reasons why Google has changed its readability parameters, but according to SEO professionals Voniatis can manually check every link on the web page to the Flesch-Kincaid and Dale-Chall using free online tools.