Closely follow Google Panda doesn't make sense

Last week Google Panda 4.2 update was officially launched. Google said the new refresh will take months to fully roll out. A revelation that comes with a cost, at least for SEOers and companies who closely follow Google’s algorithms.

Google’s logic behind this new approach is to assess the sites closely to improve the SERP quality, thus encouraging webmasters and site owners to create better and more high-quality sites.

The new Google Panda

Usually, Google updates take a very short time frame to roll out, with results being more black or white. Early versions of Google Panda had in fact produced some sharp changes, with many sites disappearing in just one night and certainly not without false positives.

Back in time, those sites hit by the Panda update that worked hard were able to get back into the good graces of the algorithm, appreciating a slow recovery (and vice versa), with SEOers quite often moaning about the outcomes.

Perhaps bored about all the rumour generated on the net, Google’s engineers have now been trialling this new approach – which is not guaranteed to be limited to Google Panda only – of spreading the update over months other than days, “promising” a constant re-evaluation of any site in a gradual manner.

The new “slow cooking” approach, however, reduces the usefulness of any tool indicating an algorithm release like the one offered by SearchMetrics. Google aims to reinforce the concept that attributing the loss of visibility to a specific algorithm is not always possible as this is much likely to be due to different factors at once.

In a nutshell, knowing a given algorithm is released cannot impact the strategy in the short term anymore because causation is not correlation.