One of the most recent surveys confirmed that 68% of visitors click away from a website if it loads too slowly. And yet the same research also reveals most website owners believe their site to be fast enough.
Both Google and Yahoo! stated in more than an occasion that speed is one of the most important factors in providing web users with the best experience.
It is no surprise then that search engines are rewarding faster websites with higher rankings, which can be differently interpreted as more traffic, and lower costs per click.
The long long story
Everything begun two years ago, when Google provided all webmasters with the page speed browser extension to measure how fast a web site was. Then it was the time of the API to provide developers with an online tool to produce a series of recommendation that could have been rebranded.
Finally – last year – not happy of how fast their speed hints was implemented by the webmasters, they released an Apache module called mod_pagespeed, a module to be installed on a server that get rid of a page compression so to enhance the load time.
However, to the final load time calculation contributes also factors like the server performances and the caching system.
As these elements are often out of control by the webmasters (unless they are not savvy enough to don’t go for cheap hosting), at the end of July, Google released the last addition to their Speed (obsession) toolset: the Page Speed Service.
The Page Speed Service is in short a DNS based caching system hosted somewhere in some Google Data Center, which makes a copy of your site (or at least a copy of the most accessed web pages) and that they use every time someone do a request to a site using the PageSpeed service.
The DNS word above should have rung a bell on top of your head, if it did not, well let me you explain in brief what will happen. By pointing your DNS to Google, it essentially means that every time your web site is requested, the primary server to answer a request is a Google’s server. That means Google to return the copy of your web site page as a first instance, and eventually retrieve a fresh copy if the page is outdated or not existent at all.
So, in order to allow this mechanism to work, your domain DNS need to be pointed on Google servers. All the rest will happen on the backstage. According to Google, users won’t even notice the difference, apart from the fast response.
I’m scared about Google Page Speed service. Are you?
Despite I appreciate the effort of Google to make a better WWW, I’m totally scared to point my domain and site DNS entries to Google DCs. Their numbers, up to 60% of speed improvements, are very comfortable, and the idea to have a cached version of my sites ready to be served from the users’ closest server for a prompt response is even better.
However, the dark side of this web site caching mechanism is the massive amount of power Google will get from this action. They will be able to collect information practically on everything. Some example:
- What are the most interested pages of a site?
- What are the real numbers of visitors?
- What are in theory the real time spent on page and the bounce rate (provided your site has Google Analytics installed as well, which could be a very realistic opportunity).
Having said this, this caching scenario, which could help in some particular scenario like the one where the chosen hosting plan has a limited bandwidth, is totally useless for high-volume and database-driven web sites.
As these sites are normally the ones that generally take the most amount of time for loading, and considering that the page is dynamically built with a server-side language to provide different content according to the user request, it is unlikely these sites will never point their DNS on Google. So what could be the real advantage of this Page Speed Service if not favouring Google only?
I could be wrong, but that’s my opinion after all. If you believe in something different, I will be glad to read your comments.