Google raises new suggestions about rankings

In a recent interview, Google’s Engineer Matt Cutts told how the search engine is going to rank web sites in the future, in particular he has hinted that ranking would somehow be affected by the loading time taken by a website.

So SEO won’t be the only “trick” on which a web site can rely; magic will be now diverted to different factors such as the real capacity of a webmaster (or web agency) to make a smart web site (in terms of code efficiency) as well as about a good hosting company with enough bandwidth and server ram (just to cite a couple of important factors).

Cutts revealed the existence of a new plug-in for Firefox, called Page Speed test, that will be integrated into the well known Firebug, which will add a new tab that aims to measure the page speed across a list of different factors.As most of SEO and Webmaster out there, I performed some test as well, and I’ve been surprised to see in the issue list an item regarding the caching.As you probably already know, most web pages include resources that change infrequently, such as CSS files, image files, JavaScript files, PDF, and so on.

These resources take time to download over the network, which increases the time it takes to load a web page.

HTTP caching allows these resources to be saved (cached) somewhere by a browser (locally according to its settings) or a proxy, making the download faster. Once a resource is cached, a browser can refer to the first available local copy instead of download it again during the next session.Using cache means with the lottery reducing round-trip time by eliminating redundant HTTP requests, thus reducing the page load time while significantly reducing the bandwidth and hosting costs for your site.

Caching is therefore one of those aspects to which every web player should take care. Useless to say that not using any server-side technology, with which you can control practically everything - including the response header - or not having direct access to the server configuration may result in a problem.

HTML Meta Tags and HTTP Headers

Let me clarify my previous statement. Although authors can add the Expire tag in the document’s head section, for caching purposes this meta tag is useless. That’s because it’s only honored by those browsers who parse the code and act accordingly and not by proxies (which never read the HTML).

If you are considering to use the Pragma header, it won’t necessarily cause the page to be kept fresh.

On the other hand, HTTP headers give you more control on how both browser caches and proxies handle your web page, even if they are not shown in the HTML code (as they are automatically appended by the Web server).

200911191806.jpg

The solution is…

I though how to solve the problem in several ways, but none of them was directly applicable to a static web page, and it’s embedded resources like CSS and Javascript. The only good thing I was able to find actually is a solution for external downloadable files like PDF, PPT or document in general, that may be eventually requested with a simple Javascript code that use an XMLHttpRequest whilst contextually modify the Response header.

Unfortunately, this solution is valid only for internal links, because any external resource that points directly to the file will automatically use the header that has been set up by default on the server.

So, if this caching will really become an issue to pursue, it will be beneficial for all of you choose an hosting company or develop your web page using a server side technology that will allow you to change the headers in a very easy way.

UPDATE: After some days of frustration, I’ve gone through a real obsession for web server speed and I figure out how to start solving the server caching issue (Italian article).

Before concluding, I just report another interesting paper released by Yahoo! on how to speed-up web pages.