Is Google raising new suggestions on how they may rank sites?

In a recent interview, Google's Engineer Matt Cutts told how the search engine is going to rank web sites in the future, in particular he has hinted that ranking would somehow be affected by the loading time taken by a website.

So SEO won't be the only "trick" on which a web site can rely; magic will be now diverted to different factors such as the real capacity of a webmaster (or web agency) to make a smart web site (in terms of code efficiency) as well as about a good hosting company with enough bandwidth and server ram (just to cite a couple of important factors).

Cutts revealed the existence of a new plug-in for Firefox, called Page Speed test, that will be integrated into the well known Firebug, which will add a new tab that aims to measure the page speed across a list of different factors.As most of SEO and Webmaster out there, I performed some test as well, and I've been surprised to see in the issue list an item regarding the caching.As you probably already know, most web pages include resources that change infrequently, such as CSS files, image files, JavaScript files, PDF, and so on.

These resources take time to download over the network, which increases the time it takes to load a web page.

HTTP caching allows these resources to be saved (cached) somewhere by a browser (locally according to its settings) or a proxy, making the download faster. Once a resource is cached, a browser can refer to the first available local copy instead of download it again during the next session.Using cache means with the lottery reducing round-trip time by eliminating redundant HTTP requests, thus reducing the page load time while significantly reducing the bandwidth and hosting costs for your site.

Caching is therefore one of those aspects to which every web player should take care. Useless to say that not using any server-side technology, with which you can control practically everything - including the response header - or not having direct access to the server configuration may result in a problem.

HTML Meta Tags and HTTP Headers

Let me clarify my previous statement. Although authors can add the Expire tag in the document's head section, for caching purposes this meta tag is useless. That's because it's only honored by those browsers who parse the code and act accordingly and not by proxies (which never read the HTML).

If you are considering to use the Pragma header, it won't necessarily cause the page to be kept fresh.

On the other hand, HTTP headers give you more control on how both browser caches and proxies handle your web page, even if they are not shown in the HTML code (as they are automatically appended by the Web server). 200911191806.jpg

The solution is…

I though how to solve the problem in several ways, but none of them was directly applicable to a static web page, and it's embedded resources like CSS and Javascript. The only good thing I was able to find actually is a solution for external downloadable files like PDF, PPT or document in general, that may be eventually requested with a simple Javascript code that use an XMLHttpRequest whilst contextually modify the Response header.

Unfortunately, this solution is valid only for internal links, because any external resource that points directly to the file will automatically use the header that has been set up by default on the server.

So, if this caching will really become an issue to pursue, it will be beneficial for all of you choose an hosting company or develop your web page using a server side technology that will allow you to change the headers in a very easy way.

UPDATE: After some days of frustration, I've gone through a real obsession for web server speed and I figure out how to start solving the server caching issue (Italian article).

Before concluding, I just report another interesting paper released by Yahoo! on how to speed-up web pages.

 6 Comments

 Started by Sean Carlos  19 November 2009

I personally think people should keep in mind that Google uses over 200 factors in the crawl - rank - return process - the latest Matt Cutts pronouncement always needs to be seen in this bigger perspective (as I'm sure Matt would say if asked).

Credit should be given to Yahoo - they came out with YSlow a while back: http://developer.yahoo.com/yslow/

Reply

 Started by Andrea Moro  19 November 2009

Hi Sean, and welcome on my blog.
I really appreciate you have found some time to participate with a comment.

I totally agree about the 200 (or more) factors; in fact I'm not saying the opposite or just "hey use this, and your web site rank". It was just a personal opinion about their new suggestion.

You are right, possibly Yahoo! should be credited about the speed, but we must note that they generally don't be so elastic while pronouncing key factor about their SE, and last time they did (meta keywords) they fall into a big mistake.

Reply

 Started by Carlo  19 November 2009

@Sean, that's right. I agree. Furthermore I think that if 10 webpages all have the same kind of 'in page' optiomization, same "number" of good "in-links"... the "loading speed" could be useful for seo.. and so after "content is the king" we could think about: "server is the prince" ;-)

Reply

 Started by Andrea Moro  19 November 2009

Hi Carlo, and welcome back on my blog.

LOL. Server is the prince. Maybe a new mascot for some new blog.

Reply

 Started by christopher faron  25 November 2009

I don't think it will be a major factor, but like a lot of guidelines set by big g there is some sense to it, especially with the increasing use of the internet via modbile/usb dongles/gsm networks which have much slower connections than xdsl..

Reply

 Started by Andrea Moro  25 November 2009

Hi Chris,

and welcome on my blog. I totally agree with you. It's just one factor to be aware of rather than a big issue. In any case it's not something to underestimate.

Reply

 Leave a Comment