On the 9th of April 2010, Google announced via their webmaster central blog, the news that website page load times were going to become one of the factors in Google’s ranking algorithms. They went on to mention that the speed of a website, while being a factor, wouldn’t carry as much weight as, for example, page rank.
This move is quite unusual for Google, as they don’t normally publicize ranking factors in this way. As this would allow people to easily manipulate the Google algorithm for their own purposes, regardless of the quality of their site.
SEO companies work by having a general idea of the factors Google takes into account and optimizing their client’s sites towards it.
This in turn creates sites that have better content, in the view of Google. This strategy allows Google to assert that its natural rankings reflect the best content on the web. Site speed is one area that the public can appreciate as one that affects their daily online searching.
In terms of what this means for website owners and SEO campaigns, it’s important to keep a sense of perspective. Below are a few observations that hopefully make the news a bit clearer.
1. This is Good News for the web.
Slow sites are tiresome and are universally disliked. The fact that Google have taken such a public step of saying it’s ‘part of their ranking algorithm’ means webmasters will sit up and take notice, and start thinking about the fact that their 8.9 second load time just might be a problem. Especially, as it has been said that even a small delay of one second in loading times can cost a website approximately 10% of its traffic.
Which means that site owners, should begin to balance graphic heavy sites with the implications of a slower site. The benefits of faster sites will also provide rewards for the site owner, as well as the customer.
Google have a lot of power over web development practices, and they are using this to maximum effect in their quest to make the web faster. Google have suggested a range of advice tools for site developers to utilize on the quest of making good sites, faster.
2. Google wants to reduce load on its own servers.
To reduce the load on Google’s servers, they don’t want duplicate content. For every page of duplicate material that it crawls, Google could be checking out another page of fresh content.
In the era of Caffeine and an internet that is growing at an exponential rate, can you imagine the efficiency savings Google could achieve if 50% of websites cleaned up their code a bit?
3. You still need good content and links.
While Matt Cutts says site speed is only going to be a factor in 1% of cases – i.e. extreme situations. If you’ve got the best site and the strongest links then a bit of a sluggish server will probably not harm you too much.
Conversely this announcement most certainly does not mean that if you strip your homepage down to 3 lines of text with no images, CSS or Javascript or rich media elements so it now only loads in 0.01 seconds then it will suddenly climb up the SERPs. Like most things in life, it’s about balance.
4. Anyone for Chrome?
It’s not too far a leap from Google search team’s focus on site speed, and the Google Chrome browser, which is undoubtedly a very fast browsing experience. If they help people to focus on developing sites that load quickly then surely that opens the door wider for people to be interested in switching to Chrome?
Tenuous maybe but in my view getting more people to use their browser is all part of Google’s master plan to learn your browsing habits and how likely you are to click on one of their sponsored links.