Google, Site Speed and SEO:
We don’t know much, but we do know something

Ever since Google announced that it now factors site speed into its search ranking algorithms, it’s played its cards somewhat close to its vest on what this actually means. That, of course, hasn’t stopped those of us in the SEO and performance communities from discussing the issue to bits.

I addressed the issue somewhat when Helen Overland at Search Engine People interviewed me a couple of weeks ago. Helen asked me if I could cite any specific examples of a site improving its search ranking after optimizing its performance. At the time, I said this:

Because Google’s addition of site speed to their search algorithm is still relatively new, and because they haven’t divulged the exact details of the algorithm changes, it isn’t possible yet to point to any specific examples of site speed improving search ranking.

And this:

We do know, however, that Google only considers a site fast is if its speed is in the top 20% of its class, and there’s a general assumption that any site that performs at the top of its class will get some kind of boost in the rankings. There’s another assumption that sites that perform in the middle of the class – 20% to 60% – will probably not see any change in their ranking. And sites that perform at the bottom of the class – 60% and lower – will probably be penalized somewhat. So for site owners, this is the time to take a critical look at their site’s speed – alongside their competitors’ – and figure out where they rank and what they need to do next.*

Today, I want to address the first issue:

How one site improved its speed and, as a result, its Googlebot performance

Since talking with Helen, I now have an example of a website that saw a dramatic improvement in how it performed with Google after doing nothing more than speeding up its performance.

In March, we here at Strangeloop implemented a three-month trial of our site optimization service with a client. That trial just ended and we’re now looking at the performance improvement results.

These graphs show Google’s crawl stats for the site over that time.

Crawl stats before and after web performance optimization

Notice that the first and the last graph are almost the inverse of each other. By the end of the Strangeloop trial, Googlebot was crawling about twice as many pages as it was able to at the outset, as Strangeloop halved the amount of time Googlebot needed to download each page.

As you probably already know, Google allocates either a set amount of time, or a set amount of data, for crawling each site. The more pages that Google can crawl within these limitations, the better a site’s ranking will be.

Benchmarking search rankings before and after optimizing site performance

Because we started this test before Google’s announcement, we didn’t think to benchmark this site’s search ranking before implementing Strangeloop. But given this new data, there’s excellent reason to be confident that improving this site’s performance positively affected its ranking.

In the future, we’ll look into benchmarking search ranking before and after implementing Strangeloop so that we can gather more data about how site speed improvements affect search engine rankings.

Have you seen any specific examples of a site that has improved or worsened its Google search ranking after speeding up or slowing down its performance? Let me know. If we can collect and analyze enough meaningful data from these case studies, it would make for a great white paper.

Tomorrow I’ll address my second point: how site owners who care about SEO can take a critical look at their website’s speed, calculate where they rank alongside their competitors, and assess what they need to do next.

*Credit for these assumptions goes to Eric Enge, who postulated these numbers in this post on Search Engine Land.

Related posts: