TechCrunch: The slowest tech blog, or one of the fastest? Turns out, it’s both.

When I started writing this post, my intent was to do a bit of performance benchmarking for leading tech blogs. But what started off as a standard benchmarking exercise turned into a great opportunity to talk about another approach to understanding your performance numbers in a meaningful way, and why it’s important to look at your site from as many angles as possible.

Shortly after TechCrunch announced its acquisition by AOL yesterday, some folks in the performance community were quick to suggest that one of AOL’s first jobs should be to improve the site’s performance. People cited that the main page that takes more than 20 seconds to load, making TechCrunch a desperate candidate for performance tuning.

But TechCrunch isn’t the only tech blog out there that readers complain about. So I decided to run some page tests to see how they measure up.

Time to interact (aka “load time”)

When you do a test via Webpagetest, this is called “load time”. Up till recently I’ve been using this term, too, but lately I’ve started calling this “time to interact”. It refers to the amount of time before the document is complete and fully interactive. Whatever you call it, this is the number most people pay attention to.

Website Time to interact (first view)*
TechCrunch 30.168
GigaOM 19.556
LA Times: Technology Blog 19.448
Mashable 17.868
New York Times: Bits Blog 15.297
Technology Review 14.973
Average 19.551 seconds

This is one of the worst industry benchmarks I’ve ever encountered. In fact, it’s suspiciously bad. There’s no way sites this slow could have such dedicated readerships.

I wanted to take another view, so I used Webpagetest’s video function to create a set of side-by-side videos to show how these sites load in real time (I’ve slowed it down so you can appreciate it in all its incremental glory):

This video reveals a couple of perspectives that you would miss if you just focused on load time…

Time to fully load: Interesting but not necessarily important

This is the amount of time it takes for every page element to load.

Website Time to interact Time to fully load
TechCrunch 30.168 29.5
GigaOM 19.556 21.9
LA Times: Technology Blog 19.448 26.8
Mashable 17.868 39.6
New York Times: Bits Blog 15.297 29.1
Technology Review 14.973 17.2
Averages 19.551 27.35

These numbers aren’t a huge surprise, given that the number of calls to the server for these sites ranges between 129 (Technology Review) and 323 (TechCrunch).

Before I started this entire exercise, I made a blindfolded guess about the performance culprits for most of these sites: ads and other bits of third-party gak. But as I’ve discussed elsewhere, a decently optimized page can sort and prioritize third-party content so that visitors are served meaningful content first. So let’s add those numbers to our table…

Time to start drawing: Very important

I define “time to start drawing” as the amount of time it takes for meaningful content to appear on the screen. Let’s take a look:

Website Time to interact Time to fully load Time to start drawing
TechCrunch 30.168 29.5 5.8
GigaOM 19.556 21.9 6.4
LA Times: Technology Blog 19.448 26.8 6.1
Mashable 17.868 39.6 6.5
New York Times: Bits Blog 15.297 29.1 7.2
Technology Review 14.973 17.2 5.1
Averages 19.551 27.35 6.18

Now, I’m not saying that an average time of 6.18 seconds is good. There’s clearly room for improvement here. But it’s not outright terrible, and it’s lower than the Fortune 500 benchmark of around 7 seconds.

And check it out: at 5.8 seconds, TechCrunch is actually the second-fastest site surveyed, not the worst, as the initial page test results indicated.

This exercise serves as a reminder of three things:

  1. The danger of focusing on just one piece of performance data
  2. The value of benchmarking your site against your competitors (though I’m still a staunch proponent of the audacious goal)
  3. And most important: always making sure you look at your site from your visitors’ perspective

Edited 9/30/10 to add: I just realized that I accidentally published the wrong numbers in the “time to interact” columns. I had originally tested the sites on both IE7 and IE8, but decided in the end to focus on the results for IE7. However, I mistakenly entered the IE8 numbers in the “time to interact” column. Duly corrected now, with my apologies for any confusion. Note that the updated numbers still corroborate the thesis behind this post.

*All tests conducted on Webpagetest — IE7 on DSL via the server in Dulles, VA.

Related posts: