Defining the Web Performance Problem: Payload, page complexity and the browser wars

If you talk to anyone in the performance industry, they’ll be happy to tell you what the nub of the performance problem is. And if you talk to three people in the performance industry, you’ll probably hear three different takes on the situation. This is mine.

To answer this question, first we need to separate the wheat from the chaff. Chances are, you have a front-end problem. My opinions below reflect the key front-end issues I see. If you have a back-end problem and your problems are in the generation of your HTML, my diagnosis below won’t be accurate.

(If you don’t know which end your problem is at, do a test at webpagetest.org and analyze your site like Steve Souders does in this classic blog post from 2007.)

Moving right along…

Clearly, dozens of performance issues exist. There are two main problems I see over and over again: the sheer size of web pages, and how these pages are built for, and consumed by, different browser types. As if these problems weren’t pressing enough, we also have users’ page-load expectations, which are currently outpacing most websites’ ability to deliver.

Problem #1: Web pages are bigger and more complex than ever.

This graph illustrates pretty dramatically the magnitude of the page bloat that’s happened between 1995 and now:

Then: The average page size was a lean, mean 14.1k, due to the fact that it contained just 2.3 objects. That means just 2.3 calls to whatever data centers were serving the site. (Well, not actually. It’s impossible to have 0.3 calls to the data center. But we’re dealing with stats here, hence decimals. Work with me.)

Now: The average page size is 498k and contains about 75 objects – everything from CSS to images to Javascript. That means 75 server round trips are needed to pull all the page’s resources to the user’s browser. The upshot: pages that load slowly and inconsistently.

Most of you CDN customers are probably feeling smug at this point, because you feel this doesn’t apply to you. Think again. Even if your roundtrips are only 30ms each, those milliseconds add up if you have 75+ roundtrips.

You’ll notice that, when it comes to page size, the trend is toward growth. Rapid, unrelenting growth. If we follow the current trend, by 2012 we can expect the average page to grow to 684k and contain 83 objects.

Problem #2: Browsers don’t all work the same way.

It would be great if all browsers worked the same way and did everything exactly the same. And it would be awesome if all users upgraded when new browsers came out. It would also be great if we could colonize Mars or invent non-fattening ice cream. These things are all about equally probable.

While browser manufacturers have, in recent years, worked to comply with common standards, this isn’t always possible, especially given the drive to develop competitive proprietary features. And browsers are always going to lag slightly behind the most up-to-date standards anyway, because, well, let’s be honest: that’s just the nature of any product’s development life cycle.

In short: browsers just work differently from one another, like any other product. And as web developers who are developing for these browsers, we always have to be aware of these differences. (As an aside, if you’re interested in browser performance, you should read another Steve Souders post: his browser performance wishlist.)

And as for user adoption of newer browser versions, as far as many web users are concerned, if it ain’t broke, don’t fix it. E.g. My mom. She still feels comfortable with IE 6. So there you go.

As a result of all of this, we have a multi-browser universe in which each browser has its own preference as to how it consumes and displays pages.

So how do site owners respond to this issue? Well, in an ideal world, a website’s server environment would generate a unique version of each page of its site, optimized to suit the particularities of each individual browser. However, given the sheer volume of pages and the ever-shifting browser landscape, this is an enormous undertaking.

The result: servers are stuffing inefficient pages across a variety of browsers, the equivalent of forcing square pegs into round holes.

And we can’t ignore the most important – and most uncontrollable – element in the web performance conundrum:

Problem #3: Users don’t care about problems 1 and 2.

Users are increasingly demanding:

Notice how it took seven years for user expectations to halve, from 8 seconds to 4 seconds, but then it only took three years for those expectations to halve again, down to 2 seconds. Scared? We should be.

These users don’t care about your awesomely complex site.

They’re totally unsympathetic to your explanation about pesky non-standardized browsers or why they should upgrade.

They expect dynamic, feature-rich, content-rich pages to load in 3 seconds or less. (While the stated expectation is 2 seconds, according to this Forrester study, users are willing to be generous and give you an extra second before they bounce.)

And they don’t want to hear excuses.

Related posts: