“In five years, your industry won’t even exist,” he said. His rationale was that in the future everyone will have applied Steve Souders’s performance rules to their sites, and there won’t be a role for automated performance tools.
To illustrate, he sent me this page, which had been optimized by a team of in-house developers:
Perfect Page Speed score, short waterfall, fast load time — this is pretty much a picture-perfect chart. (If you’re not familiar with waterfalls, here’s a primer on how to interpret them.)
But we ran it through Site Optimizer anyway, to see what we could do. Here’s what we did:
We cut load time by more than half, which is great. But look at our Page Speed score: only 74 out of 100. And we broke ‘Compress Images’.
What does this mean?
There are three takeaways from this exercise:
- The current public-facing performance rules are not complete. We sped up this site by using techniques that are not recognized in the known lists of performance rules. For example, when we consolidate images, in one of the browsers we turn them into a format that Webpagetest doesn’t recognize, so it gives us an F for Compress Images. But we make the page faster. Which matters more: the failing score or the faster page?
- Front-end website optimization will never be static. When done well, it evolves with changes to browsers and to user behavior, so it should always keep getting better.
- The front-end optimization market is evolving faster than the performance measurement tools. I say this not to criticize these tools. I’ve repeatedly gone on the record as a big Webpagetest and HTTPwatch supporter, and I think these tools are essential to our industry. But they are developed as a response to the performance rules. When the rules change or new rules are added, there’s inevitable lag time as the tools evolve to catch up.
The conclusion is one that I keep coming back to: When it comes to site speed, there really is no such thing as fast enough.