The mainstream web performance community is rapidly amassing reams of data in this area. But when it comes to the mobile web, we’re pretty much at square one. The problem is threefold: lack of performance-measuring tools, need for large-scale A/B testing, and lack of information sharing.
No two performance tests give the same results. To illustrate, I put a high-traffic site, Target.com, through its paces on a handful of commonly used tests — Gomez, Keynote, HTTPWatch and Webpagetest — to see how it performs.
In this webinar, Hooman Beheshti analyzes Google’s home page performance using Web Page Test and HTTPWatch, demonstrates how to interpret waterfall charts, and talks about ways to diagnose and isolate common web application performance problems.
Mystified by waterfall charts, or know someone who is? Today I thought it would be a good idea to take a bird’s eye view of a typical performance waterfall – pre- and post-acceleration – that you can take away for your own reference or pass along to anyone you think could benefit from having this information.
In part 2 of my two-part video whiteboarding session for Network World, I talk about the web performance solution landscape, including CDNs and ADCs, and I explain why these don’t address all the opportunities for site optimization.
Here’s the complete slide deck showing how Strangeloop VP Product Hooman Beheshti and I optimized the Velocity home page for our Velocity workshop.
Performance has come a long way in an incredibly short time. It doesn’t seem that long ago (2007, to be specific) when Steve Souders was evangelizing the nitty gritty of the newly developed YSlow best practices. Here’s a snapshot of how the performance landscape has changed in the past three years: