In the past, I’ve singled out CDNs as targets for discussing what performance value they provide. But when I talk to companies about web performance, they don’t just talk about what their CDN is doing for them: they talk up their ADC, too.
To clarify: ADC (aka application delivery controller) is a blanket term coined by Gartner to include the entire family of load balancing products and services. Companies use ADCs for several reasons: scale, security, availability. In recent years, with the emergence of site speed as a priority, ADC providers have begun to tout the performance gains they provide.
There’s no doubt that ADC providers offer value in scale, security and availability. But it recently struck me that no one has taken a close look at how well ADC players actually deliver web performance. Today is a good day to start.
The Gartner Magic Quadrant for ADCs is coming out shortly. In honour of that prestigious list, let’s take the ten companies from last year’s list — along with their own featured ADC case studies, taken straight from their sites — and see how they stack up.
The home page for each site was tested on DSL for IE 7, primarily using Webpagetest’s server location in Dulles, VA, except for where noted otherwise. (If you’re wondering “Why IE 7?” please read this.) I ran three tests per site. The median results are presented here.
For the purposes of this project, I focused on these performance criteria:
- Start render and load times for first and repeat views
- Keep-alive score
- Compress text score
I wasn’t able to find appropriate case studies for Cisco and Array, so I was confined to testing their own sites.
And it goes without saying that this is by no means a formal study. Consider it more of a revealing bird’s eye view.
|VENDOR||CUSTOMER||Load time||Start render||Load time||Start render||Keep-alive||Compress text|
|University of the Arts London*||2.747||2.296||1.433||1.166||A||F|
|Royal College of Physicians*||3.898||2.502||1.99||1.553||A||C|
|San Diego County Credit Union||6.154||2.179||3.782||1.733||A||F|
- Page load and start render times were all over the map. The average load time for first-time visitors was 7.724 seconds— pretty mediocre.
- Surprisingly slow page load times (9+ seconds) from half the providers tested: Cisco, Brocade, Array, Crescendo and Barracuda.
- Only four of the providers got As in both keep-alives and compress text, two of the easiest performance gains.
- Two of the providers, Cisco and Barracuda, both got Fs in keep-alives and compress text.
- Overall, half the sites failed to enable keep-alives.
Some mitigating factors
ADC vendors can’t force their customers to use all the product’s features. If a customer has turned off compression and keep-alives, the vendor can’t be held totally accountable for this. Though if compression and keep-alives were never turned on — something we can’t know unless the customer tells us — to me this is a vendor mistake. I also consider it quite telling if a vendor hasn’t turned on these features for their own sites.
There were no perfect winners. Cisco comes across as the poorest performer: its own site is slow and doesn’t have keep-alives and compression enabled, and it presents no ADC case studies to prove that its own performance is a fluke. Array is next lowest, for similar reasons. Then Barracuda (whose customer site, interestingly, performs much better than its own), and then Brocade.
Zeus comes off as the best of the lot: decent load times, plus it was the only provider whose case studies all passed both keep-alives and compression. However, it failed to enable compression on its own site, which seems like an incredible oversight. F5, Citrix and Radware are middling, followed by A10.
This begs several questions
Why spend so much money on a device and then not use the basic features to enable keep-alives and compression? Are the products too complicated? Are customers not getting the support they need to use them properly?
If we’re seeing results like these, but these sites are being touted as success stories, what’s the disconnect? Are there different definitions of performance in play? If so, what are they?
A great big fat caveat
As I mentioned at the top of this post, companies use ADCs for several reasons not related to performance: scale, security, and availability. It’s overly simplistic to suggest that you choose your ADC provider based solely on its ability to provide web performance. You need to take a lot of other things into consideration: core functionality, usability, support, price.
All I’m proposing is that when you’re creating your ADC scorecard, add “web performance” to your list of criteria and make sure you ask whether or not they eat their own dogfood.
*Test conducted from Gloucester, UK.
**Test conducted from Delhi, India.
***Test conducted from Paris, FR.