How much performance value do ADCs/load balancers actually provide in the real world?

In the past, I’ve singled out CDNs as targets for discussing what performance value they provide. But when I talk to companies about web performance, they don’t just talk about what their CDN is doing for them: they talk up their ADC, too.

To clarify: ADC (aka application delivery controller) is a blanket term coined by Gartner to include the entire family of load balancing products and services. Companies use ADCs for several reasons: scale, security, availability. In recent years, with the emergence of site speed as a priority, ADC providers have begun to tout the performance gains they provide.

There’s no doubt that ADC providers offer value in scale, security and availability. But it recently struck me that no one has taken a close look at how well ADC players actually deliver web performance. Today is a good day to start.

The Gartner Magic Quadrant for ADCs is coming out shortly. In honour of that prestigious list, let’s take the ten companies from last year’s list — along with their own featured ADC case studies, taken straight from their sites — and see how they stack up.

Methodology

The home page for each site was tested on DSL for IE 7, primarily using Webpagetest’s server location in Dulles, VA, except for where noted otherwise. (If you’re wondering “Why IE 7?” please read this.) I ran three tests per site. The median results are presented here.

For the purposes of this project, I focused on these performance criteria:

  • Start render and load times for first and repeat views
  • Keep-alive score
  • Compress text score

I wasn’t able to find appropriate case studies for Cisco and Array, so I was confined to testing their own sites.

And it goes without saying that this is by no means a formal study. Consider it more of a revealing bird’s eye view.


FIRST
VIEW
REPEAT
VIEW
PAGESPEED
SCORE
VENDOR CUSTOMER Load time Start render Load time Start render Keep-alive Compress text
A10 3.464 0.852 1.901 0.471 A F
GenieKnows 7.682 2.046 4.337 1.458 F F
Shopzilla 1.075 0.664 0.626 0.415 A A
University of the Arts London* 2.747 2.296 1.433 1.166 A F
Array 17.991 7.967 11.937 4.522 A F
Barracuda 17.875 9.963 11.391 5.938 F F
Royal College of Physicians* 3.898 2.502 1.99 1.553 A C
Brocade 22.033 8.393 15.186 5.841 A F
Petroleum Institute** 8.243 3.557 7.234 3.201 A F
San Diego County Credit Union 6.154 2.179 3.782 1.733 A F
Limelight 12.635 0.765 3.39 0.33 C D
Cisco 9.683 3.536 2.83 0.625 F F
Citrix 3.788 2.66 2.569 2.136 A A
Live Nation 11.067 5.223 3.051 1.449 A F
MRMV 9.304 1.942 4.284 1.619 A F
Crescendo 14.372 6.576 9.248 6.464 A F
Carfax 5.081 1.798 3.599 1.22 B D
Peirce College 5.527 1.376 2.823 1.029 A A
F5 5.084 3.109 1.258 0.879 A A
Epson 8.941 3.329 2.082 0.854 A B
Transplace 3.501 2.267 4.402 2.432 A F
Averitt 3.861 2.18 1.352 0.896 A A
Radware 4.097 1.295 1.879 0.327 A A
AccuWeather 8.592 2.474 6.422 1.327 A A
Ace Hardware 12.311 2.178 3.11 2.035 A D
Computershare 2.208 0.866 1.155 0.851 A F
Play65 3.926 2.116 3.013 2.116 A F
Zeus 3.307 2.059 2.218 0.502 A F
Gilt Groupe 3.11 0.673 0.946 0.164 A A
STA Travel* 10.937 0.996 4.462 0.745 A A
Triboo*** 6.96 2.791 5.148 2.942 A A
Averages 7.724 2.923 4.163 1.846

Observations

  • Page load and start render times were all over the map. The average load time for first-time visitors was 7.724 seconds— pretty mediocre.
  • Surprisingly slow page load times (9+ seconds) from half the providers tested: Cisco, Brocade, Array, Crescendo and Barracuda.
  • Only four of the providers got As in both keep-alives and compress text, two of the easiest performance gains.
  • Two of the providers, Cisco and Barracuda, both got Fs in keep-alives and compress text.
  • Overall, half the sites failed to enable keep-alives.

Some mitigating factors

ADC vendors can’t force their customers to use all the product’s features. If a customer has turned off compression and keep-alives, the vendor can’t be held totally accountable for this. Though if compression and keep-alives were never turned on — something we can’t know unless the customer tells us — to me this is a vendor mistake. I also consider it quite telling if a vendor hasn’t turned on these features for their own sites.

Conclusions

There were no perfect winners. Cisco comes across as the poorest performer: its own site is slow and doesn’t have keep-alives and compression enabled, and it presents no ADC case studies to prove that its own performance is a fluke. Array is next lowest, for similar reasons. Then Barracuda (whose customer site, interestingly, performs much better than its own), and then Brocade.

Zeus comes off as the best of the lot: decent load times, plus it was the only provider whose case studies all passed both keep-alives and compression. However, it failed to enable compression on its own site, which seems like an incredible oversight. F5, Citrix and Radware are middling, followed by A10.

This begs several questions

Why spend so much money on a device and then not use the basic features to enable keep-alives and compression? Are the products too complicated? Are customers not getting the support they need to use them properly?

If we’re seeing results like these, but these sites are being touted as success stories, what’s the disconnect? Are there different definitions of performance in play? If so, what are they?

A great big fat caveat

As I mentioned at the top of this post, companies use ADCs for several reasons not related to performance: scale, security, and availability. It’s overly simplistic to suggest that you choose your ADC provider based solely on its ability to provide web performance. You need to take a lot of other things into consideration: core functionality, usability, support, price.

All I’m proposing is that when you’re creating your ADC scorecard, add “web performance” to your list of criteria and make sure you ask whether or not they eat their own dogfood.

*Test conducted from Gloucester, UK.
**Test conducted from Delhi, India.
***Test conducted from Paris, FR.

Related posts: