Has your site’s third-party content gone rogue? Here’s how to regain control.

In my travels, I’m seeing a strange dichotomy when it comes to performance:

  • On one hand, site owners are pouring massive amounts of money and energy into site development and delivery.
  • On the other hand, these same site owners are ignoring the out-of-control proliferation of third-party scripts on their sites.

This raises the question: What’s the rationale behind running up a $100K+ monthly CDN tab — ostensibly to shave a couple of seconds off your load times — if you’re going to then implement a single line of unoptimized JavaScript that could not only negate those precious seconds, but take down your site completely?

The answer, more often than not, is this: There is no rationale. Most site owners are either unaware of this issue, or they don’t realize its seriousness.

If the issue of third-party performance is new to you, this post will help you:

1. Understand the impact of all those harmless-looking little snippets of code.
2. Regain control over rogue third-party content.

Background: Third-party scripts are everywhere. Some are fine. Most aren’t.

A few months ago, the folks at New Relic did some digging into the most popular third-party APIs used by the 200,000+ applications the company monitors and took a look at which ones performed the best. No surprise, these are all familiar names:

  • Amazon Web Services (response time: 432ms)
  • Twitter (response time: 832ms)
  • Facebook (response time: 918ms)
  • PayPal (response time: 1.788s)

This is good news for site owners… assuming these are the only four scripts you’re running. But third-party content — analytics, ads, trackers, social sharing widgets, etc. — are on the rise.

As I wrote here several months ago, the average top ecommerce site contains 7 third-party scripts, with some sites containing up to 25 scripts. Cumulatively, these can have a massive impact on page performance. Unoptimized scripts can slow down page load by several seconds, or even stall it completely. Third-party scripts are one of the most common points of failure for sites: just a single line of JavaScript can take down your entire site. Despite this, measuring the impact of third-party content on a site’s usability is often an afterthought — if it even gets thought about at all.

If you want to scare yourself, run a simulation using WebPagetest to see how your site would perform if one of its third-party providers went down. As a for-instance, here’s a simulation I ran, which shows what would happen to Staples.com if its third-party providers went down:

In this case, the main culprit is Omniture, which stalls the site for a full 30 seconds when it goes down. Note that about 50% of top ecommerce sites use Omniture. (I say this not to pick on Omniture, but to point out that if they ever go down, they’re going to take down a lot of sites with them.)

Outages are inevitable. As site owners and managers, our job is to figure out a strategy to mitigate these outages.

I dream of a world where all third-party providers offer clear service level agreements to users.

In an ideal world, all third-party providers would offer a clear SLA that, at the very least:

  • Expresses their annual uptime guarantee as a percentage (ideally, as close to 100% as possible).
  • Describes the process for reimbursing site owners (if site owners are paying for the service provided by the script) if uptime drops below the SLA guarantee.

Right now, most third-party providers don’t offer real-time monitoring of their scripts, nor do they offer meaningful service level agreements (SLAs). My hope is that, as site owners become more educated about the importance of page speed, they’re going to start demanding properly optimized scripts, as well as better monitoring, reporting, and accountability.

In the meantime, what can site owners do to take control of third-party performance?

Until recently, adding third-party code to your site meant giving up a lot of control over how your pages load. Not any more (sort of). Here are four options at your disposal:

1. Deferral

In simplest terms, deferral is a front-end optimization technique that delays the execution of non-critical scripts until the rest of the page has loaded and rendered on the browser.

Pro: It’s a relatively easy fix.

Con: Deferral won’t work for all content. If your site hosts ads, your advertisers won’t be happy to have their ads show up last. Save deferral for third-party scripts like analytics beacons, tracking pixels, and social widgets.

2. Asynchronous loading

With asynchronous loading, third-party scripts load in parallel with the crucial page content. Async code can be tricky to program, which is all the more reason why it’s been gratifying to note its increasing rate of adoption among third-party providers. All the social buttons on this blog are async versions. (You can read about why I removed the non-async StumbleUpon button.)

Pro: Lets you display ads and other business-critical third-party content without blocking primary content.

Con: Slow third-party scripts will prevent onLoad event from firing. This post will give you a detailed understanding of how the onLoad event works. But the short answer is that a page’s onLoad determines its load time as measured by performance measurement tools. Too many delayed onLoads will mess up your results. If you’re tracking thousands of pages over extended periods of time, these messed-up results will make it a pain to pinpoint other potential performance problems.

3. Third-party timing and script killing

Also known as “tag management”, this technique involves establishing an allotted time for scripts to load. Then, if a script fails to load in that time, it’s either killed or deferred. Strangeloop has been an early pioneer of this technique in both our desktop and mobile FEO solutions, and it’s been gratifying to see it catching on.

Pro: Gives site owners the most control over third-party content.

Cons: Doesn’t lend itself to hand-coding. Best performed by an automated system.

4. Just say no

Ask yourself if you really need that widget. Perform a cost-benefit analysis of what a proposed new third-party tool offers your site and determine if it’s really worth the performance hit. Because make no mistake about it: there’s always a performance hit. And too many hits, no matter how small, add up.

Pro: Status quo is easy to maintain.

Con: Can be incredibly difficult to resist the siren song of the latest widget du jour.

And don’t forget to audit your third-party scripts.

It doesn’t matter how well you optimize the rest of your site if a single line of external JavaScript can take out the whole thing. If you aren’t already conducting regular audits of your site’s third-party content, you should be. The audit should identify:

  • All the third-party scripts your site is running, and on which pages they appear.
  • Which of the above performance practices (load asynchronously, deferral, timing/killing) each script follows.
  • Does the third-party provider offer a service level agreement? If so, what are the terms?

What’s your experience with third-party snippets? Which ones are worth the performance hit? Which ones should send us running to the hills?

Related posts:

18 thoughts on “Has your site’s third-party content gone rogue? Here’s how to regain control.

  1. Joshua great article as always! The best advice is mostlikely #4, don’t accept every widget or ad tag – and if you do make sure you have an SLA.

    In regards of option #3, it does seem great on paper, however I have yet to see a site, developer, vendor that truly implemented a cross browser solution that kills an external JavaScript call. Even when looking at some of your clients, like Petco or Autoanything, when randomly delaying third party tags (facebook, boldchat, etc) the request still hangs for 21 seconds on a windows machine (probably more on a mac).

  2. Great post – you offer some insightful advice that I think our readers at DZone would really like to check out. We’re currently trying to expand our readership of developers interested in high performance. Please get in contact with me if you’d like to have this post republished on DZone with attribution to you as the original author and a link back to the original post. Just shoot me an email.

  3. This ties in nicely with an article I read earlier today – http://bit.ly/LbqcTD – and personal experience with Web performance data.

    And to hear from people who spend a great deal of time and money that it’s beyond their control means that they have given control of their online brand and reputation to their vendors.

  4. Pingback: QA Hates You » Blog Archive » More Thoughts On Third Party Scripts

  5. I have to question the effectiveness of loading the social buttons this way. Looking at a webpage test – http://www.webpagetest.org/result/120523_6G_HXD/ of this page, and I see the same issues that I encounter. Some of these requests, especially Twitter, seem to routinely take forever and block everything else after them. While you have the flexibility to put them last on your page, that’s not always a viable solution. e.g. bus users who want them high up on the page and not have them flash in after the rest of the content loads.

  6. Good post;

    There was a recent case I came across of a site being slowed down significantly by a weather widget on the page. In the end it comes down to how reliant you really want to be on third parties, where website performance is a top priority.

  7. Yeah i experience this when Twitter code embedded in my site slows down due to Twitter under heavy load periods.

  8. The solution that makes the website a lot faster and the widgets working perfectly:

    (function(d, s) {
    var js, fjs = d.getElementsByTagName(s)[0], load = function(url, id) {
    if (d.getElementById(id)) {return;}
    js = d.createElement(s); js.src = url; js.id = id;
    fjs.parentNode.insertBefore(js, fjs);
    load(‘//connect.facebook.net/en_US/all.js#appId=272697932759946&xfbml=1′, ‘fbjssdk’);
    load(‘https://apis.google.com/js/plusone.js’, ‘gplus1js’);
    load(‘//platform.twitter.com/widgets.js’, ‘tweetjs’);
    }(document, ‘script’));

    From excelent article:

  9. Is there a place where I can read more about your third suggestion – “3. Third-party timing and script killing” or is that a StrangeLoop secret sauce?

  10. Pingback: Roundup: Why the Facebook outage is (yet another) wakeup call for site owners

  11. FYI, looks like the social sharing widget you are using here is loading the Google +1, Twitter and LinkedIn sharing buttons using blocking script tags (along with maps.google.com which I don’t see anywhere in the actual UI).

  12. Whoa… you’re right, Pat. We’ve made a few changes to the blog at the front and back ends since implementing the async buttons… wondering if that had something to do with it. Looking into it now. If we learn anything interesting, I’ll keep you posted. Thanks for the heads up!

  13. Pingback: Three nifty (and free) new performance measurement tools for mobile and third-party content

  14. Pingback: Three nifty (and free) new performance measurement tools for mobile and third-party content – Ovidiu Donciu: software testing, wordpress websites, social marketing, SEO

  15. Pingback: Automated front-end performance: How do you calculate developer ROI?

  16. Thank You Joshua for the post.

    Can you please post or link code examples? I have a third party chat that sometimes loads very slow.

    Thanks again

  17. Pingback: Performance 101: An opinionated guide to the 22 links that every developer should read

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>