Only 1 out of 5 top ecommerce sites uses RUM. Why?

Real user monitoring (RUM) is a white-hot topic in our industry right now. For those new to the topic, RUM is monitoring technology that records all user interaction with a website or client interacting with a server or cloud based application. RUM tools use Javascript that’s injected into a page to provide feedback from the browser or client.

RUM is a powerful tool for site owners. It allows them to aggregate user data on a massive scale — think billions of page views — and then analyze the data in endless ways.

In short, for performance geeks, RUM is really, really cool. At any given Velocity conference, there’s a handful of sessions on it, and it’s a hot topic if you follow the #webperf hashtag on Twitter.

But to be frank, despite all the talk, my feeling has been that only about 20% of major ecommerce sites have adopted RUM tools. When I talk to people who run big sites, RUM is barely on their radar. (More on that later.) I was talking about this with a colleague last week, and she was surprised my estimate was so low. So I decided to back up my talk with a little ad hoc research.


  1. Took the list of the top 200 retail sites, according to Alexa, and grabbed the middle 50 sites (#76 through to #125).
  2. Ran the home page of each site through WebPagetest in order to get their waterfalls.
  3. Looked at the waterfall for each page, searching for RUM beacons from each of these providers: Gomez, New Relic, Boomerang, LogNormal, DynaTrace, Torbit, Yottaa, and WebTuna.
  4. Looked at who’s using tools like Google Analytics and Chartbeat, which gather timings by default, though I didn’t factor these findings into some of my analysis. (More on this later, but in short: most people seem not to use the RUM results in Google Analytics.)
  5. Asked ten ecommerce managers if they use Google Analytics’ RUM feature.


There’s the possibility that some companies may have developed their own home-grown monitoring tools, which might have escaped my search.

The research into the use of Google Analytics is much more anecdotal than scientific, as the sample is small and not representative of Google Analytics customers.

Finding 1: 84% of sites have not adopted RUM tools.

As mentioned above, I didn’t include Google Analytics and Chartbeat in my interpretation. Among the recognized RUM tools, here’s a look at which ones are currently being used:

How many online retailers use real user monitoring tools?

Finding 2: 9 out of 10 ecommerce companies don’t use the RUM feature in Google Analytics regularly.

While more than half (52%) of the sites I looked at use Google Analytics, most don’t use GA’s RUM feature with any regularity.

In a very non-scientific study, I asked ten ecommerce companies who use GA if they use the RUM features. Only one of them regularly looks at the RUM screen (more than once per week). I found that many of them knew that this feature exists and had clicked on the screen at least once.

When asked why they didn’t use it, I heard the following reasons:

  • The data was not comprehensive (different browsers and sample rate).
  • The data was not trusted. (Some were paranoid about Google. Tin foil, anyone?)
  • Existing performance tests were good enough.

I found these conversations very interesting and would like to learn more about this. It fascinates me that a trove of free data is available to people and they don’t use it. What does this say about the potential long-term success of RUM?

I would like to hear more about your experience using the Google Analytics RUM feature. Is my sample representative? If there is an adoption problem, is it caused by usability or data?

This research raises a few more questions:

Why so few RUM adopters?

All the pro-RUM talk is dominated by the mega-players like Google, Amazon, and Etsy, but it hasn’t trickled down to the “mortal” companies.

Does this mean RUM is irrelevant for smaller businesses?

I think real user monitoring is an essential tool in every site owner’s toolkit. If you’re not measuring real-world performance on an ongoing basis, then every effort you’re making to improve your application performance is a stab in the dark. As more and more RUM tools are available at various price points, there’s no excuse not to pick the one that suits your needs and make it a core part of your performance practice.

What should we make of the RUM tools that don’t appear to be used by the test sites?

These findings should not necessarily reflect badly on the tools that don’t appear to have been adopted by the sites I tested. There are some solid tools out there, but I don’t think the market has reached a place where the importance of these tools is being widely recognized.

Related posts:

15 thoughts on “Only 1 out of 5 top ecommerce sites uses RUM. Why?

  1. Perhaps a more interesting question would be how many people were using RUM a year ago? While the number of sites using RUM might still be small, the growth rate is high. If you’d done this study even 6 months ago, the results would have looked drastically different. Keep in mind, Torbit didn’t even have a RUM product until April of this year. Ditto for LogNormal. New Relic and Gomez have been around a lot longer, but we’re all still working to educate the market on why RUM matters. We’re making great progress and I think 2012 will be remembered as the year that RUM started to go mainstream.

  2. Great article, Josh. Thanks. One thing that might increase the adoption of RUM is if there were more case studies on how RUM measurements compare to synthetic. My anecdotal experience is RUM page load times are twice as long as synthetic. If people realized that synthetic tests are a poor reflection of the actual user experience, they might want to see what RUM has to say.

  3. Steve, you’re right. We’re seeing the same thing – synthetic paints a dramatically rosier picture than RUM. It doesn’t help that the CDNs are fighting to maintain the status quo. They look a lot better on Keynote than they do on Torbit.

  4. Josh, great report. We here at New Relic are enjoying strong adoption from the “mortal” companies – our friends at Builtwith find our RUM JS on “265,607 websites , of which 40,711 websites within the most visited sites on the internet and an additional 224,896 websites on the rest of the web “. Our APM stuff is on more sites, but Builtwith can only scrape the JS, as you know.

    We are excited for the resource timing spec to provide more actionable data and will surely be integrating that as well.

    I think the old guard of synthetic players convinced the market synthetic is the way to go, when in fact there is TREMENDOUS value in REAL user data.

  5. Great post Josh! While I would argue that there are quite a few mainstream sites that HAVE been using some form of RUM for at least a few years – enterprise adoption can take longer than all of us would like.
    I think the biggest challenge is shifting away from the fixed way of thinking when it comes to synthetic. What we have to start thinking about is how we position real user measurement as ‘truth’ and make it easier for sites to adopt our technology. We have all evangelized at some point or another the benefit of synthetic/real browser testing (which I personally feel still has huge value). IMO a holistic approach to performance management is best – as long as you are leading with RUM by default ;)
    We shared a discussion this year about the challenges of moving the performance needle from within larger organizations. In my experience it’s rare that organizations don’t see the value in performance, it’s just that they are competing with multiple priorities. I for one have faith that we will break through, thanks to the continued work of the ‘tribe’.
    BTW: (boomerang/LogNormal = SOASTA mPulse!)

  6. Pingback: Community News: When the Nerds Go Marching In and More | New Relic blog

  7. great insightful article, and some good points, however as can understand the small amount of RUM users may be because of paranoia as mentioned above, I cannot get my head around the fact that if existing performance tests were already good enough why ecommerce site owners would not want to increase there results.

  8. Pingback: Web performance and ops - Weekend must-read articles #36

  9. Pingback: Web performance and ops – Weekend must-read articles #36 | AsterHost

  10. what benefit are people getting using the non-google analytics tools listed above? Do you they just not know it exists?

  11. It will be interesting to gather similar data for mobile websites as there is a stronger case for companies to monitor real user data in terms of application performance.

    Great article, Josh

  12. Pingback: This week on the Web Performance Today podcast: Cliff Crocker and Buddy Brewer

  13. Pingback: RUM = real user monitoring | Happy User Experience

Leave a Reply

Your email address will not be published. Required fields are marked *

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>