Only 1 out of 5 top ecommerce sites uses RUM. Why?

Real user monitoring (RUM) is a white-hot topic in our industry right now. For those new to the topic, RUM is monitoring technology that records all user interaction with a website or client interacting with a server or cloud based application. RUM tools use Javascript that’s injected into a page to provide feedback from the browser or client.

RUM is a powerful tool for site owners. It allows them to aggregate user data on a massive scale — think billions of page views — and then analyze the data in endless ways.

In short, for performance geeks, RUM is really, really cool. At any given Velocity conference, there’s a handful of sessions on it, and it’s a hot topic if you follow the #webperf hashtag on Twitter.

But to be frank, despite all the talk, my feeling has been that only about 20% of major ecommerce sites have adopted RUM tools. When I talk to people who run big sites, RUM is barely on their radar. (More on that later.) I was talking about this with a colleague last week, and she was surprised my estimate was so low. So I decided to back up my talk with a little ad hoc research.

Methodology

  1. Took the list of the top 200 retail sites, according to Alexa, and grabbed the middle 50 sites (#76 through to #125).
  2. Ran the home page of each site through WebPagetest in order to get their waterfalls.
  3. Looked at the waterfall for each page, searching for RUM beacons from each of these providers: Gomez, New Relic, Boomerang, LogNormal, DynaTrace, Torbit, Yottaa, and WebTuna.
  4. Looked at who’s using tools like Google Analytics and Chartbeat, which gather timings by default, though I didn’t factor these findings into some of my analysis. (More on this later, but in short: most people seem not to use the RUM results in Google Analytics.)
  5. Asked ten ecommerce managers if they use Google Analytics’ RUM feature.

Caveats

There’s the possibility that some companies may have developed their own home-grown monitoring tools, which might have escaped my search.

The research into the use of Google Analytics is much more anecdotal than scientific, as the sample is small and not representative of Google Analytics customers.

Finding 1: 84% of sites have not adopted RUM tools.

As mentioned above, I didn’t include Google Analytics and Chartbeat in my interpretation. Among the recognized RUM tools, here’s a look at which ones are currently being used:

How many online retailers use real user monitoring tools?

Finding 2: 9 out of 10 ecommerce companies don’t use the RUM feature in Google Analytics regularly.

While more than half (52%) of the sites I looked at use Google Analytics, most don’t use GA’s RUM feature with any regularity.

In a very non-scientific study, I asked ten ecommerce companies who use GA if they use the RUM features. Only one of them regularly looks at the RUM screen (more than once per week). I found that many of them knew that this feature exists and had clicked on the screen at least once.

When asked why they didn’t use it, I heard the following reasons:

  • The data was not comprehensive (different browsers and sample rate).
  • The data was not trusted. (Some were paranoid about Google. Tin foil, anyone?)
  • Existing performance tests were good enough.

I found these conversations very interesting and would like to learn more about this. It fascinates me that a trove of free data is available to people and they don’t use it. What does this say about the potential long-term success of RUM?

I would like to hear more about your experience using the Google Analytics RUM feature. Is my sample representative? If there is an adoption problem, is it caused by usability or data?

This research raises a few more questions:

Why so few RUM adopters?

All the pro-RUM talk is dominated by the mega-players like Google, Amazon, and Etsy, but it hasn’t trickled down to the “mortal” companies.

Does this mean RUM is irrelevant for smaller businesses?

I think real user monitoring is an essential tool in every site owner’s toolkit. If you’re not measuring real-world performance on an ongoing basis, then every effort you’re making to improve your application performance is a stab in the dark. As more and more RUM tools are available at various price points, there’s no excuse not to pick the one that suits your needs and make it a core part of your performance practice.

What should we make of the RUM tools that don’t appear to be used by the test sites?

These findings should not necessarily reflect badly on the tools that don’t appear to have been adopted by the sites I tested. There are some solid tools out there, but I don’t think the market has reached a place where the importance of these tools is being widely recognized.

Related posts: