Traditionally, performance benchmarks have only been available through synthetic (simulated) performance monitoring.
But now there’s another way: we’re excited to tell you about a new, freely accessible tool that shows you how your website performs via real user monitoring (RUM) data using Google’s Chrome User Experience (CrUX) Report to create RUM-based performance benchmarks. This will help you understand your site’s performance relative to your competition in a way that synthetic/simulated benchmarks simply can't provide.
How it works
The CrUX report contains website performance data from real end users accessing millions of websites across the globe, all collected and reported on by Google.
To try out our new CrUX-powered tool, just enter your site’s domain (assuming it exists within the CrUX dataset), and then select an industry benchmark for comparison. The tool generates a comparative report within a few seconds; if desired, you can then filter the visualization by device and/or timing metric in order to further understand how those factors impact performance.
Querying the CrUX dataset on-demand can become expensive, so our approach in this tool is to query the dataset for thousands of domains at a time, process the data, store the processed JSON in a database table, and then refresh every so often. Our visualizations are generated using Vega, which is a powerful and flexible declarative format for creating, saving, and sharing visualization designs.
How to interpret your data
We’ve attempted to make the interpretation of your data as simple as possible by leaning on specific styles of visualization to help you understand the differences in key performance statistics between the selected sites.
We render these differences as a simplified box plot (also known as a 5-number plot: 1st percentile, 25th percentile, median, 75th percentile, and 99th percentile). These five key statistics—each shown by a vertical line as seen below—help to describe the distribution of user experiences and allow you to do quick numerical comparisons.
However, the lines of the simplified box plots alone do not reveal the interesting aspects about the shape of the distribution. That’s why we also included violin plots as part of the visualization, in order to provide you with more context for the box plots. The total area in each chart represents 100% of all load times that were in the CrUX dataset for a given device type. From the violin plot, we can learn where values are concentrated, if the data is symmetrical, and whether gaps or unusual values exist.
Here’s an example visualization showing how the simplified box plot and the violin plot work together:
This is just the beginning of Akamai’s work on RUM-based competitive benchmarks, and we hope it helps you understand and improve your site's performance relative to your selected benchmarks. Later this year, we’ll be introducing additional RUM benchmarking capabilities directly into Akamai’s RUM solution, mPulse. Until then, enjoy the new benchmarks, and enjoy this new tool.
Greg Del Vecchio is a senior product line specialist at Akamai Technologies.