Jonathan Klein from Wayfair wrote a post a few weeks ago about using WebPageTest to measure the performance of their CDN. The results were surprising. Wayfair found that their CDN was delivering minimal performance gains. As you would expect, the post generated a lot of lively discussion with lots of ideas about different variables that could be affecting the outcome of the test. Several people (myself included) recommended they use Real User Measurement to see how much of an improvement their actual visitors are experiencing.
Last week, Klein posted the results from the Real User Measurement test. After using the tagging feature of our Insight product to run an A/B on their production site, the results told much the same story as the synthetic test. Wayfair saw no major performance improvement due to the use of a CDN.
The results of these two tests are quite surprising. A CDN is a well known tool that will improve the performance of most websites. You can’t change the speed of light, but you can make sure your content is delivered from servers closer to your visitors. Although disappointed with the results, Klein was careful to point out other benefits of using a CDN. Klein said the ability to offload origin bandwidth and tolerate traffic spikes was enough to justify the cost of their CDN.
Performance guru, Steve Souders took a look at the results and reminded people in the comments that:
There are numerous performance best practices. Not all of them apply to every site. But that doesn’t mean the best practice is bad – it just might not be relevant at that time for that particular site.
Souders was able to trace the problem back to several large images that are being loaded from their CDN, but appear to be taking far longer than expected to load. I’m confident Wayfair will be able to take this data to their CDN and get this particular issue resolved. Many sites like Wayfair are spending thousands of dollars on their CDN, but have never taken the time to really evaluate what sort of performance gains they are receiving for their money. I love that Jonathan was willing to set up this test and share the results with the world. It’s a great example of how you can use Real User Measurement to keep your vendors accountable for the performance gains they promise.
Using Torbit Insight, Wayfair was able to set up this test and get meaningful data back in very short period of time. It’s a great example of how easy it is to use our tagging feature to do a performance-related A/B test. If you haven’t already, be sure to read Jonathan Klein’s full post for all the details on how he conducted the experiment. For anyone else interested in conducting a similar test, send us a note, we’d love to help.