Today, Patrick Meenan (the mastermind behind WebPagetest) wrote a great blog post titled Motivation and Incentive. In this post, Patrick discussed his favorite performance article from 2012 – an article by Kyle Rush about the A/B testing that took place on Obama’s campaign site during the 2012 election. Patrick went on to talk about the importance of having the right incentives in place to make change happen.
I’ve been talking a lot lately about the misaligned incentives of CDNs. I was pleased to see Patrick agrees:
Maybe it’s my tinfoil hat getting a bit tight, but given that CDNs usually bill you for the number of bits they serve on your behalf, it doesn’t feel like they are particularly motivated to make sure you are only serving as many bits as you need to. Things like always gzipping content where appropriate is one of the biggest surprises. It seems like a no-brainer but most CDN’s will just pass-through whatever your server responds with and won’t do the simple optimization of gzipping as much as possible (most of them have it as an available setting but it is not enabled by default).
Certainly you don’t want to be building your own CDN but you should be paying very careful attention to the configuration or your CDN(s) to make sure the content they are serving is optimized for your needs.
If you haven’t read his post yet, be sure to check it out. He hits on a lot of important topics as he explores the root motivations behind providers across the entire tech stack.
Over the last few years, I’ve heard countless stories from customers who have been surprised to learn that their CDN didn’t have Gzip enabled. It sounds crazy since Gzip is one of the simplest tricks to make your site load faster. CDNs promise to make your site faster, but they also charge by the byte.
Think about that.
CDNs make more money when they serve larger files, more frequently.
Why are CDNs not doing more to enable compression for their customers? Sadly, as it often turns out, to find the answer you simply need to follow the money. The larger the files you send, the more money your CDN makes. This puts their business goals directly at odds with their marketing that says they want to help make your website fast.
I decided to dig into the data to see how wide-spread the problem is. I recently shared the results on the Performance Calendar which is a great resource for anyone who cares about performance. If you use a CDN or have ever questioned if you’re getting the performance you’re paying for, I hope you’ll check out the full article.
In the last few months since we launched Torbit Insight, hundreds of top retailers and large media properties have adopted Real User Measurement on their sites. In fact, we’ve measured over 3 billion page views for retailers like Wayfair, CafePress and Build.com. As we’ve had the privilege of working with some of the largest sites around, we’ve noticed an ongoing trend. Our customers are starting to depend on the Real User Measurement (RUM) data we give them as their primary source for monitoring their website performance.
Performance measurement has traditionally been done using synthetic testing (sometimes also referred to as active monitoring). Synthetic testing is when you load a website on a regular interval from one or more locations around the world to see how fast it loads. This data is then used to generate reports or trigger alarms when there are performance issues with your site. While synthetic testing is certainly useful, hundreds of top sites are turning to Real User Measurement as a source of more accurate data.
While Synthetic testing is valuable for deep analysis and debugging, it has a few short comings. With synthetic testing you only get visibility into the specific pages that you test. This is typically a small fraction of the pages your customers actually visit, leaving you with large sections of your site without monitoring. Synthetic testing gets expensive, especially if you try and increase your coverage to more pages across your site. You’re also putting more stress on your servers, taking valuable capacity away from your actual visitors. Of course, the main problem with synthetic testing is that it makes so many assumptions about your visitors. There are dozens of factors that affect the speed at which someone is able to access your site. Where are they geographically located? What is their connection speed? Which browser are they using? Are they visiting for the first time, or are they a repeat visitor? All of these variables affect the loading experience for your visitors. If you want to know what your visitors are actually experiencing, you have to use Real User Measurement.
It’s impossible to test every variation of location, network connection speed, OS, browser & add-on. That’s not to say synthetic testing is bad. There’s a place for both.
There are a few key factors that are accelerating the adoption of Real User Measurement. We now have the web timing spec support in all of the major browsers. This allows us to collect highly accurate timing data from the browser itself, starting even before the page is loaded. This allows us to time things like DNS lookups and the time it takes to do the TCP handshake. One of the challenges of implementing RUM in the past has been simply the massive amount of data that it generates. With the explosion of “big data” tools, it’s now feasible to collect billions of samples and make sense of them. Thankfully, you don’t have to build it yourself, we offer a great Real User Measurement tool at Torbit and we even made it free for people to get started.
Every visitor matters. If your site is slow, chances are you are leaving visitors and revenue on the table. The first step in making your site faster is making sure you have an accurate way to measure your speed.
In the article, I also talked about the importance of using Real User Measurement (RUM) to get an accurate picture of what’s happening on your site. Not only do you have to make sure you are looking at the right metrics, you have to make sure your methodology for collecting your data is correct too.
There’s a big shift happening on the web right now in the way that people measure their websites. People are searching for the truth. People want to look at the real performance numbers from their actual users. Misleading metrics and synthetic tests won’t cut it anymore. You can check out the full article here.
I love it when we find people willing to share detailed performance data for their websites. Last year Etsy announced on their blog that they were committed to speeding up their website and promised to share their progress publicly. Etsy kept their word, and published Site Performance Reports in August and November documenting their efforts to make their site load faster.
A few days ago Jonathan Klein, a Senior Software Engineer at Wayfair, followed in their footsteps and posted a site performance report on Wayfair’s engineering blog comparing their January 2012 site performance on key pages (i.e. home, search results, product browsing, and product pages) to their performance in September 2011. What they found was surprising.
In almost all areas their page load time went up. For example, their search results page, when looking at the 95th percentile load time, went from 1 second to 3.7 seconds. The only place they saw any improvement was in the 95th percentile for their homepage – in September it took 0.54 seconds to load, and in January it decreased to 0.433 seconds (however when looking at their average load time, the homepage time increased from 0.245 seconds in September to 0.267 seconds in January).
Ultimately this caused them to ask, “What happened between September and now?” Jonathan had an explanation:
There is a very simple reason for this – we stopped focusing on performance.
We had made performance a priority for a while – we treated it like a project, we set goals, and we achieved many of them. But then we made the mistake of resting somewhat on these achievements and moved on. Don’t misunderstand — nobody actually said, “we’re done, let’s forget about performance” but at the same time no one was actually dedicated to improving performance over the last 4 months, and only a few projects were explicitly designed to speed things up. Instead the relentless drive for new functionality (which usually ends up taxing our servers) took over and became the focus. And the results once again demonstrate that the natural trend for load time is up if you take your eyes off the target. On top of that, traffic on our sites is steadily increasing, adding further complexity to the situation (though in the end this is a good problem to have).
Now, before you give them a hard time, keep in mind these performance numbers are still far better than most sites on the internet. I’m impressed by Wayfair’s willingness to be transparent about their site’s performance and Jonathan’s honesty in talking about their failures as well as successes. To me, it shows the priority they place on their site’s performance and that they really care about their users’ experience.
As you know, a site’s performance directly impacts a user’s experience. The longer the load time, the more likely a site visitor is to leave your site. I know I’ve definitely gotten frustrated while waiting for different websites to load. You shouldn’t let your site’s performance slip, you have to be aware of it at all times, otherwise, as Jonathan notes at Wayfair, you’ll “pay the price” for not making it a priority.
Wayfair have made a big commitment to the performance of their site, and yet they still have the challenge of continually optimizing their site with each new release. It’s a struggle for many internet retailers, as you can get stuck doing the same performance optimizations over and over again. That’s one of the reasons so many people are choosing to use an automated service like Torbit – we take care of the performance optimization process, helping sites increase the speed of their loading times to ultimately increase user satisfaction. Ultimately our service allows sites, such as Wayfair, the ability to focus on other important projects and keep their site performance optimized at the same time.
In the end, kudos to companies like Wayfair and Etsy for their transparency regarding their site performance. It’s nice to see sites paying attention to their site’s performance and being willing to talk about it publicly – after all, the first step to improving your site’s speed is to measure it.
The following article is a guest post by Dr. Steve Seow. Seow is an architect with Microsoft, has a Ph.D. in Psychology from Brown University and is the author of ‘Designing and Engineering Time’. We thought it would be interesting to get his perspective on performance and the psychology behind how we perceive time.
We can safely assume that many, if not all, the readers of this blog understand that performance is critical to the user experience of any software solution. Many of us are trained to tweak code to optimize performance, and we measure and express the deltas in percentages or time units to quantify the optimization. These metrics are important because any time, money and effort expended needs to be justified and someone (perhaps the person who signs the check) needs to be assured that there is a favorable return on investment (ROI).
Let’s make this more interesting. Suppose you estimate that it will cost $30,000 (over half of your budget), to reduce the processing time of a particular feature from 20 to 17 seconds. A full 3 seconds! Do you pull the trigger? What if the delta is from 10 to 7 seconds? Is there a net positive ROI in each case?
This hypothetical situation is what got me interested in performance, and more specifically, the psychology and economics that go into software engineering practices and decisions. In my first year at Microsoft, an engineering director, knowing my psychology background, asked me a really simple question: how much do we need to improve the timing of X in order for users to even notice the difference? The question was clearly a psychological one. We’re no longer talking about one’s and zero’s here. We’re talking sensation, perception, and psychophysics. We’re not forgetting the economics piece. We’ll come back to that.
Psychologists have measured human sensation and perception for over a century. Without going into details, suffice it to say that we are wired to detect differences in magnitude of a property in systematic way. The property of something (say, the brightness of a light) will need to increase its magnitude by a certain percentage before we go “ah, that’s different than before”. This is known as the j.n.d. or just noticeable difference.
Pooling from a ton of psychophysical research on time perception and other modalities of perception, it became clear that a 20% j.n.d. will, probabilistically-speaking, ensure that users will detect a difference in timing. What does this mean for the two scenarios above? The first case, the 3-second improvement over 20 seconds is at 15% delta. This is below the desired the rule-of-thumb j.n.d. of 20%. The second case, however, the same 3-second improvement is at 30% delta of 10 seconds. Now we can have some confidence that the difference will be detected.
An important thing to remember is that this doesn’t suggest that deltas below 20% are not worth the investment. Recall that at the beginning we were considering the investment considering the proportion of the budget it consumes, so now we’re talking economics. If a feature of your website can be tweaked from 6 to 5 seconds, which is a 17% improvement, at relatively low cost you would be foolish not to go ahead and optimize. The correlation between performance and revenue has been well documented and shaving a second off your load time can have a meaningful impact on your revenue. Depending on your scale, even achieving a mere 5% delta improvement for your website could easily be worth the investment.
It’s always great to see website performance getting mainstream coverage and with Cyber Monday quickly approaching awareness is at an all time high. Today, CNBC covered retailers that are preparing for the holiday season and discussed the impact of website performance on sales.
Here are a few key quotes:
“According to Compuware’s Gomez division, 71% of consumers expect websites to load as quickly on their mobile devices as they load at home but very few retailers can meet that need for speed. Speed is critical and we’re getting ready to go into a time of year where the web traffic can grow by ten times or more.”
“Speed is critical. After three seconds, abandonment rates for mobile websites are nearly identical to that of desktop browsing at home. The simple equation is that speed equals sales. Consumers will buy more from faster websites and they’ll abandon slower websites in favor of alternatives.”
Here at Torbit we’re always looking for ways to improve the performance of our client’s websites in an automated way. Our latest optimization is called CSS Smart Loader and is a better way to load / manage CSS.
When dealing with CSS and performance there are a few rules to consider:
1) Minify / compress your CSS
2) Combine your CSS into one request
3) Put your CSS on a CDN
4) Optimize for best caching
5) Make sure your CSS is non-blocking
Our new CSS Smart Loader handles all of these issues beautifully in order to get the best performance possible. Let’s go through them one at a time and cover the details of how this new optimization works.
Minify / compress your CSS
In order to reduce the overall download size it’s important to minify and compress your CSS resources. Minifying refers to removing excess white space and simplifying syntax in order to reduce the number of characters in the CSS resource. Compressing, in this case, refers to re-encoding the contents of a file via a common standard (GZIP) to allow a smaller overall file transfer size (reducing the number of bytes in transit). All modern browsers (and most browsers) support GZIP compression. Torbit will automatically handle minifying and gzip’ing all the CSS on your site, as well as static 3rd party CSS you include, in order to minimize the overall download of the CSS on the page.
Combine your CSS into one request
Put your CSS on a CDN
Optimize for best caching
By optimizing for the best caching you can reduce how often your site’s visitors have to download all of your CSS. If you had no caching every visitor would be forced to download all the CSS on every pageview. Obviously this is not ideal and you can improve the performance by keeping the contents of these files in the browser as long as possible. There are several parts to how we improve browser caching. The first is by using far future expires headers and versioning. This basically amounts to adding a special header in the CSS response that tells the browser it’s safe to cache for a very long time (several years). Normally this could be a problem if you change your CSS as the browser won’t know to re-download the file (because we’ve told it to cache this file for a very long time), so to address this we also do versioning. Versioning refers to changing the filename whenever the file contents changes. This allows us to prevent pages from using out of date files, while allowing us to keep a long cache life.
The other part of our improved browser caching involves using HTML5 local storage. HTML5 local storage is basically a key / value pair data store that lives in the client’s browser. The overall size of this storage varies, but is generally around 5mb per domain. Most modern browsers now provide support for local storage, so where available we’ll use this as a smarter cache. Local storage allows us to store the individual CSS files (because we pushed them down to the browser as strings) so that we can include only the CSS files needed on subsequent pages. It also means that if one of your files changes we can push only the updated file, or only the new file if another CSS file is requested on a future pageview. You can also generally get a longer cache life by using local storage because the space is specific to only that domain, so you’re not competing with other items in the cache on other sites. And lastly, local storage generally is cleared less often than the browser cache, which means the files can stay in the user’s browser longer and it’s more likely you won’t have a completely empty cache view for returning visitors.
We also include your inline styles as well in these optimizations which allows us to minify your inline styles and cache them in the browser (or store them in the browser’s local storage) so that they won’t have to be downloaded on every page view – reducing the overall file size of every page.
Make sure your CSS is non-blocking
We’re really proud of our new CSS Smart Loader. It’s an important step in improving the overall performance of a webpage and especially important for improving onready time and the perceived performance since it blocks rendering. We’ve got a lot of other exciting performance optimizations in the works. As always, you can do these optimizations yourself, but why bother when we can automate them for you with Torbit!
We’re excited to announce the release of a new performance filter for Torbit-powered sites.
WebP (pronounced “weppy”) is a fairly new image format from Google. In a test with 900,000 images from around the web, Google found their WebP format made images 39.8% smaller than jpeg images of similar quality. Since smaller files download faster than large ones, this new format can have a big impact on website performance. Check out the Google gallery for a few example images.
The only problem is that WebP is currently only supported in the latest versions of Chrome and Opera. For many webmasters, it doesn’t make sense to store duplicate copies of every image just for those two browsers.
That’s where Torbit comes in.
As of Friday, we’ve started making WebP copies of every image and will serve them instead when a visitor is using a browser that supports WebP. We are already seeing impressive results since adding this new optimization, especially on image-heavy sites. We are still gathering data on how much of a difference WebP makes and we plan on sharing those numbers soon.