Velocity 2012 and WebPerfDays

by Josh Fraser on June 6, 2012

We’re only 19 days away from the Velocity 2012 conference which is an annual gathering of the brightest minds and leaders in the web performance industry. The conference is taking place in Santa Clara, CA on June 25-27. Each year Velocity attracts the best known names in the industry and lots of geeks who care about making things fast. The Torbit team will be there and we’re looking forward to catching up with many of our customers and friends in the performance space. If you’re planning to be there, let us know and we’ll be sure to meet up. If you haven’t gotten your tickets yet, it’s not too late to register.

I’m also incredibly excited about the first annual WebPerfDays, a single day “unconference” that is taking place on June 28th (the day after Velocity ends) at the Google campus in Mountain View. Inspired by the ever popular DevOpsDays, WebPerfDays promises to be an event you won’t want to miss. Tickets are already sold out, but I’m told that more will be opening up soon so it’s worth making sure your name is on the waiting list.

Steve Souders and Aaron Kulick have both given an amazing gift to our community with their work planning these events (along with their countless other contributions!). The anticipation is already building here at Torbit. If you’re someone who cares enough about performance to be reading this blog, you probably don’t want to miss out on either one of these events.




The search for truth

by Josh Fraser on May 24, 2012

I’ve been a long time fan and reader of High Scalability, a blog about building bigger, faster, more reliable websites. Yesterday, I was honored to get the chance to write a guest post for them titled, Averages, Web Performance Data, And How Your Analytics Product Is Lying To You. I wrote about the importance of looking beyond averages when looking at your performance data.

Did you know that 5% of the pageviews on Walmart.com take over 20 seconds to load? Walmart discovered this recently after adding real user measurement (RUM) to analyze their web performance for every single visitor to their site. Walmart used JavaScript to measure their median load time as well as key metrics like their 95th percentile. While 20 seconds is a long time to wait for a website to load, the Walmart story is actually not that uncommon.

In the article, I also talked about the importance of using Real User Measurement (RUM) to get an accurate picture of what’s happening on your site. Not only do you have to make sure you are looking at the right metrics, you have to make sure your methodology for collecting your data is correct too.

There’s a big shift happening on the web right now in the way that people measure their websites. People are searching for the truth. People want to look at the real performance numbers from their actual users. Misleading metrics and synthetic tests won’t cut it anymore. You can check out the full article here.




Making sure you don’t lose impulse buyers

by Josh Fraser on May 21, 2012

The Internet is a sea of impulses. You never know where you’ll end up or what will catch your eye. It’s not surprising that over 22 percent of Internet users claim to have made an impulsive purchase of some kind in the last year, according to the Association of Customer Research.

Many people don’t think about visitors to the site who ended up on their business’s website on an impulse. Maybe that new visitor clicked a random link when they were inspired and now have found a brand new website, product or service sitting right under their nose.

That’s your visitor now. They might be a future customer, brand advocate or maybe they’ll just be a fan. Don’t lose them due to a slow website!

We’ve entered a phase of the Internet where site speed has never mattered more. With so many third-party javascript libraries and so many image-heavy websites, speed seems to be the first thing that goes out the door, an afterthought to most companies.

Take this graph which shows correlation between sale conversion and page speed for one of our customers. Along the X axis are the average load times experienced across a session for each of your visitors. The Y axis is the rate at which they converted to paying customers.

When average load time increased from half a second to one and a half seconds, conversions dropped by over 3%. That’s money, eyeballs and impulses – all lost.

We built Torbit Insight because the available tools on the market weren’t able to provide the information our customers wanted to see. We show you how your conversion rate, bounce rate, engagement, attention and revenue are affected by the performance experienced by every single visitor to your site. Who knows how long someone is going to be on your site. Don’t lose them. And if you did lose them, find out why.

Start measuring your speed today and take the first step in improving your performance and retaining your impulse buyers. Try Torbit Insight for free today.




Announcing Torbit Insight: the easiest way to add Real User Measurement to your site

by Josh Fraser on April 25, 2012

Today, we are proud to announce a new product called Torbit Insight. Torbit Insight is a Real User Measurement (RUM) tool that allows website owners to see the actual loading times for every visitor to their site. The best part is that it also lets you see the correlation between your web page speed and core business metrics like your bounce rate and conversion rate.

For the last year and a half at Torbit we have focused on making the Internet a faster place for everyone. Businesses rely on our Site Optimizer product to make their websites load fast because they understand that page speed is crucial to their bottom line. Now, with the launch of Insight, website owners have an easy tool to evaluate how fast their site is loading and quantify how much performance matters to their business.

Torbit Insight is unlike anything that’s available on the market today. Unlike synthetic testing, where your website is loaded from a few key servers around the world, Insight uses JavaScript to measure the actual load times for every visitor to your site. This gives you access to far more data than you would have with synthetic testing. With Insight, you don’t have to make any assumptions about which variables are impacting your website performance, like whether people are visiting the site for the first time, which browser they are using, or which part of the world they live in.

Massive websites like Amazon have the statistics to show that a 0.1 second delay in load time can lead to a 1% drop in their sales. But what about everyone else? What do slow load times mean for your business? We wanted to build a tool to show how important speed is for everyone, from top Internet retailers, to media properties, to startups. There was simply nothing out there that tied it all together to help make a business case for web performance.

Torbit Insight uses data that has only recently become available in browsers to show you the speed of each visitor navigating to your website. Rather than rounding everything into an average, we show you key metrics like the median page load speed along with load times of the 90th, 95th and 99th percentiles of your visitors. These additional metrics are an important differentiator for Insight since performance data has frequent outliers, and the data can be quite misleading if you only consider averages.

With graphs that correlate page speed to your bounce rate as well as conversion rates, Torbit Insight helps you understand in real-time where your site is slow and why. We’ll show you where your visitors are coming from and suggest optimizations that can be made on the front-end to drastically increase your website’s performance.

Dozens of top sites across the Internet are already using Insight, including top retailers like Wayfair and large media properties like the Cheezburger Network. Recently we made a video with Jonathan Klein who leads the performance team at Wayfair. Here is what he had to say:

Today, we’re rolling out Insight with three plans: Free, Standard and Premium. We’ve worked hard to build this product and we’re incredibly excited to get to finally share it with you. We look forward to hearing your feedback as we continue to do everything we can to make the internet a faster and better place for everyone.

Learn more or Sign up for free to start measuring your performance today.




Leading internet retailers reveal intimate details about their speed

by Chelsea Fought on January 25, 2012

I love it when we find people willing to share detailed performance data for their websites.  Last year Etsy announced on their blog that they were committed to speeding up their website and promised to share their progress publicly.  Etsy kept their word, and published Site Performance Reports in August and November documenting their efforts to make their site load faster.

A few days ago Jonathan Klein, a Senior Software Engineer at Wayfair, followed in their footsteps and posted a site performance report on Wayfair’s engineering blog comparing their January 2012 site performance on key pages (i.e. home, search results, product browsing, and product pages) to their performance in September 2011.  What they found was surprising.

In almost all areas their page load time went up.  For example, their search results page, when looking at the 95th percentile load time, went from 1 second to 3.7 seconds. The only place they saw any improvement was in the 95th percentile for their homepage – in September it took 0.54 seconds to load, and in January it decreased to 0.433 seconds (however when looking at their average load time, the homepage time increased from 0.245 seconds in September to 0.267 seconds in January).

Ultimately this caused them to ask, “What happened between September and now?”  Jonathan had an explanation:

There is a very simple reason for this – we stopped focusing on performance.

We had made performance a priority for a while – we treated it like a project, we set goals, and we achieved many of them. But then we made the mistake of resting somewhat on these achievements and moved on. Don’t misunderstand — nobody actually said, “we’re done, let’s forget about performance” but at the same time no one was actually dedicated to improving performance over the last 4 months, and only a few projects were explicitly designed to speed things up. Instead the relentless drive for new functionality (which usually ends up taxing our servers) took over and became the focus. And the results once again demonstrate that the natural trend for load time is up if you take your eyes off the target. On top of that, traffic on our sites is steadily increasing, adding further complexity to the situation (though in the end this is a good problem to have).

Now, before you give them a hard time, keep in mind these performance numbers are still far better than most sites on the internet. I’m impressed by Wayfair’s willingness to be transparent about their site’s performance and Jonathan’s honesty in talking about their failures as well as successes. To me, it shows the priority they place on their site’s performance and that they really care about their users’ experience.

As you know, a site’s performance directly impacts a user’s experience. The longer the load time, the more likely a site visitor is to leave your site. I know I’ve definitely gotten frustrated while waiting for different websites to load. You shouldn’t let your site’s performance slip, you have to be aware of it at all times, otherwise, as Jonathan notes at Wayfair, you’ll “pay the price” for not making it a priority.

Wayfair have made a big commitment to the performance of their site, and yet they still have the challenge of continually optimizing their site with each new release.  It’s a struggle for many internet retailers, as you can get stuck doing the same performance optimizations over and over again.  That’s one of the reasons so many people are choosing to use an automated service like Torbit – we take care of the  performance optimization process, helping sites increase the speed of their loading times to ultimately increase user satisfaction. Ultimately our service allows sites, such as Wayfair, the ability to focus on other important projects and keep their site performance optimized at the same time.

In the end, kudos to companies like Wayfair and Etsy for their transparency regarding their site performance. It’s nice to see sites paying attention to their site’s performance and being willing to talk about it publicly – after all, the first step to improving your site’s speed is to measure it.




Help us stop SOPA – protest on your site!

by Jon Fox on January 15, 2012

In case you haven’t already heard, congress is currently reviewing new bills that threaten the very fabric of the internet. The Stop Online Piracy Act (SOPA) bill, and it’s sister bill, Protect IP Act (PIPA), are currently working their way through the United States Congress and the Senate. If these bills pass any copyright holder can simply allege a site infringes on their copyright and the site could be redirected at the DNS level and be cut off from all major ad services and payment processing services – all without any trial or due process of any kind. It’s scary stuff, but we won’t rehash the full issue here. Read this post from the Electronic Frontier Foundation to learn more about this issue.

Many companies have voiced their protest and will be blacking out their sites or informing their users about these upcoming bills. We saw what other companies like Reddit and Craigslist were doing and wanted to offer other sites an easy way to participate. We came up with a javascript snippet you can add to your page that will show a popup like you see in the screenshot below. You can also click here to see it live.

The SOPA popup will only be displayed on January 18th from 8am–8pm EST (1300–0100 UTC) by default, but if you’re javascript savvy feel free to change that. The popup will also only be displayed to a user once (by setting a cookie in their browser). You can also manually trigger the message before then by adding the hash #stopsopa to the end of any URL that has the JavaScript installed. If you don’t want our hosted version, feel free to grab the source from github, otherwise grab the snippet below and add it to your site. If you’re a Torbit user, you can add this to your site by simply selecting the option in your account on the Torbit filters page.

Installation

If you’re on WordPress.org, Blogger, or Typepad you can quickly install using plugins from the links below. For Blogger users, make sure you’re logged in before trying to install it.

WordPress

To install the javascript snippet directly, copy this async snippet into the <head> or top of the <body> of your website:

Help us protect the internet! Contact your representatives, spread the word, and join us in protesting censorship!














Day 7 of the Performance Calendar: Automating Website Performance

by Jon Fox on December 8, 2011

Stoyan Stefanov was one of the creators of YSlow 2.0 and is now leading performance efforts at Facebook. In 2009 Stoyan started the Performance Calendar which he dubbed “the speed geek’s favorite time of the year”. For the 24 days leading up to Christmas, the Performance Calendar features an article a day from a leading performance expert. It’s like those calendars you used to get as a kid, but instead of getting candy, you get to unwrap awesome performance tips. As you would imagine, we’re big fans here at Torbit.

Yesterday our CEO, Josh Fraser, wrote an article for day 7 of the Performance Calendar titled Automating Website Performance. In it he talks about some of the challenges we’ve faced in the past with Torbit and some of the lessons we’ve learned about automating website performance optimizations. It’s definitely worth a read if you haven’t yet.




Psychology of Software Performance: the Just Noticeable Difference

by Steve Seow on November 29, 2011

The following article is a guest post by Dr. Steve Seow. Seow is an architect with Microsoft, has a Ph.D. in Psychology from Brown University and is the author of ‘Designing and Engineering Time’. We thought it would be interesting to get his perspective on performance and the psychology behind how we perceive time.

We can safely assume that many, if not all, the readers of this blog understand that performance is critical to the user experience of any software solution. Many of us are trained to tweak code to optimize performance, and we measure and express the deltas in percentages or time units to quantify the optimization. These metrics are important because any time, money and effort expended needs to be justified and someone (perhaps the person who signs the check) needs to be assured that there is a favorable return on investment (ROI).

Let’s make this more interesting. Suppose you estimate that it will cost $30,000 (over half of your budget), to reduce the processing time of a particular feature from 20 to 17 seconds. A full 3 seconds! Do you pull the trigger? What if the delta is from 10 to 7 seconds? Is there a net positive ROI in each case?

This hypothetical situation is what got me interested in performance, and more specifically, the psychology and economics that go into software engineering practices and decisions. In my first year at Microsoft, an engineering director, knowing my psychology background, asked me a really simple question: how much do we need to improve the timing of X in order for users to even notice the difference? The question was clearly a psychological one. We’re no longer talking about one’s and zero’s here. We’re talking sensation, perception, and psychophysics. We’re not forgetting the economics piece. We’ll come back to that.

Psychologists have measured human sensation and perception for over a century. Without going into details, suffice it to say that we are wired to detect differences in magnitude of a property in systematic way. The property of something (say, the brightness of a light) will need to increase its magnitude by a certain percentage before we go “ah, that’s different than before”. This is known as the j.n.d. or just noticeable difference.

Pooling from a ton of psychophysical research on time perception and other modalities of perception, it became clear that a 20% j.n.d. will, probabilistically-speaking, ensure that users will detect a difference in timing. What does this mean for the two scenarios above? The first case, the 3-second improvement over 20 seconds is at 15% delta. This is below the desired the rule-of-thumb j.n.d. of 20%. The second case, however, the same 3-second improvement is at 30% delta of 10 seconds. Now we can have some confidence that the difference will be detected.

An important thing to remember is that this doesn’t suggest that deltas below 20% are not worth the investment. Recall that at the beginning we were considering the investment considering the proportion of the budget it consumes, so now we’re talking economics. If a feature of your website can be tweaked from 6 to 5 seconds, which is a 17% improvement, at relatively low cost you would be foolish not to go ahead and optimize. The correlation between performance and revenue has been well documented and shaving a second off your load time can have a meaningful impact on your revenue. Depending on your scale, even achieving a mere 5% delta improvement for your website could easily be worth the investment.

This is merely the simplest application of j.n.d. in the world of software performance. I bring readers further down the rabbit hole in Chapter 5 of my book, Designing and Engineering Time, and on my site PerfScience.com.




CNBC discusses retailers need for tech speed

by Josh Fraser on November 21, 2011

It’s always great to see website performance getting mainstream coverage and with Cyber Monday quickly approaching awareness is at an all time high. Today, CNBC covered retailers that are preparing for the holiday season and discussed the impact of website performance on sales.

Here are a few key quotes:

“According to Compuware’s Gomez division, 71% of consumers expect websites to load as quickly on their mobile devices as they load at home but very few retailers can meet that need for speed. Speed is critical and we’re getting ready to go into a time of year where the web traffic can grow by ten times or more.”

And

“Speed is critical. After three seconds, abandonment rates for mobile websites are nearly identical to that of desktop browsing at home. The simple equation is that speed equals sales. Consumers will buy more from faster websites and they’ll abandon slower websites in favor of alternatives.”

You can watch the full clip below or on the CNBC website.




A better way to load CSS

by Jon Fox on November 17, 2011

Here at Torbit we’re always looking for ways to improve the performance of our client’s websites in an automated way. Our latest optimization is called CSS Smart Loader and is a better way to load / manage CSS.

When dealing with CSS and performance there are a few rules to consider:
1) Minify / compress your CSS
2) Combine your CSS into one request
3) Put your CSS on a CDN
4) Optimize for best caching
5) Make sure your CSS is non-blocking

Our new CSS Smart Loader handles all of these issues beautifully in order to get the best performance possible. Let’s go through them one at a time and cover the details of how this new optimization works.

Minify / compress your CSS
In order to reduce the overall download size it’s important to minify and compress your CSS resources. Minifying refers to removing excess white space and simplifying syntax in order to reduce the number of characters in the CSS resource. Compressing, in this case, refers to re-encoding the contents of a file via a common standard (GZIP) to allow a smaller overall file transfer size (reducing the number of bytes in transit). All modern browsers (and most browsers) support GZIP compression. Torbit will automatically handle minifying and gzip’ing all the CSS on your site, as well as static 3rd party CSS you include, in order to minimize the overall download of the CSS on the page.

Combine your CSS into one request
Another important best practice for performance is to combine all of your CSS into one HTTP request. The idea here is that every request the browser has to make has some overhead (headers and connection costs). By reducing the number of requests we can reduce the overhead costs of these downloads and improve the overall performance. Our new CSS Smart Loader does this by including all your CSS as strings in a JavaScript file, then attaching new style elements to the DOM to apply these styles to the page. This allows us to deliver all the CSS in a single request while still being able to treat the individual CSS files you originally included as individual attachments to the DOM. This makes it easier to preserve the proper media attributes for each file, ensure the CSS is included in the same place in the DOM it originally was included (also preserving the proper order), has some benefits with caching (which we’ll get to shortly), and prevents one error in the bottom of one CSS file from breaking the rest of your CSS in the files that follow it. This allows us to have much better control while still reducing the number of connections for CSS down to only one.

Put your CSS on a CDN
In case you’re not already familiar with the term, a CDN is a content delivery network. A CDN basically works by reducing the latency involved in a request by putting servers closer (geographically or network hops) to your site’s visitors. It’s faster to send a packet over a shorter distance / smaller number of hops and this leads to faster downloads of the content for your site’s visitors. In general it’s good practice to put all static files on a CDN (something Torbit can automate for you). Our new CSS Smart Loader also pushes the javascript file containing all your CSS to a CDN (one provided by us or yours if you already have one) which allows us to get the data to the browser even faster.

Optimize for best caching
By optimizing for the best caching you can reduce how often your site’s visitors have to download all of your CSS. If you had no caching every visitor would be forced to download all the CSS on every pageview. Obviously this is not ideal and you can improve the performance by keeping the contents of these files in the browser as long as possible. There are several parts to how we improve browser caching. The first is by using far future expires headers and versioning. This basically amounts to adding a special header in the CSS response that tells the browser it’s safe to cache for a very long time (several years). Normally this could be a problem if you change your CSS as the browser won’t know to re-download the file (because we’ve told it to cache this file for a very long time), so to address this we also do versioning. Versioning refers to changing the filename whenever the file contents changes. This allows us to prevent pages from using out of date files, while allowing us to keep a long cache life.

The other part of our improved browser caching involves using HTML5 local storage. HTML5 local storage is basically a key / value pair data store that lives in the client’s browser. The overall size of this storage varies, but is generally around 5mb per domain. Most modern browsers now provide support for local storage, so where available we’ll use this as a smarter cache. Local storage allows us to store the individual CSS files (because we pushed them down to the browser as strings) so that we can include only the CSS files needed on subsequent pages. It also means that if one of your files changes we can push only the updated file, or only the new file if another CSS file is requested on a future pageview. You can also generally get a longer cache life by using local storage because the space is specific to only that domain, so you’re not competing with other items in the cache on other sites. And lastly, local storage generally is cleared less often than the browser cache, which means the files can stay in the user’s browser longer and it’s more likely you won’t have a completely empty cache view for returning visitors.

We also include your inline styles as well in these optimizations which allows us to minify your inline styles and cache them in the browser (or store them in the browser’s local storage) so that they won’t have to be downloaded on every page view – reducing the overall file size of every page.

Make sure your CSS is non-blocking
And finally, we need to make sure our CSS is non-blocking. In some cases, CSS can block the browser from downloading any other resources. This means the browser has to wait for the CSS to finish downloading before it can start downloading other images and the like for the rest of the page. Obviously the more we can download in parallel, the better, so we make sure to include the CSS in a non-blocking manner by including it in a non-blocking javascript request. This ensures that we don’t keep the rest of the resources on the page from downloading, while still giving us all the great performance of the other optimizations mentioned above.

We’re really proud of our new CSS Smart Loader. It’s an important step in improving the overall performance of a webpage and especially important for improving onready time and the perceived performance since it blocks rendering. We’ve got a lot of other exciting performance optimizations in the works. As always, you can do these optimizations yourself, but why bother when we can automate them for you with Torbit!




« Older posts Archives Newer posts »