by Josh Fraser on August 29, 2012
On September 5th, I’ll be speaking at the San Francisco Web Performance Meetup. I’ll be revealing some never before shared data from measuring billions of pageviews with our Real User Measurement product, Torbit Insight. We’ll be looking at lots of fascinating performance trends, comparing browsers and geographical performance. If you like performance data make sure you sign up today. You’re not going to want to miss this one.
The event starts at 7pm and will be taking place at 1 Market Plaza, Steuart Tower, 5th Floor in San Francisco. Check out the Meetup page to learn more and reserve your spot.
by Josh Fraser on August 28, 2012
One of the key differentiators of Torbit Insight is that we offer the ability to quantify the value of your website performance. No other product on the market allows you to correlate website speed and revenue. For years we’ve heard stories from companies like Amazon that tell us that one tenth of a second equals one percent of sales. These public case studies are great, but what about your website? We built Torbit Insight because we wanted an easy way for everyone to know how much speed matters. There’s something incredibly powerful about seeing key metrics based on your own data from your own visitors. We’ve now helped hundreds of companies change performance from being just a technical metric (something your engineers worry about) to a business metric that influences real decisions at your organization.
If you use Insight you may have recently noticed a new addition to your dashboard. We added a graph that shows how user engagement is affected by the speed of your website as measured by the number of pages visited per session. Once again, the data tells a clear story: speed matters.
Showing the correlation between speed and revenue has always been a key focus for us. For example, we have a graph showing the correlation between your site speed and your bounce rate. For internet retailers, we offer conversion tracking that makes it easy for e-commerce sites to track the correlation between website performance and actual sales. Other sites have used our conversion tracking to track other activities like someone requesting a demo or signing up for a mailing list. Advertising-based businesses tend to care about other metrics like the number of pages viewed per session. More pageviews means more ad impressions which means more revenue. This new view should help give more visibility into this important user engagement metric. On the x-axis you can see the various speeds at which visitors experienced your site. On the y-axis you can see how many pages your visitors viewed across their session at each of those speeds. In the example above, you can see that there is a huge advantage in having 1 or 2 second load time. For this particular site, a 1 or 2 second load time ensures that an average visitor will view 23 pages per session. In contrast, if the site takes 15 seconds or longer to load, an average visitor will only view 5 pages per session.
The exact numbers and corresponding graph will vary from site to site, but most sites will see a strong correlation. When your site loads faster, people stick around longer and view more pages. Once you know how much speed matters for your site, you have a much better way of knowing how much to invest in site performance, whether that’s the money you spend on developers or on performance-related technology like a CDN or Dynamic Content Optimization.
We’re excited to offer this new feature to all of our Premium and Enterprise customers. Sign up for a free 14 day trial of Premium Torbit Insight and find our how speed affects your user engagement today.
by Josh Fraser on August 24, 2012
I am excited to announce that starting today we have lifted all pageview limits for Torbit Insight, including those of you on our free plan. We believe it’s important for our customers to be able to see performance data for 100% of their traffic. We’ve grown to a scale where we can handle the extra traffic and we’re happy to offer our service to every site regardless of their size. It’s our small contribution to making the internet a faster and better place for everyone.
We have also made some changes to our pricing that we want to let you know about. Since launching Torbit Insight, we’ve had a chance to work with hundreds of sites and have gotten lots of great feedback about the functionality you love the most. Over the last few months we have added a lot of features without raising our prices at all. Our goals with these changes are to make things simpler, continue adding more value to our product and ultimately make it easier for more people to use Torbit.
Here is an overview of what is changing today:
No more pageview limits. No asterisk.
We will be including our drill down capabilities with our free plan. This includes our loading timeline, map view and browser breakdown.
For the sake of simplicity, we are saying goodbye to our Standard plan. Customers who were on our Standard plan have been automatically upgraded to Premium accounts for the same price.
We’re raising the price of our Premium plan to $499 / month. We know this is a big price increase, but it’s an important change that will allow us to focus more of our energy on the businesses that find the most value in our product today.
Existing Premium customers will be grandfathered in at the previous price. Feel free to upgrade or downgrade your plan as it makes sense for your business.
Please contact us about an Enterprise plan if you need more than 5 domains, page tagging, extended data retention or 24/7 support. If you would like to discuss any of these changes, please feel free to contact us.
by Josh Fraser on August 23, 2012
It’s important to look at the distribution of your data when considering your performance. I’ve written before about the dangers of only looking at your average loading time. Averages can be very misleading. I’ve seen plenty of sites that have a 4 second average loading time, but a 20 second 90th percentile loading time. That’s why we offer a histogram view and always encourage our customers to track their goals using their 90th or 95th percentile loading time.
We’ve also had requests to include the geometric mean as one of our featured metrics. We thought that was a great idea and geometric mean is now featured on your Torbit dashboard along with your existing metrics (Median, Average, 90th Percentile, 95th Percentile, and 99th Percentile).
For those of you who are unfamiliar with a geometric mean, here is a quick explanation of what this new metric means for you and your performance data.
As you know, there are a lot of different factors that influence how fast your website loads. The geographic proximity of your server to your visitors has a big impact on your speed. It also makes a difference which browser each visitor is using and whether they are on a fast internet connection or not. When you look at your performance data as a whole, you are seeing the combination of many independent variables. When looking at end user performance data, it usually looks like the graph below. The data does not take a normal distribution shape, as it is skewed to the right. However, if you took the logarithm of all the data and re-graphed, you would have a normal distribution, or the standard bell curve. Thus, this is called a log-normal distribution.
The arithmetic mean (what we usually think of as an average) is very susceptible to outliers. In pageload times, it’s easy to have a few really slow data points that skew your data. It’s not a problem if you have a normal distribution since the outliers balance each other out (both visually and mathamatically). The problem is, we don’t have a normal distribution, we have a log-normal distribution. As it turns out, when you have a log-normal distribution, the geometric mean is a much better way of representing the central tendency of your data.
A geometric mean is calculated by multiplying your data points together and taking the nth root (n being the number of data points you have) of that resulting product. With this calculation, the geometric mean normalizes the ranges being averaged, so that no range dominates the weighting, and a given percentage change in any of the properties has the same effect on the geometric mean. In this way, the geometric mean helps with outliers so they don’t have undue weight. To learn more about geometric mean, I’d recommend heading to wikipedia for a more in-depth explanation of how it is calculated and when it’s most useful.
Your geometric mean will likely be the lowest value on your dashboard, but we didn’t just add this to make you feel better about your site speed. Our goal is always to give you more transparency and a more holistic view into your website performance.
by Josh Fraser on July 31, 2012
We’ve been growing rapidly since launching Torbit Insight at the end of April. We are already processing billions of pageviews every month and are currently processing about 15,000 metrics every second. If you’re curious, we’ve added a real time counter to the bottom of our homepage that shows the live number.
We know these numbers can get a bit mind boggling and they’re not showing signs of slowing down anytime soon. As part of our growth plan, our engineers have completely rebuilt the backend of Torbit Insight. Our new “big data” store will allow us to continue our rapid growth while also making it easier to add new features at scale.
As some of you noticed, our old backend was starting to struggle a bit under the load. I apologize to those of you who reported missing data or other weird issues in the last couple weeks. Thank you for bearing with us. The new backend should bring a lot more stability and reliability going forward. Otherwise, your experience should be largely unaffected. We’ve kept most of the features the same and your data has already been imported into the new system. The conversion tab will look a little different for now as it’s being revamped to work with our new collection system. If you see anything else unusual, please let us know.
We’ve been gathering feature requests from our customers for a while. With this launch, our team will be able to focus again on rolling out the features you’ve been waiting for. If we have other suggestions you want us to consider, feel free to send your suggestions directly to me at firstname.lastname@example.org. We love having customers engaged early in the development process as it helps keep us on track.
A huge thanks to our team and especially Jon and Mike on this important accomplishment.
by Josh Fraser on July 23, 2012
Jonathan Klein from Wayfair wrote a post a few weeks ago about using WebPageTest to measure the performance of their CDN. The results were surprising. Wayfair found that their CDN was delivering minimal performance gains. As you would expect, the post generated a lot of lively discussion with lots of ideas about different variables that could be affecting the outcome of the test. Several people (myself included) recommended they use Real User Measurement to see how much of an improvement their actual visitors are experiencing.
Last week, Klein posted the results from the Real User Measurement test. After using the tagging feature of our Insight product to run an A/B on their production site, the results told much the same story as the synthetic test. Wayfair saw no major performance improvement due to the use of a CDN.
The results of these two tests are quite surprising. A CDN is a well known tool that will improve the performance of most websites. You can’t change the speed of light, but you can make sure your content is delivered from servers closer to your visitors. Although disappointed with the results, Klein was careful to point out other benefits of using a CDN. Klein said the ability to offload origin bandwidth and tolerate traffic spikes was enough to justify the cost of their CDN.
Performance guru, Steve Souders took a look at the results and reminded people in the comments that:
There are numerous performance best practices. Not all of them apply to every site. But that doesn’t mean the best practice is bad – it just might not be relevant at that time for that particular site.
Souders was able to trace the problem back to several large images that are being loaded from their CDN, but appear to be taking far longer than expected to load. I’m confident Wayfair will be able to take this data to their CDN and get this particular issue resolved. Many sites like Wayfair are spending thousands of dollars on their CDN, but have never taken the time to really evaluate what sort of performance gains they are receiving for their money. I love that Jonathan was willing to set up this test and share the results with the world. It’s a great example of how you can use Real User Measurement to keep your vendors accountable for the performance gains they promise.
Using Torbit Insight, Wayfair was able to set up this test and get meaningful data back in very short period of time. It’s a great example of how easy it is to use our tagging feature to do a performance-related A/B test. If you haven’t already, be sure to read Jonathan Klein’s full post for all the details on how he conducted the experiment. For anyone else interested in conducting a similar test, send us a note, we’d love to help.
by Josh Fraser on July 12, 2012
In the last few months since we launched Torbit Insight, hundreds of top retailers and large media properties have adopted Real User Measurement on their sites. In fact, we’ve measured over 3 billion page views for retailers like Wayfair, CafePress and Build.com. As we’ve had the privilege of working with some of the largest sites around, we’ve noticed an ongoing trend. Our customers are starting to depend on the Real User Measurement (RUM) data we give them as their primary source for monitoring their website performance.
Performance measurement has traditionally been done using synthetic testing (sometimes also referred to as active monitoring). Synthetic testing is when you load a website on a regular interval from one or more locations around the world to see how fast it loads. This data is then used to generate reports or trigger alarms when there are performance issues with your site. While synthetic testing is certainly useful, hundreds of top sites are turning to Real User Measurement as a source of more accurate data.
While Synthetic testing is valuable for deep analysis and debugging, it has a few short comings. With synthetic testing you only get visibility into the specific pages that you test. This is typically a small fraction of the pages your customers actually visit, leaving you with large sections of your site without monitoring. Synthetic testing gets expensive, especially if you try and increase your coverage to more pages across your site. You’re also putting more stress on your servers, taking valuable capacity away from your actual visitors. Of course, the main problem with synthetic testing is that it makes so many assumptions about your visitors. There are dozens of factors that affect the speed at which someone is able to access your site. Where are they geographically located? What is their connection speed? Which browser are they using? Are they visiting for the first time, or are they a repeat visitor? All of these variables affect the loading experience for your visitors. If you want to know what your visitors are actually experiencing, you have to use Real User Measurement.
It’s impossible to test every variation of location, network connection speed, OS, browser & add-on. That’s not to say synthetic testing is bad. There’s a place for both.
There are a few key factors that are accelerating the adoption of Real User Measurement. We now have the web timing spec support in all of the major browsers. This allows us to collect highly accurate timing data from the browser itself, starting even before the page is loaded. This allows us to time things like DNS lookups and the time it takes to do the TCP handshake. One of the challenges of implementing RUM in the past has been simply the massive amount of data that it generates. With the explosion of “big data” tools, it’s now feasible to collect billions of samples and make sense of them. Thankfully, you don’t have to build it yourself, we offer a great Real User Measurement tool at Torbit and we even made it free for people to get started.
Every visitor matters. If your site is slow, chances are you are leaving visitors and revenue on the table. The first step in making your site faster is making sure you have an accurate way to measure your speed.
by Josh Fraser on June 25, 2012
The Velocity Conference is always a fun event for us and I doubt this year will be an exception. It’s always a great time to catch up with our friends, customers and lots of other smart people who care about performance on the web.
This year we are co-sponsoring a party with our friends at Dyn. If you’re attending, I hope you’ll stop by the Dyn Music + Tech party on Tuesday night. We’ll be handing out free Torbit shirts and other swag. Come have some free food and drinks on us! Hope to see you there!
by Josh Fraser on June 6, 2012
We’re only 19 days away from the Velocity 2012 conference which is an annual gathering of the brightest minds and leaders in the web performance industry. The conference is taking place in Santa Clara, CA on June 25-27. Each year Velocity attracts the best known names in the industry and lots of geeks who care about making things fast. The Torbit team will be there and we’re looking forward to catching up with many of our customers and friends in the performance space. If you’re planning to be there, let us know and we’ll be sure to meet up. If you haven’t gotten your tickets yet, it’s not too late to register.
I’m also incredibly excited about the first annual WebPerfDays, a single day “unconference” that is taking place on June 28th (the day after Velocity ends) at the Google campus in Mountain View. Inspired by the ever popular DevOpsDays, WebPerfDays promises to be an event you won’t want to miss. Tickets are already sold out, but I’m told that more will be opening up soon so it’s worth making sure your name is on the waiting list.
Steve Souders and Aaron Kulick have both given an amazing gift to our community with their work planning these events (along with their countless other contributions!). The anticipation is already building here at Torbit. If you’re someone who cares enough about performance to be reading this blog, you probably don’t want to miss out on either one of these events.
by Josh Fraser on May 24, 2012
I’ve been a long time fan and reader of High Scalability, a blog about building bigger, faster, more reliable websites. Yesterday, I was honored to get the chance to write a guest post for them titled, Averages, Web Performance Data, And How Your Analytics Product Is Lying To You. I wrote about the importance of looking beyond averages when looking at your performance data.
In the article, I also talked about the importance of using Real User Measurement (RUM) to get an accurate picture of what’s happening on your site. Not only do you have to make sure you are looking at the right metrics, you have to make sure your methodology for collecting your data is correct too.
There’s a big shift happening on the web right now in the way that people measure their websites. People are searching for the truth. People want to look at the real performance numbers from their actual users. Misleading metrics and synthetic tests won’t cut it anymore. You can check out the full article here.