Uswitch UK Report reveals drop between peak and off-peak surfing

From the BBC on 16 November 2011.

UK broadband speeds drop by an average of 35% from their off-peak highs when most people are online in the evening, according to a report.

The research, conducted by the comparison site Uswitch, was based on two million broadband speed tests.

The peak surfing times between 7pm and 9pm were the slowest to be online, the report said.

There were also huge regional variations between evening and early morning surfing times.

The report suggested the best time to be online was between 2am and 3am.

Users in Evesham, Worcestershire, fared worst, according to the survey, with a massive 69% drop-off between off-peak morning and evening surfing.

Those living in Weston-super-Mare did little better with speeds falling from an off-peak average of 9.5Mbps (megabits per second) to 3.4Mbps in the evening – a 64% drop.

The difference was often most noticeable in rural areas where even peak speeds were relatively slow. In Wadebridge, in Cornwall, speeds nearly halved from 4.1Mbps at off-peak times to 2.1Mbps at peak times.

“It really is surprising just how much broadband speeds fluctuate at different times of the day, with drop-offs of almost 70% in some areas of the UK,” said Uswitch’s technology expert Ernest Doku.

“Not many internet users enjoy the maximum headline broadband speeds offered by providers, and certainly not during the working week,” he added.

New rulesBroadband speed is becoming more important as bandwidth-hungry services such as on-demand TV become more popular.

Telecoms regulator Ofcom recently revealed that British households download an average of 17 gigabytes of data every month over their home broadband connections.

That monthly data diet is equivalent to streaming 11 movies or 12 hours of BBC programmes via iPlayer.

Critics say consumers are being misled by internet service providers who continue to advertise their maximum broadband speeds, even though many users do not get them.

New rules from the Committee of Advertising Practice (CAP) say that from April next year providers will no longer be able to advertise maximum speeds for net packages unless 10% of customers receive them.

Almost half of broadband users are now on packages with advertised speeds above 10Mbps but the average broadband speed is 6.8Mbps according to Ofcom.


Site Speed added to Google Analytics

According to searchenginejournal.com, Google have added a site speed tool to its analytics to improve the quality of their service to users.

Google prides itself on speed and is reported to be obsessed with providing customers with an instant search, and its love of speed isn’t a ’suit yourself if you don’t like it’ situation, the speed will impact both Google AdWords page quality and actual Google SERP, so the company are trying to create a faster experience for all users.

The site speed tracker looks at all the information related to the load speed of pages and will give you indications of which pages are loading the quickest, which browsers aren’t responding at all and which demographics are experiencing difficulty.

To make use of this tool, you have to enable the new Google Analytics, then enable the time tracking features and you must install the new Google analytics code provided.


Why Websites Are Slow & Why Speed Really Matters

Published by Mashable 6th April 2011

What a difference a millisecond can make. When it comes to browsing the web, every tiny moment counts — and the fewer moments that pass between a mouse click and a fully loaded page, the better.

Speed is a bit of an obsession for most web users. We fret over our Internet connections’ and mobile connections’ perceived slowness, and we go bananas for a faster web browser.

Given this better-faster mentality, the consequences for slow-loading pages can be dire for site owners; most users are willing to navigate away after waiting just three seconds, for example. And quite a few of these dissatisfied users will tell others about the experience.

What’s more, our entire perception of how fast or slow a page loads is a bit skewed. While we’re waiting for a site to materialize in a browser tab, pages seem to load about 15% slower than they actually do load. The perception gap increases to 35% once we’re away from a computer.

But for the precious milliseconds site owners can shave off page load times, they can see huge returns. For example, Amazon.com increased its revenue by 1% for every 100 milliseconds of load time improvement. And Aol said its users in the top 10% of site speed viewed around 50% more pages than visitors in the bottom 10%.

Site optimization firm Strangeloop has provided us with a slew of graphically organized stats on just how long pages take to load, why they take as long as they do, and just how long the average Joe or Jane is willing to wait around for your site.

Check out the infographic below.

And site owners, if you’re worried about speed, do a quick pulse-check with Google’s free and easy page speed tool, Page Speed Online.

Strangeloop's Site Speed Infographic


Jakob Nielsen’s update on website response time and why speed really matters

Website Response Times

Summary:
Slow page rendering today is typically caused by server delays or overly fancy page widgets, not by big images. Users still hate slow sites and don’t hesitate telling us.Users really care about speed in interaction design. 13 years ago, I wrote a column called “The Need for Speed,” pointing out how much users hated slow-loading Web pages. Back then, big images were the main cause of response-time delays, and our guideline recommended that you keep images small.

Today, most people have broadband, so you might think that download times are no longer a usability concern. And yes, actual image download is rarely an issue for today’s wireline users (though images can still cause delays on mobile devices).

Still, response times are as relevant as ever. That’s because responsiveness is a basic user interface design rule that’s dictated by human needs, not by individual technologies. In a client usability study we just completed, for example, users complained that “it’s being a little slow.

Speed Matters

Responsiveness matters for two reasons:

  • Human limitations, especially in the areas of memory and attention (as further discussed in our seminar on The Human Mind and Usability). We simply don’t perform as well if we have to wait and suffer the inevitable decay of information stored in short-term memory.
  • Human aspirations. We like to feel in control of our destiny rather than subjugated to a computer’s whims. Also, when companies make us wait instead of providing responsive service, they seem either arrogant or incompetent.

A snappy user experience beats a glamorous one, for the simple reason that people engage more with a site when they can move freely and focus on the content instead of on their endless wait.

In a recent study for our seminar on Brand as Experience, we asked users what they thought about various websites they had used in the past. So, their responses were based not on immediate use (as in normal usability studies), but on whatever past experiences were strong enough to form memories. Under these conditions, it was striking to hear users complain about the slowness of certain sites. Slowness (or speed) makes such an impact that it can become one of the brand values customers associate with a site. (Obviously, “sluggish” is not a brand value that any marketing VP would actively aim for, but the actual experience of using a site is more important than slogans or advertising in forming customer impressions of a brand.)

Indeed, we get findings related to website speed almost every time we run a study. When sites shave as little as 0.1 seconds off response time, the outcome is a juicy lift in conversion rates. Today or the 1990s? Same effect.

Response-Time Limits

The 3 response-time limits are the same today as when I wrote about them in 1993 (based on 40-year-old research by human factors pioneers):

  • 0.1 seconds gives the feeling of instantaneous response — that is, the outcome feels like it was caused by the user, not the computer. This level of responsiveness is essential to support the feeling of direct manipulation (direct manipulation is one of the key GUI techniques to increase user engagement and control — for more about it, see our Principles of Interface Design seminar).
  • 1 second keeps the user’s flow of thought seamless. Users can sense a delay, and thus know the computer is generating the outcome, but they still feel in control of the overall experience and that they’re moving freely rather than waiting on the computer. This degree of responsiveness is needed for good navigation.
  • 10 seconds keeps the user’s attention. From 1–10 seconds, users definitely feel at the mercy of the computer and wish it was faster, but they can handle it. After 10 seconds, they start thinking about other things, making it harder to get their brains back on track once the computer finally does respond.

A 10-second delay will often make users leave a site immediately. And even if they stay, it’s harder for them to understand what’s going on, making it less likely that they’ll succeed in any difficult tasks.

Even a few seconds’ delay is enough to create an unpleasant user experience. Users are no longer in control, and they’re consciously annoyed by having to wait for the computer. Thus, with repeated short delays, users will give up unless they’re extremely committed to completing the task. The result? You can easily lose half your sales (to those less-committed customers) simply because your site is a few seconds too slow for each page.

Fancy Widgets, Sluggish Response

Instead of big images, today’s big response-time sinners are typically overly complex data processing on the server or overly fancy widgets on the page (or too many fancy widgets).

Here’s an example from a recent eyetracking study we conducted to generate new material for our seminar on Fundamental Guidelines for Web Usability. The following gaze plots show two different users’ behavior on the same page, which contained a slideshow widget in the top yellow box that required 8 seconds to download:

Eyetracking gaze plots of two different users viewing the same page.
Gaze plots from two different users:
the blue dots indicate where users looked (one fixation per dot).

The test participant in the top gaze plot fixated a few times within the big empty color block before the content downloaded, then spent the remaining time looking at the rest of the page. This user never looked at the big promotional space after it had rendered.

The second user (bottom gaze plot) happened to be looking away from the screen during the 8 seconds when the promotional content downloaded. Thus, the first time he looked at the page he saw it as intended, complete with the entire promo.

The slideshow occupies 23% of the page, not counting a footer that’s not shown here. The user who had to endure the download delay spent only 1% of her total viewing time within this space. In contrast, the user who in effect received instantaneous page rendering (because he didn’t look until it was done), spent 20% of his viewing time within the slideshow area.

Although 8 seconds might not seem like a big delay, it’s enough to kill this big promo that the company’s Web team probably spent weeks designing. If they had allocated the space to something that rendered in 1 second instead of 8, they would have achieved much better results.

Different Causes, Same Effect

Response times are a matter of user experience: How much time does it take before the computer is ready to serve the user? The reasons behind delays don’t matter to users. All they know is that they’re getting poor service, which is annoying.

Big images in 1997. Slow servers or overly fancy widgets in 2010. Same effect. Make it snappy, and you’ll have a big leg up on the competition and their slow sites.


Inside Aptimize’s Website Acceleration Products

From Web Host Industry Review….

Web content is growing at an enormous rate, making the fast delivery of that content more of a competitive differentiator for companies than just a core competency.

Website performance is a critical component for a business’ success, especially considering that Google recently announced that speed is now accounted as a major factor in its algorithms for search rankings.

Based on that incentive New Zealand-based startup Aptimize (www.aptimize.com) is offering companies a range of website acceleration products that can significantly improve the load time of their websites.

“The web is so much more competitive than it was 15 years ago, now that people make richer and richer websites,” says Ed Robinson, co-founder and CEO of Aptimize. “Now the problem is not broadband conductivity or scalability of the servers, but it’s about how to get these massive desktop websites from the servers to the browsers.”

The company recently began offering its flagship product WAX, and its new SharePoint Accelerator, which is designed to improve the performance of Microsoft SharePoint installations, in the US and across the globe.

The products can increase a website’s performance between 200 to 400 percent without making any code changes or installing additional hardware, says Robinson.

Currently, the company’s largest customer is Microsoft, which used its products to boost the speed of the Microsoft SharePoint website.

Aptimize says the software decreased the website’s domestic load times by 43 percent, bringing it down to just 5.9 seconds. Meanwhile, the website’s international load times were reduced by 54 percent – just 6.1 seconds.

Robinson says that about 50 percent of the company’s customers are websites built on Sharepoint, which eventually lead the company to begin offering a SharePoint-specific product.

“We’ve taken a very different approach. [Other solutions] put more servers in place, or more load balancing, or great fiber networking, or put in place content delivery network,” says Robinson. “Traditionally, when data centers need to speed up websites, they’ll throw more hardware at the problem. Our product installs on the web servers themselves and increases the efficiency of the websites.”

The products have two kinds of pricing. Small business pricing, which is calculated server by server, is priced at $3,000 per server.

Enterprise pricing, on the other hand, is calculated at an application level and is priced at $18,000 per application.

And while the company is not without its share of competition, with companies like Aragon Networks, FastWeb and StrangeLoops all providing similar products, Robinson says Aptimize is one of first in the industry to offer this kind of front-end optimization product.

Aptimize does not have products for websites that operate in the cloud, but Robinson says the company is in early talks with cloud firms to provide support for cloud-based startups.

“We see a huge opportunity for cloud computing,” says Robinson. “We’ve started talking to some hosts about being able to price for that market. We’re investing in creating a specialized data center version of our product that you can put inside a cloud computing or hosting provider, and just plug it in and it will accelerate every site passing through.”


Latest Forrester research released: The Impact of Poor Web Site Performance in Financial Services

Akamai Technologies commissioned Forrester Consulting in the Autumn of 2009, to help understand the state of the Web related to financial services along with the impact of Web site performance on both the servicing and the sales of financial products.

Based on this  Forrester consulting study of 621 US consumers who bank or trade online, Forrester’s study yielded these key findings:

The Web channel is vital to an ever-growing percentage of users. The importance of the Web channel for financial services has been growing for years. As of 2009, 57% of online US adults bank online, and 36% of online US adults who have an investment account invest online.

Even more important, 29% of online bankers and 27% of online brokers access their accounts on a daily basis — a segment we call Power Users. These frequent users are less costly for banks to service and drive more direct revenue for brokerage firms. Web site users have high expectations for Web site availability and page load speed. Web user expectations grow as the importance of the Web in their lives grows. Seventy-five percent of online bankers and brokers expect 99% Web site availability or higher.

Additionally, 56% of online bankers and brokers expect Web pages to load in 2 seconds or less; this is far above the 47% of consumers who are just shopping online.1 Poor Web site performance leads to dissatisfaction more often than any other factor. Sixty-four percent of online US bankers and brokers have had a dissatisfying experience when servicing their accounts. Web performance is far and away the biggest reason for this dissatisfaction. Fifty-four percent of online bankers and 33% of online brokers who have had a dissatisfying experience said that a Web site that crashed, froze, or was unavailable was the primary source of their dissatisfaction. The consequences for financial service firms with underperforming sites are higher channel costs, lost sales, and a decrease in user willingness to recommend. The impact of poor Web site performance is large.

Fifty percent of online brokers who are conducting a transaction would use the phone or branch if they could not conduct their transactions online. Twenty-nine percent of online US adults researching a financial product would meanwhile go to a competitor’s site if they were unable to efficiently view content at a financial service Web site. The ultimate effect of poor performance is a decrease in willingness to recommend a firm, with 48% of online bankers and brokers saying that poor performance had an impact or significant impact on their likeliness to recommend a firm’s services to a friend or family member.

You can sign up via Akamai to read the full report here.