Posted by Pingdom…
We know that webpages on average are getting bigger, and that Adobe Flash is slowly fading away. But even if we tell you that the average web page has since 2010, grown from 702 kB to 1,090 kB, wouldn’t it be even better to visualize it?
Of course it would! So, with the help of data from the HTTP Archive, that’s exactly what we’ve done.
Web pages are getting bigger
First, let’s look at how the individual components of web pages tracked by the HTTP Archive have developed since November 2010. We can see that images have taken off quite dramatically in size, as have scripts. There appears to be a noticeable decline in the size of Flash files. When it comes to HTML, stylesheets and the rest, there doesn’t seem to have been much change.
If we instead aggregate the data, basically stack each type of content on top of one another, we get a much better sense of how much bigger the average web page is today.
But relative size has changed very little
If everything except Flash is growing, what is growing the most? Measured in kilobytes, as you can see above, it’s clear that images are getting larger in combined size as are scripts.
But what if we instead look at what percentage out of the average web page is HTML, stylesheets, images, etc?
In this view, where 100% represents the total size of a web page, we can see that scripts and images increase some in relative size, and that stylesheets, as well as HTML, has lost out a bit in relative terms. The biggest change, however, is Flash, which has seen its share of the typical web page almost cut in half.
For each type of content, here is the change from November 2010 to July 2012:
- Images: from 59% to 63%
- Scripts: from 16% to 19%
- Stylesheets: from 4% to 3%
- Flash: from 13% to 7% (in absolute terms, Flash has “only” been reduced from 90 kB to 81 kB)
- Other: unchanged at 3%
- HTML: from 5% to 4%
What will it look like in another two years?
From this quick comparison, it’s clear that images and scripts take up, relatively speaking, a larger portion of the complete size of websites today, and Flash is slowly disappearing. And when you put Flash against the seemingly ever-increasing size of web pages, Flash is losing out even more.
We will certainly keep an eye on how this develops. If we do a follow-up study in 2014, what do you think will be the main changes? Surely, Flash is still around, but how small.
UK broadband speeds drop by an average of 35% from their off-peak highs when most people are online in the evening, according to a report.
The research, conducted by the comparison site Uswitch, was based on two million broadband speed tests.
The peak surfing times between 7pm and 9pm were the slowest to be online, the report said.
There were also huge regional variations between evening and early morning surfing times.
The report suggested the best time to be online was between 2am and 3am.
Users in Evesham, Worcestershire, fared worst, according to the survey, with a massive 69% drop-off between off-peak morning and evening surfing.
Those living in Weston-super-Mare did little better with speeds falling from an off-peak average of 9.5Mbps (megabits per second) to 3.4Mbps in the evening – a 64% drop.
The difference was often most noticeable in rural areas where even peak speeds were relatively slow. In Wadebridge, in Cornwall, speeds nearly halved from 4.1Mbps at off-peak times to 2.1Mbps at peak times.
“It really is surprising just how much broadband speeds fluctuate at different times of the day, with drop-offs of almost 70% in some areas of the UK,” said Uswitch’s technology expert Ernest Doku.
“Not many internet users enjoy the maximum headline broadband speeds offered by providers, and certainly not during the working week,” he added.
New rulesBroadband speed is becoming more important as bandwidth-hungry services such as on-demand TV become more popular.
Telecoms regulator Ofcom recently revealed that British households download an average of 17 gigabytes of data every month over their home broadband connections.
That monthly data diet is equivalent to streaming 11 movies or 12 hours of BBC programmes via iPlayer.
Critics say consumers are being misled by internet service providers who continue to advertise their maximum broadband speeds, even though many users do not get them.
New rules from the Committee of Advertising Practice (CAP) say that from April next year providers will no longer be able to advertise maximum speeds for net packages unless 10% of customers receive them.
Almost half of broadband users are now on packages with advertised speeds above 10Mbps but the average broadband speed is 6.8Mbps according to Ofcom.
If you want the fastest smartphone Web browser, get an Android phone instead of an iPhone.
But if you want the fastest Web browser on a smartphone, definitely get an iPhone instead of an Android phone.
These contradictory statements are the result of two smartphone browser speed tests, released within a month of each other. The latest comes from Blaze Software, which claims that Samsung’s Nexus S smartphone with Android 2.3 loaded non-mobile optimized Web pages more than a second faster on average than an iPhone 4 running iOS 4.3. The speed difference for mobile Websites was negligible. An earlier study from Compuware Gomez, however, says the iPhone loads pages nearly 17 seconds faster on average than Android.
Who are these companies that came up with drastically different test results? They’re both in the business of optimizing Websites, selling their services to companies who want their pages to load faster. In other words, they both do a lot of Website speed testing, and neither of them are affiliated with Apple or Google.
As for methodology, Blaze gathered data from 45,000 downloads from Fortune 1000 Websites over 3G and Wi-Fi connections, loading each page three times, according to Wired. Gomez used data from its own business customers, analyzing 282 million Web pages served across 200 popular Websites. In other words, both companies used real-world testing instead of benchmarks or closed networks.
And yet, their results are completely different.
Mobile browser speed study from Compuware Gomez
Perhaps the big takeaway here is one that we’ve known for some time: browser speed tests can be pretty unreliable. With real-world testing, there too many variables, such as network congestion and server problems. Closed networks and benchmarks, on the other hand, aren’t really representative of what real users will experience.
In any case, if you’re complaining that your super-futuristic smartphone renders pages a second or two slower than the competition, you may want to step back, take a walk and rethink your priorities.
Latest blog post from the Microsoft IE9 team…..
Click here for the full posting.
Five Internet Explorer 9 Performance Objectives:
- Display Time: Perform user actions faster than any modern browser
- Elapsed Time: Execute Web site code faster than any modern browser
- CPU Time: Effectively scale computation better than any modern browser
- Resource Utilization: Require less overall system resources than any modern browser
- Power Consumption: Require less power than any modern browser
Website Response Times
Slow page rendering today is typically caused by server delays or overly fancy page widgets, not by big images. Users still hate slow sites and don’t hesitate telling us.Users really care about speed in interaction design. 13 years ago, I wrote a column called “The Need for Speed,” pointing out how much users hated slow-loading Web pages. Back then, big images were the main cause of response-time delays, and our guideline recommended that you keep images small.
Today, most people have broadband, so you might think that download times are no longer a usability concern. And yes, actual image download is rarely an issue for today’s wireline users (though images can still cause delays on mobile devices).
Still, response times are as relevant as ever. That’s because responsiveness is a basic user interface design rule that’s dictated by human needs, not by individual technologies. In a client usability study we just completed, for example, users complained that “it’s being a little slow.”
Responsiveness matters for two reasons:
- Human limitations, especially in the areas of memory and attention (as further discussed in our seminar on The Human Mind and Usability). We simply don’t perform as well if we have to wait and suffer the inevitable decay of information stored in short-term memory.
- Human aspirations. We like to feel in control of our destiny rather than subjugated to a computer’s whims. Also, when companies make us wait instead of providing responsive service, they seem either arrogant or incompetent.
A snappy user experience beats a glamorous one, for the simple reason that people engage more with a site when they can move freely and focus on the content instead of on their endless wait.
In a recent study for our seminar on Brand as Experience, we asked users what they thought about various websites they had used in the past. So, their responses were based not on immediate use (as in normal usability studies), but on whatever past experiences were strong enough to form memories. Under these conditions, it was striking to hear users complain about the slowness of certain sites. Slowness (or speed) makes such an impact that it can become one of the brand values customers associate with a site. (Obviously, “sluggish” is not a brand value that any marketing VP would actively aim for, but the actual experience of using a site is more important than slogans or advertising in forming customer impressions of a brand.)
Indeed, we get findings related to website speed almost every time we run a study. When sites shave as little as 0.1 seconds off response time, the outcome is a juicy lift in conversion rates. Today or the 1990s? Same effect.
The 3 response-time limits are the same today as when I wrote about them in 1993 (based on 40-year-old research by human factors pioneers):
- 0.1 seconds gives the feeling of instantaneous response — that is, the outcome feels like it was caused by the user, not the computer. This level of responsiveness is essential to support the feeling of direct manipulation (direct manipulation is one of the key GUI techniques to increase user engagement and control — for more about it, see our Principles of Interface Design seminar).
- 1 second keeps the user’s flow of thought seamless. Users can sense a delay, and thus know the computer is generating the outcome, but they still feel in control of the overall experience and that they’re moving freely rather than waiting on the computer. This degree of responsiveness is needed for good navigation.
- 10 seconds keeps the user’s attention. From 1–10 seconds, users definitely feel at the mercy of the computer and wish it was faster, but they can handle it. After 10 seconds, they start thinking about other things, making it harder to get their brains back on track once the computer finally does respond.
A 10-second delay will often make users leave a site immediately. And even if they stay, it’s harder for them to understand what’s going on, making it less likely that they’ll succeed in any difficult tasks.
Even a few seconds’ delay is enough to create an unpleasant user experience. Users are no longer in control, and they’re consciously annoyed by having to wait for the computer. Thus, with repeated short delays, users will give up unless they’re extremely committed to completing the task. The result? You can easily lose half your sales (to those less-committed customers) simply because your site is a few seconds too slow for each page.
Fancy Widgets, Sluggish Response
Instead of big images, today’s big response-time sinners are typically overly complex data processing on the server or overly fancy widgets on the page (or too many fancy widgets).
Here’s an example from a recent eyetracking study we conducted to generate new material for our seminar on Fundamental Guidelines for Web Usability. The following gaze plots show two different users’ behavior on the same page, which contained a slideshow widget in the top yellow box that required 8 seconds to download:
Gaze plots from two different users:
the blue dots indicate where users looked (one fixation per dot).
The test participant in the top gaze plot fixated a few times within the big empty color block before the content downloaded, then spent the remaining time looking at the rest of the page. This user never looked at the big promotional space after it had rendered.
The second user (bottom gaze plot) happened to be looking away from the screen during the 8 seconds when the promotional content downloaded. Thus, the first time he looked at the page he saw it as intended, complete with the entire promo.
The slideshow occupies 23% of the page, not counting a footer that’s not shown here. The user who had to endure the download delay spent only 1% of her total viewing time within this space. In contrast, the user who in effect received instantaneous page rendering (because he didn’t look until it was done), spent 20% of his viewing time within the slideshow area.
Although 8 seconds might not seem like a big delay, it’s enough to kill this big promo that the company’s Web team probably spent weeks designing. If they had allocated the space to something that rendered in 1 second instead of 8, they would have achieved much better results.
Different Causes, Same Effect
Response times are a matter of user experience: How much time does it take before the computer is ready to serve the user? The reasons behind delays don’t matter to users. All they know is that they’re getting poor service, which is annoying.
Big images in 1997. Slow servers or overly fancy widgets in 2010. Same effect. Make it snappy, and you’ll have a big leg up on the competition and their slow sites.