APM as a Service: 4 steps to monitor real user experience in production

From Compuware APM Blog from 15 May 2013

With our new service platform and the convergence of dynaTrace PurePath Technology with the Gomez Performance Network, we are proud to offer an APMaaS solution that sets a higher bar for complete user experience management, with end-to-end monitoring technologies that include real-user, synthetic, third-party service monitoring, and business impact analysis.

To showcase the capabilities we used the free trial on our own about:performance blog as a demonstration platform. It is based on the popular WordPress technology which uses PHP and MySQL as its implementation stack. With only 4 steps we get full availability monitoring as well as visibility into every one of our visitors and can pinpoint any problem on our blog to problems in the browser (JavaScript, slow 3rd party, …), the network (slow network connectivity, bloated website, ..) or the application itself (slow PHP code, inefficient MySQL access, …).

Before we get started, let’s have a look at the Compuware APMaaS architecture. In order to collect real user performance data all you need is to install a so called Agent on the Web and/or Application Server. The data gets sent in an optimized and secure way to the APMaaS Platform. Performance data is then analyzed through the APMaaS Web Portal with drilldown capabilities into the dynaTrace Client.

Compuware APMaaS is a secure service to monitor every single end user on your application end-to-end (browser to database)

Compuware APMaaS is a secure service to monitor every single end user on your application end-to-end (browser to database)

4 Steps to setup APMaaS for our Blog powered by WordPress on PHP

From a high-level perspective, joining Compuware APMaaS and setting up your environment consists of four basic steps:

  1. Sign up with Compuware for the Free Trial
  2. Install the Compuware Agent on your Server
  3. Restart your application
  4. Analyze Data through the APMaaS Dashboards

In this article, we assume that you’ve successfully signed up, and will walk you through the actual setup steps to show how easy it is to get started.

After signing up with Compuware, the first sign of your new Compuware APMaaS environment will be an email notifying you that a new environment instance has been created:

Following the steps as explained in the Welcome Email to get started

Following the steps as explained in the Welcome Email to get started

While you can immediately take a peek into your brand new APMaaS account at this point, there’s not much to see: Before we can collect any data for you, you will have to finish the setup in your application by downloading and installing the agents.

After installation is complete and the Web Server is restarted the agents will start sending data to the APMaaS Platform – and with dynaTrace 5.5, this also includes the PHP agent which gives insight into what’s really going on in the PHP application!

Agent Overview shows us that we have both the Web Server and PHP Agent successfully loaded

Agent Overview shows us that we have both the Web Server and PHP Agent successfully loaded

Now we are ready to go!

For Ops & Business: Availability, Conversions, User Satisfaction

Through the APMaaS Web Portal, we start with some high level web dashboards that are also very useful for our Operations and Business colleagues. These show Availability, Conversion Rates as well as User Satisfaction and Error Rates. To show the integrated capabilities of the complete Compuware APM platform, Availability is measured using Synthetic Monitors that constantly check our blog while all of the other values are taken from real end user monitoring.

Operations View: Automatic Availability and Response Time Monitoring of our Blog

Operations View: Automatic Availability and Response Time Monitoring of our Blog

Business View: Real Time Visits, Conversions, User Satisfaction and Errors

Business View: Real Time Visits, Conversions, User Satisfaction and Errors

For App Owners: Application and End User Performance Analysis

Through the dynaTrace client we get a richer view to the real end user data. The PHP agent we installed is a full equivalent to the dynaTrace Java and .NET agents, and features like the application overview together with our self-learning automatic baselining will just work the same way regardless of the server-side technology:

Application level details show us that we had a response time problem and that we currently have several unhappy end users

Application level details show us that we had a response time problem and that we currently have several unhappy end users

Before drilling down into the performance analytics, let’s have a quick look at the key user experience metrics such as where our blog users actually come from, the browsers they use, and whether their geographical location impacts user experience:

The UEM Key Metrics dashboards give us the key metrics of web analytics tools as well as tying it together with performance data. Visitors from remote locations are obviously impacted in their user experience

The UEM Key Metrics dashboards give us the key metrics of web analytics tools as well as tying it together with performance data. Visitors from remote locations are obviously impacted in their user experience

If you are responsible for User Experience and interested in some of our best practices I recommend checking our other UEM-related blog posts – for instance: What to do if A/B testing fails to improve conversions?

Going a bit deeper – What impacts End User Experience?

dynaTrace automatically detects important URLs as so-called “Business Transactions.” In our case we have different blog categories that visitors can click on. The following screenshot shows us that we automatically get dynamic baselines calculated for these identified business transaction:

Dynamic Baselining detect a significant violation of the baseline during a 4.5 hour period last night

Dynamic Baselining detect a significant violation of the baseline during a 4.5 hour period last night

Here we see that our overall response time for requests by category slowed down on May 12. Let’s investigate what happened here, and move to the transaction flow which visualizes PHP transactions from the browser to the database and maps infrastructure health data onto every tier that participated in these transactions:

The Transaction Flow shows us a lot of interesting points such as Errors that happen both in the browser and the WordPress instance. It also shows that we are heavy on 3rd party but good on server health

The Transaction Flow shows us a lot of interesting points such as Errors that happen both in the browser and the WordPress instance. It also shows that we are heavy on 3rd party but good on server health

Since we are always striving to improve our users’ experience, the first troubling thing on this screen is that we see errors happening in browsers – maybe someone forgot to upload an image when posting a new blog entry? Let’s drill down to the Errors dashlet to see what’s happening here:

3rd Party Widgets throw JavaScript errors and with that impact end user experience.

3rd Party Widgets throw JavaScript errors and with that impact end user experience.

Apparently, some of the third party widgets we have on the blog caused JavaScript errors for some users. Using the error message, we can investigate which widget causes the issue, and where it’s happening. We can also see which browsers, versions and devices this happens on to focus our optimization efforts. If you happen to rely on 3rd party plugins you want to check the blog post You only control 1/3 of your Page Load Performance.

PHP Performance Deep Dive

We will analyze the performance problems on the PHP Server Side in a follow up blog. We will show you what the steps are to identify problematic PHP code. In our case it actually turned out to be a problematic plugin that helps us identify bad requests (requests from bots, …)

Conclusion and Next Steps

Stay tuned for more posts on this topic, or try Compuware APMaaS out yourself by signing up here for the free trial!

Advertisements

Internet Retailer Conference annual event: Who had the fastest web site at Internet Retailer 2012?

Recently, at the Internet Retailer Conference and Exhibition annual event at McCormick Place West in Chicago had record number of attendees with more than 8,600 in attendance over the four-day event including 564 companies exhibiting e-commerce technologies and services.

This year’s event was focused on “Connecting with the 21st Century Consumer.” A description from the event brochure stated, ‘It was not long ago that having a decently performing retail web site was cool. No more. Today there a millions of e-commerce sites and the competition between them is fierce.

So fierce, in fact, that e-retailers can no longer succeed simply by keeping up with the pack. Growth comes by outperforming your competition and the surest way of doing that is by understanding who are the frequent web shoppers, what they demand from online stores, and how best to reach and serve them.’

To help attendees understand their site’s performance, we ran the “Gomez Challenge” where attendees provided their website URL to have the site’s performance measured in real-time and compared to other participants taking part in the challenge during the event.

The Gomez Challenge is a set of tests that provide event participants – whether performance focused or just beginning to learn about it – valuable insight into how both market leaders and smaller companies sites are performing and context for discussions between IT and business site stakeholders on how to balance user experience with site speed.

Over the four-day event, we ran home page tests of participant’s web site performance from multiple geographic locations looking at webpage response time, number of connections, hosts, objects, and page size to provide insight into how each site is performing.

Using a series of waterfall charts and other diagnostics tools built into the Gomez Challenge, the test also provided participants with immediate suggestions for optimizing performance.

The Gomez Challenge results are presented on a scoreboard that lists each participant along with their results across the following page load thresholds:

  • Green = less than 2 seconds, good customer experiences
  • Yellow = between 2.1 and 5 seconds, considered to be customer impacting
  • Red = more than 5 seconds, critical issues and very customer impacting

The winner of the Gomez Challenge had the the fastest average response time during the event across multiple geographies. This year’s challenge winner was Belk.com, the nation’s largest privately owned mainline department store company with 303 Belk stores located in 16 Southern states – congratulations!

Check out your own website with a free test with the Gomez Website Performance Test. You can also find out how your website performs across browsers, compared to your competitors and on mobile applications here.


Uswitch UK Report reveals drop between peak and off-peak surfing

From the BBC on 16 November 2011.

UK broadband speeds drop by an average of 35% from their off-peak highs when most people are online in the evening, according to a report.

The research, conducted by the comparison site Uswitch, was based on two million broadband speed tests.

The peak surfing times between 7pm and 9pm were the slowest to be online, the report said.

There were also huge regional variations between evening and early morning surfing times.

The report suggested the best time to be online was between 2am and 3am.

Users in Evesham, Worcestershire, fared worst, according to the survey, with a massive 69% drop-off between off-peak morning and evening surfing.

Those living in Weston-super-Mare did little better with speeds falling from an off-peak average of 9.5Mbps (megabits per second) to 3.4Mbps in the evening – a 64% drop.

The difference was often most noticeable in rural areas where even peak speeds were relatively slow. In Wadebridge, in Cornwall, speeds nearly halved from 4.1Mbps at off-peak times to 2.1Mbps at peak times.

“It really is surprising just how much broadband speeds fluctuate at different times of the day, with drop-offs of almost 70% in some areas of the UK,” said Uswitch’s technology expert Ernest Doku.

“Not many internet users enjoy the maximum headline broadband speeds offered by providers, and certainly not during the working week,” he added.

New rulesBroadband speed is becoming more important as bandwidth-hungry services such as on-demand TV become more popular.

Telecoms regulator Ofcom recently revealed that British households download an average of 17 gigabytes of data every month over their home broadband connections.

That monthly data diet is equivalent to streaming 11 movies or 12 hours of BBC programmes via iPlayer.

Critics say consumers are being misled by internet service providers who continue to advertise their maximum broadband speeds, even though many users do not get them.

New rules from the Committee of Advertising Practice (CAP) say that from April next year providers will no longer be able to advertise maximum speeds for net packages unless 10% of customers receive them.

Almost half of broadband users are now on packages with advertised speeds above 10Mbps but the average broadband speed is 6.8Mbps according to Ofcom.


Compuware Gomez Offers Industry’s First Free IPv6 Website Performance Comparison Test

From CBS Detroit.com 8th June 2011

In recognition of World IPv6 day, Compuware Gomez have released the industry’s first free IPv6 Website Performance Comparison Test.

This test allows organisations to compare the speed of their IPv4- and IPv6-enabled Web applications. The free Compuware Gomez IPv6 Website Performance Comparison Test is available at www.gomez.com/ipv6-instant-test.

With Internet Protocol IPv4 addresses running out this year, the industry must act quickly to prepare for IPv6 adoption or risk increased costs and limited functionality online for Internet users everywhere. With the migration to IPv6 already under way, it’s critical that organizations ensure their IPv6-enabled applications perform on par with their customers’ user experience expectations.

Compuware’s early analysis of IPv6-enabled sites shows that users generally experience slower response times when accessing them. “With the depletion of IPv4 addresses, the IPv6 transition will affect every business that touches the Internet and the cloud. So organizations need to be ready to ensure the best possible end user experience for this transition,” said Mark Worsey, CIO at GoGrid.

“With GoGrid’s cloud infrastructure powering Gomez’s IPv6 Website Performance Comparison Test companies can easily compare performance of their IPv4- and IPv6-enabled web applications.”

To use the Gomez IPv6 Website Performance Comparison Test the user submits URLs for IPv4- and IPv6-enabled Web sites. The test produces a waterfall chart that compares the response times of each of the sites and also shows a screen capture of the IPv6 and IPv4 pages as they are seen in an actual browser.

“IPv6 will play an important role in the future of the Internet, and until now there was no way to test the performance of IPv6-ready websites or compare them to the currently deployed IPv4 sites,” said Steve Tack, chief technology officer of Compuware application performance measurement business unit.

“This instant test provides a quick and simple way to measure the response times that a user experiences when using these two protocols and helps ensure organizations experience a smooth and successful transition to IPv6.”


Site Speed added to Google Analytics

According to searchenginejournal.com, Google have added a site speed tool to its analytics to improve the quality of their service to users.

Google prides itself on speed and is reported to be obsessed with providing customers with an instant search, and its love of speed isn’t a ’suit yourself if you don’t like it’ situation, the speed will impact both Google AdWords page quality and actual Google SERP, so the company are trying to create a faster experience for all users.

The site speed tracker looks at all the information related to the load speed of pages and will give you indications of which pages are loading the quickest, which browsers aren’t responding at all and which demographics are experiencing difficulty.

To make use of this tool, you have to enable the new Google Analytics, then enable the time tracking features and you must install the new Google analytics code provided.


Why Websites Are Slow & Why Speed Really Matters

Published by Mashable 6th April 2011

What a difference a millisecond can make. When it comes to browsing the web, every tiny moment counts — and the fewer moments that pass between a mouse click and a fully loaded page, the better.

Speed is a bit of an obsession for most web users. We fret over our Internet connections’ and mobile connections’ perceived slowness, and we go bananas for a faster web browser.

Given this better-faster mentality, the consequences for slow-loading pages can be dire for site owners; most users are willing to navigate away after waiting just three seconds, for example. And quite a few of these dissatisfied users will tell others about the experience.

What’s more, our entire perception of how fast or slow a page loads is a bit skewed. While we’re waiting for a site to materialize in a browser tab, pages seem to load about 15% slower than they actually do load. The perception gap increases to 35% once we’re away from a computer.

But for the precious milliseconds site owners can shave off page load times, they can see huge returns. For example, Amazon.com increased its revenue by 1% for every 100 milliseconds of load time improvement. And Aol said its users in the top 10% of site speed viewed around 50% more pages than visitors in the bottom 10%.

Site optimization firm Strangeloop has provided us with a slew of graphically organized stats on just how long pages take to load, why they take as long as they do, and just how long the average Joe or Jane is willing to wait around for your site.

Check out the infographic below.

And site owners, if you’re worried about speed, do a quick pulse-check with Google’s free and easy page speed tool, Page Speed Online.

Strangeloop's Site Speed Infographic


I would definitely Blame Stella; clever take on website performance monitoring

I came across this site Blame Stella on my travels…

Very nicely presented data, which I am sure the guys will develop. Particularly how they explain to mere mortals some of the terms of performance monitoring people band around everyday….

Blame Stella

An interesting  article here with the founder Delano Mandelbaum based in Montreal, Canada.