Lies, Damned lies and Website monitoring statistics

Date: 3rd November 2010
Author: Deri Jones

Sometimes it’s hard to find out the truth.  Some companies don’t even believe that the truth is out there.

There’s even doubt as to who actually originated the phrase “There are Lies, Damned lies and Statistics” (maybe it wasn’t British Prime Minister Disraeli).

It’s also sometimes hard to find out the truth as to what are the problems with user experience 24/7 on a particular website – website monitoring is a much misunderstood activity – with statistics not far away.

This week was a case of extremes that illustrate this point.

During in depth discussions with an eCommerce team about plans for 2011 web performance activity, it became apparent that the CTO thought website monitoring was of little value; he managed to set the scene for the whole team.   Their viewpoint was that they had excellent monitoring data from their servers. Which is true, they’ve a great server monitoring setup across a diverse range of kit in the eComm datacentre and back-out to their in-house merchandising systems too.

The agenda included a look at what the money-making Dynamic User Journeys on the site might be – what are most critical to sales and which routes have had most problems in the past.

But like trying to persuade Disraeli to take a look at statistics, it was not going to happen – the whole concept of Dynamic Journeys somehow just didn’t seem attractive from their position.

In contrast, we had some extremely positive feedback from a project that is 3 months into a 12 months web performance exercise.

Interestingly, this project had not started on such a positive note. Looking back the eCommerce director said:

Although at the start of the plan we were committing money and time resources to fundamentally upgrade how we monitor our web site, confidence in the likely success of the venture was not high.  We’d been running monitoring of our site for some time before that, and seen some benefit from it – but a number of voices in the team thought the money might have been better spent on hardware!

But at the meeting this week, it became clear than even a rough calculation of ROI showed that the improvements made to their site in the light of the keynote performance information gathered were significant.

The turning point seems to have come when the teams’ understanding shifted – when they realised that some of the statistics they had depended on as website monitoring facts – in fact turned out to be lies and damned lies. *

Those numbers were things like performance of:

  • product page  – static URL of http://www.company.com/womens/shirts/blue-blouse-2917655
  • search page – static URL of  http://www.company.com/search.do?text=black trousers

What the new project had revealed instead was performance based on Dynamic User Journeys.  That means ‘Do what the User Does’, i.e.

  • not following  fixed URLs (no real users types in the URL for the pages they visit!)
  • at each step in the Journey –
    • the page is examined
    • a link or button is looked for based on the Journey spec (eg Choose a product at random from the page)
    • That link is followed

Overall – the flow is exactly like a real user would have scanned the page and clicked the link.

What you get back is a fundamentally different set of numbers.

In this case, it uncovered a whole area of sporadic errors – where the users could search and find a product, but under certain conditions could not then add it to their basket.

It was running at about a 3% sporadic error rate during peak shopping hours.

Being able to identify and fix that, had proven the quick success for the project.

So it all goes to show – the truth is out there. You just need to keep an open mind.

* A thousand times, dumb -founded lies,  Gomez: “1000 times” –  Album: ‘In our Gun’

Top