Monitor what Users do on your website

Date: 7th June 2011
Author: Deri Jones

Aka:  Do what the Customer Does – not an artificial simplification of it.

If you want to know WHY users do what they do on your site:  web analytics can tell you the What:  but only by seeing what the user experience is like 24/7 as they actually ‘do what users do’ – can you find out if their behaviour is impacted by slow downs at busy periods in certain parts of the user journeys: or by pages with unexpected or wrong content shown.

It’s common for organisations to collect lots of data (if you can’t measure it, you can’t manage it: is how the mantra goes).

And data on the live website is no exception:  with the advent of the web and web analytics over the last 10 years; there are a huge amount of  numbers derived from  users hitting pages on your website.  Marketing folks have become much more analytical and numeric; and spreadsheet ROI driven.

But not all data is of equal value in driving your online business forward.

So it was a nice quick win for a project this month – when the client realised that they’d been monitoring lots of stuff on the website: but had been missing  all along the experience of their actual users: and thus were quickly able to find and fix some glitches that had been hurting their online brand and sales, for some time  – ROI was a 2 % drop in abandoned carts.

Like many online retailers,  the responsibility for measuring the live website 24/7 had been left entirely in the hands of the IT team.  Aside from a bunch of  server monitoring tools, the IT team had also added a while ago, monitoring of some pages across the site.  They’d strung some URls together in a  chain, and called it journey monitoring.

The information was not really presented to the rest of the business in a digestible form: most departments didn’t put much store by the numbers but now and again a monthly report might include some of them.

But as part of the Customer Experience project we’d been called in for, we were working with both non-tech eCommerce and IT teams:  to work out what sort of website availability metrics and website performance monitoring would make sense as higher level KPIs – that both teams would treat as a common language.

The immediate quick win – is that the existing monitoring, was quite simply not doing what real customers were doing!

This is what we found:

The site has a rich search function: and real users would make a search, and then click on something that matched the search.

But, the monitoring they had in place: did not click on anything in the search results!  It stopped with the page of search results and went no further.

Hence they were blind to a major part of the user experience their site was offering.

The reason the IT team had stopped where they had – was not because the IT team were not competent. Far from it

The keynote reason, was that the website supplied a wide range of products: and some products were big and shipped in big boxes, and required the buyer to decide what delivery slot they wanted: to ensure they would be at home to sign for the goods.

Whereas other products where smaller and needed no special choice of delivery slot – so no special delivery page showed.

So when the IT team had decided what pages to monitor: they’d got stuck on the fact that if they clicked on a  product after a search, it might sometimes want a  delivery-slot page to be filled in: and sometimes not have that page at all.

With the sensible aim of having consistent data; and  with the less sensible aim of making the job easier: they decided to not click on anything at all from the search… to avoid the problems of the extra delivery page.

Result: so although 99.9% of real users would click on a  product at that page:  the past monitoring never measured that action.

Now the project has rolled out,  there are a number of redesigned user journeys being measured on the site, confidence is now growing that the monitoring is really ‘Doing what the user does’.  And the non-IT teams such as Marketing are now taking the lead role in specifying the Journeys that should be measured: and changing that spec month by month as the desired Business Outcomes vary: as certain product campaigns drive traffic to one part of the site; or to a subset of the product ranges etc.

Once we put in place 24/7 user journey monitoring that included clicking on a  product after the search – a price glitch on the website came to light.  The product price would not match between the full price and the discount % shown: and the discounted price shown – the three numbers shown contradicted each other!    Not all the time, but it was wrong for a percentage of samples each day.

It only impacted certain products in certain product groups, under certain conditions – amounting to a 2% error rate.

That price problem had been known about anecdotally for some time; the call centre reported that it was occasionally happening  But only when the website monitoring really did what real users would do, did it come to light in a reproducible way; so that the IT team could readily identify and fix it.

ROI on finding and fixing that one problem, was the removal of a  pricing error occuring at  a 2% rate,  which had caused abandoned shopping journeys – the proof was the drop in abandoned journeys.

Top