And now… Swaggering into the Session Replay market…
To answer the WHY in a traditional Web Analytics sense, you are analysing data based on available tags in a page. You will have thought of potential scenarios based on your experiences and knowledge and hopefully the tagging will be complete enough to tell the users story. No matter how experienced you are, millions of user journeys will expose scenarios you never thought possible. In many ways, your knowledge of the web can be a hindrance. The obvious and cognitive to you is the confusing and ridiculous for many others. This is where User Session recording can play a pivotal role in fully understanding the users experience.
I will talk about Premier Inn without going to into anything confidential. I’ll highlight where user session recording has been invaluable to iron out some design kinks in the booking paths.
Focusing on these new booking flows, I witnessed international user conversion fall dramatically in Adobe Analytics. Before the booking flow tests were sent live, I created some custom code in the form of Decibel ‘Goals’ (essentially an event based JS method) for each booking step. I created Goals to capture the occurrence of inline errors and processing errors. I then created a segment within Decibel Insights UI to view the sessions of non-converting users who were not from the UK. I could clearly see a spike in errors on specific steps for our foreign guests reflecting what I was seeing in Adobe Analytics.
From this segment I watched only a couple of user sessions before it was apparent there was
an issue a massive issue within the form validation. On session review I could see the new form designs only validated UK phone numbers!
Any user with a non UK phone number would be declared insane, that their number is complete hokum and they need to be more honest. The not-so-jolly-foreigners were quite rightly perplexed. As I watched the user sessions in my browser I could see the cursors flying around the replay screen. I could almost hear the scalp scratching. The building frustration was palpable; as if I was sat next to them watching their erratic mouse movements that crescendoed into a birds-nest pattern just before they gave up. It was absolutely awesome to watch in a bittersweet way. Great for me to pinpoint this major issue so quickly, but terrible for the users experience.
A typical birds nest born of a frustrated and lost user
With Road Runner pace I was able to go straight into Adobe Target and exclude international customers from the new booking form tests (5% of site traffic was included in these four tests, the fourth being Control). This clearly highlighted the benefits of soft testing with Adobe Target before you release new designs into the wild.
I hastily shared the Decibel Session links with the UX and Development teams to track and resolve the validation issue. Being able to see the issue without me even having to utter a word meant the JIRA ticket was simple: [FIX THIS QUICKLY + DECIBEL SESSION PLAYBACK LINK].
The issue was tracked to the back end. This allowed some modification to the booking systems to accommodate the extra telephone field in the new designs. A tweak to the validation ensured this potentially costly oversight was resolved immediately.
One of the major UX changes to the new booking flows was dropping the asterix for required fields. Inline error validation now alerted users by a flag stating they had missed some required content. This was a major design change and needed to be verified as beneficial for the user’s experience.
To keep a like-for-like comparison to the Control Group (standard Premier Inn booking flow) I could only record an error event when the user hit continue. Recording the inline error flags as they showed in the UI would not have been fair given the user may not have missed the alert and Control only validated on submission and not off focus (Also the inline flags seemed to be overly excitable).
The ineffectiveness of inline error reporting was evidently a big issue in the new designs. As you can see from the trended submission error results, one test saw a 120% increase in submission errors through users missing required fields.
We could have sat around crunching data in Adobe Analytics, but having the Decibel Error Goals setup creating a segment was a synch.
Sure enough; popcorn in hand we saw users stumbling at the same hurdles. Some users scrolled fast pitching down the form before seeing the error flags appear in view. Some users didn’t notice them at all. Many users did not even see the text fields; as if they were invisible.
There was however a reassuring consistency in just viewing a few DI sessions back.
Errors reported dropped so it was in line with Control. Conversion increased to just .4% below Control which is fantastic compared to the previous test of 2% below.
There was still work to be done to bring these form designs up-to-scratch! but using Decibel Insight enabled me to understand the UX design issues very quickly. Merely A/B testing the booking forms would have shown the poor performance across the form versions, but wouldn’t have readily highlighted the specific problems… and definitely not so quickly!
Would I recommend the product? Yes I would. I believe for one off tests or continued optimisation of pages it is a very affordable tool which gives a great ROI. Is it as polished as Clicktale? Not yet (this may have changed as it has been some months past), but for someone like me having the freedom to tinker just makes the software that much more rewarding.
My favourite aspect of the software was the trailing mouse tail. This alone exuded user frustrations. If you saw a birds nest, you knew something was wrong.
The performance was excellent and it did not slow the pages down which is always a good thing