We worked with Smartdriverclub Insurance who set us the goal of increasing their click to sale rate. By engaging in usability testing and listening to their potential customers, in under 3 months they have seen their ‘click to sale’ rise by 14%.
Smartdriverclub is a relatively new player in the telematics, car insurance space and over the last few years, they have seen steady growth. We worked with their customer success team on a new brief to help them increase their click to sale rate (CTS).
What is telematics insurance?
Telematics, also known as ‘black box insurance’, records and tracks policyholder driving behaviour such as speed, acceleration, braking and cornering statistics. All via a small device that plugs into the cars’ OBD or 12v plug. To take out a telematics policy, as with normal car insurance quotes, a customer either goes directly to Smartdriverclub or arrives via an aggregator such as GoCompare. Once a policy is active, customers receive discounts on their next year’s premium if their driving behaviour has been positive.
Understanding customer behaviour through usability testing
Before making recommendations on how to improve their click to sale rate, it was important to build an understanding of how customers were engaging with the existing quote engine. Our recommendations would be presented to business stakeholders, therefore we would need evidence to support changes that we recommended. We suggested undertaking a round of usability testing, starting with 5 participants as this is a rapid and insightful way to gain real-world customer feedback. By observing how participants move through a website and listening to their instinctive reactions when facing any pain points, we’re able to build an accurate picture of the problems they’re facing when moving through the user journey, analyse patterns and make positive recommendations of change.
Remote user testing
We use the remote usability testing platform, whatusersdo and have enjoyed the flexibility and convenience that the platform provides us. Previously we faced challenges recruiting the right participants in a given timeframe. whatusersdo has over 30,000-panel members which takes the pain out of the recruitment process. The platform allows us to write comprehensive user testing scripts as well as drill down for the most applicable participant type, using pre-screener questions. We are then provided with 20-30 minute videos with audio of each individual usability test, typically arriving within 24 hours for analysis.
Writing a user testing script
When writing a user testing script, it’s important to define what questions you would like answers to and work backwards from there. Agreeing on these with the client is an important first step of the process.
For the Smartdriverclub pre-screener question, we made sure that the participants undertaking the test had a driving licence and used their car as a frequent mode of transport. We wouldn’t want someone testing who has no desire to drive and that takes the bus every day, as this wouldn’t accurately represent the audience.
We started with a scenario to set the scene for the participant to help them understand the context of the user test. This is typically followed by a link to the website we wish to test.
For Smartdriverclub, we wanted users to go through the GoCompare aggregator prior to seeking out their quote. We based this decision on patterns we discovered via analytics, which showed us that 90%+ of traffic arrives from an aggregator directly to the quote page. Building a deeper understanding of how customers use car insurance aggregators and their perception of telematics insurance proved a useful contribution to our research. We were able to gain insight into the decision-making process when choosing a suitable quote.
Reputation, price and the perception of telematics were all critical factors when customers choose an insurer.
Once a participant had landed on the Smartdriverclub quote page they referenced the testing script and answered open questions about their initial impressions of the website. When writing tasks and asking for verbal responses, it’s important not to be leading with the question which can influence their response. An example of this is a follows:
We gain a lot of valuable insight when listening to the immediate verbal responses but sometimes participants don’t realise they’re experiencing usability problems and just assume that’s how something should work, which is where our UX team are able to highlight these issues and note them down for review later on. We use post-its and a whiteboard to record positive and negative insights. This has proved to be a rapid and impactful way to record and compare insights.
We always recommend having two team members watch users tests, ensuring nothing is missed or interpreted differently.
Example Insights and recommendations we made
Having reviewed the journey from an aggregator right through to the point of payment, the team analysed the feedback. Some interesting patterns were identified and here are some (not all) of the key insights we uncovered.
1. Most customers didn’t understand it was a telematics policy
It shouldn’t be surprising to know how little users read content on the page when faced with a substantial user journey and this was evident during the user tests.
Despite the aggregator labelling telematics specific quotes, a number of participants missed this. When landing on the Smartdriverclub quote page, there was no mention that it was a telematics policy and participants progressed up until the device page (one of the last steps), at which point they were told about the device benefits in detail. Most participants struggled to see the benefit of telematics and some were surprised it was one.
We recommended providing greater transparency to make customers immediately aware that the policy is telematics-based when arriving from an aggregator. We saw opportunities to utilise the sidebar and show a brief explainer, communicating what telematics is and how it is used, which remains on every page during the journey.
2. The device onboarding was overly complicated and manual
Having previously been through the aggregator and then the Smartdriverclub quote and extras page, the customer was required to read about the device and manually call Smartdriverclub to arrange the delivery of their self-install telematics device. There were other business reasons for this call, but it provided a clear blocker for participants. Most did not want to have to go to the effort of calling the company after taking out their policy online.
We made the recommendation to no longer require the customer to call Smartdriverclub to arrange the delivery of their device and proposed the solution of delivery being handled online. As the customers’ address would have been obtained during the quote questions, the process would be made more convenient by recapping their address on this page. This was a much larger business decision for Smartdriverclub but one that could have a significant impact on click to sale rates.
3. Small usability issues
It was evident during the tests that there were small usability issues, not necessarily high impact issues, but contributing to user frustration. Checkboxes situated in large groups of text were occasionally missed and validation messages were quite robotic, with poor reference to the location on the page.
We recommended making checkboxes more visible by adding 1px borders around them, increasing the clickable area and visibility on the page when a user is scanning.
An error message audit was recommended where unhelpful, robotic responses such as ‘invalid input’ for an incorrect email address were present and would be replaced by more context-aware, helpful, human language. Such as, ‘Please enter a valid email address’
When multiple errors occurred, we recommended showing these at the top of the page, in a bullet list with anchors to the correct location of the error.
Having analysed and made our recommendations to Smartdriverclub, we presented these and estimated the effort/impact of each individual recommendation. This allowed both Si digital and Smartdriverclub to prioritise the design and development schedule and focus on the quick wins, initially.
An example of a typical effort/impact scale, showing how we would prioritise user testing feedback.
It is all looking a lot more positive on the CTS rates. Since mid-Sept CTS has gone up 14% and overall sales are up, so I’m really pleased and the feedback I’m getting is so positive. So much of that is down to you guys at Si, so thank you very much
Smartdriverclub were happy to action most of the recommendations found in the usability testing report and bravely made some bigger business decisions to simplify the device onboarding process. Since implementing these changes, Smartdriverclub has seen a 14% rise in their click to sale rate, which has been a fantastic improvement and has justified the value of the work for them. This way of working is now something we undertake on a cyclical basis, allowing constant iterative changes to their platform, helping them to grow their business.
If you would like the Si digital, product design team to organise a set of user tests on your current website, then please email email@example.com
Plan your project