Key Findings from the World Usability Congress
How to be sure our customers will really use the new online service? Which user experience scenario would users prefer? How to make a complicated system pleasant to use?
We went looking for the answers at the World Usability Congress, which was held in Graz, Austria. Exchanging experiences with internationally renowned experts in the field is simply an opportunity we did not want to miss.
Data driven UX/UI design was one of the main topics of the congress. Here are some findings you can build on to drive customer satisfaction and increase conversions.
1. Don’t guesstimate. Work on data-based user experience.
User-centric approach is a must if we want create a value for our customers. But how to figure out what they really – I mean really – want?
The common consent in all the discussions and presentations was that the initial assumptions must be verified in practice. Using data-based evidence early in the process, even before committing the development resources, means more effective development and better outcomes.
Using the data you can
- Deepen your knowledge of users and their habits
- Accelerate the learning process based on feedback
- Reduce the cost of unnecessary development
- Accelerate time to market
- Get a good starting point for new value-added services.
Because of its clear benefits for both customers and the business, the evidence-based hypotheses is clearly the future of online solution design.
2. Combine qualitative and quantitative research to get the whole picture
Which metrics best answer your specific questions? Qualitative research leverages non-numerical data. It demonstrates the “why” or “how”. On the other hand there is quantitative research, which shows the “who”, “what”, “when” and “where”.
To get a full picture of your user experience, you need to run qualitative as well as quantitative tests to understand both what’s happening and why.
Qualitative research: Listen to the Voice of the Customer
- Type of research: Qualitative
- Methodology: Usability Testing
Theory is good, practice is better. Especially if you can attend a Craig Tomlin’s workshop on the World Usability Congress – a hands-on practice on user research. We we able to fine-tune our skills on how to run a user test asking the right questions, avoiding bias and get the required feedback.
And the main conclusion? To validate assumptions testing with 5-8 users is enough. It is important to remember you test the functionality, not the user.
Assumptions based on the qualitative research must be validated through quantitative research.
Quantitative Research: Bandit Testing
- Type of research: Quantitative
- Methodology: A/B Testing with Bayesian Evaluation
Jorrin Quest, Data-driven UX expert, presented an interesting alternative to A/B testing. Traditionally A/B testing takes a lot of time and patience to get a sample size that is big enough to indicate which variant performs better. The problem is, in the meantime you lose conversions on test versions that do not perform.
This is where bandit algorithm comes to the rescue. Firstly, it is designed to handle not just two but many variants. And secondly, during the testing it gradually sends more and more visitors to the best performing variation while allocating less traffic to underperforming ones.
3. Start lean
“Time to make impact on an application usability comes early in the process, while your ability to make changes is great, and the cost to make those changes is minimal.”
According to World Usability Congress, organizations are not prepared to put money into the product upfront without any guarantee of it performing to their standard. They want proof as early as possible. To address their expectations, the developers began to utilize the Lean approach.
One of the core aspects of lean development is focusing on the outcomes, not features. So the first assumption-based release is more of a testing model that has yet to be rated by the users. By applying data-based evidence in each release you can measure the audience response, learn from their reactions, validate assumptions and adjust the solution accordingly. This build-measure-learn feedback loop is the key principle of the Minimum Viable Product (MVP) approach that will save you the costs of unnecessary development and minimize the risk of failure.
It was an honour meeting Jeff Gothelf at the congress. Jeff is a lean UX advocate and author of the book “Lean UX: Applying Lean Principles to Improve User Experience”.
“Lean teaches us that we’re always moving from doubt to certainty – preferably in small steps. As our level of certainty in our assumptions increases, the fidelity and sophistication of our MVP also increases.”
To summarise – User experience design is not just about following best practices. It is about leveraging user data to truly address the user’s needs.