Making the most of your Maze usability tests
Over the last few years, Maze has become one of my favourite tools for gathering insights and testing ideas and prototypes async. Below are some tips that will help you optimise your usability tests and make it a fun experience for your users.
Define learning goals
Knowing where to start with testing ideas is always a bit overwhelming, especially when you’re trying to test a whole experience as opposed to a small functionality. Before you jump into Maze, take time to write down the assumptions you used to put together the solution you’re testing and the main learning goals for this test.
- What exactly am I trying to validate?
- What behaviour do I expect users to have when completing this test?
- What knowledge gaps do I have that I could learn more from users completing this test?
- How in-depth do insights need to be so that I can iterate the solution? (basically, do you need to ask additional questions or not)
When a user agrees to complete your test, they agree to give you their full attention for a defined period of time so try and make the most of it. Be mindful of how you structure your test and always try to make it the shortest possible for the type of test it is. On average, I try to keep my Maze tests between 3–6 minutes long.
It’s also important to let users know upfront how much time the test will take to complete. Giving them a heads up helps reduce drop out mid test when a user realises the Maze test is taking longer than expected.
Tip 1: Go through your Maze questions and check if there are any questions you can answer by yourself using analytics/data available.
Tip 2: Reduce the number of questions, try to combine questions when and where it makes sense.
Identifying user groups
I always start Maze usability tests by asking users to share their name and job title within our client’s company. From a designer’s perspective, this information is important for two reasons:
- Knowing their role gives you insights into their daily tasks and responsibilities, giving you additional knowledge and context when analysing your test results. Two users with different roles within a business might not have the same needs when completing the same action.
- While Maze users are anonymous, knowing the name of the people who participated can help you reach out to a specific person if you have follow up questions to specific users.
Build up the test and share context
Share enough context with users upfront so that they can successfully complete the task. Users failing a usability test doesn’t necessarily means that your solution isn’t the right one, but it might be that they didn’t understand what they’re being asked to do in the first place.
First start by assessing how familiar users are with your feature/platform/product. Directly ask the question and use screenshots of your product to help users identify what you’re talking about. Using the ‘conditions’ functionality in Maze, don’t hesitate to provide more information to users that aren’t familiar with your product, keeping in mind that too much information shared upfront could bias your results.
Scenario based tasks are also a great way to provide context trying to complete the task. I use the following structure:
“Imagine you are visiting [product name] to [provide additional context] and you want to [goal you want users to complete]. You land on the following screen and you want to [specific action you want users to take].”
“Imagine you’re visiting [learningnewskills.com] to upskill yourself and you want to [find a specific pathway recommended to you by a colleague]. You land on the following screen and you want to [search for that specific pathway].”
Always ask additional questions
While getting users to go through your glow will tell you how intuitive your solution is, it won’t necessarily tell you much about what worked and what didn’t work with your prototype.
Always ask additional questions after tasking users with completing actions:
- How difficult was it completing this flow?
- What did you like about the flow you just completed?
- What do you think could be improved? Why?
Eliminate order bias to improve insights quality
When using `single-select` or `multi-select` question formats, switch on the shuffle toggle to make sure the order or your pre-define answers is randomised for every user. You can learn more about order bias here.
Using the shuffle functionality should help improve the quality and accuracy of test insights.
Let users share in-depth insights
While pre-defined answers are a good options in specific situations and help users focus on the problem space, always try to give users an opportunity to share their thoughts in a non pre-formatted format.
Two ways to do that are:
- Using the `other` option input to allow users to enter a custom answer to a single or multi-select question
- Us the open question format and ask well defined questions
If used wisely, Maze tests can generate results and insights as valuable as what you would get with in-person testing sessions with the advantage of getting access to a wider group of users. Happy testing!