Validate the Decisions
Purpose of this step
This step is about understanding the ramifications of your design decisions on the perceived usefulness and usability of your product.
While for simplicity we focus this page on product-level validation, validation actually happens at two levels. One level is the validation of the overall product and the second level concerns the validation of individual design decisions or assumptions that you are making as you design the product. At either level, the same general process —as described in our recommended approach (below)— is applicable.
At this point, you have solutions that will hypothetically help your users achieve their goals. Now, it is time to answer some critical questions:
- Do your users find the solution useful and usable?
- Knowing that your product is not perfect, what should you focus future iterations of work on?
You will gain clarity around how real users will react to your product, both the difficulties they will encounter and what works well for them. This information can then be used to inform how you proceed. Are there significant issues to address before you release? Might enthusiastic reactions to a particular aspect of your product inform marketing initiatives?
Recommended Approach
- Schedule time for validation as part of your project timeline.
- Schedule time for planning & conducting validation studies.
- Schedule time for making corrections based on the insights you will get.
- Plan for validation to be iterative, scoped to design decisions at hand.
- For Agile validation work, try to have some members of your team validating the work of one phase as the rest of the team moves forward with the next phase of work. During sprint planning, you can plan to make changes based on last sprint’s studies.
- Determine research questions.
- Revisit success criteria to inform your research questions.
- Determine your team's uncertainty around design decisions as they relate to product success evaluated from the end-user or business perspectives.
- Document these uncertainties in terms of specific, measurable questions. These are your research questions. Scope to questions that need to be answered at this point in time. If you don’t have specific questions that need to be answered, don’t do the research.
- Prioritize your research questions based on:
- Risk of proceeding with your uncertainty.
- Urgency of answers to the next design decisions to be made.
- Choose the research methods that will answer your questions in a valid and efficient manner.
- How much time do you have? Remember that different methods require varying amounts of effort and time for recruiting, study prep, study length, and analysis.
- Each method carries it's own potential biases that you need to guard against.
- For user studies, define participants.
- Who needs to answer your questions?
- Determine what user characteristics are meaningful to your research questions.
- Consider including marginalized and underrepresented groups.
- Plan for how you will recruit these specific participant types.
- Be aware of selection biases for each recruitment method.
- Who needs to answer your questions?
- For non-user data studies, select data sources.
- What data do you need in order to answer your questions?
- Determine what variables are meaningful to your research questions.
- Be aware of biases that the data might introduce, both in the data itself and in the analysis of it.
- Set up organizational structure for your collected data to keep it manageable.
- Create study script.
- Focus on script questions that align with your research questions. Resist urge to confound the study by introducing irrelevant or out of scope questions.
- Be clear and precise in your language.
- If you will have human participants, be aware of human fatigue & plan accordingly, limiting time to 45 minutes or plan for breaks.
- Allow for structured improvisation. Ask questions to understand where the user is coming from and use this in your studies. The less clinical you can make your study, the better.
- Create artifacts/prototypes to be used in the study.
- Create the simplest artifact that will test what you need tested. Additional information or detail in your artifact can confound the data.
- Recruit users.
- Be creative. There is no one best way to recruit users.
- Recruit broadly. Expect low response rates to most recruitment activities.
- Understand selection biases involved in recruitment.
- Pilot study.
- Conduct at least one run-through of your study with a participant, ideally one that represents your user base.
- Validate whether the study is efficient and effective in answering your research questions.
- Make adjustments to your study design or script as necessary.
- Conduct Study.
- Be consistent. Do not change your study design, facilitation, or artifacts during the course of the study. Doing so will confound your results.
- If possible, have someone who is not facilitating take notes. It can be hard to fill roles of both facilitator and note taker.
- Always ask for clarification. Be aware of when you might be assuming what the participant is saying, rather than actually hearing what they are actually saying.
- Resist biasing your results by peeking at them too early.
- Comb through the data and identify relevant findings.
- Synthesize the findings into insights and prioritize them. By grouping related findings, you might uncover larger issues.
- Summarize & prioritize your insights for easy consumption. Focus on giving voice to what the data is saying.
- Set meeting to go over the insights, focusing on high-priority insights.
- Help the team understand why the insights are important & their impact on the UX of the product.
- Invite relevant team members to work with you on generating recommendations.
- For each high-priority insight, determine what action should be taken to address it.
- Be opinionated. This should provide concrete direction.
- Sometimes an implementation decision is obvious, sometimes the recommendation will be for more research or to do nothing.
- Note priority/urgency of the recommendations.
- All recommendations should have documented rationale.
- Share summary of recommendations with rest of the team. Focus the team on the high-priority recommendations.
Common Pitfalls
Process Pitfalls
- Not validating early enough.
- Asking the wrong research questions.
- Using the wrong methods to answer questions.
- Thinking a prototype has to be refined before showing it to users.
- Failure to act upon findings.
- Bias in validation process.
- Study design
- Participant selection
- During the study
- In Analysis
Participant Selection Pitfalls
- Validating with too few users.
- Participants are not representative of your users.
Useful Resources
Evaluation Tools
- Usability heuristics for evaluating design.
- UW's Accessibility Checklist
- Color Contrast analyzer
User Testing Artifacts
- Wireframes: (Example A, Example B)
- These do not need to be fully developed designs, but need to have just enough detail and differentiation on the most relevant features in order to evaluate these differences.
- Can be used as a paper prototype or interactive prototype for user testing.
- Creating Wireframes
Data Gathering
Considerations
- Cognitive biases to watch out for in research
- Facilitation, Moderation, and Observation Tips
- Gathering Qualitative Data
- Combining Qualitative & Quantitative Data
- Mine existing data
Methods
- Advice for choosing research methods
- Overview of various research methods
- Contextual Inquiry. Goal of these individual interviews would be to gather insights about participants’ context and/or existing workflows
- Surveys
- Guerilla research (quick and dirty)