Understand the Problem Space

Purpose of this step

In essence, this step of the UCD process is about understanding what makes the product "useful".

Here, you will define project scope and provide a basis for decision-making going forward. Your goal is to align the project team with a clear idea of what value the product (product or service) brings to:

  • The problem space. Are you solving the right problem? For the right audience?
  • The product's stakeholders. What will your stakeholders consider success? What assumptions do they have around your product?
  • The product's users. What are your users' needs? What will your users consider success? What are their core needs in this space?

Recommended Approach

  1. Get ideas onto paper. Get the team into the same room and get assumptions, expectations & ideas out of people's heads. Focus on what problem(s) people think your product need to solve.
  2. Talk to stakeholders.
    • Understand real constraints on the project.
    • Understand what their assumptions & expectations of the project.
      • It is important to question assumptions and set the expectation that the project will be guided by the discovered real needs of the users.
  3. Understand your audiences. Who do you expect to use your product and why? Be expansive and don't forget about admins, support staff, and other user types.
    • Define each audience type in a short and simple way.
    • If you haven’t already done so, do some user research. There are many methods you can use to learn about your users.
    • What are their overall goals in using the product?
  4. Prioritize use cases (audiences & their goals)
    • Rank use cases in terms of relative criticality of supporting them (primary/secondary/tertiary).
    • Scope product (who will you ultimately support and toward what goals? Why?).
    • Create workflow diagrams for each goal. These should document how the audience will realistically set out accomplishing their goal (their "workflow") for in-scope use cases. Base this on data and confident assumptions. Keep this implementation-agnostic. You are not trying to design how your users will use your product, but to document how they naturally approach completing their goals.
  5. Identify unknowns (assumptions and open questions) for prioritized use cases
    • Do they require answers?
    • When will answers be necessary?
  6. Define what success is and how to measure it.
    • Ask yourself what, from the user’s perspective, is important?
      • Use what you have learned about users to define success from a user experience perspective.
      • Is it that users met their goals?
      • Is it that users feel satisfied?
      • Or is it something else?
    • Identify what can be measured to indicate success as you have defined it
      • Since user experience is a combination of interactions and perceptions, choose a combination of metrics that track interactions and perceptions.
      • Interactions are what people actually do, including clicking, scrolling, and filling out a form. Commonly used interaction measures are time on task, success rate, or user errors. Perceptions are what people think and feel about an experience. Simple satisfaction or ease-of-use rating scales, for example, are measures that gauge how people perceive an experience after using it.

      You don’t have to determine at this time how to gather the metrics, but it is important to know at this stage what will determine whether the product or service was successful, from a user experience perspective. When you are ready for that, however, consider if there are existing metrics, or combination of metrics, in your organization that are a good indicator of success.

  7. Get buy in.
    • Go over scoped use cases and success criteria with team & stakeholders.
    • Discuss in terms of user-benefit and user-judged success.
    • Create plan for addressing any conflicts or feedback.

At every step, you want to document open questions, assumptions (need validation? when?), and facts. There is nothing inherently wrong with moving forward with unverified assumptions. It is simply a risk that you want to know about and the team can decide how to proceed.

Common Pitfalls

General Pitfalls

  • Failure to identify risky assumptions
  • Introducing implementation ideas at this early stage. This is about defining the problem & NOT possible solutions. To think about implementation at this point is going to constrain your thinking to what problems you think can be solved, rather than what problems should be solved.

User Research & Validation Pitfalls

  • Unclear research goals (what are you trying to answer? How will data be used?)
  • Trying to shoehorn all research upfront.
    • Be agile and answer what is needed when you need to answer it.
  • Cognitive biases in research and decision making

Pitfalls in Working with Stakeholders

  • Failure to understand business requirements or expectations
  • Failure to get buy in on what use cases (audiences/goals) &emdash;and consequently scope&emdash; the product needs to support in v1.
  • Allowing business goals to prioritize or dictate workflows a product supports that are either unrealistic or low-priority for users.

Useful Resources

Defining The Problem Space

User Research & Validation

Success Criteria and Metrics

Personas & User Stories

We would emphasize that the user stories should be written around broad user goals, rather than about functionality or micro-goals that wouldn't leave a user satisfied. (i.e. for a grading app, we had a story around "As an instructor, I want to be confident that I have successfully submitted my grades to the Registrar.", rather than "As an instructor, I need to enter grades into the system.")

Working with Stakeholders