Stop Asking Users What They Want: Better Questions for Discovery
"What features do you want?"
This is the worst question you can ask users.
Here's what happens:
- They list their competitor's features
- They describe solutions to problems they don't actually have
- They tell you what they think sounds cool
- They suggest things they'll never use
Then you build it. And nobody uses it.
The problem isn't users. They're trying to help.
The problem is the question. Users are terrible product designers. But they're excellent at describing their problems, workflows, and frustrations.
Your job isn't to ask what they want. It's to understand what they need.
Here's how to ask the questions that actually reveal what to build.
Why "What Do You Want?" Fails
Reason 1: Users Think in Solutions, Not Problems
You ask: "What feature do you want?"
They say: "A Kanban board view!"
What they actually need: A way to see task status at a glance.
Better solution: Could be Kanban. Could also be a status dashboard, colored indicators, or better list sorting.
By asking about features, you skip the problem understanding phase.
Reason 2: Users Don't Know What's Possible
You ask: "What would you like us to add?"
They say: "Nothing comes to mind."
What they actually need: They have problems they've accepted as unsolvable.
They won't tell you about problems they don't think can be fixed.
Reason 3: Users Copy Competitors
You ask: "What are we missing?"
They say: "You need [competitor feature]."
Why this happens: They assume competitor has it for a reason. They haven't thought about whether they need it.
Reason 4: Social Desirability Bias
You ask: "Would you pay more for [feature]?"
They say: "Yes!" (Don't want to seem cheap)
What they actually do: Don't upgrade when you build it.
People tell you what makes them look good, not the truth.
Reason 5: Recency Bias
You ask: "What's your biggest pain point?"
They say: [Whatever annoyed them in the last hour]
What they actually need: Might be something they experience weekly that's way more important.
One-time frustrations dominate recent memory.
The Foundation: Jobs-to-be-Done Framework
The best discovery questions focus on jobs, not features.
Key principle: People don't want features. They want to make progress.
Instead of asking: "What features do you want?"
Ask: "What are you trying to accomplish?"
Example transformation:
Bad question: "Do you want dark mode?"
Good question: "When do you use our product? What's your environment like?"
Insight: If they say "I work late at night and bright screens hurt my eyes," THEN dark mode makes sense.
But they might also say "I use it in the office during business hours." Then dark mode isn't solving a real job.
The Question Framework
Use these question types in this order:
Phase 1: Context Questions (Understand Their World)
Goal: Understand their workflow, role, and environment before jumping to product.
Questions:
-
"Walk me through your typical day/week."
- Reveals workflow
- Shows where your product fits
- Uncovers adjacent problems
-
"What does success look like in your role?"
- Reveals what they're measured on
- Shows what they actually care about
- Helps you connect product to outcomes
-
"Who else is involved in [workflow]?"
- Reveals collaboration needs
- Shows multi-user requirements
- Uncovers handoff problems
-
"What tools do you use for [task]?"
- Reveals ecosystem
- Shows integrations you need
- Identifies switching costs
Why this works: You're not talking about your product yet. You're understanding their world.
Phase 2: Problem Questions (Find the Pain)
Goal: Identify actual problems, not hypothetical ones.
Questions:
-
"What's frustrating about [current process]?"
- Reveals pain points
- Shows intensity of problem
- Identifies workarounds they've created
-
"Tell me about the last time you struggled with [task]."
- Forces specific example
- Reveals actual behavior
- Shows frequency (if they can't remember, it's not that painful)
-
"What takes longer than it should?"
- Reveals efficiency problems
- Shows time waste
- Quantifies impact
-
"What do you wish you could do but can't?"
- Reveals unmet needs
- Shows aspiration
- Identifies gaps
-
"What almost stopped you from [achieving goal] recently?"
- Reveals blockers
- Shows severity
- Identifies critical paths
Why this works: You're focused on problems they've actually experienced, not ones they imagine.
Phase 3: Behavior Questions (Reveal Truth)
Goal: Understand what they actually do, not what they say they do.
Questions:
-
"Show me the last time you did [task]."
- Reveals actual workflow
- Shows workarounds
- Uncovers steps you didn't know about
-
"How did you solve this before you had our product?"
- Reveals baseline
- Shows switching reason
- Validates you're solving real problem
-
"When's the last time you used [feature]?"
- Tests if they actually use what they say they value
- Shows frequency
- Reveals forgotten features
-
"What did you try that didn't work?"
- Reveals failed attempts
- Shows problem severity
- Identifies why other solutions failed
Why this works: Behavior reveals truth. Words hide it.
Phase 4: Priority Questions (Find What Matters Most)
Goal: Separate nice-to-haves from must-haves.
Questions:
-
"If you could only change one thing, what would it be?"
- Forces prioritization
- Reveals what really matters
- Cuts through feature wish lists
-
"What would you give up to get [thing you mentioned]?"
- Tests commitment
- Reveals trade-offs
- Shows if it's truly valuable
-
"How much time/money would this save you?"
- Quantifies value
- Tests if problem is worth solving
- Helps prioritize
-
"What happens if we don't fix this?"
- Reveals consequences
- Shows severity
- Tests urgency
Why this works: Everyone wants everything. These questions force honest prioritization.
Phase 5: Decision Questions (Validate Solutions)
Goal: Test if your proposed solution actually solves their problem.
Questions:
-
"If we built [solution], how would that change your workflow?"
- Tests if they can envision using it
- Reveals integration challenges
- Shows if it solves root problem
-
"What would need to be true for you to use this?"
- Uncovers barriers
- Shows conditions for adoption
- Reveals concerns
-
"How would you measure if this was successful?"
- Reveals success criteria
- Shows what they actually value
- Helps define metrics
-
"What would stop you from using this?"
- Surfaces objections
- Shows concerns
- Reveals edge cases
Why this works: You're testing solution fit, not asking for feature requests.
The "Five Whys" Technique
When they mention a problem, dig deeper.
Example conversation:
User: "The dashboard is too slow."
You: "Why is that a problem?" (Why #1)
User: "I check it 20 times a day."
You: "Why do you check it so often?" (Why #2)
User: "I need to see if orders came in."
You: "Why do you need to check manually?" (Why #3)
User: "There's no notification system."
You: "Why do you need real-time notifications?" (Why #4)
User: "Time-sensitive orders need immediate fulfillment."
You: "Why is immediate fulfillment important?" (Why #5)
User: "We promised same-day delivery. Delays cost us customers."
Root problem: Not dashboard speed. Missing notification system for time-sensitive orders.
What to build: Real-time alerts, not faster dashboard.
The technique: Keep asking "why" until you reach the root cause.
Questions to Avoid (And Why)
Bad Question 1: "Would you use [feature]?"
Problem: People always say yes. Costs them nothing.
Instead ask: "The last time you needed to do [task], what did you do?"
Why it's better: Tests actual behavior, not hypothetical.
Bad Question 2: "Is [feature] important?"
Problem: Everything sounds important in isolation.
Instead ask: "If you could only have three features, which would they be?"
Why it's better: Forces trade-offs and prioritization.
Bad Question 3: "What features do competitors have that we don't?"
Problem: Leads to feature-matching instead of differentiation.
Instead ask: "What made you choose our product over alternatives?"
Why it's better: Reveals your actual differentiators.
Bad Question 4: "How can we make the product better?"
Problem: Too vague. You'll get surface-level answers.
Instead ask: "Walk me through the last time you got frustrated using our product."
Why it's better: Specific situation reveals specific problems.
Bad Question 5: "Do you like our product?"
Problem: Doesn't reveal anything actionable.
Instead ask: "When do you choose to use our product vs. other tools?"
Why it's better: Shows where you add value vs. where you don't.
The Discovery Interview Template
Here's a 30-minute interview structure:
Minutes 0-5: Context
"Thanks for your time. I'm trying to understand how [user segment] works so we can build better solutions. There's no right or wrong answers — just want to learn from your experience."
Questions:
- "Tell me about your role."
- "What does a typical day look like?"
Minutes 5-15: Problem Discovery
Questions:
- "Walk me through the last time you [did relevant task]."
- "What was frustrating about that?"
- "How do you solve [problem] today?"
- "What takes longer than it should?"
Minutes 15-25: Deep Dive
Pick the most interesting problem they mentioned.
Questions:
- "Tell me more about [specific problem]."
- "How often does this happen?"
- "What have you tried?"
- "What would change if we solved this?"
Minutes 25-30: Wrap Up
Questions:
- "What didn't I ask that I should have?"
- "Who else should I talk to?"
- "Can I follow up if I have more questions?"
Thank them. End on time.
How to Listen (The Skill Nobody Teaches)
Technique 1: Embrace Silence
After asking a question, shut up.
Resist urge to:
- Fill awkward silence
- Rephrase the question
- Offer examples
Instead: Wait. Count to 10 if you have to.
Why it works: Best insights come after the pause. People need time to think.
Technique 2: Follow Their Energy
When they get excited or frustrated, dig deeper.
Example:
User: "Oh my god, the export process is SO annoying."
Don't: Move to next question.
Do: "Tell me about that. What makes it annoying?"
Why it works: Energy = signal. They care about this.
Technique 3: Notice the Workarounds
Listen for:
- "I usually just..."
- "I have a spreadsheet for..."
- "I copy-paste into..."
- "I ask my colleague to..."
These are gold: Workarounds reveal pain points and desired features.
Technique 4: Ask for the Story, Not the Summary
They say: "The onboarding was confusing."
Don't accept that. Get the story.
Ask: "Walk me through your first day. What happened?"
Why it works: Details reveal the actual problem.
Technique 5: Take Notes on Language
Write down exact phrases users use.
Example:
- You call it "dashboard customization"
- They call it "making it show what I need"
Use their language in your product. It'll resonate.
Analyzing Discovery Interview Results
After 5-10 interviews:
Step 1: Look for Patterns
Ask:
- What did 3+ people mention?
- What surprised you?
- What contradicts assumptions?
Step 2: Categorize Problems
Sort by:
- Frequency (how many people mentioned it)
- Intensity (how frustrated were they)
- Impact (how much time/money does it cost)
Step 3: Map to Jobs
For each problem:
- What job are they trying to do?
- Why does current solution fail?
- What would success look like?
Step 4: Prioritize
Use:
- Frequency × Intensity × Impact = Priority score
- Build highest-scoring problems first
Common Discovery Mistakes
Mistake 1: Talking More Than Listening
Rule of thumb: User should talk 80%, you 20%.
If you're doing most of the talking: You're pitching, not learning.
Mistake 2: Asking Leading Questions
Bad: "Don't you think it would be better if...?"
Good: "How does [current process] work for you?"
Fix: Remove your opinions from questions.
Mistake 3: Confirming Your Ideas
Bad: Going into interview hoping to validate feature you want to build.
Good: Going in curious about their problems.
Fix: Be willing to hear your idea is wrong.
Mistake 4: Skipping the "Why"
User: "I want better reporting."
Don't: "Got it, we'll build better reports."
Do: "Why do you need better reports? What's missing?"
Fix: Always ask why at least once.
Mistake 5: Interviewing Only Happy Customers
Problem: They'll tell you everything is great.
Fix: Interview:
- Recent signups (fresh perspective)
- Churned customers (honest about what failed)
- Power users (deep knowledge)
- Non-users of features (why don't they use it?)
The Follow-Up Framework
After the interview, send this:
Subject: Thanks for your time
Hi [Name],
Thanks for talking with me today. Three things that stuck with me:
1. [Insight 1]
2. [Insight 2]
3. [Insight 3]
Did I get that right?
Also, you mentioned [specific problem]. If we built something to solve that, would you want to beta test it?
Thanks again,
[Your name]
Why this works:
- Confirms you heard them correctly
- Keeps door open
- Identifies potential beta users
Real-World Example
Scenario:
PM wants to build "advanced filtering" because users requested it.
Bad approach:
PM: "Would you use advanced filtering?"
User: "Yes!"
PM: Builds it. Nobody uses it.
Good approach:
PM: "Tell me about the last time you tried to find something in the app."
User: "I was looking for all orders from last month that shipped to California."
PM: "How did you do that?"
User: "I exported to Excel and filtered there."
PM: "Why Excel instead of in the app?"
User: "I can't filter by multiple things at once in your app."
PM: "How often do you need to filter by multiple criteria?"
User: "Every day. Usually region + date range."
PM: "If you could filter by multiple things, what combinations would you use most?"
User: "Region + date, customer + status, product + date."
Insight: Not "advanced filtering." Specific multi-select filters for common combinations.
What PM built: Three pre-configured filter combinations for most common use cases, not a complex advanced filter UI.
Result: 80% adoption (vs. typical 20% for "advanced" features).
The lesson: The right questions revealed the actual job, not the requested feature.
Final Thought
Users can't design your product.
But they can tell you where it fails.
Stop asking what they want.
Start asking what they struggle with.
The insights are in the problems, not the solutions.
Ask better questions.
Build better products.