Interview Techniques That Get Past 'I Would Definitely Use This'
"Would you use this feature?"
"Oh absolutely! I'd use it every day!"
Three months later: They've never touched it.
This is the core problem with user interviews: People lie.
Not intentionally. They genuinely believe they'd use your feature. They want to be helpful. They want to seem like the kind of person who would use it.
But humans are terrible at predicting their own future behavior.
The research backs this up:
- 40% of people who say "I'd definitely buy this" never do
- Users overestimate usage frequency by 3-5x
- Hypothetical questions produce 80% false positives
So what do you do?
You stop asking hypothetical questions. You stop asking about the future. You stop letting users design your product for you.
Instead, you ask questions that reveal truth through past behavior, current struggles, and real trade-offs.
This is how to run interviews that actually predict what users will do — not what they think they'll do.
Why Users Lie (And Don't Know It)
Reason 1: Social Desirability Bias
What it is: People say what makes them look good.
Example:
- You ask: "Would you use a feature to track your health goals?"
- They say: "Yes! I'm very health-conscious."
- Reality: They haven't opened a fitness app in 6 months.
Why: They want to be the person who tracks health goals, even if they're not.
Reason 2: Hypothetical Bias
What it is: Future scenarios feel different than present reality.
Example:
- You ask: "Would you pay $50/month for this?"
- They say: "Absolutely, it would save me hours!"
- Reality: When payment page loads, they close the tab.
Why: Hypothetically spending money is easy. Actually spending it is hard.
Reason 3: Confirmation Bias
What it is: People want to validate your idea (or their own).
Example:
- You ask: "Do you think collaboration features would be valuable?"
- They say: "Oh yes, definitely!"
- Reality: They work alone and never collaborate.
Why: The question suggests collaboration is important. They agree to be helpful.
Reason 4: Memory Distortion
What it is: People misremember their past behavior.
Example:
- You ask: "How often do you use feature X?"
- They say: "At least once a week."
- Reality: Analytics show once a month.
Why: Memorable uses feel more frequent. Routine uses are forgotten.
Reason 5: Inability to Predict Habit Formation
What it is: People can't predict if they'll build new habits.
Example:
- You ask: "Would you check this daily?"
- They say: "For sure, I'd build it into my morning routine."
- Reality: They check twice, then forget it exists.
Why: Starting new habits is hard. We all underestimate that.
The Core Principle: Ask About the Past, Not the Future
Bad questions focus on the future:
- "Would you use this?"
- "How often would you use it?"
- "Would you pay for this?"
Good questions focus on the past:
- "Tell me about the last time you [did related task]."
- "How did you solve this problem last week?"
- "Walk me through what you did."
Why this works: Past behavior predicts future behavior. Hypotheticals don't.
The Interview Techniques That Work
Technique 1: The Specific Incident Method
What it is: Force users to describe a specific, recent example.
Instead of: "Do you struggle with [problem]?"
Ask: "Tell me about the last time you tried to [do task]."
Why it works:
- Specific = honest
- Recent = accurate memory
- Concrete = reveals real workflow
Example:
Bad:
- PM: "Do you find it hard to collaborate on documents?"
- User: "Oh yes, collaboration is challenging."
Good:
- PM: "Tell me about the last time you worked on a document with someone else."
- User: "Um... actually, I mostly work alone. When I do collaborate, we just email versions back and forth."
The insight: They don't actually collaborate enough to need collaboration features.
The technique:
- Ask about last time (not "typical" or "usually")
- Let them struggle to remember (that's data)
- Dig into what actually happened
- Ask follow-ups about the specific incident
Technique 2: The Behavioral Follow-Up
What it is: When someone says they do something, ask for specifics.
User says: "I always check my analytics first thing in the morning."
Don't accept this. Ask:
- "What did you check this morning?"
- "What actions did you take based on what you saw?"
- "How long did you spend on it?"
Why it works: Forces them to ground claims in specifics.
Example:
User: "I'd use AI-powered insights all the time."
PM: "Interesting. Tell me about the last time you used insights or recommendations in a product."
User: "Well... I don't really use those. I prefer to analyze data myself."
The insight: They think they want AI insights, but they don't actually use them.
Technique 3: The Trade-Off Question
What it is: Force real decisions, not hypothetical interest.
Instead of: "Would you like feature X?"
Ask: "If you could only have feature X OR feature Y, which would you choose?"
Why it works: Real decisions require trade-offs. Hypotheticals don't.
Example:
Bad:
- PM: "Would you want mobile notifications?"
- User: "Sure, that sounds useful!"
Good:
- PM: "Would you rather have mobile notifications or better filtering options?"
- User: "Oh, definitely filtering. Notifications would just be noise."
The insight: They don't actually want notifications.
Variations:
- "What would you give up to get this?"
- "If you had to cut one feature, which would it be?"
- "If this cost an extra $10/month, would you still want it?"
Technique 4: The Show-Me Method
What it is: Have them demonstrate, don't just describe.
Instead of: "How would you use this?"
Ask: "Show me on your screen how you do [task] today."
Why it works:
- Reveals actual workflow
- Shows workarounds
- Uncovers pain points
- Can't fake a demonstration
Example:
User says: "I organize everything in folders."
PM: "Can you show me your folder structure?"
User: [Opens screen] "Well... actually I mostly use search. My folders are kind of a mess."
The insight: They don't organize systematically. Search is more important than organization.
How to use it:
- Screen share session
- Ask them to walk through real task
- Don't interrupt, just observe
- Ask questions after
Technique 5: The Frequency Anchor
What it is: Use specific timeframes, not vague frequencies.
Instead of: "How often do you do this?"
Ask: "How many times did you do this last week?" or "When was the last time?"
Why it works:
- Specific timeframe = concrete memory
- Forces accurate recall
- Reveals actual vs. perceived frequency
Example:
Bad:
- PM: "How often do you export reports?"
- User: "Oh, quite often. Several times a week."
- [Reality: Once a month]
Good:
- PM: "When was the last time you exported a report?"
- User: "Hmm... probably three weeks ago?"
- PM: "And before that?"
- User: "Maybe a month before that?"
The insight: Not "several times a week." Monthly at best.
Technique 6: The Workaround Probe
What it is: Ask about current solutions, not desired features.
Instead of: "What features are you missing?"
Ask: "How are you solving this today?"
Why it works:
- Workarounds reveal real pain
- No workaround = low pain
- Complex workaround = high pain
Example:
User: "I really need automated reporting."
PM: "How do you create reports now?"
User: "I export to Excel, copy-paste into another sheet, manually format it, then email it."
The insight: Complex workaround = real pain. Worth building.
Vs.
User: "I'd love to have dark mode."
PM: "How do you handle using the app at night?"
User: "I just use it. Doesn't really bother me."
The insight: No workaround = low pain. Not worth building.
Technique 7: The Skeptical Silence
What it is: When users make big claims, pause instead of moving on.
User says: "This would save me 10 hours a week!"
Don't: Accept and move on.
Do: Stay silent. Count to 5.
Why it works:
- Awkward silence prompts self-correction
- Users often walk back exaggerations
- Real reflection happens in pauses
Example:
User: "I'd use this feature constantly!"
PM: [Silence for 5 seconds]
User: "Well... actually, probably a few times a week. Maybe less if I'm honest."
The insight: They self-correct when given space.
Technique 8: The Reluctant Customer Pivot
What it is: When users are too positive, introduce friction.
User says: "I'd definitely pay for this!"
Follow with: "What if it cost $100/month?"
Why it works: Introduces real constraint. Tests commitment.
Example:
User: "We'd totally switch to this product."
PM: "What would stop you from switching?"
User: "Well... we'd need to migrate all our data. And retrain the team. And our contract with current vendor doesn't end for 6 months. And we'd need IT approval..."
The insight: They're not actually likely to switch.
The Anti-Pattern Questions (Never Ask These)
Anti-Pattern #1: Leading Questions
Bad: "Don't you think it would be useful to have [feature]?"
Why: Suggests the answer.
Fix: "How do you currently handle [task]?"
Anti-Pattern #2: Hypothetical Feature Requests
Bad: "If we added [feature], would you use it?"
Why: Hypothetical. Everyone says yes.
Fix: "Tell me about a time you needed to [accomplish goal]."
Anti-Pattern #3: Asking Users to Design
Bad: "What features should we build?"
Why: Users aren't product designers.
Fix: "What's frustrating about [current process]?"
Anti-Pattern #4: Validation-Seeking
Bad: "We're thinking of building [feature]. What do you think?"
Why: You're asking for approval, not learning.
Fix: "How do you solve [problem] today?"
Anti-Pattern #5: Multiple Choice
Bad: "Would you prefer option A, B, or C?"
Why: Constrains thinking. Might want option D.
Fix: "If you could wave a magic wand and change one thing, what would it be?"
Anti-Pattern #6: Agreement Seeking
Bad: "This is a big problem, right?"
Why: Encourages agreement, not honesty.
Fix: "How big of a problem is this for you, on a scale of 1-10?"
The Complete Interview Script Template
Opening (5 minutes)
Thanks for taking the time. I'm trying to understand how [users like you]
handle [relevant task/problem]. There are no right or wrong answers — I
just want to learn from your experience.
Background:
- Tell me about your role.
- What does a typical day look like?
- What are you trying to accomplish in your work?
Problem Discovery (15 minutes)
Specific Incident:
- Tell me about the last time you [did relevant task].
- Walk me through exactly what you did.
- How long did that take?
- What was frustrating about it?
Frequency:
- When before that did you do this?
- How many times in the last month?
Current Solution:
- How do you solve this today?
- What tools do you use?
- What workarounds have you created?
- What have you tried that didn't work?
Validation (10 minutes)
[If testing a concept:]
We're exploring [brief description].
- How would this fit into what you just described?
- What concerns would you have?
- What would need to be true for you to use this?
Trade-offs:
- If this meant [trade-off], would you still want it?
- What would you give up to have this?
Money:
- Would you pay for something that solved this?
- How much value would it need to create?
Wrap-up (5 minutes)
- What didn't I ask that I should have?
- Who else should I talk to about this?
- Can I follow up if I have more questions?
How to Interpret What Users Tell You
Positive Signals (Build It)
- Describes detailed workaround they've created
- Can cite specific recent instance of problem
- Mentions problem multiple times unprompted
- Willing to make trade-offs to solve it
- Already paying for partial solution
Example:
"Last week I spent 3 hours manually merging data from three different exports. I've built a whole spreadsheet with macros just to automate part of it. I do this every Monday. It's the worst part of my week."
Weak Signals (Maybe Build)
- Vague descriptions of problem
- Can't remember last time they had problem
- "It would be nice to have"
- No current workaround (they just accept it)
- Excitement about feature but not outcome
Example:
"Yeah, that would be cool. I think I'd use it. Can't remember specifically when I needed it, but I'm sure it would be useful."
Negative Signals (Don't Build)
- Can't describe specific instance
- Haven't tried to solve problem themselves
- Currently happy with alternative
- Hypothetical interest only
- Requirements keep changing as you talk
Example:
"I mean, I guess it could be useful. I haven't really thought about it. How exactly would it work? Well, in that case, I'd also need it to do X and Y..."
Common Interview Traps (And How to Avoid Them)
Trap #1: Confirmation Bias
What it is: Hearing what you want to hear.
Example:
User says: "That sounds interesting."
You hear: "I want this!"
Fix: Note exact words. Review recording. Look for disconfirming evidence.
Trap #2: The Friendly User
What it is: User wants to be helpful, so they're overly positive.
Example:
Everything you describe: "Oh that's great! I'd definitely use that!"
Fix: Introduce friction. Ask about trade-offs. Request specifics.
Trap #3: The Expert Trap
What it is: Interviewing power users who aren't representative.
Example:
Power user: "Everyone needs advanced analytics!"
Reality: 90% of users want simple dashboards.
Fix: Interview mix of user types. Segment feedback.
Trap #4: The Solution Focus
What it is: Users fixate on their solution, not the problem.
Example:
User: "You need to add a calendar view!"
(You never learned WHY they need calendar view)
Fix: Ask "why" three times. Get to root problem.
Trap #5: The Sample Size Trap
What it is: Drawing conclusions from too few interviews.
Example:
Three users want feature X. You build it. Nobody else cares.
Fix: Interview 8-12 people minimum. Look for patterns, not outliers.
The Post-Interview Analysis
After each interview:
Immediate Notes (While Fresh)
- What surprised me?
- What specific examples did they give?
- What workarounds do they use?
- What did they struggle to remember?
- What trade-offs would they make?
Pattern Recognition (After 5+ Interviews)
- What problems mentioned by 3+ people?
- What specific incidents repeated?
- What workarounds are common?
- What do power users vs. typical users want?
The Reality Check
- Is there behavioral evidence (not just stated interest)?
- Did they describe recent, specific instances?
- Are they currently trying to solve this?
- Would they make trade-offs for this?
- Does their behavior match their words?
Real-World Example
Scenario: Testing interest in AI-powered insights feature
Bad interview:
PM: "Would you use AI-powered insights?"
User: "Oh absolutely! AI is so useful. I'd use it all the time."
PM: "Great! What would you want to see?"
User: "Hmm, trends and patterns and... insights I guess?"
Analysis: Hypothetical interest. No specifics. Not validated.
Good interview:
PM: "Tell me about the last time you analyzed your data."
User: "Yesterday. I exported to Excel and looked at last month's numbers."
PM: "What were you trying to figure out?"
User: "Which campaigns performed best."
PM: "How long did that take?"
User: "About 30 minutes of sorting and comparing."
PM: "What made it take that long?"
User: "I had to manually calculate the differences and create charts."
PM: "What if the system automatically highlighted your top performers?"
User: "That would save me time... but I'd still want to see the raw data to verify."
PM: "If it was automated but took away manual analysis, would you use it?"
User: "Probably not. I like to dig into the details myself."
Analysis: They want faster access to data, not AI insights. Build better data views, not AI.
Your Interview Preparation Checklist
Before the interview:
- Clear research question (What am I trying to learn?)
- Interview guide with specific questions
- Recording setup (with permission)
- No leading questions in guide
- Focus on past behavior, not hypotheticals
During the interview:
- Ask about specific, recent examples
- Use silence to prompt deeper thought
- Follow up vague answers with "show me" or "when specifically"
- Notice when they can't remember (that's data)
- Ask about trade-offs, not just interest
After the interview:
- Note what surprised me
- Document specific examples given
- Flag any contradictions between stated/revealed preference
- Identify patterns across interviews
- Separate strong signals from weak signals
Final Thought
"I would definitely use this" is worthless data.
"Last week I spent 3 hours manually doing X" is gold.
Stop asking users what they want.
Start asking about what they do.
Past behavior beats future hypotheticals.
Every. Single. Time.
Run better interviews.
Build what people actually need.
Not what they think they want.