Validate

Customer Interviews That Produce Truth

· Felix Lenhard

Three years ago, I interviewed a potential customer for a digital product idea. She told me the idea was brilliant. She said she’d definitely buy it. She even told her colleague about it during our call. I left that meeting convinced I had something.

Six months later, when the product launched, she didn’t buy. Neither did her colleague. When I followed up, she said, “Oh, I think our priorities shifted.” They hadn’t shifted. She was never going to buy. She was just being nice.

That experience cost me four months of build time and more money than I’d like to admit. But it taught me the most important lesson in validation: most customer interviews produce lies, and it’s entirely the interviewer’s fault.

The Problem With How Most People Interview

Here’s what a typical founder interview sounds like:

“So I’m building this app that helps small business owners track their expenses automatically. What do you think?”

The customer thinks: “This person is clearly excited about this. I don’t want to be rude. The idea sounds… fine, I guess?”

The customer says: “That sounds really useful! I’d definitely try something like that.”

The founder writes in their notes: “Customer validated the idea. Strong interest.”

This is fantasy. The customer didn’t validate anything. They performed a social courtesy. And the founder mistook politeness for product-market fit.

The root cause is that the interviewer talked about their solution instead of the customer’s problem. The moment you describe your idea, the conversation stops being research and starts being a pitch. And nobody gives honest feedback during a pitch because the social dynamics shift entirely.

I’ve conducted hundreds of these conversations across my work with startups, and the single biggest predictor of useful interviews is this: does the customer know what you’re building? If yes, the data is contaminated. If no, you have a chance at truth.

The Three Rules of Honest Conversations

Rule one: Talk about their life, not your idea.

Don’t describe your product. Don’t hint at it. Don’t even mention the category you’re exploring. Instead, ask about the customer’s current situation, their frustrations, and their existing behavior.

“Tell me about the last time you had to deal with [problem area].”

“How did you handle it?”

“What was the most frustrating part?”

These questions produce stories. Stories contain behavior. And behavior is the only data that matters.

Rule two: Ask about the past, not the future.

“Would you use X?” is a future-tense question. Humans are terrible at predicting their own future behavior. We all say we’ll go to the gym, eat healthy, and read more books. Then we don’t.

But past behavior is concrete. “When did you last face this problem?” has a real answer or it doesn’t. “How much did you spend trying to fix it?” produces a number or an awkward silence. Both are useful.

Rule three: Seek commitment, not compliments.

At the end of a good interview, don’t ask “what do you think?” Ask for something concrete: “Would you be willing to try an early version and give feedback?” or “Can I send you updates as I develop this?” or “Would you pay €X for a solution if I built it?”

These requests have real costs — time, attention, money. If someone says yes to giving you their email, that’s mild interest. If they say yes to a follow-up call, that’s stronger. If they say yes to paying, that’s real validation.

Compliments cost nothing. Commitment costs something. Only the second matters.

The Interview Script I Actually Use

After years of iteration, I’ve settled on a script that consistently produces useful data. Here it is, with the reasoning behind each section.

Opening (2 minutes): “Thanks for taking the time. I’m doing research on [broad problem area]. I don’t have a product to sell — I’m genuinely trying to understand how people deal with [problem]. Everything you share is helpful, even if it seems obvious.”

This framing does three things: it removes sales pressure, positions the conversation as research, and gives them permission to be honest.

Problem exploration (10 minutes):

  • “Tell me about the last time you dealt with [problem area].”
  • “Walk me through what happened.”
  • “What did you try first?”
  • “Then what?”
  • “How often does this come up?”
  • “On a scale of 1-10, how big of a deal is this in your work/life?”

Follow the story. Don’t redirect. Let them ramble. The best insights come from tangents that reveal adjacent problems you hadn’t considered.

Existing solutions (5 minutes):

  • “How do you currently handle this?”
  • “Have you tried any tools or services?”
  • “What did you like/dislike about them?”
  • “How much do you spend on this annually?”
  • “What’s still not working?”

This section reveals your competitive landscape and price anchoring. If they’re spending €500/year on a bad solution, there’s room for a better one at that price point. If they’re spending nothing, either the problem isn’t painful enough or they’ve accepted it as normal.

Closing (3 minutes):

  • “If you could wave a magic wand and fix one thing about how you deal with [problem], what would it be?”
  • “Is there anything I should have asked that I didn’t?”
  • “Would you be open to testing an early version if I build something?”

The magic wand question often produces the most quotable insight. The “anything I missed” question catches things they were thinking but didn’t say. The commitment ask is your validation data point.

I keep this entire script to about 20 minutes. Anything longer and people start performing rather than sharing. Short interviews produce better data than long ones.

Reading Between the Lines: What Signals Actually Mean

Not everything a customer says is equally valuable. Here’s how I grade the signals:

Strong positive signals:

  • They describe spending money on bad solutions
  • They get visibly frustrated recounting the problem
  • They ask when your thing will be available (without you mentioning it)
  • They offer to introduce you to others with the same problem
  • They agree to a follow-up call or pre-order

Weak positive signals (don’t trust these):

  • “That sounds interesting”
  • “I could see myself using something like that”
  • “Yeah, that’s definitely a pain point”
  • “You should talk to my friend about this”

Negative signals (trust these completely):

  • They can’t recall a specific instance of the problem
  • They describe the problem but haven’t tried to solve it
  • They change the subject when you ask about spending
  • They say “I’ll think about it” when asked for commitment
  • Silence after your follow-up email

The asymmetry here is important. Positive signals need verification. Negative signals are almost always true. When someone isn’t interested, believe them immediately. When someone says they’re interested, verify with a commitment ask.

I learned this the hard way during my Vulpine Creations days. In the magic world, audiences tell you something was amazing even when it was mediocre. You learn to watch their actual reactions — the gasp, the lean-in, the silence — rather than the polite applause after. Business validation works exactly the same way.

The 20-Interview Framework

Here’s the structure I use for a complete interview round:

Week 1: Broad exploration (interviews 1-7)

Cast a wide net. Interview people across different segments of your target market. Don’t narrow down yet. You’re looking for patterns, not confirmation.

After seven interviews, you should be able to answer: “What are the top three problems people in this space talk about?” If you can’t, your market is too broad. Narrow it.

Week 2: Pattern verification (interviews 8-14)

Now target the specific segment that showed the most pain. Ask sharper questions. Go deeper on the problems that kept showing up. Start testing your language — the way you describe the problem — and see which framing gets the biggest reaction.

After fourteen interviews, you should know: “Is there a specific problem that multiple people are actively trying to solve and willing to spend money on?”

Week 3: Solution direction (interviews 15-20)

Now — and only now — you can start describing potential solutions. Not your specific solution. Solution approaches. “Some people solve this by doing X. Others try Y. What would you think about Z?”

This is where you test whether your proposed approach matches what customers actually want. It’s also where you start collecting language for your eventual sales page.

After twenty interviews, you should have enough data to make a go/no-go decision. If the data is ambiguous, that’s a no-go. Clear opportunities produce clear signal.

The whole process takes about three weeks and costs nothing but your time. Compare this to building something for three months and then discovering nobody wants it. The math is obvious, but most people skip this step because talking to strangers feels harder than writing code. It’s not harder. It’s just uncomfortable. And discomfort is the price of truth.

Common Mistakes That Destroy Interview Data

Let me catalog the errors I see most often, because I’ve made every single one of them myself.

Mistake 1: Leading questions. “Don’t you think it’s frustrating when…” leads the witness. The customer will agree with whatever emotion you attach to the question. Instead: “How do you feel about…”

Mistake 2: Talking too much. If you’re speaking more than 30% of the interview, you’re pitching, not researching. Set a timer. Review your recordings. Most founders are shocked at how much they talk.

Mistake 3: Only interviewing people who are easy to reach. Your friends, your network, people who already like you — these people will be nice. Interview strangers. Cold outreach produces better data because there’s no social obligation to be kind.

Mistake 4: Stopping at “that sounds cool.” When someone says something positive, don’t stop. Push harder: “What specifically sounds useful about that?” or “Would you be willing to pay for that today?” Watch how fast “that sounds cool” turns into “well, I’d have to think about it.”

Mistake 5: Not recording or taking notes. Memory is unreliable, especially when you’re excited about an idea. Record every call (with permission). Review the recordings before making decisions. You’ll catch things you missed live.

Mistake 6: Interpreting instead of quoting. Your notes should contain exact quotes, not your interpretation. “Customer said expense tracking is painful” is interpretation. “Customer said ‘I spend two hours every Friday manually entering receipts and it makes me want to scream’” is data. The second one tells you what to put on your landing page. The first one tells you nothing.

I keep a running document of direct quotes from every interview round I do. When it’s time to build the thing and write copy for it, those quotes become headlines, subheads, and feature descriptions. Your customer’s words are always better than your words.

Key Takeaways

  • Never describe your idea during a customer interview. The moment you do, you’re pitching, not researching, and the data becomes useless.
  • Ask about past behavior, not future intentions. “When did you last face this problem?” produces truth. “Would you use this?” produces polite fiction.
  • Twenty interviews in three weeks is the sweet spot. Start broad, verify patterns, then test solution directions.
  • Trust negative signals immediately; verify positive ones. People don’t fake disinterest, but they routinely fake enthusiasm.
  • Record exact quotes, not interpretations. Your customer’s words become your best marketing copy later.
customer research validation interviews the mom test

You might also like

validate

The Value Proposition Canvas in 20 Minutes

Map what your customer needs against what you offer. Fast.

validate

Saying No to Good Ideas (So You Can Build Great Ones)

The hardest skill in entrepreneurship is choosing what NOT to do.

validate

How to Spot Trends Before They Become Obvious

The indicators that something is about to become mainstream.

validate

The Minimum Viable Audience

You don't need millions of followers. You need 100 right people.

Stay in the Loop

One Insight Per Week.

What I'm building, what's working, what's not — and frameworks you can use on Monday.