Validate

Why Surveys Lie and What to Do Instead

· Felix Lenhard

I once surveyed 200 people about a new magic product concept. The survey had clean design, clear questions, and a respectable sample size. Seventy-eight percent said they would “definitely” or “probably” buy it.

We built it. Eight percent actually bought it.

That gap — between what people say they will do and what they actually do — is the most expensive lie in business. And surveys are the primary vehicle for that lie. Not because surveys are designed to deceive. Because human beings are designed to be agreeable.

The Mechanics of Dishonesty

People do not lie in surveys on purpose. They lie for three specific reasons, and understanding these reasons is more useful than simply distrusting all survey data.

Reason 1: Social desirability bias. When asked “Would you pay for a product that helps you exercise more?”, people answer with who they want to be, not who they are. They want to be the person who exercises more. So they say yes. Then they go home and sit on the couch, because the person who fills out surveys and the person who makes purchasing decisions live in different parts of the brain.

Reason 2: The hypothetical gap. Answering a survey costs nothing. Buying a product costs money, time, and the mental effort of making a decision. Surveys measure stated intention. Purchasing requires actual commitment. The gap between the two is vast, and it always skews optimistic.

Reason 3: Context collapse. A survey removes the context of a real purchasing decision. In a real purchase, you are weighing this product against everything else you could spend that money on. In a survey, the product exists in isolation. There is nothing competing for your attention or your wallet. The result is artificially inflated interest.

These three mechanisms work together. The person taking your survey wants to be helpful, faces no consequences for saying yes, and is evaluating your product in a vacuum. The wonder is not that surveys lie. The wonder is that anyone trusts them.

The Specific Ways Surveys Mislead Founders

Beyond the general dishonesty of survey responses, there are structural problems with how founders use surveys that make the data even worse.

Leading questions. “Would you find it helpful if there was a tool that saved you time on invoicing?” This is not a question. It is a suggestion disguised as a question. The only reasonable answer is yes. Nobody is going to say “No, I prefer wasting time on invoicing.” But the founder records this as market validation.

Confirmation sampling. Founders send surveys to their friends, their social media followers, and their email list — people who already know them and want to support them. This produces data that confirms the founder’s hopes rather than reflecting market reality.

Feature fishing. “Which of these features would you find most valuable?” is a question that sounds useful but produces noise. People will rank features without any sense of trade-offs. In a real product, adding Feature A means shipping later or increasing the price. In a survey, every feature is free and available immediately.

Missing the “how much” question. “Would you use this product?” is a useless question. “Would you pay EUR 49 per month for this product?” is slightly better. “Here is a link to pay EUR 49 per month for this product” is the only honest version. The gap between “would you” and “will you” is where most failed businesses live.

What Actually Works: Behavior Over Words

The alternative to surveys is not “no research.” It is better research. Research that measures behavior rather than intention.

The landing page test. Create a single web page that describes your product and includes a clear call to action — an email signup, a pre-order button, or a “buy now” link. Drive traffic to it. Measure what people do, not what they say.

A 3% conversion rate on a landing page is more reliable than an 80% positive survey response. Because that 3% took action. They typed their email address or entered their credit card number. They did something that a survey respondent did not.

The pre-sale. Sell the product before it exists. This is the smoke test. Put up a sales page, set a price, and see who buys. If people give you money for something that does not exist yet, you have the strongest possible validation signal.

Yes, this feels uncomfortable. Yes, you need to refund people if you decide not to build it. But the discomfort is precisely what makes the data reliable. The higher the barrier to action, the more meaningful the action is.

The problem interview. Instead of asking “Would you buy X?”, ask “Tell me about the last time you dealt with [problem].” Twenty minutes of structured conversation will reveal whether the problem is real, how people currently handle it, and how much pain it causes. These are the inputs you need to build a solution. Not “would you buy this?” but “how bad is this problem, really?”

The wallet test. At any point in a conversation, you can ask: “If I had this ready next week, would you be willing to pay [specific price] for it?” Then watch their face. The hesitation, the verbal gymnastics, the “well, it depends” — these tell you more than any survey checkbox.

If they say “yes, absolutely, when can I buy it?” without hesitation, you have a signal. If they say “yeah, maybe, sounds interesting,” you do not.

When Surveys Are Actually Useful

I am not saying surveys are always worthless. There are specific situations where they produce useful data.

Post-purchase surveys. Once someone has already bought your product, asking them about their experience is valuable. They have already demonstrated commitment with their wallet. Their feedback is grounded in actual use, not hypothetical interest.

Choosing between validated options. If you have already confirmed that people want a product, surveys can help you prioritize features, choose a name, or pick between design options. The directional decision has been made by behavior. The survey helps optimize.

Large-scale pattern detection. If you already have a few thousand customers, surveying them about their needs, frustrations, and preferences can reveal patterns. The sample is large enough and the relationship is established enough to produce reliable data.

Notice what these have in common: they all involve people who have already taken action. Customers, not prospects. Buyers, not browsers. Surveys work when directed at people whose behavior has already given you a baseline of truth.

The Five-Test Validation Stack

Instead of a survey, I recommend what I call the Five-Test Stack. Each test is progressively harder for the customer, which means each produces progressively more reliable data.

Test 1: Attention. Can you get people to read about your product? Measure this with click-through rates on social posts or ads. If nobody clicks, the problem or the framing does not connect.

Test 2: Interest. Will people give you their email address? This is a small commitment that signals more than a click. A waiting list converts attention into a measurable list.

Test 3: Engagement. Will people respond to questions about their problems? Send a follow-up email to your waiting list asking about their experience with the problem. Reply rate measures real engagement.

Test 4: Commitment. Will people schedule time to talk to you? A customer interview requires fifteen to twenty minutes of someone’s life. If they are willing to give that, the problem matters to them.

Test 5: Payment. Will people give you money? Revenue is the only real validation. Every other test is a proxy. This one is proof.

Each test takes a few days. The whole stack can run in two to three weeks. At the end, you have layered evidence from attention through to payment. That is a thousand times more valuable than a survey with 200 responses and a nice pie chart.

The Uncomfortable Truth

The reason surveys remain popular is not that they work. It is that they are comfortable. You can design a survey in an afternoon, send it to your network, and have “data” by Tuesday. The data will tell you what you want to hear. You will feel validated. You will start building.

The alternative — talking to strangers, putting up imperfect landing pages, asking people to pay for something that does not exist — is uncomfortable. It requires vulnerability. It requires accepting that most of the signals you receive will be negative or ambiguous. It requires building conviction before you have proof.

But the discomfort is the signal. The harder it is to get a positive response, the more valuable that positive response is. The easier it is, the less it means.

A survey that says 78% of people want your product means nothing. Eight people who paid EUR 49 each for a product that does not exist yet means everything.

Stop surveying. Start testing. The data that hurts is the data that helps.

research truth

You might also like

validate

The Value Proposition Canvas in 20 Minutes

Map what your customer needs against what you offer. Fast.

validate

Saying No to Good Ideas (So You Can Build Great Ones)

The hardest skill in entrepreneurship is choosing what NOT to do.

validate

How to Spot Trends Before They Become Obvious

The indicators that something is about to become mainstream.

validate

The Minimum Viable Audience

You don't need millions of followers. You need 100 right people.

Stay in the Loop

One Insight Per Week.

What I'm building, what's working, what's not — and frameworks you can use on Monday.