Written by Simon, founder who shipped 4 products nobody wanted.
The Validation Playbook: How to Test Your Startup Idea Before You Bet Your Savings
Most founders lose six to eighteen months of their life building something nobody buys. The product works. The code is clean. The design looks sharp. But the market doesn't care, because nobody verified the market wanted it in the first place. Startup idea validation isn't a nice-to-have step you squeeze in before launch. It's the only thing standing between you and a very expensive lesson. If you're reading this before writing a single line of code, you're already ahead. Validate your idea before it costs you real money.
Founder intuition is a terrible compass on its own. You're too close to your own idea. You've been thinking about it for weeks, maybe months, and the problem feels obvious because you've lived it. But your experience is a sample size of one. The goal of this playbook is to give you a 30-90 day framework with concrete go/no-go criteria so you're making decisions on evidence, not gut feeling.
Part 1: Document Your Assumptions First
Before you talk to a single customer or spin up a landing page, write down every assumption baked into your idea. That means the specific problem you're solving and for exactly whom, your proposed solution, how you plan to make money and why customers would choose you over what they use today. Most founders skip this step and go straight to building. Don't. Writing assumptions down forces you to see where you're guessing, and those guesses are the things that will kill your startup.
The most common mistake at this stage is confusing features with benefits. "We have an AI-powered dashboard" is a feature. "You'll spend 10 fewer hours a week on compliance reporting" is a benefit. Customers buy outcomes, not capabilities. Get crystal clear on the outcome you're promising before you test anything.
Once you have your assumptions on paper, define your go/no-go thresholds before you start collecting data. This is non-negotiable. If you set the bar after seeing results, you'll unconsciously move the goalposts. For customer interviews, a reasonable threshold is 70% of prospects acknowledging the core problem unprompted. For a B2B SaaS landing page, 2-5% conversion to email signup or demo request is a positive signal. For B2C consumer products, 1-3% is the benchmark. If you're in either category and sitting below 0.5%, that's not a messaging problem. That's a targeting or idea problem.
Finally, identify your riskiest assumption. This is the one thing that, if wrong, makes your entire business fall apart. For most early-stage startups it's the same question: does the market actually want this solution, or are you assuming it does? Design your first validation test to answer that one question as cheaply as possible. A $200 test that answers your riskiest question is worth more than a $20,000 MVP that doesn't.
Part 3: Landing Page and Demand Testing
A minimal landing page is not a product. It's a demand test. Build one page with a clear headline that states the problem, a subheading that describes your solution angle and three to four bullet points covering key benefits. One call-to-action: "Get early access" or "Schedule a demo." Remove everything else. No navigation. No logo clutter. No blog link. Every element that isn't converting visitors into signups is friction.
You can build this with Webflow or a no-code tool for $50-200. Then drive traffic. Cold outreach to Reddit communities, industry Slack groups and LinkedIn contacts in your target segment is free and often converts better than paid ads at this stage, because you're reaching people already talking about the problem. If you run paid ads, keep the budget to $200-500 and treat it as a learning experiment, not a growth channel.
Aim for 50-100 unique visitors before drawing conclusions. Test two variants if you can: a problem-focused headline against a solution-focused headline. A specific outcome promise ("Reduce audit prep time by 10 hours a week") will almost always outperform a generic transformation claim ("Compliance automation for teams"). You need at least 100 visitors per variant before the data means anything, so be patient.
When signups come in, follow up fast. What percentage respond to your follow-up email? Are 30% or more engaging? Are three out of ten willing to get on a call? These are your demand validation signals. Get started and use the platform's tracking tools to measure these rates without spreadsheet chaos.
Part 4: The 30-90 Day Roadmap
Here's how to sequence the work. Weeks one and two are for completing 10 customer interviews and producing a problem validation report. If fewer than 70% of interviews confirm the core problem, pivot the hypothesis or kill the idea. Weeks three and four are for launching your landing page and testing two to three traffic sources. The output is a conversion report. If conversion is below 1% and qualitative feedback from signups is negative, revisit the messaging before proceeding.
Weeks five and six shift to willingness-to-pay testing. Have five to ten conversations with your signups. Probe on budget and purchasing timeline without anchoring them on a price first. If nobody indicates a budget above $50 per month for a SaaS product, the market is either too small or the problem isn't urgent enough to spend on. Weeks seven and eight are the concierge MVP phase. Deliver your solution manually to three to five early customers. Do the work by hand. Use spreadsheets, email and Zoom. If you can't deliver value without a full product, you don't understand the value yet.
Weeks nine through twelve are for market sizing and final decision. Calculate your TAM, SAM and SOM. Run unit economics: what's your estimated CAC, your target LTV and how long does payback take? For B2B, a CAC payback under 12 months is the threshold. For B2C, aim for under six months. If the numbers work and your validation scorecards are green, proceed to build. If they don't, you've spent 90 days and maybe $1,000 learning something that would have cost you $100,000 and a year to learn the hard way.
Part 5: The Pitfalls That Kill Good Ideas Early
Confirmation bias is the most dangerous thing in customer discovery. You'll unconsciously recruit people who seem likely to agree with you, ask questions that nudge toward the answer you want and remember the responses that confirm your hypothesis. The fix is structured recruitment from platforms like UserTesting or LinkedIn Recruiter, a fixed question script and a co-founder or trusted advisor who reviews your interview notes for bias.
The second killer pitfall is confusing interest with commitment. "That sounds useful" is not validation. A pre-order is validation. A 30-minute follow-up call booked on the spot is a signal. A paid pilot, even at a steep discount, is the clearest signal of all. Set a concrete commitment threshold before you start collecting responses. Three out of ten prospects should show real commitment, not just polite interest. If you're not hitting that, the problem isn't urgent enough or your solution isn't compelling enough to act on yet.
The third pitfall is ignoring data that contradicts your vision. If your landing page converts at 0.3% across three different traffic sources and three different headlines, that's not a copywriting problem. That's the market telling you something important. The pre-set thresholds you defined in Part 1 exist precisely for this moment. Trust them. Founders who persist past three failed tests without pivoting aren't being resilient. They're being stubborn. There's a difference.
Validation Is About Disproving, Not Proving
The whole point of startup idea validation is not to confirm that your idea is great. It's to find the fastest, cheapest way to prove it's wrong. If you can't prove it wrong, you have a real business. The founders who get this right spend two to four weeks validating before spending twenty weeks building. They come out the other side with paying customers, real pricing data and a product roadmap shaped by evidence.
Your next move is simple. List your top three assumptions. Circle the riskiest one. Design a test that costs under $200 and can answer it this week. That's the whole game at this stage. Read more on the Validate & Launch blog for frameworks that help you run each phase without spinning your wheels.
Evidence compounds. Every decision you make on real data makes the next decision cheaper and faster. Start there.
