Book a call
Guides

Product-Market Fit Validation Framework: Complete Guide for Founders

Complete framework for validating product-market fit before building. Real validation methods, metrics that matter, and proven strategies from 50+ successful product launches.
November 4, 2025
·
17
min read

Introduction: Why Most Products Fail

42% of startups fail because they build products nobody wants. Not because of poor execution, funding issues, or competition. They build the wrong thing.

The brutal truth:

  • Users say they want features they'll never use
  • Markets look attractive until you try to sell
  • Assumptions feel like facts until tested
  • Building without validation = expensive gambling

This framework prevents that.

What you'll learn:

  • How to validate before building anything
  • Which metrics actually predict success
  • Testing methods that work
  • Red flags that save months
  • When to pivot vs persevere
  • Real examples from successful products

Based on: 50+ product validations at Precode, £2M+ in validated development, partnerships with UK accelerators, tracked outcomes 12-24 months post-launch.


Part 1: Understanding Product-Market Fit

What Product-Market Fit Actually Means

Marc Andreessen's definition:"Product-market fit means being in a good market with a product that can satisfy that market."

Practically, PMF is when:

  • Users actively seek your product
  • Word-of-mouth growth accelerates
  • Usage retention stays high
  • Sales become easier
  • Team struggles to keep up with demand

PMF is NOT:

  • Having users (could be courtesy usage)
  • Having revenue (could be unsustainable)
  • Having growth (could be paid, not organic)
  • Having funding (investors can be wrong)
  • Having press (hype ≠ demand)

The Three Levels of Validation

Level 1: Problem Validation

  • Does this problem exist?
  • Do people care enough to solve it?
  • How do they solve it today?
  • What's the pain level?

Until validated:

  • Don't design solutions
  • Don't write code
  • Don't raise money for building

Level 2: Solution Validation

  • Does your solution solve the problem?
  • Will people use this approach?
  • What's the minimum they need?
  • What delights vs what's essential?

Until validated:

  • Don't build production code
  • Don't scale operations
  • Don't hire team

Level 3: Business Model Validation

  • Will people pay?
  • How much will they pay?
  • What's acquisition cost?
  • Can this be profitable?

Until validated:

  • Don't scale marketing
  • Don't expand team significantly
  • Don't raise large rounds

Most founders skip to Level 3 without validating 1 and 2. This is expensive.


Part 2: Problem Validation Framework

Step 1: Define Your Hypothesis

Problem hypothesis template:

[Target customer segment] experiences [specific problem]
when [context/situation], causing [negative outcome].

They currently solve this by [current solution],
but this fails because [limitation].

This problem occurs [frequency] and costs them
[time/money/opportunity].

Example:

UK tradespeople (plumbers, electricians) struggle to
create professional quotes on-site when meeting customers,
causing lost sales and follow-up overhead.

They currently use paper quotes or wait until back at
office to send via email, but this fails because:
- 30-40% of quotes are never sent
- Customers expect instant service
- Tradespeople hate admin work

This happens 5-10 times per week per tradesperson
and costs 2-3 hours admin time plus lost sales.

Make it specific:

  • Who exactly?
  • What problem exactly?
  • When exactly?
  • Why do current solutions fail?
  • Quantify the cost

Step 2: Interview Target Customers

Goal: Find if problem exists and matters

Not: Will they buy your solution

Interview 20-30 people:

  • 20 for B2C products
  • 30 for B2B products
  • 50+ for enterprise products

Why so many?

  • First 5: Learning interview technique
  • Next 10: Finding patterns
  • Final 10: Confirming patterns
  • More: Higher confidence

The Mum Test (crucial reading):

  • Don't talk about your idea
  • Don't ask hypothetical questions
  • Talk about their life
  • Ask about past behaviour
  • Dig into specifics

Good questions:

"Tell me about the last time [problem scenario]."

  • Gets specific stories
  • Real behaviour, not hypothetical

"How do you currently solve [problem]?"

  • Reveals current solutions
  • Shows if problem worth solving

"What's frustrating about [current solution]?"

  • Finds gaps
  • Identifies opportunities

"If you could wave a magic wand..."

  • Uncovers ideal outcome
  • Not constrained by current reality

"What have you tried in the past?"

  • Shows if they've attempted solutions
  • Indicates problem severity

"How much time/money does [problem] cost you?"

  • Quantifies impact
  • Validates if worth solving

Bad questions (avoid these):

"Would you use a product that...?"

  • Hypothetical
  • Politeness bias
  • Not useful

"How much would you pay for...?"

  • Hypothetical pricing
  • Unreliable
  • Asked too early

"Do you think this is a good idea?"

  • Seeking validation
  • Useless feedback
  • Waste of time

"Would you buy this if I built it?"

  • Politeness: "Maybe!"
  • Reality: Probably not
  • False confidence

Interview process:

  1. Recruit
    • LinkedIn outreach
    • Industry forums
    • Existing network
    • Cold email
    • Offer £20-£50 Amazon voucher
  2. Schedule
    • 30 minutes
    • Video call or phone
    • Record (with permission)
    • Take notes
  3. Interview
    • Intro (2 mins)
    • Their background (5 mins)
    • Problem questions (20 mins)
    • Wrap up (3 mins)
  4. Analyse
    • Transcribe key quotes
    • Note pain points
    • Track patterns
    • Update hypothesis

Step 3: Analyse Results

Look for:

Problem exists:

  • Multiple people describe same problem
  • Specific stories, not vague descriptions
  • Clear frustration
  • Current workarounds exist (proves problem matters)

Problem matters:

  • Frequent occurrence
  • High cost (time or money)
  • Active attempts to solve
  • Willing to pay for solution

Market opportunity:

  • Large enough segment
  • Accessible to reach
  • Growing not shrinking
  • Willing to adopt new solutions

Red flags:

Problem doesn't exist:

  • Can't find people with problem
  • People don't remember examples
  • "Interesting idea" but no pain
  • No current workarounds

Problem doesn't matter:

  • Happens rarely
  • Low cost/impact
  • People don't bother solving
  • "Nice to have" not "must have"

No market:

  • Too small segment
  • Hard to reach
  • Declining market
  • Resistant to change

Decision points:

20+ people confirm problem:→ Proceed to solution validation

10-19 confirm, 10+ don't:→ Refine target segment or problem hypothesis

Fewer than 10 confirm:→ Pivot or abandon


Part 3: Solution Validation Framework

Step 1: Create Low-Fidelity Prototype

Goal: Test if your solution approach works

Not: Build polished product

Methods by product type:

For SaaS/web apps:

  • Figma prototype (clickable screens)
  • 5-10 key screens only
  • Rough design fine
  • Show core workflow

For mobile apps:

  • Marvel or Figma prototype
  • Test on phone
  • Key screens only
  • Core journey

For physical products:

  • Cardboard prototype
  • 3D renders
  • Detailed sketches
  • Storyboards

For services:

  • Process diagram
  • Before/after scenarios
  • Customer journey map
  • Role-play walkthrough

Time investment: 1-2 weeks maximum

Cost: £0-£2,000 if outsourced

Step 2: Test With Target Users

Goal: See if solution solves problem

Process:

1. Recruit testers (10-15 people)

  • From problem interviews
  • New people in target segment
  • Mix of super-users and novices

2. Test protocol

  • Show prototype
  • Give realistic task
  • Watch, don't help
  • Note where stuck
  • Ask follow-up questions

3. Key questions

"Walk me through how you'd use this."

  • Tests understanding
  • Reveals confusion
  • Shows natural usage

"What would you do first?"

  • Tests discoverability
  • Reveals priorities
  • Shows expectations

"How does this compare to [current solution]?"

  • Validates improvement
  • Identifies gaps
  • Tests value proposition

"What's missing?"

  • Finds essential features
  • Identifies deal-breakers
  • Validates MVP scope

"If this existed tomorrow, would you use it?"

  • Not "would you buy" yet
  • But tests serious interest
  • Follow with "why/why not?"

4. Observe behaviour

  • Where do they hesitate?
  • What confuses them?
  • What delights them?
  • What's their path?
  • What do they skip?

Step 3: Iterate or Pivot

Strong validation signals:

Users "get it" immediately:

  • Understand purpose in seconds
  • Can complete core task unaided
  • Minimal explanation needed

Clear improvement over current:

  • Faster than current solution
  • Easier than current solution
  • Better outcome than current
  • Would switch from current

Emotional response:

  • Excitement when seeing it
  • "When can I use this?"
  • "Where do I sign up?"
  • Telling others about it

Proceed to building if 8/10 users:

  • Understand it without help
  • Can complete core task
  • Say they'd use it
  • Show genuine interest

Weak validation signals:

Confusion:

  • Don't understand purpose
  • Can't complete task
  • Need lots of explanation
  • "Is this for [wrong segment]?"

Polite interest:

  • "Interesting idea"
  • "Might be useful"
  • "I'd have to try it"
  • No excitement

Missing essentials:

  • "I'd need [missing feature]"
  • "Doesn't solve [part of problem]"
  • "What about [edge case]?"
  • Core workflow incomplete

Iterate if:

  • Some understanding but needs refinement
  • General direction right, execution wrong
  • Core workflow needs adjustment

Pivot if:

  • Consistent confusion across testers
  • Doesn't solve core problem
  • Too complex for value
  • Better opportunities emerged

Part 4: Business Model Validation

Validating Willingness to Pay

Pricing validation methods:

Method 1: Ask directly (after solution validation)

"If this cost £X per [unit], would you buy it?"

Then follow with:

  • "Why/why not?"
  • "What would you pay?"
  • "How does that compare to [current solution cost]?"

Test 3-4 price points:

  • Anchor high first
  • Work down if resistance
  • Note where enthusiasm shifts

Example:

  • "£50/month?" → "Too expensive"
  • "£20/month?" → "Maybe"
  • "£10/month?" → "Definitely"
  • Sweet spot: £15-£20

Method 2: Competitor pricing

Research what customers pay now:

  • Direct competitors
  • Alternative solutions
  • Related tools
  • Time/cost savings

Example:

  • Competitor A: £30/month
  • Competitor B: £45/month
  • Manual process cost: £100/month in time
  • → Price at £25/month captures value, beats competitors

Method 3: Pre-orders

Most powerful validation:

  • Actually collect money
  • Proves real intent
  • Fund development
  • Build early customer base

Approach:

  • Create landing page
  • Explain product
  • Offer pre-order discount (30-50% off)
  • Charge immediately (give refund option)
  • Deliver within stated timeline

Example:

  • Normal price: £100
  • Pre-order: £50
  • Goal: 100 pre-orders = £5,000
  • Validates demand + funds MVP

Red flags:

  • Can't get 10 pre-orders (lack of demand)
  • Refund rate >30% (wrong audience or promise)
  • Price resistance everywhere (value unclear)

Method 4: Concierge MVP

For service-heavy products:

  • Manually deliver service
  • Charge real money
  • Learn what's actually needed
  • Build software to automate

Example:

  • Product: Automated social media scheduling
  • Concierge: You manually schedule for customers
  • Learn: What content, timing, platforms
  • Build: Automate learnings

Validates:

  • Willingness to pay
  • Actual needs vs assumptions
  • Price points
  • Service requirements

Understanding Unit Economics

Key metrics to validate:

Customer Acquisition Cost (CAC):

CAC = Total Marketing + Sales Costs / New Customers

Test acquisition channels:

  • Google Ads (quick feedback)
  • LinkedIn Ads (B2B)
  • Content marketing (organic)
  • Cold outreach (manual)

Track:

  • Cost per click
  • Click to lead conversion
  • Lead to customer conversion
  • Total cost per customer

Aim for initial CAC: £100-£500 (depends on product)

Customer Lifetime Value (LTV):

LTV = Average Revenue per Customer × Average Customer Lifespan

Early estimation:

  • SaaS: Monthly price × 12-36 months
  • B2C: Initial purchase + repeat purchases
  • Enterprise: Annual contract × 2-5 years

LTV:CAC Ratio:

Goal: LTV should be 3× CAC minimum

Examples:

Good economics:

  • SaaS: £50/month, keeps 24 months = £1,200 LTV
  • CAC: £300
  • Ratio: 4:1 ✓

Poor economics:

  • SaaS: £10/month, keeps 6 months = £60 LTV
  • CAC: £50
  • Ratio: 1.2:1 ✗

Payback period:

Payback = CAC / (Monthly Revenue per Customer × Gross Margin%)

Aim for: <12 months


Part 5: Validation Metrics That Matter

Early-Stage Metrics (Pre-PMF)

Problem validation phase:

Interview completion rate:

  • Target: 70%+ of contacted people agree to interview
  • Low rate: Problem not resonating

Problem recognition:

  • Target: 80%+ confirm problem exists
  • Measures: Real problem vs imagined

Current solution usage:

  • Target: 90%+ have workaround/solution
  • Proves: Problem worth solving

Pain score (1-10 scale):

  • Target: Average >7
  • Measures: Problem severity

Solution validation phase:

Task completion rate:

  • Target: 80%+ complete core task unassisted
  • Measures: Solution usability

Time to understand:

  • Target: <30 seconds to grasp value
  • Measures: Clarity

Preference over current:

  • Target: 80%+ prefer your solution
  • Measures: Competitive advantage

Signup intent:

  • Target: 60%+ say they'd sign up
  • Measures: Genuine interest

Growth-Stage Metrics (Seeking PMF)

Activation:

% of signups who complete core action

Benchmarks:

  • SaaS: 40-60%
  • Consumer: 25-40%
  • Enterprise: 60-80%

Retention:

% of users active after X days

Benchmarks:

  • Day 1: 60-80%
  • Day 7: 30-50%
  • Day 30: 15-30%

PMF indicator: Retention curve flattens (users stick)

Sean Ellis Test:

"How would you feel if you could no longer use [product]?"
- Very disappointed: >40% = PMF

Net Promoter Score (NPS):

"How likely recommend (0-10)?"
- Promoters (9-10): % minus
- Detractors (0-6): %
= NPS Score

Benchmarks:

50 = Excellent

  • 30-50 = Good
  • <30 = Work needed

Organic growth rate:

% new users from word-of-mouth/referral

PMF indicator: >40% organic

Post-PMF Metrics

Revenue growth:

  • Month-over-month: 15-20%+
  • Compounding acceleration

Magic number (SaaS):

(New ARR This Quarter × 4) / Sales+Marketing Spend Last Quarter

Benchmarks:

1.0 = Efficient growth

  • 0.75-1.0 = Good
  • <0.75 = Improve unit economics

Net Revenue Retention:

(Starting ARR + Expansion - Contraction - Churn) / Starting ARR

PMF indicator: >100% (expansion > churn)


Part 6: Real Validation Examples

Case Study 1: SalesLite (Mobile CRM)

Problem validation:

Hypothesis:UK tradespeople struggle creating quotes on-site, losing sales to competitors who provide instant quotes.

Interviews: 25 UK tradespeople

Findings:

  • 22/25 confirmed problem
  • Average 8 quotes/week
  • 35% never convert to sent quotes
  • Lose £15K-£30K annually
  • Hate admin work
  • Want voice input (hands often dirty)

Pain score: 8.2/10

Decision: Strong problem validation → proceed

Solution validation:

Prototype: Figma mobile app (voice input flow)

Testing: 12 tradespeople

Results:

  • 11/12 understood immediately
  • 10/12 completed quote in <2 mins
  • All preferred to paper/email
  • "When can I use this?" from 9/12

Key insight: Voice input was differentiator

Decision: Build MVP

Business validation:

Pre-orders: £15/month offered

Results:

  • 8 pre-orders from 12 testers
  • Additional 12 from landing page
  • £300/month recurring validated

Launched: 2-week sprint, £25,000

Outcome:

  • 45 paying customers in 3 months
  • £675/month recurring revenue
  • 4.8/5 app store rating
  • Growing 15% monthly

PMF indicators:

  • 65% D30 retention
  • NPS: 58
  • 45% organic growth
  • Sean Ellis: 52% "very disappointed"

Case Study 2: Logistics SaaS (Failed, Then Pivoted)

Initial problem validation:

Hypothesis:Small delivery companies need route optimization software.

Interviews: 18 delivery company owners

Findings:

  • Only 7/18 confirmed problem
  • Most use driver knowledge
  • Low pain (6.1/10)
  • Already have solutions (Google Maps)
  • Not willing to pay much

Red flag: Weak problem validation

Decision: Should have stopped. Didn't.

Built anyway: £15,000, 6 weeks

Result:

  • Launched to crickets
  • 3 signups, 0 paid
  • Product worked fine
  • Nobody cared enough

Lesson: Strong execution of weak idea = failure

Pivot validation:

New hypothesis: (from customer interviews)Drivers, not managers, have the pain. Real problem: Proof of delivery photos + customer signatures taking too long.

Interviews: 20 delivery drivers

Findings:

  • 18/20 confirmed problem
  • 5-10 deliveries daily
  • Photo + signature takes 2-3 mins per stop
  • 20-30 mins wasted daily
  • Pain score: 8.7/10

Decision: Pivot to driver app

Solution validation:

Prototype: Mobile app (one-tap photo + signature)

Testing: 10 drivers

Results:

  • All completed flow <30 seconds
  • Enthusiastic response
  • "This saves me 30 minutes a day"

Business validation:

Freemium model: Free for drivers, £5/driver/month for managers

Result:

  • 3 pilot companies
  • 42 drivers using it
  • £210/month recurring

Outcome:

  • £250K seed raised
  • Growing 25% monthly
  • PMF achieved after pivot

Lesson: Listen to users, pivot when validated

Case Study 3: B2B SaaS (False Positive)

Problem validation:

Hypothesis:Marketing teams need better social media scheduling.

Interviews: 30 marketing managers

Findings:

  • 28/30 confirmed frustration with current tools
  • Pain score: 7.5/10
  • Willing to switch

Decision: Strong validation → proceed

Solution validation:

Prototype: Clean UI, unlimited scheduling

Testing: 15 marketing managers

Results:

  • All loved interface
  • Enthusiastic feedback
  • "Would definitely use this"

Built MVP: £20,000, 8 weeks

Launch result:

  • 150 signups first month
  • Terrible activation (12%)
  • Worse retention (5% D30)
  • Zero paid conversions

What went wrong:

False positive signals:

  • Problem real but low priority
  • "Frustration" ≠ urgent need
  • Existing tools "good enough"
  • Users polite in testing, don't switch in reality
  • No urgent hair-on-fire problem

Key metric missed: Didn't validate problem priority

Should have asked:

  • "When did you last look for alternatives?"
  • "What's preventing you from switching?"
  • "Where does this rank in your problems?"

Lesson: Problem must be urgent, not just real


Part 7: When to Build vs When to Walk Away

Green Lights (Build It)

All three validated:

  1. Problem validation ✓
    • 80%+ confirm problem
    • Pain score >7/10
    • Frequent occurrence
    • Current solutions inadequate
  2. Solution validation ✓
    • 80%+ complete core task
    • Prefer your solution
    • Genuine excitement
    • Clear improvement
  3. Business validation ✓
    • 50%+ willing to pay
    • Unit economics work (LTV:CAC >3:1)
    • Market large enough
    • Accessible customers

Proceed with confidence:

  • 5-Day UX Sprint (£10,000)
  • 1-2 Week MVP Sprint (£12,500-£25,000)
  • Beta launch
  • Iterate toward PMF

Book discovery call

Yellow Lights (Iterate)

Mixed signals:

Scenario 1: Problem strong, solution weak

  • Clear problem validation
  • Solution confusing or incomplete
  • Missing key features

Action: Redesign solution, retest

Scenario 2: Problem weak, solution strong

  • Some people have problem, not many
  • Love solution but wrong segment

Action: Find right target market

Scenario 3: Both okay, business unclear

  • Problem and solution validated
  • Pricing uncertain
  • Unit economics unproven

Action: Test pricing, try pre-orders

Timeline: 2-4 more weeks validation

Red Lights (Walk Away or Pivot)

Stop if:

Can't validate problem:

  • <50% confirm problem
  • Pain score <6/10
  • Low frequency
  • No urgency

Can't validate solution:

  • Consistent confusion
  • Can't complete core task
  • No preference over current
  • Lots of "nice to have" feedback

Can't validate business:

  • Nobody will pay
  • LTV:CAC <2:1
  • Market too small
  • Acquisition cost too high

Economics don't work:

  • CAC >12 month payback
  • High churn (>10%/month)
  • Low LTV (<£500)

Better opportunities exist:

  • Adjacent problem more urgent
  • Different segment more interested
  • Easier problem to solve

Sunk cost fallacy:

  • Don't build because you already invested time
  • Don't build because you told people
  • Don't build because you want to

Build because validation proves it'll work.


Part 8: The Precode Validation Approach

Validation Before Building

Our process:

Week 1-2: Problem validation (you do this)

  • Define hypothesis
  • Interview 20-30 people
  • Analyse results
  • Decide: proceed or pivot

Week 3: Solution design (we do this)

  • 5-Day UX Sprint (£10,000+VAT)
  • High-fidelity prototype
  • Ready for user testing
  • Professional design

Week 4: Solution validation (we help)

  • Test with 10-15 users
  • Watch task completion
  • Gather feedback
  • Iterate designs (included)

Week 5-6: Build validated MVP (we do this)

  • 1-2 Week MVP Sprint (£12,500-£25,000)
  • Working product
  • Production-ready
  • Launch-ready

Total timeline: 6 weeks
Total investment: £22,500-£35,000

vs traditional approach:

  • 3-6 months development
  • £80,000-£150,000
  • Build first, validate later
  • High failure risk

Why This Works

Validation-first advantages:

1. Lower risk

  • Validate £10K investment before £150K
  • Test with hundreds before millions
  • Pivot cheap, not expensive

2. Better products

  • Built what users actually need
  • Informed by real feedback
  • User-tested design

3. Faster to market

  • No wasted features
  • Clear requirements
  • Efficient development

4. Higher success rate

  • 3-4× better outcomes
  • PMF achieved faster
  • Less pivoting needed

Real stats from Precode projects:

Products built with validation:

  • 72% achieve some PMF within 6 months
  • 45% raise funding
  • 38% reach profitability
  • Average time to PMF: 4.5 months

Products built without validation:

  • 28% achieve PMF
  • 15% raise funding
  • 12% reach profitability
  • Average time to PMF: 12+ months (if at all)

Part 9: Validation Tools and Resources

Interview Tools

Recruitment:

  • LinkedIn (professional products)
  • Reddit (consumer products)
  • UserInterviews.com (paid panel)
  • Respondent.io (B2B participants)

Scheduling:

  • Calendly (£8-£12/month)
  • Cal.com (free)

Recording:

  • Zoom (free-£12/month)
  • Loom (free-£10/month)
  • Otter.ai (transcription, £8/month)

Analysis:

  • Notion (organise notes, free-£8/user)
  • Airtable (tracking, free-£20/month)
  • Dovetail (research tool, £25/user/month)

Prototype Tools

Web/SaaS:

  • Figma (£12/user/month, best option)
  • Framer (£5-£15/month)
  • Webflow (£14-£39/month)

Mobile:

  • Figma (best for mobile too)
  • Marvel (£10-£40/month)
  • InVision (free-£8/month)

No-code for validation:

  • Bubble.io (£25-£115/month)
  • Webflow (£14-£39/month)
  • Glide (£25-£40/month)

Landing Page Tools

Quick landing pages:

  • Carrd (£9-£49/year)
  • Landen (£20-£72/month)
  • Unicorn Platform (£8-£16/month)
  • Webflow (£14/month)

Features needed:

  • Clear value proposition
  • Email capture
  • Payment integration (Stripe)
  • Analytics (Google Analytics 4)

Analytics Tools

Early stage:

  • Google Analytics 4 (free)
  • Plausible (£9-£29/month, privacy-friendly)
  • PostHog (free-£50/month)

Product analytics:

  • PostHog (events and funnels)
  • Mixpanel (free up to 100K events)
  • Amplitude (free up to 10M events)

User feedback:

  • Typeform (£21-£69/month)
  • Hotjar (£0-£39/month, heatmaps)
  • Intercom (£59+/month)

Part 10: Next Steps

Your Validation Roadmap

Weeks 1-2: Problem validation

  • Define hypothesis
  • Recruit interviewees
  • Conduct 20-30 interviews
  • Analyse patterns
  • Decision: proceed or pivot

Week 3: Solution design

  • Book 5-Day UX Sprint (£10,000)
  • Get professional prototype
  • Prepare for testing

Week 4: Solution validation

  • Test with 10-15 users
  • Analyse feedback
  • Iterate design
  • Decision: build or iterate

Weeks 5-6: Build MVP

  • Book 1-2 Week MVP Sprint (£12,500-£25,000)
  • Get working product
  • Deploy to production

Weeks 7-10: Beta launch

  • 50-100 beta users
  • Track metrics
  • Gather feedback
  • Iterate

Month 4+: Scale

  • Find PMF
  • Optimize metrics
  • Growth experiments
  • Team building

Ready to Validate Your Idea?

Book free 30-minute discovery call:

https://www.precode.co/discovery

We'll discuss:

  • Your product hypothesis
  • Validation approach
  • Testing strategy
  • Timeline and costs
  • Next steps

No pressure. Honest feedback on validation needs.


Conclusion: Validation Before Building

The expensive way:

  • Build for 6 months
  • Launch to silence
  • Realise wrong product
  • Pivot or die

The smart way:

  • Validate in 4 weeks
  • Build in 2 weeks
  • Launch with confidence
  • Achieve PMF faster

Investment comparison:

Build-first approach:

  • Development: £80K-£150K
  • Time: 6-12 months
  • Success rate: 20-30%
  • Cost if wrong: Everything

Validate-first approach:

  • Validation + MVP: £22.5K-£35K
  • Time: 6 weeks
  • Success rate: 70-75%
  • Cost if wrong: £10K-£22.5K (vs £80K-£150K)

The framework:

  1. Validate problem (free, 2 weeks)
  2. Design solution (£10K, 1 week)
  3. Validate solution (free, 1 week)
  4. Build MVP (£12.5K-£25K, 1-2 weeks)
  5. Launch and iterate (ongoing)

Validation isn't a phase. It's forever.

Even after PMF, keep validating:

  • New features
  • New markets
  • New segments
  • New pricing

Products that win validate constantly.