Why most product teams build the wrong things
Here’s an uncomfortable truth: most product teams spend months building features that customers don’t want, don’t use, or don’t value enough to pay for. They ship on time, hit their velocity targets, and still fail. The problem isn’t execution—it’s that they skipped product discovery or treated it as a one-time event rather than an ongoing practice.
Product discovery is the set of activities that help you decide what to build before you commit engineering resources to building it. Done well, it reduces waste, increases customer value, and gives your team confidence that they’re solving real problems. Done poorly—or not at all—it’s why 80% of features get little to no use, according to Pendo’s research on feature adoption.
This guide breaks down product discovery using the framework that’s become the industry standard: Teresa Torres’ Continuous Discovery Habits. You’ll learn how to structure your discovery practice, run effective customer interviews, use opportunity solution trees to make better decisions, and avoid the mistakes that derail most teams.
What product discovery actually means
Product discovery answers a simple question: Should we build this? It’s the counterweight to delivery, which answers: Can we build this well?
Marty Cagan, whose work at the Silicon Valley Product Group shaped modern product management, defines discovery as the work we do to “tackle the risks before we write a line of production code.” Those risks fall into four categories:
- Value risk: Will customers choose to use or buy this?
- Usability risk: Can customers figure out how to use this?
- Feasibility risk: Can we build this with the time, skills, and technology we have?
- Business viability risk: Does this work for our business model, legal constraints, and stakeholder requirements?
Discovery isn’t about having more meetings or creating more documents. It’s about reducing these risks through rapid learning—talking to customers, testing assumptions, and making evidence-based decisions about where to invest your team’s limited capacity.
The cost of skipping discovery
When teams skip discovery, they’re essentially gambling their engineering budget on untested assumptions. Consider the math: if a four-person team spends a quarter building a feature, that’s roughly $150,000-$300,000 in fully-loaded costs. If that feature fails to deliver value, you’ve wasted that investment and the opportunity cost of what you could have built instead.
Contrast that with spending two weeks on discovery activities—customer interviews, prototype testing, assumption mapping—that might cost $10,000-$20,000 in time. Even if you kill the idea, you’ve saved the majority of the investment and learned something valuable about your customers.
Continuous discovery habits: the Teresa Torres framework
Teresa Torres spent years coaching product teams at companies like CarMax, Snagajob, and Bluprint before codifying her approach in Continuous Discovery Habits (2021). Her framework has become the most widely adopted discovery methodology in tech because it’s specific, actionable, and designed for how modern cross-functional teams actually work.
The core insight of continuous discovery is in the name: discovery isn’t a phase that happens before development. It’s an ongoing practice that runs in parallel with delivery. While your team ships features, you’re simultaneously learning about what to build next. [INTERNAL_LINK: continuous discovery habits]
The six habits of continuous discovery
Torres structures her framework around six interconnected habits:
- Outcome-focused work: Define success as customer behavior change, not feature delivery
- Customer interviews: Talk to customers weekly to build empathy and surface opportunities
- Opportunity mapping: Structure what you learn into an opportunity solution tree
- Assumption testing: Identify and test the riskiest assumptions behind your ideas
- Co-creation with customers: Involve customers in generating and evaluating solutions
- Small, fast experiments: Test ideas before committing to full development
These habits work together. Weekly interviews surface opportunities. The opportunity solution tree organizes those opportunities and connects them to solutions. Assumption testing validates solutions before you build them. The rhythm is continuous—not a one-time research project.
The opportunity solution tree: your discovery operating system
The opportunity solution tree (OST) is Teresa Torres’ most influential contribution to product management. It’s a visual framework that connects your work to outcomes and ensures you’re exploring the problem space before jumping to solutions.
The tree has four levels:
- Outcome: The measurable change in customer behavior you’re trying to create
- Opportunities: Customer needs, pain points, and desires that, if addressed, would drive the outcome
- Solutions: Specific product changes that could address the opportunities
- Experiments: Tests that validate whether solutions will work
Starting with outcomes, not outputs
Most product teams receive their work as a list of features to build. The opportunity solution tree flips this by starting with outcomes—the business result you’re responsible for driving.
A good product outcome is:
- Measurable with existing data or data you can reasonably collect
- Within the team’s influence through product changes
- Connected to a broader business objective
For example, “increase revenue” is too broad—a product team can’t directly control revenue. But “increase the percentage of trial users who complete onboarding” is an outcome the team can influence through product decisions.
Netflix famously measures “hours viewed” as a key outcome. Spotify tracks “time spent listening.” These outcomes focus teams on customer value rather than feature delivery. [INTERNAL_LINK: product metrics]
Mapping opportunities
Opportunities are the meat of the tree. They represent customer needs, pain points, and desires in the customer’s language—not your product’s language.
Bad opportunity: “Users need better onboarding”
Good opportunity: “New users don’t understand which features are most relevant to their use case”
The difference matters. The first leads directly to a solution (improve onboarding). The second opens a problem space where multiple solutions might work—personalization, guided tours, templates, or something you haven’t thought of yet.
Opportunities should be structured hierarchically. A top-level opportunity like “Users struggle to find relevant content” might break down into:
- Search results don’t match user intent
- Users don’t know what to search for
- Content recommendations feel random
- Users can’t filter by their specific criteria
This structure lets you choose where to focus. You can’t solve all problems, but you can pick the most impactful opportunity and explore it deeply.
Generating and testing solutions
Only after you’ve mapped opportunities do you start generating solutions. Torres recommends generating multiple solutions for each opportunity—at least three—to avoid falling in love with your first idea.
Each solution connects to experiments that test your riskiest assumptions. If you assume users will understand your new feature without explanation, design a test for that. If you assume the technical implementation is feasible, run a spike. The goal is to learn fast and fail cheap.
Weekly customer interviews: the engine of discovery
The single most important habit in continuous discovery is talking to customers every week. Not monthly. Not quarterly. Weekly.
Teresa Torres recommends that every product trio—the PM, designer, and tech lead who share responsibility for a product area—conduct at least one customer interview per week. This cadence keeps customer context fresh and creates a steady stream of insights rather than relying on periodic research projects.
Automating interview recruitment
The biggest barrier to weekly interviews is finding participants. Torres advocates for building automated recruitment systems that constantly source interview candidates without manual effort.
Practical approaches include:
- In-product prompts: Ask users at relevant moments if they’d be willing to chat
- Post-transaction surveys: Add a “can we follow up?” question to transactional emails
- Customer success partnerships: Have CS flag customers open to feedback conversations
- Community channels: Recruit from Slack communities, forums, or social media
- Tools like Ethnio or UserInterviews: Build a panel of available participants
Intercom built a culture of customer interviews by making it easy for anyone in the company to book time with customers. Their product team maintained a standing pool of customers who’d opted in, and any employee could schedule a call within 24 hours.
The interview structure that works
Effective discovery interviews are story-based, not question-based. You’re not conducting a survey. You’re trying to understand the customer’s world.
Torres recommends structuring interviews around specific stories—instances when the customer actually did the thing you’re curious about. Instead of asking “What do you usually do when X happens?” ask “Tell me about the last time X happened. Walk me through exactly what you did.”
Specific stories reveal:
- The context in which behavior happens
- The actual sequence of actions (not the idealized version)
- Workarounds and compensating behaviors
- Emotional reactions and unmet needs
A typical interview structure:
- Warm-up (2 min): Thank them, explain the purpose, set expectations
- Context questions (5 min): Understand their role, goals, and situation
- Story mining (20 min): Collect specific stories about relevant behaviors
- Opportunity exploration (10 min): Dig into pain points and unmet needs that surfaced
- Wrap-up (3 min): Ask if there’s anything else, thank them, explain next steps
Synthesizing interviews into opportunities
After each interview, the product trio should spend 15-30 minutes synthesizing what they learned. The output is new opportunities (or refinements to existing ones) added to your opportunity solution tree.
Look for patterns across interviews. When multiple customers describe similar struggles in similar language, you’ve found a real opportunity. When only one customer mentions something, it might be an edge case—or you might not have talked to enough people yet.
Discovery vs. delivery: finding the right balance
Product discovery and delivery aren’t competing activities—they’re complementary. The question isn’t whether to do discovery or delivery, but how to structure both as ongoing, parallel workflows.
In mature product organizations, the split looks roughly like this:
- PMs and designers: 30-50% of time on discovery activities
- Engineers: 10-20% of time on discovery (feasibility assessments, prototypes, experiments), 80-90% on delivery
- The product trio together: 2-4 hours per week on discovery rituals (interviews, synthesis, assumption mapping)
Airbnb’s product teams run discovery and delivery in parallel, with different time horizons. They’re always shipping improvements from decisions made 6-8 weeks ago while simultaneously exploring opportunities that won’t ship for months. The work doesn’t compete because it operates on different timescales. [INTERNAL_LINK: product roadmap]
Dual-track agile
The term “dual-track agile” describes this parallel structure. One track focuses on delivery—shipping validated work through sprints or continuous deployment. The other track focuses on discovery—validating the next set of problems and solutions.
The tracks connect through a “validated backlog”—ideas that have passed through discovery and are ready for detailed implementation. This is different from a traditional backlog full of untested feature requests. Items in a validated backlog have evidence supporting their value, usability, and feasibility.
Common product discovery mistakes (and how to avoid them)
Even teams committed to discovery often undermine themselves with these common mistakes:
Mistake 1: Treating discovery as a phase
Many teams do discovery at the start of a project, then stop once they’ve “figured it out.” But customer needs evolve, your understanding deepens through building, and assumptions you didn’t know you had surface during implementation.
The fix: Make discovery a continuous practice with weekly rituals that persist regardless of where you are in a project.
Mistake 2: Asking customers what to build
The classic Henry Ford quote—”If I asked people what they wanted, they’d have said faster horses”—is overused but contains truth. Customers are experts on their problems, not your solutions.
The fix: Focus interviews on understanding behavior, context, and pain points. Generate solutions yourself, then test them with customers through prototypes and experiments.
Mistake 3: Only talking to happy customers
It’s easier to recruit customers who love your product. They respond to emails, they’re eager to help, they’re pleasant to talk to. But they won’t tell you what’s broken.
The fix: Deliberately recruit churned customers, trial users who didn’t convert, and customers who use competitors. The uncomfortable conversations yield the most valuable insights.
Mistake 4: Validating instead of learning
Confirmation bias is real. If you’ve already decided what to build, you’ll unconsciously design research that confirms your decision. You’ll hear what you want to hear.
The fix: Structure discovery to challenge your assumptions, not confirm them. Actively look for evidence that your idea won’t work. If you can’t find it, that’s meaningful signal.
Mistake 5: Skipping assumptions testing
Teams often jump from “customers have this problem” to “let’s build this solution” without testing the assumptions that connect them. Does this solution actually address the problem? Will customers use it? Is it technically feasible?
The fix: For every solution, list your assumptions explicitly. Rank them by risk. Design small experiments to test the riskiest ones before committing to full development. [INTERNAL_LINK: assumption testing]
Mistake 6: Discovery without authority
Discovery is pointless if its findings don’t influence what gets built. When stakeholders or leadership dictate the roadmap regardless of evidence, teams learn that discovery is theater.
The fix: This is an organizational problem, not a process problem. Product teams need real authority to make decisions based on what they learn. If they don’t have it, start by using discovery to inform how you execute stakeholder requests, then build credibility over time.
Getting started with continuous discovery
You don’t need to implement everything at once. Torres recommends starting with the highest-leverage habit: weekly customer interviews.
Here’s a practical four-week getting started plan:
Week 1: Set up a recruitment mechanism. Add an in-product prompt or email asking customers if they’d chat. Aim to
Frequently asked questions
What is product discovery in product management?
Product discovery is the process of determining what to build before you build it. It involves customer research, assumption testing, and validating that a solution will actually solve the problem — reducing the risk of building the wrong thing.
What is continuous discovery?
Continuous discovery, popularized by Teresa Torres, is the practice of conducting small, frequent customer interviews (at least weekly) to continuously gather insights that inform product decisions — rather than large, infrequent research projects.
What is the difference between product discovery and product delivery?
Discovery answers ‘what should we build?’ — it’s about finding the right problem and validating solutions. Delivery answers ‘how do we build it?’ — it’s about execution, engineering, and shipping.
How long should product discovery take?
Discovery is continuous, not a phase. Small experiments can be designed and run in days; more complex discovery work might take 2-4 weeks. The goal is to validate before you commit engineering resources.
