Recapping Inspired: How to Create Tech Products Customers Love - Part 4

This is the final entry in a four-part series covering my takeaways from Inspired.

Recapping Inspired: How to Create Tech Products Customers Love - Part 4

This is the final entry in a four-part series covering my takeaways from Inspired. In the previous part, I talked about notes from part 3 of the book: Getting the Right Product. This included tools and frameworks for getting to product-market fit, such as OKRs, product vision, and product strategy.

This last part will correspond to part 4 and short part 5 of the book. The notes will be on some more specific methods for product discovery, and for applying those methods at scale in small or large organizations. My hope is that this, as with the other parts, could be used as a handy reference for someone who's read Inspired and as a primer for someone who hasn't. As before, I'll end with a few questions for further thought.

Discovery

When Cagan talks about "productized", "production-quality", he's talking about stable, fully functioning product with analytics, test suite i18n, etc. We can do product discovery work before this, and some of it won't even use engineers' time.

The balance we seek is between getting an MVP out fast, doing discovery, and not damaging brand with something broken.

Discovery is about getting ideas in front of customers early and often, while delivery is about best practices for engineering.

Principles of Discovery

Discovery addresses these risks:

  • value risk (will the customer buy or use this?)
  • usability risk (can the user figure out how to use it?)
  • feasibility risk (can it be built?)
  • business viability risk (does the solution work for the business?)

Principles:

  1. We can't count on customers to tell us what to build; there's a lot they don't know about what's possible.
  2. Spend the most time on establishing compelling value.
  3. Great engineering is hard, but useless without UX.
  4. Functionality, design, and tech are intertwined.
  5. Expect many ideas won't work out.
  6. Validate on real users and customers.
  7. Validate n the fastest and cheapest way possible.
  8. Validate feasibility during discovery, not after.
  9. Validate business viability during discovery.
  10. Discovery is about shared learning.

The ethics of "should we build this?" is not usually a legal question. Knowing the business well helps us make an ethics case if needed.

Expect 10-20 discovery iterations per week in a competent product team.

Discovery Techniques Overview

There are discovery techniques for framing, planning, ideation, prototyping, testing, and you're using them to address the risks mentioned earlier

Discovery Framing Techniques

Framing techniques are for getting on the same page for clarity of purpose, and they should align with OKRs. They can help identify risks.

Opportunity Assessment

  1. What business objective is the work intended to address? It should line up with a business O from the OKRs.
  2. How will you know if you've succeeded? KRs. EG is 1% successful if the O is to reduce churn?
  3. What problem with this solve for our customers?
  4. What type of customer are we focused on? Eg: persona.

Customer Letter

This is useful for a redesign, it encompasses several Os. An example is a pretend press release, so we can work backward. This is used at Amazon. This is an example I've done as part of the udacity product manager nanodegree program.

Question it addresses are: How does it improve the lives of  our customers? It can be good for evangelism.

Another example is a grateful letter written from a persona's POV.

Startup Canvas

Especially useful for startups and all-new business opportunities. This is a close cousin of the business model canvas. It can also be useful for new PMs to get a handle on things, but it won't change for in-progress work.

A startup canvas is simpler than a business plan for identifying risks, and it doesn't delay getting to the solution.

Note: MBAs are likely to be focused hard on business viability risks, which are less of a concern at early stages. The biggest risk may actually be value risk. The answer is to focus on discovery.

Discovery Planning Techniques

Storymap

Storymaps are great for framing, planning, ideation, prototype organizing. They help offset the monotony and flatness of a normal backlog. You can think of it in two axes. Horizontally are the major user activities, and vertically are the user tasks and stories, with higher priority on top. More info recommended here.

Customer Discovery Program

This may be the single best leading indicator if it's done well. It's meant to deal with the situation where sales gets frustrated about features you don't have, etc.

The goal is to get 6-8 reference customers, defined as:

  • not friends or family
  • using the product in production
  • paid real money for it
  • willing to tell others

Reference customers are the single best sales tool you can provide to sales people. Discovering and developing a set of reference customers happens in parallel with discovering and developing the product.

There are variants on this method applying to different contexts. Products for businesses, platform products (public APIs), customer-enabling tools used internally, and products for consumers.

This program is good for larger efforts like a new product, new geography, or redesign. You want to find the customers so desperate for a solution they'll give ou a shot. Customer discovery should follow these guidelines:

  • They should be in a single targetmarket (e.g. , geography)
  • Screen out technologists who are eager to try new and shiny.
  • Work closely with product marketing manager.
  • Reference Customers should agree to test early versions, stay in touch, and be a public reference.
  • Emphasize that this is not a custom solution.
  • Better not to charge at first, but if necessary, can meet in the middle and hold a payment in escrow.
  • If it's a great problem, the sales team will lean on you to take more customers, but this needs to be the right set, ketp at 6-8. However you can have an early release program for anyone else
  • If there's trouble recruiting reference customers, you have failed demand validation.

For the variations:

  • For apis, you want reference applications that are using your api instead of reference customers.
  • For customer enabling products, you want 6-8 internal users who you can ask to tell their colleagues.
  • For consumer products, you want 10-50 reference, much broader testing with people who have never been exposed to the product, and who can tell others on social media, etc.

The Sean Ellis (coined the term "growth hacking") test for product market fit works like this: you survey users that have (as told by analytics) made it throught he core value flow of the product. You then ask them to rate how they'd feel if the product disappeared on a 4-point scale from "very disappointed" to "not relevant, I don't use it". If > 40% would be very disappointed, you have a good sign of product-market fit.

Discovery Ideation Techniques

Customer Interviews

Include the deisgner, PM, and a rotating engineer. The designer asks the questions, and PM takes notes.

We're trying to understand: Are our customers who we think they are? Do they really have the problems we think they have? How does the customer solve the problem today? What would be required for them to switch?

Shoot for about 1 hour of their time, 2-3 customer interviews per week every week. If time allows, try some product ideas in the interview. Then, debrief with colleagues.

Concierge Test

Similar to spending time with customer success staff, but here we're identifying the user's problem before they have called us. Spend time with them, letting them show how they work.

The Power of Customer Misbehavior

An example is ebay's "everything else" category which led to categories with lots of customers like cars and event tickets. Another example is facebook's social graph and public apis, which led to new developer products.

Encourage customers to use products off-label and learn what they actually want.

Hack Days

Directed hack days are for solving a specific problem like "reduce churn rate". Undirected are anything loosely related to the company mission. These help empowerment too.

Discovery Prototyping techniques

Fred Brooks: "Plan to throw one away, you will anyhow".

Principles of Prototypes

  1. All prototypes should require at least an order of magnitude less time and money than a fully fleshed out product.
  2. The key benefit of a prototype is it means more thnking than talking or writing.
  3. Prototypes are collaboration tools.
  4. Use lo-fi when possible, hi-fi only when necessary.
  5. Prototypes can be specs for what engineers will build. They may need supplemented with acceptance criteria, use cases, and business rules.

Prototypes should address one or more of the main product risks.

Feasibility Prototype

These will take a minimum of 1-2 days, and up to a lot longer for new tech (at the time of writing, an example was machine learning). These have little or no error handling, logging etc. The engineer can estimate how much time it's going to take. PM decides if it's better ot pursue another idea with less feasibility risk, given that estimate.

Possible risks that a feasibility prototype may help expose:

  • dependency on another team's changes
  • use of a legacy system
  • use of a new 3rd party component
  • use of new (to the team) tech
  • fault tolerance concern
  • scalability concern
  • performanc concern
  • algorithm concernn

User Prototype

A user prototype is smoke and mirrors, no real data. It can range from lo-fi (paper) to hi-fi (almost real, uses fake data). The designer can pick prototyping tools, and must be ready to dispose readily.

User prototypes are not for validating value. Novices put one in front of 10-15 people who say they love it, but then whose actions don't match their words.

Live Data Prototype

We can use live data prototypes when we need to collect actual usage data. For example in game dynamics, search result relevance, social features, or product funnel work. It will lack "productization" (as discussed earlier). It should take about 5-10% of what would be eventual delivery productization work. Productization is engineer work, and the PM needs to communicate that there will be lots more time required if this prototype is validated. Live data prototypes can happen in days to a week or more.

Hybrid Prototype

An example of this is a "wizard of oz" test , where there's a nice front end and a PM or other staff act as the backend.

Discovery Testing Techniques

Often usability and value discovery are done together. Value is often the hardest and most important to assess. Wait to "stir up" the organization for a business risk evaluation.

Testing Usability

Big companies may have their own user research departments, but don't pay for 3rd party firms. To recruit users, you're good in a B2B firm if you have the customer discovery program mentioned earlier.

Otherwise, you can advertise for test subjects on craigslist, or SEM on Adwords. You can also potentially get some email addresses from the PMM. Soliciting volunteers on the company website, so long as you call and screen them. Going where users congregate (e.g. sports bars for fantasy sports, trade shows for business software) is another good idea, though you'll want to bring thank-you gifts.

If user test candidates come to your workplace, you'll need to compensate them for their time. Better yet is to meet them at a coffee shop (This is called "Starbucks Testing").

Prep:

  • Usually use a hi-fi user prototype.
  • Define tasks to test in advance.
  • Test fast before you fall in love with your ideas.
  • Have one test admin and one note-taker present.
  • Go to the user tester's workplace if you can.
  • Tools for user testing remotely do exist.

Test:

  • "We're testing the prototype, not you as the user."
  • See what they think they're able to do when they're on the landing page.
  • Keep them in "use-mode", not "critique-mode". Watch what they do, not what they say, don't ask a lot of questions.
  • Keep quiet and if they ask questions, act like a parrot.
  • We're lookiung for where it's counterintuitive for the user.

After:

Make a quick summary to share with product team members and other relevant stakeholders. Long reports go out of date fast.

Testing Value

Your solution needs to be substantially better than the default. Sometimes you want to test demand, but often you're in an established market. Qualitative testing isn't numerical, but still gives us good info. Some things like ad-tech, you can test their efficacy by e.g. revenue.

Demand Testing Techniques

The fake door demand test is a realistic-looking prototype, but where there's a button to take a critical action, instead of doing that action or returning data, it goes to a page that checks demand by asking for an email or some other call to action. You can also do this at an even higher level, by just having a landing page for the whole product with a description and a call to action.

In risk-averse companies, beware of people trying to establish special innovation centers. What does that signal to the "normal" teams, or about the existing products? The best big companies institutionalize the idea that if you don't innovate you die.

You can help with brand protection concerns with A/B tests; at some threshold you can put in place the customer discovery program and NDAs. Small incremental continuous deployment also helps colleagues not feel blindsided by changes.

Qualitative Value Testing Techniques

Qualitative testing can answer "why?" where quantitative can't. You can interview first, and then do usability test, though that has to precede questions or evaluation about value, etc. Things like focus groups are too hypothetical.

Specific value tests make sure that the tester is not just being nice. Look for them to pull out a credit card, or sign a non-binding letter of intent to buy. Other indicators include reputational capital (they'll recommend to friends, family, boss), and time (they'll block off time to work with you, though you needn't actually do it). They may also provide login credentials from where they want to switch from1

Remember if you have to put an idea on the shelf, you're saving the company money. As PM you should attend every value test.

Quantitative Value Testing Techniques

With a large quantity of data you can get statistically significant information, below that, it's evidence that can still be useful.

An organization usually has a kind of triangle where traffic, time, and risk pull in different directions when it comes to these kinds of tests. A small company has a high appetite for risk (e.g. rolling out a change to large numbers of users) but doesn't have as much time or traffic.

A/B testing is the gold standard. You can make the B as little as 1% for riskier changes. Invite-only testing is not as predictive but better for those with low risk appetite. You can follow it up with a qualitative test. The customer discovery program will also give you data, and of course your analytics. At minimum, analytics should include usage metrics.

Analytics are freeing because "data beats opinions".

The minimum set of analytics (aka KPIs) should include:

  • user behavior (click paths, engagement)
  • business (active users, conversion rate, lifetime value, retention)
  • financial (ASP, billings, time to close)
  • performance (load time, uptime)
  • operational costs (storage, hosting)
  • go-to-market costs (acquisition costs, cost of sales programs)
  • sentiment ( NPS, customer satisfaction, surveys)

Many PMs check analytics first thing in the day because they're always running a test.

Testing Feasibility

Engineers usually answer feasibility questions quickly, but not if the tech is new to them. Sometimes they need time to make a feasibility prototype, this can be a good indicator we're doing something only just now possible AND it gives engineers a chance to learn.

DONT have a weekly meeting where you throw ideas at them and ask for effort estimates. Estimates will be conservative and designed to make you go away. Instead keep them looped in throughout discovery so they're thinking about feasibility as they go along.

Hardware has some extra challenges, but tech like 3d printing helps. Feasibility analysis may take longer.

Testing Business Viability

Different stakeholders in the organization have concerns that may become constraints. Partner with them when you have a prototype and before you incur the cost of building. Some of the stakeholders and their constraints:

stakeholder concern
marketing brand
competitiveness
differentiation
go-to-market channels
sales whether product costs line up with the sales channels (e.g. direct sales is spendy, so product must be high-priced)
customer success high touch / low touch - does the product line up with customer success strategy?
finance can we afford it?
legal IP
privacy
compliance
business development contracts with our partners
security will it follow security practices?
CEO / COO / GM wants to know if you've done your homework about different parts of the business and their relations to each other

Differentiate between a user test (test an idea on the user), a product demo (evangelize your idea), and a walkthrough (show to stakeholder to see if they have concerns).

Transformation Techniques

Especially at scale, in larger organizations we want to move from (mercenaries, roadmaps, output) -> (missionaries, empowerment, outcome).

Discovery Sprint Technique

This is also known as a design sprint (you can see my example via Udacity's Product Management Nanodegree here). Design sprints are good for companies who are struggling with the idea of an MVP.

In the discovery sprint, you frame the problem, map out the space, pick a problem to be solved, the target customer, then pursue several approaches to solving. Narrow down and flesh out solutions, and then make a hi-fi user prototype. This is sometimes doable within a week.

A book recommendation from Cagan: Sprint: How to Solve Big Problems and Test New Ideas in Just Five Days.

Some places have 'agile coaches'. Most of those are really delivery coaches, meaning they really only address delivery. A discovery coach is focused on product.

Pilot Team Technique

Find a product team to volunteer to try out these new ways of working, let it run at least two quarters. The comparison to the control groups will be qualitative, but still valuable. It helps if the pilot team is not dependent on other teams that use old methods.

Weaning An Organization Off Roadmaps

Aim to do this over 6-12 months. Every time you reference a roadmap item, include a reminder of the business outcome the feature was meant to address.

For example, if the feature was adding paypal in an effort to increase conversion, reference conversion. When the feature is live, if conversion rate does not go up, say you have more ideas on how to move it.

Remember that people (stakeholders) want roadmaps because

  1. They want visibility to make sure the team is addressing the most important stuff.
  2. They want to be able to plan the business around critical events.

Process At Scale

Beware of orgnanizations that claim to do "Agile at Scale"2. In growing or large organizations, the org may try to protect its achievements by formalizing and standardizing on how products are produced. That usually happens accidentally. There are some methods for trying to keep the magic.

Managing Stakeholders

A lot of people have opinions, but not all of them are stakeholders. Rule of thumb: people with veto power are stakeholders. An incomplete list: the exec team, business partners, finance, legal, compliance, business development.

The PM should understand the concerns and constraints of each stakeholder, customers, and the industry. Try weekly one-on-ones with stakeholders. Listen, ask many questions. Do it during discovery, not when you've already built. Do it before you put work on the product backlog. Plan for 2-3 hours per week. Lunch and coffee are good for this.

DO NOT have one big meeting with all the stakeholders, because it will degenerate to design by committee and powerpoints are bad for showing product intentions.

Avoid situations where it boils down to their opinion vs yours, by following "data beats opinions", and doing discovery work.

Only trust executive sign-off on a hi-fi prototype.

You may need to evangelize the duties of your own role as well.

Growing companies may want to borrow the credibility of a big brand like Oracle by hiring from there, but those companies don't innovate and are surviving on the brand alone. If you're coming from one of these types of companies, be open in an interview about how you're trying to leave it and its locked-in processes behind.

Communicating Product Learnings

Have an all-hands with the head of product, 15-30 minutes every 1-2 weeks. It shouldn't be detailed sprint reviews, but just high-level sharing.

The Right Culture

Most of this comes down to culture.

Good Product Team / Bad Product Team

Good Bad
missionary mercenary
product ideas from observing customers, data gathering requirements from sales and customers
understand stakeholders and their customers get requirements from stakeholders
skilled in discovery techniques roadmaps
brainstorm across the company insular
product, engineering, design are side-by-side functions are siloed
new idea tests all the time wait for permission to run a test
insist on product skills don't know what designers are
engineers have time to try prototypes every day show engineers prototypes only during sprint planning
engage directly with users every week think product team are the customer
know many ideas get thrown out just build from the roadmap
understand speed comes from good techniques, not forced labor complain colleagues don't work hard enough
use high-integrity commitments carefully and sparingly complain about being sales-driven
instrument the work, use analytics analytics are "nice to have"
release continuously release in bulk
obsess over reference customers obsess over competitors
celebrate impacts to business results celebrate a release

Top Reasons for Loss of Innovation

Missing:

  1. customer-centric culture
  2. compelling product vision
  3. focused product strategy
  4. strong product managers
  5. stable product teams
  6. engineers in discovery
  7. corporate courage (for risk)
  8. empowered product teams
  9. product mindset
  10. time to innovate

Top Reasons for Loss of Velocity

  1. technical debt
  2. lack of strong product managers
  3. lack of delivery management
  4. infrequent release cycles
  5. lack of product vision and strategy
  6. lack of colocated, durable product teams
  7. not including engineers early enough during product discovery
  8. not using product design in discovery and instead having them try to do their work at the same time as engineers
  9. changing priorities
  10. consensus culture

Establishing a Strong Product Culture

Many orgs are good at either innovation or execution, sometimes those conflict with each other. Places like Amazon are good at both, but can be brutal to work for.

Innovation culture:

  • experimentation
  • open minds
  • empowerment
  • technology (and analytics)
  • business and customer-savvy
  • skill-set and staff diversity
  • discovery techniques

Execution culture:

  • urgency (wartime mindset)
  • high-integrity commitments
  • empowerment
  • accountability (consequences to reputation)
  • collaboration
  • results
  • recognition

Questions to Leave On

  1. "Data beats opinions" but qualitative data is also useful. When is qualitative data especially useful and when is it not?
  2. Cagan often refers to the FAANG companies as references that have good product culture. Contemporary to the writing of this summary,the public reputations of one or more of those aren't doing so hot. In the case of facebook, it's on ethical grounds for misinformation, as well as sliding on innovation. In the case of google, it's on ethical grounds around user privacy. In the case of Amazon, it's around monopoly concerns and worker health and safety considerations. There are many other companies that appear successful and respected. How would you go about seeing if some of these have strong product cultures? Do they differ substantially in their product practices? How about when the organization is headquartered outside the US?
  3. In a new product or organization, how would you go about finding current best practices for analytics collection tools and KPIs that you might be able to make use of?

1 This is iffy enough for me to make a note of it. Even if you aren't actually going to take someone's credentials, it seems odd to encourage people to share or pass them around, as a matter of security practice or public opsec. There may be wording or safer ways to get the kind of credential ask described here, and Marty may be thinking of those.

2"SAFe" may be one manifestation?