After the Design Sprint
Design Thinking has become an essential part of how innovation teams surface ideas.
One specific tool has been the design sprint — typically a 5-day exercise using user-centered design methodologies to quickly arrive at a potential solution to a problem.
It’s fantastic tool - one we use ourselves and teach to our clients. But coming out of them, there’s often the problem of what to do next.
Getting Stuck
In our conversations with innovation teams, it's extremely common to run a bunch of design sprints, identify a bunch of potential solutions, but then get stuck.
In many cases, the rough prototype and the small sample size of customer feedback, even if positive, isn’t enough to earn the resources internally to move to an MVP build.
And so instead of a stack of ideas piling up, there’s now a stack of somewhat validated prototypes piling up instead. You’ve moved the ball forward, but not by much.
Validating the rest of the Lean Canvas.
A great tool for summarizing a potential solution on a single sheet of paper is the Lean Canvas.
Underneath an idea is a set of assumptions:
- You’re assuming a certain stakeholder (or set of stakeholders) has a problem.
- That you have the solution to that problem.
- That the problem is solved in a way that creates value for the stakeholder - enough for them to either start paying for it or otherwise incorporate it into their lives.
- That there are enough of these stakeholders out there (and that you can get access to them easily enough) to make the time and effort worth it.
- That the cost to get them to use it is cheap enough relative to the value you extract from their use to make it worthwhile.
- That the solution is defensive enough to avoid being copied immediately.
Design sprints can help solve a subset of these, but obviously not all of them.
Assuming you are following the practices of user-centered design, you’ve done the work of validating stakeholders need before starting your design sprint. And through the sprint process you uncover at least to a degree that the solution could potentially solve the problem.
But just like a venture fund is unlikely to invest in a company on the basis of a clickable prototype and 6 conversations with end users, your growth board or other innovation governance team likely won’t consider that sufficient to green light the resources to build an MVP.
So how do you close that gap, improve your case and move the initiative forward? Here are 5 suggestions that have helped other clients break the logjam.
Polish the prototype and get more data points
You make something quick and dirty to capture user feedback during the sprint. Now it’s time to create high fidelity versions of that prototype.
This doesn’t need to be working software. It can still leverage tools like Invision. But it should have the look and feel of a finished product.
With that in hand, go outside the conference room and show it to more users. Capture their feedback around willingness to use and if necessary willingness to pay.
Don’t be surprised when you get lukewarm feedback or objections. Use those to improve the prototype, and go back out there.
Getting more data points can help build your case. And interestingly, simply having a highly designed version of your prototype can increase internal momentum as well. It gives the impression that there’s a “there” there.
Showing pretty screens can often generate more enthusiasm than talking about an idea in the abstract (or even showing lower fidelity versions of an idea, which signal that it’s half-baked.)
Create a functional spec
Assuming the solution is a piece of software, a fantastic next step is to turn the prototype into a functional spec document. This typically includes:
- Writing out the user stories in detail.
- Outlining all of your assumptions around integration points and other data that needs to be captured.
- Creating a product roadmap, with the functionality you envision in the MVP and subsequent releases.
- Most importantly, using that information to get a timeline and budget estimate for the build.
Doing this gives your internal team the data they need to make the case from a level of effort perspective, which is obviously an essential part of your case.
It also helps you craft the message in a way that increases the likelihood of success. Too often teams try to pitch the grand vision. They ask for the budget to create the product as it will ultimately be. But this increases your cost and risk, both perceived and actual.
Instead, identify the “core experience” that belongs in the MVP. Focus your effort on nailing the first time UX and the core experience. Defer the rest.
This is a good practice anyway, as simpler, tighter products tend to result in higher user satisfaction and limit the amount of behavior change needed to adopt.
Getting timeline and budget numbers also help you identify potential trade-offs, shaping what that core experience includes and doesn’t include. It helps you make a more modest ask for MVP development, while providing visibility into the costs of the potential long term roadmap as well.
We typically recommend getting an estimate from an external vendor at this phase, even if your internal team plans to build. This addresses two potential issues:
- It avoids the objection that the internal team has too many other things going on.
- Often the internal estimate is higher than the estimate from the vendor, both in terms of budget and time. Having a second data point is incredibly useful in making the case.
Create a growth model.
We’ve talked in detail about how we use growth models to put realistic assumptions around getting traction. We’ve found it to be an invaluable tool for making the MVP case.
The model makes a series of detailed assumptions at every layer of the customer or stakeholder funnel:
- Acquisition: How will we get people to find out about the solution? From what channels? How much will those channels cost? What’s the audience size of those channels?
- Activation: What are the steps necessary for a user to go from awareness to adoption? What are our estimates for conversion rates at each step of that onboarding process?
- Retention: What do we think retention will look like over a 3/6/12-month period of time? (note that benchmarks are incredibly helpful here.)
- Revenue: If these users are being monetized in some way, what are our assumptions around monetization? Number of ad impressions, % of free to paid subscribers, etc.
- Referral: If there is a referral component, what does that loop look like? Number of invites sent per user, % who accept the invite, their onboarding behavior (note that it’s often different from a user acquired cold because of the referral), etc.
These are obviously all assumptions. But they demonstrate to your team that you’ve given things way more thought. That you have a rational plan for how you will grow the product once it’s launched, and that your estimates for ROI are backed by a detailed breakdown of the levers that influence it.
Create a smoke test
To further validate some assumptions in your model, you can create a smoke test to test acquisition.
A smoke test can take many forms - often it looks like a landing page (with your pretty screens) with a call to action to sign up. Behind that is a page saying we’re not ready yet, asking for an email address to be notified when you launch. You supplement this with a tactical paid acquisition campaign to acquire potential customers.
This helps you validate a whole host of things:
- Which channels are most cost-effective for generating traffic from our target users?
- What value proposition and create treatment is most effective?
- What percentage of people, once made aware of our product, are interested enough to click a sign-up button?
- Once they see that it’s not ready yet, what percentage still think it’s compelling enough to provide their contact information?
As an optional intermediate step, you can even add a pricing page (if relevant) in between the landing page and email collection. This lets you test various pricing models as well.
Conduct cold outreach
Finally, when dealing with b2b solutions, we often advocate for taking that smoke test and supplementing it with an aggressive cold outreach campaign.
You want the landing page up because it gives the idea credibility. You supplement the landing page with email addresses and linked profiles of people who “work” there. You build a list of targeted potential customers, and you craft a multi-modal outreach campaign to generate potential leads (typically some combination of email, phone and social selling.)
This again forces you to get outside the building and test for value prop - customer fit.
While we believe in doing stakeholder research in other ways at the front end (validating pain through problem interviews, etc.), we do believe there is value in trying to sell the solution at this stage.
Getting meetings with potential customers who think the solution exists in some form is helpful because they’re no longer just giving you advice. They’re being sold, which increases resistance. As a result you get a ton of new useful information:
- It surfaces objections you’ll eventually need to learn to overcome.
- It begins to help you map the buying process by understanding who the decision makers might be for a solution like this.
- If coupled with a signed offer letter when there is interest, you give your team the most compelling reason to move forward of all - a signed commitment to purchase the product once it’s live.
Design Sprints are important but insufficient
Please, continue to do design sprints. We know of no better tool to quickly make progress on a concept for a solution.
But realize it’s not a magic bullet. You still have to do the work of building your case to earn the right to move forward with it.
You can fundamentally derisk an idea and earn the resources to build an MVP, in very little time and with very low cost. Consider baking the tools above into your process to increase your velocity of tested ideas.
Manifold has done all the above for clients many times. We’d be happy to help you execute on any of them. To learn more, contact us.