Avoiding Product Failure Through User Demand Testing
16 July, 2019
At Climb Credit, it was important to understand if there really was a demand to build our product. More specifically, one that could help recommend a fitting program for students given the career they are considering. The difficulty in doing this, as anyone who has built a recommendation platform well knows, is that it can be a lot of effort and hard work to build this based solely off of choice. As a startup, we had limited resources to develop something that didn't yield direct revenue. The goal was to figure out how we could test into this market with limited risk in order to see how the demand would work. It is very easy for technology and product leaders to fall into the trap of starting to build something quickly and then spend six months to a year building the platform without actually realizing the demand for it. Usually what happens then is they put the product out there to fail.
Thanks to my previous job at BlackRock, I was very cognizant of not rushing into something, only to waste time and resources, and eventually have the product fail. One of the things I did there was run hackathons that were all about taking ideas and building MVP's while applying some design thinking and user testing to them. We did this to first figure out if it was a viable direction to go, before actually doing it based on gut. Using that previous experience as my inflection point, I decided that instead of building into the project, we should try to scaffold it first and then try to put a prototype out there. Much of this decision was also inspired by the book Sprint, a great read on design thinking. As a CTO, anytime there is a massive investment, divorce yourself from that idea and think about if it is necessary for the company. If so, decide whether it is necessary at that particular scale and what the appropriate level of investment needs to be. Contrarily, when you are not sure, try and set up your projects in a way that allow for experimenting and sandboxing, especially when you are testing into a new business area. What we actually did was take a few of us in separate rooms for a two week block and time boxed it. We spent the first week iterating on what a solution for this would look like. Then, we spent the second week building the lowest cost prototype for it and putting it out on the market.
- We were able to take the idea in December and launch a prototype by January. It was a big shift for us, tripling our conversions on the platform. We basically went from being a lender that students discovered only through their schools, to actually being a primary source of information for students as they made their decisions.
- By investing only a small amount of time into the project with the understanding that we wanted to keep it as a sandbox to test for the rest of the year, it led us to iterating through the product multiples times from January until July. This allowed us to leverage our learnings by basically operating in a design thinking and hackathon mode.
- The big learning for us by constraining our pathways and sandboxing it as we did was not only that we were able to give ourselves room for experimentation and failure, and learn from that, but also the freedom to take risks. We were able to do this because it wasn't affecting the resource constraints of the rest of the business that a large project would entail.
- We can now see which aspects can be monetized in order to bring back to the business model. Similarly, we are now in a position where we can make concrete decisions based on real data as opposed to going with our original plan, where we would be only half way through development by now and no wiser about what would have been better for us.
- CTO's often find themselves wearing the hat of a special projects person so it is really helpful to try and use that position to set up experiments that maximize learning before you go into significant investments for the business.
- Check your assumptions at the door when you put things out there in the real world, especially when you are working on sandbox products and a limited audience. It is very easy to fit the data to the narrative. The assumption that we made was that the information that surfaced would be the most valuable. It is important to let the data go where it needs to go.
- There is really no substitute to putting a product in front of users. Even a poorly finished product in the hands of a real user will get you a much better signal than the most perfect prototype that you build internally.
Scale your coaching effort for your engineering and product teams
Develop yourself to become a stronger engineering / product leader
Pavel Safarik, Head of Product at ROI Hunter, discusses the frequently overlooked role of product marketing in getting high user adoption rates for your product.
Head of Product at ROI Hunter
Adir Nashawi, Senior Product Manager at Hibob, shares his insight and experience from rebuilding a product to handle many feature requests and offerings.
Senior Product Manager at Hibob
Philip Gollucci, Director of Cloud Engineering at CareRev, shares his tips for managing people amidst a significant data-driven transformation.
CEO/Founder at P6M7G8 Inc.
Liz Henderson, an Executive consultant at Capgemini, shares her experience hiring a data team with a manager who was difficult to work with.
Executive Advisor at Capgemini
Jord Sips, Senior Product Manager at Mews, shares his expertise on a common challenge for product managers – finding root causes and solutions.
Senior Product Manager at Mews