Taking a Product from Inception to Market

Scott Jones

VP of Product at Real Eyes



I had a lead role in developing a new product from idea to market launch. The venture resulted in some patents and a steady supply of revenue, which was pretty cool. I was able to get us up off of the ground in three months.

It was an advertising technology company, a demand-side platform. Customers engaged with us to buy ad space in real-time bidding exchanges. They were looking for opportunities to showcase their brands to very discrete audience segments while users browse on their mobile devices, laptops, desktops, smart TVs, and internet radios.

We wanted to prove to advertisers that the five-hundred grand that they invested in our service was well worth it from an ROI perspective. We were hoping to trigger an up-tick in whatever key metric they brought to us and to make that change obvious to them through measurement, reporting, and insights.

Actions taken

In the beginning, our CEO and a few other executives all got together in a room to declare the strategic initiative of adding foot traffic measurement to our portfolio of ROI measurement capabilities. In other words, the ability to measure the impact in a campaign in driving targeted audiences to physical locations hyper-local to where they live. The company wanted to lean into this metric, and it was our job to figure out how to do so.

I partnered up with a data scientist. We spent three weeks studying foot traffic measurements on our campaigns in order to really see how effective they were. These were offered by measurement companies who were on the leading edge of providing this foot traffic capability. The data scientist and I would jump right into calls with those companies that were sending us their campaign reports. The data scientist and I would probe them with questions to understand methodology, test assumptions, generally “poke holes” and “throw rocks” and ultimately come up with an understanding of how they did it end-to-end.

After surveying the competitors and taking inventory of our own capabilities and opportunities for differentiation, we were able to come up with our own plan of how we could leverage what we had in order to factor our own secret sauce into the equation. We then set off the data scientist to build his proof-of-concept of our own foot traffic measurement product.

I carry a strong sense of entrepreneurial spirit with me; I tend to get hired in technology companies because I’m scrappy and able to find a way to meet my goals. I needed to build a comprehensive data product for the team so that our data scientist would be able to put in the R&D legwork.

We needed to figure out the mechanics of how this would all work out; I had no resources, so I had to steal them. I contacted a data engineer directly and shared the specs of what we needed --- a giant table of observations including mobile device IDs, timestamps of when we saw them in an auction, and the latitude/longitude of where they were observed to be at the time of a given auction. We had to go through a few iterations with the data engineer to get this built suitably.

The data scientist then did his development work and ultimately had a POC completed -- he had to run each module by hand, but the output was a test and control study that demonstrated an ROI measured as an increased probability of being observed at a physical location. A test group comprised of a targeted audience segment would see our ad on their phone, and a control group, a look-alike audience, would not see our ad. Using the above-described table of observations, we can see where we’ve seen the test and control groups over time.

We hoped to illustrate an increase in the probability of the test group seeing our ad and later being observed at the physical location the ad was trying to drive them to visit. This could be a restaurant or a car dealership, any type of business where customers show up in person. The geolocation factor and measurement at a level of precision was of great importance to us.

We produced the POC results and brought them back to our executive suite of senior leaders. This was about a month and a half after the initial inception of the project; I put together a collection of illustrative insights showing how I would sell this service to a potential client and provide value-adding reporting around the data to drive greater understanding of the audience. The feedback was very positive, and so I continued to validate the findings with other constituents including sales and marketing. We were validated that we had something here that looked promising.

This is when I shifted modes; I wanted to figure out the fastest way to get us to market. I put training and pitch materials together that we would be able to present to a potential client -- essentially, a concise way of showing them the opportunity to be found with us. This preliminary set of key insights showcased the stories and insights this foot traffic product could provide; a car dealership who had more potential customers test-drive Lincolns after two weeks with our service, for example, and insight showing demographic information about those customers, how far they traveled, what they tend to purchase, content they consume, even the types of cars that tend to drive currently. We leveraged every single dataset at our disposal to illuminate who these visiting customers were. These materials were then used in the field to sell the foot traffic measurement product.

That was our three-month mark. We had gone from February to May, turning it around onto the market, a sales team pushing it for free at the beginning. Part of our strategy involved building in constraints that allowed us to give people something that they could use, but that still gave us enough flexibility to continue developing as they used the product. We would spend the following summer fleshing out every aspect of our service and making it real, ultimately building it into a fully productionalized offering.

Lessons learned

  • Some of the constraints that we imposed involved budget constraints. This gave us the volume of use and traffic ideal for our study. The campaigns would need to run for a certain period of time and to target an audience of a particular size, among other parameters. We had set up an ever-flowing pipeline of new insight that we were able to plug into optimization engines.
  • All of these things were useful because we were able to easily show the bang for the buck that our service provided. It was all empirical evidence that showed that our ads got the right customers to go where they needed to go. Our success depended on our ability to get our product to market before we had a chance to lose our shirts and to deplete all of our resources pre-launch.
  • Once we got the validation that we needed, we knew that we had something viable to work with. We just needed to figure out some way to actually deliver on the full version. We had something that could spit out a result, but it was not scalable at all. We were able to validate that we had something that people would be interested in, however. By that point, we were able to switch gears and figure out how this would all actually work.

Be notified about next articles from Scott Jones

Scott Jones

VP of Product at Real Eyes

Performance MetricsTechnical ExpertiseTechnical SkillsProgrammingSoftware DevelopmentEmerging TechnologiesCareer GrowthSkill DevelopmentLeadership RolesEngineering Manager

Connect and Learn with the Best Eng Leaders

We will send you a weekly newsletter with new mentors, circles, peer groups, content, webinars,bounties and free events.


HomeCircles1-on-1 MentorshipBounties

© 2024 Plato. All rights reserved

LoginSign up