Testing a Business Model Like a Scientist

Testing a Business Model Like a ScientistHow do we decide what actions to take in business? Or in life?

In many cases, we base our actions on our models of the world. We think that things work in a particular way, and this determines the choices we make.

We can make a strong case for trying to make these choices more like a scientist does. Here’s an example.

The Nieman Journalism Lab pulled together a set of opinions on the new pay system that the New York Times has put in place, with thoughts from Steven Brill, Anil Dash and Megan McCarthy among others. Here is one of the points that Steve Buttry makes in his piece:

“My friend and former boss Jim Brady says that you can’t build a business model based on what people should do (and newspaper people believe in their bones that people should pay for their content). You build a business model based on what people will do. This tortured maze of exceptions and trigger points is a laughable effort to collect because people should pay but to find a way not to lose the people who won’t pay.”

This is a great point.

The really good thing about it is that we can actually set these up as hypotheses that can be tested. “People will pay for content” is an idea about which we can collect a fair amount of data.

There are a number of people that are starting to say that we should treat business models as a set of testable hypotheses. Steve Blank outlines how to do this very well in this set of slides:

His thoughts on the broader trends in start-up experience are well worth reading in full, but for our purposes, jump to the case study that starts on slide 72. He uses the Business Model Canvas to discuss the experience of a start-up called OurCrave, an online social shopping platform.

The case shows what the original business model was, and more importantly, how the assumptions underlying it were tested. Then he shows how the business model changed five times in response to data and testing.

This approach doesn’t necessarily guarantee success, but it certainly helps the odds. It demonstrates the reality that Peter Sims describes in a recent post:

“The truth is, most entrepreneurs launch their companies without an brilliant idea and proceed to discover one, or if they do start with what they think is a superb idea, they quickly discover that it’s flawed and then rapidly adapt.”

Thinking about your business model like a scientist can be a good scheme, provided you keep two warnings in mind.

The first is to beware of false precision. You want to test your assumptions with data, but you can’t always get the data you need. And the fact of the matter is that your plan will change yet again once you launch it and people really start interacting with your ideas. Usually, when testing numbers like these, you want to be within an order of magnitude with your numbers, and you want to remember that they are estimates.

The second caution is to avoid making models that say things like “social media always works” or “social media never works.” These absolute cases are never true. What we really want to figure out are the circumstances in which an approach works or doesn’t. Your business model may or may need need social media support, or six sigma in your production, or any other number of things.

Scientists are interested in finding the boundary conditions for rules – when do rules stop working? When testing business model hypotheses, you’re trying to figure out what is right in your particular case. So beware of absolute statements about what will or won’t work.

Experimenting is a critical innovation skill. If you can figure out how to experiment with your business model, you will increase your chances of success. Just remember to test it like a scientist.

A Guide to Open Innovation and Crowdsourcing

Don’t miss an article (2,500+) – Subscribe to our RSS feed and join our Innovation Excellence group!


Tim KastelleTim Kastelle is a Lecturer in Innovation Management in the University of Queensland Business School. He blogs about innovation at the Innovation Leadership Network.

Posted in

Tim Kastelle

NEVER MISS ANOTHER NEWSLETTER!

Categories

LATEST BLOGS

Three things you didn’t know about credit cards

By Hubert Day | October 18, 2023

Photo by Ales Nesetril on Unsplash Many of us use credit cards regularly. From using them for everyday purchases to…

Read More

Five CV skills of a business-minded individual

By Hubert Day | September 21, 2023

Photo by Scott Graham on Unsplash The skills listed on a CV help employers quickly understand your suitability for a…

Read More

No Comments

  1. Brad Barbera on April 5, 2011 at 9:43 am

    Being a scientist (or at least a former engineer, if that counts!), this approach is very appealing. I would add one other caution to the list: beware of “confirmation bias.” Entrepreneurial efforts are generally very personal. The passion and enthusiasm for an idea can easily cloud judgement, and what seems like an objective approach turns into a self-fulfilling prohecy. As objective as scientists like to believe themselves, they fall trap to confirmation bias all the time. Contradictory data are dismissed as “outliers,” while confirmatory data are exagerated in importance. If one is to implement a scientific approach, be well aware of this danger, and perhaps seek an external assessment of data to help ensure objectivity.

Leave a Comment