top of page
  • Thomas Thurston

Some Ideas Are Better Than Others (and I don’t mind saying so)

Updated: Jul 24, 2023

Some ideas are better than others. This is a challenge for business strategists, who are voracious consumers of ideas (books, seminars, case studies, articles, blogs, biographies, etc.), but who aren’t typically trained to judge an idea’s quality. This leaves strategists vulnerable to bias, persuasion, poor heuristics and psychophysiological reactions (called ‘intuition’). Everyone from the doorman to the chairman has an opinion about what a company should do, but who should managers believe?


The Scientific Method is, basically, an entire domain dedicated to the difference between good and bad ideas. Sparing you a long-winded explanation, here’s a quickie:


Steps of Decision Quality (1 = lower quality, 7 = higher quality)

  1. Analysis based in personal insight

  2. Analysis based in consensus, drawn from multiple people

  3. Analysis based in observed representative data with a statistically meaningful sample size

  4. Analysis based in statistically meaningful correlated data (ex. a ‘back test’)

  5. Analysis based in correlations from training data (back test) and at least one randomized control group

  6. All of #5, plus a theory of causation vetted by multiple randomized control groups and meta-analyses

  7. All of #6, plus forward-looking predictions and correlated post-results (‘future tests’)


Reliability of results by Analysis Graph

The further an idea has matured along the 7 Steps, the more reliable it is likely to be as a basis for decisions. For example, when deciding what product to launch, basing decisions on personal insight (Step 1) is likely to produce the least reliable results.


Wait… what was that?


It’s easy to believe other people’s insights are unreliable, but it can be hard to swallow for ourselves.


An Unfortunate Irony

This brings us to unfortunate irony. The more an idea has matured, the less intuitive it can seem – causing many to favor worse ideas over better ones.  Consider two scenarios:


Scenario 1:

You work for a sock company. Your CEO is being advised by a tall, handsome, charismatic BMW i8-driving ex-CEO and now Harvard Professor named Victor Powers (I made that up). He and your CEO met at Yale, before Victor became a billionaire by launching a luxury brand of toothpaste. His advice is to always look for mundane products where you can launch a luxury category, saying your company should launch high-end vicuna wool socks for $1,000 a pair. Victor believes your company can become the “Whole Foods,” “Tesla” or “Armani” of the sock world.


Scenario 2:

You work for a sock company. Your CEO is being advised by a short, mousey Prius-driving statistician named Miles Milktoast from Building 404 who talks about correlated sock purchasing and usage data patterns that were further vetted via multiple randomized control groups, meta-analyses and forward-looking predictions. The results are 72% accurate with 98% statistical confidence, suggesting you should launch a low-end, cheap brand of cushioned socks for poor people with mild-to-moderate bunions or neuromas.


I think we’d all like to live in a world where, without question or a moment’s hesitation, executives would always choose the recommendations in Scenario 2.

I also think, in reality, we all know it’s a coin toss, depending on how well your CEO has a sense for decision quality, statistics, cognitive bias and the Scientific Method.


The broader point I’m trying to illustrate is; managers need a basic level of competence to begin seeing the difference between better and worse decisions. Making matters worse, the more an idea matures the harder it can be to explain to untrained minds. Talk of methodological sophistication (ex. correlations, control groups, statistical confidence) can have the adverse effect of alienating people’s intuitions, prompting negative feelings that obscure the course of action.

The better the decision, the worse it can feel to ill-equipped decision makers.


This is especially the case when data runs counter to pre-held biases and intuitions.  Every counter-intuitive scientific discovery, by definition, doesn’t sit right with people’s gut feelings at first. This is an intuition dilemma. Intuition can only lead to answers that are intuitive, however reality is often counter-intuitive. As such, despite how counter-intuitive insights are often the most valuable, they can also be the least likely to win acceptance.


A Common Irony in Decision Making:


Irony in Decision Making Graph

The Opportunity


This may seem gloomy. Decision makers don’t know good from bad and, left un-checked, their intuitions can systematically favor the wrong conclusions. While it’s bad news for those trapped in the quagmire, it’s fabulous news for those who aren’t.


As a venture capitalist who uses data science to manage a fund, the decision quality shortcomings of others is a huge part of our firm’s competitive advantage. Whether in venture capital or corporate strategy, those who develop an ear for decision quality and escape the traps of intuitive bias have a very real edge over those who don’t… or won’t.


Not all ideas are good ideas. Some are better.

bottom of page