McNamara fallacy

Usually having named something after you is a positive experience, at least for statisticians.

Robert McNamara probably doesn't feel that way. It's not clear if McNamara ever considered himself a statistician, he was part of the 1950's scientific management movement that became the precursor for much of what we see today.

It's a rich and interesting history that takes us from the 1950s to the 21st century. A quote from the Wikipedia page of McNamara tells us about that time. Whilst at Ford:

"McNamara placed a high emphasis on safety: the Lifeguard options package introduced the seat belt (a novelty at the time), padded visor, and dished steering wheel, which helped to prevent the driver from being impaled on the steering column during a collision."

McNamara isn't remembered for this; he's remembered for his failure in the Vietnam War, where 'enemy body counts were taken to be a precise and objective measure of success'. The US continued to succeed at this measure but lost support for the war.

The generalisation of this idea is the McNamara fallacy, which is 'to measure whatever can easily be measured, ignore what's hard to measure, assume what's hard to measure isn't important and that what is hard to measure probably doesn't exist'.

Chris Williamson provides a nice summary where he says:

"...focussing on the most quantifiable measures even if doing so leads from our actual goals. We try to measure what we value but end up valuing what we measure."

There is no more straightforward, more concise and accurate explanation of why your data analytics is worthless than the pursuit of what is easy to measure.

The intellectual fallacy is believing your organisation can continue and be unaffected.

Every corporation that has failed this century isn't due to a lack of data; it's due to too many people following pointless dashboards and vanity metrics. Fifty years after McNamara's failure, there's no excuse for making the same dumb mistakes.

Go ask the executives at Palm how they managed to destroy $50bn of value between 2000 and 2010 when they controlled the market for personal digital assistants—the very thing we now all own and can hardly stop staring at.

How long do you think you can last, valuing what you measure instead of measuring what you value?


Why is this hard to implement?

"Did you know that over 80% of a data analyst's time is spent cleaning and preparing data?

No matter how sophisticated your models or visualizations are, your results are only as good as the data they’re built on.

Data cleaning is the backbone of every successful analysis. It helps to remove inaccuracies, fill gaps, and ensure your insights are trustworthy." - Esther Anagu

What Esther has written is a good representation of how most people work. If you work this way, follow her advice; she's correct in saying that the quality of the data determines the value of the outcome.

You can imagine those working under McNamara spent considerable time and effort on data cleaning and preparation to ensure the enemy body counts were correct.

How much time and effort was wasted to tell us a precise figure that was useless? McNamara was a smart guy who was ahead of his time in many ways, yet he still failed on the world stage.

Fifty years later, we're still writing about the standard approach and the comments indicate that people find this helpful. The standard approach is trying to find something (that might not be very meaningful) because you've spent all this time working on the data preparation. The term for this in economics is the sunk-cost fallacy.

The current approach is like asking someone to film hundreds of hours of footage and then spending your time cleaning and editing this footage on the belief that there's a movie in here somewhere if we spend enough time and effort. There might be a movie, but it probably sucks, and no one will want to watch it.

I reckon there's an alternative approach that takes this 80% of time spent cleaning and preparing data and spends it on designing simple, clever hypotheses (~1,500 hours per year). Data is collected as needed, and questions are answered.

The alternative approach is to work out what you want your reel to be, shoot one or two takes, do minimal editing, and post it for the world to see. This is a lot less work for a much higher return. There's a good chance people will watch this, and if it's interesting, make another one. If it's not interesting, the investment was fairly low.

Data analysts are trying to make movies no one wants to watch, when all we want are interesting reels.

If McNamara could lose the Vietnam War with the might of the world's most powerful military by measuring the wrong numbers, what chance do you think you stand using the standard approach 50 years later?