By John Hunter, author of the Curious Cat Management Improvement Blog.
One of the four areas of Deming’s management system is “understanding variation.” The core principle underlying that concept is using data to improve while understanding what data is and is not telling you.
The mistakes in interpreting data are very often related to mistaking natural variation in data as meaningful. Combining this with our brains ability to find patterns (even from random data) and confirmation bias this creates problems. Using data is very powerful but it is not enough, you need to use data properly (and pay attention to other important factors data can’t adequately account for).
Data can’t lie, but people can be mislead and they can even mislead themselves by misinterpreting data.
How Not to be Wrong is an excellent book by Jordan Eilenberg on how to use math to avoid making mistakes. A great deal of the book is about the dangers of mis-interpreting data and how to avoid being misled.
The book doesn’t discuss variation directly but discusses many ways to be misled by incorrect interpretations of data.
When a theory really has got your brain in its grip, contradictory evidence – even evidence you already know – sometimes becomes invisible.
This is obviously not a problem with math it is an issue of our psychology. And this point is very well understood by those familiar with Deming’s ideas. It directly ties to 2 of the other areas of Deming’s System of Profound Knowledge: psychology and theory of knowledge. The book focuses on how to use math to avoid making errors. In doing so he is wise enough to notice that one problem is we are often trapped by our brains even when we should know better.
This quote was written about a scientist missing fairly obvious evidence – likely because it just didn’t fit how he was viewing the issue. This is a very common pattern and something you need to attempt to break yourself out of. In my experience this is possible but it requires developing a habit of continually questioning what evidence supports your belief and trying to find evidence that undermines your belief. It seems to me scientists are better at doing this than most of us, but even they often fall into traps based on their beliefs and they fail to see lots of evidence that it is hard to understand how they missed it later.
We are all subject to similar psychological forces that lead us to accept what is comforting and reject what is troubling. Another tactic I find useful is to remember this and when you reject something that is troubling take a bit of extra time to see if it is sensible to do so, or if it is something you should investigate further. And to question if you are letting yourself accept weak evidence because it is comforting (perhaps due to confirmation bias).
Another tactic is to gain numeracy (literacy for numbers) and with that gain an ability to spot data fallacies that often lead people astray. The book provides several examples of traps to avoid.
Dividing one number by another is mere computation; figuring out what you should divide by what is mathematics.
I think this is a great quote; though I must admit I think of that as statistics not mathematics but he is in the mathematics department at the University of Wisconsin-Madison, not the statistics department so I can understand why he sees it this way. My father was in the statistics department there, which is probably why I see it the way I do.
Related: Statistical Techniques Allow Management to do a Better Job – Data are not taken for museum purposes; they are taken as a basis for doing something. – We Must Remember the Proxy Nature of Data – Bigger Impact: 15 to 18 mpg or 50 to 100 mpg?