The System Of Profound Knowledge® (SoPK) is the culmination of W. Edwards Deming’s work on management. The four areas of the system are: appreciation for a system, knowledge of variation, theory of knowledge and psychology. This post explores knowledge of variation in the context of Dr. Deming’s management philosophy.
Even the cursory overviews of Dr. Deming focus on his contribution to using data to improve the performance of organizations. The problem I (John Hunter) personally see is that the most important aspect is missed. More important than using specific tools (even extremely valuable tools such as the PDSA improvement cycle or control charts) is statistical thinking with an understanding of variation. Without even analyzing the data this way of thinking changes how you think about data and process results. This new way of thinking is the most powerful resources to take from your knowledge of variation.
“Why did something go wrong?” “Why are results so poor?” “How can we repeat this success?” The job of management is to not only ask these and other important performance-related questions, but also to find the right answers and take the right course of action. Dr. Deming provided the means for management to do just that through knowledge of variation.
In any business, there are always variations, between people, in output, in service and in product. The out of a system result from two types of variation: common cause and special cause variation. Common cause variations are the natural result of the system. In a stable system, common cause variation will be predictable within certain limits.
Special cause variations represent a unique event that is outside the system: for example, a natural disaster.
Distinguishing the difference between variation, as well as understanding its causes and predicting behavior, is key to management’s ability to properly remove problems or barriers in the system.
The control chart is a tool to determine if the system is in control (the system gives predictable results) and what those predictable results are. We will post on control charts in more detail later (as well as common and special causes). Essentially though the control chart is a tool to help you learn and analyze.
Depending on if you have a common cause or a special cause there are different strategies to improve. If you have a special cause you then want to learn about what is special event and find ways to either capture what was beneficial or, more likely, learn what was a problem and develop counter-measures to avoid that problem in the future. The key here is for a special cause you want to learn there is a special cause quickly (as it is much easier to learn about that special event when it is fresh in everyone’s mind).
When the system is not producing acceptable results you still need to improve. But the more effective strategies are to use common cause improvement thinking. Those strategies will examine they systemic results (all the data) and will seek to find systemic improvements.
As we posted yesterday, Dr. Deming estimated 94% of causes were common cause in Out of the Crisis, and he actually increased his estimate as time went by. Unfortunately our default mode (as people – relating to our psychology, one of the four areas of Deming’s SoPK) is to use special cause thinking. We try to discover what was bad about the specific result (often seeking the person to blame) and fix that. This can work, it is just a very inefficient way to improve. In the 94% or 97% of cases when common cause thinking would be more efficient we would get better results if we used that thinking.
On the positive side, system improvements are much more productive. If you only fix this special situation you haven’t done much to improve future results. Perhaps just eliminating one potential problem. But often you don’t even do that.
Instead of learning that your system has a weakness that should be addressed you just fire the unlucky sap who was sitting a their desk when the problem manifest itself. You convince yourself that you dealt with the problem employee but in fact the system just periodically gives that result and a few months later someone else will be sitting in the wrong place at the wrong time.
Again tying to the area of psychology; people think there is much less variation than actually exists. Therefore their natural inclination (without a sufficient knowledge of variation) is to suspect a special problem when actually it is just the natural variation in the system. “Tampering, over-reacting to variation, is a common method of increasing variation – and costs!” Brian Joiner (from my notes at Brian’s Leading for Rapid Improvement seminar).
When the result is within the expected result for the system (which a control chart will show) to use special cause thinking is tampering. This results in lots of effort with little benefit. Without knowledge of variation superstition can flourish.
Related: Variation, So Meaningful Yet So Misunderstood by Lynda M. Finn – Understanding Variation by Tom Nolan and Lloyd Provost