What is the future of SPC?
Quality assurance in industrial manufacturing has changed fundamentally over the decades. A key tool in this development is statistical process control (SPC), a method that has become indispensable in modern production today.
Historical origins: Shewhart and the birth of SPC
In the 1930s, Walter A. Shewhart developed the first quality control charts at Bell Telephone Laboratories. The aim was to monitor process stability through the statistical evaluation of random samples. This laid the foundation for SPC, as statistical process control is a method that is still used worldwide today.
SPC in Germany: Early impetus and institutional anchoring
In Germany, too, the debate on statistical methods began early on. As early as the 1950s, a working group was formed that later became part of the German Society for Quality (DGQ). SPC became increasingly well known through training courses and specialist literature, but its application remained voluntary and often theoretical for a long time.
Ford Q101: The practical breakthrough through customer audits
SPC received a decisive boost with the introduction of the Q101 guideline at Ford in the 1980s. Suppliers were required to monitor their processes statistically and evaluate them using capability indices. Customer audits ensured comprehensive implementation. At the same time, responsibility for quality shifted from the testing department to the worker, supported by suitable measurement technology and digital control charts.
This development led to a cultural shift: quality was no longer ‘checked’ but actively ‘produced’.
Norms and standards: QS-9000, ISO/TS 16949 and beyond
With the publication of QS-9000 by the AIAG (Chrysler, Ford, GM), the future of SPC took another important turn, as statistical process control became a mandatory part of quality management in the automotive industry. The successor standard ISO/TS 16949, now IATF 16949, created a globally valid standard. Accompanying documents such as the SPC manual and the MSA guideline specified the requirements and became the basis for certification.
International standards such as ISO 7870 (control charts), ISO 22514 (capability studies) and ISO 11462 (SPC guidelines) now regulate implementation and further development.
Software validation and reference data sets
Ford recognised early on that faulty software calculations can lead to incorrect decisions. Together with Q-DAS, founded by Edgar Dietrich, now managing director of ComPro, test data sets were developed to validate SPC software. This approach is continued today in ISO/TR 11462-3.
The future of SPC: AI as the next evolutionary step
The future of SPC lies in intelligent process monitoring. Today, sensors deliver huge amounts of data that can be analysed using machine learning and used for automatic process optimisation. Initial approaches show that the selection of suitable distribution models, which was previously done manually, can also be carried out by AI in the future.
The selection of suitable measuring systems in accordance with VDA Volume 5
is also increasingly supported digitally. The planned harmonisation between AIAG and VDA is expected to result in a joint SPC manual in the near future. This marks a further step towards global standardisation.
Conclusion: SPC remains, but becomes more intelligent
SPC has evolved from a theoretical concept to the practical basis of modern quality assurance. With the advent of AI, we are once again facing a paradigm shift towards self-learning, adaptive quality systems. For quality professionals, this means that SPC remains relevant, but is becoming more intelligent, faster and more connected.
You can also get an overview of the history of statistical process control in the video
Historical Review and Outlook for Statistical Process Control (SPC). You can deepen your knowledge of SPC with the reference book Statistical Procedures for Machine and Process Qualification from Hanser Verlag.