August 2025 Volume 7

OPERATIONS & MANAGEMENT

THE PARADOX OF THE INVISIBLE DISCIPLINE By Ray Harkins

E arly in my quality management career, while working at a small extrusion and fabrication company, I learned something important: bosses pay attention to the money. And if I focused on cost savings projects, I could stay on their good side. Most of my cost savings efforts at that time focused on eliminating specific types of defects. After all, even a low-frequency defect— especially one that reaches a customer—can drive substantial savings once resolved. Other projects looked inward, targeting inefficiencies in our systems and practices. Lab procedures, control plans, and audit schedules tend to drift out of sync with the products and processes they’re supposed to control. So every now and then, a little system hygiene—an organized cleanup—can free up resources and allow you to reallocate attention to where it’s needed most. It was during one of those hygiene projects that I stumbled into something I’ve since come to call The Paradox of Invisible Discipline . When I started my first job as a quality manager, I adopted a robust final product audit process from my predecessors. Every product required some level of audit before shipping. Some audits were simple—just visually comparing a sample to a print. Others were more detailed, requiring full dimensional layouts before the lot could be released. It took months to fully understand the system’s complexity: single audits, double audits, normal and tightened inspection levels—all designed to meet a mix of internal standards, customer-specific requirements, and regulatory mandates. It took another six months before I felt ready to assess how well the system was actually working. By then, I had developed what I believed was a solid sense of where the audits were aligned and where they had drifted. The good news: most audits were clearly focused on “risky” parts—those with a history of customer complaints or known process variation. That made perfect sense. Final audits were acting as a last screen in a broad net meant to catch problems before they reached the customer. But a minority of audits didn’t seem to follow that logic. These were parts that had run clean for years—no complaints, low internal defect rates, high confidence. Obviously, I thought, these were success stories. The corrective actions had worked. So why keep throwing audit resources at parts that weren’t causing trouble? I proposed a shift: move those auditing hours to areas with higher external risk. My boss and fellow managers supported this data driven decision. And for a while, it worked beautifully. We caught more defects in high-risk areas and prevented several near-escapes. But then the plan backfired. Those “trouble-free” parts started slipping. Defects crept in. Operators got lax with their own checks. Scrap rates quietly climbed. We were losing control—and we hadn’t seen it coming. What happened?

I’ll tell you what happened. The Paradox of Invisible Discipline happened.

The paradox is this: when a control system is working well—so well that defects vanish and problems don’t reach customers—it looks like nothing is happening. The very success of the system makes its contribution invisible. And as a result, it becomes a target for “optimization.” Industry tends to reward visible problem-solving. If a manager leads a team through a supplier firestorm or salvages a delayed job with heroic overtime and clever rerouting, they get credit—and rightly so. But when a system runs smoothly because someone quietly maintains a schedule of process audits, gage calibrations, and internal reviews, that’s just “business as usual.” We’re wired to notice crises, not their absence. In my case, the operators responsible for those “trouble-free” parts knew their lots were being audited so they paid closer attention to avoid drawing negative attention. Auditors often spotted subtle product shifts and relayed them to operators before issues escalated. We also used double sampling plans, allowing a second sample to be drawn if the first showed a defect. Sometimes, we’d find a flaw but still approve the lot—because the system was designed not just to block defects, but to inform and adjust upstream behavior. Eliminating these audits eliminated the unseen benefit they were providing. Understanding the Paradox This experience taught me something that, once seen, is hard to unsee: discipline only feels unnecessary when it’s working. The absence of defects, complaints, or rework isn’t just a lucky break— it’s the result of countless small actions, consistently carried out, that prevent problems before they emerge. But here’s the trap: • When a system becomes reliable, it no longer draws attention. • When operators and auditors prevent problems, their effort goes unnoticed. • And when problems stay gone long enough, people start asking why we’re still checking. That’s The Paradox of Invisible Discipline in full view. The better the system performs, the more its mechanisms are questioned or quietly dismantled. And because failures don’t return immediately, we mistake the absence of problems for proof that the discipline is now optional. This is not a failure of logic—it’s a failure of visibility. To protect what works, we need to stop assuming discipline is only valuable when it’s solving active problems. Instead, we need to make invisible discipline visible —by telling the story behind stability, tracking the behaviors that sustain performance, and reinforcing the purpose behind preventive practices. Here’s how.

FIA MAGAZINE | AUGUST 2025 32

Made with FlippingBook - Share PDF online