Learn what the Cost of Poor Quality (COPQ) is and how to improve the accuracy of your data as it relates to the cost of poor quality.

What is the Cost of Poor Quality?

Placing the cost of poor quality (COPQ) in context to all stakeholders (particularly, stakeholders in IT) requires that you’re able to turn to the right data at the right time to make your business case. The cost of poor quality (i.e., scrap, rework, returns, external failures, and so on) can give upper-level decision makers a stark reminder of the cost of neglecting quality from an enterprise point of view. As such, these metrics are critical to making your business case, but they should not overshadow other variables in the overall cost of quality equation.

In this post we’ll walk through what the cost of poor quality is, and how to improve the accuracy of your data as it relates to the cost of poor quality.

Cost of Poor Quality Criteria

Defining the Cost of Poor Quality (COPQ)

The Cost of Poor Quality (COPQ) refers to the costs that are generated as a result of producing defective material. The direct costs are easy to identify, such as labor, rework, disposal, material and recall costs. However, the indirect costs can also significantly impact your company’s profitability. These costs include:

  • Excessive overtime
  • Warranty costs
  • Returns
  • Excess inventory
  • Lost sales
  • Compliance failure
  • Increased audits
  • Brand reputation damage

COPQ for an average company is about 20% of sales. So a company that generates $100M in revenue can waste $20M in addressing poor quality. It’s in your brand’s best interest to identify to identify problems early to reduce the impact on costs, operational resources, brand reputation and, most importantly, your consumers’ health and safety. One of the best ways to ensure product quality and reduced costs is by implementing a quality management solution.

Contextualizing the Cost of Poor Quality

The costs of poor quality are very high-profile, and speaking broadly, they are well understood by researchers. The hard statistics vary from sector to sector, but the overall trend is clear: cost of quality rises substantially as defects come to light closer to the consumer.

Conversely, cost of quality is most favorable as quality issues surface earlier in the value chain. The need to mitigate non-conformances as early as possible is clear. IT’s role in enabling timely resolution of quality concerns is key to your organization’s quality management success. Without the ability to sift through accurate historical data to pinpoint root causes, any quality management professional would struggle to identify and mitigate issues consistently before products move through the value chain.

Unfortunately, it’s at this point where the shortcomings and limitations of your company’s current IT infrastructure and software tools may come into play, and the real-world situation is too often very less than ideal. Research by the Aberdeen Group has shown that a significant amount of organizations still struggle to effectively measure quality metrics.

You may view this finding as contradictory since your organization has likely made substantial investments in new technology and IT architecture over the last few years, specifically to address this issue. Your organization’s cost of quality metrics tie directly into your IT team’s ability to deliver timely and accurate data to the right personnel before products move on to manufacturing and downstream to your consumers.

IT Sprawl As a Real Phenomenon

As a quality management professional, you should never forget that IT sprawl is a very real phenomenon, particularly among large enterprises with multiple manufacturing sites and offices around the world.

Today, critical quality management related data may reside in fragmented silos of data sources, enterprise applications, and proprietary (i.e., expensive to maintain) solutions. Integration is absolutely critical to success in today’s leaner manufacturing environment. Integrating an IT architecture of such complexity is a daunting task, to say the least. Truly, your organization may be very efficient at collecting and storing financial data and quality management related data, but consolidating additional disparate data sources in the face of emerging quality management issues leaves much to be desired in today’s manufacturing environment.

In the end, your company may rely on wholly paper-driven processes to pinpoint and escalate quality management concerns to upper-level decisions makers, which allows inefficiencies to dilute quality management processes overall.

Improving the Accuracy of Data Tied to the Cost of Poor Quality

In one worst-case scenario, less than ideal IT systems may actually be the root cause of quality issues, especially with respect to an extended supply chain. The fact of the matter is that your organization may not have the most optimal IT architecture in place to collaborate among supply chain partners very effectively – if at all.

Your company may be missing a very important piece of the quality management puzzle: real-time visibility into supplier quality concerns. As such, your organization may not be measuring cost of quality as well as you might assume.

Large, incredibly complex enterprise software systems require careful implementation (and proactive maintenance) to ensure that all components of your software stack can coexist peacefully and at a reasonable cost to executives. Context is key when discussing the cost of poor quality to stakeholders in IT since this data often resides in silos of applications that are not inherently interoperable. For example, your enterprise’s software stack may include multiple ERP or MOM instances among disparate manufacturing sites as the result of mergers and acquisitions activity.

In a data-saturated manufacturing enterprise, critical quality management intelligence may fall between the cracks of your organization’s disjointed corrective and preventative actions (CAPA) processes, having an adverse effect on the cost of poor quality overall.

Learn more

Today, data quality is a mission-critical endeavor. Without timely, accurate, and ideally, real-time visibility into quality management issues throughout the value chain, your organization may see minimal gains in cost of poor quality metrics. 

An integrated quality management system is one strategy to consider in the face of such challenges. For more on that read Balancing the Cost of Quality, Technology Investments, and Integrated QMS.

Share This

Copy Link to Clipboard