Internal Controls: Designed to Fail or Designed for Failure?

All controls will fail. They will fail at a predictable rate. Internal controls not designed for failure are designed to fail.

The week of Oct 14 was “Risk Awareness Week” (RAW), a series of  interactive workshop that began on Oct.14. The workshops were designed to raise awareness about risk management applications in planning, forecasting, budgeting, construction, investments and performance management and are intended to significantly enhance decision making.

The tools and techniques discussed provide an objective basis for understanding risks and making sound decisions. I was inspired by what I heard and saw in these presentations.

But what does this have to do with designing, implementing and assessing control effectiveness? The answer today, unfortunately, is almost nothing.

The standards for internal control design, implementation and assessment are largely devoid of any rigorous quantitative analysis, any simulation, any modeling or any recognition whatsoever of human behavioral response. They are designed to fail.

Risk treatment strategies in other fields are designed to treat predicted failure rates and offset known negative impacts. In other words, “controls” as we call them are designed based on predictable failure rates. They are designed to achieve outcomes despite control failure.

There is plenty for auditors and other control practitioners to learn from these RAW workshops.

Measuring “Effectiveness” Requires Measuring Failure

Any assumption that a given control can ever be 100% effective is fundamentally flawed. Controls will always fail. But the rate of failure is predictable, and the nature of failures can be determined and offset.

  Designing Controls for Failure Designing Controls to Fail
Goal Achieve a defined desired outcome Achieve a Control Objective vs. a defined business outcome
Success criteria Evidence that the treatment contributes incrementally to the outcome Evidence that the treatment (e.g. Control) is performed as intended
Strategy Anticipate and manage failure. Effectiveness is defined as achieving targeted failure rates with acceptable negative impacts Anticipate 100% compliance. Detect and correct failures. Effectiveness is zero failures. Negative impacts are not considered.
Failure criteria Adverse impacts or side effects outweigh benefits. Failure to perform the treatment (e.g. Control)
Remediation measures The objective is achieved through a variety of complementary treatments to offset the expected failure rate. The “treatment” is designed to recognize failure. Forced compliance with treatment (e.g. Control). The treatment becomes the objective. Failures are considered “deficiencies”.

Example 1 – Designed for Failure

When seeking regulatory approval for a new drug, manufacturers must conduct extensive fact-based research. One pharmaceutical product with which I am familiar has been proven scientifically to achieve specific beneficial clinical outcomes. However, the research behind its “effectiveness” shows that despite its proven ability to achieve results in most patients:

  • 20% of those taking the medication unintentionally skip 30% of their doses,
  • 15% stop taking the medication because of its side effects, and
  • in a small number of cases potentially fatal reactions occur.

This drug was considered “effective” and approved for use. Measurement of effectiveness is based on the outcome. The rate of and reasons for failures are known and predictable. They are not deficiencies. They are reality.

Physicians try to offset the known failure rates and negative side effects with other complementary measures. They recognize that humans will exhibit a behavioral response to the medication. They constantly measure success against the outcome desired.  The goal is cure, not treatment.

Example 2 – Designed to Fail

A business decides to reduce the incidence of fraud and error though the introduction and automation of a “treatment” such as Segregation of Duties (SoD).

In my experience, here is how I would assess the “effectiveness” of SoD using the logic of the FDA. (These examples are based on my experience. Yours may differ).

  • Approximately 20% of the time SoD is deliberately breached (through shared passwords or pre signed forms or other means).
    • A small portion of these breaches result in fraud or abuse. Estimates of the specific rate of failures resulting in fraud or abuse is knowable and predictable.
    • SoD increases elapsed time for procurement for critical processes on average by 10%
    • SoD adds about 2-5% to the total economic cost of an average procurement transaction.
    • SoD requirements are often a powerful disincentive to incur operating costs or invest in the business and may have a negative impact of 2-3% on profitability.

In the world of GRC, SoD is generally considered “effective” simply if it is implemented. The compliance rate is not predicted or known, and the negative impacts are not recognized.

When treatment, not the outcome, is the criteria for success, failure is inevitable.

No attempt is made to measure or predict the failure rate and negative impacts are not recognized. If breaches of SoD are detected the remedy is more enforced compliance. If a breach of SoD results in fraud, occurs it is considered a failure of SoD. Such reasoning is tautological and leads to endless destructive repetition

It is not a failure of SoD. It is a failure of control design.

When the “effectiveness” of a control is judged by the degree of compliance with the control, and not the outcome sought, then that control is designed to fail.

Designing Controls for Failure: What Needs to Change

Define the intended outcome: The business objective is paramount. Abandon the notion of “control objective”. In my example above if 100% of the patients took 100% of their doses but the desired clinical outcome was not achieved, the treatment can’t be considered effective. The control objective would be met But the goal of medical treatment is to cure. The goal of internal control is to achieve business objectives, not control objectives.

Recognize and Assess Adverse Impacts: The cost of some treatments exceeds the benefits. Assess the importance of the outcome and weigh the adverse impacts of treatment in as part of the design decision.

Define Deficiencies Carefully: A deficiency should be assessed against the target failure rate. Correcting a deficiency must improve performance against the outcome. Tolerate control failures within the target range or change the target range and accept additional adverse impacts.

Recognize Human Behavior: There is a reason COSO created the “Control Environment” category as a root cause of failure. Over 50% of reported deficiencies under SOX are related to Control Environment. Your control portfolio must recognize and enroll the human behavior needed for success.

Design Control Portfolios for Failure: Controls work in combination. Assess the effectiveness of the entire portfolio, not individual controls. My experience suggests that the ratio of controls to risks in clients is about 5:1. That ratio  should be reversed.

Add Risk Management Tools to your Toolkit: Learn how to apply the quantitative analytical of risk management professionals. Predict failure. Model control portfolios for effectiveness. Drive efficiencies and effectiveness into internal control.

Check out my web site at https://riskrevisionist.com/

 In the words of Russell Ackoff:

The righter we do the wrong thing, the wronger we become. When we make a mistake doing the wrong thing and correct it, we become wronger. When we make a mistake doing the right thing and correct it, we become righter. Therefore, it is better to do the right thing wrong than the wrong thing right.

Published by Bruce McCuaig

I'm interested in all aspects of risk and compliance management. I want to make it work for business executives, the practitioner community and the business.

Leave a comment

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out /  Change )

Google photo

You are commenting using your Google account. Log Out /  Change )

Twitter picture

You are commenting using your Twitter account. Log Out /  Change )

Facebook photo

You are commenting using your Facebook account. Log Out /  Change )

Connecting to %s

This site uses Akismet to reduce spam. Learn how your comment data is processed.

%d bloggers like this: