Language selection

Audit of Departmental Performance Measurement

Table of Contents (February 2012)

Executive Summary

The audit of Departmental Performance Measurement was identified in the Western Economic Diversification Canada’s approved 2011-2014 Risk-Based Audit Plan.

The purpose of the audit was to determine how well the department has designed and implemented controls over the collection, reporting, and use of departmental performance information. Collecting and reporting on department wide results using reliable data is critical to support planning decisions. This audit built upon the results of the internal audit on project performance information that was completed November 2011.

This audit examined the relevant processes and systems that were in place for the fiscal year 2010-11. The audit recognized the changes that are in progress since that period.

Findings and conclusions

Overall, the department has an adequate framework in place to accumulate performance information on a department wide basis. The department is currently reviewing its performance measurement framework with the intent on reducing the number of and streamlining its performance measures.

The department can enhance how it documents and follows up on identified variances from targets. In addition, the department should improve the timeliness of its collection of final reports from recipients in order that the most timely results information is reflected in the annual departmental performance reports.

Statement of Assurance

In my professional judgement as Chief Audit Executive, sufficient and appropriate audit procedures have been conducted and evidence gathered to support the accuracy of conclusions reached.

The assurance is based on a comparison of the conditions, as they existed at the time, against pre-established audit criteria that were agreed upon with management.  The assurance is applicable to the policies and processes examined.  The evidence was gathered in accordance with Treasury Board policies, directives, and standards on internal audit for the Government of Canada.  The evidence has been gathered to be sufficient to provide senior management with the proof of the conclusions derived from this audit.

Donald MacDonald
Chief Audit Executive

Introduction

Background

Federal departments need to demonstrate measurable results for their spending and investments. The department collects and accumulates performance information to adequately demonstrate that departmental strategic results are achieved. This audit focused on performance measurement at the departmental level, building on a previous audit that focused on project performance information. The two audits were coordinated to avoid overlap. The results of the audit on Project Performance Information were integrated into the planning of this audit.

Audit Objectives

The objective of this engagement is to assess the processes and internal controls over:

  • The development of departmental performance measures and targets, and
  • The reporting on and use of performance results.

Key Risks

Management designs controls to mitigate risks. The audit is intended to provide assurance on the existence and effectiveness of management’s controls to mitigate the following risks:

  • Targets are not properly determined;
  • Performance information is incorrectly aggregated and reported;
  • Performance information is not correct or reliable;
  • Indicators are interpreted inconsistently; and
  • Performance information is not factored into decision making.

Scope and Methodology

The audit examined departmental performance measurement in place for the year 2010-11. The audit included processes for establishing targets and the subsequent accumulation and reporting of departmental performance information.

The audit gave consideration to a previous audit of project performance information and the results and findings from that audit were incorporated into the planning of this audit.

Audit work was conducted through interviews, documentation and database reviews.

Acknowledgements

We would like to thank the management and staff of headquarters and the regions contacted for the timely cooperation and assistance provided to the audit team throughout the engagement.

Observations and Recommendations

Target Setting Process

Criteria: Performance expectations are clear and concrete. A process is in place to establish indicators and targets.

The 2010 Treasury Board Policy on Management, Resources and Results Structures requires each department to identify strategic outcomes, develop a Program Activity Architecture, and to have an appropriate governance structure for decision making. The department developed a Program Activity Architecture (PAA) as a framework to link project results with departmental results. The PAA forms the basis for the departmental performance measurement framework. The current PAA is made up of five program activities and 21 sub-activities. In total, the current PAA has 108 indicators, 70 of which are outcome measures and 38 are output measures. Targets are established for all of the 108 measures and subsequently reported on through the Management Accountability Framework assessment process and the Departmental Performance Report. The department is currently reviewing the PAA and the Performance Measurement Framework to better reflect the department’s current activities and results. In the audit of Project Performance Information, we recommended that the department streamline its performance measurement framework and reduce the number of indicators. The current PAA and Performance Measurement Framework review is moving in that direction.

Each year output and outcome targets are set and presented to the departmental Executive Committee for approval. At mid-year, third quarter and year end, results against targets are reported to the Executive Committee.  The Departmental Performance Report forms the basis of reporting to Parliament and Canadians, the results from the previous year. In some cases, reporting against macro-economic indicators in the Departmental Performance Report was either not possible as data was not available, or the department had to rely on data that was two years old or more. In addition, future targets were often set before the previous year’s results have been determined, adding to the challenges of determining an achievable target.

The current PAA and performance measurement framework are cumbersome, with too many sub activities and indicators.  In addition to the PAA indicators, project officers collect and report on unique indicators which are distinctive to the specific project. These indicators are used by the officers to monitor progress of the project; however, in reporting on departmental performance, they are not available to be rolled up. As a result, a lot of information is collected which is not available to be used in departmental performance reporting. The department needs to focus on those indicators that are accurate representations of results, which are simple, valid, reliable, and measurable, and allow the department to tell its performance story.

It is true that it is much easier to measure outputs than it is to measure outcomes. A challenge for the department is to find a way to measure outcomes that are attributable to a project and can be used for departmental reporting. Performance measures at the strategic outcome and program activity level are often general and include influences that are external and not controlled by the department. It is often easier to measure what is collectible rather than what is really important.

Results reported against targets often were either significantly exceeding or well short of the target. Although details for variances were identified, there was little analysis of the reasons for the variance and action plans for significant variances were not provided. In addition, variances from targets were instructed to be considered in target setting for the following period; however, there was little evidence that trends in results over time were examined. In order to get the most benefit from variance reporting, the department should develop corrective action in response to significant variances.

Recommendation # 1: The department should document reasons for significant variances and develop action plans in response to those significant variances.

Accuracy of Results reporting

Criteria: Results are reported against expectations. Performance information is complete, and supported by reliable sources and data quality.

Although not required, the department includes a report of cumulative performance results at the sub-activity level to Parliament annually through its Departmental Performance Report.  In addition, the department reports annual incremental performance results to Treasury Board Secretariat for the Management Accountability Framework assessment process. Having two distinctive and separate annual reporting requirements adds a significant reporting burden, but is undertaken to add clarity to performance results.

We reviewed 48 projects to examine when results were recorded. We noted that the date the outcome was realized was based on when final reports were entered in the departmental reporting system, not when the outcome actually occurred. It often took up to three months before results were entered into the system. 38 of the 48 projects selected (or 79%) had final results that were reported in a different fiscal year than targeted. Significant administrative delays impact on the timeliness of reporting on results.

We examined 68 projects to verify that clients were submitting final reports when due. Of the 68 reports reviewed, 39 (or 57%) were submitted after the final client reporting date as agreed to in the contribution agreement. When there are delays in final reporting or uploading reports, then performance may be reported in a period outside of that targeted. Of the 39 reports that were submitted past the final client reporting date, 36% were submitted one month late, 28% were two months late, and 36% were submitted between 3 to 6 months late. One final report in the sample was submitted 2 years after the final client reporting date. This has an adverse impact on the accuracy of departmental performance reporting. The department needs to better enforce client reporting dates as a means of ensuring that performance is reported against targets in the period anticipated.

Recommendation # 2: The department needs to enforce the final client reporting date to improve the timeliness and accuracy of departmental performance reporting.

Use of results information in decision making

Criteria: Use of performance information for decision making is demonstrated.

Currently, performance information forms the basis of departmental reporting to central agencies to demonstrate departmental results achieved. Within the department, however, ongoing monitoring tends to rely more on financial (e.g., spending) indicators rather than on performance measures for use in decision making and planning. In fiscal year 2011-12, reporting to the Executive Committee was improved by developing a dashboard in which progress towards non-financial targets was reported on a quarterly basis. With the full implementation of the new Investment Strategy and changes to its corporate planning processes, the department will continue to enhance its use and reliance of performance information when making key planning decisions.

Once the department has a more complete set of baseline data measuring performance, then it will be able to enhance its monitoring and reporting through the use of trend analysis. This will help in highlighting successes and identifying areas where the department could re-focus resources.

Conclusion

The department has developed an adequate framework for departmental performance reporting, and improvements to the framework are ongoing.  As a means of telling the departmental story, the department should continue to streamline the framework, work on reducing the number of indicators to represent what is important to measure, and ensure that the information collected is utilised in decision making.

Summary Conclusions
Audit Area Audit Assessment
Performance expectations are clear and concrete. Criteria partially met
A process is in place to establish indicators and targets. Criteria met
Results are reported against expectations. Criteria met
Performance information is complete, and supported by reliable sources and data quality. Criteria partially met
The use of performance information for decision making is demonstrated. Criteria mostly met

 

Audit Strategies and Approach

Planning

Audit planning started in November 2011 and fieldwork was completed in January 2012.  Pre-engagement meetings and preliminary survey were conducted to facilitate identification of key risks, audit criteria, control elements and audit strategies.  Departmental staff was involved as necessary throughout the audit process.

Standards and Methodology

Government of Canada internal auditing standards were used throughout the planning, conducting and reporting phases of the audit. The audit was evidence-based in order to ensure the audit assurance is fully supported. All available evidence has been examined and analyzed against the audit criteria in order to recap the results. Sources of evidence included: interviews and document review, review and analysis of policies, background literature and management practices, review of previous audits or reviews by other assurance providers, analytical reviews, and elaboration on cause and effect of conditions, and follow-up on previous internal audits.

Sampling

Internal audit judgementally selected a sample of 48 completed projects reporting results in 2010-11 to test the accuracy and validity of data. In addition, another 68 projects completed between March 2011 and January 2012 were judgementally selected and examined to test compliance with agreed upon reporting dates.

Donald MacDonald, Chief Audit Executive

Kathy Locke, Audit Manager

Wilfred Dimailig, Auditor

 

Date modified: