Language selection

Audit of Project Performance Information

Table of Contents (September 2011)

Executive Summary

The audit of Project Performance Information was identified in the Western Economic Diversification Canada’s (WD) approved 2011-2014 Risk-Based Audit Plan.

The purpose of the audit was to determine how well the department has designed and implemented controls over the collection and use of project performance information. WD needs to demonstrate measurable results for its spending and investments. To do so, WD requires reliable data at the project level. The results and observation from this audit will be integrated with another audit engagement, on departmental performance information, that is scheduled for later in 2011-12.

The Project Performance Information audit examined the relevant processes and systems that were operational in the most recent complete fiscal year, 2010-11.

 

Findings

Overall, the department has established a detailed framework of standards, procedures and controls for the determination, collection and use of project performance information.

Some of the challenges in the collection and use of project performance information for decision-making and reporting stem from structuring the measurement system around too many indicators, and widespread use of unique indicators. This makes the alignment of project results with departmental reporting difficult, and results in some data-collection efforts that may not be necessary.

The controls around the actual collection and recording of performance data in the project management system were operating effectively, with minor exceptions relating to progress reporting. For ensuring reliability of performance information on progress reports submitted by fund recipients, a high degree of reliance is placed on regional staff’s knowledge of their clients and projects, as well as their work experience and judgment.

The department has undertaken a number of initiatives to streamline its results measurement and reporting processes.

Statement of Assurance

In my professional judgement as Chief Audit Executive, sufficient and appropriate audit procedures have been conducted and evidence gathered to support the conclusion that the department has developed a detailed framework of standards, procedures and controls for the determination, collection and use of project performance information. The framework, however, should be streamlined and better integrated for ease of application and improved reporting.

The assurance is based on a comparison of the conditions, as they existed at the time, against pre-established audit criteria that were agreed upon with management. The assurance is applicable to the policies, processes and controls examined. The evidence was gathered in accordance with Treasury Board policies, directives, and standards on internal audit for the Government of Canada. Sufficient evidence was gathered to provide senior management with the proof of the conclusions derived from this audit.

Donald MacDonald
Chief Audit Executive

Introduction

Western Economic Diversification Canada needs to demonstrate measurable results for its spending and investments. To do so, the department collects performance or results information for the projects it funds, through grants and contributions. The project performance data collected is also used to demonstrate achievement of overall departmental results.

This audit focused on the results information at the individual project level.  Another audit planned for later in 2011-12 will focus on the departmental performance information. The results of the current audit will be integrated into the planning of the audit of departmental performance information, as project results should provide the basis for reporting on the performance of the department as a whole.

Audit Objective

The objective of this engagement was to determine how well the department has designed and implemented controls over:

  • the selection, collection and validation of project performance information; and
  • the use of performance information for decision-making and results reporting.

In assessing the quality of project performance information, the audit assessed the controls to ensure the relevancy, reliability and timeliness of the information.

Key Risks

Management designs controls to manage risks. The audit examined the processes and controls management has designed to mitigate the following key risks to ensure the controls were in place and were effective:

  • performance indicators selected do not properly link to results and outcomes;
  • too many performance indicators are selected;
  • errors and delays in project performance data collected and recorded; and
  • limited or inconsistent use of project performance information in decision making.

Audit Scope

The audit examined the project performance information processes and systems currently in place, and used in the most recent complete fiscal year (2010-11).

The audit looked at all direct grants and contribution programs as the project data processes may vary from program to program. The audit did not look at overall departmental performance information, as noted earlier.

Acknowledgements

The auditors would like to thank management and staff of headquarters and the regions for the timely cooperation and assistance provided to the audit team throughout the engagement.

Observations and Recommendations

Selection and Use of Project Performance Indicators and Information

Criteria:  A sound process is in place for selecting appropriate project performance measures that are aligned with departmental objectives.

Criteria:  Project performance information is used for decision-making and results reporting.

The department has established, based on its Program Activity Architecture (PAA), a detailed framework for project officers to determine how project results are to be measured.  The framework identifies core program activities, sub-activities and standard performance indicators (commonly referred to as PAA indicators), which collectively form the basis for project and departmental performance and accountability reporting.

The current PAA identifies 68 indicators that may be used for measuring performance of projects under the department’s 4 program activities and 18 sub-activities. The large number of quantitative indicators would suggest a high degree of differentiation and precision of outputs across the department’s activities. However, project officers we interviewed expressed concerns that many indicators were too broad or ambiguous, and subject to differing interpretations by officers and proponents in measuring project results.

The PAA guidelines currently limit project officers to align their projects to a single sub-activity and suggest selecting between one and four PAA indicators that fall within the chosen sub-activity. In addition to the 68 standard PAA indicators, the department also permits project officers to use other unique indicators to help them better define outcomes for the diversity of projects that the department undertakes. These unique indicators are not considered mandatory, and the associated results information is not rolled up for departmental reporting.

In our review of 21 project files, we found that project officers generally selected one to two standard PAA indicators per project. In the sample tested, there were 35 unique indicators created compared to the 26 standard PAA indicators selected. The widespread use of unique indicators is partly in response to the view held by project officers that the PAA framework is still evolving and not yet fully reflective of what truly measures their projects’ performance, and partly to address the concerns about the ambiguity of the current indicators. Regional employees also consider unique indicators key in demonstrating success stories related to their projects. Not aligning a project to the most appropriate sub-activity also resulted in some PAA indicators to be classified as unique, because they fell outside the sub-activity to which the project was aligned.

The choice of indicators and targets is an iterative process, involving regional staff and project proponents. Factors such as proponents’ capacity and sophistication to track and report on project results and the burden of reporting are also considered by project officers. Notwithstanding the managerial review and approval of project indicators and targets, there is some risk that the choice of indicators and targets could be overly influenced by considerations other than those that are most relevant for results measurement and accountability. Project officers also find target setting to be difficult when projects involve services, span over a number of years, or comprise several development and completion stages. More guidance and training would be helpful.

On the one hand, the formal system of performance reporting at the departmental level is built around the PAA and approved standard indicators. On the other hand, officers who are involved in the day-to-day delivery of the department’s programs put a fair amount of emphasis on unique indicators for measuring project results. The resulting divergence of business practices at the operational level and the departmental level should be narrowed to facilitate a stronger alignment of project results with departmental reporting, and to focus data-collection efforts on what the department considers necessary for its operations and reporting. Gathering and processing information on unique indicators, in addition to collecting data on an already large suite of PAA indicators, consumes significant resources.

The Treasury Board of Canada Secretariat’s guide to developing performance measurement strategy framework points out that "successful implementation of the Performance Measurement Strategy is more likely if indicators are kept to a reasonable number" 1. While many things may be measurable, it does not necessarily make them key to the department’s success. A smaller number of indicators would also help keep focus on factors that are essential to the department.  The Auditor General of Canada has also observed that "over-complexity of a performance measurement system will lead to implementation problems and will simply frustrate stakeholders.  The easier it is to use and apply, the more likely the stakeholders will adopt and embrace the new approach" 2.

Over the past couple of years, the department has undertaken several initiatives to review and modify its PAA and performance measurement frameworks.  It recently issued a desktop reference guide, Performance Measurement: Grants and Contributions, to help project officers apply the PAA framework more consistently.  A working group with headquarters and regional representation has also been formed this year to examine challenges, potential solutions and improvements in measurement and reporting of performance.  Regional target setting and roll up of regional information is also intended to help with better linking of performance information to decision-making and results reporting.

Recommendation # 1:  The department should continue to streamline its framework for the collection and use of project performance information by:

  • ensuring stronger alignment of project results through a clearer and integrated set of performance measures;
  • limiting the number of performance measures; and
  • providing regular training to staff on the framework and its application.

Collection, Review and Recording of Project Performance Information

Criteria:  Appropriate controls are in place for collecting performance data on a timely basis, ensuring its reliability, and recording it accurately.

Recipients are required to report progress of their projects against the timeline of activity and performance indicators identified in the contribution agreement with the department. The standard requirements established by the department are that progress reporting is to be "a minimum of twice a year," whether or not a claim is submitted. No payment is to be made without the receipt of a progress report. The project management system has a bring-forward alert and reporting capability for overdue progress reports.

While most project officers are aware of the above standards, the requirements are not always complied with. Of the 21 samples we tested, there were three instances where progress reporting was less than the standard requirement of two per year, and another three instances where progress reporting was less frequent than claim submission. In all cases, however, final progress reports had been documented.

Project officers review performance information contained in progress reports for reasonableness and against the requirements set by the contribution agreement. Managers are also required to periodically review performance data. If there are obvious anomalies, inconsistencies with prior reports, apparent errors or performance issues, project officers would normally seek clarification from the client. Under the departmental policy, a project is not necessarily found to be in default for not achieving the anticipated results.

Site visits and spot checks are another tool that project officers may use at their discretion.  In the main, however, reliance is placed on the officers’ knowledge of their clients and projects, and their experience and judgment. The actual steps undertaken to review the reliability of performance data on progress reports may therefore vary from one project officer to another.

The key results information in progress reports submitted by fund recipients was properly recorded in most of the projects sampled. Some minor exceptions were noted where explanations for differences in the results reported by clients and that entered in the management system by project officers were not readily apparent.

The project management system has built-in edit checks for entering data in text, date and numeric fields. Most importantly, an error alert is displayed if performance results are not entered in numerical terms. These checks help prevent input errors. A system control also ensures that all key sections of the project officer’s final report in the system are completed before he or she can conclude on a project. Back-up procedures for performance data stored on the system were found to be appropriate.

_____________________________

1  Supporting Effective Evaluations: A Guide to Developing Performance Measurement Strategies, Treasury Board of Canada Secretariat, 2009

2  Implementing Results-Based Management: Lessons from the Literature, Auditor General of Canada, March 2000

Conclusion

The department has developed a detailed framework of standards, procedures and controls for the determination, collection and use of project performance information, but the framework should be streamlined and better integrated for ease of application and improved reporting.

Summary Conclusions
Audit Criteria Audit Result
Selection and Use of Project Performance
Indicators and Information
Criteria Partially Met
Collection, Review and Recording of
Project Performance Information
Criteria Mostly Met

 

Audit Strategies and Approach

Planning

Audit planning started in May 2011 and fieldwork was completed in August 2011.  Departmental employees were engaged as necessary throughout the audit process to facilitate identification of key risks, audit criteria, control elements and audit strategies.

Standards and Methodology

Government of Canada internal auditing standards were used for the audit. It was evidence-based and provides audit assurance that is fully supported. The basis for the audit examination and expectations was communicated and agreed to by management. The evidence was gathered through risk analysis, interviews, system and process documentation reviews, project file examinations, and project management system data reviews.

Sampling

Internal audit used the IDEA sampling software to randomly select 21 project files from the project management system for testing and confirming processes and controls. The sampling methodology and size ensured that all regions and significant programs were represented in our sample.

Audit Team

Name Title
Donald MacDonald Chief Audit Executive
Hemendra Shah Audit Principal
Kathy Locke Audit Manager
Christine Kasianiuk Auditor

 

Date modified: