Western Economic Diversification Canada
Symbol of the Government of Canada

Common menu bar links

Observations and Recommendations

Selection and Use of Project Performance Indicators and Information

Criteria:  A sound process is in place for selecting appropriate project performance measures that are aligned with departmental objectives.

Criteria:  Project performance information is used for decision-making and results reporting.

The department has established, based on its Program Activity Architecture (PAA), a detailed framework for project officers to determine how project results are to be measured.  The framework identifies core program activities, sub-activities and standard performance indicators (commonly referred to as PAA indicators), which collectively form the basis for project and departmental performance and accountability reporting.  

The current PAA identifies 68 indicators that may be used for measuring performance of projects under the department’s 4 program activities and 18 sub-activities. The large number of quantitative indicators would suggest a high degree of differentiation and precision of outputs across the department’s activities. However, project officers we interviewed expressed concerns that many indicators were too broad or ambiguous, and subject to differing interpretations by officers and proponents in measuring project results.

The PAA guidelines currently limit project officers to align their projects to a single sub-activity and suggest selecting between one and four PAA indicators that fall within the chosen sub-activity. In addition to the 68 standard PAA indicators, the department also permits project officers to use other unique indicators to help them better define outcomes for the diversity of projects that the department undertakes. These unique indicators are not considered mandatory, and the associated results information is not rolled up for departmental reporting.

In our review of 21 project files, we found that project officers generally selected one to two standard PAA indicators per project. In the sample tested, there were 35 unique indicators created compared to the 26 standard PAA indicators selected. The widespread use of unique indicators is partly in response to the view held by project officers that the PAA framework is still evolving and not yet fully reflective of what truly measures their projects’ performance, and partly to address the concerns about the ambiguity of the current indicators. Regional employees also consider unique indicators key in demonstrating success stories related to their projects. Not aligning a project to the most appropriate sub-activity also resulted in some PAA indicators to be classified as unique, because they fell outside the sub-activity to which the project was aligned.

The choice of indicators and targets is an iterative process, involving regional staff and project proponents. Factors such as proponents’ capacity and sophistication to track and report on project results and the burden of reporting are also considered by project officers. Notwithstanding the managerial review and approval of project indicators and targets, there is some risk that the choice of indicators and targets could be overly influenced by considerations other than those that are most relevant for results measurement and accountability. Project officers also find target setting to be difficult when projects involve services, span over a number of years, or comprise several development and completion stages. More guidance and training would be helpful.

On the one hand, the formal system of performance reporting at the departmental level is built around the PAA and approved standard indicators. On the other hand, officers who are involved in the day-to-day delivery of the department’s programs put a fair amount of emphasis on unique indicators for measuring project results. The resulting divergence of business practices at the operational level and the departmental level should be narrowed to facilitate a stronger alignment of project results with departmental reporting, and to focus data-collection efforts on what the department considers necessary for its operations and reporting. Gathering and processing information on unique indicators, in addition to collecting data on an already large suite of PAA indicators, consumes significant resources.

The Treasury Board of Canada Secretariat’s guide to developing performance measurement strategy framework points out that "successful implementation of the Performance Measurement Strategy is more likely if indicators are kept to a reasonable number" 1. While many things may be measurable, it does not necessarily make them key to the department’s success. A smaller number of indicators would also help keep focus on factors that are essential to the department.  The Auditor General of Canada has also observed that "over-complexity of a performance measurement system will lead to implementation problems and will simply frustrate stakeholders.  The easier it is to use and apply, the more likely the stakeholders will adopt and embrace the new approach" 2.

Over the past couple of years, the department has undertaken several initiatives to review and modify its PAA and performance measurement frameworks.  It recently issued a desktop reference guide, Performance Measurement: Grants and Contributions, to help project officers apply the PAA framework more consistently.  A working group with headquarters and regional representation has also been formed this year to examine challenges, potential solutions and improvements in measurement and reporting of performance.  Regional target setting and roll up of regional information is also intended to help with better linking of performance information to decision-making and results reporting.

Recommendation # 1:  The department should continue to streamline its framework for the collection and use of project performance information by:

  • ensuring stronger alignment of project results through a clearer and integrated set of performance measures;
  • limiting the number of performance measures; and
  • providing regular training to staff on the framework and its application.

Collection, Review and Recording of Project Performance Information

Criteria:  Appropriate controls are in place for collecting performance data on a timely basis, ensuring its reliability, and recording it accurately.

Recipients are required to report progress of their projects against the timeline of activity and performance indicators identified in the contribution agreement with the department. The standard requirements established by the department are that progress reporting is to be "a minimum of twice a year," whether or not a claim is submitted. No payment is to be made without the receipt of a progress report. The project management system has a bring-forward alert and reporting capability for overdue progress reports.

While most project officers are aware of the above standards, the requirements are not always complied with. Of the 21 samples we tested, there were three instances where progress reporting was less than the standard requirement of two per year, and another three instances where progress reporting was less frequent than claim submission. In all cases, however, final progress reports had been documented. 

Project officers review performance information contained in progress reports for reasonableness and against the requirements set by the contribution agreement. Managers are also required to periodically review performance data. If there are obvious anomalies, inconsistencies with prior reports, apparent errors or performance issues, project officers would normally seek clarification from the client. Under the departmental policy, a project is not necessarily found to be in default for not achieving the anticipated results. 

Site visits and spot checks are another tool that project officers may use at their discretion.  In the main, however, reliance is placed on the officers’ knowledge of their clients and projects, and their experience and judgment. The actual steps undertaken to review the reliability of performance data on progress reports may therefore vary from one project officer to another. 

The key results information in progress reports submitted by fund recipients was properly recorded in most of the projects sampled. Some minor exceptions were noted where explanations for differences in the results reported by clients and that entered in the management system by project officers were not readily apparent. 

The project management system has built-in edit checks for entering data in text, date and numeric fields. Most importantly, an error alert is displayed if performance results are not entered in numerical terms. These checks help prevent input errors. A system control also ensures that all key sections of the project officer’s final report in the system are completed before he or she can conclude on a project. Back-up procedures for performance data stored on the system were found to be appropriate. 

_____________________________

1  Supporting Effective Evaluations: A Guide to Developing Performance Measurement Strategies, Treasury Board of Canada Secretariat, 2009

2  Implementing Results-Based Management: Lessons from the Literature, Auditor General of Canada, March 2000