Western Economic Diversification Canada
Symbol of the Government of Canada

Common menu bar links

Observations and Recommendations

Target Setting Process

Criteria: Performance expectations are clear and concrete. A process is in place to establish indicators and targets.

The 2010 Treasury Board Policy on Management, Resources and Results Structures requires each department to identify strategic outcomes, develop a Program Activity Architecture, and to have an appropriate governance structure for decision making. The department developed a Program Activity Architecture (PAA) as a framework to link project results with departmental results. The PAA forms the basis for the departmental performance measurement framework. The current PAA is made up of five program activities and 21 sub-activities. In total, the current PAA has 108 indicators, 70 of which are outcome measures and 38 are output measures. Targets are established for all of the 108 measures and subsequently reported on through the Management Accountability Framework assessment process and the Departmental Performance Report. The department is currently reviewing the PAA and the Performance Measurement Framework to better reflect the department’s current activities and results. In the audit of Project Performance Information, we recommended that the department streamline its performance measurement framework and reduce the number of indicators. The current PAA and Performance Measurement Framework review is moving in that direction.

Each year output and outcome targets are set and presented to the departmental Executive Committee for approval. At mid-year, third quarter and year end, results against targets are reported to the Executive Committee.  The Departmental Performance Report forms the basis of reporting to Parliament and Canadians, the results from the previous year. In some cases, reporting against macro-economic indicators in the Departmental Performance Report was either not possible as data was not available, or the department had to rely on data that was two years old or more. In addition, future targets were often set before the previous year’s results have been determined, adding to the challenges of determining an achievable target.

The current PAA and performance measurement framework are cumbersome, with too many sub activities and indicators.  In addition to the PAA indicators, project officers collect and report on unique indicators which are distinctive to the specific project. These indicators are used by the officers to monitor progress of the project; however, in reporting on departmental performance, they are not available to be rolled up. As a result, a lot of information is collected which is not available to be used in departmental performance reporting. The department needs to focus on those indicators that are accurate representations of results, which are simple, valid, reliable, and measurable, and allow the department to tell its performance story.

It is true that it is much easier to measure outputs than it is to measure outcomes. A challenge for the department is to find a way to measure outcomes that are attributable to a project and can be used for departmental reporting. Performance measures at the strategic outcome and program activity level are often general and include influences that are external and not controlled by the department. It is often easier to measure what is collectible rather than what is really important.

Results reported against targets often were either significantly exceeding or well short of the target. Although details for variances were identified, there was little analysis of the reasons for the variance and action plans for significant variances were not provided. In addition, variances from targets were instructed to be considered in target setting for the following period; however, there was little evidence that trends in results over time were examined. In order to get the most benefit from variance reporting, the department should develop corrective action in response to significant variances.

Recommendation # 1: The department should document reasons for significant variances and develop action plans in response to those significant variances.

Accuracy of Results reporting

Criteria: Results are reported against expectations. Performance information is complete, and supported by reliable sources and data quality.

Although not required, the department includes a report of cumulative performance results at the sub-activity level to Parliament annually through its Departmental Performance Report.  In addition, the department reports annual incremental performance results to Treasury Board Secretariat for the Management Accountability Framework assessment process. Having two distinctive and separate annual reporting requirements adds a significant reporting burden, but is undertaken to add clarity to performance results.

We reviewed 48 projects to examine when results were recorded. We noted that the date the outcome was realized was based on when final reports were entered in the departmental reporting system, not when the outcome actually occurred. It often took up to three months before results were entered into the system. 38 of the 48 projects selected (or 79%) had final results that were reported in a different fiscal year than targeted. Significant administrative delays impact on the timeliness of reporting on results. 

We examined 68 projects to verify that clients were submitting final reports when due. Of the 68 reports reviewed, 39 (or 57%) were submitted after the final client reporting date as agreed to in the contribution agreement. When there are delays in final reporting or uploading reports, then performance may be reported in a period outside of that targeted. Of the 39 reports that were submitted past the final client reporting date, 36% were submitted one month late, 28% were two months late, and 36% were submitted between 3 to 6 months late. One final report in the sample was submitted 2 years after the final client reporting date. This has an adverse impact on the accuracy of departmental performance reporting. The department needs to better enforce client reporting dates as a means of ensuring that performance is reported against targets in the period anticipated.   

Recommendation # 2: The department needs to enforce the final client reporting date to improve the timeliness and accuracy of departmental performance reporting.

Use of results information in decision making

Criteria: Use of performance information for decision making is demonstrated.

Currently, performance information forms the basis of departmental reporting to central agencies to demonstrate departmental results achieved. Within the department, however, ongoing monitoring tends to rely more on financial (e.g., spending) indicators rather than on performance measures for use in decision making and planning. In fiscal year 2011-12, reporting to the Executive Committee was improved by developing a dashboard in which progress towards non-financial targets was reported on a quarterly basis. With the full implementation of the new Investment Strategy and changes to its corporate planning processes, the department will continue to enhance its use and reliance of performance information when making key planning decisions.

Once the department has a more complete set of baseline data measuring performance, then it will be able to enhance its monitoring and reporting through the use of trend analysis. This will help in highlighting successes and identifying areas where the department could re-focus resources.