Western Economic Diversification Canada
Symbol of the Government of Canada

Common menu bar links

Performance: Demonstration of Efficiency and Economy


For the most part, the programming was well-structured but process changes could have improved effectiveness.

For the most part, proponents believed the department was achieving its intended outcomes in an economical manner. Proponents, key informants and focus group participants commonly suggested the department could operate more economically by reducing paperwork and effort required for project development/application, project review/approval and for the accounting aspects of the reporting process. Focus group participants also questioned the purpose of the department's consultation component of the project development phase: to some proponents, it appeared to be an artificial exercise with an unclear goal.

Design of the programming

Most proponents approved of the design of the innovation programming, particularly its flexibility in comparison to other programs that have very specific and limited criteria. Some proponents felt the programming could be improved by decreasing approval times and allowing for the alteration of projects during the life-cycle. Approval times were somewhat long: 24% were within three months and 38% were more than six months (five of these were more than two years). Some practices that were working well for the department and some possible improvements included:

  • The open application process and early dialogue had some advantages. An open application process uses flexible application timelines which allow proponents to apply for funding at any time. The department currently follows an open application process which proponents consider a key programming success factor that allows for early dialogue with the department to shape the project. The department could improve the effectiveness of the early dialogue by clarifying departmental priorities and focus for proponents so they can align projects. One disadvantage of the open application process is that it can lead to a short window of time between project approval and March 31, the date at which the proponent is required to have spent their annual budget. Although proponents can request a reprofiling of funds from one fiscal year to the next, it is often not possible because the budget is fully committed for the upcoming year.
  • Partnerships with co-funders had advantages and disadvantages. The department currently partners with other funders. Key informants and focus group participants saw several advantages to partnering. Partnering allows the department to benefit from the sector-specific knowledge of those partners. This provides the department with a strong indication that the project will benefit the intended industrial sector and reduces the risk involved in funding innovation. Funding not-for-profit organizations in collaboration with co-funders brings industry stakeholders together in a non-competitive way. This creates synergies within and across sectors, moves sectors forward and permits greater coverage of industry than direct support to companies. Co-funders provide a good proxy for the relevance of a project. There are also some disadvantages. It may be unreasonable to expect co-funders to support truly leading edge innovative projects and that these proponents spend time pursuing additional funds instead of undertaking their projects. In order to match funding, proponents must find two aligning funding sources at the same time that have similar priorities and are willing to fund the same type of project.
  • Coordination of programming could enhance regional strengths. Focus groups participants felt that regional organizations that promote innovation could be coordinated with one another and that the department was well positioned to bring organizations together to coordinate priorities and build upon regional strengths.
  • Funding advances could have helped with cash flow problems. Many funding organizations provide up-front funding with year-end reconciliation. Because departmental funding is in the form of contributions rather than grants, eligible expenses are reimbursed, which can take several months and cause significant cash flow problems and hardship for proponents. The department can advance money, however the proponent must demonstrate need, which is difficult under the department's current set of requirements for advances. Only 31 (15%) of the 202 projects received any advance funding and the proportion appears to decrease over time with only 10% receiving advances since 2009. This raises questions as to the accessibility of advance funding.
  • Funding for ongoing operational costs. Thirteen key informants (proponents, staff, co-funders and non-recipients) indicated the department could provide ongoing funding for operational costs of program-based organizations. Focus group participants were divided on the benefit of ongoing funding: some felt it would create a dependence on departmental funding while others felt programs need ongoing funding for sustainabililty and there is no other source of funding.
  • Support for not-for-profit organizations versus companies. Eight key informants (proponents, staff, experts and co-funders) believed the department should support companies directly and three of the innovation experts emphasized that the farther away the intervention is from industry, the less the impact on industry. Some focus group participants agreed that greater outcomes would be achieved if the department supported companies directly. However, direct funding to the private sector is risky and a better option would be to provide funding through, for example, not-for-profit commercialization groups because these organizations have worked with the private sector and therefore: 1) are in a position to assess the risks and "weed out" potentially unsuccessful companies; and 2) understand the significant time and effort required in coaching, mentoring and training. Some focus group participants disagreed with direct department support to companies because it would overlap with other programs and department officers would require specific training to support companies effectively.


Administrative Efficiency

As a measure of administrative efficiency, the department's total dollars spent on transfer payments for innovation ($72.8 million) was compared to total operating expenses for innovation ($5.7 million)22. The comparison showed that it cost the department one dollar to award and manage $12.80 in innovation transfer payments. The department's efficiency compares favourably with other departments' grants and contributions programs reporting $3.6323, $6.1024 and $11.125 in transfer payments per one dollar cost to the department; however, these measures reflect a wide variety of factors and may not be directly comparable to the innovation programming.

The department's operating resources included 28 full time equivalent (FTE) staff in 2010-11, down from 33 FTE in 2009-10.

For the most part, proponents believed that the department was undertaking activities in an efficient manner. The two most common suggestions for improvements were: 1) improve the approval process in terms of the number of proposal iterations required, the unpredictability of approval, and timeliness to receive a decision; and 2) increased flexibility in departmental accounting and reporting processes.

On average, key informants also felt activities were undertaken in an efficient manner. Some proponents were operating under the incorrect perception that the department does not have programs. The proponents stated that, since the department does not have specific programs but funds specific projects instead, the department can be efficient, focused, directed, and clear in what they are asking for. Those key informants who provided a high efficiency rating credited departmental staff for facilitating an efficient process through the development of a strong working relationship early on in the process. Other interviewees provided two reasons for providing a lower efficiency rating: 1) the slow approval process that lacked transparency; and 2) onerous financial or performance reporting requirements.

Return to the top of this pagetop of page


Although most projects required co-funders, there were three exceptions - two projects funded under the Western Economic Partnership Agreements and another funded under the Western Diversification Program. The Western Economic Partnership Agreements allow the department to fund entire projects. The third project, funded under the Western Diversification Program, funded equipment costs that could not be covered because of cost overruns during the construction of two facilities. The facilities were nearing completion but could not function as intended without the equipment. On average, the department funded 29% of project costs. Department funding was instrumental in attracting some funding from other sources (Table 5.1). Table 5.2 summarizes the department's contribution to the total project costs and the dollars contributed by other funders for each departmental dollar invested (dollars leveraged). Although leveraging varied by program, overall, each department dollar was matched by $2.50 from other contributors. A leveraging ratio of $2.50 compares favourably with the $1.44 leveraged by other innovation programming26.

Table 5.1 Collaborative Funding (Millions of dollars), April, 2007 to June 2011
Program Partner Funding Contribution ($, % of total cost)
Provincial Municipal Other Total
WDP $79 (15%) $88 (16%) $1 (0%) $196 (36%) 67%
WEPA* $114 (38%) 57 (19%) 0 $70 (23%) 80%
Innovation Total $246 (23%) 203 (19%) 1(0%) $314 (29%) 71%
* Includes Western Economic Partnership Agreements round 2 and round 3.


Table 5.2 Departmental Contributions (Millions of dollars) and Leveraging Ratios, April, 2007 to June 2011
Program Total WD contribution (% of total cost) Total Cost Of Projects Dollars Leveraged Per
WD Dollar
WDP $176M (33%) $539M 2
WEPA* $59M (20%) $300M 4
Innovation Total $306M (29%) $1,070M 2.5
* Includes Western Economic Partnership Agreements round 2 and round 3.

Mapping Analysis Results

It is difficult to place the department's programming within the spectrum of all innovation programming in Canada, partly because there are approximately 500 federal and provincial programs that fund research, technology or firm development in Canada27 and partly because there is limited evaluation information on the programs. However, the mapping analysis found significant regional variation in both the number and focus of other programming open to proponents funded by the department. In British Columbia there were four notable innovation funding organizations focusing primarily on funding specific geographic areas of the province or specific sectors of technology. In Alberta, there were six funding organizations and the focus was on commercialization. In Saskatchewan there was three funding agencies and a strong sectoral focus on agricultural bioproducts. Manitoba has recently undergone a change towards consolidating its provincial support for innovation.

Best Practices and Possible Improvements

Bringing together the results from all lines of evidence collected under this evaluation yields the following list of best practices and opportunities for improvement.

  • Best Practices. Practices that seem to be working well for the department and its proponents include the open application process, the department's early dialogues with proponents and the department's partnerships with co-funders.
  • Possible Improvement: Program Delivery. Focus group participants and proponents suggested opportunities to simplify program delivery to save departmental resources and reduce client stress. The department can: review its requirements for funding advances to improve accessibility to proponents; reduce the time and paperwork involved in the application, approval and accounting/reporting processes; and provide proponents with clarification on departmental programming, priorities/focus and the purpose of the consultative process.
  • Possible Improvement: Programming Focus. The department can improve its innovation programming by focusing on priority areas while at the same time maintaining some of the flexibility that has allowed the department to fill funding gaps and accommodate regional variations and needs. Interviewees suggested possible areas of focus, such as whether to fund not-for-profit organizations and/or companies and whether to focus on research and/or commercialization. Commercialization likely requires a longer term, riskier approach than the current approach. In fact, in its own research, the department identified a funding gap in access to risk capital and a potential need for government involvement in technology commercialization for firms in Western Canada.28 Other literature cautions, however, that the issue is more complicated than just increasing access to risk capital. The innovation expert member of our steering committee highlighted research showing that commercialization is most likely to occur when there is a 3:1 ratio of private to public investment in research and development; the corresponding ratios of private:public investment in western Canada have changed little since 2002-2003 when they were: 1.13:1 (British Columbia), 0.73:1 (Alberta), 0.30:1 (Saskatchewan) and 0.45:1 (Manitoba).29
  • Possible Improvement: Performance measurement: the evaluation was hindered in assessing the success of projects by performance measurement problems. The most important departmental problems related to: 1) the overlap in sub-activities which prevented analysis by sub-activity; and 2) the widespread use of unique indicators, which concealed many of the important impacts of projects from departmental decision-makers. Arguably the best projects would be those that spanned more than one subactivity or activity; however, much of their impact is captured by unique indicators that cannot be aggregated for reporting purposes. The department is currently revamping its Program Activity Architecture and Performance Measurement Framework and the result will, ideally, produce performance information that is fully available to decision makers and reportable in performance reports.


[22] Source: 2010/11 Financial Statements, Note 12.

[23] Source: Industry Canada. February 2011. "Final Evaluation for the Northern Ontario Development Program".

[24] Source: Department of Canadian Heritage. May 2009. "Evaluation of Canadian Arts and Heritage Sustainability Program", page 38.

[25] Source: Department of Canadian Heritage. March 2010. "Summative Evaluation of the Exchanges Canada Program", page 50.

[26] Source: Atlantic Canada Opportunities Agency, January 2010. "Impact Evaluation of the Atlantic Canada Opportunities Agency Innovation Program Sub-activity."

[27] PriceWaterhouseCoopers, "Response to R&D Review Panel Consultation Questions", February 2011.

[28] Rationale for Government Involvement in Technology Commercialization in Western Canada, April 2011.

[29] Alan Cornford, Stephen Murgatroyd. "Is Innovation Working in Western Canada? Challenges and Policy Choices". August 2005.