Western Economic Diversification Canada
Symbol of the Government of Canada

Common menu bar links

Evaluation Approach and Methodology

This is a theory-based evaluation, which used program theory to guide the evaluation. The evaluation was planned as a quasi-experimental design involving a non-equivalent control group. To maximize the objectivity and relevance of the conclusions, the evaluation was guided by a steering committee containing senior managers and an external innovation expert. Evaluators also sought feedback from program staff throughout the evaluation process.

Evaluation Study Activities

Preliminary Consultations

Preliminary consultations were conducted with regional departmental staff to develop a comprehensive list of projects, the list of key informant interviewees and the case studies. The evaluation framework was reviewed by the evaluation steering committee and senior department management. Through these consultations, some preliminary evaluation information was also obtained.

Documents and Literature Review

Four groups of documents were reviewed as part of the evaluation:

  • General Background documentation (e.g. Treasury Board Submissions, documents that describe innovation programming, rationale, theory, etc.);
     
  • Departmental reports;3
     
  • Program & Policy Documentation (e.g., Departmental Performance Reports, departmental database, project files); and
     
  • Literature on innovation programming and best practices.

File Review

Using a customized data extraction template, the evaluation team analysed all information contained in the department's databases (Project Gateway and the GX financial system) as delivered through the department's reporting tool (WD Reporting Centre). The study included all projects4 approved for funding between April 1, 2007 and the time the database review was completed in June 2011. A sample of nine projects from the Alberta region was selected for paper file review to: 1) validate the accuracy of the database information; and 2) determine if there was additional outcome information in the paper files that would justify reviewing all paper files in all four regions. The sample contained accurate and complete information, suggesting further file review was unnecessary. This decision assumed similar file completion and accuracy standards across all four regions. 202 innovation projects, totalling $306 million in departmental funding, were approved between April 1, 2007 and June 2011. As of June 2011, 98 (49%) projects were complete or had last claim approved. The majority of projects were approved under the Western Diversification Program (154 projects totalling $176 million in departmental funding) and its sub-component, the Western Economic Partnership Agreements (36 projects totalling $59 million in departmental funding). Approximately two thirds of projects identified either Knowledge Infrastructure (57 projects) or Technology Adoption and Commercialization (70 projects) as the sub-activity (Table 2.1).

In British Columbia, Alberta and Saskatchewan, innovation spending levels were higher in 2007-08 and 2010-11 than in the middle years. Manitoba's innovation spending decreased every year. In the regional business plans, each region highlighted their key sectors, or clusters, of funding focus; however, the data show a similar distribution of projects across sectors with the exception of life sciences (54 projects) which claimed approximately 44% of all funding. Other sectors receiving more than 5% of funding included: other technologies (21% of funding ), multi-sector (10% of funding ), enviromental technology (7% of funding) and information technologies (6% of funding).

TABLE 2.1 Number and Funding of Approved Innovation Projects by Sub-Activity,
April 2007-June 2011
Projects by Subactivity TOTAL AB BC MB SK
 #  WD
$ MIL
 #  $ MIL  #  $ MIL  #  $ MIL  #  $ MIL
Community Innovation 17 1.3 1 0.2 4 1.1 1 0.0 1 0.0
Knowledge Infrastructure 57 136.3 2 6.5 35 25.3 13 43.8 7 60.7
Tech. Research/Development 20 24.2 2 2.0 4 2.7 4 3.1 10 16.5
Tech. Skills Development 14 14.3 2 6.9 6 1.1 5 4.7 1 1.6
Tech. Adoption/Comm. 70 106.3 29 54.5 23 21.5 4 13.0 14 17.3
Tech. Linkages 24 24.1 6 1.2 13 19.8 4 2.3 1 0.8
  202 306.5 42 71.3 95 71.4 31 66.9 34 96.9

Mapping Analysis

The steering committee chose to forego a comparative analysis of similar programming because western Canada is a unique ecosystem requiring unique innovation programming. As such, there is no meaningfully comparisons to the innovation programming in western Canada. Instead, the analysis consisted of a "mapping" of innovation-related programs to compare the department's programming to the spectrum of programs available to western Canadian organizations pursuing innovation projects. The mapping analysis also included a summary of five reports, written within the last seven years, that looked at barriers to innovation and commercialization in Canada and compared Canada's ecosystem with international innovation ecosystems.

The programs selected for the analysis were identified by proponents when asked "what other funding organizations do you receive funds from?" This produced a list of:

  1. Regional (Provincial) Programs: a summary of existing information and studies on available Innovation programming in each Region.
     
  2. Federal Programs: a summary of existing information and studies focusing on the four largest federal organizations and programs5 funding research and development, as per Innovation Canada: A Call to Action.
     
  3. Report Summaries: review of five reports6 of studies commissioned by the Federal Government within the past seven years to look at the state of science and technology, innovation, and commercialization in Canada (and internationally).

Interviews with Key Informants

The consultants developed and pre-tested the questionnaires and then conducted individual or group key informant interviews by telephone. The consultants completed 73 key informant interviews with 76 key informants including:

  • 24 proponents and 10 representatives of non-recipient organizations. These participants formed the two comparison groups for the Outcome Assessment and their selection is detailed in the Outcome Assessment section below;
     
  • 5 declined applicants: two from Saskatchewan and one from each of the other three regions. To select the sample, the list of 20 declined/unfunded applicants was stratified by region and two applicants were chosen at random from each region. Five declined/unfunded applicants agreed to participate;
     
  • 14 interviews with 17 departmental staff and management. The interviewees were those department management considered to be key staff and management involved in innovation programming;
     
  • 11 representatives of other funding agencies: 2 agencies in British Columbia, 4 in Alberta, 1 in Saskatchewan, 2 in Manitoba, and 2 that are federal agencies. This group included co-funders as well as other organizations that fund innovation projects. Interviewees were identified by departmental program managers as being representatives of co-funding agencies that the department most frequently works with or other Regional Development Agencies with a similar mandate. In total, 12 representatives from other funding agencies were contacted for an interview but one was non-responsive; and
     
  • 9 interviews with experts in Canadian innovation. These individuals were identified during interviews and meetings as people who are knowledgeable about innovation in Canada and should be consulted as part of the Evaluation. In total, 12 Innovation experts were contacted and 9 agreed to participate.

Return to the top of this pagetop of page

Proponent Survey

There were 140 proponents associated with the 202 innovation projects within the scope of this evaluation. 33 of the 202 projects were funded under the Western Economic Partnership Agreements, an initiative undergoing evaluation concurrently with this evaluation. Because these 33 proponents would be contacted for the other evaluation they were excluded from the survey universe. The survey universe also excluded the 24 proponents selected for a key informant interview and the proponents associated with the 8 case studies, leaving 108 potential survey participants. The survey successfully contacted 50 (46%) proponents of 62 projects. The survey participants were from all four regions and all six innovation subactivities. One project was funded under the Rick Hansen Foundation and the remainder were funded under the Western Diversification Program. Although proponents from the other innovation funding vehicles did not participate, their performance measurement information was collected through the file review and only three of the eleven projects funded under those components are complete. All three complete projects were funded under the Winnipeg Partnership Agreement.

The consultants developed and pre-tested the questionnaire. Each proponent in the survey universe was sent the survey by email. Proponents were provided the opportunity to complete the survey by filling out the Word document attachment or over the phone. Each proponent in the survey universe was contacted up to four times requesting that they complete the survey.

Case Studies

Each of the four regions was asked to suggest two of their funded innovation projects for case study. The eight projects were to be complete or near completion and represent a range of outcomes. In total, evaluators interviewed 16 stakeholders of eight case studies. Specifically, the eight case studies included:

  • Two projects from each region;
     
  • Projects that were indicated in the departmental database as being complete or near completion: three had first claim approved, one had final claim approved and four were complete;
     
  • A mixture of project types: three were Technology Adoption and Commercialization, two were Knowledge Infrastructure, two were Technology Research and Development and one was Technology Skills Development;
     
  • A mixture of proponent types: five were university-based, two were not-for-profit corporations, one was a not-for-profit society, one was a not-for-profit applied research, development and testing organization;
     
  • A range of sectors: two were life sciences, four were multi-sector, one was other technology and one was environmental technology; and
     
  • A range of sizes in terms of departmental funding: three were approved for $200,000 to $350,000, three were approved for between one million and two million dollars, and two were approved for approximately three million dollars.

In conducting the case studies, the evaluation team collected all background information and documentation summarizing project implementation, impact and outcomes to date. Site visits were completed for seven of the eight case studies to observe the project, interview representatives involved with the projects and review any other relevant documentation. In one case, a telephone interview was completed instead of a site visit because the proponents had closed their office in western Canada.

Outcome Assessment

The Outcome Assessment was intended to provide additional evidence on attribution and program impacts by selecting two groups for comparison: one group of proponents and one group of organizations who had not received department funding (non-recipients).

  1. Selection of 24 proponents: 81 proponents satisfied the initial selection criteria: 1) project had received at least $100,000 in funding; 2) the project began between April 2007 and December 2009; and 3) the final report was available to provide complete performance data. The 24 participants were selected from the 81 proponents to represent all regions, subactivities and types of organizations (academic, incubator, association). When possible, proponents with significant experience with the department (i.e. more than three projects) were chosen over less experienced proponents to maximize chances of detecting important differences when compared to non-recipients.
     
  2. Selection of 10 non-recipients: the consultants identified an initial list of 55 non-recipients by reviewing departmental business plans and directories of associations. The list was reviewed by regional departmental officers and 33 were confirmed as non-recipients of any departmental funding. These organizations were contacted for an interview and several were then removed from the list because they were in the process of developing a proposal, awaiting a decision on a submitted proposal, recently declined for departmental funding or a for-profit organization. In the end, the group consisted of 10 organizations from three regions: British Columbia (4 organizations), Alberta (3 organizations) and Saskatchewan (3 organizations). None of the three organizations contacted in Manitoba agreed to an interview. The 10 selected organizations satisfied the following criteria: 1) not-for profit organization; 2) mandate focused on economic development; 3) aligned with regional sectoral priorities (i.e. life sciences, information and communication technologies); 3) located, or have significant presence, in the western provinces; 4) never received funding from the department's innovation programming.

Focus Groups

The consultants conducted one focus group in each of the four regions in February 2012. All proponents and co-funders who were interviewed were invited to attend the focus group in their region. Departmental participants were selected by the Director General of Operations in each region. There were 48 focus group participants in total: 33 proponents, 4 co-funders and 11 departmental staff. Each focus group contained between 10 and 15 participants.

A consultant presented the field research findings at the focus groups and then facilitated group discussions. The objectives of the focus groups were to review and validate the field research findings and explore ways to enhance the effectiveness and efficiency of similar programming in the future.

Limitations of the Methodology

Project File Review: We extracted the data for the file review using the department's reporting tool, the Reporting Centre. We found the Reporting Centre to be inflexible and not user-friendly, spending weeks verifying data, re-running reports and manually adding/removing data. The performance measurement data, in particular, needed significant manipulation before it was suitable for analysis. In some instances, it was easier to extract information directly from the department's databases (Project Gateway and GX) rather than the Reporting Centre.

Case Studies: Case studies are nonprobability samples of projects chosen for a specific purpose. In this evaluation, the case studies were chosen to represent a range of outcomes. As with any interview process, the case studies may be biased according to respondent experience and recall. Some respondents, for example, joined a project after it had started and could not comment on what had occurred in the project's early stages. Finally, there was the potential for measurement error related to using questionnaires that were not rigorously tested for validity or reliability. For example, the meaning of terms such as "technology clusters" and "technological capacity" were open to respondent interpretation. In acknowledgement of the potential for bias, the case studies in this evaluation serve as one of several lines of evidence.

Key Informant Interviews: Twenty of the 73 interviews were conducted with individuals not directly involved with the department such as experts, representatives from non-recipient organizations, and representatives from other Regional Development Agencies. The level of knowledge and understanding of the department's innovation programming varied among these interviewees and some could not address every question.

Proponent Interviews and Surveys: It was difficult locating some proponents because they had left the organization that they were with when they received departmental funding. In these instances, the organization was asked for forwarding contact information or for the contact information of someone currently within the organization who had taken over the project and could speak about it. Also, it was difficult locating other proponents because their contact information was incorrect. In these instances, up-to-date contact information was obtained from the organization or its website. Finally, some proponents felt it was too early in their project to rate the achievement of outcomes (particularly immediate outcomes). Proponents were then asked to predict longer-term outcomes for their project.

Focus Groups: The focus group participants were asked to validate and interpret evaluation findings. The nature of focus groups implies that the main comments do not necessarily reflect the opinions of all participants on an issue.

Outcome Assessment: The objective of the outcome assessment was to compare the success of projects funded by the department to projects undertaken by non-recipient organizations. Therefore, the two groups should be as similar as possible on, for example, criteria such as their eligibility for departmental funding. Eligibility for departmental funding was based on the department's risk assessment. Because the risk assessment was not applied to non-recipients, the capacity and sophistication of non-recipients may not match that of funded proponents. Comparable performance data was often not available for non-recipient and funded projects, impeding comparison of performance between the two groups. A further impediment to the analysis was the small size of the non-recipient sample and the lack of participation from Manitoba.

Attribution: Determining the value added by the department's innovation programming is challenging over the long term because outcomes such as a stronger knowledge-based economy are the result of many factors working together. This evaluation uses contribution-focused analysis to infer WD's role in achieving strategic outcomes leading to developing and diversifying the western Canadian economy.

 


[3] Several reports informed this evaluation including: 1) Impact Assessment of the Technology Adoption and Commercialization and Knowledge Infrastructure Sub-Activities of the Innovation Component of the Western Diversification Program. Ference Weicker & Company (2009). Accessed at: http://www.wd.gc.ca/images/cont/11987-eng.pdf; 2) Impact Study of WD’s Investments in Western Canada’s life sciences cluster. PricewaterhouseCoopers (2007). Accessed at: http://www.wd.gc.ca/images/cont/10359a-eng.pdf; 3) Various annual reports produced by the Department’s innovation group.

[4] The selected nine files included: 1) all three complete projects excluding the projects funded under the conference support payments; 2) all three projects with status of letter of offer declined or offer of assistance withdrawn; 3) three projects randomly selected from the remaining set of projects.

[5] The four programs included NRC's Industrial Research Assistance Program (IRAP), Sustainable Development Technologies Canada, Natural Sciences and Engineering Research Council of Canada programs, Atlantic Innovation Fund and Business Development Program.

[6] The five reviewed reports included: 1) Innovation Canada: A Call to Action, Review of Federal Support to Research and Development – Expert Review Panel, 2011; 2) People and Excellence: The Heart of Successful Commercialization, Volume I: Final Report of the Expert Panel on Commercialization, 2006; 3) Innovation and Business Strategy: Why Canada Falls Short, Report of the Expert Panel on Business Innovation, Council of Canadian Academies, 2009; 4) State of the Nation 2010 – Canada’s Science, Technology and Innovation System: Imagination to Innovation – Building Canadian Paths to Prosperity; Science, Technology and Innovation Council, 2011; and 5) Business Innovation Policies: Selected Country Comparisons, Organization for Economic Co-operation and Development, 2011.