Western Economic Diversification Canada
Symbol of the Government of Canada

Common menu bar links

Section 2: Evaluation Approach, Design and Methodology

The logic model underlying this theory-based evaluation4 was developed by the department’s performance measurement group, based on input from an inter-regional team with expertise in business productivity and growth. This logic model supported the refinement of the Performance Measurement Framework that was subsequently approved by the Deputy Minister. A non-experimental research design with multiple lines of evidence was considered appropriate and sufficient to meet the study objectives. A consulting firm contracted by the department gathered data from key informant interviews, surveys and focus groups; the evaluation unit analysed and integrated findings from all lines of evidence into a final report. To maximize the objectivity and relevance of the conclusions, the evaluation was guided by a steering committee, led by an Assistant Deputy Minister and representatives from three regional offices, and conducted in consultation with senior management.

2.1 Evaluation Study Activities

Preliminary Consultations

Preliminary consultations were conducted with the steering committee and senior management to finalize the evaluation methodology and framework.

Documents and Literature Review

Three groups of documents were reviewed as part of the evaluation:

  • General background documentation (i.e., Treasury Board Submissions, the programming Performance Measurement Strategy, websites, documents that describe the programing rationale, history and theory);
     
  • Departmental databases and website;
     
  • Literature on trends and best practices in business productivity and growth.

File Review

The evaluation reviewed all financial and performance information contained in the department’s databases (Project Gateway and the GX financial system). Eighty-seven projects, totalling $90 million in departmental funding, were approved between April 1, 2009 and March 31, 2014. As of December 2015, 63 (79%) were complete or had their last claim approved. The majority of the projects were approved under the Western Diversification Program Authority (57 projects totalling $55 million in departmental funding) or the Western Economic Partnership Agreements (25 projects totalling $32 million in departmental funding). Most projects identified Improve Business Productivity (59 projects), Business Productivity and Growth (15 projects) or Technology Adoption and Commercialization (10 projects) as the sub-activity (Table 2.1).

 

Table 2.1 Number and Funding of Approved Business Productivity and Growth Projects by Sub-program, April 2009-April 2014
Projects by Sub-program TOTAL AB BC SK MB
# WD
$ MIL
# $ MIL # $ MIL # $ MIL # $ MIL
Improve Business Productivity 59 53 15 15 30 28 10 5 4 5
Business Productivity and Growth 15 12 2 1 3 3 8 7 2 1
Technology Adoption 10 22 1 0.15 7 15 1 0.3 1 6
Access to Capital5 2 4 0 0 0 0 1 2 1 2
Industry Collaboration 1 0.1 1 0.1 0 0 0 0 0 0
Total 87 91 19 16 40 46 20 14 8 14

Note: numbers may not add up due to rounding.

 

Client Survey

A total of 99 project proponents were contacted to complete a survey to collect outcome and economic information not captured in the departmental databases; 18 could not be reached and 59 (73%) completed the survey. Project proponents were asked for contact information for key stakeholders (participants) deriving direct benefits from the funded projects such as, individuals trained as part of a project; 23 (53%) participants completed a survey. The sample also included 19 of 65 clients (37%) that applied but did not receive funding; these clients represented a range of organization types such as local chambers of commerce and professional or industry associations. The regional distribution of survey respondents ranged from 43% (British Columbia), 26% (Alberta), 22% (Saskatchewan) and 10% (Manitoba).

Interviews with Key Informants

The consultants developed and pre-tested the questionnaires and then conducted individual key informant interviews by telephone. Most key informants were selected based on their familiarity and level of involvement with the programming. Of 160 key informants invited to participate, 74 completed interviews. There was a target of 40% (or minimum of 25 interviews) interviews from individuals least likely to have strong personal interest in the programming; 28 interviews (38%) were actually completed with this subgroup. Key informants were evenly distributed across regions: 21 (British Columbia), 16 (Alberta), 18 (Saskatchewan), 17 (Manitoba) and the remainder from headquarters or Ottawa. The selection criteria and sample sizes are summarized below.

  • Project Proponents and Unfunded Applicants. All proponents and applicants were contacted to complete an online survey. A subgroup of survey participants was then chosen for interviews based on the survey results and their involvement in business productivity and growth-related programming. A total of 20 (12 project proponents and 8 unfunded applicants) were interviewed;
     
  • Funding Partner Organizations. Selected organizations were those that: 1) represented a cross-section of types of funding partners (federal, provincial, private sector); 2) were involved in the greatest number of projects or amounts of funding; 3) were involved with business productivity and growth programming; and 4) were not Funded Project Proponents or Unfunded Applicants for this evaluation. A total of 10 organizations were interviewed;
     
  • Other Government and Community Organizations. Selected organizations were those that: 1) represented provincial, federal or community based organizations responsible for the delivery of business productivity and growth programming; and 2) were not Project Proponents, Unfunded Applicants or Funding Partner Organizations for this evaluation. A total of 18 organizations were interviewed;
     
  • Other Stakeholders and Experts. Chosen individuals were those that: 1) were stakeholders and experts in business productivity and growth (academic, industry associations, other); 2) were not Project Proponents, Unfunded Applicants or Funding Partner Organizations for this evaluation; and 3) other recommendations and feedback from departmental representatives. A total of 10 stakeholders and experts were interviewed; and
     
  • Departmental representatives. Staff and management were recommended to the consultant by departmental representatives and chosen to be those: 1) representing a cross-section of areas of specialization and seniority; and 2) most knowledgeable about the programming. A total of 16 staff were interviewed.

Focus Groups

The consultants conducted three focus groups, one in each of British Columbia, Alberta and Saskatchewan/Manitoba in January 2015. The Saskatchewan/Manitoba focus group was held in-person in Saskatoon with concurrent videoconference and teleconference in Manitoba. The participants were selected based on their knowledge and involvement in the programming and familiarity with respect to the needs of small and medium-sized enterprises in the region. In total, 28 representatives, including seven departmental staff, participated in the three focus groups: eight from British Columbia, eight from Alberta and 12 from Saskatchewan/Manitoba.

A consultant presented the field research findings at the focus groups and then facilitated group discussions. The objectives of the focus groups were to review and validate the preliminary findings. The topics under discussion varied somewhat from site to site, depending upon the composition of the groups, the interests of the participants, and the relevance of particular questions to that jurisdiction.

Economic Analysis

The consultant obtained expenditure and full-time equivalent (FTE) data for the 87 projects under review from the department. This data was used to calculate operating and maintenance (O&M) costs and Grants and Contribution (G&C) costs as a percent of total expenditures; G&C costs per FTE; and O&M expenditure per project approved.

Leverage rates were estimated by subtracting total project funding from total costs and dividing that amount by the total project funding. To assess outcomes relative to funding costs, the results of the project and administrative data and survey results were used to determine the aggregate impacts of the projects to date in terms of leading indicators such as, for example, the number of businesses created/maintained/expanded, the number of jobs created or maintained, and the dollar increase in sales. The leverage ratios and other measures were then benchmarked against other Grant and Contribution programs6 from similar economic development and diversification initiatives.

2.2 Limitations of the Methodology

Key informant interviews: Key informant groups directly involved in the programming were often knowledgeable about the impacts of one or more projects but lacked a complete picture of the projects and areas funded under the sub-program. Departmental representatives also struggled with separating impacts of the Western Canada Business Service Network partners7 from the projects funded under the sub-program. To mitigate this constraint, representatives were asked to provide specific project examples to support their ratings; key informants less familiar with the department’s programming and impacts were asked to identify major needs and challenges facing SMEs and the department’s role in addressing those needs. To minimize key informant response bias, 38% of the key informants comprised individuals least likely to have strong personal interest in the programming such as, other government/community representatives and experts.

Surveys: The evaluation findings are based, in part, on the views of those with a vested interest in the programming and potentially biased in their responses regarding programming outcomes. To reduce the impact of respondent bias and validate interview results, the survey questionnaire and letter communicated the purpose of the evaluation, its design and methodology, and strict confidentiality of responses clearly to participants. Moreover, the respondents were asked to provide a rationale for their ratings including a description of specific activities which contributed to the reported outcomes. Many projects had no key stakeholders (i.e. participants) associated with them and, therefore, the number of surveyed participants was low. To address this challenge, the responses of participants were used to complement responses from project proponents and add examples and details of areas where some projects have achieved particular outcomes or follow-on investments or projects.

Focus Groups:  The focus groups were used to validate and interpret evaluation findings. The focus group discussion reflects the opinion of some focus group participants and may not be representative of all those involved in the program.

Efficiency and Economy: The change in the department’s Program Alignment Architecture (PAA) in 2013 meant that the 2009–13 expenditures would need to be estimated and, therefore, would not be directly comparable with 2013–14. For this reason, the analysis examines only the 2013–14 expenditures. Using only one year of data limits the overall validity and usefulness of the findings in terms of stability (i.e., large G&C expenditures in one year vs. another year) and trends. There was also limited available data on outcome level results for ongoing projects, recently approved projects and completed projects that were not followed up to determine their longer term impacts. Moreover, because there was limited information on the context and rationale underlying observed differences, the analysis focused on benchmarking, rather than explaining, variations in economy and efficiency. Where possible, differences in economy and efficiency were interpreted based on key informant insights or programming considerations. Finally, there were difficulties identifying similar departmental sub-programs for comparison purposes because the size and complexity of projects vary considerably across sub-programs.

Attribution:  Ideally, projects would be followed over the long term to capture spin-off benefits and additional outcomes as projects evolve. However, this would not entirely solve the attribution challenge because long term outcomes reflect a myriad of interacting factors. As such, this study used a theory of change approach, assuming the achievement of short and medium term impacts would eventually lead to the longer term outcomes over time. The use of standardized questions improved response validity and reliability. Furthermore, both objective and subjective indicators (and therefore quantitative and qualitative data collection methods) were used to accurately address multidimensional concepts.

 


[4] Treasury Board of Canada Secretariat. "Theory-Based Approaches to Evaluation: Concepts and Practices". 2012. This document defines a theory-based approach as one based on a theory of change. A theory of change involves describing and explaining causal linkages between outputs and outcomes in a logic model.

[5] Both projects were under the Community Futures Capitalization Program and supported Community Futures investment funds.

[6] Benchmarking data was obtained from evaluations completed by Atlantic Canada Opportunities Agency, Economic Development Agency for the Regions of Quebec, Employment and Social Development Canada, Federal Economic Development Agency for Southern Ontario, Industry Canada, National Research Council and Western Economic Diversification Canada.

[7] Western Canada Business Service Network includes the following organizations: Canada Business, Women's Enterprise Initiative offices, Aboriginal Business Service Network and Francophone Economic Development Organizations.