Rationalize Evaluation Approach


To arrive at a description, how it will operate in the program context and the rationale for why this is a good fit for the program context, the resources of Better Evaluation was used to frame the evaluation.


* Note that Evaluation Approach and Rationale for Good Fit appears below the mind-mapping of the evaluation below:




PRIMARY INTENDED USERS

Who will care about the program research findings?

Primary Intended Users
Specific Interest
Service Delivery Agency
·       Quality control
·       Value for program money
·       Process improvement
·       Advocacy with funder
·       Employer satisfaction
·       Client satisfaction
·       Efficiency
·       Effectiveness
Funder
·       Value for taxpayer money
·       Evaluating Service Delivery Agency performance
·       Performance management
Sponsoring College
·       Community needs and benefits
·       Creating linkages within programs delivered for value-added potential
Ontario Works and other services within the employment network
·       Evaluating the benefits of partnerships
·       Formalizing and evaluation of referral process
Anti-poverty group
·       Community level understanding of the role of the Service Delivery Agency in grassroots local community strategies
Educational Partners
·       Program’s ability to support student employment
·       Support for students during times school support is unavailable


PURPOSE OF PROGRAM EVALUATION
What are the accountabilities of the evaluation of the program?

Accountability- to whom?
Accountability-for what?
Accountability-through what means?
Program Participants & Parents
·       Program efficiency
·       Program effectiveness
·       Customer satisfaction
·       Youth consultation groups
Service Delivery Agency
·       Program efficiency
·       Program effectiveness
·       Customer satisfaction
·       Regular reporting to funder
·       Satisfaction surveys for youth and employers
College
·       Coordination of services
·       Reporting at senior leadership team meetings
·       Consultation with leaders in other departments
Partner Agencies
·       Coordination of services
·       Ongoing meetings with program reporting and feeback
·       Referral feedback
Grassroots Community Agencies
·       Building community resourcefulness
·       Attending community consultations
·       Invitation to participate in program review process



KEY EVALUATION QUESTIONS:
EVALUATION TYPE- APPROPRIATENESS, EFFECTIVENESS, EFFICIENCY

TYPE OF EVALUATION
·       Process
·       Impact
·       Outcome
PURPOSE OF EVALUTION
·       Appropriateness
·       Efficiency
·       Effectiveness

KEY EVALUATION QUESTION
Process Evaluation

Apprpriateness
How appropriate are the program service delivery guidelines in meeting the needs of local youth employment?

Impact

Primary:  Effectiveness
Were the resources allocated impactful toward short-term program goals?

To what extent did youth grow in labour market understanding, pre and post program?

To what extent were participating youth and employer satisfied with their use of the program? 

Outcome

Primary: Efficiency

What difference did the program make for participating youth and employers?

What unintended program outcomes (positive and negative) were produced?

DEFINING PROGRAM SUCCESS


RESULTS
Level of Results
Measures of Success
Good
·       Required number of youth participate in workshops
·       Required number of youth secure employment placements with incentives
·       Required number of youth make use of supports available to buy clothing and work gear
Better
All of the above, but with enhancements:
·       A larger than required number of youth participate in early experience with the local labour market
·       Parent understanding and support is bolstered by the program
Best
All of the above, but with enhancements:
·       Youth have access to staff support, resources and have developed strategies to cope with job and career instabilities.
·       Youth are more aware and better informed about the challenges in precarious employment in order to make early decisions about career and job planning efforts.
·       More youth are considering entrepreneurship as a viable option and are aware of, and strategically introduced to, supportive structures within the community for entrepreneurship skill building. 


PROGRAM EVALUATION APPROACH DESCRIPTION:

The context of the evaluation will occur within the agency that delivers the program and by management and staff that deliver the program.  This is expected to work very well, given that program staff have intimate knowledge of the program and have gathered informal feedback and narrative comments about the service and how it is perceived. Management will be important resources, as their accountability to the funder will be a clear focus of the intent of the program. 

The evaluation approach will include asking key questions of participating youth and employers and gathering feedback about their experience in their use of the program. Key questions addressed through primary stakeholder feedback mechanisms (to be further outlined in data collection methods and analysis strategies sections of the Program Inquiry Website). 
Program evaluation will be enhanced through the presentation of findings to primary intended users of the report findings, including:
-Service Delivery Agency and Staff
-Funder
-Sponsoring College
-Ontario Works
-Anti-poverty Groups
-Educational Partners
Through the engagement of primary users and stakeholders the evaluation will be assess on key criteria, including:
-Positive Outcomes and Impacts
-Negative Outcomes and Impacts
-Cost/benefit analysis
-Resources/timing
-Process efficiency and effectiveness

Because the program will be in process when some of the key questions are posed, the process evaluation is expected to have access to in-service data needed for the program evaluation.  The program evaluation will begin with pre-program needs analysis, youth participant self-ratings on skill and understanding of the program delivery messages.  Job developers, working in collaboration with program staff will have the data needed to understand labour market demands, employer vacancies and general understanding of employer needs. 

RATIONALE FOR APPROACH- GOOD FIT WITHIN PROGRAM CONTEXT:

This approach is a good fit for the program context because it allows for Process Evaluation, which will allow for changes to be implemented based on stakeholder feedback.  The Outcome Evaluation questions already dovetail with the Funder’s approach of evaluation of Effectiveness, Efficiency and Customer Satisfaction, post program.  The data mechanisms are already in place and would not need any special set up, as they go directly into Ministry systems as they are collected.  The Impact Evaluation questions and processes are expected to be the ones in most need of special set up, being careful to ensure that participating youth are asked pre-program questions that will allow for post-program comparisons.  The Impact Evaluation questions will be helpful in converting what would normally be narrative qualitative feedback into more concrete quantitative feedback from a primary source, in order to ensure data and evidence is more and scientific, adding more credibility and better information resulting from the evaluation.
The approach is supported by existing networks of stakeholders who are actively engaged with one another in communities of practice, locally.   Many of the stakeholders have common goals and so the channels of collaboration are well established and aligned for success.  These networks are accustomed to evaluation of findings, whether outcomes are positive or negative, which would limit some of the bias that could otherwise impact the evaluation findings or the process needed to arrive at findings. Input from community stakeholders will be a valuable part of the evaluation process and a key resource, moving forward. 

The fact that the approach is open to negative outcomes is an important approach to the context.  This is because the program is now in it’s second year, which is when funders typically become receptive to feedback about program pilots.  Given that there is an over-arching expectation for continuous improvement, feedback about what needs to be improved and thoughts about how and where improvements happen is critical.  For example, if the key questions only centred around the assumption that the program is working at the level of “good” as shown in the table above, then there would be no valuable data to achieve “better” or “best” success of programs beyond review.  A critical perspective is needed to ensure that the program review is effective in pointing to the change needed for the participating youth, community employers and other program stakeholders. 

RESOURCES USED:
Better Evaluation- Sharing information to improve evaluation.
Frame Evaluation, Types of Evaluation, What is Data?, Evaluating Evidence
http://www.betterevaluation.org


No comments:

Post a Comment