Goals-Based Evaluation for Health Promotion Programs 1
AT A GLANCE
Goals-Based Evaluation for Health Promotion
Programs
Published: September 2023
Background
Evaluation is “the systematic assessment of the design, implementation [and/or] results of
an initiative for the purposes of learning or decision-making.”
1
Evaluation should be systematic,
impartial, methodical, and provide information that is credible, reliable and useful.
1
Many types of
initiatives can be evaluated: from a communication campaign to a healthy public policy to a health
promotion program or service. For simplicity, in this resource we will use the term “program” to refer to
any and all of these. There are many different types of evaluation that can be conducted, depending on
the purpose, or stage of the program’s development and implementation.
2
For example, one can
produce the information needed to inform the development of a new program; assess if a program is
carried out with the necessary reach, intensity, and duration; and/or evaluate changes in outcomes
attributable to a program.
3
This At a Glance provides an overview of a specific approach to evaluation which can be used to
evaluate programs, including health promotion programs, called goals-based evaluation. An
accompanying video presentation
4
summarizes a ten-step, three-phase model for goals-based
evaluation and provides a case study from an Ontario public health unit. These resources aim to
strengthen the capacity of health promotion practitioners to plan and evaluate health promotion
programs.
As cited in Evaluating Health Promotion Programs: introductory workbook,
5(p.1)
Michael Scriven defines
goals-based evaluation as any type of evaluation which aims to measure the achievement of pre-set
goals and objectives. This type of evaluation can measure both processes, such as the procedures and
tasks used to deliver a program, and outcomes, such as the results of a program.
5
Program planning and program evaluation are closely linked and part of an ongoing cycle.
2,3
In
particular, goals-based evaluation relies on thoughtful program planning, where specific information
necessary for a robust evaluation can be developed.
5
In this way, goals-based evaluation aligns with
Public Health Ontario’s (PHO)’s health promotion program planning process,
3
which emphasizes
identifying the need for the program, and developing goals, objectives and relevant indicators.
The ten step model for goals-based evaluation presented in this At A Glance is based on one developed
by The Health Communications Unit (THCU) in 1998.
6
The model was developed based on THCU’s
knowledge of evaluation models as well as their experience in supporting Ontario’s public health units
to plan and evaluate health promotion programs. In 2011 THCU moved from the University of Toronto
and integrated into PHO operations.
Goals-Based Evaluation for Health Promotion Programs 2
The Ten Steps for Goals-Based Evaluation of Health
Promotion Programs
The goals-based evaluation model consists of ten steps in three phases: planning, implementation and
utilization. While the model is depicted as a linear process, in reality the process can be more circular or
iterative. It may be necessary to return to previous steps as the evaluation, the program, and the
context in which they are taking place evolve.
Figure 1: Ten Steps for Goals-Based Health Promotion Program Evaluation
Step 1: Describe the Program
Purpose: To gather information for the program evaluation
This step creates the foundation for evaluation. Begin by creating a summary of the program, including: a
clear description of the program including the need and rationale for the program, the program’s goals,
audience, activities, indicators, inputs, outputs, and outcome objectives
5,7
(see Appendix A: Glossary for
definitions of these terms). These are the necessary ingredients for program evaluation and will enable
the evaluation to produce useful and actionable information.
A logic model can be a helpful way to build understanding and clarity about the program,
8
and visually
illustrate the relationship between a program’s inputs and the desired outcomes.
9
In other words, a logic
model can show that the logic or theory underlying the program will plausibly lead to the planned goals
and objectives.
Resources to support this step (see Resources section for a complete list):
At A Glance: Planning Health Promotion Programs: This resource summarizes PHO’s six steps for
planning health promotion programs.
Focus On: Logic model- a planning and evaluation tool: This document provides an overview of
the components of a logic models, examples of logic models designs, and describes the use of
logic models in program planning and evaluation.
Focus On: Evaluability assessment- a step model: This resource provides guidance on how to
conduct an evaluability assessment and describes the known facilitators and challenges that
may arise during the process.
Goals-Based Evaluation for Health Promotion Programs 3
Step 2: Identify and Engage Partners
Purpose: To identify evaluation partners and determine how best to engage them
This step involves identifying the individuals, groups, and organizations who will be impacted by the
evaluation’s implementation or results, and can contribute to the success of the evaluation.
10
For
simplicity, we will refer to these as “partners”. Consider both internal and external partners, including
the program audience. Once the partners are identified, consider the interests and expectations of
each, how they might use the evaluation results,
5
and ways to engage them in the evaluation process.
10
Resources to support this step (see Resources section for a complete list):
Wheel of Engagement: This tool from theTamarack Institute helps to identify who to engage,
and to what degree, with opportunities and ongoing work.
Context and content experts: This paper from the Tamarack Institute explores how to increase
the authenticity of community engagment and meaningfully engage both content experts,
described as profressionals and staff in an organizations, service providers and leaders with
formal power, and context expertsdescribed as people with lived experience of the situation.
Step 3: Determine Timelines and Available Resources
Purpose: To identify when the evaluation will occur what resources are available to complete it.
Consider the context in which the evaluation is occurring, and any factors and processes (such as grant
timelines or ethics approvals) that may impact the overall timelines for the evaluation. Next, identify the
resources needed to carry out the evaluation. These could include available dollars for staff salaries,
consultants, data collection, translation or interpretation, as well as supplies, equipment and
communications. Resources can also include the time needed to complete the evaluation, and in-kind
support from partners.
Step 4: Develop Evaluation Questions
Purpose: To select and prioritize the evaluation questions
Building on the information gathered in the previous steps, identify potential questions to guide the
evaluation. As previously stated, this evaluation model is a goals-based model, which can be used to
measures processes or outcomes. Evaluation questions differ according to what is being measured.
A process evaluation, or implementation evaluation, determines whether program activities have been
implemented as planned/intended,
7
and why? Or, why not?
7
Process evaluation questions could include
“Was the program implemented as designed (program fidelity)?”
2
“Is the program reaching the
intended audience?”
2
“How satisfied is the audience with the program?”
2
An outcome evaluation, or effectiveness evaluation,
7
focusses on the more downstream outcomes of a
program.
2
It measures the effects of the program on the audience by assessing the progress in achieving
the planned goal.
11
Evaluation questions could include “What did/does the audience do differently as a
result of the program?”
10
“Did the program result in unintended consequences or outcomes? What
were they?”
5
“Did program outcomes increase or decrease?”
5
Goals-Based Evaluation for Health Promotion Programs 4
It may be necessary to prioritize evaluation questions, depending on how many are identified, the
resources available for the evaluation, and the timeliness for the evaluation. Involve the internal and
external partners identified in Step 2 in the prioritization process when possible. Once evaluation
questions have been selected, the evaluation approach(es) that best suit those questions can
be determined.
Step 5: Select Measurement Methods and Procedures
Purpose: To determine what to measure, how to measure it, and what data collection procedures to use
In this step, determine what to measure, and what procedures to use in order to measure it. This
includes how, when, and from whom the data will be collected, with consideration for ethical conduct
related to data collection. Develop a data collection plan that includes:
What will be measured: Select the indicators that match the program’s key activities and
outcomes, and align with the resources available.
5
Ensure the data you collect is related to your
evaluation questions and avoid collecting information that will not be useful or used.
5
How data will be collected: There are four common groups of evaluation methods elect one or
a mix of collection methods to answer the evaluation questions identified:
5
1. Review of existing data or documentation, such as meeting notes and
program reports;
2. Talk to people directly, via interviews or focus groups;
3. Obtain written responses though surveys or similar methods; and
4. Observe, monitor, or track outcomes through primary or secondary data sources.
5
When data will be collected: Determine at which point(s) data collection will take place, for
example, before, during or after the program has been implemented;
5
From whom data will be collected. Determine who you will collect data from. This includes
determining the sample size, ensuring adequate representation that will reflect the program’s
audience,
10
how participants will be recruited,
5
and where data collection will take place.
5
In the
case of secondary data, detailed the sources of that data.
A data collection matrix can be a useful way to summarize the evaluation questions, links to logic model
(if applicable), as well as the indicators, methods, data sources, timelines, roles and responsibilities, and
data analysis process.
5
This example has been provided by Wellington Dufferin Guelph Public Health
from an evaluation of a pilot program of an overdose prevention site (OPS). The evaluation featured
three evaluation questions:
1. What are the strengths and challenges of the OPS (as it is being implemented through the pilot
program)?
2. What are the positive and negative short-term outcomes for clients associated with the use of
the OPS?
3. Is an OPS an effective way to keep people who use substances safe in our community?
12
The below data collection matrix summarizes plans for the first evaluation question.
Goals-Based Evaluation for Health Promotion Programs 5
Table 1. Data Collection Matrix Example
Category
Description
Case Study from Wellington-Dufferin-
Guelph
12
Evaluation
Question(s)
What critical question(s) will be
answered as a result of the
evaluation?
Indicators
What will indicate success (how
will you know that the program
has been successful?)
Data Collection
Methods
How will the data be collected?
Data Sources
From whom and where will the
data be collected?
Timelines
When will the data be collected?
Roles &
Responsibilities
Who is responsible for data
collection, and what is their role?
Data Analysis
Methods
How will the data be analyzed?
Step 6: Create the Evaluation Plan
Purpose: To document the decisions made in Steps 1 through 5 in an evaluation plan
The evaluation plan documents all of the decisions and information produced in the model described
here. Typically, an evaluation plan includes the program description, the purpose of the evaluation,
evaluation questions and methodology, a data analysis plan, budget and timelines, and how the results
of the evaluation will be used.
5
An important consideration in any evaluation plan is to consider ethical issues. Ethical issues can arise at
any point throughout the lifecycle of an evaluation, from the initial development of a plan through to
the application of findings. For example, evaluators often experience conflicts of interest, especially
when the evaluator, or their affiliated organization, is also implementing the program being evaluated.
Also, evaluations involve actionable information that can have a direct and immediate impact on the
Goals-Based Evaluation for Health Promotion Programs 6
welfare of individuals and their communities. Ethical reflection is therefore is an important aspect of
creating an evaluation plan and can help ensure evaluations respect the rights and protect the welfare
of individuals and their communities. This includes avoiding unnecessary risks (e.g., harms or burdens).
Resources to support this step (see Resources section for a complete list):
A framework for the Ethical Conduct of Public Health Initiatives: This framework provides a
public health lens to the Canadian Tri-Council Policy Statement 2, Ethical Conduct for Research
Involving Humans (TCPS 2). The framework poses ten guiding questions to be considered when
planning and evaluating evidence-generating public health initiatives.
Step 7: Collect Data
Purpose: To collect the data needed to answer each evaluation question
The evaluation results and corresponding recommendations hinge upon the quality of the data
collected.
13
To ensure that data are reliable, develop standard data collection procedures and tools, and
provide training for those collecting data.
5
Consider whether participation incentives are appropriate
and brainstorm ways to enhance response rates.
5
Resources to support this step (see Resources section for a complete list):
Data Collection Methods: This collection of short resources from the Centres for Disease Control
and Prevention explore specific data collection methods, including for focus groups,
questionnaires, interviews and observation.
Step 8: Process and Analyze Data
Purpose: To synthesize and analyze data collected for the evaluation
This step involves engaging with the data to process, analyze, and synthesize information from all sources.
Ensure quality data: Develop and implement quality control techniques to ensure high
quality data.
5
Organize the data: Data typically requires some “cleaning” and organizing to be ready for
analysis. Quantitative and qualitative data may need to be “cleaned” and organized in different
ways. Consider using Excel or statistical software (e.g., SAS, R) for quantitative data, and
software such as NVivo for qualitative analysis.
5
Analyze the data: Qualitative and quantitative data require different analysis techniques. For
many evaluations of quantitative data, simple descriptive statistics may be sufficient. This might
include counts or frequencies, percentages, measures of central tendency (such as mean,
median and mode) and variability. More complex analyses of associations between indicators,
or modeling may also be appropriate. Qualitative analysis may identify themes in the data,
guided by the evaluation questions.
5
Synthesize the data: This is the process or organizing and classifying the information that has
been collected, summarizing and comparing the results.
7
Goals-Based Evaluation for Health Promotion Programs 7
Step 9: Interpret and Disseminate Results
Purpose: To interpret and share your evaluation findings, engaging partners to help identify
recommendations
This step involves describing what was learned through the evaluation,
2
interpreting the data analyzed
and synthesized in the previous step, so that decisions can be made about the program.
10
Partners,
including the program’s audience, can be included in interpreting the findings
5
. Anchor the
interpretation to the original evaluation questions.
5
Create a list of recommendations thatalign with the
evaluation outcomes.
5
Recommendations are actions to consider as a result of what has been learned through the evaluation.
7
The recommendations will be informed by the program’s audience and the partners identified in Step
2,
2
as well as the purpose for the evaluation (Step 7).
7
Review recommendations with partners to
identify actionable outcomes and discuss what has been learned from conducting the evaluation and
next steps to incorporate results.
5
Evaluations are most useful when their results are used, for example by decision-makers, policy-makers,
funders, and other groups.
2,10
Identify who will use the evaluation results, and how they will use them.
7
Then the appropriate channels/formats for the audience and purpose can be selected.
5
Presentation of
findings can take many forms such as a written report, slide show presentation, infographic, and/or
short informational video. Make results available to partners. Tailor what is disseminated to their
specific interest in the evaluation and how they plan to use the results.
5
Step 10: Apply Evaluation Findings
Purpose: To use your evaluation results
The ultimate purpose of program evaluation is to use the information.
7
Evaluation findings can be used
in several ways: to inform decisions regarding the program’s continuation, improvement, or wind-down;
to demonstrate that resources are well-managed; and to generate knowledge and understanding of the
program or the issue that it addresses.
14
In this step, apply the evaluation findings as detailed in the
evaluation plan developed in Step 6.
“It is not really program evaluation unless the information is used to
make decisions.”
2(p.217)
Goals-Based Evaluation for Health Promotion Programs 8
Conclusion
Evaluations help organizations to justify, support and improve programs, and make other important
decisions about them.
5
A goals-based evaluation will help to determine if the health promotion program
has achieved the goals and objectives it was designed to. An effective evaluation involves engaging
partners, assessing resources, developing evaluation questions, gathering and analyzing data and
utilizing the results. Proper evaluations take time and resources, but yield valuable results. Taken step
by step, anyone can complete a well-designed evaluation one that encourages beneficial action to
follow.
5
Goals-Based Evaluation for Health Promotion Programs 9
References
1. Canadian Evaluation Society. What is evaluation? [Internet]. Renfrew, ON: Canadian Evaluation
Society; 2014 [cited 2022 Jul 7]. Available from: https://evaluationcanada.ca/career/what-is-
evaluation.html
2. Jordan T, Dake JA, Fertman CI. Chapter 10, Evaluating and improving health promotion
programs. In: Fertman CI, Grim M, editors. Health promotion programs: from theory to practice.
3
rd
ed. Hoboken, NJ: John Wiley & Sons Inc; 2022. p. 217-39.
3. Ontario. Ministry of Health. Ontario public health standards: requirements for programs,
services and accountability. Toronto, ON: Queen’s Printer for Ontario; 2021. Available from:
https://www.health.gov.on.ca/en/pro/programs/publichealth/oph_standards/docs/protocols_g
uidelines/Ontario_Public_Health_Standards_2021.pdf
4. Ontario Agency for Health Protection and Promotion (Public Health Ontario); Bodkin A (Public
Health Ontario), Alderson K (Wellington-Dufferin-Guelph Public Health); Ackford R (Wellington-
Dufferin-Guelph Public Health), presenters. Promoting health: a (re)introduction to evaluating
health promotion programs [Webinar]. Toronto, ON: King’s Printer for Ontario; 2022 [presented
2022 Jul 27; cited 2022 Nov 1]. Available from: https://www.publichealthontario.ca/-
/media/Event-Presentations/2022/health-promotion-reintroduction-session-
three.pdf?rev=f0ec1aaa7e9546d9b20f059ef4cf52a9&sc_lang=en
5. Ontario Agency for Health Protection and Promotion (Public Health Ontario), Snelling S,
Meserve A. Evaluating health promotion programs: introductory workbook. Toronto, ON:
Queen's Printer for Ontario; 2016. Available from: https://www.publichealthontario.ca/-
/media/documents/E/2016/evaluating-hp-programs-workbook.pdf
6. University of Toronto, Centre for Health Promotion; The Health Communication Unit (THCU).
Evaluating health promotion programs. Toronto, ON: University of Toronto; 1998.
7. Centers for Disease Control and Prevention; Office of the Director, Office of Strategy and
Innovation. Introduction to program evaluation for public health programs: a self-study guide
[Internet]. Atlanta, GA: Centers for Disease Control and Prevention; 2011 [cited 2022 Nov 1].
Available from: https://www.cdc.gov/evaluation/guide/CDCEvalManual.pdf
8. Centers for Disease Control and Prevention, National Centre for HIV/AIDS, Viral Hepatitis, STD,
and TB Prevention, Division of STD Prevention. Identifying the components of a logic model
[Internet]. Atlanta, GA: Centres for Disease Control and Prevention; [cited 2022 Nov 1].
Available from:
https://www.cdc.gov/std/program/pupestd/components%20of%20a%20logic%20model.pdf
9. Ontario Agency for Health Protection and Promotion (Public Health Ontario), Abdi S, Mensah G.
Focus on: logic modela planning and evaluation tool. Toronto, ON: Queen’s Printer for
Ontario; 2016. Available from: https://www.publichealthontario.ca/-
/media/documents/f/2016/focus-on-logic-model.pdf?la=en
Goals-Based Evaluation for Health Promotion Programs 10
10. Ontario. Ministry of Health and Long-Term Care; Health Results Team for Information
Management; Health System Intelligence Project; Ardal S, Butler J, Hohenadel J, Olsen D. The
health planner's toolkit. Module 6: evaluation. Toronto, ON: Queen's Printer for Ontario; 2008.
Available from: https://collections.ola.org/mon/22000/283845.pdf
11. Centers for Disease Control and Prevention, National Centre for HIV/AIDS, Viral Hepatitis, STD,
and TB Prevention, Division of STD Prevention. Types of evaluation [Internet]. Atlanta, GA:
Centers for Disease Control and Prevention; [cited 2022 Nov 1]. Available from:
https://www.cdc.gov/std/program/pupestd/types%20of%20evaluation.pdf
12. Wellington Dufferin Guelph Public Health. Case study: overdose prevention site evaluation.
Guelph; ON: Wellington Dufferin Guelph Public Health; 2023.
13. Fertman CI, Grim M, editors. Health promotion programs: from theory to practice. 3
rd
ed.
Hoboken, NJ: John Wiley & Sons Inc.; 2022.
14. Tamarack Institute. Developing evaluations that are used [Internet]. Kitchener, ON: Tamarack
Institute; 2017 [cited 2022 Jul 7]. Available from:
https://www.tamarackcommunity.ca/hubfs/Developing%20Evaluations%20that%20are%20Use
d%20Tool.pdf?hsCtaTracking=595905c0-9a6e-424e-abff-6ee45d31e134%7C59d234ab-c125-
4be2-b269-8821e7cb573b
15. Ontario Agency for Health Protection and Promotion (Public Health Ontario). Planning health
promotion programs: introductory workbook. 5
th
ed. Toronto; ON: Queen's Printer for Ontario;
2018. Available from: https://www.publichealthontario.ca/-
/media/Documents/W/2018/workbook-program-planning.zip?sc_lang=en
16. Ontario Agency for Health Protection and Promotion (Public Health Ontario), Snelling S,
Meserve A. Evaluating health promotion programs: introductory workbook. Toronto, ON:
Queen’s Printer for Ontario; 2016. Available from: https://www.publichealthontario.ca/-
/media/Documents/E/2016/evaluating-hp-programs-workbook.pdf?sc_lang=en
Goals-Based Evaluation for Health Promotion Programs 11
Appendix A: Glossary of Planning and Evaluation Terms
Activity: The proposed events or actions that will take place as part of the program.
8
Audience: The specific group that the program intends to reach. Identification of the audience should
be rooted in the results of a situational assessment which clearly articulates the individuals, networks,
organizations, and partners most impacted by the situation which the program intends to improve.
15
There can be multiple audiences for a single program: the primary audience is the main population
which the program intends to reach, while a secondary (or even tertiary) audience can be impacted or
influenced by the program, but are not direct recipients of it.
9
Goal: A statement that reflects the broadest level of results to be achieved by the program.
3
The goal
clarifies what is important about the program and includes the program’s intended audience.
13
Generally speaking, goals use action words such as reduce, eliminate, improve, or increase.
13
Indicator: A variable that measures the extent to which the program’s objectives have been met. The
type of indicators varies according to what precisely is being measured. Process indicators demonstrate
the concrete tasks that the program accomplished. Outcome indicators demonstrate if the program is
achieving the desired change. Indicators should be important, obtainable, reliable, and valid.
5
Input: The resources needed to implement a program. Examples include funding, staffing, and
program materials.
8
Logic Model: A logic model is a visual representation of the logic that underpins the public health
program. It shows the relationship between the program’s resources (inputs), the program’s activities
(outputs), and the program’s results (outcomes).
13
A logic model can be as broad or as specific as needed,
and may have a design specific to its purpose and audience.
9
For example, a logic model can build
understanding and clarity of the program, identify resources or the sequencing of activities that should be
implemented, or provide a basis for evaluation.
8
Generally, logic models commonly contain the program’s
goal, inputs, activities, audience, outputs, and outcomes.
9
Logic models can also include a description of
the situation, assumptions, external factors that might impact the program, and strategies.
9
Objectives: The specific and measurable steps that must be carried out in order to attain the program’s
goal. Process objectives identify the changes or tasks which are needed in order to implement the
program.
13
Outcome objectives identify the long-term accomplishments of a program.
13
Objectives
should be SMART: that is, Specific, Measurable, Actionable, Realistic, and Time-specific.
13
Well-written
objectives generally include four components:
15
Figure 2: Components of an Objective
Goals-Based Evaluation for Health Promotion Programs 12
Outcome: A measurable positive or negative change to the audience of a program,
10
that will occur as a
result of the program.7 In other words, an outcome captures the effects of a program.
7
Outcomes
measure the achievement of the program’s goal, and therefore are ambitious and often long-term.
10
It
may be useful therefore to develop short-term outcomes which can be measure in weeks or months,
and intermediate outcomes, which can be measured in months or years.
8
Output: The product or result of the program’s activities.
8
Outputs quantify activities by providing
numeric values or percentages.
9
Process evaluation: Systematically gathering information during program implementation. Process
evaluations describe and assess the reach of the program, audience recruitment and retention,
perceptions of program quality, acceptability, and fidelity of implementation.
2
Goals-Based Evaluation for Health Promotion Programs 13
Resources
Canadian Institutes of Health Research; Natural Sciences and Engineering Research Council of
Canada; Social Sciences and Humanities Research Council of Canada. Tri-Council policy
statement: ethical conduct for research involving humans TCPS 2 2022. Ottawa, ON: His
Majesty the King in Right of Canada, as represented by the Minister of Health and the Minister
of Innovation, Science and Industry; 2022. Available from
https://ethics.gc.ca/eng/documents/tcps2-2022-en.pdf
Centres for Disease Control and Prevention. Adolescent and school health: program evaluation
data collection methods [Internet]. Atlanta, GA: Centres for Disease Control and Prevention;
2014 [cited 2023 May 1]. Data collection & analysis: data collection methods. HYPERLINK
"https://www.cdc.gov/healthyyouth/evaluation/"Available from:
https://www.cdc.gov/healthyyouth/evaluation/#anchor_1612534734
Ontario Agency for Health Protection and Promotion (Public Health Ontario); Meserve A. At a
glance: the six steps for planning a health promotion program [Internet]. Toronto, ON: Queen’s
Printer for Ontario; 2015 [cited 2023 Jul 21]. Available from:
https://www.publichealthontario.ca/-/media/documents/S/2015/six-steps-planning-hp-
programs.pdf
Ontario Agency for Health Protection and Promotion (Public Health Ontario), Meserve A,
Mensah G. Focus on: evaluability assessmenta step model. Toronto, ON: Queen’s Printer for
Ontario; 2018. Available from: https://www.publichealthontario.ca/-
/media/documents/f/2018/focus-on-evaluability-assessment.pdf?sc_lang=en
Ontario Agency for Health Protection and Promotion (Public Health Ontario), Abdi S, Mensah G.
Focus on: logic model—a planning and evaluation tool [Internet]. Toronto, ON: Queen’s Printer
for Ontario; 2016 [cited 2023 Jul 4]. Available from: https://www.publichealthontario.ca/-
/media/documents/f/2016/focus-on-logic-model.pdf?la=en
Ontario Agency for Health Protection and Promotion (Public Health Ontario). A framework for
the ethical conduct of public health initiatives [Internet]. Toronto, ON: Queen’s Printer for
Ontario; 2012 [cited 2023 May 1]. Available from: https://www.publichealthontario.ca/-
/media/documents/f/2012/framework-ethical-conduct.pdf?la=en
Tamarack Institute. Wheel of engagement [Internet]. Kitchener, ON: Tamarack Institute; 2017
[cited 2022 Nov 1]. Available from:
https://www.tamarackcommunity.ca/hubfs/Collective%20Impact/Tools/Stakeholder%20Engage
ment%20Wheel%20Tool%20May%202017.pdf?hsCtaTracking=95a70673-e3d3-4b0b-961f-
432670166a60%7Ce578f338-782e-4d86-97d5-efb99bb7f6b5
Attygale L. The context experts [Internet]. Kitchener, ON: Tamarack Institute; 2017 [cited 2022
Nov 1]. Available from:
https://www.tamarackcommunity.ca/hubfs/Resources/Publications/The%20Context%20Expert
s.pdf?hsCtaTracking=56bc3396-2e91-49d8-8efc-95fa20b82878%7Cbddea62d-6f5b-4aa4-8b0d-
292bbd5c5b9b
Goals-Based Evaluation for Health Promotion Programs 14
Taylor-Powell E, Renner M. Analyzing qualitative data [Internet]. Madison, WI: Board of Regents
of the University of Wisconsin System; 2003 [cited 2021 May 1]. Available from:
https://www.betterevaluation.org/sites/default/files/analyzing_qualitative_data.pdf
Authors
Andrea Bodkin, Senior Program Specialist in Health Promotion. Health Promotion, Chronic Disease and
Injury Prevention Department, Public Health Ontario
Erin Berenbaum, Evaluation Specialist, Evaluation and Knowledge Mobilization, Public Health Ontario
Reviewers
Dan Harrington, Director, Health Promotoin, Chronic Disease and Injuiry Prevention, Public Health
Ontario
Nicole Ethier, Program Coordinator, Foundational Standards, Porcupine Health Unit
Acknowledgements
Public Health Ontario wishes to acknowledge Susan Snelling and Allison Meserve, formerly with Public
Health Ontario, who developed the 2016 workbook16 upon which this At A Glance was based, as well as
the former The Health Communications Unit staff and consultants who developed earlier versions of the
workbook.
Citation
Ontario Agency for Health Protection and Promotion (Public Health Ontario), Bodkin A, Berenbaum E.
Goals-based evaluation for health promotion programs. Toronto, ON: King's Printer for Ontario; 2023.
Disclaimer
This document was developed by Public Health Ontario (PHO). PHO provides scientific and technical
advice to Ontario’s government, public health organizations and health care providers. PHO’s work is
guided by the current best available evidence at the time of publication. The application and use of this
document is the responsibility of the user. PHO assumes no liability resulting from any such application
or use. This document may be reproduced without permission for non-commercial purposes only and
provided that appropriate credit is given to PHO. No changes and/or modifications may be made to this
document without express written permission from PHO.
Public Health Ontario
Public Health Ontario is an agency of the Government of Ontario dedicated to protecting and promoting
the health of all Ontarians and reducing inequities in health. Public Health Ontario links public health
practitioners, front-line health workers and researchers to the best scientific intelligence and knowledge
from around the world.
For more information about PHO, visit publichealthontario.ca.
© King’s Printer for Ontario, 2023