9/3/10

9/3/10
How to plan and manage an
environmental education
program –
AND evaluate its success
Philip Cox,
Plan:Net Limited (www.plannet.ca)
Gareth Thomson, ACEE
Alberta Council for Environmental
Education (ACEE)
Calgary Sept 9th, Edmonton Sept
10th 2010
A craft that comes by different names…
Outcomes Measurement
Performance Measurement and Evaluation
Outcomes Mapping
Managing by Results
Results Based Management
1
9/3/10
A craft that comes by different names…
Outcomes Measurement
Performance Measurement and Evaluation
Outcomes Mapping
Managing by Results
Results Based Management
Introducing Three Outcomes
Measurement Tools…
Logic Model
Risk Analysis and Management Table
Monitoring and Evaluation Plan
3. The basis for
workplans and job
descriptions
4.  A management reference
during team meetings
Start Up
Implementation
5. A guide for
monitoring and
reporting on
progress &
adjusting
strategies
Donor
Appraisal
Proposal
Development
2.  The centrepiece of
your proposal or
plan
Monitoring /
Evaluation
Project
Initiation
6. An aid in
designing an
evaluation
1. The basis for an
introductory
conversation with
stakeholders
Phase Out
Logic Model, Risk Analysis and Management Table, and
Monitoring and Evaluation Plan are relevant to all parts of the
program cycle…
2
9/3/10
What is Outcomes
Measurement?
It is a way of thinking and a set of tools to help us:
1.  Convert a project idea into a set of activities and
anticipated changes in a way that is:
•  Mindful of context
•  In keeping with the time and resources available
2.  Keep track of progress against the plan and use this
information to:
•  Make management decisions,
•  Service accountability relationships and
•  To be a learning organization.
Outcome Measurement is Part
of a Global Trend
•  A response to new realities...
–  Need to demonstrate value to citizens and taxpayers…
–  A response by funding bodies around the globe who are
under pressure to provide evidence that their programs
achieve results
–  EEC, World Bank, USAID, UN, WFP and others need to
provide credible evidence of results they achieve
–  By mid 90’s, almost all OECD governments had adopted
RBM approaches
–  In Canada - federal government and all provinces
–  Increasingly philanthropic organizations such as Alberta
Ecotrust Foundation, EcoAction, etc.
– 
What does this mean in
practical terms?
•  When making a plan, or reporting on
progress, we are challenged like never
before to:
•  Describe the connection between the
work we do and the difference we make
3
9/3/10
If you are planning…
•  A new web-based information portal for educators on
climate change
•  You might focus attention on…
– 
– 
– 
– 
– 
– 
– 
The quantity and quality of information loaded on to the portal
The ‘user-friendliness’ of the web site
The number of users and how they are accessing the portal
And also…
What teachers are looking for on the portal
How they are using the information to inform their teaching
How the teaching-learning experience in the classroom is
altered
Exercise
•  Think about a project that you know well
•  Craft two outcome statements that
reflect the change you want to see
•  Share your statement with your
neighbour and then in plenary
Where can Outcomes
Measurement be used in your
organization?
•  Organization as a whole
– Donor funded programs
• Partner organizations
–  Individual Partner Initiatives/Projects
4
9/3/10
15 Years of Outcomes - lots of
visceral reaction from users
What’s yours?
Three Outcomes
Measurement Tools
Logic Model !
- “Measuring Success…” pages 19 - 23
Risk Analysis and Management Table
Monitoring and Evaluation Plan
The Logic Model
•  Outcomes Measurement practice revolves
around the logic model
5
9/3/10
Splash and Ripple !
Another way of thinking about
Splash
Ripple!
Activitiesand
and Outcomes
Situation Assessment
Before beginning a results
based plan..
1. Looking Inward
- Mandate
-  Competencies
-  Experience
2. Looking Outward
-  Issues & Opportunities
3. Clarifying Priorities
Situation Assessment - Stakeholders
HIGH
LOW
Interest or Importance
Influence
High Interest/
Importance, Low
Influence
High Interest/
Importance, High
Influence
Consider ways to strengthen
or protect interests
Build coalition of support
Low Interest/
Importance, Low
Influence
Low Interest/
Importance, High
Influence
Spend more time on the
other Stakeholders
Caution - be aware of risks
or obstacles
LOW
HIGH
6
9/3/10
Discussion
•  Generate a list of groups with a stake in
the success of your project
•  Place them on the stakeholder matrix
according to their current
–  Interest/importance
–  Influence
Logic Model Format - One Version
Inputs
Activities
Outputs
Inputs
Activities
Outputs
Inputs
Activities
Outputs
Inputs
Activities
Outputs
Inputs
Activities
Outputs
Outcomes
Outcomes
Impact
Outcomes
Logic Model Format - Another Version
Why
Ultimate
Outcomes
What we
want
Intermediate
Outcomes
Immediate
Outcomes
How
Intermediate
Outcomes
Intermediate
Outcomes
Immediate
Outcomes
Immediate
Outcomes
Immediate
Outcomes
Outputs
Outputs
Outputs
Outputs
Outputs
Activities
Activities
Activities
Activities
Activities
Inputs
7
9/3/10
Inputs…
•  Are the human and
physical resources that
make a program or
project possible.
•  Links outcomes logic to
budgeting
Example: Train and Mentor Environmental Educators in Outcomes
Measurement
Ultimate
Outcomes
Intermediate
Outcomes
Immediate
Outcomes
Outputs
Activities
Inputs
Trainers, Materials, Equipment, Communications…
Activities
•  Describe the essential work of
the program or project – what
you DO
•  Should be summarized in no
more than 5 statements.
8
9/3/10
Example: Train and Mentor Environmental Educators in Outcomes
Measurement
Ultimate
Outcomes
Intermediate
Outcomes
Immediate
Outcomes
Outputs
Activities
Assess
training needs
Design
training
materials
Deliver
training
Inputs
Trainers, Materials, Equipment, Communications…
Provide
follow-up
coaching
Outputs…
•  Short term
…a change that is observable as
you complete the activity.
•  Each output relates to one activity.
•  You have substantial control.
•  Involves
–  a specific group of people - those
directly involved in the activity, or
–  the creation of a product
•  The change helps make outcomes
possible.
Example: Train and Mentor Environmental Educators in Outcomes
Measurement
Ultimate
Outcomes
Intermediate
Outcomes
Immediate
Outcomes
Outputs
Defensible
report issued
Relevant, user
friendly
training
materials in
place
Participants
complete
training
On-the-job
queries
addressed
Activities
Assess training
needs
Design training
materials
Deliver training
Provide followup coaching
Inputs
Trainers, Materials, Equipment, Communications…
9
9/3/10
Immediate Outcomes…
•  Observable within the project
cycle, following activities
•  Describes the potential created
by the outputs
•  You still have considerable
control
•  Centres on a very specific
group of people - those directly
involved in the activity or using
the product.
Example: Train and Mentor Environmental Educators in Outcomes
Measurement
Ultimate
Outcomes
Intermediate
Outcomes
Immediate
Outcomes
Stakeholder
commitment
to a training
strategy
Trained managers apply Outcomes
Measurement approaches and tools in the
design and implementation of the projects to
which they are assigned
Outputs
Defensible
report issued
Relevant, user
friendly training
materials in
place
Participants
complete
training
On-the-job
queries
addressed
Activities
Assess training
needs
Design training
materials
Deliver training
Provide followup coaching
Inputs
Trainers, Materials, Equipment, Communications…
Intermediate Outcomes…
•  Describe the key changes you want to
make in the program/project
•  Observable at or shortly after the
completion of the program/project
•  You have influence, but not control
•  Usually involves a larger group of
people than those directly involved in
the program/project
•  Are fewer in number; each flows
naturally from several Immediate
Outcomes
10
9/3/10
Example: Train and Mentor Environmental Educators in Outcomes
Measurement
Ultimate
Outcome
Intermediate
Outcomes
Agency leaders make
evidence- based, resultsfocused planning and
management decisions
Donors and community
stakeholders recognize the
‘value-added’ by the agency’s
work
Immediate
Outcomes
Stakeholder
commitment to
a training
strategy
Trained managers apply Outcomes Measurement
approaches and tools in the design and
implementation of the projects to which they are
assigned
Outputs
Defensible
report issued
Relevant, user
friendly training
materials in
place
Participants
complete
training
On-the-job
queries
addressed
Activities
Assess training
needs
Design training
materials
Deliver training
Provide followup coaching
Inputs
Trainers, Materials, Equipment, Communications…
Ultimate Outcome
•  Long term
…a picture of a preferred future; the
reason why the program or project is
important.
•  You can only make a contribution and
have an indirect influence.
•  The change usually involves an even
wider group of people
•  Is observable well after the program
or project is complete.
Example: Train and Mentor Environmental Educators in Outcomes
Measurement
Ultimate
Outcomes
People/groups identified in environmental education projects
engage with the subject matter in a way that helps them
become active stewards of the environment
Intermediate
Outcomes
Group leaders make evidencebased, results- focused planning
and management decisions
Immediate
Outcomes
Stakeholder
commitment to
a training
strategy
Trained educators apply Outcomes Measurement
approaches and tools in the design and
implementation of the projects to which they are
assigned
Outputs
Defensible
report issued
Relevant, user
friendly training
materials in
place
Participants
complete
training
On-the-job
queries
addressed
Activities
Assess training
needs
Design training
materials
Deliver training
Provide followup coaching
Inputs
Trainers, Materials, Equipment, Communications…
Donors and community
stakeholders recognize the
‘value-added’ by the group’s
work
11
9/3/10
A Progression of Results - Typical Terms
Ultimate
Outcomes
Intermediate
Outcomes
Immediate
Outcomes
Outputs
Conditions - Social
Civic
Economic
Environmental
Performance Improved - Lifestyle shifts
Institutional behaviour changes - policy & programming
Social mobilization
Capacities Developed - Institutional exchanges/collaborations
Applied knowledge
Functioning spaces/infrastructure
System improvements/adaptations
New/Improved leadership
Aligned Decision-making
What we Produce - Networks Courses
Assessments
Activities
Inputs
Cultural
Publications
Trained participants
Infrastructure
What we Do - Research
Build Train/Mentor Promote
Procure Convene
What we Need - Staff Equipment Supplies Office Space
Services
See “Measuring
the Success…”
Pages 31-41
Logic Model Quiz
“Organize a Provincial Climate
Change Summit”
12
9/3/10
Making a Logic Model
- Some Tips
1.  Vantage Point
• 
Logic Models can be written from many
different points of view • 
it is important to be clear who is doing the
activities - spending implementation funds
there is normally only one vantage point in a
framework, it could be
• 
• 
• 
a single entity, or
shared across two or more entities that are working
in partnership
2. 
When building an logic model consider three variables…
a. 
Time
• 
• 
• 
• 
c. 
Control
• 
• 
• 
• 
d. 
Outputs - Immediate
Immediate Outcomes - within time-frame following related activities
Intermediate Outcomes - by the end of the project or shortly afterward
Ultimate Outcomes - beyond the project
Outputs - substantial control
Immediate Outcomes - still considerable control
Intermediate Outcomes - direct influence
Ultimate Outcome - indirect influence
Diffusion
• 
Outputs - persons directly involved in the activity
• 
Immediate Outcomes - same, possibly early interaction with additional
people closely connected (e.g. users, colleagues, family members)
Intermediate Outcomes - larger teams, organizations, population segments
Ultimate Outcome - systems (communities, networks, societies)
• 
• 
3.  Proportion…
• 
Outputs, outcomes and impacts vary in their magnitude
according to how big (in money terms) and how long
the project is.
• 
If you have a three week, $5,000 project, your outcomes
should reflect what is reasonable to expect at three
weeks and after $5,000 has been spent
• 
If you have a three year, $500,000 project, your
outcomes should reflect what is reasonable to expect at
three years and after $500,000 has been spent
13
9/3/10
4.  Scale - macro to micro
Outcome Measurement can
be used at any scale - but
which is best for you?
Writing Activity and Results
Statements - Tips
5.  Separating the Action from the Change
– 
Activities - where you describe what you, as the implementing
group, are DOING
– 
Results (output and outcome - immediate, intermediate and
ultimate) - where you describe what is DIFFERENT
– 
For example…
• 
• 
Activity: Train 25 participants in Outcomes Measurement
Result: Participants apply Outcomes Measurement knowledge in
when making project plans
6.  Avoid ‘run-on’ statements - for example:
“Rural population retained in rural areas THROUGH
improving their living conditions BY MEANS OF enhancing
rural income, better access to health, social services and
infrastructure.”
- An impact statement for a rural development project
“Improved political freedoms of local communities TO access,
participate and CONTRIBUTE TOWARDS sustainable
regional development.”
- An impact statement for a governance related project
Difficult to locate the actual change you seek
14
9/3/10
7. 
Do not lose sight of the people in your outcomes
statements.
– 
Sometimes we write statements without naming the
people or groups who are experiencing the change. For
example…
- 
Try to include the subject in the statement at all times
“increased awareness”
8.  Avoid tentative wording like: “able to”,
“have increased capacity to…”,
“contribute toward”. These are vague
in meaning and very hard to measure.
Now your turn…..
• 
Describe your group project in
a results logic model?
• 
• 
• 
• 
1 Ultimate Outcome
2 Intermediate Outcomes
3-4 Immediate Outcomes
4-6 Activities and Outputs
15
9/3/10
Sample Logic Model Worksheet
SITUATION (I.e. context, problem, identified needs, mandate,
objectives)
Organization :
Name of project :
How ?
Inputs
Activities
What do we want ?
Outputs
Immediate
outcomes
Intermediate
outcomes
Duration :
Why ?
Ultimate outcome
Three Outcomes
Measurement Tools
Results Logic Model
Risk Analysis and Management Table !
Monitoring and Evaluation Plan
Enablers and Constraints…
• Are forces affecting
any part of your
project both
positively and
negatively
16
9/3/10
Risk Analysis Matrix
Project Example…
Climate Change
Conference
Immediate Outcome:
Conference participants
deliberate across stakeholder
lines on conference topics;
they identify climate change
strategies around which
there is common agreement,
as well as strategy areas
requiring further discussion
1. What could hamper
progress?
2. What is the likelihood of a
problem?
Effects
Significant
(3)
Moderate
(2)
Minor
(1)
3. What effect would the
problem have on the
Outcome?
Monitor Risks
Risk Mitigation
Required
Risk Mitigation
Required
Acceptable
Risks
Monitor Risks
Risk Mitigation
Required
Acceptable
Risks
Acceptable
Risks
Monitor Risks
Low (1)
Medium (2)
High (3)
Likelihood
Risk Mitigation & Monitoring
• 
• 
Risk mitigation strategies should reduce
the likelihood of an undesirable event, or
minimize the effect the event could have
on the project.
Risk areas that remain above the level
of acceptable risk should be monitored
using risk indicators.
A Typical Risk Analysis and
Management Table
Description of
Risk
Likelihood of
Occurrence
Effect on
Planned
Results
(outputs/
outcomes/
impact)
Rank in
Importance
1 = highest
Owner of the
Risk
Risk
Mitigation
Strategy
Risk
Indicators
Intermediate
outcome
level
Immediate
Outcome
level
Output
level
17
9/3/10
Three Outcomes
Measurement Tools
Logic Model
Risk Analysis and Management Table
Monitoring and Evaluation Plan ! “Measuring Success…” Appendices, pages 44 - 66
Monitoring and Evaluation
- The Distinction
• 
Monitoring
A continuous (or regular) self-assessment of progress collecting data and comparing current performance with
planned activities and outcomes.
• 
Evaluation
Independent (often external), periodic, strategically focused
assessment of a program/project’s continuing relevance,
management, governance, results, coverage, external
relationships, sustainability, etc.
Conventional vs Participatory
Monitoring & Evaluation - Contrasts
Conventional
Participatory
Who
External experts
Community members, project
staff, facilitator(s)
What
Predetermined indicators of
success
People identify their own
indicators of success
How
Focus on scientific objectivity,
evaluator distanced from
other participants; delayed
and limited access to results
Self-evaluation, simple methods
adapted to local conditions;
open, immediate sharing of
results through local involvement
When
Usually Mid and End of
Project
More frequent, small scale
evaluations
Why
Accountability - to address
donor requirements
Empowerment - to help people
initiate, navigate, control
Adapted from Narayan-Parker, 1993: 12 (taken from Estrella and Gaventa, 1998)
18
9/3/10
Different kinds of Evaluation
•  Formative - mid stream, where inquiry
focused on areas where the project can
improve
•  Summative - at the end, where inquiry
focused on end results and lessons for future
practice
•  What is your experience with these two kinds
of evaluation?
Indicators…
•  Evidence that the program
is producing expected
results at the output and
outcome levels.
•  They can be expressed
quantitatively and
qualitatively.
•  Information collected
should help managers
make wise decisions.
What indicators tell us about
the wind…
Who has seen the wind?
Neither you nor I.
But where the trees bow
down their heads,
The wind is passing by.
-The Wind, Christina Rosetti, 1830-1894
19
9/3/10
Quantitative Indicators
Expressed as a percentage, ratio, proportion, or number that can
be analyzed statistically
Examples?
– Number of conferences/public engagement activities held
– Ratio of males to females attending public meetings
– Number and type of research grants distributed
– Proportion of targeted schools making requests for assistance
with educational resources
Quantitative Indicators Your Turn
•  Cimate Change
Conference Project
•  Immediate Outcome
Major media outlets cover the Climate Change Summit with ample in-depth news analysis pieces
•  Major media outlets cover the
Climate Change Summit with
ample in-depth news analysis
pieces
?
Qualitative Indicators
Expressed as a change or comparison between two states of
understanding or experience that can be analyzed for patterns
Relies on people’s judgment or perceptions
Examples:
– Trends in the way community newspapers cover citizen
actions to reduce environmental impacts
– Before-after comparison in the way the largest five land
developers are handling wetland areas zoned for
development
20
9/3/10
Qualitative Indicators Your Turn
•  Cimate Change
Conference Project
•  Immediate Outcome
•  Major media outlets cover the
Climate Change Summit with
ample in-depth news analysis
pieces
?
Qualitative Indicators - Setting
out Markers of Progress
•  Sometimes you cannot meaningfully track progress
by counting “units” at the beginning (baseline), middle
and end
–  Indicators are not always as easy as “number of people
trained”…
•  You may need to set out…
–  Stages of completion, or
–  Markers of progress
Qualitative Indicators - Setting
out Markers of Progress
•  Consider “policy formulation”…
–  Indicator is ‘Progress in updating ‘x’ policy’
•  You could say…
–  Baseline - ‘0’ updated policy
–  Target - ‘1’ updated policy
•  But how useful is that from a management
point of view?
21
9/3/10
Qualitative Indicators - Setting
out Markers of Progress
• 
Would be more useful to identify progress markers.
For example:
a) 
b) 
c) 
d) 
e) 
f) 
• 
• 
• 
Agreed need for updated policy
Policy research underway
Policy options formulated
Discussion and approvals underway
Updated policy approved
Updated policy enacted
Baseline might be ‘a’
Yr 1 target might be ‘c’
Yr 3 target might ‘e’
Indicator Checklist
Climate Change Conference -
Key stakeholder groups commit
financial and human resources to
implement the Action Plan
Possible Indicators - How well do
they score?
a. 
Trends in the level of financial commitment, by
action plan strategy, by source
b. 
Number and type of organizations publicly
endorsing the Climate Change Action Plan
c. 
Number and type of initiatives underway in the
Province, by action plan strategy
Validity - Does it measure the result?
Yes
No
Is it cost-effective to collect the
information?
Yes
No
Does it help understand how the program
or project is affecting men and women,
and/or specific sub-groups of people
differently?
Yes
No
Does it provide useful information for
management decisions?
Yes
No
Does it communicate well to
stakeholders (public, policy makers,
donors)?
Yes
No
Indicators, Baseline, and
Target
Meaning
Example
Indicator
The information to be
collected
The number of visits to the
organization’s new website
Baseline
What the indicator would
tell you at the beginning
720
Target
What you want the
indicator to tell you at a
specific point in the future
2,000
22
9/3/10
Sources of Information
•  Data Sources are the people or documents
that provide data, not the method of collecting
data
•  Sources of performance information include:
people, individual beneficiaries, groups of
beneficiaries, organizations, partners, and
documents, etc.
•  To ensure reliability, try not to change data
sources over time.
Methods of Gathering Data
See “Measuring Success…” Appendix I, Pros and Cons, Pages 49 - 51; Appendix IV, Samples ,Pages 55 - 66
Collecting the Data
•  Conventional Methods surveys, focus groups,
observation, document
reviews, counts, etc.
23
9/3/10
Collecting the Data
•  Participatory Reflection
and Action Methods modeling/mapping,
ranking exercises,
calendars, walks,
historical profiles
Frequency of Collecting Data
•  For each indicator, determine how often
data will be collected.
•  Indicators for outputs and immediate
outcomes are more likely to be collected
on a quarterly or six-monthly time
period, for intermediate outcomes - less
frequently
Responsibility - who does it?
•  Spell out clearly who will do the
collection and analysis of the data for
each indicator
•  Ensure local capacity is built and is
responsible to ensure sustainability.
•  Decentralize responsibility as much as
possible.
24
9/3/10
A Typical Monitoring and Evaluation Plan
- Completed example on page 30 “Measuring Success…”
Results
Levels
Measurement
Indicators
BaseLine
Target
Data Source/
Method
Report
Frequency
& Resp.
Intermediate
Outcome
Immediate
Outcome
Output
Now your turn…..
1.  Take one outcome statement,
brainstorm indicators
2.  Use the indicator checklist to
refine your best indicators
3.  Identify the source, means of
data gathering (a.k.a.
‘instrument’), and frequency
4.  Repeat, if time permits
5.  Share with a neighbour
Final Discussion
1.  How might you use these
Outcomes Measurement tools
to strengthen your projects?
2.  What questions linger?
25