Government for Results: How to Build and Strengthen Institutional Capacities 1

Government for Results: How to
Build and Strengthen Institutional
Capacities
Presentation to a DNP Seminar on Monitoring
for Results
Bogotá, Colombia – 12 March 2008
Keith Mackay
([email protected])
1
Structure of Presentation
1. Why Countries Want a Monitoring &
Evaluation (M&E) System
2. Colombia and Australia: Some Common,
Important Issues
3. Country Case Study: Australia
4. Important Lessons — Conclusions
5. Useful Resource Materials
2
Why Countries Want an M&E System
¾ To support budget decision-making
= performance-based budgeting
3 main types: direct, indirect, presentational
¾ To support national, sectoral and subnational planning
¾ To design new policies and programs
¾ To assist ministries, entities and subnational governments in their management
¾ To strengthen accountability relationships
3
Related Uses of M&E Information
¾ To clarify government goals
¾ To set performance targets; customer
service standards
¾ To contract public services to private sector
¾ performance contracts; personnel appraisal
¾ anti-corruption; measuring “leakage” of
government funds
¾ civil society oversight of government
performance — “voice”
4
What Do We Mean by M&E?
¾ Performance indicators — inputs, processes,
outputs, outcomes, impacts
¾ Clarify program objectives, program logic /
results chains, targets
¾ Service delivery surveys
¾ Participatory M&E
¾ Rapid evaluations — Colombia’s E2 evaluations
¾ Rigorous impact evaluations
¾ Cost-benefit analysis
¾ Each M&E tool has strengths, costs, limitations
5
Colombia and Australia:
Some Common, Important Issues: 1
1. Data quality, reliability, credibility
2. Data overload, data harmonization, data
cost
3. Data production versus data utilization
4. Skills to use data (and evaluations) well
5. Monitoring or evaluation?
6. Number, types of evaluations conducted
6
What To Avoid: Data Collected But Not Used
7
Colombia and Australia:
Some Common, Important Issues: 2
7. Central ministries versus sector ministries
and agencies?
8. How to ensure the use of M&E
information: how to build a performance
culture
¾ Role of champions
¾ Architecture issues
¾ Carrots, sticks and sermons
¾ Need to survey extent of use of M&E
9. Isolation of M&E officials
8
Australia: 1st Generation (1987-1997)
Context: severe budget pressures; many
public sector reforms; autonomy given to
line departments, agencies
¾ but they failed to conduct M&E
So, government evaluation strategy developed
― to force evaluation
Objectives: #1. to aid budget decision-making
#2. to strengthen accountability within
government
#3. to aid managers in ministries / agencies
9
1st Generation System − Architecture
¾ System designed and managed by
Department of Finance (DoF)
¾ Evaluations mandatory ― every 3-5 years for
every program
¾ All ministries must prepare rolling, 3-year
plans ― included major evaluations only
¾ Broad range of evaluation types ― rapid
evaluations; formal reviews; CBA; impact
evaluations; performance audits; etc
¾ By mid-1990s, ∑160 underway at any time
¾ Little formal requirement for collection or
reporting of performance indicators
10
Key Roles of DoF
¾ DoF was main architect of evaluation
strategy and other public sector reforms
¾ Partnership with Dept of PM & Cabinet,
Treasury, Public Service Commission
¾ involvement of DOF policy analysts in
evaluation planning and major evaluations
¾ key role of these analysts in budget process
¾ evaluation was usually conducted by line
departments / agencies
¾ special DoF unit provided evaluation support
to departments / agencies
11
Achieving Cultural Change Within DoF
¾ Required strong leadership from
Permanent Secretary and his deputies
¾ Focused recruitment and promotion
¾ Some on-the-job training
¾ Staff turnover was required
12
1st Generation System − Strengths
¾ <80% of new policy proposals and 2/3 of
savings options sent to Cabinet were
influenced by evaluation findings
¾ and these findings influenced Cabinet
considerably in its budget decision-making
¾ Aust National Audit Office: high use of
evaluation findings by sector depts/agencies
¾ Evaluation was a collaborative endeavour
Auditor-General:
“... the success of evaluation
at the Federal level ... was largely due to its full
integration into the budget process.”
13
1st Generation System − Weaknesses
1. Uneven quality of evaluations
2. Insufficient central support for advanced
evaluation training
3. Insufficient formal requirements for the
collection and reporting of performance
information
4. An administrative burden on ministries
was claimed
14
Australia: 2nd Generation ―
Performance Framework (1997 to 2006)
Genesis was new conservative government:
¾ significant reduction in size of civil service
¾ policy-advising system during budget
process largely dismantled
¾ central oversight and “bureaucratic” rules
reduced considerably
¾ DoF was downsized considerably. Its role
in providing advice during budget process
and for maintaining budget estimates was
reduced considerably
15
2nd Generation − Framework
¾ 2nd Generation system based on mix of
principles, expectations, few requirements
¾ evaluation strategy dismantled: evaluation
was “deregulated”
¾ evaluation was encouraged ― but not
required
¾ emphasis was on performance monitoring
(i.e., performance indicators) ― of outputs
and outcomes ― and reporting them to
Parliament, both ex ante and ex post
16
2nd Generation System:
How Successful? (i)
Australian National Audit Office: departments’
performance reports to Parliament highly
inadequate
¾ poor quality data ―weak standards, systems
¾ little use made of targets or benchmarking
¾ lot of data on government outputs, but little
on outcomes
¾ lack of real analysis of performance
information
¾ Parliamentary committees very unhappy with
this inadequate information
17
2nd Generation System:
How Successful? (ii)
OECD (2002): The second generation
reforms “deprived the Finance Ministry of
the information necessary for it to
adequately advise the Minister”
In terms of evaluation, still some goodpractice departments
¾ e.g., Family & Community Services;
Employment; Education; Health
18
Good-Practice: Department of
Employment
¾ good output / outcome indicators
¾ service charter; regular surveys of client
satisfaction Æ reported publicly
¾ Minister rates all policy advice on 11
criteria
¾ active evaluation branch:
¾ conducts rigorous impact evaluations
¾ has longitudinal datasets
19
Good-Practice: Department of Family &
Community Services
¾ good range of performance information on
outputs reported to Parliament
¾ quantity, quality, cost
¾ and some information on intermediate
outcomes
¾ substantive use of its evaluation findings
in these reports to Parliament
20
Good-Practice: Department of Family &
Community Services
¾ has Research & Evaluation Plan
¾ 145 research/evaluation projects
¾ budget of US$20 million per annum
¾ helps to fund 6 university research
institutes
¾ all funded internally by the department
from its total budget allocation
21
Third Generation System = Strategic
Review Framework (from 2006)
Context: ongoing concerns with coordinating
and implementing government programs
¾ ongoing concern with “connectivity” —
coordination across ministries, agencies, and
federal / state
¾ desire for more systematic approach to
reviews of government spending
¾ strong desire of DoF to rebuild its role in
budget process and in policy advising
22
Third Generation − Framework
Two types of review to ensure spending is
efficient, effective and aligned with gov’t
priorities:
1. Strategic Reviews to focus on major policy
and spending areas
2. Program Reviews to focus on individual
programs
Note: the requirements of the Performance
Monitoring Framework (= 2nd Generation)
will also continue
23
Prioritizing Reviews
¾ Key ministers decide Review topics
¾ DoF secretariat manages the Reviews
¾ Review teams include DoF and line
department officials
¾ key ministers to decide which Review
recommendations to implement
¾ DoF to monitor implementation
24
Important Lessons — Conclusions : 1
1. Data quality, reliability, credibility
2. Data overload, data harmonization, data
cost
3. Data production versus data utilization
4. Skills to use data (and evaluations) well
5. Monitoring or evaluation?
6. Number, types of evaluations conducted
25
Important Lessons — Conclusions : 2
7. Central ministries versus sector
ministries and agencies?
8. How to ensure the use of M&E
information: how to build a performance
culture
¾ Role of champions
¾ Architecture issues
¾ Carrots, sticks and sermons
¾ Need to survey extent of use of M&E
9. Isolation of M&E officials
26
Useful Resource Materials
¾
World Bank, A Diagnosis of Colombia’s National M&E System,
SINERGIA, World Bank, 2007.
http://www.worldbank.org/ieg/ecd/sinergia.html Available in Spanish
¾
Manuel Fernando Castro, Insider Insights: Building a ResultsBased Management and Evaluation System in Colombia,
World Bank, 2008. http://www.worldbank.org/ieg/ecd/docs/ecd_wp_18.pdf
¾
Keith Mackay, How to Build M&E Systems for Better
Government, World Bank, 2007.
http://www.worldbank.org/ieg/ecd/better_government.html Available in Spanish
¾
Keith Mackay, Two Generations of Performance Evaluation
and Management System in Australia, World Bank, 2004.
http://lnweb18.worldbank.org/oed/oeddoclib.nsf/DocUNIDViewForJavaSearch/14163969
A1A709BD85256E5100013AA8/$file/ecd_wp_11.pdf
27
Useful Resource Materials
¾
Independent Evaluation Group, Monitoring & Evaluation
Tools, Methods and Approaches, World Bank, 2004
http://www.worldbank.org/ieg/ecd/me_tools_and_approaches.html Available in
Spanish
¾
Independent Evaluation Group, Influential Evaluations:
Evaluations that Improved Performance and Impacts of
Development Programs, World Bank, 2004.
http://www.worldbank.org/ieg/ecd/influential_evaluations.html Available in Spanish
¾
¾
World Bank website on Building Government M&E Systems:
www.worldbank.org/ieg/ecd/
World Bank / Carleton University website on the International
Program for Development Evaluation Training
http://www.ipdet.org/
28