THE ROLE OF LEADERSHIP IN PERFORMANCE MANAGEMENT

THE ROLE OF LEADERSHIP IN
PERFORMANCE MANAGEMENT
Donald P. Moynihan,
La Follette School of Public Affairs,
University of Wisconsin-Madison
Presentation to Chicago Federal Leadership Forum
Have you encountered?




Strategic planning
Performance measures
Performance contracts
Pay for performance
The role of leadership
•
During my 20 years in the private sector as a CEO and
advisor to CEOs, I found that leadership, measurement,
and a motivated workforce create the foundation for
good performance. I am confident that the same is true in
government

Jeff Zients – Chief Performance Officer, 2009
Outline






Defining terms
Era of governance by performance management
From Bush to Obama
How do we use performance systems?
What fosters use of performance data?
Summary points
Defining terms
Performance management

A system that generates performance information
through strategic planning and performance
measurement routines, and connects this information
to decision venues,
Performance regimes

Performance tools create unprecedented pressure
on public actors to perform, in a context where
performance is defined by quantitative indicators
Purposes of Performance Information




Promote: How can I convince political actors,
stakeholders and the public that my agency is doing
a good job?
Celebrate: What accomplishments are worthy of the
important ritual of celebrating success?
Learn: What is what working or not working?
Improve: What exactly should who do differently to
improve performance?
Purposes of Performance Information




Evaluate: how well is my agency performing?
Control: how can I ensure that my subordinates are
doing the right thing?
Budget: on what program’s, people, or projects
should my agency spend the public’s money?
Motivate: how can I motivate employees and
collaborators to improve performance?
ERA of governance by performance
management
Era of Governance
by Performance Management




The rise of a doctrine
Not new, but more influential than before
Must justify actions in terms of outputs and outcomes
Basis for holding new structural forms accountable
Doctrinal logic for change
Government Performance
and Results Act 1993

Mandated:
5
year strategic plans, updated every 3 years
 Specific goals and objectives
 Annual performance reviews and plans
From Bush to Obama
Bush approach


Presidents Management Agenda
“everyone agrees that scarce federal resources
should be allocated to programs that deliver
results”
Wanted to integrate performance data into budget
process
Congressional Justifications


Center around performance goals
Pushback from Appropriations Committees
 Veteran’s
Administration told; “to refrain from
incorporating ‘performance-based’ budget documents”;
later told: “If the Department wishes to continue the
wasteful practice of submitting a budget structure that
will not serve the needs of the Congress, the Congress
has little choice but to reject that structure and continue
providing appropriations that serve its purposes.”

Two budgets required
Congressional Justifications


Department of Transportation told: “agencies are directed to
refrain from including substantial amounts of performance
data within the budget justifications themselves, and to instead
revert to the traditional funding information previously
provided. Performance-related information may be submitted
under separate cover.”
Negative consequences were promised for agencies that
ignored this directive: “If the Office of Management and
Budget or individual agencies do not heed the Committee’s
direction, the Committee will assume that individual budget
offices have excess resources that can be applied to other,
more critical missions.”
Program Assessment Rating Tool (PART)

5 year summary by OMB of evidence on program
performance for 1016 programs
 18
percent are Effective
 31 percent are Moderately Effective
 29 percent are Adequate
 3 percent are Ineffective
 19 percent are Results Not Demonstrated
PART as Evidence-based Dialogue






Third-party program review with a clear opinion
Greater emphasis on performance
The standard of proof for program performance can only
be satisfied by positive evidence of results
The burden of proof for performance rests on agencies
Entire programs are evaluated on a regular basis
The routine nature of PART creates an incentive to engage
Obama: A Pragmatic approach

“The question we ask today is not whether our
government is too big or too small, but whether it works
-- whether it helps families find jobs at a decent wage,
care they can afford, a retirement that is dignified.
Where the answer is yes, we intend to move forward.
Where the answer is no, programs will end. And those
of us who manage the public's dollars will be held to
account, to spend wisely, reform bad habits, and do our
business in the light of day, because only then can we
restore the vital trust between a people and their
government”
Example: Pedometer challenge!


Voluntary
Belief that transparent performance numbers will
change behavior, create a sense of competition and
raise performance
Early evidence on Obama

Performance measurement will be important
 “The
President is creating a focused team within the
White House that will work with agency leaders and the
OMB to improve the results and outcomes for Federal
Government programs while eliminating waste and
inefficiency”
 Chief performance officer
 Continue to maintain agency level performance positions
What happens to PART?




Not clear
Criticized as ideological, as too broad, as a data
collection exercise
Analysis remains in place, but new PARTs have not
started
OMB have offered agencies funds for better
evaluations
New emphasis on leadership



Focusing leaders on what matters – key goals
Accelerating results – Performance Improvement
Council; data driven meetings
Style: focused collaboration
New focus on information use



Will be a central aspect of the Obama
administration’s performance initiatives
Jeff Zients: “The ultimate test of our performance
management efforts is whether or not the
information is used”
Shelly Metzenbaum: “the key performance
management challenge facing the Obama
administration is to use—not just produce—
performance goals and measures”
How do we use performance
systems?
Why care about use?





For reforms to succeed, implies that data is used
Provides a tractable means of studying the impact of
results-based reform
Public organizations have devoted significant time
and resources into creating routines to collect and
disseminate data
Almost no attention to creating routines of use
How do you use performance data?
Types of responses: 4 Ps




Passive
Perverse
Political
Purposeful
Passive use of data

Passive:
 Do
the minimum to comply with requirements
 Do not actually use data
 Correlated with cynicism about reforms
Perverse use of data




Effort Substitution: Reducing effort on non-measured
dimensions
Cherry picking/Cream-skimming: Focusing effort on
subgroups of clients most likely to provide greatest
impact on performance measures while effectively
denying services to others.
Measure selection: Selecting metrics or data to measure
that will offer the most favorable portrayal of a service
Hiding numbers: Declining to present performance
measures that may exist
Perverse use of data




Output distortion: Manipulating measurement
processes to improve measured performance.
Ratchet effects: Curbing productivity in one time
period to avoid the setting of more challenging
targets in another.
Churning: Frequently adopting different targets or
measures to prevent comparison across time.
Cheating: Simply making up numbers, though rare,
does occur.
Responding to perversity

Add new/additional measures

Change existing measures

Rely/cultivate intrinsic norms to limit misbehavior

Avoid high-powered incentives
Political uses of data


Process of selecting measures means shaping a
program narrative
“Understand that measuring policy is not a science.
It is an art. It is words, and pictures and numbers.
And you create impressions, beliefs, understandings
and persuasions.”
Political uses of data


Data tells us what happened
Program officials still need to interpret and explain:
why performance did or did not occur;
 the context of performance;
 how implementation occurred;
 an understanding of outside influences on performance; and
 how to choose which program measure is a priority.


Exploit ambiguity and subjectivity of data
Political: Ambiguity of data



Examine same programs, but disagree on data
Agree on data, but disagree on meaning
Agree on meaning, but not on next action
steps/resources
Political: Subjectivity of data

Actors will select and interpret performance
information consistent with institutional values and
purposes
Evidence of Ambiguity in PART

Ambiguity of terms:


E.g.: Program purpose, quality evaluation, ambitious, having made
progress
How to interpret results? Multiple logics from experiment:







Argue that ratings are unreliable
Cut poorly managed programs
Raise funding for programs with positive assessments
Parity: Raise funding because program with similiar assessment received
more
Delay cuts because progress being made
Clear relationship between resources, need and program delivery
Stakeholder and congressional views
Evidence of Subjectivity with PART

OMB using PART to expand influence in
performance management/policy
 OMB
can define programs, goals, measures, agency
responsibility


Disagreement with agencies/Congress on
meaning/relevance of PART
Experimental evidence:
 UW
students significantly more likely to disagree with
OMB, and to argue for higher assessments and
resources
Implications for Decisionmaking



Performance information use reflects political
process, does not replace it
Performance information use does not lead to
clarity
Ability to structure dialogue tied to power
Purposeful use of data


Use data to improve program performance
Goal-based learning
 efficiency
improvements
 better targeting of resources
 more informed strategic decisions,
 tying indicators to rewards/sanctions in contract
arrangements
Purposeful use of data

Use of performance information for problem-solving
more likely to occur in intra-institutional settings
 Reduces

competing interpretations
Problem of neglect
 rarely
do anything with information
Learning forums



Routines specifically focused on solution-seeking,
where actors collectively examine information,
consider its significance and decide how it will
affect future action
What measures are useful for agency officials?
What other ways can we encourage learning
forums?
What fosters performance
information use?
The Right Context




Simple function that is easy to measure
Clear link between measures of actions, and
measures of outcomes
One-dimensional – relatively few measures that do
not conflict with one another
Stakeholder support – clear agreement about
purpose
Other factors





Learning forums
Mission-based culture/supportive culture
Resources
Administrative stability
Administrative capacity
Quantitative approach



3 studies using survey-based data
Self-reported performance information use
Results from Moynihan and Pandey (in press) and
Moynihan, Wright and Pandey (2009; 2010)
Study 1: Ordinal regression of reported performance information use for decisions
Variable
Hypothesized
direction
Result
Individual beliefs
Public service motivation
Positive
***
Job attributes
Reward expectation
Generalist leader
Task-specific experience
Positive
Negative
Positive
-***
--
Organizational factors
Information availability
Developmental culture
Flexibility
Budget staff take adversarial role
Positive
Positive
Positive
Positive/negative
***
***
*
--
External factors
Citizen participation
Professional influence
Positive/negative
Positive
+
+
*** = significant at .001; ** = .01; * = .05 +=.10 (two tailed tests)
Controls: region, income per capita, government size, population size, population homogeniety
Intrinsic vs. extrinsic motivation
Sense of public service motivation mattered
 Possibility of extrinsic reward did not create
an incentive to use data
 Implication: performance information use as
extra role behavior

Organizational factors

Information availability
 Supply-side
approach
 Use increases with better information, and when
information is tied to management systems
Organizational factors

Demand side approach
 Culture
matters
 Previous
work focuses on whether culture welcomed
performance management reforms
 What about broader measures of culture?
 Developmental culture (adaptability, readiness, growth)
 Flexibility
– unlikely to use data if cannot apply insights
Specialist vs. generalist leaders


Task-specific knowledge provides context in which
to interpret and apply data
Leadership role
 Task-specific
leaders more likely to use data than
generalist leaders
Other evidence of leadership
Support/commitment
 Provision of resources
 Participation


What other ways can leadership matter?
Study 2: Transformational leadership

Approach to leadership consistent with
performance:
 Articulate an appealing vision of the organization’s
mission and future
 Model behavior consistent with vision, inspiring role
model
 Challenge old assumptions
Propositions


Transformational leadership behaviors will have an
indirect, positive effect on performance information
use through its influence on goal clarity
Transformational leadership behaviors will have an
indirect, positive effect on performance information
use through its influence on organizational culture.
Key measures

Transformational leadership
•
Asked department heads/assistant city managers on extent
to which city manager demonstrates transformational
leadership:
 articulates his/her vision of the future.
 leads by setting a good example
 challenges me to think about old problems in new ways
 says things that make employees proud to be part of the
organization.
 as a clear sense of where our organization should be in five
years.
Aggregated responses by organization

Structural Equation Model
Goal
Clarity
0.66*
Transformational
Leadership
0.60*
0.57
0.26*
E
0.32*
0.33*
Performance
Information
Availability
0.52
0.13*
E
Performance
Information
Use
0.63
E
0.17*
Developmental
Culture
0.89
E
All paths reported as standardized coefficient
*p < 0.05
Implications



Leadership and management
Indirect effects are important
“Setting the table” as long-term leadership strategy
Study 3: Perceived social impact



Individuals who see their work as helping others
more likely to use performance information
Some evidence that individuals who perceive
greater social impact are more motivated
Why should it relate to performance information
use?
Key measures

Perceived social impact
I
feel that my work makes a positive difference in other
people’s lives.
 I am very aware of the ways in which my work is
benefiting others.
 I am very conscious of the positive impact my work has
on others.
 I have a positive impact on others in my work on a
regular basis
Purposeful and political use
Purposeful
 I regularly use performance information to make decisions.
 I use performance information to think of new approaches for doing old
things.
 I use performance information to set priorities.
 I use performance information to identify problems that need attention.
Political
 I use performance information to communicate program successes to
stakeholders.
 I use performance information to advocate for resources to support
program needs.
 I use performance information to explain the value of the program to the
public.
Table 2: OLS Regression of Performance Information Use
Independent variables
Purposeful Use
Political
Perceived social impact
.43 (.12)***
.52 (.11)***
Goal clarity
.19 (.10)*
.25 (.09)**
Centralization
.05 (.07)
.10 (.08)
Political support
.14 (.07)*
.06 (.07)
Red tape
.04 (.03)
.02 (.02)
Years in position
.01 (.007)
.006 (.008)
Supervisory level
.03 (.05)
.02 (.04)
Educational level
.05 (.07)
.09 (.06)
Sector
-.06 (.15)
.08 (.14)
*** = .001;**= .01; *=.05 one-tailed test
N=184; R2= .25
N= 186; R2= .31
Study 4: Experimental approach





How does performance information matter to
decisions?
How does the framing of performance information
affect decisions?
Respondents given surveys with scenario – make
budget recommendations
Series of vignettes for different programs
Half vignettes are control, half are treatment
Theoretical background


Research on decision frames from psychology and
behavioral economics
Performance information is strategically selected
and presented – does this work?
Does the Addition of Performance Data Matter?


Control: no data; treatment: addition of data without
clear correlation to resources
The Department of Land and Water Resources is responsible for monitoring
and maintaining the water quality of lakes in the county, including two major
lakes that are popular for swimming and other water sports during the
summer. Estimates of water quality are based on pH levels, pesticides,
nitrates and other chemicals in the water.
2007
2008
2009
Program funding
174,000
179,000
182,000
Number of days water quality of major lakes
deemed unsafe
12
14
11
Does the Addition of Performance Data Matter?


Control: no data; treatment: addition of data with
clear relationship to resources
The Department of Social Services delivers a program called the Home
Downpayment Initiative. Using a mix of federal, state, and local resources,
the program seeks to increase the homeownership rates among low-income
and minority initiatives. To do so, it provides financial assistance to firsttime homebuyers for downpayment and closing costs.
2007
2008
2009
Home Downpayment funding
723,000
747,000
769,000
Number of families purchasing homes
36
38
46
Is outcome data more powerful than output?


Control: output data; treatment: outcome data
The Department of Health Services offers a program called Health Check,
which is a preventive health check-up program made available for anyone
under the age of 21 who is currently enrolled in Medicaid. Health Check
provides a head-to-toe medical exam, immunizations, eye exam, lab tests,
growth and development check, hearing check, nutrition check, and teen
pregnancy services. The goal of the program is to prevent the incidence of
more serious and more expensive health situations.
2007
2008
2009
Health Check funding
232,000
244,000
269,000
Clients treated
1232
1401
1325
Estimated savings due to preventive care
383,000
402,000
389,000
Threshold effects

Treatment: performance data pass a memorable threshold
(200)
The County Tourism Board seeks to increase visits from those who live outside
the county, and to increase the use of recreational and cultural opportunities
by both locals and outsiders. It collects data from local hotels, restaurants,
and other businesses that depend on tourists. In the last number of years,
the number of tourists visiting the county has stayed relatively flat at about
100,000, and the Board has focused its marketing budget on “quality, not
quantity,” by increasing the dollar amount that each tourist spends.
2007
Program budget
Average daily dollar amount spent by tourists
Program budget
Average daily dollar amount spent by tourists
120,898
178
2008
124,490
184
2009
131,289
195
2007
2008
2009
124,294
127,196
135,329
183
188
201
Including equity measures


Treatment: addition of equity measure that aligns with mission
The Department of Social Services funds the Early Intervention Program, which
provides services for children three and under with developmental delays and
disabilities. The mission statement for the Early Intervention Program is: “Our
mission is to provide access to therapies that improve child developmental
outcomes.” The program is administered by a non-profit, and employs
therapists to work with children and families in the home environment.
2007
Program budget
Number of hours of contact per year
Number of children served
Percent of eligible children below the
poverty line that access services
2008
2009
329,677
333,451
341,386
8901
8977
9011
212
233
221
42%
47%
51%
Summary points: what to do


Move beyond passive and limit perverse use
Focus on political use
 What
is the narrative of your program?
 What goals are meaningful and telling? How do they
relate to the narrative?
 What goals are essential to explaining program
purpose and achievement?
 How do you frame and communicate measures? Who is
your audience?
Summary points: what to do

Focus on purposeful use
 Provide
resources, be involved, make clear that it is
important
 Encourage right context for use
 Foster
goal clarity
 Encourage supportive culture
 Create
and support learning forums
 Appeal to intrinsic motivation
 Focus
on demonstrating significance of measures
Questions/Comments



[email protected]
http://www.lafollette.wisc.edu/facultystaff/moynihandonald.html
The Dynamics of Performance Management

Georgetown University Press