Michael Lewis, a member of Princeton's Class

A GRADUATION SPEECH
 Michael Lewis, a member of Princeton's Class
of 1982 and author of such books as "Liar's
Poker" and "Moneyball," speaks at the 2012
Baccalaureate in a speech called "Don't Eat
Fortune's Cookie."
 http://www.youtube.com/watch?v=CiQ_T5C
3hIM
PAY FOR LUCK

from Mathews, Steve. (August, 2013) “Proving CEOs Overpaid for Luck Helped Stir Pay
Backlash” Bloomberg.com http://mobile.bloomberg.com/news/2013-08-22/proving-ceosoverpaid-for-luck-helped-stir-pay-backlash.html

University of Chicago Professor Marianne Bertrand found that a pay-for-luck phenomenon
occurred with multinational businesses when currency fluctuations, rather than
management strategies, improved results. Source: The University of Chicago via Bloomberg

Marianne Bertrand helped unleash a shareholder backlash against CEO pay with research
she began while still in graduate school.

In a 2001 paper based on her work as a Ph.D. candidate at Harvard University, the 43-yearold labor economist documented that chief executive officers at U.S. oil companies got
raises when their company’s fortunes improved because of changes in global oil
prices beyond their control. The same pay-for-luck phenomenon occurred with
multinational businesses when currency fluctuations, rather than management strategies,
boosted results, she found.
Books on Behavioral Economics
For those interested in seeing a free book
on the topic, click here to obtain a free
download of Policy and Choice: Public
Finance through the Lens of Behavioral
Economics by William J. Congdon,
Jeffrey R. Kling, and Sendhil
Mullainathan.
Bringing Behavioral Economics
into the Classroom
Alan B. Krueger
Princeton University
Why Should You Bring Behavioral
Economics into the Classroom?
•
•
•
•
•
•
•
•
Trains students to avoid making serious mistakes down
the road (e.g., Don’t invest in your employer, Enron)
Clarifies what is rational and irrational decision making
Leads to a better understanding of opportunity costs,
time discounting, and other economic concepts
Provides leg up in the business world
Provides a richer, more realistic understanding of
decision making in practice  Positive Economics
Can lead to better policies (Pension Reform Bill)
 Normative Economics
Easy to explain and demonstrate in class
Oct. 13, 2006
Alan Krueger
Elements of Rational Decision Making
•
•
•
•
Utility = the amount of satisfaction a person gets from
consuming a good. Utility can also be thought of as
happiness (althought it technically goes beyond just
happiness)
Elements of decision-making
Individuals make choices to maximize some objective
function (usually utility function) under the constraints
that they face
Utility function is stable
If there is uncertainty, individuals maximize expected
utility by assigning probabilities to different states of
the world
Oct. 13, 2006
Alan Krueger
Implications of Traditional Approach
Compare opportunity cost of
various decisions
Pursue an activity until marginal
benefit equals marginal cost
Sunk costs are sunk
Consistent behavior
More choice is better
Oct. 13, 2006
Alan Krueger
assuming rationality has a very long
history
 The quote is from Thorstein Veblen [(1898): pp. 389]:
 “In all the received formulations of economic
theory… the human material with which the inquiry is
concerned is conceived in hedonistic terms; …The
hedonistic conception of man is that of a lightning
calculator of pleasures and pains who oscillates like a
homogeneous globule of desire of happiness under
the impulse of stimuli that shift him about the area,
but leave him intact.”(italics added to the original)
Instrumental Rationality
The traditional approach argues that people follow the dictates
of “instrumental rationality”.
“Players (i.e. economic actors or decision-makers) must not
only have objectives, but know the correct way to achieve
them. But how do the players know the correct way to achieve
their objectives? The instrumental rationality answer is that,
even though the actors may initially have diverse and erroneous
models, the informational feedback process and arbitraging
actors will correct initially incorrect models, punish deviant
behavior, and lead surviving players to correct models.”
Douglass North (1994)
belief in rationality is difficult to
defeat
 Dan Ariely – author of Predictably Irrational (2008) – highlighted the
skepticism people have of experiments on his blog:
 “..after I gave a presentation at a conference, a fellow I’ll call Mr. Logic (a
composite of many people I have debated with over the years) buttonholed
me. “I enjoy hearing about all the different kinds of small-scale irrationalities
that you demonstrate in your experiments,” he told me, handing me his
card. “They’re quite interesting-great stories for cocktail parties.” He
paused.
 “But you don’t understand how things work in the real world. Clearly, when
it comes to making important decisions, all of these irrationalities disappear,
because when it truly matters, people think carefully about their options
before they act. And certainly when it comes to the stock market, where the
decisions are critically important, all these irrationalities go away and
rationality prevails.”” [www.predictablyirrational.com/?p=409; posted May
20, 2009].
Seinfeld disagrees with the
rationality assumption!
 “…I must say, with all due respect, I find it very hard to see the logic behind
some of the moves you have made with this fine organization. In the past 20
years you have caused myself, and the city of New York, a good deal of
distress, as we have watched you take our beloved Yankees and reduce them
to a laughing stock…”
 George Costanza upon meeting George Steinbrenner (owner of the New
York Yankees): Seinfeld, “The Opposite” (season 5, 1994)
 “What the hell did you trade Jay Buhner for?! He had 30 home runs and over
100 RBIs last year. He's got a rocket for an arm. You don't know what the hell
you're doin'!”
 Frank Costanza (George’s father) upon meeting George Steinbrenner:
 Seinfeld, “The Caddy” (season 7, 1996)
 The quotes from Seinfeld were found at www.seinfeldscripts.com.
Behaviorial Economics
•
•
•
•
•
Fastest growing field in economics
Behavioral economics is concerned with the ways in
which the actual decision-making process influences the
decisions that are made in practice; combines
psychology and economics
Assumes bounded rationality – meaning that people
have limited time and capacity to weigh all the relevant
benefits and costs of a decision.
Decision making is less than fully rational. People are
prone to make predictable and avoidable mistakes.
At the same time, decision making is systematic and
amenable to scientific study.
Oct. 13, 2006
Alan Krueger
Bounded Rationality:
Thinking Is Costly
•
•
•
•
•
•
Example of Bounded Rationality
A baseball and bat together cost $11. The bat costs
$10 more than the ball. How much does the ball
cost?
Write down your answer.
Half of Harvard students said $1, which is the intuitive
answer but wrong!
Correct answer is 50 cents: $10.50-$.50 = $10.00
Lesson: People tend to use “intuitive thinking” or rules of
thumb
Oct. 13, 2006
Alan Krueger
SIX KEY IDEAS FROM BEHAVIORAL ECONOMICS
1. Letting Sunk Costs Matter. Allowing sunk costs, which have already been paid and
do not affect marginal costs regardless of which option is chosen, to affect a decision.
2. Overconfidence. Believing you will know what will happen in the future to a greater
extent than is justified by available information. Overconfidence also creates a
disconnect between perceived skills and actual skills.
3. Status Quo Bias. A tendency to make decisions by accepting the default option
instead of comparing the marginal benefit to the marginal cost.
4. Faulty discounting. Being too impatient when it comes to decisions that involve
benefits that are received in the future or discounting future benefits inconsistently
depending on when the delay in receipt of benefits occurs.
5. Framing. Allowing the way a decision is presented to affect the choice that is
selected even though the marginal benefit and marginal cost are unaffected.
6. Desire for Fairness and Reciprocity. A tendency to punish people who treat you
unfairly and to reward those who treat you fairly, even if you do not directly benefit
from those punishments and rewards.
Behavioral Economics recognizes that people respond to incentives, but their
response is not always a rational one.
Alan Krueger
Oct. 13, 2006
Status Quo Bias
◦ Decision-makers have an overwhelming tendency to adopt
defaults, to stick with the status quo
◦ Even when the decision is important and the stakes are
large
◦ Even when the decision-maker is told that the default is
suboptimal
◦ Examples from 401(k) plans: participation, savings rate, asset
allocation (company stock).
◦ Other examples: insurance deductibles, organ donation
Alan Krueger
Oct. 13, 2006
Example: Brigitte Madrian and Dennis Shea (2001)
• Design: A Fortune 500 Company Switched 401(k) default on
April 1, 1998. Madrian and Shea examine behavior of new
hires.
OLD
NEW
◦ Default Contribution: Must actively sign
up
◦ Default Contribution: 3 percent of
compensation deducted for plan
◦ Default Allocation: None
◦ Default Allocation: Money Market
Fund
Alan Krueger
Oct. 13, 2006
401(k) Participation Increases
Percent at Specified Contribution Rate
80%
60%
Before Automatic
Enrollment
40%
After Automatic
Enrollment
20%
0%
0%
1-2%
3%
4-5%
6%
7-9%
10%
11-14%
15%
Contribution Rate
Alan Krueger
Oct. 13, 2006
Source: Madrian and Shea (2001).
Policy Application: Pension Reform Bill of 2006
The 900-page Pension Protection Act of 2006
comes as the number of people covered by a definedbenefit pension has steadily declined and awareness has
grown about the lack of adequate savings among
Americans.
A majority of workers 45 and older have less than
$50,000 in savings, according to a survey by the Employee
Benefit Research Institute (EBRI). What's more, almost 40
percent of workers over 40 don't participate in a 401(k)
when they are eligible.
The new legislation encourages companies to
automatically enroll 401(k)-eligible employees and to
automatically increase worker contributions every year. It
also allows the plan provider chosen by the employer to
offer investment advice to workers.
Automatic enrollment is expected to boost the
participation rate in 401(k) plans beyond 90 percent.
By Jeanne Sahadi, CNNMoney.com
Alan Krueger
Oct. 13, 2006
Libertarian-Paternalism:
Set the default to help people,
but they can opt out.
More on Status Quo Bias
http://www.wired.com/wiredscience/2011/
04/the-messy-reality-of-judicial-decisions/
◦ When a judge hears a case impacts the decision.
◦ Tired brains choose the status quo.
Overconfidence
◦ People tend to think they are above average!!
◦ a survey at the University of Nebraska revealed that 94% of college professors
thought they were better teachers than the average at that same institution
◦ Richard Thaler and Cass Sunstein – in Nudge -- find evidence of this
phenomenon in surveys of MBA students, drivers, and new business owners,
and this is just a partial list. They go on to note that “unrealistic optimism is a
pervasive feature of human life; it characterizes most people in most social
categories.” Thaler and Sunstein (2008): pp. 32.
◦ Overconfidence leads us to exaggerate the probability of future success and fail
to uncover why we fail (causing us to repeat failed decisions).
Incompetent Men?
• from Why Do So Many Incompetent Men Become
Leaders? by Tomas Chamorro-Premuzic (posted at
the Harvard Business Review)
• http://blogs.hbr.org/cs/2013/08/why_do_so_many_i
ncompetent_men.html
•
In my view, the main reason for the uneven management sex ratio is our
inability to discern between confidence and competence. That is, because
we (people in general) commonly misinterpret displays of confidence as a
sign of competence, we are fooled into believing that men are better leaders
than women. In other words, when it comes to leadership, the only
advantage that men have over women (e.g., from Argentina to Norway and
the USA to Japan) is the fact that manifestations of hubris — often masked as
charisma or charm — are commonly mistaken for leadership potential, and
that these occur much more frequently in men than in women.
Faulty Discounting
◦ People display inconsistent behavior when choosing for today or for tomorrow
◦ Many choices involve benefits and costs that are received at different times
◦ People tend to be impatient in the short-run. Causes irrational (inconsistent) choices.
◦ Example: Would you rather receive $100 right now or $101 in a week? Most people choose
$100 right now. But when the choice is between $100 a year from now and $101 in a year
and a week from now, most people choose $101 in a year and a week.  More impatient
for decisions involving this week than next year. This is inconsistent, as both choices involve
delaying the receipt of $1 by a week.
◦ Technical term is “hyperbolic discounting”.
Alan Krueger
Oct. 13, 2006
Real Consequence of Faulty
Discounting
◦ The average adult has $6,000 of outstanding
credit card debt. Few people can afford to pay
off $6,000 in full, so many make only the
minimum payment each month and pay interest
at very high rates on the balance. Why? The
attraction of immediate consumption is hard to
resist. (Status quo bias prevents many people
from taking a bank loan at lower interest to pay
off credit card debt.)
Alan Krueger
Oct. 13, 2006
FRAMING
or how you ask a question impacts the answer
 A framing effect is demonstrated by constructing two
transparently equivalent versions of a given problem, which
nevertheless yield predictably different choices.
 The standard example of a framing problem, which was
developed quite early, is the 'lives saved, lives lost' question, which
offers a choice between two public-health programs proposed to
deal with an epidemic that is threatening 600 lives:
 one program will save 200 lives, the other has a 1/3 chance of
saving all 600 lives and a 2/3 chance of saving none. In this
version, people prefer the program that will save 200 lives for sure.
 In the second version, one program will result in 400 deaths, the
other has a 2/3 chance of 600 deaths and a 1/3 chance of no
deaths.
Fairness
The Ultimate Game
◦ Two players and $10.
◦ One is given the $10. This person must offer some money to the other person.
If the other person accepts the offer, both keep what they have. If the other
person says no to the offer, no one gets anything.
◦ How much should the first person offer?
ALAN KRUEGER
Oct. 13, 2006
MORE STORIES
 Regression to the mean
 Loss aversion
 More choice is better?
 Confirmation Bias
MORE STORIES: REGRESSION TO THE MEAN AND
LOSS-AVERSION
 Regression to the Mean and How to Treat your
Employees
 loss-aversion: the response to losses is consistently much
more intense than the response to corresponding gains
 imagine you were given $100,000
 now imagine you were given $1 million, and then had
$900,000 taken away
 do you feel the same?
IS MORE CHOICE ALWAYS BETTER?
Adding more complex options:

Complexity delays choice, increasing the fraction of consumers who adopt default
options (O’Donoghue and Rabin, 2004).

Complexity biases choice, since people tend to avoid complex options (Shafir and
Tversky, 1994; Iyengar and Kamenica, 2006).

1/N rule – Add a second fund and many investors divide portfolio 50-50; add a
third fund and 1/3 placed in each.
ALAN KRUEGER
Oct. 13, 2006
Confirmation Bias
• from The Righteous Mind: Why Good People
Are Divided by Politics and Religion (Jonathan
Haidt)
• “…the tendency to seek out and interpret new
evidence in ways that confirm what you
already think.”
• “People are quite good at challenging
statements made by other people, but if it’s
your belief, then it’s your possession—your
child, almost—and you want to protect it, not
challenge it and risk losing it.”
Confirmation bias and politics
•
from Willful Blindness: Why We Ignore the Obvious at Our Peril (Margaret Heffernan)
• In 2004, a team of cognitive neuroscientists set out to
see what this process actually looks like. Drew Westen,
at Emory University, was interested in what
psychologists call “motivated reasoning” and what
Freud called defense mechanisms: the processes by
which people adjust what they know to avoid bad
feelings such as anxiety and guilt. He theorized that the
brain’s neural networks would try to satisfy two kinds of
constraints: cognitive constraints—we want to put
information together in a way that feels rational—and
emotional constraints, meaning we want to feel good
about the information we take in.
More on confirmation bias and
politics
•
from Willful Blindness: Why We Ignore the Obvious at Our Peril (Margaret Heffernan)
• To test his theories, Westen and his team recruited fifteen committed
Democrats and fifteen committed Republicans to submit to fMRI scans of their
brains while reading political material. As they lay in the scanner, they read
pairs of quotes attributed either to President George W. Bush or to presidential
candidate John Kerry. In each pair, one statement was entirely compatible with
the candidate’s position, but one statement was contradictory.
• Westen wanted to find out whether the brain would treat the contradictions of
the preferred candidate in the same way as it would treat the contradictions of
a disliked candidate. The experiment found that the partisan participants gave a
far rougher ride to the contradictions that came from the candidate they
opposed.
• “They had no trouble seeing the contradictions for the opposition candidate,”
Westen wrote. “But when confronted with potentially troubling political
information, a network of neurons becomes active that produces distress. Not
only did the brain manage to shut down distress through faulty reasoning—but it
did so quickly. The neural circuits charged with regulation of emotional states
seemed to recruit beliefs that eliminated the distress and conflict.”
And more on Bias and Politics
•
from Willful Blindness: Why We Ignore the Obvious at Our Peril (Margaret Heffernan)
• But, said Westen, the brain didn’t stop at eliminating the
uncomfortable contradictions.
• It worked overtime “to feel good, activating reward
circuits that give partisans a jolt of positive reinforcement
for their biased ‘reasoning.’ In Westen’s experiment, the
reward circuits the brain was using were the same that are
activated when a junkie gets a fix.
• In other words, when we find the thoughts we agree
with, or are able to eliminate the ones that make us
uncomfortable, we feel that same kind of euphoria and
reassurance that an addict feels when reunited with his
drug of choice: all is right with the world. At least for a
while.
Even more on Bias…
 from The Righteous Mind: Why Good People Are Divided by Politics and
Religion (Jonathan Haidt)
 When subjects are told that an intelligence test gave them a low score, they
choose to read articles criticizing (rather than supporting) the validity of IQ
tests. When people read a (fictitious) scientific study that reports a link
between caffeine consumption and breast cancer, women who are heavy
coffee drinkers find more flaws in the study than do men and less caffeinated
women. Pete Ditto, at the University of California at Irvine, asked subjects to
lick a strip of paper to determine whether they have a serious enzyme
deficiency. He found that people wait longer for the paper to change color
(which it never does) when a color change is desirable than when it indicates a
deficiency, and those who get the undesirable prognosis find more reasons why
the test might not be accurate (for example, “My mouth was unusually dry
today”).The difference between a mind asking “Must I believe it?” versus “Can
I believe it?” is so profound that it even influences visual perception. Subjects
who thought that they’d get something good if a computer flashed up a letter
rather than a number were more likely to see the ambiguous figure as the
letter B, rather than as the number 13.33 If people can literally see what they
want to see—given a bit of ambiguity—is it any wonder that scientific studies
often fail to persuade the general public?
And More….
• from The Righteous Mind: Why Good People Are Divided by Politics and Religion
(Jonathan Haidt)
• Scientists are really good at finding flaws in studies that contradict
their own views, but it sometimes happens that evidence accumulates
across many studies to the point where scientists must change their
minds. I’ve seen this happen in my colleagues (and myself) many
times, and it’s part of the accountability system of science—you’d
look foolish clinging to discredited theories. But for nonscientists,
there is no such thing as a study you must believe. It’s always possible
to question the methods, find an alternative interpretation of the
data, or, if all else fails, question the honesty or ideology of the
researchers. And now that we all have access to search engines on our
cell phones, we can call up a team of supportive scientists for almost
any conclusion twenty-four hours a day. Whatever you want to believe
about the causes of global warming or whether a fetus can feel pain,
just Google your belief. You’ll find partisan websites summarizing and
sometimes distorting relevant scientific studies. Science is a
smorgasbord, and Google will guide you to the study that’s right for
you.
Another Example of how partisanship shapes
a persons view of the world…
Summarizing the Story
• Implicit Decision-making
• your mind is prone to making decisions. If you are unaware of the
biases the mind possesses, your mind will reach an answer that is
often not optimal. Such a process can be thought of as “implicit”
decision-making
• Explicit Decision-making
• an alternative approach is to systematically analyze data and delay – as
much as possible – reaching conclusions
• once conclusions are reached, be willing to question these as new
information is made available
• and do not consider “all” information. Have a process in place to
differentiate “useful” from “useless” information