Advances: Foundations of DA Revisited Page 1 of 44 3

Advances: Foundations of DA Revisited
3
Page 1 of 44
Ch 03 060520 V10
The Foundations of Decision Analysis Revisited
Ronald A. Howard
ABSTRACT
For centuries people have speculated on how to improve decision-making without much
professional help in developing clarity of action. Over the last several decades several
important supporting fields have been integrated to provide a discipline, decision
analysis, which can aid decision-makers in all fields of endeavor: business, engineering,
medicine, law, and personal life. Since uncertainty is the most important feature to
consider in making decisions, the ability to represent knowledge in terms of probability,
to see how to combine this knowledge with preferences in a reasoned way, to treat very
large and complex decision problems using modern computation, and to avoid common
errors of thought have combined to produce insights heretofore unobtainable. The
limitation in practice lies in our willingness to use reason rather than in any shortcoming
of the field. This chapter discusses the sources of the discipline, the qualities desired in a
decision process, the logic for finding the best course of action, the process of focusing
attention on important issues in attaining clarity of action, the need for clear and powerful
distinctions to guide our thinking (since most decisions derive from thought and
conversation rather than from computation), and the challenges to the growth of the
discipline.
Advances: Foundations of DA Revisited
Page 2 of 44
CONTENTS
Origins
The Motivating Application
Pillars of Decision Analysis
The First Pillar: Systems Analysis
The Second Pillar: Decision Theory
The Third Pillar: Epistemic Probability
The Fourth Pillar: Cognitive Psychology
Edifice of Decision Analysis
Decision Quality
Group Decision Making
Clairvoyance
Desiderata
The Rules
Rule 1: The Probability Rule
Rule 2: The Order Rule
Rule 3: The Equivalence Rule
Rule 4: The Substitution Rule
Rule 5: The Choice Rule
Practice of Decision Analysis
Decision Engineering
Influence, Relevance, and Decision Diagrams
Options
Ch 03 060520 V10
Advances: Foundations of DA Revisited
Page 3 of 44
Cogency versus Verisimilitude
Ethics
Language of Decision Analysis
Confusions
Uncertainty about Probability
Deal Characterization and Deal Desirability
Challenges of Decision Analysis
Classical Statistics Persists
Dealing with Multiple Attributes
Direct and Indirect Values
Treating Multi-Attribute Problems Using a Value Function
Other Approaches to Multi-Attribute Problems
Weight and Rate
Analytic Hierarchy Process
Risk Preference
Life and Death Decisions
Future of Decision Analysis
Epilog
Acknowledgment
Ch 03 060520 V10
Advances: Foundations of DA Revisited
Page 4 of 44
Ch 03 060520 V10
Revisiting the foundations of decision analysis means seeing what those foundations have
been, how they have evolved, and how well they will serve in the future. Since the entire
book is generally concerned with this subject, this chapter will be a personal view of the
development of decision analysis. Other chapters will discuss many of the topics
commented on here in greater depth. Certain of my opinions may be idiosyncratic, as you
will see as you read further.
I will briefly repeat here comments from original writings: they should be consulted for a
deeper understanding.
Origins
When I was asked to do this chapter, I thought of it as an opportunity to review the progress of
the field since its inception. I decided to return to my 1965 paper entitled "Decision Analysis:
Applied Decision Theory" (DAADT) (Howard, 1966) wherein I define the term "decision
analysis" for the first time:
The purpose of this article is to outline a formal procedure for the analysis of decision
problems, a procedure that I call “decision analysis.” We shall also discuss several of the
practical problems that arise when we attempt to apply the decision analysis formalism.
By the way, I decided to call the field “decision analysis” rather than “decision engineering”
because the latter term sounded manipulative, though in fact it is more descriptive.
The following paragraph from DAADT provides a definition of decision analysis:
Decision analysis is a logical procedure for the balancing of the factors that influence a
decision. The procedure incorporates uncertainties, values, and preferences in a basic
Advances: Foundations of DA Revisited
Page 5 of 44
Ch 03 060520 V10
structure that models the decision. Typically, it includes technical, marketing,
competitive, and environmental factors. The essence of the procedure is the construction
of a structural model of the decision in a form suitable for computation and manipulation;
the realization of this model is often a set of computer programs.
Not having read the paper for some time, and expecting it to be antiquated, I was pleased to
see how relevant some of the comments are today. For example, here is the discussion of the
most fundamental distinction underlying decision analysis:
Having defined a decision, let us clarify the concept by drawing a necessary distinction
between a good decision and a good outcome. A good decision is a logical decision -one based on the uncertainties, values, and preferences of the decision maker. A good
outcome is one that is profitable or otherwise highly valued. In short, a good outcome is
one that we wish would happen. Hopefully, by making good decisions in all the situations
that face us we shall ensure as high a percentage as possible of good outcomes. We may
be disappointed to find that a good decision has produced a bad outcome or dismayed to
learn that someone who has made what we consider to be a bad decision has enjoyed a
good outcome. Yet, pending the invention of the true clairvoyant, we find no better
alternative in the pursuit of good outcomes than to make good decisions.
The distinction between decision and outcome is still not clear for most people. When
someone makes an investment, and then loses money, he often says that he made a bad decision.
If he would make the same decision again if he did not know how it would turn out, then he
would be more accurate in saying that he made a good decision and had a bad outcome. Jaynes
traces this distinction back to Herodotus (Jaynes, 1986):
From the earliest times this process of plausible reasoning preceding decisions has been
recognized. Herodotus, in about 500 BC, discusses the policy decisions of the Persian kings. He
Advances: Foundations of DA Revisited
Page 6 of 44
Ch 03 060520 V10
notes that a decision was wise, even though it led to disastrous consequences, if the evidence at
hand indicated it as the best one to make; and that a decision was foolish, even though it led to
the happiest possible consequences, if it was unreasonable to expect those consequences.
Let me now turn to what motivated the writing of the paper.
The Motivating Application
I taught in electrical engineering and industrial management at MIT from 1958 through 1964.
One of my mentors at MIT, Bill Linvill, had taken a position at Stanford University. He invited
me to visit him there for the ’64-’65 academic year. While I was at MIT, I taught Statistical
Decision Theory and Markov decision processes in the General Electric Modern Engineering
Course, which they gave to middle engineering management. When one of the participants
learned I was going out to Stanford, he asked if I could teach the same course in San Jose to
employees of the GE nuclear power division, and I agreed.
At the end of the third weekly lecture in decision theory presented in San Jose, one of the
engineers said that the division was facing a major decision problem with both technical and
business implications. The question was whether to put a superheater of steam on their nuclear
power reactors. He asked whether what we were discussing in class could help with this
problem.
I replied, “Why not?” We spent eight months working on how to put together the dynamic,
preference, and uncertainty issues they faced, marking the beginning of decision analysis and
providing the material that motivated the first decision analysis paper quoted above.
Advances: Foundations of DA Revisited
Page 7 of 44
Ch 03 060520 V10
Pillars of Decision Analysis
It is certainly true that we stand on the shoulders of giants. I used to think that there were two
pillars supporting decision analysis, but I came to realize that there were three, and then four. I
shall briefly discuss the first two pillars and emphasize the last two.
The First Pillar: Systems Analysis
Bill Linvill introduced me to systems analysis. Systems analysis grew out of World War II and
was concerned with understanding dynamic systems. Key notions were those of state variables,
feedback, stability, and sensitivity analysis. The field of systems engineering is currently in a
state of resurgence. Decision analysis and systems engineering have many complementary
features (Howard, 1965, 1973).
The Second Pillar: Decision Theory
Decision theory is concerned primarily with making decisions in the face of uncertainty. Its roots
go back to Daniel Bernoulli (Bernoulli, 1738) and Laplace. Bernoulli introduced the idea of
logarithmic utility to explain the puzzle called the St. Petersburg paradox. In the most influential
book on probability ever written (Laplace, 1812), Laplace discusses the esperance mathematique
and the esperance morale. Today we would call these the mean and the certain equivalent.
Howard Raiffa showed how to treat many of the problems of statistics in Bayesian form, and
how to use tree structures to select the best alternative. I learned much from Howard while I
assisted him in teaching the Institute of Basic Mathematics for Application to Business.
Advances: Foundations of DA Revisited
Page 8 of 44
Ch 03 060520 V10
The Third Pillar: Epistemic Probability
I did not appreciate the importance of this pillar until well after I began teaching statistical
decision theory. My epiphany began with a manuscript sent to me by Myron Tribus, Dean of
Engineering at Dartmouth. It was a paper of Ed Jaynes (Jaynes, 2003), a professor of physics at
Washington University. Since I needed a haircut, I took it with me to a small barbershop on
Massachusetts Avenue in Cambridge. As I sat in the barber’s chair, I read the first few pages and
thought, “This is pretty silly”. When I arose from the chair 20 minutes later, I had become
completely converted to Jaynes’s way of thinking about probability, and I have been in his debt
ever since.
Jaynes taught that there is no such thing as an objective probability: a probability reflects a
person’s knowledge (or equivalently ignorance) about some uncertain distinction. People think
that probabilities can be found in data, but they cannot. Only a person can assign a probability,
taking into account any data or other knowledge available. Since there is no such thing as an
objective probability, using a term like "subjective probability" only creates confusion.
Probabilities describing uncertainties have no need of adjectives.
This understanding goes back to Cox (2001), Jeffreys (1939), Laplace (1996) and maybe
Bayes, yet somehow it was an idea that had been lost over time. A famous scientist put it best
over 150 years ago:
The actual science of logic is conversant at present only with things either certain,
impossible, or entirely doubtful, none of which (fortunately) we have to reason on.
Therefore the true logic for this world is the calculus of Probabilities, which takes
account of the magnitude of the probability which is, or ought to be, in a reasonable
man's mind. (Maxwell, 1850)
I ask students who have taken a probability class about the origin of the probabilities they use in
Advances: Foundations of DA Revisited
Page 9 of 44
Ch 03 060520 V10
the classroom and in their assignments. It turns out that these probabilities originate from the
professor or from a very simple rule based on coin tossing, balls in urns, or card playing that says
to make all elemental probabilities equal. Probability class then teaches you how to transform
this set of probabilities into probabilities of derivative events, like two heads out of ten tosses of
a coin or a royal flush – from inputs to outputs. There is very little discussion of where
probabilities come from, and where they come from has everything to do with the use of
probability. I have never seen an actual decision problem where the assignment of probabilities
could be done using the probability class rule.
DAADT emphasized the importance of the epistemic view:
Another criticism is, “If this is such a good idea, why haven’t I heard of it before?” One
very practical reason is that the operations we conduct in the course of a decision analysis
would be expensive to carry out without using computers. To this extent decision analysis
is a product of our technology. There are other answers, however. One is that the idea of
probability as a state of mind and not of things is only now regaining its proper place in
the world of thought. The opposing heresy lay heavy on the race for the better part of a
century. We should note that most of the operations research performed in World War II
required mathematical and probabilistic concepts that were readily available to Napoleon.
One wonders about how the introduction of formal methods for decision making at that
time might have affected the course of history.
Over the years, many people have tried to modify the probability theory of Bayes, Laplace,
Kolmogorov, and Jaynes for some purpose. Perhaps someday we will see a useful contribution
from such efforts; I believe that, at present, they only serve to make these giants roll over in their
graves.
Advances: Foundations of DA Revisited
Page 10 of 44
Ch 03 060520 V10
The Fourth Pillar: Cognitive Psychology
In the 1960s few appreciated the important role that cognitive psychology would play in
understanding human behavior. At the time of DAADT, we just did our best to help experts
assign probabilities. In the 1970s the work of Tversky, Kahneman, and others provided two
valuable contributions. First, it showed that people making decisions relying only on their
intuition were subject to many errors that they would recognize upon reflecting on what they had
done. This emphasized the need for a formal procedure like decision analysis to assist in making
important decisions. The second contribution was to show the necessity for those who are
assisting in the probability and preference assessments to be aware of the many pitfalls that are
characteristic of human thought. Tversky and Kahneman called these heuristics -- methods of
thought that could be useful in general but could trick us in particular settings. We can think of
these as the "optical illusions" of the mind.
An important distinction here is that between "descriptive" and "normative" decision-making.
Descriptive decision-making, as the name implies, is concerned with how people actually make
decisions. The test of descriptive decision-making models is whether they actually describe
human behavior. Normative decision-making is decision-making according to certain rules, or
norms, that we want to follow in our decision-making processes.
To illustrate, I might make mistakes, descriptively, in carrying out operations in arithmetic. I
know they are mistakes because I want to follow the norms of arithmetic. Any violation of the
norms I call a mistake.
We know there is a conflict between our reasoning (normative) and our temptations
(descriptive) that can resolve in favor of either, as illustrated by the person eating a bowl of
peanuts at a cocktail party and saying, "I know I am going to regret this tomorrow."
Advances: Foundations of DA Revisited
Page 11 of 44
Ch 03 060520 V10
Many theorists have attempted to change the norms of decision-making to make the norms
agree with descriptive behavior. Every attempt to do so that I have seen creates more problems
than it solves (Howard, 1992a).
Edifice of Decision Analysis
General Concepts
You can think of a decision as a choice among alternatives that will yield uncertain futures, for
which we have preferences. To explain the formal aspects of decision analysis to both students
and to executives I use the image of the three-legged stool shown in Figure 3.1 (Howard, 2000).
Figure 3.1 Decision Essentials
Advances: Foundations of DA Revisited
Page 12 of 44
Ch 03 060520 V10
The legs of the stool are the three elements of any decision: what you can do, the alternatives;
what you know, the information you have; and what you want, your preferences. Collectively,
the three legs represent the decision basis, the specification of the decision. Note that if any leg is
missing, there is no decision to be made. If you have only one alternative, then you have no
choice in what you do. If you do not have any information linking what you do to what will
happen in the future, then all alternatives serve equally well because you do not see how your
actions will have any effect. If you have no preferences regarding what will happen as a result of
choosing any alternative, then you will be equally happy choosing any one. The seat of the stool
is the logic that operates on the decision basis to produce the best alternative. We shall soon be
constructing the seat to make sure that it operates correctly.
The stool can be placed anywhere and used to make a decision. However, the most important
choice you make is where to place it. Placement of the stool represents the frame of the decision,
the declaration by the decision maker of what decision is under consideration at this
time. The frame will influence all elements of the decision basis. Framing a decision of where to
live as a renting decision rather than a buying decision will affect the alternatives, information,
and preferences appropriate to the decision basis.
The decision hierarchy of Figure 3.2 shows how the frame separates what is to be decided
upon now from two other potential sources of decisions.
Advances: Foundations of DA Revisited
Page 13 of 44
Ch 03 060520 V10
Figure 3.2 The Decision Hierarchy
The top of the hierarchy in the figure represents higher-level decisions with alternatives that
are taken as given at this time. The bottom of the hierarchy represents decisions that will be
made in the future following the decision under consideration. Selection of a proper frame is
perhaps the most important task in decision analysis.
Finally note the person seated on the stool. This figure reminds us that the frame, and every
element of the decision basis, must be a declaration by the decision maker. Decisions are not
found in nature, they are creations of the human mind.
Decision Quality
Decision quality comprises the six elements of the stool. A high quality decision has a proper
frame, a selection of alternatives that respond to the frame, reliable information to an extent
appropriate to the frame, and considered preferences on possible futures. The logic to arrive at a
course of action must be sound, and the decision maker must be committed both to the process
Advances: Foundations of DA Revisited
Page 14 of 44
Ch 03 060520 V10
and to the significance of the decision. The decision quality spider shown in Figure 3.3 is a
graphical representation of the qualitative attainment of these elements in any particular decision.
Individuals and groups find it helpful in assessing their decision process.
The distance from the inner circle to the outer one represents the degree of achievement for
each element. The outer circle represents the proper balancing of these elements for this
particular decision. The analysis is not balanced if too many alternatives are considered, too
much information of little value relative to cost is gathered, et cetera. The resulting picture
displays the deficiencies in any of the elements of decision quality.
Clear Preferences
Reliable Information
including Models
Creative
Alternatives
0%
Appropriate Frame
100%
Correct
Logic
Commitment to Action
100%: Further improvement uneconomical
Figure 3.3 The Decision Quality Spider
Group Decision Making
Any person, and an organization operating under the direction of a person, can use decision
analysis. An organization using decision analysis is agreeing to act as if it were a single entity
using the logic of a person. Separate groups might have the assignment of creating the frame and
the elements of the decision basis. (Howard, 1975). The analysis must use the same high-quality
logic appropriate for a person soon to be described.
Advances: Foundations of DA Revisited
Page 15 of 44
Ch 03 060520 V10
Even when the decision maker is one person, that person may consider the consequences of
the decision on other people. It is useful to define a stakeholder in a decision as "someone who
can affect or will be affected by the decision". Stakeholders can be as disparate as regulators and
customers.
Clairvoyance
A useful construct in achieving clarity in all the dimensions of uncertainty in a decision is that of
the clairvoyant. The clairvoyant can tell us the resolution of any uncertain distinction past,
present, or future as long as 1) the clairvoyant need not exercise any judgment in stating this
resolution and 2) the resolution does not depend upon any future action of the decision maker
unless that action is specified. We say that a distinction meeting these conditions has met the
clarity test. We cannot understand what a distinction like "technical success" means unless it
meets the clarity test. Assigning probabilities to distinctions that do not meet the clarity test is an
exercise in futility.
Once we have the notion of clairvoyance, we can speak of the improvement that we might
make in a decision if clairvoyance on one or more of the uncertainties in the decision were
available. If the prospects of the decision are completely describable by a value measure we can
compute the value of clairvoyance (Howard, 1966a). The most that should be paid for any
information gathering activity or experiment is the value of clairvoyance on the results of that
activity.
As stated in DAADT:
Thus the decision analysis is a vital structure that lets us compare at any time the values
of such alternatives as acting, postponing action and buying information, or refusing to
consider the problem further. We must remember that the analysis is always based on the
Advances: Foundations of DA Revisited
Page 16 of 44
Ch 03 060520 V10
current state of knowledge. Overnight there can arrive a piece of information that changes
the nature of the conclusions entirely. Of course, having captured the basic structure of
the problem, we are in an excellent position to incorporate any such information.
Desiderata
In developing any theory, it is useful to specify the desirable properties we would like to have:
desiderata. Here we shall present desiderata for a theory of decision (Howard, 1992a). We define
the decision composite as the axioms supporting the theory plus all of the theorems that follow
from them.
1. The decision composite must allow me to form inferences about uncertain distinctions
even in the absence of a decision problem. This means that probability must stand on its
foundation, in accordance with our discussion of epistemic probability.
2. The decision composite must be applicable to any decision I face regardless of type or
field.
3. The decision composite must require that I be indifferent between two alternatives I
consider to have the same probabilities of the same consequences. In other words, I must
be indifferent between two alternatives for which I have created the same descriptions.
4. Reversing the order of contemplating uncertain distinctions should not change inference
or decision. This means, in particular, that changing the order of receiving a given body
of information, including alternatives, should not change any inference or decision. This
property is sometimes called "invariance to data order".
5. If I prefer alternative 1 over alternative 2 when the uncertain event A occurs and if I
prefer 1 over 2 when A does not occur, then I must prefer alternative 1 over 2 when I am
uncertain about the occurrence of A.
Advances: Foundations of DA Revisited
Page 17 of 44
Ch 03 060520 V10
6. Once I have ordered my alternatives from best to worst, the non-informative removal of
any of them does not change the preference order of the remaining alternatives. Non –
informative means the removal of the alternative does not provide new information about
the remaining alternatives.
7. The addition of a non-informative new alternative to the basis cannot change the ranking
of the original alternatives.
8. The ability to obtain free clairvoyance on any uncertainty cannot make the decision
situation less attractive.
9. At this epoch my thoughts about how I will behave and choose in the future must be
consistent.
If the consequences of a decision are completely describable in terms of a value measure
chosen so that more will always be preferred to less, then the desiderata can be further refined.
10. If an alternative has various value consequences with associated probabilities, then I must
be able to compute the amount of the value measure I would have to receive in exchange
for the alternative to be indifferent to following it. This selling price of the alternative I
shall call the certain equivalent. This amount will be negative for undesirable alternatives.
11. I must be able to compute the value added by a new alternative: the value must be
nonnegative.
12. I must be able to compute the value of clairvoyance on any uncertain distinction or
collection of uncertain distinctions; the value of clairvoyance cannot be negative.
13. Payments of value that cannot be changed must have no effect on future decisions.
14. Since I am choosing among uncertain futures, there must be no willingness-to-pay to
avoid regret.
Advances: Foundations of DA Revisited
Page 18 of 44
Ch 03 060520 V10
The Rules
How could we construct a systematic logical process to serve as the seat of the stool in Figure
3.1? The answer as that we have to agree in our decision-making to follow a set of rules, or
norms, for our decision process. I consider them the rules of actional thought, thought about
action. These norms -- some would call them axioms -- are the foundation of the decision
composite. I shall state them here as requirements that the decision maker is placing on himself,
rather than as requirements imposed upon him. While every decision made using this normative
process must follow the rules, once they are acknowledged they will seldom be referred to in a
formal analysis. Just as a carpenter relies on the axioms of geometry, perhaps without even
knowing them, the decisions made using decision analysis procedures should be so self-evident
in their correctness that there is rarely a need to mention the rules.
Rule 1: The Probability Rule: The probability rule requires that I be able to characrerize any
alternative I face in a decision to my satisfaction by introducing uncertain distinctions of various
kinds and degrees and assigning probabilities to them. Once I have done this for all alternatives I
wish to consider within my present frame, I have completed the requirements of the probability
rule.
Rule 2: The Order Rule. The order rule requires that for each alternative I face, I construct the
possible futures formed by selecting the alternative and then one degree of each distinction used
to describe it. I call the result a "prospect". Sequential decisions following revelation of
information require prospects that describe each alternative and the distinctions that follow it.
When I have a complete list of prospects formed by the alternatives and their possible
consequences, I must then order them in a list, starting with the one I like best at the top and the
one I like worst at the bottom. I may have one or more prospects at the same level in the list. It is
possible that when I attempt to create this ordering, I find it difficult because I discover
Advances: Foundations of DA Revisited
Page 19 of 44
Ch 03 060520 V10
uncertainties that I did not represent in the probability rule. This means that I must return to the
probability rule, add distinctions to represent them, and then repeat the process. Notice that the
order rule is an ordering of prospects in a certain world: no consideration of uncertainty is
allowed.
Rule 3: The Equivalence Rule. The equivalence rule applies, in principle, to any three prospects
at different levels in the list. Suppose I like prospect A better than prospect B, and I like prospect
B better than prospect C. The equivalence rule requires that I be able to assign a probability of
the best prospect A and one minus that probability of the worst prospect C such that I would be
indifferent to receiving this probability mixture of the best and worst prospects on the one hand
and the intermediate prospect B for certain on the other. We shall call the probability that
establishes this indifference a "preference" probability because it is not the probability of any
uncertainty that the clairvoyant could resolve.
When satisfying the equivalence rule, you may that find some of the prospects ordered in the
order rule require refinement to allow the proper assessment of preference probabilities. For
example, I may know that I prefer a steak dinner to a lamb chop dinner to a hamburger dinner.
However my preference probability for a steak dinner versus a hamburger dinner that would
make me just indifferent to a lamb chop dinner may well depend on further specification of each
dinner, a task that will require returning to the probability rule and creating new distinctions.
Since an actual decision could have hundreds if not thousands of prospects, and three at
different levels could in such cases be chosen in even more ways then there are prospects, you
can see why we said that the equivalence rule must apply in principle. Rarely will we have to
carry out the assignment of so many preference probabilities.
Rule 4: The Substitution Rule. The substitution rule requires that
Advances: Foundations of DA Revisited
Page 20 of 44
Ch 03 060520 V10
if I should face in life any of the situations for which I assessed a preference probability in
the equivalence rule, and
if I assign a probability to receiving the best of the three prospects rather then the worst that
is equal to the preference probability I assigned, then
I remain indifferent between receiving the uncertain deal and the intermediate prospect.
The import of this rule is that the preference among uncertain prospects expressed in the
equivalence rule is not some hypothetical preference, but one that reflects my actual preferences
for uncertain deals. This means that probabilities and preference probabilities may be used
interchangeably in the analysis of decision problems.
Rule 5: The Choice Rule: The choice rule applies whenever I have two prospects at different
levels in my ordered list. If I prefer prospect A to prospect B, and if I face two alternatives with
different probabilities of only those two prospects, then I must choose the alternative with a
higher probability of receiving prospect A. In other words, I must choose the alternative with a
higher probability of the prospect I like better.
Note that this is the only rule that specifies the action I must take, and that it is so self-evident
that if I told someone I was violating it, they would think they had misheard me. Every aspect of
this choice is under my control: my preference and my probability.
I can make any decision using only these five rules. You can think of the process as
transforming an opaque decision situation into a transparent one by a series of transparent steps.
The transparent steps are the rules applied systematically. Using the first four rules, the choice
between any two alternatives can be reduced to an application of the choice rule. If there are
several alternatives, the repeated application of the choice rule will order them. (Howard, 1998).
The application of the rules simplifies considerably if I can describe all the prospects
completely in terms of a value measure, money. If I prefer more money to less, as I do, the rules
Advances: Foundations of DA Revisited
Page 21 of 44
Ch 03 060520 V10
require that I be able to construct a nondecreasing curve on the money axis, which I shall call the
u-curve and any point on it a u-value. This curve summarizes my preferences for receiving
different amounts of the value measure with different probabilities, my risk preference.
The rules require that I order the alternatives in terms of their mean u-values, the sum of the
u-values of each prospect multiplied by the probability of that prospect. This computation yields
the u-value of any alternative. The certain equivalent of the alternative is then the amount of
money whose u-value equals the u-value of the alternative. Rather than having to be concerned
with assigning the preference probabilities in the order rule to the many possible threesomes of
prospects at different levels, I need only deal with the u-curve. We shall have more to say on the
terminology for this curve later.
If there is no uncertainty in a decision situation, the rules are considerably simplified. The
probability rule does not require any assignment of probabilities since every alternative now has
a certain consequence. The order rule is still required because it expresses preferences in a
deterministic world. There is no need for either the equivalence or substitution rules because they
or concerned with preferences when there is uncertainty. The choice rule would still come into
play; however, the probabilities involved would be 1 and 0. For example, would I rather have
$20 or $10?
The rules are required for systematic decision-making in an uncertain world. Simpler rules,
some of which we shall later discuss, cannot handle uncertainty.
Practice of Decision Analysis
Advances: Foundations of DA Revisited
Page 22 of 44
Ch 03 060520 V10
The purpose of decision analysis is to achieve clarity of action. If you already know what to do
beyond any doubt, do it. If you do not know what to do, then apply the philosophy of decision
analysis at an appropriate level. The process can be as simple as realizing that you are incorrectly
including sunk costs or failing to recognize an alternative available to you.
Decision Engineering
The professional practice of decision analysis is decision engineering. The rules dictate the
norms, but not how to create the representation of a decision that will skillfully and efficiently
yield clarity of insight to the decision maker. What often happens when people try to analyze
decisions using a structure like a decision tree is that they are tempted to include every possible
uncertainty they can think of and thereby create an unanalyzable bush rather than the spare
structure desired.
Creating a focused analysis requires the continual elimination of every factor that will not
contribute to making the decision. This winnowing has been a feature of decision analysis since
the beginning (Howard, 1968, 1970). Since DAADT, the process has been described as a
decision analysis cycle, depicted in Figure 3.4 (Howard, 1984a).
Figure 3.4 Early Decision Analysis Cycle
Advances: Foundations of DA Revisited
Page 23 of 44
Ch 03 060520 V10
A brief description of the cycle is this. After the framing the problem and specifying
alternatives, the uncertainties that appear to have an effect upon the decision are given nominal
ranges. The deterministic phase explores the sensitivity of alternatives to these uncertainties to
determine which are worthy of probabilistic analysis. The probabilistic phase encodes probability
distributions on these uncertainties, including necessary conditional distributions. It also requires
assessing the risk preference to be used for the decision. At this point, the best alternative can be
determined, but the process continues to the informational phase to find the value of eliminating
or reducing any or all of the uncertainties. The result is not only the best decision up to this point,
but also clear knowledge of the cost of ignorance. This may lead to new information gathering
alternatives and a repeat of the cycle, or simply to action.
By the way, this process was used in DAADT. Probability assignments were needed for a
material lifetime and three experts were knowledgeable. The probability distributions they
assigned individually and the one they agreed on collectively are shown in Figure 3.5.
Advances: Foundations of DA Revisited
Page 24 of 44
Ch 03 060520 V10
Figure 3.5 Probability Assignment
It turned out that the effect of this uncertainty, deemed of great concern at the start of the
analysis, could be minimized by design changes and that the decision hinged on time preference.
There have been many refinements to the cycle over the years (Howard, 1984a, 1988). For
example in preparing for probabilistic evaluation, the “tornado diagram” (so named by a client
observing its shape) shows the sensitivity to uncertainties by the width of horizontal bars,
ordered from the widest to the narrowest. Since the variance of the resulting payoff usually
grows as the square of the length of the bars, only a few uncertainties with the longest bars are
typically needed in the probabilistic evaluation.
Advances: Foundations of DA Revisited
Page 25 of 44
Ch 03 060520 V10
Influence, Relevance, and Decision Diagrams
A development that has aided practice and research over the past three decades is the
introduction of diagrams that contribute to the understanding, communication, and computation
of decision problems. The influence diagram (Howard and Matheson, 1980) provides a structure
that on the one hand is readily understood by decision-makers and yet is formally defined so that
it can serve as the basis for machine computation. I have found it to be an invaluable tool in the
classroom and in executive conversations.
Unfortunately, the use of the word "influence" has led some people into difficulty in creating
the diagrams. I have found it useful to introduce other forms. I call the special form of the
influence diagram that contains only uncertainties represented by uncertain nodes (usually
represented by a circle or oval) a "relevance diagram" (Howard, 1989, 1990). Relevance
diagrams directly address the question of inference in probabilistic networks, by representing the
conditional probability structure of distinctions. Arrows between uncertainties represent the
possibility that the probability of the successor uncertainty is conditional on the originating
uncertainty. Every relevance diagram expresses an assessment order for all uncertainties in the
diagram. A missing arrow is an assertion of the irrelevance of one uncertainty to another given
all uncertainties that are predecessors to both. Many puzzling problems in probability become
transparent when viewed in the form of a relevance diagram.
A decision diagram contains decision nodes (rectangles) and the value node (hexagon or
octagon). It may also contain one or more relevance diagrams. Arrows into a decision node are
called "informational" arrows; they signify that the node originating the arrow is known when
that decision is made. Arrows into the value node are called "functional" arrows; they show the
Advances: Foundations of DA Revisited
Page 26 of 44
Ch 03 060520 V10
nodes on which value depends. Sometimes a special case of the uncertain node called a
deterministic node (double walled circle or oval) is created. The arrows into such a deterministic
node are functional arrows; the value of the node is computable from its inputs. You can
consider the value node as a special deterministic node.
Figure 3.6 shows a decision diagram for a test that is relevant to an uncertainty. The test costs
money; value resides in the decisions and the uncertainty.
Buy Test?
Report
Decision
Uncertainty
Value
Test Result
Figure 3.6 A Decision Diagram
Note that this decision diagram contains a relevance diagram on Test Result and Uncertainty
that has no arrows entering it . Decision diagrams that contain relevance diagrams with this
property are said to be in canonical form.
Why is canonical form important? Arrows that go from a decision node to an uncertain node
are called "influence" arrows. They assert that the probability assignment to an uncertain node
depends upon how a decision is made. They are problematical because they blur a separation we
have made between actions that are under our control, alternatives, and the uncertainties that
might be resolved by a clairvoyant that are not under our control. While influences do not pose
significant problems in computing the best decision, they do increase the difficulty of computing
the value of clairvoyance (Howard, 1990). The simple reason is that if a clairvoyant could tell
you something about an uncertainty affected by an influence, he would be telling you something
about an action on which you have not yet decided and thereby would call into question your
Advances: Foundations of DA Revisited
Page 27 of 44
Ch 03 060520 V10
free will. One can avoid this difficulty by eliminating all influences from the decision diagram,
thus placing it in canonical form. If necessary, this can be done by creating more nodes in the
relevance diagram conditioned on the influencing decisions. If this has been done, or if there are
no influences in the original diagram, then we say that the diagram is in canonical form.
Options
I consider one of the most important concepts in the practice of decision analysis to be that of an
option, properly defined (Howard, 1996). I do not restrict the term to financial options or even to
so-called real options that people attempt to replicate by portfolios of marketed securities. By
option, I mean an alternative that provides a new decision situation after the revelation of
information. Thus obtaining clairvoyance is an option because it is permitting you to make a
decision you face after resolving one or more uncertainties. The option may be obtained only at a
price, and there may be an additional cost, the exercise price, of using any alternative after
receiving the information. In computations involving clairvoyance, we typically assume that the
same alternatives will be available after the information is provided -- although this is not
necessary, we might have more or fewer. We also assume that there will be no additional cost to
exercising any of our original alternatives. All these comments apply to the option of performing
an information gathering activity, like an experiment, since we know that the experiment can be
valued by valuing clairvoyance on its results.
Sometimes options must be bought, like a fire extinguisher for your car; sometimes, they are
free of additional cost, like choosing what movie to see after driving to a multi-screen cinema.
Failure to recognize options and to incorporate them as sequential decisions is one of the
most important and consequential mistakes of decision analysis.
Advances: Foundations of DA Revisited
Page 28 of 44
Ch 03 060520 V10
Cogency versus Verisimilitude
In representing a decision of professional size there is sometimes controversy about how much
detail to include. Notice that the standard of excellence for model representations, like model
trains, is verisimilitude: the correspondence of the model in detail to the real situation it is
intended to represent. At a model train exposition you might notice a tiny passenger train with a
bar car. In examining the bar car you might see that there are people inside holding drinks; this is
surely a detailed model. Further scrutiny shows that one of the drinks is clearly a martini
containing an olive; this will be a prize-winning train. Using a magnifying glass, you see that the
olive contains a pimento. Surely, this train will be "best of show".
Is verisimilitude the criterion for decision models? Is a decision model that includes "the
sales tax in Delaware" better than one that does not? The answer is no, unless that factor is
material to the decision. The criterion for decision models is cogency: whether the model leads to
crisp clarity of action for the decision maker. You should eliminate any feature that does not
contribute to this goal. If the decision maker insists on adding such embellishments, they should
be regarded as a professional courtesy, like giving the decision maker a ride to the airport, rather
than as part of professional decision analysis.
Ethics
Decision analysis is amoral, like an adding machine. Like any other powerful tool, people can
use decision analysis for good or ill. They can justify any course of action by manipulating the
elements of the analysis: the alternatives, information, and preferences. As organizations
increasingly accepted decision analysis, I became concerned about its ethical use (Howard, 1980,
1991, 2001), as should anyone teaching or practicing the discipline. The study of decision
analysis is an excellent precursor to ethical discussions for it illuminates both utilitarian
Advances: Foundations of DA Revisited
Page 29 of 44
Ch 03 060520 V10
(consequence-based) and formalist (action-based) ethics. We find that characterizing actions as
to whether they are prudential, legal, and ethical is a helpful step in resolving ethical choices.
Language of Decision Analysis
Once again, decision analysis is more about clear thinking than about any of its detailed
procedures. Since even when thinking about a decision by ourselves we are going to use a
language to help us, it is extremely important that the language contain the proper concepts. This
is even more essential if we are discussing our decision with others. Whether supported by
modeling and computation or not, the decision conversation will become the basis for action.
My concern with the language of decision analysis goes back to DAADT:
One aid in reducing the problem to its fundamental components is restricting the
vocabulary that can be used in discussing the problem. Thus we carry on the discussion in
terms of events, random variables, probabilities, density functions, expectations,
outcomes, and alternatives. We do not allow fuzzy thinking about the nature of these
terms. Thus “The density function of the probability” and “The confidence in the
probability estimate” must be nipped in the bud. We speak of “assigning,” not
“estimating,” the probabilities of events and think of this assignment as based on our
“state of information.” These conventions eliminate statements like the one recently made
on a TV panel of doctors who were discussing the right of a patient to participate in
decision making on his treatment. One doctor asserted that the patient should be told of
“some kind of a chance of a likelihood of a bad result.” I am sure that the doctor was a
victim of the pressures of the program and would agree with us that telling the patient the
Advances: Foundations of DA Revisited
Page 30 of 44
Ch 03 060520 V10
probability the doctor would assign to a bad result would be preferable.
Some of the communication advantages in using decision analysis were also spelled out:
One of the most important advantages of decision analysis lies in the way it encourages
meaningful communication among the members of the enterprise because it provides a common
language in which to discuss decision problems. Thus engineers and marketing planners with
quite different jargons can appreciated one another’s contributions to a decision. Both can use
the decision-analysis language to convey their feelings to management quickly and effectively.
My concern for precise language continues to the present. Refer to a recent paper on this
subject (Howard, 2004) for a thorough discussion of this issue. My intention is to have the
language we use in describing decisions consist of the simplest, least confusing, most accurate
terms for the concepts under discussion. Table 3.1 summarizes some of the recommended
changes in language that I have found useful in dealing with students and clients.
Table 3.1 Suggested Terminology Changes
Conventional
Term
Preferred Term
dependence
relevance
outcome
prospect
expected value
mean
expected value
certain
equivalent
utility
u-curve
Purpose of Change
To emphasize the informational rather than the causal nature
of conditional probability assignments
To emphasize that decisions choose uncertain futures rather
than only an immediate result.
To recognize that the expected value is seldom to be expected.
In discussing probability distributions, use “mean” to describe
the first moment.
To recognize that the expected value is seldom to be expected.
In describing the value of an alternative with uncertain values,
use certain equivalent
To avoid confusion with other uses of the word utility in
related subjects, like marginal utility. The u-curve says nothing
in a deterministic world.
Advances: Foundations of DA Revisited
Page 31 of 44
Ch 03 060520 V10
Confusions
Uncertainty about Probability: As we previously discussed, the language of probability is
sufficient for describing the phenomenon of uncertainty. Yet people keep trying to build another
level they might call uncertainty about probability to describe any uneasiness they might feel in
the process of probability assignment. This is akin to the previous quote from DAADT where the
doctor said that the patient should be told of “some kind of a chance of a likelihood of a bad
result.” (Howard, 1988a). Once you have internalized the thinking of Laplace and Jaynes, any
notion of uncertainty about probability becomes unnecessary.
Deal Characterization and Deal Desirability: I recently consulted with a Board of Directors
concerned about whether to follow an alternative that had highly uncertain value prospects that
ranged from great profits to serious losses. There had been much previous discussion about
whether this alternative was "too risky" for the company. The members of the board were highly
educated and experienced business people.
The board might see an alternative like this as too risky for two logically separate reasons.
There might be a belief that the chances of large profits were overstated, or perhaps the chances
of losses understated. This would be a concern about the characterization of the deal, about
whether the analysis assigned proper probabilities to each level of profit and loss.
Once the deal has a proper characterization, the question is whether it is attractive given the
company's attitude toward risk: does the company have the stomach for this deal? Failure to
separate issues of characterization and desirability has been a perpetual source of confusion.
From DAADT:
“Often arguments over which is the best decision arise because the participants do not realize
that they are arguing on different grounds. Thus it is possible for A to think that a certain
alternative is riskier than it is in B’s opinion, either because A assigns different probabilities to
Advances: Foundations of DA Revisited
Page 32 of 44
Ch 03 060520 V10
the outcomes than B but both are equally risk-averting, or because A and B assign the same
probabilities to the outcomes but differ in their risk aversion. If we are to make progress in
resolving the argument, we must identify the nature of the difficulty and bring it into the open.
Similar clarifications may be made in the areas of time preference or in the measurement of the
value of outcomes.”
Challenges of Decision Analysis
Classical Statistics Persists
In spite of the clear perspectives on uncertainty provided by Laplacian probability, much of the
teaching about uncertainty takes place in statistics classes where students are taught concepts that
can only confuse them. They learn about confidence intervals and confidence statements and
come to believe that a confidence statement is a probability assignment, even though the
professor is careful not to say so. If ever there was a "wolf in sheep's clothing" it is a confidence
statement posing as a probability assignment. Other classical statistics concepts have similar
problems. Hypothesis testing tells you nothing about how to form the hypothesis or how to
determine the probability level at which it can be rejected. Furthermore, if a hypothesis is
rejected, there is no procedure for what to do next. Maximum likelihood procedures are based
only on the data from an experiment and have no place for any preceding knowledge. None of
these methods are what Maxwell had in mind."
Dealing with Multiple Attributes
Advances: Foundations of DA Revisited
Page 33 of 44
Ch 03 060520 V10
Decision problems can have prospects with many attributes that the decision maker would like to
consider. We shall here consider some of the procedures that have been developed to address this
problem.
Direct and Indirect Values: Sometimes the decision maker needs to assess preferences on many
fewer attributes than he or she would think. To see why, let us define direct and indirect values.
A direct value is one to be traded off by the decision maker against other direct values. An
indirect value is a distinction that is relevant to a direct value, but is not a direct value itself.
To illustrate, let us consider preferences for tourism and wildlife in Africa. A resort hotel
operator might have values like those shown on the left of Figure 3.7.
Advances: Foundations of DA Revisited
Page 34 of 44
Ch 03 060520 V10
Value
Value
Value
Wildlife
Tourism
Indirect
Direct
Hotel Operator’s Values
Wildlife
Tourism
Wildlife
Tourism
Direct
Indirect
Direct
Direct
Environmentalist’s Values
Two Direct Values
Figure 3.7 Values for Tourism and Wildlife
We suppose that he does not care at all about wildlife, but is very concerned about tourism.
He places a direct value on tourism and an indirect value on wildlife. Notice that the arrow
connecting wildlife and tourism in this diagram is a relevance arrow. This means that the
importance of wildlife depends on the information of the hotel operator. If a convincing study
showed that tourism would be unaffected by the amount of wildlife, then the operator would not
have an indirect value on wildlife. But if he believes, as is likely to be the case, that the tourists
come to see wildlife, then he would support measures preserving wildlife habitat and the
prevention of poaching.
The center diagram in Figure 3.7 shows how an environmentalist might value the same
distinctions. The environmentalist places a direct value on wildlife and an indirect value on
tourism. The importance of tourism to the environmentalist will depend on his beliefs about how
the presence of tourists enhances or harms wildlife.
Notice that both of these people have only one direct value and therefore no reason for value
trade-offs. Alternatives are judged by their effect on the one direct attribute. The right diagram in
Advances: Foundations of DA Revisited
Page 35 of 44
Ch 03 060520 V10
Figure 3.7 shows a value function for someone who places a direct value on both wildlife and
tourism. Alternatives that produced different levels of these attributes would require trade-offs
between them.
I once conducted a session with oil company executives who believed they had to deal with
about 30 different attributes in making their decisions. After about an hour of discussion, and of
direct and indirect values, they finally agreed that there were only two direct values. One was the
profitability of the company, and the other was harm to people surrounding their facilities as the
result of company operations.
Focusing on direct values considerably simplifies analysis of multiattribute decision
situations.
Treating Multi-Attribute Problems Using a Value Function. Suppose I have reduced a multiattribute problem I face to n attributes that have direct value. I would want one of these attributes
to be a value measure so that I can compute the value of clairvoyance, or of any experimentation,
in terms of this measure. In this n-dimensional space I now construct iso-preference surfaces,
combinations of attribute levels that are equally desirable to me. I can then identify each surface
by its intercept with the value function. If I now have a joint distribution on the n-1 other
attributes, I will have a derived distribution on the value measure. I can then assign a u-curve on
the value measure and determine the certain equivalent of any alternative that could produce
these attributes, and thereby make my choice. The u-curve on the value measure would imply
preferences under uncertainty for each of the other attributes. No additional information would
be contained in these preferences. Proceeding in this fashion will allow establishing trade-offs
within the attributes and in particular, between any two attributes when the value of all others are
specified. The incorporation of the value measure permits the decision maker to see the
implication of choices in terms of a value scale of common experience. The benefit of being able
Advances: Foundations of DA Revisited
Page 36 of 44
Ch 03 060520 V10
to compute the value of clairvoyance or of any other information is attainable only by using a
value measure (Matheson and Howard, 1968).
Other Approaches to Multi-Attribute Problems. Other approaches for multi-attribute problems
divide into two classes. Those that satisfy the rules and those that do not. One that satisfies the
rules is the approach of placing a ‘multidimensional utility function’ directly on the attributes
(Keeney and Raiffa, 1976). This approach does not use a value function and as a result cannot
have the advantage of computing the value of clairvoyance unless a value measure is one of the
attributes. A check of the reference (Keeney and Raiffa, 1976) reveals that there is no discussion
of the value of information gathering in the book.
The other class of approaches to the multi-attribute valuation problem consists of
simplifications that do not offer all the benefits of the rules or methods that do not follow the
rules and hence may not meet some of the desiderata.
Weight and Rate. A simple way to handle many attributes is to assign, say, 100 points total and
then assign them to each of the attributes in accordance with their importance. The next step is to
rate each of the alternatives by seeing how many of the points of each attribute or earned by that
alternative. The point value of the alternative is obtained by summing over all attributes. This
procedure is very simple and may be helpful in choosing a car or a stereo. It is not so helpful
when there is uncertainty. Weight and rate assumes that preference for these deterministic
attributes is linear in the attributes: there can be no interaction among them. When I construct my
preferences for a peanut butter and jelly sandwich, I find they do not meet this condition. In
summary, weight and rate methods can be helpful, but cannot bear the weight of decisions with
uncertainty like choosing treatment by drugs or an operation, pharmaceutical development, or
planning finances for retirement.
Advances: Foundations of DA Revisited
Page 37 of 44
Ch 03 060520 V10
Analytic Hierarchy Process. A widely used process that does not obey the rules is the analytic
hierarchy process (Howard, 1992). The result of the process is a weight and rate system that is
derived on the basis of comparative judgments of importance, preference, and likelihood. The
process has an air of mathematical sophistication, but its results are easily produced in a
spreadsheet by averaging or, equivalently, and with much more ado, by eigenvalues from matrix
iteration. AHP can incorporate uncertainty only approximately, has a major difficulty in
incorporating experimental information, cannot compute the value of information, and can
provide no warranty that the alternative it recommends is the best one. Since it does not follow
the rules, it is subject to failing a desideratum, like the one requiring that removing a
noninformative alternative cannot change the ordering of the existing alternatives (Dyer, 1990).
Why, then, do inferior processes find favor with decision-makers? The answer is that they do
not force you to think very hard or to think in new ways. Since we rarely find epistemic
probability in our educational system, even in engineering, medical, and business schools, it is
not surprising that people generally find it challenging to follow the dictum of Maxwell. In
decision-making, as in many other pursuits, you have a choice of doing something the easy way
or the right way, and you will reap the consequences.
Risk Preference
The notion of risk preference and its representation is still a problem for people and
organizations. I once heard distinguished decision science professor give a brilliant presentation
to business school professors on the necessity of having a personal risk attitude to guide
decisions. After the lecture, and in private, a graduate student asked the professor what his
personal risk attitude, i.e., u-curve, was. The professor admitted that he did not have one.
The issue of risk preference in my experience goes back to DAADT:
Advances: Foundations of DA Revisited
Page 38 of 44
Ch 03 060520 V10
… for example, although we have tended to think of the utility theory as an academic
pursuit, one of our major companies was recently faced with the question, “Is 10 million
dollars of profit sufficient to incur one chance in I million of losing I billion dollars?~’
Although the loss is staggering, it is realistic for the company concerned. Should such a
large company be risk-indifferent and make decisions on an expected value basis? Are
stockholders responsible for diversifying their risk externally to the company or should
the company be risk-averting on their behalf? For the first time the company faced these
questions in a formal way rather than deciding the particular question on its own merits
and this we must regard as a step forward.
Life and Death Decisions
One area in which I have a special interest is the use of decision analysis for making safety
decisions, and in general decisions involving a risk of death or serious bodily harm (Howard,
1978, 1980a, 1984, 1989a, 1999). I distinguish three stages of analysis. The first is risk
assessment to assess the magnitude of risks to life in a proper unit. The second is risk evaluation
to determine the importance of the risk in monetary terms. The third is risk management, or
decision-making, to choose what course of action best balances the advantages, disadvantages,
and safety consequences.
By using a properly sized probability unit, the microprobability, defined as a probability of
one in one million, small risks of death can be appreciated. A micromort, one microprobability of
death, is a handy unit of measure for the death risks faced from accidents. Placing a value on a
micromort permits making many safety decisions such as whether to take a beneficial drug with
possible deadly side effects.
Advances: Foundations of DA Revisited
Page 39 of 44
Ch 03 060520 V10
Future of Decision Analysis
Is it possible that the discipline of decision analysis developed over the last 40 years is no longer
necessary because of improvements in decision-making? Has the phenomenal growth in
computation and the availability of information obviated the need for decision analysis?
Unfortunately, that is not the case. Executives today are making the same mistakes their parents
and grandparents used to make. I hear consultants who observe poor executive decision making
say, "If we could have only 1% of the waste." Even the field of science has not learned the
lesson. Scientific journals are still accepting hypotheses that cannot be rejected at the 95%
confidence level. I believe that if Laplace could see the state of modern decision-making, he
would be appalled by our failure to use the systems we have been discussing, especially now that
we have the computational and communication tools he could only have dreamed of. With few
exceptions (Decision Education Foundation), students in elementary, secondary, college and
graduate schools do not learn how to think in an uncertain world.
Decision analysis has thrived in certain environments. Some consulting companies rely
extensively upon it. Decision analysis is entrenched in major industries like petroleum and
pharmaceuticals, and is heavily employed in electric power. There is hardly an industry from
paper to moviemaking that has not made use of decision analysis.
It has been said that every strength is accompanied by a weakness, and that is true of decision
analysis. One of its greatest strengths is its transparency: the decision basis is laid out for all
participants to see -- the alternatives considered, the information used and its sources, and finally
the preferences. Organizations wanting to use the full capability of their members to improve the
quality of a decision find this to be a great advantage. However, transparency is equally a threat
Advances: Foundations of DA Revisited
Page 40 of 44
Ch 03 060520 V10
to organizations that wish to limit alternatives, control information, and hide preferences. The
more open the organization, private or public, the more it will value the process of decision
analysis.
Epilog
To me, incorporating the principles and philosophy of decision analysis is not just learning the
subject, but more like installing a new operating system in your brain.
Acknowledgment
I thank Ali Abbas for many helpful suggestions in preparing this chapter.
Advances: Foundations of DA Revisited
Page 41 of 44
Ch 03 060520 V10
References
Bernoulli, D. (1738). “Specimen Theoriae Novae de Mensura Sortis”( Exposition of a New
Theory on the Measurement of Risk) Commentarii Academiae Scientiarum Imperialis
Petropolitanae, Tomus V [Papers of the Imperial Academy of Sciences in Petersburg, Vol. V]
,pp. 175–192.
Cox, R.T. (2001). Algebra of Probable Inference. The Johns Hopkins University Press
Decision Education Foundation. http://decisioneducation.org
Dyer, J.S. (1990). Remarks on the Analytic Hierarchy Process. Management Science, Vol. 36,
No. 3, (249-258).
Howard, R.A. & Matheson, J.E. (1980)."Influence Diagrams", Private Report. Reprinted in
Howard, R.A. & Matheson, J.E. (Eds) READINGS on the Principles and Applications of
Decision Analysis. Strategic Decisions Group, Menlo Park, California, April 1984. To appear
in the journal Decision Analysis.
Howard, R.A. (1965). Bayesian Decision Models for Systems Engineering. IEEE Transactions
on Systems, Science and Cybernetics, Vol. SSC-1, No. 1, November
Howard, R.A. (1966). Decision Analysis: Applied Decision Theory. Proceedings of the Fourth
International Conference on Operational Research, pp. 55-71 , Wiley-Interscience,.
Reprinted in Howard, R. A. & Matheson, J. E. (Eds.) READINGS on the Principles and
Applications of Decision Analysis. Strategic Decisions Group, Menlo Park, California.
Howard, R.A. (1966a). Information Value Theory. IEEE Transactions on Systems, Science and
Cybernetics, Vol. SSC-2, No. 1, August. (22-26)
Howard, R.A. (1968). The Foundations of Decision Analysis. IEEE Transactions on Systems,
Science and Cybernetics, Vol. SSC-4, No. 3, pp, September. (211-219).
Advances: Foundations of DA Revisited
Page 42 of 44
Ch 03 060520 V10
Howard, R.A. (1970). Decision Analysis: Perspectives on Inference, Decision and
Experimentation. Proceedings of the IEEE, Vol. 58, No. 5, May 1970 (632-643).
Howard, R.A. (1973). Decision Analysis in Systems Engineering. In Miles, Jr., R. F. (Ed.)
Systems Concepts: Lectures on Contemporary Approaches to Systems., Wiley-Interscience.
pp. 51-85.
Howard, R.A. (1975). Social Decision Analysis. Proceedings of the IEEE, Vol. 63, No. 3, 359371.
Howard, R.A. (1978). Life and Death Decision Analysis. Proceedings Second Lawrence
Symposium on Systems and Decision Sciences, Berkeley, California, October.
Howard, R.A. (1980). An Assessment of Decision Analysis. Special Issue on Decision Analysis,
Operations Research, Vol. 28, No. 1, January-February 1980, (4-27).
Howard, R.A. (1980a). On Making Life and Death Decisions. In Schwing, R. C. & Albers, Jr.,
W. A. (Eds.) Societal Risk Assessment, How Safe Is Safe Enough?, General Motors Research
Laboratories, Plenum Press,.
Howard, R.A. (1984). On Fates Comparable to Death. Special Issue of Management Science,
April.
Howard, R.A. (1984a). The Evolution of Decision Analysis. In Howard, R. A. & Matheson, J. E.
(Eds.) READINGS on the Principles and Applications of Decision Analysis (pp. 7-16),
Strategic Decisions Group, Menlo Park, California.
Howard, R.A. (1988). Decision Analysis: Practice and Promise. Management Science, Vol. 34,
No. 6, pp., June 1988. (679-695)
Howard, R.A. (1988a). Uncertainty About Probability: A Decision Analysis Perspective. Risk
Analysis, Vol. 8, No. 1, (91-98).
Howard, R.A. (1989). Knowledge Maps. Management Science, Vol. 35, No. 8, August. ( 903-
Advances: Foundations of DA Revisited
Page 43 of 44
Ch 03 060520 V10
922.)
Howard, R.A. (1989a). Microrisks for Medical Decision Analysis. International Journal of
Technology Assessment in Health Care, Vol. 5, No. 3, (357-370)
Howard, R.A. (1990). From Influence to Relevance to Knowledge. In Oliver, R. M. & Smith,
J.Q. (Eds.) Influence Diagrams, Belief Nets and Decision Analysis (pp. 3-23). John Wiley &
Sons Ltd.
Howard, R.A. (1991). Business Ethics: Tell the Truth. Journal of Management Development,
Vol. 11, No. 4, (4-10).
Howard, R.A. (1992). Heathens, Heretics and Cults: The Religious Spectrum of Decision
Aiding. Interfaces, Vol. 22, No. 6. (15-27).
Howard, R.A. (1992a). In Praise of the Old Time Religion. In Edwards, W. (Ed.). Utility
Theories: Measurements and Applications. Kluwer Academic Publishers. , pp. 27-55
Howard, R.A. (1996). Options. In Zeckhauser, R., Keeney, R., and Sibenius, J. (Eds.) Wise
Choices: Decisions, Games, and Negotiations. Boston: Harvard Business School Press.
Howard, R.A. (1998). Foundations of Professional Decision Analysis: A Manuscript in Process.
Stanford Course Notes.
Howard, R.A. (1999). Life and Death Decisions for Individuals and Couples. In Shanteau, J.,
Mellers, B., and Schum, D (Eds.) Decision Science and Technology: Reflections on the
Contributions of Ward Edwards. Boston: Kluwer Academic Publishers.
Howard, R.A. (2000). Decisions in the Face of Uncertainty. In Alexander, C. (Ed.) Visions of
Risk. London: Pearson Education Limited
Howard, R.A. (2001). The Ethical OR/MS Professional. INTERFACES, Vol. 31, No. 6,
November-December, (69-82)
Howard, R.A. (2004). Speaking of Decisions: Precise Decision Language. Decision Analysis,
Advances: Foundations of DA Revisited
Page 44 of 44
Ch 03 060520 V10
Vol. 1, No. 2, June 2004.).
Jaynes, E.T. (1986). Bayesian Methods: General Background. In Justice, J. H. (Ed.) MaximumEntropy and Bayesian Methods in Applied Statistics. Cambridge Univ. Press, Cambridge, p.
1.
Jaynes, E.T. (2003). Probability Theory : The Logic of Science. Cambridge University Press
Jeffreys, H. (1939). The Theory of Probability. Oxford University Press, New York (3rd Ed.,
1961)
Keeney, R.L. & Raiffa,H. (1976). Decisions with Multiple Objectives. John Wiley & Sons, Inc...
Laplace, P.S. (1812). Analytical Theory of Probability
Laplace, P.S. (1996). A Philosophical Essay on Probabilities. Dover Publications.
Matheson, J.E., &. Howard, R.A (1968). An Introduction to Decision Analysis In The Principles
and Applications of Decision Analysis, Vol. I,. Howard, R. A., and Matheson, J. E. (Eds.).
Strategic Decisions Group, Menlo Park, California, USA.