MATH30-6 Lecture 9

Joint Probability Distributions
MATH30-6
Probability and Statistics
Objectives
At the end of the lesson, the students are expected to
β€’ Use joint probability mass functions and joint
probability density functions to calculate probabilities;
β€’ Calculate marginal and conditional probability
distributions from joint probability distributions; and
β€’ Interpret and calculate covariances and correlations
between random variables.
Joint Probability Mass Function
The joint probability mass function of the discrete
random variables X and Y, denoted as π‘“π‘‹π‘Œ π‘₯, 𝑦 , satisfies
(1) π‘“π‘‹π‘Œ π‘₯, 𝑦 β‰₯ 0
(2) π‘₯ 𝑦 π‘“π‘‹π‘Œ π‘₯, 𝑦 = 1
(3) π‘“π‘‹π‘Œ π‘₯, 𝑦 = 𝑃 𝑋 = π‘₯, π‘Œ = 𝑦
(5-1)
β€’ Sometimes referred to as the bivariate probability
distribution or bivariate distribution of the random
variables
β€’ 𝑃 𝑋 = π‘₯ and π‘Œ = 𝑦 is usually written as 𝑃 𝑋 =
Joint Probability Mass Function
Examples:
5-1/153 Signal Bars Calls are made to check the airline
schedule at your departure city. You monitor the number
of bars of signal strength on your cell phone and the
number of times you have to state the name of your
departure city before the voice system recognizes the
name.
In the first four bits transmitted, let
X denote the number of bars of signal strength on your
cell phone
Y denote the number of times you need to state your
departure city
Joint Probability Mass Function
By specifying the probability of each of the points in Fig.
5-1, we specify the joint probability distribution of X and
Y. Similarly to an individual random variable, we define
the range of the random variables (X, Y) to be the set of
points (x, y) in two-dimensional space for which the
probability that X = x and Y = y is positive.
Joint Probability Mass Function
3.14/95 Two refills for a ballpoint pen are selected at
random from a box that contains 3 blue refills, 2 red
refills, and 3 green refills. If 𝑋 is the number of blue refills
and π‘Œ is the number of red refills selected, find
(a) the joint probability function 𝑓 π‘₯, 𝑦 , and
(b) (b) 𝑃 𝑋, π‘Œ ∈ 𝐴 , where 𝐴 is the region π‘₯, 𝑦 |π‘₯ +
Joint Probability Density Function
A joint probability density function for the continuous
random variables X and Y, denoted as π‘“π‘‹π‘Œ π‘₯, 𝑦 , satisfies
the following properties:
(1) π‘“π‘‹π‘Œ π‘₯, 𝑦 β‰₯ 0 for all x, y
(2)
∞ ∞
𝑓
βˆ’βˆž βˆ’βˆž π‘‹π‘Œ
π‘₯, 𝑦 𝑑π‘₯ 𝑑𝑦 = 1
(3) For any region R of two-dimensional space,
𝑃 𝑋, π‘Œ ∈ 𝑅 =
π‘“π‘‹π‘Œ π‘₯, 𝑦 𝑑π‘₯ 𝑑𝑦
𝑅
(5-2)
Joint Probability Density Function
Examples:
5-2/155 Server Access Time Let the random variable X
denote the time until a computer server connects to your
machine (in milliseconds), and let Y denote the time until
the server authorizes you as a valid user (in milliseconds).
Each of these random variables measures the wait from a
common starting time and X < Y. Assume that the joint
probability density function for X and Y is
π‘“π‘‹π‘Œ π‘₯, 𝑦 = 6 × 10βˆ’6 𝑒 βˆ’0.001π‘₯βˆ’0.002𝑦 for π‘₯ < 𝑦
Joint Probability Density Function
The region with nonzero probability is shaded in Fig. 5-4.
The property that this joint probability density function
integrates to 1 can be verified by the integral of π‘“π‘‹π‘Œ π‘₯, 𝑦
over this region as follows:
∞
∞
π‘“π‘‹π‘Œ π‘₯, 𝑦 𝑑𝑦 𝑑π‘₯
βˆ’βˆž βˆ’βˆž
∞
∞
6 × 10βˆ’6 𝑒 βˆ’0.001π‘₯βˆ’0.002𝑦 𝑑𝑦 𝑑π‘₯
=
0
π‘₯
∞
∞
= 6 × 10βˆ’6
𝑒 βˆ’0.002𝑦 𝑑𝑦 𝑒 βˆ’0.001π‘₯ 𝑑π‘₯
0
π‘₯
Joint Probability Density Function
∞
= 6 × 10βˆ’6
0
𝑒 βˆ’0.002π‘₯ βˆ’0.001π‘₯
𝑒
𝑑π‘₯
0.002
∞
𝑒 βˆ’0.003π‘₯ 𝑑π‘₯
= 0.003
∞
0
∞
π‘“π‘‹π‘Œ
βˆ’βˆž βˆ’βˆž
1
π‘₯, 𝑦 𝑑𝑦 𝑑π‘₯ = 0.003
=1
0.003
Joint Probability Density Function
Joint Probability Density Function
The probability that X < 1000 and Y < 2000 is determined
as the integral over the darkly shaded region in Fig 5-5.
1000
2000
𝑃 𝑋 ≀ 1000, π‘Œ ≀ 2000 =
1000
2000
= 6 × 10βˆ’6
π‘“π‘‹π‘Œ π‘₯, 𝑦 𝑑𝑦 𝑑π‘₯
0
π‘₯
𝑒 βˆ’0.002𝑦 𝑑𝑦 𝑒 βˆ’0.001π‘₯ 𝑑π‘₯
0
1000
= 6 × 10βˆ’6
0
1000
π‘₯
𝑒 βˆ’0.002π‘₯ βˆ’ 𝑒 βˆ’4 βˆ’0.001π‘₯
𝑒
𝑑π‘₯
0.002
𝑒 βˆ’0.003π‘₯ βˆ’ 𝑒 βˆ’4 𝑒 βˆ’0.001π‘₯ 𝑑π‘₯
= 0.003
0
Joint Probability Density Function
βˆ’1
1 βˆ’ 𝑒 βˆ’3
1
βˆ’
𝑒
= 0.003
βˆ’ 𝑒 βˆ’4
0.003
0.001
𝑃 𝑋 ≀ 1000, π‘Œ ≀ 2000 = 0.003 316.738 βˆ’ 11.578
= 0.915
Joint Probability Density Function
3.15/96 A privately owned business operates a drive-in
facility and a walk-in facility. On a randomly selected day,
let 𝑋 and π‘Œ, respectively, be the proportions of the time
that the drive-in and the walk-in facilities are in use, and
suppose that the joint density function of these random
variables is
2
𝑓 π‘₯, 𝑦 = 5 2π‘₯ + 3𝑦 , 0 ≀ π‘₯ ≀ 1, 0 ≀ 𝑦 ≀ 1,
0, elsewhere.
(a) Verify condition 2.
(b) Find 𝑃 𝑋, π‘Œ ∈ 𝐴 , where 𝐴 =
1 1
2 4
π‘₯, 𝑦 |0 < π‘₯ < , <
Marginal Probability Mass Function
Marginal probability distribution
β€’ The individual probability distribution of a random
variable
The marginal probability mass functions of 𝑋 alone and
of π‘Œ alone are
𝑓𝑋 π‘₯ =
𝑓 π‘₯, 𝑦
𝑦
and
π‘“π‘Œ 𝑦 =
𝑓 π‘₯, 𝑦
π‘₯
Marginal Probability Mass Function
Examples:
5-3/157 Marginal Distribution The joint probability
distribution of X and Y in Fig. 5-1 can be used to find the
marginal probability distribution of X. For example,
𝑓𝑋 3 = 𝑃 𝑋 = 3
= 𝑃 𝑋 = 3, π‘Œ = 1 + 𝑃 𝑋 = 3, π‘Œ = 2
+ 𝑃 𝑋 = 3, π‘Œ = 3 + 𝑃 𝑋 = 3, π‘Œ = 4
= 0.25 + 0.2 + 0.05 + 0.05 = 0.55
The marginal probability distribution for X is found by
summing the probabilities in each column, whereas the
marginal probability distribution for Y is found by
summing the probabilities in each row. The results are
shown in Fig. 5-6.
Marginal Probability Mass Function
Marginal Probability Mass Function
3.16/98 Show that the column and row totals of Table 3.1
give the marginal distribution of 𝑋 alone and of π‘Œ alone.
Table 3.1: Joint Probability Distribution for Example 3.14
x
Row
Totals
f(x, y)
0
1
2
0
3/28
3/14
1/28
1
9/28
3/14
0
2
3/28
0
0
15/28
3/7
1/28
Column Totals
5/14
15/28
3/28
1
y
Marginal Probability Mass Function
3.50/106 Suppose that X and Y have the following joint
probability distribution:
f(x, y)
y
1
3
5
x
2
0.10
4
0.15
0.20
0.10
0.30
0.15
(a) Find the marginal distribution of X.
(b) Find the marginal distribution of Y.
Marginal Probability Density
Function
If the joint probability density function of random
variables X and Y is π‘“π‘‹π‘Œ π‘₯, 𝑦 , the marginal probability
density functions of X and Y are
𝑓𝑋 π‘₯ =
𝑦
π‘“π‘‹π‘Œ π‘₯, 𝑦 𝑑𝑦 and π‘“π‘Œ 𝑦 =
π‘₯
π‘“π‘‹π‘Œ π‘₯, 𝑦 𝑑π‘₯
(5-3)
where the first integral is over all points in the range of
𝑋, π‘Œ for which X = x and the second integral is over all
points in the range of 𝑋, π‘Œ for which Y = y.
Marginal Probability Density
Function
A probability for only one random variable, say, for
example, 𝑃 π‘Ž < 𝑋 < 𝑏 , can be found from the marginal
probability distribution of X or from the integral of the
joint probability distribution of X and Y as
𝑏
𝑃 π‘Ž<𝑋<𝑏 =
𝑏
=
𝑏
𝑓𝑋 π‘₯ 𝑑π‘₯ =
π‘Ž
∞
𝑓 π‘₯, 𝑦 𝑑𝑦 𝑑π‘₯
π‘Ž
βˆ’βˆž
∞
𝑓 π‘₯, 𝑦 𝑑𝑦 𝑑π‘₯
π‘Ž
βˆ’βˆž
Marginal Probability Density
Function
Examples:
5-4/157 Server Access Time For the random variables that
denote times in Example 5-2, calculate the probability
that Y exceeds 2000 milliseconds.
This probability is determined as the integral of π‘“π‘‹π‘Œ π‘₯, 𝑦
over the darkly shaded region in Fig. 5-7. The region is
partitioned into two parts and different limits of
integration are determined for each part.
Marginal Probability Density
Function
Marginal Probability Density
Function
𝑃 π‘Œ > 2000
2000
∞
6 × 10βˆ’6 𝑒 βˆ’0.001π‘₯βˆ’0.002𝑦 𝑑𝑦 𝑑π‘₯
=
0
∞
2000
∞
6 × 10βˆ’6 𝑒 βˆ’0.001π‘₯βˆ’0.002𝑦 𝑑𝑦 𝑑π‘₯
+
2000
π‘₯
Marginal Probability Density
Function
The first integral is
2000
6 × 10βˆ’6
0
10βˆ’6
6×
=
𝑒 βˆ’4
0.002
𝑒 βˆ’0.002𝑦
∞
𝑒 βˆ’0,001π‘₯ 𝑑π‘₯
βˆ’0.002
2000
2000
𝑒 βˆ’0.001π‘₯ 𝑑π‘₯
0
6 × 10βˆ’6 βˆ’4 1 βˆ’ 𝑒 βˆ’2
=
𝑒
0.002
0.001
= 0.0475
Marginal Probability Density
Function
The second integral is
∞
𝑒 βˆ’0.002𝑦
6 × 10βˆ’6
2000
10βˆ’6
6×
=
0.002
∞
𝑒 βˆ’0.001π‘₯ 𝑑π‘₯
βˆ’0.002
π‘₯
∞
𝑒 βˆ’0.003π‘₯ 𝑑π‘₯
2000
10βˆ’6
6×
=
0.002
𝑒 βˆ’6
= 0.0025
0.003
Therefore,
𝑃 π‘Œ > 2000 = 0.0475 + 0.0025 = 0.05
Marginal Probability Density
Function
Alternatively, the probability can be calculated from the
marginal probability distribution of Y as follows. For y > 0,
𝑦
6 × 10βˆ’6 𝑒 βˆ’0.001π‘₯βˆ’0.002𝑦 𝑑π‘₯
π‘“π‘Œ 𝑦 =
0
𝑦
= 6 × 10βˆ’6 𝑒 βˆ’0.002𝑦
𝑒 βˆ’0.001π‘₯ 𝑑π‘₯
0
= 6 × 10βˆ’6 𝑒 βˆ’0.002𝑦
=6×
10βˆ’6 𝑒 βˆ’0.002𝑦
𝑒 βˆ’0.001π‘₯
𝑦
βˆ’0.001
0
𝑒 βˆ’0.001𝑦
1βˆ’
0.001
Marginal Probability Density
Function
π‘“π‘Œ 𝑦 = 6 × 10βˆ’3 𝑒 βˆ’0.002𝑦 1 βˆ’ 𝑒 βˆ’0.001𝑦 for y > 0
We have obtained the marginal probability density
function of Y. Now,
∞
𝑃 π‘Œ > 2000 = 6 × 10βˆ’3
= 6 × 10βˆ’3
𝑒 βˆ’0.002𝑦
𝑒 βˆ’0.002𝑦 1 βˆ’ 𝑒 βˆ’0.001𝑦 𝑑𝑦
2000
∞
βˆ’0.002
= 6 × 10βˆ’3
2000
𝑒 βˆ’4
βˆ’
𝑒 βˆ’0.003𝑦
∞
βˆ’0.003
𝑒 βˆ’6
βˆ’
= 0.05
0.002 0.003
2000
Marginal Probability Density
Function
3.40/105 A fast-food restaurant operates both a drive-through
facility and a walk-in facility. On a randomly selected day, let X
and Y, respectively, be the proportions of the time that the
drive-through and walk-in facilities are in use, and suppose that
the joint density function of these random variables is
2
𝑓 π‘₯, 𝑦 = 3 π‘₯ + 2𝑦 , 0 ≀ π‘₯ ≀ 1, 0 ≀ 𝑦 ≀ 1,
0, elsewhere.
(a) Find the marginal density of X.
(b) Find the marginal density of Y.
(c) Find the probability that the drive-through facility is busy less
than one-half of the time.
Conditional Probability Mass
Function
Let X and Y be two random variables, discrete or
continuous. The conditional distribution of the random
variable Y, given that X = x, is
π‘“π‘‹π‘Œ π‘₯, 𝑦
π‘“π‘Œ|π‘₯ 𝑦 =
, 𝑓𝑋 π‘₯ > 0.
𝑓𝑋 π‘₯
Similarly, the conditional distribution of the random
variable X, given that Y = y, is
π‘“π‘‹π‘Œ π‘₯, 𝑦
𝑓𝑋|𝑦 π‘₯ =
, π‘“π‘Œ 𝑦 > 0.
π‘“π‘Œ 𝑦
Conditional Probability Mass
Function
Examples:
5-5/159 Signal Bars For Example 5-1, X and Y denote the
number of bars of signal strength and times you need to
state your departure city received, respectively. Then,
𝑃 π‘Œ = 1|𝑋 = 3 = 𝑃 𝑋 = 3, π‘Œ = 1 𝑃 𝑋 = 3
= π‘“π‘‹π‘Œ 3,1 𝑓𝑋 3 = 0.25 0.55 = 0.454
The probability that π‘Œ = 2 given that 𝑋 = 3 is
𝑃 π‘Œ = 2|𝑋 = 3 = 𝑃 𝑋 = 3, π‘Œ = 2 𝑃 𝑋 = 3
= π‘“π‘‹π‘Œ 3,2 𝑓𝑋 3 = 0.2 0.55 = 0.364
Conditional Probability Mass
Function
Additional Conclusion: Further work shows that
𝑃 π‘Œ = 3|𝑋 = 3 = 0.091 and 𝑃 π‘Œ = 4|𝑋 = 3 = 0.091.
Note
that
𝑃 π‘Œ = 1|𝑋 = 3 + 𝑃 π‘Œ = 2|𝑋 = 3 +
𝑃 π‘Œ = 3|𝑋 = 3 = 𝑃 π‘Œ = 4|𝑋 = 3 = 1 . This set of
probabilities defines the conditional probability
distribution of Y given that X = 3.
Conditional Probability Mass
Function
3.18/99 Referring to Example 3.14, find the conditional
distribution of X, given that Y = 1, and use it to determine
𝑃 𝑋 = 0|π‘Œ = 1 .
Table 3.1: Joint Probability Distribution for Example 3.14
x
Row
Totals
f(x, y)
0
1
2
0
3/28
3/14
1/28
1
9/28
3/14
0
2
3/28
0
0
15/28
3/7
1/28
Column Totals
5/14
15/28
3/28
1
y
Conditional Probability Mass
Function
3.49/106 Let X denote the number of times a certain
numerical control machine will malfunction: 1, 2, or 3
times on any given day. Let Y denote the number of
times a technician is called on an emergency call. Their
joint probability distribution is given as
f(x, y)
y
1
3
5
x
1
2
3
0.05 0.05 0.10
0.05 0.10 0.35
0.00 0.20 0.10
(a) Evaluate the marginal distribution of X.
(b) Evaluate the marginal distribution of Y.
(c) Find 𝑃 π‘Œ = 3|𝑋 = 2 .
Conditional Probability Density
Function
Given continuous random variables X and Y with joint
probability density function π‘“π‘‹π‘Œ π‘₯, 𝑦 , the conditional
probability density function of Y given X = x is
π‘“π‘Œ|π‘₯ 𝑦 =
π‘“π‘‹π‘Œ π‘₯,𝑦
𝑓𝑋 π‘₯
for 𝑓𝑋 π‘₯ > 0
Equivalently,
𝑓𝑋|𝑦
π‘“π‘‹π‘Œ π‘₯, 𝑦
π‘₯ =
, π‘“π‘Œ 𝑦 > 0
π‘“π‘Œ 𝑦
(5-4)
Conditional Probability Density
Function
Because the conditional probability density function
π‘“π‘Œ|π‘₯ 𝑦 is a probability density function for all y in 𝑅π‘₯ , the
following properties are satisfied:
(1) π‘“π‘Œ|π‘₯ 𝑦 β‰₯ 0
(2) π‘“π‘Œ|π‘₯ 𝑦 𝑑𝑦 = 1
(3) 𝑃 π‘Œ ∈ 𝐡|𝑋 = π‘₯ =
range of Y
𝐡
π‘“π‘Œ|π‘₯ 𝑦 𝑑𝑦 for any set B in the
(5-5)
Conditional Probability Density
Function
Examples:
3.19/100 The joint density function for the random
variables 𝑋, π‘Œ , where X is the unit temperature change
and Y is the proportion of spectrum shift that a certain
atomic particle produces is
10π‘₯𝑦 2 , 0 < π‘₯ < 𝑦 < 1
π‘“π‘‹π‘Œ π‘₯, 𝑦 =
0, elsewhere.
a) Find the marginal densities 𝑓𝑋 π‘₯ , π‘“π‘Œ 𝑦 , and the
conditional probability π‘“π‘Œ|π‘₯ 𝑦 .
b) Find the probability that the spectrum shifts more
than half of the total observations, given the
temperature is increased to 0.25 units.
Conditional Probability Density
Function
3.20/100 Given the joint density function
π‘₯ 1 + 3𝑦 2
, 0 < π‘₯ < 2, 0 < 𝑦 < 1
π‘“π‘‹π‘Œ π‘₯, 𝑦 =
4
0, elsewhere,
find 𝑓𝑋 π‘₯ , π‘“π‘Œ 𝑦 , 𝑓𝑋|𝑦 π‘₯ , and evaluate 𝑃
1
4
<𝑋<
Conditional Probability Density
Function
3.53/106 Given the joint density function
π‘“π‘‹π‘Œ
6βˆ’π‘₯βˆ’π‘¦
, 0 < π‘₯ < 2, 2 < 𝑦 < 4,
π‘₯, 𝑦 =
8
0, elsewhere,
find 𝑃 1 < π‘Œ < 3|𝑋 = 1 .
Independence
For random variables X and Y, if any one of the following
properties is true, the others are also true, and X and Y
are independent.
(1) π‘“π‘‹π‘Œ π‘₯, 𝑦 = 𝑓𝑋 π‘₯ π‘“π‘Œ 𝑦 for all x and y
(2) π‘“π‘Œ|π‘₯ 𝑦 = π‘“π‘Œ 𝑦 for all x and y with 𝑓𝑋 π‘₯ > 0
(3) 𝑓𝑋|𝑦 π‘₯ = 𝑓𝑋 π‘₯ for all x and y with π‘“π‘Œ 𝑦 > 0
(4) 𝑃 𝑋 ∈ 𝐴, π‘Œ ∈ 𝐡 = 𝑃 𝑋 ∈ 𝐴 𝑃 π‘Œ ∈ 𝐡 for any sets A
and B in the range of X and Y, respectively.
(5-7)
Independence
Examples:
5-11/162 Independent Random Variables Suppose that
Example 5-2 is modified so that the joint probability
density function of X and Y is π‘“π‘‹π‘Œ π‘₯, 𝑦 = 2 ×
10βˆ’6 𝑒π‘₯𝑝 βˆ’0.001π‘₯ βˆ’ 0.002𝑦 for π‘₯ β‰₯ 0 and 𝑦 β‰₯ 0. Show
that X and Y are independent and determine 𝑃 𝑋 >
Independence
The marginal probability density function of Y is
∞
2 × 10βˆ’6 𝑒 βˆ’0.001π‘₯βˆ’0.002𝑦 𝑑π‘₯
π‘“π‘Œ 𝑦 =
0
= 0.002𝑒 βˆ’0.002𝑦 for y > 0
Therefore, π‘“π‘‹π‘Œ π‘₯, 𝑦 = 𝑓𝑋 π‘₯ π‘“π‘Œ 𝑦 for all x and y, and X
and Y are independent.
Independence
To determine the probability requested, property (4) of
Equation 5-7 and the fact that each random variable has
an exponential distribution can be applied.
𝑃 𝑋 > 1000, π‘Œ < 1000 = 𝑃 𝑋 > 1000 𝑃 π‘Œ < 1000
= 𝑒 βˆ’1 1 βˆ’ 𝑒 βˆ’2 = 0.0318
Independence
3.21/102 Show that the random variables of Example 3.14
are not statistically independent.
Table 3.1: Joint Probability Distribution for Example 3.14
f(x, y)
0
y
1
2
Column Totals
0
3/28
3/14
1/28
5/14
x
1
9/28
3/14
0
15/28
2
3/28
0
0
3/28
Row
Totals
15/28
3/7
1/28
1
Examples
5-1/167 Show that the following function satisfies the
properties of a joint probability mass function.
x
1
1.5
y
1
2
fXY(x, y)
1/4
1/8
1.5
3
1/4
2.5
4
1/4
3
5
1/8
Determine the following:
(a) 𝑃 𝑋 < 2.5, π‘Œ < 3
(b) 𝑃 𝑋 < 2.5
(c) 𝑃 π‘Œ < 3
(d) 𝑃 𝑋 > 1.8, π‘Œ > 4.7
Examples
(e) 𝐸 𝑋 , 𝐸 π‘Œ , 𝑉 𝑋 , and 𝑉 π‘Œ
(f) Marginal probability distribution of the random
variable X
(g) Conditional probability distribution of Y given that X =
2.5
(h) Conditional probability distribution of X given that Y =
2.
(i) 𝐸 π‘Œ|𝑋 = 1.5
(j) Are X and Y independent?
Examples
5-2/167 Determine the value of c that makes the function
𝑓 π‘₯, 𝑦 = 𝑐 π‘₯ + 𝑦 a joint probability mass function over
the nine points with x = 1, 2, 3 and y = 1, 2, 3. Determine
the following:
(a) 𝑃 𝑋 = 1, π‘Œ < 4
(b) 𝑃 𝑋 = 1
(c) 𝑃 π‘Œ = 2
(d) 𝑃 𝑋 < 2, π‘Œ < 2
(e) 𝐸 𝑋 , 𝐸 π‘Œ , 𝑉 𝑋 , and 𝑉 π‘Œ
(f) Marginal probability distribution of the random
variable X
(g) Conditional probability distribution of Y given that X =
1
Examples
(h) Conditional probability distribution of X given that Y =
2
(i) 𝐸 π‘Œ|𝑋 = 1
(j) Are X and Y independent?
Examples
5-4/167 Four electronic printers are selected from a large
lot of damaged printers. Each printer is inspected and
classified as containing either a major or a minor defect.
Let the random variables X and Y denote the number of
printers with major and minor defects, respectively.
Determine the range of the joint probability distribution
of X and Y.
Examples
5-6/167 A small-business Web site contains 100 pages and
high graphic content, respectively. A sample of four pages
is selected without replacement, and X and Y denote the
number of pages with moderate and high graphics output
in the sample. Determine:
(a) π‘“π‘‹π‘Œ π‘₯, 𝑦
(b) 𝑓𝑋 π‘₯
(c) 𝐸 𝑋
(d) π‘“π‘Œ|3 𝑦
(e) 𝐸 π‘Œ|𝑋 = 3
(f) 𝑉 π‘Œ|𝑋 = 3
(g) Are X and Y independent?
Expected Value of a Function of
Two Random Variables
β„Ž π‘₯, 𝑦 π‘“π‘‹π‘Œ π‘₯, 𝑦
𝐸 β„Ž 𝑋, π‘Œ
𝑋, π‘Œ discrete
=
β„Ž π‘₯, 𝑦 π‘“π‘‹π‘Œ π‘₯, 𝑦 𝑑π‘₯ 𝑑𝑦
(5-13)
𝑋, π‘Œ continuous
Expected Value of a Function of
Two Random Variables
Example:
5-19/171 For the joint probability distribution of the two
random variables in Fig. 5-12, calculate 𝐸 𝑋 βˆ’ πœ‡π‘‹ π‘Œ βˆ’ πœ‡π‘Œ .
Covariance
β€’ A measure of linear relationship between the random
variables
β€’ If the relationship between the random variables is
nonlinear, the covariance might not be sensitive to the
relationship, as illustrated in Fig. 5-13(d). The only
points with nonzero probability are the points on the
circle.
cov 𝑋, π‘Œ = πœŽπ‘‹π‘Œ = 𝐸 𝑋 βˆ’ πœ‡π‘‹ π‘Œ βˆ’ πœ‡π‘Œ
(5-14)
Covariance
Covariance
Examples:
5-20/173 In Example 5-1, the random variables X and Y
are the number of signal bars and the number of times
you need to state your departure city, respectively. Is the
covariance between X and Y positive or negative?
Negative covariance
Covariance
4.47/127 For the random variables X and Y whose joint
density function is given in Exercise 3.40 on page 105,
find the covariance.
2
𝑓 π‘₯, 𝑦 = 3 π‘₯ + 2𝑦 , 0 ≀ π‘₯ ≀ 1, 0 ≀ 𝑦 ≀ 1,
0, elsewhere
Correlation
The correlation between random variables X and Y,
denoted as ρXY, is
cov 𝑋, π‘Œ
Οƒπ‘‹π‘Œ
πœŒπ‘‹π‘Œ =
=
πœŽπ‘‹ πœŽπ‘Œ
𝑉 𝑋 𝑉 π‘Œ
(5-15)
For any two random variables X and Y
βˆ’1 ≀ ρXY ≀ +1
(5-16)
If X and Y are independent random variables,
ΟƒXY = ρXY = 0
(5-17)
Covariance
Examples:
5-21/173 Covariance For the discrete random variables X
and Y with the joint distribution shown in Fig. 5-14,
determine ΟƒXY and ρXY.
Correlation
5-22/174 Correlation Suppose that the random variable X
has the following distribution: 𝑃 𝑋 = 1 = 0.2,
Examples
5-29/175 Determine the covariance and correlation for
the following joint probability distribution
x
y
fXY(x, y)
2
3
1/8
1
4
1/4
2
5
1/2
4
6
1/8
Examples
5-31/175 Determine the value for c and the covariance
and correlation for the joint probability mass function
π‘“π‘‹π‘Œ π‘₯, 𝑦 = 𝑐 π‘₯ + 𝑦 for x = 1, 2, 3 and y = 1, 2, 3.
5-37/175 Determine the covariance and correlation for
the joint probability density function π‘“π‘‹π‘Œ π‘₯, 𝑦 = 𝑒 βˆ’π‘₯βˆ’π‘¦
over the range 0 < x and 0 < y.
Examples
5-39/175 The joint probability distribution is
x
y
fXY(x, y)
βˆ’1
0
1/4
0
βˆ’1
1/4
0
1
1/4
1
0
1/4
Show that the correlation between X and Y is zero, but X
and Y are not independent.
Summary
β€’ A joint probability mass function is a function used to
calculate probabilities for two or more discrete random
variables.
β€’ A joint probability density function is a function used to
calculate probabilities for two or more continuous
random variables.
β€’ A marginal probability mass function is the probability
mass function of a discrete random variable obtained
from the joint probability distribution of two or more
random variables.
Summary
β€’ A marginal probability density function is the
probability density function of a continuous random
variable obtained from the joint probability distribution
of two or more random variables.
β€’ A conditional probability mass function is the
probability mass function of the conditional probability
distribution of a discrete random variable.
β€’ A conditional probability density function is the
probability density function of the conditional
probability distribution of a continuous random
variable.
Summary
β€’ The covariance is a measure of association between
two random variables obtained as the expected value
of the product of the two random variables around
their means; that is Cov 𝑋, π‘Œ = πœŽπ‘‹π‘Œ = 𝐸 𝑋 βˆ’
References
β€’ Montgomery and Runger. Applied Statistics and
Probability for Engineers, 5th Ed. © 2011
β€’ Walpole, et al. Probability and Statistics for Engineers
and Scientists 9th Ed. © 2012, 2007, 2002