ECO 227Y1 Sample Examination Questions 7 December 2010

ECO 227Y1
Sample Examination Questions
7 December 2010
1. Below are the last three lines of Ozymandias, a famous short poem by Percy
Bysshe Shelley (1792–1822):
Nothing beside remains. Round the decay
Of that colossal wreck, boundless and bare
The lone and level sands stretch far away.
Pick a word at random from these lines. Find its expected length.
The last three lines contain 21 words in total, distributed as follows:
Word length
2
3
4
5
6
7
8
9
Frequency
1
5
4
5
1
3
1
1
The expected length of a word selected at random from the last three lines
of the poem is therefore
2·
1
5
4
5
1
3
1
1
+3·
+4·
+5·
+6·
+7·
+8·
+9·
21
21
21
21
21
21
21
21
1
2 + 15 + 16 + 25 + 6 + 21 + 8 + 9
21
102
=
21
= 4.8571
=
2. Atoms of a radioactive substance disintegrate at random and emit α-particles.
Consider an experiment that consists of counting the number of α-particles
given off in a one-second time interval by one gram of radioactive material. Suppose that past experience leads one to expect that on average, 3.2
α-particles will be emitted during that interval. What is a good approximation to the probability that no more than two α-particles will be counted?
Think of the gram of radioactive material as consisting of a large number
n of atoms each with probability 3.2
of disintegrating and sending off an
n
α-particle during that one-second interval. The number N of α-particles
given off is accordingly a binomial random variable with n trials and success probability 3.2
, which by the Poisson Limit Theorem is approximately
n
Poisson with parameter λ = 3.2 when n is large.
The desired probability, therefore, is
Pr [N ≤ 2] = e−3.2 + 3.2e−3.2 +
(3.2)2 −3.2
e
≈ .382.
2
3. A proofreader detects typographical errors while reading through a book
according to a Poisson process with a rate of one error per page. What is
the probability of the proofreader finding at least one error per page?
The number N of typographical errors on a single page of the book is a
Poisson random variable with parameter λ = 1. The probability of there
being at least one error is therefore
P [N ≥ 1] = 1 − P (N = 0) = 1 − e−1 ≈ .633
4. Let X and Y be random variables such that the marginal density of X is
{ 1 −x
√ e 2 , x > 0
2x
fX (x) =
0
, x≤0
and the conditional density of Y given X = x is
(
)
xy 2
1 1
fY |X (y|x) = √ x 2 exp −
2
2π
2
for −∞ < y < ∞. Show that E [Y | X = x] = 0 (implying that E [E [Y |X]] =
0) but that E [Y ] = ∞.
We have
(
)
y 1
xy 2
√ x 2 exp −
E [ Y | X = x] =
dy
2
2π
−∞
(
)y=∞
1
xy 2 = −√
exp −
2 y=−∞
2πx
= 0.
∫
∞
The joint density of X and Y is
fXY (x, y) = fY |X (y|x)fX (x)
(
)
{ 1
√ exp − x (1 + y 2 )
, x>0
2
2 π
=
0
, x ≤ 0.
Integrating out x produces the marginal density of Y :
∫ ∞
( x(
))
1
2
√
fY (y) =
exp − 1 + y
dx
2
2 π
0
(
)x=∞
1
x (1 + y 2 ) √ exp −
= −
2
(1 + y 2 ) π
x=0
1
√
=
(1 + y 2 ) π
for −∞ < y < ∞. Therefore
∫
∞
y
√ dy
2
π
−∞ (1 + y )
(
) ∞
1
2 √ log 1 + y =
2 π
−∞
= ∞.
E [Y ] =
5. Suppose that Sn ∼ Bin(n, p). Find the mgf of Sn .
∑
Sn , is distributed as ni=1 Xi , where X1 , . . . , Xn are independent Bernoulli
random variables with success probability p. Then the mgf of MXi (t) =
et p + 1 − p for each i ∈ {1, . . . , n}, and so
[
( n
)]
∑
MSn (t) = E [exp (tSn )] = E exp t
Xi
,
i=1
3
which by independence, gives the equality
MSn (t) =
n
∏
[
] (
)n
E etXi = et p + 1 − p .
i=1
6. Let X and Y be jointly distributed with joint density
{
k(x − y) , 0 ≤ y ≤ x ≤ 1
fXY (x, y) ≡
0
,
otherwise.
(a) What is k?
It’s usually best in questions of this sort to plot the region where the
joint density is positive. In this case, fXY > 0 only for (x, y) below
the 45-degree line in Figure 1:
Figure 1:
1
0.9
0.8
y=x
0.7
y
0.6
0.5
0.4
0.3
0.2
0.1
0
0
0.1
0.2
0.3
0.4
0.5
x
0.6
0.7
0.8
0.9
1
Staring at Figure 1 gives the limits of integration in the following equation:
∫ ∫
1
x
k(x − y)dydx = 1.
0
0
But
∫
1
∫
0
x
k(x − y)dydx
]y=x
[
1 2 dx
xy − y 2
y=0
0
∫
1
= k
0
4
∫
1
= k
0
1 2
x
2
1
k 3 =
x
6 0
k
=
,
6
so k = 6.
(b) Find the marginal densities of X and Y .
For x ∈ [0, 1], the marginal density of X is given by
∫ x
6(x − y)dy
fX (x) =
0
(
)y=x
1 2 = 6 xy − y 2
y=0
= 3x2 .
{
Therefore
3x2 , 0 ≤ x ≤ 1
0 , otherwise.
fX (x) ≡
For y ∈ [0, 1], the marginal density of Y is given by
∫ 1
fY (y) =
6(x − y)dx
y
(
= 6
(
)x=1
1 2
x − yx 2
x=y
1
1
= 6
− y − y2 + y2
2
2
2
= 3y − 6y + 3.
Therefore
{
fY (y) ≡
3y 2 − 6y + 3 ,
0
,
)
y ∈ [0, 1]
otherwise.
(c) Find the conditional densities of Y given X and of X given Y .
5
Remember the rule about dividing the joint by the marginal densities.
The conditional densities of Y given X and of X given Y are accordingly given by
{ 2(x−y)
, y ∈ [0, x]
x2
fY |X (y|x) ≡
0
, otherwise
{
and
fX|Y (x|y) ≡
2(x−y)
y 2 −2y+1
0
,
,
x ∈ [y, 1]
otherwise,
respectively.
7. Let T be an exponential random variable with parameter λ. Let X be a
discrete random variable defined as X ≡ k if k ≤ T < k + 1 for k ∈
{0, 1, . . .}. Find the density of X.
Note that X = k if T ∈ [k, k + 1) for k = 0, 1, . . .. Then
P (X = k) = P (k ≤ T < k + 1)
(
)t=k+1
= 1 − e−λt t=k
= 1 − e−λ(k+1) − 1 + e−λk
= e−λk − e−λ(k+1)
(
)
= e−λk 1 − e−λ .
This indicates that X is a geometric random variable with parameter 1−e−λ .
8. Suppose that telephone calls are received at your residence in accordance
with a Poisson process having intensity λ = 2 per hour.
(a) What is the probability of having the phone ring at least once during
any 10-minute interval?
The distribution of the number of telephone calls at the
( residence
)
1
during
a
10-minute
(i.e.,
a
-hour)
interval
is
Poisson
2 × 16 , i.e.,
6
(1)
P o 3 . As such, the probability that the phone rings at least once
during any 10-minute interval is
1 − P (no calls in 10-minute interval)
( 1 )0
( )
exp − 13
3
= 1−
( 0! )
1
= 1 − exp −
(≈ .2835).
3
6
(b) Suppose you need to step out of the house temporarily. For how long
can you do so and still keep the probability of missing at least one
phone call during that period to something no greater than 50%?
The distribution of the number of telephone calls missed during an
absence from the house lasting t hours is P o(2t). As such,
P (at least one phone call in interval of length t) ≤
⇔ 1 − P (no phone calls in interval of length t) ≤
⇔ P (no phone calls in interval of length t) ≥
⇔
⇔
⇔
⇔
⇔
1
2
1
2
1
2
(2t)0 exp (−2t)
1
≥
0!
2
1
−2t
e ≥
2
−2t ≥ − log (2)
1
t ≤ log (2)
2
t ≤ .3466.
As such, you are permitted to step out of the house for at most .3466 ×
60 = 20.7960 minutes if you want the probability of missing a phone
call during that time to be no greater than .5.
9. Let X be a continuous random variable with density function
{
2x , 0 ≤ x ≤ 1
fX (x) ≡
0 , otherwise.
(a) Find E [X].
∫
1
x · 2xdx
E [X] =
0
∫
0
2
=
.
3
7
1
x2 dx
= 2
(b) Find E [X 2 ] and V ar [X].
[
E X
2
∫
]
1
x2 · 2xdx
=
0
∫
1
x3 dx
= 2
0
2
=
4
1
=
.
2
Therefore
[ ]
1
1 4
V ar [X] = E X 2 − (E [X])2 = − = .
2 9
18
10. Let N1 and N2 be independent random variables following Poisson distributions with parameters λ1 and λ2 , respectively. Show that the distribution
of N ≡ N1 + N2 is Poisson with parameter λ1 + λ2 .
For any nonnegative integer n, we want to show that
P [N = n] =
e−λ1 −λ2 (λ1 + λ2 )n
.
n!
Note that if N = n, then there must be a number n1 ∈ {0, 1, . . . , n} such
that N1 = n1 , and N2 = n − n1 . By independence of N1 and N2 , we also
have that
P [N1 = n1 , N2 = n − n1 ] = P [N1 = n1 ] P [N2 = n − n1 ] .
Finally, by virtue of the fact that the values of a binomial (n, p) density sum
to one, we have
)n1 (
)n−n1
n ( )(
∑
λ1
λ1
n
1−
= 1.
n1
λ1 + λ2
λ1 + λ2
n
1
Putting these facts together, we have
P [N = n]
8
=
=
=
n
∑
n1 =0
n
∑
P [N1 = n1 , N2 = n − n1 ]
P [N1 = n1 ] P [N2 = n − n1 ]
n1 =0
n
∑
n1
e−λ1 λn1 1 e−λ2 λ2n−n1
·
n1 !
(n − n1 )!
=0
(
)n1 (
)n−n1
n
e−λ1 −λ2 (λ1 + λ2 )n ∑
n!
λ1
λ2
=
n!
n !(n − n1 )! λ1 + λ2
λ1 + λ2
n1 =0 1
(
)
(
)
(
)n−n1
n
n1
e−λ1 −λ2 (λ1 + λ2 )n ∑ n
λ1
λ1
1−
=
n!
n
λ
+
λ
λ1 + λ2
1
1
2
n =0
1
e
−λ1 −λ2
n
(λ1 + λ2 )
,
n!
which shows that N indeed follows a Poisson (λ1 + λ2 ) distribution.
=
11. Suppose that X has a density function given by
{ 2
cx , x ∈ [0, 1]
fX (x) ≡
0 , otherwise.
(a) Find c.
∫1
Since 0 fX (x)dx = 1, it must be the case that c = 3.
(b) Find the distribution function.
We have

0
,
x<0
 ∫
x
2
3
3t
dt
=
x
,
0
≤
x<1
FX (x) ≡
 0
1
,
x ≥ 1.
(c) Compute P (.1 ≤ X < .5).
We have
P (.1 ≤ x < .5) = F (.5) − F (.1) = (.5)3 − (.1)3 = .124.
12. Suppose that X and Y are jointly distributed with density function
{ 6
(x + y)2 , x ∈ [0, 1], y ∈ [0, 1]
7
fXY (x, y) ≡
0
,
otherwise.
9
(a) Integrate over the appropriate regions to find the following:
i. P (X > Y ).
∫
1
∫
x
6
(x + y)2 dydx
7
0
y=x
∫ 0
6 1 1
3
dx
=
(x + y) 7 0 3
y=0
∫
)
2 1( 3
=
8x − x3 dx
7 0
∫ 1
= 2
x3 dx
0
1
2 4 =
x
4 0
1
.
=
2
P (X > Y ) =
ii. P (X + Y ≤ 1).
P (X + Y ≤ 1) = P (Y ≤ 1 − X)
∫ 1 ∫ 1−x
6
=
(x + y)2 dydx
7
0
0
y=1−x
∫ 1
1
6
3
(x + y) =
dx
7 0 3
y=0
∫
)
2 1(
=
1 − x3 dx
7 0
(
)
2
1
=
1−
7
4
2 3
=
×
7 4
3
=
.
14
)
(
iii. P X ≤ 21 .
10
(
P
1
X≤
2
)
∫
1
2
∫
1
6
(x + y)2 dydx
0
0 7
y=1
∫ 1
6 2 1
3
=
(x + y) dx
7 0 3
y=0
∫ 1
]
2 2[
=
(x + 1)3 − x3 dx
7 0
[
] 1
2 1
1 4 2
4
=
(x + 1) − x 7 4
4
(
) 0
1 81
1
=
−
−1
14 16 16
1
64
=
×
14 16
2
=
.
7
=
(b) Find the marginal densities of X and Y .
For x ∈ [0, 1], we have
∫ 1
6
fX (x) =
(x + y)2 dy
0 7
y=1
6 1
3
· (x + y) =
7 3
y=0
so
]
2[
(x + 1)3 − x3
=
7
]
2[
=
(x + 1)(x2 + 2x + 1) − x3
7
]
2[ 3
=
x + 2x2 + x + x2 + 2x + 1 − x3
7
]
2[ 2
=
3x + 3x + 1
7
6 2 6
2
=
x + x+ ,
7
7
7
{ 6 2 6
x + 7 x + 27 , x ∈ [0, 1]
7
fX (x) =
0
, otherwise.
11
By symmetry, we have a virtually identical expression for the marginal
density of Y :
{ 6 2 6
y + 7 y + 27 , y ∈ [0, 1]
7
fY (y) =
0
, otherwise.
(c) Find the conditional densities of X given Y and of Y given X.
Remember the bit about dividing joint by marginal densities in order
to get conditional densities. As such, the conditional density of X
given Y is simply
{
6(x+y)2
; x, y ∈ [0, 1]
2 +6y+2
6y
fX|Y (x|y) ≡
0
; otherwise,
while that of Y given X has the same form:
{
6(x+y)2
; x, y ∈ [0, 1]
6x2 +6x+2
fY |X (y|x) ≡
0
; otherwise.
13. Consider a Poisson process on the real line, and denote by N (t1 , t2 ) the
number of events occurring in the interval (t1 , t2 ). If t0 < t1 < t2 , find the
conditional density of N (t0 , t1 ) given that N (t0 , t2 ) = n.
Suppose the Poisson process has intensity λ, so for t0 < t1 < t2 ,
N (t0 , t1 ) ∼ P o (λ(t1 − t0 ))
N (t1 , t2 ) ∼ P o (λ(t2 − t1 ))
N (t0 , t2 ) ∼ P o (λ(t2 − t0 )) .
In particular, we have that
P [N (t0 , t2 ) = n] =
e−λ(t2 −t0 ) (λ(t2 − t0 ))n
.
n!
In addition, N (t0 , t1 ) is independent of N (t1 , t2 ) by definition, so for some
n1 ∈ {0, . . . , n},
P [N (t0 , t1 ) = n1 , N (t0 , t2 ) = n]
= P [N (t0 , t1 ) = n1 , N (t1 , t2 ) = n − n1 ]
e−λ(t1 −t0 ) (λ(t1 − t0 ))n1 e−λ(t2 −t1 ) (λ(t2 − t1 ))n−n1
·
.
=
n1 !
(n − n1 )!
12
It follows that for n1 ∈ {0, . . . , n},
P [N (t0 , t1 ) = n1 | N (t0 , t2 ) = n]
P [N (t0 , t1 ) = n1 , N (t0 , t2 ) = n]
≡
P [N (t0 , t2 ) = n]
n!
=
(t1 − t0 )n1 (t2 − t1 )n−n1 (t2 − t0 )−n
n1 !(n − n1 )!
( )
n
=
(t1 − t0 )n1 (t2 − t1 )n−n1 (t2 − t0 )−n1 −n+n1
n1
)n (
)n−n1
( )(
n
t1 − t0 1 t2 − t1
=
t2 − t0
t2 − t0
n1
)n1 (
)n−n1
( )(
t1 − t0
n
t1 − t0
1−
,
=
n1
t2 − t0
t2 − t0
(
)
t1 −t0
i.e., a binomial n, t2 −t0 density.
14. Let X be a continuous random variable with density function
{
2x , 0 ≤ x ≤ 1
fX (x) ≡
0 , otherwise.
(a) Find E [X].
∫
1
x · 2xdx
E [X] =
0
∫
1
x2 dx
= 2
0
2
=
.
3
(b) Find E [X 2 ] and V ar [X].
[
E X
2
]
∫
1
x2 · 2xdx
=
0
∫
0
13
1
x3 dx
= 2
2
4
1
=
.
2
=
Therefore
[ ]
1 4
1
V ar [X] = E X 2 − (E [X])2 = − = .
2 9
18
15. Let U1 and U2 be independently distributed and uniform on [0, 1]. Find and
sketch the density function of the random variable S, where S ≡ U1 + U2 .
First of all, note that the joint density of U1 and U2 is very easy to derive,
since U1 and U2 are independent. In particular,
{
1 ; u1 , u2 ∈ [0, 1]
fU1 U2 (u1 , u2 ) ≡
0 ;
otherwise,
which means that the joint distribution of U1 and U2 is evenly spread out
over the unit square. Permissible values of S can be represented as the
vertical intercepts of lines having slope −1 in Figure 2:
Figure 2:
1
u1+u2=1
0.9
0.8
0.7
u
2
0.6
0.5
0.4
0.3
0.2
0.1
0
0
0.1
0.2
0.3
0.4
0.5
u
0.6
0.7
0.8
0.9
1
1
It is clear that S takes values in the interval [0, 2]. Given that (U1 , U2 ) only
takes values in the unit square, we need to be careful about limits of integration when computing probabilities involving S. In particular, for s ∈ [0, 1],
14
we have
∫ s∫
P [S ≤ s] =
s−u1
fU1 U2 (u1 , u2 )du2 du1
∫ s∫
0
0
=
s−u1
du2 du1
∫
0
0
s
(s − u1 )du1
(
)s
1 2 =
su1 − u1 2
0
2
s
=
.
2
=
0
On the other hand, for s ∈ (1, 2], we have
∫ 1 ∫ 1
P [S > s] =
du2 du1
s−1 s−u1
∫ 1
=
(1 − s + u1 ) du1
s−1
)1
1 2 (1 − s)u1 + u1 2
s−1
1
1
1 − s + + (s − 1)2 − (s − 1)2
2
2
3
1
2
− s + (s − 1)
2
2
s2
− 2s + 2.
2
(
=
=
=
=
It follows that the distribution function FS (s) ≡ P [S ≤ s] of S is given by

0
,
s<0



s2
, s ∈ [0, 1]
2
FS (s) ≡
s2

2s − 2 − 1 , s ∈ (1, 2]


1
,
s > 2.
The density of S is obtained by differentiation of each part of FS , i.e.,

, s ∈ [0, 1]
 s
2 − s , s ∈ (1, 2]
fS (s) ≡

0
, otherwise.
15
(Plot the density function fS (s). For obvious reasons, the random variable
S in this case is often said to have a triangular distribution.)
16. A study conducted in 19821 involved one hundred physicians being presented with the following information:
(a) In the absence of any special information, the probability that a woman
has breast cancer is 1%.
(b) If the patient has breast cancer, the probability that the radiologist
conducting mammogram screening will perform a correct diagnosis is
80%.
(c) If the patient has a benign lesion (i.e., she doesn’t have breast cancer),
the probability that the same radiologist will perform an incorrect diagnosis is 10%.
Each physician was then asked the following question: “What is the probability that a patient with a positive mammogram actually has breast cancer?” Ninety-five of the 100 physicians involved in the study estimated this
probability to be about 75%. Is this the correct answer? Provide justification.
Let B denote the event that a randomly selected woman who is also undergoing mammogram screening actually has breast cancer. Let D denote the
event that the same randomly selected woman is diagnosed with breast cancer, i.e., the event of a positive mammogram. We are given the following:
P [B] = .01
P [D| B] = .80
P [ D| B c ] = .10.
Applying Bayes’ rule, we have that the conditional probability of a randomly selected woman actually having breast cancer given a positive diagnosis is
P [ B| D] =
P [D| B] P [B]
P [ D| B] P [B] + P [D| B c ] P [B c ]
1
Eddy, D. M. (1982). “Probabilistic reasoning in clinical medicine: Problems and opportunities”. In Judgment under Uncertainty: Heuristics and Biases, 249–267. D. Kahneman, P. Slovic
and A. Tversky (eds.). Cambridge University Press.
16
.80 × .01
.80 × .01 + .10 × .99
≈ .0748.
=
It follows that the estimate of 75% is in fact far from being correct.
17. Suppose that X is a random variable with moment-generating function
mX (t).
(a) What is mX (0)?
[
]
We have mX (0) = E eX·0 = E [1] = 1.
(b) If Y = 3X, show that the moment-generating function of Y is mX (3t).
[ ]
[
]
[
]
We have mY (t) = E eY t = E e3Xt = E eX·3t = mX (3t).
(c) Use the moment-generating function of Y to show that E [Y ] = 3E [X]
and V ar [Y ] = 9V ar [X].
Recall that E [Y ] = m′Y (0). Since mY (t) = mX (3t), we have from
the chain rule for differentiation that
m′Y (t) = 3m′X (3t).
(1)
Therefore E [Y ] = m′Y (0) = 3m′X (0) = 3E [X].
Now consider E [Y 2 ] = m′′Y (0). In this connection we have from (1)
that m′′Y (t) = 9m′′X (3t). It follows that E [Y 2 ] = m′′Y (0) = 9m′′X (0) =
9E [X 2 ]. Therefore
[ ]
V ar [Y ] = E Y 2 − (E [Y ])2
[ ]
= 9E X 2 − (3E [X])2
{ [ ]
}
= 9 E X 2 − (E [X])2
= 9V ar [X] .
18. During the holiday rush, customers arrive at the meat counter of Pusateri’s
Fine Foods in accordance with a Poisson process with rate λ = 48 per
hour.
(a) During a given interval of five minutes, what is the probability that
more than 2 customers arrive at the meat counter?
17
Let X denote the number of customers arriving during a five-minute
interval. Five
( )minutes is one-twelfth of an hour, so it follows that X is
random variable, i.e., X ∼ P o(4). As such,
a Poisson 48
12
P [X > 2] = 1 − P [X ≤ 2]
2
∑
e−4 4x
= 1−
x!
x=0
≈ .7619.
(b) Let X be the number of customers arriving during a five-minute interval. If it takes approximately 2 minutes to serve each customer, then
the total time (in minutes) spent serving customers arriving during a
five-minute interval is 2X. Find the mean and variance of the total
time in minutes spent serving customers arriving during a five-minute
period. Assume that a sufficient number of employees are available so
that no customer must wait for service.
Since X ∼ P o(4), we have that E [X] = V ar [X] = 4. Set Y = 2X.
Then E [Y ] = 2E [X] = 8, while V ar [Y ] = 4V ar [X] = 16.
(c) What is the probability that the total time spent serving customers arrriving during a five-minute period exceeds 4 minutes?
We have P [Y > 4] = P [X > 2] ≈ .7619, as was shown above in part
(a).
19. The length of time (in hours) taken by students to complete a three-hour
exam is a random variable X with density function given by
{ 2 x
cx + 9 , 0 ≤ x ≤ 3
fX (x) =
0
, otherwise.
(a) Find c.
∫∞
∫3
We have −∞ fX (x)dx = 0 fX (x)dx = 1, so
∫
3
c
0
Solving (2) yields c =
1
x dx +
9
∫
2
1
.
18
18
3
xdx = 1.
0
(2)
(b) What is the probability that a randomly selected student will finish in
less than an hour?
Using the solution to part (a), we have
)
∫ 1( 2
x
x
P [X < 1] =
+
dx
18 9
0
1
1
=
+
54 18
4
=
54
2
=
27
≈ .0741.
(c) Consider that a particular student requires at least 45 minutes to finish
the exam. Given that this is the case, find the probability that this
student will take at least 90 minutes to finish.
The conditional probability that the student in question takes at least
90 minutes to finish given that he or she needs at least 45 minutes to
complete the exam is given by
]
[
[
]
P X ≥ 23
3 3
]
[
P X ≥ X ≥
=
2
4
P X ≥ 43
.8125
=
.9609
≈ .8456.
20. Suppose X is a continuous random variable with density given by
{ 2 −2x
4x e
,
x>0
fX (x) =
0
, otherwise.
Find the mean and variance of X.
( )
The density function indicates that X is a gamma 3, 12 random variable; it
( )2
follows that E [X] = 3 · 21 = 1.5, and V ar [X] = 3 · 12 = .75.
Alternatively, one can compute the following:
∫ ∞
E [X] =
xfX (x)dx
−∞
19
∫
=
=
=
=
=
while
[ ]
E X2 =
=
=
=
=
∞
4x3 e−2x dx
0
∫
1 ∞ 3 −u
u e du
4 0
1
· Γ(4)
4
1
· 3!
4
3
;
2
∫
∞
4x4 e−2x dx
0
∫
1 ∞ 4 −u
u e du
8 0
1
· Γ(5)
8
1
· 4!
8
3.
It follows that
[ ]
9
3
V ar [X] = E X 2 − (E [X])2 = 3 − = .
4
4
21. Pick an individual consumer at random. Suppose that this representative
consumer’s weekly consumption of two goods is adequately modelled by
the behaviour of the ordered pair (X1 , X2 ), where X1 and X2 are jointly
continuous random variables with joint density function given by
{ 2
6x1 x2 , 0 ≤ x1 ≤ x2 , x1 + x2 ≤ 2
fX1 X2 (x1 , x2 ) =
0
,
otherwise.
(a) What is the probability that X1 + X2 is less than 1?
We have
P [X1 + X2 < 1]
20
]
[
1
= P 0 < X1 < , X1 ≤ X2 < 1 − X1
2
∫ 1 ∫ 1−x1
2
=
6x21 x2 dx2 dx1
0
∫
x1
1
2
= 3
(
)
x21 (1 − x1 )2 − x21 dx1
0
∫
)
x21 − 2x31 dx1
(0
)
1
1
= 3
−
12 32
5
=
32
≈ .1562.
= 3
1
2
(
(b) Derive the marginal density of X1 .
We have
∫ 2−x1
fX1 (x1 ) =
6x21 x2 dx2
x1
(
)
= 3x21 (2 − x1 )2 − x21
= 12x21 (1 − x1 ),
when x1 ∈ [0, 1]; otherwise fX1 (x1 ) = 0. Note that the marginal
density of X1 is beta(3, 2).
(c) Derive the conditional density of X2 given X1 = x1 .
Dividing the joint density of X1 and X2 by the marginal density of X1
yields the relevant conditional density. We have
1
6x21 x2
= x2 (1 − x1 )−1
2
12x1 (1 − x1 )
2
when 0 ≤ x1 ≤ x2 ≤ 2 − x1 , so it follows that the conditional density
of X2 given X1 = x1 is
{ 1
x (1 − x1 )−1 , 0 ≤ x1 ≤ x2 ≤ 2 − x1
2 2
fX2 |X1 (x2 | x1 ) =
0
,
otherwise.
21
(d) Derive the marginal density of X2 .
When x2 ≤ [0, 1], we have
∫ x2
fX2 (x2 ) =
6x21 x2 dx1 = 2x42 .
0
When x2 ≤ (1, 2], we have
∫ 2−x2
6x21 x2 dx1 = 2x2 (2 − x2 )3 .
fX2 (x2 ) =
0
Summarizing, the marginal density of X2 is given by

2x42
, 0 ≤ x2 ≤ 1

3
fX2 (x2 ) =
2x (2 − x2 ) , 1 < x2 ≤ 2
 2
0
, otherwise.
22