hwk 11

Theory of Probability : Recitation 12(May 1)
1. Reviews
(1) Change of Variables
Suppose that we know the joint PDF fX, Y (x, y) of X, Y . If two RVs U and V are obtained from
X and Y in a way that
U = F (X, Y ),
(0.1)
V = G(X, Y )
for some function F and G then we can compute the joint PDF fU, V (u, v) of U, V by using the
Jacobian. There are many equivalent ways to do this and here we suggest a clear procedure to do
that.
1. Regard relations (0.1) as a system of equation with unknowns X and Y and solve it. Then
we can express X and Y in terms of U and V , namely,
X = P (U, V ),
2. Compute the Jacobian
(0.2)
Y = Q(U, V ).
!
Fx (X, Y ) Fy (X, Y ) J = det
.
Gx (X, Y ) Gy (X, Y ) Be cautious about the presence of the absolute value. Then the Jacobian J is expressed in
terms of X and Y but we can use (0.2) to rewrite J in terms of U and V . Let us denote this
result by J(U, V )
3. The joint PDF of U and V is given by
fU, V (u, v) = fX, Y (P (u, v), Q(u, v)) ×
1
.
J(u, v)
We will follow this procedure (either explicitly or implicitly) in the remaining part. Hence, the
reader should study this procedure with the specic problems below.
(2) Order Statistics
Let X1 , X2 , · · · , Xn be independent RVs and have common PDF fX (x) and CDF FX (x). Let X(k)
indicate the kth-smallest value among {X1 , X2 , · · · , Xn }. Then the PDF of X(k) is given by
fX(k) (x) =
n!
FX (x)k−1 (1 − FX (x))n−k fX (x).
(k − 1)!(n − k)!
1
(0.3)
2. Solutions for Assignment 11
1.
We rst express Z1 and Z2 in terms of X and Y . This is easy and
Z1 = X, Z2 = Y − X.
Then, we compute the Jacobian as
!
1
0
J = det
= 1.
1 1 Finally,
fX, Y (x, y) = fZ1 , Z2 (x, y − x) ×
1
1 − x2 +(y−x)2
2
=
e
J
2π
for all x, y ∈ R.
We will consider general case with n bulbs with expectation λ1 ,i.e., the lifetime of each bulb is the
Exponential(λ) RV. Then, we have n independent Exponential(λ) RVs X1 , X2 , · · · , Xn and have
to compute the expectation of X(1) . Let us rst consider the PDF of X(1) . Since fX (x) = λe−λx
and FX (x) = 1 − e−λx , we can compute fX(1) (x), the PDF of X(1) by
2.
fX(1) (x) =
n!
FX (x)1−1 (1 − FX (x))n−1 fX (x) = λne−λnx
(1 − 1)!(n − 1)!
for x > 0 by (0.3). Of course fX(1) (x) = 0 for x ≤ 0. This indicates that the distribution of X(1) is
1
Exponential(nλ) RV. Hence, the expectation is nλ
.
1
In this specic problem, n = 100 and λ = 1000 and therefore the answer is 10.
3.
We rst compute the CDF F (x) by
F (x) =
x
ye−y dy = 1 − (x + 1)e−x .
0
One can obtain this by the integration by parts. Now, we have ve independent RVs X1 , X2 , · · · , X5
where each corresponds to the functioning time for each motors. The machine will stop operating
as soon as three motors die. Therefore the lifetime of the machine is the random variable X(3) . The
PDF of X(3) can be computed explicitly by (0.3) such that
5!
[1 − (x + 1)e−x ]2 [(x + 1)e−x ]2 xe−x
2!2!
= 30x(x + 1)2 e−3x (1 − (x + 1)e−x )2
fX(3) (x) =
2
for x > 0. fX(3) (x) = 0 for x ≤ 0.
The rst step is to express X, Y in terms of U, V . By the second relation, X = V Y and let us
insert this to the rst relation so that
4.
U = X + Y = V Y + Y.
From this, we obtain
X=
UV
,
V +1
Y =
U
.
V +1
(0.4)
The second step is to calculate Jacobian.
J = det
1
1
Y
By (0.4) then we obtain
J=
! X
1 X + Y
=
−
= −
Y2
− YX2 Y 2 Y 1
UV
V +1
+
U
V +1
U
V +1
2
=
(V + 1)2
.
U
(0.5)
(0.6)
Finally,
fU, V (u, v) = fX, Y
uv
1
u
×
,
v+1 v+1
J
uv
u
u
,
×
v+1 v+1
(v + 1)2
= fX, Y

 u
2
= (v+1)
0
if 0 <
u
uv
v+1 , v+1
<1
otherwise.
Remark here that we substituted (0.4) at the rst equality, used (0.6) (not (0.5)!) at the second
equality. The nal equality follows from the fact that X, Y are independent Uniform (0, 1) RVs so
that

1 if 0 < x, y < 1
fX, Y (x, y) =
5.
0 otherwise.
We rst solve the equation. Since X = log V , it is easy to see that
X = log V,
Y = U − log V.
3
(0.7)
The Jacobian is given by
1
J = det X
e
!
1 = eX
0 and therefore, by (0.7), J = elog V = V . Thus,
fU, V (u, v) = fX, Y (log v, u − log v)J −1
1
= λe−λ log v λe−λ(u−log v)
v
λ2 −λu
e
=
v
if 0 < log v and 0 < u − log v . This two conditions can be compactly rewritten as 1 < v < eu and
therefore the answer is

 λ2 e−λu if 1 < v < eu
v
fU, V (u, v) =
otherwise.
0
6.
Let U = X + Y and V = Y . Then similar computation with the Problem 1 give us
fU, V (u, v) = fX, Y (u − v, v).
The Jacobian does not appear since J = 1. Hence,
∞
fX+Y (t) = fU (t) =
∞
fX, Y (t − x, x)dx.
fU, V (t, x)dx =
−∞
−∞
and X + Y is a continuous random variable with this PDF.
7.
Let us rst compute fY (y), the PDF of Y , such that
∞
fY (y) =
e
1
−x
y −y
e
0
for y > 0. Hence,
fX|Y (x|y) =
y
dx = e−y
fX, Y (x, y)
−x 1
=e y .
fY (y)
y
In other words, given Y = y , the distribution of X is the exponential( y1 ) RV. Hence the second mo
ment E(X 2 |Y = y) is y22 . (Of course, one can also check this by brutal computing 0∞ x2 fX|Y (x|y)dx =
∞ 2 −x 1
y dx)
y
0 x e
8. (a), (b)
Let X be the number of accidents in next year. Then, given Λ = y , X is the Poisson(y )
4
RV. In other words,
P (X = k|Λ = y) = e−y
yk
; k = 0, 1, 2, · · ·
k!
Moreover, fΛ (y) = e−y for y > 0 (and 0 for y ≤ 0). Hence
∞
P (X = k|Λ = y)fΛ (y)dy
P (X = k) =
0
∞
=
0
1
=
k!
e−y
∞
yk
× e−y dy
k!
y k e−2y dy.
0
By the change of variable z = y2 , the last one can be calculated as
1
k!
0
∞
z k −z 1
1
dz =
e
2
2
k!2k+1
∞
z k e−z dz =
0
1
1
× k! = k+1
k+1
k!2
2
as we studied before. Therefore, the answer for part (a) and (b) is
(c)
1
2
and
1
16 ,
respectively.
Let Z be the number of accident in this year. Then we have to compute
P (X = 3, Z = 0)
.
P (Z = 0)
P (X = 3|Z = 0) =
(0.8)
By part (a), we know that P (Z = 0) = 21 . We now compute the denominator. Note that given
Λ = y , X and Z are independent Poisson(y) RVs and hence
P (X = k, Z = l|Λ = y) = e−y
yk
yl
× e−y .
k!
l!
Thus,
∞
P (X = 3, Z = 0) =
P (X = 3, Z = 0|Λ = y)fΛ (y)dy
0
∞
=
0
e−y
y3
y0
× e−y × e−y
3!
0!
1
= .
81
The last computation easily follows from the change of variable z = y3 . Consequently, (0.8) is
which is our nal answer.
5
2
81
3. Solutions for Quiz 5
Let's consider two parameters c > 0 and α > 0. Let's consider X a continuous RV
with density f given below (we say that X is Pareto of parameters c and α.
Question 7.

0
f (x) =
α
c
if x < c
c α+1
if x ≥ c.
x
Show that f is indeed a probability density function.
(b) Compute P(X > x).
(c) Let y > 0. Compute limx→∞ P(X > x + y|X > x). What do you conclude? Is that property
true for an exponential RV?
(a)
Solution.
∞
By the formula, it is obvious that f is a non-negative function. Now, let us check −∞
f (x)dx =
∞ α c α+1
d
c α
α c α+1
1. Since, f = 0 for x < c, this integral is c c x
dx. Since dx − x
= c x
, this
integration is
(a)
n c α o
−
x
x=∞
n c α o
− −
x
= 1.
x=c
Consequently, f in a probability density function.
By the denition of X , it is obvious that P(X > x) = 1 if x < c, because x∞ f (y)dy =
∞
c f (y)dy in this case and this integral is 1 as we seen from part (a). For x ≥ c, By the same way
with part (a),
(b)
x
(c)
∞
α
c
α+1
α c
c
dy = −
y
y
y=∞
α c
− −
y
=
y=x
By the result of part (b),
P (X > x + y, X > x)
P (X > x)
P (X > x + y)
=
P (X > x)
α
c
α
x+y
x
=
=
c α
x+y
x
P(X > x + y|X > x) =
Hence,
lim P(X > x + y|X > x) = lim
x→∞
x→∞
6
x
x+y
α
= 1.
c α
x
.
We can conclude that, conditioned on X > x where x is a large enough number, X is highly likely to
be larger than x + y as well for any xed positive number y . It would be a little bit surprising if you
100000000000000000000000000000000000000000000000000000
take y as a very large number like 101000000000000000000000
.
For the exponential distribution, we have the memorylessness P(X > x + y|X > x) = P (X > y)
and therefore does not have this property.
7