Kobe University Repository : Kernel

 Kobe
University Repository : Kernel
Title
PMSE performance of the biased estimators in a linear
regression model when relevant regressors are omitted
Author(s)
Namba, Akio
Citation
Econometric Theory, 18( 5): 1086-1098
Issue date
2002-10
Resource Type
Journal Article / 学術雑誌論文
Resource Version
publisher
URL
http://www.lib.kobe-u.ac.jp/handle_kernel/90000343
Create Date: 2014-10-21
Econometric Theory, 18, 2002, 1086–1098+ Printed in the United States of America+
DOI: 10+1017+S0266466602185033
PMSE PERFORMANCE OF THE
BIASED ESTIMATORS IN A LINEAR
REGRESSION MODEL WHEN
RELEVANT REGRESSORS
ARE OMITTED
AK I O NA M B A
Kobe University
In this paper, we consider a linear regression model when relevant regressors are
omitted+ We derive the explicit formulae for the predictive mean squared errors
~PMSEs! of the Stein-rule ~SR! estimator, the positive-part Stein-rule ~PSR! estimator, the minimum mean squared error ~MMSE! estimator, and the adjusted
minimum mean squared error ~AMMSE! estimator+ It is shown analytically that
the PSR estimator dominates the SR estimator in terms of PMSE even when there
are omitted relevant regressors+ Also, our numerical results show that the PSR
estimator and the AMMSE estimator have much smaller PMSEs than the ordinary least squares estimator even when the relevant regressors are omitted+
1. INTRODUCTION
In the context of linear regression, the Stein-rule ~SR! estimator proposed by
Stein ~1956! and James and Stein ~1961! dominates the ordinary least squares
~OLS! estimator in terms of predictive mean squared error ~PMSE! if the model
is specified correctly+ Further, as is shown in Baranchik ~1970!, the SR estimator is further dominated by the positive-part Stein-rule ~PSR! estimator when
the specified model is correct+
As an improved estimator, Theil ~1971! proposes the minimum mean squared
error ~MMSE! estimator+ Because Theil’s ~1971! MMSE estimator includes unknown parameters, Farebrother ~1975! suggests an operational variant of the
MMSE estimator that is obtained by replacing the unknown parameters of the
MMSE estimator by the OLS estimators+ Hereafter, we use the term MMSE
estimator to denote the operational variant of the MMSE estimator+ Because
the MMSE estimator satisfies Baranchik’s ~1970! condition, it dominates the
OLS estimator if the number of regressors is larger than or equal to three and if
The author is grateful to Kazuhiro Ohtani for his helpful guidance and to Peter C+B+ Phillips and anonymous
referees for their useful comments and suggestions+ Address correspondence to: Akio Namba, Graduate School
of Economics, Kobe University, Rokko, Nada-ku, Kobe 657-8501, Japan; e-mail: namba@kobe-u+ac+jp+
1086
© 2002 Cambridge University Press
0266-4666002 $9+50
PERFORMANCE OF THE BIASED ESTIMATORS
1087
the model is specified correctly+ As an extension of the MMSE estimator, Ohtani
~1996! considers the adjusted minimum mean squared error ~AMMSE! estimator that is obtained by adjusting the degrees of the freedom of the component
of the MMSE estimator+
These estimators are called biased estimators because they are not unbiased
even if the model is specified correctly+ However, in most of the practical situations, it is hard to determine which regressors should be included in the model+
Thus, a researcher may exclude relevant regressors mistakenly+ If relevant regressors are omitted, even the OLS estimator is not unbiased+ In such situations, there may be a strong incentive to use the biased estimators if we consider
that they are superior to the OLS estimator in some criterion+ However, there is
little research on the properties of biased estimators when relevant regressors
are omitted in the specified model+ Some exceptions are Mittelhammer ~1984!
and Ohtani ~1993, 1998!+ Mittelhammer ~1984! shows that the SR estimator no
longer dominates the OLS estimator when relevant regressors are omitted+ Ohtani
~1993! derives the formulae for the PMSEs of the SR estimator and the PSR
estimator in the misspecified model and derives a sufficient condition for the
PSR estimator to dominate the SR estimator+ Also, Ohtani ~1998! derives the
formulae for the PMSEs of the MMSE estimator and the AMMSE estimator+
His numerical results show that the PMSE performance of the AMMSE estimator is better than that of the PSR estimator when the number of regressors
included in the specified model is less than or equal to five and the model misspecification is severe+ However, in Ohtani ~1993, 1998!, the formulae for the
PMSEs of the SR, PSR, MMSE, and AMMSE estimators are not correct+
Thus, in this paper, we limit our attention to PMSE and compare the sampling performances of the OLS, SR, PSR, MMSE, and AMMSE estimators
though there are several other criteria+ ~As to other criteria, see, e+g+, Gourieroux and Monfort, 1995+! The plan of this paper is as follows+ In Section 2 the
model and the estimators are presented, and in Section 3 we derive the explicit
formulae for the PMSEs of the SR, PSR, MMSE, and AMMSE estimators+ It is
shown analytically that the PSR estimator dominates the SR estimator in terms
of PMSE even when relevant regressors are omitted+ In Section 4, using the
formulae derived in Section 3, we compare the PMSEs of the estimators by
numerical evaluations+
2. THE MODEL AND THE ESTIMATORS
Consider a linear regression model,
y 5 X 1 b1 1 X 2 b2 1 e,
e ; N~0, s 2 In !,
(1)
where y is an n 3 1 vector of observations on a dependent variable, X 1 and X 2
are n 3 k 1 and n 3 k 2 matrices of observations on nonstochastic independent
variables, b1 and b2 are k 1 3 1 and k 2 3 1 vectors of regression coefficients,
1088
AKIO NAMBA
and e is an n 3 1 vector of normal error terms+ We assume that X 1 and @X 1 , X 2 #
are of full column rank+
Suppose that the matrix of regressors X 2 is omitted mistakenly and the model
is specified as
y 5 X 1 b1 1 h,
h 5 X 2 b2 1 e+
(2)
Then, based on the misspecified model, the ordinary least squares ~OLS! estimator of b1 is
21 '
b1 5 S11
X 1 y,
(3)
where S11 5 X 1' X 1 +
Also, the SR, PSR, MMSE, and AMMSE estimators based on the misspecified model are, respectively,
S
bSR1 5 1 2
F
D
ae1' e1
b1 ,
b1' S11 b1
bPS1 5 max 0,1 2
bM1 5
bAM1 5
S
S
(4)
G
D
ae1' e1
b1 ,
b1' S11 b1
(5)
b1' S11 b1
b1 ,
b1' S11 b1 1 e1' e1 0n1
(6)
b1' S11 b1 0k 1
b1 ,
b1' S11 b1 0k 1 1 e1' e1 0n1
(7)
D
where e1 5 y 2 X 1 b1 , n1 5 n 2 k 1 , and a is a constant such that 0 # a #
2~k 1 2 2!0~n1 1 2!+ If the model is specified correctly, the PMSE of the SR
estimator is minimized when a 5 ~k 1 2 2!0~n1 1 2!+ Thus, we use this value of
a hereafter+
3. PMSE OF THE ESTIMATORS
To derive the explicit formulae for the PMSEs of the SR, PSR, MMSE, and
AMMSE estimators, we consider the general pretest estimator defined as
S
bZ 1 5 I ~F $ c! 1 1 a
e1' e1
b1' S11 b1
D
r
b1 ,
(8)
where I ~A! is an indicator function such that I ~A! 5 1 if an event A occurs and
I ~A! 5 0 otherwise, F 5 ~b1' S11 b1 0k 1 !0~e1' e1 0n1 ! is the test statistic for the null
hypothesis H0 : b1 5 0 against the alternative H1 : b1 Þ 0 based on the misspecified model, c is the critical value of the pretest, and r is an arbitrary integer+
The term bZ 1 reduces to the SR estimator when c 5 0, a 5 2a, and r 5 1, and it
reduces to the PSR estimator when c 5 an1 0k 1 , a 5 2a, and r 5 1+ Furthermore, bZ 1 reduces to the MMSE estimator when c 5 0, a 5 10n1 , and r 5 21,
PERFORMANCE OF THE BIASED ESTIMATORS
1089
and it reduces to the AMMSE estimator when c 5 0, a 5 k 1 0n1 , and r 5 21,
respectively+
The PMSE of bZ 1 is defined as
PMSE @ bZ 1 # 5 E @~X 1 bZ 1 2 Xb! ' ~X 1 bZ 1 2 Xb!#
5 E @ bZ 1' S11 bZ 1 # 2 2E @ b ' X ' X 1 bZ 1 # 1 b ' Sb,
(9)
where X 5 @X 1 , X 2 # , b ' 5 @ b1' , b2' # , and S 5 X ' X+ The meaning and some discussions of the PMSE are given in Mittelhammer ~1984! and Ohtani ~1993!+
Denoting u 1 5 b1' S11 b1 0s 2 and u 2 5 e1' e1 0s 2 , ~9! reduces to
FS
FS
DS
DS
n1 u 1
$c
k1 u2
PMSE @ bZ 1 # 5 s 2 E I
2 2E I
n1 u 1
$c
k1 u2
u 1 1 au 2
u1
D G
D G
2r
u1
u 1 1 au 2
u1
r
b ' X ' X 1 b1 1 b ' Sb+
(10)
We define the functions H~ p, q; a, c! and J ~ p, q; a, c! as
FS
FS
DS
DS
H~ p, q; a, c! 5 E I
n1 u 1
$c
k1 u2
u 1 1 au 2
u1
J ~ p, q,; a, c! 5 E I
n1 u 1
$c
k1 u2
u 1 1 au 2
u1
D G
D
p
q
u1 ,
p
(11)
,
q
G
u 1 ~ b ' X ' X 1 b1 0s 2 ! ,
(12)
where p and q are arbitrary integers+ Then, we obtain
PMSE @ bZ 1 #0s 2 5 H~2r,1; a, c! 2 2J ~r,0; a, c! 1 l 1 1 l 2 ,
(13)
21 '
21 '
where l 1 5 b ' X ' X 1 S11
X 1 Xb0s 2, l 2 5 b ' X ' M1 Xb0s 2, M1 5 In 2 X 1 S11
X1 ,
'
2
and l 1 1 l 2 5 b Sb0s +
As is shown in the Appendix, the explicit formulae of H~ p, q; a, c! and
J ~ p, q; a, c! are
`
H~ p, q; a, c! 5
`
( ( wi ~l 1 !wj ~l 2 !Gij ~ p, q; a, c!,
i50 j50
`
J ~ p, q; a, c! 5 l 1 (
(14)
`
( wi ~l 1 !wj ~l 2 !Gi11, j ~ p, q; a, c!,
(15)
i50 j50
where
Gij ~ p, q; a, c! 5
2 q G~~k 1 1 n1 !02 1 q 1 i 1 j !
G~k 1 02 1 i !G~n1 02 1 j !
3
E
1
t k1 022p1q1i21 ~1 2 t ! n1 021j21 @a 1 ~1 2 a!t # p dt,
c*
wi ~l! 5 exp~2l02!~l02! i0i!, and c * 5 k 1 c0~n1 1 k 1 c!+
(16)
1090
AKIO NAMBA
The formula for J ~ p, q; a, c! is substantially different from the formula derived in Theorem 1 of Ohtani ~1993!+ In Ohtani ~1993!, there are several unnecessary terms in the formula for the PMSEs of the SR estimator and the PSR
estimator+
When c * [ ~0,1!, the integral in ~16! can be written as
E
1
t k1 022p1q1i21 ~1 2 t ! n1 021j21 @a 1 ~1 2 a!t # p dt
c*
`
5
(
l50
3
S
k 1 02 2 p 1 q 1 i 2 1
l
E
`
(
l50
S
~21! l
1
~1 2 t ! n1 021l1j21 @a 1 ~1 2 a!t # p dt
c*
5
D
k 1 02 2 p 1 q 1 i 2 1
l
D
~21! l
~1 2 c * ! n1 021j1l
n1 02 1 j 1 l
3 2 F1 ~2p, n1 02 1 j 1 l; n1 02 1 j 1 l 1 1; ~1 2 a!~1 2 c * !!,
(17)
where 2 F1~{,{;{;{! is the hypergeometric function+ ~For the definition and properties of the hypergeometric function, see, e+g+, Luke, 1969; Abramowitz
and Stegun, 1972; Abadir, 1999+! In the case of the MMSE and AMMSE estimators, the hypergeometric function in ~17! converges because 6~1 2 a! 3
~1 2 c * !6 , 1+ In the case of the SR and PSR estimators, it is a finite series
with p 1 1 terms because p is a positive integer+ Thus, ~17! converges for all
the estimators considered in this paper+ Also, when 6~1 2 a!~1 2 c * !6 . 1,
analytic continuation formulae of the hypergeometric function make ~17! convergent+ ~The author is deeply grateful to one of the referees for suggesting
that ~17! be derived and instructing the properties of the hypergeometric
function+!
Let a 5 2a and r 5 1+ We have
PMSE~ bZ 1 !0s 2 5 H~2,1; 2 a, c! 2 2J ~1,0; 2 a, c! 1 l 1 1 l 2
`
`
52(
( wi ~l 1 !wj ~l 2 !
i50 j50
3
E
G~~k 1 1 n1 !02 1 i 1 j 1 1!
G~k 1 02 1 i !G~n1 02 1 j !
1
t k1 021i22 ~1 2 t ! n1 021j21 @2a 1 ~1 1 a!t # 2 dt
c*
`
2 2l 1 (
`
G~~k 1 1 n1 !02 1 i 1 j 1 1!
( wi ~l 1 !wj ~l 2 ! G~k1 02 1 i 1 1!G~n1 1 j !
i50 j50
3
E
c
1
t k1 021i21 ~1 2 t ! n1 021j21 @2a 1 ~1 1 a!t # dt 1 l 1 1 l 2 +
*
(18)
PERFORMANCE OF THE BIASED ESTIMATORS
1091
Differentiating ~18! with respect to c and performing some manipulations, we
obtain
` `
2G~~k 1 1 n1 !02 1 i 1 j 1 1!
]PMSE~ bZ 1 !0s 2
5 ( ( wi ~l 1 !wj ~l 2 !
]c
G~k 1 02 1 i !G~n1 02 1 j !
i50 j50
3
k 1k1 021i21 n1n1 021j c k1 021i22
~n1 1 k 1 c! ~n11k1 !021i1j21
F
HF
3 2a 1 ~1 1 a!
k1 c
n1 1 k 1 c
3 2 2a 1 ~1 1 a!
G
G
J
k1 c
k1 c
1
1 l1
+
n1 1 k 1 c
k 1 02 1 i n1 1 k 1 c
(19)
From ~19!, when a 5 2a and r 5 1, a condition for PMSE @ bZ 1 # to be monotonically decreasing is
2a 1 ~1 1 a!
k1 c
# 0+
n1 1 k 1 c
(20)
Thus, PMSE @ bZ 1 # is monotonically decreasing on c [ @0, an1 0k 1 # if a 5 2a
and r 5 1+ Because bZ 1 reduces to the SR estimator when a 5 2a, r 5 1, and
c 5 0, and it reduces to the PSR estimator when a 5 2a, r 5 1, and c 5 an1 0
k 1 , we obtain the following theorem+
THEOREM 1+ The PSR estimator dominates the SR estimator in terms of
PMSE even when relevant regressors are omitted in the specified model.
Theorem 2 in Ohtani ~1993! requires the condition l 1 $ l 2 for the PMSE of
the PSR estimator to be smaller than that of the SR estimator+ This condition is
caused by the fact that his formula of PMSE is mistakenly derived+ However,
our result shows that such a condition is not required when the formula is derived correctly+
Because further theoretical analysis of the PMSEs of the SR, PSR, MMSE,
and AMMSE estimators is difficult, we compare them numerically in the next
section+
4. NUMERICAL ANALYSIS
In this section, we compare the PMSE performances of the SR, PSR, MMSE,
and AMMSE estimators by numerical evaluations+ As is shown in Ohtani ~1993!,
the noncentrality parameters are expressed as l 1 5 Fb R 12 and l 2 5 Fb ~1 2 R 12 !,
21 '
where Fb 5 b ' Sb0s 2 and R 12 5 b ' X ' X 1 S11
X 1 Xb0b ' Sb+ Here Fb is the noncentrality parameter that appeared in the test for the null hypothesis that all the
regression coefficients are zeros in the correctly specified model+ Also, R 12 is
1092
AKIO NAMBA
Table 1. Relative PMSEs of the SR, PSR, MMSE, and AMMSE estimators
for k 1 5 5 and n 5 20
R 12
Fb
0+1
0+3
0+5
+0
1+0
2+0
4+0
6+0
8+0
10+0
15+0
20+0
25+0
30+0
40+0
50+0
100+0
150+0
+4706
+5616
+6306
+7299
+7995
+8519
+8935
+9692
1+0216
1+0605
1+0908
1+1344
1+1640
1+2274
1+2471
+4706
+5632
+6349
+7381
+8080
+8579
+8948
+9538
+9873
1+0079
1+0215
1+0377
1+0467
1+0621
1+0662
+4706
+5640
+6372
+7421
+8116
+8594
+8935
+9447
+9715
+9872
+9971
1+0086
1+0147
1+0249
1+0275
+0
1+0
2+0
4+0
6+0
8+0
10+0
15+0
20+0
25+0
30+0
40+0
50+0
100+0
150+0
+3533
+4457
+5148
+6121
+6783
+7267
+7640
+8292
+8720
+9028
+9262
+9598
+9829
1+0390
1+0618
+3533
+4522
+5294
+6419
+7197
+7764
+8196
+8920
+9363
+9658
+9866
1+0133
1+0294
1+0582
1+0651
+0
1+0
2+0
4+0
6+0
8+0
10+0
15+0
20+0
25+0
30+0
40+0
50+0
100+0
150+0
+7254
+7614
+7883
+8260
+8514
+8699
+8842
+9089
+9251
+9367
+9455
+9582
+9669
+9880
+9963
+7254
+7631
+7925
+8354
+8651
+8868
+9033
+9310
+9479
+9591
+9671
+9774
+9838
+9963
1+0003
0+7
0+9
1+0
+4706
+5641
+6374
+7415
+8093
+8552
+8873
+9347
+9592
+9734
+9825
+9930
+9987
1+0081
1+0104
+4706
+5635
+6352
+7355
+7997
+8427
+8727
+9176
+9416
+9562
+9659
+9777
+9844
+9965
+9997
+4706
+5629
+6332
+7300
+7909
+8314
+8596
+9018
+9248
+9391
+9489
+9613
+9689
+9843
+9895
+3533
+4628
+5506
+6792
+7653
+8244
+8659
+9262
+9558
+9721
+9820
+9929
+9987
1+0081
1+0104
+3533
+4666
+5566
+6858
+7693
+8245
+8621
+9149
+9410
+9561
+9659
+9777
+9844
+9965
+9997
+3533
+4682
+5584
+6856
+7658
+8175
+8521
+9003
+9245
+9391
+9489
+9613
+9689
+9843
+9895
+7254
+7654
+7977
+8460
+8792
+9029
+9203
+9475
+9625
+9716
+9777
+9849
+9891
+9964
+9985
+7254
+7660
+7985
+8466
+8794
+9025
+9192
+9454
+9599
+9690
+9750
+9825
+9869
+9949
+9972
+7254
+7661
+7984
+8456
+8772
+8993
+9152
+9400
+9539
+9626
+9686
+9762
+9809
+9904
+9935
SR
PSR
+3533
+4579
+5414
+6643
+7486
+8084
+8522
+9204
+9571
+9785
+9919
1+0066
1+0139
1+0249
1+0275
MMSE
+7254
+7644
+7957
+8421
+8743
+8976
+9149
+9427
+9586
+9685
+9752
+9834
+9882
+9969
+9995
continued
PERFORMANCE OF THE BIASED ESTIMATORS
1093
Table 1. continued
R 12
Fb
0+1
0+3
+3357
+4328
+5050
+6057
+6731
+7216
+7585
+8215
+8616
+8897
+9107
+9399
+9595
1+0049
1+0224
+3357
+4316
+5064
+6154
+6910
+7463
+7885
+8600
+9044
+9346
+9563
+9853
1+0037
1+0424
1+0556
0+5
0+7
0+9
1+0
+3357
+4267
+5023
+6193
+7037
+7661
+8133
+8902
+9342
+9614
+9793
1+0003
1+0117
1+0295
1+0333
+3357
+4229
+4963
+6110
+6946
+7566
+8036
+8797
+9228
+9490
+9659
+9855
+9957
1+0103
1+0127
+3357
+4207
+4921
+6033
+6841
+7439
+7891
+8620
+9032
+9284
+9448
+9641
+9745
+9913
+9953
AMMSE
+0
1+0
2+0
4+0
6+0
8+0
10+0
15+0
20+0
25+0
30+0
40+0
50+0
100+0
150+0
+3357
+4295
+5056
+6202
+7016
+7615
+8070
+8824
+9272
+9562
+9761
1+0011
1+0157
1+0426
1+0503
interpreted as the coefficient of determination in regression of Xb on X 1 + Thus,
if R 12 is close to unity, the magnitude of model misspecification is regarded as
small and vice versa+
The parameter values used in the numerical evaluations were k 1 5 3, 4, 5, 8,
n 5 20, 30, 40, R 12 5 0+1, 0+3, 0+5, 0+7, 0+9, 1+0, and Fb 5 various values+ To
compare the PMSEs of the estimators, we evaluated the values of relative PMSE
defined as PMSE @ bN 1 #0PMSE @b1 # , where bN 1 is any estimator of b1 + Thus, the
estimator bN 1 has smaller PMSE than the OLS estimator when the value of relative PMSE is smaller than unity+ The numerical evaluations were executed on
a personal computer using the FORTRAN code+ In evaluating the integral in
Gij ~ p, q; a, c! given in ~16!, we used Simpson’s rule with 200 equal subdivisions+ The double infinite series in H~ p, q; a, c! and J ~ p, q; a, c! were judged to
converge when the increment of series became smaller than 10212+ Because the
results for the cases of k 1 5 5, 8, and n 5 20 are qualitatively typical, we do
not show the results for the other cases+
Table 1 shows the relative PMSEs of the SR, PSR, MMSE, and AMMSE
estimators for k 1 5 5 and n 5 20+ From Table 1, we can make sure of Theorem 1; that is, the PMSE of the PSR estimator is uniformly smaller than or
equal to that of the SR estimator even if relevant regressors are omitted+ Also,
we can see that the AMMSE estimator has the smallest PMSE over a wide
region of the parameter space ~e+g+, Fb # 25+0!+ In particular, the PMSE of the
AMMSE estimator is smaller than the other estimators when the model misspecification is severe ~e+g+, R 12 5 0+1 and Fb # 50+0!+ Though the maximum of
the relative PMSE of the AMMSE estimator is slightly larger than unity, the
minimum is much smaller than unity+ This indicates that the gain in PMSE
1094
AKIO NAMBA
Table 2. Relative PMSEs of the SR, PSR, MMSE, and AMMSE estimators
for k 1 5 8 and n 5 20
R 12
Fb
0+1
0+3
0+5
+0
1+0
2+0
4+0
6+0
8+0
10+0
15+0
20+0
25+0
30+0
40+0
50+0
100+0
150+0
+3571
+4321
+4980
+6105
+7053
+7877
+8611
1+0170
1+1456
1+2553
1+3506
1+5089
1+6355
2+0101
2+1888
+3571
+4320
+4975
+6067
+6944
+7663
+8265
+9408
1+0212
1+0804
1+1255
1+1891
1+2314
1+3252
1+3589
+3571
+4318
+4964
+6020
+6839
+7485
+8005
+8931
+9529
+9938
1+0233
1+0624
1+0868
1+1364
1+1527
+0
1+0
2+0
4+0
6+0
8+0
10+0
15+0
20+0
25+0
30+0
40+0
50+0
100+0
150+0
+2606
+3218
+3739
+4585
+5251
+5792
+6244
+7106
+7722
+8184
+8542
+9057
+9407
1+0200
1+0489
+2606
+3279
+3864
+4834
+5609
+6245
+6778
+7801
+8539
+9099
+9542
1+0200
1+0670
1+1872
1+2400
+0
1+0
2+0
4+0
6+0
8+0
10+0
15+0
20+0
25+0
30+0
40+0
50+0
100+0
150+0
+8030
+8145
+8242
+8400
+8525
+8627
+8714
+8885
+9014
+9117
+9202
+9334
+9435
+9713
+9844
+8030
+8165
+8283
+8479
+8636
+8766
+8875
+9083
+9232
+9345
+9432
+9559
+9647
+9855
+9935
0+7
0+9
1+0
+3571
+4313
+4946
+5960
+6724
+7312
+7774
+8571
+9066
+9396
+9630
+9932
1+0116
1+0477
1+0590
+3571
+4306
+4920
+5880
+6584
+7116
+7526
+8225
+8657
+8946
+9152
+9422
+9589
+9926
1+0033
+3571
+4301
+4904
+5831
+6500
+6998
+7379
+8023
+8419
+8684
+8874
+9127
+9288
+9629
+9750
+2606
+3389
+4072
+5198
+6076
+6769
+7322
+8293
+8898
+9295
+9568
+9909
1+0108
1+0477
1+0590
+2606
+3437
+4153
+5304
+6165
+6818
+7318
+8144
+8626
+8935
+9147
+9421
+9589
+9926
1+0033
+2606
+3460
+4188
+5335
+6171
+6786
+7247
+7984
+8408
+8682
+8874
+9127
+9288
+9629
+9750
+8030
+8203
+8353
+8600
+8793
+8945
+9069
+9290
+9435
+9535
+9608
+9706
+9768
+9898
+9942
+8030
+8220
+8382
+8643
+8841
+8994
+9116
+9329
+9465
+9559
+9626
+9716
+9774
+9893
+9934
+8030
+8228
+8395
+8660
+8857
+9008
+9126
+9331
+9460
+9549
+9612
+9698
+9753
+9871
+9912
SR
PSR
+2606
+3336
+3976
+5040
+5888
+6576
+7144
+8198
+8917
+9432
+9815
1+0337
1+0669
1+1326
1+1518
MMSE
+8030
+8184
+8320
+8546
+8725
+8870
+8990
+9211
+9361
+9468
+9549
+9659
+9731
+9889
+9944
continued
PERFORMANCE OF THE BIASED ESTIMATORS
1095
Table 2. continued
R 12
Fb
0+1
0+3
+3201
+3783
+4271
+5047
+5642
+6116
+6504
+7229
+7737
+8115
+8409
+8837
+9135
+9855
1+0143
+3201
+3783
+4284
+5107
+5756
+6282
+6718
+7542
+8123
+8556
+8892
+9380
+9717
1+0526
1+0847
0+5
0+7
0+9
1+0
+3201
+3772
+4279
+5139
+5835
+6406
+6882
+7772
+8384
+8825
+9154
+9609
+9904
1+0534
1+0747
+3201
+3762
+4260
+5103
+5782
+6338
+6797
+7648
+8222
+8628
+8926
+9327
+9578
1+0077
1+0226
+3201
+3755
+4245
+5068
+5727
+6259
+6696
+7493
+8021
+8388
+8653
+9005
+9223
+9650
+9781
AMMSE
+0
1+0
2+0
4+0
6+0
8+0
10+0
15+0
20+0
25+0
30+0
40+0
50+0
100+0
150+0
+3201
+3779
+4287
+5139
+5822
+6382
+6847
+7725
+8339
+8789
+9133
+9622
+9951
1+0705
1+0988
from using the AMMSE estimator instead of the OLS estimator is much larger
than the loss even when there are omitted regressors+ Also, the loss from using
the MMSE estimator instead of the OLS estimator is very small, though the
gain is not so large+
The relative PMSEs of the SR, PSR, MMSE, and AMMSE estimators for
k 1 5 8 and n 5 20 are shown in Table 2+ Similar to the case of k 1 5 5, we can
make sure of Theorem 1+ When k 1 5 8, the PSR estimator has the smallest
PMSE over a wide region of the parameter space+ Also, the AMMSE estimator
has much smaller PMSE than the OLS estimator over a wide region of the
parameter space+ Though the maximums of the relative PMSEs of the PSR and
the AMMSE estimators are slightly larger than unity, their minimums are much
smaller than unity+ Also, the MMSE estimator has smaller PMSE than the OLS
estimator for all the values of Fb and R 12 considered here, though the minimum
of its PMSE is not so small+
The results stated previously are almost similar to the numerical results in
Ohtani ~1998!+ In other words, the effect of his calculative error on the numerical results is not so large+
REFERENCES
Abadir, K+M+ ~1999! An introduction to hypergeometric functions for economists+ Econometric
Reviews 18, 287–330+
Abramowitz, M+ & I+A+ Stegun ~1972! Handbook of Mathematical Functions+ New York: Dover
Publications+
Baranchik, A+J+ ~1970! A family of minimax estimators of the mean of a multivariate normal distribution+ Annals of Mathematical Statistics 41, 642– 645+
1096
AKIO NAMBA
Farebrother, R+W+ ~1975! The minimum mean square error linear estimator and ridge regression+
Technometrics 17, 127–128+
Gourieroux, C+ & A+ Monfort ~1995! Statistics and Econometric Models, vol+ 1+ New York: Cambridge University Press+
James, W+ & C+ Stein ~1961! Estimation with quadratic loss+ In Proceedings of the Fourth Berkeley
Symposium on Mathematical Statistics and Probability, vol+ 1, pp+ 361–379+ Berkeley: University of California Press+
Luke, Y+L+ ~1969! The Special Functions and Their Approximations, vol+ 1, New York: Academic
Press+
Mittelhammer, R+C+ ~1984! Restricted least squares, pre-test, OLS, and Stein rule estimators: Risk
comparison under model misspecification+ Journal of Econometrics 25, 151–164+
Ohtani, K+ ~1993! A comparison of the Stein-rule and positive-part Stein-rule estimators in a misspecified linear regression model+ Econometric Theory 9, 668– 679+
Ohtani, K+ ~1996! On an adjustment of degrees of freedom in the minimum mean squared error
estimator+ Communications in Statistics—Theory and Methods 25, 3049–3058+
Ohtani, K+ ~1998! MSE performance of the minimum mean squared error estimators in a linear
regression model when relevant regressors are omitted+ Journal of Statistical Computation and
Simulation 61, 61–75+
Stein, C+ ~1956! Inadmissibility of the usual estimator for the mean of a multivariate normal distribution+ In Proceedings of the Third Berkeley Symposium on Mathematical Statistics and Probability, pp+ 197–206+ Berkeley: University of California Press+
Theil, H+ ~1971! Principles of Econometrics+ New York: Wiley+
APPENDIX
In this Appendix, we derive the formulae for H~ p, q; a, c! and J ~ p, q; a, c!+ First,
we derive the formula for H ~ p, q; a, c!+ The OLS estimator b 1 is distributed as
21
21
21 '
21
N~ b1 1 S11
S12 b2 , s 2 S11
! 5 N~S11
X 1 Xb, s 2 S11
!, where X 5 @X 1 , X 2 # , b ' 5
'
'
'
'
2
@ b1 , b2 # , and S12 5 X 1 X 2 + Thus, u 1 5 b1 S11 b1 0s is distributed as the noncentral
chi-square distribution with k 1 degrees of freedom and noncentrality parameter l 1 5
21 '
b ' X ' X 1 S11
X 1 Xb0s 2 + Also, the residual vector e1 is distributed as N~M1 Xb, s 2 M1 !,
21 '
where M1 5 ~In 2 X 1 S11
X 1 !+ Thus, u 2 5 e1' e1 0s 2 is distributed as the noncentral
chi-square distribution with n1 degrees of freedom and noncentrality parameter l 2 5
b ' X ' M1 Xb0s 2 + Furthermore, u 1 and u 2 are mutually independent+
Using u 1 and u 2 , H~ p, q; a, c! is expressed as
`
H~ p, q; a, c! 5
`
( ( Kij
i50 j50
EE
u 1k1 022p1q1i21 u 2n1 021j21 ~u 1 1 au 2 ! p
R
3 exp@2~u 1 1 u 2 !02# du 2 du 1 ,
(A.1)
where
Kij 5
wi ~l 1 !wj ~l 2 !
2 ~k11n1 !021i1j G~k 1 02 1 i !G~n1 02 1 j !
,
(A.2)
wi ~l! 5 exp~2l02!~l02! i0i!, and R is the region such that $u 1 , u 2 6u 1 0u 2 $ k 1 c0n1 %+
PERFORMANCE OF THE BIASED ESTIMATORS
1097
Making use of the change of variables, v1 5 u 1 0u 2 and v2 5 u 2 , ~A+1! reduces to
`
E E
`
`
( ( Kij
i50 j50
k 1 c0n1
`
~k 11n1 !021q1i1j21
v1k1 022p1q1i21 v2
~a 1 v1 ! p
0
3 exp@2v2 ~1 1 v1 !02# dv2 dv1 +
(A.3)
Again, making use of the change of variable, z 5 ~1 1 v1 !v2 02, ~A+3! reduces to
`
`
( ( Kij 2 ~k 1n !021q1i1j G~~k1 1 n1 !02 1 q 1 i 1 j !
i50 j50
1
3
E
1
v1k1 022p1q1i21 ~a 1 v1 ! p
`
~1 1 v1 ! ~k11n1 !021q1i1j
k 1 c0n1
dv1 +
(A.4)
Finally, making use of the change of variable, t 5 v1 0~1 1 v1 !, and performing some
manipulations, we obtain ~14! in the text+
Next, we derive the formula for J ~ p, q; a, c!+ Noting that ]l 1 0]b1 5 2~S11 b1 1
S12 b2 !0s 2 and ]l 2 0]b1 5 0, and differentiating H~ p, q; a, c! given in ~14! with respect
to b1 , we obtain
]H~ p, q; a, c!
]b1
`
5
`
((
i50 j50
`
5
`
((
i50 j50
F
F
52
1
F
]wi ~l 1 !
F
S11 b1 1 S12 b2
]b1
G
wj ~l 2 !Gij ~ p, q; a, c!
s2
S11 b1 1 S12 b2
s2
@2wi ~l 1 ! 1 wi21 ~l 1 !#wj ~l 2 !Gij ~ p, q; a, c!
G
G( (
H~ p, q; a, c!
S11 b1 1 S12 b2
s2
G
`
`
wi ~l 1 !wj ~l 2 !Gi11, j ~ p, q; a, c!+
(A.5)
i50 j50
21 '
21
X 1 Xb, s 2 S11
!, H~ p, q; a+c! can be exBecause u 1 5 b1' X 11 b1 0s 2 and b1 ; N~S11
pressed as
H~ p, q; a, c! 5
EE S
R
u 1 1 au 2
u1
D
p
q
u 1 fN ~b1 ! f2 ~u 2 ! du 2 db1 ,
(A.6)
where f2 ~u 2 ! is the density function of u 2 and
fN ~b1 ! 5
1
~2p!
k 1 02
21 102
6s 2 S11
6
F
exp 2
21 '
21 '
~b1 2 S11
X 1 Xb! ' S11 ~b1 2 S11
X 1 Xb!
2s 2
G
+
(A.7)
1098
AKIO NAMBA
Differentiating ~A+6! with respect to b1 , we obtain
]H~ p, q; a, c!
]b1
5
EE S
R
5
1
s
2
2
u 1 1 au 2
FS
E I
F
u1
D
n1 u 1
k1 u2
p
q
u 1 fN ~b1 ! f2 ~u 2 !
S11 b1 1 S12 b2
s2
DS
$c
G
u 1 1 au 2
u1
F
D
H~ p, q; a, c!+
S11 b1 2 S11 b1 2 S12 b2
p
s2
q
u 1 S11 b1
G
G
du 2 db1
(A.8)
21
Equating ~A+5! and ~A+8!, and multiplying b ' X ' X 1 S11
from the left, we obtain ~15!
in the text+