bootstrap in moving average models

16
Ann. Inst. Statist. Math. Vol. 42, No. 4, 753-768 (1990) BOOTSTRAP IN MOVING AVERAGE MODELS ARUP BOSE* Stat,-Math. Division, Indian Statistical Institute, 203, B. T. Road, Calcutta 700035, India and Department of Statistics, Purdue University, West Lafayette, IN 47907, U.S.A. (Received December24, 1987;revised February 13, 1990) Abstract. We prove that the bootstrap principle works very well in moving average models, when the parameters satisfy the invertibility condition, by showing that the bootstrap approximation of the distribu- tion of the parameter estimates is accurate to the order o(n -1/2) a.s. Some simulation studies are also reported. Key words and phrases: Moving average models, stationary autoregres- sions, Cramer's condition, Edgeworth expansions, empirical distribution function, bootstrap. 1. Introduction The bootstrap procedure was introduced by Efron (1979, 1982). Since then there have been theoretical studies dealing with the accuracy of the bootstrap approximation in various senses (asymptotic normality, Edgeworth expansion, etc.). Some of the references are Bickel and Freedman (1980, 1981), Singh (1981), Beran (1982), Babu and Singh (1984) and Hall (1988). One class of results show that in the i.i.d, situation where the normal approximation holds with an error of 0(n-1/2), if we replace the normal distribution by a sample dependent bootstrap distribution, then the error rate is o(n -1/2) a.s. The bootstrap does not give correct answers in general dependent models. However, some dependent models do allow for an appropriate resampling so that the bootstrap works. Freedman (1984) has shown that the bootstrap gives the correct asymptotic result for two stage least squares estimates in linear autoregressions with possible exogenous variables or- thogonal to errors. Basawa et al. (1989) have proven the validity of bootstrap in explosive first order autoregressions. Bose (1988a) has shown that the rate result alluded to in the i.i.d, situation holds for stationary autoregressions. *Now at Indian Statistical Institute. 753

Upload: arup-bose

Post on 10-Jul-2016

214 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: Bootstrap in moving average models

Ann. Inst. Statist. Math. Vol. 42, No. 4, 753-768 (1990)

BOOTSTRAP IN MOVING AVERAGE MODELS

ARUP BOSE*

Stat,-Math. Division, Indian Statistical Institute, 203, B. T. Road, Calcutta 700035, India and Department of Statistics, Purdue University, West Lafayette, IN 47907, U.S.A.

(Received December 24, 1987; revised February 13, 1990)

A b s t r a c t . We prove that the bootstrap principle works very well in moving average models, when the parameters satisfy the invertibility condition, by showing that the bootstrap approximation of the distribu- tion of the parameter estimates is accurate to the order o(n -1/2) a.s. Some simulation studies are also reported.

Key words and phrases: Moving average models, stationary autoregres- sions, Cramer's condition, Edgeworth expansions, empirical distribution function, bootstrap.

1. Introduction

The bootstrap procedure was introduced by Efron (1979, 1982). Since then there have been theoretical studies dealing with the accuracy of the boots t rap approximat ion in various senses (asymptotic normali ty, Edgeworth expansion, etc.). Some of the references are Bickel and Freedman (1980, 1981), Singh (1981), Beran (1982), Babu and Singh (1984) and Hall (1988). One class of results show that in the i.i.d, situation where the normal approximation holds with an error of 0(n-1/2), if we replace the normal distribution by a sample dependent bootstrap distribution, then the error rate is o(n -1/2) a.s.

The bootstrap does not give correct answers in general dependent models. However, some dependent models do allow for an appropriate resampling so that the bootstrap works. Freedman (1984) has shown that the bootstrap gives the correct asymptotic result for two stage least squares estimates in linear autoregressions with possible exogenous variables or- thogonal to errors. Basawa et al. (1989) have proven the validity of bootstrap in explosive first order autoregressions. Bose (1988a) has shown that the rate result alluded to in the i.i.d, situation holds for stationary autoregressions.

*Now at Indian Statistical Institute.

753

Page 2: Bootstrap in moving average models

754 A R U P BOSE

Here we deal with moving average models. The moment estimators of the parameters have an asymptotic normal distribution and the approxima- tion error can be shown to be O(n-V2). The structure of the process enables appropriate resampling. We show that the bootstrap distribution approxi- mates the distribution of the parameter estimates with an accuracy of o(n-1/2) a.s. The idea is to develop one term Edgeworth expansion for the distribution of the parameter estimates and their bootstrapped version. The leading terms of these expansions match and the difference of the second terms is o(n-1/2), yielding the desired result.

2. Preliminaries

l

Let (Y,) be a process satisfying Y, = e, + ,YI-- a,e,-,., where

(A1) (et) is i.i.d. ~ Fo, E(et) = O, E(e 2) = i, and Ee 2Is+l) < oo for some s_>3.

(A2) (el,e~) satisfies Cramer's condition, i.e. for every d > 0, 3& > 0 such that

sup I E e x p (it'(el, e2))l < 1 - 3 . [It II->d

a~,a2,... ,al are unknown parameters which can be estimated by moment estimates.

R e m a r k 2.1. The assumption that the mean and variance of et are known has been made to keep the proofs simple. See Remark 3.2 for a discussion of how this assumption can be dropped. The Cramer's condition is required to obtain Edgeworth expansions.

R e m a r k 2.2. The minimum moment assumption we need is Ee~ < 0% which may seem too strong. However, the estimates of a:s involve quadratic functions of et and we need the (s + 1)-th moment of e 2 with s at least 3. This is in contrast to the situation of i.i.d, observations where the s-th moment suffices to derive the o(n -Is-2)/2) expansion.

We first assume that l = 1, i.e. Yt = et + otet-1. The moment estimate of

a, given the observations Y0, Y~ .... , Y., is a. = n -I ~ Yt Yt-1. Under our t= l

assumptions, a. is strongly consistent and nl/e(an- a) has an asymptotic normal distribution.

i - 1 Define ~, = E ( - 1)Ja:y,_s, and ~ = Y~. Using the structure of the

j=O

process,

Page 3: Bootstrap in moving average models

BOOTSTRAP IN MOVING AVERAGE MODELS 755

(2.1) 5; = e , - ( - a )%.

Hence ~i and e; are close for all large i if [a] < 1, which shows that resampling is proper in this situation. (For l > 1, this condition should be replaced by the invertibility condition, see Hannan (1970)). So motivated by (2. i), we compute the pseudo errors as

z-1

~i, = Z ( - 1):ct~ Yi-j, i = 2,..., n, ~n = Y1. 3=0

For ease of notations we will often drop the suffix n. Let Gn denote the empirical distribution function which puts mass n- ~ at each ~;,, i = 1,2,..., n.

n

Let /~,(x)= G,(x+-g, ) where 2n = n -1 ]~ ~i,. It is expected that /~, will be i=1

close to F0 with increasing n. Take an i.i.d, sample (e*) from _P, and define

c* + Y: = : + a :* - : ,

i = 1 , . . . ,n ,

dropping the suffix n .

Pretend that an is unknown and obtain its moment estimate by a* = n

-1 n 3-'. Yt* Y*-I. So the bootstrapped quantity corresponding to nl/2(an - a) is

t= l

nl/2(a * - an). In the next section we will see how accurate the distribution of n l /2 (a* - a,) is (given Y0, Y1,..., In) in estimating the distribution of nl/2(a~ - a) as n --* oo.

Before discussing the main results, we introduce a few notations. C stands for a generic constant, which in probability arguments may depend on the particular point w under consideration in the basic probability

n

space. For a sequence of random vectors Xt, Sn = n-1/2 ~ Xt. G, ~ G t = l

denotes that the distribution G, converges weakly to G (G, may be random) . The funct ion ~,,s represents the first ( s - I) terms of the Edgeworth expansion of the distribution of Sn whenever such an expansion is valid. See Bhattacharya and Ranga Rao (1976, p. 145) for the definition of ~u,,, when Xt are i.i.d. Gt~tze and Hipp (1983) may be consulted for a definition of ~un,, when Xt are dependent. For any random vector X, D ( X ) denotes the dispersion matrix of X. p = (fl~'"flk) denotes a vector where each fl; is a nonnegative integer and for f : R ~ -- R,

D~ f ( x ) - Ox{""Ox~ ~ f ( x t .... , x ,) and I/ 1 =Bl +P2 + .-.

Page 4: Bootstrap in moving average models

756 A R U P BOSE

3. Main results

We first need some auxiliary results. Let P, denote the empirical distribution function of ~, . . . , g~.

LEMMA 3.1. Under (A1), we have for all k < 2(s + 1),

(a) n -l ~ ~k ~ EFo(tk)and n -1 ~E ~ki, ~ EFo(e~), i=1 i=1

(b) P, ~ Fo a.s. and f'~ ~ Fo a.s.

PROOF. Throughout the proof, arguments are for a fixed w in the basic probability space and hence all bounds, etc. depend on w in general.

(a) To prove the first part, it is enough to show that n -I ~ (e~ - ~) i = I

-- 0 a.s. But n-l(ek -- gk) --. 0 trivially. Furthermore,

I I k-1 n -1 ~ ( e l - g ) < n -1 ~, i=2 j=O

k) leolk_ j ~ ie, lJlal~. j i=2

It easily follows f rom Theorem 2.18 of Hall and Heyde (1980) that

-1 ~ leAjlali~ 0 V j_< k 1. To prove the second part, it suffices to show n - - i = |

that n -~ ~ (~ - ~k) ~ O. Note that a, ~ a and Jal < 1. Hence for all large n t = l

and for a l l j _> 1,

(3.1) lal + la. - al ~ / 7 < 1 a.s.

and

(3.2) l a ~ - a J l = l ( a , - a + a ) J - a i l < - C l a , - a l J J for some J < 1 .

Hence

i = I

--I T L~J_-o ( - l )J aJ Y~-j ( - 1)J a~ i---2 ~ j=O

[ {( '-' I; <_n-' ~ 2 k-' ' i ' la~-aJ l lY, - , I ,}o ( - 1)'c~Y~-/ i=2 j = 0 =

Page 5: Bootstrap in moving average models

BOOTSTRAP IN MOVING AVERAGE MODELS 757

(by (3.1) and (3.2)).

Sin e , ,/+o,, tisenoug ,os owt a,n' (' )* = a : l e , - j I is ~=2 j =

bounded a.s. The sequence Zi = ~ ~Jle,-:l, i _> 1 is a stationary autoregres- j = 0

sive process of order one and hence is ergodic (see Hannan (1970), p. 204). So the second part of (a) follows f rom the observation

n ~ 6j]~i_jl ,~ n-1 ~ Z k a_+ E(Z~) < ~ . z=2 \ j = 0 i = l

(b) Since (e,) is i.i.d. F0, the first part readily follows f rom (2.1). The second part follows f rom the following observation: if F, and G, are empirical distributions based on n tuples (x: , . . . ,x , ) and (y: .... ,y,), then for a l l f s u c h t h a t f ' is bounded,

]EF.( f ) - -E~.( f ) t <_1 ~ I f (x ,)-- f (Y,) l <--IIf'[t= 1 ~, Ix , --Y,I . [] n i=1 n ~=1

To study the bootstrap approximat ion, we need an Edgeworth expan- sion for a,. For this, we use a result of G6tze and Hipp (1983) (henceforth referred to as GH).

Let (Xt) be R k valued random variables on ( (2 ,~ ' , P). Let there be

a-fields~j(writea(6~j)=~)anda*>Osuchthatj=,

< 1 .

c(1) c(2) C(3) c(4)

c(5)

E X t = 0 Vt. EIIX, II ~+: < Ms+: < ~ Vt for some s >_ 3.

n+m 3 Y , , , ~ , - m ~ E [ [ X , - I',,,,11 <- C e x p ( - a'm). VA ~ ~"-~, B ~ ~%m, I e (Z N B) - P(A)P(B)I <- Cexp ( - a'm).

3 d , 6 > 0 ~ V J l t l l _ > d , E E e x p it' Y~ X: ~ , j . ~ n < I - - 6 j = r t - m

c•n+p C(6) WA~ .-p, Vn,p ,m, EIP(al~-~j, j ~ n ) - P ( A l e r ~ j , O < l j - n l < m +P)[ < C e x p ( - a'm).

lim D ( n -1/2 ~ X~ ] = S exists and is positive definite. C(7) n - ~ \ t = l ]

Let so be s or (s - 1) according as s is even or odd, ~bz be the normal density with mean 0 and dispersion matrix Z'. The following results are due to G~tze and Hipp (1983).

THEOREM 3.1. Let f: R k ~ R be a measurable function such that If~x)l -< M(1 + IlxllS~ for every x ~ R k. Assume that C(1)-C(7) hold. Then there exists a positive constant c~o not depending on f and M, and for any

Page 6: Bootstrap in moving average models

758 ARUP BOSE

arbitrary k > 0 there exists a positive constant C depending on M but not on f such that

E f (S . ) - f f d~ , . , , <_ Cw(f, n -k) + o(n -1~-2+a~

where w(f , n -k) = f sup (I f ( x + y) - f(x)[" I Y[ <- n-k)4az (x)dx. The term o(. )

depends on f through M only.

COROLLARY 3.1. Assume C(1)-C(7). Then the following approxima- tion holds uniformly over convex measurable C c_ R k.

P(S~ e C) = ~'.,~(C) + o(n-(S-Z)/2).

Let Xt = Yt Yt-~ - a and ~ j = sigma field generated by e~. It can then be easily shown that X, satisfies the conditions of the above theorem under (A1) and (A2). We omit the details (see Bose (1988b)). Thus we have the following proposition.

PROPOSITION 3.1. Assume that (A1) and (A2) hold. Let S. = n

n-1/2 Z ( Yt Yt-l - a). Then t = l

(a) Theorem 3.1 holds with the above S.; consequently, (b) P(S. e C) = ~/.,s(C) + o(n-iS-2)/2), uniformly over convex subsets

o f R.

We now develop an Edgeworth expansion for the bootstrapped version of the above S.. In what follows we make the convention that the presence of (*) indicates that we are dealing with a bootstrapped quantity; as a result, expectations, etc. are taken w.r.t. (e*) i . i .d . /e given Y0, Y~,..., yn.

Define

Xj* = Yj* Yj*I - a., j _> 1 and H*(t )= E[ eim-'/2 ~ XJ* ]

We have the following lemmas. The proofs are only sketched and the details can be completed by arguments similar to those used by GH.

LEMMA 3.2. Vltl <- Cn'~ and lfll < s + 2 , we have

[DP(H*(t) - ~*(t))[

_< C(1 + m*+l,.)(l + [tl 3(s-~)+l~l) exp ( - CltlZ)n -(s-2+~~

Page 7: Bootstrap in moving average models

BOOTSTRAP IN MOVING AVERAGE MODELS 759

for some to < 1/2 and C depends on the bounds of m*+l,. = (s + 1)-th moment of X:*. Here r is the Fourier transform of g/*s.

it. The proof is exactly as the p roof of Lemma 3.33 of GH and we omit

Let

I~ = {t: Cn ~~ <_ [tl <-- C~n~/2}, /2 = {t: C~n 1/2 <_ It[ ~ "g-~n 1/2}

where C1 is to be chosen and 0 < ~ < 1 is fixed.

LEMMA 3.3. Under (A1) and (A2), we have for almost every se- quence (Yi),

f~,z, I D~ H~*(t) l d t = o(n-lS-2)/2) .

PROOF. A careful look at the proof of L e m m a 3.43 of GH shows that it suffices to show that E*IE*A*[~* , J#Jpl < 1 uniformly in t ~ 12

and p = 1 ,2 , . . . , J where ~ j * = a ( e ) , A* = exp( itn-l/2 j=j,-ml"~m Zj* ) ; see GH

for the defini t ion ofjp and m. We omit the details of these definitions since they are not explicitly used in the sequel. It suffices to note that jp is fixed and that the above expectat ion is independent of m (see below). This expectation equals

Note that j,~m Xj* * * Yj*+2+ * 2 , a ~ = ej;(Yj~-i + a, tf.+l+a, ef.-t)+ nc.,lp "~ Vwhere V j = j p - m

is independent of e*. Let K* denote the dis t r ibut ion funct ion of Yj*_~ + a, Yj*§ + t*+ ~ +

a, tj;_,. Then 6*m = f f e x p (itn-X/2xy + itn -m a, x2)dP~(x) dK*(y).

As t varies in I2, (tn-~/2, tn-1/2 a~) varies in a compact set bounded away f rom zero. Let D denote any such set in R 2. Let b~, b2 > 0 (to be chosen). Then

C~*m <_(a.a~)~of [ f exp (i&xy id2x2)d~'~(x) ] dK*(y)

<_ K*(bl < I YI -< b2)lln + K*(I YI < bx) + K*(I YI > b2)

where

Page 8: Bootstrap in moving average models

760 A R U P B O S E

I r I Iln < - sup sup ] J exp ( id lxy + id2x2)dF,(x) l .

b~<-lyl<-b2 (d~,d2)~D I I

By Lemma 3.1, K,* ~ K a.s. where K is the distribution of Yj- ~ + a Y~ § + ej + ~ + a2ej-~, which is non-degenerate. Thus b~ and b2 can be chosen such that for all large n,

K*(I YI < b,) +/<7"(I YI > b2) < ao < 1.

Note that F * m F0 a.s. and F0 satisfies Cramer's condition. Since the convergence of characteristic functions to the limit is uni form over compact sets, we have I~, < I - y < 1 for all large n. Thus ~*m < (I -- y) + aoy < I, proving the lemma. []

LEMMA 3.4. Under (AI) and (A2), f o r a suf f icient ly smal l C1, we have f o r a lmos t every sequence (Y/), I/~1 <- s + 2,

f ,~, IDP H*( t ) l d t = ~ "

PROOF. We proceed as for L e m m a 3.3 but use a different estimate for E * [ E * A * [ e * , j ~ j p [ . We have to deal with 3*m= E * I E * exp (itn -1/2

/~*2~.~,"~-z* �9 2 * �9 ( e ' A * + n , ))l~.vz~j ,J ~ n[ where A* = Y*-t + an Y*+2 + 8"+1 + anen-1.

Note tha t fi*m E*I1 ' , ,2 3/n3 /2)E , = - ( t n / 2 n ) O ( e ~ , e ~ ) t ~ + ( y / 6 ) ( l l t n l l �9 I1(~*, ~,2) 1131 where t" = ( tA* , tan), ] y[ < 1. Thus

I t" t E*(llt"l[3) c5",, <_ E* 1 - -~n D(e*, e*2)t, + 6n3/2 lt*n ,

where

J/3*n = * * *2 3 a.s, E II(e,,e, )ll -" Ell(el,~2)ll 3

E*(l l t , II 3) _< I t l3[E*(A .2 + a,2)3] vz .

Note that E * ( A .2 + a2) 3 - E ( A 2 + a2) 3 a.s. where A = Y~ + aY3 + Y2 + a 2 Y~. Thus

g*(llt,dl 3) CIItll 3 Iltll 2 6n3/2 ~u*~ <_ n3/2 <_ CC~ --n a . s .

Let 2(A) and _2(A) denote, respectively, the m a x i m u m and min imum eigenvalues of A. Denote ~ = D(el, 2) , _~, = O(e*, e'z). Note that ~(L'n) --- X(~) > 0 a.s. and _2(S,) --- 2(X) > 0 a.s. (by L e m m a 3.1). Then we have the

Page 9: Bootstrap in moving average models

B O O T S T R A P IN M O V I N G A V E R A G E M O D E L S 761

following relations.

I t,. ] E* 1 - -2-fin D(e*, e*)2t.

[ { , (, tn , *2 tnD(e,, en ) 2 1/2

<- E* 1 - -~n D ( e. , e. ) + -~n

E, ( , nto : N,oll' ) lltN 2 ~ ] _ X2(~n)E * ~ 4n 2 <_ C Q - - n a.s.

E* I t'~nt" ]_ 2(-~)E* ( 11t"112 ) C2_(~n) [It]]2 > > a . s .

~ : - ~ - n

By combining these estimates and choosing a C1 sufficiently small,

Iltll___~ ~ _<exp [| 7[[tllZ] \ for some 7 > 0 (~nm ~ 1 y n ~ n I a . s .

A look at the proof of Lemma 3.43 of GH shows that this proves the lemma. []

From Lemmas 3.2-3.4 and Lemma 1 of Babu and Singh (1984) (henceforth referred to as BS), we have the following theorems for the cases l = 1 and l > 1, respectively.

THEOREM 3.2. Assume (AI), (A2) and lal < 1. Suppose f: R ---, R is such that If(x)l -< M(I + Ix12). Let 0 -*2 = E*(S*2). For a.e. Yo, YI .... and uniformly over x ~ R,

(a) I E*f(S*)-ffd~u~*3 < Cw(f ,n -k ,a .2) + o(n-'/2).

(b) e * ( a * - l S * <_ x) = f-oo d~*3(trn*y) + o(n -1/2) = P(tr-l S, <_ x) +

o(n-'n).

We omit the proof. See BS for a proof in the i.i.d, case.

THEOREM 3.3. Let H be a function f rom R t - . R which is thrice continuously differentiable in a neighbourhood o f O. Let h denote the vector o f first order partial derivatives o f H at O. Assume h ~ 0 and that (a,) satisfies the invertibility condition. Let

n -~176 k= 1

Page 10: Bootstrap in moving average models

762 A R U P B O S E

k= l

~g = E( Yk Yk-,), ~,, = E*( Y~' Y~-g), i = 1,..., l .

k---I

(72 ~-- I ' X I .

k = l

(7,.2 = l'Sn* l .

Then sup I P ( a - I T ( F ) < x) - P * ( a * - I T ( F * ) < x) l = o(n-1/z) a.s . x

PROOF. Proposition 3.1 and Lemmas 3.2-3.4 remain valid, respec- tively, for

(n -1/2 ~ (yk yk_g _ fig), i = 1,.. l ) k= l " '

and

Arguments analogous to Theorem 3 and Corollary 2 of BS yield the theorem. We omit the details which involve a Taylor expansion of H and a change of variable formula. []

The above result is true for vector valued H with proper modifications since Theorem 3 and Corollary 2 of BS remain true for such functions. The estimates of al, . . . ,at in a general MA model are smooth functions of

n -1

n Z Yj Yj-,, i = 1,...,l. Hence Theorem 3.9 can be utilized to prove the j = l

results for these parameter estimates.

THEOREM 3.4. Under the assumptions (A1) and (A2) fo r a.e. I1o, Y I , * * . ,

(a) Let l= 1, and lal < 1. Let (72 and (7~ be respectively the limiting variance o f nl/Z(an- a) and the variance o f nl/Z(a* - a,) (given Yo, Yx,..., Yn). Then

sup I P ( n l / 2 ( a . - a ) / a . <_ x) - P ( n m ( a * - an)/(7n --< X)[ ---- o ( n - 1 / 2 ) . x

(b) Let l > 2 and (a,) satisfy the invertibility condition. Let G, be the distribution function o f X-1/Zn~/Z(a, - a,,..., al, - at), where X is the limiting

Page 11: Bootstrap in moving average models

BOOTSTRAP IN MOVING AVERAGE MODELS 763

covariance m a t r i x o f nl/2(aln - al , . . . , otln - at). Let G* be the corresponding bootstrapped version. Then

sup I G~(x) - G * ( x ) l = o(n-V2). x ~ i i t

PROOF. The case I = 1 is Theorem 3.2. For l = 2, the moment equations are

a2. = n -1 ~ Yt Yt-z and a , . ( l + Ctzn) = n -1 ~ Yt Y t - 1 . t=l t = l

Thus

(a tn -cq 'a2" -a2 )=( ~x"+al(l+az)l+a2+22. - a l , Zzn),

where

n

n-' ~'. (Yt Yt-, - ill) = 21. and n -1 ]~ (Yt Yt-2 - flz) = 2z.. t = I t = l

Now the result follows from the multidimensional version of Theorem 3.3. The idea of proof for a general l is clear from what we have shown. However, solving for the estimates a~,,..., a~n becomes increasingly difficult with an increase in l. []

Remark 3.1. For i.i.d, observations, Hall (1988) has shown that error rates of O(n -~) can be achieved for quantile estimates. This is based on a O(n -~) expansion of the bootstrap statistic. Abramovitch and Singh (1985) have shown that an error rate of o(n-lS-al/2), s _> 3 can be obtained for the cdf of a modified bootstrap statistic provided that a sufficiently high order Edgeworth expansion is valid for the bootstrap statistic. Our attempts to derive O(n -~) results in the present context have not been successful since we have not yet been able to prove a higher order Edgeworth expansion for the bootstrap distribution.

Remark 3.2. The assumption that (et) has mean 0 and variance 1 was imposed to keep the proofs simpler. We sketch below how the case Eat = p, Eet 2 = a 2 (both/1 and o .2 unknown) can be tackled. We illustrate the case l = 1 only.

The model in this case is Yt =/z + et + aet-~ where (A1) and (A2) hold but Eet 2 = a2> 0. Under the assumptions (A1) and (A2), the Edgeworth

Page 12: Bootstrap in moving average models

764 A R U P B O S E

expansion is valid for the distribution of

(3.3) n - l / 2 ( t = l ~ (Yt- yl)' ~ (YtYt-I- y2)' ~ (yt2- y3) t = l

where yl = E E , y2 = EY~ Y,-i and y3 = E Y t 2.

The estimates It',, a. and a 2 of/2, a and a 2 a r e obtained by solving the following moment equations:

n n - 1 - 1 2 O~ 2 n Y~ Yt = It,,, n Z Yt Yt- 1 = I t n + n a n and

t = l t = l

- ' r? It. + s + d) . n = i=1

Hence

It', = n -l ~ Yt, t = l

4 2 ,1/2-,, 2 a ] = [ y 2 + ( y 4 + y2y3) 1/ Y3 and an = y2/ an

where

- 1 ~ Yt Yt-i It2 and y3 n-' ~ Yt 2 /12 y 2 = /,/ - - = - - . t = l t = l

Thus all these estimates are smooth functions of ~ Y,, ~E Yt Yt-1 and t = l t = l

yt2. t = l

Hence for a normalizing factor fl0, the distribution of n~/2f lo(a' , - a) admits an Edgeworth expansion of order o(n-~/2), with the leading term as ~ (x ) and the coefficients in the second term (which is 0(n-1/2)) being smooth functions of a, It and a z and of moments of Yt, Y~ Yt-~ and Y~LI of order less or equal to three, fl0 can be explicitly calculated and depends on a, It and moments of el. The empirical distribution is computed as before, the only difference is that Y,'s are now replaced by Y/ - It',. As in the case It = 0, o -2 = 1, an asymptotic expansion is valid for the bootstrapped version of (3.3), which yields an expansion of order o(n-~/2) for the distribution of nl/2fl ' ,(a * - a',) where fl', is the bootstrap equivalent of fl0. The leading term in this expansion is also ~b(x) and the polynomial involved in the second term is of the same form as that in the expansion of nl/Zflo(a', - a). By the ergodic theorem, the empirical moments of Yt, Y, Y,-I and y Z converge to the true moments a.s. and hence a',, It', and a', are strongly consistent estimates of a, It and a, respectively. Thus the difference between the two expansions is o(n -~/z) a.s.

Page 13: Bootstrap in moving average models

BOOTSTRAP IN MOVING AVERAGE MODELS 765

4. Simulations

It is interesting to see how the bootstrap performs in small samples. The accuracy is expected to decrease as the parameter values move towards the boundary (for l = 1, as t a l - - - l). A small simulation study for the moving average model with l = 1 was done. We also simulated the autore- gressive process

Y,=OY,-L+e,, 101<1

when 0 is estimated by the least squares method. The rate of o(n- L/z) is also valid for this situation as was shown in Bose (1988a). See Bose (1988a) for the details of bootstrapping the distribution of the least squares estimates.

For both the MA and AR models, we generated e;'s from N(0, 1) and centered exp (1) densities. The parameter values were set at a = 0.9 and 0 = 0.9 and a series of size n = 100 was generated. The distribution of the estimator, standardized by its true mean and true limiting variance, was approximated by using 1000 replications of the series. The first set of n = 100 observations was used to estimate the residuals and generate the bootstrap distribution. The bootstrap distribution was approximated by using 5000 repetitions for the AR case and I0,000 repetitions for the MA case.

The true (approximate) distribution, the bootstrap distribution and the standard normal distribution have been shown in each case in Figs. l(a)-(d). It is evident that the bootstrap works very well in the AR case and reasonably well in the MA case. Similar results were seen to hold for other parameter values. In fact the bootstrap does better as we move away from the boundary values of + 1. In the AR case, the sampling distribution for the normal and the exponential are close. This robustness is absent in the MA model. The bootstrap captures this deviation to some extent. However, it is not clear why the bootstrap performs better for the exponential than the normal. This is an interesting topic for further research. Apparently the skewness of the underlying distribution together with the structure of the model is affecting the performance of the bootstrap in some way.

See Chatterjee (1985) for some more simulation studies. The study of the behavior of the bootstrap in other complicated time series models is still open. The author is currently working on the bootstrap in the class of nonlinear autoregressive models.

Page 14: Bootstrap in moving average models

766 A R U P BOSE

P e r c e n t

0 6 :

0 5

- - ( E s t i m a t e d ) t r u e d i s t r i b u t i o n

o 4 ~ . . . . . B o o t s t r a p a p p r o x i m a t i o n

~ _ ] - - - - - ;,N .[ - - - - S t a n d a r d n o r m a l a p p r o x i m a t i o n

0 3. /

w l l ' , , .

0.2 !/ ~ \Ix 0 . , . / \

O0 ' 1 - ' - ~ 1 , i �9 i , i ' i , i �9 i �9 i �9 i , i " i

-- 4 0 -- 3.5 -- 3.0 -- 2.5 -- 2 0 -- I 5 -- 1.0 -- 0.5 0.0 0.5 I 0 1,5 2.0 2.5 3.0 3.5 4,0

(a) Normal AR model; 8 = 0.90, N = I00.

P e r c e n t

0 6 ,

0 5

- - ( E s t i m a t e d ) t r u e d i s t r i b u t i o n

. . . . B o o t s t r a p a p p r o x i m a t i o n

0 . 4 / ' ~ , , . . . . S t a n d a r d n o r m a l a p p r o x i m a t i o n

. L . I \ , I - ~ / 0.2 ,

o. ,

O0 -

- 4.0 - .t.5 - 3.0 - 2.5 - 2 0 - I 5 - 1.0 - 0.5 0.0 0.5 1.0 1.5 2 0 2.5 3,0 3 5 4.0

(b) Exponential AR model; 0 = 0.90, N = 100.

Fig. I. Histogram of estimator in Bootstrap study.

Page 15: Bootstrap in moving average models

BOOTSTRAP IN MOVING AVERAGE MODELS 767

Percent o 6~

04: . . . . Bootstrap approximation -"~'-' , I ---- Standard normal approximation

\

03 \----1

\ ! 0 2 ~'

o, / / / --'~-I

0.0 __~_~..--" ..___1 " ~ - ' > - . , . ~ ~ _ .

- 4.0 - 3 5 - 3.0 - 2.5 - 2.0 - I 5 - 1.0 - 0 5 0.0 0.5 1,0 1.5 2,0 2.5 3.0 3 5 4.0

(c) Normal MA model; a = 0.90, N = 100.

P e r c e n t

0 6 -

05 I ~ ~ (Estimated) true distribution o 4 ...... ! / ' / f '~l ---- Bootstrap approximation

\ , \ ---. Standard normal approximation

0.3 /

/ . . . . . . . . 1 \

' I \

' t x \ , \ : - J ; i . . . . . i / 0.1. j,,

/ ' i z ..... xL3_,._ j " i

L--':4 "., I . . . . . . . . - 4 0 - 3 . 5 - 3 0 - 2 5 - 2 . 0 - 1 5 - I . 0 - 0 . 5 0.0 0 5 1.0 15 2.0 2.5 3 0 3.5 4.0

(d) Exponential MA model; a --- 0.90, N = 100.

Fig. I. (continued).

Page 16: Bootstrap in moving average models

768 ARUP BOSE

Acknowledgements

The author is grateful to Professor G. J. Babu of Pennsylvania State University for his help and suggestions. The contents of this paper is part of the author's Ph.D. thesis written under his guidance at the Indian Statistical Institute. He is also grateful to anonymous referees who read an earlier version of the paper. The three referees of the current version have all made very valuable suggestions. These suggestions have greatly benefited this paper. The author is also extremely grateful to Dr. Tai Fang Lu of Purdue University for her generous help in the simulation study.

REFERENCES

Abramovitch, L. and Singh, K. (1985). Edgeworth corrected pivotal statistics and the bootstrap, Ann. Statist., 13, 116-132.

Babu, G. J. and Singh, K. (1984). On one term Edgeworth correction by Efron's bootstrap, Sankhyd Set. A, 40, 219-232.

Basawa, I. V., Mallik, A. K., McCormick, W. P. and Taylor, R. L. (1989). Bootstrapping explosive autoregressive processes, Ann. Statist., 17, 1479-1486.

Beran, R. (1982). Estimated sampling distribution: the bootstrap and competitors, Ann. Statist., 10, 212-225.

Bhattacharya, R. N. and Ranga Rao, R. (1976). Normal Approximation and Asymptotic Expansions, Wiley, New York.

Bickel, P. J. and Freedman, D. (1980). On Edgeworth expansion for the bootstrap, Preprint, University of California, Berkeley.

Bickel, P. J. and Freedman, D. (1981). Some asymptotic theory for the bootstrap, Ann. Statist., 9, 1196-1217.

Bose, A. (1988a). Edgeworth correction by bootstrap in autoregressions, Ann. Statist., 16, 1709-1722.

Bose, A. (1988b). Higher order approximations for auto-covariances from linear processes with applications, Statistics, 19, 259-269.

Chatterjee, S. (1985). Bootstrapping ARMA models: some simulations, Preprint, College of Business Administration, Northeastern University, Boston.

Efron, B. (1979). Bootstrap methods: another look at Jackknife, Ann. Statist., 7, 1-26. Efron, B. (1982). The Jackknife, the bootstrap and other resampling plans, CBMS-NSF

Regional Conference Series in Applied Maths. Monograph, 38, SIAM, Philadelphia. Freedman, D. (1984). On bootstrapping two stage least squares estimates in stationary

linear models, Ann. Statist., 12, 827-842. Gtitze, F. and Hipp, C. (1983). Asymptotic expansions for sums of weakly dependent

random vectors, Z. Wahrsch. Verw. Gebiete, 64, 211-239. Hall, P. (1988). Theoretical comparison of bootstrap confidence intervals, Ann. Statist., 16,

927-952. Hall, P. and Heyde, C. C. (1980). Martingale Limit Theory and Its Application, Academic

Press, New York. Hannan, E. J. (1970). Multiple Time Series, Wiley, New York. Singh, K. (1981). On asymptotic accuracy of Efron's bootstrap, Ann. Statist., 9, 1187-1195.