2. no .--gu..2059.---visiting!romkyus'hu-ljnl,versity ... asymptotic distribution of the...

28
1. Research supported in part by the Office of Naval Research under Contract No. N00014-67-A-0321-0002. Visiting from University of New Mexico. 2. Resea:l'ch supported in part by the National Science Foundation under Gmnt No .--GU.. 20 59.---Visiting!romKyus'hu-lJnl,versity(JapanJi . ------ AsYroPTOTIC PRoPERTIES OF GAUSSIAN PROCESSES by Clifford Qualls 1 and Hisao Watanabe 2 Department of Statistics University of North Carolina at Chapel BiZZ Institute of Statistics Mimeo Series No. 736 .FebltuaJty, 7971

Upload: truongdat

Post on 09-Apr-2018

215 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: 2. No .--GU..2059.---Visiting!romKyus'hu-lJnl,versity ... asymptotic distribution of the maximum Z(t) =maxOSsSt X(s) to condition (0.1) with ° < a S 2. Pickands treated stationary

• 1. Research supported in part by the Office of Naval Research under ContractNo. N00014-67-A-0321-0002. Visiting from University of New Mexico.

2. Resea:l'ch supported in part by the National Science Foundation under GmntNo .--GU..2059.---Visiting!romKyus'hu-lJnl,versity(JapanJi . ------

AsYroPTOTIC PRoPERTIES OF GAUSSIAN PROCESSES

by

Clifford Qualls1 and Hisao Watanabe2

Department of StatisticsUniversity of North Carolina at Chapel BiZZ

Institute of Statistics Mimeo Series No. 736

.FebltuaJty, 7971

Page 2: 2. No .--GU..2059.---Visiting!romKyus'hu-lJnl,versity ... asymptotic distribution of the maximum Z(t) =maxOSsSt X(s) to condition (0.1) with ° < a S 2. Pickands treated stationary

•By

and

AsV(1)TOTIC PROPERTIES OF GAusSIAN PRocESSES

Cl rfforCl~ua lls1;---universi-ijj ·orNoi'~7i7;ai'c;"l/ina--;-C'hape-·fHi17•• --- n •• ----------­

and University of New Me:m.co

Hisao Watanabe2., University of North CaroZina.J Chapel, Hil,l,

and Kyushu University (Japan)

o. Introduction. Let {X(t), -00 < t < oo} be a real separable Gaussian process

defined on a probability space (0, A, P). We assume EX(t) =0, 2v (t) =

EX2(t) > 0, and the covariance function r(t,s) = E(X(t)X(s» is continuous

respect to t and s. And we set p(t,s) = r(t,s)!(v(t)v(s». In this paper,

we treat two problems concerned with Gaussian processes whose correlation func-

tions satisfy

(0.1) p(t,s) = 1 - It-sla H(lt-sl) + o(lt-slaH(lt-sl» as It-sl + 0,

where ° S a S 2 and H varies slowly at zero.

In §l, we relate the magnitude of the tails of the spectral function to the

order of continuity of the correlation function. As a consequence, we can show

directly the equivalence (in very useful cases) of the Kahane-Nisio condition and

the Fernique-Marcus and Shepp condition which are necessary and sufficient con-

ditions for X(t) to have continuous sample functions in terms of the spectral

function and the correlation function, respectively.

The second problem (§2) is to extend a result of Pickands [11] which gives

the asymptotic distribution of the maximum Z(t) = maxOSsSt X(s) to condition

(0.1) with ° < a S 2. Pickands treated stationary Gaussian processes whose co­

variance function p(lt-sl) = p(t,s) satisfy condition (0.1) with °< a S 2

for H(lt-sl) = a constant. Such studies have been done for H3lder continuity of

sample functions by Marcus [8], Kano [6], and Sirao and Watanabe [13]. Our

Page 3: 2. No .--GU..2059.---Visiting!romKyus'hu-lJnl,versity ... asymptotic distribution of the maximum Z(t) =maxOSsSt X(s) to condition (0.1) with ° < a S 2. Pickands treated stationary

•2

efforts using Pickands' methods to give the asymptotic distribution of Z(t) for

the case a = 0 were not sucessfu1. (This is not too surprising.)--- - - ------------.. ---_._----- -----In §3, we use the result of §-2- to-oo-taiIi-tne -extens1oii-o! -the - u-r -oenaV!or -----------------

treated in Watanabe [15] and Qualls and Watanabe [12] to our present case. The

method in [12] turns out to be powerful, while the method in [15] doesn't seem to

be successful in our case. Section 4 treats the non-stationary case; in §2 and

§3, we assume stationarity.

1. Existence of stationary Gaussian processes. We list some definitions and

properties of regularly varying functions that will be required in this and fo1-

lowing sections. Some general references on regular variation are'

Karamata [5], Adamonic [1], and Feller [2].

~ Definition 1.1. A positive function H(x) defined for x > 0 varies slowly at

infinite (at zero), if for all t > 0,

(1.1)x-+ oo

(x -+ 0)

H(tx)H(x) = 1.

Defi ni ti on 1. 2. A positive function Q(x) defined for x > 0 varies regularly

at infinity (at zero) with exponent a ~ 0, if for all t > 0,

(1.2) Urn Q(tx) = t a •x-+ oo Q(x)

(x -+ 0)

A function Q(x) satisfies (1.2) if and only if a where H(x)Q(x) = x H(x),

varies slowly. Also, H(x) varies slowly at infinity if and. only if H(l/x)

varies slowly at zero.

Let Q(x) vary regularly with exponent a ~ 0 and H(x) vary slowly at

infinity. Then, the following properties hold.

Page 4: 2. No .--GU..2059.---Visiting!romKyus'hu-lJnl,versity ... asymptotic distribution of the maximum Z(t) =maxOSsSt X(s) to condition (0.1) with ° < a S 2. Pickands treated stationary

(1. 3) The limits (1.1) and (1.2) converge uniformly in t on any compact

subset of the half line (0,00).

3

.urn -e; H(x) 0 and .um e;H(x)x = x = 00.

x-+- oo x-+-oo

(1.5) The function H(x) varies slowly if and only if

H(x) = a(x) ~xp{f:«t)/tdt}.where e;(x) -+- 0 and a(x) -+- A as x-+-oo (0 < A < 00) •

Definition 1.3. The slowly varying function H(x) is said to be "normalized"

if a(x) = A in property (1.5) above.

(1.6) If H(x) is a "normalized" slowly varying function at infinity, then

for any e; > 0, there exists K ~ 1 such that

s H(tx)H(x)

-e;t

(1. 7)

for all x > 0 and all positive t < 1 such that tx ~ K.

If H(x) is a "normalized" slowly varying function at zero, then for

any e; > 0, there exists 0 > 0 such that

-e;t

S H(tx)H(x)

(1.8)

(1.9)

for all x > 0 and all t > 1 such that tx < 0

If H(x) is a "normalized" slowly varying function at infinity then

for any a > 0 the function x~(x) is eventually monotone decreasing.

If H(x) varies slowly at infinity, then

xP+~(x)

fx yPH(y)dyO·

-+- p + 1 for p + 1 ~ 0,

Page 5: 2. No .--GU..2059.---Visiting!romKyus'hu-lJnl,versity ... asymptotic distribution of the maximum Z(t) =maxOSsSt X(s) to condition (0.1) with ° < a S 2. Pickands treated stationary

4

and

for- --P *-1--" --0.~ ··1·p±11-xP+1 H(x)

fQo

yPH(y)dyx

Moreover, if r: ¥ dy exists then it varies slowly at infinity and

H(X);!J: ¥ dy + O.

Theorem 1.1. Let H(x) vary slowly at infinity. Let f be a positive even00

function with r f(x)dx = 1 satisfying the following condition. Assume that_00

there exist positive constants 0 < a < 2, C1, C2, and K such that

C H(x) S f(x)1 a+1

x

< C H(x)- 2 a+1

xfor x ~ K.

Then, if we set p(t) = roo eitxf(x)dx and 02(h) = 2(1-p(h», we have_ 00

(1.10) 1Xm 02

(=P,,-)-

h -+ 0 Ih IaHU/h)

Remark 1.1. The p(t) in Theorem 1.1 is a covariance function of a stationary

Gaussian process.

Proof. Without loss of generality, we take H(x) to be "normalized". First, we

note that

02(h) = 2(1-p(h» = 8 f: ~~n2(h2x) f(x) dx

K1 00

= 8 f ~~n2(~) f(x) dx + 8 f .6~n2(h;) f(x) dx - 11 (h) + 12(h)o K1

Page 6: 2. No .--GU..2059.---Visiting!romKyus'hu-lJnl,versity ... asymptotic distribution of the maximum Z(t) =maxOSsSt X(s) to condition (0.1) with ° < a S 2. Pickands treated stationary

5

J(h)(1.11)

for arbitrary Kl ~ K. We may ignore Il(h) in Theorem 1.1, since

limh~ I l (h)/h2 = 2r~1 x2f(x)dx < ~._____n _

----- Nowwe--estrmate-r2(li)--bY-CiJ(hr-~~-12(hrS C2J(hJ---n<1-~ K) where

J(h) .. 8r~1.6..[/1.2(h;)x-(a+l)H(X)dx. By a change of variables,

= ha Joo 8 .6bz.2 (x) H(x/h) dxhK

l2 xa+l

8 .6..[/1.2 (x) H(x/h) dx2 0+1

x(say),

where hKl < 0 < 6 < 00.

Since ti~~ H(x/h)/H(l/h) .. 1 uniformly with respect to x in each finite

interval (0,6)

f6

0

(H(x/h)-H(l/h» 86..[/1.2(x/2) dxa+l

x

.~ For the first integral of (1.11), we use property (1.6) to obtain for 0 < 1

where we choose €l > 0 with a+€l < 2 (which is possible if a < 2).

For the third term of the right hand side in (1.11), we use property (1.8)

to obtain, for 6 sufficiently large,

where we choose €2 > 0 with a-€2 > O.

Since H(6/h)/H(1/h) + 1 as h + 0, we have, as h + 0,

1 J~ 86..[/1.2 (x/2) H(l/h) J6 86..[/1.2 (x/2)H(l/h) hK

lxa+l H(x/h) dx - H(l/h) 0 xa+l dx

Page 7: 2. No .--GU..2059.---Visiting!romKyus'hu-lJnl,versity ... asymptotic distribution of the maximum Z(t) =maxOSsSt X(s) to condition (0.1) with ° < a S 2. Pickands treated stationary

'.6

Since

and

we have the desired result.

Corollary 1.1. Under the conditions of Theorem 1.1, we have

o < cx.C1 JOO 86,[/12 (x/2) dx s Urn 0'2 (h)Cz 0 xcx+1 h-:;-0+ 1-F(l/h)

s ~ 0'2 (h)h ~ 0+ 1-F(1/h)

where F(x) = IX f(u)du.-00

Proof. First, for h sufficiently small,

C Ioo

H(u) du s 1 - F(l/h)1 l/h ucx.+1 f

OO

< H(u)- C2 +1 duo

l/h Ucx

From this and using property (1.9) and Theorem 1.1, we obtain the corollary. 0

Remark 1.2. We can find discussions similar to Theorem 1.1 and Corollary 1.1

in Garsia and Lamperti [3] and Ibragimov and Linik [4].

Next we consider the case cx = O.

Theorem 1.2.~ Let L(x) vary slowly at infinity and assume L(x) is monotone

for large X. Let f(x) be a symmetric probability density function such that

~

C L(x) S f(x)1 x

C L(x)2 x for large x.

The proof of Theorem 2.1 was communicated to us by Professor Walter L.Smith.

Page 8: 2. No .--GU..2059.---Visiting!romKyus'hu-lJnl,versity ... asymptotic distribution of the maximum Z(t) =maxOSsSt X(s) to condition (0.1) with ° < a S 2. Pickands treated stationary

'.7

. 00 itx 2Then, if we define p(t) = I e f(x) dx, 0 (h) = 2(l-p(h», and

-00

00

H(x) =Ix L(y)/y dy, we have

(1.12) • 0 2 (h)4C l S; Um H(l/h)

h -+- 0

and H(l/h) varys slowly at zero.

Proof. Wi thout loss of generali ty, we take L(x) to be "normalized". Let the

hypotheses of Theorem 1.2 hold for all x ~ K, where K > 1 say. Without loss

of generality we redefine L(x) on x < K so that L(x) is monotone (de-

creasing) for all x > 0 and L(x) -+- L(O) < 00 as x -+- 00. (Of course, now the

bounds on f(x) given above can only hold for x bounded away from the origin.)

Now as in the proof of Theorem 1.1, we may ignore the integral 1l(h) in

cr2 (h) • 4 I: (l-~hx) f(x) dx + 4 I: (l-c.ohhx) f(x) dx

e :: 11

(h) + 12

(h) ,

as h -+- O.

By assumption, we have 4Cl J(h) s; I 2(h) s; 4C2J(h) where J(h) =

f~(1-C04hx)L~X)dX. We replace the study of J(h) by the study of JO(h) =

1;(l-C04hx)L~) dx, since the integral f~(l-C04hX)L~)dx again may be ignored

as h -+- O.

00 L(x/h)Now Jo(h) = f 0 (l-C04X) x dx is an increasing function of h since

L(x!h) is increasing. Of course

So we may study the Laplace transform

We write

Page 9: 2. No .--GU..2059.---Visiting!romKyus'hu-lJnl,versity ... asymptotic distribution of the maximum Z(t) =maxOSsSt X(s) to condition (0.1) with ° < a S 2. Pickands treated stationary

8

=

where 6 is large.

-2 s6 62L(6s)Since J1*(s) ~ s f O xL(x)dx - 2 as s + 00 by property (1.9)t we

have

(1.13)

Also

~ foo ~~x) dx _ H(s6).s6

Now, property (1.9) states that H(l/h) varys slowly at zero and that

L(s)/H(s) + 0 as s + 00 (which we use here). This together with (1.13) yields

and consequently

Jo*(s).Um H(s)s + 00

1;

(1.14)Jo*(s)

lim H(s) = 1.s + 00

We apply a fundamental Tauberian theorem to JO*(s) to obtain

(1.15)JO(h)

£im = 1, as desired.h + 0 H(l/h)

Remark 1.3. Given a slowly varying function H at infinity, we can find L(o)00

by solving the functional equation H(x) = f x L(u)/u duo If H is differentia-

ble, then we need only verify that L(x) = -xH'(x) varies slm71y. For example,

if f(x) ~ 8/x(iogx)8+1 as x + 00, e > 0, then o2(h) ~ 4/lioghl 8 as h ~ O.

Corollary 1.2. Under the same conditions as in Theorem 1.2, we have

where F(x)

4C1- s limC2 h + 0

=f:oo f(y)dy.

02(h)1-F(1/h)

s :am o2(h)h + 0 1-F(1/h)

Page 10: 2. No .--GU..2059.---Visiting!romKyus'hu-lJnl,versity ... asymptotic distribution of the maximum Z(t) =maxOSsSt X(s) to condition (0.1) with ° < a S 2. Pickands treated stationary

9

Several authors give necessary and sufficient conditions for the sample

~ functions of stationary Gaussian processes to be continuous. Marcus and Shepp

-------- - -+9]-give-acondition in-terms-of- a 2 (h) ~ . i~e;,-if- ais monotonic-tnc-re~sirtfr,--·

then

foE a(h) db < 00

h(.tog 1)1/2h

is a necessary and sufficient condition for continuous sample functions. Also

Nisio [10] has obtained a condition in terms of the spectral distribution func-

tion F. Based on her resu1t~ Landau and Shepp [7] have given that if

is eventually decreasing, then is a necessary

and sufficient condition for continuity of sample functions. By using the re-

suIts of this section, we can make clear the relation between the two criterion.

Now, suppose that the spectral function satisfies the conditions of either

Corollary 1.1 or Corollary 1.2. Then we may suppose without harm that

a 2 (h) - 1-F(1/h) as h + 0, so that a2 (2-n) - I~=n sk as n + 00. If a(·)

is monotone increasing in some interval (O,E), then

foo a(h) dho h(.tog ~)1/2

< 00 if and only if00

Ln=l

< 00

Now, there exist positive constants C3 and C4 such that

(1.16)

Here ~ we use the following inequality of Boas (see Marcus and Shepp (9]) •

Lemma 1.1. Let gn = I;=n a. , where aj

are positive and decreasing. ThenJ

00

>100

(8n)~00

1 I s I s 2 L >12 a a •

n=l n n=l n n=l n

So

e C300

~00

a (2-n) 00

~2 I s s I 1/2 s 2C4 L s •

n=l n n=l n=l nn

Page 11: 2. No .--GU..2059.---Visiting!romKyus'hu-lJnl,versity ... asymptotic distribution of the maximum Z(t) =maxOSsSt X(s) to condition (0.1) with ° < a S 2. Pickands treated stationary

10

Theorem 1.3. If we suppose that the spectral function F satisfies the con-

ditions of either Corollary 1.1 or Corollary 1.2 and that sand o(h)n

are

00

In=l

1as < 00n

if and only if fe:

o(h) dho h (log 1.) 1/2

h

Remark 1.4. Of course if we take the "normalized" version of the regularly

varying o(h) in the case 0 < a < 2, then it is eventually decreasing by

property (1.8), as h ~ O.

2. The asymptotic distribution of the maximum. For the study of the asymptotic

distribution of Z(t) = ~UPO~s~tX(s) in this section, we assun~ that the pro­

cess X(t) is stationary in addition to its covariance function p(s) = p(t,t+s)

satisfying condition (0.1) with 0 < a ~ 2. We are assuming each X(t) has

mean 0 and variance 1. The non-stationary case is discussed in §4. Without

loss of generality, we also assume the slowly varying function H(s) in condi-

tion (0.1) is "normalized". See §1 for definitions. Let 02(s)_

E{X(t+s)-X(t)} = 2(1-p(s», define

in60<s~t o(s)/~(s) and AZ(t) = ~uPO<s~t o(s)/~(s).

The theorem of this section is an extension of Pickands' result [11]. Since

Pickands' methods apply in our case, we will only sketch the proof emphasizing

the points of difference.

Theorem 2.1. If condition (0.1) with 0 < a ~ 2 holds, 02(8) > 0 for s ~ 0,

and ~(.) is defined as above, then

(Z.l) Urn -.R[Z(t) > x]--1x + 00 t~{x)/o (l/x)

= H = lim T-I foo eSP[ ~up Y(t) > s] ds,a T + 00 00< t < T

and 0 < H < 00, where {Y(t), 0 ~ t < oo} is a non-stationary Gaussian processa

Page 12: 2. No .--GU..2059.---Visiting!romKyus'hu-lJnl,versity ... asymptotic distribution of the maximum Z(t) =maxOSsSt X(s) to condition (0.1) with ° < a S 2. Pickands treated stationary

11

with Y(O) = 0 a.s., E{Y(t)} = -ltl a /2, c.oV{Y(tl ),Y(t2)} =.. I ,a I ,a I ,a -.k -1 2... ( t l + t 2 - t l -t2 )/2, and ~(x) = (2TI) 2 X exp(-x /2).

_m_ u__ _ __ _ __

Remark 2.1.,..,

By property (1.8) in §l, the 0(0) defined above (or any "normal-

ized" regularly varying function of positive exponent) is monotone on some small

interval (0,8). This useful fact does not seem to be well known in this con­

text. Of course, any function ~(o) with ~(s)"" o(s) as s + 0 (or 0(0)

itself) that is monotone near the origin can be used in Theorem 2.1. In fact,

""'-1any g(x)"" 0 U/x) as x + 00 could be used whether g is monotone or not.

Remark 2.2. The condition that 02(s) > 0 for s # 0 excludes the periodic

case. However, if p(s) is periodic with period sO' then Theorem 2.1 holds

with t in the denominator of (2.1) replaced by T = min(t,sO). See the remarks

in [12].

The proof of Theorem 2.1 is accomplished by a series of lemmas; and in

particular Lemma 2.3 is a useful discrete version of Theorem 2.1. The connection

between the constants of Theorem 2.1 and Lemma 2.3 is that Ha = ~a+oHa(a)/a.

The proof of the first lemma below illustrates the central idea behind the dis-

crete version of Theorem 2.1. For this discrete version, we need a partition of

the time interval

rate as x + 00.

(O,t) with mesh size 6(x) approaching 0

""'-1 ,.., ,..,Let 6(x) = 0 (l/x) for all x ~ 1/0(0).

at the proper

Lemma 2.1. If the conditions of Theorem 2.1 hold, then for a > 0,

P[Z (t) > x]

Um t~(x) /6(x)x+ oo

where Zx(t) = maXOSksmX(ka06(x», m = [t/(a6(x»], [0] denotes the greatest

integer function, and ~(o) is the standard Gaussian distribution function.

Page 13: 2. No .--GU..2059.---Visiting!romKyus'hu-lJnl,versity ... asymptotic distribution of the maximum Z(t) =maxOSsSt X(s) to condition (0.1) with ° < a S 2. Pickands treated stationary

12

Proof. This lemma corresponds to Pickands v Lemma 2.4 [11]. Defining the events

~ Bk = [X(kao~(x» > x] and using stationarity, we have

-------------------- - -- --- n -----------m- -----------P[Zx(t) > xl ~ I PBk - I I p(BjnBk)

k=O O~j<kSm

m~ (m+1)(PBO - Lp(BonBk)·

k=l

Now from [11, Lemma 2.3], we record that

where p = p(kao~(x», and obtain

(2.3)

To study

P[Z (t) > xl mlim x ~ a-1(1_ lIm 2 I {l-~(x(l-p)~(l+p)-~)}.x~oo t~(x)/~(x) x + 00 k=l

"m ~--*:2Lk=1{1-~(x(1-p) (l+p) 2)} partition the sum into three parts accord-

ing to i) ka ~ 1, ii) ka > 1, ka o6(x) < some 0, and iii) ka > 1,

4It ka·~(x) ~ 0. First, llm L(i)2(1-~) = r(i)lim 2(1-~).x+oo x+oo

We may ignore the third sum I(iii). For kao~(x) ~ 0, there exists a

positive K such that 1-p ~ K, and

,,(iii) 1 ~L {l-Hx(l~P) )} ~

as x + 00.

For positive 01 sufficiently small, A1(01) > O. For I(ii) when

ka·~(x) < 01 estimate

( l-p)~ ~x l+p

Page 14: 2. No .--GU..2059.---Visiting!romKyus'hu-lJnl,versity ... asymptotic distribution of the maximum Z(t) =maxOSsSt X(s) to condition (0.1) with ° < a S 2. Pickands treated stationary

13

By property (107) in §1 and for ka > 1, there is a positive 0 such thatcx

4It H(kao6(x»/H(6(x» ~ (ka)-a/2 provided ka 06(x) < 0cx' Take 0 = min(ocx,ol,t).

---'--------- ---Consequently

for ka > 1, ka o 6(x) < 0 and T large.

Finally, defining ak(x) = 2{1-~(x(1-p)t(1+p)-t)} for ka o 6(x) < 0 and

~(x) = 2{1_~(2-l(ka)cx/2)} for ka o 6(x) ~ 0, we have

00

I -1 .6UP ~ (x)k>a T~x<CIO

It follows that

< 00

(2.4)

since

00

:urn L ~(x)x -+,00 k=l

00

~ I ~ ~(x)k=l x -+ 00

= y 2(1_~(2-l(ka)cx/2»k=l

< 00,

k -k xx(1-p)2(1+p) 2 ~ 2 cr(ka o 6(x» ~ 1. a(ka 06(x»2 ~

cr(A(x»-+ 1. (ka) cx/2

2

as x -+ 00. Applying (2.4) in (2.3) completes the proof. 0

The partition corresponding to Z (t) above was made to depend on a,x

and the lower estimate of the distribution of Z (t) (as well as the upper esti­x

mate) depends on a. Since we wish to take a ~ 0, it is seen that Lemma 2.1

is not sharp enough to obtain the desired resulto Hence Lemma 2.2 will be

needed.

Lemma 2.2. If the conditions of Theorem 2.1 hold, then for a > 0

P[Z (na o 6(x» > xlUm x 1/J(x)

x-+ oo= H (n,a)cx = 1 + fco eSP[ max Y(ka) > slds

o l~k~n

Page 15: 2. No .--GU..2059.---Visiting!romKyus'hu-lJnl,versity ... asymptotic distribution of the maximum Z(t) =maxOSsSt X(s) to condition (0.1) with ° < a S 2. Pickands treated stationary

14

Proof. Simplifying Pickands' proof [11, Lemma 2.2], we have

eP[Z (na o6(x» > x] • P[X(o) > x] + P[X(o) ~ x, max X(kao6(x» > x].

-- ----- no. ---- --------- ------ ---- --- ----- ---x----------------------------------------- -------- n .. n -- - -l-sk"Stt--------------- . .__...._.. .. _ _

The second term equals

fx P[ max X(kao6(x» > x/X(o) = u] ~(u) du,-00 l~k~n

where ~(u) is the standard Gaussian density function. Substituting u = x-six,

and defining Yl(t) = x(X(t o 6(x»-x)+s, we obtain

~(x) foo eSP[ max X(kao6(x» > x / X(o) = x-six] exp(-sZ/(2x2»dso l~~n

Ioo

s 2 2= ~(x) e P[ max Y1(ka) >s / X(o) = x-six] exp(-s /(2x »ds.o l~k~n .

Note that

E(Yl(t) / x(o) = x-six) = x(p(t o6(x»(x-s/x)-x) + s

= - x2(1-p(t o6(x» + s(1-p(t o6(x»)

= - x2 cr2 (6(x» ° Itl a/2 + 0(1)

as x -+ 00;

and that

COV(Yl (tl ),Yl (t2) / X(o) = x-six) = x2[p«t2-tl)o6(X»-P(tlo6(X»P(tzo6(X»]

2= x2 [-cr2 (6(x»Itz-tl ,a+cr2 (6(X»I t l l

a+cr2 (6(x» It2Ia-cr4(6(X»ltlt2Ia/2] + 0(1)

= t[-lt2-tlla+ltlla+lt2Ia] + 0(1) as x -+ 00

Consequently p[maxl~k~n Y1(ka»s/X(o)=x-s/x] -+ P[maXl~k~n Y(ka»s] as x -+ 00,

and an application of Boole's inequality and the Lebesgue dominated convergence

theorem completes the proof. 0

--

Page 16: 2. No .--GU..2059.---Visiting!romKyus'hu-lJnl,versity ... asymptotic distribution of the maximum Z(t) =maxOSsSt X(s) to condition (0.1) with ° < a S 2. Pickands treated stationary

15

Lemma 2.3. If the conditions of Theorem 2.1 hold, then for a > 0

(2.5)P[Z (t) > x]

/!:'= .tt/J{x)/L\(x) __ =

H (a)a

a

where 0 < Ha (a) :: U""h~ Ha (n,a) /n < =, Zx(t) = ma.XO~k~mX(ka.~(x», and

m = [t/ (a~(x» ].

Proof. This lemma corresponds to [11, Lemma 2.5]. For each non-negative integer

k, let Bk = [X(ka·~(x» > x], and for an arbitrary positive integer n, letkn-l

~ = U B. Then-K j=(k-l)n j

(2.6) p[ U' 1\] ~ P[Z (t) > x] ~ p[m~+l:1\]'k=l x k=l

where m' = [(m+l)/n]. By stationarity, P(1\) = P(Al ) for all k ~ 1.

Consequently

P[Z (t) > x]x

Now using Lennna 2.2, we obtain

=

(2.7)P[Z (t) > x]

1AJn tt/J(x)/~(x)x+ oo

(m '+1) PAl

1Zm (t/~(x»t/J(x)x+ oo

= H (n-l,a)/naa

On the other hand, (2.6) and stationarity imply

(2.8) P[Z (t) > x]x

m'~ r P(~) - L L p(~nAj)

k=l l~k<j~m'

m'~ m'P(Al ) - m' L p(AlnA.)

j=2 J

~ m'{PAl - nIl r p(BknBl )}.k=O l=n

As in the proof of Lemma 2.1, inequality (2.2) applied to p(BknBl ) = p(BOnBl _k)

and inequality (2.8) yield

(2.9)P[Z (t) > x]

Um x- -tt/J..;:;(;....,x):--/,....~...,..(x.....)-x+ oo

Page 17: 2. No .--GU..2059.---Visiting!romKyus'hu-lJnl,versity ... asymptotic distribution of the maximum Z(t) =maxOSsSt X(s) to condition (0.1) with ° < a S 2. Pickands treated stationary

16

{n-l 00 }

~ Ha(n-l,a) - L L dt _k Inak=l t=n

_\\7~:~:__~j_~2~1_-~(!9_a) ~/:)2·__~!_E~~4?:~~_J;:::O~j_~~~ .a~~u_t~::foreUmn-+co L~:~ Lt=n dt_k/na = 0, by Kronecker's lemma.

Combining (2.7) and (2.9), we have

n+ oo

H (n-l,a)urn ...:;;a _na

P[Z (t) > x]Um _-=x-,-..,...-,.......,.._--- tw(x)/~(x)

x+ oo

P[Z (t) > x]

lIm tw{x)/~(x)x+ oo

H (n-l,a)Um --=a _na '

and the conclusion of Lemma 2.2.

Now (2.7) implies H (a) < 00.a

By Lemma 2.1, H (a) > 0a

for a suffic1en~ly

large, say for all a > some RO' For any a > 0, there exists an integer m

such that rna > aO'

H (a) > O. 0a

Now H (n,am) S H (nm,a)a a

implies H (am) S mH (a)a a and

Lemma 2.4. Under the same conditions as Theorem 2.1, it follows for a > 0 and

2-a / 4 < b < 1 that

lZm P[X(o)sx, Z(a'~(x»>x] S M(a),x + 00 w(x)

and Um M(a) = 0,a + 0 a

where

and

R(x) = Ioo

(l-~(s» ds.x

and00

c U Dkk=O

[X(O)Sx, Z(a'~(x»>x]

Proof. We combine Pickands' Lemmas 2.7 and 2.8 [11]. Note that

Zk_lU E j k '

j=O '

where

Page 18: 2. No .--GU..2059.---Visiting!romKyus'hu-lJnl,versity ... asymptotic distribution of the maximum Z(t) =maxOSsSt X(s) to condition (0.1) with ° < a S 2. Pickands treated stationary

[ max X(ja.~(x)/2k) S x,OSjS2k-l

17

max X(ja.~(x)/2k+l) > x+bk/x],OSjS2k+l -l

and_un -- - --ok --- -- ---------------k+l---- ke-- ---- --[X(ja·~(x)/2 ) S x, X«2j+l)a·~(x)/2 ) > x+b Ix].

00

Wi P[X(O)Sx,Z(a·~(x»>x] S. llm Ix-+-oo ~(x) x-+- oo k=O

By using [11, Lemma 2.6], we obtain P(Ej,k) S ~(x)xp-l(1-p2)~R(y), where

p = p(a.~(x)/2k+l), and y = y(x) = bkx-lp(1-p2)~-x(1+p)-1(1-p2)~. Consequently

2k- l

I P(E j k)/Hx)j=O '

(2.10)

00

S lZm I 2k xp-l(1-p2)~ R(y).x-+- oo k=O

In order to apply the technique used in Lemma 2.1, we need to show

00

I 2k .6up xp -1(l_p2)~ R(y) < 00

k=O TSx<oofor some T > o.

That this sum is finite follows from the estimates:

-1 2 ~ -1 k+l -1 "" k+lxp (l-p) S xp o(a·~(x)/2 ) S xS A2o(a.~(x)/2 )

S S-l A2

(a2-k- l )a/2 (H(a.~(x)/2k+l)/H(~(X»)~

S S-l A2

(2-k a/2)a/2 (a/2k+l )-a/4 by (1.6) in §l

= S-l A2

(a/2)a/4 2-Qk/4,

and

y(x) = bk x-I p(1-p2)~ _ x(l+p)-l (1-p2)~ ~ bk Sx-l/o _ 2-l xo

k -1 ...., "" k+l -1 "" k+l ""~ b SA2 o(~(x»/o(a~(x)/2 ) - 2 A2o(a~(x)/2 )/o(~(x»

~ bk SA2

- l (2k 2/a)a/4 - 2-lA2

(2k .2/a)-a/4 by (1.6).

Here S =~n60sssa.~(x) p(s) ~ l~, and A2 = A2(a.~(x» S l+~ for x > some T.

Therefore, (2.10) yields

lZm P[X(o)sx, Z(a·~(x»>x] Sx-+- oo ~(x)

00

I lIm 2kxp-l(1-p2)~R(Y) = M(a),k=O x-+- oo

Page 19: 2. No .--GU..2059.---Visiting!romKyus'hu-lJnl,versity ... asymptotic distribution of the maximum Z(t) =maxOSsSt X(s) to condition (0.1) with ° < a S 2. Pickands treated stationary

18

In order to see that M(a)/a ~ 0 as a ~ 0, use the estimates R(x) ~

_ 2 2-~

.., $(x)/x ~ exp(-x /2) for x ~ (2~) ; note that one may disregard any finite

nUmSerof -l:heleadirigterm~fof H(ar; u- atfdfnefatt6r -e...cacx- -out -of--the-in-fini re----

sum M(a). Here c > 0 is a properly choosen constant. 0

Proof of Theorem 2.1. See [11, Lemma 2.9]. There it is shown that H =a

llm H (a)/a = tim T-l(l + Joo eS P[ ~up Y(t) > s]ds). 0a ~ 0 a T ~ 00 0 O<t<T

Remark 2.3. Pickands' Theorem 2.1 [11] concerning the expected number of e:-up-

crossings is hereby generalized also.

3. An asymptotic 0-1 behavior. In this section, we use the results of §2 to

e obtain an extension of the results in Qualls and Watanabe [12]. We again postpone

discussion of the non-stationary case to §4. Using the notation of §2, we

have

Theorem 3.1. If p(t) satisfies (0.1) with 0 < a S 2, ~(.) is defined as

in §2, and

(3.1) p(t) = O(t-Y) as t ~ 00, for some Y > 0;

then, for any positive non-decreasing function ~(t) on some interval [a,oo),

as the integral

= 1 or 0

converges or diverges.

fOO (~(t)cr-l(l/~(t»)-l exp(-~2(t)/2) dta

Page 20: 2. No .--GU..2059.---Visiting!romKyus'hu-lJnl,versity ... asymptotic distribution of the maximum Z(t) =maxOSsSt X(s) to condition (0.1) with ° < a S 2. Pickands treated stationary

In particular,

a-e:

S ~(s) S s 2

19

Remark 3.1. Monotone ~(.) other than the one defined in §2 can be used; see

e Remark 2.1. Note that condition (3.1) implies p(t) is not periodic; conse-

quent~y The-orenr 2.1 isapplicab1e~

a+e:

P f ( )-2-.

roo. For ~very e: > 0, assumption 0.1 implies that s .. 2 = 2~. ~-1 ~

and that s S cr (s) S s for all positive s S some O.

the integrand of I(~) is eventually a decreasing function of ~.

(1) The aase when I(~) < ~.

Let t n = n6, where 6 > 0 and n = 0,1,2, •••• By Theorem 2.1 and for

fixed 6 > 0, we have

~

Ln=no

P{ .6u.p X(t)t nStStn+1

(t +l-t )(~(t )~-l(l/~(t »-1 exp(-~2(t )/2)n n n n n

s C1f~ (~(t)~-l(l/~(t»)-l exp(-~2(t)/2)dt < ~,n

06

for nO sufficiently large. Here C1 > 0 is a certain constant. So, the

Bore1-Cante1li lemma yields

.6u.p X(t) S $(tn) for all n ~ n~} = 1;t nStSt n+1

and consequently PE~ = 1. 0

(2) The aase when I(~) = ~.

For this part of the proof, we need the following lemma.

Lemma 3.1. If Theorem 3.1 when I(~) = 00 holds under the additional assumption

that

Page 21: 2. No .--GU..2059.---Visiting!romKyus'hu-lJnl,versity ... asymptotic distribution of the maximum Z(t) =maxOSsSt X(s) to condition (0.1) with ° < a S 2. Pickands treated stationary

20

2togt S ~2(t) S 3togt, for all large t,

then it holds without this additional assumption.

Proof. --1From the bounds on (1 ( • ) given above, there are positive constants

2

S C3

Jco If>( t)H -1 exp(-4>2 (t) /2) dt.a

s I(</»

C3 such that

2

C2

flO ~(t)(;k -1 exp(-</>2(t)/2) dta

(3.2)

When 0 < a < 2, choose e: > 0 Buch that a+e: < 2 and a-e: > O. When a· 2,

it is well known that the B(s) in a2 (s) cannot tend to zero; consequently, we

may choose e:. 0 in the left hand side of (3.2) when a· 2. We obtain for

o < a S 2 that

(3.3)

Let </let) be an arbitrary positive non-decreasing function such that

I(~) • co. Let $(t). min(rnax(</l(t),(2togt)%). (3togt)%). To show l($). co,

we may assume 4>(t) crosses u(t). (2togt)~ infinitely often as t + co.

Otherwise, either </> S u and 1($). I(u) • co, or 4> > u and 1($) ~ 1(4)) • co,

for some large a.

"The proof of Lemma 1.4 in (12] now shows that J(</». co; and by (3.3) that

1($) • co.

"That P[X(t»</>(t) i.o.J • 1 implies P[X(t»4>(t) i.o.J • 1 follows from

"Theorem 3.1 when lev) < co" with v(t)· (3togt)~; details are given in

Lemma 4.1 in [15J. 0

The proof of the second part Qf Theorem 3.1 now proceeds in the same way as

in Qualls and Watanabe [12] We will use the same notation as in [12].

Page 22: 2. No .--GU..2059.---Visiting!romKyus'hu-lJnl,versity ... asymptotic distribution of the maximum Z(t) =maxOSsSt X(s) to condition (0.1) with ° < a S 2. Pickands treated stationary

21

Define a sequence of intervals by I = [n~, n~+S] for ~ > 0 andn

o < S <~. Let Gk = {~,v=k~+v/nk: v=O,l, ••• ,[Snk]} be points in I k where--- ----- -1>.1-1- ~--- --- --- --- -- -1------------~ ---- ~ - - -- - -- __ __ ___nk = [(0 (l/ep(k~+S») ]. Let Ek = [maxS€G

kX(s}s-~(kA+I3}r: uNow-\ising--Lemma-

c2.3 in the same way as in [12], we see that I(~) = 00 implies ~P(Ek) = 00.

So, we only need to prove the asymptotic independence of the Ek's, that

is,

(3.4)m-+ oo n-+ oo

n nIp(nEk) -TIPEkl

m m= O.

~-l 2Now by the use of Lemma 3.1 and the bounds on 0 (0), we have ~ (k~+S) ~

2 .e.og(k~+S) and

2

nk

s ~(k~+S)a-€ s

1

(3.e.og(~+S»a-€ ,

where € > 0 and a-€ > O. Now the proof of (3.4) given in [12] applies with-

out change. 0

Remark 3.2.;

By the use of inequalities (3.2) and Theorem 3.1, we can easily show

that for every € > 0,

P /1 +_1_ s lliii (2.e.09t)~(Z (t) -(2.e.09t)~ s 1.+.-Lj = 1;2 a+€ log log t 2 a-€

- x-+oo

and consequently

It is interesting that this is true whatever H may be (as long as it satisfies

the assumptions of Theorem 3.1).

Remark 3.3. Generally speaking, it seems to be difficult to compute I(~) in

the criterion of Theorem 3.1 in concrete examples. Of course, inequalities (3.2)

may be used except in the critical cases.

Page 23: 2. No .--GU..2059.---Visiting!romKyus'hu-lJnl,versity ... asymptotic distribution of the maximum Z(t) =maxOSsSt X(s) to condition (0.1) with ° < a S 2. Pickands treated stationary

22

tnllt Slepian's-result-n~ailoe used to generaUze§2.

Let X(t) be a separable Gaussian process with zero mean function and

correlation function pet,s). We adopt the notation of §2 with modifications

to the non-stationary case. At first, we only assume that

(4.1)

for 0 < h <012 and 0 ~ T ~ t, where 0 < a ~ 2 and H is slowly varying at

zero. Without loss of generality, we take H to be "normalized". By §l there

exist separable zero mean stationary processes Yl(t)

avariance functions satisfying ql(h) ~ l-C3h H(h) and

and Y2(t) with co­

q2(h) ~ l-C4h~(h) as

h ~ 0, respectively. For C4 < C2 < Cl

< C3

, we have

(4.2) for 0 < h < 034

and 0 ~ T ~ t. We shall use the subscripts 1 and 2 through out to corre-

spond to the stationary processes Y (.)1

respectively.

Remark 4.1. Section §l does not cover the case a = 2. For a covariance, say

ql(h) , with a = 2, we know H(h) has a limit, that the zero limit must be

excluded, and that a finite limit is easily handled. The interesting case is

when H(s) ~~. So when a = 2 we assume (4.2) instead of (4.1).

In order to exclude any type of periodic case, we assume K = ~up{p(T,T+h):

034 ~ h, 0 ~ T+h ~ t} < 1.

Theorem 4.1. If X(·) satisfies (4.1) and K < 1, then for a > 0

(4.3) 11 H (a) 11 11 H (a)C a a _ 2(C a_C a) _a _2 a 1 2 a

P[Z (t) > xlUrn x- ~-lx ~ ~ t~(x)/a (l/x)

Page 24: 2. No .--GU..2059.---Visiting!romKyus'hu-lJnl,versity ... asymptotic distribution of the maximum Z(t) =maxOSsSt X(s) to condition (0.1) with ° < a S 2. Pickands treated stationary

e l/a H _ 2(e l/a_e l/a)H2 a 1_2 un a-

(4.4)

23P[Z (t) > x] 1/ H (a)

~:urn x ~ C a.....;a~_

x + 00 t~(x)/~-l(l/x) 1 a

~ - x~m00 t:~:~ ;~~~( ~~X) u - - -- ------ ------

lIm P[Z(t) > xl"'-1x + 00 t~(x)/a (l/x)

~ e l/a H1 a'

a > 0

Moreover if X(·) satisfies (0.1) with 0 < a ~ 2 and K < 1, then for

(4.5)

and

(4.6) Um

P[Z (t) > x]x

"'-1tljJ (x) /a (l/x)

P[Z(t) > x]"'-1tljJ{x) /a (l/x)

H (a)a= a

= H.a

Proof. The proof consists of showing that the results only depend on the "local

condition" for p(T,T+h) instead of on the total time interval (O,t). Let

the integer M be large enough that ° = tiM is less than °34/2. Define

A =j

[ max X(kaoA(x» > x].(j-l)o~ka·A(x)<jo

is Slepian's result in the non-stationary case togetherThat PAj ~ P1Aj :: PIAl

ith th f t th t P i t ti Since A(X) '" e3l/aAl(x)w e ac a 1 s a s a onary measure. u u

x + 00, Theorem 2.1 (or rather Lemma 2.3) yields

as

(4.7)

Similarly

(4.8)

P[Z (t) > x]:um x-t"':;';(~X):-/~A-:-(x--:-)-

x+ oo

lIm P[Z(t) > x]--'- t~(X)/ll(X)

X-rOO

Pl[Z(o) > x]:urn

--"- 0~(x) / ll(x)X-r OO

1/ H (a)= e a _a,,--_3 a

Page 25: 2. No .--GU..2059.---Visiting!romKyus'hu-lJnl,versity ... asymptotic distribution of the maximum Z(t) =maxOSsSt X(s) to condition (0.1) with ° < a S 2. Pickands treated stationary

24

For lower bounds, we consider

(4.9)M

P[Z (t) > x] ~ I PA - I I P(A.nA).- ---~--- _un j =1 j-ba-<j ~M- -- 1._]__

For j-i ~ 2 in the double sum, we use a well-known device (see, e.g., Lemma

1.5 in [12]) to obtain

m m~ K I I

k=l k=l

Ipl exp(-x2

/(1+p»(1-p2)~

1 2 2~ K m exp(-x /(l+K»,

where m = [t/(a6(x»].

Dividing by tljJ(x)/6(x), we see that the error term and PA.PA. approach1. J

zero as x + 00; and therefore we may ignore this part of the double sum. For

j-i = 1,

(4.10)

Using (4.10) in (4.9), we have

(4.11) Urn P[Z(t) > xltljJ (x) /6(x)

x+ oo

P[Z (t) > x]Um _...::x~...,..-,.--,-_

- tljJ{x) /6(x)x+ oo

1/ H (a) 1/ 1/ H (a)= C Ct Ct _ 2 (C Ct-C Ct) _Ct::.-._

4 a 3 4 a

Choosing C3 = C1' C4 = C2' and letting a + 0 in (4.7), (4.8) and (4.11), we

obtain (4.3) and (4.4). Choosing C1 = C2 = 1 in (4.3) and (4.4), we obtain

(4.5) and (4.6). 0

Of course, Theorem 3.1 of §3 can be generalized easily to a result analogous

to Theorems 2.1 and 2.3 of [12]. We write the following theorem without proof.

Page 26: 2. No .--GU..2059.---Visiting!romKyus'hu-lJnl,versity ... asymptotic distribution of the maximum Z(t) =maxOSsSt X(s) to condition (0.1) with ° < a S 2. Pickands treated stationary

25

Theorem 4.2. If X(o) satisfies (4.1) with 0 < a ~ 2 for 0 < h < 0 and all

e T > T, and

__m___ __ __ __ __ _ __

(4.12) p(T,T+S) = O(s-Y) uniformly in T as s ~ 00 for some y > 0,

then, for any positive non-decreasing function ~(t) on some [a,oo),

P[X(t) > v(t)~(t) i.o. in t] = 0 or 1

as the integral I(~) < 00 or = 00.

Page 27: 2. No .--GU..2059.---Visiting!romKyus'hu-lJnl,versity ... asymptotic distribution of the maximum Z(t) =maxOSsSt X(s) to condition (0.1) with ° < a S 2. Pickands treated stationary

26

References

(1) Adamovic, D. Sur quelques propri~t~s des fonctions a croissance 1ente- - --- - de-Karamata. -I-,ll. Mat.-Vesnik-3 (18) 1966, 123"'136,161-172 .-- --

(2) Feller, W. An Introduation to Probabi tity Theory and Its AppZiaations~

VoZ. II~ Wiley, New York, 1966.

(3) Garsia, A. and Lamperti, J. A discrete renewal theorem with infinitemean. Comment. Math. HeZv. 37 (1963), 221-234.

(4) Ibragimov, LA. and Linik, Yu. V. Independent and Stationary aonneatedvariabZes~ Nauka Moscow 1965. (In Russian).

(5) Karamata, J. Sur un mode de croissance regu1iere des fonctions.Mathematiaa (CZujJ 4 (1930), 38-53.

(6) Kano, N. On the modulus of continuity of sample functions of Gaussianprocesses. Jour. Math. Kyoto Univ. 10 (1970), 493-536.

(7) Landau, H.J. and Shepp, L.A.(To appear.)

On the supremum of a Gaussian process.

(8) Marcus, M.B. HB1der condition for Gaussian processes with stationaryincrements. Trans. AMS 134 (1968),29-52.

(9) Marcus, M.B. and Shepp, L.A. Continuity of Gaussian processes.Trans. AMS 151 (1970), 377-392.

(10) Nisio, M. On the continuity of stationary Gaussian processes.Nagoya. Math. Jour. 34 (1969), 89-104.

(11) Pickands, J.,III.processes.

Upcrossing probabilities for stationary GaussianTrans. AMS 145 (1969),51-73.

(12) Qualls, C. and Watanabe, H. An asumptotic 0-1 behavior of Gaussianprocesses. Univ. of N. Car. ~ ChapeZ HiZZ~ Inst. of Stat.Mimeo No. 725 (1970).

(13) Sirao, T. and Watanabe, H.Gaussian processes.

On the upper and lower class for stationaryTrans. AMS 147 (1970), 301-331.

(14) Slepian, D. The one-sided barrier problem for Gaussian noise.BeZZ Sys. Teah. J. 41 (1962), 463-501.

(15) Watanabe, H. An asymptotic property of Gaussian process.Trans. AMS 148 (1970), 233-248.

Page 28: 2. No .--GU..2059.---Visiting!romKyus'hu-lJnl,versity ... asymptotic distribution of the maximum Z(t) =maxOSsSt X(s) to condition (0.1) with ° < a S 2. Pickands treated stationary

27

Footnotes

AMS Subject Classifications: Primary 60G15, 42A68, 60G10; Secondary

60F20, 60G17.

Key Phrases: Stationary processes (existence), 2. Fourier transforms,

3. Tauberian theorem for Fourier transforms, 4. Regular variation,S. Slow

variation, 6. Supremum of a Gaussian process, 7. Extreme values of normal pro-

cesses, 8. Upcrossing probabilities, 9. Law of the iterated logarithm,

10. Zero-one laws, 11. Continuity of sample functions.

1. Research supported in part by the Office of Naval Research under

Contract N00014-67-A-0321-0002.

2. Research supported in part by the National Science Foundation under

Grant GU-2059.