extremes of autoregressive threshold processes

25
EXTREMES OF AUTOREGRESSIVE THRESHOLD PROCESSES Author(s): CLAUDIA BRACHNER, VICKY FASEN and ALEXANDER LINDNER Source: Advances in Applied Probability, Vol. 41, No. 2 (JUNE 2009), pp. 428-451 Published by: Applied Probability Trust Stable URL: http://www.jstor.org/stable/27793885 . Accessed: 14/06/2014 03:32 Your use of the JSTOR archive indicates your acceptance of the Terms & Conditions of Use, available at . http://www.jstor.org/page/info/about/policies/terms.jsp . JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide range of content in a trusted digital archive. We use information technology and tools to increase productivity and facilitate new forms of scholarship. For more information about JSTOR, please contact [email protected]. . Applied Probability Trust is collaborating with JSTOR to digitize, preserve and extend access to Advances in Applied Probability. http://www.jstor.org This content downloaded from 62.122.73.177 on Sat, 14 Jun 2014 03:32:28 AM All use subject to JSTOR Terms and Conditions

Upload: vicky-fasen-and-alexander-lindner

Post on 15-Jan-2017

212 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: EXTREMES OF AUTOREGRESSIVE THRESHOLD PROCESSES

EXTREMES OF AUTOREGRESSIVE THRESHOLD PROCESSESAuthor(s): CLAUDIA BRACHNER, VICKY FASEN and ALEXANDER LINDNERSource: Advances in Applied Probability, Vol. 41, No. 2 (JUNE 2009), pp. 428-451Published by: Applied Probability TrustStable URL: http://www.jstor.org/stable/27793885 .

Accessed: 14/06/2014 03:32

Your use of the JSTOR archive indicates your acceptance of the Terms & Conditions of Use, available at .http://www.jstor.org/page/info/about/policies/terms.jsp

.JSTOR is a not-for-profit service that helps scholars, researchers, and students discover, use, and build upon a wide range ofcontent in a trusted digital archive. We use information technology and tools to increase productivity and facilitate new formsof scholarship. For more information about JSTOR, please contact [email protected].

.

Applied Probability Trust is collaborating with JSTOR to digitize, preserve and extend access to Advances inApplied Probability.

http://www.jstor.org

This content downloaded from 62.122.73.177 on Sat, 14 Jun 2014 03:32:28 AMAll use subject to JSTOR Terms and Conditions

Page 2: EXTREMES OF AUTOREGRESSIVE THRESHOLD PROCESSES

Adv. Appl. Prob. 41,428-^51 (2009) Printed in Northern Ireland

? Applied Probability Trust 2009

EXTREMES OF AUTOREGRESSIVE THRESHOLD PROCESSES

CLAUDIA BRACHNER,* Allianz Investment Management SE

VICKY FASEN,** Technische Universit?t M?nchen

ALEXANDER LINDNER,*** Technische Universit?t Braunschweig

Abstract

In this paper we study the tail and the extremal behaviors of stationary solutions of

threshold autoregressive (TAR) models. It is shown that a regularly varying noise

sequence leads in general to only an O-regularly varying tail of the stationary solution.

Under further conditions on the partition, it is shown however that TAR(S, 1) models of order 1 with S regimes have regularly varying tails, provided that the noise sequence

is regularly varying. In these cases, the finite-dimensional distribution of the stationary solution is even multivariate regularly varying and its extremal behavior is studied via

point process convergence. In particular, a TAR model with regularly varying noise can

exhibit extremal clusters. This is in contrast to TAR models with noise in the maximum

domain of attraction of the Gumbel distribution and which is either subexponential or in

X(y) with y > 0. In this case it turns out that the tail of the stationary solution behaves

like a constant times that of the noise sequence, regardless of the order and the specific

partition of the TAR model, and that the process cannot exhibit clusters on high levels.

Keywords: Ergodic; exponential tail; extreme value theory; O-regular variation; point

process; regular variation; SETAR process; subexponential distribution; tail behavior;

TAR process

2000 Mathematics Subject Classification: Primary 60G70

Secondary 60G10; 60G55

1. Introduction

A (self-exciting) threshold autoregressive ((SE)TAR) model of order q with S regimes is a

piecewise AR(q) process with different regimes, where the current regime depends on the size of the past observations. More precisely, we will consider the following model: let (Zk)keN0 be an independent and identically distributed (i.i.d.) noise sequence, let q, p, S, d\,..., dp e N

with d\ < < dp, let [Jt : i = 1,..., S} be a partition of Rp into pairwise disjoint Borel

sets, and let a,-, / = 1,..., 5, as well as ?ij,

i = 1,..., S, j =

1,..., q, be real coefficients.

Then we call a process (Xk)keN0 satisfying

S , q v

Xk = J2[cti ^y?ijXk-j)l{{xk-dl,...A^p)GJi}

+ Zk, k> maxta, dp], (1.1) i=i V 7=1 7

Received 17 January 2008; revision received 19 February 2009. * Postal address: Allianz Investment Management SE, K?niginstrasse 28, 80802 M?nchen, Germany. ** Postal address: Center for Mathematical Sciences, Technische Universit?t M?nchen, Boltzmannstrasse 3, D-85747 Garching, Germany. Email address: [email protected]

Financial support from the Deutsche Forschungsgemeinschaft through a research grant is gratefully acknowledged. *** Postal address: Institute for Mathematical Stochastics, Technische Universit?t Braunschweig, Pockelsstrasse 14, D-38106 Braunschweig, Germany. Email address: [email protected]

428

This content downloaded from 62.122.73.177 on Sat, 14 Jun 2014 03:32:28 AMAll use subject to JSTOR Terms and Conditions

Page 3: EXTREMES OF AUTOREGRESSIVE THRESHOLD PROCESSES

Extremes of autoregressive threshold processes 429

and for which the starting vector (Xo,..., Xm3x{q,dp}-\) is independent of (Zk+h)heN0, a

TAR(5, q) process. The current regime at time k is determined by the vector (Xk-di,

Xk-dp) of the past observations, and within each regime, (X*) follows an AR(qi) process with

qi := max{ 7 e {1,..., q} : ?ij ^ 0}. Sometimes we also allow the variance of the noise to

be regime dependent, by replacing Zk in (1.1) by cr/Z*, where cr; depends on the past in the same way as ol[ does, but in this paper we will not consider such a specification. Autoregressive threshold models were introduced by Tong [28] in 1977 and were systematically presented by Tong and Lim [30], who used them as a model for the lynx data. Since then they have found various applications in many areas, such as financial economics, physics, population dynamics, and neural sciences, to name just a few; see the presentation in [13] for further information

and references. In particular, when used as a model for financial data, it is important to have information about the tail and the extremal behaviors of these models, since stylized facts of financial data are heavy tails and clusters on high levels. The present paper will investigate the tail and the extremal behaviors of TAR models for various classes of driving noise sequences.

A somewhat related paper by Diop and Guegan [10] considered the tail behavior of TAR stochastic volatility models. Observe, however, that the threshold model under consideration in [10] is governed by a different regime switching mechanism, where the regime is not determined by the size of the previous observation, as for the TAR process, but by the sign of the volatility model.

The paper is organized as follows. In Section 2 we collect some known assumptions under which the model has a strictly stationary and geometrically ergodic solution. We then prove that under the given conditions the tail of the stationary solution can be estimated by the tail of a certain corresponding AR(g) sequence, a lemma which will turn out to be a crucial ingredient for the determination of the tail behavior.

Next, in Section 3 we derive the tail and the extremal behaviors of a TAR process with

regularly varying noise. It is shown that the tail of the TAR model is O-regularly varying but not necessarily regularly varying. For this reason, we restrict our attention to the classical

TAR(5, 1) model of order 1 with S regimes, where the partition is a partition into S intervals and the regime is determined by whether Xk-\ is in these intervals. In this case we show that the stationary solution has a regularly varying tail, and even that the finite-dimensional distributions are multivariate regularly varying. Furthermore, we derive the extremal behavior

by point process convergence. A result is that the classical TAR(S, 1) process models extremal clusters.

Finally, Section 4 is also on general TAR(S, q) models with noise which has at most an

exponentially decreasing tail without being regularly varying. It is then shown that the tail of the TAR model is in the same class and its extremal behavior is determined. Here, the sequence of point processes converges to a Poisson random measure, which reflects, in contrast to the

regularly varying case, the absence of extremal clusters. Some of the results presented in this

paper can also be found in the diploma thesis [3] of the first author.

_ Throughout the paper, we will denote N = {1, 2, ...}, N0 = N U {0}, R+ = (0, oo), R = RU{-l-oo}U{?oo}, and use '?>' to denote weak convergence. For the integer part of a real number x, we write [x\ = sup{/t e Z: n < x}. The Dirac measure at a point x will be denoted by ex. For two strictly positive functions g and h and a constant c e [0, oo), we write

g(t) ~

ch(t) as t -* oo if the quotient g(t)/h(t) tends to c as t -+ oo. For x e RJ, we denote

by xT the transpose of x and by the maximum norm of x. As usual, the positive part and the negative part of x e R is denoted by x+ = max{x, 0} and x~ = max{0, ?jc}, respectively. The tail of a distribution function F will be written as F = 1 ? F.

This content downloaded from 62.122.73.177 on Sat, 14 Jun 2014 03:32:28 AMAll use subject to JSTOR Terms and Conditions

Page 4: EXTREMES OF AUTOREGRESSIVE THRESHOLD PROCESSES

430 C. BRACHNER ETAL.

2. Model assumptions and basic properties

In this paper we restrict our attention to noise sequences (Z&) which are subexponential (which includes regularly varying noise) or are in the class X(y) with y > 0, which includes tails of the form P(Zi > x)

~ Kxbe~yx asx oowithZ? e Rand AT > 0. When determining the tail behavior, we will further assume the following classical tail balance condition, which is also standard for extreme value theory of linear ARMA processes, as presented, in, e.g. [12, Section A.3.3].

Condition 2.1. (Tailbalance (TB).) There are constants p+, p~ e [0, \]suchthat p++p~ = 1, and the tail ofZ\ satisfies the TB condition if

P(Zi > jc) -

p+ P(|Zi I > x) and P(Zi < -x) ~

p~ P(|Zi | > x) as x -> oo.

For the existence of strictly stationary solutions (Xk)keN0, we use the following sufficient condition on the distribution of Z\ and the size of the coefficients, which is sufficiently general for our purposes. That it is not a necessary condition for stationarity can be seen from the

presentation in [6].

Condition 2.2. Let (Z^keNo be an i.i.d. sequence whose marginal distribution has a Lebesgue density h satisfying inf^ ^ h(x) > 0 for every compact set K C R. Furthermore, assume that E\Z\ l1

1^1'^ < oo for some n > 0. Denoting a := maxfc=i,...,s |ofjt| and ?j :=

max/=i_.5s \?ij\> assume further that ? := Y^j=\ ?j < 1

Define the function / : -* R by S ( qi \

f(xk-uXk-2, , xk-i) := yiat +y?ijXk-j jl[(xk-dlt...txk^dp)eJib

where / := max{g, dp], and define Xk := (X&, Xk-i, , X^_/+i)T. Then the threshold model (1.1) has the representation

Xk = f(Xk-i) + Zk for* {/,/ + l,...},

with f(Xk-i) being independent of Z#. Furthermore, we write Zk := (Z*, 0,..., 0)T and

f(x) := (f(x), x\,..., x/_i)T for x := (x\,..., xi)T e Rl. Then (Xk)k>i-\ is a Markov

chain, where

Xk = f(Xk^) + Zk for*>Z.

The following lemma assures the existence of a geometrically ergodic, strictly station

ary solution. Geometric ergodicity has already been proved in [5] (cf. [29, Example A 1.2, p. 464]) and [1, Theorem 3.2 and Example 3.6], under the slightly more restrictive condition that E \ Z\\ < oo. However, the proof in [1] can easily be generalized to the case where

only finiteness of E \Z\ |min{M} for SOme n > 0 is assumed: simply replace the test func tion (x\, ..., Xp) i-> max{|*i I, ..., \xp\) appearing in the proof of Theorem 3.2 of [1] by (jci, ..., Xp) h> (max{|xi|,..., l^l})1 *1'^. Also, note that the proof in [1] carries over to the more general partitions considered in (1.1) without change. That geometric ergodicity of the stationary solution then implies strong mixing with geometrically decreasing mixing rate in the sense that

sup |P(AH?) -P(A)P(?)| < Kyk, k,meN, (2.1) Aea(Xj: j<m), Beo(Xj : j>k+m)

for some 0 < y < 1 and K > 0, then follows from [21, Theorem 16.1.5].

This content downloaded from 62.122.73.177 on Sat, 14 Jun 2014 03:32:28 AMAll use subject to JSTOR Terms and Conditions

Page 5: EXTREMES OF AUTOREGRESSIVE THRESHOLD PROCESSES

Extremes of autoregressive threshold processes 431

Lemma 2.1. Let (Xk)keNo be a TAR process as given in (1.1), and suppose that Condition 2.2

holds. Then (Xk)keNo is geometrically ergodic and admits a unique strictly stationary solution, which is strongly mixing with geometrically decreasing mixing rate in the sense of (2.1).

The next lemma is crucial for the analysis of extremes of TAR models. It states that, under our assumptions, there is a causal AR(q) process whose stationary solution has a tail which is not smaller than that of the stationary solution of the TAR process.

Lemma 2.2. Suppose that Condition 2.2 holds, and let (Xk)keN0 be the stationary solution of the TAR model (1.1). Let (Zk)keZ be an i.i.d. sequence such that Zk = \Zk\ + a for k e No.

Denote by (Xk)kGZ the unique strictly stationary solution of the causal AR(q) process

q

Xk:=J2?jXk-j + Zk, keZ. (2.2)

Then (Xk)keZ has the almost-surely convergent moving average representation

co

Xk = J^ifjZk.j, (2.3) j=o

where i/>o = 1> 0 < < ^for j N, and (V^Oy'eNo is bounded by a geometrically decreasing sequence, i.e. < KyJ for some 0 < y < 1 and K > 0. Furthermore, for any m e N, k\,..., km e No, and x\,..., xm e R, it holds that

P(\Xkl\ >*!,..., |X* J >xm)<P(Xk] >xu...,Xkm >xm). (2.4)

Proof. Since the polynomial O(z) := 1 ? X^=i ?j^ has no zeros for |z| < 1 as a con

sequence of Y?j=\?j

< it follows that process (2.2) is causal. Expanding <?>(z)_1 =

IlJLo m a Power series around the origin gives x/zq = 1, t/>i = ?\, and the recursions

i/m = J2JZmsix{0,m~q} ?m-j^j f?r m > 2. A simple induction argument then shows that 0 <

x/fj < 1 for j > 1 and that (\/rm) can be dominated by an exponentially decreasing sequence. In particular, YfjLo t/t

"1*1'^ < oo. Since E | |min{!^} < oo by assumption, this

gives almost-sure convergence of (2.3). Define the sequence (Xp^No by := l^ol, , X*__x := \Xq-\ \ and

<?

X*k:=Y,?jX*k-j + Zk> h^V'

7 = 1

Then it follows by induction that

\Xk\<X*h for all k g No, (2.5)

since

q q

\Xk\< max a/ + Y, . ifx5 \?ij\\xk-j\ + \Zk\< J2?jXt-j + ̂ = xl

j=\l 7=i

Observe that (X* +n, ..., X^+n) converges in distribution as n ?> oo to the stationary solution

(Xfl,..., X^) of the causal AR(g) process (2.2), and that the topological boundary of the set

This content downloaded from 62.122.73.177 on Sat, 14 Jun 2014 03:32:28 AMAll use subject to JSTOR Terms and Conditions

Page 6: EXTREMES OF AUTOREGRESSIVE THRESHOLD PROCESSES

432 C. BRACHNER ETAL.

[x\, oo) x x [xm, oo) has P^ ^ ^ -measure 0 as a consequence of the absolute continuity

of the distribution of Z\. Thus, we conclude from (2.5) that

P(Xn > xi, ..., Xtm

> xm) = lim P(Xr1+w

> x\, ..., Xtfn+n > xm) n?*oc

< limsupP(X*+w

> xi,..., X*m+n > xm)

n?>oo

= P(Xtl

> x\,..., Xtm > xm),

showing (2.4).

3. Regularly varying noise

Recall that a measurable function / : (0, oo) ?

(0, oo) is said to be regularly varying (at oo) with index ?k g M, written as / e $t-K, if

lim ̂?^ = u~K for all u > 0. (3.1) x^oo f(X)

Functions in ?^o are also called slowly varying functions and, for k > 0, it holds that / e tR-K if and only if f(x) = x~KL(x) for all x > 0 with a slowly varying function L. For a random variable Z with distribution function Fz, we also write Z e !R-K to indicate that Z has a

regularly varying tail, i.e. that Fz 6 ^ft-*;. Examples of distributions having regularly varying tails include Pareto distributions and a-stable distributions with a e (0, 2).

3.1. O-regular variation of TAR models

Unlike for linear models such as ARMA processes, stationary solutions of general TAR mod els with regularly varying noise give only O-regularly varying tails. Recall that a measurable function / : (0, oo) -> (0, oo) is called O-regularly varying (at oo) if

n ,. r ^ r f^XUj * n n 0 < hm inf- < lim sup- < oo tor all u > 0. x->oc f(X) x^ f(x)

Clearly, every regularly varying function is O-regularly varying. For TAR processes with

regularly varying noise, we now have the following result.

Lemma 3.1. (O-regular variation.) Suppose that Conditions 2.1 and 2.2 hold, and let (Xk)keNo be a stationary version of the TAR process as given in (1.1). Suppose further that \Z\ \ e $l-K

for some k > 0. Then

. f P(Xp > x) ^ r P(X0>x) ̂ IK p+2 < hm inf ??- < hm sup

??- < > V), (3.2) x-^oo P(|Zi|>x) x-^oc P(|Zi|>x) ^

3

where (VoOyeNo *5 given as in Lemma 2.2. In particular, Fx is O-regularly varying if p^ > 0.

Proof. From Lemma 2.2 and [24, Lemma 4.24], we obtain

P(X0 >x) nx0>x) A hm sup- < hm sup- = > y/i. x^JP(\Zl\>x)- PP(|Zi| >x)

j^JJ

This content downloaded from 62.122.73.177 on Sat, 14 Jun 2014 03:32:28 AMAll use subject to JSTOR Terms and Conditions

Page 7: EXTREMES OF AUTOREGRESSIVE THRESHOLD PROCESSES

Extremes of autoregressive threshold processes 433

On the other hand, since Xk -

Zk is independent of Zk, it also holds that

P(X0>x) P(Zi>2*) + hminf ??- > hminf ??- P(Xi -

Zx > -x) = p+2 K. x^oo P(|Zi|>Jc) x^oo P(|Zi|>x)

This gives (3.2), implying O-regular variation of the tail of Xo if p+ > 0.

The next proposition shows that without specific assumptions on the partition, regular variation of the stationary distribution cannot be expected, even for a TAR(2, 1) model.

Proposition 3.1. Let (Zk)keNo he an i.i.d. sequence such that P(Zi > x) ~ x~K as x ?> oo

for some k > 0 and that Conditions 2.1 and 2.2 hold with /?+ > 0. For the partition J\ :=

UmeNo(4m, 4m+1/2] and J2 := M \ J\, consider the TAR(2, 1) model

?iXk-.i+Zk forXk-xeJi, ef$ Zk farXk-\ e J2,

where 0 < ?\ < 1. Then there are constants 0 < c\ < c2 < oo such that the stationary solution (XjO&gNo of the TAR(2, 1) model with distribution function Fx satisfies

c\x~K < Fx(x) < c2x~\ x > 1, (3.3)

but Fx is not regularly varying.

Proof. Equation (3.3) follows immediately from Lemma 3.1. Let L be the function satisfying

Fx(x) = P(X0 > jc) = L(x)x~K, x>0.

By (3.3), it follows that c\ < L(x) < c2 for x > 1. For k e No, define

Wk:= ?xXh forXkeJu 0 forXfcE/2.

We will show that the assumption that Fx is regularly varying gives an O-regularly varying tail of Wk-\ which is not regularly varying, and then obtain a contradiction of the tail behavior of

Xk = Wjc-i + Zk, where Wk-\ and Zk are independent. So we assume that Fx is regularly varying. Then it follows that L must be slowly varying.

We will first show that there are constants 0 < d\ < d2 < oo such that

d\x~K < ?(Wk > x)< d2x~K, x>l. (3.4)

Here, the right-hand side inequality follows easily from P(Wk > x) < P(?\Xk > x) and (3.3). For the left-hand side inequality, observe that the regular variation of Fx implies that

P(Xk e (x,2x]) = P(Xk >x)~ P(Xk > 2x) x^o p(Xk e (x, 4x]) a ?>oc P(x* > x)

- P(Xk > 4jc)

= [P(Xfe > x) - P(X* > 2x)]/P(Xk > x) x^?o [P(Xk >x)- P(Xk > 4x)]/ P(Xk > x)

_\-2~K ~ i _

4~K'

This content downloaded from 62.122.73.177 on Sat, 14 Jun 2014 03:32:28 AMAll use subject to JSTOR Terms and Conditions

Page 8: EXTREMES OF AUTOREGRESSIVE THRESHOLD PROCESSES

434 C. BRACHNER ETAL.

For x > 0, denote by m(x) the unique nonnegative integer such that 4m(^ 1 < x/?\ < 4m(^. Then, given e > 0, it follows that, for large enough x > x(e),

P(Wk > x) = T PI Xjt > i-, Xk (4m, 4m+1/2]

oo >

? P(X* e (4m,4m+1/2]) m=m(x)

1 _ 2~K ??

>d-^Y3^ ? ?(Xke(4mAm+l]) m=m(x)

= 7?7=7 P^ >4mW). 1 ? 4 *

The left-hand side inequality in (3.4) then follows from the corresponding one in (3.3) and 4*n(x) < 4x/?lm Thus, it follows from (3.4) that we can write

V(Wk > x) = r(x)x-K, xeRy (3.5)

where d\ < r(x) < d^ for x > 1. Now, let (xm)me?q be a sequence of numbers such that xm/?x [4w+4/6,4m+5/6]. ThenXxm/?x g (4m+1/2, 4m+1) for every ? g (4"1/6, 41/6), so that

V(Wk > Xxm) = p(xk

> X* Ji)

= P^X*

> ^,

X* g 7^

= P(Wk > jcw),

giving

r(Xxm) = XKr(xm) for all X g (4~1/6, 41/6) and m g N. (3.6)

This implies in particular that r is not slowly varying, so that the distribution function of cannot have a regularly varying tail by (3.5).

Choose 8 > 0 such that 4"1/6 < 1 - 8 < 1 + 8 < 41/6, and let xf := (1 + 6)* and := (1

? 8)x. Write P(Zi > x) = q(x)x~K, so that lim^oo q(x) = 1. Then, using exactly

the same proof as in [17, p. 278], it follows that, for given s > 0 and large enough x > x(e), the tail of X& = Wk-\ + Z^ satisfies

(1 - e)(r(x') + W)"" < P(Xjk > x) < (1 + e)(r(*") + $(*"))(*")"*. (3.7)

Choosing xm as before, the left-hand side inequality of (3.7) together with (3.6) shows that, for large enough m,

P(X* > xm)

(r(xm) + q(xm))xmK

>(\-e)(l+8y .Kr(xmKl+8)K +q{xfm)

(1-*)(!+$)-* 1 +

r(xm) + q(xm)

r(*w)((l + 8)K -

1) + q(x'm) -

q(xm)N

r{xm) +q(xm)

This content downloaded from 62.122.73.177 on Sat, 14 Jun 2014 03:32:28 AMAll use subject to JSTOR Terms and Conditions

Page 9: EXTREMES OF AUTOREGRESSIVE THRESHOLD PROCESSES

Extremes of autoregressive threshold processes 435

Using d\ <r(x) <d2?oxx > 1, lim^oc q(x) = 1, andlim^o(l +&Y ? 1 = 0, we conclude

that . P(X* > xm)

liminf-? > 1. m^??

(r(xm)+q(xm))xmK

A similar argument holds for the limes superior, so that

L(xm) -

(r(xm) 4- q(xm)) as m -> oc. (3.8)

Taking for xm the sequences um := 4m+9/12y0i and vm := 4m+10/12?1 = 41/12wm, it follows, from (3.6) and (3.8), that

L(vm) r(vm)+q(vm) 4^l2r(um) + q{um) (4Kln - \)r(um) -~-~-= 1 H-as m -> oo,

L{um) r(um) + q(um) r(um)+q(um) r(um) + q(um)

and the latter does not converge to 1 asm oo, since d\ < r{um) < d2 and Hindoo q(x) = 1.

Hence, L cannot be slowly varying, contradicting the regular variation of Fx

3.2. Regular variation of TAR(5,1) models with specific partitions

In this and the next subsection we restrict our attention to stationary TAR(S, 1) models with

representation

s

Xk^Y^i+frXk-^hx^e^ + Zk fovkeN, (3.9) 7 = 1

where J\ = (-oo, n], J2 = (r2, oo) for some r\,r2 e M, r\ < r2, and {7/ : / = 3,..., S} is a measurable partition of (r\, r2]. For this model, we are able to compute the tail behavior

explicitly, as we will show in the next lemma.

Lemma 3.2. (Regular variation.) Suppose that Conditions 2.1 and 2.2 hold, and let (Xk)keN0 be a stationary version of the TAR(S, 1) process as given in (3.9). Suppose further that \ Z\ | tR-K for some k > 0. Then |Xo| e 3l-K. More precisely, defining

?t-.-l*' ?'>0> and * < ?<

[0, ?i<0, y' [0, #>0,

for i = \ ,2, it holds that

Hm P(X? > X) = P+++P-^K

=: ?+ (3.10) x-^oo p(|Zi| > x) l - (?+y -

(?-y{?-y

and

p(Xo < -x) P- + p+i?^r ,, M. lim -=-:-?: p . (3.11) *-ooP(|Zi| >*) 1 ~(?t)K -{?^Yi?x Y

In particular, Xq e 3l-K if p+ > 0. Furthermore, p+ > p+ and p" > p~.

This content downloaded from 62.122.73.177 on Sat, 14 Jun 2014 03:32:28 AMAll use subject to JSTOR Terms and Conditions

Page 10: EXTREMES OF AUTOREGRESSIVE THRESHOLD PROCESSES

436 C. BRACHNER ETAL.

Proof. Let x > 0 be fixed, and let (an)neN be a sequence of positive real numbers tending to oo as n -> oo. Then, for any 8 > 0, we can write

P(Xi > xan) = P(X\ > xan, \Z\ \ > 8an, \X0\ > 8an)

+ P(Xi > jcfl?, |Zi| > 8an, \X0\ < 8an)

+ P(Xi >Jca?, |Zi| <<5fl?, |X0| <5fl?)

+ P(Xi > xan, \Z\ \ < 8an, X?> > 8an)

+ P(Xi > xan, \Z\ \ < 8an, X0 < -San)

=:/ + // + /// + /V+V, say. (3.12)

We will study the tail behavior of the five summands of (3.12). Using the independence of Z\ and Xo, we obtain for the first term

r / PqZil >^)P(1Zq1 >8an) hm

- < hm -=0. (3.13) P(|Zii > xan) ?->oo P(\Z\\ > xan)

For the second term of (3.12), observe that

V{Xx>xan, \Zi\>8an, \X0\ < 8an) P(Zi > (x -

?8)an -

a) hm sup- < hm sup

P(|Zi| >xan) P(|Zi| > xan)

+ {x-?8)~K

On the other hand,

. P(Xi > xan, \Z\ \ > 8an, \X0\ < 8an) hm inf n^oo P(|Zii > xan)

. r P(Zi > (jc + jSajfl,, + a) -

P(Zi > (jc + ?8)an + a, |X0| > 8an) > hm inf n-+oo P(|Zi i > X<2n)

= - forO < 8 < x,

x~K

and we conclude that

lim lim inf ???-- = lim lim sup ^- = p+. (3.14)

HO "^oo P(\Zi\>xan) HO ?_ ?> P(\Z\\>xan)

The third term of (3.12) is 0 provided 8 < x/2. For the investigation of IV and V, we define

_ P(X0 > u) . P(X0 > K) A = hm sup-, A = hm inf P(|Zi| > M)

- ?-?oo PflZil > U)

5 = hm sup-, S = hm int ,-. ?^ooFP(|Zi| > U)

- u?yoo P(\Z\\ > w)

All these terms are finite by Lemma 3.1. Then we obtain

hmhmsup- <

hmhmsup-?7? -=

A(?Z)K, (3.15) no ?^00 P(\Zi\>xa?) HO ?^00 P(|Zi|>xa?)

This content downloaded from 62.122.73.177 on Sat, 14 Jun 2014 03:32:28 AMAll use subject to JSTOR Terms and Conditions

Page 11: EXTREMES OF AUTOREGRESSIVE THRESHOLD PROCESSES

Extremes of autoregressive threshold processes 437

and, similarly,

IV P(a2 + ?tXo > (x + S)an) limliminf- > limliminf ?-?^?--? P(|Zi| < San) 8io ?->oo P(|Zi|>xa?) HO n-^oo P(|Zi| > jca?)

= A(#)*. (3.16)

The bounds of V are

V P(?i -

jSrXo > (jc -

SH) - limlimsup- < limlimsup ??^?--? = B(?7)K (3.17) HO n->oo P(\Z\\>xan) no n^oo P(|Zi|>xa?)

1

and

V P(ori -iSrXo > (x + 8)an) limliminf- > limliminf---PflZil < &an)

HO n-> P(|Zi| > xan) *

HO ?^oo PQZ\\ > Jcaw) ~

= ?(fiT. (3.18)

Then (3.12)-(3.18) give

P+ + A(?2Y +B(?i)K < A < ? < /?+ +!(?+)* + #0^)*. (3.19)

Since s

-Xk = + A(-X?-i)l{_x^i -//}

- Zjt,

we obtain, by symmetry,

P~ +M.(?t)K + A(?~f <B<B< p~ + B(?+)K +?(?~)K. (3.20)

If ?i > 0 then (3.19) gives P+ A = A= F

i - (?tr '

In the case in which ?\ < 0 we obtain, by (3.20),

P~ + A(?2Y <K and B < p~ +?(?~)K. (3.21)

Inserting (3.21) into (3.19) yields

P+ + p-<?DK_< A <A< P* + P~(?'r i - (?jr

- (?2r(?;y

" " i - (?jr

- i?2Y{?xY

which gives the result A = A ? p+. Inserting this into (3.20) gives ? = B ? p~ also. That

p+ >

p+ and p~ >

p~ is clear.

Denote by || || the maximum norm and by Sm = {jc g Rm+1 : \\x\\ = 1} the unit sphere with

respect to the maximum norm in Rm+l. Recall that a random vector Y e Rm+1 is multivariate

regularly varying with index ? k < 0 (sometimes also termed with index k > 0) if there exists a random vector 0 with values in Sw such that, for every x > 0, the measures

P(||F|| >ux, Y/\\Y\\ ) P(||F|| >?)

This content downloaded from 62.122.73.177 on Sat, 14 Jun 2014 03:32:28 AMAll use subject to JSTOR Terms and Conditions

Page 12: EXTREMES OF AUTOREGRESSIVE THRESHOLD PROCESSES

438 C. BRACHNER ETAL.

on &(?m) converge weakly to a measure x~K P(0 e ) as u -> oo. The distribution of 0 is called the spectral measure of Y (with respect to the maximum norm). Multivariate regular variation can be defined with respect to any other norm on Rm+1, but since all these definitions are equivalent (only the form of the spectral measure differs), we have chosen to work with the maximum norm, which is particularly convenient for our calculations. It is further known that a random vector Y is regularly varying with index ?k if and only if there exists a nonzero

Radon measure a on Rm \ {0} with o (Rm \ Rm+1 ) = 0 and a sequence (an)nefq of positive numbers increasing to oo such that

nP(a~lY e-)\ cr(-) as n OO,

where '-V denotes vague convergence on <S(R'" 'x

\ {0}). For further information regarding multivariate regular variation, we refer the reader to [2] or [23].

Using Lemma 3.2, we will prove that the finite-dimensional distributions

X{m) = (X0, Xi, , Xm)T Rm+1

of the stationary TAR(5, 1) process (3.9) are multivariate regularly varying for every m e For this, we need the definition of the following matrices: let

?t, ?)2x2

and, for m No, define

/I

B

B2

\Bm B

0

/

B

m ? l

0

0

/

B

0\

0

Am) _ 0

-1

0

\0 0

0 1

0 -1 0

Where C(m) e R2(m + l)x2(m+l) and S(m) e M(m + l)x2(m+l)5 and ?m denotes the mth power 0f B (with B? = I). Finally, define

C(m) := S(m)C(m) =: (c+, ,..., c+, c~) e ^+i)x2(m+\)^

The vectors c? and (k = 0,..., m) will be used to describe the spectral measure of

(Xo, ., Xm)T. Insight into their structure can be obtained by writing

B" Am)

Am) ?2\

dm)\ ?\2

b(m) ?22 /

?2x2 for m e Nq.

Then b^p

> 0 for m e No, /, j = 1,2, and Bm has at most one nonzero element in every column. Setting

<Jm) ._ Am) _ Am) and b im) ._ b[f-b(2f forme No, (3.22)

This content downloaded from 62.122.73.177 on Sat, 14 Jun 2014 03:32:28 AMAll use subject to JSTOR Terms and Conditions

Page 13: EXTREMES OF AUTOREGRESSIVE THRESHOLD PROCESSES

Extremes of autoregressive threshold processes 439

we see that the yth component (cf)j of satisfies

(c+);=0 for j<k, (c+)j=b\j~k~l) for k + 1 < j < m + 1,

(c*);=0 forj<?, (c-)j=b{2j~k~l) for?+1 < j < m +1.

It is easy to check that (|fcjm)|)weN0 and (l^m)l)m N0 are decreasing sequences, and that

\b{"l)\ < ?m and |/4m)| < i?m. In particular, it" follows that g Sm for 0 < ifc < m. With

these preparations, we can now show that the stationary version of the TAR(S, 1) model (3.9)

is multivariate regularly varying.

Theorem 3.1. (Multivariate regular variation.) Suppose that Conditions 2.1 and 2.2 hold, and

let (Xk)kefq0 be a stationary version of the TAR(S, 1) process as given in (3.9). Suppose further that \Z\\ g tR-K for some k > 0. Then X^ = (Xo, ..., Xm)T is multivariate regularly

varying with index ?k, and its spectral measure with respect to the maximum norm is given by

1 / m \

P(0(m) > = p+ + p-+m (?+1^

+ P'1^ + + P'hcje,)}

(3.23)

where p^ and p~ are defined as in Lemma 3.2, and (j = 0, ..., m) is defined as above.

Proof. We define Z(m) = (X+, X~, Z^ Zf,..., Z+, Z~)T g R2^+1) form g N0,

which, by Lemma 3.2, is multivariate regularly varying of index ?k with spectral measure

1 / m \

P(?(m) e')= p+ + p-+m [P+1^e.}

+ p-l{e2e.} + J>+1(,2;+1 .} + p^ .,)J,

where ej g M2^+1), j = 1,..., 2(m + 1), is the unit vector with 1 in the jth component and

0s elsewhere.

Furthermore, we define (Yk)keN0 by To = Xo and

Yk = ?2Yk.il{Yk_}>o} + ?iYk^l[Yk_{<0} + Zk= ?i^k-i ~

?^Yk-i +zk, N.

Let

W := C(m)Z(w) =: (W0, , Wm)T, Y := (F0, ..., Fm)T,

W := C^Z =: ?, W", ..., W+, W")T, F := (F+, F",..., F+, F")T.

Since W is obtained from Z(m) by a linear transformation, it is easy to see that W is again multivariate regularly varying with spectral measure 0(m) as given in (3.23); see, e.g. [2,

Proposition A.l] and [14, Lemma 2.1]. We will show that F is regularly varying with the same

spectral measure as W. Furthermore, observe that, by the definition of the matrix C(m\ it holds

that

K = w? = *6~. wm = ?iK + ?x W + zUx. and W~+1

= ?2W+ +?+W~ + Z~+V for* = 0,...,m-l.

This content downloaded from 62.122.73.177 on Sat, 14 Jun 2014 03:32:28 AMAll use subject to JSTOR Terms and Conditions

Page 14: EXTREMES OF AUTOREGRESSIVE THRESHOLD PROCESSES

440 C. BRACHNER ETAL.

Let (?n)n N be a sequence of positive numbers increasing to oo such that lim^oo n P(|Zi | >

an) = 1, and let 0 < 6 < 1. Then

P(|| W - f|| > 2San) < P(\\W - Y\\ > 8an) m

< ?[P(|W+

- > 8an)+P(\W- -

Y~\ > 8an)}. (3.24) k=\

Let Af :=W^ - for k e N0. First, we will show that

P(|A+| > 6a?) = o(P(|Zi| > as ft -* oo (3.25)

for any 8 > 0, and, hence, by symmetry, the same arguments lead to

P(| Aj~| > 8an) = o(P(|Zi| > as n -> oo.

Then we use induction to prove that, for any 5 > 0,

P(|A?I > 8an) =?(P(|Zi| > an)) as n -> oo, k e N. (3.26)

Since the result is trivial if ? = 0, i.e. ?x = ?2 = 0, we will assume that ? ^ 0 from now on. In order to study (3.25), let 0 < ?82 < 8\ < 8. Then

P(|A+| > 8an) < P(|A+| > 8an, Zx > 8{an, \Y0\ < 82an)

+ P(|A+| > Sa*, Z! < -8ian, \Y0\ < 82an)

+ P(|A+| >8an, \Zi \ <8{an, Y0 > 82an)

+ P(|A+| > 8an, \Zi\ < 8ian, Y0 < -82an)

+ P(|A+| >8an, \Zx\<8ian, \Y0\<82an)

+ P(|Zi| >5i?in, |F0| >han)

=:/ + // + /// + /V+V+V7, say. (3.27)

The summand / can be estimated by

/ < P(^|Fo| > 8an, Zx > 8ian, \Y0\ < 82an) = 0, (3.28)

since ?82 < 8. Similarly, we obtain

// < P(i?|^ol > 8an, \Y0\ < 82an) = 0. (3.29)

That /// = 0 also can be seen from

///<P(|A+| >8an, \Zx\<8xan, \?2\Y0 > 8xan)

+ P(|A+| >8afU |Zi| <8ian, 82\?2\an < \?2\Y0 < 8xan) = 0 + 0

= 0 (3.30)

if ?2 ^ 0, and from = Y+ if ?2 = 0 and Y0 > 0. By symmetry, we also have

/V=0. (3.31)

This content downloaded from 62.122.73.177 on Sat, 14 Jun 2014 03:32:28 AMAll use subject to JSTOR Terms and Conditions

Page 15: EXTREMES OF AUTOREGRESSIVE THRESHOLD PROCESSES

Extremes of autoregressive threshold processes 441

Provided that 8\ and 82 are small, we further have

V=0, (3.32)

and, finally, we estimate

V/ = P(|Zi| > 5ifl?)P(|y0| > 82an) = o(V(\Zx \ > an)) as n -> 00. (3.33)

Hence, (3.27)-(3.33) give

P(|A+| > 8an) =o(P(|Zi| > an)) as n -> 00. (3.34)

Next, we assume that (3.26) holds for some k g N and every 8 > 0. Define :=

?tYk + ?ry*~ + Zt+\ for * e No, and let 83 g (0, 1). Then

(8ci 8(1 \ IA+I < lA^I < -y, \Wt+l

- > (1 - S3)8a?j

(i _L ?1- \ IA+1 <

-f, |A-| < -JL, - >

i3?flbj

+ '(ia,+i>^)+p(ia.-i>^)

=: VII + V/// + XI + X, say. (3.35)

With exactly the same reasoning that led to (3.34), we obtain

VII < P(|w^+1 - > (1

- 83)8an) = o(P(|Zi| > an)) as n -> 00. (3.36)

On the other hand,

(88 \

IA+1 < |A"| < -^,|^A+

+ ^-A-| >a3ianj

=0. (3.37)

Furthermore,

X/ = o(P(|Zi| > an)) and X = o(P(|Zi | > an)) as n -* 00, (3.38)

by the induction hypothesis. Equations (3.35)-(3.38) then give (3.26). Using (3.24) and (3.26), we hence obtain

P(||W -

F|| > 28an) < P(|| W - Y\\ > 8an) = o(P(|Zi| > an)) as n 00.

Since 5 can be arbitrarily small, T is multivariate regularly varying with the same spectral measure as W. Finally, ||X(m)

- F|| <

Y!k=\W?)k max{|n|, \r2\} + ? ?*~0(2jSy), which follows again by induction, shows that X(m) is multivariate regularly varying with the same

spectral measure as Y and, hence, as W.

3.3. Extremal behavior of the TAR(S, 1) model with specific partitions For a locally compact Hausdorff space E, we denote by Mp{E) the space of all point

measures on E. A point process is then a random element with values in Mp (?), and a Poisson

point process (which is a Poisson random measure) with mean measure ? will be denoted

This content downloaded from 62.122.73.177 on Sat, 14 Jun 2014 03:32:28 AMAll use subject to JSTOR Terms and Conditions

Page 16: EXTREMES OF AUTOREGRESSIVE THRESHOLD PROCESSES

442 C. BRACHNER etal.

by PRM(#). In extreme value theory, the space E is often R \ {0}, R, [0, oo) x (R \ {0}), or [0, oo) x R, and the extremal behavior of a stationary sequence (tjk)keN is described by the weak limit of the point process YlkLi sa~\?k-b )

m Mp(R \ {0}) or M/>(R) as n -> oo

(depending on the tail behavior of ?i), or still more informatively by the weak limit of the point process ?(k/n a~\?k-b ))

m oo) x (R\ {0})) or Mp([0, oo) x R), respectively, as n oo. Here, an > 0 and bn e R are the norming constants of an associated i.i.d. sequence

(liOiteN with distribution f i = ?i (where denotes equality in distribution), such that

exp^hm^P^ >anx +

bn)) = ^p(a~l(y

h - bn^j

< x^j

= G(x) (3.39)

for x in the support of G, where G is an extreme value distribution, i.e. either a Fr?chet

distribution, a Gumbel distribution, or a Weibull distribution. For further information about extreme value theory and point processes, we refer the reader to [23] and [25]. Observe in

particular that (3.39) holds for G being the Fr?chet distribution 0>K(x) = cxp(?x~K)l(o^ )(x) with k > 0 if and only if ?i is in tR-K, in which case the norming constants bn can be chosen to be 0 and an can be chosen subject to Hindoo n P(?i > an) ? 1.

Now we describe the extremal behavior of the stationary TAR(S, 1) model (3.9) via point processes. Observe that the constants an defined in (3.40), below, are the norming constants of an i.i.d. sequence with the same distribution as |Xo|, rather than that of Xq.

Theorem 3.2. (Point process behavior.) Suppose that Conditions 2.1 and 2.2 hold, and let

(Xk)keN0 be a stationary version of the TAR(S, 1) process as given in (3.9). Suppose further that \ Z\ \ e Jl-K for some k > 0. Let 0 < an f oo be a sequence of constants such that

lim wP(|X0| > an) = 1. (3.40)

Then, as n -> oo,

oo oo oo

Ew7%) ^ ??^v+<^-) inMp([0>??} x mm>

k=\ k=\ j=0

where b\j) and b(2j) are given by (3.22), J2T=\ ?(sk,Pk) is PRM(#) in MP([0, oo) x (R \ {0})) with

?(dt x dx) = dt x k0(p+x~K~ll{0,oo)(x) + />~(-*)~*_1l(-oo,0)(*))dx

and (OO OO v -l

p+ ? i^/'v+p- i*2'Y ) = ̂+p~yl e (o.

7=0 7=0 '

Here, p+ and p~ are given by (3.10) and (3.11), respectively.

Proof. We apply the results of [7, Theorem 2.7]. By Lemma 2.1, (Xk)keN0 is geometrically strongly mixing. Hence, the mixing condition A(an) of [7, p. 882] is satisfied, meaning that there exists a sequence of positive integers (u?)?e^ such that limn^oo vn

? oo, lim^oo vn/n =

0, and

This content downloaded from 62.122.73.177 on Sat, 14 Jun 2014 03:32:28 AMAll use subject to JSTOR Terms and Conditions

Page 17: EXTREMES OF AUTOREGRESSIVE THRESHOLD PROCESSES

Extremes of autoregressive threshold processes 443

holds for all step functions / on R \ {0} with bounded support which is bounded away from 0.

Furthermore, for x > 0,

(3.41)

P( V 1**1 >XCln^ |Xo1 >Xan) m<k<vn

< Y p(l**l > xan, l*ol > xan) tn<k<vn

, k-\ k

< Y ?(?k\X0\+aJ2?j + Y.?k~J\ZJl >xa?> |Xo1 >xa" m<k<vn ^ ;=0 j = \

(oo

^ ?k\X0\ +

j^? + Y?j\Zj\>xani \X0\>xany

Let 8 e (0, x). Then (3.41) and the independence of Xo and (Zk)keNo result in

P( V >Xa"> |Xo1 >Xan) m<k<vn

'

< p(V|X0|

+ > ia?) m<k<vn ^ p /

+ J2 v(f^?j\Zj\>(x-S)an\?{\X0\>xa?) m<k<vn 7=0 '

=: 7i(?) + /2(?)? say.

Since |Xo| by Lemma 3.2, and since the limit in (3.1) is uniform for u e [uo, oo) for

every wo > 0 (see, e.g. [23, Proposition 0.5]), it follows using dominated convergence that

J\(j?) /?~^8\~K f8\~K lim sup lim -l-- < lim sup V| -

) < limsupf -

) y^(?K)k = 0. m-*oo n->?? P(|Xo| > xan) m-+oo , \ X J m-*oc \X J k>m / x

k>m

We can estimate J2{n) by

J (^z) <^

\ lim sup lim ^- < lim sup lim i;n P( ?J'|Z/| > (x - <5)<2? )

= 0, m^oo P(|Xo| > m^oo V^?' / V=0

since lim^oo vn/n = 0 and since

P(|X0| >x)-(^+4/3-)P(|Z1| >x)

(p+ + pna -?K)v(j??j\zj\

> x\ V=0

' as x ?> oo.

Hence, we conclude that

lim lim PI \/ \Xk\ > xan m?>ocn?>oo \

"

m<k<vn

\Xq\ > xan ) = 0, jc > 0,

This content downloaded from 62.122.73.177 on Sat, 14 Jun 2014 03:32:28 AMAll use subject to JSTOR Terms and Conditions

Page 18: EXTREMES OF AUTOREGRESSIVE THRESHOLD PROCESSES

444 C. BRACHNER ETAL.

which is Equation (2.8) of [7]. Since X(2m+1) is multivariate regularly varying with index ?k

and spectral measure described by the vector 0(2m) = (0^2m))7=o,...,2m

as given in Theorem 3.1, we obtain

1 / m ? 1 m ? 1 x

eiq^i* - -++-+2w(p+ifo(r)r+ri4m)r

+ p+ ? +/>- ? i^Y). ^ p m 7=0 7=0

Using further that (|&^|)m>o and (l^^^ l)^>0 are nonincreasing sequences, we obtain

2m 2m

\fc=m k=m + \ '

1

/5+ -f p -f 2m

m ? l m ? l

7=0 7=0

= l - p+\b \K - P-\b \K+p+(\b(r]\K

- iMm+1V)+p-{\b(2]\K

- i&f

/?+ + p~ + 2m

Then

T-?A/2m l^(2m)|* \/2m ,^(2m)|/Cx 1

E l0-ml' p+ ?~ 0 l^/V + p- Ef=o \b(2j)\K "

where 6? e (1 - ?K, 1], since l^l, \b(2j)\ < ?K and l&^l

= \b{2]\

= 1. Similarly, defining the probability measures Rm on M/>(R \ {0}) by

e<[V&? i?fm)r -

v,2=m+1 K2m)ni(E- V2m) ,)

it follows that /j?m_converges weakly as m -> oo to the distribution of the point process

Ef=o?Qj eMp(S\{0}), where co oo oo

7=0 7=0 7=0

with P(xi = 1) = p+ and P(xi = 0) = p~. Hence, the assumptions of [7, Theorem 2.7] are all satisfied. Let J2T=\ ?(~sk,Pk) be PRMW on [0. oo) x (R \ {0}) with

?(dt x dx) = dt x k0x~k~1 1(o,oo)(x)dx,

and let (J2jLo?Qkj)keN

be an i.i.d. sequence with J2jLo?Qkj

= YlJLo?Qj> independent of

YlkLi ?(sk pky Then Theorem 2.7, Corollary 2.4, and Remark 2.3 of [7] imply that

co co oo

?*(*/?,??-'x,) ^

EEe(*,Gyft) in Mp([0' ??> x (S\{0})) as" ~* 00

Jfc=l /c=l y=0

This content downloaded from 62.122.73.177 on Sat, 14 Jun 2014 03:32:28 AMAll use subject to JSTOR Terms and Conditions

Page 19: EXTREMES OF AUTOREGRESSIVE THRESHOLD PROCESSES

Extremes of autoregressive threshold processes 445

(see also Lemma 4.1.2 of [18] for the connection between the convergence of the process

2Xl ?(k/n,a-'Xk) and that 0f E*=l VX*

aS n ~* ??y SinCe

CO CO OO CO

YlY2?(h,Qkjh) = EE%,^J) Pf+b\j) p-y

k=\ 7=0 fc=l 7=0

the result follows, apart from the representation 0 = (p+ + p~)~l. To see the latter, suppose, for example, that ?\ < 0 and ?2 > 0. Then

.+ _,_._ p+ + p-(-?ir , _ + i , + ' +p =?? + p =p

rr^ + p

-UT?i On the other hand,

b\j)=?J2, je No, bf = -\, b{/} = (-?l)?i~\ jzN, so that

P 7=0 7=0 P2 Pl

The other cases follow similarly.

Having the point process convergence in Theorem 3.2, it is standard to derive many results about the asymptotic behavior of the stationary sequence (Xk)keN0> such as convergence of the maxima to extremal processes, the asymptotic distribution of the order statistics or of exceedances over high thresholds, or the determination of the extremal index. We will concentrate here on the latter. Recall that a stationary sequence (?k)keN nas extremal index

p e (0, 1] if there exist norming constants^ > 0andZ?? e R, and a nondegenerate distribution function G such that (3.39) holds for an associated i.i.d. sequence (?k)keN with f i = ?i, and

JimP^;1 (\/ Hk - b^j <x^=

G(xY for all. ^r '

"

Under weak mixing conditions (which are satisfied in our case), it is known that the reciprocal p~l of the extremal index can be interpreted as the mean cluster size of exceedances over high thresholds. In particular, an extremal index of size 1 says that high exceedances of a stationary sequence behave asymptotically like that of an i.i.d. sequence with the same marginals, while an extremal index which is less than 1 shows that clusters occur. We refer the reader to [12] and [20] for further information regarding the extremal index.

The following result gives the asymptotic behavior of the maxima of the stationary TAR (5, 1 ) model and its extremal index.

Corollary 3.1. Let the assumptions of Theorem 3.2 hold with (an)nef$ as defined in (3.40), and

define Mn = max?=i_.i/2 X^forn e N. Then, for x > 0,

lim P(a-lMn <x)= exp(-6>(/7+ + p-(?7)")x-K) = exp(-P t/

(?l ) *-*Y n-+ \ p+ + p~ )

with p+ and p~ as defined in (3.10) and (3.11). In particular, if p+ > 0 then (Xk)keNo has extremal index p = l- (?^Y

- (?^Y(?2Y

This content downloaded from 62.122.73.177 on Sat, 14 Jun 2014 03:32:28 AMAll use subject to JSTOR Terms and Conditions

Page 20: EXTREMES OF AUTOREGRESSIVE THRESHOLD PROCESSES

446 C. BRACHNER ETAL.

Proof. Applying the continuous mapping theorem for the functional

00

Ti : MP([0, oo) x (R \ {0})) -> R, ^ SUP?*: tk - 1] k=\

it follows from Theorem 3.2 that

1 w

a~lMn -> F

:= sup{b[j)Pf + : sk < 1, j G N0} (n -* oo)

= sup{P+,^fp-: ^ < 1}

= Ti(N),

where N := J2h=\ ?(Sk Pk+)~^~?(sk ?\Pk)' Here, we used the facts that sup {0, b[J^ : j e No} = 1

and sup{0, b2j) : j e No} =

?f. Since N is PRM(#) with mean measure

?(dt x (Le) = dt x /e<9(/?+ + /?"(i?f)/c)jc"/c"1l(o,oo)(^)dx,

we conclude that, for fixed x > 0,

lim PfoT1^ < jc) = P(F < jc) n-^oo

= P(N((0, 1] x (a:, oo)) =0)

= exV{-0{p+ +p-{?-f)x-K).

The extremal index p of (X^)^eN0 is then given by

P+ + P I P+ + P

which is the claim.

We can also compute the asymptotic cluster probabilities P(f i = j) of exceedances Xk >

anx of length j e N for fixed x > 0 explicitly, considering the limit behavior of the rescaled times k/n for which Xk > anx. This is done using the same arguments as in [8, Section 3.D].

We omit the proof.

Corollary 3.2. Let the assumptions of Theorem 3.2 hold with (aw)weN as defined in (3.40), and

suppose that p+ > 0, i.e. p+ + p~(?i)K > 0. Let (sk)keN be the jump times of a Poisson

process with intensity 0(p+ + p~(?^)K)x~K, x > 0, fixed. Let (?k)keN be an N-valued i.i.d.

sequence, independent of (sk)keN> with distribution

p+((b\j~l))K -

(b\j))K) + p-({b{J~l)Y -

(b\j))K) mx=j) = ^A?i? y ;

" 2?-?for j en,

P+ + P (?l )K

where 1 = b^ > b\1^ > is the order statistic of the sequence (maxfO, &^})/eNn and ? ~(0) (1) (/')

?x =

b2 >

b2 > - - is the order statistic of the sequence (max{0, b2 })jeN0- Then

oo oo

k=\ k?\

This content downloaded from 62.122.73.177 on Sat, 14 Jun 2014 03:32:28 AMAll use subject to JSTOR Terms and Conditions

Page 21: EXTREMES OF AUTOREGRESSIVE THRESHOLD PROCESSES

Extremes of autoregressive threshold processes 447

Remarks 3.1. (i) If ?\ = ?i = = ?s = ? G (-1, 1) and ai = = as = 0, then

process (3.9) is a causal AR(1) process X& = ?Xk~\ + Z^, which can be written as an infinite

moving average process Xk = YlJLo ?^ zk-j- h is then easy to see that the results obtained in this paper are in line with those of [8] for infinite moving average processes with regularly varying noise.

(ii) Provided that p+ > 0, the extremal index p of Corollary 3.1 is strictly less than 1 if and

only if ?2 > 0 or ?\?2 > 0. In these cases, the TAR(5, 1) model can model clusters.

(iii) The value 0 is the extremal index of (|Xfc|)fceNo

4. Noise with at most an exponentially decreasing tail

In this section we are interested in noise sequences which are lighter tailed than regularly varying functions, but whose tail is still not too light. Since distributions which are not regularly varying and do not have a finite right endpoint can only be in the maximum domain of attraction of the Gumbel distribution if they are in the maximum domain of attraction of some extreme value distribution at all, we will concentrate on specific i.i.d. noise sequences (Zk)keN0 for which (3.39) holds with G(x) = A(x) = exp(?e~x) and ̂ = Z*, which we denote by Z\ e MDA(A). Besides the TB condition on Z\, we will look at distributions which are either subexponential or have a tail which is close to that of an exponential distribution. More

precisely, we will look at distribution functions F in the class X{y) with y e [0, oo), i.e. distribution functions which satisfy F(x) < 1 for all x e R and for which

um Til +

y)=c-yy (4.D F(x)

holds locally uniformly in y eR. For y > 0, this means that F has a tail which is close to that of an exponential distribution. For y = 0, we need a further assumption: a distribution function F (defined on R) is called subexponential if F e X(0) and limx_^oo F * F(x)/F(x) = 2. The class of subexponential distributions will be denoted by S. For a random variable Z with distribution function F, we simply write X e S or Z e X(y) if F has the corresponding property. Subexponential distributions and those whicharein <?(/) for y > 0 are handy classes of (semi-)heavy-tailed distributions, and the extremal and the tail behaviors of infinite moving average processes associated with such noise sequences have been studied in [9] and [26], respectively. Examples of distributions in MDA(A) Pi S include tails of the form F(x) ~

Qxp(-x/(\ogx)a), a > 0, orF(x) - Kxb exp(-x^), where p e (0, 1), K > 0, and b e R

(x -? oo), or the lognormal distribution. The class of subexponential distributions also includes those which have regularly varying tails, but these are not in MDA(A). Examples in X(y) with y > 0 include distribution functions with tails of the form F(x)

~ Kxbe~yx (x ?> oo) with K > 0 and b e R, or certain generalized inverse Gaussian distributions. Observe that X(y) C MDA(A) for y > 0 since, by (4.1), (3.39) holds with an = y~l andbn = F*~(l/n). Also, observe that whether a distribution function is in S H MDA(A) or in X(y) with y > 0, respectively, is completely determined by its tail behavior. More precisely, if G(x)

~ cF(x)

(x -> oo) for some c e (0, oo) then F e S Pi MDA(A) or F e X(y) implies the same for G; see [22, Lemma 2.4] for the subexponential case (the case X(y) follows from the definition of

X(y)). As in Section 3, we will first present the tail behavior of the TAR model if Z has the described

noise. However, unlike there we do not have to restrict to specific TAR(5, 1) models to remain in the same noise class, we can handle the general TAR(5\ q) model.

This content downloaded from 62.122.73.177 on Sat, 14 Jun 2014 03:32:28 AMAll use subject to JSTOR Terms and Conditions

Page 22: EXTREMES OF AUTOREGRESSIVE THRESHOLD PROCESSES

448 C. BRACHNER ETAL.

Proposition 4.1. (Tail behavior.) Suppose that Conditions 2.1 and 2.2 hold with p+ > 0, and

let (Xk)keN0 be a stationary version of the TAR(S, q) process as given in (1.1). Suppose further that Z\ e <?(y) with y > 0, or that Z\ e S O MDA(A), in which case we set y := 0. Then

Eexp(y(Xi ?

Z\)) is finite and

P(Xi > *) ~

Eexp(y(Xi -

Z\))V(Z\ > x) as x -> oo. (4.2)

In particular, if Z\ e ?(y) or Z\ e S H MDA(A), respectively, then so is X\.

Proof Similarly to and with the same notation as in Lemma 2.2 we obtain

P(|X* - Zk\ >x)< P(\Xk +a-Zk\>x)= p(f]

fjZj + a > x^j,

(4.3)

so that

p(\xk-zk\>x) l P(?~i^z,+<*>*) lim sup-

< ? hm sup- (4.4) x-+oc P(Zi > X) p+ x->oo P(Zi > X)

by Condition 2.1. Since 0 < < 1 for j e N and since the (xj/j) are exponentially decreasing, it follows from Proposition 1.3 of [9] that the right-hand side of (4.4) is 0 if Z\ e S DMDA(A) and Condition 2.1 is valid, and, hence, that (4.2) holds in the case Z\ e S H MDA(A) (y = 0)

by [11, Proposition 1] (for subexponentials on R, see also [22, Lemma 5.1]).

Now, suppose that y > 0 and that Z\ e X(y). Together with Condition 2.1, this implies that xf/Zi g Z(y jy\r) for 0 < iff < 1, so that Eexp(<5Zi) < oo for 0 < 8 < y. Observe

that 0 < i/j < 1 for j e N and that

J2JL\ Vo < 00 Choosing 0 < s < y such that

(y -h s)~l(y ?

s) > maxye^{^y }, it follows, from (4.3) and Jensen's inequality, that

CO

Eexp((K +*)|Xi - Zi|) < ^y^YlEtxpdy+s^jZj)

CO < ?y+e)a

]~[(Eexp((y -

E)Zj)){Y+s)*^{Y-?) 7 = 1

< oo.

But, since Z\ e <?(y) if and only if exp(Zi) e tR-y, and since Eexp(|Xi ?

Z\\(y + e)) < oo, it follows from Breiman's [4] result on the products of regularly varying distributions that

P(exp(Xi) > x) = P(exp(Zi)exp(Xi -

Zx) > x) ~

Eexp(y(Xi ?

Z\)) P(exp(Zi) > x) as x ? oo,

which is (4.2) (the latter equation can also be derived from [9, Proposition 1.1] and [22, Lemma 2.1]).

Similarly to the regularly varying TAR(5\ 1) model, the tail of the TAR process is equivalent to the tail of the noise. Next, analogously to the regularly varying case, we show the convergence of a sequence of point processes. In contrast to Theorem 3.2 we obtain the convergence to a

Poisson random measure. Thus, this model cannot exhibit extremal clusters.

This content downloaded from 62.122.73.177 on Sat, 14 Jun 2014 03:32:28 AMAll use subject to JSTOR Terms and Conditions

Page 23: EXTREMES OF AUTOREGRESSIVE THRESHOLD PROCESSES

Extremes of autoregressive threshold processes 449

Theorem 4.1. (Point process behavior.) Suppose that Conditions 2.1 and 2.2 hold with p+ > 0, and let (Xk)keN0 be a stationary version of the TAR(S, q) process as given in (1.1). Suppose further that Z\ e X(y) with y > 0, or that Z\ e S P MDA(A). Let an > 0 and hn g R be

sequences of constants such that

lim ttP(Xo > anx + bn) = e~* for x e R.

Then, as n -? oo,

CO CO

Y,?{k,n,a-\xk-bn)) ^

Y,?(sk,Pk) in MP([0, oo) x R),

k=\ k=\

where YlkLi ?(sk,Pk) ̂ a PRM(df x e~x dx).

Proof. First, we investigate the case in which Z\ e X(y) with y > 0. Let un = anx + bn, n e N. Since (Xfc)fceN0 is geometrically strongly mixing by Lemma 2.1, the mixing condition

Dr (un) of [20, Theorem 5.5.1] holds for (Xk)keN0 It remains to show the anti-cluster condition

D'(un), i.e.

[n/k\ lim lim n > P(X, > Xi > Un) = 0.

?:->co?->oo ^-^

j=2

By Lemma 2.2 we have

\n/k\ \_n/k\ n ^ P(X;- > Un, X\ > un) < n

^ P(-^;' > un, Xi > w?).

j=2 j=2

Furthermore, P(Xi > un) ~

CP(Xi > w?) as n ? oo and some constant C > 0 by Proposition 4.1. Analogously to the proof of Theorem 7.4 of [26] (cf. [16, proof of Lemma 2]), the moving average process (X^)^G^0 satisfies the Df(un) condition, and, hence, (X^)^eN0 also satisfies the Dr(un) condition. The conclusion then follows by [20, Theorem 5.5.1].

In the remaining case, Z\ e S Pi MDA(A), the conclusion follows from [15, Proposition 9] and (4.3)-(4.4), which converges to 0 as x -> oo.

We can now obtain the behavior of the running maxima of the stationary sequence and, hence, identify the extremal index to be equal to 1, which implies that the model cannot exhibit clusters on high levels in this case. We omit the proof, which follows the lines of the proof of

Corollary 3.1.

Corollary 4.1. Let the assumptions of Theorem 4.1 hold, and let Mn = max^=i .. ? X& for n e N. Then

lim P(a~l(Mn -

bn) < x) = exp(-e~x) for x g R. n?>oo

In particular, the extremal index of (X^keH^ is equal to 1.

5. Conclusion

We have shown that stationary TAR models with noise in S P MDA(A) or X(y) with y > 0 are tail equivalent to their noise sequence, and that they have extremal index equal to 1 ; hence,

they cannot cluster. On the other hand, if the noise sequence is regularly varying then the tail

This content downloaded from 62.122.73.177 on Sat, 14 Jun 2014 03:32:28 AMAll use subject to JSTOR Terms and Conditions

Page 24: EXTREMES OF AUTOREGRESSIVE THRESHOLD PROCESSES

450 C. BRACHNER ETAL.

is in general only O-regularly varying, but for TAR(S, 1) models with intervals as partitions, it is tail equivalent to its noise (in particular, it is regularly varying). Moreover, in this case the extremal index is less than 1 in many cases depending on the coefficients of the TAR model. In these cases the TAR(S, 1) model can exhibit cluster behavior.

It would be interesting to obtain similar results for noise sequences which are superexpo nential, such as distribution functions F with tails like F(x) = Kxb exp(?xp) for p ? (1, co). However, already the analysis for infinite moving average processes with such noise sequences is very involved and has been carried out by Rootz?n [26], [27]. See also [19] for such moving average processes. But, for the TAR model, owing to the nonlinear regime switch, it seems an

open problem how to determine the precise tail behavior of the stationary TAR model even for Gaussian noise, apart from simple situations such as symmetric TAR models.

Acknowledgement

The authors are indebted to a referee for pointing out the relation between ?(y) for y > 0 and MDA(A).

References

[1] An, H. Z. and Huang, F. C. (1996). The geometric ergodicity of nonlinear autoregressive models. Statistica Sinica 6, 943-956.

[2] Basrak, B., Davis, R. A. and Mikosch, T. (2002). Regular variation of GARCH processes. Stock Process.

Appl. 99, 95-115.

[3] Brachner, C. (2004). Tailverhalten autoregressiver Thresholdmodelle. Diploma Thesis, Technische Universit?t M?nchen.

[4] Breiman, L. (1965). On some limit theorems similar to the arc-sine law. Theory Prob. Appl. 10, 351-360.

[5] Chan, K. and Tong, H. (1985). On the use of the deterministic Lyapunov function for the ergodicity of stochastic difference equations. Adv. Appl. Prob. 17, 666-678.

[6] Chen, R. and Tsay, R. (1991). On the ergodicity of TAR(l) processes. Ann. Appl. Prob. 1, 613-634.

[7] Davis, R. A. and Hsing, T. (1995). Point process and partial sum convergence for weakly dependent random variables with infinite variance. Ann. Prob. 23, 879-917.

[8] Davis, R. and Resnick, S. (1985). Limit theory for moving averages of random variables with regularly varying tail probabilities. Ann. Prob. 13, 179-195.

[9] Davis, R. and Resnick, S. (1988). Extremes of moving averages of random variables from the domain of attraction of the double exponential distribution. Stock Process. Appl. 30, 41-68.

[10] Diop, A. and Guegan, D. (2004). Tail behavior of a threshold autoregressive stochastic volatility model. Extremes 7, 367-375.

[11] Embrechts, R, Goldie, C. M. and Veraverbeke, N. (1979). Subexponentiality and infinite divisibility. Z. Wahrscheinlichkeitsth. 49, 335-347.

[12] Embrechts, R, Kl?ppelberg, C. and Mikosch, T. (1997). Modelling Extremal Events for Insurance and Finance. Springer, Berlin.

[13] Fan, J. and Yao, Q. (2003). Nonlinear Time Series: Nonparametric and Parametric Methods. Springer, New York.

[14] Fasen, V. (2005). Extremes of regularly varying L?vy-driven mixed moving average processes. Adv. Appl. Prob.

37,993-1014. [15] Fasen, V. (2006). Extremes of subexponential L?vy driven moving average processes. Stock Process. Appl.

116, 1066-1087.

[16] Fasen, V., Kl?ppelberg, C. and Lindner, A. (2006). Extremal behavior of stochastic volatility models. In Stochastic Finance, eds A. Shiryaev et al, Springer, New York, pp. 107-155.

[17] Feller, W. (1971). An Introduction to Probability Theory and Its Applications, Vol. II, 2nd edn. John Wiley, New York.

[18] Hsing, T. (1993). On some estimates based on sample behavior near high level excursions. Prob. Theory Relat. Fields 95, 331-356.

[19] Kl?ppelberg, C. and Lindner, A. (2005). Extreme value theory for moving average processes with light-tailed innovations. Bernoulli 11, 381-410.

[20] Leadbetter, M. R., Lindgren, G. and Rootz?n, H. (1983). Extremes and Related Properties of Random

Sequences and Processes. Springer, New York.

This content downloaded from 62.122.73.177 on Sat, 14 Jun 2014 03:32:28 AMAll use subject to JSTOR Terms and Conditions

Page 25: EXTREMES OF AUTOREGRESSIVE THRESHOLD PROCESSES

Extremes of autoregressive threshold processes 451

[21] Meyn, S. and Tweedie, R. (1993). Markov Chains and Stochastic Stability. Springer, London.

[22] Pakes, A. G. (2004). Convolution equivalence and infinite divisibility. J. Appi Prob. 41,407^24. (Correction: 44 (2007), 295-305.)

[23] Resnick, S. I. (1986). Point processes, regular variation and weak convergence. Adv. Appl. Prob. 18, 66-138.

[24] Resnick, S. I. (1987). Extreme Values, Regular Variation, and Point Processes. Springer, New York.

[25] Resnick, S. I. (2007). Heavy-Tail Phenomena. Springer, New York.

[26] Rootz?n, H. (1986). Extreme value theory for moving average processes. Ann. Prob. 14, 612-652.

[27] Rootz?n, H. (1987). A ratio limit theorem for the tails of weighted sums. Ann. Prob. 15, 728-747.

[28] Tong, H. (1977). Contribution to the discussion of the paper entitled 'Stochastic modelling of riverflow time series' by A. J. Lawrance and N. T. Kottegoda. J. R. Statist. Soc. A 140, 34-35.

[29] Tong, H. (1990). Nonlinear Time Series. Oxford University Press.

[30] Tong, H. and Lim, K. S. (1980). Threshold autoregression, limit cycles and cyclical data. J. R. Statist. Soc. B

42, 245-292.

This content downloaded from 62.122.73.177 on Sat, 14 Jun 2014 03:32:28 AMAll use subject to JSTOR Terms and Conditions