random sampling of long-memory stationary processes

15
Random sampling of long-memory stationary processes Anne Philippe a, , Marie-Claude Viano b a Universite´ de Nantes, Laboratoire de Mathe´matiques Jean Leray, UMR CNRS 6629, 2 rue de la Houssini ere - BP 92208, 44322 Nantes Cedex 3, France b Laboratoire Paul Painleve´ UMR CNRS 8524, UFR de Mathe´matiques - Bat M2, Universite´ Lille 1, Villeneuve d’Ascq, 59655 Cedex, France article info Article history: Received 7 December 2008 Received in revised form 14 September 2009 Accepted 24 October 2009 Available online 3 November 2009 MSC: 60G10 60G12 62M10 62M15 Keywords: Aliasing FARIMA processes Generalised fractional processes Regularly varying covariance Seasonal long-memory Spectral density abstract This paper investigates the second order properties of a stationary process after random sampling. While a short memory process gives always rise to a short memory one, we prove that long-memory can disappear when the sampling law has heavy enough tails. We prove that under rather general conditions the existence of the spectral density is preserved by random sampling. We also investigate the effects of deterministic sampling on seasonal long-memory. & 2009 Elsevier B.V. All rights reserved. 1. Introduction The effects of random sampling on the second order characteristics and more specially on the memory of a stationary second order discrete time process are the subject of this paper. We start from X ¼ðX n Þ nZ0 , a stationary discrete time second order process with covariance sequence s X ðhÞ and a random walk ðT n Þ nZ0 independent of X. The sampling intervals D j ¼ T j T j1 are independent identically distributed integer random variables with common probability law S. We fix T 0 ¼ 0. Throughout the paper we consider the sampled process Y defined by Y n ¼ X Tn ; n ¼ 0; 1; ... : ð1:1Þ The particular case where S ¼ d k , the Dirac measure at point k, shall be mentioned as deterministic sampling (some authors prefer systematic or periodic sampling). Either because of practical restraints or because it can be used as a way to reduce serial dependence, deterministic sampling is quite an old and common practice in all areas using time series, such as econometric, signal processing, climatology, astronomy. Contents lists available at ScienceDirect journal homepage: www.elsevier.com/locate/jspi Journal of Statistical Planning and Inference ARTICLE IN PRESS 0378-3758/$ - see front matter & 2009 Elsevier B.V. All rights reserved. doi:10.1016/j.jspi.2009.10.011 Corresponding author. E-mail address: [email protected] (A. Philippe). Journal of Statistical Planning and Inference 140 (2010) 1110–1124

Upload: anne-philippe

Post on 26-Jun-2016

212 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Random sampling of long-memory stationary processes

ARTICLE IN PRESS

Contents lists available at ScienceDirect

Journal of Statistical Planning and Inference

Journal of Statistical Planning and Inference 140 (2010) 1110–1124

0378-37

doi:10.1

� Cor

E-m

journal homepage: www.elsevier.com/locate/jspi

Random sampling of long-memory stationary processes

Anne Philippe a,�, Marie-Claude Viano b

a Universite de Nantes, Laboratoire de Mathematiques Jean Leray, UMR CNRS 6629, 2 rue de la Houssini�ere - BP 92208, 44322 Nantes Cedex 3, Franceb Laboratoire Paul Painleve UMR CNRS 8524, UFR de Mathematiques - Bat M2, Universite Lille 1, Villeneuve d’Ascq, 59655 Cedex, France

a r t i c l e i n f o

Article history:

Received 7 December 2008

Received in revised form

14 September 2009

Accepted 24 October 2009Available online 3 November 2009

MSC:

60G10

60G12

62M10

62M15

Keywords:

Aliasing

FARIMA processes

Generalised fractional processes

Regularly varying covariance

Seasonal long-memory

Spectral density

58/$ - see front matter & 2009 Elsevier B.V. A

016/j.jspi.2009.10.011

responding author.

ail address: [email protected] (A.

a b s t r a c t

This paper investigates the second order properties of a stationary process after random

sampling. While a short memory process gives always rise to a short memory one, we

prove that long-memory can disappear when the sampling law has heavy enough tails.

We prove that under rather general conditions the existence of the spectral density is

preserved by random sampling. We also investigate the effects of deterministic

sampling on seasonal long-memory.

& 2009 Elsevier B.V. All rights reserved.

1. Introduction

The effects of random sampling on the second order characteristics and more specially on the memory of a stationarysecond order discrete time process are the subject of this paper.

We start from X¼ ðXnÞnZ0, a stationary discrete time second order process with covariance sequence sXðhÞ and arandom walk ðTnÞnZ0 independent of X. The sampling intervals Dj ¼ Tj � Tj�1 are independent identically distributedinteger random variables with common probability law S. We fix T0 ¼ 0.

Throughout the paper we consider the sampled process Y defined by

Yn ¼ XTn; n¼ 0;1; . . . : ð1:1Þ

The particular case where S¼ dk, the Dirac measure at point k, shall be mentioned as deterministic sampling (someauthors prefer systematic or periodic sampling).

Either because of practical restraints or because it can be used as a way to reduce serial dependence, deterministicsampling is quite an old and common practice in all areas using time series, such as econometric, signal processing,climatology, astronomy.

ll rights reserved.

Philippe).

Page 2: Random sampling of long-memory stationary processes

ARTICLE IN PRESS

A. Philippe, M.-C. Viano / Journal of Statistical Planning and Inference 140 (2010) 1110–1124 1111

Introducing some randomness in the sampling procedure, as a possible method to avoid the well known aliasingphenomenon was proposed by several authors. See, without exhaustivity, Beutler (1970), Masry (1971) and Shapiro andSilverman (1960) where several random schemes are introduced. The particular idea of sampling along a random walk wasfirst proposed in Shapiro and Silverman (1960) where simple conditions for alias-free sampling are given.

Random sampling can also be viewed as an interesting way to model data at irregularly spaced moments as it is the casein numerous applications. Think for example of observations of sunspots where missing data correspond to weatherphenomena, or, in industrial processes, missing values caused by instrument failures. Several examples of naturaloccurrence of deterministic or random sampling are well detailed in Greenshtein and Ritov (2000).

In the domain of time series analysis, attention has been particularly paid to the effect of sampling on parametricfamilies of processes. After studies devoted to deterministic sampling (see Brewer, 1975; Niemi, 1984; Sram and Wei,1986, among others), the effect of random sampling on ARMA or ARIMA processes is investigated in Kadi et al. (1994) andOppenheim (1982). The main result is that the ARMA structure is preserved, the order of the autoregressive part beingnever increased after sampling.

Precisely, if AðLÞXn ¼ BðLÞen and A1ðLÞYn ¼ B1ðLÞZn are the minimal representations of X and of the sampled process Y, theroots of the polynomial A1ðzÞ belong to the set f

PjZ1 SðjÞrj

kg where the rk’s are the roots of AðzÞ. As Sð1Þa1 impliesjP

jZ1 SðjÞrjkjo jrkj, a consequence is that the convergence to zero of sY ðhÞ is strictly faster than that of sXðhÞ.

In other words, sampling an ARMA process along any random walk shortens its memory. In Oppenheim (1982) it is evenpointed out that some ARMA processes could give rise to a white noise through a well chosen sampling law.

One can wonder whether deterministic or random sampling also reduce long-memory.Most papers dealing with this question only consider deterministic sampling. Let us mention Chambers (1998), Hwang

(2000), Souza and Smith (2002), Souza (2005) and Souza et al. (2006) where can be found a detailed study of deterministicsampling and time aggregation of the FARIMA ðp; d; qÞ process with related important statistic questions such as theestimation of the memory parameter d from the sampled process.

Concerning the memory, it is proved in Chambers (1998) that when the spectral density of ðXnÞ has the typical form

f ðlÞ ¼ LðlÞjlj�2d; 0odo1=2;

where L is bounded and slowly varying near zero, it is still the case for the process obtained by any deterministic sampling.In other words, contrary to what happens in the case of short memory processes like ARMA, long memory seems to be

unchanged by deterministic sampling.The question of long-memory processes observed at random instants remains rarely tackled. We note in a recent paper

(Bardet and Bertrand, 2008), a wavelet-based estimation of the spectral density of a continuous time Gaussian processhaving the above typical form at frequency zero. The sampling law has a finite second moment, which means, in view ofour results (see Section 3.1 below), that the sampled process keeps the same memory parameter as the initial one.

Concentrating on the second order properties of the processes, the present paper we give answers to the followingquestions:

When the covariance sequence of the basic process Xn is summable (or square summable), is it also the case for thecovariance of the sampled process? � When the spectrum of Xn is absolutely continuous, is it still the case for the sampled process? � Is it possible to obtain short memory from long memory? � Can deterministic sampling modify the singular behaviour of the spectral density near zero?

In all the sequel, a second order stationary process X is said to have long memory if its covariance sequence is non-summableX

hZ0

jsXðhÞj ¼1; ð1:2Þ

definition including the FARIMA family and also the family of Generalised Fractional Processes (see Gray et al., 1989;Leipus and Viano, 2000), whose densities can present singularities at non-zero frequencies.

A question under study in this paper is what can be said ofXhZ0

jsY ðhÞj ¼XhZ0

jEðsXðThÞÞj ð1:3Þ

depending on whether (1.2) is satisfied or not.In Section 2 we prove preservation of Lp- convergence of the covariance and of absolute continuity of the spectrum. We

show in particular that short memory is always preserved.The main results of the paper concern changes of memory when sampling a process with regularly varying covariance.

They are gathered in Section 3. We show that the intensity of memory of such processes is preserved if EðT1Þ ¼P

jSðjÞo1,while this intensity decreases when EðT1Þ ¼1. For sufficiently heavy tailed S, the sampled process has short memory,which is somehow not surprising since with a heavy tailed sampling law, the sampling intervals can be quite large.

Page 3: Random sampling of long-memory stationary processes

ARTICLE IN PRESS

A. Philippe, M.-C. Viano / Journal of Statistical Planning and Inference 140 (2010) 1110–11241112

In Section 4 we consider processes presenting long-memory with seasonal effects, and investigate the particular effectsof deterministic sampling. We show that in some cases the seasonal effects can totally disappear after sampling. We alsogive examples where the form jlj�2d of the spectral density at zero frequency is modified by deterministic sampling.

2. Some features unchanged by random sampling

Taking p¼ 1 in Proposition 1 below confirms an intuitive claim: random sampling cannot produce long-memory fromshort memory. Propositions 3–5 state that, at least in all situations investigated in this paper, random sampling preservesthe existence of a spectral density.

2.1. Preservation of summability of the covariance

Proposition 1. (i) Let pZ1. IfPjsX j

po1, the same holds for sY .

(ii) In the particular case p 2 ½1;2� both processes X and Y have spectral densities linked by the relation

fY ðlÞ ¼1

2pXþ1�1

Z p

�pðe�ilSðyÞÞjfXðyÞdy;

where SðyÞ ¼ EðeiyT1 Þ is the characteristic function of S.

Proof. The covariance sequence of the sampled process is given by

sY ð0Þ ¼ sXð0Þ;

sY ðhÞ ¼ EðsXðThÞÞ ¼X1j ¼ h

sXðjÞS�hðjÞ; hZ1;

8>><>>: ð2:1Þ

where S�h, the h-times convoluted of S by itself, is the probability distribution of Th.

As the sequence Th is strictly increasing, sXðThÞ is almost surely a sub-sequence of sXðhÞ. Then, (i) follows fromXh

jEðsXðThÞÞjprE

Xh

jsXðThÞjprX

h

jsXðhÞjp:

The proof of (ii) is immediate, using (i)

fY ðlÞ ¼1

2pXj2Z

e�ijlsY ðjÞ ¼1

2pXj2Z

e�ijlEðsXðTjÞÞ ¼1

2pXj2Z

Z p

�pe�ijlEðeiTjyÞfXðyÞdy;

where the series converges in L2ð½�p;p�Þ when pa1, the covariance being then square-summable without being

summable. &

2.2. Preservation of the existence of a spectral density

In this part we assume that X has a spectral density denoted by fX .Firstly, it is well known that the existence of a spectral density is preserved by deterministic sampling (see (4.3) below).

Secondly, from Proposition 1 it follows that, for any sampling law, the spectral density of Y exists when the covariance of X

is square summable. It should also be noticed that when proving that the ARMA structure is preserved by random sampling(Oppenheim, 1982) (see also Kadi et al., 1994, for the multivariate case) gives an explicit form of the spectral density of Y

when X is an ARMA process.The three propositions below show that preservation of the existence of a spectral density by random sampling holds

for all the models considered in the present paper.The proofs are based on the properties of Poisson kernel recalled in Appendix A.2

PsðtÞ ¼1

2p1� s2

1� 2s cos tþs2

� �; s 2 ½0;1½ ð2:2Þ

and the representation given in Lemma 2 of the covariance of sampled process.

Lemma 2. For all jZ0,

sY ðjÞ ¼ limr-1�

Z p

�peijygðr; yÞdy; ð2:3Þ

where

gðr;yÞ ¼1

4p

Z p

�pfXðlÞ

1

1� re�iySðlÞþ

1

1� re�iySð�lÞ

!dl ð2:4Þ

Page 4: Random sampling of long-memory stationary processes

ARTICLE IN PRESS

A. Philippe, M.-C. Viano / Journal of Statistical Planning and Inference 140 (2010) 1110–1124 1113

¼1

4

Z p

�pfXðlÞ

1

pþPrrðt� yÞþPrrðtþyÞ

� �dl ð2:5Þ

and where

SðlÞ ¼ rðlÞeitðlÞ ð2:6Þ

also denoted reit for the sake of shortness.

Proof. The proof of the lemma is relegated to Appendix A.1.

Proposition 3. If fX is bounded in a neighbourhood of zero, the sampled process Y has a spectral density given by

fY ðyÞ ¼ limr-1�

1

4p

Z p

�pfXðlÞ

1

1� re�iySðlÞþ

1

1� re�iySð�lÞ

!dl: ð2:7Þ

Proof. Firstly, it is easily seen that, if ya0, gðr; yÞ has a limit as r-1�. Hence, the proof of the proposition simply consists inexchanging the limit and integration in (2.3) in order to show that

sY ðjÞ ¼

Z p

�peijy lim

r-1�gðr; yÞdy

implying that

fY ðyÞ ¼ limr-1�

gðr; yÞ

and thus (2.7) according to (2.4).

Now we prove that conditions of Lebesgue’s theorem hold for (2.3).

As we can suppose that the sampling is not deterministic,

jSðlÞjo1; 8l 2�0;p�

(see Feller, 1971). Hence, thanks to the continuity of jSðlÞj,

supjlj4 ejSðlÞjo1; 8e40:

The integral (2.4) is split into two parts: choosing e such that fX is bounded on Ie ¼ ½�e; e� and using the fact that the

integrand in (2.4) is positive (see (A.7)):ZIe

fXðlÞRe1

1� re�iySðlÞþ

1

1� re�iySð�lÞ

!dlr sup

l2Ie

fXðlÞ

!ZIe

Re1

1� re�iySðlÞþ

1

1� re�iySð�lÞ

!dl

which leads, thanks to Lemma 13, toZIe

fXðlÞRe1

1� re�iySðlÞþ

1

1� re�iySð�lÞ

!dlr4p sup

l2Ie

fXðlÞ

!g�ðr; yÞ ¼ 4p sup

l2Ie

fXðlÞ

!: ð2:8Þ

Now,ZIce

fXðlÞRe1

1� re�iySðlÞþ

1

1� re�iySð�lÞ

!dl¼ p

ZIce

fXðlÞ1

p þPrrðt� yÞþPrrðtþyÞ� �

dl:

Applying (A.8) with

s¼ rrðlÞorðlÞ; t¼ tðlÞ7y and Z¼ 1� supjlj4 ejSðlÞj

yields

pZ

Ice

fXðlÞ1

p þPrrðt� yÞþPrrðtþyÞ� �

dlr 1þ2

Z

� �ZIce

fXðlÞdlr 1þ2

Z

� �Z p

�pfXðlÞdl: ð2:9Þ

Gathering (2.11) and (2.12) leads to the result via Lebesgue’s theorem. &

In the next two propositions, the spectral density of X is allowed to be unbounded at zero. Their proofs are in theAppendix.

We first suppose that the sampling law has a finite expectation. It shall be proved in Section 3.1 that in this case theintensity of memory of X is preserved.

Proposition 4. If E T1o1 and if the spectral density of X has the form

fXðlÞ ¼ jlj�2dfðlÞ; ð2:10Þ

Page 5: Random sampling of long-memory stationary processes

ARTICLE IN PRESS

A. Philippe, M.-C. Viano / Journal of Statistical Planning and Inference 140 (2010) 1110–11241114

where f is non-negative, integrable and bounded in a neighbourhood of zero, and where 0rdo1=2, then the sampled process

has a spectral density fY given by (2.7).

Proof. See Appendix A.3.

The case EðT1Þ ¼1 is treated in the next proposition, under the extra assumption (2.11) meaning that SðjÞ is regularlyvarying at infinity. We shall see in Section 3.2 (see Proposition 7) how the parameter d is transformed when the samplinglaw satisfies condition (2.11). In particular we shall see that if 1ogo3� 4d the covariance of the sampled process issquare summable, implying the absolute continuity of the spectral measure of Y. The proposition below shows that thisproperty holds for every g 2�1;2�.

Proposition 5. Assume that the spectral density of X has the form (2.10). If the distribution S satisfies the following condition:

PðT1 ¼ jÞ ¼ SðjÞ�cj�g when j-1; ð2:11Þ

where 1ogo2 and with c40, then the conclusion of Proposition 4 is still valid.

Proof. See Appendix A.4.

3. Random sampling of processes with regularly varying covariances

The propositions below are valid for the family of stationary processes whose covariance sXðhÞ decays arithmetically, upto a slowly varying factor. In the first paragraph we show that if T1 has a finite expectation, the rate of decay of thecovariance is unchanged by sampling. We then see that, when EðT1Þ ¼1, the memory of the sampled process is reducedaccording to the largest finite moment of T1.

3.1. Preservation of the memory when EðT1Þo1

Proposition 6. Assume that the covariance of X is of the form

sXðhÞ ¼ h�aLðhÞ;

where 0oao1 and where L is slowly varying at infinity and ultimately monotone (see Bingham et al., 1987). If EðT1Þo1, then,the covariance of the sampled process satisfies

sY ðhÞ�h�aðEðT1ÞÞ�aLðhÞ

as h-1.

Proof. We prove that as h-1

sXðThÞ

h�aLðhÞ¼

Th

h

� ��a LðThÞ

LðhÞ�!a:s:ðEðT1ÞÞ

�að3:1Þ

and that, for h large enough,

sXðThÞ

h�aLðhÞ

��������r1: ð3:2Þ

Since Th is the sum of independent and identically distributed random variables Th ¼Ph

j ¼ 1 Dj, with common

distribution T1 2 L1, the law of large numbers leads to

Th

h�!a:s:

EðT1Þ: ð3:3Þ

Now

LðThÞ

LðhÞ�!a:s:

1; ð3:4Þ

indeed, ThZh because the intervals DjZ1 for all j and (3.3) implies that for h large enough

Thr2EðT1Þh:

Therefore, using the fact that L is ultimately monotone, we have

LðhÞrLðThÞrLð2EðT1ÞhÞ ð3:5Þ

if L is ultimately increasing (and the reversed inequalities if L is ultimately decreasing). Finally (3.5) directly leads to (3.4)

since L is slowly varying at infinity (see Theorem 1.2.1 in Bingham et al., 1987). Clearly, (3.3) and (3.4) imply the

convergence (3.1).

Page 6: Random sampling of long-memory stationary processes

ARTICLE IN PRESS

A. Philippe, M.-C. Viano / Journal of Statistical Planning and Inference 140 (2010) 1110–1124 1115

In order to prove (3.2), we write

sXðThÞ

h�aLðhÞ¼

Th

h

� ��a=2 Th

h

� ��a=2 LðThÞ

LðhÞ¼

Th

h

� ��a=2 T�a=2h LðThÞ

h�a=2LðhÞ:

As a40 and ThZh,

Th

h

� ��ar1:

Moreover, h�a=2LðhÞ is decreasing for h large enough (see Bingham et al., 1987), so that

T�a=2h LðThÞ

h�a=2LðhÞr1:

These two last inequalities lead to (3.2).

Since sY ðhÞ ¼ EðsXðThÞÞ we conclude the proof by applying Lebesgue’s theorem, and we get

sY ðhÞ

h�aLðhÞ-ðEðT1ÞÞ

�a: &

3.2. Decrease of memory when EðT1Þ ¼1

If EðT1Þ ¼1 it is known that Th=h-1, implying that the limit in (3.1) is zero. In other words, in this case theconvergence to zero of sY ðhÞ could be faster than h�a. The aim of this section is to give a precise evaluation of this rate ofconvergence.

Proposition 7. Assume that the covariance of X satisfies

jsXðhÞjrch�a; ð3:6Þ

where 0oao1.

If

liminfx-1

xbPðT14xÞ40 ð3:7Þ

for some b 2 ð0;1� (implying EðTb1 Þ ¼1) then

jsY ðhÞjrCh�a=b: ð3:8Þ

Proof. From hypothesis (3.6),

jsY ðhÞjrEðjsXðThÞjÞrcEðT�ah Þ:

Then,

EðT�ah Þ ¼X1j ¼ h

PðThrjÞðj�a � ðjþ1Þ�aÞraX1j ¼ 0

PðThrjÞj�a�1: ð3:9Þ

From hypothesis (3.7) on the tail of the sampling law, it follows that, for large enough h and for all jZh

PðThrjÞrP max1rlrh

Dlrj

� �¼ PðT1rjÞhrð1� Cj�bÞhre�Ch=jb : ð3:10Þ

Gathering (3.9) and (3.10) then gives

EðT�ah ÞraX1j ¼ h

j�a�1e�Ch=jb :

The last sum has the same asymptotic behaviour asZ 1h

x�a�1e�Ch=xb dx¼ h�a=bZ h1�b

0ua=b�1e�Cu du;

and the result follows since

Z h1�b

0ua=b�1e�Cu du

�!h-1 R1

0 ua=b�1e�Cu du when bo1;

¼R 1

0 ua=b�1e�Cu du when b¼ 1: &

8<:

Next proposition states that the bound in Proposition 7 is sharp under some additional hypotheses.

Page 7: Random sampling of long-memory stationary processes

ARTICLE IN PRESS

A. Philippe, M.-C. Viano / Journal of Statistical Planning and Inference 140 (2010) 1110–11241116

Proposition 8. Assume that

sXðhÞ ¼ h�aLðhÞ;

where 0oao1 and where L is slowly varying at infinity and ultimately monotone.

If

b ¼: supfg : EðTg1Þo1g 2�0;1� ð3:11Þ

then, for every e40,

sY ðhÞZC1h�a=b�e: ð3:12Þ

Proof. Let e40. We have

sXðThÞ

h�a=b�e¼

T�ah

h�a=b�eLðThÞ ¼

T�a�be=2h

h�a=b�eTbe=2

h LðThÞ ¼Th

hd

� ��a�be=2

Tbe=2h LðThÞ;

where

d¼a=bþe

aþbe2

¼1

baþbeaþbe=2

� �:

Using Proposition 1.3.6 in Bingham et al. (1987),

Tbe=2h LðThÞ�!

h-1þ1 a:s:

Moreover we have d41=b. Therefore, the assumption (3.11) implies EðT1=d1 Þo1 with 1=do1. Then, we can apply the

law of large numbers of Marcinkiewicz–Zygmund (see Stout, 1974, Theorem 3.2.3), which yields

Th

hd�!a:s:

0 as h-1: ð3:13Þ

Therefore by applying Fatou’s Lemma

sY ðhÞ

h�a=b�e�!h-11: &

3.3. Example: the FARIMA processes

A FARIMA ðp;d; qÞ process is defined from a white noise ðenÞn, a parameter d 2�0;1=2½ and two polynomialsAðzÞ ¼ zpþa1zp�1þ � � � þap and BðzÞ ¼ zqþb1zq�1þ � � � þbq non-vanishing on the domain jzjZ1, by

Xn ¼ BðLÞA�1ðLÞðI � LÞ�den; ð3:14Þ

where L is the back-shift operator LUn ¼Un�1. The FARIMA ð0; d;0Þ

Wn ¼ ðI � LÞ�den

introduced by Granger (Gray et al., 1989) is specially popular.It is well known (see Brockwell and Davis, 1991) that the covariance of the FARIMA ðp; d;qÞ process satisfies

sXðkÞ�ck2d�1 as k-1 ð3:15Þ

allowing to apply the results of Sections 3.1 and 3.2.

Proposition 9. Sampling a FARIMA process (3.14), with a sampling law S such that, with some g41

SðjÞ ¼ PðT1 ¼ jÞ�cj�g as j-1 ð3:16Þ

leads to a process ðYnÞn whose auto-covariance function satisfies the following properties:

(i)

if g42,

sY ðhÞ�Ch2d�1; ð3:17Þ

if gr2,

(ii)

C1hð2d�1Þ=ðg�1Þ�ersY ðhÞrC2hð2d�1Þ=ðg�1Þ; 8e40: ð3:18Þ

Consequently, the sampled process ðYnÞn has

� the same memory parameter if gZ2,� a reduced long memory parameter if 2ð1� dÞrgo2,� short memory if 1ogo2ð1� dÞ.

Page 8: Random sampling of long-memory stationary processes

ARTICLE IN PRESS

A. Philippe, M.-C. Viano / Journal of Statistical Planning and Inference 140 (2010) 1110–1124 1117

Proof. We omit the proof which is an immediate consequence of Proposition 6 for (i) and of Propositions 7 and 8 withb¼ g� 1 and a¼ 1� 2d for (ii). &

We illustrate this last result by simulating and sampling FARIMAð0; d;0Þ processes.Simulations of the trajectories are based on the moving average representation of the FARIMA (see Bardet et al., 2002).The sampling distribution is

PðS¼ kÞ ¼

Z kþ1

kðg� 1Þt�g dt�Ck�g; ð3:19Þ

which is simulated using the fact that when u is uniformly distributed on ½0;1�, the integer part of u1=ð1�gÞ is distributedaccording to (3.19).

In Fig. 1 are depicted the empirical auto-covariances of a trajectory of the process X and of two sampled trajectories Y1

and Y2. The memory parameter of X is d¼ 0:35 and the parameters of the two sampling distributions are, respectively,g1 ¼ 2:8 and g2 ¼ 1:9. According to Proposition 9, the process Y1 has the same memory parameter as X, while the memoryintensity is reduced for the process Y2.

Then we estimate the memory parameters of the sampled processes by using the FEXP procedure introduced byMoulines and Soulier (1999) and Bardet et al. (2003). The FEXP estimator is adapted to processes having a spectral density.From Propositions 5 and 4, this is the case for our sampled processes.

From Proposition 9 the memory parameter of Y is

dY ¼d if gZ2;

d� 1þg=2 otherwise:

(ð3:20Þ

Fig. 2 shows the estimate and a confidence interval for dY . Two values, d¼ 0:1 (lower curves) and d¼ 0:35 (upper curves) ofthe memory parameter of X are considered. In abscissa, the parameter g of the sampling distribution is allowed to varybetween 1.7 and 3.3.

Figobs

In the case d¼ 0:1, short memory (corresponding to negative values of dY ) is obtained for gr1:8.

� In the case d¼ 0:35, sample distributions leading to short memory are too heavy tailed to allow tractable

simulations.

Notice that the systematic underestimation of dY seems to confirm Souza’s diagnosis in Souza et al. (2006).

0.0

Lag

AC

F

Unobserved process

Lag

AC

F

gamma= 2.8

0Lag

AC

F

gamma= 1.9

5 10 15 20 250 5 10 15 20 250 5 10 15 20 25

0.2

0.4

0.6

0.8

1.0

0.0

0.2

0.4

0.6

0.8

1.0

0.0

0.2

0.4

0.6

0.8

1.0

. 1. Auto-covariance functions of X (left) and of two sub-sampled processes corresponding to g¼ 2:8 (middle) and g¼ 1:9 (right). The number of

erved values of the sub-sampled processes is equal to 5000.

Page 9: Random sampling of long-memory stationary processes

ARTICLE IN PRESS

1.6 1.8 2 2.2 2.4 2.6 2.8 3 3.2 3.4 3.6–0.1

0

0.1

0.2

0.3

0.4

0.5

0.6

γ

estim

atio

n of

long

mem

ory

para

met

er

d=0.1d=0.35

Fig. 2. Estimation of the long memory parameter for d¼ 0:1 and 0:35 as a function of parameter g. The confidence regions are obtained using 500

independent replications. The circles represent the theoretical values of the parameter dY defined in (3.20). For each g, the estimation of dY is evaluated on

5000 observations.

A. Philippe, M.-C. Viano / Journal of Statistical Planning and Inference 140 (2010) 1110–11241118

4. Sampling generalised fractional processes

Consider now the family of generalised fractional processes whose spectral density has the form

fXðlÞ ¼ jFðlÞj2YMj ¼ 1

ðeil � eiyj Þ�dj ðeil � e�iyj Þ

�dj

������������2

; ð4:1Þ

with exponents dj in �0;1=2½, and where jFðlÞj2 is the spectral density of an ARMA process.

4.1. Decrease of memory by heavy-tailed sampling law

See Gray et al. (1989), Leipus and Viano (2000), and Oppenheim et al. (2000) for results and references on this class ofprocesses. Taking M¼ 1 and y1 ¼ 0, it is clear that this family includes the FARIMA processes, which belong to theframework of Section 3. It is proved in Leipus and Viano (2000) that as h-1 the covariance sequence is a sum of periodicsequences damped by a regularly varying factor

sXðhÞ ¼ h2d�1X

c1;jcosðhyjÞþc2;jsinðhyjÞþoð1Þ� �

; ð4:2Þ

where d¼maxfdj; j¼ 1; . . . ;Mg and where the sum extends over indexes corresponding to dj ¼ d. Hence, these modelsgenerally present seasonal long-memory, and for them the regular form sXðhÞ ¼ h2d�1LðhÞ is lost. Precisely, this regularityremains only when, in (4.1), d¼maxfdj; j¼ 1; . . . ;Mg corresponds to the unique frequency y¼ 0. In all other cases, from(4.2), it has to be to be replaced by

sXðhÞ ¼Oðh2d�1Þ:

From the previous comments it is clear that Propositions 6 and 8 are no longer valid in the case of seasonal long-memory. This means that in this situation, it is not sure that long memory is preserved when the sampling law has a finiteexpectation. In fact this is an open question. Nevertheless it is true that long-memory can be lost when the sampling lawhas an infinite expectation. Indeed, Proposition 7 applies with a¼ 1� 2d where d¼maxfdj; j¼ 1; . . . ;Mg, and the followingresult holds:

Proposition 10. When sampling a process with spectral density (4.1) along a random walk such that the sampling law satisfies

(3.16), the obtained process has short memory as soon as 1ogo2ð1� dÞ where d¼maxfdj; j¼ 1; . . . ;Mg.

4.2. The effect of deterministic sampling

In a first step we just suppose that X has a spectral density fX , which is unbounded in one or several frequencies (we callthem singular frequencies). It is clearly the case of the generalised fractional processes. The examples shall be taken in thisfamily. We investigate the effect of deterministic sampling on the number and the values of the singularities. Let ussuppose that S¼ dk, the law concentrated at k. It is well known that if X has a spectral density, the spectral density of the

Page 10: Random sampling of long-memory stationary processes

ARTICLE IN PRESS

A. Philippe, M.-C. Viano / Journal of Statistical Planning and Inference 140 (2010) 1110–1124 1119

sampled process Y¼ ðXknÞn is

fY ðlÞ ¼1

2‘þ1

Xl

j ¼ �l

fXl� 2pj

2‘þ1

� �if k¼ 2‘þ1;

fY ðlÞ ¼1

2‘

X‘�1

j ¼ �‘þ1

fXl� 2pj

2‘

� �þ fX

l� 2p‘ sgnðlÞ2‘

� �0@

1A if k¼ 2‘: ð4:3Þ

Proposition 11. Let the spectral density of ðXnÞn have NXZ1 singular frequencies. Then, denoting by NY the number of singular

frequencies of the spectral density of the sampled process ðYn ¼ XknÞn

1rNYrNX . � NY oNX if and only if fX has at least two singular frequencies l0al1 such that, for some integer number j�,

l0 � l1 ¼2pj�

k:

Proof. We choose k¼ 2‘þ1 to simplify. The proofs for k¼ 2‘ are quite similar. Now remark that for fixed j,

l� 2pj

2‘þ12 Ij :¼

�p� 2pj

2‘þ1;p� 2pj

2‘þ1

� �for l 2 ½�p;p�:

The intervals Ij are non-overlapping andS‘

j ¼ �‘ Ij ¼ ½�p;p�. Now suppose that fX is unbounded at some l0. Then fY isunbounded at the unique frequency l� satisfying l0 ¼ ðl

�� 2pj0Þ=ð2‘þ1Þ for a suitable value j0 of j. In other words, to every

singular frequency of fX corresponds a unique singular frequency of fY . As a consequence, the number of singularfrequencies of fY is at least 1, and cannot exceed the number of singularities of fX .

It is clearly possible to reduce the number of singularities by deterministic sampling. Indeed, two distinct singular

frequencies of fX , l0 and l1 produce the same singular frequency l� of fY if and only if

l� � 2pj0

2‘þ1¼ l0 and

l� � 2pj1

2‘þ1¼ l1;

which happens if and only if l0 � l1 is a multiple of the basic frequency 2p=ð2‘þ1Þ. &

Proposition 11 is illustrated by the three following examples showing the various effects of the aliasing phenomenonwhen sampling generalised fractional processes.

Example 1. Consider fXðlÞ ¼ jeil � 1j�2d, the spectral density of the FARIMA (0; d;0) process, and take k¼ 2lþ1. From (4.3),the spectral density is

fY ðlÞ ¼1

2lþ1

Xl

j ¼ �l

jeiðl�2pjÞ=ð2lþ1Þ � 1j�2d:

Clearly, FY has, on ½�p;p½, only one singularity at l¼ 0, associated with the same memory parameter d as fXðlÞ. Thememories of X and Y have the same intensity. This was already noticed in Chambers (1998) and Hwang (2000).

In the three following examples we take k¼ 3.

Example 2. Consider the following spectral density of a seasonal long-memory process (see Oppenheim et al., 2000)

fXðlÞ ¼ jeil � e2ip=3j�2djeil � e�2ip=3j�2d:

The sampled spectral density is

fY ðlÞ ¼1

3

X1

j ¼ �1

jeiðl�2pjÞ=3 � e2ip=3j�2djeiðl�2pjÞ=3 � e�2ip=3j�2d;

and is everywhere continuous on ½�p;p�, except at l¼ 0 where

fY ðlÞ�cjlj�2d:

In other words the memory is preserved, but the seasonal effect disappears after sampling.

Example 3. We consider a spectral density having a singularity at zero and two other ones:

fXðlÞ ¼ jeil � e2ip=3j�2d1 jeil � e�2ip=3j�2d1 jeil � 1j�2d0 :

Page 11: Random sampling of long-memory stationary processes

ARTICLE IN PRESS

A. Philippe, M.-C. Viano / Journal of Statistical Planning and Inference 140 (2010) 1110–11241120

The sampled spectral density is now

fY ðlÞ ¼1

3

X1

j ¼ �1

jeiðl�2pjÞ=3 � e2ip=3j�2d1 jeiðl�2pjÞ=3 � e�2ip=3j�2d1 jeiðl�2pjÞ=3 � 1j�2d0 ;

which has only one singularity at l¼ 0, with exponent �2maxfd0; d1g. More precisely,

fY ðlÞ ¼ LðlÞjlj�2maxfd0 ;d1g;

where L is continuous and non-vanishing on ½�p;p�.Taking d14d0 shows that deterministic sampling can change the local behaviour of the spectral density at zero

frequency. This completes Chambers’ result in Chambers (1998).

Example 4. Consider now a case of two seasonalities (7p=4 and 73p=4) associated with two different memoryparameters d1 and d2:

fXðlÞ ¼ jeil � eip=4j�2d1 jeil � e�ip=4j�2d1 jeil � e3ip=4j�2d2 jeil � e�3ip=4j�2d2 :

It is easily checked that fY has the same singular frequencies as fX , with an exchange of the memory parameters:

fY ðlÞ�c l7p4

��� ����2d2

near 8p=4;

fY ðlÞ�c l73p4

���������2d1

near 83p=4:

Appendix A

A.1. Proof of Lemma 2

Let us consider the two z-transforms of the bounded sequence sY ðjÞ:

s�Y ðzÞ ¼X1j ¼ 0

zjsY ðjÞ; jzjo1

and

sþY ðzÞ ¼X1j ¼ 0

z�jsY ðjÞ; jzj41:

On the first hand, from the representation

sY ðjÞ ¼ EðsXðTjÞÞ ¼

Z p

�pfXðlÞSðlÞj dl;

we have

s�Y ðzÞ ¼X1j ¼ 0

zj

Z p

�pfXðlÞðSðlÞÞj dl¼

Z p

�p

fXðlÞ1� zSðlÞ

dl; jzjo1; ðA:1Þ

sþY ðzÞ ¼X1j ¼ 0

z�j

Z p

�pfXðlÞðSðlÞÞj dl¼

Z p

�p

fXðlÞ

1�SðlÞ

z

dl; jzj41: ðA:2Þ

On the second hand, let Cr be the circle jzj ¼ r. If 0oro1, for all jZ0

1

2ip

ZCr

s�Y ðzÞz�j�1 dz¼

1

2p

Z p

�pðreiyÞ�js�Y ðreiyÞdy¼

X1l ¼ 0

rl�jsY ðlÞ

2p

Z p

�peiðl�jÞy dy¼ sY ðjÞ; ðA:3Þ

and, similarly, if r41

1

2ip

ZCr

sþY ðzÞzj�1 dz¼

1

2p

Z p

�pðreiyÞjsþY ðreiyÞdy¼

X1l ¼ 0

rj�lsY ðlÞ

2p

Z p

�peiðj�lÞydy¼ sY ðjÞ: ðA:4Þ

Page 12: Random sampling of long-memory stationary processes

ARTICLE IN PRESS

A. Philippe, M.-C. Viano / Journal of Statistical Planning and Inference 140 (2010) 1110–1124 1121

Gathering (A.1) with (A.3) and (A.2) with (A.4) leads to

sY ðjÞ ¼

1

2pR p�p e�ijy r�j

R p�p

fXðlÞ1� reiySðlÞ

dl

!dy if ro1;

1

2pR p�p eijy rj

R p�p

fXðlÞ

1�e�iySðlÞ

r

dl

0BB@

1CCAdy if r41:

8>>>>>>>><>>>>>>>>:

ðA:5Þ

Changing the integrand in (A.5) into its conjugate when ro1, and r for 1=r when r41 leads to

sY ðjÞ ¼ r�j

Z p

�peijygðr;yÞdy; 8r 2 ½0;1½; ðA:6Þ

where gðr;yÞ is defined in (2.4). As the first member does not depend on r, (2.3) is proved.Let us now prove (2.5). Firstly,

Im1

1� re�iySðlÞþ

1

1� re�iySð�lÞ

1

2

1

1� re�iySðlÞþ

1

1� re�iySð�lÞ�

1

1� reiySð�lÞ�

1

1� reiySðlÞ

!

is an odd function of l. Hence, the imaginary part of the integrand in (2.4) disappears after integration.Secondly,

Re1

1� re�iySðlÞþ

1

1� re�iySð�lÞ

1

2

1� r2jSðlÞj2

j1� re�iySðlÞj2þ

1� r2jSðlÞj2

j1� re�iySð�lÞj2þ2

!; ðA:7Þ

and the proof is over.

A.2. A few technical results

Hereafter we recall some properties used in the paper.

Lemma 12. The Poisson kernel (2.2) satisfies

2pPsðtÞr1� s2

ð1� sÞ2¼

1þs

1� sr

2

Z ; 8so1� Z with so1; ðA:8Þ

0odo jtjrp2¼)PsðtÞoPsðdÞ; ðA:9Þ

2p sup0o so1

PsðtÞ ¼1

jsin tj; 8t 2� � p=2;p=2½: ðA:10Þ

Proof. The bound (A.8) is direct. Inequality (A.9) comes from the monotonicity of the kernel with respect to s.

Let us prove (A.10):

@

@sðPsðtÞÞ ¼ 2

s2cos t � 2sþcos t

ð1þs2 � 2scos tÞ2:

It is easily checked that the numerator has one single root s0 ¼ ð1� jsin tjÞ=cos t in [0,1[, and Ps0ðtÞ ¼ jsin tj�1. &

The following result comes from spectral considerations.

Lemma 13. With r¼ rðlÞ and t¼ tðlÞ defined in (2.6), for all r 2 ½0;1½ and y 2� � p;p½,

g�ðr; yÞ :¼1

4

Z p

�p

1

pþPrrðt� yÞþPrrðtþyÞ

� �dl¼ 1: ðA:11Þ

Proof. Notice first that if fX � 1, the process X is a white noise with VarðX1Þ ¼ 2p. Hence, the sampled process is also awhite noise with variance 2p. Applying (2.4) to a white noise, we get

2pd0ðjÞ ¼ r�j

Z p

�peijyg�ðr;yÞdy ðA:12Þ

for all r 2 ½0;1½. Since rjd0ðjÞ ¼ d0ðjÞ, we can rewrite (A.12) as

2pd0ðjÞ ¼

Z p

�peijyg�ðr; yÞdy:

This means that g�ðr; yÞ is the Fourier transform of ð2pd0ðjÞÞj. Consequently g�ðr;yÞ � 1. &

Page 13: Random sampling of long-memory stationary processes

ARTICLE IN PRESS

A. Philippe, M.-C. Viano / Journal of Statistical Planning and Inference 140 (2010) 1110–11241122

A.3. Proof of Proposition 4

As for Proposition 3, the proof consists in finding an integrable function gðyÞ such that

jgðr; yÞjrgðyÞ; 8r 2�0;1½; y 2� � p;p½: ðA:13Þ

For that purpose, we need the following estimation of SðlÞ near zero:

j1� SðlÞj ¼ ð1� eilÞXjZ1

SðjÞ1� eijl

1� eil

������������¼ j1� eilj

XjZ1

SðjÞeðj�1Þl=2 sinðjl=2Þ

sinl=2

������������rj1� eilj

XjZ1

jSðjÞ

¼ j2i sinðl=2Þeil=2jXjZ1

jSðjÞ ¼ 2jsinðl=2ÞjXjZ1

jSðjÞrjljXjZ1

jSðjÞ ¼ Cjlj: ðA:14Þ

Now we use the fact that

jujru0o1¼)j1� uj41� u0 and sinðargð1� uÞÞou0:

Since u0op=2, this implies argð1� uÞopu0=2.From this and inequality (A.14) we obtain

jljr1

C¼)jtðlÞjrpCjlj

2: ðA:15Þ

For a fixed y40 let l0 be such that f is bounded on ½�l0; l0�. Denoting

bðyÞ ¼min l0;1

C;ypC

;

we separate ½�p;p� into four intervals:

½�p;�bðyÞ½; ½�bðyÞ;0½; �0; bðyÞ�; �bðyÞ;p�:

In the sequel we only deal with the two last intervals, and concerning the integrand in (2.5) we only treat the partPrrðt� yÞ:

I1ðyÞþ I2ðyÞ ¼Z bðyÞ

0fXðlÞPrrðlÞðtðlÞ � yÞdlþ

Z p

bðyÞfXðlÞPrrðlÞðtðlÞ � yÞdl:

� Bounding I1: From (A.15), since bðyÞry=ðpCÞ, we have jtðlÞjry=2 which implies

y2rjtðlÞ � yj:

Via (A.9) and (A.10), this leads to

Prrðt� yÞrPrrðy=2Þr1

2y:

Consequently

I1ðyÞrsup½0;bðyÞ�ðfðlÞÞ

2y

Z bðyÞ

0l�2d dl¼

sup½0;bðyÞ�ðfðlÞÞ2y

ðbðyÞÞ�2dþ1rC1y�2d

since bðyÞry=ðpCÞ and �2dþ140.� Bounding I2: When l4bðyÞ, we have l�2drC2maxfy�2d;1g for some constant C2. Hence

I2ðyÞrC2maxfy�2d;1g

Z p

�pfðlÞPrrðy� tÞdl: ðA:16Þ

Since f is bounded in a neighbourhood of zero, the arguments used to prove Proposition 3 show that the integral in (A.16)is bounded by a constant.

Finally I1þ I2 is bounded by an integrable function gðyÞ and the proposition is proved.

A.4. Proof of Proposition 5

The following lemma gives the local behaviour of SðlÞ under assumption (2.11).

Lemma 14. Since SðjÞ�cj�g with 1ogo2 and c40,

jlj1�gð1� SðlÞÞ-Z if l-0; ðA:17Þ

where ReðZÞ40 and ImðZÞo0.

Page 14: Random sampling of long-memory stationary processes

ARTICLE IN PRESS

A. Philippe, M.-C. Viano / Journal of Statistical Planning and Inference 140 (2010) 1110–1124 1123

Proof. From the assumption on SðjÞ,XjZ1

SðjÞð1� cosðjlÞÞ�l-0cXjZ1

j�gð1� cosðjlÞÞ

and XjZ1

SðjÞsinðjlÞ�l-0cXjZ1

j�gsinðjlÞ:

Then, using well known results of Zygmund (1959, pp. 186 and 189)XjZ1

j�gð1� cosðjlÞÞ�l-0c

g� 1jljg�1 ¼: c1jljg�1

and XjZ1

j�gsinðjlÞ�l-0cGð1� gÞcospg2

� �jljg�1 ¼: c2jljg�1:

It is clear that c140, and c2o0 follows from the fact that GðxÞo0 for x 2� � 1;0½ and cosðpg=2Þo0. &

In the sequel we take l40. From this lemma, if l is small enough (say 0rlrl0),

c3lg�1rtðlÞrc3

0 lg�1ðA:18Þ

and

1� c4lg�1rrðlÞr1� c4

0 lg�1; ðA:19Þ

where the constants are positive.For a fixed y40, we deduce from (A.18)

lomin l0;y

c30

� �1=ðg�1Þ( )

implies 0oy� c30 lg�1ry� tðlÞ: ðA:20Þ

After choosing l1rl0 such that f is bounded on ½0; l1� define

cðyÞ ¼min l1;y

c30

� �1=ðg�1Þ( )

:

Then we split ½�p;p� into six intervals

½�p;�l1½; ½�l1;�cðyÞ=2½; ½�cðyÞ=2;0½; �0; cðyÞ=2�; �cðyÞ=2; l1�; �l1;p�:We only consider the integral on the three last domains and the part Prrðt� yÞ of the integrand in (2.5).� When l 2 ½0; cðyÞ=2�, inequality (A.20) and properties (A.9) and (A.10) of the Poisson kernel lead to

Prrðy� tÞrPrrðy� c30 lg�1

ÞrC

y� c30 lg�1

;

whence

I1 ¼

Z cðyÞ=2

0f ðlÞPrrðy� tÞdlrC1

0

Z cðyÞ=2

0

l�2d

y� c30 lg�1

dl

rC10

Z 1=2ðy=c03Þ1=ðg�1Þ

0

l�2d

y� c30 lg�1

dl

¼ C20 yð�2dþ1Þ=ðg�1Þ�1

Z 2�1=ðg�1Þ

0

uð�2dþ1Þ=ðg�1Þ�1

1� udu:

Since ð�2dþ1Þ=ðg� 1Þ � 14 � 1, the last integral is finite, implying

I1rC30 yð�2dþ1Þ=ðg�1Þ�1;

which is an integrable function of y.� Thanks to (A.8) and to the r.h.s. of (A.19), we have on the interval �cðyÞ=2; l1�

Prrðy� tÞrC30

lg�1:

Since f is bounded on this domain,

I2rC30 sup

0olrl1

fðlÞZ l1

cðyÞ=2l�2dþ1�g dl¼

C30 sup0olrl1

fðlÞ�2dþ2� g l�2dþ2�g

1 � ðcðyÞ=2Þ�2dþ2�g� �

Page 15: Random sampling of long-memory stationary processes

ARTICLE IN PRESS

A. Philippe, M.-C. Viano / Journal of Statistical Planning and Inference 140 (2010) 1110–11241124

rC30 sup0olrl1

fðlÞ�2dþ2� g l�2dþ2�g

1 þðcðyÞ=2Þ�2dþ2�g� �

¼ C40 l�2dþ2�g

1 þ min l1;y

c03

� �1=ðg�1Þ( ) !�2dþ2�g

0@

1A;

where the function between brackets is integrable because

�2dþ2� gg� 1

¼�2dþ1

g� 1� 14 � 1:

� Finally,

I3 ¼

Z p

l1

f ðlÞPrrðy� lÞdlrl�2d1

Z p

�pfðlÞPrrðy� lÞdl

which has already been treated since f is bounded near zero.Gathering the above results on I1, I2 and I3 completes the proof.

References

Bardet, J.M., Bertrand, P., 2008. Wavelet analysis of a continuous-time Gaussian process observed at random times and the application to the estimation ofthe spectral density. Preprint.

Bardet, J.M., Lang, G., Oppenheim, G., Philippe, A., Taqqu, M., 2002. Generators of long-range dependent processes: a survey. In: Doukhan, P., Taqqu, M.,Oppenheim, G. (Eds.), Long-Range Dependence: Theory and Applications. Birkhauser, Boston.

Bardet, J.M., Lang, G., Oppenheim, G., Philippe, A., Stoev, S., Taqqu, M., 2003. Semi-parametric estimation of the long-range dependent processes: a survey.In: Doukhan, P., Taqqu, M., Oppenheim, G. (Eds.), Long-Range Dependence: Theory and Applications. Birkhauser, Boston.

Beutler, R.J., 1970. Alias-free randomly timed sampling of stochastic processes. IEEE Trans. Inform. Theory IT-16 (2), 147–152.Bingham, N.H., Goldie, C.M., Teugels, J.L., 1987. Regular variation. In: Encyclopedia of Mathematics. Cambridge University Press, Cambridge.Brewer, K., 1975. Some consequences of temporal aggregation and systematic sampling for ARMA or ARMAX models. J. Econom. 1, 133–154.Brockwell, P.J., Davis, R.A., 1991. Time Series: Theory and Methods. Springer, New York.Chambers, M., 1998. Long-memory and aggregation in macroeconomic time-series. Internat. Econom. Rev. 39, 1053–1072.Feller, W., 1971. An Introduction to Probability Theory and Applications, vol. II. Wiley, New York.Gray, H.L., Zhang, N.F., Woodward, W.A., 1989. On generalized fractional processes. J. Time Ser. Anal. 10, 233–257.Greenshtein, E., Ritov, Y., 2000. Sampling from a stationary process and detecting a change in the mean of a stationary distribution. Bernoulli 6, 679–697.Hwang, S., 2000. The effects of systematic sampling and temporal aggregation on discrete time long memory processes and their finite sample properties.

Econometric Theory 16, 347–372.Kadi, A., Oppenheim, G., Viano, M.-C., 1994. Random aggregation of uni and multivariate linear processes. J. Time Ser. Anal. 15, 31–44.Leipus, R., Viano, M.-C., 2000. Modelling long-memory time series with finite or infinite variance: a general approach. J. Time Ser. Anal. 21, 61–74.Masry, E., 1971. Random sampling and reconstruction of spectra. Inform. and Control 19 (4), 275–288.Moulines, E., Soulier, Ph., 1999. Log-periodogram regression of time series with long range dependence. Ann. Statist. 27 (4), 1415–1439.Niemi, H., 1984. The invertibility of sampled and aggregate ARMA models. Metrika 31, 43–50.Oppenheim, G., 1982. Echantillonnage aleatoire d’un processus ARMA. C. R. Acad. Sci. Paris Ser. I Math. 295 (5), 403–406.Oppenheim, G., Ould Haye, M., Viano, M.-C., 2000. Long-memory with seasonal effects. Statist. Inform. Stochastic Process. 3, 53–68.Shapiro, H.S., Silverman, R.A., 1960. Alias-free sampling of random noise. J. Soc. Indust. Appl. Math. 8 (2), 225–248.Souza, L.R., Smith, J., 2002. Bias in the memory parameter for different sampling rates. Internat. J. Forecast. 18, 299–313.Souza, L.R., 2005. A note on Chamber’s ‘‘long-memory and aggregation in macroeconomic time series’’. Internat. Econom. Rev. 46, 1059–1062.Souza, L.R., Smith, J., Souza, R.C., 2006. Convex combinations of long-memory estimates from different sampling rates. Comput. Statist. 21, 399–413.Sram, D.O., Wei, W.S., 1986. Temporal aggregation in the ARIMA processes. J. Time Ser. Anal. 7, 279–292.Stout, W.F., 1974. Almost Sure Convergence. Academic Press, New York.Zygmund, A., 1959. Trigonometric Series, vol. 1. Cambridge University Press, Cambridge.