time series models
DESCRIPTION
This will help to quick view on time series analysis.TRANSCRIPT
![Page 1: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/1.jpg)
Time SeriesModels
![Page 2: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/2.jpg)
DJS 19/04/2023 2
Topics• Stochastic processes
• Stationarity• White noise• Random walk• Moving average processes• Autoregressive processes• More general processes
![Page 3: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/3.jpg)
Stochastic Processes
1
![Page 4: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/4.jpg)
DJS 19/04/2023 4
Stochastic processes• Time series are an example of a
stochastic or random process• A stochastic process is 'a statistical
phenomenen that evolves in timeaccording to probabilistic laws'
• Mathematically, a stochastic process is an indexed collection of random variables Xt :t T
![Page 5: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/5.jpg)
DJS 19/04/2023 5
Stochastic processes• We are concerned only with processes indexed by time, either discrete time or continuous time processes such as
Xt :t ( ,) Xt : t or
Xt:t {0,1,2,...} X0 , X1, X2 ,...
![Page 6: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/6.jpg)
DJS 19/04/2023 6
Inference• We base our inference usually on a single observation or realization of the process over some period of time, say [0, T] (a continuous interval of time) or at a sequence of time points {0, 1, 2, . . . T}
![Page 7: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/7.jpg)
DJS 19/04/2023 7
Specification of a process• To describe a stochastic process fully, we must specify the all finite dimensional distributions, i.e. the joint distribution of of the random variables for any finite set of times
Xt1, Xt2
, Xt3,....Xtn
![Page 8: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/8.jpg)
DJS 19/04/2023 8
Specification of a process• A simpler approach is to only specify the moments—this is sufficient if all the joint distributions are normal
• The mean and variance functions are given by
t E(Xt ), and t2 Var (Xt )
![Page 9: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/9.jpg)
DJS 19/04/2023 9
Autocovariance• Because the random variables
comprising the process are not independent, we must also specify their covariancet1,t2
Cov( Xt1, Xt2
)
![Page 10: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/10.jpg)
Stationarity
2
![Page 11: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/11.jpg)
DJS 19/04/2023 11
Stationarity• Inference is most easy, when a process is stationary—its distribution does not change over time
• This is strict stationarity• A process is weakly stationary if
its mean and autocovariance functions do not change over time
![Page 12: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/12.jpg)
DJS 19/04/2023 12
Weak stationarity• The autocovariance depends
only on the time difference or lag between the two time points involvedt , t
2 2
andt1,t2
Cov( Xt1, Xt2
) Cov(Xt1 , Xt2 )
t1,t2 t1 t2
![Page 13: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/13.jpg)
DJS 19/04/2023 13
Autocorrelation• It is useful to standardize the autocovariance function (acvf)
• Consider stationary case only• Use the autocorrelation function
(acf)t
t
0
![Page 14: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/14.jpg)
DJS 19/04/2023 14
Autocorrelation • More than one process can have the same acf
• Properties are:
0 1t t
t 1
![Page 15: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/15.jpg)
White Noise
3
![Page 16: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/16.jpg)
DJS 19/04/2023 16
White noise• This is a purely random process, a sequence of independent and identically distributed random variables
• Has constant mean and variance• Also k Cov(Zt , Ztk ) 0, k 0
k 1 k 0
0 k 0
![Page 17: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/17.jpg)
Random Walk
3
![Page 18: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/18.jpg)
DJS 19/04/2023 18
Random walk• Start with {Zt} being white noise or purely random
• {Xt} is a random walk if
Xo 0
Xt Xt 1 Zt Ztk0
t
![Page 19: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/19.jpg)
DJS 19/04/2023 19
Random walk• The random walk is not stationary
• First differences are stationary
Xt Xt Xt 1 Zt
E( Xt ) t, Var (Xt ) t2
![Page 20: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/20.jpg)
Moving Average Processes
4
![Page 21: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/21.jpg)
DJS 19/04/2023 21
Moving average processes• Start with {Zt} being white noise or
purely random, mean zero, s.d. Z
• {Xt} is a moving average process of order q (written MA(q)) if for some constants 0, 1, . . . q we have
• Usually 0 =1
Xt 0Zt 1Zt 1...qZt q
![Page 22: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/22.jpg)
DJS 19/04/2023 22
Moving average processes• The mean and variance are given
by
• The process is weakly stationary because the mean is constant and the covariance does not depend on t
E( Xt ) 0, Var (Xt ) Z2 k
2
k0
q
![Page 23: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/23.jpg)
DJS 19/04/2023 23
Moving average processes• If the Zt's are normal then so is the process, and it is then strictly stationary
• The autocorrelation is
k
1 k 0
iiki0
q k
i2 k 1,..., q
i0
q
0 k q
k k 0
![Page 24: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/24.jpg)
DJS 19/04/2023 24
Moving average processes• Note the autocorrelation cuts off at
lag q• For the MA(1) process with 0 = 1
k 1 k 0
1 (112 ) k = 1
0 otherwise
![Page 25: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/25.jpg)
DJS 19/04/2023 25
Moving average processes• In order to ensure there is a unique MA process for a given acf, we impose the condition of invertibility
• This ensures that when the process is written in series form, the series converges
• For the MA(1) process Xt = Zt + Zt
- 1, the condition is ||< 1
![Page 26: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/26.jpg)
DJS 19/04/2023 26
Moving average processes• For general processes introduce the backward shift operator B
• Then the MA(q) process is given by
B j Xt Xt j
Xt (0 1B 2B2...qBq )Zt (B)Zt
![Page 27: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/27.jpg)
DJS 19/04/2023 27
Moving average processes• The general condition for invertibility is that all the roots of the equation lie outside the unit circle (have modulus less than one)
![Page 28: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/28.jpg)
Autoregressive Processes
4
![Page 29: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/29.jpg)
DJS 19/04/2023 29
Autoregressive processes• Assume {Zt} is purely random with mean zero and s.d. z
• Then the autoregressive process of order p or AR(p) process is
Xt 1Xt 1 2Xt 2... pXt p Zt
![Page 30: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/30.jpg)
DJS 19/04/2023 30
Autoregressive processes• The first order autoregression is
Xt = Xt - 1 + Zt
• Provided ||<1 it may be written as an infinite order MA process
• Using the backshift operator we have
(1 – B)Xt = Zt
![Page 31: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/31.jpg)
DJS 19/04/2023 31
Autoregressive processes• From the previous equation we have
Xt Zt / (1 B)
(1B 2B2...)Zt
Zt Zt 12Zt 2...
![Page 32: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/32.jpg)
DJS 19/04/2023 32
Autoregressive processes• Then E(Xt) = 0, and if ||<1
Var(Xt)X2 Z
2 / (1 2)
k kZ
2 / (1 2)
k k
![Page 33: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/33.jpg)
DJS 19/04/2023 33
Autoregressive processes• The AR(p) process can be written as
(1 1B 2B2 ...pB
p)Xt Zt
or
Xt Zt / (1 1B 2B2 ...pB
p) f(B)Zt
![Page 34: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/34.jpg)
DJS 19/04/2023 34
Autoregressive processes• This is for
for some 1, 2, . . .• This gives Xt as an infinite MA
process, so it has mean zero
f(B)(1 1B ... pBp) 1
(11B2B2...)
![Page 35: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/35.jpg)
DJS 19/04/2023 35
Autoregressive processes• Conditions are needed to ensure that various series converge, and hence that the variance exists, and the autocovariance can be defined
• Essentially these are requirements that the i become small quickly enough, for large i
![Page 36: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/36.jpg)
DJS 19/04/2023 36
Autoregressive processes• The i may not be able to be found however.
• The alternative is to work with the i
• The acf is expressible in terms of the roots i, i=1,2, ...p of the auxiliary equation y
p 1yp 1... p 0
![Page 37: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/37.jpg)
DJS 19/04/2023 37
Autoregressive processes• Then a necessary and sufficient condition for stationarity is that for every i, |i|<1
• An equivalent way of expressing this is that the roots of the equation
must lie outside the unit circle
(B)1 1B ... pBp 0
![Page 38: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/38.jpg)
DJS 19/04/2023 38
ARMA processes• Combine AR and MA processes
• An ARMA process of order (p,q) is given by
Xt 1Xt 1...pXt p
Zt 1Zt 1...pZt q
![Page 39: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/39.jpg)
DJS 19/04/2023 39
ARMA processes• Alternative expressions are possible using the backshift operator
(B)Xt (B)Zt
where
(B)1 1B ... pBp
(B)11B...qBq
![Page 40: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/40.jpg)
DJS 19/04/2023 40
ARMA processes• An ARMA process can be written in
pure MA or pure AR forms, the operators being possibly of infinite order
• Usually the mixed form requires fewer parameters
Xt (B)Zt
(B)Xt Zt
![Page 41: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/41.jpg)
DJS 19/04/2023 41
ARIMA processes• General autoregressive integrated moving average processes are called ARIMA processes
• When differenced say d times, the process is an ARMA process
• Call the differenced process Wt. Then Wt is an ARMA process and
Wt d Xt (1 B)d Xt
![Page 42: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/42.jpg)
DJS 19/04/2023 42
ARIMA processes• Alternatively specify the process as
• This is an ARIMA process of order (p,d,q)
(B)Wt (B)Ztor
(B)(1 B)d Xt (B)Zt
![Page 43: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/43.jpg)
DJS 19/04/2023 43
ARIMA processes• The model for Xt is non-stationary because the AR operator on the left hand side has d roots on the unit circle
• d is often 1• Random walk is ARIMA(0,1,0)• Can include seasonal terms—see
later
![Page 44: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/44.jpg)
DJS 19/04/2023 44
Non-zero mean• We have assumed that the mean is zero in the ARIMA models
• There are two alternatives—mean correct all the Wt terms
in the model—incorporate a constant term in
the model
![Page 45: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/45.jpg)
The Box-Jenkins
Approach
![Page 46: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/46.jpg)
DJS 19/04/2023 46
Topics• Outline of the approach
• Sample autocorrelation & partial autocorrelation
• Fitting ARIMA models• Diagnostic checking• Example• Further ideas
![Page 47: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/47.jpg)
Outline of theBox-Jenkins Approach
1
![Page 48: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/48.jpg)
DJS 19/04/2023 48
Box-Jenkins approach• The approach is an iterative one
involving—model identification—model fitting—model checking
• If the model checking reveals that there are problems, the process is repeated
![Page 49: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/49.jpg)
DJS 19/04/2023 49
Models• Models to be fitted are from the ARIMA class of models (or SARIMA class if the data are seasonal)
• The major tools in the identification process are the (sample) autocorrelation function and partial autocorrelation function
![Page 50: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/50.jpg)
DJS 19/04/2023 50
Autocorrelation• Use the sample autocovariance and sample variance to estimate the autocorrelation
• The obvious estimator of the autocovariance is
ck Xt X t1
N k
Xtk X N
![Page 51: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/51.jpg)
DJS 19/04/2023 51
Autocovariances• The sample autocovariances are not unbiased estimates of the autocovariances—bias is of order 1/N
• Sample autocovariances are correlated, so may display smooth ripples at long lags which are not in the actual autocovariances
![Page 52: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/52.jpg)
DJS 19/04/2023 52
Autocovariances• Can use a different divisor (N-k instead of N) to decrease bias—but may increase mean square error
• Can use jacknifing to reduce bias (to order 1/N )—divide the sample in half and estimate using the whole and both halves
2
![Page 53: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/53.jpg)
DJS 19/04/2023 53
Autocorrelation• More difficult to obtain properties of
sample autocorrelation• Generally still biased• When process is white noise
E(rk) –1/NVar( rk) 1/N
• Correlations are normal for N large
![Page 54: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/54.jpg)
DJS 19/04/2023 54
Autocorrelation• Gives a rough test of whether an autocorrelation is non-zero
• If |rk|>2/(N) suspect the autocorrelation at that lag is non-zero
• Note that when examining many autocorrelations the chance of falsly identifying a non-zero one increases
• Consider physical interpretation
![Page 55: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/55.jpg)
DJS 19/04/2023 55
Partial autocorrelation• Broadly speaking the partial autocorrelation is the correlation between Xt and Xt+k with the effect of the intervening variables removed
• Sample partial autocorrelations are found from sample autocorrelations by solving a set of equations known as the Yule-Walker equations
![Page 56: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/56.jpg)
DJS 19/04/2023 56
Model identification• Plot the autocorrelations and partial autocorrelations for the series
• Use these to try and identify an appropriate model
• Consider stationary series first
![Page 57: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/57.jpg)
DJS 19/04/2023 57
Stationary series• For a MA(q) process the autocorrelation is zero at lags greater than q, partial autocorrelations tail off in exponential fashion
• For an AR(p) process the partial autocorrelation is zero at lags greater than p, autocorrelations tail off in exponential fashion
![Page 58: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/58.jpg)
DJS 19/04/2023 58
Stationary series• For mixed ARMA processes, both the acf and pacf will have large values up to q and p respectively, then tail off in an exponential fashion
• See graphs in M&W, pp. 136–137• Try fitting a model and examine
the residuals is the approach used
![Page 59: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/59.jpg)
DJS 19/04/2023 59
Non-stationary series• The existence of non-stationarity is indicated by an acf which is large at long lags
• Induce stationarity by differencing• Differencing once is generally
sufficient, twice may be needed• Overdifferencing introduces
autocorrelation & should be avoided
![Page 60: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/60.jpg)
Estimation
2
![Page 61: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/61.jpg)
DJS 19/04/2023 61
Estimation• We will always fit the model using Minitab
• AR models may be fitted by least squares, or by solving the Yule-Walker equations
• MA models require an iterative procedure
• ARMA models are like MA models
![Page 62: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/62.jpg)
DJS 19/04/2023 62
Minitab• Minitab uses an iterative least squares approach to fitting ARMA models
• Standard errors can be calculated for the parameter estimates so confidence intervals and tests of significance can be carried out
![Page 63: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/63.jpg)
Diagnostic Checking
3
![Page 64: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/64.jpg)
DJS 19/04/2023 64
Diagnostic checking• Based on residuals
• Residuals should be Normally distributed have zero mean, by uncorrelated, and should have minimum variance or dispersion
![Page 65: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/65.jpg)
DJS 19/04/2023 65
Procedures• Plot residuals against time
• Draw histogram• Obtain normal scores plot• Plot acf and pacf of residuals• Plot residuals against fitted
values• Note that residuals are not
uncorrelated, but are approximately so at long lags
![Page 66: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/66.jpg)
DJS 19/04/2023 66
Procedures• Portmanteau test
• Overfitting
![Page 67: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/67.jpg)
DJS 19/04/2023 67
Portmanteau test• Box and Peirce proposed a statistic which tests the magnitudes of the residual autocorrelations as a group
• Their test was to compare Q below with the Chi-Square with K – p – q d.f. when fitting an ARMA(p, q) model
Q N rk2
k1
K
![Page 68: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/68.jpg)
DJS 19/04/2023 68
Portmanteau test• Box & Ljung discovered that the test was not good unless n was very large
• Instead use modified Box-Pierce or Ljung-Box-Pierce statistic—reject model if Q* is too large
Q* N N 2 rk2
N kk1
K
![Page 69: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/69.jpg)
DJS 19/04/2023 69
Overfitting• Suppose we think an AR(2) model is appropriate. We fit an AR(3) model.
• The estimate of the additional parameter should not be significantly different from zero
• The other parameters should not change much
• This is an example of overfitting
![Page 70: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/70.jpg)
Further Ideas
4
![Page 71: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/71.jpg)
DJS 19/04/2023 71
Other identification tools• Chatfield(1979), JRSS A among
others has suggested the use of the inverse autocorrelation to assist with identification of a suitable model
• Abraham & Ledolter (1984) Biometrika show that although this cuts off after lag p for the AR(p) model it is less effective than the partial autocorrelation for detecting the AR order
![Page 72: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/72.jpg)
DJS 19/04/2023 72
AIC• The Akaike Information Criterion is a function of the maximum likelihood plus twice the number of parameters
• The number of parameters in the formula penalizes models with too many parameters
![Page 73: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/73.jpg)
DJS 19/04/2023 73
Parsimony• Once principal generally accepted is that models should be parsimonious—having as few parameters as possible
• Note that any ARMA model can be represented as a pure AR or pure MA model, but the number of parameters may be infinite
![Page 74: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/74.jpg)
DJS 19/04/2023 74
Parsimony• AR models are easier to fit so there
is a temptation to fit a less parsimonious AR model when a mixed ARMA model is appropriate
• Ledolter & Abraham (1981) Technometrics show that fitting unnecessary extra parameters, or an AR model when a MA model is appropriate, results in loss of forecast accuracy
![Page 75: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/75.jpg)
DJS 19/04/2023 75
Exponential smoothing• Most exponential smoothing techniques are equivalent to fitting an ARIMA model of some sort
• Winters' multiplicative seasonal smoothing has no ARIMA equivalent
• Winters' additive seasonal smoothing has a very non-parsimonious ARIMA equivalent
![Page 76: Time Series Models](https://reader031.vdocuments.mx/reader031/viewer/2022032105/55cf92fb550346f57b9ae113/html5/thumbnails/76.jpg)
DJS 19/04/2023 76
Exponential smoothing• For example simple exponential smoothing is the optimal method of fitting the ARIMA (0, 1, 1) process
• Optimality is obtained by taking the smoothing parameter to be 1 – when the model is(1 B)Xt (1 B)Zt