forecasting with artificial neural networks05 slides... · forecasting with artificial neural...

157
Sven F. Crone Centre for Forecasting Department of Management Science Lancaster University Management School email: [email protected] Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 slides on www.neural-forecasting.com

Upload: vuongmien

Post on 30-Jan-2018

231 views

Category:

Documents


3 download

TRANSCRIPT

Page 1: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

Sven F. CroneCentre for ForecastingDepartment of Management ScienceLancaster University Management Schoolemail: [email protected]

Forecasting with Artificial Neural Networks

EVIC 2005 Tutorial Santiago de Chile, 15 December 2005

slides on www.neural-forecasting.com

Page 2: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Lancaster University Management School?

Page 3: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

What you can expect from this session …

Simple back propagation algorithm [Rumelhart et al. 1982]

),( pjpjp otCE =ji

pjpjjip w

otCw

∂−∝Δ

),(

ji

pj

pj

pjpj

ji

pjpj

wnet

netotC

wotC

∂=

∂ ),(),(

pj

pjpjpj net

otC∂

∂−=

),(δ

pj

pj

pj

pjpj

pj

pjpjpj net

oo

otCnet

otC∂

∂=

∂−=

),(),(δ

)( pjjpj netfo =

)(' pjjpj

pt netfneto

=∂

)(),( '

pjjpj

pjpjpj netf

ootC

∂=δ

∑∑

∑∑∑

−=∂

∂=

∂=

kkjpj

kkj

pk

pjpj

pj

ipiki

k pk

pjpj

pj

pk

k pk

pjpj

wwnet

otC

o

ow

netotC

onet

netotC

δ),(

),(),(

∑=k

kjpjpjjpj wnetf δδ )('

⎪⎪⎩

⎪⎪⎨

⎧∂

=

∑ layerhidden ain is unit if)(

layeroutput in the is unit if)(),(

'

'

jwnetf

jnetfo

otC

kpjkpkpjj

pjjpj

pjpj

pj

δδ

„How to …“ on Neural Network Forecasting with limited maths!

CD-Start-Up Kit forNeural Net Forecasting

20+ software simulatorsdatasetsliterature & faq

slides, data & additional info onwww.neural-forecasting.com

Page 4: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

1. Forecasting?

2. Neural Networks?

3. Forecasting with Neural Networks …

4. How to write a good Neural Network forecasting paper!

Agenda

Forecasting with Artificial Neural Networks

Page 5: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

1. Forecasting?

1. Forecasting as predictive Regression

2. Time series prediction vs. causal prediction

3. Why NN for Forecasting?

2. Neural Networks?

3. Forecasting with Neural Networks …

4. How to write a good Neural Network forecasting paper!

Agenda

Forecasting with Artificial Neural Networks

Page 6: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

ALGORITHMS

TASKS

Forecasting or Prediction?

Data Mining: „ Application of data analysis algorithms & discovery algorithms that extract patterns out of the data”algorithms?

Data Mining

DescriptiveData Mining

PredictiveData Mining

Categorisation:Clustering

Summarisation& Visualisation

Association Analysis

Sequence Discovery

Categorisation:Classification Regression Time Series

Analysis

K-means ClusteringNeural networks

Feature Selection

Princip.ComponentAnalysis

K-means Clustering

Class Entropy

Decision trees

Logistic regress.

Neural networks

DisciminantAnalysis

Temporal association rules

Association rules

Link Analysis

Exponential smoothing

(S)ARIMA(x)

Neural networks

Linear Regression

Nonlinear Regres.

Neural networksMLP, RBFN, GRNN

timing of dependent variable

Page 7: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Forecasting or Classification

DOWNSCALE

DOWNSCALE

DOWNSCALE

Ordinal scale

ClusteringPrincipal Component Analysis

NONE

Contingency Analysis

ClassificationNominal scale

DOWNSCALE

RegressionTime Series Analysis

Metric scale

DOWNSCALEOrdinal scale

Analysis of Variance

Metric sale

Nominal scaleIndependent

dependent

Supervised learning

Unsupervised learning

...

...........................

......

..................

...

Inputs

Cases

Target

......

Inputs

...

...........................

...

..................

...

Cases

......

Simplification from Regression (“How much”) Classification (“Will event occur”)FORECASTING = PREDICTIVE modelling (dependent variable is in future)FORECASTING = REGRESSION modelling (dependent variable is of metric scale)

Page 8: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Forecasting or Classification?

What the experts say …

Page 9: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

International Conference on Data Mining

You are welcome to contribute … www.dmin-2006.com !!!

Page 10: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

1. Forecasting?

1. Forecasting as predictive Regression

2. Time series prediction vs. causal prediction

3. SARIMA-Modelling

4. Why NN for Forecasting?

2. Neural Networks?

3. Forecasting with Neural Networks …

4. How to write a good Neural Network forecasting paper!

Agenda

Forecasting with Artificial Neural Networks

Page 11: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Forecasting Models

Time series analysis vs. causal modelling

Time series prediction (Univariate)Assumes that data generating processthat creates patterns can be explainedonly from previous observations ofdependent variable

Causal prediction (Multivariate)Data generating process can be explainedby interaction of causal (cause-and-effect)independent variables

Page 12: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Classification of Forecasting MethodsForecasting Methods

ObjectiveForecasting Methods

Subjective Forecasting Methods

Time Series Methods

CausalMethods

Averages

Exponential Smoothing

Simple Regression

Autoregression ARIMA

Linear Regression

Multiple Regression

„Prophecy“educated guessing…

Moving Averages

Naive Methods

Simple ES

Linear ES

Seasonal ES

Dampened Trend ES

Sales Force Composite

Analogies

Delphi

PERT

Survey techniques

Neural Networks

Neural Networks

Dynamic Regression

Vector Autoregression

Intervention model

Demand Planning PracticeObjektive Methods + Subjektive correction

Neural Networks AREtime series methodscausal methods

& CAN be used asAverages & ESRegression …

Page 13: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

DefinitionTime Series is a series of timely ordered, comparable observations yt recorded in equidistant time intervals

NotationYt represents the t th period observation, t=1,2 … n

Time Series Definition

Page 14: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Concept of Time Series

An observed measurement is made up of a systematic part and arandom part

ApproachUnfortunately we cannot observe either of these !!!Forecasting methods try to isolate the systematic partForecasts are based on the systematic partThe random part determines the distribution shape

AssumptionData observed over time is comparable

The time periods are of identical lengths (check!)The units they are measured in change (check!)The definitions of what is being measured remain unchanged (check!)They are correctly measured (check!)

data errors arise from sampling, from bias in the instruments or the responses, from transcription.

Page 15: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Objective Forecasting Methods – Time Series

Methods of Time Series Analysis / ForecastingClass of objective Methods based on analysis of past observations of dependent variable alone

Assumptionthere exists a cause-effect relationship, that keeps repeating itself with the yearly calendarCause-effect relationship may be treated as a BLACK BOXTIME-STABILITY-HYPOTHESIS ASSUMES NO CHANGE:

Causal relationship remains intact indefinitely into the future!the time series can be explained & predicted solely from previous observations of the series

Time Series Methods consider only past patterns of same variableFuture events (no occurrence in past) are explicitly NOT considered!external EVENTS relevant to the forecast must be corectedMANUALLY

Page 16: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Diagram 1.2: Seasonal - m ore or le s s re gular m ove m e nts

w ithin a ye ar

0204060

80100120

Yea

r 5 10 15 20 25 30 35 40 45

Diagram 1.1: Trend - long-te rm grow th or de cline occuring

w ithin a s e r ie s

0

20

40

60

80

100

Yea

r 3 6 9 12 15 18 21 24 27 30

Diagram 1.3: Cycle - alte rnating ups w ings of var ie d le ngth

and inte ns ity

0

2

4

6

8

10

Yea

r 5 10 15 20 25 30 35 40 45

Diagram 1.4: Irregular - random m ove m e nts and thos e w hich

re fle ct unus ual e ve nts

0

50100

150

200

250300

350

1 10 19 28 37 46 55 64 73 82

Simple Time Series Patterns

Long term movement in series

Regular fluctuation superimposed on trend (period

may be random)

regular fluctuation within a year (or shorter period)

superimposed on trend and cycle

Page 17: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Regular Components of Time Series

A Time Series consists of superimposed components / patterns:

Signal

level ‘L‘

trend ‘T‘

seasonality ‘S‘

Noise

irregular,error 'e'

Sales = LEVEL + SEASONALITY + TREND + RANDOM ERROR

Page 18: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Irregular Components of Time Series

Structural changes in systematic data

PULSEone time occurrenceon top of systematic stationary / trended / seasonal development

LEVEL SHIFTone time / multiple time shiftson top of systematic stationary / trended / seasonaletc. development

STRUCTURAL BREAKSTrend changes (slope, direction)Seasonal pattern changes & shifts

Original-Zeitreihen

0

5

10

15

20

25

30

35

40

45

M 0

8.20

00

M 0

9.20

00

M 1

0.20

00

M 1

1.20

00

M 1

2.20

00

M 0

1.20

01

M 0

2.20

01

M 0

3.20

01

M 0

4.20

01

M 0

5.20

01

M 0

6.20

01

M 0

7.20

01

M 0

8.20

01

M 0

9.20

01

M 1

0.20

01

M 1

1.20

01

M 1

2.20

01

M 0

1.20

02

M 0

2.20

02

M 0

3.20

02

M 0

4.20

02

M 0

5.20

02

M 0

6.20

02

M 0

7.20

02 [t]

Datenwert original Korrigierter Datenwert

Original-Zeitreihen

0

10

20

30

40

50

60

Aug

00

Sep

00

Okt

00

Nov

00

Dez

00

Jan

01

Feb

01

Mrz

01

Apr 0

1

Mai

01

Jun

01

Jul 0

1

Aug

01

Sep

01

Okt

01

Nov

01

Dez

01

Jan

02

Feb

02

Mrz

02

Apr 0

2

Mai

02

Jun

02

Jul 0

2

[t]

Datenwert original Korrigierter Datenwert

STATIONARY time series with PULSE

STATIONARY time series with level shift

Page 19: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

• Time Series decomposed into Components

Level

Trend

Random

Time Series

Components of Time Series

Page 20: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Time Series Patterns

Time Series Pattern

STATIONARY)Time Series

REGULARTime Series Patterns

IRREGULAR Time Series Patterns

FLUCTUATINGTime Series

INTERMITTANTTime Series

SEASONAL Time Series

TRENDEDTime Series

Original-Zeitreihen

0

5

10

15

20

25

30

35

40

45

Jan

77

Mrz

77

Mai

77

Jul 7

7

Sep

77

Nov

77

Jan

78

Mrz

78

Mai

78

Jul 7

8

Sep

78

Nov

78

Jan

79

Mrz

79

Mai

79

Jul 7

9

Sep

79

Nov

79

Jan

80

Mrz

80

Mai

80

Jul 8

0

Sep

80

Nov

80

[t]

Datenwert original Korrigierter Datenwert

Original-Zeitreihen

0

2

4

6

8

10

12

1960

1961

1962

1963

1964

1965

1966

1967

1968

1969

1970

1971

1972

1973

1974

1975

1976

1977

1978

1979

1980

[t]

Datenwert original Korrigierter Datenwert

Original-Zeitreihen

0

10000

20000

30000

40000

50000

60000

M 0

8.20

00

M 0

9.20

00

M 1

0.20

00

M 1

1.20

00

M 1

2.20

00

M 0

1.20

01

M 0

2.20

01

M 0

3.20

01

M 0

4.20

01

M 0

5.20

01

M 0

6.20

01

M 0

7.20

01

M 0

8.20

01

M 0

9.20

01

M 1

0.20

01

M 1

1.20

01

M 1

2.20

01

M 0

1.20

02

M 0

2.20

02

M 0

3.20

02

M 0

4.20

02

M 0

5.20

02

M 0

6.20

02

M 0

7.20

02 [t]

Datenwert original Korrigierter Datenwert

Yt = f(Et)time series is influenced by

level & random fluctuations

Yt = f(St,Et)time series is influenced by

level, season and random

fluctuations

Yt = f(Tt,Et)time series is influenced by trend from level and random fluctuations

Combination of individual Components

Yt = f( St, Tt, Et )

Original-Zeitreihen

-5

0

5

10

15

20

25

30

M 0

8.20

00

M 0

9.20

00

M 1

0.20

00

M 1

1.20

00

M 1

2.20

00

M 0

1.20

01

M 0

2.20

01

M 0

3.20

01

M 0

4.20

01

M 0

5.20

01

M 0

6.20

01

M 0

7.20

01

M 0

8.20

01

M 0

9.20

01

M 1

0.20

01

M 1

1.20

01

M 1

2.20

01

M 0

1.20

02

M 0

2.20

02

M 0

3.20

02

M 0

4.20

02

M 0

5.20

02

M 0

6.20

02

M 0

7.20

02 [t]

Datenwert original Korrigierter Datenwert

Original-Zeitreihen

-10

0

10

20

30

40

50

60

M 0

8.20

00

M 0

9.20

00

M 1

0.20

00

M 1

1.20

00

M 1

2.20

00

M 0

1.20

01

M 0

2.20

01

M 0

3.20

01

M 0

4.20

01

M 0

5.20

01

M 0

6.20

01

M 0

7.20

01

M 0

8.20

01

M 0

9.20

01

M 1

0.20

01

M 1

1.20

01

M 1

2.20

01

M 0

1.20

02

M 0

2.20

02

M 0

3.20

02

M 0

4.20

02

M 0

5.20

02

M 0

6.20

02

M 0

7.20

02 [t]

Datenwert original Korrigierter Datenwert

Number of periods with zero sales is high

(ca. 30%-40%)

time series fluctuates very strongly around level

(mean deviation > ca. 50% around mean)

+ PULSES!+ LEVEL SHIFTS!+ STRUCTURAL BREAKS!

Page 21: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Components of complex Time Series

Original-Zeitreihen

0

10000

20000

30000

40000

50000

60000

M 0

8.20

00

M 0

9.20

00

M 1

0.20

00

M 1

1.20

00

M 1

2.20

00

M 0

1.20

01

M 0

2.20

01

M 0

3.20

01

M 0

4.20

01

M 0

5.20

01

M 0

6.20

01

M 0

7.20

01

M 0

8.20

01

M 0

9.20

01

M 1

0.20

01

M 1

1.20

01

M 1

2.20

01

M 0

1.20

02

M 0

2.20

02

M 0

3.20

02

M 0

4.20

02

M 0

5.20

02

M 0

6.20

02

M 0

7.20

02 [t]

Datenwert original Korrigierter Datenwert

Original-Zeitreihen

0

5

10

15

20

25

30

35

40

45

Jan

77

Mrz

77

Mai

77

Jul 7

7

Sep

77

Nov

77

Jan

78

Mrz

78

Mai

78

Jul 7

8

Sep

78

Nov

78

Jan

79

Mrz

79

Mai

79

Jul 7

9

Sep

79

Nov

79

Jan

80

Mrz

80

Mai

80

Jul 8

0

Sep

80

Nov

80

[t]

Datenwert original Korrigierter Datenwert

Original-Zeitreihen

0

2

4

6

8

10

12

1960

1961

1962

1963

1964

1965

1966

1967

1968

1969

1970

1971

1972

1973

1974

1975

1976

1977

1978

1979

1980

[t]

Datenwert original Korrigierter Datenwert

Zeitreihen-Diagramm

500

550

600

650

700

750

800

Jan

60

Mrz

60

Mai

60

Jul 6

0

Sep

60

Nov

60

Jan

61

Mrz

61

Mai

61

Jul 6

1

Sep

61

Nov

61

Jan

62

Mrz

62

Mai

62

Jul 6

2

Sep

62

Nov

62

Jan

63

Mrz

63

Mai

63

Jul 6

3

Sep

63

Nov

63 [t]

Datenwert original Korrigierter Datenwert

Yt

=f ( )

St ,

Tt ,

Et

Sales orobservation of

time series at point t

consists of a combination of

Base Level +Seasonal Component

Trend-Component

Irregular or random Error-

Component

Different possibilities to combine components

Additive ModelYt = L + St + Tt + Et

Multiplicative ModelYt = L * St * Tt * Et

Page 22: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Classification of Time Series Patterns

Multiplicative Trend Effect

Additive Trend Effect

No Trend Effect

Multiplicative Seasonal Effect

Additive Seasonal Effect

No Seasonal Effect

[Pegels 1969 / Gardner]

Page 23: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

1. Forecasting?

1. Forecasting as predictive Regression

2. Time series prediction vs. causal prediction

3. SARIMA-Modelling

1. SARIMA – Differencing

2. SARIMA – Autoregressive Terms

3. SARIMA – Moving Average Terms

4. SARIMA – Seasonal Terms

4. Why NN for Forecasting?

2. Neural Networks?

3. Forecasting with Neural Networks …

4. How to write a good Neural Network forecasting paper!

AgendaForecasting with Artificial Neural Networks

Page 24: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Introduction to ARIMA Modelling

Seasonal Autoregressive Integrated Moving Average Processes: SARIMApopularised by George Box & Gwilym Jenkins in 1970s (names often used synonymously)models are widely studiedPut together theoretical underpinning required to understand & use ARIMADefined general notation for dealing with ARIMA models

claim that most time series can be parsimoniously represented by the ARIMA class of models

ARIMA (p, d, q)-Models attempt to describe the systematic pattern of a time series by 3 parameters

p: Number of autoregressive terms (AR-terms) in a time seriesd: Number of differences to achieve stationarity of a time seriesq: Number of moving average terms (MA-terms) in a time series

tqtd

p eBZBB )()1)(( Θ+=−Φ δ

Page 25: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

The Box-Jenkins Methodology for ARIMA models

Model Identification

Model Estimation & Testing

Model Application

Data Preparation• Transform time series for stationarity• Difference time series for stationarity

Model selection• Examine ACF & PACF• Identify potential Models (p,q)(sq,sp) auto

Model Estimation• Estimate parameters in potential models• Select best model using suitable criterion

Model Diagnostics / Testing• Check ACF / PACF of residuals white noise• Run portmanteau test of residuals

Model Application• Use selected model to forecast

Re-identify

Page 26: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

ARIMA-Modelling

ARIMA(p,d,q)-ModelsARIMA - Autoregressive Terms AR(p), with p=order of the autoregressive part

ARIMA - Order of Integration, d=degree of first differencing/integration involved

ARIMA - Moving Average Terms MA(q), with q=order of the moving average of error

SARIMAt (p,d,q)(P,D,Q) with S the (P,D,Q)-process for the seasonal lags

ObjectiveIdentify the appropriate ARIMA model for the time seriesIdentify AR-termIdentify I-termIdentify MA-term

Identification through Autocorrelation FunctionPartial Autocorrelation Function

Page 27: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

ARIMA-Models: Identification of d-term

Parameter d determines order of integration

ARIMA models assume stationarity of the time seriesStationarity in the mean Stationarity of the variance (homoscedasticity)

Recap:Let the mean of the time series at t beand

DefinitionA time series is stationary if its mean level μt is constant for all tand its variance and covariances λt-τ are constant for all tIn other words:

all properties of the distribution (mean, varicance, skewness, kurtosis etc.) of a random sample of the time series are independent of the absolute time t of drawing the sample identity of mean & variance across time

( )t tE Yμ =

( )( )

,

,

cov ,

vart t t t

t t t

Y Y

Yτ τλ

λ− −=

=

Page 28: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

ARIMA-Models: Stationarity and parameter d

sample B

Sample A

Stationarity:μ(A)= μ(B)

var(A)=var(B)etc.

this time series:μ(B)> μ(A) trendinstationary time series

Is the time series stationary

Page 29: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

ARIMA-Modells: Differencing for Stationariry

E.g. : time series Yt={2, 4, 6, 8, 10}.time series exhibits linear trend1st order differencing between observation Yt and predecessor Yt-1derives a transformed time series:

xt

2

10

1 2 3 4 5

xt-xt-1

2

10

1 2 3 4

4-2=26-4=28-6=210-8=2

The new time series ΔYt={2,2,2,2} is stationary through 1st differencingd=1 ARIMA (0,1,0) model2nd order differences: d=2

Differencing time series

Page 30: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

ARIMA-Modells: Differencing for Stationariry

IntegrationDifferencing

Transforms: Logarithms etc.…Where Zt is a transform of the variable of interest Ytchosen to make Zt-Zt-1-(Zt-1-Zt-2)-… stationary

Tests for stationarity: Dickey-Fuller TestSerial Correlation TestRuns Test

1t t tZ Y Y −= −

Page 31: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

1. Forecasting?

1. Forecasting as predictive Regression

2. Time series prediction vs. causal prediction

3. SARIMA-Modelling

1. SARIMA – Differencing

2. SARIMA – Autoregressive Terms

3. SARIMA – Moving Average Terms

4. SARIMA – Seasonal Terms

4. Why NN for Forecasting?

2. Neural Networks?

3. Forecasting with Neural Networks …

4. How to write a good Neural Network forecasting paper!

AgendaForecasting with Artificial Neural Networks

Page 32: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

ARIMA-Models – Autoregressive Terms

Description of Autocorrelation structure auto regressive (AR) termIf a dependency exists between lagged observations Yt and Yt-1 we can describe the realisation of Yt-1

Equations include only lagged realisations of the forecast variableARIMA(p,0,0) model = AR(p)-model

ProblemsIndependence of residuals often violated (heteroscedasticity)Determining number of past values problematic

Tests for Autoregression: Portmanteau-testsBox-Pierce testLjung-Box test

1 1 2 2 ...t t t p t p tY c Y Y Y eφ φ φ− − −= + + + + +Observation

in time tWeight of the

AR relationshipObservation in

t-1 (independent)random component

(„white noise“)

Page 33: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

ARIMA-Modells: Parameter p of Autocorrelation

stationary time series can be analysed for autocorrelation-structureThe autocorrelation coefficient for lag k

denotes the correlation between lagged observations of distance k

Graphical interpretation …Uncorrelated data has low autocorrelationsUncorrelated data showsno correlation patern…

( )( )

( )1

1

n

t t kt k

k n

tt

Y Y Y Y

Y Yρ

−= +

=

− −=

Autocorrelation between x_t and x_t-1

100

120

140

160

180

200

220

100 120 140 160 180 200 220

[x_t]

[x_t

-1]

X_t-1

Page 34: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

ARIMA-Modells: Parameter p

E.g. time series Yt

7, 88, 77, 66, 55, 44, 55, 66, 4

lag 1:

r1=.62

7, 7

7, 56, 45, 54, 65, 4

lag 2:

r2=.32

7, 8, 7, 6, 4,5, 5, 6, 4.

7, 68, 57, 46, 55, 64, 5

lag 3:

r3=.15

8, 6

ACF

0.0

0.6

-0.6

lag1 2 3…

Autocorrelations rt gathered atlags 1, 2, … make up the autocorrelation function (ACF)

Page 35: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

ARIMA-Models – Autoregressive Terms

Identification of AR-terms …?

AUTO

Lag Number

1615

1413

1211

109

87

65

43

21

ACF

1.0

.5

0.0

-.5

-1.0

Confidence Limits

Coeff icient

NORM

Lag Number

1615

1413

1211

109

87

65

43

21

ACF

1.0

.5

0.0

-.5

-1.0

Confidence Limits

Coeff icient

• Random independentobservations

• An AR(1) process?

Page 36: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

ARIMA-Modells: Partial Autocorrelations

Partical Autocorrelations are used to measure the degree of association between Yt and Yt-k when the effects of other time lags 1,2,3,…,k-1 are removed

Significant AC between Yt and Yt-1significant AC between Yt-1 and Yt-2

induces correlation between Yt and Yt-2 ! (1st AC = PAC!)

When fitting an AR(p) model to the time series, the last coefficient p of Yt-p measures the excess correlation at lag pwhich is not accounted for by an AR(p-1) model. πp is called the pth order partial autocorrelation, i.e.

Partial Autocorrelation coefficient measures true correlation atYt-p

0 1 1 2 2 . . . .+t p t p t p t p tY Y Y Yϕ ϕ ϕ π ν− − −= + + + +

( )1 2 1corr , | , ,...,p t t p t t t pY Y Y Y Yπ − − − − +=

Page 37: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

ARIMA Modelling – AR-Model patterns

AR(1) model: =ARIMA (1,0,0)

-1

-0,8

-0,6

-0,4

-0,2

0

0,2

0,4

0,6

0,8

1

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16

AutocorrelationConfidenceInterval

Autocorrelation function

-1

-0,8

-0,6

-0,4

-0,2

0

0,2

0,4

0,6

0,8

1

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16

Partial AutocorrelationConfidenceInterval

Partial Autocorrelation function

Pattern in ACf 1st lag significant

1 1t t tY c Y eφ −= + +

Page 38: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

-1

-0,8

-0,6

-0,4

-0,2

0

0,2

0,4

0,6

0,8

1

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16

Partial AutocorrelationConfidenceInterval

-1

-0,8

-0,6

-0,4

-0,2

0

0,2

0,4

0,6

0,8

1

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16

AutocorrelationConfidenceInterval

ARIMA Modelling – AR-Model patterns

Autocorrelation function Partial Autocorrelation functionPattern in ACf 1st lag significant

AR(1) model: =ARIMA (1,0,0)1 1t t tY c Y eφ −= + +

Page 39: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

AR(2) model:=ARIMA (2,0,0)

-1

-0,8

-0,6

-0,4

-0,2

0

0,2

0,4

0,6

0,8

1

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16

AutocorrelationConfidenceInterval -1

-0,8

-0,6

-0,4

-0,2

0

0,2

0,4

0,6

0,8

1

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16

Partial AutocorrelationConfidenceInterval

ARIMA Modelling – AR-Model patterns

Autocorrelation function Partial Autocorrelation functionPattern in ACf 1st & 2nd lag significant

1 1 2 2t t t tY c Y Y eφ φ− −= + + +

Page 40: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

-1

-0,8

-0,6

-0,4

-0,2

0

0,2

0,4

0,6

0,8

1

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16

AutocorrelationConfidenceInterval -1

-0,8

-0,6

-0,4

-0,2

0

0,2

0,4

0,6

0,8

1

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16

Partial AutocorrelationConfidenceInterval

ARIMA Modelling – AR-Model patterns

Autocorrelation function Partial Autocorrelation functionPattern in ACF:dampened sine

1st & 2nd lag significant

AR(2) model:=ARIMA (2,0,0) 1 1 2 2t t t tY c Y Y eφ φ− −= + + +

Page 41: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

ARIMA Modelling – AR-Models

Autoregressive Model of order one ARIMA(1,0,0), AR(1)

Higher order AR models

1 1

11.1 0.8t t t

t t

Y c Y eY e

φ −

= + +

= + +

Autoregressive: AR(1), rho=.8

-4

-3

-2

-1

0

1

2

3

4

5

1 4 7 10 13 16 19 22 25 28 31 34 37 40 43 46 49

1 1 2 2 ...t t t p t p tY c Y Y Y eφ φ φ− − −= + + + + +

1

2 2 1 2 1

for 1, 1 12, 1 1 1 1

pp

φφ φ φ φ φ

= − < <= − < < ∧ + < ∧ − <

Page 42: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

1. Forecasting?

1. Forecasting as predictive Regression

2. Time series prediction vs. causal prediction

3. SARIMA-Modelling

1. SARIMA – Differencing

2. SARIMA – Autoregressive Terms

3. SARIMA – Moving Average Terms

4. SARIMA – Seasonal Terms

4. Why NN for Forecasting?

2. Neural Networks?

3. Forecasting with Neural Networks …

4. How to write a good Neural Network forecasting paper!

AgendaForecasting with Artificial Neural Networks

Page 43: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

ARIMA Modelling – Moving Average Processe

Description of Moving Average structureAR-Models may not approximate data generator underlying the observations perfectly residuals et, et-1, et-2, …, et-q

Observation Yt may depend on realisation of previous errors eRegress against past errors as explanatory variables

ARIMA(0,0,q)-model = MA(q)-model

1 1 2 2 ...t t t t q t qY c e e e eθ θ φ− − −= + − − − −

1

2 2 1 2 1

for 1, 1 12, 1 1 1 1

qq

θθ θ θ θ θ

= − < <= − < < ∧ + < ∧ − <

Page 44: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

-1

-0,8

-0,6

-0,4

-0,2

0

0,2

0,4

0,6

0,8

1

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16

AutocorrelationConfidenceInterval -1

-0,8

-0,6

-0,4

-0,2

0

0,2

0,4

0,6

0,8

1

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16

Partial AutocorrelationConfidenceInterval

ARIMA Modelling – MA-Model patterns

MA(1) model: =ARIMA (0,0,1)

Autocorrelation function Partial Autocorrelation function

Pattern in PACF1st lag significant

1 1t t tY c e eθ −= + −

Page 45: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

-1

-0,8

-0,6

-0,4

-0,2

0

0,2

0,4

0,6

0,8

1

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16

AutocorrelationConfidenceInterval -1

-0,8

-0,6

-0,4

-0,2

0

0,2

0,4

0,6

0,8

1

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16

Partial AutocorrelationConfidenceInterval

ARIMA Modelling – MA-Model patterns

MA(1) model: =ARIMA (0,0,1)

Autocorrelation function Partial Autocorrelation function

Pattern in PACF1st lag significant

1 1t t tY c e eθ −= + −

Page 46: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

-1

-0,8

-0,6

-0,4

-0,2

0

0,2

0,4

0,6

0,8

1

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16

AutocorrelationConfidenceInterval -1

-0,8

-0,6

-0,4

-0,2

0

0,2

0,4

0,6

0,8

1

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16

Partial AutocorrelationConfidenceInterval

ARIMA Modelling – MA-Model patterns

MA(3) model:=ARIMA (0,0,3)

Autocorrelation function Partial Autocorrelation function

Pattern in PACF1st, 2nd & 3rd lag significant

1 1 1 2 1 3t t t t tY c e e e eθ θ θ− − −= + − − −

Page 47: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

ARIMA Modelling – MA-Models

Autoregressive Model of order one ARIMA(0,0,1)=MA(1)

1 1

110 0.2t t t

t t

Y c e ee e

θ −

= + −

= + +

Autoregressive: MA(1), theta=.2

-2-1,5

-1-0,5

00,5

11,5

22,5

3

1 4 7 10 13 16 19 22 25 28 31 34 37 40 43 46 49

Period

Page 48: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

ARIMA Modelling – Mixture ARMA-Models

complicated series may be modelled by combining AR & MA termsARMA(1,1)-Model = ARIMA(1,0,1)-Model

Higher order ARMA(p,q)-Models

1 1 1 1t t t tY c Y e eφ θ− −= + + −

1 1 2 2

1 1 2 2

...

...t t t p t p t

t t q t q

Y c Y Y Y e

e e e

φ φ φ

θ θ φ− − −

− − −

= + + + + +

− − − −

Page 49: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

-1

-0,8

-0,6

-0,4

-0,2

0

0,2

0,4

0,6

0,8

1

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16

AutocorrelationConfidenceInterval -1

-0,8

-0,6

-0,4

-0,2

0

0,2

0,4

0,6

0,8

1

1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 1

Partial AutocorrelationConfidenceInterval

Autocorrelation function Partial Autocorrelation function

1st lag significant1st lag significant

AR(1) and MA(1) model:=ARMA(1,1)=ARIMA (1,0,1)

Page 50: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

1. Forecasting?

1. Forecasting as predictive Regression

2. Time series prediction vs. causal prediction

3. SARIMA-Modelling

1. SARIMA – Differencing

2. SARIMA – Autoregressive Terms

3. SARIMA – Moving Average Terms

4. SARIMA – Seasonal Terms

5. SARIMAX – Seasonal ARIMA with Interventions

4. Why NN for Forecasting?

2. Neural Networks?

3. Forecasting with Neural Networks …

4. How to write a good Neural Network forecasting paper!

AgendaForecasting with Artificial Neural Networks

Page 51: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Seasonality in ARIMA-Models

Identifying seasonal data:Spikes in ACF / PACF at seasonal lags, e.g

t-12 & t-13 for yearlyt-4 & t-5 for quarterly

DifferencesSimple: ΔYt=(1-B)Yt

Seasonal: Δ sYt=(1-Bs)Yt

with s= seasonality, eg. 4, 12

Data may require seasonal differencing to remove seasonalityTo identify model, specify seasonal parameters: (P,D,Q)

the seasonal autoregressive parameters Pseasonal difference D andseasonal moving average Q

Seasonal ARIMA (p,d,q)(P,D,Q)-model

-,4000

-,2000

,0000

,2000

,4000

,6000

,8000

1,0000

1 4 7 10 13 16 19 22 25 28 31 34

ACF

Upper Limit

Low er Limit

Page 52: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Seasonality in ARIMA-Models

-.4000

-.2000

.0000

.2000

.4000

.6000

.8000

1.0000

1 4 7 10 13 16 19 22 25 28 31 34

ACF

Upper Limit

Low er Limit

-.6000

-.4000

-.2000

.0000

.2000

.4000

.6000

.8000

1 4 7 10 13 16 19 22 25 28 31 34

PACF

Upper Limit

Low er Limit

Seasonal spikes= monthly data

ACF

PACF

Page 53: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Seasonality in ARIMA-Models

Extension of Notation of Backshift Operator

ΔsYt = Yt-Yt-s =Yt–BsYt=(1-Bs)Yt

Seasonal difference followed by a first difference: (1-B) (1-Bs)Yt

Seasonal ARIMA(1,1,1)(1,1,1)4-modell( )( )( )( ) ( )( )4 4 4

1 1 1 11 1 1 1 1 1t tB B B B Y c B B eφ θ− − Φ − − = + − − Θ

Non-seasonalAR(1)

SeasonalAR(1)

Non-seasonaldifference

Seasonaldifference

Non-seasonalMA(1)

SeasonalMA(1)

Page 54: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

1. Forecasting?

1. Forecasting as predictive Regression

2. Time series prediction vs. causal prediction

3. SARIMA-Modelling

1. SARIMA – Differencing

2. SARIMA – Autoregressive Terms

3. SARIMA – Moving Average Terms

4. SARIMA – Seasonal Terms

5. SARIMAX – Seasonal ARIMA with Interventions

4. Why NN for Forecasting?

2. Neural Networks?

3. Forecasting with Neural Networks …

4. How to write a good Neural Network forecasting paper!

AgendaForecasting with Artificial Neural Networks

Page 55: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Forecasting Models

Time series analysis vs. causal modelling

Time series prediction (Univariate)Assumes that data generating processthat creates patterns can be explainedonly from previous observations ofdependent variable

Causal prediction (Multivariate)Data generating process can be explainedby interaction of causal (cause-and-effect)independent variables

Page 56: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Causal Prediction

ARX(p)-Models

General Dynamic Regression Models

Page 57: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

1. Forecasting?

1. Forecasting as predictive Regression

2. Time series prediction vs. causal prediction

3. SARIMA-Modelling

4. Why NN for Forecasting?

2. Neural Networks?

3. Forecasting with Neural Networks …

4. How to write a good Neural Network forecasting paper!

Agenda

Forecasting with Artificial Neural Networks

Page 58: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Why forecast with NN?

Pattern or noise?

Airline Passenger dataSeasonal, trended

Real “model” disagreed:multiplicative seasonality

or additive seasonality with level shifts?

Fresh products supermarket Sales

Seasonal, events, heteroscedastic noiseReal “model” unknown

Page 59: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Why forecast with NN?

Pattern or noise?

Random Noise iid(normally distributed:

mean 0; std.dev. 1)

BL(p,q) Bilinear Autoregressive Model

1 20.7t t t ty y ε ε− −= +

Page 60: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Why forecast with NN?

Pattern or noise?

TAR(p) Threshold Autoregressive model

1 1

1 1

0.9 for 10.3 for 1

t t t t

t t t

y y yy y

εε

− −

− −

= + ≤= − − >

Random walk

1t t ty y ε−= +

Page 61: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Motivation for NN in Forecasting – Nonlinearity!

True data generating process in unknown & hard to identifyMany interdependencies in business are nonlinear

NN can approximate any LINEAR and NONLINEAR function to any desired degree of accuracy

Can learn linear time series patternsCan learn nonlinear time series patternsCan extrapolate linear & nonlinear patterns = generalisation!

NN are nonparametricDon’t assume particular noise process, i.e. gaussian

NN model (learn) linear and nonlinear process directly from dataApproximate underlying data generating process

NN are flexible forecasting paradigm

Page 62: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Motivation for NN in Forecasting - Modelling Flexibility

Unknown data processes require building of many candidate models!

Flexibility on Input Variables flexible codingbinary scale [0;1]; [-1,1]nominal / ordinal scale (0,1,2,…,10 binary coded [0001,0010,…]metric scale (0.235; 7.35; 12440.0; …)

Flexibility on Output Variablesbinary prediction of single class membershipnominal / ordinal prediction of multiple class membershipsmetric regression (point predictions) OR probability of class membership!

Number of Input Variables…

Number of Output Variables…

One SINGLE network architecture MANY applications

Page 63: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Applications of Neural Nets in diverse Research Fields 2500+ journal publications on NN & Forecasting alone!

Neurophysiologysimulate & explain brain

InformaticseMail & url filteringVirusScan (Symmantec Norton Antivirus)Speech Recognition & Optical Character Recognition

Engineeringcontrol applications in plantsautomatic target recognition (DARPA)explosive detection at airportsMineral Identification (NASA Mars Explorer)starting & landing of Jumbo Jets (NASA)

Meteorology / weatherRainfall predictionElNino Effects

Corporate Business credit card fraud detectionsimulate forecasting methods

Business Forecasting DomainsElectrical Load / DemandFinancial Forecasting

Currency / Exchange ratestock forecasting etc.

Sales forecastingnot all NN recommendationsare useful for your DOMAIN!

Citation Analysis by year title=(neural AND net* AND (forecast* OR predict* OR time-ser* OR

time w/2 ser* OR timeser*) & title=(... ) and evaluated sales forecasting related point predictions

R2 = 0.9036

0

50

100

150

200

250

300

350

1987

1988

1989

1990

1991

1992

1993

1994

1995

1996

1997

1998

1999

2000

2001

2002

2003 [year]

[cita

tions

]

0

5

10

15

20

25

30

35

ForecastSales ForecastLinear (Forecast)

Number of Publications by Business Forecasting Domain

1021

52

5

32

51

83 General BusinessMarketingFinanceProductionProduct Sales Electrical Load

Page 64: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

IBF Benchmark– Forecasting Methods used

NN are applied in coroporate Demand Planning / S&OP processes![Warning: limited sample size]

Applied Forecasting Methods (all industries)

25%

8%

24%

35%

23%

9%

69%

23%

22%

6%

49%

7%

0% 10% 20% 30% 40% 50% 60% 70% 80%

Averages

Autoregressive Methods

Decomposition

Exponential Smoothing

Trendextrapolation

Econometric Models

Neural Networks

Regression

Analogies

Delphi

PERT

Surveys

[For

ecas

tinng

Met

hod]

[number replies]

Time Series methods(objective) 61%

Causal Methods(objective) 23%

Judgemental Methods(subjective) 2x%

Survey 5 IBF conferences in 2001240 forecasters, 13 industries

Page 65: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

1. Forecasting?

2. Neural Networks?

1. What are NN? Definition & Online Preview …

2. Motivation & brief history of Neural Networks

3. From biological to artificial Neural Network Structures

4. Network Training

3. Forecasting with Neural Networks …

4. How to write a good Neural Network forecasting paper!

Agenda

Forecasting with Artificial Neural Networks

Page 66: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

What are Artificial Neural Networks?

Artificial Neural Networks (NN)„a machine that is designed to model the way in which the brain performs a particular task …; the network is … implemented … or .. simulated in software on a digital computer.“ [Haykin98]

class of statistical methods for information processing consisting of large number of simple processing units (neurons), which exchange information of their activation via directed connections. [Zell97]

1nθ +

2nθ +

6nθ +

5nθ +

3nθ +

4nθ +

n hθ +

Inputtime series observation

causal variables

image data (pixel/bits)

Finger prints

Chemical readouts

Outputtime series prediction

dependent variables

class memberships

class probabilities

principal components

Processing

Black Box

Page 67: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

What are Neural Networks in Forecasting?

Artificial Neural Networks (NN) a flexible forecasting paradigmA class of statistical methods for time-series and causal forecasting

Highly flexible processing arbitrary input to output relationshipsProperties non-linear, nonparametric (assumed), error robust (not outlier!)Data driven modelling “learning” directly from data

1nθ +

2nθ +

6nθ +

5nθ +

3nθ +

4nθ +

n hθ +

InputNominal, interval or ratio scale (not ordinal)

time series observationsLagged variablesDummy variablesCausal variables

Explanatory variablesDummy variables…

OutputRatio scale

RegressionSingle period aheadMultiple period ahead…

Processing

Black Box

Page 68: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

DEMO: Preview of Neural Network Forecasting

Simulation of NN for Business Forecasting

Airline Passenger Data Experiment3 layered NN: (12-8-1) 12 Input units - 8 hidden units – 1 output unit12 input lags t, t-1, …, t-11 (past 12 observations) time series predictiont+1 forecast single step ahead forecast

Benchmark Time Series[Brown / Box&Jenkins]

132 observations

13 periods of monthly data

Page 69: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Demonstration: Preview of Neural Network Forecasting

NeuraLab Predict! „look inside neural forecasting“

NN forecasted value

Time seriesactual value

PQ-diagramm

Time Series versus Neural Network Forecastupdated after each learning step

in sample observations & forecaststraining validate

out of sample= Test

Errors on training / validation / test dataset

Absolute Forecasting Errors

Page 70: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

1. Forecasting?

2. Neural Networks?

1. What are NN? Definition & Online Preview …

2. Motivation & brief history of Neural Networks

3. From biological to artificial Neural Network Structures

4. Network Training

3. Forecasting with Neural Networks …

4. How to write a good Neural Network forecasting paper!

Agenda

Forecasting with Artificial Neural Networks

Page 71: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Motivation for using NN … BIOLOGY!

Human & other nervous systems (animals, insects e.g. bats)Ability of various complex functions: perception, motor control,pattern recognition, classification, prediction etc.Speed: e.g. detect & recognize changed face in crowd=100-200msEfficiency etc.

brains are the most efficient & complex computer known to date

Comparison: Human = 10.000.000.000 ant 20.000 neurons

billions of steps100 stepsComputation: Vision

10-6 Joule10-16 JouleEnergy consumption

kilograms to tons!1500 gramsWeight

50 million (PC chip)10 billion & 103 billion conn.Neurons/Transistors

10-9ms (2500 MHz PC)10-3ms (0.25 MHz)Processing Speed

Computer (PCs)Human Brain

Page 72: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

HistoryDeveloped in interdisciplinary Research (McCulloch/Pitts1943)Motivation from Functions of natural Neural Networks

neurobiological motivationapplication-oriented motivation

Research field of Soft-Computing & Artificial IntelligenceNeuroscience, Mathematics, Physics, Statistics, Information Science, Engineering, Business Managementdifferent VOCABULARY: statistics versus neurophysiology !!!

Brief History of Neural Networks

McCulloch / Pitts1943

Hebb1949

Rosenblatt1959

Minsky/Papert1969

Rumelhart/Hinton/Williams1986

Werbos1974

White 19881st paper on forecasting

Kohonen1972

NeurosciencePattern Recognition & Control Forecasting …

Applications in Engineering

Minski 1954builds 1st NeuroComputer

Turing1936

Dartmouth Project1956

IBM 1981Introduces PC

Neuralware 1987founded

SAS 1997Enterprise Miner

IBM 1998$70bn BI initiative

[Smith & Gupta, 2000]

1st IJCNN1987

1st journals1988

GE 19541st computer payroll system

INTEL19711st microprocessor

Applications in Business

Page 73: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

1. Forecasting?

2. Neural Networks?

1. What are NN? Definition & Online Preview …

2. Motivation & brief history of Neural Networks

3. From biological to artificial Neural Network Structures

4. Network Training

3. Forecasting with Neural Networks …

4. How to write a good Neural Network forecasting paper!

Agenda

Forecasting with Artificial Neural Networks

Page 74: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

neural_net = eval(net_name);

[num_rows, ins] = size(neural_net.iw{1});

[outs,num_cols] = size(neural_net.lw{neural_net.numLayers, neural_net.numLayers-1});

if (strcmp(neural_net.adaptFcn,''))

net_type = 'RBF';

else net_type = 'MLP';

end

fid = fopen(path,'w');

Motivation & Implementation of Neural Networks

From biological neural networks … to artificial neural networks

jjj

iji ownet θ−=∑ ( )ii netfa =o1o2oj

wi,1wi,2wi,j

oiii ao =

1nθ +

2nθ +

6nθ +

5nθ +

3nθ +

4nθ +

n hθ +

⎟⎟

⎜⎜

⎛−= ∑

jijjii owo θtanh

Mathematics as abstractrepresentations of reality

use in software simulators,hardware, engineering etc.

Page 75: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Information Processing in biological Neurons

Modelling of biological functions in Neurons10-100 Billion Neurons with 10000 connections in BrainInput (sensory), Processing (internal) & Output (motoric) Neurons

CONCEPT of Information Processing in Neurons …

Page 76: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Biological Representation

Graphical Notation

Alternative notations –Information processing in neurons / nodes

i ij jj

net w in= ∑ ( )i i ja f net θ= −

in1

in2

inj Neuron / Node ui

wi,1

wi,2

wi,j

outi

Input Function Activation FunctionInput

Mathematical Representation1 if 0

0 if 0

ji j ij

iji j i

j

w xy

w x

θ

θ

⎧ − ≥⎪= ⎨

− <⎪⎩

Output

tanhi ji j ij

y w x θ⎛ ⎞

= −⎜ ⎟⎝ ⎠∑

βi => weights wiβ0 => bias θ

alternative:

Page 77: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Information Processing in artificial Nodes

CONCEPT of Information Processing in NeuronsInput Function (Summation of previous signals) Activation Function (nonlinear)

binary step function {0;1}sigmoid function: logistic, hyperbolic tangent etc.

Output Function (linear / Identity, SoftMax ...)

i ij j jj

net w in θ= −∑ ( )ii netfa =

in1

in2

inj Neuron / Node ui

wi,1

wi,2

wi,j

outiii ao =

=

Input Function

ActivationFunction

OutputFunction

Input Output

Unidirectional Information Processing

1 if 0

0 if 0

ji j ij

iji j i

j

w oout

w o

θ

θ

⎧ − ≥⎪= ⎨

− <⎪⎩

Page 78: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Input Functions

Page 79: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Binary Activation Functions

Binary activation calculated from input

e.g.( ),j act j ja f net θ= ( )j act j ja f net θ= −

Page 80: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Node Function BINARY THRESHOLD LOGIC1. weight individual input by connection strength2. sum weighted inputs3. add bias term4. calculate output of node through BINARY transfer function RERUN with next input

Information Processing: Node Threshold logic

o2

o1

o3

w1,i

w2,i

w3,i

inputs weights outputinformation processing

i ij j jj

net w o θ= −∑ ( )ii netfa =

1 0

0 0

ji j ij

iji j i

j

w oo

w o

θ

θ

⎧ ∀ − ≥⎪= ⎨

− <⎪⎩

2.2

4

1

0.71

-1.84

9.01

2.2* 0.71

3.212- 8.0

= -4.778

-4.778 < 0 0.00 oj

0.00

θ=o0w0,i

8.0

+4.0*-1.84 +1.0* 9.01 = 3.212

Page 81: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Continuous Activation Functions

Activation calculated from input

e.g.( ),j act j ja f net θ= ( )j act j ja f net θ= −

Logistic Function

Hyperbolic Tangent

Page 82: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Node Function Sigmoid THRESHOLD LOGIC of TanH activation function1. weight individual input by connection strength2. sum weighted inputs3. add bias term4. calculate output of node through BINARY transfer function RERUN with next input

Information Processing: Node Threshold logic

o2

o1

o3

w1,i

w2,i

w3,i

inputs weights outputinformation processing

i ij j jj

net w o θ= −∑ ( )ii netfa =

2.2

4

1

0.71

-1.84

9.01

2.2* 0.71

3.212- 8.0

= -4.778

Tanh(-4.778)= -0.9998 oj

-0.9998

θ=o0w0,i

8.0

+4.0*-1.84 +1.0* 9.01 = 3.212

⎟⎟

⎜⎜

⎛−= ∑

jijjii owo θtanh

Page 83: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

A new Notation … GRAPHICS!

Single Linear Regression … as an equation:

Single Linear Regression … as a directed graph:

1 1 2 2 ...o n ny x x xβ β β β ε= + + + + +

n nn

xβ∑

x1

x2

xn

β1

β2

βn

y

1β0

Unidirectional Information Processing

Page 84: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Why Graphical Notation?

Simple neural network equation without recurrent feedbacks:

with …

Also:

1 1

1 1

2

21

tanh

N N

i ij j i ij ji i

N N

i ij j i ij ji i

x w x wN

i ij ji x w x w

e ex w

e e

θ θ

θ θ

θ= =

= =

⎛ ⎞ ⎛ ⎞− − −⎜ ⎟ ⎜ ⎟⎜ ⎟ ⎜ ⎟

⎝ ⎠ ⎝ ⎠

⎛ ⎞ ⎛ ⎞= − − −⎜ ⎟ ⎜ ⎟⎜ ⎟ ⎜ ⎟⎝ ⎠ ⎝ ⎠

⎛ ⎞∑ ∑⎜ ⎟⎛ ⎞ −⎛ ⎞ ⎝ ⎠− =⎜ ⎟⎜ ⎟⎝ ⎠ ⎛ ⎞⎝ ⎠ ∑ ∑⎜ ⎟+⎝ ⎠

tanh tanh tanh !k kj ki ji j j i kk i j

y w w w x Minθ θ θ⎛ ⎞⎛ ⎞⎛ ⎞

= − − − ⇒⎜ ⎟⎜ ⎟⎜ ⎟⎜ ⎟⎜ ⎟⎝ ⎠⎝ ⎠⎝ ⎠∑ ∑ ∑

yt

yt-1

yt+1

1nθ +

2nθ +

5nθ +

3nθ +

u1

un+5

un+3

un+2

un+1

u2

yt-2

yt-n-1

u3

un

...

4nθ +

un+4

Simplification for complex models!

βi => wi

β0 => θ

Page 85: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Combination of Nodes

1 1

1 1

1

2

2

tanh

N N

i ij j i ij ji i

N N

i ij j i ij ji i

N

i ij ji

o w o w

jo w o w

o w

e e

o

e e

θ θ

θ θ

θ

= =

= =

=

⎛ ⎞ ⎛ ⎞− − −⎜ ⎟ ⎜ ⎟⎜ ⎟ ⎜ ⎟

⎝ ⎠ ⎝ ⎠

⎛ ⎞ ⎛ ⎞− − −⎜ ⎟ ⎜ ⎟⎜ ⎟ ⎜ ⎟

⎝ ⎠ ⎝ ⎠

⎛ ⎞⎛ ⎞−⎜ ⎟⎜ ⎟

⎝ ⎠⎝ ⎠

⎛ ⎞∑ ∑⎜ ⎟−⎜ ⎟⎜ ⎟⎝ ⎠= →⎛ ⎞∑ ∑⎜ ⎟+⎜ ⎟⎜ ⎟⎝ ⎠

o2

o1

o3

w1,i

w2,i

w3,i

ol

okwk,j

wl,j 1 1

1 1

1

2

2

tanh

N N

i ij j i ij ji i

N N

i ij j i ij ji i

N

i ij ji

o w o w

o w o w

o w

e e

e e

θ θ

θ θ

θ

= =

= =

=

⎛ ⎞ ⎛ ⎞− − −⎜ ⎟ ⎜ ⎟⎜ ⎟ ⎜ ⎟

⎝ ⎠ ⎝ ⎠

⎛ ⎞ ⎛ ⎞− − −⎜ ⎟ ⎜ ⎟⎜ ⎟ ⎜ ⎟

⎝ ⎠ ⎝ ⎠

⎛ ⎞⎛ ⎞−⎜ ⎟⎜ ⎟

⎝ ⎠⎝ ⎠

⎛ ⎞∑ ∑⎜ ⎟−⎜ ⎟⎜ ⎟⎝ ⎠= →⎛ ⎞∑ ∑⎜ ⎟+⎜ ⎟⎜ ⎟⎝ ⎠

1 1

1

2

2

tanh

N N

i ij j i ij ji i

N N

i ij j i ij j

N

i ij ji

o w o w

o w o w

o w

e eθ θ

θ θ

θ

= =

=

⎛ ⎞ ⎛ ⎞− − −⎜ ⎟ ⎜ ⎟⎜ ⎟ ⎜ ⎟

⎝ ⎠ ⎝ ⎠

⎛ ⎞ ⎛ ⎞− − −⎜ ⎟ ⎜ ⎟⎜ ⎟ ⎜ ⎟

⎛ ⎞⎛ ⎞−⎜ ⎟⎜ ⎟

⎝ ⎠⎝ ⎠

⎛ ⎞∑ ∑⎜ ⎟−⎜ ⎟⎜ ⎟⎝ ⎠=⎛ ⎞∑ ∑⎜ ⎟

wi,j

wi,j+1

“Simple” processing per nodeCombination of simple nodes creates complex behaviour…

olwl,j

Page 86: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Architecture of Multilayer Perceptrons

u5

u5

u6

u1

u2

u3

u4 u7

u8

u9

u10

u11

u14

U13

u12

… ……

Architecture of a Multilayer PerceptronClassic form of feed forward neural network!

Neurons un (units / nodes) ordered in Layersunidirectional connections with trainable weights wn,n

Vector of input signals xi (input) Vector of output signals oj (output)

w1,5

w8,12

w5,8

input-layer

o1

hidden-layers output-layer

X1

X2

X3

X4

o2

o3

!tanhtanhtanh Minowwwok

kii j

jjjikikjk ⇒⎟⎟⎟

⎜⎜⎜

⎛−⎟

⎜⎜

⎛−

⎟⎟

⎜⎜

⎛−= ∑ ∑ ∑ θθθ

Combination of neurons

= neural network

Page 87: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Dictionary for Neural Network Terminology

Due to its neuro-biological origins, NN use specific terminology

don‘t be confused: ASK!

……

ParametersWeights

ParameterizationTraining

Dependent variable(s)Output Node(s)

Independent / lagged VariablesInput Nodes

StatisticsNeural Networks

Page 88: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

1. Forecasting?

2. Neural Networks?

1. What are NN? Definition & Online Preview …

2. Motivation & brief history of Neural Networks

3. From biological to artificial Neural Network Structures

4. Network Training

3. Forecasting with Neural Networks …

4. How to write a good Neural Network forecasting paper!

Agenda

Forecasting with Artificial Neural Networks

Page 89: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Hebbian Learning

HEBB introduced idea of learning by adapting weights [0,1]

Delta-learning rule of Widrow-Hoff

ij i jw o aηΔ =

( )

( )ij i j j

i j j i j

w o t a

o t o o

η

η η δ

Δ = −

= − =

Page 90: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Training LEARNING FROM EXAMPLES1. Initialize connections with randomized weights (symmetry breaking)2. Show first Input-Pattern (independent Variables) (demo only for 1 node!)3. Forward-Propagation of input values unto output layer4. Calculate error between NN output & actual value (using error / objective function)5. Backward-Propagation of errors for each weight unto input layer

RERUN with next input pattern…

Neural Network Training with Back-Propagation

w1,4

w3,8

w4,9

w3,10

10

4

3

5

6

7

8

92

1 o1

o2

x2

x1

x3

123456789

10

Input-Vector

x

Output-Vectoro

Weight-Matrix

W1 2 3 4 5 6 7 8 9 10

tTeaching-Output

E=o-t

Page 91: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Neural Network Training

Simple back propagation algorithm [Rumelhart et al. 1982]

),( pjpjp otCE =ji

pjpjjip w

otCw

∂−∝Δ

),(

ji

pj

pj

pjpj

ji

pjpj

wnet

netotC

wotC

∂=

∂ ),(),(

pj

pjpjpj net

otC∂

∂−=

),(δ

pj

pj

pj

pjpj

pj

pjpjpj net

oo

otCnet

otC∂

∂=

∂−=

),(),(δ

)( pjjpj netfo =

)(' pjjpj

pt netfneto

=∂

)(),( '

pjjpj

pjpjpj netf

ootC

∂=δ

∑∑

∑∑∑

−=∂

∂=

∂=

kkjpj

kkj

pk

pjpj

pj

ipiki

k pk

pjpj

pj

pk

k pk

pjpj

wwnet

otC

o

ow

netotC

onet

netotC

δ),(

),(),(

∑=k

kjpjpjjpj wnetf δδ )('

⎪⎪⎩

⎪⎪⎨

⎧∂

=

∑ layerhidden ain is unit if)(

layeroutput in the is unit if)(),(

'

'

jwnetf

jnetfo

otC

kpjkpkpjj

pjjpj

pjpj

pj

δδ

( )( )( ) ( )

output nodes mit

hidden nodes

ij i j

j j j j

jj j k jk

k

w o

f net t o j

f net w j

η δ

δδ

Δ =

⎧ ′ − ∀⎪= ⎨ ′ ∀⎪⎩

( )( )( ) ( )

( )

1mit ( ) ( ) (1 )1

1 output nodes

1 hidden nodes

i iji

ij i j

j j j jo t w

j j j j

jj j k jk

k

w o

f net f net o oe

o o t o j

o o w j

η δ

δδ

Δ =

′= → = −∑+

⎧ − − ∀⎪= ⎨ − ∀⎪⎩

Page 92: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Neural Network Training = Error Minimization

Minimize Error through changing ONE weight wj

wj

E(wj)

localminimum

localminimum

localminimum

localminimum

GLOBALminimum

randomstarting point 1

randomstarting point 2

Page 93: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Error Backpropagation = 3D+ Gradient Decent

Local search on multi-dimensional error surface

0

0

0

2

4

6

0

0

-4-2

02

4

-4

-2

0

24

-2.5

0

2.5

5

-4-2

02

4

-4

-2

0

24-4

-2

0

2

4

-4

-2

0

2

4

0

2

-4

-2

0

2

4

-4

-2

0

2

4

task of finding the deepestvalley in mountains

local searchstepsize fixedfollow steepest decent

local optimum = any valleyglobal optimum = deepest valley with lowest errorvaries with error surface

Page 94: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Demo: Neural Network Forecasting revistied!

Simulation of NN for Business Forecasting

Airline Passenger Data Experiment3 layered NN: (12-8-1) 12 Input units - 8 hidden units – 1 output unit12 input lags t, t-1, …, t-11 (past 12 observations) time series predictiont+1 forecast single step ahead forecast

Benchmark Time Series[Brown / Box&Jenkins]

132 observations

13 periods of monthly data

Page 95: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

1. Forecasting?

2. Neural Networks?

3. Forecasting with Neural Networks …

1. NN models for Time Series & Dynamic Causal Prediction

2. NN experiments

3. Process of NN modelling

4. How to write a good Neural Network forecasting paper!

Agenda

Forecasting with Artificial Neural Networks

Page 96: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Time Series Prediction with Artificial Neural Networks

ANN are universal approximators [Hornik/Stichcomb/White92 etc.]

Forecasts as application of (nonlinear) function-approximationvarious architectures for prediction (time-series, causal, combined...)

Single neuron / node≈ nonlinear AR(p)Feedforward NN (MLP etc.)≈ hierarchy of nonlinear AR(p) Recurrent NN (Elman, Jordan)≈ nonlinear ARMA(p,q)…

( ) httht xfy ++ += εˆ

yt+h = forecast for t+hf (-) = linear / non-linear function xt = vector of observations in tet+h = independent error term in t+h

Non-linear autoregressive AR(p)-model( )1211 ,...,,,ˆ −−−−+ = nttttt yyyyfy

1nθ +

2nθ +

6nθ +

5nθ +

3nθ +

4nθ +

n hθ +

Page 97: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Neural Network Training on Time Series

Sliding Window Approach of presenting Data

yt

yt-1

yt+1

1nθ +

2nθ +

5nθ +

3nθ +

u1

un+5

un+3

un+2

un+1

u2

yt-2

yt-n-1

u3

un

...

4nθ +

un+4

Forward pass

Backpropagation

Compare

Neural NetworkForecast agains<> actual value

Backpropagation

Change weightsto reduce output

forecast error

Calculate

Neural NetworkOutput from Input values

Input

Present newdata pattern to Neural Network

New Data Input

Slide windowforward to show

next pattern

Page 98: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Neural Network Architectures for Linear Autoregression

linear autoregressive AR(p)-model

( )1211 ,...,,,ˆ −−−−+ = nttttt yyyyfy

yt

yt-1

yt+1

u1

u2

yt-2

yt-n-1

u3

un

...

uj

1 1 1 2 2 1 1ˆ ...t t tj t t j t t j t n t n j jy y w y w y w y w θ+ − − − − − − − −= + + + + −

Interpretationweights represent autoregressive terms

Same problems / shortcomings as standard AR-models!

Extensionsmultiple output nodes= simultaneous auto-regression models

Non-linearity through different activation function in output node

Page 99: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Neural Network Architecture for Nonlinear Autoregression

Nonlinear autoregressive AR(p)-model

( )1211 ,...,,,ˆ −−−−+ = nttttt yyyyfyExtensionsadditional layers withnonlinear nodes

linear activation function in output layer

1

1ˆ tanht n

t i ij ji t

y y w θ− −

+=

⎛ ⎞= −⎜ ⎟

⎝ ⎠∑

Page 100: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Neural Network Architectures for Nonlinear Autoregression

Nonlinear autoregressive AR(p)-model

( )1211 ,...,,,ˆ −−−−+ = nttttt yyyyfy

InterpretationAutoregressive modeling AR(p)-approach WITHOUT the moving average terms of errors≠ nonlinear ARIMA

Similar problems / shortcomings as standard AR-models!

Extensionsmultiple output nodes= simultaneous auto-regression models

yt

yt-1

yt+1

1nθ +

2nθ +

5nθ +

3nθ +

u1

un+5

un+3

un+2

un+1

u2

yt-2

yt-n-1

u3

un

...

4nθ +

un+4

1ˆ tanh tanh tanht kj ki ji t j j i kk i j

y w w w y θ θ θ+ −

⎛ ⎞⎛ ⎞⎛ ⎞= − − −⎜ ⎟⎜ ⎟⎜ ⎟⎜ ⎟⎜ ⎟⎝ ⎠⎝ ⎠⎝ ⎠

∑ ∑ ∑

Page 101: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Neural Network Architectures for Multiple Step Ahead Nonlinear Autoregression

Nonlinear autoregressive AR(p)-model

( )1 2 1 2 1ˆ ˆ ˆ, ,..., , , ,...,t t t n t t t t ny y y f y y y y+ + + − − − −=

InterpretationAs single Autoregressive modeling AR(p)

1nθ +

2nθ +

6nθ +

5nθ +

3nθ +

4nθ +

n hθ +

Page 102: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Neural Network Architectures for Forecasting -Nonlinear Autoregression Intervention Model

Nonlinear autoregressive ARX(p)-model

( )1 2 1 2 1ˆ ˆ ˆ, ,..., , , ,...,t t t n t t t t ny y y f y y y y+ + + − − − −=

InterpretationAs single Autoregressive modeling AR(p)

Additional Event term to explain external events

Extensionsmultiple output nodes= simultaneous multiple regression

1nθ +

2nθ +

5nθ +

3nθ +

4nθ +

Page 103: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Neural Network Architecture for Linear Regression

Linear Regression Model

( )1 2 3ˆ , , ,..., ny f x x x x=

1 1 2 2 3 3ˆ ...j t j t j n nj jy x w x w x w x w θ= + + + + −

Max Temperature

Rainfall

Sunshine Hours

Demand

Page 104: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Neural Network Architectures for Non-Linear Regression (≈Logistic Regression)

Nonlinear Multiple (Logistic) Regression Model

( )1211 ,...,,,ˆ −−−−+ = nttttt yyyyfy1

1ˆ logt n

t i ij ji t

y y w θ− −

+=

⎛ ⎞= −⎜ ⎟

⎝ ⎠∑

Max Temperature

Rainfall

Sunshine Hours

Demand

Page 105: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Neural Network Architectures for Non-linear Regression

Nonlinear Regression Model

( )1 2 3ˆ , , ,..., ny f x x x x=

1 1 2 2 3 3ˆ ...j t j t j n nj jy x w x w x w x w θ= + + + + −

1nθ +

2nθ +

5nθ +

3nθ +

4nθ +

InterpretationSimilar to linear Multiple Regression Modeling

Without nonlinearity in output: weighted expert regime on non-linear regression

With nonlinearity in output layer: ???

Page 106: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Classification of Forecasting MethodsForecasting Methods

ObjectiveForecasting Methods

Subjective Forecasting Methods

Time Series Methods

CausalMethods

Averages

Exponential Smoothing

Simple Regression

Autoregression ARIMA

Linear Regression

Multiple Regression

„Prophecy“educated guessing…

Moving Averages

Naive Methods

Simple ES

Linear ES

Seasonal ES

Dampened Trend ES

Sales Force Composite

Analogies

Delphi

PERT

Survey techniques

Neural Networks

Neural Networks

Dynamic Regression

Vector Autoregression

Intervention model

Demand Planning PracticeObjektive Methods + Subjektive correction

Neural Networks AREtime series methodscausal methods

& CAN be used asAverages & ESRegression …

Page 107: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Focus

Different model classes of Neural Networks

Since 1960s a variety of NN were developed for different tasksClassification ≠ Optimization ≠ Forecasting Application Specific Models

Different CLASSES of Neural Networks for Forecasting alone!Focus only on original Multilayer Perceptrons!

Page 108: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Problem!

MLP most common NN architecture usedMLPs with sliding window can ONLY capture nonlinear seasonal autoregressive processes nSAR(p,P)

BUT:Can model MA(q)-process through extended AR(p) window!Can model SARMAX-processes through recurrent NN

Page 109: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

1. Forecasting?

2. Neural Networks?

3. Forecasting with Neural Networks …

1. NN models for Time Series & Dynamic Causal Prediction

2. NN experiments

3. Process of NN modelling

4. How to write a good Neural Network forecasting paper!

Agenda

Forecasting with Artificial Neural Networks

Page 110: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Time Series Prediction with Artificial Neural Networks

Which time series patterns can ANNs learn & extrapolate? [Pegels69/Gardner85]

… ???Simulation of

Neural Network prediction of Artificial Time Series

Page 111: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Time Series Demonstration – Artificial Time Series

Simualtion of NN in Business Forecasting with NeuroPredictor

Experiment: Prediction of Artificial Time Series (Gaussian noise)Stationary Time Series Seasonal Time Serieslinear Trend Time SeriesTrend with additive Seasonality Time Series

Page 112: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Time Series Prediction with Artificial Neural Networks

Which time series patterns can ANNs learn & extrapolate? [Pegels69/Gardner85]

Neural Networks can forecast ALL mayor time series patternsNO time series dependent preprocessing / integration necessaryNO time series dependent MODEL SELECTION required!!!SINGLE MODEL APPROACH FEASIBLE!

Page 113: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Time Series Demonstration A - Lynx Trappings

Simulation of NN in Business Forecasting

Experiment: Lynx Trappings at the McKenzie River3 layered NN: (12-8-1) 12 Input units - 8 hidden units – 1 output unitDifferent lag structures: t, t-1, …, t-11 (past 12 observationst+1 forecast single step ahead forecast

Benchmark Time Series[Andrews / Hertzberg]

114 observations

Periodicity? 8 years?

Page 114: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Time Series Demonstration B – Event Model

Simulation of NN in Business Forecasting

Experiment: Mouthwash Sales3 layered NN: (12-8-1) 12 Input units - 8 hidden units – 1 output unit12 input lags t, t-1, …, t-11 (past 12 observations) time series predictiont+1 forecast single step ahead forecast

Spurious Autocorrelations from Marketing Events

Advertisement with small Lift

Price-reductions with high Lift

Page 115: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Time Series Demonstration C – Supermarket Sales

Simulation of NN in Business Forecasting

Experiment: Supermarket sales of fresh products with weather4 layered NN: (7-4-4-1) 7 Input units - 8 hidden units – 1 output unit t+4Different lag structures: t, t-1, …, t-7 (past 12 observations)t+4 forecast single step ahead forecast

Page 116: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

1. Forecasting?

2. Neural Networks?

3. Forecasting with Neural Networks …

1. NN models for Time Series & Dynamic Causal Prediction

2. NN experiments

3. Process of NN modelling

1. Preprocessing

2. Modelling NN Architecture

3. Training

4. Evaluation

4. How to write a good Neural Network forecasting paper!

Agenda

Forecasting with Artificial Neural Networks

Page 117: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Decisions in Neural Network Modelling

Data Pre-processingTransformationScalingNormalizing to [0;1] or [-1;1]

Modelling of NN architectureNumber of INPUT nodes Number of HIDDEN nodes Number of HIDDEN LAYERSNumber of OUTPUT nodesInformation processing in Nodes (Act. Functions)Interconnection of Nodes

TrainingInitializing of weights (how often?)Training method (backprop, higher order …)Training parametersEvaluation of best model (early stopping)

Application of Neural Network Model

EvaluationEvaluation criteria & selected dataset

manualDecisions recquireExpert-Knowledge

NN

Mod

ellin

g P

roce

ss

Page 118: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Modeling Degrees of Freedom

B ] stopping method & parameters

Kconnectivity / weight matrix

IN

number of initializations

NO

no. of output nodes

Oobjective Function

IP

initializations procedure

PT,L

Learning parameters phase & layer

[Gchoice of Algorithm

L=learning algorithm

FO ]Output Function

FA

Activation Function

[FI

Input function

U=signal processing

T ]Activation Strategy

NL

no. of hidden layers

NS

no. of hidden nodes

[NI

no. of input nodes

A=Architecture

S ]Scaling

N Normalization

[C Correction

P=Preprocessing

DSA ]Sampling

[DSE

SelectionD=Dataset

Variety of Parameters must be pre-determined for ANN Forecasting:

interactions & interdependencies between parameter choices!

Page 119: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Heuristics to Reduce Design Complexity

Number of Hidden nodes in MLPs (in no. of input nodes n)2n+1 [Lippmann87; Hecht-Nielsen90; Zhang/Pauwo/Hu98]

2n [Wong91]; n [Tang/Fishwick93]; n/2 [Kang91]

0.75n [Bailey90]; 1.5n to 3n [Kasstra/Boyd96] …

Activation Function and preprocessinglogistic in hidden & output [Tang/Fischwick93; Lattermacher/Fuller95; Sharda/Patil92 ]

hyperbolic tangent in hidden & output [Zhang/Hutchinson93; DeGroot/Wurtz91]

linear output nodes [Lapedes/Faber87; Weigend89-91; Wong90]

... with interdependencies!

no research on relative performance of all alternativesno empirical results to support preference of single heuristicADDITIONAL SELECTION PROBLEM of choosing a HEURISTIC INCREASED COMPLEXITY through interactions of heurísticsAVOID selection problem through EXHAUSTIVE ENUMERATION

Page 120: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Tip & Tricks in Data Sampling

Do’s and Don'ts

Random order sampling? Yes!Sampling with replacement? depends / try!Data splitting: ESSENTIAL!!!!

Training & Validation for identification, parameterisation & selection Testing for ex ante evaluation (ideally multiple ways / origins!)

Simulation Experiments

Page 121: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Data Preprocessing

Data TransformationVerification, correction & editing (data entry errors etc.)Coding of VariablesScaling of VariablesSelection of independent Variables (PCA)Outlier removalMissing Value imputation

Data CodingBinary coding of external events binary codingn and n-1 coding have no significant impact, n-coding appears to be more robust (despite issues of multicollinearity)

Modification of Data to enhance accuracy & speed

Page 122: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Scaling of variables

Linear interval scaling

Intervall features, e.g. „turnover“ [28.12 ; 70; 32; 25.05 ; 10.17 …] Linear Intervall scaling to taget intervall, e.g. [-1;1]

Data Preprocessing – Variable Scaling

( ) (x Min(x))y ILower IUpper ILowerMax(x) Min(x)

−= + −

eg. x 72 Max(x) 119.95 Min(x) 0 Target [-1;1](72 0) 144y 1 (1 ( 1)) 1 0.2005

119.95 0 119.95

= = =−

= − + − − = − + =−

X1Hair Length

0.01 0.02 0.03

X2Income

100.00090.00080.00070.00060.00050.00040.00030.00020.00010.000

0X1

Hair Length0.0 0.1 0.2 … 0.5 … 0.9 1.0

X2Income

1.00.90.80.70.60.50.40.30.20.10.0

Page 123: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Scaling of variables

Standardisation / Normalisation

Attention: Interaction of interval with activation FunctionLogistic [0;1]TanH [-1;1]

Data Preprocessing – Variable Scaling

σηx

y−

=

X1Hair Length

0.01 0.02 0.03

X2Income

100.00090.00080.00070.00060.00050.00040.00030.00020.00010.000

0X1

Hair Length0.0 0.1 0.2 … 0.5 … 0.9 1.0

X2Income

1.00.90.80.70.60.50.40.30.20.10.0

Page 124: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Data Preprocessing – Outliers

0 25310 -1 +1

Scaling

Outliersextreme valuesCoding errorsData errors

Outlier impact on scaled variables potential to bias the analysisImpact on linear interval scaling (no normalisation / standardisation)

ActionsEliminate outliers (delete records) replace / impute values as missing valuesBinning of variable = rescalingNormalisation of variables = scaling

Page 125: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Data Preprocessing – Skewed Distributions

Asymmetryof observations

Transform dataTransformation of data (functional transformation of values)Linearization or Normalisation

Rescale (DOWNSCALE) data to allow better analysis byBinning of data (grouping of data into groups) ordinal scale!

Page 126: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Data Preprocessing – Data Encoding

Downscaling & Coding of variablesmetric variables create bins/buckets of ordinal variables (=BINNING)

Create buckets of equaly spaced intervallsCreate bins if Quantile with equal frequencies

ordinal variable of n valuesrescale to n or n-1 nominal binary variables

nominal Variable of n values, e.g. {Business, Sports & Fun, Woman}Rescale to n or n-1 binary variables

0 = Business Press1 = Sports & Fun2 = Woman

Recode as 1 of N Coding 3 new bit-variables1 0 0 Business Press 0 1 0 Sports & Fun 0 0 1 Woman

Recode 1 of N-1 Coding 2 new bit-variables1 0 Business Press 0 1 Sports & Fun 0 0 Woman

Page 127: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Data Preprocessing – Impute Missing Values

Missing Valuesmissing feature value for instancesome methods interpret “ “ as 0!Others create special class for missing…

SolutionsMissing value of interval scale mean, median, etc.Missing value of nominal scale most prominent value in feature set

X1Hair Length

0.01 0.02 0.03

X2Income

100.00090.00080.00070.00060.00050.00040.00030.00020.00010.000

0

Page 128: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Tip & Tricks in Data Pre-Processing

Do’s and Don'ts

De-Seasonalisation? NO! (maybe … you can try!)De-Trending / Integration? NO / depends / preprocessing!

Normalisation? Not necessarily correct outliers!Scaling Intervals [0;1] or [-1;1]? Both OK!Apply headroom in Scaling? YES!Interaction between scaling & preprocessing? limited…

Simulation Experiments

Page 129: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Outlier correction in Neural Network Forecasts?

Outlier correction? YES!

Neural networks are often characterized asFault tolerant and robustShowing graceful degradation regarding errorsFault tolerance = outlier resistance in time series prediction?

Simulation Experiments

Page 130: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Number of OUTPUT nodesGiven by problem domain!

Number of HIDDEN LAYERS1 or 2 … depends on Information Processing in nodesAlso depends on nonlinearity & continuity of time series

Number of HIDDEN nodes Trial & error … sorry!

Information processing in Nodes (Act. Functions)Sig-IdSig-Sig (Bounded & additional nonlinear layer)TanH-IdTanH-TanH (Bounded & additional nonlinear layer)

Interconnection of Nodes???

Page 131: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Tip & Tricks in Architecture Modelling

Do’s and Don'ts

Number of input nodes? DEPENDS! use linear ACF/PACF to start!Number of hidden nodes? DEPENDS! evaluate each time (few)Number of output nodes? DEPENDS on application!

fully or sparsely connected networks? ???shortcut connections? ???

activation functions logistic or hyperbolic tangent? TanH !!!activation function in the output layer? TanH or Identity!…

Simulation Experiments

Page 132: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

1. Forecasting?

2. Neural Networks?

3. Forecasting with Neural Networks …

1. NN models for Time Series & Dynamic Causal Prediction

2. NN experiments

3. Process of NN modelling

1. Preprocessing

2. Modelling NN Architecture

3. Training

4. Evaluation & Selection

4. How to write a good Neural Network forecasting paper!

Agenda

Forecasting with Artificial Neural Networks

Page 133: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Error Variation by number of Initialisations

0,0

5,0

10,0

15,0

20,0

25,0

30,0

35,0

40,0

1 5 10 25 50 100 200 400 800 1600

[initialisations]

[MAE

]

lowest errorhighest error

Tip & Tricks in Network Training

Do’s and Don'ts

Initialisations? A MUST! Minimum 5-10 times!!!

Simulation Experiments

Page 134: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Tip & Tricks in Network Training & Selection

Do’s and Don'ts

Initialisations? A MUST! Minimum 5-10 times!!!Selection of Training Algorithm? Backprop OK, DBD OK …… not higher order methods!Parameterisation of Training Algorithm? DEPENDS on dataset!Use of early stopping? YES – carefull with stopping criteria!…

Suitable Backpropagation training parameters (to start with)Learning rate 0.5 (always <1!)Momentum 0.4Decrease learning rate by 99%

Early stopping on composite error of Training & Validation

Simulation Experiments

Page 135: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

1. Forecasting?

2. Neural Networks?

3. Forecasting with Neural Networks …

1. NN models for Time Series & Dynamic Causal Prediction

2. NN experiments

3. Process of NN modelling

1. Preprocessing

2. Modelling NN Architecture

3. Training

4. Evaluation & Selection

4. How to write a good Neural Network forecasting paper!

Agenda

Forecasting with Artificial Neural Networks

Page 136: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Experimental Results

33 (12226)0,3986280,1460160,01450414400th

……………

8 (919)0,0258860,0136500,00963225th

……………

10 (5873)0,0233670,0120930,0097322nd

39 (3579)0,0434130,0114550,0108501st

0,3986280,1460160,155513overall highest

0,0177600,0114550,009207overall lowest

ANN IDTestValidationTraining

Data Set ErrorsRank by valid-error

Experiments ranked by validation error

significant positive correlationstraining & validation setvalidation & test settraining & test set

inconsistent errors by selection criterialow validation error high test error higher validation error lower test error

0,15

0,00

0,05

0,10

0,15

0,20

0,25

test

0,050,10 0,15train

0,050,10

valid

A

AAAAAAAAAAAAAAAAAAAA

A

AAAAAAAAAAAAAAAAAAAAAA

A

AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA

AAAAAAAAAAAAAA

A

A AAAAAAA

AAAAAAAAAAA AAAAAAAAA

A

AAA

A

A

AAAAAAAAAAAAAAAAAAAAAAAAAA

A

AAAAAAA

AAA

AAAAAA

A

AAAAAAAAAAA

AAAAAAAAAAAAAAAAAAAAA

AAAAAAAAAAAAAAAAAAAAAA

A

AAAAAAA

A

AAA

AAAAAAAA

AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA

A

AAAAAAAAAAAAA

AAAAAAAAAAAAAAAAAAAAAA

AAAA

AAA

A

AAAAAA

AAAAAAAA

AAAAAAAA AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA

AAAAAAAAAAAAAAAA

A

AAAAAAAAAAAAAAAAAAAAAAA

A

AAAAAAAAAAAAAAAAAA

AAAAAAAAAAAAAAAAAAAAAAA

A

AAAAAAA AAAAA

AAAAAAAAAAAAA

AAAAAAAAA

AA

AA

AAAAAA

AA

AAAAAAAAAAAAAAAAAAAAAAAA

AAAAAAAAAAAAAAAAA

A

AAAAAAAA

A

AAAA

A

AAAAAAAAAAAAAAAAAAAA

A

AA

AAAAAAAAAA

A

AAAAA

A

AAAAAAAAAAAA

AAA

A

AAAAAAAAAAA

AA

AAAAAAA

A

A

AAAAAAAAAAAAA

AAAAAA

AAAAAAAAAAAAAAAAAAAAAAAAAA

AAAAAAAAAAAA

A

AAAAAAAAAAAAAAAAAAAA

AA

A

AAAAAAAAAAAAAAAAAAAAAAAAAA

AAA

A

AAAAAAAAAAAA

A

AAAAAAAAAAAAAAAAAAAAAAAA

AAAAA

AA

AAAAAAAAAAAAAAAAA

A

A

AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA

AA

AAAAAAA

A

AAAAAAAAAAAA

A

AAAAA

AAAAAAAAAAAA

AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA

AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA

A

AAAA

A

AAAAAAAA

A

AAAAAAAAAAAAAAAAA

A

AAAA

A

A

A

AAAAAAAAAAAAAAAAAAAAAAAAAA

A

AAAAA

A

AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA

A

AAAAAAAAAAAAAAAAAAAAAAAAAA

A

A

A

AAAAAAAAAAAAAAAAAAAAAAAAAAA

AAAAAAAAAA

A

AAAAAAA

A

A

A

AAA

A

A

AAAAAAAA

A

A

AAAAAAAA

AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA

A

AAAAAA

A

AAAAAAAAAAAAAAAAAA

AAAAAAAAAAA

A

AAAAAAAAAAAA

AAAAAAAAAAA

AAAAA

A

AAA

A

AAAA

AAAAAAAAAAAAAAAAAAAAAAAAA

AAAAAAA

AAAAAAAAAAAAAAA

A

AA A

AAAAAAA

AAAA

AAAAAAAAAAAAAAAAAAAAAAAA

AAAAAAAAAAAAAAAAAAAAA

AAAAAAAAAAAAAAAAAAA

A

AAAAAAAAAA

AAAAAA

AAAAA

AAAAAAAAAAAAAAAAAAAAAAAA

AAA

AAAAA

A

AAAAAAAAAAAAAAAA

AAAAAAAAAAAAAAAAA

AAAAAAAAAAAAAAAAAAAAAA

A

AA

A

A

AAAAAA

AAAAAAAAAAAAAAA

AAAAAAAAAAAAAAAAAAAA

A

AAAAAAAAAAA

AAAA

AA

AAAAAAAAAAAAAAAAAAAAAAAAAAAAA

A

AAAAAAAAAAAA

AAAAA

A

AA

AAAAAAAAAAAAAAAAAAAAAAAAAA

AAAA

AAAAAAAAAAAAAAAAAAAAAAAAA

A

AAAAAAA

AA

AAAAAAAAAAAAAA

A

AAAAAAAAAAAAAAAAAAAAAAAAAAAAA

A

AA

A

AA

AAA

AAAAA

A

AAAAAAAAAA

A

AAAAAAAAAA

A

AAAAAAAAAAAAAAAA

AAAAAAAAAAAAAAAA

AAAAAAAA

A

A

AAAAAAAAAAAAAAAAA

AAAAAAAAAAAAAAAAAAA

A

AAAAAAAAAAAAAAAA

A

A

A

AAA

AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA

AAAAAAAAAA A

AAAAAAAAAAAAAAAA

A

AAAAAAAAAAAAAAAAA

A

A

AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA

AAAAAAAAAAAAAAAAAAAAAAAAAAA

A

AAAAAAA

AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA

AAAAAAA

A

AAAAAAAAAAAAAAAAAAA

AAAAAAAAAAA

AAAAAAAAAAAAAA

A

AAAAAAAAAAAAAAAAAAAAA

A

AA

A

AAAAAAAAAAAAAAAAAAAAAAAA

A

AA

AAAAAAAAAAAAAAAA

A

AAAAAAAAAAAAAAAAAAAAAA

A

AAAAAA

A

AAAAA

A

AA AA

A

AAAAAAA

A

AAAAAA

AAAAAAAAAAAAAAAAAAAAA

AAAAAAAAAAAAAAAAAAAAAAAAA

A

AAAAAAAAAAAAAA

AAAAAAAAA

AAAAAAAAAAAAAAAA

AAAAA

A

AA

AAAAAAAAAAAAAAAAAAAAAA

AAAA

AAAAAAAAAAAAAAAAAAAAA

AA

AAA

AAAAAAAA AAAA

AAAAAA

AAAAAAAAAAAAAAAAAAAAAAAAAAAAA AAAAAAAAAAAA A

AAAAA

AAA

A

AAAAAA

A

AAAAAAAAAAAAAA

A

AAAAAAAAAAAAAA

A

AAAAA

A

AAAAAAAAAAA

A

AAAA

AAAAAAA

A

AAAAAAAAAAAAAAAAAAAAA A

A

AAAAAAAAAAAAAAAAAAAAAAAAAAAAAA

AAAAAAAAAAA

A

AAAAAAA

A

AAAAAAAAAAAAAAA

A

AAAAAAAAAAAAAAAAAAAAAAAAAAA

A

AAAAAAAA

A

AAAAAAAAAAAAAAAAAAAA

AAAAAAAAAAAAAAAAAAA

A

AAAA AAAAAA

A A

A

AAAAAAAA

AA

A

AAAAAAAAAAAA

AA

AAA

AAAA

AAA

A

AAAA

A

AAAAAA

AAAAAAAAAAAAAAAA

AAA

A

AAAAAAAAAAAAAAAAAA

AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA

AA

AA

A

A

A

A

A

A

AAAAAAAA

A

AA

A

AAAAAAAAAAAAAAAA

AAAAAAAAAAAAAAAAAAAAAAAAA

A

A

A

AAA

A

AAAAAA

AAAAAAAAAAA

A

AAAAA

A

AAAAAA

AA

AAAAAAAAA

AAAAAAAAAA

A

AA

A

A

A

AAAA

A

AAAAAAAAAA

AAAA

AAAAAA

AAAAAAAAAAAAAAAAAA

A

AAAAAAAAAAA

A

AAAAAAAAAAAA

AAAAAAAA

A

A

A

AAAAA

AAAAAAAAAAAAAAAAAAAAAAAA

A

AAAAAAAAAAAA

A

A

AAA

AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA

AAAAAAAAAAAAAAAAAAAAA

A

AAAAAA AAAAAAAA

A

A

AA

A

AAAAAAAAAAAAA

A

AAAAAAAAA

AAAAAAAAAAA

AAAAA

A

AA

AA

AAAAAAA

A

A

AAA

A

AAAAA

AAAAA

A

AAAAAA

A

AAAA

A

AAAA

A

AAAAAAAAAA

AAAAAAAAAAAA

AAA

A

AAA

AAAAAAAAAAAAAAAA

A

A

A

AAAAAAAAAA

A

AAAAAAA

A

AAA

AAAAAAA

A

AAA

A

AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA

AAA

AAAAAAAA

AAAAAAAAAAA

A

AAAAA

AAAAA

AAAA

A

A

A

A

A

AAAAAAA

AA

A

A

A

AAAAAA

A

AAAA

A

AAAAA

A

AA

A

AAAAA

A

AAA AAA

A

AAAAAAAA

A

AAAA

A

AAA

AAAAAAAAAAAA

A AAAAAAAAAAAAA

AAAA

AAAA

A

A

A

AAAAAA

A

AAA

A

AAAAAAAA

AAAAAA

AA

A

AAAA

AAAAAAAAAAAAAAAA

A

AAAAAAAAAA

AAAA

A

AAAAA

AA

A

AA

A

AA

AA

AAAAAAAAAAA

A

AAAAAAAAAAA

A

AAAAAAAAAAAAAAAAA

A

AAAAA

A

AA

A

AAAAAAA

AAAAA

A

AAAAAAAAAA

A

AA

AAAAAA

AAAA

A

AAAA

A

AAAAAAAAAAAAA

A

AAAAAAAAAAAAAAA

AAAAAAAAAAAAAAAAAAAAAAA

AAAAAAA

A

AA

A

AAAAAAAAAAAAAAA

AAAA AAAAAAA

A

AAAA

AA

AAAAAAAAAAAAAAAAAAAAAAAAA

AAAAAAAA

AAAAAA

A

AAAAAAAA

AAAAAA

AAAA

A

A

AAAAAAA

AAAAAAAAAA

A

AAAAAAAAAA

AAAAAAAAA

AAAAA

AAAAAAA

AAAAA

A

AAAAA

AA

AAAAAAA

AAA

AAA

AAAA

AAAA

AAAA

AAAAA

A

A

A

AAAAA

A

AAA

AAAAAA

AAAAAAAA

AAAAAAAAAAAAAAAAAAA

A

AAA

A A

AA

AAAAAA

A

AAAA

A

AA

AAAAAAAAAAAAA

A

AAA

A

A

AAAAAAA

AAAAA

AAAA

A

A

A

AAA

A

AAAAAAAAAAAAAAA

A

AA

A

AAAAAAAAAAAAAAAAAAAAAAAAAA

AAAA

A

A

A

AAAA

AA

AA

AA

AA AAA

A

AA

AAAAA

A

AAAAAAAAAA

A

AAAA

A

AAAAAAAAAA

AA

AAAAAAAAAAAAAAAAAAAAAAAAA

A

AA

AAA

A

AAAAAAAA

A

A

AA

AA

AAAAAA

A

AA

A

AAAAAA

A

AA

AAAAAAAAAAAAAA

AAAA

AA

A

AAAAAAAAA AA

A

AA

A

AAA

A

A

A

AA

AAAA

AAAAAAAAAAAAAAAAA

AA

A

AAAA

AAAAAAAA

A

AAA

AAA

AAAAAAAAAAA AAAA

AAAAA

A

AA

A

A

A

AAAAA A

AA

A AAA

AAAA

A

AAA

AAA

A

AAA

A

AAAAA

AAAAAAA

A

AA AAAAAAAAAAAAAAA AAA

A

AA

A

AAA

AAAAAAA

AA

A

A

AAAAAA

AA

A

AAAAAAAA

A

AAAA

AAAAA

A

AAA

AAAA

A

AAA

AAA

A

A

A

A

A

A

AAAA

AA

AAAA

AAAAAAAAA

A

A

A

AAA

AAAAAAAAA

A

A

AAA

AA

A

AA

AAAAA

A

A

AAA

AA

A

AAA

A

A

AAAAA

A

A

AAAA

AAAAAAAA

AA

A

AA

A

AA

AAAAAAAAAAAAAAAAAAAAA

A

AAAAAA

AAAA

A

A

A

AAA

AAA

A

A

AAAAA

A

AA

A

A

AAA

AAAAAAAAAAA

AAAA

AAAAAAAAAAAAA

AAA

A

A

AA

AA

AAAAAAAA

AAAA

A

AA

AA

AA

AA

A

AA

AAAAAAAAAAAAAA AAAAAAAAA

AAAAAAAAAAAAAAA

A

AAAAAAA

AA

AA

A

A

A

AA

AAAAA

AA

AAAAAAAAAA

AAAA

A

A

A

AA

A

AA

A

AAAAAAAAAAAAAAAAAAA

A

A

A

AAAAAAAA

AA

A

AAAA

AAAAAAAAAA

A

AAAAAAAAAAA

A

A

AA

A

A

AAAA

A

AA

A

A A

AA

A

AA

A

AAAAAA AAA

AA

AA

A

AAAA

AA

A

AAAAA

A

AA

AAA

AA

AAA

A

A

AA

A

A A

A

AAAAAAA

AAAAAAAAA

A

AA

AAA

AA

A

AA

A

AAAAAA

AAAAA

AA

A

AA

AA AAAAAAA

A

A

A

AA

A

AAAAAAA

AA

AAA

AA

AAAAA

AA

AAAAA

A

AA

AAA

AAA

A

A

AAAAAA

AA

A

AA

AA

AAAA

AAAAAAAAA

AA

AA

AA

AA

AAAA

AA

AAAAA

AA

AAA

A

A

AAAAA

A

A

AA

AA

A

A

A

A

AAAAAAAAAAAA

A

AAAA

AA

A

AAAAAA

AAAA

A

A

AA

A

AAA

AAAA

AA

AAAAA

A

AA

A

AA

A

AAA

AAA

A

AA

A

AAA

AAAAAAAAAA

A

AAAAA

A

AA

A

AAAAA

AAAAAAAAAAAAAAA

A

AAA

A

AAAAAAAA AAA

AAAAAA

AA

AAAAAAA

A AA

AA

AAAAAA

AA

A

AA

AAAA

A

A

AAAAAAAA

AAAAA

A

AA

A

A

A

A

A

A

AA

AAAAA

AAA

A

A

AA

AAA

AAAA

AA

AAAA

AAA

A

AA

A

AA

A

A

AA

AA

A

A

AAAAA

AA

AAAA

A

A

AAAA

A

AA

AA

A

AAA

A

AAAAAAAA

AA

A

AA

AA

A

A

AA AAA

A

A

A

AAAAAAAAA AAA

AA AA

AA

A

A

AA

A

A

A

AAAAA

AAA

A

A

A

AAAAAA

A

AAAAAAAAAAA

AAAAA

AAAAA

AA

A

A

A

AAAA

AA

AAAAAAAAAAA

AAA

AAAA

AAA

AAA

A

AAAAAAAAA

AAAAAAA

A

AAA

A

AAAAAA

A

A

A

AA

AA

AAAAA

A

A AAAAAA

A

AA

A

AAAAAA

A

AA

A

A

AAA A

AA

A

AAA

A

AAA

AA

AAA

AAAAA

A

A AAA

AA

AAA

A A

A

AA

A

A

AA

AA

AAA

AAAAA

A

AAAA

AAAAAA

A

A

AA

AAAAA

AAAAA

A

AA

AA

A

AAA

AAAA

A

A

AAA

A

AAAAAA

AA

A

A

AA

AA

A

A

AAA

AA

A

AAA

AAA

AAAA AAA

A

A

A

A

A

AAA

AAA

A

AAAAAAAA

A

A

A

A

AA

A

A

AA

A

A

AA AA

A

AA

AA

A

AA AAA

A

AAA

AA

AAAA

A

AAAA

AAA

AAA

AA

A

AAA

AA

AA

A

A

A A

AAAAAAA

A

AAA

AA

AAAAAA

A

A

A

AA

AA

A

A

A

AAAAAAAAAA

AAAAAA AA

AA

A

AAAAAAAAA

AAAAA

A

AAA

AA

A

A

A A

AA

AAA

A

AAA

AAAA

A

A

AAAA

A

AA

A

A

A

A

A

A

AA

AAA

A

AA

AAA

AAA

A

AA

A

A AAA

AA

A

AAA

A

AAAAA

AAA

A

A

AAAA

AAAAAAAAAAAAA

A

AAAAA

A

A

AAAAAAAA

A

A

A

AAA

A

A

A

AAAAA

A

AA

A

AA

AAA

A

A

AAA

AA

AA

A

A

A

AAA

AAAAAAA

A

A

AA

A

A

AA

AAAAAAAA

A

AA

A

A

A

AAA

A

A

A

AA

A

A

AA A AAA

AAA

AAAA

AAAA AAAAAAAA

AA

A

A

AA

AAAA

A

AAA

A

AAA

A

AAAA

A

A

A

A

AAAA

A

A

AAAAAAAA

A

AA

A

A

AAAA A

AA

A

A

A

A

AAA AAAAA

A

AAAA

AA

A

AAAAAAA

A

AA

A

A

AAAAA

A

AAA

A AAA

AA

AA

AA

AA

A

AA

A

AAAAA

AAAA

A

A

A

AAAAA

A

AA

A

AA

A

A

A

AAAA

A

AA

AAAAAAA

AAAAA

AA AA

A

A

AAAAAA

AAAAAA

AAA A

A

A

A

A

A

A

AAAA

AA

A

A

A

A

A

A

AAAA

AAA

A

AAAA

A

A

AAAAAAAAAA

A

A

AA

AA

AA

A

AAAA

AAAAA

A

AA

A

AAAAA

A

AAAAA

AA

AAA

AAA

AA

A

AAA

A

A

A

A

AA

AAA

A

AAAAA

A

AAA

A

A

AAA

AA

AAAAAA

A

A

A AAA

A

A AA

A

A

AA

AAAA

A

A

A

AAAAA

A

A

AAAA

A

AAAAA AAA

A

AAAA

A

AAAA A

AAAAA

AAA

A

A A

A

AAA

AAAA

AAA

AA

A

AAA

A

AAA

AAAA

AA

A

AAA

AA

A

A A

AA

A

AAAA

A

AA

AAAAA

AAAA

A

AAA

A

A

AA A

AAAA

AA

AA

AAAAAAA

A

AA

AAAAA

AA

AAA

AA

AA

A

AA

AAAAA

AA AAAA

A

A AA

A

A

A AAAAA

A

A

AA

AAAA

AAA

A

A AA

AAAAA A

AAAAA

A

A

A

A

AAAAA

A

AAAAA

A A

AA

A

AAA

AAA

AAA

AA

A

A

AAAAAAA

A

AAA

AAAAAA

AA

AA AAAA

A

A

A

AAA

A

A

AAAA

A

A

A

A

A

A A

AA

A

A

A

AA

A

AA A

A

A

AA

AAAA

AA

AA

AA

A

AAAAAA

AAAAAA

A

A

A

A

A

A

AAAAA AA

AA

AA

AA

A AAAA

AA

A

AA

A

AA

A

AA

AA

A

A

AA

AA

AA

A

AAA

A

A

A

A

A

A

A

A

A

AA

A

AAAA

A

AAA

A

A A

AAA

AAAA

AA

A

AA

AA

AAA

AA

A

AAAA

A

AA

A

AA

A

A

A

AA

AA

AA

AA

AA

A A

AAAA

AA

A

A

AA

A

AAAA

A

A

AAA

AA A

AAAAAA

A

A

AAAA

A

AA

A

A

A

A

AA

AA

A

A

AAAA

A

A

A

A

AAAAAA

AAA

AA

AAAAA

A

A

AAAAAAAA

A

AAAA

A

A

AA

AAAA

A

A

A A

A

AAA

AA

AA

AAA

AA

AA

AAA

A

A

AA

A

AAA

A

AAAA

AA

AA

AAAAAAA

AAA

A

A

A

AA

A

A

AA

A

A

AA

A

A

AAAA

A

AA A

A

AAAA

A

AAA

AAA

A

A

AAAAA

A

A

AA

AAAA

A

AA

A

A

AA

AA

A

A

AAAAAA

A

AAA

AAA

A

AA

AA

A

AA

AAAAA

AAAA AAA

A

A

AAA

A

AAAA

A

AAAA

AA

A

A

AA

A

A AAAA

A

A

A

AA

A

A AA

A

A

AA

A

AA AAA

A

A

A

AA

A

A

A AAAAA

A

A AAA

A

AA A

AA AA

A

A

A

A

A AAA AA

AA

A

AA

A

A AA

A

AA

AAA

A

A

AA

A

A

A

AAAA

AA AA

AA

A

A

A

AA

A

A

A

AAAA

A AAAAA

A

AAA

A

A

AA

AA

A

A

AAAA

AAAA

AAAAA

AA

A

AA AAAAAAAAA

A

AA

AAA

A

AAAAA

AAA

A

A

AAAAAA

AA

A A

A

AA

AA

A

A

A

A

A

A

AAAAAA

AAA

AAA

A

A

A

A

A

A

AA

A

A

AAA

A

AA

AAA

AA

AA

A

AAA

AA

AA AAAAAA

A

A

A

AAAA

AAA

AA

AA

A

AA

AA

AAA

A

AAAAA

A

A

A

A

AAA

AAA

AAAA

A

AAA

AAAA

A

A

AAAA

AAAAA

A

AAAAA A

AA

A

A AA

A

AAAAA

AAA A

A

AA

AA

A

A

AA

AA

AA

A

A

A

AA

AA

AA

AA

AAAA

AAAAA

AA

A

A

A

A

AA

A

A

A

A

A

AA A

AA

A

AAA

A

AA

AAA

AA

AAA

AAA

AA

A

AAA

A

AAAAA

A

AA

A

AA AA

AA

A

A

A

AA

A

A

AAAAAAA

AAA A

AA A

A

A

A

A

AA

A AAAA

A

AAAA

A

A

AA A

A

A

AAA

A

AA

A

A

A

A

A

AA

AAAA AAA

A A

A

AA

A

AA

AAA

AA

AAAA

A

A

AA

AAAA

A

AA AAA

A

AAAA

A

A

A

A

AA

A

A

AAA

A

AAAAA

AAAAA

AAA

AA

AA

AAAAA

AA

AAAAA

A AAA

AAA

AAAA

A

AA

A

A

A

AA

A

A

A

A

A AAA

A

A

AAAA AAA

A

AAAAA

AAAA

A

AAAA

AA

AA

A

A

AA

AAAA

AAA

AAAAA

AA

AAA

A

A

A

AA

A

A

AA

A

A

AAAAAA

A

AAA

A

A

AAAA

A

AAA

AAAAAA

AA

AAA AA

A

AA

AAA

AA A

AAAA AA

A

AA

A

A

A

A

A

A

A

A

AAA

AAA

AA

AAA A

A

AA

A

AAAA

AAAA

AAA

A

AA

A

AAAAA

A

A

AAA

AAAA AA

A

A

A

A

AA

AAAAAAAAAAA

A

AAAA

A

AA

A

AAAAAAA

AAAA

AAAA

AA

A

AA

A

A

AA AA

AAAAA

A

A

AAA

AA

AAA

A

AA

A

AA

AAA

A

AAA

AA

A

AA

A

A

AAAAAAA

A AAAAAAA

A

AAAAAA

A

AAA

AA

A

AA

AAAA AAAAAAAAAAA

AAAA AAA

A

AAAAA

A

AAAA AA

AA

AA

A

A

AA

AAAA

A

A

A

A

A

A

AAA

A A

A

AAA

A

A

A

A

AA

AAA A

A

AAAA

A

AA A

A

AA

A

AAAAA

A

AAA

AAAAAAA

A

A

A

AA

A

AAAA

A A

A

A

AAAAAA

A

AAAAAAAAAAA

AAA

A

AA A

A

AAA

AA

AA AAA

A

A

A

A

AA

A

AA

A

A

AAAAA

A

A

A

AAA

AA AAAA

A

AA

AAAA

AA

A

A

AA

A

AAA

A

A A

A

A

AA

A

A

A

A

A

AAAAA

A

AA

AAAAA

A

A

AA

AA

AA

AAAAAAA

A

AAAA A AAAA AA

A

A

AAAAAAAAAA

AA

A

A AAAAAA

A

AAAA AA

A

A

A

A

A

A

AAA

AAA

AAAA

A

AA

A

A

A A

AA

AAAA

AAA

AA

A

AAAAAAAA

A

A

AA

A

A

AA

A

AA AAAA

A

A

AAAA

A

AA

A

A

AAA

A

AAA

A

A

AAAA AAAAAAAAAAA

AA

A

A

AA

AAAAA AA

AA

A A

AA

AAAA

A

AAA

AAAAA

A

A

AA

A

A

AA

A

AA

A

AAAA

A

AA

A

AA

A

AA

AA

AAAA A

AAAA

A

AA

A

AAA

A

A

A

AA

A

AA

A

AA

AAA

AAA

A

A

A

A

A

AAAA

A

A

A

A

AAA

AAA

A

A

A

AAAA

A

A

AAAA AA

AA

AA

A

A

AA

AAA AA

A

A

A

A

A

A

AAA

A

AAA AAA

AAA

AA

AA

A

AAAA

A

A

A

A

AAA

AAAAAAA

A

A

A

AAAAA

A

A

A

AA

A

A

AAAAAA

AAAAAAAA

A

A

AA

AA

A

AAA

AAA

AAA

AA

AA

A

AA

A

AA

AAAAAA

AAA

A A

A

AAA

A

A

A

AA AAA A

A

AAA

A

A

A

A

AA

A

AAAA

A

AA

A

A

A

AA

A

A

AA

A

AAAAA A

A

AAA

A

A

A

AAAAAA

A

A

A

AAA

A

AAAAAAA

A

A

AA

AAAAA

AA

A

A

A

AAA A

A

AA

AA

AA

AA

AAA

A

A AAAA

AA

A

AAAA

A

AA

AAAA

A

AA

AAAA

A

AA

AAAAA

AA

A A

AAA

AA AAAA

A

AA

AA

AAAAA

A

A

AA

A

A

AA

AAAAAA

A

AAA

AA A

A

A

A A

AA

AAA

A

AAA

A

A

A

A

A

A

AA

A

A

A

AA A

AAA AAAAA

A

AAA

A

AAA

AA

AAA AA

A

A

A

AAAAAAA AA

A

AA

A

A

AAA

AAAA

A

AAA

AA

AAA

A

AA

A

A

A

A

A

AA

AA

A

A

A

A

AAA

A

AAAA

AAAAAA

A

AA

A

AAA AAAA

A

AAA AA

AA

AAAA

A

A

AA

A

AAAAA

AA

A

A

A

AA

A

A

A

A

A

AA AA

A

A AAAA

A

A

A

A

AA

A

A

AAA A

AAAAAAAAA

A

A

AAAAAAAAAA

A

AA

AA A

A

AA

A

A

AAAA

A

A

AAA

AAA

AAAA

A

A

A

A

AAA

A

A

A

AAAAAA

A

AAAAA

AAAA

AAA

A

AA A

A

AA

A

A

A

AA

A

AAAAAAAA AAA A

AAAA

AAAAAA

AAA

AAA

AA

AAA

AAA

A

A

A

AAA AA

A

A

AAAAA

AA

A

A

AA

A

AA

A

AAAA

A

A A

AA

A

AA

A

AAA

A

AAA

A

A

AA

A

A

A

A

AAA

AAA

A

A

AA AA

A

AAA

AA

AA

AAAA AAAA

A

A

AA

A

AA

AA

AA

A

A

A A

A

AAAA AAA

A

AA AAAAA

A AAAAA

A

A

AA

AA

AAA

A

AA

AAAA A

A A

AAA A

AAA

AAAA

AAAAA A

A

AAAAAA

AAAAA

A

AA

A

A

AA

AA

AA

AA

A AAAA AAAA

A

AAA

A

AAA

A

A

A

AAA

A

A

A

AA

AA AAA

AAAA

A

AAAA A

AA

AA

A

AA

A

A

A

AA

AAA

AAAAA

AA

A

A

A

AA

A

AA AA

A

AA

A

A

A

AAAA

A

A

AAA

AAA

AAAAA

AAAAAAAAA

AAAA AA

AA

A

A

AAA

AAA

A

A

AAAA

AA

A

AAA

A

A

A

A

AAA

AAA

AAAAAA

A A

A

A

A AA

A

A

A

AAA A

A

AAA

A

AAA

A

A

A

AAAA

A

A

A

A

AAAAAAA

A

AA

A

AA

AAA

A

AA

A

A

A

A

A

A

AA

A

AAAAAA

AA

AAA

AAAAA

A

A

A

AAAA AAAA

AA

AA AAAA

A

A

AA

A

AA

A

A

AA

A

AAAAAAA

AAA

AA

A

AA

A

A

A

A

AAAA

A

A

AAA

A

A

AAAA

AAA

AA

AAAAA

A

A

AA

A

A

AAA

A

AAAAAAAAA A

A

A AA

AA AAA

A

A

A

A AA

AA

AAAAA

A

A

AAAAAAA

A

AAAA

A A

AAA AA

A

AA

A

AA

A

AAAA

A

A

AA

AA

A A

AA

A

A

AA

A

A

AAA

A

AAA

A

AA

AA

A

A

A

AA

A

A

AAA

A

AA

A

A

A

A

A

AA

A

AA

AAA

A

A

AA

A

AAAA AA

A

A

AAA

AA

A

AA

AA

AA

AAA

A

AAAA

AA

A AA

AAAAAAAA AA

AA

A

AA

A

A

AA

AA

AA AAAAAAAA

A

AA A

A

AAA AAA

A

A

AA

AA

A

AAA A

AA AAAAA

A

A

A

AAAAAA

A

AAA

A

AAA

A

AAA

A

AA

A

A

AAAA

AAA

AAA A

A

A

AA

AA

A

AAA AAAAAAA

A

A

A

A

A

AA

AA AAAAAAAAA

AA AAA

A

A

AA

AAAAA AA

A

A

AAAA

A

A

AA AAAA AAA

A

AA

AA

A A

AA AA

A

AAA

A

A

AA

A

AAAA

A

A

AA

A

AA

AA

A

AAA

A

A

A

AAA AAAAA

A

AAA

A

AA

A

AA

A AAAAAAA

AAAA

A

A

A

AA

A

AA

AAA

AAA

A

AA

AAA

AAAA

A

AAA

A

AAAAA

AAA

AA

AA AA A

A

A

AA

AA

A

AAA

AAAAAAAAA

AAA AA

AAA

A

A

AAA

A

AA AA A

AAAAA AA

A

AAAAA

A

AA

AAAA

A

A

AAAA

A

AAAA

AA A

AAA

A

A

AA

AAA

A

AA

A

A

A

AA

AA

A

AA A

A

AA

AA

A

A

AA

A

AAAAAAAA

A

A

AAA

AAA

AA

A

AA

AA

AA

AA

A

A

AAA

AA

A

A

AAAA

A AA

A

AA

AAA

AAAAA

AAA

A

A

A

A A

A

A AAAAA

A

AA

AAAAAAAA AAA AA

AA

A

AAA

AAAA

AAA AAA

AA

AAAA

A

A

AA

AAAAA

A

AA

AA

AA

AA AA

A

A

AA AAA

AA

AAAAAA

AAAAAA

A

AA

A

A

A

AAA

A

AA

AAA

A

AA

AA

A

AA

A

AAAAAA

A

A

AA

A

A

AA

AA

A

A

A

AAAA

A

A

A

AAA

AA

A

AA AA

A

A

A AA

A

AA

AA

A

AAA

AAAA AAAA

AA

AA

A

A

A AA

A

A

A

AAAAA

AA

A

AAA A

AA

A

A

AAAAA

AA A

A

A

A A A

A

A

A

A

AA

AAA

AA

AA A

A

AA AA

AA

A

A

A

AA

A

A

A

A

A

AAAAA

AA

A

AA AA

AA

AAAAAAAA

A

AA

AA

A

AAA

AAA

AAA

A

A

A

A

A

AAA AAAAAAA

AA

AAA

A

A

AA

A

A

AA

A

A

AA

A

AA

AA

A

A

A

A

AA

A

A

AA A

AAA

A

AA AAA

A

A

AA

AAAA

A

A

A

A

A

A

AA

A

A

A

A

AA

A

AA

AA A

A

A

AA

A

AAA

A

A

AAA

AA

AAA

A

AA

A

AAA

A

AAA

A A

A

AAAA

AAAAA

AA

AA

A

AA

A

A

AAA

AA

A

A

AA

AA

AAA

AA AA

AAAA AAAAAA A

AA

AAAAA

A

AAAA AA

A

A

A

A

A

AA

AAA

AAAA

A

A A AAA

A

A

AA

AA

AAA

AAA

A

A

A

AAA

A A

AA

A AA A

AA

A AA

A

AAAA A

AA

A

A

A

A

A

A

A AAA AA AAAA

A A

AAA

A

A

A

A

AAAA

AA

A AAA

AAA

AA

AAAAA

AAA

A

A AA

A

AA

AAA

AAAAAA

AA

AA

AA A

A

A

A

A

AAA A

A

A

A

AA

A

A

AA

AA A

A

A

AA

A

AAA

A

AA

A

AAA

A

AAAA A

A AAA

AA

A

A

AAA A

AAAAAA

A

AA

AAAA

A AAA AAAA

AAA

AA

AA AA

A

A

A AA

AAAA

AAA

A

AAA

A

AA

AA

AA

AAA

AA

A

AA

AAA

AA A

A

A

AAA

A

AA

A

AAA

A

AAA

A

A

A

AA

AA

AAAA

AA

A AAAAA

A

A

A

AAA

A

AAAA

AAA

A

A

A

AA

AA

AA

A

AAA

AAA

A

AA AAA

A

AAAA

A

A

AAAA

AA

AAA A

AA

AA

A

A

A

A

A

AAA

A

AAA AA

A

A

A

A AAA AAAA

AA

A

A

AA

A

AAA

A

AAA

AAAAAA

A

AAAA AA

A

AAA

AA

AAA A

AA

A

A

A

A

AA

A

AA

A

A

AA

A AA

AA

AAA

A

A

A

A A

A

AA

A

A

A

AA

AAAAAAA

AAA

A

AA

A

A

A

A

AAA AA AA

A AA

AA

A

AAA

AAA

A

A

A

AA

AA

A

AA

A

A

AA

A AA

AAA

AAA

AAA

A AA

A

AA

A

A

AAA A

A A

AA

A

A

AA AA

AA

AAA

AAA

A

A

AA

A

A

A AA AA

A

A

A

AAA

A

A

AA

A

A

A

AA

AA

AA

A

A

AAAAAAAA

A

A

A

AA

A A

A

AA

A

A

AA

A

A

AAAAAA A

A

A

AAA

A

A AAAA

A

AAA

AA

AAA

AA

A

A A

AA

A

A

AAAA

A

A

A

AAA A

AAA AA AA

AA

AA

A

A

AA

A

A

A

AA AAA

A

AA

AA

A

AAA

A

A

AAAAAA

A

AA

A

A

AA

A

A

A

A

A

A

A

AA

AA

A

A

A

A

A

A

AAA

A

A

A

A

A

A

A

AA

A

AA

A

AAAA

AA

A

AAAA

AA A

AAA

A

AA

A

AAAAA

A

AAA

AA

AA

A

A

AAA

A

AA

A

AA

A

AA

A

A

A

A

A

AA

A

A

A

AAA

A

A

A

AAAA

A

AA

A

A

AA

A

AAAA

AA

A

A

AAA

A

A

A

AAAA

A

AAAAAA

A

A

A

A

A

AAAAAAAAAAAAA

A

A

A

A

A

AA

AA

A

AAA

A

AAAA

A

A

A

A

A

A

A

A

A

A

A

AAAA

A

AAA

A

A

A

A

A

A

AAA

A

A

A

A

A

A

AAA

A

A

AAA

AAA

A

A

A

A

A

A

AAA

A

AA

A

A

A

A

A

A

A

AAA

A

A

AAA

AAA

AAAA

A

AA

AAA

A

A

AA

AAAA

A

AA

A

AAAAA

A

A

AA

A

AA

A

AA

A

AAAA

A

A

A

AAA

A

A

A

AA

AA

A

A

AA

AA

A

A

A A

A

AAAAAA

A

A

A

AAA

A

AA

A

A

A

AAA

A

A

A

AA

A

A

AA

A

A

AA

AAA

A

AA

A

A

A

AAAAA

A

A

AA

AAA

A

AAAAAAAA

A

A

A

A

A

A

A

A

A

AAA

AAAAA

AA

A

A

A

AA

A

A

A

A

A

AAAAA

AA

A

A

A

A

AAAA

AAAAAA

A

AA

A

A

A

A

AA

A

AA

A

A

AA

A

AAA

A

AA

A

AAAAAAAA

AAAAA

A

AA

A

A

A

A

AA

A

A

AA

A

A

AAA

AAA

A

AA

A

A

A

AAA

AAAAA

AA

AAA

AA

A

AAA

A

AAAA

A

AA

AAA

A

A

A

AAAA

A

AAAAA

AA A

AAA

A

A

AA

A

A

AAA

A

AAAA

A

A

A

A

AAA

A

A

A

A

AAA

A

AA

A

AAA

AA

A

AA

AA

A

A

AA

A

AA

AA

AAA

A

A

AAAAA

A

AAAA

A

A

AAAAAA

AA

AAA

A

AAAA

A

A

A

A

A

AA

A

AAAA

A

AAA

AA

AA

AAA

A

AAA

A

AAAA

A

A

A

A

A

AAA

AA

A

AA

A

AAAAA

A

AA

AAAAAA

A

AAA

A

AAAAA

A

A

A

AAA

A

AA

A

AAAA

AA

A

A

A

A

A

A

AAA

A

A

AA

AAA

AA

AA

A

A

AAA

AA

AAAA

A

A

A

A

AA

A

A

AAAAAA

A

A

AAAA

A

A

A

A

AAAA

AA

A

AAA

A

AAAA

A

A

AA

A

A

A

AAAA

A

AA

AAA

A

AA

A

AA

A

AAA

A

A

A

A

A

A

A

A

A

A

A

AAA

AAA

AA

A

A

AAAAAAAAAAAA

A

AA

AAA

AA

A

A

A

AAAAA

AA

AAAAA

A

A

A

A

A

A

A

A

A

AA

A

A

A

AA

A

A

A

A

A

A

AA

A

AAA

A

A

AAAAA

AA

A

AAA

A

A

A

A

A

A

AA

A

A

AAA

A

AA

AA

A

A

A

A

A

A

A

AA

A

AAAAA

A

A

A

A

A

A

A

A

AA

A

AAA

AA

A

A

A

AA

A

A

AA

A

A

AA

A

AA

AAA

A

A

AAAA

A

AAA

AA

A

AA

A

A

AA

A

AA

AA

AA

AA

A

A

A

AA

A

A

A

A

A

A

AAA

A

A

A

A

A

A

AAAA

A

A

A

A

A

AAA

A

A

A

A

AA

A

A

A

AAAA

A

A

A

AAAA

A

AAAA

AAAA

A

A

AA

AA

A

A

A

A

A

A

A

AA

A

A

AA

AA

A

AAAAAAA

A

A

AA

AAAAAAA

AA

AA

AA

AA

A

A

A

A

A

A

AAA

A

AA

A

AA

AA

A

A

A

A

A

A

AA

A

AAAA

AA

A

A

A

A

A

A

AA

A

AAA

A

A

AAA

A

A

AA

AAAAA

A

A

A

A

AA

A

A

AA

A

A

A

A

AA

A

A

A

AA

A

AA

A

A

A

AA

A

AA

A

AAA

A

AA

A

AAAAA

A

A

AA

A

A

A

A

AA

AAA

A

A

AA

AA

AA

A

AAA

AA

A

A

A

AAAA

A

A

A

A

AA

A

A

A

A

A

A

AA

AAA

AAA

AA

A

AAA

A

A

A

A

A

AAA

A

AA

A

A

A

A

AAA

A

AAA

A

A

A

A

A

A

AA

AA

A

AAA

A

A

A

A

AA

AAA

A

A

AA

A

A

A

A

AA

AAAAA

A

A

A

A

AA

A

AA

A

AA

A

A

A

AA

AA

A

A

A

A

A

A

A

AA

A

A

AA

A

AA

A

A

AAA

AA

A

A

A

A

AA

A

A

A

AA

A

A

A

A

A

A

A

A

A

A

A

A

A

A

AAA

A

A

A

A

A

A

AAA

A

A

A

A

A

A

AA

AAA

AAA

A

AAA

AAAAA

AAA

A

AA

AAA

A

A

A

A

A

A

A

A

A

A

A

A

AA

A

A

A

A

A

A

AA

AA

A

A

A

AA

AA

AA

A

A

A

AA

A

A

AA

A

A

A

AA

A

A

AAA

AA

A

AAA

A

AA

A

A

A

AA

A

AA

A

AAAA

AA

AAA

AA

A

A

AA

AAAA

AA

AAA

A

A

A

AA

AA

A

A

A

A

A

A

A

A

A

A

A

A

A

A

AA

AAA

AA

A

A

A

AAAAAA

AAAA

A

A

AA

A

AA

A

A

AA

A

AA

A

AA

A

A

A

AA

A

A

AA

AA

A

AA

A

A

A

A

A

A

AA

AAAA

A

A

A

AA

A

AAA

AAA

A

A

A

A

A

AA

A

AA

A

A

AAA

A

A

AA

AA

AA

A

AA

A

A

A

A

A

A

A

A

AAA

A

A

A

A

A

A

A

AAAA

A

A

AAA

AA

A

A

AAA

A

A

A

A

AA

AA

A

AA

A

AA

A

AA

AAAA

A

AA

A

AA

AAA

A

A

AAA

A

AAA

AAA

A

A

AAAA

A

AA

A

A

AAA

AAA

A

A

AA

AA

AAA

A

A

AAAA

A

A

A

AA

AA

AA

A

A

AA

A

A

A

AAAA

A

A

AAAA

AAAAA

AA

A

A

AA

A

AA

A

AA

A

AA

A

A

AA

A

A

AA

A

AAAAAA

A

AAA

AA

A

A

A

A

A

A

A

AA

AAAA

A

AAA

AAA

A

A

A

A

AA

A

AA

A

A

A

A

AA

A

A

A

AA

A

AA

AA

AAA

A

A

AA

AAAAA

A

A

A

A

A

A

A

A

A

A

A

A

AAA

AA

A

AAA

A

AA

AAA

A

AAAAAA

AAA

A

AA

AA

A

AAA

A

AAA

A

AA

A

AAA

A

AAA

AA

AA

AA

A

A

AA

AAA

AA

AAA

AA

A

A

AAA

A

A

A

AA

A

A

A

A

AAAAAA

A

A

A

A

AA

A

AA

A

A

A

AAAA

AA

AA

A

AA

AA

A

AA

A

A

A

A

A

AAA

A

A

AAA

A

AAA

A

A

A

AAA

A

A

A

AAA

A

AA

A

AAAAAAA

A

AA

AA

AA

AA

A

A

A

A

A

A

A

A

A

AA

A

A

A

AA

A

AA

A

A

A

AAAA

AA

AA

A

A

A

AAAAA

A

AAAAA

A

A

AA

A

A

A

AA

AA

A

A

AA

A

A

AA

AA

A

A

AA

AAA

AAAA

AA

A

AA

AA

A

AA

AAA

AA

AAA

A

A

A

A

AAA

A

A

AAAA

A

A

A

A

AA

A

A

A

A

AA

AA

A

A

AA

AAAA

A

A

A

AAA

A

AAAAA

A

AAA

AA

A

A

AAA

A

A

A

A

AA

A

A

A

AAAA

AA

A

A

A

A

A

A

AA

A

A

AA

A

A

A

AAA

A

A

A

AA

AA

A

AA

AAAA

A

AAAA

A

A

A

A

AAAA

A

AA

A

AAA

A

AA

AAAA

A

AA

A

A

A

AAAAA

A

A

A

A

A

A

A

AA

A

AAAAA

A

A

AA

AAA

AAAAA

A

A

AAAAAAA

A

AAAAA

AAAA

A

A

AA

A

AAA

A

AAAAA

AAA

A

AA

A

A

AA

AAA

A

A

A

A

A

AA

A

A

A

AAA

AAAA

A

AAA

AAA

AAAAA

A

AA

AAA

AA

A

A

A

AA

AAA

AAAAAAAAA

AAA

AA

A

AA

A

AAA

A

AAAA

AAA

A

A

AAA

A

AAAAAAAAAAAA

A

AAAAA

A

AA

A

AAAAA

A

AA

A

AAAAAAAAAAA

A

A

A

AA

AA

AAAAAA

A

A

A

AAAAAAAAAAAAAAA

A

AAAAAAAAAAAAAAAAAAA

A

AAAAAA

A

AAAAAAAAA

A

AAAAA

A

AAAA

A

AAAAA

A

AAAAAAAAAAAAAAAAAAAAAAAAAAA

A

AA

A

AAAAAAAAAAAAAAAA

A

AAAAAAAAAAAAAAA

A

AA

A

AAAAAAAAAA

A

AA

A

AAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAAA

1

6

10

Perc

entil

e Va

lidat

ion

TRAIN

VALID

TEST

Page 137: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Problem: Validation Error Correlations

Correlations between dataset errors

validation error is questionable selection criteriondecreasing correlationhigh variance on test error same results ordered by training & test error

0,4004**0,1276**0,2067**top 100 ANNs

0,4204**0,0917**0,2652**top 1000 ANNs

0,7686**0,9750**0,7786**14400 ANNs

Train - TestValidate - TestTrain - ValidateData included

Correlation between datasets

S-Diagramm of Error Correlationsfor TOP 1000 ANNs by Training error

0

0,01

0,02

0,03

0,04

0,05

0,06

0,07

0,08

0,09

1 64 127 190 253 316 379 442 505 568 631 694 757 820 883 946

[ordered ANN by Train error]

[Err

or]

E TainE ValidE Test20 ANN mov.Average Train20 ANN mov.Average Test

S-Diagramm of Error Correlationsfor TOP 1000 ANNs by Test error

0

0,005

0,01

0,015

0,02

0,025

0,03

0,035

1 64 127 190 253 316 379 442 505 568 631 694 757 820 883 946

[ordered ANN by Valid error]

[Err

or]

E TainE ValidE Test20 ANN mov.Average Train20 ANN mov.Average Test

S-Diagramm of Error Correlations

0

0,05

0,1

0,15

0,2

0,25

0,3

1 1056 2111 3166 4221 5276 6331 7386 8441 9496 10551 11606 12661 13716

[ordered ANN by Valid error]

[Err

or]

E TainE ValidE Test150 MovAv. E Test150 MovAv. E Train

S-Diagramm of Error Correlationsfor TOP 1000 ANNs by Valid error

0

0,01

0,02

0,03

0,04

0,05

0,06

0,07

0,08

1 64 127 190 253 316 379 442 505 568 631 694 757 820 883 946

[ordered ANN by Valid error]

[Err

or]

E TainE ValidE Test20 ANN mov.Average Train20 ANN mov.Average Test

Page 138: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Desirable properties of an Error Measure:summarizes the cost consequences of the errorsRobust to outliersUnaffected by units of measurement Stable if only a few data points are used

Fildes,IJF, 92, Armstrong and Collopy, IJF, 92;Hendry and Clements, Armstong and Fildes, JOF, 93,94

Page 139: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Model Evaluation through Error Measures

forecasting k periods ahead we can assess the forecast quality using a holdout sample

Individual forecast erroret+k= Actual - Forecast

Mean error (ME)Add individual forecast errors As positive errors cancel out negative errors, the ME should be approximately zero for an unbiased series of forecast

Mean squared error (MSE)Square the individual forecast errors Sum the squared errors and divide by n

t t te y F= −

( )2

1

1 n

t t k t kk

MSE Y Fn + +

=

= −∑

1

1 n

t t k t kk

ME Y Fn + +

=

= −∑

Page 140: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Model Evaluation through Error Measures

avoid cancellation of positive v negative errors: absolute errors

Mean absolute error (MAE)Take absolute values of forecast errorsSum absolute values and divide by n

Mean absolute percent error (MAPE)Take absolute values of percent errorsSum percent errors and divide by n

This summarises the forecast error over different lead-timesMay need to keep k fixed depending on the decision to be made based on the forecast:

1

1 n

t k t kk

MAE Y Fn + +

=

= −∑

1

1 nt k t k

k t k

Y FMAPEn Y

+ +

= +

−= ∑

∑−+

=+ −

+−=

knT

Tttkt kFY

knkMAE )(

)1(1)( ∑

−+

= +

+ −+−

=knT

Tt kt

tkt

YkFY

knkMAPE )(

)1(1)(

Page 141: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Selecting Forecasting Error Measures

MAPE & MSE are subject to upward bias by single bad forecastAlternative measures may are based on median instead of mean

Median Absolute Percentage Errormedian = middle value of a set of errors sorted in ascending orderIf the sorted data set has an even number of elements, the median is the average of the two middle values

Median Squared Error

⎟⎟⎠

⎞⎜⎜⎝

⎛×= 100Med ,

t

tff y

eMdAPE

( )2,Med tff eMdSE =

Page 142: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Evaluation of Forecasting Methods

ttft yy =+ˆ

The Base Line model in a forecasting competition is the Naïve 1a No Change model use as a benchmark

Theil’s U statistic allows us to determine whether our forecasts outperform this base line, with increased accuracy trough our method (outperforms naïve ) if U < 1

( )

( )∑

⎟⎠

⎞⎜⎝

⎛ −

⎟⎟⎠

⎞⎜⎜⎝

⎛ −

=+

++

2

t

ftt

t

fttft

yyy

yyy

U

Page 143: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Tip & Tricks in Network Selection

Do’s and Don'ts

Selection of Model with lowest Validation error? NOT VALID!Model & forecasting competition? Always multiple origin etc.!…

Simulation Experiments

Page 144: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

1. Forecasting?

2. Neural Networks?

3. Forecasting with Neural Networks …

1. NN models for Time Series & Dynamic Causal Prediction

2. NN experiments

3. Process of NN modelling

4. How to write a good Neural Network forecasting paper!

Agenda

Forecasting with Artificial Neural Networks

Page 145: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

How to evaluate NN performance

Valid ExperimentsEvaluate using ex ante accuracy (HOLD-OUT data)

Use training & validation set for training & model selectionNEVER!!! Use test data except for final evaluation of accuracy

Evaluate across multiple time seriesEvaluate against benchmark methods (NAÏVE + domain!)Evaluate using multiple & robust error measures (not MSE!)Evaluate using multiple out-of-samples (time series origins)Evaluate as Empirical Forecasting Competition!

Reliable ResultsDocument all parameter choicesDocument all relevant modelling decisions in processRigorous documentation to allow re-simulation through others!

Page 146: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Evaluation through Forecasting Competition

Forecasting CompetitionSplit up time series data 2 sets PLUS multiple ORIGINS!Select forecasting modelselect best parameters for IN-SAMPLE DATAForecast next values for DIFFERENT HORIZONS t+1, t+3, t+18?Evaluate error on hold out OUT-OF-SAMPLE DATAchoose model with lowest AVERAGE error OUT-OF-SAMPLE DATA

Results M3-competitionsimple methods outperform complex onesexponential smoothing OK

neural networks not necessaryforecasting VALUE depends on VALUE of INVENTORY DECISION

Page 147: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Evaluation of Forecasting Methods

HOLD-OUT DATA out of sample errors count!

SumSumAugJulJunMaiAprMarFebJanMethod

?30?????20100absolute error AE(A)

10 ??????10020absolute error AE(B)

100110100110100120100110Method B

90

?

90

?

90

?

90

?

90

110

Method A

Baseline Sales

909090

?10090

t+2 t+3t+1

… 2003 “today” presumed Future …… today Future …

t+2 t+3t+1

SIMULATED = EX POST Forecasts

Page 148: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Evaluation of Forecasting Methods

Different Forecasting horizons, emulate rolling forecast …

Evaluate only RELEVANT horizonsomit t+2 if irrelevant for planning!

SumSumAugJulJunMaiAprMarFebJanMethod

503010201001020100absolute error AE(A)

30 2000020010020absolute error AE(B)

100110100110100120100110Method B

90

90

90

100

90

110

90

100

90

110

Method A

Baseline Sales

909090

10010090

t+2 t+3t+1

… 2003 “today” presumed Future …

t+2t+1

Page 149: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Evaluation of Forecasting Methods

Single vs. Multiple origin evaluation

Problem of sampling Variability!Evaluate on multiple originsCalculate t+1 errorCalculate average of t+1 error

GENERALIZE about forecast errors

SumSumAugJulJunMaiAprMarFebJanMethod

503010201001020100absolute error AE(A)

30 2000020010020absolute error AE(B)

100110100110100120100110Method B

90

90

90

100

90

110

90

100

90

110

Method A

Baseline Sales

909090

10010090

A B

A

B

B

A

… 2003 “today” presumed Future …

Page 150: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Software Simulators for Neural Networks

Public Domain Software

Research orientedSNNSJNNS JavaSNNSJOONE…

Commercial Software by Price

High EndNeural Works ProfessionalSPSS ClementineSAS Enterprise Miner

MidpriceAlyuda NeuroSolutionsNeuroShell PredictorNeuroSolutionsNeuralPowerPredictorPro

ResearchMathlab LibraryR-packageNeuroLab

Consider Tashman/Hoover Tables on forecasting Software for more details

FREE CD-ROM for evaluationData from Experiments

M3-competitionairline-datalynx-databeer-data

Software Simulators

Page 151: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Neural Networks Software - Times Series friendly!

NeuroSolutions CosunsultantNeurosolutions for ExcelNeuroSolutions for MathlabTrading Solutions

Easy NN Easy NN Plus

Braincell

PredictorPredictor PRO

AITrilogy: NeuroShell Predictor, NeuroShellClassifier, GeneHunterNeuroShell 2, NeuroShell Trader, Pro,DayTrader

NeuroDimensionInc.

Neural Planner Inc.

Promised Land

Attrasoft Inc.

Ward Systems

”Let your systems learn the wisdom

of age and experience”

Alyuda Inc.

Page 152: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Neural networks Software – General Applications

SAS Enterprise Miner

SPSS Clementine DataMining Suite

Neural Works Professional II Plus

SAS

SPSS

Neuralware Inc

Page 153: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Further Information

Literature & websitesNN Forecasting website www.neural-forecasting.com or www.bis-lab.comGoogle web-resources, SAS NN newsgroup FAQ ftp://ftp.sas.com/pub/neural/FAQ.htmlBUY A BOOK!!! Only one? Get: Reeds & Marks ‘Neural Smithing’

JournalsForecasting … rather than technical Neural Networks literature!

JBF – Journal of Business ForecastingIJF – International Journal of ForecastingJoF – Journal of Forecasting

Contact to Practitioners & Researchers Associations

IEEE NNS – IEEE Neural Network SocietyINNS & ENNS – International & European Neural Network Society

Conferences Neural Nets: IJCNN, ICANN & ICONIP by associations (search google …)Forecasting: IBF & ISF conferences!

Newsgroups news.comp.ai.nnCall Experts you know … me ;-)

Page 154: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Agenda

1. Process of NN Modelling

2. Tips & Tricks for Improving Neural Networks based forecasts

a. Copper Price Forecasting

b. Questions & Answers and Discussiona. Advantages & Disadvantages of Neural Networks

b. Discussion

Business Forecasting with Artificial Neural Networks

Page 155: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Advantages … versus Disadvantages!

Disadvantages

- ANN can forecast any time series pattern (t+1!)

- without preprocessing- no model selection needed!

- ANN offer many degrees of freedom in modeling

- Experience essential!- Research not consistent

- explanation & interpretation of ANN weights IMPOSSIBLE (nonlinear combination!)

- impact of events not directly deductible

Advantages

- ANN can forecast any time series pattern (t+1!)

- without preprocessing- no model selection needed!

- ANN offer many degrees of freedom in modeling

- Freedom in forecasting with one single model

- Complete Model Repository- linear models- nonlinear models- Autoregression models- single & multiple regres.- Multiple step ahead

- …

Page 156: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Questions, Answers & Comments?

Summary Day I- ANN can forecast any time series

pattern (t+1!)- without preprocessing- no model selection needed!

- ANN offer many degrees of freedom in modeling

- Experience essential!- Research not consistent

What we can offer you:- NN research projects with

complimentary support!- Support through MBA master

thesis in mutual projects

Sven F. [email protected]

SLIDES & PAPERS availble:www.bis-lab.de

www.lums.lancs.ac.uk

Page 157: Forecasting with Artificial Neural Networks05 Slides... · Forecasting with Artificial Neural Networks EVIC 2005 Tutorial Santiago de Chile, 15 December 2005 Æslides on

EVIC’05 © Sven F. Crone - www.bis-lab.com

Contact Information

Sven F. CroneResearch Associate

Lancaster University Management SchoolDepartment of Management Science, Room C54Lancaster LA1 4YXUnited Kingdom

Tel +44 (0)1524 593867Tel +44 (0)1524 593982 directTel +44 (0)7840 068119 mobileFax +44 (0)1524 844885

Internet www.lums.lancs.ac.ukeMail [email protected]