jbonios forecasting final

Upload: jeremy-bonios

Post on 30-May-2018

216 views

Category:

Documents


0 download

TRANSCRIPT

  • 8/14/2019 JBonios Forecasting Final

    1/28

    i | P a g e

    Fall '08

    FIN 484.01 FinalBusiness Forecasting MethodsJeremy Bonios

  • 8/14/2019 JBonios Forecasting Final

    2/28

    ii | P a g e

    TABLE OF CONTENTS

    LIST OF FIGURES AND TABLES ......................................................................................... ii

    INTRODUCTION ............................................................................................................... 1

    EXAMINING THE DATA AND MAKING FORECASTS................................................................. 1

    Graphically Examining the Data ....................................................................................... 1

    Nave Forecasts ............................................................................................................. 1

    Nave 1: .................................................................................................................... 2

    Nave 2: .................................................................................................................... 2

    Simple Exponential Smoothing ..................................................................................... 2

    Holts Exponential Smoothing ....................................................................................... 2Winters Exponential Smoothing ................................................................................... 3

    Adaptive-Response-Rate Single Exponential Smoothing: .................................................. 3

    Regression Models ......................................................................................................... 3

    Bivariate Regression .................................................................................................... 3

    Multiple Regression ..................................................................................................... 4

    Time-Series Decomposition ............................................................................................. 5

    ARIMA [Box-Jenkins] Forecasting Models .......................................................................... 7

    ProCast ..................................................................................................................... 7

    CONCLUSIONS ................................................................................................................. 8

  • 8/14/2019 JBonios Forecasting Final

    3/28

    iii | P a g e

    LIST OF FIGURES AND TABLES

    Figure 1 .......................................................................................................................... 9Figure 2 .......................................................................................................................... 9Figures 3, 4, 5, & 6 ......................................................................................................... 10Figures 7 & 8 ................................................................................................................. 11Figures 9 & 10 ............................................................................................................... 12Figures 11 & 12.............................................................................................................. 13Figures 13 & 14.............................................................................................................. 14Figure 15 ...................................................................................................................... 14Figure 16 ...................................................................................................................... 15Figures 17, 18, 19, & 20 ................................................................................................. 16

    Figure 21 ...................................................................................................................... 17Figure 22 ...................................................................................................................... 19Figure 23 ...................................................................................................................... 19Figure 24 ...................................................................................................................... 19Figure 25 ...................................................................................................................... 20

    Figure 26 ...................................................................................................................... 20Figure 27 ...................................................................................................................... 21Figure 28 ...................................................................................................................... 21Figure 29 ...................................................................................................................... 22

    Table 1 ........................................................................................................................... 9Table 2 ......................................................................................................................... 15

    Table 3 ......................................................................................................................... 16Table 4 ......................................................................................................................... 17Table 5 ......................................................................................................................... 18Table 6 ......................................................................................................................... 19Table 7 ......................................................................................................................... 22Table 8 ......................................................................................................................... 23

  • 8/14/2019 JBonios Forecasting Final

    4/28

    iv | P a g e

    EXECUTIVE SUMMARY

    Purpose of the Report

    The purposes of this report are to (1) provide the reader with an introduction to popularbusiness forecasting models available, and (2) compare and contrast the ability of those models

    to forecast a specific time-series data.

    Business Forecasting Methods

    Forecasting has becoming increasingly important in todays highly competitive global market.Firms make business decisions that are based on forecasts of some future events. The ability tobetter forecast the direction a particular variable is interest is imperative to remainingcompetitive.

    This paper introduces the reader to commonly used forecasting methods, including: Nave,simple exponential smoothing, Holts exponential smoothing, Winters exponential smoothing,adaptive-response-rate single exponential smoothing, bivariate and multiple-regression, time-series decomposition, ARIMA Box-Jenkins, and ForecastX ProCast. Additionally, comparisonsare made between the various models by looking at how well each model fits the historic databeing forecast [domestic sales of autos and light-trucks from January 1976 to August 2008] aswell as how well it is able to predict the future. The summary statistic root-mean-square-error isused as the yardstick for comparisons.

    Conclusions

    1. Although each time-series is unique, decomposition models appear to produce relativelyaccurate forecasts while remaining intuitive and easy to explain to managers.

    2. Time-series decomposition best fit the historic data, while ARIMA Box-Jenkins producedthe best forecast using a 12-month holdout period [Table 8]. However the ARIMA forecastperformed only slightly better than decomposition, and because of the reasons justmentioned, decomposition is still the preferred method for this data.

  • 8/14/2019 JBonios Forecasting Final

    5/28

    1 | P a g e

    Business Forecasting Methods

    Presented to

    Professor DheeriyaBusiness Forecasting FIN 484.01

    California State University Dominguez Hills

    Prepared by

    Jeremy Bonios

    December 17, 2008

  • 8/14/2019 JBonios Forecasting Final

    6/28

    1 | P a g e

    BUSINESS FORECASTING METHODS

    INTRODUCTION

    The focus of this paper is on business forecasting techniques. Throughout this reading, you willbe introduced to 10 distinct forecasting models. Comparisons between the various models will bemade by looking at each models retrospective and forecast root-mean-squared error [RMSE].

    Being that there is a sufficient amount of data, a 12-month holdout period was used to evaluateforecast accuracy.

    The data forecasted was of domestic monthly auto and light-truck sales [sales] measured inthousands. The data used for forecasts ranged from January 1976 to August 2008, although asmentioned, a 12-month holdout period was use for evaluative purposes.

    EXAMINING THE DATA AND MAKING FORECASTS

    With some experience, you can simply looking a time-series plot of your data, and determinewhich forecasting method(s) would best fit your data. Once you have settled on a method ortwo, then comes the fun part running your forecasts and determing how well they fit andforecast your data.

    Graphically Examining the Data

    The analysis began by plotting the time series data with a line-line graph [Figure 1]. A linearlong-term trend line was superimposed over the graph to look for stationarity or trend [either

    positive or negative] in the data. As is obvious from the graph, there is a long-term positivetrend in the data.

    Next, I produced an ACF Correlogram to confirm if there was in fact a trend, and look forseasonality [Table 1 & Figure 2]. Being that the autocorrelations did not quickly fall to zero [infact they never hit zero] I could conclude that there was both a positive trend and seasonality.

    Nave Forecasts

    The simplest of all forecasting methods, Nave forecasts are based solely on the most recentobservations of the variable of interest. Two variations of the Nave method were tested

    [referred to as Nave 1 and Nave 2], with a total of four distinct tests being done [Figures 3, 4,5, & 6].

  • 8/14/2019 JBonios Forecasting Final

    7/28

    2 | P a g e

    Nave 1:

    This forecast was produced using Excel. The forecast value is equal to the previous periodsvalue. The formula for the forecast is:

    SalesFt = Salest-1Although both conceptually easy to understand and calculate, Nave 1 is useless if you want toforecast several periods into the future. Also, this forecast fails to take into consideration any

    direction [trend] in the data.

    Nave 2:

    Nave 2 forecasts, although conceptually very similar to Nave 1 forecasts, take intoconsideration some proportion of the most recently observed rate of change. The formula for theforecast is:

    SalesFt = SalesFt-1 + P[Salest-1 Salest-2]

    I tried this model at various P values [the proportion of change between period], and includedthree [P @ .25, .50, and .75] in this report for comparison between this model and against other

    models. I found that smaller P values produced lower RMSEs with my data.

    Simple Exponential Smoothing

    Simple exponential smoothing forecasts [for a value at any time] are the weighted average of allprevious actual values. They give more weight to the more recent observations, and less weightto older observations.

    The forecasts presented [Figures 7 & 8] were produced using ForecastX, with the smoothingconstant [] adjusted. This model produces better results when there is no trend orseasonality in the data. I included output from two forecasts [ = .9 and = .5], to show thatvarying the smoothing constant effected RMSE, and how this model compared against othermodels. This model performed slightly better than Nave 1 forecast, and much better than thethree Nave 2 orecasts. Because the series has both trend and seasonality, I did not expect thismodel to perform well [which it did not].

    Holts Exponential Smoothing

    An extension of simple exponential smoothing, Holts [two-parameter] exponential smoothingproduces forecasts that better fit data that exhibits a trend. Holts introduces a secondsmoothing constant [] that adjusts for trend.

    I performed multiple forecasts [Figures 9 & 10], once again by adjusting constants, in this case and the trend smoothing constant . Holts Exponential Smoothing performed about as wellas simple exponential smoothing, and better than Nave 2. I included output with the smoothingconstants set at the following levels [ .9, .01] and [ .5, .01]. Although the forecastsaccounted for trend, Holts falls short because the data also exhibits seasonality.

  • 8/14/2019 JBonios Forecasting Final

    8/28

    3 | P a g e

    Winters Exponential Smoothing

    Another extension of the simple smoothing model, Winters is used for data with both trend andseasonality, and introduces another smoothing constant [] to adjust for seasonality.

    I performed multiple forecasts [Figures 11 & 12] by adjusting and , and and foundthat Winters produced the most accurate and fit forecasts of the smoothing models. Through

    trial and error, I discovered that the forecasts with the lowest RMSE were had by leaving alone and allowing ForecastX to determine the best value for this constant; in all cases,ForecastX choose 0.00. With regard to , smaller values [i.e. 0.3] produced lower RMSEs; andwith regard to , lower values [i.e. 0.22] produced the lower RMSEs.

    Winters also confirmed my interpretation of the correlogram [Table 1 & Figure 2] that there wasa seasonal component to the data, by the seasonal indices output [i.e. index 1 @ 0.84, index 2@ .92, index 3 @ 1.16, etc.].

    Adaptive-Response-Rate Single Exponential Smoothing:

    A variant of simple exponential smoothing, adaptive-response-rate single exponential smoothing[ADRES] does not require one to choose an value. This model assumes that there is little orno trend or seasonality in the data being forecast. I tried this model with at various levels,and included output in this report with at 0.1, and 0.2 [Figures 13 & 14]. The results [RMSE]were about the same as the simple exponential smoothing model, and for good reason theADRES equation is essentially the same as the equation for simple exponential smoothing anddoes not handle seasonality [unless the data is first deseasonalized].

    Regression Models

    Although the underlying economic theory for sales is rooted in supply and demand, I felt thatregression-based forecasting would be best accomplished by narrowing my focus on demand-side variables. My reasoning is that this market/industry is highly saturated, and automakerscan collectively meet all demand. Over the next section, I will introduce forecasts based uponrelationships with a single independent variable [bivariate regression] and with multipleindependent variables [multiple regression].

    Bivariate Regression

    My analysis began by first considering the variables that I thought might contribute to demand.

    Those variables included, but limited to: Time [T], Interest Rate(s) [IR], Interest Rate NewCars and Trucks [IRNC], Disposable Personal Income [DPI], Consumer Price Index for Gas[CPIG], Population Estimates [POP], and Unemployment [NOU]. Next, I performedbivariate linear analysis on each of the variables, and compared each models historic andforecast RMSEs [Figure 15]. As no great surprise, the model with the lowest historic RMSE @160.72 had IRNC as its dependent variable. In contrast, the model with the lowest forecastRMSE @ 116.72 had NOU as its dependent variable. That said, the bivariate model thatperformed the best all around was the IR model.

  • 8/14/2019 JBonios Forecasting Final

    9/28

    4 | P a g e

    Multiple Regression

    Next, I performed multiple regression using the independent variables already, plusincorporating additional, potential relevant independent variables that I thought would produceforecasts with better fit and accuracy. The variables were either additional time series data Ifound from online resources, or dummy variables, which produced mixed results.

    Seasonal Dummy Variables:Based upon seasonality that I saw in the data, I incorporatedseasonal dummy variables. The dummy variables did in fact capture this seasonality, anddramatically improved my summary statistics.

    September 11th: Working backwards September 11, 2001 [9/11] did not seem to influencesales terribly. Reflecting on the events that took place immediately after the 9/11 attacks, Iremember that interest rates dropped in an effort to stimulate purchases and the economy. Anynegative effects the attacks had on the economy were counter-balanced by these reducedinterest rates. My conclusion was that either 9/11 did not affect sales in a meaningful way or Idid not accurately capture its impact in my models.

    Black Monday, 1987: Looking at the data, I noticed that there was a significant drop in salesthat lasted for several periods. In reflecting on the time series data, and then confirming myassumptions online, I saw that the stock market crash of 1987, called Black Monday, coincidedwith a with my data. I created a dummy variable, and attempted to capture Black Mondayseffects by populating the dummy variable with 1s plus/minus this event [Figure 16]. Much tomy frustration, I was unable to get this dummy variable to contribute models [higher AIC/BIC,lower Adjusted R-square, etc.]. Frustrated and defeated, I left this variable and moved on.

    1979 Oil Crisis: We have a winner! I created a dummy variable that captured historic eventsthat had a direct impact on auto and light truck sales the 1979 Oil Crises [Figure 16]. Aftercreating the variable, I populated it with 1s around the time of the crises. Its effects on mymodels were statistically significant [P-value less than 0.05], and contributed to improved

    summary statistics [lower AIC/BIC, higher Adjusted R-square, etc]. All but one of my bestperforming multiple regression models incorporates this variable.

    Testing With Various Combinations of Independent Variables:All told, I probably ran 100,or more, forecasts using various combinations of independent variables. Although my pool ofavailable variables often captured the same, or very similar, data [such as interest rates and

    interest rates new cars], in an effort to reduce multicollinearity I tried to keep similar variablesout of any one given model. I compared each model [Table 2, Figure 17, 18, 19, & 20] lookingat the following pieces of forecast output: P-values of the independent variables [if > than0.05]; AIC and BIC; Adjusted R-Square; Durbin Watson Statistic; and correlation betweenindependent variables.

    Settling on Several Models and Comparing Fit and Accuracy:Although I ran more than a100 models, most of them were junk. I kept the best performing models [Table 2, Figure 17, 18,19, & 20], and kept tweaking by adding, subtracting, or replacing variables. Of this narroweddown list, I focused on five models, indentified at models N, O, P, Q, and R [Table 3].

    I then compared these five models by comparing their summary statistics and their historic andforecast RMSEs [Figure 21]. Comparing historic RMSEs was the easy part ForecastX providedthis output. The laborious part was comparing the models forecast RMSEs. I accomplished this

  • 8/14/2019 JBonios Forecasting Final

    10/28

    5 | P a g e

    by first running Holts forecasts for 12 periods on each of the independent variables foundcollectively in my five final models. I then created forecasts by taking the models formula[created by ForecastX] and replacing the variables with Holts generated variable forecasts[Table 4].

    Model O Is the Winner Sort Of:Although all five of my finalists preformed quite similarly,especially with regard to their historic RMSE, Model O produced both the best historic and

    forecast RMSEs [82.02 and 193.01 respectively]. Although Model O beat out all other multipleregression models, it performed poorly when compared to two bivariate regression modelforecasts [IR @ 137.99 and NOU @ 116.72].

    So, what does this mean? Quite frustrating is what it means. Although my complex models fitthe data quite well, they do not appear to forecast as accurately as my simple my bivariatemodels. So is the life of a forecaster. Back to the drawing board.

    Time-Series Decomposition

    Time-series decomposition models are used to identify the underlying components of a time

    series, including long-term trends [T], seasonality [S], cyclical movements [C], and irregularfluctuations [I]. Decomposition can be understood with the following formula:

    I = H

    Once deconstructed, the individual components can be reconstructed to produce excellentforecasts. In fact, the best forecasts that I have produced with the data thus far, taking both fitand accuracy into consideration, were produced with decomposition.

    The Process Deconstructed:I will now take you through the process of taking the raw dataand from it producing a time-series decomposition forecast, and conclude with an assessment ofits forecasting prowess when compared to other models.

    Deseasonalizing The Data And Finding Seasonal Indices:The first step was to remove theshort-term fluctuations in order to reveal the long-term trend and cyclical components of thedata. This was accomplished by calculating an appropriate moving average for the series. Beingthat I collected monthly data [domestic sales cars and light-trucks], I used a 12-period movingaverage. The moving average was calculated as follows [Table 5]:

    = (I + I' + + I + I# + I')/12

    Since my data contains an even number of observations, the moving averages are not reallycentered in the middle of each year. To adjust for this, a 2-period moving average wascalculated as follows [Table 5 Label 2, Figure 23]:

    = ( + #)/2

    By comparing the actual values of the series Sales [Yt] with the deseaonlized data CMA thedegree of seasonality, called the seasonal factor SF, was revealed. The calculation is as follows[Table 5]:

    = I/

    Next, I found the mean average seasonal factor for each month for the entire series

  • 8/14/2019 JBonios Forecasting Final

    11/28

    6 | P a g e

    [Table 5]. This average became the seasonal index [SI]. The calculation involves summing theseasonal factors for each of the 12 months and dividing the number by the number ofobservations.

    A seasonal index greater than 1 indicates a period of sales greater than the yearly average,while the reverse is true when SI is less than 1. Looking at my data [Table 6 & Figure 22], youcan see that Jan, Feb, Sep, Oct, Nov, and December are less than 1, while Mar through Aug are

    greater than one. Upon further analysis, we can see that Mar, May, and Jun on average have thegreatest amount in sales [SI @ 1.11, 1.12, and 1.11, respectively].

    Finding the Long-Term Trend:The long-term trend [CMAT] was calculated from thedeseasonalized data using ForecastX multiple regression with a time index as the independentvariable, and CMA and the dependent variable [Table 5 Label 1]. The resulting formula was:

    = 1,031.87 + () 0.995478

    Measuring the Cyclical Component:The cyclical component is the wavelike movement aboutthe long-term trend [CMAT]. It is measured by a cycle factor [CF] using the following equation:

    = /

    The above-equation was applied to each period with exception. When creating the centeredmoving average, multiple periods were lost [six in total]. Additionally, because I had a 12-monthhold out period, I lost an additional six period of CMA. In all, I needed a method to forecast 12periods of CF [9/2007 through 8/2008], and used ForecastX Holt Winters[Table 5 Label 2].

    A cycle factor greater than one indicates that sales for that period are above the long-term trendin the data. On my graph [Figure 24] I indicated an expansion phase [beginning trough A andthe peak B] and a contraction phase [peak B to ending trough C]. The expansion and

    contraction phases presented in my example are not exhaustive, as is obvious from graph.

    Although difficult to analyze and project in the forecast period, the long term-trend can help usto anticipate the next turning point in the cycle.

    The Time-Series Decomposition Forecast: To forecast sales [Table 5], I simply multipliedthe long-term trend [CMAT] by the seasonal index [SI] by the cycle component [CF]. Theequation is:

    J = H

    You can see from the graph [Figure 25] how well time-series decomposition forecasts sales.*I have included for comparison a graph produced by forecasting sales using Winters model[Table 5 Label 3, Figure 26]. This model produced both a low historic RMSE @ 79.44 [fit] andforecast RMSE @ 108.73 [accuracy].

  • 8/14/2019 JBonios Forecasting Final

    12/28

    7 | P a g e

    ARIMA [ARIMA Box-Jenkins is a forecastingforecasted. Rather than attemptinas we do with multiple-regression,do with decomposition, with ARIMthen pass the time series through

    correct black box is chosen, only w

    ForecastX is able to calculate ARable to generate forecasts that botwith accuracy with RMSEs at 104

    Included in the ForecastX softwato perform one-step forecasts. Aftto run ProCast and the softwareon one based upon summary statisanalyzed. The softwares authors c

    put the claims to the test

    I ran ProCast and the software cconstants: level at 0.28, seasonalwere 92.30 and 140.30 respectivelwith a lower RMSE. However, ARIRMSE [ARIMA at 103.31, ProCast

    My conclusion although ProCastand forecast sense, other models i

    Box-Jenkins] Forecasting Models

    model which looks only at the pattern ofto explain our data by incorporating ind

    or by deconstructing the data into its conwe instead focus our attention on the tiblack box and examine the resulting ti

    hite noise remains.

    MA Box-Jenkins with little effort. Using th fit the historic data as well as forecaste.29 and 103.31 respectively [Figure 27,

    ProCast

    e package is a feature known as ProCastr selecting your time series, you simply iill choose from a variety of forecasting t

    tics and the particular characteristics of tlaim that ProCast will choose the best

    oose Winters [Figure 28] with the followt 0.23, and trend at 0.00. The historic a

    y [Table 7]. Compared to ARIMA, ProCasA forecast better than ProCast with a siat 140.30].

    did choose a well performing model, bottroduced in this paper performed better.

    Observered

    Time Series

    the data beingpendent variablesstituent parts as wee series itself. We

    ime series. If the

    e software, I wasthe holdout period

    able 7]

    that enables usersstruct ForecastXchniques and settle

    he data beingodel. I decided to

    ing smoothingd forecast RMSEs fit the data betterignificantly lower

    h in a retrospective

  • 8/14/2019 JBonios Forecasting Final

    13/28

    8 | P a g e

    CONCLUSIONS

    The basic time-series decomposition model produced the best fitting with a historic RMSE of79.44, while ARIMA Box-Jenkins forecast best with an RMSE of 103.31. However, ARIMAsforecast RMSE was only slightly better the decomposition.

    If I were to settle on one forecast model for this time-series, it would be time-series

    decomposition because how well it performs overall and because of its intuitiveness as a model.

    A graph comparing each models forecast [Figure 29] as well as a table [Table 8] comparingeach model historic and forecast RMSEs is provided for your review.

  • 8/14/2019 JBonios Forecasting Final

    14/28

    9 | P a g e

    Figure 1This graph shows

    domestic cars andlight trucks sold [inthousands] fromJanuary 1976through August

    2008, along with along-term upward

    trend line.The blue shows theactual vehicles sold.The straight black lineshows the long-termpositive trend ofvehicles sold.

    Table 1 &

    Figure 2ACF Values forDomestic Sales ofAutos and LightTrucksAll coefficients areoutside the 90%confidence bandindicating the positivetrend in sales.

    0.00

    200.00

    400.00

    600.00

    800.00

    1000.00

    1200.00

    1400.00

    1600.00

    1800.00

    2000.00

    Jan-76

    Oct-76

    Jul-77

    Apr-78

    Jan-79

    Oct-79

    Jul-80

    Apr-81

    Jan-82

    Oct-82

    Jul-83

    Apr-84

    Jan-85

    Oct-85

    Jul-86

    Apr-87

    Jan-88

    Oct-88

    Jul-89

    Apr-90

    Jan-91

    Oct-91

    Jul-92

    Apr-93

    Jan-94

    Oct-94

    Jul-95

    Apr-96

    Jan-97

    Oct-97

    Jul-98

    Apr-99

    Jan-00

    Oct-00

    Jul-01

    Apr-02

    Jan-03

    Oct-03

    Jul-04

    Apr-05

    Jan-06

    Oct-06

    Jul-07

    Apr-08

    Obs ACF

    1 .7136

    2 .6275

    3 .5576

    4 .4338

    5 .4104

    6 .3552

    7 .3890

    8 .4083

    9 .5285

    10 .5669

    11 .6291

    12 .7917

    -.2000

    .0000

    .2000

    .4000

    .6000

    .8000

    1.0000

    1 2 3 4 5 6 7 8 9 10 11 12

    ACF Upper Limit Lower Limit

  • 8/14/2019 JBonios Forecasting Final

    15/28

    10 | P a g e

    Figures 3, 4, 5,& 6Four Nave Forecastsfor Domestic Sales ofAutos and LightTrucks

    Nave forecast 1 is

    simply the previousactual value of sales.That is SalesFt=Salest-1.The nave forecast 2takes into account thechange betweenprevious periods. Threenave 2 forecasts areshow with P @ .25, .50,and .75 producinghistoric RMSEs of172.73, 196.57, and

    224.65 respectively.Nave forecast 1 fits the

    actual data best with ahistoric RMSE of155.06.

    0.00

    200.00

    400.00

    600.00

    800.00

    1000.00

    1200.00

    1400.00

    1600.00

    1800.00

    2000.00

    Jan-76

    Feb-77

    Mar-78

    Apr-79

    May-80

    Jun-81

    Jul-82

    Aug-83

    Sep-84

    Oct-85

    Nov-86

    Jan-88

    Feb-89

    Mar-90

    Apr-91

    May-92

    Jun-93

    Jul-94

    Aug-95

    Sep-96

    Oct-97

    Nov-98

    Jan-00

    Feb-01

    Mar-02

    Apr-03

    May-04

    Jun-05

    Jul-06

    Aug-07

    Data Nave Forecast 1

    0.00

    200.00400.00

    600.00

    800.00

    1000.00

    1200.00

    1400.00

    1600.00

    1800.00

    2000.00

    Jan-76

    Feb-77

    Mar-78

    Apr-79

    May-80

    Jun-81

    Jul-82

    Aug-83

    Sep-84

    Oct-85

    Nov-86

    Jan-88

    Feb-89

    Mar-90

    Apr-91

    May-92

    Jun-93

    Jul-94

    Aug-95

    Sep-96

    Oct-97

    Nov-98

    Jan-00

    Feb-01

    Mar-02

    Apr-03

    May-04

    Jun-05

    Jul-06

    Aug-07

    Data Nave Forecast 2 (p = .25)

    0.00

    200.00

    400.00600.00

    800.00

    1000.00

    1200.00

    1400.00

    1600.00

    1800.00

    2000.00

    Jan-76

    Feb-77

    Mar-78

    Apr-79

    May-80

    Jun-81

    Jul-82

    Aug-83

    Sep-84

    Oct-85

    Nov-86

    Jan-88

    Feb-89

    Mar-90

    Apr-91

    May-92

    Jun-93

    Jul-94

    Aug-95

    Sep-96

    Oct-97

    Nov-98

    Jan-00

    Feb-01

    Mar-02

    Apr-03

    May-04

    Jun-05

    Jul-06

    Aug-07

    Data Nave Forecast 2 (p = .50)

    0.00

    200.00

    400.00

    600.00800.00

    1000.00

    1200.00

    1400.00

    1600.00

    1800.00

    2000.00

    Jan-76

    Feb-77

    Mar-78

    Apr-79

    May-80

    Jun-81

    Jul-82

    Aug-83

    Sep-84

    Oct-85

    Nov-86

    Jan-88

    Feb-89

    Mar-90

    Apr-91

    May-92

    Jun-93

    Jul-94

    Aug-95

    Sep-96

    Oct-97

    Nov-98

    Jan-00

    Feb-01

    Mar-02

    Apr-03

    May-04

    Jun-05

    Jul-06

    Aug-07

    Data Nave Forecast 2 (p = .75)

  • 8/14/2019 JBonios Forecasting Final

    16/28

    11 | P a g e

    Figures 7 & 8Two SimpleExponentialSmoothing Forecastsof domestic cars and

    light trucks sold [inthousands] fromJanuary 1976through August2008.The top graph is of aforecast with an alphaof 0.9 selected and aforecast RMSE of240.71. The bottomgraph is of a forecastwith an alpha of 0.5selected to minimize

    the forecast RMSE to213.22.

    0.00

    200.00

    400.00

    600.00

    800.00

    1,000.00

    1,200.00

    1,400.00

    1,600.00

    1,800.00

    2,000.00

    Jan-1976

    Nov-1976

    Sep-1977

    Jul-1978

    May-1979

    Mar-1980

    Jan-1981

    Nov-1981

    Sep-1982

    Jul-1983

    May-1984

    Mar-1985

    Jan-1986

    Nov-1986

    Sep-1987

    Jul-1988

    May-1989

    Mar-1990

    Jan-1991

    Nov-1991

    Sep-1992

    Jul-1993

    May-1994

    Mar-1995

    Jan-1996

    Nov-1996

    Sep-1997

    Jul-1998

    May-1999

    Mar-2000

    Jan-2001

    Nov-2001

    Sep-2002

    Jul-2003

    May-2004

    Mar-2005

    Jan-2006

    Nov-2006

    Sep-2007

    Jul-2008

    A ctual Fore cast Fitted Value s

    0.00

    200.00

    400.00

    600.00

    800.00

    1,000.00

    1,200.00

    1,400.00

    1,600.00

    1,800.00

    2,000.00

    Jan-1976

    Nov-1976

    Sep-1977

    Jul-1978

    May-1979

    Mar-1980

    Jan-1981

    Nov-1981

    Sep-1982

    Jul-1983

    May-1984

    Mar-1985

    Jan-1986

    Nov-1986

    Sep-1987

    Jul-1988

    May-1989

    Mar-1990

    Jan-1991

    Nov-1991

    Sep-1992

    Jul-1993

    May-1994

    Mar-1995

    Jan-1996

    Nov-1996

    Sep-1997

    Jul-1998

    May-1999

    Mar-2000

    Jan-2001

    Nov-2001

    Sep-2002

    Jul-2003

    May-2004

    Mar-2005

    Jan-2006

    Nov-2006

    Sep-2007

    Jul-2008

    A ctual Fore cast Fitted Value s

  • 8/14/2019 JBonios Forecasting Final

    17/28

    12 | P a g e

    Figures 9 & 10Two Holts

    ExponentialSmoothing Forecastsof domestic cars andlight trucks sold [inthousands] from

    January 1976through August

    2008.The top graph is of aforecast with an alphaof 0.9, a gamma of0.01, and a forecastRMSE of 240.71. Thebottom graph is of aforecast with an alphaof 0.5, a gamma of0.01, and an improvedforecast RMSE of220.20.

    0.00

    200.00

    400.00

    600.00

    800.00

    1,000.00

    1,200.00

    1,400.00

    1,600.00

    1,800.00

    2,000.00

    Jan-1976

    Nov-1976

    Sep-1977

    Jul-1978

    May-1979

    Mar-1980

    Jan-1981

    Nov-1981

    Sep-1982

    Jul-1983

    May-1984

    Mar-1985

    Jan-1986

    Nov-1986

    Sep-1987

    Jul-1988

    May-1989

    Mar-1990

    Jan-1991

    Nov-1991

    Sep-1992

    Jul-1993

    May-1994

    Mar-1995

    Jan-1996

    Nov-1996

    Sep-1997

    Jul-1998

    May-1999

    Mar-2000

    Jan-2001

    Nov-2001

    Sep-2002

    Jul-2003

    May-2004

    Mar-2005

    Jan-2006

    Nov-2006

    Sep-2007

    Jul-2008

    A ctual Fore cast Fitted Value s

    0.00

    200.00

    400.00

    600.00

    800.00

    1,000.00

    1,200.00

    1,400.00

    1,600.00

    1,800.00

    2,000.00

    Jan-1976

    Nov-1976

    Sep-1977

    Jul-1978

    May-1979

    Mar-1980

    Jan-1981

    Nov-1981

    Sep-1982

    Jul-1983

    May-1984

    Mar-1985

    Jan-1986

    Nov-1986

    Sep-1987

    Jul-1988

    May-1989

    Mar-1990

    Jan-1991

    Nov-1991

    Sep-1992

    Jul-1993

    May-1994

    Mar-1995

    Jan-1996

    Nov-1996

    Sep-1997

    Jul-1998

    May-1999

    Mar-2000

    Jan-2001

    Nov-2001

    Sep-2002

    Jul-2003

    May-2004

    Mar-2005

    Jan-2006

    Nov-2006

    Sep-2007

    Jul-2008

    Actual Forecast Fitted Values

  • 8/14/2019 JBonios Forecasting Final

    18/28

    13 | P a g e

    Figures 11 & 12Two Winters

    ExponentialSmoothing Forecastsof domestic cars andlight trucks sold [inthousands] from

    January 1976through August

    2008.The top graph is of aforecast with an alphaof 0.5, a beta of 0.33, agamma of 0.0, and aforecast RMSE of142.59. The bottomgraph is of a forecastwith an alpha of 0.3, abeta of 0.23, a gammaof 0.0, and animproved forecast

    RMSE of 139.78.

    0.00

    200.00

    400.00

    600.00

    800.00

    1,000.00

    1,200.00

    1,400.00

    1,600.00

    1,800.00

    2,000.00

    Jan-1976

    Nov-1976

    Sep-1977

    Jul-1978

    May-1979

    Mar-1980

    Jan-1981

    Nov-1981

    Sep-1982

    Jul-1983

    May-1984

    Mar-1985

    Jan-1986

    Nov-1986

    Sep-1987

    Jul-1988

    May-1989

    Mar-1990

    Jan-1991

    Nov-1991

    Sep-1992

    Jul-1993

    May-1994

    Mar-1995

    Jan-1996

    Nov-1996

    Sep-1997

    Jul-1998

    May-1999

    Mar-2000

    Jan-2001

    Nov-2001

    Sep-2002

    Jul-2003

    May-2004

    Mar-2005

    Jan-2006

    Nov-2006

    Sep-2007

    Jul-2008

    Actual Forecast Fitted Values

    0.00

    200.00

    400.00

    600.00

    800.00

    1,000.00

    1,200.00

    1,400.00

    1,600.00

    1,800.00

    2,000.00

    Jan-1976

    Nov-1976

    Sep-1977

    Jul-1978

    May-1979

    Mar-1980

    Jan-1981

    Nov-1981

    Sep-1982

    Jul-1983

    May-1984

    Mar-1985

    Jan-1986

    Nov-1986

    Sep-1987

    Jul-1988

    May-1989

    Mar-1990

    Jan-1991

    Nov-1991

    Sep-1992

    Jul-1993

    May-1994

    Mar-1995

    Jan-1996

    Nov-1996

    Sep-1997

    Jul-1998

    May-1999

    Mar-2000

    Jan-2001

    Nov-2001

    Sep-2002

    Jul-2003

    May-2004

    Mar-2005

    Jan-2006

    Nov-2006

    Sep-2007

    Jul-2008

    A ctual Fore cast Fitted Value s

  • 8/14/2019 JBonios Forecasting Final

    19/28

    14 | P a g e

    Figures 13 & 14Two Adaptive-Response-RateSingle ExponentialSmoothing Forecasts

    of domestic cars andlight trucks sold [inthousands] fromJanuary 1976through August2008.The top graph is of aforecast with a beta of0.2 selected and aforecast RMSE of236.47. The bottomgraph is of a forecastwith a beta of 0.1

    selected and animproved forecastRMSE of 215.50.

    Figure 15Comparison ofbivariate regressions

    models each with adifferentindependent variablechosen.T = Time Index; IR =Interest Rate; DPI=Disposable PersonalIncome; IRNC =Interest Rate NewCars; CPIG =Consumer Price IndexGasoline; POP =National Population

    Estimates; NOU =

    Number of Unemployed

    0.00

    200.00

    400.00

    600.00

    800.00

    1,000.00

    1,200.00

    1,400.00

    1,600.00

    1,800.00

    2,000.00

    Jan-1976

    Nov-1976

    Sep-1977

    Jul-1978

    May-1979

    Mar-1980

    Jan-1981

    Nov-1981

    Sep-1982

    Jul-1983

    May-1984

    Mar-1985

    Jan-1986

    Nov-1986

    Sep-1987

    Jul-1988

    May-1989

    Mar-1990

    Jan-1991

    Nov-1991

    Sep-1992

    Jul-1993

    May-1994

    Mar-1995

    Jan-1996

    Nov-1996

    Sep-1997

    Jul-1998

    May-1999

    Mar-2000

    Jan-2001

    Nov-2001

    Sep-2002

    Jul-2003

    May-2004

    Mar-2005

    Jan-2006

    Nov-2006

    Sep-2007

    Jul-2008

    A ctual Fore cast Fitted Value s

    0.00

    200.00

    400.00

    600.00

    800.00

    1,000.00

    1,200.00

    1,400.00

    1,600.00

    1,800.00

    2,000.00

    Jan-1976

    Nov-1976

    Sep-1977

    Jul-1978

    May-1979

    Mar-1980

    Jan-1981

    Nov-1981

    Sep-1982

    Jul-1983

    May-1984

    Mar-1985

    Jan-1986

    Nov-1986

    Sep-1987

    Jul-1988

    May-1989

    Mar-1990

    Jan-1991

    Nov-1991

    Sep-1992

    Jul-1993

    May-1994

    Mar-1995

    Jan-1996

    Nov-1996

    Sep-1997

    Jul-1998

    May-1999

    Mar-2000

    Jan-2001

    Nov-2001

    Sep-2002

    Jul-2003

    May-2004

    Mar-2005

    Jan-2006

    Nov-2006

    Sep-2007

    Jul-2008

    A ctual Fore cast Fitted Value s

    T IR DPI IRNC CPIG POP NOU

    Forecast RMSE 214.81 137.79 247.82 208.37 307.96 221.47 116.72

    Historic RMSE 175.58 179.27 171.91 160.72 197.37 175.06 191.65

    0.00

    50.00

    100.00

    150.00

    200.00

    250.00

    300.00350.00

    Bivariate Regression Models

  • 8/14/2019 JBonios Forecasting Final

    20/28

    15 | P a g e

    Figure 16Domestic cars andlight trucks sold [inthousands] fromJanuary 1976

    through August2008, along withevents that I thoughtif captured couldcontribute toimproved multipleregressionforecast(s).Starting from the left isthe 1979 Oil Crisis itseffects on my modelswere statisticallysignificant and

    contributed toimproved models.However, I was unableto capture the effects ofthe 1987 stock marketcrash Black Mondayin a meaningful way.

    Table 2Comparison ofvarious multipleregression models byusing the followingevaluative criterion:AIC, BIC, Adjusted R-Square, Durbin-Watson, Historic

    RMSE, and P-valuevariables [greaterthan 0.05].

    The models thatproduced the bestresults [relative to theother models tested]are highlighted inyellow at the bottom ofthe table.

    0.00

    200.00

    400.00

    600.00

    800.00

    1000.00

    1200.00

    1400.00

    1600.00

    1800.00

    2000.00

    Jan-76

    Feb-77

    Mar-78

    Apr-79

    May-80

    Jun-81

    Jul-82

    Aug-83

    Sep-84

    Oct-85

    Nov-86

    Jan-88

    Feb-89

    Mar-90

    Apr-91

    May-92

    Jun-93

    Jul-94

    Aug-95

    Sep-96

    Oct-97

    Nov-98

    Jan-00

    Feb-01

    Mar-02

    Apr-03

    May-04

    Jun-05

    Jul-06

    Aug-07

    Domestic Sales Cars and Light Trucks

    Model AIC BIC

    Adjusted

    R-Square

    Durbin

    Watson RMSE

    -va ue

    Variables

    >.05A 4,790.57 4,794.51 58.55% 0.95 131.87 1B 4,741.25 4,745.19 63.60% 1.02 123.58 0

    C 4,676.55 4,680.49 69.21% 1.16 113.50 1D 4,606.96 4,610.90 74.29% 1.35 103.57 0

    E 4,598.74 4,602.68 74.77% 1.39 102.45 0F 4,534.90 4,538.84 78.62% 1.46 94.20 1

    G 4,984.48 4,988.42 33.16% 0.57 170.20 1H 4,593.98 4,597.92 75.02% 1.35 101.81 1

    I 4,522.68 4,526.62 79.29% 1.36 92.70 1J 4,548.83 4,552.77 77.82% 1.35 95.94 1K 4,535.03 4,538.97 78.67% 1.46 94.21 0

    L 4,529.01 4,532.95 79.00% 1.48 93.47 0M 4,523.09 4,527.03 79.27% 1.54 92.75 0

    N 4,464.04 4,467.98 82.21% 1.34 85.81 0O 4,431.36 4,435.30 83.58% 1.28 82.20 2

    P 4,433.89 4,437.83 83.56% 1.27 82.48 0Q 4,429.66 4,433.60 83.74% 1.24 82.02 1

    R 4,433.42 4,437.36 83.63% 1.20 82.42 0

    Multiple Regression Models for Consideration

    Black Monday

    Oil Crisis

  • 8/14/2019 JBonios Forecasting Final

    21/28

    16 | P a g e

    Figure 17, 18,19, & 20Comparisons of thevarious multipleregression modelstested.Chart 1: Presents acomparison of each

    models AIC and BICstatistics.Chart 2: Shows eachmodels Durbin-Watsonstatistic.Charter 3: Contrastseach models Adjusted

    R-Square.Chart 4: Historic RMSEsare compared, with alower numberpresenting a betterfitting model.

    Table 3List of all thevariables consideredfor multiple-regression with thevariables used in thebest performingmodels identified

    with checkmarks.The table presents all

    of the variables used inthe models tested. Thebest performing modelsindentified as modelsN, O, P, Q, and R areshown with checkmarks indicating whichvariables were used ineach respective model.

    4,100.00

    4,200.00

    4,300.00

    4,400.00

    4,500.00

    4,600.00

    4,700.00

    4,800.00

    4,900.00

    5,000.00

    5,100.00

    A B C D E F G H I J K L M N O P Q R

    AIC BIC

    0.00

    0.50

    1.00

    1.50

    2.00

    2.50

    3.00

    3.50

    4.00

    A B C D E F G H I J K L M N O P Q R

    Durbin Watson

    0.00%

    10.00%

    20.00%

    30.00%

    40.00%

    50.00%

    60.00%

    70.00%

    80.00%

    90.00%

    A B C D E F G H I J K L M N O P Q R

    Adjusted R-Square

    0.00

    20.00

    40.00

    60.00

    80.00

    100.00

    120.00

    140.00

    160.00

    180.00

    A B C D E F G H I J K L M N O P Q R

    RMSE

    Defined Abbr. N O P Q R

    Time Index T

    Time Index2 T2

    Interest Rates IRLIBOR Rates 1-Month LIBOR1LIBOR Rates 3-Month LIBOR3LIBOR Rates 6-Month LIBOR6Consumer Installment Credit CICConsumer [Nonrevolving] Instal lment Credit CICNRSavings Deposits SDLoan-to-Value Ratio on New Car L2VNCInterest Rate New CarLoan IRNCIncome: Total Personal ITCPersonal Consumption Expenditures PCEDisposable Personal Income DPI

    Disposable Personal Income2 DPI2Personal saving as a % of DPI PSDPINumber of Unemployed NOUUnemployment rate - 20 yrs. & over Male UR20CPI: All items CPIForeign exchange rate: Japan FERJPCPI: Used cars and trucks CPIUCTCPI: Gasoline (all types) CPIGCPI: Public transportation CPIPTU.S. Dollar: Major Currencies Index - Nominal, EVUDStandard & Poor's (S&P) SPPIMonthly National Population Estimates POPSeasonal Dummy POP2Seasonal Dummy FEB thru DECSeptember 11th Effect SEP11Black Monday 1987 BLKOil Crises 1979 OIL CRISIS

    Total Number of Variables Used in Model 9 11 9 8 6

    Variables Considered For Models Variables Used In Best Performing Models

  • 8/14/2019 JBonios Forecasting Final

    22/28

    17 | P a g e

    Figure 21Comparison ofhistoric and forecastRMSEs for the bestperforming multiple

    regression models.For a complete list ofthe variables used ineach model, refer toTable 3. Model Qproduced the both thesmallest historic RMSE@ 82.02, and thesmallest forecast RMSE@ 193.01.

    Table 4Summary of themultiple regressionformulas producedby each of the bestperforming models.The formulas presentedwere created usingForecastX. To produceforecasts, the variablesneed to be replacedwith Holts generatedvariable forecasts [foreach independentvariable].

    N O P Q R

    Forecast RMSE 250.45 228.16 254.14 193.01 216.60

    Historic RMSE 85.81 82.20 82.48 82.02 82.42

    0.00

    50.00100.00

    150.00

    200.00

    250.00

    300.00

    Mutiple Regression Models

    Model Regression Model

    N Sales =-661.67+((T)*-9.57)+((IR)*-11.73)+((DPI)*0.578977)+((PSDPI)*-41.28)+((NOU)*-

    0.022799)+((CPIUCT)*6.95)+((SPPI)*-0.169739)+((FEB)*117.82)+((MAR)*350.41 )+

    ((APR)*257.91)+((MAY)*355.94)+((JUN)*335.75)+((JUL)*240.66)+((AUG)*234.44)+((SEP)*158.67)+((OCT)*177.88)+((NOV)*78.21)+((DEC)*128.69)+((OIL CRISIS)*58.15)

    O Sales =-1762.86+((T)*-11.89)+((IR)*-5.96)+((CIC)*-0.557691)+((DPI)*0.968897)+((PSDPI)*-

    65.53)+((NOU)*-0.008512)+((CPIUCT)*6.3)+((SPPI)*-0.204488)+((POQ2)*0)+

    ((FEB)*117.28)+((MAR)*350.16)+((APR)*258.29)+((MAY)*357.94)+((JUN)*335.95)+((JUL)*240.54)+((AU

    G)*232.28)+((SEP)*158.13)+((OCT)*179.27)+((NOV)*80.29)+((DEC)*123.57)+((OIL CRISIS)*51.26)

    P Sale =-1965.43+((T)*-12.49)+ ((IR)*-4.66)+((CIC)*-0.625548)+((DPI)*1.03)+ ((PSDPI)*-

    71.59)+((CPIUCT)*6.45)+((SPPI)*-0.195896)+((FEB)*117.3)+((MAR)*350.46)+((APR)*

    258.3)+((MAY)*358.35)+((JUN)*335.29)+((JUL)*239.79)+((AUG)*230.94)+((SEP)*157.31)+((OCT)*178.9

    8)+((NOV)*80.07)+((DEC)*122.74)+((OIL CRISIS)*59.4)

    Q Sales =-1951.71+((T)*-12.15)+((CIC)*-0.607095)+((DPI)*1.03)+((PSDPI)*-

    72.56)+((CPIUCT)*5.88)+((CPIG)*-0.687004)+((SPPI)*-

    0.227034)+((FEB)*117.34)+((MAR)*351.08)+((APR)*261.05)+((MAY)*363.86)+((JUN)*341.05)+((JUL)*24

    5.8)+((AUG)*236.62)+((SEP)*162.53)+((OCT)*182.43)+((NOV)*82.2)+((DEC)*123)+((OIL CRISIS)*40.9)

    R Sales =-1934.37+((T)*-12.19)+((CIC)*-0.617483)+((DPI)*1.04)+((PSDPI)*-

    74.05)+((CPIUCT)*5.7)+((CPIG)*-0.708253)+((SPPI)*-

    0.22771)+((FEB)*117.22)+((MAR)*351.05)+((APR)*259.79)+((MAY)*363.01)+((JUN)*340.15)+((JUL)*246.

    35)+((AUG)*236.95)+((SEP)*162.96)+((OCT)*182.97)+((NOV)*82.79)+((DEC)*123.54)

    Best Performing Multiple Regression Models

  • 8/14/2019 JBonios Forecasting Final

    23/28

    18 | P a g e

    Table 5Time-SeriesDecomposition ofSales of Autos andLight Trucks

    The long-term trend[CSCMAT] identified bythe label 1 wascalculated from thedeseasonalized datausing ForecastXmultiple regression witha time index as theindependent variable,and CMA and thedependent variable.The values for thecyclical factor [CF] from

    9/2007 though 8/2008was forecasted usingForecastX Holt Wintersmodel see label 2.Also included in thistable is a forecast ofauto and light truck

    sales using ForecastXHolt Winters tocontrast thedecomposition forecast see label 3

    Date SalesTime

    Index

    Jan-76 864.60 1

    Feb-76 973.30 2

    Mar-76 1,216.10 3

    Apr-76 1,163.20 4

    May-76 1,176.10 5

    Jun-76 1,224.90 6

    Jul-76 1,130.10 7

    Aug-76 994.90 8

    Sep-76 1,024.90 9

    Oct-76 1,103.70 10

    Nov-76 1,062.00 11

    Dec-76 1,036.00 12

    Jan-77 945.00 13

    Feb-77 1,065.40 14

    Mar-77 1,414.00 15

    Apr-77 1,320.30 16

    May-77 1,344.70 17

    Jun-77 1,424.50 18

    Jul-77 1,171.30 19

    Aug-77 1,201.20 20

    Sep-77 1,073.80 21

    Oct-77 1,306.50 22

    Nov-77 1,154.10 23

    Dec-77 1,061.10 24

    Jan-78 906.20 25

    Feb-78 1,043.20 26

    . . .

    . . .

    . . .

    May-06 1,484.90 365

    Jun-06 1,496.00 366

    Jul-06 1,489.50 367

    Aug-06 1,482.50 368Sep-06 1,350.40 369

    Oct-06 1,213.10 370

    Nov-06 1,194.90 371

    Dec-06 1,426.10 372

    Jan-07 1,087.50 373

    Feb-07 1,251.10 374

    Mar-07 1,536.80 375

    Apr-07 1,332.40 376

    May-07 1,558.70 377

    Jun-07 1,450.20 378

    Jul-07 1,304.30 379

    Aug-07 1,471.30 380

    Sep-07 1,310.40 381

    Oct-07 1,226.60 382

    Nov-07 1,175.40 383

    Dec-07 1,384.50 384

    Jan-08 1,039.10 385

    Feb-08 1,171.80 386

    Mar-08 1,351.50 387

    Apr-08 1,243.50 388

    May-08 1,392.90 389

    Jun-08 1,185.30 390

    Jul-08 1,132.20 391

    Aug-08 1,246.00 392

    MA CMA CSCMAT SF SI CFDecompostion

    Forecast(Error)

    0.81 864.60 0.0

    0.90 973.30 0.0

    1.11 1,216.10 0.0

    1.04 1,163.20 0.0

    1.12 1,176.10 0.0

    1.11 1,224.90 0.0

    1,080.82 1,084.17 1,038.84 1.04 1.03 1.04 1,120.43 93.5

    1,087.52 1,091.35 1,039.83 0.91 1.03 1.05 1,125.90 17,160.3

    1,095.19 1,103.44 1,040.83 0.93 0.94 1.06 1,037.31 154.0

    1,111.68 1,118.23 1,041.82 0.99 0.96 1.07 1,068.45 1,242.5

    1,124.78 1,131.80 1,042.82 0.94 0.87 1.09 988.82 5,354.7

    1,138.83 1,147.14 1,043.82 0.90 0.91 1.10 1,043.21 52.0

    1,155.46 1,157.18 1,044.81 0.82 0.83 1.11 963.92 358.0

    1,158.89 1,167.49 1,045.81 0.91 0.93 1.12 1,080.58 230.5

    1,176.08 1,178.12 1,046.80 1.20 1.11 1.13 1,312.73 10,256.62

    1,180.16 1,188.61 1,047.80 1.11 1.04 1.13 1,231.04 7,967.22

    1,197.06 1,200.90 1,048.79 1.12 1.12 1.15 1,344.97 0.0

    1,204.73 1,205.78 1,049.79 1.18 1.11 1.15 1,333.49 8,283.5

    1,206.83 1,205.21 1,050.78 0.97 1.03 1.15 1,245.52 5,508.2

    1,203.59 1,202.67 1,051.78 1.00 1.03 1.14 1,240.73 1,562.8

    1,201.74 1,202.68 1,052.78 0.89 0.97 1.14 1,167.85 8,845.3

    1,203.62 1,205.62 1,053.77 1.08 0.98 1.14 1,187.20 14,232.1

    1,207.62 1,214.79 1,054.77 0.95 0.90 1.15 1,095.64 3,417.3

    1,221.95 1,225.32 1,055.76 0.87 0.94 1.16 1,155.65 8,939.81

    1,228.70 1,231.92 1,056.76 0.74 0.83 1.17 1,026.19 14,397.3

    1,235.15 1,238.60 1,057.75 0.84 0.93 1.17 1,146.40 10,650.0

    . . . . . . . .

    . . . . . . . .

    . . . . . . . .

    1,377.01 1,378.40 1,395.22 1.08 1.12 0.99 1,543.77 3,465.6

    1,379.79 1,377.56 1,396.21 1.09 1.11 0.99 1,523.46 753.9

    1,375.32 1,373.14 1,397.21 1.08 1.03 0.98 1,419.07 4,960.7

    1,370.96 1,370.70 1,398.21 1.08 1.03 0.98 1,414.09 4,680.01,370.45 1,370.92 1,399.20 0.99 0.97 0.98 1,331.21 368.0

    1,371.38 1,366.74 1,400.20 0.89 0.98 0.98 1,345.86 17,625.4

    1,362.10 1,365.18 1,401.19 0.88 0.90 0.97 1,231.28 1,323.4

    1,368.25 1,366.34 1,402.19 1.04 0.94 0.97 1,288.65 18,892.8

    1,364.43 1,356.72 1,403.18 0.80 0.83 0.97 1,130.14 1,818.1

    1,349.00 1,348.53 1,404.18 0.93 0.93 0.96 1,248.15 8.6

    1,348.07 1,346.40 1,405.17 1.14 1.11 0.96 1,500.23 1,337.32

    1,344.73 1,345.30 1,406.17 0.99 1.04 0.96 1,393.32 3,711.4

    1,345.86 1,345.05 1,407.17 1.16 1.12 0.96 1,506.41 2,733.8

    1,344.23 1,342.50 1,408.16 1.08 1.11 0.95 1,484.69 1,189.3

    1,340.77 1,338.75 1,409.16 0.97 1.03 0.95 1,383.53 6,276.71

    1,336.73 1,333.43 1,410.15 1.10 1.03 0.95 1,375.63 9,151.92

    1,330.13 1,411.15 0.97 0.94 1,289.64 431.0

    1,314.68 1,412.14 0.98 0.94 1,302.55 5,768.71

    1,307.28 1,413.14 0.90 0.93 1,188.20 163.91

    1,293.46 1,414.13 0.94 0.93 1,237.45 21,623.3

    1,271.38 1,415.13 0.83 0.92 1,088.47 2,437.3

    1,257.04 1,416.12 0.93 0.92 1,204.45 1,066.21

    1,238.27 1,417.12 1.11 0.91 1,444.00 8,556.3

    1,418.12 1.04 0.91 1,336.62 8,670.4

    1,419.11 1.12 0.91 1,439.32 2,155.2

    1,420.11 1.11 0.90 1,415.27 52,887.9

    1,421.10 1.03 0.90 1,316.94 34,127.6

    1,422.10 1.03 0.89 1,309.05 3,975.4

    Historic RMSE 79.44

    Forecast RMSE 108.73

    Winters'

    Forecast(Error)2

    . 884.23 385.47

    . 967.36 35.25

    . 1,217.73 2.66

    . 1,120.57 1,817.46

    . 1,186.23 102.60

    . 1,175.68 2,422.21

    . 1,090.08 1,601.86

    . 1,073.52 6,180.33

    . 980.18 2,000.21

    . 1,091.30 153.71

    . 1,008.10 2,905.39

    . 969.65 4,401.75

    . 929.83 230.01

    . 1,033.10 1,043.35

    . 1,307.97 11,242.48

    . 1,239.21 6,575.41

    . 1,311.72 1,087.55

    . 1,323.14 10,273.88

    . 1,237.53 4,385.93

    . 1,166.69 1,190.86

    . 1,118.40 1,989.07

    . 1,209.15 9,477.66

    . 1,146.20 62.47

    . 1,090.95 890.92

    . 1,010.25 10,826.50

    . 1,087.90 1,997.76

    . .

    . .

    . .

    . 1,567.61 6,840.91

    . 1,543.88 2,292.49

    . 1,483.27 38.83

    . 1,439.68 1,833.60

    . 1,306.47 1,930.21

    . 1,316.64 10,719.57

    . 1,195.90 1.00

    . 1,363.78 3,883.47

    . 1,097.91 108.27

    . 1,255.00 15.17

    . 1,503.80 1,088.84

    . 1,421.98 8,025.06

    . 1,508.38 2,531.98

    . 1,527.41 5,962.11

    . 1,467.97 26,789.17

    . 1,384.66 7,506.89

    . 1,268.74 1,735.27

    . 1,243.08 271.73

    . 1,169.79 31.51

    . 1,344.44 1,604.89

    . 1,059.14 401.75

    . 1,215.18 1,881.91

    . 1,463.35 12,511.42

    . 1,355.87 12,628.00

    . 1,487.72 8,991.51

    . 1,471.94 82,161.61

    . 1,419.65 82,626.61

    . 1,423.13 31,375.73

    92.30

    140.30

  • 8/14/2019 JBonios Forecasting Final

    24/28

    19 | P a g e

    Table 6 &

    Figure 22Seasonal Indices forSales of Autos andLight Trucks

    A seasonal index

    SI > 1 indicates aperiod of sales greaterthan the yearlyaverage, while thereverse is true when SI< 1.

    Figure 23Sales of Autos andLight Trucks withCentered Moving

    Average [CMA] andCentered Moving-Average Trend

    [CMAT]The dotted line[CSCMAT] shows thelong-term positivetrend in sales. Thetrend line equation is: = 1,031.87 + 0.995[]

    The lighter coloredjagged line is the rawdata [Sales], while thethinner wavelike line is

    the deseasonalizeddata [CMAT].

    Figure 24Cycle Factor for Salesof Autos and LightTrucksThe cycle factor [CF] isfound by dividing thecentered movingaverage [CMA] by thecentered moving-average trend [CMAT].

    The cycle factor movesslowly around the baseline [1.00]. The periodbetween [A] and [B] iscalled an expansionphase, while the periodbetween [B] and [C] iscalled a contractionphase.

    Date SI

    Jan 0.81

    Feb 0.90

    Mar 1.11

    Apr 1.04

    May 1.12

    Jun 1.11

    Jul 1.03

    Aug 1.03

    Sep 0.94

    Oct 0.96

    Nov 0.87

    Dec 0.91

    0.70

    0.80

    0.90

    1.00

    1.10

    1.20

    1.30

    Jan Feb Mar Apr May Jun Jul Aug Sep Oct Nov Dec

    SI

    0.00

    200.00

    400.00

    600.00

    800.00

    1,000.00

    1,200.00

    1,400.00

    1,600.00

    1,800.00

    2,000.00

    Jan-76

    Apr-77

    Jul-78

    Oct-79

    Jan-81

    Apr-82

    Jul-83

    Oct-84

    Jan-86

    Apr-87

    Jul-88

    Oct-89

    Jan-91

    Apr-92

    Jul-93

    Oct-94

    Jan-96

    Apr-97

    Jul-98

    Oct-99

    Jan-01

    Apr-02

    Jul-03

    Oct-04

    Jan-06

    Apr-07

    Jul-08

    Sales CMA CSCMAT

    0.00

    0.200.40

    0.60

    0.80

    1.00

    1.20

    1.40

    1.60

    CF

    A

    B

    C

  • 8/14/2019 JBonios Forecasting Final

    25/28

    20 | P a g e

    Figure 25Time-Series

    DecompositionForecast for Sales ofAutos and LightTrucksThis graph shows

    forecasted salesdomestic sales of auto

    and light trucks. Actualsales from January1976 through August2007 were used todevelop the forecastthrough August 2008.These results wereobtained by time-seriesdecomposition.

    Figure 26Winters Forecast forSales of Autos andLight TrucksThis graph showsforecasted salesdomestic sales of autoand light trucks. Actual

    sales from January1976 through August2007 were used todevelop the forecastthrough August 2008.

    These results wereobtained using HoltWinter Method inForecastX

    -

    200.00

    400.00

    600.00

    800.00

    1,000.00

    1,200.00

    1,400.00

    1,600.00

    1,800.00

    2,000.00

    Jan-76

    Oct-77

    Jul-79

    Apr81

    Jan-83

    Oct-84

    Jul-86

    Apr88

    Jan-90

    Oct-91

    Jul-93

    Apr95

    Jan-97

    Oct-98

    Jul-00

    Apr02

    Jan-04

    Oct-05

    Jul-07

    Sales Forecast

    0.00

    200.00

    400.00

    600.00

    800.00

    1,000.00

    1,200.00

    1,400.00

    1,600.00

    1,800.00

    2,000.00

    Jan-76

    Apr-77

    Jul-78

    Oct-79

    Jan-81

    Apr-82

    Jul-83

    Oct-84

    Jan-86

    Apr-87

    Jul-88

    Oct-89

    Jan-91

    Apr-92

    Jul-93

    Oct-94

    Jan-96

    Apr-97

    Jul-98

    Oct-99

    Jan-01

    Apr-02

    Jul-03

    Oct-04

    Jan-06

    Apr-07

    Jul-08

    Sales Winters' Forecast

  • 8/14/2019 JBonios Forecasting Final

    26/28

    21 | P a g e

    Figure 27Box Jenkins Forecastfor Sales of Autosand Light TrucksThis graph showsforecasted sales

    domestic sales of autoand light trucks. Actualsales from January1976 through August2007 were used todevelop the forecastthrough August 2008.

    These results wereobtained using theARIMA Box-JenkinsMethod in ForecastX

    Figure 28Procast Forecast forSales of Autos andLight TrucksThis graph showsforecasted salesdomestic sales of autoand light trucks. Actualsales from January

    1976 through August2007 were used todevelop the forecastthrough August 2008.These results wereobtained by selectingProcast in ForecastX.

    The method chosen bythe software wasWinters with thefollowing smoothingconstants [level @0.28, seasonal @ 0.23,and trend @ 0.00]

    0.00

    200.00

    400.00

    600.00

    800.00

    1,000.00

    1,200.00

    1,400.00

    1,600.00

    1,800.00

    2,000.00

    Jan-1976

    Nov-1976

    Sep-1977

    Jul-1978

    May-1979

    Mar-1980

    Jan-1981

    Nov-1981

    Sep-1982

    Jul-1983

    May-1984

    Mar-1985

    Jan-1986

    Nov-1986

    Sep-1987

    Jul-1988

    May-1989

    Mar-1990

    Jan-1991

    Nov-1991

    Sep-1992

    Jul-1993

    May-1994

    Mar-1995

    Jan-1996

    Nov-1996

    Sep-1997

    Jul-1998

    May-1999

    Mar-2000

    Jan-2001

    Nov-2001

    Sep-2002

    Jul-2003

    May-2004

    Mar-2005

    Jan-2006

    Nov-2006

    Sep-2007

    Jul-2008

    A ctual Fore cast Fitte d Values

    0.00

    200.00

    400.00

    600.00

    800.00

    1,000.00

    1,200.00

    1,400.00

    1,600.00

    1,800.00

    2,000.00

    Jan-1976

    Nov-1976

    Sep-1977

    Jul-1978

    May-1979

    Mar-1980

    Jan-1981

    Nov-1981

    Sep-1982

    Jul-1983

    May-1984

    Mar-1985

    Jan-1986

    Nov-1986

    Sep-1987

    Jul-1988

    May-1989

    Mar-1990

    Jan-1991

    Nov-1991

    Sep-1992

    Jul-1993

    May-1994

    Mar-1995

    Jan-1996

    Nov-1996

    Sep-1997

    Jul-1998

    May-1999

    Mar-2000

    Jan-2001

    Nov-2001

    Sep-2002

    Jul-2003

    May-2004

    Mar-2005

    Jan-2006

    Nov-2006

    Sep-2007

    Jul-2008

    A ctual Fore cast Fitte d Values

  • 8/14/2019 JBonios Forecasting Final

    27/28

    22 | P a g e

    Table 7Comparison of

    ARIMA Box-Jenkinsand ProcastForecasts for Sales ofAutos and LightTrucks

    This table showsforecasted sales of

    domestic sales of autosand light trucksproduced by ARIMABox-Jenkins andProcast methods. Box-Jenkins produced aforecast with a lower[better] RMSE,however, Procastproduced a forecastresulting in a lowerhistoric RMSE.

    Figure 29Comparison Of All

    IntroducedForecasting Models[forecast RMSEs forholdout period]Decomposition fit thehistoric data best withthe smallest RMSE @79.44. However, ARIMABox-Jenkins produced

    the best forecast[holdout period] withan RSME @ 103.31.

    Date ActualBox Jenkins

    ForecastError

    2 Procast

    ForecastError

    2

    Sep-07 1,310.40 1,348.40 1,443.97 1,268.74 1,735.27

    Oct-07 1,226.60 1,234.06 55.64 1,243.08 271.73

    Nov-07 1,175.40 1,213.20 1,429.11 1,169.79 31.51

    Dec-07 1,384.50 1,377.81 44.74 1,344.44 1,604.89Jan-08 1,039.10 1,128.64 8,017.58 1,059.14 401.75

    Feb-08 1,171.80 1,246.83 5,629.30 1,215.18 1,881.91

    Mar-08 1,351.50 1,454.63 10,634.89 1,463.35 12,511.42

    Apr-08 1,243.50 1,305.19 3,805.33 1,355.87 12,628.00

    May-08 1,392.90 1,470.09 5,958.97 1,487.72 8,991.51

    Jun-08 1,185.30 1,390.85 42,250.92 1,471.94 82,161.61

    Jul-08 1,132.20 1,284.37 23,154.84 1,419.65 82,626.61

    Aug-08 1,246.00 1,406.17 25,652.87 1,423.13 31,375.73

    Forecast RMSE 103.31 140.30

    Historic RMSE 104.29 92.30

    Sales

    Simple @ .9

    Simple @ .5

    Holt @ .9, .01

    Holt @ .5, .01

    Winters @ .5, .33

    Winters @ .3, .23

    Adaptive @ .1

    Adaptive @ .2

    Bivariate T

    Bivariate IR

    Bivariate DPI

    Bivariate IRNC

    Bivariate CPIG

    Bivariate POP

    Bivariate NOU

    Model N

    Model O

    Model P

    Model Q

    Model R

    Decomposition

    Box-Jenkins

    Procast

  • 8/14/2019 JBonios Forecasting Final

    28/28

    Table 8Comparison Of AllIntroducedForecasting ModelsDecomposition fit thedata best with the

    smallest historic RMSE@ 79.44. However,ARIMA Box-Jenkinsproduced the bestforecast with an RSME@ 103.31 [ 5 pointslower than the

    decompositionforecast].

    Forcasting Model

    Nave Forecast 1 155.26 na

    Nave Forecast 2 (p = .50) 196.57 na

    Nave Forecast 2 (p = .25) 172.73 na

    Nave Forecast 2 (p = .75) 224.65 na

    Simple Exponential ( = .9) 149.93 240.71Simple Exponential ( = .5) 143.15 213.22

    Holt's ( .9, .01) 150.62 249.24

    Holt's ( .5, .01) 143.85 220.20

    Winters' (constants blank) 92.30 140.30

    Winters' ( .5 .33) 95.03 142.59

    Winters' ( .3 .23) 92.34 139.78

    Winters' ( .2 .22) 93.15 143.66

    Adaptive Response ( .1) 144.67 215.50

    Adaptive Response ( .2) 142.44 236.47

    Bivariate [T] 175.58 214.81

    Bivariate [IR] 179.27 137.79

    Bivariate [DPI] 171.91 247.82Bivariate [IRNC] 160.72 208.37

    Bivariate [CPIG] 197.37 307.96

    Bivariate [POP] 175.06 221.47

    Bivariate [NOU] 191.65 116.72

    Multiple Regression [Model N] 85.81 250.45

    Multiple Regression [Model O] 82.20 228.16

    Multiple Regression [Model P] 82.48 254.14

    Multiple Regression [Model Q] 82.02 193.01

    Multiple Regression [Model R] 82.42 216.60

    Decomposition 79.44 108.73

    ARMIA Box-Jenkins [1,0,0] 104.29 103.31

    Procast 92.30 140.30

    Historic RMSE Forecast RMSE