Introduction and backgroundThe likelihood and priors
Sequential updating of a state space modelDynamic model averaging
Fast sequential Bayesian prediction of footballusing mixtures of state space models
Gareth Ridall,Lancaster University, UK,Anthony Pettitt, QUT, Australia,
email: [email protected]
June 29, 2019
Gareth Ridall,Lancaster University, UK, Anthony Pettitt, QUT, Australia, email: [email protected] sequential Bayesian prediction of football using mixtures of state space models
Introduction and backgroundThe likelihood and priors
Sequential updating of a state space modelDynamic model averaging
Fast sequential Bayesian prediction of footballusing mixtures of state space models
Gareth Ridall,Lancaster University, UK,Anthony Pettitt, QUT, Australia,
email: [email protected]
June 29, 2019
Gareth Ridall,Lancaster University, UK, Anthony Pettitt, QUT, Australia, email: [email protected] sequential Bayesian prediction of football using mixtures of state space models
Introduction and backgroundThe likelihood and priors
Sequential updating of a state space modelDynamic model averaging
Table of contents
1 Introduction and background
2 The likelihood and priors
3 Sequential updating of a state space model
4 Dynamic model averaging
Gareth Ridall,Lancaster University, UK, Anthony Pettitt, QUT, Australia, email: [email protected] sequential Bayesian prediction of football using mixtures of state space models
Introduction and backgroundThe likelihood and priors
Sequential updating of a state space modelDynamic model averaging
Abstract
Abstract
We look at the history of premier league football over the last 23years using a mixture of state space models where the states aredefensive and attacking form (The ability to score goals and restrictgoals).
We show that the seasons vary a lot in volatility. Some seasons arepredictable with teams maintaining their form over the season. Inother seasons, the form of the teams vary substantially throughoutthe season with some teams exhibiting sharp changes of style andform. Plots
To model this we use a mixture of models to accommodate bothsequences of games of predictable form and other sequences ofgames showing highly variable form. In using mixtures we arrive at amodel with superior predictive properties.
Gareth Ridall,Lancaster University, UK, Anthony Pettitt, QUT, Australia, email: [email protected] sequential Bayesian prediction of football using mixtures of state space models
Introduction and backgroundThe likelihood and priors
Sequential updating of a state space modelDynamic model averaging
Methodological Background
West, Harrison and Migon (1985) introduce a class of dynamicgeneralised linear models where dynamic updates of the sufficientstatistics can be made through the exploitation of conjugacy.
We extend this methodology to models where the dynamicparameters do not have sufficient statistics but where the fullconditional posteriors of each parameter are from knowndistributions.
We show, using applications, how proxys for the sufficient statistics(which we call quasi-sufficient statistics) can be constructed for thepurpose of sequential updating and marginalisation and prediction.
Parameters that cannot be updated in this way are averaged over.These include discount factors which represent the rate of change ofform and another parameter to model correlation andover-dispersion.
Gareth Ridall,Lancaster University, UK, Anthony Pettitt, QUT, Australia, email: [email protected] sequential Bayesian prediction of football using mixtures of state space models
Introduction and backgroundThe likelihood and priors
Sequential updating of a state space modelDynamic model averaging
Introduction and background
Gareth Ridall,Lancaster University, UK, Anthony Pettitt, QUT, Australia, email: [email protected] sequential Bayesian prediction of football using mixtures of state space models
Introduction and backgroundThe likelihood and priors
Sequential updating of a state space modelDynamic model averaging
Main ingredients of methodology
We exploit these ideas which simplify and speed up inference
1 Sequential Bayes with parameter discounting (forgetting thepast).
2 Exploitation of conjugacy providing closed form expressionsfor posterior and cumulative evidence
3 Mixtures of models with differing discount factors (eachweighted by its cumulative evidence).
4 Use of sufficient statistics or proxys for them to be used todefine conditional posteriors
5 Use of only multiplicative models
6 Total absence of simulation based methods
Gareth Ridall,Lancaster University, UK, Anthony Pettitt, QUT, Australia, email: [email protected] sequential Bayesian prediction of football using mixtures of state space models
Introduction and backgroundThe likelihood and priors
Sequential updating of a state space modelDynamic model averaging
The state space model
Figure: Sequential evolution of parameters of two conditionallyindependent Poisson distributions.
Gareth Ridall,Lancaster University, UK, Anthony Pettitt, QUT, Australia, email: [email protected] sequential Bayesian prediction of football using mixtures of state space models
Introduction and backgroundThe likelihood and priors
Sequential updating of a state space modelDynamic model averaging
Predicting premier league football outcomes
Aim of the analysis
To develop a model or combination of models capable of dynamicallypredicting the premier league football outcomes over the last two decades,using only the final scores, date and the home ground advantages. In thisway we hope to develop an exploratory tool to identify changes in formand style both within and between the seasons of the major clubs.
Results, dates of results and bookmaker odds of results of over 20seasons of premier league football are available online at
http://www.football-data.co.uk/englandm.php
Gareth Ridall,Lancaster University, UK, Anthony Pettitt, QUT, Australia, email: [email protected] sequential Bayesian prediction of football using mixtures of state space models
Introduction and backgroundThe likelihood and priors
Sequential updating of a state space modelDynamic model averaging
Literature
Residual analysis, from stationary univariate Poisson models overeach season, indicate low, predominantly positive correlation. Inaddition there is some over-dispersion.
Dixon and Coles (1997) formulate correlation only between lowscoring games.
Karlis and Ntzoufras (2003), Crowder et. al (2002) and Koopmanand (2013) use latent variables to explain positive correlations.
Gamerman et al.,(2013) construct conjugate state space models forunivariate models
We suggest a shared gamma multiplicative random effect to explainboth over-dispersion and positive correlations.
Gareth Ridall,Lancaster University, UK, Anthony Pettitt, QUT, Australia, email: [email protected] sequential Bayesian prediction of football using mixtures of state space models
Introduction and backgroundThe likelihood and priors
Sequential updating of a state space modelDynamic model averaging
The likelihood and priors
Gareth Ridall,Lancaster University, UK, Anthony Pettitt, QUT, Australia, email: [email protected] sequential Bayesian prediction of football using mixtures of state space models
Introduction and backgroundThe likelihood and priors
Sequential updating of a state space modelDynamic model averaging
The likelihood
Let i ∈ 1, 2, . . . , 20 denote the home team and j ∈ 1, 2, . . . , 20denote the away team and let the games, in chronological order belabelled as t = 1, . . . , 380.
Let αi,t be the attacking strength of team i at time t,
Let βj,t be the defensive strength of team j at time t
let γt be the common home ground advantage.
Then the home goals Xt and away goals Yt at time t are conditionallyindependent Poisson distributions(
Xt
Yt
)| κ ∼ Poisson
(αi,tβj,tγtεtαj,tβi,tεt
)(Likelihood)
where εt ∼ Gamma (κ, κ) is a shared random effect for game t.
Gareth Ridall,Lancaster University, UK, Anthony Pettitt, QUT, Australia, email: [email protected] sequential Bayesian prediction of football using mixtures of state space models
Introduction and backgroundThe likelihood and priors
Sequential updating of a state space modelDynamic model averaging
The likelihood
The joint likelihood of data and random effects is
f (x, y, ε | α,β, γt , κ) ∝ exp(−380∑t=1
εt [αi,tβj,tγt + αj,tβi,t ]
×380∏t=1
κκ
Γ(κ)εκ−1t exp(−κεt) [εtαi,tβj,tγt ]
xt × [εtαj,tβi,t ]yt
The likelihood marginalised over the random effect is
L(α,β, γ, κ) =
∫ε
f (x, y, ε | α,β, Γ, κ)p(ε | κ)dε
=380∏t=1
Γ(κ+ xt + yt)
Γ(κ)Γ(xt + 1)Γ(yt + 1)pxtt q
ytt (1− pt − qt)
κt
where pt = µt
κ+µt+λtand qt = λt
κ+µt+λt, µt = αi,tβj,tγt and λt = αj,tβi,t .
Gareth Ridall,Lancaster University, UK, Anthony Pettitt, QUT, Australia, email: [email protected] sequential Bayesian prediction of football using mixtures of state space models
Introduction and backgroundThe likelihood and priors
Sequential updating of a state space modelDynamic model averaging
Over-dispersion and positive correlations
This is the bivariate Negative Binomial model which is able to explainboth over-dispersion and positive correlation. The marginal variance,covariance and correlations of the bivariate distribution can be derived byusing standard identities.
Var(Xt) = µt +µ2t
κ
Cov(Xt ,Yt) =µtλtκ
Cor(Xt ,Yt) =µtλt√
(κµt + µ2t )(κλt + λ2t )
where µt is the expected score of the home side and λt is the expectedscore of the away side.
Gareth Ridall,Lancaster University, UK, Anthony Pettitt, QUT, Australia, email: [email protected] sequential Bayesian prediction of football using mixtures of state space models
Introduction and backgroundThe likelihood and priors
Sequential updating of a state space modelDynamic model averaging
Sequential updating of a state space model
Gareth Ridall,Lancaster University, UK, Anthony Pettitt, QUT, Australia, email: [email protected] sequential Bayesian prediction of football using mixtures of state space models
Introduction and backgroundThe likelihood and priors
Sequential updating of a state space modelDynamic model averaging
Sequential inference
We assume that the dynamic parameters θt = αt ,βt , γt evolve overtime in such away that there is a loss of information from posterior toprior through the use of discount parameters. Gamerman et al.,(2013)and others express a conjugate transition equation to the conditionallyPoisson likelihood as(
Xt
Yt
)| κ ∼ Poisson
(αi,tβj,tγtεtαj,tβi,tεt
)(Observation Equation)
ωθt
θt−1∼ Beta
(ωpθt−1, (1− ω)pθt−1
)(State Equation)
We will see that 0 < ω ≤ 1 has the effect of discounting the posteriorfrom the previous observation.
Gareth Ridall,Lancaster University, UK, Anthony Pettitt, QUT, Australia, email: [email protected] sequential Bayesian prediction of football using mixtures of state space models
Introduction and backgroundThe likelihood and priors
Sequential updating of a state space modelDynamic model averaging
Sequential Bayesian inference
Figure: Sequential Bayesian inference consists of an extend an an updatestep.Gareth Ridall,Lancaster University, UK, Anthony Pettitt, QUT, Australia, email: [email protected]
Fast sequential Bayesian prediction of football using mixtures of state space models
Introduction and backgroundThe likelihood and priors
Sequential updating of a state space modelDynamic model averaging
Posteriort−1Extend−−−→ Priort
Update−−−−→ Posteriort
The extend argument below follows from induction.
θt−1 | x1:t−1, y1:t−1 ∼ Gamma(pθt−1, q
θt−1
)(Posterior)
π(θt | x1:t−1, y1:t−1)∝∫θt−1
π(θt−1 | x1:t−1, y1:t−1) π(θt | θt−1)dθt−1 (Extend)
θt | x1:t−1, y1:t−1 ∼ Gamma(ωpθt−1, ωq
θt−1
)(Prior)
∼ Gamma(pθt , q
θt
)π(θt | x1:t , y1:t)∝ π(θt | x1:t−1, y1:t−1)p(xt , yt | θt) (Update)
θt | x1:t , y1:t ∼ Gamma(pθt , q
θt
)(Posterior)
and the bi-product of the update step is the predictive distribution:
p(xt , yt | x1:t−1, y1:t−1) =
∫θt
p(xt , yt | θt)π(θt | x1:t−1, y1:t−1)dθt
=1
xt !yt !× Γ(pθt )
Γ(pθt )
qθtpθt
qθtpθt
Gareth Ridall,Lancaster University, UK, Anthony Pettitt, QUT, Australia, email: [email protected] sequential Bayesian prediction of football using mixtures of state space models
Introduction and backgroundThe likelihood and priors
Sequential updating of a state space modelDynamic model averaging
The Extend step (before each game)
Figure: The extend step takes the posterior from the last observation tocreate new prior. It expresses loss of information by discounting or downweighting past form The discount factor, ω, preserves the mean of theposterior but adds uncertainty to the priory by increasing its variance.
Gareth Ridall,Lancaster University, UK, Anthony Pettitt, QUT, Australia, email: [email protected] sequential Bayesian prediction of football using mixtures of state space models
Introduction and backgroundThe likelihood and priors
Sequential updating of a state space modelDynamic model averaging
Step 1 Football model: Extending the previous posterior
The priors in terms of the previous posterior
Prior Posterior
Attack αi,t ∼ Gamma(pαi,t , q
αi,t
)αi,t ∼ Gamma
(pαi,t , q
αi,t
)Defense βi,t ∼ Gamma
(pβi,t , q
βi,t
)βi,t ∼ Gamma
(pβi,t , q
βi,t
)HGA γt ∼ Gamma (pγt , q
γt ) γt ∼ Gamma (pγt , q
γt )
αi,t ∼ Gamma(pαi,t , q
αi,t
), pαi,t = ωpαi,t−1, qαi,t = ωqαi,t−1 (AS Home)
αj,t ∼ Gamma(pαj,t , q
αj,t
), pαj,t = ωpαj,t−1, qαj,t = ωqαj,t−1, (AS Away)
βj,t ∼ Gamma(pβj,t , q
βj,t
), pβj,t = ωpβj,t−1, qβj,t = ωqβj,t−1, (DS Away)
βi,t ∼ Gamma(pβi,t , q
βi,t
), pβi,t = ωpβi,t−1, qβi,t = ωqβi,t−1, (DS Home)
γt ∼ Gamma (pγt , qγt ), pγt = wtp
γt−1, qγt = wtq
γt−1, (HGA)
εt ∼ Gamma (κ, κ), (R.E)
The mean is preserved but the variance has increased.
Gareth Ridall,Lancaster University, UK, Anthony Pettitt, QUT, Australia, email: [email protected] sequential Bayesian prediction of football using mixtures of state space models
Introduction and backgroundThe likelihood and priors
Sequential updating of a state space modelDynamic model averaging
The Update step (after each game)
Figure: The Update step is just an application of Bayes theorem. Theprior is updated to encompass information from the score of the latestfootball game. The normalising constant of Bayes theorem yields thepredictive distribution which give us the cumulative evidence, needed formodel comparisons.
Gareth Ridall,Lancaster University, UK, Anthony Pettitt, QUT, Australia, email: [email protected] sequential Bayesian prediction of football using mixtures of state space models
Introduction and backgroundThe likelihood and priors
Sequential updating of a state space modelDynamic model averaging
The full conditional posteriors (after each game)
We denote the fixed parameters by φ. The dynamic posterior parametersθt = αj,t , αi,t , βi,t , βj,tγt , εt are updated at the end of a game at time tusing the following joint posterior
π(θt | x1:t , y1:t ,φ) ∝ exp−[εt(αi,tβj,tγ + αj,tβi,t)] × [εtαi,tβj,tγ]xt × [εtαj,tβi,t ]yt︸ ︷︷ ︸
Likelihood
× αpαi,t−1
i,t exp(−pαi,tαi,t) αpαj,t−1
j,t exp(−pαj,tαj,t)︸ ︷︷ ︸Prior Attacking Strengths
× βpβi,t−1
i,t exp(−pβi,tβi,t) βpβj,t−1
j,t exp(−pβj,tβj,t)︸ ︷︷ ︸Prior Defensive Strengths
× γ pγt t−1
t exp(−qγt γt)︸ ︷︷ ︸HGA
× εκ−1t exp(−κεt)︸ ︷︷ ︸Random effect
Gareth Ridall,Lancaster University, UK, Anthony Pettitt, QUT, Australia, email: [email protected] sequential Bayesian prediction of football using mixtures of state space models
Introduction and backgroundThe likelihood and priors
Sequential updating of a state space modelDynamic model averaging
Step 2 Updating the sufficient statistics
The updates can be formulated by examining the full conditional posteriordistributions. All the dynamic parameters have known shape parameters.However the scale parameters involve other parameters and are not known.However the expectations at the previous iteration can be used as a proxy for asufficient statistic.The update equations are then
pαi,t ← pαi,t + xt qαi,t ← qαi,t + γt βj,t , (AS home)
pαj,t ← pαj,t + yt qαj,t ← qαj,t + βi,t , (AS away)
pβi,t ← pβi,t + yt qβi,t ← qβi,t + αj,t , (DS home)
pβj,t ← pβj,t + xt qβj,t ← qβj,t + γt αi,t (DS away)
pγt ← pγt + xt qγt ← qγt + αi,t βj,t (HGA)
pεt ← κ+ xt + yt qεt ← κ+ γαi,t βj,t + αj,t βi,t (RE)
where αi,t =pαi,tqαi,t
, βi,t =pβi,t
qβi,t
and γt =pγt
qγt
.
Gareth Ridall,Lancaster University, UK, Anthony Pettitt, QUT, Australia, email: [email protected] sequential Bayesian prediction of football using mixtures of state space models
Introduction and backgroundThe likelihood and priors
Sequential updating of a state space modelDynamic model averaging
Updating the cumulative evidence (after each game)
The predictive can be calculated as
p(xt , yt | x1:t−1, y1:t−1,φ) =1
xt !yt !×
Γ(pαi,t)
Γ(pαi,t)
qαi,tpαi,t
qαi,tpαi,t×
Γ(pαj,t)
Γ(pαi,t)
qαj,tpαj,t
qαj,tpαj,t
×Γ(pβi,t)
Γ(pβi,t)
qβi,tpβi,t
qβi,tpβi,t
‘×Γ(pβj,t)
Γ(pβi,t)
qβj,tpβj,t
qβj,tpβj,t× Γ(pγt )
Γ(pγt )
qγtpγt
qγtpγt× Γ(pεt )
Γ(κ)
κκ
qεtpεt. (1)
The evidence for can be calculated as
Zk,t =t∏
t∗=1
p(xt∗ , yt∗ | x1:t∗−1, y1:t∗−1,φk) (2)
Here φk , k = 1, 2, . . . ,K denotes a particular configuration of the fixedparameters which include ω and κ, the discount and correlationparameters.
Gareth Ridall,Lancaster University, UK, Anthony Pettitt, QUT, Australia, email: [email protected] sequential Bayesian prediction of football using mixtures of state space models
Introduction and backgroundThe likelihood and priors
Sequential updating of a state space modelDynamic model averaging
Dynamic model averaging
Gareth Ridall,Lancaster University, UK, Anthony Pettitt, QUT, Australia, email: [email protected] sequential Bayesian prediction of football using mixtures of state space models
Introduction and backgroundThe likelihood and priors
Sequential updating of a state space modelDynamic model averaging
Dynamic model averaging
We now look at the online evolution of the mixture of models.We set a uniform prior on each configuration of fixed parameters φk atthe beginning of the season as π(φk) = 1
K . The posterior weights of eachcomponent of the mixture, Ωk,t , are given by
Ωk,t = π(φk,t | x1:t , y1:t) =p(xt , yt | x1:t−1, y1:t−1,φk)∑Kk=1 p(xt , yt | x1:t−1, y1:t−1,φk)
.
The filtered posteriors of the attacking and defensive strengths are now amixture of Gamma distributions
αt | φ =K∑
k=1
Ωk,tαk,t
where αk,t ∼ Gamma(pαk,t ,q
αk,t
), for k = 1, 2, . . . ,K
Gareth Ridall,Lancaster University, UK, Anthony Pettitt, QUT, Australia, email: [email protected] sequential Bayesian prediction of football using mixtures of state space models
Introduction and backgroundThe likelihood and priors
Sequential updating of a state space modelDynamic model averaging
Sequential prediction from the dynamic mixture
The cumulative evidence for the mixture of models is the sum of thepredictive distributions weighetd by the cumulative evidence up to lastobservation.
Z∗t =
t∏t∗=1
K∑k=1
Ωk,t−1p(xt∗ , yt∗ | x1:t∗−1, y1:t∗−1,φk) (3)
We show that the cumulative predictive performance of the mixture isusually much better than the cumulative predictive performance any ofthe components of the mixture.
The plots on the next few slides contrast the log evidence of the mixture,Z∗t to the log evidence of each of the components of the mixture
Zk,t k = 1, 2, . . . 20.
Gareth Ridall,Lancaster University, UK, Anthony Pettitt, QUT, Australia, email: [email protected] sequential Bayesian prediction of football using mixtures of state space models
Introduction and backgroundThe likelihood and priors
Sequential updating of a state space modelDynamic model averaging
Weighted prediction of a finite number of predictivedistributions
Figure: In model averaging the predictive from K models is combinedand weighted by the cumulative evidence from previous step
Gareth Ridall,Lancaster University, UK, Anthony Pettitt, QUT, Australia, email: [email protected] sequential Bayesian prediction of football using mixtures of state space models
Introduction and backgroundThe likelihood and priors
Sequential updating of a state space modelDynamic model averaging
Comparison of evidence for models for 1999 Season
Figure: The log evidence of the mixture compared to the log evidence ofeach of the components of the mixture
Gareth Ridall,Lancaster University, UK, Anthony Pettitt, QUT, Australia, email: [email protected] sequential Bayesian prediction of football using mixtures of state space models
Introduction and backgroundThe likelihood and priors
Sequential updating of a state space modelDynamic model averaging
Comparison of evidence for models for 2001 Season
Figure: The log evidence of the mixture compared to the log evidence ofeach of the components of the mixture
Gareth Ridall,Lancaster University, UK, Anthony Pettitt, QUT, Australia, email: [email protected] sequential Bayesian prediction of football using mixtures of state space models
Introduction and backgroundThe likelihood and priors
Sequential updating of a state space modelDynamic model averaging
Comparison of evidence for models for 2004 Season
Figure: The log evidence of the mixture compared to the log evidence ofeach of the components of the mixture
Gareth Ridall,Lancaster University, UK, Anthony Pettitt, QUT, Australia, email: [email protected] sequential Bayesian prediction of football using mixtures of state space models
Introduction and backgroundThe likelihood and priors
Sequential updating of a state space modelDynamic model averaging
Comparison of evidence for models for 2005 Season
Figure: The log evidence of the mixture compared to the log evidence ofeach of the components of the mixture
Gareth Ridall,Lancaster University, UK, Anthony Pettitt, QUT, Australia, email: [email protected] sequential Bayesian prediction of football using mixtures of state space models
Introduction and backgroundThe likelihood and priors
Sequential updating of a state space modelDynamic model averaging
Comparison of evidence for models for 2009 Season
Figure: The log evidence of the mixture compared to the log evidence ofeach of the components of the mixture
Gareth Ridall,Lancaster University, UK, Anthony Pettitt, QUT, Australia, email: [email protected] sequential Bayesian prediction of football using mixtures of state space models
Introduction and backgroundThe likelihood and priors
Sequential updating of a state space modelDynamic model averaging
Comparison of evidence for models for 2015 Season
Figure: The pedictivity of the mixture compared to the log evidence ofeach of the components of the mixture
Gareth Ridall,Lancaster University, UK, Anthony Pettitt, QUT, Australia, email: [email protected] sequential Bayesian prediction of football using mixtures of state space models
Introduction and backgroundThe likelihood and priors
Sequential updating of a state space modelDynamic model averaging
Comparison of evidence for models for 2019 Season
Figure: The log evidence of the mixture compared to the log evidence ofeach of the components of the mixture
Gareth Ridall,Lancaster University, UK, Anthony Pettitt, QUT, Australia, email: [email protected] sequential Bayesian prediction of football using mixtures of state space models
Introduction and backgroundThe likelihood and priors
Sequential updating of a state space modelDynamic model averaging
The evolution of posterior weights with different discountfactors
Figure: Sequential posterior weights within and between seasons.ω ∈ 0.95, 0.995, 0.96, 0.965, 0.97, 0.975, 0.98, 0.985, 0.99, 1
Gareth Ridall,Lancaster University, UK, Anthony Pettitt, QUT, Australia, email: [email protected] sequential Bayesian prediction of football using mixtures of state space models
Introduction and backgroundThe likelihood and priors
Sequential updating of a state space modelDynamic model averaging
The evolution of weights of models with different values ofκ and ω.
Figure: Sequential posterior weights within and between seasons.κ ∈ 10, 20, 50, 200 and ω ∈ 0.98, 0.99, 1
Gareth Ridall,Lancaster University, UK, Anthony Pettitt, QUT, Australia, email: [email protected] sequential Bayesian prediction of football using mixtures of state space models
Introduction and backgroundThe likelihood and priors
Sequential updating of a state space modelDynamic model averaging
Filtered mixtures of abilities for Chelsea and Man United
Figure: The evolution of form of Chelsea (Winners in 05,06,10,15,17) andMan United (Winners in 99,00,01,03,07,08,09,11,13)
Gareth Ridall,Lancaster University, UK, Anthony Pettitt, QUT, Australia, email: [email protected] sequential Bayesian prediction of football using mixtures of state space models
Introduction and backgroundThe likelihood and priors
Sequential updating of a state space modelDynamic model averaging
Filtered mixtures of abilities for Arsenal and Liverpool
Figure: The evolution of form of Arsenal (98,02,04) and Liverpool.
BK
Gareth Ridall,Lancaster University, UK, Anthony Pettitt, QUT, Australia, email: [email protected] sequential Bayesian prediction of football using mixtures of state space models
Introduction and backgroundThe likelihood and priors
Sequential updating of a state space modelDynamic model averaging
Conclusions
MCMC and simulation based methodologies are of limited use forfast online prediction.
Online sequential updating and model averaging give superiorpredictions
Constant exponential smoothing with a fixed rate parameter is notappropriate for football modelling
A small varying amount of over dispersion and positive correlationcan be dealt with by model averaging.
Mixtures of models highlight sharp changes in form and result inbetter predictions.
Embrace conjugacy when you can
Gareth Ridall,Lancaster University, UK, Anthony Pettitt, QUT, Australia, email: [email protected] sequential Bayesian prediction of football using mixtures of state space models
Introduction and backgroundThe likelihood and priors
Sequential updating of a state space modelDynamic model averaging
Winners
Season Winners1 1997/98 Arsenal2 1998/99 Manchester United3 1999/00 Manchester United4 2000/01 Manchester United5 2001/02 Arsenal6 2002/03 Manchester United7 2003/04 Arsenal8 2004/05 Chelsea9 2005/06 Chelsea
10 2006/07 Manchester United11 2007/08 Manchester United12 2008/09 Manchester United13 2009/10 Chelsea14 2010/11 Manchester United15 2011/12 Manchester City16 2012/13 Manchester United17 2013/14 Manchester City18 2014/15 Chelsea19 2015/16 Leicester City20 2016/17 Chelsea21 2017/18 Manchester City22 2018/19 Manchester City
Gareth Ridall,Lancaster University, UK, Anthony Pettitt, QUT, Australia, email: [email protected] sequential Bayesian prediction of football using mixtures of state space models