bayesian filtering and smoothing - cambridge university...

22
Bayesian Filtering and Smoothing Filtering and smoothing methods are used to produce an accurate estimate of the state of a time-varying system based on multiple observational inputs (data). Interest in these methods has exploded in recent years, with numerous applications emerging in fields such as navigation, aerospace engineering, telecommunications, and medicine. This compact, informal introduction for graduate students and advanced undergraduates presents the current state-of-the-art filtering and smoothing methods in a unified Bayesian framework. Readers learn what non-linear Kalman filters and particle filters are, how they are related, and their relative advantages and disadvantages. They also discover how state-of-the-art Bayesian parameter estimation methods can be combined with state-of-the-art filtering and smoothing algorithms. The book’s practical and algorithmic approach assumes only modest mathematical prerequisites. Examples include MATLAB computations, and the numerous end-of-chapter exercises include computational assignments. MATLAB/GNU Octave source code is available for download at www.cambridge.org/sarkka, promoting hands-on work with the methods. simo s¨ arkk¨ a worked, from 2000 to 2010, with Nokia Ltd., Indagon Ltd., and the Nalco Company in various industrial research projects related to telecommunications, positioning systems, and industrial process control. Currently, he is a Senior Researcher with the Department of Biomedical Engineering and Computational Science at Aalto University, Finland, and Adjunct Professor with Tampere University of Technology and Lappeenranta University of Technology. In 2011 he was a visiting scholar with the Signal Processing and Communications Laboratory of the Department of Engineering at the University of Cambridge. His research interests are in state and parameter estimation in stochastic dynamic systems and, in particular, Bayesian methods in signal processing, machine learning, and inverse problems with applications to brain imaging, positioning systems, computer vision, and audio signal processing. He is a Senior Member of the IEEE. www.cambridge.org © in this web service Cambridge University Press Cambridge University Press 978-1-107-03065-7 - Bayesian Filtering and Smoothing Simo Särkkä Frontmatter More information

Upload: dangdan

Post on 29-Apr-2018

249 views

Category:

Documents


2 download

TRANSCRIPT

Page 1: Bayesian Filtering and Smoothing - Cambridge University …assets.cambridge.org/97811070/30657/frontmatter/9781107030657... · Bayesian Filtering and Smoothing Filtering and smoothing

Bayesian Filtering and Smoothing

Filtering and smoothing methods are used to produce an accurate estimate of the stateof a time-varying system based on multiple observational inputs (data). Interest inthese methods has exploded in recent years, with numerous applications emerging infields such as navigation, aerospace engineering, telecommunications, and medicine.

This compact, informal introduction for graduate students and advancedundergraduates presents the current state-of-the-art filtering and smoothing methodsin a unified Bayesian framework. Readers learn what non-linear Kalman filters andparticle filters are, how they are related, and their relative advantages anddisadvantages. They also discover how state-of-the-art Bayesian parameter estimationmethods can be combined with state-of-the-art filtering and smoothing algorithms.

The book’s practical and algorithmic approach assumes only modest mathematicalprerequisites. Examples include MATLAB computations, and the numerousend-of-chapter exercises include computational assignments. MATLAB/GNU Octavesource code is available for download at www.cambridge.org/sarkka, promotinghands-on work with the methods.

s imo sarkka worked, from 2000 to 2010, with Nokia Ltd., Indagon Ltd., and theNalco Company in various industrial research projects related to telecommunications,positioning systems, and industrial process control. Currently, he is a SeniorResearcher with the Department of Biomedical Engineering and ComputationalScience at Aalto University, Finland, and Adjunct Professor with Tampere Universityof Technology and Lappeenranta University of Technology. In 2011 he was a visitingscholar with the Signal Processing and Communications Laboratory of the Departmentof Engineering at the University of Cambridge. His research interests are in state andparameter estimation in stochastic dynamic systems and, in particular, Bayesianmethods in signal processing, machine learning, and inverse problems withapplications to brain imaging, positioning systems, computer vision, and audiosignal processing. He is a Senior Member of the IEEE.

www.cambridge.org© in this web service Cambridge University Press

Cambridge University Press978-1-107-03065-7 - Bayesian Filtering and SmoothingSimo SärkkäFrontmatterMore information

Page 2: Bayesian Filtering and Smoothing - Cambridge University …assets.cambridge.org/97811070/30657/frontmatter/9781107030657... · Bayesian Filtering and Smoothing Filtering and smoothing

INSTITUTE OF MATHEMATICAL STATISTICSTEXTBOOKS

Editorial BoardD. R. Cox (University of Oxford)A. Agresti (University of Florida)B. Hambly (University of Oxford)S. Holmes (Stanford University)X.-L. Meng (Harvard University)

IMS Textbooks give introductory accounts of topics of current concern suitable foradvanced courses at master’s level, for doctoral students and for individual study. Theyare typically shorter than a fully developed textbook, often arising from materialcreated for a topical course. Lengths of 100–290 pages are envisaged. The bookstypically contain exercises.

www.cambridge.org© in this web service Cambridge University Press

Cambridge University Press978-1-107-03065-7 - Bayesian Filtering and SmoothingSimo SärkkäFrontmatterMore information

Page 3: Bayesian Filtering and Smoothing - Cambridge University …assets.cambridge.org/97811070/30657/frontmatter/9781107030657... · Bayesian Filtering and Smoothing Filtering and smoothing

Bayesian Filtering and Smoothing

SIMO SARKK AAalto University, Finland

www.cambridge.org© in this web service Cambridge University Press

Cambridge University Press978-1-107-03065-7 - Bayesian Filtering and SmoothingSimo SärkkäFrontmatterMore information

Page 4: Bayesian Filtering and Smoothing - Cambridge University …assets.cambridge.org/97811070/30657/frontmatter/9781107030657... · Bayesian Filtering and Smoothing Filtering and smoothing

www.cambridge.orgInformation on this title: www.cambridge.org/9781107030657

C© Simo Sarkka 2013

This publication is in copyright. Subject to statutory exceptionand to the provisions of relevant collective licensing agreements,no reproduction of any part may take place without the written

permission of Cambridge University Press.

First published 2013

Printed in the United Kingdom by Clays, St Ives plc.

A catalogue record for this publication is available from the British Library

Library of Congress Cataloguing in Publication data

ISBN 978-1-107-03065-7 HardbackISBN 978-1-107-61928-9 Paperback

Additional resources for this publication at www.cambridge.org/sarkka

Cambridge University Press has no responsibility for the persistence oraccuracy of URLs for external or third-party internet websites referred to

in this publication, and does not guarantee that any content on suchwebsites is, or will remain, accurate or appropriate.

University Printing House, Cambridg nited Kingdom Cambridge University Press is part of the University of Cambridge.

It furthers the University’s mission by disseminating knowledge in the pursuit of education, learning and research at the highest international levels of excellence.

UeiCB2i8BS,i

3rd printing 2014

www.cambridge.org© in this web service Cambridge University Press

Cambridge University Press978-1-107-03065-7 - Bayesian Filtering and SmoothingSimo SärkkäFrontmatterMore information

Page 5: Bayesian Filtering and Smoothing - Cambridge University …assets.cambridge.org/97811070/30657/frontmatter/9781107030657... · Bayesian Filtering and Smoothing Filtering and smoothing

Contents

Preface ixSymbols and abbreviations xiii

1 What are Bayesian filtering and smoothing? 11.1 Applications of Bayesian filtering and smoothing 11.2 Origins of Bayesian filtering and smoothing 71.3 Optimal filtering and smoothing as Bayesian inference 81.4 Algorithms for Bayesian filtering and smoothing 121.5 Parameter estimation 141.6 Exercises 15

2 Bayesian inference 172.1 Philosophy of Bayesian inference 172.2 Connection to maximum likelihood estimation 172.3 The building blocks of Bayesian models 192.4 Bayesian point estimates 202.5 Numerical methods 222.6 Exercises 24

3 Batch and recursive Bayesian estimation 273.1 Batch linear regression 273.2 Recursive linear regression 293.3 Batch versus recursive estimation 313.4 Drift model for linear regression 333.5 State space model for linear regression with drift 363.6 Examples of state space models 393.7 Exercises 46

4 Bayesian filtering equations and exact solutions 514.1 Probabilistic state space models 514.2 Bayesian filtering equations 544.3 Kalman filter 56

v

www.cambridge.org© in this web service Cambridge University Press

Cambridge University Press978-1-107-03065-7 - Bayesian Filtering and SmoothingSimo SärkkäFrontmatterMore information

Page 6: Bayesian Filtering and Smoothing - Cambridge University …assets.cambridge.org/97811070/30657/frontmatter/9781107030657... · Bayesian Filtering and Smoothing Filtering and smoothing

vi Contents

4.4 Exercises 62

5 Extended and unscented Kalman filtering 645.1 Taylor series expansions 645.2 Extended Kalman filter 695.3 Statistical linearization 755.4 Statistically linearized filter 775.5 Unscented transform 815.6 Unscented Kalman filter 865.7 Exercises 92

6 General Gaussian filtering 966.1 Gaussian moment matching 966.2 Gaussian filter 976.3 Gauss–Hermite integration 996.4 Gauss–Hermite Kalman filter 1036.5 Spherical cubature integration 1066.6 Cubature Kalman filter 1106.7 Exercises 114

7 Particle filtering 1167.1 Monte Carlo approximations in Bayesian inference 1167.2 Importance sampling 1177.3 Sequential importance sampling 1207.4 Sequential importance resampling 1237.5 Rao–Blackwellized particle filter 1297.6 Exercises 132

8 Bayesian smoothing equations and exact solutions 1348.1 Bayesian smoothing equations 1348.2 Rauch–Tung–Striebel smoother 1358.3 Two-filter smoothing 1398.4 Exercises 142

9 Extended and unscented smoothing 1449.1 Extended Rauch–Tung–Striebel smoother 1449.2 Statistically linearized Rauch–Tung–Striebel smoother 1469.3 Unscented Rauch–Tung–Striebel smoother 1489.4 Exercises 152

10 General Gaussian smoothing 15410.1 General Gaussian Rauch–Tung–Striebel smoother 15410.2 Gauss–Hermite Rauch–Tung–Striebel smoother 155

www.cambridge.org© in this web service Cambridge University Press

Cambridge University Press978-1-107-03065-7 - Bayesian Filtering and SmoothingSimo SärkkäFrontmatterMore information

Page 7: Bayesian Filtering and Smoothing - Cambridge University …assets.cambridge.org/97811070/30657/frontmatter/9781107030657... · Bayesian Filtering and Smoothing Filtering and smoothing

Contents vii

10.3 Cubature Rauch–Tung–Striebel smoother 15610.4 General fixed-point smoother equations 15910.5 General fixed-lag smoother equations 16210.6 Exercises 164

11 Particle smoothing 16511.1 SIR particle smoother 16511.2 Backward-simulation particle smoother 16711.3 Reweighting particle smoother 16911.4 Rao–Blackwellized particle smoothers 17111.5 Exercises 173

12 Parameter estimation 17412.1 Bayesian estimation of parameters in state space models 17412.2 Computational methods for parameter estimation 17712.3 Practical parameter estimation in state space models 18512.4 Exercises 202

13 Epilogue 20413.1 Which method should I choose? 20413.2 Further topics 206

Appendix Additional material 209A.1 Properties of Gaussian distribution 209A.2 Cholesky factorization and its derivative 210A.3 Parameter derivatives for the Kalman filter 212A.4 Parameter derivatives for the Gaussian filter 214

References 219Index 229

www.cambridge.org© in this web service Cambridge University Press

Cambridge University Press978-1-107-03065-7 - Bayesian Filtering and SmoothingSimo SärkkäFrontmatterMore information

Page 8: Bayesian Filtering and Smoothing - Cambridge University …assets.cambridge.org/97811070/30657/frontmatter/9781107030657... · Bayesian Filtering and Smoothing Filtering and smoothing

www.cambridge.org© in this web service Cambridge University Press

Cambridge University Press978-1-107-03065-7 - Bayesian Filtering and SmoothingSimo SärkkäFrontmatterMore information

Page 9: Bayesian Filtering and Smoothing - Cambridge University …assets.cambridge.org/97811070/30657/frontmatter/9781107030657... · Bayesian Filtering and Smoothing Filtering and smoothing

Preface

The aim of this book is to give a concise introduction to non-linear Kalmanfiltering and smoothing, particle filtering and smoothing, and to the re-lated parameter estimation methods. Although the book is intended to bean introduction, the mathematical ideas behind all the methods are care-fully explained, and a mathematically inclined reader can get quite a deepunderstanding of the methods by reading the book. The book is purposelykept short for quick reading.

The book is mainly intended for advanced undergraduate and graduatestudents in applied mathematics and computer science. However, the bookis suitable also for researchers and practitioners (engineers) who need aconcise introduction to the topic on a level that enables them to implementor use the methods. The assumed background is linear algebra, vector cal-culus, Bayesian inference, and MATLAB R� programming skills.

As implied by the title, the mathematical treatment of the models andalgorithms in this book is Bayesian, which means that all the results aretreated as being approximations to certain probability distributions or theirparameters. Probability distributions are used both to represent uncertain-ties in the models and for modeling the physical randomness. The theo-ries of non-linear filtering, smoothing, and parameter estimation are for-mulated in terms of Bayesian inference, and both the classical and recentalgorithms are derived using the same Bayesian notation and formalism.This Bayesian approach to the topic is far from new. It was pioneered byStratonovich in the 1950s and 1960s – even before Kalman’s seminal arti-cle in 1960. Thus the theory of non-linear filtering has been Bayesian fromthe beginning (see Jazwinski, 1970).

Chapter 1 is a general introduction to the idea and applications ofBayesian filtering and smoothing. The purpose of Chapter 2 is to brieflyreview the basic concepts of Bayesian inference as well as the basicnumerical methods used in Bayesian computations. Chapter 3 starts witha step-by-step introduction to recursive Bayesian estimation via solving a

ix

www.cambridge.org© in this web service Cambridge University Press

Cambridge University Press978-1-107-03065-7 - Bayesian Filtering and SmoothingSimo SärkkäFrontmatterMore information

Page 10: Bayesian Filtering and Smoothing - Cambridge University …assets.cambridge.org/97811070/30657/frontmatter/9781107030657... · Bayesian Filtering and Smoothing Filtering and smoothing

x Preface

linear regression problem in a recursive manner. The transition to Bayesianfiltering and smoothing theory is explained by extending and generalizingthe problem. The first Kalman filter of the book is also encountered in thischapter.

The Bayesian filtering theory starts in Chapter 4 where we derive thegeneral Bayesian filtering equations and, as their special case, the cele-brated Kalman filter. Non-linear extensions of the Kalman filter, the ex-tended Kalman filter (EKF), the statistically linearized filter (SLF), and theunscented Kalman filter (UKF) are presented in Chapter 5. Chapter 6 gen-eralizes these filters into the framework of Gaussian filtering. The Gauss–Hermite Kalman filter (GHKF) and cubature Kalman filter (CKF) are thenderived from the general framework. Sequential Monte Carlo (SMC) basedparticle filters (PF) are explained in Chapter 7 by starting from the basicSIR filter and ending with Rao–Blackwellized particle filters (RBPF).

Chapter 8 starts with a derivation of the general (fixed-interval)Bayesian smoothing equations and then continues to a derivation ofthe Rauch–Tung–Striebel (RTS) smoother as their special case. In thatchapter we also briefly discuss two-filter smoothing. The extended RTSsmoother (ERTSS), statistically linearized RTS smoother (SLRTSS),and the unscented RTS smoother (URTSS) are presented in Chapter 9.The general Gaussian smoothing framework is presented in Chapter 10,and the Gauss–Hermite RTS smoother (GHRTSS) and the cubature RTSsmoother (CRTSS) are derived as its special cases. We also discussGaussian fixed-point and fixed-lag smoothing in the same chapter. InChapter 11 we start by showing how the basic SIR particle filter can beused to approximate the smoothing solutions with a small modification.We then introduce the numerically better backward-simulation particlesmoother and the reweighting (or marginal) particle smoother. Finally, wediscuss the implementation of Rao–Blackwellized particle smoothers.

Chapter 12 is an introduction to parameter estimation in state spacemodels concentrating on optimization and expectation–maximization(EM) based computation of maximum likelihood (ML) and maximuma posteriori (MAP) estimates, as well as to Markov chain Monte Carlo(MCMC) methods. We start by presenting the general methods and thenshow how Kalman filters and RTS smoothers, non-linear Gaussian filtersand RTS smoothers, and finally particle filters and smoothers, can beused to compute or approximate the quantities needed in implementationof parameter estimation methods. This leads to, for example, classicalEM algorithms for state space models, as well as to particle EM and

www.cambridge.org© in this web service Cambridge University Press

Cambridge University Press978-1-107-03065-7 - Bayesian Filtering and SmoothingSimo SärkkäFrontmatterMore information

Page 11: Bayesian Filtering and Smoothing - Cambridge University …assets.cambridge.org/97811070/30657/frontmatter/9781107030657... · Bayesian Filtering and Smoothing Filtering and smoothing

Preface xi

particle MCMC methods. We also discuss how Rao–Blackwellization cansometimes be used to help parameter estimation.

Chapter 13 is an epilogue where we give some general advice on theselection of different methods for different purposes. We also discuss andgive references to various technical points and related topics that are im-portant, but did not fit into this book.

Each of the chapters ends with a range of exercises, which give thereader hands-on experience in implementing the methods and in selectingthe appropriate method for a given purpose. The MATLAB R� source codeneeded in the exercises as well as various other material can be found onthe book’s web page at www.cambridge.org/sarkka.

This book is an outgrowth of lecture notes of courses that I gave duringthe years 2009–2012 at Helsinki University of Technology, Aalto Univer-sity, and Tampere University of Technology, Finland. Most of the text waswritten while I was working at the Department of Biomedical Engineeringand Computational Science (BECS) of Aalto University (formerly HelsinkiUniversity of Technology), but some of the text was written during my visitto the Department of Engineering at the University of Cambridge, UK. Iam grateful to the former Centre of Excellence in Computational ComplexSystems Research of the Academy of Finland, BECS, and Aalto UniversitySchool of Science for providing me with the research funding which madethis book possible.

I would like to thank Jouko Lampinen and Aki Vehtari from BECS forgiving me the opportunity to do the research and for co-operation which ledto this book. Arno Solin, Robert Piche, Juha Sarmavuori, Thomas Schon,Pete Bunch, and Isambi S. Mbalawata deserve thanks for careful checkingof the book and for giving a lot of useful suggestions for improving the text.I am also grateful to Jouni Hartikainen, Ville Vaananen, Heikki Haario,and Simon Godsill for research co-operation that led to improvement ofmy understanding of the topic as well as to the development of some of themethods which now are explained in this book. I would also like to thankDiana Gillooly from Cambridge University Press and series editor SusanHolmes for suggesting the publication of my lecture notes in book form.Finally, I am grateful to my wife Susanne for her support and patienceduring the writing of this book.

Simo SarkkaVantaa, Finland

www.cambridge.org© in this web service Cambridge University Press

Cambridge University Press978-1-107-03065-7 - Bayesian Filtering and SmoothingSimo SärkkäFrontmatterMore information

Page 12: Bayesian Filtering and Smoothing - Cambridge University …assets.cambridge.org/97811070/30657/frontmatter/9781107030657... · Bayesian Filtering and Smoothing Filtering and smoothing

www.cambridge.org© in this web service Cambridge University Press

Cambridge University Press978-1-107-03065-7 - Bayesian Filtering and SmoothingSimo SärkkäFrontmatterMore information

Page 13: Bayesian Filtering and Smoothing - Cambridge University …assets.cambridge.org/97811070/30657/frontmatter/9781107030657... · Bayesian Filtering and Smoothing Filtering and smoothing

Symbols and abbreviations

General notationa; b; c; x; t; ˛; ˇ Scalarsa; f ; s; x; y ; ˛; ˇ VectorsA; F ; S; X; Y MatricesA;F ;S;X ;Y SetsA; F; S; X; Y Spaces

Notational conventionsAT Transpose of matrixA�1 Inverse of matrixA�T Inverse of transpose of matrixŒA�i i th column of matrix A

ŒA�ij Element at i th row and j th column of matrix A

jaj Absolute value of scalar a

jAj Determinant of matrix A

dx=dt Time derivative of x.t/@gi .x/

@xjPartial derivative of gi with respect to xj

.a1; : : : ; an/ Column vector with elements a1; : : : ; an

.a1 � � � an/ Row vector with elements a1; : : : ; an

.a1 � � � an/T Column vector with elements a1; : : : ; [email protected]/

@xGradient (column vector) of scalar function g

@g.x/

@xJacobian matrix of vector valued function x! g.x/

CovŒx� Covariance CovŒx� D EŒ.x � EŒx�/ .x � EŒx�/T� ofthe random variable x

diag.a1; : : : ; an/ Diagonal matrix with diagonal values a1; : : : ; anpP Matrix such that P D pP

pP

T

EŒx� Expectation of x

EŒx j y� Conditional expectation of x given y

xiii

www.cambridge.org© in this web service Cambridge University Press

Cambridge University Press978-1-107-03065-7 - Bayesian Filtering and SmoothingSimo SärkkäFrontmatterMore information

Page 14: Bayesian Filtering and Smoothing - Cambridge University …assets.cambridge.org/97811070/30657/frontmatter/9781107030657... · Bayesian Filtering and Smoothing Filtering and smoothing

xiv Symbols and abbreviationsRf .x/ dx Lebesgue integral of f .x/ over the space Rn

p.x/ Probability density of continuous random variable x orprobability of discrete random variable x

p.x j y/ Conditional probability density or conditional probabil-ity of x given y

p.x/ / q.x/ p.x/ is proportional to q.x/, that is, there exists a con-stant c such that p.x/ D c q.x/ for all values of x

tr A Trace of matrix A

VarŒx� Variance VarŒx� D EŒ.x � EŒx�/2� of the scalar randomvariable x

x � y x is much greater than y

xi;k i th component of vector xk

x � p.x/ Random variable x has the probability density or prob-ability distribution p.x/

x , y x is defined to be equal to y

x � y x is approximately equal to y

x ' y x is assumed to be approximately equal to y

x0Wk Set or sequence containing the vectors fx0; : : : ; xkgPx Time derivative of x.t/

Symbols˛ Parameter of the unscented transform or pendulum angle˛i Acceptance probability in an MCMC methodN� Target acceptance rate in an adaptive MCMCˇ Parameter of the unscented transformı.�/ Dirac delta functionıx Difference of x from the mean ıx D x �m

�t Sampling period�tk Length of the time interval �tk D tkC1 � tk"k Measurement error at the time step k

"k Vector of measurement errors at the time step k

� Vector of parameters�k Vector of parameters at the time step k

�.n/ Vector of parameters at iteration n of the EM-algorithm�.i/ Vector of parameters at iteration i of the MCMC-algorithm�� Candidate point in the MCMC-algorithmO�MAP Maximum a posteriori (MAP) estimate of parameter �

� Parameter of the unscented transform

www.cambridge.org© in this web service Cambridge University Press

Cambridge University Press978-1-107-03065-7 - Bayesian Filtering and SmoothingSimo SärkkäFrontmatterMore information

Page 15: Bayesian Filtering and Smoothing - Cambridge University …assets.cambridge.org/97811070/30657/frontmatter/9781107030657... · Bayesian Filtering and Smoothing Filtering and smoothing

Symbols and abbreviations xv

� Parameter of the unscented transform�0 Parameter of the unscented transform�00 Parameter of the unscented transform�k Predicted mean of measurement yk in a Kalman/Gaussian

filter at the time step k

�L Mean in the linear approximation of a non-linear transform�M Mean in the Gaussian moment matching approximation�Q Mean in the quadratic approximation�S Mean in the statistical linearization approximation�U Mean in the unscented approximation�.�/ Importance distribution�2 Variance�2

i Variance of noise component i

† Auxiliary matrix needed in the EM-algorithm†i Proposal distribution covariance in the Metropolis algorithm'k.�/ Energy function at the time step k

ˆ.�/ A function returning the lower triangular part of its argumentˆ An auxiliary matrix needed in the EM-algorithm� Unit Gaussian random variable�.i/ i th scalar unit sigma point� Vector of unit Gaussian random variables�.i/ i th unit sigma point vector�.i1;:::;in/ Unit sigma point in the multivariate Gauss–Hermite cubaturea Action in decision theory, or a part of a mean vectorao Optimal actiona.t/ AccelerationA Dynamic model matrix in a linear time-invariant model, the

lower triangular Cholesky factor of a covariance matrix, theupper left block of a covariance matrix, a coefficient in sta-tistical linearization, or an arbitrary matrix

Ak Dynamic model matrix (i.e., transition matrix) of the jumpfrom step k to step k C 1

b The lower part of a mean vector, the offset term in statisticallinearization, or an arbitrary vector

B Lower right block of a covariance matrix, an auxiliary matrixneeded in the EM-algorithm, or an arbitrary matrix

Bj jk Gain matrix in a fixed-point or fixed-lag Gaussian smootherc Scalar constantC.�/ Cost or loss function

www.cambridge.org© in this web service Cambridge University Press

Cambridge University Press978-1-107-03065-7 - Bayesian Filtering and SmoothingSimo SärkkäFrontmatterMore information

Page 16: Bayesian Filtering and Smoothing - Cambridge University …assets.cambridge.org/97811070/30657/frontmatter/9781107030657... · Bayesian Filtering and Smoothing Filtering and smoothing

xvi Symbols and abbreviations

C The upper right block of a covariance matrix, an auxiliary ma-trix needed in the EM-algorithm, or an arbitrary matrix

Ck Cross-covariance matrix in a non-linear Kalman filterCL Cross-covariance in the linear approximation of a non-linear

transformCM Cross-covariance in the Gaussian moment matching approxi-

mation of a non-linear transformCQ Cross-covariance in the quadratic approximationCS Cross-covariance in the statistical linearization approximationCU Cross-covariance in the unscented approximationd Positive integer, usually dimensionality of the parametersdi Order of a monomialdt Differential of time variable t

dx Differential of vector x

D Derivative of the Cholesky factor, an auxiliary matrix neededin the EM-algorithm, or an arbitrary matrix

Dk Cross-covariance matrix in a non-linear RTS smoother or anauxiliary matrix used in derivations

ei Unit vector in the direction of the coordinate axis i

f .�/ Dynamic transition function in a state space modelFx.�/ Jacobian matrix of the function x! f .x/

F Feedback matrix of a continuous-time linear state space modelF

.i/xx.�/ Hessian matrix of x! fi .x/

F Œ�� An auxiliary functional needed in the derivation of the EM-algorithm

g Gravitation accelerationg.�/ An arbitrary functiongi .�/ An arbitrary functiong.�/ An arbitrary functiong�1.�/ Inverse function of g.�/Qg.�/ Augmented function with elements .x; g.�//Gk Gain matrix in an RTS smootherGx.�/ Jacobian matrix of the function x! g.x/

G.i/xx.�/ Hessian matrix of x! gi .x/

Hp.�/ pth order Hermite polynomialH Measurement model matrix in a linear Gaussian model, or a

Hessian matrixHk Measurement model matrix at the time step k in a linear Gaus-

sian model

www.cambridge.org© in this web service Cambridge University Press

Cambridge University Press978-1-107-03065-7 - Bayesian Filtering and SmoothingSimo SärkkäFrontmatterMore information

Page 17: Bayesian Filtering and Smoothing - Cambridge University …assets.cambridge.org/97811070/30657/frontmatter/9781107030657... · Bayesian Filtering and Smoothing Filtering and smoothing

Symbols and abbreviations xvii

Hx.�/ Jacobian matrix of the function x! h.x/

H.i/xx.�/ Hessian matrix of x! hi .x/

h.�/ Measurement model function in a state space modeli Integer valued index variableI Identity matrixIi .�; �.n// An integral term needed in the EM-algorithmJ.�/ Jacobian matrixk Time step numberKk Gain matrix of a Kalman/Gaussian filterL Noise coefficient (i.e., dispersion) matrix of a continuous-

time linear state space modelL.�/ Likelihood functionm Dimensionality of a measurement, mean of the univariate

Gaussian distribution, or the massm Mean of a Gaussian distributionQm Mean of an augmented random variablemk Mean of a Kalman/Gaussian filter at the time step k

m.i/

kMean of the Kalman filter in the particle i of RBPF at thetime step k

m.i/0WT History of means of the Kalman filter in the particle i of

RBPFQmk Augmented mean at the time step k or an auxiliary vari-

able used in derivationsm�

kPredicted mean of a Kalman/Gaussian filter at the timestep k just before the measurement yk

m�.i/

kPredicted mean of the Kalman filter in the particle i ofRBPF at the time step k

Qm�k

Augmented predicted mean at the time step k

msk

Mean computed by a Gaussian fixed-interval (RTS)smoother for the time step k

ms;.i/0WT History of means of the RTS smoother in the particle i of

RBPSmkjn Conditional mean of xk given y1Wnn Positive integer, usually the dimensionality of the staten0 Augmented state dimensionality in a non-linear transformn00 Augmented state dimensionality in a non-linear transformN Positive integer, usually the number of Monte Carlo sam-

plesN.�/ Gaussian distribution (i.e., normal distribution)

www.cambridge.org© in this web service Cambridge University Press

Cambridge University Press978-1-107-03065-7 - Bayesian Filtering and SmoothingSimo SärkkäFrontmatterMore information

Page 18: Bayesian Filtering and Smoothing - Cambridge University …assets.cambridge.org/97811070/30657/frontmatter/9781107030657... · Bayesian Filtering and Smoothing Filtering and smoothing

xviii Symbols and abbreviations

p Order of a Hermite polynomialP Variance of the univariate Gaussian distributionP Covariance of the Gaussian distributionQP Covariance of an augmented random variablePk Covariance of a Kalman/Gaussian filter at the time step k

P.i/

kCovariance of the Kalman filter in the particle i of RBPFat the time step k

P.i/0WT History of covariances of the Kalman filter in the particle

i of RBPFQPk Augmented covariance at the time step k or an auxiliary

variable used in derivationsP �

kPredicted covariance of a Kalman/Gaussian filter at thetime step k just before the measurement yk

QP �k

Augmented predicted covariance at the time step k

P�.i/

kPredicted covariance of the Kalman filter in the particle i

of RBPF at the time step k

P sk

Covariance computed by a Gaussian fixed-interval (RTS)smoother for the time step k

Ps;.i/0WT History of covariances of the RTS smoother in the particle

i of RBPSPkjn Conditional covariance of xk given y1Wnqc Spectral density of a white noise processqc

i Spectral density of component i of a white noise processq.�/ Proposal distribution in the MCMC algorithm, or an arbi-

trary distribution in the derivation of the EM-algorithmq.n/ Distribution approximation on the nth step of the EM-

algorithmq Gaussian random vectorqk Gaussian process noiseQ Variance of scalar process noiseQ.�; �.n// An auxiliary function needed in the EM-algorithmQ Covariance of the process noise in a time-invariant modelQk Covariance of the process noise at the jump from step k to

k C 1

rk Scalar Gaussian measurement noiserk Vector of Gaussian measurement noisesR Variance of scalar measurement noiseR Covariance matrix of the measurement in a time-invariant

model

www.cambridge.org© in this web service Cambridge University Press

Cambridge University Press978-1-107-03065-7 - Bayesian Filtering and SmoothingSimo SärkkäFrontmatterMore information

Page 19: Bayesian Filtering and Smoothing - Cambridge University …assets.cambridge.org/97811070/30657/frontmatter/9781107030657... · Bayesian Filtering and Smoothing Filtering and smoothing

Symbols and abbreviations xix

Rk Covariance matrix of the measurement at the time step k

R Space of real numbersRn n-dimensional space of real numbersRn�m Space of real n �m matricesS Number of backward-simulation drawsSk Innovation covariance of a Kalman/Gaussian filter at step k

SL Covariance in the linear approximation of a non-linear trans-form

SM Covariance in the Gaussian moment matching approximationof a non-linear transform

SQ Covariance in the quadratic approximation of a non-lineartransform

SS Covariance in the statistical linearization approximation of anon-linear transform

SU Covariance in the unscented approximation of a non-lineartransform

t Time variable t 2 Œ0;1/

tk Time of the step k (usually time of the measurement yk)T Index of the last time step, the final time of a time intervalTk Sufficient statisticsu Uniform random variableuk Latent (non-linear) variable in a Rao–Blackwellized particle

filter or smootheru

.i/

kLatent variable value in particle i

u.i/

0Wk History of latent variable values in particle i

U.�/ Utility functionU.�/ Uniform distributionv

.i/

kUnnormalized weight in an SIR particle filter based likelihoodevaluation

vk Innovation vector of a Kalman/Gaussian filter at step k

w.i/ Normalized weight of the particle i in importance samplingQw.i/ Weight of the particle i in importance sampling

w�.i/ Unnormalized weight of the particle i in importance samplingw

.i/

kNormalized weight of the particle i on step k of a particle filter

w.i/

kjn Normalized weight of a particle smootherwi Weight i in a regression modelwk Vector of weights at the time step k in a regression modelw.t/ Gaussian white noise processW Weight in the cubature or unscented approximation

www.cambridge.org© in this web service Cambridge University Press

Cambridge University Press978-1-107-03065-7 - Bayesian Filtering and SmoothingSimo SärkkäFrontmatterMore information

Page 20: Bayesian Filtering and Smoothing - Cambridge University …assets.cambridge.org/97811070/30657/frontmatter/9781107030657... · Bayesian Filtering and Smoothing Filtering and smoothing

xx Symbols and abbreviations

Wi i th weight in sigma-point approximation or in Gauss–Hermite quadrature

W.m/

i Mean weight of the unscented transformW

.m/0

i Mean weight of the unscented transformW

.c/i Covariance weight of the unscented transform

W.c/0

i Covariance weight of the unscented transformWi1;:::;in

Weight in multivariate Gauss–Hermite cubaturex Scalar random variable or state, sometimes regressor vari-

able, or a generic scalar variablex Random variable or statex.i/ i th Monte Carlo draw from the distribution of x

xk State at the time step k

Qxk Augmented state at the time step k

x0Wk Set containing the state vectors fx0; : : : ; xkgx

.i/

0Wk The history of the states in the particle i

Qx.j /0WT State trajectory simulated by a backward-simulation particle

smootherX Matrix of regressorsXk Matrix of regressors up to the time step k

X .�/ Sigma point of xQX .�/ Augmented sigma point of x

X .�/k

Sigma point of the state xk

QX .�/k

Augmented sigma point of the state xk

OX .�/k

Predicted sigma point of the state xk

X�.�/k

Sigma point of the predicted state xk

QX�.�/k

Augmented sigma point of the predicted state xk

y Random variable or measurementyk Measurement at the time step k

y1Wk Set containing the measurement vectors fy1; : : : ; ykgY .�/ Sigma point of yQY .�/ Augmented sigma point of yOY .�/

ki th predicted sigma point of the measurement yk at step k

Z Normalization constantZk Normalization constant at the time step k

1 Infinity

www.cambridge.org© in this web service Cambridge University Press

Cambridge University Press978-1-107-03065-7 - Bayesian Filtering and SmoothingSimo SärkkäFrontmatterMore information

Page 21: Bayesian Filtering and Smoothing - Cambridge University …assets.cambridge.org/97811070/30657/frontmatter/9781107030657... · Bayesian Filtering and Smoothing Filtering and smoothing

Symbols and abbreviations xxi

AbbreviationsADF Assumed density filterAM Adaptive Metropolis (algorithm)AMCMC Adaptive Markov chain Monte CarloAR Autoregressive (model)ARMA Autoregressive moving average (model)ASIR Auxiliary sequential importance resamplingBS-PS Backward-simulation particle smootherCDKF Central differences Kalman filterCKF Cubature Kalman filterCLT Central limit theoremCPF Cubature particle filterCRLB Cramer–Rao lower boundDLM Dynamic linear modelDOT Diffuse optical tomographyDSP Digital signal processingEC Expectation correctionEEG ElectroencephalographyEKF Extended Kalman filterEM Expectation–maximizationEP Expectation propagationERTSS Extended Rauch–Tung–Striebel smootherFHKF Fourier–Hermite Kalman filterFHRTSS Fourier–Hermite Rauch–Tung–Striebel smootherfMRI Functional magnetic resonance imagingGHKF Gauss–Hermite Kalman filterGHPF Gauss–Hermite particle filterGHRTSS Gauss–Hermite Rauch–Tung–Striebel smootherGPB Generalized pseudo-BayesianGPS Global positioning systemHMC Hamiltonian (or hybrid) Monte CarloHMM Hidden Markov modelIMM Interacting multiple model (algorithm)INS Inertial navigation systemIS Importance samplingInI Inverse imagingKF Kalman filterLMS Least mean squaresLQG Linear quadratic Gaussian (regulator)

www.cambridge.org© in this web service Cambridge University Press

Cambridge University Press978-1-107-03065-7 - Bayesian Filtering and SmoothingSimo SärkkäFrontmatterMore information

Page 22: Bayesian Filtering and Smoothing - Cambridge University …assets.cambridge.org/97811070/30657/frontmatter/9781107030657... · Bayesian Filtering and Smoothing Filtering and smoothing

xxii Symbols and abbreviations

LS Least squaresMA Moving average (model)MAP Maximum a posterioriMC Monte CarloMCMC Markov chain Monte CarloMEG MagnetoencephalographyMH Metropolis–HastingsMKF Mixture Kalman filterML Maximum likelihoodMLP Multi-layer perceptronMMSE Minimum mean squared errorMNE Minimum norm estimateMSE Mean squared errorPF Particle filterPMCMC Particle Markov chain Monte CarloPMMH Particle marginal Metropolis–HastingsPS Particle smootherQKF Quadrature Kalman filterRAM Robust adaptive Metropolis (algorithm)RBPF Rao–Blackwellized particle filterRBPS Rao–Blackwellized particle smootherRMSE Root mean squared errorRTS Rauch–Tung–StriebelRTSS Rauch–Tung–Striebel smootherSDE Stochastic differential equationSIR Sequential importance resamplingSIR-PS Sequential importance resampling particle smootherSIS Sequential importance samplingSLDS Switching linear dynamic systemSLF Statistically linearized filterSLRTSS Statistically linearized Rauch–Tung–Striebel smootherSMC Sequential Monte CarloTVAR Time-varying autoregressive (model)UKF Unscented Kalman filterUPF Unscented particle filterURTSS Unscented Rauch–Tung–Striebel smootherUT Unscented transform

www.cambridge.org© in this web service Cambridge University Press

Cambridge University Press978-1-107-03065-7 - Bayesian Filtering and SmoothingSimo SärkkäFrontmatterMore information