the oxford handbook of nonlinear filtering

2
824 Book Reviews econometricians working in the area of time series econometrics and forecasting and also for library purchase. Kuldeep Kumar Bond University Gold Coast E-mail: [email protected] Introductory Time Series with R P. S. P. Cowpertwait and A. V. Metcalfe, 2009 New York, Springer xvi + 254 pp., $48.45 ISBN 978-0-387-88697-8 The title speaks for itself. Each of the 12 chapters includes the required minimum of theory and some helpful examples to show how to proceed practically with R. The book seems to agree with the statement (which is usually attributed to Albert Einstein) that ‘example isn’t another way to teach, it is the only way to teach’. It is well structured: the core material in the chapters is placed between sections entitled ‘Purpose’ and ‘Summary of R commands’ and is complemented with exercises. The exposition is simple and yet pro- found. The index is helpful. There is enough R code for a book that ends in ‘. . . with R’. As one would guess from the above summary, this reviewer’s reaction to the book is entirely pos- itive. It covers subjects that are standard and com- mon to many time series analysis books: in order of appearance, time series data, correlation, forecast- ing strategies, basic stochastic models, regression, stationary models, non-stationary models, long memory processes, spectral analysis, system iden- tification, multivariate models and state space models. However, the choice of material for the chapters is quite arbitrary. For example, the chap- ter on forecasting comprises three relatively inde- pendent topics: leading variables, the Bass model and exponential smoothing. The full contents (and the rest of the book) are freely available from the Springer Web site. The book has a clear focus on model fitting and data analysis, which makes it particularly helpful to those recognizing that (Akaike, 1983) ‘the statistical problem is not one of testing a given hypothesis but rather one of fitting models with various number of parameters and to decide when to stop fitting’. With regard to the R component, the authors do not emphasize the efficiency and optimality of the language, deliberately using loops for transparency and to avoid obscure looking code. The book is a clear success. It is not that I can- not see typographical errors, or things to disagree with as a researcher (e.g. on pages 123 and 55 respec- tively), but that I cannot see a good reason to be neg- ative for longer than a sentence. I recommend it to anyone with suitably limited expertise in time series and/or R. It will certainly help to go through all of its examples, which should bring a beginner to a point when they are ready to set out on a longer journey, with a more advanced time series or R book. Reference Akaike, H. (1983) Information measures and model selection. Bull. Int. Statist. Inst., 50, 277–290. Andrey Kostenko Monash University Melbourne E-mail: [email protected] The Oxford Handbook of Nonlinear Filtering D. Crisan and B. Rozovskii (eds), 2011 Oxford, Oxford University Press xiv + 1064 pp., $110 ISBN 978-0-199-53290-2 This is an excellent collection of 35 well-written papers by about 50 leading scientists. The papers deal with different aspects of non-linear filtering (NLF) problems. The introduction written by the editors provides a comprehensive overview of NLF problems. The idea is that we deal with a two-component dynamic stochastic system, say (X t ; Y t ), t 0, where X t , t 0, is a non-observable signal, and all the infor- mation in it is contained in the observable process Y t , t 0, which in some way depends on the sig- nal. The processes involved may have discrete or continuous time and discrete or continuous state space. The problem is to find the optimal mean- square estimator, or optimal filter π( f ) of a given function f.X t / based on the available observations, the filtration Y t = σ.Y s , s t/, t 0, whereas the optimal filter is π t .f/ = E[f.X t /|Y t ]: To derive an equation for this conditional expectation, or to study its properties, turned out to be an extremely difficult problem. Perhaps everything started in the early 1960s with the relatively easy linear model leading to the classical Kalman–Bucy filter. This was followed by important contributions by leading mathematicians from all over the world. The hand- book provides full information about progress in this area. It should be noted that the NLF problems needed fundamentally new ideas and techniques. Thus, in particular, a new area of research was born:

Upload: jordan-stoyanov

Post on 30-Sep-2016

215 views

Category:

Documents


3 download

TRANSCRIPT

824 Book Reviews

econometricians working in the area of time serieseconometrics and forecasting and also for librarypurchase.

Kuldeep KumarBond University

Gold CoastE-mail: [email protected]

Introductory Time Series with RP. S. P. Cowpertwait and A. V. Metcalfe, 2009New York, Springerxvi + 254 pp., $48.45ISBN 978-0-387-88697-8

The title speaks for itself. Each of the 12 chaptersincludes the required minimum of theory and somehelpful examples to show how to proceed practicallywith R. The book seems to agree with the statement(which is usually attributed to Albert Einstein) that

‘example isn’t another way to teach, it is the onlyway to teach’.

It is well structured: the core material in the chaptersis placed between sections entitled ‘Purpose’ and‘Summary of R commands’ and is complementedwith exercises. The exposition is simple and yet pro-found. The index is helpful. There is enough R codefor a book that ends in ‘. . . with R’.

As one would guess from the above summary,this reviewer’s reaction to the book is entirely pos-itive. It covers subjects that are standard and com-mon to many time series analysis books: in order ofappearance, time series data, correlation, forecast-ing strategies, basic stochastic models, regression,stationary models, non-stationary models, longmemory processes, spectral analysis, system iden-tification, multivariate models and state spacemodels. However, the choice of material for thechapters is quite arbitrary. For example, the chap-ter on forecasting comprises three relatively inde-pendent topics: leading variables, the Bass modeland exponential smoothing. The full contents (andthe rest of the book) are freely available from theSpringer Web site.

The book has a clear focus on model fitting anddata analysis, which makes it particularly helpful tothose recognizing that (Akaike, 1983)

‘the statistical problem is not one of testing agiven hypothesis but rather one of fitting modelswith various number of parameters and to decidewhen to stop fitting’.

With regard to the R component, the authors donot emphasize the efficiency and optimality of the

language, deliberately using loops for transparencyand to avoid obscure looking code.

The book is a clear success. It is not that I can-not see typographical errors, or things to disagreewith as a researcher (e.g. on pages 123 and 55 respec-tively), but that I cannot see a good reason to be neg-ative for longer than a sentence. I recommend it toanyone with suitably limited expertise in time seriesand/or R. It will certainly help to go through all of itsexamples, which should bring a beginner to a pointwhen they are ready to set out on a longer journey,with a more advanced time series or R book.

ReferenceAkaike, H. (1983) Information measures and model

selection. Bull. Int. Statist. Inst., 50, 277–290.

Andrey KostenkoMonash University

MelbourneE-mail: [email protected]

The Oxford Handbook of Nonlinear FilteringD. Crisan and B. Rozovskii (eds), 2011Oxford, Oxford University Pressxiv + 1064 pp., $110ISBN 978-0-199-53290-2

This is an excellent collection of 35 well-writtenpapers by about 50 leading scientists. The papersdeal with different aspects of non-linear filtering(NLF) problems.

The introduction written by the editors providesa comprehensive overview of NLF problems. Theidea is that we deal with a two-component dynamicstochastic system, say (Xt ; Yt), t � 0, where Xt ,t � 0, is a non-observable signal, and all the infor-mation in it is contained in the observable processYt , t � 0, which in some way depends on the sig-nal. The processes involved may have discrete orcontinuous time and discrete or continuous statespace. The problem is to find the optimal mean-square estimator, or optimal filter π( f ) of a givenfunction f.Xt/ based on the available observations,the filtration Yt = σ.Ys, s � t/, t � 0, whereasthe optimal filter is πt .f/ = E[f.Xt/|Yt ]: To derivean equation for this conditional expectation, or tostudy its properties, turned out to be an extremelydifficult problem. Perhaps everything started in theearly 1960s with the relatively easy linear modelleading to the classical Kalman–Bucy filter. Thiswas followed by important contributions by leadingmathematicians from all over the world. The hand-book provides full information about progress inthis area. It should be noted that the NLF problemsneeded fundamentally new ideas and techniques.Thus, in particular, a new area of research was born:

Book Reviews 825

the area of stochastic partial differential equations.Another area which received much attention andintensive development was martingale theory. Thesetwo areas were the basis of the progress that wasachieved in solving NLF problems. This handbookconvincingly demonstrates that we have today awell-developed theory of NLF. For a variety ofstochastic models we can write non-trivial stochas-tic partial differential equations whose solutionsdescribe the dynamics of optimal filters. There areanalytical results about their solutions; however, inmost of the cases the only possibility for meetingthe demands of practice is to use ‘good’ approxi-mations; hence the importance of numericalmethods.

In a short book review it is not possible to writeabout the content of all 35 papers. The papers,however, are carefully written with appropriatereferences. It is interesting; the authors not only de-scribe specific models and formulate questions andresults, but in many cases quite complete proofs aregiven. It ismore thancurious toseenewproofs, shortand elegant, of results that were established 30 or 40years ago.

The papers are grouped in the following parts:I, ‘The foundations of nonlinear filtering’; II, ‘Non-linear filtering and stochastic partial differentialequations’; III, ‘Stability and asymptotic analysis’;IV, ‘Special topics’; V, ‘Estimation and control’;VI, ‘Approximation theory’; VII, ‘The particleapproach’; VIII, ‘Numerical methods in nonlinearfiltering’; IX, ‘Nonlinear filtering in financial math-ematics’. The handbook ends with a comprehensiveindex.

This volume is good value in size, price andcontent. As a result of the good job by the pub-lisher, the editors and the authors, we, the read-ers, have a chance to use and enjoy this handbook,which at the moment is, and for many years to comewill be, an irreplaceable source of information foranyone working or interested in NLF problems.Hence, there are good reasons to recommendstrongly this invaluable reference book to anyscience library.

Jordan StoyanovNewcastle University

E-mail: [email protected]

Approaching Multivariate Analysis, a PracticalIntroduction, 2nd ednP. Dugard, J. Todman and H. Staines, 2010London, Routledge440 pp., £49.95ISBN 978-0-415-47828-1

This book is a friendly, non-mathematical introduc-

tion to multivariate analysis. The text focuses verymuch on the practical side of multivariate statis-tics. Numerous examples are given, as well as infor-mation on how to carry out the analyses that aredescribed in the text by using SPSS and SAS. Thisis an excellent book, ideally suited for a reader with-out a mathematical background. The new materialthat is contained in this second edition is of specialinterest to medical researchers, although it will alsobe of much use to psychologists. Full demonstra-tions and the data used in the book are available onan accompanying Web site.

After a useful introduction describing somegeneral ideas that are needed for the remainder ofthe topics covered, the book goes on to describewide-ranging statistical techniques such as analy-sis of variance, regression, survival analysis andmulti-dimensional scaling (to name but a few). Eachchapter begins with a brief introduction to the tech-nique in question and closes with guidance on howthe reader may report the results of their ownstatistical analyses. Further reading is also provided(references are kept to a minimum in the text). Anextremely useful glossary of terms is included at theend of the text.

The book is well structured and very easy tofollow. The authors use their wealth of experience togive some practical guidance (some of which wouldbe very useful even to the more experienced ofstatisticians). I myself have used material in the bookto construct examples for a postgraduate courseon applied statistics. The book actively encouragesthe reader to engage themselves in statisticalanalyses, and to play around with the examplesprovided.

I wholeheartedly recommend this book toanyone who is interested in ‘getting their hands dirty’with some practical statistics but do not have astrong mathematical background. Academics whoalso teach statistics to non-mathematicians wouldfind the content of the book particularly helpful.

Jonathan GillardCardiff University

E-mail: [email protected]

Multivariable Modeling and Multivariate Analysisfor the Behavioral SciencesB. S. Everitt, 2010London, CRC Pressxviii + 304 pp., $69.95ISBN 978-1-439-80769-9

Multivariable Modeling and Multivariate Analysisfor the Behavioral Sciences by Brian Everitt is asecond-level applied statistics book which is aimed