Introduction to recursive Bayesian filtering ?· Introduction to recursive Bayesian filtering ... –…

Download Introduction to recursive Bayesian filtering ?· Introduction to recursive Bayesian filtering ... –…

Post on 05-Aug-2018

212 views

Category:

Documents

0 download

TRANSCRIPT

  • 1

    IntroductiontorecursiveBayesianfiltering

    MichaelRubinstein

    IDC

    Problemoverview

    Input (Noisy)Sensormeasurements( y)

    Goal Estimatemostprobablemeasurementattimekusingmeasurementsuptotimekkk:smoothingk=k:filtering

    Manyproblemsrequireestimationofthestateofsystemsthatchangeovertimeusingnoisymeasurementsonthesystem

    Michael Rubinstein

  • 2

    Applications

    Ballistics Robotics Robotics

    Robotlocalization Trackinghands/cars/ Econometrics

    StockpredictionN i ti Navigation

    Manymore

    Michael Rubinstein

    Challenges(whyisitahardproblem?)

    Measurements Noise Errors

    Detectionspecific Full/partialocclusions Deformableobjects Entering/leavingthescene Lightingvariations

    Efficiency Efficiency Multiplemodelsandswitchingdynamics Multipletargets,

    Michael Rubinstein

  • 3

    Talkoverview

    Background Modelsetup

    k i h i Markovianstochasticprocesses Thestatespacemodel Dynamicsystems

    TheBayesianapproach Recursivefilters Restrictivecases+prosandcons

    TheKalman filter TheGridbasedfilter

    Particle filters Particlefilters MonteCarlointegration Importancesampling

    Multipletargettracking BraMBLe ICCV2001(?)

    Michael Rubinstein

    StochasticProcesses

    DeterministicprocessOnly one possible reality Onlyonepossible reality

    Randomprocess Severalpossibleevolutions(startingpointmightbeknown)

    Characterizedbyprobabilitydistributions

    Timeseriesmodeling Sequenceofrandomstates/variables Measurementsavailableatdiscretetimes

    Michael Rubinstein

  • 4

    Statespace

    Thestatevectorcontainsallavailableinformation to describe the investigated systeminformationtodescribetheinvestigatedsystem usuallymultidimensional:

    Themeasurementvectorrepresentsobservationsrelatedtothestatevector

    G ll (b t t il ) f l di i

    xNRkX )(

    zNRkZ )(

    Generally(butnotnecessarily)oflowerdimensionthanthestatevector

    Michael Rubinstein

    Statespace

    Tracking: Econometrics: Monetaryflow

    =N 3 =N 4 Interestrates Inflation

    yx

    Nx 3

    y

    x

    x

    vyvx

    N 4

    Michael Rubinstein

  • 5

    (Firstorder)Markovprocess

    TheMarkovproperty thelikelihoodofaf t t t d d t t t lfuturestatedependsonpresentstateonly

    Markovchain AstochasticprocesswithMarkov property

    0)],()(|)([Pr ]),()(|)(Pr[

    >==+===+

    hkxkXyhkXkssxsXyhkX

    Markovproperty

    Michael Rubinstein

    k1 k k+1 time

    xk1 xk xk+1 States

    HiddenMarkovModel(HMM)

    thestateisnotdirectlyvisible,butoutputd d t th t t i i ibldependentonthestateisvisible

    k1 k k+1 time

    xk1 xk xk+1 States(hidden)

    Michael Rubinstein

    zk1 zk zk+1Measurements(observed)

  • 6

    DynamicSystemk1 k k+1

    xk1 xk xk+1kf

    kh

    Stateequation:statevectorattimeinstantkstatetransitionfunction,i.i.d processnoise

    ),( 1 kkkk vxfx =kxkf xvx

    NNNk RRRf :

    kv

    zk1 zk zk+1Stochastic diffusion

    Observationequation:observationsattimeinstantkobservationfunction,i.i.d measurementnoise

    ),( kkkk wxhz =kzkhkw

    zwx NNNk RRRh :

    Michael Rubinstein

    Asimpledynamicsystem

    (4dimensionalstatespace)],,,[ yx vvyxX = Constantvelocitymotion:

    vvvvtyvtxvXf yxyx +++= ],,,[),(

    =

    2

    2

    00000000000000

    qq

    Q),0(~ QNv

    Onlypositionisobserved:

    Michael Rubinstein

    wyxwXhz +== ],[),(

    ),0(~ RNw

    =

    2

    2

    00r

    rR

  • 7

    TheBayesianapproach Constructtheposteriorprobabilitydensity function of the state based)|( 1 kk zxp

    Thomas Bayes

    densityfunctionofthestatebasedonallavailableinformation

    Byknowingtheposteriormanykindsofi f b d i d

    )|( :1 kk zxp

    Sample space

    Posterior

    estimatesfor canbederived mean(expectation),mode,median, Canalsogiveestimationoftheaccuracy(e.g.covariance)

    Michael Rubinstein

    kx

    Recursivefilters

    Formanyproblems,estimateisrequiredeachtimeanewmeasurementarrives

    Batch processing Requiresall availabledata

    Sequential processing Newdataisprocesseduponarrival Neednotstorethecompletedatasetp Neednotreprocessalldataforeachnewmeasurement

    Assumenooutofsequencemeasurements(solutionsforthisexistaswell)

    Michael Rubinstein

  • 8

    RecursiveBayesfilters

    Given: Systemmodelsinprobabilisticforms

    (known statistics of vk, wk)

    )|(),( 11 = kkkkkk xxpvxfx

    )|(),( kkkkkk xzpwxhz =

    Markovian process

    Measurementsareconditionallyindependent

    giventhestate

    (knownstatisticsofvk,wk)

    Initialstate alsoknownastheprior Measurements

    Michael Rubinstein

    )()|( 000 xpzxp =

    kzz ,,1 K

    RecursiveBayesfilters

    Prediction step(apriori)

    Usesthesystemmodeltopredictforward Deforms/translates/spreadsstatepdf duetorandomnoise

    Update step(aposteriori)

    )|()|( 1:11:11 kkkk zxpzxp

    Updatethepredictioninlightofnewdata Tightensthestatepdf

    )|()|( :11:1 kkkk zxpzxp

    Michael Rubinstein

  • 9

    Generalpredictionupdateframework

    Assume isgivenattimek1P di ti

    )|( 1:11 kk zxp Prediction:

    UsingChapmanKolmogorovidentity+Markovproperty

    = 11:1111:1 )|()|()|( kkkkkkk dxzxpxxpzxp (1)Previous posteriorSystem model

    Michael Rubinstein

    Generalpredictionupdateframework

    Updatestep ),|()|( 1:1:1 = kkkkk zzxpzxp

    )|()|(

    )|()|(),|(

    1:1

    1:1

    1:11:1

    =

    =

    kkkk

    kk

    kkkkk

    zxpxzp

    zzpzxpzxzp

    (2)

    )|()|(),|(),|(

    CBpCApCABpCBAp =

    priorlikelihood

    Measurement model

    Currentprior

    Where

    )|( 1:1 =

    kk zzp

    = kkkkkkk dxzxpxzpzzp )|()|()|( 1:11:1

    (2)

    Michael Rubinstein

    evidenceNormalization constant

  • 10

    Generatingestimates

    Knowledgeof enablestot ti l ti t ith t t

    )|( :1 kk zxpcomputeoptimalestimatewithrespecttoanycriterion.e.g. Minimummeansquareerror(MMSE)

    [ ] kkkkkkMMSEkk dxzxpxzxEx = )|(| :1:1| Maximumaposteriori

    )|(maxarg | kkxMAP

    kk zxpxk

    Michael RubinsteinMAP MMSE

    Generalpredictionupdateframework

    So(1)and(2)giveoptimalsolutionforthei ti ti bl !recursiveestimationproblem!

    Unfortunatelynoonlyconceptualsolution integralsareintractable Canonlyimplementthepdftofiniterepresentation!

    However,optimalsolutiondoes existforseveralrestrictivecases

    Michael Rubinstein

  • 11

    Restrictivecase#1

    PosteriorateachtimestepisGaussian Completelydescribedbymeanandcovariance

    If isGaussianitcanbeshownthat isalsoGaussianprovidedthat: areGaussian are linear

    )|( 1:11 kk zxp

    )|( :1 kk zxp

    kk wv ,hf arelinearkk hf ,

    Michael Rubinstein

    Restrictivecase#1

    WhyLinear?

    Michael Rubinstein

    Yacov Hel-Or( ) ( )TAABANypBAxy ++= ,~

  • 12

    Restrictivecase#1

    WhyLinear?

    Michael Rubinstein

    Yacov Hel-Or( ) ( )Nypxgy ~)( /=

    Restrictivecase#1

    Linearsystemwithadditivenoise)( vxfx = vxFx +=

    Simpleexampleagain

    ),(),( 1

    kkkk

    kkkkwxhz

    vxfx==

    ),R~N(w),Q~N(vwxHzvxFx

    kk

    kk

    kkkk

    kkkk

    00

    1+=+=

    vvvvtyvtxvXf yxyx +++= ],,,[),( wyxwXhz +== ],[),(001

    Michael Rubinstein

    ),0(

    10000100

    010001

    1,

    1,

    ,

    ,k

    ky

    kx

    k

    k

    ky

    kx

    k

    k

    QN

    vvyx

    tt

    vvyx

    +

    =

    ),0(00100001

    ,

    ,k

    ky

    kx

    k

    k

    obs

    obs RN

    vvyx

    yx

    +

    =

    FH

  • 13

    TheKalmanfilter

    Rudolf E. Kalman

    ),;()|(),;()|(

    ),;()|(

    ||:1

    1|1|1:1

    1|11|111:11

    kkkkkkk

    kkkkkkk

    kkkkkkk

    PxxNzxpPxxNzxp

    PxxNzxp

    ==

    =

    ( )

    = )(

    21exp|2|),;( 12/1 xxxN T

    Substitutinginto(1)and(2)yieldsthepredictandupdateequations

    Michael Rubinstein

    TheKalmanfilter

    Predict:

    Update:

    kT

    k|kkkk|k

    |kkkk|k

    QFPFPxFx

    +==

    111

    111

    11 +=T

    kTkk|kkk RHPHS

    Michael Rubinstein

    ( )[ ] 1

    11

    11

    =+=

    =

    k|kkkk|k

    k|kkkkk|kk|k

    kTkk|kk

    PHKIPxHzKxx

    SHPK

  • 14

    Intuitionvia1Dexample

    Lostatsea Night Noideaoflocation Forsimplicity letsassume1D

    Michael Rubinstein

    * Example and plots by Maybeck, Stochastic models, estimation and control, volume 1

    Example contd

    Timet1:StarSighting Denotex(t1)=z1

    Uncertainty(inaccuracies,humanerror,etc) Denote1(normal)

    Canestablishtheconditionalprobabilityofx(t1) given measurement z1x(t1)givenmeasurementz1

    Michael Rubinstein

  • 15

    Example contd

    Probabilityforanylocation,basedonmeasurement ForGaussiandensity 68.3%within1 Bestestimateofposition:Mean/Mode/Median

    Michael Rubinstein

    Example contd

    Timet2t1:friend(moretrained) x(t2)=z2,(t2)=2 Sinceshehashigherskill:2

  • 16

    Example contd

    f(x(t2)|z1,z2)alsoGaussian

    Michael Rubinstein

    Example contd

    lessthanboth1and2 1=2:average 1>2:moreweighttoz2 Rewrite:

    Michael Rubinstein

  • 17

    Example contd

    TheKalmanupdaterule:

    BestestimateGivenz2

    (aposeteriori)

    BestPredictionpriortoz2(apriori)

    OptimalWeighting(KalmanGain)

    Residual

    Michael Rubinstein

    TheKalmanfilter

    Predict:

    Update:

    kT

    k|kkkk|k

    |kkkk|k

    QFPFPxFx

    +==

    111

    111

    1 += kTkk|kkk RHPHS

    ( )[ ] 1

    11

    11

    =+=

    =

    k|kkkk|k

    k|kkkkk|kk|k

    kTkk|kk

    PHKIPxHzKxx

    SHPK

    Michael Rubinstein

    222

    2

    22

    22

    1

    21

    1

    21

    21 1 zzz

    z

    zz

    zz

    +=

    +

  • 18

    Kalmangain

    ( )1

    1

    1

    =+=

    kTkk|kk

    kTkk|kkk

    SHPKRHPHS

    Smallmeasurementerror:

    ( )[ ] 1

    11

    1

    =+=

    k|kkkk|k

    k|kkkkk|kk|k

    kkk|kk

    PHKIPxHzKxx

    kkkkkk zHxHK kk1

    |01

    0 limlim

    == RR Smallpredictionerror:

    Michael Rubinstein

    1||00 lim0lim == kkkkk xxK kk PP

    TheKalmanfilter

    Pros Optimal closedform solution to the tracking problemOptimalclosed formsolutiontothetrackingproblem(undertheassumptions)

    NoalgorithmcandobetterinalinearGaussianenvironment!

    Alllogicalestimationscollapsetoauniquesolution Simpletoimplement Fasttoexecute

    Cons Ifeitherthesystemormeasurementmodelisnonlinear theposteriorwillbenonGaussian

    Michael Rubinstein

  • 19

    Restrictivecase#2

    Thestatespace(domain)isdiscreteandfinite Assumethestatespaceattimek1consistsofstates

    Let betheconditionalprobabilityofthestateattimek1,givenmeasurements up to k1

    sik Nix ..1,1 =

    ikkk

    ikk wzxx 1|11:111 )|Pr( ==

    measurementsuptok1

    Michael Rubinstein

    TheGridbasedfilter

    Theposteriorpdfatk1canbeexpressedasf d lt f tisumofdeltafunctions

    Again,substitutioninto(1)and(2)yieldsthepredict and update equations

    =

    =sN

    i

    ikk

    ikkkk xxwzxp

    1111|11:11 )()|(

    predictandupdateequations

    Michael Rubinstein

  • 20

    TheGridbasedfilter Prediction

    = 11:1111:1 )|()|()|( kkkkkkk dxzxpxxpzxp (1) 11:1111:1 )|()|()|( kkkkkkk ppp

    =

    = =

    =

    =

    s

    s s

    N

    i

    ikk

    ikk

    N

    i

    N

    j

    ikk

    jkk

    jk

    ikkk

    xxw

    xxwxxpzxp

    1111|

    1 1111|111:1

    )(

    )()|()|(

    N

    Newpriorisalsoweightedsumofdeltafunctions Newpriorweightsarereweightingofoldposteriorweightsusingstate

    transitionprobabilities

    =

    =sN

    j

    jk

    ik

    jkk

    ikk xxpww

    111|11| )|(

    Michael Rubinstein

    TheGridbasedfilter Update

    )|()|()|( 1:1:1 = kkkkkkzxpxzpzxp (2)

    =

    =sN

    i

    ikk

    ikkkk xxwzxp

    111|:1 )()|(

    =

    sNjj

    ikk

    ikki

    kk

    xzpw

    xzpww 1||

    )|(

    )|(

    )|()|(

    1:1:1

    kkkk zzp

    p ( )

    Posteriorweightsarereweightingofpriorweightsusinglikelihoods(+normalization)

    =

    j

    kkkk xzpw1

    1| )|(

    Michael Rubinstein

  • 21

    TheGridbasedfilter

    Pros:d k b)|()|( assumedknown,butno

    constraintontheir(discrete)shapes

    Easyextensiontovaryingnumberofstates Optimalsolutionforthediscretefiniteenvironment!

    Cons:

    )|(),|( 1 kkkk xzpxxp

    Curseofdimensionality Inefficientifthestatespaceislarge

    Staticallyconsidersall possiblehypotheses

    Michael Rubinstein

    Suboptimalsolutions

    Inmanycasestheseassumptionsdonothold Practical environments are nonlinear nonGaussianPracticalenvironmentsarenonlinear,non Gaussian,continuous

    Approximationsarenecessary

    ExtendedKalman filter(EKF) ApproximategridbasedmethodsM lti l d l ti t

    Analytic approximations

    Numerical methods

    Multiplemodelestimators UnscentedKalman filter(UKF) Particlefilters(PF)

    Sampling approaches

    Gaussian-sum filters

    Michael Rubinstein

  • 22

    TheextendedKalmanfilter

    Theidea:locallinearizationofthedynamict i ht b ffi i t d i ti f thsystemmightbesufficientdescriptionofthe

    nonlinearity

    Themodel:nonlinearsystemwithadditivenoise

    ),0(~),0(~

    )()( 1

    kk

    kk

    kkkk

    kkkk

    RNwQNv

    wxhzvxfx

    +=+=

    ),RN(~w),QN(~v

    wHxzvxFx

    kk

    kk

    kkk

    kkkk

    00

    1+=+=

    Michael Rubinstein

    TheextendedKalmanfilter

    f,hareapproximatedusingafirstorderTaylori i ( l t t t ti ti )seriesexpansion(evalatstateestimations)

    Predict:

    Update:k

    Tk|kkkk|k

    |kkkk|k

    QFPFP)x(fx

    +==

    111

    111

    += T RHPHS1|1

    ][

    ][][

    ][],[

    ==

    k

    kkkk

    k

    ih

    xxjxif

    k

    jiH

    jiF

    Michael Rubinstein

    ( )[ ] 1

    11

    11

    1

    =+=

    =+=

    k|kkkk|k

    k|kkkkk|kk|k

    kTkk|kk

    kkk|kkk

    PHKIP)x(hzKxx

    SHPKRHPHS

    1|][][],[

    ==

    kkkk

    kxxjx

    ihk jiH

  • 23

    TheextendedKalmanfilter

    Michael Rubinstein

    Yacov Hel-Or

    TheextendedKalmanfilter

    Pros Good approximation when models are nearlinearGoodapproximationwhenmodelsarenear linear Efficienttocalculate(defactomethodfornavigationsystemsandGPS)

    Cons Onlyapproximation(optimalitynotproven) StillasingleGaussianapproximations

    N li it G i it ( bi d l) Nonlinearity nonGaussianity (e.g.bimodal) Ifwehavemultimodalhypothesis,andchooseincorrectly canbedifficulttorecover

    Inapplicablewhenf,h discontinuous

    Michael Rubinstein

  • 24

    Intermission

    Questions?

    Michael Rubinstein

    Particlefiltering

    Familyoftechniques Condensation algorithms (MacCormick&Blake 99)Condensationalgorithms(MacCormick&Blake, 99) Bootstrapfiltering(Gordonetal.,93) Particlefiltering(Carpenteretal.,99) Interactingparticleapproximations(Moral98) Survivalofthefittest(Kanazawaetal.,95) SequentialMonteCarlomethods(SMC,SMCM)SIS SIR ASIR RPF SIS,SIR,ASIR,RPF,.

    Statisticsintroducedin1950s.IncorporatedinvisioninLastdecade

    Michael Rubinstein

  • 25

    Particlefiltering

    Manyvariations,onegeneralconcept:

    Represent the posterior pdf by a set of randomly chosen weighted samples (particles)

    Posterior

    RandomlyChosen=MonteCarlo(MC) Asthenumberofsamplesbecomeverylarge the

    characterizationbecomesanequivalentrepresentationofthetruepdf

    Michael Rubinstein

    Sample space

    Particlefiltering

    ComparedtopreviousmethodsC t bit di t ib ti Canrepresentanyarbitrarydistribution

    multimodalsupport Keeptrackseveralhypothesessimultaneously Approximaterepresentationofcomplexmodelratherthanexactrepresentationofsimplifiedmodelode

    Thebasicbuildingblock:ImportanceSampling

    Michael Rubinstein

  • 26

    MonteCarlointegration

    Evaluatecomplexintegralsusingprobabilistictechniquestechniques

    Assumewearetryingtoestimateacomplicatedintegralofafunctionf oversomedomainD:

    xdxfF rr= )(

    AlsoassumethereexistssomePDFpdefinedoverD

    fD )(

    Michael Rubinstein

    MonteCarlointegration

    Thenf r )(

    But

    xdxpxpxfxdxfF

    DD

    rrr

    rr == )()(

    )()(

    pxxfExdxpxfD

    ~,)()()(

    )()(

    = r

    rrr

    r

    r

    Thisistrueforany PDFp overD!

    xpxpD )()(

    Michael Rubinstein

  • 27

    MonteCarlointegration

    Now,ifwehavei.i.drandomsamplesl d f th i t

    Nxxrr ,...,1

    sampledfromp,thenwecanapproximateby

    Guaranteed by law of large numbers

    )()(

    xpxfE rr

    =

    =N

    i i

    iN xp

    xfN

    F1 )(

    )(1r

    r

    Guaranteedbylawoflargenumbers:

    FxpxfEFN

    sa

    N =

    )()(,

    .

    r

    r

    Michael Rubinstein

    ImportanceSampling(IS)

    Whatabout ?0)( =xp r

    Ifp isverysmall,canbearbitrarilylarge,damagingtheaverage Designp suchthat isbounded Ruleofthumb:takep similartof aspossible

    pf /

    pf /

    Importance weights

    Importance or proposal

    Theeffect:getmoresamplesinimportantareasoff,i.e.wheref islarge

    Michael Rubinstein

    density

  • 28

    ConvergenceofMCintegration Chebyshevsinequality:letXbearandom variable with expected value Pafnuty Lvovich randomvariablewithexpectedvalueandstd.Foranyrealnumberk>0,

    For example for it shows that at least

    2

    1}|Pr{|k

    kX

    2=k

    yChebyshev

    Forexample,for,itshowsthatatleasthalfthevalueslieininterval

    Let,thenMCestimatoris)()(

    i

    ii xp

    xfy = =

    =N

    iiN yN

    F1

    1

    2=k)2,2( +

    Michael Rubinstein

    ConvergenceofMCintegration ByChebyshevs,

    }][|][Pr{|

    21

    NNN

    FVFEF )/1( =k

    [ ] [ ]yVN

    yVN

    yVN

    yN

    VFVN

    ii

    N

    ii

    N

    iiN

    1111][1

    21

    21

    ===

    ==

    =

    =

    NN

    }][1|Pr{|

    21

    yVN

    FFN

    Hence,forafixedthreshold,theerrordecreasesatrate

    N

    N/1

    Michael Rubinstein

  • 29

    ConvergenceofMCintegration

    Meaning1. Tocuttheerrorinhalf,itisnecessaryto

    evaluate4timesasmanysamples

    2. Convergencerateisindependentoftheintegranddimension!

    Oncontrast,theconvergencerateofgridbasedi ti d iNapproximationsdecreasesasincreases

    Michael Rubinstein

    xN

    ISforBayesianestimation

    kkkX

    k

    xp

    dxzxpxfXfE :0:1:0:0

    )|(

    )|()())(( =

    Wecharacterizetheposteriorpdfusingasetofsamples(particles)andtheirweights

    Ni

    ik

    ik wx 1:0 },{ =

    kkkX kk

    kkk dxzxqzxq

    zxpxf :0:1:0:1:0

    :1:0:0 )|()|(

    )|()(=

    Thenthejointposteriordensityattimekisapproximatedby

    =

    N

    i

    ikk

    ikkk xxwzxp

    1:0:0:1:0 )()|(

    Michael Rubinstein

  • 30

    ISforBayesianestimation

    Wedrawthesamplesfromtheimportanced it ith i t i ht)|(density withimportanceweights

    Sequentialupdate(aftersomecalculation)

    )|( :1:0 kk zxq

    )|()|(

    :1:0

    :1:0

    kk

    kkik zxq

    zxpw

    Michael Rubinstein

    ),|(~ 1 kikk

    ik zxxqx

    Particle update

    ),|()|()|(

    1

    11

    kik

    ik

    ik

    ik

    ikki

    kik zxxq

    xxpxzpww

    =

    Weight update

    SequentialImportanceSampling(SIS)

    [ ] [ ]kNiikikNiikik zwxwx ,},{SIS},{ 1111 == = FORi=1:N

    Draw Updateweights

    END Normalize weights

    ),|(~ 1 kikk

    ik zxxqx

    ),|()|()|(

    1

    11

    kik

    ik

    ik

    ik

    ikki

    kik zxxq

    xxpxzpww

    =

    Normalizeweights

    Michael Rubinstein

  • 31

    Stateestimates

    Anyfunction canbecalculatedbydi t df i ti

    )( kxfdiscretepdfapproximation

    [ ] =

    =N

    i

    ik

    ikk xfwN

    xfE1

    )(1)(

    Robust mean Example: Mean (simple average)

    Michael Rubinstein

    MAP Mean

    Mean(simpleaverage) MAPestimate:particlewithlargestweight

    Robustmean:meanwithinwindowaroundMAPestimate

    Choiceofimportancedensity

    Michael Rubinstein

    Hsiao et al.

  • 32

    Choiceofimportancedensity

    Mostcommon(suboptimal):thetransitionaliprior

    )|(),|(

    )|()|(

    )|(),|(

    11

    11

    11

    ikk

    ik

    kik

    ik

    ik

    ik

    ikki

    kik

    ikkk

    ikk

    xzpwzxxq

    xxpxzpww

    xxpzxxq

    ==

    =

    Michael Rubinstein

    =

    =sN

    j

    jkk

    jkk

    ikk

    ikki

    kk

    xzpw

    xzpww

    11|

    1||

    )|(

    )|(Grid filter weight update:

    Thedegeneracyphenomenon

    UnavoidableproblemwithSIS:afterafewit ti t ti l h li ibliterationsmostparticleshavenegligibleweights Largecomputationaleffortforupdatingparticleswithverysmallcontributionto

    Measureofdegeneracy theeffectivesample)|( :1 kk zxp

    g y psize:

    Uniform: ,severedegeneracy:=

    = Ni

    ik

    effw

    N1

    2)(1

    NN eff = 1=effN Michael Rubinstein

  • 33

    Resampling

    Theidea:whendegeneracyisabovesometh h ld li i t ti l ith lthreshold,eliminateparticleswithlowimportanceweightsandmultiplyparticleswithhighimportanceweights

    The new set is generated by sampling with

    NiN

    ik

    Ni

    ik

    ik xwx 1

    1*1 },{},{ ==

    Thenewsetisgeneratedbysamplingwithreplacementfromthediscreterepresentationof suchthat)|( :1 kk zxp jk

    jk

    ik wxx == }Pr{*

    Michael Rubinstein

    Resampling

    G t N i i d i bl[ ] [ ]NiikikNiikik wxwx 11* },{RESAMPLE},{ == =

    ]10[U GenerateNi.i.d variables Sorttheminascendingorder Comparethemwiththecumulativesumofnormalizedweights

    ]1,0[~ Uui

    Ristic et al. Michael Rubinstein

  • 34

    Resampling

    Complexity:O(NlogN) O(N) sampling algorithms exist O(N)samplingalgorithmsexist

    Michael RubinsteinHsiao et al.

    GenericPF

    A l SIS filt i[ ] [ ]kNiikikNiikik zwxwx ,},{PF},{ 1111 == =

    [ ] [ ]NiiNii }{SIS}{ ApplySISfiltering Calculate IF

    END

    [ ] [ ]kNiikikNiikik zwxwx ,},{SIS},{ 1111 == =effN

    threff NN 0do

    Resample into Predict

    Ni

    ii wx 100 },{ =

    Ni

    ik

    ik wx 111 },{ =

    Ni

    ik N

    x 1*

    1 }1,{ =

    0X

    )|(~ * 11ikkk

    ik xxxpx =

    Michael Rubinstein

    Reweight Normalizeweights Estimate (fordisplay)

    )|( ikkkik xxzpw ==

    kx

    Redpillorbluepill?

    1. Wehadenough showussomevideos!

    2. 15minutewalkthroughamultipletargettrackingsystem

    Michael Rubinstein

    12

  • 37

    MultipleTargets(MTT/MOT)

    Previouschallenges Full/partial occlusionsFull/partialocclusions Entering/leavingthescene

    Andinaddition Estimatingthenumberofobjects Computationallytractableformultiplesimultaneoustargetstargets

    Interactionbetweenobjects

    Manyworksonmultiplesingletargetfilters

    Michael Rubinstein

    BraMBLe:ABayesianMultipleBlobTracker

    M.Isard andJ.MacCormickhCompaqSystemsResearchCenter

    ICCV2001

    Some slides taken from Qi ZhaoSome images taken from Isard and MacCormick

  • 38

    BraMBLE

    Firstrigorousparticlefilterimplementationwithvariable number of targetsvariablenumberoftargets

    Posteriordistributionisconstructedoverpossibleobjectconfigurationsand number

    Sensor:singlestaticcameraT ki SIR ti l filt Tracking:SIRparticlefilter

    Performance:realtimefor12simultaneousobjects

    Michael Rubinstein

    TheBraMBLeposterior

    )|( 1 kk zxp )|( :1 kk zxp

    State at frame k Image Sequence

    Number,Positions,Shapes,Velocities,

    Michael Rubinstein

    ,

  • 39

    Statespace

    Hypothesisconfiguration:

    Objectconfiguration:),...,,,( 21 mkkkkk xxxmX =

    ),,,( ikik

    ik

    ik

    ikx SVX=

    identifier shape

    max131 MNx +=

    Michael Rubinstein

    identifier

    position),( zx=X

    velocity),( zx vv=V

    shape),,,,,,,( swhswf hwwww =S

    Objectmodel

    Apersonismodeledasageneralizedli d ith ti l i i th ldcylinder withverticalaxisintheworld

    coordsCalibratedCamera),,,,,,,( swhswf hwwww =S

    )},(),,(),,(),0,{(),( hwhwhwwyr hsswwfii =

    Michael Rubinstein

  • 40

    Observationlikelihood

    ImageoverlaidwithrectangularGrid(e.g.5i l )

    )|( ttp XZ

    pixels)

    Y

    CrGaussian

    Michael Rubinstein

    Cr

    Cb MexicanHat(secondderiv ofG)

    Observationlikelihood

    Theresponsevaluesareassumedditi ll i d d t i X

    )|( ttp XZ

    conditionallyindependentgivenX

    == g ggg g lzpzp )|()|()|p( XXZ

    0=gl1=gl

    Michael Rubinstein

    2=glg

    3=gl

  • 41

    Appearancemodels

    GMMsforbackgroundandforegroundaret i d i ktrainedusingkmeans

    Michael Rubinstein

    Bk Bkg

    kggg NK

    lz ++== ),(1)0|p(

    FkkF

    kFgg NK

    lz += ),(1)0|p(4=K

    16=K

    Observationlikelihood

    Michael Rubinstein

    =

    )0|p()0|p(

    loggg

    gg

    lzlz

  • 42

    System(prediction)model

    Thenumberofobjectscanchange: Each object has a constant probability to remain in

    )|( 1tt XXp

    Eachobjecthasaconstantprobabilitytoremaininthescene.

    Ateachtimestep,thereisconstantprobabilitythatanewobjectwillenterthescene.

    r

    i

    ,...)~,(,...)~,( 1,1,'1'1

    '1

    nt

    nt

    nt

    nt

    nt

    nt xmXxmX ==

    Michael Rubinstein

    Predictionfunction Initialization

    function

    Predictionfunction

    Motionevolution:dampedconstantvelocitySh l ti 1st d t i Shapeevolution:1st orderautoregressiveprocessmodel(ARP)

    Michael Rubinstein

    1tX 1tV

    11 8.0 + tt VX

  • 43

    Particles1tX

    2tX

    1NtX NtX...N Points:

    N Weights: 1t2t

    1Nt N

    t

    Michael Rubinstein

    Estimate

    Denote thesetofexistingi id tifi

    },...,{ 1 Mt =M

    tX

    uniqueidentifiers

    Total probability thatobject i exists

    (particle,target)

    Michael Rubinstein

  • 44

    Results

    N=1000particles initializationsamplesalwaysgenerated

    Michael Rubinstein

    Results

    Singleforegroundmodelcannotdistinguishb t l i bj t idbetweenoverlappingobjects causesidswitches

    Michael Rubinstein

  • 45

    Parameters

    Michael Rubinstein

    Summary

    Particlefilterswereshowntoproducegoodapproximations under relatively weakapproximationsunderrelativelyweakassumptions candealwithnonlinearities candealwithnonGaussiannoise Multiplehypotheses canbeimplementedinO(N)p ( ) Relativelysimple Adaptivefocusonmoreprobableregionsofthestatespace

    Michael Rubinstein

  • 46

    Inpractice

    1. State(object)model2. System(evolution)modely ( )3. Measurement(likelihood)model4. Initial(prior)state5. Stateestimate(giventhepdf)

    6. PFspecifics1. Proposaldensity2. Resampling method

    Configurationsforspecificproblemscanbefoundinliterature

    Michael Rubinstein

    Isard&BlakeCONDENSATION conditionaldensitypropagationforvisualtrackingIJCV98

    Michael Rubinstein

  • 47

    Isard&BlakeCONDENSATION conditionaldensitypropagationforvisualtrackingIJCV98

    Michael Rubinstein

    Isard&BlakeCONDENSATION conditionaldensitypropagationforvisualtrackingIJCV98

    girl dancing vigorously to a Scottish reel 100 particles bush blowing in the wind 1200 particles

    Michael Rubinstein

  • 48

    Okumaelal.BoostedParticleFilterECCV2004

    Goal:trackhockeyplayers Idea:AdaBoost+PF

    Michael Rubinstein

    Okumaelal.BoostedParticleFilterECCV2004

    Michael Rubinstein

  • 49

    Bibby&ReidTrackingusingPixelWisePosteriors(ECCV08)

    Michael Rubinstein

    Bibby&ReidTrackingusingPixelWisePosteriors(ECCV08)

    Michael Rubinstein

  • 50

    Thankyou!

    Michael Rubinstein

    References

    BeyondtheKalman filter/Ristic Arulamplam GordonRistic,Arulamplam,Gordon Onlinetutorial:ATutorialonParticleFiltersforOnlineNonlinear/NonGaussianBayesianTracking/Arulampalam etal2002

    Stochasticmodels,estimationandcontrol/PeterS.Maybeck

    AnIntroductiontotheKalman Filter/GregWelch,GaryBishop

    Particlefiltersanoverview/MatthiasMuhlich

    Michael Rubinstein

  • 51

    Sequentialderivation1

    Supposeattimek1, characterizeNiik

    ik wx 111:0 },{ =

    )|( zxp Wereceivenewmeasurementandneedtoapproximateusingnewsetofsamples

    Wechooseqsuchthat

    )|( 1:11:0 kk zxpkz

    )|( :1:0 kk zxp

    )|(),|()|( 1:11:0:11:0:1:0 = kkkkkkk zxqzxxqzxq

    Andwecangeneratenewparticles

    ),|(~ :11:0 ki

    kkik zxxqx

    Michael Rubinstein

    Sequentialderivation2

    Fortheweightupdateequation,itcanbeh th tshownthat

    Andso)|()|()|(

    )|()|(

    )|()|()|(

    1:11:01

    1:11:01:1

    1:1:0

    =

    kkkkkk

    kkkk

    kkkkkk

    zxpxxpxzp

    zxpzzp

    xxpxzpzxp

    )|()|()|()|(

    ),|()|()|(

    )|(),|()|()|()|(

    )|()|(

    :11:0

    11

    1:11:0:11:0

    1:11:01

    :1:0

    :1:0

    ki

    kik

    ik

    ik

    ikki

    k

    kkkkk

    kkkkkk

    kk

    kkik

    zxxqxxpxzpw

    zxqzxxqzxpxxpxzp

    zxqzxpw

    =

    ==

    Michael Rubinstein

  • 52

    Sequentialderivation3

    Further,if ),|(),|( 1:11:0 kkkkkk zxxqzxxq = Thentheweightsupdaterulebecomes

    (andneednotstoreentireparticlepathsandfullhistoryofb i )

    ),|()|()|(

    1

    11

    kik

    ik

    ik

    ik

    ikki

    kik zxxq

    xxpxzpww

    = (3)

    observations)

    Finally,the(filtered)posteriordensityisapproximatedby

    =

    N

    i

    ikk

    ikkk xxwzxp

    1:1 )()|(

    Michael Rubinstein

    Choiceofimportancedensity

    Chooseqtominimizevarianceofweights Optimalchoice:

    Usuallycannotsamplefrom orsolvefor(insomespecificcasesisworks)

    Most commonly used (suboptimal)

    )|(),|(),|(

    11

    11ikk

    ik

    ik

    kikkoptk

    ikk

    xzpwwzxxpzxxq

    =

    optqikw

    Mostcommonlyused(suboptimal)alternative:

    i.e.thetransitionalprior

    )|()|(),|(

    1

    11ikk

    ik

    ik

    ikkoptk

    ikk

    xzpwwxxpzxxq

    =

    Michael Rubinstein

  • 53

    GenericPF

    Resampling reducesdegeneracy,butnewproblems ariseproblemsarise

    1. Limitsparallelization2. Sampleimpoverishment:particleswithhigh

    weightsareselectedmanytimeswhichleadstolossofdiversity

    ifprocessnoiseissmall allparticlestendtocollapsetosinglepointwithinfewinterations

    Methodsexisttocounterthisaswell

    Michael Rubinstein

Recommended

View more >