hidden markov models
DESCRIPTION
Hidden Markov Models. First Story!. Majid Hajiloo, Aria Khademi. Outline. This lecture is devoted to the problem of inference in probabilistic graphical models (aka Bayesian nets). The goal is for you to : Practice marginalization and conditioning Practice Bayes rule - PowerPoint PPT PresentationTRANSCRIPT
![Page 1: Hidden Markov Models](https://reader035.vdocuments.mx/reader035/viewer/2022070423/56816770550346895ddc5ae8/html5/thumbnails/1.jpg)
Hidden Markov Models
First Story!
Majid Hajiloo, Aria Khademi
![Page 2: Hidden Markov Models](https://reader035.vdocuments.mx/reader035/viewer/2022070423/56816770550346895ddc5ae8/html5/thumbnails/2.jpg)
2
Outline
This lecture is devoted to the problem of inference in probabilistic graphical models (aka Bayesian nets). The goal is for you to:
Practice marginalization and conditioning Practice Bayes rule Learn the HMM representation (model) Learn how to do prediction and filtering using HMMs
![Page 3: Hidden Markov Models](https://reader035.vdocuments.mx/reader035/viewer/2022070423/56816770550346895ddc5ae8/html5/thumbnails/3.jpg)
3
Assistive Technology
Assume you have a little robot that is trying to estimate the posterior probability that you are happy or sad
![Page 4: Hidden Markov Models](https://reader035.vdocuments.mx/reader035/viewer/2022070423/56816770550346895ddc5ae8/html5/thumbnails/4.jpg)
4
Assistive Technology
Given that the robot has observed whether you are: watching Game of Thrones (w) sleeping (s) crying (c) face booking (f)
Let the unknown state be X=h if you’re happy and X=s if you’re sad.Let Y denote the observation, which can be w, s, c or f.
![Page 5: Hidden Markov Models](https://reader035.vdocuments.mx/reader035/viewer/2022070423/56816770550346895ddc5ae8/html5/thumbnails/5.jpg)
5
Assistive Technology
We want to answer queries, such as:
![Page 6: Hidden Markov Models](https://reader035.vdocuments.mx/reader035/viewer/2022070423/56816770550346895ddc5ae8/html5/thumbnails/6.jpg)
6
Assistive Technology
Assume that an expert has compiled the following prior and likelihood models:
X
Y
![Page 7: Hidden Markov Models](https://reader035.vdocuments.mx/reader035/viewer/2022070423/56816770550346895ddc5ae8/html5/thumbnails/7.jpg)
7
Assistive Technology
But what if instead of an absolute prior, what we have instead is a temporal (transition prior). That is, we assume a dynamical system:
Init
Given a history of observations, say we want to compute the posterior distribution that you are happy at step 3. That is, we want to estimate:
)
![Page 8: Hidden Markov Models](https://reader035.vdocuments.mx/reader035/viewer/2022070423/56816770550346895ddc5ae8/html5/thumbnails/8.jpg)
8
Dynamic Model
In general, we assume we have an initial distribution , a transition model and an observation model
For 2 time steps:
𝑋 0 𝑋 1 𝑋 2
𝑌 1 𝑌 2
![Page 9: Hidden Markov Models](https://reader035.vdocuments.mx/reader035/viewer/2022070423/56816770550346895ddc5ae8/html5/thumbnails/9.jpg)
9
Optimal Filtering
Our goal is to compute, for all t, the posterior (aka filtering) distribution:
We derive a recursion to compute assuming that we have as input . The recursion has two steps:1. Prediction2. Bayesian update
![Page 10: Hidden Markov Models](https://reader035.vdocuments.mx/reader035/viewer/2022070423/56816770550346895ddc5ae8/html5/thumbnails/10.jpg)
10
Prediction
First, we compute the state prediction:
![Page 11: Hidden Markov Models](https://reader035.vdocuments.mx/reader035/viewer/2022070423/56816770550346895ddc5ae8/html5/thumbnails/11.jpg)
11
Bayes Update
Second, we use Bayes rule to obtain
![Page 12: Hidden Markov Models](https://reader035.vdocuments.mx/reader035/viewer/2022070423/56816770550346895ddc5ae8/html5/thumbnails/12.jpg)
12
HMM Algorithm
Example. Assume:
Prediction:
Bayes Update: if we observe , the Bayes update yields:
![Page 13: Hidden Markov Models](https://reader035.vdocuments.mx/reader035/viewer/2022070423/56816770550346895ddc5ae8/html5/thumbnails/13.jpg)
The END