hmms and particle filters

17
HMMs and Particle Filters

Upload: dory

Post on 23-Feb-2016

65 views

Category:

Documents


0 download

DESCRIPTION

HMMs and Particle Filters. Observations and Latent States. Markov models don’t get used much in AI. The reason is that Markov models assume that you know exactly what state you are in, at each time step. This is rarely true for AI agents. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: HMMs and Particle Filters

HMMs and Particle Filters

Page 2: HMMs and Particle Filters

Observations and Latent StatesMarkov models don’t get used much in AI.

The reason is that Markov models assume that you know exactly what state you are in, at each time step.

This is rarely true for AI agents.

Instead, we will say that the agent has a set of possible latent states – states that are not observed, or known to the agent.

In addition, the agent has sensors that allow it to sense some aspects of the environment, to take measurements or observations.

Page 3: HMMs and Particle Filters

Hidden Markov ModelsSuppose you are the parent of a college student, and would like to know how studious your child is.

You can’t observe them at all times, but you can periodically call, and see if your child answers.

Sleep Study

0.50.60.4

0.5H1 H2 H3

Sleep Study

0.50.60.4

0.5

Sleep Study

0.50.60.4

0.5

O1 O2 O3Answer callor not?

Answer callor not?

Answer callor not?

Page 4: HMMs and Particle Filters

Hidden Markov Models

H1 H2 H3…

O1 O2 O3

H1 H2 P(H2|H1)

Sleep Sleep 0.6

Study Sleep 0.5

H2 H3 P(H3|H2)

Sleep Sleep 0.6

Study Sleep 0.5

H4 H3 P(H4|H3)

Sleep Sleep 0.6

Study Sleep 0.5

H1 O1P(O1|H1)

Sleep Ans 0.1

Study Ans 0.8

H2 O2P(O2|H2)

Sleep Ans 0.1

Study Ans 0.8

H3 O3P(O3|H3)

Sleep Ans 0.1

Study Ans 0.8

H1 P(H1)

Sleep 0.5

Study 0.5

Here’s the same model, with probabilities in tables.

Page 5: HMMs and Particle Filters

Hidden Markov Models

HMMs (and MMs) are a special type of Bayes Net. Everything you have learned about BNs applies here.

H1 H2 H3…

O1 O2 O3

H1 H2 P(H2|H1)

Sleep Sleep 0.6

Study Sleep 0.5

H2 H3 P(H3|H2)

Sleep Sleep 0.6

Study Sleep 0.5

H4 H3 P(H4|H3)

Sleep Sleep 0.6

Study Sleep 0.5

H1 O1P(O1|H1)

Sleep Ans 0.1

Study Ans 0.8

H2 O2P(O2|H2)

Sleep Ans 0.1

Study Ans 0.8

H3 O3P(O3|H3)

Sleep Ans 0.1

Study Ans 0.8

H1 P(H1)

Sleep 0.5

Study 0.5

Page 6: HMMs and Particle Filters

Quick Review of BNs for HMMs

H1

O1

H1 H2

Page 7: HMMs and Particle Filters

Hidden Markov ModelsSuppose a parent calls and gets an answer at time step 1. What is P(H1=Sleep|O1=Ans)?

Notice: before the observation, P(Sleep) was 0.5. By making a call and getting an answer, the parent’s belief in Sleep drops to P(Sleep) = 0.111.

H1…

O1

H1 H2 P(H2|H1)

Sleep Sleep 0.6

Study Sleep 0.5

H1 O1P(O1|H1)

Sleep Ans 0.1

Study Ans 0.8

H1 P(H1)

Sleep 0.5

Study 0.5

Page 8: HMMs and Particle Filters

Hidden Markov ModelsSuppose a parent calls and gets an answer at time step 2. What is P(H2=Sleep|O2=Ans)?

H1

O1

H1 H2 P(H2|H1)

Sleep Sleep 0.6

Study Sleep 0.5

H1 O1P(O1|H1)

Sleep Ans 0.1

Study Ans 0.8

H1 P(H1)

Sleep 0.5

Study 0.5

H2

O2

Page 9: HMMs and Particle Filters

Quiz: Hidden Markov Models

H1

O1

H1 H2 P(H2|H1)

Sleep Sleep 0.6

Study Sleep 0.5

H1 O1P(O1|H1)

Sleep Ans 0.1

Study Ans 0.8

H1 P(H1)

Sleep 0.5

Study 0.5

H2

O2

Suppose a parent calls twice, once at time step 1 and once at time step 2. The first time, the child does not answer, and the second time the child does.

Now what is P(H2=Sleep)?

Page 10: HMMs and Particle Filters

Answer: Hidden Markov Models

H1

O1

H1 H2 P(H2|H1)

Sleep Sleep 0.6

Study Sleep 0.5

H1 O1P(O1|H1)

Sleep Ans 0.1

Study Ans 0.8

H1 P(H1)

Sleep 0.5

Study 0.5

H2

O2

Suppose a parent calls twice, once at time step 1 and once at time step 2. The first time, the child does not answer, and the second time the child does.

Now what is P(H2=Sleep)?

Numerator:+

Denominator:+

It’s a pain to calculate by Enumeration.

Page 11: HMMs and Particle Filters

Quiz: Complexity of Enumeration for HMMs

Suppose we have an HMM with T time steps.

To compute any query like P(Hi|O1, …, OT), we need to compute P(O1, …, OT).

How many terms are in this sum, if there are 2 possible values for each Hi?

Page 12: HMMs and Particle Filters

Answer: Complexity of Enumeration for HMMs

Suppose we have an HMM with T time steps.

To compute any query like P(Hi|O1, …, OT), we need to compute P(O1, …, OT).

How many terms are in this sum, if there are 2 possible values for each H i?

2T terms in this sum. Regular enumeration is an O(2T) algorithm.

This makes it intractable, for example, for sentences with 20 words, or DNA sequences with hundreds or millions of base pairs.

Page 13: HMMs and Particle Filters

Specialized Inference Algorithm: Dynamic Programming

There is a fairly simple way to compute this sum exactly in O(T), or linear time, using dynamic programming.

Essentially, this works by computing partial sums, storing them, and re-using them during calculations of the sums for longer sequences.

This is called the forward algorithm.

We won’t cover this here, but you can see the book or online tutorials if you are interested.

Page 15: HMMs and Particle Filters

Particle Filter Demos

Real robot localization with particle filter:https://www.youtube.com/watch?v=H0G1yslM5rc&feature=player_embedded

1-dimensional case:https://www.youtube.com/watch?v=qQQYkvS5CzU&feature=player_embedded

Page 16: HMMs and Particle Filters

Particle Filter Algorithm

Inputs: – set of particles S, each with location si.loc and

weight si.w– Control vector u (where robot should move next)– Measurement vector z (sensor readings)

Outputs:– New particles S’, for the next iteration

Page 17: HMMs and Particle Filters

Particle Filter AlgorithmInit: For (N is number of particles in S):

Pick a particle sj from S randomly (with replacment), in proportion to the weights s.wCreate a new particle s’Sample Set Set Set

End ForFor :

Set End For