entropy of hidden markov processes

Post on 31-Dec-2015

49 Views

Category:

Documents

4 Downloads

Preview:

Click to see full reader

DESCRIPTION

Entropy of Hidden Markov Processes. Or Zuk 1 Ido Kanter 2 Eytan Domany 1 Weizmann Inst. 1 Bar-Ilan Univ. 2. Overview. Introduction Problem Definition Statistical Mechanics approach Cover&Thomas Upper-Bounds Radius of Convergence Related subjects Future Directions. - PowerPoint PPT Presentation

TRANSCRIPT

.

Entropy of Hidden Markov Processes

Or Zuk1 Ido Kanter2 Eytan Domany1

Weizmann Inst.1 Bar-Ilan Univ.2

2

Overview

Introduction Problem Definition Statistical Mechanics approach Cover&Thomas Upper-Bounds Radius of Convergence Related subjects Future Directions

3

HMP - Definitions

Markov Process:

X – Markov Process

M – Transition Matrix Mij = Pr(Xn+1 = j| Xn = i)

Hidden Markov Process :Y – Noisy Observation of XN – Noise/Emission Matrix Nij = Pr(Yn = j| Xn = i)

M

NN

Xn Xn+1

Yn+1Yn

4

Example: Binary HMP

0 1

p(1|0)

p(0|1)

p(1|1)

p(0|0)

)1|1()1|0(

)0|1()0|0(

pp

pp

0 1

q(0|0) q(1|0)q(0|1)

q(1|1)

)1|1()1|0(

)0|1()0|0(

qq

qq

Transition Emission

5

Example: Binary HMP (Cont.) For simplicity, we will concentrate on

Symmetric Binary HMP :

M = N =

So all properties of the process depend on two parameters, p and . Assume (w.l.o.g.) p, < ½

pp

pp

1

1

1

1

6

HMP Entropy Rate

Definition :

H is difficult to compute, given as a Lyaponov Exponent (which is hard to compute generally.) [Jacquet et al 04]

What to do ? Calculate H in different Regimes.

7

Different Regimes

p -> 0 , p -> ½ ( fixed)

-> 0 , -> ½ (p fixed)

[Ordentlich&Weissman 04] study several regimes.

We concentrate on the ‘small noise regime’ -> 0.

Solution can be given as a power-series in :

8

Statistical Mechanics

First, observe the Markovian Property :

Perform Change of Variables :

9

Statistical Mechanics (cont.)

Ising Model :

, {-1,1} Spin Glasses

+ + + + - + - -

+ + - - - + + -

1

1

2

2

K

J

K

J

n

n

10

Statistical Mechanics (cont.)

Summing, we get :

11

Statistical Mechanics (cont.) Computing the Entropy (low-temperature/high-field

expansion) :

12

Cover&Thomas BoundsIt is known (Cover & Thomas 1991) :

We will use the upper-bounds C(n), and derive their orders :

Qu : Do the orders ‘saturate’ ?

13

Cover&Thomas Bounds (cont.)

0 0.1 0.2 0.3 0.4 0.50

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

0.45

0.5

eps

p

Upperbound / Lowerbound Average

0.1

0.2

0.3

0.4

0.5

0.6

0.7

0.8

0.9

1

0 0.1 0.2 0.3 0.4 0.50

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

0.45

0.5

eps

p

Upperbound Minus Lowerbound

0

0.01

0.02

0.03

0.04

0.05

0.06

0.07

0.08

0.09

0 0.1 0.2 0.3 0.4 0.50

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

0.45

0.5

eps

p

Relative Error Upperbound Minus Lowerbound / Average

0.02

0.04

0.06

0.08

0.1

0.12

0 0.1 0.2 0.3 0.4 0.50

0.05

0.1

0.15

0.2

0.25

0.3

0.35

0.4

0.45

0.5

eps

p

Relative Error Upperbound Minus Lowerbound / (1-Average)

0

0.5

1

1.5

2

2.5

3

n=4

14

Cover&Thomas Bounds (cont.)

Ans : Yes. In fact they ‘saturate’ sooner than would have

been expected ! For n (K+3)/2 they become constant.

We therefore have : Conjecture 1 : (proven for k=1)

How do the orders look ? Their expression is simpler when expressed using = 1-2p, which is the 2nd eigenvalue of P.

Conjecture 2 :

15

First Few Orders :

Note : H0-H2 proven. The rest are conjectures from the upper-bounds.

16

First Few Orders (Cont.) :

17

First Few Orders (Cont.) :

18

Radius of Convergence :

When is our approximation good ? Instructive : Compare to the I.I.D. model

For HMP, the limit is unknown. We used the fit :

19

Radius of Convergence (cont.) :

20

Radius of Convergence (cont.) :

21

Relative Entropy Rate

Relative entropy rate :

We get :

22

Index of Coincidence Take two realizations Y,Y’ (of length n) of the same HMP. What is the probability that

they are equal ?

Exponentially decaying with n.

We get :

Similarly, we can solve for three and four (but not five) realizations. Can give bounds on the entropy rate.

23

Future Directions

Proving conjectures Generalizations (e.g. any alphabets, continuous case) Other regimes Relative Entropy of two HMPs

Thank You

top related