{ x n : n =0, 1, 2,...} is a discrete time stochastic process markov chains

47
{X n : n =0, 1, 2, ...} is a discrete time stochastic process Markov Chains

Upload: sheila-atkinson

Post on 05-Jan-2016

218 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: { X n : n =0, 1, 2,...} is a discrete time stochastic process Markov Chains

{Xn: n =0, 1, 2, ...} is a discrete time stochastic process

Markov Chains

Page 2: { X n : n =0, 1, 2,...} is a discrete time stochastic process Markov Chains

{Xn: n =0, 1, 2, ...} is a discrete time stochastic process

If Xn = i the process is said to be in state i at time n

Markov Chains

Page 3: { X n : n =0, 1, 2,...} is a discrete time stochastic process Markov Chains

{Xn: n =0, 1, 2, ...} is a discrete time stochastic process

If Xn = i the process is said to be in state i at time n

{i: i=0, 1, 2, ...} is the state space

Markov Chains

Page 4: { X n : n =0, 1, 2,...} is a discrete time stochastic process Markov Chains

{Xn: n =0, 1, 2, ...} is a discrete time stochastic process

If Xn = i the process is said to be in state i at time n

{i: i=0, 1, 2, ...} is the state space

If P(Xn+1 =j|Xn =i, Xn-1 =in-1, ..., X0 =i0}=P(Xn+1 =j|Xn =i} = Pij, the process is said to be a Discrete Time Markov Chain (DTMC).

Markov Chains

Page 5: { X n : n =0, 1, 2,...} is a discrete time stochastic process Markov Chains

{Xn: n =0, 1, 2, ...} is a discrete time stochastic process

If Xn = i the process is said to be in state i at time n

{i: i=0, 1, 2, ...} is the state space

If P(Xn+1 =j|Xn =i, Xn-1 =in-1, ..., X0 =i0}=P(Xn+1 =j|Xn =i} = Pij, the process is said to be a Discrete Time Markov Chain (DTMC).

Pij is the transition probability from state i to state j

Markov Chains

Page 6: { X n : n =0, 1, 2,...} is a discrete time stochastic process Markov Chains

0

00 01 02

10 11 12

0 1 2

0, , 0 1, 0,1,...

...

...

. . . .

. . . .

...

. . . .

. . . .

ij ijj

i i i

P i j P i

P P P

P P P

P P P

P

P: transition matrix

Page 7: { X n : n =0, 1, 2,...} is a discrete time stochastic process Markov Chains

Example 1: Probability it will rain tomorrow depends only on whether it rains today or not:

P(rain tomorrow|rain today) = P(rain tomorrow|no rain today) =

Page 8: { X n : n =0, 1, 2,...} is a discrete time stochastic process Markov Chains

Example 1: Probability it will rain tomorrow depends only on whether it rains today or not:

P(rain tomorrow|rain today) = P(rain tomorrow|no rain today) =

State 0 = rainState 1 = no rain

Page 9: { X n : n =0, 1, 2,...} is a discrete time stochastic process Markov Chains

Example 1: Probability it will rain tomorrow depends only on whether it rains today or not:

P(rain tomorrow|rain today) = P(rain tomorrow|no rain today) =

State 0 = rainState 1 = no rain

1

1

P

Page 10: { X n : n =0, 1, 2,...} is a discrete time stochastic process Markov Chains

Example 4: A gambler wins $1 with probability p, loses $1 with probability 1-p. She starts with $N and quits if she reaches either $M or $0. Xn is the amount of money the gambler has after playing n rounds.

Page 11: { X n : n =0, 1, 2,...} is a discrete time stochastic process Markov Chains

Example 4: A gambler wins $1 with probability p, loses $1 with probability 1-p. She starts with $N and quits if she reaches either $M or $0. Xn is the amount of money the gambler has after playing n rounds.

P(Xn=i+1|Xn-1 =i, Xn-2 =in-2, ..., X0 =N}=P(Xn =i+1|Xn-1 =i}=p

(i≠0, M)

Page 12: { X n : n =0, 1, 2,...} is a discrete time stochastic process Markov Chains

Example 4: A gambler wins $1 with probability p, loses $1 with probability 1-p. She starts with $N and quits if she reaches either $M or $0. Xn is the amount of money the gambler has after playing n rounds.

P(Xn=i+1|Xn-1 =i, Xn-2 =in-2, ..., X0 =N}=P(Xn =i+1|Xn-1 =i}=p

(i≠0, M)

P(Xn=i-1| Xn-1 =i, Xn-2 = in-2, ..., X0 =N} = P(Xn =i-1|Xn-1 =i}=1–p

(i≠0, M)

Page 13: { X n : n =0, 1, 2,...} is a discrete time stochastic process Markov Chains

Example 4: A gambler wins $1 with probability p, loses $1 with probability 1-p. She starts with $N and quits if she reaches either $M or $0. Xn is the amount of money the gambler has after playing n rounds.

P(Xn=i+1|Xn-1 =i, Xn-2 =in-2, ..., X0 =N}=P(Xn =i+1|Xn-1 =i}=p

(i≠0, M)

P(Xn=i-1| Xn-1 =i, Xn-2 = in-2, ..., X0 =N} = P(Xn =i-1|Xn-1 =i}=1–p

(i≠0, M)

Pi, i+1=P(Xn=i+1|Xn-1 =i}; Pi, i-1=P(Xn=i-1|Xn-1 =i}

Page 14: { X n : n =0, 1, 2,...} is a discrete time stochastic process Markov Chains

Pi, i+1= p;

Pi, i-1=1-p for i≠0, M

P0,0= 1; PM, M=1 for i≠0, M (0 and M are called absorbing states)

Pi, j= 0, otherwise

Page 15: { X n : n =0, 1, 2,...} is a discrete time stochastic process Markov Chains

random walk: A Markov chain whose state space is 0, 1, 2, ..., and Pi,i+1= p = 1 - Pi,i-1 for i=0, 1,

2, ..., and 0 < p < 1 is said to be a random walk.

Page 16: { X n : n =0, 1, 2,...} is a discrete time stochastic process Markov Chains

Chapman-Kolmogorv Equations

{ | }, 0, , 0nij n m mP P X j X i n i j

Page 17: { X n : n =0, 1, 2,...} is a discrete time stochastic process Markov Chains

Chapman-Kolmogorv Equations

1

{ | }, 0, , 0nij n m m

ij ij

P P X j X i n i j

P P

Page 18: { X n : n =0, 1, 2,...} is a discrete time stochastic process Markov Chains

Chapman-Kolmogorv Equations

1

0

{ | }, 0, , 0

for all , 0, and , 0

( )

nij n m m

ij ij

n m n mij ik kjk

P P X j X i n i j

P P

P P P n m i j

Chapman - Kolmogrov equations

Page 19: { X n : n =0, 1, 2,...} is a discrete time stochastic process Markov Chains

0{ | },

n mij n mP P X j X i

Page 20: { X n : n =0, 1, 2,...} is a discrete time stochastic process Markov Chains

0

00

{ | },

= { , | }

n mij n m

n m nk

P P X j X i

P X j X k X i

Page 21: { X n : n =0, 1, 2,...} is a discrete time stochastic process Markov Chains

0

00

0 00

{ | },

= { , | }

{ | , } { | }

n mij n m

n m nk

n m n nk

P P X j X i

P X j X k X i

P X j X k X i P X k X i

Page 22: { X n : n =0, 1, 2,...} is a discrete time stochastic process Markov Chains

0

00

0 00

00

{ | },

= { , | }

{ | , } { | }

{ | } { | }

n mij n m

n m nk

n m n nk

n m n nk

P P X j X i

P X j X k X i

P X j X k X i P X k X i

P X j X k P X k X i

Page 23: { X n : n =0, 1, 2,...} is a discrete time stochastic process Markov Chains

0

00

0 00

00

0 0

{ | },

= { , | }

{ | , } { | }

{ | } { | }

n mij n m

n m nk

n m n nk

n m n nk

m n n mkj ik ik kjk k

P P X j X i

P X j X k X i

P X j X k X i P X k X i

P X j X k P X k X i

P P P P

Page 24: { X n : n =0, 1, 2,...} is a discrete time stochastic process Markov Chains

( ) : the matrix of transition probabilities n nijn P

P

Page 25: { X n : n =0, 1, 2,...} is a discrete time stochastic process Markov Chains

( )

( ) ( ) ( )

: the matrix of transition probabilities n nij

n m n m

n P

P

P P × P

Page 26: { X n : n =0, 1, 2,...} is a discrete time stochastic process Markov Chains

( )

( ) ( ) ( )

1

: the matrix of transition probabilities

(Note: if [ ] and [ ], then [ ])

n nij

n m n m

M

ij ij ik kjk

n P

a b a b

P

P P × P

A B A × B

Page 27: { X n : n =0, 1, 2,...} is a discrete time stochastic process Markov Chains

Example 1: Probability it will rain tomorrow depends only on whether it rains today or not:

P(rain tomorrow|rain today) = P(rain tomorrow|no rain today) =

What is the probability that it will rain four days from today given that it is raining today? Let = 0.7 and = 0.4.

State 0 = rainState 1 = no rain

Page 28: { X n : n =0, 1, 2,...} is a discrete time stochastic process Markov Chains

400What is ?P

Page 29: { X n : n =0, 1, 2,...} is a discrete time stochastic process Markov Chains

400What is ?

0.7 0.3

0.4 0.6

P

P

Page 30: { X n : n =0, 1, 2,...} is a discrete time stochastic process Markov Chains

400

(2)

What is ?

0.7 0.3

0.4 0.6

0.7 0.3 0.7 0.3 0.61 0.39

0.4 0.6 0.4 0.6 0.52 0.48

P

P

P ×

Page 31: { X n : n =0, 1, 2,...} is a discrete time stochastic process Markov Chains

400

(2)

(4) (2) (2)

What is ?

0.7 0.3

0.4 0.6

0.7 0.3 0.7 0.3 0.61 0.39

0.4 0.6 0.4 0.6 0.52 0.48

0.61 0.39 0.61 0.39 0.5749 0.4251

0.52 0.48 0.52 0.48 0.5668 0.4332

P

P

P ×

P P × P ×

Page 32: { X n : n =0, 1, 2,...} is a discrete time stochastic process Markov Chains

400

(2)

(4) (2) (2)

400

What is ?

0.7 0.3

0.4 0.6

0.7 0.3 0.7 0.3 0.61 0.39

0.4 0.6 0.4 0.6 0.52 0.48

0.61 0.39 0.61 0.39 0.5749 0.4251

0.52 0.48 0.52 0.48 0.5668 0.4332

0.574

P

P

P

P ×

P P × P ×

9

Page 33: { X n : n =0, 1, 2,...} is a discrete time stochastic process Markov Chains

How do we calculate ( )?nP X j

Unconditional probabilities

Page 34: { X n : n =0, 1, 2,...} is a discrete time stochastic process Markov Chains

0

How do we calculate ( )?

Let ( )

n

i

P X j

P X i

Unconditional probabilities

Page 35: { X n : n =0, 1, 2,...} is a discrete time stochastic process Markov Chains

0

0 01

How do we calculate ( )?

Let ( )

( ) ( | ) ( )

n

i

n ni

P X j

P X i

P X j P X j X i P X i

Unconditional probabilities

Page 36: { X n : n =0, 1, 2,...} is a discrete time stochastic process Markov Chains

0

0 01

1

How do we calculate ( )?

Let ( )

( ) ( | ) ( )

n

i

n ni

nij ii

P X j

P X i

P X j P X j X i P X i

P

Unconditional probabilities

Page 37: { X n : n =0, 1, 2,...} is a discrete time stochastic process Markov Chains

0

State is accessible from state if 0 for some 0.

Two states that are accessible to each other are said

to communicate ( ).

Any state communicates with itself since 1.

nij

ii

j i P n

i j

P

Classification of States

Page 38: { X n : n =0, 1, 2,...} is a discrete time stochastic process Markov Chains

State is accessible from state if 0 for some 0.nijj i P n

Classification of States

Page 39: { X n : n =0, 1, 2,...} is a discrete time stochastic process Markov Chains

State is accessible from state if 0 for some 0.

Two states that are accessible to each other are said

to communicate ( ).

.

nijj i P n

i j

Classification of States

Page 40: { X n : n =0, 1, 2,...} is a discrete time stochastic process Markov Chains

0

State is accessible from state if 0 for some 0.

Two states that are accessible to each other are said

to communicate ( ).

Any state communicates with itself since 1.

nij

ii

j i P n

i j

P

Classification of States

Page 41: { X n : n =0, 1, 2,...} is a discrete time stochastic process Markov Chains

State communicates with state , for all 0.i i i

Properties

Page 42: { X n : n =0, 1, 2,...} is a discrete time stochastic process Markov Chains

State communicates with state , for all 0.

If state communicates with state , then state communicates

with state .

i i i

i j j

i

Properties

Page 43: { X n : n =0, 1, 2,...} is a discrete time stochastic process Markov Chains

State communicates with state , for all 0.

If state communicates with state , then state communicates

with state .

If state communicates with state , and state communicates

with st

i i i

i j j

i

i j j

ate , then state communicates with state .k i k

Properties

Page 44: { X n : n =0, 1, 2,...} is a discrete time stochastic process Markov Chains

0

If communicates with and communicates with ,

then there exist some and for which 0 and 0.

0.

n mij jk

n m n m n mik ir rk ij jkr

i j j k

m n P P

P P P P P

Page 45: { X n : n =0, 1, 2,...} is a discrete time stochastic process Markov Chains

Two states that communicate are said to belong to the same class.

Classification of States (continued)

Page 46: { X n : n =0, 1, 2,...} is a discrete time stochastic process Markov Chains

Two states that communicate are said to belong to the same class.

Two classes are either identical or disjoint

(have no communicating states).

Classification of States (continued)

Page 47: { X n : n =0, 1, 2,...} is a discrete time stochastic process Markov Chains

Two states that communicate are said to belong to the same class.

Two classes are either identical or disjoint

(have no communicating states).

A Markov chain is said to be if it has onl

irreducible y one class

(all states communicate with each other).

Classification of States (continued)