martingale problems and filteringkurtz/lectures/barrett.pdf · 2015. 5. 13. · first prev next go...

58
First Prev Next Go To Go Back Full Screen Close Quit 1 Martingale problems and filtering Martingale problems for conditional distributions Filtering equations References

Upload: others

Post on 05-Aug-2021

3 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Martingale problems and filteringkurtz/Lectures/Barrett.pdf · 2015. 5. 13. · First Prev Next Go To Go Back Full Screen Close Quit 13 Martingale characterization of conditional

•First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 1

Martingale problems and filtering

• Martingale problems for conditional distributions

• Filtering equations

• References

Page 2: Martingale problems and filteringkurtz/Lectures/Barrett.pdf · 2015. 5. 13. · First Prev Next Go To Go Back Full Screen Close Quit 13 Martingale characterization of conditional

•First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 2

1. Martingale problems for conditional distributions

• Specifying Markov processes

• Examples of generators

• Characterization of Markov processes as solutions of martingaleproblems

• Conditional distributions for martingale problems

• Partially observed processes

• Filtered martingale problem

• Markov mapping theorem

• Burke’s theoremKurtz and Ocone (1988); Kurtz and Nappo (2011)

Page 3: Martingale problems and filteringkurtz/Lectures/Barrett.pdf · 2015. 5. 13. · First Prev Next Go To Go Back Full Screen Close Quit 13 Martingale characterization of conditional

•First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 3

How to specify a Markov model

An E-valued process is Markov wrt Ft if

E[f(X(t+ s))|Ft] = E[f(X(t+ s))|X(t)], f ∈ B(E)

Ordinary differential equations: X = F (X)

X(t+ ∆t) ≈ X(t) + F (X(t))∆t

Stochastic differential equations:

X(t+ ∆t) ≈ X(t) + F (X(t))∆t+G(X(t))∆W

Page 4: Martingale problems and filteringkurtz/Lectures/Barrett.pdf · 2015. 5. 13. · First Prev Next Go To Go Back Full Screen Close Quit 13 Martingale characterization of conditional

•First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 4

Infinitesimal specification

Deterministic (ode) case:

f(X(t+ ∆t)) ≈ f(X(t)) + F (X(t)) · ∇f(X(t))∆t

f(X(t+ r))− f(X(t)) =∑

f(X(ti+1))− f(X(ti))

≈∑

F (X(ti) · ∇f(X(ti))(ti+1 − ti)

which suggests

f(X(t+ r))− f(X(t))−∫ t+r

t

F (X(s)) · ∇f(X(s))ds = 0

Page 5: Martingale problems and filteringkurtz/Lectures/Barrett.pdf · 2015. 5. 13. · First Prev Next Go To Go Back Full Screen Close Quit 13 Martingale characterization of conditional

•First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 5

Martingale properties

“Infinitesimal changes of distribution”

E[f(X(t+ ∆t))|Ft] ≈ f(X(t)) + Af(X(t))∆t

orE[f(X(t+ ∆t))− f(X(t))− Af(X(t))∆t|Ft] ≈ 0

which suggests

E[f(X(t+ r))− f(X(t))−∫ t+r

t

Af(X(s))ds|Ft] = 0

f(X(t))− f(X(0))−∫ t

0

Af(X(s))ds a martingale

Page 6: Martingale problems and filteringkurtz/Lectures/Barrett.pdf · 2015. 5. 13. · First Prev Next Go To Go Back Full Screen Close Quit 13 Martingale characterization of conditional

•First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 6

Examples of generators: Jump processes

Poisson process (E = 0, 1, 2 . . ., D(A) = B(E))

Af(k) = λ(f(k + 1)− f(k))

Markov chain (E discrete,D(A) = f ∈ B(E) : f has finite support)

Af(k) =∑l

qk,l(f(l)− f(k)

Pure jump process (E arbitrary)

Af(x) = λ(x)

∫E

(f(y)− f(x))µ(x, dy)

Page 7: Martingale problems and filteringkurtz/Lectures/Barrett.pdf · 2015. 5. 13. · First Prev Next Go To Go Back Full Screen Close Quit 13 Martingale characterization of conditional

•First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 7

Examples of generators: Continuous processes

Standard Brownian motion (E = Rd)

Af =1

2∆f, D(A) = C2

c (Rd)

Diffusion process (E = Rd, D(A) = C2c (Rd))

Af(x) =1

2

∑i,j

aij(x)∂2

∂xi∂xjf(x) +

∑i

bi(x)∂

∂xif(x)

Reflecting diffusion (E ⊂ Rd)

D(A) = f ∈ C2c (E) : η(x) · ∇f(x) = 0, x ∈ ∂E

Af(x) =1

2

∑i,j

aij(x)∂2

∂xi∂xjf(x) +

∑i

bi(x)∂

∂xif(x)

Page 8: Martingale problems and filteringkurtz/Lectures/Barrett.pdf · 2015. 5. 13. · First Prev Next Go To Go Back Full Screen Close Quit 13 Martingale characterization of conditional

•First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 8

The martingale problem for AX is a solution for the martingale problem for (A, ν0), ν0 ∈ P(E), ifPX(0)−1 = ν0 and

f(X(t))− f(X(0))−∫ t

0

Af(X(s))ds

is an FXt -martingale for all f ∈ D(A).

Theorem 1.1 If any two solutions of the martingale problem for A satis-fying PX1(0)−1 = PX2(0)−1 also satisfy PX1(t)

−1 = PX2(t)−1 for all

t ≥ 0, then the f.d.d. of a solution X are uniquely determined by PX(0)−1

If X is a solution of the MGP for A and Ya(t) = X(a + t), then Ya is asolution of the MGP for A.

Theorem 1.2 If the conclusion of the above theorem holds, then any solu-tion of the martingale problem for A is a Markov process.

Page 9: Martingale problems and filteringkurtz/Lectures/Barrett.pdf · 2015. 5. 13. · First Prev Next Go To Go Back Full Screen Close Quit 13 Martingale characterization of conditional

•First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 9

A martingale lemmaLet Ft and Gt be filtrations with Gt ⊂ Ft.

Lemma 1.3 Suppose that

M(t) = U(t)− U(0)−∫ t

0

V (s)ds

is an Ft-martingale. Then

E[U(t)|Gt]− E[U(0)|G0]−∫ t

0

E[V (s)|Gs]ds

is a Gt-martingale.

Proof. The lemma follows by the definition and properties of condi-tional expectations.

Page 10: Martingale problems and filteringkurtz/Lectures/Barrett.pdf · 2015. 5. 13. · First Prev Next Go To Go Back Full Screen Close Quit 13 Martingale characterization of conditional

•First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 10

Martingale properties of conditional distributions

Corollary 1.4 If X is a solution of the martingale problem for A with re-spect to the filtration Ft and πt is the conditional distribution of X(t)given Gt ⊂ Ft, then

πtf − π0f −∫ t

0

πsAfds (1.1)

is a Gt-martingale for each f ∈ D(A).

Page 11: Martingale problems and filteringkurtz/Lectures/Barrett.pdf · 2015. 5. 13. · First Prev Next Go To Go Back Full Screen Close Quit 13 Martingale characterization of conditional

•First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 11

Technical conditions

Condition 1.5 1. A : D(A) ⊂ Cb(E)×C(E) with 1 ∈ D(A) and A1 =0.

2. D(A) is closed under multiplication and separates points.

3. There exist ψ ∈ C(E), ψ ≥ 1, and constants af such that f ∈ D(A)implies |Af(x)| ≤ afψ(x).

4. DefiningA0 = (f, ψ−1Af) : f ∈ D(A),A0 is separable in the sensethat there exists a countable collection gk ⊂ D(A) such that everysolution of the martingale problem forAr

0 = (gk, A0gk) : k = 1, 2, . . .is a solution for A0.

5. A0 is a pre-generator, that is, A0 is dissipative and there are sequencesof functions µn : E → P(E) and λn : E → [0,∞) such that for each(f, g) ∈ A for each x ∈ E

g(x) = limn→∞

λn(x)

∫E

(f(y)− f(x))µn(x, dy). (1.2)

Page 12: Martingale problems and filteringkurtz/Lectures/Barrett.pdf · 2015. 5. 13. · First Prev Next Go To Go Back Full Screen Close Quit 13 Martingale characterization of conditional

•First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 12

Forward equation

A P(E)-valued function νt, t ≥ 0 is a solution of the forward equa-tion for A if for each t > 0,

∫ t0 νsψds < ∞ (see Condition 1.5) and for

each f ∈ D(A),

νtf = ν0f +

∫ t

0

νsAfds. (1.3)

Note that if π satisfies (1.1), then νt = E[πt] satisfies (1.3).

Theorem 1.6 Under Condition 1.5, every solution of the forward equationcorresponds to a solution of the martingale problem.

Page 13: Martingale problems and filteringkurtz/Lectures/Barrett.pdf · 2015. 5. 13. · First Prev Next Go To Go Back Full Screen Close Quit 13 Martingale characterization of conditional

•First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 13

Martingale characterization of conditional distributions

Theorem 1.7 Suppose that πt, t ≥ 0 is a cadlag, P(E)-valued processwith no fixed points of discontinuity adapted to Gt satisfying

E[

∫ t

0

πsψds] <∞, t > 0

and that

πtf − π0f −∫ t

0

πsAfds

is a Gt-martingale for each f ∈ D(A). Then there exists a solution X ofthe martingale problem for A, a P(E)-valued process πt, t ≥ 0 with thesame distribution as πt, t ≥ 0, and a filtration Gt such that πt is theconditional distribution of X(t) given Gt.

Page 14: Martingale problems and filteringkurtz/Lectures/Barrett.pdf · 2015. 5. 13. · First Prev Next Go To Go Back Full Screen Close Quit 13 Martingale characterization of conditional

•First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 14

Conditioning on a process

Theorem 1.8 If Gt in Theorem 1.7 is generated by a cadlag process Ywith no fixed points of discontinuity and π(0), that is,

Gt = F Yt ∨ σ(π(0)),

then there exists a solution X of the martingale problem for A, a P(E)-valued process πt, t ≥ 0, and a process Y such that πt, t ≥ 0 and Yhave the same joint distribution as πt, t ≥ 0 and Y and πt is the condi-tional distribution of X(t) given FY

t ∨ σ(π(0)).

Page 15: Martingale problems and filteringkurtz/Lectures/Barrett.pdf · 2015. 5. 13. · First Prev Next Go To Go Back Full Screen Close Quit 13 Martingale characterization of conditional

•First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 15

Idea of proof

Enlarge the state space so that the current state of the process con-tains all information about the past of the observation Y .

Let bk, ck ⊂ Cb(E0) satisfy 0 ≤ bk, ck ≤ 1, and suppose that thespans of bk and ck are bounded, pointwise dense in B(E0).

Let a1, a2, . . . be an ordering of the rationals with ai ≥ 1 and

Vki(t) = ck(Y (0))− ai∫ t

0

Vki(s)ds+

∫ t

0

bk(Y (s))ds (1.4)

= ck(Y (0))e−ait +

∫ t

0

e−ai(t−s)bk(Y (s))ds.

Page 16: Martingale problems and filteringkurtz/Lectures/Barrett.pdf · 2015. 5. 13. · First Prev Next Go To Go Back Full Screen Close Quit 13 Martingale characterization of conditional

•First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 16

Set V (t) = (Vki(t) : k, i ≥ 1) ∈ [0, 1]∞,

D(A) = f(x)m∏

k,i=1

gki(vki) : f ∈ D(A), gki ∈ C1[0, 1],m = 1, 2, . . .

and

A(fg)(x, v, u) = g(v)Af(x) + f(x)∑

(−aiv + bk(u))∂kig(v),

For fg ∈ D(A),

πtfg(V (t))− π0fg(V (0))

−∫ t

0

(g(V (s))πsAf + πsf

∑(−aiVki(s) + bk(Y (s)))∂kig(V (s))

)ds

is a F Yt -martingale and νt defined by

νt(fgh) = E[πtfg(V (t))h(Y (t)]

is a solution of the controlled forward equation for A.

Page 17: Martingale problems and filteringkurtz/Lectures/Barrett.pdf · 2015. 5. 13. · First Prev Next Go To Go Back Full Screen Close Quit 13 Martingale characterization of conditional

•First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 17

Partially observed processes

Let γ : E → E0 be Borel measurable.

Corollary 1.9 If in Corollary 1.8, Y and π satisfy∫E

h γ(x)πt(dx) = h(Y (t)) a.s.

for all h ∈ B(E0) and t ≥ 0, then Y (t) = γ(X(t)).

(cf. Kurtz and Ocone (1988) and Kurtz (1998))

Page 18: Martingale problems and filteringkurtz/Lectures/Barrett.pdf · 2015. 5. 13. · First Prev Next Go To Go Back Full Screen Close Quit 13 Martingale characterization of conditional

•First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 18

The filtered martingale problem

Definition 1.10 A P(E)-valued process π and an E-valued process Y area solution of the filtered martingale problem for (A, γ) if

πtf − π0f −∫ t

0

πsAfds

is a F Yt ∨σ(π(0))-martingale for each f ∈ D(A) and

∫E hγ(x)πt(dx) =

h(Y (t)) a.s. for all h ∈ B(E0) and t ≥ 0.

Theorem 1.11 Let ϕ0 ∈ P(P(E)) and define µ0 =∫P(E) µϕ0(dµ). If

uniqueness holds for the martingale problem (A, µ0), then uniqueness holdsfor the filtered martingale problem for (A, γ, ϕ0). If uniqueness holds forthe filtered martingale problem for (A, γ, ϕ0), then πt, t ≥ 0 is a Markovprocess.

Note: πt = H(t, Y , π0) implies πt = H(t, Y, π0).

Page 19: Martingale problems and filteringkurtz/Lectures/Barrett.pdf · 2015. 5. 13. · First Prev Next Go To Go Back Full Screen Close Quit 13 Martingale characterization of conditional

•First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 19

Markov mappings

Theorem 1.12 γ : E → E0, Borel measurable.

α a transition function from E0 into E satisfying

α(y, γ−1(y)) = 1

Let µ0 ∈ P(E0), ν0 =∫α(y, ·)µ0(dy), and define

C = (∫E

f(z)α(·, dz),∫E

Af(z)α(·, dz)) : f ∈ D(A) .

If Y is a solution of the MGP for (C, µ0), then there exists a solution Z ofthe MGP for (A, ν0) such that Y = γ Z and Y have the same distributionon ME0

[0,∞).

E[f(Z(t))|FYt ] =

∫f(z)α(Y (t), dz)

(at least for almost every t, all t if Y has no fixed points of discontinuity).

Page 20: Martingale problems and filteringkurtz/Lectures/Barrett.pdf · 2015. 5. 13. · First Prev Next Go To Go Back Full Screen Close Quit 13 Martingale characterization of conditional

•First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 20

Uniqueness

Corollary 1.13 If uniqueness holds for the MGP for (A, ν0), then unique-ness holds for the ME0

[0,∞)-MGP for (C, µ0). If Y has sample paths inDE0

[0,∞), then uniqueness holds for the DE0[0,∞)-martingale problem

for (C, µ0).

Existence for (C, µ0) and uniqueness for (A, ν0) implies existence for (A, ν0)

and uniqueness for (C, µ0), and hence that Y is Markov.

Page 21: Martingale problems and filteringkurtz/Lectures/Barrett.pdf · 2015. 5. 13. · First Prev Next Go To Go Back Full Screen Close Quit 13 Martingale characterization of conditional

•First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 21

Intertwining condition

Let α(y,Γ) be a transition function from E0 to E satisfying

α(y, γ−1(y)) = 1,

and define S(t) : B(E0)→ B(E0) by

S(t)g(y) = αT (t)g γ(y) ≡∫E

T (t)g γ(x)α(y, dx).

Theorem 1.14 (Rogers and Pitman (1981), cf Rosenblatt (1966)) If foreach t ≥ 0,

αT (t)f = S(t)αf, f ∈ B(E), (S(t) is a semigroup)

and X is a Markov process with intial distribution α(y, ·) and semigroupT (t), then Y is a Markov process with Y (0) = y and

PX(t) ∈ Γ|FYt = α(Y (t),Γ).

Page 22: Martingale problems and filteringkurtz/Lectures/Barrett.pdf · 2015. 5. 13. · First Prev Next Go To Go Back Full Screen Close Quit 13 Martingale characterization of conditional

•First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 22

Generator for Y

Note thatαT (t)f = S(t)αf, f ∈ B(E),

suggests that the generator for Y is given by

Cαf = αAf.

Page 23: Martingale problems and filteringkurtz/Lectures/Barrett.pdf · 2015. 5. 13. · First Prev Next Go To Go Back Full Screen Close Quit 13 Martingale characterization of conditional

•First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 23

Burke’s output theorem Kliemann, Koch, and Marchetti (1990)

X = (Q,D), an M/M/1 queue and its departure process

Af(k, l) = λ(f(k + 1, l)− f(k, l)) + µ1k>0(f(k − 1, l + 1)− f(k, l))

γ(k, l) = l

Assume λ < µ and define

α(l, (k, l)) = (1−λµ

)(λ

µ)k−1, k = 0, 1, 2, . . . α(l, (k,m)) = 0, m 6= l

Then

αAf(l) = µ∞∑k=1

(1− λ

µ)(λ

µ)k−1(f(k − 1, l + 1)− f(k − 1, l))

= λ(αf(l + 1)− αf(l))

Page 24: Martingale problems and filteringkurtz/Lectures/Barrett.pdf · 2015. 5. 13. · First Prev Next Go To Go Back Full Screen Close Quit 13 Martingale characterization of conditional

•First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 24

Poisson output

Therefore, there exists a solution (Q,D) of the martingale problemfor A such that D is a Poisson process with parameter λ and

PQ(t) = k|FDt = (1− λ

µ)(λ

µ)k−1,

that is, Q(t) is independent of FDt and is geometrically distributed.

Page 25: Martingale problems and filteringkurtz/Lectures/Barrett.pdf · 2015. 5. 13. · First Prev Next Go To Go Back Full Screen Close Quit 13 Martingale characterization of conditional

•First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 25

Pitman’s theorem

Z standard Brownian motion

M(t) = sups≤t Z(s), V (t) = M(t)− Z(t)

X(t) = (Z(t),M(t)− Z(t)) = (Z(t), V (t))

Y (t) = 2M(t)− Z(t) = γ(X(t) = 2V (t) + Z(t)

Af(z, v) =1

2fzz(z, v)− fzv(z, v) +

1

2fvv(z, v) b.c. fv(z, 0) = 0

F (y) = αf(y) =1

y

∫ y

0

f(y − 2v, v)dv

αAf(y) =1

2F ′′(y) +

1

yF ′(y)

Page 26: Martingale problems and filteringkurtz/Lectures/Barrett.pdf · 2015. 5. 13. · First Prev Next Go To Go Back Full Screen Close Quit 13 Martingale characterization of conditional

•First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 26

2. Filtering equations

• Bayes and Kallianpur-Striebel formulas

• “Particle representations” of conditional distributions

• Continuous time filtering in Gaussian white noise

• Derivation of filtering equations

• Zakai equation

• Kushner-Stratonovich equation

• Point process observations

• Spatial observations with additve white noise

• Cluster detection

• Signal in noise with point process observations

• Exit time observations

• Uniqueness

Kurtz and Xiong (1999, 2001)

Page 27: Martingale problems and filteringkurtz/Lectures/Barrett.pdf · 2015. 5. 13. · First Prev Next Go To Go Back Full Screen Close Quit 13 Martingale characterization of conditional

•First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 27

Bayes and Kallianpur Striebel formulasKallianpur and Striebel (1968)

Let X and Y be random variables defined on (Ω,F , P ). Suppose wewant to calculate the conditional expectation

EP [f(X)|Y ].

If P << Q with dP = LdQ, Bayes formula says

EP [f(X)|Y ] =EQ[f(X)L|Y ]

EQ[L|Y ].

If X = h(U, Y ), L = L(U, Y ) and U and Y are independent under Q,then

EP [f(X)|Y ] =

∫f(h(u, Y ))L(u, Y )µU(du)∫

L(u, Y )µU(du)

where µU is the distribution of U .

Method: Find a reference measure under which what we don’t knowis independent of what we do know.

Page 28: Martingale problems and filteringkurtz/Lectures/Barrett.pdf · 2015. 5. 13. · First Prev Next Go To Go Back Full Screen Close Quit 13 Martingale characterization of conditional

•First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 28

Monte Carlo integration Handschin and Mayne (1969)

Let U1, U2, . . . be iid with distribution µU . Then

EP [f(X)|Y ] = limn→∞

∑ni=1 f(h(Ui, Y ))L(Ui, Y )∑n

i=1 L(Ui, Y )

Note that (U1, Y ), (U2, Y ), . . . is a stationary (in fact exchangeable) se-quence. Consequently,

limn→∞

1

n

n∑i=1

f(h(Ui, Y ))L(Ui, Y ) = EQ[f(U1, Y )L(Ui, Y )|I]

= EQ[f(U1, Y )L(Ui, Y )|Y ]

=

∫f(h(u, Y ))L(u, Y )µU(dx) a.s. Q

Page 29: Martingale problems and filteringkurtz/Lectures/Barrett.pdf · 2015. 5. 13. · First Prev Next Go To Go Back Full Screen Close Quit 13 Martingale characterization of conditional

•First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 29

Continuous time filtering in Gaussian white noise

To understand the intuition behind the standard “observation in Gaus-sian white noise” filtering model, suppose X is the signal of interestand noisy observations of the form

On(k

n) = h(X(

k

n))

1

n+

1√nζk

are made every n−1 time units. For large n, the noise swamps thesignal at any one time point.

Assume that the ζk are iid with mean zero and variance σ2. ThenYn(t) =

∑[nt]k=1On(

kn) is approximately

Y (t) =

∫ t

0

h(X(s))ds+ σW (t). (2.1)

Note, however, that E[f(X(t))|FYnt ] does not necessarily converge to

E[f(X(t))|FYt ]; however, we still take (2.1) as our observation model.

Page 30: Martingale problems and filteringkurtz/Lectures/Barrett.pdf · 2015. 5. 13. · First Prev Next Go To Go Back Full Screen Close Quit 13 Martingale characterization of conditional

•First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 30

Reference measure

By the Girsanov formula,

EP [g(X(t))|FYt ] =

EQ[g(X(t))L(t)|FYt ]

EQ[L(t)|FYt ]

where under Q, X and Y are independent, Y is a Brownian motionwith mean zero and variance σ2t, and

L(t) = exp∫ t

0

h(X(s))

σ2dY (s)− 1

2

∫ t

0

h2(X(s))

σ2ds

that is,

L(t) = 1 +

∫ t

0

h(X(s))

σ2L(s)dY (s).

Under Q, σ−1Y is a standard Brownian motion, so under P ,

W (t) = σ−1Y (t)−∫ t

0

σ−1h(X(s))ds

is a standard Brownian motion.

Page 31: Martingale problems and filteringkurtz/Lectures/Barrett.pdf · 2015. 5. 13. · First Prev Next Go To Go Back Full Screen Close Quit 13 Martingale characterization of conditional

•First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 31

Monte Carlo solution Pardoux (1991)

Let X1, X2, . . . be iid copies of X that are independent of Y under Q,and let

Li(t) = 1 +

∫ t

0

h(Xi(s))

σ2Li(s)dY (s).

Note that

φ(g, t) ≡ EQ[g(X(s))L(s)|FYs ] = EQ[g(Xi(s))Li(s)|FY

s ]

Claim:1

n

n∑i=1

g(Xi(s))Li(s)→ EQ[g(X(s))L(s)|FYs ]

Page 32: Martingale problems and filteringkurtz/Lectures/Barrett.pdf · 2015. 5. 13. · First Prev Next Go To Go Back Full Screen Close Quit 13 Martingale characterization of conditional

•First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 32

Zakai equation

For simplicity, assume σ = 1 and X is a diffusion

X(t) = X(0) +

∫ t

0

σ(X(s))dB(s) +

∫ t

0

b(X(s))ds,

where under Q, B and Y are independent standard Brownian mo-tions. Since

g(X(t)) = g(X(0)) +

∫ t

0

g′(X(s))σ(X(s))dB(s) +

∫ t

0

Ag(X(s))ds

g(X(t))L(t) = g(X(0)) +

∫ t

0

L(s)dg(X(s)) +

∫ t

0

g(X(s))dL(s)

= g(X(0)) +

∫ t

0

L(s)g′(X(s))σ(X(s))dB(s)

+

∫ t

0

L(s)Ag(X(s))ds+

∫ t

0

g(X(s))h(X(s))L(s)dY (s)

Page 33: Martingale problems and filteringkurtz/Lectures/Barrett.pdf · 2015. 5. 13. · First Prev Next Go To Go Back Full Screen Close Quit 13 Martingale characterization of conditional

•First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 33

Monte Carlo derivation of Zakai equation

Xi(t) = Xi(0) +

∫ t

0

σ(Xi(s))dBi(s) +

∫ t

0

b(Xi(s))ds,

where (Xi(0), Bi) are iid copies of (X(0), B).

g(Xi(t))Li(t) = g(Xi(0)) +

∫ t

0

Li(s)g′(Xi(s))σ(Xi(s))dBi(s)

+

∫ t

0

Li(s)Ag(Xi(s))ds

+

∫ t

0

g(Xi(s))h(Xi(s))Li(s)dY (s)

and hence

φ(g, t) = φ(g, 0) +

∫ t

0

φ(Ag, s)ds+

∫ t

0

φ(gh, s)dY (s)

Page 34: Martingale problems and filteringkurtz/Lectures/Barrett.pdf · 2015. 5. 13. · First Prev Next Go To Go Back Full Screen Close Quit 13 Martingale characterization of conditional

•First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 34

Kushner-Stratonovich equation

πtg = EP [g(X(t))|FYt ] =

φ(g, t)

φ(1, t)

=φ(g, 0)

φ(1, 0)+

∫ t

0

1

φ(1, s)dφ(g, s)−

∫ t

0

φ(g, s)

φ(1, s)2dφ(1, s)

+

∫ t

0

φ(g, s)

φ(1, s)3d[φ(1, ·)]s −

∫ t

0

1

φ(1, s)2d[φ(g, ·), φ(1, ·)]s

= π0g +

∫ t

0

πsAgds+

∫ t

0

(πsgh− πsgπsh)dY (s)

+

∫ t

0

σ2πsgπsh2ds−

∫ t

0

σ2πsghπshds

= π0g +

∫ t

0

πsAgds+

∫ t

0

(πsgh− πsgπsh)(dY (s)− πshds)

Note: πt in Theorem 1.11 is πt × δY (t) in the current notation.

Page 35: Martingale problems and filteringkurtz/Lectures/Barrett.pdf · 2015. 5. 13. · First Prev Next Go To Go Back Full Screen Close Quit 13 Martingale characterization of conditional

•First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 35

Spatial observations with additive white noise

Signal:

X(t) = X(0) +

∫ t

0

σ(X(s))dB(s) +

∫ t

0

b(X(s))ds+

∫S0×[0,t]

α(X(s), u)W (du× ds)

= X(0) +

∫ t

0

σ(X(s))dB(s) +

∫S0×[0,t]

α(X(s), u)Y (du× ds)

+

∫ t

0

(b(X(s))−∫S0

α(X(s), u)h(X(s), u)µ0(du))ds

Observation:

Y (A, t) =

∫ t

0

∫A

h(X(s), u)µ0(du)ds+W (A, t)

Under Q, Y is Gaussian white noise on S0 × [0,∞) with

E[Y (A, t)Y (B, s)] = µ0(A ∩B)t ∧ s,and dP|Ft = L(t)dQ|Ft where

L(t) = 1 +

∫S0×[0,t]

L(s)h(X(s), u)Y (du× ds)

Page 36: Martingale problems and filteringkurtz/Lectures/Barrett.pdf · 2015. 5. 13. · First Prev Next Go To Go Back Full Screen Close Quit 13 Martingale characterization of conditional

•First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 36

Apply Ito’s formula

Under P , X is a diffusion with generator

Af(x) =1

2

∑aij(x)∂i∂jf(x) +

∑bi(x)∂if(x)

wherea(x) = σ(x)σ(x)T +

∫S0

α(x, u)α(x, u)Tµ0(du)

Then

f(X(t))L(t)

= f(X(0)) +

∫ t

0

L(s)∇f(X(s))Tσ(X(s))dB(s)

+

∫S0×[0,t]

L(s)(∇f(X(s)) · α(X(s), u)

+

∫ t

0

L(s)Af(X(s))ds+ f(X(s))h(X(s), u))Y (du× ds)

Page 37: Martingale problems and filteringkurtz/Lectures/Barrett.pdf · 2015. 5. 13. · First Prev Next Go To Go Back Full Screen Close Quit 13 Martingale characterization of conditional

•First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 37

Particle representation

Bi independent, standard Brownian motions, independent of Y on(Ω,F , P0). Let

Xi(t) = Xi(0) +

∫ t

0

σ(Xi(s))dBi(s) +

∫S0×[0,t]

α(Xi(s−), u)Y (du× ds)

+

∫ t

0

∫S0

(b(Xi(s))− α(Xi(s), u)h(Xi(s), u))µ0(du)ds

Li(t) = 1 +

∫S0×[0,t]

Li(s)h(Xi(s), u)Y (du× ds)

Then

φ(f, t) = EP0[f(X(t))L(t)|FYt ] = lim

n→∞

1

n

n∑i=1

f(Xi(t))Li(t)

Page 38: Martingale problems and filteringkurtz/Lectures/Barrett.pdf · 2015. 5. 13. · First Prev Next Go To Go Back Full Screen Close Quit 13 Martingale characterization of conditional

•First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 38

Zakai equation

Since

f(Xi(t))Li(t)

= f(Xi(0)) +

∫ t

0

Li(s)∇f(Xi(s))Tσ(Xi(s))dBi(s)

+

∫S0×[0,t]

Li(s)(∇f(Xi(s)) · α(Xi(s), u)

+

∫ t

0

Li(s)Af(Xi(s))ds+ f(Xi(s))h(Xi(s), u))Y (du× ds),

φ(f, t) = φ(f, 0) +

∫ t

0

φ(Af, s)ds

+

∫S0×[0,t]

φ(∇f · α(·, u) + fh(·, u), s)Y (du× ds)

Page 39: Martingale problems and filteringkurtz/Lectures/Barrett.pdf · 2015. 5. 13. · First Prev Next Go To Go Back Full Screen Close Quit 13 Martingale characterization of conditional

•First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 39

Kushner-Stratonovich equation

It follows that

πtf =φ(f, t)

φ(1, t)

= π0f +

∫ t

0

πsAfds

+

∫S0×[0,t]

(πs(∇f · α(·, u) + fh(·, u))− πsfπsh(·, u)

)Y (du× ds)

+

∫ t

0

∫S0

(πsfπsh(·, u)− πs(∇f · α(·, u) + fh(·, u))

)πsh(·, u)µ0(du)ds

= π0f +

∫ t

0

πsAfds

+

∫S0×[0,t]

(πs(∇f · α(·, u) + fh(·, u))− πsfπsh(·, u)

)Y (du× ds)

where

Y (A, t) = Y (A, t)−∫ t

0

∫A

πsh(·, u)µ0(du)ds

Page 40: Martingale problems and filteringkurtz/Lectures/Barrett.pdf · 2015. 5. 13. · First Prev Next Go To Go Back Full Screen Close Quit 13 Martingale characterization of conditional

•First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 40

Cluster detection: Possible applications

• Internet packets that form a malicious attack on a computer sys-tem.

• Financial transactions that form a collusive trading scheme.

• Earthquakes that form a single seismic event.

Page 41: Martingale problems and filteringkurtz/Lectures/Barrett.pdf · 2015. 5. 13. · First Prev Next Go To Go Back Full Screen Close Quit 13 Martingale characterization of conditional

•First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 41

The model Wu (2009)

The observations form a marked point process O with marks in E.

O(A, t) = N(A, t) + C(A, t)

with

N(A, t) =

∫A×[0,∞)×[0,t]

1[0,γ(u)](v)ξ1(du× dv × ds)

C(A, t) =

∫A×[0,∞)×[0,t]

1[0,λ(u,ηs−)](v)ξ2(du× dv × ds)

where ξ1 and ξ2 are independent Poisson random measures on E ×[0,∞) × [0,∞) with mean measure ν × ` × `, ` denoting Lebesguemeasure.

ηt(A× [0, r]) =∫A×[0,t] 1A(u)1[0,r](s)C(du× ds)

Page 42: Martingale problems and filteringkurtz/Lectures/Barrett.pdf · 2015. 5. 13. · First Prev Next Go To Go Back Full Screen Close Quit 13 Martingale characterization of conditional

•First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 42

Radon-Nikodym derivative

Lemma 2.1 On (Ω,F , Q), let N and C be independent Poisson randommeasures with mean measures ν0(du × ds) = γ(u)ν(du)ds and ν1(du ×ds) = λ(u)ν(du)ds respectively that are compatible with Ft. Let L sat-isfy

L(t) = 1+

∫E×[0,t]

(λ(u, ηs−)

λ(u)−1)L(s−)(C(du×ds)−λ(u)ν(du)ds). (2.2)

and assume that L is a Ft-martingale.

Define dP|Ft= L(t)dQ|Ft

. Under P , for allA such that∫ t0

∫A λ(u, ηs)ν(du)ds <

∞, t > 0,

C(A, t)−∫A×[0,t]

λ(u, ηs)ν(du)ds

is a local martingale and N is independent of C and is a Poisson randommeasure with mean measure ν0.

Page 43: Martingale problems and filteringkurtz/Lectures/Barrett.pdf · 2015. 5. 13. · First Prev Next Go To Go Back Full Screen Close Quit 13 Martingale characterization of conditional

•First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 43

The general filtering equations

Theorem 2.2

φ(f, t) = φ(f, 0)−∫E×[0,t]

φ(f(·)(λ(u, ·)− λ(u)), s)ν(du)ds

+

∫E×[0,t]

φ(f(·+ δ(u,s))λ(u, ·)λ(u)

− f(·), s−)λ(u)

λ(u) + γ(u)O(du× ds)

and

πtf = π0f

+

∫E×[0,t]

πs−(f(·+ δ(u,s))λ(u, ·))− πs−λ(u, ·)πs−fπs−λ(u, ·) + γ(u)

O(du× ds)

−∫E×[0,t]

(πs(f(·)λ(u, ·))− πsfπsλ(u, ·))ν(du)ds

Page 44: Martingale problems and filteringkurtz/Lectures/Barrett.pdf · 2015. 5. 13. · First Prev Next Go To Go Back Full Screen Close Quit 13 Martingale characterization of conditional

•First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 44

Simplify

Problem: The difficulty of computing the distribution: 2O(E,t) possi-ble states.

Need to compromise: compute πtf = EP [f(ηs)|Fs] for a “small” col-lection of f

Suppose one observes ui at time τi and yi = (ui, τi).

θ(yi)(·) = 1yi is a point in the cluster

θ0(yi)(·) = 1yi is the latest point in the cluster

Need to be able to evaluate

πtλ(u, ·)

Page 45: Martingale problems and filteringkurtz/Lectures/Barrett.pdf · 2015. 5. 13. · First Prev Next Go To Go Back Full Screen Close Quit 13 Martingale characterization of conditional

•First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 45

A Markov scenario

Consider

λ(u, ηt) =

O(E,t)∑i=1

λ(u, yi)θ0(yi) + ε(u),

where θ0(yi) = 1yi is the latest point in the cluster.

Get a closed system for πtθ0(yi)

Let θ(y)(·) = 1y is a point in the cluster.

Get a closed system for

πtθ0(yi), πtθ(yi), πtθ(yi)θ0(yj)

Page 46: Martingale problems and filteringkurtz/Lectures/Barrett.pdf · 2015. 5. 13. · First Prev Next Go To Go Back Full Screen Close Quit 13 Martingale characterization of conditional

•First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 46

Exit time observations Krylov and Wang (2011)

Let Xτ be a diffusion process in a domain D stopped at τ = inft :Xτ(t) /∈ Do. Let Y0 satisfy

Y0(t) =

∫ t

0

h(Xτ(s))ds+W0(t),

where h(x) = 0 for x ∈ ∂D. We are interested in the conditionaldistribution of Xτ(t) given FY0,τ

t = σ(Y0(s), τ ∧ s : s ≤ t.

For i = 1, 2, . . ., let Ni be independent, unit Poisson processes anddefine

Yi(t) = Ni(

∫ t

0

1∂D(Xτ(s))ds)

LetFnt = σ(Y0(s), . . . , Yn(s) : s ≤ t),

and τn = inft : ∨1≤i≤nYi(t) ≥ 1. Then τ = limn→∞ τn, and we have

limn→∞

E[f(Xτ(t))|Fnt ] = E[f(Xτ(t))| ∨n Fn

t ] = E[f(Xτ(t))|FY0,τt ].

Page 47: Martingale problems and filteringkurtz/Lectures/Barrett.pdf · 2015. 5. 13. · First Prev Next Go To Go Back Full Screen Close Quit 13 Martingale characterization of conditional

•First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 47

Reference measure

Under the reference measure Qn, Y0 is a standard Browian motionand the Yi, i = 1, . . . , n are independent Poisson processes with in-tensit n−1.

Setting θ(t) = 1∂D(X(t∧τ), Y n = (Y0, Y1, . . . , Yn), S(Y n(t)) =∑n

i=1 Yi(t),

L0(Xτ , Y0, t) = exp

∫ t

0

h(Xτ(s))dY0(s)−1

2

∫ t

0

h2(Xτ(s))ds

Li(Xτ , Yi, t) = exp

∫ t

0

log nθ(s−)dYi(s)−∫ t

0

(θ(s)− n−1)ds

Then

Ln(Xτ , Y n, t) = L0(X

τ , Y0, t)n∏i=1

Li(Xτ , Yi, t)

Page 48: Martingale problems and filteringkurtz/Lectures/Barrett.pdf · 2015. 5. 13. · First Prev Next Go To Go Back Full Screen Close Quit 13 Martingale characterization of conditional

•First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 48

SDE for Radon-Nikodym derivative

Ln(Xτ , Y n, t)

= L0(Xτ , Y0, t) exp

n∑i=1

∫ t

0

log nθ(s−)dYi(s)−∫ t

0

(nθ(s)− 1)ds

= L0(Xτ , Y0, t)1τn>t exp−

∫ t

0

(nθ(s)− 1)ds

+L0(Xτ , Y0, t)1τ<τn≤tn

S(Y n(t)) exp−n(t− τ) + t)ds.

So

Ln(Xτ , Y n, t) = 1 +

∫ t

0

Ln(Xτ , Y n, s)h(Xτ(s))TdY0(s)

+n∑i=1

∫ t

0

Ln(Xτ , Y n, s−)(nθ(s−)− 1)d(Yi(s)− n−1s).

Page 49: Martingale problems and filteringkurtz/Lectures/Barrett.pdf · 2015. 5. 13. · First Prev Next Go To Go Back Full Screen Close Quit 13 Martingale characterization of conditional

•First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 49

Zakai equationWriting Ln(t) = Ln(X

τ , Y n, t) and

Mϕ(t) = ϕ(Xτ(t))− ϕ(X(0))−∫ t

0

Aτϕ(Xτ(s))ds

Ln(t)ϕ(Xτ(t)) = ϕ(Xτ(0)) +

∫ t

0

ϕ(Xτ(s))dLn(s) +

∫ t

0

Ln(s)Aτϕ(Xτ(s))ds

+

∫ t

0

Ln(s)dMϕ(s)

= ϕ(Xτ(0)) +

∫ t

0

ϕ(Xτ(s))Ln(s)h(Xτ(s))TdY0(s)

+n∑i=1

∫ t

0

ϕ(Xτ(s))Ln(s−)(nθ(s−)− 1)d(Yi(s)− n−1s)

+

∫ t

0

Ln(s)Aτϕ(Xτ(s))ds+

∫ t

0

Ln(s)dMϕ(s).

Page 50: Martingale problems and filteringkurtz/Lectures/Barrett.pdf · 2015. 5. 13. · First Prev Next Go To Go Back Full Screen Close Quit 13 Martingale characterization of conditional

•First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 50

Averaging

The Zakai equation becomes

〈V n(t), ϕ〉 = 〈V (0), ϕ〉+

∫ t

0

〈V n(s), Aτϕ〉ds+

∫ t

0

〈V n(s), ϕhT 〉dY0(s)

+n∑i=1

∫ t

0

〈V n(s−), (n1∂D − 1)ϕ〉d(Yi(s)− n−1s).

Page 51: Martingale problems and filteringkurtz/Lectures/Barrett.pdf · 2015. 5. 13. · First Prev Next Go To Go Back Full Screen Close Quit 13 Martingale characterization of conditional

•First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 51

Kushner-Stratonovich equation

〈πn(t), ϕ〉 = 〈π(0), ϕ〉+

∫ t

0

〈πn(s), Aτϕ〉ds

+

∫ t

0

(〈πn(s), ϕhT 〉 − 〈πn(s), ϕ〉〈πn(s), hT 〉)(dY0(s)− 〈πn(s)h〉ds)

−∫ t

0

〈πn(s−), (n1∂D − 1)ϕ〉ds+

∫ t

0

〈πn(s−), ϕ〉〈πn(s−), (n1∂D − 1)〉ds

+n∑i=1

∫ t

0

(〈πn(s−),1∂Dϕ〉〈πn(s−),1∂D〉

− 〈πn(s−), ϕ〉)dYi(s)

= 〈π(0), ϕ〉+

∫ t

0

〈πn(s), Aτϕ〉ds

+

∫ t

0

(〈πn(s), ϕhT 〉 − 〈πn(s), ϕ〉〈πn(s), hT 〉)(dY0(s)− 〈πn(s), h〉ds)

+n∑i=1

∫ t

0

(〈πn(s−),1∂Dϕ〉〈πn(s−),1∂D〉

− 〈πn(s−), ϕ〉)

(dYi(s)− 〈πn(s−),1∂D〉ds)

Page 52: Martingale problems and filteringkurtz/Lectures/Barrett.pdf · 2015. 5. 13. · First Prev Next Go To Go Back Full Screen Close Quit 13 Martingale characterization of conditional

•First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 52

Stopped equationStopping at τn, we have

〈πn(t ∧ τn), ϕ〉 = 〈π(0), ϕ〉+

∫ t∧τn

0

〈πn(s), Aτϕ〉ds

+

∫ t∧τn

0

(〈πn(s), ϕhT 〉 − 〈πn(s), ϕ〉〈πn(s), hT 〉)(dY0(s)− 〈πn(s), h〉ds)

+1τn≤t

(〈πn(τn−),1∂Dϕ〉〈πn(τn−),1∂D〉

− 〈πn(τn−), ϕ〉)

−n∫ t∧τn

0

(〈πn(s),1∂Dϕ〉〈πn(s),1∂D〉

− 〈πn(s), ϕ〉)〈πn(s),1∂D〉ds)

Note that, assuming Pτn <∞ = 1, we must have

E[n

∫ τn

0

〈πn(s),1∂D〉ds] = 1

and for E[〈πn(s),1∂D〉1s<τn] = E[〈πn(s),1∂D〉1τ≤s<τn] = O(n−1)

Page 53: Martingale problems and filteringkurtz/Lectures/Barrett.pdf · 2015. 5. 13. · First Prev Next Go To Go Back Full Screen Close Quit 13 Martingale characterization of conditional

•First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 53

Convergence

Let Do = x : ρ(x) > 0, |∇ρ(x)| > 0 on ∂D. Let kn → ∞ in such away that, at least along a subsequence, for t < τ ,∫ t

0

k2n〈πn(s), e−knρϕ〉ds→∫ t

0

〈β(s), ϕ〉ds.

Defineϕn(x) = ϕ(x)e−knρ(x).

Then for t < τ ,∫ t

0

n〈πn(s),1∂Dϕ〉ds→∫ t

0

〈β(s),1

2

∑i,j

aij∂iρ∂jρϕ〉ds,

and we should have

〈πn(τn−),1∂Dϕ〉〈πn(τn−),1∂D〉

→〈β(s),

∑i,j aij∂iρ∂jρϕ〉

〈β(s),∑

i,j aij∂iρ∂jρ〉

Page 54: Martingale problems and filteringkurtz/Lectures/Barrett.pdf · 2015. 5. 13. · First Prev Next Go To Go Back Full Screen Close Quit 13 Martingale characterization of conditional

•First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 54

Uniqueness for Kushner-Stratonovich equationEach of the Kushner-Stratonovich equations is of the form

πtg = π0g +

∫ t

0

πsAgds+

∫ t

0

G(πs−)d(Y (s)− πshds)

Each of the stochastic integrator would be a martingale if the π wasthe conditional distribution. Suppose that is not true.

In each case, under some restrictions on h and π, we can do a changeof measure to make Y (t) −

∫ t0 πshds a martingale. Under the “new”

measure, (Y, π) is a solution of the corresponding filtered martingaleproblem.

By uniqueness of the filtered martingale problem, πt = H(t, Y ) forsome appropriately measurable transformation. But this transforma-tion doesn’t depend on the change of measure, so π was already theconditional distribution process.

Page 55: Martingale problems and filteringkurtz/Lectures/Barrett.pdf · 2015. 5. 13. · First Prev Next Go To Go Back Full Screen Close Quit 13 Martingale characterization of conditional

•First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 55

ReferencesJ. E. Handschin and D. Q. Mayne. Monte Carlo techniques to estimate the conditional expectation in multi-

stage non-linear filtering. Internat. J. Control (1), 9:547–559, 1969. ISSN 0020-7179.

G. Kallianpur and C. Striebel. Estimation of stochastic systems: Arbitrary system process with additive whitenoise observation errors. Ann. Math. Statist., 39:785–801, 1968. ISSN 0003-4851.

W. Kliemann, G. Koch, and F. Marchetti. On the unnormalized solution of the filtering problem with countingobservations. IEEE Trans. Inform. Theory, 36(6):1415–1425, 1990.

N. V. Krylov and Teng Wang. Filtering partially observable diffusions up to the exit time from a domain.Stochastic Process. Appl., 121(8):1785–1815, 2011. ISSN 0304-4149. doi: 10.1016/j.spa.2011.04.008. URL http://dx.doi.org/10.1016/j.spa.2011.04.008.

Thomas G. Kurtz. Martingale problems for conditional distributions of Markov processes. Electron. J. Probab.,3:no. 9, 29 pp. (electronic), 1998. ISSN 1083-6489.

Thomas G. Kurtz and Giovanna Nappo. The filtered martingale problem. In Dan Crisan and Boris Rozovskii,editors, Handbook on Nonlinear Filtering, chapter 5, pages 129–165. Oxford University Press, 2011.

Thomas G. Kurtz and Daniel L. Ocone. Unique characterization of conditional distributions in nonlinearfiltering. Ann. Probab., 16(1):80–107, 1988. ISSN 0091-1798.

Thomas G. Kurtz and Jie Xiong. Particle representations for a class of nonlinear SPDEs. Stochastic Process.Appl., 83(1):103–126, 1999. ISSN 0304-4149.

Thomas G. Kurtz and Jie Xiong. Numerical solutions for a class of SPDEs with application to filtering. InStochastics in finite and infinite dimensions, Trends Math., pages 233–258. Birkhauser Boston, Boston, MA,2001.

Page 56: Martingale problems and filteringkurtz/Lectures/Barrett.pdf · 2015. 5. 13. · First Prev Next Go To Go Back Full Screen Close Quit 13 Martingale characterization of conditional

•First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 56

Etienne Pardoux. Filtrage non lineaire et equations aux derivees partielles stochastiques associees. In Ecoled’Ete de Probabilites de Saint-Flour XIX—1989, volume 1464 of Lecture Notes in Math., pages 67–163. Springer,Berlin, 1991.

L. C. G. Rogers and J. W. Pitman. Markov functions. Ann. Probab., 9(4):573–582, 1981. ISSN 0091-1798. URL http://links.jstor.org/sici?sici=0091-1798(198108)9:4<573:MF>2.0.CO;2-G&origin=MSN.

Murray Rosenblatt. Functions of Markov processes. Z. Wahrscheinlichkeitstheorie und Verw. Gebiete, 5:232–243,1966.

Zhengxiao Wu. A cluster identification framework illustrated by a filtering model for earthquake occurrences.Bernoulli, 15(2):357–379, 2009. ISSN 1350-7265. doi: 10.3150/08-BEJ159. URL http://dx.doi.org/10.3150/08-BEJ159.

Page 57: Martingale problems and filteringkurtz/Lectures/Barrett.pdf · 2015. 5. 13. · First Prev Next Go To Go Back Full Screen Close Quit 13 Martingale characterization of conditional

•First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 57

AbstractMartingale problems and filteringMartingale problems for conditional distributions

Let X be a Markov process characterized as the solution of a martin-gale problem with generator A, and let Y(t) be given by a functionof X(t). The conditional distribution of X(t) given observations of Yup to time t is characterized as the solution of a filtered martingaleproblem. Uniqueness for the original martingale problem impliesuniqueness for the filtered martingale problem which in turn impliesthe Markov property for the conditional distribution considered as aprobability-measure-valued process.

Derivation and uniqueness for filtering equations

The conditional distribution of a partially observed Markov processcan be characterized as a solution of a filtered martingale problem.

Page 58: Martingale problems and filteringkurtz/Lectures/Barrett.pdf · 2015. 5. 13. · First Prev Next Go To Go Back Full Screen Close Quit 13 Martingale characterization of conditional

•First •Prev •Next •Go To •Go Back •Full Screen •Close •Quit 58

In a variety of settings, this characterization in turn implies that theconditional distribution is given as the unique solution of a filteringequation. Previous results will be reviewed, and new uniqueness re-sults based on local martingale problems and a local forward equa-tion will be presented.