segue from time series to point processes. y = 0,1 e(y) = prob{y = 1}

24
Segue from time series to point processes. Y = 0,1 E(Y) = Prob{Y = 1} (Y 1 , Y 2 ) E(Y 1 Y 2 } = Prob{( Y 1 ,Y 2 ) = (1,1)} {Y(t)} case. mean level: c Y (t) = Prob{Y(t) = 1} product moment: Prob{Y(t 1 )=1, Y(t 2 ) = 1} = E{Y(t 1 )Y(t 2 )} Naïve interpretations Stationary case : c YY (t 1 – t 2 )

Upload: gaerwn

Post on 19-Jan-2016

24 views

Category:

Documents


0 download

DESCRIPTION

Segue from time series to point processes. Y = 0,1 E(Y) = Prob{Y = 1} (Y 1 , Y 2 ) E(Y 1 Y 2 } = Prob{( Y 1 ,Y 2 ) = (1,1)} {Y(t)} case. mean level: c Y (t) = Prob{Y(t) = 1} - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Segue from time series to point processes.      Y = 0,1             E(Y) = Prob{Y = 1}

Segue from time series to point processes.

Y = 0,1 E(Y) = Prob{Y = 1} (Y1, Y2 ) E(Y1 Y2} = Prob{( Y1 ,Y2) = (1,1)}

{Y(t)} case.

mean level: c Y(t) = Prob{Y(t) = 1}

product moment: Prob{Y(t1)=1, Y(t2) = 1} = E{Y(t 1)Y(t2)} Naïve interpretations

Stationary case : cYY(t1 – t 2)

Can use acf, ccf, … i,e, stationary series R functions

Page 2: Segue from time series to point processes.      Y = 0,1             E(Y) = Prob{Y = 1}

Can approximate point process data {τj , j=1,…,J } by a 0-1 tt.s. data

points isolated, pick small Δt

time series Y(t/ Δt) T = J/Δt can be large

Page 3: Segue from time series to point processes.      Y = 0,1             E(Y) = Prob{Y = 1}

Point processes on the line. Nerve firing.

Page 4: Segue from time series to point processes.      Y = 0,1             E(Y) = Prob{Y = 1}

Stochastic point process. Building blocks

Process on R {N(t)}, t in R, with consistent set of distributions

Pr{N(I1)=k1 ,..., N(In)=kn } k1 ,...,kn integers 0

I's Borel sets of R.

Consistentency example. If I1 , I2 disjoint

Pr{N(I1)= k1 , N(I2)=k2 , N(I1 or I2)=k3 }

=1 if k1 + k2 =k3

= 0 otherwise

Guttorp book, Chapter 5

Page 5: Segue from time series to point processes.      Y = 0,1             E(Y) = Prob{Y = 1}

Points: ... -1 0 1 ...

discontinuities of {N}

N(t) = #{0 < j t}

Simple: j k if j k

points are isolated

dN(t) = 0 or 1

Surprise. A simple point process is determined by its void probabilities

Pr{N(I) = 0} I compact

Page 6: Segue from time series to point processes.      Y = 0,1             E(Y) = Prob{Y = 1}

Conditional intensity. Simple case

History Ht = {j t}

Pr{dN(t)=1 | Ht } = (t:)dt r.v.

Has all the information

Probability points in [0,T) are t1 ,...,tN

Pr{dN(t1)=1,..., dN(tN)=1} =

(t1)...(tN)exp{- (t)dt}dt1 ... dtN

[1-(h)h][1-(2h)h] ... (t1)(t2) ...

Page 7: Segue from time series to point processes.      Y = 0,1             E(Y) = Prob{Y = 1}

Dirac delta.

Picture a r.v. , U, = 0 with probability 1

then E{g(U)} = g(0)

Picture a r.v. , V, with distribution N(0, σ 2), σ small

then E{g(V)}approaches g(0) as σ decreases, g cts at 0

Picture that U has a density δ(u), a generalized function

then E{g(U)} = ∫ g(u) δ(u) du

Properties: ∫ δ(u) du = 1, δ(u) = 0 for u ≠ 0

dH(u) = δ(u) du for H the Heavyside function

Page 8: Segue from time series to point processes.      Y = 0,1             E(Y) = Prob{Y = 1}

Parameters. Suppose points are isolated

dN(t) = 1 if point in (t,t+dt]

= 0 otherwise

1. (Mean) rate/intensity.

E{dN(t)} = pN(t)dt

= Pr{dN(t) = 1}

j g(j) = g(s)dN(s)

E{j g(j)} = g(s)pN(s)ds

Trend: pN(t) = exp{+t} Cycle: exp{cos(t+)}

t

N dssptNE 0 )()}({

Page 9: Segue from time series to point processes.      Y = 0,1             E(Y) = Prob{Y = 1}

Product density of order 2.

Pr{dN(s)=1 and dN(t)=1}

= E{dN(s)dN(t)}

= [(s-t)pN(t) + pNN (s,t)]dsdt

Factorial moment

tvu

NN dudvvuptNtNE,0

),(]}1)()[({

Page 10: Segue from time series to point processes.      Y = 0,1             E(Y) = Prob{Y = 1}

Autointensity.

Pr{dN(t)=1|dN(s)=1}

= (pNN (s,t)/pN (s))dt s t

= hNN(s,t)dt

= pN (t)dt if increments uncorrelated

Page 11: Segue from time series to point processes.      Y = 0,1             E(Y) = Prob{Y = 1}

Covariance density/cumulant density of order 2.

cov{dN(s),dN(t)} = qNN(s,t)dsdt st

= [(s-t)pN(s)+qNN(s,t)]dsdt generally

qNN(s,t) = pNN(s,t) - pN(s) pN(t) st

Page 12: Segue from time series to point processes.      Y = 0,1             E(Y) = Prob{Y = 1}

Identities.

1. j,k g(j ,k ) = g(s,t)dN(s)dN(t)

Expected value.

E{ g(s,t)dN(s)dN(t)}

= g(s,t)[(s-t)pN(t)+pNN (s,t)]dsdt

= g(t,t)pN(t)dt + g(s,t)pNN(s,t)dsdt

Page 13: Segue from time series to point processes.      Y = 0,1             E(Y) = Prob{Y = 1}

2. cov{ g(j ), g(k )}

= cov{ g(s)dN(s), h(t)dN(t)}

= g(s) h(t)[(s-t)pN(s)+qNN(s,t)]dsdt

= g(t)h(t)pN(t)dt + g(s)h(t)qNN(s,t)dsdt

Page 14: Segue from time series to point processes.      Y = 0,1             E(Y) = Prob{Y = 1}

Product density of order k.

t1,...,tk all distinct

Prob{dN(t1)=1,...,dN(tk)=1}

=E{dN(t1)...dN(tk)}

= pN...N (t1,...,tk)dt1 ...dtk

= Prob{dN(t1)=1,...,dN(tk)=1}

E{N(t) (k)} = ∫0t… ∫0

t pN...N (t1,...,tk)dt1 ...dtk

Page 15: Segue from time series to point processes.      Y = 0,1             E(Y) = Prob{Y = 1}

Cumulant density of order k.

t1,...,tk distinct

cum{dN(t1),...,dN(tk)}

= qN...N (t1 ,...,tk)dt1 ...dtk

Page 16: Segue from time series to point processes.      Y = 0,1             E(Y) = Prob{Y = 1}

Stationarity.

Joint distributions,

Pr{N(I1+t)=k1 ,..., N(In+t)=kn} k1 ,...,kn integers 0

do not depend on t for n=1,2,...

Rate.

E{dN(t)=pNdt

Product density of order 2.

Pr{dN(t+u)=1 and dN(t)=1}

= [(u)pN + pNN (u)]dtdu

Page 17: Segue from time series to point processes.      Y = 0,1             E(Y) = Prob{Y = 1}

Autointensity.

Pr{dN(t+u)=1|dN(t)=1}

= (pNN (u)/pN)du u 0

= hN(u)du

Covariance density.

cov{dN(t+u),dN(t)}

= [(u)pN + qNN (u)]dtdu

Page 18: Segue from time series to point processes.      Y = 0,1             E(Y) = Prob{Y = 1}
Page 19: Segue from time series to point processes.      Y = 0,1             E(Y) = Prob{Y = 1}

Mixing.

cov{dN(t+u),dN(t)} small for large |u|

|pNN(u) - pNpN| small for large |u|

hNN(u) = pNN(u)/pN ~ pN for large |u|

|qNN(u)|du <

Page 20: Segue from time series to point processes.      Y = 0,1             E(Y) = Prob{Y = 1}

Algebra/calculus of point processes.

Consider process {j, j+u}. Stationary case

dN(t) = dM(t) + dM(t+u)

Taking "E", pNdt = pMdt+ pMdt

pN = 2 pM

Page 21: Segue from time series to point processes.      Y = 0,1             E(Y) = Prob{Y = 1}

)()()(2)]()([)(

)(

)()(2)]()([)(

/)}]()({

)}()({)}()({)}()({[

/)}()({)()(

uvpuvpvppuvuvvp

tusp

utsptspptusutstsp

dsdtutdMusdME

tdMusdMEutdMsdMEtdMsdME

dsdttdNsdNEtsppts

MMMMMMMNN

MM

MMMMMNN

NNN

Taking "E" again,

Page 22: Segue from time series to point processes.      Y = 0,1             E(Y) = Prob{Y = 1}

Association. Measuring? Due to chance?

Are two processes associated? Eg. t.s. and p.p.

How strongly?

Can one predict one from the other?

Some characteristics of dependence:

E(XY) E(X) E(Y)

E(Y|X) = g(X)

X = g (), Y = h(), r.v.

f (x,y) f (x) f(y)

corr(X,Y) 0

Page 23: Segue from time series to point processes.      Y = 0,1             E(Y) = Prob{Y = 1}

Bivariate point process case.

Two types of points (j ,k)

Crossintensity.

Prob{dN(t)=1|dM(s)=1}

=(pMN(t,s)/pM(s))dt

Cross-covariance density.

cov{dM(s),dN(t)}

= qMN(s,t)dsdt no ()

Page 24: Segue from time series to point processes.      Y = 0,1             E(Y) = Prob{Y = 1}