1 2. stationary processes and models a random variable x is a rule for assigning to every outcome s...

Post on 04-Jan-2016

218 Views

Category:

Documents

0 Downloads

Preview:

Click to see full reader

TRANSCRIPT

1

2. Stationary Processes and Models

A random variable x is a rule for assigning to every outcome s of an experiment a number x(s)– For example: the outcome of die tossing.

– For example: the outcome of a survey. 非常滿意 滿意 尚可 不滿意 非常不滿意 5 4 3 2 1

The outcomes of an experiments are unknown before they are conducted. This is why the variable is called random.

In many cases, we can collect many outcomes and characterize the variable using its distribution function.

For example: the previous survey example.

2

The distribution function:

Although the variable is random, we can somehow understand its characteristics and predict its outcomes if we know the distribution function.

1 2 3 4 5

50 5545

30

80

3

A random (stochastic) process x(t) is a rule for assigning to every outcome s of an experiment a time function x(t,s).

If t is fixed, x(t,s) is a random variable. If s is fixed, x(t,s) is a outcome (time function) of a

particular experiment. If both s and t are fixed, x(t,s) is a number.

t

s

(ensemble)

4

Examples: – Record of a speech.

– Record of temperature.

– Received signal in mobile communication. How to characterize a random process mathematically?

-- use statistics First order statistics : the density function of x(t)

t

f[x(t0)] f[x(t0+1)] f[x(t0+2)]

5

Second order statistics: f[x(t1),x(t2)]

N-th order statistics: f[x(t1),x(t2),..,x(tN)]

In any cases, we need infinite distribution functions. It is almost impossible to specify.

A random process x(t) is called strict-sense stationary if its statistic properties are invariant to a shift of origin. Let the time be discrete, we have

Except for the first order statistics, we still have infinite distributions to specify.

))Δ(),...,Δ(),Δ(())(),...,(),((

))Δ(),Δ(())(),((

Δ(())((

2121

2121

NN nxnxnxfnxnxnxf

nxnxfnxnxf

nxfnxf

6

A random process is called wide-sense stationary (WSS) if

The function R() is called the autocorrelation function. Note that R(0)=E{|x(n)|2} is the average power of x(n). The autocovariance is defined as

For real signals, we have For complex signals, we have

function) (a )()()(

constant) (a )(

rnxnxE

nxE

2)()( rc

).()( rr

).()( rr

7

Thus, and r() completely characterize a WSS process.

Note that the characterization of a random process by the mean and the autocorrelation function is not unique.

Conclusion: In general, we can only partially characterize a small portion of random processes.

In real applications, how can we know and r() of a random process?

It is usually impractical to obtain and r() from ensemble average and we may want to use time average.

Note that the estimate is a random variable itself and

1

0

)(1

)(ˆN

n

nxN

N

)(ˆ NE

8

The ensemble and time average:

We say that the process x(n) is mean ergodic in the mean square sense if

t

s

Ensemble average Time average

0|)(ˆ| 2lim NEn

9

It can be shown that

For N , | l |/N0. Thus, the condition implies

Thus if the process is asymptotic uncorrelated, it is mean ergodic in the MSE sense.

)(||

11

)(1

|)(ˆ|

1

)1(

1

0

1

02

2

lcNl

N

kncN

NE

N

Nl

N

n

N

k

0)(1 1

)1(

lcN

N

Nl

* check this

10

By the same way, a process is said to be correlation ergodic in the MSE sense if the autocorrelation can be estimated using the time average.

Let u(n)=[u(n),u(n-1),…,u(n-M+1)]T. The correlation matrix of u(n) is defined as R=E{u(n)uH(n)}.

)()(1

);(ˆ1

0

knxnxN

NkrN

n

11

The correlation matrix plays an important role in adaptive signal processing.

Property 1: the correlation matrix (of a stationary process) is Hermitian, i.e., RH=R.

Property 2: the correlation matrix is Toeplitz.

Property 3: the correlation matrix is always nonnegative definite.

Let y=xHu(n), then E{|y|2}=E{yy*}=E{xH u(n)uH(n)x}= xH E{u(n)uH(n)}x=xHRx0.

12

Property 4: Let uB(n)=[u(n-M+1),u(n-M+2),…,u(n)]T. Then

In other words, E{uB(n) uBH(n)}=RT. Property 5: the correlation matrices RM and RM+1 are relat

ed by

rH=[r(1),r(2),…r(M)]rBT=[r(-M),r(-M+1),…r(-1)]

13

This can be easily shown as follows:

In this case, the input vector is decomposed as

14

Similarly,

The input vector is decomposed as

15

Stochastic model: Any hypothesis that may be applied to explain or describe the hidden laws that are supposed to govern or constrain the generation of some physical data of interest.

This is equivalent to ask how to synthesize a stochastic process? Or, how to characterize a stochastic process?

Typically, using a model, we only need a set of parameters to do the job.

Linear FilterWhiteGaussianNoisev(n)

Stochasticprocessesu(n)

)()()(

0)(2 knkvnvE

nvE

16

Autoregressive (AR) model :

z-transform representation:

The AR model is used most since the parameters identification is easier.

Note that poles must be inside the unit circle. Strictly speaking, the AR process is not stationary. If n is

large, we say that it is asymptotic stationary.

)()(...)1()(

)()(...)1()(

1

1

nvMnanuanu

nvMnanuanu

M

M

)(...1

1)(

21

21

zVzazaza

zUM

M

17

For example: u(n)=a u(n-1)+v(n)

Thus, we have the variance of u(n) as

)2()1()0()2()1()2(

)1()0()1()0()1(

)0()0(

2 vavvavauu

vavvauu

vu

22

2)(

22

)1(22

)(

2242)2(

222)1(

22)0(

11

11

)1(

)1(

vu

v

n

nu

vu

vu

vu

a

aa

aa

a

* One can show that the autocorrelation also has the same characteristics. Thus, AR processes are asymptotic stationary.

18

The generation of an AR process:

19

Moving Average (MA) Model:

z-transform representation:

ARMA model:

Computations of MA and ARMA model coefficients required solving systems of nonlinear equations.

)()...1()()(10

NnvbnvbnvbnuN

)()...()( 1

10zVzbzbbzU N

N

)(...1

...)(

21

1

21

10 zVzazaza

zbzbbzU

M

N

M

N

20

Generation of ARMA signal:

+

D

D

D

+

+

v(n) u(n)

1a

Ma

+

2a

+

+

+

+

0b

1b

Nb

White Gaussiannoise

ARMAprocess

21

Multiplying the AR difference equation by u*(n-l) and taking expectation, we have

For l=0, we have

0 ,0)()1()(

)}()({)}(){(

...)}()1({)}()({

)()(...)1()(

1

1

1

lMlralralr

lnunvElnuMnEa

lnunuEalnunuE

nvMnanuanu

M

M

M

0 ,)()1()(

)}()({)}()({

...)}1()({)}()({

)}()({)}(){(

...)}()1({)}()({

2

1

1

1

lMlralralr

nvnvEMnunvEa

nunvEanunvE

nunvEnuMnEa

nunuEanunuE

vM

M

M

22

Let l=1,2,…,M. We then obtain a set of linear equations to solve ai’s.

Then,

Once ai’s are found, we can then find v2.

0)0()1()(

0)2()1()2(

0)1()0()1(

1

1

1

raMraMr

Mrarar

Mrarar

M

M

M

)(

)2(

)1(

)0()2()1(

)1()0()1(

)1()1()0(

2

1

Mr

r

r

a

a

a

rMrMr

Mrrr

Mrrr

M

},,,,{)}(,),1(),0({ 212

Mv aaaMrrr

23

Example AR2: u(n)+a1u(n-1)+a2u(n-2)=v(n)

The 2-nd order AR process has the characteristic equation

Thus, we have two poles. They must be located inside of the unit circle.

It turns out the following conditions must be satisfied.

Three examples are considered; (1) a1=-0.1, a2=-0.8, (poles: 0.9458, -0.8458) (2) a1=0.1, a2=-0.8, (poles: -0.9458, 0.8458) (3) a1=-0.975, a2=0.95 (poles: 0.4875 + j 0.8440, 0.4875 - j 0.8440).

01 22

11 zaza

11

1

1

2

12

12

a

aa

aa

24

Permissible region:

25

Samples:

Input white noise

26

The autocorrelation for (1)

27

The autocorrelation for (2)

28

The autocorrelation for (3)

29

Is an AR process stationary?– No. It is non-stationary.

– It is asymptotically stationary. This can be clearly seen from its recursive equation. For

example, for a first-order AR process, we have

We then have

As we can see, even the variance is not a constant in the AR process.

However, as n approaches infinity, it becomes stationary.

)()()( nvnxnx 1

)()()()(

)()()(

2102

1012 vvxx

vxx

30

Order selection:– Given a random process, how can we select an order for a

modeling AR process? An information-theoretic (AIC) criterion:

– m : the order

– ui=u(i), I=1,2,…,N, observations

– : estimated parameters

Minimum description length (MDL) criterion:– N: the number of data

)ˆ|(lnmaxˆ

ˆ)(

m

N

iiUm

m

ufL

mLm

1

22AIC

]ˆ,,ˆ,ˆ[ˆmmmmm 21θ

)ln(ˆ)( NmLm m MDL

31

Let u(n) denote a complex Gaussian process consisting of N samples.– A mean of zero

– An autocorrelation function denoted by

and the set of the autocorrelation function defines the correlation matrix R of the Gaussian process u(n).

The density function:

Note that fU(u) is 2N-dimensinal. We use N(0,R) to denote a Gaussian process with correlation matrix R.

NnnuE ,,,,)( 210

,,,,,)()()( 110 NkknunuEkr

uRuR

u 1

2

2 HNUf exp

)det()()(

TNuuu )](,),(),([ 21u

32

Properties:– The process u(n) is stationary in the strict sense.

– The process is circularly complex. It is also referred to as a circularly complex Gaussian.

– Let un=u(n), for n=1,2,…,N denotes samples of a Gaussian process. If kl

and k=l

where denotes a permutation of {1,2,…,l}.

knkunuE ,)()( 0

,02121

lk tttsss uuuuuuE

.)()()( llll tstststttsss uuEuuEuuEuuuuuuE

22112121

)}(,),(),({ l 21

33

The last one is called Gaussian factorization theorem. A special case is

Linear transformation of a random process:

413242314321 uuEuuEuuEuuEuuuuE

2

222

2222

2

11

1

|)(|)(

|)(||)(||)(|)

|)(||)(||)()(||)(|

)()()(

|)(|)

limlim(

lim(

jx

jNN

jN

jNN

jNN

N

eHS

eHxEN

uEN

eHxEeHxEuE

eHxu

uEN

NNuS

NuS

Discrete-timelinear filter

x(n) u(n)

34

For an AR process:

Thus, as long as we know poles’ positions, we can figure out the PSD of the AR process.

Discrete-timeall-pole filter

v(n) u(n)white Gaussian

222 |)(||)(|)()( jv

jvu eHeHSS

35

Power spectrum analyzer:– Ideal bandpass filter

36

Spectrum analysis of stochastic processes: Let u(n) be a random process and uN(n)=u(n) for n=0,1,…,N-1, uN(n)=0 for n>N-1 [windowing of u(n)]. Then,

The Fourier transform (FT) of uN(n) and its conjugate are

Then,

)()( lim nunu NN

1

0

1

0

)()(,)()(N

k

kjN

n

njNN enuuenuu

NN

1

0

1

0

)(

1

0

1

0

)(2

1

0

1

0

)(2

)(

)()(|)(|

)()(|)(|

N

n

N

k

nkjN

N

n

N

k

nkjNN

N

n

N

k

nkjNN

eknr

ekunuEuE

ekunuu

N

N

37

Let l=n-k. We may rewrite the above formula as

Thus,

Thus, the FT of the autocorrelation function is called the power spectrum density (PSD) of a process.

Let s() is a PSD. Then s()d corresponds to the average power of the contribution to the total power from components of a process with frequencies located between and +d.

1

1

2 )()||

(|)(|1 N

Nl

ljNN elr

Nl

luEN

1

1

21 N

Nl

ljN elruE

NN )(|)(|lim

38

Property : The PSD of a stationary process is real and nonnegative.

Property: The frequency support of the PSD is the Nyquist interval (-, ].

Property: The PSD of a real stationary process is even (if it is complex, this is not necessarily true).

Property:

dSr )(

2

1)0(

top related