chapter 6: correlation functions

27
Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9. B.J. Bazuin, Spring 2015 1 of 27 ECE 3800 Chapter 6: Correlation Functions 6-1 Introduction 6-2 Example: Autocorrelation Function of a Binary Process 6-3 Properties of Autocorrelation Functions 6-4 Measurement of Autocorrelation Functions 6-5 Examples of Autocorrelation Functions 6-6 Crosscorrelation Functions 6-7 Properties of Crosscorrelation Functions 6-8 Example and Applications of Crosscorrelation Functions 6-9 Correlation Matrices for Sampled Functions Concepts Autocorrelation o of a Binary/Bipolar Process o of a Sinusoidal Process Linear Transformations Properties of Autocorrelation Functions Measurement of Autocorrelation Functions An Anthology of Signals Crosscorrelation Properties of Crosscorrelation Examples and Applications of Crosscorrelation Correlation Matrices For Sampled Functions

Upload: others

Post on 30-Apr-2022

18 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Chapter 6: Correlation Functions

Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.

B.J. Bazuin, Spring 2015 1 of 27 ECE 3800

Chapter 6: Correlation Functions

6-1 Introduction

6-2 Example: Autocorrelation Function of a Binary Process

6-3 Properties of Autocorrelation Functions

6-4 Measurement of Autocorrelation Functions

6-5 Examples of Autocorrelation Functions

6-6 Crosscorrelation Functions

6-7 Properties of Crosscorrelation Functions

6-8 Example and Applications of Crosscorrelation Functions

6-9 Correlation Matrices for Sampled Functions

Concepts

Autocorrelation o of a Binary/Bipolar Process o of a Sinusoidal Process

Linear Transformations Properties of Autocorrelation Functions Measurement of Autocorrelation Functions An Anthology of Signals Crosscorrelation Properties of Crosscorrelation Examples and Applications of Crosscorrelation Correlation Matrices For Sampled Functions

Page 2: Chapter 6: Correlation Functions

Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.

B.J. Bazuin, Spring 2015 2 of 27 ECE 3800

5-5.3 A random process has sample functions of the form

n

tnTtfAtX 0

where A and T are constants and t0 is a random variable that is uniformly distributed between 0 and T. The function f(t) is defined by

201 Tttf

and zero elsewhere.

Also Note: TtT

tpdf 00 01

a. Find the mean and second moment.

00 tnTtfEAEtnTtfAEtXEn

0tnTtfEAtXE

For any time t, there is a probability that f(t) is present or absent, based on the random variable t0. As t0 is uniformly distributed across the bit period T and f(t) is present for exactly ½ of the period,

2

1

2

110

0

00 T

Tdt

TtftnTtfE

T

and

2

1 AtXE

nn

tnTtfAEtnTtfAEtXE 02

2

02

022 tnTtfEAtXE

As previously stated

2

122 AtXE

Page 3: Chapter 6: Correlation Functions

Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.

B.J. Bazuin, Spring 2015 3 of 27 ECE 3800

b. Find the time average and squared average.

0

0

0

1tT

t n

dttnTtfAT

tX

22

11 2

00

AT

T

Adt

T

AdttfA

TtX

TT

and

0

0

2

02 1

tT

t n

dttnTtfAT

tX

22

11 222

0

2

0

22 AT

T

Adt

T

AdttfA

TtX

TT

c. Can this process be stationary?

Yes, the mean and variance are not functions of time. Therefore, we expect it to be wide-sense stationary.

d. Can this process be ergodic?

Based on the information provide, there is no specific reason to suggest that the process is not ergodic.

Page 4: Chapter 6: Correlation Functions

Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.

B.J. Bazuin, Spring 2015 4 of 27 ECE 3800

Chapter 6: Correlation Functions

The Autocorrelation Function

For a sample function defined by samples in time of a random process, how alike are the different samples?

Define: 11 tXX and 22 tXX

The autocorrelation is defined as:

2121, tXtXEttRXX

The above function is valid for all processes, stationary and non-stationary.

For WSS processes:

XXXX RtXtXEttR 21,

If the process is ergodic, the time average is equivalent to the probabilistic expectation, or

txtxdttxtx

T

T

TT

XX 2

1lim

and

XXXX R

Page 5: Chapter 6: Correlation Functions

Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.

B.J. Bazuin, Spring 2015 5 of 27 ECE 3800

Example: tfAtx 2sin for A a uniformly distributed random variable 2,2A

212121 2sin2sin, tfAtfAEtXtXEttRXX

2121

22121 2cos2cos

2

1, ttfttfAEtXtXEttRXX

21212

21 2cos2cos2

1, ttfttfAEttRXX

for 12 tt

21

2

21 2cos2cos122

1, ttffttRXX

2121 2cos2cos24

16, ttffttRXX

A non-stationary process

The time based formulation:

txtxdttxtx

T

T

TT

XX 2

1lim

T

TT

XX dttfAtfAT

2sin2sin2

1lim

T

TT

XX dttffT

A 22cos2cos2

1

2

1lim2

T

TT

XX dttfT

Af

A 22cos2

1lim

22cos

2

22

fA

fA

XX 2cos2

2cos2

22

Acceptable, but the R.V. is still present?!

ff

AEE XX 2cos

24

162cos

2

2

Page 6: Chapter 6: Correlation Functions

Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.

B.J. Bazuin, Spring 2015 6 of 27 ECE 3800

Example: tfAtx 2sin for a uniformly distributed random variable 2,0

212121 2sin2sin, tfAtfAEtXtXEttRXX

22cos2cos2

1, 2121

22121 ttfttfEAtXtXEttRXX

22cos2

2cos2

, 21

2

21

2

2121 ttfEA

ttfA

tXtXEttRXX

Of note is that the phase need only be uniformly distributed over 0 to π in the previous step!

21

2

2121 2cos2

, ttfA

tXtXEttRXX

for 12 tt

fA

RXX 2cos2

2

but

fA

RR XXXX 2cos2

2

Assuming a uniformly distributed random phase “simplifies the problem” …

Also of note, if the amplitude is an independent random variable, then

fAE

RXX 2cos2

2

The time based formulation:

txtxdttxtx

T

T

TT

XX 2

1lim

T

TT

XX dttfAtfAT

2sin2sin2

1lim

T

TT

XX dttffT

A 222cos2cos2

1lim

2

2

fA

XX 2cos2

2

Page 7: Chapter 6: Correlation Functions

Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.

B.J. Bazuin, Spring 2015 7 of 27 ECE 3800

Example:

T

ttrectBtx 0 for B =+/-A with probability p and (1-p) and t0 a uniformly

distributed random variable

2,

20

TTt . Assume B and t0 are independent.

T

ttrectB

T

ttrectBEtXtXEttRXX

01012121 ,

T

ttrect

T

ttrectBEtXtXEttRXX

010122121 ,

As the RV are independent

T

ttrect

T

ttrectEBEtXtXEttRXX

020122121 ,

T

ttrect

T

ttrectEpApAttRXX

02012221 1,

2

2

002012

21

1,

T

TXX dt

TT

ttrect

T

ttrectAttR

For 21 0 tandt

2

2

002 1

1,0

T

TXX dt

TT

trectAR

The integral can be recognized as being a triangle, extending from –T to T and zero everywhere else.

T

triARXX

2

T

TTT

A

TTT

A

T

RXX

,0

0,1

0,1

,0

2

2

Page 8: Chapter 6: Correlation Functions

Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.

B.J. Bazuin, Spring 2015 8 of 27 ECE 3800

The time based formulation:

txtxdttxtx

T

T

TT

XX 2

1lim

C

CC

XX dtT

ttrectB

T

ttrectB

C00

2

1lim

A change in variable for the integral 0ttt . And only integrate over the finite interval T.

2

2

2 11

T

TXX dt

T

trect

TB

T

ttriBXX

2

T

ttriBEE XX

2

T

ttripApAE XX

122

T

ttriAE XX

2

Page 9: Chapter 6: Correlation Functions

Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.

B.J. Bazuin, Spring 2015 9 of 27 ECE 3800

Properties of Autocorrelation Functions

1) 220 XXERXX or 20 txXX

the mean squared value of the random process can be obtained by observing the zeroth lag of the autocorrelation function.

2) XXXX RR

The autocorrelation function is an even function in time. Only positive (or negative) needs to be computed for an ergodic, WSS random process.

3) 0XXXX RR

The autocorrelation function is a maximum at 0. For periodic functions, other values may equal the zeroth lag, but never be larger.

4) If X has a DC component, then Rxx has a constant factor.

tNXtX

NNXX RXR 2

Note that the mean value can be computed from the autocorrelation function constants!

5) If X has a periodic component, then Rxx will also have a periodic component of the same period.

Think of:

20,cos twAtX

where A and w are known constants and theta is a uniform random variable.

wA

tXtXERXX cos2

2

Page 10: Chapter 6: Correlation Functions

Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.

B.J. Bazuin, Spring 2015 10 of 27 ECE 3800

5b) For signals that are the sum of independent random variable, the autocorrelation is the sum of the individual autocorrelation functions.

tYtXtW

YXYYXXWW RRR 2

For non-zero mean functions, (let w, x, y be zero mean and W, X, Y have a mean)

YXYYXXWW RRR 2

YXYyyXxxWwwWW RRRR 2222

222 2 YYXXyyxxWwwWW RRRR

22YXyyxxWwwWW RRRR

Then we have

22YXW

yyxxww RRR

6) If X is ergodic and zero mean and has no periodic component, then

0lim

XXR

No good proof provided.

The logical argument is that eventually new samples of a non-deterministic random process will become uncorrelated. Once the samples are uncorrelated, the autocorrelation goes to zero.

7) Autocorrelation functions can not have an arbitrary shape. One way of specifying shapes permissible is in terms of the Fourier transform of the autocorrelation function. That is, if

dtjwtRR XXXX exp

then the restriction states that

wallforRXX 0

Page 11: Chapter 6: Correlation Functions

Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.

B.J. Bazuin, Spring 2015 11 of 27 ECE 3800

Additional concepts that are useful … dealing with constant:

tNatX

tNatNaEtXtXERXX

NNXX RatNtNEaR 22

tYatX

tYatYaERXX

tYtYtYatYaaERXX2

tYtYEtYEatYEaaRXX2

YYXX RtYEaaR 22

Exercise 6-1.1

A random process has a sample function of the form

else

tAtX

,0

10,

where A is a random variable that is uniformly distributed from 0 to 10.

Find the autocorrelation of the process.

100,10

1 aforaf

Using

2121, tXtXEttRXX

1,0,, 212

21 ttforAEttRXX

1,0,10

1, 21

10

0

221 ttfordaattRXX

1,0,30

, 21

10

0

3

21 ttfora

ttRXX

1,0,3

100

30

1000, 2121 ttforttRXX

Page 12: Chapter 6: Correlation Functions

Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.

B.J. Bazuin, Spring 2015 12 of 27 ECE 3800

Exercise 6-1.2

Define a random variable Z(t) as

1 tXtXtZ

where X(t) is a sample function from a stationary random process whose autocorrelation function is

2exp XXR

Write an expressions for the autocorrelation function of Z(t).

2121, tZtZEttRZZ

12211121, tXtXtXtXEttRZZ

12112111212121, tXtXtXtXtXtXtXtXEttRZZ

1211211

1212121 ,

tXtXEtXtXE

tXtXEtXtXEttRZZ

Note: based on the autocorrelation definition, 12 tt

121121121221, ttRttRttRttRttR XXXXXXXXZZ

1121 2, XXXXXXZZ RRRttR

21

21

221 expexpexp2, ttRZZ

Note that for this example, the autocorrelation is strictly a function of tau and not t1 or t2. As a result, we expect that the random variable is stationary.

Helpful in defining a stationary random process:

A stationary random variable will have an autocorrelation that is dependent on the time difference, not the absolute times.

Page 13: Chapter 6: Correlation Functions

Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.

B.J. Bazuin, Spring 2015 13 of 27 ECE 3800

Autocorrelation of a Binary Process

Binary communications signals are discrete, non-deterministic signals with plenty of random variables involved in their generations.

Nominally, the binary bit last for a defined period of time, ta, but initially occurs at a random delay (uniformly distributed across one bit interval 0<=t0<ta). The amplitude may be a fixed constant or a random variable.

ta

t0A

-A

bit(0) bit(1) bit(2) bit(4)bit(3)

The bit values are independent, identically distributed with probability p that amplitude is A and q=1-p that amplitude is –A.

aa

ttfort

tpdf 00 0,1

Determine the autocorrelation of the bipolar binary sequence, assuming p=0.5.

k a

aa

k t

tktttrectatx

20

2121, tXtXEttRXX

For samples more than one period apart, attt 21 , we must consider

ApApApApApApApApaaE jk 1111

222 112 ppppAaaE jk

144 22 ppAaaE jk

For p=0.5 0144 22 ppAaaE jk

Page 14: Chapter 6: Correlation Functions

Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.

B.J. Bazuin, Spring 2015 14 of 27 ECE 3800

For samples within one period, attt 21 ,

222 1 AApApaaE kk

0144 221 ppAaaE kk

there are two regions to consider, the sample bit overlapping and the area of the next bit. But the overlapping area … should be triangular. Therefore

0,11 2

2

2

2

1

a

t

tkk

a

t

tkk

aXX tfordtaaE

tdtaaE

tR

a

a

a

a

a

t

tkk

a

t

tkk

aXX tfordtaaE

tdtaaE

tR

a

a

a

a

0,11 2

2

1

2

2

or

0,11 2

2

2

a

t

taXX tfordt

tAR

a

a

a

t

taXX tfordt

tAR

a

a

0,11 2

2

2

Therefore

aa

a

aa

a

XX

tfort

tA

tfort

tA

R

0,

0,

2

2

or recognizing the structure

aaa

XX ttfort

AR

,12

This is simply a triangular function with maximum of A2, extending for a full bit period in both time directions.

Page 15: Chapter 6: Correlation Functions

Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.

B.J. Bazuin, Spring 2015 15 of 27 ECE 3800

For unequal bit probability

a

aaaa

a

XX

tforppA

ttfort

ppt

tA

R

,144

,144

22

22

As there are more of one bit or the other, there is always a positive correlation between bits (the curve is a minimum for p=0.5), that peaks to A2 at = 0.

Note that if the amplitude is a random variable, the expected value of the bits must be further evaluated. Such as,

22 kk aaE

21 kk aaE

In general, the autocorrelation of communications signal waveforms is important, particularly when we discuss the power spectral density later in the textbook.

Page 16: Chapter 6: Correlation Functions

Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.

B.J. Bazuin, Spring 2015 16 of 27 ECE 3800

Examples of discrete waveforms used for communications, signal processing, controls, etc.

(a) Unipolar RZ & NRZ, (b) Polar RZ & NRZ , (c) Bipolar NRZ , (d) Split-phase Manchester, (e) Polar quaternary NRZ.

From Cahp. 11: A. Bruce Carlson, P.B. Crilly, Communication Systems, 5th ed., McGraw-Hill, 2010. ISBN: 978-0-07-338040-7

In general, a periodic bipolar “pulse” that is shorter in duration than the pulse period will have the autocorrelation function

wwwp

wXX ttfor

tt

tAR

,12

for a tw width pulse existing in a tp time period, assuming that positive and negative levels are equally likely.

Page 17: Chapter 6: Correlation Functions

Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.

B.J. Bazuin, Spring 2015 17 of 27 ECE 3800

Exercise 6-3.1

a) An ergodic random process has an autocorrelation function of the form

1610cos164exp9 XXR

Find the mean-square value, the mean value, and the variance of the process.

The mean-square (2nd moment) is

222 41161690 XXRXE

The constant portion of the autocorrelation represents the square of the mean. Therefore

1622 XE and 4

Finally, the variance can be computed as,

2516410 2222 XXRXEXE

b) An ergodic random process has an autocorrelation function of the form

1

642

2

XXR

Find the mean-square value, the mean value, and the variance of the process.

The mean-square (2nd moment) is

222 61

60 XXRXE

The constant portion of the autocorrelation represents the square of the mean. Therefore

41

4

1

64lim

2

222

t

XE and 2

Finally, the variance can be computed as,

2460 2222 XXRXEXE

Page 18: Chapter 6: Correlation Functions

Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.

B.J. Bazuin, Spring 2015 18 of 27 ECE 3800

The Crosscorrelation Function

For a two sample function defined by samples in time of two random processes, how alike are the different samples?

Define: 11 tXX and 22 tYY

The cross-correlation is defined as:

2121212121 ,, yxfyxdydxYXEttRXY

2121212121 ,, xyfxydxdyXYEttRYX

The above function is valid for all processes, jointly stationary and non-stationary.

For jointly WSS processes:

XYXY RtYtXEttR 21,

YXYX RtXtYEttR 21,

Note: the order of the subscripts is important for cross-correlation!

If the processes are jointly ergodic, the time average is equivalent to the probabilistic expectation, or

tytxdttytx

T

T

TT

XY 2

1lim

txtydttxty

T

T

TT

YX 2

1lim

and

XYXY R

YXYX R

Page 19: Chapter 6: Correlation Functions

Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.

B.J. Bazuin, Spring 2015 19 of 27 ECE 3800

Properties of Crosscorrelation Functions

1) The properties of the zoreth lag have no particular significance and do not represent mean-square values. It is true that the “ordered” crosscorrelations must be equal at 0. .

00 YXXY RR or 00 YXXY

2) Crosscorrelation functions are not generally even functions. However, there is an antisymmetry to the ordered crosscorrelations:

YXXY RR

For

tytxdttytx

T

T

TT

XY 2

1lim

Substitute t

yxdyxT

T

TT

XY

2

1lim

YX

T

TT

XY xydxyT2

1lim

3) The crosscorrelation does not necessarily have its maximum at the zeroth lag. This makes sense if you are correlating a signal with a timed delayed version of itself. The crosscorrelation should be a maximum when the lag equals the time delay!

It can be shown however that 00 XXXXXY RRR

As a note, the crosscorrelation may not achieve the maximum anywhere …

4) If X and Y are statistically independent, then the ordering is not important

YXtYEtXEtYtXERXY

and

YXXY RYXR

Page 20: Chapter 6: Correlation Functions

Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.

B.J. Bazuin, Spring 2015 20 of 27 ECE 3800

5) If X is a stationary random process and is differentiable with respect to time, the crosscorrelation of the signal and it’s derivative is given by

d

dRR XX

XX

Defining derivation as a limit:

e

tXetXX

e

0lim

and the crosscorrelation

e

tXetXtXEtXtXER

eXX

0

lim

e

tXtXetXtXER

eXX

0lim

e

tXtXEetXtXER

eXX

0lim

e

ReRR XXXX

eXX

0lim

d

dRR XX

XX

Similarly,

2

2

d

RdR XX

XX

That is all the properties for the crosscorrelation function.

Page 21: Chapter 6: Correlation Functions

Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.

B.J. Bazuin, Spring 2015 21 of 27 ECE 3800

Example: A delayed signal in noise … cross correlated with the original signal …

tX

tNttXatY d

Find

tYtXERXY

Then

tNttXatXER dXY

XNdXXXY RtRaR

For noise independent of the signal,

NXXN RNXR

and therefore

NXtRaR dXXXY

For either X or N a zero mean process, we would have

dXXXY tRaR

Does this suggest a way to determine the time delay based on what you know of the autocorrelation?

The maximum of the autocorrelation is at zero; therefore, the maximum of the crosscorrelation can reveal the time delay!

any electronic system based on the time delay of a known, transmitted signal pulse …

Page 22: Chapter 6: Correlation Functions

Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.

B.J. Bazuin, Spring 2015 22 of 27 ECE 3800

Back to Autocorrelation examples, using time a time average Example: Rect. Function

From Dr. Severance’s notes;

k

k T

kTttctAtX 0Re

where A(k) are zero mean, identically distributed, independent R.V., t0 is a R.V. uniformly distributed over possible bit period T, and T is a constant. k describes the bit positions.

j

j

k

kXX T

jTttctA

T

kTttctAEttR 00 ReRe,

T

jTttct

T

kTttctAAEttR jk

jk

XX00 ReRe,

T

jTttct

T

kTttctEAAEttR jk

jk

XX00 ReRe,

but

jkfor

jkforAEAAE jk

0

2

Therefore (one summation goes away)

T

kTttct

T

kTttctEAEttR

k

XX00 ReRe, 2

Now using the time average concept

txtxdttxtx

T

T

TT

XX

2

2

1lim

Establishing the time average as the average of the kth bit, (notice that the delay is incorporated into the integral, and assuming that the signal is Wide-Sense Stationary)

0

0

2

2

2

Re11

tT

tT

XX dtT

tct

TAE

Page 23: Chapter 6: Correlation Functions

Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.

B.J. Bazuin, Spring 2015 23 of 27 ECE 3800

for T>>=0

0

0

2

0

0

2 2

2

2

2

111

1 tT

tT

tT

tT

XX tT

AEdtT

AE

TT

AEtTtTT

AEXX1

221

2200

0,12

for

TAEXX

Due to symmetry, XXXX , we have

TAEXX

12

Page 24: Chapter 6: Correlation Functions

Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.

B.J. Bazuin, Spring 2015 24 of 27 ECE 3800

Example: Rect. Function that does not extend for the full period

k

k T

kTttctAtX

0Re

where A(k) are zero mean, identically distributed, independent R.V., t0 is a R.V. uniformly distributed over possible bit period T, and T and alpha are constants. k describes the bit positions.

From the previous example, the average value during the time period was defined as:

0

0

2

2

2

Re11

tT

tT

XX dtT

tct

TAE

For this example, the period remains the same, but the integration period due to the rect functions change to:

0

0

2

2

2

Re11

tT

tT

XX dtT

tct

TAE

for T>>=0

0

0

2

0

0

2 2

2

2

2

111

1 tT

tT

tT

tT

XX tT

AEdtT

AE

TT

AEtTtTT

AEXX1

221

2200

0,11 22

for

TAE

TT

TAEXX

Due to symmetry, we have

TAEXX

12

Page 25: Chapter 6: Correlation Functions

Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.

B.J. Bazuin, Spring 2015 25 of 27 ECE 3800

Measurement of the Autocorrelation Function

We love to use time average for everything. For wide-sense stationary, ergodic random processes, time average are equivalent to statistical or probability based values.

txtxdttxtx

T

T

TT

XX 2

1lim

Using this fact, how can we use short-term time averages to generate auto- or cross-correlation functions?

An estimate of the autocorrelation is defined as:

T

XX dttxtxT

R

0

Note that the time average is performed across as much of the signal that is available after the time shift by tau.

In most practical cases, function is performed in terms of digital samples taken at specific time intervals,

t . For tau based on the available time step, k, with N equating to the available time interval, we have:

kN

i

XX ttktixtixtktN

tkR

01

kN

i

XXXX kixixkN

kRtkR

01

1ˆˆ

In computing this autocorrelation, the initial weighting term approaches 1 when k=N. At this point the entire summation consists of one point and is therefore a poor estimate of the autocorrelation. For useful results, k<<N!

Page 26: Chapter 6: Correlation Functions

Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.

B.J. Bazuin, Spring 2015 26 of 27 ECE 3800

It can be shown that the estimated autocorrelation is equivalent to the actual autocorrelation; therefore, this is an unbiased estimate.

kN

i

XXXX kixixkN

EkREtkRE

01

1ˆˆ

tkRkNkN

tkRkN

kixixEkN

tkRE

XX

kN

i

XX

kN

i

XX

11

1

1

1

1

0

0

tkRtkRE XXXX ˆ

As noted, the validity of each of the summed autocorrelation lags can and should be brought into question as k approaches N. As a result, a biased estimate of the autocorrelation is commonly used. The biased estimate is defined as:

kN

i

XX kixixN

kR

01

1~

Here, a constant weight instead of one based on the number of elements summed is used. This estimate has the property that the estimated autocorrelation should decrease as k approaches N.

The expected value of this estimate can be shown to be

tkRN

nkRE XXXX

11

~

The variance of this estimate can be shown to be (math not done at this level)

M

Mk

XXXX tkRN

kRVar 22~

This equation can be used to estimate the number of time samples needed for a useful estimate.

Page 27: Chapter 6: Correlation Functions

Notes and figures are based on or taken from materials in the course textbook: Probabilistic Methods of Signal and System Analysis (3rd ed.) by George R. Cooper and Clare D. McGillem; Oxford Press, 1999. ISBN: 0-19-512354-9.

B.J. Bazuin, Spring 2015 27 of 27 ECE 3800

Matlab Example %% % Figure 6_2 % clear; close all; nsamples = 1000; n=(0:1:nsamples-1)'; %% % rect rect_length = 100; x = zeros(nsamples,1); x(100+(1:rect_length))=1; % Scale to make the maximum unity Rxx1 = xcorr(x)/rect_length; Rxx2 = xcorr(x,'unbiased'); nn = -(nsamples-1):1:(nsamples-1); figure plot(n,x) figure plot(nn,Rxx1,nn,Rxx2) legend('Raw','Unbiased')

-1000 -800 -600 -400 -200 0 200 400 600 800 1000-0.2

0

0.2

0.4

0.6

0.8

1

1.2

Raw

Unbiased