random processes - sens processes.pdf · random processes definitions: a random process is a...
TRANSCRIPT
EAS 305 Random Processes Viewgraph 1 of 10
Random Processes
Definitions:
A random process is a family of random variables indexed by a parameter , where is called the indexset.
Experiment outcome is , which is a whole function . this real-valued function is called asample function. The set of all sample functions is an ensamble.
t T∈ T
λi X t λi,( ) xi t( )=
●
●
●
●
●
λ1
λ2
λ3
x2 t( )
x3 t( )
t
t
x1 t( )
x1 t1( )
t
x1 t2( )
t2
t1
x2 t2( )
X2X1
x3 t2( )
x3 t1( )
x2 t1( )
EAS 305 Random Processes Viewgraph 2 of 10
Statistics of Random Processes
I. Distributions and Densities
Random process . For a particular value of , say , we have a random variable . Thedistribution function of this random variable is defined by
, and is called the first-order distribution of .
The corresponding first-order density function is .
For and , we get two random variables and . Their joint distribution is called thesecond-order distribution:
, with corresponding second-order density function
.
The nth-order distribution and density functions are given by
, and
.
X t( ) t t1 X t1( ) X1=
FX x1 t1;( ) P X t1( ) x1≤{ }= X t( )
f X x1 t1;( )x1∂∂
FX x1 t1;( )=
t1 t2 X t1( ) X1= X t2( ) X2=
FX x1 x2 t1 t2,;,( ) P X t1( ) x1 X t2( ) x2≤,≤{ }=
f X x1 x2 t1 t2,;,( )x1 x2∂
2
∂∂
FX x1 x2 t1 t2,;,( )=
FX x1 x2 …xn t1 t2 … tn, , ,;, ,( ) P X t1( ) x1 X t2( ) x2 … X tn( ) xn≤, ,≤,≤{ }=
f X x1 x2 …xn t1 t2 … tn, , ,;, ,( )x1 x2… xn∂∂
n
∂∂
FX x1 x2 …xn t1 t2 … tn, , ,;, ,( )=
EAS 305 Random Processes Viewgraph 3 of 10
II. Statistical Averages (ensamble averages)
The mean of is defined by .
The autocorrelation of is defined by
.
The autocovariances of is defined by
The nth joint moment of is defined by
III. Stationarity
Strict-Sense Stationarity
A random process is called strict-sense stationary (SSS) if the statistics are invariant w.r.t. any timeshift, i.e.
It follows for any ,hence first-order density of a SSS is independent of time: . Similarly, .
By setting , we get . Thus, if is SSS, the joint density of therandom variables and is independent of and depends only on the time difference .
X t( ) E X t( )[ ] X t( ) x f X x t;( ) xd∞–
∞∫= =
X t( )
RXX t1 t2,( ) E X t1( )X t2( )[ ] x1x2 f X x1 x2 t1 t2,;,( ) x1 x2dd∞–
∞∫∞–
∞∫= =
X t( )
CXX t1 t2,( ) E X t1( ) X t1( )–[ ] X t2( ) X t2( )–[ ]{ } RXX t1 t2,( ) X t1( )X t2( )–= =
X t( )
E X t1( )X t2( )…X tn( )[ ] … x1x2…xn f X x1 x2 … xn t1 t2 … tn, , ,;, , ,( ) x1 x2… xnddd∞–
∞∫∞–
∞∫∞–
∞∫=
X t( )
f X x1 x2 … xn t1 t2 … tn, , ,;, , ,( ) f X x1 x2 … xn t1 c+ t2 c+ … tn c+, , ,;, , ,( )=
f X x1 t1;( ) f X x1 t1 c+;( )= c X t( )
t f X x1 t;( ) f X x1( )= f X x1 x2 t1 t2,;,( ) f X x1 x2 t1 c+ t2 c+,;,( )=
c t1–= f X x1 x2 t1 t2,;,( ) f X x1 x2 t2 t1–;,( )= X t( )
X t( ) X t τ+( ) t τ
EAS 305 Random Processes Viewgraph 4 of 10
Wide -Sense Stationary
A random process is said to be wide-sense stationary (WSS) if its mean is constant (independent oftime) , and its autocorrelation depends only on the time difference .
As a result, the auto covariance of a WSS process also depends only on the time difference :
.
Setting in results in . The average power of a WSSprocess is independent of time , and equals .
An SSS process is WSS, but a WSS process is not necessarily SSS.
Two processes and are jointly wide-sense stationary (jointly WSS) if each is WSS and their cross-correlation depends only on the time difference :
Also the cross-covariance of jointly WSS and dependsonly on the time difference :
.
IV. Time Averages and Ergodicity
The time-averaged mean of a sample function of a random process is defined as
, where the symbol denotes time-averaging.
Similarly, the time-averaged autocorrelation of the sample function is
X t( )
E X t( )[ ] X= τ E X t( )X t τ+( )[ ] RXX τ( )=
τ
CXX τ( ) RXX τ( ) X2–=
τ 0= E X t( )X t τ+( )[ ] RXX τ( )= E X2 t( )[ ] RXX 0( )=
t RXX 0( )
X t( ) Y t( )
τ
RXY t t τ+,( ) E X t( )Y t τ+( )[ ] RXY τ( )= = X t( ) Y t( )
τ
CXY τ( ) RXY τ( ) XY–=
x t( ) X t( )
x t( )⟨ ⟩ 1T---
T ∞→lim x t( ) td
T 2⁄–
T 2⁄∫= .⟨ ⟩
x t( )
EAS 305 Random Processes Viewgraph 5 of 10
.
Both and are random variables since they depend on which sample function resultedfrom experiment. then if is stationary:
.
The expected value of the time-averaged mean is equal to ensamble mean.
Also ,
the expected value of the time-averaged autocorrelation is equal to the ensamble autocorrelation.
A random process is ergodic if time-averages are the same for all sample functions, and are equal tothe corresponding ensamble averages.
In an ergodic process, all its statistics can be obtained from a single sample function.
A stationary process is called ergodic in the mean if ,
and ergodic in the autocorrelation if .
Correlations and Power Spectral Densities
Assume all random processes WSS:
I. Autocorrelation RXX(τ):
.
Properties of : , , .
x t( )x t τ+( )⟨ ⟩ 1T---
T ∞→lim x t( )x t τ+( ) td
T 2⁄–∫=
x t( )⟨ ⟩ x t( )x t τ+( )⟨ ⟩X t( )
E x t( )⟨ ⟩[ ] 1T---
T ∞→lim E x t( )[ ] td
T 2⁄–
T 2⁄∫ X= =
E x t( )x t τ+( )⟨ ⟩[ ] 1T---
T ∞→lim E x t( )x t τ+( )[ ] td
T 2⁄–
T 2⁄∫ RXX τ( )= =
X t( )
X t( ) x t( )⟨ ⟩ X=
x t( )x t τ+( )⟨ ⟩ RXX τ( )=
RXX τ( ) E X t( )X t τ+( )[ ]=
RXX τ( ) RXX τ–( ) RXX τ( )= RXX τ( ) RXX 0( )≤ RXX 0( ) E X2 t( )[ ]=
EAS 305 Random Processes Viewgraph 6 of 10
II. Cross-Correlation RXY(τ): .
Properties of : , , .
III. Autocovariance CXX(τ):
IV. Cross-Covariance CXY(τ): .
Two processes are (mutually) orthogonal if , and uncorrelated if .
V. Power Spectrum Density SXX (ω):
The power spectral density SXX (ω) is the Fourier transformof RXX(τ) . Thus
.
Properties: real, , even fn. ,
(Parseval’s type relation) .
VI. Cross Spectral Densities
, .
Therefore: , .
Since , then .
RXY τ( ) E X t( )Y t τ+( )[ ]=
RXY τ( ) RXY τ–( ) RXY τ( )= RXY τ( ) RXX 0( )RYY 0( )≤ RXY τ( )12--- RXX 0( ) RYY 0( )+[ ]≤
CXX τ( ) RXX τ( ) X2–=
CXY τ( ) RXY τ( ) XY–=
RXY τ( ) 0= CXY τ( ) 0=
SXX ω( ) RXX τ( )e jωτ– τd∞–
∞∫=
RXX τ( )1
2π------ SXX ω( )ejωτ ωd
∞–
∞∫=
SXX ω( ) 0≥ SXX ω–( ) SXX ω( )=
12π------ SXX ω( ) ωd
∞–
∞∫ RXX 0( ) E X2 t( )[ ]= =
SXY ω( ) RXY τ( )e jωτ– τd∞–
∞∫= SYX ω( ) RYX τ( )e jωτ– τd
∞–
∞∫=
RXY τ( )1
2π------ SXY ω( )ejωτ ωd
∞–
∞∫= RYX τ( )
12π------ SYX ω( )ejωτ ωd
∞–
∞∫=
RXY τ( ) RYX τ–( )= SXY ω( ) SYX ω–( ) S∗YX ω( )= =
EAS 305 Random Processes Viewgraph 7 of 10
Random Processes in Linear Systems
I. System Response:
LTI system with impulse response , and output .
II. Mean and Autocorrelation of Output:
.
If input is WSS then , the mean of the output is a cons.
, :Y WSS
III. Power Spectral Density of Output:
,
. Average power of output is:
h t( ) Y t( ) L X t( )[ ] h t( ) ∗ X t( ) h ζ( )X t ζ–( ) ζd∞–
∞∫= = =
E Y t( )[ ] Y t( ) E h ζ( )X t ζ–( ) ζd∞–
∞∫ h ζ( )E X t ζ–( )([ ] ζd
∞–
∞∫ h ζ( )X t ζ–( ) ζd
∞–
∞∫ h t( ) ∗ X t( )= = = = =
RYY t1 t2,( ) E Y t1( )Y t2( )[ ] E h ζ( )h µ( )X t1 ζ–( )X t2 µ–( ) ζd( ) µd∞–
∞∫∞–
∞∫= = =
h ζ( )h µ( )E X[ t1 ζ–( )X t2 µ–( ) ] ζd µd∞–
∞∫∞–
∞∫ h ζ( )h µ( )RXX t1 ζ– t2 µ–,( ) ζd µd
∞–
∞∫∞–
∞∫=
E Y t( )[ ] Y h ζ( )X ζd∞–
∞∫ X h ζ( ) ζd
∞–
∞∫ XH 0( )= = = =
RYY t1 t2,( ) h ζ( )h µ( )RXX t2 t–1
ζ µ–+( ) ζd µd∞–
∞∫∞–
∞∫= RYY τ( ) h ζ( )h µ( )RXX τ ζ µ–+( ) ζd µd
∞–
∞∫∞–
∞∫=
SYY ω( ) RYY τ( )e jωτ– τd∞–
∞∫ h ζ( )h µ( )RXX τ ζ µ–+( )e jωτ– τd ζd µd
∞–
∞∫∞–
∞∫∞–
∞∫ H ω( ) 2SXX ω( )= = =
RYY τ( )1
2π------ SYY ω( )ejωτ ωd
∞–
∞∫ 1
2π------ H ω( ) 2SXX ω( )ejωτ ωd
∞–
∞∫= =
E Y2 t( )[ ] RYY 0( )1
2π------ H ω( ) 2SXX ω( ) ωd
∞–
∞∫= =
EAS 305 Random Processes Viewgraph 8 of 10
Special Classes of Random Processes
I. Gaussian Random Process:
II. White Noise:
A random process is white noise if . Its autocorrelation is .
III. Band-Limited White Noise:
A random process is band-limited white noise if .
Then .
X t( ) SXX ω( ) η2---= RXX τ( )
η2---δ τ( )=
X t( ) SXX ω( )η2---, ω ωB≤
0, ω ωB>
=
RXX τ( )1
2π------ η
2---ejωτ ωd
ω– B
ωB
∫ηωB
2π-----------
ωBτsin
ωBτ------------------= =
EAS 305 Random Processes Viewgraph 9 of 10
IV. Narrowband Random Process:
Let be WSS process with zero mean. Let its power spectral density be nonzero only in anarrow frequency band of width 2W which is very small compared to a center frequency , then we have anarrowband random process. When white or broadband noise is passed through a narrowband linear filter,narrowband noise results. When a sample function of the output is viewed on oscilloscope, the observedwaveform appears as a sinusoid of random amplitude and phase. Narrowband noise is convenientlyrepresented by , where are the envelop function and phasefunction, respectively. From trigonometric identity of the cosine of a sum we get the quadraturerepresentation of process:
where
To detect the quadrature component , and the in-phase component from we use:
X t( ) SXX ω( )
ωc
X t( ) V t( ) ωct φ t( )+[ ]cos= V t( ) and φ t( )
X t( ) V t( ) φ t( ) ωctcoscos V t( ) φsin t( ) ωctsin– Xc t( ) ωctcos Xs t( ) ωctsin–= =
Xc t( ) V t( ) φ t( ),cos= in phase component–
Xs t( ) V t( ) φsin t( ),= qudrature component
V t( ) Xc2 t( ) Xs
2 t( )+=
φ t( )Xs t( )
Xc t( )------------atan=
Xc t( ) Xs t( ) X t( )
EAS 305 Random Processes Viewgraph 10 of 10
Properties of and :
1- Same power spectrum:
2- Same mean and variance as :
3- and are uncorrelated:
4- If input process is gaussian, then so are the in-phase and quadrature components.
5- If is gaussian, then for a fixedt, V(t) is a random variable with Rayleigh distribution, and is a randomvariable uniformly distributed over .
Xc t( ) Xs t( )
SXcω( ) SXs
ω( )SXX ω ωc–( ) SXX ω ωc+( ) ω W≤,+
0 else,
= =
X t( )Xc Xs X 0 and σXc
2, σXs
2 σX2= = = = =
Xc t( ) Xs t( ) E Xc t( )Xs t( )[ ] 0=
X t( ) φ t( )0 2π,[ ]
~
~
Low-passFilter
Low-passFilter
Xc(t)
Xs(t)
2cosωc t
-2sin ωct
X(t)