# 1. Random Processes - MIT

Post on 05-Feb-2017

213 views

Category:

## Documents

0 download

Embed Size (px)

TRANSCRIPT

<ul><li><p>13.42 Design Principles for Ocean Vehicles Prof. A.H. Techet </p><p>Spring 2005 </p><p>1. Random Processes </p><p>A random variable, ( )x , can be defined from a Random event, , by assigning values ix </p><p>to each possible outcome, iA , of the event. Next define a Random Process, ( )x t , , a </p><p>function of both the event and time, by assigning to each outcome of a random event, , a </p><p>function in time, 1( )x t , chosen from a set of functions, ( )ix t . </p><p>1 1 1</p><p>2 2 2</p><p>( )( )</p><p>( )n n n</p><p>A p x tA p x t</p><p>A p x t</p><p>M M M (6) </p><p>This menu of functions, ( )ix t , is called the ensemble (set) of the random process and </p><p>may contain infinitely many ( )ix t , which can be functions of many independent variables. </p><p>EXAMPLE: Roll the dice: Outcome is iA , where 1 6i = : is the number on the face of the </p><p>dice and choose some function </p><p> ( ) iix t t= (7) </p><p>to be the random process. </p></li><li><p>3.1. Averages of a Random Process </p><p>Since a random process is a function of time we can find the averages over some period of </p><p>time, T , or over a series of events. The calculation of the average and variance in time are </p><p>different from the calculation of the statistics, or expectations, as discussed in the </p><p>previously. </p><p>TIME AVERAGE (Temporal Mean) </p><p> { }0</p><p>1( ) ( )T tlim</p><p>i T iM x t x t dt xT= = (8) </p><p>TIME VARIANCE (Temporal Variance) </p><p> { } 20</p><p>1{ ( )} [ ( ) ( ) ]Tt lim</p><p>i T i iV x t x t M x t dtT= (9) </p><p>TEMPORAL CROSS/AUTO CORRELATION This gives us the correlation or </p><p>similarity in the signal and its time shifted version. </p><p> { } { }0</p><p>1( ) [ ( ) ( ) ][ ( ) ( ) ]Tt lim t t</p><p>i T i i i iR x t M x t x t M x t dtT = + + (10) </p><p> is the correlation variable (time shift). </p><p> tiR| | is between 0 and 1. </p><p> If tiR is large (i.e. ( ) 1tiR ) then ( )ix t and ( )ix t + are similar. For example, a </p><p>sinusoidal function is similar to itself delayed by one or more periods. </p><p> If tiR is small then ( )ix t and ( )ix t + are not similar for example white noise would </p><p>result in ( ) 0tiR = . </p></li><li><p>EXPECTED VALUE: </p><p> 1 1 1{ ( )} ( )xt E x t x f x t dx</p><p>= = , (11) </p><p>STATISTICAL VARIANCE: </p><p> 2 2 21 1 1 1[ ( ) ( )] ( ) ( )x x xt E x t t x f x t dx </p><p>= = , (12) </p><p>AUTO-CORRELATION: </p><p> [ ][ ]{ }1 2 1 2 1 1 2 2{ ( ) ( )} ( ) { ( )} ( ) { ( )}xR xt t E x t x t E x t E x t x t E x t , = , , = , , , , (13) </p><p>Example: Roll the dice: 1 6k = : Assign to the event ( )kA t a random process function: </p><p> ( ) cosk ox t a k t= (14) </p><p>Evaluate the time statistics: </p><p>MEAN: { ( )}t kM x t = 10</p><p>cos 0Tlim</p><p>T oT a k tdt = VARIANCE: { ( )}t kV x t = 22 21 20 cos</p><p>Tlim aT oT a k tdt = </p><p>CORRELATION: { ( )}t kR x t = 210</p><p>cos( ) cos( ( ))Tlim</p><p>T o oT a k t k t dt + = 2</p><p>2a</p><p>ocosk </p><p>Looking at the correlation function then we see that if 2ok t = / then the correlation is </p><p>zero for this example it would be the same as taking the correlation of a sine with cosine, </p><p>since cosine is simply the sine function phase-shifted by 2/ , and cosine and sine are not </p><p>correlated. </p></li><li><p>Now if we look at the STATISTICS of the random process, for some time ot t= , </p><p> ( ) cos( ) ( )k o o o kx t a k t y , = = (15) </p><p>where k is the random variable ( 1 2 3 4 5 6k = , , , , , ) and each event has probability, 1 6ip = / . </p><p>EXPECTED </p><p>VALUE: </p><p>6 161</p><p>{ ( )} cos( )k k o okE y p x a k t == = </p><p>VARIANCE: 6 2 2161</p><p>{ ( )} cos ( )o okV y a k t == CORRELATION: ( ) { ( ) ( )}yy o k o k oR t E y t y t , = , + , </p><p>STATISTICS TIME AVERAGES </p><p>In general the Expected Value does not match the Time Averaged Value of a function i.e. </p><p>the statistics are time dependent whereas the time averages are time independent. </p><p>2. Stationary Random Processes </p><p>A stationary random process is a random process, ( )X t , , whose statistics (expected </p><p>values) are independent of time. For a stationary random process: </p><p>1 1( ) { ( )} ( )x t E x t f t = , </p><p>2 221 1 1( ) ( ) [ ( ) ( )]x x xV t t E x t t </p><p>= = =</p><p>( ) ( ) ( )xx xxR t R f t , = = </p><p>( ) ( 0) ( )V t R t V f t= , = </p><p>The statistics, or expectations, of a stationary random process are NOT necessarily equal to </p><p>the time averages. However for a stationary random process whose statistics ARE equal to </p><p>the time averages is said to be ERGODIC. </p></li><li><p>EXAMPLE: Take some random process defined by ( )y t , : </p><p> ( ) cos( ( ))oy t a t , = + (16) </p><p> ( ) cos( )i o iy t a t = + (17) </p><p>where ( ) is a random variable which lies within the interval 0 to 2 , with a constant, </p><p>uniform PDF such that </p><p> 1 2 for(0 2 )</p><p>( )0 else</p><p>f </p><p>/ ; </p><p>= ; (18) </p><p>STATISTICAL AVERAGE: the statistical mean is not a function of time. </p><p> 2</p><p>0</p><p>1{ ( )} cos( ) 02o o o</p><p>E y t a t d</p><p>= + = (19) </p><p>STATISTICAL VARIANCE: Variance is also independent of time. </p><p> 2</p><p>( ) ( 0)2oaV t R = = = (20) </p><p>STATISTICAL CORRELATION: Correlation is not a function of t, is a constant. </p><p> 2 2</p><p>0</p><p>2</p><p>{ ( ) ( )} ( )1 cos( ) cos( [ ] )</p><p>21 cos2</p><p>o o o</p><p>o o o o</p><p>o</p><p>E y t y t R t</p><p>a t t d</p><p>a</p><p>, + , = ,</p><p>= + + +</p><p>=</p><p> (21) </p><p>Since statistics are independent of time this is a stationary process! </p></li><li><p>Lets next look at the temporal averages for this random process: </p><p>MEAN (TIME AVERAGE): </p><p> [ ]</p><p>0</p><p>1{ ( )} cos( )</p><p>1 sin( ) 0</p><p>Tlimi T o i</p><p>limT o i</p><p>o</p><p>m y t a t dtT</p><p>a TT</p><p>, = +</p><p>= + =</p><p> (22) </p><p>TIME VARIANCE: </p><p> 2</p><p>(0)2</p><p>t t aV R= = (23) </p><p>CORRELATION: </p><p>2</p><p>0</p><p>2</p><p>1( ) cos( )cos( [ ] )</p><p>1 cos2</p><p>Tt limT o i o i</p><p>o</p><p>R a t t dtT</p><p>a</p><p>= + + +</p><p>=</p><p> (24) </p><p>STATISTICS = TIME AVERAGES </p><p>Therefore the process is considered to be an ERGODIC random process! </p><p>N.B.: This particular random process will be the building block for simulating water </p><p>waves. </p><p>3. ERGODIC RANDOM PROCESSES </p><p> Given the random process ( )y t , it is simplest to assume that its expected value is zero. </p><p>Thus, if the expected value equals some constant, { ( )} oE x t x, = , where 0ox , then we </p><p>can simply adjust the random process such that the expected value is indeed </p><p>zero: ( ) ( ) oy t x t x , = , . </p></li><li><p>The autocorrelation function ( )R is then </p><p>0</p><p>( ) { ( ) ( )} ( )1 ( ) ( )</p><p>t</p><p>TlimT i i</p><p>R E y t y t R</p><p>y t y t dtT</p><p>= , + , =</p><p>= + (25) </p><p>with CORRELATION PROPERTIES: </p><p>1. (0)R = variance 2= = (RMS) 2 0 </p><p>2. ( ) ( )R R = </p><p>3. (0) ( )R R | | </p><p>EXAMPLE: Consider the following random process that is a summation of cosines of </p><p>different frequencies similar to water waves. </p><p> 1</p><p>( ) cos( ( ))N</p><p>n n nn</p><p>y t a t =</p><p>, = + (26) </p><p>where ( )n are all independent random variables in [0, 2 ] with a uniform pdf. This </p><p>random process is stationary and ergodic with an expected value of zero. </p><p>The autocorrelation ( )R is </p><p> 2</p><p>1( ) cos( )</p><p>2</p><p>Nn</p><p>nn</p><p>aR =</p><p>= (27) </p></li></ul>