probability theory and random processes

Post on 08-Feb-2016

108 views

Category:

Documents

Embed Size (px)

DESCRIPTION

Probability Theory and Random Processes. Communication Systems , 5ed., S. Haykin and M. Moher , John Wiley & Sons, Inc., 2006. Probability. Probability theory is based on the phenomena that can be modeled by an experiment with an outcome that is subject to chance. - PowerPoint PPT Presentation

TRANSCRIPT

Probability Theory and Random Processes

Probability Theory and Random ProcessesCommunication Systems, 5ed., S. Haykin and M. Moher, John Wiley & Sons, Inc., 2006.ProbabilityProbability theory is based on the phenomena that can be modeled by an experiment with an outcome that is subject to chance.Definition: A random experiment is repeated n time (n trials) and the event A is observed m times (m occurrences). The probability is the relative frequency of occurrence m/n.Probability Based on Set TheoryDefinition: An experiment has K possible outcomes where each outcome is represented as the kth sample sk. The set of all outcomes forms the sample space S. The probability measure P satisfies theAxioms:0 P[A] 1P[S] = 1If A and B are two mutually exclusive events (the two events cannot occur in the same experiment), P[AUB]=P [A] + P[B], otherwise P[AUB] = P[A] + P[B] P[AB]The complement is P[] = 1 P[A]If A1, A2,, Am are mutually exclusive events, then P[A1] + P[A2] + + P[Am] = 1Venn DiagramsABEvents A and B that are mutually exclusive events in the sample space S.SskSample can only come from A, B, or neither.ABSskSample can only come from both A and B.Events A and B are not mutually exclusive events in the sample space S.Conditional ProbabilityDefinition: An experiment involves a pair of events A and B where the probability of one is conditioned on the occurrence of the other. Example: P[A|B] is the probability of event A given the occurrence of event BIn terms of the sets and subsetsP[A|B] = P[AB] / P[A]P[AB] = P[A|B]P[B] = P[B|A]P[A] Definition: If events A and B are independent, then the conditional probability is simply the elementary probability, e.g. P[A|B] = P[A], P[B|A] = P[B].Random VariablesDefinition: A random variable is the assignment of a variable to represent a random experiment. X(s) denotes a numerical value for the event s.When the sample space is a number line, x = s.Definition: The cumulative distribution function (cdf) assigns a probability value for the occurrence of x within a specified range such that FX(x) = P[X x].Properties:0 FX(x) 1FX(x1) FX(x2), if x1 x2

Random VariablesDefinition: The probability density function (pdf) is an alternative description of the probability of the random variable X: fX(x) = d/dx FX(x) P[x1 X x2] = P[X x2] - P[X x1] = FX(x2) - FX(x1) = fX(x)dx over the interval [x1,x2]Example DistributionsUniform distribution

Several Random VariablesCDF:

Marginal cdf:

PDF:

Marginal pdf:

Conditional pdf:

Statistical AveragesExpected value:

Function of a random variable:

Text Example 5.4

Statistical Averagesnth moments:

Central moments:

Mean-square value of XVariance of XJoint MomentsCorrelation:

Covariance:

Correlation coefficient:

Expected value of the product- Also seen as a weighted inner product

Correlation of the central moment

Random ProcessesDefinition: a random process is described as a time-varying random variable

Mean of the random process:

Definition: a random process is first-order stationary if its pdf is constant

Definition: the autocorrelation is the expected value of the product of two random variables at different times

Constant mean, variance

Stationary to second orderRandom ProcessesDefinition: the autocorrelation is the expected value of the product of two random variables at different times

Definition: the autocovariance of a stationary random process is

Stationary to second order

Properties of AutocorrelationDefinition: autocorrelation of a stationary process only depends on the time differences

Mean-square value:

Autocorrelation is an even function:

Autocorrelation has maximum at zero:

ExampleSinusoidal signal with random phase

Autocorrelation

As X(t) is compared to itself at another time, we see there is a periodic behavior it in correlationCross-correlationTwo random processes have the cross-correlation

Wide-sense stationary cross-correlation

ExampleOutput of an LTI system when the input is a RPText 5.7Power Spectral DensityDefinition: Fourier transform of autocorrelation function is called power spectral density

Consider the units of X(t) Volts or AmperesAutocorrelation is the projection of X(t) onto itselfResulting units of Watts (normalized to 1 Ohm)

Properties of PSDZero-frequency of PSD

Mean-square value

PSD is non-negative

PSD of a real-valued RP

Which theorem does this property resemble?

ExampleText Example 5.12Mixing of a random process with a sinusoidal process

Autocorrelation

PSD

Wide-sense stationary RP (to make it easier)Uniformly distributed, but not time-varying

PSD of LTI SystemStart with what you know and work the math

PSD of LTI SystemThe PSD reduces to

System shapes power spectrum of input as expected from a filtering like operationGaussian ProcessThe Gaussian probability density function for a single variable is

When the distribution has zero mean and unit variance

The random variable Y is said to be normally distributed as N(0,1)

Properties of a Gaussian ProcessThe output of a LTI is Gaussian if the input is Gaussian

The joint pdf is completely determined by the set of means and autocovariance functions of the samples of the Gaussian process

If a Gaussian process is wide-sense stationary, then the output of the LTI system is strictly stationary

A Gaussian process that has uncorrelated samples is statistically independentNoiseShot noise

Thermal noise

White noise

Narrow