# probability random variables and random processes part 1

Post on 14-Oct-2014

6.667 views

Category:

## Documents

Embed Size (px)

DESCRIPTION

A short introduction to Probability Random Variables and Random Processes: Part 1 of 3

TRANSCRIPT

Probability, Random Variables and Random Processes with applications to Signal ProcessingGanesh.G Member(Research Staff), Central Research Laboratory, Bharat Electronics Limited, Bangalore-13

02/May/2007

1 of 30ganesh.crl@gmail.com

Notes :

Organization of the TopicProbability, Random Variables and Random Processes with applications to Signal Processing

Probability

Random Variables with Applications to Signal Processing Part-2

Random Random Processes with Processes with Applications to Applications to Signal Signal Processing Processing Part-32 of 30

Part-102/May/2007

Notes :

Contents 1. Probability Probability Why study Probability Four approaches to Probability definition A priori and A posteriori Probabilities Concepts of Joint, Marginal, Conditional and Total Probabilities Bayes Theorem and its applications Independent events and their properties Tips and Tricks Example

02/May/2007

3 of 30

Notes :

Contents 2. Random Variables The Concept of a Random Variable Distribution and Density functions Discrete, Continuous and Mixed Random variables Specific Random variables: Discrete and Continuous Conditional and Joint Distribution and Density functions Functions of One Random Variable Transformations of Continuous and Discrete Random variables Expectation Moments: Moments about the origin, Central Moments, Variance and Skew Characteristic Function and Moment Generating Functions Chebyshev and Shwarz Inequalities Chernoff Bound

02/May/2007

4 of 30

Notes :

1. Distribution and Density functions/ Discrete, Continuous and Mixed Random variables/ Specific Random variables: Discrete and Continuous/ Conditional and Joint Distribution and Density functions 2. Why functions of Random variables are important to signal processing/ Transformations of Continuous and Discrete Random variables/ Expectation/ Moments: Moments about the origin, Central Moments, Variance and Skew/ Functions that give Moments: Characteristic Function and Moment Generating Functions/ Chebyshev and Shwarz Inequalities/ Chernoff Bound

Contents 2. Random Variables Contd.. Multiple Random Variables Joint distribution and density functions Joint Moments (Covariance, Correlation Coefficient, Orthogonality) and Joint Characteristic Functions Conditional distribution and density functions Random Vectors and Parameter Estimation Expectation Vectors and Covariance Matrices MMSE Estimator and ML Estimator Sequences of Random Variables Random Sequences and Linear Systems WSS and Markov Random Sequences Stochastic Convergence and Limit Theorems /Central Limit Theorem Laws of Large Numbers02/May/2007 5 of 30

Notes :

1. Functions of two Random variables/ Joint distribution and density functions/ Joint Moments (Covariance, Correlation Coefficient, Orthogonality) and Joint Characteristic Functions/ Conditional distribution and density functions 2. Expectation Vectors and Covariance Matrices/ Linear Estimator, MMSE Estimator/ ML Estimators {S&W}* 3. Random Sequences and Linear Systems/ WSS Random Sequences /Markov Random Sequences {S&W} / Stochastic Convergence and Limit Theorems/ Central Limit Theorem {Papoulis} {S&W}/ Laws of Large Numbers {S&W} *Note: Shown inside the brackets {..} are codes for Reference Books. See page 30 of 30 of this document for references.

Contents 3. Random Processes Introduction to Random Processes The Random Process Concept Stationarity, Time Averages and Ergodicity Some important Random Processes Wiener and Markov Processes

Spectral Characteristics of Random Processes Linear Systems with Random Inputs White Noise Bandpass, Bandlimited and Narrowband Processes Optimum Linear Systems Systems that maximize SNR Systems that minimize MSE

02/May/2007

6 of 30

Notes :

1. Correlation functions of Random Processes and their properties {Peebles}; {S&W}; {Papoulis} 2. Power Spectral Density and its properties, relationship with autocorrelation ; White and Colored Noise concepts and definitions{Peebles} 3. Spectral Characteristics of LTI System response; Noise Bandwidth{Peebles};{S&W} 4. Matched Filter for Colored Noise/White Noise; Wiener Filters{Peebles}

Contents 3. Random Processes Contd.. Some Practical Applications of the Theory Noise in an FM Comm.System Noise in a Phase-Locked Loop Radar Detection using a single Observation False Alarm Probability and Threshold in GPS Applications to Statistical Signal Processing Wiener Filters for Random Sequences Expectation-Maximization Algorithm(E-M) Hidden Markov Models (and their specifications) Spectral Estimation Simulated Annealing

02/May/2007

7 of 30

Notes :

1. {Peebles}; Consider Code Acquisition scenario in GPS applications for one example in finding the false alarm rate 2. Kalman Filtering; Applications of HMM (Hidden Markov Model)s to Speech Processing {S&W}

Probability .Part 1 of 3

02/May/2007

8 of 30

Notes :

Why study Probability Probability plays a key role in the description of noise like signals Nearly uncountable number of situations where we cannot make any categorical deterministic assertion regarding a phenomenon because we cannot measure all the contributing elements Probability is a mathematical model to help us study physical systems in an average sense Probability deals with averages of mass phenomena occurring sequentially or simultaneously: Noise, Radar Detection, System Failure, etc02/May/2007 9 of 30

Notes :

1. {R.G.Brown},pp1 2. {S&W},pp2 3. {S&W},pp2 4. {Papoulis}, pp1 [ 4.1Add Electron Emission, telephone calls, queueing theory, quality control, etc.] 5. Extra: {Peebles} pp2: [How do we characterize random signals: One:how to describe any one of a variety of a random phenomena Contents shown in Random Variables is required; Two: how to bring time into the problem so as to create the random signal of interest-Contents shown in Random Processes is required] ALL these CONCEPTS are based on PROBABILITY Theory.

Four approaches to Probability definition

Probability as Intuition Probability as the ratio of Favorable to Total Outcomes Probability as a measure of Frequency of Occurrence P[A] =

nE n

P[A] = Lim E n n

n

02/May/2007

10 of 30

Notes :

1. Refer their failures from {Papoulis} pp6-7 2. {S&W} pp2-4 3. Slide not required!? Only of Historical Importance? 4. Classical Theory or ratio of Favorable to Total Outcomes approach cannot deal with outcomes that are not equally likely and it cannot handle uncountably infinite outcomes without ambiguity. 5. Problem with relative frequency approach is that we can never perform the experiment infinite number of times so we can only estimate P(A) from a finite number of trails.Despite this, this approach is essential in applying probability theory to the real world.

Four approaches to Probability definition

Probability Based on an Axiomatic Theory

(i)

P ( A) 0 (Probabili ty is a nonnegativ e number)

(ii) P ( ) = 1 (Probabili ty of the whole set is unity) (iii) If A B = , then P ( A B ) = P ( A) + P ( B ).- A.N.Kolmogorov P(A1+ A2+02/May/2007

+ An) = 111 of 30

Notes :

1. Experiment, Sample Space, Elementary Event (Outcome), Event, Discuss the equations why they are so? - :Refer {Peebles},pp10 2. Axiomatic Theory Uses- Refer {Kolmogorov} 3. Consider a simple resistor R = V(t) / I(t) is this true under all conditions? Fully accurate?(inductance and capacitance?)clearly specified terminals? Refer{Papoulis}, pp5 4. Mutually Exclusive/Disjoint Events? [(refer point (iii) above) when P(AB) = 0]. When a set of Events is called Partition/Decomposition/Exhaustive (refer last point in the above slide); what is its use?(Ans: refer Tips and Tricks page of this document )

A priori and A posteriori Probabilities A priori Probability Relating to reasoning from self-evident propositions or presupposed by experience

Before the Experiment is conducted

A posteriori Probability Reasoning from the observed facts

After the Experiment is conducted

02/May/2007

12 of 30

Notes :

1. {S&W}, pp3 2. Also called Prior Probability and Posterior Probability 3. Their role; Bayes Theorem: Prior: Two types: Informative Prior and Uninformative(Vague/diffuse) Prior; Refer {Kemp},pp41-42

Concepts of Joint, Marginal, Conditional and Total Probabilities Let A and B be two experiments Either successively conducted OR simultaneously conducted

Let A1+ A2+ B1+ B2+

+ An be a partition of A and + Bn be a partition of B

This leads to the Array of Joint and Marginal Probabilities

02/May/2007

13 of 30

Notes :

1. {R.G.Brown} pp12-13 2. Conditional probability, in contrast, usually is explained through relative frequency interpretation of probability see for example {S&W} pp16

Concepts of Joint, Marginal, Conditional and Total ProbabilitiesB A Event A1 Event A2 Event B1 Event B2 Event BnMarginal Probabiliti es

P ( A1 B1 ) P( A1 B2 ) P ( A2 B1 ) P ( A2 B2 )

P( A1 Bn ) P(A1)

P ( A2 Bn ) P(A2)

Event An Marginal Probabilities

P ( An B1 )P(B1) P(B2)

P ( An Bn )P(Bn)

P(A2) SUM = 1

02/May/2007

14 of 30

Notes :

1. From {R.G.Brown} pp12-13 2. Joint Probability? 3. What happens if Events A1,A2,.An are not a partition but just some disjoint/Mutually Exclusive Events?Similarly for Events Bs? 4. Summing out a row for example gives the probability of an event A of Experiment A irrespective of the oucomes of Experiment A 5.