# Probability and Random Processes - ?· Probability and Random Processes Main aims I Understand fundamental…

Post on 30-Aug-2018

212 views

Embed Size (px)

TRANSCRIPT

Probability and Random Processes

Probability and Random Processes

Jinho ChoiGIST

February 2017

1 / 99

Probability and Random Processes

What Albert Einstein said:

I As I have said so many times, God doesnt play dice with theworld.

I Two things are infinite: the universe and human stupidity.

So, he does believe in infinity,but not in randomness.

2 / 99

Probability and Random Processes

Main aims

I Understand fundamental issues of probability theory (pdf,mean, and variance)

I Understand joint and conditional pdf, independence, andcorrelation

I Learn properties of Gaussian random variables and be able toderive Chernoff bound

I Understand random processes with key definitions

I Be able to compute the mean and variance of samples fromrandom processes

3 / 99

Probability and Random Processes

Probability Theory

Probability Theory

The three axioms with a sample space, , a family of sets, F , forallowable events, and measure Pr():

I The first axiom: The probability of an event, E, is anon-negative real number:

Pr(E) 0.

I The second axiom:Pr() = 1.

I The third axiom: For any countable sequence of mutuallyexclusive events E1, E2, . . .,

Pr(E1 E2 ) =

i=1

Pr(Ei).

4 / 99

Probability and Random Processes

Probability Theory

Random variables: A random variable is a mapping from thesample space to the set of real numbers.

Sample space (abstract space) Real number

The main idea of random variables is to describe some randomevents by numbers.

real numbers

Event space

event X( )

X: random variable:

X() (,)

5 / 99

Probability and Random Processes

Probability Theory

Example of random variables: A game of dice

I Before you draw a dice, the number of dots is unknown. This number can be considered as a random variable.I Once a dice is drawn, we have a particular number, which

would be one of {1, . . . , 6}. This is called a realization.

realizations of {2, 3, 4}6 / 99

Probability and Random Processes

Probability Theory

Cumulative distribution function (CDF):

FX(x) = Pr(X x) ,

where X is the random variable (r.v.) and x is a real number. Bydefinition, the CDF is a nondecreasing function.Example: a dice

Cummulative distribution function (CDF):

FX(x) = Pr(X x),where X is the random variable (r.v.) and x is a real number.

Probability density function (PDF):

fX(x) =d

dxFX(x)

Example: a dice

1 4 5 62 3

2/6

3/6

4/6

5/6

1

1/6

FX(x) = Pr(X x)

x

3

7 / 99

Probability and Random Processes

Probability Theory

There are different types of r.v.s such as:

I continuous r.v.: X has a continuous value

I discrete r.v.: X has a discrete value

Examples:

I the phase of a sinusoid, : sin(ct+ ) continuous r.v.

I the number of dots of a dice discrete r.v.

8 / 99

Probability and Random Processes

Probability Theory

A discrete r.v. with binomial distribution

I Consider a random experiement that has two possibleoutcomes. For example, the outcome of this experiment canbe expressed (1 to denote success; 0 to denote failure) as

Y =

{1, with probability p;0, with probability 1 p.

I Consider a sum of n outcomes from independent experiments:

X =

n

j=1

Yj = {0, 1 . . . , n}.

I Then, X is the binomial random variable with parameters nand p, X B(n, p):

Pr(X = j) =

(n

j

)pj(1 p)nj , j = 0, . . . , n.

9 / 99

Probability and Random Processes

Probability Theory

Continuous r.v., X, has the probability density function (pdf) as

fX(x) =d

dxFX(x) .

I As FX(x) is nondecreasing, fX(x) 0.

I In general,

t

fX(x)dx = FX(t) .

I Since limx FX(x) = 1, fX(x)dx = 1.

For a discrete r.v., the pdf becomes the probability massfunction (pmf) which is actually probability.

I Example of dice:

fX(X = k) Pr(X = k) =1

6, k = 1, 2, . . . , 6.

10 / 99

Probability and Random Processes

Probability Theory

Mean and VarianceFor a r.v. X, the mean of X (or g(X), where g(x) is a function ofx) is given by

I a continuous r.v.:

E[X] =xfX(x)dx or E[g(X)] =

g(x)fX(x)dx

I a discrete r.v.:

E[X] =

k

xk Pr(xk) or E[g(X)] =

k

g(xk) Pr(xk)

The variance is given by

V ar(X) = E[(X E[X])2]

The mean and variance are used to characterize a random variable.11 / 99

Probability and Random Processes

Probability Theory

Mean of X B(n, p)

E[X] =n

j=0

j

(n

j

)pj(1 p)nj

=

n

j=1

jn!

j!(n j)!pj(1 p)nj

=

n

j=1

n(n 1)!(j 1)!(n j)!

ppj1(1 p)nj

= np

n

j=1

(n 1)!(j 1)!(n j)!

pj1(1 p)nj

= np(p+ (1 p))n1 = np.

12 / 99

Probability and Random Processes

Probability Theory

Geometric random variable:

I pmf: Pr(X = k) = (1 p)k1p, k 1Note that

k=1

(1 p)k1p = p

k=0

(1 p)k = p 11 (1 p)

= 1

I Example: The number of independent flips of a coin untilhead first apprears.

I Mean: Letting q = 1 p, it can be shown that

E[X] =

k=1

k(1 p)k1p = p ddq

k=0

qk

= pd

dq

1

1 q= p

1

(1 q)2=

1

p

13 / 99

Probability and Random Processes

Probability Theory

A continuous r.v. with uniform distributionLet us consider a uniform r.v. X that has the pdf as

fX(x) =

{1A , 0 x A;0, otherwise

The mean is

E[X] = A

0x

1

Adx =

1

A

x2

2

A

x=0

=A

2.

The variance is

E[(X E[X])2] = A

0

(x A

2

)2 1Adx

=

A/2

A/2z2

1

Adz

=1

A 1

3z3A/2

A/2=A2

1214 / 99

Probability and Random Processes

Probability Theory

More examples on expectationQ) Let X be a r.v. and a and c are constants. Show that

E[aX] = aE[X] and E[X + c] = E[X] + c

A)

E[aX] =

(a x)fX(x)dx = a

xfX(x)dx = aE[X]

E[X + c] =

(x+ c)fX(x)dx =

xfX(x)dx+

cfX(x)dx = E[X] + c

Q) Show that E[X2] = V ar(X) + (E[X])2A)

E[(X E[X])2] = E[X2 2(E[X])X + (E[X])2]= E[X2] 2(E[X])E[X] + (E[X])2= E[X2] (E[X])2

15 / 99

Probability and Random Processes

Probability Theory

Q) Show that V ar(aX + c) = a2V ar(X)

Q) Suppose that X is a r.v. with mean 1 and variance 3. FindE[3X2 + 2X].A)

E[3X2 + 2X] = 3E[X2] + 2E[X]= 3(V ar(X) + E2[X]) + 2E[X]

= 3 (3 + 12) + 2 1 = 14.

16 / 99

Probability and Random Processes

Probability Theory

Jensens inequality:For a convex function, g(x),

E[g(X)] g(E[X]) .

A convex function satisfiesg(x1) + (1 )g(x2) g(x1 + (1 )x2), [0, 1]

17 / 99

Probability and Random Processes

Probability Theory

Gaussian or normal random variable:

fX(x) = N (, 2) =12

exp

((x )

2

22

),

where the mean

E[X] =

xfX(x)dx =

and the variance

E[(X E[X])2] =

(x )2fX(x)dx = 2

18 / 99

Probability and Random Processes

Probability Theory

Normal or Gaussian pdfs:

19 / 99

Probability and Random Processes

Probability Theory

Normal or Gaussian cdfs:

20 / 99

Probability and Random Processes

Probability Theory

Q-function: a tail of the normal pdf.

Pr(X x) = Q(x) 4=

x

12

exp

( t

2

2

)dt

21 / 99

Probability and Random Processes

Probability Theory

Conditional probabilityThe conditional probability of an event A given B is denoted andgiven by

Pr(A |B) = Pr(A,B)Pr(B)

Q) Find the probability that the face of one dot occurs assumingodd dots are observed:A) We have

Pr(odd) =1

2and

Pr(1, odd) = Pr(1) =1

6.

Hence, it follows

Pr(1| odd) = Pr(1, odd)Pr(odd)

=1

3.

22 / 99

Probability and Random Processes

Probability Theory

Multiple random variables: The joint pdf is written as

fXY (x, y) =2

xyFXY (x, y) =

2

xyPr(X x, Y y)

Conditional pdf:

fX|Y (x|y) =

{fXY (x,y)fY (y)

, if fY (y) 6= 0.0, otherwise.

Marginalization:

fX(x) =

fXY (x, y)dy or fY (y) =

fXY (x, y)dx

23 / 99

Probability and Random Processes

Probability Theory

Expectation with two r.v.s:For continuous r.v.s, double-integral should take place:

E[g(X,Y )] =

g(x, y)fXY (x, y)dxdy (continuous)

E[g(X,Y )] =

x

y

g(x, y) Pr(X = x, Y = y) (discrete).

The conditional expectation is defined by

E[X|Y ] =xfX|Y (x|Y )dx (continuous)

E[X|Y ] =

x

xPr(X = x |Y ) (discrete).

Note that E[X|Y ] = g(Y ) is a function of Y , which is a randomvariable.

24 / 99

Probability and Random Processes

Probability Theory

Q) Show thatE[XY ] = E[Y E[X|Y ]]

A)

E[XY ] =

xyfXY (x, y)dxdy

=

xyfX|Y (x|y)fY (y)dxdy

=

y

(xfX|Y (x|y)dx

)fY (y)dy

=

yE[X|y]fY (y)dy = E[Y E[X|Y ]].

25 / 99

Probability and Random Processes

Probability Theory

Exponential distribution and memorylessness

I The exponential distribution is given by

f(x;) =

{ex, if x 0;0, o.w.

I The mean and variance are 1/ and 1/2, respectively.

I An exponentially distributed random variable T obeys thefollowing relation:

Pr(T > s+ t |T > s) = Pr(T > t), s, t 0.

This property is called the memorylessness.

26 / 99

Probability and Random Processes

Probability Theory

Independence and correlationThe joint cdf (more than 2 r.v.s) is written as

FX1,X2, ,Xn(x1, x2, , xn) = Pr(X1 x1, , Xn xn)The joint pdf is given by

fX1,X2, ,Xn(x1, x2, , xn) =n

x1 xnFX1,X2, ,Xn(x1, x2, , xn).

The marginal pdf is given by

fX1(x1) =

x2

xn

fX1,X2, ,Xn(x1, x2, , xn)dx2 dxn.

If X1, , Xn are independent, then

FX1,X2, ,Xn(x1, x2, , xn) = FX1(x1) FXn(xn)

and

fX1,X2, ,Xn(x1, x2, , xn) = fX1(x1) fXn(xn) .27 / 99

Probability and Random Processes

Probability Theory

I Note: If X and Y are independent,

fX|Y (x|y) =fXY (x, y)

fY (y)= fX(x).

I The correlation of X and Y is defined as

corr(X,Y ) = E[XY ]

and the covariance is defined as

cov(X,Y ) = E[(X E[X])(Y E[Y ])].

If cov(X,Y ) = 0, X and Y are said to be uncorrelated (notuncovarianced).

28 / 99

Probability and Random Processes

Probability Theory

Joint Gaussian random variables: Let us define the randomvector as

x = [X1 X2 Xn]T.

The random variables, Xi, are jointly Gaussian if the pdf can bewritten as

f(x1, x2, , xn) =1

2 det(C)exp

(1

2(xm)TC1(xm)

),

where the mean vector is m = E[x] = [E[X1] . . . E[Xn]]T and thecovariance matrix is

C = E[(Xm)(Xm)T]

=

E[X1X1] E[X1]E[X1] . . . E[X1Xn] E[X1]E[Xn]... . . . ...E[XnX1] E[Xn]E[X1] . . . E[XnXn] E[Xn]E[Xn]

.29 / 99

Probability and Random Processes

Probability Theory

30 / 99

Probability and Random Processes

Probability Theory

Moment Generating Function (MGF)

I Finding mn = E[Xn] (the nth moment)I Define the MGF as

MX(t) = E[etX ].

I Using the Taylor series,

MX(t) = E

[

k=0

1

k!(tX)k

]=

k=0

tkmkk!

I From this, the kth moment can be found as

mk =dk

dtkMX(t)

t=0

.

31 / 99

Probability and Random Processes

Probability Theory

I The MGF can be seen as the Laplace transform of the pdf,fX(t):

MX(t) =

etXfX(x)dx

Thus, using the inverse Laplace transform, the pdf can befound from the MGF.

I Let X and Y be independent random variables. The MGF ofZ = X + Y is given by

MZ(t) = E[etZ ] = E[et(X+Y )] = MX(t)MY (t).

Thus, the pdf of Z is the inverse Laplace transform ofMX(t)MY (t).

I Thus, the pdf of Z is the convolution of the pdfs of X and Y :fZ(x) = fX(x) fY (x).

32 / 99

Probability and Random Processes

Probability Theory

Sum of independent Gaussian random variables: Let Xi beindependent Gaussian r.v.s, Xi N (i, 2i ). Then,

Y =

i

Xi

is also a Gaussian r.v.

I The MGF of Xi is Mi(t) = exp(it+12

2i t

2).

I The MGF of Y is

MY (t) =

i

Mi(t) = exp

(

i

it+1

22i t

2

)

Y N

(

i

i,

i

2i

).

33 / 99

Probability and Random Processes

Probability Theory

Gaussian related distributions

I 2 distribution: The pdf of the 2 distribution with N degreesof freedom is written as

fX(x) =1

(N/2)2N/2x(N/2)1ex/2 x 0,

where (x) =

0 tx1etdt is the Gamma function. If x = n

is an integer,(n) = (n 1)!.

Let X1, X2, , XN are identically independentlly distributed(i.i.d.) random variables (r.v.s). In addition, letXi N (0, 1). Then, Y =

Ni=1X

2i is the

2 r.v. with Ndegrees of freedom.

34 / 99

Probability and Random Processes

Probability Theory

2 pdf

35 / 99

Probability and Random Processes

Probability Theory

I The 2 distribution with 2 degrees of freedom is anexponential distribution:

fX(x) =1

2e

x2 , x 0.

I The exponential distribution with parameter (denoted byX Exp()) is given by

f(x;) =

{ex, x 0,0, x < 0.

I The mean and variance of X Exp() are

E[X] =1

V ar(X) =1

2.

36 / 99

Probability and Random Processes

Probability Theory

I Let X1 and X2 be i.i.d. Gaussian r.v. and Xi N (0, 1).Define Y = X1 + jX2. Then, Y is the circularly symmetriccomplex Gaussian (CSCG) r.v. The mean and variance are

E[Y ] = 0,E[Y 2] = 0, and E[|Y |2] = 2.

I From a zero-mean CSCG random vector to a real-valuedGaussian random vector:

y = x1+jx2 [x1x2

] N

([00

],

[E[x1xT1 ] E[x1xT2 ]E[x2xT1 ] E[x2xT2 ]

])

I Then, what is the distribution of z = Ay if y is a zero-meanCSCG random vector to a real-valued Gaussian?

37 / 99

Probability and Random Processes

Probability Theory

I The real-valued A:

A A =[

Probability and Random Processes

Probability Theory

Remarks:

I Let X be a CSCG r.v. Then, |X|2 becomes a 2 r.v.I The Gamma function (n):

(n) =

0tn1etdt, n > 0.

Some properties:(n+ 1) = n(n)

and(n+ 1) = n!

In addition,

(1

2

)=.

39 / 99

Probability and Random Processes

Probability Theory

Rician and Rayleigh random variables

I The Rayleigh r.v. has a single parameter pdf given by

fX(x) =x

2ex

2/22 , x 0

and the cdf is

FX(x) = 1 ex2/22 , x 0.

If X1 and X2 are independent Gaussian random vairables withmean 0 and variance 2, then X =

X21 +X

22 is the

Rayleigh r.v.

40 / 99

Probability and Random Processes

Probability Theory

I The Rician pdf is written as

fX(x) =x

2I0

(x2

)e(x

2+2)/22 , x 0,

where I0(x) is the modified Bessel function of zero order. Itcan be shown that if X1 and X2 are independent Gaussian,and X1 N (, 2) and X2 N (0, 2), thenX =

X21 +X

22 is the Rician r.v.

41 / 99

Probability and Random Processes

Probability Theory

Approximations from binomial to Poisson, to Gaussian

I A (discrete) Poisson random variable with parameter ,denoted by Pois(), has the following distribution:

Pr(X = k) =ek

k!, k = 0, 1, . . .

The followings are often useful:

I A binomial random variable B(n, p) can be approximatedby a Poisson random variable with = np if n is sufficientlylarge.

I A Poisson random variable with can be approximated by aGaussian random variable if is large.

42 / 99

Probability and Random Processes

Probability Theory

Limit of X B(n, p): (taking a heuristic approach)

Pr(X = k) =

(n

k

)pk(1 p)nk

nk

k!pk

(1 p)n

(1 p)k(for large n)

(np)k

k!

epn

epk(use (1 x) ex, |x| 1)

Let = pn and taking n, which means that p approaches 0(epk 1). Then,

Pr(X = k) ke

k!,

which is the Poisson distribution with parameter .

43 / 99

Probability and Random Processes

Probability Theory

Sum of independent Poisson rvs is also Poisson

I The MGF of Pois():

MX(t) = E[etX] =

k=0

etkke

k!

= e

k=0

(et)k

k!= eee

t = exp((et 1)).

I Z = X + Y , where X Pois(X) and Y Pois(Y ), has theMGF:

MZ(t) = MX(t)MY (t) = eX(e

t1)eY (et1) = e(X+Y )(e

t1),

which leads to Z Pois(X + Y ).

44 / 99

Probability and Random Processes

Probability Theory

Standardized Poisson random variable approaches Gaussian

I Let X Pois().I E[X] = and V ar(X) = (check these!)I The standardized Poisson rv becomes

X E[X]V ar(X)

=X

( Z N (0, 1) as ) .

I The MGF is

E[etX ] = E[e

tX

]et = exp((e

t 1))et

= exp

(

(t

+t2

2!+

t3

3!32

+ . . .

) t

)

= exp

(t

2

)as ,

which is the MGF of Z N (0, 1).45 / 99

Probability and Random Processes

Probability Theory

Transformation Method

I Suppose that U is uniformly distributed in the interval [0, 1].

I From U , we may want to generate r.v., X, that have cdfFX(x).

I To this end, defineZ = F1X (U).

I Then, since Pr(u < t) = t, t [0, 1],

Pr(Z x) = Pr(F1X (u) x) = Pr(u FX(x)) = FX(x).

I That is, Z becomes a r.v. X of cdf FX(x).

46 / 99

Probability and...

Recommended