ece353: probability and random processes lecture fuxia/lec5cdfandexpectation.pdf · ece353:...

Download ECE353: Probability and Random Processes Lecture fuxia/Lec5CDFandExpectation.pdf · ECE353: Probability…

Post on 07-Sep-2018

214 views

Category:

Documents

0 download

Embed Size (px)

TRANSCRIPT

  • ECE353: Probability and Random Processes

    Lecture 5 - Cumulative Distribution Function andExpectation

    Xiao Fu

    School of Electrical Engineering and Computer ScienceOregon State University

    E-mail: xiao.fu@oregonstate.edu

  • From PMF to CDF

    Recall PMF of a discrete RV is PX(x) = P [X = x].

    Definition: the Cumulative Distribution Function (CDF):

    FX(x) := P [X x].

    very useful since in many cases we care about P [X x].

    it comes very handy in calculating things like P [` X u].

    ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 1

  • From PMF to CDF

    Example: X with PMF

    PX(x) =

    0.15, x = 1

    0.65, x = 2

    0.2, x = 3

    0, o.w.

    1 2 3

    0.15

    0.65

    0.2

    ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 2

  • From PMF to CDF

    Example: from PMF to CDF (FX(x) = PX[X x])

    PX(x) =

    0.15, x = 1

    0.65, x = 2

    0.2, x = 3

    0, o.w.

    1 2 3

    0.15

    0.65

    0.2

    FX(x) =

    0, x < 1

    1 2 3

    ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 3

  • From PMF to CDF

    Example: from PMF to CDF (FX(x) = PX[X x])

    PX(x) =

    0.15, x = 1

    0.65, x = 2

    0.2, x = 3

    0, o.w.

    1 2 3

    0.15

    0.65

    0.2

    FX(x) =

    0, x < 1

    0.15, x = 1

    1 2 3

    0.15

    ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 4

  • From PMF to CDF

    Example: from PMF to CDF (FX(x) = PX[X x])

    PX(x) =

    0.15, x = 1

    0.65, x = 2

    0.2, x = 3

    0, o.w.

    1 2 3

    0.15

    0.65

    0.2

    FX(x) =

    0, x < 1

    0.15, 1 x < 2

    1 2 3

    0.15

    ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 5

  • From PMF to CDF

    Example: from PMF to CDF (FX(x) = PX[X x])

    PX(x) =

    0.15, x = 1

    0.65, x = 2

    0.2, x = 3

    0, o.w.

    1 2 3

    0.15

    0.65

    0.2

    FX(x) =

    0, x < 1

    0.15, 1 x < 20.8 x = 2

    1 2 3

    0.15

    0.8

    ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 6

  • From PMF to CDF

    Example: from PMF to CDF (FX(x) = PX[X x])

    PX(x) =

    0.15, x = 1

    0.65, x = 2

    0.2, x = 3

    0, o.w.

    1 2 3

    0.15

    0.65

    0.2

    FX(x) =

    0, x < 1

    0.15, 1 x < 20.8 2 x < 31 x 3

    1 2 3

    0.15

    0.8

    1

    ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 7

  • Properties of CDF

    1 2 3

    0.15

    0.8

    1

    Some important properties of CDF:

    1) FX() = 0 and FX(+) = 1.2) FX(x) 0.3) x x, we have FX(x) FX(x).4) FX(x) is a constant between two consecutive values x1 and x2.5) P [ < X ] = FX() FX().

    ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 8

  • Sample Mean and Expectation Consider a collection of random samples {X1, . . . , XN}. Compute the sample

    mean:

    sample mean :=1

    N

    Ni=1

    Xi

    Xi corresponds to the ith sample; or, the outcome of ith trial of a randomexperiment.

    The problem with sample mean is that itself is random.

    To fix this, a solution is to take N , in which case, under certain conditions,it can be shown that the sample mean converges to ensemble mean, or, theexpectation of a random variable X.

    Definition: the expectation of a RV X is definied as

    E[X] := X =xSX

    xPX(x)

    ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 9

  • Expectation

    Example: Let us draw a fair die. We have SX = {1, 2, 3, 4, 5, 6} and

    PX(x) =

    {16, x {1, 2, 3, 4, 5, 6}0, o.w.

    E[X] =

    x{1,2,...,6}

    1

    6 x

    =1

    6 (1 + 2 + 3 + 4 + 5 + 6)

    = 3.5

    Why did we say the sample mean converges to E[X]?

    ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 10

  • Expectation

    Consider a collection of samples {X1, . . . , XN}. Denote

    N(xj) =

    Ni=1

    1(Xi = xj), where 1(X = x) =

    {1, X = x

    0, o.w.

    In plain words, N(xj) is the times of seeing xj in the set of samples.

    Consequently, we have

    1

    N

    Ni=1

    Xi =1

    N

    xjSX

    xjN(xj) =xjSX

    xjN(xj)

    N,

    where we have limNN(xj)

    N = PX(xj).

    ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 11

  • Expectation of Bernoulli RV

    Example: Bernoulli 0-1 RV:

    PX(x) =

    {0, w.p. 1 p1, w.p. p.

    By definition,E[X] = 0 (1 p) + 1 p = p.

    What if we have

    PX(x) =

    {3, w.p. 1 p5, w.p. p.

    ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 12

  • Expectation of Geometric RV

    Example: Geometric RV:

    PX(x) =

    {p(1 p)x1, x {1, 2, 3, . . .}0, o.w.

    By definition, we have

    E[X] =

    x=1

    xp(1 p)x1

    = p

    x=1

    xqx1 (q = 1 p)

    = p

    x=1

    dqx

    dq

    ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 13

  • Expectation of Geometric RV

    By definition, we have

    E[X] =

    x=1

    xp(1 p)x1

    = pd (x=1 q

    x)

    dq(q = 1 p)

    = pd(p(1 + p+ p2 + . . .)

    )dq

    = pd(p 11q

    )dq

    =1

    p

    This is intuitive: Consider the coffee shop example: the number of visits that youneed to meet your barista is inversely proportional to the probability that you canmeet him/her there each time.

    ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 14

  • Expectation of Poisson RV

    Example: Poisson RV:

    PX(x) =

    {x

    x! e, x {0, 1, 2, 3, . . .}

    0, o.w.

    By definition, we have

    E[X] =

    x=0

    xx

    x!e

    =

    x=1

    xx

    x!e =

    x=1

    xx

    (x 1)!e

    =y=0

    y+1

    y!e =

    y=0

    y

    y!e

    = ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 15

  • Limit of Poisson RV

    Theorem: The Poisson PMF is limit of the Binomial(n, p) PMF, i.e., n and p 0 = np .

    Proof: The Binomial(n, p) PMF is(n

    k

    )pk(1 p)nk.

    Taking p = /n, we wish to show(n

    k

    )(n

    )k (1

    n

    )nk

    k

    k!e

    The left hand side (LHS) can be written as

    n!

    k!(n k)!k

    nk(1 p)nk =

    k

    k!

    [n(n 1)(n 2) . . . (n k + 1)

    nk

    ](1 p)nk

    ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 16

  • Limit of Poisson RV

    lets continue...

    k

    k!

    [n(n 1)(n 2) . . . (n k + 1)

    n n . . . n

    ](1 p)nk.

    We have

    limn

    k

    k!

    [n(n 1)(n 2) . . . (n k + 1)

    n n . . . n

    ](1 p)nk =

    k

    k!limn

    (1

    n

    )nk=k

    k!limn

    (1 n

    )n(1 n

    )k=k

    k!limn

    (1

    n

    )nBy basic Calculus, we have limn

    (1 1n

    )n= e1.

    ECE353 Probability and Random Processes X. Fu, School of EECS, Oregon State University 17

Recommended

View more >