ee 5110: probability foundations for electrical … expectations of discrete rvs a discrete random...

4

Click here to load reader

Upload: vonga

Post on 03-Jul-2018

213 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: EE 5110: Probability Foundations for Electrical … Expectations of Discrete RVs A discrete random variable X(!); ... can be represented as ... The connection is given by the following

EE 5110: Probability Foundations for Electrical Engineers July-November 2015

Lecture 20: Expectation of Discrete RVs, Expectation over Different SpacesLecturer: Dr. Krishna Jagannathan Scribe: Arjun Bhagoji

20.1 Expectations of Discrete RVs

A discrete random variable X(ω), (which only takes a countable set of values) can be represented as follows:

Definition 20.1 X(ω) =∞∑i=1

aiIAi(ω) where X ≥ 0.

In the canonical representation, the ai’s are non-negative and distinct, and the Ai’s are disjoint. It is easyto see that the Ai’s partition the sample space. Let us now define a sequence of simple random variables,which approximate X from below.

Definition 20.2 Define Xn(ω) =n∑i=1

aiIAi(ω).

Ω

A1

A2 A3

A4

a3

a1

a4

a2

X

X

X

X

Figure 20.1: Simple random variable

Note that ∀ω, Xn(ω) ≤ Xn+1(ω), where n ≥ 1. Next, let us fix ω ∈ Ω. Since Ai’s partition Ω, there existsk ≥ 1 such that ω ∈ Ak. Thus, ∀n ≥ k, Xn(ω) = ak and ∀n < k, Xn(ω) = 0. Therefore,

limn→∞

Xn(ω) = X(ω) ∀ω ∈ Ω. (20.1)

In other words, Xn(ω) is a sequence of simple functions converging monotonically to X(ω). Now applying

20-1

Page 2: EE 5110: Probability Foundations for Electrical … Expectations of Discrete RVs A discrete random variable X(!); ... can be represented as ... The connection is given by the following

20-2 Lecture 20: Expectation of Discrete RVs, Expectation over Different Spaces

the Monotone Convergence Theorem (MCT) to the sequence of random variables Xn,

E[X] = limn→∞

E[Xn],

= limn→∞

n∑i=1

aiP(Ai),

= limn→∞

n∑i=1

aiP(X = ai),

⇒ E[X] =

∞∑i=1

aiP(X = ai). (20.2)

The limit of the sum is well-defined as X is a non-negative random variable and it either converges to somepositive real number or goes to +∞. If X is discrete but takes on both positive and negative values, wewrite X = X+ −X−, where X+ = max(X, 0) and X− = −min(X, 0). Then, we compute

E[X] = E[X+]− E[X−]. (20.3)

The above is meaningful when at least one of the expectation on the right hand side is finite. We now givesome examples.

1. X ∼ Geometric(p) - E[X] =∞∑i=1

i(1− p)i−1p = 1p .

This tells us that, for a geometric random variable, the expected number of trials for the first successto occur scales as 1

p .

2. P(X = k) = 6π2

1k2 for k ≥ 1 - For this probability distribution,the expectation is calculated as

E[X] =6

π2

∞∑i=1

i

(1

i2

)= +∞. (20.4)

In this example, we see that a random variable can have infinite expectation.

3. P(X = k) = 3π2

1k2 for k ∈ Z/0 - For this probability distribution, the expectation is calculated as

E[X] = E[X+]−E[X−]. However, both the expectations E[X+] and E[X−] are infinite! Therefore, E[X]is not defined! This is an example of a discrete random variable with undefined expectation.

20.2 Connection between Riemann and Lebesgue integrals

The connection is given by the following theorem which we state without proof.

Theorem 20.3 Let f be measurable and Riemann integrable over an interval [a, b]. Then,∫[a,b]

f dλ exists, and

∫[a,b]

f dλ =

∫ b

a

f(x) dx. (20.5)

Here, λ is the Lebesgue measure on R. The integral on the left is a Lebesgue integral while the one on theright is the standard Riemann integral.

Page 3: EE 5110: Probability Foundations for Electrical … Expectations of Discrete RVs A discrete random variable X(!); ... can be represented as ... The connection is given by the following

Lecture 20: Expectation of Discrete RVs, Expectation over Different Spaces 20-3

20.3 Expectations on different spaces

We often want to compute the expectation of a function of a random variable, say Y = f(X), where bothX and Y are random variables and f(·) is a measurable function on R. The following theorem asserts thatthe expectation can be computed over different spaces, to obtain the ‘same answer.’ For example, we cancompute the expectation of Y by either working in the X-space or the Y -space to write (for discrete randomvariables)

∑i

yiP(Y = yi) =∑i

f(ai)P(X = ai), (20.6)

where yi = f(ai). This is just a special case of the following theorem

Theorem 20.4 Denote the probability measure on the sample space by P, on the range space of X as PXand on range space of Y as PY . Then,

∫Y dP =

∫f dPX =

∫y dPY where Y = f(X) and the integrals are

over the respective spaces.

Figure 20.2: Different spaces considered

Proof: Let f be a simple function which takes values y1, y2 · · · yn. Then,

∫Y dP =

n∑i=1

yiP(ω|Y (ω) = yi),

=

n∑i=1

yiP(ω|f(X(ω)) = yi).

Page 4: EE 5110: Probability Foundations for Electrical … Expectations of Discrete RVs A discrete random variable X(!); ... can be represented as ... The connection is given by the following

20-4 Lecture 20: Expectation of Discrete RVs, Expectation over Different Spaces

Now, looking at the second integral, we have∫f dPX =

n∑i=1

yiPX(x ∈ R|f(x) = yi),

=

n∑i=1

yiPX(f−1(yi)),

=

n∑i=1

yiP(ω|ω : X(ω) ∈ f−1(yi)),

=

n∑i=1

yiP(ω|f(X(ω)) = yi).

Now, we extend the above to the case when f is a non-negative measurable function. Let fn be a sequenceof simple functions such that fn ↑ f according to the construction given in the previous lecture. Thus,fn(X) ↑ f(X) and, ∫

Y dP =

∫(f ·X) dP,

=

∫f(X) dP,

= limn→∞

∫fn(X) dP (by MCT),

= limn→∞

∫fn dPX (∵ simple function),

=

∫f dPX (by MCT).

This can now be simply extended to the case where g takes both negative and positive values.

A simple corollary of this theorem is that∫X dP =

∫x dPX .

20.4 Exercise

1. [Dimitri P.Bertsekas] Let X be a random variable with PMF pX(x) = x2

a , if x = −3,−2,−1, 0, 1, 2, 3and zero otherwise. Compute a and E[X].

2. [Dimitri P.Bertsekas] As an advertising campaign, a chocolate factory places golden tickets in someof its candy bars, with the promise that a golden ticket is worth a trip through the chocolate factory,and all the chocolate you can eat for life. If the probability of finding a golden ticket is p, find theexpected number of bars you need to eat to find a ticket.

3. [Dimitri P.Bertsekas] On a given day, your golf score takes values from the range 101 to 110, withprobability 0.1, independent of other days. Determined to improve your score, you decide to play onthree different days and declare as your score the minimum X of the scores X1, X2 and X3 on thedifferent days. By how much has your expected score improved as a result of playing on three days?

4. [Papoulis] A biased coin is tossed and the first outcome is noted. Let the probability of head occuringbe p and that of a tail be q = 1− p. The tossing is continued until the outcome is the complement ofthe first outcome, thus completing the first run. Let X denote the length of the first run. Find thePMF of X and show that E[X] = p

q + qp .