hw3

1
E2 202 (Aug–Dec 2015) Homework Assignment 3 Discussion: Friday, Sept. 4 1. Basic properties of variance. (a) Verify that Var(aX )= a 2 Var(X ) for any a R. (b) Show that Var(X + Y )= Var(X )+ Var(Y )+2Cov(X, Y ). 2. Moment generating functions. (a) Determine the moment generating function of a random variable X having a GEOM(p) distribution, i.e., P (X = k) = (1 - p) k-1 p for k =1, 2, 3, .... Use the MGF to compute the first two moments of X . (b) Verify that for an r.v. X taking values in 1, ±2, ±3,...}, with P (X = k) = (32 ) 1 k 2 , for k = ±1, ±2, ±3,..., the moment generating function M X (t) is not finite for t 6=0. 3. Characteristic functions. Let Φ X denote the characteristic function of a random variable X . Let W = a + X and Y = aX for some constant a R. Express the characteristic functions Φ W and Φ Y in terms of Φ X . Recall that the pdf of a Gaussian random variable Z with mean 0 and variance 1 is given by f Z (z )= 1 2π e - 1 2 z 2 , z R Compute the characteristic function of Z . From (a) and (b), deduce the characeristic function of a Gaussian rv with mean μ and variance σ 2 . 4. Let X 1 ,X 2 ,...,X n be n independent random variables, with each X i satisfying P (X i = +1) = P (X i = -1) = 1/2. (a) Determine the mean and variance of X = 1 n n i=1 X i . (b) Use Chebyshev’s inequality to bound the probability P (| X | ) for δ> 0. (c) Use the Chernoff bounding technique to show that for 0 <δ< 1, P ( X>δ) 2 -n[1-h( 1+δ 2 )] , where h(·) is the binary entropy function defined by h(x)= -x log 2 (x) - (1 - x) log 2 (1 - x) for x [0, 1]. Hence, by symmetry, P (| X | )=2 · P (X>δ) 2 · 2 -n[1-h( 1+δ 2 )] . The point of this exercise is that the Chernoff bound yields an exponential decay in n for the tail probability of X , while Chebyshev’s inequality only yields a 1/n decay.

Upload: saptarshi-chatterjee

Post on 10-Dec-2015

212 views

Category:

Documents


0 download

DESCRIPTION

problems on probability

TRANSCRIPT

Page 1: hw3

E2 202 (Aug–Dec 2015)Homework Assignment 3

Discussion: Friday, Sept. 4

1. Basic properties of variance.

(a) Verify that Var(aX) = a2Var(X) for any a ∈ R.

(b) Show that Var(X + Y ) = Var(X) + Var(Y ) + 2Cov(X,Y ).

2. Moment generating functions.

(a) Determine the moment generating function of a random variable X having a GEOM(p) distribution, i.e.,

P (X = k) = (1− p)k−1 p for k = 1, 2, 3, . . . .

Use the MGF to compute the first two moments of X .

(b) Verify that for an r.v. X taking values in {±1,±2,±3, . . .}, with

P (X = k) = (3/π2)1

k2, for k = ±1,±2,±3, . . . ,

the moment generating function MX(t) is not finite for t 6= 0.

3. Characteristic functions.

• Let ΦX denote the characteristic function of a random variable X . Let W = a + X and Y = aX forsome constant a ∈ R. Express the characteristic functions ΦW and ΦY in terms of ΦX .

• Recall that the pdf of a Gaussian random variable Z with mean 0 and variance 1 is given by

fZ(z) =1√2πe−

12z2 , z ∈ R

Compute the characteristic function of Z.

• From (a) and (b), deduce the characeristic function of a Gaussian rv with mean µ and variance σ2.

4. Let X1, X2, . . . , Xn be n independent random variables, with each Xi satisfying P (Xi = +1) = P (Xi =

−1) = 1/2.

(a) Determine the mean and variance of X = 1n

∑ni=1Xi.

(b) Use Chebyshev’s inequality to bound the probability P (|X| > δ) for δ > 0.

(c) Use the Chernoff bounding technique to show that for 0 < δ < 1,

P (X > δ) ≤ 2−n[1−h( 1+δ2

)],

where h(·) is the binary entropy function defined by h(x) = −x log2(x) − (1 − x) log2(1 − x) forx ∈ [0, 1]. Hence, by symmetry,

P (|X| > δ) = 2 · P (X > δ) ≤ 2 · 2−n[1−h( 1+δ2

)].

The point of this exercise is that the Chernoff bound yields an exponential decay in n for the tail probabilityof X , while Chebyshev’s inequality only yields a 1/n decay.