independence of random variables

29
week 8 1 Independence of random variables Definition Random variables X and Y are independent if their joint distribution function factors into the product of their marginal distribution functions Theorem Suppose X and Y are jointly continuous random variables. X and Y are independent if and only if given any two densities for X and Y their product is the joint density for the pair (X,Y) i.e. Proof: If X and Y are independent random variables and Z =g(X), W = h(Y) then Z, W are also independent. y F x F y x F Y X Y X , , y f x f y x f Y X Y X , ,

Upload: libby

Post on 06-Feb-2016

61 views

Category:

Documents


0 download

DESCRIPTION

Independence of random variables. Definition Random variables X and Y are independent if their joint distribution function factors into the product of their marginal distribution functions Theorem - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Independence of random variables

week 8 1

Independence of random variables • Definition

Random variables X and Y are independent if their joint distribution function factors into the product of their marginal distribution functions

• TheoremSuppose X and Y are jointly continuous random variables. X and Y are independent if and only if given any two densities for X and Y their product is the joint density for the pair (X,Y) i.e.

Proof:

• If X and Y are independent random variables and Z =g(X), W = h(Y) then Z, W are also independent.

yFxFyxF YXYX ,,

yfxfyxf YXYX ,,

Page 2: Independence of random variables

week 8 2

Example

• Suppose X and Y are discrete random variables whose values are the non-negative integers and their joint probability function is

Are X and Y independent? What are their marginal distributions?

• Factorization is enough for independence, but we need to be careful of constant terms for factors to be marginal probability functions.

...2,1,0,!!

1,, yxe

yxyxp yx

YX

Page 3: Independence of random variables

week 8 3

Example and Important Comment

• The joint density for X, Y is given by

• Are X, Y independent?

• Independence requires that the set of points where the joint density is positive must be the Cartesian product of the set of points where the marginal densities are positive i.e. the set of points where fX,Y(x,y) >0 must be (possibly infinite) rectangles.

otherwise

yxyxyxyxf YX

0

1,0,4,

2

,

Page 4: Independence of random variables

week 8 4

Conditional densities

• If X, Y jointly distributed continuous random variables, the conditional density function of Y | X is defined to be

if fX(x) > 0 and 0 otherwise.

• If X, Y are independent then .

• Also,

Integrating both sides over x we get

• This is a useful application of the law of total probability for the continuous case.

xf

yxfxyf

X

YXXY

,| ,

|

xfxyfyxf XXYYX |, |,

dxxfxyfyf XXYY ||

yfxyf YXY ||

Page 5: Independence of random variables

week 8 5

Example

• Consider the joint density

• Find the conditional density of X given Y and the conditional density of Y given X.

otherwise

yxeyxf

y

YX0

0,

2

,

Page 6: Independence of random variables

week 8 6

Properties of Expectations Involving Joint Distributions

• For random variables X, Y and constants

E(aX + bY) = aE(X) + bE(Y)

Proof:

• For independent random variables X, Y

E(XY) = E(X)E(Y)

whenever these expectations exist.

Proof:

Rba ,

Page 7: Independence of random variables

week 8 7

Covariance

• Recall: Var(X+Y) = Var(X) + Var(Y) +2 E[(X-E(X))(Y-E(Y))]

• Definition For random variables X, Y with E(X), E(Y) < ∞, the covariance of X and Y is

• Covariance measures whether or not X-E(X) and Y-E(Y) have the same sign.

• Claim:

Proof:

• Note: If X, Y independent then E(XY) =E(X)E(Y), and Cov(X,Y) = 0.

YEYXEXEYXCov ,

YEXEXYEYXCov ,

Page 8: Independence of random variables

week 8 8

Example • Suppose X, Y are discrete random variables with probability function given

by

• Find Cov(X,Y). Are X,Y independent?

y x -1 0 1 pX(x)

-1 1/8 1/8 1/8

0 1/8 0 1/8

1 1/8 1/8 1/8

pY(y)

Page 9: Independence of random variables

week 8 9

Important Facts

• Independence of X, Y implies Cov(X,Y) = 0 but NOT vice versa.

• If X, Y independent then Var(X+Y) = Var(X) + Var(Y).

• If X, Y are NOT independent then

Var(X+Y) = Var(X) + Var(Y) + 2Cov(X,Y).

• Cov(X,X) = Var(X).

Page 10: Independence of random variables

week 8 10

Example

• Suppose Y ~ Binomial(n, p). Find Var(Y).

Page 11: Independence of random variables

week 8 11

Properties of Covariance

For random variables X, Y, Z and constants

• Cov(aX+b, cY+d) = acCov(X,Y)

• Cov(X+Y, Z) = Cov(X,Z) + Cov(Y,Z)

• Cov(X,Y) = Cov(Y, X)

Rdcba ,,,

Page 12: Independence of random variables

week 8 12

Correlation• Definition

For X, Y random variables the correlation of X and Y is

whenever V(X), V(Y) ≠ 0 and all these quantities exists.

• Claim: ρ(aX+b,cY+d) = ρ(X,Y)

Proof:

• This claim means that the correlation is scale invariant.

YVXV

YXCovYX

,,

Page 13: Independence of random variables

week 8 13

Theorem

• For X, Y random variables, whenever the correlation ρ(X,Y) exists it must satisfy

-1 ≤ ρ(X,Y) ≤ 1Proof:

Page 14: Independence of random variables

week 8 14

Interpretation of Correlation ρ

• ρ(X,Y) is a measure of the strength and direction of the linear relationship between X, Y.

• If X, Y have non-zero variance, then .

• If X, Y independent, then ρ(X,Y) = 0. Note, it is not the only time when ρ(X,Y) = 0 !!!

• Y is a linearly increasing function of X if and only if ρ(X,Y) = 1.

• Y is a linearly decreasing function of X if and only if ρ(X,Y) = -1.

1,1

Page 15: Independence of random variables

week 8 15

Example

• Find Var(X - Y) and ρ(X,Y) if X, Y have the following joint density

otherwise

xyxyxf YX 0

103,,

Page 16: Independence of random variables

week 8 16

Conditional Expectation

• For X, Y discrete random variables, the conditional expectation of Y given X = x is

and the conditional variance of Y given X = x is

where these are defined only if the sums converges absolutely.

• In general,

yXY xypyhxXYhE || |

22

|2

||

|||

xXYExXYE

xypxXYEyxXYVy

XY

y

XY xypyxXYE || |

Page 17: Independence of random variables

week 8 17

• For X, Y continuous random variables, the conditional expectation of

Y given X = x is

and the conditional variance of Y given X = x is

• In general,

dyxyfyxXYE XY || |

22

|2

||

|||

xXYExXYE

dyxyfxXYEyxXYV XY

dyxyfyhxXYhEy XY || |

Page 18: Independence of random variables

week 8 18

Example

• Suppose X, Y are continuous random variables with joint density function

• Find E(X | Y = 2).

otherwise

xyeyxf

y

YX0

10,0,,

Page 19: Independence of random variables

week 8 19

More about Conditional Expectation

• Assume that E(Y | X = x) exists for every x in the range of X. Then, E(Y | X ) is a random variable. The expectation of this random variable is

E [E(Y | X )]

• TheoremE [E(Y | X )] = E(Y)

This is called the “Law of Total Expectation”.

Proof:

Page 20: Independence of random variables

week 8 20

Example

• Suppose we roll a fair die; whatever number comes up we toss a coin that many times. What is the expected number of heads?

Page 21: Independence of random variables

week 8 21

Theorem

• For random variables X, Y

V(Y) = V [E(Y|X)] + E[V(Y|X)]

Proof:

Page 22: Independence of random variables

week 8 22

Example

• Let X ~ Geometric(p).

Given X = x, let Y have conditionally the Binomial(x, p) distribution.

• Scenario: doing Bernoulli trails with success probability p until 1st success so X : number of trails. Then do x more trails and count the number of success which is Y.

• Find, E(Y), V(Y).

Page 23: Independence of random variables

week 8 23

Law of Large Numbers

• Toss a coin n times.

• Suppose

• Xi’s are Bernoulli random variables with p = ½ and E(Xi) = ½.

• The proportion of heads is .

• Intuitively approaches ½ as n ∞ .

Tupcametossiif

HupcametossiifX

th

th

i0

1

n

iin X

nX

1

1

nX

Page 24: Independence of random variables

week 8 24

Markov’s Inequality

• If X is a non-negative random variable with E(X) < ∞ and a >0 then,

Proof:

a

XEaXP

Page 25: Independence of random variables

week 8 25

Chebyshev’s Inequality

• For a random variable X with E(X) < ∞ and V(X) < ∞, for any a >0

• Proof:

2a

XVaXEXP

Page 26: Independence of random variables

week 8 26

Back to the Law of Large Numbers

• Interested in sequence of random variables X1, X2, X3,… such that the random variables are independent and identically distributed (i.i.d). Let

Suppose E(Xi) = μ , V(Xi) = σ2, then

and

• Intuitively, as n ∞, so

n

iin X

nX

1

1

n

ii

n

iin XE

nX

nEXE

11

11

n

XVn

Xn

VXVn

ii

n

iin

2

12

1

11

0nXV nn XEX

Page 27: Independence of random variables

week 8 27

• Formally, the Weak Law of Large Numbers (WLLN) states the following:

• Suppose X1, X2, X3,…are i.i.d with E(Xi) = μ < ∞ , V(Xi) = σ2 < ∞, then for any positive number a

as n ∞ .

This is called Convergence in Probability.

Proof:

0 aXP n

Page 28: Independence of random variables

week 8 28

Example

• Flip a coin 10,000 times. Let

• E(Xi) = ½ and V(Xi) = ¼ .

• Take a = 0.01, then by Chebyshev’s Inequality

• Chebyshev Inequality gives a very weak upper bound.

• Chebyshev Inequality works regardless of the distribution of the Xi’s.

Tupcametossiif

HupcametossiifX

th

th

i0

1

4

1

01.0

1

000,104

101.0

2

12

nXP

Page 29: Independence of random variables

week 8 29

Strong Law of Large Number

• Suppose X1, X2, X3,…are i.i.d with E(Xi) = μ < ∞ , then converges to μ

as n ∞ with probability 1. That is

• This is called convergence almost surely.

11

lim 21

n

nXXX

nP

nX