1 introduction to stochastic models gslm 54100. 2 outline independence of random variables ...

34
1 Introduction to Introduction to Stochastic Models Stochastic Models GSLM 54100 GSLM 54100

Upload: augustus-quinn

Post on 23-Dec-2015

231 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: 1 Introduction to Stochastic Models GSLM 54100. 2 Outline  independence of random variables  variance and covariance  two useful ideas  examples

1

Introduction to Stochastic ModelsIntroduction to Stochastic ModelsGSLM 54100GSLM 54100

Page 2: 1 Introduction to Stochastic Models GSLM 54100. 2 Outline  independence of random variables  variance and covariance  two useful ideas  examples

2

OutlineOutline

independence of random variables

variance and covariance

two useful ideas examples

conditional distribution

Page 3: 1 Introduction to Stochastic Models GSLM 54100. 2 Outline  independence of random variables  variance and covariance  two useful ideas  examples

3

Independent Random VariablesIndependent Random Variables

two random variables X and Y being independent all events generated by X and Y being independent

discrete X and Y

P(X = x, Y = y) = P(X = x) P(Y = y) for all x, y

continuous X and Y

fX ,Y(x, y) = fX(x) fY(y) for all x, y

any X and Y

FX ,Y(x, y) = FX(x) FY(y) for all x, y

Page 4: 1 Introduction to Stochastic Models GSLM 54100. 2 Outline  independence of random variables  variance and covariance  two useful ideas  examples

4

Proposition 2.3Proposition 2.3

E[g(X)h(Y)] = E[g(X)]E[h(Y)] for independent X, Y

different meanings of E()

Ex #7 of WS #5 (Functions of independent random variables)

X and Y be independent and identically distributed (i.i.d.) random variables equally likely to be 1, 2, and 3

Z = XY E(X) = ? E(Y) = ? distribution of Z? E(Z) = E(X)E(Y)?

E(Z) as the mean of a function of X and Y, or as the mean of a random variable Z

Page 5: 1 Introduction to Stochastic Models GSLM 54100. 2 Outline  independence of random variables  variance and covariance  two useful ideas  examples

5

Proposition 2.3Proposition 2.3

E[g(X)h(Y)] = E[g(X)]E[h(Y)] for independent X, Y

different meanings of E()

E[g(X)] =

E[h(Y)] =

E[g(X)h(Y)] =

( ) ( )Xg x f x dx

( ) ( )Yh y f y dy

,( ) ( ) ( , )X Yg x h y f x y dxdy

x and y are dummy variables

( ) ( ) ( ) ( )X Yg x h y f x f y dxdy

( ) ( ) ( ) ( )X Yg x f x dx h y f y dy

Page 6: 1 Introduction to Stochastic Models GSLM 54100. 2 Outline  independence of random variables  variance and covariance  two useful ideas  examples

6

Variance and Covariance Variance and Covariance (Ross, pp 52-53)(Ross, pp 52-53)

Cov(X, Y) = E(XY) E(X)E(Y) Cov(X, X) = Var(X) Cov(X, Y) = Cov(Y, X) Cov(cX, Y) = cCov(X, Y) Cov(X, Y + Z) = Cov(X, Y) + Cov(X, Z)

Cov(iXi, jYj) = i j Cov(Xi, Yj)

. 1 1 1

( ) ( ) 2 ( , )n n

i i i ji i i j n

Var X Var X Cov X X

Page 7: 1 Introduction to Stochastic Models GSLM 54100. 2 Outline  independence of random variables  variance and covariance  two useful ideas  examples

7

Two Useful IdeasTwo Useful Ideas

Page 8: 1 Introduction to Stochastic Models GSLM 54100. 2 Outline  independence of random variables  variance and covariance  two useful ideas  examples

8

Two Useful IdeasTwo Useful Ideas

for X = X1 + … + Xn, E(X) = E(X1) + … + E(Xn),

no matter whether Xi are independent or not

for a prize randomly assigned to one of the n lottery tickets, the probability of winning the price = 1/n for all tickets

the order of buying a ticket does not change the probability of winning

Page 9: 1 Introduction to Stochastic Models GSLM 54100. 2 Outline  independence of random variables  variance and covariance  two useful ideas  examples

9

Applications of the Two IdeasApplications of the Two Ideas

the following are interesting applications

mean of Bin(n, p) (Ex #7(b) of WS #8)

variance of Bin(n, p) (Ex #8(b) of WS #8)

the probability of winning a lottery (Ex #3(b) of WS #9)

mean of hypergeometric random variable (Ex #4 of WS #9)

mean and variance of random number of matches (Ex #5 of WS #9)

Page 10: 1 Introduction to Stochastic Models GSLM 54100. 2 Outline  independence of random variables  variance and covariance  two useful ideas  examples

10

Mean of Bin(Mean of Bin(nn, , pp) ) Ex #7(b) of WS #8Ex #7(b) of WS #8

X ~ Bin(n, p)

find E(X) from E(I1+…+In)

E(X) = E(I1+…+In) = np

Page 11: 1 Introduction to Stochastic Models GSLM 54100. 2 Outline  independence of random variables  variance and covariance  two useful ideas  examples

11

Variance of Bin(Variance of Bin(nn, , pp) ) Ex #8(b) of WS #8Ex #8(b) of WS #8

X ~ Bin(n, p)

find V(X) from V(I1+…+In)

V(X) = V(I1+…+In) = nV(I1) = np(1p)

1 1 1( ) ( ) 2 ( , )n n

i i i ji i i j n

Var X Var X Cov X X

Page 12: 1 Introduction to Stochastic Models GSLM 54100. 2 Outline  independence of random variables  variance and covariance  two useful ideas  examples

12

Probability of Winning a Lottery Probability of Winning a Lottery Ex #3(b) & (c) Ex #3(b) & (c) of WS #9of WS #9

a grand prize among n lotteries (b) Let n 3. Find the probability that the third

person who buys a lottery wins the grand prize

(c). Let Ii = 1 if the ith person buys the lottery wins the grand prize, and Ii = 0 otherwise, 1 i n (i). Show that all Ii have the same (marginal)

distribution

Find cov(Ii, Ij) for i j

Verify 1 1 1( ) ( ) 2 ( , )n n

i i i ji i i j n

Var X Var X Cov X X

Page 13: 1 Introduction to Stochastic Models GSLM 54100. 2 Outline  independence of random variables  variance and covariance  two useful ideas  examples

13

Probability of Winning a Lottery Probability of Winning a Lottery Ex #3(b) & (c) Ex #3(b) & (c) of WS #9of WS #9

(b) A = the third person buying a lottery wins the grand prize

find P(A) when there are 3 persons

Sol. P(A) =

actually the order does not matter thinking about randomly throwing a ball into

one of three boxes

2 1 13 2 3

Page 14: 1 Introduction to Stochastic Models GSLM 54100. 2 Outline  independence of random variables  variance and covariance  two useful ideas  examples

14

Probability of Winning a Lottery Probability of Winning a Lottery Ex #3(b) & (c) Ex #3(b) & (c) of WS #9of WS #9

(c)(i). P(Ij = 1) = 1/n for any j

.

for i j, cov(Ii, Ij) = E(IiIj) E(Ii)E(Ij)

E(IiIj) = 0 cov(Ii, Ij) = -1/n2

checking: 1 1 1( ) ( ) 2 ( , )n n

i i i ji i i j n

Var X Var X Cov X X

1

n

jiI

1

1

n

ji

Var I

0

Page 15: 1 Introduction to Stochastic Models GSLM 54100. 2 Outline  independence of random variables  variance and covariance  two useful ideas  examples

15

Hypergeometric Hypergeometric in the Context of in the Context of Ex #4 of WS #9Ex #4 of WS #9

3 balls are randomly picked from 2 white & 3 black balls

X = the total number of white balls picked 2 30 3

53

1( 0)

10

C CP X

C

2 31 2

53

3( 1)

5

C CP X

C

2 32 1

53

3( 2)

10

C CP X

C E(X) = 6/5

Page 16: 1 Introduction to Stochastic Models GSLM 54100. 2 Outline  independence of random variables  variance and covariance  two useful ideas  examples

16

Hypergeometric Hypergeometric in the Context of in the Context of Ex #4 of WS #9Ex #4 of WS #9

Ex #4(c). Assume that the three picked balls are put in bins 1, 2, and 3 in the order of being picked

(i). Find P(bin i contains a white ball), i = 1, 2, & 3

(ii). Define Bi = 1 if the ball in bin i is white in color, i = 1, 2, and 3. Find E(X) by relating X to B1, B2, and B3

Page 17: 1 Introduction to Stochastic Models GSLM 54100. 2 Outline  independence of random variables  variance and covariance  two useful ideas  examples

17

Hypergeometric Hypergeometric in the Context of in the Context of Ex #4 of WS #9Ex #4 of WS #9

(i). P(bin i contains a white ball) = 2/5 each ball being equally likely to be in bin i

(ii). Bi = 1 if the ball in bin i is white in color, and = 0 otherwise

X = B1 + B2 + B3

E(Bi) = P(bin i contains a white ball) = 2/5

E(X) = E(B1) + E(B2) + E(B3) = 6/5

Page 18: 1 Introduction to Stochastic Models GSLM 54100. 2 Outline  independence of random variables  variance and covariance  two useful ideas  examples

18

Hypergeometric Hypergeometric in the Context of in the Context of Ex #4 of WS #9Ex #4 of WS #9

Ex #4(d). Arbitrarily label the white balls as 1 and 2.

 (i). Find P(white ball 1 is put in a bin); find P(white ball 2 is put in a bin)

(ii). let Wi = 1 if the white ball i is put in a bin, and Wi = 0 otherwise, i = 1, 2; find E(X) from Wi

Page 19: 1 Introduction to Stochastic Models GSLM 54100. 2 Outline  independence of random variables  variance and covariance  two useful ideas  examples

19

Hypergeometric Hypergeometric in the Context of in the Context of Ex #4 of WS #9Ex #4 of WS #9

(i) P(white ball 1 is put in a bin) = 3/5 each ball being equally likely to be in a bin

(ii) Wi = 1 if the white ball i is put in a bin, and Wi = 0 otherwise, i = 1, 2. Find E(X) by relating X to W1 and W2

X = W1 + W2

E(Wi) = P(white ball 1 is put in a bin) = 3/5

E(X) = E(W1) + E(W2) = 6/5

Page 20: 1 Introduction to Stochastic Models GSLM 54100. 2 Outline  independence of random variables  variance and covariance  two useful ideas  examples

20

Mean and Variance Mean and Variance of Random Number of Matches of Random Number of Matches

Ex #5 Ex #5 of WS #9of WS #9 gift exchange among n participants X = total # of participants who get back their own gifts (a). Find P(the ith participant gets back his own gift) (b). Let Ii = 1 if the ith participant get back his own gift,

and Ii = 0 otherwise, 1 i n. Relate X to I1, …, In (c). Find E(X) from (b) (d). Find cov(Ii, Ij) for i j (e). Find V(X)

Page 21: 1 Introduction to Stochastic Models GSLM 54100. 2 Outline  independence of random variables  variance and covariance  two useful ideas  examples

21

Mean and Variance Mean and Variance of Random Number of Matches of Random Number of Matches

Ex #5 Ex #5 of WS #9of WS #9 (a). P(the ith participant gets back his own gift) = 1/n

each hat being equally likely be picked by the person

(b). Ii = 1 if the ith participant get back his own gift, and Ii = 0 otherwise, 1 i n; X = I1 + …+ In

(c). E(X) = E(I1+ …+In) = 1

(d). for i j, cov(Ii, Ij) = E(IiIj) E(Ii)E(Ij)

E(IiIj) = P(Ii = 1, Ij = 1) = P(Ii = 1|Ij = 1)P(Ij = 1) = 1/[n(n-1)]

cov(Ii, Ij) = 1/[ n2(n-1)]

(e). V(X) = 2

( 1)1 1( 1)

1 1n nn n n n

n

Page 22: 1 Introduction to Stochastic Models GSLM 54100. 2 Outline  independence of random variables  variance and covariance  two useful ideas  examples

22

Example 1.11 of RossExample 1.11 of Ross

It is still too complicated to discuss. Let us postpone its discussion until covering the condition probability and the condition probability

Page 23: 1 Introduction to Stochastic Models GSLM 54100. 2 Outline  independence of random variables  variance and covariance  two useful ideas  examples

23

Chapter 2 Chapter 2

material to read: from page 21 to page 59 (section 2.5.3)

Examples highlighted: Examples 2.3, 2.5, 2.17, 2.18, 2.19, 2.20, 2.21, 2.30, 2.31, 2.32, 2.34, 2.35, 2.36, 2.37

Sections and material highlighted: 2.2.1, 2.2.2, 2.2.3, 2.2.4, 2.3.1, 2.3.2, 2.3.3, 2.4.3, Proposition 2.1, Corollary 2.2, 2.5.1, 2.5.2, Proposition 2.3, 2.5.3, Properties of Covariance

Page 24: 1 Introduction to Stochastic Models GSLM 54100. 2 Outline  independence of random variables  variance and covariance  two useful ideas  examples

24

Chapter 2 Chapter 2

Exercises #5, #11, #20, #23, #29, #37, #42, #43, #44, #45, #46, #51, #57, #71, #72

Page 25: 1 Introduction to Stochastic Models GSLM 54100. 2 Outline  independence of random variables  variance and covariance  two useful ideas  examples

25

Conditional DistributionsConditional Distributions

Page 26: 1 Introduction to Stochastic Models GSLM 54100. 2 Outline  independence of random variables  variance and covariance  two useful ideas  examples

26

Conditional DistributionConditional Distribution

X ~ {pn} and A is an event

0 P(X = n|A) 1

n P(X = n|A) =

{P(X = n|A)} is a probability mass

function, called the conditional distribution

of X given A

( , )1

( )n

P X n A

P A

Page 27: 1 Introduction to Stochastic Models GSLM 54100. 2 Outline  independence of random variables  variance and covariance  two useful ideas  examples

27

Conditional DistributionConditional Distribution

define Z = (X|A)

Z is a random variable

E(Z) and Var(Z) being well-defined E(X|A), the conditional mean of X given A

Var(X|A), the conditional variance of X given A

event A can defined by a random variable, e.g., A = {Y = 3}

Page 28: 1 Introduction to Stochastic Models GSLM 54100. 2 Outline  independence of random variables  variance and covariance  two useful ideas  examples

28

Ex #1 of WS #5Ex #1 of WS #5

Exercise 1. (Joint and conditional distributions) The joint distribution of X and Y is shown below, where pm,n = P(X = m, Y = n).

p1,1 = 0; p1,2 = 1/8; p1,3 = 1/8; p2,1 = 1/4; p2,2 = 1/4; p2,3 = 0; p3,1 = 1/8; p3,2 = 0; p3,3 = 1/8. Find the (marginal) distribution of X. Find the (marginal) distribution of Y. Find the conditional distribution of (X|Y = 1), (X|Y = 2), and (X|Y = 3).

Find the conditional means E(X|Y = 1), E(X|Y = 2), and E(X|Y = 3). Find the conditional variances V(X|Y = 1), V(X|Y = 2), and V(X|Y = 3).

Page 29: 1 Introduction to Stochastic Models GSLM 54100. 2 Outline  independence of random variables  variance and covariance  two useful ideas  examples

29

Ex #1 of WS #5Ex #1 of WS #5

Exercise 1. (Joint and conditional distributions) The joint distribution of X and Y is shown below, where pm,n = P(X = m, Y = n).

p1,1 = 0; p1,2 = 1/8; p1,3 = 1/8;

p2,1 = 1/4; p2,2 = 1/4; p2,3 = 0;

p3,1 = 1/8; p3,2 = 0; p3,3 = 1/8.

distribution of X: p1 = 1/4, p2 = 1/2, p3 = 1/4

distribution of Y: p1 = 3/8, p2 = 3/8, p3 = 1/4

Page 30: 1 Introduction to Stochastic Models GSLM 54100. 2 Outline  independence of random variables  variance and covariance  two useful ideas  examples

30

Ex #1 of WS #5Ex #1 of WS #5

Exercise 1. (Joint and conditional distributions) The joint distribution of X and Y is shown below, where pm,n = P(X = m, Y = n).

p1,1 = 0; p1,2 = 1/8; p1,3 = 1/8; p2,1 = 1/4; p2,2 = 1/4; p2,3 = 0; p3,1 = 1/8; p3,2 = 0; p3,3 = 1/8. conditional distribution of

(X|Y = 1): p(X=1|Y=1) = 0; p(X=2|Y=1) = 2/3; p(X=3|Y=1) = 1/3 (X|Y = 2): p(X=1|Y=2) = 1/3; p(X=2|Y=2) = 2/3; p(X=3|Y=2) = 0 (X|Y = 3): p(X=1|Y=3) = 1/2; p(X=2|Y=3) = 0; p(X=3|Y=3) = 1/2

Page 31: 1 Introduction to Stochastic Models GSLM 54100. 2 Outline  independence of random variables  variance and covariance  two useful ideas  examples

31

Ex #1 of WS #5Ex #1 of WS #5

Exercise 1. (Joint and conditional distributions) The joint distribution of X and Y is shown below, where pm,n = P(X = m, Y = n).

p1,1 = 0; p1,2 = 1/8; p1,3 = 1/8; p2,1 = 1/4; p2,2 = 1/4; p2,3 = 0; p3,1 = 1/8; p3,2 = 0; p3,3 = 1/8. (X|Y = 1) being a random variable with well-defined distribution

the conditional means being well-defined E[(X|Y = 1)] = (2)(2/3)+(3)(1/3) = 7/3 E[(X|Y = 2)] = 5/3 E[(X|Y = 3)] = 2

Page 32: 1 Introduction to Stochastic Models GSLM 54100. 2 Outline  independence of random variables  variance and covariance  two useful ideas  examples

32

Ex #1 of WS #5Ex #1 of WS #5

Exercise 1. (Joint and conditional distributions) The joint distribution of X and Y is shown below, where pm,n = P(X = m, Y = n).

p1,1 = 0; p1,2 = 1/8; p1,3 = 1/8; p2,1 = 1/4; p2,2 = 1/4; p2,3 = 0; p3,1 = 1/8; p3,2 = 0; p3,3 = 1/8. (X|Y = 1) being a random variable with well-defined distribution

the conditional variances being well-defined V(X|Y = 1) = E(X2|Y = 1) E2(X|Y = 1) = 2/9 V(X|Y = 2) = 2/9 V(X|Y = 3) =1

Page 33: 1 Introduction to Stochastic Models GSLM 54100. 2 Outline  independence of random variables  variance and covariance  two useful ideas  examples

33

Ex #1 of WS #5Ex #1 of WS #5

note the mapping defined by the conditional means E[(X|Y = 1)] = 7/3, E[(X|Y = 2)] = 5/3, E[(X|Y = 3)] = 2

at {1|Y(1) = 1}, the mapping gives 7/3 at {2|Y(2) = 2}, the mapping gives 5/3 at {3|Y(3) = 3}, the mapping gives 2 the mapping E(X|Y), i.e., the conditional mean, defines a

random variable E[E(X|Y)] = (3/8)(7/3)+(3/8)(5/3)+(1/4)(2) = 2

incidentally E(X) = 2

Page 34: 1 Introduction to Stochastic Models GSLM 54100. 2 Outline  independence of random variables  variance and covariance  two useful ideas  examples

34

Ex #1 of WS #5Ex #1 of WS #5

note the mapping defined by the conditional means V[(X|Y = 1)] = 2/9, V[(X|Y = 2)] = 2/9, V[(X|Y = 3)] = 1

at {1|Y(1) = 1}, the mapping gives 2/9 at {2|Y(2) = 2}, the mapping gives 2/9 at {3|Y(3) = 3}, the mapping gives 1 the mapping V(X|Y), i.e., the conditional variance,

defines a random variable E[V(X|Y)] = (3/8)(2/9)+(3/8)(2/9)+(1/4)(1) = 5/12