1. (a) (b) the random variables x 1 and x 2 are independent, and each has p.m.f.f(x) = (x + 2) / 6...

24
1. (a) (b) The random variables X 1 and X 2 are independent, and each has p.m.f. f(x) = (x + 2) / 6 if x = –1, 0, 1 . Find E(X 1 + X 2 ) . E(X 1 ) = E(X 2 ) = 1 2 3 (–1) — + (0) — + (1) — = 6 6 6 1 3 E(X 1 ) + E(X 2 ) = 1 1 2 — + — = — 3 3 3 Find the p.m.f. of Y = X 1 + X 2 , and use this p.m.f. to find E(Y) . The space of Y is {–2, –1, 0, 1, 2} P(Y = – 2) = P(X 1 = – 1 X 2 = – 1) = P(X 1 = – 1) P(X 2 = – 1) = (1/6)(1/6) = 1/36

Upload: sidney-ellmore

Post on 15-Jan-2016

233 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: 1. (a) (b) The random variables X 1 and X 2 are independent, and each has p.m.f.f(x) = (x + 2) / 6 if x = –1, 0, 1. Find E(X 1 + X 2 ). E(X 1 ) = E(X 2

1.

(a)

(b)

The random variables X1 and X2 are independent, and each has p.m.f. f(x) = (x + 2) / 6 if x = –1, 0, 1 .

Find E(X1 + X2) .

E(X1) = E(X2) = 1 2 3(–1) — + (0) — + (1) — = 6 6 6

1— 3

E(X1) + E(X2) = 1 1 2— + — = — 3 3 3

Find the p.m.f. of Y = X1 + X2 , and use this p.m.f. to find E(Y) .

The space of Y is {–2, –1, 0, 1, 2}

P(Y = –2) = P(X1 = – 1 X2 = – 1) =

P(X1 = – 1) P(X2 = – 1) = (1/6)(1/6) = 1/36

Page 2: 1. (a) (b) The random variables X 1 and X 2 are independent, and each has p.m.f.f(x) = (x + 2) / 6 if x = –1, 0, 1. Find E(X 1 + X 2 ). E(X 1 ) = E(X 2

P(Y = –1) = P({X1 = – 1 X2 = 0}{X1 = 0 X2 = – 1}) =

P(X1 = – 1 X2 = 0) + P(X1 = 0 X2 = – 1) =

P(X1 = – 1) P(X2 = 0) + P(X1 = 0) P(X2 = – 1) =

(1/6)(2/6) + (2/6)(1/6) = 4/36 = 1/9

P(Y = 0) =

P({X1 = – 1 X2 = 1}{X1 = 1 X2 = – 1}{X1 = 0 X2 = 0}) =

P(X1 = – 1 X2 = 1) + P(X1 = 1 X2 = – 1) + P(X1 = 0 X2 = 0) =

P(X1 = – 1) P(X2 = 1) + P(X1 = 1) P(X2 = – 1) + P(X1 = 0) P(X2 = 0) =

(1/6)(3/6) + (3/6)(1/6) + (2/6)(2/6) = 10/36 = 5/18

P(Y = 1) = P(Y = 2) =12/36 = 1/3 9/36 = 1/4

Page 3: 1. (a) (b) The random variables X 1 and X 2 are independent, and each has p.m.f.f(x) = (x + 2) / 6 if x = –1, 0, 1. Find E(X 1 + X 2 ). E(X 1 ) = E(X 2

The p.m.f. of Y is g(y) =

1/36 if y = – 2

4/36 = 1/9 if y = – 1

10/36 = 5/18 if y = 0

12/36 = 1/3 if y = 1

9/36 = 1/4 if y = 2

E(Y) = 2/3 (as expected from part (a))

(c) What type of distribution does W = X12 have?

The space of W is {0, 1}

P(W = 0) =

P(W = 1) =

P(X1 = 0) = 1/3

P(X1 = – 1 X1 = 1) = 2/3

W has a Bernoulli distribution with p = 2/3.

Page 4: 1. (a) (b) The random variables X 1 and X 2 are independent, and each has p.m.f.f(x) = (x + 2) / 6 if x = –1, 0, 1. Find E(X 1 + X 2 ). E(X 1 ) = E(X 2

2. Suppose that the random variable X has a b(n1 , p) distribution, that the random variable Y has a b(n2 , p) distribution, and that the random variables X and Y are independent. What type of distribution does the random variable V = X + Y have?

V has a b(n1 + n2 , p) distribution.

3.

(a)

The random variables X1 and X2 are independent and respectively have p.d.f.

f1(x) = 4x13 if 0 < x1 < 1 and f2(x) = 2x2 if 0 < x2 < 1 .

Find the joint p.d.f. of (X1 , X2) .

Since X1 and X2 are independent, their joint p.d.f. is

f(x1 , x2) = 8 x13 x2 if 0 < x1 < 1 , 0 < x2 < 1

Page 5: 1. (a) (b) The random variables X 1 and X 2 are independent, and each has p.m.f.f(x) = (x + 2) / 6 if x = –1, 0, 1. Find E(X 1 + X 2 ). E(X 1 ) = E(X 2

Section 5.2

Suppose X1 and X2 are continuous type random variables with joint p.d.f. f(x1 , x2) and with space S R2.

Suppose Y = u(X1 , X2) where u is a function. A method for finding the p.d.f. of the random variable Y is the distribution function method. To use this method, one first attempts to find a formula for the distribution function of Y, that is, a formula for G(y) = Then, the p.d.f. of Y is obtained by taking the derivative of the distribution function, that is, the p.d.f. of Y is g(y) =

P(Y y) .

G /(y) . This method will generally involve working with a double integral.

Suppose Y1 = u1(X1 , X2) and Y2 = u2(X1 , X2) where u1 and u2 are functions. A method for finding the joint p.d.f. of the random variables (Y1 , Y2) is the change-of-variables method. This method can be used when the two functions define a one-to-one mapping between S R2 and T R2 as follows: y1 = u1(x1 , x2)

y2 = u2(x1 , x2)

x1 = v1(y1 , y2)

x2 = v2(y1 , y2)

Page 6: 1. (a) (b) The random variables X 1 and X 2 are independent, and each has p.m.f.f(x) = (x + 2) / 6 if x = –1, 0, 1. Find E(X 1 + X 2 ). E(X 1 ) = E(X 2

Then the space of (Y1 , Y2) is T R2, and the joint p.d.f. of (Y1 , Y2) is

g(y1 , y2) = f[ v1(y1 , y2) , v2(y1 , y2) ] | J | where

J is defined to be the determinant of a Jacobian matrix as follows:

J = det

Each of the distribution function method and the change-of-variables method can be extended in a natural way to a situation where (X1 , X2) is replaced with (X1 , X2 , … , Xn) for n > 2.

v1 v1— —y1 y2

v2 v2— —y1 y2

Return to Class Exercise #3.

J is called the Jacobian determinant.

Page 7: 1. (a) (b) The random variables X 1 and X 2 are independent, and each has p.m.f.f(x) = (x + 2) / 6 if x = –1, 0, 1. Find E(X 1 + X 2 ). E(X 1 ) = E(X 2

(b) Define the random variables Y1 = X12 and Y2 = X1X2 . Use the

change-of-variables method to find the joint p.d.f. of (Y1 , Y2).

First, we find the space of Y1 = X12 and Y2 = X1X2 as follows:

x1

x2

(0,0)

(0,1) (1,1)

(1,0) y1

y2

(0,0)

(1,1)

(1,0)

< y1 < , < y2 <0 1 0 y1

< y2 < , < y1 <0 1 y22 1

or

(Look at where the boundaries are mapped)

y1 = x12 x1 =

y2 = x1x2 x2 =

y1

y2—— y1

Then, we find the inverse transformation as follows:

Page 8: 1. (a) (b) The random variables X 1 and X 2 are independent, and each has p.m.f.f(x) = (x + 2) / 6 if x = –1, 0, 1. Find E(X 1 + X 2 ). E(X 1 ) = E(X 2

3. - continued

Next, we find the Jacobian determinant as follows:

The joint p.d.f. of Y1 and Y2 is

J = det

x1 x1— —y1 y2

x2 x2— —y1 y2

=

1—–2y1

det

0

– y2—–2y1

3

1—–y1

= 1—2y1

g(y1 , y2) = 8( )3 ( ) =y1

y2—— y1

1—2y1

if 0 < y1 < 1 , 0 < y2 < y1 4y2

( or 0 < y2 < 1 , y22 < y1 < 1 )

Page 9: 1. (a) (b) The random variables X 1 and X 2 are independent, and each has p.m.f.f(x) = (x + 2) / 6 if x = –1, 0, 1. Find E(X 1 + X 2 ). E(X 1 ) = E(X 2

(c) Find the marginal p.d.f. for Y1 and the marginal p.d.f. for Y2 .

g1(y1) =

0

y1

4y2 dy2 =y2 = 0

y1

2y22 = 2y1 if 0 < y1 < 1

g2(y2) =

y22

1

4y2 dy1 =y1 = y2

2

1

4y2y1 = 4y2 – 4y23 if 0 < y2 < 1

To find g2(y2), we first observe that the space 0 < y1 < 1 , 0 < y2 < y1 can be described as 0 < y2 < 1 , y2

2 < y1 < 1 .

Page 10: 1. (a) (b) The random variables X 1 and X 2 are independent, and each has p.m.f.f(x) = (x + 2) / 6 if x = –1, 0, 1. Find E(X 1 + X 2 ). E(X 1 ) = E(X 2

4.

(a)

(b)

The random variables X1 and X2 have joint p.d.f.

f(x1 , x2) = 8x1x2 if 0 < x1 < x2 < 1 .

Are X1 and X2 independent random variables? Why or why not?

Since the space of (X1 , X2) is not rectangular, X1 and X2 cannot possibly be independent.

Define the random variables Y1 = X1 / X2 and Y2 = X2 – X1 . Use the change-of-variables method to find the joint p.d.f. of (Y1 , Y2).

First, we find the space of Y1 = X1 / X2 and Y2 = X2 – X1 as follows:

x1

x2

(0,0)

(0,1) (1,1)

y1

y2

(0,0)

(0,1)

(1,0)

< y1 < , < y2 <0 1 0 1 – y1

< y2 < , < y1 <0 1 0 1 – y2

(Look at where the boundaries are mapped)

or

Page 11: 1. (a) (b) The random variables X 1 and X 2 are independent, and each has p.m.f.f(x) = (x + 2) / 6 if x = –1, 0, 1. Find E(X 1 + X 2 ). E(X 1 ) = E(X 2

x1y1 = — x1 = x2

y2 = x2 – x1 x2 =

y1 y2——1 – y1

y2——1 – y1

Then, we find the inverse transformation as follows:

Next, we find the Jacobian determinant as follows:

J = det

x1 x1— —y1 y2

x2 x2— —y1 y2

=

y2———(1 – y1)2

det

y1——1 – y1

y2———(1 – y1)2

1——1 – y1

= y2———(1 – y1)2

Page 12: 1. (a) (b) The random variables X 1 and X 2 are independent, and each has p.m.f.f(x) = (x + 2) / 6 if x = –1, 0, 1. Find E(X 1 + X 2 ). E(X 1 ) = E(X 2

8( )( ) = y1 y2——1 – y1

y2——1 – y1

y2———(1 – y1)2

if 0 < y1 < 1 , 0 < y2 < 1 – y1

8 y1 y23

———(1 – y1)4

( or 0 < y2 < 1 , 0 < y1 < 1 – y2 )

4. - continued

The joint p.d.f. of Y1 and Y2 is g(y1 , y2) =

(c) Find the marginal p.d.f. for Y1 and the marginal p.d.f. for Y2 .

g1(y1) =

0

1 – y1

8 y1 y23

——— dy2 =(1 – y1)4

y2 = 0

1 – y1

2 y1 y24

——— =(1 – y1)4

2y1 if 0 < y1 < 1

Page 13: 1. (a) (b) The random variables X 1 and X 2 are independent, and each has p.m.f.f(x) = (x + 2) / 6 if x = –1, 0, 1. Find E(X 1 + X 2 ). E(X 1 ) = E(X 2

g2(y2) =

0

1 – y2

8 y1 y23

——— dy1 =(1 – y1)4

1 18 y2

3 ——— – ——— dy1 =(1 – y1)4 (1 – y1)3

To find g2(y2), we first observe that the space 0 < y1 < 1 , 0 < y2 < 1 – y1 can be described as 0 < y2 < 1 , 0 < y1 < 1 – y2 .

0

1 – y2

y1 = 0

1 – y2

8 – 12y2 + 4y23

—————— if 0 < y2 < 1 3

3y1 – 18 y2

3 ———— =6(1 – y1)3

Study Example 5.2-3 in the textbook, and compare this with the distribution function method for this same situation, suggested in Text Exercise 5.2-6.

5.

Page 14: 1. (a) (b) The random variables X 1 and X 2 are independent, and each has p.m.f.f(x) = (x + 2) / 6 if x = –1, 0, 1. Find E(X 1 + X 2 ). E(X 1 ) = E(X 2

The random variables X1 and X2 are independent and each has a gamma(1,2) (or 2(2) or exponential(2)) distribution.

6.

(a)

(b)

Find the joint p.d.f. of (X1 , X2) .

Since X1 and X2 are independent, their joint p.d.f. is

1 x1 + x2f(x1 , x2) = — exp – ——— if 0 < x1 , 0 < x2 4 2

Define the random variable Y = X1 + X2 . Use the distribution function method to find the p.d.f. of Y .

The space of Y = X1 + X2 is {y | 0 < y}

The distribution function of Y = X1 + X2 is G(y) = P(Y y) =

P(X1 + X2 y) = P(X1 y – X2) = P(0 < X1 y – X2 , 0 < X2 < y) =

Page 15: 1. (a) (b) The random variables X 1 and X 2 are independent, and each has p.m.f.f(x) = (x + 2) / 6 if x = –1, 0, 1. Find E(X 1 + X 2 ). E(X 1 ) = E(X 2

0

y

0

y – x2

1 x1 + x2— exp – ——— dx1 dx2 = 4 2

0

y

0

y – x2

1 x2— exp – — dx2 = 2 2

0

y

1 x2— exp – — dx2 = 2 2

x1 = 0

y – x2

x1– exp – —

2

1 x1— exp – — dx1 2 2

0

y

1 x2— exp – — dx2 = 2 2

y – x21 – exp – ——–

2

Page 16: 1. (a) (b) The random variables X 1 and X 2 are independent, and each has p.m.f.f(x) = (x + 2) / 6 if x = –1, 0, 1. Find E(X 1 + X 2 ). E(X 1 ) = E(X 2

0

y

1 x2— exp – — dx2 = 2 2

y– exp – —

2

x2 = 0

y

x2– exp – — =

2

x2 y– — exp – — 2 2

y1 – exp – —

2

y y– — exp – — 2 2

The p.d.f. of Y is g(y) = G /(y) = y y— exp – — if 0 < y 4 2

We recognize that Y has a distribution.gamma(2,2) (or 2(4))

Page 17: 1. (a) (b) The random variables X 1 and X 2 are independent, and each has p.m.f.f(x) = (x + 2) / 6 if x = –1, 0, 1. Find E(X 1 + X 2 ). E(X 1 ) = E(X 2

The random variables U and V are independent and have respectively a 2(r1) distribution and a 2(r2) distribution.

7.

(a)

(b)

Find the joint p.d.f. of (U , V) .

Define the random variable W = .

Use the distribution function method to find the p.d.f. of W by completing the steps outlined, after first reviewing the following two facts from calculus:

(1)

Since U and V are independent, their joint p.d.f. is

u v u + vf(u,v) = ————————— exp – ——— if 0 < u , 0 < v

(r1/2) (r2/2) 2 2

U / r1——V / r2

d—dy

(r1+r2)/2

r1/2 – 1 r2/2 – 1

a

b

g(x,y) dx = —y

a

b

g(x,y) dx

Page 18: 1. (a) (b) The random variables X 1 and X 2 are independent, and each has p.m.f.f(x) = (x + 2) / 6 if x = –1, 0, 1. Find E(X 1 + X 2 ). E(X 1 ) = E(X 2

Example: d—dy

–1

3

(x3y + xy2) dx = —y (x3y + xy2) dx

–1

3

x4y x2y2

— + —— 4 2

x = –1

3 d—dy

–1

3

(x3 + 2xy) dx

(20y + 4y2) d—dy

20 + 8y

x4

— + x2y 4

x = –1

3

20 + 8y

Page 19: 1. (a) (b) The random variables X 1 and X 2 are independent, and each has p.m.f.f(x) = (x + 2) / 6 if x = –1, 0, 1. Find E(X 1 + X 2 ). E(X 1 ) = E(X 2

(2)

Example:

7. - continuedFrom the chain rule and the Fundamental Theorem of Calculus, we have that

d—dt

a

h(t)

g(x) dx = d—dt

g(h(t)) h(t) .

d—dt

–1

t + 3t

x3 dx = (t + 3t)3(1 + 3/(2t))

x4

— = 4x = –1

d—dt

t + 3t (t + 3t)4 1———— – — = 4 4

d—dt

(t + 3t)3(1 + 3/(2t))

Page 20: 1. (a) (b) The random variables X 1 and X 2 are independent, and each has p.m.f.f(x) = (x + 2) / 6 if x = –1, 0, 1. Find E(X 1 + X 2 ). E(X 1 ) = E(X 2

The space of W = is {w | 0 < w}

The distribution function of W = is G(w) = P(W w) =

U / r1——V / r2

U / r1——V / r2

PU / r1—— w =V / r2

P U =r1 wV—— r2

0

0

1——————(r1/2) (r2/2)

2

u edu

r1 wv—— r2

(r1+r2)/2

r1/2 – 1 – u/2

v e dvr2/2 – 1 – v/2

Page 21: 1. (a) (b) The random variables X 1 and X 2 are independent, and each has p.m.f.f(x) = (x + 2) / 6 if x = –1, 0, 1. Find E(X 1 + X 2 ). E(X 1 ) = E(X 2

The p.d.f. of W is g(w) = G /(w) =7. - continued

0

1

——————(r1/2) (r2/2) v e dv =r2/2 – 1 – v/2

2(r1+r2)/2

r1/2 – 1r1 wv—— r2

exp r1wv– —— 2r2 r1v—

r2

(To simplify this p.d.f., we could either (1) make an appropriate change of variables in the integral, as is done in Example 5.2-4 of the textbook,or (2) do some algebra to make the formula under the integral a p.d.f. which we know must integrate to (one) 1, as we shall do here.)

0

——————(r1/2) (r2/2)

r1/2 r1— r2

2(r1+r2)/2

v(r1+r2)/2 – 1wr1/2 – 1 exp r1wv + r2v– ———— 2r2

dv =

Page 22: 1. (a) (b) The random variables X 1 and X 2 are independent, and each has p.m.f.f(x) = (x + 2) / 6 if x = –1, 0, 1. Find E(X 1 + X 2 ). E(X 1 ) = E(X 2

0

——————(r1/2) (r2/2)

r1/2 r1— r2

2(r1+r2)/2

v(r1+r2)/2 – 1wr1/2 – 1 exp r1wv + r2v– ———— 2r2

dv =

0

——————————— (r1/2) (r2/2)

r1/2 r1— r2

wr1/2 – 1[(r1 + r2)/2]

2(r1+r2)/2

dv =v(r1+r2)/2 – 1 exp

[(r1 + r2)/2]

v– —————— 2 / (1 + r1w/r2)

—————————————(r1/2) (r2/2)

r1/2 r1— r2

wr1/2 – 1[(r1 + r2)/2]

(1 + r1w/r2)(r1+r2)/2

0

[2 / (1 + r1w/r2)](r1+r2)/2

dvv(r1+r2)/2 – 1 exp

[(r1 + r2)/2]

v– —————— 2 / (1 + r1w/r2)

This is the p.d.f. for a random variable having a

(r1 + r2)/2 2 / (1 + r1w/r2)gamma( , ) distribution.

Page 23: 1. (a) (b) The random variables X 1 and X 2 are independent, and each has p.m.f.f(x) = (x + 2) / 6 if x = –1, 0, 1. Find E(X 1 + X 2 ). E(X 1 ) = E(X 2

—————————————(r1/2) (r2/2)

r1/2 r1— r2

wr1/2 – 1[(r1 + r2)/2]

(1 + r1w/r2)(r1+r2)/2

if 0 < w

This is the p.d.f. for a random variable having a Fisher’s f distribution with r1 numerator degrees of freedom and r2 denominator degrees of freedom.

This distribution is important in some future applications of the theory of statistics.

Since the integral must be equal to , then we now have thatone (1)

the p.d.f. of W is g(w) =

Page 24: 1. (a) (b) The random variables X 1 and X 2 are independent, and each has p.m.f.f(x) = (x + 2) / 6 if x = –1, 0, 1. Find E(X 1 + X 2 ). E(X 1 ) = E(X 2

Suppose the random variable F has an f distribution with r1 numerator degrees of freedom and r2 denominator degrees of freedom.

8.

(a)

(b)

(c)

(d)

(e)

(f)

If r1 = 5 and r2 = 10, then P(F < 3.33) = 0.95

If r1 = 5 and r2 = 10, then P(F > 5.64) = 0.01

f0.025(4, 8) = 5.05

f0.975(4, 8) = 1/f0.025(8, 4) =

f0.025(8, 4) = 8.98

f0.975(8, 4) = 1/f0.025(4, 8) =

1/8.98 = 0.111

1/5.05 = 0.198