random variables, expectations, variances etc

86
RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC. 1

Upload: ivory

Post on 19-Jan-2016

58 views

Category:

Documents


0 download

DESCRIPTION

RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC. Variable. Recall: Variable: A characteristic of population or sample that is of interest for us. Random variable: A function defined on the sample space S that associates a real number with each outcome in S. DISCRETE RANDOM VARIABLES. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC.

1

Page 2: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

Variable

• Recall: • Variable: A characteristic of population or

sample that is of interest for us.• Random variable: A function defined on the

sample space S that associates a real number with each outcome in S.

2

Page 3: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

DISCRETE RANDOM VARIABLES

• If the set of all possible values of a r.v. X is a countable set, then X is called discrete r.v.

• The function f(x)=P(X=x) for x=x1,x2, … that assigns the probability to each value x is called probability density function (p.d.f.) or probability mass function (p.m.f.)

3

Page 4: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

Example

• Discrete Uniform distribution:

• Example: throw a fair die. P(X=1)=…=P(X=6)=1/6

4

,...2,1N;N,...,2,1x;N

1)xX(P

Page 5: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

CONTINUOUS RANDOM VARIABLES

• When sample space is uncountable (continuous)

• Example: Continuous Uniform(a,b)

5

.bxaab

1)X(f

Page 6: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

CUMULATIVE DENSITY FUNCTION (C.D.F.)

• CDF of a r.v. X is defined as F(x)=P(X≤x).• Note that, P(a<X ≤b)=F(b)-F(a).• A function F(x) is a CDF for some r.v. X iff it

satisfies

6

)b(F)a(Fimpliesba

)x(F)hx(Flim

1)x(Flim

0)x(Flim

0h

x

x

F(x) is continuous from right

F(x) is non-decreasing.

Page 7: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

Example• Consider tossing three fair coins.• Let X=number of heads observed.• S={TTT, TTH, THT, HTT, THH, HTH, HHT, HHH}• P(X=0)=P(X=3)=1/8; P(X=1)=P(X=2)=3/8

7

x F(x)

(-∞,0) 0

[0,1) 1/8

[1,2) 1/2

[2,3) 7/8

[3, ∞) 1

Page 8: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

Example

• Let

8

0xfor)x1(2)x(f 3

0xfor0

0xfor)x1(1dt)t1(2)xX(P)x(F

x

023

035.0)4.0(F)45.0(Fdx)x(f)45.0X4.0(P45.0

4.0

Page 9: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

JOINT DISTRIBUTIONS• In many applications there are more than one

random variables of interest, say X1, X2,…,Xk.

JOINT DISCRETE DISTRIBUTIONS• The joint probability mass function (joint pmf)

of the k-dimensional discrete rv X=(X1, X2,…,Xk) is

kk2211k21 xX,...,xX,xXPx,...,x,xf .Xx,...,x,x k21 of

9

Page 10: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

JOINT DISCRETE DISTRIBUTIONS

• A function f(x1, x2,…, xk) is the joint pmf for some vector valued rv X=(X1, X2,…,Xk) iff the following properties are satisfied:

f(x1, x2,…, xk) 0 for all (x1, x2,…, xk)

and

10

.1x,...,x,xf...

1x kxk21

Page 11: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

Example

• Tossing two fair dice 36 possible sample points

• Let X: sum of the two dice; Y: |difference of the two dice|• For e.g.:

– For (3,3), X=6 and Y=0. – For both (4,1) and (1,4), X=5, Y=3.

11

Page 12: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

Example• Joint pmf of (x,y)

12

x

y

2 3 4 5 6 7 8 9 10 11 12

0 1/36 1/36 1/36 1/36 1/36 1/36

1 1/18 1/18 1/18 1/18 1/18

2 1/18 1/18 1/18 1/18

3 1/18 1/18 1/18

4 1/18 1/18

5 1/18

Empty cells are equal to 0.

e.g. P(X=7,Y≤4)=f(7,0)+f(7,1)+f(7,2)+f(7,3)+f(7,4)=0+1/18+0+1/18+0=1/9

Page 13: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

MARGINAL DISCRETE DISTRIBUTIONS

• If the pair (X1,X2) of discrete random variables has the joint pmf f(x1,x2), then the marginal pmfs of X1 and X2 are

13

12

21222111xx

xxfxf and xxfxf ,,

Page 14: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

Example

• In the previous example,

14

36/1)5y,2X(P...)0y,2X(P)y,2X(P5

0y

2)P(X

12

2x

18/4)2Y,x(P)2Y(P

Page 15: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

JOINT DISCRETE DISTRIBUTIONS• JOINT CDF:

• F(x1,x2) is a cdf iff

15

.xX,...,xXPx,...,x,xF kk11k21

.andx,xFhx,xFx,hxF

,0c,aFd,aFc,bFd,bF)dXc,bXa(P

1,Fx,xFlim

.x,0,xFx,xFlim

.x,0x,Fx,xFlim

212121

21

21

2x1x

11212x

22211x

2 10h0h

x x ,limlim

d.c and ba

Page 16: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

JOINT CONTINUOUS DISTRIBUTIONS

• A k-dimensional vector valued rv X=(X1, X2,…,Xk) is said to be continuous if there is a function f(x1, x2,…, xk), called the joint probability density function (joint pdf), of X, such that the joint cdf can be given as

16

1x 2x kx

k21k21k21 dt...dtdtt,...,t,tf...x,...,x,xF

Page 17: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

JOINT CONTINUOUS DISTRIBUTIONS

• A function f(x1, x2,…, xk) is the joint pdf for some vector valued rv X=(X1, X2,…,Xk) iff the following properties are satisfied:

f(x1, x2,…, xk) 0 for all (x1, x2,…, xk)

and

17

.1dx...dxdxx,...,x,xf... k21k21

Page 18: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

JOINT CONTINUOUS DISTRIBUTIONS

• If the pair (X1,X2) of discrete random variables has the joint pdf f(x1,x2), then the marginal pdfs of X1 and X2 are

18

.,, 1212222111 dxxxfxf and dxxxfxf

Page 19: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

JOINT DISTRIBUTIONS

• If X1, X2,…,Xk are independent from each other, then the joint pdf can be given as

And the joint cdf can be written as

19

k21k21 xf...xfxfx,...,x,xf

k21k21 xF...xFxFx,...,x,xF

Page 20: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

CONDITIONAL DISTRIBUTIONS

• If X1 and X2 are discrete or continuous random variables with joint pdf f(x1,x2), then the conditional pdf of X2 given X1=x1 is defined by

• For independent rvs,

20

elsewhere. 0 f that such ,0xx,xf

x,xfxxf 11

1

2112

.

.

121

212

xfxxf

xfxxf

Page 21: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

ExampleStatistical Analysis of Employment Discrimination Data (Example

from Dudewicz & Mishra, 1988; data from Dawson, Hankey and Myers, 1982)

21

% promoted (number of employees)

Pay grade Affected class others

5 100 (6) 84 (80)

7 88 (8) 87 (195)

9 93 (29) 88 (335)

10 7 (102) 8 (695)

11 7 (15) 11 (185)

12 10 (10) 7 (165)

13 0 (2) 9 (81)

14 0 (1) 7 (41)

Affected class might be a minority group or e.g. women

Page 22: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

Example, cont.

• Does this data indicate discrimination against the affected class in promotions in this company?

• Let X=(X1,X2,X3) where X1 is pay grade of an employee; X2 is an indicator of whether the employee is in the affected class or not; X3 is an indicator of whether the employee was promoted or not

• x1={5,7,9,10,11,12,13,14}; x2={0,1}; x3={0,1}

22

Page 23: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

Example, cont.

• E.g., in pay grade 10 of this occupation (X1=10) there were 102 members of the affected class and 695 members of the other classes. Seven percent of the affected class in pay grade 10 had been promoted, that is (102)(0.07)=7 individuals out of 102 had been promoted.

• Out of 1950 employees, only 173 are in the affected class; this is not atypical in such studies.

23

Pay grade Affected class others

10 7 (102) 8 (695)

Page 24: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

Example, cont.

• E.g. probability of a randomly selected employee being in pay grade 10, being in the affected class, and promoted: P(X1=10,X2=1,X3=1)=7/1950=0.0036 (Probability function of a discrete 3 dimensional r.v.)

• E.g. probability of a randomly selected employee being in pay grade 10 and promoted:

P(X1=10, X3=1)= (7+56)/1950=0.0323 (Note: 8% of 695 -> 56) (marginal probability function of X1 and X3)

24

Pay grade Affected class others

10 7 (102) 8 (695)

Page 25: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

Example, cont.

• E.g. probability that an employee is in the other class (X2=0) given that the employee is in pay grade 10 (X1=10) and was promoted (X3=1):

P(X2=0| X1=10, X3=1)= P(X1=10,X2=0,X3=1)/P(X1=10, X3=1)

=(56/1950)/(63/1950)=0.89 (conditional probability)• probability that an employee is in the affected class

(X2=1) given that the employee is in pay grade 10 (X1=10) and was promoted (X3=1):

P(X2=1| X1=10, X3=1)=(7/1950)/(63/1950)=0.11

25

Page 26: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

26

Describing the Population

• We’re interested in describing the population by computing various parameters.

• For instance, we calculate the population mean and population variance.

Page 27: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

27

EXPECTED VALUESLet X be a rv with pdf fX(x) and g(X) be a

function of X. Then, the expected value (or the mean or the mathematical expectation) of g(X)

Xx

X

g x f x , if X is discrete

E g Xg x f x dx, if X is continuous

providing the sum or the integral exists, i.e.,<E[g(X)]<.

Page 28: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

28

EXPECTED VALUES

• E[g(X)] is finite if E[| g(X) |] is finite.

Xx

X

g x f x < , if X is discrete

E g Xg x f x dx< , if X is continuous

Page 29: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

29

Population Mean (Expected Value)

• Given a discrete random variable X with values xi, that occur with probabilities p(xi), the population mean of X is

ixall

ii )x(px)X(E ixall

ii )x(px)X(E

Page 30: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

30

– Let X be a discrete random variable with possible values xi that occur with probabilities p(xi), and let E(xi) = The variance of X is defined by

ixall

i2

i22 )x(p)x()X(E)X(V

ixalli

2i

22 )x(p)x()X(E)X(V

Population Variance

2

isdeviationdardtansThe

2

isdeviationdardtansThe

Unit*Unit

Unit

Page 31: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

31

EXPECTED VALUE• The expected value or mean value of a

continuous random variable X with pdf f(x) is

( ) ( )all x

E X xf x dx • The variance of a continuous random

variable X with pdf f(x) is2 2 2

all x

2 2 2 2

all x

( ) ( ) ( ) ( )

( ) ( ) ( )

Var X E X x f x dx

E X x f x dx

Page 32: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

32

EXAMPLE• The pmf for the number of defective items in

a lot is as follows0.35, 0

0.39, 1

( ) 0.19, 2

0.06, 3

0.01, 4

x

x

p x x

x

x

Find the expected number and the variance of

defective items.

Results: E(X)=0.99, Var(X)=0.8699

Page 33: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

33

EXAMPLE

• Let X be a random variable. Its pdf isf(x)=2(1-x), 0< x < 1

Find E(X) and Var(X).

Page 34: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

34

Laws of Expected Value• Let X be a rv and a, b, and c be constants.

Then, for any two functions g1(x) and g2(x) whose expectations exist,

1 2 1 2)a E ag X bg X c aE g X bE g X c

1 10 , 0.b) If g x for all x then E g X

1 2 1 2) .c If g x g x for all x, then E g x E g x

1 1)d If a g x b for all x, then a E g X b

Page 35: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

35

Laws of Expected Value E(c) = c E(X + c) = E(X) + c E(cX) = cE(X)

Laws of Variance V(c) = 0 V(X + c) = V(X) V(cX) = c2V(X)

Laws of Expected Value and Variance

Let X be a rv and c be a constant.

Page 36: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

EXPECTED VALUE

36

.

k

iii

k

iii XEaXaE

11

If X and Y are independent,

YhEXgEYhXgE

The covariance of X and Y is defined as

)Y(E)X(E)XY(E

YEYXEXEY,XCov

Page 37: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

EXPECTED VALUE

37

If X and Y are independent,

0YXCov ,

The reverse is usually not correct! It is only correct under normal distribution.

If (X,Y)~Normal, then X and Y are independent iff

Cov(X,Y)=0

Page 38: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

EXPECTED VALUE

38

212121 2 XXCovXVarXVarXXVar ,

If X1 and X2 are independent,

2121 XVarXVarXXVar

Page 39: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

CONDITIONAL EXPECTATION AND VARIANCE

39

.continuous are Y and X if , dyxyyf

discrete. are Y and X if , xyyf

xYEy

22 xYExYExYVar

Page 40: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

CONDITIONAL EXPECTATION AND VARIANCE

40

YEXYEE

))X|Y(E(Var))X|Y(Var(E)Y(Var XX

(EVVE rule)

Proofs available in Casella & Berger (1990), pgs. 154 & 158

Page 41: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

Example

• An insect lays a large number of eggs, each surviving with probability p. Consider a large number of mothers. X: number of survivors in a litter; Y: number of eggs laid

• Assume:

• Find: expected number of survivors, i.e. E(X)

41

)(lExponentia~

)(Poisson~|Y

)p,Y(Binomial~Y|X

Page 42: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

Example - solution

EX=E(E(X|Y))=E(Yp) =p E(Y)=p E(E(Y|Λ))=p E(Λ)=pβ

42

Page 43: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

43

SOME MATHEMATICAL EXPECTATIONS

• Population Mean: = E(X)• Population Variance:

2 22 2 0Var X E X E X

(measure of the deviation from the population mean)

• Population Standard Deviation: 2 0

• Moments:* kk E X the k-th moment

k

k E X the k-th central moment

Page 44: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

44

SKEWNESS• Measure of lack of symmetry in the pdf.

3

33 3/2

2

E XSkewness

If the distribution of X is symmetric around its mean ,

3=0 Skewness=0

Page 45: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

45

KURTOSIS

• Measure of the peakedness of the pdf. Describes the shape of the distribution.

4

44 2

2

E XKurtosis

Kurtosis=3 NormalKurtosis >3 Leptokurtic (peaked and fat tails)Kurtosis<3 Platykurtic (less peaked and thinner tails)

Page 46: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

46

Measures of Central Location

• Usually, we focus our attention on two types of measures when describing population characteristics:– Central location– Variability or spread

Page 47: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

47

With one data pointclearly the central location is at the pointitself.

Measures of Central Location

• The measure of central location reflects the locations of all the data points.

• How?

But if the third data point appears on the left hand-sideof the midrange, it should “pull”the central location to the left.

With two data points,the central location should fall in the middlebetween them (in order to reflect the location ofboth of them).

Page 48: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

48

Sum of the observationsNumber of observations

Mean =

• This is the most popular measure of central location

The Arithmetic Mean

Page 49: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

49

nx

x in

1i

Sample mean Population mean

Nx i

N1i

Sample size Population size

nx

x in

1i

The Arithmetic Mean

Page 50: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

50

10

...

101021

101 xxxx

x ii

• ExampleThe reported time on the Internet of 10 adults are 0, 7, 12, 5, 33, 14, 8, 0, 9, 22 hours. Find the mean time on the Internet.

00 77 2222 11.011.0

The Arithmetic Mean

Page 51: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

51

The Arithmetic Mean

• Drawback of the mean: It can be influenced by unusual observations, because it uses all the information in the data set.

Page 52: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

52

Odd number of observations

0, 0, 5, 7, 8 9, 12, 14, 220, 0, 5, 7, 8, 9, 12, 14, 22, 330, 0, 5, 7, 8, 9, 12, 14, 22, 33

Even number of observations

Example

Find the median of the time on the internetfor the 10 adults of previous example

• The Median of a set of observations is the value that falls in the middle when the observations are arranged in order of magnitude. It divides the data in half.

The Median

Suppose only 9 adults were sampled (exclude, say, the longest time (33))

Comment

8.5, 8

Page 53: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

53

• The Mode of a set of observations is the value that occurs most frequently.

• Set of data may have one mode (or modal class), or two or more modes.

The modal class

The Mode

Page 54: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

54

• Find the mode for the data in the Example. Here are the data again: 0, 7, 12, 5, 33, 14, 8, 0, 9, 22

Solution

• All observation except “0” occur once. There are two “0”s. Thus, the mode is zero.

• Is this a good measure of central location?• The value “0” does not reside at the center of this set

(compare with the mean = 11.0 and the median = 8.5).

The Mode

Page 55: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

55

Relationship among Mean, Median, and Mode• If a distribution is from a bell shaped symmetrical

one, then the mean, median and mode coincide

• If a distribution is asymmetrical, and skewed to the left or to the right, the three measures differ.

A positively skewed distribution(“skewed to the right”)

MeanMedian

Mode

Mean = Median = Mode

Mode < Median < Mean

Page 56: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

56

• If a distribution is non symmetrical, and skewed to the left or to the right, the three measures differ.

A positively skewed distribution(“skewed to the right”)

Mean

Median

Mode Mean

Median

Mode

A negatively skewed distribution(“skewed to the left”)

Relationship among Mean, Median, and Mode

Mean < Median < Mode

Page 57: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

57

Measures of variability

• Measures of central location fail to tell the whole story about the distribution.

• A question of interest still remains unanswered:

How much are the observations spread outaround the mean value?

Page 58: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

58

Measures of variability

Observe two hypothetical data sets:

The average value provides a good representation of theobservations in the data set.

Small variability

This data set is now changing to...

Page 59: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

59

Measures of Variability

Observe two hypothetical data sets:

The average value provides a good representation of theobservations in the data set.

Small variability

Larger variability

The same average value does not provide as good representation of theobservations in the data set as before.

Page 60: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

60

– The range of a set of observations is the difference between the largest and smallest observations.

– Its major advantage is the ease with which it can be computed.

– Its major shortcoming is its failure to provide information on the dispersion of the observations between the two end points.

? ? ?

But, how do all the observations spread out?

Smallestobservation

Largestobservation

The range cannot assist in answering this question

Range

The Range

Page 61: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

61

This measure reflects the dispersion of all the observations

The variance of a population of size N x1, x2,…,xN

whose mean is is defined as

The variance of a sample of n observationsx1, x2, …,xn whose mean is is defined asx

N

)x( 2i

N1i2

N

)x( 2i

N1i2

1n

)xx(s

2i

n1i2

1n

)xx(s

2i

n1i2

The Variance

Page 62: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

62

Why not use the sum of deviations?

Consider two small populations:

1098

74 10

11 12

13 16

8-10= -2

9-10= -111-10= +1

12-10= +2

4-10 = - 6

7-10 = -3

13-10 = +3

16-10 = +6

Sum = 0

Sum = 0

The mean of both populations is 10...

…but measurements in Bare more dispersedthan those in A.

A measure of dispersion Should agrees with this observation.

Can the sum of deviationsBe a good measure of dispersion?

A

B

The sum of deviations is zero for both populations, therefore, is not a good measure of dispersion.

Page 63: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

63

Let us calculate the variance of the two populations

185

)1016()1013()1010()107()104( 222222B

25

)1012()1011()1010()109()108( 222222A

Why is the variance defined as the average squared deviation?Why not use the sum of squared deviations as a measure of variation instead?

After all, the sum of squared deviations increases in magnitude when the variationof a data set increases!!

The Variance

Page 64: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

64

Which data set has a larger dispersion?Which data set has a larger dispersion?

1 3 1 32 5

A B

Data set Bis more dispersedaround the mean

Let us calculate the sum of squared deviations for both data sets

The Variance

Page 65: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

65

1 3 1 32 5

A B

SumA = (1-2)2 +…+(1-2)2 +(3-2)2 +… +(3-2)2= 10SumB = (1-3)2 + (5-3)2 = 8

SumA > SumB. This is inconsistent with the observation that set B is more dispersed.

The Variance

Page 66: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

66

1 3 1 32 5

A B

However, when calculated on “per observation” basis (variance), the data set dispersions are properly ranked.

A2 = SumA/N = 10/5 = 2

B2 = SumB/N = 8/2 = 4

The Variance

Page 67: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

67

• Example– The following sample consists of the number of

jobs six students applied for: 17, 15, 23, 7, 9, 13. Find its mean and variance

• Solution

2

2222

in

1i2

jobs2.33

)1413...()1415()1417(16

11n

)xx(s

jobs146

846

13972315176

xx i

61i

The Variance

Page 68: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

68

2

2222

2i

n1i2

i

n

1i

2

jobs2.33

613...1517

13...151716

1

n)x(

x1n

1s

The Variance – Shortcut method

Page 69: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

69

• The standard deviation of a set of observations is the square root of the variance.

2

2

:deviationandardstPopulation

ss:deviationstandardSample

2

2

:deviationandardstPopulation

ss:deviationstandardSample

Standard Deviation

Page 70: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

70

• The coefficient of variation of a set of measurements is the standard deviation divided by the mean value.

• This coefficient provides a proportionate measure of variation.

CV : variationoft coefficien Population

x

scv : variationoft coefficien Sample

A standard deviation of 10 may be perceivedlarge when the mean value is 100, but only moderately large when the mean value is 500

The Coefficient of Variation

Page 71: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

Percentiles• Example from http://www.ehow.com/how_2310404_calculate-percentiles.html

• Your test score, e.g. 70%, tells you how many questions you answered correctly. However, it doesn’t tell how well you did compared to the other people who took the same test.

• If the percentile of your score is 75, then you scored higher than 75% of other people who took the test.

71

Page 72: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

72

Sample Percentiles

• Percentile– The pth percentile of a set of measurements is the

value for which • p percent of the observations are less than that value• 100(1-p) percent of all the observations are greater than

that value.

Page 73: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

73

Sample Percentiles

•Find the 10 percentile of 6 8 3 6 2 8 1

•Order the data: 1 2 3 6 6 8 8

•7*(0.10) = 0.70; round up to 1

The first observation, 1, is the 10 percentile.

Page 74: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

74

• Commonly used percentiles– First (lower) quartile, Q1 = 25th percentile

– Second (middle) quartile,Q2 = 50th percentile

– Third quartile, Q3 = 75th percentile

– Fourth quartile, Q4 = 100th percentile

– First (lower) decile = 10th percentile– Ninth (upper) decile = 90th percentile

Page 75: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

75

Quartiles and Variability

• Quartiles can provide an idea about the shape of a histogram

Q1 Q2 Q3

Positively skewedhistogram

Q1 Q2 Q3

Negatively skewedhistogram

Page 76: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

76

• Large value indicates a large spread of the observations

Interquartile range = Q3 – Q1

Interquartile Range

Page 77: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

77

Paired Data Sets and the Sample Correlation Coefficient

• The covariance and the coefficient of correlation are used to measure the direction and strength of the linear relationship between two variables.

– Covariance - is there any pattern to the way two

variables move together? – Coefficient of correlation - how strong is the linear

relationship between two variables

Page 78: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

78

N

)y)((xY)COV(X,covariance Population yixi

N

)y)((xY)COV(X,covariance Population yixi

x (y) is the population mean of the variable X (Y).N is the population size.

1-n)yy)(x(x

y) cov(x,covariance Sample ii

1-n)yy)(x(x

y) cov(x,covariance Sample ii

Covariance

x (y) is the sample mean of the variable X (Y).n is the sample size.

Page 79: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

79

• If the two variables move in opposite directions, (one increases when the other one decreases), the covariance is a large negative number.

• If the two variables are unrelated, the covariance will be close to zero.

• If the two variables move in the same direction, (both increase or both decrease), the covariance is a large positive number.

Covariance

Page 80: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

80

• Compare the following three sets

Covariance

xi yi (x – x) (y – y) (x – x)(y – y)

2

6

7

13

20

27

-3

1

2

-7

0

7

21

0

14

x=5 y =20 Cov(x,y)=17.5

xi yi (x – x) (y – y) (x – x)(y – y)

2

6

7

27

20

13

-3

1

2

7

0

-7

-21

0

-14

x=5 y =20 Cov(x,y)=-17.5

xi yi

2

6

7

20

27

13

Cov(x,y) = -3.5

x=5 y =20

Page 81: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

81

– This coefficient answers the question: How strong is the association between X and Y.

yx

)Y,X(COV

ncorrelatio oft coefficien Population

yx

)Y,X(COV

ncorrelatio oft coefficien Population

yxss)Y,Xcov(

r

ncorrelatio oft coefficien Sample

yxss

)Y,Xcov(r

ncorrelatio oft coefficien Sample

The coefficient of correlation

Page 82: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

82

COV(X,Y)=0 or r =

+1

0

-1

Strong positive linear relationship

No linear relationship

Strong negative linear relationship

or

COV(X,Y)>0

COV(X,Y)<0

The coefficient of correlation

Page 83: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

83

• If the two variables are very strongly positively related, the coefficient value is close to +1 (strong positive linear relationship).

• If the two variables are very strongly negatively related, the coefficient value is close to -1 (strong negative linear relationship).

• No straight line relationship is indicated by a coefficient close to zero.

The Coefficient of Correlation

Page 84: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

The Coefficient of Correlation

84

Page 85: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

Correlation and causation• Recognize the difference between correlation and

causation — just because two things occur together, that does not necessarily mean that one causes the other.

• For random processes, causation means that if A occurs, that causes a change in the probability that B occurs.

85

Page 86: RANDOM VARIABLES, EXPECTATIONS, VARIANCES ETC

Correlation and causation• Existence of a statistical relationship, no matter how strong it

is, does not imply a cause-and-effect relationship between X and Y. for ex, let X be size of vocabulary, and Y be writing speed for a group of children. There most probably be a positive relationship but this does not imply that an increase in vocabulary causes an increase in the speed of writing. Other variables such as age, education etc will affect both X and Y.

• Even if there is a causal relationship between X and Y, it might be in the opposite direction, i.e. from Y to X. For eg, let X be thermometer reading and let Y be actual temperature. Here Y will affect X.

86