chapter 1 probability and distributions

45
Chapter 1 Probability and Distributions Math 6203 Fall 2009 Instructor: Ayona Chatterjee

Upload: kezia

Post on 14-Feb-2016

32 views

Category:

Documents


0 download

DESCRIPTION

Chapter 1 Probability and Distributions. Math 6203 Fall 2009 Instructor: Ayona Chatterjee. 1.1 INTRODUCTION. Every experiment has one or more outcomes. Random experiment > single outcome cannot be predicted but outcomes over a long period of time follow a certain rule. - PowerPoint PPT Presentation

TRANSCRIPT

Page 1: Chapter 1 Probability and Distributions

Chapter 1Probability and Distributions

Math 6203Fall 2009

Instructor: Ayona Chatterjee

Page 2: Chapter 1 Probability and Distributions

1.1 INTRODUCTION

• Every experiment has one or more outcomes.• Random experiment > single outcome cannot

be predicted but outcomes over a long period of time follow a certain rule.

• Sample Space > collection of every possible outcome.– Denoted by C. – Let c denote an element in C and let C represent a

collection of elements of C.

Page 3: Chapter 1 Probability and Distributions

• Examples: Toss a coin, toss 2 coins, throw a die. Write sample spaces.

• We define f/N as the relative frequency of event C in N performances.

• As N increases, f/N -> p• Here p is the number which, in future

performances of the experiment the relative frequency of the event C will approximately equal.

• ‘p’ is called the probability of event C.

Page 4: Chapter 1 Probability and Distributions

• Note:– Above idea works only if experiment can be

repeated in identical conditions.– But sometimes p is personal or subjective.– For both mathematical theory is the same.

Page 5: Chapter 1 Probability and Distributions

1.2 SET THEORY

• Set -> Collection of particular objects.– Notation: c is an element if set C,

• One-one correspondence – Every member of A is paired with exactly one member of B. No two ordered pairs have the same first element or the same second element.

Page 6: Chapter 1 Probability and Distributions

Definitions

• If each element C1 is also an element of C2, the set C1 is called subset of set C2.

• Example C1 ={x: 0≤x ≤1} and C2 ={x: -1 ≤x ≤2} then C1 is the subset of C2.

211221

21

then and If CCCCCCCC

Page 7: Chapter 1 Probability and Distributions

• Null set: If C has no elements, C = Φ.• Union: Set of all elements that belong to at

least one of the sets C1 and C2 is called union of C1 and C2. Notation C1 U C2.

• Intersection: Set of all elements that belong to each of the sets C1 and C2 is called intersection of C1 and C2. Denoted by C1 C2 .

C

CCC

CC

CCC

Page 8: Chapter 1 Probability and Distributions

• Complement: The set that consists of all elements of C that are not elements of C is called the complement of C. We will denote complement of C as Cc .

• De Morgan’s Laws

• Let us prove the laws and do some examples for union, intersection, complement with – C1 ={x: 0≤x ≤1} and C2 ={x: -1 ≤x ≤2}

Ccc

Ccc

CCCC

CCCC

2121

2121

)(

)(

Page 9: Chapter 1 Probability and Distributions

• Let C be a set in one-dimensional space and let Q(C) be equal to the number of points in C which correspond to positive integers. Then Q(C) is a function of the set C.– Example C = {x, 0 < x< 5}, then Q(C)=4– C = {-1, -2}, Q(C) = 0

• Let C be a set in one-dimensional space and let Q(C)=∑f(x) over C or – Note: f(x) has to be chosen with care or else the

integral may fail to exist.

C

dxxfCQ )()(

Page 10: Chapter 1 Probability and Distributions

Examples

21

21

where)( }31:{},20:{

)( and ldimensiona-one be CLet

)( },30:{0

,.....3,2,121

)(

CCCCQFindxxCxxC

dxeCQ

CQFindxxCelsewhere

xxf

C

x

x

Page 11: Chapter 1 Probability and Distributions

1.3 THE PROBABILITY SET FUNCTION

• σ – Field: Let B be a collection of subsets of C. We say B is a σ-field if– B is not empty.– If – If the sequence of sets {C1, C2, ….}is in B then

– Example B = {C, Cc , Φ, C} is a σ-field. – σ-field is also called the Borel σ-field .

BBC cC then

BCii 1

Page 12: Chapter 1 Probability and Distributions

Probability definition

• Let C be a sample space and let B be a σ-field on C. Let P be a real valued function defined on B. Then P is a probability set function if P satisfies the following three conditions.1. P(C)≥ 02. P(C) = 13. If {Cn } is a sequence of sets in B and

11)()(

thenn,m allfor

nn

nn

nm

CPCP

CC

Page 13: Chapter 1 Probability and Distributions

Theorems-work on proofs

)()()()(then C,in events are and If*

1)(0,each For *

)()(then ,such that events are and If*

0)( is that zero, ifset null theofy probabilit The*

1event each For

212121

21

21

2121

CCPCPCPCCPCC

CPBC

CPCPCCCC

P

)-P(CB, P(C)C c

Page 14: Chapter 1 Probability and Distributions

Inequalities

• Booles’s inequality

• Bonferoni’s Inequality

)...()(....)()( 2121 kk CCCPCPCPCP

1)()()( 2121 CPCPCCP

Page 15: Chapter 1 Probability and Distributions

Definitions

• Mutually exclusive events: when two events have no elements in common.

• Equilikely case: If there are k events each event has probability 1/k.

• Permutation and Combination.– Examples from Poker hand, page 17

Page 16: Chapter 1 Probability and Distributions

Theorems

11

1

1

)(

Then events. of sequencearbitary an be }{ *

)lim()(lim

Then events. of sequence decreasingan be }{ *

)lim()(lim

Then events. of sequence increasingan be }{ *

nnn

n

n

nn

nnnn

n

nn

nnnn

n

CPCP

CLet

CPCPCP

CLet

CPCPCP

CLet

Page 17: Chapter 1 Probability and Distributions

1.4 CONDITIONAL PROBABILITY AND INDEPENDENCE

• Let C1 and C2 both be subsets of C. We want to define the probability of C2 relative to C1. This we call conditional probability, denoted by P(C2 | C1 ).

• Since C1 is the sample space now, the only elements of interest are the ones in both C1 and C2.

)|()(

)()|(

)|(*

)|()|(*1)|(*

121

21

11

121

12112

11

CCPCP

CCPCCP

CCCPCCCPCCP

CCP

Page 18: Chapter 1 Probability and Distributions

Properties of P(C2 | C1 )

• Note P(C2 | C1 )≥0• Note

• P(C1 | C1 ) = 1

sets.disjoint mutually are ,...,C that provided....)|()|()|.....(

32

1312132

CCCPCCPCCCP

Page 19: Chapter 1 Probability and Distributions

Examples

• Example: A hand of 5 cards is to be dealt at random without replacement from an ordinary deck of cards. Find the probability of an all-spade hand given that there are at least 4 spades in the hand.

• From an ordinary deck of playing cards, cards are to be drawn successively at random without replacement. Find the probability that the third spade appears on the sixth draw.

Page 20: Chapter 1 Probability and Distributions

Bayes Theorem• Law of total probability – Assume that events

C1, ….Ck are mutually exclusive and exhaustive events. Let C be another event, C occurs with one and only one of the events.

• Bayes Theorem- To find the conditional probability of Cj given C.

k

iii CCPCPCP

1

)|()()(

k

iii

jjjj

CCPCP

CCPCPCP

CCPCCP

1

)|()(

)|()()(

)()|(

Page 21: Chapter 1 Probability and Distributions

Bayes Theorem

• The probabilities P(Ci ) are called prior probabilities.

• The probabilities P(Ci | C) are called posterior probabilities.

• Lets work through example 1.4.5, page 25 from the text book.

Page 22: Chapter 1 Probability and Distributions

Independent Events

• Let C1 and C2 be two events. We say that C1 and C2 are independent if

• Suppose we have three events, then the events are mutually independent if and only if– They are pair wise independent and

Lets work through example 1.4.10, page 29.

)()()( 2121 CPCPCCP

)()()()( 321321 CPCPCPCCCP

Page 23: Chapter 1 Probability and Distributions

1.5 RANDOM VARIABLES

• Consider a random experiment with a sample space C. A function X which assigns to each element c (c belongs to the sample space) one and only one number X(c) = x, is a random variable.

• Notation:– B: any event– D: a new sample space– {di }:simple events which belong to D

Bd

iXi

dcXCcPBP }])(:[{)(

Page 24: Chapter 1 Probability and Distributions

PMF

• Here Px is completely determined by the function

• The function px (di ) is called the probability mass function (pmf).

• Example: Write the pmf for X: sum of the upfaces on a roll of a pair of 6-sideddice.

• Compute probability for event B={x: x=7,11}

midPdp iXix ,...,1for }][{)(

Page 25: Chapter 1 Probability and Distributions

Cumulative Distribution Function

• Let X be a random variable. Then its cdf is defined as

• Continuing with the previous example, define the cdf of X.

• For a continuous random variable, the probability density function can be obtained as

)(]),(()( xXPxPxF XX

dxxdFxf X

X)()(

Page 26: Chapter 1 Probability and Distributions

Note

• If X and Y are two random variables and FX(x)=Fy (y) for all x in R, then X and Y are equal in distribution and it is denoted as

• Though X and Y may be equal in distribution, X and Y itself may be quite different.

• See example in book.

YXD

Page 27: Chapter 1 Probability and Distributions

Theorems

• Let X be a random variable with cumulative distribution function F(x). Then

• Let X be a rv with cdf Fx . Then for a < b, P[a<X≤b]=Fx (b)-Fx (a)

continuousright is F )()(lim*

1)(lim*

0)(lim*ingnondecreas is F F(b),F(a) then b a if b, and a allFor *

0x

x

-x

0

XFxF

xF

xF

x

Page 28: Chapter 1 Probability and Distributions

Theorem

• For any random variable

• Lets work through the proof.

)(lim)( when , allfor

)()(][

zFxFRx

xFxFxXP

Xxz

X

XX

Page 29: Chapter 1 Probability and Distributions

1.6 DISCRETE RANDOM VARIABLES

• We say a random variable is a discrete rv if its space is either finite or countable.

• Example:– Consider a sequence of independent flips of a

count. Let X be the number of flips required to obtain the first head. What is the pmf for X?

– Five fuses are chosen from a lot of 100 fuses. The lot contains 20 defective fuses. Let X be the number of non-defective fuses, what is the pmf for X?

Page 30: Chapter 1 Probability and Distributions

Properties of pmf

DxX

X

xpii

Dxxpi

1)()(

,1)(0)(

Page 31: Chapter 1 Probability and Distributions

Transformations

• Suppose we are interested in some random variable Y, which is some transformation of X, say Y=g(X). Assume X is discrete with space Dx.

• Then the space of Y is Dy ={g(x);x belongs to Dx

}.• Y can be obtained as

))(()]([])([][)( 11 ygpygXPYXgPyYPyp xY

Page 32: Chapter 1 Probability and Distributions

Example

• Let X have the pmf

• Find the pmf of Y where Y = X2

elsewhere

xxxxp

xx

X

0

3,2,1,031

32

)!3(!!3

)(

3

Page 33: Chapter 1 Probability and Distributions

1.7 CONTINUOUS RANDOM VARIABLE

• We say a random variable is a continuous random variable if its cumulative distribution function is a continuous function for all x belong to R.

• We define the cdf as • The probability density function can be

defined as

x

XX dttfxF )()(

)()( xfdx

xdFX

X

Page 34: Chapter 1 Probability and Distributions

Note

1)(

0)(

)()()()(

dttf

tf

dttfaFbFbXaP

X

X

b

aXXX

Page 35: Chapter 1 Probability and Distributions

Example

• Let the random variable be the time in seconds between incoming telephone calls at a busy switchboard. Suppose that a reasonable pdf for X is given below. Find P(X>4).

elsewhere

xexfx

X0

041

)(4/

Page 36: Chapter 1 Probability and Distributions

Transformation

• Let X be a continuous random variable with pdf fX (x) and support S. Let Y =g(X), where g(x) is a one-to-one differentiable function, on the support of X, S. Denote the inverse of g by x=g-

1 (y) and let dx/dy =d[g-1 (y)]/dy. Then the pdf of Y is given by

Sydydxygfyf XY for ))(()( 1

Page 37: Chapter 1 Probability and Distributions

Example

• Let X have the pdf given below. Find the pdf for Y where Y=-2logX.– f(x)=1 for 0<x<1

Page 38: Chapter 1 Probability and Distributions

1.8 EXPECTATION OF A RANDOM VARIABLE

• Let X be a random variable. If X is a continuous random variable with pdf f(x) and

dxxxf

dxxfx

)(E(X)

is X ofn expectatio then the

,)(

Page 39: Chapter 1 Probability and Distributions

Example for a discrete RV

• Let the random variable X of the discrete type have the pmf given by the table below. Find E(X).

X 1 2 3 4

P(X) 4/10 1/10 3/10 2/10

Page 40: Chapter 1 Probability and Distributions

Theorem

• Expectation of a constant is a constant itself, that is E(k)=k.

• Let X be a random variable and let Y=g(X) for some function g.– Suppose X is continuous with pdf f(x).

-

-

)()()(

bygiven is and exists Y ofn expectatio then the,)()( If

dxxfxgYE

dxxfxg

X

X

Page 41: Chapter 1 Probability and Distributions

Theorem

• Let g1(X) and g2(X) be the functions of a random variable X. Suppose the expectations of g1(X) and g2(X) exist. Then for any constants k1 and k2, the expectation of k1 g1(X) + k2 g2(X) exists and is given – E[k1 g1(X) + k2 g2(X)]= k1 E( g1(X)) + k2 E(g2(X) )

Page 42: Chapter 1 Probability and Distributions

1.9 SOME SPECIAL EXPECTATIONS

• Some expectations have special names. – E(X) is called the mean value of X.– It is denoted by µ.– The mean is the first moment (about 0) of a

random variable.

Page 43: Chapter 1 Probability and Distributions

Variance

• Let X be a random variable with finite mean µ and such that E[(x- µ)2] is finite. Then the variance of X is defined to be E[(x- µ)2]. It is usually denoted by σ2 or by Var(X). – We can write

– Here σ is called the standard deviation of X. • Measures the dispersion of the points in the space relative to

the mean.

222 )( XE

Page 44: Chapter 1 Probability and Distributions

Moment Generating Function

• Let X be a random variable such that h>0, the expectation of etx exists for –h < t < h. The moment generating function (mgf) of X is defined to be the function M(t)=E(etx ) for –h<t<h.

• If X and Y have the same distribution, them they have the same mgf in a neighborhood of 0. The converse is also true.

Page 45: Chapter 1 Probability and Distributions

Note

)0(

)()()0()(

M

dxxfxXEM mmm