probability and random processes resumen

Download Probability and Random Processes Resumen

Post on 11-Jan-2016

24 views

Category:

Documents

0 download

Embed Size (px)

DESCRIPTION

estadistica, resumen del libro de grimmer

TRANSCRIPT

  • Probability and Random Processes

    Serik Sagitov, Chalmers University of Technology and Gothenburg University

    Abstract

    Lecture notes based on the book Probability and Random Processes by Geoffrey Grimmett andDavid Stirzaker. Last updated August 12, 2013.

    Contents

    Abstract 1

    1 Random events and variables 21.1 Probability space . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 21.2 Conditional probability and independence . . . . . . . . . . . . . . . . . . . . . . . . . . . 31.3 Random variables . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31.4 Random vectors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 41.5 Filtration . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

    2 Expectation and conditional expectation 62.1 Expectation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 62.2 Conditional expectation and prediction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 82.3 Multinomial distribution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 92.4 Multivariate normal distribution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102.5 Sampling from a distribution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 102.6 Probability generating, moment generating, and characteristic functions . . . . . . . . . . 112.7 Inequalities . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 12

    3 Convergence of random variables 123.1 Borel-Cantelli lemmas . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 123.2 Modes of convergence . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 133.3 Continuity of expectation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 153.4 Weak law of large numbers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 153.5 Central limit theorem . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 163.6 Strong LLN . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 17

    4 Markov chains 174.1 Simple random walks . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 174.2 Markov chains . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 194.3 Stationary distributions . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 214.4 Reversibility . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 224.5 Poisson process and continuous-time Markov chains . . . . . . . . . . . . . . . . . . . . . 224.6 The generator of a continuous-time Markov chain . . . . . . . . . . . . . . . . . . . . . . . 23

    5 Stationary processes 245.1 Weakly and strongly stationary processes . . . . . . . . . . . . . . . . . . . . . . . . . . . 245.2 Linear prediction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 245.3 Linear combination of sinusoids . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 255.4 The spectral representation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 265.5 Stochastic integral . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 285.6 The ergodic theorem for the weakly stationary processes . . . . . . . . . . . . . . . . . . . 29

    1

  • 5.7 The ergodic theorem for the strongly stationary processes . . . . . . . . . . . . . . . . . . 295.8 Gaussian processes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 31

    6 Renewal theory and Queues 326.1 Renewal function and excess life . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 326.2 LLN and CLT for the renewal process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 326.3 Stopping times and Walds equation . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 336.4 Stationary distribution . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 346.5 Renewal-reward processes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 346.6 Regeneration technique for queues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 346.7 M/M/1 queues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 356.8 M/G/1 queues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 366.9 G/M/1 queues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 376.10 G/G/1 queues . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 38

    7 Martingales 407.1 Definitions and examples . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 407.2 Convergence in L2 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 417.3 Doobs decomposition . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 437.4 Hoeffdings inequality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 437.5 Convergence in L1 . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 447.6 Bounded stopping times. Optional sampling theorem . . . . . . . . . . . . . . . . . . . . . 467.7 Unbounded stopping times . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 477.8 Maximal inequality . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 487.9 Backward martingales. Strong LLN . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 49

    8 Diffusion processes 508.1 The Wiener process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 508.2 Properties of the Wiener process . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 518.3 Examples of diffusion processes . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 528.4 The Ito formula . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 538.5 The Black-Scholes formula . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 54

    1 Random events and variables

    1.1 Probability space

    A random experiment is modeled in terms of a probability space (,F ,P) the sample space is the set of all possible outcomes of the experiment, the -field (or sigma-algebra) F is a collection of measurable subsets A (which are called

    random events) satisfying

    1. F ,2. if Ai F , 0 = 1, 2, . . ., then i=1Ai F , countable unions,3. if A F , then Ac F , complementary event,

    the probability measure P is a function on F satisfying three probability axioms1. if A F , then P(A) 0,2. P() = 1,3. if Ai F , 0 = 1, 2, . . . are all disjoint, then P(i=1Ai) =

    i=1 P(Ai).

    2

  • De Morgans laws (i

    Ai

    )c=i

    Aci ,(

    i

    Ai

    )c=i

    Aci .

    Properties derived from the axioms

    P() = 0,P(Ac) = 1 P(A),

    P(A B) = P(A) + P(B) P(A B).Inclusion-exclusion rule

    P(A1 . . . An) =i

    P(Ai)i

  • Definition 1.3 Distribution function (cumulative distribution function)

    F (x) = FX(x) = PX{(, x]} = P(X x).In terms of the distribution function we get

    P(a < X b) = F (b) F (a),P(X < x) = F (x),P(X = x) = F (x) F (x).

    Any monotone right-continuous function with

    limxF (x) = 0 and limxF (x) = 1

    can be a distribution function.

    Definition 1.4 The random variable X is called discrete, if for some countable set of possible values

    P(X {x1, x2, . . .}) = 1.Its distribution is described by the probability mass function f(x) = P(X = x).

    The random variable X is called (absolutely) continuous, if its distribution has a probability densityfunction f(x):

    F (x) =

    x

    f(y)dy, for all x,

    so that f(x) = F (x) almost everywhere.

    Example 1.5 The indicator of a random event 1A = 1{A} with p = P(A) has a Bernoulli distribution

    P(1A = 1) = p, P(1A = 0) = 1 p.For several events Sn =

    ni=1 1Ai counts the number of events that occurred. If independent events

    A1, A2, . . . have the same probability p = P(Ai), then Sn has a binomial distribution Bin(n, p)

    P(Sn = k) =(n

    k

    )pk(1 p)nk, k = 0, 1, . . . , n.

    Example 1.6 (Cantor distribution) Consider (,F ,P) with = [0, 1], F = B[0,1], andP([0, 1]) = 1P([0, 1/3]) = P([2/3, 1]) = 21

    P([0, 1/9]) = P([2/9, 1/3]) = P([2/3, 7/9]) = P([8/9, 1]) = 22

    and so on. Put X() = , its distribution, called the Cantor distribution, is neither discrete nor contin-uous. Its distribution function, called the Cantor function, is continuous but not absolutely continuous.

    1.4 Random vectors

    Definition 1.7 The joint distribution of a random vector X = (X1, . . . , Xn) is the function

    FX(x1, . . . , xn) = P({X1 x1} . . . {Xn xn}).Marginal distributions

    FX1(x) = FX(x,, . . . ,),FX2(x) = FX(, x,, . . . ,),

    . . .

    FXn(x) = FX(, . . . ,, x).

    4

  • The existence of the joint probability density function f(x1, . . . , xn) means that the distribution function

    FX(x1, . . . , xn) =

    x1

    . . .

    xn

    f(y1, . . . , yn)dy1 . . . dyn, for all (x1, . . . , xn),

    is absolutely continuous, so that f(x1, . . . , xn) =nF (x1,...,xn)x1...xn

    almost everywhere.

    Definition 1.8 Random variables (X1, . . . , Xn) are called independent if for any (x1, . . . , xn)

    P(X1 x1, . . . , Xn xn) = P(X1 x1) . . .P(Xn xn).In the jointly continuous case this equivalent to

    f(x1, . . . , xn) = fX1(x1) . . . fXn(xn).

    Example 1.9 In general, the joint distribution can not be recover

Recommended

View more >