solutions manual for - solution manual & test … · solutions manual for by stochastic...

20
SOLUTIONS MANUAL FOR by Stochastic Processes: An Introduction, Second Edition Peter W. Jones Peter Smith

Upload: hanhi

Post on 26-Aug-2018

238 views

Category:

Documents


0 download

TRANSCRIPT

SOLUTIONS MANUAL FOR

by

Stochastic Processes:An Introduction,Second Edition

Peter W. JonesPeter Smith

SOLUTIONS MANUAL FOR

by

Stochastic Processes:An Introduction,Second Edition

Peter W. JonesPeter Smith

CRC Press is an imprint of theTaylor & Francis Group, an informa business

Boca Raton London New York

Chapman & Hall/CRCTaylor & Francis Group6000 Broken Sound Parkway NW, Suite 300Boca Raton, FL 33487-2742

© 2010 by Taylor and Francis Group, LLCChapman & Hall/CRC is an imprint of Taylor & Francis Group, an Informa business

No claim to original U.S. Government works

Printed in the United States of America on acid-free paper10 9 8 7 6 5 4 3 2 1

International Standard Book Number: 978-1-4398-4100-6 (Paperback)

This book contains information obtained from authentic and highly regarded sources. Reasonable efforts have been made to publish reliable data and information, but the author and publisher cannot assume responsibility for the validity of all materials or the consequences of their use. The authors and publishers have attempted to trace the copyright holders of all material reproduced in this publication and apologize to copyright holders if permission to publish in this form has not been obtained. If any copyright material has not been acknowledged please write and let us know so we may rectify in any future reprint.

Except as permitted under U.S. Copyright Law, no part of this book may be reprinted, reproduced, transmitted, or utilized in any form by any electronic, mechanical, or other means, now known or hereafter invented, including photocopying, microfilming, and recording, or in any information storage or retrieval system, without written permission from the publishers.

For permission to photocopy or use material electronically from this work, please access www.copyright.com (http://www.copyright.com/) or contact the Copyright Clearance Center, Inc. (CCC), 222 Rosewood Drive, Danvers, MA 01923, 978-750-8400. CCC is a not-for-profit organization that provides licenses and registration for a variety of users. For organizations that have been granted a photocopy license by the CCC, a separate system of payment has been arranged.

Trademark Notice: Product or corporate names may be trademarks or registered trademarks, and are used only for identification and explanation without intent to infringe.

Visit the Taylor & Francis Web site athttp://www.taylorandfrancis.com

and the CRC Press Web site athttp://www.crcpress.com

Preface

The website includes answers and solutions of all the end-of-chapter problems in the textbookStochastic Processes: An Introduction. We hope that they will prove of help to lecturers andstudents. Both the original problems as numbered in the text are also included so that the materialcan be used as an additional source of worked problems.

There are obviously references to results and examples from the textbook, and the manualshould be viewed as a supplement to the book. To help identify the sections and chapters, the fullcontents of Stochastic Processes follow this preface.

Every effort has been made to eliminate misprints or errors (or worse), and the authors, whowere responsible for the LaTeX code, apologise in advance for any which occur.

Peter JonesPeter Smith Keele, May 2009

1

Contents of Stochastic Processes

Chapter 1: Some Background in Probability1.1 Introduction1.2 Probability1.3 Conditional probability and independence1.4 Discrete random variables1.5 Continuous random variables1.6 Mean and variance1.7 Some standard discrete probability distributions1.8 Some standard continuous probability distributions1.9 Generating functions1.10 Conditional expectationProblems

Chapter 2: Some Gambling Problems2.1 Gambler’s ruin2.2 Probability of ruin2.3 Some numerical simulations2.4 Expected duration of the game2.5 Some variations of gambler’s ruin

2.5.1 The infinitely rich opponent2.5.2 The generous gambler2.5.3 Changing the stakes

Problems

Chapter 3: Random Walks3.1 Introduction3.2 Unrestricted random walks3.3 Probability distribution after n steps3.4 First returns of the symmetric random walk3.5 Other random walksProblems

Chapter 4: Markov Chains4.1 States and transitions4.2 Transition probabilities4.3 General two-state Markov chain4.4 Powers of the transition matrix for the m-state chain4.5 Gambler’s ruin as a Markov chain4.6 Classification of states4.7 Classification of chainsProblems

Chapter 5: Poisson Processes5.1 Introduction5.2 The Poisson process5.3 Partition theorem approach5.4 Iterative method5.5 The generating function5.6 Variance for the Poisson process

2

5.7 Arrival times5.8 Summary of the Poisson processProblems

Chapter 6: Birth and Death Processes6.1 Introduction6.2 The birth process6.3 Birth process: generating function equation6.4 The death process6.5 The combined birth and death process6.6 General population processesProblems

Chapter 7: Queues7.1 Introduction7.2 The single server queue7.3 The stationary process7.4 Queues with multiple servers7.5 Queues with fixed service times7.6 Classification of queues7.7 A general approach to the M(λ)/G/1 queueProblems

Chapter 8: Reliability and Renewal8.1 Introduction8.2 The reliability function8.3 The exponential distribution and reliability8.4 Mean time to failure8.5 Reliability of series and parallel systems8.6 Renewal processes8.7 Expected number of renewalsProblems

Chapter 9: Branching and Other Random Processes9.1 Introduction9.2 Generational growth9.3 Mean and variance9.4 Probability of extinction9.5 Branching processes and martingales9.6 Stopping rules9.7 The simple epidemic9.8 An iterative scheme for the simple epidemicProblems

Chapter 10: Computer Simulations and Projects

3

Chapters of the Solutions Manual

Chapter 1: Some Background in Probability 5

Chapter 2: Some Gambling Problems 16

Chapter 3: Random Walks 30

Chapter 4: Markov Chains 44

Chapter 5: Poisson Processes 65

Chapter 6: Birth and Death Processes 71

Chapter 7: Queues 93

Chapter 8: Reliability and Renewal 108

Chapter 9: Branching and Other Random Processes 116

4

Chapter 1

Some background in probability

1.1. The Venn diagram of three events is shown in Figure 1.5(in the text). Indicate on the diagramthe following events:(a) A ∪B; (b) A ∪ (B ∪ C); (c) A ∩ (B ∪ C); (d) (A ∩ C)c; (e) (A ∩B) ∪ Cc.

(c)

(a) (b)

(d)

(e)

S S

S S

S

A A

A A

A

B B

B B

B

C C

C C

C

Figure 1.1:

The events are shaded in Figure 1.1.

1.2. In a random experiment, A, B, C are three events. In set notation write down expressionsfor the events:(a) only A occurs;(b) all three events A, B, C occur;(c) A and B occur but C does not;(d) at least one of the events A, B, C occurs;(e) exactly one of the events A, B, C occurs;(f) not more than two of the events occur.

5

(a) A ∩ (B ∪ C)c; (b) A ∩ (B ∩ C) = A ∩B ∩ C; (c) (A ∩B) ∩ Cc; (d) A ∪B ∪ C;(e) A ∩ (B ∪ C)c represents an event in A but not in either B nor C: therefore the answer is

(A ∩ (B ∪ C)c) ∪ (B ∩ (A ∪ C))c ∪ (C ∩ (A ∪B)c).

1.3. For two events A and B, P(A) = 0.4, P(B) = 0.5 and P(A ∩B) = 0.3. Calculate(a) P(A ∪B); (b) P(A ∩Bc); (c) P(Ac ∪Bc).

(a) From (1.1) P(A ∪B) = P(A) + P(B)−P(A ∩B), it follows that

P(A ∪B) = 0.4 + 0.5− 0.3 = 0.6.

(b) Since A = (A ∩Bc) ∪ (A ∩B) and A ∩Bc, and A ∩B are mutually exclusive, then,

P(A) = P[(A ∩Bc) ∪ (A ∩B)] = P(A ∩Bc) + P(A ∩B),

so thatP(A ∩Bc) = P(A)−P(A ∩B) = 0.4− 0.3 = 0.1.

(c) Since Ac ∪Bc = (A ∩B)c, then

P(Ac ∪Bc) = P[(A ∩B)c] = 1−P(A ∩B) = 1− 0.3 = 0.7.

1.4. Two distinguishable fair dice a and b are rolled. What are the elements of the sample space?What is the probability that the sum of the face values of the two dice is 9? What is the probabilitythat at least one 5 or at least one 3 appears?

The elements of the sample space are listed in Example 1.1. The event A1, that the sum is 9,is given by

A1 = {(3, 6), (4, 5), (5, 4), (6, 3)}.Hence P = 4

36 = 19 .

Let A2 be the event that at least one 5 or at least one 3 appears. Then by counting the elementsin the sample space in Example 11, P(A2) = 20

36 = 59 .

1.5. Two distinguishable fair dice a and b are rolled. What is the probability that the sum of thefaces is not more than 6?

Let the random variable X be the sum of the faces. By counting events in the sample space inExample 1.1, P(X) = 15

36 = 512 .

1.6. A probability function {pn}, (n = 0, 1, 2, . . .) has a probability generating function

G(s) =∞∑

n=0

pnsn = 14 (1 + s)(3 + s)

12 .

Find the probability function {pn} and its mean.

Note that G(1) = 1. Using the binomial theorem

G(s) = 14 (1 + s)(3 + s)

12 =

√3

4 (1 + s)(1 + 13s)

12

=√

34

∞∑n=0

( 12

n

) (s

3

)n

+3√

34

∞∑n=1

( 12

n− 1

) (s

3

)n

.

6

The probabilities can now be read off from the coefficients of the series:

p0 =√

34

, pn =√

33n4

[( 12

n

)+ 3

( 12

n− 1

)], (n = 1, 2, . . .).

The expected value is given by

µ = G′(1) =14

dds

[(1 + s)(3 + s)

12

]s=1

=[14(3 + s)

12 +

18(1 + s)(3 + s)−

12

]

s=1

=58

1.7. Find the probability generating function G(s) of the Poisson distribution (see Section 1.7) withparameter α given by

pn =e−ααn

n!, n = 0, 1, 2, . . . .

Determine the mean and variance of {pn} from the generating function.

Given pn = e−ααn/n!, the generating function is given by

G(s) =∞∑

n=0

pnsn =∞∑

n=0

e−ααnsn

n!= e−α

∞∑n=0

(αs)n

n!= eα(s−1).

The mean and variance are given by

µ = G′(1) =dds

[eα(s−1)

]s=1

= α,

σ2 = G′′(1) + G′(1)− [G′(1)]2 = [α2eα(s−1) + αeα(s−1) − α2e2α(s−1)]s=1 = α,

as expected.

1.8. A panel contains n warning lights. The times to failure of the lights are the independentrandom variables T1, T2, . . . , Tn which have exponential distributions with parameters α1, α2, . . . , αn

respectively. Let T be the random variable of the time to first failure, that is

T = min{T1, T2, . . . , Tn}.

Show that T has an exponential distribution with parameter∑n

j=1 αj. Show also that the probabilitythat the i-th panel light fails first is αi/(

∑nj=1 αj).

The probability that no warning light has failed by time t is

P(T ≥ t) = P(T1 ≥ t ∩ T2 ≥ t ∩ · · · ∩ Tn ≥ t)= P(T1 ≥ t)P(T2 ≥ t) · · ·P(Tn ≥ t)= e−α1te−α2t · · · e−αnt = e−(α1+α2+···+αn)t.

7

Let Ti represent the random variable that the ith component fails first. The probability thatthe ith component fails first is

P(Ti = T ) =∑

δt

n6=i

P(Tn > t)P(t < Ti < t + δt)

=∑

δt

n6=i

P(Tn > t)[e−αit − e−αi(t+δt)]

≈∑

δt

n6=i

P(Tn > t)αiδte−αit

→∫ ∞

0

αi exp

[−t

n∑

i=1

αi

]dt =

αi∑ni=1 αi

as δt → 0.

1.9. The geometric probability function with parameter p is given by

p(x) = qx−1p, x = 1, 2, . . .

where q = 1 − p (see Section 1.7). Find its probability generating function. Calculate the meanand variance of the geometric distribution from its pgf.

The generating function is given by

G(s) =∞∑

x=1

qx−1psx =p

q

∞∑x=1

(qs)x =p

q

qs

1− qs=

ps

1− qs,

using the formula for the sum of a geometric series.The mean is given by

µ = G′(1) =dds

[ps

1− qs

]

s=1

=[

p

1− qs+

pqs

(1− qs)2

]

s=1

=1p.

For the variance,

G′′(s) =dds

[p

(1− qs)2

]=

2pq

(1− qs)3.

is required. Hence

σ2 = G′′(1) + G′(1)− [G′(1)]2 =2q

p2+

1p− 1

p2=

1− p

p2.

1.10. Two distinguishable fair dice a and b are rolled. What are the probabilities that:(a) at least one 4 appears;(b) only one 4 appears;(c) the sum of the face values is 6;(d) the sum of the face values is 5 and one 3 is shown;(e) the sum of the face values is 5 or only one 3 is shown?

From the Table in Example 1.1:(a) If A1 is the event that at least one 4 appears, then P(A1) = 11

36 .(b) If A2 is the event that only one 4 appears, then P(A2) = 10

36 = 518 .

(c) If A3 is the event that the sum of the faces is 6, then P(A3) = 536 .

8

(d) If A4 is the event that the face values is 5 and one 3 is shown, then P(A4) = 236 = 1

18 .(e) If A5 is the event that the sum of the faces is 5 or only one 3 is shown, then P(A5) = 7

36 .

1.11. Two distiguishable fair dice a and b are rolled. What is the expected sum of the face values?What is the variance of the sum of the face values?

Let N be the random variable representing the sum x + y, where x and y are face values of thetwo dice. Then

E(N) =136

6∑x=1

6∑y=1

(x + y) =136

[6

6∑x=1

x + 66∑

y=1

y

]= 7.

and

V(N) = E(N2)−E(N) =136

6∑x=1

6∑y=1

(x + y)2 − 72

=136

[12

6∑x=1

x2 + 2(6∑

x=1

x)2]− 49

=136

[(12× 91) + 2× 212]− 49 =356

= 5.833 . . . .

1.12. Three distinguishable fair dice a, b and c are rolled. How many possible outcomes are therefor the faces shown? When the dice are rolled, what is the probability that just two dice show thesame face values and the third one is different?

The sample space contains 63 = 216 elements of the form, (in the order a, b, c),

S = {(i, j, k)}, (i = 1, . . . , 6; j = 1, . . . , 6; k = 1, . . . , 6).

Let A be the required event. Suppose that a and b have the same face values, which can occur in6 ways, and that c has a different face value which can occurs in 5 ways. Hence the total numberof ways in which a and b are the same but c is different is 6 × 5 = 30 ways. The faces b and c,and c and a could also be the same so that the total number of ways for the possible outcome is3× 30 = 90 ways. Therefore the required probability is

P(A) =90216

=512

.

1.13. In a sample space S, the events B and C are mutually exclusive, but A and B, and A andC are not. Show that

P(A ∪ (B ∪ C)) = P(A) + P(B) + P(C)−P(A ∩ (B ∪ C)).

From a well-shuffled pack of 52 playing cards a single card is randomly drawn. Find the proba-bility that it is a club or an ace or the king of hearts.

From (1.1) (in the book)

P(A ∪ (B ∪ C)) = P(A) + P(B ∪ C)−P(A ∩ (B ∪ C)) (i).

Since B and C are mutually exclusive,

P(B ∪ C) = P(B) + P(C). (ii)

9

From (i) and (ii), it follows that

P(A ∪ (B ∪ C)) = P(A) + P(B) + P(C)−P(A ∩ (B ∪ C)).

Let A be the event that the card is a club, B the event that it is an ace, and C the event thatit is the king of hearts. We require P(A∪ (B ∪C)). Since B and C are mutually exclusive, we canuse the result above. The individual probabilities are

P(A) =1352

=14; P(B) =

452

=113

; P(C) =152

,

and since A ∩ (B ∪ C) is the ace of clubs,

P(A ∩ (B ∪ C)) =152

.

Finally

P(A ∪ (B ∪ C)) =14

+113

+152− 1

52=

1752

.

1.14. Show that

f(x) =

0 x < 012a 0 ≤ x ≤ a12ae−(x−a)/a x > a

is a possible probability density function. Find the corresponding probability function.

Check the density function as follows:∫ ∞

−∞f(x)dx =

12a

∫ a

0

dx +12a

∫ ∞

a

e−(x−a)/adx

= 12 − 1

2 [e−(x−a)/a]∞a = 1.

The probability function is given by, for 0 ≤ x ≤ a,

F (x) =∫ x

−∞f(u)du =

∫ x

0

12a

du =x

2a,

and, for x > a, by

F (x) =∫ x

0

f(u)du =∫ a

0

12a

du +∫ a

0

12a

e−(u−a)/adu

=12− 1

2a[ae−(u−a)/a]xa

= 1− 12e−(x−a)/a.

1.15. A biased coin is tossed. The probability of a head is p. The coin is tossed until the first headappears. Let the random variable N be the total number of tosses including the first head. FindP(N = n), and its pgf G(s). Find the expected value of the number of tosses.

The probability that the total number of throws is n (including the head) until the first headappears is

P(N = n) =

(n−1) times︷ ︸︸ ︷(1− p)(1− p) · · · (1− p) p = (1− p)n−1p, (n ≥ 1)

10

The probability generating function is given by

G(s) =∞∑

n=1

(1− p)n−1psn =p

1− p

∞∑n=1

[(1− p)s]n

=p

1− p· s(1− p)[1− s(1− p)]

=ps

1− s(1− p),

after summing the geometric series.For the mean, we require G′(s) given by,

G′(s) =p

[1− s(1− p)]+

sp(1− p)[1− s(1− p)]2

=p

[1− s(1− p)]2.

The mean is given by µ = G′(1) = 1/p.

1.16. The n random variables X1, X2, . . . , Xm are independent and identically distributed each witha gamma distribution with parameters n and α. The random variable Y is defined by

Y = X1 + X2 + · · ·+ Xm.

Using the moment generating function, find the mean and variance of Y .

The probability density function for the gamma distribution with parameters n and α is

f(x) =αn

Γ(n)xn−1e−αx.

It was shown in Section 1.9 that the moment generating function for Y is given, in general, by

MY (s) = [MX(s)]m,

where X has a gamma distribution with the same parameters. Hence

MY (s) =(

α

α− s

)nm

=(1− s

α

)nm

= 1 +nm

αs +

nm(nm + 1)2α2

s2 + · · ·

HenceE(Y ) =

nm

α, V(Y ) = E(Y 2)− [E(Y )]2 =

nm

α2.

1.17. A probability generating function with parameter 0 < α < 1 is given by

G(s) =1− α(1− s)1 + α(1− s)

.

Find pn = P(N = n) by expanding the series in powers of s. What is the mean of the probabilityfunction {pn}?

Applying the binomial theorem

G(s) =1− α(1− s)1 + α(1− s)

=(1− α)[1 + (α/(1− α))s](1 + α)[1− (α/(1 + α))s]

=(

1− α

1 + α

)(1 +

αs

1− α

) ∞∑n=

(αs

1 + α

)n

=1− α

1 + α

∞∑n=0

1 + α

)n

sn +α

1 + α

∞∑n=0

1 + α

)n

sn+1.

11

The summation of the two series leads to

G(s) =1− α

1 + α

∞∑n=0

1 + α

)n

sn +∞∑

n=1

1 + α

)n

sn

=1− α

1 + α+

21 + α

∞∑n=1

1 + α

)n

sn

Hencep0 =

1 + α

1− α, pn =

2αn

(1 + α)n+1, (n = 1, 2, . . .).

The mean is given by

G′(1) =dds

[1− α(1− s)1 + α(1− s)

]

s=1

=[

[1 + a(1− s)]2

]

s=1

= 2α

1.18. Find the moment generating function of the random variables X which has the uniformdistribution

f(x) ={

1/(b− a) a ≤ x ≤ b0 for all other values of x

Deduce E(Xn).

The moment generating function of the uniform distribution is

MX(s) =∫ b

a

exs

b− adx =

1b− a

1s[ebs − eas]

=1

b− a

∞∑n=1

(bn − an

n!

)sn−1

Hence

E(X) =12(b + a), E(Xn) =

bn+1 − an+1

(n + 1)(b− a).

1.19. A random variable X has the normal distribution N(µ, σ2). Find its momemt generatingfunction.

By definition

MX(s) = E(eXs) =1

σ√

∫ ∞

−∞esx exp

[− (x− µ)2

2σ2

]dx

=1

σ√

∫ ∞

−∞exp

[2σ2xs− (x− µ)2

2σ2

]dx

Apply the substitution x = µ + σ(v − σs): then

MX(s) = exp(sµ + 12σ2s2)

∫ ∞

−∞

1√2π

e−12 v2

dv

= exp(sµ + 12σ2s2) + 1 = exp(sµ + 1

2σ2s2)

(see the Appendix for the integral).

12

Expansion of the exponential function in powers of s gives

MX(s) = 1 + µs +12(σ2 + µ2)s2 + · · · .

So, for example, E(X2) = µ2 + σ2.

1.20. Find the probability generating functions of the following distributions, in which 0 < p < 1:(a) Bernoulli distribution: pn = pn(1− p)n, (n = 0, 1);(b) geometric distribution: pn = p(1− p)n−1, (n = 1, 2, . . .);(c) negative binomial distribution with parameter r expressed in the form:

pn =(

r + n− 1r − 1

)pr(1− p)n, (n = 0, 1, 2, . . .)

where r is a positive integer. In each case find also the mean and variance of the distribution usingthe probability generating function.

(a) For the Bernoulli distribution

G(s) = p0 + p1s = (1− p) + ps.

The mean is given byµ = G′(1) = p,

and the variance by

σ2 = G′′(1) + G′(1)− [G′(1)]2 = p− p2 = p(1− p).

(b) For the geometric distribution (with q = 1− p)

G(s) =∞∑

n=1

pqn−1sn = ps

∞∑n=0

(qs)n =ps

1− qs

summing the geometric series. The mean and variance are given by

µ = G′(1) =[

p

1− qs

]

s=1

=1p,

σ2 = G′′(1) + G′(1)− [G′(1)]2 =[

2pq

(1− qs)3

]

s=1

+1p− 1

p2=

1− p

p2.

(c) For the negative binomial distribution (with q = 1− p)

G(s) =∞∑

n=0

(r + n− 1

r − 1

)prqnsn = pr

(1 + r(qs) +

r(r + 1)2!

(qs)2 + · · ·)

=pr

(1− qs)r

The derivatives of G(s) are given by

G′(s) =rqpr

(1− qs)r+1, G′′(s) =

r(r + 1)q2pr

(1− qs)r+2.

Hence the mean and variance are given by

µ = G′(1) =rq

p,

13

σ2 = G′′(1) + G′(1)− [G′(1)]2 =r(r + 1)q2

p2+

rq

p− r2q2

p2=

rq

p2

1.21. A word of five letters is transmitted by code to a receiver. The transmission signal is weak,and there is a 5% probability that any letter is in error independently of the others. What is theprobability that the word is received correctly? The same word is transmitted a second time withthe same errors in the signal. If the same word is received, what is the probability now that theword is correct?

Let A1, A2, A3, A4, A5 be the events that the letters in the word are correct. Since the eventsare independent, the probability that the word is correctly transmitted is

P(A1 ∩A2 ∩A3 ∩A4 ∩A5) = 0.955 ≈ 0.774.

If a letter is sent a second time the probability that one error occurs twice is 0.052 = 0.0025.Hence the probability that the letter is correct is 0.9975. For 5 letters the probability that theword is correct is 0.99755 ≈ 0.988.

1.22. A random variable N over the positive integers has the probability distribution with

pn = P(N = n) = − αn

n ln(1− α), (0 < α < 1; n = 1, 2, 3, . . .).

What is its probability generating function? Find the mean of the random variable.

The probability generating function is given by

G(s) = −∞∑

n=1

αnsn

n log(1− α)=

log(1− αs)log(1− α)

for 0 ≤ s < 1/α.Since

G′(s) =−α

(1− αs) log(1− α),

the mean isµ = G′(1) =

−α

(1− α) log(1− α).

1.23. The source of a beam of light is a perpendicular distance d from a wall of length 2a, withthe perpendicular from the source meeting the wall at its midpoint. The source emits a pulse oflight randomly in a direction θ, the angle between the direction of the pulse and the perpendicularis chosen uniformly in the range − tan−1(a/d) ≤ θ ≤ tan−1(a/d). Find the probability distributionof x (−a ≤ x ≤ a) where the pulses hit the wall. Show that its density function is given by

f(x) =d

2(x2 + d2) tan−1(a/d),

(this the density function of a Cauchy distribution). If a → ∞, what can you say about themean of this distribution?

Figure 1.2 shows the beam and wall. Let X be the random variable representing any displacement

14

x

d

a

a

θ

source

wall

beam

Figure 1.2: Source and beam for Problem 1.23

between −a and x. Then

P(−a ≤ X ≤ x) = P(−a ≤ d tan θ ≤ x)= P(tan−1(−a/d) + tan−1(x/d))

=tan−1(x/d) + tan−1(a/d)

2 tan−1(a/d)

by uniformity. The density is given by

f(x) =ddx

[tan−1(x/d) + tan−1(a/d)

2 tan−1(a/d)

]

=d

2(x2 + d2) tan−1(a/d)

The mean is given by

µ =∫ a

−a

xd

2(x2 + d2) tan−1(a/d)dx = 0,

since the integrand is an odd function and the limits are ±a.For the infinite wall the integral defining the mean becomes divergent.

1.24. Suppose that the random variable X can take the integer values 0, 1, 2, . . .. Let pj and qj bethe probabilities

pj = P(X = j), qj = P(X > j), (j = 0, 1, 2, . . .).

Show that, if

G(s) =∞∑

j=0

pjsj , H(s) =

∞∑

j=0

qjsj ,

then (1− s)H(s) = 1−G(s).Show also that E(X) = H(1).

Using the series for H(s),

(1− s)H(s) = (1− s)∞∑

j=0

qjsj =

∞∑

j=0

qjsj −

∞∑

j=0

qjsj+1

= q0 +∞∑

j=1

(qj − qj−1)sj

= q0 −∞∑

j=1

P(X = j)sj

15

= 1− p0 −∞∑

j=1

pjsj = 1−G(s)

Note that generally H(s) is not a probability generating function.The mean of the random variable X is given by

E(X) =∞∑

j=1

jpj = G′(1) = H(1),

differentiating the formula above.

16