ph.d. qualifying examination in probability · ph.d. qualifying examination in probability and...

24
Ph.D. Qualifying Examination in Probability Department of Mathematics University of Louisville August 13, 2015, 9:00am–12:30pm Do 3 problems from each section. SECTION 1 Problem 1. (a) State and prove Borel-Cantelli’s Lemma. (b) Suppose X is a random variable whose only values are positive integers. Show that EX = n=1 P (X n). (c) Suppose X, X 1 ,X 2 ,... are i.i.d. random variables that take only positive integer values. Suppose their common probability function is as follows: For each n =1, 2,..., P (X = n)= C/n 3 . Here C is a normalizing constant – the positive constant such that n=1 C/n 3 = 1. Compute P (X n n for infinitely many positive integers n). Problem 2. Suppose that X n are positive iid random variables with finite mean and S n = X 1 + X 2 + .. + X n . Show that max 1kn X k /S n converges to 0 almost surely. Problem 3. Thomas tosses a fair coin twice. Let us define a random variable X be the number of heads. (a) Write the probability space of this experiment. (b) Write the sigma algebra generated by all possible events. (c) Let G be the sigma algebra generated by the events with only one head. Find E(X |G ). Problem 4. (a) State the Strong Law of Large Numbers. (b) Let f (t)= t t +1 . Compute lim n→∞ Z 0 ··· Z 0 f x 1 + ... + x n n 2 n exp {-2x 1 - ... - 2x n } dx 1 ··· dx n . Make sure to show sufficient details and justify important steps.

Upload: phamxuyen

Post on 27-Feb-2019

246 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Ph.D. Qualifying Examination in Probability · Ph.D. Qualifying Examination in Probability and Mathematical Statistics Department of Mathematics University of Louisville Fall 2012

Ph.D. Qualifying Examination inProbability

Department of MathematicsUniversity of Louisville

August 13, 2015, 9:00am–12:30pm

Do 3 problems from each section.

SECTION 1

Problem 1.

(a) State and prove Borel-Cantelli’s Lemma.

(b) Suppose X is a random variable whose only values are positive integers.Show that EX =

∑∞n=1 P (X ≥ n).

(c) Suppose X,X1, X2, . . . are i.i.d. random variables that take only positive integer values.Suppose their common probability function is as follows:

For each n = 1, 2, . . . , P (X = n) = C/n3.

Here C is a normalizing constant – the positive constant such that∑∞

n=1C/n3 = 1.

Compute P (Xn ≥ n for infinitely many positive integers n).

Problem 2. Suppose that Xn are positive iid random variables with finite mean andSn = X1 +X2 + ..+Xn. Show that max1≤k≤nXk/Sn converges to 0 almost surely.

Problem 3. Thomas tosses a fair coin twice. Let us define a random variable X be the number of heads.

(a) Write the probability space of this experiment.

(b) Write the sigma algebra generated by all possible events.

(c) Let G be the sigma algebra generated by the events with only one head. Find E(X|G).

Problem 4.

(a) State the Strong Law of Large Numbers.

(b) Let f(t) =t

t+ 1. Compute

limn→∞

∫ ∞0· · ·∫ ∞0

f

(x1 + . . .+ xn

n

)2n exp −2x1 − . . .− 2xn dx1 · · · dxn.

Make sure to show sufficient details and justify important steps.

Page 2: Ph.D. Qualifying Examination in Probability · Ph.D. Qualifying Examination in Probability and Mathematical Statistics Department of Mathematics University of Louisville Fall 2012

SECTION 2

Problem 5.

(a) State Markov’s inequality.

(b) Suppose that X is a random variable with moment generating function M(t) which isdefined for all real numbers t. Prove that P (X ≥ x) ≤ e−txM(t) for t ≥ 0.

(c) Suppose that Y has density function

f(y) =θαyα−1e−θy

Γ(α)for y > 0

with θ > 0 and α > 0. Prove that P

(Y ≥ 3α

θ

)≤(

3

e2

)α.

Problem 6.

(a) Define a Standard Brownian Motion (SBM), B(t), t ≥ 0 and show that B(t) is a martingale.

(b) Show that the process Xt = 1√cB(ct), c > 0 is also a SBM, whenever B(t) is a SBM.

(c) Let Bn be a standard Brownian motion evaluated only at integer times n = 1, 2, . . ..Show that the process B2

n − n forms a martingale.

Problem 7. Suppose that the random variables Xn are defined on the same probability space andthere is a constant c such that Xn converges in distribution to c.Prove or disprove each of the following:

(a) Xn converges to c in probability;

(b) Xn converges to c a.s. .

Problem 8.

(a) State and prove the central limit theorem in Rd, d ≥ 1.

(b) Suppose X1, X2, . . . are i.i.d random variables such that

P (X1 = 2) = P (X1 = −2) = 1/4 and P (X1 = 0) = 1/2.

Compute approximately

P

(10,000∑k=1

Xk < 0 <

10,000∑k=1

(X2k − 2)

).

All the details must be shown including computing integrals.

Page 3: Ph.D. Qualifying Examination in Probability · Ph.D. Qualifying Examination in Probability and Mathematical Statistics Department of Mathematics University of Louisville Fall 2012

Ph.D. Qualifying Examination inProbability and Mathematical Statistics

Department of MathematicsUniversity of Louisville

August 14, 2013, 9:00am–12:30pm

Do all problems from both sections.

Section 1: Probability

1. Let Ω = (0, 1] and An =(

m2n ,

m+12n

]: m = 0, 1, . . . , 2n − 1

.

Let Fn be the σ-field generated by An.(a) State a valid defintion of a σ-field.(b) List all sets in F1 and list all sets in F2.

(c) Is

∞⋃n=1

Fn a σ-field? If yes, prove it. If not, justify your answer.

2. Let X,X1, X2, . . . be random variables on a probability space. For each ofthe following statements, determine whether it is true or false. Then, give aproof of the statement if it is true or give a counterexample if it is false.(a) If lim

n→∞E |Xn −X| = 0, then Xn →p X.

(b) If Xn →p X, then Xn →d X.(c) If Xn →d X, then Xn →p X.

3. Suppose X1, X2, . . . are independent, E [Xn] = 0 for all n, andE[X4n

]∞n=1

is bounded.

(a) Show that E

( n∑i=1

Xi

)4 =

n∑i=1

E[X4i ] + 6

∑∑1≤i<j≤n

E[X2i ] E[X2

j ].

(b) Show that P

(∣∣∣∣∣ 1nn∑i=1

Xi

∣∣∣∣∣ > ε

)≤ 1

ε4n4E

( n∑i=1

Xi

)4

(c) Show that

∞∑n=1

P

(∣∣∣∣∣ 1nn∑i=1

Xi

∣∣∣∣∣ > ε

)converges.

(d) Prove that1

n

n∑i=1

Xn → 0 with probability 1.

Page 4: Ph.D. Qualifying Examination in Probability · Ph.D. Qualifying Examination in Probability and Mathematical Statistics Department of Mathematics University of Louisville Fall 2012

4. Suppose that Xk∞k=1 is an independent sequence of random variables suchthat Xk is uniformly distributed on the interval

(−1− 2−k, 1 + 2−k

).

(a) Compute µk = E [Xk] and σ2k = Var [Xk].

(b) Find the value A ∈ (0,∞) such that limn→∞

E[∑n

k=1X4k

]n

= A.

(c) Let s2n =

n∑k=1

σ2k. Find the value B ∈ (0,∞) such that lim

n→∞

s4nn2

= B.

(d) Prove that

∑nk=1(Xk − µk)

sn→d Z where Z is a standard normal random

variable.

Section 2: Mathematical Statistics

5. Suppose that X has a Gamma(α,1) distribution with density

fX(x|α) =xα−1e−x

Γ(α), x > 0.

(a) Find the probability density function fY (y|α) for the randomvariable Y =

√X.

(b) Using part (a), show that Γ(12

)=√π.

(c) A family of probability density functions is called an exponential familyif it can be expressed as f(x|θ) = h(x)C(θ) exp W (θ)t(x). Is fY (y|α) anexponential family? If yes, define θ and find h(x), C(θ), W (θ), and t(x)? If not,justify your answer.

6. Let X1, X2, . . . , Xn be independent Normal random variables with unknownmean µ and variance 1.

(a) Show that

n∑i=1

Xi is a complete and sufficient statistic for µ.

(b) Find the uniformly minimum variance unbiased estimator of µ.(c) Does the estimator in (b) attain the Cramer-Rao lower bound? Justify youranswer.

7. Let X1, X2, . . . , Xn be a sample from probability mass function

P (X = k) =

1/N if k = 1, 2, . . . , N,

0 otherwise.

(a) Find the maximum likelihood estimator N of N .(b) Show that P(N > k) = 1−

(kN

)nfor k = 1, . . . , N .

(c) For sample size n = 2, compute E[N ].

Page 5: Ph.D. Qualifying Examination in Probability · Ph.D. Qualifying Examination in Probability and Mathematical Statistics Department of Mathematics University of Louisville Fall 2012

8. Let X1, X2, . . . , Xn be independent random variables following the Poissondistribution with probability mass function

P (X = k) =

λke−λ

k!if k = 0, 1, 2, . . . ,

0 otherwise

.

(a) Let Xn =

∑ni=1Xi

n. Find the mean and standard deviation of Xn.

(b) Let Zn =Xn − E[Xn]√

Var[Xn

] . Using the fact that Zn →d Z where Z follows

a standard normal distribution so that P (Zn < z1−α) → 1 − α as n → ∞,construct an approximate (for large n) one-sided 100(1−α)% confidence intervalfor λ with lower confidence limit λ` based on the pivot Zn.(c) If n = 10, λ = 1

10 , and α = .05, compute the exact coverage probability of theapproximate 95% confidence interval proposed in (b). (Note that z.95 ≈ 1.645.)

Page 6: Ph.D. Qualifying Examination in Probability · Ph.D. Qualifying Examination in Probability and Mathematical Statistics Department of Mathematics University of Louisville Fall 2012

Ph.D. Qualifying Examination in Probability andMathematical Statistics

Department of MathematicsUniversity of Louisville

Fall 2012

Do ALL the problems from both sections

1 Probability

1. (a) We toss a coin twice. Write the sample space Ω. Also, write theσ algebra generated by HT and TH.

(b) Show that, for two events A and B, if A ⊂ B, then P(P ) ≤ P(B).

(c) Prove that P(|X| > a) ≤ E(X2)a2 .

2. Consider a game with the following rule. One tosses a fair coin. If itis a tail, one gets nothing. If it is a head, one gets a certain amountof money, which follows the uniform distribution between $0 and $10.Find the distribution function of the money one gets from this game.

3. Let (Ω,F ,P) be a probability space and Xn be a sequence of randomvariable defined on Ω. If Xn converges almost uniformly to X, thenshow that Xn converges almost everywhere to X.

4. Let Xn be independent and identically distributed random variables

with P(Xn = 1) = 12

and P(Xn = −1) = 12. Show that

n∑j=1

Xj converges

to 0 in probability.

1

Page 7: Ph.D. Qualifying Examination in Probability · Ph.D. Qualifying Examination in Probability and Mathematical Statistics Department of Mathematics University of Louisville Fall 2012

2 Mathematical Statistics

1. The random variable X follows the exponential distribution Exp(β)

with pdf f(x) = 1βe−

1βxI0<x<∞.

(a) Find the pdf of Y = X1γ, γ > 0

(b) A family of pdfs is called an exponential family if it can be ex-

pressed as f(x/θ) = h(x)C(θ) exp(∑k

i=1Wi(θ) ti(x))

. Does X

make an exponential family? If yes, define θ and find h(x), C(θ),Wi(θ), ti(x). If not, justify your answer.

(c) Does Y make an exponential family? If yes, define θ and findh(x), C(θ), Wi(θ), ti(x). If not, justify your answer.

2. Let X1, X2, ..., Xn be a random sample from a distribution with pdff(x/θ) = 1

θe−

1θx I0<x<∞, where 0 < θ <∞.

(a) Find the maximum likelihood (ML) estimator of θ? Is this MLestimator of θ is a sufficient estimator of θ?

(b) Using moment method find a consistent estimator of θ.

3. Xi i = 1, 2, ..., n are i.i.d. from Bernoulli distribution with probabilityof success p.

(a) Find a CSS for p.

(b) Find the UMVUE of p(1− p).(c) Notice that X1(1−X2) is an unbiased estimator of p(1− p). Us-

ing this fact, the CSS you found in part (a) and Lehman-Scheffetheorem, find the UMVUE.

4. Let X1, X2, ..., Xn be a random sample of size n from a normal popula-tion with mean µ and variance σ2. What is the likelihood ratio test ofsize α for testing the null hypothesis Ho : µ = µo versus the alternativehypothesis Ha : µ 6= µo?

2

Page 8: Ph.D. Qualifying Examination in Probability · Ph.D. Qualifying Examination in Probability and Mathematical Statistics Department of Mathematics University of Louisville Fall 2012

Ph.D. Qualifying Examination in Probability andMathematical Statistics

Department of MathematicsUniversity of Louisville

Fall 2011

1 Probability

Do any three problems from this section.

1. Let (Ω,F ,P) be a probability space. Let En∞n=1 be a sequence ofevents in F and

E = w | w ∈ En for infinitely many n ∈ N .

(a) If∞∑n=1

P(En) <∞, then prove that P(E) = 0.

(b) If∞∑n=1

P(En) = ∞ and the sequence En consists of mutually

independent events, then prove that P(E) = 1.

2. Let (Ω,F ,P) be a probability space, X a random variable on Ω, andF the cdf of X. Then prove that F satisfies the followings:

(a) If x1 ≤ x2, then F (x1) ≤ F (x2) for all x1, x2 ∈ R.

(b) F is right continuous at each xo ∈ R.

(c) limx→−∞

F (x) = 0 and limx→∞

F (x) = 1.

3. Let (Ω,F ,P) be a probability space. Let X be a random variableand f : R → R be a Borel measurable function. Then show that thecomposite function f X is a random variable.

4. State precisely a version of the Central Limit theorem. Give a proof ofthe Central Limit theorem.

1

Page 9: Ph.D. Qualifying Examination in Probability · Ph.D. Qualifying Examination in Probability and Mathematical Statistics Department of Mathematics University of Louisville Fall 2012

2 Mathematical Statistics

Do any three problems from this section.

1. Suppose that X1, X2, ..., Xn be a random sample from a population Xhaving density function

f(x; θ) =

θ2

x3 e− θx if 0 < x <∞

0 otherwise

where θ > 0. It is known that E(Xi) = θ and E(

1Xi

)= 2

θfor each

i = 1, 2, ..., n.

(a) Determine the estimator of θ using moment method.

(b) Compute the likelihood function for this random sample.

(c) Find the maximum likelihood estimator of θ.

(d) Show thatn∑i=1

1

Xi

is a sufficient statistic for θ.

(e) Find the Fisher information I(θ) in a single observation from thisdensity function.

(f) Using the maximum likelihood estimator based the sample, con-struct an approximate 90% confidence interval for θ.

(g) Suppose n = 8 and8∑i=1

1

Xi

= 10. Using these information on

your confidence interval in (f), can you reject the null hypothesisHo : θ = 1 in favor of Ha : θ 6= 1 at the significance level α = 0.10?

(h) Verify that the likelihood ratio critical region for testing the nullhypothesis Ho : θ = θo against Ha : θ 6= θo has a critical region ofthe form(x1, x2, ..., xn) ∈ Rn

+

∣∣∣∣(

n∑i=1

1

Xi

)2n

exp

(−θo

n∑i=1

1

Xi

)≤ k

for some constant k.

2

Page 10: Ph.D. Qualifying Examination in Probability · Ph.D. Qualifying Examination in Probability and Mathematical Statistics Department of Mathematics University of Louisville Fall 2012

2. Let X1, X2, ..., Xn be a random sample from a normal distribution with

mean zero and variance θ. Consider the class of estimators cn∑i=1

X2i .

(a) What value of the constant c leads to an unbiased estimator?

(b) Does this estimator achieve Cramer-Rao lower bound?

(c) Find the unique value of c which minimizes the mean squarederror

((c

n∑i=1

X2i − θ

)2)uniformly.

3. Answer both parts of this question.

(a) Let X1, X2, ..., Xn be a random sample from a continuous popula-tion X with a distribution function F (x; θ). If F (x; θ) is monotonein θ, then show that the statistic

Q = −2n∑i=1

ln ( 1− F (Xi; θ) )

has a chi-square distribution with 2n degrees of freedom.

(b) If X1, X2, ..., Xn is a random sample from a population with den-sity

f(x; θ) =

θ xθ−1 if 0 < x < 1

0 otherwise

where θ > 0 is an unknown parameter, what is a 100(1 − α)%confidence interval for θ?

4. Let X1, X2, ..., Xn be a random sample of size n from a normal popu-lation with mean µ and known variance σ2. What is the likelihoodratio test of size α for testing the null hypothesis Ho : µ = µo versusthe alternative hypothesis Ha : µ 6= µo?

3

Page 11: Ph.D. Qualifying Examination in Probability · Ph.D. Qualifying Examination in Probability and Mathematical Statistics Department of Mathematics University of Louisville Fall 2012

Probablity and Mathematical Statistics Qualifier ExamMay 18, 2007

Remark : Choose three problems in 1-4, and choose threeproblems in 5-8

Name :

Problem 1: Let (Ω,F ,P) = ([0, 1), Borel, Lebesgue). For n = 0, 1, 2, 3...let Qn = σ([ k

2n , k+12n ) : 0 ≤ k < n) be the dyadic partitions of [0, 1). Let

X(s) = s, , 0 ≤ s < 1.(a)Find the equation of and sketch E(X|Q0),E(X|Q1) and E(X|Q2).

(b)Let Yn = E(X|Qn). Show that E(Y6|Q4) = Y4. Find E(Y4|Q6).

Problem 2:(a) Suppose that Ω ∈ F and that A, B ∈ F implies A− B ∈ F . Show thatF is a field.

(b) Suppose that Ω ∈ F and that F is closed under the formation of comple-ments and finite disjoint unions. Find a counterexample which shows thatF need not be a field.

Problem 3:(a) Let X, Y ∈ L1. If X and Y are independent, show that XY ∈ L1. Givean example to show XY need not be in L1 in general.

(b) Let Xn be i.i.d. random variables with P (Xn = 1) = 12

and P (Xn =−1) = 1

2. Show that 1

n

∑nj=1 Xj converges to 0 in probability.

Problem 4:(a) Suppose that X1, X2, . . . is an independent sequence and Y is measurableσ(Xn, Xn+1, . . .) for each n. Show that there exists a constant a such thatP (Y = a) = 1.

1

Page 12: Ph.D. Qualifying Examination in Probability · Ph.D. Qualifying Examination in Probability and Mathematical Statistics Department of Mathematics University of Louisville Fall 2012

(b) Prove for integrable X that

E[X] =

∫ ∞

0

P (X > t) dt−∫ 0

−∞P (X < t) dt.

Problem 5:(a) f Z ∼ N(0, 1), prove the following.

P (|Z| ≥ t) ≥√

2

π

t

t2 + 1e−

t2

2 .

(b) Let X1, X2, ..., Xn be an i.i.d. sample from B(n, p). Show that there isno unbiased estimator of g(p) = 1/p.

(c) Suppose there are n AA Financial bonds, and let Xi denote the spreadreturn, in a given month, of the ith bond, i = 1, . . . , n. Suppose thatthey are all independent and follows normal distribution with a commonmean µ but different variances σ2

i . Show that the joint distribution ofX = (X1, X2, . . . , Xn) makes an exponential family and find C(x), h(x), wi(x)and Ti(x).

Problem 6: Let X1, X2, ..., Xn be i.i.d. random sample with density

f(x|µ, σ) =1

σexp(−x− µ

σ)I(µ,∞)(x),

where σ > 0.(a) Find a two-dimensional sufficient statistic for (µ, σ).

(b) Find the distribution of nσ(X(1) − µ).

(c) Find the distribution of∑n

i=1(Xi −X(1))/σ.

(d) Using (2) and (3), check that the sufficient statistic you found in (a) isalso complete.

2

Page 13: Ph.D. Qualifying Examination in Probability · Ph.D. Qualifying Examination in Probability and Mathematical Statistics Department of Mathematics University of Louisville Fall 2012

Problem 7:(a) Let X1, X2, ..., Xn be i.i.d. from Uniform(0, θ). Find the Complete Suffi-cient Statisitcs of θ. Also, find the UMVUE of θ.

(b) Let X1, X2, ..., Xn be i.i.d. from Poisson(λ). Find a Complete SufficientStatistic of λ. Also, find the UMVUE of η = P (X1 = a).

Problem 8:(a) Let X1, X2, ..., Xn be i.i.d. from a distribution having p.d.f. of the formf(x) = θxθ−1I(0,1)(x). Find the Rejection Region of the most poweful testfor H0 : θ = 1 versus H1 : θ = 2

(b) Let X1, X2, ..., Xn be i.i.d. from a distribution having p.d.f. of the formf(x) = e−(x−θ)I[θ,∞)(x). Find the likelihood ratio test for H0 : θ ≤ θ0 versusH1 : θ > θ0.

3

Page 14: Ph.D. Qualifying Examination in Probability · Ph.D. Qualifying Examination in Probability and Mathematical Statistics Department of Mathematics University of Louisville Fall 2012

Preliminary Exam: Probability and Mathematical Statistics May 2006

Name:

Instructions: Clearly mark the answers and show all the work. You may use calculators when appropriate.Solve three problems from each part. Clearly marke the ones you are NOT solving

PART ONE: ProbabillitySolve three out of the next four problems

1. (50 points) This question consists of three parts.

(a) (15 points) Let Xi be a sequence of random variables such that

limn→∞

V arSn

n2= 0, (1)

where Sn =∑n

i=1 Xi. Show that

Sn − ESn

n

P→ 0 as n →∞. (2)

(b) (15 points) Now, suppose that we replace the condition (1) with the assumption that the Xi arepairwise uncorrelated and satisfy supi EX2

i < ∞. Show that the result (2) above also holds underthese alternative assumptions.

(c) (20 points) Assuming that the Xi are independent random variables such that supi EX4i < ∞,

show that the convergence in probability in (2) can be strengthened to convergence almost surely.

Page 1 of 8 Please go on to the next page. . .

Page 15: Ph.D. Qualifying Examination in Probability · Ph.D. Qualifying Examination in Probability and Mathematical Statistics Department of Mathematics University of Louisville Fall 2012

Preliminary Exam: Probability and Mathematical Statistics May 2006

2. (50 points points) Given a probability space (Ω,F0, P ) , an F0-measureable random variable X andanother σ-field F ⊂ F0, the conditional expectation of X given F is defined to be any randomvariable Y which is F-measureable and satisfies∫

A

XdP =∫

A

Y dP

for all A ∈ F .

(a) (15 points) To warm up, consider how this definition relates to the one taught in undergraduateprobability. Specifically, suppose that Ω1,Ω2, . . . is a finite or infinite partition of Ω into disjointsets each of which has positive probability (with respect to P ), and let F = σ (Ω1,Ω2, . . . ) theσ-field generated by these sets. Then show that on each Ωi,

E (X |F ) =E (X; Ωi)

P (Ωi).

(b) (15 points) Let F1 ⊂ F2 be two σ-fields on Ω. Then show that

1. E (E (X |F1 ) |F2 ) = E (X |F1 ) , and2. E (E (X |F2 ) |F1 ) = E (X |F1 ) .

(c) (20 points) Let Ω = a, b, c. Give an example of P,F1,F2, and X in which

E (E (X |F1 ) |F2 ) 6= E (E (X |F2 ) |F1 ) .

Page 2 of 8 Please go on to the next page. . .

Page 16: Ph.D. Qualifying Examination in Probability · Ph.D. Qualifying Examination in Probability and Mathematical Statistics Department of Mathematics University of Louisville Fall 2012

Preliminary Exam: Probability and Mathematical Statistics May 2006

3. (50 points) This problem consists of two parts.

(a) (25 points) Let X be N(0, 1) random variable. Let

M(s) = Ees X =∫ ∞−∞

1√2π

exp(

s x− x2

2

)dx.

Show that M(s) = es2/2.

(b) (25 points) Show that for any positive integer n we have EX2n+1 = 0 and

EX2n =(2n)!2n n!

= (2n− 1)(2n− 3) . . . 3 · 1.

(Hint: Note that es2/2 =∑∞

k=0s2k

2k k!

Page 3 of 8 Please go on to the next page. . .

Page 17: Ph.D. Qualifying Examination in Probability · Ph.D. Qualifying Examination in Probability and Mathematical Statistics Department of Mathematics University of Louisville Fall 2012

Preliminary Exam: Probability and Mathematical Statistics May 2006

4. (50 points) This question consists of two parts.

(a) (25 points) Show that for any c.d.f. F and any a ≥ 0∫[F (x + a)− F (x)]dx = a

(b) (25 points) Let X be a random variable with range 0,1,2,. . . . Show that if EX < ∞ then

EX =∞∑

i=1

P (X ≥ i)

Page 4 of 8 Please go on to the next page. . .

Page 18: Ph.D. Qualifying Examination in Probability · Ph.D. Qualifying Examination in Probability and Mathematical Statistics Department of Mathematics University of Louisville Fall 2012

Preliminary Exam: Probability and Mathematical Statistics May 2006

PART TWO: Mathematical StatisticsSolve three out of the next four problems

5. (50 points) Let U1, . . . , Un be i.i.d. random variables having uniform distribution on [0,1] and Yn =(∏n

i=1 Ui)−1/n. Show that√

n(Yn − e) D→ N(0, e2)

where e = exp(1).

Page 5 of 8 Please go on to the next page. . .

Page 19: Ph.D. Qualifying Examination in Probability · Ph.D. Qualifying Examination in Probability and Mathematical Statistics Department of Mathematics University of Louisville Fall 2012

Preliminary Exam: Probability and Mathematical Statistics May 2006

6. (50 points) Let φ be a UMP test of level α ∈ (0, 1) for testing simple hypothesis P0 vs P1. If β is thepower of the test, show that β ≥ α with equality if and only if P0 = P1.

Page 6 of 8 Please go on to the next page. . .

Page 20: Ph.D. Qualifying Examination in Probability · Ph.D. Qualifying Examination in Probability and Mathematical Statistics Department of Mathematics University of Louisville Fall 2012

Preliminary Exam: Probability and Mathematical Statistics May 2006

7. (50 points) Let X1, . . . , Xn be independently and identically distributed with density

f(x, θ) =1σ

exp−x− µ

σ

, x ≥ µ,

where θ = (µ, σ2),−∞ < µ < ∞, σ2 > 0.

(a) Find maximum likelihood estimates of µ and σ2.

(b) Find the maximum likelihood estimate of Pθ(X1 ≥ t) for t > µ.

Page 7 of 8 Please go on to the next page. . .

Page 21: Ph.D. Qualifying Examination in Probability · Ph.D. Qualifying Examination in Probability and Mathematical Statistics Department of Mathematics University of Louisville Fall 2012

Preliminary Exam: Probability and Mathematical Statistics May 2006

8. (50 points) Let X1, . . . , Xn be iid from Bernoulli distribution with unknown probability of successP (X1 = 1) = p ∈ (0, 1).

(a) (20 points) Show that S =n∑

i=1

Xi is a complete and sufficient statistic.

(Hint:n∑

k=0

(n

k

)g(k)pk(1− p)n−k = (1− p)n

(n∑

k=0

(n

k

)g(k)ζk

)where ζ =

p

1− p.)

(b) (30 points) Find UMVUE for pm when m ≤ n is a positive integer.

Page 8 of 8 Done !

Page 22: Ph.D. Qualifying Examination in Probability · Ph.D. Qualifying Examination in Probability and Mathematical Statistics Department of Mathematics University of Louisville Fall 2012

Preliminary Examination Probability and Statistics May 23, 2003

Instruction: You are required to complete this examination within 3 hours.

Do only 5 problems, at least 2 from each group. Work on one side of the

page only. Start each problem on a new page. In order to receive full credit,

answer each problem completely and must show all work.

GROUP A

1. If A1, A2, ..., An, ... is a sequence of events in a sample space S such that

A1 ⊆ A2 ⊆ · · · ⊆ An ⊆ · · ·, then prove that

P

( ∞⋃n=1

An

)= lim

n→∞P (An).

Why this result is called the continuity theorem for the probability measure?

2. Consider the function

f(x, y) =

2 if 0 < x < y < 1

0 otherwise.

(a) Show that f is a bivariate density function.

Let the random variables X and Y have f as their joint density function.

Determine

(b) the marginal density, f1(x), of X;

(c) the conditional density, f(y/x), of Y given that X = x;

(d) the mean of this conditional density, that is E(Y/X = x);

(e) the variance of this conditional density, that is V ar(Y/X = x).

3. Answer any four from the followings:

(a) State precisely Chebychev’s inequality.

(b) State precisely the weak law of large numbers.

(c) State precisely Kolmogorov’s inequality.

(d) State precisely the strong law of large numbers.

(e) State precisely the central limit theorem.

1

Page 23: Ph.D. Qualifying Examination in Probability · Ph.D. Qualifying Examination in Probability and Mathematical Statistics Department of Mathematics University of Louisville Fall 2012

4. Answer the followings:

(a) Suppose X is a nonnegative random variable with mean E(X). Then

show that

P (X ≥ t) ≤ E(X)t

for all t > 0.

(b) Suppose X1, X2, ... is a sequence of random variables defined on a sample

space S. Define the notion of convergence in probability of the sequence

random variables Xn to X.

(c) Suppose the random variable X and the sequence X1, X2, ..., of random

variables are defined on a sample space S. Define the notion of convergence

almost surely of the sequence random variables Xn to X.

(d) Suppose X is a random variable with cumulative density function F (x) and

the sequence X1, X2, ... of random variables with cumulative density functions

F1(x), F2(x), ..., respectively. Define the notion of convergence in distribution

of the sequence random variables Xn to X.

GROUP B

5. Answer the followings:

(a) What is the basic principle of the maximum likelihood estimation?

(b) Define an estimator for a population parameter θ.

(c) Define an unbiased estimator of a parameter θ.

(d) Define a sufficient estimator of a parameter θ.

(e) Let X1, X2, ..., Xn be a random sample from a normal population with

known mean µ and unknown variance σ2 > 0. Determine the maximum

likelihood estimators of µ and σ2?

6. Observations y1, y2, ..., yn are assumed to come from a model with E(Yi/xi) =

α + βxi, where α and β are unknown parameters, and x1, x2, ..., xn are given

constants. Derive the least squares estimates of the parameters α and β in terms

of the sum of squares Sxx and Syy, and the sum of cross product Sxy, where

Sxy =n∑

i=1

(xi − x)(yi − y).

2

Page 24: Ph.D. Qualifying Examination in Probability · Ph.D. Qualifying Examination in Probability and Mathematical Statistics Department of Mathematics University of Louisville Fall 2012

7. Answer the followings:

(a) Define a 100(1− α)% confidence interval for a parameter θ.

(b) Define an interval estimator of a parameter θ.

(c) Define a pivotal quantity for a parameter θ.

(d) If X1, X2, ..., Xn is a random sample from a population with density

f(x; θ) =

θ xθ−1 if 0 < x < 1

0 otherwise,

where θ > 0 is an unknown parameter, then derive a 100(1−α)% approximate

confidence interval for θ assuming the sample size is large.

8. Answer the followings:

(a) Define a hypothesis test.

(b) Define the power function of a hypothesis test

(c) State precisely the Neyman-Pearson Lemma.

(d) Let X1, X2, ..., Xn be a random sample from a normal population with

mean µ and known variance σ2. Derive the likelihood ratio test of size α

for testing the null hypothesis Ho : µ = µo versus the alternative hypothesis

Ha : µ 6= µo?

3