problems on probability

Post on 10-Dec-2015

216 Views

Category:

Documents

3 Downloads

Preview:

Click to see full reader

DESCRIPTION

IISc Bangalore, problems on probability basics.

TRANSCRIPT

E2 202 (Aug–Dec 2015)Homework Assignment 1

Discussion: Friday, Aug. 21

1. Let F : R→ [0, 1] be a non-decreasing, right-continuous function with limx→−∞

F (x) = 0 and limx→∞

F (x) = 1.

Let P be the probability measure on the Borel σ-algebra on R defined by P ((−∞, x]) = F (x) for all x ∈ R.Using the axioms of probability and the properties of F , determine the probability assigned by P to the closedinterval [a, b], a ≤ b. In particular, what is P (a)? What happens if F is continuous (i.e., left-continuousand right-continuous)?

2. Let X and Y be discrete random variables, and let g, h be functions such that g(X) and h(Y ) are also randomvariables. Prove that if X and Y are independent, then so are g(X) and h(Y ).

3. Let g : X → Y be some mapping. The inverse image of a set A ⊆ Y under the mapping g is the setg−1(A) , x ∈ X : g(x) ∈ A. Thus, for example, X : Ω → R is a random variable wrt a σ-algebra F iffX−1 ((−∞, x]) ∈ F for all x ∈ R.

Verify the following:

(a) g−1(A ∪B) = g−1(A) ∪ g−1(B), and this extends to countable unions as well.

(b) g−1(Ac) = [g−1(A)]c.

Now, let B denote the Borel σ-algebra on R. A function g : R → R is said to be Borel measurable if for allsets A ∈ B, the inverse image g−1(A) is also in B. Prove that if X is a random variable on some probabilityspace (Ω,F , P ) and g is Borel measurable, then Y = g(X) is also a random variable on (Ω,F , P ).

4. Let X be a non-negative random variable with distribution function FX . Prove that

E[X] =

∫ ∞0

(1− FX(x)) dx

[Hint: Prove this for discrete random variables by using an “area under the curve” interpretation of an integral.For continuous random variables, you may assume that FX is differentiable, with d

dx(1−FX(x)) = −fX(x),and use integration by parts on the formula E[X] =

∫∞0 xfX(x) dx.]

5. Let X and Y be simple random variables defined on the same probability space such that X ≤ Y . Prove,using the definition of expectation, that E[X] ≤ E[Y ].

6. Let X and Y be random variables defined on the same probability space, such that E[X], E[Y ] and E[XY ]

all exist.

(a) Prove that if X and Y are independent, then E[XY ] = E[X]E[Y ].

(b) Is the converse true? Prove or give a counterexample.

top related