who is crazier: bayes or fisher? a missing (data) perspective on … › programs › scientific ›...

218
Who’s Crazier? 1/31 Keli Liu and Xiao-Li Meng The Art of Creating Missingness A Partial Look at Fiducial Observed Not At Random Is Utopia Possible? Conclusions Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on Fiducial Inference Keli Liu and Xiao-Li Meng Department of Statistics, Harvard University November 15, 2013

Upload: others

Post on 26-Jun-2020

5 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

1/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Who is Crazier: Bayes or Fisher?A Missing (Data) Perspective

on Fiducial Inference

Keli Liu and Xiao-Li Meng

Department of Statistics, Harvard University

November 15, 2013

Page 2: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

2/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Let’s Meet the Crazies

Inference for Mean of Normal

x − µs/√

n= t

Frequentist 1− α level interval: x ± s√n

tn−1,1−α/2.

Bayes

Assume µ is random.

Assume µ has improperdistribution.

[µ|x , s] ∼ x − s√n

tn−1

Fisher

Fiducial Equation:µ = x − s√

nt.

Assume [t|x , s] ∼ tn−1.

[µ|x , s] ∼ x − s√n

tn−1.

Page 3: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

2/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Let’s Meet the Crazies

Inference for Mean of Normal

x − µs/√

n= t

Frequentist 1− α level interval: x ± s√n

tn−1,1−α/2.

Bayes

Assume µ is random.

Assume µ has improperdistribution.

[µ|x , s] ∼ x − s√n

tn−1

Fisher

Fiducial Equation:µ = x − s√

nt.

Assume [t|x , s] ∼ tn−1.

[µ|x , s] ∼ x − s√n

tn−1.

Page 4: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

2/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Let’s Meet the Crazies

Inference for Mean of Normal

x − µs/√

n= t

Frequentist 1− α level interval: x ± s√n

tn−1,1−α/2.

Bayes

Assume µ is random.

Assume µ has improperdistribution.

[µ|x , s] ∼ x − s√n

tn−1

Fisher

Fiducial Equation:µ = x − s√

nt.

Assume [t|x , s] ∼ tn−1.

[µ|x , s] ∼ x − s√n

tn−1.

Page 5: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

2/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Let’s Meet the Crazies

Inference for Mean of Normal

x − µs/√

n= t

Frequentist 1− α level interval: x ± s√n

tn−1,1−α/2.

Bayes

Assume µ is random.

Assume µ has improperdistribution.

[µ|x , s] ∼ x − s√n

tn−1

Fisher

Fiducial Equation:µ = x − s√

nt.

Assume [t|x , s] ∼ tn−1.

[µ|x , s] ∼ x − s√n

tn−1.

Page 6: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

2/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Let’s Meet the Crazies

Inference for Mean of Normal

x − µs/√

n= t

Frequentist 1− α level interval: x ± s√n

tn−1,1−α/2.

Bayes

Assume µ is random.

Assume µ has improperdistribution.

[µ|x , s] ∼ x − s√n

tn−1

Fisher

Fiducial Equation:µ = x − s√

nt.

Assume [t|x , s] ∼ tn−1.

[µ|x , s] ∼ x − s√n

tn−1.

Page 7: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

2/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Let’s Meet the Crazies

Inference for Mean of Normal

x − µs/√

n= t

Frequentist 1− α level interval: x ± s√n

tn−1,1−α/2.

Bayes

Assume µ is random.

Assume µ has improperdistribution.

[µ|x , s] ∼ x − s√n

tn−1

Fisher

Fiducial Equation:µ = x − s√

nt.

Assume [t|x , s] ∼ tn−1.

[µ|x , s] ∼ x − s√n

tn−1.

Page 8: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

2/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Let’s Meet the Crazies

Inference for Mean of Normal

x − µs/√

n= t

Frequentist 1− α level interval: x ± s√n

tn−1,1−α/2.

Bayes

Assume µ is random.

Assume µ has improperdistribution.

[µ|x , s] ∼ x − s√n

tn−1

Fisher

Fiducial Equation:µ = x − s√

nt.

Assume [t|x , s] ∼ tn−1.

[µ|x , s] ∼ x − s√n

tn−1.

Page 9: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

2/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Let’s Meet the Crazies

Inference for Mean of Normal

x − µs/√

n= t

Frequentist 1− α level interval: x ± s√n

tn−1,1−α/2.

Bayes

Assume µ is random.

Assume µ has improperdistribution.

[µ|x , s] ∼ x − s√n

tn−1

Fisher

Fiducial Equation:µ = x − s√

nt.

Assume [t|x , s] ∼ tn−1.

[µ|x , s] ∼ x − s√n

tn−1.

Page 10: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

2/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Let’s Meet the Crazies

Inference for Mean of Normal

x − µs/√

n= t

Frequentist 1− α level interval: x ± s√n

tn−1,1−α/2.

Bayes

Assume µ is random.

Assume µ has improperdistribution.

[µ|x , s] ∼ x − s√n

tn−1

Fisher

Fiducial Equation:µ = x − s√

nt.

Assume [t|x , s] ∼ tn−1.

[µ|x , s] ∼ x − s√n

tn−1.

Page 11: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

2/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Let’s Meet the Crazies

Inference for Mean of Normal

x − µs/√

n= t

Frequentist 1− α level interval: x ± s√n

tn−1,1−α/2.

Bayes

Assume µ is random.

Assume µ has improperdistribution.

[µ|x , s] ∼ x − s√n

tn−1

Fisher

Fiducial Equation:µ = x − s√

nt.

Assume [t|x , s] ∼ tn−1.

[µ|x , s] ∼ x − s√n

tn−1.

Page 12: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

2/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Let’s Meet the Crazies

Inference for Mean of Normal

x − µs/√

n= t

Frequentist 1− α level interval: x ± s√n

tn−1,1−α/2.

Bayes

Assume µ is random.

Assume µ has improperdistribution.

[µ|x , s] ∼ x − s√n

tn−1

Fisher

Fiducial Equation:µ = x − s√

nt.

Assume [t|x , s] ∼ tn−1.

[µ|x , s] ∼ x − s√n

tn−1.

Page 13: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

3/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Augmenting the Inferential Framework

Fiducial Equation (Taraldsen and Lindqvist 2013)

X = G (θ,U) where X ∈X , θ ∈ Θ, U ∈UG is the structure equation. U is (God’s) uncertainty.

Together, G and U determine the sampling distribution.

The sampling distribution does not determine G and U.

Method Type of Replication Relevant?

Frequentist data given parameter X|θBayes parameter given data θ|X X

Fiducial uncertainty given data U|X X

Why should finding U|x be any easier than finding θ|x?

Page 14: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

3/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Augmenting the Inferential Framework

Fiducial Equation (Taraldsen and Lindqvist 2013)

X = G (θ,U) where X ∈X , θ ∈ Θ, U ∈U

G is the structure equation. U is (God’s) uncertainty.

Together, G and U determine the sampling distribution.

The sampling distribution does not determine G and U.

Method Type of Replication Relevant?

Frequentist data given parameter X|θBayes parameter given data θ|X X

Fiducial uncertainty given data U|X X

Why should finding U|x be any easier than finding θ|x?

Page 15: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

3/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Augmenting the Inferential Framework

Fiducial Equation (Taraldsen and Lindqvist 2013)

X = G (θ,U) where X ∈X , θ ∈ Θ, U ∈UG is the structure equation. U is (God’s) uncertainty.

Together, G and U determine the sampling distribution.

The sampling distribution does not determine G and U.

Method Type of Replication Relevant?

Frequentist data given parameter X|θBayes parameter given data θ|X X

Fiducial uncertainty given data U|X X

Why should finding U|x be any easier than finding θ|x?

Page 16: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

3/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Augmenting the Inferential Framework

Fiducial Equation (Taraldsen and Lindqvist 2013)

X = G (θ,U) where X ∈X , θ ∈ Θ, U ∈UG is the structure equation. U is (God’s) uncertainty.

Together, G and U determine the sampling distribution.

The sampling distribution does not determine G and U.

Method Type of Replication Relevant?

Frequentist data given parameter X|θBayes parameter given data θ|X X

Fiducial uncertainty given data U|X X

Why should finding U|x be any easier than finding θ|x?

Page 17: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

3/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Augmenting the Inferential Framework

Fiducial Equation (Taraldsen and Lindqvist 2013)

X = G (θ,U) where X ∈X , θ ∈ Θ, U ∈UG is the structure equation. U is (God’s) uncertainty.

Together, G and U determine the sampling distribution.

The sampling distribution does not determine G and U.

Method Type of Replication Relevant?

Frequentist data given parameter X|θBayes parameter given data θ|X X

Fiducial uncertainty given data U|X X

Why should finding U|x be any easier than finding θ|x?

Page 18: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

3/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Augmenting the Inferential Framework

Fiducial Equation (Taraldsen and Lindqvist 2013)

X = G (θ,U) where X ∈X , θ ∈ Θ, U ∈UG is the structure equation. U is (God’s) uncertainty.

Together, G and U determine the sampling distribution.

The sampling distribution does not determine G and U.

Method Type of Replication Relevant?

Frequentist data given parameter X|θBayes parameter given data θ|X X

Fiducial uncertainty given data U|X X

Why should finding U|x be any easier than finding θ|x?

Page 19: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

3/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Augmenting the Inferential Framework

Fiducial Equation (Taraldsen and Lindqvist 2013)

X = G (θ,U) where X ∈X , θ ∈ Θ, U ∈UG is the structure equation. U is (God’s) uncertainty.

Together, G and U determine the sampling distribution.

The sampling distribution does not determine G and U.

Method Type of Replication Relevant?

Frequentist data given parameter X|θ

Bayes parameter given data θ|X X

Fiducial uncertainty given data U|X X

Why should finding U|x be any easier than finding θ|x?

Page 20: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

3/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Augmenting the Inferential Framework

Fiducial Equation (Taraldsen and Lindqvist 2013)

X = G (θ,U) where X ∈X , θ ∈ Θ, U ∈UG is the structure equation. U is (God’s) uncertainty.

Together, G and U determine the sampling distribution.

The sampling distribution does not determine G and U.

Method Type of Replication Relevant?

Frequentist data given parameter X|θBayes parameter given data θ|X X

Fiducial uncertainty given data U|X X

Why should finding U|x be any easier than finding θ|x?

Page 21: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

3/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Augmenting the Inferential Framework

Fiducial Equation (Taraldsen and Lindqvist 2013)

X = G (θ,U) where X ∈X , θ ∈ Θ, U ∈UG is the structure equation. U is (God’s) uncertainty.

Together, G and U determine the sampling distribution.

The sampling distribution does not determine G and U.

Method Type of Replication Relevant?

Frequentist data given parameter X|θBayes parameter given data θ|X X

Fiducial uncertainty given data U|X X

Why should finding U|x be any easier than finding θ|x?

Page 22: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

3/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Augmenting the Inferential Framework

Fiducial Equation (Taraldsen and Lindqvist 2013)

X = G (θ,U) where X ∈X , θ ∈ Θ, U ∈UG is the structure equation. U is (God’s) uncertainty.

Together, G and U determine the sampling distribution.

The sampling distribution does not determine G and U.

Method Type of Replication Relevant?

Frequentist data given parameter X|θBayes parameter given data θ|X X

Fiducial uncertainty given data U|X X

Why should finding U|x be any easier than finding θ|x?

Page 23: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

4/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Detour via a Fiducial Bridge?

OriginalSpace(X, θ)

Roadblock(Prior)π(dθ)

Destination(Posterior)π(θ|x)

AugmentedSpace

(X, θ,U)

G

Pitstopf (U|x)

Fiducial Bridge?

Page 24: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

4/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Detour via a Fiducial Bridge?

OriginalSpace(X, θ)

Roadblock(Prior)π(dθ)

Destination(Posterior)π(θ|x)

AugmentedSpace

(X, θ,U)

G

Pitstopf (U|x)

Fiducial Bridge?

Page 25: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

4/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Detour via a Fiducial Bridge?

OriginalSpace(X, θ)

Roadblock(Prior)π(dθ)

Destination(Posterior)π(θ|x)

AugmentedSpace

(X, θ,U)

G

Pitstopf (U|x)

Fiducial Bridge?

Page 26: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

4/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Detour via a Fiducial Bridge?

OriginalSpace(X, θ)

Roadblock(Prior)π(dθ)

Destination(Posterior)π(θ|x)

AugmentedSpace

(X, θ,U)

G

Pitstopf (U|x)

Fiducial Bridge?

Page 27: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

4/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Detour via a Fiducial Bridge?

OriginalSpace(X, θ)

Roadblock(Prior)π(dθ)

Destination(Posterior)π(θ|x)

AugmentedSpace

(X, θ,U)

G

Pitstopf (U|x)

Fiducial Bridge?

Page 28: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

4/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Detour via a Fiducial Bridge?

OriginalSpace(X, θ)

Roadblock(Prior)π(dθ)

Destination(Posterior)π(θ|x)

AugmentedSpace

(X, θ,U)

G

Pitstopf (U|x)

Fiducial Bridge?

Page 29: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

4/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Detour via a Fiducial Bridge?

OriginalSpace(X, θ)

Roadblock(Prior)π(dθ)

Destination(Posterior)π(θ|x)

AugmentedSpace

(X, θ,U)

G

Pitstopf (U|x)

Fiducial Bridge?

Page 30: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

4/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Detour via a Fiducial Bridge?

OriginalSpace(X, θ)

Roadblock(Prior)π(dθ)

Destination(Posterior)π(θ|x)

AugmentedSpace

(X, θ,U)

G

Pitstopf (U|x)

Fiducial Bridge?

Page 31: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

5/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Which Smells Fishier: Bayes or Fiducial?

Inference for Mean of Normal

x − µs/√

n= t

t is our missing data with prior distribution: t ∼ tn−1.

Bayes

Objective prior for µ.

What does objective priormean?

Ad hoc arguments giveπ(µ) ∝ 1.

Fiducial

Objective posterior for t.

What does objectiveposterior mean?

Ignore information on t in(x , s) that’s tied to π.

Objective Posterior: Throw away data until we don’t needa prior on µ.

Page 32: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

5/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Which Smells Fishier: Bayes or Fiducial?

Inference for Mean of Normal

x − µs/√

n= t

t is our missing data with prior distribution: t ∼ tn−1.

Bayes

Objective prior for µ.

What does objective priormean?

Ad hoc arguments giveπ(µ) ∝ 1.

Fiducial

Objective posterior for t.

What does objectiveposterior mean?

Ignore information on t in(x , s) that’s tied to π.

Objective Posterior: Throw away data until we don’t needa prior on µ.

Page 33: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

5/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Which Smells Fishier: Bayes or Fiducial?

Inference for Mean of Normal

x − µs/√

n= t

t is our missing data with prior distribution: t ∼ tn−1.

Bayes

Objective prior for µ.

What does objective priormean?

Ad hoc arguments giveπ(µ) ∝ 1.

Fiducial

Objective posterior for t.

What does objectiveposterior mean?

Ignore information on t in(x , s) that’s tied to π.

Objective Posterior: Throw away data until we don’t needa prior on µ.

Page 34: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

5/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Which Smells Fishier: Bayes or Fiducial?

Inference for Mean of Normal

x − µs/√

n= t

t is our missing data with prior distribution: t ∼ tn−1.

Bayes

Objective prior for µ.

What does objective priormean?

Ad hoc arguments giveπ(µ) ∝ 1.

Fiducial

Objective posterior for t.

What does objectiveposterior mean?

Ignore information on t in(x , s) that’s tied to π.

Objective Posterior: Throw away data until we don’t needa prior on µ.

Page 35: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

5/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Which Smells Fishier: Bayes or Fiducial?

Inference for Mean of Normal

x − µs/√

n= t

t is our missing data with prior distribution: t ∼ tn−1.

Bayes

Objective prior for µ.

What does objective priormean?

Ad hoc arguments giveπ(µ) ∝ 1.

Fiducial

Objective posterior for t.

What does objectiveposterior mean?

Ignore information on t in(x , s) that’s tied to π.

Objective Posterior: Throw away data until we don’t needa prior on µ.

Page 36: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

5/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Which Smells Fishier: Bayes or Fiducial?

Inference for Mean of Normal

x − µs/√

n= t

t is our missing data with prior distribution: t ∼ tn−1.

Bayes

Objective prior for µ.

What does objective priormean?

Ad hoc arguments giveπ(µ) ∝ 1.

Fiducial

Objective posterior for t.

What does objectiveposterior mean?

Ignore information on t in(x , s) that’s tied to π.

Objective Posterior: Throw away data until we don’t needa prior on µ.

Page 37: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

5/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Which Smells Fishier: Bayes or Fiducial?

Inference for Mean of Normal

x − µs/√

n= t

t is our missing data with prior distribution: t ∼ tn−1.

Bayes

Objective prior for µ.

What does objective priormean?

Ad hoc arguments giveπ(µ) ∝ 1.

Fiducial

Objective posterior for t.

What does objectiveposterior mean?

Ignore information on t in(x , s) that’s tied to π.

Objective Posterior: Throw away data until we don’t needa prior on µ.

Page 38: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

5/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Which Smells Fishier: Bayes or Fiducial?

Inference for Mean of Normal

x − µs/√

n= t

t is our missing data with prior distribution: t ∼ tn−1.

Bayes

Objective prior for µ.

What does objective priormean?

Ad hoc arguments giveπ(µ) ∝ 1.

Fiducial

Objective posterior for t.

What does objectiveposterior mean?

Ignore information on t in(x , s) that’s tied to π.

Objective Posterior: Throw away data until we don’t needa prior on µ.

Page 39: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

5/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Which Smells Fishier: Bayes or Fiducial?

Inference for Mean of Normal

x − µs/√

n= t

t is our missing data with prior distribution: t ∼ tn−1.

Bayes

Objective prior for µ.

What does objective priormean?

Ad hoc arguments giveπ(µ) ∝ 1.

Fiducial

Objective posterior for t.

What does objectiveposterior mean?

Ignore information on t in(x , s) that’s tied to π.

Objective Posterior: Throw away data until we don’t needa prior on µ.

Page 40: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

5/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Which Smells Fishier: Bayes or Fiducial?

Inference for Mean of Normal

x − µs/√

n= t

t is our missing data with prior distribution: t ∼ tn−1.

Bayes

Objective prior for µ.

What does objective priormean?

Ad hoc arguments giveπ(µ) ∝ 1.

Fiducial

Objective posterior for t.

What does objectiveposterior mean?

Ignore information on t in(x , s) that’s tied to π.

Objective Posterior: Throw away data until we don’t needa prior on µ.

Page 41: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

5/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Which Smells Fishier: Bayes or Fiducial?

Inference for Mean of Normal

x − µs/√

n= t

t is our missing data with prior distribution: t ∼ tn−1.

Bayes

Objective prior for µ.

What does objective priormean?

Ad hoc arguments giveπ(µ) ∝ 1.

Fiducial

Objective posterior for t.

What does objectiveposterior mean?

Ignore information on t in(x , s) that’s tied to π.

Objective Posterior: Throw away data until we don’t needa prior on µ.

Page 42: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

5/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Which Smells Fishier: Bayes or Fiducial?

Inference for Mean of Normal

x − µs/√

n= t

t is our missing data with prior distribution: t ∼ tn−1.

Bayes

Objective prior for µ.

What does objective priormean?

Ad hoc arguments giveπ(µ) ∝ 1.

Fiducial

Objective posterior for t.

What does objectiveposterior mean?

Ignore information on t in(x , s) that’s tied to π.

Objective Posterior: Throw away data until we don’t needa prior on µ.

Page 43: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

6/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

An Old Idea

1 Obtain f (U|x) without invoking π(dθ).

2 Use structural relation x = G (θ,U) to obtain π(θ|x) fromf (U|x).

A probability statement concerning e [the error] is ipsofacto a probability statement concerning θ. (Fraser 1968)

One can get a random realization from the fiducialdistribution of ξ by generating U and solving the structuralequation for ξ. (Hannig 2009)

The key point is that knowing θ is equivalent to knowingU; in other words, inference on θ is equivalent topredicting the value of the unobserved U. (Martin, Zhang,and Liu 2010)

Page 44: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

6/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

An Old Idea

1 Obtain f (U|x) without invoking π(dθ).

2 Use structural relation x = G (θ,U) to obtain π(θ|x) fromf (U|x).

A probability statement concerning e [the error] is ipsofacto a probability statement concerning θ. (Fraser 1968)

One can get a random realization from the fiducialdistribution of ξ by generating U and solving the structuralequation for ξ. (Hannig 2009)

The key point is that knowing θ is equivalent to knowingU; in other words, inference on θ is equivalent topredicting the value of the unobserved U. (Martin, Zhang,and Liu 2010)

Page 45: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

6/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

An Old Idea

1 Obtain f (U|x) without invoking π(dθ).

2 Use structural relation x = G (θ,U) to obtain π(θ|x) fromf (U|x).

A probability statement concerning e [the error] is ipsofacto a probability statement concerning θ. (Fraser 1968)

One can get a random realization from the fiducialdistribution of ξ by generating U and solving the structuralequation for ξ. (Hannig 2009)

The key point is that knowing θ is equivalent to knowingU; in other words, inference on θ is equivalent topredicting the value of the unobserved U. (Martin, Zhang,and Liu 2010)

Page 46: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

6/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

An Old Idea

1 Obtain f (U|x) without invoking π(dθ).

2 Use structural relation x = G (θ,U) to obtain π(θ|x) fromf (U|x).

A probability statement concerning e [the error] is ipsofacto a probability statement concerning θ. (Fraser 1968)

One can get a random realization from the fiducialdistribution of ξ by generating U and solving the structuralequation for ξ. (Hannig 2009)

The key point is that knowing θ is equivalent to knowingU; in other words, inference on θ is equivalent topredicting the value of the unobserved U. (Martin, Zhang,and Liu 2010)

Page 47: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

6/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

An Old Idea

1 Obtain f (U|x) without invoking π(dθ).

2 Use structural relation x = G (θ,U) to obtain π(θ|x) fromf (U|x).

A probability statement concerning e [the error] is ipsofacto a probability statement concerning θ. (Fraser 1968)

One can get a random realization from the fiducialdistribution of ξ by generating U and solving the structuralequation for ξ. (Hannig 2009)

The key point is that knowing θ is equivalent to knowingU; in other words, inference on θ is equivalent topredicting the value of the unobserved U. (Martin, Zhang,and Liu 2010)

Page 48: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

6/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

An Old Idea

1 Obtain f (U|x) without invoking π(dθ).

2 Use structural relation x = G (θ,U) to obtain π(θ|x) fromf (U|x).

A probability statement concerning e [the error] is ipsofacto a probability statement concerning θ. (Fraser 1968)

One can get a random realization from the fiducialdistribution of ξ by generating U and solving the structuralequation for ξ. (Hannig 2009)

The key point is that knowing θ is equivalent to knowingU; in other words, inference on θ is equivalent topredicting the value of the unobserved U. (Martin, Zhang,and Liu 2010)

Page 49: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

7/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

A Missing (Data) Perspective: Prior = Nuisance

The conditional distribution of X given U depends on π.

f (x|U,π) =

∫f (x|U,θ)π (dθ) =

∫1 {x =G (θ,U)}π (dθ)

Predicting the Missing U

f (U|x, π) ∝ f (U) f (x|U, π) .

We can treat π (dθ) as an infinite dimensional nuisanceparameter in the “U-likelihood”, f (x|U, π).

π can be viewed as a nuisance parameter only if we switch theproblem from inference for θ to prediction of U.

Can we get to π(θ|x) without going through π(θ)?

Can we predict U without any knowledge of the nuisance?

Page 50: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

7/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

A Missing (Data) Perspective: Prior = Nuisance

The conditional distribution of X given U depends on π.

f (x|U,π) =

∫f (x|U,θ)π (dθ) =

∫1 {x =G (θ,U)}π (dθ)

Predicting the Missing U

f (U|x, π) ∝ f (U) f (x|U, π) .

We can treat π (dθ) as an infinite dimensional nuisanceparameter in the “U-likelihood”, f (x|U, π).

π can be viewed as a nuisance parameter only if we switch theproblem from inference for θ to prediction of U.

Can we get to π(θ|x) without going through π(θ)?

Can we predict U without any knowledge of the nuisance?

Page 51: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

7/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

A Missing (Data) Perspective: Prior = Nuisance

The conditional distribution of X given U depends on π.

f (x|U,π) =

∫f (x|U,θ)π (dθ) =

∫1 {x =G (θ,U)}π (dθ)

Predicting the Missing U

f (U|x, π) ∝ f (U) f (x|U, π) .

We can treat π (dθ) as an infinite dimensional nuisanceparameter in the “U-likelihood”, f (x|U, π).

π can be viewed as a nuisance parameter only if we switch theproblem from inference for θ to prediction of U.

Can we get to π(θ|x) without going through π(θ)?

Can we predict U without any knowledge of the nuisance?

Page 52: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

7/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

A Missing (Data) Perspective: Prior = Nuisance

The conditional distribution of X given U depends on π.

f (x|U,π) =

∫f (x|U,θ)π (dθ) =

∫1 {x =G (θ,U)}π (dθ)

Predicting the Missing U

f (U|x, π) ∝ f (U) f (x|U, π) .

We can treat π (dθ) as an infinite dimensional nuisanceparameter in the “U-likelihood”, f (x|U, π).

π can be viewed as a nuisance parameter only if we switch theproblem from inference for θ to prediction of U.

Can we get to π(θ|x) without going through π(θ)?

Can we predict U without any knowledge of the nuisance?

Page 53: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

7/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

A Missing (Data) Perspective: Prior = Nuisance

The conditional distribution of X given U depends on π.

f (x|U,π) =

∫f (x|U,θ)π (dθ) =

∫1 {x =G (θ,U)}π (dθ)

Predicting the Missing U

f (U|x, π) ∝ f (U) f (x|U, π) .

We can treat π (dθ) as an infinite dimensional nuisanceparameter in the “U-likelihood”, f (x|U, π).

π can be viewed as a nuisance parameter only if we switch theproblem from inference for θ to prediction of U.

Can we get to π(θ|x) without going through π(θ)?

Can we predict U without any knowledge of the nuisance?

Page 54: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

7/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

A Missing (Data) Perspective: Prior = Nuisance

The conditional distribution of X given U depends on π.

f (x|U,π) =

∫f (x|U,θ)π (dθ) =

∫1 {x =G (θ,U)}π (dθ)

Predicting the Missing U

f (U|x, π) ∝ f (U) f (x|U, π) .

We can treat π (dθ) as an infinite dimensional nuisanceparameter in the “U-likelihood”, f (x|U, π).

π can be viewed as a nuisance parameter only if we switch theproblem from inference for θ to prediction of U.

Can we get to π(θ|x) without going through π(θ)?

Can we predict U without any knowledge of the nuisance?

Page 55: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

7/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

A Missing (Data) Perspective: Prior = Nuisance

The conditional distribution of X given U depends on π.

f (x|U,π) =

∫f (x|U,θ)π (dθ) =

∫1 {x =G (θ,U)}π (dθ)

Predicting the Missing U

f (U|x, π) ∝ f (U) f (x|U, π) .

We can treat π (dθ) as an infinite dimensional nuisanceparameter in the “U-likelihood”, f (x|U, π).

π can be viewed as a nuisance parameter only if we switch theproblem from inference for θ to prediction of U.

Can we get to π(θ|x) without going through π(θ)?

Can we predict U without any knowledge of the nuisance?

Page 56: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

8/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

A Meaningful Definition of Objective

U-Likelihood

f (x|U, π) contains all information in x about U and π.

Goal: Extract information on U not contaminated by thenuisance parameter π.

Technique: Throw away data contaminated by π. Reduce x toA(x).

Objective Posterior for U

f (U|A(x)) is an objective posterior for U if it does not depend on thevalue of the nuisance parameter, π.

Requires f (A(x)|U, π) = f (A(x)|U).

Page 57: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

8/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

A Meaningful Definition of Objective

U-Likelihood

f (x|U, π) contains all information in x about U and π.

Goal: Extract information on U not contaminated by thenuisance parameter π.

Technique: Throw away data contaminated by π. Reduce x toA(x).

Objective Posterior for U

f (U|A(x)) is an objective posterior for U if it does not depend on thevalue of the nuisance parameter, π.

Requires f (A(x)|U, π) = f (A(x)|U).

Page 58: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

8/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

A Meaningful Definition of Objective

U-Likelihood

f (x|U, π) contains all information in x about U and π.

Goal: Extract information on U not contaminated by thenuisance parameter π.

Technique: Throw away data contaminated by π. Reduce x toA(x).

Objective Posterior for U

f (U|A(x)) is an objective posterior for U if it does not depend on thevalue of the nuisance parameter, π.

Requires f (A(x)|U, π) = f (A(x)|U).

Page 59: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

8/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

A Meaningful Definition of Objective

U-Likelihood

f (x|U, π) contains all information in x about U and π.

Goal: Extract information on U not contaminated by thenuisance parameter π.

Technique: Throw away data contaminated by π. Reduce x toA(x).

Objective Posterior for U

f (U|A(x)) is an objective posterior for U if it does not depend on thevalue of the nuisance parameter, π.

Requires f (A(x)|U, π) = f (A(x)|U).

Page 60: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

8/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

A Meaningful Definition of Objective

U-Likelihood

f (x|U, π) contains all information in x about U and π.

Goal: Extract information on U not contaminated by thenuisance parameter π.

Technique: Throw away data contaminated by π. Reduce x toA(x).

Objective Posterior for U

f (U|A(x)) is an objective posterior for U if it does not depend on thevalue of the nuisance parameter, π.

Requires f (A(x)|U, π) = f (A(x)|U).

Page 61: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

8/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

A Meaningful Definition of Objective

U-Likelihood

f (x|U, π) contains all information in x about U and π.

Goal: Extract information on U not contaminated by thenuisance parameter π.

Technique: Throw away data contaminated by π. Reduce x toA(x).

Objective Posterior for U

f (U|A(x)) is an objective posterior for U if it does not depend on thevalue of the nuisance parameter, π.

Requires f (A(x)|U, π) = f (A(x)|U).

Page 62: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

9/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Didn’t You Say You Were Ancillary?

Does there exist a statistic, A(x), s.t. f (A(x)|U) is free of thenuisance π?

How about statistics ancillary for θ?

An Ancillarity Paradox from Basu (1964)

Let (X ,Y ) have fiducial equation

X = ε1 and Y = ρε1 +√

1− ρε2 where εi i.i.d. N (0, 1)

X ,Y are marginally ancillary but not jointly ancillary.

P (X = x |ε, ρ) = 1 {x = ε1} is free of ρ.

P (Y = y |ε, ρ) = 1{

y = ρε1 +√

1− ρε2

}depends on ρ.

Situation reverses by switching the roles of X and Y in thefiducial equation.

Whether or not A(x) is free of π given U depends on G .

Page 63: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

9/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Didn’t You Say You Were Ancillary?

Does there exist a statistic, A(x), s.t. f (A(x)|U) is free of thenuisance π?

How about statistics ancillary for θ?

An Ancillarity Paradox from Basu (1964)

Let (X ,Y ) have fiducial equation

X = ε1 and Y = ρε1 +√

1− ρε2 where εi i.i.d. N (0, 1)

X ,Y are marginally ancillary but not jointly ancillary.

P (X = x |ε, ρ) = 1 {x = ε1} is free of ρ.

P (Y = y |ε, ρ) = 1{

y = ρε1 +√

1− ρε2

}depends on ρ.

Situation reverses by switching the roles of X and Y in thefiducial equation.

Whether or not A(x) is free of π given U depends on G .

Page 64: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

9/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Didn’t You Say You Were Ancillary?

Does there exist a statistic, A(x), s.t. f (A(x)|U) is free of thenuisance π?

How about statistics ancillary for θ?

An Ancillarity Paradox from Basu (1964)

Let (X ,Y ) have fiducial equation

X = ε1 and Y = ρε1 +√

1− ρε2 where εi i.i.d. N (0, 1)

X ,Y are marginally ancillary but not jointly ancillary.

P (X = x |ε, ρ) = 1 {x = ε1} is free of ρ.

P (Y = y |ε, ρ) = 1{

y = ρε1 +√

1− ρε2

}depends on ρ.

Situation reverses by switching the roles of X and Y in thefiducial equation.

Whether or not A(x) is free of π given U depends on G .

Page 65: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

9/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Didn’t You Say You Were Ancillary?

Does there exist a statistic, A(x), s.t. f (A(x)|U) is free of thenuisance π?

How about statistics ancillary for θ?

An Ancillarity Paradox from Basu (1964)

Let (X ,Y ) have fiducial equation

X = ε1 and Y = ρε1 +√

1− ρε2 where εi i.i.d. N (0, 1)

X ,Y are marginally ancillary but not jointly ancillary.

P (X = x |ε, ρ) = 1 {x = ε1} is free of ρ.

P (Y = y |ε, ρ) = 1{

y = ρε1 +√

1− ρε2

}depends on ρ.

Situation reverses by switching the roles of X and Y in thefiducial equation.

Whether or not A(x) is free of π given U depends on G .

Page 66: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

9/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Didn’t You Say You Were Ancillary?

Does there exist a statistic, A(x), s.t. f (A(x)|U) is free of thenuisance π?

How about statistics ancillary for θ?

An Ancillarity Paradox from Basu (1964)

Let (X ,Y ) have fiducial equation

X = ε1 and Y = ρε1 +√

1− ρε2 where εi i.i.d. N (0, 1)

X ,Y are marginally ancillary but not jointly ancillary.

P (X = x |ε, ρ) = 1 {x = ε1} is free of ρ.

P (Y = y |ε, ρ) = 1{

y = ρε1 +√

1− ρε2

}depends on ρ.

Situation reverses by switching the roles of X and Y in thefiducial equation.

Whether or not A(x) is free of π given U depends on G .

Page 67: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

9/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Didn’t You Say You Were Ancillary?

Does there exist a statistic, A(x), s.t. f (A(x)|U) is free of thenuisance π?

How about statistics ancillary for θ?

An Ancillarity Paradox from Basu (1964)

Let (X ,Y ) have fiducial equation

X = ε1 and Y = ρε1 +√

1− ρε2 where εi i.i.d. N (0, 1)

X ,Y are marginally ancillary but not jointly ancillary.

P (X = x |ε, ρ) = 1 {x = ε1} is free of ρ.

P (Y = y |ε, ρ) = 1{

y = ρε1 +√

1− ρε2

}depends on ρ.

Situation reverses by switching the roles of X and Y in thefiducial equation.

Whether or not A(x) is free of π given U depends on G .

Page 68: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

9/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Didn’t You Say You Were Ancillary?

Does there exist a statistic, A(x), s.t. f (A(x)|U) is free of thenuisance π?

How about statistics ancillary for θ?

An Ancillarity Paradox from Basu (1964)

Let (X ,Y ) have fiducial equation

X = ε1 and Y = ρε1 +√

1− ρε2 where εi i.i.d. N (0, 1)

X ,Y are marginally ancillary but not jointly ancillary.

P (X = x |ε, ρ) = 1 {x = ε1} is free of ρ.

P (Y = y |ε, ρ) = 1{

y = ρε1 +√

1− ρε2

}depends on ρ.

Situation reverses by switching the roles of X and Y in thefiducial equation.

Whether or not A(x) is free of π given U depends on G .

Page 69: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

9/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Didn’t You Say You Were Ancillary?

Does there exist a statistic, A(x), s.t. f (A(x)|U) is free of thenuisance π?

How about statistics ancillary for θ?

An Ancillarity Paradox from Basu (1964)

Let (X ,Y ) have fiducial equation

X = ε1 and Y = ρε1 +√

1− ρε2 where εi i.i.d. N (0, 1)

X ,Y are marginally ancillary but not jointly ancillary.

P (X = x |ε, ρ) = 1 {x = ε1} is free of ρ.

P (Y = y |ε, ρ) = 1{

y = ρε1 +√

1− ρε2

}depends on ρ.

Situation reverses by switching the roles of X and Y in thefiducial equation.

Whether or not A(x) is free of π given U depends on G .

Page 70: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

9/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Didn’t You Say You Were Ancillary?

Does there exist a statistic, A(x), s.t. f (A(x)|U) is free of thenuisance π?

How about statistics ancillary for θ?

An Ancillarity Paradox from Basu (1964)

Let (X ,Y ) have fiducial equation

X = ε1 and Y = ρε1 +√

1− ρε2 where εi i.i.d. N (0, 1)

X ,Y are marginally ancillary but not jointly ancillary.

P (X = x |ε, ρ) = 1 {x = ε1} is free of ρ.

P (Y = y |ε, ρ) = 1{

y = ρε1 +√

1− ρε2

}depends on ρ.

Situation reverses by switching the roles of X and Y in thefiducial equation.

Whether or not A(x) is free of π given U depends on G .

Page 71: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?

9/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Didn’t You Say You Were Ancillary?

Does there exist a statistic, A(x), s.t. f (A(x)|U) is free of thenuisance π?

How about statistics ancillary for θ?

An Ancillarity Paradox from Basu (1964)

Let (X ,Y ) have fiducial equation

X = ε1 and Y = ρε1 +√

1− ρε2 where εi i.i.d. N (0, 1)

X ,Y are marginally ancillary but not jointly ancillary.

P (X = x |ε, ρ) = 1 {x = ε1} is free of ρ.

P (Y = y |ε, ρ) = 1{

y = ρε1 +√

1− ρε2

}depends on ρ.

Situation reverses by switching the roles of X and Y in thefiducial equation.

Whether or not A(x) is free of π given U depends on G .

Page 72: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?10/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Representational Ancillarity

Definition of R-Ancillarity

A statistic A(x) is representationally ancillary w.r.t. G if there existsa representation AG s.t. A (G (θ,U)) = AG (U) ∀θ.

A(x) is R-ancillary if and only if U|A (x) , π ∼ U|A (x).

Lemma (also see Barnard, 1995)

If A1(x) and A2(x) are both R-ancillaries w.r.t. G , then they arejointly R-ancillary w.r.t. G .

Write A1,A2 as A1,G (U),A2,G (U). Then (A1,G (U),A2,G (U))remains a R-ancillary w.r.t. G .

In Basu’s paradox, X ,Y are both ancillaries but they are notR-ancillaries with respect to the same G .

Page 73: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?10/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Representational Ancillarity

Definition of R-Ancillarity

A statistic A(x) is representationally ancillary w.r.t. G if there existsa representation AG s.t. A (G (θ,U)) = AG (U) ∀θ.

A(x) is R-ancillary if and only if U|A (x) , π ∼ U|A (x).

Lemma (also see Barnard, 1995)

If A1(x) and A2(x) are both R-ancillaries w.r.t. G , then they arejointly R-ancillary w.r.t. G .

Write A1,A2 as A1,G (U),A2,G (U). Then (A1,G (U),A2,G (U))remains a R-ancillary w.r.t. G .

In Basu’s paradox, X ,Y are both ancillaries but they are notR-ancillaries with respect to the same G .

Page 74: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?10/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Representational Ancillarity

Definition of R-Ancillarity

A statistic A(x) is representationally ancillary w.r.t. G if there existsa representation AG s.t. A (G (θ,U)) = AG (U) ∀θ.

A(x) is R-ancillary if and only if U|A (x) , π ∼ U|A (x).

Lemma (also see Barnard, 1995)

If A1(x) and A2(x) are both R-ancillaries w.r.t. G , then they arejointly R-ancillary w.r.t. G .

Write A1,A2 as A1,G (U),A2,G (U). Then (A1,G (U),A2,G (U))remains a R-ancillary w.r.t. G .

In Basu’s paradox, X ,Y are both ancillaries but they are notR-ancillaries with respect to the same G .

Page 75: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?10/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Representational Ancillarity

Definition of R-Ancillarity

A statistic A(x) is representationally ancillary w.r.t. G if there existsa representation AG s.t. A (G (θ,U)) = AG (U) ∀θ.

A(x) is R-ancillary if and only if U|A (x) , π ∼ U|A (x).

Lemma (also see Barnard, 1995)

If A1(x) and A2(x) are both R-ancillaries w.r.t. G , then they arejointly R-ancillary w.r.t. G .

Write A1,A2 as A1,G (U),A2,G (U). Then (A1,G (U),A2,G (U))remains a R-ancillary w.r.t. G .

In Basu’s paradox, X ,Y are both ancillaries but they are notR-ancillaries with respect to the same G .

Page 76: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?10/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Representational Ancillarity

Definition of R-Ancillarity

A statistic A(x) is representationally ancillary w.r.t. G if there existsa representation AG s.t. A (G (θ,U)) = AG (U) ∀θ.

A(x) is R-ancillary if and only if U|A (x) , π ∼ U|A (x).

Lemma (also see Barnard, 1995)

If A1(x) and A2(x) are both R-ancillaries w.r.t. G , then they arejointly R-ancillary w.r.t. G .

Write A1,A2 as A1,G (U),A2,G (U). Then (A1,G (U),A2,G (U))remains a R-ancillary w.r.t. G .

In Basu’s paradox, X ,Y are both ancillaries but they are notR-ancillaries with respect to the same G .

Page 77: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?10/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Representational Ancillarity

Definition of R-Ancillarity

A statistic A(x) is representationally ancillary w.r.t. G if there existsa representation AG s.t. A (G (θ,U)) = AG (U) ∀θ.

A(x) is R-ancillary if and only if U|A (x) , π ∼ U|A (x).

Lemma (also see Barnard, 1995)

If A1(x) and A2(x) are both R-ancillaries w.r.t. G , then they arejointly R-ancillary w.r.t. G .

Write A1,A2 as A1,G (U),A2,G (U). Then (A1,G (U),A2,G (U))remains a R-ancillary w.r.t. G .

In Basu’s paradox, X ,Y are both ancillaries but they are notR-ancillaries with respect to the same G .

Page 78: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?11/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

What Do Representations Buy Us?

Fiducial inference depends on G because whether a statistic isancillary for π depends on whether it is R-ancillary for G .

Definition of ancillarity as distributional independence of A(x)from θ is insufficient.

We require the notion of representational (or functional)independence.

Page 79: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?11/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

What Do Representations Buy Us?

Fiducial inference depends on G because whether a statistic isancillary for π depends on whether it is R-ancillary for G .

Definition of ancillarity as distributional independence of A(x)from θ is insufficient.

We require the notion of representational (or functional)independence.

Page 80: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?11/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

What Do Representations Buy Us?

Fiducial inference depends on G because whether a statistic isancillary for π depends on whether it is R-ancillary for G .

Definition of ancillarity as distributional independence of A(x)from θ is insufficient.

We require the notion of representational (or functional)independence.

Page 81: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?11/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

What Do Representations Buy Us?

Fiducial inference depends on G because whether a statistic isancillary for π depends on whether it is R-ancillary for G .

Definition of ancillarity as distributional independence of A(x)from θ is insufficient.

We require the notion of representational (or functional)independence.

Page 82: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?12/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Is Fiducial Hazardous?

Cox Hazard Model Fiducial

regressioncoef. β

InterestedIn...

U

baselinehazardλ(dt)

Don’tKnow...

priorπ(dθ)

LossyCompress...

Full Datafailure times

Full DataX

InferenceFrom...

λ ancillaryβ non-ancillary

ranks

π ancillaryU non-ancillary

A(x)

Page 83: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?12/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Is Fiducial Hazardous?

Cox Hazard Model Fiducial

regressioncoef. β

InterestedIn...

U

baselinehazardλ(dt)

Don’tKnow...

priorπ(dθ)

LossyCompress...

Full Datafailure times

Full DataX

InferenceFrom...

λ ancillaryβ non-ancillary

ranks

π ancillaryU non-ancillary

A(x)

Page 84: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?12/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Is Fiducial Hazardous?

Cox Hazard Model Fiducial

regressioncoef. β

InterestedIn...

U

baselinehazardλ(dt)

Don’tKnow...

priorπ(dθ)

LossyCompress...

Full Datafailure times

Full DataX

InferenceFrom...

λ ancillaryβ non-ancillary

ranks

π ancillaryU non-ancillary

A(x)

Page 85: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?12/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Is Fiducial Hazardous?

Cox Hazard Model Fiducial

regressioncoef. β

InterestedIn...

U

baselinehazardλ(dt)

Don’tKnow...

priorπ(dθ)

LossyCompress...

Full Datafailure times

Full DataX

InferenceFrom...

λ ancillaryβ non-ancillary

ranks

π ancillaryU non-ancillary

A(x)

Page 86: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?12/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Is Fiducial Hazardous?

Cox Hazard Model Fiducial

regressioncoef. β

InterestedIn...

U

baselinehazardλ(dt)

Don’tKnow...

priorπ(dθ)

LossyCompress...

Full Datafailure times

Full DataX

InferenceFrom...

λ ancillaryβ non-ancillary

ranks

π ancillaryU non-ancillary

A(x)

Page 87: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?13/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Fiducial Is Just Partial Likelihood

Partial Likelihood for U

Assume the decomposition, x =(T (x),A (x)) where A (x) isR-ancillary w.r.t. G .

The joint (U, π)-likelihood factors as

f (x|U, π) = f (T (x) |A (x) ,U, π)f (A (x) |U).

f (T (x) |A (x) ,U, π) requires prior specification.

f (A (x) |U) is known independent of the prior, π.

Fiducial Recipe for U|x

1 Ignore factor f (T (x) |A (x) ,U, π) because information on U“cannot be disentangled” from prior.

2 Use f (A (x) |U) to obtain f (U|A(x)) ∝ f (U)f (A(x)|U).

3 Pretend f (U|A(x)) = f (U|x). Refreshing or Revolting?

Page 88: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?13/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Fiducial Is Just Partial Likelihood

Partial Likelihood for U

Assume the decomposition, x =(T (x),A (x)) where A (x) isR-ancillary w.r.t. G .

The joint (U, π)-likelihood factors as

f (x|U, π) = f (T (x) |A (x) ,U, π)f (A (x) |U).

f (T (x) |A (x) ,U, π) requires prior specification.

f (A (x) |U) is known independent of the prior, π.

Fiducial Recipe for U|x

1 Ignore factor f (T (x) |A (x) ,U, π) because information on U“cannot be disentangled” from prior.

2 Use f (A (x) |U) to obtain f (U|A(x)) ∝ f (U)f (A(x)|U).

3 Pretend f (U|A(x)) = f (U|x). Refreshing or Revolting?

Page 89: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?13/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Fiducial Is Just Partial Likelihood

Partial Likelihood for U

Assume the decomposition, x =(T (x),A (x)) where A (x) isR-ancillary w.r.t. G .

The joint (U, π)-likelihood factors as

f (x|U, π) = f (T (x) |A (x) ,U, π)f (A (x) |U).

f (T (x) |A (x) ,U, π) requires prior specification.

f (A (x) |U) is known independent of the prior, π.

Fiducial Recipe for U|x

1 Ignore factor f (T (x) |A (x) ,U, π) because information on U“cannot be disentangled” from prior.

2 Use f (A (x) |U) to obtain f (U|A(x)) ∝ f (U)f (A(x)|U).

3 Pretend f (U|A(x)) = f (U|x). Refreshing or Revolting?

Page 90: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?13/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Fiducial Is Just Partial Likelihood

Partial Likelihood for U

Assume the decomposition, x =(T (x),A (x)) where A (x) isR-ancillary w.r.t. G .

The joint (U, π)-likelihood factors as

f (x|U, π) = f (T (x) |A (x) ,U, π)f (A (x) |U).

f (T (x) |A (x) ,U, π) requires prior specification.

f (A (x) |U) is known independent of the prior, π.

Fiducial Recipe for U|x

1 Ignore factor f (T (x) |A (x) ,U, π) because information on U“cannot be disentangled” from prior.

2 Use f (A (x) |U) to obtain f (U|A(x)) ∝ f (U)f (A(x)|U).

3 Pretend f (U|A(x)) = f (U|x). Refreshing or Revolting?

Page 91: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?13/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Fiducial Is Just Partial Likelihood

Partial Likelihood for U

Assume the decomposition, x =(T (x),A (x)) where A (x) isR-ancillary w.r.t. G .

The joint (U, π)-likelihood factors as

f (x|U, π) = f (T (x) |A (x) ,U, π)f (A (x) |U).

f (T (x) |A (x) ,U, π) requires prior specification.

f (A (x) |U) is known independent of the prior, π.

Fiducial Recipe for U|x

1 Ignore factor f (T (x) |A (x) ,U, π) because information on U“cannot be disentangled” from prior.

2 Use f (A (x) |U) to obtain f (U|A(x)) ∝ f (U)f (A(x)|U).

3 Pretend f (U|A(x)) = f (U|x). Refreshing or Revolting?

Page 92: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?13/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Fiducial Is Just Partial Likelihood

Partial Likelihood for U

Assume the decomposition, x =(T (x),A (x)) where A (x) isR-ancillary w.r.t. G .

The joint (U, π)-likelihood factors as

f (x|U, π) = f (T (x) |A (x) ,U, π)f (A (x) |U).

f (T (x) |A (x) ,U, π) requires prior specification.

f (A (x) |U) is known independent of the prior, π.

Fiducial Recipe for U|x

1 Ignore factor f (T (x) |A (x) ,U, π) because information on U“cannot be disentangled” from prior.

2 Use f (A (x) |U) to obtain f (U|A(x)) ∝ f (U)f (A(x)|U).

3 Pretend f (U|A(x)) = f (U|x). Refreshing or Revolting?

Page 93: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?13/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Fiducial Is Just Partial Likelihood

Partial Likelihood for U

Assume the decomposition, x =(T (x),A (x)) where A (x) isR-ancillary w.r.t. G .

The joint (U, π)-likelihood factors as

f (x|U, π) = f (T (x) |A (x) ,U, π)f (A (x) |U).

f (T (x) |A (x) ,U, π) requires prior specification.

f (A (x) |U) is known independent of the prior, π.

Fiducial Recipe for U|x

1 Ignore factor f (T (x) |A (x) ,U, π) because information on U“cannot be disentangled” from prior.

2 Use f (A (x) |U) to obtain f (U|A(x)) ∝ f (U)f (A(x)|U).

3 Pretend f (U|A(x)) = f (U|x). Refreshing or Revolting?

Page 94: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?13/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Fiducial Is Just Partial Likelihood

Partial Likelihood for U

Assume the decomposition, x =(T (x),A (x)) where A (x) isR-ancillary w.r.t. G .

The joint (U, π)-likelihood factors as

f (x|U, π) = f (T (x) |A (x) ,U, π)f (A (x) |U).

f (T (x) |A (x) ,U, π) requires prior specification.

f (A (x) |U) is known independent of the prior, π.

Fiducial Recipe for U|x

1 Ignore factor f (T (x) |A (x) ,U, π) because information on U“cannot be disentangled” from prior.

2 Use f (A (x) |U) to obtain f (U|A(x)) ∝ f (U)f (A(x)|U).

3 Pretend f (U|A(x)) = f (U|x). Refreshing or Revolting?

Page 95: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?13/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Fiducial Is Just Partial Likelihood

Partial Likelihood for U

Assume the decomposition, x =(T (x),A (x)) where A (x) isR-ancillary w.r.t. G .

The joint (U, π)-likelihood factors as

f (x|U, π) = f (T (x) |A (x) ,U, π)f (A (x) |U).

f (T (x) |A (x) ,U, π) requires prior specification.

f (A (x) |U) is known independent of the prior, π.

Fiducial Recipe for U|x

1 Ignore factor f (T (x) |A (x) ,U, π) because information on U“cannot be disentangled” from prior.

2 Use f (A (x) |U) to obtain f (U|A(x)) ∝ f (U)f (A(x)|U).

3 Pretend f (U|A(x)) = f (U|x).

Refreshing or Revolting?

Page 96: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?13/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Fiducial Is Just Partial Likelihood

Partial Likelihood for U

Assume the decomposition, x =(T (x),A (x)) where A (x) isR-ancillary w.r.t. G .

The joint (U, π)-likelihood factors as

f (x|U, π) = f (T (x) |A (x) ,U, π)f (A (x) |U).

f (T (x) |A (x) ,U, π) requires prior specification.

f (A (x) |U) is known independent of the prior, π.

Fiducial Recipe for U|x

1 Ignore factor f (T (x) |A (x) ,U, π) because information on U“cannot be disentangled” from prior.

2 Use f (A (x) |U) to obtain f (U|A(x)) ∝ f (U)f (A(x)|U).

3 Pretend f (U|A(x)) = f (U|x). Refreshing or Revolting?

Page 97: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?14/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

When the Recipe Works...

Exponential Hyperbola

V = 1λE1 and W = λE2 where E1,E2 are iid exponential.

λ = (W /V )1/2 is the MLE and A = (VW )1/2 is an R-ancillary.

Solve λ = λ (E2/E1)1/2 to obtain λ = λ (E1/E2)1/2.

Impute (E2/E1)1/2 according to its prior distribution. Parallelsusing marginal likelihood f (λ|λ).

But λ is not sufficient. Can we do better?

We observe A = (E1E2)1/2. Impute (E2/E1)1/2 according tof (E1,E2|A). Parallels using conditional likelihood f (λ|A, λ).

Why does conditioning on ancillary statistics recover secondorder information?

Fiducial Answer: It increases our efficiency of predicting U.

Page 98: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?14/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

When the Recipe Works...

Exponential Hyperbola

V = 1λE1 and W = λE2 where E1,E2 are iid exponential.

λ = (W /V )1/2 is the MLE and A = (VW )1/2 is an R-ancillary.

Solve λ = λ (E2/E1)1/2 to obtain λ = λ (E1/E2)1/2.

Impute (E2/E1)1/2 according to its prior distribution. Parallelsusing marginal likelihood f (λ|λ).

But λ is not sufficient. Can we do better?

We observe A = (E1E2)1/2. Impute (E2/E1)1/2 according tof (E1,E2|A). Parallels using conditional likelihood f (λ|A, λ).

Why does conditioning on ancillary statistics recover secondorder information?

Fiducial Answer: It increases our efficiency of predicting U.

Page 99: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?14/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

When the Recipe Works...

Exponential Hyperbola

V = 1λE1 and W = λE2 where E1,E2 are iid exponential.

λ = (W /V )1/2 is the MLE and A = (VW )1/2 is an R-ancillary.

Solve λ = λ (E2/E1)1/2 to obtain λ = λ (E1/E2)1/2.

Impute (E2/E1)1/2 according to its prior distribution. Parallelsusing marginal likelihood f (λ|λ).

But λ is not sufficient. Can we do better?

We observe A = (E1E2)1/2. Impute (E2/E1)1/2 according tof (E1,E2|A). Parallels using conditional likelihood f (λ|A, λ).

Why does conditioning on ancillary statistics recover secondorder information?

Fiducial Answer: It increases our efficiency of predicting U.

Page 100: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?14/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

When the Recipe Works...

Exponential Hyperbola

V = 1λE1 and W = λE2 where E1,E2 are iid exponential.

λ = (W /V )1/2 is the MLE and A = (VW )1/2 is an R-ancillary.

Solve λ = λ (E2/E1)1/2 to obtain λ = λ (E1/E2)1/2.

Impute (E2/E1)1/2 according to its prior distribution. Parallelsusing marginal likelihood f (λ|λ).

But λ is not sufficient. Can we do better?

We observe A = (E1E2)1/2. Impute (E2/E1)1/2 according tof (E1,E2|A). Parallels using conditional likelihood f (λ|A, λ).

Why does conditioning on ancillary statistics recover secondorder information?

Fiducial Answer: It increases our efficiency of predicting U.

Page 101: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?14/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

When the Recipe Works...

Exponential Hyperbola

V = 1λE1 and W = λE2 where E1,E2 are iid exponential.

λ = (W /V )1/2 is the MLE and A = (VW )1/2 is an R-ancillary.

Solve λ = λ (E2/E1)1/2 to obtain λ = λ (E1/E2)1/2.

Impute (E2/E1)1/2 according to its prior distribution.

Parallelsusing marginal likelihood f (λ|λ).

But λ is not sufficient. Can we do better?

We observe A = (E1E2)1/2. Impute (E2/E1)1/2 according tof (E1,E2|A). Parallels using conditional likelihood f (λ|A, λ).

Why does conditioning on ancillary statistics recover secondorder information?

Fiducial Answer: It increases our efficiency of predicting U.

Page 102: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?14/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

When the Recipe Works...

Exponential Hyperbola

V = 1λE1 and W = λE2 where E1,E2 are iid exponential.

λ = (W /V )1/2 is the MLE and A = (VW )1/2 is an R-ancillary.

Solve λ = λ (E2/E1)1/2 to obtain λ = λ (E1/E2)1/2.

Impute (E2/E1)1/2 according to its prior distribution. Parallelsusing marginal likelihood f (λ|λ).

But λ is not sufficient. Can we do better?

We observe A = (E1E2)1/2. Impute (E2/E1)1/2 according tof (E1,E2|A). Parallels using conditional likelihood f (λ|A, λ).

Why does conditioning on ancillary statistics recover secondorder information?

Fiducial Answer: It increases our efficiency of predicting U.

Page 103: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?14/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

When the Recipe Works...

Exponential Hyperbola

V = 1λE1 and W = λE2 where E1,E2 are iid exponential.

λ = (W /V )1/2 is the MLE and A = (VW )1/2 is an R-ancillary.

Solve λ = λ (E2/E1)1/2 to obtain λ = λ (E1/E2)1/2.

Impute (E2/E1)1/2 according to its prior distribution. Parallelsusing marginal likelihood f (λ|λ).

But λ is not sufficient. Can we do better?

We observe A = (E1E2)1/2. Impute (E2/E1)1/2 according tof (E1,E2|A). Parallels using conditional likelihood f (λ|A, λ).

Why does conditioning on ancillary statistics recover secondorder information?

Fiducial Answer: It increases our efficiency of predicting U.

Page 104: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?14/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

When the Recipe Works...

Exponential Hyperbola

V = 1λE1 and W = λE2 where E1,E2 are iid exponential.

λ = (W /V )1/2 is the MLE and A = (VW )1/2 is an R-ancillary.

Solve λ = λ (E2/E1)1/2 to obtain λ = λ (E1/E2)1/2.

Impute (E2/E1)1/2 according to its prior distribution. Parallelsusing marginal likelihood f (λ|λ).

But λ is not sufficient. Can we do better?

We observe A = (E1E2)1/2. Impute (E2/E1)1/2 according tof (E1,E2|A).

Parallels using conditional likelihood f (λ|A, λ).

Why does conditioning on ancillary statistics recover secondorder information?

Fiducial Answer: It increases our efficiency of predicting U.

Page 105: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?14/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

When the Recipe Works...

Exponential Hyperbola

V = 1λE1 and W = λE2 where E1,E2 are iid exponential.

λ = (W /V )1/2 is the MLE and A = (VW )1/2 is an R-ancillary.

Solve λ = λ (E2/E1)1/2 to obtain λ = λ (E1/E2)1/2.

Impute (E2/E1)1/2 according to its prior distribution. Parallelsusing marginal likelihood f (λ|λ).

But λ is not sufficient. Can we do better?

We observe A = (E1E2)1/2. Impute (E2/E1)1/2 according tof (E1,E2|A). Parallels using conditional likelihood f (λ|A, λ).

Why does conditioning on ancillary statistics recover secondorder information?

Fiducial Answer: It increases our efficiency of predicting U.

Page 106: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?14/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

When the Recipe Works...

Exponential Hyperbola

V = 1λE1 and W = λE2 where E1,E2 are iid exponential.

λ = (W /V )1/2 is the MLE and A = (VW )1/2 is an R-ancillary.

Solve λ = λ (E2/E1)1/2 to obtain λ = λ (E1/E2)1/2.

Impute (E2/E1)1/2 according to its prior distribution. Parallelsusing marginal likelihood f (λ|λ).

But λ is not sufficient. Can we do better?

We observe A = (E1E2)1/2. Impute (E2/E1)1/2 according tof (E1,E2|A). Parallels using conditional likelihood f (λ|A, λ).

Why does conditioning on ancillary statistics recover secondorder information?

Fiducial Answer: It increases our efficiency of predicting U.

Page 107: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?14/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

When the Recipe Works...

Exponential Hyperbola

V = 1λE1 and W = λE2 where E1,E2 are iid exponential.

λ = (W /V )1/2 is the MLE and A = (VW )1/2 is an R-ancillary.

Solve λ = λ (E2/E1)1/2 to obtain λ = λ (E1/E2)1/2.

Impute (E2/E1)1/2 according to its prior distribution. Parallelsusing marginal likelihood f (λ|λ).

But λ is not sufficient. Can we do better?

We observe A = (E1E2)1/2. Impute (E2/E1)1/2 according tof (E1,E2|A). Parallels using conditional likelihood f (λ|A, λ).

Why does conditioning on ancillary statistics recover secondorder information?

Fiducial Answer: It increases our efficiency of predicting U.

Page 108: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?15/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

An Often Forgotten Ingredient in Fiducial Cooking

X has the following fiducial equation

X = G (θ,U) =

{0 0 ≤ U < θ1 θ ≤ U < 1

where U ∼Unif[0, 1] and θ ∈ [1/4, 1/2].

Key Question: Without π(θ), what do we know about U?What is the free information in X about U?

The only additional information available to us is the factthat the value of U and x must be compatible. (Hannig2009)

If X = 0, U ∈ [0, 1/2], shall we predict U as Unif[0, 1/2]?If X = 1, U ∈ [1/4, 1], shall we predict U as Unif[1/4, 1]?

What are we forgetting by doing this?

Page 109: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?15/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

An Often Forgotten Ingredient in Fiducial Cooking

X has the following fiducial equation

X = G (θ,U) =

{0 0 ≤ U < θ1 θ ≤ U < 1

where U ∼Unif[0, 1] and θ ∈ [1/4, 1/2].

Key Question: Without π(θ), what do we know about U?What is the free information in X about U?

The only additional information available to us is the factthat the value of U and x must be compatible. (Hannig2009)

If X = 0, U ∈ [0, 1/2], shall we predict U as Unif[0, 1/2]?If X = 1, U ∈ [1/4, 1], shall we predict U as Unif[1/4, 1]?

What are we forgetting by doing this?

Page 110: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?15/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

An Often Forgotten Ingredient in Fiducial Cooking

X has the following fiducial equation

X = G (θ,U) =

{0 0 ≤ U < θ1 θ ≤ U < 1

where U ∼Unif[0, 1] and θ ∈ [1/4, 1/2].

Key Question: Without π(θ), what do we know about U?What is the free information in X about U?

The only additional information available to us is the factthat the value of U and x must be compatible. (Hannig2009)

If X = 0, U ∈ [0, 1/2], shall we predict U as Unif[0, 1/2]?If X = 1, U ∈ [1/4, 1], shall we predict U as Unif[1/4, 1]?

What are we forgetting by doing this?

Page 111: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?15/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

An Often Forgotten Ingredient in Fiducial Cooking

X has the following fiducial equation

X = G (θ,U) =

{0 0 ≤ U < θ1 θ ≤ U < 1

where U ∼Unif[0, 1] and θ ∈ [1/4, 1/2].

Key Question: Without π(θ), what do we know about U?What is the free information in X about U?

The only additional information available to us is the factthat the value of U and x must be compatible. (Hannig2009)

If X = 0, U ∈ [0, 1/2], shall we predict U as Unif[0, 1/2]?If X = 1, U ∈ [1/4, 1], shall we predict U as Unif[1/4, 1]?

What are we forgetting by doing this?

Page 112: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?15/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

An Often Forgotten Ingredient in Fiducial Cooking

X has the following fiducial equation

X = G (θ,U) =

{0 0 ≤ U < θ1 θ ≤ U < 1

where U ∼Unif[0, 1] and θ ∈ [1/4, 1/2].

Key Question: Without π(θ), what do we know about U?What is the free information in X about U?

The only additional information available to us is the factthat the value of U and x must be compatible. (Hannig2009)

If X = 0, U ∈ [0, 1/2], shall we predict U as Unif[0, 1/2]?

If X = 1, U ∈ [1/4, 1], shall we predict U as Unif[1/4, 1]?

What are we forgetting by doing this?

Page 113: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?15/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

An Often Forgotten Ingredient in Fiducial Cooking

X has the following fiducial equation

X = G (θ,U) =

{0 0 ≤ U < θ1 θ ≤ U < 1

where U ∼Unif[0, 1] and θ ∈ [1/4, 1/2].

Key Question: Without π(θ), what do we know about U?What is the free information in X about U?

The only additional information available to us is the factthat the value of U and x must be compatible. (Hannig2009)

If X = 0, U ∈ [0, 1/2], shall we predict U as Unif[0, 1/2]?If X = 1, U ∈ [1/4, 1], shall we predict U as Unif[1/4, 1]?

What are we forgetting by doing this?

Page 114: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?15/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

An Often Forgotten Ingredient in Fiducial Cooking

X has the following fiducial equation

X = G (θ,U) =

{0 0 ≤ U < θ1 θ ≤ U < 1

where U ∼Unif[0, 1] and θ ∈ [1/4, 1/2].

Key Question: Without π(θ), what do we know about U?What is the free information in X about U?

The only additional information available to us is the factthat the value of U and x must be compatible. (Hannig2009)

If X = 0, U ∈ [0, 1/2], shall we predict U as Unif[0, 1/2]?If X = 1, U ∈ [1/4, 1], shall we predict U as Unif[1/4, 1]?

What are we forgetting by doing this?

Page 115: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?16/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

We Observe and Observe that We Observe

x-Compatible Regions

If we generate data x = G (θ0,U0), we learn that U0 resides in

M (G (θ0,U0)) = {U : ∃θ ∈ Θ s.t. G (θ0,U0) =G (θ,U)} .

Hannig (2009) assumes that

f (U|M(X) =M(x)) = f (U|U ∈M(x)),

hence does not depend on π.

The event {M(X) =M(x)} contains two pieces of information:

1 U0 ∈M(x).

2 How we came to observe U0 ∈M(x).

To correct use the information in (1), we need to conditionon the how, i.e., condition on the observation process.

Page 116: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?16/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

We Observe and Observe that We Observe

x-Compatible Regions

If we generate data x = G (θ0,U0), we learn that U0 resides in

M (G (θ0,U0)) = {U : ∃θ ∈ Θ s.t. G (θ0,U0) =G (θ,U)} .

Hannig (2009) assumes that

f (U|M(X) =M(x)) = f (U|U ∈M(x)),

hence does not depend on π.

The event {M(X) =M(x)} contains two pieces of information:

1 U0 ∈M(x).

2 How we came to observe U0 ∈M(x).

To correct use the information in (1), we need to conditionon the how, i.e., condition on the observation process.

Page 117: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?16/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

We Observe and Observe that We Observe

x-Compatible Regions

If we generate data x = G (θ0,U0), we learn that U0 resides in

M (G (θ0,U0)) = {U : ∃θ ∈ Θ s.t. G (θ0,U0) =G (θ,U)} .

Hannig (2009) assumes that

f (U|M(X) =M(x)) = f (U|U ∈M(x)),

hence does not depend on π.

The event {M(X) =M(x)} contains two pieces of information:

1 U0 ∈M(x).

2 How we came to observe U0 ∈M(x).

To correct use the information in (1), we need to conditionon the how, i.e., condition on the observation process.

Page 118: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?16/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

We Observe and Observe that We Observe

x-Compatible Regions

If we generate data x = G (θ0,U0), we learn that U0 resides in

M (G (θ0,U0)) = {U : ∃θ ∈ Θ s.t. G (θ0,U0) =G (θ,U)} .

Hannig (2009) assumes that

f (U|M(X) =M(x)) = f (U|U ∈M(x)),

hence does not depend on π.

The event {M(X) =M(x)} contains two pieces of information:

1 U0 ∈M(x).

2 How we came to observe U0 ∈M(x).

To correct use the information in (1), we need to conditionon the how, i.e., condition on the observation process.

Page 119: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?16/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

We Observe and Observe that We Observe

x-Compatible Regions

If we generate data x = G (θ0,U0), we learn that U0 resides in

M (G (θ0,U0)) = {U : ∃θ ∈ Θ s.t. G (θ0,U0) =G (θ,U)} .

Hannig (2009) assumes that

f (U|M(X) =M(x)) = f (U|U ∈M(x)),

hence does not depend on π.

The event {M(X) =M(x)} contains two pieces of information:

1 U0 ∈M(x).

2 How we came to observe U0 ∈M(x).

To correct use the information in (1), we need to conditionon the how, i.e., condition on the observation process.

Page 120: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?16/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

We Observe and Observe that We Observe

x-Compatible Regions

If we generate data x = G (θ0,U0), we learn that U0 resides in

M (G (θ0,U0)) = {U : ∃θ ∈ Θ s.t. G (θ0,U0) =G (θ,U)} .

Hannig (2009) assumes that

f (U|M(X) =M(x)) = f (U|U ∈M(x)),

hence does not depend on π.

The event {M(X) =M(x)} contains two pieces of information:

1 U0 ∈M(x).

2 How we came to observe U0 ∈M(x).

To correct use the information in (1), we need to conditionon the how, i.e., condition on the observation process.

Page 121: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?16/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

We Observe and Observe that We Observe

x-Compatible Regions

If we generate data x = G (θ0,U0), we learn that U0 resides in

M (G (θ0,U0)) = {U : ∃θ ∈ Θ s.t. G (θ0,U0) =G (θ,U)} .

Hannig (2009) assumes that

f (U|M(X) =M(x)) = f (U|U ∈M(x)),

hence does not depend on π.

The event {M(X) =M(x)} contains two pieces of information:

1 U0 ∈M(x).

2 How we came to observe U0 ∈M(x).

To correct use the information in (1), we need to conditionon the how, i.e., condition on the observation process.

Page 122: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?16/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

We Observe and Observe that We Observe

x-Compatible Regions

If we generate data x = G (θ0,U0), we learn that U0 resides in

M (G (θ0,U0)) = {U : ∃θ ∈ Θ s.t. G (θ0,U0) =G (θ,U)} .

Hannig (2009) assumes that

f (U|M(X) =M(x)) = f (U|U ∈M(x)),

hence does not depend on π.

The event {M(X) =M(x)} contains two pieces of information:

1 U0 ∈M(x).

2 How we came to observe U0 ∈M(x).

To correct use the information in (1), we need to conditionon the how, i.e., condition on the observation process.

Page 123: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?17/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Bernoulli with Restricted Parameter Space

Reminder: U ∼Unif[0, 1] and θ ∈ [1/4, 1/2]

X = G (θ,U) =

{0 0 ≤ U < θ1 θ ≤ U < 1

M (0) = [0, 1/2], M (1) = [1/4, 1].

I{B} is indicator function. Let Ix ≡ I {U ∈M (x)} for x = 0, 1.

We do not observe Ix . It must be inferred from data.

What We Actually Observe

Iobs = X I1 + (1− X )I0 (Note: Iobs = 1)

For x ∈ X, define

Ox =

{1 if Iobs = Ix0 otherwise

Page 124: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?17/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Bernoulli with Restricted Parameter Space

Reminder: U ∼Unif[0, 1] and θ ∈ [1/4, 1/2]

X = G (θ,U) =

{0 0 ≤ U < θ1 θ ≤ U < 1

M (0) = [0, 1/2], M (1) = [1/4, 1].

I{B} is indicator function. Let Ix ≡ I {U ∈M (x)} for x = 0, 1.

We do not observe Ix . It must be inferred from data.

What We Actually Observe

Iobs = X I1 + (1− X )I0 (Note: Iobs = 1)

For x ∈ X, define

Ox =

{1 if Iobs = Ix0 otherwise

Page 125: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?17/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Bernoulli with Restricted Parameter Space

Reminder: U ∼Unif[0, 1] and θ ∈ [1/4, 1/2]

X = G (θ,U) =

{0 0 ≤ U < θ1 θ ≤ U < 1

M (0) = [0, 1/2], M (1) = [1/4, 1].

I{B} is indicator function. Let Ix ≡ I {U ∈M (x)} for x = 0, 1.

We do not observe Ix . It must be inferred from data.

What We Actually Observe

Iobs = X I1 + (1− X )I0 (Note: Iobs = 1)

For x ∈ X, define

Ox =

{1 if Iobs = Ix0 otherwise

Page 126: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?17/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Bernoulli with Restricted Parameter Space

Reminder: U ∼Unif[0, 1] and θ ∈ [1/4, 1/2]

X = G (θ,U) =

{0 0 ≤ U < θ1 θ ≤ U < 1

M (0) = [0, 1/2], M (1) = [1/4, 1].

I{B} is indicator function. Let Ix ≡ I {U ∈M (x)} for x = 0, 1.

We do not observe Ix . It must be inferred from data.

What We Actually Observe

Iobs = X I1 + (1− X )I0 (Note: Iobs = 1)

For x ∈ X, define

Ox =

{1 if Iobs = Ix0 otherwise

Page 127: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?17/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Bernoulli with Restricted Parameter Space

Reminder: U ∼Unif[0, 1] and θ ∈ [1/4, 1/2]

X = G (θ,U) =

{0 0 ≤ U < θ1 θ ≤ U < 1

M (0) = [0, 1/2], M (1) = [1/4, 1].

I{B} is indicator function. Let Ix ≡ I {U ∈M (x)} for x = 0, 1.

We do not observe Ix . It must be inferred from data.

What We Actually Observe

Iobs = X I1 + (1− X )I0 (Note: Iobs = 1)

For x ∈ X, define

Ox =

{1 if Iobs = Ix0 otherwise

Page 128: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?17/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Bernoulli with Restricted Parameter Space

Reminder: U ∼Unif[0, 1] and θ ∈ [1/4, 1/2]

X = G (θ,U) =

{0 0 ≤ U < θ1 θ ≤ U < 1

M (0) = [0, 1/2], M (1) = [1/4, 1].

I{B} is indicator function. Let Ix ≡ I {U ∈M (x)} for x = 0, 1.

We do not observe Ix . It must be inferred from data.

What We Actually Observe

Iobs = X I1 + (1− X )I0 (Note: Iobs = 1)

For x ∈ X, define

Ox =

{1 if Iobs = Ix0 otherwise

Page 129: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?17/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Bernoulli with Restricted Parameter Space

Reminder: U ∼Unif[0, 1] and θ ∈ [1/4, 1/2]

X = G (θ,U) =

{0 0 ≤ U < θ1 θ ≤ U < 1

M (0) = [0, 1/2], M (1) = [1/4, 1].

I{B} is indicator function. Let Ix ≡ I {U ∈M (x)} for x = 0, 1.

We do not observe Ix . It must be inferred from data.

What We Actually Observe

Iobs = X I1 + (1− X )I0 (Note: Iobs = 1)

For x ∈ X, define

Ox =

{1 if Iobs = Ix0 otherwise

Page 130: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?18/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

A Sleight of Hand

M(X ) contains the information Ix through Iobs and {Ox}.

f (U|M (X ) , π) = f(U|Iobs ,O0,O1, π

)Rewrite the posterior using the law of total probability∑

t∈{0,1}

f (U|Ix = t,O0,O1, π) P(Ix = t|Iobs ,O0,O1

)Problem: The distribution of O0,O1 depends on π.

The Sleight of Hand

To remove the dependence on π, make the substitution

f (U|Ix = t,O0,O1, π) = f (U|Ix = t)

We ignore how we learned {Ix = t}.

Page 131: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?18/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

A Sleight of Hand

M(X ) contains the information Ix through Iobs and {Ox}.

f (U|M (X ) , π) = f(U|Iobs ,O0,O1, π

)

Rewrite the posterior using the law of total probability∑t∈{0,1}

f (U|Ix = t,O0,O1, π) P(Ix = t|Iobs ,O0,O1

)Problem: The distribution of O0,O1 depends on π.

The Sleight of Hand

To remove the dependence on π, make the substitution

f (U|Ix = t,O0,O1, π) = f (U|Ix = t)

We ignore how we learned {Ix = t}.

Page 132: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?18/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

A Sleight of Hand

M(X ) contains the information Ix through Iobs and {Ox}.

f (U|M (X ) , π) = f(U|Iobs ,O0,O1, π

)Rewrite the posterior using the law of total probability∑

t∈{0,1}

f (U|Ix = t,O0,O1, π) P(Ix = t|Iobs ,O0,O1

)

Problem: The distribution of O0,O1 depends on π.

The Sleight of Hand

To remove the dependence on π, make the substitution

f (U|Ix = t,O0,O1, π) = f (U|Ix = t)

We ignore how we learned {Ix = t}.

Page 133: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?18/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

A Sleight of Hand

M(X ) contains the information Ix through Iobs and {Ox}.

f (U|M (X ) , π) = f(U|Iobs ,O0,O1, π

)Rewrite the posterior using the law of total probability∑

t∈{0,1}

f (U|Ix = t,O0,O1, π) P(Ix = t|Iobs ,O0,O1

)Problem: The distribution of O0,O1 depends on π.

The Sleight of Hand

To remove the dependence on π, make the substitution

f (U|Ix = t,O0,O1, π) = f (U|Ix = t)

We ignore how we learned {Ix = t}.

Page 134: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?18/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

A Sleight of Hand

M(X ) contains the information Ix through Iobs and {Ox}.

f (U|M (X ) , π) = f(U|Iobs ,O0,O1, π

)Rewrite the posterior using the law of total probability∑

t∈{0,1}

f (U|Ix = t,O0,O1, π) P(Ix = t|Iobs ,O0,O1

)Problem: The distribution of O0,O1 depends on π.

The Sleight of Hand

To remove the dependence on π, make the substitution

f (U|Ix = t,O0,O1, π) = f (U|Ix = t)

We ignore how we learned {Ix = t}.

Page 135: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?18/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

A Sleight of Hand

M(X ) contains the information Ix through Iobs and {Ox}.

f (U|M (X ) , π) = f(U|Iobs ,O0,O1, π

)Rewrite the posterior using the law of total probability∑

t∈{0,1}

f (U|Ix = t,O0,O1, π) P(Ix = t|Iobs ,O0,O1

)Problem: The distribution of O0,O1 depends on π.

The Sleight of Hand

To remove the dependence on π, make the substitution

f (U|Ix = t,O0,O1, π) = f (U|Ix = t)

We ignore how we learned {Ix = t}.

Page 136: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?19/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

O, We Just Ignored You!

A Mismatch

Using the sleight of hand, rewrite the now π-free posterior∑t∈{0,1}

f (U|Ix = t) P(Ix = t|Iobs ,O0,O1

)We ignored O0,O1 in the first term BUT continue to conditionon it in the second term—incoherent.

It is this incoherence which leads to the incorrect assumption

f (U|M(X ) =M(x), π) = f (U|U ∈M(x))

Confidence Validity (Rubin 1976)

The posterior f (U|U ∈M(x)) leads to valid confidence regionsfor U if the observation process is ignorable

P (Ix = 1|O0 = o0,O1 = o1,U, π) = P (Ix = 1|U)

Page 137: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?19/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

O, We Just Ignored You!

A Mismatch

Using the sleight of hand, rewrite the now π-free posterior

∑t∈{0,1}

f (U|Ix = t) P(Ix = t|Iobs ,O0,O1

)We ignored O0,O1 in the first term BUT continue to conditionon it in the second term—incoherent.

It is this incoherence which leads to the incorrect assumption

f (U|M(X ) =M(x), π) = f (U|U ∈M(x))

Confidence Validity (Rubin 1976)

The posterior f (U|U ∈M(x)) leads to valid confidence regionsfor U if the observation process is ignorable

P (Ix = 1|O0 = o0,O1 = o1,U, π) = P (Ix = 1|U)

Page 138: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?19/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

O, We Just Ignored You!

A Mismatch

Using the sleight of hand, rewrite the now π-free posterior∑t∈{0,1}

f (U|Ix = t) P(Ix = t|Iobs ,O0,O1

)

We ignored O0,O1 in the first term BUT continue to conditionon it in the second term—incoherent.

It is this incoherence which leads to the incorrect assumption

f (U|M(X ) =M(x), π) = f (U|U ∈M(x))

Confidence Validity (Rubin 1976)

The posterior f (U|U ∈M(x)) leads to valid confidence regionsfor U if the observation process is ignorable

P (Ix = 1|O0 = o0,O1 = o1,U, π) = P (Ix = 1|U)

Page 139: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?19/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

O, We Just Ignored You!

A Mismatch

Using the sleight of hand, rewrite the now π-free posterior∑t∈{0,1}

f (U|Ix = t) P(Ix = t|Iobs ,O0,O1

)We ignored O0,O1 in the first term BUT continue to conditionon it in the second term—incoherent.

It is this incoherence which leads to the incorrect assumption

f (U|M(X ) =M(x), π) = f (U|U ∈M(x))

Confidence Validity (Rubin 1976)

The posterior f (U|U ∈M(x)) leads to valid confidence regionsfor U if the observation process is ignorable

P (Ix = 1|O0 = o0,O1 = o1,U, π) = P (Ix = 1|U)

Page 140: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?19/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

O, We Just Ignored You!

A Mismatch

Using the sleight of hand, rewrite the now π-free posterior∑t∈{0,1}

f (U|Ix = t) P(Ix = t|Iobs ,O0,O1

)We ignored O0,O1 in the first term BUT continue to conditionon it in the second term—incoherent.

It is this incoherence which leads to the incorrect assumption

f (U|M(X ) =M(x), π) = f (U|U ∈M(x))

Confidence Validity (Rubin 1976)

The posterior f (U|U ∈M(x)) leads to valid confidence regionsfor U if the observation process is ignorable

P (Ix = 1|O0 = o0,O1 = o1,U, π) = P (Ix = 1|U)

Page 141: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?19/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

O, We Just Ignored You!

A Mismatch

Using the sleight of hand, rewrite the now π-free posterior∑t∈{0,1}

f (U|Ix = t) P(Ix = t|Iobs ,O0,O1

)We ignored O0,O1 in the first term BUT continue to conditionon it in the second term—incoherent.

It is this incoherence which leads to the incorrect assumption

f (U|M(X ) =M(x), π) = f (U|U ∈M(x))

Confidence Validity (Rubin 1976)

The posterior f (U|U ∈M(x)) leads to valid confidence regionsfor U if the observation process is ignorable

P (Ix = 1|O0 = o0,O1 = o1,U, π) = P (Ix = 1|U)

Page 142: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?20/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

A Free Lunch? Take a Second Look.

Suppose we observe I0 = 1 and I1 is missing.

Condition for confidence validity does not hold:

P (I0 = 1|U, θ) = I {U ≤ 1/2}

P (I0 = 1|O0 = 1,O1 = 0,U, θ) = I{U ≤ θ}

Confidence valid inferences for U requires modeling O0,O1.

The distribution of (O0,O1) depends on π: the reductionX →M(X ) does not throw away enough information.

The information U ∈M(x) seems free. But to use it correctlyrequires paying the price of a prior on θ.

Page 143: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?20/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

A Free Lunch? Take a Second Look.

Suppose we observe I0 = 1 and I1 is missing.

Condition for confidence validity does not hold:

P (I0 = 1|U, θ) = I {U ≤ 1/2}

P (I0 = 1|O0 = 1,O1 = 0,U, θ) = I{U ≤ θ}

Confidence valid inferences for U requires modeling O0,O1.

The distribution of (O0,O1) depends on π: the reductionX →M(X ) does not throw away enough information.

The information U ∈M(x) seems free. But to use it correctlyrequires paying the price of a prior on θ.

Page 144: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?20/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

A Free Lunch? Take a Second Look.

Suppose we observe I0 = 1 and I1 is missing.

Condition for confidence validity does not hold:

P (I0 = 1|U, θ) = I {U ≤ 1/2}

P (I0 = 1|O0 = 1,O1 = 0,U, θ) = I{U ≤ θ}

Confidence valid inferences for U requires modeling O0,O1.

The distribution of (O0,O1) depends on π: the reductionX →M(X ) does not throw away enough information.

The information U ∈M(x) seems free. But to use it correctlyrequires paying the price of a prior on θ.

Page 145: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?20/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

A Free Lunch? Take a Second Look.

Suppose we observe I0 = 1 and I1 is missing.

Condition for confidence validity does not hold:

P (I0 = 1|U, θ) = I {U ≤ 1/2}

P (I0 = 1|O0 = 1,O1 = 0,U, θ) = I{U ≤ θ}

Confidence valid inferences for U requires modeling O0,O1.

The distribution of (O0,O1) depends on π: the reductionX →M(X ) does not throw away enough information.

The information U ∈M(x) seems free. But to use it correctlyrequires paying the price of a prior on θ.

Page 146: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?20/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

A Free Lunch? Take a Second Look.

Suppose we observe I0 = 1 and I1 is missing.

Condition for confidence validity does not hold:

P (I0 = 1|U, θ) = I {U ≤ 1/2}

P (I0 = 1|O0 = 1,O1 = 0,U, θ) = I{U ≤ θ}

Confidence valid inferences for U requires modeling O0,O1.

The distribution of (O0,O1) depends on π: the reductionX →M(X ) does not throw away enough information.

The information U ∈M(x) seems free. But to use it correctlyrequires paying the price of a prior on θ.

Page 147: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?20/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

A Free Lunch? Take a Second Look.

Suppose we observe I0 = 1 and I1 is missing.

Condition for confidence validity does not hold:

P (I0 = 1|U, θ) = I {U ≤ 1/2}

P (I0 = 1|O0 = 1,O1 = 0,U, θ) = I{U ≤ θ}

Confidence valid inferences for U requires modeling O0,O1.

The distribution of (O0,O1) depends on π: the reductionX →M(X ) does not throw away enough information.

The information U ∈M(x) seems free. But to use it correctlyrequires paying the price of a prior on θ.

Page 148: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?20/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

A Free Lunch? Take a Second Look.

Suppose we observe I0 = 1 and I1 is missing.

Condition for confidence validity does not hold:

P (I0 = 1|U, θ) = I {U ≤ 1/2}

P (I0 = 1|O0 = 1,O1 = 0,U, θ) = I{U ≤ θ}

Confidence valid inferences for U requires modeling O0,O1.

The distribution of (O0,O1) depends on π: the reductionX →M(X ) does not throw away enough information.

The information U ∈M(x) seems free. But to use it correctlyrequires paying the price of a prior on θ.

Page 149: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?21/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

The Consequences of Being Cheap

Actual Posterior, X = 0

Eπ(θ|x=0)[I {U ≤ θ} /θ]

Naive Posterior, X = 0

2I {U ≤ 1/2}

Ignoring {O0,O1} equivalent to assuming point mass prior atθ = 1/2—most dogmatic of all!

Page 150: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?21/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

The Consequences of Being Cheap

Actual Posterior, X = 0

Eπ(θ|x=0)[I {U ≤ θ} /θ]

Naive Posterior, X = 0

2I {U ≤ 1/2}

Ignoring {O0,O1} equivalent to assuming point mass prior atθ = 1/2—most dogmatic of all!

Page 151: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?21/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

The Consequences of Being Cheap

Actual Posterior, X = 0

Eπ(θ|x=0)[I {U ≤ θ} /θ]

Naive Posterior, X = 0

2I {U ≤ 1/2}

Ignoring {O0,O1} equivalent to assuming point mass prior atθ = 1/2—most dogmatic of all!

Page 152: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?21/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

The Consequences of Being Cheap

Actual Posterior, X = 0

Eπ(θ|x=0)[I {U ≤ θ} /θ]

Naive Posterior, X = 0

2I {U ≤ 1/2}

Ignoring {O0,O1} equivalent to assuming point mass prior atθ = 1/2—most dogmatic of all!

Page 153: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?21/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

The Consequences of Being Cheap

Actual Posterior, X = 0

Eπ(θ|x=0)[I {U ≤ θ} /θ]

Naive Posterior, X = 0

2I {U ≤ 1/2}

Ignoring {O0,O1} equivalent to assuming point mass prior atθ = 1/2—most dogmatic of all!

Page 154: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?22/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Know the Price

Takeaway: The information U ∈M(x) is not free—mayrequire assuming π.

Question 1: So when is this information free? When isU|M(x) objective—free of π?

Question 2: What is maximal amount of free informationabout U? What is the “best” objective posterior for U?

Page 155: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?22/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Know the Price

Takeaway: The information U ∈M(x) is not free—mayrequire assuming π.

Question 1: So when is this information free? When isU|M(x) objective—free of π?

Question 2: What is maximal amount of free informationabout U? What is the “best” objective posterior for U?

Page 156: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?22/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Know the Price

Takeaway: The information U ∈M(x) is not free—mayrequire assuming π.

Question 1: So when is this information free? When isU|M(x) objective—free of π?

Question 2: What is maximal amount of free informationabout U? What is the “best” objective posterior for U?

Page 157: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?22/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Know the Price

Takeaway: The information U ∈M(x) is not free—mayrequire assuming π.

Question 1: So when is this information free? When isU|M(x) objective—free of π?

Question 2: What is maximal amount of free informationabout U? What is the “best” objective posterior for U?

Page 158: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?23/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

How Small Can You Go?

U|M(x) is an objective posterior for U if and only if anR-ancillary statistic captures the information in M(x).

R-Ancillary Regions

If A(x) is R-ancillary s.t. A(G (θ0,U0)) = AG (U0), theR-ancillary region defined by A is the set

AG (U0) = {U :AG (U) = AG (U0)}

The smaller the R-ancillary region, the more informative A is.

Any R-ancillary region is rougher than the x compatible region.

M(G (θ0,U0)) ⊂ AG (U0) ∀θ0, ∀AG

What is the smallest we can make AG? Does a smallestregion even exist?

Page 159: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?23/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

How Small Can You Go?

U|M(x) is an objective posterior for U if and only if anR-ancillary statistic captures the information in M(x).

R-Ancillary Regions

If A(x) is R-ancillary s.t. A(G (θ0,U0)) = AG (U0), theR-ancillary region defined by A is the set

AG (U0) = {U :AG (U) = AG (U0)}

The smaller the R-ancillary region, the more informative A is.

Any R-ancillary region is rougher than the x compatible region.

M(G (θ0,U0)) ⊂ AG (U0) ∀θ0, ∀AG

What is the smallest we can make AG? Does a smallestregion even exist?

Page 160: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?23/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

How Small Can You Go?

U|M(x) is an objective posterior for U if and only if anR-ancillary statistic captures the information in M(x).

R-Ancillary Regions

If A(x) is R-ancillary s.t. A(G (θ0,U0)) = AG (U0), theR-ancillary region defined by A is the set

AG (U0) = {U :AG (U) = AG (U0)}

The smaller the R-ancillary region, the more informative A is.

Any R-ancillary region is rougher than the x compatible region.

M(G (θ0,U0)) ⊂ AG (U0) ∀θ0, ∀AG

What is the smallest we can make AG? Does a smallestregion even exist?

Page 161: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?23/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

How Small Can You Go?

U|M(x) is an objective posterior for U if and only if anR-ancillary statistic captures the information in M(x).

R-Ancillary Regions

If A(x) is R-ancillary s.t. A(G (θ0,U0)) = AG (U0), theR-ancillary region defined by A is the set

AG (U0) = {U :AG (U) = AG (U0)}

The smaller the R-ancillary region, the more informative A is.

Any R-ancillary region is rougher than the x compatible region.

M(G (θ0,U0)) ⊂ AG (U0) ∀θ0, ∀AG

What is the smallest we can make AG? Does a smallestregion even exist?

Page 162: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?23/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

How Small Can You Go?

U|M(x) is an objective posterior for U if and only if anR-ancillary statistic captures the information in M(x).

R-Ancillary Regions

If A(x) is R-ancillary s.t. A(G (θ0,U0)) = AG (U0), theR-ancillary region defined by A is the set

AG (U0) = {U :AG (U) = AG (U0)}

The smaller the R-ancillary region, the more informative A is.

Any R-ancillary region is rougher than the x compatible region.

M(G (θ0,U0)) ⊂ AG (U0) ∀θ0, ∀AG

What is the smallest we can make AG? Does a smallestregion even exist?

Page 163: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?23/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

How Small Can You Go?

U|M(x) is an objective posterior for U if and only if anR-ancillary statistic captures the information in M(x).

R-Ancillary Regions

If A(x) is R-ancillary s.t. A(G (θ0,U0)) = AG (U0), theR-ancillary region defined by A is the set

AG (U0) = {U :AG (U) = AG (U0)}

The smaller the R-ancillary region, the more informative A is.

Any R-ancillary region is rougher than the x compatible region.

M(G (θ0,U0)) ⊂ AG (U0) ∀θ0, ∀AG

What is the smallest we can make AG? Does a smallestregion even exist?

Page 164: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?23/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

How Small Can You Go?

U|M(x) is an objective posterior for U if and only if anR-ancillary statistic captures the information in M(x).

R-Ancillary Regions

If A(x) is R-ancillary s.t. A(G (θ0,U0)) = AG (U0), theR-ancillary region defined by A is the set

AG (U0) = {U :AG (U) = AG (U0)}

The smaller the R-ancillary region, the more informative A is.

Any R-ancillary region is rougher than the x compatible region.

M(G (θ0,U0)) ⊂ AG (U0) ∀θ0, ∀AG

What is the smallest we can make AG? Does a smallestregion even exist?

Page 165: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?24/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

From the Cave to Utopia?

R-ancillary statistics are jointly R-ancillary. Intersection ofR-ancillary regions is an R-ancillary region.

Utopia ⊂ Cave

UG (U0) =⋃θ0∈Θ

M (G (θ0,U0)) ⊂⋂

A: R-ancillary for G

AG (U0) = CG (U0)

The maximal R-ancillary, Amax, restricts U0 to the Cave.

Ultimately, we hope to restrict U0 to Utopia, which is theuniversal “Cramer-Rao lower bound for conditioning”.

Can we achieve Utopia?

Page 166: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?24/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

From the Cave to Utopia?

R-ancillary statistics are jointly R-ancillary. Intersection ofR-ancillary regions is an R-ancillary region.

Utopia ⊂ Cave

UG (U0) =⋃θ0∈Θ

M (G (θ0,U0)) ⊂⋂

A: R-ancillary for G

AG (U0) = CG (U0)

The maximal R-ancillary, Amax, restricts U0 to the Cave.

Ultimately, we hope to restrict U0 to Utopia, which is theuniversal “Cramer-Rao lower bound for conditioning”.

Can we achieve Utopia?

Page 167: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?24/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

From the Cave to Utopia?

R-ancillary statistics are jointly R-ancillary. Intersection ofR-ancillary regions is an R-ancillary region.

Utopia ⊂ Cave

UG (U0) =⋃θ0∈Θ

M (G (θ0,U0)) ⊂⋂

A: R-ancillary for G

AG (U0) = CG (U0)

The maximal R-ancillary, Amax, restricts U0 to the Cave.

Ultimately, we hope to restrict U0 to Utopia, which is theuniversal “Cramer-Rao lower bound for conditioning”.

Can we achieve Utopia?

Page 168: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?24/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

From the Cave to Utopia?

R-ancillary statistics are jointly R-ancillary. Intersection ofR-ancillary regions is an R-ancillary region.

Utopia ⊂ Cave

UG (U0) =⋃θ0∈Θ

M (G (θ0,U0)) ⊂⋂

A: R-ancillary for G

AG (U0) = CG (U0)

The maximal R-ancillary, Amax, restricts U0 to the Cave.

Ultimately, we hope to restrict U0 to Utopia, which is theuniversal “Cramer-Rao lower bound for conditioning”.

Can we achieve Utopia?

Page 169: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?24/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

From the Cave to Utopia?

R-ancillary statistics are jointly R-ancillary. Intersection ofR-ancillary regions is an R-ancillary region.

Utopia ⊂ Cave

UG (U0) =⋃θ0∈Θ

M (G (θ0,U0)) ⊂⋂

A: R-ancillary for G

AG (U0) = CG (U0)

The maximal R-ancillary, Amax, restricts U0 to the Cave.

Ultimately, we hope to restrict U0 to Utopia, which is theuniversal “Cramer-Rao lower bound for conditioning”.

Can we achieve Utopia?

Page 170: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?24/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

From the Cave to Utopia?

R-ancillary statistics are jointly R-ancillary. Intersection ofR-ancillary regions is an R-ancillary region.

Utopia ⊂ Cave

UG (U0) =⋃θ0∈Θ

M (G (θ0,U0)) ⊂⋂

A: R-ancillary for G

AG (U0) = CG (U0)

The maximal R-ancillary, Amax, restricts U0 to the Cave.

Ultimately, we hope to restrict U0 to Utopia, which is theuniversal “Cramer-Rao lower bound for conditioning”.

Can we achieve Utopia?

Page 171: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?25/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Can Statisticians Achieve Utopia?

Despite all the cynicism in the world, we happily report...

Utopia Can Always Be Achieved

Utopia =⋃θ0∈Θ

M (G (θ0,U0)) =⋂

A: R-ancillary for G

AG (U0) = Cave

1 The collection of Utopia sets, {UG (U0)}U0∈U, partitions thepivot space into equivalence classes.

2 The set of pivot space equivalence classes corresponds to a setof sample space equivalence classes.

3 The index for the sample space equivalence class is observedand is R-ancillary.

Utopia represents an upper bound on the informativenessof an R-ancillary. It is always achieved.

Page 172: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?25/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Can Statisticians Achieve Utopia?

Despite all the cynicism in the world, we happily report...

Utopia Can Always Be Achieved

Utopia =⋃θ0∈Θ

M (G (θ0,U0)) =⋂

A: R-ancillary for G

AG (U0) = Cave

1 The collection of Utopia sets, {UG (U0)}U0∈U, partitions thepivot space into equivalence classes.

2 The set of pivot space equivalence classes corresponds to a setof sample space equivalence classes.

3 The index for the sample space equivalence class is observedand is R-ancillary.

Utopia represents an upper bound on the informativenessof an R-ancillary. It is always achieved.

Page 173: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?25/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Can Statisticians Achieve Utopia?

Despite all the cynicism in the world, we happily report...

Utopia Can Always Be Achieved

Utopia =⋃θ0∈Θ

M (G (θ0,U0)) =⋂

A: R-ancillary for G

AG (U0) = Cave

1 The collection of Utopia sets, {UG (U0)}U0∈U, partitions thepivot space into equivalence classes.

2 The set of pivot space equivalence classes corresponds to a setof sample space equivalence classes.

3 The index for the sample space equivalence class is observedand is R-ancillary.

Utopia represents an upper bound on the informativenessof an R-ancillary. It is always achieved.

Page 174: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?25/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Can Statisticians Achieve Utopia?

Despite all the cynicism in the world, we happily report...

Utopia Can Always Be Achieved

Utopia =⋃θ0∈Θ

M (G (θ0,U0)) =⋂

A: R-ancillary for G

AG (U0) = Cave

1 The collection of Utopia sets, {UG (U0)}U0∈U, partitions thepivot space into equivalence classes.

2 The set of pivot space equivalence classes corresponds to a setof sample space equivalence classes.

3 The index for the sample space equivalence class is observedand is R-ancillary.

Utopia represents an upper bound on the informativenessof an R-ancillary. It is always achieved.

Page 175: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?25/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Can Statisticians Achieve Utopia?

Despite all the cynicism in the world, we happily report...

Utopia Can Always Be Achieved

Utopia =⋃θ0∈Θ

M (G (θ0,U0)) =⋂

A: R-ancillary for G

AG (U0) = Cave

1 The collection of Utopia sets, {UG (U0)}U0∈U, partitions thepivot space into equivalence classes.

2 The set of pivot space equivalence classes corresponds to a setof sample space equivalence classes.

3 The index for the sample space equivalence class is observedand is R-ancillary.

Utopia represents an upper bound on the informativenessof an R-ancillary. It is always achieved.

Page 176: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?25/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Can Statisticians Achieve Utopia?

Despite all the cynicism in the world, we happily report...

Utopia Can Always Be Achieved

Utopia =⋃θ0∈Θ

M (G (θ0,U0)) =⋂

A: R-ancillary for G

AG (U0) = Cave

1 The collection of Utopia sets, {UG (U0)}U0∈U, partitions thepivot space into equivalence classes.

2 The set of pivot space equivalence classes corresponds to a setof sample space equivalence classes.

3 The index for the sample space equivalence class is observedand is R-ancillary.

Utopia represents an upper bound on the informativenessof an R-ancillary. It is always achieved.

Page 177: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?25/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Can Statisticians Achieve Utopia?

Despite all the cynicism in the world, we happily report...

Utopia Can Always Be Achieved

Utopia =⋃θ0∈Θ

M (G (θ0,U0)) =⋂

A: R-ancillary for G

AG (U0) = Cave

1 The collection of Utopia sets, {UG (U0)}U0∈U, partitions thepivot space into equivalence classes.

2 The set of pivot space equivalence classes corresponds to a setof sample space equivalence classes.

3 The index for the sample space equivalence class is observedand is R-ancillary.

Utopia represents an upper bound on the informativenessof an R-ancillary. It is always achieved.

Page 178: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?26/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Making Relevant Subsets Relevant Again

Fisher (1934) argued that we should condition on relevantsubsets. Of what space?

Classical Way

Subsets of the samplespace, X.

Level-sets of ancillarystatistics.

Paradoxes: Which ancillarystatistics? Existence?

New Way

Subsets of the pivot space,U.

Level-sets of R-ancillarystatistics.

Unique “most relevant”subset—Utopia.

U, the uncertainty, dictates how hard the inference problem is.

Condition on U⇔ Condition on difficulty of inference.

Want to give same effort for all difficulties.

Page 179: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?26/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Making Relevant Subsets Relevant Again

Fisher (1934) argued that we should condition on relevantsubsets.

Of what space?

Classical Way

Subsets of the samplespace, X.

Level-sets of ancillarystatistics.

Paradoxes: Which ancillarystatistics? Existence?

New Way

Subsets of the pivot space,U.

Level-sets of R-ancillarystatistics.

Unique “most relevant”subset—Utopia.

U, the uncertainty, dictates how hard the inference problem is.

Condition on U⇔ Condition on difficulty of inference.

Want to give same effort for all difficulties.

Page 180: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?26/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Making Relevant Subsets Relevant Again

Fisher (1934) argued that we should condition on relevantsubsets. Of what space?

Classical Way

Subsets of the samplespace, X.

Level-sets of ancillarystatistics.

Paradoxes: Which ancillarystatistics? Existence?

New Way

Subsets of the pivot space,U.

Level-sets of R-ancillarystatistics.

Unique “most relevant”subset—Utopia.

U, the uncertainty, dictates how hard the inference problem is.

Condition on U⇔ Condition on difficulty of inference.

Want to give same effort for all difficulties.

Page 181: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?26/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Making Relevant Subsets Relevant Again

Fisher (1934) argued that we should condition on relevantsubsets. Of what space?

Classical Way

Subsets of the samplespace, X.

Level-sets of ancillarystatistics.

Paradoxes: Which ancillarystatistics? Existence?

New Way

Subsets of the pivot space,U.

Level-sets of R-ancillarystatistics.

Unique “most relevant”subset—Utopia.

U, the uncertainty, dictates how hard the inference problem is.

Condition on U⇔ Condition on difficulty of inference.

Want to give same effort for all difficulties.

Page 182: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?26/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Making Relevant Subsets Relevant Again

Fisher (1934) argued that we should condition on relevantsubsets. Of what space?

Classical Way

Subsets of the samplespace, X.

Level-sets of ancillarystatistics.

Paradoxes: Which ancillarystatistics? Existence?

New Way

Subsets of the pivot space,U.

Level-sets of R-ancillarystatistics.

Unique “most relevant”subset—Utopia.

U, the uncertainty, dictates how hard the inference problem is.

Condition on U⇔ Condition on difficulty of inference.

Want to give same effort for all difficulties.

Page 183: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?26/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Making Relevant Subsets Relevant Again

Fisher (1934) argued that we should condition on relevantsubsets. Of what space?

Classical Way

Subsets of the samplespace, X.

Level-sets of ancillarystatistics.

Paradoxes: Which ancillarystatistics? Existence?

New Way

Subsets of the pivot space,U.

Level-sets of R-ancillarystatistics.

Unique “most relevant”subset—Utopia.

U, the uncertainty, dictates how hard the inference problem is.

Condition on U⇔ Condition on difficulty of inference.

Want to give same effort for all difficulties.

Page 184: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?26/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Making Relevant Subsets Relevant Again

Fisher (1934) argued that we should condition on relevantsubsets. Of what space?

Classical Way

Subsets of the samplespace, X.

Level-sets of ancillarystatistics.

Paradoxes: Which ancillarystatistics? Existence?

New Way

Subsets of the pivot space,U.

Level-sets of R-ancillarystatistics.

Unique “most relevant”subset—Utopia.

U, the uncertainty, dictates how hard the inference problem is.

Condition on U⇔ Condition on difficulty of inference.

Want to give same effort for all difficulties.

Page 185: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?26/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Making Relevant Subsets Relevant Again

Fisher (1934) argued that we should condition on relevantsubsets. Of what space?

Classical Way

Subsets of the samplespace, X.

Level-sets of ancillarystatistics.

Paradoxes: Which ancillarystatistics? Existence?

New Way

Subsets of the pivot space,U.

Level-sets of R-ancillarystatistics.

Unique “most relevant”subset—Utopia.

U, the uncertainty, dictates how hard the inference problem is.

Condition on U⇔ Condition on difficulty of inference.

Want to give same effort for all difficulties.

Page 186: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?26/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Making Relevant Subsets Relevant Again

Fisher (1934) argued that we should condition on relevantsubsets. Of what space?

Classical Way

Subsets of the samplespace, X.

Level-sets of ancillarystatistics.

Paradoxes: Which ancillarystatistics? Existence?

New Way

Subsets of the pivot space,U.

Level-sets of R-ancillarystatistics.

Unique “most relevant”subset—Utopia.

U, the uncertainty, dictates how hard the inference problem is.

Condition on U⇔ Condition on difficulty of inference.

Want to give same effort for all difficulties.

Page 187: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?26/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Making Relevant Subsets Relevant Again

Fisher (1934) argued that we should condition on relevantsubsets. Of what space?

Classical Way

Subsets of the samplespace, X.

Level-sets of ancillarystatistics.

Paradoxes: Which ancillarystatistics? Existence?

New Way

Subsets of the pivot space,U.

Level-sets of R-ancillarystatistics.

Unique “most relevant”subset—Utopia.

U, the uncertainty, dictates how hard the inference problem is.

Condition on U⇔ Condition on difficulty of inference.

Want to give same effort for all difficulties.

Page 188: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?26/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Making Relevant Subsets Relevant Again

Fisher (1934) argued that we should condition on relevantsubsets. Of what space?

Classical Way

Subsets of the samplespace, X.

Level-sets of ancillarystatistics.

Paradoxes: Which ancillarystatistics? Existence?

New Way

Subsets of the pivot space,U.

Level-sets of R-ancillarystatistics.

Unique “most relevant”subset—Utopia.

U, the uncertainty, dictates how hard the inference problem is.

Condition on U⇔ Condition on difficulty of inference.

Want to give same effort for all difficulties.

Page 189: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?26/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Making Relevant Subsets Relevant Again

Fisher (1934) argued that we should condition on relevantsubsets. Of what space?

Classical Way

Subsets of the samplespace, X.

Level-sets of ancillarystatistics.

Paradoxes: Which ancillarystatistics? Existence?

New Way

Subsets of the pivot space,U.

Level-sets of R-ancillarystatistics.

Unique “most relevant”subset—Utopia.

U, the uncertainty, dictates how hard the inference problem is.

Condition on U⇔ Condition on difficulty of inference.

Want to give same effort for all difficulties.

Page 190: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?26/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Making Relevant Subsets Relevant Again

Fisher (1934) argued that we should condition on relevantsubsets. Of what space?

Classical Way

Subsets of the samplespace, X.

Level-sets of ancillarystatistics.

Paradoxes: Which ancillarystatistics? Existence?

New Way

Subsets of the pivot space,U.

Level-sets of R-ancillarystatistics.

Unique “most relevant”subset—Utopia.

U, the uncertainty, dictates how hard the inference problem is.

Condition on U⇔ Condition on difficulty of inference.

Want to give same effort for all difficulties.

Page 191: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?26/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Making Relevant Subsets Relevant Again

Fisher (1934) argued that we should condition on relevantsubsets. Of what space?

Classical Way

Subsets of the samplespace, X.

Level-sets of ancillarystatistics.

Paradoxes: Which ancillarystatistics? Existence?

New Way

Subsets of the pivot space,U.

Level-sets of R-ancillarystatistics.

Unique “most relevant”subset—Utopia.

U, the uncertainty, dictates how hard the inference problem is.

Condition on U⇔ Condition on difficulty of inference.

Want to give same effort for all difficulties.

Page 192: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?27/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

The Best for U

The optimal achievable objective (free of π) posterior for U:f (U|Amax).

“Heaven Is Possible” If and Only If...

With respect to fixed G ,

f (U|Amax (x)) = f (U|M (x)) , ∀x ∈ X

if and only if ∀U ∈U, ∀θ ∈ Θ, M (G (θ,U)) = UG (U) where UG (U)is the Utopia set containing U.

Interpretation

M(G (θ,U)) is the information we get back about U if data aregenerated using θ.

Invariance of M(G (θ,U)) to θ implies that information contentis independent of the true parameter value.

Page 193: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?27/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

The Best for U

The optimal achievable objective (free of π) posterior for U:f (U|Amax).

“Heaven Is Possible” If and Only If...

With respect to fixed G ,

f (U|Amax (x)) = f (U|M (x)) , ∀x ∈ X

if and only if ∀U ∈U, ∀θ ∈ Θ, M (G (θ,U)) = UG (U) where UG (U)is the Utopia set containing U.

Interpretation

M(G (θ,U)) is the information we get back about U if data aregenerated using θ.

Invariance of M(G (θ,U)) to θ implies that information contentis independent of the true parameter value.

Page 194: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?27/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

The Best for U

The optimal achievable objective (free of π) posterior for U:f (U|Amax).

“Heaven Is Possible” If and Only If...

With respect to fixed G ,

f (U|Amax (x)) = f (U|M (x)) , ∀x ∈ X

if and only if ∀U ∈U, ∀θ ∈ Θ, M (G (θ,U)) = UG (U) where UG (U)is the Utopia set containing U.

Interpretation

M(G (θ,U)) is the information we get back about U if data aregenerated using θ.

Invariance of M(G (θ,U)) to θ implies that information contentis independent of the true parameter value.

Page 195: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?27/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

The Best for U

The optimal achievable objective (free of π) posterior for U:f (U|Amax).

“Heaven Is Possible” If and Only If...

With respect to fixed G ,

f (U|Amax (x)) = f (U|M (x)) , ∀x ∈ X

if and only if ∀U ∈U, ∀θ ∈ Θ, M (G (θ,U)) = UG (U) where UG (U)is the Utopia set containing U.

Interpretation

M(G (θ,U)) is the information we get back about U if data aregenerated using θ.

Invariance of M(G (θ,U)) to θ implies that information contentis independent of the true parameter value.

Page 196: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?27/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

The Best for U

The optimal achievable objective (free of π) posterior for U:f (U|Amax).

“Heaven Is Possible” If and Only If...

With respect to fixed G ,

f (U|Amax (x)) = f (U|M (x)) , ∀x ∈ X

if and only if ∀U ∈U, ∀θ ∈ Θ, M (G (θ,U)) = UG (U) where UG (U)is the Utopia set containing U.

Interpretation

M(G (θ,U)) is the information we get back about U if data aregenerated using θ.

Invariance of M(G (θ,U)) to θ implies that information contentis independent of the true parameter value.

Page 197: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?28/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

When Is the Best for U Good Enough for θ?

An Objective Posterior for θ

1 f (U|Amax) = f (U|M(x)).

2 x = G (θ,U) can be solved for θ, yielding a function, θ(U; x),from M(x) to Θ.

Then θ(U; x) as a function of U ∼ f (U|Amax) induces aFrequentist calibrated posterior distribution for θ.

Most Relevant Confidence Regions

Let CU be a 1− α probability set w.r.t. f (U|Amax).

For fixed G , CU is a most relevant confidence region for U.

The mapping θ(U, x) (not necessarily function) converts CU

into a most relevant 1− α level confidence region, Cθ for θ.

Cθ attains posterior probability 1− α if (1) and (2) hold.

Page 198: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?28/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

When Is the Best for U Good Enough for θ?

An Objective Posterior for θ

1 f (U|Amax) = f (U|M(x)).

2 x = G (θ,U) can be solved for θ, yielding a function, θ(U; x),from M(x) to Θ.

Then θ(U; x) as a function of U ∼ f (U|Amax) induces aFrequentist calibrated posterior distribution for θ.

Most Relevant Confidence Regions

Let CU be a 1− α probability set w.r.t. f (U|Amax).

For fixed G , CU is a most relevant confidence region for U.

The mapping θ(U, x) (not necessarily function) converts CU

into a most relevant 1− α level confidence region, Cθ for θ.

Cθ attains posterior probability 1− α if (1) and (2) hold.

Page 199: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?28/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

When Is the Best for U Good Enough for θ?

An Objective Posterior for θ

1 f (U|Amax) = f (U|M(x)).

2 x = G (θ,U) can be solved for θ, yielding a function, θ(U; x),from M(x) to Θ.

Then θ(U; x) as a function of U ∼ f (U|Amax) induces aFrequentist calibrated posterior distribution for θ.

Most Relevant Confidence Regions

Let CU be a 1− α probability set w.r.t. f (U|Amax).

For fixed G , CU is a most relevant confidence region for U.

The mapping θ(U, x) (not necessarily function) converts CU

into a most relevant 1− α level confidence region, Cθ for θ.

Cθ attains posterior probability 1− α if (1) and (2) hold.

Page 200: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?28/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

When Is the Best for U Good Enough for θ?

An Objective Posterior for θ

1 f (U|Amax) = f (U|M(x)).

2 x = G (θ,U) can be solved for θ, yielding a function, θ(U; x),from M(x) to Θ.

Then θ(U; x) as a function of U ∼ f (U|Amax) induces aFrequentist calibrated posterior distribution for θ.

Most Relevant Confidence Regions

Let CU be a 1− α probability set w.r.t. f (U|Amax).

For fixed G , CU is a most relevant confidence region for U.

The mapping θ(U, x) (not necessarily function) converts CU

into a most relevant 1− α level confidence region, Cθ for θ.

Cθ attains posterior probability 1− α if (1) and (2) hold.

Page 201: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?28/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

When Is the Best for U Good Enough for θ?

An Objective Posterior for θ

1 f (U|Amax) = f (U|M(x)).

2 x = G (θ,U) can be solved for θ, yielding a function, θ(U; x),from M(x) to Θ.

Then θ(U; x) as a function of U ∼ f (U|Amax) induces aFrequentist calibrated posterior distribution for θ.

Most Relevant Confidence Regions

Let CU be a 1− α probability set w.r.t. f (U|Amax).

For fixed G , CU is a most relevant confidence region for U.

The mapping θ(U, x) (not necessarily function) converts CU

into a most relevant 1− α level confidence region, Cθ for θ.

Cθ attains posterior probability 1− α if (1) and (2) hold.

Page 202: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?28/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

When Is the Best for U Good Enough for θ?

An Objective Posterior for θ

1 f (U|Amax) = f (U|M(x)).

2 x = G (θ,U) can be solved for θ, yielding a function, θ(U; x),from M(x) to Θ.

Then θ(U; x) as a function of U ∼ f (U|Amax) induces aFrequentist calibrated posterior distribution for θ.

Most Relevant Confidence Regions

Let CU be a 1− α probability set w.r.t. f (U|Amax).

For fixed G , CU is a most relevant confidence region for U.

The mapping θ(U, x) (not necessarily function) converts CU

into a most relevant 1− α level confidence region, Cθ for θ.

Cθ attains posterior probability 1− α if (1) and (2) hold.

Page 203: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?28/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

When Is the Best for U Good Enough for θ?

An Objective Posterior for θ

1 f (U|Amax) = f (U|M(x)).

2 x = G (θ,U) can be solved for θ, yielding a function, θ(U; x),from M(x) to Θ.

Then θ(U; x) as a function of U ∼ f (U|Amax) induces aFrequentist calibrated posterior distribution for θ.

Most Relevant Confidence Regions

Let CU be a 1− α probability set w.r.t. f (U|Amax).

For fixed G , CU is a most relevant confidence region for U.

The mapping θ(U, x) (not necessarily function) converts CU

into a most relevant 1− α level confidence region, Cθ for θ.

Cθ attains posterior probability 1− α if (1) and (2) hold.

Page 204: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?28/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

When Is the Best for U Good Enough for θ?

An Objective Posterior for θ

1 f (U|Amax) = f (U|M(x)).

2 x = G (θ,U) can be solved for θ, yielding a function, θ(U; x),from M(x) to Θ.

Then θ(U; x) as a function of U ∼ f (U|Amax) induces aFrequentist calibrated posterior distribution for θ.

Most Relevant Confidence Regions

Let CU be a 1− α probability set w.r.t. f (U|Amax).

For fixed G , CU is a most relevant confidence region for U.

The mapping θ(U, x) (not necessarily function) converts CU

into a most relevant 1− α level confidence region, Cθ for θ.

Cθ attains posterior probability 1− α if (1) and (2) hold.

Page 205: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?29/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Posterior Reflections: From U to θ Through G

Invert x = G (θ,U) toθ∗(U; x) = {θ : G (θ,U) = x}

M (x) = UG (U0)

θ∗ (U; x) issingleton.

Fiducial posterioris θ∗(U; x)|Amax

θ∗ (U; x) isnot singleton.

Dempster-Shafertheory for distri-bution over 2Θ.

M(x)proper⊂ UG(U0)

Amax notinformative

enough.

ApproximateR-ancillarity?

Uncongeniality: Inconsistent use of the full data, x, and thepartial data, Amax , in different phases of the analysis.

Non-uniqueness: How does one choose G ?

Page 206: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?29/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Posterior Reflections: From U to θ Through G

Invert x = G (θ,U) toθ∗(U; x) = {θ : G (θ,U) = x}

M (x) = UG (U0)

θ∗ (U; x) issingleton.

Fiducial posterioris θ∗(U; x)|Amax

θ∗ (U; x) isnot singleton.

Dempster-Shafertheory for distri-bution over 2Θ.

M(x)proper⊂ UG(U0)

Amax notinformative

enough.

ApproximateR-ancillarity?

Uncongeniality: Inconsistent use of the full data, x, and thepartial data, Amax , in different phases of the analysis.

Non-uniqueness: How does one choose G ?

Page 207: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?29/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Posterior Reflections: From U to θ Through G

Invert x = G (θ,U) toθ∗(U; x) = {θ : G (θ,U) = x}

M (x) = UG (U0)

θ∗ (U; x) issingleton.

Fiducial posterioris θ∗(U; x)|Amax

θ∗ (U; x) isnot singleton.

Dempster-Shafertheory for distri-bution over 2Θ.

M(x)proper⊂ UG(U0)

Amax notinformative

enough.

ApproximateR-ancillarity?

Uncongeniality: Inconsistent use of the full data, x, and thepartial data, Amax , in different phases of the analysis.

Non-uniqueness: How does one choose G ?

Page 208: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?29/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Posterior Reflections: From U to θ Through G

Invert x = G (θ,U) toθ∗(U; x) = {θ : G (θ,U) = x}

M (x) = UG (U0)

θ∗ (U; x) issingleton.

Fiducial posterioris θ∗(U; x)|Amax

θ∗ (U; x) isnot singleton.

Dempster-Shafertheory for distri-bution over 2Θ.

M(x)proper⊂ UG(U0)

Amax notinformative

enough.

ApproximateR-ancillarity?

Uncongeniality: Inconsistent use of the full data, x, and thepartial data, Amax , in different phases of the analysis.

Non-uniqueness: How does one choose G ?

Page 209: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?29/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Posterior Reflections: From U to θ Through G

Invert x = G (θ,U) toθ∗(U; x) = {θ : G (θ,U) = x}

M (x) = UG (U0)

θ∗ (U; x) issingleton.

Fiducial posterioris θ∗(U; x)|Amax

θ∗ (U; x) isnot singleton.

Dempster-Shafertheory for distri-bution over 2Θ.

M(x)proper⊂ UG(U0)

Amax notinformative

enough.

ApproximateR-ancillarity?

Uncongeniality: Inconsistent use of the full data, x, and thepartial data, Amax , in different phases of the analysis.

Non-uniqueness: How does one choose G ?

Page 210: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?29/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Posterior Reflections: From U to θ Through G

Invert x = G (θ,U) toθ∗(U; x) = {θ : G (θ,U) = x}

M (x) = UG (U0)

θ∗ (U; x) issingleton.

Fiducial posterioris θ∗(U; x)|Amax

θ∗ (U; x) isnot singleton.

Dempster-Shafertheory for distri-bution over 2Θ.

M(x)proper⊂ UG(U0)

Amax notinformative

enough.

ApproximateR-ancillarity?

Uncongeniality: Inconsistent use of the full data, x, and thepartial data, Amax , in different phases of the analysis.

Non-uniqueness: How does one choose G ?

Page 211: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?29/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Posterior Reflections: From U to θ Through G

Invert x = G (θ,U) toθ∗(U; x) = {θ : G (θ,U) = x}

M (x) = UG (U0)

θ∗ (U; x) issingleton.

Fiducial posterioris θ∗(U; x)|Amax

θ∗ (U; x) isnot singleton.

Dempster-Shafertheory for distri-bution over 2Θ.

M(x)proper⊂ UG(U0)

Amax notinformative

enough.

ApproximateR-ancillarity?

Uncongeniality: Inconsistent use of the full data, x, and thepartial data, Amax , in different phases of the analysis.

Non-uniqueness: How does one choose G ?

Page 212: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?29/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Posterior Reflections: From U to θ Through G

Invert x = G (θ,U) toθ∗(U; x) = {θ : G (θ,U) = x}

M (x) = UG (U0)

θ∗ (U; x) issingleton.

Fiducial posterioris θ∗(U; x)|Amax

θ∗ (U; x) isnot singleton.

Dempster-Shafertheory for distri-bution over 2Θ.

M(x)proper⊂ UG(U0)

Amax notinformative

enough.

ApproximateR-ancillarity?

Uncongeniality: Inconsistent use of the full data, x, and thepartial data, Amax , in different phases of the analysis.

Non-uniqueness: How does one choose G ?

Page 213: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?29/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Posterior Reflections: From U to θ Through G

Invert x = G (θ,U) toθ∗(U; x) = {θ : G (θ,U) = x}

M (x) = UG (U0)

θ∗ (U; x) issingleton.

Fiducial posterioris θ∗(U; x)|Amax

θ∗ (U; x) isnot singleton.

Dempster-Shafertheory for distri-bution over 2Θ.

M(x)proper⊂ UG(U0)

Amax notinformative

enough.

ApproximateR-ancillarity?

Uncongeniality: Inconsistent use of the full data, x, and thepartial data, Amax , in different phases of the analysis.

Non-uniqueness: How does one choose G ?

Page 214: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?29/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Posterior Reflections: From U to θ Through G

Invert x = G (θ,U) toθ∗(U; x) = {θ : G (θ,U) = x}

M (x) = UG (U0)

θ∗ (U; x) issingleton.

Fiducial posterioris θ∗(U; x)|Amax

θ∗ (U; x) isnot singleton.

Dempster-Shafertheory for distri-bution over 2Θ.

M(x)proper⊂ UG(U0)

Amax notinformative

enough.

ApproximateR-ancillarity?

Uncongeniality: Inconsistent use of the full data, x, and thepartial data, Amax , in different phases of the analysis.

Non-uniqueness: How does one choose G ?

Page 215: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?29/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Posterior Reflections: From U to θ Through G

Invert x = G (θ,U) toθ∗(U; x) = {θ : G (θ,U) = x}

M (x) = UG (U0)

θ∗ (U; x) issingleton.

Fiducial posterioris θ∗(U; x)|Amax

θ∗ (U; x) isnot singleton.

Dempster-Shafertheory for distri-bution over 2Θ.

M(x)proper⊂ UG(U0)

Amax notinformative

enough.

ApproximateR-ancillarity?

Uncongeniality: Inconsistent use of the full data, x, and thepartial data, Amax , in different phases of the analysis.

Non-uniqueness: How does one choose G ?

Page 216: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?29/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Posterior Reflections: From U to θ Through G

Invert x = G (θ,U) toθ∗(U; x) = {θ : G (θ,U) = x}

M (x) = UG (U0)

θ∗ (U; x) issingleton.

Fiducial posterioris θ∗(U; x)|Amax

θ∗ (U; x) isnot singleton.

Dempster-Shafertheory for distri-bution over 2Θ.

M(x)proper⊂ UG(U0)

Amax notinformative

enough.

ApproximateR-ancillarity?

Uncongeniality: Inconsistent use of the full data, x, and thepartial data, Amax , in different phases of the analysis.

Non-uniqueness: How does one choose G ?

Page 217: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?30/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Bibliography

Barnard, G. A. (1995). Pivotal Models and the FiducialArgument. International Statistical Review, 63(3): 309-323.

Basu, D. (1964). Recovery of ancillary information. Sankhya, 26:3-16.

Dawid, A.P. and M. Stone (1982). The Function-Model Basis ofFiducial Inference. Annals of Statistics, 10(4): 1054-1067.

Dempster, A.P. (2008). The Dempster-Shafer calculus forstatisticians. International Journal of Approximate Reasoning,48: 365-377.

Fraser, D.A.S. (1968). The Structure of Inference. John Wiley &Sons, New York-London-Sydney.

Page 218: Who is Crazier: Bayes or Fisher? A Missing (Data) Perspective on … › programs › scientific › 13-14 › ... · 2013-11-15 · Let’s Meet the Crazies Inference for Mean of

Who’sCrazier?31/31

Keli Liu andXiao-Li Meng

The Art ofCreatingMissingness

A Partial Lookat Fiducial

Observed NotAt Random

Is UtopiaPossible?

Conclusions

Bibliography

Hannig, Jan. (2009). On Generalized Fiducial Inference.Statistica Sinica, 19: 491-544.

Martin, R., J. Zhang, and C. Liu (2010). Dempster-ShaferTheory and Statistical Inference with Weak Beliefs. StatisticalScience, 20(1): 72-87.

Rubin, Donald B. (1976). Inference and missing data.Biometrika, 63(3): 581-592.

Taraldsen, G. and B. Lindqvsit (2013). Fiducial Theory andOptimal Inference. Annals of Statistics, 41(1): 323-341.