inductive amnesia
DESCRIPTION
Inductive Amnesia. The Reliability of Iterated Belief Revision. EvenOdd StraightCrooked Reliability“Confirmation” Performance “Primitive norms” Correctness“Coherence” Classical statistics Bayesianism Learning theoryBelief Revision Theory. A Table of Opposites. - PowerPoint PPT PresentationTRANSCRIPT
Inductive AmnesiaInductive Amnesia
The Reliability The Reliability
of of
Iterated Belief RevisionIterated Belief Revision
A Table of OppositesA Table of Opposites
EvenEven OddOdd StraightStraight CrookedCrooked ReliabilityReliability “Confirmation”“Confirmation” Performance Performance “Primitive norms”“Primitive norms” CorrectnessCorrectness “Coherence”“Coherence” Classical statistics Classical statistics BayesianismBayesianism Learning theoryLearning theory Belief Revision TheoryBelief Revision Theory
The IdeaThe Idea
Belief revision is inductive reasoningBelief revision is inductive reasoning A restrictive norm prevents us from finding truths A restrictive norm prevents us from finding truths
we could have found by other meanswe could have found by other means Some proposed belief revision methods are Some proposed belief revision methods are
restrictiverestrictive The restrictiveness is expressed as The restrictiveness is expressed as inductive inductive
amnesiaamnesia
Inductive AmnesiaInductive Amnesia
No restriction on memory...No restriction on memory... No restriction on predictive power...No restriction on predictive power... But prediction causes memory loss...But prediction causes memory loss... And perfect memory precludes prediction!And perfect memory precludes prediction! Fundamental dilemmaFundamental dilemma
OutlineOutline
I. I. Seven belief revision methodsSeven belief revision methods II. II. Belief revision as learningBelief revision as learning III. III. Properties of the methodsProperties of the methods IV. IV. The Goodman hierarchyThe Goodman hierarchy V. V. Negative resultsNegative results VI. VI. Positive resultsPositive results VII. VII. DiscussionDiscussion
Points of InterestPoints of Interest
Strong negative and positive resultsStrong negative and positive results Short run advice from limiting analysisShort run advice from limiting analysis 2 is magic for reliable belief revision2 is magic for reliable belief revision Learning as cube rotationLearning as cube rotation GrueGrue
Part IPart I
Iterated Belief RevisionIterated Belief Revision
Bayesian (Vanilla) UpdatingBayesian (Vanilla) Updating
B
Propositions are sets of “possible worlds”
Bayesian (Vanilla) UpdatingBayesian (Vanilla) Updating
B
E newevidence
Bayesian (Vanilla) UpdatingBayesian (Vanilla) Updating
Perfect memoryPerfect memory No inductive leapsNo inductive leaps
BB’
E
B’ = B *E = B E
Epistemic HellEpistemic Hell
B
Epistemic HellEpistemic Hell
B
E
Epistemic hell
Surprise!
Epistemic HellEpistemic Hell
Scientific revolutionsScientific revolutions Suppositional reasoningSuppositional reasoning Conditional pragmaticsConditional pragmatics Decision theoryDecision theory Game theoryGame theory Data basesData bases
B
E
Epistemic hell
Ordinal EntrenchmentOrdinal EntrenchmentSpohn 88Spohn 88
Epistemic state Epistemic state SS maps worlds to ordinals maps worlds to ordinals Belief state of Belief state of SS = = bb ( (S S ) = ) = S S -1-1(0)(0) Determines “centrality” of beliefsDetermines “centrality” of beliefs Model: orders of infinitesimal probabilityModel: orders of infinitesimal probability
S
B = b (S)
2
1
0
+ 1
Belief Revision MethodsBelief Revision Methods
*
S’
S
E
* takes an epistemic state and a proposition to an epistemic state
b(S)
b (S *E )
Spohn Conditioning *Spohn Conditioning *CCSpohn 88Spohn 88
b (S )
S
Spohn Conditioning *Spohn Conditioning *CCSpohn 88Spohn 88
E
S
E newevidencecontradicting b (S )
b (S )
Spohn Conditioning *Spohn Conditioning *CCSpohn 88Spohn 88
E
B’
S
E
*C
b (S )
S *C E
Spohn Conditioning *Spohn Conditioning *CCSpohn 88Spohn 88
Conditions an entire Conditions an entire entrenchment orderingentrenchment ordering
Perfect memoryPerfect memory Inductive leapsInductive leaps No epistemic hell on consistent No epistemic hell on consistent
sequencessequences Epistemic hell on inconsistent Epistemic hell on inconsistent
sequencessequences
E
B’
S
E
S *C E
*C
b (S )
Lexicographic Updating *Lexicographic Updating *LLSpohn 88, Nayak 94Spohn 88, Nayak 94
S
Lexicographic Updating *Lexicographic Updating *LLSpohn 88, Nayak 94Spohn 88, Nayak 94
S
E
Lexicographic Updating *Lexicographic Updating *LLSpohn 88, Nayak 94Spohn 88, Nayak 94
Lift refuted possibilities above Lift refuted possibilities above non-refuted possibilities non-refuted possibilities preserving order.preserving order.
Perfect memory on consistent Perfect memory on consistent sequencessequences
Inductive leapsInductive leaps No epistemic hellNo epistemic hell
B’
S S *L E
E
*L
Minimal or “Natural” Updating *Minimal or “Natural” Updating *MMSpohn 88, Boutilier 93Spohn 88, Boutilier 93
B
S
Minimal or “Natural” Updating *Minimal or “Natural” Updating *MMSpohn 88, Boutilier 93Spohn 88, Boutilier 93
B
E
S
Minimal or “Natural” Updating *Minimal or “Natural” Updating *MMSpohn 88, Boutilier 93Spohn 88, Boutilier 93
Drop the lowest Drop the lowest possibilities consistent possibilities consistent with the data to the bottom with the data to the bottom and raise everything else and raise everything else up one notchup one notch
inductive leapsinductive leaps No epistemic hellNo epistemic hell But...But...
E
S S *M E
*M
AmnesiaAmnesia
What goes up can come downWhat goes up can come down Belief no longer entails past dataBelief no longer entails past data
E
AmnesiaAmnesia
What goes up can come downWhat goes up can come down Belief no longer entails past dataBelief no longer entails past data
E E’
*M
AmnesiaAmnesia
What goes up can come downWhat goes up can come down Belief no longer entails past dataBelief no longer entails past data
E E’
*M *M
The Flush-to-The Flush-to- Method * Method *F,F,Goldszmidt and Pearl 94Goldszmidt and Pearl 94
B
S
The Flush-to-The Flush-to- Method * Method *F,F,Goldszmidt and Pearl 94Goldszmidt and Pearl 94
E
S
EE
The Flush-to-The Flush-to- Method * Method *F,F,Goldszmidt and Pearl 94Goldszmidt and Pearl 94
Send non-Send non-EE worlds to a worlds to a fixed level fixed level and drop and drop E E --worlds rigidly to the worlds rigidly to the bottombottom
Perfect memory on Perfect memory on sequentially consistent sequentially consistent data data ifif is high enough is high enough
Inductive leapsInductive leaps No epistemic hellNo epistemic hell
E
S S *F, E
EE
*F,
Ordinal Jeffrey Conditioning *Ordinal Jeffrey Conditioning *J,J,Spohn 88 Spohn 88
S
Ordinal Jeffrey Conditioning *Ordinal Jeffrey Conditioning *J,J,Spohn 88 Spohn 88
E
S
EE
Ordinal Jeffrey Conditioning *Ordinal Jeffrey Conditioning *J,J,Spohn 88 Spohn 88
E
S
EE
Ordinal Jeffrey Conditioning *Ordinal Jeffrey Conditioning *J,J,Spohn 88 Spohn 88
Drop Drop E E worlds to the worlds to the bottom. Drop non-bottom. Drop non-EE worlds worlds to the bottom and then jack to the bottom and then jack them up to level them up to level
Perfect memory on Perfect memory on consistent sequences if consistent sequences if is is large enoughlarge enough
No epistemic hellNo epistemic hell ReversibleReversible But...But...
B
E
S S *J, E
B’
EE
*J,
Empirical BackslidingEmpirical Backsliding
Empirical BackslidingEmpirical Backsliding
E
Empirical BackslidingEmpirical Backsliding
Ordinal Jeffrey Ordinal Jeffrey conditioning can conditioning can increase the increase the plausibility of a plausibility of a refuted possibilityrefuted possibility
E
The Ratchet Method *The Ratchet Method *R,R,Darwiche and Pearl 97Darwiche and Pearl 97
S
The Ratchet Method *The Ratchet Method *R,R,Darwiche and Pearl 97Darwiche and Pearl 97
S
E
The Ratchet Method *The Ratchet Method *R,R,Darwiche and Pearl 97Darwiche and Pearl 97
Like ordinal Jeffrey Like ordinal Jeffrey conditioning except conditioning except refuted possibilities move refuted possibilities move up by up by from their current from their current positionspositions
Perfect memory if Perfect memory if is is large enoughlarge enough
Inductive leapsInductive leaps No epistemic hellNo epistemic hell
S
S *R, E
E
B
B’
*R,
Part IIPart II
Belief Revision as LearningBelief Revision as Learning
Iterated Belief RevisionIterated Belief Revision
*
S0 S1 S2
b (S2)
S0
((SS00 * ()) = * ()) = SS00
((SS00 * ( * (EE00, ..., , ..., EEnn, , EEn+n+11)) = ()) = (SS00 * ( * (EE00, ..., , ..., EEnn, )) * , )) * EEn+n+11
E0 E1
b (S1)b (S0)
A Very Simple Learning ParadigmA Very Simple Learning Paradigm
mysterious system
outcomesequence
0 0 1 0 0
e
e|n
possible infinite trajectories
n
Empirical PropositionsEmpirical Propositions
[] = the proposition
that has occurred[k, n] = theproposition thatk occurs at stage n
n
Empirical propositions are sets of possible trajectoriesEmpirical propositions are sets of possible trajectories Some special cases:Some special cases:
e
{e} = the proposition that the future trajectory is exactly e
“fan”k
Trajectory IdentificationTrajectory Identification
((*, S*, S00) ) identifies eidentifies e for all but finitely many for all but finitely many nn, ,
bb((SS00 * ([0, * ([0, ee(0)], ..., [(0)], ..., [nn, , ee((nn)]) = {)]) = {ee}}
Trajectory IdentificationTrajectory Identification
((*, S*, S00) ) identifies eidentifies e for all but finitely many for all but finitely many nn, ,
bb((SS00 * ([0, * ([0, ee(0)], ..., [(0)], ..., [nn, , ee((nn)]) = {)]) = {ee}}
e
possible trajectories
Trajectory IdentificationTrajectory Identification
((*, S*, S00) ) identifies eidentifies e for all but finitely many for all but finitely many nn, ,
bb((SS00 * ([0, * ([0, ee(0)], ..., [(0)], ..., [nn, , ee((nn)]) = {)]) = {ee}}
b (S 0)
e
Trajectory IdentificationTrajectory Identification
((*, S*, S00) ) identifies eidentifies e for all but finitely many for all but finitely many nn, ,
bb((SS00 * ([0, * ([0, ee(0)], ..., [(0)], ..., [nn, , ee((nn)]) = {)]) = {ee}}
b (S 1)
Trajectory IdentificationTrajectory Identification
((*, S*, S00) ) identifies eidentifies e for all but finitely many for all but finitely many nn, ,
bb((SS00 * ([0, * ([0, ee(0)], ..., [(0)], ..., [nn, , ee((nn)]) = {)]) = {ee}}
b (S 2)
Trajectory IdentificationTrajectory Identification
((*, S*, S00) ) identifies eidentifies e for all but finitely many for all but finitely many nn, ,
bb((SS00 * ([0, * ([0, ee(0)], ..., [(0)], ..., [nn, , ee((nn)]) = {)]) = {ee}}
b (S 3)
Trajectory IdentificationTrajectory Identification
((*, S*, S00) ) identifies eidentifies e for all but finitely many for all but finitely many nn, ,
bb((SS00 * ([0, * ([0, ee(0)], ..., [(0)], ..., [nn, , ee((nn)]) = {)]) = {ee}}
convergence to {e }
b (S 4)
Trajectory IdentificationTrajectory Identification
((*, S*, S00) ) identifies eidentifies e for all but finitely many for all but finitely many nn, ,
bb((SS00 * ([0, * ([0, ee(0)], ..., [(0)], ..., [nn, , ee((nn)]) = {)]) = {ee}}
b (S 5)
etc...
ReliabilityReliability
Let Let K K be a set of possible outcome be a set of possible outcome trajectoriestrajectories
((*, S*, S00) ) identifies Kidentifies K ((*, S*, S00) identifies each ) identifies each
ee in in KK
Identifiability CharacterizedIdentifiability Characterized
Proposition: Proposition: KK is identifiable just in case is identifiable just in case KK is countable is countable
Completeness and RestrictivenessCompleteness and Restrictiveness
* is * is completecomplete each ientifiable each ientifiable KK is is identifiable by (identifiable by (*, S*, S00), for some choice of ), for some choice of SS00..
Else * is Else * is restrictiverestrictive. .
Part IIIPart III
Properties of the MethodsProperties of the Methods
Timidity and StubbornnessTimidity and Stubbornness
timidity: no inductive leaps timidity: no inductive leaps without refutationwithout refutation
stubbornness: no retractions stubbornness: no retractions without refutationwithout refutation
BB’
Timidity and StubbornnessTimidity and Stubbornness
timidity: no inductive leaps timidity: no inductive leaps without refutationwithout refutation
stubbornness: no retractions stubbornness: no retractions without refutationwithout refutation
BB’
Timidity and StubbornnessTimidity and Stubbornness
““Belief is Bayesian in the non-Belief is Bayesian in the non-problematic case”problematic case”
All the proposed methods are All the proposed methods are timid and stubborntimid and stubborn
Vestige of the dogma that Vestige of the dogma that probability rules inductionprobability rules induction
BB’
Local ConsistencyLocal Consistency
Local consistency: The Local consistency: The updated belief must updated belief must always be consistent with always be consistent with the current datumthe current datum
All the methods under All the methods under consideration are designed consideration are designed to be locally consistentto be locally consistent
Timidity and StubbornnessTimidity and Stubbornness
timidity: no inductive leaps timidity: no inductive leaps without refutationwithout refutation
stubbornness: no retractions stubbornness: no retractions without refutationwithout refutation
BB’
Positive Order-invariancePositive Order-invariance
Positive order-invariance: Positive order-invariance: ranking among worlds ranking among worlds satisfying all the data so satisfying all the data so far are preservedfar are preserved
All the methods considered All the methods considered are positively order-are positively order-invariantinvariant
Data-RetentivenessData-Retentiveness
Data-retentiveness: Each Data-retentiveness: Each world satisfying all the data is world satisfying all the data is placed above each world placed above each world failing to satisfy some datumfailing to satisfy some datum
Data-retentiveness is sufficient Data-retentiveness is sufficient but not necessary for perfect but not necessary for perfect memorymemory
**CC, *, *LL are data-retentive are data-retentive
**R,R,,, * *J,J, are data-retentive if are data-retentive if is above the top of is above the top of SS. .
S
Enumerate and TestEnumerate and Test
A method enumerates and A method enumerates and tests just in case it is: tests just in case it is:
locally consistent,locally consistent, positively order-invariant,positively order-invariant, data-retentivedata-retentive Enumerate and test Enumerate and test
methods: *methods: *CC, *, *LL
The methods with The methods with parameter parameter if a is above if a is above the top of the top of SS 00. .
preserved entrenchmentordering on live possibilities
epistemicdump forrefutedpossibilities
CompletenessCompleteness
Proposition: Proposition: If * enumerates and tests, then * is If * enumerates and tests, then * is completecomplete
Proof: Let Proof: Let SS00 be an enumeration of be an enumeration of KK
CompletenessCompleteness
Proposition: Proposition: If * enumerates and tests, then * is If * enumerates and tests, then * is completecomplete
Proof: Let Proof: Let SS00 be an enumeration of be an enumeration of KK
Let Let e e be in be in KK
e
CompletenessCompleteness
Feed successive data along Feed successive data along ee:: [0, [0, ee(0)], [1, (0)], [1, ee(1)], ..., [(1)], ..., [nn, , ee((nn)], ...)], ...
[0, e (0)]
e
CompletenessCompleteness
[0, e (0)]
e local consistencypositive invariance
data retentiveness
e
CompletenessCompleteness
[0, e (0)]
e
e
[1, e (1)]
CompletenessCompleteness
[0, e (0)]
e
e
e
[1, e (1)]
local consistencypositive invariance
data retentiveness
CompletenessCompleteness
[0, e (0)]
e
e
e
[1, e (1)][2, e (2)]
CompletenessCompleteness
[0, e (0)]
e
e
e
[1, e (1)][2, e (2)]
e
local consistpositive invar
data retentiveness
Convergence
QuestionQuestion
What about the methods that aren’t data What about the methods that aren’t data retentive?retentive?
Are they complete?Are they complete? If not, can they be objectively compared?If not, can they be objectively compared?
Part IV:Part IV:
The Goodman HierarchyThe Goodman Hierarchy
The Grue OperationThe Grue OperationNelson GoodmanNelson Goodman
A way to generate inductive problems A way to generate inductive problems of ever higher difficulty of ever higher difficulty
ee ‡ ‡ nn = ( = (ee||nn)¬()¬(nn||ee))
nnee
nnkk
ee ‡ ‡ nnee
The Grue OperationThe Grue OperationNelson GoodmanNelson Goodman
A way to generate inductive problems A way to generate inductive problems of ever higher difficulty of ever higher difficulty
ee ‡ ‡ nn = ( = (ee||nn)¬()¬(nn||ee))
The Grue OperationThe Grue OperationNelson GoodmanNelson Goodman
A way to generate inductive problems A way to generate inductive problems of ever higher difficulty of ever higher difficulty
ee ‡ ‡ nn = ( = (ee||nn)¬()¬(nn||ee))
nn
mmkk
ee ‡ ‡ nnee
((ee ‡ ‡ nn) ‡ ) ‡ mm
The Grue OperationThe Grue OperationNelson GoodmanNelson Goodman
A way to generate inductive problems A way to generate inductive problems of ever higher difficulty of ever higher difficulty
ee ‡ ‡ nn = ( = (ee||nn)¬()¬(nn||ee))
nn
mmkk
ee
((((ee ‡ ‡ nn) ‡ ) ‡ mm) ‡ ) ‡ kk
nn
mmkk
ee ‡ ‡ nn((ee ‡ ‡ nn) ‡ ) ‡ mm
The Goodman HierarchyThe Goodman Hierarchy
GGnn((ee) = the set of all trajectories you can get by gruing ) = the set of all trajectories you can get by gruing ee up to up to nn positions positions
GGnneveneven((ee) = the set of all trajectories you can get by ) = the set of all trajectories you can get by
gruing gruing ee an even number of distinct positions up to 2 an even number of distinct positions up to 2nn
GG00((ee))GG11((ee))
GG22((ee))GG33((ee))
GG00eveneven((ee))
GG11eveneven
((ee))
nn
mmkk
The Goodman LimitThe Goodman Limit
GG((ee) = ) = nnGGnn((ee))
GGeveneven((ee) = ) = nnGGnn
eveneven ((ee))
Proposition: Proposition: GGeveneven((ee) = the set of all finite ) = the set of all finite
variants of variants of ee
The Goodman SpectrumThe Goodman Spectrum
GG00((ee))GG11((ee))
GG22((ee))
GG33((ee))
GG((ee))
Min Flush Jeffrey Ratch Lex Cond
= 2
= 2
= 2 yes
yes
yes
yes yes
yes
yes
yes
= 2
= 2
no
no
no
no = = 2 = 2
= n +1
= 3
= 2
yes yes
= 0 = 0 = 0yes
= 1
The Even Goodman SpectrumThe Even Goodman Spectrum
GG00eveneven
((ee))GG11
eveneven ((ee))
GG22eveneven
((ee))
GGnneveneven
((ee))
GGeveneven
((ee))
= 1
Min Flush Jeffrey Ratch Lex Cond
= 0
= 2
= 3
= n +1
= 0 = 0
= 1
= 1
= 1
= 1
yes
yes
yes
yes
yes yes
yes
yes
yes
yes
= 1
= 1
yes
no
no
no
no =
= 1
Part V:Part V:
Negative ResultsNegative Results
Epistemic DualityEpistemic Duality
. . .
“tabula rasa”Bayesian
“conjectures and refutations”Popperian
Epistemic ExtremesEpistemic Extremes
. . .
perfect memoryno projections
projects the futuremay forget
*J,2
Opposing Epistemic PressuresOpposing Epistemic Pressures
Opposing Epistemic PressuresOpposing Epistemic Pressures
rarefaction for inductive leaps
Opposing Epistemic PressuresOpposing Epistemic Pressures
Identification requires Identification requires bothboth Is there a critical value of Is there a critical value of for which for which
they can be balanced for a given they can be balanced for a given problem problem KK??
compression for memory
rarefaction for inductive leaps
Methods *Methods *S,1S,1; *; *MM Fail on Fail on GG11((ee))
GG00((ee))GG11((ee))
GG22((ee))
GG33((ee))
GG((ee))
Min Flush Jeffrey Ratch Lex Cond
= 2
= 2
= 2 yes
yes
yes
yes yes
yes
yes
yes
= 2
= 2
no
no
no
no = = 2
= n +1
= 3
= 2
yes yes
= 0 = 0 = 0yes
= 1
= 2
Methods *Methods *S,1S,1; *; *MM Fail on Fail on GG11((ee))
Proof: Proof: Suppose otherwiseSuppose otherwise Feed Feed ee until until ee is uniquely at the bottom is uniquely at the bottom
Methods *Methods *S,1S,1; *; *MM Fail on Fail on GG11((ee))
Proof: Proof: Suppose otherwiseSuppose otherwise Feed Feed ee until until ee is uniquely at the bottom is uniquely at the bottom
e
data so far
?
Methods *Methods *S,1S,1; *; *MM Fail on Fail on GG11((ee))
By the well-ordering condition, By the well-ordering condition,
e
data so far
else...
?
Methods *Methods *S,1S,1; *; *MM Fail on Fail on GG11((ee))
Now feed Now feed e’ e’ foreverforever By stage By stage n, n, the picture is the samethe picture is the same
e
?
e’
e’’
e’ n
positive order invariance
timidity and stubbornness
Methods *Methods *S,1S,1; *; *MM Fail on Fail on GG11((ee))
At stage At stage nn +1, +1, ee stays at the stays at the bottom (timid and stubborn).bottom (timid and stubborn).
So So e’ e’ can’t travel down can’t travel down (definitions of the rules)(definitions of the rules)
e’’ e’’ doesn’t rise (definitions doesn’t rise (definitions of the rules)of the rules)
Now Now e’’ e’’ makes it to the makes it to the bottom at least as soon as bottom at least as soon as e’e’
e
?
e’
e’’
e’ n
Method *Method *R,1R,1 Fails on Fails on GG22((ee))
GG00((ee))GG11((ee))
GG22((ee))
GG33((ee))
GG((ee))
Min Flush Jeffrey Ratch Lex Cond
= 2
= 2
= 2 yes
yes
yes
yes yes
yes
yes
yes
= 2
= 2
no
no
no
no = = 2 = 2
= n +1
= 3
= 2
yes yes
= 0 = 0 = 0yes
= 1
Method *Method *R,1R,1 Fails on Fails on GG22((ee))with Oliver Schultewith Oliver Schulte
ProofProof: Suppose otherwise: Suppose otherwise Bring Bring ee uniquely to the bottom, say at stage uniquely to the bottom, say at stage kk
e
k
Method *Method *R,1R,1 Fails on Fails on GG22((ee))with Oliver Schultewith Oliver Schulte
Start feeding Start feeding a a = = ee ‡ ‡ kk
e
k
a
Method *Method *R,1R,1 Fails on Fails on GG22((ee))with Oliver Schultewith Oliver Schulte
By some stage By some stage k’k’, , a a is uniquely downis uniquely down So between So between k + k + 1 and 1 and k’k’, there is a first stage , there is a first stage j j when no finite variant of when no finite variant of ee is at is at
the bottomthe bottom
a
k k’
a
Method *Method *R,1R,1 Fails on Fails on GG22((ee))with Oliver Schultewith Oliver Schulte
Let Let c c in in GG22((ee) be a finite variant ) be a finite variant of of e e that rises to level 1 at that rises to level 1 at jj
k’
a
k j
c
Method *Method *R,1R,1 Fails on Fails on GG22((ee))with Oliver Schultewith Oliver Schulte
Let Let c c in in GG22((ee) be a finite variant ) be a finite variant of of e e that rises to level 1 at that rises to level 1 at jj
k’
a
k j
c
Method *Method *R,1R,1 Fails on Fails on GG22((ee))with Oliver Schultewith Oliver Schulte
SoSo c c((j j - 1) - 1) aa((j j - 1)- 1)
k’
a
k j
c
Method *Method *R,1R,1 Fails on Fails on GG22((ee))with Oliver Schultewith Oliver Schulte
Let Let dd be be aa up to up to jj and and ee thereafterthereafter
So is in So is in GG22((ee)) Since Since dd differs from differs from ee, , dd is at is at
least as high as level 1 at least as high as level 1 at j j k’
a
k j
c
d
1
Method *Method *R,1R,1 Fails on Fails on GG22((ee))with Oliver Schultewith Oliver Schulte
Show: Show: cc agrees with agrees with ee after after jj..
k’
a
k j
c
d
1
Method *Method *R,1R,1 Fails on Fails on GG22((ee))with Oliver Schultewith Oliver Schulte
Case: Case: jj = = kk+1+1 Then Then cc could have been chosen could have been chosen
as as ee since since ee is uniquely at the is uniquely at the bottom at bottom at kkk’
a
k j
c
d
1
Method *Method *R,1R,1 Fails on Fails on GG22((ee))with Oliver Schultewith Oliver Schulte
Case: Case: jj > > kk+1+1 Then Then cc wouldn’t have been at wouldn’t have been at
the bottom if it hadn’t agreed the bottom if it hadn’t agreed with with aa (disagreed with (disagreed with ee))k’
a
k j
c
d
1
Method *Method *R,1R,1 Fails on Fails on GG22((ee))with Oliver Schultewith Oliver Schulte
Case: Case: jj > > kk+1+1 So So cc has already used up its two has already used up its two
grues against grues against ee
k’
a
k j
c
d
1
Method *Method *R,1R,1 Fails on Fails on GG22((ee))with Oliver Schultewith Oliver Schulte
Feed c forever afterFeed c forever after By positive invariance, either By positive invariance, either
never projects never projects or or forgetsforgets the the refutation of refutation of cc at at jj-1 -1 k’ k j
c
d
1
d
The Internal Problem of InductionThe Internal Problem of Induction
Necessary condition for success by pos ord-invar methods:Necessary condition for success by pos ord-invar methods: no data stream is a no data stream is a kk-limit point of data streams as low as it -limit point of data streams as low as it
after it has been presented for after it has been presented for kk steps steps
bad
The Internal Problem of InductionThe Internal Problem of Induction
Necessary condition for success by pos ord-invar methods:Necessary condition for success by pos ord-invar methods: no data stream is a no data stream is a kk-limit point of data streams as low as it -limit point of data streams as low as it
after it has been presented for after it has been presented for kk steps steps
bad good
Corollary: Stacking LemmaCorollary: Stacking Lemma
Necessary condition for Necessary condition for identification of identification of GGnn+1+1((ee) ) by positively order-by positively order-invariant methodsinvariant methods
If If ee is at the bottom level is at the bottom level after being presented up after being presented up to stage to stage kk, then some , then some data stream data stream e’ e’ in in GGnn+1+1((ee) ) - - GGnn ((ee) agreeing with the ) agreeing with the data so far is at least at data so far is at least at level level nn+1+1
Corollary: Stacking LemmaCorollary: Stacking Lemma
Necessary condition for Necessary condition for identification of identification of GGnn+1+1((ee) ) by positively order-by positively order-invariant methodsinvariant methods
If If ee is at the bottom level is at the bottom level after being presented up after being presented up to stage to stage kk, then some , then some data stream data stream e’ e’ in in GGnn+1+1((ee) ) - - GGnn ((ee) in agreeing with ) in agreeing with the data so far is at least the data so far is at least at level at level nn+1+1
Why?
Corollary: Stacking LemmaCorollary: Stacking Lemma
Necessary condition for Necessary condition for identification of identification of GGnn+1+1((ee) ) by positively order-by positively order-invariant methodsinvariant methods
If If ee is at the bottom level is at the bottom level after being presented up after being presented up to stage to stage kk, then some , then some data stream data stream e’ e’ in in GGnn+1+1((ee) ) - - GGnn ((ee) in agreeing with ) in agreeing with the data so far is at least the data so far is at least at level at level nn+1+1
Else!
Even Stacking LemmaEven Stacking Lemma
Similarly for Similarly for GGn+n+11eveneven((ee))
Even Stacking LemmaEven Stacking Lemma
Similarly for Similarly for GGn+n+11eveneven((ee))
Why?
Even Stacking LemmaEven Stacking Lemma
Similarly for Similarly for GGn+n+11eveneven((ee))
Else!
Method *Method *F,F,nn Fails on Fails on GGnn((ee))
GG00((ee))GG11((ee))
GG22((ee))
GG33((ee))
GG((ee))
Min Flush Jeffrey Ratch Lex Cond
= 2
= 2
= 2 yes
yes
yes
yes yes
yes
yes
yes
= 2
= 2
no
no
no
no = = 2 = 2
= n +1
= 3
= 2
yes yes
= 0 = 0 = 0yes
= 1
Method *Method *F,F,nn Fails on Fails on GGnn((ee))
Proof for Proof for = 4: Suppose otherwise= 4: Suppose otherwise Bring Bring ee uniquely to the bottom uniquely to the bottom
Method *Method *F,F,nn Fails on Fails on GGnn((ee))
Proof for Proof for = 4: Suppose otherwise= 4: Suppose otherwise Bring Bring ee uniquely to the bottom uniquely to the bottom
e
0
4
e’
? e’’
Method *Method *F,F,nn Fails on Fails on GGnn((ee))
Apply stacking lemmaApply stacking lemma Let Let e’ e’ be in Gbe in G44((ee) - G) - G33((ee) at or ) at or
above level 4above level 4 Let Let e’’ e’’ be the same except at a be the same except at a
the first place the first place kk where where e’ e’ differs from differs from ee
Feed Feed e’e’ forever after forever after
k
e
? e’’
Method *Method *F,F,nn Fails on Fails on GGnn((ee))
Timidity, stubbornness and Timidity, stubbornness and positive invariance hold the positive invariance hold the picture fixed up to picture fixed up to kk
k
0
4
e’
e
? e’’
Method *Method *F,F,nn Fails on Fails on GGnn ((ee))
Ouch! Ouch! Positive invariance, Positive invariance, timidity and stubbornness and timidity and stubbornness and = 4 = 4
k
k
= 4
0
4
e’
e 0
e’
e
Method *Method *F,F,nn Fails on Fails on GGnneveneven
((ee))
Same, using the even stacking lemmaSame, using the even stacking lemma
Part VI:Part VI:
Positive ResultsPositive Results
How to Program Epistemic StatesHow to Program Epistemic States
Hamming Distance: Hamming Distance: ((ee, , e’e’) = {) = {nn: : ee((nn) °) ° e’ e’((nn)})} ((ee, , e’e’) = | ) = | ((ee, , e’e’) |) |
(e, e’ ) = 9
e
e’
Hamming AlgebraHamming Algebra
aa HH b mod e b mod e ((ee, , aa) is a subset of ) is a subset of ee, , bb))
1 1 1
1 1 0 1 0 1 0 1 1
1 0 0 0 1 0 0 0 1
0 0 0
Hamming
Epistemic States as Boolean RanksEpistemic States as Boolean Ranks
Hamming Algebra
GGeveneven
((ee))
e
Advantage of Hamming RankAdvantage of Hamming Rank
No violations of limit point condition over No violations of limit point condition over finite levels:finite levels:
nobody lower can match this
**R,1 R,1 ,*,*J,1 J,1 can identify can identify GGeveneven((ee))
GG00eveneven
((ee))GG11
eveneven ((ee))
GG22eveneven
((ee))
GGnneveneven
((ee))
GGeveneven
((ee))
= 1
Min Flush Jeffrey Ratch Lex Cond
= 0
= 2
= 3
= n +1
= 0 = 0
= 1
= 1
= 1
= 1
yes
yes
yes
yes
yes yes
yes
yes
yes
yes
= 1
= 1
yes
no
no
no
no =
= 1
**R,1 R,1 ,*,*J,1 J,1 can identify can identify GGeveneven((ee))
Proof: Proof: Let Let SS be generated by be generated by finite ranks of the Hamming finite ranks of the Hamming algebra rooted at algebra rooted at ee
GGeveneven
((ee))
e
S
**R,1 R,1 ,*,*J,1 J,1 can identify can identify GGeveneven((ee))
Let a be an arbitrary element of Let a be an arbitrary element of GG
eveneven((ee))
So So aa is at a finite level of is at a finite level of SS
e
a
S
**R,1 R,1 ,*,*J,1 J,1 can identify can identify GGeveneven((ee))
Consider the principal ideal of Consider the principal ideal of aa These are the possibilities that These are the possibilities that
differ from differ from ee only where only where aa does does
e
a S
**R,1 R,1 ,*,*J,1 J,1 can identify can identify GGeveneven((ee))
So these are the possibilities that So these are the possibilities that have just one difference from have just one difference from the truth for each level below the truth for each level below the truththe truth
e
a S
e
33
a
**R,1 R,1 ,*,*J,1 J,1 can identify can identify GGeveneven((ee))
Induction = hypercube rotationInduction = hypercube rotation
**R,1 R,1 ,*,*J,1 J,1 can identify can identify GGeveneven((ee))
a e Example
**R,1 R,1 ,*,*J,1 J,1 can identify can identify GGeveneven((ee))
a
e
a e Example
**R,1 R,1 ,*,*J,1 J,1 can identify can identify GGeveneven((ee))
a
e
**R,1 R,1 ,*,*J,1 J,1 can identify can identify GGeveneven((ee))
a
e
**R,1 R,1 ,*,*J,1 J,1 can identify can identify GGeveneven((ee))
ConvergenceConvergence
a
e
**R,1 R,1 ,*,*J,1 J,1 can identify can identify GGeveneven((ee))
Other possibilities have more than enough Other possibilities have more than enough differences to climb above the truthdifferences to climb above the truth
**J,1 J,1 doesn’t backslide since the rotation doesn’t backslide since the rotation keeps refuted possibilities rising keeps refuted possibilities rising
**R,2R,2 is Complete is Complete
GG00((ee))GG11((ee))
GG22((ee))
GG33((ee))
GG((ee))
Min Flush Jeffrey Ratch Lex Cond
= 2
= 2
= 2 yes
yes
yes
yes yes
yes
yes
yes
= 2
= 2
no
no
no
no = = 2
= n +1
= 3
= 2
yes yes
= 0 = 0 = 0yes
= 1
= 2
**R,2R,2 is Complete is Complete
Proposition: Proposition: **R,2R,2 is a complete function is a complete function identifieridentifier
Proof: Proof: Let Let KK be countable be countable Partition Partition KK into finite variant classes into finite variant classes
CC00, , CC11, ..., , ..., CCn n , ..., ...
**R,2R,2 is Complete is Complete
Impose the Hamming distance ranking on Impose the Hamming distance ranking on each equivalence classeach equivalence class
Now raise the Now raise the nnth Hamming ranking by th Hamming ranking by nn
CC00 CC11 CC22 CC33 CC44
SS
**R,2R,2 is Complete is Complete
Else might generate horizontal limit points:Else might generate horizontal limit points:
CC00 CC11 CC22 CC33 CC44
SS
**R,2R,2 is Complete is Complete
Data streams in different columns differ Data streams in different columns differ infinitely often from the truth. infinitely often from the truth.
CC00 CC11 CC22 CC33 CC44
truthzoom
!
**R,2R,2 is Complete is Complete
Data streams in the same column just barely Data streams in the same column just barely make it because they jump by 2 for each make it because they jump by 2 for each difference from the truthdifference from the truth
CC00 CC11 CC22 CC33 CC44
SS
**R,2R,2 is Complete is Complete
Data streams in the same column just barely Data streams in the same column just barely make it because they jump by 2 for each make it because they jump by 2 for each difference from the truthdifference from the truth
CC00 CC11 CC22 CC33 CC44
SS
**R,2R,2 is Complete is Complete
Convengence at least by the stage when 2Convengence at least by the stage when 2mm differences from differences from ee have been observed for have been observed for each each eeii belowbelow e e that is not in the column of that is not in the column of ee
CC00 CC11 CC22 CC33 CC44
SSe m
e 0
e 3
e 4
How about *How about *J,2J,2??
The same thing works, The same thing works, rightright??
CC00 CC11 CC22 CC33 CC44
SS
The Wrench in the WorksThe Wrench in the Works
Proposition: Proposition: Even *Even *J,2 J,2 can’t succeed if we can’t succeed if we add ¬ add ¬ ee to to GG
eveneven((ee) when ) when SS extends the extends the
Hamming rankingHamming ranking Proof: Proof: Suppose otherwiseSuppose otherwise
The Wrench in the WorksThe Wrench in the Works
Feed ¬ Feed ¬ e e until it is uniquely at the bottom until it is uniquely at the bottom
¬¬e
k
The Wrench in the WorksThe Wrench in the Works
Let Let nn exceed exceed kk and the original height of ¬ and the original height of ¬ee aa = ¬ = ¬e e ‡ ‡ n n b = b = ¬¬e e ‡ ‡ n+n+11
¬¬e
a
b
k n
The Wrench in the WorksThe Wrench in the Works
By positive invariance, timidity and stubb.By positive invariance, timidity and stubb.
¬¬e
a
b
k n
The Wrench in the WorksThe Wrench in the Works
By positive invariance, timidity, stubbornness and the fact that ¬By positive invariance, timidity, stubbornness and the fact that ¬ee was was alone in the basementalone in the basement
¬¬e
ab
k n
Ouch!!!
(e, e’ ) = 6
e
e’
Solution: A Different Initial StateSolution: A Different Initial State
Goodman Distance: Goodman Distance: ((ee, , e’e’) = {) = {nn: : e’e’ grues grues ee at at nn)})} gg ( (ee, , e’e’) = | ) = | ((ee, , e’e’) |) |
Hamming vs. Goodman AlgebrasHamming vs. Goodman Algebras
aa HH b mod e b mod e ((ee, , aa) is a subset of ) is a subset of ee, , bb))
aa GG b mod e b mod e ((ee, , aa) is a subset of) is a subset ofee, , bb))
Goodman1 0 1
1 1 0 0 1 0
0 1 1 0 0 1
0 0 0
1 1 1
1 1 0 1 0 1 0 1 1
1 0 0 0 1 0 0 0 1
0 0 0
Hamming
1 0 0
1 1 1
Epistemic States as Boolean RanksEpistemic States as Boolean Ranks
GoodmanHamming
GG((ee))GGeveneven
((ee))
GGoddodd
((ee))
e e
Epistemic States as Boolean RanksEpistemic States as Boolean Ranks
Goodman
GG00((ee))GG11((ee))GG22((ee))GG33((ee))
e
**J,2 J,2 can identify can identify GG ((ee))
GG00((ee))GG11((ee))
GG22((ee))
GG33((ee))
GG((ee))
Min Flush Jeffrey Ratch Lex Cond
= 2
= 2
= 2 yes
yes
yes
yes yes
yes
yes
yes
= 2
= 2
no
no
no
no = = 2
= n +1
= 3
= 2
yes yes
= 0 = 0 = 0yes
= 1
= 2
**J,2 J,2 can identify can identify GG ((ee))
Proof: Proof: Use the Goodman ranking as initial stateUse the Goodman ranking as initial state Show by induction that the method projects Show by induction that the method projects ee
until a grue occurs at until a grue occurs at nn, then projects , then projects ee ‡ ‡ nn, until , until another grue occurs at another grue occurs at n’n’, etc. , etc.
Part VII:Part VII:
DiscussionDiscussion
SummarySummary
Belief revision as inductive inquiryBelief revision as inductive inquiry Reliability vs. intuitive symmetriesReliability vs. intuitive symmetries Intuitive symmetries imply reliability for large Intuitive symmetries imply reliability for large Intuitive symmetries restrict reliability for small Intuitive symmetries restrict reliability for small Sharp discriminations among proposed methodsSharp discriminations among proposed methods Isolation of fundamental epistemic dilemmaIsolation of fundamental epistemic dilemma = 2 as fundamental epistemic invariant= 2 as fundamental epistemic invariant Learning as cube rotationLearning as cube rotation Surprising relevance of tail reversalsSurprising relevance of tail reversals