probabilistic inference
DESCRIPTION
Probabilistic Inference. Agenda. Review of probability theory Joint, conditional distributions Independence Marginalization and conditioning Intro to Bayesian Networks. Probabilistic Belief. Consider a world where a dentist agent D meets with a new patient P - PowerPoint PPT PresentationTRANSCRIPT
Probabilistic Inference
Agenda
• Review of probability theory
• Joint, conditional distributions
• Independence
• Marginalization and conditioning
• Intro to Bayesian Networks
Probabilistic Belief Consider a world where a dentist agent D
meets with a new patient P
D is interested in only whether P has a cavity; so, a state is described with a single proposition – Cavity
Before observing P, D does not know if P has a cavity, but from years of practice, he believes Cavity with some probability p and Cavity with probability 1-p
The proposition is now a boolean random variable and (Cavity, p) is a probabilistic belief
Probabilistic Belief State
The world has only two possible states, which are respectively described by Cavity and Cavity
The probabilistic belief state of an agent is a probabilistic distribution over all the states that the agent thinks possible
In the dentist example, D’s belief state is:
Cavity Cavityp 1-p
Multivariate Belief State
We now represent the world of the dentist D using three propositions – Cavity, Toothache, and PCatch
D’s belief state consists of 23 = 8 states each with some probability:
{CavityToothachePCatch, CavityToothachePCatch, CavityToothachePCatch,...}
The belief state is defined by the full joint probability of the
propositions
PCatch PCatch PCatch PCatch
Cavity 0.108 0.012 0.072 0.008
Cavity 0.016 0.064 0.144 0.576
Toothache Toothache
Probabilistic Inference
PCatch PCatch PCatch PCatch
Cavity 0.108 0.012 0.072 0.008
Cavity 0.016 0.064 0.144 0.576
Toothache Toothache
P(Cavity Toothache) = 0.108 + 0.012 + ...= 0.28
Probabilistic Inference
PCatch PCatch PCatch PCatch
Cavity 0.108 0.012 0.072 0.008
Cavity 0.016 0.064 0.144 0.576
Toothache Toothache
P(Cavity) = 0.108 + 0.012 + 0.072 + 0.008= 0.2
Probabilistic Inference
PCatch PCatch PCatch PCatch
Cavity 0.108 0.012 0.072 0.008
Cavity 0.016 0.064 0.144 0.576
Toothache Toothache
Marginalization: P(c) = tpc P(ctpc) using the conventions that c = Cavity or Cavity and that t is the sum over t = {Toothache, Toothache}
Komolgorov’s Probability Axioms
• 0 P(a) 1• P(true) = 1, P(false) = 0• P(a b) = P(a) + P(b) - P(a b)
Decoding Probability Notation
• Capital letters A,B,C denote random variables
• Each random variable X can take one of a set of possible values xVal(X)– Confusingly, this is often left implicit
• Probabilities defined over logical statements, e.g., P(X=x)– Confusingly, this is often written P(x)…
Decoding Probability Notation
• Interpretation of P(A) depends on if A treated as a logical statement, or a random variable
• As a logical statement– Just the probability that A is true– P(A) = 1-P(A)
• As a random variable– Denotes both P(A is true) and P(A is false)– For non-boolean variables, denotes P(A=a) for all a
Val(A)
• Computation requires marginalization– Belief space A, B, C, D, … often left implicit– Marginalize the joint distribution over all combinations of
B, C, D… (event space)
Decoding Probability Notation
• How to interpret P(AB) = P(A,B)?• As a logical statement
– Probability that both A and B are true• As random variables
– {P(A=0,B=0), P(A=1,B=0), P(A=0,B=1), P(A=1,B=1)}
– P(A=a,B=b) for all aVal(A), bVal(B) in general
• Equal to P(A)P(B)? No!!! (except under very special conditions)
Quiz: What does this mean?
P(AB) = P(A)+P(B)- P(AB)
P(A=a B=b) = P(A=a) + P(B=b)- P(A=a B=b)
For all aVal(A) and bVal(B)
Marginalization
• Express P(A,B) in terms of P(A,B,C) • If C is boolean,
– P(A,B) = P(A,B,C=0)+ P(A,B,C=1)
• In general– P(A,B) = cVal(C) P(A,B,C=c)
Conditional Probability
P(AB) = P(A|B) P(B)= P(B|A) P(A)
P(A|B) is the posterior probability of A given B
Axiomatic definition:P(A|B) = P(A,B)/P(B)
PCatch PCatch PCatch PCatch
Cavity 0.108 0.012 0.072 0.008
Cavity 0.016 0.064 0.144 0.576
Toothache Toothache
P(Cavity|Toothache) = P(CavityToothache)/P(Toothache) = (0.108+0.012)/(0.108+0.012+0.016+0.064) = 0.6
Interpretation: After observing Toothache, the patient is no longer an “average” one, and the prior probability (0.2) of Cavity is no longer valid
P(Cavity|Toothache) is calculated by keeping the ratios of the probabilities of the 4 cases unchanged, and normalizing their sum to 1
PCatch PCatch PCatch PCatch
Cavity 0.108 0.012 0.072 0.008
Cavity 0.016 0.064 0.144 0.576
Toothache Toothache
P(Cavity|Toothache) = P(CavityToothache)/P(Toothache)= (0.108+0.012)/(0.108+0.012+0.016+0.064) = 0.6
P(Cavity|Toothache)=P(CavityToothache)/P(Toothache)= (0.016+0.064)/(0.108+0.012+0.016+0.064) = 0.4
P(c|Toochache) = (P(Cavity|Toothache), P(Cavity|Toothache))= P(c Toothache)
= pc P(c Toothache pc) = [(0.108, 0.016) + (0.012, 0.064)] = (0.12, 0.08) = (0.6, 0.4)
normalizationconstant
Conditioning
• Express P(A) in terms of P(A|B), P(B)• If C is boolean,
– P(A) = P(A|B=0) P(B=0)+ P(A|B=1) P(B=1)
• In general– P(A) = bVal(B) P(A|B=b) P(B=b)
Conditional Probability
P(AB) = P(A|B) P(B)= P(B|A) P(A)
P(ABC) = P(A|B,C) P(BC)= P(A|B,C) P(B|C) P(C)
P(Cavity) = tpc P(Cavitytpc)
= tpc P(Cavity|t,pc) P(tpc)
Independence
Two random variables A and B are independent if
P(AB) = P(A) P(B) hence if P(A|B) = P(A)
Two random variables A and B are independent given C, if
P(AB|C) = P(A|C) P(B|C)hence if P(A|B,C) = P(A|C)
Updating the Belief State
PCatch PCatch PCatch PCatch
Cavity 0.108 0.012 0.072 0.008
Cavity 0.016 0.064 0.144 0.576
Toothache Toothache
Let D now observe Toothache with probability 0.8 (e.g., “the patient says so”)
How should D update its belief state?
Updating the Belief State
PCatch PCatch PCatch PCatch
Cavity 0.108 0.012 0.072 0.008
Cavity 0.016 0.064 0.144 0.576
Toothache Toothache
Let E be the evidence such that P(Toothache|E) = 0.8 We want to compute P(ctpc|E) = P(cpc|t,E) P(t|E) Since E is not directly related to the cavity or the probe
catch, we consider that c and pc are independent of E given t, hence: P(cpc|t,E) = P(cpc|t)
Let E be the evidence such that P(Toothache|E) = 0.8 We want to compute P(ctpc|E) = P(cpc|t,E) P(t|E) Since E is not directly related to the cavity or the probe
catch, we consider that c and pc are independent of E given t, hence: P(cpc|t,E) = P(cpc|t) To get these 4 probabilities
we normalize their sum to 0.2
To get these 4 probabilities we normalize their sum to 0.8
Updating the Belief State
PCatch PCatch PCatch PCatch
Cavity 0.108 0.012 0.072 0.008
Cavity 0.016 0.064 0.144 0.576
Toothache Toothache
0.432
0.064 0.256
0.048 0.018
0.036 0.144
0.002
Issues
If a state is described by n propositions, then a belief state contains 2n states (possibly, some have probability 0)
Modeling difficulty: many numbers must be entered in the first place
Computational issue: memory size and time
Bayes’ Rule and other Probability Manipulations
P(AB) = P(A|B) P(B)= P(B|A) P(A)
P(A|B) = P(B|A) P(A) / P(B) Can manipulate distributions, e.g.
P(B) = a P(B|A=a) P(A=a) Can derive P(A|B), P(B) using only P(B|A) and P(A)
So, if any variables are conditionally independent, we should be able to decompose joint distribution to take advantage of it
Toothache and PCatch are independent given Cavity (or Cavity), but this relation is hidden in the numbers ! [Verify this]
Bayesian networks explicitly represent independence among propositions to reduce the number of probabilities defining a belief state
PCatch PCatch PCatch PCatch
Cavity 0.108 0.012 0.072 0.008
Cavity 0.016 0.064 0.144 0.576
Toothache Toothache
Bayesian Network Notice that Cavity is the “cause” of both
Toothache and PCatch, and represent the causality links explicitly
Give the prior probability distribution of Cavity Give the conditional probability tables of
Toothache and PCatch
Cavity
Toothache
P(Cavity)
0.2
P(Toothache|c)
CavityCavity
0.60.1
PCatch
P(PCatch|c)
CavityCavity
0.90.02
5 probabilities, instead of 7
P(ctpc) = P(tpc|c) P(c) = P(t|c) P(pc|c) P(c)
Conditional Probability Tables
Cavity
Toothache
P(Cavity)
0.2
P(Toothache|c)
CavityCavity
0.60.1
PCatch
P(PCatch|c)
CavityCavity
0.90.02
P(ctpc) = P(tpc|c) P(c) = P(t|c) P(pc|c) P(c)
P(t|c) P(t|c)
CavityCavity
0.60.1
0.40.9
Rows sum to 1
If X takes n values, just store n-1 entries
Naïve Bayes Models
P(Cause,Effect1,…,Effectn)
= P(Cause) i P(Effecti | Cause)
Cause
Effect1 Effect2 Effectn
Naïve Bayes Classifier
P(Class,Feature1,…,Featuren)
= P(Class) i P(Featurei | Class)
Class
Feature1 Feature2 Featuren
P(C|F1,….,Fk) = P(C,F1,….,Fk)/P(F1,….,Fk)
= 1/Z P(C) i P(Fi|C)
Given features, what class?
Spam / Not Spam
English / French/ Latin
…
Word occurrences
Equations Involving Random Variables
• C = A B• C = max(A,B)• Constrains joint probability P(A,B,C)• Nicely encoded as causality
relationship
C
A B
Conditional probability given by equation rather than a CPT
A More Complex BN
Burglary Earthquake
Alarm
MaryCallsJohnCalls
causes
effects
Directed acyclic graph
Intuitive meaning of arc from x to y:
“x has direct influence on y”
B E P(A|…)
TTFF
TFTF
0.950.940.290.001
Burglary Earthquake
Alarm
MaryCallsJohnCalls
P(B)
0.001
P(E)
0.002
A P(J|…)
TF
0.900.05
A P(M|…)
TF
0.700.01
Size of the CPT for a node with k parents: 2k
A More Complex BN
10 probabilities, instead of 31
What does the BN encode?
Each of the beliefs JohnCalls and MaryCalls is independent of Burglary and Earthquake given Alarm or Alarm
Burglary Earthquake
Alarm
MaryCallsJohnCalls
For example, John doesnot observe any burglariesdirectly
P(bj) P(b) P(j)P(bj|a) P(b|a) P(j|a)
The beliefs JohnCalls and MaryCalls are independent given Alarm or Alarm
For instance, the reasons why John and Mary may not call if there is an alarm are unrelated
Burglary Earthquake
Alarm
MaryCallsJohnCalls
What does the BN encode?
P(bj|a) P(b|a) P(j|a)P(jm|a) P(j|a) P(m|a)P(bj|a) P(b|a) P(j|a)P(jm|a) P(j|a) P(m|a)
A node is independent of its non-descendants given its parents
A node is independent of its non-descendants given its parents
The beliefs JohnCalls and MaryCalls are independent given Alarm or Alarm
For instance, the reasons why John and Mary may not call if there is an alarm are unrelated
Burglary Earthquake
Alarm
MaryCallsJohnCalls
What does the BN encode?
A node is independent of its non-descendants given its parents
A node is independent of its non-descendants given its parents
Burglary and Earthquake are independent
Burglary and Earthquake are independent
Locally Structured World A world is locally structured (or sparse) if
each of its components interacts directly with relatively few other components
In a sparse world, the CPTs are small and the BN contains much fewer probabilities than the full joint distribution
If the # of entries in each CPT is bounded by a constant, i.e., O(1), then the # of probabilities in a BN is linear in n – the # of propositions – instead of 2n for the joint distribution
But does a BN represent a belief state?
In other words, can we compute the full joint
distribution of the propositions from it?
Calculation of Joint Probability
B E P(A|…)
TTFF
TFTF
0.950.940.290.001
Burglary Earthquake
Alarm
MaryCallsJohnCalls
P(B)
0.001
P(E)
0.002
A P(J|…)
TF
0.900.05
A P(M|…)
TF
0.700.01
P(jmabe) = ??
P(jmabe)= P(jm|a,b,e) P(abe)= P(j|a,b,e) P(m|a,b,e) P(abe)(J and M are independent given A)
P(j|a,b,e) = P(j|a)(J and B and J and E are independent given A)
P(m|a,b,e) = P(m|a) P(abe) = P(a|b,e) P(b|e) P(e)
= P(a|b,e) P(b) P(e)(B and E are independent)
P(jmabe) = P(j|a)P(m|a)P(a|b,e)P(b)P(e)
Burglary Earthquake
Alarm
MaryCallsJohnCalls
Calculation of Joint Probability
B E P(A|…)
TTFF
TFTF
0.950.940.290.001
Burglary Earthquake
Alarm
MaryCallsJohnCalls
P(B)
0.001
P(E)
0.002
A P(J|…)
TF
0.900.05
A P(M|…)
TF
0.700.01
P(JMABE)= P(J|A)P(M|A)P(A|B,E)P(B)P(E)= 0.9 x 0.7 x 0.001 x 0.999 x 0.998= 0.00062
Calculation of Joint Probability
B E P(A|…)
TTFF
TFTF
0.950.940.290.001
Burglary Earthquake
Alarm
MaryCallsJohnCalls
P(B)
0.001
P(E)
0.002
A P(J|…)
TF
0.900.05
A P(M|…)
TF
0.700.01
P(jmabe)= P(j|a)P(m|a)P(a|b,e)P(b)P(e)= 0.9 x 0.7 x 0.001 x 0.999 x 0.998= 0.00062
P(x1x2…xn) = i=1,…,nP(xi|parents(Xi))
full joint distribution table
Calculation of Joint Probability
B E P(A|…)
TTFF
TFTF
0.950.940.290.001
Burglary Earthquake
Alarm
MaryCallsJohnCalls
P(B)
0.001
P(E)
0.002
A P(J|…)
TF
0.900.05
A P(M|…)
TF
0.700.01
P(x1x2…xn) = i=1,…,nP(xi|parents(Xi))
full joint distribution table
P(jmabe)= P(j|a)P(m|a)P(a|b,e)P(b)P(e)= 0.9 x 0.7 x 0.001 x 0.999 x 0.998= 0.00062
Since a BN defines the full joint distribution of a set of propositions, it represents a belief state
Since a BN defines the full joint distribution of a set of propositions, it represents a belief state
The BN gives P(t|c) What about P(c|t)? P(Cavity|t)
= P(Cavity t)/P(t)= P(t|Cavity) P(Cavity) /
P(t)[Bayes’ rule]
P(c|t) = P(t|c) P(c)
Querying a BN is just applying the trivial Bayes’ rule on a larger scale… algorithms next time
Querying the BN
Cavity
Toothache
P(C)
0.1
C P(T|c)
TF
0.40.01111
More Complicated Singly-Connected Belief Net
Radio
Battery
SparkPlugs
Starts
Gas
Moves
Some Applications of BN
Medical diagnosis Troubleshooting of
hardware/software systems Fraud/uncollectible debt detection Data mining Analysis of genetic sequences Data interpretation, computer vision,
image understanding
Region = {Sky, Tree, Grass, Rock}
R2
R4R3
R1
Above
BN to evaluate insurance risks
Purposes of Bayesian Networks
Efficient and intuitive modeling of complex causal interactions
Compact representation of joint distributions O(n) rather than O(2n)
Algorithms for efficient inference with new evidence (more on this next time)
Reminder
• HW4 due on Thursday!