assignment 3 question 2 sample solution - computer...
TRANSCRIPT
Assignment 3 Question 2Sample Solution
Prior:P(C) 0.60P(~C) 0.40
Joint Model:P(~A,~B|C) 0.33 P(~A,~B|~C) 0.25P(A,~B|C) 0.33 P(A,~B|~C) 0.00P(~A,B|C) 0.00 P(~A,B|~C) 0.50P(A,B|C) 0.33 P(A,B|~C) 0.25
Naïve Model:P(A|C) 0.67 P(A|~C) 0.25P(~A|C) 0.33 P(~A|~C) 0.75P(B|C) 0.33 P(B|~C) 0.75P(~B|C) 0.67 P(~B|~C) 0.25
P(~A,~B|C) 0.2222 P(~A,~B|~C) 0.1875P(A,~B|C) 0.4444 P(A,~B|~C) 0.0625P(~A,B|C) 0.1111 P(~A,B|~C) 0.5625P(A,B|C) 0.2222 P(A,B|~C) 0.1875
Now, we use Bayes Rule to compute P(C|A,B). For example,
A B P(C=1|Joint) P(C=1|Naïve)0 0 0.6667 0.64001 0 1.0000 0.91430 1 0.0000 0.22861 1 0.6667 0.6400
Here, I'm pre-computing the values for the rows below here for convenience. They are computed using the table of numbers above. For example, P(A,B|C) = P(A|C)*P(B|C). We compute the probabilities this way because we are making the "naïve" assumption: A and B are independent given C.
Feature values:
where P(A,B|C) will come from either the Joint model or from the Naïve model. To compute P(A,B), we will use the summation formula given on slides 35 and 36 of the the probabilities and bayes classifiers slides. Specifically, we will use
Now we can produce a table of probabilities computed from each of the the Joint and Naïve models. Notice how the probabilities computed from the Naïve model are less extreme.
Probabilities for C=1
!
P(C | A,B) =P(A,B |C) " P(C)
P(A,B)
!
P(C | A,B) =P(A,B |C) " P(C)
P(A,B |C) " P(C) +P(A,B |~ C)P(~ C)