cs 416 artificial intelligence lecture 9 logical agents chapter 7 lecture 9 logical agents chapter 7

40
CS 416 Artificial Intelligence Lecture 9 Lecture 9 Logical Agents Logical Agents Chapter 7 Chapter 7

Upload: coral-long

Post on 30-Dec-2015

225 views

Category:

Documents


1 download

TRANSCRIPT

Page 1: CS 416 Artificial Intelligence Lecture 9 Logical Agents Chapter 7 Lecture 9 Logical Agents Chapter 7

CS 416Artificial Intelligence

Lecture 9Lecture 9

Logical AgentsLogical Agents

Chapter 7Chapter 7

Lecture 9Lecture 9

Logical AgentsLogical Agents

Chapter 7Chapter 7

Page 2: CS 416 Artificial Intelligence Lecture 9 Logical Agents Chapter 7 Lecture 9 Logical Agents Chapter 7

Chess Article

Garry Kasparov reflects on computerized chessGarry Kasparov reflects on computerized chess

• IBM should have released the contents of Deep Blue to chess IBM should have released the contents of Deep Blue to chess community to advance research of computation as it relates to chesscommunity to advance research of computation as it relates to chess

• Kudos to Deep Junior for putting information in public domain so state of Kudos to Deep Junior for putting information in public domain so state of the art can advancethe art can advance

• Deep Blue made one good move that surprised Kasparov (though he Deep Blue made one good move that surprised Kasparov (though he thinks a person was in the loop)thinks a person was in the loop)

• Deep Junior made a fantastic sacrifice that reflects a new Deep Junior made a fantastic sacrifice that reflects a new accomplishment for computerized chessaccomplishment for computerized chess

http://www.opinionjournal.com/extra/?id=110003081http://www.opinionjournal.com/extra/?id=110003081

Garry Kasparov reflects on computerized chessGarry Kasparov reflects on computerized chess

• IBM should have released the contents of Deep Blue to chess IBM should have released the contents of Deep Blue to chess community to advance research of computation as it relates to chesscommunity to advance research of computation as it relates to chess

• Kudos to Deep Junior for putting information in public domain so state of Kudos to Deep Junior for putting information in public domain so state of the art can advancethe art can advance

• Deep Blue made one good move that surprised Kasparov (though he Deep Blue made one good move that surprised Kasparov (though he thinks a person was in the loop)thinks a person was in the loop)

• Deep Junior made a fantastic sacrifice that reflects a new Deep Junior made a fantastic sacrifice that reflects a new accomplishment for computerized chessaccomplishment for computerized chess

http://www.opinionjournal.com/extra/?id=110003081http://www.opinionjournal.com/extra/?id=110003081

Page 3: CS 416 Artificial Intelligence Lecture 9 Logical Agents Chapter 7 Lecture 9 Logical Agents Chapter 7

Where are we?

We’ve studied classes of search problemsWe’ve studied classes of search problems

• Find the best sequence of actionsFind the best sequence of actions

– Uninformed search (BFS, Iterative Deepening)Uninformed search (BFS, Iterative Deepening)

– Informed search (A*)Informed search (A*)

• Find the best value of something (possibly a sequence)Find the best value of something (possibly a sequence)

– Simulated annealing, genetic algorithms, hill climbingSimulated annealing, genetic algorithms, hill climbing

• Finding the best action in an adversarial settingFinding the best action in an adversarial setting

We’ve studied classes of search problemsWe’ve studied classes of search problems

• Find the best sequence of actionsFind the best sequence of actions

– Uninformed search (BFS, Iterative Deepening)Uninformed search (BFS, Iterative Deepening)

– Informed search (A*)Informed search (A*)

• Find the best value of something (possibly a sequence)Find the best value of something (possibly a sequence)

– Simulated annealing, genetic algorithms, hill climbingSimulated annealing, genetic algorithms, hill climbing

• Finding the best action in an adversarial settingFinding the best action in an adversarial setting

Page 4: CS 416 Artificial Intelligence Lecture 9 Logical Agents Chapter 7 Lecture 9 Logical Agents Chapter 7

Logical Agents

What are we talking about, “logical?”What are we talking about, “logical?”

• Aren’t search-based chess programs logicalAren’t search-based chess programs logical

– Yes, but knowledge is used in a very specific wayYes, but knowledge is used in a very specific way

Win the gameWin the game

Not useful for extracting strategies or understanding Not useful for extracting strategies or understanding other aspects of chessother aspects of chess

• We want to develop more general-purpose knowledge We want to develop more general-purpose knowledge systems that support a variety of logical analysessystems that support a variety of logical analyses

What are we talking about, “logical?”What are we talking about, “logical?”

• Aren’t search-based chess programs logicalAren’t search-based chess programs logical

– Yes, but knowledge is used in a very specific wayYes, but knowledge is used in a very specific way

Win the gameWin the game

Not useful for extracting strategies or understanding Not useful for extracting strategies or understanding other aspects of chessother aspects of chess

• We want to develop more general-purpose knowledge We want to develop more general-purpose knowledge systems that support a variety of logical analysessystems that support a variety of logical analyses

Page 5: CS 416 Artificial Intelligence Lecture 9 Logical Agents Chapter 7 Lecture 9 Logical Agents Chapter 7

Why study knowledge-based agents

Partially observable environmentsPartially observable environments

• combine available information (percepts) with general knowledge to combine available information (percepts) with general knowledge to select actionsselect actions

Natural LanguageNatural Language

• Language is too complex and ambiguous. Problem-solving agents are Language is too complex and ambiguous. Problem-solving agents are impeded by high branching factor.impeded by high branching factor.

FlexibilityFlexibility

• Knowledge can be reused for novel tasks. New knowledge can be Knowledge can be reused for novel tasks. New knowledge can be added to improve future performance.added to improve future performance.

Partially observable environmentsPartially observable environments

• combine available information (percepts) with general knowledge to combine available information (percepts) with general knowledge to select actionsselect actions

Natural LanguageNatural Language

• Language is too complex and ambiguous. Problem-solving agents are Language is too complex and ambiguous. Problem-solving agents are impeded by high branching factor.impeded by high branching factor.

FlexibilityFlexibility

• Knowledge can be reused for novel tasks. New knowledge can be Knowledge can be reused for novel tasks. New knowledge can be added to improve future performance.added to improve future performance.

Page 6: CS 416 Artificial Intelligence Lecture 9 Logical Agents Chapter 7 Lecture 9 Logical Agents Chapter 7

Components of knowledge-based agent

Knowledge BaseKnowledge Base• Store informationStore information

– knowledge knowledge representationrepresentation language language

• Add information (Add information (TellTell))

• Retrieve information (Retrieve information (AskAsk))

• Perform Perform inferenceinference

– derive new derive new sentencessentences (knowledge) from existing (knowledge) from existing sentencessentences

Knowledge BaseKnowledge Base• Store informationStore information

– knowledge knowledge representationrepresentation language language

• Add information (Add information (TellTell))

• Retrieve information (Retrieve information (AskAsk))

• Perform Perform inferenceinference

– derive new derive new sentencessentences (knowledge) from existing (knowledge) from existing sentencessentences

Page 7: CS 416 Artificial Intelligence Lecture 9 Logical Agents Chapter 7 Lecture 9 Logical Agents Chapter 7

The wumpus world

A scary world, indeedA scary world, indeed

• A maze in a caveA maze in a cave

• A wumpus who will eat youA wumpus who will eat you

• One arrow that can kill the wumpusOne arrow that can kill the wumpus

• Pits that can entrap you (but not the Pits that can entrap you (but not the wumpus for it is too large to fall in)wumpus for it is too large to fall in)

• A heap of gold somewhereA heap of gold somewhere

A scary world, indeedA scary world, indeed

• A maze in a caveA maze in a cave

• A wumpus who will eat youA wumpus who will eat you

• One arrow that can kill the wumpusOne arrow that can kill the wumpus

• Pits that can entrap you (but not the Pits that can entrap you (but not the wumpus for it is too large to fall in)wumpus for it is too large to fall in)

• A heap of gold somewhereA heap of gold somewhere

Page 8: CS 416 Artificial Intelligence Lecture 9 Logical Agents Chapter 7 Lecture 9 Logical Agents Chapter 7

But you have sensing and action

Sensing (each is either on or off – a single bit)Sensing (each is either on or off – a single bit)

• wumpus emits a stench in adjacent squareswumpus emits a stench in adjacent squares

• pits cause a breeze in adjacent squarespits cause a breeze in adjacent squares

• gold causes glitter you see when in the squaregold causes glitter you see when in the square

• walking into wall causes a bumpwalking into wall causes a bump

• death of wumpus can be heard everywhere in worlddeath of wumpus can be heard everywhere in world

Sensing (each is either on or off – a single bit)Sensing (each is either on or off – a single bit)

• wumpus emits a stench in adjacent squareswumpus emits a stench in adjacent squares

• pits cause a breeze in adjacent squarespits cause a breeze in adjacent squares

• gold causes glitter you see when in the squaregold causes glitter you see when in the square

• walking into wall causes a bumpwalking into wall causes a bump

• death of wumpus can be heard everywhere in worlddeath of wumpus can be heard everywhere in world

Page 9: CS 416 Artificial Intelligence Lecture 9 Logical Agents Chapter 7 Lecture 9 Logical Agents Chapter 7

But you have sensing and action

ActionAction

• You can turn left or right 90 degreesYou can turn left or right 90 degrees

• You can move forwardYou can move forward

• You can shoot an arrow in your facing directionYou can shoot an arrow in your facing direction

ActionAction

• You can turn left or right 90 degreesYou can turn left or right 90 degrees

• You can move forwardYou can move forward

• You can shoot an arrow in your facing directionYou can shoot an arrow in your facing direction

Page 10: CS 416 Artificial Intelligence Lecture 9 Logical Agents Chapter 7 Lecture 9 Logical Agents Chapter 7

An example

Page 11: CS 416 Artificial Intelligence Lecture 9 Logical Agents Chapter 7 Lecture 9 Logical Agents Chapter 7

An example

Page 12: CS 416 Artificial Intelligence Lecture 9 Logical Agents Chapter 7 Lecture 9 Logical Agents Chapter 7

Our agent played well

• Used inference to relate two different percepts observed from Used inference to relate two different percepts observed from different locationsdifferent locations

• Agent is guaranteed to draw correct conclusions if percepts Agent is guaranteed to draw correct conclusions if percepts are correctare correct

• Used inference to relate two different percepts observed from Used inference to relate two different percepts observed from different locationsdifferent locations

• Agent is guaranteed to draw correct conclusions if percepts Agent is guaranteed to draw correct conclusions if percepts are correctare correct

Page 13: CS 416 Artificial Intelligence Lecture 9 Logical Agents Chapter 7 Lecture 9 Logical Agents Chapter 7

Knowledge Representation

Must be syntactically and semantically correctMust be syntactically and semantically correct

SyntaxSyntax• the formal specification of how information is storedthe formal specification of how information is stored

– a + 2 = c (typical mathematical syntax)a + 2 = c (typical mathematical syntax)

– a2y += (not legal syntax for infix (regular math) notation)a2y += (not legal syntax for infix (regular math) notation)

SemanticsSemantics• the meaning of the informationthe meaning of the information

– a + 2 = c (c is a number whose value is 2 more than a)a + 2 = c (c is a number whose value is 2 more than a)

– a + 2 = c (the symbol that comes two after ‘a’ in the alphabet is ‘c’)a + 2 = c (the symbol that comes two after ‘a’ in the alphabet is ‘c’)

Must be syntactically and semantically correctMust be syntactically and semantically correct

SyntaxSyntax• the formal specification of how information is storedthe formal specification of how information is stored

– a + 2 = c (typical mathematical syntax)a + 2 = c (typical mathematical syntax)

– a2y += (not legal syntax for infix (regular math) notation)a2y += (not legal syntax for infix (regular math) notation)

SemanticsSemantics• the meaning of the informationthe meaning of the information

– a + 2 = c (c is a number whose value is 2 more than a)a + 2 = c (c is a number whose value is 2 more than a)

– a + 2 = c (the symbol that comes two after ‘a’ in the alphabet is ‘c’)a + 2 = c (the symbol that comes two after ‘a’ in the alphabet is ‘c’)

Page 14: CS 416 Artificial Intelligence Lecture 9 Logical Agents Chapter 7 Lecture 9 Logical Agents Chapter 7

Logical Reasoning

EntailmentEntailment• one sentence follows logically from anotherone sentence follows logically from another

– the sentence the sentence entails the sentence entails the sentence

• entails entails ( ) if and only if for every ( ) if and only if for every modelmodel in which in which is true, is true, is also true is also true

Model: Model: a description of the world where every a description of the world where every relevant sentence has been assigned truth or relevant sentence has been assigned truth or falsehoodfalsehood

EntailmentEntailment• one sentence follows logically from anotherone sentence follows logically from another

– the sentence the sentence entails the sentence entails the sentence

• entails entails ( ) if and only if for every ( ) if and only if for every modelmodel in which in which is true, is true, is also true is also true

Model: Model: a description of the world where every a description of the world where every relevant sentence has been assigned truth or relevant sentence has been assigned truth or falsehoodfalsehood

Page 15: CS 416 Artificial Intelligence Lecture 9 Logical Agents Chapter 7 Lecture 9 Logical Agents Chapter 7

An example

After one step in wumpus After one step in wumpus worldworld• Knowledge base (KB) isKnowledge base (KB) is

– A set of all game states that are A set of all game states that are possible given:possible given:

rules of gamerules of game

perceptspercepts

• How does the KB represent the How does the KB represent the game?game?

After one step in wumpus After one step in wumpus worldworld• Knowledge base (KB) isKnowledge base (KB) is

– A set of all game states that are A set of all game states that are possible given:possible given:

rules of gamerules of game

perceptspercepts

• How does the KB represent the How does the KB represent the game?game?

Page 16: CS 416 Artificial Intelligence Lecture 9 Logical Agents Chapter 7 Lecture 9 Logical Agents Chapter 7

Building the KB

Consider a KB intended to represent the presence Consider a KB intended to represent the presence of pits in a Wumpus world where [1,1] is clear and of pits in a Wumpus world where [1,1] is clear and [2,1] has a breeze[2,1] has a breeze

• There are three cells withThere are three cells withtwo conditions each two conditions each

• 2233 = Eight possible models = Eight possible models

• According to percepts andAccording to percepts andrules, KB is well definedrules, KB is well defined

Consider a KB intended to represent the presence Consider a KB intended to represent the presence of pits in a Wumpus world where [1,1] is clear and of pits in a Wumpus world where [1,1] is clear and [2,1] has a breeze[2,1] has a breeze

• There are three cells withThere are three cells withtwo conditions each two conditions each

• 2233 = Eight possible models = Eight possible models

• According to percepts andAccording to percepts andrules, KB is well definedrules, KB is well defined

Page 17: CS 416 Artificial Intelligence Lecture 9 Logical Agents Chapter 7 Lecture 9 Logical Agents Chapter 7

Model Checking

The agent wishes to check all The agent wishes to check all modelsmodels of the game of the game in which a pit is in the three candidate spotsin which a pit is in the three candidate spots

• Enumerate all modelsEnumerate all modelswhere three candidatewhere three candidatespots may have pitsspots may have pits

• 3 pits, two conditions each3 pits, two conditions each

• 2233 = Eight models = Eight models

The agent wishes to check all The agent wishes to check all modelsmodels of the game of the game in which a pit is in the three candidate spotsin which a pit is in the three candidate spots

• Enumerate all modelsEnumerate all modelswhere three candidatewhere three candidatespots may have pitsspots may have pits

• 3 pits, two conditions each3 pits, two conditions each

• 2233 = Eight models = Eight models

Page 18: CS 416 Artificial Intelligence Lecture 9 Logical Agents Chapter 7 Lecture 9 Logical Agents Chapter 7

Checking entailment

Can “Can “11:There is no pit in [1, 2]” be true?:There is no pit in [1, 2]” be true?

• Enumerate all states whereEnumerate all states where11 is true is true

• For all models where KBFor all models where KBis true, is true, 11 is true also is true also

• The KB entails The KB entails 11

Can “Can “11:There is no pit in [1, 2]” be true?:There is no pit in [1, 2]” be true?

• Enumerate all states whereEnumerate all states where11 is true is true

• For all models where KBFor all models where KBis true, is true, 11 is true also is true also

• The KB entails The KB entails 11

Page 19: CS 416 Artificial Intelligence Lecture 9 Logical Agents Chapter 7 Lecture 9 Logical Agents Chapter 7

Checking entailment

Can “Can “22: There is no pit in [2, 2]” be true?: There is no pit in [2, 2]” be true?

• Enumerate all statesEnumerate all stateswhere where 22 is true is true

• For all models where KBFor all models where KBis true, is true, 22 is not always is not always

true alsotrue also

• KB does not entail KB does not entail 22

Can “Can “22: There is no pit in [2, 2]” be true?: There is no pit in [2, 2]” be true?

• Enumerate all statesEnumerate all stateswhere where 22 is true is true

• For all models where KBFor all models where KBis true, is true, 22 is not always is not always

true alsotrue also

• KB does not entail KB does not entail 22

Page 20: CS 416 Artificial Intelligence Lecture 9 Logical Agents Chapter 7 Lecture 9 Logical Agents Chapter 7

Logical inference

Entailment permitted logicEntailment permitted logic

• we inferred new knowledge from entailmentswe inferred new knowledge from entailments

Inference algorithmsInference algorithms

• The method of logical inference we demonstrated is called The method of logical inference we demonstrated is called model checkingmodel checking because we enumerated all possibilities to because we enumerated all possibilities to find the inferencefind the inference

Entailment permitted logicEntailment permitted logic

• we inferred new knowledge from entailmentswe inferred new knowledge from entailments

Inference algorithmsInference algorithms

• The method of logical inference we demonstrated is called The method of logical inference we demonstrated is called model checkingmodel checking because we enumerated all possibilities to because we enumerated all possibilities to find the inferencefind the inference

Page 21: CS 416 Artificial Intelligence Lecture 9 Logical Agents Chapter 7 Lecture 9 Logical Agents Chapter 7

Inference algorithms

There are many inference algorithmsThere are many inference algorithms

• If an inference algorithm, If an inference algorithm, ii, can derive , can derive from KB, we write from KB, we write

There are many inference algorithmsThere are many inference algorithms

• If an inference algorithm, If an inference algorithm, ii, can derive , can derive from KB, we write from KB, we write

Page 22: CS 416 Artificial Intelligence Lecture 9 Logical Agents Chapter 7 Lecture 9 Logical Agents Chapter 7

Inference Algorithms

SoundSound

• Algorithm derives only entailed sentencesAlgorithm derives only entailed sentences

• An unsound algorithm derives falsehoodsAn unsound algorithm derives falsehoods

CompleteComplete

• inference algorithm can derive any sentence that is entailedinference algorithm can derive any sentence that is entailed

– Means inference algorithm cannot become caught in Means inference algorithm cannot become caught in infinite loop?infinite loop?

SoundSound

• Algorithm derives only entailed sentencesAlgorithm derives only entailed sentences

• An unsound algorithm derives falsehoodsAn unsound algorithm derives falsehoods

CompleteComplete

• inference algorithm can derive any sentence that is entailedinference algorithm can derive any sentence that is entailed

– Means inference algorithm cannot become caught in Means inference algorithm cannot become caught in infinite loop?infinite loop?

Page 23: CS 416 Artificial Intelligence Lecture 9 Logical Agents Chapter 7 Lecture 9 Logical Agents Chapter 7

Propositional (Boolean) Logic

Syntax of allowable sentencesSyntax of allowable sentences

• atomic sentencesatomic sentences

– indivisible syntactic elementsindivisible syntactic elements

– Use uppercase letters to represent a proposition that can Use uppercase letters to represent a proposition that can be true or falsebe true or false

– True and False are predefined propositions where True True and False are predefined propositions where True means always true and False means always falsemeans always true and False means always false

Syntax of allowable sentencesSyntax of allowable sentences

• atomic sentencesatomic sentences

– indivisible syntactic elementsindivisible syntactic elements

– Use uppercase letters to represent a proposition that can Use uppercase letters to represent a proposition that can be true or falsebe true or false

– True and False are predefined propositions where True True and False are predefined propositions where True means always true and False means always falsemeans always true and False means always false

Page 24: CS 416 Artificial Intelligence Lecture 9 Logical Agents Chapter 7 Lecture 9 Logical Agents Chapter 7

Atomic sentences

Syntax of atomic sentencesSyntax of atomic sentences

• indivisible syntactic elementsindivisible syntactic elements

• Use uppercase letters to represent a proposition that can be Use uppercase letters to represent a proposition that can be true or falsetrue or false

• True and False are predefined propositions where True True and False are predefined propositions where True means always true and False means always falsemeans always true and False means always false

Syntax of atomic sentencesSyntax of atomic sentences

• indivisible syntactic elementsindivisible syntactic elements

• Use uppercase letters to represent a proposition that can be Use uppercase letters to represent a proposition that can be true or falsetrue or false

• True and False are predefined propositions where True True and False are predefined propositions where True means always true and False means always falsemeans always true and False means always false

Page 25: CS 416 Artificial Intelligence Lecture 9 Logical Agents Chapter 7 Lecture 9 Logical Agents Chapter 7

Complex sentences

Formed from atomic sentences using connectivesFormed from atomic sentences using connectives

• ~ (or = not): the negation~ (or = not): the negation

• ^ (and): the conjunction^ (and): the conjunction

• V (or): the disjunctionV (or): the disjunction

• => (or = implies): the implication=> (or = implies): the implication

(if and only if): the biconditional(if and only if): the biconditional

Formed from atomic sentences using connectivesFormed from atomic sentences using connectives

• ~ (or = not): the negation~ (or = not): the negation

• ^ (and): the conjunction^ (and): the conjunction

• V (or): the disjunctionV (or): the disjunction

• => (or = implies): the implication=> (or = implies): the implication

(if and only if): the biconditional(if and only if): the biconditional

Page 26: CS 416 Artificial Intelligence Lecture 9 Logical Agents Chapter 7 Lecture 9 Logical Agents Chapter 7

Backus-Naur Form (BNF)

Page 27: CS 416 Artificial Intelligence Lecture 9 Logical Agents Chapter 7 Lecture 9 Logical Agents Chapter 7

Propositional (Boolean) Logic

SemanticsSemantics

• given a particular model (situation), what are the rules that given a particular model (situation), what are the rules that determine the truth of a sentence?determine the truth of a sentence?

• use a truth table to compute the value of any sentence with use a truth table to compute the value of any sentence with respect to a model by recursive evaluationrespect to a model by recursive evaluation

SemanticsSemantics

• given a particular model (situation), what are the rules that given a particular model (situation), what are the rules that determine the truth of a sentence?determine the truth of a sentence?

• use a truth table to compute the value of any sentence with use a truth table to compute the value of any sentence with respect to a model by recursive evaluationrespect to a model by recursive evaluation

Page 28: CS 416 Artificial Intelligence Lecture 9 Logical Agents Chapter 7 Lecture 9 Logical Agents Chapter 7

Truth table

Page 29: CS 416 Artificial Intelligence Lecture 9 Logical Agents Chapter 7 Lecture 9 Logical Agents Chapter 7

Example from wumpus

A square is breezy A square is breezy only if only if a neighboring square has a pit a neighboring square has a pit

• BB1,11,1 (P (P1,21,2 V P V P2,12,1))

A square is breezy A square is breezy if if a neighboring square has a pita neighboring square has a pit

• (P(P1,21,2 V P V P2,12,1) => B) => B1,11,1

Former is more powerful and true to Wumpus rulesFormer is more powerful and true to Wumpus rules

A square is breezy A square is breezy only if only if a neighboring square has a pit a neighboring square has a pit

• BB1,11,1 (P (P1,21,2 V P V P2,12,1))

A square is breezy A square is breezy if if a neighboring square has a pita neighboring square has a pit

• (P(P1,21,2 V P V P2,12,1) => B) => B1,11,1

Former is more powerful and true to Wumpus rulesFormer is more powerful and true to Wumpus rules

Page 30: CS 416 Artificial Intelligence Lecture 9 Logical Agents Chapter 7 Lecture 9 Logical Agents Chapter 7

A wumpus knowledge base• Initial conditionsInitial conditions

– RR11: ~P: ~P1,11,1 no pit in [1,1] no pit in [1,1]

• Rules of Breezes (for a few Rules of Breezes (for a few example squares)example squares)

– RR22: B: B1,11,1 (P (P1,21,2 V P V P2,12,1) )

– RR33: B: B2,12,1 (P (P1,11,1 V P V P2,12,1 V P V P3,13,1))

• PerceptsPercepts

– RR44: ~B: ~B1,11,1

– RR55: B: B2,12,1

We know: RWe know: R11 ^ R ^ R22 ^ R ^ R33 ^ R ^ R44 ^ R ^ R55

• Initial conditionsInitial conditions

– RR11: ~P: ~P1,11,1 no pit in [1,1] no pit in [1,1]

• Rules of Breezes (for a few Rules of Breezes (for a few example squares)example squares)

– RR22: B: B1,11,1 (P (P1,21,2 V P V P2,12,1) )

– RR33: B: B2,12,1 (P (P1,11,1 V P V P2,12,1 V P V P3,13,1))

• PerceptsPercepts

– RR44: ~B: ~B1,11,1

– RR55: B: B2,12,1

We know: RWe know: R11 ^ R ^ R22 ^ R ^ R33 ^ R ^ R44 ^ R ^ R55

Page 31: CS 416 Artificial Intelligence Lecture 9 Logical Agents Chapter 7 Lecture 9 Logical Agents Chapter 7

Inference

Does KB entail Does KB entail (KB -> (KB -> ?)?)

• Is there a pit in [1,2]: PIs there a pit in [1,2]: P1,21,2??

• Consider only what we needConsider only what we need

– BB1,1 1,1 BB2,1 2,1 PP1,1 1,1 PP1,2 1,2 PP2,1 2,1 PP2,2 2,2 PP3,13,1

– 2277 permutations of models to check permutations of models to check

• For each model, see if KB is trueFor each model, see if KB is true

• For all KB = True, see if For all KB = True, see if is true is true

Does KB entail Does KB entail (KB -> (KB -> ?)?)

• Is there a pit in [1,2]: PIs there a pit in [1,2]: P1,21,2??

• Consider only what we needConsider only what we need

– BB1,1 1,1 BB2,1 2,1 PP1,1 1,1 PP1,2 1,2 PP2,1 2,1 PP2,2 2,2 PP3,13,1

– 2277 permutations of models to check permutations of models to check

• For each model, see if KB is trueFor each model, see if KB is true

• For all KB = True, see if For all KB = True, see if is true is true

ModelChecking

Page 32: CS 416 Artificial Intelligence Lecture 9 Logical Agents Chapter 7 Lecture 9 Logical Agents Chapter 7

Inference

Truth tableTruth tableTruth tableTruth table

Page 33: CS 416 Artificial Intelligence Lecture 9 Logical Agents Chapter 7 Lecture 9 Logical Agents Chapter 7

Concepts related to entailmentlogical equivalencelogical equivalence

• Two sentences a and b are logically equivalent if they are true in the same set of Two sentences a and b are logically equivalent if they are true in the same set of models… amodels… a b b

validity (or tautology)validity (or tautology)

• a sentence that is valid in all modelsa sentence that is valid in all models

– P V ~PP V ~P

– deduction theoremdeduction theorem: a entails b if and only if a implies b: a entails b if and only if a implies b

satisfiabilitysatisfiability

• a sentence that is true in a sentence that is true in somesome model model

• a entails b a entails b (a ^ ~b) is unsatisfiable (a ^ ~b) is unsatisfiable

logical equivalencelogical equivalence

• Two sentences a and b are logically equivalent if they are true in the same set of Two sentences a and b are logically equivalent if they are true in the same set of models… amodels… a b b

validity (or tautology)validity (or tautology)

• a sentence that is valid in all modelsa sentence that is valid in all models

– P V ~PP V ~P

– deduction theoremdeduction theorem: a entails b if and only if a implies b: a entails b if and only if a implies b

satisfiabilitysatisfiability

• a sentence that is true in a sentence that is true in somesome model model

• a entails b a entails b (a ^ ~b) is unsatisfiable (a ^ ~b) is unsatisfiable

Page 34: CS 416 Artificial Intelligence Lecture 9 Logical Agents Chapter 7 Lecture 9 Logical Agents Chapter 7

Logical Equivalences

Know these equivalencesKnow these equivalencesKnow these equivalencesKnow these equivalences

Page 35: CS 416 Artificial Intelligence Lecture 9 Logical Agents Chapter 7 Lecture 9 Logical Agents Chapter 7

Reasoning w/ propositional logic

Inference RulesInference Rules

• Modus Ponens: Modus Ponens:

– Whenever sentences of form Whenever sentences of form => => and and are given are giventhe sentence the sentence can be inferred can be inferred

RR11: Green => Martian: Green => Martian

RR22: Green: Green

Inferred: MartianInferred: Martian

Inference RulesInference Rules

• Modus Ponens: Modus Ponens:

– Whenever sentences of form Whenever sentences of form => => and and are given are giventhe sentence the sentence can be inferred can be inferred

RR11: Green => Martian: Green => Martian

RR22: Green: Green

Inferred: MartianInferred: Martian

Page 36: CS 416 Artificial Intelligence Lecture 9 Logical Agents Chapter 7 Lecture 9 Logical Agents Chapter 7

Reasoning w/ propositional logic

Inference RulesInference Rules• And-EliminationAnd-Elimination

– Any of conjuncts can be inferredAny of conjuncts can be inferred

RR11: Martian ^ Green: Martian ^ Green

Inferred: MartianInferred: Martian

Inferrred: GreenInferrred: Green

Use truth tables if you want to confirm inference Use truth tables if you want to confirm inference rulesrules

Inference RulesInference Rules• And-EliminationAnd-Elimination

– Any of conjuncts can be inferredAny of conjuncts can be inferred

RR11: Martian ^ Green: Martian ^ Green

Inferred: MartianInferred: Martian

Inferrred: GreenInferrred: Green

Use truth tables if you want to confirm inference Use truth tables if you want to confirm inference rulesrules

Page 37: CS 416 Artificial Intelligence Lecture 9 Logical Agents Chapter 7 Lecture 9 Logical Agents Chapter 7

Example of a proof

~P ~B

BP?

P?

P?

P?

Page 38: CS 416 Artificial Intelligence Lecture 9 Logical Agents Chapter 7 Lecture 9 Logical Agents Chapter 7

Example of a proof

~P ~B

B~P

~P

P?

P?

Page 39: CS 416 Artificial Intelligence Lecture 9 Logical Agents Chapter 7 Lecture 9 Logical Agents Chapter 7

Constructing a proof

Proving Proving is like is like searchingsearching

• Find sequence of logical inference rules that lead to desired Find sequence of logical inference rules that lead to desired resultresult

• Note the explosion of propositionsNote the explosion of propositions

– Good proof methods ignore the countless irrelevant Good proof methods ignore the countless irrelevant propositionspropositions

Proving Proving is like is like searchingsearching

• Find sequence of logical inference rules that lead to desired Find sequence of logical inference rules that lead to desired resultresult

• Note the explosion of propositionsNote the explosion of propositions

– Good proof methods ignore the countless irrelevant Good proof methods ignore the countless irrelevant propositionspropositions

Page 40: CS 416 Artificial Intelligence Lecture 9 Logical Agents Chapter 7 Lecture 9 Logical Agents Chapter 7

Monotonicity of knowledge base

Knowledge base can only get largerKnowledge base can only get larger

• Adding new sentences to knowledge base can only make it get largerAdding new sentences to knowledge base can only make it get larger

– If (KB entails If (KB entails ))

((KB ^ ((KB ^ ) entails ) entails ))

• This is important when constructing proofsThis is important when constructing proofs

– A logical conclusion drawn at one point cannot be invalidated by a A logical conclusion drawn at one point cannot be invalidated by a subsequent entailmentsubsequent entailment

Knowledge base can only get largerKnowledge base can only get larger

• Adding new sentences to knowledge base can only make it get largerAdding new sentences to knowledge base can only make it get larger

– If (KB entails If (KB entails ))

((KB ^ ((KB ^ ) entails ) entails ))

• This is important when constructing proofsThis is important when constructing proofs

– A logical conclusion drawn at one point cannot be invalidated by a A logical conclusion drawn at one point cannot be invalidated by a subsequent entailmentsubsequent entailment