6 logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_logical agents (s).pdf · 2010. 6. 9. · “logical...

71
Artificial Intelligence, Spring, 2010 Logical Agents (I) Instructor: Tsung-Che Chiang [email protected] Department of Computer Science and Information Engineering National Taiwan Normal University

Upload: others

Post on 16-Mar-2021

9 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

Artificial Intelligence, Spring, 2010

Logical Agents (I)

Instructor: Tsung-Che [email protected]

Department of Computer Science and Information EngineeringNational Taiwan Normal University

Page 2: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

2“Logical Agents,”Artificial Intelligence, Spring, 2010

-5-20496470023

-5-5495470313

-5-10496470358

-5495730086

-5495730127

496470384

-5-10496470334

496470267

496470463

496470293

-5496479562

-5-5-2496470229

495211298

-5496470413

-5496470310

496470035

-5496470011

-2496470308

496470255

統計有誤結果有誤執行有誤實作有誤編譯有誤

Page 3: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

3“Logical Agents,”Artificial Intelligence, Spring, 2010

Outline

Knowledge-based AgentsThe Wumpus WorldLogicPropositional LogicReasoning Patterns in Propositional LogicEffective Propositional InferenceAgents based on Propositional LogicSummary

Page 4: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

4“Logical Agents,”Artificial Intelligence, Spring, 2010

Knowledge-based Agents

Humans know things and do reasoning.

Knowledge and reasoning play a crucial rolein dealing with partially observableenvironments. e.g. diagnosing a patient

Understanding natural language alsorequires reasoning.

Page 5: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

5“Logical Agents,”Artificial Intelligence, Spring, 2010

Knowledge-based Agents

Knowledge-based agents can benefit fromknowledge expressed in very general formsto suit many purposes.

They are able to accept new tasks explicitly described by goals, achieve competence by being told new

knowledge about the environment, and adapt to changes by updating the relevant

knowledge.

Page 6: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

6“Logical Agents,”Artificial Intelligence, Spring, 2010

Knowledge-based Agents

The central component of a knowledge-based agent is its knowledge base (KB).

A KB is a set of sentences.Each sentence is expressed in a knowledge

representation language and representssome assertion about the world.

Page 7: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

7“Logical Agents,”Artificial Intelligence, Spring, 2010

Knowledge-based Agents

TELL: add new sentences to the KBASK: query what is known

Both tasks may involve inference –derivingnew sentences from old.

When one ASKS a question of the KB, theanswer should “follow”what has beenTELLED (told) to the KB.

Page 8: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

8“Logical Agents,”Artificial Intelligence, Spring, 2010

Knowledge-based Agents

Artificial Intelligence: A Modern Approach, 2nd ed., Figure 7.1

The KB may initially contain some background knowledge.The details of the representation language are hidden inside three functions.The details of the inference mechanism are hidden inside TELL and ASK.

Page 9: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

9“Logical Agents,”Artificial Intelligence, Spring, 2010

Knowledge-based Agents

One can build a knowledge-based agent byTELLing it what it needs to know. But how?Declarative approach

It adds one by one the sentences that represent thedesigner’s knowledge.

Design of the representation language is important.

Procedural approach It encodes desired behaviors directly as program

code. Minimizing the role of explicit representation and

reasoning can result in a much more efficient system.

Page 10: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

10“Logical Agents,”Artificial Intelligence, Spring, 2010

Knowledge-based Agents

We will see both declarative andprocedural approaches later.

A successful agent must combine bothelements in its design.

Page 11: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

11“Logical Agents,”Artificial Intelligence, Spring, 2010

The Wumpus World

PEAS description Environment:

It is a 44 grid of rooms. The agent always starts

from [1, 1], facing to theright.

The locations of the goldand the wumpus arechosen randomly.

Each square other thanthe start can be a pitwith probability 0.2.

The agent has only onearrow.

Artificial Intelligence: A Modern Approach, 2nd ed., Figure 7.2

Page 12: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

12“Logical Agents,”Artificial Intelligence, Spring, 2010

The Wumpus World

PEAS description Performance measure:

+1000 for picking up the gold -1000 for falling into a pit or being eaten by the

wumpus -1 for each action -10 for shooting the (only one) arrow

Page 13: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

13“Logical Agents,”Artificial Intelligence, Spring, 2010

The Wumpus World

PEAS descriptionActuators:

Move forward Turn left/right by 90 Grab Shoot

Sensors: Stench Breeze Glitter Bump Scream

Artificial Intelligence: A Modern Approach, 2nd ed., Figure 7.2

Page 14: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

14“Logical Agents,”Artificial Intelligence, Spring, 2010

BreezeBreeze

StenchStench

GlitterGlitter

Breeze

Stench

Glitter

The Wumpus World

Page 15: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

15“Logical Agents,”Artificial Intelligence, Spring, 2010

The Wumpus World

Fundamental property of reasoning

“In each case where the agent draws aconclusion from the available information, thatconclusion is guaranteed to be correct if theavailable information is correct.”

Page 16: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

16“Logical Agents,”Artificial Intelligence, Spring, 2010

Logic

Sentences in the KB are expressedaccording to the syntax of therepresentation language. e.g. “x+y=4”is a well-formed sentence, whereas“x4y+=“is not.

A logic must also define the semantics ofthe language. It defines the truth of each sentence w.r.t.

each possible world.

Page 17: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

17“Logical Agents,”Artificial Intelligence, Spring, 2010

Logic

ModelWe will use the term model in place of “possible

world.”We will say “m is a model of ”to mean that

sentence is true in model m.

EntailmentWe use ╞ to mean that the sentence

entails the sentence .╞ if and only if in every model in which is

true, is also true. The truth of is contained in the truth of .

Page 18: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

18“Logical Agents,”Artificial Intelligence, Spring, 2010

Logic

ExampleThe agent has detected nothing in [1, 1] and a

breeze in [2, 1]. It is interested in whether the adjacent

squares [1, 2], [2, 2], and [3, 1] contain pits. There are 23 = 8 possible models.

The percepts and the rules of the wumpusworld constitute the KB.

The KB is true in models thatfollow what the agent knows.

Page 19: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

19“Logical Agents,”Artificial Intelligence, Spring, 2010

Logic

8 possible models “nothing in [1, 1]”and“breeze in [2, 1]”

Page 20: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

20“Logical Agents,”Artificial Intelligence, Spring, 2010

Logic

Artificial Intelligence: A Modern Approach, 2nd ed., Figure 7.5

1 = “There is no pit in [1, 2]”KB╞ 1

Page 21: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

21“Logical Agents,”Artificial Intelligence, Spring, 2010

Logic

Artificial Intelligence: A Modern Approach, 2nd ed., Figure 7.5

2 = “There is no pit in [2, 2]”KB╞ 2

Page 22: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

22“Logical Agents,”Artificial Intelligence, Spring, 2010

Logic

The previous example shows how aninference algorithm called “model checking”works.

It enumerates all possible models to checkthat is true in all models in which KB istrue.

Page 23: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

23“Logical Agents,”Artificial Intelligence, Spring, 2010

Logic

If an inference algorithm i can derive from KB, we write

KB├i .

An inference algorithm is called sound or truth-preserving if it derives only entailed sentences. Model checking is a sound algorithm (when it is

applicable).

An inference algorithm is complete if it canderive any sentence that is entailed.

Page 24: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

24“Logical Agents,”Artificial Intelligence, Spring, 2010

Logic

“If KB is true in the real world, then anysentence derived from KB by a soundinference procedure is also true in the realworld.”

Artificial Intelligence: A Modern Approach, 2nd ed., Figure 7.6

Page 25: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

25“Logical Agents,”Artificial Intelligence, Spring, 2010

Logic

GroundingHow do we know that KB is true in the real

world? The simple answer is that the agent’s sensors create

the connection.

What about the rest of the agent’s knowledge? Knowledge that is not a direct representation of a

single percept could be produced by a sentenceconstruction procedure called learning.

KB may not be true in the real world, but with goodlearning procedures there is reason for optimism.

Page 26: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

26“Logical Agents,”Artificial Intelligence, Spring, 2010

Tea Time

Wumpus Worldhttp://www.youtube.com/watch?v=TgRXLA1EY4A

Wumpus World Gamehttp://www.inthe70s.com/games/wumpus/index.shtml#

http://www.funzac.com/play/Wumpus%20World.html

Page 27: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

27“Logical Agents,”Artificial Intelligence, Spring, 2010

Propositional Logic

SyntaxA proposition symbol stands for a proposition

that can be true or false. special symbols: True and False

The atomic sentences are indivisible syntacticelements. They consist of a single propositionsymbol.

Complex sentences are constructed fromsimpler sentences using logical connectives.

A literal is either an atomic sentence or anegated atomic sentence.

Page 28: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

28“Logical Agents,”Artificial Intelligence, Spring, 2010

Propositional Logic

A BNF grammarSentence AtomicSentence | ComplexSentence

AtomicSentence True | False | Symbol

Symbol P | Q | R | …

ComplexSentence Sentence| (Sentence Sentence)| (Sentence Sentence)| (Sentence Sentence)| (Sentence Sentence)

negation

conjunction

disjunction

implication

biconditional

Page 29: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

29“Logical Agents,”Artificial Intelligence, Spring, 2010

Propositional Logic

Semantics In propositional logic, a model simply fixes the

truth value for every proposition symbol.The semantics must specify how to compute the

truth value of any sentence, given a model.

Artificial Intelligence: A Modern Approach, 2nd ed., Figure 7.8

Truth table

Page 30: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

30“Logical Agents,”Artificial Intelligence, Spring, 2010

Propositional Logic

“P Q”says that “If P is true, then I amclaiming that Q is true. Otherwise, I ammaking no claim.”

“P Q”shows that it is true wheneverboth “P Q”and “Q P”are true. e.g.

B1,1 (P1,2 P2,1)

B1,1 (P1,2 P2,1) is true but incomplete.

Page 31: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

31“Logical Agents,”Artificial Intelligence, Spring, 2010

Propositional Logic

We often omit the parentheses by obeyingthe order of precedence (from highest to lowest):, , , , and .

PQR S is equivalent to ((P)(QR)) SWe allow ABC, ABC, and ABC.However, we do not allow ABC since it is

ambiguous. A(BC) and (AB)C havedifferent meaning.

Page 32: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

32“Logical Agents,”Artificial Intelligence, Spring, 2010

Propositional Logic

A logical knowledge base is a conjunction ofsentences.

If we start with an empty KB and doTELL(KB, S1), …TELL(KB, Sn) then we haveKB = S1 …Sn.

Page 33: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

33“Logical Agents,”Artificial Intelligence, Spring, 2010

Propositional Logic

A simple knowledge base for the wumpusworld (only considering the pits)There is no pit in [1, 1].

R1: P1, 1

A square is breezy if and only if there is a pit ina neighboring square. (True in all wumpus worlds)

R2: B1,1 (P1,2 P2,1) R3: B2,1 (P1,1 P2,2 P3,1)

Agent percepts R4: B1,1

R5: B2,1

P?

BOK

Page 34: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

34“Logical Agents,”Artificial Intelligence, Spring, 2010

Inference

The aim of logical inference is to decidewhether KB╞for some sentence .

Our first algorithm will enumerate themodels and check that is true in everymodel in which KB is true. e.g. In the previous slide, we have seven

relevant proposition symbols. There are 27 =128 possible models, and in three of them KB istrue.

Page 35: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

35“Logical Agents,”Artificial Intelligence, Spring, 2010

Inference

1 = P1, 2

BOK

R1: P1, 1

R2: B1,1 (P1,2 P2,1)R3: B2,1 (P1,1 P2,2 P3,1)R4: B1,1

R5: B2,1

Artificial Intelligence: A Modern Approach, 2nd ed., Figure 7.9

128 models

Page 36: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

36“Logical Agents,”Artificial Intelligence, Spring, 2010

Inference

ExerciseWrite down the related rules.Apply the truth table to do model checking.

Wumpus world: Is there a breeze in [2,2]?

Minesweeper: Where is the mine?

B

B?

1

1

1

1

1

A B

C

D

Page 37: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

37“Logical Agents,”Artificial Intelligence, Spring, 2010

Inference

Artificial Intelligence: A Modern Approach, 2nd ed., Figure 7.10

Page 38: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

38“Logical Agents,”Artificial Intelligence, Spring, 2010

Inference

AnalysisTT-Entails is sound and complete. However, … If KB and contain n symbols, then there are 2n

possible models.The time complexity is O(2n) and the space

complexity is O(n).

We will see more efficient algorithms later. But every known inference algorithm for propositional

logic has a worst-case complexity that is exponentialin the size of the input. (Propositional entailment isco-NP-complete.)

Page 39: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

39“Logical Agents,”Artificial Intelligence, Spring, 2010

Inference

Before we plunge into the details of logicalinference algorithms, we need someadditional concepts related to entailment.

equivalence validity satisfiability

Page 40: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

40“Logical Agents,”Artificial Intelligence, Spring, 2010

Inference

Logical equivalenceTwo sentences and are logically equivalent if

they are true in the same set of models. Wewrite this as .

An alternative definition is

if and only if ╞and ╞.

Page 41: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

41“Logical Agents,”Artificial Intelligence, Spring, 2010

Inference

Logical equivalence

Artificial Intelligence: A Modern Approach, 2nd ed., Figure 7.11

Page 42: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

42“Logical Agents,”Artificial Intelligence, Spring, 2010

Inference

Validity (tautology)A sentence is valid if it is true in all models.

The deduction theorem:

For any sentences and , ╞if and only ifthe sentence ( ) is valid.

We can think of the TT-ENTAILS algorithm aschecking the validity of (KB ). Conversely, every valid implication sentence describes

a legitimate inference.

Page 43: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

43“Logical Agents,”Artificial Intelligence, Spring, 2010

Inference

SatisfiabilityA sentence is satisfiable if it is true in some

model. If a sentence is true in a model m, we say

that m satisfies or that m is a model of .

Determining the satisfiability of sentences inpropositional logic was the first problem provedto be NP-complete.

Page 44: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

44“Logical Agents,”Artificial Intelligence, Spring, 2010

Inference

SatisfiabilityValidity and satisfiability are connected.

╞if and only if the sentence () isunsatisfiable.

Proving from by checking theunsatisfiability of () is called proof bycontradiction.

One assumes a sentence to be false and showsthat this leads to a contradiction with knownaxioms .

Page 45: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

45“Logical Agents,”Artificial Intelligence, Spring, 2010

Reasoning Patterns in PL

Inference rulesModus Ponens

And-Elimination

And-Introduction

Or-Introduction

All of the logical equivalences in slide 41 can be used as inference rules.

Page 46: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

46“Logical Agents,”Artificial Intelligence, Spring, 2010

Reasoning Patterns in PL

Example

R1: P1, 1

R2: B1,1 (P1,2 P2,1)R3: B2,1 (P1,1 P2,2 P3,1)R4: B1,1

R5: B2,1

R6: (B1,1 (P1,2 P2,1)) ((P1,2 P2,1) B1,1)R7: (P1,2 P2,1) B1,1

R8: B1,1 (P1,2 P2,1)R9: (P1,2 P2,1)R10: P1,2 P2,1

Biconditional elimination of R2

And elimination of R6

Contraposition of R7

Modus Ponens with R4 and R8

De Morgan’s Rule with R9

OK

Page 47: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

47“Logical Agents,”Artificial Intelligence, Spring, 2010

Reasoning Patterns in PL

The sequence of applications of inferencerules is called a proof.

Finding proofs is exactly like findingsolutions to search problems.The successor function can be defined to

generate all possible applications of inferencerules.

Page 48: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

48“Logical Agents,”Artificial Intelligence, Spring, 2010

Reasoning Patterns in PL

Searching for proofs is an alternative toenumerating models.Although inference in propositional logic is NP-

complete, finding a proof can be highlyefficient. e.g. The previous proof ignores B3,1, P1,1, P2,2, etc.

The simple truth-table algorithm, on the otherhand, would be overwhelmed by the exponentialexplosion of models.

Page 49: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

49“Logical Agents,”Artificial Intelligence, Spring, 2010

Reasoning Patterns in PL

MonotonicityThe set of entailed sentences can only increase

as information is added to the KB.

For any sentences and ,if KB╞, then (KB )╞.

It means that inference rules can be appliedwhenever suitable premises are found in theKBthe conclusion of the rule must followregardless of what else is in the KB.

Page 50: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

50“Logical Agents,”Artificial Intelligence, Spring, 2010

Conjunctive Normal Form

A sentence expressed as a conjunction ofdisjunctions of literals is said to be inconjunctive normal form (CNF).A sentence in k-CNF has exactly k literals per

clause.

Every sentence can be transformed into aCNF sentence. exercise: B1,1 (P1,2 P2,1)

Page 51: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

51“Logical Agents,”Artificial Intelligence, Spring, 2010

Conjunctive Normal Form

Example

B1,1 (P1,2 P2,1)(B1,1 (P1,2 P2,1)) ((P1,2 P2,1) B1,1)(B1,1 (P1,2 P2,1)) ((P1,2 P2,1) B1,1)(B1,1 P1,2 P2,1) (P1,2 P2,1) B1,1)(B1,1 P1,2 P2,1) ((P1,2 B1,1) (P2,1 B1,1))

(B1,1 P1,2 P2,1) (P1,2 B1,1) (P2,1 B1,1)

Page 52: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

52“Logical Agents,”Artificial Intelligence, Spring, 2010

Resolution

Unit resolution

Full resolution

li and m are complementary literals.

li and mj are complementary literals.

Page 53: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

53“Logical Agents,”Artificial Intelligence, Spring, 2010

Resolution

The resulting clause should contain only onecopy of each literal. (The removal of multiplecopies of literals is called factoring.)

The resolution rule applied only todisjunctions of literals. (But, recall that everysentence can be transformed into a 3-CNF sentence.)

ABABA ,

Page 54: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

54“Logical Agents,”Artificial Intelligence, Spring, 2010

Resolution

Any complete search algorithm, applyingonly the resolution rule, can derive anyconclusion entailed by any knowledge base.

Given that A is true, we cannot generate theconsequence AB. But we can answer whetherAB is true.

This is called refutation completeness, meaningthat resolution can always be used to eitherconfirm or refute a sentence.

Page 55: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

55“Logical Agents,”Artificial Intelligence, Spring, 2010

A Resolution Algorithm

Resolution-based inference proceduresfollow the principle of proof bycontradiction.

To show that KB╞, we show that (KB)is unsatisfiable. First (KB) is converted into CNF.Then, each pair of clauses that contain

complementary literals is resolved to produce anew clause.

Page 56: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

56“Logical Agents,”Artificial Intelligence, Spring, 2010

A Resolution Algorithm

The process continues until one of twothings happen: there are no new clauses that can be added, in

which case KB does not entail ; or, two clauses (P and P) resolve to yield the

empty clause, in which case KB entails

Page 57: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

57“Logical Agents,”Artificial Intelligence, Spring, 2010

A Resolution Algorithm

Artificial Intelligence: A Modern Approach, 2nd ed., Figure 7.12

Page 58: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

58“Logical Agents,”Artificial Intelligence, Spring, 2010

A Resolution Algorithm

ExampleR2: B1,1 (P1,2 P2,1) // (B1,1 P1,2 P2,1) (P1,2 B1,1) (P2,1 B1,1)

R4: B1,1

We wish to prove = P1,2

Artificial Intelligence: A Modern Approach, 2nd ed., Figure 7.13

KB

KB

CNF

Any clause containing two complementary literals can be discarded.

Page 59: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

59“Logical Agents,”Artificial Intelligence, Spring, 2010

A Resolution Algorithm

AnalysisThe resolution closure RC(S) of a set of clause S

is the set of all clauses derivable by repeatedapplication of the resolution rule to clauses in Sor their derivatives.

RC(S) must be finite because there are onlyfinitely many distinct clauses constructed bysymbols P1, …, Pk appearing in S. PL-RESOLUTION always terminates.

Note that the last sentence might not be true without the factoringstep that removes multiple copies of literals.

Page 60: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

60“Logical Agents,”Artificial Intelligence, Spring, 2010

A Resolution Algorithm

AnalysisThe Ground Resolution Theorem:

“If a set of clauses is unsatisfiable, then theresolution closure of those clauses contains theempty clause.”

We can prove this theorem by demonstratingits contrapositive: if the closure RC(S) does notcontain the empty clause, then S is satisfiable.

Page 61: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

61“Logical Agents,”Artificial Intelligence, Spring, 2010

A Resolution Algorithm

Analysis (contd.)We can construct a model for S with suitable

truth values for P1, …, Pk.

For i from 1 to k,

if there is a clause containing the literal Pisuch that all its other literals are false under theassignment of P1, …, Pi-1, then assign false to Pi.

otherwise, assign true to Pi.

Page 62: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

62“Logical Agents,”Artificial Intelligence, Spring, 2010

Forward & Backward Chaining

The completeness of resolution makes it avery important inference method.

In many practical situations, however, thefull power of resolution is not needed.

Real-world KBs often contain only clausesof a restricted kind called Horn clauses.

Page 63: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

63“Logical Agents,”Artificial Intelligence, Spring, 2010

Forward & Backward Chaining

Horn clausesA Horn clause is a disjunction of literals of

which at most one is positive. In the following algorithm, we assume that each clause

contains exactly one positive literal for simplicity. Exact one positive literal: definite clause Definite clause without negative literal: fact Without positive literal: integrity constraint

Every Horn clause can be written as animplication.

Page 64: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

64“Logical Agents,”Artificial Intelligence, Spring, 2010

Forward & Backward Chaining

Horn clauses Inference with Horn clauses can be done

through the forward chaining and backwardchaining algorithms.

Deciding entailment with Horn clauses can bedone in time that is linear to the size of KB.

Page 65: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

65“Logical Agents,”Artificial Intelligence, Spring, 2010

Forward & Backward Chaining

Forward chaining It begins from known facts (the clauses with

only a positive literal) in the KB. If all the premises of an implication are known,

then its conclusion is added to the set of knownfacts.

The process continues until the query is addedor until no further inferences can be made.

Page 66: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

66“Logical Agents,”Artificial Intelligence, Spring, 2010

Forward & Backward Chaining

Artificial Intelligence: A Modern Approach, 2nd ed., Figure 7.14

HEAD[c]:the positive literal of c

Page 67: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

67“Logical Agents,”Artificial Intelligence, Spring, 2010

Forward & Backward Chaining

Artificial Intelligence: A Modern Approach, 2nd ed., Figure 7.15

Page 68: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

68“Logical Agents,”Artificial Intelligence, Spring, 2010

Forward & Backward Chaining

Analysis Forward chaining is sound: every inference is

essentially an application of Modus Ponens. Forward chaining is complete:

Consider the final state of the inferred table afterthe algorithm terminates.

The table contains true for each symbol inferredduring the process, and false for all other symbols.

We can view this table as a model of the KB. Any atomic sentence entailed by the KB must be true

in this model, and thus it is inferred by the algorithm.

Page 69: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

69“Logical Agents,”Artificial Intelligence, Spring, 2010

Forward & Backward Chaining

Backward chainingAs its name suggest, it works backwards from

the query. If the query q is known to be true, no work is

needed.Otherwise, the algorithm finds those

implications in the KB that conclude q. If all the premises of one of those implications can be

proved true (by backward chaining), then q is true.

Page 70: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

70“Logical Agents,”Artificial Intelligence, Spring, 2010

Forward & Backward Chaining

Page 71: 6 Logical agentsweb.ntnu.edu.tw/~tcchiang/ai/6_Logical agents (S).pdf · 2010. 6. 9. · “Logical Agents,”Artificial Intelligence, Spring, 2010 Knowledge-based Agents Artificial

71“Logical Agents,”Artificial Intelligence, Spring, 2010

Forward & Backward Chaining

Forward chaining is an example of data-drivenreasoning. It can be used within an agent to derive conclusions from

incoming percepts.

Backward chaining is a form of goal-directedreasoning. It is useful for answering specific questions such as

“What should I do now.” Its cost is often lower since it touches only relevant

facts. An agent should share the work between forward and

backward reasoning.