knowledge representation (overview)mjs/teaching/knowledgerep491/... · knowledge representation and...

13
Knowledge Representation (Overview) Marek Sergot October 2015 Introduction From SOLE 2014 . . . Prof Marek Sergot is the most lucid, patient, engaging, humourous, enthusiastic and approachable lecturer one could ever hope to have. It is a privilege to encounter such a lecturer. The most interesting of all the courses offered . . . My only suggestion for improvement would be to offer this course in the first term and offer a second that goes into other areas (e.g. argumentation) in more depth in the spring. This happened on several occasions and I believe it is not acceptable. Marek Sergot (October 2015) 2 / 51 Introduction Knowledge Representation * * includes reasoning a huge sub-field of AI a variety of representation/modelling formalisms, mostly (these days, always) based on logic assorted representation problems So more or less: applied logic Marek Sergot (October 2015) 3 / 51 Introduction KR 2008: 11th International Conference on Principles of Knowledge Representation and Reasoning Exception tolerant and inconsistency-tolerant reasoning, Paraconsistent logics Nonmonotonic logics, Default Logics, Conditional logics, Argumentation Temporal reasoning and spatial reasoning Causal reasoning, Abduction, Model-based diagnosis Reasoning about actions and change, Action languages Reasoning, planning, or decision making under uncertainty Representations of vagueness, Many-valued and fuzzy logics Graphical representations for belief and preference Reasoning about belief and knowledge, Epistemic and Doxastic logics, Multi-agent logics of belief and knowledge Logic programming, Constraint logic programming, Answer set programming Computational aspects of knowledge representation Marek Sergot (October 2015) 4 / 51

Upload: others

Post on 22-Oct-2020

40 views

Category:

Documents


0 download

TRANSCRIPT

  • Knowledge Representation(Overview)

    Marek Sergot

    October 2015

    Introduction

    From SOLE 2014 . . .

    Prof Marek Sergot is the most lucid, patient, engaging,humourous, enthusiastic and approachable lecturer one couldever hope to have. It is a privilege to encounter such a lecturer.

    The most interesting of all the courses offered . . . My onlysuggestion for improvement would be to offer this course in thefirst term and offer a second that goes into other areas (e.g.argumentation) in more depth in the spring.

    This happened on several occasions and I believe it is notacceptable.

    Marek Sergot (October 2015) 2 / 51

    Introduction

    Knowledge Representation∗

    ∗ includes reasoning

    a huge sub-field of AI

    a variety of representation/modelling formalisms, mostly (these days,always) based on logic

    assorted representation problems

    So more or less: applied logic

    Marek Sergot (October 2015) 3 / 51

    Introduction

    KR 2008: 11th International Conference on Principles ofKnowledge Representation and Reasoning

    Exception tolerant and inconsistency-tolerant reasoning,Paraconsistent logics

    Nonmonotonic logics, Default Logics, Conditional logics, Argumentation

    Temporal reasoning and spatial reasoning

    Causal reasoning, Abduction, Model-based diagnosis

    Reasoning about actions and change, Action languages

    Reasoning, planning, or decision making under uncertainty

    Representations of vagueness, Many-valued and fuzzy logics

    Graphical representations for belief and preference

    Reasoning about belief and knowledge, Epistemic and Doxastic logics, Multi-agent logicsof belief and knowledge

    Logic programming, Constraint logic programming,Answer set programming

    Computational aspects of knowledge representation

    Marek Sergot (October 2015) 4 / 51

  • Introduction

    KR 2008: 11th International Conference on Principles ofKnowledge Representation and Reasoning

    Concept formation, Similarity-based reasoning

    Belief revision and update, Belief merging, Information fusion

    Description logics, ontologies

    Qualitative reasoning, Reasoning about physical systems

    Decision theory, Preference modeling and representation, Reasoning aboutpreference,

    KR and Autonomous agents: Intelligent agents, Cognitive robotics

    KR and Multi-agent systems: Negotiation, Group decision making, Cooperation,Interaction, KR and game theory

    Natural language processing, Summarization, Categorization

    KR and Machine learning, Inductive logic programming, Knowledge discovery andacquisition

    WWW querying languages, Information retrieval and web mining, Websiteselection and configuration

    Philosophical foundations and psychological evidence

    Marek Sergot (October 2015) 5 / 51

    Introduction

    KR 2014: 14th International Conference on Principles ofKnowledge Representation and Reasoning

    Applications of KRArgumentationBelief revision and update, belief merging, information fusionComputational aspects of knowledge representationConcept formation, similarity-based reasoningContextual reasoningDescription logicsExplanation finding, diagnosis, causal reasoning, abductionInconsistency- and exception tolerant reasoning, paraconsistent logicsKR and autonomous agents, cognitive robotics, multi-agent systems, logical models ofagencyKR and data management, ontology-based data access, queries and updates overincomplete dataKR and decision making, decision theory, game theory and economic modelsKR and machine learning, inductive logic programming, knowledge discovery andacquisitionKR and the Web, Semantic Web, formal approaches to knowledge basesKR in games, general game playing, reasoning in video games and virtual environments,believable agentsKR in natural language understanding and question answeringKR in image and video understanding

    Marek Sergot (October 2015) 6 / 51

    Introduction

    KR 2014: 14th International Conference on Principles ofKnowledge Representation and Reasoning

    Logical approaches to planning and behavior synthesis

    Logic programming, answer set programming, constraint logic programming

    Nonmonotonic logics, default logics, conditional logics

    Reasoning about norms and organizations, social knowledge and behavior

    Philosophical foundations of KR

    Ontology languages and modeling

    Preference modeling and representation, reasoning about preferences, preference-basedreasoning

    Qualitative reasoning, reasoning about physical systems

    Reasoning about actions and change, action languages, situation calculus, dynamic logic

    Reasoning about knowledge and belief, epistemic and doxastic logics

    Spatial reasoning and temporal reasoning

    Uncertainty, representations of vagueness, many-valued and fuzzy logics, relationalprobability models

    Marek Sergot (October 2015) 7 / 51

    Introduction

    KR 2012: Programme

    Decision Making and Reasoning about Preferences

    Short: – Description Logics – Belief Revision – Argumentation – Reasoning andSearch – Logic Programming – Principled Applications

    Uncertainty

    Description Logics I

    Answer Set Programming and Logic Programming

    Inconsistency-Tolerant and Similarity-Based Reasoning

    Knowledge Representation and Knowledge-Based Systems

    Epistemic Logics

    Automated Reasoning and Computation

    Belief Revision II

    Abstraction and Diagnosis

    Argumentation

    Spatial and Temporal Reasoning

    Description Logics II

    Reasoning about Actions and Planning

    Marek Sergot (October 2015) 8 / 51

  • Introduction

    Marek Sergot (October 2015) 9 / 51

    Introduction

    Aims of this course

    LogicLogic 6= classical (propositional) logic !!

    Computational logicLogic programming 6= Prolog !!

    Some examples

    defeasible (non-monotonic) rulespriorities (preferences)action + ‘inertia’‘practical reasoning’: what should I do?

    Marek Sergot (October 2015) 10 / 51

    Introduction

    Logic of conditionals (‘if ... then ...’)

    material implication (classical →)‘strict implication’

    causal conditionals

    counterfactuals

    conditional obligations

    defeasible (non-monotonic) conditionals...

    Marek Sergot (October 2015) 11 / 51

    Defeasible reasoning

    Example

    A recent article about the Semantic Web was critical about the use oflogic for performing useful inferences in the Semantic Web, citing thefollowing example, among others:

    ‘People who live in Brooklyn speak with a Brooklyn accent. I livein Brooklyn. Yet I do not speak with a Brooklyn accent.’

    According to the author,

    ‘each of these statements is true, but each is true in a differentway. The first is a generalization that can only be understood incontext.’

    The article was doubtful that there are any practical ways of representingsuch statements.

    www.shirky.com/writings/semantic syllogism.html.

    Marek Sergot (October 2015) 12 / 51

  • Defeasible reasoning

    His point (the classical syllogism)

    ∀x (p(x)→ q(x))p(a)

    q(a)

    In logic programming notation:

    q(x)← p(x)p(a)

    q(a)

    Marek Sergot (October 2015) 13 / 51

    Defeasible reasoning

    Solution

    We need either or both of:

    a new kind of conditional

    a special kind of defeasible entailment

    ∀x (p(x) q(x))p(a)

    q(a)

    There is a huge amount of work on this in AI!

    Marek Sergot (October 2015) 14 / 51

    Defeasible reasoning

    The Qualification Problem (1)

    “All birds can fly . . . ”flies(X) ← bird(X)

    “. . . unless they are penguins . . . ”flies(X) ← bird(X), ¬ penguin(X)

    “. . . or ostriches . . . ”flies(X) ← bird(X), ¬ penguin(X), ¬ ostrich(X)

    “. . . or wounded . . . ”flies(X) ← bird(X), ¬ penguin(X), ¬ ostrich(X),

    ¬ wounded(X)

    “. . . or dead, or sick, or glued to the ground, or . . . ”

    Marek Sergot (October 2015) 15 / 51

    Defeasible reasoning

    The Qualification Problem (2)

    Let BIRDS be the set of rules about flying birds.Even if we could list all these exceptions, classical logic would still notallow

    BIRDS ∪ {bird(frank)} |= flies(frank)

    We would also have to affirm all the qualifications:

    ¬ penguin(frank)¬ ostrich(frank)¬ wounded(frank)¬ dead(frank)¬ sick(frank)¬ glued to ground(frank)

    ...

    Marek Sergot (October 2015) 16 / 51

  • Defeasible reasoning

    Classical logic is inadequate

    A |= α means that α is true in all models of A.

    We want a new kind of ‘entailment’:

    BIRDS ∪ {bird(frank)} |=∆ flies(frank)

    From BIRDS and bird(frank) it follows by default—in the absence ofinformation to the contrary—that flies(frank).

    This kind of reasoning will be defeasible.

    Marek Sergot (October 2015) 17 / 51

    Defeasible reasoning

    Non-monotonic logics

    Classical logic is monotonic:

    If KB |= α then KB ∪X |= αNew information X always preserves old conclusions α.

    Default reasoning is typically non-monotonic. Can have:

    KB |=∆ α but KB ∪X 6|=∆ α

    BIRDS ∪ {bird(frank)} |=∆ flies(frank)But

    BIRDS ∪ {bird(frank)} ∪ {penguin(frank)} 6|=∆ flies(frank)

    There have been huge developments in AI over the last 25 years innon-monotonic logics for default reasoning and other applications.

    Marek Sergot (October 2015) 18 / 51

    Defeasible reasoning

    Three main approaches

    1 ‘Preferential entailment’

    KB |=∆ α — α is true in all preferred models of KBThe ‘preferred’ models are usually minimal in some appropriate sense.

    2 Define |=∆ in terms of classical consequence |=. Usually:

    KB |=∆ α iff KB ∪ ext(KB) |= α

    Non-monotonic because, in general:

    KB ∪ ext(KB) 6⊆ KB ∪X ∪ ext(KB ∪X )

    3 Extend the language with a new form of defeasible conditional/rule(e.g. ).

    There are strong relationships between these three methods.We will see some examples of each.

    Marek Sergot (October 2015) 19 / 51

    Defeasible reasoning

    Some sources of defeasible reasoning

    Typical and stereotypical situations

    Generalisations and exceptions...

    Marek Sergot (October 2015) 20 / 51

  • Defeasible reasoning

    Example

    Typically (by default, unless there is reason to think otherwise, . . . ) abird can fly.

    Except that ostriches, which are birds, typically cannot fly.

    Also penguins, which are birds, cannot fly.

    Except that magic ostriches can fly (in general).

    Jim is an ostrich (an ordinary ostrich) who can fly.

    Frank is a magic ostrich who cannot fly.

    No bird can fly if it is dead (no exceptions!).

    Marek Sergot (October 2015) 21 / 51

    Defeasible reasoning

    Furthermore . . .

    A dead bird is abnormal from the point of view of flying (and singing) butnot necessarily from the point of view of having feathers.

    Birds have feathers.

    Birds who have no feathers cannot fly.

    Frank, the magic ostrich, has no feathers (and cannot fly).

    Marek Sergot (October 2015) 22 / 51

    Defeasible reasoning

    Multiple extensions: “The Nixon diamond”

    Quakers are typically pacifists.

    Republicans are typically not pacifists.

    Richard Nixon is a Quaker.

    Richard Nixon is a Republican

    Do we conclude that Nixon is a pacifist or not?

    Marek Sergot (October 2015) 23 / 51

    Defeasible reasoning

    Another example

    Alcoholics are typically adults.alcoholic adult

    Adults are typically healthy.adult healthy

    Do we conclude?

    Alcoholics are typically healthy.alcoholic healthy

    Marek Sergot (October 2015) 24 / 51

  • Defeasible reasoning

    On the other hand . . .

    Alcoholics are typically adults.alcoholic adult

    Adults are not children.adult → ¬child

    Do we conclude?

    Alcoholics are typically not children.alcoholic ¬child

    Marek Sergot (October 2015) 25 / 51

    Defeasible reasoning

    Some sources of defeasible reasoning

    Typical and stereotypical situations

    Generalisations and exceptions

    Conventions of communication

    ‘Closed World Assumptions’‘Circumscription’

    Autoepistemic reasoning (reasoning about your own beliefs)

    Burdens of proof (e.g. in legal reasoning)

    Persistence and change in temporal reasoning

    Marek Sergot (October 2015) 26 / 51

    Temporal reasoning

    Temporal reasoning: The Frame Problem

    Actions change the truth value of some facts, but almost everything elseremains unchanged.

    Painting my house pink changes the colour of the house to pink. . .

    but does not change:

    the age of my house is 93 yearsthe father of Brian is Billthe capital of France is Paris

    ...

    Qualification problems!

    Marek Sergot (October 2015) 27 / 51

    Temporal reasoning

    Temporal reasoning: Default Persistence (‘Inertia’)

    Actions change the truth value of some facts, but almost everything elseremains unchanged.

    p[t] p[t+ 1]

    Some facts persist ‘by inertia’, until disturbed by some action.

    Marek Sergot (October 2015) 28 / 51

  • Temporal reasoning

    Temporal reasoning: Ramifications

    win causes richlose causes ¬richrich ⇒ happy

    So an occurrence of win indirectly causes happy.

    (We still have to figure out how to deal with ⇒.Material implication → is too weak. It doesn’t represent causalconnections.)

    Marek Sergot (October 2015) 29 / 51

    Conditionals

    Material implication

    Everyone in Ward 16 has cancer.

    ∀x ( in ward 16 (x ) → has cancer(x ) )

    But compare:

    ∀x ( in ward 16 (x ) ⇒ has cancer(x ) )

    Being in Ward 16 causes you to have cancer.x has cancer because x is in Ward 16.

    Marek Sergot (October 2015) 30 / 51

    Conditionals

    The ‘paradoxes of material implication’

    A→ (B → A)¬A→ (A→ B)(¬A ∧A)→ B

    ( (A ∧B)→ C ) → ( (A→ C) ∨ (B → C) )

    (A→ B) ∨ (B → A)

    Marek Sergot (October 2015) 31 / 51

    Conditionals

    Logic of conditionals (‘if ... then ...’)

    material implication (classical →)‘strict implication’

    causal conditionals

    counterfactuals

    conditional obligations

    defeasible (non-monotonic) conditionals...

    Marek Sergot (October 2015) 32 / 51

  • Action

    A favourite topic — action

    Action

    state change/transition

    agency

    what is it ‘to act’?

    logic

    computer science

    philosophy

    Marek Sergot (October 2015) 33 / 51

    Action

    Agency: an example of ‘proximate cause’

    s0

    water¬poisonintactalive(c)

    ¬waterpoisonintactalive(c)

    s1

    ¬waterpoison¬intactalive(c)

    s2

    ¬water¬poison¬intactdead(c)

    s3τ1

    a:poisons

    τ2

    b:pricks

    τ3

    c:goes

    J.A. McLaughlin. Proximate Cause. Harvard Law Review 39(2):149–199 (Dec. 1925)

    Marek Sergot (October 2015) 34 / 51

    Action

    Example: a problem of ‘practical’ moral reasoning

    (Atkinson & Bench-Capon)

    Hal, a diabetic, has no insulin. Without insulin he will die.

    Carla, also a diabetic, has (plenty of) insulin.

    Should Hal take (steal) Carla’s insulin? (Is he so justified?)

    If he takes it, should he leave money to compensate?

    Suppose Hal does not know whether Carla needs all her insulin.Is he still justified in taking it?Should he compensate her?

    (Why?)

    Marek Sergot (October 2015) 35 / 51

    Defeasible rules

    Defeasible rules (reasons to believe):

    winter heating onsunny window openheating on ¬window openwarm ¬heating onwinter ¬warm

    winter : {¬warm, heating on,¬window open}

    winter , sunny : {¬warm, heating on,¬window open}{¬warm, heating on,window open}

    Marek Sergot (October 2015) 36 / 51

  • Defeasible rules

    Add priorities:

    winter heating on lowestsunny window openheating on ¬window openwarm ¬heating onwinter ¬warm highest

    winter : {¬warm, heating on,¬window open}

    winter , sunny : {¬warm, heating on,¬window open}

    (((((((

    (((((((

    ((((

    {¬warm, heating on,window open}

    Marek Sergot (October 2015) 37 / 51

    Defeasible rules

    Defeasible conditional imperatives

    Colonel : ¬(heat on ∧ open window)1Major : sunny open window

    Sergeant: winter heat on

    winter : {heat on,¬open window}

    winter , sunny :((((

    (((((((

    (({heat on,¬open window} Major annoyed!{¬heat on, open window} Sergeant annoyed!

    1equivalent to (heat on → ¬open window);which is not the same as heat on ¬open window

    Marek Sergot (October 2015) 38 / 51

    Defeasible rules

    Example: party

    law : ¬(drink ∧ drive)wife: drive

    friends: drink

    law > wife law > friends

    wife > friends: {drive,¬drink}

    friends > wife: {drink ,¬drive}

    Marek Sergot (October 2015) 39 / 51

    Defeasible rules

    Example: party (temporal version)

    drink [1]→ drank [2]law : drank [2] ¬drive[2]

    wife: drive[0] drive[2]friends: drink [1]

    no look ahead (any ordering){drive[0], drink [1],¬drive[2]} wife annoyed!

    look ahead (wife > friends){drive[0],¬drink [1], drive[2]} friends annoyed!

    look ahead (friends > wife){drive[0], drink [1],¬drive[2]} wife annoyed!

    What if drive[0]↔ drive[2] ?

    Marek Sergot (October 2015) 40 / 51

  • Defeasible rules

    Hal, Carla and Dave

    has insulin(Carla)

    has insulin(Dave)

    diabetic(Dave)

    has insulin(X) take from(X) :: life(Hal)

    has insulin(X) ¬take from(X) :: property(X )has insulin(X) ∧ diabetic(X) ¬take from(X) :: life(X)

    take from(X) pay(X) :: property(X)

    ¬pay(X) :: property(Hal)

    ¬(take from(X) ∧ take from(Y ) ∧X 6= Y )

    life(X) > property(Y ), life(Hal) > property(Y ),

    life(X) > life(Hal)Marek Sergot (October 2015) 41 / 51

    Defeasible rules

    Hal, Carla and Dave

    has insulin(Carla)

    has insulin(Dave)

    diabetic(Dave)

    Altruistic Hal: life(X) > life(Hal) > property(Y )

    {¬take from(Dave), take from(Carla),¬pay(Dave), pay(Carla)}

    Epistemically cautious Hal: diabetic(X )← not ¬diabetic(X ){¬take from(Dave),¬take from(Carla),¬pay(Dave),¬pay(Carla)}

    Selfish (and cautious) Hal: life(Hal) > life(X ) > property(Y )

    {take from(Dave),¬take from(Carla), pay(Dave),¬pay(Carla)}{¬take from(Dave), take from(Carla),¬pay(Dave), pay(Carla)}

    Marek Sergot (October 2015) 42 / 51

    Defeasible rules

    An example from Asimov

    r1 : ¬harm(a)r2 : commands(a, α) α

    r1 > r2

    The point of the story:

    commands(a, tell(a))tell(a) → harm(a)

    Conclusion:{¬tell(a), ¬harm(a)}

    Marek Sergot (October 2015) 43 / 51

    Defeasible rules

    Another story (not Asimov)

    r1(a) : ¬kill(a)r1(b) : ¬kill(b)r2(a) : commands(a, α) αr2(b) : commands(b, α) α

    [r1(a), r1(b)] > [r2(a), r2(b)]

    The poignancy of the story:

    kill(a) ∨ kill(b)

    Conclusions:{kill(a), ¬kill(b)} {kill(b), ¬kill(a)}

    Marek Sergot (October 2015) 44 / 51

  • Defeasible rules

    Another story (not Asimov)

    r1(a) : ¬kill(a)r1(b) : ¬kill(b)r2(a) : commands(a, α) αr2(b) : commands(b, α) α

    [r1(a), r1(b)] > [r2(a), r2(b)]

    The poignancy of the story:

    kill(a) ∨ kill(b)

    Another twist:

    commands(a, kill(b))

    Naive satisficer:{¬kill(a), kill(b)}

    Marek Sergot (October 2015) 45 / 51

    Contents

    Aims

    LogicLogic 6= classical (propositional) logic !!

    Computational logicLogic programming 6= Prolog !!

    Some examples (action, ‘practical reasoning’, . . . )

    Marek Sergot (October 2015) 46 / 51

    Contents

    Contents (not necessarily in this order)

    Models, theories, consequence relations

    Logic databases/knowledge bases; models, theories

    Defeasible reasoning, defaults, non-monotonic logics, non-monotonicconsequence

    Some specific non-monotonic formalisms

    normal logic programs, extended logic programs, Reiter default logic,. . . , ‘nonmonotonic causal theories’, . . . Answer Set Programmingpriorities and preferences

    Temporal reasoning: action, change, persistence(and various related concepts)

    If time permits, examples from

    ‘practical reasoning’, action, norms . . .priorities and preferences

    Marek Sergot (October 2015) 47 / 51

    Contents

    Assumed knowledge

    Basic logic: syntax and semantics; propositional and first-order logic.

    Basic logic programming: syntax and semantics, inference andprocedural readings (Prolog), negation as failure.

    Previous AI course(s) not essential.

    There is no (compulsory) practical lab work — though you are encouragedto implement/run the various examples that come up.

    Recommended reading

    References for specific topics will be given in the notes.

    For background: any standard textbook on AI (not esssential)

    Marek Sergot (October 2015) 48 / 51

  • Contents

    Other possible topics, not covered in this course

    Assorted rule-based formalisms; procedural representations

    Structured representations (1) — old fashioned (frames, semanticnets, conceptual graphs), and their new manifestations

    Structured representations (2) — VERY fashionable

    description logics (previously ‘terminological logics’)See e.g: http://www.dl.kr.org

    “Ontologies”

    Develop ‘ontology’ for application X and world-fragment Y .‘Ontology’ as used in AI means ‘conceptual framework’.

    Marek Sergot (October 2015) 49 / 51

    Contents

    Other possible topics, not covered in this course

    Goals, plans, mentalistic structures (belief, desire, intention, . . . )

    associated in particular with multi-agent systems.

    Belief system dynamics: belief revision – no time

    Formal theories of argumentation – closely related to the topics ofthis course

    Probabilistic approaches (various)

    Some of these topics are covered in other MEng/MAC courses.

    Marek Sergot (October 2015) 50 / 51

    Contents

    Description logic (example)

    Bavaria v GermanyPersonLager v BeerSam : Person

    Person drinks BeerPerson lives in Germany

    Person u ∃lives in.BavariaSam : Person u ∃lives in.Bavaria

    Person u ∃lives in.Bavaria v Person u ∀drinks.Lager

    Conclude:Sam : Person u ∀drinks.Lager

    Marek Sergot (October 2015) 51 / 51

    IntroductionDefeasible reasoningTemporal reasoningConditionalsActionDefeasible rulesContents