part 1: algorithmic self-governance
DESCRIPTION
Part 1 presentation for Jeremy Pitt's Case Study at the FoCAS Summer School 2014, Heraklion, Crete.TRANSCRIPT
Algorithmic Self-Governance. . .
Jeremy PittDepartment of Electrical and Electronic Engineering
Imperial College London
FoCAS Summer SchoolUniversity of Crete, Crete, 23-27/06/2014
Self-Organising Multi-Agent Systems
Preliminary assumptions and definitions
Assume – ‘Multi-Agent Systems’ is a known‘Organising’ implies intentionally arranging or changing atarget object‘Self’ implies agents are organising something that affectsthem directly without external interventionJust to be confusing, we call this form of self-organisation“organised adaptation” . . .. . . and we’ve also referred to adaptive institutions
Tutorial (Lecture 1) structure
Background: Organised AdaptationFocus: Dynamic Norm-Governed SystemsSpecification Language: the Event CalculusExample: A Voting ProtocolSummary: Organised Adaptation + Socio-Technical System =Algorithmic Self-Governance
Jeremy Pitt Algorithmic Self-Governance . . . 2 / 42
Motivation
Pick a network:
individual people, forming online communities or socialnetworks via computer-mediated communicationcomputing devices, forming ad hoc networks, MANETs,VANETs, Sensor Networks, etc.business processes, forming virtual enterprises/organizations,holonic manufacturing, computational economies, etc.
May be classified as open systems (in the sense of Hewitt)
autonomous components of heterogeneous provenancecan assume that components can communicate (i.e. a commonlanguage)can not assume a common objective or central controller
Jeremy Pitt Algorithmic Self-Governance . . . 3 / 42
Features
Common features of open systems:
Dynamic and ‘volatile’: the environment, network topologyand constituent nodes can vary rapidly and unpredictably‘Evolutionary’: known nodes can come/go, but can also havenew nodes and node ‘death’Co-dependence and internal competition: nodes need others tosatisfy their own requirements, but may also behave tomaximise individual (rather than collective) utilityPartial knowledge: no single source knowledge, union ofknowledge may be inconsistentSub-ideal operation: the nodes may fail to comply according tothe system specification, by accident, necessity, or design.
Actuality (what is the case) and ideality (what ought to bethe case) do not necessarily coincide
Jeremy Pitt Algorithmic Self-Governance . . . 4 / 42
Addressing the Features
Distributed functionality and co-dependence
requires collective and coordinated interactionrequires role-based ‘co-operative work’
‘Evolution’
requires self-assessment, and resilience to (unexpected) change
Decentralised control and partial knowledge
requires sub-group decision-making and consensus formation
Unpredictable behaviour and sub-ideal operation
requires monitoring, conflict resolution, and restoration ofcompliant states
Everything requires agreed rules and well-defined procedures,and rules and procedures for changing rules and procedures
These rules and procedures have to applied by the systemcomponents to themselves (or ‘things’ which affect them, likeroles, structures – and the rules themselves)
This is Organised Adaptation
Jeremy Pitt Algorithmic Self-Governance . . . 5 / 42
Organised Adaptation vs. Emergence
Organised Adaptation differs from Emergence
Emergent Adaptation:
the non-introspective application . . .of hard-wired local computations, . . .with respect to physical rules and/or the environment, . . .which achieve unintended or unknown global outcomes
. . . as opposed to . . .Organised Adaptation:
the introspective application . . .of soft-wired local computations, . . .with respect to physical rules, the environment andconventional rules, . . .in order to achieve intended and coordinated global outcomes.
And the ‘things’ which ‘do’ Organised Adaptation are –self-organising multi-agent systems
Jeremy Pitt Algorithmic Self-Governance . . . 6 / 42
Examples of Organised Adaptation (‘Real’ Life)
!"#$%#&'#(
)**$+,&-.,*&(
/$%-&,0-.,*&-1(2+-3.-.,*&( 4*1#(255,%&"#&.(678,.#(5"*9#(:$*"(.8#(;,5.#&#(
)8-3#1(*:(;.<(=#.#$>5(?-5,1,'-(
-&&*@&'#5(-(&#A(=*3#B(
C#%-1(=$*'1-"-.,*&(
Jeremy Pitt Algorithmic Self-Governance . . . 7 / 42
Representative Approaches (Multi-Agent/AutonomicSystems)
Max-flow Networks
Unity
OMACS (Organisational Model for Adaptive ComputationalSystems)
Adaptive Decision-Making Frameworks
Dynamic Argument Systems
Organic Computing
Law-Governed Interaction
Dynamic Norm-Governed Systems
Jeremy Pitt Algorithmic Self-Governance . . . 8 / 42
Dynamic Norm-governed Systems
Dynamic Norm-Governed Multi-Agent Systems
Social Constraints
Physical power, institutionalised power, and permissionObligations, and other complex normative relationsSanctions and penaltiesRoles and actions (communication language)
Communication Protocols
Protocol stack: object-/meta-/meta-meta-/etc. level protocolsTransition protocols to instigate and implement change
Specification Space
Identify changeable components of a specification(Degrees of Freedom: DoF)Define a ‘space’ of specification instances, and a notion ofdistanceDefine rules about moving between instances
Jeremy Pitt Algorithmic Self-Governance . . . 9 / 42
Social Constraints
Three types of ‘can’
Physical capabilityInstitutional power
The performance by a designated agent, occupying a specificrole, of a certain action, which has conventional significance,in the context of an institutionA special kind of ‘certain action’ is the speech act
Permission (& obligation)Can have (physical or institutional) power with/withoutpermissionSometimes power implies permission
Sanctions and enforcement policies
Right, duty, entitlement, and other more complex relations
Social constraints can be adapted for intentional, run-timemodification of the institution
Jeremy Pitt Algorithmic Self-Governance . . . 10 / 42
Communication Protocols
...object protocol
rule modificationlevel 1 protocol:
voting
level k-1 protocol:
voting
level 0 protocol:
resource-sharing
object protocol
initialisation
Any protocol for norm-governed systems can be in level 0.
Any protocol for decision-making over rule modification canbe in level n, n > 0.
Attention is also payed to the transition protocols: theprocedures with which a meta-protocol is initiated.
Jeremy Pitt Algorithmic Self-Governance . . . 11 / 42
Specification Space
We define the Degrees of Freedom (DoF) of a protocol.
A protocol specification with n DoF creates an n-dimensionalspecification space, where each dimension corresponds to aDoF.
A specification point represents a complete protocolspecification — a specification instance — and is denoted bya n-tuple, where each element of the tuple expresses the valueof a DoF.
DoF1
DoF2
DoF2
DoF1
Jeremy Pitt Algorithmic Self-Governance . . . 12 / 42
Representing protocols in a dynamic norm-governed system
Computational Logic: A Recap
Propositional Logic
Predicate Logic
Modal and Temporal Logic
Automated Reasoning
Prolog
the Event Calculus: a language for representing and reasoningabout conventional procedures for organised adaptation
Jeremy Pitt Algorithmic Self-Governance . . . 13 / 42
The Event Calculus (EC)
General purpose language for representing events, and forreasoning about effects of events.
An action language with a logical semantics. Therefore, thereare links to:
Implementation directly in Prolog.Implementation in other programming languages.
Prolog:
specification is its own implementation;hence executable specification.
Jeremy Pitt Algorithmic Self-Governance . . . 14 / 42
Fluents and Events
Focus on events rather than situations; local states ratherthan global states
Fluents
A fluent is a proposition whose value changes over timeA local state is a period of time during which a fluent holdscontinuously
Events
initiate and terminate . . .. . . a period of time during which a fluent holds continuously
Example
give(X , obj ,Y ) initiates has(Y , obj)give(X , obj ,Y ) terminates has(X , obj)
A sequence of such events forms a narrative
Jeremy Pitt Algorithmic Self-Governance . . . 15 / 42
Simplified Event Calculus
Inertial fluents hold their values continuously
Values are assigned initially (at the start),Values are given when asserted (initiated)Values persist until disturbed (terminated)Otherwise we have ‘missing information’
A formula of the form
Event terminates fluentHas persistence disturbing effect, but no assertional force
A formula of the form
Event initiates fluentHas assertional force, but no persistence disturbing effect
Suppose
win lottery initiates richlose wallet terminates rich
Jeremy Pitt Algorithmic Self-Governance . . . 16 / 42
Example
Given
win lottery initiates richWinning the lottery initiates rich (but you might be richalready)lose wallet terminates richLosing your wallet terminates rich (but you might not be richwhen you lose it)
win -richwin -rich -rich
lose lose win -rich
assume still rich here
assertional force
no persistence disturbing effect
no assertional force
persistence disturbing effect
66 66
@@@R
���
Jeremy Pitt Algorithmic Self-Governance . . . 17 / 42
Events and Narratives in the Simplified EC
Events occur at specific times (when they ‘happen’)
Assume that all events are instantaneousAside: there is a refinement of EC for events which haveduration
Here, we will use non-negative integer time-points
Does not mean we assume that time is discreteDoes not mean that time points have to be integersWe only need a relative/partial ordering for eventsFor non-negative integers, < will doRead < as ‘earlier than’ or ‘before’
A set of events, each with a given time, is called a narrative
Inference in the SEC is non-monotonicEvents in a narrative can be processed in a different order tothat in which they occurred
Jeremy Pitt Algorithmic Self-Governance . . . 18 / 42
General Formulation
The narrative (what happens when) is represented by:initially F
Fluent F holds at the initial time point (usually 0)
E happensat TEvent/action of type E occurred/happened at time T
The effects of actions are represented by:E initiates F at T
The occurrence of event of type E at time T starts a periodof time for which fluent F holds
E terminates F at TThe occurrence of event of type E at time T ends a period oftime for which fluent F holds
The general query:F holdsat T
Fluent F holds at time T
F holdsfor PFluent F holds for time period P (P is of the form (T1,T2])
Jeremy Pitt Algorithmic Self-Governance . . . 19 / 42
Method of Computation: The EC ‘Engine’
F holdsat T ←E happensat Te ∧Te < T ∧E initiates F at Te ∧not (F brokenbetween Te and T )
F holdsat T ←0 ≤ T ∧initially F ∧not (F brokenbetween 0 and T )
F brokenbetween Te and T ←E ′ happensat Ti ∧Te ≤ Ti ∧Ti < T ∧E ′ terminates F at Ti
Jeremy Pitt Algorithmic Self-Governance . . . 20 / 42
Notes
Time comparisons are strict: therefore a fluent does not holdat the time point in which it is initiated
Negation-as-failure (not(. . .)) ensures that inferences arenon-monotonic
Action pre-conditions can be expressed as integrity constraints
Some actions can’t be performed at the same timeFor example: give(X , obj ,Y ) ∧ give(X , obj ,Z ) ∧ not(Y = Z )Every time the narrative changes, query the integrityconstraints to check consistency
A simple extension allows many-valued (as well as boolean)fluents
Form is F = VFor boolean valued fluents, V ∈ {true, false}
There is a difference between:
kill(X ) initiates alive(X ) = false at Tkill(X ) terminates alive(X ) = true at T
Jeremy Pitt Algorithmic Self-Governance . . . 21 / 42
Classic Example: the Yale Shooting Problem
Due to Steve Hanks and Drew McDermott in 1987
One actor, Fred, who turns out to be a turkey, and a gun
Two fluents
one for the state of the gun, which can either be loaded orunloadedone for the state of Fred, which can either be dead or alive
Two actions
load the gun, after which the gun is loadedshoot the gun, after which Fred is dead, and the gun unloaded
A naive formulation
¬loaded(N) ∧ load(N)→ loaded(N + 1)alive(N) ∧ loaded(N) ∧ shoot(N)→ dead(N + 1)loaded(N) ∧ shoot(N)→ ¬loaded(N + 1)
Jeremy Pitt Algorithmic Self-Governance . . . 22 / 42
Reasoning about YSP (1)
Given:
{alive(1),¬loaded(1), load(1), shoot(2)}We can prove
alive(2) ∧ loaded(2) ∧ dead(3) ∧ ¬loaded(3)
But we can also prove
dead(2) ∧ loaded(2) ∧ dead(3) ∧ ¬loaded(3)
Because we did not say:
alive(N) ∧ load(N)→ alive(N + 1)
Jeremy Pitt Algorithmic Self-Governance . . . 23 / 42
Reasoning about YSP (2)
Given:
{alive(1),¬loaded(1), load(1), shoot(2), load(3), shoot(4)}We can prove
alive(2) ∧ dead(3) ∧ dead(4) ∧ dead(5) ∧loaded(2) ∧ ¬loaded(3) ∧ loaded(4) ∧ ¬loaded(5)
But we can also prove
alive(2) ∧ dead(3) ∧ alive(4) ∧ dead(5) ∧loaded(2) ∧ ¬loaded(3) ∧ loaded(4) ∧ ¬loaded(5)
Because we did not say:
dead(N) ∧ load(N)→ dead(N + 1)
Do we have to do this for everything! – everything notchanged stays the same?
This is the frame problem
Jeremy Pitt Algorithmic Self-Governance . . . 24 / 42
EC formulation of the YSP (due to Marek Sergot)
initiates(load,loaded,T).initiates(shoot,dead,T) :- holds at(loaded,T).
terminates(shoot,loaded,T).terminates(shoot,alive,T) :- holds at(loaded,T).
initially(alive).
happens(shoot,2).happens(load,3).happens(shoot,5).happens(shoot,8).happens(load,9).happens(shoot,11).
Example (‘Yale Shooting Problem’)
(In Prolog notation)
initiates(load,loaded).
initiates(shoot,dead,T) :- holds_at(loaded,T).
terminates(shoot,loaded).
terminates(shoot,alive,T) :- holds_at(loaded,T).
initially(alive).
happens(shoot,2).
happens(load,3).
happens(shoot,5).
happens(shoot,8).
happens(load,9).
happens(shoot,11).
alive alive
shoot
loaded
load shoot
loaded
dead
shoot
loaded
load shoot
loaded
0 1 2 3 4 5 6 7 8 9 10 11 12
?- holds_at(alive, 4).
yes
?- holds_at(alive, 5).
yes % shows how end points are treated
?- holds_at(dead,6.22256).
yes
?- holds_for(loaded,P).
P = (3,5];
P = (9, 11]
Don’t try
?- holds_at(loaded, T).
Too many answers!! (Time is not discrete/integer)
9
Action pre-conditions
You can’t load and shoot a gun at the same time.
How do we express such action pre-conditions?
Action pre-conditions = integrity constraints
incons :-
happens(load, T), happens(shoot, T).
Here incons stands for ‘inconsistency’. It doesn’t matter what you choose (as long as itisn’t a Prolog built-in predicate.)
Now to check consistency of the narrative
?- incons.
Every time the narrative changes, run the query to check consistency.
(There are techniques for e!cient incremental integrity constraint checking in deductivedatabases. Details omitted.)
It is often helpful to include an extra argument in incons to give an indication of the typeof inconsistency that has been detected.
You can put any kind of message you like in this argument. Personally, I like to put aProlog term representing the nature of the inconsistency, like this:
incons((load,T)-happens(shoot,T)) :-
happens(load, T), happens(shoot, T).
Here (load,T)-happens(shoot,T) is just a Prolog term. (- is just a Prolog functionsymbol. It has no special meaning.)
Now the consistency checking query
?- incons(X).
returns in X a record of what kind of inconsistency it is.
Notice that: action pre-conditions in event calculus will be more complicated than insituation calculus (typically). Many events can happen simultaneously in event calculus,and some combinations aren’t possible. That can’t happen in situation calculus — onlyone action at a time (unless you use some exotic version).
10
Jeremy Pitt Algorithmic Self-Governance . . . 25 / 42
Motivating Example: Collective Choice Arrangements
Voting Protocol
There is a set of agents S , a subset of whom belong to aninstitution I , some of whom occupy the role of voters who areentitled to vote, and a designated agent in I occupying the roleof chair, who declares the result of a vote. The protocolstipulates that a specific session (action situation) is opened,the chair calls for a ballot on a specific motion, the voters casttheir votes (express their preference), the chair counts thevotes and declares the result according to the standing(collective choice) rules.
Jeremy Pitt Algorithmic Self-Governance . . . 26 / 42
Voting Protocol: Informal Description
Informal specification of a decision-making procedureaccording to Robert’s Rules of Order (Newly Revised)
a committee meets and the chair opens a sessiona committee member requests and is granted the floorthat member proposes a motionanother member seconds the motionthe members debate the motionthe chair calls for those in favour to cast their votethe chair calls for those against to cast their votethe motion is carried or not, according to the standing rules ofthe committee
Jeremy Pitt Algorithmic Self-Governance . . . 27 / 42
Voting Protocol: Graphical Description
Various options for graphical representation
UML Sequence diagramsState diagrams
!"#$%&'(&)#
!*#$+,$,-%'#
!.#-%/,&'%'#
!0#1,2&)#
!3#1,4%'#
!5#+%-,61%'#
17#1,4%#
/7#'%/68+%#/7#/6,-%9:866,4#1(7#$+,$,-%# /7#,$%&9:866,4#1;7#-%/,&'#
17#+%1,<%#17#8:-48(&#
!"#-(=&)>-?@"#
/7#,$%&9-%--(,&# /7#/6,-%9-%--(,&#
!*#-(=&)>-?@"#
1,2&)#-(=&)>-?@*#
Note certain simplifications to RONR specification
No floor request, debate or agendaVoting, changing of votes etc., concurrently
Jeremy Pitt Algorithmic Self-Governance . . . 28 / 42
An Event Calculus Specification
Basic Items: Events and Fluents
Institutional Powers
Voting and Counting Votes
Permission and Obligation
Sanctions
Objection
Jeremy Pitt Algorithmic Self-Governance . . . 29 / 42
Actions
Action Indicating. . .open session(Ag , S) open and close a sessionclose session(Ag , S)
propose(Ag ,M) propose and second a motionsecond(Ag ,M)
open ballot(Ag ,M) open and close a ballotclose ballot(Ag ,M)
vote(Ag ,M, aye) vote for or against a motion,vote(Ag ,M, nay) abstain or change voteabstain(Ag ,M)revoke(Ag ,M)
declare(Ag ,M, carried) declare the result of a votedeclare(Ag ,M, not carried)
Jeremy Pitt Algorithmic Self-Governance . . . 30 / 42
Fluents
Fluent Rangesitting(S) booleanstatus(M) {pending , proposed , seconded
voting(T ), voted , resolved }votes(M) N × N
voted(Ag ,M) {nil , aye, nay , abs}resolutions(S) list of motions
qualifies(Ag ,R) booleanrole of (Ag ,R) booleanpow(Ag ,Act) booleanper(Ag ,Act) booleanobl(Ag ,Act) booleansanction(Ag) list of integers
Jeremy Pitt Algorithmic Self-Governance . . . 31 / 42
Institutional Power
Recall: an empowered agent performs a designated action incontext which creates or changes an institutional fact.
We want to express the effects of the designated protocol(speech) actions, in particular:
voteopen session and open ballotdeclare
For the specification of the effects of these actions, it isimportant to distinguish between:
the act of (‘successfully’) casting a vote, andthe act by means of which the casting of the vote is signalled(e.g. sending a message of a particular form via a TCP/IPsocket connection).
Jeremy Pitt Algorithmic Self-Governance . . . 32 / 42
Institutional Power
Institutional power to open the ballot on a motion:
pow(C , open ballot(C ,M)) = true holdsat T ←status(M) = seconded holdsat T ∧role of (C , chair) = true holdsat T
Institutional power to cast a vote:
pow(V , vote(V ,M, )) = true holdsat T ←status(M) = voting( ) holdsat T ∧role of (V , voter) = true holdsat T ∧not role of (V , chair) = true holdsat T ∧voted(V ,M) = nil holdsat T
Jeremy Pitt Algorithmic Self-Governance . . . 33 / 42
Effects of Institutional Power (1)
Chair performs open ballot(C ,M)
open ballot(C ,M) initiates votes(M) = (0, 0) at T ←pow(C , open ballot(C ,M)) = true holdsat T
open ballot(C ,M) initiates voted(V ,M) = nil at T ←pow(C , open ballot(C ,M)) = true holdsat T ∧role of (V , voter) = true holdsat T
open ballot(C ,M) initiates status(M) = voting(T ) at T ←pow(C , open ballot(C ,M)) = true holdsat T
Now voters have power to cast votes
Jeremy Pitt Algorithmic Self-Governance . . . 34 / 42
Effects of Institutional Power (2)
Casting and counting votes
vote(V ,M, aye) initiates votes(M) = (F 1,A) at T ←pow(V , vote(V ,M)) = true holdsat T ∧votes(M) = (F ,A) holdsat T ∧F 1 = F + 1
vote(V ,M, aye) initiates voted(V ,M) = aye at T ←pow(V , vote(V ,M, )) = true holdsat T
Power to revoke vote now granted (revocation without votewas ‘meaningless’)
Power also used to advance status of motion, perform roleassignment, etc.
Jeremy Pitt Algorithmic Self-Governance . . . 35 / 42
Permission
‘Right’ aspect of enfranchisementAgents have the power to voteAgents have the permission to vote
In this case (although not always) power implies permission
Nobody should stop them from exercising their powerTherefore the chair’s power to close the ballot is not alwayspermitted
pow(C , close ballot(C ,M)) = true holdsat T ←status(M) = voting holdsat T ∧role of (C , chair) = true holdsat T
per(C , close ballot(C ,M)) = true holdsat T ←role of (C , chair) = true holdsat T ∧status(M) = voting(T ′) holdsat T ∧ T > T ′ + 10
Jeremy Pitt Algorithmic Self-Governance . . . 36 / 42
Obligation
‘Entitlement’ aspect of enfranchisement
‘Access’ to ‘voting machine’ is a ’physical’ issueCorrect vote count: as aboveA ’fair’ outcome: obligation to declare the result correctly: e.g.a simple majority vote
obl(C , declare(C ,M, carried)) = true holdsat T ←role of (C , chair) = true holdsat T ∧status(M) = voted holdsat T ∧votes(M) = (F ,A) holdsat T ∧F > A
Jeremy Pitt Algorithmic Self-Governance . . . 37 / 42
Sanction
The chair always has the power to close a ballot
It has permission to exercise the power only after some timehas elapsed
If it closes the ballot early, it may be sanctioned
close ballot(C ,M) initiates sanction(C ) = [(102,M)|S ] at T ←role of (C , chair) = true holdsat T ∧per(C , close ballot(C ,M)) = false holdsat T ∧sanction(C ) = S holdsat T
The sanction results in penalty only if someone objects
Feature of RONR: ‘anything goes unless someone objects’
Jeremy Pitt Algorithmic Self-Governance . . . 38 / 42
The Event Calculus: Implementation Routes
EC has been mainly used for narrative assimilation:
Given a narrative, check that it is consistentGiven a consistent narrative, check what holds when
There are many EC variants and implementations, fordifferent purposes or requirements
Discrete Event Calculus Reasoner, for proving properties &planningCached Event Calculus, for efficient narrative assimilationetc. . . .
We will show the use of the Simplified EC for narrativeassimilation
Pre-process directly into Prolog
Jeremy Pitt Algorithmic Self-Governance . . . 39 / 42
The Event Calculus: Narrative Assimilation
!"#$%&
'()*+)+,&
Initial Social State!initially( role_of(cAgent,chair) = true ). !!
initially( role_of(cAgent,voter) = true ). !!
initially( role_of(pAgent,voter) = true ).!
…!
Social Constraints!…!
holdsAt( obl(C, declare(C,M,carried))=true, T ) :-!!holdsAt( role_of(C,chair)=true, T ),!
!holdsAt( status(M)=voted, T ),!!holdsAt( votes(M)=(F,A), T ),!
!F > A.!…!
Narrative!happens( open_session(cAgent, sesh), 1).!happens( propose(pAgent, m1), 2).!happens( second(sAgent, m1), 3).!
happens( open_ballot(cAgent, m1), 4).!happens( vote(pAgent, m1, aye), 5).!
happens( vote(sAgent, m1, nay), 6).!happens( vote(vAgent, m1, nay), 7).!happens( revoke(sAgent,m1), 8).!
happens( vote(sAgent, m1, aye), 9).!happens( close_ballot(cAgent, m1), 10).!
happens( declare(cAgent, m1, not_carried), 11).!happens( close_session(cAgent, sesh), 12).!
Resulting Social State!roles!powers !!
permissions !!
obligations!sanctions!
Jeremy Pitt Algorithmic Self-Governance . . . 40 / 42
Example: The Voting Protocol
EC Specification pre-processed into Prolog program
Process narratives for consistency and ‘what holds when’
agent roles powers permissions obligations sanctionscAgent chair voter close ballot close session close ballotpAgent voter proposersAgent voter proposer vote votevAgent voter
happens(vote(sAgent,m1, aye))cAgent chair voter close ballot close session close ballot close ballotpAgent voter proposersAgent voter proposervAgent voter
happens(close ballot(cAgent,m1))cAgent chair voter declare close session declare(carried) declare(carried)pAgent voter proposersAgent voter proposervAgent voter
happens(declare(cAgent,m1, not carried))cAgent chair voter close session close session 102pAgent voter proposer propose proposesAgent voter proposer propose proposevAgent voter
Jeremy Pitt Algorithmic Self-Governance . . . 41 / 42
Summary
Introduced the concept oforganised adaptation
We have reviewed the framework of dynamic norm-governed(multi-agent) systems
We have studied the Event Calculus as one language forspecifying the salient aspects of the framework
We have specified a voting protocol in the Event Calculus
This is the basis of engineering self-organising(norm-governed) multi-agent systems
But what happens when we ‘inject’ these systems intosocio-technical systems?
Organised Adaptation + Socio-Technical System =Algorithmic Self-Governance
Jeremy Pitt Algorithmic Self-Governance . . . 42 / 42