space applications for distributed constraint reasoning
DESCRIPTION
Space Applications for Distributed Constraint Reasoning. Brad Clement, Tony Barrett Artificial Intelligence Group Jet Propulsion Laboratory California Institute of Technology [email protected] http://ai.jpl.nasa.gov/. Outline. Applications multi-spacecraft missions - PowerPoint PPT PresentationTRANSCRIPT
Space Applications forDistributed Constraint Reasoning
Brad Clement, Tony BarrettArtificial Intelligence GroupJet Propulsion Laboratory
California Institute of [email protected]://ai.jpl.nasa.gov/
2
Outline
• Applications– multi-spacecraft missions– “collaborative” mission planning– network scheduling
• Current approaches
• Challenges for DCR
• Unsolicited opinions
3
Multi-Robot Control
• Goal selection
• Future commanding
• Commanding now– mode estimation/diagnosis
• Perception & actuation
AnalystAnalyst
PlannerPlanner
ExecutiveExecutive
ControlControl
AnalystAnalyst
PlannerPlanner
ExecutiveExecutive
ControlControl
AnalystAnalyst
PlannerPlanner
ExecutiveExecutive
ControlControl
AnalystAnalyst
PlannerPlanner
ExecutiveExecutive
ControlControl
AnalystAnalyst
PlannerPlanner
ExecutiveExecutive
ControlControl
AnalystAnalyst
PlannerPlanner
ExecutiveExecutive
ControlControl
4
Control of/by Humans
• Goal selection
• Future commanding
• Commanding now– mode estimation/diagnosis
• Perception & actuation
AnalystAnalyst
PlannerPlanner
ExecutiveExecutive
ControlControl
AnalystAnalyst
PlannerPlanner
ExecutiveExecutive
ControlControl
AnalystAnalyst
PlannerPlanner
ExecutiveExecutive
ControlControl
AnalystAnalyst
PlannerPlanner
ExecutiveExecutive
ControlControl
AnalystAnalyst
PlannerPlanner
ExecutiveExecutive
ControlControl
AnalystAnalyst
PlannerPlanner
ExecutiveExecutive
ControlControl
5
Optimize a function of variable assignments with both local and non-local constraints.
Distributed Constrained Optimization
ControlControl
ExecutiveExecutive
PlannerPlanner
AnalystAnalyst
7
Space Applications
• multiple rovers• spacecraft constellation• Earth orbiters• Mars network• DSN antenna allocation• mission planning• construction, repair• crew operations
Decentralize decision-making?
• competing objectives (self-interest)
• control is already distributed
• communication constraints/costs
• computation constraints
8
Applications – Multiple Spacecraft
Over 40 multi-spacecraft missions proposed!– Autonomous single spacecraft missions
have not yet reached maturity.– How can we cost-effectively manage
multiple spacecraft?
Earth Observing System Sun-Earth Connections
Origins Program
Structure & Evolution of the Universe
Mars Network
NMP
NMP
9
Applications – Multiple SpacecraftClassification of Phenomena
(Underlying Scientific Questions)
Five Classification Metrics• Signal Location
– Where are the signals?
• Signal Isolation– How close are distinct signals in
phenomenon?
• Information Integrity– How much noise is inherent in
each signal?
• Information Rate– How fast do the signals change?
• Information Predictability– How predictable is the
phenomenon?
Five Classification Metrics• Signal Location
– Where are the signals?
• Signal Isolation– How close are distinct signals in
phenomenon?
• Information Integrity– How much noise is inherent in
each signal?
• Information Rate– How fast do the signals change?
• Information Predictability– How predictable is the
phenomenon?
x
y
Signals from Celestial Sphere
t
Signals from Magnetosphere
10
Isolation & IntegrityIsolation & Integrity Rate & PredictabilityRate & Predictability
Applications – Multiple SpacecraftMultiple Platform Mission Types
Rate
Predictability
Low
High
High
Low
SingleSpacecraft
Signal Separation
Signal SpaceCoverage
Signal Combination
Noise
Resolution Need
Low
High
High
Low
SingleSpacecraft
11
Space Applications – ScienceHow to Distribute?
GN&CGN&C
ExecutiveExecutive
PlannerPlanner
AnalystAnalyst
GN&CGN&CGN&CGN&C GN&CGN&C
ExecutiveExecutive
PlannerPlanner
AnalystAnalyst
GN&CGN&C
ExecutiveExecutive
PlannerPlanner
AnalystAnalyst
GN&CGN&C
ExecutiveExecutive
PlannerPlanner
AnalystAnalyst
Cross-links
Who gets which components?
GN&CGN&C
ExecutiveExecutive
PlannerPlanner
AnalystAnalyst
GN&CGN&C
ExecutiveExecutive
PlannerPlanner
AnalystAnalyst
GN&CGN&C
ExecutiveExecutive
PlannerPlanner
AnalystAnalyst
12
Autonomous Signal Separation
• Why many executives?– Each spacecraft can have
local anomalies.– During an anomaly
communications can be lost due to drift.
• Why only one planner?– During normal operations
the spacecraft are guaranteed to be able to communicate.
– Since spacecraft join to make an observation, only one analyst is needed.
GN&CGN&C
ExecutiveExecutive
PlannerPlanner
AnalystAnalyst
GN&CGN&C
ExecutiveExecutive
GN&CGN&C
ExecutiveExecutive
13
Autonomous Signal Space Coverage
• Why many planners?– Cross-link is lost during
normal operations, but spacecraft still have to manage local activities and respond to science events.
• Why communicate at all?– The value of local
measurements is enhanced when combined with data from others. Analysts must coordinate over collection.
GN&CGN&C
ExecutiveExecutive
PlannerPlanner
AnalystAnalyst
GN&CGN&C
ExecutiveExecutive
PlannerPlanner
AnalystAnalyst
GN&CGN&C
ExecutiveExecutive
PlannerPlanner
AnalystAnalyst
14
Autonomous Signal/Mission Combination
• How does this differ from signal space coverage?– Each entity has different
capabilities• Sensors: radar, optical, IR...• Mobility: satellite, rover...• Communications abilities.
– Each mission has its own motivations.
• There is a competition where each mission wants to optimize its own objectives in isolation.
GN&CGN&C
ExecutiveExecutive
PlannerPlanner
AnalystAnalyst
GN&CGN&C
ExecutiveExecutive
PlannerPlanner
AnalystAnalyst
GN&CGN&C
ExecutiveExecutive
PlannerPlanner
AnalystAnalyst
15
Space Applications – Mission OperationsDecentralize decision-making?
• competing objectives (self-interest)
• control is already distributed
• communication constraints/costs
• computation constraints
• multiple instruments on spacecraft contend for resources
• multiple scientists may compete for one instrument (HST)
• scientists work with operations staff to make sure goals can be safely achieved
• plans must be validated (carefully simulated)
• changes made by users in parallel invalidate validation
Payload GS Payload GS (i.e., Datalynx, USN)(i.e., Datalynx, USN)
Payload GS Payload GS (i.e., Datalynx, USN)(i.e., Datalynx, USN)
(X-band)(X-band)(X-band)(X-band) Spacecraft GS Spacecraft GS (i.e., RSC)(i.e., RSC)
CommandsCommands(L-Band)(L-Band)
CommandsCommands(L-Band)(L-Band)
Telemetry, Telemetry, QL PayloadQL Payload
(S-Band)(S-Band)
Telemetry, Telemetry, QL PayloadQL Payload
(S-Band)(S-Band)
{ AFSCN }{ AFSCN }{ AFSCN }{ AFSCN }Payload DataPayload DataPayload DataPayload Data
Payload Downlink Requests Payload Downlink Requests Payload Downlink Requests Payload Downlink Requests Payload DataPayload Data - - FTPFTP - Overnight (all)- Overnight (all)
Payload DataPayload Data - - FTPFTP - Overnight (all)- Overnight (all)
Telemetry (ftp)
Telemetry (ftp)
Telemetry (ftp)
Telemetry (ftp)
PagerPagerPagerPager
Mission PlanningMission PlanningMission PlanningMission Planning
Simulation EnvSimulation EnvSimulation EnvSimulation Env
Commanding Commanding SOH displaySOH displayTelemetry Telemetry
Commanding Commanding SOH displaySOH displayTelemetry Telemetry
ASPEN ASPEN ASPEN ASPEN
SCL SCL SCL SCL
Fight DynamicsFight DynamicsFight DynamicsFight Dynamics
Payload Payload Ops W/SOps W/SPayload Payload Ops W/SOps W/S
Activity schedules
Activity schedules
Activity schedules
Activity schedules
TS-21 EngrTS-21 EngrTS-21 EngrTS-21 Engr
Cmd VerificationCmd VerificationCmd VerificationCmd Verification Engineering ModelsEngineering Models Engineering ModelsEngineering Models
PPC ClusterPPC ClusterPPC ClusterPPC ClusterCmd VerificationCmd VerificationCmd VerificationCmd Verification
TT&C W/STT&C W/S TT&C W/STT&C W/S
TT&C W/STT&C W/STT&C W/STT&C W/S
Data CenterData Center Data CenterData Center
Pass PlaybackPass PlaybackSOH displaySOH displayTrendingTrendingAnom ResAnom Res
Pass PlaybackPass PlaybackSOH displaySOH displayTrendingTrendingAnom ResAnom Res
SCLSCLMatlabMatlabSCLSCLMatlabMatlab
TT&C W/STT&C W/S TT&C W/STT&C W/S
PTF PTF PTF PTF
MOCMOCMOCMOC
R/T MOCR/T MOCR/T MOCR/T MOC
MPW
local constraints
newactivities
rejectedactivities
rescheduledactivities
confirmationscheduleupdates
removedactivities
Techsat-21Techsat-21
16
Applications - Deep Space Network (DSN)
17
Applications - Deep Space Network (DSN)Decentralize decision-making?
• competing objectives (self-interest)
• control is already distributed
• communication constraints/costs
• computation constraints
• 56 missions• 12 antennas
– different capabilities– shared equipment– geometric constraints– human operator constraints
• some schedule as long as 10 years into future• some require schedule freeze 6 months out• complicated requirements originally from agreement with NASA
with flexibility in antennas, timing, numbers of tracks, gaps, etc.• schedule centrally generated, meetings and horse trading to
resolve conflicts• similar to coordination operations across missions
18
Applications – DSN ArraysDecentralize decision-making?
• competing objectives (self-interest)
• control is already distributed
• communication constraints/costs
• computation constraints
• NASA may build 3600 10m weather-sensitive antennas
• 1200 at each complex in groups of 100 spread over wide area
• High automation requested—one operator for 100 or 1200 antennas
• Spacecraft may use any number of antennas for varying QoS, and may need link carried across complexes
• Only some subsets of antenna signals can be combined
– depends on design of wiring/switching to combiners
– combiners may be limited• Local response time should be
minimized
DSCC
Array Signal Proc
Other DSN Systems
Array Sites
Sig ProcSig Proc
Sig Proc
Sig Proc
Sig Proc
Sig Proc
Sig Proc
Sig Proc
19
Mars Network
• Network traffic scheduled far in advance
• Windows of comm availability• Need to react to unexpected
events and reschedule• Missions must control own
spacecraft• Comm affects resources that
are needed for other operations
• Continual negotiation
MGS MEX Odyssey
MER A MER B
20
How does DCR fit?
• Goal selection– task allocation
• Future commanding– meeting scheduling
• Commanding now– mode estimation/diagnosis
• Perception & actuation
AnalystAnalyst
PlannerPlanner
ExecutiveExecutive
ControlControl
AnalystAnalyst
PlannerPlanner
ExecutiveExecutive
ControlControl
AnalystAnalyst
PlannerPlanner
ExecutiveExecutive
ControlControl
AnalystAnalyst
PlannerPlanner
ExecutiveExecutive
ControlControl
AnalystAnalyst
PlannerPlanner
ExecutiveExecutive
ControlControl
AnalystAnalyst
PlannerPlanner
ExecutiveExecutive
ControlControl
21
Distributed Constraint Reasoningfor Planning & Scheduling
• Allocating events/resources to time slots (meeting scheduling)– Hannebauer and Mueller,
AAMAS 2001– Maheswaran et al.,
AAMAS 2004– Modi & Veloso, AAMAS 2005
• Coordinating plans by making coordination decisions variables– Cox et al., AAMAS 2005
A Goal
A’ C’ Goal’c
Init q
a
a
Znot(a)
z
Merge(A, A’)
Threat(A, C’, Z)(m, i)Merge(A’, A)
(m, m
)
(i)
(i)
Threat(A’, C’, Z)(i, i)
Threat(Z, A, A’)
(i, i)
Threat(A’, Goal, Z)(m, i)
(i, i)
22
ExecutiveExecutive
Planner
ExecutiveExecutive
Planner
ExecutiveExecutive
Planner
Shared Activity Coordination
Shared activities implement team plans, joint actions, and shared states/resources
=
=
=
23
Shared Activity Coordination(SHAC, Clement & Barrett, 2003)
– continual coordination algorithm– language for coordinating planning agents– framework for defining and implementing automated
interactions between planning agents (a.k.a. coordination protocols/algorithms)
– software• planner-independent interface• protocol class hierarchy• testbed for evaluating protocols
24
Shared Activity Model
• parameters (string, integer, etc.)– constraints (e.g. agent4 allows start_time [0,20], [40,50])
• decompositions (shared subplans)
• permissions - to modify parameters, move, add, delete, choose decomposition, constrain
• roles - maps each agent to a local activity
• protocols - defined for each role– change constraints– change permissions– change roles
• includes adding/removing agents assigned to activity
25
Control Protocols for a Shared Activity
• Chaos– A free-for-all among planners
• Master/Slave– The master has permissions, slaves don’t
• Round Robin– Master role passes round-robin among planners
• Asynchronous Weak Commitment (AWC)– Neediest planner becomes master
• Variations– how many planners share activity
– use of constraints
26
Asynchronous Weak CommitmentAWC::modifyPermissions()
– if have highest priority• remove self’s modification permissions (add, move,
delete)– else
• give self modification permissions
AWC::modifyConstraints()– if cannot resolve local conflicts and conflicts with constraints
of higher ranking agents• set own rank to highest rank plus one• generate parameter constraints (no-good) describing
locally consistent values
27
Experiments – Abstract Problem
• joint measurements
• capability matching
• 3-9 spacecraft
• 1-7 capabilities
• 1-9 joint goals each requiring 1-4 of each capability
28
Chaos - invalid solutions
M/S - not complete
Experimental Results(Progress over cpu time)
num
ber
of p
robl
ems
max cpu time (seconds)
AWC
RR
ChaosM/S
36
Computing Consensus Windows
Agent A Agent CAgent B1 1
Agent A Agent CAgent B1 1
2 2
Agent A
Agent B
Agent C
time execute
consensus window
highest rank decidesvoting or auction
37
Computing Consensus Windows
Agent A Agent CAgent B1 1
2 2
Agent A
Agent B
Agent C
time execute
consensus window
voting or auction
38
Computing Consensus Windows
Agent A Agent CAgent B1 1
2 2
Agent A
Agent B
Agent C
time execute
consensus window
voting or auction
vote
sco
llec
ted
39
Computing Consensus Windows
Agent A Agent CAgent B1 1
2 2
Agent A
Agent B
Agent C
time execute
voting or auction
40
Computing Consensus Windows
Agent A Agent CAgent B1 1
2 2
Agent A
Agent B
Agent C
time execute
consensus window
voting or auction
41
Causal Inconsistency
A SHAC protocol is proven sound if• the underlying planners are sound,• the protocol ensures that only one agent has permissions over
any piece of information, and• it employs causally consistent communication
ab
c
1
add
delete
add/master
update
2
7
3
add/master3
554
8 6
Order of events1. a is master and shares with (adds to roles) b2. b receives add from a3. a replaces b with c and makes c master4. c receives add message making it master5. c makes b master and removes self
(deletes)6. b receives add/master from c (before delete
from a)7. a receives update from c8. b receives delete from a
42
Summary
• Many space applications for distributed constraint reasoning
• Many involve model-based causal systems• Need to map these systems to DCRs
– how are CSPs mapped?• Need to handle
– continuous variables (including cpu)– limited computation– not 1000 computers, but 2-10– communication outages, unreliability, guarantees