parallel computing and vehicle routing teodor gabriel crainic · 2018-06-05 · parallel computing...
Post on 11-Apr-2020
1 Views
Preview:
TRANSCRIPT
Parallel Computing and Vehicle Routing
Teodor Gabriel CrainicTeodorGabriel.Crainic@CIRRELT.net
VeRoLog PhD School, Cagliari, June 1-2, 2018
© Teodor Gabriel Crainic 2018
Parallel Optimization
2
Several processes work simultaneously
on several processors
with the common goal of
solving a given problem instance
© Teodor Gabriel Crainic 2018
Solving Problems with O.R.
Difficult decision problems in planning and managing
complex systems
Transport, Logistics, Telecom, Production, Health, …
Operations Research methodology to address/solve them
Modelling: Mathematical formulations
Resolution: Solution algorithms/methods
3
© Teodor Gabriel Crainic 2018
Optimization and Solution Methods
Models with “nice” mathematical properties
Exact mathematical programming methods
e.g., Linear and convex programming
Hard optimization problems with “nice” mathematical
properties and small dimensions
Exact implicit enumeration methods
e.g., Branch-&-Bound
& cut & price & …
Polyhedral methods (cutting planes)
Larger instances? Decomposition + parallel computing
4
© Teodor Gabriel Crainic 2018
Optimization and Solution Methods (2)
All others?
Larger dimensions
Hard optimization problems, e.g., combinatorial
Optimization problems without “nice” mathematical
properties
Non-optimization problems
…
Heuristics, meta-heuristics, matheuristics
Decomposition + parallel computing
5
© Teodor Gabriel Crainic 2018
Plan
Parallel Computing
Parallel meta-heuristics
Cooperative search
A rapid perspective on Parallel Branch & …
We focus on algorithmic design not on implementation
(nor particular computing architectures)
6
© Teodor Gabriel Crainic 2018
The world is (massively) parallel ☺
Problems are large and complex (in space and time)
Parallel computing to better represent, simulate, “solve”,
understand them
Simulating actual systems (astrophysics, biology,
health, genomics, material engineering, …) and
virtual ones (games, movies, virtual reality, …)
Solving larger / more complex problems (VRP …)
Concurrency – performing several things
simultaneously, e.g., collaborative search
Better use of resources
7
Parallel Computing – Why ?
© Teodor Gabriel Crainic 2018
Several processes work simultaneously on several
processors with the common goal of solving a given
problem instance
Solving (for us) = Finding the (an) optimal or
a (good) feasible solution
Parallelism
Decompose the computational load (of solving)
Distribute the resulting tasks to available processors
Solve !
Extract (reconstruct) the solution
8
Parallel Computing – How ?
© Teodor Gabriel Crainic 2018
Solve more rapidly
Mathematical problems, exact solution algorithms
Aim for speedup =
Best sequential time / p-processor parallel time
The same algorithm, the same solution
Better representation
Simulation
Finer tasks, more complex interactions
Broader & more robust search
Heuristics
9
Parallel Computing Goals
© Teodor Gabriel Crainic 2018
10
Parallel Search – Sources of Parallelism
How to divide the task of solving the problem instance
into more or less independent tasks to be addressed more
or less simultaneously?
What is this “complete solving” task?
How to define tasks?
How to distribute the set of tasks (global search)?
© Teodor Gabriel Crainic 2018
Functional
The “algorithm”
Decompose computing-intensive tasks
Work on the same or dedicated parts of the data
Rarely changes the sequential algorithm
Search space separation
Decompose the problem domain (solution/search
space) or
Decompose the problem structure
Work on each part with particular solution method
Require “control” of the overall search
11
Parallel Computing Decomposition Types
© Teodor Gabriel Crainic 2018
Task granularity
Fine or coarse grained
Inter-task (processor) communications
Synchronous = All finish current task before
exchanging
Asynchronous = Individuals proceed when ready
12
Parallel Computing Decomposition
© Teodor Gabriel Crainic 2018
Performing the search & communications
How tasks (including the complete algorithm),
information (e.g., status), data (e.g., new solutions or
components), commands are initiated, exchanged,
terminatedSometimes together, sometimes separated
How the data is kept
Centralized (“master-slave)
A unique decision maker
Decentralized (“cooperation”, “collegial”)
Multi-level, hierarchical, …
13
Parallel Computing Control
© Teodor Gabriel Crainic 2018
14
Meta-heuristics
Master strategies (heuristics) to guide and modify other
heuristics to produce solutions beyond those normally
identified by local search heuristics (Glover 1986)
Neighbourhoods & populations
Avoiding getting trapped into
Local optima
Sequences of visited solutions (cycling)
Trying to avoid overlooking promising regions
Memories and learning capabilities
Local search is often one of the guided heuristics
© Teodor Gabriel Crainic 2018
15
Meta-heuristics (2)
Neighborhood Search Methods
Tabu Search, Adaptive Large Neighbourhood Search
(ALNS), Variable Neighbourhood Search (VNS +),
GRASP, Guided Local Search, Simulated Annealing,
and so on
Population-based methods
Evolutionary (genetic) methods, Scatter Search, Path
Relinking
Swarm intelligence
Ant colonies, bee swarms, schools of fish, …
Hybrids, Matheuristics
© Teodor Gabriel Crainic 2018
16
Meta-heuristics (3)
Different but with a few fundamental common design
elements and characteristics
Guiding strategies
Selecting neighbourhoods, individual-selection rules,
search phases (intensify/diversity, population
management), …
Moves
Exploring/Selecting in the neighbourhood (solution
transformation, individual matting, …)
Learning, explicit (memories), implicit (population), noneCommon strategies may be designed
© Teodor Gabriel Crainic 2018
17
Parallel Meta-heuristics Goals
Accelerate the search
For comparable solution quality (at least)
Broaden the search for better solutions
For comparable wall-clock time
Build a more robust search method for better solutions
The method performs well on different types of
problem instances
Looks “deeper” and “broader” for solutions with
good values for specific attributes
© Teodor Gabriel Crainic 2018
18
Sources of Parallelism in Parallel MH Search
The “complete solving” task
Finding a good, the best possible, solution
Significantly less predefined than exact solution
methods (e.g., Branch-and-Bound)
May change with the “decomposition” definition
A new class of meta-heuristics
How to define tasks?
Functional or search space separation
© Teodor Gabriel Crainic 2018
Functional Decomposition
Several tasks work in parallel on the “same” data
The “inner loop” of meta-heuristics, e.g.,
neighbourhood/fitness evaluation
Not much in meta-heuristics …,
But may become important for local search, hierarchical
schemes, etc.
19
© Teodor Gabriel Crainic 2018
Search Space Separation
A very rich source
Search Space (Domain) decomposition (data parallelism)
Explicit separation,
Same meta-heuristic on each component
Construct complete solution
Multi-search (threads, walks, …)
Implicit separation
Independent solvers (meta-heuristics, exact, …) work
on complete space or partial problems
Construct complete solution (latter case)
20
© Teodor Gabriel Crainic 2018
Combining Decompositions & Strategies
Combining strategies & methods (including exact ones)
“Hybrids”: not much information in this name …
Hierarchical designs
Cooperation
Integrative Cooperative Search
21
© Teodor Gabriel Crainic 2018
Partition or cover (region overlap)
Building complete solutions out of partial ones
Control/guidance of the overall search given the result of
concurrent explorations (if)
What & how & when information is exchanged (if any)
How to gather global information (if any)
How to create new information (if any)
What to do with received information
What search strategies
22
Parallel Search – Main Design Issues
© Teodor Gabriel Crainic 2018
23
A 3-Dimension Taxonomy of Algorithmic Design
Search control cardinality
Search control and
Communications
Search differentiation
1C
pC
SPSS
KS C KC
SPDS
MPSS
MPDS
RS
Control of
- Work (who does what)
- Communications
(who talks to whom, why, how,
when, and about what)
- Knowledge
(who knows what, e.g., “are we
done, yet?”
what is created new?)
How many are controlling?
© Teodor Gabriel Crainic 2018
24
A 3- Dimension Taxonomy
Search control cardinalitySearch control and
Communications
Search differentiation
1C
pC
SPSS
KS C KC
SPDS
MPSS
MPDS
RS
Same/Different
Starting points/populations
Strategies
Rigid/Knowledge Synchronous
Asynchronous “Collegial”
Knowledge-creating Collegial
© Teodor Gabriel Crainic 2018
1-Control, RS/KS, SPSS: Functional Decomposition
25
1
p
Search control
cardinality
SPSS
C KC
Search control and
communications
Search differentiation
SPDS
MPSS
MPDS
RS
Low-level parallelism
Accelerates intensive tasks
Does not change the algorithm
(may change behavior)
KS
© Teodor Gabriel Crainic 2018
Neighbourhood evaluation
Local Search
Education (individual improvement) for populations
Fitness evaluation for populations
Swarms
Master updates pheromone/global matrix
Slaves perform an individual’s work (e.g.,
construction heuristic)
Does not change the sequential behavior & solution
28
1C/RS & KS: Functional Decomposition (2)
© Teodor Gabriel Crainic 2018
1C/KS: Master-Slave Probing / Look Ahead
Allow slaves to execute a few iterations to select the most
promising trajectory
The parallel search method is different from the sequential one different solutions
Good speedups when evaluation is costly
QAP, VRP with ejection chains, fitness evaluation
Probing: better in quality, but, more complex to control,
outperformed by pC methods
Low-level decomposition interest: Large (very)
neighbourhoods/populations, Local Search in hierarchical
setups, swarms, using Graphic Processing Units
29
© Teodor Gabriel Crainic 2018
1C&pC/KS Search Space (Domain) Decomposition
30
1
p
Search control
cardinality
SPSS
C KC
Search control and
communications
Search differentiation
SPDS
MPSS
MPDS
RS
“Domain” decomposition
Perform the search on parts of
search space (“partition”)
Reconstruct global solution
Two types of control
KS
© Teodor Gabriel Crainic 2018
33
Explicit Search Space Decomposition
How to separate? Eliminate or freeze variables?
Partition or cover?
Global solution reconstruction?
Moves restricted to own zone or not?
Modify decomposition and repeat?
To avoid missing subspaces in between partitions
1C = master-slave (real-time ambulances’ routing)
pC = reconstruction through synchronous exchanges
(first proposal for VRP 1993; ants for VRP 2006)
Potential for very large instances (e.g., time-dependent
network design), hierarchical strategies, …
© Teodor Gabriel Crainic 2018
34
Multi-search Parallel Meta-heuristics
Algorithm-based search space separation
Several “independent” meta-heuristics work
simultaneously on the same problem and search space
Communications = algorithm design
None: Independent Search
All processes synchronize through exogenous (hard
coded) or collegial control
Processes communicate asynchronously
The most successful
Direct or indirect communications/exchanges
Knowledge creation
© Teodor Gabriel Crainic 2018
pC/RS: Independent (Multi) Search
35
1
p
SPSS
KS C KC
SPDS
MPSS
MPDS
RS
Search differentiation
Search control
cardinality
Search control and
communications
One of the first strategies
Parallel multi-start
Easy to implement
“Not bad” ….
© Teodor Gabriel Crainic 2018
pC/RS Independent Search
36
Perform meta-h
SPDS/MPSS/MPDS
Return best
Perform meta-h
SPDS/MPSS/MPDS
Return best
Perform meta-h
SPDS/MPSS/MPDS
Return best
Select
best of best
© Teodor Gabriel Crainic 2018
Cooperative Multi-Search Strategies
37
1
p
Search control
cardinality
SPSS
C KC
Search control and
communications
Search differentiation
SPDS
MPSS
MPDS
RS
pC/KS pC/C pC/KC
© Teodor Gabriel Crainic 2018
38
Cooperative Multi-search Strategies
The most successful
Solvers participate to the cooperative search
Information is shared (and created)
Meaningful & timely
The information-sharing (and guidance) mechanism
How the solvers interact (who exchanges, when, how)
What information is shared (if any)
How is the shared information used
Locally: How each solver acts on / transforms the received
information / creates new one, before making it available
Globally (if)
© Teodor Gabriel Crainic 2018
39
Cooperative Multi-search Strategies (2)
Exchanged information
Meaningful
Timely
Aim
Enhance individual solver performance
Create “global” view of system to guide the search
What is exchanged
Good solutions, e.g., overall best, elite, local best at
end of improvement sequence
Context information, e.g., memories
Global guidance (control)
© Teodor Gabriel Crainic 2018
40
Cooperative Multi-search Strategies (3)
How it is exchanged – Communication topology
Directly at agreed meeting points, diffusion, or
broadcasting (humm, danger here, see next!)
Indirectly, through independent data structures (and
process)
How it is exchanged – Communication mode
Synchronously
Asynchronously
© Teodor Gabriel Crainic 2018
41
Cooperative Multi-search Strategies (4)
General remarks on cooperation
Tends to favor regions where good solutions were found Need for diversification mechanisms
No (major) differences between neighbourhood- and
population-based methods
Often combined
Main principle of swarm methods
© Teodor Gabriel Crainic 2018
42
Cooperative Multi-search Strategies (5)
Knowledge synchronization
Solvers stop and exchange, at pre- or dynamically-
determined moments
Collegial
Asynchronous exchanges initiated by one of the
solvers
Simple information exchanges
Knowledge collegial
Extract new information from exchanged data
Global guidance (control)
© Teodor Gabriel Crainic 2018
43
Synchronous Cooperation
Synchronous strategies show poor performance
compared to asynchronous and independent search
Less reactive to environment
Larger computation overheads
Premature convergence of the dynamic process
Global broadcasting: May yield a global random search
© Teodor Gabriel Crainic 2018
Broadcast-based Communications
44
Search
i
Search
kSearch
j
Search
lSearch
h
Search
r
New best
solution !!
STOP!
Better
solution
@!!%$
I was
improving !
© Teodor Gabriel Crainic 2018
45
Synchronous Cooperation (2)
Need to control tightly the information exchanges
What, when, to whom, handling of import data, …
Should not be overwhelming
© Teodor Gabriel Crainic 2018
46
Asynchronous Cooperation
Most successful
Global solution (decision) emerges out of the collective
behaviour/decisions/solutions & exchanges of the
individual solvers New meta-heuristic class
More “agile”, adaptable to the search evolution
Exchange appropriate information in a timely manner
Aim for quality, meaningfulness, parsimony
What information is exchanged?
Good solutions – local best(s) (& diverse)
Context information (local “learning”)
© Teodor Gabriel Crainic 2018
47
Asynchronous Cooperation (2)
When is information exchanged?
Send on reaching new local optima or important
information (e.g., strategic variable) discovered
Require at decision moments, e.g., diversification or
generation renewal
Send & require jointly?
Reduces communication overhead
Maybe not, to make available significant new
information (improved solutions/individuals) sooner
© Teodor Gabriel Crainic 2018
48
Asynchronous Cooperation (3)
What does one do with imported data?
Integrate into choices and transform
What does one do with the exchanged data?
New information & guidance may be derived
© Teodor Gabriel Crainic 2018
Asynchronous Cooperation Exchanges
A logical communication graph to be mapped on the
computer architecture (for synchronous as well)
Direct, solver(s)-to-solver(s)
Most population-based methods (migration)
Swarms
Multi-level
Indirect, through data warehouse / blackboard / pool /
central (central / adaptive) memory
Solvers work on
Full problem
Partial problem + integration49
© Teodor Gabriel Crainic 2018
50
Controlled Direct Exchange Mechanisms
Search
i
Search
kSearch
j
Search
lSearch
h
Search
r
© Teodor Gabriel Crainic 2018
51
Direct Exchange Mechanisms: Diffusion
Search
i
Search
k
Search
j
Search
lSearch
h
Search
r
© Teodor Gabriel Crainic 2018
One-to-one or one-to-many mainly in genetic-based
population-based methods & some swarms
Island methods (ants: part of colony = island)
Migration initiated within one population
Send new best individual
Request when stagnant population
Individual accepted when different and better than
worst in receiving population
Complete communication graphs not necessary
Swarms: diffusion (often)
52
Direct Exchange Mechanisms
© Teodor Gabriel Crainic 2018
53
The Multi-level Paradigm
Search Phase
A0
Coarsening Phase Refinement Phase
….
A1
A2
Ai-1
Ai
A1
A0
A2
….
Ai-1
© Teodor Gabriel Crainic 2018
54
Multi-level Search
Level
L
Search
Refine solution:
Interpolation (and search)
Coarsen the problem instance
Level
2
Search
Level
1
Search
Level
0
Search
© Teodor Gabriel Crainic 2018
55
Multi-level Method Paradigm
Coarsen the solution space into hierarchical sub-spaces
Coarsening / aggregation operators
Find (search) a “good” solution at the top-most level
Search on the most coarsened problem
Project back (interpolate) the solution at the higher level
to guide the search at the current level
Projection / interpolation operators
(relatively “simple” search)
The search at level 0 yields the solution
© Teodor Gabriel Crainic 2018
56
Cooperative Multi-level Search
Level
L
Search
Refine solution:
Interpolation (and search)
Elite solutions and memories
Coarsen the problem instance
Modify coarsening
Elite solutions and memories
Level
2
Search
Level
1
Search
Level
0
Search
© Teodor Gabriel Crainic 2018
57
Cooperative Multi-level Search (2)
Several (re-)coarsening operators
Aggregation, variable fixing, partitioning, …
Coarsening: critical element
Several projection/interpolation operators
One/several searches
Use of local elite sets and memories to guide operations
at neighbouring levels
Excellent results: (hyper) graph partitioning, network
design, feature selection in bio-medical data, …
© Teodor Gabriel Crainic 2018
58
Cooperative Multi-level with Central Memory
Level
L
Search
Level
2
Search
Level
1
Search
Level
0
Search
Memory
Elite SetNeeds to
be studied
© Teodor Gabriel Crainic 2018
59
Indirect, Memory-based Cooperation
Search
i
Search
k
Search
j
Search
l
Memory, Pool
Reference Set
Elite Set
Data Warehouse
© Teodor Gabriel Crainic 2018
60
pC Memory-based Mechanisms
No direct exchanges among solvers
Data is deposited into memory according to internal logic of the solver
It is extracted on request according to guiding mechanism
Asynchronous operations and process independence easy to enforce
Easy to keep track of exchanged data, easy to manipulate it to build new data and search directions
Dynamic and adaptive structure
Same cooperation logic may be implemented without the common repository structure but it is not efficient
© Teodor Gabriel Crainic 2018
61
pC/C Indirect, Memory-based Cooperation
Solution set
(population)
+ Context information
+ Newly created
information
Solver set
SSearch
ControlGuidance
Memory
SOLUTIONS
SEND / REQUEST
ORDER
(QUALITY, DIVERSITY)
EXTRACT & SEND ON REQUEST
© Teodor Gabriel Crainic 2018
Used with almost all classes of neighbourhood meta-
heuristics
Extraction from memory: random, biased by rank (and
diversity)
As in all cooperative strategies, solvers may belong to
different meta-heuristic or exact solution-method class
Different methods may be activated at different
phases within solvers
Context data (e.g., long-term central memories built on
short-term local ones) may be built
62
pC/C Indirect, Memory-based Cooperation
© Teodor Gabriel Crainic 2018
63
pC/KC Indirect, Memory-based Cooperation
There is a great richness in the information exchanged
Good and diverse solutions of various origins
Local context
Memory = Population of elite solutions dynamically
updated
One may analyze & learn & create new information,
new knowledge out of the exchanged one
Create new solutions (genetic, scatter, path relinking)
Build global context information
Build guiding information
© Teodor Gabriel Crainic 2018
Simple pC/KC Experiment for Network Design
pC/C with several Tabu Searches and Central Memory
+
A GA that takes its population from Memory and returns
improved individuals
Outperformed the initial pC/C
Best solution never from GA but improved by the Tabus
Worthwhile to involve different solvers and create new
solutions
64
Several
TABUsMemory
GENETIC
Algo
© Teodor Gabriel Crainic 2018
65
pC/KC Indirect, Memory-based Cooperation
Solution set
(population)
+ Context information
+ Newly created
information
Solver set
S
Search
Control
Guidance
Memory
SOLUTIONS
Local context
CREATE new
solutions +
global context
EXTRACT
guidance
SOLUTIONS
Guidance info
© Teodor Gabriel Crainic 2018
66
Adaptive Memory
Rochat & Taillard, 1995
In memory
Components of elite solutions (e.g., routes)
Measures of how good the solutions were
Possibly for other attribute-based measures
Memory
Extracts components out of received solutions
Computes a value for each component
Historic smoothing
Context data, e.g., frequency in good solutions
© Teodor Gabriel Crainic 2018
67
Adaptive Memory (2)
New information creation
Within solvers
(could be performed by different solvers)
Each solver
Selects components & constructs an initial solution
Biased random selection (in-memory solution value,
context data)
Improves the solution (meta-heuristic)
Sends best solution(s) + solution value(s) to memory
© Teodor Gabriel Crainic 2018
68
Adaptive Memory (3)
Very successful, widespread, VRP particularly
Several interesting ideas
Set-covering formulation to select components for
solvers (VRPTW; Schulze & Fahle, 1999)
Two-level hierarchical method for real-time vehicle
routing and dispatching (Gendreau et al, 1999)
First level: pC/KC/MPSS
Second level: master-slave with domain decomposition
© Teodor Gabriel Crainic 2018
69
Central Memory
Crainic, Toulouse, Gendreau, 1996
In memory
Complete solutions, received and created
Local context information received
Global context information created
Guidance information: solutions, solution
components, target variable or attribute values,
e.g., patterns of arcs in routes
Measures of performance for solutions, solvers,
guidance data, …
© Teodor Gabriel Crainic 2018
70
Central Memory (2)
New information creation
In memory
could be outsourced to a (group of) solver(s)
Memory
Memory = Elite population Generate new
solutions
Integrate local context information into global one
(includes performance measures)
Extract global attributes from population + contextGenerate guidance data
© Teodor Gabriel Crainic 2018
71
Central Memory (3)
Solvers
Request (good) solution(s) at appropriate times
e.g., diversification or parent selection
Receive information, solutions and guidance
Integrate it according to internal logic
Improve solutions
Heuristics (constructive, improving, post-optimizing),
neighbourhood (tabu search, VNS, etc.) or population (GA,
Path Relinking) meta-heuristics, exact methods
Send new best solution(s) to memory
Send context information
© Teodor Gabriel Crainic 2018
Simple pC/KC for VRPTW (Le Bouthillier & Crainic, 2005)
72
© Teodor Gabriel Crainic 2018
An Advanced pC/KS for VRPTW
Le Bouthillier, Crainic, Kropf, 2005.
Measures based on atomic element = arc
Independent of routes
Frequency & evolution of arc appearance in
good/average/bad elite solutions (memory)
Pattern = vector of arcs to consider
Build patterns (various lengths) of arcs based on
frequencies and search evolution and send to solvers to
Intensify / diversify the search, e.g., strongly
prefer / avoid the arcs in pattern when selecting arcs
73
© Teodor Gabriel Crainic 2018
74
pC/KS Structure (VRPTW)TABUROUTE
Ordered elite solutions
Extraction rules
Data creation
Postoptimization
heuristics
UNIFIED
TABU
GA
(subchain)
GA
(subchain)Construction
heuristics
Request
solution
Guidance
Request
parents
Guidance
Send new
best
© Teodor Gabriel Crainic 2018
Advanced pC/KSs for VRP
Jin, Crainic, Løkketangen, 2012, 2014
Cluster solutions based on arcs in common + history
& solution-quality measures = how well the region
was explored & how good solutions are
Intensification and diversification goals
Heuristic solvers to improve existing solutions + set
covering solvers generate new solutions from elite
solutions in memory
Gröer, Golden, Wasil, 2011
Set covering on routes in memory to generate new
and better ones + Meta-heuristic to improve solutions
75
© Teodor Gabriel Crainic 2018
Cooperation & Multi-Attribute Optimization
In most cooperative meta-heuristics, including pC/KC,
solvers work on the complete problem (& search space)
Increasingly, large number of interacting attributes
(characteristics)
Larger than in “classical” (“academic”) settings
Problem characterization
Objectives
Uncertainty
That one desires to (must ☺) address simultaneously
76
© Teodor Gabriel Crainic 2018
Integrative Cooperative Search
pC/KC with partial solvers and integrators
77
Initial complete problem
Complete solvers
Complete solution set
Partial problems
Partial solvers
Partial solution sets
DECOMPOSITION
GUIDANCE
SHARING CONTEXT DATA
Integration solvers
INTEGRATION
Decision-set decomposition to yield “simpler problems”,
e.g., MDPVRP → MDVRP & PVRP
© Teodor Gabriel Crainic 2018
Integrative Cooperative Search
Decision-Set decomposition: opportunistic along sets of
decision variables
Partial solvers in pC/KC groups
Integrators
Construct complete solutions out of partial ones
Complete-solution population (+ solvers)
Global Search Coordinator
Monitoring
Guidance
Crainic et al. (2006, 2009), Lahrichi et al. (2015)
78
© Teodor Gabriel Crainic 2018
Integrative Cooperative Search – Decomposition
Decision-Set decomposition
Group variables to yield known or “easier” problem
settings
MDPVRP: along depot and period decisions →
MDVRP & PVRP
Locate antennas and select values for 7 attributes:
location versus “local” attributes → location & 3 groups
of attributes
Fix (not eliminate) variables not in partial problem
setting
79
© Teodor Gabriel Crainic 2018
Partial Solution set Pi
+ Context information
+ Newly created information
Partial solver set Si
Partial Solver Group PSGi
LSCi
Partial Solution set Pj
+ Context information
+ Newly created information
Partial solver set Sj
Partial Solver Group PSGj
LSCj
…
Partial Solver Group
One or several solvers per partial problem
pC/KC organization
Solvers, central memory = elite population + context
Local Search Coordinator manages the central memory
+ interacts with the global search
© Teodor Gabriel Crainic 2018
Integrator set I
Partial Solution set Pi
+ Context information
+ Newly created information
Partial solver set Si
Partial Solver Group PSGi
LSCi
Partial Solution set Pj
+ Context information
+ Newly created information
Partial solver set Sj
Partial Solver Group PSGj
LSCj
© Teodor Gabriel Crainic 2018
Integrative Cooperative Search – Integration
Integrators
Build complete solutions by mixing partial
solutions(from PSGs) with promising solution features
Aim for
Solution quality & Computational efficiency
Preserving/pass on critical features
Several different may work simultaneously
Simple transmission
Meta-heuristics, e.g., path relinking, Unified Hybrid
Genetic Search
Optimization models, e.g., set covering, selection with
critical-feature preservation82
© Teodor Gabriel Crainic 2018
Integrator set I
Partial Solution set Pi
+ Context information
+ Newly created information
Partial solver set Si
Partial Solver Group PSGi
LSCi
Partial Solution set Pj
+ Context information
+ Newly created information
Partial solver set Sj
Partial Solver Group PSGj
LSCj
Complete Solution set P
+ Context information
+ Newly created information
Complete solver set S
Complete Solver Group CSG
CSC
© Teodor Gabriel Crainic 2018
Integrative Cooperative Search – Cooperation
Global Solver Group & Global Search Coordinator
The (global) central memory
Elite solution, context, guiding information
(“instructions”)
Solvers (possibly)
84
© Teodor Gabriel Crainic 2018
Integrator set I
GSC
Partial Solution set Pi
+ Context information
+ Newly created information
Partial solver set Si
Partial Solver Group PSGi
LSCi
Partial Solution set Pj
+ Context information
+ Newly created information
Partial solver set Sj
Partial Solver Group PSGj
LSCj
Complete Solution set P
+ Context information
+ Newly created information
Complete solver set S
Complete Solver Group CSG
CSC
© Teodor Gabriel Crainic 2018
Integrative Cooperative Search – Cooperation
Global Solver Group & Global Search Coordinator
Context information
Out of solutions in central memory,
e.g., frequency of presence of (customer, depot,
period pattern) in MDPVRP solutions
Combining contexts of PSGs
Monitor PSG evolution for, e.g., loss of diversity,
search/population stagnation, unexplored zones, …
Build and send guiding information for PSGs to orient
the search toward promising features
New solutions or components, modify values of fixed
variables, change decomposition or solvers86
© Teodor Gabriel Crainic 2018
Perspectives
Low-level parallelism (1C) →
Search space (domain) decomposition →
Independent Search →
Cooperative multi-search (& hybridization)
Synchronous cooperation →
Asynchronous cooperation →
Asynchronous cooperation with knowledge creation
Each fulfills a particular type of task
All may be needed at some time
A meta-heuristic class/paradigm on its own
87
© Teodor Gabriel Crainic 2018
Perspectives (2)
Still many issues, gaps in knowledge, challenges
“Best” local search acceleration still best when
integrated in meta-heuristic ?
Usage of Graphic Processing units, behaviour of local
search, integration
“New” computing platforms
Search-space (& low-level) decomposition for
hierarchical methods
Dynamic separation
Integrating with cooperation
88
© Teodor Gabriel Crainic 2018
Perspectives (3)
Context data integration/generation/utilization
Solver relative performance
Strategic variable/characteristic identification
Learning & guidance
Creating new meaningful information out of shared data
Cooperation of various meta-heuristic and exact solvers
Understanding & modelling cooperation
New application fields, e.g., stochastic programing
89
top related