an automatic algorithm selection approach for nurse rostering
DESCRIPTION
An automatic algorithm selection approach for nurse rostering. Tommy Messelis, Patrick De Causmaecker CODeS research group, member of ITEC-IBBT- K.U.Leuven. outline. introduction automatic algorithm selection our case: nurse rostering experimental setup results conclusions - PowerPoint PPT PresentationTRANSCRIPT
An automatic algorithm selection approach for nurse
rosteringTommy Messelis, Patrick De Causmaecker
CODeS research group, member of ITEC-IBBT-K.U.Leuven
outlineintroduction
automatic algorithm selection
our case: nurse rostering
experimental setup
results
conclusions
future work
observation
1 2 3 4 5 6 7 8 9 10instance
performance algortihm A
performance algorithm B
performance algorithm C
performance algorithm D
Many different algorithms exist that tackle the same problem class
Most of them perform good on some instances, while on other instances, their performance is worse
There is no single best algorithm that outperforms al others on all instances
how to pick the best algorithm?It would be great to know in advance which
algorithm to run on a given instance
minimize the total cost over all instances
use resources as efficiently as possible
Automatic algorithm selection in a portfolio
1 2 3 4 5 6 7 8 9 10instance
performance algorithm D
performance algorithm C
performance algorithm B
performance algortihm A
empirical hardnesshardness or complexity is linked to the solution
method that is useda hard instance for one algorithm can be easy to
solve by another algorithm
empirical hardness modelsmap problem instance features onto performance
measures of an algorithm
such models are used for performance prediction
automatic algorithm selectionlearn an empirical hardness model for each
algorithm in a portfolio
when presented with a new, unseen instance:predict the performance of each algorithmrun the algorithm with best prospective
hopefully achieve a better overall performance than any of the components individually !
outlineintroduction
automatic algorithm selection
our case: nurse rostering
experimental setup
results
conclusions
future work
nurse rostering problemthe problem of finding an assignment of nurses
to a set of shifts during a scheduling period that satisfies:all hard constraints
e.g. minimal coverage on every dayas many soft constraints as possible
e.g. nurse may want to be free on Wednesdays, but it might not always be possible
hard combinatoral optimisation problemtoo complex to solve to optimalityuse approximation methods (metaheuristics)
INRC 2010first International Nurse Rostering Competition
2010co-organized by our research group (CODeS)well-specified format for generic nurse rostering
(NRP) instancesset of competitive algorithms, working on the
same instance specification
ideal sandbox for an automatic algorithm selection tool
experimentsbuilding empirical hardness models for a set of
algorithmssix-step procedure, as first introduced by
K. Leyton-Brown, E. Nudelman, Y. Shoham. Learning the empirical hardness of optimisation problems: The case of combinatorial auctions. In Principles and Practice of Constraint Programming, 2002.
step 1: instance distribution
step 2: algorithm selection
step 3: feature selection
step 4: data generation
step 5: feature elimination
step 6: model construction
using the models to predict performance of the algorithmsallows for automatic algorithm selection
empirical hardness models1. instance distribution
use of an instance generator that produces real world like instances, similar to the competition instances
2. algorithms & performance criteria two competitors of the INRC 2010
alg. A: variable neighbourhood search alg. B: tabu search
quality of the solutions measured as the accumulated cost of constraint
violations (the lower the better)
empirical hardness models3. feature set
305 features: easily computable properties of the instances
size scheduling period workforce structure contract regulations nurses’ requests
described in detail byT. Messelis, P. De Causmaecker. An NRP feature set. Technical report, 2010. http://www.kuleuven-kortrijk.be/codes/
empirical hardness models4. data generation
instance set: 500 instances 400 training instances 100 test instances
all feature values are computed algorithms are run on all instances and the quality
of the solutions is determined using the computer cluster of the Flemish
Supercomputer Center (VSC)
5. feature elimination 201 useless (univalued) or correlated features are
eliminated
empirical hardness models6. Model learning
using several learning methods provided by the Weka-tool
tree learning techniques were most accurate
example: Alg. A R2 = 0,93
0 50 100 150 200 250 300 3500
50
100
150
200
250
300
350
Alg. A - test set
real quality
pre
dic
ted
qu
ali
ty
automatic algorihm selectiongiven an unseen instance:
use the empirical hardness models to predict the performance of both algorithms
run the algorithm with best predicted performance
unfortunately, the results were not goodperformance of the portfolio was worse than
‘always Alg. A’
automatic algorihm selectionperformance of Alg. A and Alg. B is very
similar in most cases, the difference is small
performance predictions include a certain errorcomparing these predictions does not produce an
accurate outcome
other approaches are also possible!
automatic algorihm selectionbuilding a classifier
predicts either ‘Alg. A’ or ‘Alg. B’ for a given instance
67% of the test instances are correctly classifiedfor wrongly classified instances, the difference
between both algorithms is not large
automatic algorithm selection tooluse the classifier to predict the algorithm that will
perform bestrun this algorithm
resultsportfolio performs better than any of the
components individually
measuring the sum of the costs of obtained solutions for the test instances
Algorithm AAlgorithm B
portfolio
13750
13800
13850
13900
13950
14000
13996
13884
13840
sum
quality
on t
ests
et
outlineintroduction
automatic algorithm selection
our case: nurse rostering
experimental setup
results
conclusions
future work
conclusionsit is possible
to build a portfoliocontaining state-of-the-art algorithms
to construct an automatic algorithm selection toolthat accurately selects the best algorithm to run
improving the performance of good algorithmsusing simple machine learning techniquesconsidering the existing algorithms as black-boxes
good strategy to overcome the weaknesses of certain algorithms with the strengths of other algorithms
future workimproving the performance even more:
adding more algorithmsdifferent variants of the current algorithms
learn other thingsdifference in performanceprobability that a certain algorithm will perform best
applying this to other domainscurrently working on project scheduling problems
thank you
any questions?