a membrane algorithm for the min storage problem dipartimento di informatica, sistemistica e...
TRANSCRIPT
A Membrane Algorithmfor the Min Storage problem
Dipartimento di Informatica, Sistemistica e Comunicazione
Università degli Studi di Milano – Bicocca
WMC 2006Leiden – Lorentz Center – 21 July 2006
Alberto Leporati [email protected]
Dario Pagani [email protected]
WMC 7 - Leiden - 21 July 2006 2
Talk outline
The ConsComp and Min Storage problemsComputational properties of Min Storage (NP-hardness, 2-approximability)Nishida’s membrane algorithm for TSPA membrane algorithm for Min Storage:
first try: crossover and mutationsecond try: simple local optimization
Some traditional heuristics for Min StorageComputer experiments:
generation of random instancescomparison with traditional heuristics
WMC 7 - Leiden - 21 July 2006 3
Conservative computations
Let:G be an n-input/m-output gate (function) be a sequence of input values
be the corresponding sequence of output values
For i = 1,2,…,k, we define:
If , then we say the computation is
conservative
kin xxxS ,,, 21 )(,),(),( 21 kout xGxGxGS
))(()( imini xGExEe
01
k
i ie
energyfunction
WMC 7 - Leiden - 21 July 2006 4
Gates with storage unit
In a conservative computation it may be ei
> 0 or ei < 0 for some i {1,2,…,k}
We assume to use gates equipped with a storage unit, able to store a limited amount C of energyWe call C the capacity of the gateRemark: without loss of generality, we can assume that all ei’s are integer values
WMC 7 - Leiden - 21 July 2006 5
Stored energy during a computation
If G(x1), G(x2),…, G(xk) are computed in this order, then the energy stored into the gate at each step is:
For the computation to be C-feasible, it must
hold:
0
))(()(
))(()(
0
21
221212
1111
0
kk
mn
mn
eeest
xGExEeeest
xGExEest
st
},...,2,1{0 kiCsti
Question: is there a permutation Sk such that the
computation of G(x(1)), G(x(2)),…, G(x(k)) is C-feasible?
WMC 7 - Leiden - 21 July 2006 6
The ConsComp decision problem
Instance:
a set = {e1, e2, …, ek} of integer numbers such that e1+e2+…+ek = 0
an integer number C > 0
Question: is there a permutation Sk such that
for each i {1,2,…,k}?
Theorem: ConsComp is NP-completeProof: reduction from Partition (ask for details!).
i
jj Ce
1)(0 i-th prefix sum
WMC 7 - Leiden - 21 July 2006 7
Complexity of ConsComp
Theorem: ConsComp is NP-complete in the strong sense. Proof: reduction from (3-)Partition.
We conjecture that the problem remains NP-complete even with constant elements in the instance
Notice that Bin Packing and 3-Partition with a constant number of element types are polynomial. Of course, the proofs for these problems do not work for ConsComp
WMC 7 - Leiden - 21 July 2006 8
The Min Storage optimization problem
Optimization problem whose natural decisionversion is ConsComp
name: Min Storageinstance: a set {e1, e2, …, ek} of integer numbers such that e1+e2+…+ek = 0
solution: a permutation Sk such that
for each i {1,2,…,k}
measure:
i
j je1 )( 0
i
j jki
e1 )(
1max
WMC 7 - Leiden - 21 July 2006 9
Complexity of Min Storage
Min Storage NPO
Since ConsComp is strongly NP-complete, Min Storage is NP-hard in the strong senseTrivial bounds for the optimal solution:
upper bound:
lower bound:
It is easily shown that the problem is 2-approximable
best known upper bound:
Open problem: is it 3/2-approximable?
k
i ie12
1
iki e1max
1max2 1 iki e
WMC 7 - Leiden - 21 July 2006 10
Complexity of Min Storage
A 2-approximation algorithm for Min Storage:
There is no FPTAS for Min StorageOpen problem: is there a PTAS?
WMC 7 - Leiden - 21 July 2006 11
Membrane algorithms
Introduced by Nishida, and applied to TSPInspired by Membrane ComputingComposed by three components:
a linear collection of regions separated by nested membraneslocal optimization sub-algorithms that works upon a small number of candidate solutionsa trasport mechanism to move candidate solutions to the immediately inner or outer region
Notation:M = number of nested membranesD = number of iterations before halting
WMC 7 - Leiden - 21 July 2006 12
Membrane algorithms
0
1
2
3
WMC 7 - Leiden - 21 July 2006 13
Nishida’s algorithm for TSP
Local optimization subalgorithm for region 0 is tabu search
exchanges two nodes in candidate solution (throws away previously considered solutions)
In the other regions:edge exchange crossover: combines two solutions to produce two new solutionsmutation, performed in region i with probability i/M
Halting condition: prefixed number of iterations
WMC 7 - Leiden - 21 July 2006 14
Nishida’s algorithm for TSP
20 tests on eil51, with D = 40000. Optimal solution is 426
20 tests on kroA100, with D = 100000. Optimal solution is 21282
WMC 7 - Leiden - 21 July 2006 15
Membrane alg. for Min Storage
First try: MA4MS
Fitness measure:
Same structure Nishida’s algorithm for TSPmembrane structure, number of solutions
Local optimization subalgorithms:region 0: LocalSearch4MSother regions: Partially Matched Crossover (PMX), followed by Mutation
otherwiseNumVPS
},,2,1{ 0 ifmax)(
1
1 )(1 )(1
k
i i
i
j j
i
j jki
e
kieeF
WMC 7 - Leiden - 21 July 2006 16
LocalSearch4MS
Neighborhood of candidate solution , w.r.t. position : the set of k–1 solutions
where i, is obtained from by exchanging elements in positions i and LocalSearch4MS looks for the best (i.e., lowest fitness) solution in Neigh(,)
is chosen as the first position for which the prefix sum is negative (if any)if all prefix sums are non negative, is chosen at random
i
iNeigh }{),( ,
WMC 7 - Leiden - 21 July 2006 17
Partially Matched Crossover
3 1 1 7
7 6 4 4
l r
3 62 1 4 57
31 7 6 4 25
31
3 1
17
1 7
6 7
6 7
WMC 7 - Leiden - 21 July 2006 18
Mutation
After applying PMX, to each of the new solutions we apply Mutationin region i, the probability to apply Mutation is i/MMutation simply chooses two positions at random (with uniform distribution) and exchanges the two elements
2 6 3 1 4 7 5
WMC 7 - Leiden - 21 July 2006 19
Generation of instances
Problem: generate k-tuples (X1,X2,…,Xk) such that
in a uniform way, where X1,X2,…,Xk are discrete,
independent random variables uniformly distributed over the set of integers {–M,…, M}
First idea: extract a k-tuple and discard it if
If you approximate the distribution of
with , we get:
for k=100 and M=106
k
iiX
1
0
k
i iX1
0
k
i iXY1
3
)1(,0
MMkN 81091.6]0Pr[ Y
WMC 7 - Leiden - 21 July 2006 20
Generation of instances
Alternative idea: extract a (k–1)-tuple and discard it if
Approximating the distribution of with
, we get:
for k=100 and M=106
},,{1
1
MMXk
ii
1
1
k
i iXY
3
)1()1(,0
MMkN
138.0Pr1
1
MXMk
ii
WMC 7 - Leiden - 21 July 2006 21
Error of approximation
The probability distribution of is almost uniform:
Example:for k=100 and M=106
91004.1]Pr[]0Pr[ MYY
1
1
k
i iXY
WMC 7 - Leiden - 21 July 2006 22
Coefficient of approximation
Given:a heuristic Aan instance I of Min Storage
let:
cA(I) be the value returned by Aopt(I) be the optimal solution
The performance ratio (coefficient of approximation) of A over I is:
Note that appA(I)1, and the closer it is to 1 the bettercA(I) is
)(
)()(
Iopt
IcIapp A
A
WMC 7 - Leiden - 21 July 2006 23
Coefficient of approximation
For large instances, we are not able to compute opt(I)
we use insteadwe obtain upper bounds to the coefficient of approximation
We really compute the average coefficient of approximation:
where N is the number of tests
iki e1max
N
i i
iMSMA opt
F
Napp
14
)(1
WMC 7 - Leiden - 21 July 2006 24
Some experiments with MA4MS
10000 tests with M=10 and D=50
10000 tests with M=30 and D=150
WMC 7 - Leiden - 21 July 2006 25
MA4MS with local search only
Second try: MA4MS LSwe use LocalSearch4MS as a local optimization subalgorithm in every region
10000 tests with M=30 and D=150
In the following, we will use only MA4MS LS !
WMC 7 - Leiden - 21 July 2006 26
System’s parameters
we have tried to make M and D grow together…
WMC 7 - Leiden - 21 July 2006 27
System’s parameters
…as well as one at the time
Average coefficient of approximation for M varying between 10 and 100, for D=150
WMC 7 - Leiden - 21 July 2006 28
System’s parameters
Average coefficient of approximation for D varying between 50 and 500, for M=80
WMC 7 - Leiden - 21 July 2006 29
Some heuristics for Min Storage
The Greedy algorithm (time: (k2)) Some (k log k) heuristics:
MinMaxMaxMinMaxMinMaxMinMinMaxMinMaxMaxMinMaxMin
The Best Fit algorithm (time: (k2))
Some assumptions: all lists are implemented
through arrays removal of a generic element
from a list requires constanttime
sorting requires (k log k)steps
WMC 7 - Leiden - 21 July 2006 30
Some heuristics for Min Storage
WMC 7 - Leiden - 21 July 2006 31
Some heuristics for Min Storage
WMC 7 - Leiden - 21 July 2006 32
First experiment
100 instances of 12 elements from the set {–106,…,106}
Goal: comparison between optimal solutions and coefficients of approximation
WMC 7 - Leiden - 21 July 2006 33
Second experiment
105 instances of 100 elements from the set {–106,…,106}
performance ratios have been computed using instead of opt(I)iki e1max
WMC 7 - Leiden - 21 July 2006 34
Third experiment
100 instances of 100 elements from the set {–106,…,106}
Initial energy comprised between 0 and with steps of 100
iki e1max
WMC 7 - Leiden - 21 July 2006 35
Fourth experiment
The same as the third experiment, but with initial energy comprised between 0 and
with steps of 100iki e 1max1.1
WMC 7 - Leiden - 21 July 2006 36
Fifth experiment
105 instances of 100 elements from the set {–106,…,106}
Initial energy fixed to iki e 121 max
WMC 7 - Leiden - 21 July 2006 37
Conclusions
Min Storage seems to be easy to solve on (almost uniformly) randomly chosen instancesMA4MS LS is among the best heuristics (the best for relaxed problem)Best Fit performs better when the initial energy is set to 0, but not when the initial energy is > 0
MA4MS LS seems to be stableComparison with analogous algorithms for further problems?
WMC 7 - Leiden - 21 July 2006 38
Last slide !
Thanks foryour attention !